UK Vintage Radio Repair and Restoration Discussion Forum

UK Vintage Radio Repair and Restoration Discussion Forum (https://www.vintage-radio.net/forum/index.php)
-   Vintage Tape (Audio), Cassette, Wire and Magnetic Disc Recorders and Players (https://www.vintage-radio.net/forum/forumdisplay.php?f=35)
-   -   Different Dolby B / C Calibration Standards (https://www.vintage-radio.net/forum/showthread.php?t=156259)

Mach One 3rd May 2019 5:00 pm

Different Dolby B / C Calibration Standards
 
Here is something that I have only ever guessed at an answer. It refers to when back in the Eighties I splashed out on a brand new stereo cassette deck - my second. The first (a lovely Teac with totally un-electronic transport controls) still worked but bizarrely was putting clicks on tapes even in play. (I can't get used to thinking of cassettes and CDs "Vintage," by the way...:

I decided to buy an NAD deck because I had the impression that it was one of the best of the affordable decks around. Having bought it and recorded a couple of cassettes on it, I noticed that my original cassettes sounded a bit dull on the new machine. Cassettes recorded on the new one sounded perfect on it, of course. I can't remember how but I came to the conclusion that it was a Dolby issue - as if the new deck required a higher level of audio to work properly, as it were. As I did not want to re-do all my audio cassettes I took it back and settled for a Sony at an equivalent price, which did not exhibit the same problem.

Incidentally, when I borrowed a relatively high-end Panasonic at the same time that was also problematic. It seemed that Sony were the only good make that I could rely on to continue recording my cassettes.

Were there two different Dolby calibration settings once?

Electronpusher0 3rd May 2019 5:53 pm

Re: Different Dolby B / C Calibration Standards
 
Are you sure it was a Dolby issue and not a tape head azimuth issue?

Peter

Ted Kendall 3rd May 2019 6:05 pm

Re: Different Dolby B / C Calibration Standards
 
No, although I think Dolby Level for cassettes was 200nWb/m and that for open reel at 250, if memory serves. Some NAD decks had a thing called "play trim" which was meant to compensate for HF losses on incoming tapes and bring the Dolby into the sweet spot.

Cassette machines in general and Dolby machines in particular were always pushing the limits of available technology, and undoubtedly some manufacturers were more careful with line-up than others. Azimuth was always a variable feast, too - it was uncommon for two machines to match, and many machines gave a different answer if you removed and re-inserted the cassette. Fascinating as I have to admit they are, two comments from the Sounds Good programme in the 70s stick in my mind - one was a dealer admitting that the hi-fi cassette was "a fluke" and John Longden, the EIC of Radio London, giving his opinion that all the refinements were meant to make something that was basically rather horrid work.

Admittedly, Revox, Nakamichi and one or two others achieved this, but even they struggled with Dolby C - I don't think I've ever heard a Dolby C cassette recording which didn't display artifacts. B, proprely lined up, could actually work well. C was all right E-E, but once the tape was introduced into the chain, things went awry - it was just too finicky on line-up.

paulsherwin 3rd May 2019 6:41 pm

Re: Different Dolby B / C Calibration Standards
 
Dolby tracking errors caused all sorts of problems when playing a tape recorded on one machine on something else. There were even tracking errors on the same machine because different tape formulations had different sensitivities. BASF chrome tapes produced tracking errors when used to record on any deck set up for the higher sensitivity TDK/Maxell pseudochromes, even if the record bias was adjusted appropriately.

Dolby errors are the main reason that commercial prerecorded cassettes tend to sound murky if played with Dolby B enabled. BASF were very successful in persuading record companies to use their chrome formulations.

wd40addict 3rd May 2019 7:04 pm

Re: Different Dolby B / C Calibration Standards
 
These days you can do the Dolby B decode in software and experiment with the optimum level.

I always thought a weakness in Dolby B was the variance in reference level between different manufacturers. With the A system reference tones were recorded to allow optimum playback on a different machine.

Ted Kendall 3rd May 2019 8:24 pm

Re: Different Dolby B / C Calibration Standards
 
Quote:

Originally Posted by paulsherwin (Post 1142363)
Dolby errors are the main reason that commercial prerecorded cassettes tend to sound murky if played with Dolby B enabled. BASF were very successful in persuading record companies to use their chrome formulations.

The brand of tape stock is surely irrelevant here - if the duplication equipment was correctly adjusted (and all too often it wasn't), the replay machine should have been presented with a tape modulated at the correct level and accurate decoding should have ensued. The adoption of 120uS for commercially duplicated chrome stock was intended to reduce HF crushing and thus improve Dolby tracking. Sadly, the consistency and ultimate quality of high speed duplicated cassettes seldom exceeded the mediocre.

cmjones01 3rd May 2019 8:48 pm

Re: Different Dolby B / C Calibration Standards
 
There is some evidence that not all cassette deck manufacturers actually used the same reference playback level for their Dolby decoding. I did some work on my Sony TC-K81, a somewhat high-end 3-head deck, and while searching the web for information on standard level tapes, found that Sony's 'Dolby' level wasn't quite the same as everyone else's - something like 235nWb/m compared with the standard 200nWb/m. I checked, and indeed the measurements specified in the service manual around the Dolby decoder chips seemed to reinforce this.

I didn't use Dolby B much back in the cassette era. My Technics RS-B355 deck (a budget model) had both B and C. I used C for a while which worked fairly well on its own recordings but I soon got fed up with the pumping artefacts and just switched off noise reduction altogether for recording and playback.

I recently acquired a Sony WM-D6C Walkman professional which also has Dolby C. Trying to play back any of my late 80s-era Dolby C encoded cassettes is a disaster on it, though. I haven't tried realigning the playback level to fix it.

The elephant in the room with cassettes was always the azimuth adjustment. It was accepted schoolboy folklore that for best results you had to play back cassettes on the same machine they were recorded on. Even good quality hi-fi decks seemed to leave the factory with diffebt azimuth settings, and even the tiniest misalignment makes such a huge difference that it's a real problem.

Chris

paulsherwin 3rd May 2019 8:55 pm

Re: Different Dolby B / C Calibration Standards
 
Ted:

The problem is the differing sensitivity of the different tape stock. BASF chrome has very low noise levels but also low MOLs. A correctly recorded BASF chrome cassette will play back 3-6dB lower than a good pseudochrome. It's quite easy to see this effect when playing back commercial cassettes recorded on different stock. There is no practical way for tape duplicators to counteract this effect. The playback machine can be Dolby calibrated for high or low MOLs but not both.

I'm not picking on BASF here - it's just that they were the biggest manufacturers of this sort of tape, and marketed it aggressively to both the commercial and domestic markets.

Of course, there are lots of other reasons why a musicassette can sound lousy. They got a lot better towards the end of the cassette era as the duplicators switched to decent ferric stock and introduced HX-Pro style variable bias.

TIMTAPE 4th May 2019 12:19 am

Re: Different Dolby B / C Calibration Standards
 
Quote:

Originally Posted by cmjones01 (Post 1142401)
There is some evidence that not all cassette deck manufacturers actually used the same reference playback level for their Dolby decoding. I did some work on my Sony TC-K81, a somewhat high-end 3-head deck, and while searching the web for information on standard level tapes, found that Sony's 'Dolby' level wasn't quite the same as everyone else's - something like 235nWb/m compared with the standard 200nWb/m. I checked, and indeed the measurements specified in the service manual around the Dolby decoder chips seemed to reinforce this...

I've never come across this. Back in the early 80's though there was a discrepancy in cassette reference level, in the high frequencies. I think BASF used a slightly different method of arriving at the reference level which resulted in their calibration tapes measuring differently from those of other cal tape manufacturers. This wasn't a Dolby issue as such but Dolby exaggerates misalignment issues especially in the highs, it would have affected Dolby tracking on machines aligned to the different standard. I think the issue was eventually sorted out.

TIMTAPE 4th May 2019 2:57 am

Re: Different Dolby B / C Calibration Standards
 
Quote:

Originally Posted by paulsherwin (Post 1142402)
Ted:

The problem is the differing sensitivity of the different tape stock. BASF chrome has very low noise levels but also low MOLs. A correctly recorded BASF chrome cassette will play back 3-6dB lower than a good pseudochrome. It's quite easy to see this effect when playing back commercial cassettes recorded on different stock. There is no practical way for tape duplicators to counteract this effect. The playback machine can be Dolby calibrated for high or low MOLs but not both.

I'm not sure I go along with this, Paul. For a tape with sensitivity 3 db below reference we simply increase record gain by 3db. Whereas MOL is the maximum level a tape is capable of, or for a given level of distortion. Since Dolby B and C only kick in at the low levels, MOL shouldnt affect their performance, unless the MOL is very low. That's my understanding anyway.

TIMTAPE 4th May 2019 5:05 am

Re: Different Dolby B / C Calibration Standards
 
Quote:

Originally Posted by cmjones01 (Post 1142401)
There is some evidence that not all cassette deck manufacturers actually used the same reference playback level for their Dolby decoding. I did some work on my Sony TC-K81, a somewhat high-end 3-head deck, and while searching the web for information on standard level tapes, found that Sony's 'Dolby' level wasn't quite the same as everyone else's - something like 235nWb/m compared with the standard 200nWb/m. I checked, and indeed the measurements specified in the service manual around the Dolby decoder chips seemed to reinforce this.

Adding to the previous post, is it possible you came across the different standards for describing the same flux level? For example on an alignment tape I have here the same flux level is described as 295nWb/m (ANSI) and 320 nWb/m (DIN).

I'm not sure on this but I think cassette Dolby level at 400Hz is 185 nWb/m (ANSI) which is 200 nWb/m (DIN) which again is the same flux level.


All times are GMT +1. The time now is 8:46 pm.

Powered by vBulletin®
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Copyright ©2002 - 2023, Paul Stenning.