Different Dolby B / C Calibration Standards
Here is something that I have only ever guessed at an answer. It refers to when back in the Eighties I splashed out on a brand new stereo cassette deck - my second. The first (a lovely Teac with totally un-electronic transport controls) still worked but bizarrely was putting clicks on tapes even in play. (I can't get used to thinking of cassettes and CDs "Vintage," by the way...:
I decided to buy an NAD deck because I had the impression that it was one of the best of the affordable decks around. Having bought it and recorded a couple of cassettes on it, I noticed that my original cassettes sounded a bit dull on the new machine. Cassettes recorded on the new one sounded perfect on it, of course. I can't remember how but I came to the conclusion that it was a Dolby issue - as if the new deck required a higher level of audio to work properly, as it were. As I did not want to re-do all my audio cassettes I took it back and settled for a Sony at an equivalent price, which did not exhibit the same problem. Incidentally, when I borrowed a relatively high-end Panasonic at the same time that was also problematic. It seemed that Sony were the only good make that I could rely on to continue recording my cassettes. Were there two different Dolby calibration settings once? |
Re: Different Dolby B / C Calibration Standards
Are you sure it was a Dolby issue and not a tape head azimuth issue?
Peter |
Re: Different Dolby B / C Calibration Standards
No, although I think Dolby Level for cassettes was 200nWb/m and that for open reel at 250, if memory serves. Some NAD decks had a thing called "play trim" which was meant to compensate for HF losses on incoming tapes and bring the Dolby into the sweet spot.
Cassette machines in general and Dolby machines in particular were always pushing the limits of available technology, and undoubtedly some manufacturers were more careful with line-up than others. Azimuth was always a variable feast, too - it was uncommon for two machines to match, and many machines gave a different answer if you removed and re-inserted the cassette. Fascinating as I have to admit they are, two comments from the Sounds Good programme in the 70s stick in my mind - one was a dealer admitting that the hi-fi cassette was "a fluke" and John Longden, the EIC of Radio London, giving his opinion that all the refinements were meant to make something that was basically rather horrid work. Admittedly, Revox, Nakamichi and one or two others achieved this, but even they struggled with Dolby C - I don't think I've ever heard a Dolby C cassette recording which didn't display artifacts. B, proprely lined up, could actually work well. C was all right E-E, but once the tape was introduced into the chain, things went awry - it was just too finicky on line-up. |
Re: Different Dolby B / C Calibration Standards
Dolby tracking errors caused all sorts of problems when playing a tape recorded on one machine on something else. There were even tracking errors on the same machine because different tape formulations had different sensitivities. BASF chrome tapes produced tracking errors when used to record on any deck set up for the higher sensitivity TDK/Maxell pseudochromes, even if the record bias was adjusted appropriately.
Dolby errors are the main reason that commercial prerecorded cassettes tend to sound murky if played with Dolby B enabled. BASF were very successful in persuading record companies to use their chrome formulations. |
Re: Different Dolby B / C Calibration Standards
These days you can do the Dolby B decode in software and experiment with the optimum level.
I always thought a weakness in Dolby B was the variance in reference level between different manufacturers. With the A system reference tones were recorded to allow optimum playback on a different machine. |
Re: Different Dolby B / C Calibration Standards
Quote:
|
Re: Different Dolby B / C Calibration Standards
There is some evidence that not all cassette deck manufacturers actually used the same reference playback level for their Dolby decoding. I did some work on my Sony TC-K81, a somewhat high-end 3-head deck, and while searching the web for information on standard level tapes, found that Sony's 'Dolby' level wasn't quite the same as everyone else's - something like 235nWb/m compared with the standard 200nWb/m. I checked, and indeed the measurements specified in the service manual around the Dolby decoder chips seemed to reinforce this.
I didn't use Dolby B much back in the cassette era. My Technics RS-B355 deck (a budget model) had both B and C. I used C for a while which worked fairly well on its own recordings but I soon got fed up with the pumping artefacts and just switched off noise reduction altogether for recording and playback. I recently acquired a Sony WM-D6C Walkman professional which also has Dolby C. Trying to play back any of my late 80s-era Dolby C encoded cassettes is a disaster on it, though. I haven't tried realigning the playback level to fix it. The elephant in the room with cassettes was always the azimuth adjustment. It was accepted schoolboy folklore that for best results you had to play back cassettes on the same machine they were recorded on. Even good quality hi-fi decks seemed to leave the factory with diffebt azimuth settings, and even the tiniest misalignment makes such a huge difference that it's a real problem. Chris |
Re: Different Dolby B / C Calibration Standards
Ted:
The problem is the differing sensitivity of the different tape stock. BASF chrome has very low noise levels but also low MOLs. A correctly recorded BASF chrome cassette will play back 3-6dB lower than a good pseudochrome. It's quite easy to see this effect when playing back commercial cassettes recorded on different stock. There is no practical way for tape duplicators to counteract this effect. The playback machine can be Dolby calibrated for high or low MOLs but not both. I'm not picking on BASF here - it's just that they were the biggest manufacturers of this sort of tape, and marketed it aggressively to both the commercial and domestic markets. Of course, there are lots of other reasons why a musicassette can sound lousy. They got a lot better towards the end of the cassette era as the duplicators switched to decent ferric stock and introduced HX-Pro style variable bias. |
Re: Different Dolby B / C Calibration Standards
Quote:
|
Re: Different Dolby B / C Calibration Standards
Quote:
|
Re: Different Dolby B / C Calibration Standards
Quote:
I'm not sure on this but I think cassette Dolby level at 400Hz is 185 nWb/m (ANSI) which is 200 nWb/m (DIN) which again is the same flux level. |
All times are GMT +1. The time now is 8:46 pm. |
Powered by vBulletin®
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Copyright ©2002 - 2023, Paul Stenning.