View Single Post
Old 3rd May 2019, 8:48 pm   #7
cmjones01
Nonode
 
Join Date: Oct 2008
Location: Warsaw, Poland and Cambridge, UK
Posts: 2,669
Default Re: Different Dolby B / C Calibration Standards

There is some evidence that not all cassette deck manufacturers actually used the same reference playback level for their Dolby decoding. I did some work on my Sony TC-K81, a somewhat high-end 3-head deck, and while searching the web for information on standard level tapes, found that Sony's 'Dolby' level wasn't quite the same as everyone else's - something like 235nWb/m compared with the standard 200nWb/m. I checked, and indeed the measurements specified in the service manual around the Dolby decoder chips seemed to reinforce this.

I didn't use Dolby B much back in the cassette era. My Technics RS-B355 deck (a budget model) had both B and C. I used C for a while which worked fairly well on its own recordings but I soon got fed up with the pumping artefacts and just switched off noise reduction altogether for recording and playback.

I recently acquired a Sony WM-D6C Walkman professional which also has Dolby C. Trying to play back any of my late 80s-era Dolby C encoded cassettes is a disaster on it, though. I haven't tried realigning the playback level to fix it.

The elephant in the room with cassettes was always the azimuth adjustment. It was accepted schoolboy folklore that for best results you had to play back cassettes on the same machine they were recorded on. Even good quality hi-fi decks seemed to leave the factory with diffebt azimuth settings, and even the tiniest misalignment makes such a huge difference that it's a real problem.

Chris
__________________
What's going on in the workshop? http://martin-jones.com/
cmjones01 is offline