View Single Post
Old 4th May 2019, 12:19 am   #9
TIMTAPE
Octode
 
Join Date: Jul 2006
Location: Perth, Western Australia
Posts: 1,969
Default Re: Different Dolby B / C Calibration Standards

Quote:
Originally Posted by cmjones01 View Post
There is some evidence that not all cassette deck manufacturers actually used the same reference playback level for their Dolby decoding. I did some work on my Sony TC-K81, a somewhat high-end 3-head deck, and while searching the web for information on standard level tapes, found that Sony's 'Dolby' level wasn't quite the same as everyone else's - something like 235nWb/m compared with the standard 200nWb/m. I checked, and indeed the measurements specified in the service manual around the Dolby decoder chips seemed to reinforce this...
I've never come across this. Back in the early 80's though there was a discrepancy in cassette reference level, in the high frequencies. I think BASF used a slightly different method of arriving at the reference level which resulted in their calibration tapes measuring differently from those of other cal tape manufacturers. This wasn't a Dolby issue as such but Dolby exaggerates misalignment issues especially in the highs, it would have affected Dolby tracking on machines aligned to the different standard. I think the issue was eventually sorted out.
TIMTAPE is offline