Thread: FET Questions
View Single Post
Old 27th Dec 2013, 2:58 am   #66
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,944
Default Re: FET Questions

Quote:
Originally Posted by G6Tanuki View Post
Pondering this, I wonder - in the US UHF TV took a long time to get off the ground - to the point where it was considered very much an afterthought to TV designers until well into the 1980s. Remember, unlike the UK they had 525-line NTSC colo(u)r on VHF from the 1950s... so no real pressure to put much effort into UHF... and UHF doesn't cover US-style distances well either.
Definitely UHF TV was of secondary importance in the USA. The three main networks, ABC, CBS and NBC, were on the VHF channels in the great majority of locations, and PBS was also on VHF in most areas. Thus the UHF channels were occupied mostly by local and independent broadcasters. Dallas-Fort Worth (DFW) was a typical example, with VHF TV stations on channels A2 (PBS), A4 (CBS), A5 (NBC), A8 (ABC), A11 (Independent) and A13 (PBS), and with 3 or 4 independents on UHF.

Thus UHF tuners were very simple devices until the 1970s. At some stage, it might have been early in the 1970s, FCC mandated that TV receivers must tune the UHF channels as well as they did VHF channels, and this resulted in detented tuning mechanisms, although not necessarily with any electronic changes. The addition of RF amplifiers, as in the RCA KRK-226 case, might have coincided more-or-less with the advent of varactor tuning. Possibly the use of varactors resulted in greater loss and/or higher noise, such that low-noise gain ahead of the diode mixer became desirable?

Quote:
Originally Posted by G6Tanuki View Post
I also ponder whether the US FET TV- and FM-radio tuner designers were more focussed on coping with lots of strong competing stations on VHF [particularly on the VHF-TV and 'Band II' FM broadcast radio-band] whereas in more-regulated State-controlled-media Europe we tended to have many fewer stations. Cross-modulation resistance (hence early use of MOSFETs and diode-mixers) could have been more important to the US designers? When I visited Dallas/Fort Worth in the early-1980s tuning a FM radio from one end of the band to the other revealed precious few unoccupied channels.
With US VHF TV tuners, cross-modulation was evidently the key performance parameter as an indication of front end non-linearity, and one that had gone backwards significantly with the transition from valves to bipolar transistors. The use of mosfets solved the problem, and allowed solid state tuners to equal or match the best valved models. TI at least had also proposed the use of jfets, but mosfets were better for the application. At the time the cross-modulation concern would have been more about the adverse effects of alternate – or even further away - channels than about the effects of adjacent channels, but the latter became important once TV receivers were fitted with “cable ready” front ends.

On the face of it, the European VHF TV situation might have been less demanding in that fewer stations were available in any given area, and that in turn might have justified the continued use of bipolar VHF tuners. But it was not necessarily so. My own experience with two Philips receivers, a K9 in Auckland during the 1970s and a later model (I do not recall which chassis) in Wellington during the early 1980s was that their respective VHF tuners were quite miserable in terms of cross-modulation performance. At each site, there was just one Band I and one Band III channel available. At both, reasonable outdoor aerials were needed to obtain interference- and ghost-free reception. And at both, something like 20 dB attenuation (at the set input) was needed to get rid of what was very visible cross-modulation. In the Auckland case, using an aerial array that I had installed, an earlier valved HMV monochrome receiver had no problem with the same signal. By playing around with the internal RF agc offset control, one could get the front end into overload, but even just before onset of same, there was no evidence of cross-modulation. The Wellington aerial was a professional installation, and the installer had to add the attenuator to get rid of the cross modulation. My conclusion was that it was poor or careless design on the part of Philips; it probably should have used a mosfet-based tuner in these receivers. But at the time Philips seemed to be bipolar-oriented.

With US FM there were similar concerns as with VHF TV. As you say, the FM Band was often pretty much full, and in DFW it was end-to-end occupied with what were mostly highly compressed transmissions “shouting” at the potential audience, with KERA (PBS) and WRR (Dallas City-owned classical) being in a very small minority with minimal compression. Both from time-to-time received questions from poorly-informed listeners as to why they were relatively “quiet”. I can’t recall their exact responses, but KERA at least implied that compression was a form of distortion and thus highly undesirable (which I don’t suppose went down too well with the purveyors of compression equipment who no doubt preferred to put a positive spin on their distortion mcahines).

Anyway, FM receivers with bipolar front ends could and did have difficulties in that environment. For example my Beolit 707 was not too happy. At least judging by the RCA and TI papers, the key parameter relating to FM front end linearity was the half-IF response, and FETs were generally better than bipolar devices at minimizing this. Here though, the European and UK FM tuner makers adopted FET-based front ends in the same time period, if not earlier than their Japanese and US counterparts, I imagine because they were weighing heavily the needs of the export markets including the USA. Some delayed the change from valves to solid state until FETs were available and economic, and so avoided bipolar woes. On timing, I don’t have the exact dates but I think that B&O and Sony both updated their respective “5000” models from bipolar to FET front ends at around the same time, circa 1969. Radford offered an “export” FET front end option for its otherwise bipolar FM tuner by 1967.

For FM, jfets as well as mosfets were used in the early days. The VHF-TV objections, such as limited agc range for an RF jfet cascode, and minimal gain for a jfet mixer, were of much less consequence in FM applications.

Returning to UHF TV, the UK might have had the most severe reception conditions, with its transmitter network carefully designed to provide national coverage. Whilst the service area maps showed the boundaries for good reception, I suspect that had they also showed the boundaries at which transmitters could cause material interference, these would have been much further out, an inevitable consequence of designing for contiguous service areas. On the other hand, for many US cities, except perhaps in the northeast, it would be unlikely that any UHF transmissions from other cities would be strong enough to cause interference problems. In DFW, one could not find any out-of-area UHF TV transmissions using the ICOM R7000 VHF-UHF communications receiver, and scarcely any out-of-area VHF TV transmissions, either. Anyway, fairly exacting standards were set by BREMA for UK UHF TV tuners from the start, and they more-or-less had to be four-gang. Cross-modulation of some of the early bipolar designs was evidently not brilliant, but then signal strengths as delivered to the receivers might not have been that high. Still, the previously-mentioned Mullard U321 seems to have included a higher front end overload threshold as part of its design brief, albeit in this case by using a grounded base bipolar stage, I imagine with reasonable current. Maybe a mosfet would have been better still, though. It did not occur to me before, but I wonder if a diode mixer was chosen because a realizable bipolar mixer similar to that used in earlier designs, might have overloaded before the new, higher-headroom RF amplifier did.

Quote:
Originally Posted by G8HQP Dave View Post
Active mixers are much noiser than an amplifier using the same technology. Passive (e.g. switching) mixers can be very quiet. This means that when getting near the limit of what your technology can do the first thing that goes is active mixers, then amplifiers, then passive mixers.
So the take on this is that (appropriate) mosfets were quiet enough for use as RF amplifiers for both the VHF TV and UHF TV cases.

And the gain available at VHF was sufficient to allow a mosfet mixer to follow a single mosfet RF stage without prejudicing noise performance.

But at UHF, the gain available from a single mosfet RF stage was insufficient to offset the noise contribution of a following mosfet mixer, so it was preferable to use a passive mixer followed by a low noise amplifier.

Cheers,
Synchrodyne is offline