UK Vintage Radio Repair and Restoration Powered By Google Custom Search Vintage Radio and TV Service Data

Go Back   UK Vintage Radio Repair and Restoration Discussion Forum > Specific Vintage Equipment > Vintage Television and Video

Notices

Vintage Television and Video Vintage television and video equipment, programmes, VCRs etc.

Closed Thread
 
Thread Tools
Old 27th Apr 2018, 8:40 pm   #1
Pieter H
Tetrode
 
Join Date: Sep 2017
Location: Waalre, Netherlands
Posts: 67
Default Tuners in Philips G6-G11 CTV chassis

Hi,

I'm happy to announce the upload op the third part of my Philips tuner history.

One of the issues I'm still struggling with are the tuners used in the colour TV platforms that were modified by the UK/Mullard organization from the original K-chassis developed in Eindhoven. So the G6, G8, G9 and G11 chassis. Even in the official service manuals the information on the tuners is minimal, essentially only a service code.

What would be most helpful are pictures of the tuners and their 12nc code labels, either inside the sets or as module. In the end this provides most information. Hope you can help me!

Cheers, Pieter
Pieter H is offline  
Old 28th Apr 2018, 4:36 am   #2
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,943
Default Re: Tuners in Philips G6-G11 CTV chassis

Quote:
Originally Posted by Pieter H View Post
I'm happy to announce the upload op the third part of my Philips tuner history.
Thanks very much – magnificent and a real tour-de-force! Reading through it twice already has fully occupied a wet and windy Saturday afternoon! I think I might have a few questions, to follow, but meanwhile it was good to find some clarity on the issue of germanium vs. silicon npn vs. silicon pnp transistor performance at UHF.

Cheers,
Synchrodyne is offline  
Old 28th Apr 2018, 9:08 am   #3
toshiba tony
Heptode
 
toshiba tony's Avatar
 
Join Date: May 2006
Location: Accrington, Lancashire, UK.
Posts: 977
Default Re: Tuners in Philips G6-G11 CTV chassis

Invaluable document. Memories!
toshiba tony is offline  
Old 3rd May 2018, 2:54 am   #4
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,943
Default Re: Tuners in Philips G6-G11 CTV chassis

Here are some questions and comments on just a few of the many topics covered in the epic “Philips TV Tuners Part 3”.

Tuned RF Input for UHF Tuners:

The case of the AT6380 where a tuned input was added ahead of an early Ge transistor RF amplifier in order to reduce noise at the higher frequency end of the UHF band does show that the “conventional wisdom” on this issue was a broad-brush approach that did not address all situations.

That conventional wisdom suggest that the best noise performance was achieved with an aperiodic input, with the required RF selectivity being provided by a bandpass tuned interstage. This in turn indicated 3-gang UHF tuners.

The UK case was an exception. The image rejection requirement in the UK was much higher than in the rest of Europe. This was due to the decision to move the vision IF (VIF) from 38.9 to 39.5 MHz after the European UHF channel plan was determined at ITU Stockholm 1961, including the co-siting of n and (n+10) channel transmitters. This required extra RF selectivity, typically provided by adding a 4th gang for a tuned RF input. But there were other ways in which it was done. In the late 1960s, Thorn used the 4th gang to tune an input image rejector in one case, and to facilitate a triple-bandpass interstage in another, in both cases retaining what was an aperiodic input. That suggests that – with devices that did not need any additional help at the top end of the UHF band – it saw a benefit, presumably a noise benefit in retaining the aperiodic input.

In the USA, the traditional UHF tuners usually had a bandpass tuned input ahead of their diode mixers. That was needed for additional RF selectivity. RF amplifiers in US UHF tuners appeared to have arrived at about the same time that varactor tuning was introduced. The lower Q of early varactor circuits indicated that some additional help with RF selectivity was required, and this was provided by a tuned input. Also, as the VHF tuners moved over to varactors, it was more difficult to provide for feeding the UHF tuner output into the VHF tuner RF amplifier without additional switching, so the UHF tuner output went to the VHF mixer instead, requiring more overall gain in the UHF tuners.

Combined VHF-UHF Tuners with Separate UHF Oscillators:

Whilst it was logical that the three transistors used in such tuners as the Pi1 be used in essentially the same way on both VHF and UHF, looking at the UHF case alone, it was unusual in European practice to use separate transistors for the oscillator and mixer functions. The self-oscillating mixer seems to have been preferred. I should imagine that the use of separate transistors was the better approach, but that the self-oscillating mixer was considered to be adequate, and saved a device. Perhaps that goes back to the valve era. A separate oscillator would have required an additional triode valve, making for a three-valve tuner, probably outruled on cost grounds. Maybe that thinking was carried over to the transistor era.

VHF Tuners with Self-Oscillating Mixers:

This one is really the opposite of the previous case, and I must admit it I was very surprised to find that it was actually done. The customary reason given for its not being widely used is that once TV receiver IFs moved up to be just under the lower edge of Band I, it would be difficult to avoid regeneration at the lower Band I channels with a self-oscillating mixer. The regeneration issue is why, for example, the pentode mixer (in the form of triode-pentode valves such as the 6X8, 6U8 and PCF80) was adopted in the early 1950s, even though the pentode was contra-indicated at Band III. In the transistor era, grounded base mixers were used, and in the USA also cascode bipolar mixers to avoid the regeneration problem. When dual-gate mosfets arrived and were used in VHF TV tuners, the mixers had both signal and local oscillator on gate 1, so that gate 2 screened the drain from both inputs. In contrast, radio applications usually had signal on gate 1 and local oscillator on gate 2.

So the question arises - how did Philips avoid the regeneration problem with self-oscillating VHF mixers?

Intermediate Frequencies:

Intriguing was the Philips VF5 case where the 32.7 MHz VIF was used not only for System L (where it was standard), but also for System E, which put the System E SIF at 43.85 MHz, actually overlapping the lower end of Band I and channel F2.

I wondered why not use the established 28.05 MHz VIF for System E, with SIF at 39.2 MHz, the same as for System L. This arrangement ensured that the System E LO frequencies fell on an opposite channel carrier frequency, and that there was no IF channel overlap with Band I.

VIF-low was necessary to facilitate reception of channels F2 and F4, for which only oscillator-high conversion was feasible. And also for the L’ Band I channels, which were inverted (vision carrier high).

In the solid-state era, one may imagine that the better screening that was possible – for example both tuners and IF strips in those small rectilinear metal boxes – obviated the need for worrying too much about where the in-band LOs fell, or that the SIF was within Band I. That made possible the upward movement of the System E IF channel so that the E and L VIFs were the same, rather than the SIFs.

But I wonder if the primary driver may have been to facilitate the use of synchronous vision demodulation. The quasi-synchronous IC types of the late 1960s and 1970s all used tank circuits to offset some of the quadrature errors produced by limiting an asymmetric sideband signal. So having the same VIFs for Systems L and E avoided the need to adjust synchronous demodulator tank circuit tuning according to the system being received at any given time. That problem did not arise with diode demodulators, which rectified whatever they received (and also added a quite a bit of quadrature distortion in the process). Two SIFs may have been less of a problem in the solid-state era, and there were some (but not all) synchronous demodulation AM SIF ICs that did not require tank circuits, and so were effectively wideband.

That said, I am not sure about the chronology of the move to synchronous demodulation for the positive modulation case. The early ICs, such as the Motorola MC1330 (1969) and Philips TCA270 (1972) appear to have been oriented to negative modulation systems, although I think that the Siemens TBA440 (also 1972) could handle both negative and positive. The TDA2542 was the positive modulation counterpart to the TDA2540/1, from about the mid-1970s, and the TDA2543 was its counterpart for the AM sound channel.

Of course in System IF choices one might say that from the early days, the Belgian and border-area multi-norm receivers did not respect the notion that System E LOs should fall upon an opposite carrier frequency. But then they were intended to receive just a small number of Band III System E channels, so probably did not run into the LO interference issue.

Also interesting was the AT7650 VHF tuner, intended to receive all of the VHF System E channels and the channel E7 Luxembourg System F transmitter. That produced the standard IFs for System E, 28.05 MHz VIF and 39.3 MHz SIF, but it must have been a very rare case where a 7 MHz channel European transmission was transposed to a VIF-low IF channel, 33.7 MHz VIF and 39.2 MHz SIF, which would have required LO-low.

And somewhat surprising was that the use of the Italian IF evidently persisted into the 1970s. The V302 seems to have been the last Philips VHF TV tuner to have catered for this. The Italian IF was probably a case of standardizing too early, as the Italian IF channel of 40 to 47 MHz was decreed early in 1952. It was apparently derived from American practice, where the IF channel was 41 to 47 MHz. Italy’s first experimental 625-line transmitter at Torino, supplied from the US, initially worked in the 6 MHz wide channel A6, which was retained and expanded to 7 MHz as Italian channel C. This IF choice probably explains the odd spacings of the Italian Band III channels, positioned to avoid possible interferences. Within that IF channel, as far as I know the SIF and VIF were normally 40.25 and 45.75 MHz respectively, but Philips moved them slightly to 40.4 and 45.9 MHz respectively. The ITU Stockholm 1961 European UHF planning meeting appears to have taken account of both the Italian and the standard European IF channels when looking at interference patterns that affected channel allocations, so this stage a change to the standard IF was evidently anticipated.

Use of Mosfets:

Something apparent from the chronology is that Philips waited quite a long time, until 1980, before introducing the dual gate mosfet RF amplifier into its TV tuners. Although to be fair, I think that that was generally consistent with what the other European setmakers did. I have a vague notion that Tandberg had been an early mover in using mosfets in VHF TV tuners, but I no longer have the reference for that.

As far as I know, RCA was the first anywhere to introduce a VHF TV tuner with a dual-gate mosfet RF amplifier, namely its KRK-142 of later 1968. This also had a cascode bipolar mixer for better performance.

Other US makers generally waited the year or two before protected gate devices became available before adopting mosfet RF amplifiers for their VHF TV tuners. But Zenith had an early model that combined a bipolar grounded base RF amplifier with a dual-gate mosfet mixer. The latter was more generally adopted through the 1970s, as well.

RCA was also about the first to use a dual-gate mosfet RF amplifier in a UHF tuner, in its KRK-226 of 1975.

Evidently the American TV setmakers were concerned about the relatively poor performance of bipolar VHF tuners, and so often held on to valve types, at least for their more expensive models, until the mosfet type became available. Certainly with the early mosfet models, some work was done to demonstrate their equivalence or superiority to their valved predecessors.

One could argue that the USA was a more difficult case for VHF TV reception, with multiple transmitters in the larger cities. (When I lived in the Dallas-Fort Worth area, we had 5, later 6 VHF TV stations, in the latter case occupying channels A2, A4, A5, A8, A11 and A13.) A specific problem was the channel A6 colour beat. So in turn one might deduce that US receivers needed better VHF tuners than were required elsewhere.

But there is contrary evidence. Here in New Zealand the Philips K9 was a very popular receiver in the early days of colour, say mid- through later 1970s. At that time the major cities usually had just one Band I and one Band III transmitter, so reception conditions were apparently quite mild. One K9 that I was familiar with in Auckland suffered severe cross-modulation when fed with the signal from an outdoor aerial array which an earlier valved monochrome receiver had handled with ease. As I recall around 20 dB of signal attenuation at the receiver input was needed to clear the cross-modulation. I had a similar experience in Wellington in the early 1980s with a later Philips model whose K-number I do not recall; there 12 or 18 dB of attenuation was needed, although I don’t remember the exact number. An oddity was that Pye NZ had introduced a colour TV chassis with a mosfet-based tuner around 1974-75. I think that the chassis may have been non-Philips, and the tuner may well have come from the USA, perhaps Japan.


Cheers,
Synchrodyne is offline  
Old 4th May 2018, 12:46 pm   #5
Pieter H
Tetrode
 
Join Date: Sep 2017
Location: Waalre, Netherlands
Posts: 67
Default Re: Tuners in Philips G6-G11 CTV chassis

Hi Synchrodyne,
thanks a lot for your extensive feedback and comments, most interesting and food for some deeper thoughts and analysis. Unfortunately (well, at least for this discussion) I'm leaving on a week holiday, so I can't respond right now. When I'm back I'll dive into it and come back to you as soon as possible.

Regards, Pieter
Pieter H is offline  
Old 20th May 2018, 6:36 pm   #6
Pieter H
Tetrode
 
Join Date: Sep 2017
Location: Waalre, Netherlands
Posts: 67
Default Re: Tuners in Philips G6-G11 CTV chassis

Hi Synchrodyne,

finally back from holidays, so time to respond to your many inputs.
And again, before I start, thanks a lot for your careful reading and thorough feedback! Much appreciated.
Let me try to answer and respond to your inputs.

Quote:
Tuned RF Input for UHF Tuners:

The case of the AT6380 where a tuned input was added ahead of an early Ge transistor RF amplifier in order to reduce noise at the higher frequency end of the UHF band does show that the “conventional wisdom” on this issue was a broad-brush approach that did not address all situations.

That conventional wisdom suggest that the best noise performance was achieved with an aperiodic input, with the required RF selectivity being provided by a bandpass tuned interstage. This in turn indicated 3-gang UHF tuners.

The UK case was an exception. The image rejection requirement in the UK was much higher than in the rest of Europe. This was due to the decision to move the vision IF (VIF) from 38.9 to 39.5 MHz after the European UHF channel plan was determined at ITU Stockholm 1961, including the co-siting of n and (n+10) channel transmitters. This required extra RF selectivity, typically provided by adding a 4th gang for a tuned RF input. But there were other ways in which it was done. In the late 1960s, Thorn used the 4th gang to tune an input image rejector in one case, and to facilitate a triple-bandpass interstage in another, in both cases retaining what was an aperiodic input. That suggests that – with devices that did not need any additional help at the top end of the UHF band – it saw a benefit, presumably a noise benefit in retaining the aperiodic input.
I can't speak for the whole world, but from what I see within the Philips tuner developments it looks like the initial concept was to keep the UHF input - still a major challenge at the time, for consumer equipment! - as simple as possible, meaning no tuned input match. This was used in the first valve UHF tuners as well as the AT6370 first transistor tuner. However, (noise) performance was almost certainly not optimal and could be improved with a tuned input match. From the AT6380 this was standard in all UHF tuners, with the exception of the KD1, but also this one quickly moved back to the 4-stage tuning in the KD2.
In my analysis I haven't found any UK-specific tuners in this respect. But in line with your assumption the UK requirements for RF selectivity might have pushed the designs in that direction. However, up to this stage Philips maintained global architectures for its tuners, so no special UK versions yet (apart from the IF setting of course). Also I don't think that the N+10 requirement lead to special requirements for UK tuners only. N+10 in UHF (and N+11 in VHF) is a generic issue for all systems with IF's in the 35-45MHz range, being the image channel of the wanted signal.

Quote:
Combined VHF-UHF Tuners with Separate UHF Oscillators:

Whilst it was logical that the three transistors used in such tuners as the Pi1 be used in essentially the same way on both VHF and UHF, looking at the UHF case alone, it was unusual in European practice to use separate transistors for the oscillator and mixer functions. The self-oscillating mixer seems to have been preferred. I should imagine that the use of separate transistors was the better approach, but that the self-oscillating mixer was considered to be adequate, and saved a device. Perhaps that goes back to the valve era. A separate oscillator would have required an additional triode valve, making for a three-valve tuner, probably outruled on cost grounds. Maybe that thinking was carried over to the transistor era.
I think you're absolutely right, I'm convinced that tuner design has been driven until well into the 1970's by mantras like "minimize the number of valves or transistors". So for twenty years the rule was two valves or 3 transistors per tuner, even when combi VHF-UHF tuners were introduced. This was undoubtedly driven by the need to limit component cost. And when it was decided to use a separate oscillator, it meant that the mixer could not be a transistor but became a diode again, see the U322 from 1975. It was the ELC family that finally introduced separate RF and MO transistors per band, adding up to 7 transistors per tuner. So yes, these approaches can be persistent!

Quote:
VHF Tuners with Self-Oscillating Mixers:

This one is really the opposite of the previous case, and I must admit it I was very surprised to find that it was actually done. The customary reason given for its not being widely used is that once TV receiver IFs moved up to be just under the lower edge of Band I, it would be difficult to avoid regeneration at the lower Band I channels with a self-oscillating mixer. The regeneration issue is why, for example, the pentode mixer (in the form of triode-pentode valves such as the 6X8, 6U8 and PCF80) was adopted in the early 1950s, even though the pentode was contra-indicated at Band III. In the transistor era, grounded base mixers were used, and in the USA also cascode bipolar mixers to avoid the regeneration problem. When dual-gate mosfets arrived and were used in VHF TV tuners, the mixers had both signal and local oscillator on gate 1, so that gate 2 screened the drain from both inputs. In contrast, radio applications usually had signal on gate 1 and local oscillator on gate 2.

So the question arises - how did Philips avoid the regeneration problem with self-oscillating VHF mixers?
This refers to the VD1-12ET5732, which indeed introduced VHF self-oscillating MO. However, we should not compare the transistor circuit with the earlier valves. In a valve mixer it indeed helped to use a pentode, giving better anode-to-grid isolation and less regeneration that a triode. However, in a grounded-base self-oscillating MO as in the VD1 and the collector-to-base capacitance is much less relevant. But other than that I can't claim to have the real answer. At the same time we have to admit that the VD1 had a long life for a tuner, so apparently the performance was more than acceptable for the demanding German market.

Quote:
Intermediate Frequencies:

Intriguing was the Philips VF5 case where the 32.7 MHz VIF was used not only for System L (where it was standard), but also for System E, which put the System E SIF at 43.85 MHz, actually overlapping the lower end of Band I and channel F2.

I wondered why not use the established 28.05 MHz VIF for System E, with SIF at 39.2 MHz, the same as for System L. This arrangement ensured that the System E LO frequencies fell on an opposite channel carrier frequency, and that there was no IF channel overlap with Band I.

VIF-low was necessary to facilitate reception of channels F2 and F4, for which only oscillator-high conversion was feasible. And also for the L’ Band I channels, which were inverted (vision carrier high).

In the solid-state era, one may imagine that the better screening that was possible – for example both tuners and IF strips in those small rectilinear metal boxes – obviated the need for worrying too much about where the in-band LOs fell, or that the SIF was within Band I. That made possible the upward movement of the System E IF channel so that the E and L VIFs were the same, rather than the SIFs.
Indeed an intriguing tuner, the VF5, and I was happy to finally have a circuit diagram. Note that the specs specify the tuner only for system-E (819 lines) channels F2 and F4! F3 (PC 56,15MHz, SC 63,4MHz) would require an LO at 23,45MHz, which is of course impossible.
I agree that the change to VIF at 32,7MHz was driven by the emerging IF IC's. I'll try to cover that in the next chapter.

Quote:
But I wonder if the primary driver may have been to facilitate the use of synchronous vision demodulation. The quasi-synchronous IC types of the late 1960s and 1970s all used tank circuits to offset some of the quadrature errors produced by limiting an asymmetric sideband signal. So having the same VIFs for Systems L and E avoided the need to adjust synchronous demodulator tank circuit tuning according to the system being received at any given time. That problem did not arise with diode demodulators, which rectified whatever they received (and also added a quite a bit of quadrature distortion in the process). Two SIFs may have been less of a problem in the solid-state era, and there were some (but not all) synchronous demodulation AM SIF ICs that did not require tank circuits, and so were effectively wideband.
Again, to be worked out in the next chapter, but your statement seems to be correct. With the emergence of SAW IF filters and QSS IF demodulators the flexibility on the VIF became much less, so the trend reversed to fixed VIF and flexible SIF depending upon the standard.

So far for the moment, I'll respond to the Mosfet topic in a separate post.

Regards, Pieter
Pieter H is offline  
Old 27th May 2018, 12:38 am   #7
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,943
Default Re: Tuners in Philips G6-G11 CTV chassis

Hi Pieter:

Thanks very much for your comprehensive follow-up.

Re the UK UHF situation and its image rejection requirement, the attached Wireless World item provided some background:

Click image for larger version

Name:	WW 196210 p.477 UK TV IF.jpg
Views:	131
Size:	103.9 KB
ID:	163604

Essentially it stated that the decision to move the UK standard VIF from 38.9 to 39.5 MHz significantly increased the image rejection requirement. That VIF change was made after the European UHF channelling plan was developed at the ITU Stockholm 1961 (ST61) meeting. The planning appears to have been based upon the VIFs existing at the time. From this page of the ST61 documentation:

Click image for larger version

Name:	ST61 TA p.57.jpg
Views:	83
Size:	51.2 KB
ID:	163605

although the actual IFs used in the calculations were not stated, one may deduce certain aspects. For example, the Italian case was calculated for both the Italian standard 45.75 MHz and CCIR standard 38.9 MHz VIFs. The Russian case was evidently calculated for two VIFs. One would have been the then-standard 34.25 MHz, the other might have been the CCIR 38.9 MHz number as a surrogate for a future higher VIF in that vicinity, which later materialized as 38.0 MHz. (Or perhaps 38.0 MHz was in view even then.)

I have not found any information as to why the UK VIF was moved up to 39.5 MHz. There did not seem to be any intrinsic necessity, as 38.9 MHz was used in South Africa for System I with the same UHF channels as Europe and with the addition of Band III channels. In the absence of “hard” information my best guess is that it was done to assist those setmakers who wanted to use a dual-Nyquist IF channel to simplify standards switching. Pye had done that with its prototype dual-standard receiver. But with the -6 dB points on the dual-Nyquist curve at 34.65 MHz (405 VIF) and 38.9 MHz (625 VIF), the 625 vision bandwidth was a paltry 4.25 MHz against the 5.5 MHz transmitted. (Recalling that the 1950 CCIR Gerber decision was a choice between 4.25 MHz vision bandwidth in a 6 MHz channel and 5 MHz vision bandwidth in a 7 MHz channel, and was made in favour of the latter.) Moving the 625 VIF up as far as reasonably possible was desired, and I suspect that 39.5 MHz was about the limit without creating near-impossible conditions in respect of image rejection, etc. This allowed a 4.85 MHz 625 vision bandwidth for the dual-Nyquist case, still on the poor side but perhaps just escaping being risible.

Click image for larger version

Name:	WW 196110 p.514 Pye Dual-Standard TV Receiver IF Bandpass.jpg
Views:	107
Size:	78.7 KB
ID:	163606

That VIF change did require a rethink in respect of UHF tuners. Cyldon had been the first UK maker to introduce such, and its initial UT model was three-gang, with an aperiodic input. It was reconfigured with a fourth gang and a tuned input. Mullard also had 4-gang valved UHF tuners, such as the AT6380:

Click image for larger version

Name:	Mullard AT6380-02 Schematic.jpg
Views:	288
Size:	55.5 KB
ID:	163607



Cheers,
Synchrodyne is offline  
Old 31st May 2018, 11:30 am   #8
Pieter H
Tetrode
 
Join Date: Sep 2017
Location: Waalre, Netherlands
Posts: 67
Default Re: Tuners in Philips G6-G11 CTV chassis

Hi Synchrodyne,

an interesting discussion, the UK VIF choice for 39,5MHz. Like you, I also haven't found any convincing technical reason, which makes me conclude it must have been political!

As my story on the first 10 year of TV development in Europe clearly shows, choices on line rates and the associated IF standardization were pure politics! This was the time that countries (i.e. the technical community, the big business management and politicians) still thought that for competitiveness and local employment it was still best to have national standards. With the well-known result of initially some 6-7 different TV standards in Europe (depending how you count, and including the Belgian versions). In the meantime we learned the had way that in consumer electronics that's not how it works!

With the introduction of 625 lines UHF and colour most countries complied (with the exception of France and its 819 lines) but not with respect to Vision-Sound frequency distance and the IF. So my guess is that UK policy makers were forced to accept the 625 lines (or maybe even were happy, who knows, the 405 lines had proven to be a pretty mediocre standard), but in order to keep some level of uniqueness imposed a larger bandwidth and 5,999MHz vision-sound distance ("the UK goes for higher picture quality" or similar). Essentially the same policy as the French, without going to the 11MHz 819 lines extreme.

Given this different VIF-SIF distance of 0,5MHz compared to the CCIR, one argument for a higher VIF could have been to move it 0,5MHz upwards in order to keep the SIF identical at 33,4. As we've seen in my tuner history, until the emergence of IC's, in general set makers tried to keep the SIF fixed as much as possible and then vary the VIF depending upon the standard. But this doesn't seem to be the reason, because it is moved upwards by 0,6MHz, making the SIF 100kHz higher at 33,5 vs. the 33,4 of CCIR. Strange and indeed unexplainable so far. So it looks the arguments for 39,5-33,5 seems to have been more political than economical.

But I'm sure that, given the desire to go to 39,5MHz, there was some technical analysis done that showed a slightly better interference performance under certain conditions, thus justifying the choice for 39,5-33,5MHz. And so it happened.

I admit, mostly speculation, but the best I can make of it. Looking forward to your thoughts.

Kind regards, Pieter
Pieter H is offline  
Old 9th Jun 2018, 2:57 am   #9
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,943
Default Re: Tuners in Philips G6-G11 CTV chassis

As you say, there were quite a few twists and turns in the development of analogue TV transmission and reception systems, with political and economic factors involved as well as technical. Mostly we can ascertain what happened, but why it happened is not always apparent.

My inclination is first to look for possible technical reasons that support what was done. When the search for these comes up empty, then the supervention of politics is a reasonable assumption. That’s not to say that in some cases the technical and political requirements happened to align, and that if they had not, then the political choice may have taken precedent.

Here is my distillation of the history, looking first at the lines counts and transmission parameters, then at the IF choices.

UK adopted 405/50 in 1936 as about the best that could reasonably be done at the time, with the approximately the same line frequency as the RCA 343/60 system. Positive vision modulation was adopted as making the most efficient used of transmitter capability at the time. AM sound was adopted as FM was not yet quite a contender. It was originally double sideband, with vestigial sideband adopted in the late 1940s, using the 0.75 MHz NTSC number, albeit upper rather than lower.

NTSC chose 525/60 in 1941. All parameters were open for discussion and choice on their merits except the channel width of 6 MHz, which was predetermined. Thus vestigial sideband, negative vision modulation, inclusion of equalizing pulses and FM sound were chosen as the better/best. Negative modulation was chosen because of the apparent ease with which black-level AGC could be obtained in receivers. There was evident concern about maintaining transmitter linearity at lower modulation levels towards white level, and the original white level specification was 15% maximum. (The 10% minimum, to accommodate intercarrier sound, was a later RTMA addition.) Until fairly late in the NTSC proceedings, the line choice was 441, the same as the RMA system, itself representing an advance in line frequency over the UK 405/50 system. (The 441/60 equivalent would have been about 525/50.). Donald Fink advanced the flatness of field (lack of lininess) argument in favour of 525, and that was chosen, thus producing what was a durable standard. (Much later, Fink wrote that an 8 MHz channel would have been a better choice.)

Russian work on 625/50 started c.1944. It was an adaptation of the NTSC 525/60 system to suit 50 Hz power supplies, based upon approximately equivalent line frequencies. Without a predetermined channel width constraint, the Russians chose 6 MHz vision bandwidth, ostensibly to match 16 mm film capability and an 8 MHz channel. But the vestigial sideband was 0.75 MHz, the same as for NTSC. Regular broadcasting started in 1948.

The French 819/50 “high definition” system was adopted in 1948 and on-air in 1949. It had 10.4 MHz vision bandwidth in a 14 MHz channel, later reduced to 13.15 MHz for the tête-bêche plan by eliminating the outer guard bands. It had a 2.0 MHz vestigial sideband, about the same proportion of the full sideband as for the NTSC case, originally on the upper side but later on either side in the tête-bêche system. It was positive/AM without equalizing pulses. The reasons for the latter choices are unknown, but it might have been for similarity with the existing 441/50 system. At one stage it was envisaged that both systems would be retained, although by late 1948 the 441/50 system was given a definite end-date of the 1st of 1958. By 1948, the NTSC’s thoughts on the benefits of negative modulation were being questioned, e.g. see Wireless World 1948 December pp.439-440. So perhaps positive modulation was chosen for technical reasons as well. In respect of the line count, at that time Western Europe generally had not settled upon a standard, although 625/50 was in view.

The Western Europe situation looks a bit murky. Germany picked up the 625/50 system from the Russian work. Apparently there was ongoing debate about the channel width. One may deduce that 8 MHz was seen as too much, and that the 6 MHz American channel was attractive. An American 625/50 transmitter was installed at Torino in 1949, this working in a 6 MHz channel. Meanwhile Philips had proposed a 567/50 system as being a better fit for the 6 desired MHz channel than 625/50, but that did not garner much support. I’d guess that on one hand, anything lower in line count than what the Russians were doing would have been unacceptable, but equally, so would have been directly copying the Russian system. The late argument was apparently between 4.25 and 5 MHz video bandwidths in 6 and 7 MHz channels respectively. (4.25 MHz was an interesting number, given that NTSC was 4 MHz, increased to 4.2 MHz in 1953 to accommodate colour. It [4.25 MHz] suggested that the 6 MHz channel proponents realized that a 4 MHz vision bandwidth was “undersized” and that they needed to push that number as far out as possible.) The 1950 Gerber compromise came out in favour of the 7 MHz channel.

The 6 MHz 625/50 channel resurfaced for use in Argentina (and later in other 50 Hz Latin American countries) in 1951. In those cases there was a need to stay with the American 6 MHz channelling pattern, used in the adjoining 60 Hz countries.

By the end of 1951, it looked as if the number of 625/50 transmission variants – three- might have been not much smaller than the number of 625/50 transmitters actually in regular broadcasting service!

Then in 1952-53 came the Belgian variations. Here the problem was that the transmitters would need to be able to switch between incoming programmes in either 625/50 or 819/50 format. So commonality of channel arrangements and other parameters was required. Presumably spectrum availability pointed to the 7 MHz channel, which meant that the 819/50 service would be of very limited vision bandwidth as compared with the French version. Positive vision modulation and AM sound (with pre-emphasis) was chosen for both. Why I don’t know. Had negative/FM been chosen for both, then the 625/50 system would have been the same as that elsewhere in Western Europe, which seems simpler. Maybe the assessment at the time was that positive was a bit better. Or perhaps “equality of treatment” across the Flanders/Wallonia “divide” demanded that both the 625/50 and 819/50 systems be modified as compared with their prototypes, not just one of them. The Belgian 819/50 signal included equalizing pulses, presumably to make it as much like the 625/50 signal as possible.

In the mid-to-late 1950s, European UHF transmitter network planning was heading to a common 8 MHz channel with fixed vision carrier position. This was in order to maximize utilization of the available channels. Countries using the existing CCIR 625/50 system would simply replicate this at UHF, albeit in 8 MHz rather than 7 MHz channels.

Without existing 625/50 services, both the UK and France, who were heading towards the use of 625/50 for future services, were free to fully exploit the possibilities of the 8 MHz UHF channels.

Early UK 625/50 work was with the Russian parameters, logical given the availability of an 8 MHz channel. Late in its deliberations, the TAC (Television Advisory Committee) reviewed this and determined that optimum use of the 8 MHz channel would be obtained by increasing the vestigial sideband from 0.75 to 1.25 MHz, and decreasing the main sideband from 6 to 5.5 MHz. This was recommended in 1960 and eventually adopted for UK 625/50 UHF transmissions, which followed the standard negative/FM pattern. One might say that the Russians had overlooked the desirability of vestigial sideband proportionality when transposing from NTSC. I am inclined to credit the TAC with having primarily technical motives for its choice, although one could not totally discount a leaning – perhaps even subconscious - to finding a good technical reason for doing differently to the Russians whilst still making full use of the 8 MHz channel. Insofar as future UK receivers would be dual-standard 405/625, that alone would have been a significant non-tariff barrier if indeed that was part of the thinking.

France opted for both the 1.25 MHz vestigial sideband and the 6 MHz main sideband for 625/50, this requiring utilization of the outer guardbands, as it had done with its 819/50 tête-bêche channelling. Why 6 MHz for the main sideband I do not know. My speculation is that it was done in consideration of the future adoption of SECAM colour. The initial SECAM proposal used AM (DSB) colour subcarriers. With DSB AM and the use of simple envelope detectors at the receiving end, any single-sidebanding caused by restricted vision bandwidth would have caused quadrature distortion which was potentially deleterious. A 6 MHz vision bandwidth allowed plenty of room for colour subcarrier upper sidebands without truncation. I imagine that the same argument might have been used when FM subcarriers were adopted for SECAM, as one supposes that single-sidebanding of an FM signal would also cause distortion. However, the later use of SECAM for System B transmissions suggests that single-sidebanding of the FM subcarriers was not a show-stopper.

Positive/AM was chosen for domestic 625/50 transmissions, apparently for commonality with the existing 819/50 transmissions, this easing the design of dual-standard receivers. This has sometimes been assigned as a political decision, creating a non-tariff trade barrier, although I have never seen a support case for this assertion. Contrary evidence is that negative/FM was chosen a little later for the Outré-Mer territories, where there were no 819/50 transmissions (well, just a few in Algeria for a while and I think also in Togo), so the dual-standard receiver issue did not arise. Had creating a tariff barrier been the primary objective, then it seems more likely that the Outré-Mer territories would also have had the positive/AM system. Also, as future domestic receivers would be 625/819 dual-standard, a de facto non-tariff barrier was already in place.

Some Western European countries with existing VHF 625/50 transmissions adopted the 1.25 MHz vestigial sideband for their UHF transmissions. This included Belgium, who used negative/FM at UHF. That was interesting, as when UHF reception facilities were added to Belgian multistandard receivers, they also covered the French positive/AM system.

The system designations A through F were promulgated at the CCIR 1959 Los Angeles meeting. There was some order to them, based upon line count. I.e. A for 405, B,C,D for 625 and E,F for 819. Within the 625 group, they appear to have been ordered by increasing channel width, with negative/FM preceding positive/AM at a given width. With E and F it looks to have been a time-based ordering, in that E preceded F.

G through L arrived at the ITU 1961 Stockholm European VHF/UHF planning meeting. Here the ordering seems to have been on the basis of increasing total (i.e. main + vestigial) vision bandwidth.

K’, originally K* (I don’t know when it was changed) was assigned at or before the ITU 1963 Geneva African VHF/UHF broadcasting conference. This covered the French Outré-Mer 625/50 system. Also in the documentation for that meeting, the 525/50 system was described as System M. From that one might infer that the M (and probably N) designations were assigned between the 1961 and 1963 ITU meetings.

By mid-1963, there were no fewer than ten 625/50 transmission standards, namely B, C, D, G, H, I, K, K’, L, N. From a reception viewpoint, that number could be reduced to seven, as B/G, C, D/K, H, I, L, N. Or, if vestigial sideband variations were ignored, to five: B/G/H, C, D/K/K’, L, N.

Of course, it gets more complicated when one factors in the later addition of colour and the later still addition of stereo/two-channel sound.

Turning to IFs, one may make the generalization that standard numbers were developed following the implementation of VHF channelling and channel assignment schemes, whilst UHF channel assignments were worked out on the basis of established standard IFs. But there were exceptions. Standard numbers were usually preceded by a period of ad hockery.

The US was early with an initial standard, 25.75 to 26.4 MHz VIF range, later known as the “low” or “20 MHz” IF, but soon learned that higher was desirable, so after careful study by the RMA, the 45.75 MHz VIF was chosen, known as the “high” or “40 MHz” IF. This was then used by the FCC as the basis for planning geographical UHF channel assignments.

In Europe, Italy elected to follow the American precedent, choosing the 45.75 MHz VIF in 1952 ahead of the ITU Stockholm 1952 (ST52) European VHF planning meeting. Evidently this action placed some constraints on VHF channel frequency assignments, and the Italian set thus differed from those used elsewhere in Western Europe. So this looks to have been a case where the decision was more political than technical.

The European 625/50 standard VIF of 38.9 MHz appears to have been established in 1954 after careful consideration, although it had been used before then. Similarly the British standard 405/50 VIF of 34.65 MHz was also established that year in anticipation of Band III transmissions starting in 1955. I am not sure when the French standard VIF of 28.05 MHz was established, but I’d guess during 1955. The tête-bêche channelling system limited the VIF choices, and the selection of 28.05 MHz rendered unusable channel F3, which had been included in the ST52 planning.

I imagine that the initial Russian standard 34.25 MHz VIF was established on technical grounds. The Russian channelling system, including the use of Band II, would have provided a different set of conflicts when it came to IF choices.

In the Australian case, the original 36.0 MHz standard VIF was more-or-less coincident with the announcement of the unusual VHF channelling scheme, also involving Band II frequencies.

The ITU 1961 Stockholm European VHF-UHF planning meeting (ST61) appears to have assumed the following VIFs in respect of UHF channel allocations:

Western Europe generally, Systems G and H – 38.9 MHz.

Italy, Systems G and H - both the Italian standard 45.75 MHz and the European standard 38.9 MHz.

UK, System I – 38.9 MHz. But as noted, the UK moved this to 39.5 MHz post-ST61 for reasons not yet known, although as best may be determined there was no compelling technical need to do so.

Russia and Eastern Europe generally – 34.25 MHz and a higher number in the vicinity of 38.9 MHz. The 38.0 MHz number arrived somewhat later, although it might have been in view in 1961.

France, System L – 32.7 MHz. This number arose because to simplify dual-standard receiver design, it was decided that the SIF for System L should be the same as for System E, namely 39.2 MHz. That put the System L VIF at 32.7 MHz, and also meant a vision-low IF channel, in turn requiring oscillator-low frequency changing in receivers. That was not a problem at UHF or at Band III, but it was at Band I. Hence when 625/50 was extended to Band I, it was with inverted channels as System L’, which in turn allowed oscillator-high operation. There was a line of consequences extending from the tête-bêche channelling decision right down to System L’.


Cheers,
Synchrodyne is offline  
Old 11th Jun 2018, 5:57 am   #10
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,943
Default Re: Tuners in Philips G6-G11 CTV chassis

Quote:
Originally Posted by Synchrodyne View Post
The system designations A through F were promulgated at the CCIR 1959 Los Angeles meeting. There was some order to them, based upon line count. I.e. A for 405, B,C,D for 625 and E,F for 819. Within the 625 group, they appear to have been ordered by increasing channel width, with negative/FM preceding positive/AM at a given width. With E and F it looks to have been a time-based ordering, in that E preceded F.
Not so. I have since found Report 124 from the CCIR 1959 meeting. It did not include the system letters. It simply described seven systems as 405, 525, 625, Belgian 625, IBTO 625, 819 and Belgian 819 respectively.

I had previously misread the statement in the Technical Annex to the Stockholm 1961 meeting, which was:

2.1 The television standards for Bands I, II and III (see C.C.I.R. Report No. 124) are designated in the Plans as
follows:

A – 405-line system
B – 625-line system
C – Belgian 625-line system
D – l.B.T.O. 625-line system
E – 819-line system
F – Belgian 819-line system


What appears to have been meant is that the whilst the systems had been described in CCIR report 124, the letter designations were developed at and for the purposes of the ITU ST61 plan, and were not part of that report. Thus, the A through F designations were assigned in 1961, essentially at the same time as the G through L designations were assigned. It seems obvious now, but my misapprehension has existed for quite a long time. One may imagine that with so many systems to consider, a simple designation system was highly desirable for those doing the planning.

Given that ST61 was concerned only with European matters, it seems reasonable that only systems of interest to Europe were included in the list.

These system designation letters were included in Report 308 from the CCIR 1963 Geneva meeting. Report 308 replaced Report 124. Additionally, Report 308 used the letters M and N for the 525- and Latin American 625-line systems respectively.

Report 308-1 from CCIR 1966 added system K1 (written that way rather than as K’), which as already mentioned, had first appeared as K* from the ITU 1963 Geneva African VHF-UHF planning meeting.

The detailed history of those CCIR TV system reports would be another topic.

Quote:
Originally Posted by Synchrodyne View Post
In the mid-to-late 1950s, European UHF transmitter network planning was heading to a common 8 MHz channel with fixed vision carrier position. This was in order to maximize utilization of the available channels. Countries using the existing CCIR 625/50 system would simply replicate this at UHF, albeit in 8 MHz rather than 7 MHz channels.

Without existing 625/50 services, both the UK and France, who were heading towards the use of 625/50 for future services, were free to fully exploit the possibilities of the 8 MHz UHF channels.
Report 123 from CCIR 1959 Los Angeles, referring to a 1958 Moscow meeting, include the following in respect of the UK and France:

The UK had yet to make a decision on what standard to use in Bands IV and V, but in any event would adopt the 8 MHz channel if that was generally adopted in Europe. And if it did chose 625 lines, it would be with 6 MHz vision bandwidth and 6.5 MHz vision-to-sound spacing.

France was similarly undecided, but accepted the 8 MHz channel if that were the majority choice, and stated that any 625-line system should have at least 6.5 MHz vision-to-sound spacing.

Quote:
Originally Posted by Synchrodyne View Post
France opted for both the 1.25 MHz vestigial sideband and the 6 MHz main sideband for 625/50, this requiring utilization of the outer guardbands, as it had done with its 819/50 tête-bêche channelling. Why 6 MHz for the main sideband I do not know. My speculation is that it was done in consideration of the future adoption of SECAM colour. The initial SECAM proposal used AM (DSB) colour subcarriers. With DSB AM and the use of simple envelope detectors at the receiving end, any single-sidebanding caused by restricted vision bandwidth would have caused quadrature distortion which was potentially deleterious. A 6 MHz vision bandwidth allowed plenty of room for colour subcarrier upper sidebands without truncation.
The ITU 1963 Geneva African VHF-UHF documents included a submission on the benefits of the K* system. Argued therein was that the 1.25 MHz vestigial sideband was “right-sized” for minimizing quadrature distortion, and that the 6 MHz vision bandwidth allowed for a symmetrical ± 1.5 MHz colour subcarrier bandwidth (about 4.43 MHz), preferable for NTSC colour as well. I imagine that the same arguments had previously led the French to the System L parameters.


Cheers,
Synchrodyne is offline  
Old 11th Jun 2018, 6:32 am   #11
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,943
Default Re: Tuners in Philips G6-G11 CTV chassis

On IFs, the respective CCIR reports associated with the Plenary meetings from 1956 Warsaw through 1986 Dubrovnik give approximate timelines for some of the changes, although I think that the CCIR was simply publishing what was voluntarily reported rather than the results of its own forensic work. There was evidently some quite late reporting.

1970 New Delhi Report 184-1 shows the Russian standard VIF as 38.0 MHz whereas it was 34.25 MHz in the 1966 Oslo Report 184-1. And for Japan both the old “low” VIF of 26.75 MHz and the new “high” VIF of 58.75 MHz are shown, whereas in 1966 only 26.75 MHz was shown.

1978 Kyoto Report 184-3 shows only the Japanese “high” IF. It also added the African System K1 VIF of 40.2 MHz, noting that this was obtained from document 44 of the 1963 Geneva African conference. So there was quite a time lag before this one got into the system. Document 44 was a rather arcane treatment of IF derivation, but overall I think a good worked example. The starting point was that the Band III channel set was a given, but that the three Band I channels were not yet set. So the problem was to find a combination of a satisfactory IF with a suitable Band I channel set (within 41 to 68 MHz and non-overlapping).

The Italian VIF was shown as 45.75 MHz through to 1974 New Delhi, and 38.9 MHz from 1978 Kyoto.

The UK 625 VIF of 39.5 MHz did not appear until 1970 New Delhi.

I’d hazard a guess that over the years Philips used every IF in the book, and some that weren’t, and very likely a bigger range than any other TV tuner maker. I have the 1990 issue of Data Philips Handbook DC-03 “Television Tuners, Coaxial Aerial Input Assemblies”, and this shows a wide range if IFs according to TV system and destination geography.


Cheers,
Synchrodyne is offline  
Old 14th Jun 2018, 1:48 pm   #12
Pieter H
Tetrode
 
Join Date: Sep 2017
Location: Waalre, Netherlands
Posts: 67
Default Re: Tuners in Philips G6-G11 CTV chassis

Hi Synchrodyne,

another major piece of analysis, thanks!
I'm not going to respond to every individual paragraph, also because I agree with most of your analysis. So only a few items:

This is what I wrote on the Belgian VHF standards:
To complicate matters further Vlaanderen (Flanders, the Dutch-speaking part of Belgium) defined their 625-line system as a compromise with the French 819-line standard, so AM sound and positive modulation, whereas Wallonie, the French-speaking part of Belgium as well as Luxembourg, opted for the 819-line system but squeezed into the 7MHz channel.
So in the end this was a typical Belgian compromise:
  • There was some Flemish-Wallonie unity (channel width @ 7MHz, 5,5MHz Snd AM)
  • But apart from that the Flemish system could be seen as a CCIR-B version.
  • And the Wallonie system as a 819-line version.
Everybody happy, although at the expense of two more standards.

As to Italy, please note that they indeed conceptually took over the US NTSC IF and VHF-I channel arrangement, but slightly modified:
- IF was 45,9MHz, not the NTSC-M 45,75MHz
- channel spacing was 7MHz, not the 6MHz NTSC
In my overviews there is not a single Philips Italian tuner with 45,75MHz, they are all 45,9, moving to 45,9/38,9 combos and ultimately 38,9 (when they were no longer special Italian tuners).

As to the UK 39,5 VIF, it was suggested to me by David Norton that the system-I N+1 sound IF trap would be the same as the sound carrier of VHF 405-line channel1. Which is correct!
  • System-I (UHF only) channel 8Mhz, Vision-Sound distance 6MHz, so N+1 sound IF 2MHz above VIF = 39,5+2=41,5MHz
  • System-A (VHF-only) Channel1 Vision 45MHz, sound 41,5MHz
Seems like a very plausible explanation, where if I'm correct Ch1 was the Crystal Palace transmitter from 1957 for the greater London area. Apparently this was a serious interference issue, especially with the first transistor tuners and their sensitivity to crosstalk. Moving the IF upwards in order to put the N+1 SIF trap exactly on 41,5MHz improved this issue.
In this context it is interesting to note how quickly the conversion - at least at set level - happened from 405 to 625 lines: only the 1966 G6 first colour TV and the 210 B&W chassis used a multi-mode IF, with 34,65/38,15MHz IF for the 405-line system, and the 39,5/33,5MHz for the 625-lines. See the IF curves below:
Click image for larger version

Name:	Philips 1967 AT7672 IF in UK 210 chassis.jpg
Views:	80
Size:	33.0 KB
ID:	164565
From 1970 all Philips sets were in principle UHF-only!

Cheers, Pieter
Pieter H is offline  
Old 16th Jun 2018, 2:32 am   #13
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,943
Default Re: Tuners in Philips G6-G11 CTV chassis

Thanks Pieter. Regarding the UK system I VIF choice, I agree that alignment of the lower channel sound rejection point with channel B1 sound at 41.5 MHz is a plausible reason for the shift from 38.9 to 38.5 MHz. It could have been a standalone reason, or it could have been an additional reason for an otherwise desired upward shift in order to better accommodate receivers with dual-Nyquist IF bandpasses. In the latter case, it was likely the reason that determined the actual magnitude of that upward shift.

Somewhere there must be a BREMA document that would explain all. Maybe one day it will surface.

One of the questions implied in that VIF selection was how close could one get to the bottom edge of Band I, or rather the lowest carrier frequency in Band I without causing undue problems. In the French case the closest approach generally was between the 39.2 MHz SIF and the 41.25 sound carrier of channel F2. But border-area multisystem receivers used 39.9 MHz VIF for system L, and sometimes also for system E. In the French African case, system K’, the VIF was 40.2 MHz against an assumed channel one vision carrier of 42.25 MHz.

On that basis, perhaps the UK system I VIF could have been pushed as high as 40.25 MHz. But another limiting factor was image rejection for the (n+10) case, which would have indicated less than 40 MHz by an adequate margin. 39.5 MHz would have been close to, if not at that limit. So it effectively killed two birds with one stone.

The Philips 210 IF bandpass is quite interesting. It seems to be of the quasi-double Nyquist type, in that the 34.65 MHz -6dB point on the 405 Nyquist flank remains in place on 625, giving a 625 vision bandwidth of 4.85 MHz. But that flank is steepened up by the 33.9 MHz trap, not used on 405.

In the Italian case, it could be that for its own reasons, Philips decided that moving from 45.75 to 45.9 MHz was beneficial. The documentation that I can find all shows 45.75 MHz as the “official” VIF until the change was made to the European standard of 38.9 MHz.

The CCIR TV IF reports from 1959 Los Angeles through 1970 New Delhi all show 45.75 MHz. That from 1974 Geneva shows the change to 38.9 MHz.

Click image for larger version

Name:	from CCIR 1959 Los Angeles Report 98.jpg
Views:	79
Size:	57.8 KB
ID:	164677

Click image for larger version

Name:	from CCIR 1970 New Delhi Report 184-1.jpg
Views:	85
Size:	35.3 KB
ID:	164678

Click image for larger version

Name:	from CCIR 1974 Geneva Report 184-2.jpg
Views:	71
Size:	43.0 KB
ID:	164679

I imagine that Philips would have had good reason to deviate. I understand that it was instrumental in choosing the 38.9 MHz number, as the reference given for that (which I haven’t seen) was an article in Funk and Ton no. 8, 1954, pp.129-138, “Choice of an intermediate frequency for television receivers to suit the C.C.I.R. standard”, by W. Holm and W. Werner. I imagine that was the same W. Werner who co-authored with the F. Kerkhof the Philips Technical Library book “Television”. And the same W. Holm who wrote the book “How Television Works.” The latter was a good “starter” book, and I still have my copy bought back in 1964.

I guess that a pertinent questions is was the deviation to 45.9 MHz general amongst the setmakers and tuner makers, or did Philips do that alone?

Cheers,
Synchrodyne is offline  
Old 7th Jul 2018, 3:58 pm   #14
Pieter H
Tetrode
 
Join Date: Sep 2017
Location: Waalre, Netherlands
Posts: 67
Default Re: Tuners in Philips G6-G11 CTV chassis

Hi Synchrodyne,

our discussion on where the UK IF choices, or standard-related IF choices in general, came from kept haunting me. Putting it all together I've come to the following observations.
  • It seems that in the ideal case one would like to position the LO frequency in an "empty" frequency space between channels.
  • This is most consistently done with the French system-E 819-line standard, where most (Philips) receivers had the LO exactly on the band edge ow the active pair/impair band, while additionally it co-incided with the sound carrier of the opposite band.
  • This is exactly what is being done for UK CCIR-I. The need to change from the CCIR-G/H 38,9MHz IF was that the sound carrier of N+4 was in comparison 0,5MHz higher. For reasons as yet unclear to me there was apparently a preference to put the LO nearer the PC than the SC.
  • So for CCIR-I the centre between N+4 SC and N+5 PC would have been 39,0MHz, but it was likely moved to 39,5 to have the N+1 IF sound trap be on the channel A1 sound carrier at 41,5MHz, as discussed earlier.
  • The interesting thing is that the G/H 38,9MHz is similarly perfect for the standard with 5,5MHz picture-sound distance AT UHF! At the time the 38,9MHz was standardized when only VHF off-air was being transmitted. However, this suggests the 38,9MHz choice was mainly determined by the future UHF use. I don't have the 1954 article by Holm and Werner on this, so can't check whether this assumption is true.
  • I also looked at the location of the image frequency given the IF choice (so 2*IF above the wanted PC). Here again we see that the G/H and I IF's are perfectly in between a SC and PC. Even stronger, looking at the numbers one could argue that the location of the image frequency EXACTLY between the N+9 SC and N+10 PC was the main design criterium, with the LO frequency and IF then automatically being half that number. That would then be the reason of the 39,5 for CCIR-I: image exactly between the 78 and 80MHz of these two carriers. And similarly for the 38,9MHz for G/H!
  • The "older" VHF off-air standards from the early 1950s (B and D) probably had their IF choice either determined by UHF (B) or by other practical reasons. Given the much lower number of channels, the above arguments of the LO positioning only applied for the lowest VHF-III channels, in all other cases the LO was outside the received band and the choice thus less critical.
  • This changed, however, with the introduction of cable S-channels, when all of a sudden the VHF-III band was substantially extended, this could have become an issue. I'm still checking if and how this translated into additional tuner specs.
  • As to the Italian (Philips) IF of 45,9, it is noteworthy that this is exactly one channel width (7MHz) up from 38,9, and would thus from an interference rejection perspective give the same performance as the standard 38,9. Might have been the reason, but just a theory.

Below is a table I made to summarize all this.
Green fields indicate the LO and image frequencies exactly between a SC and PC. Dark orange fields indicate the LO in the middle of the video band, while light orange are cases where the LO is just between the upper video band edge and the SC. The dark yellow fields in the image column are only an issue in case of cable channels.

I don't pretend this the full or only story, but it is too nice and consistent to be irrelevant for the IF choice. Looking forward to your view on this.

Cheers, Pieter
Attached Thumbnails
Click image for larger version

Name:	IF standards.jpg
Views:	71
Size:	40.9 KB
ID:	165730  
Pieter H is offline  
Old 16th Jul 2018, 12:48 am   #15
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,943
Default Re: Tuners in Philips G6-G11 CTV chassis

Hi Pieter:

Thanks for the detailed analysis, which is quite compelling. I have added a few comments, as follows, but nothing material in respect of the numbers themselves.

Quote:
Originally Posted by Pieter H View Post

our discussion on where the UK IF choices, or standard-related IF choices in general, came from kept haunting me. Putting it all together I've come to the following observations.[LIST][*]It seems that in the ideal case one would like to position the LO frequency in an "empty" frequency space between channels.[*]This is most consistently done with the French system-E 819-line standard, where most (Philips) receivers had the LO exactly on the band edge ow the active pair/impair band, while additionally it co-incided with the sound carrier of the opposite band.[*]This is exactly what is being done for UK CCIR-I. The need to change from the CCIR-G/H 38,9MHz IF was that the sound carrier of N+4 was in comparison 0,5MHz higher. For reasons as yet unclear to me there was apparently a preference to put the LO nearer the PC than the SC.[*]So for CCIR-I the centre between N+4 SC and N+5 PC would have been 39,0MHz, but it was likely moved to 39,5 to have the N+1 IF sound trap be on the channel A1 sound carrier at 41,5MHz, as discussed earlier.
That all makes sense. As it was advantageous to move the IF upwards a little, then moving it to have coincidence between A1 SC and N+1 SC was even more advantageous.

Quote:
Originally Posted by Pieter H View Post
Hi Synchrodyne,
[*]The interesting thing is that the G/H 38,9MHz is similarly perfect for the standard with 5,5MHz picture-sound distance AT UHF! At the time the 38,9MHz was standardized when only VHF off-air was being transmitted. However, this suggests the 38,9MHz choice was mainly determined by the future UHF use. I don't have the 1954 article by Holm and Werner on this, so can't check whether this assumption is true.
It does seem possible that Holm and Werner were looking at a future that included UHF. In the USA, the 45.75 MHz IF was derived from consideration of the VHF requirements, and then the FCC used it to plan UHF channel assignments whilst avoiding interference problems. This resulted in a list of co-located and adjacent service area assignments to be avoided, known as the “UHF taboos”. Holm and Werner were no doubt aware of this history, and perhaps wanted to consider the UHF case in their work. I am not sure when the idea of standard pan-European 8 MHz channels for UHF was first mooted, but I think it was quite early on. A general consensus appears to have been reached at a 1958 CCIR Moscow study group meeting, reported in the CCIR 1959 Los Angeles documents, which suggests that the question had been under study for a while before then.

Quote:
Originally Posted by Pieter H View Post
[*] As to the Italian (Philips) IF of 45,9, it is noteworthy that this is exactly one channel width (7MHz) up from 38,9, and would thus from an interference rejection perspective give the same performance as the standard 38,9. Might have been the reason, but just a theory.
Well spotted! I’d missed that connection.


Cheers,
Synchrodyne is offline  
Old 16th Jul 2018, 2:04 pm   #16
Richard_FM
Octode
 
Richard_FM's Avatar
 
Join Date: Sep 2017
Location: Stockport, Cheshire, UK.
Posts: 1,999
Default Re: Tuners in Philips G6-G11 CTV chassis

I only had the chance to scan through but did Philips make some tuners specially for the Irish market?
Richard_FM is offline  
Old 16th Jul 2018, 2:30 pm   #17
dragonser
Heptode
 
dragonser's Avatar
 
Join Date: Aug 2009
Location: Carshalton, Surrey, UK.
Posts: 734
Default Re: Tuners in Philips G6-G11 CTV chassis

Hi,
I think one of the early Philips sets had six buttons, and you could select which band and system each button would work at. I think 625 and band 1 or band 3 was an option...
As far as I know the G 8 and G11 only had UHF tuners...
but my memory is not as clear as I would like..
__________________
Regards Peter B
dragonser is offline  
Old 16th Jul 2018, 2:53 pm   #18
kan_turk
Hexode
 
kan_turk's Avatar
 
Join Date: Mar 2014
Location: Dublin, Ireland
Posts: 396
Default Re: Tuners in Philips G6-G11 CTV chassis

All Philips sets sold in Ireland at the time had VHF & UHF capability, either separate tuners or integrated VHF/UHF tuners - G8s sold in Ireland definitely had VHF and UHF tuners from the then current Philips range

J
kan_turk is offline  
Old 17th Jul 2018, 11:44 pm   #19
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,943
Default Re: Tuners in Philips G6-G11 CTV chassis

A question – what type of VHF tuner was used on the Australian version of the Philips K9 chassis, which would have been produced from c.1975? My recollection is that the Australian version of the K9 was unusual in having a rotary channel selector, rather than the customary push-button arrangement. Whether it was a rotary tuner per se, or simply a varicap tuner with rotary channel switching I don’t know. And at that time the tuner would have covered the Australian Band II and out-of-band channels.

Presumably it would also have had the Australian standard IF. Actually there were two such, the original 36.0 MHz VIF of 1956 and the later 36.875 MHz VIF which was in place circa 1970. Although the latter was offered as an alternative not a replacement, I think it was the dominant number in the colour era.

This question came to mind after reading this thread: https://www.vintage-radio.net/forum/...d.php?t=148117.

The New Zealand version of the K9 had what I think was the European standard VHF tuner. It had six pushbuttons with corresponding presets that were calibrated with channels (E) 2 through 12, with an “automatic” flipover between Bands I and III, i.e. between channels 4 and 5. The corresponding NZ channels were generally one number lower than the European channels, and evidently there was no problem getting down to NZ1, 44 to 51 MHz, which was somewhat below E2, 47 to 54 MHz.


Cheers,
Synchrodyne is offline  
Old 20th Jul 2018, 8:26 pm   #20
Pieter H
Tetrode
 
Join Date: Sep 2017
Location: Waalre, Netherlands
Posts: 67
Default Re: Tuners in Philips G6-G11 CTV chassis

Richard (post #16),
no, as far as I can see there have never been dedicated Ireland tuners within the Philips tuner range. Just one general remark: the fact that I haven't seen them doesn't mean they never existed! New tuners keep on popping up regularly.

The best way to verify this would be a look inside some Philips sets made for Ireland and picture the tuner label. You never know! But as Peter and Kan_Turk already suggest, most likely it were standard Western-Europe VHF-UHF tuners.

Cheers, Pieter
Pieter H is offline  
Closed Thread

Thread Tools



All times are GMT +1. The time now is 12:39 pm.


All information and advice on this forum is subject to the WARNING AND DISCLAIMER located at https://www.vintage-radio.net/rules.html.
Failure to heed this warning may result in death or serious injury to yourself and/or others.


Powered by vBulletin®
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Copyright ©2002 - 2023, Paul Stenning.