View Single Post
Old 9th Jun 2018, 2:57 am   #9
Synchrodyne
Nonode
 
Join Date: Jan 2009
Location: Papamoa Beach, Bay of Plenty, New Zealand
Posts: 2,944
Default Re: Tuners in Philips G6-G11 CTV chassis

As you say, there were quite a few twists and turns in the development of analogue TV transmission and reception systems, with political and economic factors involved as well as technical. Mostly we can ascertain what happened, but why it happened is not always apparent.

My inclination is first to look for possible technical reasons that support what was done. When the search for these comes up empty, then the supervention of politics is a reasonable assumption. That’s not to say that in some cases the technical and political requirements happened to align, and that if they had not, then the political choice may have taken precedent.

Here is my distillation of the history, looking first at the lines counts and transmission parameters, then at the IF choices.

UK adopted 405/50 in 1936 as about the best that could reasonably be done at the time, with the approximately the same line frequency as the RCA 343/60 system. Positive vision modulation was adopted as making the most efficient used of transmitter capability at the time. AM sound was adopted as FM was not yet quite a contender. It was originally double sideband, with vestigial sideband adopted in the late 1940s, using the 0.75 MHz NTSC number, albeit upper rather than lower.

NTSC chose 525/60 in 1941. All parameters were open for discussion and choice on their merits except the channel width of 6 MHz, which was predetermined. Thus vestigial sideband, negative vision modulation, inclusion of equalizing pulses and FM sound were chosen as the better/best. Negative modulation was chosen because of the apparent ease with which black-level AGC could be obtained in receivers. There was evident concern about maintaining transmitter linearity at lower modulation levels towards white level, and the original white level specification was 15% maximum. (The 10% minimum, to accommodate intercarrier sound, was a later RTMA addition.) Until fairly late in the NTSC proceedings, the line choice was 441, the same as the RMA system, itself representing an advance in line frequency over the UK 405/50 system. (The 441/60 equivalent would have been about 525/50.). Donald Fink advanced the flatness of field (lack of lininess) argument in favour of 525, and that was chosen, thus producing what was a durable standard. (Much later, Fink wrote that an 8 MHz channel would have been a better choice.)

Russian work on 625/50 started c.1944. It was an adaptation of the NTSC 525/60 system to suit 50 Hz power supplies, based upon approximately equivalent line frequencies. Without a predetermined channel width constraint, the Russians chose 6 MHz vision bandwidth, ostensibly to match 16 mm film capability and an 8 MHz channel. But the vestigial sideband was 0.75 MHz, the same as for NTSC. Regular broadcasting started in 1948.

The French 819/50 “high definition” system was adopted in 1948 and on-air in 1949. It had 10.4 MHz vision bandwidth in a 14 MHz channel, later reduced to 13.15 MHz for the tête-bêche plan by eliminating the outer guard bands. It had a 2.0 MHz vestigial sideband, about the same proportion of the full sideband as for the NTSC case, originally on the upper side but later on either side in the tête-bêche system. It was positive/AM without equalizing pulses. The reasons for the latter choices are unknown, but it might have been for similarity with the existing 441/50 system. At one stage it was envisaged that both systems would be retained, although by late 1948 the 441/50 system was given a definite end-date of the 1st of 1958. By 1948, the NTSC’s thoughts on the benefits of negative modulation were being questioned, e.g. see Wireless World 1948 December pp.439-440. So perhaps positive modulation was chosen for technical reasons as well. In respect of the line count, at that time Western Europe generally had not settled upon a standard, although 625/50 was in view.

The Western Europe situation looks a bit murky. Germany picked up the 625/50 system from the Russian work. Apparently there was ongoing debate about the channel width. One may deduce that 8 MHz was seen as too much, and that the 6 MHz American channel was attractive. An American 625/50 transmitter was installed at Torino in 1949, this working in a 6 MHz channel. Meanwhile Philips had proposed a 567/50 system as being a better fit for the 6 desired MHz channel than 625/50, but that did not garner much support. I’d guess that on one hand, anything lower in line count than what the Russians were doing would have been unacceptable, but equally, so would have been directly copying the Russian system. The late argument was apparently between 4.25 and 5 MHz video bandwidths in 6 and 7 MHz channels respectively. (4.25 MHz was an interesting number, given that NTSC was 4 MHz, increased to 4.2 MHz in 1953 to accommodate colour. It [4.25 MHz] suggested that the 6 MHz channel proponents realized that a 4 MHz vision bandwidth was “undersized” and that they needed to push that number as far out as possible.) The 1950 Gerber compromise came out in favour of the 7 MHz channel.

The 6 MHz 625/50 channel resurfaced for use in Argentina (and later in other 50 Hz Latin American countries) in 1951. In those cases there was a need to stay with the American 6 MHz channelling pattern, used in the adjoining 60 Hz countries.

By the end of 1951, it looked as if the number of 625/50 transmission variants – three- might have been not much smaller than the number of 625/50 transmitters actually in regular broadcasting service!

Then in 1952-53 came the Belgian variations. Here the problem was that the transmitters would need to be able to switch between incoming programmes in either 625/50 or 819/50 format. So commonality of channel arrangements and other parameters was required. Presumably spectrum availability pointed to the 7 MHz channel, which meant that the 819/50 service would be of very limited vision bandwidth as compared with the French version. Positive vision modulation and AM sound (with pre-emphasis) was chosen for both. Why I don’t know. Had negative/FM been chosen for both, then the 625/50 system would have been the same as that elsewhere in Western Europe, which seems simpler. Maybe the assessment at the time was that positive was a bit better. Or perhaps “equality of treatment” across the Flanders/Wallonia “divide” demanded that both the 625/50 and 819/50 systems be modified as compared with their prototypes, not just one of them. The Belgian 819/50 signal included equalizing pulses, presumably to make it as much like the 625/50 signal as possible.

In the mid-to-late 1950s, European UHF transmitter network planning was heading to a common 8 MHz channel with fixed vision carrier position. This was in order to maximize utilization of the available channels. Countries using the existing CCIR 625/50 system would simply replicate this at UHF, albeit in 8 MHz rather than 7 MHz channels.

Without existing 625/50 services, both the UK and France, who were heading towards the use of 625/50 for future services, were free to fully exploit the possibilities of the 8 MHz UHF channels.

Early UK 625/50 work was with the Russian parameters, logical given the availability of an 8 MHz channel. Late in its deliberations, the TAC (Television Advisory Committee) reviewed this and determined that optimum use of the 8 MHz channel would be obtained by increasing the vestigial sideband from 0.75 to 1.25 MHz, and decreasing the main sideband from 6 to 5.5 MHz. This was recommended in 1960 and eventually adopted for UK 625/50 UHF transmissions, which followed the standard negative/FM pattern. One might say that the Russians had overlooked the desirability of vestigial sideband proportionality when transposing from NTSC. I am inclined to credit the TAC with having primarily technical motives for its choice, although one could not totally discount a leaning – perhaps even subconscious - to finding a good technical reason for doing differently to the Russians whilst still making full use of the 8 MHz channel. Insofar as future UK receivers would be dual-standard 405/625, that alone would have been a significant non-tariff barrier if indeed that was part of the thinking.

France opted for both the 1.25 MHz vestigial sideband and the 6 MHz main sideband for 625/50, this requiring utilization of the outer guardbands, as it had done with its 819/50 tête-bêche channelling. Why 6 MHz for the main sideband I do not know. My speculation is that it was done in consideration of the future adoption of SECAM colour. The initial SECAM proposal used AM (DSB) colour subcarriers. With DSB AM and the use of simple envelope detectors at the receiving end, any single-sidebanding caused by restricted vision bandwidth would have caused quadrature distortion which was potentially deleterious. A 6 MHz vision bandwidth allowed plenty of room for colour subcarrier upper sidebands without truncation. I imagine that the same argument might have been used when FM subcarriers were adopted for SECAM, as one supposes that single-sidebanding of an FM signal would also cause distortion. However, the later use of SECAM for System B transmissions suggests that single-sidebanding of the FM subcarriers was not a show-stopper.

Positive/AM was chosen for domestic 625/50 transmissions, apparently for commonality with the existing 819/50 transmissions, this easing the design of dual-standard receivers. This has sometimes been assigned as a political decision, creating a non-tariff trade barrier, although I have never seen a support case for this assertion. Contrary evidence is that negative/FM was chosen a little later for the Outré-Mer territories, where there were no 819/50 transmissions (well, just a few in Algeria for a while and I think also in Togo), so the dual-standard receiver issue did not arise. Had creating a tariff barrier been the primary objective, then it seems more likely that the Outré-Mer territories would also have had the positive/AM system. Also, as future domestic receivers would be 625/819 dual-standard, a de facto non-tariff barrier was already in place.

Some Western European countries with existing VHF 625/50 transmissions adopted the 1.25 MHz vestigial sideband for their UHF transmissions. This included Belgium, who used negative/FM at UHF. That was interesting, as when UHF reception facilities were added to Belgian multistandard receivers, they also covered the French positive/AM system.

The system designations A through F were promulgated at the CCIR 1959 Los Angeles meeting. There was some order to them, based upon line count. I.e. A for 405, B,C,D for 625 and E,F for 819. Within the 625 group, they appear to have been ordered by increasing channel width, with negative/FM preceding positive/AM at a given width. With E and F it looks to have been a time-based ordering, in that E preceded F.

G through L arrived at the ITU 1961 Stockholm European VHF/UHF planning meeting. Here the ordering seems to have been on the basis of increasing total (i.e. main + vestigial) vision bandwidth.

K’, originally K* (I don’t know when it was changed) was assigned at or before the ITU 1963 Geneva African VHF/UHF broadcasting conference. This covered the French Outré-Mer 625/50 system. Also in the documentation for that meeting, the 525/50 system was described as System M. From that one might infer that the M (and probably N) designations were assigned between the 1961 and 1963 ITU meetings.

By mid-1963, there were no fewer than ten 625/50 transmission standards, namely B, C, D, G, H, I, K, K’, L, N. From a reception viewpoint, that number could be reduced to seven, as B/G, C, D/K, H, I, L, N. Or, if vestigial sideband variations were ignored, to five: B/G/H, C, D/K/K’, L, N.

Of course, it gets more complicated when one factors in the later addition of colour and the later still addition of stereo/two-channel sound.

Turning to IFs, one may make the generalization that standard numbers were developed following the implementation of VHF channelling and channel assignment schemes, whilst UHF channel assignments were worked out on the basis of established standard IFs. But there were exceptions. Standard numbers were usually preceded by a period of ad hockery.

The US was early with an initial standard, 25.75 to 26.4 MHz VIF range, later known as the “low” or “20 MHz” IF, but soon learned that higher was desirable, so after careful study by the RMA, the 45.75 MHz VIF was chosen, known as the “high” or “40 MHz” IF. This was then used by the FCC as the basis for planning geographical UHF channel assignments.

In Europe, Italy elected to follow the American precedent, choosing the 45.75 MHz VIF in 1952 ahead of the ITU Stockholm 1952 (ST52) European VHF planning meeting. Evidently this action placed some constraints on VHF channel frequency assignments, and the Italian set thus differed from those used elsewhere in Western Europe. So this looks to have been a case where the decision was more political than technical.

The European 625/50 standard VIF of 38.9 MHz appears to have been established in 1954 after careful consideration, although it had been used before then. Similarly the British standard 405/50 VIF of 34.65 MHz was also established that year in anticipation of Band III transmissions starting in 1955. I am not sure when the French standard VIF of 28.05 MHz was established, but I’d guess during 1955. The tête-bêche channelling system limited the VIF choices, and the selection of 28.05 MHz rendered unusable channel F3, which had been included in the ST52 planning.

I imagine that the initial Russian standard 34.25 MHz VIF was established on technical grounds. The Russian channelling system, including the use of Band II, would have provided a different set of conflicts when it came to IF choices.

In the Australian case, the original 36.0 MHz standard VIF was more-or-less coincident with the announcement of the unusual VHF channelling scheme, also involving Band II frequencies.

The ITU 1961 Stockholm European VHF-UHF planning meeting (ST61) appears to have assumed the following VIFs in respect of UHF channel allocations:

Western Europe generally, Systems G and H – 38.9 MHz.

Italy, Systems G and H - both the Italian standard 45.75 MHz and the European standard 38.9 MHz.

UK, System I – 38.9 MHz. But as noted, the UK moved this to 39.5 MHz post-ST61 for reasons not yet known, although as best may be determined there was no compelling technical need to do so.

Russia and Eastern Europe generally – 34.25 MHz and a higher number in the vicinity of 38.9 MHz. The 38.0 MHz number arrived somewhat later, although it might have been in view in 1961.

France, System L – 32.7 MHz. This number arose because to simplify dual-standard receiver design, it was decided that the SIF for System L should be the same as for System E, namely 39.2 MHz. That put the System L VIF at 32.7 MHz, and also meant a vision-low IF channel, in turn requiring oscillator-low frequency changing in receivers. That was not a problem at UHF or at Band III, but it was at Band I. Hence when 625/50 was extended to Band I, it was with inverted channels as System L’, which in turn allowed oscillator-high operation. There was a line of consequences extending from the tête-bêche channelling decision right down to System L’.


Cheers,
Synchrodyne is offline