![]() |
![]() |
![]() |
|
Television Standards Converters, Modulators etc Standards converters, modulators anything else for providing signals to vintage televisions. |
![]() |
|
Thread Tools |
![]() |
#81 |
Retired Dormant Member
Join Date: Jun 2010
Location: Co. Limerick, Ireland.
Posts: 1,183
|
![]()
Why not have the CF player do the interlacing?
This is how DVD playback usually works. |
![]() |
![]() |
#82 |
Rest in Peace
Join Date: Jul 2011
Location: Bridgnorth, Shropshire, UK.
Posts: 787
|
![]()
Oh Kat you've saved me hours of hair pulling - thank you!
My player is very dumb, Neon Indicator (I still don't know your name!) and the data has to be stored in strict chronological order. It's not a problem having the two fields combined in ffmpeg's output - the utility I plan to write can deal with that. I have my Linux machine now, I've added an audio DAC to my player. Just the play/stop/rew/ffwd buttons left to add. |
![]() |
![]() |
#83 |
Rest in Peace
Join Date: Jul 2011
Location: Bridgnorth, Shropshire, UK.
Posts: 787
|
![]()
I had a question Kat but I think the 'reinterlace' argument probably answers it: Interlacing is not a simple matter of taking odd and even lines from a progressively scanned image but I'm sure ffmpeg is taking care of that. Marvelous program!
|
![]() |
![]() |
#84 |
Retired Dormant Member
Join Date: Jun 2010
Location: Co. Limerick, Ireland.
Posts: 1,183
|
![]()
Interlacing is actually fairly simple if the progressive source FPS is 1/2 the field rate, as then you do play odd and even lines of the source in each field.
De-interlacing is fraught with horrors and can only be done with no loss of quality if the original source was progressive at 1/2 the field rate (i.e. 24 fps film speeded to 25fps interlaced at 50Hz as there is no movement between fields). http://wiki.videolan.org/Deinterlacing http://www.100fps.com/ http://en.wikipedia.org/wiki/Deinterlacing Actually creating Interlaced from Progressive though is easy really. If the source is really interlaced (i.e. simple weave will not work as you will get a comb artifact on any movement as objects are not in same position 1/50th of a second later) there is actually no way to "perfectly" de-interlace. True film source is never "real" interlace, there is no additional temporal information in fields vs frames. Why was Interlacing chosen for TV? Film is 24fps. This is actually a bit flickery. So each frame is shown twice in the projector (fake 48fps). In the 1930s (or indeed up to 1990s) it was not technically possible to show each transmitted TV frame twice. Using 50fps (or 60fps) is needed (rather than 48fps) also avoid moving "hum bars" on receivers and flicker from Studio lighting. Unless you have DC lighting and very good PSUs your TV rate needs to be the same as the mains. But full 50fps (or 60fps) would be 20KHz to 32kHz line rate instead of 10KHz to 16KHz and need twice the transmission bandwidth. So they had the idea of doing something film can't do. Interlace. The Entire frame would be 25fps or 30fps, but the scanning would be two sub frames (fields) twice as fast but with 1/2 the lines. This actually increases the quality compared with a progressive 25fps (which was possible). The point is to get a refresh rate more than twice film (50Hz or 60Hz). Only sharp detail one line high will have the 25Hz flicker. In HDTV they often argue that 1080p is higher resolution than 1080i In fact for a static image they are the same! Also in 50Hz world, the HD film is natively only 25fps (speeded from 24fps and sound pitch corrected), so there is NO difference at all between Film transfers at 1080i ( 25fps 50 field) and 1080p (50fps). True TV interlace source is actually higher temporal resolution that film as the 25fps resolution is 720x576 or 1920x1080 and and the 50fps resolution is 720x288 or 1920x540 (Film is only 24fps and depending on format transferred at 480, 576 or 1080lines) So you can't de-interlace without either losing vertical resolution (blending adjacent lines) or temporal resolution (blending across frames) or both. Modern de-interlacing is much improved. Older schemes really blurred movement. The 60Hz (ex NTSC) has an extra problem with Film source, which is particularly why "interlace" is regarded as "evil" and progressive as "good". But in the 50Hz world, interlace improves the quality for free. |
![]() |
![]() |
#85 |
Retired Dormant Member
Join Date: Dec 2003
Location: North London, UK.
Posts: 6,168
|
![]()
Another way of looking at interlace is that it was the first video compression system. It served very well until the era of standards conversion when separating vertical and temporal information became important. Doing this well ranges from tricky to downright impossible. The results of getting it wrong are all too visible.
Conversely, as neon_indicator has said, going from progressive to interlace is trivial and free of artifacts. You just read the odd lines in field 1, go back to the top and read the even lines in field 2. Since there is always an odd number of lines in total the 2nd field starts halfway along a line. Using an odd number of lines like this made interlace happen automatically for CRT displays. An even number of lines would have required some special circuitry to move the scan down by a line at the start of the 2nd field. Interlace is now a bit of a curse. Field storage is now trivial. Modern compression systems work better when temporal and vertical information are clearly separated. LCD and many other displays are inherently progressive. |
![]() |
![]() |
#86 |
Retired Dormant Member
Join Date: Jun 2010
Location: Co. Limerick, Ireland.
Posts: 1,183
|
![]()
It's more of a curse to 60Hz world though than 50Hz.
the 3:2 pull down artefact. Why do we have to de-interlace? Because if you need to change the resolution you need a progressive image. In theory DLP, LCD and Plasma displays can display Progressive mode (like VGA CRTs) or Interlace (traditional TV CRTs). But unlike CRT they have fixed native resolution. Although in Analogue era there was 441, 405, 819, 525, 625 and even 1125 (which is the "why" of 1080), unless you lived in Belgium or a Broadcaster, your TV set really only had to do one resolution. A cheap HD Ready TV is typically 1366 x 768 (basically a widescreen version of PC 1024 x 768, WXGA not a TV resolution at all). But digital transmission is 4:3 aspect 384 x 288 544 x 576 704 x 576 720 x 576 (768 x 576 would be square pixel, but doesn't exist) 16:9 aspect (mostly anamorphic) 384 x 288 544 x 576 704 x 576 720 x 576 (1024 x 576 would be square pixel, but doesn't exist) 1440 x 720 (rare, not in UK /Ireland) 1440 x 1080 1440 x 1088 (anamorphic) 1920 x 1080 (square pixel) That's 12 resolutions and all are 25fps 50Hz interlace. Broadcast has more legal resolutions than DVD. Of course there are also 60Hz area resolutions. So the TV set doesn't bother with a native interlace mode, it's no use to it. It actually has to de-interlace and resize EVERY signal. I recommend "HD Ready" sets only for non-HD. You also can see why an SD 720x576 LCD is so poor compared with a CRT. Especially on Analogue source. To display Analogue properly you need to sample at typically 1440x576 and then anti-alias to the screen native resolution. Ironically the "HD ready" sets which are not much good for HD, are thus much better for analogue than the basic LCD. Why am I writing all this? Well, in creating material for your 405 line player or designing an FPGA Aurora like box you nowadays need to understand digital sources. You need to understand the trade-offs on difference de-interlacing schemes, resample to new resolution and then re-interlace. You can't sensibly convert to 405 lines without de-interlacing first. 12" on 405 line is same quality of sharpness as 42" HDTV at same typical viewing distance! 405 line is/was HDTV. As long as the screen isn't too big ![]() Last edited by neon indicator; 22nd Jul 2011 at 10:01 am. |
![]() |
![]() |
#87 |
Retired Dormant Member
Join Date: Dec 2003
Location: North London, UK.
Posts: 6,168
|
![]()
While I agree with most of your previous post I must take issue with this point. Fortunately both 405 and 625 are 50Hz interlaced systems. Hence you can do a respectable conversion within each field. It is theoretically better to de-interlace the 625 source, convert to 405 and then re-interlace. AFAIK nobody has ever done this. I suspect that the practical gain would be very small indeed. The BBC's CO6/509 (digital, 4 line intrafield interpolator) and the Aurora (3 line intrafield interpolator) are both as close to perfect as is needed in any practical application. The Pineapple converter had a framestore and offered a frame mode. There was no proper deinterlace so it looked great on stationary pictures but grim on any movement. Since the Pineapple had very simple 2 line interpolation it can't really be used to judge the effect of deinterlacing.
|
![]() |
![]() |
#88 |
Rest in Peace
Join Date: Jul 2011
Location: Bridgnorth, Shropshire, UK.
Posts: 787
|
![]()
At the risk of going off topic, if we haven't already, I find a field frequency of fifty Hertz a bit flickery so twenty-five would be intolerable. My understanding of interlace is that it reduces flicker without causing a massive increase in bandwidth.
Talking of flickery lights - I am particularly sensitive to such things. I am frequently distracted by LED rear lights, the lights on the consol of crossings and those new-fangled LED cats eyes. Why don't they make them switch faster?!!! |
![]() |
![]() |
#89 | |
Retired Dormant Member
Join Date: Dec 2003
Location: North London, UK.
Posts: 6,168
|
![]() Quote:
Subjective flicker depends heavily on the angle the screen subtends to the eye. Flicker is mainly seen in peripheral vision. At normal TV viewing distances flicker is lot less obvious than for computer monitors which are usually much closer to the eye. |
|
![]() |
![]() |
#90 |
Heptode
Join Date: Jul 2008
Location: Selby, North Yorkshire, UK.
Posts: 979
|
![]()
Slightly off topic, but it fits nicely with the current discussion. Various film studios are now shooting at 48fps (for example 'The Hobbit').
Apparently because the limitation of 24fps is less relevant now that most cinemas in the target audience areas are no longer using film, and because of the Blu-ray/HDTV (and whatever will supersede it) market. Still doesn't help the 60Hz areas with their 3:2 issues. |
![]() |
![]() |
#91 |
Retired Dormant Member
Join Date: Dec 2003
Location: North London, UK.
Posts: 6,168
|
![]()
Standards converters are now good enough to do good conversion between almost any arbitrary pair of standards. Especially if no interlace is involved. This kind of technology isn't cheap to implement in real time though there's no reason to do it in real time when preparing a film for DVD release. Whether film distributors actually take the necessary care is another matter.
|
![]() |
![]() |
#92 | |
Retired Dormant Member
Join Date: Jun 2011
Location: Berkshire
Posts: 389
|
![]() Quote:
![]() some people are more susceptible to it than others |
|
![]() |
![]() |
#93 | |
Retired Dormant Member
Join Date: Jun 2010
Location: Co. Limerick, Ireland.
Posts: 1,183
|
![]() Quote:
"Generally you can't sensibly change resolution without de-interlacing first."In the 405 - 625 crossover era till the last few years "deinterlacing" was poor. Certainly historically it made more sense to not deinterlace. If you were designing from scratch today for multiple resolutions and frame rate and source types you likely would de-interlace. I think using VLC, FFmpeg or other common PC tools or modern GPU engines, you really have to de-interlace first before resizing. |
|
![]() |
![]() |
#94 |
Heptode
Join Date: Feb 2010
Location: Duffort, Gers, France
Posts: 706
|
![]()
I read somewhere that the adaptability of the human brain also plays a role. TVs don't flicker when you look straight at them because we spend so much time looking straight at them that the brain adapts to filter out the annoying flicker. One problem with computer monitors is that they use frequencies other than 50Hz so until you get desensitized you notice 60Hz or 70Hz even though 50Hz might not be a problem. Nowadays with LCDs it doesn't really matter anyway.
__________________
Stuart The golden age is always yesterday - Asa Briggs |
![]() |
![]() |
#95 | ||
Retired Dormant Member
Join Date: Jul 2005
Location: West Yorkshire, UK.
Posts: 1,700
|
![]() Quote:
We don't need to understand any of this when the developers of the tools we have available already do. Stuff data in one end of FFmpeg (practically any resolution, encoding, container; progressive, interlaced, whatever), pull the handle and data falls out of the other end. If you look at the documentation you'll find it has several de-interlacing algorithms and I think it'll attempt automatic 3:2 pull-down, inverse telecine, etc. (not something I've needed to do.) You just tell FFmpeg what you want to get out and it handles everything else. IME it does a reasonable job with default settings though there are plenty of options available to tweak it if needed. In this instance, all I've aimed to do is produce data in the format required by Karen's player. FFmpeg, the bit of C code and the script tying them together do that. Refining it to produce better results can happen once there are some results. One assumption I've made is that the frame rate of the video being converted is 25 fps; I haven't passed FFmpeg any options for frame rate conversion. I think that's a reasonable assumption; Karen and myself are both in the UK so the majority of source material is likely to be 25 fps. It's the availability of tools such as FFmpeg which make this project practical. Converting source material to the required format only involves simple 'massaging' of the output of FFmpeg and doesn't require an in-depth knowledge of decompression, de-interlacing, scaling etc.; that's already taken care of. Quote:
![]() But that's only because I can choose from several de-interlacing algorithms on MythTV from my armchair with the remote, via the on-screen setup menus. I've found it's better not to de-interlace. The choice seems to come down to 'combing' on motion or a soft picture. Subjectively I prefer the remarkably crisp picture I get on 405 with de-interlacing turned off. (I remember several people commented on the excellent picture quality at the NVCF demo once I'd hit the Pye monitor hard enough to get it to work. I had de-interlacing turned off then, too.) So normally if the source material is interlaced (e.g., DVB-T), the display device is interlaced (e.g., a 'proper' 405/625/819 CRT telly) and the frame rates match, I don't bother de-interlacing. IMO it isn't necessary and the picture quality is better (noticeable on a monitor, probably less so on a TV set.) Kat |
||
![]() |
![]() |
#96 |
Retired Dormant Member
Join Date: Jul 2005
Location: West Yorkshire, UK.
Posts: 1,700
|
![]()
My pleasure (I enjoy hacking little bits of code together.)
I figured that as you mentioned your Unix/C programming is a little rusty, that one way back into it is if you're given something which vaguely works as a starting point rather than starting from scratch. Consider it "unfinished" (please); it's not up to my usual standards. For a start, that main 'while' loop terminates when 'fread' returns no items rather than when end-of-file is reached; it just happens to work but it's wrong. Obviously there's no error handling anywhere either; but IMO it's easier to follow what it's doing without it being twice as long and checking for and handling all possible error conditions. In this instance, with only one input and one output, it seemed simpler to read from stdin and write to stdout; it avoided any mucking around with parsing/validating command line arguments. To handle audio as well, it'd need two inputs. I think it'd be possible to create another stream; tell FFmpeg to write audio to it and have the 'data massaging' program read from that as well. The program would then need to accept an argument - the stream FFmpeg is writing to. Or use two streams, one for audio and one for video, rather than reading from stdin at all. I think it's preferable to pipe the output(s) of FFmpeg into a program then redirect the output of that to the CF card; it saves having to have enough disk space for temporarily storing large files. If you've got it all installed, you should have man pages for standard (and other) C libraries; e.g., 'man 3 fread' brings up the appropriate 'Programmer's Manual' page on my system. KDE's 'help' system will show man (and info) pages too; I forget there's GUI software! Surely the X Window System exists just so you can have loads of xterms open at once... I hope you don't mind me joining in. A lot of what you need at the 'PC end' is something I've already been thinking about for another project. Kat |
![]() |
![]() |
#97 |
Rest in Peace
Join Date: Jul 2011
Location: Bridgnorth, Shropshire, UK.
Posts: 787
|
![]()
I've had this player idea in my head for some time now but knew I lacked the skills needed to prepare the data required by the card. In some ways I joined this forum in the hope that I could link up with someone who could fill that gap. I am not disappointed
![]() You've all but solved the data preparation problem, Kat and I am eternally grateful. I appreciate that it is not finished but I'm sure I can manage now that you've done the difficult part. It means I can concentrate on the bit I do best - low level firmware (and hardware provided it doesn't involve a microscope). I can prove the PIC code responsible for video timing using the Microchip simulation tools but ultimately I will need a monitor or television to really prove it works. I may have a transport problem there. Does the Aurora do 405 to 625 conversion? ![]() |
![]() |
![]() |
#98 |
Retired Dormant Member
Join Date: Jul 2005
Location: West Yorkshire, UK.
Posts: 1,700
|
![]()
As I mentioned, the data preparation part is related to another project. Someone once wrote of standards conversion with a PC, "So when you say you have built a converter with a cast off 486 PC I won’t believe you" and, "I still believe that you can’t do it with an old 486".
I'm still not so sure about full standards conversion (I haven't ruled it out) but reckon a 405-line player is viable; pulling pre-prepared data (512 bytes per line with embedded audio) from a hard drive partition then throwing it at a graphics card and sound card. Very similar to what you're doing, just with ancient PC hardware. Embedding the audio in each line of video is inspired by the BBC's 'Sound-in-Vision' distribution system which I read about in Practical Television, January 1969 issue. This inserts two 10-bit samples in each line sync pulse (625-line system.) They used a compressor/expander and pre-emphasis/de-emphasis which "improves the signal-to-noise ratio of the p.c.m system by 13dB and makes the 10-digit system better in this respect than a 12-digit one without a compressor-expander system." Finding the article again answers the earlier question about audio bandwidth; "In addition to the saving in line rental that this gives there is an appreciable improvement in the quality of the sound signal in the network, the sound bandwidth being increased from about 10kc/s to approximately 14kc/s." So if your audio is at least 12-bit and has a bandwidth somewhere north of 10 kHz; preferably nearer 14 kHz; you're roughly matching the BBC. Kat |
![]() |
![]() |
#99 | |
Dekatron
Join Date: May 2006
Location: Invercargill, New Zealand
Posts: 3,375
|
![]()
I'm a bit late 'in' to this, but would a band I NZ channel 2/3 modulator be any use, for the vision part at least? I've got one here out of an old VCR that's unlikely to see much use, yours for the cost of postage. Obviously the sound part will be no use.
It's probably the same as the ones I sent Kat years ago (see here). Channels 2 and 3 are 55.25 MHz and 62.25 MHz vision respectively. I'm not aware of anything around for channel 1 (45.25 MHz) although Jaycar used to do an AU channel 0/1 (46.25 / 57.25) one. Quote:
![]() |
|
![]() |
![]() |
#100 | |
Retired Dormant Member
Join Date: Dec 2003
Location: North London, UK.
Posts: 6,168
|
![]()
Kat raises a number of interesting points in post #98. The "old 486" comment was mine, in one of my standards conversion articles.
The BBC SIS parameters were determined by what was feasible at the time and also what would survive journeys over long analogue lines. Karen, your system is not constrained by these. You should be able to find the original BBC reports on the subject here: http://www.bbc.co.uk/rd/publications/rdreports.shtml Quote:
|
|
![]() |