Jump to content

Talk:Interlaced video: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
Finest1 (talk | contribs)
Line 104: Line 104:
I made a new comparison animation, but it don't think that I made it right here it is:<br>[[Image:Progressive vs interlace.gif]]<br>Make a new one please...--[[User:Finest1|Finest1]] 02:27, 25 June 2006 (UTC)
I made a new comparison animation, but it don't think that I made it right here it is:<br>[[Image:Progressive vs interlace.gif]]<br>Make a new one please...--[[User:Finest1|Finest1]] 02:27, 25 June 2006 (UTC)
: It looks like you scaled the lines after they were placed on the image, because they look much too fine, and rather irregular. Also, what is the copyright status of that image that the lines are on? [[User:Algr|Algr]] 08:58, 25 June 2006 (UTC)
: It looks like you scaled the lines after they were placed on the image, because they look much too fine, and rather irregular. Also, what is the copyright status of that image that the lines are on? [[User:Algr|Algr]] 08:58, 25 June 2006 (UTC)

Uh... I don't know. Delete it!--[[User:Finest1|Finest1]] 19:22, 28 June 2006 (UTC)


== I disagree with "24-30Hz is a sufficient video frame rate for smooth motion" ==
== I disagree with "24-30Hz is a sufficient video frame rate for smooth motion" ==

Revision as of 19:22, 28 June 2006

More disadvantages

Another disadvantage of interlacing is that it can only be displayed as is and only on a CRT, without causing problems. Trying to display each field or frame twice to reduce flicker (which occurs at less than around 70 fps) will probably make the interlace artifacts too visible and ruin the image, unless complex deinterlacing (that might cause other problems) is performed first. (So called "100 Hz TVs" are common in Europe today.) Also, the same problems will occur when interlaced material is shown on a flat-screen display like an LCD or plasma monitor, so interlace is not suitable for video distribution today.

That should be mentioned in the article. Actually I think this is a bigger disadvantage of interlacing than the "major" one about reduction of vertical resolution and flickering as mentioned in the article already.

ABostrom 01:37, 2004 Dec 20 (UTC)

I agree with Atlant that the "a" belongs with "frequency" Pdn 03:01, 16 Mar 2005 (UTC)

Terminology Origin

The page currently says "It is sometimes called the top field (since the top line is numbered one, which is odd)". I'm not experienced in this terminology so maybe it is the correct origin, but a more plausible explanation to me would be that odd lines are called the top field because if you split the screen into 2-line rows, the odd rows are the ones on top in each pair. 129.110.240.1 00:00, 19 Mar 2005 (UTC)

Flicker

I am not an expert on the subject, however my understanding is that the interlace scanning method had a greater level of flicker than non-interlaced (progressive) scanning methods. Here, however, the author states that "..This takes a finite length of time, during which the image on the CRT begins to decay, resulting in flicker. An interlaced display reduces this effect.." which seems to contradict what I have read at other sources, including the wikipedia article on progressive scan. Maybe someone who knows for definite could rectify this.

There are several aspects to flicker which is why one can get different answers depending on the exact question you ask.
If you think of the screen "as a whole", then a screen that is written from top to bottom 60 (50) times per second appears to flicker a lot less than a screen that is written from top to bottom 30 (25) times per second. So considering the screen as a whole (say, looking at it from a distance), interlace definitely helps suppress this sort of flicker.
But then we have the problem of some very small detail on the screen that occupies only one scan line (say, part of some white line-art shown on an overall black screen). Because the object is only one scan line high, this object is only illuminated with every even or odd field, so even with interlacing, the object still flickers at 30 (25) Hz. So in this case, interlace doesn't help the flicker. And if the object is displayed on some sort of (say) uniformly grey screen, the object will appear to flicker much more than the screen as-a-whole. Because of this, systems that generate captions and titles for TV usually take a lot of care to not draw objects that are just one scan line high.
(Did this explanation help?)
Atlant 00:06, 12 May 2005 (UTC)[reply]

Comparing interlace to non-interlace

There is a confusion that runs through this article when comparing interlace to non-interlace. What exactly is being compared? That is, what factors are changing and what factors are being held constant? (Scan rate, lines per frame, etc).

The muddle starts in the first paragraph. "The method causes less visible flickering than non-interlaced methods." This is true under some conditions and not true under others. It all depends on what is being compared.

Another problematic statement is, "The major disadvantage of interlacing is the reduction in vertical display resolution." One can just as easily say that the major advantage of interlace is the increase in vertical resolution. Again, it depends what is being compared. If you compare an N-line non-interlace system with a N*2 line interlace system (a very reasonable comparison to make), which has more vertical resolution? Mirror Vax 09:06, 14 August 2005 (UTC)[reply]

The problem here is that in many places the article compares regular interlaced scanning to a hypothetical 30 frame progressive scan rate that has probably never been seen since the 1920s, and would flicker like a strobe light. No one advocates this. Anything currently calling itself "Progressive scan" must rely on either frame repeating built into the set, (as in HDTV) or conversion to PAL/NTSC before broadcast. (as in film transfers and "film look" video) For the rewrite, all such comparisons should be made between technologies of similar time periods and realities. Algr

I'm certainly no expert on this subject, but it seems to me the first item in "The alternatives to interlacing" is inaccurate. "Doubling the bandwidth used and transmitting full frames instead of each field" describes a progressive input at the same frequency as the interlaced one, which should certainly give you a better image. I.e., instead of showing only half the pixels in an image every 60th of a second, you would be scanning and transmitting all the pixels every 60th of a second. How could that not result in better picture quality? You would not have twitter, nor combing. (BTW, I'm not saying progressive 60Hz broadcast is practical or desirable) cortell

More cleanup needed!

I was going to put the POV-section and inappropriate tone headers on the "Interlacing as a data compression technique" section, but I decided to take a crack at 'em myself. Some parts I left alone (save for fixing typos) because I couldn't immediately come up with a better wording. On the other hand, some parts (such as the "like a tank" bit) were easy to replace. I hope that the changes I made didn't introduce any technical inaccuracies. Meanwhile, this article still needs a good going-over to improve the tone and remove spelling and punctuation errors. It may also be in need of technical clarification, as another person commented, but I'll leave that to those with more knowledge of the subject than I. --Jay (Histrion) (talkcontribs) 19:02, 1 December 2005 (UTC)[reply]

Removal of the "Interlacing as a compression technique" section

I object against the removal of the "Interlacing as a compression technique" secion in the following changeset:

(cur) (last) 12:10, 3 December 2005 Smyth (→Interlacing as a data compression technique - As the article makes quite clear, there never were any full frames.)

Also, the rationale presented by the editor is bogus. Of course there were full frames. The image focused by the lens on the back of the vidicon tube or a CCD, and recorded in its sensors is a full frame. In both cases only half of the lines of sensors are read out and encoded, the other half is simply reset when preparing for the next readout cycle.

The lines were ignored because 60 fields per second produces less eyestrain on a CRT than the alternative 30 frames per second. Saving bandwidth has nothing to do with it at all, and comparing interlacing's 2:1 "compression ratio" with MPEG's 22:1 is ridiculous. Much of the other text I removed was either out of place (discussion of how deinterlacing is computationally intensive) or completely wrong (stating that progressive devices are twice as complex as interlaced ones, or that interlaced video is impossible to edit).
But the article was missing some discussion of how interlaced signals interact with digital compression, so thanks for adding it. – Smyth\talk 17:11, 4 December 2005 (UTC)[reply]
The interlacing is clearly intended to save bandwidth when transmitting video-based material (50/60 Hz), because 50 progressive frames take twice the bandwidth of the 25 interlaced. I wasn't talking about the film-based material (24/25 Hz, as well as NTSC 30 Hz), where interlacing is serving a different role, flicker avoidance. The point of interlacing against MPEG comparision was that the digital compression is more suitable for saving bandwidth than the interlacing when transmitting video-based material. However I admit, that this could be put in a more clear manner, without the detailed comparision.
And of course the progressive devices are twice as complex, becase: MPEG decoder requires 1) twice bigger frame buffers, 2) twice more CPU power for block transformation, twice more CPU power and bandwidth for encoding the transformed blocks into stream, and twice more capacious and twice faster storage; however it is stupid to limit the frame format to the capabilities of the current technology, because the technology will develop tomorrow, but the recording/works of art will be spoiled by the limitations today.
I will reedit the article taking this into account and assuming that your misunderstanding was solely due to the unexplained video- against film-based material difference.Greengrass
I don't think that anyone has ever seriously suggested that capturing 50 or 60 progressive frames a second is a sensible use of either bandwidth or storage. The proper comparison is between 2N fields/sec and N frames/sec, not 2N fields/sec and 2N frames/sec. – Smyth\talk 19:59, 4 December 2005 (UTC)[reply]
Progressive image is universally considered superior to interlaced, whether it is filmed at 24 frames a second or 50 or 60 frames a second. The bandwidth or storage use are of little importance if the picture looks bad. Anyway, one cannot say I did not covered in depth the topic of bandwidth.Greengrass
60 progressive frames per second IS sensible and is precisely what ABC and ESPN are doing with their 720p sports broadcasts. 24-30 frames per second is never adequate for sports. Interlace creates obvious line crawl around the lines on the field, even at 1080i. Algr 09:12, 27 January 2006 (UTC)[reply]

Interlace illustration

I'm new here so I thought I'd put this picture up for review before putting it up on the real article:

|

This picture illustrates how sharpness is lost due to interlace. The left and center images are identical except for interlace. But the center image has too much flicker because objects in the picture tend to line up with just one field. In the flag, for example, there are areas where one field hits mostly red lines, while the other hits mostly white.

Real interlaced transmissions deal with this by blurring out such details, as seen in the image on the right. However, while a line doubler might restore such image to progressive scanning, the necessary blurring would remain.

A Technical note: This animation runs at 30 hz, not the 50-60 hz that real monitors use, and may appear even slower on your monitor. The effect of interlace is therefore magnified.Algr Dec 16, 2005

Interesting, but it needs lots of caveats about how the interlaced frames will be torn to pieces because they aren't synchronized with your monitor's refresh rate. – Smyth\talk 13:20, 17 December 2005 (UTC)[reply]
Well, it will be a while before we get readers who have never seen an interlaced TV set, so I think a warning like "The effects of interlace are magnified for illustrative purpuses." should do. In 20 years everyone will be able to run the animation at 50 hz, and it will let people know what interlace used to look like. Algr Dec 17, 2005
The problem is, the effects of interlace aren't strictly "magnified", because you don't get frame tearing on an interlaced TV set, and so the animation makes interlacing look much worse than it actually is. Would it still be instructive if you reduced it to 10Hz, or even less? – Smyth\talk 11:24, 18 December 2005 (UTC)[reply]
Tearing huh? I'm not seeing that on my screen - maybe because I have an LCD? Here it is at 10 hz: Algr 16:54, 18 December 2005 (UTC)[reply]

|

Much better. But why are the colors different in the "progressive" image? – Smyth\talk 20:04, 18 December 2005 (UTC)[reply]
I dimmed the progressive image to match the loss of brightness caused by black lines over the interlaced version. Strangely, one or the other looks brighter depending on what angle I view the monitor at. The brightness matches when I view the illustration at a 90° angle. (Face on to the monitor.) Algr 22:26, 18 December 2005 (UTC)[reply]

I've decided I'm not to happy with my drawing. If someone wants to offer up something better, I can put it under the scan lines. A good image should have near horizontal lines or texture approaching the width of the scan lines.Algr 08:46, 24 January 2006 (UTC)[reply]


Get rid of this image. In a properly designed interlaced system a static image would look nearly the same as the progresssive one. This is Dis-information. 221.28.55.68 21:08, 23 March 2006 (UTC)[reply]

Has there ever been a "properly designed interlaced system"? VGA exists because the effects of interlace couldn't be dealt with. (The Amiga tried to do it. It worked for video production but was a liability everywhere else.) Why is progressive scan output of DVDs such a big selling point then? Algr 21:53, 23 March 2006 (UTC)[reply]
Thank you for illustrating. It is helpful. Avé 23:26, 23 March 2006 (UTC)

I made a new comparison animation, but it don't think that I made it right here it is:
File:Progressive vs interlace.gif
Make a new one please...--Finest1 02:27, 25 June 2006 (UTC)[reply]

It looks like you scaled the lines after they were placed on the image, because they look much too fine, and rather irregular. Also, what is the copyright status of that image that the lines are on? Algr 08:58, 25 June 2006 (UTC)[reply]

Uh... I don't know. Delete it!--Finest1 19:22, 28 June 2006 (UTC)[reply]

I disagree with "24-30Hz is a sufficient video frame rate for smooth motion"

The statement "24-30Hz is a sufficient video frame rate for smooth motion" doesn't ring true for me. 50i video is much smoother than 25p. I can't think of a good alternative though.

I think you may be confusing the field rate with the frame rate. Otherwise, an entire film industry disagrees with you :-) (There's no doubt that 50 frames/second looks better than 24 frames/second, but "sufficient for smooth motion" is a lesser criterion.)
Atlant 12:45, 12 February 2006 (UTC)[reply]
What I suppose I mean is that 50i material (i.e. sports) is clearly smoother than and easily distinguishable from 25p material, which has a definite "stutter" (modern drama). I wouldn't say that 25p is sufficient for smooth motion, but it's enough to maintain the illusion of motion. 12p is even enough sometimes (at least when you're watching Wile E. Coyote chase Road Runner) to maintain that illusion, but that doesn't mean it's smooth.
What you are describing is called high motion, which I wrote about. I agree that the second bit about more frames being wasteful is wrong, so I've change this whole bit. BTW, you might want to sign your posts by typing four of these "~" at the end. Algr 16:04, 5 March 2006 (UTC)[reply]

Just a thought

As a hobbyist video editor i get to deal quite a lot with interlance and it's artefacts in the video footage. I can recommend one very good and reliable source of information about interlace: http://www.100fps.com I learned great deal about interlace from there. And indeed, lots of this in this wikipedia's article is messy. Usinf 100fps as reference we could correct alot of things. Robert 22:26, 1 March 2006 (UTC)[reply]

: Thanks for the reference.  I'll include it. Algr 16:06, 5 March 2006 (UTC)[reply]


Needs more work

This is supposed to be an encyclopedia! If you don't know, don't guess! I am not enough of an expert to author a complete rewrite, but I know much of this is totally wrong.

  • Example from "Application" section:

"The film solution was to project each frame of film twice, a movie shot at 24 frames per second would thus illuminate the screen 48 times per second."

WRONG! Think about it a little bit before posting. MOVIES ARE SHOT (AND SHOWN) AT 24 FRAMES PER SECOND. Period.

"Illuminating the screen 48 times per second" implies displaying the frame, then (unnecessarily!?) blanking, then displaying the same frame again.

All of this discussion about flicker is confusing and misleading. The perception of flickering doesn't depend on frame rate or field rate...it simply depends on HOW LONG THE SCREEN IS DARK or dimmed. More than maybe 1/60th of a second (give or take a zero...I'M JUST GUESSING. Somebody please look it up before moving this guess to the article) makes the viewer see flicker. Modern movie projectors can flip to the next frame very fast, blanking the screen for ever-smaller periods as the technology improves. I'm GUESSING that they're keeping the frame illuminated 75% of the time or more these days, thus taking less than 1/96th of a second to blank, move to the next frame, and unblank.

And, CRT makers can control the PERSISTENCE of the phosphor (how long it continues to glow after being struck by the electron beam during the scan.) They tune to the application. If the phosphor is too fast for the frame rate, flicker will be seen; if it's too slow, moving objects will leave trails.

  • The first part of the "Application" section is confusing too, and goes against what we see in real life:

"When motion picture film was developed, it was observed that the movie screen had to be illuminated at a high rate to prevent flicker. The exact rate necessary varies by brightness, with 40 hz being acceptable in dimly lit rooms, while up to 80 hz may be necessary for bright displays that extend into peripheral vision."

Firstly, movie theaters show movies at 24hz, not 40hz or 48hz, as explained above. Secondly, the wide screen at the theater extends way farther into one's peripheral vision than a television or desktop monitor, unless you have your face up to the glass. Thirdly, refresh rate is completely irrelevant to "flicker" as explained above. Fourthly, I don't think I agree with the assertion (guess?) that "the rate necessary varies by brightness". Brightness of what? The display? The environs? The display relative to the environs? If it's the latter (which seems to make the most sense) then movie screens are by far the brightest vs. a dark room. Movies have a 24fps frame rate, and are certainly "acceptable in a dimly-lit room."

The whole "Application" section needs to be rewritten/expunged.

  • The section about interlacing as a compression method leaves me highly skeptical too. Some people guessing again, is my guess. I vote it be ripped out. My understanding is, there's no compression advantage to interlaced video. Higher field rate, lower resolution....twice as many half-size images to compress as progressive-scan. Halving the vertical resolution is unlikely to cut the compressed field size in half vs. a full frame; so if anything, there's probably a net loss.

At any rate I have no idea how to fix it. Rip out whole sections and rewrite them, maybe? But certainly the WRONG datum about 48 illuminations per second on movies needs to be gracefully done away with somehow.

Somebody help me out here, I'm afraid to even touch this article.

""Illuminating the screen 48 times per second" implies displaying the frame, then (unnecessarily!?) blanking, then displaying the same frame again." I believe that is exactly what is done, the reason being that blanking at 48Hz is less noticable than at 24Hz. Of course it would be better to actually display different frames, but for cost reasons film is limited to 24fps. I'm not sure what this has to do with interlacing, though. Mirror Vax 04:45, 20 March 2006 (UTC)[reply]
To the unsigned commentor: You are being quite rude with your tone, and you are also quite wrong. Please don't touch the article. I specifically wrote that article to address some common misperceptions that I have encountered. Do some open minded research and you will find that the situation is quite different then you have in mind. Algr 15:49, 20 March 2006 (UTC)[reply]

... My apologies. I did research it, and movie projectors DO blank at 2x the frame rate, or even 3x. So that part is OK. And, sorry if my tone was offensive. I'm new to this; I'll try to keep it scholarly and objective in the future. ...I'm still dissatisfied with the article as a whole though. --shyland

I'm still planning on working on it. I only rewrote the top part so far. Algr 04:33, 21 March 2006 (UTC)[reply]

I edited the title of this section to make it less offensive... I'll keep my hands off the article. Shyland 16:52, 21 March 2006 (UTC)[reply]

long-persistence phosphors

which used "slow" or long-persistence phosphor to reduce flicker.

This isn't correct. Long persistence phosphors were never used in consumer devices - only for things like radar. On a TV set, the phosphor dims out within 30 scan lines or so, (about an inch) so this has no effect on interlace either. See the photo here: Refresh rate Algr 06:14, 15 May 2006 (UTC)[reply]

Misunderstanding of afterglow and 'persistence of vision'.

  • There is no significant afterglow on a CRT TV display. I once took a photograph at high shutter speed to test this, and found that the picture faded over about a quarter of the picture height, in other words in 1/200th of a second. Nor is 'persistence of vision' the simple thing it seems. I believe, from experiments, that it is processed image content in the brain that persists, not the image on the retina. Interlacing works because of this. The brain does not combine subsequent frames; if it did we would see mice-teeth on moving verticals, as we do on computer images or stills, which in fact we don't see on a CRT display. The brain also cannot register fine detail on moving images, hence I think litte is lost on a proper CRT interlaced display, while motion judder is reduced as well as flicker. Modern LCD and Plasma displays seem to me to be fundamentally unsuited to interlaced video since they necessitate de-interlacing in the TV with inevitable loss. In theory, it is not impossible to make an interlaced plasma or LCD display, in which the two fields were lit up alternately, but in practice this would halve brightness, even if the response was fast enough. In view of this, I think it is a great pity that the 1080i HD standard was created, since it is unlikely ever to be viewed except as a de-interlaced compromise on modern displays. If 1080p/25 (UK) were encouraged worldwide, then de-interlacing would not be needed. 1080p/25fps requires no more bandwidth than 1080i of course, but it has the advantage of being close enough to 24fps to avoid 'pull down' on telecine from movies (in the UK we just run movies 4% fast and get smoother motion.) It also fits well with the commonly used 75Hz refresh rate of many computer monitors, which would just repeat each frame three times for smooth motion. In high-end TV's processing using motion detection could be used to generate intermediate frames at 50 or 75Hz as was done fairly successfully in some last-generation 100Hz TV's. Reducing motion judder in this way as an option is a better way of using motion detection than de-interlacing, because it can be turned off or improved as technology progresses. I note that the EBU has advised against the use of interlace, especially in production, where it recommends that 1080p/50fps be used as a future standard. --Lindosland 15:26, 23 June 2006 (UTC)[reply]