Could you provide a (made-up) screenshot? ~~helix84 12:48, 11 January 2006 (UTC)
This artifact occurs on plasma televisions, too. Why is the article written specifically for video games? --18.104.22.168 03:28, 21 October 2007 (UTC)
Does this happen on Progessive scan outputs, where the whole frame is pushed at once, or only on Interlaced displays? Cherrera 06:39, 25 February 2006 (UTC)
- it happens on all displays, you can't push the entire image to the screen in one instance. Screens are drawn line by line, whether it's lines 1, 2, 3, 4, 5, 6... on progressive scan, or 1, 3, 5... 2, 4, 6... on interlaced, it still happens whenever the image changes and the raster is not at the point of v-sync 22.214.171.124 17:24, 18 June 2007 (UTC)
we need an image to illustrate this. I'm a mutant geek, and even I'm not sure what it looks like. Sys Hax 08:29, 29 November 2006 (UTC)
- You've never seen tearing? Lucky, it really annoys me in games, especially First-person shooter, here is a screen shot I Googled but don't just insert it in to the article, I have no idea of the copyright status: http://img141.imageshack.us/img141/7274/naamloos9ep.jpg Notice there appears to be four separate frames sliced together like a photo-fit, that's because the image in the V-RAM updated 4 times faster than the graphics card could push it to the monitor. 126.96.36.199 17:24, 18 June 2007 (UTC)
I just noticed another wiki has an illustration, maybe we can ue it? http://www.mythtv.org/wiki/index.php/Frame_display_timing 188.8.131.52 17:27, 18 June 2007 (UTC)
I thought page tearing is only a major problem in CRTs because CRTs scan using lines from top to bottom, whereas digital screens update at once. Part of this reasoning comes from the way the GunCon works, because it relies on CRT scanline timings to determine its position, and hence why it doesn't work on a digital display. Also in the several years I've played PC games on LCD screens, I've never encountered page tearing. And while I've encountered poor timing artifacts in some movies, namely from DVD's, those were mostly due to poor de-interlacing or motion compensating techniques. Besides, it doesn't make sense for digital displays to use scan lines. XenoL-Type 10:11, 23 November 2007 (UTC)
LCD's definitely experience tearing when playing PC games. I can turn off v-sync in Bioshock and within moments I'll notice the hideous screen tearing and have to turn v-sync back on.
It's already been covered that if you want to fix tearing, use v-sync. I will always turn on v-sync if the option is there, no question. But what if the option isn't there? (ie. almost all console games) Is there ANYTHING you can do to get rid of the tearing? One of those new 120Hz TV's? (I doubt it.) Unfortunately more and more console games are tearing without ANY v-sync option and it's driving me nuts, to the point where I refuse to buy any game with confirmed tearing! --Exodite (talk) 07:33, 25 November 2007 (UTC)
I thought page tearing is only a major problem in CRTs because CRTs scan using lines from top to bottom, whereas digital screens update at once.
The image is transmitted to LCDs and CRTs the same way: pixel-by-pixel, row-by-row (DVI is essentially a digitalized version of VGA, complete with sync and blanking). Tearing happens if the frame buffer gets updated with a new image while the image is being transmitted (the time it takes to transmit the image depends on the refresh rate and the amount of blanking, but it is slightly less than the refresh period, which is ~16ms at 60Hz for example). Even if LCDs updated the screen instantaneously (impossible, because that would require infinite bandwidth), the DVI/VGA interface would make them susceptible to tearing. I don't know if this applies to laptops, which probably don't use VGA/DVI between the built-in display and the frame buffer. Totsugeki (talk) 13:53, 14 March 2008 (UTC)
The article says: Screen tearing can be prevented by altering both the device outputting the video and also the device displaying it. How can altering the display device prevent tearing, when it is the output device that's feeding the display device with a torn image? .oisyn (talk) 12:06, 9 June 2008 (UTC)
- It is possible if you alter *both* ends of the chain, but it requires at least these things:
- The monitor would need it's own page-flipping and double-buffering implementation.
- The monitor must be incapable of altering the currently displayed buffer.
- The monitor must be incapable of altering its back buffer except on a whole-frame basis.
- The monitor must be incapable of flipping between buffers except during its internal vertical sync/refresh.
- There needs a cable and protocol that specialized for this kind of operation.
- The video card would need to insist on always using a graphics API and protocol that generates whole frames, from the biggest screen repaint to the lowliest mouse pointer effect, except for the absolute basics like VESA framebuffer or console text.
- The video card would need to completely insulate this stuff from the driver, relegating the user's software to (re)generating the virtual scene and executing various generic control functions like buffer swaps or rendering new frames.
- In the end, it's more trouble than it's worth, it breaks compatibility with a lot of software, and all you've done is moved the issue into the monitor (thus mostly re-inventing the dumb terminal). Vanessaezekowitz (talk) 07:06, 3 August 2009 (UTC)
I've rewritten the article and also included a prevention section since most people coming here will most likely be looking for a solution... Please do add what you know and some references are probably needed as well. ShadowFusion (talk) 11:58, 19 May 2008 (UTC)
I see tearing frequently occuring in DV if the tape is played back by another camera model and brand than the one it was recorded with, but never with standalone DV decks used for playback. It can't be a display issue as it's obviously inside the original stream, when capturing to PC via Firewire the tearing is present in the resulting files on the hard drive, and it can't be a graphic card issue because all you need to do is switch to the original recording camera model or even better a standalone deck. For instance, I own two different JVC cams, a 1CCD and a 3CCD (the latter with more sophisticated error correction, even), and a tape recorded on one of them will always playback with tearing on the other, and vice versa. Same if recorded with a JVC then played back on a Canon cam, and vice versa, same goes for Sony, Panasonic, etc. (Something to do also with the fact that each company has their own brand DV codec with a different FourCC each (DVSD for MicrosoftDV, CDVC for the software version of CanopusDV...), perhaps?) This tearing is always coupled with a sharp noise or audio dropout. The article ought to be updated accordingly and not focus entirely on monitors, graphic cards, and video games. --184.108.40.206 (talk) 00:33, 3 January 2010 (UTC)
"Beam tracing" section
The beam tracing section describes a technique relevant to screen tearing, however this has nothing to do with beam tracing, and beam tracing has nothing to do with screan tearing. Leave the content of the section in place but change the title to whatever is the appropriate name for the technique (if it has such a name, otherwise call it "v-sync aware scheduling" or "alternative techniques" or ...) — Preceding unsigned comment added by 220.127.116.11 (talk) 20:16, 25 August 2012 (UTC)
- Very much agree with this... Since the original author did not reply i took the freedom to do this. --Mangostaniko (talk) 16:52, 13 March 2016 (UTC)
Please can someone indicate in the article how "tearing" should be pronounced, eg tearing as in ripping (to rhyme with "pairing"), or tearing as in crying (to rhyme with "hearing"). Ianhowlett (talk) 17:12, 15 July 2015 (UTC)