Screen tearing

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
A typical video tearing artifact (simulated image)

Screen tearing is a visual artifact in video display where a display device shows information from multiple frames in a single screen draw.[1]

The artifact occurs when the video feed to the device is not in sync with the display's refresh rate. That can be caused by non-matching refresh rates, and the tear line then moves as the phase difference changes (with speed proportional to difference of frame rates). It can also occur simply from lack of sync between two equal frame rates, and the tear line is then at a fixed location that corresponds to the phase difference. During video motion, screen tearing creates a torn look as edges of objects (such as a wall or a tree) fail to line up.

Tearing can occur with most common display technologies and video cards, and is most noticeable in horizontally-moving visuals, such as in slow camera pans in a movie, or classic side-scrolling video games.

Screen tearing is less noticeable when more than two frames finish rendering during the same refresh interval since that means the screen has several narrower tears, instead of a single wider one.

Prevention[edit]

Ways to prevent video tearing depend on the display device and video card technology, software in use, and the nature of the video material. The most common solution is to use multiple buffering.

Most systems use multiple buffering and some means of synchronization of display and video memory refresh cycles.

Vertical synchronization[edit]

Vertical synchronization is an option in most systems in which the video card is prevented from doing anything visible to the display memory until after the monitor finishes its current refresh cycle.

During the vertical blanking interval, the driver orders the video card to either rapidly copy the off-screen graphics area into the active display area (double buffering), or treat both memory areas as displayable, and simply switch back and forth between them (page flipping).

Nvidia and AMD video adapters provide an 'Adaptive Vsync' option, which will turn on vertical synchronization only when the frame rate of the software exceeds the display's refresh rate, disabling it otherwise. That eliminates the stutter that occurs as the rendering engine frame rate drops below the display's refresh rate.[2]

Alternatively, technologies like FreeSync[3] and G-Sync[4] reverse the concept and adapts the display's refresh rate to the content coming from the computer. Such technologies require specific support from both the video adapter and the display.

Complications[edit]

When vertical synchronization is used, the frame rate of the rendering engine gets limited to the video signal frame rate. That feature normally improves video quality but involves trade-offs in some cases.

Judder[edit]

Vertical synchronization can also cause artifacts in video and movie presentations since they are generally recorded at frame rates significantly lower than the typical monitor frame rates (24–30 frame/s). When such a movie is played on a monitor set for a typical 60 Hz refresh rate, the video player misses the monitor's deadline fairly frequently, and the interceding frames are displayed slightly faster than intended, resulting in an effect similar to judder. (See Telecine: Frame rate differences.)

Input lag[edit]

Video games, which use a wide variety of rendering engines, tend to benefit visually from vertical synchronization since a rendering engine is normally expected to build each frame in real time, based on whatever the engine's variables specify at the moment a frame is requested. However, because vertical synchronization causes input lag, it interferes with the interactive nature of games,[5] and particularly interferes with games that require precise timing or fast reaction times.

Benchmarking[edit]

Lastly, benchmarking a video card or rendering engine generally implies that the hardware and software render the display as fast as possible, without regard to monitor capabilities or resultant video tearing. Otherwise, the monitor and video card throttle the benchmarking program, causing invalid results.

Other techniques[edit]

Some graphics systems let the software perform its memory accesses so that they stay at the same time point relative to the display hardware's refresh cycle, known as raster interrupt or racing the beam. In that case, the software writes to the areas of the display that have just been updated, staying just behind the monitor's active refresh point. That allows for copy routines or rendering engines with less predictable throughput as long as the rendering engine can "catch up" with the monitor's active refresh point when it falls behind.

Alternatively, software can instead stay just ahead of the active refresh point. Depending on how far ahead one chooses to stay, that method may demand code that copies or renders the display at a fixed, constant speed. Too much latency causes the monitor to overtake the software on occasion, leading to rendering artifacts, tearing, etc.

Demo software on classic systems such as the Commodore 64 and ZX Spectrum frequently exploited those techniques because of the predictable nature of their respective video systems to achieve effects that might otherwise be impossible.

References[edit]

  1. ^ How to fight tearing, virtualdub.org, 2005-10-31, archived from the original on 2015-05-30, retrieved 2015-05-19
  2. ^ Adaptive VSync, nvidia.com, retrieved 2014-01-28
  3. ^ https://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/freesync
  4. ^ https://www.cnet.com/news/nvidia-g-sync-is-a-smooth-move-for-pc-games/
  5. ^ Derek Wilson (2009-07-16), Exploring Input Lag Inside and Out, AnandTech, retrieved 2012-01-15