This article needs additional citations for verification. (January 2010) (Learn how and when to remove this template message)
Flicker is a visible change in brightness between cycles displayed on video displays. It applies especially to the refresh interval on Cathode ray tube (CRT) televisions and computer monitors, as well as Plasma based computer screens and televisions.
Flicker occurs on CRTs when they are driven at a low refresh rate, allowing the brightness to drop for time intervals sufficiently long to be noticed by a human eye – see persistence of vision and flicker fusion threshold. For most devices, the screen's phosphors quickly lose their excitation between sweeps of the electron gun, and the afterglow is unable to fill such gaps – see phosphor persistence. A refresh rate of 60 Hz on most screens will produce a visible "flickering" effect. Most people find that refresh rates of 70–90 Hz and above enable flicker-free viewing on CRTs. Use of refresh rates above 120 Hz is uncommon, as they provide little noticeable flicker reduction and limit available resolution.
Flatscreen Plasma displays have a similar effect. The plasma pixels fade in brightness between refreshes.
In LCD screens, the LCD itself does not flicker, it preserves its opacity unchanged until updated for the next frame. However, in order to prevent accumulated damage LCD displays quickly alternate the voltage between positive and negative for each pixel, which is called 'polarity inversion'. Ideally, this wouldn't be noticeable because every pixel has the same brightness whether a positive or a negative voltage is applied. In practice, there is a small difference, which means that every pixel flickers at about 30 Hz. Screens that use opposite polarity per-line or per-pixel can reduce this effect compared to when the entire screen is at the same polarity, sometimes the type of screen is detectable by using patterns designed to maximize the effect.
Much more of a concern is the LCD backlight. Earlier screens used fluorescent lamps which flickered at 100 or 120 Hz; most modern screens use an electronic ballast that flickers at 25–60 kHz which is far outside the human perceptible range. LED backlights should not flicker at all, though some designs may "dim" themselves by switching on and off rapidly and noticeably.
Flicker is necessary for a movie projector to block the light as the film is moved from one frame to the next. The standard framerate of 24 fps produces very obvious flicker, so even very early movie projectors added additional vanes to the rotating shutter to block light even when the film was not moving. Most common is 3 vanes raising the rate to 72 Hz. Home movie projectors (and early theater projectors) often have four vanes, to raise the 18 fps used by silent film to 72 Hz. Digital projectors typically use DLP mirrors which do not have flicker. Some 3D projection rapidly alternates between the picture for each eye (alternating the polarization at the same rate) at 144 Hz. Movie projectors typically use an incandescent lamp or arc lamp which does not itself flicker.
Televisions use interlaced video so the screen flickers at twice the rate (50 or 60 Hz) that the image changes (25 or 30 Hz). This was considered necessary even in the very first televisions which used very slow phosphors.
The exact refresh rate necessary to prevent the perception of flicker varies greatly based on the viewing environment. In a completely dark room, a sufficiently dim display can run as low as 30 Hz without visible flicker. At normal room and TV brightness this same display rate would produce flicker so severe as to be unwatchable.
The human eye is most sensitive to flicker at the edges of the human field of view (peripheral vision) and least sensitive at the center of gaze (the area being focused on). As a result, the greater the portion of our field of view that is occupied by a display, the greater is the need for high refresh rates. This is why computer monitor CRTs usually run at 70 to 90 Hz, while TVs, which are viewed from further away, are seen as acceptable at 60 or 50 Hz (see Analog television Standards).
Software can cause flicker effects by directly displaying an unintended intermediate image for a short time. For example, drawing a page of text by blanking the area to white first in the frame buffer, then drawing 'on top' of it, makes it possible for the blank region to appear momentarily onscreen. Usually this is much faster and easier to program than to directly set each pixel to its final value.
When it is not feasible to set each pixel only once, double buffering can be used. This creates an off-screen drawing surface, drawing to it (with as much flicker as you want), and then copying it all at once to the screen. The result is the visible pixels only change once. While this technique cuts down on software flicker, it can also be very inefficient.
Flicker is used intentionally by developers on low-end systems to create the illusion of more objects or colors/shades than are actually possible on the system, or as a speedy way of simulating transparency. While typically thought of as a mark of older systems like 16-bit game consoles, the flicker technique continues to be used on new systems, such as the temporal dithering used to fake true color on most new LCD monitors.
Flicker of a CRT monitor can cause various symptoms in those sensitive to it such as headaches in migraine sufferers and seizures in epileptics.
As the flicker is most clearly seen at the edge of our vision there is no obvious risk in using a CRT, but prolonged use can cause a sort of retinal shock where the flickering is seen even when looking away from the monitor. This can create a sort of motion sickness, a discrepancy between the movement detected by the fluid in the inner ear and the motion we can see. Symptoms include dizziness, fatigue, headaches and (sometimes extreme) nausea. The symptoms usually disappear in less than a week without CRT use, and usually only last a few hours unless the exposure has been over a long period.