The television industry has adopted ultra high definition television (UHDTV) as its 4K standard. As of 2013, some UHDTV models are available to general consumers for under $1000. However, due to lack of available content, 4K television has yet to achieve mass market appeal. Using horizontal resolution to characterize the technology marks a switch from the previous generation, high definition television, which categorized media according to vertical resolution (1080i, 720p, 480p, etc.).
The Digital Cinema Initiatives consortium established a standard resolution of 4096 pixels × 2160 lines (8.8 megapixels, aspect ratio ~17:9) for 4K film projection. This is the native resolution for DCI-compliant 4K digital projectors and monitors; pixels are cropped from the top or sides depending on the aspect ratio of the content being projected. The DCI 4K standard has twice the horizontal and vertical resolution of DCI 2K, with four times as many pixels overall. DCI 4K does not conform to the standard 1080p Full HD aspect ratio (16:9), so it is not a multiple of the 1080p display.
4K digital films may be produced, scanned, or stored in a number of other resolutions depending on what storage aspect ratio is used. In the digital film production chain, a resolution of 4096x3112 is often used for acquiring "open gate" or anamorphic input material, a resolution based on the historical resolution of scanned super 35mm film.
The main advantage of recording video at the 4K standard is that fine detail is resolved well. This contrasts with 2K resolutions in which fine detail in hair is displayed poorly. If the final video quality is reduced to 2K from a 4K recording more detail is apparent than would have been achieved from a 2K recording. Increased fineness and contrast is then possible with output to DVD and Blu-ray. Some cinematographers choose to record at 4K when using the Super 35 film format to offset any resolution loss which may occur during video processing.