Display resolution: Difference between revisions
rv 2 edits by 24.186.49.76 to last revision by 24.186.49.76 |
→Computer Monitors: Removed paragraph that made neither logical nor grammatical sense. |
||
Line 280: | Line 280: | ||
As far as [[Digital cinematography#Acquisition Formats|digital cinematography]] is concerned, video resolution standards depend first on the frames' aspect ratio in the [[film stock]] (which is usually [[motion picture film scanner|scanned]] for [[digital intermediate]] post-production) and then on the actual points' count. Although there is not a unique set of standardized sizes, it is commonplace within the motion picture industry to refer to "''n''K" image "quality", where <math>n</math> is a (small, usually even) integer number which translates into a set of actual resolutions, depending on the [[film format]]. As a reference consider that, for a 4:3 (around 1.33:1) aspect ratio which a film frame (no matter what is its format) is expected to ''horizontally fit in'', <math>n</math> is the multiplier of 1024 such that the horizontal resolution is exactly <math>1024n</math> points. For example, 2K reference resolution is 2048×1536 pixels, whereas 4K reference resolution is 4096×3072 pixels. Nevertheless, 2K may also refer to resolutions like 2048×1556 (full-aperture), 2048×1152 ([[HDTV]], 16:9 aspect ratio) or 2048×872 pixels ([[Cinemascope]], 2.35:1 aspect ratio). It is also worth noting that while a frame resolution may be, for example, 3:2 (720×480 NTSC), that is not what you will see on-screen (i.e. 4:3 or 16:9 depending on the orientation of the rectangular pixels). |
As far as [[Digital cinematography#Acquisition Formats|digital cinematography]] is concerned, video resolution standards depend first on the frames' aspect ratio in the [[film stock]] (which is usually [[motion picture film scanner|scanned]] for [[digital intermediate]] post-production) and then on the actual points' count. Although there is not a unique set of standardized sizes, it is commonplace within the motion picture industry to refer to "''n''K" image "quality", where <math>n</math> is a (small, usually even) integer number which translates into a set of actual resolutions, depending on the [[film format]]. As a reference consider that, for a 4:3 (around 1.33:1) aspect ratio which a film frame (no matter what is its format) is expected to ''horizontally fit in'', <math>n</math> is the multiplier of 1024 such that the horizontal resolution is exactly <math>1024n</math> points. For example, 2K reference resolution is 2048×1536 pixels, whereas 4K reference resolution is 4096×3072 pixels. Nevertheless, 2K may also refer to resolutions like 2048×1556 (full-aperture), 2048×1152 ([[HDTV]], 16:9 aspect ratio) or 2048×872 pixels ([[Cinemascope]], 2.35:1 aspect ratio). It is also worth noting that while a frame resolution may be, for example, 3:2 (720×480 NTSC), that is not what you will see on-screen (i.e. 4:3 or 16:9 depending on the orientation of the rectangular pixels). |
||
Videos on resolution 1280x1024 are with incorrect aspect ratio and videogames graphics is correct on 1280x1024 resolution. It's because this resolution have physical display aspect ratio 4:3 and resolution aspect ratio 1280:1024=5:4. So such monitors are good only for videogames and 3D graphics and not for watching videos and pictures. It can be not true for all LCD monitors, but is true for all CRT monitors, in this (CRT) case need to choose resolution 1280:960=4:3. |
|||
====Evolution of standards==== |
====Evolution of standards==== |
Revision as of 21:13, 5 October 2011
The display resolution of a digital television or display device is the number of distinct pixels in each dimension that can be displayed. It can be an ambiguous term especially as the displayed resolution is controlled by all different factors in cathode ray tube (CRT), flat panel or projection displays using fixed picture-element (pixel) arrays.
It is usually quoted as width × height, with the units in pixels: for example, "1024x768" means the width is 1024 pixels and the height is 768 pixels. This example would normally be spoken as "ten twenty-four by seven sixty-eight".
One use of the term “display resolution” applies to fixed-pixel-array displays such as plasma display panels (PDPs), liquid crystal displays (LCDs), digital light processing (DLP) projectors, or similar technologies, and is simply the physical number of columns and rows of pixels creating the display (e.g., 1920×1080). A consequence of having a fixed grid display is that, for multi-format video inputs, all displays need a "scaling engine" (a digital video processor that includes a memory array) to match the incoming picture format to the display.
Note that the use of the word resolution here is a misnomer, though common. The term “display resolution” is usually used to mean pixel dimensions, the number of pixels in each dimension (e.g., 1920×1080), which does not tell anything about the resolution of the display on which the image is actually formed: resolution properly refers to the pixel density, the number of pixels per unit distance or area, not total number of pixels. In digital measurement, the display resolution would be given in pixels per inch. In analog measurement, if the screen is 10 inches high, then the horizontal resolution is measured across a square 10 inches wide. This is typically stated as "lines horizontal resolution, per picture height;"[citation needed] for example, analog NTSC TVs can typically display 486 lines of "per picture height" horizontal resolution, which is equivalent to 648 total lines of actual picture information from left edge to right edge. Which would give NTSC TV a display resolution of 648×486 in actual lines/picture information, but in "per picture height" a display resolution of 640×480.
Considerations
Some commentators also use this term to indicate a range of input formats that the display's input electronics will accept and often include formats greater than the screen's native grid size even though they have to be down-scaled to match the screen's parameters (e.g., accepting a 1920×1080 input on a display with a native 1366×768 pixel array). In the case of television inputs, many manufacturers will take the input and zoom it out to "overscan" the display by as much as 5% so input resolution is not necessarily display resolution.
The eye's perception of "display resolution" can be affected by a number of factors—see Image resolution and Optical resolution. One factor is the display screen's rectangular shape, which is expressed as the ratio of the physical picture width to the physical picture height. This is known as the aspect ratio. A screen's physical aspect ratio and the individual pixels' aspect ratio may not necessarily be the same. An array of 1280×720 on a 16:9 display has square pixels. An array of 1024×768 on a 16:9 display has rectangular pixels.
An example of pixel shape affecting "resolution" or perceived sharpness: displaying more information in a smaller area using a higher resolution makes the image much clearer or "sharper". However, newer LCD screens and such are fixed at a certain resolution; making the resolution lower on these kinds of screens will greatly decrease sharpness, as an interpolation process is used to "fix" the non-native resolution input into the display's native resolution output.
While some CRT-based displays may use digital video processing that involves image scaling using memory arrays, ultimately "display resolution" in CRT-type displays is affected by different parameters such as spot size and focus, astigmatic effects in the display corners, the color phosphor pitch shadow mask (such as Trinitron) in color displays, and the video bandwidth.
Overview
Analog television systems use interlaced video scanning with two sequential scans called fields (50 PAL or 60 NTSC fields per second), one with the odd numbered scan lines, the other with the even numbered scan lines to give a complete picture or frame (25 or 30 frames per second). This is done to save transmission bandwidth but a consequence is that in picture tube (CRT) displays, the full vertical resolution cannot be realized. For example, the maximum detail in the vertical direction would be for adjacent lines to be alternately black then white. This is not as great a problem in a progressive video display but an interlace display will have an unacceptable flicker at the slower frame rate. This is why interlace is unacceptable for fine detail such as computer word processing or spreadsheets. For television it means that if the picture is intended for interlace displays the picture must be vertically filtered to remove this objectionable flicker with a reduction of vertical resolution. According to the Kell factor the reduction is to about 85%, so a 576 line PAL interlace display only has about 480 lines vertical resolution, and a 486 line NTSC interlace display has a resolution of approximately 410 lines vertical. Similarly, 1080i digital interlaced video (the "i" in 1080i refers to "interlaced") would need to be filtered to about 910 lines for an interlaced display, although a fixed pixel display (such as LCD television) eliminates the inaccuracies of scanning, and thus can achieve Kell factors as high as 95% or 1020 lines. It should be noted that the Kell Factor equally applies to progressive scan. Using a Kell factor of 0.9, a 1080p HDTV video system using a CCD camera and an LCD or plasma display will only have 1728×972 lines of resolution.
Fixed pixel array displays such as LCDs, plasmas, DLPs, LCoS, etc. need a "video scaling" processor with frame memory, which, depending on the processing system, effectively converts an incoming interlaced video signal into a progressive video signal. A similar process occurs in a PC and its display with interlaced video (e.g., from a TV tuner card). The downside is that interlace motion artifacts are almost impossible to remove resulting in horizontal "toothed" edges on moving objects.
In analog connected picture displays such as CRT TV sets, the horizontal scanlines are not divided into pixels, but by the sampling theorem, the bandwidth of the luma and chroma signals implies a horizontal resolution. For television, the analog bandwidth for luminance in standard definition can vary from 3 MHz (approximately 330 lines edge-to-edge; VHS) to 4.2 MHz (440 lines; live analog) up to 7 MHz (660 lines; DVD). In high definition the bandwidth is 37 MHz (720p/1080i) or 74 MHz (1080p/60).
Current standards
Televisions
Televisions are of the following resolutions:
- Standard-definition television (SDTV):
- 480i (NTSC standard uses an analog system of 486i split into two interlaced fields of 243 lines)
- 576i (PAL, 720×576 split into two interlaced fields of 288 lines)
- Enhanced-definition television (EDTV):
- 480p (720×480 progressive scan)
- 576p (720×576 progressive scan)
- High-definition television (HDTV):
Computer Monitors
Computer monitors have higher resolutions than most televisions. As of July 2002[update], 1024×768 eXtended Graphics Array was the most common display resolution.[1][2] Many web sites and multimedia products were re-designed from the previous 800×600 format to the higher 1024×768-optimized layout.
The availability of inexpensive LCD monitors has made the 5:4 aspect ratio resolution of 1280×1024 more popular for desktop usage. Many computer users including CAD users, graphic artists and video game players run their computers at 1600×1200 resolution (UXGA, Ultra-eXtended) or higher if they have the necessary equipment. Other recently available resolutions include oversize aspects like 1400×1050 SXGA+ and wide aspects like 1280×800 WXGA, 1440x900 WXGA+, 1680×1050 WSXGA+, and 1920×1200 WUXGA. A new more-than-HD resolution of 2560×1600 WQXGA was released in 30" LCD monitors in 2007. In 2010, 27" LCD monitors with the resolution 2560×1440 were released by multiple manufacturers including Apple.[3] Panels for professional environments such as medical use and air traffic control, support resolutions up to 4096×2160,[4][5] which is as of September 2011 is the maximum resolution available in a single monitor. The most common computer display resolutions are as follows:[6]
Width | Height | % of Internet Users |
---|---|---|
1024 | 768 | 22.63 |
1366 | 768 | 15.63 |
1280 | 800 | 14.55 |
1280 | 1024 | 7.96 |
1440 | 900 | 6.92 |
1680 | 1050 | 3.75 |
1920 | 1080 | 3.70 |
1600 | 900 | 3.12 |
1360 | 768 | 2.65 |
1024 | 600 | 2.37 |
1152 | 864 | 1.91 |
1280 | 768 | 1.84 |
1280 | 720 | 1.66 |
800 | 600 | 1.44 |
1920 | 1200 | 1.04 |
1280 | 960 | 0.86 |
768 | 1024 | 0.80 |
1093 | 614 | 0.54 |
1024 | 640 | 0.28 |
1152 | 720 | 0.26 |
Other | 6.08 |
- Note: These statistics were gathered from visitors to three million websites, normalised to counteract geolocational bias, and may not be representative of computer users in general. Covers the three month period from June to August 2011.
These are the results of Steam hardware survey of July 2011 (note that these figures reflect video-gaming enthusiasts only):[7]
Code | Name | Aspect ratio | Width | Height | % of Steam users |
---|---|---|---|---|---|
XGA | eXtended Graphics Array | 4:3 | 1024 | 768 | 5.12% |
XGA+ | eXtended Graphics Array Plus | 4:3 | 1152 | 864 | 1.04% |
WXGA | Widescreen eXtended Graphics Array | 16:9 | 1280 | 720 | 0.69% |
WXGA | Widescreen eXtended Graphics Array | 16:10 | 1280 | 800 | 5.28% |
SXGA (UVGA) | Super eXtended Graphics Array | 4:3 | 1280 | 960 | 0.95% |
SXGA | Super eXtended Graphics Array | 5:4 | 1280 | 1024 | 11.80% |
HD | High Definition | 16:9 | 1360 | 768 | 1.42% |
HD | High Definition | 16:9 | 1366 | 768 | 6.50% |
WXGA+ | Widescreen eXtended Graphics Array Plus | 16:10 | 1440 | 900 | 9.20% |
HD+ | High Definition Plus | 16:9 | 1600 | 900 | 4.05% |
UXGA | Ultra eXtended Graphics Array | 4:3 | 1600 | 1200 | 0.81% |
WSXGA+ | Widescreen Super eXtended Graphics Array Plus | 16:10 | 1680 | 1050 | 18.01% |
FHD (Full HD) | Full High Definition | 16:9 | 1920 | 1080 | 21.78% |
WUXGA | Widescreen Ultra eXtended Graphics Array | 16:10 | 1920 | 1200 | 7.80% |
QFHD | Quad Full High Definition | 16:9 | 2560 | 1440 | 0.65% |
Other | 4.92% |
When a computer display resolution is set higher than the physical screen resolution (native resolution) , some video drivers make the virtual screen scrollable over the physical screen thus realizing a two dimensional virtual desktop with its viewport. Most LCD manufacturers do make note of the panel's native resolution as working in a non-native resolution on LCDs will result in a poorer image, due to dropping of pixels to make the image fit (when using DVI) or insufficient sampling of the analog signal (when using VGA connector). Few CRT manufacturers will quote the true native resolution since CRTs are analog in nature and can vary their display from as low as 320×200 (emulation of older computers or game consoles) to as high as the internal board will allow, or the image becomes too detailed for the vacuum tube to recreate (i.e. analog blur). Thus CRTs provide a variability in resolution that LCDs can not provide (LCDs have fixed resolution).
In recent years the popularity of 16:9 aspect ratios has resulted in more notebook display resolutions adhering to this aspect ratio. 1366×768 (HD) has become popular for most notebook sizes, while 1600×900 (HD+) and 1920x1080 (FHD) are available for larger notebooks.
As far as digital cinematography is concerned, video resolution standards depend first on the frames' aspect ratio in the film stock (which is usually scanned for digital intermediate post-production) and then on the actual points' count. Although there is not a unique set of standardized sizes, it is commonplace within the motion picture industry to refer to "nK" image "quality", where is a (small, usually even) integer number which translates into a set of actual resolutions, depending on the film format. As a reference consider that, for a 4:3 (around 1.33:1) aspect ratio which a film frame (no matter what is its format) is expected to horizontally fit in, is the multiplier of 1024 such that the horizontal resolution is exactly points. For example, 2K reference resolution is 2048×1536 pixels, whereas 4K reference resolution is 4096×3072 pixels. Nevertheless, 2K may also refer to resolutions like 2048×1556 (full-aperture), 2048×1152 (HDTV, 16:9 aspect ratio) or 2048×872 pixels (Cinemascope, 2.35:1 aspect ratio). It is also worth noting that while a frame resolution may be, for example, 3:2 (720×480 NTSC), that is not what you will see on-screen (i.e. 4:3 or 16:9 depending on the orientation of the rectangular pixels).
Evolution of standards
Many personal computers introduced in the late 1970s and the 1980s were designed to use television sets as their display devices, making the resolutions dependent on the television standards in use, including PAL and NTSC. Picture sizes were usually limited to ensure the visibility of all the pixels in the major television standards and the broad range of television sets with varying amounts of overscan. The actual drawable picture area was therefore somewhat smaller than the whole screen, and was usually surrounded by a static-colored border (see image to right). Also, the interlace scanning was usually omitted in order to provide more stability to the picture, effectively halving the vertical resolution in progress. 160×200, 320×200 and 640×200 on NTSC were relatively common resolutions in the era (224, 240 or 256 scanlines were also common). In the IBM PC world, these resolutions came to be used by 16-color EGA video cards.
One of the drawbacks of using a classic television is that the computer display resolution is higher than the TV could decode. Chroma resolution for NTSC/PAL televisions are bandwidth-limited to a maximum 1.5 megahertz, or approximately 160 pixels wide, which led to blurring of the color for 320- or 640-wide signals, and made text difficult to read (see second image to right). Many users upgraded to higher-quality televisions with S-Video or RGBI inputs that helped eliminate chroma blur and produce more legible displays. The earliest, lowest cost solution to the chroma problem was offered in the Atari 2600 Video Computer System and the Apple II+, both of which offered the option to disable the color and view a legacy black-and-white signal. On the Commodore 64, the GEOS mirrored the Mac OS method of using black-and-white to improve readability.
The 640×400i resolution (720×480i with borders disabled) was first introduced by home computers such as the Commodore Amiga and (later) Atari Falcon. These computers used interlace to boost the maximum vertical resolution. These modes were only suited to graphics or gaming, as the flickering interlace made reading text in word processor, database, or spreadsheet software difficult. (Modern game consoles solve this problem by pre-filtering the 480i video to a lower resolution. For example, Final Fantasy XII suffers from flicker when the filter is turned off, but stabilizes once filtering is restored. The computers of the 1980s lacked sufficient power to run similar filtering software.)
The advantage of a 720×480i overscanned computer was an easy interface with interlaced TV production, leading to the development of Newtek's Video Toaster. This device allowed Amigas to be used for CGI creation in various news departments (example: weather overlays), drama programs such as NBC's seaQuest, WB's Babylon 5, and early computer-generated animation by Disney for the Little Mermaid, Beauty and the Beast, and Aladdin.
In the PC world, the IBM PS/2 VGA and MCGA (multi-color) on-board graphics chips used a non-interlaced (progressive) 640×480×16 color resolution that was easier to read and thus more-useful for office work. It was the standard resolution from 1990 to around 1996. [citation needed] The standard resolution was 800×600 until around 2000. Microsoft Windows XP, released in 2001, was designed to run at 800×600 minimum although it is possible to select the original 640×480 in the Advanced Settings window. Linux, FreeBSD, and most Unix variants use the X Window System and can run at any desired resolution as long as the display and video card support it, and tend to go a long way towards being usable even on small screens, though not all applications may support very low display resolutions.
Programs designed to mimic older hardware such as Atari, Sega, or Nintendo game consoles (emulators) when attached to multiscan CRTs, routinely use much lower resolutions such as 160×200 or 320×400 for greater authenticity.
Commonly used
The list of common display resolutions article lists the most commonly used display resolutions for computer graphics, television, films, and video conferencing.
Overscan and underscan
Most television display manufacturers "overscan" the pictures on their displays (CRTs and PDPs, LCDs etc.), so that the effective on-screen picture may be reduced from 720×576(480) to 680×550(450), for example. The size of the invisible area somewhat depends on the display device. HD televisions do this as well, to a similar extent.
Computer displays including projectors generally do not overscan although many models (particularly CRT displays) allow it. CRT displays tend to be underscanned in stock configurations, to compensate the increasing distortions at the corners.
See also
- Computer display standards has a detailed list of display resolutions (e.g. VGA 640×480, WUXGA 1920×1200, ... etc.).
- Resolution independence
- Display aspect ratio
- Widescreen
- List of displays by pixel density
- Pixel density of Computer displays – PPI (For example, for a 20" screen (1680×1050) we get a 99.06 PPI.)
- Video scaler
References
- ^ "Higher screen resolutions more popular for exploring the Internet according to OneStat.com", July 24, 2002.
- ^ "Screen resolution 800×600 significantly decreased for exploring the Internet according to OneStat.com", April 18, 2007.
- ^ Apple Releases New Cinema Display: 27 inches, 2560x1440 Resolution
- ^ http://www.eizo.com/global/products/duravision/fdh3601/
- ^ EYE-LCD 6400-4K
- ^ StatCounter Global Statistics — Cumulative worldwide figures for the three months June to August 2011.
- ^ "Primary Display Resolution", The Steam hardware survey."
- Sony SXRD 4K Projector (SRXR110) resolution retrieved from [1]
External links
This article's use of external links may not follow Wikipedia's policies or guidelines. (December 2010) |
- How many dots has it got? — Fourmilab
- ScreenResolution.org — Free online browser screen tester; shows the screen resolution of your current monitor; realtime statistics on Internet users’ screen resolution
- Video Format Resolutions — Video Technology Magazine
- Browser Display Statistics — W3Schools
- Standard resolutions used for computer graphics equipment, TV and video applications and mobile devices.
- Screen resolution simulator