Image quality

From Wikipedia, the free encyclopedia
  (Redirected from Image Quality)
Jump to navigation Jump to search

Image quality (often Image Quality Assessment, IQA) is a characteristic of an image that measures the perceived image degradation (typically, compared to an ideal or perfect image). Imaging systems may introduce some amounts of distortion or artifacts in the signal – for example by transcoding –, which affects the subjectively experienced quality and Quality of Experience for end users.

In photographic imaging[edit]

An (image) is formed on the image plane of the camera and then measured electronically or chemically to produce the photograph. The image formation process may be described by the ideal pinhole camera model, where only light rays from the depicted scene that pass through the camera aperture can fall on the image plane.[a] In reality, this ideal model is only an approximation of the image formation process, and image quality may be described in terms of how well the camera approximates the pinhole model.[1]

An ideal model of how a camera measures light is that the resulting photograph should represent the amount of light that falls on each point at a certain point in time. This model is only an approximate description of the light measurement process of a camera, and image quality is also related to the deviation from this model.

In some cases, the image for which quality should be determined is primarily not the result of a photographic process in a camera, but the result of storing or transmitting the image. A typical example is a digital image that has been compressed, stored or transmitted, and then decompressed again. Unless a lossless compression method has been used, the resulting image is normally not identical to the original image and the deviation from the (ideal) original image is then a measure of quality. By considering a large set of images, and determining a quality measure for each of them, statistical methods can be used to determine an overall quality measure of the compression method.

In a typical digital camera, the resulting image quality depends on all three factors mentioned above: how much the image formation process of the camera deviates from the pinhole model, the quality of the image measurement process, and the coding artifacts that are introduced in the image produced by the camera, typically by the JPEG coding method.

By defining image quality in terms of a deviation from the ideal situation, quality measures become technical in the sense that they can be objectively determined in terms of deviations from the ideal models. Image quality can, however, also be related to the subjective perception of an image, e.g., a human looking at a photograph. Examples are how colors are represented in a black-and-white image, as well as in color images, or that the reduction of image quality from noise depends on how the noise correlates with the information the viewer seeks in the image rather than its overall strength. Another example of this type of quality measure is Johnson's criteria for determining the necessary quality of an image in order to detect targets in night vision systems.

Subjective measures of quality also relate to the fact that, although the camera's deviation from the ideal models of image formation and measurement in general is undesirable and corresponds to reduced objective image quality, these deviations can also be used for artistic effects in image production, corresponding to high subjective quality.

Image quality assessment categories[edit]

There are several techniques and metrics that can be measured objectively and automatically evaluated by a computer program. They can be classified depending on the availability of a reference image or features from a reference image:[2]

  • Full-reference (FR) methods – FR metrics try to assess the quality of a test image by comparing it with a reference image that is assumed to have perfect quality, e.g. the original of an image versus a JPEG-compressed version of the image.
  • Reduced-reference (RR) methods – RR metrics assess the quality of a test and reference image based on a comparison of features extracted from both images.
  • No-reference (NR) methods – NR metrics try to assess the quality of a test image without any reference to the original one.

Image quality metrics can also be classified in terms of measuring only one specific type of degradation (e.g., blurring, blocking, or ringing), or taking into account all possible signal distortions, that is, multiple kinds of artifacts.[3]

Image quality metrics[edit]

A large number of image quality models have been developed in the last decades, mostly stemming from academic research.[2][3]

Full-reference metrics[edit]

Well-known and often-used metrics include Peak Signal to Noise Ratio (PSNR), Structural Similarity (SSIM), and Visual Information Fidelity (VIF).

A large fraction of the picture quality measurement tools that are commercially deployed by the television and home cinema industries utilize SSIM.[citation needed] VIF serves as a core picture quality prediction engine in the Netflix VMAF video quality monitoring system, which quality-controls all Netflix encoded videos that are streamed worldwide. Both SSIM and VIF were developed at the Laboratory for Image & Video Engineering (LIVE).

No-reference metrics[edit]

The NR algorithms BRISQUE[4], BLIINDS[5], DIIVINE[6], and NIQE[7] are based on Natural Scene Statistics (NSS) and have also been developed at the LIVE group.

Image quality factors[edit]

Blown highlights are detrimental to image quality. Top: Original image. Bottom: Blown areas highlighted in red.
At full resolution, this image has clearly visible compression artifacts, for example along the edges of the rightmost trusses.
  • Sharpness determines the amount of detail an image can convey. System sharpness is affected by the lens (design and manufacturing quality, focal length, aperture, and distance from the image center) and sensor (pixel count and anti-aliasing filter). In the field, sharpness is affected by camera shake (a good tripod can be helpful), focus accuracy, and atmospheric disturbances (thermal effects and aerosols). Lost sharpness can be restored by sharpening, but sharpening has limits. Oversharpening, can degrade image quality by causing "halos" to appear near contrast boundaries. Images from many compact digital cameras are sometimes oversharpened to compensate for lower image quality.
  • Noise is a random variation of image density, visible as grain in film and pixel level variations in digital images. It arises from the effects of basic physics— the photon nature of light and the thermal energy of heat— inside image sensors. Typical noise reduction (NR) software reduces the visibility of noise by smoothing the image, excluding areas near contrast boundaries. This technique works well, but it can obscure fine, low contrast detail.
  • Dynamic range (or exposure range) is the range of light levels a camera can capture, usually measured in f-stops, EV (exposure value), or zones (all factors of two in exposure). It is closely related to noise: high noise implies low dynamic range.
  • Tone reproduction is the relationship between scene luminance and the reproduced image brightness.
  • Contrast, also known as gamma, is the slope of the tone reproduction curve in a log-log space. High contrast usually involves loss of dynamic range — loss of detail, or clipping, in highlights or shadows.
  • Color accuracy is an important but ambiguous image quality factor. Many viewers prefer enhanced color saturation; the most accurate color isn't necessarily the most pleasing. Nevertheless it is important to measure a camera's color response: its color shifts, saturation, and the effectiveness of its white balance algorithms.
  • Distortion is an aberration that causes straight lines to curve. It can be troublesome for architectural photography and metrology (photographic applications involving measurement). Distortion tends to be noticeable in low cost cameras, including cell phones, and low cost DSLR lenses. It is usually very easy to see in wide angle photos. It can be now be corrected in software.
  • Vignetting, or light falloff, darkens images near the corners. It can be significant with wide angle lenses.
  • Exposure accuracy can be an issue with fully automatic cameras and with video cameras where there is little or no opportunity for post-exposure tonal adjustment. Some even have exposure memory: exposure may change after very bright or dark objects appear in a scene.
  • Lateral chromatic aberration (LCA), also called "color fringing", including purple fringing, is a lens aberration that causes colors to focus at different distances from the image center. It is most visible near corners of images. LCA is worst with asymmetrical lenses, including ultrawides, true telephotos and zooms. It is strongly affected by demosaicing.
  • Lens flare, including "veiling glare" is stray light in lenses and optical systems caused by reflections between lens elements and the inside barrel of the lens. It can cause image fogging (loss of shadow detail and color) as well as "ghost" images that can occur in the presence of bright light sources in or near the field of view.
  • Color moiré is artificial color banding that can appear in images with repetitive patterns of high spatial frequencies, like fabrics or picket fences. It is affected by lens sharpness, the anti-aliasing (lowpass) filter (which softens the image), and demosaicing software. It tends to be worst with the sharpest lenses.
  • Artifacts – software (especially operations performed during RAW conversion) can cause significant visual artifacts, including data compression and transmission losses (e.g. Low quality JPEG), oversharpening "halos" and loss of fine, low-contrast detail.

See also[edit]


  1. ^ Perry Sprawls, Ph.D. "Image Characteristics and Quality". Retrieved 2014-06-24. 
  2. ^ a b Kim-Han Thung and Paramesran Raveendran. "A Survey of Image Quality Measures". Retrieved 2015-03-16. 
  3. ^ a b Shahid, Muhammad; Rossholm, Andreas; Lövström, Benny; Zepernick, Hans-Jürgen (2014-08-14). "No-reference image and video quality assessment: a classification and review of recent approaches". EURASIP Journal on Image and Video Processing. 2014: 40. doi:10.1186/1687-5281-2014-40. ISSN 1687-5281. 
  4. ^ Mittal, A.; Moorthy, A. K.; Bovik, A. C. (December 2012). "No-Reference Image Quality Assessment in the Spatial Domain". IEEE Transactions on Image Processing. 21 (12): 4695–4708. doi:10.1109/tip.2012.2214050. ISSN 1057-7149. 
  5. ^ Saad, M. A.; Bovik, A. C.; Charrier, C. (August 2012). "Blind Image Quality Assessment: A Natural Scene Statistics Approach in the DCT Domain". IEEE Transactions on Image Processing. 21 (8): 3339–3352. doi:10.1109/tip.2012.2191563. ISSN 1057-7149. 
  6. ^ Moorthy, A. K.; Bovik, A. C. (December 2011). "Blind Image Quality Assessment: From Natural Scene Statistics to Perceptual Quality". IEEE Transactions on Image Processing. 20 (12): 3350–3364. doi:10.1109/tip.2011.2147325. ISSN 1057-7149. 
  7. ^ Mittal, A.; Soundararajan, R.; Bovik, A. C. (March 2013). "Making a #x201C;Completely Blind #x201D; Image Quality Analyzer". IEEE Signal Processing Letters. 20 (3): 209–212. doi:10.1109/lsp.2012.2227726. ISSN 1070-9908. 
  • Sheikh, H.R.; Bovik A.C., Information Theoretic Approaches to Image Quality Assessment. In: Bovik, A.C. Handbook of Image and Video Processing. Elsevier, 2005.
  • Guangyi Chen, Stephane Coulombe, An Image Visual Quality Assessment Method Based on SIFT Features 85-97 JPRR


  1. ^ The pinhole model is relevant for the majority of cameras, but exceptions such as omnidirectional cameras and push broom scanners provide other models for the ideal image formation.

External links[edit]