False color refers to a group of color rendering methods used to display images in color which were recorded in the visual or non-visual parts of the electromagnetic spectrum. A false-color image is an image that depicts an object in colors that differ from those a photograph (a "true-color" image) would show.
In addition variants of false color such as pseudo color, density slicing and choropleths are used for information visualization of either data gathered by a single grayscale channel or data not depicting parts of the electromagnetic spectrum (e.g. elevation in relief maps or tissue types in magnetic resonance imaging).
Types of color renderings 
True color 
To understand false color, a look at the concept behind true color is helpful. An image is called a "true-color" image when it offers a natural color rendition, or when it comes close to it. This means that the colors of an object in an image appear to a human observer the same way as if this observer were to directly view the object: A green tree appears green in the image, a red apple red, a blue sky blue, and so on. When applied to black-and-white images, true-color means that the perceived lightness of a subject is preserved in its depiction.
- Different spectral sensitivities of the human eye and of an image capture device (e.g. a "camera")
- Different spectral emissions/reflections of the object and of the image render process (e.g. a "printer" or "monitor")
- Differences in spectral irradiance in the case of reflective images (e.g. photo prints) or reflective objects – see color rendering index (CRI) for details
The result of a metameric failure would be for example an image of a green tree which shows a different shade of green than the tree itself, a different shade of red for a red apple, a different shade of blue for the blue sky, and so on. Color management (e.g. with ICC profiles) can be used to mitigate this problem within the physical constraints.
Approximate true-color images gathered by spacecraft are an example where images have a certain amount of metameric failure, as the spectral bands of a spacecraft's camera are chosen to gather information on the physical properties of the object under investigation, and are not chosen to capture true-color images.
False color 
A false-color image sacrifices natural color rendition (in contrast to a true-color image) in order to ease the detection of features that are not readily discernible otherwise – for example the use of near infrared for the detection of vegetation in satellite images. While a false-color image can be created using solely the visual spectrum (e.g. to accentuate color differences), typically some or all data used is from electromagnetic radiation (EM) outside the visual spectrum (e.g. infrared, ultraviolet or X-ray). The choice of spectral bands is governed by the physical properties of the object under investigation.
As the human eye uses three "spectral bands" (see trichromacy for details), three spectral bands are commonly combined into a false-color image. At least two spectral bands are needed for a false-color encoding, and it is possible to combine more bands into the three visual RGB bands – with the eye's ability to discern three channels being the limiting factor. In contrast, a "color" image made from one spectral band, or an image made from data consisting of non-EM data (e.g. elevation, temperature, tissue type) is a pseudo-color image (see below).
For true color, the RGB channels (red "R", green "G" and blue "B") from the camera are mapped to the corresponding RGB channels of the image, yielding a "RGB→RGB" mapping. For false color this relationship is changed. The simplest false-color encoding is to take an RGB image in the visible spectrum, but map it differently, e.g. "GBR→RGB". For "traditional false-color" satellite images of Earth a "NRG→RGB" mapping is used, with "N" being the near-infrared spectral band (and the blue spectral band being unused) – this yields the typical "vegetation in red" false-color images.
False color is used (among others) for satellite and space images: Examples are remote sensing satellites (e.g. Landsat, see example above), space telescopes (e.g. the Hubble Space Telescope) or space probes (e.g. Cassini-Huygens). Some spacecraft, with rovers (e.g. the Mars Science Laboratory "Curiosity") being the most prominent examples, have the ability to capture approximate true-color images as well. Weather satellites produce, in contrast the spacecrafts mentioned above, grayscale images from the visible or infrared spectrum.
Pseudo color 
A pseudo-color image is derived from a grayscale image by mapping each intensity value to a color according to a table or function. Pseudo color is typically used when a single channel of data is available (e.g. temperature, elevation, soil composition, tissue type, and so on), in contrast to false color which is commonly used to display three channels of data.
Another familiar example of pseudo color is the encoding of elevation using hypsometric tints in physical relief maps, where negative values (below sea level) are usually represented by shades of blue, and positive values by greens and browns.
Depending on the table or function used and the choice of data sources, pseudo-coloring may increase the information contents of the original image, for example adding geographic information, combining information obtained from infrared or ultra-violet light, or other sources like MRI scans.
A further application of pseudo coloring is to store the results of image elaboration; that is, changing the colors in order to ease understanding an image.
Density slicing 
Density slicing, a variation of pseudo color, divides an image into a few colored bands and is (among others) used in the analysis of remote sensing images. For density slicing the range of grayscale levels is divided into intervals, with each interval assigned to one of a few discrete colors – this is in contrast to pseudo color, which uses a continuous color scale. For example, in a grayscale thermal image the temperature values in the image can be split into bands of 2°C, and each band represented by one color – as a result the temperature of one spot in the thermograph can be easier acquired by the user, because the discernible differences between the discrete colors are greater than those of images with continuous grayscale or continuous pseudo color.
A choropleth is an image or map in which areas are colored or patterned proportionally to the category or value of one or more variables being represented. The variables are mapped to a few colors; each area contributes one data point and receives one color from these selected colors. Basically it is density splicing applied to a pseudo-color overlay. A choropleth map of a geographic area is thus an extreme form of false color.
See also 
- NASA World Wind uses several false-color satellite image layers
- List of software palettes – False color palettes section
- Imaginary colors, points in a color space that correspond to a color perception that cannot be produced by any physical (non-negative) light spectrum.
- "Principles of Remote Sensing - Centre for Remote Imaging, Sensing and Processing, CRISP". Crisp.nus.edu.sg. Retrieved 2012-09-01.
- "The Landsat 7 Compositor". Landsat.gsfc.nasa.gov. 2011-03-21. Retrieved 2012-09-01.
- Nancy Atkinson (2007-10-01). "True or False (Color): The Art of Extraterrestrial Photography". Universetoday.com. Retrieved 2012-09-01.
- "Mars Art Gallery Articles". Marsartgallery.com. Retrieved 2012-09-01.
- "NGC 3627 (M66) - NASA Spitzer Space Telescope Collection". Nasaimages.org. 2005-09-15. Retrieved 2012-09-01.
- GDSC. "band combinations". GDSC. Retrieved 2012-09-01.
- "Pseudocolor Filter for VirtualDub". Neuron2.net. Retrieved 2012-09-01.
- Leonid I. Dimitrov (1995). "Pseudo-colored visualization of EEG-activities on the human cortex using MRI-based volume rendering and Delaunay interpolation". Institute of Information Processing, Austrian Academy of Sciences. Retrieved 2009-03-18.
- C J Setchell; N W Campbell (July 1999). "Using Color Gabor Texture Features for Scene Understanding". 7th. International Conference on Image Processing and its Applications. University of Bristol. Retrieved 2009-03-18.
- John Alan Richards and Xiuping Jia (2006). Remote Sensing Digital Image Analysis: An Introduction (4th ed.). Birkhäuser. pp. 102–104. ISBN 9783540251286.
- J. B. Campbell, "Introduction to Remote Sensing", 3rd ed., Taylor & Francis, p. 153