A Bayer filter mosaic is a color filter array (CFA) for arranging RGB color filters on a square grid of photosensors. Its particular arrangement of color filters is used in most single-chip digital image sensors used in digital cameras, camcorders, and scanners to create a color image. The filter pattern is 50% green, 25% red and 25% blue, hence is also called BGGR, RGBG, GRGB, or RGGB.
Alternatives to the Bayer filter include both various modifications of colors and arrangement and completely different technologies, such as color co-site sampling, the Foveon X3 sensor, the dichroic mirrors or a transparent diffractive-filter array.
Bryce Bayer's patent (U.S. Patent No. 3,971,065) in 1976 called the green photosensors luminance-sensitive elements and the red and blue ones chrominance-sensitive elements. He used twice as many green elements as red or blue to mimic the physiology of the human eye. The luminance perception of the human retina uses M and L cone cells combined, during daylight vision, which are most sensitive to green light. These elements are referred to as sensor elements, sensels, pixel sensors, or simply pixels; sample values sensed by them, after interpolation, become image pixels. At the time Bayer registered his patent, he also proposed to use a cyan-magenta-yellow combination, that is another set of opposite colors. This arrangement was unpractical at the time because the necessary dyes did not exist, but is used in some new digital cameras. The big advantage of the new CMY dyes is that they have an improved light absorption characteristic; that is, their quantum efficiency is higher.
The raw output of Bayer-filter cameras is referred to as a Bayer pattern image. Since each pixel is filtered to record only one of three colors, the data from each pixel cannot fully specify each of the red, green, and blue values on its own. To obtain a full-color image, various demosaicing algorithms can be used to interpolate a set of complete red, green, and blue values for each pixel. These algorithms make use of the surrounding pixels of the corresponding colors to estimate the values for a particular pixel.
Different algorithms requiring various amounts of computing power result in varying-quality final images. This can be done in-camera, producing a JPEG or TIFF image, or outside the camera using the raw data directly from the sensor.
Demosaicing or "debayering" can be performed in different ways. Simple methods interpolate the color value of the pixels of the same color in the neighborhood. For example, once the chip has been exposed to an image, each pixel can be read. A pixel with a green filter provides an exact measurement of the green component. The red and blue components for this pixel are obtained from the neighbors. For a green pixel, two red neighbors can be added together to yield the red value; similarly, two blue pixels can be added to yield the blue value.
This simple approach works well in areas with constant color or smooth gradients, but it can cause artifacts such as color bleeding in areas where there are abrupt changes in color or brightness especially noticeable along sharp edges in the image. Because of this, other demosaicing methods attempt to identify high-contrast edges and only interpolate along these edges, but not across them.
Other algorithms are based on the assumption that the color of an area in the image is relatively constant even under changing light conditions, so that the color channels are highly correlated with each other. Therefore, the green channel is interpolated at first then the red and afterwards the blue channel, so that the color ratio red-green respective blue-green are constant. There are other methods that make different assumptions about the image content and starting from this attempt to calculate the missing color values.
Images with small-scale detail close to the resolution limit of the digital sensor can be a problem to the demosaicing algorithm, producing a result which does not look like the model. The most frequent artifact is Moiré, which may appear as repeating patterns, color artifacts or pixels arranged in an unrealistic maze-like pattern
False color artifact
A common and unfortunate artifact of Color Filter Array (CFA) interpolation or demosaicing is what is known and seen as false coloring. Typically this artifact manifests itself along edges, where abrupt or unnatural shifts in color occur as a result of misinterpolating across, rather than along, an edge. Various methods exist for preventing and removing this false coloring. Smooth hue transition interpolation is used during the demosaicing to prevent false colors from manifesting themselves in the final image. However, there are other algorithms that can remove false colors after demosaicing. These have the benefit of removing false coloring artifacts from the image while using a more robust demosaicing algorithm for interpolating the red and blue color planes.
The zippering artifact is another side effect of CFA demosaicing, which also occurs primarily along edges, is known as the zipper effect. Simply put, zippering is another name for edge blurring that occurs in an on/off pattern along an edge. This effect occurs when the demosaicing algorithm averages pixel values over an edge, especially in the red and blue planes, resulting in its characteristic blur. As mentioned before, the best methods for preventing this effect are the various algorithms which interpolate along, rather than across image edges. Pattern recognition interpolation, adaptive color plane interpolation, and directionally weighted interpolation all attempt to prevent zippering by interpolating along edges detected in the image.
However, even with a theoretically perfect sensor that could capture and distinguish all colors at each photosite, Moiré and other artifacts could still appear. This is an unavoidable consequence of any system that samples an otherwise continuous signal at discrete intervals or locations. For this reason, most photographic digital sensors incorporate an optical low-pass filter (OLPF), also called an anti-aliasing (AA) filter. This is typically a thin layer directly in front of the sensor, and works by effectively blurring any potentially problematic details that are finer than the resolution of the sensor.
The Bayer filter is almost universal on consumer digital cameras, and widespread among other cameras. Alternatives include:
- CYGM filter (cyan, yellow, green, magenta)
- RGBE filter (red, green, blue, emerald)
- Foveon X3 sensor (no mosaic)
Some mosaic filters add unfiltered pixels, for a quarter or a half or other fraction of the sensor pixels. Their formats may include:
- CMYW (cyan, magenta, yellow, and white)
- Jeff Mather (2008). "Adding L* to RGBG".
- dpreview.com (2000). "Sony announce 3 new digital cameras". Archived from the original on 2011-07-21.
- Margaret Brown (2004). Advanced Digital Photography. Media Publishing. ISBN 0-9581888-5-8.
- Thomas Maschke (2004). Digitale Kameratechnik: Technik digitaler Kameras in Theorie und Praxis. Springer. ISBN 3-540-40243-8.
- Wang, Peng; Menon, Rajesh (29 October 2015). "Ultra-high-sensitivity color imaging via a transparent diffractive-filter array and computational optics". Optica. 2 (11): 933. doi:10.1364/optica.2.000933.
- Patent US3971065 - Color imaging array - Google Patents
- Diallo, Amadou. "Adobe's Fujifilm X-Trans sensor processing tested". dpreview.com. Retrieved 20 October 2016.
- "Adobe Improves X-Trans Processing in Lightroom CC Update: Promises More to Come". Thomas Fitzgerald Photography Blog. Retrieved 20 October 2016.
- RGB "Bayer" Color and MicroLenses, Silicon Imaging (design, manufacturing and marketing of high-definition digital cameras and image processing solutions)
- eLynx image processing library, Big set of Bayer mosaic manipulation source code licensed under the GPL.
- Efficient, high-quality Bayer demosaic filtering on GPUs
- Global Computer Vision
- Review of Bayer Pattern Color Filter Array (CFA) Demosaicing with New Quality Assessment Algorithms
- Digital Camera Sensors