Histogram matching

From Wikipedia, the free encyclopedia
Jump to: navigation, search
An example of histogram matching

Histogram matching is a method in image processing of color adjustment of two images using the image histograms.

It is possible to use histogram matching to balance detector responses as a relative detector calibration technique. It can be used to normalize two images, when the images were acquired at the same local illumination (such as shadows) over the same location, but by different sensors, atmospheric conditions or global illumination.

The algorithm[edit]

Given two images, the reference and the adjusted images, we compute their histograms. Following, we calculate the cumulative distribution functions of the two images' histograms - F_1()\, for the reference image and F_2()\, for the target image. Then for each gray level G_1\in[0,255], we find the gray level G_2\, for which F_1(G_1)=F_2(G_2)\,, and this is the result of histogram matching function: M(G_1)=G_2\,. Finally, we apply the function M()\, on each pixel of the reference image.

Multiple Histograms Matching[edit]

The Histogram matching Algorithm can be extended to find a monotonic mapping between two sets of histograms. Given two sets of histograms  P=\{p_i\}_{i=1}^k and  Q = \{q_i \}_{i=1}^k , the optimal monotonic color mapping  M is calculated to minimize the distance between the two sets simultaneously, namely  min_M \ \sum_k d(M(p_k),q_k) where  d(.,.) is a distance metric between two histograms. The optimal solution is calculated using dynamic programming [1]

References[edit]

  1. ^ Shapira D., Avidan S., Hel-Or Y. (2013). "Multiple Histogram Matching". Proceedings of The IEEE International Conference on Image Processing. 

See also[edit]