# Phase congruency

Phase congruency is a measure of feature significance in computer images, a method of edge detection that is particularly robust against changes in illumination and contrast.

## Foundations

Phase congruency reflects the behaviour of the image in the frequency domain. It has been noted that edgelike features have many of their frequency components in the same phase. The concept is similar to coherence, except that it applies to functions of different wavelength.

For example, the Fourier decomposition of a square wave consists of sine functions, whose frequencies are odd multiples of the fundamental frequency. At the rising edges of the square wave, each sinusoidal component has a rising phase; the phases have maximal congruency at the edges. This corresponds to the human-perceived edges in an image where there are sharp changes between light and dark.

## Definition

Phase congruency compares the weighted alignment of the Fourier components of a signal ${\displaystyle A_{\rm {n}}}$ with the sum of the Fourier components.

${\displaystyle PC(t)=\max _{\bar {\phi }}{\frac {\sum _{\rm {n}}A_{\rm {n}}\cos(\phi _{\rm {n}}(t)-{\bar {\phi }})}{\sum _{\rm {n}}A_{n}}}}$

where ${\displaystyle \phi _{\rm {n}}}$ is the local or instantaneous phase as can be calculated using the Hilbert transform and ${\displaystyle A_{\rm {n}}}$ are the local amplitude, or energy, of the signal. When all the phases are aligned, this is equal to 1.