Correlation function (astronomy)
|Part of a series on|
In astronomy, a correlation function describes the distribution of galaxies in the universe. By default, "correlation function" refers to the two-point autocorrelation function. The two-point autocorrelation function is a function of one variable (distance); it describes the excess probability of finding two galaxies separated by this distance (excess over and above the probability that would arise if the galaxies were simply scattered independently and with uniform probability). It can be thought of as a lumpiness factor - the higher the value for some distance scale, the more lumpy the universe is at that distance scale.
The following definition (from Peebles 1980) is often cited:
- Given a random galaxy in a location, the correlation function describes the probability that another galaxy will be found within a given distance.
However, it can only be correct in the statistical sense that it is averaged over a large number of galaxies chosen as the first, random galaxy. If just one random galaxy is chosen, then the definition is no longer correct, firstly because it is meaningless to talk of just one "random" galaxy, and secondly because the function will vary wildly depending on which galaxy is chosen, in contradiction with its definition as a function.
where is a unitless measure of overdensity, defined at every point. Letting , it can also be expressed as the integral
The n-point autocorrelation functions for n greater than 2 or cross-correlation functions for particular object types are defined similarly to the two-point autocorrelation function.
The correlation function is important for theoretical models of physical cosmology because it provides a means of testing models which assume different things about the contents of the universe.