# Kirkwood approximation

The Kirkwood superposition approximation was introduced in 1935 by John G. Kirkwood as a means of representing a discrete probability distribution.[1] The Kirkwood approximation for a discrete probability density function ${\displaystyle P(x_{1},x_{2},\ldots ,x_{n})}$ is given by

${\displaystyle P^{\prime }(x_{1},x_{2},\ldots ,x_{n})={\frac {\frac {\frac {\prod _{{\mathcal {T}}_{n-1}\subseteq {\mathcal {V}}}p({\mathcal {T}}_{n-1})}{\prod _{{\mathcal {T}}_{n-2}\subseteq {\mathcal {V}}}p({\mathcal {T}}_{n-2})}}{\vdots }}{\prod _{{\mathcal {T}}_{1}\subseteq {\mathcal {V}}}p({\mathcal {T}}_{1})}}}$

where

${\displaystyle \prod _{{\mathcal {T}}_{i}\subseteq {\mathcal {V}}}p({\mathcal {T}}_{i})}$

is the product of probabilities over all subsets of variables of size i in variable set ${\displaystyle \scriptstyle {\mathcal {V}}}$. This kind of formula has been considered by Watanabe (1960) and, according to Watanabe, also by Robert Fano. For the three-variable case, it reduces to simply

${\displaystyle P^{\prime }(x_{1},x_{2},x_{3})={\frac {p(x_{1},x_{2})p(x_{2},x_{3})p(x_{1},x_{3})}{p(x_{1})p(x_{2})p(x_{3})}}}$

The Kirkwood approximation does not generally produce a valid probability distribution (the normalization condition is violated). Watanabe claims that for this reason informational expressions of this type are not meaningful, and indeed there has been very little written about the properties of this measure. The Kirkwood approximation is the probabilistic counterpart of the interaction information.

Judea Pearl (1988 §3.2.4) indicates that an expression of this type can be exact in the case of a decomposable model, that is, a probability distribution that admits a graph structure whose cliques form a tree. In such cases, the numerator contains the product of the intra-clique joint distributions and the denominator contains the product of the clique intersection distributions.

## References

1. ^ Kirkwood, J. G. Statistical Mechanics of Fluid Mixtures. J. Chem. Phys. 3, 300, (1935)
• Jakulin, A. & Bratko, I. (2004), Quantifying and visualizing attribute interactions: An approach based on entropy, Journal of Machine Learning Research, (submitted) pp. 38–43.
• Matsuda, H. (2000), Physical nature of higher-order mutual information: Intrinsic correlations and frustration, Physical Review E 62, 3096–3102.
• Pearl, J. (1988), Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference, Morgan Kaufmann, San Mateo, CA.
• Watanabe, S. (1960), Information theoretical analysis of multivariate correlation, IBM Journal of Research and Development 4, 66–82.