Variation of information

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In probability theory and information theory, the variation of information or shared information distance is a measure of the distance between two clusterings (partitions of elements). It is closely related to mutual information; indeed, it is a simple linear expression involving the mutual information. Unlike the mutual information, however, the variation of information is a true metric, in that it obeys the triangle inequality. Even more, it is a universal metric, in that if any other distance measure two items close-by, then the variation of information will also judge them close.[1]

Background[edit]

Definition[edit]

Suppose we have two clusterings (a division of a set into several subsets) X and Y where X = \{X_{1}, X_{2}, ..,, X_{k}\}, p_{i} = |X_{i}| / n, n = \Sigma_{k} |X_{i}|. Then the variation of information between two clusterings is:

VI(X; Y ) = H(X) + H(Y) - 2I(X, Y)

where H(X) is entropy of X and I(X, Y) is mutual information between X and Y.

This is equivalent to the shared information distance.

References[edit]

  1. ^ Alexander Kraskov, Harald Stögbauer, Ralph G. Andrzejak, and Peter Grassberger, "Hierarchical Clustering Based on Mutual Information", (2003) ArXiv q-bio/0311039

Further reading[edit]

  • Arabie, P.; Boorman, S. A. (1973). "Multidimensional scaling of measures of distance between partitions". Journal of Mathematical Psychology 10: 148–203. doi:10.1016/0022-2496(73)90012-6. 
  • Meila, Marina (2003). "Comparing Clusterings by the Variation of Information". Learning Theory and Kernel Machines: 173–187. doi:10.1007/978-3-540-45167-9_14. 
  • Meila, M. (2007). "Comparing clusterings—an information based distance". Journal of Multivariate Analysis 98 (5): 873–895. doi:10.1016/j.jmva.2006.11.013.  edit
  • Kingsford, Carl (2009). "Information Theory Notes" (PDF). Retrieved 22 September 2009. 

External links[edit]