Talk:Statistical distance
Statistics Start‑class Mid‑importance | ||||||||||
|
Comparison table
Would be nice to have a comparison table to see how far the distance or divergence measures are from being a metric. Please feel free to fill in missing (-) data, references being welcome too. I mostly copied data from the articles of these distances:
Divergence between distributions | Symmetric | Nonnegative | Triangle inequality | Identity of indiscernibles | Metric |
---|---|---|---|---|---|
Kullback–Leibler divergence | no | yes | no | yes | no |
Hellinger distance | - | yes | - | - | - |
Total variation distance of probability measures | yes | yes | yes | yes | yes |
Jensen–Shannon divergence | yes | - | - | - | no |
Jensen–Shannon distance | yes | yes | yes | yes | yes |
Lévy–Prokhorov metric | yes | yes | yes | yes | yes |
Bhattacharyya distance | - | - | - | - | no |
Wasserstein metric | yes | yes | yes | yes | yes |
Divergence between a point and a distribution | Symmetric | Nonnegative | Triangle inequality | Identity of indiscernibles | Metric |
---|---|---|---|---|---|
Mahalanobis distance | - | - | - | - | - |
Olli Niemitalo (talk) 10:22, 3 December 2014 (UTC)
- That's nice. I've completed missing entries on the total variation distance. - Saibod (talk) 23:54, 3 March 2016 (UTC)
There is no mention of the statistical distance used in 100% of the crypto papers I've encountered. SD(u,v) = 1/2 ∑ |v_i - u_i| . Is there a reason for that, or is it just missing? For example "Three XOR-Lemmas -- An Exposition, Oded Goldreich" states it. "Randomness Extraction and Key Derivation Using the CBC, Cascade and HMAC Modes, Yevgeniy Dodis, Rosario Gennaro, Johan Hastad, Hugo Krawczyk4 and Tal Rabin" states the same definition. Those are the first two papers I checked.
David in oregon (talk) 05:08, 1 October 2016 (UTC)
Never mind. It was a special case of the total variation distance. David in oregon (talk) 23:27, 1 October 2016 (UTC)