Jump to content

Talk:Statistical distance

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by David in oregon (talk | contribs) at 23:27, 1 October 2016. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

WikiProject iconStatistics Start‑class Mid‑importance
WikiProject iconThis article is within the scope of WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
MidThis article has been rated as Mid-importance on the importance scale.

Comparison table

Would be nice to have a comparison table to see how far the distance or divergence measures are from being a metric. Please feel free to fill in missing (-) data, references being welcome too. I mostly copied data from the articles of these distances:

Divergence between distributions Symmetric Nonnegative Triangle inequality Identity of indiscernibles Metric
Kullback–Leibler divergence no yes no yes no
Hellinger distance - yes - - -
Total variation distance of probability measures yes yes yes yes yes
Jensen–Shannon divergence yes - - - no
Jensen–Shannon distance yes yes yes yes yes
Lévy–Prokhorov metric yes yes yes yes yes
Bhattacharyya distance - - - - no
Wasserstein metric yes yes yes yes yes
Divergence between a point and a distribution Symmetric Nonnegative Triangle inequality Identity of indiscernibles Metric
Mahalanobis distance - - - - -

Olli Niemitalo (talk) 10:22, 3 December 2014 (UTC)[reply]

That's nice. I've completed missing entries on the total variation distance. - Saibod (talk) 23:54, 3 March 2016 (UTC)[reply]

There is no mention of the statistical distance used in 100% of the crypto papers I've encountered. SD(u,v) = 1/2 ∑ |v_i - u_i| . Is there a reason for that, or is it just missing? For example "Three XOR-Lemmas -- An Exposition, Oded Goldreich" states it. "Randomness Extraction and Key Derivation Using the CBC, Cascade and HMAC Modes, Yevgeniy Dodis, Rosario Gennaro, Johan Hastad, Hugo Krawczyk4 and Tal Rabin" states the same definition. Those are the first two papers I checked.

David in oregon (talk) 05:08, 1 October 2016 (UTC)[reply]

Never mind. It was a special case of the total variation distance. David in oregon (talk) 23:27, 1 October 2016 (UTC)[reply]