|WikiProject Statistics||(Rated Start-class, Low-importance)|
|WikiProject Mathematics||(Rated Start-class, Low-importance)|
Computer vision category
Computer vision category
I removed this article from the computer vision category. This B-distance is probably usful in some part of CV but
- It is not a concept developed within CV or specific to CV.
- There is no material in this article which relates it to CV.
--KYN 22:19, 27 July 2007 (UTC)
The articles do seem to be about the same thing. One problem may be the title to adopt. Should "distance" be used, considering that it isn't a "distance" in the metric sense? So what is the common usage? Melcombe (talk) 16:31, 15 May 2008 (UTC)
- Google seems to prefer distance (13,000 versus 3,000 hits). Btyner (talk) 20:14, 10 May 2009 (UTC)
- It is a measure of dissimilarity, even though it does not obey the triangle inequality (see Kailath's paper, referenced in the main article).Jrvz (talk) 16:54, 8 November 2010 (UTC)
The term "distance" certainly applies to the Bhattacharyya distance, it is a different type of distance than the traditional Euclidean distance between two points, but there are many other "distances" such as the Chess-board distance, the Chamfer distance or the Kullback-Leibler Distance (also called divergence). What the B.D. is measuring is how far apart two "things" are where the "things" are statistical distributions, the closer the distributions, the smaller the distance. "Bhattacharyya distance" is a term widely used in statistics and in other areas such as computer vision and image processing when it refers to the distance between distributions of classes that were obtained from an image. So, it is not directly related to C.V. but widely used there — Preceding unsigned comment added by Creyes (talk • contribs) 13:41, 12 February 2013 (UTC)
Comments from Talk:Bhattacharya coefficient
Note: in article's Definition section, in paragraph relating it to Mahalanobis distance, figures are mentioned but none are present.
Error in simplified formula?
In the definition based on mean and variance: "where sigma_p is the variance of the p-th distribution," -> is sigma really the variance, shouldn't it be the standard deviation? Since sigma^2 is used everywhere in the formula, I'm not sure but I suspect that it should be "where sigma^2_p is the variance of the p-th distribution". Can someone confirm or refute that please?
In the article that is referred to (in which it also says that sigma is the variance), this expression is derived from the more general expression involving covariance matrices. Seeing as the covariance matrix is a generalization of the variance to higher dimensions, and this leads to the same expression involving sigma as the standard deviation (thus sigma squared as the variance), this must be an error in the article as well. Can someone please confirm that?
- Using Maxima I can quickly derive Bhattacharyya distance for normal distributions directly like that:
N[m,s](x) := 1/(s*sqrt(2*%pi)) * exp(-(x-m)**2/(2*s**2)) $ assume(sp > 0, sq > 0) $ -log(integrate(sqrt(N[mp,sp](x) * N[mq,sq](x)), x, -inf, inf));