# Bhatia–Davis inequality

In mathematics, the Bhatia–Davis inequality, named after Rajendra Bhatia and Chandler Davis, is an upper bound on the variance σ² of any bounded probability distribution on the real line.

Suppose a distribution has minimum m, maximum M, and expected value μ. Then the inequality says:

${\displaystyle \sigma ^{2}\leq (M-\mu )(\mu -m).\,}$

Equality holds precisely if all of the probability is concentrated at the endpoints m and M.

The Bhatia–Davis inequality is stronger than Popoviciu's inequality on variances.

A lower bound for the variance based on the Bhatia–Davis inequality has been found by Agarwal et al.[1]

${\displaystyle (M-\mu )(\mu -m)-{\frac {(M-m)^{3}}{6}}\leq \sigma ^{2}}$