Stein's lemma

Stein's lemma,[1] named in honor of Charles Stein, is a theorem of probability theory that is of interest primarily because of its applications to statistical inference — in particular, to James–Stein estimation and empirical Bayes methods — and its applications to portfolio choice theory. The theorem gives a formula for the covariance of one random variable with the value of a function of another, when the two random variables are jointly normally distributed.

Statement of the lemma

Suppose X is a normally distributed random variable with expectation μ and variance σ2. Further suppose g is a function for which the two expectations E(g(X) (X − μ) ) and E( g ′(X) ) both exist (the existence of the expectation of any random variable is equivalent to the finiteness of the expectation of its absolute value). Then

${\displaystyle E{\bigl (}g(X)(X-\mu ){\bigr )}=\sigma ^{2}E{\bigl (}g'(X){\bigr )}.}$

In general, suppose X and Y are jointly normally distributed. Then

${\displaystyle \operatorname {Cov} (g(X),Y)=\operatorname {Cov} (X,Y)E(g'(X)).}$

Proof

In order to prove the univariate version of this lemma, recall that the probability density function for the normal distribution with expectation 0 and variance 1 is

${\displaystyle \varphi (x)={1 \over {\sqrt {2\pi }}}e^{-x^{2}/2}}$

and that for a normal distribution with expectation μ and variance σ2 is

${\displaystyle {1 \over \sigma }\varphi \left({x-\mu \over \sigma }\right).}$

Then use integration by parts.

More general statement

Suppose X is in an exponential family, that is, X has the density

${\displaystyle f_{\eta }(x)=\exp(\eta 'T(x)-\Psi (\eta ))h(x).}$

Suppose this density has support ${\displaystyle (a,b)}$ where ${\displaystyle a,b}$ could be ${\displaystyle -\infty ,\infty }$ and as ${\displaystyle x\rightarrow a{\text{ or }}b}$,${\displaystyle \exp(\eta 'T(x))h(x)g(x)\rightarrow 0}$ where ${\displaystyle g}$ is any differentiable function such that ${\displaystyle E|g'(X)|<\infty }$ or ${\displaystyle \exp(\eta 'T(x))h(x)\rightarrow 0}$ if ${\displaystyle a,b}$ finite. Then

${\displaystyle E((h'(X)/h(X)+\sum \eta _{i}T_{i}'(X))g(X))=-Eg'(X).}$

The derivation is same as the special case, namely, integration by parts.

If we only know ${\displaystyle X}$ has support ${\displaystyle \mathbb {R} }$, then it could be the case that ${\displaystyle E|g(X)|<\infty {\text{ and }}E|g'(X)|<\infty }$ but ${\displaystyle \lim _{x\rightarrow \infty }f_{\eta }(x)g(x)\not =0}$. To see this, simply put ${\displaystyle g(x)=1}$ and ${\displaystyle f_{\eta }(x)}$ with infinitely spikes towards infinity but still integrable. One such example could be adapted from ${\displaystyle f(x)={\begin{cases}1&x\in [n,n+2^{-n})\\0&{\text{otherwise}}\end{cases}}}$ so that ${\displaystyle f}$ is smooth.

Extensions to elliptically-contoured distributions also exist.[2][3]