# Le Cam's theorem

Jump to navigation Jump to search

In probability theory, Le Cam's theorem, named after Lucien Le Cam (1924 – 2000), states the following.

Suppose:

• $X_{1},X_{2},X_{3},\ldots$ are independent random variables, each with a Bernoulli distribution (i.e., equal to either 0 or 1), not necessarily identically distributed.
• $P(X_{i}=1)=p_{i},{\text{ for }}i=1,2,3,\ldots .$ • $\lambda _{n}=p_{1}+\cdots +p_{n}.$ • $S_{n}=X_{1}+\cdots +X_{n}.$ (i.e. $S_{n}$ follows a Poisson binomial distribution)

Then

$\sum _{k=0}^{\infty }\left|\Pr(S_{n}=k)-{\lambda _{n}^{k}e^{-\lambda _{n}} \over k!}\right|<2\left(\sum _{i=1}^{n}p_{i}^{2}\right).$ In other words, the sum has approximately a Poisson distribution and the above inequality bounds the approximation error in terms of the total variation distance.

By setting pi = λn/n, we see that this generalizes the usual Poisson limit theorem.

When $\lambda _{n}$ is large a better bound is possible: $\sum _{k=0}^{\infty }\left|\Pr(S_{n}=k)-{\lambda _{n}^{k}e^{-\lambda _{n}} \over k!}\right|<2\left(1\wedge {\frac {1}{\lambda }}_{n}\right)\left(\sum _{i=1}^{n}p_{i}^{2}\right).$ It is also possible to weaken the independence requirement.