# Free Poisson distribution

In the mathematics of free probability theory, the free Poisson distribution is a counterpart of the Poisson distribution in conventional probability theory.

## Definition

The free Poisson distribution[1] with jump size ${\displaystyle \alpha }$ and rate ${\displaystyle \lambda }$ arises in free probability theory as the limit of repeated free convolution

${\displaystyle \left(\left(1-{\frac {\lambda }{N}}\right)\delta _{0}+{\frac {\lambda }{N}}\delta _{\alpha }\right)^{\boxplus N}}$

as N → ∞.

In other words, let ${\displaystyle X_{N}}$ be random variables so that ${\displaystyle X_{N}}$ has value ${\displaystyle \alpha }$ with probability ${\displaystyle {\frac {\lambda }{N}}}$ and value 0 with the remaining probability. Assume also that the family ${\displaystyle X_{1},X_{2},\ldots }$ are freely independent. Then the limit as ${\displaystyle N\to \infty }$ of the law of ${\displaystyle X_{1}+\cdots +X_{N}}$ is given by the Free Poisson law with parameters ${\displaystyle \lambda ,\alpha }$.

This definition is analogous to one of the ways in which the classical Poisson distribution is obtained from a (classical) Poisson process.

The measure associated to the free Poisson law is given by[2]

${\displaystyle \mu ={\begin{cases}(1-\lambda )\delta _{0}+\nu ,&{\text{if }}0\leq \lambda \leq 1\\\nu ,&{\text{if }}\lambda >1,\end{cases}}}$

where

${\displaystyle \nu ={\frac {1}{2\pi \alpha t}}{\sqrt {4\lambda \alpha ^{2}-(t-\alpha (1+\lambda ))^{2}}}\,dt}$

and has support ${\displaystyle [\alpha (1-{\sqrt {\lambda }})^{2},\alpha (1+{\sqrt {\lambda }})^{2}]}$.

This law also arises in random matrix theory as the Marchenko–Pastur law. Its free cumulants are equal to ${\displaystyle \kappa _{n}=\lambda \alpha ^{n}}$.

## Some transforms of this law

We give values of some important transforms of the free Poisson law; the computation can be found in e.g. in the book Lectures on the Combinatorics of Free Probability by A. Nica and R. Speicher[3]

The R-transform of the free Poisson law is given by

${\displaystyle R(z)={\frac {\lambda \alpha }{1-\alpha z}}.}$

The Cauchy transform (which is the negative of the Stieltjes transformation) is given by

${\displaystyle G(z)={\frac {z+\alpha -\lambda \alpha -{\sqrt {(z-\alpha (1+\lambda ))^{2}-4\lambda \alpha ^{2}}}}{2\alpha z}}}$

The S-transform is given by

${\displaystyle S(z)={\frac {1}{z+\lambda }}}$

in the case that ${\displaystyle \alpha =1}$.

## References

1. ^ Free Random Variables by D. Voiculescu, K. Dykema, A. Nica, CRM Monograph Series, American Mathematical Society, Providence RI, 1992
2. ^ James A. Mingo, Roland Speicher: Free Probability and Random Matrices. Fields Institute Monographs, Vol. 35, Springer, New York, 2017.
3. ^ Lectures on the Combinatorics of Free Probability by A. Nica and R. Speicher, pp. 203–204, Cambridge Univ. Press 2006