Jump to content

Mixed Poisson distribution: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
(1) Space between f(x) and dx; (2) Do not italicize punctuation or digits in this context, nor things like cos, log, det, max, etc. See WP:MOSMATH. (3) Some of the TeX code was more complicated than it needs to be. (4) Alignment of punctation.
Line 10: Line 10:
| parameters = <math>\lambda\in (0, \infty)</math>
| parameters = <math>\lambda\in (0, \infty)</math>
| support = <math>k \in \mathbb{N}_0</math>
| support = <math>k \in \mathbb{N}_0</math>
| pdf = <math>\int\limits_{0}^{\infty} \frac{\lambda^{k}}{k!}e^{-\lambda} \,\,\pi(\lambda)\,\mathrm d\lambda</math>
| pdf = <math>\int\limits_0^\infty \frac{\lambda^k}{k!}e^{-\lambda} \,\,\pi(\lambda)\,\mathrm d\lambda</math>
| cdf =
| cdf =
| mean = <math>\int\limits_{0}^{\infty} \lambda \,\,\pi(\lambda)d\lambda</math>
| mean = <math>\int\limits_0^\infty \lambda \,\,\pi(\lambda)\,d\lambda</math>
| median =
| median =
| mode =
| mode =
| variance = <math>\int\limits_{0}^{\infty} (\lambda+(\lambda-\mu_\pi)^2) \,\,\pi(\lambda)d\lambda</math>
| variance = <math>\int\limits_0^\infty (\lambda+(\lambda-\mu_\pi)^2) \,\,\pi(\lambda) \, d\lambda</math>
| skewness = <math>\Bigl(\mu_\pi+\sigma_\pi^2\Bigr)^{-\frac{3}{2}} \,\Biggl[\int\limits_0^\infty(\lambda-\mu_\pi)^3\,\pi(\lambda)\,d{\lambda}+\mu_\pi\Biggr]</math>
| skewness = <math>\Bigl(\mu_\pi+\sigma_\pi^2\Bigr)^{-3/2} \,\Biggl[\int\limits_0^\infty(\lambda-\mu_\pi)^3 \, \pi(\lambda) \, d{\lambda}+\mu_\pi\Biggr]</math>
| kurtosis =
| kurtosis =
| entropy =
| entropy =
| pgf = <math>M_\pi(z-1)</math>
| pgf = <math>M_\pi(z-1)</math>
| mgf = <math>M_\pi(e^t-1)</math>, with <math>M_\pi</math> the MGF of ''π''
| mgf = <math>M_\pi(e^t-1)</math>, with <math>M_\pi</math> the MGF of {{pi}}
| char = <math>M_\pi(e^{it}-1)</math>
| char = <math>M_\pi(e^{it}-1)</math>
| fisher =
| fisher =
Line 28: Line 28:


== Definition ==
== Definition ==
A [[random variable]] ''X'' satisfies the mixed Poisson distribution with density ''π(λ)'' if it has the probability distribution<ref name="Willmot">{{Cite web |last=Willmot |first=Gord |date=2014-08-29 |title=Mixed Compound Poisson Distributions |url=https://www.cambridge.org/core/services/aop-cambridge-core/content/view/S051503610001165X |url-status=live |website=Cambridge |pages=5-7 |doi=10.1017/S051503610001165X}}</ref>
A [[random variable]] ''X'' satisfies the mixed Poisson distribution with density {{pi}}(''λ'') if it has the probability distribution<ref name="Willmot">{{Cite web |last=Willmot |first=Gord |date=2014-08-29 |title=Mixed Compound Poisson Distributions |url=https://www.cambridge.org/core/services/aop-cambridge-core/content/view/S051503610001165X |url-status=live |website=Cambridge |pages=5-7 |doi=10.1017/S051503610001165X}}</ref>


: <math>\operatorname{P}(X=k) = \int\limits_{0}^{\infty} \frac{\lambda^{k}}{k!}e^{-\lambda} \,\,\pi(\lambda)\,\mathrm d\lambda</math>.
: <math>\operatorname{P}(X=k) = \int_0^\infty \frac{\lambda^k}{k!}e^{-\lambda} \,\,\pi(\lambda)\,\mathrm d\lambda. </math>


If we denote the probabilities of the Poisson distribution by ''q<sub>λ</sub>(k)'', then
If we denote the probabilities of the Poisson distribution by ''q''<sub>''λ''</sub>(''k''), then


: <math>\operatorname{P}(X=k) = \int\limits_{0}^{\infty} q_\lambda(k) \,\,\pi(\lambda)\,\mathrm d\lambda</math>.
: <math>\operatorname{P}(X=k) = \int_0^\infty q_\lambda(k) \,\,\pi(\lambda)\,\mathrm d\lambda. </math>


== Properties ==
== Properties ==


* The [[variance]] is always bigger than the [[expected value]]. This property is called [[overdispersion]]. This is in contrast to the Poisson distribution where mean and variance are the same.
* The [[variance]] is always bigger than the [[expected value]]. This property is called [[overdispersion]]. This is in contrast to the Poisson distribution where mean and variance are the same.
* In practice, almost only densities of [[Gamma distribution|gamma distributions]], [[Log-normal distribution|logarithmic normal distributions]] and [[Inverse Gaussian distribution|inverse Gaussian distributions]] are used as densities ''π(λ)''. If we choose the density of the [[gamma distribution]], we get the [[negative binomial distribution]], which explains why this is also called the Poisson gamma distribution.
* In practice, almost only densities of [[Gamma distribution|gamma distributions]], [[Log-normal distribution|logarithmic normal distributions]] and [[Inverse Gaussian distribution|inverse Gaussian distributions]] are used as densities {{pi}}(''λ''). If we choose the density of the [[gamma distribution]], we get the [[negative binomial distribution]], which explains why this is also called the Poisson gamma distribution.


In the following let <math>\mu_\pi=\int\limits_{0}^{\infty} \lambda \,\,\pi(\lambda)d\lambda\,</math> be the expected value of the density <math>\pi(\lambda)\,</math> and <math>\sigma_\pi^2 = \int\limits_{0}^{\infty} (\lambda-\mu_\pi)^2 \,\,\pi(\lambda)d\lambda\,</math> be the variance of the density.
In the following let <math>\mu_\pi=\int\limits_0^\infty \lambda \,\,\pi(\lambda) \, d\lambda\,</math> be the expected value of the density <math>\pi(\lambda)\,</math> and <math>\sigma_\pi^2 = \int\limits_0^\infty (\lambda-\mu_\pi)^2 \,\,\pi(\lambda) \, d\lambda\,</math> be the variance of the density.


=== Expected value ===
=== Expected value ===
The [[expected value]] of the Mixed Poisson Distribution is
The [[expected value]] of the Mixed Poisson Distribution is


: <math>\operatorname{E}(X) = \mu_\pi</math>.
: <math>\operatorname{E}(X) = \mu_\pi.</math>


=== Variance ===
=== Variance ===
For the [[variance]] one gets<ref name="Willmot"/>
For the [[variance]] one gets<ref name="Willmot"/>


: <math>\operatorname{Var}(X) = \mu_\pi+\sigma_\pi^2</math>.
: <math>\operatorname{Var}(X) = \mu_\pi+\sigma_\pi^2. </math>


=== Skewness ===
=== Skewness ===
The [[skewness]] can be represented as
The [[skewness]] can be represented as


: <math>\operatorname{v}(X) = \Bigl(\mu_\pi+\sigma_\pi^2\Bigr)^{-\frac{3}{2}} \,\Biggl[\int\limits_0^\infty(\lambda-\mu_\pi)^3\,\pi(\lambda)\,d{\lambda}+\mu_\pi\Biggr]</math>.
: <math>\operatorname{v}(X) = \Bigl(\mu_\pi+\sigma_\pi^2\Bigr)^{-3/2} \,\Biggl[\int_0^\infty(\lambda-\mu_\pi)^3\,\pi(\lambda)\,d{\lambda}+\mu_\pi\Biggr].</math>


=== Characteristic function ===
=== Characteristic function ===
The [[characteristic function]] has the form
The [[characteristic function]] has the form


: <math>\varphi_{X}(s) = M_\pi(e^{is}-1)\,</math>.
: <math>\varphi_X(s) = M_\pi(e^{is}-1).\,</math>


Where <math> M_\pi </math> is the [[moment generating function]] of the density.
Where <math> M_\pi </math> is the [[moment generating function]] of the density.
Line 68: Line 68:
For the [[probability generating function]], one obtains<ref name="Willmot"/>
For the [[probability generating function]], one obtains<ref name="Willmot"/>


: <math>m_{X}(s) = M_\pi(s-1)\,</math>.
: <math>m_X(s) = M_\pi(s-1).\,</math>


=== Moment-generating function ===
=== Moment-generating function ===
The [[moment-generating function]] of the mixed Poisson distribution is
The [[moment-generating function]] of the mixed Poisson distribution is


: <math>M_{X}(s) = M_\pi(e^s-1)\,</math>.
: <math>M_X(s) = M_\pi(e^s-1).\,</math>


== Examples ==
== Examples ==
{|
{|
|style="vertical-align: top;"|{{Math theorem|Compounding a [[Poisson distribution]] with rate parameter distributed according to a [[gamma distribution]] yields a [[negative binomial distribution]].<ref name="Willmot"/>}}
|style="vertical-align: top;"|{{Math theorem|Compounding a [[Poisson distribution]] with rate parameter distributed according to a [[gamma distribution]] yields a [[negative binomial distribution]].<ref name="Willmot"/>}}
{{Math proof|Let <math>\pi(\lambda)=\frac{(\frac{p}{1-p})^r}{\Gamma(r)} \lambda^{r-1} e^{-\frac{p}{1-p}\lambda}</math> be a density of a <math>\mathrm{\Gamma}\left(r,\frac{p}{1-p}\right)</math> distributed random variable.
{{Math proof|Let <math>\pi(\lambda)=\frac{(\frac{p}{1-p})^r}{\Gamma(r)} \lambda^{r-1} e^{-\frac{p}{1-p}\lambda}</math> be a density of a <math>\operatorname{\Gamma}\left(r,\frac{p}{1-p}\right)</math> distributed random variable.


<math>\begin{align}
<math>\begin{align}
Line 92: Line 92:
{{Math proof|Let <math>\pi(\lambda)=\frac1\beta e^{-\frac \lambda\beta}</math> be a density of a <math>\mathrm{Exp}\left(\frac1\beta\right)</math> distributed random variable. Using [[integration by parts]] ''n'' times yields:
{{Math proof|Let <math>\pi(\lambda)=\frac1\beta e^{-\frac \lambda\beta}</math> be a density of a <math>\mathrm{Exp}\left(\frac1\beta\right)</math> distributed random variable. Using [[integration by parts]] ''n'' times yields:
<math display=block>\begin{align}
<math display=block>\begin{align}
\operatorname{P}(X=k)&=\frac{1}{k!}\int\limits_0^\infty \lambda^k e^{-\lambda} \frac1\beta e^{-\frac \lambda\beta}\mathrm d\lambda\\
\operatorname{P}(X=k)&=\frac{1}{k!}\int\limits_0^\infty \lambda^k e^{-\lambda} \frac1\beta e^{-\frac \lambda\beta} \, \mathrm d\lambda\\
&=\frac{1}{k!\beta}\int\limits_0^\infty \lambda^k e^{-\lambda\left(\frac{1+\beta}{\beta}\right)}\,\mathrm d \lambda\\
&=\frac{1}{k!\beta}\int\limits_0^\infty \lambda^k e^{-\lambda\left(\frac{1+\beta}{\beta}\right)}\,\mathrm d \lambda\\
&=\frac{1}{k!\beta}\cdot k!\left(\frac{\beta}{1+\beta}\right)^k\int\limits_0^\infty e^{-\lambda\left(\frac{1+\beta}{\beta}\right)}\,\mathrm d \lambda\\
&=\frac{1}{k!\beta}\cdot k!\left(\frac{\beta}{1+\beta}\right)^k\int\limits_0^\infty e^{-\lambda\left(\frac{1+\beta}{\beta}\right)}\,\mathrm d \lambda\\

Revision as of 02:51, 29 November 2022

mixed Poisson distribution
Notation
Parameters
Support
PMF
Mean
Variance
Skewness
MGF , with the MGF of π
CF
PGF

A mixed Poisson distribution is a univariate discrete probability distribution in stochastics. It results from assuming that a random variable is Poisson distributed, where the rate parameter itself is considered as a random variable. Hence it is a special case of a compound probability distribution. Mixed Poisson distributions can be found in actuarial mathematics as a general approach for the distribution of the number of claims and is also examined as an epidemiological model.[1] It should not be confused with compound Poisson distribution or compound Poisson process.[2]

Definition

A random variable X satisfies the mixed Poisson distribution with density π(λ) if it has the probability distribution[3]

If we denote the probabilities of the Poisson distribution by qλ(k), then

Properties

In the following let be the expected value of the density and be the variance of the density.

Expected value

The expected value of the Mixed Poisson Distribution is

Variance

For the variance one gets[3]

Skewness

The skewness can be represented as

Characteristic function

The characteristic function has the form

Where is the moment generating function of the density.

Probability generating function

For the probability generating function, one obtains[3]

Moment-generating function

The moment-generating function of the mixed Poisson distribution is

Examples

Theorem — Compounding a Poisson distribution with rate parameter distributed according to a gamma distribution yields a negative binomial distribution.[3]

Proof

Let be a density of a distributed random variable.

Therefore we get .

Theorem — Compounding a Poisson distribution with rate parameter distributed according to a exponential distribution yields a geometric distribution.

Proof

Let be a density of a distributed random variable. Using integration by parts n times yields: Therefore we get .

Table of mixed Poisson distributions

mixing distribution mixed Poisson distribution[4]
gamma negative binomial
exponential geometric
inverse Gaussian Sichel
Poisson Neyman
generalized inverse Gaussian Poisson-generalized inverse Gaussian
generalized gamma Poisson-generalized gamma
generalized Pareto Poisson-generalized Pareto
inverse-gamma Poisson-inverse gamma
log-normal Poisson-log-normal
Lomax Poisson–Lomax
Pareto Poisson–Pareto
Pearson’s family of distributions Poisson–Pearson family
truncated normal Poisson-truncated normal
uniform Poisson-uniform
shifted gamma Delaporte
beta with specific parameter values Yule

Literature

  • Jan Grandell: Mixed Poisson Processes. Chapman & Hall, London 1997, ISBN 0-412-78700-8 .
  • Tom Britton: Stochastic Epidemic Models with Inference. Springer, 2019, doi:10.1007/978-3-030-30900-8

References

  1. ^ Willmot, Gordon E.; Lin, X. Sheldon (2001), "Mixed Poisson distributions", Lundberg Approximations for Compound Distributions with Insurance Applications, vol. 156, New York, NY: Springer New York, pp. 37–49, doi:10.1007/978-1-4613-0111-0_3, ISBN 978-0-387-95135-5, retrieved 2022-07-08
  2. ^ Willmot, Gord (1986). "Mixed Compound Poisson Distributions". ASTIN Bulletin. 16 (S1): S59–S79. doi:10.1017/S051503610001165X. ISSN 0515-0361.
  3. ^ a b c d Willmot, Gord (2014-08-29). "Mixed Compound Poisson Distributions". Cambridge. pp. 5–7. doi:10.1017/S051503610001165X.{{cite web}}: CS1 maint: url-status (link)
  4. ^ Karlis, Dimitris; Xekalaki, Evdokia (2005). "Mixed Poisson Distributions". International Statistical Review / Revue Internationale de Statistique. 73 (1): 35–58. ISSN 0306-7734.