# Talk:Mixture distribution

WikiProject Statistics (Rated C-class, Mid-importance)

This article is within the scope of the WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page or join the discussion.

C  This article has been rated as C-Class on the quality scale.
Mid  This article has been rated as Mid-importance on the importance scale.

I have only seen these two terms used synonomously 205.155.65.236 06:30, 31 October 2007 (UTC)

## Rename

Should we rename this to Mixture distribution so as to include the discrete case? —3mta3 (talk) 12:27, 24 June 2009 (UTC)

————————————————

I suggest to rename this to Mixture family, similarly to how we have the exponential family. Unlike all other “distribution” articles, this model cannot be completely described by a finite number of parameters, there always remains freedom in choosing the “mixed-in” densities pi(x), just like there is a similar freedom in the exponential family.  … stpasha »  15:20, 16 January 2010 (UTC)

## Simulation

I think the text should include a Simulation chapter. It's very easy to simulate a variable: first, sample from the discrete set {1, 2, ... n} with weights wi and get i, then simulate the i-th distribution. Is there any source for this? Albmont (talk) 17:29, 1 July 2009 (UTC)

## Incorrect diagram

An IP editor plavedthis in the main text "-this is wrong it shows the concept, but the "mixed normal pdf" should be underneath the 3 pdf's because each pdf integrates to 1 on its own." I have hidden both this and the diagram for now. Melcombe (talk) 15:55, 28 June 2010 (UTC)

## Incorrect formula for variance (2nd momentum; Section 'Moments')

There appears to be a mistake in the formula for the variance of a mixture distribution: The general formula is:

{\displaystyle {\begin{aligned}\operatorname {E} [(X-\mu )^{j}]&=\sum _{i=1}^{n}w_{i}\operatorname {E} [(X_{i}-\mu _{i}+\mu _{i}-\mu )^{j}]&=\sum _{i=1}^{n}\sum _{k=0}^{j}\left({\begin{array}{c}j\\k\end{array}}\right)(\mu _{i}-\mu )^{j-k}w_{i}\operatorname {E} [(X_{i}-\mu _{i})^{k}]\end{aligned}}}

For j=2 (Variance), we get:

{\displaystyle {\begin{aligned}\operatorname {E} [(X-\mu )^{2}]=\sigma ^{2}&=\sum _{i=1}^{n}\sum _{k=0}^{2}\left({\begin{array}{c}2\\k\end{array}}\right)(\mu _{i}-\mu )^{2-k}w_{i}\operatorname {E} [(X_{i}-\mu _{i})^{k}]=\sum _{i=1}^{n}(1*(\mu _{i}-\mu )^{2}*w_{i}*1)+(2*(\mu _{i}-\mu )*w_{i}*0)+(1*1*w_{i}*\operatorname {E} [(X_{i}-\mu _{i})^{2}])=\sum _{i=1}^{n}w_{i}*((\mu _{i}-\mu )^{2}+\operatorname {E} [(X_{i}-\mu _{i})^{2}])=\\&=\sum _{i=1}^{n}w_{i}((\mu _{i}-\mu )^{2}+\sigma _{i}^{2})\end{aligned}}}

This is different to the formula from the article:

${\displaystyle \operatorname {E} [(X-\mu )^{2}]=\sigma ^{2}=\sum _{i=1}^{n}w_{i}(\mu _{i}^{2}+\sigma _{i}^{2})-\mu ^{2}.}$

I believe the mistake is that

${\displaystyle (\mu _{i}-\mu )^{2}\neq (\mu _{i}^{2}-\mu ^{2})}$

## Edits

I altered the statement that the variance formula applied to normal distributions only. It is general.

Source: https://statisticalmodeling.wordpress.com/2011/06/16/the-variance-of-a-mixture/ — Preceding unsigned comment added by Svein Olav Nyberg (talkcontribs) 22:25, 17 November 2015 (UTC)

## Finite, countable, uncountable mixtures

The term discrete is more concise than "finite and countable". Also, the term continuous feels better than uncountable here even though it's more restrictive. More accessible to someone with just a calculus background. Then we could make note that it generalizes in the context of measure theory. — Preceding unsigned comment added by 207.11.1.161 (talk) 08:44, 20 February 2016 (UTC)

## Generating functions

There should be a new section about moment generating (and maybe characteristic) functions of mixture distributions. Kjetil B Halvorsen 15:10, 19 April 2017 (UTC) — Preceding unsigned comment added by Kjetil1001 (talkcontribs)