# Talk:Truncated normal distribution

WikiProject Mathematics (Rated Start-class, Low-importance)
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
 Start Class
 Low Importance
Field: Probability and statistics
WikiProject Statistics (Rated Start-class, Low-importance)

This article is within the scope of the WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page or join the discussion.

Start  This article has been rated as Start-Class on the quality scale.
Low  This article has been rated as Low-importance on the importance scale.

## Mean

The expression for the mean is given as: ${\displaystyle \mu +{\frac {\phi (\alpha )-\phi (\beta )}{Z}}\sigma }$. This must be incorrect, because it sometimes gives mean values outside the truncation bounds. For example, ${\displaystyle \mu =0}$, ${\displaystyle \sigma =2}$, ${\displaystyle a=2}$, ${\displaystyle b=10}$ gives a mean of 1.53. I believe the correct expression is ${\displaystyle \mu +{\frac {\phi (\alpha )-\phi (\beta )}{Z}}\sigma ^{2}}$. Agreed? Jtmcg1128 (talk) 18:03, 13 October 2011 (UTC)

• The expression ${\displaystyle \mu +{\frac {\phi (\alpha )-\phi (\beta )}{Z}}\sigma }$ is correct. It's an issue of ${\displaystyle \phi (\cdot )}$ being defined as the standard normal pdf: ${\displaystyle \phi (\xi )={\frac {1}{\sqrt {2\pi }}}\exp {(-{\frac {1}{2}}\xi ^{2}})}$, see discussion on the pdf definition below. The pdf of a distribution with arbitrary mean and standard deviation is ${\displaystyle {\frac {1}{\sigma }}\phi ({\frac {x-\mu }{\sigma }})}$. (Karekafi (talk) 17:25, 14 November 2011 (UTC))

## Regarding the pdf

• I am concerned that ${\displaystyle f(x;\mu ,\sigma ,a,b)={\frac {{\frac {1}{\sigma }}\phi ({\frac {X-\mu }{\sigma }})}{\Phi ({\frac {b-\mu }{\sigma }})-\Phi ({\frac {a-\mu }{\sigma }})}}}$, cannot be the formula for a PDF of a truncated random normal variable. Say there is a left truncated (a = 0) normal random variable with positive mean. If we choose X to be a negative value, then ${\displaystyle \phi ({\frac {X-\mu }{\sigma }})}$ is positive, ${\displaystyle \Phi ({\frac {b-\mu }{\sigma }})}$ is 1 and ${\displaystyle \Phi ({\frac {a-\mu }{\sigma }})}$ is positive. Altogether, the PDF cannot be zero as it should be. Perhaps defining it piecewise is the most logical idea because I cannot think of an explicit formula.
• Well, the article already mentioned that the domain of X is [a,b]. Thus ${\displaystyle f(x=;\mu ,\sigma ,a,b)}$ is zero outside a and b. So, in your example, if a = 0, then f(x) is zero. Robbyjo (talk) 20:04, 20 February 2008 (UTC)
• In my opinion the formula for is incorrect. It should be: ${\displaystyle f(x;\mu ,\sigma ,a,b)={\frac {\phi ({\frac {X-\mu }{\sigma }})}{\Phi ({\frac {b-\mu }{\sigma }})-\Phi ({\frac {a-\mu }{\sigma }})}}}$. In the current version, if you truncate at a=-inf, b=+inf you will not get Normal distribution. Compare also: http://rss.acs.unt.edu/Rdoc/library/msm/html/tnorm.html —Preceding unsigned comment added by 128.143.16.201 (talk) 20:31, 20 February 2008 (UTC)
• It's not a typo; note that ${\displaystyle \phi (\cdot )}$ is the standard normal pdf. So ${\displaystyle {\frac {1}{\sigma }}\phi ({\frac {X-\mu }{\sigma }})}$ gives you the pdf for ${\displaystyle X\sim N(\mu ,\sigma )}$. Write that out and you'll see why. Josuechan (talk) 23:11, 20 February 2008 (UTC)
• I reverted the numerator for the PDF back to ${\displaystyle {\frac {1}{\sigma }}\phi \left({\frac {X-\mu }{\sigma }}\right)}$ because the edit by IP: 152.78.63.13 appears incorrect as Josuechan pointed out above. I also cleaned up the formatting of the discussion a little. --V madhu (talk) 18:17, 4 August 2009 (UTC)
The correct Density function is this one ${\displaystyle f(x;\mu ,\sigma ,a,b)={\frac {{\frac {1}{\sigma }}\phi ({\frac {X-\mu }{\sigma }})}{\Phi ({\frac {b-\mu }{\sigma }})-\Phi ({\frac {a-\mu }{\sigma }})}}}$. This function is both positive and integrates to 1 (because of change of variables, ${\displaystyle dz={\frac {dx}{\sigma }}}$). The incorrect version, listed above, integrates to ${\displaystyle {\sigma }}$.Iwaterpolo (talk) 01:57, 3 June 2010 (UTC)
• I agree on the PDF formulation. For an easier understanding I added the exact definition of the standard normal pdf in the text below. This may help people not completely familiar with the exact notation to more quickly understand the problem. (Karekafi (talk) 16:40, 14 November 2011 (UTC))
• I have changed back to ${\displaystyle f(x;\mu ,\sigma ,a,b)={\frac {{\frac {1}{\sigma }}\phi ({\frac {X-\mu }{\sigma }})}{\Phi ({\frac {b-\mu }{\sigma }})-\Phi ({\frac {a-\mu }{\sigma }})}}}$. Note that ${\displaystyle \phi ()}$ is Standard Normal and the left hand side f(x) is Normal. The integration gives exactly 1.0. — Preceding unsigned comment added by Syz2 (talkcontribs) 03:17, 24 April 2014 (UTC)
• Could it be that the PDF in the box misses ${\displaystyle 1/\sigma }$ factor in the numerator in front of ${\displaystyle \phi (\xi )}$? — Preceding unsigned comment added by Fuzzyrandom (talkcontribs) 19:50, 14 July 2015 (UTC)
• I think so, Fuzzyrandom. These formulae are a little confusing, with so many previous definitions... Given I use this distribution once in a while, I have this bookmarked for ages to try and clean it up, but I never get to it :-| I'll add that one. Tks. - Nabla (talk) 23:38, 14 July 2015 (UTC)
• PS: Interestingly it was me who introduced the said possibly confusing notation back in 2011! (that's my previous edit here...) :-) Oh well... at least the PDF was OK then. - Nabla (talk) 23:51, 14 July 2015 (UTC)

## Entropy formula

Appears to be wrong. The values from the formula don't agree with numerical computation of of the entropy. I derived the case for a one-sided truncated normal, and that differs from this case, but I haven't had time to go back and derive the two-sided case. Would be nice if someone can track down a reference for this or find the correct formula. --Jpillow (talk) 17:20, 12 January 2012 (UTC)

• I derived the correct formula by taking the mean of -log density. The confusion in the incorrect formula appears to be the fact that it took \mu to be the mean of the truncated r.v. The difference cancels the last term in the variance, which yields the simple formula for the entropy. I haven't looked up a reference to the entropy. If anyone can find one, it'll be good. Troos (talk) 18:27, 28 September 2013 (UTC)

## Simulation

The simulation section appears to be wrong/misleading. I understand the formula as basically simulating rejecting sampling by basically drawing uniformly from the lower and upper bounds of the CDF of the non-truncated normal (with appropriate parameters), then inverting to obtain the actual value. However, in this case the use of ${\displaystyle \Phi }$ is misleading because it is referring to the CDF of the normal distribution with parameters ${\displaystyle \mu ,\sigma }$ instead of the standard normal.

There are several ways to resolve, this, but I feel that the following would be easiest. Note that using ${\displaystyle \alpha ,\beta }$ with the standard normal CDF instead of ${\displaystyle a,b}$ would require the result to be multiplied by ${\displaystyle \sigma }$ then added to ${\displaystyle \mu }$.

A random variate x defined as ${\displaystyle x=\Phi '^{-1}(\Phi '(a)+U*(\Phi '(b)-\Phi '(a)))}$ with ${\displaystyle \Phi '}$ the CDF of a normal distribution with mean ${\displaystyle \mu }$ and variance ${\displaystyle \sigma ^{2}}$, and ${\displaystyle \Phi '^{-1}}$ its inverse, ${\displaystyle U}$ a uniform random number on ${\displaystyle (0,1)}$, follows the distribution truncated to the range ${\displaystyle (a,b)}$.

Heheman3000 (talk) 23:42, 29 December 2012 (UTC)

Heheman3000, yes it was incorrect. I changed it a couple months back. Mguggis (talk) 23:29, 31 July 2013 (UTC)