This is the user sandbox of SRevel. A user sandbox is a subpage of the user's user page. It serves as a testing spot and page development space for the user and is not an encyclopedia article. Create or edit your own sandbox here.
Writing an article and ready to request its creation?
I'm copying & pasting article content in here, so that I can clip bits I don't need, and add reminder notes (relevant to the SOA P-exam) or personal notes (scattered pedantry).
- 1 Encoding a random variable (p.d.f., c.d.f., chi.f., m.g.f.)
- 2 Special discrete random variables
- 3 Stuff I haven't copypasted into place yet
Encoding a random variable (p.d.f., c.d.f., chi.f., m.g.f.)
The most natural way to describe a discrete random variable is to give its probability distribution function , where reports the probability of the event . To describe a wider class of random variables, we interpret this p(x) as specifying the atoms of a finite real measure (on whatever space supports it, usually ).
A complete description of a random variable may be given in the form of a cumulative distribution function, characteristic function, or moment-generating function, defined by
We drop the subscript as soon as we're sure Prof. Renalis is not looking.
Special discrete random variables
Binomial and Negative Binomial
Formally, 'trials' are i.i.d. binary random variables, with p.d.f. .
Binomial: Given many trials, the probability of obtaining exactly successes is
(It's the probability of obtaining k successes and then n-k failures, times the number of ways to rearrange where those k successes and n-k failures occur.)
A binomial rv has mean and variance .
Given a required threshold of successes, the probability of needing exactly trials is
(It's like the binomial, except we can't rearrange the final success.)
Stuff I haven't copypasted into place yet
There are particularly simple results for the moment-generating functions of distributions defined by the weighted sums of random variables.
The moment-generating function does not always exist even for real-valued arguments, unlike the characteristic function. There are relations between the behavior of the moment-generating function of a distribution and properties of the distribution, such as the existence of moments.
wherever this expectation exists.
always exists and is equal to 1.
A key problem with moment-generating functions is that moments and the moment-generating function may not exist, as the integrals need not converge absolutely. By contrast, the characteristic function always exists (because it is the integral of a bounded function on a space of finite measure), and thus may be used instead.
More generally, where , an n-dimensional random vector, one uses instead of tX:
The reason for defining this function is that it can be used to find all the moments of the distribution. The series expansion of etX is:
where mn is the nth moment.
If we differentiate MX(t) i times with respect to t and then set t = 0 we shall therefore obtain the ith moment about the origin, mi.
|Distribution||Moment-generating function MX(t)||Characteristic function φ(t)|
|Binomial B(n, p)|
|Uniform U(a, b)|
|Normal N(μ, σ2)|
|Gamma Γ(k, θ)|
|Multivariate normal N(μ, Σ)|
|Laplace L(μ, b)|
|Cauchy Cauchy(μ, θ)||not defined|
|Negative Binomial NB(r, p)|
So that characteristic function is a Wick rotation of the moment generating function Mx(t).
The moment-generating function is given by the Riemann–Stieltjes integral
where F is the cumulative distribution function.
where mn is the nth moment.
Sum of independent random variables
If X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, and
where the ai are constants, then the probability density function for Sn is the convolution of the probability density functions of each of the Xi, and the moment-generating function for Sn is given by
Vector-valued random variables
where t is a vector and is the dot product.
The most important property of the moment-generating function is that if two distributions have the same moment-generating function, then they are identical at all points. That is, if for all values of t,
for all values of x (or equivalently X and Y have the same distribution). This statement is not equivalent to ``if two distributions have the same moments, then they are identical at all points", because in some cases the moments exist and yet the moment-generating function does not, because in some cases the limit
does not exist. This happens for the lognormal distribution.
Calculations of moments
n should be nonnegative.
Hoeffding's lemma provides a bound on the moment-generating function in the case of a zero-mean, bounded random variable.
Relation to other functions
Related to the moment-generating function are a number of other transforms that are common in probability theory:
- characteristic function
- The characteristic function is related to the moment-generating function via the characteristic function is the moment-generating function of iX or the moment generating function of X evaluated on the imaginary axis. This function can also be viewed as the Fourier transform of the probability density function, which can therefore be deduced from it by inverse Fourier transform.
- cumulant-generating function
- The cumulant-generating function is defined as the logarithm of the moment-generating function; some instead define the cumulant-generating function as the logarithm of the characteristic function, while others call this latter the second cumulant-generating function.
- probability-generating function
- The probability-generating function is defined as This immediately implies that
||This article includes a list of references, but its sources remain unclear because it has insufficient inline citations. (February 2010) (Learn how and when to remove this template message)|
- Bulmer, M.G., Principles of Statistics, Dover, 1979, pp. 75–79