Jump to content

Noncentral chi-squared distribution

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Monkbot (talk | contribs) at 14:47, 18 December 2020 (Task 18 (cosmetic): eval 3 templates: hyphenate params (2×);). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Noncentral chi-square
Probability density function
Cumulative distribution function
Parameters

degrees of freedom

non-centrality parameter
Support
PDF
CDF with Marcum Q-function
Mean
Variance
Skewness
Excess kurtosis
MGF
CF

In probability theory and statistics, the noncentral chi-square distribution (or noncentral chi-squared distribution, noncentral distribution) is a noncentral generalization of the chi-square distribution. It often arises in the power analysis of statistical tests in which the null distribution is (perhaps asymptotically) a chi-square distribution; important examples of such tests are the likelihood-ratio tests.

Background

Let be k independent, normally distributed random variables with means and unit variances. Then the random variable

is distributed according to the noncentral chi-square distribution. It has two parameters: which specifies the number of degrees of freedom (i.e. the number of ), and which is related to the mean of the random variables by:

is sometimes called the noncentrality parameter. Note that some references define in other ways, such as half of the above sum, or its square root.

This distribution arises in multivariate statistics as a derivative of the multivariate normal distribution. While the central chi-square distribution is the squared norm of a random vector with distribution (i.e., the squared distance from the origin to a point taken at random from that distribution), the non-central is the squared norm of a random vector with distribution. Here is a zero vector of length k, and is the identity matrix of size k.

Definition

The probability density function (pdf) is given by

where is distributed as chi-square with degrees of freedom.

From this representation, the noncentral chi-square distribution is seen to be a Poisson-weighted mixture of central chi-square distributions. Suppose that a random variable J has a Poisson distribution with mean , and the conditional distribution of Z given J = i is chi-square with k + 2i degrees of freedom. Then the unconditional distribution of Z is non-central chi-square with k degrees of freedom, and non-centrality parameter .

Alternatively, the pdf can be written as

where is a modified Bessel function of the first kind given by

Using the relation between Bessel functions and hypergeometric functions, the pdf can also be written as:[1]

Siegel (1979) discusses the case k = 0 specifically (zero degrees of freedom), in which case the distribution has a discrete component at zero.

Properties

Moment generating function

The moment-generating function is given by

Moments

The first few raw moments are:

The first few central moments are:

The nth cumulant is

Hence

Cumulative distribution function

Again using the relation between the central and noncentral chi-square distributions, the cumulative distribution function (cdf) can be written as

where is the cumulative distribution function of the central chi-square distribution with k degrees of freedom which is given by

and where is the lower incomplete gamma function.

The Marcum Q-function can also be used to represent the cdf.[2]

Approximation (including for quantiles)

Abdel-Aty[3] derives (as "first approx.") a non-central Wilson-Hilferty approximation:

is approximately normally distributed, i.e.,

which is quite accurate and well adapting to the noncentrality. Also, becomes for , the (central) chi-squared case.

Sankaran[4] discusses a number of closed form approximations for the cumulative distribution function. In an earlier paper,[5] he derived and states the following approximation:

where

denotes the cumulative distribution function of the standard normal distribution;

This and other approximations are discussed in a later text book.[6]

For a given probability, these formulas are easily inverted to provide the corresponding approximation for , to compute approximate quantiles.

Derivation of the pdf

The derivation of the probability density function is most easily done by performing the following steps:

  1. Since have unit variances, their joint distribution is spherically symmetric, up to a location shift.
  2. The spherical symmetry then implies that the distribution of depends on the means only through the squared length, . Without loss of generality, we can therefore take and .
  3. Now derive the density of (i.e. the k = 1 case). Simple transformation of random variables shows that
where is the standard normal density.
  1. Expand the cosh term in a Taylor series. This gives the Poisson-weighted mixture representation of the density, still for k = 1. The indices on the chi-square random variables in the series above are 1 + 2i in this case.
  2. Finally, for the general case. We've assumed, without loss of generality, that are standard normal, and so has a central chi-square distribution with (k − 1) degrees of freedom, independent of . Using the poisson-weighted mixture representation for , and the fact that the sum of chi-square random variables is also a chi-square, completes the result. The indices in the series are (1 + 2i) + (k − 1) = k + 2i as required.
  • If is chi-square distributed then is also non-central chi-square distributed:
  • A linear combination of independent noncentral chi-squared variables , is generalized chi-square distributed.
  • If and and is independent of then a noncentral F-distributed variable is developed as
  • If , then
  • If , then takes the Rice distribution with parameter .
  • Normal approximation:[7] if , then in distribution as either or .
  • If and , where are independent, then where .
  • In general, for a finite set of , the sum of these non-central chi-square distributed random variables has the distribution where . This can be seen using moment generating functions as follows: by the independence of the random variables. It remains to plug in the MGF for the non-central chi square distributions into the product and compute the new MGF - this is left as an exercise. Alternatively it can be seen via the interpretation in the background section above as sums of squares of independent normally distributed random variables with variances of 1 and the specified means.
  • The complex noncentral chi-squared distribution has applications in radio communication and radar systems.[citation needed] Let be independent scalar complex random variables with noncentral circular symmetry, means of and unit variances: . Then the real random variable is distributed according to the complex noncentral chi-square distribution:

where

Transformations

Sankaran (1963) discusses the transformations of the form . He analyzes the expansions of the cumulants of up to the term and shows that the following choices of produce reasonable results:

  • makes the second cumulant of approximately independent of
  • makes the third cumulant of approximately independent of
  • makes the fourth cumulant of approximately independent of

Also, a simpler transformation can be used as a variance stabilizing transformation that produces a random variable with mean and variance .

Usability of these transformations may be hampered by the need to take the square roots of negative numbers.

Various chi and chi-square distributions
Name Statistic
chi-square distribution
noncentral chi-square distribution
chi distribution
noncentral chi distribution

Occurrences

Use in tolerance intervals

Two-sided normal regression tolerance intervals can be obtained based on the noncentral chi-square distribution.[8] This enables the calculation of a statistical interval within which, with some confidence level, a specified proportion of a sampled population falls.

Notes

  1. ^ Muirhead (2005) Theorem 1.3.4
  2. ^ Nuttall, Albert H. (1975): Some Integrals Involving the QM Function, IEEE Transactions on Information Theory, 21(1), 95–96, ISSN 0018-9448
  3. ^ Abdel-Aty, S. (1954). Approximate Formulae for the Percentage Points and the Probability Integral of the Non-Central χ2 Distribution Biometrika 41, 538–540. doi:10.2307/2332731
  4. ^ Sankaran , M. (1963). Approximations to the non-central chi-squared distribution Biometrika, 50(1-2), 199–204
  5. ^ Sankaran , M. (1959). "On the non-central chi-squared distribution", Biometrika 46, 235–237
  6. ^ Johnson et al. (1995) Continuous Univariate Distributions Section 29.8
  7. ^ Muirhead (2005) pages 22–24 and problem 1.18.
  8. ^ Derek S. Young (August 2010). "tolerance: An R Package for Estimating Tolerance Intervals". Journal of Statistical Software. 36 (5): 1–39. ISSN 1548-7660. Retrieved 19 February 2013., p.32

References

  • Abramowitz, M. and Stegun, I. A. (1972), Handbook of Mathematical Functions, Dover. Section 26.4.25.
  • Johnson, N. L., Kotz, S., Balakrishnan, N. (1995), Continuous Univariate Distributions, Volume 2 (2nd Edition), Wiley. ISBN 0-471-58494-0
  • Muirhead, R. (2005) Aspects of Multivariate Statistical Theory (2nd Edition). Wiley. ISBN 0-471-76985-1
  • Siegel, A. F. (1979), "The noncentral chi-squared distribution with zero degrees of freedom and testing for uniformity", Biometrika, 66, 381–386
  • Press, S.J. (1966), "Linear combinations of non-central chi-squared variates", The Annals of Mathematical Statistics, 37 (2): 480–487, doi:10.1214/aoms/1177699531, JSTOR 2238621