Jump to content

Bussgang theorem

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Fgnievinski (talk | contribs) at 03:27, 12 April 2022 (top). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In mathematics, the Bussgang theorem is a theorem of stochastic analysis. The theorem states that the cross-correlation of a Gaussian signal before and after it has passed through a nonlinear operation are equal up to a constant. It was first published by Julian J. Bussgang in 1952 while he was at the Massachusetts Institute of Technology.[1]

Statement

Let be a zero-mean stationary Gaussian random process and where is a nonlinear amplitude distortion.

If is the autocorrelation function of , then the cross-correlation function of and is

where is a constant that depends only on .

It can be further shown that

Derivation for One-bit Quantization

It is a property of the two-dimensional normal distribution that the joint density of and depends only on their covariance and is given explicitly by the expression

where and are standard Gaussian random variables with correlation .

Assume that , the correlation between and is,

.

Since

,

the correlation may be simplified as

.

The integral above is seen to depend only on the distortion characteristic and is independent of .

Remembering that , we observe that for a given distortion characteristic , the ratio is .

Therefore, the correlation can be rewritten in the form

.

The above equation is the mathematical expression of the stated "Bussgang‘s theorem".

If , or called one-bit quantization, then .

[2][3][1][4]

Arcsine law

If the two random variables are both distorted, i.e., , the correlation of and is

.

When , the expression becomes,

where .

Noticing that

,

and , ,

we can simplify the expression of as

Also, it is convenient to introduce the polar coordinate . It is thus found that

.

Integration gives

This is called "Arcsine law", which was first found by J. H. Van Vleck in 1943 and republished in 1966.[2][3] The "Arcsine law" can also be proved in a simpler way by applying Price's Theorem.[4][5]

The function can be approximated as when is small.

Price's Theorem

Given two jointly normal random variables and with joint probability function

,

we form the mean

of some function of . If as , then

.

Proof. The joint characteristic function of the random variables and is by definition the integral

.

From the two-dimensional inversion formula of Fourier transform, it follows that

.

Therefore, plugging the expression of into , and differentiating with respect to , we obtain

After repeated integration by parts and using the condition at , we obtain the Price's theorem.

[4][5]

Proof of Arcsine law by Price's Theorem

If , then where is the Dirac delta function.

Substituting into Price's Theorem, we obtain,

.

When , . Thus

,

which is Van Vleck's well-known result of "Arcsine law".

[2][3]

Application

This theorem implies that a simplified correlator can be designed.[clarification needed] Instead of having to multiply two signals, the cross-correlation problem reduces to the gating[clarification needed] of one signal with another.[citation needed]

References

  1. ^ a b J.J. Bussgang,"Cross-correlation function of amplitude-distorted Gaussian signals", Res. Lab. Elec., Mas. Inst. Technol., Cambridge MA, Tech. Rep. 216, March 1952.
  2. ^ a b c Vleck, J. H. Van. "The Spectrum of Clipped Noise". Radio Research Laboratory Report of Harvard University. No. 51. {{cite journal}}: |volume= has extra text (help)
  3. ^ a b c Vleck, J. H. Van; Middleton, D. (January 1966). "The spectrum of clipped noise". Proceedings of the IEEE. 54 (1): 2–19. doi:10.1109/PROC.1966.4567. ISSN 1558-2256.
  4. ^ a b c Price, R. (June 1958). "A useful theorem for nonlinear devices having Gaussian inputs". IRE Transactions on Information Theory. 4 (2): 69–72. doi:10.1109/TIT.1958.1057444. ISSN 2168-2712.
  5. ^ a b Papoulis, Athanasios (2002). Probability, Random Variables, and Stochastic Processes. McGraw-Hill. p. 396. ISBN 0-07-366011-6.

Further reading