Jump to content

Statistical signal processing

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by BG19bot (talk | contribs) at 21:15, 21 March 2016 (Remove blank line(s) between list items per WP:LISTGAP to fix an accessibility issue for users of screen readers. Do WP:GENFIXES and cleanup if needed. Discuss this at Wikipedia talk:WikiProject Accessibility#LISTGAP). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Statistical signal processing is an area of Applied Mathematics and Signal Processing that treats signals as stochastic processes, dealing with their statistical properties (e.g., mean, covariance, etc.). Because of its very broad range of application Statistical signal processing is taught at the graduate level in either Electrical Engineering, Applied Mathematics, Pure Mathematics/Statistics, or even Biomedical Engineering and Physics departments around the world, although important applications exist in almost all scientific fields.

Basic Signal Model

In many applications, a signal is modeled as functions consisting of both a deterministic and a stochastic component. A simple example and also a common model of many statistical systems is a signal that consists of a deterministic part added to noise which can be modeled in many situations as white Gaussian noise :

where

White noise simply means that the noise process is completely uncorrelated. As a result, its autocorrelation function is an impulse:

where

is the Dirac delta function.

Given information about a statistical system and the random variable from which it is derived, we can increase our knowledge of the output signal; conversely, given the statistical properties of the output signal, we can infer the properties of the underlying random variable. These statistical techniques are developed in the fields of estimation theory, detection theory, and numerous related fields that rely on statistical information to maximize their efficiency.

Example

The Computation of Average Transients (CAT) is used routinely in FT-NMR spectroscopy (nuclear magnetic resonance) to improve the signal-noise ratio of nmr spectra. The signal is measured repeatedly n times and then averaged.

Assuming that the noise is white and that its variance is constant in time it follows by error propagation that

Thus, if 10,000 measurements are averaged the signal to noise ratio is increased by a factor of 100, enabling the measurement of 13C NMR spectra at natural abundance (1.1%) of 13C.

See also

Further reading

  • Scharf, Louis L. (1991). Statistical signal processing: detection, estimation, and time series analysis. Boston: Addison–Wesley. ISBN 0-201-19038-9. OCLC 61160161.
  • P Stoica, R Moses (2005; Chinese Edition, 2007). SPECTRAL ANALYSIS OF SIGNALS (PDF). NJ: Prentice Hall. {{cite book}}: Check date values in: |year= (help)CS1 maint: year (link)
  • Kay, Steven M. (1993). Fundamentals of Statistical Signal Processing. Upper Saddle River, New Jersey: Prentice Hall. ISBN 0-13-345711-7. OCLC 26504848.
  • Kainam Thomas Wong [1]: Statistical Signal Processing lecture notes at the University of Waterloo, Canada.
  • Ali H. Sayed, Adaptive Filters, Wiley, NJ, 2008, ISBN 978-0-470-25388-5.
  • Thomas Kailath, Ali H. Sayed, and Babak Hassibi, Linear Estimation, Prentice-Hall, NJ, 2000, ISBN 978-0-13-022464-4.