Statistical signal processing

Statistical signal processing is an approach to signal processing which treats signals as stochastic processes, utilizing their statistical properties to perform signal processing tasks. Statistical techniques are widely used in signal processing applications. For example, one can model the probability distribution of noise incurred when photographing an image, and construct techniques based on this model to reduce the noise in the resulting image.

Examples of statistical signal models

A classic problem in signal processing is estimating a signal, or its underlying parameters, from noisy observations. This is typically accomplished using either as a Bayesian or a frequentist model. We provide a classic example for each of these models.

Noise reduction

Let ${\displaystyle x(t)}$ be an unknown, random, zero-mean, normally-distributed, wide-sense stationary process. We would like to estimate ${\displaystyle x(t)}$ from noisy measurements ${\displaystyle y(t)=x(t)+w(t)}$, where ${\displaystyle w(t)}$ is a noise process, uncorrelated with ${\displaystyle x(t)}$, which is likewise zero-mean, normally-distributed, and wide-sense stationary. Then, the optimal estimator is obtained by passing ${\displaystyle y(t)}$ through a linear filter whose frequency response is given by

${\displaystyle H(\omega )={\frac {S_{xx}(\omega )}{S_{xx}(\omega )+S_{ww}(\omega )}}}$

where ${\displaystyle S_{xx}(\omega )}$ and ${\displaystyle S_{ww}(\omega )}$ are the spectral densities of ${\displaystyle x(t)}$ and ${\displaystyle w(t)}$, respectively.

Parameter estimation

Many types of signals can be modeled using a small number of unknown parameters. This can be useful, for example, in signal compression (only the parameters need be stored, rather than the entire signal), as well as signal analysis (the parameters may hint at underlying characteristics of the model generating the signals).

As an example, suppose a discrete-time signal ${\displaystyle x[n]}$ is modeled as an autoregressive process,

${\displaystyle x[n]=a_{1}x[n-1]+a_{2}x[n-2]+\cdots +a_{k}x[n-k]+w[n]}$

where ${\displaystyle w[n]}$ is additive white Gaussian noise and ${\displaystyle a_{1},\ldots ,a_{k}}$ are unknown deterministic parameters (the autoregression parameters). Then, from observations of ${\displaystyle x[n]}$, it is of interest to estimate the autoregression parameters. This may be done using the method of least squares, or using the more computationally efficient Yule-Walker equations.