Hilbert spectral analysis

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Hilbert spectral analysis is a signal analysis method applying the Hilbert transform to compute the instantaneous frequency of signals according to

\omega=\frac{d\theta(t)}{dt}.\,

After performing the Hilbert transform on each signal, we can express the data in the following form:

X(t)=\sum_{j=1}^{n}a_j(t)\exp\left(i\int\omega_j(t)dt\right).\,

This equation gives both the amplitude and the frequency of each component as functions of time. It also enables us to represent the amplitude and the instantaneous frequency as functions of time in a three-dimensional plot, in which the amplitude can be contoured on the frequency-time plane. This frequency-time distribution of the amplitude is designated as the Hilbert amplitude spectrum, or simply Hilbert spectrum.

Hilbert spectral analysis method is an important part of Hilbert–Huang transform.

See also[edit]

References[edit]

  • Alan V. Oppenheim and Ronald W. Schafer, "Discrete-Time Signal Processing," Prentice-Hall Signal Processing Series, 2 ed., 1999.
  • Huang, et al. "The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis." Proc. R. Soc. Lond. A (1998) 454, 903–995 (Link)