Jump to content

Blackman–Tukey transformation

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Citation bot (talk | contribs) at 11:36, 27 September 2023 (Removed parameters. | Use this bot. Report bugs. | #UCB_CommandLine). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The Blackman–Tukey transformation (or Blackman–Tukey method) is a digital signal processing method to transform data from the time domain to the frequency domain. It was originally programmed around 1953 by James Cooley for John Tukey at John von Neumann's Institute for Advanced Study as a way to get "good smoothed statistical estimates of power spectra without requiring large Fourier transforms."[1] It was published by Ralph Beebe Blackman and John Tukey in 1958.

Background

Transformation

In signal processing, transformation from the time domain to another domain, such as the frequency domain, is used to focus on the details of a waveform. Many of the waveform's details can be analyzed much easier in another domain than the original. Different methods exist to do transformation from time domain to frequency domain; the most prominent is the Fourier transform, which the Blackman–Tukey method uses. Prior to the advent of fast computers and the 1965 rediscovery of the fast Fourier Transform, the large number of computations necessary for the discrete Fourier Transform motivated researchers to reduce the number of calculations required, resulting in the (now obsolete) Blackman–Tukey method based on the Wiener-Khinchin theorem.[2]

Statistical estimation

Statistical estimation is used to determine the expected value(s) of statistical expected values of statistical quantities. Statistical estimation also tries to find the expected values. The expected values are those values that we expect among the random values, derived from samples of the population in probability (group of subset). In time series analysis, discrete data obtained as a function of time is usually the only type of data available, instead of samples of population or group of subsets taken simultaneously.

Difficulty is commonly avoided using an ergodic process, that changes with time and probability gets involved with it, and it's not always periodic at all portions of time.[clarification needed]

Blackman–Tukey transformation method

The method is fully described in Blackman and Tukey's 1958 journal publications republished as their 1959 book "The measurement of power spectra, from the point of view of communications engineering"[3] and is outlined by the following procedures:

  1. Calculate the autocorrelation function with the data
  2. Apply a suitable window function, and finally
  3. Compute a discrete Fourier transform (now done with FFT) of the data to obtain the power density spectrum

Autocorrelation makes the wave smoothed rather than averaging several waveforms.[clarification needed] This function is set to window, the corresponding waveform toward its extremes.[clarification needed] Computation gets faster if more data is correlated and if memory capacity of the system increases then overlap save sectioning technique would be applied.[clarification needed] If the autocorrelation function in Blackman–Tukey is computed using FFT, then it will name fast correlation method for spectral estimation.[clarification needed]

References

  1. ^ Cooley, James. "The Re-Discovery of the Fast Fourier Transform Algorithm" (PDF). web.cs.dal.ca. Archived from the original (PDF) on 2012-12-24. However, we had a previous collaboration in 1953 when Tukey was a consultant at John Von Neuman's computer project at the Institute for Advanced Study in Princeton, New Jersey, where I was a programmer. I programmed for him what later became the very popular Blackman-Tukey method of spectral analysis [5]. The important feature of this method was that it gave good smoothed statistical estimates of power spectra without requiring large Fourier transforms. Thus, our two collaborations were first on a method for avoiding large Fourier transforms since they were so costly and then a method for reducing the cost of the Fourier transforms.
  2. ^ Wunsch, Carl (Spring 2005). "Lecture notes on process of using The Blackman-Tukey method to solve a problem". MIT OpenCourseWare. Retrieved 2022-04-11. Prior to the advent of the FFT and fast computers, power density spectral estimation was almost never done as described in the last section. Rather the onerous computational load led scientists, as far as possible, to reduce the number of calculations required. The so-called Blackman-Tukey method... ... The Blackman-Tukey estimate is based upon ... and the choice of suitable window weights...A large literature grew up devoted to the window choice. Again, one trades bias against variance through the value M, which one prefers greatly to minimize. The method is now obsolete because the ability to generate the Fourier coefficients directly permits much greater control over the result. The bias discussion of the Blackman-Tukey method is particularly tricky, as is the determination of ν. Use of the method should be avoided except under those exceptional circumstances when for some reason only R~(τ) is known.
  3. ^ Blackman, R. B.; Tukey, J. W. (1958). The Measurement of Power Spectra, from the point of view of Communications Engineering (new Dover 1959 edition, an unabridged and corrected republication of Part I and Part II of "The Measurement of Power Spectra from the Point of View of Communications Engineering" which originally appeared in the January 1958 and March 1958 issues of Volume XXXVII of the Bell System Technical Journal. ed.). American Telephone and Telegraph Company. Retrieved 2022-04-11.