Jump to content

Delay equalization

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Monkbot (talk | contribs) at 16:12, 7 May 2014 (Task 3: Fix CS1 deprecated coauthor parameter errors). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In signal processing, delay equalization corresponds to adjusting the relative phases of different frequencies to achieve a constant group delay, using by adding an all-pass filter in series with an uncompensated filter.[1] Clever machine-learning techniques are now being applied to the design of such filters.[2]

References

  1. ^ Quélhas, MF; Petraglia A (2003). Group Delay Equalization of Discrete-Time Filters. pp. 924–927. {{cite book}}: |journal= ignored (help)
  2. ^ Žiška, P; Laipert M (2006). "Band-pass Filter Group delay equalization". Proceedings of the 38th Symposium on Systems Theory: 289–293.