|This article does not cite any sources. (December 2014) (Learn how and when to remove this template message)|
In mathematics, the Dirichlet conditions are sufficient conditions for a real-valued, periodic function f to be equal to the sum of its Fourier series at each point where f is continuous. Moreover, the behavior of the Fourier series at points of discontinuity is determined as well (it is the midpoint of the values of the discontinuity). These conditions are named after Peter Gustav Lejeune Dirichlet.
The conditions are:
- f must be absolutely integrable over a period.
- f must have a finite number of extrema in any given bounded interval, i.e. there must be a finite number of maxima and minima in the interval.
- f must have a finite number of discontinuities in any given bounded interval, however the discontinuity cannot be infinite.
These three conditions are satisfied if f is a function of bounded variation over a period.
Dirichlet's theorem for 1-dimensional Fourier series
We state Dirichlet's theorem assuming f is a periodic function of period 2π with Fourier series expansion where
The analogous statement holds irrespective of what the period of f is, or which version of the Fourier expansion is chosen (see Fourier series).
- Dirichlet's theorem: If f satisfies Dirichlet conditions, then for all x, we have that the series obtained by plugging x into the Fourier series is convergent, and is given by
- where the notation
- denotes the right/left limits of f.
A function satisfying Dirichlet's conditions must have right and left limits at each point of discontinuity, or else the function would need to oscillate at that point, violating the condition on maxima/minima. Note that at any point where f is continuous,
Thus Dirichlet's theorem says in particular that under the Dirichlet conditions the Fourier series for f converges and is equal to f wherever f is continuous.