Fluctuation-dissipation theorem

From Wikipedia, the free encyclopedia
  (Redirected from Fluctuation dissipation theorem)
Jump to navigation Jump to search

The fluctuation–dissipation theorem (FDT) or fluctuation–dissipation relation (FDR) is a powerful tool in statistical physics for predicting the behavior of systems that obey detailed balance. Given that a system obeys detailed balance, the theorem is a general proof that thermal fluctuations in a physical variable predict the response quantified by the admittance or impedance of the same physical variable (like Voltage, Temperature difference, etc), and vice versa. The fluctuation–dissipation theorem applies both to classical and quantum mechanical systems.

The fluctuation–dissipation theorem relies on the assumption that the response of a system in thermodynamic equilibrium to a small applied force is the same as its response to a spontaneous fluctuation. Therefore, the theorem connects the linear response relaxation of a system from a prepared non-equilibrium state to its statistical fluctuation properties in equilibrium.[1] Often the linear response takes the form of one or more exponential decays.

The fluctuation–dissipation theorem was originally formulated by Harry Nyquist in 1928,[2] and later proven by Herbert Callen and Theodore A. Welton in 1951.[3]

Qualitative overview and examples[edit]

The fluctuation–dissipation theorem says that when there is a process that dissipates energy, turning it into heat (e.g., friction), there is a reverse process related to thermal fluctuations. This is best understood by considering some examples:

If an object is moving through a fluid, it experiences drag (air resistance or fluid resistance). Drag dissipates kinetic energy, turning it into heat. The corresponding fluctuation is Brownian motion. An object in a fluid does not sit still, but rather moves around with a small and rapidly-changing velocity, as molecules in the fluid bump into it. Brownian motion converts heat energy into kinetic energy—the reverse of drag.
If electric current is running through a wire loop with a resistor in it, the current will rapidly go to zero because of the resistance. Resistance dissipates electrical energy, turning it into heat (Joule heating). The corresponding fluctuation is Johnson noise. A wire loop with a resistor in it does not actually have zero current, it has a small and rapidly-fluctuating current caused by the thermal fluctuations of the electrons and atoms in the resistor. Johnson noise converts heat energy into electrical energy—the reverse of resistance.
When light impinges on an object, some fraction of the light is absorbed, making the object hotter. In this way, light absorption turns light energy into heat. The corresponding fluctuation is thermal radiation (e.g., the glow of a "red hot" object). Thermal radiation turns heat energy into light energy—the reverse of light absorption. Indeed, Kirchhoff's law of thermal radiation confirms that the more effectively an object absorbs light, the more thermal radiation it emits.

Examples in detail[edit]

The fluctuation–dissipation theorem is a general result of statistical thermodynamics that quantifies the relation between the fluctuations in a system at thermal equilibrium and the response of the system to applied perturbations.

The model thus allows, for example, the use of molecular models to predict material properties in the context of linear response theory. The theorem assumes that applied perturbations, e.g., mechanical forces or electric fields, are weak enough that rates of relaxation remain unchanged.

Brownian motion[edit]

For example, Albert Einstein noted in his 1905 paper on Brownian motion that the same random forces that cause the erratic motion of a particle in Brownian motion would also cause drag if the particle were pulled through the fluid. In other words, the fluctuation of the particle at rest has the same origin as the dissipative frictional force one must do work against, if one tries to perturb the system in a particular direction.

From this observation Einstein was able to use statistical mechanics to derive the Einstein–Smoluchowski relation

which connects the diffusion constant D and the particle mobility μ, the ratio of the particle's terminal drift velocity to an applied force. kB is the Boltzmann constant, and T is the absolute temperature.

Thermal noise in a resistor[edit]

In 1928, John B. Johnson discovered and Harry Nyquist explained Johnson–Nyquist noise. With no applied current, the mean-square voltage depends on the resistance R, , and the bandwidth over which the voltage is measured:

General formulation[edit]

The fluctuation–dissipation theorem can be formulated in many ways; one particularly useful form is the following:[citation needed]

Let be an observable of a dynamical system with Hamiltonian subject to thermal fluctuations. The observable will fluctuate around its mean value with fluctuations characterized by a power spectrum . Suppose that we can switch on a time-varying, spatially constant field which alters the Hamiltonian to . The response of the observable to a time-dependent field is characterized to first order by the susceptibility or linear response function of the system

where the perturbation is adiabatically (very slowly) switched on at .

The fluctuation–dissipation theorem relates the two-sided power spectrum (i.e. both positive and negative frequencies) of to the imaginary part of the Fourier transform of the susceptibility :

The left-hand side describes fluctuations in , the right-hand side is closely related to the energy dissipated by the system when pumped by an oscillatory field .

This is the classical form of the theorem; quantum fluctuations are taken into account by replacing with (whose limit for is ). A proof can be found by means of the LSZ reduction, an identity from quantum field theory.[citation needed]

The fluctuation–dissipation theorem can be generalized in a straightforward way to the case of space-dependent fields, to the case of several variables or to a quantum-mechanics setting.[3]


We derive the fluctuation–dissipation theorem in the form given above, using the same notation. Consider the following test case: The field f has been on for infinite time and is switched off at t=0

We can express the expectation value of x by the probability distribution W(x,0) and the transition probability

The probability distribution function W(x,0) is an equilibrium distribution and hence given by the Boltzmann distribution for the Hamiltonian

where . For a weak field , we can expand the right-hand side

here is the equilibrium distribution in the absence of a field. Plugging this approximation in the formula for yields






where A(t) is the auto-correlation function of x in the absence of a field:

Note that in the absence of a field the system is invariant under time-shifts. We can rewrite using the susceptibility of the system and hence find with the above equation (*)







To make a statement about frequency dependence, it is necessary to take the Fourier transform of equation (**). By integrating by parts, it is possible to show that

Since is real and symmetric, it follows that

Finally, for stationary processes, the Wiener-Khinchin theorem states that the two-sided spectral density is equal to the Fourier transform of the auto-correlation function:

Therefore, it follows that

Violations in glassy systems[edit]

While the fluctuation–dissipation theorem provides a general relation between the response of equilibrium systems to small external perturbations and their spontaneous fluctuations, no general relation is known for systems out of equilibrium. Glassy systems at low temperatures, as well as real glasses, are characterized by slow approaches to equilibrium states. Thus these systems require large time-scales to be studied while they remain in disequilibrium.

In the mid 1990s, in the study of non-equilibrium dynamics of spin glass models, a generalization of the fluctuation–dissipation theorem was discovered[citation needed] that holds for asymptotic non-stationary states, where the temperature appearing in the equilibrium relation is substituted by an effective temperature with a non-trivial dependence on the time scales. This relation is proposed to hold in glassy systems beyond the models for which it was initially found.

Quantum version[edit]

The Rényi entropy as well as von Neumann entropy in quantum physics are not observables since they depend nonlinearly on the density matrix. Recently, Ansari and Nazarov proved an exact correspondence that reveals the physical meaning of the Rényi entropy flow in time. This correspondence is similar to the fluctuation-dissipation theorem in spirit and allows the measurement of quantum entropy using the full counting statistics (FCS) of energy transfers.[4][5][6]

See also[edit]


  1. ^ David Chandler (1987). Introduction to Modern Statistical Mechanics. Oxford University Press. p. 255. ISBN 978-0-19-504277-1.
  2. ^ Nyquist H (1928). "Thermal Agitation of Electric Charge in Conductors". Physical Review. 32: 110&ndash, 113. Bibcode:1928PhRv...32..110N. doi:10.1103/PhysRev.32.110.
  3. ^ a b H.B. Callen, T.A. Welton (1951). "Irreversibility and Generalized Noise". Physical Review. 83: 34&ndash, 40. Bibcode:1951PhRv...83...34C. doi:10.1103/PhysRev.83.34.
  4. ^ Ansari_Nazarov (2016)
  5. ^ Ansari_Nazarov (2015a)
  6. ^ Ansari_Nazarov (2015b)


Further reading[edit]