Hellmann–Feynman theorem

From Wikipedia, the free encyclopedia

In quantum mechanics, the Hellmann–Feynman theorem relates the derivative of the total energy with respect to a parameter, to the expectation value of the derivative of the Hamiltonian with respect to that same parameter. According to the theorem, once the spatial distribution of the electrons has been determined by solving the Schrödinger equation, all the forces in the system can be calculated using classical electrostatics.

The theorem has been proven independently by many authors, including Paul Güttinger (1932),[1] Wolfgang Pauli (1933),[2] Hans Hellmann (1937)[3] and Richard Feynman (1939).[4]

The theorem states







  • is a Hamiltonian operator depending upon a continuous parameter ,
  • , is an eigen-state (eigenfunction) of the Hamiltonian, depending implicitly upon ,
  • is the energy (eigenvalue) of the state , i.e. .


This proof of the Hellmann–Feynman theorem requires that the wavefunction be an eigenfunction of the Hamiltonian under consideration; however, one can also prove more generally that the theorem holds for non-eigenfunction wavefunctions which are stationary (partial derivative is zero) for all relevant variables (such as orbital rotations). The Hartree–Fock wavefunction is an important example of an approximate eigenfunction that still satisfies the Hellmann–Feynman theorem. Notable example of where the Hellmann–Feynman is not applicable is for example finite-order Møller–Plesset perturbation theory, which is not variational.[5]

The proof also employs an identity of normalized wavefunctions – that derivatives of the overlap of a wavefunction with itself must be zero. Using Dirac's bra–ket notation these two conditions are written as

The proof then follows through an application of the derivative product rule to the expectation value of the Hamiltonian viewed as a function of :

Alternate proof[edit]

The Hellmann–Feynman theorem is actually a direct, and to some extent trivial, consequence of the variational principle (the Rayleigh-Ritz variational principle) from which the Schrödinger equation may be derived. This is why the Hellmann–Feynman theorem holds for wave-functions (such as the Hartree–Fock wave-function) that, though not eigenfunctions of the Hamiltonian, do derive from a variational principle. This is also why it holds, e.g., in density functional theory, which is not wave-function based and for which the standard derivation does not apply.

According to the Rayleigh–Ritz variational principle, the eigenfunctions of the Schrödinger equation are stationary points of the functional (which is nicknamed[by whom?] Schrödinger functional for brevity):






The eigenvalues are the values that the Schrödinger functional takes at the stationary points:






where satisfies the variational condition:






By differentiating Eq. (3) using the chain rule, one obtains:






Due to the variational condition, Eq. (4), the second term in Eq. (5) vanishes. In one sentence, the Hellmann–Feynman theorem states that the derivative of the stationary values of a function(al) with respect to a parameter on which it may depend, can be computed from the explicit dependence only, disregarding the implicit one.[citation needed] On account of the fact that the Schrödinger functional can only depend explicitly on an external parameter through the Hamiltonian, Eq. (1) trivially follows.

Example applications[edit]

Molecular forces[edit]

The most common application of the Hellmann–Feynman theorem is to the calculation of intramolecular forces in molecules. This allows for the calculation of equilibrium geometries – the nuclear coordinates where the forces acting upon the nuclei, due to the electrons and other nuclei, vanish. The parameter corresponds to the coordinates of the nuclei. For a molecule with electrons with coordinates , and nuclei, each located at a specified point and with nuclear charge , the clamped nucleus Hamiltonian is

The -component of the force acting on a given nucleus is equal to the negative of the derivative of the total energy with respect to that coordinate. Employing the Hellmann–Feynman theorem this is equal to

Only two components of the Hamiltonian contribute to the required derivative – the electron-nucleus and nucleus-nucleus terms. Differentiating the Hamiltonian yields[6]

Insertion of this in to the Hellmann–Feynman theorem returns the -component of the force on the given nucleus in terms of the electronic density and the atomic coordinates and nuclear charges:

Expectation values[edit]

An alternative approach for applying the Hellmann–Feynman theorem is to promote a fixed or discrete parameter which appears in a Hamiltonian to be a continuous variable solely for the mathematical purpose of taking a derivative. Possible parameters are physical constants or discrete quantum numbers. As an example, the radial Schrödinger equation for a hydrogen-like atom is

which depends upon the discrete azimuthal quantum number . Promoting to be a continuous parameter allows for the derivative of the Hamiltonian to be taken:

The Hellmann–Feynman theorem then allows for the determination of the expectation value of for hydrogen-like atoms:[7]

In order to compute the energy derivative, the way depends on has to be known. These quantum numbers are usually independent, but here the solutions must be varied so as to keep the number of nodes in the wavefunction fixed. The number of nodes is , so .

Van der Waals forces[edit]

In the end of Feynman's paper, he states that, "Van der Waals' forces can also be interpreted as arising from charge distributions with higher concentration between the nuclei. The Schrödinger perturbation theory for two interacting atoms at a separation , large compared to the radii of the atoms, leads to the result that the charge distribution of each is distorted from central symmetry, a dipole moment of order being induced in each atom. The negative charge distribution of each atom has its center of gravity moved slightly toward the other. It is not the interaction of these dipoles which leads to van der Waals's force, but rather the attraction of each nucleus for the distorted charge distribution of its own electrons that gives the attractive force."[excessive quote]

Hellmann–Feynman theorem for time-dependent wavefunctions[edit]

For a general time-dependent wavefunction satisfying the time-dependent Schrödinger equation, the Hellmann–Feynman theorem is not valid. However, the following identity holds:[8][9]



The proof only relies on the Schrödinger equation and the assumption that partial derivatives with respect to λ and t can be interchanged.


  1. ^ Güttinger, P. (1932). "Das Verhalten von Atomen im magnetischen Drehfeld". Zeitschrift für Physik. 73 (3–4): 169–184. Bibcode:1932ZPhy...73..169G. doi:10.1007/BF01351211. S2CID 124962011.
  2. ^ Pauli, W. (1933). "Principles of Wave Mechanics". Handbuch der Physik. Vol. 24. Berlin: Springer. p. 162.
  3. ^ Hellmann, H (1937). Einführung in die Quantenchemie. Leipzig: Franz Deuticke. p. 285. OL 21481721M.
  4. ^ Feynman, R. P. (1939). "Forces in Molecules". Physical Review. 56 (4): 340–343. Bibcode:1939PhRv...56..340F. doi:10.1103/PhysRev.56.340.
  5. ^ Jensen, Frank (2007). Introduction to Computational Chemistry. West Sussex: John Wiley & Sons. p. 322. ISBN 978-0-470-01186-7.
  6. ^ Piela, Lucjan (2006). Ideas of Quantum Chemistry. Amsterdam: Elsevier Science. p. 620. ISBN 978-0-444-52227-6.
  7. ^ Fitts, Donald D. (2002). Principles of Quantum Mechanics : as Applied to Chemistry and Chemical Physics. Cambridge: Cambridge University Press. p. 186. ISBN 978-0-521-65124-0.
  8. ^ Epstein, Saul (1966). "Time-Dependent Hellmann-Feynman Theorems for Variational Wavefunctions". The Journal of Chemical Physics. 45 (1): 384. Bibcode:1966JChPh..45..384E. doi:10.1063/1.1727339.
  9. ^ Hayes, Edward F.; Parr, Robert G. (1965). "Time-Dependent Hellmann-FeynmanTheorems". The Journal of Chemical Physics. 43 (5): 1831. Bibcode:1965JChPh..43.1831H. doi:10.1063/1.1697020.