Arnold diffusion

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In applied mathematics, Arnold diffusion is the phenomenon of instability of integrable Hamiltonian systems. The phenomenon is named after Vladimir Arnold who was the first to publish a result in the field in 1964.[1][2] More precisely, Arnold diffusion refers to results asserting the existence of solutions to nearly integrable Hamiltonian systems that exhibit a significant change in the action variables.

Background and statement[edit]

For integrable systems, one has the conservation of the action variables. According to the KAM theorem if we perturb an integrable system slightly, then many, though certainly not all, of the solutions of the perturbed system stay close, for all time, to the unperturbed system. In particular, since the action variables were originally conserved, the theorem tells us that there is only a small change in action for many solutions of the perturbed system.

However, as first noted in Arnold's paper,[1] there are nearly integrable systems for which there exist solutions that exhibit arbitrarily large growth in the action variables. More precisely, Arnold considered the example of nearly integrable Hamiltonian system with Hamiltonian

He showed that for this system, with any choice of where , there exists a such that for all there is a solution to the system for which

for some time

See also[edit]


  1. ^ a b Arnold, Vladimir I. (1964). "Instability of dynamical systems with several degrees of freedom". Soviet Mathematics. 5: 581–585. 
  2. ^ Florin Diacu; Philip Holmes (1996). Celestial Encounters: The Origins of Chaos and Stability. Princeton University Press. p. 193. ISBN 0-691-00545-1.