Jump to content

Arnold diffusion

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Lagrangian matrix (talk | contribs) at 08:27, 4 December 2015 (Category:Classical mechanics). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In applied mathematics, Arnold diffusion is the phenomenon of instability of integrable Hamiltonian systems. The phenomenon is named after Vladimir Arnold who was the first to publish a result in the field in 1964.[1][2] More precisely, Arnold diffusion refers to results asserting the existence of solutions to nearly integrable Hamiltonian systems that exhibit a significant change in the action variables.

Background and statement

For integrable systems, one has the conservation of the action variables. According to the KAM theorem if we perturb an integrable system slightly, then many, though certainly not all, of the solutions of the perturbed system stay close, for all time, to the unperturbed system. In particular, since the action variables were originally conserved, the theorem tells us that there is only a small change in action for many solutions of the perturbed system.

However, as first noted in Arnold's paper,[1] there are nearly integrable systems for which there exist solutions that exhibit arbitrarily large growth in the action variables. More precisely, Arnold considered the example of nearly integrable Hamiltonian system with Hamiltonian

He showed that for this system, with any choice of where , there exists a such that for all there is a solution to the system for which

for some time

See also

References

  1. ^ a b Arnold, Vladimir I. (1964). "Instability of dynamical systems with several degrees of freedom". Soviet Mathematics. 5: 581–585.
  2. ^ Florin Diacu; Philip Holmes (1996). Celestial Encounters: The Origins of Chaos and Stability. Princeton University Press. p. 193. ISBN 0-691-00545-1.