# Arnold diffusion

In applied mathematics, Arnold diffusion is the phenomenon of instability of integrable Hamiltonian systems. The phenomenon is named after Vladimir Arnold who was the first to publish a result in the field in 1964.[1][2] More precisely, Arnold diffusion refers to results asserting the existence of solutions to nearly integrable Hamiltonian systems that exhibit a significant change in the action variables.

## Background and statement

For integrable systems, one has the conservation of the action variables. According to the KAM theorem if we perturb an integrable system slightly, then many, though certainly not all, of the solutions of the perturbed system stay close, for all time, to the unperturbed system. In particular, since the action variables were originally conserved, the theorem tells us that there is only a small change in action for many solutions of the perturbed system.

However, as first noted in Arnold's paper,[1] there are nearly integrable systems for which there exist solutions that exhibit arbitrarily large growth in the action variables. More precisely, Arnold considered the example of nearly integrable Hamiltonian system with Hamiltonian

${\displaystyle H(I,p,q,\phi ,t)={1 \over 2}I^{2}+p^{2}+\epsilon \cos {(q-1)}+\mu \cos {(q-1)}(\sin {\phi +\cos t)}}$

He showed that for this system, with any choice of ${\displaystyle \epsilon ,\delta ,K>0,}$ where ${\displaystyle K\gg \delta }$, there exists a ${\displaystyle \mu _{0}>0}$ such that for all ${\displaystyle \mu <\mu _{0}}$ there is a solution to the system for which

${\displaystyle I(0)<\delta {\text{ and }}I(T)>K\,}$

for some time ${\displaystyle T\gg 0.}$