In physics, an infrared divergence (also IR divergence or infrared catastrophe) is a situation in which an integral, for example a Feynman diagram, diverges because of contributions of objects with very small energy approaching zero, or, equivalently, because of physical phenomena at very long distances.
The infrared divergence only appears in theories with massless particles (such as photons). They represent a legitimate effect that a complete theory often implies. In fact, in the case of photons, the energy is given by E=hν, where ν is the frequency associated to the particle and as it goes to zero, like in the case of soft photons, there will be an infinite number of particles in order to have a finite amount of energy. One way to deal with it is to impose an infrared cutoff and take the limit as the cutoff approaches zero and/or refine the question. Another way is to assign the massless particle a fictitious mass, and then take the limit as the fictitious mass vanishes.
The divergence is usually in terms of particle number and not empirically troubling, in that all measurable quantities remain finite. (Unlike in the case of the UV catastrophe where the energies involved diverge.)
When an electric charge is accelerated (or decelerated) it emits Bremsstrahlung radiation. Semiclassical electromagnetic theory, or the full quantum electrodynamic analysis, shows that an infinite number of soft photons are created. But only a finite number are detectable, the remainder, due to their low energy, falling below any finite energy detection threshold, which must necessarily exist. However even though most of the photons are not detectable they can't be ignored in the theory; quantum electrodynamic calculations show that the transition amplitude between any states with a finite number of photons vanishes. Finite transition amplitudes are obtained only by summing over states with an infinite number of soft photons.
|This quantum mechanics-related article is a stub. You can help Wikipedia by expanding it.|