Third law of thermodynamics

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The third law of thermodynamics is sometimes stated as follows, regarding the properties of systems in equilibrium at absolute zero temperature:

The entropy of a perfect crystal at absolute zero is exactly equal to zero.

At absolute zero (zero kelvin), the system must be in a state with the minimum possible energy, and the above statement of the third law holds true provided that the perfect crystal has only one minimum energy state. Entropy is related to the number of accessible microstates, and for a system consisting of many particles, quantum mechanics indicates that there is only one unique state (called the ground state) with minimum energy.[1] If the system does not have a well-defined order (if its order is glassy, for example), then in practice there will remain some finite entropy as the system is brought to very low temperatures as the system becomes locked into a configuration with non-minimal energy. The constant value is called the residual entropy of the system.[2]

The Nernst–Simon statement of the third law of thermodynamics concerns thermodynamic processes at a fixed, low temperature:

The entropy change associated with any condensed system undergoing a reversible isothermal process approaches zero as the temperature at which it is performed approaches 0 K.

Here a condensed system refers to liquids and solids. A classical formulation by Nernst (actually a consequence of the Third Law) is:

It is impossible for any process, no matter how idealized, to reduce the entropy of a system to its absolute-zero value in a finite number of operations.

Physically, the Nernst–Simon statement implies that it is impossible for any procedure to bring a system to the absolute zero of temperature in a finite number of steps.[3]


The 3rd law was developed by the chemist Walther Nernst during the years 1906–12, and is therefore often referred to as Nernst's theorem or Nernst's postulate. The third law of thermodynamics states that the entropy of a system at absolute zero is a well-defined constant. This is because a system at zero temperature exists in its ground state, so that its entropy is determined only by the degeneracy of the ground state.

In 1912 Nernst stated the law thus: "It is impossible for any procedure to lead to the isotherm T = 0 in a finite number of steps."[4]

An alternative version of the third law of thermodynamics as stated by Gilbert N. Lewis and Merle Randall in 1923:

If the entropy of each element in some (perfect) crystalline state be taken as zero at the absolute zero of temperature, every substance has a finite positive entropy; but at the absolute zero of temperature the entropy may become zero, and does so become in the case of perfect crystalline substances.

This version states not only ΔS will reach zero at 0 K, but S itself will also reach zero as long as the crystal has a ground state with only one configuration. Some crystals form defects which causes a residual entropy. This residual entropy disappears when the kinetic barriers to transitioning to one ground state are overcome.[5]

With the development of statistical mechanics, the third law of thermodynamics (like the other laws) changed from a fundamental law (justified by experiments) to a derived law (derived from even more basic laws). The basic law from which it is primarily derived is the statistical-mechanics definition of entropy for a large system:

 S - S_0 = k_B \ln \, \Omega \

where S is entropy, kB is the Boltzmann constant, and \Omega is the number of microstates consistent with the macroscopic configuration. The counting of states is from the reference state of absolute zero, which corresponds to the entropy of S0.


In simple terms, the third law states that the entropy of a perfect crystal of a pure substance approaches zero as the temperature approaches zero. The alignment of a perfect crystal leaves no ambiguity as to the location and orientation of each part of the crystal. As the energy of the crystal is reduced, the vibrations of the individual atoms are reduced to nothing, and the crystal becomes the same everywhere.

The third law provides an absolute reference point for the determination of entropy at any other temperature. The entropy of a system, determined relative to this zero point, is then the absolute entropy of that system. Mathematically, the absolute entropy of any system at zero temperature is the natural log of the number of ground states times Boltzmann's constant kB=1.38x10−23, JK−1.

The entropy of a perfect crystal lattice as defined by Nernst's theorem is zero provided that its ground state is unique, because ln(1) = 0. If the system is composed of one-billion atoms, all alike, and lie within the matrix of a perfect crystal, the number of permutations of one-billion identical things taken one-billion at a time is Ω = 1. Hence:

S - S_0 = k_B \ln\Omega = k_B\ln{1} = 0

The difference is zero, hence the initial entropy S0 can be any selected value so long as all other such calculations include that as the initial entropy. As a result the initial entropy value of zero is selected S0 = 0 is used for convenience.

S - S_0 = S - 0 = 0
S = 0

In way of example, suppose a system consists of 1 cm3 of matter with a mass of 1 g and 20 g/gmole. The system consists of 3x1022 identical atoms at 0 K. If one atom should absorb a photon of wavelength of 1 cm that atom is then unique and the permutations of one unique atom among the 3x1022 is N=3x1022. The entropy, energy, and temperature of the system rises and can be calculated. The entropy change is:

\Delta S = S - S_{0} = k_{B} \ln{\Omega}

From the second law of thermodynamics:

\Delta S = S - S_0 = \frac{\delta Q}{T}


\Delta S = S - S_0 = k_B ln(\Omega) = \frac{ \delta Q}{T}

Calculating entropy change:

S - 0 = k_B \ln{N} = 1.38 \times 10^{-23} * \ln{3 \times 10^{22}} =70 \times 10^{-23}J/K

The energy change of the system as a result of absorbing the single photon whose energy is ε:

 \delta Q = \epsilon = \frac {hc}{\lambda} =\frac{6.62 \times 10^{-34}J\cdot s * 3 \times 10^{8} m/s}{0.01 m}=2 \times 10^{-23} J

The temperature of the system rises by:

T = \frac{\epsilon}{\Delta S} = \frac{2 \times 10^{-23}J}{70 \times 10^{-23}J/K} = \frac{1}{35} K

This can be interpreted as the average temperature of the system over the range from 0 < S < 70x10−23 J/K[6] A single atom was assumed to absorb the photon but the temperature and entropy change characterizes the entire system.

An example of a system which does not have a unique ground state is one whose net spin is a half-integer, for which time-reversal symmetry gives two degenerate ground states. For such systems, the entropy at zero temperature is at least kB*ln(2) (which is negligible on a macroscopic scale). Some crystalline systems exhibit geometrical frustration, where the structure of the crystal lattice prevents the emergence of a unique ground state. Ground-state helium (unless under pressure) remains liquid.

In addition, glasses and solid solutions retain large entropy at 0 K, because they are large collections of nearly degenerate states, in which they become trapped out of equilibrium. Another example of a solid with many nearly-degenerate ground states, trapped out of equilibrium, is ice Ih, which has "proton disorder".

For the entropy at absolute zero to be zero, the magnetic moments of a perfectly ordered crystal must themselves be perfectly ordered; from an entropic perspective, this can be considered to be part of the definition of a "perfect crystal". Only ferromagnetic, antiferromagnetic, and diamagnetic materials can satisfy this condition. However, ferromagnetic materials do not in fact have zero entropy at zero temperature, because the spins of the unpaired electrons are all aligned and this gives a ground-state spin degeneracy. Materials that remain paramagnetic at 0 K, by contrast, may have many nearly-degenerate ground states (for example, in a spin glass), or may retain dynamic disorder (a quantum spin liquid).[citation needed]

Mathematical formulation[edit]

Consider a closed system in internal equilibrium. As the system is in equilibrium, there are no irreversible processes so the entropy production is zero. During slow heating, small temperature gradients are generated in the material, but the associated entropy production can be kept arbitrarily low if the heat is supplied slowly enough. The increase in entropy due to the added heat δQ is then given by the second part of the Second law of thermodynamics which states that the entropy change of a system is given by

\Delta S = \frac{\delta Q}{T}. (1)

The temperature rise dT due to the heat δQ is determined by the heat capacity C(T,X) according to

\delta Q=C(T,X) \mathrm{d}T. (2)

The parameter X is a symbolic notation for all parameters (such as pressure, magnetic field, liquid/solid fraction, etc.) which are kept constant during the heat supply. E.g. if the volume is constant we get the heat capacity at constant volume CV. In the case of a phase transition from liquid to solid, or from gas to liquid the parameter X can be one of the two components.[clarification needed] Combining relations (1) and (2) gives

\Delta S = \frac{C(T,X) \mathrm{d}T}{T}. (3)

Integration of Eq.(3) from a reference temperature T0 to an arbitrary temperature T gives the entropy at temperature T

S(T,X) = S(T_0,X) + \int_{T_0}^{T} \frac {C(T^\prime,X)}{T^\prime}\mathrm{d}T^\prime. (4)

We now come to the mathematical formulation of the third law. There are three steps:

1: in the limit T0→0 the integral in Eq.(4) is finite.[clarification needed] So that we may take T0=0 and write

S(T,X)=S(0,X) + \int_0^T \frac {C(T^\prime,X)}{T^\prime}\mathrm{d}T^\prime. (5)

2. the value of S(0,X) is independent of X.[clarification needed] In mathematical form

S(0,X) = S(0). (6)

So Eq.(5) can be further simplified to

S(T,X)=S(0) + \int_0^T \frac {C(T^\prime,X)}{T^\prime}\mathrm{d}T^\prime. (7)

Equation (6) can also be formulated as

\lim_{T \rightarrow 0}\left( \frac { \part S(T,X)}{ \part X}\right)_T = 0. (8)

In words: at absolute zero all isothermal processes are isentropic. Eq.(8) is the mathematical formulation of the third law.

3: Classically, one is free to choose the zero of the entropy, and it is convenient to take

S(0)=0 (9)

so that Eq.(7) reduces to the final form

S(T,X) = \int_0^T \frac {C(T^\prime,X)}{T^\prime}\mathrm{d}T^\prime. (10)

However, reinterpreting Eq. (9) in view of the quantized nature of the lowest-lying energy states, the physical meaning of Eq.(9) goes deeper than just a convenient selection of the zero of the entropy. It is due to the perfect order at zero kelvin as explained above.

Consequences of the third law[edit]

Fig.1 Left side: Absolute zero can be reached in a finite number of steps if S(0,X1)≠S(0, X2). Right: An infinite number of steps is needed since S(0,X1)= S(0,X2).

Can absolute zero be obtained?[edit]

The third law is equivalent to the statement that

"It is impossible by any procedure, no matter how idealized, to reduce the temperature of any system to zero temperature in a finite number of finite operations".[7]

The reason that T=0 cannot be reached according to the third law is explained as follows: Suppose that the temperature of a substance can be reduced in an isentropic process by changing the parameter X from X2 to X1. One can think of a multistage nuclear demagnetization setup where a magnetic field is switched on and off in a controlled way.[8] If there were an entropy difference at absolute zero, T=0 could be reached in a finite number of steps. However, at T=0 there is no entropy difference so an infinite number of steps would be needed. The process is illustrated in Fig.1.

Specific heat[edit]

A non-quantitative description of his third law that Nernst gave at the very beginning was simply that the specific heat can always be made zero by cooling the material down far enough.[9] A modern, quantitative analysis follows.

Suppose that the heat capacity of a sample in the low temperature region has the form of a power law C(T,X)=C0Tα asymptotically as T→0, and we wish to find which values of α are compatible with the third law. We have

\int_{T_0}^T \frac {C(T^\prime,X)}{T^\prime}dT^\prime = \frac {C_0}{ \alpha}(T^{ \alpha}-T_0^{ \alpha}). (11)

By the discussion of third law (above), this integral must be bounded as T0→0, which is only possible if α>0. So the heat capacity must go to zero at absolute zero

 \lim_{T \rightarrow 0}C(T,X)=0. (12)

if it has the form of a power law. The same argument shows that it cannot be bounded below by a positive constant, even if we drop the power-law assumption.

On the other hand, the molar specific heat at constant volume of a monatomic classical ideal gas, such as helium at room temperature, is given by CV=(3/2)R with R the molar ideal gas constant. But clearly a constant heat capacity does not satisfy Eq. (12). That is, a gas with a constant heat capacity all the way to absolute zero violates the third law of thermodynamics. We can verify this more fundamentally by substituting CV in Eq. (4), which yields

S(T,V) = S(T_0,V) + \frac{3}{2}R \ln \frac{T}{T_0}. (13)

In the limit T0→0 this expression diverges, again contradicting the third law of thermodynamics.

The conflict is resolved as follows: At a certain temperature the quantum nature of matter starts to dominate the behavior. Fermi particles follow Fermi–Dirac statistics and Bose particles follow Bose–Einstein statistics. In both cases the heat capacity at low temperatures is no longer temperature independent, even for ideal gases. For Fermi gases

C_V = \frac{ \pi^2}{2}R \frac{T}{T_F} (14)

with the Fermi temperature TF given by

T_F = \frac{1}{8 \pi^2}\frac{N_A^2h^2}{MR}\left( \frac{3\pi^2N_A}{V_m}\right)^{2/3}. (15)

Here NA is Avogadro's number, Vm the molar volume, and M the molar mass.

For Bose gases

C_V=1.93..R\left( \frac{T}{T_B}\right)^{3/2} (16)

with TB given by

T_B = \frac{1}{11.9..}\frac{N_A^2h^2}{MR}\left( \frac{N_A}{V_m}\right)^{2/3}. (17)

The specific heats given by Eq.(14) and (16) both satisfy Eq.(12). Indeed, they are power laws with α=1 and α=3/2 respectively.

Vapor pressure[edit]

The only liquids near absolute zero are ³He and ⁴He. Their heat of evaporation has a limiting value given by

L=L_0+C_pT (18)

with L0 and Cp constant. If we consider a container, partly filled with liquid and partly gas, the entropy of the liquid–gas mixture is

S(T,x) = S_l(T)+x(\frac{L_0}{T}+C_p) (19)

where Sl(T) is the entropy of the liquid and x is the gas fraction. Clearly the entropy change during the liquid–gas transition (x from 0 to 1) diverges in the limit of T→0. This violates Eq.(8). Nature solves this paradox as follows: at temperatures below about 50 mK the vapor pressure is so low that the gas density is lower than the best vacuum in the universe. In other words: below 50 mK there is simply no gas above the liquid.

Latent heat of melting[edit]

The melting curves of ³He and ⁴He both extend down to absolute zero at finite pressure. At the melting pressure liquid and solid are in equilibrium. The third law demands that the entropies of the solid and liquid are equal at T=0. As a result the latent heat of melting is zero and the slope of the melting curve extrapolates to zero as a result of the Clausius–Clapeyron equation.

Thermal expansion coefficient[edit]

The thermal expansion coefficient is defined as

\alpha_V = \frac{1}{V_m} \left(\frac{\part V_m}{\part T}\right)_{p}. (20)

With the Maxwell relation

\left(\frac{\part V_m}{\part T}\right)_{p}=-\left(\frac{\part S_m}{\part p}\right)_T (21)

and Eq.(8) with X=p it is shown that

\lim_{T \rightarrow 0}\alpha_V=0. (22)

So the thermal expansion coefficient of all materials must go to zero at zero kelvin.

See also[edit]


  1. ^ J. Wilks The Third Law of Thermodynamics Oxford University Press (1961).[page needed]
  2. ^ Kittel and Kroemer, Thermal Physics (2nd ed.), page 49.
  3. ^ Wilks, J. (1971). The Third Law of Thermodynamics, Chapter 6 in Thermodynamics, volume 1, ed. W. Jost, of H. Eyring, D. Henderson, W. Jost, Physical Chemistry. An Advanced Treatise, Academic Press, New York, page 477.
  4. ^ Bailyn, M. (1994). A Survey of Thermodynamics, American Institute of Physics, New York, ISBN 0–88318–797-3, page 342.
  5. ^ Kozliak, Evguenii; Lambert, Frank L. (2008). "Residual Entropy, the Third Law and Latent Heat". Entropy 10 (3): 274–84. Bibcode:2008Entrp..10..274K. doi:10.3390/e10030274. 
  6. ^ Reynolds and Perkins (1977). Engineering Thermodynamicsq. McGraw Hill. p. 438. ISBN 0-07-052046-1. 
  7. ^ Guggenheim, E.A. (1967). Thermodynamics. An Advanced Treatment for Chemists and Physicists, fifth revised edition, North-Holland Publishing Company, Amsterdam, page 157.
  8. ^ F. Pobell, Matter and Methods at Low Temperatures, (Springer-Verlag, Berlin, 2007)[page needed]
  9. ^ Einstein and the Quantum, A. Douglas Stone, Princeton University Press, 2013.

Further reading[edit]