A.34 The third law

In the simplest formulation, the third law of thermodynamics says that the entropy at absolute zero temperature is zero.

The original theorem is due to Nernst. A more recent formulation is

“The contribution to the entropy of a system due to each component that is in internal equilibrium disappears at absolute zero.” [D. Ter Haar (1966) Elements of Thermostatistics. Holt, Rinehart & Winston.]
A more readable version is
“The entropy of every chemically simple, perfectly crystalline, body equals zero at the absolute zero of temperature.” [G.H. Wannier (1966) Statistical Physics. Wiley.]
These formulations allow for the existence of meta-stable equilibria. The third law in its simple form assumes that strictly speaking every ground state is reasonably unique and that the system is in true thermal equilibrium. Experimentally however, many substances do not appear to approach zero entropy. Random mixtures as well as ice are examples. They may not be in true equilibrium, but if true equilibrium is not observed, it is academic.

The zero of entropy is important for mixtures, in which you need to add the entropies of the components together correctly. It also has implications for the behavior of various quantities at low temperatures. For example, it implies that the specific heats become zero at absolute zero. To see why, note that in a constant volume or constant pressure process the entropy changes are given by

\int \frac{C}{T}{\,\rm d}T

If the specific heat $C$ would not become zero at $T$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0, this integral would give an infinite entropy at that temperature instead of zero.

Another consequence of the third law is that it is not possible to bring a system to absolute zero temperature completely even in ideal processes. That seems pretty self-evident from a classical point of view, but it is not so obvious in quantum terms. The third law also implies that isothermal processes become isentropic when absolute zero temperature is approached.

It may seem that the third law is a direct consequence of the quantum expression for the entropy,

S = - k_{\rm B}\sum P_q \ln(P_q)

At absolute zero temperature, the system is in the ground state. Assuming that the ground state is not degenerate, there is then only one nonzero probability $P_q$ $\vphantom0\raisebox{1.5pt}{$=$}$ 1 and for that probability $\ln(P_q)$ is zero. So the entropy is zero.

Even if the ground state is not unique, often it does not make much of a difference. For example, consider the case of a system of $I$ noninteracting spin 1 bosons in a box. If you could really ignore the effect of all particle interactions on the energy, the $I$ spin states would be arbitrary in the ground state. But even then there would be only about $\frac12I^2$ different system states with the ground state energy, chapter 5.7. That produces an entropy of only about $-k_{\rm B}\ln(2/I^2)$. It would make the specific entropy proportional to $\ln(I)$$\raisebox{.5pt}{$/$}$$I$, which is zero for a large-enough system.

On the other hand, if you ignore electromagnetic spin couplings of nuclei in a crystal, it becomes a different matter. Since the nuclear wave functions have no measurable overlap, to any conceivable accuracy the nuclei can assume independent spatial states. That gets rid of the (anti) symmetrization restrictions on their spin. And then the associated entropy can be nonzero. But of course, if the nuclear spin does not interact with anything, you may be able to ignore its existence altogether.

Even if a system has a unique ground state, the third law is not as trivial as it may seem. Thermodynamics deals not with finite systems but with idealized systems of infinite size. A very simple example illustrates why it makes a difference. Consider the possibility of a hypothetical system whose specific entropy depends on the number of particles $I$, temperature $T$, and pressure $P$ as

s_{\rm h.s.}(I,T,P) = \frac{IT}{1+IT}

This system is consistent with the expression for the third law given above: for a given system size $I$, the entropy becomes zero at zero temperature. However, the idealized infinite system always has entropy 1; its entropy does not go to zero for zero temperature. The third law should be understood to say that this hypothetical system does not exist.

If infinite systems seem unphysical, translate it into real-life terms. Suppose your test tube has say $I$ $\vphantom0\raisebox{1.5pt}{$=$}$ 10$\POW9,{20}$ particles of the hypothetical system in it instead of infinitely many. Then to reduce the specific entropy from 1 to 0.5 would require the temperature to be reduced to a completely impossible 10$\POW9,{-20}$ K. And if you double the number of particles in the test tube, you would need another factor two reduction in temperature. In short, while formally the entropy for the finite hypothetical system goes to zero at absolute zero, the temperatures required to do so have no actual meaning.