Jump to content

Entropy: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
added statistical entropy. additions to article intro and blotzman's definition section
Line 17: Line 17:
The quantity ''k_B'' is a [[physical constant]] known as [[Boltzmann constant|Boltzmann's constant]], which, like the entropy, has units of [[heat capacity]]. The [[logarithm]] is [[Dimensionless number|dimensionless]].
The quantity ''k_B'' is a [[physical constant]] known as [[Boltzmann constant|Boltzmann's constant]], which, like the entropy, has units of [[heat capacity]]. The [[logarithm]] is [[Dimensionless number|dimensionless]].


This definition is valid including far away from equilibrium. Other definitions assume that the system is in [[thermal equilibrium]], either as an [[isolated system]], or as a system in exchange with its surroundings. The set of microstates on which the sum is to be done is called a [[statistical ensemble]]. Each [[statistical ensemble]] (micro-canonical, canonical, grand-canonical, etc.) describes a different configuration of the system's exchanges with the outside, from an [[isolated]] system to a system that can exchange one more quantity with a reservoir, like energy, volume or molecules. In every ensemble, the [[thermodynamic equilibrium|equilibrium]] configuration of the system is dictated by the maximization of the entropy of the union of the system and its reservoir, according to the [[second law of thermodynamics]] (see the [[statistical meachanics]] article).
This definition is valid including far away from equilibrium. Other definitions assume that the system is in [[thermal equilibrium]], either as an [[isolated system]], or as a system in exchange with its surroundings. The set of microstates on which the sum is to be done is called a [[statistical ensemble]]. Each [[statistical ensemble]] (micro-canonical, canonical, grand-canonical, etc.) describes a different configuration of the system's exchanges with the outside, from an [[isolated]] system to a system that can exchange one more quantity with a reservoir, like energy, volume or molecules. In every ensemble, the [[thermodynamic equilibrium|equilibrium]] configuration of the system is dictated by the maximization of the entropy of the union of the system and its reservoir, according to the [[second law of thermodynamics]] (see the [[statistical mechanics]] article).


Note the above expression of the statistical entropy is a discretized version of [[Shannon entropy]]
Note the above expression of the statistical entropy is a discretized version of [[Shannon entropy]]

Revision as of 21:29, 6 November 2005

For other senses of the term entropy, see entropy (disambiguation).

In thermodynamics and statistical mechanics, the thermodynamic entropy (or simply the entropy) S is a measure of the internal microscopic disorder present in a system; or, equivalently, the number of possible internal configurations available to the system. The entropy can also be understood as the "quality" of heat flowing between two bodies. Entropy measures energy that is no longer available to perform useful work within the current environment. The SI unit of entropy is J·K-1 (joules per kelvin), which is the same as the unit of heat capacity.

In thermodynamics, entropy is a function of the parameters describing the macroscopi state of the system, like pressure, temperature, etc. In statistical mechanics, entropy is a funtions of the way the system is distributed over his set of microstates. The wider the set of microstates over which the system fluctuates, the higher its entropy. Since statistical mechanics give a probabilistic treatment to a system's thermal fluctuations, a higher entropy also means a greater lack of information on the exact microscopic state of the system. As such it has many similarities with Shannon entropy, which appears in information theory.

An important and well-known law of physics, known as the second law of thermodynamics, states that the entropy of an isolated system can never decrease. We will explain the meaning of the "second law" in a subsequent section.

The concept of entropy was originally introduced in 1865 by Rudolf Clausius, in the context of classical thermodynamics. In 1877, Ludwig Boltzmann formulated an alternative definition of entropy as a measure of disorder, which is now regarded as one of the basic postulates of the theory of statistical mechanics. The closely-related concept of information entropy, used in information theory, was introduced by Claude Shannon in 1948.

Statistical entropy

The macroscopic state of the system is defined by a distribution on the microstates that are accessible to a system in the course of its thermal fluctuations. For a quantum system with a discrete set of microstates, if is the energy of microstate i, and is its probability that it occurs during the system's fluctuations, then the entropy of the system is

The quantity k_B is a physical constant known as Boltzmann's constant, which, like the entropy, has units of heat capacity. The logarithm is dimensionless.

This definition is valid including far away from equilibrium. Other definitions assume that the system is in thermal equilibrium, either as an isolated system, or as a system in exchange with its surroundings. The set of microstates on which the sum is to be done is called a statistical ensemble. Each statistical ensemble (micro-canonical, canonical, grand-canonical, etc.) describes a different configuration of the system's exchanges with the outside, from an isolated system to a system that can exchange one more quantity with a reservoir, like energy, volume or molecules. In every ensemble, the equilibrium configuration of the system is dictated by the maximization of the entropy of the union of the system and its reservoir, according to the second law of thermodynamics (see the statistical mechanics article).

Note the above expression of the statistical entropy is a discretized version of Shannon entropy

Boltzmann's Principle

In Boltzmann's definition, entropy is a measure of the number of possible microscopic states (or microstates) of a system in thermodynamic equilibrium, consistent with its macroscopic thermodynamic properties (or macrostate). To understand what microstates and macrostates are, consider the example of a gas in a container. At a microscopic level, the gas consists of a vast number of freely moving atoms, which occasionally collide with one another and with the walls of the container. The microstate of the system is a description of the positions and momenta of all the atoms. In principle, all the physical properties of the system are determined by its microstate. However, because the number of atoms is so large, the motion of individual atoms is mostly irrelevant to the behavior of the system as a whole. Provided the system is in thermodynamic equilibrium, the system can be adequately described by a handful of macroscopic quantities, called "thermodynamic variables": the total energy E, volume V, pressure P, temperature T, and so forth. The macrostate of the system is a description of its thermodynamic variables.

There are three important points to note. Firstly, to specify any one microstate, we need to write down an impractically long list of numbers, whereas specifying a macrostate requires only a few numbers (E, V, etc.). However, and that is the second point, the usual thermodynamic equations only describe the macrostate of a system adequately when this system is in equilibrium; non-equilibrium situations can generally not be described by a small number of variables. For example, if a gas is sloshing around in its container, even a macroscopic description would have to include, e.g., the velocity of the fluid at each different point. Actually, the macroscopic sate of the system will be described by a small number of variablesonly only if the system is at global thermodynamic equilibrium. Thirdly, more than one microstate can correspond to a single macrostate. In fact, for any given macrostate, there will be a huge number of microstates that are consistent with the given values of E, V, etc.

We are now ready to provide a definition of entropy. Let Ω be the number of microstates consistent with the given macrostate. The entropy S is defined as

where k_B is Boltzmann's constant.

The statistical entropy reduces to Boltzman's entropy when all the accessible microstates of the system are equally likely. It is also the configuration corresponding to the maximum of a system's entropy for a given set of accessible microstates, in other words the macroscopic configuration in which the disorder (or lack of information) is maximal. As such, according to the second law of thermodynamics, it is the equilibrium configuration of an isolated system. Boltzman's entropy is the expression of entropy at thermodynamic equilibrium in the micro-canonical ensemble.

This postulate, which is known as Boltzmann's principle, may be regarded as the foundation of statistical mechanics, which describes thermodynamic systems using the statistical behaviour of its constituents. It turns out that S is itself a thermodynamic property, just like E or V. Therefore, it acts as a link between the microscopic world and the macroscopic. One important property of S follows readily from the definition: since Ω is a natural number (1,2,3,...), S is either zero or positive (this is a property of the logarithm.)

Disorder and the second law of thermodynamics

We can view Ω as a measure of the disorder in a system. This is reasonable because what we think of as "ordered" systems tend to have very few configurational possibilities, and "disordered" systems have very many. As an illustration of this idea, consider a set of 100 coins, each of which is either heads up or tails up. The macrostates are specified by the total number of heads and tails, whereas the microstates are specified by the facings of each individual coin. For the macrostates of 100 heads or 100 tails, there is exactly one possible configuration, corresponding to the most "ordered" state in which all the coins are facing the same way. The most "disordered" macrostate consists of 50 heads and 50 tails in any order, for which there are 100891344545564193334812497256 (100 choose 50) possible microstates.

Even when a system is entirely isolated from external influences, its microstate is constantly changing. For instance, the particles in a gas are constantly moving, and thus occupy a different position at each moment of time; their momenta are also constantly changing as they collide with each other or with the container walls. Suppose we prepare the system in an artificially highly-ordered equilibrium state. For instance, imagine dividing a container with a partition and placing a gas on one side of the partition, with a vacuum on the other side. If we remove the partition and watch the subsequent behavior of the gas, we will find that its microstate evolves according to some chaotic and unpredictable pattern, and that on average these microstates will correspond to a more disordered macrostate than before. It is possible, but extremely unlikely, for the gas molecules to bounce off one another in such a way that they remain in one half of the container. It is overwhelmingly probable for the gas to spread out to fill the container evenly, which is the new equilibrium macrostate of the system.

This is an illustration of a principle that we will prove rigorously in a subsequent section, known as the Second Law of Thermodynamics. This states that

The total entropy of an isolated system can never decrease.

Since its discovery, the idea that disorder tends to increase has been the focus of a great deal of thought, some of it confused. A chief point of confusion is the fact that the Second Law applies only to isolated systems. For example, the Earth is not an isolated system because it is constantly receiving energy in the form of sunlight. Nevertheless, it has been pointed out that the universe may be considered an isolated system, so that its total disorder should be constantly increasing. We will discuss the implications of this idea in the section on Entropy and cosmology.

It is important to distinguish the meaning of "disorder" in the context of entropy and the colloquial definition, which is a vague term associated with "chaos". The "disorder" to which we refer in this article is a specific, well-defined quantity.

Counting of microstates

In classical statistical mechanics, the number of microstates is actually uncountably infinite, since the properties of classical systems are continuous. For example, a microstate of a classical ideal gas is specified by the positions and momenta of all the atoms, which range continuously over the real numbers. If we want to define Ω, we have to come up with a method of grouping the microstates together to obtain a countable set. This procedure is known as coarse graining. In the case of the ideal gas, we count two states of an atom as the "same" state if their positions and momenta are within δx and δp of each other. Since the values of δx and δp can be chosen arbitrarily, the entropy is not uniquely defined. It is defined only up to an additive constant. (As we will see, the thermodynamic definition of entropy is also defined only up to a constant.)

This ambiguity can be resolved with quantum mechanics. The quantum state of a system can be expressed as a superposition of "basis" states, which can be chosen to be energy eigenstates (i.e. eigenstates of the quantum Hamiltonian.) Usually, the quantum states are discrete, even though there may be an infinite number of them. In quantum statistical mechanics, we can take Ω to be the number of energy eigenstates consistent with the thermodynamic properties of the system.

An important result, known as Nernst's theorem or the third law of thermodynamics, states that the entropy of a system at zero absolute temperature is a well-defined constant. This is due to the fact that a system at zero temperature exists in its lowest-energy state, or ground state, so that its entropy is determined by the degeneracy of the ground state. Many systems, such as crystal lattices, have a unique ground state, and (since ln(1) = 0) this means that they have zero entropy at absolute zero. Other systems have more than one state with the same, lowest energy, and have a non-vanishing "zero-point entropy". For instance, ordinary ice has a zero-point entropy of 3.41 J/(mol·K), due to the fact that its underlying crystal structure possesses multiple configurations with the same energy (a phenomenon known as geometrical frustration).

Thermodynamic definition of entropy

In this section, we discuss the original definition of entropy, as introduced by Clausius in the context of classical thermodynamics. Clausius defined the change in entropy of a thermodynamic system, during a reversible process in which an amount of heat dQ is introduced at constant absolute temperature T, as

This definition makes sense when absolute temperature has been defined.

Clausius gave the quantity S the name "entropy", from the Greek word τρoπή, "transformation". Since this definition involves only differences in entropy, so the entropy itself is only defined up to an arbitrary additive constant.

Heat engines

Clausius' identification of S as a significant quantity was motivated by the study of reversible and irreversible thermodynamic transformations. A thermodynamic transformation is a change in a system's thermodynamic properties, such as temperature and volume. A transformation is reversible if it is quasistatic which means that it is infinitesimally close to thermodynamic equilibrium at all times. Otherwise, the transformation is irreversible. To illustrate this, consider a gas enclosed in a piston chamber, whose volume may be changed by moving the piston. If we move the piston slowly enough, the density of the gas is always homogeneous, so the transformation is reversible. If we move the piston quickly, pressure waves are created, so the gas is not in equilibrium, and the transformation is irreversible.

A heat engine is a thermodynamic system that can undergo a sequence of transformations which ultimately return it to its original state. Such a sequence is called a cyclic process, or simply a cycle. During some transformations, the engine may exchange energy with the environment. The net result of a cycle is (i) mechanical work done by the system (which can be positive or negative, the latter meaning that work is done on the engine), and (ii) heat energy transferred from one part of the environment to another. By the conservation of energy, the net energy lost by the environment is equal to the work done by the engine.

If every transformation in the cycle is reversible, the cycle is reversible, and it can be run in reverse, so that the energy transfers occur in the opposite direction and the amount of work done switches sign.

Definition of temperature

In thermodynamics, absolute temperature is defined in the following way. Suppose we have two heat reservoirs, which are systems sufficiently large that their temperatures do not change when energy flows into or out of them. A reversible cycle exchanges heat with the two heat reservoirs. If the cycle absorbs an amount of heat Q from the first reservoir and delivers an amount of heat Q′ to the second, then the respective reservoir temperatures T and T′ obey

Proof: Introduce an additional heat reservoir at an arbitrary temperature T0, as well as N cycles with the following property: the j-th such cycle operates between the T0 reservoir and the Tj reservoir, transferring energy dQj to the latter. From the above definition of temperature, the energy extracted from the T0 reservoir by the j-th cycle is

Now consider one cycle of the heat engine, accompanied by one cycle of each of the smaller cycles. At the end of this process, each of the N reservoirs have zero net energy loss (since the energy extracted by the engine is replaced by the smaller cycles), and the heat engine has done an amount of work equal to the energy extracted from the T0 reservoir,

If this quantity is positive, this process would be a perpetual motion machine of the second kind, which is impossible. Thus,

Now repeat the above argument for the reverse cycle. The result is

(reversible cycles)

Now consider a reversible cycle in which the engine exchanges heats dQ1, dQ2, ..., dQN with a sequence of N heat reservoirs with temperatures T1, ..., TN. A negative dQ means that energy flows from the reservoir to the engine, and a positive dQ means that energy flows from the engine to the reservoir. We can show (see the box on the right) that

.

Since the cycle is reversible, the engine is always infinitesimally close to equilibrium, so its temperature is equal to any reservoir with which it is contact. In the limiting case of a reversible cycle consisting of a continuous sequence of transformations,

(reversible cycles)

where the integral is taken over the entire cycle, and T is the temperature of the system at each point in the cycle.

Entropy as a state function

We can now deduce an important fact about the entropy change during any thermodynamic transformation, not just a cycle. First, consider a reversible transformation that brings a system from an equilibrium state A to another equilibrium state B. If we follow this with any reversible transformation which returns that system to state A, our above result says that the net entropy change is zero. This implies that the entropy change in the first transformation depends only on the initial and final states.

This allows us to define the entropy of any equilibrium state of a system. Choose a reference state R and call its entropy SR. The entropy of any equilibrium state X is

Since the integral is independent of the particular transformation taken, this equation is well-defined.

Entropy change in irreversible transformations

We now consider irreversible transformations. It can be shown that the entropy change during any transformation between two equilibrium states is

where the equality holds if the transformation is reversible.

Notice that if dQ = 0, then ΔS ≥ 0. This is the Second Law of Thermodynamics, which we have discussed earlier.

Suppose a system is thermally and mechanically isolated from the environment. For example, consider an insulating rigid box divided by a movable partition into two volumes, each filled with gas. If the pressure of one gas is higher, it will expand by moving the partition, thus performing work on the other gas. Also, if the gases are at different temperatures, heat can flow from one gas to the other provided the partition is an imperfect insulator. Our above result indicates that the entropy of the system as a whole will increase during these process (it could in principle remain constant, but this is unlikely.) Typically, there exists a maximum amount of entropy the system may possess under the circumstances. This entropy corresponds to a state of stable equilibrium, since a transformation to any other equilibrium state would cause the entropy to decrease, which is forbidden. Once the system reaches this maximum-entropy state, no part of the system can perform work on any other part. It is in this sense that entropy is a measure of the energy in a system that "cannot be used to do work".

Measuring entropy

In real experiments, it is quite difficult to measure the entropy of a system. The techniques for doing so are based on the thermodynamic definition of the entropy, and require extremely careful calorimetry.

For simplicity, we will examine a mechanical system, whose thermodynamic state may be specified by its volume V and pressure P. In order to measure the entropy of a specific state, we must first measure the heat capacity at constant volume and at constant pressure (denoted CV and CP respectively), for a successive set of states intermediate between a reference state and the desired state. The heat capacities are related to the entropy S and the temperature T by

where the X subscript refers to either constant volume or constant pressure. This may be integrated numerically to obtain a change in entropy:

We can thus obtain the entropy of any state (P,V) with respect to a reference state (P0,V0). The exact formula depends on our choice of intermediate states. For example, if the reference state has the same pressure as the final state,

In addition, if the path between the reference and final states lies across any first order phase transition, the latent heat associated with the transition must be taken into account.

The entropy of the reference state must be determined independently. Ideally, one chooses a reference state at an extremely high temperature, at which the system exists as a gas. The entropy in such a state would be that of a classical ideal gas plus contributions from molecular rotations and vibrations, which may be determined spectroscopically. Choosing a low temperature reference state is sometimes problematic since the entropy at low temperatures may behave in unexpected ways. For instance, a calculation of the entropy of ice by the latter method, assuming no entropy at zero temperature, falls short of the value obtained with a high-temperature reference state by 3.41 J/(mol·K). This is due to the "zero-point" entropy of ice mentioned earlier.

The arrow of time

Entropy is the only quantity in the physical sciences that "picks" a particular direction for time, sometimes called an arrow of time. As we go "forward" in time, the Second Law of Thermodynamics tells us that the entropy of an isolated system can only increase or remain the same; it cannot decrease. In contrast, all physical processes occurring at the microscopic level, such as mechanics, do not pick out an arrow of time. Going forward in time, we might see an atom moving to the left, whereas going backward in time, we would see the same atom moving to the right; the behavior of the atom is not qualitatively different in either case. In contrast, we would be shocked if a gas that originally filled a container evenly, spontaneously shrinks to occupy only half the container.

The reader may have noticed that the Second Law allows for the entropy remaining the same. If the entropy is constant in either direction of time, there would be no preferred direction. However, the entropy can only be a constant if the system is in the highest possible state of disorder, such as a gas that always was, and always will be, uniformly spread out in its container. The existence of a thermodynamic arrow of time implies that the system is highly ordered in one time direction, which would by definition be the "past".

Unlike most other laws of physics, the Second Law of Thermodynamics is statistical in nature, and its reliability arises from the huge number of particles present in macroscopic systems. It is not impossible, in principle, for all 1023 atoms in a gas to spontaneously migrate to one half of container; it is only fantastically unlikely -- so unlikely that no macroscopic violation of the Second Law has ever been observed.

In 1867, James Clerk Maxwell introduced a now-famous thought experiment that highlighted the contrast between the statistical nature of entropy and the deterministic nature of the underlying physical processes. This experiment, known as Maxwell's demon, consists of a hypothetical "demon" that guards a trapdoor between two containers filled with gases at equal temperatures. By allowing fast molecules through the trapdoor in only one direction and only slow molecules in the other direction, the demon raises the temperature of one gas and lowers the temperature of the other, apparently violating the Second Law. Maxwell's thought experiment was only resolved in the 20th century by Leó Szilárd, Charles H. Bennett, and others. The key idea is that the demon itself necessarily possesses a non-negligible amount of entropy that increases even as the gases lose entropy, so that the entropy of the system as a whole increases. This is because the demon has to contain many internal "parts" if it is to perform its job reliably, and therefore has to be considered a macroscopic system with non-vanishing entropy. An equivalent way of saying this is that the information possessed by the demon on which atoms are considered "fast" or "slow", can be considered a form of entropy known as information entropy.

Unsolved problem in physics:

Arrow of time: Why did the universe have such low entropy in the past, resulting in the distinction between past and future and the second law of thermodynamics?

Many physicists believe that all phenomena that behave differently in one time direction can ultimately be linked to the Second Law of Thermodynamics. This includes the fact that ice cubes melt in hot coffee rather than assembling themselves out of the coffee, that a block sliding on a rough surface slows down rather than speeding up, and that we can remember the past rather than the future. (This last phenomenon, called the "psychological arrow of time", has deep connections with Maxwell's demon and the physics of information.) If the thermodynamic arrow of time is indeed the only arrow of time, then the ultimate reason for a preferred time direction is that the universe as a whole was in a highly ordered state at the Big Bang. The question of why this highly ordered state existed, and how to describe it, remains an area of research.

Entropy and cosmology

We have previously mentioned that the universe may be considered an isolated system. As such, it may be subject to the Second Law of Thermodynamics, so that its total entropy is constantly increasing. It has been speculated that the universe is fated to a heat death in which all the energy ends up as a homogeneous distribution of thermal energy, so that no more work can be extracted from any source.

If the universe can be considered to have increasing entropy, then, as Roger Penrose has pointed out, an important role in the disordering process is played by gravity, which causes dispersed matter to accumulate into stars, which collapse eventually into black holes. Jacob Bekenstein and Stephen Hawking have shown that black holes have the maximum possible entropy of any object of equal size. This makes them likely end points of all entropy-increasing processes.

The role of entropy in cosmology remains a controversial subject. Recent work has cast extensive doubt on the heat death hypothesis and the applicability of any simple thermodynamical model to the universe in general. Although entropy does increase in an expanding universe, the maximum possible entropy rises much more rapidly and leads to an "entropy gap," thus pushing the system further away from equilibrium with each time increment. Complicating factors, such as the energy density of the vacuum and macroscopic quantum effects, are difficult to reconcile with thermodynamical models, making any predictions of large-scale thermodynamics extremely difficult.

Entropy in literature

  • Isaac Asimov's The Last Question, a short science fiction story about entropy
  • Thomas Pynchon, an American author who deals with entropy in many of his novels

See also

External links

References

  • Fermi, E., Thermodynamics, Prentice Hall (1937)
  • . ISBN 0716710889. {{cite book}}: Missing or empty |title= (help); Unknown parameter |Author= ignored (|author= suggested) (help); Unknown parameter |Publisher= ignored (|publisher= suggested) (help); Unknown parameter |Title= ignored (|title= suggested) (help); Unknown parameter |Year= ignored (|year= suggested) (help)
  • Penrose, Roger The Road to Reality : A Complete Guide to the Laws of the Universe (2005) ISBN 0679454438
  • Reif, F., Fundamentals of statistical and thermal physics, McGraw-Hill (1965)
  • Rifkin, Jeremy, Entropy Viking (1980)