Jump to content

History of entropy: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
extracting location on {{cite book}}
→‎References: using references-small per MediaWiki:common.css
Line 44: Line 44:


== References ==
== References ==
<div style="font-size:85%">
<div class="references-small">
#{{note|Mendoza}} {{cite book|author=Mendoza, E. |title=Reflections on the Motive Power of Fire – and other Papers on the Second Law of Thermodynamics by E. Clapeyron and R. Clausius|location=New York | publisher=Dover Publications |year=1988|id=ISBN 0486446417}}
#{{note|Mendoza}} {{cite book|author=Mendoza, E. |title=Reflections on the Motive Power of Fire – and other Papers on the Second Law of Thermodynamics by E. Clapeyron and R. Clausius|location=New York | publisher=Dover Publications |year=1988|id=ISBN 0486446417}}
#{{note|tribus}} M. Tribus, E.C. McIrvine, “Energy and information”, Scientific American, 224 (September 1971).
#{{note|tribus}} M. Tribus, E.C. McIrvine, “Energy and information”, Scientific American, 224 (September 1971).

Revision as of 07:00, 8 May 2006

The history of entropy, essentially, is the development of ideas set forth to theoretically understand why a certain amount of functionable energy released from combustion reactions is always lost to dissipation or friction, i.e. unusable. In this direction, in 1698 engineer Thomas Savery built the first engine. Soon others followed, as the Newcomen engine [1712] and the Cugnot steam tricycle [1769]. These early engines, however, were inefficient converting less than two percent of the input energy into useful work output. Essentially, a great deal of useful energy was dissipated or lost into what seemed like a state of immeasurable randomness. Over the next two centuries, physicists began to pry away at this puzzle of lost energy; the result was the concept of entropy.

Classical thermodynamic views

In 1803, mathematician Lazare Carnot published a work entitled Fundamental Principles of Equilibrium and Movement. This work includes a discussion on the efficiency of fundamental machines, i.e. pulleys and inclined planes. Lazare saw through all the details of the mechanisms to develop a general discussion on the conservation of mechanical energy. Over the next three decades, the term “Carnot’s theorem” was taken as a statement that in any machine the accelerations and shocks of the moving parts all represent losses of moment of activity, i.e., of the useful work done. From this Lazare drew the inference that perpetual motion was impossible. This loss of moment of activity was the first-ever rudimentary statement of what is now known as entropy, i.e. energy lost to dissipation and friction.[1]

In 1823, Lazare Carnot died in exile. The following year, in 1824, Lazare’s son Sadi Carnot having graduated from the Ecole Polytechnique training school for engineers, but now living on half-pay with his brother Hippolyte in a small apartment in Paris, wrote the Reflections on the Motive Power of Fire. In this paper, Sadi visualized an ideal engine in which the head of caloric converted into work could be reinstated by reversing the motion of the cycle, a concept subsequently known as thermodynamic reversibility. Sadi further postulated, however, that some caloric is lost, not being converted to mechanical work. Hence no real heat engine could realize the Carnot cycle's reversibility and was condemned to be less efficient. Hence, building on his father's work, Sadi positioned the concept that “some caloric is always lost”. This lost caloric was a precursory form of entropy loss as we now know it. Though formulated in terms of caloric, rather than entropy, this was an early insight into the second law of thermodynamics.

Statistical thermodynamic views

In 1850, a mathematical formulation of entropy was first introduced in the context of classical thermodynamics by Rudolf Clausius in his analysis of Sadi Carnot's 1824 work on thermodynamic efficiency. It was not until 1865, however, that Clausius singled the quantity out and gave it the name “entropy” as derived from the Greek word en-trepein, meaning energy turned to waste.

In 1877, Ludwig Boltzmann formulated the alternative definition of entropy S defined as:

where

kB is Boltzmann's constant and
Ω is the number of microstates consistent with the given macrostate.

Boltzmann saw entropy as a measure of statistical "mixedupness" or disorder. This concept was soon refined by J. Willard Gibbs, and is now regarded as one of the cornerstones of the theory of statistical mechanics.

Information theory views

An analog to thermodynamic entropy is information entropy. In 1948, while working at Bell Telephone Laboratories electrical engineer Claude Shannon set out to mathematically quantify the statistical nature of “lost information” in phone-line signals. To do this, Shannon developed the very general concept of information entropy, a fundamental cornerstone of information theory. Initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and earlier work in thermodynamics. In 1949, however, when Shannon had been working on his equations for some time, he happened to visit the mathematician John von Neumann. Although the story varies, initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and the earlier work in thermodynamics. During their discussions, regarding what Shannon should call the “measure of uncertainty” or attenuation in phone-line signals with reference to his new information theory, according to one source:[2]

My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.

According to another source, when von Neumann asked him how he was getting on with his information theory, Shannon replied:[3]

The theory was in excellent shape, except that he needed a good name for “missing information”. “Why don’t you call it entropy”, von Neumann suggested. “In the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage.

Shannon's information entropy is a much more general concept than statistical thermodynamic entropy. Information entropy is present whenever there are unknown quantities that can be described only by a probability distribution; so in general it has no connection to any thermodynamic entropy. However, as E. T. Jaynes championed in a series of papers starting in 1957, the statistical thermodynamic entropy can be seen as just a particular application of Shannon's information entropy to the particular set-up of statistical mechanics, a view many physicists find helpful.

Recent views

In recent years, there has been a push, especially with regard to the nature of evolution and human life, to accommodate an anthropic entropy perspective. One such application is the concept of corporate entropy as put forward somewhat humorously by authors Tom DeMarco and Timothy Lister in their 1989 classic publication Peopleware, a book on growing and managing productive teams and successful software projects. Here, they view energy waste as red tape and business team inefficiency as a form of entropy, i.e. energy lost to waste. In the second edition [1999] of this book, they take a hard, incisive look at people, teams and their surroundings and in doing so formulate theories on how to fight corporate entropy. This concept has caught on and is now common jargon in business schools.

Terminology overlap

When necessary, to disambiguate between the statistical thermodynamic concept of entropy, and entropy-like formulae put forward by different researchers, the statistical thermodynamic entropy is most properly referred to as the Gibbs entropy. The terms Boltzmann-Gibbs entropy or BG entropy, and Boltzmann-Gibbs-Shannon entropy or BGS entropy are also seen in the literature.

See also

References

  1. ^ Mendoza, E. (1988). Reflections on the Motive Power of Fire – and other Papers on the Second Law of Thermodynamics by E. Clapeyron and R. Clausius. New York: Dover Publications. ISBN 0486446417.
  2. ^ M. Tribus, E.C. McIrvine, “Energy and information”, Scientific American, 224 (September 1971).
  3. ^ Avery, John (2003). Information Theory and Evolution. World Scientific. ISBN 9812384006.