Jump to content

Entropy (energy dispersal): Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
update template box contents
m typo in article name in mergeto
Line 4: Line 4:
{{POV-title|alternate title="''Approaches to teaching entropy''" or "''Introductory pedagogies for entropy''". Alternatively, this article could be deleted or merged into article(s) mentioned below.|date=July 2013}}
{{POV-title|alternate title="''Approaches to teaching entropy''" or "''Introductory pedagogies for entropy''". Alternatively, this article could be deleted or merged into article(s) mentioned below.|date=July 2013}}
{{Merge into|Entropy|date=July 2013}}
{{Merge into|Entropy|date=July 2013}}
{{Merge into|Entropy (order and disorder|date=July 2013)}}
{{Merge into|Entropy (order and disorder)|date=July 2013)}}
{{Merge into|Introduction to entropy|date=July 2013}}
{{Merge into|Introduction to entropy|date=July 2013}}
The description of '''entropy as energy dispersal''' provides an introductory method of teaching the [[thermodynamics|thermodynamic]] concept of [[entropy]]. In [[physics]] and [[physical chemistry]], entropy has commonly been defined as a [[scalar (physics)|scalar]] measure of the disorder of a [[thermodynamic system]]. This newer approach sets out a variant approach to entropy, namely as a measure of [[energy]] ''dispersal'' or ''distribution'' at a specific [[temperature]]. Under this approach, changes in entropy can be quantitatively related to the distribution or the spreading out of the energy of a thermodynamic system, divided by its [[thermodynamic temperature|temperature]].
The description of '''entropy as energy dispersal''' provides an introductory method of teaching the [[thermodynamics|thermodynamic]] concept of [[entropy]]. In [[physics]] and [[physical chemistry]], entropy has commonly been defined as a [[scalar (physics)|scalar]] measure of the disorder of a [[thermodynamic system]]. This newer approach sets out a variant approach to entropy, namely as a measure of [[energy]] ''dispersal'' or ''distribution'' at a specific [[temperature]]. Under this approach, changes in entropy can be quantitatively related to the distribution or the spreading out of the energy of a thermodynamic system, divided by its [[thermodynamic temperature|temperature]].

Revision as of 00:14, 23 July 2013

Template:POV-title

The description of entropy as energy dispersal provides an introductory method of teaching the thermodynamic concept of entropy. In physics and physical chemistry, entropy has commonly been defined as a scalar measure of the disorder of a thermodynamic system. This newer approach sets out a variant approach to entropy, namely as a measure of energy dispersal or distribution at a specific temperature. Under this approach, changes in entropy can be quantitatively related to the distribution or the spreading out of the energy of a thermodynamic system, divided by its temperature.

The energy dispersal approach to teaching entropy was developed to facilitate teaching entropy to students beginning university chemistry and biology. This new approach also avoids ambiguous terms such as disorder and chaos, which have multiple everyday meanings.

Problem: entropy as disorder is hard to teach

The term "entropy" has been in use from early in the history of classical thermodynamics, and with the development of statistical thermodynamics and quantum theory, entropy changes have been described in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantized energy levels.

Such descriptions have tended to be used together with commonly used terms such as disorder and chaos which are ambiguous, and whose everyday meaning is the opposite of what they are intended to mean in thermodynamics. Not only does this situation cause confusion, but it also hampers the teaching of thermodynamics. Students were being asked to grasp meanings directly contradicting their normal usage, with equilibrium being equated to "perfect internal disorder" and the mixing of milk in coffee from apparent chaos to uniformity being described as a transition from an ordered state into a disordered state.[1]

The description of entropy as the amount of "mixedupness" or "disorder," as well as the abstract nature of the statistical mechanics grounding this notion, can lead to confusion and considerable difficulty for those beginning the subject.[2][3] Even though courses emphasised microstates and energy levels, most students could not get beyond simplistic notions of randomness or disorder. Many of those who learned by practising calculations did not understand well the intrinsic meanings of equations, and there was a need for qualitative explanations of thermodynamic relationships.[4][5]

Solution: entropy as energy dispersal

Entropy can be described in terms of "energy dispersal" and the "spreading of energy," while avoiding all mention of "disorder" and "chaos" except when explaining misconceptions. All explanations of where and how energy is dispersing or spreading have been recast in terms of energy disperal, so as to emphasise the underlying qualitative meaning.[2]

In this approach, the second law of thermodynamics is introduced as "Energy spontaneously disperses from being localized to becoming spread out if it is not hindered from doing so." in the context of common experiences such as a rock falling, a hot frying pan cooling down, iron rusting, air leaving a punctured tyre and ice melting in a warm room. Entropy is then depicted as a sophisticated kind of "before and after" yardstick — measuring how much energy is spread out over time as a result of a process such as heating a system, or how widely spread out the energy is after something happens in comparison with its previous state, in a process such as gas expansion or fluids mixing (at a constant temperature). The equations are explored with reference to the common experiences, with emphasis that in chemistry the energy that entropy measures as dispersing is the internal energy of molecules.

The statistical interpretation is related to quantum mechanics in describing the way that energy is distributed (quantized) amongst molecules on specific energy levels, with all the energy of the macrostate always in only one microstate at one instant. Entropy is described as measuring the energy dispersal for a system by the number of accessible microstates, the number of different arrangements of all its energy at the next instant. Thus, an increase in entropy means a greater number of microstates for the final state than for the initial state, and hence more possible arrangements of a system's total energy at any one instant. Here, the greater 'dispersal of the total energy of a system' means the existence of many possibilities.[citation needed][6]

Continuous movement and molecular collisions visualised as being like bouncing balls blown by air as used in a lottery can then lead on to showing the possibilities of many Boltzmann distributions and continually changing "distribution of the instant", and on to the idea that when the system changes, dynamic molecules will have a greater number of accessible microstates. In this approach, all everyday spontaneous physical happenings and chemical reactions are depicted as involving some type of energy flows from being localized or concentrated to becoming spread out to a larger space, always to a state with a greater number of microstates.[7]

This approach provides a good basis for understanding the conventional approach, except in very complex cases where the qualitative relation of energy dispersal to entropy change can be so inextricably obscured that it is moot.[7] Thus in situations such as the entropy of mixing when the two or more different substances being mixed are at the same temperature and pressure so there will be no net exchange of heat or work, the entropy increase will be due to the literal spreading out of the motional energy of each substance in the larger combined final volume. Each component’s energetic molecules become more separated from one another than they would be in the pure state, when in the pure state they were colliding only with identical adjacent molecules, leading to an increase in its number of accessible microstates.[8]

Variants of the energy dispersal approach have been adopted in number of undergraduate chemistry texts[citation needed], mainly in the United States. An advanced text, Physical Chemistry 8th edition, by Peter Atkins of Oxford University and Julio De Paula, says "The concept of the number of microstates makes quantitative the ill-defined qualitative concepts of 'disorder' and 'the dispersal of matter and energy' that are used widely to introduce the concept of entropy: a more 'disorderly' distribution of energy and matter corresponds to a greater number of microstates associated with the same total energy." --- p. 81[9]

Websites have made the energy dispersal approach accessible not only to all students of chemistry, but also to the lay public seeking a basic intuitive understanding of thermodynamic entropy. For example, here [7] is a page setting out the qualitative simplicity of the notion of entropy.

The energy dispersal approach has been criticised by Arieh Ben-Naim.[10]

History of energy dispersal

The concept of "energy dispersal" as a description of entropy appeared in William Thomson's (Lord Kelvin) 1852 article "On a Universal Tendency in Nature to the Dissipation of Mechanical Energy."[11] Thomson distinguished between two types or "stores" of mechanical energy: "statical" and "dynamical." He discussed how these two types of energy can change from one form to the other during a thermodynamic transformation. When heat is created by any irreversible process (such as friction), or when heat is diffused by conduction, mechanical energy is dissipated, and it is impossible to restore the initial state.[12][13]

In the mid-1950s, with the development of quantum theory, researchers began speaking about entropy changes in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantized energy levels, such as by the reactants and products of a chemical reaction.[14]

In 1984, the Oxford physical chemist Peter Atkins, in a book The Second Law, written for laypersons, presented a nonmathematical interpretation of what he called the "infinitely incomprehensible entropy" in simple terms, describing the Second Law of thermodynamics as "energy tends to disperse". His analogies included an imaginary intelligent being called "Boltzmann's Demon," who runs around reorganizing and dispersing energy, in order to show how the W in Boltzmann's entropy formula relates to energy dispersion. This dispersion is transmitted via atomic vibrations and collisions. Atkins wrote: "each atom carries kinetic energy, and the spreading of the atoms spreads the energy…the Boltzmann equation therefore captures the aspect of dispersal: the dispersal of the entities that are carrying the energy."[15]

Stanley Sandler, in his 1989 Chemical and Engineering Thermodynamics, described how given any thermodynamic process, a quantity TS can be interpreted as the amount of mechanical energy that has been converted into thermal energy by viscous dissipation, dispersion, and other system irreversibilities.[16] In 1997, John Wrigglesworth described spatial particle distributions as represented by distributions of energy states. According to the second law of thermodynamics, isolated systems will tend to redistribute the energy of the system into a more probable arrangement or a maximum probability energy distribution, i.e. from that of being concentrated to that of being spread out. By virtue of the First law of thermodynamics, the total energy does not change; instead, the energy tends to disperse from a coherent to a more incoherent state.[17] In his 1999 Statistical Thermodynamics, M.C. Gupta defined entropy as a function that measures how energy disperses when a system changes from one state to another.[18] Other authors defining entropy in a way that embodies energy dispersal are Cecie Starr[19] and Andrew Scott.[20]

In a 1996 article, the physicist Harvey S. Leff set out what he called "the spreading and sharing of energy."[21] Another physicist, Daniel F. Styer, published an article in 2000 showing that "entropy as disorder" was inadequate.[22] In an article published in the 2002 Journal of Chemical Education, Frank L. Lambert argued that portraying entropy as "disorder" is confusing and should be abandoned. He has gone on to develop detailed resources for chemistry instructors, equating entropy increase as the spontaneous dispersal of energy, namely how much energy is spread out in a process, or how widely dispersed it becomes – at a specific temperature.[2][23]

References

  1. ^ Microsoft Encarta 2006. © 1993–2005 Microsoft Corporation. All rights reserved.
  2. ^ a b c Frank L. Lambert, 2002, "Disorder--A Cracked Crutch for Supporting Entropy Discussions," Journal of Chemical Education 79: 187. Updated version at here.
  3. ^ Frank L. Lambert, "The Second Law of Thermodynamics (6)."
  4. ^ Carson, E. M., and Watson, J. R., (Department of Educational and Professional Studies, Kings College, London), 2002, "Undergraduate students' understandings of entropy and Gibbs Free energy," University Chemistry Education - 2002 Papers, Royal Society of Chemistry.
  5. ^ Sozbilir, Mustafa, PhD studies: Turkey, A Study of Undergraduates' Understandings of Key Chemical Ideas in Thermodynamics, Ph.D. Thesis, Department of Educational Studies, The University of York, 2001.
  6. ^ Frank L. Lambert, The Molecular Basis for Understanding Simple Entropy Change
  7. ^ a b c Frank L. Lambert, Entropy is simple, qualitatively
  8. ^ Frank L. Lambert, Notes for a “Conversation About Entropy”: a brief discussion of both thermodynamic and "configurational" ("positional") entropy in chemistry.
  9. ^ Atkins, Peter (2006). Physical Chemistry , 8th edition. Oxford University Press. ISBN 0-19-870072-5. {{cite book}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  10. ^ Review of "Entropy and the second law: interpretation and misss-interpretationsss" in Chemistry World
  11. ^ Jensen, William. (2004). "Entropy and Constraint of Motion." Journal of Chemical Education (81) 693, May
  12. ^ Thomson, William (1852). "On a Universal Tendency in Nature to the Dissipation of Mechanical Energy." Proceedings of the Royal Society of Edinburg, April 19.
  13. ^ Thomson, William (1874). "Kinetic Theory of the Dissipation of Energy", Nature IX: 441-44. (April 9).
  14. ^ Denbigh, Kenneth (1981). The Principles of Chemical Equilibrium, 4th Ed. Cambridge University Press. ISBN 0-521-28150-4.
  15. ^ Atkins, Peter (1984). The Second Law. Scientific American Library. ISBN 0-7167-5004-X.
  16. ^ Sandler, Stanley, I. (1989). Chemical and Engineering Thermodynamics. John Wiley & Sons. ISBN 0-471-83050-X.{{cite book}}: CS1 maint: multiple names: authors list (link)
  17. ^ Wrigglesworth, John (1997). Energy and Life (Modules in Life Sciences). CRC. ISBN 0-7484-0433-3. (see excerpt)
  18. ^ Gupta, M.C. (1999). Statistical Thermodynamics. New Age Publishers. ISBN 81-224-1066-9. (see excerpt)
  19. ^ Starr, Cecie (1992). Biology - the Unity and Diversity of Life. Wadsworth Publishing Co. ISBN 0-534-16566-4. {{cite book}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  20. ^ Scott, Andrew (2001). 101 Key ideas in Chemistry. Teach Yourself Books. ISBN 0-07-139665-9.
  21. ^ Leff, H. S., 1996, "Thermodynamic entropy: The spreading and sharing of energy," Am. J. Phys. 64: 1261-71.
  22. ^ Styer D. F., 2000, Am. J. Phys. 68: 1090-96.
  23. ^ Frank L. Lambert, "A Student's Approach to the Second Law and Entropy."

Further reading

Texts using the energy dispersal approach

  • Atkins, P. W., Physical Chemistry for the Life Sciences. Oxford University Press, ISBN 0-19-928095-9; W. H. Freeman, ISBN 0-7167-8628-1
  • Benjamin Gal-Or, "Cosmology, Physics and Philosophy", Springer-Verlag, New York, 1981, 1983, 1987 ISBN 0-387-90581-2
  • Bell, J., et al., 2005. Chemistry: A General Chemistry Project of the American Chemical Society, 1st ed. W. H. Freeman, 820pp, ISBN 0-7167-3126-6
  • Brady, J.E., and F. Senese, 2004. Chemistry, Matter and Its Changes, 4th ed. John Wiley, 1256pp, ISBN 0-471-21517-1
  • Brown, T. L., H. E. LeMay, and B. E. Bursten, 2006. Chemistry: The Central Science, 10th ed. Prentice Hall, 1248pp, ISBN 0-13-109686-9
  • Ebbing, D.D., and S. D. Gammon, 2005. General Chemistry, 8th ed. Houghton-Mifflin, 1200pp, ISBN 0-618-39941-0
  • Ebbing, Gammon, and Ragsdale. Essentials of General Chemistry, 2nd ed.
  • Hill, Petrucci, McCreary and Perry. General Chemistry, 4th ed.
  • Kotz, Treichel, and Weaver. Chemistry and Chemical Reactivity, 6th ed.
  • Moog, Spencer, and Farrell. Thermodynamics, A Guided Inquiry.
  • Moore, J. W., C. L. Stanistski, P. C. Jurs, 2005. Chemistry, The Molecular Science, 2nd ed. Thompson Learning. 1248pp, ISBN 0-534-42201-2
  • Olmsted and Williams, Chemistry, 4th ed.
  • Petrucci, Harwood, and Herring. General Chemistry, 9th ed.
  • Silberberg, M.S., 2006. Chemistry, The Molecular Nature of Matter and Change, 4th ed. McGraw-Hill, 1183pp, ISBN 0-07-255820-2
  • Suchocki, J., 2004. Conceptual Chemistry 2nd ed. Benjamin Cummings, 706pp, ISBN 0-8053-3228-6