From Wikipedia, the free encyclopedia
Jump to: navigation, search
Former good article Entropy was one of the Natural sciences good articles, but it has been removed from the list. There are suggestions below for improving the article to meet the good article criteria. Once these issues have been addressed, the article can be renominated. Editors may also seek a reassessment of the decision if they believe there was a mistake.
          This article is of interest to the following WikiProjects:
WikiProject Physics (Rated B-class, Top-importance)
WikiProject icon This article is within the scope of WikiProject Physics, a collaborative effort to improve the coverage of Physics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 Top  This article has been rated as Top-importance on the project's importance scale.
WikiProject Chemistry (Rated B-class, High-importance)
WikiProject icon This article is within the scope of WikiProject Chemistry, a collaborative effort to improve the coverage of chemistry on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 High  This article has been rated as High-importance on the project's importance scale.
WikiProject Mathematics (Rated B-class, High-importance)
WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
B Class
High Importance
 Field: Mathematical physics
One of the 500 most frequently viewed mathematics articles.
News This article has been mentioned by a media organisation:

deeply muddled[edit]

The lead of this article is deeply muddled in concept. It needs radical revision. A word or two here or there are as drops in the ocean.Chjoaygame (talk) 16:19, 26 June 2014 (UTC)

Agreed. W. P. Uzer (talk) 17:12, 26 June 2014 (UTC)

How about..

In thermodynamics, entropy (usual symbol S) is a measure of the disorder of an identified system of particles with a given total energy. It directly relates to the probability the system is in a particular state compared with all possible states or arrangements. According to the second law of thermodynamics the entropy of real isolated systems always increases; such systems spontaneously decay towards thermodynamic equilibrium, the configuration with maximum entropy or disorder. The result of increasing entropy is to dissipate energy evenly throughout the system so pressure, temperature, chemical potential, mechanical energy (motion) etc get smoothed out. The total energy of the system is not changed but the quality or ability of the system to do work is reduced ultimately to zero at equilibrium. Theoretical ideal systems with no energy conversion to heat due say to friction maintain constant entropy. Systems which are not isolated may decrease in entropy when subject to an input which has the effect of creating order, like the freezer box in a refrigerator. Since entropy is a state function, the change in the entropy of a system is the same for any process going from a given initial state to a given final state, whether the process is reversible (ideal) or irreversible (real). However irreversible processes increase the combined entropy of the system and its environment.

Gyroman (talk) 05:07, 28 June 2014 (UTC)

This is just as deeply muddled.Chjoaygame (talk) 08:45, 28 June 2014 (UTC)
I stated what is confusing and misleading about the use of "evolve", the only answer given was that certain authors use it that way. I stated why it is confusing and misleading and the same applies to those authors.. Well now its your turn.. OK, tell us all.. exactly what is muddled, incorrect or missing from the above proposal? Gyroman (talk) 01:58, 1 July 2014 (UTC)
Thank you for this kind invitation. Sad to say, I have to decline it.Chjoaygame (talk) 06:07, 1 July 2014 (UTC)
For me, I think there needs to be more considered explanation of what entropy actually is, before talking in detail about the second law of thermodynamics and so on. W. P. Uzer (talk) 07:32, 1 July 2014 (UTC)
Its interesting the same comment (re entropy.. What is it?) appears in the talk page to the Introduction to Entropy article. You'll know most of this but please bear with me.. This is what I think is the source of the confusion..
(1) There are two equations for entropy, the first discovered by Rudolf Clausius mid 19th century is actually the change in entropy of a SYSTEM, calculated as the integral of heat flow divided by the absolute temperature at the boundary of a subsystem to or from its surroundings both contained within the whole SYSTEM. What is not so apparent is that differences in macroscopic properties such as temperature or pressure between subsystems within a larger system represent an ordered state as opposed to a disordered state where the energy is smoothly distributed = equilibrium. The point is that only an ordered system with differences in pressure or temperature between its subsystems allows us to get any work from the system. As you know.. the amount of work can be calculated from a temperature entropy diagram as an area ∑(∆T x ∆s) for a process.. like for a steam engine.
(2) The second equation by Ludwig Boltzmann discovered around the turn of the 20th century calculates the absolute entropy of a SYSTEM and this is the general or fundamental equation for entropy. It directly measures how disordered the system is. The equation s = k.ln W should be as well known as E = M.C^2.. this is a great failing in science education. The Clausius equation is easily derived from the Boltzmann equation by making just one simplifying assumption (ie the system is already at equilibrium or W = Wmax),. this derivation must never be done the other way round! The reason is calculation of entropy change by heat transfer across a subsystem boundary is NOT a complete statement of the entropy of a system. This is clear from the Boltzmann equation, "W" is just a number and "k" is the Boltzmann constant, which is required to give "s" some units. We MUST note "order" includes logicalstates (not all of which can be accounted for by heat transfer), as well as physical states. Now "W" is the number of microstates (logical arrangements, or 'phase space' if every particle is identified and each is exchanged with every other) in a given macrostate (physical arrangement set of particle properties available). Clearly 'W' is incalculable for any real system.. just one litre of gas gives a logical phase space of microstates exceeding the size of the universe!
(3) When the absolute entropy of a system and the term ORDER, are inadequately defined or left out you end up with a gaping hole in your definition of the second law. Just saying 'Entropy Increases" or 'never decreases' (needs more definitions 'real', 'ideal', 'reversible', 'irreversible') makes little sense if entropy has not been comprehensively defined. Entropy is not about heat or energy its all about ORDER. 'W' is a probability term and ordered means less probable. Energy dissipation is a RESULT of the decay in ORDER or increase in PROBABILITY and with this understanding a clearer definition of the second law results.. "Isolated systems tend to move to their most probable state".
The emphasis on Rudolf Clausius to the exclusion of Luwig Boltzmann is at the root of the problem Gyroman (talk) 14:31, 1 July 2014 (UTC)
This is a good point. Perhaps the article would be clearer if it were re-organized into two distinct parts: (1) Entropy change, including Clausius and the Second Law, and (2) Absolute entropy, including Boltzmann, the Third Law, information theory, etc. Dirac66 (talk) 15:46, 1 July 2014 (UTC)
Agreed, that the difference between relative entropy as defined by Clausius and absolute entropy as fixed by the third law is important.
But I think it important also to distinguish explicitly also between macroscopic entropy as defined by Clausius and the third law, and microscopic or statistical entropy as defined by probability and information theory. I think it unwise to try to elide this distinction, because such an elision hides from the student an important aspect of the logical structure of the situation. The distinction having been made, it is then in order to show the linkage.
I do not agree that the word "order" should be enshrined as the key to entropy. At least one expert tackles this oft used word in this context, Walter T. Grandy, in Entropy and the Time Evolution of Macroscopic Systems, Oxford University Press, Oxford UK, 2008, ISBN 978-0-19-954617-6, p. 56. It is ok to talk about order, but not to enshrine it.
I think that the here-relevant characteristic feature of thermodynamic equilibrium is its time invariance. It is true that Planck considers homogeneity to be characteristic. But I would say that such homogeneity should be time-invariant; I think Planck assumes this without saying so. A fluctuation from a non-equilibrium state could lead to instantaneous homogeneity. Repeated measurements on a state of thermodynamic equilibrium may show fluctuations, but they must be statistically time-invariant for the equilibrium state.Chjoaygame (talk) 18:07, 1 July 2014 (UTC)
So no comment from anyone else.. all 460+ watchers! Your all happy to have a "deeply muddled" degraded presentation..? I am still going through the references.. however I do not find any attempt to explain the intended meaning of "evolve" concerning isolated thermodynamic systems. Rather it seems to me it is being used as a tactical ploy to drag the second law into the controversy over the meaning of the word "evolution" (ie 'any change in gene frequency' as opposed to an 'average increase in complexity of genes' and further still into the mire we have the complete absence of the meaning of 'complexity' itself). Clearly there is a significant group of scientists fighting the claims of another group of scientists as to the real implications of the second law for evolution. This is the sort of thing which brings professional 'bias' into science. Please STOP there, its NOT RELEVANT here.
What is relevant is that in the context of an encyclopedia there is no justification for using any word contrary to its dictionary definition and such hidden' meanings will confuse and mislead. That is what needs addressing and it is singularly unsatisfactory to claim certain scientists 'express' it that way.. no its bias! All this over a single word.. it is obviously of much higher importance to you editors than you are admitting in this discussion. Sorry but that leads me to conclude you are more part of the problem than the solution. So I think we are heading for some sort of independent resolution and I am left with no alternative but to change it and force the issue.
Done Gyroman (talk) 00:49, 23 July 2014 (UTC)
Dear Gyroman, sad to say you are mistaken. You tell us that you think that the use of the word 'evolve' "is being used as a tactical ploy to drag the second law into the controversy over the meaning of the word "evolution" (ie 'any change in gene frequency' as opposed to an 'average increase in complexity of genes' and further still into the mire we have the complete absence of the meaning of 'complexity' itself)." No, the use of the word 'evolve' here is reflective of common usage in this area of science. That's all.
Witness the title of a recent monograph on the subject: Entropy and the Time Evolution of Macroscopic Systems, Grandy, W.T, Jr (2008), Oxford University Press, Oxford UK, ISBN 978-0-19-954617-6. Other examples from a standard text on this very matter: "We take a metal rod prepared at time zero in such a way that its temperature varies linearly from one end to the other. We then let it evolve freely and measure the change of temperature as a function of time at a given point. It is a fact of experience that, if the experiment is repeated a large number of times under identical conditions, its result is reproducible." Also "Given a system, described at time t = 0 by an arbitrary ensemble distribution, it is possible to explain its subsequent time evolution by the exact laws of classical or quantum mechanics." Many more examples in this book.<Balescu, R. (1975), Equilibrium and Nonequilibrium Statistical Mechanics, John Wiley & Sons, New York, ISBN 0-471-04600-0, p. 371.> Editor Waleswatcher has above here supplied other examples.
While you are right in general too look to ordinary language as the first candidate for Wikipedia wording, in this particular case the local language is established. Moreover, the dictionary that you choose as criterion for ordinary language is only one of many. I take here the liberty of copying one of the eight meanings listed in the Oxford English Dictionary: "6. Of circumstances, conditions, or processes: To give rise to, produce by way of natural consequence."<Oxford English Dictionary, Oxford English Dictionary Second Edition on CD-ROM (v. © Oxford University Press 2009>. Best to choose a good dictionary when you can.
You point to my comment that the article is deeply muddled. Better you study some thermodynamics texts to try to work out what is wrong with it than try to impose your choice of words based on one popularist author. The word 'decay' that you prefer also has its local meanings. They sometimes fit with the time course of natural processes, but not always. Greatly respect Atkins though we may, his book is not a reliable source for the present purpose. Judging from the one source you cite, you have not done nearly enough reading in this area to provide you with the background to identify a reliable source.Chjoaygame (talk) 02:16, 23 July 2014 (UTC)

The OED gives the Latin verb evolvere, to roll out or unroll, as the origin of the English verb "evolve", which was in use in the 17th century, long before anyone could conceive of species evolving. "Evolve" feels more neutral than "decay", which connotes a loss, such as of structural integrity (decay of a neighborhood, tooth, etc.) or of substance (decay of a particle, signal, etc.). I suppose one could argue that two systems that have drifted into a common thermal equilibrium have lost their heterogeneity, but to say that a previously segregated neighborhood that had drifted into homogeneity had "decayed" would seem like an uncalled-for value judgment.

Could common ground be found in the verbiage "drift into thermal equilibrium"?

As for explaining this drift tendency in the lead, any such explanation would open a can of worms unsuited to a lead and more appropriate for the body. A link should be ok though, say to the principle of maximum entropy or maximum entropy thermodynamics. Vaughan Pratt (talk) 16:44, 31 October 2014 (UTC)

How can entropy change if it is held constant?[edit]

According to the article, "in theory, the entropy of a system can be changed without changing its energy. That is done by keeping all variables constant, including temperature (isothermally) and entropy (adiabatically). That is easy to see, ..."

I'm finding it hard to see. Could someone kindly explain how entropy can be changed by keeping it constant? Vaughan Pratt (talk) 17:13, 31 October 2014 (UTC)

No worries, mate. This is easily explained by the law of the purple cow, that black is white. It seems I made a mistake above, when I limited my remark to the lead. I think the muddle is not limited to the lead.Chjoaygame (talk) 18:31, 31 October 2014 (UTC)
Progressing beyond the foregoing. I have deleted the section. As it stood, as pointed out by User:Vaughan Pratt, it verged on nonsense. If someone sees merit in the section, perhaps they will restore it in a form that it makes good sense. There remains much in this article that also verges on nonsense.Chjoaygame (talk) 05:09, 1 November 2014 (UTC)

Yet more questionable material[edit]

Thanks, Chjoaygame. In the meantime the article seems to be rapidly filling up with yet more strange ideas. Examples:

"The early classical definition of the properties of the system assumed equilibrium."

How can the entropy of a system that is in equilibrium be anything but constant? This would make dS = dQ/T a vacuous equation since both sides would be zero.

"If heat is transferred out the sign would be reversed giving a decrease of entropy of the system."

Surely the second law of thermodynamics makes this impossible.

"The statistical mechanics description of the behavior of a system is necessary as the definition of the properties of a system using classical thermodynamics become an increasingly unreliable method of predicting the final state of a system that is subject to some process. "

What "increasing unreliability"? Increasing how? If the concern is with the temperature of the ground state of an atom (nanokelvins) or the derivation of Planck's law or the various distributions then say so. Otherwise this smacks of unsourced stream-of-consciousness writing.

I don't work on this article or I'd straighten this out myself. Hopefully one of the article's regular maintainers will do so. Chjoaygame @Chjoaygame: or PAR @PAR:, what do you think? Vaughan Pratt (talk) 08:26, 10 November 2014 (UTC)

Thank you for this ping. Like you, I don't work on this article, and am not a maintainer of it. I just loosely glance at it.Chjoaygame (talk) 11:20, 10 November 2014 (UTC)

"If heat is transferred"[edit]

"If heat is transferred out the sign would be reversed giving a decrease of entropy of the system." The sign convention is: entropy into the system is positive and entropy out is negative. The entropy of a system can be decreased provided it transfers at least as much entropy to another system. On the largest scale the entropy of the universe always increases. A system that converts heat into entropy takes in heat at a high temperature and converts some of that heat into work (zero entropy out as work) but it must dump the remaining heat to a low temperature source. As the entropy is dQ/T the entropy out is greater than the entropy into the system (note the temperature in the denominator). I must admit, that with this article there is a lot of questionable writing. I will keep reading and writing but it will take years to get to the point that I am able to do the subject justice. Zedshort (talk) 01:29, 21 November 2014 (UTC)

definitions of entropy[edit]

I haven't been too active lately, but reading some of the recent additions, I really have to take issue with the paragraph beginning "There are two related definitions of entropy:...". There is essentially only one definition of thermodynamic entropy, the "classical" or "macroscopic" definition, which makes no mention of molecules, atoms, etc. The statistical mechanics "definition" of thermodynamic entropy is not a definition, it is an explanation of thermodynamic entropy. Its value lies in the fact that it is an excellent explanation. For example, you cannot define temperature in terms of some average energy per particle (per degree of freedom, actually) unless you measure the individual particle energies and calculate an average. I suppose this is possible in some experiments, but it is not generally the case. This statistical mechanical description of temperature thus serves only as an explanation of temperature, not a definition. The same general idea is true of any thermodynamic parameter, including entropy. Physical science is a science of measurement and "definitions" must be traceable back to raw measurements, not to some picture or model in your mind, no matter how repeatedly and accurately that picture or model explains your measurements. PAR (talk) 15:08, 21 November 2014 (UTC)

I would like to express my agreement with the just-above comment of PAR.Chjoaygame (talk) 20:24, 21 November 2014 (UTC)
As a non-scientist, I don't find that argument at all convincing (surely you need a picture or model of some sort to persuade you that "measurement" is happening at all? and anyway, why on earth not base a definition on a well-accepted model?), but what matters is not so much what we say here, but what respected scientists in the field say - do they regard the statistical description as a definition, or just an explanation? And what about other kinds of entropy than the thermodynamic kind? For me, this article is all still rather confused, and the lead section is possibly one of the worst of all - it's all over the place, without ever explaining what the first sentence is supposed to mean. W. P. Uzer (talk) 20:42, 21 November 2014 (UTC)
My reading of respected scientists supports the view posted by PAR. Apart from this, I can agree with W. P. Uzer that the article is confused. I think the comment of PAR offers a constructive contribution to sorting out the confusion.Chjoaygame (talk) 21:13, 21 November 2014 (UTC)
I don't like to quibble with those who likely know more of the matter than I do, but "statistical definition of entropy" gives no shortage of hits on Google Scholar, including several similar as this from Physical Review Letters: "...just as the statistical definition of entropy is widely considered more general and fundamental than the original thermodynamic definition..." W. P. Uzer (talk) 21:52, 21 November 2014 (UTC)
I see your comment as reasonable, not as a quibble. Still I prefer the view of PAR.
I think the key here is that this article announces that it is is about entropy in thermodynamics.
To answer your argument about a model. Yes, some sort of model is more or less a presupposition of a measurement. But PAR is pointing out that for measuring entropy, the model is most often, or by default, the macroscopic thermodynamic one, which is very well accepted. Indeed it would be hard to think of another way of actually making a measurement. The statistical mechanical model does not readily lend itself to direct measurement. The macroscopic thermodynamic way is really based on a model, with such concepts as thermodynamic equilibrium imagined as experimentally realized.
Part of the problem is that the word entropy was more or less stolen from its original thermodynamic definition by Shannon, for his statistical purpose, at the suggestion of von Neumann, partly on the ground that people often feel baffled by it, as you may read at History of entropy#Information theory. Not an auspicious move, I would say. Another part of the problem is that many people feel that explanation is more "fundamental" than is statement of fact. And that such greater "fundamentality" is better. You have quoted such a person. Part of the problem with the statistical definitions is that they are several and divers.
At the risk of provoking an indecorous range of responses, may I suggest for the present purpose, mainly for this article, that we might try to work out some agreement along the following lines? The primary or default meaning of the word entropy is the macroscopic thermodynamic one indicated by PAR. When the context does not more or less enforce other meanings, the plain word entropy is by default to be taken to have this primary meaning. Again, when the context does not more or less enforce other meanings, they are preferably primarily to be distinguished and explicitly indicated by up-front use of qualifying or modifying words; for example we would say "Shannon entropy" when that is what we mean. It would be reasonable, in this article, that a section about Shannon entropy could announce that within such section, for the sake of brevity, the plain word entropy is used to mean Shannon entropy unless otherwise explicitly indicated; yes, just within such section. That is not to try to say what other articles should do.
Perhaps that is putting my neck out. But at least it is more or less rational or systematically executable, considering that there is currently confusion in the absence of some such agreement.Chjoaygame (talk) 05:42, 22 November 2014 (UTC)
To Chjoaygame - I think that is a good suggestion, announce at the beginning of the article that the subject is thermodynamic entropy, and note that the statistical explanation constitutes a definition of a more general entropy, e.g. Shannon entropy.
To W. P. Uzer - Sure, you can have multiple definitions of entropy, just as you can have multiple definitions of a straight line in geometry. As long as the definitions have been proven by somebody somewhere to be consistent, and you accept that proof on faith, that is, as long as you don't want to worry yourself about the axiomatic, logical development of geometry, you can "do" geometry without running into problems. Same with thermodynamics, if you don't want to bother with the axiomatic, logical structure of thermodynamics, one, two, five definitions of entropy are fine, as long as somebody, somewhere has shown them to be equivalent, and you accept their work on faith. You can "do" thermodynamics just fine. I am interested in understanding thermodynamics, rather than just "doing" it, and if you come at it from that angle, multiple definitions don't cut it, just as multiple definitions of a straight line don't cut it in axiomatic geometry. However, it seems to me that there is no need to mislead someone who wishes to study geometry by declaring that there are multiple definitions of a straight line. It doesn't hurt to simply declare that there is one definition, from which the others can be shown to follow, and then you won't have to unlearn what you have learned if you ever wish to bother yourself with an axiomatic development. The situation is similar, but much more pronounced in quantum mechanics. If you ever want to understand quantum mechanics rather than just "do" it, the distinction between measurement and the picture you have in your mind to explain those measurements must be kept brilliantly clear. Just because Phys Rev and Google make statements about two definitions of entropy does not constitute a mathematical proof that this is the way to go. When Phys Rev states that the stat mech definition of entropy is superior, I take that to really mean that it suggests definitions which can be measurement-based that are outside the scope of classical (pseudo-equilibrium) thermodynamics. If you wish to accept pronouncements by Phys Rev as ultimate truth, you will probably be ok as long as you don't become interested in the logical structure of thermodynamics. If you don't wish to accept their statements, but instead try to understand what is going on, then the mere presence of multiple definitions, some of which are unmeasurable, has to bother you. Again, it doesn't hurt to state that there is the phenomenological definition of thermodynamic entropy, and the statistical mechanical explanation, an explanation which opens an understanding of situations beyond classical thermodynamics, an understanding which can be used to develop an axiomatic, measurement-based of these situations.
I had the both fortunate and unfortunate experience of being taught thermodynamics and statistical mechanics by Dr. Ta-You Wu. I say unfortunate because he was very strict about the theoretical foundations which I could not grasp immediately, and it took me years to finally appreciate the truth of what he tried to beat into my relatively thick head. PAR (talk) 14:14, 22 November 2014 (UTC)