Jump to content

Talk:Entropy

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Vh mby (talk | contribs) at 01:43, 6 August 2014 (→‎I thought you would never ask !). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Former good articleEntropy was one of the Natural sciences good articles, but it has been removed from the list. There are suggestions below for improving the article to meet the good article criteria. Once these issues have been addressed, the article can be renominated. Editors may also seek a reassessment of the decision if they believe there was a mistake.
Article milestones
DateProcessResult
June 22, 2006Good article nomineeListed
February 20, 2008Good article reassessmentDelisted
Current status: Delisted good article

Template:Vital article


Thermodynamic Systems Decay

In the opening paragraphs the article refers to isolated thermodynamic systems "evolving toward equilibrium" and further down "progressing toward equilibrium" these statements are misleading in that they convey some idea of improvement. The correct term is decay (ref Peter Atkins - Creation Revisited). So I would like to suggest the revised wording.. "decay toward equilibrium" in all cases where this is the intended meaning. Vh mby (talk) 12:44, 18 May 2014 (UTC)[reply]

Does that not, in turn, convey some idea of deterioration? Wouldn't a more neutral term be "moving"? Although for me, "evolving" and "progressing" don't imply improvement necessarily, so I'm not sure there's a problem. W. P. Uzer (talk) 13:33, 18 May 2014 (UTC)[reply]
Sorry to burst your bubble here but of course decay means "some kind of deterioration".. That is the precise understanding of the Second Law which the article is supposed to convey! By declaring your personal ideas.. "evolving" and "progressing" don't necessarily imply improving you make the very point I raise.. (confusion about the meaning of the Second Law.. which is certainly not 'neutral') I suggest you READ Peter Atkins before you air your personal ideas as encyclopedic. Vh mby (talk) 03:03, 19 May 2014 (UTC)[reply]
Technical terms sometimes can carry unfortunate semantics when interpreted using their more everyday sense, and it is difficult to steer clear of this. I think it is clear enough from the context that such semantics do not apply, and would suggest that no attention be given to it from this sort of perspective. OTOH, in a technical sense, "decay" does carry with it the meaning of a natural unidirectional tendency or relaxation to a "final state" (even if only in the macroscopic sense), as with radioactive decay. In this sense, "decay" seems to me to be a better fit. —Quondum 15:50, 18 May 2014 (UTC)[reply]
The meaning of the word decay as commonly understood does not colour or confound the technical meaning in any way (unlike the current terms). The processes that produce observable decay are the very same the mathematics attempts to explain. That I would suggest is why P Atkins uses it so emphatically. The entry by Mr/s W.P.Uzer demonstrates, I would suggest the need for such a change. Vh mby (talk) 03:03, 19 May 2014 (UTC)[reply]
If (roughly speaking) most scientists say "decay", then I'm quite happy for it to be used, but the fact that one author uses it is not conclusive in my opinion. As I understand your objection to "evolve" and "progress", it is also just your personal view (that these words imply some kind of improvement) - my personal view that decay implies some kind of worsening is then equally valid. W. P. Uzer (talk) 18:19, 19 May 2014 (UTC)[reply]

Well now we know why the article got de-listed from "good", It has suffered some decay [1]. If budding contributors don't understand it is their responsibility to find appropriate references to support their opinions and at the very least read and understand the validity, in the scientific community, of those given, I would suggest they don't belong here. So I shall include the reference..(Done) Vh mby (talk) 01:35, 22 May 2014 (UTC)[reply]

Reference

  1. ^ dictionary.com "decay" verb "to decline in excellence"

It seems Waleswatcher considers himself above Wiki protocols and above published authorities on the subject.My Ref:Atkins, Peter, W. (1992). Creation Revisited: The Origin of Space, Time and the Universe. Penguin Books. ISBN 0140174257.{{cite book}}: CS1 maint: multiple names: authors list (link). So I would request Waleswatcher to state here in this discussion page an answer to the question; To what (state) do isolated thermodynamic systems evolve..? Please include your reference. Mike 01:47, 31 May 2014 (UTC)

I'm not sure who the above comment is from, but if you look at the "evolve" versus "decay" section of this page, you will see I gave many references, with quotes, all to references used in PhD level physics courses on thermal and statistical physics (and hence substantially more authoritative than a pop science book written by a chemist). To answer your question, thermodynamics systems evolve to the equilibrium state. Waleswatcher (talk) 00:17, 1 June 2014 (UTC)[reply]
Well I did not realize you created a whole new topic, sorry about that.. but it appears to me you are just further confusing the issue.. see below. vh mby 03:31, 16 June 2014 (UTC) — Preceding unsigned comment added by Vh mby (talkcontribs)

edit about heat, work, and transfer of energy with matter

The edit summary, and references given, state the well established reasons for the edit that said "In the model of this present account, as shown in the diagram, it is important that the work and heat transfers are by paths physically distinct from the paths of entry and exit of matter from the system.<Born, M. (1949). Natural Philosophy of Cause and Chance, Oxford University Press, London, pp. 44, 146–147.><Haase, R. (1971). Survey of Fundamental Laws, chapter 1 of Thermodynamics, pages 1–97 of volume 1, ed. W. Jost, of Physical Chemistry. An Advanced Treatise, ed. H. Eyring, D. Henderson, W. Jost, Academic Press, New York, p. 35.>"Chjoaygame (talk) 23:28, 24 May 2014 (UTC)[reply]

Why is it important to have paths for heat and work different from those for matter transfer? Paradoctor (talk) 23:38, 24 May 2014 (UTC)[reply]
The edit summary, and references given, state the well established reasons for the edit.Chjoaygame (talk) 00:54, 25 May 2014 (UTC)[reply]
a) Why isn't this in the article?
b) If this is "well established", then there should be secondary sources instead of the primary sources provided. Considering the level of generality we're talking about, this should mentioned in the major textbooks, shouldn't it? Paradoctor (talk) 09:10, 25 May 2014 (UTC)[reply]
c) The Born reference does not seem to support the statement, could this be an instance of WP:SYN? Please provide a quote of the Haase reference supporting the claim, so we can verify. Paradoctor (talk) 10:52, 25 May 2014 (UTC)[reply]
Thank you for drawing attention to this.
The usual introductory presentations of definitions of work and heat for thermodynamics start simply by considering closed systems, that is to say systems for which matter transfer is not allowed. For this, Born was a main leader of the move to insist that heat be defined as a residual transfer of energy after mechanical transfer as work is calculated. No question is considered then of what would be so if matter transfer were allowed. Many texts are vague or uninformative about it. The question is considered at some length in the Wikipedia article on the first law of thermodynamics. Perhaps that may help. I am not sure that it is appropriate for me to try to summarize that here.
Born on page 44 writes "... thermodynamics is definitely connected with walls or enclosures. ... in free contact with its neighbourhood ... the flux of energy and material constituents through its boundary, which themselves cannot be reduced to mechanics." In an appendix, on pages 146–149, Born provides more detail. He presents a model in which transfers of matter and of energy as work pass through physically separate portals. Such a separation is also shown in the diagram for the model in the present Wikipedia article.
The present problem is to analyze a change of the entropy content of a system due to a thermodynamic process. It is due not only to entropy production within the system during the process, but also to transfer of entropy. The present article tackles the problem by considering transfers of energy as heat and work. In general, those quantities are defined only in paths in which there is no transfer of matter, as shown in the diagram in the article. In paths in which there is transfer of matter, they are in general not definable. That is what Born means when he writes "which themselves cannot be reduced to mechanics". Haase on page 34 writes: "On the other hand, work and heat are not functions of state; they essentially relate to interactions of the system with its surroundings. Hence these quantities as defined heretofore have no definite meaning for open systems. (cf. Defay (1929). See also Haase (1956a))." Haase goes on to point out "There are however, important exceptions: the external work Wa and the dissipated work Wdiss can always be calculated from external actions. If, for instance, there is flow of electricity .... ". These exceptions are for quantities of work that pass by paths separate from matter transfer. Haase then summarizes "But the total work done on the open system remains indefinite."
Born and Haase are not alone. For example, Münster writes on page 50 "As explained in §14 the classical point of view (equivalence of heat and work) as well as Carathéodory's point of view (definition of heat) are meaningless for open systems."<Münster, A. (1970), Classical Thermodynamics, translated by E.S. Halberstadt, Wiley–Interscience, London, ISBN 0-471-62430-6.> Further references about this are in the Wikipedia article about the first law.
There are those who feel that this is not a happy situation, and they offer arbitrary definitions that are not tied to the basic physics of heat and work. Such arbitrary definitions are not appropriate here, where the paths are shown to be separate in the diagram.
One needs to be careful in reading texts on non-equilibrium thermodynamics, where the word heat is used in the vague senses of the nineteenth century, when it was still ok to speak as if heat were found within a body like internal energy, rather than in the strict sense nowadays used that insists that heat is energy only in transfer. I will not here go into detail about this, beyond commenting that it does not override the above comments.Chjoaygame (talk) 23:22, 25 May 2014 (UTC)[reply]
I see no response to the immediately foregoing. I think it justifies removal of the tag [failed verification]. As for the tag [why?], I am not persuaded that a detailed reason for the tagged sentence is needed in the article. The basic reason why it is needed is that the formula that is offered applies only to a special model as set out in the section and illustrated in the diagram in the article. Without that special model, the formula has no meaning.
I am writing in the reference to the special nature of the model, which was deleted, I think inappropriately.Chjoaygame (talk) 04:55, 30 May 2014 (UTC)[reply]

"evolve" versus "decay"

Regarding this choice of terms - the problem with "decay" is that it implies that something changes form, disappears, or otherwise gets reduced or eliminated. But in many circumstances that is not what happens. For example when two systems are put into thermal contact, heat flows between them until their temperatures are equal. Nothing is decaying (well I suppose you could say the temperature difference is decaying, but that's convoluted and unclear). Similarly, drop some ink into a glass of water and the ink will mix into the water, increasing the entropy, but again nothing is "decaying". So decay is not necessary, "evolve" is more neutral and more accurate. That point of view is shared both by references (for instance Kittel and Kromer never use the term decay in the many places they discuss the second law - their statement of it is simply that the entropy will most probably increase if the system is not in equilibrium) and other editors (W. P. Uzer above, and Chjoaygame here https://en.wikipedia.org/wiki/Talk:Second_law_of_thermodynamics ). Waleswatcher (talk) 16:06, 30 May 2014 (UTC)[reply]

If you don't understand the meaning of entropy or deliberately want to confuse people (as the article so admirably does) then you won't know what decays.. and from what you say this seems to be the case..
Entropy is a measure of DISORDER and in every example you site ORDER is what decaysMike 03:22, 16 June 2014 (UTC)

Here's another reference that uses evolve in exactly the same way as in the article, and never decay (this is just one I can easily link to - there are many more). P. 122: "The second law, as expressed in (4.13), is responsible for the observed arrow of time in the macroscopic world. Isolated systems can only evolve to states of equal or higher entropy." http://www.damtp.cam.ac.uk/user/tong/statphys/sp.pdf Waleswatcher (talk) 16:13, 30 May 2014 (UTC)[reply]

More references, just for fun.

People don't come here for FUN.. this is an encyclopedia not your playground Mike 03:22, 16 June 2014 (UTC)

From Kardar, Statistical Physics of Particles: "The time evolution of systems toward equilibrium is governed by the second law of thermodynamics.... Do all systems naturally evolve towards an equilibrium state?...What is the time evolution of a system that is not quite in equilibrium?... In contrast to kinetic theory, equilibrium statistical mechanics leaves out the question of how various systems evolve to a state of equilibrium." The term "decay" is never used in the context of the second law or entropy anywhere in the book; "evolve" is used throughout.

From Landau and Lifschitz, Statistical Physics: "...its macroscopic state will vary with time...the system continually passes from states of lower to those of higher entropy until finally the system reaches...". Again, no "decay".

From Huang, Introduction to Statistical Physics, explaining the second law in some examples of distributions of gas molecules: "...almost any state that looks like (a) will evolve into a uniform state like (b). But..."

Need I go on? Waleswatcher (talk) 17:03, 30 May 2014 (UTC)[reply]

You seem to think this site is the sole property of your favourite physicists and the place to champion your pet theory.. well its not. Genuine scientific enquiry doesn't care about your favourite theory or the battle you are with having with whoever (creationists in this case no doubt)
The dictionary usage and understanding of the word "evolve" is unequivocally "develop"..
"The Science Dictionary
To undergo biological evolution, as in the development of new species or new traits within a species.
To develop a characteristic through the process of evolution.
To undergo change and development, as the structures of the universe.
Example sentences
The world-and the employment marketplace- evolve and progress. Superbugs evolve when common bacterial :infections develop resistance to the
The letter urges regulators to help each firm develop a plan that would evolve . Our culture evolve s, :sometimes rapidly, and teaching styles with it--but"
The usage is common across every dictionary I have checked.. The point is in the encyclopedia articles must use words consistent which their most commonly understood dictionary definition. You have absolutely no special 'rights' here and it appears to me you not only don't understand entropy.. you don't know what an encyclopedia is..
There are over 450 people watching this site so can I get some educated comments here please.. It is precisely this sort of misinformation and distortion which destroys the clarity of such important articles Peter Atkins uses 'decay' because it is exactly what thermodynamic systems do. Mike 03:22, 16 June 2014 (UTC)
vh mby 03:57, 16 June 2014 (UTC)
Words often have somewhat different meanings in scientific contexts than in "ordinary" contexts - this applies equally to "decay" (which in commonly understood usage means to deteriorate or decompose) as it does to "evolve". Waleswatcher has given several examples (presumably there are many more) of the scientific use of "evolve" in this context; you seem to be relying on one single author who uses "decay". That being the case, at the present time, "evolve" seems to be the more appropriate choice. W. P. Uzer (talk) 07:52, 16 June 2014 (UTC)[reply]
Any one else..?? vh mby 08:42, 16 June 2014 (UTC) — Preceding unsigned comment added by Vh mby (talkcontribs) Sorry about the identity crisis I am having {vh mby = Mike = Gyroman} hope this works Gyroman (talk) 02:14, 17 June 2014 (UTC)[reply]
In physics, both "decay" and "evolve" get used with specific meanings. For example, the term "exponential decay" is well-established, and a web search on "decays towards" gives many hits, many of which relate to thermodynamics, but also in subjects such as engineering ("the voltage slowly decays to zero") and even biology (referring to population densities). One speaks of the time evolution of a system; this says nothing about where it is headed, merely that it may be changing as a function of time governed by physical laws. Thus the state a system might be said to evolve according to the Schrödinger equation. The term "decay" is used to mean something more specific: that a process of relaxation is occurring. Systems evolve over time, but properties like entropy may decay over time, so I suggest we choose a term according to the subject of the sentence. In short, I suggest saying "systems evolve [in time]" and "entropy decays [towards thermodynamic equilibrium]". I see a sentence that is in need of attention: "Historically, the concept of entropy evolved in order to explain". I'd suggest "the concept of entropy arose". I also see the use of the word "progress" in "systems tend to progress", which could be replaced. —Quondum 14:48, 16 June 2014 (UTC)[reply]
"properties like entropy may decay over time" and "entropy decays [toward thermodynamic equilibrium]" This, I am very sorry to say, reveals an astounding level of IGNORANCE of the very term we are trying to define. The fact of your confusion (as a knowledgeable contributor?) only serves to confirm my objections to the confusion caused by the muddled meaning of "evolve" in this context. Gyroman (talk) 02:14, 17 June 2014 (UTC)[reply]
I think there is a sign problem involved with using the word "decay". In ordinary language, the word "decay" suggests a quantity which decreases with time, which might be your examples of voltage or population density under appropriately defined circumstances. But of course the entropy of a system plus its surroundings increases with time, and I think this is the root of many people's reluctance to use the word decay in relation to entropy. Dirac66 (talk) 15:30, 16 June 2014 (UTC)[reply]
And so the confusion continues.. Gyroman (talk) 02:14, 17 June 2014 (UTC)[reply]
I think we should stick to language as used in the discipline (by which I'm implying that I disagree that your point applies). My suggestion should however break the deadlock: the actual examples in the article refer to a system, not to the entropy, wherever the term "evolves" occurs, and thus "evolves" should be retained there. So while I feel that the term "decay" is the correct one to use when one is referring to what happens to the entropy (i.e. one should not say that the entropy of a system evolves), I see no examples in the article to apply it to. —Quondum 17:23, 16 June 2014 (UTC)[reply]
Physicists publications are by definition biased.. (everyone is, actually). Encyclopedic content tries to overcome these biases by conferring with a broad cross-section of the relevant knowledgeable community. Those who write school text books (like P Atkins) are going to be scrutinized to a higher degree than those publishing books to make their own particular point to a limited audience. A lot of science is argued out between authors and positions change as a result. Ok more than one author is needed on my part.. Let me thus put to you all, a consensus of a highly reputable scientific community which has already been done for us. I refer to the Cambridge Encyclopedia (CE) fourth edition 2000.. In which it is stated..

"The main aim of the CE is to provide a succinct, systematic and readable guide to the facts, events, issues, beliefs, and achievements which make up the sum of human knowledge". Here of course "events, issues, beliefs etc" are simply factually reported. The CE is not misrepresenting beliefs as intrinsic statements of fact. Please don't get sidetracked on that.

It states "Entropy: In thermodynamics, a numerical measure of disorder; ... As a system becomes increasingly disordered its entropy increases. For example, the entropy of a system comprising a drop of ink and a tank of water increases when the drop of ink is added to the water and disperses through it, since dispersed ink is highly disordered. Entropy can never decrease, which in the ink-in-water example amounts to the observation that the particles of ink never spontaneously gather themselves back into a single drop." Before you start on the purist 'isolated system' requirement it is basically covered by the word "spontaneously" meaning 'of itself' or 'without external influence'. Perfectly clear to any reader.

The Wikipedia should of course go further and cover history, people, mathematics etc.. but must on no account cloud or confuse already accepted (by relevant scientific community) meaning as I would suggest is given in the CE. Your problem here is the meaning of the word "evolve". If you would not write "thermodynamic systems develop towards equilibrium" and you claim different meanings for the word 'evolve' depending upon the context then you are faced with the unavoidable consequence of defining 'evolve' in this context. Which implies equating the 'evolve' used here to "increase in disorder" which is by no means obvious. For that reason I suggest the better and more appropriate word is the one used in the text books "decay" meaning decay in state of order. The term 'decay' is further more closely related to the mathematical usage of decay as expressed for any single value, which in this case is a probability term, that a particular state exists. Gyroman (talk) 02:14, 17 June 2014 (UTC)[reply]

Ok done Gyroman (talk) 12:11, 18 June 2014 (UTC)[reply]

Would Mr W P Uzer care to define "evolve" in a manner clearly synonymous with what is known and accepted concerning any isolated system ie it will irreversibly undergo an "increase in disorder".! Gyroman (talk) 01:10, 19 June 2014 (UTC)[reply]

Gyroman (or Vh mby, or whatever your name is), "evolve" in this context simply means to change or develop from one form to another. That's how it's used in the references I gave, and that by the way is the dictionary definition in three places I just checked (not that the dictionary definition really matters - what matters here is the language reliable sources for this topic use). However I do agree with you that it's not an ideal term in that it might give the wrong impression to some people, but in my opinion (and it seems that of every other editor that has commented on the issue) "decay" is worse. Perhaps there is a third term that is better than both, but I can't think of one. Waleswatcher (talk) 05:59, 19 June 2014 (UTC)[reply]
My point exactly.. 'evolve' = change or develop (same as I said), is completely misleading in the context of trying to convey to ALL READERS the relentless downward trend in order which is the essential characteristic of every isolated thermodynamic system. Any reader checking the meaning of evolve, with its synonym 'develop' is going to get the wrong idea. (May I say just like you seam to have).. and so it seems our Mr Uzer is content with a consensus of ignorance.. Well this is not a matter of individual, clearly erroneous opinion, and to leave it without clarification is unacceptable in this context. Either define evolve as the opposite of develop or get rid of it.. Gyroman (talk) 12:45, 23 June 2014 (UTC)[reply]
Would you care to say why you consider 'decay' worse than 'evolve'..? Gyroman (talk) 12:52, 23 June 2014 (UTC)[reply]
Would you also care to state how your "language reliable sources" define 'evolve'..? Gyroman (talk) 12:56, 23 June 2014 (UTC)[reply]
"Evolve" just seems to be the word that most scientists use in this context, whereas "decay" isn't, nor does "decay" seem a particularly apt metaphor (for what the system does, that is; it may be a good word for what the order does, as I think has been pointed out). I would be quite happy if the whole thing could be rewritten in such a way that more people might understand it, and this might well be done without using the word "evolve", but simply replacing that word with "decay" doesn't seem to me to be helping: it makes it more confusing for most people who have some familiarity with other scientific writing on the topic, and at best no less confusing for the rest. Like I said at the start, if there really is a problem here, I would suggest using a simple word like "move" or "go". W. P. Uzer (talk) 16:02, 23 June 2014 (UTC)[reply]
Let me understand you clearly.. 'order' decays.. but the 'system' doesn't.!! There are two substantial problems with that..
(1) The absolute entropy of any system is based on one and only one variable a number. It doesn't even have units.. it's a probability! (the thermodynamic probability that the particular state it is in exists out of all possible states). As the single descriptor it can hardly be used to account for two different results?
(2) How can you describe what is happening to the 'system' as different to the primary effect of order decay in the system without even mentioning it?
You did say you wanted more people to actually understand this.. Well I would suggest confusion over the direction energy takes as measured by its entropy is at the very heart of the problem here. Equilibrium is the lowest state of any system by any descriptor you choose and "move" or "go" fail miserably to convey what is actually happening! But 'evolve = develop' is manifestly its inverse!! Please explain? Gyroman (talk) 08:15, 25 June 2014 (UTC)[reply]
You seem to want to make (metaphorical) value judgements about different physical states - ordered is "good", unordered is "bad", so to describe changing to a "worse" state you insist on using a term that implies worsening, and can't accept a term that (to you) implies improvement. But others, I suspect, have no such judgements in mind - they are happy to follow scientific convention in using the word "evolve" just to denote a spontaneous change from one state to another state, and might use "decay" to denote a decrease in some quantity (a system is not a quantity, though its order might be) or decomposition of some object (which is not what happens to the system here). I don't think this usage of "evolve" can lead to any particular confusion, as it is anyway stated in the same sentence that the "evolution" (change) is in the direction of thermal equilibrium, but using more everyday wording might make it more accessible to some readers. W. P. Uzer (talk) 11:20, 25 June 2014 (UTC)[reply]

We are not even on the same planet here.. HELLO! is there anyone out there who understands
(a) The purpose of Dictionaries
(b) The meaning of words
(c) What happens to thermodynamic systems.?
"good" and "bad" are not my words, "worse" is Waleswatchers word, mine was "improvement", as conveyed by the word "evolve", (almost universally given the synonym "develop" with clear examples showing development of new logical or structural elements), THAT IS NOT JUST MY "OPINION", and it is the OPPOSITE OF WHAT OCCURS to isolated thermodynamic systems! ORDER is not one of a number of "quantities" of the system it is the ONLY quantity of relevance to the word entropy.. and it goes in ONE direction, and that is most emphatically the opposite indicated by evolve. If you can't define evolve as you would like (its not your word to do that with), different to the dictionary then I say it is deceptive to persist with it and a major reason for the downgrade of clarity. (reader understanding) Gyroman (talk) 13:41, 26 June 2014 (UTC)[reply]
It's not my preferred meaning of evolve, it's what appears to be established scientific usage (as per evidence given a long time ago). W. P. Uzer (talk) 17:12, 26 June 2014 (UTC)[reply]

Note "system evolves" "system decays" The same holds on Google Web and Books, and also when adding any of the prepositions "to"/"into"/"towards". Paradoctor (talk) 01:38, 23 July 2014 (UTC)[reply]

deeply muddled

The lead of this article is deeply muddled in concept. It needs radical revision. A word or two here or there are as drops in the ocean.Chjoaygame (talk) 16:19, 26 June 2014 (UTC)[reply]

Agreed. W. P. Uzer (talk) 17:12, 26 June 2014 (UTC)[reply]

How about..

In thermodynamics, entropy (usual symbol S) is a measure of the disorder of an identified system of particles with a given total energy. It directly relates to the probability the system is in a particular state compared with all possible states or arrangements. According to the second law of thermodynamics the entropy of real isolated systems always increases; such systems spontaneously decay towards thermodynamic equilibrium, the configuration with maximum entropy or disorder. The result of increasing entropy is to dissipate energy evenly throughout the system so pressure, temperature, chemical potential, mechanical energy (motion) etc get smoothed out. The total energy of the system is not changed but the quality or ability of the system to do work is reduced ultimately to zero at equilibrium. Theoretical ideal systems with no energy conversion to heat due say to friction maintain constant entropy. Systems which are not isolated may decrease in entropy when subject to an input which has the effect of creating order, like the freezer box in a refrigerator. Since entropy is a state function, the change in the entropy of a system is the same for any process going from a given initial state to a given final state, whether the process is reversible (ideal) or irreversible (real). However irreversible processes increase the combined entropy of the system and its environment.

Gyroman (talk) 05:07, 28 June 2014 (UTC)[reply]

This is just as deeply muddled.Chjoaygame (talk) 08:45, 28 June 2014 (UTC)[reply]
I stated what is confusing and misleading about the use of "evolve", the only answer given was that certain authors use it that way. I stated why it is confusing and misleading and the same applies to those authors.. Well now its your turn.. OK, tell us all.. exactly what is muddled, incorrect or missing from the above proposal? Gyroman (talk) 01:58, 1 July 2014 (UTC)[reply]
Thank you for this kind invitation. Sad to say, I have to decline it.Chjoaygame (talk) 06:07, 1 July 2014 (UTC)[reply]
For me, I think there needs to be more considered explanation of what entropy actually is, before talking in detail about the second law of thermodynamics and so on. W. P. Uzer (talk) 07:32, 1 July 2014 (UTC)[reply]
Its interesting the same comment (re entropy.. What is it?) appears in the talk page to the Introduction to Entropy article. You'll know most of this but please bear with me.. This is what I think is the source of the confusion..
(1) There are two equations for entropy, the first discovered by Rudolf Clausius mid 19th century is actually the change in entropy of a SYSTEM, calculated as the integral of heat flow divided by the absolute temperature at the boundary of a subsystem to or from its surroundings both contained within the whole SYSTEM. What is not so apparent is that differences in macroscopic properties such as temperature or pressure between subsystems within a larger system represent an ordered state as opposed to a disordered state where the energy is smoothly distributed = equilibrium. The point is that only an ordered system with differences in pressure or temperature between its subsystems allows us to get any work from the system. As you know.. the amount of work can be calculated from a temperature entropy diagram as an area ∑(∆T x ∆s) for a process.. like for a steam engine.
(2) The second equation by Ludwig Boltzmann discovered around the turn of the 20th century calculates the absolute entropy of a SYSTEM and this is the general or fundamental equation for entropy. It directly measures how disordered the system is. The equation s = k.ln W should be as well known as E = M.C^2.. this is a great failing in science education. The Clausius equation is easily derived from the Boltzmann equation by making just one simplifying assumption (ie the system is already at equilibrium or W = Wmax),. this derivation must never be done the other way round! The reason is calculation of entropy change by heat transfer across a subsystem boundary is NOT a complete statement of the entropy of a system. This is clear from the Boltzmann equation, "W" is just a number and "k" is the Boltzmann constant, which is required to give "s" some units. We MUST note "order" includes logicalstates (not all of which can be accounted for by heat transfer), as well as physical states. Now "W" is the number of microstates (logical arrangements, or 'phase space' if every particle is identified and each is exchanged with every other) in a given macrostate (physical arrangement set of particle properties available). Clearly 'W' is incalculable for any real system.. just one litre of gas gives a logical phase space of microstates exceeding the size of the universe!
(3) When the absolute entropy of a system and the term ORDER, are inadequately defined or left out you end up with a gaping hole in your definition of the second law. Just saying 'Entropy Increases" or 'never decreases' (needs more definitions 'real', 'ideal', 'reversible', 'irreversible') makes little sense if entropy has not been comprehensively defined. Entropy is not about heat or energy its all about ORDER. 'W' is a probability term and ordered means less probable. Energy dissipation is a RESULT of the decay in ORDER or increase in PROBABILITY and with this understanding a clearer definition of the second law results.. "Isolated systems tend to move to their most probable state".
The emphasis on Rudolf Clausius to the exclusion of Luwig Boltzmann is at the root of the problem Gyroman (talk) 14:31, 1 July 2014 (UTC)[reply]
This is a good point. Perhaps the article would be clearer if it were re-organized into two distinct parts: (1) Entropy change, including Clausius and the Second Law, and (2) Absolute entropy, including Boltzmann, the Third Law, information theory, etc. Dirac66 (talk) 15:46, 1 July 2014 (UTC)[reply]
Agreed, that the difference between relative entropy as defined by Clausius and absolute entropy as fixed by the third law is important.
But I think it important also to distinguish explicitly also between macroscopic entropy as defined by Clausius and the third law, and microscopic or statistical entropy as defined by probability and information theory. I think it unwise to try to elide this distinction, because such an elision hides from the student an important aspect of the logical structure of the situation. The distinction having been made, it is then in order to show the linkage.
I do not agree that the word "order" should be enshrined as the key to entropy. At least one expert tackles this oft used word in this context, Walter T. Grandy, in Entropy and the Time Evolution of Macroscopic Systems, Oxford University Press, Oxford UK, 2008, ISBN 978-0-19-954617-6, p. 56. It is ok to talk about order, but not to enshrine it.
I think that the here-relevant characteristic feature of thermodynamic equilibrium is its time invariance. It is true that Planck considers homogeneity to be characteristic. But I would say that such homogeneity should be time-invariant; I think Planck assumes this without saying so. A fluctuation from a non-equilibrium state could lead to instantaneous homogeneity. Repeated measurements on a state of thermodynamic equilibrium may show fluctuations, but they must be statistically time-invariant for the equilibrium state.Chjoaygame (talk) 18:07, 1 July 2014 (UTC)[reply]
So no comment from anyone else.. all 460+ watchers! Your all happy to have a "deeply muddled" degraded presentation..? I am still going through the references.. however I do not find any attempt to explain the intended meaning of "evolve" concerning isolated thermodynamic systems. Rather it seems to me it is being used as a tactical ploy to drag the second law into the controversy over the meaning of the word "evolution" (ie 'any change in gene frequency' as opposed to an 'average increase in complexity of genes' and further still into the mire we have the complete absence of the meaning of 'complexity' itself). Clearly there is a significant group of scientists fighting the claims of another group of scientists as to the real implications of the second law for evolution. This is the sort of thing which brings professional 'bias' into science. Please STOP there, its NOT RELEVANT here.
What is relevant is that in the context of an encyclopedia there is no justification for using any word contrary to its dictionary definition and such hidden' meanings will confuse and mislead. That is what needs addressing and it is singularly unsatisfactory to claim certain scientists 'express' it that way.. no its bias! All this over a single word.. it is obviously of much higher importance to you editors than you are admitting in this discussion. Sorry but that leads me to conclude you are more part of the problem than the solution. So I think we are heading for some sort of independent resolution and I am left with no alternative but to change it and force the issue.
Done Gyroman (talk) 00:49, 23 July 2014 (UTC)[reply]
Dear Gyroman, sad to say you are mistaken. You tell us that you think that the use of the word 'evolve' "is being used as a tactical ploy to drag the second law into the controversy over the meaning of the word "evolution" (ie 'any change in gene frequency' as opposed to an 'average increase in complexity of genes' and further still into the mire we have the complete absence of the meaning of 'complexity' itself)." No, the use of the word 'evolve' here is reflective of common usage in this area of science. That's all.
Witness the title of a recent monograph on the subject: Entropy and the Time Evolution of Macroscopic Systems, Grandy, W.T, Jr (2008), Oxford University Press, Oxford UK, ISBN 978-0-19-954617-6. Other examples from a standard text on this very matter: "We take a metal rod prepared at time zero in such a way that its temperature varies linearly from one end to the other. We then let it evolve freely and measure the change of temperature as a function of time at a given point. It is a fact of experience that, if the experiment is repeated a large number of times under identical conditions, its result is reproducible." Also "Given a system, described at time t = 0 by an arbitrary ensemble distribution, it is possible to explain its subsequent time evolution by the exact laws of classical or quantum mechanics." Many more examples in this book.<Balescu, R. (1975), Equilibrium and Nonequilibrium Statistical Mechanics, John Wiley & Sons, New York, ISBN 0-471-04600-0, p. 371.> Editor Waleswatcher has above here supplied other examples.
While you are right in general too look to ordinary language as the first candidate for Wikipedia wording, in this particular case the local language is established. Moreover, the dictionary that you choose as criterion for ordinary language is only one of many. I take here the liberty of copying one of the eight meanings listed in the Oxford English Dictionary: "6. Of circumstances, conditions, or processes: To give rise to, produce by way of natural consequence."<Oxford English Dictionary, Oxford English Dictionary Second Edition on CD-ROM (v. 4.0.0.3) © Oxford University Press 2009>. Best to choose a good dictionary when you can.
You point to my comment that the article is deeply muddled. Better you study some thermodynamics texts to try to work out what is wrong with it than try to impose your choice of words based on one popularist author. The word 'decay' that you prefer also has its local meanings. They sometimes fit with the time course of natural processes, but not always. Greatly respect Atkins though we may, his book is not a reliable source for the present purpose. Judging from the one source you cite, you have not done nearly enough reading in this area to provide you with the background to identify a reliable source.Chjoaygame (talk) 02:16, 23 July 2014 (UTC)[reply]

The side bar formula on work and entropy change

I agree. But the assumption of maintaining equilibrium should have started on the first equation, not the second. Otherwise it looks to the casual reader like you missed the chain rule in calculus. — Preceding unsigned comment added by 2601:C:8D80:249:250:8DFF:FEB5:7FA4 (talk) 22:24, 31 May 2014 (UTC)[reply]

asking for reliable source

I have asked for reliable sourcing for a statement in the article: "It has more recently been extended in the area of non-equilibrium thermodynamics."

An opinion is expressed by Lieb, E.H., Yngvason, J. (2003), The Entropy of Classical Thermodynamics, chapter 8, pp. 147–195, of Entropy, edited by Greven, A, Keller, G. Warnecke, G. (2003), Princeton University Press, Princeton NJ, ISBN 0-691-11338-6. They write on page 190:

"Despite the fact that most physicists believe in such a nonequilibrium entropy, it has so far proved impossible to define it a clearly satisfactory way. (For example, Boltzmann's famous H-Theorem shows the steady increase of certain function called H. This, however is not the whole story, as Boltzmann himself knew; for one thing, in equilibrium (except for ideal gases), and for another, no one has so far proved the increase without making severe assumptions, and then only for a short time interval (cf. Lanford 1975) ...)
"It is not clear if entropy can be consistently extended to non-equilibrium situations in the desired way. ..."

An opinion is expressed by Grandy, W.T., Jr, (2008), Entropy and the Time Evolution of Macroscopic Systems, Oxford University Press, Oxford UK, ISBN 978-0-19-954617-6, on page 153. He writes:

"A century later Jaynes (1971) demonstrated that the H-theorem, therefore the Boltzmann equation upon which it is based, cannot be generally valid, even for a dilute gas. ..."Chjoaygame (talk) 11:33, 15 July 2014 (UTC)[reply]
Mathematical treatise of physical systems quite often results in sets of equations to which there is no analytical solution possible.. 3D aerodynamics and aero-elasticity problems often end up that way. But that does not mean the physical situation cannot be described or measured in such a way as to give reliable conclusions about macro behaviour. The problem with your references doing admirable in depth mathematical treatment may be a bit like not seeing the forest for the trees. If probability is properly considered as the sole driving force behind the second law or entropy increase it is not too difficult to answer the question "What drives the direction of entropy change in non equilibrium, isolated thermodynamic systems?" The answer simply falls out of the question! Since by definition it is only average behaviour in macroscopic terms being sought.. the system must tend toward a more probable state. Heat must still move from hot to cold.. etc.
And by the way the reference you are really missing is one that states the isolated thermodynamic systems evolve and defines evolve as the opposite of its synonym 'develop'! Gyroman (talk) 01:24, 23 July 2014 (UTC)[reply]
Dear Editor Gyroman, one can see that you mean well here, but you have not come near to offering an adequately reliable source for that general claim about entropy.Chjoaygame (talk) 02:30, 23 July 2014 (UTC)[reply]

So what exactly is the quantity which increases according to the second law?

The above discussion is confusing to non-experts, because the second law was originally classical and is in fact frequently used to study non-equilibrium states. If Clausius and Kelvin said that the entropy of the (system plus surroundings) always increases for non-equilibrium processes, then they must have provided some definition of the quantity which increases. The two sentences questioned were Historically, the classical thermodynamics definition developed first. It has more recently been extended in the area of non-equilibrium thermodynamics. The second sentence has now been deleted by Chjoaygame, and perhaps it was not quite accurate or reliably sourced. But I think it would be useful to readers to provide a more accurate explanation of why entropy can be used to describe non-equilibrium states and processes. I think it has to do with considering reversible processes which are infinitesimally removed from true equilibrium states, but I won't try to write a complete (and sourced) statement because I am certain that Chjoaygame can write a more accurate explanation of this point than I can. Please. Dirac66 (talk) 20:42, 23 July 2014 (UTC)[reply]

"disorder" Gyroman (talk) 01:33, 6 August 2014 (UTC)[reply]

I thought you would never ask !

The entropy of classical thermodynamics is a state variable for the energy picture U = U(S,V,{Nj}), and a state function for the entropy picture S = S(U,V,{Nj}), of a thermodynamic system in its own state of internal thermodynamic equilibrium. For physical systems in general, the classical entropy is not defined.

A straightforward but perhaps long-winded, though very safe, statement of the second law might go as follows:

One may consider an initial set {i } of several thermodynamic systems each its own state of internal thermodynamic equilibrium with entropy {Si }. There may then occur a thermodynamic operation by which the walls between those systems are changed in permeability or otherwise altered, so that there results a new and final set {f } of physical systems, at first not in thermodynamic equilibrium. Eventually they will settle into their own states of internal thermodynamic equilibrium having entropies {Sf }. The second law asserts that

Every natural thermodynamic process is irreversible. 'Reversible processes' in thermodynamics are virtual or fictional mathematical artifices, valuable, indeed practically indispensable, devices for equilibrium thermodynamic studies. Mathematical artifices nevertheless.

Thermodynamic operations have been implicitly recognized, though not so named, since the early days. Kelvin spoke of "inanimate agency" : "It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects." Logicaly this implies that he contemplates "animate agency"; that means, in modern language, 'thermodynamic operation'. They are essential for thermodynamic reasoning.

A step to dealing with non-equilibrium problems is to consider physical systems near enough to being in their own states of internal thermodynamic equilibrium that one can take the entropy to be the same function of the same state variables as for equilibrium. This is an approximation that works very well for many problems, and much valuable work has been done with it. But for an article on entropy to allow it, without specific notice that it is an approximation, I think is loose.

A further step towards non-equilibrium thermodynamics is to try to work with a scalar entropy that is a function of an extended set of state variables, that for example may include fluxes and time rates of change or spatial gradients of classical state variables. This works for a wider range of problems. Again I think an article on entropy that intends to include this should say so explicitly, not just loosely imply it.

A further step towards non-equilibrium thermodynamics is to use a thoroughly non-classical extension of the concept of entropy, the multiple-time hierarchy of entropies. The two-time entropy is a function of a time interval, between the initial state and the final state. It provides a criterion for the expected direction of non-equilibrium processes, when the one-time entropy is not an adequate guide. But I would say that for Wikipedia it is to be regarded as research matter, and reliable sources for it are not too many. I think in this article it calls for explicit, rather than just vaguely or loosely implied, mention.

Loose statements that refer to changes in entropy in an isolated system I think are indeed loose statements, in an article on entropy. The classical entropy of a physical system not in its own state of internal thermodynamic equilibrium is not defined. The entropy of an isolated thermodynamic system in its own state of internal thermodynamic equilibrium does not change. Statements that refer to changes have many implicit but tacit presuppositions that are not likely to be apparent to readers not familiar with the subject.

Loose statements in this article can easily be used to support wild speculation, such as about the entropy of the universe, something that is hardly definable, and certainly not classically defined. How far do we want this article to supply such support?

Loose statements may be argued for because they 'help' readers who want a 'quick and efficient' glimpse of entropy. Perhaps. But are they really well served by inviting them to accept loose statements? And what about those who want to learn something factual and reliable?

It would not be easy to change the article to say this kind of thing.Chjoaygame (talk) 10:31, 24 July 2014 (UTC)Chjoaygame (talk) 10:42, 24 July 2014 (UTC)[reply]

Hm. I notice that some of these points are discussed in the article on nonequilibrium thermodynamics, so we could just add a link to that article for supplementary information instead of repeating everything.
However I think there is a simpler version suitable for this article. I have now consulted several undergraduate textbooks to refresh my memory, and they say that entropy is a state function which is defined for a reversible path as the line integral of dQ/T, and for an irreversible path as having the same value as for the reversible path. So ΔS is operationally defined for any system as
We do of course have this equation in the intro already, but I believe we should add that since S is a state function, it can be also used to evaluate ΔS for irreversible (nonequilibrium) changes if the integral is evaluated for a reversible path. We could also point out that for both equilibrium and nonequilibrium, this classical definition only gives ΔS and not an absolute S. Dirac66 (talk) 16:26, 24 July 2014 (UTC)[reply]
I am not proposing to try to say these things in the present article, and I am not suggesting that we should link this article to the one on non-equilibrium thermodynamics, which is rather messy. I am just saying that the present article might benefit from more caution in what it suggests of the universality of entropy as a guide to physics. I don't have any particular proposed changes to the article in mind right now.
The worry is not about irreversible processes. Physically, thermodynamic processes are all irreversible. Only mathematical virtual "processes" can be reversible. The worry is about processes that do not start and finish in states of thermodynamic equilibrium. Processes that exactly start and finish in states of thermodynamic equilibrium are found in laboratories, but not so often outside them.Chjoaygame (talk) 17:51, 24 July 2014 (UTC)[reply]
Yes, I realize that reversible processes and also equilibrium initial and final states are only idealizations which can approximate reality. This is of course true of many scientific concepts - for example pure substances do not exist as any analytical chemist will tell you. But all of these idealizations are still useful approximations to reality in favorable cases. Dirac66 (talk) 02:40, 26 July 2014 (UTC)[reply]
Of course you are right there.
I was just worried that the reader might get the impression that the way from classical entropy to a hoped-for thermodynamics of general physical processes was all plain sailing, guided by the Admiralty charts.Chjoaygame (talk) 02:56, 26 July 2014 (UTC)[reply]
So what is the answer to the original question..? Gyroman (talk) 01:43, 6 August 2014 (UTC)[reply]