Jump to content

Talk:Entropy: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
Line 256: Line 256:
:::There is also WP's official assessment process -- how does the article compare with the criteria to be rated B, or A, or [[WP:GA]], or [[WP:FA]]? From what I've seen this also may tend to be judged by generalists, who may be very hot on whether a particular sentence has a supporting reference link, or on whether the references are typeset ''just so'', but who may be very weak on the actual science, and simply not qualified to spot whether the article may have glaring inaccuracies, glaring gaps, or material that simply isn't the best way to explain concepts.
:::There is also WP's official assessment process -- how does the article compare with the criteria to be rated B, or A, or [[WP:GA]], or [[WP:FA]]? From what I've seen this also may tend to be judged by generalists, who may be very hot on whether a particular sentence has a supporting reference link, or on whether the references are typeset ''just so'', but who may be very weak on the actual science, and simply not qualified to spot whether the article may have glaring inaccuracies, glaring gaps, or material that simply isn't the best way to explain concepts.
:::So the scary one is asking for external peer review. This is probably best done when you have got very straight what intended scope you want the article to cover; got some informal internal reviewers to tell you whether they think it flows and develops reasonably, and when you've done quite a lot of work referencing it, and making sure it reasonably tracks the principal most authorititative sources. At that stage, it becomes a question of who you know (or who anybody at [[WT:PHYSICS]] knows) who actually teaches this stuff at a university, who could be persuaded to dig into it and give it a proper critique. It's a high bar to aim for, but if you pass that, you may be well on your way to getting it acclaimed as a featured article, and seeing your article as the big draw in the hot spot of the front page for a day. [[User:Jheald|Jheald]] ([[User talk:Jheald|talk]]) 22:46, 11 February 2013 (UTC)
:::So the scary one is asking for external peer review. This is probably best done when you have got very straight what intended scope you want the article to cover; got some informal internal reviewers to tell you whether they think it flows and develops reasonably, and when you've done quite a lot of work referencing it, and making sure it reasonably tracks the principal most authorititative sources. At that stage, it becomes a question of who you know (or who anybody at [[WT:PHYSICS]] knows) who actually teaches this stuff at a university, who could be persuaded to dig into it and give it a proper critique. It's a high bar to aim for, but if you pass that, you may be well on your way to getting it acclaimed as a featured article, and seeing your article as the big draw in the hot spot of the front page for a day. [[User:Jheald|Jheald]] ([[User talk:Jheald|talk]]) 22:46, 11 February 2013 (UTC)
::::If it could just get to be an A-class article or maybe a Good article, that would be nice. If it were in a state where a university student looking for answers such as: "What is it? How was it discovered?" would feel that they got those answers, at least, that would nice. The main changes made recently where to remove any suggestion that this is some sort of disambiguation page. This article is probably best named [[entropy (thermodynamics)]] since that is how the categories (and the French and German articles) are structured. The fate of the other two articles ( [[entropy (classical thermodynamics)]] and [[entropy (statistical thermodynamics)]] seems unclear since there is considerable overlap with this one. The choices seem to be: 1. Just give up and do nothing. 2. Merge the other two articles back into this one 3. Merge most of this article back into the other two. 4. Just disclaim that the other two are specialized articles with a lot of overlap with this one. Any other viable options?--[[Special:Contributions/189.51.14.154|189.51.14.154]] ([[User talk:189.51.14.154|talk]]) 23:29, 11 February 2013 (UTC)
::::If it could just get to be an A-class article or maybe a Good article, that would be nice. If it were in a state where a university student looking for answers such as: "What is it? How was it discovered?" would feel that they got those answers, at least, that would nice. The main changes made recently where to remove any suggestion that this is some sort of disambiguation page. This article is probably best named [[entropy (thermodynamics)]] since that is how the categories (and the French and German articles) are structured. The fate of the other two articles ( [[entropy (classical thermodynamics)]] and [[entropy (statistical thermodynamics)]] seems unclear since there is considerable overlap with this one. Look at the new section named "Entropy of a system" with material that is peculiar to neither definition. The choices seem to be: 1. Just give up and do nothing. 2. Merge the other two articles back into this one 3. Merge most of this article back into the other two. 4. Just disclaim that the other two are specialized articles with a lot of overlap with this one. Any other viable options?--[[Special:Contributions/189.51.14.154|189.51.14.154]] ([[User talk:189.51.14.154|talk]]) 23:29, 11 February 2013 (UTC)

Revision as of 23:42, 11 February 2013

Former good articleEntropy was one of the Natural sciences good articles, but it has been removed from the list. There are suggestions below for improving the article to meet the good article criteria. Once these issues have been addressed, the article can be renominated. Editors may also seek a reassessment of the decision if they believe there was a mistake.
Article milestones
DateProcessResult
June 22, 2006Good article nomineeListed
February 20, 2008Good article reassessmentDelisted
Current status: Delisted good article


First sentence

Are people opposed to editing the first sentence. It looks like it has been stable for a long time, but, as far as I can tell, it's not even gramattically correct. It doesn't really say anything. Any thoughts? Sirsparksalot (talk) 14:38, 18 September 2012 (UTC)[reply]

Personally, I agree. The sentence is very busy, with far too many slashes, and it requires too much prior knowledge of the terminology to be useful to the average reader. I think expanding it to become an opening paragraph of its own could be helpful. Here's how it reads now:
Entropy is the thermodynamic property toward equilibrium/average/homogenization/dissipation: hotter, more dynamic areas of a system lose heat/energy while cooler areas (e.g., space) get warmer / gain energy; molecules of a solvent or gas tend to evenly distribute; material objects wear out; organisms die; the universe is cooling down.
Here's what I would probably change to make it clearer:
Entropy is the thermodynamic property toward equilibrium. Entropy is the property that produces average, creates homogenization, and causes dissipation. Entropy arises from the second law of thermodynamics, which says that uneven temperatures will attempt to equalize. Although, according to the first law of thermodynamics, energy cannot be created or destroyed, the hotter, more dynamic areas of a system will lose heat or energy, causing cooler areas to get warmer or gain energy. Examples of entropy include thermal conduction, in which heat transmits from hotter areas to cooler areas, until thermal equilibrium is reached. It also includes diffusion, where molecules of solute in a solvent, or mixtures of gases or liquids, tend to evenly and irreversibly distribute. Material objects wear out, organisms die, and the universe is cooling down.
Does that seem to read any better? Zaereth (talk) 20:18, 19 September 2012 (UTC)[reply]
Well done Zaereth ! ... One minor typo: "causing cooler areas to get warmer" [or does your original conform with the rules of standard American English?] ... And a more significant point: perhaps replace your "and the universe is cooling down" [at the very end] with a reference to Wikipedia's Heat Death of the Universe. I know what you mean, but as you know, as long as hydrogen atoms still exist, they have the potential to fuse into helium (and so on up the scale) - releasing more heat, which (by the First Law) must go somewhere, even if some of it ends up locked inside black holes? ... (which is, in any event, a somewhat complicated concept?) --DLMcN (talk) 09:39, 20 September 2012 (UTC)[reply]
Yes, that was a typo. (I'm famous for my typos, "fat-fingers," and dyslexisms.) Thanks for catching that. I really like this type of collaboration, because ... you can actually look at the building of an encyclopedia as a reversal of entropy. As Dr. Ian Malcolm (Jurassic Park) said, "Life will find a way." I corrected that typo, so thanks again.
I would love to discuss black holes, and gravity in particular, but this isn't really the place for it. Black holes also seem to defy entropy, and many theories exist about that as well. I have no objection to adding a link to heat-death of the universe. However, unlike thermal conduction or diffusion, (which are knowns), heat death is still a theory, one which the theory of dark-energy may topple. Perhaps if we add a clause at the end of the sentence ... something like, "...and the universe is cooling down, possibly leading to the heat-death of the universe." That way we can present the view without necessarily endorsing it. Does that sound ok? Zaereth (talk) 13:08, 20 September 2012 (UTC)[reply]
But on average (i.e., taking it as a whole) I do not think we can really say that the universe is 'cooling down'. Individual stars, yes - as they become White Dwarfs and then burn out completely - but the heat which they progressively lose must go elsewhere. --DLMcN (talk) 13:46, 20 September 2012 (UTC)[reply]
When talking about the universe cooling, we're really referring to the cosmic microwave background radiation. In the beginning, this blackbody radiation was supposedly very hot and energetic, like visible light or even shorter wavelengths. As the universe spreads farther and farther into infinity, it becomes more and more spread-out and dispersed, which equates to cooling. Now this background radiation has strestched from very hot light into microwaves, which indicates that the universe itself is within just a few degrees of absolute zero.Zaereth (talk) 17:27, 20 September 2012 (UTC)[reply]
Let's give this a cursory once-over:

Entropy is the thermodynamic property toward equilibrium.

  • That isn't even English. Words need to be added if it is to have even the potential to be meaningful.

Entropy is the property that produces average, creates homogenization, and causes dissipation.

  • Incorrect. The cumulative effect of umpteen-many random quantum state jumps (if one is describing reality using an approximation which allows them) is what tends to create homogenisation and cause dissipation. Entropy is (or rather can be) a measure of that homogenisation, or extent to which dissipation has occurred.
  • "Average" is a statistical measure used by analysts of the system. It is not something "produced" by entropy.

Entropy arises from the second law of thermodynamics,...

  • Err, no. The Second Law is a law about entropy. But entropy can sometimes still be defined in circumstances where the Second Law is hard to apply.

... which says that uneven temperatures will attempt to equalize.

  • That is not a standard statement of the Second Law. It would be hard to apply the statement, for example, to a domestic refrigerator -- a standard case for the application of the Second Law.

Examples of entropy include...

  • The examples which follow are all examples of entropy increase. They are not examples of entropy, which is a measure of a particular aspect of a system, or of part of a system, at a moment in time.
Sorry if it may seem as if I'm being pedantic, but part of the challenge of the lead section of an article is that above all that section needs to be clear and to be accurate. Jheald (talk) 14:56, 20 September 2012 (UTC)[reply]
... a challenge which I do not disagree that the present lead fails badly too. Jheald (talk) 15:00, 20 September 2012 (UTC)[reply]
I agree that it's not perfect, and there ae many challenges in defining entropy. I also agree that the lede should be clear and accurate. However, the lede, by nature, is going to be somewhat vague and simplistic. Most of what is written there now adheres to the commonly accepted definition that entropy is: "The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity." One great problem is that there are many definitions of entropy, depending on what science you prefer. Macroscopic entropy is defined somewhat differently from microscopic entropy. Some consider it to be a mysterious driving force, while others consider it to be a measure of that mysterious force. Some scientists believe that there will never be a satisfactory definition, but I don't think the lede is the place for all of that confusion. Here is where we need to start with a certain, albeit vague, definition, so that the inexperienced reader will have some sort of context for understanding the body of the text. So, my question to you is, what would you change to improve the first sentence? Zaereth (talk) 16:50, 20 September 2012 (UTC)[reply]
Part of what people want from WP is to cut through vagueness and mystery -- where its uninformed or unnecessary -- and instead get a good steer on what the scientific understanding of the term actually is.
So to start with, entropy is not a "force" or a "tendency". It is a measure of a particular aspect of a system, at a particular moment in time.
It is the universe which shows a tendency towards increased uniformity (for reasons which actually can be quite readily understood). Entropy is a numerical measure, that can be calculated, that quantifies that uniformity.
Straightening this confusion out is the most pressing thing that needs to be fixed in the proposed text.
More precisely, entropy quantifies the number of microscopic states of the system which are compatible with a particular macroscopic description. More uniform macroscopic states are associated with larger numbers of microscopic states. As the system gets jostled about, and gets bounced about from its initial microscopic state to "nearby" microscopic states, it is more likely to find itself in a macroscopic state associated with more microscopic states than fewer microscopic states. So the effect of that random evolution is to move the system to a state which macroscopically seems more homogeneous, closer to uniformity -- and makes any spontaneous move to a macroscopic state of less uniformity very unlikely. That's the heart of the idea of entropy, which may be worth setting out in the paragraph immediately after the fold -- with citations to informal introductions setting out the concept in this way. Jheald (talk) 17:33, 20 September 2012 (UTC)[reply]
I've occasionally offered re-writes to clarify bits of the lead in the past, eg diff, or further back this, which I still think is quite a nice balanced intro. They mostly haven't stuck, so if it's all right with you, I'm quite happy to let somebody else have a go, and just try to offer encouragement and what are intended to be constructive comments from the sidelines. Jheald (talk) 17:46, 20 September 2012 (UTC)[reply]
Yes, but that type of statistcal interpretation will be very difficult for the newcomer to understand. The actuality is that entropy is a mathematical model which serves to describe some property of a system which cannot be expressed simply in some other way. For example, here is a quote from the book The Entropy Principle: Thermodynamics for the Unsatisfied, by André Thess: "A simple example shows us this is not the case. When a stone falls into a well, the the energy of the water increases by the same amount as the decrease in potential energy of the stone. Can this process run spontaneously in the reverse direction? It would certainly not violate the first law of thermodynamics if the water would spontaneously cool down a little bit and conspire to throw the stone into the sky. Nevertheless, our experience shows us that such a process doesn't ever occur. Our physical intuition suggests that "something" has been lost in the system consisting of the stone and the water. In what follows we will see that this "something" can be accurately described in mathematical terms and leads us to the concept of entropy."
The book Entropy and Information Theory, gives yet another definition, and Thermal Engineering yet another. Chaos theory has yet another use for entropy. And then we come to "entropic forces", such as osmotic pressure or depletion. In considering all of this, we still have the problem of presenting the information in the best way that can be understood by a general audience. This is typically in a non-linear fashon, with the simplistic, vague, elementary school definitions coming first, and then expanding, from there on, in a pyramid fashon. You can find this spelled out in books like On writing well: The classical guide to writing non-fiction or Reading and writing non-fiction genres. So how can we incorporate all of this into one sentence, which will give an accurate definition of entropy? Zaereth (talk) 18:41, 20 September 2012 (UTC)[reply]
Note that a lead isn't necessarily meant to be an introduction. Per WP:LEDE, it is supposed to be a summarisation of the article -- how much of the content of the article can you summarise, restricted to four paragraphs (eg for Google, or someone else that just reproduces the lead, throwing all the rest of the article away). The article should certainly build up in as introductory and accessible a way as possible, but the place for that introduction to start is below the fold. The lead is supposed to summarise the whole article.
With regard to entropy, there is long-standing pedagogical discussion as to whether it is better to start from a macroscopic basis -- since the macroscopic idea of entropy can be developed entirely in macroscopic terms, with a microscopic picture only added later; or whether it is better to start from the microscopic picture, being more explicit, concrete and tangible, and therefore leading to a much more physically motivated, understandable sense of what is going on, and why this is how things work. Myself, I would tend to the latter; or at any rate bringing up the microscopic picture very quickly to explain why the macroscopic state function has the properties it does.
The macroscopic and microscopic views of entropy are fairly compatible. The chaos theory use of the term is rather different, and doesn't have much overlap. "Entropic force" follows fairly quickly given a good grasp of thermodynamic entropy, being the derivative of the free energy with respect to a parameter, the derivative being dominated by changes in the entropy-containing term in the free energy. Jheald (talk) 19:12, 20 September 2012 (UTC)[reply]
Fair enough. However, this is not about the lede; it's about the first sentence. The question it needs to answer is: What is entropy? Forget, if you will, my attempt to expand upon what the sentence already says (in order to make it readable) just look at that way it's currently written. Now, imagine you're in front of an audience of people ranging from elementary-school students to college students to senior citizens; some know all about science and others know nothing. Now imagine that you've been assigned the task of defining entropy, but must do it in only one sentence. What would you say? Zaereth (talk) 20:31, 20 September 2012 (UTC)[reply]
(Personally, in that position I'd say FU, I'm not gettin' paid enough for this. :-) But seriously, this is the most important sentence; the sentence which determines whether the majority of an audience "reads on or just moves on." Does anybody have any ideas, because, as it reads now, it's borderline gibberish. As Jheald pointed out, it's not even a complete sentence. Anybody? "Entropy is..." Zaereth (talk) 01:18, 21 September 2012 (UTC)[reply]
I've had a look at the diffs, but would rather see a simpler definition in the lede, saving the actual math for the body. I do appreciate your help with this difficult problem. Perhaps someone else will come along and help us out. Zaereth (talk) 18:41, 20 September 2012 (UTC)[reply]

The only correct definition is this: "The entropy of a system is the amount of information needed to specify the exact physical state it is in." You can then decide in what units to measure this information, which defines the prefactor k in S = k Log(Omega). But it should be clear that if a system can be in Omega number of states, that Log(Omega) is proportional to the number of bits of information needed to point out exactly which if the Omega states the system actually is in.

With this fundamental definition of entropy it is also easy to make contact with thermodynamics, and then you also have a rigorous definition of heat, work and temperature in terms of fundamental concepts. The second law follows naturally from all of this. Count Iblis (talk) 02:23, 21 September 2012 (UTC)[reply]

My suggestion:
Entropy is an artificially constructed quantity obtained by dividing the amount of heat supplied to an object (or emitted from one), by its absolute temperature. It has proved to be very useful in the development of thermodynamic theory. In layman's terms, it can be regarded as an indicator of the usefulness of a particular packet of energy. --DLMcN (talk) 05:30, 21 September 2012 (UTC)[reply]
Well that definition works, I suppose, for open systems, where energy is being added or taken away. I'm curious about Count Iblis' idea of starting with Shannon's entropy rather than Clausius'. This is not compatible with the way that the rest of the lede is written, but, if it is actually easier to tie into both macroscopic and microscopic entropy, then perhaps this is a better way. Such elucidation would need to come rather quickly, so I'm curious as to how the Count would tie this all together. Personally, if I was to start from scratch, rather than trying to expand what is already there, I might try to start with something like: "Entropy is a measure of the randomness of a system, the distribution of energy within a system, or the measure of uncertainty of the system. In thermodynamics, a closed system of uneven temperatures has low entropy, while a closed system at thermal equilibrium is at maximum entropy." or something like that. Then, perhaps a brief description of how it applies to statistical entropy, and finally information entropy, all in the first paragraph. In the next paragraph I would probably go into increasing entropy, how it applies to the second law, etc... Anyhow, I'll be gone for the next week or so, so I'll let you all work out a better definition. (It's really nice to be a part of this type of collaboration, for a change, and i thank you all for your responses.) Zaereth (talk) 06:10, 21 September 2012 (UTC)[reply]
I'm glad to see that someone took the initiative and changed the first sentence while I was away. It reads much better this way. It still doesn't cover every aspect, but doing so is trickier than I had originally thought. Perhaps I'll sleep on it some more, and try out some ideas, here, sometime in the future. Zaereth (talk) 23:41, 27 September 2012 (UTC)[reply]
@Zaereth: Sorry to have gone off-air on you last week. I'd started writing a reply, but then some quite pressing things IRL had to get dealt with.
I would agree that the new first sentence is a marked improvement on what was there before; and it may well be close to the best way forward. On the other hand, the way the definition (the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work) is true is perhaps a bit more subtle than the average reader first encountering the sentence might suspect, so I'm a bit concerned that if we're going to use this as our headline statement of entropy, without misleading or confusing people, then the statement needs to be unpacked a bit, and the sense in which it is true needs to be explained somewhere -- perhaps the first paragraph below the fold; or an explanatory footnote, perhaps.
To be a bit clearer about what I'm getting at: the statement we now give is true, but only when two rather non-obvious points are understood. First, the temperature in "per unit temperature" is not the system temperature, but is the temperature of the coldest accessible heat sink the system can dump heat into. Secondly, the "thermal energy unavailable to do work" is also not a state variable of the system: rather, it too changes, depending on how cold a heat sink can be found. Contrary then to what might be first impressions, neither of these quantities is therefore a state variable, but it is the surprise proposition of classical thermodynamics that their ratio S is a state variable.
"A measure of the energy unavailable to do useful work" is a standard first-line definition of entropy in dictionaries of physics and dictionaries of science. As a form of words, it always makes me uncomfortable though, because entropy is absolutely not an energy. For that reason, I do think our new "unavailable energy per unit temperature" is better. But if we're going to run with that, I do think we need to make clear: temperature of what; and that (unlike the entropy) the availability/unavailability is not a state property depending only on the system state itself.
The other thing I'd say is that there does seem now to be a huge amount of repetition and duplication in the article as it now stands, without (to me) now any very clear sense of shape or structure. It seems to me that a few years back the article was much more streamlined and clearly structured, and all-in-all in better shape. Since than it looks as if people may have copied and pasted in various headline sections from various other articles, without much sense of the structure and shape of what was already here. So some judicious pruning and shaping and trimming would (I think) be very welcome. Jheald (talk) 07:23, 29 September 2012 (UTC)[reply]
One other tiny thing that for me just slightly jarred is the phrase per unit of temperature. I'm happy to talk about "per unit" of some extensive quantity like volume or mass, that can be parcelled up into separate pieces; or even "per unit" of a temperature range, where the change in temperature can be parcelled out degree by degree. But here the resonance seems just slightly wrong, because we're not thinking of either the system temperature or the reservoir temperature changing, so any association with stepping the temperature unit by unit seems just slightly deceptive. Jheald (talk) 10:56, 30 September 2012 (UTC)[reply]
Hi Jheald. Thanks for your response. I understand about real life because it often gets in my way too, so no need to apologize. In fact, I'm rarely online on weekends, and only happened to stop by my computer today.
You make a valid point, although I might also point out that the main noun in the sentence is "measure," which leads to the question, "A measure of what?" So, as written, the sentence doesn't say that entropy equals energy, but that entropy is equivalant to a measure of energy states. (The difference between energy states, perhaps?) My typical M/O is to let it stew in my subconscious for a while, and come back when an idea of how to concisely word something comes to mind. I'll think about your suggestions while I'm gone for the next few days, and perhaps I may be able to come back with an idea or two we can toss around. Thanks again for your input and any ideas you can think of. Zaereth (talk) 22:17, 29 September 2012 (UTC)[reply]
Np. I'm aware that I never got back to you with what my ideal clean and succinct "one line" explanation of entropy would be; indeed I'm not sure I have one. For a slightly longer pitch, I think I'd probably still cleave to something pretty close to the diff I suggested before, that you weren't so impressed with, which to me combines a number of standard one-line definitions -- "it's a state function"; "it's a measure of unusable energy"; "it's a measure of disorder/mixedupness"; "it's the amount of uncertainty remaining given a macroscopic specification" -- in a way that I hope presents them understandably and as compatible and related. There's probably a bit more that should be added or amplified, that may make the lead feel slightly dissatisfying until it is there; but this I still think is my best shot, so I'm happy to leave it for others to take up and take further or not as they wish.
Taking a look at other snapshots of the article in (say) six-monthly jumps may also throw up some other alternative approaches that might be possibilities to re-consider, going forward. (As well as perhaps being of interest in its own right, as a case-study of how a text that particular people feel isn't quite right yet can end up getting blown about, hither and yon). Jheald (talk) 11:28, 30 September 2012 (UTC)[reply]

I've been thinking about this since my last post here. I've read through lots of books and websites alike. I thought it would be easy enought to come up with one sentence which could sum up what entropy means, but, it turns out, there are as many different definitions as there are people giving them. I'm beginning to agree with Swinburne, in that perhaps we still lack the vocabulary, or even the fundamental understanding of the concept, to put it eloquently in words. (In fact, check this out if you want to see what this exact discussion looked like a hundred years ago.)

In all of this, however, there appears to be one definition that all branches of science can agree upon. It alnost seems too obvious, yet it may be like the forest that cannot be seen through all of the trees. Perhaps it may be best to start by saying that entropy is a ratio (joules per kelvin) or a rate measurement, similar to miles per hour (velocity), cents per dollar (profit margin), or ounces per gallon (mixing rate). Maybe, if we start with a very simple statement like that, the rest of the lede and, hopefully, the article will make more sense to the newcomer. (Actually, I think the analogy to profit margin is a rather good one for negentropy, because its easy for anyone to visualize, whereas entropy is more like the expense margin.) Zaereth (talk) 01:28, 4 January 2013 (UTC)[reply]

Saying entropy is a ratio says nothing, really. I'm in favor of trying to draw on people's intuitive notion of entropy in the lede. If you look at a movie running forward and backwards (sound off), you can tell when its running forward and when its running backward. Your intuitive understanding of entropy allows you to make that decision. No law of physics is broken in a backward running movie except the second law of thermodynamics, which is essentially a definition of entropy. Broken glass reassembles, beer jumps out of a glass into a pitcher, melted ice unmelts, etc. PAR (talk) 21:56, 4 January 2013 (UTC)[reply]
I understand that, and I agree. However, we have to start somewhere. The question is, how do we get from here to there? Nearly every book on non-fiction writing recommends the same thing: Start with a sentence that is all-encompassing, even though it may be rather vague, and then immediately expand on that definition. Since everyone can seem to agree on the mathematical definition, perhaps it is best to start with that. Just as it is meaningless to describe speed in terms of distance only, the reader should be made aware, right off the bat, that it is meaningless to describe entropy in terms of temperature or energy alone. Entropy is a concept that requires a comparison of the two.
I should note that I'm only thinking about adding a sentence to what we already have, in order to clarify what the next sentence (which is currently the first sentence) means. Of course, this "pointing out the obvious" is mainly for the newcomer, but it may help give them a point of reference which can help elucidate the rest of the text. Zaereth (talk) 22:53, 4 January 2013 (UTC)[reply]

Restructure

There is a serious structural problem with this article. Most of the material in this article should be moved to Entropy (classical thermodynamics) and this article should then be redirected to Entropy (disambiguation). That is the only way to make plain the other subject areas of entropy from information theory, quantum mechanics, etc. No wonder this article has been so confused and has failed to find focus.--190.204.70.243 (talk) 07:18, 9 January 2013 (UTC)[reply]

I second that motion.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk
18:01, 9 January 2013 (UTC)[reply]
For an article such as this, I would rather see it become a parent article rather than a simple DAB page. DABs are great for completely different subjects that simply share the same name, like Mercury (planet) and mercury (element). Here we have different aspects of the same subject. Personally, I think it's better to briefly summarize them in the parent article, and provide "main article links" to the subordinate articles. A similar example is the potential energy article, which is the parent article of gravitational potential energy, nuclear potential energy, and so on. I think this subject can benefit from having a similar parent article, provided we can make it much more readable. (Actually, I think this article could also use an introduction section, briefly summarizing the introduction to entropy article, as well.) Zaereth (talk) 18:29, 9 January 2013 (UTC)[reply]
A lot of other languages point to this page. Redirecting it might be disruptive. Almost all of the classical thermodynamics should be moved to Entropy (classical thermodynamics).--61.141.152.67 (talk) 05:03, 11 January 2013 (UTC)[reply]
I agree with Zaereth and above. Have this as an introductory article for entropy in its many forms, links to specific subjects. Then go back an fix the links to this page if appropriate. PAR (talk) 07:55, 11 January 2013 (UTC)[reply]
Scholarpedia's article on Entropy is a good example of what an overall article introducing the many meanings of entropy could look like. Note that SP is licensed under a strict non-commercial license though, which is not compatible with WP, so we can't just copy the SP article over here either in whole or part.
Note that when this has been raised in the past, there has been a vocal lobby protesting that the primary meaning of entropy in the thermodynamic one, as e.g. often first encountered in high-school chemistry, or though discussions of the Second Law in mass-audience physics material. That in the past is what has led to this article having the scope it has (i.e. entropy in thermodynamics, with non-thermodynamic understandings of the term entropy handed off to other articles).
I'm not against re-scoping this article, but we should perhaps raise this to a formal RfC, give links to past discussions, and advertise it widely, including especially at WT:PHYSICS and WT:CHEMISTRY. Jheald (talk) 14:06, 11 January 2013 (UTC)[reply]
I've been giving this a lot of thought over the last few months. Perhaps a top-down approach is not the best way to tackle this. Maybe it would be better to try reverse-engineering, from the bottom up. The more I think about it, the more it seems like we need a fresh place to start, rather than trying to reword or clarify what is already there. I don't have much spare-time to actually sit down and work it all out at once. However, I've been thinking that, within the coming weeks, I would begin an introduction section in my sandbox. (Anyone who has seen my work knows I'm fond of intros.) Intros tend to give a little more latitude, rather than trying to cram it all in the lede, giving some room in which to tie it all together.
I like a good challenge, and this should surely present one. I am also excited because I see great potential for some serious collaboration here, and am happy that we have people representing all of the various aspects of entropy, such as Jheald, PAR, Count Iblis, Dolphin51... (Forgive me if I missed anyone.) I can't promise quick turn-around but, once I get something written, I hope I can get a some input from everyone. Thanks. Zaereth (talk) 22:00, 11 January 2013 (UTC)[reply]
I've started working on an intro-section, at User:Zaereth/sandbox. If anyone is interested in leaving some feedback, or helping to correct any mistakes, it would be appreciated. Zaereth (talk) 23:54, 18 January 2013 (UTC)[reply]
The initial paragraphs of your introduction confuse entropy with heat capacity. Entropy is not a rate: it is an integral. Heat capacity = dQ/dT but entropy = integral dQ/T . I have removed your first two paragraphs, which introduce the confusion.--212.3.132.195 (talk) 13:31, 27 January 2013 (UTC)[reply]

(Undent)Well, I was trying to avoid that confusion. I think you're right, though, that it did introduce some confusion. Perhaps describing it as an interval variable rather than a ratio variable is better. (I was thinking that, because both joules and kelvin are ratio variables, entropy must be a ratio as well.) However, as the new addition is written, it doesn't make much sense, especially to a newcomer. As PAR mentioned above, saying that entropy is an "abstract function of state" doesn't really say anything, and I believe this only adds to the confusion. The first sentence there should concisely state exactly what the math says entropy is. The difficulty lies in making the correct translation.

Entropy is definitely not an abstract thing, but a measurable property of heat, which I was trying to define from the macroscopic, thermodynamic standpoint first, before getting into other forms. To see this, perhaps it would be helpful to point out the difference between entropy and heat capacity. Heat capacity is the amount of energy that needs to be added to a certain amount of something to change its entire temperature a single degree. For instance, It takes a certain amount of energy to raise the temperature of a gallon of water a single degree.

On the other hand. entropy is the amount of energy that must be added to something to change its temperature at the point of energy transfer only. Entropy does not deal with the heat capacity of the entire substance, but only with the energy needed to change (or "maintain" perhaps would be a better word) the temperature at the boundary where energy is being transferred.

In other words, as energy is added to the gallon of water, the temperature of the boundary does not change instantly. If it did, the energy and temperature would be equal, and the entropy would be nothing. Instead, if adding 1000 joules only increases the boundary temperature to 800 kelvin, then logic dictates that some of that energy is being used for something else. By dividing 1000 by 800, we get 1.25. If providing 800 degrees at the boundary is 100% of the energy needed to perform work, (in this example, performing work is simply heating the entire gallon one degree), then you will actually need to add 125% of the needed amount. The rest of that energy will not be used for work (temperature change), and will only be released as waste once the gallon of water cools. I think the main thing to understand is that entropy is not just something that occurs in machinery, but it occurs anytime heat transfers. Zaereth (talk) 01:01, 29 January 2013 (UTC)[reply]

"entropy is the amount of energy that must be added to something to change its temperature at the point of energy transfer only."
What you wrote makes no sense. The correct relation is that temperature is (proportional to) the amount of energy that must be added to change the entropy by a given amount.
"If it did, the energy and temperature would be equal, and the entropy would be nothing."
????? ? Entropy is not a form of energy. Nor is temperature.
You seem quite confused. Jheald (talk) 22:17, 29 January 2013 (UTC)[reply]
Ok, so the definition you're giving me for entropy is T=Q/dS. I don't doubt that your math skills are better than mine, but what confuses me is how a definition for temperature can be a definition for entropy. Zaereth (talk) 22:36, 29 January 2013 (UTC)[reply]
I've been trying to think about a good analogy when talking about things like temperature, entropy, or energy on the macroscopic scale. One that comes into mind is electrical energy (Watts per second). Electrical energy is defined by the parameters "power (W)," "amps (I)," and "volts (E)." The definitions of each of each of these parameters is: Power = W=IE, Amps = I=W/E, and Volts = E=W/I. None of these parameters are the same things, but they are all necessary dimensions of something called watt/seconds or "electrical energy." Similarly, the relationship between entropy, temperature, and thermal energy are all necessary parameters of something we, as a matter of convenience, call "heat." A quick look at a TS diagram can easily show the relationship. Zaereth (talk) 19:19, 30 January 2013 (UTC)[reply]
I'm not sure that's helpful (as well as being WP:OR. You can have entropy (and energy and temperature, for that matter) without having heat.
In response to your earlier comment, it's a change people come to as their understanding of the subject develops. So when people get their first thorough introduction to entropy -- e.g. perhaps in the context of the thermodynamics lectures of a first year undergraduate chemistry course -- then what seem natural are energy, and temperature that one can measure on a thermometer. These are the familiar safe everyday touchstones that one works out from; whereas entropy can seem nebulous, abstract, artificial -- a state function (as the syllabus may require one to prove), but one for which the meaning seems rather remote.
On the other hand, by the time one has moved on to statistical mechanics, entropy is something one can calculate, something that comes to seem completely instinctive (and also very fundamental); and then it is temperature which starts to seem the derived quantity, and in extreme cases not so easy to measure, unless one comes to define it as 1/(dS/dQ). Jheald (talk) 19:41, 30 January 2013 (UTC)[reply]
It seems like some of the later paragraphs of the introduction are not specifically about entropy. They mention entropy and talk about heat flow, but they do not prepare the reader to, for example, actually use or even become familiar a TS diagram. Explaining entropy should go beyond just reiterating the second law of thermodynamics. It should familiarize the reader with what it is like to solve problems with real entropy data. Even a high school student can perform a calculation with a TS diagram if properly instructed.--86.96.65.146 (talk) 20:37, 30 January 2013 (UTC)[reply]
Yes, and you can have volts and amps without having power. Forgive me for the OR, but I wasn't aware that I needed to cite talk page discussions. For a reference, the book Kinetic Theory and Thermodynamics says, "The quantity Q/T is a definite thermal property of the working substance and is called Change in entropy.... The entropy of a substance is a real pysical quantity like energy, pressure, temperature that can be measured in a laboratory.... From the above relation, we can say that the dimension of heat energy are the same as that of the product of entropy and absolute temperature." For a ref about electrical energy, see the book Basic Electrical And Electronics Engineering. I agree with you on the statistical side of things, but I fear beginning with that approach will lead to a "cannot see the forest through the trees" syndrome. Zaereth (talk) 20:54, 30 January 2013 (UTC)[reply]
It is a real physical quantity, but when it is "measured in a laboratory", what happens is that heat and temperature are more directly observed and then entropy change is computed.--86.97.247.44 (talk) 00:36, 5 February 2013 (UTC)[reply]
I understand that. Those words are from R. K. Agrawal, the author of the book. Personally, however, I'm beginning to agree with Hans Fuchs (The Dynamics of Heat: A Unified Approach to Thermodynamics and Heat Transfer): The problem appears to be one of cognitive linguistics. On the one hand, there is the language of math, and I think we all have an intuitive understanding of that language, (whereas others may speak it quite fluently). On the other hand, there is English which is far more complex, containing a much broader and subtlervariety of functions and relationships. The disjoint between math (thermodynamics in particular) and other forms of language seems to be due to the inability to translate an otherwise abstract math formula, and then project that translation metaphorically in standard cognitive imagery and standard cognitive dimensions. I understood this even before I read Fuchs' book, which is why I gave the above example about electricity (eg: amps and volts are easy to measure, but watts must be calculated). These same types of cognitive structures, expressions, and equations are found in all languages, and are almost always based on the relationship between at least three characteristics. (ie: This and that are those. These are those but not that, etc...) Personally, however, I've decided to just leave well enough alone until the linguistics is sorted out. My only suggestion is to include some real-world examples, which can be found in abundance, from cosmology to pattern recognition, if one really cares to look. Zaereth (talk) 04:10, 5 February 2013 (UTC)[reply]

(undent) Getting back to the topic of restructuring. It might be helpful to see what some of the other languages do on this topic.

But English currently has:

Perhaps we should restructure along the lines of the French and German.--190.73.248.92 (talk) 14:15, 2 February 2013 (UTC)[reply]

Most of the various other uses of "entropy" have been moved out of this article. The article is now focused better on just thermodynamic entropy. We may have to settle for that for now.--76.220.18.223 (talk) 22:54, 6 February 2013 (UTC)[reply]
Mostly very sound, I think.
But in my view the article should at least touch on the view that the entropy state function is simply the amount of information (in the Shannon sense) that would be needed to specify the full microstate of the system, that is left unspecified by the macroscopic description.
Many people find that this is the way they like to think about what entropy fundamentally is (although, equally, there are others that utterly dislike it). Jheald (talk) 23:10, 6 February 2013 (UTC)[reply]
On further inspection, that is still stated pretty much word-for-word in the lead; but perhaps there does need to be some more in the article to contextualise it. Jheald (talk) 23:13, 6 February 2013 (UTC)[reply]
Mostly restored the "information theory" section. Added your description as an introductory paragraph. It still seems a bit long.--200.246.13.126 (talk) 16:48, 7 February 2013 (UTC)[reply]

Reversible vs. irreversible

The article so far fails to distinguish clearly between entropy changes due to a reversible process and an irreversible process. The two are usually conceptually distinct and the article should strive to provide an example of each so that the distinct is clear to the reader.--212.3.132.195 (talk) 13:35, 27 January 2013 (UTC)[reply]

I agree. There seem to be some common misconceptions about what a reversible process is, and why most processes are irreversible. There is nothing in the 2d law that says a closed system cannot spontaneously run in reverse. All that is needed is to add all of the work to the waste. However, the universe is supposed to be the only closed system. Therefore, it is entirely possible that the universe will, one day, stop expanding, collapse in on itself and --Bang-- form the universe all over again. However, the universe is simply not running in that direction right now. Within the universe, there are only open systems. For example, I can gather a certain amount of air and compress it. This will concentrte the heat energy, adiabatically raising its temperature. This is a reversal of entropy. However, it is only a small reversal in a very large river. In actuality, energy had to be added to the compressor through the motor, and the motor's electrical energy (no matter how it was made) came from the sun on it's way to the universe. The compressor is simply a small reversal along its way. Some of the energy from the compressor cannot wait, and will disperse into the universe. Therefore, if you try to power the compressor by relesing the air into an air motor, you will never be able to pump the compressor up. Ultimately, the energy is on a one-way trip, and any reversal is simply like an eddie current in a river, being small and having little effect on the overall flow. Zaereth (talk) 21:47, 29 January 2013 (UTC)[reply]

Specific entropy

I have removed the anchor for "specific entropy" and put sentence to define it in the lead. I have also updated the redirect specific entropy to just point to this article with no anchor. The anchor I removed did not make sense and seems to have been added by someone just searching for the first occurrence of the phrase, which turned out to be inappropriate.--178.210.45.18 (talk) 18:11, 6 February 2013 (UTC)[reply]

Peer review?

So what else does this article need before it is ready for peer review? One thing that would be nice is if we could have just one section that deals with the classical and statistical descriptions.--200.165.161.254 (talk) 00:01, 8 February 2013 (UTC)[reply]

Merged the two sections after the "History" section. I also added a "Function of state" section under "definitions". Much better outline: now, each element of the material has a natural home. It is really how these concepts come together in one obscure function that makes entropy difficult to grasp.--189.51.14.154 (talk) 14:17, 8 February 2013 (UTC)[reply]
Another thing about the style of the article: it really should be about the math i.e. geared for scientist and engineer student at university. That is the adult version of the article. Generalizations about entropy almost all belong rather in the section (or article) about the second law. So...what else does this article need before it might be considered ready for peer review?--77.122.164.30 (talk) 19:28, 11 February 2013 (UTC)[reply]
Um. Are you all the same person, that keeps changing IP (which seems to shift from continent to continent). Or were you three different people? It would be a bit helpful if you could get a user-id (or user-ids plural, if you're more than one of you), so the rest of us can know which of the edits were all made by the same guiding mind...
That said, I think the idea of peer review is a really good one, because no question this should be one of the most important articles under WP:PHYSICS, and in my view it simply hasn't been up to scratch.
I have to confess I haven't read through your edits in detail. But one key question that anyone taking on to reshape this article needs to come to a view about is what relationship it is going to have to all our other articles on entropy -- in particular to Entropy (classical thermodynamics), which I think is in reasonably good shape, in terms of being relatively accessible and not over-complicating things too early, but also covering a lot of good material. So wherever you take this article, it needs to have a distinctive role, that is different from any of the other articles we have on entropy. In the past, I think, it has been to outline the idea of entropy in classical thermodynamics, and also the idea of entropy in statistical thermodynamics, and to discuss how the two are compatible, while leaving quite a lot of each subject to be addressed in more depth by the more specific articles. So this article becomes a coat-stand, if you like, to hang various more detailed more specific articles off. I think that is how it has been in the past. But it might be objected that such a structure may not be perfect, because too many people may only get to this article, and not read things that would actually be valuable to them, because they never get to the sub-articles that would discuss them. But I think the real question is first to get clear to yourself what ideally you think the scope of this article should be for it to contain.
Assuming you get past that stage, something that almost all our Physics articles are very bad at compared to the rest of Wikipedia is referencing. (Perhaps because the ethos in Physics is so much to be able to work things through for yourself from first principles, if you really understand them.) For contrast, compare one of our few entries in Physics that is rated as having Featured Article status, namely Equipartition of energy, and see just how dense the referencing is. If you really want to take this article a step further, one of the things that could help would be to take some of the favourite classic textbooks on thermodynamics -- eg Reif, Kittel etc -- and add the relevant page references to each key idea or each section in this article. That also has the advantage of helping you to really scrutinise this article, and make sure that everything we say really does solidly reflect something presented in what we would consider an authoritative source. Working through the books may also have the flipside advantage of drawing your attention to perhaps anything we really ought to be discussing or pointing out in our text but maybe are not.
As for peer review, when you feel you've got the article into a state that you feel happy with, you could post at WT:PHYSICS or WT:CHEMISTRY or perhaps WT:AST asking for an informal internal review. Don't probably expect too much from this -- while there are some very high calibre people out there (particularly in the Maths project), most of us are just run of the mill editors, and it may be a long time since we studied thermo at Uni, or alternatively typical editors may be students who are still studying their way through their first degree -- either way typical editors probably don't have the grasp of detail that a really good peer review needs. But they can be very useful in telling you how your article reads to somebody not particularly initiated -- does it communicate the big ideas well, so they stand out? Does it flow? Does it avoid getting slowed down in unnecessary technicalities, that could be left until the reader has a better grasp of the overall shape of the topic? That's the sort of quality of input maybe to expect from such an informal internal review.
There is also WP's official assessment process -- how does the article compare with the criteria to be rated B, or A, or WP:GA, or WP:FA? From what I've seen this also may tend to be judged by generalists, who may be very hot on whether a particular sentence has a supporting reference link, or on whether the references are typeset just so, but who may be very weak on the actual science, and simply not qualified to spot whether the article may have glaring inaccuracies, glaring gaps, or material that simply isn't the best way to explain concepts.
So the scary one is asking for external peer review. This is probably best done when you have got very straight what intended scope you want the article to cover; got some informal internal reviewers to tell you whether they think it flows and develops reasonably, and when you've done quite a lot of work referencing it, and making sure it reasonably tracks the principal most authorititative sources. At that stage, it becomes a question of who you know (or who anybody at WT:PHYSICS knows) who actually teaches this stuff at a university, who could be persuaded to dig into it and give it a proper critique. It's a high bar to aim for, but if you pass that, you may be well on your way to getting it acclaimed as a featured article, and seeing your article as the big draw in the hot spot of the front page for a day. Jheald (talk) 22:46, 11 February 2013 (UTC)[reply]
If it could just get to be an A-class article or maybe a Good article, that would be nice. If it were in a state where a university student looking for answers such as: "What is it? How was it discovered?" would feel that they got those answers, at least, that would nice. The main changes made recently where to remove any suggestion that this is some sort of disambiguation page. This article is probably best named entropy (thermodynamics) since that is how the categories (and the French and German articles) are structured. The fate of the other two articles ( entropy (classical thermodynamics) and entropy (statistical thermodynamics) seems unclear since there is considerable overlap with this one. Look at the new section named "Entropy of a system" with material that is peculiar to neither definition. The choices seem to be: 1. Just give up and do nothing. 2. Merge the other two articles back into this one 3. Merge most of this article back into the other two. 4. Just disclaim that the other two are specialized articles with a lot of overlap with this one. Any other viable options?--189.51.14.154 (talk) 23:29, 11 February 2013 (UTC)[reply]