Jump to content

Talk:Entropy: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
Line 949: Line 949:


:: I think the present example, although Count Iblis is strictly correct, is not the best. I think it would be better to define, for example, a macrostate where all the black cards are on top, all the red cards on the bottom. Then we could talk about the number of microstates that correspond to this macrostate. That might be enough, but then we might make the intuitive argument that a macrostate where no five cards in a row were the same color would be a macrostate that had more microstates, and therefore more entropy than the first macrostate. [[User:PAR|PAR]] ([[User talk:PAR|talk]]) 00:34, 1 December 2009 (UTC)
:: I think the present example, although Count Iblis is strictly correct, is not the best. I think it would be better to define, for example, a macrostate where all the black cards are on top, all the red cards on the bottom. Then we could talk about the number of microstates that correspond to this macrostate. That might be enough, but then we might make the intuitive argument that a macrostate where no five cards in a row were the same color would be a macrostate that had more microstates, and therefore more entropy than the first macrostate. [[User:PAR|PAR]] ([[User talk:PAR|talk]]) 00:34, 1 December 2009 (UTC)

== Confusing statemnt ==

From the lead:

:"Increases in entropy correspond to irreversible changes in a system"

From a layman's standpoint this is clearly not true. Taking the melting-ice example in the graphic, the ice can be re-frozen. Does it mean irreversible ''without expenditure of energy'' perhaps? My competence in the subject is not sufficient to make any changes. [[Special:Contributions/81.129.128.164|81.129.128.164]] ([[User talk:81.129.128.164|talk]]) 22:01, 1 January 2010 (UTC).

Revision as of 22:01, 1 January 2010

Former good articleEntropy was one of the Natural sciences good articles, but it has been removed from the list. There are suggestions below for improving the article to meet the good article criteria. Once these issues have been addressed, the article can be renominated. Editors may also seek a reassessment of the decision if they believe there was a mistake.
Article milestones
DateProcessResult
June 22, 2006Good article nomineeListed
February 20, 2008Good article reassessmentDelisted
Current status: Delisted good article

Exact Differential

There is an error in the integral defining the entropy difference, even if dS is an exact integral (if the transformations are reversible that is) dQ is NOT an exact integral, and this should be stated by having a barred "d" or a "delta" instead of a simple "d" in the formula. The best would be a barred "d" like the "\dj" in the MikTex distributions. I cannot do this correction myself because i don't know latex that much :) —Preceding unsigned comment added by 89.97.102.194 (talk) 09:22, 22 September 2008 (UTC)[reply]

According my knowledge of Thermo, there is no such thing as ΔQ as we can't explain equation "Q2 - Q1". So using Q1,2 or 1Q2 is, or at least should be more scientifically correct. --E147387 (talk) 08:42, 27 November 2008 (UTC)[reply]

In physics and mathematics we call this "abuse of notation". Abuse of notation is very common in the physics and math literature. Since wikipedia is supposed to reflect the literature, there is really no problem. What we should do is explain that "dQ" is not to be interpreted as a differential of a function "Q", as such a function does not exist. It merely represents an infinitessimal amount of heat added to the system. Count Iblis (talk) 14:45, 27 November 2008 (UTC)[reply]

Error?

Could someone please correct this: Failed to parse (unknown function\DeltaS): \Delta G=\Delta H - T\DeltaS ? —Preceding unsigned comment added by 81.180.224.38 (talk) 04:39, 22 September 2008 (UTC)[reply]

The following code will do what you want. Math markup is in LaTeX form.
 <math>\Delta G=\Delta H - T \Delta S</math>

Note the space is required to separate the command \Delta from the s, otherwise latex tries to interpret a nonexistent command "\DeltaS".

New intro

Rudolf Clausius' 1879 book (2nd Ed.) Mechanical Theory of Heat (see page: 107 beginnings of entropy discussions) is now available in Google books. Thus, I have started updating the intro to the correct presentation, i.e. in Clausius' own words. --Sadi Carnot 21:36, 30 July 2007 (UTC)[reply]

I have reverted your last change which was:

"In physics, entropy, symbolized by S, from the Greek τροπή meaning "transformation", is a mathematical function that represents the measure of the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[1] In short, entropy is a variable that quantifies the affects of irreversibility in natural processes."

Your paragraph is all true, but it is quite unintelligible to the average reader and is far to concise. You also left the lead far too short and not being a summary of the whole article. However I recognise that you probably were going to add something more. Also entropy is not anymore what Clausius wrote. We should be describing entropy as it is now understood and used not its historical roots. Please stop and discuss your changes here. --Bduke 23:28, 30 July 2007 (UTC)[reply]

Current lead

Bduke, all I did was move the bulk of the lead to an "overview" section. The current lead paragraph (which is completely un-referenced), show below, is filled with errors (especially the etymology):

The concept of entropy (Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn) [wrong (τροπή meaning "transformation", )]) in thermodynamics is central to the 2nd law of thermodynamics, which deals with physical processes and whether they occur spontaneously [wrong (the measure of spontaneity is "free energy" as per the combined law of thermodynamics)]. Spontaneous changes occur with an increase in entropy [wrong (only for isolated systems)]. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed [close (in some cases, but no reference)]. In contrast, the first law of thermodynamics deals with the concept of energy, which is conserved [correct (but does not explain what the connection is to entropy)].

I'll move the bulk of the lead back in, but I'm still correcting all this mess; for instance, all these suppositions need to be referenced. As to your statement "entropy is not anymore what Clausius wrote", there is some truth to this (in terms of verbal terms), but entropy, at its core, is what he wrote (in conceptual and mathematical terms). Now that the original paper is available, I intend to include a blend of this, as well as modern views, in the lead. No need to do any further reverting, please work together on this. --Sadi Carnot 07:05, 31 July 2007 (UTC)[reply]

It seems good now. I'm guessing, however, that if the lead keeps growing, some of it will have to be moved into a new "overview" section (which is what I was attempting to do before), as per WP:LEAD which states that opening section "should contain up to four paragraphs, should be carefully sourced as appropriate, and should be written in a clear, accessible style so as to invite a reading of the full article". --Sadi Carnot 07:16, 31 July 2007 (UTC)[reply]

Sadi, please discuss it more here and let us see what others think. I do not agree one little bit that it "seems good now", but I'm not going to revert. The problem is that the new first paragraph is NOT "written in a clear, accessible style so as to invite a reading of the full article". It will be a complete off-put to most readers, particularly those who are coming to it from a discipline other than physics but realise that this is a part of physics they need to know about. This has been a long term problem with this article and particularly its lead, but I just do not seem to be able to convince you and others. I can not work together with you on it, because it is the very opposite of what I would like to see. Keep the lead simple. Let it attract people with very different views of why they came to read it. Make in intelligible. --Bduke 07:38, 31 July 2007 (UTC)[reply]

I agree with you, any subject should be written in the manner that best conveys information in a digestible manner. One should not, however, bend, twist, misconstrue or even misrepresent basic science and logic for the sake of readability. Presently, to review as things currently stand, we are debating the first two sentences in the article. Please explain what your fuss is about (with these two sentences)? All I did was to correct wrong information and to add a reference. --Sadi Carnot 08:00, 31 July 2007 (UTC)[reply]

Lead comparison

To give you a comparative idea of why the lead is in “good shape” now, below is the current lead for the energy article (which there seems to be no issues with):

Energy

In physics, energy (from the Greek ενεργός, energos, "active, working")[2] is a scalar physical quantity, often represented by the symbol E,[3] that is used to describe a conserved property of objects and systems of objects.

Entropy

In physics, entropy, symbolized by S, from the Greek τροπή meaning "transformation", is a mathematical function that represents the measure of the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[4]

There really is no need to make a big fuss over entropy; it’s basically the same thing as energy, only in a non-conservative sense. If you think the average reader is going to be “put off” by this sentence, than you might as well go over to the energy article, and post a note on that talk page as well, because I see no difference between these to sentences, in terms of difficultly. In short, the first sentence has to define the term. This is the way it is in all science articles. --Sadi Carnot 08:08, 31 July 2007 (UTC)[reply]

I think the lead to energy could be improved somewhat, but it really is not as difficult or off-putting as the current one to "Entropy". I do not think the first sentence has to define the term. It may do so, but often it is better to say in general terms what it is about, where it is used etc. and define it later. This does not mean being inexact or misleading. I do not want to sound patronising, but I think it is clear that you have never taught entropy or energy to people who are somewhat apprehensive about the topics. If you had you would see quite clearly what to me "the fuss is about". It has to attract people. It has to be simple, so readers can decide whether they need to get deeper. It currently does not do these things. I am busy with other things, so I am going to leave it to you. When you have done, put it to peer review and try to get it to featured article status. That will bring many others to comment on the article. --Bduke 09:32, 31 July 2007 (UTC)[reply]

(Edit conflict note: this was written before seeing Bduke's post above) IMO, the lead sentence is as clear as mud. What is transformation-content? What is "dissipative energy use"? That 19th century quote can be impenetrable to the modern reader. I'd rather have a definition like this one (although it's not perfect either):
"Quantity the change in which is equal to the heat brought to the system in a reversible process at constant temperature divided by that temperature. Entropy is zero for an ideally ordered crystal at 0 K. In statistical thermodynamics, S = k ln W, where k is the Boltzmann constant and W the number of possible arrangements of the system."[1]
This definition has the deficiency of not saying what entropy is good for or what it "is", but it is concrete and clear. Saying what entropy "is" gets into issues of interpretations or analogies, of which everyone has a favorite. --Itub 09:41, 31 July 2007 (UTC)[reply]

Dava souza's revert

Sadi, your enthusiasm for obscure historical definitions is noted, but this article is about informing newcomers to the subject and Bduke's considerable expertise on the subject has produced a much better lead than the proposed change, so I've restored it. .. dave souza, talk 09:56, 31 July 2007 (UTC)[reply]

Dave, you reverted several of my edits (corrections to errors) just now, not just the definition. I'm flexible on this, however, I want to seen a reference (or several) in the opening sentence and I don't want to see sloppy (incorrect) sentences. The sentence "spontaneous changes occur with an increase in entropy", is only correct in isolated systems; the novice reader will think it applies to all situations. The etymology is wrong to; I added an original source reference and you have reverted this too. Also, the lead needs to be four concise paragraphs, and the rest moved to an overview section. Please be considerate of my editing efforts. If you want to blend in a new reference to make it easier to read then do so. The lead you reverted to:
The concept of entropy (Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn)) in thermodynamics is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy.
is completely incorrect, e.g. see spontaneous process. This is what I am trying to clean. --Sadi Carnot 16:48, 31 July 2007 (UTC)[reply]
Spontaneous process is misleading at best and certainly needs work. It does not clearly say that is for the system only and is also an entropy term - for the surroundings. is really a measure of the total entropy change. --Bduke 22:32, 31 July 2007 (UTC)[reply]
Well, we can discuss whether the article should sharpen its discussion on the difference between the entropy of the universe and the entropy of the system. But, as the article spontaneous process makes quite clear, chemists define the word spontaneous to mean a process in which the entropy of the universe increases - ie a process allowed by the 2nd law. Jheald 17:17, 31 July 2007 (UTC)[reply]
(As a physicist, that usage always makes me uncomfortable -- it seems misguided to me to call a reaction spontaneous, if in practice it doesn't occur spontaneously, because the reaction barrier is too high. But who am I to argue with chemists in full flood?) Jheald 17:20, 31 July 2007 (UTC)[reply]
I kind of agree, but the distinction between thermodynamic control and kinetic control of a reaction is a useful one. --Bduke 22:32, 31 July 2007 (UTC)[reply]

Etymology

On the subject of the derivation of the word entropy, in Greek τροπή comes from τρέπω just like for example in English "cessation" comes from "cease". The verb τρέπω is the root word, and "chase, escape, rotate, turn" gives a good sense of what it means. The noun τροπή means a τρέπω-ing, hence a turning, a changing, a transformation.

I have to agree, in the strongest terms, with Itub above, when he writes that your proposed lead sentence "is as clear as mud. What is transformation-content? What is "dissipative energy use"?" These terms should be left in the 19th century. He is so right, when he writes, they are simply "impenetrable to the modern reader". I have reverted this ancient cruft, and would do so again without hesitation. Jheald 17:43, 31 July 2007 (UTC)[reply]

Jheald, I was the one that added the original etymology (from Perrot's A to Z Dictionary of Thermodynamics), and now that I've seen the 2nd Edition of the book (page 107):
I have corrected it to how Clausius coined it. Thanks you: --Sadi Carnot 17:55, 31 July 2007 (UTC)[reply]
The difference is that εν + τρέπω understood as the "chasing/ escaping/ rotating/ turning" "inside" the system actually gives quite a helpful steer towards understanding what entropy is. "Transformation" doesn't. Jheald 18:02, 31 July 2007 (UTC)[reply]
A measure of the unavailability of a system’s energy to do work. This actually is rather unhelpful. TR S is a measure of the energy unavailable to do work. The dependence on the reservoir temperature TR is fundamental. If TR was zero, then all the energy would be available to do work. It therefore is not helpful to suggest that S on its own is a measure of the unavailability of a system’s energy to do work. Jheald 18:07, 31 July 2007 (UTC)[reply]
Oh, and while you're at it, please learn enough about information theory to understand why saying Shannon entropy is "attenuation in phone-line signals" is imbecilic. Jheald 18:12, 31 July 2007 (UTC)[reply]

I added a second ref note per your request:

  • The etymology of entropy, in modern terms, according to Perrot’s A to Z of Thermodynamics, can be interpreted to mean, “from the Greek root εντροπη, the act of turning around (τροπη, change of direction), implying the idea of reversibility”.

I hope this helps. As to the new 2005 Oxford Dictionary of Physics definition, do you really have to complain about every reference? First Clausius is to historic, now Oxford it too unhelpful. Give me a break. I'm only trying to add references to the article so to give it credibility, rather than original research. As to Shannon, fix it if you know of a better wording. --Sadi Carnot 18:21, 31 July 2007 (UTC)[reply]

As to Shannon, I was rather hoping you might go away and actually learn something, so you don't continue to inflict nonsense like this any more.
As for the opening definitions, no I'm not going to give you a break. Settling for misleading is not acceptable. "A measure of the unavailability of a system’s energy to do work" is horribly misleading, because that unavailablity depends utterly on the reservoir temperature.
Finally, connecting τροπη with the idea of reversibility is a spectacularly unhelpful intuition, even by your standards. What is valuable about the link with τρέπω = "chase, escape, rotate, turn" is that it gives some idea of internal molecular confusion. No, that's not what Clausius was thinking of when he coined the phrase. But it's the most valuable connection today. Clausius's original etymology frankly isn't helpful for the intro. Jheald 18:39, 31 July 2007 (UTC)[reply]

Jheald’s comments

Jheald, let me get this straight: from your point of view, I’m an imbecile and you want me to go away? --Sadi Carnot 02:36, 1 August 2007 (UTC)[reply]

No, but from time to time, like all of us, you may write things which make you look clueless. At which point, the best solution is to get a clue. I told you 18 months ago that this sort of statement about information entropy was misconceived, and yet you still trot it out. Jheald 08:22, 1 August 2007 (UTC)[reply]
In any event, thanks for the nice comments, I've added them to my user page. --Sadi Carnot 03:21, 1 August 2007 (UTC)[reply]

Economic entropy

I have changed economic entropy from being a quantitative value to a semi-quantitative value. I would go further and call it qualitative, but people might disagree. I fail to see how it can be quantitative without a mathematical definition, which is stated by the article by filing it under sociological definitions. I would argue that quantitative measurements must be of a known quantity if they are to be named as such. Thanks User A1 11:26, 6 August 2007 (UTC)[reply]

Entrophy and the relative number of states

Ω, the number of microstates, in S = k ln Ω might be better interpreted as a relative number of states which would be a dimensionless quantity for which the logarithm would be defined.

On p. 24 of Wolfgang Pauli's Statistical Mechanics (Vol. 4 of Pauli Lectures on Physics) he comments,

"The statistical view also permits us to formulate a definition of entrophy for nonequilibrium states. For two states, 1 and 2 we have

S2 - S1 = k log(W2/W1);

leaving the additive constant unspecified, we obtain

S = k log W.

Because of the logarithm, and because the probabilities of independent states multiply, the additivity of entrophy is maintained." --Jbergquist 18:41, 2 October 2007 (UTC)[reply]

Gibbs entropy as fundamental but not defined?

Under "Miscellaneous definitions", "Gibbs entropy" is described as being the "usual statistical mechanical entropy of a thermodynamic system". However, Gibbs entropy does not appear to be defined in this article, and the linked article on "Gibbs entropy" does not define some of the terms used. 68.111.243.96 20:18, 17 October 2007 (UTC)[reply]

Entropy (disambiguation)

Editors of this page might like to look over the recent to-and-fro and Entropy (disambiguation). User:Thumperward seems dead set to (IMO) make the page harder to use. Compared to eg this edit, he seems determined to

  • remove the link to Introduction to entropy
  • remove links directing people to find additional entropy articles in the categories
  • reduce the structuring of the page between thermodynamic entropy and information entropy.

-- all of which (IMO) are mistakes. Anyhow, there have been a series of reverts and counter-reverts (I've now had my 3 for the day), and there's discussion on the talk page there, if anybody wants to have a look. Jheald 13:52, 18 October 2007 (UTC)[reply]

Never mind, we seem to have come to agreement. Edit war over :-) Jheald 15:47, 18 October 2007 (UTC)[reply]

"calculated using the multiplicity function" ????

And if I click on the wiki link or multiplicity function I see the expression for a system of N noninteracting spins :) Also, we should avoid using misleading examples of a system of which the energy levels are exactly degenerate. It is better to define as F. Reif does in his textbook: is the number of energy eigenstates with energy between and , where is a macroscopically small energy interval. The entropy defined in this way depends on the choice of , but this dependence becomes negligible in the thermodynamical limit. It cannot be set to zero, because then for generic systems , and the entropy becomes identical to zero.

Basically what happens is that if you specify the energy of a system with infinite accuracy, then there can be only one microstate compatible with that energy specification. This entropy is the so-called fine grained entropy, while the entropy defined with the nonzero is the coarse grained entropy. Count Iblis 15:24, 23 October 2007 (UTC)[reply]

Clausius Inequality

Shouldn't the inequality be a less than or equal to sign rather than a greater than or equal to sign? That should probably be fixed. --Lee —Preceding unsigned comment added by 24.3.168.99 (talk) 05:39, 21 November 2007 (UTC)[reply]

No - the inequality is correct. Read the statement just above the equation where it states that the result should be positive or zero for the extreme situation. This is a basic statement of the 2nd Law of Thermodynamics. But thanks for checking. PhySusie (talk) 13:09, 21 November 2007 (UTC)[reply]

I wish I knew how to add things to my own comments

My mistake on that one, haha. I overlooked the part where the article explained heat transfer from the body was positive (every class I've ever taken has had it as negative). That's why I'm used to seeing the Inequality with a less than or equal to sign. Thanks again! —Preceding unsigned comment added by 24.3.168.99 (talk) 06:45, 23 November 2007 (UTC)[reply]

GA Sweeps (on hold)

This article has been reviewed as part of Wikipedia:WikiProject Good articles/Project quality task force in an effort to ensure all listed Good articles continue to meet the Good article criteria. In reviewing the article, I have found there are some issues that may need to be addressed, mainly whether having a separate "Introduction to" article is a sufficient replacement for writing at a general level. In my view it is not, but I'm happy to be convinced otherwise, or post the article at Good Article Reassessment for other opinions.

There are a few other fixes listed below that are needed to keep Entropy at a good article standard.

GA review – see WP:WIAGA for criteria


I was initally of two minds about using an "introduction to" article as a general audience work around. On the one hand, yes, it is a very good way to avoid the difficulties inherent in explaining advanced concepts at a basic level, but on the other hand, it bears similarities to a "criticisms of" POV fork. After reading through the lead of the article, I think there's a good reason to merge the "introduction to" article into this one: when writing for a general audience, one tends to focus more on clarity than absolute precision, and it is this clarity that is crucial for the lead of this sort of highly techinical article. Also, most of the sections in this article are summaries of other articles, but still focused at a very high level. It would be better if the summaries were geared to a general audience and the high level material left for the main articles.

  1. Is it reasonably well written?
    A. Prose quality:
    The lead could be made clearer and more focussed, especially the first paragraph. See comment after review.
    B. MoS compliance:
    There are some Manual of style problems in addition to the general audience discussion above. They're only small errors, but there are quite a few a of them. For example, remember to avoid slipping into textbook style and using "we" in derivations, and that punctuation after a math tag must go inside. See Wikipedia:Manual of Style (mathematics).
  2. Is it factually accurate and verifiable?
    A. References to sources:
    B. Citation of reliable sources where necessary:
    There are a number of unsourced facts, especially in the history section. The GA criteria specify that at a minimum, every statement that could be contested must sourced with an inline citation.
    C. No original research:
    The ice melting example has to count as original research unless it is sourced. As it's apparently used in many textbooks, this won't be hard to fix.
  3. Is it broad in its coverage?
    A. Major aspects:
    B. Focused:
  4. Is it neutral?
    Fair representation without bias:
  5. Is it stable?
    No edit wars, etc:
  6. Does it contain images to illustrate the topic?
    A. Images are copyright tagged, and non-free images have fair use rationales:
    B. Images are provided where possible and appropriate, with suitable captions:
  7. Overall:
    Pass or Fail:


Regarding the lead, it contains 7 different definitions of entropy:

  1. [it] is a measure of the unavailability of a system’s energy to do work.
  2. is a measure of the randomness of molecules in a system.
  3. entropy is a function of a quantity of heat which shows the possibility of conversion of that heat into work.
  4. [is] the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state
  5. [it has] often been defined as a change to a more disordered state at a molecular level.
  6. has been interpreted in terms of the "dispersal" of energy.
  7. is defined by the differential quantity dS = δQ / T

While I'm sure they're all true, it makes the lead seem very cluttered, and I come away with a cluttered idea of what entropy is. I think that if the lead were refactored such that each paragraph had a clear, single focus, it would improve the article dramatically.

Feel free to drop a message here if you have any questions, and many thanks for all the hard work that has gone into this article thus far! --jwandersTalk 03:58, 19 February 2008 (UTC)[reply]

Delisted from GA. See review directly above. --jwandersTalk 22:28, 20 February 2008 (UTC)[reply]

other sections on entropy: economics, and as metaphor

There should be some other sections on entropy as it appears in macroeconomic theory, and as it has been used outside of science as a metaphor.

The first: I do not mean the technical (information theory) use of entropy as a measure of this or that quality of information generated about one or another economic variable. I mean: there's a revolution happening in economics, as the old Newtonian mechanism is being replaced by a macro perspective that understands that economic processes are one-way flows that happen in time, in which scarce (low entropy) matter and energy is taken up into the economy and degraded (high entropy) matter and energy is extruded/exhausted/discarded. I wrote a 'graph on this that seems to have disappeared. Maybe I didn't follow directions or rules?

The second: the idea of entropy has been widely used by poets, historians, fiction writers, thinkers of all kinds, some of whom understood it, some of whom didn't. Still, a comprehensive article on the subject could give a brief survey of these uses. I wrote some text on this, too, and it isn't there today. Problem? —Preceding unsigned comment added by 128.252.254.30 (talk) 19:24, 1 March 2008 (UTC)[reply]

Lede context

Hello,

A few questions, which I post without any

  1. is thermodynamics a branch of physics? I always thought of it as a branch of chemistry - particularly the statistical mechanics considerations, though I can see how it could go either way.
  2. Is entropy really at a purely thermodynamic property? I would have thought that entropy is a statistical property which finds use in fields such as thermodynamics, information theory, etc.

Maybe I am just favouring statistical mechanics... Thanks User A1 (talk) 22:59, 25 March 2008 (UTC)[reply]

Ad 1. Our Thermodynamics article starts like this: "Thermodynamics (...) is a branch of physics that studies the effects of changes...". The article History of entropy described how the notion was already well developed (see also Classical thermodynamics and Entropy (classical thermodynamics)) before the statistical explanation was developed (see Statistical thermodynamics and Entropy (statistical thermodynamics)).
Ad 2. Entropy is not a purely thermodynamic concept, although it originally was, and the statistical definition used in thermodynamics is specific to that field. However, as it is, it is the thermodynamic concept that is described by this article. I am in favour of renaming this article Entropy (thermodynamics), a name that currently redirects here, as does Thermodynamic entropy. See also the discussion raging at Talk:Entropy (disambiguation).  --Lambiam 21:40, 26 March 2008 (UTC)[reply]


Requested move

EntropyEntropy (thermodynamics) — The article appears to discuss thermodynamics only, and fails to review entropy in other branches of physics, information science and mathematics. —linas (talk) 04:14, 27 March 2008 (UTC)[reply]

Once again, the stupidity of masses rears its ugly head, as the above exhibits in spades. At the risk of being uncivil, I say "fuck wikipedia". If this is what the cornhole editors with their heads stuck up their asses want, this is what they get. linas (talk) 02:16, 8 April 2008 (UTC)[reply]

Error in Explanation

'then entropy may be (most concretely) visualized as the "scrap" or "useless" energy'

Usually in an article discussing a useful combination of more basic physical quantities, the units of the item are given. In this article they are not explicitly covered. Big mistake. And it leads to incorrect statements like the one above. Entropy is not energy. The term energy has a whole lot of baggage that comes with it, and to suggest that entropy carries the same baggage (say like conservation) contributes to a gross misunderstanding of what is going on. I hope authors/editors will be much more careful. Properly presenting the ideas of physical chemistry requires much more rigor than present in this article. blackcloak (talk) 05:32, 7 June 2008 (UTC)[reply]

Thanks for the comment, this article has been subject to a sort of tug of war between various perceptions of how to explain a difficult concept involving advanced mathematics in a simple way accessible to the layman. This earlier version was edited by an educator, and may be nearer what you were looking for. The article's gone through numerous intermediate stages, as in this version, and the lead has been stripped down to the point where it's probably missing out on essentials while still including misleading cruft. Rather beyond me, but your assistance in a rewrite will be greatly appreciated. Note, of course, that thermodynamic entropy applies to more than physical chemistry. . dave souza, talk 08:14, 7 June 2008 (UTC)[reply]
Well, today the average lay person is much more familiar with information theoretical concepts because many people have a computer these days (certainly those people who visit this page :) ). So, we can exlain the rigorous formulation much more easily than, say, Landauer could half a century ago. Why can't Maxwell's demon be effective? Today that's almost a no brainer to a ten year old. Count Iblis (talk) 13:29, 7 June 2008 (UTC)[reply]

I rarely have time these days to think about this article, but I want to make a comment in response to blackcloak. I suggest that the urge to follow "the ideas of physical chemistry requires much more rigor than present in this article" does more harm than good and probably explains why students in physical chemistry never really understand entropy. What is needed at least at first is not rigor but clarity, so the reader can see what entropy is actually about and why they need to learn about it. Rigor can follow later. I am not of course suggesting that the "clarity" phase should be false, but it does not need to be rigorous. It also needs to take into account that many students have a poor background in mathematics. --Bduke (talk) 22:36, 7 June 2008 (UTC)[reply]

On the other hand, claiming entropy is "scrap or useless energy" is not clear, and is not good. It does not help understanding if entropy is confused with energy. The unusable energy is TR S, where TR is the temperature of the coldest accessible reservoir. Jheald (talk) 20:08, 9 June 2008 (UTC)[reply]
I entirely agree. My concern is the general urge for total rigour that can make the article totally unclear. --Bduke (talk) 02:29, 10 June 2008 (UTC)[reply]
Why don't we say that the entropy of a system is the maximum amount of information you could theoretically store in the system without affecting its macroscopic state? This definition is perfectly understandable to most lay people. Count Iblis (talk) 02:45, 10 June 2008 (UTC)[reply]
It is totally confusing to someone who comes across the term "entropy" in connection with chemistry. But even wider, where on earth do you get the idea that it "is perfectly understandable to most lay people"? Can you back that up with a source.--Bduke (talk) 23:01, 10 June 2008 (UTC)[reply]
Well, the problem is that people are taught thermal physics in the wrong way in high school. At university part of our work is to let the students unlearn what they learned in high school. Entropy is fundamentally an information theoretical or statistical concept, just like heat, temperature etc. are. If we just say that entropy is related to heat and temperature, we aren't explaining anything.
I'm not saying that we should explain everything mathematically, that's not necessary. Compare e.g. how Chaitin explains Gödel's theorem to lay people. The information theoretical approach clearly works very well here, precisely because lay people are familiar with computers, computer memory etc. etc.. Count Iblis (talk) 21:23, 11 June 2008 (UTC)[reply]
I do not know where you come from, but most of the students I have taught physical chemistry to in the last few decades had not even studied physics at High School and had not come across entropy before. I see you are a physicist. It seems to me that what is familiar and obvious to your students is very far from being so to everyone else. I give up. I just do not see a way forward. This article will continue to be dominated by physicists and it will continue to be totally unclear and unhelpful to anybody else. --Bduke (talk) 23:17, 11 June 2008 (UTC)[reply]
We do have to explain everything from the start to the students. I don't really believe that genuinly interested people who are willing to learn can fail to underststand something as simple as entropy. But they do have to be open to the idea that their intuitive ideas about entropy, heat and temperature may be wrong.
The reason why people find physics difficult is because we don't teach it properly until the students go to university. Just think about how well you would read and write English if you were not taught to read and write until you were 18 years old. Now, if our reaction to this problem is to dumb thing down even more we are only going to make the problem worse. We have to keep in mind that wikipedia is also read by many children in primary and high school. They would benefit from being exposed to real physics instead of the dumbed down physics stuff they are taught in school. Count Iblis (talk) 02:40, 12 June 2008 (UTC)[reply]
BDuke, Just thought I would wade in here. If you want to make progress, one suggestion would be to create a page in your own user-namespace, eg User:Bduke/entropy and then use that to construct, what you believe to be a good modification - that way you can actually point at something and say "this is a better explanation; what do you think" rather than "Currently the way the article does is wrong, we should do it a better way". More likely you will get a more enthusiastic response from other editors. see WP:BOLD. On the other hand, this is more work :( - Can't win em all, huh?User A1 (talk) 00:40, 12 June 2008 (UTC)[reply]
I actually did that long ago, but under a different title which I forget. I deleted it. It lead to a rewrite of the intro para as Dave Souza mentions in the second para above in this section. I had other things to do and it just reverted back to where it is now. It is just too hard unless others recognise that we do have a real problem with this article and many others. I just do not have the time to fight this alone. --Bduke (talk) 01:01, 12 June 2008 (UTC)[reply]

Defining entropy as the maximum amount of information you could theoretically store in the system without affecting its macroscopic state is not understandable to most lay people IMO. That definition only makes sense if the reader is acquainted with a quite technical meaning of "information", which takes the reader who doesn't know it in a nearly circular path of confusion. It is also counterintuitive to suggest that a gas "holds" more information than a solid, for example. What do you mean by "hold"? Why are hard drives not gaseous then? ;-) Like I suggested above already, I think it is best to start with a clear and unambiguous definition such as [6], even if it doesn't explain what entropy is good for or what it "is". The analogies and examples can come later. --Itub (talk) 08:54, 12 June 2008 (UTC)[reply]

Once more into the lead, dear friends

As a layman, my opinion is that the current lead section has some problems. In the opening sentence – "In thermodynamics (a branch of physics), entropy is a measure of the unavailability of a system’s energy to do work." – "(a branch of physics)" is superfluous and misleading as it's also a branch of chemistry and mechanical engineering. Best explained in more depth later.

"It is a measure of the randomness of molecules in a system" is completely baffling to me. The jump from availability of energy to do work to "randomness" of molecules makes no sense. I'd be much happier with modified borrowings from the introduction to entropy article, explaining what it is and how it's measured in statistical terms. Thus, a proposal, concluding with the statistical and information meanings. The derivation of the term adds nothing to basic understanding, and so should be moved to the body of the text. –

In thermodynamics, entropy is a measure of the unavailability of a system’s energy to do work. It provides a measure of certain aspects of energy in relation to absolute temperature, and is one of the three basic thermodynamic potentials: U (internal energy), S (entropy) and A (Helmholtz energy). Entropy is a measure of the uniformity of the distribution of energy. In calculations, entropy is symbolised by S and is a measure at a particular instant, a state function. Thus entropy as energy Q in relation to absolute temperature T is expressed as S = Q/T. Often change in entropy, symbolised by ΔS, is referred to in relation to change in energy, δQ.
It is central to the second law of thermodynamics and the fundamental thermodynamic relation, which deal with physical processes and whether they occur spontaneously. Spontaneous changes, in isolated systems, occur with an increase in entropy. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed.
Statistical mechanics introduces calculation of entropy using probability theory to find the number of possible microstates at an instant, any one of which will contain all the energy of the system at that instant. The calculation shows the probability, which is enabled by the energy: in terms of heat, by the motional energy of molecules. Statistical mechanical entropy is mathematically similar to Shannon entropy which is part of information theory, where energy is not involved. This similarity means that some probabilistic aspects of thermodynamics are replicated in information theory.

The question of use in physics, chemistry and engineering is covered by the mention of temperature, pressure, density, and chemical potential, in my opinion. To refer to the previous discussion, saying that "the entropy of a system is the maximum amount of information you could theoretically store in the system without affecting its macroscopic state" is meaningless to me. If it has some deep meaning not covered by the last paragraph, perhaps that explanation could be simplified and added. . . dave souza, talk 09:30, 24 June 2008 (UTC)[reply]

Not so bad as a starting point, but I do have some issues:
  • a measure of certain aspects of energy isn't good. It's vague to the point of being confusing, rather than clarifying. (1) It's too abstract. Entropy is a particular property of particular physical systems. On the other hand certain aspects of energy suggests (to me) certain characteristics of energy such as eg that energy can't be created or destroyed. But what would be a measure of that? So I find the phrasing unintuitive and confusing. (2) It's not just energy that entropy is associated with. You can know things about the system that change its entropy, without changing its energy distribution.
  • Thus entropy as energy Q in relation to absolute temperature T is expressed as S = Q/T. Entropy isn't energy. And the relationship is a differential one, because adding heat to a system will generally change its temperature. Furthermore, Q is not a state function. (Incidentally, a measure at a particular instant is a curious way to define a state function; though I think I see what you're getting at).
  • the number of possible microstates at an instant, any one of which will contain all the energy of the system at that instant. What you mean, I think, is that a microstate contains all the information of the whole system; so defines exactly how all the energy is distributed across the whole system at that instant. But what's written is very unintuitive. I fear anyone who doesn't already know what a microstate is will inevitably think of all the energy piled up in one place.
  • The calculation shows the probability, which is enabled by the energy: in terms of heat, by the motional energy of molecules. This is gibberish. What is it trying to say?
As for entropy as a measure of the freedom/randomness/uncertainty/information of a system. This isn't meant to follow intuitively from the "unavailability of a system's energy to do work". Rather, it's a deeper, contrasting, microscopic view of what entropy fundamentally is. The laws of physics are deterministic, and they preserve uncertainty. So if you're pushing heat through a heat engine, there has to be at least as much uncertainty (entropy) associated with it at the exhaust stage as at the input stage. If the exhaust is at a temperature TR, that means an energy of at least TR S must be dumped as waste heat, if S was the entropy put in at the input stage.
That's why saying the freedom/randomness/uncertainty/information can't decrease is equivalent to there being a limit on the availability of the system's energy to do work. Jheald (talk) 10:41, 24 June 2008 (UTC)[reply]
Perhaps it would help if we don't stick to the wiki-convention of defining the term in the first sentence. We could just say in the first sentence that entropy is an important quantity in thermal physics, and then give a general introduction of thermal physics. We should not mention entropy again until we are ready to give a precise definition that is understandable to the reader who has read the text till that point. Count Iblis (talk) 13:32, 24 June 2008 (UTC)[reply]

User Comment

I have a B.S.in physics, and I am studying for the subject test, so I use Wikipedia a lot. I really appreciate how useful Wikipedia is. I have a collection of texts that I have either saved from undergraduate classes or purchased here and there, but the articles on Wikipedia usually clarify some otherwise mystifying points about certain topics. While I deeply appreciate all the efforts that everyone has made to present a valid, understandable presentation, there are two things that concern me. One is about the contributors to Wikipedia in general, and the other is about this article in particular.

First, I know that one of the specified conditions is that people be courteous to each other, but they are obviously not. This disturbs me on several levels. It disturbs me on a casual level, just because it is so distracting when I'm trying to study. When I run into all this rude stuff that people are writing to and about each other,it makes me want to just shut it down. On a more professional level, it seems to reflect the general lack of respect that scientific types seem to have for each other. I don't know if it's because of competitiveness, or because we spend to little time interacting with people, that we don't learn to treat others the way we want them to treat us. I think it's some of both. Regardless, I wish you peopole would be nice.

Now, about the article. Somebody break out the smelling salts. I'm interested in the concept of entropy as it pertains to physics. It is distracting to include the informational aspects of the concept mixed in with an article about physics. The two should be separate. At the beginning, there should be a link for people who are interested in the informational aspects of the concept. Just because both uses of the term are equally valid and important doesn't mean that they have to be mixed together in the same article. I have seen countless other articles where this problem has been addressed very effectively. The reader is directed to the article about the usage that he or she is interested in. If one so desires, he or she can go back and read the other article. Putting the information in separate articles doesn't slight one discipline or the other. Putting them together is not efficient for the reader. I don't know why there needs to be so much debate about it. It seems clearly disorganized to mix it all together. Thank you for all your hard work. Is there enough bread around my compliment, critique, compliment sandwich? —Preceding unsigned comment added by 98.163.102.226 (talk) 17:35, 8 July 2008 (UTC)[reply]

Entropy and Information theory

On 9 July 2008 the following text was added by Kissnmakeup. These comments are not appropriate for the article Entropy, but are most appropriate for this Talk page. The text was deleted by Jheald on 9 July.

This does not belong in this article. This article is about thermodynamics. There is a link at the top of this article for readers who want to know about information theory. The person or people who insist that this should be here need to reconsider out of courtesy and common sense. Both uses of the term should have their own articles. There is no reason why an article about thermodynamics should include a discussion about information theory.

I am restoring the text, this time to Talk:Entropy, for the benefit of those who regularly work on Entropy. Dolphin51 (talk) 03:00, 10 July 2008 (UTC)[reply]

Fair enough. I think the section at the moment clearly isn't explaining well enough what I also wrote in my edit summary, viz. that many people do find this interpretation very useful as a way of understanding *thermodynamic* entropy.
Specifically, (as the Maximum entropy thermodynamics page puts it) that statistical mechanics should be seen as an inference process: a specific application of inference techniques rooted in information theory. And that thermodynamic entropy is exactly the same thing as information entropy - it is simply an application of information entropy to a particular situation, and a particular framework of questions of interest.
So thermodynamic entropy represents the amount of information you do not have about the exact state of a system, if all you know are the values of the macroscopic variables.
That explains why learning information about the system (eg as in the Szilard engine thought experiment) can change its entropy, i.e. allow you to extract additional useful work, even if nothing has changed about the system itself.
It also, for many physicists, answers the metaphysical questions like "So what is entropy? Where does it come from?" Answer: it is exactly the same thing as information entropy, and it comes from where information entropy comes from.
The article would be better if it presented this much more directly. At the moment, what physicists who take this line actually think, and why they think it, is very much buried. It's not surprising that readers who haven't met the idea before are being confused. And in addition, the last paragraph of the current #Entropy and Information theory simply isn't true. There's a very straightforward interpretation of δQ = T dS. It's the definition of temperature -- how much energy you have to put in to increase the information entropy of the system by one nat. In systems of natural units, temperature is measured in (Energy units per nat). Secondly, the second law of thermodynamics. The information interpretation is what reconciles the Second Law with determinism and Liouville's theorem. The laws of the universe preserve information (Liouville's theorem) - but as time goes on, they make it less useful, as more of the information you had becomes related to microscopic correlations rather than macroscopic properties. So the effective amount of information you can use about the system has gone down; corresponding to an increase in its classical thermodynamic entropy.
Most theoretical physicists think of entropy in this way, I would claim. So, for example, Seth Lloyd, Programming the Universe, just because I happen to have it to hand. Page 65 (and following): "In particular, the physical quantity known as entropy came to be seen as a measure of information registered by the individual atoms that make up matter."
Information entropy has everything to do with thermodynamic entropy. Jheald (talk) 08:10, 10 July 2008 (UTC)[reply]

Further User Comment

I will concede that the yes or no question analogy illuminates the similarity between information entropy and thermodynamic entropy, but it doesn't explain thermodynamic entropy, nor does it pertain to thermodynamic entropy. To say that thermodynamic entropy is information entropy does not explain thermodynamic entropy unless one already understands information entropy which is A DIFFERENT FIELD OF STUDY. At least the vegetable soup part has been moved out of the way of the part that matters. —Preceding unsigned comment added by Kissnmakeup (talkcontribs) 23:32, 14 July 2008 (UTC)[reply]

The information theoretical definition is the fundamental definition of entropy. Most modern textbooks on thermodynamics and statistical physics teach it that way. The old fashioned way of introducing thermal physics in terms of heat engines etc. does not explain what entropy is at all. In that approach you simply have to postulate the existence of a quantity called entropy which is then related to heat and temperature. Neither heat, nor entropy can be rigorously defined in this approach. Count Iblis (talk) 00:21, 15 July 2008 (UTC)[reply]

Actually, the statistical mechanical definition makes perfect sense to me. There are x number of possible quantum states with an equal probability of the system being in any one of those states. It's like a digital computer. You have x number of possible yeses or no's, 0's or 1's, trues or falses, or blacks or whites. Whatever you call it, to say that is informational is like saying that two electron's can't have the same quantum numbers because Pauli said so. Two electron's couldn't have the same quantum numbers before Pauli came along. To say that it is because Pauli said so is silly and certainly doesn't explain physically why they can't be the same. Just like information entropy doesn't explain thermodynamics unless you already know it as information entropy. Entropy existed before information science. I think that you should include a discussion about boolean algebra, the binomial distribution, and machine language for those who already undertand the same concepts from still other points of view.

This is the biggest problem in education. To teach, one must explain things in terms that the student can understand. By student, I mean someone who doesn't already know the concepts or the terminology. That is why they are students. The more one learns about something, the less connected he or she becomes with the novice. The concepts and terminology become so entrenched in the brain until it is impossible for an "expert" to look at it from the point of view of someone who doesn't know it yet. People who are emotionally needy try to make another feel stupid because they don't understand it. Such people are incapable of actually teaching, even though they may be employed as teachers. To teach, one must be able to put himself in the shoes of someone who has never heard of it, and respect the effort to learn. Treat the student with dignity, and put it in terms that can be understood. I know that is what you all are trying to do. You're doing a tremendous work in providing this information. I have a great deal of reverence for it, being from a background that is not exactly "two commas". Knowledge should not be for sale.

Now, back to entropy.

Another thing that is confusing about discussions of entropy is the lack of stess on the idea that when a system is disturbed from equilibrium, the entropy is decreased by the constraints or whatever causes the departure from equilibrium. Granted, the definitions state that the entropy of "spontaneous" (meaning "undisturbed?") processes always increases. It seems that the idea of entropy increasing is stressed so abundantly more than the spontaneous part, that it leads a novice to miss or forget the reverse process in which entropy decreases, which is very misleading. Thank you.Kissnmakeup (talk) 11:57, 18 July 2008 (UTC)[reply]

Entropy and Information/Communication Theory

This article seems to be an overview of entropy as the term as been used in many disciplines, with the main emphasis on its use in thermodynamics. The section on information theory omits a big split in meaning that is not very consequential, but needs to be pointed out. I believe the Shannon used the term for the information rate of a channel. If we define the information (in bits) of a symbol as $ log_2(1/p) $, where p is the probability of that being emitted by the source, then the entropy is the mean information per symbol emitted, i.e. $ Sum_i, p_i log_2(1/p_i) $. Yet the article contains this snippet: "and the entropy of the message system was a measure of how much information was in the message", which is treating entropy as information.

So one definition makes entropy be a synonym for information, and the other makes entropy be an information rate.

Finally, there is this sentence "For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of yes/no questions needed to determine the content of the message." This leads the reader to a psychological idea of entropy, which is very far from the notion of entropy in information theory, with has to do with probabilities of symbols. And the qualifier about equally probable messages is really confusing. I don't know why it is needed, nor what the author was trying to get at. DMJ001 (talk) 04:29, 6 August 2008 (UTC)[reply]

No, this article is specifically about thermodynamic entropy. There is another article, information entropy for the (more general) concept of entropy in information theory. Information entropy appears in this article only because many people find it the most useful way to think about what thermodynamic entropy fundamentally is.
Entropy in information theory is defined for any probability distribution, and Shannon's original papers do this; that is the most fundamental form of the information entropy concept. Entropy rate, as you correctly explain, has a slightly different meaning. It's true that in information theory people sometimes talk about the entropy of a source, meaning its entropy rate, and compare this to the channel capacity. But questioned on the point, they would freely accept that this is a derived idea, the more fundamental idea being the entropy (or information) of any general probability distribution.
Not so fast there. Take a look at Shannon's original 1948 paper. He very clearly considers entropy to be information per symbol. Here is one quote: "H or H' measures the amount of information generated by the source per symbol or per second." Then he discusses messages. He gets to the probability p that some message of length N has been generated. He says that the entropy is given by H = log(1/p) / N. (The = has a dot over it to indicate, I think, that this is an estimate based on a sample.)
DMJ001 (talk) 03:07, 9 August 2008 (UTC)[reply]
Fair enough, though actually Shannon uses both meanings of entropy. In section 20 of the paper (page 35 of the Lucent pdf), he's defining entropy for general pdfs. But for a source, what's more useful is the (average) entropy per symbol - ie the average of the entropy of the probability distribution for the next symbol. It's still an entropy, it's just the entropy of a slightly more specific thing; we can call it the entropy rate if we want to think of it that way, or we can call it the entropy per symbol. Jheald (talk) 18:48, 13 August 2008 (UTC)[reply]
As to the "psychological" view of entropy, I agree this may be a stumbling block to people in the article, because entropy in information theory is just as "real" as entropy in thermodynamics. (In fact User:linas has told me he finds it rather more real [7]). On the other hand, some subjectivity of both information entropy and thermodynamic entropy isn't entirely wrong. (See the maximum entropy thermodynamics article for more discussion). Entropy in information theory is reflective of a state of knowledge; different states of knowledge give rise to different entropies. In thermodynamics too, the Szilard engine thought experiment is an illustration of this - knowing particular microscopic information can sometimes let you extract more macroscopic useful work. It's also a useful idea when thinking about entropy increase in the context of Loschmidt's paradox. So the subjective element may not be completely out of place. But as I've said before, the whole section on the connection between information entropy and thermodynamic entropy could probably stand a re-write. Jheald (talk) 10:35, 6 August 2008 (UTC)[reply]

To clarify, are you saying that the sum total of all the possible microscopic energy states along with any constraints on these particles is the equivalent of the information in the so called "message" of the information entropy definition? If so, I still don't understand why one who is trying to learn about thermodynamics must go off on a tangent and take time to also learn about all the nuances of information entropy in order to learn about thermodynamic entropy in order to provide an illustration or example of "entropy" when all a person really wants to know is what is thermodynamic entropy?Kissnmakeup (talk) 14:19, 6 August 2008 (UTC)[reply]

The "message", in the information theory perspective, would be the identification of the single 'true' microscopic state, from out of the set of all the potentially possible microscopic states and their probability distribution, that are compatible with the macroscopic variables and any other constraints on the system.
I think you are trying to extend the similarities of the entropy of statistical thermodynamics and the entropy of information theory too far.
But back to my original point about entropy as information vs. entropy as information rate. If you want entropy to be information (and I actually prefer this, even if it was not the way Shannon used the term), then the entropy of a symbol, message, or anything else, is -log of the conditional probability of the symbol, message, or whatever.
And the "condition" is whatever the observer knows at that point. Thus entropy is relative to an observer. For example, the entropy rate of an AES-encrypted counter is 128 bits per chunk to an observer that does not know the key (it looks like 128 bit random blocks), but is zero to an observer who does (she can predict each output with probability 1).
DMJ001 (talk) 03:07, 9 August 2008 (UTC)[reply]
Sure, but that dependence on what you know is the case with thermodynamic entropy too. See for example section VI, "the 'anthropomorphic' nature of Entropy" in E.T. Jaynes' 1965 paper Gibbs vs. Boltzmann entropies.
The -log formula is right if you know the symbol, message or anything else has a particular numerical probability. But if, based on what you know, different symbols have different possible probabilities then the - p log p formula is an appropriate quantification of the quantity of your uncertainty. And if you don't know those probabilities, you should assign them so that the sum - p log p is maximised. (Gibbs' algorithm). See on this Jaynes' original 1957 paper. [8]. -- Jheald (talk) 18:48, 13 August 2008 (UTC)[reply]


The amount of information you are missing, by not yet having received the message, is the information entropy of the message; and also the thermodynamic entropy of the system.
The reason people find it helpful, as I've tried to write above, is they find it helps them with the question you asked, "what is thermodynamic entropy?" Answer: thermodynamic entropy is missing information; specifically, information about what microstate the universe is actually in. That helps people who worry about Loschmidt's paradox. (How can entropy really increase, if physical dynamics are deterministic and measure-preserving? Answer: the information we had is still there, but in effect unusable, so effectively we might as well forget we ever had it at all). And it meshes well with the Szilard engine scenario. (Learning some information, the entropy really is reduced, and we find we then really can (at least in principle) extract a little more useful work). Jheald (talk) 15:07, 6 August 2008 (UTC)[reply]

Now see, this is great. This is exactly what I mean. I'm just beginning to learn about thermo, so I'm not to Loschmidt's paradox or Szilard engines yet. If I were, I'd go read about Loschimdt's paradox or Szilard engines. Right now, as much as I would like to know it all, I have to manage my time because I have constraints on my system, so I have to focus on what really matters right now. I still have what some of us would call a "finite" amount of time to learn what I need to know right now about the basics, which is why there are, I presume, articles about Loschimdt's pair of ducks and lizard engines for those who are to the point that they are ready to learn about those things. Those articles about pairs of ducks and lizard engines could talk about the information entropy in thermodynamic entropy for the sake of those people who want to know about it. Whereas, in this article you could talk more about the curl of F not being zero and what that has to do with entropy, I mean something that I have heard of that at least remotely pertains to this at the appropriate level for this article, unless you want to show off to people who don't have time for it how much more you think you know and try to make students feel stupid for trying to learn, which, I think is what's going on here. You can say, for those readers who want to know about pairs of ducks, lizard engines, and information entropy, "Here, follow this link to there," and not put the entire cyclopedia Jhealdia in this one article. But, thank you for your attention and your contributions to an outstanding project. I really don't have the time to waste here. I don't think I'll be returning to this page, or sending anyone else to it either. Kissnmakeup (talk) 21:02, 6 August 2008 (UTC)[reply]

An editor put an Arxiv paper in Further Reading

Here is the paper that was recently added: Umberto Marini Bettolo Marconi, Andrea Puglisi, Lamberto Rondoni, Angelo Vulpiani (5 March, 2008). "Fluctuation-Dissipation: Response Theory in Statistical Physics". Arxiv.org. {{cite journal}}: Check date values in: |date= (help)CS1 maint: multiple names: authors list (link)

I'm moving this new item here for discussion, after removing it from the article. Though it is potentially of interest, three points:

  1. This is only an Arxiv paper, not a refereed publication
  2. The other items in Entropy#Further reading are of an introductory nature
  3. The editor, User:Arjun r acharya, has now added the same paper to eight different articles, which regrettably causes us to worry about spam. EdJohnston (talk) 21:32, 17 August 2008 (UTC)[reply]

In principle, this article could be included, as it is a big review article which is in press in Physics Reports. However, it would not be appropriate to just include this article without explaining the fluctuation dissipation theorem. If we do that, then it is still not clear if this particular review article would be the best reference (I haven't read it yet). Count Iblis (talk) 22:49, 17 August 2008 (UTC)[reply]

    • Response to "An editor put an Arxiv paper in Further Reading"

First Point : Quote : "This is only an Arxiv paper, not a refereed publication"

Response : It has actually been published in Physics Reports.

http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6TVP-4S2F5J5-2&_user=10&_rdoc=1&_fmt=&_orig=search&_sort=d&view=c&_version=1&_urlVersion=0&_userid=10&md5=5b08887e9e0b5250474fee813a641d54

Third Point : Quote : "The editor, User:Arjun r acharya, has now added the same paper to eight different articles, which regrettably causes us to worry about spam.

Response : I can assure you that this ("spam") was not intended. The fluctuation dissipation is essentially relating response functions to equilibrium quantities, and it is the notion of fluctuation about equilibrium which ties all of them together (or atleast that's what I had in mind when making the changes). For entropy & fluctuations, see chapter 14 of Pathria.

http://books.google.co.uk/books?hl=en&id=6cUbnWO2NNwC&dq=pathria+statistical&printsec=frontcover&source=web&ots=K3Isz3MRcW&sig=RV47jGVM6mbUT8HK8VhbAYbuFyo&sa=X&oi=book_result&resnum=4&ct=result#PPR9,M1

Second Point : —Preceding unsigned comment added by Arjun r acharya (talkcontribs) 23:22, 17 August 2008 (UTC) In regards to the second point I think that is completely valid (and also applies to the "Second law of thermodynamics" article, where I also appended this reference).On a second look at the entropy article it seems a bit too indepth a reference to be included, and I do understand why one would want to omit such a reference, for the sake of preserving clarity. —Preceding unsigned comment added by Arjun r acharya (talkcontribs) 23:18, 17 August 2008 (UTC)[reply]

Edit war

There is an edit war about this addition:-

It needs discussion here. If the anon editor restores the material again, I will revert it and protect the article. Bduke (talk) 01:22, 21 August 2008 (UTC)[reply]

User comment

I happened apon this article while looking for more information on entropy for a class, and it looks like it should use some clean-up. There's a lot of "entropy can be thought of like A" and "entropy is kind of like B" before any concise definition is given. It seems like the statistical mechanical definition should be given earlier, and then an attempt should be made to show the relationship between this and other ways of thinking about entropy.

That's just my two cents. Thanks for reading! —Preceding unsigned comment added by 168.156.89.189 (talk) 15:54, 12 March 2009 (UTC)[reply]

Lede issues (redux)

Just stumbled across this article and I have to say, as a lay reader with no real knowledge of physics, I read the lede and it didn't even seem to say what entropy is. It says what it's used for and how it's defined mathematically, but not what it actually is... rʨanaɢ talk/contribs 11:46, 29 April 2009 (UTC)[reply]

Agreed. It was bad enough that I came here (and pretty quickly saw how the different definitions of Entropy seem to have resulted in a compromise lede involving only uncontentious math). The problem is, this is the entropy article people will wind up going to first, and the lede is useless for a layperson. And I'm talking about pretty nerdy laypeople, too. Better a long lede that explains several meanings than an equation intelligible only to people with no need for the article. 99.192.48.185 (talk) 15:23, 26 July 2009 (UTC)[reply]

Entropy and the ergodic hypothesis

With only a few undergraduate physics courses in my background, I'm not really qualified to discuss this issue, but it seems to me that the concept of entropy is related to the ergodic hypothesis. If others agree then I'd invite them to add some relevant material to both this article and to the article on the ergodic hypothesis.

W.F.Galway (talk) 15:38, 13 May 2009 (UTC)[reply]

Extensive is too much extention

In the section Entropy and Cosmology, I changed the phrase of "extensive doubt" into "some doubt"- in reality lack of interest in formulating a conclusion doesn't equate to extensive doubt, and this is the real reason for the controversy... no to say that we can find the answer easily, but the current controversy is not followed by an extreme majority of scientist saying entropy models doesn't apply to the Universe as a whole. The jury is still out with competitive arguments, and any extensive doubt phrase is misleading (at least with current arguments) today... —Preceding unsigned comment added by 206.248.106.175 (talk) 20:21, 20 May 2009 (UTC)[reply]

Figure is stepping on the text

In Section 4.4 "Chemical Thermodynamics" the figure is covering some of the text. Someone who knows how, please fix it.

Can you please list your browser and screen resolution? I am not seeing any problems under firefox 3.0.9 at 1280*1024. If you can try multilpe resolutions and list the ones where it is a problem, we may be able to work around it. User A1 (talk) 01:54, 22 May 2009 (UTC)[reply]
Sure. I am using FireFox 2.0.0.14 on MacOS 10.5.6. I tried FireFox 3.xxx but did not like it because of major changes in the way it uses windows. I am using the standard resolution for my brand-new Mac 15" MacBook Pro. Bill Jefferys (talk) 02:45, 22 May 2009 (UTC)[reply]
Someone has fixed this. It now displays correctly (as above). FYI the resolution is 1440x900. Bill Jefferys (talk) 02:59, 22 May 2009 (UTC)[reply]

Restrict explanation of entropy to known facts

I recommend that the main article on entropy be limited to its meaning in the context of the Second Law of Thermodynamics, because I found the overall article to be chaotic due to all of the extensions and the resulting lack of any cohesive core description or definition of entropy.

I recommend that information theory be entirely separate because it is itself fundamental, and is mathematics. While the two may coincide in a sense, the entropy of physical processes was historically the first concept / law / process to be developed, and the additional emphasis on math may block some readers from understanding something that does not require math.

I would recommend that all the extensions be described in separate entries, including cosmology, although I grant it is highly relevant. Cosmology remains well beyond any widely accepted scientific formulation; hence, it is little help to my understanding of entropy, and the main article is about entropy, not cosmology.

The main article about entropy should be the minimum required to explain the the Second Law of Thermodynamics to an educated reader who may lack the background in mathematics to grasp its detail but can surely understand the basic concepts. Then the additional concepts which build upon an understanding of entropy will be simplified, because, for example, the reader can proceed to cosmology without trying to grasp the meaning of information theory or entropy as it relates to evolution.

Thomasrwilson (talk) 07:30, 5 August 2009 (UTC)[reply]

Re: Restrict explanation of entropy to known facts

After attempting to reread the entry I at least don't think my comments above are especially useful. That is surely in part due to not having read the entry on entropy correctly. Also, I have always found thermo to be exhausting - so simple but it just wasn't given what it needed - an E=Mc^2 formulation - instead it has about 10 different statements. My formulation is as follows:

Entropy is the internal push of all contained systems, including the universe, toward chaos (as such, it would be a force a Force). Or, alternatively, entropy is the chaos that is spontaneously produced by systems, that is, without any outside force or direction or other consideration, systems will produce a certain amount of (useless energy (equivalently, they will lose energy, giving it off as friction), randomness (a synonym for chaos in some situations), they will become smoothed out, as gases will when released in a large room, or they will become randomized, as in the case of encrypted information.

1. I once saw/read a proof of Shannon's theory that information is proportional to N log2 N that used nice (clever) definitions of what information had to be in the context of a bit of data. It wasn't more than a printed page long. Has anyone seen it? I saw it in about 1982 at the dept. of transportation (DOT), transportation systems center (TSC) in Cambridge, MA. Maybe "proof" isn't the right word - it was more a motivation.

But I was also working on digital filtering and was struck by the number of processes that were somehow NlogN -related. Like the FFT and quick sort - they required NlogN steps and were highly efficient. Also, there seemed to be a limiting factor whereby NlogN (log2) represented the degree of compression that was possible - AND it represented, by the same token, the degree of encryption possible. In fact, those things were only correct to a first approximation - given enough memory, sorting can be done almost in a single pass; however, NlogN.

By the way, Shannon is one of the most fascinating characters you could imagine, from his development of information theory and computer basics, to wartime code breaking, to beating the gamblers of Las Vegas and the stock market, ... and of course his unfortunate affliction with Alzheimers. He died in 2001.

I'm out of time.

Thomasrwilson (talk) 13:29, 5 August 2009 (UTC)[reply]

Major rewrite underway

I have started to rewrite this article. The information theoretic introduction will be kept, but the connection with thermodynamics will be made more clear right in the lead. The second law of thermodynamics, the fundamental thermodynamic relation will be written down explicitely on the lead. The advantage of the statistical approach is that all these statements come with explanations of how exactly they follow from the fundamental definition, so everything will be more clear.

I don't have the time to edit a lot every day. What I still have to do in the lead is explain why S can only increase, given the way we defined the entropy. Then we say that the way S depends on the external variables and the internal energy fixes all the thermal properties of the system. Then we define the temperature in terms of the derivative of S w.r.t. E, which leads to the relation dS >= dq/T.

We then have all the important properties explained heuristicaly in the lead, so lay persons can just read the lead and see the most important properties of S and read what follows from what and why. Count Iblis (talk) 01:06, 7 August 2009 (UTC)[reply]

Statistical thermodynamics

Hello, I am trying to make improvements to this article by incorporating references from Enrico Fermi but someone keeps deleting them without making any effort to incorporate them. These are viable calculations that clearly show the connection between Boltzmann's equation and the thermodynamic definition of entropy. I cannot understand why such contributions would be deleted when they address the key issue that this article wishes to consider. —Preceding unsigned comment added by Quantumechanic (talkcontribs) 02:04, 7 August 2009 (UTC)[reply]

The problem is that you cannot define entropy from thermodynamics alone. It is reverse logic and no modern textbook uses this approach anymore (with the possible exception of chemistry and engineering texts). Now, it used to be done like that (e.g. like in the book by Fermi, and Fermi, b.t.w., did not invent the thermodynamical derivations as you erroneously wrote in your version) but then you get a very complicated text that to engineers and chemists may look better (because they leaned it that way) but it actually doesn't explain one iota. What on Earth does the equation
dS = dq/T
really mean if you:
a) Have not explained what heat is?
b) you have not defined what T is?
Count Iblis (talk) 02:19, 7 August 2009 (UTC)[reply]
I agree. Certainly there is no reason to start from thermodynamics in this article if one can simply start from a more universal definition of entropy. Plastikspork (talk) 02:40, 7 August 2009 (UTC)[reply]
The disambiguation information states that this page is about entropy in thermodynamics. Your statement that "no modern textbook uses this approach anymore (with the possible exception of chemistry and engineering texts" is unsourced and unreliable. Also your statement that "you cannot derive entropy from thermodynamics alone is unsourced and appears to constitute original research. In short, there is a separate page for information entropy and that is where this information based material belongs. In fact, I have multiple textbooks, which I can cite in the article, which develop entropy in a thermodynamics framework. Hence, since this article is on thermdynamic entropy, there is no need to rely extensively on material belonging to another WP page. Locke9k (talk) 01:20, 24 November 2009 (UTC)[reply]
You can certainly develop the concept of entropy via thermodynamics only, but then it is a purely phenomenlogical quantity that cannot be explained much deeper. Fortunately, the subject is not taught this way anymore to physics students. Also, obvious statements are not OR. It should be clear that thermodynamics alone cannot possibly tell you the number of microstates corresponding to some given macrostate. Count Iblis (talk) 02:10, 24 November 2009 (UTC)[reply]

The physical meaning of heat and temperature

Heat is energy. When we talk about entropy we are discriminating between heat that can do useful work and heat that is necessary to maintain the thermodynamic state (i.e. volume, pressure and temperature) of the system. Temperature is a measurement of the heat contained in an isolated system. For example, a given mass of aluminum at a particular temperature will have a specific heat capacity. That capacity corresponds to the amount of energy necessary to maintain that mass of aluminum at that temperature.

Quantumechanic (talk) 02:39, 7 August 2009 (UTC)[reply]

Temperature can be defined using the derivative of entropy with respect to energy. One can start by define entropy without defining temperature. Plastikspork (talk) 02:42, 7 August 2009 (UTC)[reply]
In addition, there is already an article on Entropy (classical thermodynamics). Plastikspork (talk) 02:44, 7 August 2009 (UTC)[reply]

OK, I'm giving it another edit but trying to stick with the more general definition of entropy. I hope you like this one better. —Preceding unsigned comment added by Quantumechanic (talkcontribs) 18:57, 7 August 2009 (UTC)[reply]

Sounds like a reasonable plan. Thank you. Plastikspork (talk) 23:20, 7 August 2009 (UTC)[reply]

Unjustified reverts

Now references to Fermi, Boltzmann and a recent work by Gardiner and Zoller have been deleted despite my repeated attempts to introduce new pertinent references and make this article read in a more encyclopedic style. Reverting from these additions is inappropriate unless the source material is bad or somehow misinterpreted. But the deleted source material substantiates those precise equations by which we have defined entropy in this article. Furthermore the mathematics by which the definition can be derived is clarified by these sources and additions.

It is my understanding that because these recent reverts have been deleting sourced material and replacing it with unsourced information, I am supposed to revert again. But before I do anything else I would appreciate feedback from anyone with an interest in improving this article. Quantumechanic (talk) 22:54, 7 August 2009 (UTC)[reply]

My only objection is to making the lead section unnecessarily dense with quantum statistical physics and thermodynamics terms. It would be great if the lead section could use a more general probabilistic definition of entropy, and then introduce the connections to quantum statistical mechanics and thermodynamics a bit further down. This is, of course, my opinion. Thank you for your contributions. Plastikspork (talk) 23:18, 7 August 2009 (UTC)[reply]


Yes, the formalism using density matrices is welcome further down in the article. Most of the article is intended for people who don't kow the density matrix formalism. If someone knows about density matrices, then that person will very likely know a lot about thermodynamics. If such a person looks at this article, then it is most likely for technical stuff, e.g. about entanglement entropy etc. etc. So, the section you intend to write about that must have such a focus.
I don't think it is a good idea to focus too much about sources at this stage. Of course, we have very strict rules about what we write being sourced. But that is to keep crack editors away. The disadvantage of simply copying from textbooks is that you take a text on, say, page 400 of a textbook that is meant for students who have read through pages 1 to 399 and who have followed other physics courses (e.g. quantum mechanics). What we need to do is think about how we can best explain things to a, say, advanced high schooler who is very interested in physics and willing to spend some time to learn stuff. Count Iblis (talk) 00:54, 8 August 2009 (UTC)[reply]

It would be great to explain everything in as simple of terms as possible so we should work together to make the article as clear as we can. By pursuing an article that presents a statistical definition of entropy in the introduction we have the opportunity to enhance the article style if it is written in a clear and consistent manner. Our primary goal must be to the integrity of the article itself. Deleting viable sources defeats that goal.

If we choose to define entropy statistically and introduce the subject with respect to quantum states, mentioning the density operator in the introduction and the relationship between its simple diagonal form and its trace after the unitary transformation of interest helps clarify the equations being presented. I believe such clarification makes the article self-consistent. If we choose to go deeper into the matrix formalism of the entropy derivation within the article this will also be consistent. But a description of matrix mechanics is beyond the scope of this article.

The only other option I can see is to get rid of any mention of quantum mechanics from the introduction and focus on Boltzmann. Boltzmann's equation comprises a simple, statistical definition of entropy that can be introduced by his own reference. A reference to Fermi can also substantiate that Boltzmann did in fact prove his theorem. That way references to quantum optics texts could be resserved until later in the article. Quantumechanic (talk) 02:40, 8 August 2009 (UTC)[reply]

We could talk about "micro states" instead of "quantum states" in the lead. I think the lead should be as simple as possible, just sumarize the important properties of the entropy. In the next sections we'll give detailed explanations.
I don't really see the point in following sources very precisely. There is no point in saying that "S= k Log(Omega), see Boltzmann". You want to explain this (in your own words) first, and then you can can add the source. Also note that physics text should not be history texts.
The things that need to be put in the article are the equal prior probability postulate, an argument why entropy can only increase, the definition of temperature, the definition of work, the definition of heat, the second law. These things should be explained in a self consistent way in the article, not just dropped into the article with a source. Count Iblis (talk) 03:19, 8 August 2009 (UTC)[reply]

Request for comments

Now the introduction may be improved, but this article still needs a lot of work. Please let me know whether you think my efforts are consistent with our discussion. Also if anyone finds any mathematical errors please keep me apprised. I have not had time to check all of the equations yet but I will.

I decided to work on this page to hopefully give it more encyclopedic style. I would appreciate constructive criticism with respect to that goal. Also I recommend we take a critical look at the entire article and edit or remove any questionable content while taking care to preserve all pertinent sources and explain them better if necessary. Looking forward to your responses. Quantumechanic (talk) 23:21, 8 August 2009 (UTC)[reply]

I would suggest to move the stuff about the density matrix out of the lead to a later section. Just mentioning S = k Log(Omega) is enough. You then keep the lead section understandable for a, say, 16 year old high schooler. You can put the derivation of S = k Log(Omega) from a density matrix formalism in that later section.
About the next section on enthalpy: this is already covered in the wiki article on the Gibbs energy. The statement in this section comes with extra conditions: The system is in contact with a heat bath wich also maintains the system at constant pressure. I think that in the first section after the lead it would be more appropriate to consider an isolated system is more appropriate in the first section after the lead. You can simply set up the definition of the microcanonical ensemble in that section.
What we need to do in the first few sections after the lead is explain why the entropy will increase when you approach thermal equilibrium, how temperature is defined, the definition of heat and work (from the microscopic perspective), the steps that lead to the second law of thermodynamics etc. etc. Count Iblis (talk) 23:54, 8 August 2009 (UTC)[reply]

Those are good suggestions, Count. I realize that this article has some things to say about free energy but that there are already other articles on that topic. The same goes for entropy in information theory. I think that we should make an effort to consolidate various topics including enthalpy in the organization of the article. We should try to organize the article in a manner that makes it easy to read while building a conceptual understanding for the reader. Quantumechanic (talk) 01:36, 9 August 2009 (UTC)[reply]

I think you should change your writing style. Forget about sources and literal statements from books for the moment. I just read the section about enthalpy line by line (didn't do that yesterday) and it was completely flawed. You're text confuses free energy with actual work performed, you don't mention the conditions at which the statements that are correct are valid. Now enthalpy changes measure the heat supplied to the system (when pressure is kept constant), not changes in internal energy. Also the assumption tat processes be quasstatic was not mentioned, the whole notion of quasistaticity hasn't even even been introduced yet.
So, I had no choice but to delete that entire section on enthalpy. Count Iblis (talk) 15:57, 9 August 2009 (UTC)[reply]

I do not appreciate your deleting my work. It would be more constructive to help improve it. I am aware that we need to add a definition of the diagonal form of the density matrix to make the article more readable. But there is no need to delete viable equations and deleting good references is unacceptable. Each time you do that I am entitled to revert.

I disagree with you about the section on enthalpy. My writing assumes that the free energy is available to do useful work. I am trying to keep the discussion of free energy as simple as possible to avoid confusion. The important physics is that the work is done. If the system fails to make use of the available work, that is an engineering problem. But this is a physics article and not an engineering article. It should clarify the meaning of entropy. It is not intended to teach the engineering of a thermodynamic engine. Quantumechanic (talk) 16:57, 9 August 2009 (UTC)[reply]

It is perfectly acceptable to delete nonsensical texts. It nonsensical to start introducing entropy using a density matrix formalism, let alone without even defining what a density matrix is. Then, having started that way, the statement that Tr(rho log(rho)) is invariant is trivial, but you're then making some big point about that. And citation to an article on "quantum noise" instead of some basic text book is inappropriate. Count Iblis (talk) 17:07, 9 August 2009 (UTC)[reply]

Quantum Noise is a textbook. I am not aware of any basic texts that introduce a statistical definition of entropy in terms of the probability of states. Quantumechanic (talk) 17:10, 9 August 2009 (UTC)[reply]

Almost all textbooks I know do this. Count Iblis (talk) 17:21, 9 August 2009 (UTC)[reply]

Fixing the definition of entropy

I suggest that we change the first paragraph somewhat as follows, possible without the parentheses:

In thermodynamics, Entropy, usually denoted by the symbol S, is a physical quantity that is used to measure the internal energy of a system with respect to temperature. The entropy is related to the ability of the system to do useful work because it quantifies the heat energy internal to the system state that cannot do any work. The energy available to do work (free energy) is the difference between the total energy (enthalpy) and the internal energy (S T).

Please let me know what you think about this one. If the entropy is not clearly delineated from the free energy the definition is vague and ambiguous. Quantumechanic (talk) 17:48, 9 August 2009 (UTC)[reply]

Well, entropy is not "used to measure the internal energy of a system" at all. You can either give a definition that is really correct (e.g. the amount of information you would eed to specify the exact state given what we know about the system), or leave such more precise defintions fr later and then simply describe the way entropy is related to energy, heat and work.
In the second part fo your text, you write that entropy quantifies the eat content of the internal energy, but then must say that this is for a system ytat is kept at constant T. If you don't do that, then the relation is only valid for infinitessimal changes for entropy, as the temperature can change.
Another thing, I just say that in the last part of this article you editit in an integral expression of S involving Q(T). This looks strange to me and I'll check if this is correct right now. Count Iblis (talk) 18:02, 9 August 2009 (UTC)[reply]

Good points, thank you. Maybe this is better?

In thermodynamics, Entropy, usually denoted by the symbol S, is a physical quantity that is related to the internal energy of a system by multiplying the internal energy and the temperature of the system. The entropy is related to the ability of the system to do useful work because the heat energy internal to the system state cannot do any work. The energy that is available to do work (free energy) is the difference between the total energy (enthalpy) and the internal energy (S T).

I agree that those integrals look strange. I think it's a style issue mostly. The differentials look cleaner when lower case letters are used for the functions. But what was there before looked like it was edited by someone who doesn't know calculus. Quantumechanic (talk) 18:12, 9 August 2009 (UTC)[reply]


Let's see. You propose to write:

"S is a physical quantity that is related to the internal energy of a system by multiplying the internal energy and the temperature of the system."

Do you understand what you wrote here?

"The entropy is related to the ability of the system to do useful work"

Ok, so far. But this:

"because the heat energy internal to the system state cannot do any work."

doesn't make sense, because heat is not a thermodynamic state variable.

Technically you are correct because the internal energy does the work that puts the system into its state. So we really need to say that the internal energy cannot do any external work. Quantumechanic (talk) 19:02, 9 August 2009 (UTC)[reply]

Last part:

"The energy that is available to do work (free energy) is the difference between the total energy (enthalpy) and the internal energy (S T)."

This looks ok, if you say that this is for a system in contact woit a heat bath. In case of enthalpy, you look at a system that is kept at constant pressure (and volume pessure work against that system that keeps it at constant pressure then doesn't count as useful work). Count Iblis (talk) 18:29, 9 August 2009 (UTC)[reply]

It is true that most chemists are concerned with a reaction that occurs at atmospheric pressure and must account for changes in volume in terms of work done against the atmosphere. In this simple case the change in enthalpy is merely the heat absorbed in the process. When we talk about entropy we are actually more interested in what happens in a closed vessel where the pressure may increase as new components are introduced. Then the volume would be constant and no work is done against the atmosphere. Thus the change in enthalpy contributes only to the internal energy and no external work is done. But the enthalpy is still equal to the heat absorbed in the process.
Contrary to what you may believe, the work done against the atmosphere due to a change in volume of a chemical reaction does count as useful work. Of course only the free energy is available to be extracted. If you try to get more, the system will operate less efficiently and the pressure will end up above atmosphere. Quantumechanic (talk) 19:02, 9 August 2009 (UTC)[reply]

Integral expressions

I removed this edit:


Thus for a constant temperature process the derivative of the entropy with respect to heat is constant

If the temperature of the system is not constant, then the entropy must be integrated over both variables

Since Q and T are mutually dependent in general, the integral becomes considerably more complicated


Comments:

A) Since when did Q get promoted to a thermodynamic state variable?

B) Ignoring point A) how do you get a double integral over T with integration measure "dT"? You could have replaced dQ by dQ/dT dT and gotten an expression in terms of a heat capacity, that would have made sense

c)Then in the last integral expression with Q(T), where did that come from? If you somehow integrate over Q (which you actually cannot do and the expression is flawed to start with), you would have lost one integral sign. Count Iblis (talk) 18:19, 9 August 2009 (UTC)[reply]

Consider an infinitesimal reversible transformation with entropy difference dS and an amount of heat absorbed dQ at temperature T. Thus
that is substantially the first equation. If T is not constant the differential must be considered in two variables ie.e dQ and dT. That is why the integral gets more complicated. Quantumechanic (talk) 18:34, 9 August 2009 (UTC)[reply]
Ok, I think this provides the answer to the question I asked in the title of this thread beyond any reasonable doubt. Count Iblis (talk) 18:41, 9 August 2009 (UTC)[reply]

If you don't like the way I wrote down the integrals, why don't you suggest a way to fix them. I'm sure we can write the differential equation for the entropy when the temperature is not constant. Quantumechanic (talk) 19:07, 9 August 2009 (UTC)[reply]

Actually, I'm very glad with these integrals you wrote down. It settles the important question on how to proceed with editing this wiki article. Count Iblis (talk) 19:23, 9 August 2009 (UTC)[reply]

If you mean that we should edit out old stuff that doesn't make sense rather than try to preserve it and improve it, I agree. But I do not see a problem with writing the differential equation for the entropy when the temperature is variable. Why don't you suggest a form for that? Of course the integrals are fine now but they do not apply to a variable temperature process. Quantumechanic (talk) 19:43, 9 August 2009 (UTC)[reply]

I support Count Iblis' current version. It is clearer, more concise, and less "all over the place" than what was there before. As for a way to edit this article without unnecessary friction, I would advise QM to use (from this point on) multiple edits to make small incremental changes so that editors are not confronted with 1 massive change, but rather lots of smaller ones. It makes things easier to review, and if there is a problem with one, it does not invalidated all the rest of them. Headbomb {ταλκκοντριβς – WP Physics} 21:40, 9 August 2009 (UTC)[reply]
I concur. Part of the problem is that each edit looks like a massive rewrite when one looks at the diff. Plastikspork (talk) 22:10, 9 August 2009 (UTC)[reply]

Thank you for such valuable suggestions. Obviously I am a newb. But I can help make this article read in a more encyclopedic style, and maybe even make it more clear to the layman at the same time.

I feel that a clear and concise definition of entropy belongs right in the beginning. We have gone through both classical and quantum notions for such a definition. Regardless of which one we choose to maintain, we need to make sure that our assertions are adequately sourced for verification and further reading (and to keep with wiki policy).

Please give me more feedback so I can contribute most effectively. Thank you! Quantumechanic (talk) 22:39, 9 August 2009 (UTC)[reply]

Care to explain how those integral expressons you edited in this article were "sourced for verification and further reading (and to keep with wiki policy)" :(
You could have sourced your expressions or not source them, anyone who knows anything about thermodynamics knows that these expressions are wrong. So, you see that giving a source is irrelevant. It is only relevant when we're done writing this article and want to get this article to FA status. Count Iblis (talk) 23:03, 9 August 2009 (UTC)[reply]
An excellent source on the topic, IM(biased)O are Darrel Irvine's lecture notes on solid state thermo, available via MIT OCW. Right-hand column, click for annotated PDFs. Awickert (talk) 03:26, 11 August 2009 (UTC)[reply]

The introduction still doesn't make sense.

When the natural log is written in the entropy equation, Boltzmann's constant must be multiplied by ln 10. I suggest that we specify the logarithm in base 10 to fix this and clarify how Boltzmann's relation is related to the quantum mechanical description. Quantumechanic (talk) 00:25, 10 August 2009 (UTC)[reply]

The equations are incorrect when specified in the natural log. I think it is misleading to write Boltzmann's equation differently from how he originally expressed it, especially without correctly defining the constant. Furthermore the logarithm in the source material Quantum Noise is written with respect to base 10. It is derived from the Boltzmann relation and uses the same constant. Quantumechanic (talk) 01:04, 10 August 2009 (UTC)[reply]

What you don't seem to understand is that most editors here are experts in this subject. Then if you look in your book and see "log" and you see that we write "ln" then there are a few possibilties:
a) We are wrong (not just we here but other editors on other related wiki articles).
b) The book is wrong.
c) You may be mistaken that "log" always means "log in base ten".


The first reaction most people have when things don't seem to add up is check and double check if they themselves have made a error or a made an assumption that is not valid. If after checking and double checking and deriving the result ourself to get to the bottom of the matter, we have proof that the mistake is really not ours, and there is thus some anomaly (mistake in a source or whatever), do we raise the problem.
Your first reaction, not just now but from the first time you started to edit here was to make claims of errors in ths article right away without even attempting to provide for a detailed proof on the talk page (as indicated by some of your edit summaries).
There is nothing wrong when you write here about something that doesn't seem to add up. But constantly claiming that there is an error and that you have proof of that because you think that your source says something else than is written here, is just irritating. Count Iblis (talk) 01:22, 10 August 2009 (UTC)[reply]

I guess you're right. The book may be wrong unless (c) applies. (We already agreed that the normalized relation is a natural log.) So we need a better reference. And we need to explain it better. Boltzmann's relation teaches us about a collection of microstates in thermal equilibrium such that the lowest energy states are the most probable. If we explain it in terms of Boltzmann relaxation ratios we can clarify both the natural log and the minus sign. What do you think? Quantumechanic (talk) 15:38, 10 August 2009 (UTC)[reply]

Before we can explain the Boltzmann factor exp(-E/kT), we first need to explain the things that are explained in detail here. This section was written by me to provide for a first principles derivation that was missing in Wikipedia about a year ago. I have just made some small edits to the wiki pages on the first law of thermodynamics and on the page on work, in order to avoid circular definitions.
Now, in this article, we need to keep the very detailed math out, so we need to refer to other wiki pages were the things are explained. But that means that we may need to modify those wiki pages as well, if something is missing there. Count Iblis (talk) 16:05, 10 August 2009 (UTC)[reply]

The change in internal energy of a transformation that does useful work is important to the concept of entropy. We have already discussed some of the material in the article here especially about enthalpy and the first law. When we introduce entropy we can describe the physics in much more general terms. Free energy may be extracted from a variety of thermodynamic systems ranging from batteries to steam engines. So we shouldn't be limited to P, V systems in the general discussion of the first and second laws. Quantumechanic (talk) 16:45, 10 August 2009 (UTC)[reply]

Proposal: move equations out of lead

Now I looove math, but the average human being who wants to know about entropy may or may not have the background to deal with an analytical introduction to it. I suggest that:

  • The many disconnected clauses of the lead are combined into a coherent paragraph.
  • The mathy material is moved out of the lead (though still being described in words) or into a second paragraph

Awickert (talk) 04:09, 15 August 2009 (UTC)[reply]

Sounds good. Plastikspork (talk) ―Œ 04:29, 15 August 2009 (UTC)[reply]
Yes, I agree. I think one could perhaps keep the equation k Log(Omega) with a statement that this is Boltzmann's famous formula that appears on his tombstone. I have been making small edits to the wiki articles on related subjects such as on heat, work and will make more substantial edits to the wiki article on the second law, before coming back here. This is to make sure that when we explain something here or in a related wiki article that you don't get a circular explanation like when you explain X in terms of Y and Z but then the wiki link to Y explains Y in terms of Z and X. Count Iblis (talk) 13:13, 15 August 2009 (UTC)[reply]
OK, just FYI I am on vacation now and will be in the field this weekend, but I will be around again next week, and will be happy to help translate physics to English. Awickert (talk) 03:58, 16 August 2009 (UTC)[reply]

I have included a proof of the second law (I took it from the article on the fundamental thermodynamic relation which I wrote about a year ago). I do refer to this proof in the fundamental thermodynamic relation article on some other pages, so it should not be removed until the wiki links have been updated.

The proof starts from the relation S = k Log(Omega), so we can now write about the second law from the information theoretical perspective in this article and refer for the technical details to the second law article. Count Iblis (talk) 02:18, 16 August 2009 (UTC)[reply]

Information theory concept?

I believe the first section of this article was edited by somone from an information theory background, ignoring the rest of the article. Surely it can't be introduced as an information theory concept, when for decades it was purely a thermodynamic idea, then eventually applied in information theory? There is more than enough at the end of the article to explain the application of entropy as a concept in information theory. I also feel the start is written like a Batman vs. Superman argument. 192.198.151.37 (talk) 09:14, 17 September 2009 (UTC)[reply]

I agree. There is a separate article dedicated to the concept of entropy in information theory. See Entropy (Information theory). Dolphin51 (talk) 11:58, 17 September 2009 (UTC)[reply]
I don't agree. Most modern courses on statistical physics explain entropy using a rigourous information theoretical approach or they do it more heuristically by introducing entropy as k Log(Omega). Definition of heat, work temperature etc. are all given in terms of the fundamental defintions. Only then do we discuss heat engines etc. in class. What we don't do is start wit heat engines, assuming that somehow temperature and heat don't need to be explained and then build up the theory. This is the historical approach and it is a recipe for disaster if you teach it this way. And, as explained here, this is true in general, not just in case of thermodynamics.
I do agree that the article needs a lot of improvement to explains things better. I started to work on this aticle last month, but I was caught up with more urgent matters on Wikipedia which has so far used up all the available time I can afford to spend here. Count Iblis (talk) 14:05, 17 September 2009 (UTC)[reply]
Entropy can't / shouldn't be introduced via heat? That doesn't sound plausible. The 2nd law beomes almost trivial to explain. Heat flows from the hotter to the cooler, hence dS=dQ/T > 0. --Michael C. Price talk 17:55, 17 September 2009 (UTC)[reply]
Yes, but one needs to explain why the equation dS = dQ/T for reversible processes holds in the first place, see e.g. here. A big problem you face when you introduce entropy via dS = dq/T (not just introduce, but also build on it), is that the Second Law (that entropy increases) appeals on a notion of thermal equilibrium where you can define a temperature at all. But the fact that entropy increases is precisely due to non-equilibrium effects.
This problem is actually also mentioned in the criticism section of the second law article, which address the problems with the approach by Clausius. Count Iblis (talk) 18:36, 17 September 2009 (UTC)[reply]
I'm not saying that dS=dQ/T > 0 provides a proof, but it certainly provides an accessible explanation. --Michael C. Price talk 20:43, 17 September 2009 (UTC)[reply]

Assuming Count Iblis is correct, some strategic changes should be made. The present version of Entropy is clearly focused on thermodynamics and should be merged with Entropy (classical thermodynamics). A new version of Entropy should be created, based on the modern fundamental approach to entropy, and pointing towards each of the existing articles covering entropy:

The present version of Entropy is clearly based, almost exclusively, on the classical thermodynamical perspective of entropy. It has an image of ice melting in a glass of water; it has an image of Rudolph Clausius; and many references to heat and temperature and thermodynamics. But the opening paragraph is an incongruous and incompatible reference to information theory. Dolphin51 (talk) 23:19, 17 September 2009 (UTC)[reply]

There should be separate articles for each of the main types of entropy. Sticking it all in one article is too much. The main article can just talk about how the different concepts relate to each other. --Michael C. Price talk 21:50, 18 September 2009 (UTC)[reply]


New Synthesis?

I am relatively new to Wikipedia and have made some significant changes/additions to the beginning of this article in the past two days. My purpose is to synthesize and summarize the three opposing viewpoints so that each of their strengths can work together (instead of opposing each other and confusing the curious). Admittedly, I focus more on the microscopic/information viewpoint than the macro/thermo viewpoint. However, I tend to learn and explain more pedagogically than mathematically, which is why I use equations sparingly. I am open to suggestion, critique, criticism and conflagration in effigy. I am not an expert on the subject, merely a BA in chemistry whose favorite courses were P-chem and quantum mechanics. But I feel that I have a reasonable grasp of this important concept. Am I wrong?

There have been few changes to my edits and I am looking for some feedback. There seems to be a lot of vitriol on this page, but I think that we can all work together and get along (and hopefully get back on the featured article list). One problem I currently have is that the fourth paragraph is very disjointed from the first three, which I have been working on. But I don't want to delete the concepts. Perhaps someone can insert a better segue.

--User:Kpedersen1 20:28, 4 October 2009 (UTC)[reply]

I think you did a great job. I think the rest of the artcle should now be rewritten to bring it firmly in line with the information theoretical introduction. The connection between thermodynamics and the fundamental definition of entropy should be made clear. We could start with the temperature. The definition 1/T = dS/dE can be easily motivated when you consider two initially isolated systems each with their own Omega(E) function.
Then, I think some statements that are now made in the lead about the universe, the S T_R energy associated with information content should be explained elsewhere in the article in more detail. Maxwell's Demon and Laplace's Demon can be introduced to illstrate the relevance of information in thermodynamics in a less abstract way. Formula's for entropy, like Sackur Tetrode's formula for the entropy for an ideal gas could be given. Count Iblis (talk) 21:15, 4 October 2009 (UTC)[reply]
Feynman Lectures on Computation have an excellent chapter on the thermodynamics of computation, bit : k log 2 connection, why resetting bits takes energy and how Bennet's machine can run on bits. --Dc987 (talk) 08:47, 20 October 2009 (UTC)[reply]

Information Concept First

A recent edit dropped in the thermodynamic definition of entropy as the first sentence (thank you for fixing it Count Iblis). I think this thermodynamic definition is very important, but should not be introduced until later in the article - certaintly as the main concept in the thermodynamic approach section. I understand the frustration of thermo people who see the information concept introduced first, since it was developed later. But given what we now know about entropy, the information concept is a more foundational approach. In quantum mechanical systems, entropy is explicitly defined as information. (The Black hole paradox deals with the thermodynamics of a black hole, but defines the 2nd law as "information cannot be destroyed in the universe"). Thermodynamic systems are composed of quantum mechanical ones; thus thermodynamic entropy is an emergent property of informational entropy. Certaintly thermodynamics has more importance on our daily lives, since we generally deal solely with the classical limits of quantum mechanical laws, but a true undestanding of any physics must begin with its physical origins (not its intellectual ones).

Therefore I think the introduction section should deal almost strictly with microstates, information, and the entropy defined as the potential for disorder (and, alternatively, the amount of information needed to descrive a system). Once this ground is covered, we can delve into how this affects the amount of useful work a system can perform, entropy changes of phase transitions, entropy of mixing, etc. and how these can be traced back to entropy as information. Hopefully if we all work together and embrace the duality of entropy as energy and information, we can get this very important page back on the featured article list. If you have objections, please discuss them here so that we can compromise before slashing and burning the hard work of others (I have been very careful not to remove any ideas in my edits, at the very most rephrasing them and re-organizing their structure). Kpedersen1 contributions (talk) 10:22, 9 October 2009 (UTC)[reply]

There is no such thing as duality of entropy. Lookup the Landauer's_principle and Entropy_in_thermodynamics_and_information_theory. --Dc987 (talk) 08:37, 20 October 2009 (UTC)[reply]

Measure of Entanglement

Entropy is a measure of entanglement. [[9]] And entanglement is not even mentioned in the article. Not even once. --Dc987 (talk) 08:22, 20 October 2009 (UTC)[reply]

I assume that you mean "entanglement" in the rigorously-defined, quantum mechanical sense (c.f. Einstein, Podolsky, and Rosen). Perhaps the most powerful aspect of classical thermodynamics is that is independent of the specific formulation of statistical mechanics which gives rise to it as an emergent mathematical framework; as such, the definition of entropy exists completely independently of entanglement (as implied by the applicability of the entropy concept to information theory). Reference to entanglement is therefore unnecessary. If you think that a subsection on this is appropriate, find some external resources and write the section. Calavicci (talk) 19:38, 1 November 2009 (UTC)[reply]
I'm not sure if any definition that doesn't mention the entanglement part would be a complete definition, not just a classical approximation. The article already has the subsection on entropy in QM: [[10]]. But it is technical. And it doesn't mention the entanglement part. The main article on the Von_Neumann_entropy does:
Intuitively, this can be understood as follows: In quantum mechanics, the entropy of the joint system can be less than the sum of the entropy of it's components because the components may be entangled. For instance, the Bell state of two spin-1/2's, , is a pure state with zero entropy, but each spin has maximum entropy when considered individually. The entropy in one spin can be "cancelled" by being correlated with the entropy of the other.
--Dc987 (talk) 23:02, 4 November 2009 (UTC)[reply]
A definition of entropy without mentioning entanglement is, indeed, complete. The idea is that entanglement is only one of any of the numerous properties of systems which can lead to entropy when analyzed statistically. Entropy is not dependent on entanglement to exist. Trying to incorporate it in the definition is trying to define an abstract concept with a concrete example, like defining a derivative by saying that velocity is the first derivative of position with respect to time.Calavicci (talk) 21:06, 6 December 2009 (UTC)[reply]
"Entropy is not dependent on entanglement to exist." - I wouldn't be so sure about that. If you consider the QM definition of entropy, the density matrix pretty much specifies the entanglement between the system and the observer. --Dc987 (talk) 06:28, 8 December 2009 (UTC)[reply]

But entanglements can be reversible, whereas entropy is not. Suggests entropy is not a measure of entanglements, but only of irreversible entanglements.--Michael C. Price talk 19:50, 1 November 2009 (UTC)[reply]

Er, you can either use entropy to quantify the entanglement between two systems or you can not. You can. So it entropy is a measure of entanglement. The opposite is not necessarily true, but either way, entanglement is an important part of entropy and I think it should be mentioned in the article.
Offtopic: Speculatively, physical entropy and the second law can be a consequence of entanglement and decoherence, but I don't think it is an established fact. IMHO to discuss the reversibility you need the time arrow, and this arrow would point backwards if you reverse an entanglement (if you really reverse it, not just leak to the environment); but it's not really relevant here. --Dc987 (talk) 23:02, 4 November 2009 (UTC)[reply]
Your use of entropy as a measure of entanglement violates the 2nd Law, where the entanglements are reversible.--Michael C. Price talk 02:13, 5 November 2009 (UTC)[reply]
Er, it is not me, who uses entropy that way. See: [11], [12]
Continuing Offtopic: Not really. Reversible by definition implies that the arrow of time is pointing backwards. And the 2nd Law only requires the increase of entropy in the direction of the arrow of the time. BTW. Can you give an example of reversible entanglement?
--Dc987 (talk) 05:57, 5 November 2009 (UTC)[reply]

There are two statements:
1. Entanglement entropy merely one of the contributions of the whole entropy;
2. The whole entropy equals the entanglement entropy;

I understand that the first statement is a well established fact. Second statement is a speculation. --Dc987 (talk) 05:57, 5 November 2009 (UTC)[reply]


I think the relation between entropy and entanglement is useful to discuss in this article after we finish rewriting the first few sections. Perhaps some easy to understand example involving qubits states can be used to explain it. Count Iblis (talk) 14:31, 5 November 2009 (UTC)[reply]
About reversibility, in principle everything is reversible (at least, assuming unitary time evolution). The reason that entropy can increase is because we don't look not at a picture of the system, but instead at a coarse grained picture of the system. I think that entropy in QM can be formally defined just like in the classical case, the only difference is that you have to replace "bits" by "qubits". This then automatically includes all the subtleties caused by entanglement. But to be sure, I'll look it up. Count Iblis (talk) 14:39, 5 November 2009 (UTC)[reply]
Hehe. You've mentioned "easy to understand" and "qubits states" in the same sentence.
Yes, entropy in QM can formally be defined just like in the classical case; I think that Von Neumann definition does it. The trouble with this definition is that takes a lot of effort to take in. While a few example involving a couple of electrons are very intuitive. --Dc987 (talk) 20:32, 5 November 2009 (UTC)[reply]

Entropy vs Chaos

I've heard mention in talking with some physicists that even though Entropy is counted as Chaos in a small closed system, the act on a universal scale it would be more accurate to describe entropy as a movement towards order... in otherwords, the heat death of the universe would result in the closest thing to absolute predictability of the universe. However, I'm unable to find what I would consider a good citation for this. —Preceding unsigned comment added by Starfyredragon (talkcontribs) 21:13, 3 November 2009 (UTC)[reply]

Weird thing for them to say, since the heat death has maximal heat released and heat, by definition, is unpredictable.--Michael C. Price talk 02:16, 5 November 2009 (UTC)[reply]

Cosmic inflation and entropy: please expand

Would someone expand the section which reads, "The entropy gap is widely believed to have been originally opened up by the early rapid exponential expansion of the universe"? It needs sources to start with. Also, did expansion open a gap because when distant stars and planets become unobservable, their entropy contribution no longer counted and the observable universe decreased in entropy? Or was there solely an increase in the total possible entropy of the universe? If so, was this because the universe cooled down and more kinds of broken symmetry (like atoms and molecules) became possible? Or was it because there was just more space for particles to be distributed in? I'm sure there are some egregiously wrong-headed questions in here - ruling them out may be a way to begin adding more content to the article. Wnt (talk) 07:09, 21 November 2009 (UTC)[reply]

Greem etymology

I know this sounds a bit trivial, but still.. :P

Do you think this section:

The word "entropy" is derived from the Greek εντροπία "a turning toward" (εν- "in" + τροπή "a turning")

(right above the TOC)

should be placed, in a shortened form, right next to the first instance of the word "Entropy" in the very beginning of this article? That's how it's usually done with other scientific/philosophical/psyhological/whatever Greek/Latin/etc-derived terms (e.g. Dynamics, Logos etc), and I think it makes more sense to be right where the word is found for the first time rather than somewhere else in the page. Something like " Entropy (from the Greek εντροπία, "a turning toward")[4] " maybe..

Sounds good? • jujimufu (talk) 10:51, 23 November 2009 (UTC)[reply]

Focus of this article

It seems to me that this article is presently off point. We have an article on Entropy (information theory). This page is stated to be on "entropy in thermodynamics". Nevertheless, the focus of the introduction is on information entropy, and there is considerably discussion of that topic in this article. This article needs to be refocused on the material that it is purportedly on based on the disambiguation information: thermodynamic entropy. Whats more, having a very technical article and a redirect to an "introduction to entropy page just really isn't the right way to do things. I am sure that this article can be written in a way that is initially clear to the layperson or at least generally technical person, and then progress into more technical material later. Furthermore, I am sure that material on information entropy will benefit by being moved to its existing, dedicated page. I will thus begin to make edits in these directions. Locke9k (talk) 00:36, 24 November 2009 (UTC)[reply]

There is Classical thermodynamics (1824) and Statistical thermodynamics (1870+). It seems to me that you are trying to move the article towards the 1824.--Dc987 (talk) 10:36, 25 November 2009 (UTC)[reply]
I'm not clear what you mean exactly. I think that the reworked intro clearly reflects the statistical thermodynamic view of entropy, as well as the classical view. The only thing I am trying to move the focus away from is information entropy, which is actually a separate (although related) topic and was being made the focus of parts of this page. I think that its very important to cover both the classical and statistical mechanical views of entropy within this article, as its not really possible for a lay person to get the complete historical and physical view of entropy otherwise. Also, I don't think that these two viewpoints are in any way competing, but are rather complementary.Locke9k (talk) 16:45, 25 November 2009 (UTC)[reply]

Original research

Various parts of this article are uncited, dubious, and seem likely to be original research. I am removing such material and where it seems useful I am copying them to this page. Please do not readd them to the article without citation. Following is the first such section.

"The true importance of entropy and the second law can be understood by altering the second law in a "toy universe" and studying the implications. Given the knowledge we now possess about atomic/quantum theory, it is no stretch to generalize all universal phenomena as "particle interaction". If the second law required entropy to remain constant, then particle behavior would be limited so as not to increase the balance of interacting systems (new interactions would only be allowed if they terminated old interactions). The universe would either be static or strange. And if the second law were reversed (total universal entropy decreased with time) then particle interaction would either require a certain amount of particle combination (annihilation of sets of particles into smaller sets) or interaction itself would be required to diminish. However, this does not mean that the second law bestows absolute freedom of interaction; total universal entropy must still increase with any physical process. Thus the spontaneity of any physical process can be predicted by calculating its (the more positive is, the more probable the process is). "

Locke9k (talk) 04:13, 24 November 2009 (UTC)[reply]

Sounds like a MaxEnt approach. --Dc987 (talk) 20:36, 24 November 2009 (UTC)[reply]

Deck of cards analogy

All states in the deck of cards are equally likely. So why specify any one state to start and then declare that another state is more randomly arranged on shuffling. I have seen this before and doubted it then; no specific state has any more combinations than one, the same number as the starting state. —Preceding unsigned comment added by Quaggy (talkcontribs) 18:01, 30 November 2009 (UTC)[reply]

What is left out in the text (in order not to make it technical, I presume), is that you need to count the number of microstates consistent with a certain macrostate. In this case we distinguish between a decks of card for which we do not have any information about how it is aranged and decks of cards for which we a priori know that it is arranged in some neat ordered way. These two types of decks of cards have to be considered to be different macroscopic pobjects.
The entropy is then the logarithm of the number of ways you can re-arrange the cards such that it would still be the same macrostate. Now, what a macrostate is, is matter of definition (in practice we make a choice that is the most relevant from an experimental point of view). In this case we can define two macorstates corresponding to the deck of cards being ordered or not ordered, but you can choose an alternative definition in which you do not make any such distinctions. Count Iblis (talk) 18:21, 30 November 2009 (UTC)[reply]
I think the present example, although Count Iblis is strictly correct, is not the best. I think it would be better to define, for example, a macrostate where all the black cards are on top, all the red cards on the bottom. Then we could talk about the number of microstates that correspond to this macrostate. That might be enough, but then we might make the intuitive argument that a macrostate where no five cards in a row were the same color would be a macrostate that had more microstates, and therefore more entropy than the first macrostate. PAR (talk) 00:34, 1 December 2009 (UTC)[reply]

Confusing statemnt

From the lead:

"Increases in entropy correspond to irreversible changes in a system"

From a layman's standpoint this is clearly not true. Taking the melting-ice example in the graphic, the ice can be re-frozen. Does it mean irreversible without expenditure of energy perhaps? My competence in the subject is not sufficient to make any changes. 81.129.128.164 (talk) 22:01, 1 January 2010 (UTC).[reply]

  1. ^ Clausius, Rudolf. (1879). Mechanical Theory of Heat, 2nd Edition. London: Macmillan & Co.
  2. ^ Harper, Douglas. "Energy". Online Etymology Dictionary. {{cite web}}: Unknown parameter |accessmonthday= ignored (help); Unknown parameter |accessyear= ignored (|access-date= suggested) (help)
  3. ^ International Union of Pure and Applied Chemistry (1993). Quantities, Units and Symbols in Physical Chemistry, 2nd edition, Oxford: Blackwell Science. ISBN 0-632-03583-8. p. 12. Electronic version.
  4. ^ Clausius, Rudolf. (1879). Mechanical Theory of Heat, 2nd Edition. London: Macmillan & Co.