Jump to content

Talk:Entropy

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Arjun r acharya (talk | contribs) at 23:18, 17 August 2008 (An editor put an Arxiv paper in Further Reading). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Former good articleEntropy was one of the Natural sciences good articles, but it has been removed from the list. There are suggestions below for improving the article to meet the good article criteria. Once these issues have been addressed, the article can be renominated. Editors may also seek a reassessment of the decision if they believe there was a mistake.
Article milestones
DateProcessResult
June 22, 2006Good article nomineeListed
February 20, 2008Good article reassessmentDelisted
Current status: Delisted good article

New intro

Rudolf Clausius' 1879 book (2nd Ed.) Mechanical Theory of Heat (see page: 107 beginnings of entropy discussions) is now available in Google books. Thus, I have started updating the intro to the correct presentation, i.e. in Clausius' own words. --Sadi Carnot 21:36, 30 July 2007 (UTC)[reply]

I have reverted your last change which was:

"In physics, entropy, symbolized by S, from the Greek τροπή meaning "transformation", is a mathematical function that represents the measure of the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[1] In short, entropy is a variable that quantifies the affects of irreversibility in natural processes."

Your paragraph is all true, but it is quite unintelligible to the average reader and is far to concise. You also left the lead far too short and not being a summary of the whole article. However I recognise that you probably were going to add something more. Also entropy is not anymore what Clausius wrote. We should be describing entropy as it is now understood and used not its historical roots. Please stop and discuss your changes here. --Bduke 23:28, 30 July 2007 (UTC)[reply]

Current lead

Bduke, all I did was move the bulk of the lead to an "overview" section. The current lead paragraph (which is completely un-referenced), show below, is filled with errors (especially the etymology):

The concept of entropy (Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn) [wrong (τροπή meaning "transformation", )]) in thermodynamics is central to the 2nd law of thermodynamics, which deals with physical processes and whether they occur spontaneously [wrong (the measure of spontaneity is "free energy" as per the combined law of thermodynamics)]. Spontaneous changes occur with an increase in entropy [wrong (only for isolated systems)]. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed [close (in some cases, but no reference)]. In contrast, the first law of thermodynamics deals with the concept of energy, which is conserved [correct (but does not explain what the connection is to entropy)].

I'll move the bulk of the lead back in, but I'm still correcting all this mess; for instance, all these suppositions need to be referenced. As to your statement "entropy is not anymore what Clausius wrote", there is some truth to this (in terms of verbal terms), but entropy, at its core, is what he wrote (in conceptual and mathematical terms). Now that the original paper is available, I intend to include a blend of this, as well as modern views, in the lead. No need to do any further reverting, please work together on this. --Sadi Carnot 07:05, 31 July 2007 (UTC)[reply]

It seems good now. I'm guessing, however, that if the lead keeps growing, some of it will have to be moved into a new "overview" section (which is what I was attempting to do before), as per WP:LEAD which states that opening section "should contain up to four paragraphs, should be carefully sourced as appropriate, and should be written in a clear, accessible style so as to invite a reading of the full article". --Sadi Carnot 07:16, 31 July 2007 (UTC)[reply]

Sadi, please discuss it more here and let us see what others think. I do not agree one little bit that it "seems good now", but I'm not going to revert. The problem is that the new first paragraph is NOT "written in a clear, accessible style so as to invite a reading of the full article". It will be a complete off-put to most readers, particularly those who are coming to it from a discipline other than physics but realise that this is a part of physics they need to know about. This has been a long term problem with this article and particularly its lead, but I just do not seem to be able to convince you and others. I can not work together with you on it, because it is the very opposite of what I would like to see. Keep the lead simple. Let it attract people with very different views of why they came to read it. Make in intelligible. --Bduke 07:38, 31 July 2007 (UTC)[reply]

I agree with you, any subject should be written in the manner that best conveys information in a digestible manner. One should not, however, bend, twist, misconstrue or even misrepresent basic science and logic for the sake of readability. Presently, to review as things currently stand, we are debating the first two sentences in the article. Please explain what your fuss is about (with these two sentences)? All I did was to correct wrong information and to add a reference. --Sadi Carnot 08:00, 31 July 2007 (UTC)[reply]

Lead comparison

To give you a comparative idea of why the lead is in “good shape” now, below is the current lead for the energy article (which there seems to be no issues with):

In physics, energy (from the Greek ενεργός, energos, "active, working")[2] is a scalar physical quantity, often represented by the symbol E,[3] that is used to describe a conserved property of objects and systems of objects.
In physics, entropy, symbolized by S, from the Greek τροπή meaning "transformation", is a mathematical function that represents the measure of the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[4]

There really is no need to make a big fuss over entropy; it’s basically the same thing as energy, only in a non-conservative sense. If you think the average reader is going to be “put off” by this sentence, than you might as well go over to the energy article, and post a note on that talk page as well, because I see no difference between these to sentences, in terms of difficultly. In short, the first sentence has to define the term. This is the way it is in all science articles. --Sadi Carnot 08:08, 31 July 2007 (UTC)[reply]

I think the lead to energy could be improved somewhat, but it really is not as difficult or off-putting as the current one to "Entropy". I do not think the first sentence has to define the term. It may do so, but often it is better to say in general terms what it is about, where it is used etc. and define it later. This does not mean being inexact or misleading. I do not want to sound patronising, but I think it is clear that you have never taught entropy or energy to people who are somewhat apprehensive about the topics. If you had you would see quite clearly what to me "the fuss is about". It has to attract people. It has to be simple, so readers can decide whether they need to get deeper. It currently does not do these things. I am busy with other things, so I am going to leave it to you. When you have done, put it to peer review and try to get it to featured article status. That will bring many others to comment on the article. --Bduke 09:32, 31 July 2007 (UTC)[reply]

(Edit conflict note: this was written before seeing Bduke's post above) IMO, the lead sentence is as clear as mud. What is transformation-content? What is "dissipative energy use"? That 19th century quote can be impenetrable to the modern reader. I'd rather have a definition like this one (although it's not perfect either):
"Quantity the change in which is equal to the heat brought to the system in a reversible process at constant temperature divided by that temperature. Entropy is zero for an ideally ordered crystal at 0 K. In statistical thermodynamics, S = k ln W, where k is the Boltzmann constant and W the number of possible arrangements of the system."[1]
This definition has the deficiency of not saying what entropy is good for or what it "is", but it is concrete and clear. Saying what entropy "is" gets into issues of interpretations or analogies, of which everyone has a favorite. --Itub 09:41, 31 July 2007 (UTC)[reply]

Dava souza's revert

Sadi, your enthusiasm for obscure historical definitions is noted, but this article is about informing newcomers to the subject and Bduke's considerable expertise on the subject has produced a much better lead than the proposed change, so I've restored it. .. dave souza, talk 09:56, 31 July 2007 (UTC)[reply]

Dave, you reverted several of my edits (corrections to errors) just now, not just the definition. I'm flexible on this, however, I want to seen a reference (or several) in the opening sentence and I don't want to see sloppy (incorrect) sentences. The sentence "spontaneous changes occur with an increase in entropy", is only correct in isolated systems; the novice reader will think it applies to all situations. The etymology is wrong to; I added an original source reference and you have reverted this too. Also, the lead needs to be four concise paragraphs, and the rest moved to an overview section. Please be considerate of my editing efforts. If you want to blend in a new reference to make it easier to read then do so. The lead you reverted to:
The concept of entropy (Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn)) in thermodynamics is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy.
is completely incorrect, e.g. see spontaneous process. This is what I am trying to clean. --Sadi Carnot 16:48, 31 July 2007 (UTC)[reply]
Spontaneous process is misleading at best and certainly needs work. It does not clearly say that is for the system only and is also an entropy term - for the surroundings. is really a measure of the total entropy change. --Bduke 22:32, 31 July 2007 (UTC)[reply]
Well, we can discuss whether the article should sharpen its discussion on the difference between the entropy of the universe and the entropy of the system. But, as the article spontaneous process makes quite clear, chemists define the word spontaneous to mean a process in which the entropy of the universe increases - ie a process allowed by the 2nd law. Jheald 17:17, 31 July 2007 (UTC)[reply]
(As a physicist, that usage always makes me uncomfortable -- it seems misguided to me to call a reaction spontaneous, if in practice it doesn't occur spontaneously, because the reaction barrier is too high. But who am I to argue with chemists in full flood?) Jheald 17:20, 31 July 2007 (UTC)[reply]
I kind of agree, but the distinction between thermodynamic control and kinetic control of a reaction is a useful one. --Bduke 22:32, 31 July 2007 (UTC)[reply]

Etymology

On the subject of the derivation of the word entropy, in Greek τροπή comes from τρέπω just like for example in English "cessation" comes from "cease". The verb τρέπω is the root word, and "chase, escape, rotate, turn" gives a good sense of what it means. The noun τροπή means a τρέπω-ing, hence a turning, a changing, a transformation.

I have to agree, in the strongest terms, with Itub above, when he writes that your proposed lead sentence "is as clear as mud. What is transformation-content? What is "dissipative energy use"?" These terms should be left in the 19th century. He is so right, when he writes, they are simply "impenetrable to the modern reader". I have reverted this ancient cruft, and would do so again without hesitation. Jheald 17:43, 31 July 2007 (UTC)[reply]

Jheald, I was the one that added the original etymology (from Perrot's A to Z Dictionary of Thermodynamics), and now that I've seen the 2nd Edition of the book (page 107):
I have corrected it to how Clausius coined it. Thanks you: --Sadi Carnot 17:55, 31 July 2007 (UTC)[reply]
The difference is that εν + τρέπω understood as the "chasing/ escaping/ rotating/ turning" "inside" the system actually gives quite a helpful steer towards understanding what entropy is. "Transformation" doesn't. Jheald 18:02, 31 July 2007 (UTC)[reply]
A measure of the unavailability of a system’s energy to do work. This actually is rather unhelpful. TR S is a measure of the energy unavailable to do work. The dependence on the reservoir temperature TR is fundamental. If TR was zero, then all the energy would be available to do work. It therefore is not helpful to suggest that S on its own is a measure of the unavailability of a system’s energy to do work. Jheald 18:07, 31 July 2007 (UTC)[reply]
Oh, and while you're at it, please learn enough about information theory to understand why saying Shannon entropy is "attenuation in phone-line signals" is imbecilic. Jheald 18:12, 31 July 2007 (UTC)[reply]

I added a second ref note per your request:

  • The etymology of entropy, in modern terms, according to Perrot’s A to Z of Thermodynamics, can be interpreted to mean, “from the Greek root εντροπη, the act of turning around (τροπη, change of direction), implying the idea of reversibility”.

I hope this helps. As to the new 2005 Oxford Dictionary of Physics definition, do you really have to complain about every reference? First Clausius is to historic, now Oxford it too unhelpful. Give me a break. I'm only trying to add references to the article so to give it credibility, rather than original research. As to Shannon, fix it if you know of a better wording. --Sadi Carnot 18:21, 31 July 2007 (UTC)[reply]

As to Shannon, I was rather hoping you might go away and actually learn something, so you don't continue to inflict nonsense like this any more.
As for the opening definitions, no I'm not going to give you a break. Settling for misleading is not acceptable. "A measure of the unavailability of a system’s energy to do work" is horribly misleading, because that unavailablity depends utterly on the reservoir temperature.
Finally, connecting τροπη with the idea of reversibility is a spectacularly unhelpful intuition, even by your standards. What is valuable about the link with τρέπω = "chase, escape, rotate, turn" is that it gives some idea of internal molecular confusion. No, that's not what Clausius was thinking of when he coined the phrase. But it's the most valuable connection today. Clausius's original etymology frankly isn't helpful for the intro. Jheald 18:39, 31 July 2007 (UTC)[reply]

Jheald’s comments

Jheald, let me get this straight: from your point of view, I’m an imbecile and you want me to go away? --Sadi Carnot 02:36, 1 August 2007 (UTC)[reply]

No, but from time to time, like all of us, you may write things which make you look clueless. At which point, the best solution is to get a clue. I told you 18 months ago that this sort of statement about information entropy was misconceived, and yet you still trot it out. Jheald 08:22, 1 August 2007 (UTC)[reply]
In any event, thanks for the nice comments, I've added them to my user page. --Sadi Carnot 03:21, 1 August 2007 (UTC)[reply]

Economic entropy

I have changed economic entropy from being a quantitative value to a semi-quantitative value. I would go further and call it qualitative, but people might disagree. I fail to see how it can be quantitative without a mathematical definition, which is stated by the article by filing it under sociological definitions. I would argue that quantitative measurements must be of a known quantity if they are to be named as such. Thanks User A1 11:26, 6 August 2007 (UTC)[reply]

Entrophy and the relative number of states

Ω, the number of microstates, in S = k ln Ω might be better interpreted as a relative number of states which would be a dimensionless quantity for which the logarithm would be defined.

On p. 24 of Wolfgang Pauli's Statistical Mechanics (Vol. 4 of Pauli Lectures on Physics) he comments,

"The statistical view also permits us to formulate a definition of entrophy for nonequilibrium states. For two states, 1 and 2 we have

S2 - S1 = k log(W2/W1);

leaving the additive constant unspecified, we obtain

S = k log W.

Because of the logarithm, and because the probabilities of independent states multiply, the additivity of entrophy is maintained." --Jbergquist 18:41, 2 October 2007 (UTC)[reply]

Gibbs entropy as fundamental but not defined?

Under "Miscellaneous definitions", "Gibbs entropy" is described as being the "usual statistical mechanical entropy of a thermodynamic system". However, Gibbs entropy does not appear to be defined in this article, and the linked article on "Gibbs entropy" does not define some of the terms used. 68.111.243.96 20:18, 17 October 2007 (UTC)[reply]

Entropy (disambiguation)

Editors of this page might like to look over the recent to-and-fro and Entropy (disambiguation). User:Thumperward seems dead set to (IMO) make the page harder to use. Compared to eg this edit, he seems determined to

  • remove the link to Introduction to entropy
  • remove links directing people to find additional entropy articles in the categories
  • reduce the structuring of the page between thermodynamic entropy and information entropy.

-- all of which (IMO) are mistakes. Anyhow, there have been a series of reverts and counter-reverts (I've now had my 3 for the day), and there's discussion on the talk page there, if anybody wants to have a look. Jheald 13:52, 18 October 2007 (UTC)[reply]

Never mind, we seem to have come to agreement. Edit war over :-) Jheald 15:47, 18 October 2007 (UTC)[reply]

"calculated using the multiplicity function" ????

And if I click on the wiki link or multiplicity function I see the expression for a system of N noninteracting spins :) Also, we should avoid using misleading examples of a system of which the energy levels are exactly degenerate. It is better to define as F. Reif does in his textbook: is the number of energy eigenstates with energy between and , where is a macroscopically small energy interval. The entropy defined in this way depends on the choice of , but this dependence becomes negligible in the thermodynamical limit. It cannot be set to zero, because then for generic systems , and the entropy becomes identical to zero.

Basically what happens is that if you specify the energy of a system with infinite accuracy, then there can be only one microstate compatible with that energy specification. This entropy is the so-called fine grained entropy, while the entropy defined with the nonzero is the coarse grained entropy. Count Iblis 15:24, 23 October 2007 (UTC)[reply]

Clausius Inequality

Shouldn't the inequality be a less than or equal to sign rather than a greater than or equal to sign? That should probably be fixed. --Lee —Preceding unsigned comment added by 24.3.168.99 (talk) 05:39, 21 November 2007 (UTC)[reply]

No - the inequality is correct. Read the statement just above the equation where it states that the result should be positive or zero for the extreme situation. This is a basic statement of the 2nd Law of Thermodynamics. But thanks for checking. PhySusie (talk) 13:09, 21 November 2007 (UTC)[reply]

I wish I knew how to add things to my own comments

My mistake on that one, haha. I overlooked the part where the article explained heat transfer from the body was positive (every class I've ever taken has had it as negative). That's why I'm used to seeing the Inequality with a less than or equal to sign. Thanks again! —Preceding unsigned comment added by 24.3.168.99 (talk) 06:45, 23 November 2007 (UTC)[reply]

GA Sweeps (on hold)

This article has been reviewed as part of Wikipedia:WikiProject Good articles/Project quality task force in an effort to ensure all listed Good articles continue to meet the Good article criteria. In reviewing the article, I have found there are some issues that may need to be addressed, mainly whether having a separate "Introduction to" article is a sufficient replacement for writing at a general level. In my view it is not, but I'm happy to be convinced otherwise, or post the article at Good Article Reassessment for other opinions.

There are a few other fixes listed below that are needed to keep Entropy at a good article standard.

GA review – see WP:WIAGA for criteria


I was initally of two minds about using an "introduction to" article as a general audience work around. On the one hand, yes, it is a very good way to avoid the difficulties inherent in explaining advanced concepts at a basic level, but on the other hand, it bears similarities to a "criticisms of" POV fork. After reading through the lead of the article, I think there's a good reason to merge the "introduction to" article into this one: when writing for a general audience, one tends to focus more on clarity than absolute precision, and it is this clarity that is crucial for the lead of this sort of highly techinical article. Also, most of the sections in this article are summaries of other articles, but still focused at a very high level. It would be better if the summaries were geared to a general audience and the high level material left for the main articles.

  1. Is it reasonably well written?
    A. Prose quality:
    The lead could be made clearer and more focussed, especially the first paragraph. See comment after review.
    B. MoS compliance:
    There are some Manual of style problems in addition to the general audience discussion above. They're only small errors, but there are quite a few a of them. For example, remember to avoid slipping into textbook style and using "we" in derivations, and that punctuation after a math tag must go inside. See Wikipedia:Manual of Style (mathematics).
  2. Is it factually accurate and verifiable?
    A. References to sources:
    B. Citation of reliable sources where necessary:
    There are a number of unsourced facts, especially in the history section. The GA criteria specify that at a minimum, every statement that could be contested must sourced with an inline citation.
    C. No original research:
    The ice melting example has to count as original research unless it is sourced. As it's apparently used in many textbooks, this won't be hard to fix.
  3. Is it broad in its coverage?
    A. Major aspects:
    B. Focused:
  4. Is it neutral?
    Fair representation without bias:
  5. Is it stable?
    No edit wars, etc:
  6. Does it contain images to illustrate the topic?
    A. Images are copyright tagged, and non-free images have fair use rationales:
    B. Images are provided where possible and appropriate, with suitable captions:
  7. Overall:
    Pass or Fail:


Regarding the lead, it contains 7 different definitions of entropy:

  1. [it] is a measure of the unavailability of a system’s energy to do work.
  2. is a measure of the randomness of molecules in a system.
  3. entropy is a function of a quantity of heat which shows the possibility of conversion of that heat into work.
  4. [is] the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state
  5. [it has] often been defined as a change to a more disordered state at a molecular level.
  6. has been interpreted in terms of the "dispersal" of energy.
  7. is defined by the differential quantity dS = δQ / T

While I'm sure they're all true, it makes the lead seem very cluttered, and I come away with a cluttered idea of what entropy is. I think that if the lead were refactored such that each paragraph had a clear, single focus, it would improve the article dramatically.

Feel free to drop a message here if you have any questions, and many thanks for all the hard work that has gone into this article thus far! --jwandersTalk 03:58, 19 February 2008 (UTC)[reply]

Delisted from GA. See review directly above. --jwandersTalk 22:28, 20 February 2008 (UTC)[reply]

other sections on entropy: economics, and as metaphor

There should be some other sections on entropy as it appears in macroeconomic theory, and as it has been used outside of science as a metaphor.

The first: I do not mean the technical (information theory) use of entropy as a measure of this or that quality of information generated about one or another economic variable. I mean: there's a revolution happening in economics, as the old Newtonian mechanism is being replaced by a macro perspective that understands that economic processes are one-way flows that happen in time, in which scarce (low entropy) matter and energy is taken up into the economy and degraded (high entropy) matter and energy is extruded/exhausted/discarded. I wrote a 'graph on this that seems to have disappeared. Maybe I didn't follow directions or rules?

The second: the idea of entropy has been widely used by poets, historians, fiction writers, thinkers of all kinds, some of whom understood it, some of whom didn't. Still, a comprehensive article on the subject could give a brief survey of these uses. I wrote some text on this, too, and it isn't there today. Problem? —Preceding unsigned comment added by 128.252.254.30 (talk) 19:24, 1 March 2008 (UTC)[reply]

Lede context

Hello,

A few questions, which I post without any

  1. is thermodynamics a branch of physics? I always thought of it as a branch of chemistry - particularly the statistical mechanics considerations, though I can see how it could go either way.
  2. Is entropy really at a purely thermodynamic property? I would have thought that entropy is a statistical property which finds use in fields such as thermodynamics, information theory, etc.

Maybe I am just favouring statistical mechanics... Thanks User A1 (talk) 22:59, 25 March 2008 (UTC)[reply]

Ad 1. Our Thermodynamics article starts like this: "Thermodynamics (...) is a branch of physics that studies the effects of changes...". The article History of entropy described how the notion was already well developed (see also Classical thermodynamics and Entropy (classical thermodynamics)) before the statistical explanation was developed (see Statistical thermodynamics and Entropy (statistical thermodynamics)).
Ad 2. Entropy is not a purely thermodynamic concept, although it originally was, and the statistical definition used in thermodynamics is specific to that field. However, as it is, it is the thermodynamic concept that is described by this article. I am in favour of renaming this article Entropy (thermodynamics), a name that currently redirects here, as does Thermodynamic entropy. See also the discussion raging at Talk:Entropy (disambiguation).  --Lambiam 21:40, 26 March 2008 (UTC)[reply]


Requested move

EntropyEntropy (thermodynamics) — The article appears to discuss thermodynamics only, and fails to review entropy in other branches of physics, information science and mathematics. —linas (talk) 04:14, 27 March 2008 (UTC)[reply]

Once again, the stupidity of masses rears its ugly head, as the above exhibits in spades. At the risk of being uncivil, I say "fuck wikipedia". If this is what the cornhole editors with their heads stuck up their asses want, this is what they get. linas (talk) 02:16, 8 April 2008 (UTC)[reply]

Error in Explanation

'then entropy may be (most concretely) visualized as the "scrap" or "useless" energy'

Usually in an article discussing a useful combination of more basic physical quantities, the units of the item are given. In this article they are not explicitly covered. Big mistake. And it leads to incorrect statements like the one above. Entropy is not energy. The term energy has a whole lot of baggage that comes with it, and to suggest that entropy carries the same baggage (say like conservation) contributes to a gross misunderstanding of what is going on. I hope authors/editors will be much more careful. Properly presenting the ideas of physical chemistry requires much more rigor than present in this article. blackcloak (talk) 05:32, 7 June 2008 (UTC)[reply]

Thanks for the comment, this article has been subject to a sort of tug of war between various perceptions of how to explain a difficult concept involving advanced mathematics in a simple way accessible to the layman. This earlier version was edited by an educator, and may be nearer what you were looking for. The article's gone through numerous intermediate stages, as in this version, and the lead has been stripped down to the point where it's probably missing out on essentials while still including misleading cruft. Rather beyond me, but your assistance in a rewrite will be greatly appreciated. Note, of course, that thermodynamic entropy applies to more than physical chemistry. . dave souza, talk 08:14, 7 June 2008 (UTC)[reply]
Well, today the average lay person is much more familiar with information theoretical concepts because many people have a computer these days (certainly those people who visit this page :) ). So, we can exlain the rigorous formulation much more easily than, say, Landauer could half a century ago. Why can't Maxwell's demon be effective? Today that's almost a no brainer to a ten year old. Count Iblis (talk) 13:29, 7 June 2008 (UTC)[reply]

I rarely have time these days to think about this article, but I want to make a comment in response to blackcloak. I suggest that the urge to follow "the ideas of physical chemistry requires much more rigor than present in this article" does more harm than good and probably explains why students in physical chemistry never really understand entropy. What is needed at least at first is not rigor but clarity, so the reader can see what entropy is actually about and why they need to learn about it. Rigor can follow later. I am not of course suggesting that the "clarity" phase should be false, but it does not need to be rigorous. It also needs to take into account that many students have a poor background in mathematics. --Bduke (talk) 22:36, 7 June 2008 (UTC)[reply]

On the other hand, claiming entropy is "scrap or useless energy" is not clear, and is not good. It does not help understanding if entropy is confused with energy. The unusable energy is TR S, where TR is the temperature of the coldest accessible reservoir. Jheald (talk) 20:08, 9 June 2008 (UTC)[reply]
I entirely agree. My concern is the general urge for total rigour that can make the article totally unclear. --Bduke (talk) 02:29, 10 June 2008 (UTC)[reply]
Why don't we say that the entropy of a system is the maximum amount of information you could theoretically store in the system without affecting its macroscopic state? This definition is perfectly understandable to most lay people. Count Iblis (talk) 02:45, 10 June 2008 (UTC)[reply]
It is totally confusing to someone who comes across the term "entropy" in connection with chemistry. But even wider, where on earth do you get the idea that it "is perfectly understandable to most lay people"? Can you back that up with a source.--Bduke (talk) 23:01, 10 June 2008 (UTC)[reply]
Well, the problem is that people are taught thermal physics in the wrong way in high school. At university part of our work is to let the students unlearn what they learned in high school. Entropy is fundamentally an information theoretical or statistical concept, just like heat, temperature etc. are. If we just say that entropy is related to heat and temperature, we aren't explaining anything.
I'm not saying that we should explain everything mathematically, that's not necessary. Compare e.g. how Chaitin explains Gödel's theorem to lay people. The information theoretical approach clearly works very well here, precisely because lay people are familiar with computers, computer memory etc. etc.. Count Iblis (talk) 21:23, 11 June 2008 (UTC)[reply]
I do not know where you come from, but most of the students I have taught physical chemistry to in the last few decades had not even studied physics at High School and had not come across entropy before. I see you are a physicist. It seems to me that what is familiar and obvious to your students is very far from being so to everyone else. I give up. I just do not see a way forward. This article will continue to be dominated by physicists and it will continue to be totally unclear and unhelpful to anybody else. --Bduke (talk) 23:17, 11 June 2008 (UTC)[reply]
We do have to explain everything from the start to the students. I don't really believe that genuinly interested people who are willing to learn can fail to underststand something as simple as entropy. But they do have to be open to the idea that their intuitive ideas about entropy, heat and temperature may be wrong.
The reason why people find physics difficult is because we don't teach it properly until the students go to university. Just think about how well you would read and write English if you were not taught to read and write until you were 18 years old. Now, if our reaction to this problem is to dumb thing down even more we are only going to make the problem worse. We have to keep in mind that wikipedia is also read by many children in primary and high school. They would benefit from being exposed to real physics instead of the dumbed down physics stuff they are taught in school. Count Iblis (talk) 02:40, 12 June 2008 (UTC)[reply]
BDuke, Just thought I would wade in here. If you want to make progress, one suggestion would be to create a page in your own user-namespace, eg User:Bduke/entropy and then use that to construct, what you believe to be a good modification - that way you can actually point at something and say "this is a better explanation; what do you think" rather than "Currently the way the article does is wrong, we should do it a better way". More likely you will get a more enthusiastic response from other editors. see WP:BOLD. On the other hand, this is more work :( - Can't win em all, huh?User A1 (talk) 00:40, 12 June 2008 (UTC)[reply]
I actually did that long ago, but under a different title which I forget. I deleted it. It lead to a rewrite of the intro para as Dave Souza mentions in the second para above in this section. I had other things to do and it just reverted back to where it is now. It is just too hard unless others recognise that we do have a real problem with this article and many others. I just do not have the time to fight this alone. --Bduke (talk) 01:01, 12 June 2008 (UTC)[reply]

Defining entropy as the maximum amount of information you could theoretically store in the system without affecting its macroscopic state is not understandable to most lay people IMO. That definition only makes sense if the reader is acquainted with a quite technical meaning of "information", which takes the reader who doesn't know it in a nearly circular path of confusion. It is also counterintuitive to suggest that a gas "holds" more information than a solid, for example. What do you mean by "hold"? Why are hard drives not gaseous then? ;-) Like I suggested above already, I think it is best to start with a clear and unambiguous definition such as [6], even if it doesn't explain what entropy is good for or what it "is". The analogies and examples can come later. --Itub (talk) 08:54, 12 June 2008 (UTC)[reply]

Once more into the lead, dear friends

As a layman, my opinion is that the current lead section has some problems. In the opening sentence – "In thermodynamics (a branch of physics), entropy is a measure of the unavailability of a system’s energy to do work." – "(a branch of physics)" is superfluous and misleading as it's also a branch of chemistry and mechanical engineering. Best explained in more depth later.

"It is a measure of the randomness of molecules in a system" is completely baffling to me. The jump from availability of energy to do work to "randomness" of molecules makes no sense. I'd be much happier with modified borrowings from the introduction to entropy article, explaining what it is and how it's measured in statistical terms. Thus, a proposal, concluding with the statistical and information meanings. The derivation of the term adds nothing to basic understanding, and so should be moved to the body of the text. –

In thermodynamics, entropy is a measure of the unavailability of a system’s energy to do work. It provides a measure of certain aspects of energy in relation to absolute temperature, and is one of the three basic thermodynamic potentials: U (internal energy), S (entropy) and A (Helmholtz energy). Entropy is a measure of the uniformity of the distribution of energy. In calculations, entropy is symbolised by S and is a measure at a particular instant, a state function. Thus entropy as energy Q in relation to absolute temperature T is expressed as S = Q/T. Often change in entropy, symbolised by ΔS, is referred to in relation to change in energy, δQ.
It is central to the second law of thermodynamics and the fundamental thermodynamic relation, which deal with physical processes and whether they occur spontaneously. Spontaneous changes, in isolated systems, occur with an increase in entropy. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed.
Statistical mechanics introduces calculation of entropy using probability theory to find the number of possible microstates at an instant, any one of which will contain all the energy of the system at that instant. The calculation shows the probability, which is enabled by the energy: in terms of heat, by the motional energy of molecules. Statistical mechanical entropy is mathematically similar to Shannon entropy which is part of information theory, where energy is not involved. This similarity means that some probabilistic aspects of thermodynamics are replicated in information theory.

The question of use in physics, chemistry and engineering is covered by the mention of temperature, pressure, density, and chemical potential, in my opinion. To refer to the previous discussion, saying that "the entropy of a system is the maximum amount of information you could theoretically store in the system without affecting its macroscopic state" is meaningless to me. If it has some deep meaning not covered by the last paragraph, perhaps that explanation could be simplified and added. . . dave souza, talk 09:30, 24 June 2008 (UTC)[reply]

Not so bad as a starting point, but I do have some issues:
  • a measure of certain aspects of energy isn't good. It's vague to the point of being confusing, rather than clarifying. (1) It's too abstract. Entropy is a particular property of particular physical systems. On the other hand certain aspects of energy suggests (to me) certain characteristics of energy such as eg that energy can't be created or destroyed. But what would be a measure of that? So I find the phrasing unintuitive and confusing. (2) It's not just energy that entropy is associated with. You can know things about the system that change its entropy, without changing its energy distribution.
  • Thus entropy as energy Q in relation to absolute temperature T is expressed as S = Q/T. Entropy isn't energy. And the relationship is a differential one, because adding heat to a system will generally change its temperature. Furthermore, Q is not a state function. (Incidentally, a measure at a particular instant is a curious way to define a state function; though I think I see what you're getting at).
  • the number of possible microstates at an instant, any one of which will contain all the energy of the system at that instant. What you mean, I think, is that a microstate contains all the information of the whole system; so defines exactly how all the energy is distributed across the whole system at that instant. But what's written is very unintuitive. I fear anyone who doesn't already know what a microstate is will inevitably think of all the energy piled up in one place.
  • The calculation shows the probability, which is enabled by the energy: in terms of heat, by the motional energy of molecules. This is gibberish. What is it trying to say?
As for entropy as a measure of the freedom/randomness/uncertainty/information of a system. This isn't meant to follow intuitively from the "unavailability of a system's energy to do work". Rather, it's a deeper, contrasting, microscopic view of what entropy fundamentally is. The laws of physics are deterministic, and they preserve uncertainty. So if you're pushing heat through a heat engine, there has to be at least as much uncertainty (entropy) associated with it at the exhaust stage as at the input stage. If the exhaust is at a temperature TR, that means an energy of at least TR S must be dumped as waste heat, if S was the entropy put in at the input stage.
That's why saying the freedom/randomness/uncertainty/information can't decrease is equivalent to there being a limit on the availability of the system's energy to do work. Jheald (talk) 10:41, 24 June 2008 (UTC)[reply]
Perhaps it would help if we don't stick to the wiki-convention of defining the term in the first sentence. We could just say in the first sentence that entropy is an important quantity in thermal physics, and then give a general introduction of thermal physics. We should not mention entropy again until we are ready to give a precise definition that is understandable to the reader who has read the text till that point. Count Iblis (talk) 13:32, 24 June 2008 (UTC)[reply]

User Comment

I have a B.S.in physics, and I am studying for the subject test, so I use Wikipedia a lot. I really appreciate how useful Wikipedia is. I have a collection of texts that I have either saved from undergraduate classes or purchased here and there, but the articles on Wikipedia usually clarify some otherwise mystifying points about certain topics. While I deeply appreciate all the efforts that everyone has made to present a valid, understandable presentation, there are two things that concern me. One is about the contributors to Wikipedia in general, and the other is about this article in particular.

First, I know that one of the specified conditions is that people be courteous to each other, but they are obviously not. This disturbs me on several levels. It disturbs me on a casual level, just because it is so distracting when I'm trying to study. When I run into all this rude stuff that people are writing to and about each other,it makes me want to just shut it down. On a more professional level, it seems to reflect the general lack of respect that scientific types seem to have for each other. I don't know if it's because of competitiveness, or because we spend to little time interacting with people, that we don't learn to treat others the way we want them to treat us. I think it's some of both. Regardless, I wish you peopole would be nice.

Now, about the article. Somebody break out the smelling salts. I'm interested in the concept of entropy as it pertains to physics. It is distracting to include the informational aspects of the concept mixed in with an article about physics. The two should be separate. At the beginning, there should be a link for people who are interested in the informational aspects of the concept. Just because both uses of the term are equally valid and important doesn't mean that they have to be mixed together in the same article. I have seen countless other articles where this problem has been addressed very effectively. The reader is directed to the article about the usage that he or she is interested in. If one so desires, he or she can go back and read the other article. Putting the information in separate articles doesn't slight one discipline or the other. Putting them together is not efficient for the reader. I don't know why there needs to be so much debate about it. It seems clearly disorganized to mix it all together. Thank you for all your hard work. Is there enough bread around my compliment, critique, compliment sandwich? —Preceding unsigned comment added by 98.163.102.226 (talk) 17:35, 8 July 2008 (UTC)[reply]

Entropy and Information theory

On 9 July 2008 the following text was added by Kissnmakeup. These comments are not appropriate for the article Entropy, but are most appropriate for this Talk page. The text was deleted by Jheald on 9 July.

This does not belong in this article. This article is about thermodynamics. There is a link at the top of this article for readers who want to know about information theory. The person or people who insist that this should be here need to reconsider out of courtesy and common sense. Both uses of the term should have their own articles. There is no reason why an article about thermodynamics should include a discussion about information theory.

I am restoring the text, this time to Talk:Entropy, for the benefit of those who regularly work on Entropy. Dolphin51 (talk) 03:00, 10 July 2008 (UTC)[reply]

Fair enough. I think the section at the moment clearly isn't explaining well enough what I also wrote in my edit summary, viz. that many people do find this interpretation very useful as a way of understanding *thermodynamic* entropy.
Specifically, (as the Maximum entropy thermodynamics page puts it) that statistical mechanics should be seen as an inference process: a specific application of inference techniques rooted in information theory. And that thermodynamic entropy is exactly the same thing as information entropy - it is simply an application of information entropy to a particular situation, and a particular framework of questions of interest.
So thermodynamic entropy represents the amount of information you do not have about the exact state of a system, if all you know are the values of the macroscopic variables.
That explains why learning information about the system (eg as in the Szilard engine thought experiment) can change its entropy, i.e. allow you to extract additional useful work, even if nothing has changed about the system itself.
It also, for many physicists, answers the metaphysical questions like "So what is entropy? Where does it come from?" Answer: it is exactly the same thing as information entropy, and it comes from where information entropy comes from.
The article would be better if it presented this much more directly. At the moment, what physicists who take this line actually think, and why they think it, is very much buried. It's not surprising that readers who haven't met the idea before are being confused. And in addition, the last paragraph of the current #Entropy and Information theory simply isn't true. There's a very straightforward interpretation of δQ = T dS. It's the definition of temperature -- how much energy you have to put in to increase the information entropy of the system by one nat. In systems of natural units, temperature is measured in (Energy units per nat). Secondly, the second law of thermodynamics. The information interpretation is what reconciles the Second Law with determinism and Liouville's theorem. The laws of the universe preserve information (Liouville's theorem) - but as time goes on, they make it less useful, as more of the information you had becomes related to microscopic correlations rather than macroscopic properties. So the effective amount of information you can use about the system has gone down; corresponding to an increase in its classical thermodynamic entropy.
Most theoretical physicists think of entropy in this way, I would claim. So, for example, Seth Lloyd, Programming the Universe, just because I happen to have it to hand. Page 65 (and following): "In particular, the physical quantity known as entropy came to be seen as a measure of information registered by the individual atoms that make up matter."
Information entropy has everything to do with thermodynamic entropy. Jheald (talk) 08:10, 10 July 2008 (UTC)[reply]

Further User Comment

I will concede that the yes or no question analogy illuminates the similarity between information entropy and thermodynamic entropy, but it doesn't explain thermodynamic entropy, nor does it pertain to thermodynamic entropy. To say that thermodynamic entropy is information entropy does not explain thermodynamic entropy unless one already understands information entropy which is A DIFFERENT FIELD OF STUDY. At least the vegetable soup part has been moved out of the way of the part that matters. —Preceding unsigned comment added by Kissnmakeup (talkcontribs) 23:32, 14 July 2008 (UTC)[reply]

The information theoretical definition is the fundamental definition of entropy. Most modern textbooks on thermodynamics and statistical physics teach it that way. The old fashioned way of introducing thermal physics in terms of heat engines etc. does not explain what entropy is at all. In that approach you simply have to postulate the existence of a quantity called entropy which is then related to heat and temperature. Neither heat, nor entropy can be rigorously defined in this approach. Count Iblis (talk) 00:21, 15 July 2008 (UTC)[reply]

Actually, the statistical mechanical definition makes perfect sense to me. There are x number of possible quantum states with an equal probability of the system being in any one of those states. It's like a digital computer. You have x number of possible yeses or no's, 0's or 1's, trues or falses, or blacks or whites. Whatever you call it, to say that is informational is like saying that two electron's can't have the same quantum numbers because Pauli said so. Two electron's couldn't have the same quantum numbers before Pauli came along. To say that it is because Pauli said so is silly and certainly doesn't explain physically why they can't be the same. Just like information entropy doesn't explain thermodynamics unless you already know it as information entropy. Entropy existed before information science. I think that you should include a discussion about boolean algebra, the binomial distribution, and machine language for those who already undertand the same concepts from still other points of view.

This is the biggest problem in education. To teach, one must explain things in terms that the student can understand. By student, I mean someone who doesn't already know the concepts or the terminology. That is why they are students. The more one learns about something, the less connected he or she becomes with the novice. The concepts and terminology become so entrenched in the brain until it is impossible for an "expert" to look at it from the point of view of someone who doesn't know it yet. People who are emotionally needy try to make another feel stupid because they don't understand it. Such people are incapable of actually teaching, even though they may be employed as teachers. To teach, one must be able to put himself in the shoes of someone who has never heard of it, and respect the effort to learn. Treat the student with dignity, and put it in terms that can be understood. I know that is what you all are trying to do. You're doing a tremendous work in providing this information. I have a great deal of reverence for it, being from a background that is not exactly "two commas". Knowledge should not be for sale.

Now, back to entropy.

Another thing that is confusing about discussions of entropy is the lack of stess on the idea that when a system is disturbed from equilibrium, the entropy is decreased by the constraints or whatever causes the departure from equilibrium. Granted, the definitions state that the entropy of "spontaneous" (meaning "undisturbed?") processes always increases. It seems that the idea of entropy increasing is stressed so abundantly more than the spontaneous part, that it leads a novice to miss or forget the reverse process in which entropy decreases, which is very misleading. Thank you.Kissnmakeup (talk) 11:57, 18 July 2008 (UTC)[reply]

Entropy and Information/Communication Theory

This article seems to be an overview of entropy as the term as been used in many disciplines, with the main emphasis on its use in thermodynamics. The section on information theory omits a big split in meaning that is not very consequential, but needs to be pointed out. I believe the Shannon used the term for the information rate of a channel. If we define the information (in bits) of a symbol as $ log_2(1/p) $, where p is the probability of that being emitted by the source, then the entropy is the mean information per symbol emitted, i.e. $ Sum_i, p_i log_2(1/p_i) $. Yet the article contains this snippet: "and the entropy of the message system was a measure of how much information was in the message", which is treating entropy as information.

So one definition makes entropy be a synonym for information, and the other makes entropy be an information rate.

Finally, there is this sentence "For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of yes/no questions needed to determine the content of the message." This leads the reader to a psychological idea of entropy, which is very far from the notion of entropy in information theory, with has to do with probabilities of symbols. And the qualifier about equally probable messages is really confusing. I don't know why it is needed, nor what the author was trying to get at. DMJ001 (talk) 04:29, 6 August 2008 (UTC)[reply]

No, this article is specifically about thermodynamic entropy. There is another article, information entropy for the (more general) concept of entropy in information theory. Information entropy appears in this article only because many people find it the most useful way to think about what thermodynamic entropy fundamentally is.
Entropy in information theory is defined for any probability distribution, and Shannon's original papers do this; that is the most fundamental form of the information entropy concept. Entropy rate, as you correctly explain, has a slightly different meaning. It's true that in information theory people sometimes talk about the entropy of a source, meaning its entropy rate, and compare this to the channel capacity. But questioned on the point, they would freely accept that this is a derived idea, the more fundamental idea being the entropy (or information) of any general probability distribution.
Not so fast there. Take a look at Shannon's original 1948 paper. He very clearly considers entropy to be information per symbol. Here is one quote: "H or H' measures the amount of information generated by the source per symbol or per second." Then he discusses messages. He gets to the probability p that some message of length N has been generated. He says that the entropy is given by H = log(1/p) / N. (The = has a dot over it to indicate, I think, that this is an estimate based on a sample.)
DMJ001 (talk) 03:07, 9 August 2008 (UTC)[reply]
Fair enough, though actually Shannon uses both meanings of entropy. In section 20 of the paper (page 35 of the Lucent pdf), he's defining entropy for general pdfs. But for a source, what's more useful is the (average) entropy per symbol - ie the average of the entropy of the probability distribution for the next symbol. It's still an entropy, it's just the entropy of a slightly more specific thing; we can call it the entropy rate if we want to think of it that way, or we can call it the entropy per symbol. Jheald (talk) 18:48, 13 August 2008 (UTC)[reply]
As to the "psychological" view of entropy, I agree this may be a stumbling block to people in the article, because entropy in information theory is just as "real" as entropy in thermodynamics. (In fact User:linas has told me he finds it rather more real [7]). On the other hand, some subjectivity of both information entropy and thermodynamic entropy isn't entirely wrong. (See the maximum entropy thermodynamics article for more discussion). Entropy in information theory is reflective of a state of knowledge; different states of knowledge give rise to different entropies. In thermodynamics too, the Szilard engine thought experiment is an illustration of this - knowing particular microscopic information can sometimes let you extract more macroscopic useful work. It's also a useful idea when thinking about entropy increase in the context of Loschmidt's paradox. So the subjective element may not be completely out of place. But as I've said before, the whole section on the connection between information entropy and thermodynamic entropy could probably stand a re-write. Jheald (talk) 10:35, 6 August 2008 (UTC)[reply]

To clarify, are you saying that the sum total of all the possible microscopic energy states along with any constraints on these particles is the equivalent of the information in the so called "message" of the information entropy definition? If so, I still don't understand why one who is trying to learn about thermodynamics must go off on a tangent and take time to also learn about all the nuances of information entropy in order to learn about thermodynamic entropy in order to provide an illustration or example of "entropy" when all a person really wants to know is what is thermodynamic entropy?Kissnmakeup (talk) 14:19, 6 August 2008 (UTC)[reply]

The "message", in the information theory perspective, would be the identification of the single 'true' microscopic state, from out of the set of all the potentially possible microscopic states and their probability distribution, that are compatible with the macroscopic variables and any other constraints on the system.
I think you are trying to extend the similarities of the entropy of statistical thermodynamics and the entropy of information theory too far.
But back to my original point about entropy as information vs. entropy as information rate. If you want entropy to be information (and I actually prefer this, even if it was not the way Shannon used the term), then the entropy of a symbol, message, or anything else, is -log of the conditional probability of the symbol, message, or whatever.
And the "condition" is whatever the observer knows at that point. Thus entropy is relative to an observer. For example, the entropy rate of an AES-encrypted counter is 128 bits per chunk to an observer that does not know the key (it looks like 128 bit random blocks), but is zero to an observer who does (she can predict each output with probability 1).
DMJ001 (talk) 03:07, 9 August 2008 (UTC)[reply]
Sure, but that dependence on what you know is the case with thermodynamic entropy too. See for example section VI, "the 'anthropomorphic' nature of Entropy" in E.T. Jaynes' 1965 paper Gibbs vs. Boltzmann entropies.
The -log formula is right if you know the symbol, message or anything else has a particular numerical probability. But if, based on what you know, different symbols have different possible probabilities then the - p log p formula is an appropriate quantification of the quantity of your uncertainty. And if you don't know those probabilities, you should assign them so that the sum - p log p is maximised. (Gibbs' algorithm). See on this Jaynes' original 1957 paper. [8]. -- Jheald (talk) 18:48, 13 August 2008 (UTC)[reply]


The amount of information you are missing, by not yet having received the message, is the information entropy of the message; and also the thermodynamic entropy of the system.
The reason people find it helpful, as I've tried to write above, is they find it helps them with the question you asked, "what is thermodynamic entropy?" Answer: thermodynamic entropy is missing information; specifically, information about what microstate the universe is actually in. That helps people who worry about Loschmidt's paradox. (How can entropy really increase, if physical dynamics are deterministic and measure-preserving? Answer: the information we had is still there, but in effect unusable, so effectively we might as well forget we ever had it at all). And it meshes well with the Szilard engine scenario. (Learning some information, the entropy really is reduced, and we find we then really can (at least in principle) extract a little more useful work). Jheald (talk) 15:07, 6 August 2008 (UTC)[reply]

Now see, this is great. This is exactly what I mean. I'm just beginning to learn about thermo, so I'm not to Loschmidt's paradox or Szilard engines yet. If I were, I'd go read about Loschimdt's paradox or Szilard engines. Right now, as much as I would like to know it all, I have to manage my time because I have constraints on my system, so I have to focus on what really matters right now. I still have what some of us would call a "finite" amount of time to learn what I need to know right now about the basics, which is why there are, I presume, articles about Loschimdt's pair of ducks and lizard engines for those who are to the point that they are ready to learn about those things. Those articles about pairs of ducks and lizard engines could talk about the information entropy in thermodynamic entropy for the sake of those people who want to know about it. Whereas, in this article you could talk more about the curl of F not being zero and what that has to do with entropy, I mean something that I have heard of that at least remotely pertains to this at the appropriate level for this article, unless you want to show off to people who don't have time for it how much more you think you know and try to make students feel stupid for trying to learn, which, I think is what's going on here. You can say, for those readers who want to know about pairs of ducks, lizard engines, and information entropy, "Here, follow this link to there," and not put the entire cyclopedia Jhealdia in this one article. But, thank you for your attention and your contributions to an outstanding project. I really don't have the time to waste here. I don't think I'll be returning to this page, or sending anyone else to it either. Kissnmakeup (talk) 21:02, 6 August 2008 (UTC)[reply]

An editor put an Arxiv paper in Further Reading

Here is the paper that was recently added: Umberto Marini Bettolo Marconi, Andrea Puglisi, Lamberto Rondoni, Angelo Vulpiani (5 March, 2008). "Fluctuation-Dissipation: Response Theory in Statistical Physics". Arxiv.org. {{cite journal}}: Check date values in: |date= (help)CS1 maint: multiple names: authors list (link)

I'm moving this new item here for discussion, after removing it from the article. Though it is potentially of interest, three points:

  1. This is only an Arxiv paper, not a refereed publication
  2. The other items in Entropy#Further reading are of an introductory nature
  3. The editor, User:Arjun r acharya, has now added the same paper to eight different articles, which regrettably causes us to worry about spam. EdJohnston (talk) 21:32, 17 August 2008 (UTC)[reply]

In principle, this article could be included, as it is a big review article which is in press in Physics Reports. However, it would not be appropriate to just include this article without explaining the fluctuation dissipation theorem. If we do that, then it is still not clear if this particular review article would be the best reference (I haven't read it yet). Count Iblis (talk) 22:49, 17 August 2008 (UTC)[reply]

    • Response to "An editor put an Arxiv paper in Further Reading"

First Point : Quote : "This is only an Arxiv paper, not a refereed publication"

Response : It has actually been published in Physics Reports.

http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6TVP-4S2F5J5-2&_user=10&_rdoc=1&_fmt=&_orig=search&_sort=d&view=c&_version=1&_urlVersion=0&_userid=10&md5=5b08887e9e0b5250474fee813a641d54

Third Point : Quote : "The editor, User:Arjun r acharya, has now added the same paper to eight different articles, which regrettably causes us to worry about spam.

Response : I can assure you that this ("spam") was not intended. The fluctuation dissipation is essentially relating response functions to equilibrium quantities, and it is the notion of fluctuation about equilibrium which ties all of them together (or atleast that's what I had in mind when making the changes). For entropy & fluctuations, see chapter 14 of Pathria.

http://books.google.co.uk/books?hl=en&id=6cUbnWO2NNwC&dq=pathria+statistical&printsec=frontcover&source=web&ots=K3Isz3MRcW&sig=RV47jGVM6mbUT8HK8VhbAYbuFyo&sa=X&oi=book_result&resnum=4&ct=result#PPR9,M1

In regards to the second point I think that is completely valid (and also applies to the "Second law of thermodynamics" article, where I also appended this reference).On a second look at the entropy article it seems a bit too indepth a reference to be included, and I do understand why one would want to omit such a reference, for the sake of preserving clarity.

  1. ^ Clausius, Rudolf. (1879). Mechanical Theory of Heat, 2nd Edition. London: Macmillan & Co.
  2. ^ Harper, Douglas. "Energy". Online Etymology Dictionary. {{cite web}}: Unknown parameter |accessmonthday= ignored (help); Unknown parameter |accessyear= ignored (|access-date= suggested) (help)
  3. ^ International Union of Pure and Applied Chemistry (1993). Quantities, Units and Symbols in Physical Chemistry, 2nd edition, Oxford: Blackwell Science. ISBN 0-632-03583-8. p. 12. Electronic version.
  4. ^ Clausius, Rudolf. (1879). Mechanical Theory of Heat, 2nd Edition. London: Macmillan & Co.