Jump to content

Talk:Entropy: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
→‎The first (opening) paragraph is completely WRONG: Comment on Count Iblis's observation
Line 107: Line 107:


I agree with this criticism, but a previous rewrite initiative to fix this failed, because most editors here want to edit from the classical thermodynamics perspective and then you can't give a good definition of entropy. Peraps we'll need to try again. [[User:Count Iblis|Count Iblis]] ([[User talk:Count Iblis|talk]]) 21:39, 31 May 2011 (UTC)
I agree with this criticism, but a previous rewrite initiative to fix this failed, because most editors here want to edit from the classical thermodynamics perspective and then you can't give a good definition of entropy. Peraps we'll need to try again. [[User:Count Iblis|Count Iblis]] ([[User talk:Count Iblis|talk]]) 21:39, 31 May 2011 (UTC)

:The [[Entropy (disambiguation)|disambiguation page]] for entropy shows there is a whole family of articles with the word ''entropy'' in the title. This article ([[Entropy]]) is the original one, and is still devoted almost exclusively to the classical thermodynamic application of entropy, even though there is a separate article titled [[Entropy (classical thermodynamics)]]. The solution to the problem described above by Count Iblis would appear to be to convert [[Entropy]] into a brief introduction to the word to explain its many applications, and then direct readers to the various articles on entropy and allow the readers to choose which application they want to investigate. Everything in [[Entropy]] that is dedicated to classical thermodynamics should be merged into [[Entropy (classical thermodynamics)]] (or [[Entropy (statistical thermodynamics)]]).
:Some examples of those various articles are:
:*[[Introduction to entropy]]
:*[[Entropy (classical thermodynamics)]]
:*[[Entropy (statistical thermodynamics)]]
:*[[Entropy (information theory)]]
:*[[Entropy in thermodynamics and information theory]]
:[[User:Dolphin51|<font color="green">''Dolphin''</font>]] ''([[User talk:Dolphin51|<font color="blue">t</font>]])'' 23:44, 31 May 2011 (UTC)


== Inaccurate Statements ==
== Inaccurate Statements ==

Revision as of 23:44, 31 May 2011

Former good articleEntropy was one of the Natural sciences good articles, but it has been removed from the list. There are suggestions below for improving the article to meet the good article criteria. Once these issues have been addressed, the article can be renominated. Editors may also seek a reassessment of the decision if they believe there was a mistake.
Article milestones
DateProcessResult
June 22, 2006Good article nomineeListed
February 20, 2008Good article reassessmentDelisted
Current status: Delisted good article


Entropy from "nothing"? I don't think so....

The Wikipedia article on "Entropy" states in the first line:

"Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines."

This statement is interesting because, as we know, Entropy is not measured in units of Energy, but rather, as a Change of Energy divided by Temperature.

I also observed the Fundamental thermodynamic relation:

∆U = T∆S - p∆V

I realized that the equation somehow implied for the same change in internal energy, ∆U, that the amount of p∆V was only limited by the value of T∆S. The mainstream idea is that increasing entropy somehow increases the amount of energy not available to the system. However, if this were true, then temperature must decrease faster than entropy increases, for by doing so, if S increased, then the sum of ∆U and p∆V would not have to increase. If ∆U is the change of internal energy, then it is easy to see that P∆V is the change in external energy.


Changing a pressure in a constant volume is like changing a force without changing the displacement—by itself, there is no work associated with such forces. Of course you cannot just change the pressure in a system of constant volume without changing the forces in that volume. Fundamentally, the change in pressure in a "constant" volume is really the result of the changes in the proximal distance between particles in that system. The hotter particles in the volume are, then the greater the variation there is in the distances between those particles, and because of the inverse square law and the fact that the root mean square of a set of unequal real values is always greater than the average, the higher the average forces between the particles, even if the average proximal distance between particles does not change. Therefore, in a sense, V∆p at one scale is actually p∆V at a smaller scale. V∆p is otherwise, implicitly, a part of internal energy, ∆U.


Thus, it is obvious that T∆S = ∆U + p∆V is the change of total energy.

If S is conserved between systems, then it is easy to deduce a direct, linear relationship between temperature and energy of a given system, which is exactly what one expects from the kinetic theory of heat:

∆U = T∆S - p∆V
∆U + p∆V = T∆S
(∆U + p∆V)/T = ∆S

Where ∆S is the entropy change of one system at the expense of another (i.e. an amount of entropy "in transit").

Also notice that if T decreases with time, for a given entropy "in transit" (∆S), the total energy "in transit" (∆U + p∆V) literally decreases, which corresponds directly with requirement that such energy becomes unavailable to do work. Technically, this would make T∆S equivalent to exergy rather than the energy. So there is no need to assume that the total entropy of a universe must increase to explain this. All that is required is that entropy is transferred from one domain to another domain of a lower temperature.

The first line of the article on fundamental thermodynamic relation states: "In thermodynamics, the fundamental thermodynamic relation expresses an infinitesimal change in internal energy in terms of infinitesimal changes in entropy, and volume for a closed system."

This makes it clear that the interpretation in mainstream science is that this "fundamental thermodynamic relation" is for closed systems, implying that somehow that all the entropy change is manifested directly by the system. They have no concern for the possibility that entropy could flow into what they have instead considered as a "closed system" (which reads as "a system that is thermally isolated from the environment, in the sense of not being able to receive energy from the outside"). They've come to think that energy transfer is necessary for a transfer of entropy because they treat energy as the fundamental substance of matter. So to them, the entropy in these said "closed" systems arises solely due to the interactions of the energy already present, and of that, only the part which they know how to plug into the fundamental thermodynamic relation. Thus, they do not include entropy from external sources, nor even from subatomic particles themselves, for as of yet, the entropy of systems within atoms is not apparent to them. Additionally, they have not accepted the idea that entropy can flow into a system when that system is energetically isolated from the environment. Thus, it is no wonder that they think entropy can be created from nothing.Kmarinas86 (Expert Sectioneer of Wikipedia) 19+9+14 + karma = 19+9+14 + talk = 86 15:04, 27 January 2011 (UTC)[reply]

The first (opening) paragraph is completely WRONG

"Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work entropy accumulates in the system, but has to be removed by dissipation in the form of waste heat"

The above first "Entropy" article(opening) paragraph is completely WRONG! Entropy is NOT “measure of the energy not available for useful work” since entropy is an equilibrium property and work is energy in transfer during a process. Actually, the higher system temperature the higher entropy (everything else the same) and more potential for work with reference to the same surroundings (so work is not property but process variable). The second sentence is confusing and ambiguous, thus not accurate. Maximum efficiency is NOT “when converting energy to work,” but if work is obtained in reversible processes (like ideal Carnot cycle). The last (third) sentence is confusing and plain WRONG: “During this(?) work entropy accumulates in the system” - the entropy does NOT accumulates (what system, where?) since entropy is not associated with work but with thermal energy per unit of absolute temperature. Actually, during ideal Carnot cycle (max work efficiency) the entropy is not generated, but conserved: entropy rate to the cycle is equal to the entropy rate out of the cycle. Not a good idea to start Entropy article with confusing and ambiguous, thus inaccurate statements. See entropy definition at: http://www.kostic.niu.edu/Fundamental_Nature_Laws_n_Definitions.htm#DEFINITION_of_ENTROPY —Preceding unsigned comment added by 24.14.178.97 (talk) 04:25, 14 February 2011 (UTC)[reply]

I am confused by the opening paragraph too. I agree with the above post. According to the stated definition of entropy, in a reversible thermodynamic process (0 change in entropy) 0% of the energy would not be available for useful work. So for a Carnot engine operating between two reservoirs at any two different temperatures (0 change in entropy) there would be 0% energy not available for useful work. That strikes me as being incorrect because that would mean the engine was 100% efficient: ie 100% of the input energy would be available for useful work, which, of course, is not even theoretically possible (unless Tc = Absolute zero).
Also: what is "convertible energy"? That is a term I am not familiar with. In thermodynamics there is heat flow (Q), internal energy (U) and work (W). Which of those three categories of energy does "convertible energy" fall into?
The author(s) of the opening paragraph appear to be confusing entropy with heat flow that is unavailable to do work. They are two very different things. As is correctly pointed out above, even in an idealized reversible thermodynamic process (entropy change = 0) there is heat flow that is unavailable to do work.
This is most unfortunate. How many thermodynamics students are being confused by this incorrect definition of Entropy?! AMSask (talk) 23:16, 11 April 2011 (UTC)[reply]
Count me as one. Worse yet, as someone that works with a lot of information and computer science, the definition of "randomness" used in the article is incredibly vague and useless, and impossible to relate to anything I already understand. "Random" to me means that multiple items in a series are mutually independent, so that any item cannot be used to determine the state of another item. That definition obviously has no meaning whatsoever in the context of the provided paragraph. Now I'm aware that there is a definition of entropy that relates to the number of available states in a system, and that "randomness" is sometimes used as a shorthand for that concept, and I believe that is where the author was trying to go, but he stops well short of clarity. To sum the problem up in a sentence, I think both the paragraph regarding "convertible energy" and the "randomness" bit can be replaced with the sentence "entropy is that thing, you know what it is, with the heat and the randomness and the time's arrow, yeah that" without any loss of information or clarity. Fortunately, I found this excellent write-up googling, which may be useful for other frustrated students looking at the talk page for the same reason I am, http://www.tim-thompson.com/entropy1.html -- Anonymous frustrated student. — Preceding unsigned comment added by 173.29.64.199 (talk) 21:26, 31 May 2011 (UTC)[reply]

I agree with this criticism, but a previous rewrite initiative to fix this failed, because most editors here want to edit from the classical thermodynamics perspective and then you can't give a good definition of entropy. Peraps we'll need to try again. Count Iblis (talk) 21:39, 31 May 2011 (UTC)[reply]

The disambiguation page for entropy shows there is a whole family of articles with the word entropy in the title. This article (Entropy) is the original one, and is still devoted almost exclusively to the classical thermodynamic application of entropy, even though there is a separate article titled Entropy (classical thermodynamics). The solution to the problem described above by Count Iblis would appear to be to convert Entropy into a brief introduction to the word to explain its many applications, and then direct readers to the various articles on entropy and allow the readers to choose which application they want to investigate. Everything in Entropy that is dedicated to classical thermodynamics should be merged into Entropy (classical thermodynamics) (or Entropy (statistical thermodynamics)).
Some examples of those various articles are:
Dolphin (t) 23:44, 31 May 2011 (UTC)[reply]

Inaccurate Statements

The article contains a number of inaccurate statements. Let me name a few:

"Historically, the concept of entropy evolved in order to explain why some processes are spontaneous and others are not..." Entropy, as defined by Clausius, is a conserved function in the Carnot cycle along with the internal energy. For efficiencies other than the Carnot efficiency the equality in the first formula in the section "Classical Thermodynamics" becomes an inequality.

"Thermodynamics is a non-conserved state function..." contradicts the first formula in the section "Classical Thermodynamics".

"For almost all practical purposes, this [Gibbs's formula for the entropy] can be taken as the fundamental definition of entropy since all other formulas for S can be derived from it, but not vice versa." This is definitely incorrect; just consider the entropy of degenerate gases which is proportional to the number of particles.Bernhlav (talk) 08:20, 2 April 2011 (UTC)[reply]


Hello and welcome to Wikipedia. I appreciate your help and encourage to fix the flaws in the article - be bold. However, a quick browse through our policies and guidelines would be handy first - I'll drop you a welcome on your personal talk page. Zakhalesh (talk) 08:23, 2 April 2011 (UTC)[reply]
As long as you can cite your corrections to reliable sources, please make them. It is unlikely someone else will if you don't. InverseHypercube (talk) 08:43, 2 April 2011 (UTC)[reply]

Hello,

Thank you. I do not know how to make the changes because that will remove references that are cited. Let me give you an example of the first of the above statements.

The concept of entropy arose from Clausius's study of the Carnot cycle [1]. In a Carnot cycle heat, Q_1, is absorbed from a 'hot' reservoir, isothermally at the higher temperature T_1$, and given up isothermally to a 'cold' reservoir, Q_2, at a lower temperature, T_2. Now according to Carnot's principle work can only be done when there is a drop in the temperature, and the work should be some function of the difference in temperature and the heat absorbed. Carnot did not distinguish between Q_1 and Q_2 since he was working under the hypothesis that caloric theory was valid and hence heat was conserved [2]. Through the efforts of Clausius and Kelvin, we know that the maximum work that can be done is the product of the Carnot efficiency and the heat absorbed at the hot reservoir: In order to derive the Carnot efficiency, , Kelvin had to evaluate the ratio of the work done to the heat absorbed in the isothermal expansion with the help of the Carnot-Clapeyron equation which contained an unknown function, known as the Carnot function. The fact that the Carnot function could be the temperature, measured from zero, was suggested by Joule in a letter to Kelvin, and this allowed Kelvin to establish his absolute temperature scale [3] We also know that the work is the difference in the heat absorbed at the hot reservoir and rejected at the cold one: Since the latter is valid over the entire cycle, this gave Clausius the hint that at each stage of the cycle work and heat would not be equal but rather their difference would be a state function that would vanish upon completion of the cycle. The state function was called the internal energy and it became the first law of thermodynamics [4]

Now equating the two expressions gives If we allow Q_2 to incorporate the algebraic sign, this becomes a sum and implies that there is a function of state which is conserved over a complete cycle. Clausius called this state function entropy. This is the second law of thermodynamics.

Then Clausius asked what would happen if there would be less work done than that predicted by Carnot's principle. The right-hand side of the first equation would be the upper bound of the work, which would now be converted into an inequality, When the second equation is used to express the work as a difference in heats, we get or So more heat is given off to the cold reservoir than in the Carnot cycle. If we denote the entropies by S_i=Q_i/T_i for the two states, then the above inequality can be written as a decrease in the entropy, The wasted heat implies that irreversible processes must have prevented the cycle from carrying out maximum work.

Statistical Thermodynamics

Of all the thermodynamic functions, entropy has the unique role of having one foot in the macroscopic world and the other in the microscopic world of atoms. According to Boltzmann, the entropy is a logarithmic measure of the number of micro-complexions that correspond to, or are indistinguishable from, a given macroscopic state. Since the latter is a very large number, instead of being a proper fraction, Planck referred to it as a 'thermodynamic' probability [5]. Boltzmann was always of the opinion that large number gave better assurance than mere fractions.

As an illustration of how Boltzmann determined his entropy let us consider the distribution of N particles in k urns [6]. The a priori probability of there being particles in the ith urn is . The are not all independent but must obey the condition , and since probability must be conserved, . Then the probability that the first urn contains , the second urn contains , and so on, is , which is known as the multinomial distribution[7]. Now Boltzmann did not know what the a priori probabilities were so he supposed that they were all equal, $p_i=1/k$, for all the urns. According to Boltzmann we should not discriminate among the urns. [But when he attaches different energies to the different urns, he precisely does discriminate among them.] The probability P thus becomes a 'thermodynamic' probability whose logarithm, , he set proportional to the entropy S. The constant of proportionality, or the universal gas constant divided by Avogadro's number, for one mole of gas, was later determined by Planck in his study of black-body radiation, at the same time he derived his constant [8]. Thus, Boltzmann's assumption of a priori equal probabilities [9] converts the multinomial distribution into a thermodynamic probability, which, instead of being an actual probability, is a very big number.

Entropy is a measure of the error that we commit when we observe a value different from its true value which we know to be the average value, . When we specify which average we are considering, the entropy will be determined from Gauss's error law which equates the average value with the most probable value Cite error: A <ref> tag is missing the closing </ref> (see the help page).. This implies that must be the same function of the observations that is of the average value. And if we take , we obtain the probability distribution as which is the well-known Poisson distribution under the condition that Stirling's approximation for the factorial is valid.

If we consider the independent probabilities of the occupation of the k different urns we must consider the product of the probabilities where we must now attach an index on the average value to show that it is the average value for the ith urn. The product then gives the multinomial distribution, where the a priori probabilities are now seen to be . Thus we see that is the Gibbs entropy which ensures that is a true probability, less than one, based on its strict concavity, The strict concavity of the entropy is its defining property, and corresponds to the thermodynamic criteria of stability. Consequently, the selection of which average is the true value and Gauss's error law determine the form of the entropy. The entropy is thus said to be the potential for the law of error, and this completes Boltzmann's principle [10].


I think this is clearer than what is found in the text. The same holds true for the other points mentioned.Bernhlav (talk) 16:15, 3 April 2011 (UTC)[reply]

Thank you! If you provide citations, I will try to incorporate it into the article text. InverseHypercube 17:51, 3 April 2011 (UTC)[reply]
I can certainly provide you with references. But if I told you where I took it from would that be considered self-publicity by Wikipedia? I can do the other points I have mentioned, and others that I haven't.Bernhlav (talk) 09:34, 4 April 2011 (UTC)[reply]
No, it wouldn't be considered self-publicity, unless it linked to your own site, and even then if it was of an academic nature it wouldn't be a problem (see WP:IRS for how to identify good sources for Wikipedia). Please write all your points with references (and in an encyclopedic manner), and I'll see what I can do about adding them. Thanks for your contributions! InverseHypercube 22:58, 4 April 2011 (UTC)[reply]
I have added the references and have included part of the statistical thermodynamics, but I can't see it when I click "show preview".
Note that the "Gibbs entropy" is the fundamental definition of entropy, it also applies to degenerate gasses. Bernhlav (talk)
You forgot to put some closing reference tags; that's why you couldn't see it. Anyways, I fixed it now. The statistical thermodynamics section seems to be unsourced, so it can't be added as it is now (Wikipedia is very strict about citing sources). The first part seems all good, though, so I'll be adding it soon. Thanks again! InverseHypercube 23:34, 5 April 2011 (UTC)[reply]
I have added the references. There must be a mistake, I didn't say the Gibbs entropy applies to degenerate gases: it doesn't. Actually, this could have been the motivation for Planck's search for a logarithmic form of the entropy that would be valid in every frequency zone from to in his theory of black body radiation. Bernhlav (talk) 11:48, 6 April 2011 (UTC)[reply]
I wrote that. And it does apply, but you then have to take the probability distribution to be over the set of the entire N-particle states of the gas. If you work with the individual molecules, then it obviously does not work. In statistical physics, it is standard to let the "i" in "P_i" refer to the complete state of the gas, not to the state of a single molecule. Count Iblis (talk) 14:55, 6 April 2011 (UTC)[reply]
The entropy of a degenerate gas is proportional to N, the number of molecules which is not conserved, and not to the logarithm of it. Relativistically, it also follows that S and N are relativistic invariants, and are proportional to one another.

I don't understand "the complete state of the gas" as the problem concerns the combinatorics, i.e. how to distribute a certain number of particles among given states.Bernhlav (talk) 16:47, 6 April 2011 (UTC)[reply]

What you are doing is giving a treatment based on some assumptions that allows you to do statistics with the molecules themselves. This only gives the correct result in special cases. As you point out, you don't get the correct result in the case of e.g. degenerate gasses. However, within Statistical Mechanics, we can still treat degenerate gasses using the Gibbs formula for entropy, but this works by considering a ensemble of systems. So, you consider a set consisting of M copies of the entire gas of N molecules (you can also make the number of molecules variable and fix it later usingthe chemical potential, appearing as a Lagrange multiplier when you maximize the probability while fixing the average internal energy and average number of particles) you are looking at and the probability distribution is over the set of possible N-particle quantum states of the entire gas. Count Iblis (talk) 17:46, 6 April 2011 (UTC)[reply]
Either the particle number is constant and the chemical potential is a function of temperature or the particle number is a function of temperature and the chemical potential is constant, which can also be zero in the case of a boson gas. The constraint of constant internal energy on the maximization of the multinomial coefficient is contrary to the assumption that the a priori probabilities of occupation of the states are equal. The Gibbs entropy is a limiting form as the Renyi entropy clearly shows. If the Gibbs entropy would apply to degenerate gases, Planck wouldn't have bothered looking for another logarithmic entropy in each frequency interval. The Gibbs entropy corresponds to the Poisson distribution which is a limiting form of the negative binomial and binomial distributions. Hence it cannot account for FD or BE statistics.Bernhlav (talk) 07:09, 7 April 2011 (UTC)[reply]
You can simply put the whole system in a box and keep internal energy and particle number constant. Then, you deal with the mathematical difficulties of treating the Fermi gas, using varius tricks/approximations that become exact in the N to infinity limit. You then find the usual Fermi-Dirac distribution. This is not a surprise, because in the thermodynamic limit, the microcanonical ensemble is equivalent to the canonical and grand canonical ensembles. Count Iblis (talk) 14:16, 7 April 2011 (UTC)[reply]
There is a terrible confusion in the literature about a seeming equivalence of ensembles. Nothing of the sort, not even in the so-called thermodynamic limit. It is impossible to define temperature in the microcanonical ensemble, and any one who tries (and there have been many!) is wrong. Tricks or not, if the entropies are different for different probability distributions, which they are, not one entropy will give all three statistics. And since there are only two distributions (binomial and negative binomial) which tending to a limiting distribution (Poisson) in the high temperature limit, there is no such thing as intermediate, or para-,statistics.Bernhlav (talk) 16:14, 7 April 2011 (UTC)[reply]

If a system can be divided into two independent parts A and B, then

Thus the entropy is additive. The number of particles in the systems is also additive. If the substance is uniform, consisting of the same type of particles at the same temperature, then any portion of it can be thought of as a sum of identical pieces each containing the same number of particle and the same entropy. Thus the portions' entropy and particle number should be proportional to each other. So what is the problem? JRSpriggs (talk) 00:42, 15 April 2011 (UTC)[reply]

Just because two quantities are additive does not mean that they are proportional to one another! If you take an ideal gas, its entropy will be [eqn (42.7) Landau and Lifshitz, Statistical Physics, 1969]

where is some function of the temperature, and the prime means differentiation with respect to it. If I multiply all extensive variables by some constant, , then it is true that the entropy will be increased times, but the entropy has the form so that it is not a linear function of .

However, the entropy of black body radiation,

is proportional to [which is eqn (60.13) in the same reference] where is the radiation constant. The entropy has lost its logarithmic form. Moreover, the Renyi entropy

for all is also additive (and concave in the interval). It becomes the Gibbs-Shannon entropy in the limit so that the Gibbs-Shannon entropy is a limiting form of the Renyi entropy. Not to mention the non-additive entropies, to which the Havrda-Charvat entropy belongs,

which is also mentioned in the text under the name of the Tsallis entropy.

There is still the more fundamental reason why the Gibbs-Shannon entropy cannot be the most general form of the entropy: It corresponds to the Poisson distribution which is a limiting distribution in the limit where and such that their product is constant. So the Gibbs-Shannon entropy is not the most general form of entropy from which all others can be derived.Bernhlav(talk) —Preceding unsigned comment added by 95.247.255.15 (talk) 10:27, 17 April 2011 (UTC)[reply]

To 95.247.255.15: I was just trying to respond to the claim that degenerate gases were an exception. I did not say that additivity was sufficient by itself. I was talking about uniform substances which can be divided into many small identical parts.
If you want me to comment further on this, then please explain why your equation for the entropy of an ideal gas includes the factor N in the denominator inside the logarithm. JRSpriggs (talk) 21:42, 17 April 2011 (UTC)[reply]
Hi JRSpriggs: All you need are the equations of state PV=NRT and E=(3/2)NRT and the Gibbs-Duhem equation

can be integrated to give

where is a constant having dimensions of an entropy density. The exponents in the logarithm must be such that the entropy be an extensive quantity. This is explained in H. Callen, Thermodynamics, 2nd ed. pp. 67-68. The N in the denominator is nothing more than the inverse probabilities in the Gibbs-Shannon entropy,

Bernhlav (talk) 07:49, 18 April 2011 (UTC)[reply]

To Bernhlav: When varying N (quantity of substance), extensional quantities like E (energy) and V (volume) should not be held constant. Rather it is the ratios e=E/N and v=V/N which are constant. Thus your formula becomes

as required. JRSpriggs (talk) 10:06, 18 April 2011 (UTC)[reply]

To JRSpriggs: If you do that then you must also write and the particle number disappears completely! You can do the same by dividing all extensive variables by V. This is because the entropy is extensive.87.3.220.56 (talk) 06:08, 19 April 2011 (UTC)[reply]
To 87.3.220.56: Thanks for your support. Can I take it that you also agree with me that Gibbs' formula is the fundamental definition of entropy? JRSpriggs (talk) 08:40, 23 April 2011 (UTC)[reply]
The fundamental definition of thermodynamic entropy is given by the second law. It is entirely macroscopic, and has nothing to do with statistical mechanics. Statistical mechanics describes the entropy in microscopic terms, and develops a mathematical equation which is equivalent to the thermodynamic entropy. Statistical mechanics does not define nor redefine entropy, it reveals the underlying microscopic and statistical nature of the thermodynamic entropy. PAR (talk) 14:30, 23 April 2011 (UTC)[reply]
Thermodynamics defines entropy in terms of the extensive thermodynamic variables. It's statistical counterpart is defined in terms of probabilities. A complete set of probabilities is not a complete set of extensive variables. Isn't that a re-definition?Bernhlav (talk) 15:20, 23 April 2011 (UTC)[reply]
Entropy in statistical mechanics is an integral over a complete set of probabilities, which is not the same thing. Also, these probabilities are assumed to be equal for each microstate, not measured to be equal. The fact that the calculated statistical entropy can be identified with the thermodynamic entropy supports the assumption and the whole train of reasoning. Its like you are picking through possible microscopic explanations of thermodynamic entropy until you find the right one, and accept it as an explanation, but it is not a redefinition. If statistical mechanics were to yield an explanation of entropy which is at odds with the thermodynamic definition, statistical mechanics would be wrong, not thermodynamics. Thermodynamics and entropy in particular are based on macroscopic measurements. Statistical mechanics seeks to give a microscopic explanation of those measurements. In the realm of thermodynamics, statistical mechanics makes no measurements of its own, and therefore cannot operationally redefine any thermodynamic concept. PAR (talk) 18:10, 23 April 2011 (UTC)[reply]
You can still observe correlation functions, fluctuations etc. that are beyond the realm of thermodynamics. So, statistical mechanics is more fundamental than thermodynamics, you could compare it to the relation between quantum mechanics and classical mechanics. Count Iblis (talk) 18:41, 23 April 2011 (UTC)[reply]
The 2nd law can be proven using the Gibbs-Shannon/microscopic definition. Cannot be proven from classical theory. Ergo we adopt Gibbs-Shannon as fundamental. -- cheers, Michael C. Price talk 19:32, 23 April 2011 (UTC)[reply]
Count Iblis - absolutely true. Thats why I started the last sentence with "In the realm of thermodynamics...". Statistical mechanics is so successful in predicting and explaining the experimental facts and laws of thermodynamics that it may be considered a valid theory. But the realm of its applicability goes beyond thermodynamics, where it is again successful.
Michael C. Price - The second law in classical theory is primary: it is an experimental fact. If the Gibbs-Shannon definition failed to predict the second law, it would be wrong, not the second law. The fact that the Gibbs-Shannon definition implies the second law is proof that the assumptions made in the Gibbs-Shannon definition are valid, not that the second law is valid. In this sense, it does not provide a "proof" of the second law, it provides an explanation. PAR (talk) 20:55, 23 April 2011 (UTC)[reply]
I said I would add the information provided by Bernhlav to the article. Do any of you have revisions to make to it? Thanks. InverseHypercube 03:56, 24 April 2011 (UTC)[reply]
PAR - the 2nd law, using Gibbs-Shannon, follows from unitarity. -- cheers, Michael C. Price talk 05:29, 24 April 2011 (UTC)[reply]
Thus proving the validity of the assumptions made in the Gibbs-Shannon definition of entropy. Not proving the second law. PAR (talk) 15:34, 24 April 2011 (UTC)[reply]
What assumptions would those be? I don't see them. -- cheers, Michael C. Price talk 16:17, 24 April 2011 (UTC)[reply]

I think the most important one is the assumption that each microstate is equally probable (equal apriori probability). This is not as simple as it sounds, because for a thermodynamic process, in which things are changing, you will have to deal with the time element, i.e. how collisions produce and maintain this equal apriori probability. PAR (talk) 17:07, 24 April 2011 (UTC)[reply]

You don't have to assume that each microstate state is equally probable. You can derive that from dS = 0. And that is derived from unitarity. -- cheers, Michael C. Price talk 17:21, 24 April 2011 (UTC)[reply]
Could you give a short expansion on what you are saying or recommend a book or article that would expand on it? I'm having trouble following your argument. PAR (talk) 19:18, 24 April 2011 (UTC)[reply]
Equiprobability at equilibrium follows from varying Gibbs-Shannon S(P_i) and using conservation of probability to constrain the dP_i. dS vanishes when P_i = constant. (For the unitarity argument see Second_law_of_thermodynamics#General_derivation_from_unitarity_of_quantum_mechanics.)-- cheers, Michael C. Price talk 20:21, 24 April 2011 (UTC)[reply]
To PAR: You have made two incorrect statements: (1) "these probabilities are assumed to be equal for each microstate" This is not an assumption, it follows from Liouville's theorem (Hamiltonian). (2) "The second law ... is an experimental fact" No, it is only approximately true. I believe that experiments have been done which show very small systems changing from a higher entropy state to a lower entropy state. JRSpriggs (talk) 01:46, 25 April 2011 (UTC)[reply]
How do equiprobabilities follow from Liouville's theorem? -- cheers, Michael C. Price talk 02:19, 25 April 2011 (UTC)[reply]
At Liouville's theorem (Hamiltonian)#Symplectic geometry, it says "The theorem then states that the natural volume form on a symplectic manifold is invariant under the Hamiltonian flows.". In other words, the volume of a set of states in phase space does not change as time passes, merely its location and shape. So if we identify a micro-state as a certain volume in phase space (perhaps ), then each micro-state will have the same probability in the long run (assuming perturbations cause mixing across orbits). JRSpriggs (talk) 02:39, 25 April 2011 (UTC)[reply]
  • To Michael C. Price - ok, I will read up on this. I have never read the Everett thesis, but it looks excellent.
  • To JRSpriggs - Yes, the further you are from the thermodynamic limit (infinite number of particles) the more error you encounter in the direct statement of the second law. I always take the second law to mean "in the thermodynamic limit...", so that's the experimental fact I am referring to. If you do not take this meaning for the second law, then our disagreement is only semantic. Regarding Liouvilles theorem, I think your last statement "assuming perturbations..." is what I was referring to when I said "you have to deal with the time element...". Speaking classically, simply because the probability density of states in phase space remains constant (Liouville) without collisions does not imply that the probability becomes evenly distributed with collisions. PAR (talk) 03:15, 25 April 2011 (UTC)[reply]
  • To Michael C. Price - Reading the Everett thesis ([1]) - please tell me if you agree with this - define the information "entropy" of a pure wave function with respect to some operator as where Pi are the usual - modulus of the projections of the wave funtion on the eigenvectors of the operator. Then a measurement will collapse the wave function, yielding minimum (zero) entropy - one Pi will will be unity, the others zero. Unitary propagation of the wave function forward in time (using e.g. the Schroedinger equation) will cause these Pi to change, increasing the "entropy". This part is not obvious to me, but possible - For a system with many degrees of freedom, the "entropy" will increase until it reaches a maximum - i.e. all Pi are roughly equal (the larger the system, the less roughly). PAR (talk) 16:00, 25 April 2011 (UTC)[reply]
  • Everett's derivation of the increase in entropy (see pages 29/30, and c.127 in his thesis), doesn't rely on starting from a collapsed state. His proof is interpretation independent, so works (I think) whether you collapse the wf or not. It doesn't rely on the equiprobability assumption, which is something I just threw in because the topic surfaced here. -- cheers, Michael C. Price talk 16:22, 25 April 2011 (UTC)[reply]

I don't think that the traditional foundations of statistical mechanics are still being taken very serious. see e.g. here. The reason why statistical mechachanics works is still being vigorously debated. What I find interesting is the Eigenstate Thermalization Hypothesys (ETH), which basically boils down to assuming that a randomly chosen eigenstates of an isolated system with a large number of degrees of freedom will look like a thermal state. See here for an outline and here for a paper by Mark Srednicki on this topic. Count Iblis (talk) 15:44, 25 April 2011 (UTC)[reply]

  1. ^ B. H. Lavenda, "A New Perspective on Thermodynamics" Springer, 2009, Sec. 2.3.4,
  2. ^ S. Carnot, "Reflexions on the Motive Power of Fire", translated and annotated by R. Fox, Manchester University Press, 1986, p. 26; C. Truesdell, "The Tragicomical History of Thermodynamics, Springer, 1980, pp. 78-85
  3. ^ J. Clerk-Maxwell, "Theory of Heat", 10th ed. Longmans, Green and Co., 1891, p. 155-158.
  4. ^ R. Clausius, "The Mechanical Theory of Heat", translated by T. Archer Hirst, van Voorst, 1867, p. 28.
  5. ^ M. Planck, "Theory of Heat", translated by H. L Brose, Macmillan, 1932, part 4, chapter 1
  6. ^ L. Boltzmann, "Lectures on Gas Theory", translated by S. G. Brush, Univ. California Press, 1964, part 1 sec.6; P. Ehrenfest and T. Ehrenfest, "The Conceptual Foundations of the Statistical Approach in Mechanics", translated by M. J. Moravcsik, Cornell Univ. Press, 1959, sec.12
  7. ^ W. A. Whitworth, "Chance and Choice", 3rd ed. Deighton Bell, 1886
  8. ^ M. Planck, "The Theory of Heat Radiation", translated by M. Masius, American Institute of Physics, 1988, p. 145
  9. ^ R. C. Tolman, "The Principles of Statistical Mechanics", Oxford Univ. Press, 1938, sec. 23
  10. ^ B. H. Lavenda, "Statistical Physics: A Probabilistic Approach", Wiley-Interscience, 1991, chapter 1