Jump to content

Talk:Entropy: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
Line 105: Line 105:
I actually meant that each state and relative position the particles or waves or strings or whatever has been in is information, so the total amount of information increases overtime since information can never be destroyed. You falsely confused it with "information entropy", what I mean is that entropy is not a independent property at all, but just an apparent effect of the fact that information is created but never destroyed.
I actually meant that each state and relative position the particles or waves or strings or whatever has been in is information, so the total amount of information increases overtime since information can never be destroyed. You falsely confused it with "information entropy", what I mean is that entropy is not a independent property at all, but just an apparent effect of the fact that information is created but never destroyed.
[[Special:Contributions/95.209.181.217|95.209.181.217]] ([[User talk:95.209.181.217|talk]]) 18:28, 23 August 2011 (UTC)Martin J Sallberg
[[Special:Contributions/95.209.181.217|95.209.181.217]] ([[User talk:95.209.181.217|talk]]) 18:28, 23 August 2011 (UTC)Martin J Sallberg

psychological entropy is bullshit

Revision as of 01:42, 23 November 2011

Former good articleEntropy was one of the Natural sciences good articles, but it has been removed from the list. There are suggestions below for improving the article to meet the good article criteria. Once these issues have been addressed, the article can be renominated. Editors may also seek a reassessment of the decision if they believe there was a mistake.
Article milestones
DateProcessResult
June 22, 2006Good article nomineeListed
February 20, 2008Good article reassessmentDelisted
Current status: Delisted good article


In general

The terms "in general" and "generally" are often ambiguous. For example, the section on The second law of thermodynamics begins:

"The second law of thermodynamics states that in general the total entropy of any system will not decrease other than by increasing the entropy of some other system." [italics added]

The last part of this section indicates (without reference from the above quote) an apparent exception:

"Statistical mechanics demonstrates that entropy is governed by probability, thus allowing for a decrease in disorder even in a closed system. Although this is possible, such an event has a small probability of occurring, making it unlikely. Even if such event were to occur, it would result in a transient decrease that would affect only a limited number of particles in the system."

However, it is not made clear whether this is the only exception.

Also, any exceptions noted here should be made harmonious with the Second law of thermodynamics article.

The terms "in general" and "generally" appear elsewhere in the article, but this is arguably the one most in need of clarification.

Dorkenergy (talk) 07:26, 4 July 2011 (UTC)[reply]

I agree that use of the terms in general and generally can better be understood by recognizing that they are cliches than by looking for any precise meaning they might convey. On Wikipedia, good writers provide good content and better writers improve the prose by removing cliches and other speed humps. There are many science-oriented topics in which a universal truth must be carefully qualified. For example, the principle of conservation of mechanical energy is a universal principle but it must be qualified by saying "providing the only forces at work are conservative forces." It would be unacceptable for Wikipedia to take a short cut and say "In general, mechanical energy is conserved." Wikipedia articles must say what they mean. There are no short cuts. Dolphin (t) 07:55, 4 July 2011 (UTC)[reply]
How unlikely does an event have to be before one is permitted to say that it can never happen? This is the kind of issue one will get into if one tries to be more specific than just saying that entropy generally does not decrease. JRSpriggs (talk) 08:23, 4 July 2011 (UTC)[reply]
Greater specificity is certainly required where there is substantial controversy over the implications and applications of the concepts. In this instance, the religious vs. scientific debate on implications of the Second Law require that greater specificity. Also, readers might reasonably question the relationship to virtual particles or other quantum mechanical considerations. Dorkenergy (talk) 14:06, 4 July 2011 (UTC)[reply]
I think you make some interesting points. Do you have a proposal for how the sentence using these terms can be reworded or what should be added to help clarify for the reader? Currently I think I understand what is meant, but I am certainly open to even better explanations. § Music Sorter § (talk) 23:17, 4 July 2011 (UTC)[reply]
If, in a system isolated from its environment, the entropy is S1 at time t1 and S2 < S1 and t1 < t2, then the probability of finding the system to have entropy S2 at time t2 is less than or equal to where k is Boltzmann's constant (by my OR). Given how small Boltzmann's constant is, this probability might as well be zero for any noticeable reduction in entropy. JRSpriggs (talk) 08:33, 5 July 2011 (UTC)[reply]

It is impossible for us to cause a decrease in total entropy. However, very small decreases occur spontaneously, but very rarely and very briefly. If the system is not already at equilibrium, this tendency would be overwhelmed by the increase in entropy due to dissipation of free energy. Any attempt to use the spontaneous decreases to do useful work is doomed to failure because more work would be expended to identify and capture them than could thereby be generated. JRSpriggs (talk) 09:07, 14 July 2011 (UTC)[reply]

I now see that we have an article on this subject. It is fluctuation theorem. JRSpriggs (talk) 20:59, 25 August 2011 (UTC)[reply]
Also interesting. You can consider large fluctuations to a lower entropy state and then ask what the most likely time evolution that leads to such a fluctuation looks like. The answer is rather simple, it's simply the time reversal of the way the system would relax back to equilibrium, starting from the low entropy state. For large objects, say a house, that approach equilibrium by slowly falling apart bit by bit, you are led to conclude that the fluctuation that leads to it reappear, consists of a large number of fluctuations that conspire to build it up from dust. Count Iblis (talk) 01:32, 26 August 2011 (UTC)[reply]

Kinetic energy of a piston is not thermal energy. Thermal energy is not conserved.

"Thus, entropy is also a measure of the tendency of a process, such as a chemical reaction, to be entropically favored, or to proceed in a particular direction. It determines that thermal energy always flows spontaneously from regions of higher temperature to regions of lower temperature,"

The reality is that energy does not always remain thermal. In fact, a car internal combustion engine converts some amount of thermal energy into vehicle motion. In no sense can a temperature be defined for a vehicle as function of a vehicle speed, tire speed, crankshaft speed, or any other speed of any rotating and/or reciprocating car part. It doesn't take long to realize that energy that leaves the system or any energy that remains in the system in some form other than thermal energy will contribute to TdS. All TdS is somehow accounted for in this way. I italicized the latter to implicate the fact that not all TdS constitutes unusable "waste heat". The quantity TdS also encapsulates thermal energy that was lost due to expansion itself (which is in fact a mechanism capable of expansion work that requires no heat transfer whatsoever from any reservoir to any other reservoir for thermal energy to be lost; look it up at adiabatic process). Recovering regeneratively any work done (kinetic energy of pistons, vehicles, etc.), such as by running electric motors in reverse (e.g. regenerative braking) can reduce the rate at which entropy increases. This non-thermal energy recovered can indeed spontaneously flow from a lower to a higher temperature, and this is because which direction non-thermal energy flows is not principally determined by temperature, but rather by inertia as well as external forces to the body, which are largely unrelated to temperature, such as the force that pulls a cannonball to the Earth. Entropy generally rises because recovery of kinetic energy that was previously in the form of thermal energy derived from waste heat only supplies a tiny fraction of our energy resources. However, the idea that energy in general cannot flow from cold to hot spontaneously is clearly flawed. A windmill milling grain is a clear example of this fact, wherein only non-thermal energy plays a role as far as work done is concerned. Ask yourself, "What temperature of wind and what temperature of grain is relevant to the mere fact that a windmill operates?" That limitation only exists for thermal energy. And there are some exceptions to the rule concerning thermal energy: (Example: Shine a warm light onto a much brighter light more so than the other way around by simply inserting a wavelength-specific filter in between, and though such an effect is of limited utility at this time, it does constitute an example where thermal energy can flow spontaneously from cold to hot. Again it's unlikely, but it's still spontaneous.)siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk
23:19, 22 August 2011 (UTC)[reply]

Entropy as an apparent effect of conservation of information.

If entropy is considered an equilibrium property as in energy physics, then it conflicts with the conservation of information. But the second law of thermodynamics may simply be a apparent effect of the conservation of information, that is, entropy is really the amount of information it takes to describe a system, each reaction creates new information but information cannot be destroyed. That means the second law of thermodynamics is not an independent law of physics at all, but just an apparent effect of the fact that information can be created but not destroyed. The arrow of time is thus not about destruction, but about the continuous creation of information. This explains how the same laws of physics can cause self-organization. Organized systems are not anyhow less chaotic than non-organized systems at all, and the spill heat life produces can, in an information-physical sense, be considered beings eliminated by evolution rather than a step towards equilibrium. It is possible that overload of information will cause the arrow of time to extract vacuum energy into usefulness rather than heat death. 217.28.207.226 (talk) 10:49, 23 August 2011 (UTC)Martin J Sallberg[reply]

You could reinterpret Loschmidt's paradox to say that the total information entropy of, lets say, a gas, must remain constant due to time reversal symmetry. The paradox is that the thermodynamic entropy increases. The paradox is resolved by realizing that the total information entropy is equal to the marginal entropy (proportional to the thermodynamic entropy) in which correlations in velocity and position between particles is ignored, plus the mutual entropy (the entropy due to correlations), the sum of which is constant - as the thermodynamic entropy increases, the marginal entropy decreases. So there is no creation of information entropy, and total information entropy is conserved. See Mutual information#Applications of mutual information. PAR (talk) 15:07, 23 August 2011 (UTC)[reply]

I actually meant that each state and relative position the particles or waves or strings or whatever has been in is information, so the total amount of information increases overtime since information can never be destroyed. You falsely confused it with "information entropy", what I mean is that entropy is not a independent property at all, but just an apparent effect of the fact that information is created but never destroyed. 95.209.181.217 (talk) 18:28, 23 August 2011 (UTC)Martin J Sallberg[reply]

psychological entropy is bullshit