Talk:Entropy as an arrow of time
This is the talk page for discussing improvements to the Entropy as an arrow of time article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
This level-5 vital article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
Citations
[edit]I removed some comments from the articls about "footnote needed" and similiar things. When you think a section needs to be backed by some paper add the [citation needed] tag. This tells people that you think there needs to be citation.War (talk) 06:12, 22 May 2008 (UTC)
Overview section
[edit]This opening statement is outdated: "Unlike most other laws of physics, the Second Law of Thermodynamics is statistical in nature, and therefore its reliability arises from the huge number of particles present in macroscopic systems." Research from 2012-present indicates in that most other laws of physics, including gravity, are statistically emergent properties of the Second Law of Thermodynamics itself, as so-called "entropic forces." See, e.g., Erik Verlinde's holographic theory of gravity. This article needs a fundamental reorganization to specifically highlight entropic forces, thus connecting this article to other areas of physics instead of isolating it from them.
Some thoughts
[edit]This article asks why the universe began with low entropy.
But heat death of the universe says this: "Recent work in inflationary cosmology has inverted the heat death notion, suggesting that the early universe was in a thermal equilibrium and thus heat death–like state before cosmic expansion. Meanwhile, in an expanding universe, some believe the maximum possible entropy increases far more quickly than the actual entropy with each time increment, pushing the universe continually further away from an equilibrium state despite increasing entropy. Furthermore, the very notion of thermodynamic modelling of the universe has been questioned, since the effects of such factors as gravity and quantum phenomena are very difficult to reconcile with simple thermodynamic models, rendering the utility of such models as predictive systems highly doubtful according to some."
In what sense did the universe begin at minimal entropy, if entropy is defined as thermal equilibrium?
Oneismany 08:55, 3 May 2006 (UTC)
Entropy's pretty fucked up, eh?
--Davidknippers 06:59, 12 May 2006 (UTC)
To answer the above question, the early universe is not considered to have been at minimal entropy, because its spectral distribution was that of a black-body (and consequently at thermal equilibrium.)
Tkircher 05:40, 30 July 2006 (UTC)
The true answer is that this statement is simply false. I will correct it soon. The early universe was NOT in a thermal equilibrium, which is a state of high entropy. Details will appear soon in heat death of the universe. Dan Gluck 20:12, 8 September 2006 (UTC) I have added the details there and you're invited to read. Dan Gluck 12:37, 11 September 2006 (UTC)
thegreatturtleman 00:23, 06 January 2011 (MST)
I may be mistaken but it seems that the concept of time arrow takes one line of time as we perceive it, and have many segues off of it based on events. It may be more likely to consider dimensional theory. 1D would be only a line segment that is infinitely thin. 2D would give you the y axis as well but it is still infinitely thin. 3D is the world that we live in now. Everything in existence can be considered in this one 3d block. We in this "frame" are traveling through the 4th dimension. This creates a line of the 3rd dimension that we are in. Entropy as in thermodynamics provides an explanation that energy is lost in many different forms, but in the traveling through the 4th dimension there would have to be some kind of resistance, such as friction or relative gravity, but I do not believe that to affect our ability to travel through this 4th dimension. Like I said earlier, the idea that time lines segue bring us into the 5th dimension and then likely to the 6th changing small and large scales and then the effects through the timeline or 4th dimension. If these were only being created at the time, this could be the inefficiency that could cause entropy but once again, I believe otherwise. This thought is similar to the geocentric universe theory. We are not the center of the timeline. It isn't that this segues come from our line, its that it already exists. Something small like a molecule moving left or right, could have a line segment (traveling though the 4th dimension)adjacent to another line segment (also traveling through the 4th dimension) where everything could be exactly the same up until that one point and then the effects after such an event. Just as we believe that all 3 dimensions we can perceive can go infinitely in all directions, who is to say it isnt true if you look at dimensions 4,5,6 the same? Energy can neither be created nor destroyed only redirected. That entropy is the same now as ever and will stay the same, perhaps its effects are just a little less understood.
Entropy increases in both directions
[edit]Chris Wallace (computer scientist) demonstrated ([1]) that if the laws of physics are reversible, then given an arbitrary state, entropy will increase in both directions of time. There is a nifty interactive simulation (simplified) of this on Lloyd Allison's home page([2]), but that seems to be down at the moment. --njh 03:21, 2 December 2006 (UTC)
NB arbitrary state -- not Big bang. 1Z 16:11, 3 April 2007 (UTC)
Oh my
[edit]An equivalent way of saying this is that the information possessed by the demon on which atoms are considered "fast" or "slow", can be considered a form of entropy known as information entropy.- wtfhax
Three scenarios
[edit]The three scenairios at the end of the article miss the fact that the universe most likely is not expanding. Einstein's theory with conservation of energy, as opposed to the Big Bang, predicts Hubble constant as it is observed (accelerating with acceleration 2.5E-36 1/s2). From it predicts the density of the universe (6.5E-27 kg/m3), acceleration of space probes (7E-10 m/s2), the size of pieces of non luminous matter (1 m). It predicts also distances to some quasars. Usually when theory predicts so many things right and none wrong it is worth looking into it, especially when it is created by Einstein. Jim 19:43, 18 June 2006 (UTC)
What you have written is completely false. The theory of the BIg Bang is today fully accepted by virtually all cosmologists. Dan Gluck 14:24, 8 September 2006 (UTC)
Change of name of article
[edit]FYI, I am contemplating moving this article to arrow of time (physics) and possibly spitting it into several articles. I admit I am prejudiced: I think that cosmological explanations for the arrow of time are teleological hogwash, right up there with the work of the Bogdanov brothers. Also, entropy does have something to do with it, but this article fails to sketch this correctly. linas 15:47, 2 August 2006 (UTC)
- Starting arrow of time (physics) sounds like it might be a reasonable idea as long as you reference it as it is discussed in physics books; however moving this article doesn’t make any sense. I was the one who moved it here in the first place. The entropy page was getting too long; see: Talk:Entropy/Archive2#Article is now 46 kilobytes long (the limit is 32) for a discussion of this. Moreover, as far as I am aware of, the term “arrow of time” was first used by Arthur Eddington in his 1928 discussions about “entropy”, see for example pgs. 69-88 and 295 in Eddington’s book The Nature of the Physical World. Hence the name of the article entropy (arrow of time).
- As this book was written in a lose tone (without equations) for philosophers and for the general public, the idea spread rapidly from here. Harold Blum’s 1950 Time’s Arrow and Evolution cites Eddington’s 1928 book as the source of the term, as based on the argument that the second law of thermodynamics, i.e. the law that energetically quantifies the phenomenon of irreversibility, points in the direction of all real events in time.
- Certainly, cleaning, removing teleological references, and reshuffling things, etc. or even starting a new article, such as arrow of time (physics) or arrow of time (cosmology), or even a new category, i.e. category:arrow of time, may be in order, but moving the article is bad idea – entropy is the arrow of time as it was originally defined by Eddington. Thanks:--Sadi Carnot 18:12, 2 August 2006 (UTC)
- OK, well, there has been a lot of progress in mathematics and physics since 1928, or 1950, or even since Stephen Hawking invoked cosmology as an explanation (in the mid-1980's ???). None of the modern theories of the arrow of time deny the importance of entropy; however, they don't use entropy as the starting point; rather it is something that is a computable outcome.
- Now, I suppose we could turn this article into a "what Eddington thought abou it all" article, but ... I wouldn't be the one to do that. Instead, what I saw was that the article on Arrow of time seemed to provide the layman's overview of the general idea, while this article seemed to somehow be more in-depth, more physics-based.
- I tried to capture what I beleive is the state-of-the-art in understanding the arrow of time in the section titled "dynamical systems" that I added this morning. As best as I understand it, the idea of the arrow of time is explained by the structure of cetain topological vector spaces, and the fact that operators on these spaces are not time-symmetric even though the underlying microscopic physical systems are time-reversible. I am not sure how wide-spread or how accepted this idea is, but there are some books out on it, and relatively famous people like David Ruelle and many others (e.g. Caroline Series, David Mumford) have been writing about aspects it. I beleive this formulation is fully consistent wit formulations of entropy -- e.g. the Kolmogorov entropy can be preciely computed for some of these models. No one is denying that entropy is important, but its not the cause, its the effect.
- So the net result is we have an artcle that talks about the arrow of time in five different ways -- from the old statistical mechanics point of view, from the cosmology point of view, from the weak-interaction/CP-violation point of view, and from the dynamical systems point of view. If you don't want to rename the article, then I'd suggest moving most of the content out of this article, and leaving this one to describe Eddington's ideas, and nothing more. Personally, I'd rather rename it, and re-organize it so that the discussion of entropy becomes historically interesting, but no longer cnetral to the modern understanding (as I understand it). linas 23:41, 2 August 2006 (UTC)
- Linas, I would suggest starting up a second page and then move some of the physics content there. Remember, more than half of this page was written by people who contributed their writing, with the understanding that it was going to go on the entropy page. Regarding grammar, if you use abbreviations, such as PSL(2,R), please state what these are in word form (assume the reader may have printed this article out and is reading it somewhere); also, I wouldn’t use the phrase “as of 2006”. Something about it doesn’t sound right. We shouldn’t be filling articles with the latest research and theories from last month. If we did, it would be more like "science weekly" rather than an encyclopedia article. Just some food for thought. --Sadi Carnot 14:34, 5 August 2006 (UTC)
- Linas, Entropy is still the concept used regarding this matter in almost every field in physics, so it is not only of historical value. Additionally, I agree that to say the second law is the CAUSE of the arrow of time is wrong, because it is not the cause - IT IS the arrow of time, in the sense that this is the simplest (and at least the most widely used) way to encompass all most known phenomena related to the arrow of time. In any case, it is not only of historic interest. I think what you mean is that the current research in the field involves trying to identify a more accurate description and mathematical insight for the arrow of time.
Regarding the article, this just means that we should separate the last part of the article under "current research", as opposed to the general discussion in entropy. I have done this, and am still doing some cleanup, as soon as I finish I'll put down the cleanup tag. Hope you're OK with this.Dan Gluck 20:00, 8 September 2006 (UTC) Rgarding the name, I think the best option is either not to change it, or to change to "thermodynamic arrow of time". arrow of time (physics) is not good because all arrows of time are physical.Dan Gluck 20:14, 8 September 2006 (UTC)
- Linas, can you begin building your page or pages independently? With a consensus of acceptance, topics which become redundant here can be removed. Regards, ... PeterEasthope (talk) 05:53, 29 December 2011 (UTC)
Essay on irreversibility
[edit]After a hopelessly long discussion with User:Jpod2 on my talk page about entropy and irreversibility and what not, I wrote a small essay that mostly captures what I believe to be the "state of the art" about the arrow of time. I post it below; it is far too raw (and controversial?) to be a real article. linas 04:10, 5 October 2006 (UTC)
How to prove irreversibility
[edit]How can one show that a dynamical system with deterministic, time-reversible (i.e energy-conservative) equations of motion can somehow become irreversible? I think this question now has a clear answer in principle, even if actual examples are hard to prove in practice. Here's how its done:
First one must prove that a solution to the equations of motion is ergodic over some portion of phase space. Ergodicity implies that the trajectory of a classical "point particle" will be dense on some subset of phase space of non-zero measure. Here, the motion of a "classical point particle" should be understood to mean a single solution to the differential equations subject to some initial conditions. A trajectory that is dense can be visualized as a space-filling curve. When a trajectory is dense, one is justified in considering the closure of this set. Consider, for example, a discrete dynamical system that can only take values that are rational numbers. The rationals are dense in the reals, and the closure of the rationals may be taken to be the reals (well, also the p-adics...). The natural topology of the reals is fundamentally different than that on the rationals. When one considers physical observables on the closure, (such as correlation functions, densities, spectra and the like), these observables will usually have a radically different structure on the closure, than they do on the dense subset.
The canonical example of something that has a radically different form on the closure of a dense set, as compared to the dense set itself is the transfer operator. When the trajectories of a time-reversible dynamical system are considered, the transfer operator will be unitary, encapsulating the reversibility. The eigenvalues of the transfer operator will lie on the unit circle on the complex plane. But when one considers the transfer operator on the topological closure, one discovers that its spectrum is quite different, typically having eigenvalues that are not on the unit circle. Eigenvalues whose magnitude is less than one correspond to states that decay over time. Such eigenvalues are typically paired with others whose magnitude is greater than one: these are the states that "blow up" in the future, and are thus non-physical. In this way, one is forced into a situation which is manifestly irreversible: one can only keep the decaying states (and the steady states). (The eigenvalue one corresponds to the steady state(s): this is the largest allowed eigenvalue, and is also why the transfer operator is sometimes called the "Frobenius-Perron operator", in reference to the Frobenius-Perron theorem.)
So under what conditions is it acceptable to consider the closure of a dense ergodic set? It is acceptable when one can show that, by varying the initial conditions of the PDE's (but keeping the energy constant), the resulting trajectories result in the closure. That is, when the union of all possible trajectories forms the closure.
So has this sequence of steps ever been done for anything resembling a realistic physical system? I believe it has: it has been done for the Kac-Zwanzig model. This is a simple model of a single one-dimensional classical particle moving in a fixed, external potential, that is weakly coupled to N harmonic oscillators. It is completely deterministic and energy conservative. It is completely deterministic and energy conservative even in the case where the frequencies of the harmonic oscillators are more-or-less uniformly and randomly distributed. In such a case, the set oscillators can be envisioned as a model of a heat bath weakly coupled to the moving particle. For most fixed, external potentials, the system is chaotic but not ergodic (right?) for finite N. But something interesting happens in the limit. It has been recently shown by Gil Ariel in a PhD thesis that this system has a strong limit theorem, a strong form of a central limit theorem: that, when averaging over all possible initial conditions, the system behaves as a stochastic process. Now, stochastic processes and measure theory are really the same thing, just having two different vocabularies. A strong limit theorem is a theorem that states that the closure of a set of trajectories is equal to the union of the set of trajectories taken over all initial conditions -- this is exactly what we want. In essence, Ariel has proven the time-irreversibility of a more-or-less realistic system that is purely deterministic and reversible for finite N. A similar form of irreversibility has been proven before for much simpler, less "physically realistic" systems: the Baker's map, and I believe all Axiom A systems.
(Since WP has no article on strong limits: an example of a strong limit theorem is strong mixing: the core idea is that one can freely interchange the order in which two different limits can be taken, with the difference being at most a set of measure zero).
The the general study of irreversibility boils down to finding ergodic systems, and proving strong limit theorems on them. Off-topic, but I am very intrigued by the fact that the "classical" exactly-solvable ergodic systems, e.g. Anosov flows and in general motion in homogeneous spaces has a rich self-similar structure, and, when graphed, can strongly resemble Brownian motion. I'm also intrigued that there are integrable systems with highly non-trivial constants of motion: e.g. the Lax pairs of the Toda lattice. Hmmm.
Some people say that "in real life", its quantum mechanics that describes nature. So, a brief note: when happens when one quantizes a deterministic but ergodic dynamical system? Well, one finds that the spacing between energy levels goes as 1/N, and one finds that the wave functions are fractal, and space-filling. Here, "fractal and space-filling" means that, even for very small volumes of space, one can find a large number of wave functions all of whose probabilities are significant (and not exponentially small). They are also "space filling" in the sense that, by properly choosing phases between them, they can be made to destructively interfere over large regions. This has been shown for deterministic diffusion, and for dynamical billiards. The net result of having fractal wave functions with tightly-spaced energy levels is that it becomes (literally) impossible to prepare a pure state. That is, the only states that can be prepared are states which, over time, will come to occupy the entire space: that will macroscopically have the appearance of diffusing and spreading, despite the fact that the time evolution is completely and utterly unitary. The unitary evolution simply makes the wave functions (which occupied all of space) no longer interfere destructively, thus giving the appearance of spreading (when in fact "they were there all along"). The closely-spaced energy levels means that the ratio of energies between two energy levels seems to be irrational, and thus one has what looks like decoherence.
That is how I currently understand irreversibility in physics and mathematics. linas 04:10, 5 October 2006 (UTC)
Second-person
[edit]This article is a good read, but it's too chatty and adopts a second-person perspective. This should be avoided, per the style guidelines. Chris Cunningham 16:17, 15 February 2007 (UTC)
- I've changed the cleanup tag to a more specific one and tried to make some things "better" (whether I succeeded or not is another story). A fairly easy way to conform to guidelines would be to simply use the third person ("one") more often. Esn 05:50, 5 March 2007 (UTC)
A Mistake?
[edit]The Second Law of Thermodynamics allows for the entropy to remain the same. If the entropy is constant in either direction of time, there would be no preferred direction. However, the entropy can only be a constant if the system is in the highest possible state of disorder, such as a gas that always was, and always will be, uniformly spread out in its container. The existence of a thermodynamic arrow of time implies that the system is highly ordered in one time direction, which would by definition be the "past". Thus this law is about the boundary conditions rather than the equations of motion of our world.
Should not it be the 'highest possible state of order', and 'that the system is highly disordered in one time direction'? Forgive me if I am wrong. --210.84.41.237 05:46, 21 April 2007 (UTC)
Oh, and I found another example:
A more orderly state has become less orderly since the energy has become less concentrated and more diffused.
--210.84.41.237 05:51, 21 April 2007 (UTC)
- Sorry but you are wrong, the article is fine. Ordered things become less ordered with time. Dan Gluck (talk) 07:33, 3 March 2009 (UTC)
- I even order that this cloud was highly ordered in the past time, it isn't in the future so why should it be in past using statistic math never is about order it's about aproximation. We cannt pinpoint a gas molecule in time, position and speed cannt be measured at the same time. So time isn't about that might it be that time rather could be explained in states. For example two photons do hit each other or not hit each other. A single photon could not be affected by time it has no state change. While state changes can happen with multiple particles, in any direction. All these quantum scale events are almost random states. Altough some information (our world) keeps conserved within this system and so builds up next states based upon previous states. Since we are at a big lager scale (a result of to many minor states) to us the direction of time isn't reversable like it is in Quantum mechanics. WEll i gues the whole talk is hypothetical as we in our information world we cannt observe something going backward in time (wel ahum if we exclude the airflight to Elbonia). Peter-art —Preceding comment was added at 21:28, 31 October 2007 (UTC)
- I didn't understand anything. Dan Gluck (talk) 07:33, 3 March 2009 (UTC)
state changes instead of time ?
[edit]Reading some examples of gas clouds wondering if thats all right to use as example. How can a gas cloud be highly ordered in the past time, it isn't in the future so why should it have been in the past. Using statistic math doesnt make past things fixed, or framed, statistics is about aproximations. We cannt pinpoint a gas molecule in time, as position and speed cannt be measured at the same time. So i wondered time isn't about that perhaps at all.
Might it be that time rather could be explained in states. For example two photons do hit each other or dont hit each other. That's called a state change. While universe with a single photon could not be affected by time it has no state change possibilities and nothing take distance from.
In our world state changes happen with multiple particles, in any time direction. At quantum scale events and are "almost" random states. As some information (our world) keeps conserved within this system and so builds up next states based upon previous states. Our large scale world (as a result of many minor QM states) to us the direction of time isn't reversable like it is in Quantum mechanics. Well i gues the whole talk is hypothetical as we in our information world cannt observe something going backward in time that is if we exclude the airtrip to Elbonia).
If I do remember correctly somthing like this has been discused at Edge.org by Julian Barbor As he wrotes the folowing in this article the end of time Peter-art —Preceding comment was added at 21:52, 31 October 2007 (UTC)
Reference commentary
[edit]I suggest that the commentary currently being inserted into the references be copied here (where it belongs) for a proper debate to take place. I will respond further in a week (busy at moment). --Michael C. Price talk 17:45, 28 January 2008 (UTC)
removing claims of proofs of the second law until they are no longer controversial
[edit]I suggest removing mention of putative proofs of the 2nd law. For over 100 years people have been unsuccessfully trying to prove this from microscopic physics. I just checked the latest of such proofs: Mechanical Proof of the Second Law of Thermodynamics based on Volume Entropy by Michelle Campisi, , submitted to Stud. Hist. Phi. MP. I think I found 6 errors: 1. The energy eigenstates could have accidental degeneracies, not due to any symmetry. The author assumes no degeneracy because there are no symmetries. 2. In equilibrium, only the expectation value of the density matrix is expected to be diagonal, not the density matrix itself. Too strong of an assumption. In fact, at t_f, the density matrix is expected to be wildly oscilating. 3. The no level crossing assumption is unjustified 4. Eqn 19 does not follow from eqn 18. There is a term ln(K+3/2)*Sum_{n=0,K}(p_n-p’_n) which must be subtracted from the RHS of eqn 19. This subtraction invalidates the final inequality (S_f>=S_i). 5. The fact that the entropy seems to be monotonically increasing for any t_f>t_i regardless of whether equilibrium has been achieved is not what one would expect for small systems (no specialization to macroscopic systems) and means there is something wrong with the proof (1-4 above), unlike what the author claims, that the entropy only increases when the density matrix happens to be diagonal. 6. Any proof of entropy increase in a closed system should work for almost any initial probability distribution of microstates, not just the one assumed by the author. The system should evolve after some time towards equal probabilities for the microstates, as that distribution maximizes the entropy. It is almost certain, for any physical system that the density matrix itself will not be diagonal with equal entries no matter how much time elapses. However, either the time average or the quantum expectation value of the density matrix (Tr(rho^2)) should evolve towards 1/N where N is the number of microstates. Even one example of such an evolution would be great, but as far as I know, none exists (without unjustified "approximations", aka cheating). Next, I will debunk (or corroborate the validity of) Gemmer, Jochen; Otte, Alexander & Mahler, Günter (2001), "Quantum Approach to a Derivation of the Second Law of Thermodynamics", Phys. Rev. Lett. 86 (10). Give me a few days. Or it might be weeks.
OK, I just looked at Gemmer at al. and a few things are not clear to me: 1. Where is there a derivation of eqn 18 (they say it is too involved a derivation to present in PRL, but no reference to a derivation is given)? What is it that one is finding an average of, is it the total purity, or just the partial trace over environmental states? Is a uniform distribution for the elements of the density matrix assumed, or some other distribution, and what is its physical justification? 2. Same question for eqn 22: what is being computed, the gas purity, or the gas plus container purity? It looks like the gas plus container, but then the purity should be conserved, which they imply is not the case for there apparently oscillating eqn 22. 3. Eqn 23 does not follow from 22. There is a factor of Sum_{B,D}{P^c_B*P^c_D} term missing from the first term and a factor of Sum_{A,C}{P^g_A*P^g_C} term missing from the second term, maybe they are identically 1? 4. If eqn 23 is wrong then the whole "proof" is wrong for the specific case of no degeneracy--the time average is not equal to the ensemble average. But then they make the unproven claim that the result is more general than eqn 23 for the case of no degeneracy. Why should we believe this? Are we doing science or religion?
My personal opinion about this is that the second law is not provable from microscopic dynamics (or initial conditions) without cheating (usually unintentional) and unjustified circular assumptions (ones which get the 2nd law as the result). I suspect there is something missing from our microscopic equations of motion, either quantum or classical, that once inserted would give irreversibility and the second law. —Preceding unsigned comment added by Clejan (talk • contribs) 23:53, 30 January 2008 (UTC)
- Your last comment suggests you are trying to prove/disprove the 2nd Law from the equations of motion. This is misguided. As the article says, the origins of the 2nd Law are to be found in the boundary conditions, not the equations of motion (which are reversible or CPT invariant).--Michael C. Price talk 06:36, 2 February 2008 (UTC)
Not CPT invariant, but just T invariant. The Gemmer paper does not seem to use an asymmetry in boundary conditions in their "proof", but as I claim above, it could very well be wrong. Experimentally, I think a macroscopic system always evolves towards a larger entropy if it can (that is if you didn't already start it in a maximal entropy, and if it is ergodic, unlike some glasses), which suggests that it is not just the boundary conditions (do you mean initial conditions?) that are responsible for the second law. I don't think you can concoct any experimental initial conditions that will lead to entropy decrease (after initial transients) for a macroscopic closed system (or an experimentally obtainable approximation thereof). Even if you don't have very good control over the initial conditions, you can concoct an experiment where you allow them to vary and repeat the experiment many times, but I bet that you will never find entropy decrease. If that's correct, then it must be something besides the initial conditions. Since it isn't the dynamics we know, it must be dynamics we are yet unaware of. (talk) 19:00, 4 February 2008 (UTC)
- Dear Clejan - you can't get the proper initilal conditions for an inversion of the second law with a system of many particles simply because you can't control each particle individually well, but theoretically you can get it (see the example in the correlation section in the article). Clearly uncorrelated initial conditions alays lead, according to theory, to a subsequent increase in enthropy. The only problem is why the initial conditions are such, and we can infer them only by assuming similar initial conditions further to the past. That's why you ultimately can (and in fact must) explain everything by the initial conditions in the big bang. Dan Gluck (talk) 09:02, 3 March 2009 (UTC)
I think it is fair to say that nowadays there is a consensus among experts that the arrow of time is due to cosmological boundary conditions.[ See for example http://arxiv.org/abs/hep-th/0410270] The article should reflect this and not give the impression that the other "explanations" mentioned are somehow competitive with this consensus view. Boltzy (talk) 06:03, 22 May 2008 (UTC)
- Is this a joke? Since when does consensus make for scientific truth? Instead of your argumentum ad verecundiam logical fallacy, you need to address Clejan's last paragraph, directly above your comment, which is quite specific in its criticism of suggesting cosmology imparts a thermodynamic arrow of time. That would suggest that global conditions give rise to a law of physics, the 2nd law of thermodynamics, that has direct local action--completely backwards of the scoping of other laws of physics, where global effects are the result in local action. Entropy will increase in an isolated system even if your time zero is not the Big Bang, so it makes no sense to suppose that cosmological initial conditions somehow create a dynamics law that drives the behavior of a system for which you pick a non-coinciding time zero. This point is also made here http://motls.blogspot.com/2007/06/is-cosmology-behind-second-law.html ThVa (talk) 19:55, 18 August 2008 (UTC)
For the standard view of these matters, see the Scholarpedia article at
http://www.scholarpedia.org/article/Time%27s_arrow_and_Boltzmann%27s_entropy
Note that Lebowitz is a world authority on the subject, a senior professor at Rutgers. I repeat: the wikipedia article should reflect the consensus view. It can take note of the less nutty alternative theories [though in this particular case all of the alternatives are pretty nutty], but it should make their status clear. Of course consensus and the words of established authority are essential for wikipedia to work; otherwise the whole project will be taken over by crackpots. Lebowitz is a real expert, Motl is not. Boltzy (talk) 08:10, 5 January 2009 (UTC)
- I totally agree that this is what the article should reflect. But doesn't it already do that? Dan Gluck (talk) 07:42, 3 March 2009 (UTC)
- PS - Dear ThVa, wikipedia should reflect the consensus in the scientific community because otherwise we may start having articles about all kinds of bizzare pseudo-physical theories. Accodring to your line of thought, we should insert them because perhaps the truth lies there. Additionally, though the consensus argument is enough, I would like to explain how can the cosmological boundary conditions effect every experiment we are doing in a closed system: since the universe has started with very low (zero?) correlations between its molecules and different parts, every experiment we are doing with gas etc. will have a high probability of having molecules which are uncorrelated in the beginning (at least no correlation which is relevant to the experiment). Hence the entropy will grow in this particulat experiment as well. Dan Gluck (talk) 07:59, 3 March 2009 (UTC)
A proposed change of name
[edit]I suggest changing the name of the article to "Thermodynamic arrow of time". Any objections/suggestions? Dan Gluck (talk) 09:36, 3 March 2009 (UTC)
Quantum optical perspective
[edit]In the field with which I am most familiar, the arrow of time is considered to arise in open systems due to coupling to unmeasured (unmeasurable) degrees of freedom. Time evolution in quantum mechanics of closed systems is a unitary group. For every time evolution operator taking the state of the system at a given time as an input and returning the state of the system at a later time as an output, there exists an inverse operator that carries the system into the past. In order to model behaviors that exhibit an arrow of time, such as spontaneous emission, the common approach is to include couplings to a large (usually infinite) set of external degrees of freedom. The behavior of the open system is then determined by taking a "partial trace" over the reservoir degrees of freedom. The resulting time evolution of the open subsystem is now non-unitary and the resulting time evolution exhibits an arrow of time. The time evolution for the open system forms a quantum dynamical semigroup, in which only time evolution operators carrying the system into the future are defined (semigroups lack an inverse). Identifying the arrow of time as a property of open systems also evades Loschmidt's paradox and having to explain the absence of Poincare recurrence.
In light of this, may I suggest a section about open systems? 128.223.131.79 (talk) 04:50, 8 August 2009 (UTC) jmh
Contradictory section
[edit]It appears that the last two paragraphs of the Correlations section are contradictory. The first of the two paragraphs claims that the measured thermodynamic entropy is independent of correlations. Further, it claims that correlations between particles will decrease the information entropy. Then, the following paragraph claims that as correlations in an initially uncorrelated system increase, the entropy must also increase. This is absolutely contrary to the previous paragraph. Additionally, the reference [1] given is not of the peer-reviewed sort, and is thus somewhat suspect (not that the peer review process is perfect). zipz0p (talk) 23:18, 13 September 2009 (UTC)
References
Cosmology/A notable exception is the wave function collapse
[edit]Quote: A notable exception is the wave function collapse in quantum mechanics, which is an irreversible process. It has been conjectured that the collapse of the wave function may be the reason for the Second Law of Thermodynamics. However it is more accepted today that the opposite is correct, namely that the (possibly merely apparent) wave function collapse is a consequence of quantum decoherence, a process which is ultimately an outcome of the Second Law of Thermodynamics.
This section needs some citations and clarification. First - what is it doing in Cosmology? Shouldn't it be in Quantum mechanics?
Second it uses the phrase "wave function collapse" in two different ways: 1. "wave function collapse" as in wave function collapse in the Copenhagen interpretation; 2. "wave function collapse" as in the "appearance of the wavefunction collapse" in the Decoherence interpretation. Confusing as hell.
And third it states that the quantum decoherence process (quantum theory) is an outcome of the Second Law (classical theory). It is reversed. The classical law should be the classical limit of the quantum theory. My proposal here would be to replace this part with the following:
A notable exception is the [quantum entanglement] process in quantum mechanics, which is an irreversible process. It has been conjectured that the entanglement may be the reason for the Second Law of Thermodynamics. The [quantum decoherence] process (resulting from the entanglements between the open system and environment) produces the Second Law of Thermodynamics in the classical limit.
--Dc987 (talk) 07:37, 3 October 2009 (UTC)
I will second these complaints and add my own complaints. The entire Cosmology section of this article is full of weasel words and independent research. Probably the worst part: "It has been conjectured that the collapse of the wave function may be the reason for the Second Law of Thermodynamics." Generally, this entire section does not belong in an encyclopedia article. Our next step is to mark this article up with wikipedia warning banners. Hopefully the person who wrote this nonsense will come here and defend it. (Although I honestly cannot imagine a reasonable defense of this tripe.) Miloserdia (talk) 11:12, 27 April 2011 (UTC)
Revert please
[edit]Due to edits of User:Wyhiugl and User:Sage123, this once great article has been reduced to plain pseudo-scientific nonsense (I'm referring to this recent version ). I revert it the second time to an older one, but Wyhiugl has re-reverted once and I suspect he will do it again, so I post this to explain and try to avoid an edit war.
This diff illustrates the nature of the changes. I fail to see how any entropy phenomenon in my refrigerator is induced by gravity. The rest of the article is similarly original research at best, with irrelevant sources.
Wyhiugl called the good old definition of entropy "gibberish" in the summary, that may be a sign a sign of a certain attitude. I suggest reverting to last sane version version (I already did) but I fear it's going to degenerate into an edit war.
Dear Wyhiugl!
Please let me know why you had to completely rewrite a great article. Even if your claims here are true, you should have at least kept the basic material in place.
Tibixe (talk) 20:00, 13 February 2011 (UTC)
- I agree that the recent changes are at best WP:OR/WP:SYNTH. Gravity has nothing to do with entropy as the arrow of time. I've reverted to the long-standing version. Headbomb {talk / contribs / physics / books} 00:35, 14 February 2011 (UTC)
- I also agree, same reason as Headbomb. --Steve (talk) 00:48, 14 February 2011 (UTC)
- Confirmed as socks of Antichristos. They have been blocked, along with a few other accounts. Headbomb {talk / contribs / physics / books} 03:05, 14 February 2011 (UTC)
Correlations
[edit]The meaning of correlation in this context should be defined, either directly here or by providing a link to an explanation. Without this explanation many readers will not understand this section. Regards, PeterEasthope (talk) 06:07, 29 December 2011 (UTC)
This whole section looks mistaken. The correlations work exactly the opposite way in the thought experiment described. In the beginning (of "experiment A") the whole gas is in one half of the box, i.e. the positions of the gas molecules are much more correlated than later on when they occupy the whole box. Actually, if we split the box, put gas in one part and make vacuum in the other, and then wait a bit, the gas filled part will have particles with "correlated" speeds. If we remove the splitter, we get the perfect initial condition for "experiment A". So, IMHO, this whole section is erroneous. --176.63.223.236 (talk) 22:38, 25 February 2017 (UTC)
Do We Need Maxwell's Demon?
[edit]I don't believe the section on Maxwell's Demon is sufficiently relevant to the arrow of time to justify its inclusion here. Maybe it should have been placed in the article on Entropy? In any case I propose that it be cut from this page despite being well written & informative. — Preceding unsigned comment added by Tony999 (talk • contribs) 01:33, 4 July 2012 (UTC)
Intermingled?
[edit]In the section titled "Quantum Mechanics" in this article, the rational and irrational numbers are given as an example of "intermingled" sets. These sets are dense in each other -- I assume that's what they mean by "intermingled".
Are these unnamed physicists (I say unnamed because the citations are still missing) suggesting that there are states that are literally dense in each other in Hilbert space? How can that be well-defined in a physical theory?166.137.101.164 (talk) 18:46, 12 July 2014 (UTC)Collin237
(Another) Contradictory section
[edit]The first sentence of this article is a rather ill-put conclusion; "Entropy is the only quantity in the physical sciences (apart from certain rare interactions in particle physics; see below) that requires a particular direction for time, sometimes called an arrow of time." The sentence clearly contradicts itself, as well as the wikipedia article "T-symmetry", which begs of THREE ways an arrow of time reveals itself in reversable physics. I understand that the second law of thermodynamics is very different from other time symmetry violations, but it seems too hasty to brush aside other phenomenom with the same time asymmetric conclusions, especially when problems like the Loschmidt's paradox remain unsolved. If nothing else, please take into account the contradictions of the two articles I mentioned. — Preceding unsigned comment added by 104.162.23.189 (talk) 22:27, 23 June 2015 (UTC)
Contradictory Paragraph
[edit]I found this text in the article which belongs on the talk page:
The above paragraph contradicts itself by first saying that the early Universe was in a state of "maximum entropy" which then "increased" beyond its' already maximized entropy via the expansion of the space containing the matter. Cosmic inflation isn't analogous to letting the air out of a tire given that the "creation" of more space via inflation has no effect on the ordering of matter within its' space since individual particles of matter are "unaware" of their increasing volume. If you define entropy as a measure of homogeneity one can only conclude that any further clumping (subatomic particles combining to form atoms) will only decrease the overall homogeneity of the system, regardless of the space in which it resides. Simply playing "Jedi" mind games doesn't point the arrow of time in the right direction. By Example: You create a one cubic meter volume of particles X seconds after the Big Bang. Y seconds later in time your one cubic meter volume of particles is still only one cubic meter since the measurement standard has also increased relative to its' surrounding space.
(In the "Cosmology" section)
Pinging the editor who wrote it: @Ftherrmann1326:
No opinion here, do with this what you will. -- ProgrammingGeek (Page! • Talk! • Contribs!) 07:48, 30 July 2016 (UTC)
Lots of grammar mistakes like commas separating independent clauses and also strange tone like "it is easy to understand"
[edit]I submitted an edit but quickly realized this page is just rife with mistakes and the tone does not match the rest of wikipedia. One sentence even starts with "it is easy to understand" Theguyi26 (talk) 22:59, 14 March 2019 (UTC)
possible Original Research disclaimer removed
[edit]The example of two fluids to illustrate entropy is not controversial nor original. It was referenced in classical Greek.
Small labelling change would be very helpful
[edit]Hi, first contribution on Wikipedia for me! Thanks to everyone for all their hard work.
I did struggle to make sense of the section "Mathematics of the arrow". Bodies are labelled as T1 and T2 but then T is used to refer to temperature. It did take some effort to realise that, in fact, T1 and T2 are the temperatures of the two bodies. I suggest it needs to say eg "a body at temperature T1". Also, I did think that the convention for such variables was small case letters, with capitals for the general relationships, like E = MC squared. And, that would feel more intuitive to me.
Also, less significantly, whilst Δ S is defined as the entropy change, I suggest it would be more useful to firstly define S: you would then also be educating readers not familiar with delta symbols as to their meaning. Very simple solution: entropy is defined, as Q/T. We could simply have 'Entropy, S, defined as Q/T'.
Timhohfee (talk) 13:27, 26 October 2021 (UTC)
https://en.wikipedia.org/wiki/Mass%E2%80%93energy_equivalence
E = mc^2 lowercase m, lowercase c
https://en.wikipedia.org/wiki/List_of_letters_used_in_mathematics_and_science
When reading a page like "Entropy as an arrow of time", please note that there is already pages "Arrow of time" and "Entropy" which cover a lot of the fundamentals, if like me, you do not get every technical detail at first read, visit the Wikipedia portal, and look for the "Outline of ..." or "Introduction to ...".
Thanks for an enjoyable discourse, I hope you have a nice day. — Preceding unsigned comment added by 49.199.247.57 (talk) 12:04, 20 April 2022 (UTC)
"Dynamical systems" section contains nonsense
[edit]In the section "Dynamical systems" there are currently the following sentences:
"For some of these systems, it can be explicitly, mathematically shown that the transfer operators are not trace-class. This means that these operators do not have a unique eigenvalue spectrum that is independent of the choice of basis. In the case of the baker's map, it can be shown that several unique and inequivalent diagonalizations or bases exist, each with a different set of eigenvalues."
This is nonsense on multiple levels.
First: "Eigenvalue spectrum" is a weird phrasing. It's either called "the spectrum" or (in special cases) "the set of eigenvalues", but not both. In general, the set of eigenvalues is only a subset of the spectrum, but not necessarily equal to the whole spectrum.
Second: There is no such thing as a non-unique spectrum or a basis-dependent spectrum. By its very definition the spectrum is unique and basis independent, because there simply is nothing in the definition it could possibly be dependent on (except the operator itself), especially not a basis.
Related: There is no such thing as "diagonalizations with different sets of eigenvalues". That's the thing about diagonalizing an operator: Exactly the eigenvalues will appear on the diagonal. That's the only thing that *can* appear on the diagonal. And there is only one set of eigenvalues. That's also why "inequivalent diagonalizations" does not make any sense. All diagonalizations are unitarily equivalent, almost by definition, because they all have the eigenvalues on the diagonal. That's the whole point of a diagonalization. Otherwise it wouldn't be a diagonalization.
Third: What the hell does "several unique but inequivalent" mean? If something is unique, there is only one of that thing. At most it can mean "there are several, but all are equivalent". But "several inequivalent" implies multiple things. It cannot possibly be both.
Fourth: Not being trace-class has very little to do with diagonalizations existing or not. For example, all compact operators are diagonalizable, but not all compact operators are trace-class.
The section then continues seemingly inferring
"It is this phenomenon that can be offered as an "explanation" for the arrow of time. That is, although the iterated, discrete-time system is explicitly time-symmetric, the transfer operator is not."
which is just plain wrong. The transfer operator of Baker's map is an invertible unitary operator, because Baker's map is invertible and measure-preserving.
I'm sure the original author meant to write something meaningful. I simply have no idea what that was so I will not attempt to correct the section.2A02:2454:854C:3600:2530:8649:C490:5D9B (talk) 23:19, 1 July 2023 (UTC)
Highest Entropy State of Universe not likely to be “all matter collapsed into black holes.”
[edit]"(compared to a high-entropy state of having all matter collapsed into black holes, a state to which the system may eventually evolve)"
Although black holes have the highest state of entropy in the current state of our Universe, the ultimate end-point is more likely the "heat death" of the Universe (actually this is one of three possibilities known at this time, the other two being a big crunch and a vacuum decay), not black holes eating everything up and staying as black holes forever. Hawking and others showed that black holes have a temperature (albeit very low) and eventually every black hole will evaporate over incredibly long time periods. See Katy Mack's book "The End of Everything." Joh71962 (talk) 14:01, 15 January 2024 (UTC)