Talk:Maximum entropy thermodynamics
|WikiProject Physics||(Rated Start-class, Mid-importance)|
|This page was nominated for deletion on 28 October 2005. The result of the discussion was keep.|
- 1 From stub to article
- 2 Introduction could be more friendly
- 3 This entry is hard to follow
- 4 Average entropy, measured entropy and entropy fluctuations
- 5 meaning of the words 'subjective' and 'objective'
- 6 The latest addition to the list of caveats with the argument is wrong and should be deleted
- 7 New section on criticism
- 8 increase of thermodynamic entropy
- 9 epistemic and objective
- 10 MaxEnt
From stub to article
- technical note that strictly the entropy should be relative to a prior measure. -> Principle of minimum cross-entropy (Kullback-Leibler distance). In thermodynamics we usually assume the "principle of equal a-priori probability" over phase space, so the two are then equivalent.
- section on philosophical implications regarding the conceptual problems of statistical mechanics, second law, etc.
- -- (?) DONE Jheald 22:07, 2 November 2005 (UTC)
- (?) some more algebra, and a simple nonequilibrium example (eg Brownian motion?)
-- Jheald 12:47, 28 October 2005 (UTC)
Introduction could be more friendly
(from the Article deletion page):
Note to author - please consider adding a few paragraphs up front in layman talk before getting on to the partial differentials. There ought to be something you can say about maximum entropy that I can slip into a casual conversation. Denni☯ 23:56, 28 October 2005 (UTC)
- You know, I tend to agree with you. But the problem is that whatever egghead wrote the damn article probably would have real trouble talking to a regular human being. Someone that bright probably would have trouble answering a question as easy as "Hey man, what's up?" You'd probably get as an answer some canned formulaic response that they'd learned to the question if not some Larry-Wallesque humor. If anything, I really fucking like this article? It reminds me of the good old days of wikipedia, when PhD level researchers from a buncha universities would pop on during a coffee break and spit out (arguably useful) articles on cutting edge theory, heavy academia or particle physics. Eventually the whole project got taken over by hack librarians, star trek bloggers and geeky high-school students, with the result that they pretty much demanded a level of organization to the writing that your average professor was unable/unwilling to do on their coffee break. This scared them off and most articles, even about important subjects, end up bland, overly-verifiable and smelling like they were written by some windex-and-spam fried committee. Honestly, the only good articles left on wikipedia are the ones on topics so obscure that people find nothing in them to be contentious. Like this article, for example. But anyways, brother I certainly see your points. Sorry for the outburst. Long-live Myspace, Facebook, Wal-Mart and Tom Hanks. —Preceding unsigned comment added by 10.250.65.158 (talk) 01:22, 6 October 2007 (UTC)
- Me again. There's the additional problem that sufficiently complex information reaches a point of irreducibility beyond which any further boiling deprives the subject-matter of its inherit factuality. (c.f. What the Bleep do we know http://en.wikipedia.org/wiki/What_the_Bleep_Do_We_Know%21%3F , a film designed to make quantum mechanics comprehensible to people of average intellect and new-age spiritual interest). You try to make something like that so that regular folks can understand it, and the result is a hodge-podge of miswrought analogies and irreconcilable metaphors. Really, people are better off knowing that they just don't fucking know anything about a particular subject than thinking they can define five or ten vocabulary words of the industry jargon.
This entry is hard to follow
I have done a fair bit of reading and writing of Wiki entries on math and science, and this entry strikes me as one of the hardest to follow out of all those I've read. This entry badly needs to be turned around, but I don't quite know where to begin. If anyone conjectures that I am incompetent to form an opinion of this nature, let me point out that my education included Bayesian statistics and Shannon information.
This entry is important simply because E.T. Jaynes founded its subject with his 1957 Princeton Ph.D. thesis written under Eugene Wigner; Wigner was a worthy fellow, and Jaynes became one himself. In fact, I first encountered his name while learning Bayesian statistics. I think of Jaynes as having advanced the legacy of Gibbs and Boltzmann. In any event, this entry should also include a paragraph giving the history of its subject.
Another reason why this topic is a worthy one is that subjective probability and Bayesian statistics can be a powerful tool in the philosophy of science; see, for example, the work of Nick Bostrom. If the data can be adequately summarized via a likelihood function, then Bayes Rule is a powerful way of reasoning inductively from those data. Hence I am surprised that this entry does not mention Bayes's Rule.220.127.116.11 (talk) 06:25, 25 December 2008 (UTC)
Average entropy, measured entropy and entropy fluctuations
At the moment the article isn't very clear as to when it's talking about expectations (either as a constraint, or a prediction), and actual measurements. For example, in the discussion of the 2nd law, the measured macroscopic quantities probably won't come in bang on the nose of the predicted -- instead (we assume) they will be within the margin of predicted uncertainty.
This especially needs to be much cleaned up in the context of entropy, particularly if we're going to discuss the fluctuation theorem.
Also, the new measurements will therefore contain (a little) new information, over and beyond the predicted distribution. So it's not quite true that SI is unchanged. It will still be a constant, but strictly speaking it will become a different contstant, as we propagate back the new information, sharpening up our phase-space distribution for each instant back in time.
-- Jheald 15:51, 1 November 2005 (UTC)
- -- DONE Jheald 22:07, 2 November 2005 (UTC)
meaning of the words 'subjective' and 'objective'
The article at present, in a section called 'The nature of probabilities in statistical mechanics', writes: "According to the MaxEnt viewpoint, the probabilities in statistical mechanics are subjective (epistemic, personal), to the extent that they are conditioned on a particular model for the underlying state space (e.g. Liouvillian phase space). They are also conditioned on a particular partial description of the system (the macroscopic description of the system used to constrain the MaxEnt probability assignment). The probabilities are objective to the extent that given these inputs, a uniquely defined probability distribution will result." [bold type by present talk page writer here]
I think this is not quite right. It is true that critics of the MaxEnt viewpoint view the MaxEnt viewpoint as dealing with 'subjective (personal) probabilities', but the holders of the MaxEnt viewpoint do not do so. The holders of the MaxEnt viewpoint hold that the probabilities are simply objective and epistemic. They hold that being epistemic logically precludes them from being subjective or personal. The root of the word epistemic, used by Aristotle, is usually translated as "scientfic knowledge", in direct and explicit contrast with "opinion", which is regarded as essentially subjective and personal. In view of this tradition of thinking and language, it is self-contradictory to say that something is both epistemic and subjective/personal. I think all will agree that in science, to label something as 'subjective/personal' is pejorative. That the probabilities are conditioned on a particular model does not make them subjective or personal unless one hides or conceals or fails to state the model. The MaxEnt people are keen to put the model upfront and explicit, and not to hide it. Indeed, "the probabilities are objective because given these inputs, as is the case in a properly stated proposition, a uniquely defined probability distribution will result." It is according only to the opponents of the MaxEnt viewopoint that the MaxEnt viewpoint deals with "subjective/personal" probabilities. The sentence in the article that says otherwise is not an accurate statement, and should be deleted. Perhaps it could be replaced, if one was keen, by a sentence like "According to the opponents of the MaxEnt viewpoint, it deals with subjective or personal probabilities, but the holders of the MaxEnt viewpoint categorically deny this pejorative mischaracterization."
The article at present continues: "At a trivial level, the probabilities cannot be entirely objective, because in reality there is only one system, and (assuming determinism) a single unknown trajectory it will evolve through. The probabilities therefore represent a lack of information in the analyst's macroscopic description of the system, not a property of the underlying reality itself." This is not an argument that the probabilities are not objective. It is an objective fact that the analyst is stating that he has a lack of information. Objectivity is not about direct contact with the underlying reality itself: it is is about how the situation is conceived and described: objectivity is an epistemic notion not an ontological one. The two quoted sentences are inaccurate, misleading, and should be deleted. Chjoaygame (talk) 05:54, 17 October 2009 (UTC)
note of intention to edit
I have above proposed to make some edits to the effect that 'subjective' is not the right word to label the epistemic meaning of probability. The word 'subjective' is used by opponents of MaxEnt theory as a politely pejorative epithet, but that something is epistemic does not make it subjective. MaxEnt people do not think that probability is subjective. So far, no replies to my above remarks that are preliminary to my efforts at editing to this effect. Anyone with a comment?Chjoaygame (talk) 07:00, 23 October 2009 (UTC)
more argument about opinion versus scientific knowledge
On page 44 of Probability Theory, Jaynes writes, in a slightly uncharacterisitic appeasement of critics, that "any probability assignment is necessarily 'subjective' in the sense that it describes only a state of knowledge, and not anything that could be measured in a physical experiment." Jaynes puts single quote marks ' ' on the word subjective. This indicates that he is not using the word in its natural sense, but is exhibiting it in the strained sense of his critics. In its natural sense, the word is opposite to the word objective. Jaynes goes on on page 45 to say that the probability assignments that he regards as valid are objective. Jaynes here puts ' ' marks on the word objective, to indicate that he is critically examining it, as distinct from merely using it. There are things objective that cannot be demonstrated in a physical experiment. That 0 = 0 and exp(i pi) = - 1 are objective facts of mathematics that cannot be demonstrated in an empirical experiment. True, they are not empirical facts; they are only analytic or logical facts, but they are still objective. So it is with the objective character of Jaynes' probabilities. As mentioned above, the real distinction here can be traced back further back than Plato and Aristotle; the distinction is as valid today as it was in the old days. It is between doxa, meaning reputation, expectation, or opinion, and episteme, meaning scientific knowledge. The opposition is fundamental. Doxa is subjective, episteme is objective. Jaynes intended his probabilities as objective and not subjective. The Wikipedia article should make that clear, and make it clear that it is the opponents of Jaynes who like to call his probabilities subjective.Chjoaygame (talk) 10:05, 23 October 2009 (UTC)
edit to focus on Jaynes' view that probability is epistemic and impersonal
The latest addition to the list of caveats with the argument is wrong and should be deleted
The following has been added to the list of caveats with the argument.
- 4. The use of Shannon information-entropy in Boltzmann probabilistic approach to thermodynamics requires very restricted assumptions about the preparation of the system, as well on the nature of the collision mechanism (Markov processes). The relation between entropy and information has been exploited as a general rule to extract information in other situations than thermodynamics, many of these generalizations are based on little or inapplicable evidence.
It is imprecise and inaccurate. It was written by someone who was not thinking clearly at the time of writing, but who was not aware of that. The use of information theoretic arguements requires not the making of restriced assumptions, but the precise explicit statement of assumptions that often would not otherwise be stated, but would be left unstated, though still implicitly, and thus unscientifically, made. The use of information theoretic arguments does not require specific assumptions about the collision mechanism, but does require explicit statement of whatever assumption is made. The writer of the addition seems likely to have been in the habit of making implicit unstated assumptions, and apparently does not like the information theoretic requirement that he mend his ways in this respect.
- New caveat removed, along with a ref to a self-publication by the contributor () -- self-promotion is not encouraged; and self-pubs are not considered a WP:RS.
- More generally, to amplify what Chjoaygame writes above, you should read some of what Jaynes has to say on the subject; and eg discussion, such as discussion say on the H-Theorem by eg Jaynes or Gull.
- The MaxEnt position is that a MaxEnt distribution is the consequence of a deliberate assumption that no relevant factors need to be considered other than those captured in the expectation values of a particular set of parameters, and the nature of the underlying assumed space. The form p log p appears, not because of a detailed collision mechanism, but because of detailed axiomatic arguments (applicable to probability distributions in quite arbitrary contexts -- see maximum entropy distribution)
- The MaxEnt people recognise full well that what is being made is a prediction, based on some quite specific falsifiable assumptions, not a mathematically-established consequence. Mathematical systems can surely be constructed -- because physical ones definitely can -- where details of the initial state, and/or the mixing process, do survive to affect the final state distribution. But the crucial point is that this isn't seen as a failure of the MaxEnt algorithm, or the MaxEnt philosophy -- rather, it is flagging a much more concrete failing in the detailed assumptions that are being fed to the MaxEnt method. It's seen as a strength in the MaxEnt method, that it can flag a failure in the modelling. That is good, because it is an opportunity to learn; and a way to demonstrate that something beyond the originally-chosen set of parameters makes a difference and has to be taken into account.
- The "caveats" set of points in the article is intended to reflect views and self-perspectives that the MaxEnt people themselves have of the MaxEnt process.
- External criticism, if you can relate it to what WP would consider independently published reliable sources, should probably go into a different category, called something like "reception" or "criticism". But be careful you aren't criticising MaxEnt as not providing something that it isn't ever claiming to do. Jheald (talk) 12:35, 14 February 2010 (UTC)
New section on criticism
In a response to Chjoaygame, editor Jheald wrote above:
- External criticism, if you can relate it to what WP would consider independently published reliable sources, should probably go into a different category, called something like "reception" or "criticism". But be careful you aren't criticising MaxEnt as not providing something that it isn't ever claiming to do.
I am surprised that the current version of this article does not contain a section with critic of Jaynes and MaxEnt theories, both of which have received strong criticism on specialized literature. I am going to add a new section with the concise but strong criticism done by Radu Balescu in a recent monograph. JuanR (talk) 19:43, 17 December 2010 (UTC)
I have modified non-NPOV changes by user CoupledMap. CoupledMap has deleted a fact from the online reference (Kleidon, Axel; Lorenz, Ralph D. (2004)):
- "[...] Jayne's MaxEnt formulation [...] has for so long failed to be accepted by the majority of scientists."
and substituted by his non-NPOV. CoupledMap writes that:
- "Several scientist such as Joel Lebowitz or Roger Penrose are clearly in favor of maximum entropy thermodynamics ."
The reference  that CoupledMap adds is an unpublished report which has nothing to see with MaxEnt. Lebowitz does not cite Jaynes' work on MaxEnt, less still claims to support it! The report contains an appendix A where letters from several authors published in journals are reprinted. One of them is from some MaxEnt people who cites Jaynes' work in his criticism of Lebowitz. In his reply Lebowitz, shows his dislike of MaxEnt. Also nowhere in reference  added by CoupledMap it is claimed that Roger Penrose is in favor of MaxEnt. JuanR (talk) 00:42, 12 January 2011 (UTC)
increase of thermodynamic entropy
The new edit did not make clear that it refers to a transient non-equilibrium situation, in which the system is started far from equilibrium, or is watched until it eventually, after a time comparable with the age of the universe, if such exists, fluctuates back to a transient state far from equilibrium. The edit distracted from the point that was being made where the edit was inserted. If the editor wants to discuss this kind of situation, he should state his concern clearly, not just as an unexplained interruption to an account of a different matter, which referred to an all-time average.Chjoaygame (talk) 00:11, 27 December 2011 (UTC)
epistemic and objective
The editor who made the edit that I have undone was writing in good faith. But he is writing from his perspective of thinking of things along Kantian lines, not from the perspective of maximum entropy thermodynamics. The problem that he sees with the terms objective and subjective is not present when things are considered from the perspective of maximum entropy thermodynamics. That is not to say that the problem is not important for Kantian theory. The paragraph of the article that leads that editor to write his "however" explains why his "however" is irrelevant for the present page. The edit was an advertisement to Kantian theory but did not actually say anything useful for this page. I am not criticizing Kantian theory by these remarks. I am saying that the edit did not contribute to the value of this page. I hope that the editor may read again the paragraph that he thinks needs his "however", and come to agree that his "however" was non-contributory to this page.Chjoaygame (talk) 16:20, 19 January 2013 (UTC)
- Perhaps so. Yes, it is better to spell out the words as you have done in your edit.
- As a matter of collegiality, it would be easier for other editors if you would very kindly choose yourself a very safely unidentifiable user name and register your usage here with it, instead of just using an IP address. It just makes conversation more comfortable. You just sign with four tildes.Chjoaygame (talk) 01:20, 2 February 2015 (UTC)Chjoaygame (talk) 01:44, 2 February 2015 (UTC)
- I might get around to that; most of my edits are very small.
- Good to chat. The variable IP address is disconcerting, even for small or rare edits. I very much prefer the four tildes trick. Best to take some care to create a name that is unidentifiable.
- BTW, I don't mean it's the "baby" of some editor of this article! Rather, MaxEnt sounds like a proprietary algorithm or a variable name, and in any case a recent invention, yet actually the idea goes back to papers written well before camelback abbreviations were common.
- I haven't yet finished reading the article, but I have trouble seeing why there would be contention about these ideas. Misunderstanding maybe, it is not that easy. But it's often the case that thermodynamics tends to generate controversy.
- Yes, I agree about abbreviation.
- It seems that the "subjectivity" involved is no different from conditioning a probability on an event (meaning a subset of a measure space). Coding a state of knowledge or opinion into sets, numbers and probabilities is as objective a description as you can get. Besides "episteme", it reminds me of Popper's "Third world", which contains not only knowledge, but also "the state of an argument".
- I noticed by google that "MaxEnt" really does refer to a software package, as well as a long-running conference series (20 or 25 years). So I guess the vocabulary word, as well as the group-feeling of pro and contra, goes back a long way. It appears to be aligned with a similar polarization about Bayesian statistics. All of which is new to me. But not to you, as I see from these talk pages. 18.104.22.168 (talk) 09:37, 5 February 2015 (UTC)