# Talk:Statistical mechanics

WikiProject Physics (Rated B-class, Top-importance)
This article is within the scope of WikiProject Physics, a collaborative effort to improve the coverage of Physics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B  This article has been rated as B-Class on the project's quality scale.
Top  This article has been rated as Top-importance on the project's importance scale.
WikiProject Mathematics (Rated C-class, High-importance)
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
 C Class
 High Importance
Field: Mathematical physics

## Out-of-place quotation ?

"Ludwig Boltzmann, who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics". -- David L. Goodman "States of Matter"

It is a little weird to start an article with a quotation, like that. — Miguel 07:33, 2004 May 28 (UTC)

I know, but it is such a great quote from a standard reference, I thought it was worth including. Michael L. Kaufman 22:29, May 28, 2004 (UTC)

It is pretty interesting, maybe inclusion at the end of the article. Edsanville 05:40, 19 Aug 2004 (UTC)

Perhaps the page should link to Maxwell-Boltzmann distribution (AC)

## Structure of statistical mechanics articles

I'd like to make some more contributions to the Stat. Mech. subject, so I was trying to figure out the present structure. I've made up the following "summary" and I suppose it ought to go on the Stat. Mech. page, but I don't know where. Any suggestions or modifications? Paul Reiser 21:32, 11 Dec 2004 (UTC)

General:

## What if quantum mechanics improve ?

I wonder : what would become statistical thermodynamics if quantum mechanics will be deterministic ? Would it be more accurate ? In other words, would the bridge be more resistant and better built ? Bête spatio-temporelle —Preceding unsigned comment added by 193.54.238.42 (talk) 12:08, 8 March 2010 (UTC)

## Factorising Z: is it correct?

${\displaystyle Z=\sum _{i}\exp \left(-\beta (E_{ti}+E_{ci}+E_{ni}+E_{ei}+E_{ri}+E_{vi})\right)}$
${\displaystyle =\sum _{i}\exp \left(-\beta E_{ti}\right)\exp \left(-\beta E_{ci}\right)\exp \left(-\beta E_{ni}\right)\exp \left(-\beta E_{ei}\right)\exp \left(-\beta E_{ri}\right)\exp \left(-\beta E_{vi}\right)}$
${\displaystyle =Z_{t}Z_{c}Z_{n}Z_{e}Z_{r}Z_{v}\,}$

Is sum of the products equal to product of sums, or I am missing something here?(Igny 02:55, 8 September 2005 (UTC))

If only it were that easy :-) It is generally false, unless all the energies are completely independent (completely uncoupled).
Consider an example:
Zf = Zf (pr(f1) + pr(f2)),
Zg = Zg ((pr(g1) + pr(g2)).
(Zf)(Zg) = Zf (pr(f1) + pr(f2)) Zg ((pr(g1) + pr(g2))
= Zf Zg (pr(f1)pr(g1) + pr(f1)pr(g2) + pr(f2)pr(g1) + pr(f2)pr(g2))
Does this equal Zfg? -- Only if the probabilities for f and g are completely independent, so pr(f1)pr(g1) = pr(f1,g1) etc; ie pr(f1|g1) = pr(f1), etc.
Assuming you can neglect correlations between f and g may or may not be a reasonable first guess. Jheald 22:29, 6 November 2005 (UTC)
Line added to article, to reflect this Jheald 22:46, 6 November 2005 (UTC)
You're right, this is linked to the idea of independent degrees of freedom There are still quite some cases where it works, obvious examples are the monoatomic ideal gas and paramagnetic systems. Actually, any system that has N quadratic degrees of freedom can be reduced into a set of N or less independent and quadratic degrees of freedom. For example, this is what happens with the diatomic ideal gas, and even with some quite complex systems like phonons in a solid. The degrees of freedom (physics and chemistry) article will explain this when I'll have finished with it. Actually, in all the examples I give here, the partition function breaks down into a product involving only individual independent dofs. ThorinMuglindir 23:02, 6 November 2005 (UTC)
Assuming the interactions really are quadratic, of course, so can be diagonalised; and also that we really are talking about equilibrium/maximum entropy distributions, so we can ignore any preparation effects. Jheald 23:13, 6 November 2005 (UTC)
The factorization does not work at all in liquids. As a general statement, one may separate treatments of separable systems, in which the factorization works, and non-separable systems, in which the factorization is utterly wrong.Phillies 20:36, 21 November 2006 (UTC)

All prior comments about this deeply flawed sub-section are quite correct, though much could be saved by simply introducing a heading "Application to Independent Subsystems" or better yet "Application to Simple Ideal Gases". Even then, there are many issues of pedagogy and rigor that must be addressed: 1) Where is the Z_N = Z(single)^N? 2) Where is the division by N!? Both of these are far more fundamental and important than the approximate decomposition of a molecule into independent internal degrees of freedom. 3) The mixing of classical integration results with quantum mechanically rigorous summations needs to be indicated and maybe even motivated. The vibrational results are summations for harmonic oscillators and the rotational results are integrations for high-temperature rigid rotors. 4) Neither the Z_t nor the Z_c expressions are in any way correct for an ideal gas or any other system. Partition functions--even single-particle partition functions--are intrinsically dimensionless, as the original definition in the article clearly indicates. In actuality, the *correct* single-particle Z_t is actually the product of the article's Z_t and its Z_c. The article confuses the configurational INTEGRAL with a partition function. 5) What are all these theta's? There is no motivation for or definition of characteristic temperatures anywhere in the article. 6) If one is intent on introducing these various expressions for the Z's, then at least indicate the genuine triumphs of the whole undertaking: the Sacur-Tetrode equation for the third-law entropies of monatomic ideal gases and the direct calculation of third-law entropies for stiff diatomics like NO or CO, both without any consideration of the character of the substance in condensed phases. Here are clear examples where statistical mechanics unequivocally trumps classical thermodynamics!

All in all, this article has a strong chemical smell (pun intended). Though myself a chemist, I have taught statistical mechanics in chemistry, chemical engineering, materials science and physics at both the uncergraduate and graduate levels. Each course requires a very different presentation. (And remember J. W. Gibbs was the very first U.S. Ph.D. engineer!) No physicist would develop statistical mechanics in this fashion. I therefore suspect many of the earlier critiques were from physicists. A little more of their point of view would help a great deal.

One point that I often make in class is that quantum mechanics developed in a very real sense from statistical mechanics. Both Planck (black body radiation) and Einstein (solid heat capacities) had more faith in Boltzmann and Gibbs--however reluctantly in Planck's case--than they did in Newton. A good portion of the "old quantum mechanics" rested on preserving the viability of statistical mechanics, even at the cost of jettisoning ideas in classical mechanics. (Of course it didn't hurt that other successful early applications, like the photoelectric effect and the electronic structure of hydrogen, helped cement quantum ideas independently.) Sure these seminal statistical mechanical applications deserve mention here, particularly since they are also examples of results for independent subsystems.68.41.166.61 21:02, 14 January 2007 (UTC) For whatever reason, my previous post (68:41:166:61) did not show my name. Wmadden 21:06, 14 January 2007 (UTC)

## Article could do with *extensive* slimming

Since this article was first begun, there is now a whole *category* of articles on statistical mechanics.

Can I suggest therefore, that as much of the detail as possible is handed off to other articles, out of this one; leaving this article just as a general introductory tour/overview?

A much shorter, more focussed, less extensive, less duplicative article would be better. Jheald 23:05, 6 November 2005 (UTC)

the probem is that a lot of articles link here. many probably reference some expression in an ensemble or another, or the concept of statistical ensemble. Doing this would suppose to go through the links and send them to the appropriate article. ThorinMuglindir 23:58, 6 November 2005 (UTC)

## Fundamental postulate

I am not entirely certain of this, but I wanted to add a paragraph to the "fundamental postulate" section which mentions the subtle difference between W and Ω :

In Boltzmann's original formulation for entropy (S = k log W), the quantity W is defined to be the number of microstates which are consistent with the macrostate. In the nomenclature of the day, W was refered to as the "number of complexions". This postulate is necessary as it provides the conditions under which the equivalence W = Ω holds.

What do you Wikipedians think? Good to add, or should be completely neglected? --HappyCamper 22:29, 24 December 2005 (UTC)

Good to add, but only as an historic footnote. 81.83.108.233 19:06, 6 January 2006 (UTC)

DEFINETELY NOT. Statistical thermodynamics is a branch of thermodyanamics that tries to intepretate some theermodynamic phenomenas with statistical manner. Statistical mechanics tries out to intepretate some phenomenas in PHYSICS CONDENSSED MATTHER(NOT THERMODYNAMICS)with statistical ways.

This is not the fundamental postulate of Gibbs' statistical mechanics. I refer you to his book. In Gibbs' statistical mechanics, the canonical ensemble is primary, and all states of the system are not equally likely; instead, the log of the statistical weight is proportional to E/kT, where E is the energy of the state. Furthermore, Gibbs uses a different formula for the entropy, and is emphatic that you cannot refer to the number of microstates of a classical system, because the states are labelled by real variables, so the number of states is continuously (uncountably) infinite.Phillies 20:43, 21 November 2006 (UTC)

### From comments page, attributed and dated using history

The discussion of 'fundamental postulate' is incorrect. It refers only to the formulation of Boltzmann, and not to the very different formulation of Josiah Willard Gibbs. In understanding the difference between the Gibbs and Boltzmann approaches, you actually have to read Gibbs' book Elementary Principles in Statistical Mechanics, and not later summaries, most of which appear to pass through Paul and Tatiana Ehrenfest's 1912 article "The Conceptual Foundations of the Statistical Approach in Mechanics", now available from Dover Press as a mostly-English translation. I am reasonably sure that I am not the first to have noted that the Ehrenfest presentation of Gibbs' book does not do as well as might have been desired by Gibbs, but that needs to be researched.

A few issues are treated in my textbook "Elementary Lectures in Statistical Mechanics, Springer-Verlag). In particular, in the actual book by Gibbs, which is reasonably a reliable source on what Gibbs wrote:

1) Gibbs used a different fundamental postulate, _not_ the principle of equal a priori probabilities, because Gibbs viewed the canonical ensemble as primary and the microcanonical ensemble as secondary. In modern notation, Gibbs viewed Wj = exp(- beta Ej) as fundamental.

2) The notion that statistical mechanics is only applicable to large systems is not found in Gibbs' book. Indeed, he deliberately compares his treatment with a treatment that he does not identify as Boltzmann's, showing the differences in the predictions of teh two models.

3) Gibbs certainly does not speak of the H-Theorem. After all, in Gibbsian statistical mecahnics the entropy is a constant of the motion.

Phillies 21:25, 17 November 2006

## Merge Statistical thermodynamics

• Yes - merge it into this article. PAR 03:32, 19 July 2006 (UTC)
• Concur - 04:13, 19 July 2006 (UTC)
• Disagree - If Britannica can have separate articles for each of these terms, I don’t see why Wikipedia can’t as well. There is much overlap, yet each term has enough peculiarity such to allow full books and textbooks to exist in their own standing. If you compare the table of contents in Gibbs’ Statistical Mechanics with the table of contents in Schrödinger’s Statistical Thermodynamics, you see that they have related but separate agendas. --Sadi Carnot 21:03, 19 July 2006 (UTC)
'Comment' : Sadi, I did as you suggested, but to me it looks like the two books are covering essentially the same subject (with differences in emphasis). I've never been able to discern any significant difference in the usage or meaning of the two terms, although "statistical mechanics" is far more common. I think (but am not sure) that the difference in usage is simple one of preference, not of substance. The OED has an entry for "statistical mechanics" (Attributes to Gibbs) but none for "statistical thermodynamics". I can't read Brittanica online. Could you summarize what they think the difference is? Nonsuch 21:34, 19 July 2006 (UTC)
More commentary -- Hill wrote one book called '"Statistical mechanics" and another called "Introduction to Statistical thermodynamics". But in the introduction to the latter he says "Such an explanation falls rather within the province of statistical mechanics or statistical thermodynamics, terms which we regard in this book as synonymous". Nonsuch 21:45, 19 July 2006 (UTC)
• Comment - At the moment, the statistical thermodynamics article is much more gentle and introductory, for a more general reader; while the statistical mechanics article goes straight for the maths, and piles it on. I suspect that distinction is also foreshadowed in the titles: statistical thermodynamics (I think) is more what a generalist might call the subject, seeing it as part of themodynamics as a whole; whereas (I suspect) people who would call it statistical mechanics are be more likely to be specialists, emphasising it as a subject in its own right. It seems to me the articles at the moment perform rather different functions, and it might be useful to keep a very light article just motivating what the concept is, in parallel with a proper thorough heavy-duty treatment. Jheald 22:03, 19 July 2006 (UTC).
I'm not very keen on merging the articles either, but from my perspective, I've always thought of statistical thermodynamics as being a subdiscipline of statistical mechanics. --HappyCamper 22:06, 19 July 2006 (UTC)
As a graduate physics student, I was taught thermodynamics by Dr. Ta-You Wu. I remember he asked a thermodynamics question in class and I volunteered an answer that was essentially a statistical mechanics answer. He about ripped my head off - he explained that the two approaches to the same problem are entirely distinct. Thermodynamics as a theory is practically complete. (Einstein said thermodynamics "is the only physical theory of universal content that … will never be overthrown.") Statistical mechanics attempts to derive and explain the basic assumptions of thermodynamics, in order to produce a theory which explains thermodynamics, and yields results which extend beyond thermodynamics. I am not familiar with the term "statistical thermodynamics", but it sounds like, as mentioned above, that it might be a subdiscipline of statistical mechanics dedicated only to explaining thermodynamics. PAR 22:57, 19 July 2006 (UTC)
• Comment: I’m quite sure it would take a few weeks worth of reading to dig out exactly what the technical differences are for each term. For the moment, I would suggest putting see also headers at the top of each article, so that as time moves on editors and readers can add what they know to each, based on their unique knowledges. Also, to note, a Google search shows about 24 million hits for SM and 9 million hits for ST. Lastly, if this helps, here are introductions to the ‘02 entries from Britannica for each term:
Did you remember to quote the google searches? I make it 10 million SM, 350,000 ST, a 30:1 ratio. Nonsuch 00:20, 20 July 2006 (UTC)

### Statistical mechanics

branch of physics that combines the principles and procedures of statistics with the laws of both classical and quantum mechanics. It aims to predict and explain the measurable properties of macroscopic systems on the basis of the properties and behaviour of the microscopic constituents of those systems. Statistical mechanics, for example, interprets thermal energy as the energy of atomic particles in disordered states and temperature as a quantitative measure of how energy is shared among such particles. Statistical mechanics draws heavily on the laws of probability, so that it does not concentrate on the behaviour of every individual particle in a macroscopic substance but on the average behaviour of a large number of particles of the same kind.

The mathematical structure of statistical mechanics was established by the U.S. physicist J. Willard Gibbs in his book Elementary Principles in Statistical Mechanics (1902), but two earlier physicists, James Clerk Maxwell of Great Britain and Ludwig E. Boltzmann of Austria, are generally credited with having developed the fundamental principles of the field with their work on thermodynamics. Over the years the methods of statistical mechanics have been applied to such phenomena as Brownian motion (i.e., the random movement of minute particles suspended in a liquid or gas) and electric conduction in solids. They also have been used in relating computer simulations of molecular dynamics to the properties of a wide range of fluids and solids.

### Statistical thermodynamics

Thermodynamics is the study of the various properties of macroscopic systems that are in equilibrium and, particularly, the relations between these various properties. Having been developed in the 1800s before the atomic theory of matter was generally accepted, classical thermodynamics is not based on any atomic or molecular theory, and its results are independent of any atomic or molecular models. This character of classical thermodynamics is both a strength and a weakness: classical thermodynamic results will never need to be modified as scientific knowledge of atomic and molecular structure improves or changes, but classical thermodynamics gives no insight into the physical properties or behaviour of physical systems at the molecular level.

With the development of atomic and molecular theories in the late 1800s and early 1900s, thermodynamics was given a molecular interpretation. This field is called statistical thermodynamics, because it relates average values of molecular properties to macroscopic thermodynamic properties such as temperature and pressure. The goal of statistical thermodynamics is to understand and to interpret the measurable macroscopic properties of materials in terms of the properties of their constituent particles and the interactions between them. Statistical thermodynamics can thus be thought of as a bridge between the macroscopic and the microscopic properties of systems. It provides a molecular interpretation of thermodynamic quantities such as work, heat, and entropy.

Research in statistical thermodynamics varies from mathematically sophisticated discussions of general theories to semiempirical calculations involving simple, but nevertheless useful, molecular models. An example of the first type of research is the investigation of the question of whether statistical thermodynamics, as it is formulated today, is capable of predicting the existence of a first-order phase transition. General questions like this are by their nature mathematically involved and require rigorous methods. For many scientists, however, statistical thermodynamics merely serves as a tool with which to calculate the properties of physical systems of interest.

From these entries, I would gauge that Britannica views SM to be a sub-branch of physics and ST to be sub-branch of thermodynamics. As an example of a similar sort of debate on topic semantics you might want to see: Talk:Physical chemistry#Chemical physics vs. Physical chemistry. I hope this helps?--Sadi Carnot 00:14, 20 July 2006 (UTC)
There doesn’t seem to be a strong consensus to merge these articles. A bought another book yesterday on statistical thermodynamics (1965) by Leonard K. Nash so I will try to build up that article and simply put “otheruses” links at the top of each article to connect the two. I’m no expert on either of these topics, but there are dozens of books on each at the book store. As an example, it took me some time to figure out what the difference is between thermochemistry and chemical thermodynamics; for some time, “chemical thermodynamics” was a redirect to “thermochemistry” although they are technically not the same subject. I will proceed to make this change:--Sadi Carnot 15:14, 2 August 2006 (UTC)

### Statistical Physics

In addition to Statistical Mechanics (conflict of interest confession: My textbook is Elementary Lectures in Statistical Mechanics) and Statistical Thermodynamics, there is also a large collection of texts that title themselves 'statistical physics' perhaps with modifiers. They appear to be a bit different in their coverages than stat mech or stat thermo texts. I can suggest several characteristic features of graduate-level statistical physics texts, not shared by all of them. Note that I am describing 'characteristic' features', not all found in all books, with this not meant to be a pejorative list (though you may draw conclusions as to why I wrote my own text.)

Limited treatment of interacting systems other than Ising models and renormalization group. Treatment of phase diagrams--or stat mech applied to nondilute systems--somewhat heavily focused on the approach to the critical point with less attention to liquids or solutions. Relatively strong treatment of au courant topics in research. More focus on Boltzmann than Gibbs, including the H theorem. Less tendency to see chemical equilibria, charging potentials, or topics studied by J.G. Kirkwood, who after all was the leading light of American statistical mechanics for something like the second quarter or a bit later of the 20th century. There is also a typical distinction between intended physics and chemistry audiences here: Physics audiences are more likely to hear about linear response theory; chemical audiences are more likely to hear about Mori-Zwanzig theory.

There is some tendency in both statistical physics and chemical thermodynamics texts to mix thermodynamic and statistico-mechanical arguments in reaching a conclusion. George Phillies 130.215.96.64 21:23, 9 January 2007 (UTC)

With a slight risk of restarting an old battle, I've posted some suggestions about a general merger of this topic at Wikipedia_talk:WikiProject_Physics#Statistical_physics_.2F_mechanics_.2F_thermodynamics. Djr32 (talk) 13:19, 22 November 2008 (UTC)

## Helmholtz free energy - F or A?

Is there a discussion somewhere about which letter to use for Helmholtz free energy? --HappyCamper 19:51, 23 July 2006 (UTC)

See Helmholtz free energy. WP is following the current thinking from IUPAC to the extent of using the letter A; but not to the extent of deleting the word "free" from the phrase "free energy". Jheald 21:35, 23 July 2006 (UTC)
Hmm...because in this article, F is already being used. It does not seem to be used consistently on Wikipedia at least... --HappyCamper 23:41, 23 July 2006 (UTC)
Use of F in this article is an abberation. Generally the other articles have (now) been standardised to IUPAC notation. Persistence of the word "free" is partly a hang–over from when the original articles were created and partly due to tradition / conservatism / lack of awareness (I guess) among editors and contributors. In my view we _should_ be consistent, and retain "free" in discussions of "free energy" concepts, but not when referring to the variables "Gibbs energy" and "Helmholtz energy". — DIV (128.250.204.118 10:49, 29 December 2006 (UTC))

## FYI

I am fairly sure that the standard format rule in Wikipedia is to not link headers in the main article. I would suggest that comeone correct this.--Sadi Carnot 17:12, 2 August 2006 (UTC)

## "List of notable textbooks in statistical mechanics" proposed for deletion

People following this page might like to express their views in the AfD, one way or the other. Jheald 03:55, 8 October 2006 (UTC).

## What is SiSj

The random walker section of the article references a <SiSj> in one of the equations, but what this is not explained. 69.143.208.183 22:32, 3 December 2006 (UTC)

Looks like it's stating that the S_i's are completely not correlated, but I'm not sure what S is. --HappyCamper 14:15, 4 December 2006 (UTC)
I believe S is the ith step.Fephisto 03:50, 13 December 2006 (UTC)

## Isn't this an old-fashioned view of statistical mechanics?

Maximum Entropy and far-off-equilibrium stat mech, starting with the work of Jaynes, did away with the need to assume anything like the postulate of equal a priori probility, and put stat mech on firm footing as a method of *inference* of physical systems.

It seems as though this article reflects a 1950s view, as presented in (e.g.) the Huang textbook. Does it not need updating?

Bkalafut 19:16, 11 April 2007 (UTC)

I would agree that this treatment is somewhat out of date, but there is frankly no consensus on this question. There are a lot of people who believe that the traditional expression is still the valid one. There are others who believe that stat mech is embodied by the ergodic hypothesis. Zathraszathras 19:16, 17 May 2007 (UTC)

## Information theory section appears incorrect

I believe the "Information function" equation and explanation are incorrect. The equation is closer to Information Entropy (${\displaystyle H}$) than Information Gain (${\displaystyle I}$), but is still not quite right. It should be:

${\displaystyle H(X)=-\sum _{x\in \mathbb {X} }p(x)\log _{2}p(x)}$.

(See Information theory.) The explanation is slightly off as well. When all the states have equal probability, the information entropy is maximal. Information entropy is a measure of the uncertainty associated with a random variable, and specifically it describes the shortest average message length, in bits, that can be sent to communicate the true value of the random variable to a recipient. --- Jeff

Jwmillerusa 16:55, 10 June 2007 (UTC)

## Mechanics or thermodynamics?

This article is a mess at the moment. There seem to be large sections of the article where, within each section, one of the terms ("statistical mechanics" or "statistical thermodynamics") is used consistently, but the article overall seems to keep switching between the two terms with no explanation. This has presumably been the case ever since the explanation in the first sentence was removed. (For anyone else who's new here, this is the first edit to this talk page since statistical mechanics and statistical thermodynamics were merged in Dec 2008.) Brian Jason Drake 04:58, 3 May 2009 (UTC)

The terms are used interchangeably, so I'm not sure it's such a bad thing for the article to reflect that usage - however, I think it was a bad idea to remove the explanation in the first sentence that you mention (I think I added that explanation in the first place...) I'm not sure that putting it back in would stop the article being a mess - I think it needs quite a bit of work! Djr32 (talk) 10:37, 3 May 2009 (UTC)

## Partition functions - multiple issues

1. Wrong equation for the vibrational partition function

The equation appears incorrect, because when there is a vibration for which ${\displaystyle T<\theta _{v}}$, the value of the partition function can be less than one and negative vibrational entropy could result, which is impossible. My derivation indicates that the proper equation is:

${\displaystyle Z_{v}=\prod _{j}{\frac {e^{-\theta _{vj}/2T}}{e^{\theta _{vj}/T}-1}}}$

Note that in the denominator, there is no minus in the exponent, and the entire exponent is negated. Here is my derivation. We assume that each vibrational mode is independent.

{\displaystyle {\begin{aligned}E&=\hbar \omega \left(n+{\frac {1}{2}}\right)\\\theta _{R}&={\frac {\hbar \omega }{k}}\\Z_{vj}&=\sum _{n=0}^{\infty }e^{-\theta _{v}\left(n+{\frac {1}{2}}\right)/T}=\sum _{n=0}^{\infty }e^{-\theta _{v}/2T}e^{-\theta _{v}n/T}=e^{-\theta _{v}/2T}\sum _{n=0}^{\infty }e^{-{\frac {\theta _{v}}{T}}n}\end{aligned}}}

Wolfram Alpha says that ${\displaystyle \sum _{n=0}^{\infty }e^{-\alpha n}={\frac {1}{e^{\alpha }-1}}}$

{\displaystyle {\begin{aligned}Z_{vj}&={\frac {e^{-\theta _{v}/2T}}{e^{\theta _{v}/T}-1}}\\Z_{v}&=\prod _{j}Z_{vj}=\prod _{j}{\frac {e^{-\theta _{vj}/2T}}{e^{\theta _{vj}/T}-1}}\end{aligned}}}

2. Wrong equation for the linear rotational partition function

The correct equation should be ${\displaystyle {\frac {T}{\theta _{r}}}}$. The current eqation is obviously incorrect by dimensional analysis alone. I don't know what ${\displaystyle \sigma }$ means, maybe it should be in the denominator as well.

3. Those are molecular partition functions

The functions in the table are molecular partition functions ${\displaystyle q}$, not partition functions ${\displaystyle Z}$. The two are related via this equation:

${\displaystyle Z={\frac {q^{N}}{N!}}}$

When calculating thermodynamical properties, we usually use ${\displaystyle \ln Z}$ and combine the terms resulting from exponentiating by ${\displaystyle N}$ and dividing by ${\displaystyle N!}$ with the translational partition function, but that doesn't mean we can ignore the difference altogether.

4. Symbols not explained

Symbols used in the equations are not explained anywhere. The meaning of, for example, ${\displaystyle \theta _{v}}$ is not immediately obvious to non-experts.

I would recommend to create a separate article, called molecular partition function, and put those equations along with derivations in that article. --Tweenk (talk) 14:45, 14 June 2010 (UTC)

## Misnomer?

Most sciences create models of reality. This science applies probability theory. Does this mean that it creates a probabilistic model? Does this mean that it does not create a statistical model? I my perception, rather well supported by the Wikipedia articles, probability is a class of models that can be used to represent our views of the world while statistics is a class of methods to process data (on observations). In the case of statistical mechanics I think the intention was to characterize the model rather than the processing of observational data. --Ettrig (talk) 17:00, 27 January 2011 (UTC)

You wouldn't be the first to suggest that "probabilistic mechanics" might be more apt term, but "statistical mechanics" is the established one and it's too late to change well-established usage. Qwfp (talk) 23:20, 27 January 2011 (UTC)
I am (of course) not suggesting that we change the name. But maybe the article should explain this. --Ettrig (talk) 11:06, 28 January 2011 (UTC)
I agree, but we need a reliable source.... and by a stroke of googluck, i've just found one : the first paragraph of this book chapter. I'll attempt to add something to the article. Qwfp (talk) 12:10, 28 January 2011 (UTC)

## Molecular partition functions - multiple issues

I was (re-)writing the wikipedia article on the Rotational partition function and had to change notation to match it with total partition function.

Now per accident I found that this main articles also contains some lines on molecular partition functions. The section, stating with "It is often useful to consider..." doesn't even have a title here. The post two posts above mine by user "Tweenk" also made some remarks on a couple of problems. A concern is, that there are now several notation for all the partition funktions going on, ${\displaystyle Z}$, ${\displaystyle \zeta }$, ${\displaystyle q}$,... (and I'm pretty sure there is a wikipedia page, which calls it ${\displaystyle Q}$ too.)

I suggest taking the information here, in the two stubs Translational partition function and Vibrational partition function as well as Rotational partition function (this is now the biggest one) and merging it into one articles "molecular partition function". Then link it in the main article. This way one doesn't have for example to give the "${\displaystyle E=\sum E^{i}}$" explaination four times in all these articles. 129.247.247.239 (talk) 09:15, 13 January 2012 (UTC)

## Degrees of freedom

Should the ‘Degrees of freedom’ link in the ‘Fundamentals’ section to Degrees of freedom (statistics) actually be to Degrees of freedom (physics and chemistry), since the later mentions rotations and vibrations, as does the present article? Puzl bustr (talk) 16:20, 23 February 2012 (UTC)

Done. Melcombe (talk) 15:37, 25 February 2012 (UTC)

## Tables and meaning "V o B"

If its ok with peeps I changed the table:

 Summary of ensembles in statistical mechanics Ensembles: Microcanonical Canonical Grand canonical Constant variables E, N, V o B T, N, V o B T, μ, V o B Microscopic features Number of microstates ${\displaystyle \Omega }$ Canonical partition function ${\displaystyle Z=\sum _{k}e^{-\beta E_{k}}}$ Grand canonical partition function ${\displaystyle \Xi \ =\ \sum _{k}e^{-\beta (E_{k}-\mu N_{k})}}$ Macroscopic function ${\displaystyle S\ =\ k_{B}\ \ln \Omega }$ ${\displaystyle F\ =\ -\ k_{B}T\ \ln Z}$ ${\displaystyle \ F-\mu N=-\ k_{B}T\ln \Xi }$

which is so difficult to edit - the code is:

{| class="WSerieV" tableborder="1" cellspacing="0" cellpadding="8" style="border:1px solid; padding: 0.3em; clear: right;margin: 0px 0px 5px 1em; border:1px solid #999; border-bottom:2px solid; border-right-width: 2px; text-align:center;line-height: 1.2em; font-size: 95%" | rowspan="2"| Summary of ensembles in<br />statistical mechanics | colspan="3" bgcolor="#C8D8FF" style="border-left:1px solid; border-right:1px solid; border-top:1px solid;" | '''Ensembles:''' |----- | bgcolor="#C8D8FF" style="border-left:1px solid; border-top:1px solid;" | [[Microcanonical ensemble|Microcanonical]] | bgcolor="#C8D8FF" style="border-left:1px solid; border-top:1px solid;" | [[Canonical ensemble|Canonical]] | bgcolor="#C8D8FF" style="border-left:1px solid; border-right:1px solid; border-top:1px solid;" | [[Grand canonical ensemble|Grand canonical]] |----- | style="border-left:1px solid; border-top:1px solid; background:#CCFFCC;" | Constant variables | style="border-left:1px solid; border-top:1px solid;" | E, N, V o B | style="border-left:1px solid; border-top:1px solid;" | T, N, V o B | style="border-left:1px solid; border-top:1px solid; border-right:1px solid;" | T, μ, V o B |----- | style="border-left:1px solid; border-top:1px solid; background:#CCFFCC;" | Microscopic features | style="border-left:1px solid; border-top:1px solid;" | <small> Number of [[Microstate (statistical mechanics)|microstates]] </small><br /><br />$\Omega$ | style="border-left:1px solid; border-top:1px solid;" | <small> Canonical partition function<br /><br />$Z = \sum_k e^{-\beta E_k}$ | style="border-left:1px solid; border-top:1px solid; border-right:1px solid;" | <small> Grand canonical partition function<br /><br />$\Xi \ = \ \sum_k e^{ -\beta (E_k - \mu N_k ) }$ |----- | style="border-left:1px solid; border-top:1px solid;border-bottom:1px solid; background:#CCFFCC;" | Macroscopic function | style="border-left:1px solid; border-top:1px solid; border-bottom:1px solid;" | $S \ = \ k_B \ \ln \Omega$ | style=" border-left:1px solid; border-top:1px solid; border-bottom:1px solid;" | $F \ = \ - \ k_B T \ \ln Z$ | style=" border-top:1px solid; border-left:1px solid; border-right:1px solid; border-bottom:1px solid;" | $\ F - \mu N =- \ k_B T \ln \Xi$ |}

to a more uniform and simpler appearance, [nicer colours (which are non-essential) can be added back later, the current ones look very drab and dull]:

Summary of ensembles Ensembles
Header text Microcanonical Canonical Grand canonical
Constant variables E, N, V o B T, N, V o B T, μ, V o B
Microscopic features
• Number of microstates
• ${\displaystyle \Omega }$
• Canonical partition function
• ${\displaystyle Z=\sum _{k}e^{-\beta E_{k}}}$
• Grand canonical partition function
• ${\displaystyle \Xi =\sum _{k}e^{-\beta (E_{k}-\mu N_{k})}}$
Macroscopic function ${\displaystyle S=k_{B}\ln \Omega }$ ${\displaystyle F=-k_{B}T\ln Z}$ ${\displaystyle F-\mu N=-k_{B}T\ln \Xi }$

which is far more easier to edit:

{| class="wikitable" |- ! Summary of ensembles ! colspan="3"| Ensembles |- ! Header text ! [[Microcanonical ensemble|Microcanonical]] ! [[Canonical ensemble|Canonical]] ! [[Grand canonical ensemble|Grand canonical]] |- ! Constant variables | E, N, V o B | T, N, V o B | T, μ, V o B |- ! Microscopic features | <div class="plainlist"> *Number of [[Microstate (statistical mechanics)|microstates]] </small> *<center>$\Omega$</center> </div> | <div class="plainlist"> *Canonical partition function *<center>$Z = \sum_k e^{-\beta E_k}$</center> </div> | <div class="plainlist"> *Grand canonical partition function *<center>$\Xi = \sum_k e^{ -\beta (E_k - \mu N_k ) }$</center> </div> |- ! Macroscopic function | $S = k_B \ln \Omega$ | $F = - k_B T \ln Z$ | $F - \mu N =- k_B T \ln \Xi$ |- |}

Please see WP:Accessibility#Tables. I have no clue what "V o B" means either. Does anyone here know?

For the other tables - I have no clue why there is the colour syntax bgcolor="white" for every row when the colours are equal to/not far off the defualt colours, for example:

Nuclear ${\displaystyle Z_{n}=1\qquad (T<10^{8}K)}$ ${\displaystyle Z_{e}=W_{0}e^{kTD_{e}}+W_{1}e^{-\theta _{e1}/T}+\cdots }$ ${\displaystyle Z_{v}=\prod _{j}{\frac {e^{-\theta _{vj}/2T}}{e^{\theta _{vj}/T}-1}}}$ ${\displaystyle Z_{r}={\frac {T}{\sigma \theta _{r}}}}$ ${\displaystyle Z_{r}={\frac {1}{\sigma }}{\sqrt {\frac {{\pi }T^{3}}{\theta _{A}\theta _{B}\theta _{C}}}}}$ ${\displaystyle Z_{t}={\frac {(2\pi mkT)^{3/2}}{h^{3}}}}$ ${\displaystyle Z_{c}=V\,}$

which is

:{| class="wikitable" |- ! style="text-align: left" | Nuclear | bgcolor="white" | $Z_n = 1 \qquad (T < 10^8 K)$ |- ! style="text-align: left" | Electronic | bgcolor="white" | $Z_e = W_0 e^{kT D_e} + W_1 e^{-\theta_{e1}/T} + \cdots$ |- ! style="text-align: left" | Vibrational | bgcolor="white" | $Z_v = \prod_j \frac{e^{-\theta_{vj} / 2T}}{e^{\theta_{vj} / T} - 1}$ |- ! style="text-align: left" | Rotational (linear) | bgcolor="white" | $Z_r = \frac{T}{\sigma\theta_r}$ |- ! style="text-align: left" | Rotational (non-linear) | bgcolor="white" | $Z_r = \frac{1}{\sigma}\sqrt{\frac{{\pi}T^3}{\theta_A \theta_B \theta_C}}$ |- ! style="text-align: left" | Translational | bgcolor="white" | $Z_t = \frac{(2 \pi mkT)^{3/2}}{h^3}$ |- ! style="text-align: left" | Configurational (ideal gas) | bgcolor="white" | $Z_c = V\,$ |- |}

... I just removed those bits, and rewrote as:

{| class="wikitable" |- ! style="text-align: left" | Nuclear | $Z_n = 1 \qquad (T < 10^8 K)$ |- ! style="text-align: left" | Electronic | $Z_e = W_0 e^{kT D_e} + W_1 e^{-\theta_{e1}/T} + \cdots$ |- ! style="text-align: left" | Vibrational | $Z_v = \prod_j \frac{e^{-\theta_{vj} / 2T}}{e^{\theta_{vj} / T} - 1}$ |- ! style="text-align: left" | Rotational (linear) | $Z_r = \frac{T}{\sigma\theta_r}$ |- ! style="text-align: left" | Rotational (non-linear) | $Z_r = \frac{1}{\sigma}\sqrt{\frac{{\pi}T^3}{\theta_A \theta_B \theta_C}}$ |- ! style="text-align: left" | Translational | $Z_t = \frac{(2 \pi mkT)^{3/2}}{h^3}$ |- ! style="text-align: left" | Configurational (ideal gas) | $Z_c = V\,$ |- |}

which appears as

Nuclear ${\displaystyle Z_{n}=1\qquad (T<10^{8}K)}$ ${\displaystyle Z_{e}=W_{0}e^{kTD_{e}}+W_{1}e^{-\theta _{e1}/T}+\cdots }$ ${\displaystyle Z_{v}=\prod _{j}{\frac {e^{-\theta _{vj}/2T}}{e^{\theta _{vj}/T}-1}}}$ ${\displaystyle Z_{r}={\frac {T}{\sigma \theta _{r}}}}$ ${\displaystyle Z_{r}={\frac {1}{\sigma }}{\sqrt {\frac {{\pi }T^{3}}{\theta _{A}\theta _{B}\theta _{C}}}}}$ ${\displaystyle Z_{t}={\frac {(2\pi mkT)^{3/2}}{h^{3}}}}$ ${\displaystyle Z_{c}=V\,}$

with identical appearance, but less code!

In addition cleaned up the article. 08:27, 3 June 2012 (UTC)

For now, the "V o B" to "V" will be changed to "V" [what is o B supposed to mean? I can't think of anything and it isn't even mentioned in the article (if it is, then it’s certainly easy to miss...)]. Also the row heading "Constant variables" will sound very strange to readers, it should be "Variables (supressed constant for ensemble)". If someone reverts these next changes then please explain, and not just revert... 08:40, 3 June 2012 (UTC)

## Split fundamental assumption?

There seem to be umpteen different ways of stating the fundamental assumption, each with unique interpretations. It's probably an article's worth of content all to itself. Of course, an overall introduction/summary should stay on this page. That's my opinion, anyway. Teply (talk) 06:41, 10 July 2012 (UTC)

I doubt there are umpteen different ways of stating the fundamental postulate. If there are, then by all means start an appropriate article and state them. You don't need a split tag to do that. As the article stands, there is no need to split, therefore I am removing the tag. Op47 (talk) 13:29, 14 September 2012 (UTC)

## The original "statistical mechanics" from over 100 years ago: not just for thermodynamics.

For anyone interested in statistical mechanics, Gibbs' book "Elementary principles" is worth reading, and besides comprehension there is no obstacle as it is freely available. I want to point out some things mentioned in the preface that were a bit surprising to me:

• When Gibbs talks about statistical mechanics, he simply refers to the mechanics (motion) of a system where the positions and velocities are not exactly known, but are rather described by a probability distribution (p. viii). In other words, if we start out with an arbitrary probability cloud, or statistical ensemble, in phase space, how does that cloud evolve over time? (In today's quantum view, one would ask how does an arbitrary density matrix evolve over time)
• Thermodynamics is a neat application of statistical mechanics (and thermodynamics inspired it), but actually statistical mechanics is far more general and worth studying in its own right. (p. viii) And, besides, even for thermodynamics, statistical mechanics lets us go beyond classical thermodynamics to talk not just about equilibrium macroscopic systems, but also equilibrium microscopic systems with any number of degrees of freedom.
• The first chapter derives results related to all statistical ensembles, even to those which are not in any sort of equilibrium. This general type of mechanics (encompassing equilibrium and non-equilibrium) is what Gibbs sees as the core of statistical mechanics, and the fundamental equation as he sees it is the Liouville equation.
• Only around chapter 4 does he finally begin to apply statistical mechanics to thermodynamics.

At the moment this wikipedia article discusses exclusively the thermodynamic or equilibrium applications of statistical mechanics. Is an oversight, or, has the modern definition of statistical mechanics become restricted to the equilibrium? --Nanite (talk) 08:28, 15 August 2013 (UTC)

You are correct. A more general approach would be an improvement for the article. Dauto (talk) 15:21, 15 August 2013 (UTC)
Ok, phew. I'll let this talk topic stew for a bit more in case anyone else has some insights. I was reading around a bit more and I realized that the equilibrium side ought just to be called "equilibrium statistical mechanics". I guess that somewhere along the way the word equilibrium got dropped for whatever reason, probably because most studies of statistical mechanics focus exclusively on the equilibrium side. So, the easy fix would be to substitute back in the word equilibrium, wherever necessary. Perhaps renaming this article to equilibrium statistical mechanics, making room for a more general statistical mechanics article. However, I don't quite feel up to the task, knowledge-wise, to write the more general article. --Nanite (talk) 20:48, 15 August 2013 (UTC)

## Assessment comment

The comment(s) below were originally left at Talk:Statistical mechanics/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.

 It's only "high" in maths because its a fairly small but much used branch, esp. in Physics. It's only "start" in maths because a non-mathmatician would be left pretty clueless. Tompw 11:23, 5 October 2006 (UTC)

Last edited at 15:21, 14 April 2007 (UTC). Substituted at 06:57, 30 April 2016 (UTC)