Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

May 3, 2007

Boltzmann Entropy

This semester, I’ve been teaching a Physics for non-Science majors (mostly Business School students) class.

Towards the end of the semester, we turned to Thermodynamics and, in particular, the subject of Entropy. The textbook had a discussion of ideal gases and of heat engines and whatnot. But, somewhere along the line, they made a totally mysterious leap to Boltzmann’s definition of Entropy. As important as Boltzmann’s insight is, it was presented in a fashion totally disconnected from Thermodynamics, or anything else that came before.

So, equipped with the Ideal Gas Law, and a little baby kinetic theory, I decided to see if I could present the argument leading to Boltzmann’s definition. I think I mostly succeeded. Herewith is a, somewhat fancied-up, version of the argument.

We start with Clausius’s definition1 of the entropy

(1)Q=TdS Q = T d S

the First Law of Thermodynamics

(2)dU=QW d U = Q - W

(where W=pdVW= p d V) and the Ideal Gas Law

(3)pV=NkT=1αU p V = N k T = \frac{1}{\alpha} U

where UU is the internal energy of the gas and α=3/2\alpha = 3/2 for a monatomic ideal gas.

Let’s consider an isothermal process. Since T=constT=\text{const}, dU=0d U =0 and Q=NkTln(V f/V i) Q = N k T \ln(V_f/V_i) Comparing this with (1), we conclude that

(4)S=Nkln(V)+f(T) S = N k \ln(V) + f(T)

where f(T)f(T) is some volume-independent function of the temperature.

Repeating the same analysis for an adiabatic process, Q=0Q=0, and hence dU=αNkdT=W=NkTVdV d U = \alpha N k d T = -W = -\frac{N k T}{V} d V or

(5)αdTT=dVV \alpha \frac{d T}{T} = - \frac{d V}{V}

Since dS=0d S=0, we can solve for the previously unknown function f(T)f(T)

(6)S =Nkln(VT α)+const =kln(VT α) N+const \begin{aligned} S &= N k \ln( V T^\alpha) + \text{const} \\ &= k \ln( V T^\alpha)^N + \text{const} \end{aligned}

where the constant is independent of both VV and TT.

This is (almost) the answer we are after. But it behoves us to pause and note that it has a very suggestive interpretation. We don’t know where any particular gas molecule is located. But we do know that it must be somewhere within the volume VV. Similarly, we don’t know what the velocity of any particular gas molecule is. But baby kinetic theory2 tells us that v RMST 1/2 v_{\text{RMS}} \propto T^{1/2} So T 3/2T^{3/2} is (proportional to) the volume in “velocity space” in which we expect to find the molecule of gas and VT 3/2V T^{3/2} is the volume in “phase space” for a single molecule. It represents, in other words, our lack of knowledge of the state of that gas molecule. For NN molecules, the volume in phase space is (VT 3/2) N{(V T^{3/2})}^N, which is what appears as the argument of the logarithm in (6).

So the Boltzmann entropy is kk times the natural logarithm of the volume in phase space of the system.

That’s about as far as I got in my lecture, but one can go a little further. (6) is wrong because it isn’t extensive. If we take two container of the same gas, at the same temperature and pressure, we should find that the total entropy S=S 1+S 2S= S_1 + S_2. Instead, with (6), we find

(SS 1S 2) naïve=Nkln(N)N 1kln(N 1)N 2kln(N 2) {(S- S_1 - S_2)}^{\text{naïve}} = N k \ln(N) - N_1 k \ln(N_1) - N_2 k \ln(N_2)

where N=N 1+N 2N=N_1+N_2.

But this discrepancy is easy to fix. The quantity that behaves extensively is

(7)S=Nkln(VT αN)+k(c 1N+c 2) S = N k \ln\left(\frac{V T^\alpha}{N}\right) + k(c_1 N + c_2)

where c 1,2c_{1,2} are constants. Using Stirling’s formula, for large NN, we can then write this as

(8)S=kln((VT α) NN!)+k((c 11)N+c 2) S = k \ln\left(\frac{{\left(V T^\alpha\right)}^N}{N!}\right) + k\Bigl((c_1-1) N + c_2\Bigr)

That is, we should treat the gas molecules as identical particles, and take kk times logarithm of the volume in phase space, where we’ve modded out by the permutations of the NN particles.

Despite having had to gloss over a couple of steps where a little calculus was required, I’m rather proud of this “elementary” derivation. I don’t think I’ve seen anything even remotely resembling a satisfactory explanation in any of the elementary textbooks (even the calculus-based ones).

1 The course was, by no means, calculus-based. Expressions like “dXd X” mean “a small change in XX.” So (1) was read as: add a small amount of heat, QQ to the system at a temperature TT, and you get a small change in the entropy, dS=QTd S = \tfrac{Q}{T}. As a result, I had to cheat in a couple of steps in the derivation. But these weren’t terribly big cheats.

2 We’d previously argued for this, on the basis of a simple model, in which molecules, whose average kinetic energy is 12mv RMS 2=32kT\tfrac{1}{2}m v_{\text{RMS}}^2= \tfrac{3}{2} k T, collide elastically with the walls of the container. This simple-minded model reproduces the pressure, pp, predicted by the Ideal Gas Law.

Posted by distler at May 3, 2007 12:45 PM

TrackBack URL for this Entry:

10 Comments & 2 Trackbacks

Read the post Entropy for Non-Majors
Weblog: Science After Sunclipse
Excerpt: Every once in a while (well, actually, pretty frequently) I see a post out there in the Blagopelago which makes me feel bad about ranting so much and discussing science so little. Today’s entry in this category is Jacques Distler’s treatme...
Tracked: May 3, 2007 3:14 PM

Re: Boltzmann Entropy

Nice. It also shows the falsity of the oft-made claim that quantum mechanics is needed to get the correct entropy formula.

Posted by: Mark Srednicki on May 18, 2007 2:07 PM | Permalink | Reply to this

Identical paticles


One of the bizarre, but oft-made, statements is that you can’t deal with identical particles in classical mechanics.

This is rubbish.

If we remove the diagonals (where particles coincide) from the configuration space, and impose boundary conditions (hard-sphere, or whatever is appropriate), then the symmetric group, S NS_N acts freely on the phase space \mathcal{M}.

Moreover, this action commutes with the Hamiltonian and with the symplectic form, and so the dynamics descends to the quotient space, /S N\mathcal{M}/S_N.

That’s the setting for the classical dynamics of identical particles, and its volume what appears in (8).

Posted by: Jacques Distler on May 18, 2007 2:37 PM | Permalink | PGP Sig | Reply to this

Re: Identical paticles

Hmmm. This post just jumped up to the top of Planet Musings, so I figure this is a good opportunity to ask a dumb question before my morning caffeine.

Since it’s definitely possible, what are the motivations for doing classical mechanics with identical particles? First, I guess, is making the entropy an extensive quantity; I suppose, also, that if one had done QFT without ever having seen classical stat mech, one might want to work with identical particles. Are there other reasons to say, “Today I’ll mod out by the symmetric group?”

Posted by: Blake Stacey on July 12, 2007 8:46 AM | Permalink | Reply to this

Re: Identical paticles

Are there other reasons to say, “Today I’ll mod out by the symmetric group?”

Classical statistical mechanics is the obvious answer. Atoms (as conceived in the 19th Century) are identical particles. And if we are to handle them properly, we ought to treat them as such.

One consequence is that this makes the entropy (and other thermodynamic quantities) extensive. But, regardless, it’s the right thing to do.

Posted by: Jacques Distler on July 12, 2007 9:52 AM | Permalink | PGP Sig | Reply to this

Re: Boltzmann Entropy

I’ve been meaning to actually go through this since it first came up on Mixed States a while ago.

Would the proper interpretation of the an extensive property be that it is linear in N, whereas here in the leadup to equation 7 we see that it the entropy goes like N_i*ln(N_i) for each container? Or is it something other than that?

Posted by: agm on July 12, 2007 6:59 PM | Permalink | Reply to this


Would the proper interpretation of the an extensive property be that it is linear in N …?

It’s a little more than that. It’s linear in NN, provided we hold TT and N/VN/V fixed.

Posted by: Jacques Distler on July 12, 2007 8:58 PM | Permalink | PGP Sig | Reply to this

Re: Extensivity

So extensive properties require a system with a constant number density of particles undergoing isothermal processes. For an ideal gas, that’s just an isobaric process, no?

It seems like a requirement to hold the number density and the temperature constant would be linked to some sort of invariant, perhaps an adiabatic one, that serves as an index for the properties of the system since the product of constants is another constant.

Posted by: agm on July 13, 2007 2:35 PM | Permalink | Reply to this

Re: Extensivity

I don’t quite understand you.

You asked what an extensive property is.

I defined it as one which is additive when combining subsystems, provided the subsystems have same pressure and temperature (etc.).

This has nothing to do with isothermal, isobaric (or whatever) processes.

Posted by: Jacques Distler on July 13, 2007 10:05 PM | Permalink | PGP Sig | Reply to this
Read the post Opera and MathML
Weblog: Musings
Excerpt: A rant.
Tracked: January 31, 2008 12:01 AM

Re: Boltzmann Entropy


Apart from the exensivity, is this deduction not similar to the one in Fermi Thermodynamics (Dover 1956)page 61?

Albert Alberts, Amsterdam NL.

Posted by: albert alberts on March 27, 2008 6:28 AM | Permalink | Reply to this

Re: Boltzmann Entropy

boy! talk about falling flat6 on ones face!

Posted by: denzl on April 11, 2008 5:40 PM | Permalink | Reply to this

Post a New Comment