Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

July 20, 2024

What Is Entropy?

Posted by John Baez

I wrote a little book about entropy; here’s the current draft:

If you see typos and other mistakes, or have trouble understanding things, please let me know!

An alternative title would be 92 Tweets on Entropy, but people convinced me that title wouldn’t age well: in decade or two few people may remember what ‘tweets’ were.

Here is the foreword, which explains the basic idea.

Foreword

Once there was a thing called Twitter, where people exchanged short messages called ‘tweets’. While it had its flaws, I came to like it and eventually decided to teach a short course on entropy in the form of tweets. This little book is a slightly expanded version of that course.

It’s easy to wax poetic about entropy, but what is it? I claim it’s the amount of information we don’t know about a situation, which in principle we could learn. But how can we make this idea precise and quantitative? To focus the discussion I decided to tackle a specific puzzle: why does hydrogen gas at room temperature and pressure have an entropy corresponding to about 23 unknown bits of information per molecule? This gave me an excuse to explain these subjects:

  • information
  • Shannon entropy and Gibbs entropy
  • the principle of maximum entropy
  • the Boltzmann distribution
  • temperature and coolness
  • the relation between entropy, expected energy and temperature
  • the equipartition theorem
  • the partition function
  • the relation between expected energy, free energy and entropy
  • the entropy of a classical harmonic oscillator
  • the entropy of a classical particle in a box
  • the entropy of a classical ideal gas.

I have largely avoided the second law of thermodynamics, which says that entropy always increases. While fascinating, this is so problematic that a good explanation would require another book! I have also avoided the role of entropy in biology, black hole physics, etc. Thus, the aspects of entropy most beloved by physics popularizers will not be found here. I also never say that entropy is ‘disorder’.

I have tried to say as little as possible about quantum mechanics, to keep the physics prerequisites low. However, Planck’s constant shows up in the formulas for the entropy of the three classical systems mentioned above. The reason for this is fascinating: Planck’s constant provides a unit of volume in position-momentum space, which is necessary to define the entropy of these systems. Thus, we need a tiny bit of quantum mechanics to get a good approximate formula for the entropy of hydrogen, even if we are trying our best to treat this gas classically.

Since I am a mathematical physicist, this book is full of math. I spend more time trying to make concepts precise and looking into strange counterexamples than an actual ‘working’ physicist would. If at any point you feel I am sinking into too many technicalities, don’t be shy about jumping to the next tweet. The really important stuff is in the boxes. It may help to reach the end before going back and learning all the details. It’s up to you.

Posted at July 20, 2024 6:02 PM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/3544

13 Comments & 0 Trackbacks

Re: What Is Entropy?

Typo:

1/2 10241/2^{1024} carries one kilobyte of information”

It should either be

1/2 10241/2^{1024} carries one kilobit of information”

or

1/2 81921/2^{8192} carries one kilobyte of information”

Posted by: Mike Stay on July 20, 2024 8:33 PM | Permalink | Reply to this

Re: What Is Entropy?

Although if you want to be super precise, “kilobyte” in colloquial usage is ambiguous, since in the context of information storage it could mean wither 1000 bytes or 1024 bytes.

1000 n1000^n is used by:

  • hard drive and storage manufacturers

  • measurements of network speeds (1Mbps is 1000000 bits per second)

  • online storage providers (Dropbox’s paid plan for 2TB is 2000GB for instance)

  • macOS and iOS

  • Ubuntu (see https://wiki.ubuntu.com/UnitsPolicy )

  • most modern GNOME applications and most GUI applications on desktop Linux

  • hard drive manufacturers have been sued over this, and US courts have agreed with hard drive manufacturers that 1 GB = 1000 MB

  • the International System of Units (SI) and the International Electrotechnical Commission (IEC) both use the decimal definitions (1 kB = 1000 B)

Microsoft is pretty much the lone holdout.

ISO/IEC 80000 and the JEDEC memory standards introduce new units ki, Mi, Gi, Ti, etc., spelled “kibi”, “mebi”, “gibi”, “tebi”, etc. that are powers of 1024.

Posted by: Mike Stay on July 20, 2024 8:52 PM | Permalink | Reply to this

Re: What Is Entropy?

Thanks for catching the error—I’ve fixed it.

Maybe I’ll insert a brief remark about how ‘kilobyte’ is ambiguous. I hadn’t known base 10 was winning! I’ve heard people complain that they’re being cheated when their terabyte drive has only 91% of the storage they expected, since they thought they were getting a tebibyte drive.

Posted by: John Baez on July 21, 2024 9:50 AM | Permalink | Reply to this

Re: What Is Entropy?

Does anyone define temperature in terms of conserved quantities other than energy, e.g. angular momentum of spins in a solid state system?

Posted by: Mike Stay on July 20, 2024 8:37 PM | Permalink | Reply to this

Re: What Is Entropy?

Those other quantities are important, but they’re not called ‘temperature’. They’re called the ‘thermodynamic conjugates’ of the conserved quantities in question.

I haven’t heard people discuss the thermodynamic conjugate of angular momentum or momentum, but Landau uses these (perhaps implicitly—I forget) to argue that an object in thermodynamic equilibrium with its surroundings will not be moving or spinning. The thermodynamic conjugate of particle number, for conserved particles, is called ‘chemical potential’, and it’s super-important in chemistry.

In the third to last section of my book I discuss the thermodynamic conjugate of volume, and hint at how you can study these ideas more generally. Maybe I should say more. No, that would be the start of another book.

Posted by: John Baez on July 21, 2024 9:45 AM | Permalink | Reply to this

Re: What Is Entropy?

The derivation of the Boltzmann distribution by Lagrange multipliers, and the discussion of the Hagedorn temperature, are IMO particularly yummers, bravo!

[Maybe the ratio

kh^{-1} = 2.0836 6176 3613… x 10^10 Herz per Kelvin

is a reasonably-sized measure for the relation of action to temperature; IIUC it involves not only Boltzmann and Hertz but secretly de Broglie as well?]

[There may be an earlier version of this note in moderation, if so please delete one or the other!]

Posted by: jack on July 21, 2024 4:09 PM | Permalink | Reply to this

Re: What Is Entropy?

That frequency of 10\sim 10 gigahertz per kelvin seems enormously high: what’s it supposed to mean? Are we saying that if E=kTE = k T and a system has temperature T=T = 1 kelvin, its energy EE is such that the phase of its wavefunction is oscillating at a frequency of E/h10E/h \sim 10 gigahertz? That seems very high.

Hmm, I guess it’s actually not! But now I’m even more confused:

The cosmic microwave background (CMB) radiation is a thermal quasi-uniform black body radiation which peaks at 2.725 K in the microwave regime at 160.2 GHz, corresponding to a 1.9 mm wavelength as in Planck’s law.

How are we getting 160.2 gigahertz? Maybe there’s a factor of 2π2 \pi in here?

I can work this out, and I should. I hadn’t realized 10 gigahertz corresponds to such a low temperature. Just imagine how low the temperature is corresponding to a typical 1 megahertz AM radio frequency!

Posted by: John Baez on July 22, 2024 2:34 PM | Permalink | Reply to this

Re: What Is Entropy?

Yes,

Boltzmann et al say E = kT while de Broglie says E = h\nu so

    T = hk^{-1} \nu

seems unescapable. This seems intuitively to say that Avogadro’s number is one those puzzling Very Large numbers…

[Srsly: I encountered hk^{-1} once when trying to calculate something involving the

https://en.wikipedia.org/wiki\Helen_(unit)

which is another fundamental measure, like Planck’s, of action. I think we should be told…]

Posted by: jack on July 23, 2024 3:22 PM | Permalink | Reply to this

Re: What Is Entropy?

Could

Z(X,β)= iXexp(βE i)Z(X,\beta) = \sum_{i \in X} \exp (-\beta E_i)

(regarded as a function of βR +\beta \in \R_+) perhaps sometimes extend to a (Schwarz) distribution in β\beta defined on (some nontrivial part of) the complex plane CR +\C \supset \R_+ ?

[Asking for a frind, \cf

https://arxiv.org/abs/math/0611240 ;

there are related questions about heat kernels.

I just posted this back at John’s original questions about partition functions as Euler characteristics but maybe here is where it belongs.]

Posted by: jack morava on July 24, 2024 8:26 PM | Permalink | Reply to this

Re: What Is Entropy?

Great read! I especially enjoyed the development starting with finitely-many states, where probability is easy and temperature is counterintuitive; negative temperature makes perfect sense now.

Typos:

On page 92 (pdf page 99) you wrote

The entropy for distinguishable particles has a term equal to 32kN\frac 3 2 kN, while for distinguishable particles it has a term equal to 32kN\frac 3 2 kN

Looks like the second half of that sentence should be about indistinguishable particles and 52\frac 5 2.

On page 26 (pdf page 33) you wrote the Shannon entropy as ip ilnp i\sum_i p_i \ln p_i instead of the unit-agnostic form ip ilogp i\sum_i p_i \log p_i that you prefer above and elsewhere. Maybe you did this on purpose to avoid fiddling with the constant λ\lambda, but it jumped out at me since you made this a point of distinction with the Gibbs entropy on page 18.

Finally on page 73 (pdf page 80) you wrote “game plane”, which, while evocative, presumably should be “game plan”.

Posted by: Jon on July 22, 2024 8:06 AM | Permalink | Reply to this

Re: What Is Entropy?

Thanks! The issue regarding bases of logarithms is quite annoying because there’s no simple way to make everything beautiful except to use base ee everywhere… which goes against my desire to allow base 2, to make it easy to talk about ‘bits’. That’s a purely childish desire as far as this book goes: I just figure people often think about ‘bits’ when they talk about information, so sometimes I want to give them bits.

You located the fault line where I give up using log\mathrm{log} and start using ln\ln, to make the differentiation simple. Unfortunately not the exact same place where I start putting Boltzmann’s constant kk in front of the entropy: I want to wait and introduce kk a bit later. So there is Shannon entropy with logp\mathrm{log} p, Gibbs entropy with klnpk \ln p, and this half-breed with lnp\ln p.

If I weren’t trying to make contact with experimental data, I would have used units where k=1k = 1.

Anyway, I have fixed this error in at least a stop-gap manner.

Posted by: John Baez on July 22, 2024 2:21 PM | Permalink | Reply to this

Re: What Is Entropy?

I got a question regarding problem 25 on p. 33. First you assume only E 1E 2E_1 \neq E_2, but in the first two sub-problems you state inequations which (iiuc) only hold if E 1<E 2E_1 \lt E_2. Should the initial assumption be E 1<E 2E_1 \lt E_2 or did I get something wrong?

Thank you for the great read. I didn’t expect entropy to be so interesting mathematically.

Posted by: Stéphane Desarzens on August 7, 2024 4:33 PM | Permalink | Reply to this

Re: What Is Entropy?

Thanks for catching that mistake! Of course if E 1E 2E_1 \neq E_2 then one of them is smaller than the other. But to state the problems it’s convenient to know which one is smaller, so then I assumed E 1<E 2E_1 \lt E_2, without mentioning it. But that’s ridiculous: I should explicitly assume E 1<E 2E_1 \lt E_2 from the very start! A corrected version should appear 15 minutes from now.

I’m glad you’re enjoying the book. Yes, entropy is very interesting mathematically!

Posted by: John Baez on August 8, 2024 11:44 AM | Permalink | Reply to this

Post a New Comment