Compositional Thermostatics (Part 2)
Posted by John Baez
guest post by Owen Lynch
In Part 1, John talked about a paper that we wrote recently:
- John Baez, Owen Lynch and Joe Moeller, Compositional thermostatics.
and he gave an overview of what a ‘thermostatic system’ is.
In this post, I want to talk about how to compose thermostatic systems. We will not yet use category theory, saving that for another post; instead we will give a ‘nuts-and-bolts’ approach, based on examples.
Suppose that we have two thermostatic systems and we put them in thermal contact, so that they can exchange heat energy. Then we predict that their temperatures should equalize. What does this mean precisely, and how do we derive this result?
Recall that a thermostatic system is given by a convex space and a concave entropy function A ‘tank’ of constant heat capacity, whose state is solely determined by its energy, has state space and entropy function where is the heat capacity.
Now suppose that we have two tanks of heat capacity and respectively. As thermostatic systems, the state of both tanks is described by two energy variables, and and we have entropy functions
By conservation of energy, the total energy of both tanks must remain constant, so
for some equivalently
The equilibrium state then has maximal total entropy subject to this constraint. That is, an equilibrium state must satisfy
We can now derive the condition of equal temperature from this condition. In thermodynamics, temperature is defined by
The interested reader should calculate this for our entropy functions, and in doing this, see why we identify with the heat capacity. Now, manipulating the condition of equilibrium, we get
As a function of the right hand side of this equation must have derivative equal to Thus,
Now, note that if then
Thus, the condition of equilibrium is
Using the fact that
the above equation reduces to
so we have our expected condition of temperature equilibriation!
The result of composing several thermostatic systems should be a new thermostatic system. In the case above, the new thermostatic system is described by a single variable: the total energy of the system The entropy function of this new thermostatic system is given by the constrained supremum:
The reader should verify that this ends up being the same as a system with heat capacity i.e. with entropy function given by
A very similar argument goes through when one has two systems that can exchange both heat and volume; both temperature and pressure are equalized as a consequence of entropy maximization. We end up with a system that is parameterized by total energy and total volume, and has an entropy function that is a function of those quantities.
The general procedure is the following. Suppose that we have thermostatic systems, Let be a convex space, that we think of as describing the quantities that are conserved when we compose the thermostatic systems (i.e., total energy, total volume, etc.). Each value of the conserved quantities corresponds to many different possible values for We represent this with a relation
We then turn into a thermostatic system by using the entropy function
It turns out that if we require to be a convex relation (that is, a convex subspace of ) then as defined above ends up being a concave function, so is a true thermostatic system.
We will have to wait until a later post in the series to see exactly how we describe this procedure using category theory. For now, however, I want to talk about why this procedure makes sense.
In the statistical mechanical interpretation, entropy is related to the probability of observing a specific macrostate. As we scale the system, the theory of large deviations tells us that seeing any macrostate other than the most probable macrostate is highly unlikely. Thus, we can find the macrostate that we will observe in practice by finding the entropy maxima. For an exposition of this point of view, see this paper:
- Jeffrey Commons, Ying-Jen Yang and Hong Qian, Duality symmetry, two entropy functions, and an eigenvalue problem in Gibbs’ theory.
There is also a dynamical systems interpretation of entropy, where entropy serves as a Lyapunov function for a dynamical system. This is the viewpoint taken here:
- Wassim M. Haddad, A Dynamical Systems Theory of Thermodynamics, Princeton U. Press.
In each of these viewpoints, however, the maximization of entropy is not global, but rather constrained. The dynamical system only maximizes entropy along its orbit, and the statistical mechanical system maximizes entropy with respect to constraints on the probability distribution.
We can think of thermostatics as a ‘common refinement’ of both of these points of view. We are agnostic as to the mechanism by which constrained maximization of entropy takes place and we are simply interested in investigating its consequences. We expect that a careful formalization of either system should end up deriving something similar to our thermostatic theory in the limit.