## September 17, 2023

### Counting Algebraic Structures

#### Posted by John Baez The number of groups with $n$ elements goes like this, starting with $n = 0$:

0, 1, 1, 1, 2, 1, 2, 1, 5, …

The number of semigroups with $n$ elements goes like this:

1, 1, 5, 24, 188, 1915, 28634, 1627672, 3684030417, 105978177936292, …

Here I’m counting isomorphic guys as the same.

But how much do we know about such sequences in general? For example, is there any sort of algebraic gadget where the number of gadgets with $n$ elements goes like this:

1, 1, 2, 1, 1, 1, 1, 1, … ?

No! Not if by “algebraic gadget” we mean something described by a bunch of operations obeying equational laws — that is, an algebra of a Lawvere theory.

This follows from a result of László Lovász in 1967:

On Mastodon, Omar Antolín sketched a proof that greases the wheels with more category theory. It relies on a rather shocking lemma:

Super-Yoneda Lemma. Let $\mathsf{C}$ be the category of algebras of some Lawvere theory, and let $A, B \in \mathsf{C}$ be two algebras whose underlying sets are finite. If the functors $\mathrm{hom}(-,A)$ and $\mathrm{hom}(-,B)$ are unnaturally isomorphic, then $A \cong B$.

Here we say the functors $\mathrm{hom}(-,A)$ and $\mathrm{hom}(-,B)$ are unnaturally isomorphic if

$\mathrm{hom}(X,A) \cong \mathrm{hom}(X,B)$

for all $X \in \mathsf{C}$. We’re not imposing the usual commuting naturality square — indeed we can’t, since we’re not even giving any specific choice of isomorphism!

If $\mathrm{hom}(-,A)$ and $\mathrm{hom}(-,B)$ are naturally isomorphic, you can easily show $A \cong B$ using the Yoneda Lemma. But when they’re unnaturally isomorphic, you have to break the glass and pull out the Super-Yoneda Lemma.

Given this shocking lemma, it’s easy to show this:

Theorem. Let $A, B$ be two algebras of a Lawvere theory whose underlying sets are finite. If $A^k \cong B^k$ for some natural number $k$ then $A \cong B$.

Here’s how. Since $A^k \cong B^k$, we have natural isomorphisms

$\mathrm{hom}(-,A)^k \cong \mathrm{hom}(-, A^k) \cong \mathrm{hom}(-, B^k) \cong \mathrm{hom}(-,B)^k$

so for any $X \in \mathsf{C}$ the sets $\mathrm{hom}(X,A)^k$ and $\mathrm{hom}(X,B)^k$ have the same cardinality. This means we have an unnatural isomorphism

$\mathrm{hom}(-,A) \cong \mathrm{hom}(-,B)$

The lemma magically lets us conclude that

$A \cong B$

Now, how do we use this to solve our puzzle? Let $a(n)$ be the number of isomorphism classes of algebras whose underlying set has $n$ elements. We must have

$a(n^k) \ge a(n)$

since we’ve just seen that nonisomorphic algebras with $n$ elements give nonisomorphic algebras with $n^k$ elements. So, for example, we can never have $a(4) \lt a(2)$, since $4 = 2^2$. Thus, the sequence can’t look like the one I showed you, with

$a(0) = 1, \; a(1) = 1, \; a(2) = 2, \; a(3) = 1,\; a(4) = 1, ...$

Nice! So let’s turn to the lemma, which is the really interesting part.

I’ll just quote Omar Antolín’s proof, since I can’t improve on it. I believe the ideas go back to Lovász, but a bit of category theory really helps. Remember, $A$ and $B$ are algebras of some Lawvere theory whose underlying sets are finite:

Let $\mathrm{mon}(X, A)$ be the set of monomorphisms, which here are just homomorphisms that are injective functions. I claim you can compute the cardinality of $\mathrm{mon}(X, A)$ using the inclusion-exclusion principle in terms of the cardinalities of $\mathrm{hom}(Q, A)$ for various quotients of $X$.

Indeed, for any pair of elements $x, y \in X$, let $S(x, y)$ be the set for homomorphisms $f \colon X \to A$ such that $f(x) = f(y)$. The monomorphisms are just the homomorphisms that belong to none of the sets $S(x, y)$, so you can compute how many there are via the inclusion-exclusion formula: you’ll just need the cardinality of intersections of several $S(x_i, y_i)$.

Now, the intersection of some $S(x_i, y_i)$ is the set of homorphisms $f$ such that for all $i$, $f(x_i) = f(y_i)$. Those are in bijection with the homorphisms $Q \to A$ where $Q$ is the quotient of $X$ obtained by adding the relations $x_i=y_i$ for each $i$.

So far I hope I’ve convinced you that if $\mathrm{hom}(-, A)$ and $\mathrm{hom}(-, B)$ are unnaturally isomorphic, so are $\mathrm{mon}(-, A)$ and $\mathrm{mon}(-, B)$. Now it’s easy to finish: since $\mathrm{mon}(A, A)$ is non-empty, so is $\mathrm{mon}(A, B)$, so $A$ is isomorphic to a subobject of $B$. Similarly $B$ is isomorphic to a subobject of $A$, and since they are finite, they must be isomorphic.

Beautiful!

But if you look at this argument you’ll see we didn’t use the full force of the assumptions. We didn’t need $A$ and $B$ to be algebras of a Lawvere theory. They could have been topological spaces, or posets, or simple graphs (which you can think of as reflexive symmetric relations), or various other things. It seems all we really need is a category $\mathsf{C}$ of gadgets with a forgetful functor

$U \colon \mathsf{C} \to \mathsf{FinSet}$

that is faithful and has some extra property… roughly, that we can take an object in $\mathsf{C}$ and take a quotient of it where we impose a bunch of extra relations $x_i = y_i$, and maps out of this quotient will behave as you’d expect. More precisely, I think the extra property is this:

Given any $X \in \mathsf{C}$ and any surjection $p \colon U(X) \to S$, there is a morphism $j \colon X \to Q$ such that the morphisms $f \colon X \to Y$ that factor through $j$ are precisely those for which $U(f)$ factors through $p$.

Can anyone here shed some light on this property, and which faithful functors $U \colon \mathsf{C} \to \mathsf{FinSet}$ have it? These papers should help:

but I haven’t had time to absorb them yet.

By the way, there’s a name for categories where the super-Yoneda Lemma holds: they’re called right combinatorial.

And there’s a name for the sequences I’m talking about. If $T$ is a Lawvere theory, the sequence whose $n$th term is the number of isomorphism classes of $T$-algebras with $n$ elements is called the fine spectrum of $T$. The idea was introduced here:

• Walter Taylor, The fine spectrum of a variety, Algebra Universalis 5 (1975), 263–303.

though Taylor used not Lawvere theories but an equivalent framework: ‘varieties’ in the sense of universal algebra. For a bit more on this, go here.

I’m interested in which sequences are the fine spectrum of some Lawvere theory. You could call this an ‘inverse problem’. The direct problem — computing the fine spectrum of a given Lawvere theory — is already extremely difficult in many cases. But the case where there aren’t any equational laws (except trivial ones) is manageable:

Some errors in Harrison’s paper were corrected here:

I suspect Harrison and Tureček’s formulas could be nicely derived using species, since they’re connected to the tree-like structures discussed here:

• François Bergeron, Gilbert Labelle and Pierre Leroux, Combinatorial Species and Tree-Like Structures, Cambridge U. Press, Cambridge, 1998.

For all I know these authors point this out! It’s been a while since I’ve read this book.

Posted at September 17, 2023 1:00 AM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/3499

### Re: Counting Algebraic Structures

“If the functors hom(−,A) and hom(−,A) are unnaturally isomorphic” Pretty sure they’re naturally isomorphic.

Posted by: Allen Knutson on September 17, 2023 3:15 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

You forgot the letter A has a reflection symmetry!

But yes, I meant B.

Posted by: John Baez on September 17, 2023 4:10 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

Neat! I think the following hypothetical variant of the theorem on powers could give even stronger constraints on such “number of algebras” sequences:

Theorem? Let $A$, $B$, $C$ be algebras of a Lawvere theory whose underlying sets are finite. If $A \times C \cong B \times C$, then $A \cong B$.

The problem with the analogous proof is that trying to cancel $- \times \mathrm{hom}(-,C)$ from an isomorphism may not be possible because $\mathrm{hom}(X, C)$ could be empty for some $X$.

But perhaps one can get this to work with some additional cleverness? Or are there obvious counterexamples?

If it’s true, then one could conclude that a “number of algebras” sequence must increase (non-strictly) not only along powers, but along arbitrary multiples, provided that there is at least one algebra of every positive finite cardinality.

Posted by: Tobias Fritz on September 17, 2023 4:52 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

You might look at

If you have a single constant in the signature and the constant always gives a subalgebra then you never have empty hom sets and you can do this.

Posted by: Benjamin Steinberg on September 18, 2023 3:58 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

This works if every nonempty finite algebra contains a copy of the 1 element algebra. So groups, rings, monoids and semi groups are fine.

Posted by: Benjamin Steinberg on September 18, 2023 4:07 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

Alas, Tobias’s conjectured theorem isn’t true in full generality. Here’s a counterexample.

Take the theory consisting of three constants and no equations. Thus, an algebra is a set $A$ together with distinguished elements $a_1$, $a_2$ and $a_3$, in order, which may or may not be equal.

Let $A = B = C = \{1, 2, 3\}$. Make $A$, $B$ and $C$ into algebras by taking the following constants:

• $a_1 = 2$, $a_2 = 2$, $a_3 = 3$

• $b_1 = 1$, $b_2 = 2$, $b_3 = 3$

• $c_1 = 2$, $c_2 = 3$, $c_3 = 3$.

Then $A \times C$ is the 9-element set $\{1, 2, 3\} \times \{1, 2, 3\}$ with distinguished elements

$(2, 2), (2, 3), (3, 3),$

and $B \times C$ is the same 9-element set with distinguished elements

$(1, 2), (2, 3), (3, 3).$

Since both $A \times C$ and $B \times C$ are 9-element sets whose three distinguished elements are all distinct, they are isomorphic as algebras. Concretely, the self-map of $\{1, 2, 3\}^2$ that exchanges $(2, 2)$ and $(1, 2)$ while leaving everything else fixed defines an isomorphism of algebras $A \times C \cong B \times C$.

On the other hand, $A$ is not isomorphic to $B$, because the three distinguished elements of $B$ are distinct but those of $A$ are not.

Posted by: Tom Leinster on September 19, 2023 8:21 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

Fiendishly clever, Tom! Your trick reminds me of how geometric realization of simplicial sets preserves products thanks to the miracle of degeneracies. The simplicial 1-simplex (the simplicial set that looks like an interval) has vertices $0$ and $1$. It has a degenerate 2-simplex with vertices $0, 0, 1$ and another with vertices $0, 1, 1$. These give the product of two simplicial 1-simplexes a nondegenerate 2-simplex with vertices $(0,0), (0,1)$ and $(1,1)$. As we move along from $(0,0)$ to $(0,1)$ to $(1,1)$, at each stage the first coordinate may stay the same, or the second coordinate may stay the same, but they don’t both stay the same.

Posted by: John Baez on September 19, 2023 10:53 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

No, I just guessed the general conjecture would be false and looked for counterexamples in the simplest possible algebraic theories.

Posted by: Tom Leinster on September 20, 2023 4:04 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

That’s very nice indeed! It’s a shame that my conjecture is wrong in general.

In fact, I realize now that also my conjectured corollary about every “number of algebras” sequence being non-strictly increasing along multiples is wrong. For example, the number of Boolean algebras on $2$ elements is one, but the number of Boolean algebras on $2 \cdot 3$ elements is zero. In this case, the problem is not that $\mathrm{hom}(-,C)$ can be empty, which is how my envisioned argument fails in your example, but rather the complete absence of algebras in cardinalities that are not powers of two.

Posted by: Tobias Fritz on September 20, 2023 10:32 AM | Permalink | Reply to this

### Re: Counting Algebraic Structures

Oh, that’s a very simple counterexample, Tobias!

Given a set of natural numbers closed under multiplication, is there a Lawvere theory whose finite algebras only have cardinalities in this set?

Posted by: John Baez on September 20, 2023 10:45 AM | Permalink | Reply to this

### Re: Counting Algebraic Structures

The number of outer automorphisms of $S_n$ is $1,1,2,1,1,1,2,1,\ldots$.

Posted by: James Sheppard on September 19, 2023 2:28 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

Posted by: John Baez on September 19, 2023 10:58 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

I didn’t find that the politest of responses. Of course I mis-wrote: my point was that the automorphism group of $S_n$ is not what one “expects” (ie $S_n$ again) precisely when $n=2,6$, and that set doesn’t have the property of algebraic structures which is under discussion.

Posted by: James Sheppard on September 20, 2023 8:54 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

Sorry, I was trying to joke around, which is always risky without facial expressions to indicate it. I suspected many people would skim over the sequence and not notice that the first “2” was wrong.

If you expand the scope a bit and look for epimomorphisms from $S_n$ to $S_m$ with $m \le n$ then you’ll also find our friend the sign homomorphism from $S_n$ to $S_2$ and one more exotic delight: the nontrivial homomorphism from $S_4$ to $S_3$, which arises from the 3 ways to partition a 4-element set into 2-element subsets.

Posted by: John Baez on September 20, 2023 9:49 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

It is almost surely unrelated, but talk of the nontrivial homomorphism $S_4 \to S_3$ makes me think of the unique non-null-homotopic map $S^4\to S^3$.

Posted by: David Roberts on September 21, 2023 12:58 AM | Permalink | Reply to this

### Re: Counting Algebraic Structures

Actually I think the nontrivial homomorphism $S_4 \to S_3$ is related to the existence of a nontrivial homomorphism $\mathrm{SO}(4) \to \mathrm{SO}(3)$.

If we take the 4-element set $\{1,2,3,4\}$, there are 3 ways to split it:

$\{1,2\} + \{3,4\}$ $\{1,3\} + \{4,2\}$ $\{1,4\} + \{2,3\}$

and $S_4$ permutes these 3 splittings. This gives a nontrivial homomorphism $S_4 \to S_3$.

Similarly, if we take $\mathbb{R}^4$ with its usual basis $e_1, e_2, e_3, e_4$, the space of self-dual elements of $\Lambda^2 \mathbb{R}^4$ is 3-dimensional with basis

$e_1 \wedge e_2 + e_3 \wedge e_4$ $e_1 \wedge e_3 + e_4 \wedge e_2$ $e_1 \wedge e_4 + e_2 \wedge e_3$

and $\mathrm{SO}(4)$ preserves this 3-dimensional space. This gives a nontrivial homomorphism $\mathrm{SO}(4) \to \mathrm{SO}(3)$.

We can think of these two phenomena as ‘the same thing’ over the field with one element and the field $\mathbb{R}$. But the real case is different because we also have a 3-dimensional space of anti-self-dual elements of $\Lambda^2 \mathbb{R}^4$, with basis

$e_1 \wedge e_2 - e_3 \wedge e_4$ $e_1 \wedge e_3 - e_4 \wedge e_2$ $e_1 \wedge e_4 - e_2 \wedge e_3$

This gives a different nontrivial homomorphism $\mathrm{SO}(4) \to \mathrm{SO}(3)$. But arguably this is because $\mathbb{R}$ lets us use minus signs, which we can’t do in the field with one element.

I should really see how this story plays out in all the finite fields $\mathbb{F}_p$. I get a bit confused about orthogonal groups and exterior algebras in characteristic $p$, especially when $p = 2$.

Posted by: John Baez on September 21, 2023 1:11 PM | Permalink | Reply to this

### Re: Counting Algebraic Structures

This stuff is so cool, John! It’s so nice to see categorical tools being useful in such a concrete, ‘hard math’ question.

Re the condition on the functor $U:C \to \mathrm{FinSet}$ to prove super-Yoneda, it looks very much like exponentiability for a functor.

Quoting from the nLab:

A functor $p\colon E\to B$ is exponentiable if for any morphism $\alpha\colon a\to b$ in $E$ and any factorization $p a \overset{\beta}{\to} c \overset{\gamma}{\to} p b$ of $p \alpha$ in $B$, we have:

1. there exists a factorization $a \overset{\tilde{\beta}}{\to} d \overset{\tilde{\gamma}}{\to} b$ of $\alpha$ in $E$ such that $p \tilde{\beta} = \beta$ and $p \tilde{\gamma} = \gamma$, and

2. any two such factorizations in $E$ are connected by a zigzag of commuting morphisms which map to $id_c$ in $B$.

You see, the first condition is almost an exact match of yours.

Posted by: Matteo Capucci on September 21, 2023 2:24 PM | Permalink | Reply to this

Post a New Comment