### The Ten-Fold Way (Part 2)

#### Posted by John Baez

How can we discuss all the kinds of matter described by the ten-fold way in a single setup?

It’s bit tough, because 8 of them are fundamentally ‘real’ while the other 2 are fundamentally ‘complex’. Yet they *should* fit into a single framework, because there are 10 super division algebras over the real numbers, and each kind of matter is described using a super vector space — or really a super Hilbert space — with one of these super division algebras as its ‘ground field’.

Combining physical systems is done by tensoring their Hilbert spaces… and there *does* seem to be a way to do this even with super Hilbert spaces over different super division algebras. But what sort of mathematical structure can formalize this?

Here’s my current attempt to solve this problem. I’ll start with a warmup case, the threefold way. In fact I’ll spend most of my time on that! Then I’ll sketch how the ideas should extend to the tenfold way.

Fans of lax monoidal functors, Deligne’s tensor product of abelian categories, and the collage of a profunctor will be rewarded for their patience if they read the whole article. But the basic idea is supposed to be simple: it’s about a multiplication table.

### The $\mathbb{3}$-fold way

First of all, notice that the set

$\mathbb{3} = \{1,0,-1\}$

is a commutative monoid under ordinary multiplication:

$\begin{array}{rrrr} \mathbf{\times} & \mathbf{1} & \mathbf{0} & \mathbf{-1} \\ \mathbf{1} & 1 & 0 & -1 \\ \mathbf{0} & 0 & 0 & 0 \\ \mathbf{-1} & -1 & 0 & 1 \end{array}$

Next, note that there are three (associative) division algebras over the reals: $\mathbb{R}, \mathbb{C}$ or $\mathbb{H}$. We can equip a real vector space with the structure of a module over any of these algebras. We’ll then call it a **real**, **complex** or **quaternionic** vector space.

For the real case, this is entirely dull. For the complex case, this amounts to giving our real vector space $V$ a **complex structure**: a linear operator $i: V \to V$ with $i^2 = -1$. For the quaternionic case, it amounts to giving $V$ a **quaternionic structure**: a pair of linear operators $i, j: V \to V$ with

$i^2 = j^2 = -1, \qquad i j = -j i$

We can then define $k = i j$.

The terminology ‘quaternionic vector space’ is a bit quirky, since the quaternions aren’t a field, but indulge me. $\mathbb{H}^n$ is a quaternionic vector space in an obvious way. $n \times n$ quaternionic matrices act by multiplication on the *right* as ‘quaternionic linear transformations’ — that is, *left* module homomorphisms — of $\mathbb{H}^n$. Moreover, every finite-dimensional quaternionic vector space is isomorphic to $\mathbb{H}^n$. So it’s really not so bad! You just need to pay some attention to left versus right.

Now: I claim that given two vector spaces of any of these kinds, we can tensor them over the real numbers and get a vector space of another kind. It goes like this:

$\begin{array}{cccc} \mathbf{\otimes} & \mathbf{real} & \mathbf{complex} & \mathbf{quaternionic} \\ \mathbf{real} & real & complex & quaternionic \\ \mathbf{complex} & complex & complex & complex \\ \mathbf{quaternionic} & quaternionic & complex & real \end{array}$

You’ll notice this has the same pattern as the multiplication table we saw before:

$\begin{array}{rrrr} \mathbf{\times} & \mathbf{1} & \mathbf{0} & \mathbf{-1} \\ \mathbf{1} & 1 & 0 & -1 \\ \mathbf{0} & 0 & 0 & 0 \\ \mathbf{-1} & -1 & 0 & 1 \end{array}$

So:

- $\mathbb{R}$ acts like 1.
- $\mathbb{C}$ acts like 0.
- $\mathbb{H}$ acts like -1.

There are different ways to understand this, but a nice one is to notice that if we have algebras $A$ and $B$ over some field, and we tensor an $A$-module and a $B$-module (over that field), we get an $A \otimes B$-module. So, we should look at this ‘multiplication table’ of real division algebras:

$\begin{array}{lrrr} \mathbf{\otimes} & \mathbf{\mathbb{R}} & \mathbf{\mathbb{C}} & \mathbf{\mathbb{H}} \\ \mathbf{\mathbb{R}} & \mathbb{R} & \mathbb{C} & \mathbb{H} \\ \mathbf{\mathbb{C}} & \mathbb{C} & \mathbb{C} \oplus \mathbb{C} & \mathbb{C}[2] \\ \mathbf{\mathbb{H}} & \mathbb{H} & \mathbb{C}[2] & \mathbb{R}[4] \end{array}$

Here $\mathbb{C}[2]$ means the 2 × 2 complex matrices viewed as an algebra over $\mathbb{R}$, and $\mathbb{R}[4]$ means that 4 × 4 real matrices.

What’s going on here? Naively you might have hoped for a simpler table, which would have instantly explained my earlier claim:

$\begin{array}{lrrr} \mathbf{\otimes} & \mathbf{\mathbb{R}} & \mathbf{\mathbb{C}} & \mathbf{\mathbb{H}} \\ \mathbf{\mathbb{R}} & \mathbb{R} & \mathbb{C} &\mathbb{H} \\ \mathbf{\mathbb{C}} & \mathbb{C} & \mathbb{C} & \mathbb{C} \\ \mathbf{\mathbb{H}} & \mathbb{H} & \mathbb{C} & \mathbb{R} \end{array}$

This isn’t true, but it’s ‘close enough to true’. Why? Because we always have a god-given algebra homomorphism from the naive answer to the real answer! The interesting cases are these:

$\mathbb{C} \to \mathbb{C} \oplus \mathbb{C}$ $\mathbb{C} \to \mathbb{C}[2]$ $\mathbb{R} \to \mathbb{R}[4]$

where the first is the diagonal map $a \mapsto (a,a)$, and the other two send numbers to the corresponding scalar multiples of the identity matrix.

So, for example, if $V$ and $W$ are $\mathbb{C}$-modules, then their tensor product (over the reals! — all tensor products here are over $\mathbb{R}$) is a module over $\mathbb{C} \otimes \mathbb{C} \cong \mathbb{C} \oplus \mathbb{C}$, and we can then pull that back via $f$ to get a right $\mathbb{C}$-module.

What’s really going on here?

There’s a monoidal category $Alg_{\mathbb{R}}$ of algebras over the real numbers, where the tensor product is the usual tensor product of algebras. The monoid $\mathbb{3}$ can be seen as a monoidal category with 3 objects and only identity morphisms. And I claim this:

**Claim.** There is an oplax monoidal functor $F : \mathbb{3} \to Alg_{\mathbb{R}}$ with
$\begin{array}{ccl}
F(1) &=& \mathbb{R} \\
F(0) &=& \mathbb{C} \\
F(-1) &=& \mathbb{H}
\end{array}$

What does ‘oplax’ mean? Some readers of the $n$-Category Café eat oplax monoidal functors for breakfast and are chortling with joy at how I finally summarized everything I’d said so far in a single terse sentence! But others of you see ‘oplax’ and get a queasy feeling.

The key idea is that when we have two monoidal categories $C$ and $D$, a functor $F : C \to D$ is ‘oplax’ if it preserves the tensor product, not up to isomorphism, but up to a specified *morphism*. More precisely, given objects $x,y \in C$ we have a natural transformation

$F_{x,y} : F(x \otimes y) \to F(x) \otimes F(y)$

If you had a ‘lax’ functor this would point the other way, and they’re a bit more popular… so when it points the opposite way it’s called ‘oplax’.

(In the lax case, $F_{x,y}$ should probably be called the **laxative**, but we’re not doing that case, so I don’t get to make that joke.)

This morphism $F_{x,y}$ needs to obey some rules, but the most important one is that using it twice, it gives two ways to get from $F(x \otimes y \otimes z)$ to $F(x) \otimes F(y) \otimes F(z)$, and these must agree.

Let’s see how this works in our example… at least in one case. I’ll take the trickiest case. Consider

$F_{0,0} : F(0 \cdot 0) \to F(0) \otimes F(0),$

that is:

$F_{0,0} : \mathbb{C} \to \mathbb{C} \otimes \mathbb{C}$

There are, in principle, two ways to use this to get a homomorphism

$F(0 \cdot 0 \cdot 0 ) \to F(0) \otimes F(0) \otimes F(0)$

or in other words, a homomorphism

$\mathbb{C} \to \mathbb{C} \otimes \mathbb{C} \otimes \mathbb{C}$

where remember, all tensor products are taken over the reals. One is

$\mathbb{C} \stackrel{F_{0,0}}{\longrightarrow} \mathbb{C} \otimes \mathbb{C} \stackrel{1 \otimes F_{0,0}}{\longrightarrow} \mathbb{C} \otimes (\mathbb{C} \otimes \mathbb{C})$

and the other is

$\mathbb{C} \stackrel{F_{0,0}}{\longrightarrow} \mathbb{C} \otimes \mathbb{C} \stackrel{F_{0,0} \otimes 1}{\longrightarrow} (\mathbb{C} \otimes \mathbb{C})\otimes \mathbb{C}$

I want to show they agree (after we rebracket the threefold tensor product using the associator).

Unfortunately, so far I have described $F_{0,0}$ in terms of an isomorphism

$\mathbb{C} \otimes \mathbb{C} \cong \mathbb{C} \oplus \mathbb{C}$

Using this isomorphism, $F_{0,0}$ becomes the diagonal map $a \mapsto (a,a)$. But now we need to really understand $F_{0,0}$ a bit better, so I’d better say what isomorphism I have in mind! I’ll use the one that goes like this:

$\begin{array}{ccl} \mathbb{C} \otimes \mathbb{C} &\to& \mathbb{C} \oplus \mathbb{C} \\ 1 \otimes 1 &\mapsto& (1,1) \\ i \otimes 1 &\mapsto &(i,i) \\ 1 \otimes i &\mapsto &(i,-i) \\ i \otimes i &\mapsto & (1,-1) \end{array}$

This may make you nervous, but it truly is an isomorphism of real algebras, and it sends $a \otimes 1$ to $(a,a)$. So, unraveling the web of confusion, we have

$\begin{array}{rccc} F_{0,0} : & \mathbb{C} &\to& \mathbb{C}\otimes \mathbb{C} \\ & a &\mapsto & a \otimes 1 \end{array}$

Why didn’t I just say that in the first place? Well, I suffered over this a bit, so you should too! You see, there’s an unavoidable arbitrary choice here: I could just have well used $a \mapsto 1 \otimes a$. $F_{0,0}$ looked perfectly god-given when we thought of it as a homomorphism from $\mathbb{C}$ to $\mathbb{C} \oplus \mathbb{C}$, but that was deceptive, because there’s a choice of isomorphism $\mathbb{C} \otimes \mathbb{C} \to \mathbb{C} \oplus \mathbb{C}$ lurking in this description.

This makes me nervous, since category theory disdains arbitrary choices! But it seems to work. On the one hand we have

$\begin{array}{ccccc} \mathbb{C} &\stackrel{F_{0,0}}{\longrightarrow} &\mathbb{C} \otimes \mathbb{C} &\stackrel{1 \otimes F_{0,0}}{\longrightarrow}& \mathbb{C} \otimes \mathbb{C} \otimes \mathbb{C} \\ a &\mapsto & a \otimes 1 & \mapsto & a \otimes (1 \otimes 1) \end{array}$

On the other hand, we have

$\begin{array}{ccccc} \mathbb{C} &\stackrel{F_{0,0}}{\longrightarrow} & \mathbb{C} \otimes \mathbb{C} &\stackrel{F_{0,0} \otimes 1}{\longrightarrow} & \mathbb{C} \otimes \mathbb{C} \otimes \mathbb{C} \\ a &\mapsto & a \otimes 1 & \mapsto & (a \otimes 1) \otimes 1 \end{array}$

So they agree!

I need to carefully check all the other cases before I dare call my claim a theorem. Indeed, writing up this case has increased my nervousness… before, I’d thought it was obvious.

But let me march on, optimistically!

### Consequences

In quantum physics, what matters is not so much the algebras $\mathbb{R}$, $\mathbb{C}$ and $\mathbb{H}$ themselves as the categories of vector spaces — or indeed, Hilbert spaces —-over these algebras. So, we should think about the map sending an algebra to its category of modules.

For any field $k$, there should be a contravariant pseudofunctor

$Rep: Alg_k \to Rex_k$

where $Rex_k$ is the 2-category of

$k$-linear finitely cocomplete categories,

$k$-linear functors preserving finite colimits,

and natural transformations.

The idea is that $Rep$ sends any algebra $A$ over $k$ to its category of modules, and any homomorphism $f : A \to B$ to the pullback functor $f^* : Rep(B) \to Rep(A)$.

(Functors preserving finite colimits are also called right exact; this is the reason for the funny notation $Rex$. It has nothing to do with the dinosaur of that name.)

Moreover, $Rep$ gets along with tensor products. It’s definitely true that given real algebras $A$ and $B$, we have

$Rep(A \otimes B) \simeq Rep(A) \boxtimes Rep(B)$

where $\boxtimes$ is the tensor product of finitely cocomplete $k$-linear categories. But we should be able to go further and prove $Rep$ is monoidal. I don’t know if anyone has bothered yet.

(In case you’re wondering, this $\boxtimes$ thing reduces to Deligne’s tensor product of abelian categories given some ‘niceness assumptions’, but it’s a bit more general. Read the talk by Ignacio López Franco if you care… but I could have used Deligne’s setup if I restricted myself to finite-dimensional algebras, which is probably just fine for what I’m about to do.)

So, if my earlier claim is true, we can take the oplax monoidal functor

$F : \mathbb{3} \to Alg_{\mathbb{R}}$

and compose it with the contravariant monoidal pseudofunctor

$Rep : Alg_{\mathbb{R}} \to Rex_{\mathbb{R}}$

giving a guy which I’ll call

$Vect: \mathbb{3} \to Rex_{\mathbb{R}}$

I guess this guy is a contravariant oplax monoidal pseudofunctor! That doesn’t make it sound very lovable… but I love it. The idea is that:

$Vect(1)$ is the category of real vector spaces

$Vect(0)$ is the category of complex vector spaces

$Vect(-1)$ is the category of quaternionic vector spaces

and the operation of multiplication in $\mathbb{3} = \{1,0,-1\}$ gets sent to the operation of tensoring any one of these three kinds of vector space with any other kind and getting another kind!

So, if this works, we’ll have combined linear algebra over the real numbers, complex numbers and quaternions into a unified thing, $Vect$. This thing deserves to be called a $\mathbb{3}$-graded category. This would be a nice way to understand Dyson’s threefold way.

### What’s really going on?

What’s really going on with this monoid $\mathbb{3}$? It’s a kind of combination or ‘collage’ of two groups:

The Brauer group of $\mathbb{R}$, namely $\mathbb{Z}_2 \cong \{-1,1\}$. This consists of Morita equivalence classes of central simple algebras over $\mathbb{R}$. One class contains $\mathbb{R}$ and the other contains $\mathbb{H}$. The tensor product of algebras corresponds to multiplication in $\{-1,1\}$.

The Brauer group of $\mathbb{C}$, namely the trivial group $\{0\}$. This consists of Morita equivalence classes of central simple algebras over $\mathbb{C}$. But $\mathbb{C}$ is algebraically closed, so there’s just one class, containing $\mathbb{C}$ itself!

See, the problem is that while $\mathbb{C}$ is a division algebra over $\mathbb{R}$, it’s not ‘central simple’ over $\mathbb{R}$: its center is not just $\mathbb{R}$, it’s bigger. This turns out to be why $\mathbb{C} \otimes \mathbb{C}$ is so funny compared to the rest of the entries in our division algebra multiplication table.

So, we’ve really got two Brauer groups in play. But we also have a homomorphism from the first to the second, given by ‘tensoring with $\mathbb{C}$’: complexifying any real central simple algebra, we get a complex one.

And whenever we have a group homomorphism $\alpha: G \to H$, we can make their disjoint union $G \sqcup H$ into monoid, which I’ll call $G \sqcup_\alpha H$.

It works like this. Given $g,g' \in G$, we multiply them the usual way. Given $h, h' \in H$, we multiply them the usual way. But given $g \in G$ and $h \in H$, we define

$g h := \alpha(g) h$

and

$h g := h \alpha(g)$

The multiplication on $G \sqcup_\alpha H$ is associative! For example:

$(g g')h = \alpha(g g') h = \alpha(g) \alpha(g') h = \alpha(g) (g'h) = g(g'h)$

Moreover, the element $1_G \in G$ acts as the identity of $G \sqcup_\alpha H$. For example:

$1_G h = \alpha(1_G) h = 1_H h = h$

But of course $G \sqcup_\alpha H$ isn’t a group, since “once you get inside $H$ you never get out”.

This construction could be called the **collage** of $G$ and $H$ via $\alpha$, since it’s reminiscent of a similar construction of that name in category theory.

**Question.** What do monoid theorists call this construction?

**Question.** Can we do a similar trick for any field? Can we always take the Brauer groups of all its finite-dimensional extensions and fit them together into a monoid by taking some sort of collage? If so, I’d call this the **Brauer monoid** of that field.

### The $\mathbb{10}$-fold way

If you carefully read Part 1, maybe you can guess how I want to proceed. I want to make everything ‘super’.

I’ll replace division algebras over $\mathbb{R}$ by super division algebras over $\mathbb{R}$. Now instead of 3 = 2 + 1 there are 10 = 8 + 2:

8 of them are central simple over $\mathbb{R}$, so they give elements of the super Brauer group of $\mathbb{R}$, which is $\mathbb{Z}_8$.

2 of them are central simple over $\mathbb{C}$, so they give elements of the super Brauer group of $\mathbb{C}$, which is $\mathbb{Z}_2$.

Complexification gives a homomorphism

$\alpha: \mathbb{Z}_8 \to \mathbb{Z}_2$

namely the obvious nontrivial one. So, we can form the collage

$\mathbb{10} = \mathbb{Z}_8 \sqcup_\alpha \mathbb{Z}_2$

It’s a commutative monoid with 10 elements! Each of these is the equivalence class of one of the 10 real super division algebras.

I’ll then need to check that there’s an oplax monoidal functor

$G : \mathbb{10} \to SuperAlg_{\mathbb{R}}$

sending each element of $\mathbb{10}$ to the corresponding super division algebra.

If $G$ really exists, I can compose it with a thing

$SuperRep : SuperAlg_{\mathbb{R}} \to Rex_{\mathbb{R}}$

sending each super algebra to its category of ‘super representations’ on super vector spaces. This should again be a contravariant monoidal pseudofunctor.

We can call the composite of $G$ with $SuperRep$

$SuperVect: \mathbb{10} \to \Rex_{\mathbb{R}}$

If it all works, this thing $SuperVect$ will deserve to be called a $\mathbb{10}$-graded category. It contains super vector spaces over the 10 kinds of super division algebras in a single framework, and says how to tensor them. And when we look at super *Hilbert* spaces, this setup will be able to talk about all ten kinds of matter I mentioned last time… and how to combine them.

So that’s the plan. If you see problems, or ways to simplify things, please let me know!

## Re: The Ten-Fold Way (Part 2)

In retrospect I’m making things too hard on myself, and even doing things wrong, in the ‘tricky’ case of tensoring two real vector spaces equipped with a complex structure! I should just treat them as complex vector spaces and tensor them over $\mathbb{C}$. That’s physically right (for combining quantum systems), and it also seems to correspond to what we should do by taking the Brauer group of the complex numbers seriously, as a separate ‘chunk’ of our collage.

I believe that with this fix, the functor $F : \mathbb{3} \to Alg_{\mathbb{R}}$ will still be lax monoidal, and devoid of funny arbitrary choices.