Classical vs Quantum Computation (Week 12)
Posted by John Baez
In this week’s class on Classical vs. Quantum Computation, we finally see a simple example of how ‘processes of computation’ shows up as 2-morphisms in a 2-category. In this example, our 2-category is just a monoidal category:
-
Week 12 (Feb. 1) - 2-categories of computation. The word problem for monoids. Given a presentation of a monoid, we get not only a monoid but also a strict monoidal category where the relations in the presentation are interpreted not as equations but as ‘rewrite rules’: that is, morphisms. Terminating and confluent categories. The Diamond Lemma. Normal forms. The monoidal functor , where rewrite rules are squashed down to equations.
Supplementary reading:
- Mathworld articles: Reduction system, Confluent, Finitely terminating, Knuth-Bendix completion algorithm.
- Yves Lafont and Alain Proute, Church-Rosser property and homology of monoids.
Last week’s notes are here; next week’s notes are here.
In today’s class I considered a particular example of a monoid presentation. Here’s an excerpt from week70, my summary of a talk by Yves Lafont which used the same example to illustrate some different ideas:
Yves Lafont gave a talk with strong connections to n-category theory. Recall that a monoid is a set with an associative product having a unit element. One way to describe a monoid is by giving a presentation with “generators”, say
a, b, c, d,
and “relations”, say
ab = a, da = ac.
We get a monoid out of this in an obvious sort of way, namely by taking all strings built from the generators a,b,c, and d, and then identifying two strings if you can get from one to the other by repeated use of the relations. In math jargon, we form the free monoid on the generators and then mod out by the relations.
Suppose our monoid is finitely presented, that is, there are finitely many generators and finitely many relations. How can we tell whether two elements of it are equal? For example, does
dacb = acc
in the above monoid? Well, if the two are equal, we will always eventually find that out by an exhaustive search, applying the relations mechanicallly in all possible ways. But if they are not, we may never find out! (For the above example, the answer appears at the end of this article in case anyone wants to puzzle over it. Personally, I can’t stand this sort of puzzle.) In fact, there is no general algorithm for solving this “word problem for monoids”, and in fact one can even write down a specific finitely presented monoid for which no algorithm works.
However, sometimes things are nice. Suppose you write the relations as “rewrite rules”, that go only one way:
ab → a
da → ac
Then if you have an equation you are trying to check, you can try to repeatedly apply the rewrite rules to each side, reducing it to “normal form”, and see if the normal forms are equal. This will only work, however, if some good things happen! First of all, your rewrite rules had better terminate: it had better be that you can only apply them finitely many times to a given string. This happens to be true for the above pair of rewrite rules, because both rules decrease the number of b’s and c’s. Second of all, your rewrite rules had better be confluent: it had better be that if I use the rules one way until I can’t go any further, and you use them some other way, that we both wind up with the same thing! If both these hold, then we can reduce any string to a unique normal form by applying the rules until we can’t do it any more.
Unfortunately, the rules above aren’t confluent; if we start with the word dab, you can apply the rules like this
dab → acb
while I apply them like this
dab → da → ac
and we both terminate, but at different answers. We could try to cure this by adding a new rule to our list,
acb → ac.
This is certainly a valid rule, which cures the problem at hand… but if we foolishly keep adding new rules to our list this way we may only succeed in getting confluence and termination when we have an infinite list of rules:
ab → a
da → ac
acb → ac
accb → acc
acccb → accc
accccb → acccc…
and so on. I leave you to check that this is really terminating and confluent. Because it is, and because it’s a very predictable list of rules, we can use it to write a computer program in this case to solve the word problem for the monoid at hand. But in fact, if we had been cleverer, we could have invented a finite list of rules that was terminating and confluent:
ab → a
ac → da
Lafont went on to describe some work by Squier:
- Craig C. Squier, Word problems and a homological finiteness condition for monoids, Jour. Pure Appl. Algebra 49 (1987), 201-217.
- Craig C. Squier, A finiteness condition for rewriting systems, revision by F. Otto and Y. Kobayashi, to appear in Theoretical Computer Science.
- Craig C. Squier and F. Otto, The word problem for finitely presented monoids and finite canonical rewriting systems, in Rewriting Techniques and Applications, ed. J. P. Jouannuad, Lecture Notes in Computer Science 256, Springer, Berlin, 1987, 74-82.
which gave general conditions which must hold for there to be a finite terminating and confluent set of rewrite rules for a monoid. The nice thing is that this relies heavily on ideas from n-category theory. Note: we started with a monoid in which the relations are equations, but we then started thinking of the relations as rewrite rules or morphisms, so what we really have is a monoidal category. We then started worrying about “confluences”, or equations between these morphisms. This is typical of “categorification”, in which equations are replaced by morphisms, which we then want to satisfy new equations (see “week38”).
For the experts, let me say exactly how it all goes. Given any monoid M, we can cook up a topological space called its “classifying space” KM, as follows. We can think of KM as a simplicial complex. We start by sticking in one 0-simplex, which we can visualize as a dot like this:
OThen we stick in one 1-simplex for each element of the monoid, which we can visualize as an arrow going from the dot to itself. Unrolled a bit, it looks like this:
O---a---OReally we should draw an arrow going from left to right, but soon things will get too messy if I do that, so I won’t. Then, whenever we have ab = c in the monoid, we stick in a 2-simplex, which we can visualize as a triangle like this:
O / \ a b / \ O---c---OThen, whenever we have abc = d in the monoid, we stick in a 3-simplex, which we can visualize as a tetrahedron like this
O /|\ / | \ / b \ a | bc / _O_ \ / / \_ \ / _ab c_ \ /_/ \_\ O--------d--------OAnd so on… This is a wonderful space whose homology groups depend only on the monoid, so we can call them Hk(M). If we have a presentation of M with only finitely many generators, we can build KM using 1-simplices only for those generators, and it follows that H1(M) is finitely generated. (More precisely, we can build a space with the same homotopy type as KM using only the generators in our presentation.) Similarly, if we have a presentation with only finitely many relations, we can build KM using only finitely many 2-simplices, so H2(M) is finitely generated. What Squier showed is that if we can find a finite list of rewrite rules for M which is terminating and confluent, then we can build KM using only finitely many 3-simplices, so H3(M) is finitely generated! What’s nice about this is that homological algebra gives an easy way to compute Hk(M) given a presentation of M, so in some cases we can prove that a monoid has no finite list of rewrite rules for M which is terminating and confluent, just by showing that H3(M) is too big. Examples of this, and many further details, appear in Lafont’s work:
- Yves Lafont and Alain Proute, Church-Rosser property and homology of monoids, Mathematical Structures in Computer Science 1 (1991), 297-326. Also available at http://iml.univ-mrs.fr/~lafont/publications.html
- Yves Lafont, A new finiteness condition for monoids presented by complete rewriting systems (after Craig C. Squier), Journal of Pure and Applied Algebra 98 (1995), 229-244. Also available at http://iml.univ-mrs.fr/~lafont/publications.html
(Answer: dacb = ddab = dda = dac = acc.)