Alas, I don’t know what to do about this problem without significantly modifying our setup. Josh mentioned multicategories as one approach, but I have no idea how this would work for the ‘quantum logic’ systems our setup is designed to include. So, I guess I’ll just mention the problem.
In a way it doesn’t really matter, since we’re just trying to set up tenuous bridges between various subjects… just enough to get more people talking to each other. But, it’s good to know about the hot water we’re getting into.
I think the subsection on proof theory will leave many scratching their heads about the point of it all, because here all proof theory appears to be is an especially cumbersome way of presenting the construction of proofs/morphisms.
To my mind, the crucial point underlying proof theory is that body of results that are called cut elimination theorems, which for a category theorist not accustomed to thinking of these things, says something quite incredible: that you can dispense with composition (your rule )!! That’s really a crude way of putting it, so let me say a bit more.
Imagine that you’re setting out to prove a difficult theorem, . Along the way you will need to prove a bunch of propositions which will get you from point to point : , , …, . (The cut rule, or rule , would then allow you to conclude .) In the case of a nontrivial theorem, there is no doubt that the intermediate formulas could be quite complicated. If you actually wanted to mechanize the process (of deciding which you will need), the problem would seem to be hopeless: the search space for the requisite intermediate formulas is infinite, unbounded. So having to rely on the cut rule would seem to be a very risky proposition, from that point of view.
Nevertheless, Gentzen and his successors showed that for deductive systems of carefully specified type, one can get around this fundamental difficulty: he proved a meta-theorem to the effect that any entailment (or sequent) that can be proven with the cut or composition rule, can also be proven without it! For example, as an application to the types of monoidal categories you and Mike are discussing in your paper, one can decide if there exists a morphism from one formula built up from and (your lollipop connective – sorry, I don’t know the LaTeX code for that) to another , that is, a morphism built out of the data for the particular type of structured category being studied (monoidal, monoidal closed, symmetric monoidal closed, etc.). In effect, in the system left after the cut rule is removed, the only way one can infer an entailment or morphism below the line is by starting with entailment(s) above the line with a simpler logical structure. In a lot of such deductive systems which have been studied by categorists, the entailment one is after is built up from simpler ones, one connective at a time.
[There is also a more refined, categorified cut elimination, which allows one to decide not only if there exists a morphism , but whether two such morphisms built from syntactically different deductions are the same on the basis of the axioms (commutative diagrams) for the type of structured monoidal category being studied. This is the crucial step toward solving a coherence problem for that type of structure, i.e., deciding which diagrams commute. The fact that Gentzen cut elimination admits a categorified form is one of the great insights due to Lambek.]
But it should come as no surprise that to get away with a cut elimination theorem, one has to be very careful in just how the deductive system is set up – sometimes a delicate matter. In multiplicative intuitionistic linear logic (the logic for symmetric monoidal closed categories), the Gentzen method gives rules which introduce connectives one at a time, either on the left or right of the entailment symbol (the turnstyle). For example, there is a rule
which introduces a connective on the left (here , denote lists of formulas). There is a separate rule
for introducing on the right. The various rules come in pairs like this, balanced “just so”, so that a cut elimination theorem can go through. In particular, I think one really does need sequents of the form
in order to get cut elimination to come out right. (Just FYI, the cut rule would take a form like
Multicategories are one semantic device for interpreting such sequents. Another is to think of the “commas” as the monoidal product for a monoidal strictification (a formal way of associating a strict monoidal category equivalent to a given monoidal category). I spoke a little about this back here; see also the comments which follow (including where Tom Leinster chipped in with a remark on unbiased monoidal categories).
I may have more comments later.
Re: Logicians Needed Now
You probably know that chemistry (p. 35 of your paper) and logic have been related before. I recall Charles Peirce writing on this. More recently we have the chemical abstract machine.
Hmm, the connection goes back at least as far as Hegel.