## April 30, 2013

### How Does Applied Math Impact Foundations?

#### Posted by John Baez

This weekend I’m giving a talk on “The Foundations of Applied Mathematics”. It’s mostly about how the widespread use of diagrams in engineering, biology, and the like may force us to think about math in new ways, at least if we take those diagrams seriously.

I wouldn’t typically emphasize the term ‘foundations of mathematics’ when talking about this. But I’m speaking at a Category-Theoretic Foundations of Mathematics Workshop, so I want to fit in.

Unfortunately, this means I need to think a bit more generally about how applications of mathematics can impinge on foundational issues. I need to do this just so I’m not taken by surprise when people start objecting or asking tough questions.

So, I’d like to hear what you have to say. Preferably before Sunday morning!

Posted at 8:21 PM UTC | Permalink | Followups (44)

## April 24, 2013

### The Michael and Lily Atiyah Portrait Gallery

#### Posted by Simon Willerton

A new portrait gallery opened this week in the James Clark Maxwell Building which houses the Edinburgh University maths department. It consists of seventy portraits of mathematicians selected, as the name suggests, by Michael and Lily Atiyah.

Due to the magic of the internet, you do not have to travel to Edinburgh to see the exhibition as Andrew Ranicki, who helped organise the gallery, has made it available on his website.

The Michael and Lily Atiyah Portrait Gallery

The commentaries on each of the photographs give interesting personal insights of the Atiyahs. I’d definitely recommend that you younger mathematicians out there give them a read.

Posted at 8:07 AM UTC | Permalink | Followups (4)

## April 23, 2013

### Modal Types

#### Posted by David Corfield

If Martin-Löf dependent type theory is a formal system which projects down via truncation to (typed) first-order logic, might we not expect there to be a modal type theory which would project down to first-order modal logic? I’ll use this post to play around with the idea.

Normally we take a modal logic to be a system which allows us to study certain ways of qualifying propositions.

It is necessarily the case that P; It is obligatory that P ; It is known that P, etc.

With first-order modal logic, these propositions may have free variables which we can then bind by a quantifier. This allows us to express controversial metaphysical theses such as:

‘Everything is necessarily self-identical’ or ‘For every thing, it is necessarily the case that that thing is identical to itself’ ($\forall x \Box (x = x)$).

It still seems that only propositions are modalised. But given all we’ve been hearing at the Café from Mike on type theory, we know that propositions can be considered as just a certain kind of type. Then, if we can modalise proposition-as-types, why not modalise all types? This is what happens in modal type theory.

Posted at 10:34 AM UTC | Permalink | Followups (23)

## April 17, 2013

### Koudenburg on Algebraic Weighted Colimits

#### Posted by Simon Willerton

My student Roald Koudenburg recently successfully defended his thesis and has yesterday put his thesis on the arXiv.

Roald Koudenburg, Algebraic weighted colimits

I will give a rough caricature of what he does. For a much nicer overview, I suggest you read the well-written introduction to the thesis! (Relating to the Café, there’s even an example including Simon Wadsley’s Theorem into Coffee.)

Posted at 6:00 PM UTC | Permalink | Followups (8)

### Scones, Logical Relations, and Parametricity

#### Posted by Mike Shulman

Consider a formal system, such as logic, type theory, or some programming language. Suppose you want to prove some “meta-theoretic property” of this system, such as the following.

• Consistency: There is no proof of false. This is obviously a desirable property of a formal system.
• The existence property: if it is provable that “there exists an $x$ such that $P(x)$”, then there is some particular $t$ such that $P(t)$ is provable.
• The disjunction property: if “$P$ or $Q$” is provable, then either $P$ is provable or $Q$ is provable. The existence and disjunction properties are not true of all formal systems: they are characteristic of constructive logics. (Excluded middle obviously violates the disjunction property, while the axiom of choice obviously violates the existence property.) In fact, arguably they are a defining feature of constructive logics, where proving that something exists ought to involve “constructing” a particular such thing.
• Canonicity: any term of natural-number type evaluates to (or, is provably equal to) some numeral (and various similar statements). This is an important property for any system that can function as a programming language: we want our programs to compute results!
• Parametricity: I’ll explain this later on.

How might you go about proving properties like this? If you’re a syntactically minded type theorist, you might try to use induction over types. That is, you first prove something for “base types” such as the natural numbers $N$, then prove that if it holds for $A$ and $B$ then it holds for derived types such as $A\times B$, $A+B$, and $A\to B$, etc. After a lot of work reformulating the thing you’re trying to prove as a sufficiently general statement for such an induction to go through, you finally make it all work and write QED. Then, because the reformulation you ended up with involves assigning “relations” (sometimes unary, sometimes binary) to types in an inductive way, you call the method logical relations.

I’ve gathered that this sort of thing is perfectly intuitive to a type theorist. However, when I first encountered it, being a category theorist, I had a very hard time understanding it. Certain steps seemed extremely unmotivated, and I didn’t see any principled reason for what was going on. I think I understand the type theorists’ point of view a little better now — but fortunately there’s another way to structure the whole argument, which I think makes much more sense to a category theorist.

Posted at 5:19 PM UTC | Permalink | Followups (12)

## April 16, 2013

### Tutte Polynomials and Magnitude Functions

#### Posted by Tom Leinster

Last summer in Barcelona, Joachim Kock floated the idea that there might be a connection between two invariants of graphs: the Tutte polynomial and the magnitude function. Here I’ll explain what these two invariants are, I’ll give you some examples, and I’ll ask for your help in understanding what the magnitude function tells us.

The Tutte polynomial of a graph is old and well-understood. It turns a graph $G$ into a polynomial $T_G$ in two variables, with natural number coefficients: $T_G(x, y) \in \mathbb{N}[x, y].$ The magnitude function of a graph is new and ill-understood. It turns a graph $G$ into a rational function $\mu_G$ in one variable, with rational coefficients: $\mu_G(q) \in \mathbb{Q}(q).$ It can also be expanded as a power series with integer coefficients: $\mu_G(q) \in \mathbb{Z}[[q]].$ I can’t figure out what the relationship between the two is — if any — or what information is contained in the magnitude function. It’s exasperating! I’ll be very pleased if someone can shed light on the situation.

Posted at 9:45 PM UTC | Permalink | Followups (64)

## April 12, 2013

### Colouring a Graph

#### Posted by Tom Leinster

It had always seemed to me that category theory provided no useful perspective on graph theory. But yesterday I learned a small fact that caused me to slightly revise that opinion. It’s that colourings of graphs — which seem to be a major source of questions in graph theory — are simply homomorphisms into a complete graph.

I’ll explain this, and I’ll describe a conjecture that Tom Hirschowitz told me about yesterday: for graphs $G$ and $H$,

The chromatic number of $G \times H$ is the minimum of the chromatic numbers of $G$ and $H$.

It’s amazing that something so basic is unknown!

Posted at 12:54 AM UTC | Permalink | Followups (55)

## April 5, 2013

### Category-Theoretic Foundations in Irvine

#### Posted by John Baez

I’d like to remind you of this workshop, which is coming up soon:

The schedule of talks is available now; I’ll just tell you what they’re about.

Posted at 12:06 AM UTC | Permalink | Followups (13)

## April 3, 2013

### Iterating the Free Monoid Construction

#### Posted by John Baez

The free monoid on a set $X$ is

$F X = 1 + X + X^2 + X^3 + \cdots$

and the same formula works in many other categories. I guess it works in any monoidal category with coproducts, where the tensor product distributes over coproducts.

Sometimes it’s nice to study this free monoid construction using geometric series, by pretending that

$F X = \frac{1}{1 - X}$

It doesn’t exactly make sense, at least not to me, since I can’t think of any categories where I know both what “$1 - X$” means for an object $X$, and what the “reciprocal” of an object means. But we can still sometimes squeeze some useful insights out of this idea.

But Mike Stay asks, what about $F F F X$?

Posted at 7:42 PM UTC | Permalink | Followups (28)

## April 1, 2013

#### Posted by John Baez

For any permutation $\sigma$ of 3 letters, we can find real polynomials in one variable, say $f_1, f_2, f_3$, such that

$f_1(x) < f_2(x) < f_3(x)$

when $x$ is negative, but

$f_{\sigma(1)}(x) < f_{\sigma(2)}(x) < f_{\sigma(3)}(x)$

when $x$ is positive.

However, Kontsevich noted that the analogous statement for permutations of 4 letters is false. It’s not true that if $\sigma$ is any permutation of 4 letters, we can find real polynomials in one variable $f_1, f_2, f_3, f_4$ such that

$f_1(x) < f_2(x) < f_3(x) < f_4(x)$

when $x$ is negative, but

$f_{\sigma(1)}(x) < f_{\sigma(2)}(x) < f_{\sigma(3)}(x) < f_{\sigma(4)}(x)$

It’s only true for 22 of these 24 permutations.

Posted at 6:45 PM UTC | Permalink | Followups (19)