Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

February 27, 2015

Concepts of Sameness (Part 4)

Posted by John Baez

This time I’d like to think about three different approaches to ‘defining equality’, or more generally, introducing equality in formal systems of mathematics.

These will be taken from old-fashioned logic — before computer science, category theory or homotopy theory started exerting their influence. Eventually I want to compare these to more modern treatments.

If you know other interesting ‘old-fashioned’ approaches to equality, please tell me!

Posted at 10:21 AM UTC | Permalink | Followups (29)

February 26, 2015

Introduction to Synthetic Mathematics (part 1)

Posted by Mike Shulman

John is writing about “concepts of sameness” for Elaine Landry’s book Category Theory for the Working Philosopher, and has been posting some of his thoughts and drafts. I’m writing for the same book about homotopy type theory / univalent foundations; but since HoTT/UF will also make a guest appearance in John’s and David Corfield’s chapters, and one aspect of it (univalence) is central to Steve Awodey’s chapter, I had to decide what aspect of it to emphasize in my chapter.

My current plan is to focus on HoTT/UF as a synthetic theory of \infty-groupoids. But in order to say what that even means, I felt that I needed to start with a brief introduction about the phrase “synthetic theory”, which may not be familiar. Right now, my current draft of that “introduction” is more than half the allotted length of my chapter; so clearly it’ll need to be trimmed! But I thought I would go ahead and post some parts of it in its current form; so here goes.

Posted at 6:15 AM UTC | Permalink | Followups (42)

February 25, 2015

Concepts of Sameness (Part 3)

Posted by John Baez

Now I’d like to switch to pondering different approaches to equality. (Eventually I’ll have put all these pieces together into a coherent essay, but not yet.)

We tend to think of x=xx = x as a fundamental property of equality, perhaps the most fundamental of all. But what is it actually used for? I don’t really know. I sometimes joke that equations of the form x=xx = x are the only really true ones — since any other equation says that different things are equal — but they’re also completely useless.

But maybe I’m wrong. Maybe equations of the form x=xx = x are useful in some way. I can imagine one coming in handy at the end of a proof by contradiction where you show some assumptions imply xxx \ne x. But I don’t remember ever doing such a proof… and I have trouble imagining that you ever need to use a proof of this style.

If you’ve used the equation x=xx = x in your own work, please let me know.

Posted at 1:20 AM UTC | Permalink | Followups (28)

February 23, 2015

Concepts of Sameness (Part 2)

Posted by John Baez

I’m writing about ‘concepts of sameness’ for Elaine Landry’s book Category Theory for the Working Philosopher. After an initial section on a passage by Heraclitus, I had planned to write a bit about Gongsun Long’s white horse paradox — or more precisely, his dialog When a White Horse is Not a Horse.

However, this is turning out to be harder than I thought, and more of a digression than I want. So I’ll probably drop this plan. But I have a few preliminary notes, and I might as well share them.

Posted at 3:31 AM UTC | Permalink | Followups (45)

Concepts of Sameness (Part 1)

Posted by John Baez

Elaine Landry is a philosopher at U. C. Davis, and she’s editing a book called Categories for the Working Philosopher. Tentatively, at least, it’s supposed to have chapters by these folks

  • Colin McLarty (on set theory)
  • David Corfield (on geometry)
  • Michael Shulman (on univalent foundations)
  • Steve Awodey (on structuralism, invariance, and univalence)
  • Michael Ernst (on foundations)
  • Jean-Pierre Marquis (on first-order logic with dependent sorts)
  • John Bell (on logic and model theory)
  • Kohei Kishida (on modal logic)
  • Robin Cockett and Robert Seely (on proof theory and linear logic)
  • Samson Abramsky (on computer science)
  • Michael Moortgat (on linguistics and computational semantics)
  • Bob Coecke and Aleks Kissinger (on quantum mechanics and ontology)
  • James Weatherall (on spacetime theories)
  • Jim Lambek (on special relativity)
  • John Baez (on concepts of sameness)
  • David Spivak (on mathematical modeling)
  • Hans Halvorson (on the structure of physical theories)
  • Elaine Landry (on structural realism)
  • Andrée Ehresmann (on a topic to be announced)

We’re supposed to have our chapters done by April. To make writing my part more fun, I thought I’d draft some portions here on the nn-Café.

Posted at 12:20 AM UTC | Permalink | Followups (29)

February 18, 2015

Quantum Physics and Logic at Oxford

Posted by John Baez

There’s a workshop on quantum physics and logic at Oxford this summer:

Posted at 9:52 PM UTC | Permalink | Followups (1)

February 12, 2015

Can a Computer Solve Lebesgue’s Universal Covering Problem?

Posted by John Baez

Here’s a problem I hope we can solve here. I think it will be fun. It involves computable analysis.

To state the problem precisely, recall that the diameter of a set of points AA in a metric space is

diam(A)=sup{d(x,y):x,yA} diam(A)=\sup\{d(x,y) : x,y\in A\}

Recall that two subsets of the Euclidean plane 2\mathbb{R}^2 are isometric if we can get one from the other by translation, rotation and/or reflection.

Finally, let’s define a universal covering to be a convex compact subset KK of the Euclidean plane such that any set A 2A \subseteq \mathbb{R}^2 of diameter 11 is isometric to a subset of KK.

In 1914 Lebesgue posed the puzzle of finding the universal covering with the least area. Since then people have been using increasingly clever constructions to find universal coverings with smaller and smaller area.

My question is whether we need an unbounded amount of cleverness, or whether we could write a program to solve this puzzle.

There are actually a number of ways to make this question precise, but let me focus on the simplest. Let 𝒰\mathcal{U} be the set of all universal coverings. Can we write a program that computes this number to as much accuracy as desired:

a=inf{area(K):K𝒰}? a=inf\{ area(K) : K \in \mathcal{U}\} \; ?

More precisely, is this real number aa computable?

Right now all we know is that

0.832a0.844115376859 0.832 \le a \le 0.844115376859\dots

though Philip Gibbs has a heuristic argument for a better lower bound:

0.84408a 0.84408 \le a

Posted at 12:53 AM UTC | Permalink | Followups (32)

February 9, 2015

Higher-Dimensional Rewriting in Warsaw

Posted by John Baez

This summer there will be a conference on higher-dimensional algebra and rewrite rules in Warsaw. They want people to submit papers! I’ll give a talk about presentations of symmetric monoidal categories that are important in electrical engineering and control theory. There should also be interesting talks about combinatorial algebra, homotopical aspects of rewriting theory, and more:

Here’s a description…

Posted at 10:36 PM UTC | Permalink | Post a Comment

February 4, 2015

More on the AMS and NSA

Posted by Tom Leinster

Just a quickie. This month’s Notices of the AMS ran an article by Michael Wertheimer, recently-retired Director of Research at the NSA, largely about the accusation that the NSA deliberately created a backdoor in a standard cryptographic utility so that they could decode the messages of anyone using it.

Wertheimer’s protestations garnered an unusual amount of press and a great deal of scepticism (e.g. Le Monde, Ars Technica, The Register, Peter Woit, me), with the scepticism especially coming from crypto experts (e.g. Matthew Green, Ethan Heilman).

Some of those experts — also including Bruce Schneier — are writing to the Notices pointing out how misleading Wertheimer’s piece was, with ample historical evidence. And crucially: that in everything Wertheimer wrote, he never actually denied that the NSA created a backdoor.

If you support this letter — and if more broadly, you think it’s important that the AMS reconsiders its relationship with the NSA — then you can add your signature.

Posted at 2:37 AM UTC | Permalink | Followups (7)

Lebesgue’s Universal Covering Problem

Posted by John Baez

Lebesgue’s universal covering problem is famously difficult, and a century old. So I’m happy to report some progress:

• John Baez, Karine Bagdasaryan and Philip Gibbs, Lebesgue’s universal covering problem.

But we’d like you to check our work! It will help if you’re good at programming. As far as the math goes, it’s just high-school geometry… carried to a fanatical level of intensity.

Posted at 1:23 AM UTC | Permalink | Followups (12)

January 19, 2015

The Univalent Perspective on Classifying Spaces

Posted by Mike Shulman

I feel like I should apologize for not being more active at the Cafe recently. I’ve been busy, of course, and also most of my recent blog posts have been going to the HoTT blog, since I felt most of them were of interest only to the HoTT crowd (by which I mean, “people interested enough in HoTT to follow the HoTT blog” — which may of course include many Cafe readers as well). But today’s post, while also inspired by HoTT, is less technical and (I hope) of interest even to “classical” higher category theorists.

In general, a classifying space for bundles of XX’s is a space BB such that maps YBY\to B are equivalent to bundles of XX’s over YY. In classical algebraic topology, such spaces are generally constructed as the geometric realization of the nerve of a category of XX’s, and as such they may be hard to visualize geometrically. However, it’s generally useful to think of BB as a space whose points are XX’s, so that the classifying map YBY\to B of a bundle of XX’s assigns to each yYy\in Y the corresponding fiber (which is an XX). For instance, the classifying space BOB O of vector bundles can be thought of as a space whose points are vector spaces, where the classifying map of vector bundle assigns to each point the fiber over that point (which is a vector space).

In classical algebraic topology, this point of view can’t be taken quite literally, although we can make some use of it by identifying a classifying space with its representable functor. For instance, if we want to define a map f:BOBOf:B O\to B O, we’d like to say “a point vBOv\in B O is a vector space, so let’s do blah to it and get another vector space f(v)BOf(v)\in B O. We can’t do that, but we can do the next best thing: if blah is something that can be done fiberwise to a vector bundle in a natural way, then since Hom(Y,BO)Hom(Y,B O) is naturally equivalent to the collection of vector bundles over YY, our blah defines a natural transformation Hom(,BO)Hom(,BO)Hom(-,B O) \to Hom(-,B O), and hence a map f:BOBOf:B O \to B O by the Yoneda lemma.

However, in higher category theory and homotopy type theory, we can really take this perspective literally. That is, if by “space” we choose to mean “\infty-groupoid” rather than “topological space up to homotopy”, then we can really define the classifying space to be the \infty-groupoid of XX’s, whose points (objects) are XX’s, whose morphisms are equivalences between XX’s, and so on. Now, in defining a map such as our ff, we can actually just give a map from XX’s to XX’s, as long as we check that it’s functorial on equivalences — and if we’re working in HoTT, we don’t even have to do the second part, since everything we can write down in HoTT is automatically functorial/natural.

This gives a different perspective on some classifying-space constructions that can be more illuminating than a classical one. Below the fold I’ll discuss some examples that have come to my attention recently.

Posted at 6:25 PM UTC | Permalink | Followups (15)

January 9, 2015

The AMS Must Justify Its Support of the NSA

Posted by Tom Leinster

That’s the title of a letter I’ve just had published in the Notices of the AMS (Feb 2015, out yesterday). Text follows. There’s also a related letter from Daniel Stroock of MIT.

Plus, there’s an article by the NSA’s director of research, Michael Wertheimer. I have a few points to make about that — read on.

Posted at 7:11 AM UTC | Permalink | Followups (19)

January 5, 2015

Mathematics and Magic: the de Bruijn Card Trick

Posted by Simon Willerton

A mathematician hands out a pack of cards to a group of five people. They repeatedly cut the deck and then take a card each. The mathematician tries to use telepathy to divine the cards that the people are holding but unfortunately due to solar disturbances, the mind waves are a bit scrambled and the mathematician has to ask a few questions to help unscramble the images being received. “Who had porridge for breakfast?” “Who is holding a red card?” “Is anyone a Pisces?” “Who has a dog called Stanley?” The answers to these questions are sufficient to allow the mathematician to name the card that each person is holding. The audience applaud wildly.

As  2d  5c  3s  6d

I learnt about this trick in a book I got for Christmas.

The first thing to note is that the authors are both respected mathematicians, so it is perhaps not surprising to learn that the mathematics involved is actually non-trivial. In my undergraduate course on Knots and Surfaces I do a few knot and rope tricks to enliven the lectures and to demonstrate some of the ideas in the course, but these are generally sleight-of-hand tricks unlike the tricks in this book which all have some interesting mathematics underlying them.

In this post I want to wear my mathematician’s hat and explain how the above card trick is based on the existence of de Bruijn sequences. Of course, if I were wearing my magician’s hat, I wouldn’t be allowed to reveal how the trick works!

Posted at 7:36 PM UTC | Permalink | Followups (8)

January 1, 2015

Integral Octonions (Part 12)

Posted by John Baez

This time I’d like to swerve from the main road and tell you about a cute connection between lattices and braided monoidal categories.

We’ve been looking at a lot of lattices lately, like the E 8\mathrm{E}_8 lattice and the Leech lattice. A lattice is an abelian group under addition, so we can try to categorify it and construct a ‘2-group’ with points in the lattice as objects, but also some morphisms. Today I’ll show you that for a lattice in a vector space with an inner product, there’s a nice 1-parameter family of ways to do this, each of which gives a ‘braided 2-group’. Here the commutative law for addition in our lattice:

a+b=b+a a + b = b + a

is replaced by an isomorphism:

a+bb+a a + b \cong b + a

And this has some fun spinoffs. For example: for any compact simple Lie group GG, the category of representations Rep(T)Rep(T) of any maximal torus TGT \subseteq G, with its usual tensor product, has a 1-parameter family of braidings that are invariant under the action of the Weyl group.

What is this good for? I don’t know! I hope you can help me out. The best clues I have seem to be lurking here:

Posted at 1:17 AM UTC | Permalink | Followups (13)

December 31, 2014

Can One Explain Schemes to a Biologist?

Posted by John Baez

Tonight I read in Lior Pachter’s blog:

I’m a (50%) professor of mathematics and (50%) professor of molecular & cell biology at UC Berkeley. There have been plenty of days when I have spent the working hours with biologists and then gone off at night with some mathematicians. I mean that literally. I have had, of course, intimate friends among both biologists and mathematicians. I think it is through living among these groups and much more, I think, through moving regularly from one to the other and back again that I have become occupied with the problem that I’ve christened to myself as the ‘two cultures’. For constantly I feel that I am moving among two groups — comparable in intelligence, identical in race, not grossly different in social origin, earning about the same incomes, who have almost ceased to communicate at all, who in intellectual, moral and psychological climate have so little in common that instead of crossing the campus from Evans Hall to the Li Ka Shing building, I may as well have crossed an ocean.

I try not to become preoccupied with the two cultures problem, but this holiday season I have not been able to escape it. First there was a blog post by David Mumford, a professor emeritus of applied mathematics at Brown University, published on December 14th. For those readers of the blog who do not follow mathematics, it is relevant to what I am about to write that David Mumford won the Fields Medal in 1974 for his work in algebraic geometry, and afterwards launched another successful career as an applied mathematician, building on Ulf Grenader’s Pattern Theory and making significant contributions to vision research. A lot of his work is connected to neuroscience and therefore biology. Among his many awards are the MacArthur Fellowship, the Shaw Prize, the Wolf Prize and the National Medal of Science. David Mumford is not Joe Schmo.

It therefore came as a surprise to me to read his post titled “Can one explain schemes to biologists?” in which he describes the rejection by the journal Nature of an obituary he was asked to write. Now I have to say that I have heard of obituaries being retracted, but never of an obituary being rejected. The Mumford rejection is all the more disturbing because it happened after he was invited by Nature to write the obituary in the first place!

The obituary Mumford was asked to write was for Alexander Grothendieck, a leading and towering figure in 20th century.

Posted at 6:36 AM UTC | Permalink | Followups (8)