## March 11, 2018

### Stabilization of Derivators

#### Posted by Mike Shulman

*(guest post by Ian Coley)*

I recently published a paper to the arXiv which reconstructs an old paper of Alex Heller. Heller’s Stable homotopy theories and stabilization is one of a few proto-derivator papers that are still oft-cited by those of us studying derivators — a subject absent from this website since the two papers of Mike Shulman and Kate Ponto were published in 2014! Therefore before getting into the paper itself, it’s worth recalling what a derivator is supposed to be and do. For those interested in the long version, check out the nLab article or Moritz Groth’s excellent paper.

## March 10, 2018

### Cognition, Convexity, and Category Theory

#### Posted by John Baez

*guest post by Tai-Danae Bradley and Brad Theilman*

Recently in the Applied Category Theory Seminar our discussions have returned to modeling natural language, this time via *Interacting Conceptual Spaces I* by Joe Bolt, Bob Coecke, Fabrizio Genovese, Martha Lewis, Dan Marsden, and Robin Piedeleu. In this paper, convex algebras lie at the heart of a compositional model of cognition based on Peter Gärdenfors’ theory of conceptual spaces. We summarize the ideas in today’s post.

Sincere thanks go to Brendan Fong, Nina Otter, Fabrizio Genovese, Joseph Hirsh, and other participants of the seminar for helpful discussions and feedback.

## March 4, 2018

### Coarse-Graining Open Markov Processes

#### Posted by John Baez

Kenny Courser and I have been working hard on this paper for months:

- John Baez and Kenny Courser, Coarse-graining open Markov processes.

It may be almost done. So, it would be great if you folks could take a look and comment on it! It’s a cool mix of probability theory and double categories.

‘Coarse-graining’ is a standard method of extracting a simple Markov process from a more complicated one by identifying states. We extend coarse-graining to open Markov processes. An ‘open’ Markov process is one where probability can flow in or out of certain states called ‘inputs’ and ‘outputs’. One can build up an ordinary Markov process from smaller open pieces in two basic ways:

- composition, where we identify the outputs of one open Markov process with the inputs of another,

and

- tensoring, where we set two open Markov processes side by side.

A while back, Brendan Fong, Blake Pollard and I showed that these constructions make open Markov processes into the morphisms of a symmetric monoidal category:

- A compositional framework for Markov processes, $n$-Category Café, January 12, 2016.

Here Kenny and I go further by constructing a symmetric monoidal *double* category where the 2-morphisms include ways of coarse-graining open Markov processes. We also extend the previously defined ‘black-boxing’ functor from the category of open Markov processes to this double category.

But before you dive into the paper, let me explain all this stuff a bit more….

## March 1, 2018

### Univalence From Scratch

#### Posted by Mike Shulman

Martín Escardó has written “a self-contained, brief and complete formulation of Voevodsky’s Univalence Axiom” in English and Agda: