## September 25, 2007

### Progic II

#### Posted by David Corfield My research time at present largely consists of the time it takes me to travel to Canterbury and back. Here’s what I came up with on the subject of progic on yesterday’s trip down.

A probabilistic predicate, $P$, I said back here, is a map from $X$ to $[0, 1]$. But another way to look at it is as a conditional probability distribution on the set $2$ given $X$. This is a map in the Kleisli category of the probability monad. In fact, our probabilistic predicate is a map $P: X \to 2$. Now we can look at the composition of a distribution over $X$ and this $P$.

Let’s take $X$ a set of dogs, $Q(x)$ is a probability distibution over $X$, perhaps recording my degree of belief in which dog I saw just now as it dashed past. Then if $P(x)$ is the predicate recording my belief as to whether $x$ is a poodle, then $\sum_x P(x) \times Q(x)$ is my degree of belief that I just glimpsed a poodle.

This is one of Mumford’s random constants, i.e., probabilistic predicates over {*}, or maps in the Kleisli category $1 \to 2$.

Now how much of all that stuff about functions between $X$ and $Y$ and adjunctions that I mentioned before can we bring with us? First, why not have probabilistic functions, or in other words conditional distributions $Own(y | x)$, e.g., recording my uncertainty as to the owner of dog $x$? Then for every probabilistic predicate on humans, I can pull it back to give a probabilistic predicate on dogs. E.g., if $F(y)$ records my degree of belief that person $y$ is French, I have a degree of belief, $\sum_y F(y) \cdot Own(y | x)$ that $x$ is owned by a Frenchman, and $\sum_{x, y} F(y) \cdot Own(y | x) \cdot Q(x)$ that the dog I just glimpsed is owned by a Frenchman.

But can we form those adjunctions? An obvious choice of a structure to place on the probabilistic predicates on $X$ is a partial order where $P \leq Q$ if and only if $P(x) \leq Q(x)$, for all $x \in X$. So if ‘Poodle’ $\leq$ ‘owned by a Frenchman’, do we have something like ‘owner of at least one poodle’ $\leq$ ‘French’? It seems to me unlikely that there are mappings from probabilistic predicates over $Y$ to those over $X$, adjoint to this ‘multiplication’ by $Own(y | x)$.

Posted at September 25, 2007 9:34 AM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/1437

### Re: Progic II

We can see the adjoints at play in the case of predicate logic preserving (co)limits.

Substitution being both a left and right adjoint, we have

Being owned by a French and happy person is equivalent to being owned by a French person and being owned by a happy person.

(There being just one owner.)

Being owned by a French or a happy person is equivalent to being owned by a French person or being owned by a happy person.

So, the product and coproduct are preserved.

Being an owner of only poodles and an owner of only black dogs is equivalent to being an owner only of black poodles.

But now ‘and’, including the one implicit in ‘black poodles’, cannot be replaced by ‘or’.

Being an owner of a poodle or an owner of a black dog is equivalent to being an owner of a black dog or a poodle.

Now ‘or’ cannot be replaced by ‘and’.

This nice picture is going to fall apart presumably in the probabilistic case.

Posted by: David Corfield on September 25, 2007 2:33 PM | Permalink | Reply to this

### Re: Progic II

You might want to look into using the category of probability kernels (where the objects are measurable spaces, and the morphisms are probability kernels between these spaces, which can be thought of either as families of probability measures in the target space indexed by the domain, or as a linear map from probability measures in the domain to that in the target). It seems to be a fairly convenient category for managing concepts such as relative information, relative products, conditional expectation, independence, etc.; they also show up in learning theory. (Strangely, however, they seem to fail to show up much on the internet; I was unable to find a good introduction to them online, but see for instance page 20 of this book.)

[Hmm, weird, I can’t get that URL to show up at all. It was Kallenberg’s “Foundations of modern probability” on Google Books.]

Posted by: Terry Tao on September 25, 2007 3:41 PM | Permalink | Reply to this

### Re: Progic II

Thanks. Something to read on the train. Google Books seem to delight in missing out passages where kernels are mentioned.

Posted by: David Corfield on September 25, 2007 6:29 PM | Permalink | Reply to this

### Re: Progic II

Terry wrote:

You might want to look into using the category of probability kernels (where the objects are measurable spaces, and the morphisms are probability kernels between these spaces, which can be thought of either as families of probability measures in the target space indexed by the domain, or as a linear map from probability measures in the domain to that in the target).

This sounds exactly like the ‘Kleisli category’ or ‘category of free algebras’ that David and I are talking about!

Indeed there are always two equivalent ways to describe morphisms in such situations. The simpler way is the first one you mention. This is what I called a probabilistic morphism from $X$ to $Y$: a map from $X$ to the space of probability measures on $Y$:

$X \stackrel{f}{\to} P Y$

But the second way makes it a bit more obvious how to compose these morphisms.

I composed my morphisms the hard way, like this:

$X \stackrel{f}{\to} P Y \stackrel{P g}{\to} P P Z \stackrel{\mu_Z}{\to} P Z$

where in the last step we convert a probability measure on probability measures on $Z$ to a probability measure on $Z$. This is standard ‘Kleisli category’ yoga.

Posted by: John Baez on September 25, 2007 9:18 PM | Permalink | Reply to this

### Re: Progic II

It’s nice to see progress on the progic project. Do you think it’s philosophically pukka enough to slip through the foundationalist filter into the vats of funding? Certainly both probability and logic still count as ‘philosophical’ — while the territories of geometry and group theory have somehow been lost to philosophers.

What follows is a brief attempt to understand, explain, and popularize what you’ve done so far.

Anyway:

You seem to be blending logic as done in the topos $Set$ with ‘my’ probability monad $P : Set \to Set$ Or maybe you’re blending logic as done in the topos (?) of Polish spaces with ‘your’ probability monad $P : Pol \to Pol$ I guess when making initial forays into the subject it’s not worth getting too specific about which formalism you’re using. The approach using sets has the advantage of being cheap and low-tech — everyone thinks they know what a set is. But eventually you’ll want to use the high-budget Polish space approach, to get Lebesgue measure on $[0,1]$ into the game!

One advantage of keeping both options on the table is that it provides an excuse to imagine you’re working very generally, starting from any topos equipped with a ‘probability monad’. I’m not sure what a probability monad actually is at this level of generality. But that doesn’t matter! I’ll just take the formal structure of ‘topos with monad’ and use it to examine what you’ve done so far…

Given objects $X$ and $Y$ in our topos, a probabilistic morphism from $X$ to $Y$ is a morphism from $X$ to probability measures on $Y$: $f: X \to P Y$

You’ve given two examples:

• A point in $X$ is a morphism $f: 1 \to X$ so a probabilistic point is a morphism $f: 1 \to P X$ that is, a probability measure on $X$.
• A predicate on $X$ is a morphism $f: X \to \Omega$ where $\Omega$ is the subobject classifier. So, a probabilistic predicate of $X$ is a morphism $f: X \to P\Omega$ In your examples $\Omega = 2$ and $P\Omega = [0,1]$, so a probabilistic predicate on $X$ just assigns a probablity to each point of $X$.

As you note, we can compose a probabilistic point and a probabilistic predicate and get a point in $P\Omega$, that is, a probability. This is just the probability that our probabilistic predicate is true for our probabilistic point.

How can we compose probabilistic morphisms? We can do it thanks to the wonders of the Kleisli category… err, the category of free algebras of $P$.

Let me remind the kids how this works:

Given a probabilistic morphism $f$ from $X$ to $Y$: $f: X \to P Y$ and a probabilistic morphism $g$ from $Y$ to $Z$: $g: Y \to P Z$ we can compose $f$ with $P g: P Y \to P P Z$ and then use our ability to squash probability measures on a space of probability measures down to mere probability measures: $\mu_Z : P P Z \to P Z$ The result is a probabilistic morphism from $X$ to $Z$: $\mu_Z \circ P g \circ f : X \to P Z$

So: go ahead and blend everything you know about logic — formulated using topoi — with probability theory — formulated using the probability monad. A bunch of stuff will work just because you’ve got a topos with a monad on it. But at some point, you’re gonna want to know how the monad gets along with the topos structures: products, coproducts, exponentials, etc. What can you say about these:

$P(X \times Y)$

$P(X + Y)$

$P(Y^X)$

For example, does the probability monad admit a strength? Is it monoidal? I think so.

Someone who actually knows topos theory, like Todd Trimble, could probably list three dozen ways that a monad can get along with a topos.

Posted by: John Baez on September 25, 2007 4:23 PM | Permalink | Reply to this

### Re: Progic II

By the way, I know that in the comment above I sound like one of those pompous jerks who try to embed any new idea in some elaborate formalism just to make it harder to understand.

It’s just that people have put a lot of thought into topoi and monads, so it’s fun if you know about them to see how much of David’s progic project can be phrased in those terms — and doing this raises some new questions, namely, how the operations of logic (built into the topos) and probability theory (built into the monad) interact.

In short, smashing big abstract formalisms against each other can shoot off interesting sparks.

Posted by: John Baez on September 25, 2007 5:28 PM | Permalink | Reply to this

### Re: Progic II

I’ve never tried to understand what ‘admitting a strength’ really means. Is there an easy example to get what it’s about?

This ‘getting along’ you mention between monad and topos is the tricky part. In the case of logic over Sets you have Pred$(X + Y)$ = Pred $(X) \times$ Pred $(Y)$, where ‘Pred’ gives you the predicates on a set.

And, as I mentioned, substitution from predicates of one set to another by way a function, preserves both join and meet.

But in the case of the probability monad, distributions over $(X + Y)$ don’t neatly match up with distributions over $X$ and over $Y$. I suppose you could say they give you a distribution over $X$ and one over $Y$ and a parameter to tell you the mixture. But even there, if $X$ or $Y$ is outside the support, then there’s trouble.

Posted by: David Corfield on September 26, 2007 3:36 PM | Permalink | Reply to this

### Re: Progic II

I’m not too clear on why people invented strong monads in the first place. But, it seems pretty likely that the probability monad admits a strength, since there’s an obvious map

$X \times PY \to P(X \times Y)$

sending a point $x \in X$ and a probability measure $\mu$ on $Y$ to the probability measure $\delta_x \otimes \mu$ on $X \times Y$.

I haven’t directly checked that it satisfies all the axioms, but I bet it does… especially since the Wikipedia article defines commutative strong monads on symmetric monoidal categories and says “One interesting fact about commutative strong monads is that they are ‘the same as’ symmetric monoidal monads.”

While I again haven’t bothered to check it, it seems pretty obvious that the probability monad is monoidal, meaning we have a very well-behaved map

$P X \times P Y \to P(X \times Y)$

sending a probability measure $\mu$ on $X$ and a probability measure $\nu$ on $Y$ to the probability measure $\mu \otimes \nu$ on $X \times Y$.

In short, I get the impression that while there are lots of subtly different conditions saying that a monad gets along with tensor products, the probability monad satisfies all of these with respect to the cartesian product.

Posted by: John Baez on September 28, 2007 5:03 PM | Permalink | Reply to this

### Re: Progic II

…it seems pretty obvious that the probability monad is monoidal…

So then:

The Kleisli category of a monoidal monad has a canonical monoidal structure, induced by the monoidal structure of the monad. The canonical adjunction between C and the Kleisli category is a monoidal adjunction with respect to this monoidal structure.

Posted by: David Corfield on September 28, 2007 5:19 PM | Permalink | Reply to this

### Re: Progic II

Robin wrote:

A symmetric monoidal closed category $C$ is enriched over itself, and a strong monad is just a monad on $C$ in the 2-category of $C$-enriched categories. It turns out that a monad admits a tensorial strength just when it “is” strong in this sense.

Well heck — then why didn’t they just come out and admit it?

However, I’m not sure I understand. You seem be giving a slick definition of a strong monad on a symmetric closed monoidal category. But they seem to be defining a strong monad on an arbitrary monoidal category.

I don’t see how having a strong monad on a monoidal category is enough to force that category to be symmetric and closed. So, maybe your slick definition of strong monad only applies to symmetric closed monoidal categories, not more general one? But maybe you’re hinting that the extra generality is not worth worrying about?

With all this talk of monoidal monads, and strengths, and the like, we keep bumping into a very elegant idea: taking any doctrine and looking at monads in that doctrine! I’d like to better understand the full power of this idea. For example: I guess in nice cases we get a category of algebras for the monad which again lies in the doctrine?

(Of course, a doctrine can itself be seen as a categorified monad. This is just an example of how, past a certain level of abstraction, every concept interacts with every other in mind-bendingly circular ways.)

Posted by: John Baez on September 29, 2007 3:04 AM | Permalink | Reply to this

### Re: Progic II

However, I’m not sure I understand. You seem be giving a slick definition of a strong monad on a symmetric closed monoidal category. But they seem to be defining a strong monad on an arbitrary monoidal category.

I think you understand perfectly, John, though I should have explained myself more carefully. If you think about C-enriched monads on a symmetric monoidal closed C, then (provided you’re clever enough, like Anders Kock) you eventually realise that they can be characterised in a way that only uses the tensor product, via the “tensorial strength”. Then you can turn around and say, let’s use this characterisation as a definition! So you have then generalised the notion to an arbitrary monoidal category.

Of course this is a common trick in mathematics: take some definition, find a characterisation that makes sense in a more general setting, then turn around and use that characterisation as a generalised definition that makes sense in the wider setting.

I’d dearly love to think about your second question; but I have to submit my thesis on Monday, so I’d better get back to it. :-)

Posted by: Robin on September 29, 2007 10:42 AM | Permalink | Reply to this

### Re: Progic II

PS. There’s also an ‘abstract nonsense’ way of generalising the notion of self-enriched monad to arbitrary monoidal categories. I don’t know that this gives the same result as the tensorial strength definition, but I’d be surprised if it didn’t. (Perhaps Todd Trimble knows?)

We know fron Brian Day’s thesis work that every monoidal category (every promonoidal category, even) embeds into a monoidal biclosed one: a (pro)monoidal structure on $C$ induces a monoidal biclosed structure on $[C, Set]^{op}$. So that means that a (pro)monoidal category $C$ is $[C, Set]^{op}$-enriched in a canonical way.

What does it mean for a monad on $C$ to be enriched in $[C, Set]^{op}$? I bet it’s equivalent to having a tensorial strength.

Posted by: Robin on September 29, 2007 11:31 AM | Permalink | Reply to this

### Re: Progic II

Hi Robin,

No, I’d never thought about that, but since you gave me a shout-out, and have other more important things to do yourself (best wishes for your defense by the way!), sure, let me have a go at it.

I’ll first expand just a little on what Robin said before, about the connection between enrichment and strength. If $V$ is a closed monoidal category and we start with a strong functor $T: V \to V$, then we get a $V$-enrichment of $T$:

$u^v \to (u^v \otimes T v)^{T v} \to T(u^v \otimes v)^{T v} \to T u^{T v}$

where we have used in turn the unit of the adjunction $(-) \otimes T v \dashv (-)^{T v}$, the strength on $T$, and the evaluation or counit of $(-) \otimes v \dashv (-)^v$. In the other direction, starting with a $V$-enriched $T: V \to V$, we get a strength:

$u \otimes T v \to (u \otimes v)^v \otimes T v \to T(u \otimes v)^{T v} \otimes T v \to T(u \otimes v)$

where I’ll leave it to the reader to figure out what’s going on at each step. So, as Robin said, any functor $T: Set \to Set$, being Set-enriched, carries a canonical strength. (Although my saying ‘canonical’ is overkill – there’s just one strength, period. I think.)

Okay, now let’s tackle Robin’s claim. Suppose given a monoidal category $C$. Each functor $T: C \to C$ induces a functor $T_!: Set^{C^{op}} \to Set^{C^{op}}$ by left Kan extension:

$T_{!}(F)(c) = C(c, T-) \otimes_C F := \int^d C(c, T d) \otimes F d$

(here the $\otimes_C$ is really ‘module composition’) and now we are considering what it would mean if $T_!$ were $Set^{C^{op}}$-enriched, or equivalently admits a $Set^{C^{op}}$-strength,

$F \otimes T_!(G) \to T_!(F \otimes G),$

where the tensor is Day convolution. Since $T_!$ is cocontinuous and $\otimes$ is separately cocontinuous in each of its arguments, and every $F$ is a colimit of representables, such a strength is (by naturality) uniquely determined by how it behaves on representables $F = hom(-, c)$, $G = hom(-, d)$. Here

$T_!(G)(x) = T_!(hom(-, d))(x) = hom(x, T-) \otimes_C hom(-, d)$

which reduces to $hom(x, T d)$ by the Yoneda lemma (in the formulation which says that $hom$ behaves as a unit for module composition). So $T_!(G) = hom(-, T d)$. Also,

$F \otimes G = hom(-, c) \otimes hom(-, d) \cong hom(-, c \otimes d)$

because the Yoneda embedding $C \to Set^{C^{op}}$ is a monoidal functor. So in the end, the strength

$F \otimes T_!(G) \to T_!(F \otimes G)$

boils down to

$hom(-, c) \otimes hom(-, T d) \to T_!(hom(-, c \otimes d))$

or to

$hom(-, c \otimes T d) \to hom(-, T(c \otimes d))$

and now we apply Yoneda one last time to arrive at (hurray!) our strength

$c \otimes T d \to T(c \otimes d),$

just as Robin predicted.

Whew. That calls for another cup of coffee…

Posted by: Todd Trimble on September 29, 2007 2:22 PM | Permalink | Reply to this

### Re: Progic II

One day I’d love to see this kind of material worked out in terms of a few examples.

Is anything profound being said if the probability monad is chosen?

Posted by: David Corfield on September 30, 2007 4:52 PM | Permalink | Reply to this

### Re: Progic II

Oh, I forgot, Robin said “monad”, not just a functor. It all works out fine, just the same, i.e., Robin spoke the truth: a strong monad $T$ on a monoidal category $C$ is tantamount to the left Kan extension $T_!: Set^{C^{op}} \to Set^{C^{op}}$ being a $Set^{C^{op}}$-enriched monad.

Posted by: Todd Trimble on September 29, 2007 2:44 PM | Permalink | Reply to this

### Re: Progic II

All monads on Set are strong: they can’t help it. This is obvious if (and only if?) you know the original motivation for the definition of strength (by Anders Kock).

A symmetric monoidal closed category C is enriched over itself, and a strong monad is just a monad on C in the 2-category of C-enriched categories. It turns out that a monad admits a tensorial strength just when it “is” strong in this sense.

But “Set-enriched” is vacuous: all monads (functors, natural transformations) are Set-enriched! So every Set-monad has a canonical strength.

Posted by: Robin on September 28, 2007 6:36 PM | Permalink | Reply to this

### Re: Progic II

Combining probability theory and topos theory is not pompous. It is actually a very old idea invented by Dana Scott (for Boolean valued models) in his proof of the independence of the Continuous Hypothesis.
Scott’s idea is simple: if we add random real numbers, then the continuum does not have cardinality aleph_1. The proof, of course, was a bit more difficult … :-)

Matthew Jackson has recently picked up the topic again:
http://www.lawrence.edu/fast/jacksoma/research.html

For instance, showing that both measures and measurable functions are internal real numbers. This is similar to the fact that the interpretation of the reals in the topos of sheaves over a topological space are precisely the continuous real-valued functions.

Posted by: Bas Spitters on September 26, 2007 10:05 AM | Permalink | Reply to this