## December 8, 2019

### Random Permutations (Part 8)

#### Posted by John Baez

Last time we saw a strong connection between random permutations and Poisson distributions. Thinking about this pulled me into questions about partitions and the moments of Poisson distributions. These may be a bit of a distraction — but maybe not, since every permutation gives a partition, namely the partition into cycles.

Puzzle 11. Suppose raindrops are falling on your head, randomly and independently, at an average rate of one per minute. What’s the average of the cube of the number of raindrops that fall on your head in one minute?

To get started we need a bit of probability theory. Suppose random events are happening at an average rate of one per minute, in a nice independent way so that the history so far doesn’t affect the rate of new events now. Let $P(k,t)$ be the probability that $k$ events have happened between time $0$ and time $t$. Then:

$\frac{d}{d t} P(k,t) = P(k-1,t) - P(k,t)$

since if $k-1$ events have happened an extra event will bring the total to $k$, while if $k$ events have already happened an extra one will make the total stop being $k$.

Now, it takes a bit of work to figure out $P(k,t)$ given this equation and the obvious initial conditions

$P(k,0) = \left\{\begin{array}{ccl} 1 &\mathrm{if}& k = 0 \\ 0 &\mathrm{if}& k \gt 0 \end{array}\right.$

But if someone tells you the answer, you just need elementary calculus to check that it’s right:

$P(k,t) = \frac{t^k}{k!} e^{-t}$

Back to our raindrop problem. Here the time $t$ is 1 minute, so the probability of $k$ raindrops landing on your head in this time is

$\frac{1}{k!} e^{-1}$

Sanity check: these probabilities sum to 1.

We’re trying to figure out the expected value of the cube of the number of raindrops that land on your head in a minute. We now know this is

$\frac{1}{e} \sum_{k=0}^\infty \frac{k^3}{k!}$

So this is the sum we need to do. But being mathematicians, let’s generalize. Let’s figure out the expected value of the $n$th power of the number of raindrops that land on your head in a minute:

$\frac{1}{e} \sum_{k=0}^\infty \frac{k^n}{k!}$

The sum here looks like a mutant version of

$\sum_{k=0}^\infty \frac{n^k}{k!} = e^n$

But I mention this only so you don’t get confused and take a wrong turn!

To compute

$\frac{1}{e} \sum_{k=0}^\infty \frac{k^n}{k!}$

it’s nice to express the powers $k^n$ in terms of ‘falling powers’:

$k^{\underline{n}} = k (k-1) (k-2) \cdots (k - n + 1)$

The power $k^n$ counts the number of functions from $n$ to $k$, where now I’m identifying a natural number with a set having that number of elements. Similarly, the falling power $k^{\underline{n}}$ counts the number of injections from $n$ to $k$. Why? With an injection, the first element of $n$ has $k$ choices of where to go, but the next one has only $k-1$, and so on.

There’s a nice formula expressing ordinary powers as linear combinations of falling powers:

$k^n = \sum_{i = 0}^k \left\{ \begin{array}{c} n \\ i \end{array} \right\} \; k^{\underline{i}}$

Here $\left\{ \begin{array}{c} n \\ i \end{array} \right\}$ is just the number of partitions of $n$ into $i$ parts.

This formula has a wonderfully slick proof. Any function $f \colon n \to k$ factors as a surjection followed by an injection:

$n \stackrel{f}{\to} \mathrm{im} f \hookrightarrow k$

and the surjection gives a partition of $n$ into $i$ parts, where $i$ is the number of points in the image $\mathrm{im} f$. Conversely a partition of $n$ into $i$ parts together with an injection $i \hookrightarrow k$ determines a function $f \colon n \to k$. So, the number of functions $f \colon n \to k$ is the sum over $0 \le i \le n$ of the number of partitions of $n$ into $i$ parts times the number of injections $i \hookrightarrow k$. And that’s all the formula says!

So, the expected value of the $n$th power of the number of raindrops that fall on your head is

$\frac{1}{e} \sum_{k=0}^\infty \frac{k^n}{k!} \; = \; \frac{1}{e} \sum_{k=0}^\infty \sum_{i = 0}^k \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, \frac{k^{\underline{i}}}{k!}$

But in fact we can simplify this! Since there are no injections $i \hookrightarrow k$ when $i \gt k$, the falling power vanishes in this case, so our sum is just

$\frac{1}{e} \sum_{i,k=0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, \frac{k^{\underline{i}}}{k!}$

And now we can do the sum over $k$ first. Since

$\sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} \; = \; \sum_{k = 0}^\infty \frac{k(k-1)\cdots (k-i+1)}{k(k-1) \cdots 1} \; = \; \sum_{k = i}^\infty \frac{1}{(k-i)!} \; = \; \sum_{j = 0}^\infty \frac{1}{j!} \; = \; e$

we get

$\frac{1}{e} \sum_{i,k=0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, \frac{k^{\underline{i}}}{k!} \; = \; \sum_{i = 0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\}$

which is the total number of partitions of an $n$-element set!

In our original question we were interested in $n = 3$. There are 5 partitions of a 3-element set:

$\{\{1,2,3\}\},$ $\{\{1,2\}, \{3\}\}, \; \{\{2,3\}, \{1\}\}, \; \{\{3,1\}, \{2\}\},$ $\{\{1\}, \{2\}, \{3\}\}$

So, the expected value of the cube of the number of raindrops that land on your head in a minute is 5.

Now that we’re done, let’s reflect on what happened, and introduce more jargon so you can read more about the concepts involved.

The function

$P(k,t) = \frac{t^k}{k!} e^{-t}$

is called a Poisson distribution. It’s the answer whenever you want to know the probability of a given number $k$ of events occurring in some amount of time $t$ if these events occur with a constant average rate, set here to $1$ for convenience, independently of the time since the last event.

The parameter $t$ is the mean of the Poisson distribution. We worked out the $n$th moment of the Poisson distribution with mean $1$. In other words, we worked out the expected value of $k^n$ in this case:

$\sum_{k = 0}^\infty k^n P(k,1) \; = \; \frac{1}{e} \sum_{k = 0}^\infty \frac{k^n}{k!}$

We got the number of partitions of an $n$-element set, which is called the Bell number $B_n$, after the author of a fantasy novel about a planet where all mathematics is done by men.

This result is called Dobiński’s formula:

$\frac{1}{e} \sum_{k = 0}^\infty \frac{k^n}{k!} = B_n$

Dobiński published a paper about it in 1877, but it seems to me that he only proved some special cases:

The number of partitions of an $n$-element set into $i$ parts, $\left\{ \begin{array}{c} n \\ i \end{array} \right\}$, is called a Stirling number of the second kind. Given the definitions it’s obvious that

$\sum_{i = 0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\} = B_n$

(We can stop the sum at $i = n$ without changing its value.)

The identity

$k^n = \sum_{i = 0}^k \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, k^{\underline{i}}$

is more interesting, since it connects Stirling numbers of the second kind to falling powers, which are also known as falling factorials, falling sequential products, and various other things. The proof I gave comes from here:

Category theorists will say it relies on the uniqueness of the epi-mono factorization of a function between finite sets. Rota doesn’t use these terms, but he talks about the ‘kernel’ of a function, which is his name for the epi in this factorization.

In this paper Rota goes ahead and proves Dobiński’s formula, but he does not explicitly discuss its connection to the Poisson distribution. Someone on Wikipedia translated his argument into the language of probability, which I like. Unfortunately the Wikipedia article merely reduces Dobiński’s formula to the identity

$\frac{1}{e} \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} = 1$

and stops there, remarking that the $k$th factorial moment of the Poisson distribution of mean $1$ equals $1$. Factorial moments are like ordinary moments but with the power $k^i$ replaced by the falling power $k^{\underline{i}}$.

I was frustrated to see the argument fizzle out before it ended. Then I saw why

$\sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} = e$

It follows from some facts about species and groupoid cardinality.

If you have a species, namely a functor $F$ from the groupoid of finite sets to the category of sets, you can think of it as describing a type of structure you can put on finite sets. For each natural number $k$, $F(k)$ is the set of structures of this type that you can put on a $k$-element set. The species has a generating function

$f(x) = \sum_{k = 0}^\infty \frac{|F(k)|}{k!} x^k$

and the number $f(1)$ has a nice meaning if the sum involved is well-defined. The category of elements $\int F$ has finite sets equipped with the given structure as objects and bijections preserving the structure as morphisms. This category $\int F$ is a groupoid, and the number $f(1)$ is just the groupoid cardinality of $\int F$.

Now fix a finite set $i$. There’s a species $F$ that assigns to each finite set $k$ the set of all injections $i \hookrightarrow k$. The generating function of this species is

$f(x) = \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} x^k$

Thus,

$f(1) = \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!}$

is the cardinality of the groupoid $\int F$ whose objects are finite sets equipped with an inclusion of the set $i$. But this groupoid is equivalent to the groupoid of finite sets and bijections! Why? Because the morphisms in $\int F$ are bijections that preserve the inclusion of the set $i$. Thus, points in the image of this inclusion have no ‘freedom of movement’, and our morphism boils down to an arbitrary bijection between the remaining points.

Since $\int F$ is equivalent to the groupoid of finite sets, it must have the same cardinality, namely

$\sum_{k = 0}^\infty \frac{1}{k!} = e$

So, we get

$\sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} = e$

Later, Akiva Weinberger pointed out to me that this identity can be shown by a simple reindexing:

$\sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} \; = \; \sum_{k = i}^\infty \frac{1}{(k-i)!} \; = \; \sum_{j = 0}^\infty \frac{1}{j!} \; = \; e$

This is precisely a decategorified version of my observation that $\int F$ is equivalent to the groupoid of finite sets.

By the way, Dobiński’s formula, written this way:

$\sum_{k = 0}^\infty \frac{k^n}{k!} = e B_n$

says the cardinality of the groupoid of finite sets equipped with a function from a fixed finite set is $e$ times the number of partitions of that set. After a while this starts to seem obvious, if you keep thinking about epi-mono factorizations.

So groupoid cardinality establishes a nice connection between Poisson distributions, partitions, and injections. I doubt I’ve gotten to the bottom of it, since I haven’t really seen why the combinatorics is connected to Poisson processes. I’m also not sure how much all this will help me with permutations. But it was fun.

Posted at December 8, 2019 9:55 PM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/3174

### Re: Random Permutations (Part 8)

While probably all your regular audience here knows, it may be appropriate to mention for the benefit of passersby that Bell didn’t realise, or at least didn’t acknowledge, that his work was a fantasy. (I like the description, though!)

Posted by: L Spice on December 9, 2019 2:36 PM | Permalink | Reply to this

### Re: Random Permutations (Part 8)

Sure, I don’t mind clarifying for anyone who doesn’t know it’s a joke. Eric Temple Bell’s famous Men of Mathematics was helpful to me as a boy in the deep pre-internet era, trying to imagine what mathematicians were like. Only later did I realize how over-romanticized and inaccurate it was, and the sexist title came to disgust me. This edition even has “Men” in bright red:

The mathematician Clifford Truesdell wrote:

…[Bell] was admired for his science fiction and his Men of Mathematics. I was shocked when, just a few years later, Walter Pitts told me the latter was nothing but a string of Hollywood scenarios; my own subsequent study of the sources has shown me that Pitts was right, and I now find the contents of that still popular book to be little more than rehashes enlivened by nasty gossip and banal or indecent fancy.

Rebecca Goldstein described the mathematician characters in Men of Mathematics thus:

Some of these people sounded as if they had to be changelings, non-human visitors from some other sphere, with powers so prodigious they burst the boundaries of developmental psychology, lisping out profundities while other children were playing with their toes.

Posted by: John Baez on December 9, 2019 6:33 PM | Permalink | Reply to this

### Re: Random Permutations (Part 8)

Typo: the $n$ after the first equation should be a $k$.

Posted by: Blake Stacey on December 9, 2019 4:33 PM | Permalink | Reply to this

### Re: Random Permutations (Part 8)

Thanks! Fixed! Indecision about what letter to use for what, many revisions…

Posted by: John Baez on December 9, 2019 6:33 PM | Permalink | Reply to this

### Re: Random Permutations (Part 8)

Ignacio Larrosa Cañestro points out the charming similarity between Dobiński’s formula

$B_n = \frac{1}{e} \sum_{k = 0}^\infty \frac{k^n}{k!}$

and the generating function for partitions

$e^{e^x - 1} = \sum_{n = 0}^\infty \frac{B_n}{n!} x^n$

Combining them, we get this:

$e^{e^x} = \sum_{n,k = 0}^\infty \frac{k^n}{n! k!} x^n$

What’s the best way to think about all this?

Posted by: John Baez on December 9, 2019 6:44 PM | Permalink | Reply to this

### Re: Random Permutations (Part 8)

I seem to be very typo prone today, but your last identity is just: $e^{e^x} = \sum_{k^0}^{\infty} \frac{e^{k x}}{k!} = \sum_{k=0}^{\infty} \frac{1}{k!} \sum_{n=0}^{\infty} \frac{(k x)^n}{n!}$ Dividing both sides by $e$ and extracting the coefficient of $x^n$ does seem like a slick proof of Dobinski’s formula.

Posted by: David E Speyer on December 9, 2019 9:18 PM | Permalink | Reply to this

### Re: Random Permutations (Part 8)

That’s great! I’m kind of glad I didn’t think of it, since the longwinded proof I wrote down here taught me a lot of interesting things. But this is clearly the most efficient way to prove Dobiński’s formula. I’ve added it to Wikipedia. Now I should find a reference, so I don’t get accused of “original work”.

I also fixed up the proof that was already there.

Posted by: John Baez on December 10, 2019 6:54 AM | Permalink | Reply to this

### Re: Random Permutations (Part 8)

I need an information about G. Dobiński’s biography and his photo.

Posted by: Olga Belova on December 10, 2019 1:52 PM | Permalink | Reply to this

### Re: Random Permutations (Part 8)

I have no information about G. Dobiński. I’m curious to know who he was. If anyone has information (or a photo), please post a link to it here!

Posted by: John Baez on December 11, 2019 1:48 AM | Permalink | Reply to this

### Re: Random Permutations (Part 8)

For the combinatorics using partitions these might be helpful:

https://rolandspeicher.com/lectures/fpt1819/s2/ (see Thm 2.16 there) https://rolandspeicher.files.wordpress.com/2019/02/nica-speicher.pdf (no classical case, just as a reference)

I just flew over the post and didn’t think too much about it, but I guess this might help.

Posted by: Alex on December 11, 2019 10:13 AM | Permalink | Reply to this

Post a New Comment