The Bus Stop Problems
Since we had so much fun with Bayes Theorem in a recent post, I can’t resist another.
Young Economics whippersnapper Evan Soltas posed two problems to do with Bayesian probability:
- You arrive at a bus stop in an unfamiliar part of town. Assume that buses arrive at the stop as a Poisson process, with an unknown (to you) rate, $\lambda$. You don’t know $\lambda$, but say you have a prior probability distribution for it, $p_0(\lambda)$.
- What’s your expected wait time, $\langle T\rangle$, for the next bus to arrive?
- Say you’ve been waiting for a time $t$. What’s your posterior probability distribution, $p(\lambda)$, and what’s your new expected wait time?
- Let’s add some more information. Say that riders arrive at the bus stop via an independent Poisson process with an (unknown to you) rate, $\mu$. Whenever a bus arrives, all those waiting at the stop get on it. Thus, the number of people waiting is the number who arrived since the last bus. Say you arrive at the stop to find $n$ people already waiting. You wait for a time, $t$, at which point there are $N$ other people waiting at the stop (i.e., $N-n$ arrived while you were waiting).
- Given this data, what’s your posterior probability distribution, $p(\lambda,\mu)$?
- What’s your new expected wait time, $\langle T\rangle$?
These questions illustrate one of my favourite points of view on Bayes Theorem, namely that it induces a flow on the (infinite-dimensional!) space of probability distributions. Understanding the nature of that flow is, I think, the key task of the subject.
Infinite dimensions are hard to get an intuition for, so one of the first tasks is to cut the problem down to a finite-dimensional one.
Say we have a finite-dimensional family of probability distributions. We will call that family natural for the problem at hand, if the Bayes flow keeps us within that finite dimensional space.
A natural family of probability distributions for Problem 1 is the $\Gamma$-distribution. Let us choose a prior in that family $p_0(\lambda) = f(\lambda; \kappa, a) = \frac{\kappa^a \lambda^{a-1} e^{-\kappa\lambda}}{\Gamma(a)}$ where, for reasons that will be apparent in a moment, we require $\kappa\gt 0,\, a\gt 1$. For fixed $\lambda$, the expected wait time is $\int_0^\infty d\tau e^{-\lambda \tau} \lambda\tau= \frac{1}{\lambda}$ So, given our prior, we expect to wait $\langle T\rangle = \int_0^\infty d\lambda \frac{p_0(\lambda)}{\lambda} = \frac{\kappa}{a-1}$
Applying Bayes Theorem, our posterior distribution, after waiting a time $t$, is $\begin{split} p(\lambda; t) &= \frac{e^{-\lambda t} p_0(\lambda)}{\int_0^\infty d\lambda' e^{-\lambda' t} p_0(\lambda')}\\ &= f(\lambda; \kappa+t, a) \end{split}$ which, as announced, is just a shift of parameters of the $\Gamma$-distribution. We immediately conclude that our expected wait time $\langle T\rangle = \frac{\kappa+t}{a-1}$ has gone up. The longer we wait, the longer we expect to continue having to wait!
You should pause to convince yourself that’s the generic behaviour, whatever prior you assumed.
What about Problem 2? The first task is to compute, for fixed rates, $\lambda,\mu$, the probability that
- There are $n$ people waiting at the stop, when you arrive.
- You wait for a time, $t$, during which
- no bus arrives, but
- $N-n$ more riders arrive.
The answer is $P(N,n; \lambda,\mu) = e^{-(\lambda+\mu)t} \frac{{(\mu t)}^{N-n}}{(N-n)!}\, \frac{\mu^n \lambda}{{(\lambda+\mu)}^{n+1}}$
The second task is to find a natural family of probability distributions for this problem. I don’t know its name, but there is an obvious 5-parameter family, which is the natural generalization of the $\Gamma$-distribution, $\begin{split} p_0(\lambda, \mu) &= g(\lambda,\mu;\kappa,\rho, a,b,c)\\ &= \frac{\kappa^{a+c}\rho^b}{\Gamma(a+c)\Gamma(b)\multiscripts{_2}{F}{_1}(b,-c,1-a-c;\kappa/\rho)}\, \lambda^{a-1} \mu^{b-1} {(\lambda+\mu)}^c e^{-(\kappa\lambda+\rho\mu)} \end{split}$ Note that
- For $c=0$, this is just a product of independent $\Gamma$-distributions. $g(\lambda,\mu;\kappa,\rho,a,b,0) = f(\lambda;\kappa,a)f(\mu;\rho,b)$
- For positive integer $c$, the hypergeometric function is just a finite-order polynomial $\multiscripts{_2}{F}{_1}(b,-c,1-a-c;x) = \frac{1}{{(a)}_c} \sum_{k=0}^c {(b-1)}_k {(a)}_{c-k}\begin{pmatrix}c\\ k\end{pmatrix} x^k$
- There’s an obvious symmetry $\begin{gathered} \lambda\leftrightarrow\mu \\ \kappa\leftrightarrow\rho \\ a\leftrightarrow b \end{gathered}$
Applying Bayes Theorem, the posterior probability is $p(\lambda,\mu; N,n,t) = g(\lambda,\mu;\kappa+t,\rho+t , a+1,b+N,c-n-1)$ which, again, is just a shift of parameters of the distribution. The expected wait time $\langle T\rangle = \frac{\kappa}{a+c-1} \frac{\multiscripts{_2}{F}{_1}(b,-c,2-a-c;\kappa/\rho)}{\multiscripts{_2}{F}{_1}(b,-c,1-a-c;\kappa/\rho)}$ transforms accordingly. The dependence on $N,n,t$ is, alas, somewhat complicated. You can spend some hours convincing yourself that it is what you should expect.
Whether you think a Poisson process is a good model for a real public transit systems, probably depends on your political persuasion.
Update: 12/29/2013
In return, I should pose the followup problem:
- Same setup as Problem 1 but, now, you’ve been to the bus stop $n$ times previously and had to wait for times $\{t_1,t_2,\dots,t_n\}$. What is your posterior probability distribution for $\lambda$ (hint: the $\Gamma$-distribution is natural for this problem, too), and what is your expected wait time?
Re: The Bus Stop Problems
Regarding the first problem, and the fact that the expected waiting time increases, I find it very fascinating that it implies that a rational agent might decide to try waiting for the bus for a while and then give up and choose to walk when the expected wait time exceeds the time that it takes to walk.
I was forced to ponder the exact same problem during my school years and never figured this out…