# Coupling from the past

Among Markov chain Monte Carlo (MCMC) algorithms, coupling from the past is a method for sampling from the stationary distribution of a Markov chain. Contrary to many MCMC algorithms, coupling from the past gives in principle a perfect sample from the stationary distribution. It was invented by James Propp and David Wilson in 1996.

## The basic idea

Consider a finite state irreducible aperiodic Markov chain $M$ with state space $S$ and (unique) stationary distribution $\pi$ ($\pi$ is a probability vector). Suppose that we come up with a probability distribution $\mu$ on the set of maps $f:S\to S$ with the property that for every fixed $s\in S$ , its image $f(s)$ is distributed according to the transition probability of $M$ from state $s$ . An example of such a probability distribution is the one where $f(s)$ is independent from $f(s')$ whenever $s\neq s'$ , but it is often worthwhile to consider other distributions. Now let $f_{j}$ for $j\in \mathbb {Z}$ be independent samples from $\mu$ .

Suppose that $x$ is chosen randomly according to $\pi$ and is independent from the sequence $f_{j}$ . (We do not worry for now where this $x$ is coming from.) Then $f_{-1}(x)$ is also distributed according to $\pi$ , because $\pi$ is $M$ -stationary and our assumption on the law of $f$ . Define

$F_{j}:=f_{-1}\circ f_{-2}\circ \cdots \circ f_{-j}.$ Then it follows by induction that $F_{j}(x)$ is also distributed according to $\pi$ for every $j\in \mathbb {N}$ . Now here is the main point. It may happen that for some $n\in \mathbb {N}$ the image of the map $F_{n}$ is a single element of $S$ . In other words, $F_{n}(x)=F_{n}(y)$ for each $y\in S$ . Therefore, we do not need to have access to $x$ in order to compute $F_{n}(x)$ . The algorithm then involves finding some $n\in \mathbb {N}$ such that $F_{n}(S)$ is a singleton, and outputting the element of that singleton. The design of a good distribution $\mu$ for which the task of finding such an $n$ and computing $F_{n}$ is not too costly is not always obvious, but has been accomplished successfully in several important instances.

## The monotone case

There is a special class of Markov chains in which there are particularly good choices for $\mu$ and a tool for determining if $|F_{n}(S)|=1$ . (Here $|\cdot |$ denotes cardinality.) Suppose that $S$ is a partially ordered set with order $\leq$ , which has a unique minimal element $s_{0}$ and a unique maximal element $s_{1}$ ; that is, every $s\in S$ satisfies $s_{0}\leq s\leq s_{1}$ . Also, suppose that $\mu$ may be chosen to be supported on the set of monotone maps $f:S\to S$ . Then it is easy to see that $|F_{n}(S)|=1$ if and only if $F_{n}(s_{0})=F_{n}(s_{1})$ , since $F_{n}$ is monotone. Thus, checking this becomes rather easy. The algorithm can proceed by choosing $n:=n_{0}$ for some constant $n_{0}$ , sampling the maps $f_{-1},\dots ,f_{-n}$ , and outputting $F_{n}(s_{0})$ if $F_{n}(s_{0})=F_{n}(s_{1})$ . If $F_{n}(s_{0})\neq F_{n}(s_{1})$ the algorithm proceeds by doubling $n$ and repeating as necessary until an output is obtained. (But the algorithm does not resample the maps $f_{-j}$ which were already sampled; it uses the previously sampled maps when needed.)