# Craps principle

In probability theory, the craps principle is a theorem about event probabilities under repeated iid trials. Let ${\displaystyle E_{1}}$ and ${\displaystyle E_{2}}$ denote two mutually exclusive events which might occur on a given trial. Then the probability that ${\displaystyle E_{1}}$ occurs before ${\displaystyle E_{2}}$ equals the conditional probability that ${\displaystyle E_{1}}$ occurs given that ${\displaystyle E_{1}}$ or ${\displaystyle E_{2}}$ occur on the next trial, which is

${\displaystyle \operatorname {P} \left[E_{1}\mid E_{1}\cup E_{2}\right]={\frac {\operatorname {P} [E_{1}]}{\operatorname {P} [E_{1}]+\operatorname {P} [E_{2}]}}}$

The events ${\displaystyle E_{1}}$ and ${\displaystyle E_{2}}$ need not be collectively exhaustive (if they are, the result is trivial).[1][2]

## Proof

Let ${\displaystyle A}$ be the event that ${\displaystyle E_{1}}$ occurs before ${\displaystyle E_{2}}$. Let ${\displaystyle B}$ be the event that neither ${\displaystyle E_{1}}$ nor ${\displaystyle E_{2}}$ occurs on a given trial. Since ${\displaystyle B}$, ${\displaystyle E_{1}}$ and ${\displaystyle E_{2}}$ are mutually exclusive and collectively exhaustive for the first trial, we have

${\displaystyle \operatorname {P} (A)=\operatorname {P} (E_{1})\operatorname {P} (A\mid E_{1})+\operatorname {P} (E_{2})\operatorname {P} (A\mid E_{2})+\operatorname {P} (B)\operatorname {P} (A\mid B)=\operatorname {P} (E_{1})+\operatorname {P} (B)\operatorname {P} (A\mid B)}$

and ${\displaystyle \operatorname {P} (B)=1-\operatorname {P} (E_{1})-\operatorname {P} (E_{2})}$. Since the trials are i.i.d., we have ${\displaystyle \operatorname {P} (A\mid B)=\operatorname {P} (A)}$. Solving the displayed equation for ${\displaystyle \operatorname {P} (A)}$ gives the formula ${\displaystyle \operatorname {P} (A)={\frac {\operatorname {P} (E_{1})}{\operatorname {P} (E_{1})+\operatorname {P} (E_{2})}}}$.

The other equation follows from the definition of conditional probability and the fact that ${\displaystyle E_{1}}$ and ${\displaystyle E_{2}}$ are mutually exclusive:

${\displaystyle \operatorname {P} [E_{1}\cup E_{2}]=\operatorname {P} [E_{1}]+\operatorname {P} [E_{2}]}$

and

${\displaystyle E_{1}\cap (E_{1}\cup E_{2})=E_{1}}$

so by the definition of conditional probability,

${\displaystyle \operatorname {P} [E_{1}\cap (E_{1}\cup E_{2})]=\operatorname {P} \left[E_{1}\mid E_{1}\cup E_{2}\right]\operatorname {P} \left[E_{1}\cup E_{2}\right]}$

Combining these three yields the desired result.

## Application

If the trials are repetitions of a game between two players, and the events are

${\displaystyle E_{1}:\mathrm {player\ 1\ wins} }$
${\displaystyle E_{2}:\mathrm {player\ 2\ wins} }$

then the craps principle gives the respective conditional probabilities of each player winning a certain repetition, given that someone wins (i.e., given that a draw does not occur). In fact, the result is only affected by the relative marginal probabilities of winning ${\displaystyle \operatorname {P} [E_{1}]}$ and ${\displaystyle \operatorname {P} [E_{2}]}$ ; in particular, the probability of a draw is irrelevant.

### Stopping

If the game is played repeatedly until someone wins, then the conditional probability above is the probability that the player wins the game. This is illustrated below for the original game of craps, using an alternative proof.

## Etymology

If the game being played is craps, then this principle can greatly simplify the computation of the probability of winning in a certain scenario. Specifically, if the first roll is a 4, 5, 6, 8, 9, or 10, then the dice are repeatedly re-rolled until one of two events occurs:

${\displaystyle E_{1}:{\textrm {the\ original\ roll\ (called\ 'the\ point')\ is\ rolled\ (a\ win)}}}$
${\displaystyle E_{2}:{\textrm {a\ 7\ is\ rolled\ (a\ loss)}}}$

Since ${\displaystyle E_{1}}$ and ${\displaystyle E_{2}}$ are mutually exclusive, the craps principle applies. For example, if the original roll was a 4, then the probability of winning is

${\displaystyle {\frac {3/36}{3/36+6/36}}={\frac {1}{3}}}$

This avoids having to sum the infinite series corresponding to all the possible outcomes:

${\displaystyle \sum _{i=0}^{\infty }\operatorname {P} [{\textrm {first\ }}i{\textrm {\ rolls\ are\ ties,\ }}(i+1)^{\textrm {th}}{\textrm {\ roll\ is\ 'the\ point'}}]}$

Mathematically, we can express the probability of rolling ${\displaystyle i}$ ties followed by rolling the point:

${\displaystyle \operatorname {P} [{\textrm {first\ }}i{\textrm {\ rolls\ are\ ties,\ }}(i+1)^{\textrm {th}}{\textrm {\ roll\ is\ 'the\ point'}}]=(1-\operatorname {P} [E_{1}]-\operatorname {P} [E_{2}])^{i}\operatorname {P} [E_{1}]}$

The summation becomes an infinite geometric series:

${\displaystyle \sum _{i=0}^{\infty }(1-\operatorname {P} [E_{1}]-\operatorname {P} [E_{2}])^{i}\operatorname {P} [E_{1}]=\operatorname {P} [E_{1}]\sum _{i=0}^{\infty }(1-\operatorname {P} [E_{1}]-\operatorname {P} [E_{2}])^{i}}$
${\displaystyle ={\frac {\operatorname {P} [E_{1}]}{1-(1-\operatorname {P} [E_{1}]-\operatorname {P} [E_{2}])}}={\frac {\operatorname {P} [E_{1}]}{\operatorname {P} [E_{1}]+\operatorname {P} [E_{2}]}}}$

which agrees with the earlier result.

## References

1. ^ Susan Holmes (1998-12-07). "The Craps principle 10/16". statweb.stanford.edu. Retrieved 2016-03-17.
2. ^ Jennifer Ouellette (31 August 2010). The Calculus Diaries: How Math Can Help You Lose Weight, Win in Vegas, and Survive a Zombie Apocalypse. Penguin Publishing Group. pp. 50–. ISBN 978-1-101-45903-4.