Abrazolica


Home     Archive     Tags     About        RSS
Gambler's Ruin

Let's say you are playing a game where you win a dollar with probability \(p\) and lose a dollar with probability \(q = 1 - p\). You start the game with \(a\) dollars and your opponent starts the game with \(b\) dollars. The game is over when either one of you loses all of his money and is ruined. This is the famous gambler's ruin problem.

If \(r_k\) is the probability of ruin after you have been playing and your bankroll is now at \(k\) dollars, then

\[r_k = \frac{\alpha^k - \alpha^N}{1 - \alpha^N}\]

where

\[\begin{aligned} \alpha & = \frac{q}{p} = \frac{\mbox{probability of losing a game}}{\mbox{probability of winning a game}}\\ N & = a + b \end{aligned}\]

Now what happens if you are playing against an opponent with unlimited resources and you put no limits on when you will stop?

To answer the question take the limit of the above equation as \(N\) goes to infinity. When the game is in your favor (\(p > q, \alpha < 1\)), then this limit gives \(r_k=\alpha^k\), meaning the chance of ruin decreases exponentially with increasing \(k\). When the game is not in your favor (\(p < q, \alpha > 1\)) then the limit gives \(r_k=1\), meaning the chance of ruin is certain, and it doesn't matter how much money you start with.

So if you're playing against an opponent with unlimited resources and placing lots of bets over a given period of time, it's crucial that you have an edge, no matter how small.

The derivation of the above formula as well as more information about the gambler's ruin problem can be found in our book The Coin Toss: The Hydrogen Atom of Probability.


© 2010-2012 Stefan Hollos and Richard Hollos

blog comments powered by Disqus