# The St. Petersburg paradox

Simon introduced me to the St. Petersburg paradox the other day. Here’s how it goes. Basically, you flip a coin over and over again until you get a tails. You win 2^n dollars, where n is the total number of coin tosses, including the failed tails toss. So a tails straight away wins you $2; HT wins you $4; HHT $8; HHHT $16 and so on.

The question is: how much should you be willing to stake to play the game?

Now the answer is different depending on whether you play repetitively or you only play a single game. If your pockets are infinitely deep and time is not an issue, then any amount of money, no matter how large, is worth the investment in a game. Here’s why.

The expected earnings from a game can be calculated by multiplying the earnings by the probability of those earnings for each eventuality, and summing these. The possible outcomes are shown below, with the columns showing the coin pattern (C), the associated winnings (W), the probability of that event happening (P) and the expected value of that line (E) respectively.

C W P E

=======

T 2 0.5 1

HT 4 0.25 1

HHT 8 0.125 1

HHHT 16 0.0625 1

HHHHT 32 0.03125 1

…

As you’ll notice, the expected winnings of each line is $1. So summing these for the infinite series gives you infinite expected winnings. Each remote possibility of a long string of heads comes with it a winnings pot commensurate with the remoteness of it happening.

So it’s worth a trillion dollars per game. You’d need 29 heads in a row to win back that amount or greater, but over time, the odds are such that you’d do it. And if you got 35 heads in a row, you’d win $68 trillion. And you could pick any stake greater than $1 trillion, and the numbers would always show it’s worth it.

But if you only had enough money for a single game, what would you stake? The game is certainly worth $2, as that’s the minimum you could win. And it’s arguably worth $3, as that would give you a 50% chance of losing a dollar, and a 50% chance of winning a dollar or more. At $4, you have a 50% chance of losing $2, a 25% chance of breaking even, and a 25% chance of winning $4 or more. The decision as to how high a stake is worthwhile is subjective based on the value of money to that person. Or more importantly, the detrimental effect that losing a proportion of the stake would have on the player. If you have $1m in the bank, then you might risk $10 for a game. If you’re down to your last $10, you’re unlikely to do the same.

Very interesting conundrum.

### Comments

**One Response to “The St. Petersburg paradox”**

**Leave a Reply**

What are we actually saying when we calculate probability. There are two ways, IMHO, that we can look at it.

1) Probability is the odds that any given physical event will actually occur OR

2) Probability is the odds that we will guess the right outcome of any physical event. I.e. it is an epistemological issue.

I vote the later. The Monty Hall problem I think is explained much better when we think of it as 2) and not 1).

The second reason is determinism. If you have a 6 sided die then the probability of the number scored when the dice is rolled is 1 and the probability of the dice landing on any other face is actually 0. In other words, when that dice is rolled it cannot end up on any other face than which it actually does. So given the future landing state of the dice can only ever have landed as it did the physical probability of it was certainty. It could not have been any other way.

The only probability of interest is then type 2). That we don’t KNOW what that certain outcome will be so we guess. So probability is about knowledge, or lack of it, rather than to do with anything physical.