How Surprising is the Two Envelope Paradox

Matt Weiner writes:

I don’t think we want it to be the case that, once you open the first envelope, you have a reason to switch. Now, one issue here is the mathematical impossibility of defining a probability distribution on which every rational number is equally likely–probabilities are supposed to be countably additive, which makes it impossible to assign the same value to an infinite number of different ones. [Afterthought: Anyone know if non-standard analysis can do any work here?]

Actually you don’t even need non-standard analysis. As John Broome (1995) pointed out you can get just the result that Matt doesn’t want within standard probability theory. To be sure, you can’t get the result that whatever you see in the envelope it is equally likely that there is twice as much in the other envelope as half as much. But we can get the result that whatever you see, you should switch. Here’s one way. (It could be Broome’s exact method for all I remember – I don’t have the paper in front of me. And the numbers feel familiar from what I remember of Broome’s paper.)

Let’s assume we have a red and blue envelope, a currency denominated in utils, and a God, Eris seems appropriate, who chooses how much to put in the envelopes as follows. She takes a coin that has a 2/3 probability of falling tails, and tosses it repeatedly until it falls heads. Let n be the number of tosses this takes. (If it falls tails infinitely often, stipulate that n=1.) She then takes a fair coin and tosses it once to determine what goes in the envelopes by the following rules.

If heads, then 2n utils in the red envelope and 2n+1 utils in the blue envelope.

If tails, then 2n utils in the blue envelope and 2n+1 utils in the red envelope.

Let’s assume she now gives you the red envelope and you somehow (presumably with the help of another God – this part sort of defeats Eris’s purposes) get to see what’s in it. It turns out there are only two cases worth considering.

Case 1: You see 2 utils. In this case you know that there are 4 utils in the other envelope, so the expected gain from switching is 2.

Case 2: You see 4 utils. In this case you know that there are either 2 utils in the other envelope or 8. The prior probability (i.e. before Eris tosses any coins, which we can assume is when she stops telling you what happens) that the red envelope would end up with 4 and the blue envelope with 2 is 1/9. The prior probability that the red envelope would end up with 4 and the blue envelope with 8 is 2/27. So by a quick application of Bayes’s theorem, the posterior probability that the blue envelope has 2 is 0.6, and that it has 8 is 0.4. So the expected value of the blue envelope is 2 * 0.6 + 8 * 0.4 = 4.4. So the expected gain from swapping is 0.4.

What about all the other cases you ask? Well I won’t try doing the algebra in HTML, but it isn’t too hard to prove that for any x larger than 2 that you see, the expected gain from swapping is x/10. So whatever you see, you should not only prefer to swap, you should be prepared to pay at least 0.4 to swap. And note that we’ve only used standard (countably additive) probability functions here and standard reasoning.

By the way, although I’ll leave the proof for another day, I’m pretty sure that the expected utility of receiving one of the envelopes does not have to be infinite for a two-envelope paradox style situation to arise, i.e. a situation where you will want to swap whatever you see in your envelope, whichever envelope you are given. (No one I know of has ever said that it has to be, but it’s easy to misread David Chalmers as saying that it must be.) In the case here, the expected value of Eris’s gift is infinite. (Well, until she starts getting you to pay to switch envelopes.) But I’m pretty sure that by mixing positive and negative payouts in the distribution it’s possible to produce a paradoxical distribution with each envelope having a strictly undefined expected utility.