# A Counterexample to Conditional Countable Additivity

That title should draw the readers in! Here is something that I was prompted to think about by Timothy Williamson’s Analysis piece. Countable additivity is the following principle.

If S is a countable set of propositions, and any two members of S are incompatible, then the probability that one member of S is true is the sum of the probabilities of each member of S. In symbols, if the members of S are p1, p2, …, then Pr(p1 v p2 v …) = Pr(p1) + Pr(p2) + …

One consequence of countable additivity is that it is impossible for it to be certain that one member of S obtains while the probabilities of each member of S are zero. And that implies there cannot be an ‘even’ distribution of probabilities over a countable set. So if you believe countable additivity, you believe that there will be pretty serious constraints on what kinds of sets there can be ‘even’ distributions of probability over. (For example, there can’t be a way of selecting a real number at random in a way that for any two intervals of the same size, the probability of drawing the number from that interval would be the same. It is easy to show that violates countable additivity.) Can we find a counterexample to countable additivity that doesn’t posit such a dubious ‘even’ distribution? I think not, but we can find something similar.

Two fair coins, a dime and a nickel will be tossed an infinite number of times sequentially. (That is, the number of tosses will be countable.) For any non-negative integer n, let Dn be the proposition that the dime will land heads exactly n times, and Nn the proposition that the nickel will land heads exactly n times. Let D = D1 v D2 v …, i.e. the proposition that the dime will land heads a finite number of times. Similarly, let N = N1 v N2 v ….

Now what is the value of Pr(Dk | D), for arbitary k? I think there is a reasonable case to be made that it is 0. Here is the argument. Let Dk,r be the proposition that k of the first r tosses are heads, and D,r be D1,r v D2,r v …. In the circumstances, D,r is guaranteed to be true, though D of course is not guaranteed to be true. Now it isn’t hard to prove that the limit, as r tends to infinity, of Pr(Dk,r | D,r) is 0. I take that to be a good reason to say Pr(Dk | D) is 0.

Note that the probability distribution over the Dk conditional on D is not in any sense an ‘even’ distribution. It’s true that the probability of each Dk is the same, namely 0. But ‘even’ distributions usually satisfy stronger constraints than this. We might want, for instance, the conditional probability Pr(Di | Di v Dj), for i < j, to be 1/2. But in this case an argument similar to the one above can be used to show that Pr(Di | Di v Dj) is in fact 0. So it is not any kind of ‘even’ distribution. Nor did we assume the existence of any such distribution in setting up the problem. But there is a sort of problem here.

The problem is that although Pr(Dk | D) is 0, Pr(D1 v D2 v … | D) = Pr(D | D) = 1. So, conditional on D, we have a violation of countable additivity.

Some may think this is no real problem. Since Pr(D) is not defined, probabilities conditional on D should not be taken to be defined. I think this is a mistake, and conditional probabilities should be taken to be primitive. But that is one way out of the puzzle raised here.

The reason that I like countable additivity is that (given standard assumptions) it is equivalent to conditional conglomerability. That is, it is equivalent to the principle that, where S = {p1, p2, …} which are pairwise disjoint, then if Pr(A | pi) > x for all i, then Pr(A | S) > x. I think that is a pretty plausible principle; indeed it looks close to being a principle of logic to me. But the case I just described suggests a failure of a very similar principle.

Let N > D be the proposition that is true iff the number of heads in the nickel tosses is infinite, or it is finite and the nickel landed heads more often than the dime did. Again using reasoning similar to the above, it seems very plausible that the following is true for all values of k.

Pr(N > D | N & Dk) > 1/2.

Indeed, that probability is, if not 1, then arbitrarily close to 1. Using a conditional version of countable conglomerability, we can infer from the truth of every instance of that that Pr(N > D | N & D) > 1/2. However, we can also argue that

Pr(N > D | Nk & D) < 1/2.

Indeed, that probability is, if not 0, then arbitrarily close to 0. Again using a conditional version of countable conglomerability, we can infer that Pr(N > D | N & D) < 1/2.

Something has gone deeply wrong. But I don’t quite know what it is.

I also don’t know whether all of this is well known in the relevant literature. I used to try to keep up a little with debates about countable additivity, but that was a while ago, I’ve forgotten a lot, and I didn’t know that much to start with. So maybe I’m just reinventing the sled here. But it all seems like an interesting set of problems to me.

## 2 Replies to “A Counterexample to Conditional Countable Additivity”

1. I haven’t seen anything like this, and I hope I’ve been reading enough and talking to enough people that something like this would have been pointed out to me if it already existed. It’s somewhat troubling.

However, I wonder about the step where you argue that P(Di|Di v Dj) is the limit of P(Dr,i|Dr,i v Dr,j) – this reasoning where you take a probability in the infinitary case to be the limit of the corresponding finite cases is exactly what gives us failures of countable additivity (and conglomerability) in the unconditional cases.

But regardless of the particular argument you used there, it does look very plausible that P(Di|D)=0 – after all, it really looks like this value should be non-decreasing in i, which requires that it be 0. The only way out of this that I see right now is to say that D isn’t even an event in the algebra, which seems like a harsh response.

2. I was thinking about that way out, but I worried that it would have another bad consequence, namely that the algebra isn’t a sigma-algebra. But maybe that’s just something that we have to live with.