Skip to main content.
July 10th, 2005

What do Dutch Book Arguments Prove

It’s sometimes said that probability theory is the logic of partial belief, meaning by that that a person whose credences do not conform to the probability calculus are incoherent in just the same way that a person whose beliefs are logically inconsistent are incoherent. (We’re setting aside for purposes of this post whether logical inconsistency is a major, or even a minor, epistemic failing. The issue is whether not being a probabilist is like being an inconsistent person, however good or bad that is.)

It seems to me that this can’t be right. In particular, it seems that Dutch Book arguments for this cannot succeed. The most we can show by Dutch Book arguments is that the non-probabilist will evaluate sets of bets in such a way that leads to them giving positive evaluation to some bets that provably have negative net value taken collectively. But this is compatible with the agent having no mistaken logical views.

Note that there are lots of kinds of beliefs an agent could have that could not be true that are not logical errors. Here are three interesting categories of such belief.

Which of these errors is the non-probabilist making. It is hard to see how it could be more than the last. Consider someone who engages in the following reasoning.

What should my credence in p v q be? Well, my credence in p is 0.37. And my credence in q is 0.27. And p and q are logically incompatible. So my credence in p v q should be 0.37 plus 0.27, that is, 0.54. That’s is, my credence in p v q is 0.54.

There are no logical errors in this bit of reasoning, just a mathematical error. So having non-probabilistic credences doesn’t imply making any logical mistakes, at most it implies a mathematical mistake.

That’s not to say there are no logical constraints on credences. I think there are, but they are all framed in terms of comparative probabilities, not numerical probabilities. For instance, I think the following is (akin to) a logical constraint, unlike the constraint that credences be probabilities. (C here is the credence function.)

C(p | q) > C(s)
C(p | ~q) > C(s)
So, C(p) > C(s)

A credence function that doesn’t satisfy this constraint is flawed in just the way that inconsistent beliefs are flawed. But this isn’t the way that non-probabilistic credences are flawed.

Posted by Brian Weatherson in Uncategorized

12 Comments »

This entry was posted on Sunday, July 10th, 2005 at 10:01 pm and is filed under Uncategorized. You can follow any responses to this entry through the comments RSS 2.0 feed. Both comments and pings are currently closed.

12 Responses to “What do Dutch Book Arguments Prove”

  1. Kenny Easwaran says:

    That’s interesting – you put van Fraassen’s reflection principle on a firmer footing than the principal that credences should be probabilities! I suppose this would just strengthen the point I make in my paper on conditional probabilities (http://socrates.berkeley.edu/~fitelson/few/easwaran.pdf)

    I imagine you’d also assign something like logical status to the requirement that C(p|p)=1?

    Anyway, I don’t entirely see why these principles should be any stronger than the one that credences should be probabilities. I’ve always been most strongly convinced of each of these principles by the dutch book arguments in their favor rather than anything else. Of course, van Fraassen does give lots of other arguments for the reflection principle in “Belief and the Problem of Ulysses and the Sirens”.

    I never thought of dutch book arguments as suggesting that a violator was logically incoherent. Instead, I always thought of them as an argument that a rational person should obey them. This would be akin to the sort of rationality advocated by one’s favorite brand of decision theory, and not mere logical consistency. Given most analyses of rationality, being logically infallible and omniscient is necessary for rationality, but not sufficient. So it’s no surprise that one could violate the norms of rationality while still being logically coherent.

  2. Trent Dougherty says:

    It seems to me that in the mathematical case you focused on the process of reasoning when the focus should be on the results. I don’t think the bayesian primarily wants to criticize the booked individual’s train of thought, but rather primarily the wrongness of the destination. In the mathematical case the individual holds a belief such that there is a formal system which the mathematician takes to be normative which shows the result is wrong—indeed incoherent given the individual’s beliefs regarding sums. Now the bayesian wants to say much the same thing about the booked individual: they have a belief such that there is a formal system which the bayesian takes to be normative that shows that the result is wrong—indeed incoherent given plausible preferences. You can get from here to irrationality or even irresponsibility if the individual is recalcitrant to correction (or as Kenny points out by definition).

  3. Barry Lam says:

    Brian,

    To keep the logic/rationality divide as distinct as possible pretheoretically, I think its better to say that probability is the logic of the contents of partial belief, and logic proper is the logic of the contents of beliefs, that is, propositions. This requires richer types of content, but we leave open the general question of how logic and rationality are related.

    In your argument about whether or not someone who reasons thus and so make any “logical” mistakes, I assume that you are invoking our intuitive notion of logicality and not any technical notion, less the argument reduces to triviality (if a logical mistake just is a classical logical fallacy, then of course any mistake about numbers is not such a fallcy!) On the intuitive notion of logicality, though, lots of things that philosophers don’t consider logical are as good a candidate as others: color incompatibility, family relations, and so on. You still have a point, however, that arithmetical errors don’t seem to be logical error on an intuitive notion of logicality.

    I think the way to think of probability as a logic of the contents of partial beliefs is to consider what it would take to reduce probabilistic incoherence into the technical, classical conception of logical inconsistency (where this is neither the intuitive notion of logicality, nor some notion directly tied to the rules for right reasoning). To reduce the notion of probabilistic incoherence, or failure to satisfy the K-axioms, to classical inconsistency, one needs to help oneself to the notion of a coherent utility function, and then to definitions like the following “A believes that a bet on P is fair iff…(the usual statement about probabilities, utilities, and expected returns)” and maybe “A believes that a book of bets is unfair iff there A believes that one of the bets in the book is unfair”.

    Then the DBAs return the result that probabilistically incoherent agents have classically inconsistent beliefs. That’s building way too much into the probabilists view. I don’t think the probabilists who relies on DBAs need to rely on this much to reduce probabilistically incoherent contents of partial beliefs to classically inconsistent contents of ordinary beleifs, but its the clearest way of seeing what the project is supposed to be. I think this is a version of the Skyrms line, from whom I got my probability education.

  4. Peter says:

    Two issues which often seem to be ignored here:

    - Probability is just ONE means of representing uncertainty, and is not the only one with a consistent set of reasonable axioms. I may (for example), throw away the Law of Excluded Middle and use Dempster-Shafer theory. Am I in logical error because I view the D-S axioms as more appropriate to a particular application domain than the Kolmogorov axioms? I have heard Bayesians say that I am, but they are wrong. (When I hear these attacks, I am reminded of Frege’s failure to understand Hilbert’s geometry.)

    - Dutch book arguments are based on convergence of infinite sequences, and so are liable to attack on constructivist grounds. We all inhabit a finite world, and all of us (and all our machines) are resource-bounded reasoners; prima facie, therefore, supporters of DB arguments have some justifying to do before their arguments are acceptable, IMO.

  5. Brian Weatherson says:

    A few quick comments on all of the above.

    Kenny,

    I don’t mean to be endorsing Reflection here. The principles I’m supporting a synchronic, not diachronic. They are stated using conditional probabilities, but they are still synchronic principles.

    And I don’t think it is even close to true that “Given most analyses of rationality, being logically infallible and omniscient is necessary for rationality, but not sufficient”. Indeed I would say that is true on approximately zero analyses of rationality outside of the formal epistemology movement. Certainly the denial of that by, say, Richard Foley and Gilbert Harman is not regarded as being what is radical about their views.

    Trent,

    I agree entirely that in some sense the non-probabilist is incoherent. The issue is just what this incoherence comes to. I think it is (at most) the same kind of incoherence as the person who belives 37+27=54, i.e. mathematical incoherence. And, as the failures of logicism showed, that isn’t the same thing as logical incoherence. (Actually this point is tricky, and maybe I’m going to eventually want to retract it, because it is arguable that 37+27=64 is a logical truth, even if not all theorems of arithmetic are logical truths. Maybe I’ll have to come back to that.)

    Barry,

    I agree this is a pointless thing to say if we mean to stipulate that logic is, say, first order predicate calculus with identity. (Indeed, the ‘synchronic reflection’ principle that I say is logical in the intended sense isn’t part of regular predicate calculus.) But I do think the fuzzy intuitive notion is still useful and important enough to use here.

    Peter,

    The DS theory, at least the version I know, doesn’t give up the Law of Excluded Middle. Any coherent belief function still has B(p v ~p) = 1. The only person to defend at length a theory about coherent credence functions where C(p v ~p) = 1 fails is, to my knowledge, me (in “From Classical to Constructivist Probability”). I do think the constructivist point here is one that the DB theorist has to do some work to respond to, though I think this is just the same work that any proponent of classical logic has to do.

  6. Trent Dougherty says:

    Brian,

    I agree about the failure of logicism which is why I adverted to the subject’s own beliefs about sums. Plausibly the subject will have the usual belief about how to add digits by carrying, and then all they need are the beliefs that 7 + 7 = 14 and 1 + 2 + 3 = 6. From these beliefs together with their mistaken belief as a result of the error a contradiction can be derived. The individual can be evaluated diachronically w.r.t how they came to the mistaken belief or they can be evaluated synchronically as simply holding contradictory beliefs. The former is not essentially a logical evaluation the latter, I think, is.

  7. Kenny Easwaran says:

    I suppose my comments about the relation between logic and rationality betray my differing levels of exposure to different ideas of rationality. (But isn’t a Kantian rational agent supposed to be able to consider all the consequences of universalizing her maxims, and doesn’t that require at least some sort of unrealistically large amount of logical knowledge?)

    And I suppose I shouldn’t have identified van Fraassen’s reflection principle with this law of conditional probabilities. He specifically defends a version of the principle that should apply even if probability isn’t the appropriate model of belief, and if one thinks that conditionalization isn’t the appropriate way to update beliefs then this synchronic fact comes apart from the diachronic one in yet another way. I just figured that these principles were closely enough related that it would make sense to call both reflection principles, even if they’re distinct at some level.

    I should check the theories given in Roeper and LeBlanc’s book “Probability Theories and Probability Semantics” (I think that’s the right title). They definitely describe a probability calculus that gives intuitionist logic the way Popper’s gives classical logic, but I’ll have to double-check whether or not C(p v ~p)=1.

  8. Branden Fitelson says:

    A nice (state-of-the-art) survey of Popper-style approaches to non-classical probability (including intuitionistic probability) has been given in a recent talk by David Miller, at:

    http://www2.warwick.ac.uk/fac/soc/philosophy/staff/miller/chuaqui.pdf

    I think C(p v ~p | T) = 1 fails in all of these intuitionistic systems, which date back at least to Popper and Miller’s 1980 system. Since these are conditional probability systems, there is no unconditional probability function. In this sense, they differ from Brian’s approach, which takes unconditional probability as primitive (in Kolmogorov style).

  9. Barry says:

    Sorry Brian, didn’t want to make it seem like that was my main point. It wasn’t at all. The idea is just that DBA for probabilism attempt to show either (1) that non-K-axiom-satisfying probabilities along with a coherent utility function imply that you have classically inconsistent beliefs (that a set of bets are all fair and not all fair) or (2) that non-k-satisfying probabilities along with a coherent utility function COMMIT you to having classically inconsistent evaluations of a set of bets.

    Of course, if you are looking for non-classical inconsistency, choose the write set of probability axioms and the DBAs will get you the results for those. The point is just that DBAs reduce incoherence to classical inconsistency via utility and some principle on the what it is to take a bet to be fair.

  10. Peter says:

    Brian —

    Further to your and my posts on DS theory, I checked my copy of Klir and Wirman (“Uncertainty-based Information”), which presents a single mathematical theory of probability, evidence and possibility theories. They define Belief Functions as satisfying superadditivity. For disjoint sets, this comes down to: the belief assigned to a union of two disjoint sets is greater than or equal to the sum of the beliefs assigned to the individual sets.

    This condition seems to me to contradict one part of LEM, the part which claims that {p} and {not-p} exhaust all the possibilities of a universal set, since this condition allows belief mass to be assigned to the union set

    { {p} U {not{p} }

    which is greater than the sum of the beliefs assigned to {p} and to {not-p}.

  11. George Kahrimanis says:

    It seems to me that my (current!) outlook is relevant to Brian’s main question. My apology if I misuse the space.

    Imv the (original) DBA is mainly about the multiplicity of solutions:
    A Pr(gain)>0
    B Pr(loss)=1
    which appears like logical inconsistency. I dare say that this multiplicity is similar to the multiplicity we find in the Reference Class Problem: in both cases, we state a problem as completely as we can but we find a multiplicity of solutions.

    (A difference between the RCP and the DB Problem is that in the RCP we have two (at least) coherent probability spaces disagreeing with each other, but in the DBP there is no such structure. I am not sure at the moment whether this distinction is not a trivial formality.)

    Inasmuch as we find a way to live with the RCP (I have in mind a variant of Venn’s practical solution for an English consumptive in Madeira) then the issue “multiplicity of solutions” can be addressed, in principle. Wrt the DBA, the practical upshot is that a second Dutch bookie would undercut the deal offered by the first one, and so on; that is, competition between Dutch bookies would kill the Dutch book. Moral: there is no harm caused by being incoherent in a free market. In this way we dodge the pragmatic DBA.

    To recap, it seems to me that the multiplicity of solutions “comes with the territory” of applied probability, unlike, say, in physics, where we have relativistic covariance. E.g., an elevator in free fall may be described, equivalently, as if in a space with zero gravity, so that it makes no difference whether one adopts the one or the other description. On the other hand, in applied probability we may encounter problems admitting distinct and non-equivalent formulations.

    Back to the original issue: additivity of partial beliefs, or “credences”. Even though I do not find the DBA convincing (in view of the above considerations) I still support Additivity, for the following reason. For one, I recognize only such credences that are based on objective (that is, commonly or tentatively agreed) chances; that is, no betting on theories and no hanky-panky with “ignorance priors”. The allowed credences are subject to statistical tests. If C(A)=p and C(~A)=q, then in a series of N repetitions the expected number of A’s is Np and that of ~A’s Nq. Unless p+q=1, then in large enough samples we shall have a guaranteed significant deviation of the actual numbers from the expected ones. It seems to me that a pretheoretical razor exists, or should exist, that eliminates any hypothesis that will necessarily be rejected by the evidence.

  12. George Kahrimanis says:

    Oops, I must make a couple of improvements in my (George Kahrimanis) above “rant”

    1 About the non-equivalent formulations of a problem: so long as probability is not regarded as real but only as relative to specific conditions, there is no logical inconsistency (to my mind).

    2 I regret that I have mentioned betting on theories and unsupported priors: that is beside the issue. Let me try again. Incoherence is rejected by statistical tests, as in the following argument. A habitually incoherent thinker holds not just one pair of incoherent beliefs but a sequence of incoherent pairs, and if he contemplates (only) a corresponding sequence of Dutch-book-like procedures (leaving out utility — say, Dutch books played with pebbles instead of money), the expected net “gain” will be positive according to his beliefs, though the actual net gain would be necessarily negative. Then in large enough samples we shall have a guaranteed significant deviation of the actual number from the expected one. I repeat: it seems to me that a pretheoretical razor exists, or should exist, that eliminates any hypothesis that will necessarily be rejected by the evidence.