There was a fair bit of back and forth in the “previous thread”:http://tar.weatherson.org/2010/08/20/philosophy-in-the-new-york-times/ on just what us stakes-sensitive folks were claiming to be stakes-sensitive. So I thought I’d list what *I* thought was stakes-sensitive, and perhaps others who thought there is stakes-sensitivity somewhere can chime in either in comments or on their blogs/sites.
Three qualifications before I start.
First, I’m really interested in *odds*-sensitivity, not *stakes*-sensitivity. I think you get some stakes-sensitivity effects when you have to decide whether to bet $20 against a few seconds work. For instance, you might double check on your phone something we’d ordinarily say you knew, because the act of checking has a positive expected return. I think that’s a case of long odds defeating knowledge. That doesn’t mean I think of losing $20 as a high stakes situation of course.
Second, I’m primarily interested in the way in which various things are *constitutively* dependent on stakes. If the stakes raise, then I collect more evidence, and then my credence/belief/knowledge/evidence changes, that doesn’t in itself mean the kind of sensitivity at issue here is displayed. I also think that ‘thinking through’ a question is often a way of collecting more logical/mathematical/epistemological evidence. So this kind of causal dependency of belief etc on stakes is not what’s at issue here, but is surely a central feature of our epistemic life. Any theory that said that for ordinary humans, stakes raising doesn’t have a causal impact on how much we collect and think through evidence is surely too absurd to be taken seriously.
Third, it’s very important to distinguish various ways in which beliefs can be strong. There are plenty of pairs of propositions _p, q_ such that:
- There are odds that I would regard as favourable for a bet on _q_, but not for a bet on _p_. In that sense, my credence in _q_ is higher than my credence in _p_.
- In the ordinary sense, I believe _p_ but not _q_, so in the ordinary sense, I have a stronger degree of belief in _p_ than in _q_.
Here’s one instance of that. Let _p_ be that in the weekend’s election, the seat of La Trobe was won by “a college friend”:http://en.wikipedia.org/wiki/Laura_Smyth of mine, and _q_ be that this particular lottery ticket will lose.
This means that phrases like ‘degree/strength of belief/confidence/credence’, are systematically ambiguous. By ‘credence’ I mean the state that bears a close relationship to betting behaviour, and by ‘belief’ I mean the state such that someone who believes _p_ takes _p_ for granted when making theoretical or practical decisions.
Having said that, here’s what I think is stakes-sensitive.
- Credences are not stakes-sensitive, since credences are defined in terms of dispositions over decision possibilities that include high stake situations.
- Beliefs are stakes-sensitive, since in high stakes situations, less is taken for granted by rational actors.
- Evidence is stakes-sensitive, since in high stakes situations, different sources are rationally relied upon.
- Knowledge is stakes-sensitive for at least both of the last two reasons.
- Fixing credence, knowledge is still stakes-sensitive since of course credences don’t vary with stakes.
- Fixing belief, knowledge is still stakes-sensitive.
In my original “paper on pragmatic encroachment”:http://brian.weatherson.org/cwdwpe.pdf I hinted that the last claim is false. But I’ve changed my mind. I think that in cases where the agent is wrong about what the stakes are, or even have mistaken credences about stakes, knowledge can be affected by stakes even fixing everything else. Those cases are rare, and haven’t been much discussed in the literature. (They certainly haven’t been tested in any experiments.) But I think they are important for getting the details of stakes-sensitivity right.