I don’t think the evidence that knowledge is simply true belief has been taken seriously enough by many in the philosophical community. So I’m going to try again here to get people to do so. The following example is quite long, but I think that’s necessary to remove some possible distractions. In particular, part of my theory is that stress on the word ‘know’ or its cognate in knowledge claims changes the acceptability of those claims, so a longer story gives us more context and hence more natural stress patterns and hence a better guide to what’s really happening. (I’m much indebted to various conversations with Polly Jacobson, Jeff King and Jason Stanley for getting me to realise the importance of stress in these matters.)
The Virus
A nasty virus has been released at your workplace, and everyone is at risk of infection. The virus isn’t extremely infectious, but it isn’t fun to have, so it’s important to get a clampdown on it as soon as possible. Unfortunately, one of the two tests that people have been using to see whether they have the virus is not very good. The other test is fine, not perfect but pretty good by medical standards.
But the bad test is quite bad. The people using it were told it is 98% accurate. That is a small exaggeration, but in any case it is quite irrelevant. The test is ‘accurate’ because it mostly returns negative results and most people don’t have the virus. So it gets it right with about 95% or so of people. But only about 1/3 of those who get positive test results actually have the virus. So there’s a lot of false positives floating around your workplace.
Here are the numbers so far for various salient groups:
5 people have the virus and believe that they do because they used the good test.
4 people have the virus and believe they they do because they used the bad test.
6 people don’t have the virus but believe they they do because they used the bad test.
8 people have the virus but haven’t taken a test, so don’t think they have it.Making matters worse, your boss would prefer that news of the virus didn’t get out, thinking it will send a downwards spiral in the company’s share price. He would prefer there’d been no tests at all. Having heard that there’s been more testing, he storms in to your office asking, “HOW MANY people know that they have the virus now?”
What do you answer?
It’s philosophically defensible to say zero, because no one is 100% beyond a shadow of a ghost of a shade of a doubt certain that they have it. But in the circumstances not many bosses would take that to be an acceptable answer.
It’s even more philosophically defensible to say five, because only five have a warranted true belief that they have the virus. (I presume that since the 4 made a false inference from false premises to conclude they have the virus, their belief is not warranted unless warrant is a totally trivial condition.) But again, that doesn’t seem like the most appropriate thing to say in the circumstances.
If your boss knew the underlying facts, the answer he’d expect, I think, is nine. And I think that’s the right answer.
To back up this intuition, consider if the boss continues questioning you the following way. Molly is one of the 4 who believe for bad reasons she has the virus.
BOSS: Does Molly know she has the virus?
YOU: No.
BOSS: Does Molly believe she has the virus?
YOU: Yes.
BOSS: Does Molly have the virus?
You: Yes.
BOSS: Then whatdya mean she doesn’t know she has it?
YOU: Let me tell you about late 20th century epistemology.
The next step, of course, is you being fired. In the circumstances, true belief is enough for knowledge.
But note that nine is the largest answer you could give. You shouldn’t answer fifteen, though the Boss might appreciate it if your answer informed him that another 6 people think they have the virus. That’s probably relevant information, but those people shouldn’t be grouped in with the people who know they have the virus.
And, of course, the eight people who don’t even think they have the virus shouldn’t be considered. It’s clearly wrong to answer seventeen, even though seventeen people do, in fact, have the virus.
I hope you agree with all my intuitions here. What should we make of them philosophically?
The most natural explanation of the data, I think, is that knowledge is simply true belief, though sometimes when someone says S knows that p, they speaker mean that S has a warranted, or justified, or certain, or approved by God, belief that p. Semantically, all that they mean is that S truly believes that p. Questions, especially questions by people in authority not concerned with niceties of speaker meaning, tend to bring out semantic meaning, so in your little conversation with Boss, ‘know’ reverts back to its basic meaning of being truly believes. That’s why the right answer is nine, though perhaps if you have a cute enough smile you can get away with five or zero without being fired.
I’m not saying that’s the best explanation of all the data concerned with knowledge talk. But I do think it’s the best explanation of this bit of data. There are two other explanations of the data that people have tried in the past.
One of these I won’t say much about. This is the contextualist approach. I’ve argued against contextualism here before, and I think in general the various objections that Jason Stanley and Ernie Lepore and John Hawthorne have made of contextualism in various places work. But I don’t want to really argue for that here as much as set it aside. My main target is the invariantist who thinks that (non-trivial) warrant is necessary for knowledge.
What can that philosopher say about the appropriateness of nine as an answer to Boss’s question? The response I usually get is an inverse of my response – that although ‘knowledge’ really denotes warranted true belief, sometimes the speaker meaning of a knowledge ascription can be somewhat weaker than this. Here all Boss cares about is true belief, he speaker means “How many people truly believe they have the virus?”, and that’s how you should answer.
PI used to think this answer was incoherent – speaker meaning can only add to the content of a term not subtract from it. But that was probably too quick. The real problem with this response is that it can’t really explain the data. If ‘knowledge’ semantically means warranted true belief, but its speaker meaning can be simply true belief on some occasions, why couldn’t its speaker meaning be simply belief, or simply truth? If we can subtract part of the semantic meaning out, why not the other parts? I don’t think there’s any good explanation for this available to the invariantist who holds that knowledge is warranted true belief. If there’s any explanation for it at all, I suspect it will be very complicated.
Well, this was all rather quick, but I think there’s a somewhat powerful case to be made here that knowledge is simply true belief. Obviously this theory will have to rely on some very heavy duty pragmatics in order to explain most of the cases philosophers have talked about. But since virtually every case considered in epistemology classrooms involves stress (usually comparative stress with an unclear comparison) on ‘know’, I think a good theory of stress can explain a lot of the data apparently inconsistent with the claim that knowledge is simply true belief. Could it, or any other pragmatic theory, explain all of that data? Don’t know, but I’d like to see some clever people argue one way or the other.
Quick acknowledgment at the end. The case here is somewhat modelled on various cases John Hawthorne has used for various purposes, but it does have one or two new touches. In particular, the use of questions to push the knowledge = true belief line is John’s, but the extra point that these cases do not support knowledge = truth or knowledge = belief is, I think, original.