“Philip Petit”:http://www.princeton.edu/~ppettit/ has a very good paper in the latest _Analysis_ on the issue of when you should defer to majority wisdom. (Philip Petit, “When to Defer to Majority Testimony – And When Not,” _Analysis_ 66.3, July 2006, pp. 17987.) Those of you who have _Analysis_ should go read it, and come back here when you’re done. So I’m a blogger so I’m going to pick on a relatively minor technical point, but see at the bottom of the post for more substantive points of agreement with the paper.
Now Philip’s overall position is relatively subtle, but one of the key points he makes is that deferring to the majority view on all positions is likely to be inconsistent, so you shouldn’t do it. This should be a fairly well known point, but it is worth rehearsing the proof.
Imagine that there are two atomic propositions under discussion, _p_ and _q_. And there are three people in your group, A, B and C. (It won’t matter which you are.) A and B believe that _p_, but C thinks it is false. B and C believe that _q_, but A thinks it is false. So the majority believes that _p_, the majority believes that _q_, and the majority believes not-p&q. If you follow majority ‘wisdom’ on all counts, you’ll end up with inconsistent views.
Surprisingly, Philip suggests that a supermajoritarian position works better. (He doesn’t say this is a good idea to follow in all cases, but he does seem to think it avoids this problem.)
bq. There is another sort of approach that will do better, however. This is not to allow just any majoritarian challenge to reverse a belief but to allow only a certain sort of supermajoritarian challenge to do so. This would amount to a policy of global supermajoritarian revision. In order to see why this approach need not involve the same problems as the majoritarian counterpart, imagine that you are prepared to defer only to a supermajority of 70%, and to every such supermajority. And suppose that, as before, you are confronted by a situation in which a majority of others hold that _p_, that _q_, and that not _p_ & _q_, where you hold that not _p_, not _q_ and not-p&q. Suppose in particular that 70% hold by p and that 70% hold by q, giving you a reason to defer to the group on those issues. Will such supermajoritarian deference raise a problem? No, as it happens, it wont. You will be led to adopt the majority view that _p_, and the majority view that _q_, since each proposition commands the requisite supermajority of 70%. But you will also be allowed to revise your belief that not-p&q, thereby ensuring that your beliefs are consistent. You will not be forced, inconsistently, to hold by the majority view that not-p&q, since this will not be supported by a majority of 70%. (184-5)
Philip has an argument for why the supermajoritarian position avoids this particular problem, but it seems to me it runs into a very similar problem. Change the case so that there are four people in the group, and three propositions under discussion. Their opinions are:
* A, B and C think _p_ is true, D thinks it is false.
* A, B and D think _q_ is true, C thinks it is false.
* B, C and D think _r_ is true, A thinks it is false.
Now a 75% supermajority believes each of _p_, _q_, _r_ and not-p&q&r. So the supermajoritarian gets led into inconsistency. So supermajoritarian deference seems like a bad idea.
So in the context of this paper, this isn’t a very big deal I think, since what I’m objecting to could be excised without damaging the main theme. What’s really important is the point Philip makes in leading off, about the key reason you shouldn’t defer about matters of extreme epistemic importance.
bq. It would be objectionably self-abasing to revise your belief on matters like intelligent design or the wrongness of abortion just in virtue of finding that others whom you respect take a different view. Or so most of us think. To migrate towards the views of others, even under the sorts of assumptions given, would seem to be an abdication of epistemic responsibility: a failure to take seriously the evidence as it presents itself to your own mind. (181)
That seems exactly right to me. The view Philip is arguing against here is a kind of epistemic collectivism, a view that we _should_ let our beliefs be guided by the majority opinion amongst suitably informed, suitably intelligent, suitably impartial judges. The view he is arguing for is a personal responsibility view. At risk coming off all Ronald Reagan on y’all, at some level people just have to take responsibility for their own beliefs, not regard themselves as merely a jury member with a single vote about the most important issues they consider. The collectivist view tends to run into technical problems, even paradoxes, when we try to develop it. But apart from that it suggests a deeply unappealing picture of our epistemic agency, one that we should firmly reject.
Hopefully I’ll post more on this over upcoming weeks, but for now there’s a football team to cheer for…