Game Theory and Uncertainty

This semester I ended up teaching some stuff on the risk/uncertainty distinction and some stuff on game theory in fairly quick succession, and that got me thinking about how the two might interact. In particular, I started thinking about what would happen if you expanded the definition of mixed strategies to include strategies where each option was given an uncertain, as opposed to merely a risky, probability of being played. (E.g. you play A if p is true and B if ~p is true, where p is a proposition about which you don’t have a numerical probability judgment.)

The upshot is that in some cases you radically expand the range of Nash equilibria. In particular I discovered a game where by orthodox lights there is a unique Nash equilibrium, and it is a pure strategy of playing option C, suddenly includes a new equilibrium of both players playing a mixture of A and B. Just what the philosophical implications of this are are unclear, and I haven’t done much to clarify them. In earlier work I’d argued that as important as the risk/uncertainty distinction is to epistemology, it isn’t that important to decision theory. Indeed, I argued it can be effectively ignored in decision theory. If you need to pay careful attention to the risk/uncertainty distinction to work out all the Nash equilibria in a game, that’s more evidence that these equilibrium concepts that game theorists toss around are radical departures from standard single-person decision theory. I could be totally confused here though, and I’m much more confident in the technical result than in any philosophical derivations from it. The technical part is written up in this short note.

bq. “Game Playing Under Ignorance”:http://brian.weatherson.org/gpui.pdf