Hall on Causation

Ned Hall has a paper in a recent Philosophical Studies where he defends a new account of causation. The crucial idea is that we have to distinguish between default and deviant states for certain events/objects. Applied to simple cases like neuron diagrams, the default state of a neuron is to not fire. The theory gets a few complicated cases right, but it seems to not get some even more complicated right. I’ll state (hopefully not too badly) the theory, then offer a counterexample to it. There’s a little bit of interpretation here, because Hall never quite states the theory like this, so I might be horribly misinterpreting something.

C is a cause of E iff there are some events K such that

  • each event in K consists of some entity being in a deviant, rather than a default, state
  • had every event in K not happened, E would still have happened
  • had every event in K not happened, every event in the causal network of which C and E are parts either would have happened in the same way, or would have reverted to its default state
  • had every event in K not happened, E would have been counterfactually dependent on C

Here is a kind of counterexample to it. First the diagram, then the description.

Counterexample to Hall

  • A sets off a chain of events that, if unchecked, will result in E.
  • B initiates a threat to E.
  • C causes the threat to be cancelled early.
  • C also initiates a chain that, if left unchecked, will cancel the threat late.
  • D pre-empts this second chain, and will cancel the threat late if it is still present.

C causes E by defusing a threat to it, but it fails Hall’s test. E isn’t counterfactually dependent on C. If D hadn’t happened, E would have been counterfactually dependent on C. But if D hadn’t happened, an event that is actually at default, the one labelled G on the diagram, would have been in a deviant state. So there is no set K as required above, and C is not a cause of E. But this is wrong, since C is what defuses the threat to E.

I think there are also cases where Hall’s theory mistakenly classifies a non-cause as a cause, but those cases are more contentious and I’ll leave them for another post.

3 Replies to “Hall on Causation”

  1. I used of all things Microsoft Publisher. It was pretty easy to set up, once you are familiar with guidelines and the like.

    I’d thought it would be easy to use flowchart type things, but I just found it easier to draw regular circles and arrows.

    15 years ago I could have done this thing in my sleep using Corel Draw. But I don’t even know if that still exists, and if it does it has probably got so fancy that it is hard to do simple diagrams with. In any case, I don’t have a copy of any version. (An old version of Corel Draw 3 or 5 might come in handy sometimes, so maybe I should try to dig one up!)

  2. Nice! Damn you, of course. Here’s a somewhat simpler counterexample, that doesn’t rely on causation-by-threat-canceling (and I’m embarrassed to say that I already knew of this example, having devised it some time ago for another purpose): A sends signals to C and D. Simultaneously, B sends signals to D and E. The signal from A causes C to fire; likewise, the signal from B causes E to fire. But the simultaneous incoming signals from A and B cause D to short out, so it doesn’t fire. If it had fired, it would have sent a stimulatory signal to F. Next, C and E, having fired, each send signals to F, which is a stubborn neuron (needing two stimulatory signals in order to fire). F fires.

    The initial firings of A and B jointly cause F to fire. But there is dependence on neither A nor B: If A hadn’t fired, B would have caused both D and E to fire, and each in turn would have sent one of the two needed signals to F. Similarly if B hadn’t fired.

    Here, as in your example, we can save the account if we allow that a “reduction” of the actual situation can remove neurons themselves — thinking of the presence of these neurons as being a ‘deviation’ from a default state in which nothing is there at all. (That strikes me as unattractive, for various reasons I’ll skip over.) But otherwise there’s trouble: no nomologically possible situation in which strictly fewer firings happen is one in which F depends on A, or on B.

Leave a Reply