We read two critics of the kind of position Taurek takes. (Glover is responding to a similar argument made by Elizabeth Anscombe while Kamm discusses Taurek directly.)
We mostly talked about Taurek himself, however; the critics we read did not occupy a lot of our discussion. Indeed, we barely got Kamm in at all.
Glover points out that Taurek’s math is funny. Taurek agrees that you would be required to save one person stranded on a rock and at risk of drowning as the tide comes in. Here 1 > 0. But if there were two people on one rock and one on another, they should have equal chances, provided there isn’t a special reason to favor one over the other. Here 2 = 1 or, if you like, the first 1 = 1 and the second 1 = 0. That is not consistent with 1 > 0.
If I were Taurek, I would say that this is what you get when you treat people as having infinite value or, more modestly, it’s what you should expect from a guy who denies that the numbers matter. Taurek’s whole point is that people aren’t like other things. Since you can’t add their value together you shouldn’t expect the mathematical operation of addition to work.
Zane was massively suspicious of Taurek on this point. He questioned Taurek’s assertion that he isn’t putting a value on different people’s lives, he questioned whether Taurek was really counting people equally since he seems to be counting the two as worth .5 when they’re in competition with one, and he said that he thinks you can add different people’s desire to live together, such that the numbers do make sense. Towards the end of class he said that Taurek gives each life an equal chance but that each life does not make an equal contribution. Zane was on this one and I promise to make up for my failure to come up with a decent paper topic about Parfit by having a topic about Taurek.
Emma and Kenny liked the point about the infinite worth of human beings. Jacob questioned whether it’s true.
We considered a bunch of ways of describing the case where there are two people on one rock (call them “A” and “C”), one person on the other rock (“B”), and only enough time to make it to one rock before the tide comes in.
Taurek describes it like this: we have a choice about whether to save two lives (A and C) or one (B). Assuming we don’t have any special reason for preferring one over the others, the fairest way to decide what to do is flip a coin, so everyone has an equal 50-50 chance of living.
Chris suggested that the way to give them equal chances would be to give each a one in three chance of being rescued. In effect, A and C would have a two-thirds chance, assuming that if either A or B wins you would pick up the other.
Kenny wondered whether A and C shouldn’t have a 75% chance of being rescued. (I didn’t record his reasoning and can’t reconstruct it. Sorry Kenny!)
I imagined a series of one to one comparisons like this:
The idea is that treating people as equals means pitting equals against equals. On the first line, compare one person on one rock against one person on the other rock. Do the same on the second line. And the third, and so on.
If you compare the value of saving A with the value of saving B, you’ll see they’re equal and you have no basis for choosing one rather than the other. So the first line doesn’t tell you what to do. OK, so move to the next line. Compare the value of saving C against the value of saving no one. C wins. So you should rescue C.
When you rescue C, you will face the choice of rescuing A compared with rescuing no one. You will be at A’s rock when you get C and there isn’t time to get to B’s rock so it’s A or no one. A wins. So you should save C and A.
On my method of treating everyone as equals, the larger number will always win. Each line but the last will be a tie. The last member of the larger group will win compared with no one. When you save the last member of the larger group, it will make sense to save every other member of the larger group. Whether that is a good or bad feature of my method is up to you to decide.
We didn’t talk much about Kamm. Here is what she does in a nutshell. Everything builds towards what she calls the Aggregation Argument (Kamm 1993, 85). Here’s the argument.
Here is another way to put it (where “<” means “worse than” and “=” means “is equally bad as”).
Throughout her presentation, she is concerned with this kind of reply to what she is saying: “there isn’t any such thing as its being better or worse, period; there’s only better and worse from someone’s point of view. So whenever you say it’s better that some people die than that other people die, your argument has to reflect a kind of bias that comes from taking up the point of view of one of the participants.” (She takes this to be Taurek’s main point.)
She tackles this in two steps:
She tries to show that changes can make the world worse even though it doesn’t seem that way from everyone’s point of view. She does this in §II by arguing that the world gets worse as different people get headaches. It’s worse when Jim gets a headache and still worse when Joe gets one even though neither Jim nor anyone else feels any more pain when Joe gets a headache than they did before he got a headache.
Then she tries to show that changes can leave the world just as bad as it was before even though it is made worse from some people’s point of view and better from other people’s point of view. She does this in §III by imagining that people’s headaches trade off: when Jim gets his, Joe’s goes away and vice versa.
Where she is going is that “better” and “worse” can be evaluated from an impersonal point of view rather than from anyone’s particular perspective.
These are the points from today’s class that you should know or have an opinion about.
Glover, Jonathan. 1977. Causing Death and Saving Lives. New York: Penguin Books.
Kamm, F. M. 1993. Morality, Mortality: Volume 1 Death and Whom to Save from It. Oxford: Oxford University Press.
Taurek, John. 1977. “Should the Numbers Count?” Philosophy & Public Affairs 6 (4): 293–316.