I have been wondering something. When one is theorizing about which theory of normative ethics is correct, what is one to do about moral intuitions? Most people seem to agree that moral intuitions function as data for moral theorizing. But how exactly does it work? It seems to me that there are precisely three possibilities:

  1. The theory is always right when there is a conflict between what the theory implies and what people’s moral intuitions are
  2. The moral intuitions are always right when there is a conflict between what the theory implies and what people’s moral intuitions are
  3. The theory is sometimes right and the moral intuitions are sometimes right

None of these seem satisfactory. I examine them below.

1. The theory is always right when there is a conflict between what the theory implies and what people’s moral intuitions are

If this is the case, how does one test the theory? Or test theories against each other? For instance, hedonistic utilitarianism vs. interest utilitarianism. It seems that moral intuitions are used to test theories. One question comes to mind: Which percentage of moral intuitions is the theory supposed to fit with? If 100%, then the theories are just theories about human psychology, which is interesting enough but it seems not to be what people have in mind when they do normative ethics. If some other percent, then it’s possible that more than one theory fit the same % of moral intuitions without fitting with the same intuitions. How should one decide in such a case?

2. The moral intuitions are always right when there is a conflict between what the theory implies and what people’s moral intuitions are

In which case we don’t need a theory of normative ethics besides: What human moral intuitions say is right. (And then something about when there is disagreement about intuitions.)

3. The theory is sometimes right and the moral intuitions are sometimes right

In that case: Under which conditions is the theory right, and under which conditions is the moral intuitions right?

0 Comments

Leave a Reply