Social intuitionism

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In moral psychology, social intuitionism proposes that moral judgments and actions are caused more by intuition than by reason. It contradicts earlier, rationalist theories of morality, such as of Lawrence Kohlberg's stage theory of moral reasoning. Jonathan Haidt (2001) greatly de-emphasizes the role of reasoning in reaching moral conclusions. Haidt asserts that moral judgment is primarily given rise to by intuition with reasoning playing a very marginalized role in most of our moral decision-making. Conscious thought-processes serves as a kind of post hoc justification of our decisions.

His main evidence comes from studies of "moral dumbfounding" where people have strong moral reactions but fail to establish any kind of rational principle to explain their reaction.[1] An example situation in which moral intuitions are activated is as follows: Imagine that a brother and sister sleep together once. No one else knows, no harm befalls either one, and both feel it brought them closer as siblings. Most people imagining this incest scenario have very strong negative reaction, yet cannot explain why.[2] Haidt suggests that we have intuitive heuristics which are unconscious[3] that generate our reactions to morally charged situations and our moral behavior. He suggests that if people reason about morality, it is independent of processes causing moral decisions to be made.[4]

Haidt's model also states that moral reasoning is more likely to be interpersonal than private, reflecting social motives rather than abstract principles. He does grant that interpersonal discussion (and, on very rare occasions, private reflection) can activate new intuitions which will then be carried forward into future judgments.

Reasons to doubt the role of cognition[edit]

Haidt (2001) lists four reasons to doubt the cognitive primacy model championed by Kohlberg and others.[5]

  1. There is considerable evidence that many evaluations, including moral judgments, take place automatically, at least in their initial stages (and these initial judgments anchor subsequent judgments).
  2. The moral reasoning process is highly biased by two sets of motives, which Haidt labels "relatedness" motives (relating to managing impressions and having smooth interactions with others) and "coherence" motives (preserving a coherent identity and worldview).
  3. The reasoning process has repeatedly been shown to create convincing post hoc justifications for behavior that are believed by people despite not actually correctly describing the reason underlying the choice. This was demonstrated in a classic paper by Nisbett and Wilson (1977).
  4. According to Haidt, moral action covaries more with moral emotion than with moral reasoning.

Objections to Haidt's model[edit]

Joseph Paxton and Joshua Greene (2010) review evidence suggesting that moral reasoning plays a significant role in moral judgment, including counteracting automatic tendencies toward bias.[6] Greene and colleagues have proposed an alternative to the social intuitionist model suggesting that deontological moral judgments, which involve rights and duties, are driven primarily by intuition, while utilitarian judgments aimed at promoting the greater good are underlain by controlled cognitive reasoning processes. Christopher Santos-Lang (2014) argued that machines exist of both the reasoning and intuitionist varieties, that each type has relative advantages, and that human evaluative diversity spans that of machines. This suggests that Kohlberg and Haidt were both incorrect in attempting to fit all of humanity into a single model as though one type of person is morally best.[7][8]

References[edit]

  1. ^ Haidt, Jonathan. The righteous mind. Pantheon: 2012.
  2. ^ Haidt, Jonathan. The righteous mind. Pantheon: 2012.
  3. ^ Haidt, Jonathan. The righteous mind. Pantheon: 2012.
  4. ^ Haidt, Jonathan. The righteous mind. Pantheon: 2012.
  5. ^ Haidt, J. (2001). "The emotional dog and its rational tail: A social intuitionist approach to moral judgment." Psychological Review, 108, 814-834.
  6. ^ Paxton, J., & Greene, J. (2010). "Moral reasoning: Hints and allegations." Topics in Cognitive Science, 2, 511-527.
  7. ^ Santos-Lang, Christopher (2014). "Chapter 6: Moral Ecology Approaches" (PDF). In van Rysewyk, Simon; Pontier, Matthijs. Machine Medical Ethics. New York: Springer. pp. 74–96. 
  8. ^ Santos-Lang, Christopher (In Press). "Measuring computational evaluative differences in humans" (PDF). WCSED Working Paper. 

External links[edit]