Social intuitionism

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

In moral psychology, social intuitionism is a model that proposes that moral positions and judgments are: (1) primarily intuitive ("intuitions come first"), (2) rationalized, justified, or otherwise explained after the fact, (3) taken mainly to influence other people, and are (4) often influenced and sometimes changed by discussing such positions with others.[1]

This model diverges from earlier rationalist theories of morality, such as of Lawrence Kohlberg's stage theory of moral reasoning.[2] Jonathan Haidt (2001) de-emphasizes the role of reasoning in reaching moral conclusions. Haidt asserts that moral judgment is primarily given rise to by intuition, with reasoning playing a smaller role in most of our moral decision-making. Conscious thought-processes serve as a kind of post hoc justification of our decisions.

His main evidence comes from studies of "moral dumbfounding"[3] where people have strong moral reactions but fail to establish any kind of rational principle to explain their reaction.[4] An example situation in which moral intuitions are activated is as follows: Imagine that a brother and sister sleep together once. No one else knows, no harm befalls either one, and both feel it brought them closer as siblings. Most people imagining this incest scenario have very strong negative reaction, yet cannot explain why.[5] Referring to earlier studies by Howard Margolis[6] and others, Haidt suggests that we have unconscious intuitive heuristics which generate our reactions to morally charged-situations, and underlie our moral behavior. He suggests that when people explain their moral positions, they often miss, if not hide, the core premises and processes that actually led to those conclusions.[7]


Haidt's model also states that moral reasoning is more likely to be interpersonal than private, reflecting social motives (reputation, alliance-building) rather than abstract principles. He does grant that interpersonal discussion (and, on very rare occasions, private reflection) can activate new intuitions which will then be carried forward into future judgments.

Reasons to doubt the role of cognition[edit]

Haidt (2001) lists four reasons to doubt the cognitive primacy model championed by Kohlberg and others.[8]

  1. There is considerable evidence that many evaluations, including moral judgments, take place automatically, at least in their initial stages (and these initial judgments anchor subsequent judgments).
  2. The moral reasoning process is highly biased by two sets of motives, which Haidt labels "relatedness" motives (relating to managing impressions and having smooth interactions with others) and "coherence" motives (preserving a coherent identity and worldview).
  3. The reasoning process has repeatedly been shown to create convincing post hoc justifications for behavior that are believed by people despite not actually correctly describing the reason underlying the choice. This was demonstrated in a classic paper by Nisbett and Wilson (1977).
  4. According to Haidt, moral action covaries more with moral emotion than with moral reasoning.

Objections to Haidt's model[edit]

Joseph Paxton and Joshua Greene (2010) review evidence suggesting that moral reasoning plays a significant role in moral judgment, including counteracting automatic tendencies toward bias.[9] Greene and colleagues have proposed an alternative to the social intuitionist model - the Dual Process Model[10] - which suggests that deontological moral judgments, which involve rights and duties, are driven primarily by intuition, while utilitarian judgments aimed at promoting the greater good are underlain by controlled cognitive reasoning processes.

Other researchers have criticized the evidence cited in support of social intuitionism relating to moral dumbfounding, arguing these findings rely on a misinterpretation of participants' responses.[11][12]


  1. ^ Haidt, Jonathan (2012). The Righteous Mind: Why Good People Are Divided by Politics and Religion. Pantheon. pp. 913 Kindle ed. ISBN 978-0307377906.
  2. ^ Levine, Charles; Kohlberg, Lawrence; Hewer, Alexandra (1985). "The Current Formulation of Kohlberg's Theory and a Response to Critics". Human Development. 28 (2): 94–100. doi:10.1159/000272945.
  3. ^ McHugh, Cillian; McGann, Marek; Igou, Eric R.; Kinsella, Elaine L. (2017-10-04). "Searching for Moral Dumbfounding: Identifying Measurable Indicators of Moral Dumbfounding". Collabra: Psychology. 3 (1). doi:10.1525/collabra.79. ISSN 2474-7394.
  4. ^ Haidt, Jonathan. The righteous mind. Pantheon: 2012. Loc 539, Kindle ed. In footnote 29, Haidt credits the neology of the term moral dumbfounding to social/experimental psychologist Daniel Wegner.
  5. ^ Haidt, Jonathan. The righteous mind. Pantheon: 2012. Loc 763 Kindle ed.
  6. ^ Grover, Burton L. (1989-06-30). "Patterns, Thinking, and Cognition: A Theory of Judgment by Howard Margolis. Chicago: University of Chicago Press, 1987, 332 pp. (ISBN 0-226-50527-8)". The Educational Forum. 53 (2): 199–202. doi:10.1080/00131728909335595. ISSN 0013-1725.
  7. ^ Haidt, Jonathan. The righteous mind. Pantheon: 2012. Loc 1160 Kindle ed.
  8. ^ Haidt, J. (2001). "The emotional dog and its rational tail: A social intuitionist approach to moral judgment." Psychological Review, 108, 814-834.
  9. ^ Paxton, Joseph M.; Greene, Joshua D. (13 May 2010). "Moral Reasoning: Hints and Allegations". Topics in Cognitive Science. 2 (3): 511–527. doi:10.1111/j.1756-8765.2010.01096.x. PMID 25163874.
  10. ^ Greene, J. D. (14 September 2001). "An fMRI Investigation of Emotional Engagement in Moral Judgment". Science. 293 (5537): 2105–2108. doi:10.1126/science.1062872. PMID 11557895.
  11. ^ Guglielmo, Steve (January 2018). "Unfounded dumbfounding: How harm and purity undermine evidence for moral dumbfounding". Cognition. 170: 334–337. doi:10.1016/j.cognition.2017.08.002. PMID 28803616.
  12. ^ Royzman, Edward B; Kim, Kwanwoo; Leeman, Robert F. "The curious tale of Julie and Mark: Unraveling the moral dumbfounding effect". Judgment and Decision Making. 10 (4): 296–313.

External links[edit]