Joshua Greene (psychologist)

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Joshua Greene (psychologist)
Fields experimental philosophy, experimental psychology, moral psychology
Institutions Harvard University
Alma mater Princeton University, Harvard University
Doctoral advisor David Lewis and Gilbert Harman
Website
http://www.wjh.harvard.edu/~jgreene/

Joshua D. Greene is a Professor of Psychology at Harvard University and director of its Moral Cognition Lab. His work focuses on the intersection of psychology, neuroscience, and moral philosophy.

Education and career[edit]

Greene earned a bachelor's degree in philosophy at Harvard University in 1997. He then pursued a philosophy PhD at Princeton University under the supervision of David Lewis and Gilbert Harman. His 2002 dissertation, The Terrible, Horrible, No Good, Very Bad Truth About Morality and What to Do About It, argues against moral-realist language and in defense of non-realist utilitarianism as a better framework for resolving disagreements.[1] Greene served as a postdoctoral fellow at Princeton in the Neuroscience of Cognitive Control Laboratory before returning to Harvard in 2006.

Dual-process theory[edit]

Greene and colleagues have advanced a dual process theory of moral judgment, suggesting that the brain has competing moral subsystems:

  1. Emotional, intuitive, deontological judgments (e.g., don't push the fat man off the footbridge in the trolley problem)
  2. Rational, calculated, utilitarian judgments (e.g., push the fat man off to save more total lives).

In one of the first experiments to suggest a moral dual-process model,[2] Greene and coauthors showed that people's judgments about "personal" moral dilemmas, like whether to push the fat man off the footbridge, engaged several brain regions associated with emotion that were not activated by more distant moral or non-moral choices. They also demonstrated that for the dilemmas involving "personal" moral questions, those who did make the intuitively unappealing choice had longer reaction times than those who made the more emotionally pleasant decision.

A follow-up study[3] compared "easy" personal moral questions to which subjects had fast reaction times against "hard" dilemmas (like the footbridge problem) to which they had slow reaction times. When responding to the hard problems, subjects displayed increased activity in the anterior dorsolateral prefrontal cortex (DLPFC) and inferior parietal lobes—areas associated with cognitive processing—as well as the anterior cingulate cortex—which has been implicated in error detection between two confusing inputs, as in the Stroop task). This comparison demonstrated that harder problems activated different brain regions, but it didn't prove differential activity for the same moral problem depending on the answer given. This was done in the second part of the study, in which the authors showed that for a given question, those subjects who made the utilitarian choices did have higher activity in the anterior DLPFC and the right inferior parietal lobe than subjects making non-utilitarian choices.

These two studies were correlational, but others have since suggested a causal impact of emotional vs. cognitive processing on deontological vs. utilitarian judgments.[4][5][6] A 2008 study[7] by Greene showed that cognitive load caused subjects to take longer to respond when they made a utilitarian moral judgment but had no effect on response time when they made a non-utilitarian judgment, suggesting that the utilitarian thought processes required extra cognitive effort.

Moral Tribes[edit]

Drawing on dual-process theory, as well as evolutionary psychology and other neuroscience work, Greene's book Moral Tribes (2013) explores how our ethical intuitions play out in the modern world.[8] Greene posits that humans have an instinctive, automatic tendency to cooperate with others in their social group on tragedy of the commons scenarios ("me versus us"). For example, in a cooperative investment game, people are more likely to do what's best for the group when they're under time pressure or when they're primed to "go with their gut," and inversely, cooperation can be inhibited by rational calculation.[9] However, on questions of inter-group harmony ("us versus them"), automatic intuitions run into a problem, which Greene calls the "tragedy of commonsense morality." The same ingroup loyalty that achieves cooperation within a community leads to hostility between communities. In response, Greene proposes a "metamorality" based on a "common currency" that all humans can agree upon and suggests that utilitarianism—or as he calls it, "deep pragmatism"—is up to the task.[10]

Reception[edit]

Moral Tribes received several positive reviews.[11][12] Thomas Nagel critiques the book by suggesting that Greene is too quick to conclude utilitarianism specifically from the general goal of constructing an impartial morality; for example, he says, Kant and Rawls offer other impartial approaches to ethical questions.[10]

Robert Wright calls[13] Greene's proposal for global harmony ambitious and adds, "I like ambition!" But he also claims that people have a tendency to see facts in a way that serves their ingroup, even if there's no disagreement about the underlying moral principles that govern the disputes. "If indeed we’re wired for tribalism," Wright explains, "then maybe much of the problem has less to do with differing moral visions than with the simple fact that my tribe is my tribe and your tribe is your tribe. Both Greene and Paul Bloom cite studies in which people were randomly divided into two groups and immediately favored members of their own group in allocating resources -- even when they knew the assignment was random." Instead, Wright proposes that "nourishing the seeds of enlightenment indigenous to the world’s tribes is a better bet than trying to convert all the tribes to utilitarianism -- both more likely to succeed, and more effective if it does."

Bibliography[edit]

  • Greene, Joshua D; Sommerville, R Brian and Nystrom, Leigh E and Darley, John M and Cohen, Jonathan D (2001). "An fMRI investigation of emotional engagement in moral judgment". Science 293 (5537): 2105–2108. doi:10.1126/science.1062872. PMID 11557895.  (1866 citations as of 30 Nov. 2013)
  • Greene, Joshua D; Nystrom, Leigh E and Engell, Andrew D and Darley, John M and Cohen, Jonathan D (2004). "The neural bases of cognitive conflict and control in moral judgment". Neuron 44 (2): 389–400. doi:10.1016/j.neuron.2004.09.027.  (962 citations as of 30 Nov. 2013)
  • Greene, Joshua; Jonathan Haidt (2002). "How (and where) does moral judgment work?". Trends in cognitive sciences 6 (12): 517–523. doi:10.1016/S1364-6613(02)02011-9.  (901 citations as of 30 Nov. 2013)

See also[edit]

References[edit]

  1. ^ Greene, Joshua. "The Terrible, Horrible, No Good, Very Bad Truth about Morality and What to Do About it". Retrieved 24 November 2013. 
  2. ^ Greene, Joshua D; Sommerville, R Brian and Nystrom, Leigh E and Darley, John M and Cohen, Jonathan D (2001). "An fMRI investigation of emotional engagement in moral judgment". Science 293 (5537): 2105–2108. doi:10.1126/science.1062872. PMID 11557895. 
  3. ^ Greene, Joshua D; Nystrom, Leigh E and Engell, Andrew D and Darley, John M and Cohen, Jonathan D (2004). "The neural bases of cognitive conflict and control in moral judgment". Neuron 44 (2): 389–400. doi:10.1016/j.neuron.2004.09.027. 
  4. ^ Mendez, M. F.; Anderson, E.; Shapira, J. S. (2005). "An investigation of moral judgement in 349 frontotemporal dementia". Cognitive and Behavioral Neurology 18 (4): 193–197. doi:10.1097/01.wnn.0000191292.17964.bb. 
  5. ^ Koenigs, M.; Young, L.; Adolphs, R.; Tranel, D.; Cushman, F.; Hauser, M. et al. (2007). "Damage to the 340 prefrontal cortex increases utilitarian moral judgments". Nature 446 (7138): 908–911. doi:10.1038/nature05631. 
  6. ^ Valdesolo, P.; DeSteno, D. (2006). "Manipulations of emotional context shape moral judgment". Psychological Science 17 (6): 476–477. doi:10.1111/j.1467-9280.2006.01731.x. 
  7. ^ Greene, Joshua D; Morelli, Sylvia A and Lowenberg, Kelly and Nystrom, Leigh E and Cohen, Jonathan D (2008). "Cognitive load selectively interferes with utilitarian moral judgment". Cognition 107 (3): 1144–1154. doi:10.1016/j.cognition.2007.11.004. 
  8. ^ Greene, Joshua (2013). Moral Tribes: Emotion, Reason, and the Gap Between Us and Them. Penguin Press. ISBN 978-1594202605. 
  9. ^ Greene, Joshua D. "Deep Pragmatism". Edge. Retrieved 24 November 2013. 
  10. ^ a b Nagel, Thomas. "You Can't Learn About Morality from Brain Scans: The problem with moral psychology". New Republic. Retrieved 24 November 2013. 
  11. ^ Waytz, Adam (2 November 2013). "‘Moral Tribes’ by Joshua Greene". Boston Globe. Retrieved 24 November 2013. 
  12. ^ "Moral Tribes: Emotion, Reason, and the Gap Between Us and Them". Kirkus Reviews. 19 August 2013. Retrieved 24 November 2013. 
  13. ^ Wright, Robert (23 October 2013). "Why Can't We All Just Get Along? The Uncertain Biological Basis of Morality". The Atlantic. Retrieved 24 November 2013. 

External links[edit]