Bias blind spot

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The bias blind spot is the cognitive bias of recognizing the impact of biases on the judgement of others, while failing to see the impact of biases on one's own judgement.[1] The term was created by Emily Pronin, a social psychologist from Princeton University's Department of Psychology, with colleagues Daniel Lin and Lee Ross.[2] The bias blind spot is named after the visual blind spot.

Causes of bias blindness[edit]

The cognitive utilization of bias blind spots may be caused by a variety of other biases and self-deceptions [3]

Self-enhancement biases may play a role, in that people are motivated to view themselves in a positive light. Biases are generally seen as undesirable,[4] so people tend to think of their own perceptions and judgments as being rational, accurate, and free of bias. The self-enhancement bias also applies when analyzing our own decisions, in that people are likely to think of themselves as better decision makers than others.[3]

People also tend to believe they are aware of ‘how’ and ‘why’ they make their decisions, and therefore conclude that bias did not play a role. Many of our decisions are formed from biases and cognitive shortcuts, which are unconscious processes. By definition, people are unaware of unconscious processes, and therefore cannot see their influence in the decision making process.[3]

When made aware of various biases acting on our perception, decisions, or judgments, research has shown that we are still unable to control them. This contributes to the bias blind spot in that even if one is told that they are biased, they are unable to alter their biased perception.[3]

Role of introspection[edit]

Emily Pronin and Matthew Kugler have argued that this phenomenon is due to the introspection illusion.[5] In their experiments, subjects had to make judgments about themselves and about other subjects.[6] They displayed standard biases, for example rating themselves above the others on desirable qualities (demonstrating illusory superiority). The experimenters explained cognitive bias, and asked the subjects how it might have affected their judgment. The subjects rated themselves as less susceptible to bias than others in the experiment (confirming the bias blind spot). When they had to explain their judgments, they used different strategies for assessing their own and others' bias.

Pronin and Kugler's interpretation is that, when people decide whether someone else is biased, they use overt behaviour. On the other hand, when assessing whether or not they themselves are biased, people look inward, searching their own thoughts and feelings for biased motives.[5] Since biases operate unconsciously, these introspections are not informative, but people wrongly treat them as reliable indication that they themselves, unlike other people, are immune to bias.[6]

Pronin and Kugler tried to give their subjects access to others' introspections. To do this, they made audio recordings of subjects who had been told to say whatever came into their heads as they decided whether their answer to a previous question might have been affected by bias.[6] Although subjects persuaded themselves they were unlikely to be biased, their introspective reports did not sway the assessments of observers.

Differences of perceptions[edit]

People tend to attribute bias in an uneven way. When people reach different perceptions from each other, they each tend to label the other person as biased, and themselves as being accurate and un-biased. Pronin hypothesizes that this bias misattribution may be a source of conflict and misunderstanding between people. For example, in labeling another person as biased, one may also label their intentions cynically. But when examining one’s own cognitions, people judge themselves based on their good intentions. It is likely that in this case, one may attribute another’s bias to “intentional malice” rather than an unconscious process.[7]

Pronin also hypothesizes ways to use awareness of the bias blind spot to reduce conflict, and to think in a more “scientifically informed” way. Although we are unable to control bias on our own cognitions,[3] one may keep in mind that biases are acting on everyone. Pronin suggests that people might use this knowledge to separate other’s intentions from their actions.[7]

See also[edit]

References[edit]

  1. ^ Pronin, E., Lin, D., & Ross, L. (2002). The Bias Blind Spot: Perceptions Of Bias In Self Versus Others. Personality and Social Psychology Bulletin, 28(3), 369-381.
  2. ^ Emily Pronin, Center for Behavioral Decision Research
  3. ^ a b c d e http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1392625 Antony, P. (2009). Unconscious bias and the limits of director independence. University of Illinois Law Review, (1), 237-294.
  4. ^ Pronin, E. (2007). Perception and misperception of bias In human judgment. Trends in Cognitive Sciences,11(1), 37-43.
  5. ^ a b Gilovich, Thomas; Nicholas Epley; Karlene Hanko (2005). "Shallow Thoughts About the Self: The Automatic Components of Self-Assessment". In Mark D. Alicke, David A. Dunning, Joachim I. Krueger. The Self in Social Judgment. Studies in Self and Identity. New York: Psychology Press. p. 77. ISBN 978-1-84169-418-4. 
  6. ^ a b c Pronin, Emily; Matthew B. Kugler (July 2007). "Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot". Journal of Experimental Social Psychology (Elsevier) 43 (4): 565–578. doi:10.1016/j.jesp.2006.05.011. ISSN 0022-1031. 
  7. ^ a b Pronin, E. (2008). How we see ourselves and how we see others. Science, 320(5880), 1177-1180.