Curse of knowledge

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The curse of knowledge is a cognitive bias that leads better-informed parties to find it extremely difficult to think about problems from the perspective of lesser-informed parties. The effect was first described in print by the economists Colin Camerer, George Loewenstein and Martin Weber, though they give original credit for suggesting the term to Robin Hogarth.[1]


While the economists Colin Camerer, George Loewenstein, and Martin Weber were the first to coin the term "curse of knowledge" and to describe and effectively define this phenomenon they are self-reportedly not the first individuals to document or study the effect; on the other hand, in their publication they state that: "All the previous evidence of the curse of knowledge has been gathered in psychological studies of individual judgments", referring readers to Baruch Fischhoff's work from 1975, which also involves the hindsight bias.[1]


From these origins, economists Camerer, Loewenstein, and Weber first applied the curse of knowledge phenomenon to economics, in order to explain why and how the assumption that better informed agents can accurately anticipate the judgments of lesser informed agents is not inherently true, as well as to support the finding that sales agents who are better informed about their products may, in fact, be at a disadvantage against other, less-informed agents when selling their products.[1][2] This is said to be because better informed agents fail to ignore the privileged knowledge that they possess, thus "cursed" and unable to sell their products at a value that more naïve agents would deem acceptable.

In addition, and more recently, researchers have linked the curse of knowledge bias with false-belief reasoning in both children and adults, as well as theory of mind development difficulties in children.[3] The curse of knowledge bias reportedly decreases in degree for adults versus children, who experience exaggerated effects; however, it was also found that for adults: "knowledge becomes a more potent curse when it can be combined with a rationale (even if only an implicit one) for inflating one's estimates of what others know".

In one experiment, one group of subjects "tapped" a well-known song on a table while another listened and tried to identify the song. Some "tappers" described a rich sensory experience in their minds as they tapped out the melody. Tappers on average estimated that 50% of listeners would identify the specific tune; in reality only 2.5% of listeners could identify the song.[4][5] Related to this finding is the phenomenon experienced by players of charades: The actor may find it frustratingly hard to believe that his or her teammates keep failing to guess the secret phrase, known only to the actor, conveyed by pantomime.

It has also been suggested that the curse of knowledge could contribute to the difficulty of teaching.[6]

See also[edit]


  1. ^ a b c Camerer, Colin; George Loewenstein & Mark Weber (1989). "The curse of knowledge in economic settings: An experimental analysis". Journal of Political Economy 97: 1232–1254. doi:10.1086/261651. 
  2. ^ Birch, S. A. J., & Bernstein, D. (2007). What kids can tell us about hindsight bias: A fundamental constraint on perspective-taking? Social Cognition, 25, 78-97.
  3. ^ Birch, S. A. J., & Bloom, P. (2007). The curse of knowledge in reasoning about false beliefs. Psychological Science, 18, 382-386.
  4. ^ Heath, Chip; Dan Heath (2007). Made to Stick. Random House. 
  5. ^ Ross, L., & Ward, A. (1996). Naive realism in everyday life: Implications for social conflict and misunderstanding. In T. Brown, E. S. Reed & E. Turiel (Eds.), Values and knowledge (pp. 103–135). Hillsdale, NJ: Erlbaum.
  6. ^ Wieman, Carl (2007). "The "Curse of Knowledge," or Why Intuition About Teaching Often Fails". APS News. The Back Page 16 (10). Retrieved 8 March 2012.