Curse of knowledge

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The curse of knowledge is a cognitive bias that leads better-informed parties to find it extremely difficult to think about problems from the perspective of lesser-informed parties. The effect was first described in print by the economists Colin Camerer, George Loewenstein and Martin Weber, though they give original credit for suggesting the term to Robin Hogarth.[1]

An example of this bias would be of a tailor selling clothes. Because the tailor has made a dress, he is intimately familiar with the quality of the item in craftsmanship, features, and fabric quality. When pricing a dress for sale, however, he needs to take the point of view of an uninformed customer - someone might be walking into the store with no previous knowledge of the owner, dressmaker, or how difficult or easy the item is to make. The tailor, as hard as he might try to take the point of view of the customer, cannot completely separate himself from the knowledge he has of the quality of this dress, and therefore will assume a customer will value and pay much more for the dress than is actually true.

History of concept[edit]

The term "curse of knowledge" was coined in the Journal of Political Economy by economists Colin Camerer, George Loewenstein, and Martin Weber. The aim of their research was to document the idea that making bad judgments can be due to being unable to predict the actions of lesser informed parties. The economic impacts of this bias are described as two-fold: better informed parties may suffer losses in a deal when they should not, and that the curse of knowledge can somewhat cancel out market consequences resulting from information asymmetry, one party knowing more than the other and being at an advantage because of it.[1]

The idea that better informed parties may suffer losses in a deal or exchange was seen as something important to bring to the sphere of economic theory. Most theoretical analyses of situations where one party knew less than the other focused on how the lesser informed party attempted to learn more information to minimize information asymmetry. However, in these analyses, there is an assumption that better-informed parties can optimally exploit their information asymmetry when they, in fact, cannot. People cannot ignore their additional, better, information, even when they should in a bargaining situation.[1]

For example, two people are bargaining over dividing money or provisions. One party may know the size of the amount being divided while the other does not. However, to fully exploit his advantage, the informed party should make the same offer regardless of the amount of material to be divided. [2] But informed parties actually offer more when the amount to be divided is larger. [3] [4] Informed parties are unable to ignore their better information, even when they should.[1]

In the 1989 publication, Camerer, Loewenstein, and Weber state that: "All the previous evidence of the curse of knowledge has been gathered in psychological studies of individual judgments," referring readers to Baruch Fischhoff's work from 1975, which also involves the hindsight bias.[1] Fischhoff's work focused on hindsight bias, a cognitive bias that implies knowing the outcome of a certain event makes it seem more predictable than may actually be true. His work was specifically on how it can be known and accounted for by individuals.[5]

In one experiment by Fischhoff it was shown that participants did not know that their outcome knowledge affected their responses, and, if they did know, they could still not ignore or defeat the effects of the bias.[5] These participants could not successfully reconstruct their previous, less knowledgeable states accurately, which directly relates to the curse of knowledge. This poor reconstruction was theorized by Fischhoff to be because the participant was "anchored in the hindsightful state of mind created by receipt of knowledge."[5] This receipt of knowledge returns to the idea of the curse proposed by Camerer, Loewenstein, and Weber: a knowledgeable person cannot accurately reconstruct what a person, be it themselves or someone else, without the knowledge would think, or how they would act.

In his paper, Fischoff questions the failure to empathize with ourselves in less knowledgeable states, and notes that how well people manage to reconstruct perceptions of lesser informed others is a crucial question for historians and "all human understanding."[5]

Experimental evidence[edit]

Birch and Bloom found that the ability of people to reason about another person's actions could be compromised by the knowledge of the outcome of an event. The perception the participant had of the plausibility of an event also mediated the extent of the bias. If the event was less plausible, knowledge was not as much of a "curse" as when there was a potential explanation for the way the other person could act.[6] In addition, and more recently, researchers have linked the curse of knowledge bias with false-belief reasoning in both children and adults, as well as theory of mind development difficulties in children.

In another experiment related to the curse of knowledge, one group of participants "tapped" a well-known song on a table while another listened and tried to identify the song. Some "tappers" described a rich sensory experience in their minds as they tapped out the melody. Tappers on average estimated that 50% of listeners would identify the specific tune. In reality, only 2.5% of listeners could identify the song. [7][8]

Related to this finding is the phenomenon experienced by players of charades: The actor may find it frustratingly hard to believe that his or her teammates keep failing to guess the secret phrase, known only to the actor, conveyed by pantomime.


In the Camerer, Loewenstein and Weber's article, it is mentioned that the setting closest in structure to the market experiments done would be underwriting, a task in which well-informed experts price goods that are sold to a less-informed public. Investment bankers value securities, experts taste cheese, store buyers observe jewelry being modeled, and theater owners see movies before they are released. They then sell those goods to a less-informed public. If they suffer from the curse of knowledge, high-quality goods will be overpriced and low-quality goods underpriced relative to optimal, profit-maximizing prices; prices will reflect characteristics (e.g., quality) that are unobservable to uninformed buyers.[1]

The curse of knowledge has a paradoxical effect in these settings. By making better-informed agents think that their knowledge is shared by others, the curse helps alleviate the inefficiencies that result from information asymmetries- a better informed party having an advantage in a bargaining situation- bringing outcomes closer to complete information. In such settings, the curse on individuals may actually improve social welfare.


Economists Camerer, Loewenstein, and Weber first applied the curse of knowledge phenomenon to economics, in order to explain why and how the assumption that better informed agents can accurately anticipate the judgments of lesser informed agents is not inherently true. They also sought to support the finding that sales agents who are better informed about their products may, in fact, be at a disadvantage against other, less-informed agents when selling their products. This is said to be because better informed agents fail to ignore the privileged knowledge that they possess, thus "cursed" and unable to sell their products at a value that more naïve agents would deem acceptable.[1] [9]

It has also been suggested that the curse of knowledge could contribute to the difficulty of teaching.[10]The curse of knowledge means that it could be potentially ineffective, if not harmful, to think about how students are viewing and learning material by asking the perspective of the teacher as opposed to what has been verified by students. The teacher already has the knowledge that he is trying to impart, but the way that knowledge is conveyed may not be the best for those without the knowledge already.

See also[edit]


  1. ^ a b c d e f g Camerer, Colin; George Loewenstein; Mark Weber (1989). "The curse of knowledge in economic settings: An experimental analysis". Journal of Political Economy 97: 1232–1254. doi:10.1086/261651. 
  2. ^ Myerson, Roger B. "Negotiation in Games: A Theoretical Overview". In Un-certainty, Information, and Communication: Essays in Honor of Kenneth J. Arrow, vol. 3, edited by Walter P. Heller, Ross M. Starr, and David A. Starrett. New York: Cambridge Univ. Press, 1986.
  3. ^ Forsythe, Robert; Kennan, John; and Sopher, Barry. "An Experimental Analysis of Bargaining and Strikes with One Sided Private Information." Working Paper no. 87-4. Iowa City: Univ. Iowa, Dept. Econ., 1987.
  4. ^ Banks, Jeff; Camerer, Colin F.; and Porter, David. "Experimental Tests of Nash Refinements in Signaling Games." Working paper. Philadelphia: Univ. Pennsylvania, Dept. Decision Sci., 1988.
  5. ^ a b c d Fischhoff, Baruch. "Hindsight & Foresight: The Effect of Outcome Knowledge on Judgment under Uncertainty". J. Experimental Psychology: Human Perception and Performance 1 (August 1975): 288-99.
  6. ^ Birch, S. A. J., & Bloom, P. (2007). "The curse of knowledge in reasoning about false beliefs". Psychological Science, 18, 382-386.
  7. ^ Heath, Chip; Dan Heath (2007). Made to Stick. Random House. 
  8. ^ Ross, L., & Ward, A. (1996). "Naive realism in everyday life: Implications for social conflict and misunderstanding". In T. Brown, E. S. Reed & E. Turiel (Eds.), Values and Knowledge, (pp. 103–135). Hillsdale, NJ: Erlbaum.
  9. ^ Birch, S. A. J., & Bernstein, D. (2007). "What kids can tell us about hindsight bias: A fundamental constraint on perspective-taking?", Social Cognition, 25, 78-97.
  10. ^ Wieman, Carl (2007). ""The 'Curse of Knowledge', or Why Intuition About Teaching Often Fails"". APS News. The Back Page 16 (10). Retrieved 8 March 2012.