Cognitive miser

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

In psychology, the human mind is considered to be a cognitive miser due to the tendency of people to think and solve problems in simpler and less effortful ways rather than in more sophisticated and more effortful ways, regardless of intelligence.[1] Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending computational effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain how and why people are cognitive misers.[2][3]

The term cognitive miser was first introduced by Susan Fiske and Shelley Taylor in 1984. It is an important concept in social cognition theory and has been influential in other social sciences including but not exclusive to economics and political science.[2]


The naïve scientist and attribution theory[edit]

Before Fiske and Taylor's cognitive miser theory, the predominant model of social cognition was the naïve scientist. First proposed in 1958 by Fritz Heider in The Psychology of Interpersonal Relations, this theory holds that humans think and act with dispassionate rationality whilst engaging in detailed and nuanced thought processes for both complex and routine actions.[4] In this way, humans were thought to think like scientists, albeit naïve ones, measuring and analyzing the world around them. Applying this framework to human thought processes, naïve scientists seek the consistency and stability that comes from a coherent view of the world and need for environmental control.[5][page needed]

In order to meet these needs, naïve scientists make attributions.[6][page needed] Thus, attribution theory emerged from the study of the ways in which individuals assess causal relationships and mechanisms.[7] Through the study of causal attributions, led by Harold Kelley and Bernard Weiner amongst others, social psychologists began to observe that subjects regularly demonstrate several attributional biases including but not limited to the fundamental attribution error.[8]

The study of attributions had two effects: it created further interest in testing the naive scientist and opened up a new wave of social psychology research that questioned its explanatory power. This second effect helped to lay the foundation for Fiske and Taylor's cognitive miser.[5][page needed]


Much of the cognitive miser theory is built upon work done on heuristics in judgment and decision-making,[9][page needed] most notably Amos Tversky and Daniel Kahneman results published in a series of influential articles.[10][11][12] Heuristics can be defined as the "judgmental shortcuts that generally get us where we need to go—and quickly—but at the cost of occasionally sending us off course."[13] In their work, Kahneman and Tversky demonstrated that people rely upon different types of heuristics or mental short cuts in order to save time and mental energy.[12] However, in relying upon heuristics instead of detailed analysis, like the information processing employed by Heider's naïve scientist, biased information processing is more likely to occur.[5][page needed] Some of these heuristics include the representativeness heuristic (the inclination to assign specific attributes to an individual the more he/she matches the prototype of that group),[10] availability heuristic (the inclination to judge the likelihood of something occurring because of the ease of thinking of examples of that event occurring)[5][page needed][10] and anchoring heuristic (the inclination to overweight the importance and influence of an initial piece of information).[12] The frequency with which Kahneman and Tversky and other attribution researchers found the individuals employed mental shortcuts to make decisions and assessments laid important groundwork for the overarching idea that individuals and their minds act efficiently instead of analytically.[9][page needed]

The cognitive miser theory[edit]

The wave of research on attributional biases done by Kahneman, Tversky and others effectively ended the dominance of Heider's naïve scientist within social psychology.[9] Fiske and Taylor, building upon the prevalence of heuristics in human cognition, offered their theory of the cognitive miser. It is, in many ways, a unifying theory which suggests that humans engage in economically prudent thought processes, instead of acting like scientists who rationally weigh costs and benefits, test hypothesis, and update expectations based upon the results of the experiments that are our everyday actions.[2] In other words, humans are more inclined to act as cognitive misers using mental short cuts to make assessments and decisions, about issues and ideas about which they know very little as well as issues of great salience. Fiske and Taylor argue that acting as cognitive misers is rational due to the sheer volume and intensity of information and stimuli humans intake[2][14] However, other psychologists also argue that the cognitively miserly tendency of humans is a primary reason why "humans are often less than rational".[3]


The implications of this theory raise important questions about both cognition and human behavior. In addition to streamlining cognition in complicated, analytical tasks, cognitive misers are also at work when dealing with unfamiliar issues as well as issues of great importance.[2][14] Voting behavior in democracies are an arena in which the cognitive miser is at work. Acting as a cognitive miser should lead those with expertise in an area to more efficient information processing and streamlined decision making.[15] However, as Lau and Redlawsk note, acting as cognitive miser who employs heuristics can have very different results for high-information and low-information voters. They write, "...cognitive heuristics are at times employed by almost all voters, and that they are particularly likely to be used when the choice situation facing voters is complex... heuristic use generally increases the probability of a correct vote by political experts but decreases the probability of a correct vote by novices."[15] In democracies, where no vote is weighted more or less because of the expertise behind its casting, low-information voters, acting as cognitive misers, can have broad and potentially deleterious choices for a society.[15]

Updates and later research[edit]

Later models suggest that the cognitive miser and the naïve scientist create two poles of social cognition that are too monolithic. Instead, Fiske, Taylor, and Arie W. Kruglanski and other social psychologists offer an alternative explanation of social cognition: the motivated tactician.[2] According to this theory, people employ either shortcuts or thoughtful analysis based upon the context and salience of a particular issue. In other words, this theory suggests that humans are, in fact, both naive scientists and cognitive misers.[5][page needed]

See also[edit]


  1. ^ Stanovich, Keith E. (2009). "The cognitive miser: ways to avoid thinking". What intelligence tests miss: the psychology of rational thought. New Haven: Yale University Press. pp. 70–85. ISBN 9780300123852. OCLC 216936066. See also other chapters in the same book: "Framing and the cognitive miser" (chapter 7); "A different pitfall of the cognitive miser: thinking a lot, but losing" (chapter 9).
  2. ^ a b c d e f Fiske, Susan T.; Taylor, Shelley E. (1991) [1984]. Social cognition (2nd ed.). New York: McGraw-Hill. ISBN 978-0070211919. OCLC 22810253.
  3. ^ a b Toplak, Maggie E.; West, Richard F.; Stanovich, Keith E. (April 2014). "Assessing miserly information processing: an expansion of the Cognitive Reflection Test". Thinking & Reasoning. 20 (2): 147–168. doi:10.1080/13546783.2013.844729.
  4. ^ Heider, Fritz (1958). The psychology of interpersonal relations (1st ed.). New York: John Wiley & Sons. ISBN 978-0898592825. OCLC 225326.
  5. ^ a b c d e Crisp, Richard J.; Turner, Rhiannon N. (2014). Essential social psychology (3rd ed.). New York: Sage Publications. ISBN 9781446270769. OCLC 873005953.
  6. ^ Kassin, Saul; Fein, Steven; Markus, Hazel Rose (2016). Social psychology (10th ed.). Cengage Learning. ISBN 9781305580220. OCLC 952391832.
  7. ^ Ross, Lee (1977). "The intuitive psychologist and his shortcomings: distortions in the attribution process". In Berkowitz, Leonard (ed.). Advances in experimental social psychology. 10. New York: Academic Press. pp. 173–220. ISBN 978-0120152100. OCLC 1283539.
  8. ^ Jones, Edward E.; Harris, Victor A. (1967). "The attribution of attitudes". Journal of Experimental Social Psychology. 3 (1): 1–24. doi:10.1016/0022-1031(67)90034-0.
  9. ^ a b c Barone, David F.; Maddux, James E.; Snyder, Charles R. (1997). Social cognitive psychology: history and current domains (1st ed.). New York: Plenum Press. ISBN 978-0306454752. OCLC 36330837.
  10. ^ a b c Kahneman, Daniel; Tversky, Amos (1973). "On the psychology of prediction". Psychological Review. 80 (4): 237–251. doi:10.1037/h0034747.
  11. ^ Tversky, Amos; Kahneman, Daniel (1973). "Availability: a heuristic for judging frequency and probability". Cognitive Psychology. 5 (2): 207–232. doi:10.1016/0010-0285(73)90033-9.
  12. ^ a b c Tversky, Amos; Kahneman, Daniel (1974). "Judgment under uncertainty: heuristics and biases". Science. 185 (4157): 1124–1131. doi:10.1126/science.185.4157.1124. PMID 17835457.
  13. ^ Gilovich, Thomas; Savitsky, Kenneth (1996). "Like goes with like: the role of representativeness in erroneous and pseudoscientific beliefs" (PDF). The Skeptical Inquirer. 20 (2): 34–40. Archived from the original on 2016-03-07.CS1 maint: BOT: original-url status unknown (link)
  14. ^ a b Scheufele, Dietram A.; Lewenstein, Bruce V. (17 May 2005). "The public and nanotechnology: how citizens make sense of emerging technologies". Journal of Nanoparticle Research. 7 (6): 659–667 [660]. doi:10.1007/s11051-005-7526-2.
  15. ^ a b c Lau, Richard R.; David P. Redlawsk (4 Oct 2001). "Advantages and disadvantages of cognitive heuristics in political decision making". American Journal of Political Science. 45 (4): 951–971. CiteSeerX doi:10.2307/2669334. JSTOR 2669334.

Further reading[edit]