Heuristic (//; Greek: "Εὑρίσκω", "find" or "discover") refers to experience-based techniques for problem solving, learning, and discovery that give a solution which is not guaranteed to be optimal. Where the exhaustive search is impractical, heuristic methods are used to speed up the process of finding a satisfactory solution via mental shortcuts to ease the cognitive load of making a decision. Examples of this method include using a rule of thumb, an educated guess, an intuitive judgment, stereotyping, or common sense.
- 1 Example
- 2 Psychology
- 3 Philosophy
- 4 Law
- 5 Mass communication
- 6 Stereotyping
- 7 Computer science
- 8 Human-computer interaction
- 9 Engineering
- 10 See also
- 11 References
- 12 Further reading
- 13 External links
The most fundamental heuristic is trial and error, which can be used in everything from matching nuts and bolts to finding the values of variables in algebra problems.
- If you are having difficulty understanding a problem, try drawing a picture.
- If you can't find a solution, try assuming that you have a solution and seeing what you can derive from that ("working backward").
- If the problem is abstract, try examining a concrete example.
- Try solving a more general problem first (the "inventor's paradox": the more ambitious plan may have more chances of success).
In psychology, heuristics are simple, efficient rules, learned or hard-coded by evolutionary processes, that have been proposed to explain how people make decisions, come to judgments, and solve problems typically when facing complex problems or incomplete information. These rules work well under most circumstances, but in certain cases lead to systematic errors or cognitive biases.
Although much of the work of discovering heuristics in human decision-makers was done by the Israeli psychologists Amos Tversky and Daniel Kahneman, the concept was originally introduced by Nobel laureate Herbert A. Simon. Simon's original, primary object of research was problem solving, showed that we operate within what he calls bounded rationality. He coined the term "satisficing", which denotes the situation where people seek solutions or accept choices or judgments that are "good enough" for their purposes, but could be optimized.
Gerd Gigerenzer focused on the "fast and frugal" properties of heuristics, i.e., using heuristics in a way that is principally accurate and thus eliminating most cognitive bias. From one particular batch of research, Gigerenzer and Wolfgang Gaissmaier found that both individuals and organizations rely on heuristics in an adaptive way. They also found that ignoring part of the information [with a decision], rather than weighing all the options, can actually lead to more accurate decisions.
Heuristics, through greater refinement and research, have begun to be applied to other theories, or be explained by them. For example: the Cognitive-Experiential Self-Theory (CEST) also an adaptive view of heuristic processing. CEST breaks down two systems that process information. At some times, roughly speaking, individuals consider issues rationally, systematically, logically, deliberately, effortfully, and verbally. On other occasions, individuals consider issues intuitively, effortlessly, globally, and emotionally. From this perspective, heuristics are part of a larger experiential processing system that is often adaptive, but vulnerable to error in situations that require logical analysis.
In 2002, Daniel Kahneman and Shane Frederick proposed that cognitive heuristics work by a process called attribute substitution, which happens without conscious awareness. According to this theory, when somebody makes a judgment (of a "target attribute") that is computationally complex, a rather easier calculated "heuristic attribute" is substituted. In effect, a cognitively difficult problem is dealt with by answering a rather simpler problem, without being aware of this happening. This theory explains cases where judgments fail to show regression toward the mean. Heuristics can be considered to reduce the complexity of clinical judgements in healthcare.
Theorized psychological heuristics
- Anchoring and adjustment – Describes the common human tendency to rely too heavily on the first piece of information offered (the "anchor") when making decisions. For example, in a study done with children, the children were told to estimate the number of jellybeans in a jar. Groups of children were given either a high or low "base" number (anchor). Children estimated the number of jellybeans to be closer to the anchor number that they were given. 
- Availability heuristic – A mental shortcut that occurs when people make judgments about the probability of events by the ease with which examples come to mind. For example, in a 1973 Tversky & Kahneman experiment, the majority of participants reported that there were more words in the English language that start with the letter K than there were more words for which K was the third letter. There are actually twice as many words in the English Language that have K as the third letter than start with K, but words that start with K are much easier to recall and bring to mind. 
- Representativeness heuristic – A mental shortcut used when making judgments about the probability of an event under uncertainty. Or, judging a situation based on how similar the prospects are to the prototypes the person holds in his or her mind. For example, in a 1982 Tversky and Kahneman experiment, participants were given a description of a woman named Linda. Based on the description, it was likely that Linda was a feminist. 80-90% of participants responded that it was also more likely for Linda to be a feminist and a bank teller than just a bank teller. The likelihood of two events cannot be greater than that of either of the two events individually. For this reason, the Representativeness Heuristic is exemplary of the Conjunction Fallacy. 
- Naïve diversification – When asked to make several choices at once, people tend to diversify more than when making the same type of decision sequentially.
- Escalation of commitment – Describes the phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the cost, starting today, of continuing the decision outweighs the expected benefit.
- Familiarity heuristic – A mental shortcut applied to various situations in which individuals assume that the circumstances underlying the past behavior still hold true for the present situation and that the past behavior thus can be correctly applied to the new situation. Especially prevalent when the individual experiences a high cognitive load.
- Affect heuristic
- Contagion heuristic
- Effort heuristic
- Fluency heuristic
- Gaze heuristic
- Peak-end rule
- Recognition heuristic
- Scarcity heuristic
- Similarity heuristic
- Simulation heuristic
- Social proof
- Take-the-best heuristic
Heuristics were also found to be used in the manipulation and creation of cognitive maps. Cognitive maps are internal representations of our physical environment, particularly associated with spatial relationships. These internal representations of our environment are used as memory as a guide in our external environment. It was found that when questioned about maps imaging, distancing, and etc., people commonly made distortions to images. These distortions took shape in the regularization of images (i.e., images are represented as more like pure abstract geometric images, though they are irregular in shape).
There are several ways that humans form and use cognitive maps. Visual intake is a key part of mapping. The first is by using landmarks. This is where a person uses a mental image to estimate a relationship, usually distance, between two objects. Second, is route-road knowledge, and this is generally developed after a person has performed a task and is relaying the information of that task to another person. Third, is survey. A person estimates a distance based on a mental image that, to them, might appear like an actual map. This image is generally created when a person's brain begins making image corrections. These are presented in five ways: 1. Right-angle bias is when a person straightens out an image, like mapping an intersection, and begins to give everything 90-degree angles, when in reality it may not be that way. 2. Symmetry heuristic is when people tend to think of shapes, or buildings, as being more symmetrical than they really are. 3. Rotation heuristic is when a person takes a naturally (realistically) distorted image and straightens it out for their mental image. 4. Alignment heuristic is similar to the previous, where people align objects mentally to make them straighter than they really are. 5. Relative-position heuristic people do not accurately distance landmarks in their mental image based on how well they remember that particular item.
Another method of creating cognitive maps is by means of auditory intake based on verbal descriptions. Using the mapping based from a person's visual intake, another person can create a mental image, such as directions to a certain location.
"Heuristic device" is used when an entity X exists to enable understanding of, or knowledge concerning, some other entity Y. A good example is a model that, as it is never identical with what it models, is a heuristic device to enable understanding of what it models. Stories, metaphors, etc., can also be termed heuristic in that sense. A classic example is the notion of utopia as described in Plato's best-known work, The Republic. This means that the "ideal city" as depicted in The Republic is not given as something to be pursued, or to present an orientation-point for development; rather, it shows how things would have to be connected, and how one thing would lead to another (often with highly problematic results), if one would opt for certain principles and carry them through rigorously.
"Heuristic" is also often used as a noun to describe a rule-of-thumb, procedure, or method. Philosophers of science have emphasized the importance of heuristics in creative thought and constructing scientific theories. (See The Logic of Scientific Discovery, and philosophers such as Imre Lakatos, Lindley Darden, William C. Wimsatt, and others.)
In legal theory, especially in the theory of law and economics, heuristics are used in the law when case-by-case analysis would be impractical, insofar as "practicality" is defined by the interests of a governing body.
The present securities regulation regime largely assumes that all investors act as perfectly, rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects.
For instance, in all states in the United States the legal drinking age for unsupervised persons is 21 years, because it is argued that people need to be mature enough to make decisions involving the risks of alcohol consumption. However, assuming people mature at different rates, the specific age of 21 would be too late for some and too early for others. In this case, the somewhat arbitrary deadline is used because it is impossible or impractical to tell whether an individual is sufficiently mature for society to trust them with that kind of responsibility. Some proposed changes, however, have included the completion of an alcohol education course rather than the attainment of 21 years of age as the criterion for legal alcohol possession. This would put youth alcohol policy more on a case-by-case basis and less on a heuristic one, since the completion of such a course would presumably be voluntary and not uniform across the population.
The same reasoning applies to patent law. Patents are justified on the grounds that inventors must be protected so they have incentive to invent. It is therefore argued that it is in society's best interest that inventors receive a temporary government-granted monopoly on their idea, so that they can recoup investment costs and make economic profit for a limited period. In the United States, the length of this temporary monopoly is 20 years from the date the application for patent was filed, though the monopoly does not actually begin until the application has matured into a patent. However, like the drinking-age problem above, the specific length of time would need to be different for every product to be efficient. A 20-year term is used because it is difficult to tell what the number should be for any individual patent. More recently, some, including University of North Dakota law professor Eric E. Johnson, have argued that patents in different kinds of industries–such as software patents–should be protected for different lengths of time.
|This section does not cite any references or sources. (November 2011)|
Heuristics are created to aid people in decision making, helping them put as little effort as needed to make a quick decision on various topics. In the study of media effects, judgmental heuristics have been shown to play an active role in the simplifying of news and political communication. Use of these cues and other signals from elites allows average people the opportunity to achieve a modest level of rationality in reaching a decision. This can be accomplished without having to devote any significant measure of cognitive effort normally required to arrive at thoughtful and considered choices. Analogous to a filing cabinet, when people are introduced to new information, they automatically search within their brain to associate it with something familiar. Once associated, they “file” it away in that “drawer” where it can be referenced later. Once filed, that association is hard to change. Mass media and advertisers know this idea good and well, implementing it in almost every aspect of advertisement. This method of processing expands beyond the bounds of social media and advertisements as well, often used as a key tool in political agendas. The limited capacity theory model and other information processing models have been influential in the study of how people encode, store and retrieve political information. Most people maintain a minimum level of interest in public affairs, and therefore employ simplifying shortcuts to arrive at political judgments. Common examples include referring to the complex military and intelligence activities by NATO forces in the Middle East simply as “the war on terror,” a reversal of a specific policy or position as a “flip-flop,” and the homogenization of any type of broad government assistance program as “socialism.”As a result of such heuristic thinking, bias in belief tends to follow, further reinforcing disingenuous mechanisms for information processing. Such an example, according to Schneider, is the hindsight bias, which “refers to people's tendency to believe, in retrospect, that an event was more predictable than it actually was" By employing heuristic thinking in decision making, one is not only reinforcing thoughts and opinions, but diminishes one’s openness to other solutions, or ideas. Heuristics is a positive quality when quick judgments need to be made, however in the long run are often detrimental for educational growth and attaining a knowledge-deficit approach to life.
Risk assessment of new technologies offers another example of how ordinary citizens seek shortcuts to expediently arrive at judgments. Most people maintain a low level of interest in issues that are not center to their daily lives, such as developments in the various fields of science and technology. Media frames can produce powerful heuristics that can have significant impact on public opinion about a given new technology. Research has shown media frames that suggest high risk often lead to strong negative perceptions and possible rejection of a technology. An example is the casting of genetically modified foods as “Frankenfoods” and using illustrations containing visual cues to Frankenstein’s monster.
Heuristics and science opinions
The public also relies on specific heuristics to form opinions about science and science news. Scientists and communicators often assume that the public objectively accumulates and evaluates scientific information to develop opinions, but research has shown heuristics have a larger effect than specific science knowledge.
For example, a 2007 study examined how people in the United States developed opinions about agricultural biotechnology. Their results showed that the public used key heuristics to arrive at their opinions rather than specific knowledge of biotechnology. Specifically, the heuristics they used were deference to scientific authority, trust in scientific institutions, and whether they had seen media coverage of biotechnology. Different heuristics were used for different demographic groups, and actual knowledge of biotechnology played a small role in opinion formation.
This study also brings up the idea of using current media news as a heuristic. Whatever information has been most recently presented by the media is likely to be more accessible in an individual’s mind. This information can then be used as a shortcut in evaluating an issue, and is used heuristically in place of lengthier cognitive processing using past information.
Our use of heuristics may come directly from individuals. Opinions of trusted or elite individuals may themselves become a heuristic. When evaluating a decision or problem, individuals can turn to these trusted or elite individuals for their opinions. Rather than evaluating the information surrounding the decision, the individual uses these trusted opinions as informational shortcuts to make their decisions.
Heuristics used when forming opinions can also be ideologically based. A 2008 study looked at the relationship between religion and opinions about nanotechnology. This research found that the more religious the citizens of a country, the less likely they were to support nanotechnology. This suggests that people used religion as a shortcut or heuristic; they were not informed about nanotechnology, but because their religious beliefs cautioned them against some forms of technology, they used an ideological heuristic to form their opinions about an unknown technology.
Different individuals use different heuristics to process the information before them based on their available schema and the framing of the information. Issues may resonate with different schemata depending on the individual and the way the issue is framed. For example, “drilling for oil” may activate schemata relating to corporate profits, environmental disasters, and exploitation of workers, while “exploring for energy” may activate schemata related to protecting the environment, national pride, and innovation. These two terms refer to the same activity, but when they are framed differently, different schemata are activated, which results in the use of different heuristics.
||This section's tone or style may not reflect the encyclopedic tone used on Wikipedia. (August 2013)|
Stereotyping is a type of heuristic that all people use to form opinions or make judgments about things they have never seen or experienced. They work as a mental shortcut to access everything from social status of a person from their actions to assumptions that a plant that is tall has a trunk and leaves, is a tree even though we have never seen that particular type of tree before. Stereotypes as described by Lippman are the pictures we have in our heads which are built around experiences as well as what we are told about the world. These "pictures in our heads" allow us to make judgments without having first-hand experience on the topics, which is what heuristics are all about.
Stereotypes get a bad rap as being a tool used in racism but we all use stereotypes, and they are an effective way to form opinions or to pass judgment on things we do not fully understand.
Explanation of how we can identify a plant in another country as being a tree with stereotypes: Because we have been told what a tree looks like and we have seen many types of trees we have images in our brains of what different characteristics make up a tree. So when we see something that has similar characteristics even though we have never been told that that plant is in fact a tree we can pass judgment that most likely that plant is in fact a tree. Thus we have used a mental shortcut to make a decision on something, instead of going to a native of the region and asking, "Is this a tree".
Stereotyping is also more likely to occur as the result of Heuristic use when people are fatigued. In a 1990 Galen V. Bodenhausen study, it was found that judgment of a situation at a non-optimal circadian time can elicit the misuse of Heuristics. In this study, college participants were asked to judge the alleged "guilt" of students associated with misbehaviors on campus. For one group, the students in question were associated with stereotypes, and in the other condition, they were accused of the same acts but not associated with a stereotype. Students were more likely to use the Heuristic of the associated stereotype to conclude the students were guilty at their non-optimal time of day. The "morning" people were more likely to use Heuristics when questioned at night, while "night" people were more likely to use Heuristics in the morning. 
In computer science, a heuristic is a technique designed for solving a problem more quickly when classic methods are too slow, or for finding an approximate solution when classic methods fail to find any exact solution. By trading optimality, completeness, accuracy, and/or precision for speed, a heuristic can quickly produce a solution that is good enough for solving the problem at hand, as opposed to finding all exact solutions in a prohibitively long time.
One way of achieving this computational performance gain consists in solving a simpler problem whose solution is also a solution to the more complex problem.
Heuristics is used in the A* algorithm whose intent is to find a short path from one node to another. A high-value heuristic computes a path quickly, but the path might not be the shortest. A low-value heuristic computes a path more slowly, but the path becomes shorter.
||This section is written like a personal reflection or opinion essay rather than an encyclopedic description of the subject. (February 2011)|
In human-computer interaction, heuristic evaluation is a usability-testing technique devised by expert usability consultants. In heuristic evaluation, the user interface is reviewed by experts and its compliance to usability heuristics (broadly stated characteristics of a good user interface, based on prior experience) is assessed, and any violating aspects are recorded.
In software development, the use of a heuristic approach can facilitate a well-designed user interface, enabling users to navigate complex systems intuitively and without difficulty. The interface may guide the user when necessary using tooltips, help buttons, invitations to chat with support, etc., providing help when needed. However, in practice, the designer of the user interface may not find it easy to strike the optimum balance for assistance of the user. An example of a heuristic approach is the search product of Google. Google's primary product, search, involves incredibly complex algorithms searching a massive amount of data. The User Interface is simplified hugely to make for an intuitive experience; the requested search data is entered into a box and submitted with a single click. Data is organised by searching both the precise term submitted and also by applying fuzzy logic; searching for near-matches and associations (e.g., a search for 'Jonathan Smith' also returns results for 'John Smith'). This means that Google is able to return the information that the user wants, but may not have asked for, with an incredibly simple and intuitive user interface. If, however, the results returned are incorrect, you are given the option of performing an "advanced search" to provide more information for a more targeted response.
Software developers and targeted end-users alike disregard heuristics at their own peril. End users often need to increase their understanding of the basic framework that a project entails (so that their expectations are realistic), and developers often need to push to learn more about their target audience (so that their learning styles can be judged). Business rules crucial to the organization are often so obvious to the end-user that they are not conveyed to the developer, who may lack domain knowledge in the particular field of endeavor the application is meant to serve.
A proper Software Requirements Specification (SRS) models the heuristics of how a user processes information on-screen. An SRS is ideally shared with the end-user well before the actual Software Design Specification (SDS) is written and the application is developed, so users' feedback about their experience can be used to adapt the design of the application. This saves much time in the Software Development Life Cycle (SDLC). Unless heuristics are adequately considered, the project will likely suffer many implementation problems and setbacks.
In engineering, a heuristic is an experience-based method that can be used as an aid to solve process design problems, varying from size of equipment to operating conditions. By using heuristics, time can be reduced when solving problems. Several methods are available to engineers. These include Failure mode and effects analysis and Fault tree analysis. The former relies on a group of qualified engineers to evaluate problems, rank them in order of importance and then recommend solutions. The methods of forensic engineering are an important source of information for investigating problems, especially by elimination of unlikely causes and using the weakest link principle. Because heuristics are fallible, it is important to understand their limitations. They are aids that facilitate quick estimates and preliminary process designs.
- Behavioral economics
- Daniel Kahneman
- Failure mode and effects analysis
- List of biases in judgment and decision making
- Problem solving
- Social heuristics
- Pearl, Judea (1983). Heuristics: Intelligent Search Strategies for Computer Problem Solving. New York, Addison-Wesley, p. vii. ISBN 978-0-201-05594-8
- Pólya, George (1945) How to Solve It: A New Aspect of Mathematical Method, Princeton, NJ: Princeton University Press. ISBN 0-691-02356-5 ISBN 0-691-08097-6
- Gigerenzer, Gerd (1991). "How to Make Cognitive Illusions Disappear: Beyond "Heuristics and Biases"". European Review of Social Psychology 2: 83–115. Retrieved 14 October 2012.
- Daniel Kahneman, Amos Tversky, and Paul Slovic, eds. (1982) Judgment under Uncertainty: Heuristics & Biases. Cambridge, UK, Cambridge University Press ISBN 0-521-28414-7
- "Heuristics and heuristic evaluation". Interaction-design.org. Retrieved 2013-09-01.
- Gerd Gigerenzer, Peter M. Todd, and the ABC Research Group (1999). Simple Heuristics That Make Us Smart. Oxford, UK, Oxford University Press. ISBN 0-19-514381-7
- Gigerenzer, Gerd and Gaissmaier, Wolfgang (January 2011). "Heuristic Decision Making". Annual Review of Psychology. Vol. 62. Ssrn.com. p. 451-482.
- "Cognitive experiential self theory - Psychlopedia". psych-it.com.au. 2008-10-18. doi:10.1177/1745691611429354. Retrieved 2013-09-01.
- Epstein, S., Pacini, R., Denes-Raj, V. & Heier, H. (1996). "Individual differences in intuitive-experiential and analytical-rational thinking styles". Journal of Personality and Social Psychology, 71, 390-405.
- Kahneman, Daniel; Shane Frederick (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich, Dale Griffin, Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 49–81. ISBN 978-0-521-79679-8. OCLC 47364085.
- Kahneman, Daniel (December 2003). "Maps of Bounded Rationality: Psychology for Behavioral Economics". American Economic Review (American Economic Association) 93 (5): 1449–1475. doi:10.1257/000282803322655392. ISSN 0002-8282.
- Cioffi, Jane (1997). "Heuristics, servants to intuition, in clinical decision making". Journal of Advanced Nursing 26: 203–208. doi:10.1046/j.1365-2648.1997.1997026203.x.
- Smith, H. (1999). Use of the anchoring and adjustment heuristic by children. Current Psychology: A Journal For Diverse Perspectives On Diverse Psychological Issues, 18(3), 294-300. doi:10.1007/s12144-999-1004-4
- Harvey, N. (2007). Use of heuristics: Insights from forecasting research. Thinking & Reasoning, 13(1), 5-24. doi:10.1080/13546780600872502
- Harvey, N. (2007). Use of heuristics: Insights from forecasting research. Thinking & Reasoning, 13(1), 5-24. doi:10.1080/13546780600872502
- Sternberg, Robert J.; Karin Sternberg (2012). Cognitive Psychology (6th ed.). Belmont, CA: Wadsworth, Cengage Learning. pp. 310–1315. ISBN 978-1-111-34476-4.
- K. M. Jaszczolt (2006). "Defaults in Semantics and Pragmatics", The Stanford Encyclopedia of Philosophy, ISSN 1095-5054
- Roman Frigg and Stephan Hartmann (2006). "Models in Science", The Stanford Encyclopedia of Philosophy, ISSN 1095-5054
- Olga Kiss (2006). "Heuristic, Methodology or Logic of Discovery? Lakatos on Patterns of Thinking", Perspectives on Science, vol. 14, no. 3, pp. 302-317, ISSN 1063-6145
- Gerd Gigerenzer and Christoph Engel, eds. (2007). Heuristics and the Law, Cambridge, The MIT Press, ISBN 978-0-262-07275-5
- Lin, Tom C. W., A Behavioral Framework for Securities Risk (April 16, 2012). 34 Seattle University Law Review 325 (2011) . Available at SSRN: http://ssrn.com/abstract=2040946
- Eric E. Johnson (2006). "Calibrating Patent Lifetimes", Santa Clara Computer & High Technology Law Journal, vol. 22, p. 269-314
- Robles, Harmony A. (March 26, 2012). "Heuristics: Helpful, Harmful, and Somewhere in Between - Applied Social Psychology (ASP)".
- Schneider, F. W., Coutts, L. M., & Gruman, J. A. (2012). Applied social psychology, understanding and addressing social and practical problems. (2nd ed.). Thousand Oaks, CA: Sage Publications, Inc.
- Scheufele, D. A. (2006). Messages and heuristics: How audiences form attitudes about emerging technologies. In J. Turney (Ed.), Engaging science: Thoughts, deeds, analysis and action (pp. 20-25).
- Brossard, D., & Nisbet, M. C. (2007). Deference to scientific authority among a low information public: Understanding U.S. opinion on agricultural biotechnology. International Journal of Public Opinion Research, 19(1), 24-52. doi: 10.1093/ijpor/edl003
- Popkin, S. L. (1991). The reasoning voter: Communication and persuasion in presidential campaigns. Chicago, IL: University of Chicago Press. (Chapters 3 & 4, pp. 44–95).
- Scheufele, D. A., Corley, E. A., Shih, T.-j., Dalrymple, K. E., & Ho, S. S. (2009). Religious beliefs and public attitudes to nanotechnology in Europe and the US. Nature Nanotechnology, 4(2), 91 - 94. doi: 10.1038/NNANO.2008.361
- Bodenhausen, G. V. (1990). Stereotypes as judgmental heuristics: Evidence of circadian variations in discrimination. Psychological Science, 1(5), 319-322. doi:10.1111/j.1467-9280.1990.tb00226.x
- How To Solve It: Modern Heuristics, Zbigniew Michalewicz and David B. Fogel, Springer Verlag, 2000. ISBN 3-540-66061-5
- Russell, Stuart J.; Norvig, Peter (2003), Artificial Intelligence: A Modern Approach (2nd ed.), Upper Saddle River, New Jersey: Prentice Hall, ISBN 0-13-790395-2
- The Problem of Thinking Too Much, 2002-12-11, Persi Diaconis
|Look up heuristic in Wiktionary, the free dictionary.|