This article includes a list of references, related reading or external links, but its sources remain unclear because it lacks inline citations. (May 2008) (Learn how and when to remove this template message)
Adaptive bias is the idea that the human brain has evolved to reason adaptively, rather than truthfully or even rationally, and that cognitive bias may have evolved as a mechanism to reduce the overall cost of cognitive errors as opposed to merely reducing the number of cognitive errors, when faced with making a decision under conditions of uncertainty.
Error Management Theory
According to Error Management Theory, when making decisions under conditions of uncertainty, two kinds of errors need to be taken into account—"false positives", i.e. deciding that a risk or benefit exists when it does not, and "false negatives", i.e. failing to notice a risk or benefit that exists. False positives are also commonly called "Type I errors", and false negatives are called "Type II errors".
Where the cost or impact of a Type I error is much greater than the cost of a Type II error (e.g. the water is safe to drink), it can be worthwhile to bias the decision-making system towards making fewer Type I errors, i.e. making it less likely to conclude that a particular situation exists. This by definition would also increase the number of Type II errors. Conversely, where a false positive is much less costly than a false negative (blood tests, smoke detectors), it makes sense to bias the system towards maximising the probability that a particular (very costly) situation will be recognised, even if this often leads to the (relatively un-costly) event of noticing something that is not actually there. This situation is exhibited in modern airport screening—maximising the probability of preventing a high-cost terrorist event results in frequent, low-cost screening hassles for harmless travelers who represent a minimal threat.
Haselton & Buss (2003) state that cognitive bias can be expected to have developed in humans for cognitive tasks where:
- decision-making is complicated by a significant signal-detection problem (i.e. when there is uncertainty)
- the solution to the particular kind of decision-making problem has had a recurrent effect on survival and fitness throughout evolutionary history
- the costs of a "false positive" or "false negative" error dramatically outweighs the cost of the alternative type of error
The costly information hypothesis
The costly information hypothesis is used to explore how adaptive biases relate to cultural evolution within the field of dual inheritance theory. The focus is on the evolutionary trade-offs in cost between individual learning, (e.g., operant conditioning) and social learning. If more accurate information that could be acquired through individual learning is too costly, evolution may favor learning mechanisms that, in turn, are biased towards less costly, (though potentially less accurate), information via social learning.
- Guided variation and biased transmission – Adaptive biases in dual inheritance theory
- Psychological adaptation
- Curse of knowledge
- Haselton, M.G.; Buss, D.M. (2003). "Biases in Social Judgment: Design Flaws or Design Features?" (PDF). In Forgas, Joseph P; Williams, Kipling D; von Hippel, William. Social Judgments: Implicit and Explicit Processes. New York, NY: Cambridge University Press. pp. 23–43. ISBN 9780521822480.
- Haselton, Martie G.; Nettle, Daniel; Andrews, Paul W. (2005). "The Evolution of Cognitive Bias" (PDF). In Buss, D.M. The Handbook of Evolutionary Psychology. Hoboken: Wiley. pp. 724–746. doi:10.1002/9780470939376.ch25. ISBN 9780470939376.
- Henrich, Joseph; McElreath, Richard (2007). "Dual-inheritance theory: the evolution of human cultural capacities and cultural evolution" (PDF). In Barrett, Louise; Dunbar, Robin. Oxford Handbook of Evolutionary Psychology. Oxford: Oxford University Press. Ch. 38. doi:10.1093/oxfordhb/9780198568308.013.0038. ISBN 9780198568308.