Jump to content

Adaptive bias: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Grick (talk | contribs)
m moved Adaptive Bias to Adaptive bias: "bias" shouldn't be capped
Ric168 (talk | contribs)
probablility changed probability wrong spelling
Line 3: Line 3:
When making decisions under conditions of uncertainty, two kinds of errors need to be taken into account - "false positives", i.e. deciding that a risk or benefit exists when it does not, and "false negatives", i.e. failing to notice a risk or benefit that exists. False positives are also commonly called "Type 1 errors", and false negatives are called "Type 2 errors".
When making decisions under conditions of uncertainty, two kinds of errors need to be taken into account - "false positives", i.e. deciding that a risk or benefit exists when it does not, and "false negatives", i.e. failing to notice a risk or benefit that exists. False positives are also commonly called "Type 1 errors", and false negatives are called "Type 2 errors".


Where the cost or impact of a type 1 error is much greater than the cost of a type 2 error (e.g. the water is safe to drink), it can be worthwhile to bias the decision making system towards making fewer type 1 errors, i.e. making it less likely to conclude that a particular situation exists. This by definition would also increase the number of type 2 errors. Conversely, where a false positive is much less costly than a false negative (blood tests, smoke detectors), it makes sense to bias the system towards maximising the probablility that a particular (very costly) situation will be recognised, even if this often leads to the (relatively un-costly) event of noticing something that is not actually there.
Where the cost or impact of a type 1 error is much greater than the cost of a type 2 error (e.g. the water is safe to drink), it can be worthwhile to bias the decision making system towards making fewer type 1 errors, i.e. making it less likely to conclude that a particular situation exists. This by definition would also increase the number of type 2 errors. Conversely, where a false positive is much less costly than a false negative (blood tests, smoke detectors), it makes sense to bias the system towards maximising the probability that a particular (very costly) situation will be recognised, even if this often leads to the (relatively un-costly) event of noticing something that is not actually there.


Martie G. Haselton and David M. Buss (2003) state that Cognitive Bias can be expected to have developed in humans for cognitive tasks where:
Martie G. Haselton and David M. Buss (2003) state that Cognitive Bias can be expected to have developed in humans for cognitive tasks where:

Revision as of 15:15, 29 December 2006

Adaptive bias is the idea that the human brain has evolved to reason adaptively, rather than truthfully or even rationally, and that Cognitive bias may have evolved as a mechanism to reduce the overall cost of cognitive errors as opposed to merely reducing the number of cognitive errors, when faced with making a decision under conditions of uncertainty.

When making decisions under conditions of uncertainty, two kinds of errors need to be taken into account - "false positives", i.e. deciding that a risk or benefit exists when it does not, and "false negatives", i.e. failing to notice a risk or benefit that exists. False positives are also commonly called "Type 1 errors", and false negatives are called "Type 2 errors".

Where the cost or impact of a type 1 error is much greater than the cost of a type 2 error (e.g. the water is safe to drink), it can be worthwhile to bias the decision making system towards making fewer type 1 errors, i.e. making it less likely to conclude that a particular situation exists. This by definition would also increase the number of type 2 errors. Conversely, where a false positive is much less costly than a false negative (blood tests, smoke detectors), it makes sense to bias the system towards maximising the probability that a particular (very costly) situation will be recognised, even if this often leads to the (relatively un-costly) event of noticing something that is not actually there.

Martie G. Haselton and David M. Buss (2003) state that Cognitive Bias can be expected to have developed in humans for cognitive tasks where:

  • Decision making is complicated by a significant signal-detection problem (i.e. when there is uncertainty)
  • The solution to the particular kind of decision making problem has had a recurrent effect on survival and fitness throughout evolutionary history
  • The costs of a "false positive" or "false negative" error dramatically outweighs the cost of the alternative type of error

Reference

Martie G. Haselton and David M. Buss (2003) - Biases in Social Judgment: Design Flaws or Design Features? [1]