Talk:List of cognitive biases/Archive 1

From Wikipedia, the free encyclopedia
Jump to: navigation, search


Name/Organization of article

Can we call this something more specific than "list of biases"? There has to be a better, less confusing name for this category. - Texture 18:19, 7 Mar 2004 (UTC)

Perhaps List of psychological biases? Isomorphic 18:23, 7 Mar 2004 (UTC)
Much more descriptive. Any objection to moving this article to the new name? - Texture 18:28, 7 Mar 2004 (UTC)
If you wanna move it, fine with me. My only thought is that all statistical biases could be considered ultimately cognitive ones. 18:31, 7 Mar 2004 (UTC)
Would List of cognitive biases be better than List of psychological biases? - Texture 18:33, 7 Mar 2004 (UTC)
Better ring to it, covers what I had in mind, yes. 18:38, 7 Mar 2004 (UTC)
I'll move to this article to List of cognitive biases - Texture 18:59, 7 Mar 2004 (UTC)

Maybe the list should be hierarchical. Categories include attributional biases, statistical biases, self-serving biases, group-serving biases, reporting biases (though there might be some overlap between some of them) 08:45, 7 Mar 2004 (UTC)

The list is not accessible

I think the list is not accessible. Each bias should mention the following

  1. whether this is an informal term only used by lay people or or a term used by psychologists
  2. whether the bias has been empirically verified
  3. whether the bias is undisputed by scientists.
  4. what other biases are closely related, overlap or are a part of it.

I can't do it myself because I don't know enough about the subject. Thanks for helping. Andries 15:56, 16 May 2004 (UTC)

I agree with Andries. On another note, this is a really great list subject!! If only we could get Bill O'Oreilly and Michael Moore to read it... -- 18:49, 21 Aug 2004 (UTC)
I agree. I'll probably be working on this over the next few days, any help is appreciated. J.S. Nelson 07:57, 25 Apr 2005 (UTC)
I've done my best to improve this list (organizing it into obvious catagories and adding short descriptions). Please continue to update and improve this list as y'all see fit. Headlouse 01:33, 16 November 2005 (UTC)

Victim fallacy

People tend to assume that their problems are unique to them or a group to which they belong, when these problems are very often widespread and affect many people.

Is there a name for this bias or fallacy? I cannot find anything in the list which mentions it. I would tend to call it the "victim fallacy".

LordK 19:13, 20 Nov 2004 (UTC)

Coercion bias?

This is just an unconfirmed rumor I read somewhere. If there is a person with an opinion, people tend to agree to him more in person than not in person. So this can lead to bias in that individual. But I am not even sure if this belongs here. Samohyl Jan 23:35, 21 Dec 2004 (UTC)

Why is there a separate "Other cognitive biases:" list?

Why is there a separate "Other cognitive biases:" list? There is no indication to the reader of any reason that these are separated out. I'd suggest they either be integrated in, or an explanation give to the reader why they are separate.

I am not even convinced that things like optical illusions are cognitive biases in the same sense as the other things on the list. It might be better to remove them entirely, or put them under 'see also.' Tom harrison 00:21, 21 October 2005 (UTC)

Tunnel Vision

Just wondering if this is correctly linked? AndyJones 18:19, 9 November 2005 (UTC)

Support for merging this page with Cognitive Bias

Like previous users, I see no reason for not merging this with the cognitive bias page. I would need help to do this.. any offers please?? --Rodders147 11:26, 19 March 2006 (UTC)

I don't know about the merging. It is a pretty long list, after all. --maru (talk) contribs 18:14, 19 March 2006 (UTC)
I Agree with Marudubshinki. This list is too long to include on the cognitive bias page. Plus there are several "List of _________" pages on wikipedia so this is not abnormal. Headlouse 22:59, 30 March 2006 (UTC)

Lake Wobegon effect

Where did this list come from? A lot of these, such as Lake Wobegon effect, must have some other name in psychology. It seems like some serious merging would help make these topics more informative. -- 10:12, 21 April 2006 (UTC)

Lake Wobegon effect, egocentric bias and actor-observer bias are closely related. Distinguishing (or merging) them requires expert advise. Peace01234 (talk) 03:52, 25 November 2007 (UTC)

confusing causation with correlation

It seems as though the common phenomean of assuming causation when all that exists is correclation or association should be on this list. Maybe it is and I could not identify it. Not my area of expertise, hope someone will address this. BTW: to those who maintain this page, its a wonderful resource!

Correlation implies causation is mentioned in Logical fallacy article. -- Sundar \talk \contribs 16:22, 26 June 2006 (UTC)


"Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view." It seems to me that this line is self-contradictory. Not that it should be removed, but if a psychologist discovered this bias how would this psychologist know whether or not he/she was subject to this bias (the bias of a psychologist) during its discovery? Anyway, I just thought that was interesting. --Merond e 14:04, 18 March 2007 (UTC)

Reductive Bias

"A common thread running through the deficiencies in learning is oversimplification. We call this tendency the reductive bias, and we have observed its occurrence in many forms. Examples include the additivity bias, in which parts of complex entities that have been studied in isolation are assumed to retain their characteristics when the parts are reintegrated into the whole from which they were drawn; the discreteness bias, in which continuously dimensioned attributes (like length) are bifurcated to their poles and continuous processes are instead segmented into discrete steps; and the compartmentalization bias, in which conceptual elements that are in reality highly interdependent are instead treated in isolation, missing important aspects of their interaction "

Cognitive Flexibility, Constructivism, and Hypertext, R.Spiro et al. —The preceding unsigned comment was added by Difficult to pick a user name (talkcontribs) 18:59, 29 March 2007 (UTC).

Loss aversion, endowment effect, and status-quo bias

"Anomalies: The Endowment Effect, Loss Aversion, and Status Quo Bias" Daniel Kahneman; Jack L. Knetsch; Richard H. Thaler The Journal of Economic Perspectives, Vol. 5, No. 1. (Winter, 1991), pp. 193-206.

The first two paragraphs of this article:

A wine-loving economist we know purchased some nice Bordeaux wines years ago at low prices. The wines have greatly appreciated in value, so that a bottle that cost only $10 when purchased would now fetch $200 at auction. This economist now drinks some of this wine occasionally, but would neither be willing to sell the wine at the auction price nor buy an additional bottle at that price.
Thaler (1980) called this pattern-the fact that people often demand much more to give up an object than they would be willing to pay to acquire it-the endowment efect. The example also illustrates what Samuelson and Zeckhauser (1988) call a :status quo bias, a preference for the current state that biases the economist against both buying and selling his wine. These anomalies are a manifestation of an asymmetry of value that Kahneman and Tversky (1984) call loss aversion-the disutility of giving up an object is greater that the utility associated with acquiring it.

I think it's very clear that my edit (suggesting that the three are related) is consistent with the conventional usage in economics. Anthon.Eff 22:24, 11 April 2007 (UTC)

Look, armies of academics have worked on those "asymmetric" effects. Your citations do not support your point as they show clearly that the three phenomena are different manifestations of something more general, but more vague, an "asymmetry". How can, for example, an investor understand his/her biases if he/she sees a confusing explanation? Please, treat each bias for itself, show the analogies if you like, but also the differences. Show why they are not called by the same name, this is the way to make things clear. In asset management the differences are as follows
  • Loss aversion is clearly the reluctance to sell when the priced is lower.
  • Endowment effect is clearly the idea that what somebody owns has more value that the market offer even if its market price already got multiplied two, ten or a hundred times.
  • Status quo bias does not concern only selling but also buying. It is the reluctance to change one's price estimate (whether upwards or downwards), as well as to make arbitrages between assets.
The purpose of a list is to list things, not to merge them. I plan to eliminate again your misleading add on, to avoid confusion in the mind of readers (yes, framing, which is the way things are presented and perceived, is another cognitive bias). But I'm sure it would not be needed, as I trust you will dig deeper into those phenomena and will refine your wordings (the word "related" is seriously misleading as it hides the - crucial - differences) to avoid those confusions. --Pgreenfinch 07:10, 12 April 2007 (UTC)
I think you need to give a citation for your definitions. I provided a citation, written by three well-known economists, including Thaler (who invented the term endowment effect), and Kahneman (who with Tversky invented the term loss aversion). It would help, when you provide your citation, to also do as I did: actually extract a few paragraphs providing the definitions that support your point. Until then, I think you should avoid reverting, since you have provided no evidence that you are correct. Anthon.Eff 11:54, 12 April 2007 (UTC)
If I understood well your only knowledge of the subject is a few old citations and you do not really want to explore the topic by yourself. Sorry, but you took the responsability to make an add on (which btw does not match clearly those citations) and to try to play the professor on something you do not really grasp. So you have to take that responsibility until the end and explain really what your word "related" really means. Maybe you can look at Martin's Sewell's treasure chest on the topic, a site I usually recommends, there you will learn a lot. You know, I respect the fact that you are an expert in philosophy, but thoses topics are very practical and precise ones and should be approached practically and precisely, avoiding reductionism. A philosophical concept, this word, seems to me, although here I'm not an expert and will not meddle in the article ;-). --Pgreenfinch 13:10, 12 April 2007 (UTC)
Actually, I'm an academic economist, though behavioral economics is not my primary area of expertise. But to be honest, I think you are the one trying to "play the professor", as you put it, by presenting your own definitions as if these are somehow authoritative. That's a nice approach when writing ones own papers or playing on a blog, but that's not what is expected on Wikipedia (see WP:Attribution or WP:No original research). --Anthon.Eff 13:49, 12 April 2007 (UTC)
I didn't give my page as reference for those concepts (although, by the way, various academics and various professional institutions link to that page and "play" with that forum). I just summed up here the usual definitions - which I didn't invented - of those effects. You are free to find better ones in the litterature, which is why, to help you clarify (what does "related" means?) your (original) add on I suggested you to get more expertise via a recognised academic portal on those topics. If you do not give clear definitions of those phenomena (I don't specially ask to follow those I gave, if you find more explicit ones) that show how close / how far their relations are, how can you write that they are related, a very vague word? --Pgreenfinch 17:43, 12 April 2007 (UTC)
I guess this is getting pretty unproductive. I did not use the word "related" in the article. I used the words "see also." And I have already provided a citation--from two of the men who coined the terms "loss aversion" and "endowment effect." Since we're having such trouble over this issue, I decided to simply quote these two men in the article for the definition of these terms. I have, of course, provided the source for these quotes. I trust that letting the authorities speak for themselves now resolves any differences we may have. --Anthon.Eff 18:47, 12 April 2007 (UTC)
I see your point, an author talked about different things which are about 10% related and this gives an authority argument to put "see also" between those. Status quo bias has very few to do with loss aversion, which is why your citations are not really explicit about that. I find that a poor way to use a Nobel prize author. Now that you opened the pandora box, we could add (we just need to find an author that talked about those different things) "see also" between scores of cognitive, emotional, individual, collective biases that have a few common points between them, and make a complete mess of the article. Just an example, status quo bias could be linked, by stretching the rubber band, to cognitive dissonance, cognitive overload, plain laziness, rationalization, overconfidence and many other things, nearly the whole list of biases. I have nothing against the "see also" tool, but on condition that it explains the similarities as well as the differences. With a "see also" signpost and no clear explanations, it is up to the reader to scratch its head wondering "Boy, what does this is supposed to tell me? Do those concepts complement or oppose each other and why? Do those roads converge or diverge and where?". I think we have here now a serious dent in an article which had the potential to be an example of encyclopedic quality, --Pgreenfinch 07:09, 13 April 2007 (UTC)
The article contains 10 cases of "see also" in parentheses--all but one were there before I made my edit. Not one of the nine preexisting cases has the level of editorial detail you think is mandatory. By the way, note how this "serious dent in the article" only became salient after a dispute with another editor. What kind of bias do you think we are witnessing? My guess would be that this most closely matches Confirmation bias, in that we have seen one thing after another brought up in order to validate the initial negative reaction to the other editor's edit (first wrong definitions, then a few ad hominems, then the use of the word "related", then use of the words "see also"). But it could also be seen as an example of the Focusing effect, since you appear to think one minor edit has the potential to wreck the entire article. But then again, your commitment to the current state of the article may be an example of Status quo bias. Of course, as Pronin, Ross and Gilovich (2004: 781) tell us: "people readily detect or infer a wide variety of biases in others while denying such biases in themselves." An unnamed bias of bias attribution (someone should name this and put it on the list), which has almost certainly caused me to overlook my own biases. Any lurkers out there? What kinds of biases do you think sustain such a long and unproductive discussion over a minor edit? --Anthon.Eff 13:38, 13 April 2007 (UTC)
Pronin, Emily, Lee Ross, and Thomas Gilovich. (2004). "Objectivity in the Eye of the Beholder: Divergent Perceptions of Bias in Self Versus Others." Psychological Review. 111(3): 781–799.
Anthon, you are becoming a real shrink, see all those biases you discovered in me, I'm impressed :-). Btw, the other "see also" do not refer to a specific bias in the list. More generally, "see also" are understandable in an article on a precise topic, as the relations can be seen clearly from the text. But in a list there is no real room to elaborate, either it becomes a mess by trying to explain any relation, however strong or weak, and if those explanations are skipped, the "reductionist" risk is high. An alternative is to make smaller groupings, but here overlaps would be frequent and it would be subjective to find clear category criteria in a field that deals with human behavior. Thus we would face the same risks. Now I expect like you that lurkers will describe fully our repective biases. Certainly, from what I just said, they will find me overly "risk averse" ;-). --Pgreenfinch 15:14, 13 April 2007 (UTC)

Great List - Inaccurate quality/importance tag?

This article is one of the best pages on the whole of the Internet. It strikes me as bordering on presumptuous for someone to come along ang lightly tag it as low-quality and unimportant. --New Thought 16:36, 12 January 2007 (UTC)

I agree. This is a fantastic list. As a research, it was a big help to me. What would make it even more powerful is if the article citations for each bias (if available) were noted in the footnotes on this page, rather than just on the individual pages of the biases. Zfranco 00:25, 8 February 2007 (UTC)
Don't repeat yourself. --Gwern (contribs) 03:15 8 February 2007 (GMT)
This list is just great. It is not perfect as it is, but nowhere else in my life I´ve seen a list of cognitive biases, which affect so much people´s decisions. Unfortunately most people don´t think too much about them. The world would certainly be a better place if everyone was aware of cognitive biases and thought more about the rationality in their decisions. I don´t know how to express in Wikipedia that an article is important to me. I guess I am going to write it on my user page. Congratulations to all editors who made this list possible. A.Z. 00:16, 12 March 2007 (UTC)
Indeed, this is a highly useful list and is more comprehensive than anything I have been able to find on the internet on the subject. Moreover, it is a great use of the Wiki approach of synthesizing information from so many different sources. As the article currently stands, the writing is sound, the topic important, and I suggest the tags be removed or at least be revised to reflect consensus. In that respect, I have moved the various comments on the quality of the page to this new section, for convenience of others in expressing their views. McIlwrath 20:50, 12 April 2007 (UTC)

Observer-expectancy effect

Do you have a source saying that Observer-expectancy effect is a cognitive bias ? I have added a {{citation needed}} in the Observer-expectancy effect article. Thank you. Akkeron 10:44, 15 April 2007 (UTC)

Déformation professionnelle

My observation is that this bias is detected by folks outside the discipline/profession, not within it. 20:50, 24 April 2007 (UTC)

Dunning-Kruger effect

I found this, which looks to me like a cognitive bias, but I'm not sure which category to put it in.

Dunning-Kruger effect Saraid 02:10, 22 July 2007 (UTC)

This was just removed, in this edit, but I don't understand why. "original research" how? CRETOG8(t/c) 23:28, 17 September 2009 (UTC)
The term "Dunning-Kruger effect" is original research by Wikipedia: it doesn't come from published sources. Hence putting it on a list alongside things like confirmation bias and overconfidence, which are well-established in literature, is misleading. The article itself needs to be renamed or merged. In the meantime, there's no reason to have the term on this list. The Dunning-Kruger experiments are mentioned in Illusory superiority. The academic literature doesn't treat their research as a separate effect, but mentions it in connection with illusory superiory/better-than-average effect/self-enhancement bias/whatever you want to call it. MartinPoulter (talk) 08:36, 18 September 2009 (UTC)
The term shows up in several places by a standard Google search and a couple (not clearly published) in a Google Scholar search. So, it's a term in use, if not in heavy use, so it's not OR. I don't know how to rule out circularity here, that quite possibly some of the search results use the term because they learned it from WP, but anyway... I'm also not sure it deserves its own article (I lean slightly toward yes), but that's a different question.
Aside from this disagreement, I appreciate the cleanup you did on this list. CRETOG8(t/c) 13:57, 18 September 2009 (UTC)
Actually, following a few of those search links, it really does look like they're getting the term from WP. It's awkward, but I think you're right. A priority then should be getting rid of the article itself and merging it into illusory superiority. CRETOG8(t/c) 14:05, 18 September 2009 (UTC)
Glad you understand, and thanks for the compliment. I don't think it justifies having its own article yet, but I'm very open to debate on that issue. MartinPoulter (talk) 11:34, 19 September 2009 (UTC)

Perceptual vs. Conceptual

Perhaps the page should say "distortion in the way humans conceive reality" instead of "distortion in the way humans perceive reality" since the perception is the same, it's the concept that's distorted. 10:12, 22 July 2007 (UTC)

More general type of Post-purchase rationalization bias?

[moved from Talk:Post-purchase_rationalization ] Hi, I've been searching List_of_cognitive_biases for a type of bias and I think Post-purchase rationalization is closest to it, but it's more general than purchasing, where a decision has been made, then vacuous justifications are made up afterwards to support the decision, and justifications for alternative outcomes are deemphasized. Is this a separate type of bias that has its own name? 12:48, 24 July 2007 (UTC)

Illusion of control

The 'Illusion of Control' entry in the list contains has a negative bias. I propose 'clearly cannot' be changed to 'scientifically have no control over'. - Shax —Preceding unsigned comment added by (talk) 01:13, 2 September 2007 (UTC)

Argument from incredulity

Argument_from_ignorance (or incredulity) is listed as a "logical fallacy" but seems like it belongs in this list. Whether one makes the logical argument explicitly or not doesn't really matter; it still is a cognitive bias.

LiamH (talk) 20:17, 6 February 2008 (UTC)

Is this list (and the article on Cognitive Bias) confusing Cognitive Bias with simple Irrational Thinking?

On reading the article and the discussions here, I wonder where the "authority" comes from to classify this list as comprising genuine cognitive biases. At first I wondered, as others have, about grouping and classifying the biases, but the more I read the more the term Cognitive Bias seemed to be becoming used as a catch-all for any illogical thought process. There seems to be a sort of populist psychology creeping in here. I am not sure that this is correct but IMHO a cognitive bias is the sort of phenomenom noted in Prospect theory. Whilst it is true that, using mathematical formulae, the human response is inconsistent and illogical, at the same time it is also clear, from a human perspective, why the bias exists and indeed its utility in the survival process. On the other hand simple illogical thinking can be termed "bad" thinking e.g.: "I can jump that gap because he could and I am as good a person as he is.". There is a confusion in the example between being goo (at jumping) and being good (as a person). Prospect theory, on the other hand, has more in common with valuing a "bird in the hand more than two in the bush.". Simple illogical thinking is clearly not useful for survival and can even at times be described as a symptom of mental illness. However Cognitive bias can IMHO be seen as a "real" response to a probablistic reality. If a coin is tossed and it has come up heads four times it is illogical to prefer to choose tails the next toss. However, IMHO it is eminently sensible so to do. It all depends on HOW we frame the maths. Could we do some "disambiguation" here into "true" cognitive biases, and the examples of simple "bad"/illogical thinking? Also, I have been unable to find (quickly) any references to "lists" of cognitive biases, but only to particular studies showing an example of a cognitive bias, such as that described by Prospect Theory, Priming, Framing etc.

LookingGlass (talk) 16:55, 18 February 2008 (UTC)

I agree that this list is leaking beyond cognitive biases into other kinds of bias. I think the way around this is to focus on books which are unambiguously about cognitive bias (there are examples in the article), and count something as a cognitive bias if its described in those. MartinPoulter (talk) 18:30, 4 August 2009 (UTC)


Did somebody discover the secret of time travel? Whoever marked "December 2008", please correct this. Otherwise, it will have to be removed. Montblanc2000 (talk) 18:02, 29 August 2008 (UTC)

Removed tag: This article or section has multiple issues. The page is, while perhaps not perfect, fine. Tagging it is unreasonable. Power.corrupts (talk) 07:31, 14 September 2008 (UTC)

Precision bias

I have just discovered Precision bias. Where in this list shoould it be? -- Wavelength (talk) 17:30, 22 July 2009 (UTC)

Good catch. I'd put it in List_of_cognitive_biases#Biases_in_probability_and_belief. CRETOG8(t/c) 17:36, 22 July 2009 (UTC)
I don't think this is a cognitive bias. Unless we can find cognitive psychology literature describing this bias, it shouldn't be in this article. MartinPoulter (talk) 18:32, 4 August 2009 (UTC)

Pro-innovation bias

I have just discovered Pro-innovation bias. Where should it be listed? -- Wavelength (talk) 20:53, 22 July 2009 (UTC)

Same objection as for "Precision bias" above. It is mentioned in academic literature, but I don't see an argument that it's a cognitive bias rather than just a fallacy. MartinPoulter (talk) 18:33, 4 August 2009 (UTC)

inequality aversion

It's difficult to make a clear line between "biases" and "preferences", but I'd say inequality aversion falls pretty clearly into the preference category. Some people don't like inequality, that isn't a bias in any standard sense--it wouldn't be classified as "irrational" by an economist or decision theorist. CRETOG8(t/c) 19:36, 23 July 2009 (UTC)

Actually, sacrificing personal gain to prevent an "unfair" advantage by others is a classic example of irrational behavior in economic study. For example, if you and a coworker both earn $500 a day. Then, you have the option to accept a $100 raise on the condition that your coworker also receives a $300 raise. According to Inequality Aversion, many people will refuse, even though they would themselves earn more money than they were before. Rational economic actors disregard economic inequality and only seek to improve their own individual condition. (talk) 13:37, 20 October 2009 (UTC)

Cleanup for unsourced/redlinks and article organization

The page needs some cleanup with regard to redlinked and related terms. Redlinks need either a source showing the acceptance of the term within the psychology community or an article that is similarly sourced. If there is no reason to believe that a particular bias is anything more than a pet project, it should be removed. Additionally, it would behoove us to group related biases (such as confirmation bias, experimenter's bias, hindsight bias, et cetera) rather than use the broad categories we currently employ. (talk) 19:03, 25 August 2009 (UTC)


This page links "black swan", but why does it not list the narrative fallacy? Cesiumfrog (talk) 04:15, 8 October 2009 (UTC)

Esperanto Version

How long should I wait for someone else to publish the Esperanto version before I try to do it myself? Please advise. -Joshua Clement Broyles —Preceding unsigned comment added by (talk) 02:21, 6 November 2009 (UTC)

What about the effect of power on moral hypocrisy?

This recent research Why Powerful People -- Many of Whom Take a Moral High Ground -- Don't Practice What They Preach (also discussed in this Economist article Absolutely: Power corrupts, but it corrupts only those who think they deserve it) demonstrates that "power and influence can cause a severe disconnect between public judgment and private behavior, and as a result, the powerful are stricter in their judgment of others while being more lenient toward their own actions."

None of the biases in the list seems to cover this one. The ones that came closest were some of the social biases, such as illusory bias. Before I added it to the list, I wanted to check to see if it was already covered by one of the listed biases. --Nick (talk) 09:25, 11 February 2010 (UTC)

Post-decission bias after irreversible elective surgery?

Hi, looking for literature regarding this effect and especially the likely bias it introduces into patient satisfaction surveys and such. I found quite a few somewhat related topics but nothing that would match the situation very well - is there anything? Richiez (talk) 23:52, 29 April 2010 (UTC)

"Early information is better" bias?


Can't find it in the list and did not do the research myself but i'm convinced (by introspection, why do i value this political argument more than the other) there is another bias: whatever you hear first is "more true" than what you hear later, even when the sources of information have the same credibility. The experiment would be something like: mum tells a kid that people like blue more than any other color. A month later dad says red is the most favorite color. My guess is that the kid will insist that it's blue, even when the two sources of information are just as credible and there is not obvious reason why any of the colors should be "better". (Of course, swapping colors and parents should be part of the experiment). The reason for this phenomenon could be that people create a point of vision based on facts they heard before, and it's just more work to change your vision than to deny new facts that don't fit in it.

Joepnl (talk) 17:29, 1 August 2009 (UTC)

This would come under Confirmation bias. MartinPoulter (talk) 18:42, 1 August 2009 (UTC)
Thanks! Joepnl (talk) 00:03, 2 August 2009 (UTC)

How about this also (from the article):

Primacy effect — the tendency to weigh initial events more than subsequent events.
In my learning of psychology (I did half a degree in it), primacy effect refers to a stronger memory for early events than subsequent events, and the linked article on serial position effects seems to be about this. Joepnl's question was about opinions based on early evidence being defended against subsequent opposing evidence. That's why I recommended confirmation bias rather than primacy effect. MartinPoulter (talk) 18:25, 4 August 2009 (UTC)
Thank you both, psychology really is interesting Joepnl (talk) 01:22, 2 May 2010 (UTC)

Christian McClellan spam

This guy "Christian McClellan" is a real villain. He has spammed the entry for "Bias blind spot" several times. I suggest a permaban. (talk) 21:12, 16 June 2010 (UTC)

Realism Theory?

I believe that "realism theory" has another name. Does anyone know what it is? 9:08, 21 Nov 2005

Don't know if helpful, but I am looking strongly into developing the subject of "bias". 10-25-10 —Preceding unsigned comment added by (talk) 21:22, 25 October 2010 (UTC)

Bystander effect

This doesn't seem to me to be a cognitive bias. It is discussed in a lot of the same sources that discuss cognitive biases, but that doesn't make it one. An effect is not the same as a bias that causes an effect. MartinPoulter (talk) 14:02, 29 June 2010 (UTC)

You are quite right! (I did not think of it when I corrected the edit.) Lova Falk talk 17:41, 29 June 2010 (UTC)
A cognitive bias that can lead to the bystander effect is called pluralistic ignorance. In a helping situation, it is the bias to assume that no one thinks an event is an emergency because no one is acting like it is. The trouble is that people tend to freeze when something unexpected happens. This can cause everyone to look around, see that others aren't doing anything, and therefore choose not to act themselves. It would be a type of attribution bias because people are assuming that others' behavior can be explained by their internal states (their beliefs) when in reality it is aspects of the situation that are impacting behavior. Osubuckeyeguy (talk) 18:09, 29 June 2010 (UTC)

Social Biases are not a type of Cognitive Bias

Cognitive Biases are one thing. Social Biases are another. Social Biases should be split to make a new article, "List of Social Biases."

I disagree. Social biases stem from the way that people think (cogitate) about social interactions. At the very least, make some sort of argument for your assertion.

--NcLean 8th of October 2006

Valence effects

Optimism bias and Valence effects have separate articles, but what's the difference? Also, are there known systematic links between optimism effects (Rosy retrospective, valence effect, planning fallacy, overconfidence effect, false-consensus...) --NcLean 8th of October 2006

Comfort and Implications effects

Where does one's desire for comfort at the expense of something he knows to be more beneficial fit into this list of biases? In other words, a major obstacle to clarity is the human predisposition to adopt and maintain beliefs which are comfortable for us as opposed to true. Perhaps related, or not, is the skewing of one's perception due to the implications of his/her choices. For instance, if I decide that the right thing to do is to help my wife deal with a sick child, then I will have to stop surfing the web, which gives me more immediate pleasure. [Helping my wife and child is a deeper pleasure, but not immediately gratifying]

Directly related to 'comfort bias' is a bias which isn't listed yet strikes me as one of the most significant of all: believing what you want to believe. Athiests argue that people believe in God and heaven because people very much want to believe in the afterlife. This bias is tremendous in the effect it has had on mankind. Where is it listed? Simon and Garfunkel wrote a song that had a line about this bias.

Bandwagon effect

I don't see a good psychology citation for this one either. Or lots of the other ones. That's why it's B-grade in my view. There are other pages with better citations - and good books (cited in the article). But this article is not a bad starting point. —The preceding unsigned comment was added by (talk) July 16, 2007

New topic

There is a recent (or recently recreated) article over at Semmelweis reflex. Do ya'll think that is a neologism for one of the biases already on this list (perhaps a version of confirmation/disconfirmation bias), or does it deserve a place of its own? - KSchutte (talk)

Does this article need renaming?

Improvement of this article is hampered by the fact that there doesn't seem to be any scholarly source which gives a definitive list of cognitive biases: a situation very different from, say, logical fallacies. If anyone knows of such a source, I'd love to know about it. What we do have is a set of books and papers that can be identified as cognitive psychology or social psychology, and which between them describe biases in judgment and decision making.

Not only is there not a canonical list, the term "cognitive bias" seems misleading. It implies a bias in judgement or decision-making that is due to heuristics or passive reasoning errors, as opposed to motivational effects such as cognitive dissonance. A few decades ago when the "heuristics and biases" and "hot cognition" areas of psychology were more separate, there was a clearer distinction between cognitive biases and some of the other biases on this list. Since the early 1990s, it has been recognised that a lot of biases have both motivational and cognitive components. Explanations of confirmation bias or hindsight bias, for example, include both cognitive and motivational effects. Hence I question whether it's useful, or fair to the sources, to attempt to demarcate the cognitive biases. This is an issue which has come up before: [1]

Something like this list needs to exist, and in fact should be a high priority given the huge amount of research on biases and the great public interest in the topic. It would be destructive to pare it down to biases that are uncontroversially "cognitive". It seems like List of biases in judgment and decision making would be a less presumptive/ less technical alternative. Look at the titles of two of the key textbooks: The Psychology of Judgment and Decision Making; Thinking and Deciding.

We still need a way to hold back the tide of unwanted additions (effects related to psychopathology or psychotherapy; logical fallacies; behavioural effects such as Bystander effect) but a well-worded lede and some in-article comments could achieve that. MartinPoulter (talk) 16:53, 30 July 2010 (UTC)

Yes. Not only are their plenty of books on the subject which use the term "judgment and decision making", there is even a Society of that name! Fainites barleyscribs 10:47, 31 July 2010 (UTC)
Just backing up Fainites' point, here are more textbooks that should be crucial sources for this list: Rational Choice in an Uncertain World: The Psychology of Judgment and Decision Making, Judgment and decision making: psychological perspectives, Blackwell handbook of judgment and decision making. On the other side (trying to avoid confirmation bias), one of the key textbooks is Rudiger Pohl's "Cognitive Illusions". However, even that book has a subtitle of "A handbook on fallacies and biases in thinking, judgement and memory." MartinPoulter (talk) 12:24, 31 July 2010 (UTC)
I agree on the renaming and I think that unwanted additions might be taken care of by explicit categorizing and introductory remarks to the list. As to the literature, these (unmentioned in the articles) might be of central interest: [ Hypothetical Thinking: Dual Processes in Reasoning and Judgement (Essays in Cognitive Psychology) by Jonathan Evans] and an earlier book by the same author [ Bias in Human Reasoning: Causes and Consequences (Essays in Cognitive Psychology)]. Best, --Morton Shumway (talk) 11:20, 3 August 2010 (UTC).
It does seem that the topics of logical fallacy and cognitive bias have been mashed together with neurological biases. I'm leaning toward renaming and or splitting. Comment: WikiPedia says "Pareidolia – a vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing hidden messages on records played in reverse." However, Apophenia, the ability to pull meaningful data out of what is essentially noise, is likely the basis of Pareidolia.[[2]] To quote: "Apophenia is the tendency to find meaning in noise. This is how we see constellations in the stars, faces on Mars, and Kaziklu Bey’s visage grinning evilly from a slice of cinnamon toast." - Dreadfullyboring (talk) 14:53, 19 December 2010 (UTC)

Hostile media effect

Although an anonymous user deleted it, this does seem to be a bias of the same kind as the rest in the list. It's a bias in human information processing, discovered by researchers who investigated other biases in the list, and published about in the same journals and books. I'm going to restore it to the article. MartinPoulter (talk) 10:24, 9 January 2011 (UTC)

Good decisions - following decision theory

I'm not sure where it belongs in the Cognitive biases contents,however, I would like to see a definition in the Tversky/Kahneman sense, following decision theory, what a good decision is. However, since the study focuses on the opposites or fallacies, what a 'poor decision' is. (talk) 17:20, 1 February 2011 (UTC) Fjeld

Forer Effect

Is the example given for Forer Effect, "horoscopes", possibly a little insensitive to the believers in astrology and numerology? It seems a little dismissive while there are more innocuous ways to frame the concept and this just seems to pick on a less prominent belief system. To illustrate: I wouldn't want to conclude with "...a proselytizer's sermon," which could be said to have this very effect on the searching listener, either. Similarly, consumers are moved by targeted marketing recruitment commercials for entry-level positions as though their strengths are being called upon specifically.

I was thinking perhaps "For example, a fortune cookie" would be less offensive while illustrating the concept fully. Stopblaming (talk) 21:25, 18 February 2011 (UTC)

The experiment which demonstrated the Forer effect specifically used horoscope material as its source, and the reliable sources which discuss the effect mention it in conjunction with horoscopes. Wikipedia is not censored, and we should not be removing or replacing factual, verifiable information in an article for fear that readers might be "offended" by it. A fortune cookie is not an appropriate example because it is not presented as an individual description of the reader's personality. It would be misleading to readers about what the Forer effect actually is. MartinPoulter (talk) 13:04, 19 February 2011 (UTC)
Just to emphasise, "people might be offended" is never, of itself, a reason to remove or distort information on Wikipedia. MartinPoulter (talk) 13:17, 19 February 2011 (UTC)

"More footnotes" tag - should this be removed?

There's a "more footnotes" tag at the head of the article, and some "citation needed" tags in the text. However, as this article is essentially a list and the large majority of entries are linked to their own, usually well-referenced Wikipedia articles, is it really necessary for them to have additional citations here? It seems to me these tags can be removed. (talk) 01:42, 13 March 2011 (UTC)

Capability Bias

Every single description that I've been able to locate on the web defines the capablity bias as;

"the tendency to believe that the closer average performance is to a target, the tighter the distribution of the data set"

And by "every", I truly do mean every single one of them. I can find no alternate definitions or clarifications, and every instance of the capability bias listed on the web (that I could find) has been copypasta'd in the same large list of cognitive biases.

I started looking because the above definition doesn't make much sense to me - the grammar and punctuation leaves it with an ambiguous definition to me. I cannot think of a reasonable instance where I would use the term "capability bias" as a label for something which has happened.

The only further information I could find was in the wikibin - this article once did exist but has since been omitted. The only additional information listed there, not found anywhere else, is;

"Long known, but recently codified bias. Based on observations by Daryl Clements, and Steve Hajec in their many years working with executives in the area of Business Excelence."

However, a search on Clements and Hajec (including a search of scholarly articles) came up empty. I can't even find any evidence of a Daryl Clements or a Steven Hajec of ever having published anything scholarly - not together, and not even on their own. I'm not convinced this bias is recognized by scholars in the field, and it may even be something fake that suffered from truth-by-default due to copypasta. I don't think that "citation needed" suffices in this case. —Preceding unsigned comment added by (talk) 21:19, 8 April 2011 (UTC)

I agree. 1,040,000 results for the copypasta. 4,000 results for "capability bias" without the copypasta definition, and those are largely coincidental occurrences of the two words (and a couple mirrors of the deleted Capability Bias article). If it can't be sourced, it should be deleted. (talk) 21:25, 27 April 2011 (UTC)
This list is long enough without questionable entries, so I have simply removed it. Hans Adler 22:10, 27 April 2011 (UTC)
Thanks all for doing this. Nice detective work by the anonymous author. MartinPoulter (talk) 15:11, 28 April 2011 (UTC)

Possible clarifications/ addtions

I'm a relative newbie to the formal study of psychological heuristics and cognitive biases. In perusing this list, I wondered some things:

a. Would "Stereotyping" be enhanced or clarified by the addition of, "(regardless of whether the stereotype may be probabilistically "true.") Or, further and in general, attempting to apply general probabilities that are relevant in a situation to a particular instance." Or is the latter a different bias altogether? (Granted, it is the misapplication of probability theory ("Neglect of probability" bias?), or simply ill-education. cf. "Misuse of Statistics.")

b. Is the "Just-world phenomenon" (Social Biases) the same as the "Just-world hypothesis" (Biases in Probability and Belief)? If so, should they be combined? If not, why are they different? Should they reference each other?

c. Would "System justification" benefit from the addition of, "a.k.a. that's-the-way-it's-always-been or we-don't-do-it-that-way-here" for a wider audience?

d. Where does the NIMBY syndrome fit? (NIMBY: "Not In My Back Yard.") Is it a cognitive bias?

Thanks. Spartan26 (talk) 02:30, 14 October 2011 (UTC)

It has been suggested that List of memory biases be merged into this article or section.

  • Support - makes perfect sense, can't think why this has been outstanding since August! Pesky (talkstalk!) 13:41, 25 November 2011 (UTC)

Yes, makes sense top merge different groups, and I will start with some of this work... first I think it is useful to merge
"Decision-making and behavioral biases: Many of these biases are studied for how they affect belief formation, business decisions, and scientific research"
"Biases in probability and belief: Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research"
which is really arbitrary... --InfoCmplx (talk) 20:04, 14 January 2012 (UTC)

Where does "repetition bias" fall?

This is a common mistake where we select for the answer that we've heard more often, regardless of how much or little sense it makes. This might also be considered a "strong meme" bias, where ideas that are better at getting repeated (are more "memetically fit") are given prevalence over others on the presumption that passing through more minds has provided a stronger filter for bad ideas.

I was taught this was a "perseverance error". I've not heard it described as a bias. Or you might be talking about the availability heuristic. I read a lot of bias literature and I've not encountered "repetition bias" as a term in itself. Willing to be pointed to new sources, though. MartinPoulter (talk) 18:21, 2 March 2012 (UTC)

Doubt avoidance?

Should "doubt avoidance" perhaps be on this list? Or is it already covered under another category? E.g. the tendency of humans to want to 'fill in' information that isn't available. Cf. "This model is taken from 'Poor Charlie's Almanack', by Charles Munger: The brain of man is programmed with a tendency to quickly remove doubt by reaching some decision. It is easy to see how evolution would make animals, over the eons, drift toward such quick elimination of doubt. After all, the one thing that is surely counterproductive for a prey animal that is threatened by a predator is to take a long time in deciding what to do. And so man's Doubt-Avoidance Tendency is quite consistent with the history of his ancient, non-human ancestors. So pronounced is the tendency in man to quickly remove doubt by reaching some decision that behavior to counter the tendency is required from judges and jurors. Here, delay before decision making is forced. And one is required to so comport himself, prior to conclusion time, so that he is wearing a 'mask' of objectivity." (talk) 17:19, 27 March 2012 (UTC)

renaming of this page

There is an existing consensus that the title of this list is inappropriate and misleading. I'm going to go ahead and rename. MartinPoulter (talk) 14:28, 13 October 2012 (UTC)

I'm concerned about the move

The new name is too long, and reflects an academic divide (and not all of it, at that) which is unimportant to the article topic. IMHO, the word cognitive as a mashup is precisely the correct, universally understood umbrella term into which the list should be poured, sieved internally. It might be important to decide which is most important for this article, and which should go in another article. We have neurologists, psychologists, and all manner of combo-disciplines employing ever finer microtomes of linguistic point-shavery just to be pointy. Should we really be playing into that game, even as some of the research and claims (as we speak) are being thrown out as fraudulent, irreproducible claptrap? Help me out here. I just don't think dragging the reader down the rathole of indistinguishable distinctions without differences is beneficial. Brevity and clarity should be more important than that. --Lexein (talk) 18:13, 13 October 2012 (UTC)

Speaking of brevity and clarity, can you set out the content of your last few sentences in a more succinct way? And what does "an academic divide which is unimportant to the article topic" refer to? I dispute that "cognitive" is either correct (it's POV) or universally understood. Why do you think the Wikipedia article should differ from the books on which it's based. Help me understand what your objections are. MartinPoulter (talk) 21:21, 13 October 2012 (UTC)
Wow. Well, no, I asked you first. You wrote "explanations of confirmation bias or hindsight bias, for example, include both cognitive and motivational effects." So, why not simply add the word "motivational" (or some other) in the title, rather than remove "cognitive"? Is removing one word, as opposed to adding another, the only solution? The so-titled journal Cognitive Psychology, doesn't seem to ban the word, and hasn't renamed itself Judgment and Decision-making Psychology or even Cognitive and Motivational Psychology, so why, precisely, is "cognitive" too POV to retain as part of the title? The name of the other Wikipedia article, upon which this list is based, is Cognitive bias - is that to be renamed? To what?
This article is narrowly about observed biases, not the entirety of the containing field. Given that of all the titles of all the books offered as rationale, all describe a field: only one of those titles include the word "bias", and only along with "reasoning" and "cognitive". So, I assert that it is just as incorrect to entitle this list "judgment and decision-making biases" (if read too narrowly, an OR term), as it is to narrowly focus on only "cognitive biases". Some better naming solution exists, as yet unknown. --Lexein (talk) 23:11, 13 October 2012 (UTC)
The sources don't use the phrase "cognitive and motivational bias".
"This article is narrowly about observed biases" yes, but of what? Biases of electrical measurement by magnetic fields? Biases of statistical methods? What is being biased? In all these cases, the answer is either a judgment or a decision, so the present title of the article is appropriate. The journal Cognitive Psychology is about the discipline of cognitive psychology, so it's fairly named. It's fair for Wikipedia to have an article about cognitive bias, it's just that almost nothing on this list is uncontroversially a cognitive bias (and to imply they all are is unfortunately blatant POV), although it's not controversial to say they are biases in judgment and/or decision-making. If you're concerned about accessibility, I should point out that "cognitive" is not a common word that most English speakers would be expected to understand, but "judgment", "decision" and "making" should be relatively widely understood.
What does "(if read too narrowly, an OR term)" mean? Thanks for writing in a more clear and understandable way. MartinPoulter (talk) 15:04, 19 October 2012 (UTC)

question about inline citations

Since this is a list type article, does every entry really need inline citations? I was under the assumption that lists were present for the purpose of aggregating information and giving a cursory overview. If the respective linked article is properly cited, do all the entries here need to be cited as well? It seems non-productive and only really adds clutter.Darqcyde (talk) 01:38, 19 October 2012 (UTC)

I get your point, but the article has become something of a magnet for original research. Though in an ideal world I'd agree with you and we wouldn't require refs for each item, I think in the present case we need to ask for a citation that shows that each term is a bias within the scope of the article. I would hope that when fully developed, this list article will do what the best Wikipedia lists normally do which is give information about each item (e.g. when a term was first introduced): it's not just a navigational tool. If so, then there will have to be references for each fact. MartinPoulter (talk) 14:53, 19 October 2012 (UTC)

Baader-Meinhof phenomenon

Nuvola apps edu languages.png Relevant discussion at Wikipedia:Articles for deletion/Baader-Meinhof phenomenon (2nd nomination)

For Frequency illusion, the phrase Sometimes called "The Baader-Meinhof phenomenon" is currently {{fact}}-tagged[3]. This is a blog-propagated term unlikely to be found in any actual study, but as a popular online factoid it gets posted to Reddit about once a month[4].

This language could be changed to something like "popularly called Baader-Meinhof phenomenon", but there is some debate on what that term really describes from a scientific perspective (examples), so linking it to a scientific term might be WP:OR. Also, the term is unlikely to be used among people who know what the actual Baader-Meinhof is (such as Germans, or people educated in recent history).

So I figure List of cognitive biases should probably stick with terms used in relevant study, and popular neologisms can be omitted entirely. Does anyone disagree here? / edg 16:18, 9 October 2012 (UTC)

I agree. We should use the scholarly terms primarily, mention popular synonyms where sources allow, and avoid neologisms. MartinPoulter (talk) 18:21, 9 October 2012 (UTC)
Well, where does the term come from? Why is it popularly referred to as the Baader-Meinhof phenomenon? There is an article about the Baader-Meinhof group, what is its relationship with that? (talk) 03:08, 25 October 2012 (UTC)

Forer vs. Barnum Effect

Barnum effect redirects to Forer effect. The first sentence of the description is identical, while the second one is only in the description of the Forer effect.

Shouldn't one of the following actions be taken?

  • Remove Barnum Effect from the list
  • Use the whole description of Forer effect on Barnum effect (talk) 15:18, 24 October 2012 (UTC)

I've listed Barnum effect as an alternative name for Forer effect.
There's an open question about Forer effect/subjective validation, too. See Talk:Forer effect#Something.27s_wrong.... —Mrwojo (talk) 20:41, 24 October 2012 (UTC)

Requested move

The following discussion is an archived discussion of a requested move. Please do not modify it. Subsequent comments should be made in a new section on the talk page. Editors desiring to contest the closing decision should consider a move review. No further edits should be made to this section.

The result of the move request was: Move. Cúchullain t/c 15:30, 14 February 2013 (UTC)

List of biases in judgment and decision makingList of cognitive biases – This is a shorter, and much more popular name for this. --Relisted Tyrol5 [Talk] 02:21, 18 January 2013 (UTC) Greg Bard (talk) 08:38, 10 January 2013 (UTC)

If you look further up this Talk page, you'll see why it was recently moved from "List of cognitive biases". You're right that "Cognitive biases" is more popular in the web as a whole, but it's not correct, and academic literature more often uses "in judgement and decision making". In fact, one of the concerns about the old name is that there does not seem to be any academic literature giving a "list of cognitive biases". Also, the "cognitive biases" title fails WP:NPOV, because "cognitive" is used in a lot of the literature for biases that arise from cognitive heuristics. Very few if any of the biases on this list are purely explainable in terms of heuristics. On the other hand, they are all (or nearly all) biases in judgement and/or decision-making. Another of the arguments made above is that "cognitive" is not a common word known to the typical Wikipedia reader, but "judgement" and "decision" are more likely to be. MartinPoulter (talk) 10:16, 18 January 2013 (UTC)
  • Support: Martin may be perceiving a false-consensus effect, since he seems to be the one that previously moved the page and may now be engaging in irrational escalation. There seems to be an acknowledgement here that "cognitive bias" is the WP:COMMONNAME, and the claim that it is WP:NPOV does not seem correct to me. Rather, I think the claim is that the current name is more precisely accurate in the opinion of an academic expert, rather than that "cognitive bias" expresses some opinion. I don't exactly follow the rationale for saying that "cognitive bias" is insufficiently precise, since these phenomena are biases related to cognition. However, any lack of precision in the WP:COMMONNAME can presumably be addressed by adding some clarifying remarks within the article itself. As the policy says, the article name should not be pedantic, and "the term most typically used in reliable sources is preferred to technically correct but rarer forms". The more common term "cognitive bias" seems sufficiently disambiguating for Wikipedia purposes, less artificial, and more succint. —BarrelProof (talk) 22:01, 5 February 2013 (UTC)
  • Support - I opposed the earlier rename, which was really pushed through by aggressive and vocal tag-teaming and shouting-down by two highly biased editors, against WP:COMMONNAME and against usual Wikipedia naming sensibility. COMMONNAME should obviously prevail here, with the inclusion in the lead paragraph explaining the lengthier usage in some (though assuredly not all) literature. I just let it go previously, knowing full well that it would eventually be reverted back. If there are subtopics which don't belong under the original title, they should go in a different article, rather than resorting to an unwieldy title which is, let's face it, hostile to the readership. We don't rename things because a few books or articles use a new fad name, we wait until the majority of the field settles on a new name, which, by the way, is usually similar in length, rather than clumsily longer, than the old. I'd prefer some neologism (even Greek, Latin, or German!) as a title over the current bloviation. --Lexein (talk) 13:28, 6 February 2013 (UTC)
  • Query. Are all "cognitive biases" biases that involve judgment and (and/or?) decision[hyphen]making..? CsDix (talk) 18:04, 12 February 2013 (UTC)
    • According to the cognitive bias article, "a cognitive bias is a pattern of deviation in judgment". If that is correct, then I suppose the answer is yes. However, I notice that the list in this article includes some biases that do not seem to necessarily be about either judgments or decision-making. The list includes "belief and behavioral biases", "socal biases" and "memory errors and biases". The descriptions of many of the phenomena are not really a matter of judgments or decision-making (e.g. the curse of knowledge, bizarreness effect, and childhood amnesia) – especially the items under "memory errors and biases". —BarrelProof (talk) 20:19, 12 February 2013 (UTC)
Lexein, could you please name the "two highly biased editors" who used "aggressive and vocal tag-teaming and shouting down"? Maybe it would help if you link to the diffs or talk page section where this happened.
Also to Lexein: "We don't rename things because a few books or articles use a new fad name, we wait until the majority of the field settles on a new name" You are absolutely right. This is why we should not be using the term "cognitive bias" to cover all these biases when the textbooks don't. What literature are you working from?
CsDix: the answer is yes. That means that some entries have to move out of this list, but that's pretty inevitable as the list is basically a grab-bag of different things.
To BarrelProof's "false consensus" accusation: Before this requested move, we had me proposing, User:Fainities supporting, User:Morton Shumway supporting, User:Dreadfullyboring "leaning towards renaming or splitting", User:Lexein "concerned". Not unanimity, but only I and Morton Shumway have so far actually cited literature in support of our arguments. I could of course speculate about biases that are shaping your judgment, but it would be rude and not advance the discussion, so I won't.
BarrelProof: "the claim that it is WP:NPOV does not seem correct to me". Interesting and I hope you're right. Can you spell out the argument?
Problem for BarrelProof and Lexein: where in reliable academic literature is there a list of cognitive biases on which this article can be based? Conversely, is the literature on biases in judgement and decision-making? Concrete example: does Hindsight bias belong in an NPOV list of cognitive biases? Does it belong in a list of biases in judgment and decision-making? MartinPoulter (talk) 00:05, 13 February 2013 (UTC)
Martin, I hope you understand that my references to the 'false consensus effect' and 'irrational escalation' were primarily meant to be humorous. I just thought it would be fun to use topics from the article itself in the renaming discussion. I'm not sure I understand some of your new comments. The reason that I don't see an NPOV problem in calling this a list of cognitive biases is just that I don't see any particular opinion or perspective being promoted by using that name. To me, yes, hindsight bias does seems like a cognitive bias. I'm not sure whether it really fits as a bias in judgment and decision-making or not, although I think it probably does. But I don't understand why you asked about that. To me, the term 'cognitive bias' just refers to biases related to cognition in general, which seems broader in scope than biases in judgment and decision-making. —BarrelProof (talk) 01:25, 13 February 2013 (UTC)
  • Support - This is a silly terminology quibble. Roger (talk) 03:22, 13 February 2013 (UTC)
  • Oppose - Hi all. I accept the common name argument, however, I also think that concerns over the use of the term “bias” in psychology are valid and not marginal.[1][2][3][4] I would therefore like to suggest a third option. This would be to merge this list with list of psychological effects (retaining that latter name). That list could then be broken down into sub-lists according to discipline (e.g. social psychology, cognitive psychology, clinical psychology). While I know that a taxonomy along those lines will be far from perfect, at this stage I think that it will be far less fraught than trying to distinguish between “biases” and “heuristics” or between “decision making” and “social biases”. I also feel like this will be a list that will adequately serve the purposes of a list of this nature. Of course, I am keen to hear what others think. Cheers Andrew (talk) 03:51, 14 February 2013 (UTC)
  • Support "List of cognitive biasses" is a better name than "List of biases in judgment and decision making ", and I fail to see the controversy. --Spannerjam (talk) 05:53, 14 February 2013 (UTC)



  1. ^ McGarty, C. (1999). Categorization in social psychology. Sage Publications: London, Thousand Oaks, New Delhi.
  2. ^ Turner, J. C.; Reynolds, K. H. (2001). Brown, R.; Gaertner, S. L., eds. "The Social Identity Perspective in Intergroup Relations: Theories, Themes, and Controversies". Blackwell Handbook of Social Psychology: Intergroup processes. 3 (1): 133–152. 
  3. ^ Oakes, P. (2001). The root of all evil in intergroup relations? Unearthing the categorization process. Blackwell handbook of social psychology: Intergroup processes, 4, 3-21.
  4. ^ Turner, J. C.; Oakes, P. J. (1997). McGarty, C.; Haslam, S. A., eds. "The socially structured mind". The message of social psychology. Cambridge, MA: Blackwell: 355–373. 
The above discussion is preserved as an archive of a requested move. Please do not modify it. Subsequent comments should be made in a new section on this talk page or in a move review. No further edits should be made to this section.

Baader-Meinhof Phenomenon

There used to be a separate page for the Baader-Meinhof Phenomenon. What happened to it? --David G (talk) 16:38, 14 May 2013 (UTC)

Hi David G! It got deleted. See: Wikipedia:Articles for deletion/Baader-Meinhof phenomenon (2nd nomination) Lova Falk talk 14:01, 29 May 2013 (UTC)

Identified victim bias

I added this text, which was reverted by U3964057 (talk · contribs): "* Identified Victim Bias - the tendency to respond more to a single identified person at risk than to a large group of people at risk."

My citation was to, and the reversion noted that "A 2012 conference website is not sufficient substantiation that this is a commonly accepted 'cognitive bias'".

So my question is: What would be considered "sufficient"? This? Thanks, Vectro (talk) 13:23, 11 August 2013 (UTC)

Hi Vectro. As per the Wikipedia guidelines around reliable sources there is no across the board principal that can be applied in terms of what is and is not a sufficient source. That is, it is "context dependent". In this case I would say that the journal article your offered will do as an appropriate source for the notability of the effect you added to the list (although a review article, or other tertiary source, would be better). I would say then by all means add in the 'identified victim effect' with the reference you suggest. Other editors are of course welcome to chime in with their thoughts. Cheers Andrew (talk) 13:31, 12 August 2013 (UTC)
OK, since there are no (other) objections, I will re-add with that source. Vectro (talk) 03:28, 15 August 2013 (UTC)

Explaining base rate fallacy

Paging @Spannerjam: about this edit. "the tendency to base judgments on specifics, ignoring general statistical information", while not perfect, seems to require less prior knowledge than " the tendency to ignore base rate probabilities", which includes two technical terms. Someone who doesn't know what the "base rate" is won't be helped by looking at that definition. MartinPoulter (talk) 10:28, 2 September 2013 (UTC)

Misattribution as a cognitive bias?

Hi all. Spannerjam felt that my removal of “source confusion” was unjustified and reinserted the item with an improved definition. Despite the improvement (which I do like) I still don’t think source confusion belongs in the list. To elaborate upon my edit summary, I have always come across source confusion and memory misattribution as potential ‘’outcomes’’ of some cognitive effect (or “bias”); not as a cognitive effects unto themselves. For example, in the classic “who said what” paradigm a salient and inclusive social category is shown to lead to source confusion as to which category member said what phrase.

To include features of human cognition like source confusion really broadens the gambit of the list beyond cognitive effects and decision making heuristics. In the long run I think we could expect a much longer list and an eventual rename to something like list of perceptual and memory phenomenon. Anyway, I am keen to hear others’ thoughts. Cheers Andrew (talk) 14:15, 6 November 2013 (UTC)

Should the Abilene paradox be included?

Personally, I think it fits. Luis Dantas (talk) 18:40, 21 December 2013 (UTC)

Hi Luis Dantas. I would suggest that the the Abilene paradox does not really qualify as a cognitive bias. I would instead say that the Abilene paradox describes a particularly type of outcome that my arise as conformity processes operate within a particular context. It is these conformity processes that one might think of as being subject to cognitive biases; not the Abilene paradox itself. Does that resonate with you? It might be illustrative to point out that pluralistic ignorance also does not make the list. Cheers Andrew (talk) 12:49, 24 December 2013 (UTC)

Ludic fallacy

I looked at the page for the Ludic fallacy, and I found it extremely wanting. I think that this is a term coined by Taleb and as far as I can tell it is only used in the popular press. While it is nominally about the "misuse of games", that is itself a bit of hyperbole, and the "bias" Taleb identifies is that people are using inductive reasoning instead of... something else? I'm not seeing any identifiable bias here. I think it should be removed from this list. 0x0077BE (talk) 00:06, 12 February 2014 (UTC)

Hi 0x0077BE. I agree. It seems to have dubious notability credentials and even more dubious credentials as one of the "decision-making, belief, and behavioral biases". I am going to go ahead with the removal. Cheers Andrew (talk) 02:57, 12 February 2014 (UTC)

difference in the way Bizarreness effect is written

I see that "Bizarreness effect" is entered in a different format than the other names in the lists, without explanation. Please provide the reason; I am curious.

'Bizarreness effect'
Choice-supportive bias
Change bias
Childhood amnesia
Conservatism or Regressive Bias

Thank you, Wordreader (talk) 15:53, 24 January 2014 (UTC)

Looking good, Dream Eater. Thank you, Wordreader (talk) 01:40, 15 March 2014 (UTC)

Frequency illusion and the Baader-Meinhof phenomenon

The page Baader-Meinhof phenomenon has been deleted, and there is no page for Frequency Illusion. I'd like to include in this page that "Frequency Illusion" is also called the Baader-Meinhof phenomenon, and make both of those pages redirect to this article. Does that seem reasonable? Havensfire (talk) 21:24, 19 April 2014 (UTC)

Secrecy Bias

Political scientists have identified a bias towards assigning more weight to information one believes to be secret. Would this be an appropriate addition to this list?

A source:

Travers, Mark, et. al. The Secrecy Heuristic: Inferring Quality from Secrecy in Foreign Policy Contexts, Political Psychology, Volume 35, Issue 1, pages 97–111, February 2014

link to paper

Scu83 (talk) 22:32, 12 June 2014 (UTC)

Hi Scu83. I would prefer not to include it at this stage on the grounds of 'suspect notability'. In other words, I would like to wait to see if the concept is adopted by the broader scientific community. This is because getting an article through peer review doesn't equate with scientific consensus. Others may disagree, but I think this wiki-article is more useful as a list of widely accepted cognitive biases rather than as a list of all the cognitive biases that anyone has ever thought up and got published. Keen to hear what others think though. Cheers Andrew (talk) 12:09, 13 June 2014 (UTC)

Pandering Bias

There may be a better name for this, "Pandering Bias" is the best I could come up with. I haven't found a specific reference to this, and I am hoping someone reading this has seen one.

But I definitely find - both in my personal experience (anecdotal, of course), and through history in general (admittedly not scientific) - that anything complimentary of the human species, will be accepted without question, while traits that are negative will be rejected.

For example, I find that rational choices are rarely made by human beings in their personal life. In fact, the Biases page itself is a huge list of examples of this. But almost everyone feels that they usually make rational choices, and only rarely make instinctive ones.

And an example of the converse - in 1973, Jane Goodall and Richard Wrangham discovered - to their dismay - strong evidence that many primate species practice warfare, thus making it a hard-wired instinct, rather than a cultural practice that might be halted. However, that viewpoint is rejected by everyone who comes to hear about it (confirmation bias ?), with the possible exception of scientists who have examined the research.

Has anyone seen this overall "pandering" phenomenon mentioned elsewhere ? I obviously can't put it in the list without more support and references. (talk) 23:38, 12 August 2014 (UTC)

Just-World Hypothesis

Spartan26 asked in 2011 here the question: "Is the "Just-world phenomenon" (Social Biases) the same as the "Just-world hypothesis" (Biases in Probability and Belief)? If so, should they be combined? If not, why are they different? Should they reference each other?" In response I would point to this article on the just-word phenomenon on that ends with "Also Known As: Just-world theory, just-world hypothesis, just-world fallacy or just-world effect." This suggests that these are equivalent. Kind regards, Timelezz (talk) 16:00, 16 October 2014 (UTC)

Hi Timelezz. I agree that it is not necessary to have two entries. I have made the removal and retained the one that i think had better language. I have also opted for placement in the 'social biases' list, but I am not wedded to this decision. Cheers Andrew (talk) 22:12, 16 October 2014 (UTC)

Blaming the victim

I was suprised to find this common cognitive bias omitted, despite having it's own wikipedia article. Anniepoo (talk) 02:51, 7 April 2015 (UTC)

Hi Anniepoo. To my knowledge victim blaming is not a cognitive bias in the sense covered by this article. For the most part it is too multiply-determined a phenomenon. Instead, more specific cognitive biases may reportedly contribute to victim blaming (e.g. ingroup favoritism, just world beliefs). Does that make sense to you? Cheers Andrew (talk) 03:25, 10 April 2015 (UTC)
Makes sense to me. Thre's a link to Cognitie Bias at the end of Victim blaming article. Is that inconsistent? Anniepoo (talk) 05:22, 10 April 2015 (UTC)
I think that should be fine. The "See also" links are usually intended along the lines of "if you're interested in this article, you may be interested in these other related articles as well," focusing on topics that were not already mentioned in the main text of the article. The list at Victim blaming is longer than usual, so it might be useful to find ways to move some of them to the text. Sunrise (talk) 06:55, 10 April 2015 (UTC)
Hi all. I actually think the 'see also' mention over there could be misleading. I am going to be bold and remove for the meantime. After all, we already have the link in there to just world theory. Cheers Andrew (talk) 11:13, 16 April 2015 (UTC)

Duplicate {{vanchor}}s

There are a number of duplicate {{vanchor}}s on this page (e.g. Regressive bias). They each need to be unique, if links to any but the first are to work. One fix woud be to add some or all of the section header in parentheses (e.g. Regressive bias (memory) and Regressive bias (belief) ).

Comments? Other possible fixes?Lentower (talk) 03:18, 8 June 2015 (UTC)

Another List bias

I heard this on NPR this morning, that items in a list are more cited in research if they're at the top of a list, vs others that are lower in that same list. It may require an article, but I thought I'd put the link here, if anyone wishes to tackle it: [1] Hires an editor (talk) 00:23, 16 July 2015 (UTC)


Assessment comment

The comment(s) below were originally left at Talk:List of cognitive biases/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.

Ludic fallacy probably does not belong to this list, as it is not commonly accepted as a cognitive bias. It is used almost exclusively by its inventor Nassim Taleb. I do not believe there has been any published psychology work on "Ludic fallacy." Drhoenikker 16:30, 27 April 2007 (UTC)

Last edited at 16:30, 27 April 2007 (UTC). Substituted at 15:16, 1 May 2016 (UTC)

Notable ommission

There is a bias where a thief will see everyone else as thieves, or where someone who cheats will accuse everyone else of cheating, without realizing it. What is that bias called? — Preceding unsigned comment added by (talk) 07:23, 9 October 2015 (UTC)

I think the "thief considers everyone else a thief" phenomenon may fall under the heading of Psychological projectionPsychological Projection. I personally would classify that as a cognitive bias, but experts may say otherwise. — Preceding unsigned comment added by (talk) 01:16, 22 October 2015 (UTC)

We are more likely to believe a statement if it is easy to understand.

This two-part article gives a pretty good overview of the bias (if you can overlook its political leaning) and summarises it like this:

"The less effort it takes to process a factual claim, the more accurate it seems."

Is this bias listed on the page? If not, could we add it? JoeyTwiddle (talk) 11:17, 19 January 2016 (UTC)

This bias seems especially interesting at present, with the rise of meme (image) culture. It might be related to System 1, but I find the concise summary above quite valuable, both for people hoping to influence others, and for those seeking to avoid being unduly influenced! — Preceding unsigned comment added by (talk) 07:15, 25 January 2016 (UTC)
What is needed is a citation to a name for such a bias. A name is not mentioned in the article, nor the study mentioned. To make up one would be original research. Ultimately, acting on a cognitive bias could be considered lazy thinking to start with. If they put some effort in to thinking about it, they would see the bias. That said, there is processing fluency, attribute substitution and law of triviality. The book Consumer Behavior mentions "low-effort decision-making processes". Richard-of-Earth (talk) 22:09, 25 January 2016 (UTC)


I saw declinism added to the list by User:Kvng and I don't think it should be there, but I thought it best to discuss here before removal. The article on Declinism describes it as a belief. Something can't be both a belief and a cognitive bias. Looking through a Google Scholar search, there are lots of mentions of declinism, but as a belief or a feature of schools of thought: there doesn't seem to be anything about it in the extensive literature on cognitive biases. The reference that is seemingly used to justify calling declinism a cognitive bias is a newspaper opinion column by psychologist Pete Etchell. It's written by a mainstream scientist, but isn't part of peer-reviewed scholarly literature, is openly speculative rather than sharing definitive results, and it attempts to explain declinism as a result of cognitive biases, which is arguably an important difference from saying declinism is itself a cognitive bias. All sorts of beliefs or preferences might be explained in terms of cognitive biases: that's partly why the topic is so interesting. It doesn't mean, though, that those beliefs or preferences all belong in this list. MartinPoulter (talk) 20:32, 16 February 2016 (UTC)

Declinism suggests that cognitive bias may fuel declinist sentiment. Declinism is definitely a belief but there is a close relationship between cognitive bias and beliefs. I don't agree that something can't be both a belief and a bias. I have added another reference to Declinism but it is along the same lines as the first: reliable source and scientist but not scholarly. I don't outright reject removal of Declinism from this list but I would like to see a third opinion before we make any changes. ~Kvng (talk) 16:11, 17 February 2016 (UTC)

Antiquity illusion

Opposite of recency illusion. The pre-dating of one's memories about when a cultural practice, word etc. was first used.Languagelog: The antiquity illusion -- (talk) 15:42, 18 April 2016 (UTC)

projection bias

Should be moved from the "social" category into one more appropriate for intertemporal choice stuff. Its closest relative is empathy gap−−Ihaveacatonmydesk (talk) 19:54, 22 May 2016 (UTC)


Illusion of validity

This description is wrong, illusion of validity is akin to overconfidence (it basically is the same, imo the two should be merged). See page for sources. The description used is the one for Information bias (psychology). Cheers −−Ihaveacatonmydesk (talk) 19:59, 22 May 2016 (UTC)

@Ihaveacatonmydesk:, per WP:BOLD, go ahead and fix description(s)! Baking Soda (talk) 10:40, 25 May 2016 (UTC)

Cross-race effect

How is this a memory bias? Ihaveacatonmydesk (talk) 23:45, 29 May 2016 (UTC)

on "frequency illusion" being called the Baader-Meinhof Phenomenon

I'm unsure if this is an appropriate reference, but it does discuss the origin of the phrase. - Paul2520 (talk) 22:35, 25 August 2016 (UTC)

"Cheat sheet" proposes big deduplication and regrouping

This blog post claims to be the result of an effort to deduplicate and regroup this article. (talk) 12:03, 3 September 2016 (UTC)

+1 to that, I just came here and saw it was already mentioned. Samois98 16:32, 5 September 2016 (UTC) — Preceding unsigned comment added by Samois98 (talkcontribs)

I've contacted the illustrator mentioned at the bottom of the blog and he's willing to upload a CC-licensed jpeg. I think it would make a good lead/hero image for this article. ~Kvng (talk) 22:05, 12 September 2016 (UTC)
If I start seeing hero images on Wikipedia articles, I'll start shooting puppies. The illustration is great, though. Paradoctor (talk) 03:04, 4 November 2016 (UTC)
For the sake of the puppies', please don't download the Wikipedia mobile app. ~Kvng (talk) 14:13, 6 November 2016 (UTC)
I agree with Kvng. (But then, I agree with everybody about everything! You're all right!) Herostratus (talk) 17:43, 6 November 2016 (UTC)

Generating flashcards from article

I just thought I'd mention that I've written a short Python script that dumps the list into a .csv file ready to import into various spaced-repetition software (such as Anki and Mnemosyme). As I've started using this data I've found it to work pretty well but some descriptions could use some improvement. If you want to learn these in a reasonable manner and help out with writing better descriptions for some of these, check out the script! I expect to make such improvements myself over time but I currently almost exclusively do the cards when not at a computer, which prevents me from improving descriptions when I find ones in need of improvement. Hopefully this'll be of some use to someone! I can really recommend using flashcards to memorize these if someone is on the fence about this. You can find the script here: (talk) 18:10, 17 March 2017 (UTC)

May I suggest Wikiversity? If you put this up as a resource or course there, including an external link per WP:ELMAYBE would certainly be justifiable. Paradoctor (talk) 20:07, 17 March 2017 (UTC)

Great work, editors

This is so useful. Well done, whoever has contributed. :-). Tony (talk) 05:38, 23 June 2017 (UTC)

Learned Helplessness

I feel like Learned Helplessness should be somewhere on this list, but I don't know enough about the subject to be comfortable adding it. Tristyn 18:17, 26 August 2017 (UTC)

A cognitive bias may be involved in the phenomenon, but learned helplessness itself is not a cognitive bias. Actually, it is rational not to expend effort that can reasonably be expected to be wasted. Paradoctor (talk) 22:53, 26 August 2017 (UTC)


@Gaia Abundance Life: This is a list class article. Notice how each point only includes a very short summary with a link to the relevant article. I suggest adding your material to Anthropocentrism instead and to only add a single sentence here if necessary. Thanks, —PaleoNeonate – 19:54, 26 October 2017 (UTC)