Illusory truth effect
The illusory truth effect (also known as the truth effect, the illusion-of-truth effect, the reiteration effect, the validity effect, and the frequency-validity relationship) is the tendency to believe information to be correct after repeated exposure. One science writer has explained it as follows: "Why are so many people convinced that we only use 10% of our brains, or that Eskimos have no words for snow...?" The answer is the truth effect.
This phenomenon was first discovered in 1977 at Villanova University and Temple University. It has in recent years been equated by some researchers with the concept of "truthiness", a term coined by American comedian Stephen Colbert.
The effect was first named and defined following the results in a study from 1977. On three occasions, Lynn Hasher, David Goldstein, and Thomas Toppino presented the same group of college students with lists of sixty plausible statements, some of them true and some of them false. The second list was distributed two weeks after the first, and the third two weeks after that. Twenty statements appeared on all three lists; the other forty items on each list were unique to that list. Participants were asked how confident they were of the truth or falsity of the statements, which concerned matters about which they were unlikely to know anything. (For example, "The first air force base was launched in New Mexico." Or "Basketball became an Olympic discipline in 1925.") Specifically, the participants were asked to grade their belief in the truth of each statement on a scale of one to seven. While the participants' confidence in the truth of the non-repeated statements remained steady, their confidence in the truth of the repeated statements increased from the first to the second and second to third sessions, with an average score for those items rising from 4.2 to 4.6 to 4.7. The conclusion made by the researchers, who were from Villanova and Temple universities, was that repeating a statement makes it appear more likely to be factual.
In 1989, Hal R. Arkes, Catherine Hackett, and Larry Boehm essentially replicated the original study, with similar results, which was published in Europe's Journal of Psychology.
At first, the truth effect was believed to occur only when individuals are highly uncertain about a given statement.
This assumption was challenged by the results of a 2015 study by Lisa K. Fazio, Nadia M. Brasier, B. Keith Payne, and Elizabeth J. Marsh. Published in the Journal of Experimental Psychology, the study suggested that the truth effect can have an impact on participants who actually knew the correct answer to begin with, but who were swayed to believe otherwise through the repetition of a falsehood. For example, when participants encountered on multiple occasions the statement "A sari is the name of the short plaid skirt worn by Scots," some of them were likely to come to believe it was true, even though these same people were able to correctly answer the question "What is the name of the short pleated skirt worn by Scots?"
After replicating these results in another experiment, Fazio and her team attributed this curious phenomenon to "processing fluency", a term that describes the facility with which people comprehend statements. "Repetition," explained the researcher, "makes statements easier to process (i.e. fluent) relative to new statements, leading people to the (sometimes) false conclusion that they are more truthful."
In a 1997 study, Ralph Hertwig, Gerd Gigerenzer, and Ulrich Hoffrage linked the truth effect to the phenomenon known as "hindsight bias", described as a situation in which the "recollection of confidence is systematically distorted after feedback about the actual truth or falsity has been received".
Although the truth effect has been demonstrated scientifically only in recent years, it is a phenomenon with which people have been familiar for millennia. One study notes that the Roman statesman Cato closed each of his speeches with a call to destroy Carthage ("Ceterum censeo Carthaginem esse delendam"), knowing that the repetition would breed agreement, and that Napoleon reportedly "said that there is only one figure in rhetoric of serious importance, namely, repetition", whereby a repeated affirmation fixes itself in the mind "in such a way that it is accepted in the end as a demonstrated truth". Others who have taken advantage of the truth effect have included Quintilian, Ronald Reagan, and Marcus Antonius in Shakespeare's Julius Caesar.
Hertwig, Gigerenzer, and Hoffrage have described the truth effect (which they call "the reiteration effect") as a subset of hindsight bias, and look forward to "a theoretical integration of findings in human confidence", including the truth effect and such other phenomena as overconfidence bias and the hard–easy effect.
In a 1979 study, participants were told that repeated statements were no more likely to be true than unrepeated ones. Despite this warning, the participants perceived repeated statements as being truer than unrepeated ones.
Studies in 1981 and 1983 showed that information deriving from recent experience tends to be viewed as "more fluent and familiar" than new experience. A 2011 study by Jason D. Ozubko and Jonathan Fugelsang built on this finding by demonstrating that, generally speaking, information retrieved from memory is "more fluent or familiar than when it was first learned" and thus produces "an illusion of truth". The effect grew even more pronounced when statements were repeated twice and yet more pronounced when they were repeated four times. The researchers thus concluded that "memory retrieval is a powerful method for increasing the perceived validity of statements (and subsequent illusion of truth) and that the illusion of truth is a robust effect that can be observed even without directly polling the factual statements in question."
A 1992 study by Ian Maynard Begg, Ann Anas, and Suzanne Farinacci suggested that "a statement will seem true if it expresses information that feels familiar".
A 2012 experiment by Danielle C. Polage showed that some participants exposed to false news stories would go on to have false memories. The conclusion was that "repeating false claims will not only increase their believability but may also result in source monitoring errors".
In a 2014 study, Eryn J. Newman, Mevagh Sanson, Emily K. Miller, Adele Quigley-McBride, Jeffrey L. Foster, Daniel M. Bernstein, and Maryanne Garry asked participants to judge the truth of statements attributed to various people, some of whose names were easier to pronounce than others. Consistently, statements by persons with easily pronounced names were viewed as being more truthful than those with names that were harder to pronounce. The researchers' conclusion was that "subjective, tangential properties such as ease of processing can matter when people evaluate information attributed to a source".
The truth effect plays a significant role in various fields of activity. During election campaigns, false information about a candidate, if repeated in TV commercials, can cause the public to believe it. Similarly, advertising that repeats unfounded claims about a product may boost sales because some viewers may come to think that they heard the claims from an objective source.
Examples of the truth effect can be found everywhere. A kayaking expert has pointed out that it is an accepted fact that when kayaking on the ocean or the Great Lakes, one should use a kayak at least 16 feet long. But this is not true; the best length for a kayak depends on a variety of factors.
- "The Truth Effect and Other Processing Fluency Miracles". Science Blogs. Science Blogs. Retrieved 30 December 2016.
- Hasher, Lynn; Goldstein, David; Toppino, Thomas (1977). "Frequency and the conference of referential validity" (PDF). Journal of Verbal Learning and Verbal Behavior. 16 (1): 107–112. doi:10.1016/S0022-5371(77)80012-1. Archived from the original on 2016-05-15.
- "People with Easier to Pronounce Names Promote Truthiness of Claims". PLOS ONE. September 6, 2014. Retrieved 29 December 2016.
- Polage, Danielle (May 31, 2012). "Making up History: False Memories of Fake News Stories". Europe's Journal of Psychology. 8: 245–250. doi:10.5964/ejop.v8i2.456. Retrieved 30 December 2016.
- Fazio, Lisa K.; Brashier, Nadia M.; Payne, B. Keith; Marsh, Elizabeth J. (2015). "Knowledge does not protect against illusory truth" (PDF). Journal of Experimental Psychology: General. 144 (5): 993–1002. doi:10.1037/xge0000098. Archived from the original on 2016-05-14.
- Nason, Brian (December 8, 2015). "THE ILLUSORY TRUTH EFFECT". Vox Populi News. Retrieved 29 December 2016.
- Hertwig, Ralph; Gigerenzer, Gerd; Hoffrage, Ulrich (1997). "The Reiteration Effect in Hindsight Bias". Center for Adaptive Behavior and Cognition. Retrieved 30 December 2016.
- Ozubko, JD; Fugelsang, J (January 2011). "Remembering makes evidence compelling: retrieval from memory can give rise to the illusion of truth.". University of Waterloo. PMID 21058878. doi:10.1037/a0021323.
- Hansel, Bryan. "Illusory Truth Effect and Sea Kayaking (Sort of Off-Topic)". Paddling Light. Paddling Light. Retrieved 30 December 2016.
- Gigerenzer, Gerd (1984). "External Validity of Laboratory Experiments: The Frequency-Validity Relationship". The American Journal of Psychology. 97 (2): 185–195. JSTOR 1422594. doi:10.2307/1422594.
- Zacks, Rose T.; Hasher, Lynn (2002). "Frequency processing: A twenty-five year perspective" (PDF). In Sedlmeier, Peter; Betsch, Tilmann. Etc. Frequency Processing and Cognition. pp. 21–36. ISBN 9780198508632. doi:10.1093/acprof:oso/9780198508632.003.0002.
- "The Illusion of Truth - PsyBlog". PsyBlog. Retrieved 2016-04-22.