Self-deception: Difference between revisions
m Reverted edits by 70.188.174.60 (talk) to last version by DavidBailey |
→Examples: current treatments can postpone the disease for decades, so it may be no longer entirely unreasonable Tag: references removed |
||
Line 46: | Line 46: | ||
==Examples== |
==Examples== |
||
Though the term is difficult to define, examples of self-deception are abundant in varying degrees. Simple instances of self-deception include common occurrences such as: the alcoholic who is self-deceived in believing that his drinking is under control, the husband who is self-deceived in believing that his wife is not having an affair, the jealous colleague who is self deceived in believing that her colleague's greater professional success is due to ruthless ambition |
Though the term is difficult to define, examples of self-deception are abundant in varying degrees. Simple instances of self-deception include common occurrences such as: the alcoholic who is self-deceived in believing that his drinking is under control, the husband who is self-deceived in believing that his wife is not having an affair, the jealous colleague who is self deceived in believing that her colleague's greater professional success is due to ruthless ambition. |
||
The implications of self-deception increase in gravity when conducted on a political scale. Both policymakers and heads of state are capable of self-deception, which can affect both internal and foreign affairs. The [[2003 US invasion of Iraq]], based, in part, on the mistaken belief that Saddam Hussein harbored weapons of mass destruction, serves as a prominent—and controversial—case of self-deception that is still being examined.<ref>"Anatomy of Self Deception: Judgment, Belief, and the US Decision to Invade Iraq" by Peter Zimmerman</ref> |
The implications of self-deception increase in gravity when conducted on a political scale. Both policymakers and heads of state are capable of self-deception, which can affect both internal and foreign affairs. The [[2003 US invasion of Iraq]], based, in part, on the mistaken belief that Saddam Hussein harbored weapons of mass destruction, serves as a prominent—and controversial—case of self-deception that is still being examined.<ref>"Anatomy of Self Deception: Judgment, Belief, and the US Decision to Invade Iraq" by Peter Zimmerman</ref> |
Revision as of 15:49, 14 October 2011
Self-deception is a process of denying or rationalizing away the relevance, significance, or importance of opposing evidence and logical argument. Self-deception involves convincing oneself of a truth (or lack of truth) so that one does not reveal any self-knowledge of the deception.
Definitional problems
A consensus on the identification of self-deception remains elusive to contemporary philosophers, the result of the term's paradoxical elements and ambiguous paradigmatic cases. Self-deception also incorporates numerous dimensions, such as epistemology, psychological and intellectual processes, social contexts, and morality. As a result, the term is highly debated and occasionally argued to be an impossible phenomenon.
Theorization
Analysis
The traditional paradigm of self-deception focuses on interpersonal deception, as described by the Stanford Encyclopedia of Philosophy. In this paradigm, A intentionally gets B to believe some proposition p, all the while knowing or believing truly ~p. Such deception is intentional and requires the deceiver to know or believe ~p and the deceived to believe p. On this traditional mode, self-deceivers must (1) hold contradictory beliefs and (2) intentionally get themselves to hold a belief they know or believe truly to be false.[1]
The process of rationalization, however, can obscure the intent of self-deception. Brian McLaughlin illustrates that such rationalizations in certain circumstances permit the phenomenon. When a person, who disbelieves p, intentionally tries to make himself believe or continue believing p by engaging in such activities, and, as a result unintentionally misleads himself into believing or continuing to believe p via biased thinking, he deceives himself in a way appropriate for self-deception. No deceitful intention is required for this.[2]
Psychology
Self-deception calls into question the nature of the individual, specifically in a psychological context and the nature of "self". Irrationality is the foundation upon which the argued paradoxes of self-deception stem, and it is argued that not everyone has the "special talents" and capacities for self-deception.[3] However, rationalization is influenced by a myriad of factors, including socialization, personal biases, fear, and cognitive repression. Such rationalization can be manipulated in both positive and negative fashions; convincing one to perceive a negative situation optimistically and vice versa. In contrast, rationalization alone cannot effectively clarify the dynamics of self-deception, as reason is just one adaptive form mental processes can take.[4]
Paradoxes of self-deception
The works of philosopher Alfred R. Mele have provided insight into some of the more prominent paradoxes regarding self-deception. Two of these paradoxes include the self-deceiver's state of mind and the dynamics of self-deception, coined the "static" paradox and the "dynamic/strategic" paradox, respectively.
Mele formulates an example of the "static" paradox as the following:
If ever a person A deceives a person B into believing that something, p, is true, A knows or truly believes that p is false while causing B to believe that p is true. So when A deceives A (i.e., himself) into believing that p is true, he knows or truly believes that p is false while causing himself to believe that p is true. Thus, A must simultaneously believe that p is false and believe that p is true. But how is this possible?[5]
Mele then describes the "dynamic/strategy" paradox:
In general, A cannot successfully employ a deceptive strategy against B if B knows A's intention and plan. This seems plausible as well when A and B are the same person. A potential self-deceiver's knowledge of his intention and strategy would seem typically to render them ineffective. On the other hand, the suggestion that self-deceivers typically successfully execute their self-deceptive strategies without knowing what they are up to may seem absurd; for an agent's effective execution of his plans seems generally to depend on his cognizance of them and their goals. So how, in general, can an agent deceive himself by employing a self-deceptive strategy?[5]
These models call into question how one can simultaneously hold contradictory beliefs ("static" paradox) and deceive oneself without rendering one's intentions ineffective ("dynamic/strategic" paradox). Attempts at a resolution to these have created two schools of thought: one that maintains that paradigmatic cases of self-deception are intentional and those that deny the notion—Intentionalists and Non-Intentionalists, respectively.[1]
Intentionalists tend to agree that self-deception is intentional, but divide over whether it requires the holding of contradictory beliefs.[1] This school of thought incorporates elements of temporal partitioning (extended over time to benefit the self-deceiver, increasing the chance of forgetting the deception altogether) and psychological partitioning (incorporating various aspects of the "self").
Non-Intentionalists, in contrast, tend to believe that cases of self-deception are not necessarily accidental, but motivated by desire, anxiety, or some other emotion regarding p or related to p.[1] This notion distinguishes self-deception from misunderstanding. Furthermore, "wishful thinking" is distinguished from self-deception in that the self-deceivers recognize evidence against their self-deceptive belief or possess, without recognizing, greater counterevidence than wishful thinkers.[1]
Numerous questions and debates have continued to foment regarding the paradoxes of self-deception, however, a consensual paradigm remains intangible.
Trivers' theory of self-deception
It has been theorized that humans are susceptible to self-deception because most people have emotional attachments to beliefs, which in some cases may be irrational. Some evolutionary biologists, such as Robert Trivers, have suggested[6] that deception plays a significant part in human behavior, and in animal behavior, more generally speaking. One deceives oneself to trust something that is not true as to better convince others of that truth. When a person convinces her or himself of this untrue thing, s/he better mask the signs of deception.
This notion is based on the following logic: deception is a fundamental aspect of communication in nature, both between and within species. It has evolved so that one can have an advantage over another. From alarm calls to mimicry, animals use deception to further their survival. Those who are better able to perceive deception are more likely to survive. As a result, self-deception evolved to better mask deception from those who perceive it well, as Trivers puts it: "Hiding the truth from yourself to hide it more deeply from others." In humans, awareness of the fact that one is acting deceptively often leads to tell-tale signs of deception, such as nostrils flaring, clammy skin, quality and tone of voice, eye movement, or excessive blinking. Therefore, if self-deception enables someone to believe her or his own distortions, s/he will not present such signs of deception and will therefore appear to be telling the truth.
Self-deception can be used both to act greater or lesser than one actually is. For example, one can act overconfident to attract a mate or act under-confident to avoid a predator or threat. If a person is capable of concealing her or his true feelings and intentions well, then s/he is more likely to deceive others and succeed.
It may also be argued that the ability to deceive, or self-deceive, is not the selected trait but a by-product of a more primary trait called abstract thinking. Abstract thinking allows many evolutionary advantages such as more flexible, adaptive behaviors and innovation. Since a lie is an abstraction, the mental process of creating a lie can only occur in animals with enough brain complexity to permit abstract thinking.[citation needed] Self-deception lowers cognitive cost; that is to say, it is less complicated for one to behave or think in a certain manner that implies something is true, if one has convinced oneself that that very thing is indeed true. The mind will not have to think constantly of the true thing and then the false thing, but simply convince itself that the false thing is true.
Evolutionary implications of Trivers' theory of self-deception
Because there is deceit, there exists a strong selection to recognize when deception occurs. As a result, self-deception evolves so as to better hide the signs of deception from others. The presence of deception explains the existence of an innate ability to commit self-deception to hide the indications of deceptions. Humans deceive themselves in order to better deceive others and thus have an advantage over them. In the three decades since Trivers introduced his adaptive theory of self-deception, there has been an ongoing debate over the question of such behavior having a genetic basis.
The explanation of deception and self-deception as innate characteristics is perhaps true, but there are very many other explanations for this pattern of behavior. It is possible that the ability to self-deceive is not innate, but a learned trait, acquired through experience. For example, a person could have been caught being deceitful by revealing her/his knowledge of information she/he was trying to hide. Her/his nostrils flared, indicating that she/he was lying to the other person, and she/he thus did not get what she/he wanted. Next time, to better achieve success, the person will more actively deceive her/himself of having knowledge to better hide the signs of deception. People therefore could have the capacity to learn self-deception.
Examples
Though the term is difficult to define, examples of self-deception are abundant in varying degrees. Simple instances of self-deception include common occurrences such as: the alcoholic who is self-deceived in believing that his drinking is under control, the husband who is self-deceived in believing that his wife is not having an affair, the jealous colleague who is self deceived in believing that her colleague's greater professional success is due to ruthless ambition.
The implications of self-deception increase in gravity when conducted on a political scale. Both policymakers and heads of state are capable of self-deception, which can affect both internal and foreign affairs. The 2003 US invasion of Iraq, based, in part, on the mistaken belief that Saddam Hussein harbored weapons of mass destruction, serves as a prominent—and controversial—case of self-deception that is still being examined.[7]
In the same invasion, former Iraqi Information Minister Mohammed Saeed al-Sahhaf illustrated another well-known instance of self-deception. On April 7, 2003, al-Sahhaf claimed that there were no American troops in Baghdad, and that the Americans were committing suicide by the hundreds at the city's gates. At that time, American tanks were patrolling the streets only a few hundred meters from the location where the press conference was held. Despite empirical evidence contradicting al-Sahhaf's claims, he claimed that the reports were provided by "reliable sources".[8]
Further examples can be found in various despotic countries around the world. Self-deception is typically evident in such environments, nourished by reluctance to leadership criticism from both the citizens and members of the regime. This notion is personified by Kim Jong-il, de facto leader of Democratic People's Republic of Korea (North Korea), through self-proclamations (such as his messianic title "Dear Father") and his entourage of primarily Korean War veterans. With heavy restrictions placed on free speech, and Kim Jong-il's noted intolerance for criticism,[9] self-deception on the state of internal and external affairs can foment without advisory counterevidence.
See also
- Anosognosia
- Bad faith
- Bad faith
- Cognitive dissonance
- Confabulation
- Delusion
- Denial
- Distancing language
- Doublethink
- Double-blind
- Groupthink
- Introspection illusion
- Indoctrination
- List of cognitive biases
- Point of no return
- Positive illusions
- Propaganda
- Psychology
- Rigour
- Self-fulfilling prophecy
- Self-handicapping
- Self propaganda
- Subjective validation
- True-believer syndrome
- Wishful thinking
- Mundus vult decipi, ergo decipiatur
Notes
- ^ a b c d e Stanford Encyclopedia of Philosophy, "Self-Deception", http://plato.stanford.edu/entries/self-deception/
- ^ "Exploring the Possibility of Self-Deception in Belief" by Brian P. McLaughlin
- ^ "The Deceptive Self: Liars, Layers, and Lairs" by Amélie Oksenberg Rorty
- ^ "Self-Deceptive and the Nature of Mind" by Mark Johnston
- ^ a b Two Paradoxes of Self-Deception by Alfred R. Mele
- ^ Robert Trivers (2002). Natural Selection and Social Theory: Selected Papers of Robert Trivers. Oxford University Press US. ISBN 9780195130621. Retrieved 4 December 2008.
- ^ "Anatomy of Self Deception: Judgment, Belief, and the US Decision to Invade Iraq" by Peter Zimmerman
- ^ Thunder Run Zucchino, David, Atlantic Monthly Press, ISBN 0-87113-911-1
- ^ "Kim Jong-il" by GlobalSecurity.org, http://www.globalsecurity.org/military/world/dprk/kim-jong-il.htm
External links
- Skeptic's Dictionary entry on self-deception
- Arbinger Institute - a consulting organisation based on Terry Warner's work on self-deception
- The pattern behind self-deception
- The Philosophy Talk episode on self-deception
Books
- Leadership and Self Deception, by Arbinger Institute - talks at length about self-deception and its implications for leaders - in personal and public life. ISBN 978-1576759776
- Anatomy of Peace - Resolving the Heart of Conflict, by Arbinger Institute ISBN 978-1576753347
- McLaughlin, Brian P. & Amélie Oksenberg Rorty (eds.) (1988). Perspectives on Self-Deception. California UP: Berkley etc.