Jump to content

Brain-wash: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
fact
(2 intermediate revisions by the same user not shown)
Line 6: Line 6:
{{cite web
{{cite web
| url = http://www.mercksource.com/pp/us/cns/cns_hl_dorlands_split.jsp?pg=/ppdocs/us/common/dorlands/dorland/two/000014405.htm
| url = http://www.mercksource.com/pp/us/cns/cns_hl_dorlands_split.jsp?pg=/ppdocs/us/common/dorlands/dorland/two/000014405.htm
| title = brainwashing
| title = brainwashing | accessdate = 2008-09-13 | author =
| last = | first =
| accessdate = 2008-09-13
| authorlink = | coauthors =
| author =
| last =
| date = | year = 2007
| month = | work = Dorland's Medical Dictionary for Healthcare Consumers
| first =
| authorlink =
| coauthors =
| date =
| year = 2007
| month =
| work = Dorland's Medical Dictionary for Healthcare Consumers
| publisher = Merck/Elsevier
| publisher = Merck/Elsevier
| location =
| location = | doi =
| archiveurl = | archivedate =
| doi =
| quote = [A]ny systematic effort aimed at instilling certain attitudes and beliefs against a person's will, usually beliefs in conflict with prior beliefs and knowledge. It initially referred to political indoctrination of prisoners of war and political prisoners.}}</ref> in order to affect that individual's [[value system]] and subsequent thought-patterns and behaviors.
| archiveurl =
| archivedate =
| quote = [A]ny systematic effort aimed at instilling certain attitudes and beliefs against a person's will, usually beliefs in conflict with prior beliefs and knowledge. It initially referred to political indoctrination of prisoners of war and political prisoners.
}}
</ref>
in order to affect that individual's [[value system]] and subsequent thought-patterns and behaviors.


In 1987 the [[American Psychological Association]] (APA) Board of Social and Ethical Responsibility for Psychology (BSERP) provisionally declined to endorse one particular approach to brainwashing as "lack[ing] the scientific rigor and evenhanded critical approach necessary for APA imprimatur". The debate amongst APA members on this subject continues.<ref>
In 1987 the [[American Psychological Association]] (APA) Board of Social and Ethical Responsibility for Psychology (BSERP) provisionally declined to endorse one particular approach to brainwashing as "lack[ing] the scientific rigor and evenhanded critical approach necessary for APA imprimatur". The debate amongst APA members on this subject continues.<ref>
Line 38: Line 27:
The term "''brainwashing''" first came into use in the [[English language]] in the 1950s. Author [[John Marks]] writes that a journalist later revealed to have worked undercover for the [[Central Intelligence Agency]] (CIA)<ref>Marks, John. ''The Search for the "Manchurian Candidate": The CIA and Mind Control''. New York: McGraw-Hill, 1980.</ref> first coined the term in 1950. The [[Oxford English Dictionary| OED]] records its earliest known English-language usage of "brain-washing" by E. Hunter in ''New Leader'' on [[7 October]] [[1950]].
The term "''brainwashing''" first came into use in the [[English language]] in the 1950s. Author [[John Marks]] writes that a journalist later revealed to have worked undercover for the [[Central Intelligence Agency]] (CIA)<ref>Marks, John. ''The Search for the "Manchurian Candidate": The CIA and Mind Control''. New York: McGraw-Hill, 1980.</ref> first coined the term in 1950. The [[Oxford English Dictionary| OED]] records its earliest known English-language usage of "brain-washing" by E. Hunter in ''New Leader'' on [[7 October]] [[1950]].


Earlier forms of [[coercive persuasion]] occurred for example during the [[Inquisition]] and in the course of [[show trial]]s against "[[enemy of the state|enemies of the state]]" in the [[Soviet Union]]; but no specific term emerged until the [[methodology|methodologies]] of these earlier movements became [[systematic|systematized]] during the early decades of the [[People's Republic of China]] for use in struggles against internal [[enemy of the people|class enemies]] and foreign invaders. Until that time, presentations of the phenomenon described only concrete specific techniques.
Earlier forms of [[coercive persuasion]] occurred for example during the [[Inquisition]] and in the course of [[show trial]]s against "[[enemy of the state|enemies of the state]]" in the [[Soviet Union]]; but no specific term emerged until the [[methodology|methodologies]] of these earlier movements became [[systematic|systematized]] during the early decades of the [[People's Republic of China]] for use in struggles against internal [[enemy of the people|class enemies]] and foreign invaders. Until that time, presentations of the phenomenon described only concrete specific techniques.{{fact}}


The term ''xǐ năo'' (洗腦, the Chinese term literally translated as "to wash the [[brain]]") originally referred to methodologies of coercive persuasion used in the "reconstruction" (改造 ''gǎi zào'') of the so-called feudal (封建 ''fēng jiàn'') thought-patterns of Chinese citizens raised under pre-revolutionary régimes; the term [[pun]]ned on the [[Taoism|Taoist]] custom of "cleansing/washing the heart" (洗心 ''xǐ xīn'') prior to conducting certain ceremonies or entering certain holy places, and in Chinese, the word "心" ''xīn'' also refers to the soul or the mind, contrasting with the brain. The term first came into use in the [[United States]] in the 1950s during the [[Korean War]] (1950–1953) to describe those same methods as applied by the [[Communist Party of China|Chinese communists]] to attempt deep and permanent behavioral changes in foreign prisoners, and especially during the [[Korean War]] to disrupt the ability of [[prisoners of war|captured]] [[United Nations]] troops to effectively [[Community organizing|organize]] and resist their imprisonment.<ref>
The term ''xǐ năo'' (洗腦, the Chinese term literally translated as "to wash the [[brain]]") originally referred to methodologies of coercive persuasion used in the "reconstruction" (改造 ''gǎi zào'') of the so-called feudal (封建 ''fēng jiàn'') thought-patterns of Chinese citizens raised under pre-revolutionary régimes; the term [[pun]]ned on the [[Taoism|Taoist]] custom of "cleansing/washing the heart" (洗心 ''xǐ xīn'') prior to conducting certain ceremonies or entering certain holy places, and in Chinese, the word "心" ''xīn'' also refers to the soul or the mind, contrasting with the brain. The term first came into use in the [[United States]] in the 1950s during the [[Korean War]] (1950–1953) to describe those same methods as applied by the [[Communist Party of China|Chinese communists]] to attempt deep and permanent behavioral changes in foreign prisoners, and especially during the [[Korean War]] to disrupt the ability of [[prisoners of war|captured]] [[United Nations]] troops to effectively [[Community organizing|organize]] and resist their imprisonment.<ref>
Line 261: Line 250:
:''"The evidence before the Court, which is detailed below, shows that neither the APA nor the ASA has endorsed the views of Dr. Singer and Dr. Ofshe on thought reform ... At best, the evidence establishes that psychiatrists, psychologists and sociologists disagree as to whether or not there is agreement regarding the Singer-Ofshe thesis. The Court therefore excludes defendants' proffered testimony'' (U.S. vs. Fishman, 1989)."
:''"The evidence before the Court, which is detailed below, shows that neither the APA nor the ASA has endorsed the views of Dr. Singer and Dr. Ofshe on thought reform ... At best, the evidence establishes that psychiatrists, psychologists and sociologists disagree as to whether or not there is agreement regarding the Singer-Ofshe thesis. The Court therefore excludes defendants' proffered testimony'' (U.S. vs. Fishman, 1989)."


Social scientists who study new religious movements, such as [[Jeffrey K. Hadden]] (see [[#References|References]]), understand the general proposition that religious groups can have considerable influence over their members, and that that influence may have come about through deception and indoctrination. Indeed, many sociologists{{Who|date=October 2008}} observe that "influence" occurs ubiquitously in human cultures, and some{{Who|date=October 2008}} argue that the influence exerted in [[cult]]s or new religious movements does not differ greatly from the influence present in practically every{[fact}} domain of human action and of human endeavor.
Social scientists who study new religious movements, such as [[Jeffrey K. Hadden]] (see [[#References|References]]), understand the general proposition that religious groups can have considerable influence over their members, and that that influence may have come about through deception and indoctrination. Indeed, many sociologists{{Who|date=October 2008}} observe that "influence" occurs ubiquitously in human cultures, and some{{Who|date=October 2008}} argue that the influence exerted in [[cult]]s or new religious movements does not differ greatly from the influence present in practically every{[fact}} domain of human action and of human endeavor.


The Association of World Academics for Religious Education states that "...&nbsp;without the [[legitimate|legitimating]] umbrella of brainwashing [[ideology]], [[deprogramming]] — the practice of kidnapping members of [[new religious movement|NRMs]] and destroying their religious faith — cannot be justified, either legally or morally."{{Fact|date=April 2007}}
The Association of World Academics for Religious Education states that "...&nbsp;without the [[legitimate|legitimating]] umbrella of brainwashing [[ideology]], [[deprogramming]] — the practice of kidnapping members of [[new religious movement|NRMs]] and destroying their religious faith — cannot be justified, either legally or morally."{{Fact|date=April 2007}}
Line 267: Line 256:
[[Fight Against Coercive Tactics Network|F.A.C.T.net]] states that "Forced deprogramming was sometimes successful and sometimes unsuccessful, but is not considered an acceptable, legal, or ethical method of rescuing a person from a cult."<ref>
[[Fight Against Coercive Tactics Network|F.A.C.T.net]] states that "Forced deprogramming was sometimes successful and sometimes unsuccessful, but is not considered an acceptable, legal, or ethical method of rescuing a person from a cult."<ref>
[http://www.factnet.org/cris_fdp.htm Use of Forced Deprogramming] F.A.C.T.net
[http://www.factnet.org/cris_fdp.htm Use of Forced Deprogramming] F.A.C.T.net
</ref>
</ref>


The [[American Civil Liberties Union]] (ACLU) published a statement in 1977 related to brainwashing and mind control. In this statement the ACLU opposed certain methods "depriving people of the free exercise of religion". The ACLU also rejected (under certain conditions) the idea that claims of the use of "brainwashing" or of "mind control" should overcome the free exercise of religion. (See [http://en.wikiquote.org/wiki/Brainwashing quote])
The [[American Civil Liberties Union]] (ACLU) published a statement in 1977 related to brainwashing and mind control. In this statement the ACLU opposed certain methods "depriving people of the free exercise of religion". The ACLU also rejected (under certain conditions) the idea that claims of the use of "brainwashing" or of "mind control" should overcome the free exercise of religion. (See [http://en.wikiquote.org/wiki/Brainwashing quote]) There have been allegations of the use of brainwashing in cults using children in rituals.<ref>{{cite book|title=Cult and Ritual Abuse: Its History, Anthropology, and Recent Discovery in Contemporary America|last= Randall |first= James|coauthors=Pamela Sue Perskin|year=2000|publisher=Greenwood Publishing Group|pages=229 |url =http://www.questia.com/read/27225017 | isbn = 0-275-95281-9}}</ref><ref>[http://www.ra-info.org/library/programming/noblitt.shtml An Empirical Look at the Ritual Abuse Controversy - Randy Noblitt, PhD] </ref><ref>{{cite book |chapter = The extreme abuse surveys: Preliminary findings regarding dissociative identity disorder |last = Becker | first = T |coauthors = Karriker W; Overkamp B; Rutz, C |year = 2008 |pages= 32-49| title= Forensic aspects of dissociative identity disorder |editors =Sachs, A; Galton, G.(Eds) | publisher = Karnac Books | location =London | isbn =1-855-75596-3}}</ref><ref>Rutz, C. Becker, T., Overkamp, B. & Karriker, W. (2008). Exploring Commonalities Reported by Adult Survivors of Extreme Abuse: Preliminary Empirical Findings pp. 31- 84 in {{cite book | last =Noblitt | first =J.R. | coauthors =Perskin, P. S. (eds) | title = Ritual Abuse in the Twenty-first Century: Psychological, Forensic, Social and Political Considerations | publisher =Robert Reed | date =2008 | location = Bandor, OR | pages =552 | isbn =1-934759-12-0}}</ref><ref>[http://extreme-abuse-survey.net/survey.php?en=b Extreme Abuse Survey results]</ref>


In the 1960s, after coming into contact with [[new religious movement]]s (NRMs, a subset of which have gained the popular designation of "[[cult]]s"), some young people suddenly adopted [[faith]]s, [[belief]]s, and behavior that differed markedly from their previous lifestyles and seemed at variance with their upbringings. In some cases, these people neglected or even broke contact with their families. Such changes appeared strange and upsetting to their families. To explain these phenomena, some postulated brainwashing on the part of new religious movements. Observers quoted practices such as isolating recruits from their family and friends (inviting them to an end-of-term camp after university for example), arranging a sleep-deprivation program (3 a.m. [[prayer meeting| prayer-meeting]]s) and exposing them to loud and repetitive chanting. Another alleged technique of religious brainwashing involved [[love bombing]] rather than [[torture]].
In the 1960s, after coming into contact with [[new religious movement]]s (NRMs, a subset of which have gained the popular designation of "[[cult]]s"), some young people suddenly adopted [[faith]]s, [[belief]]s, and behavior that differed markedly from their previous lifestyles and seemed at variance with their upbringings. In some cases, these people neglected or even broke contact with their families. Such changes appeared strange and upsetting to their families. To explain these phenomena, some postulated brainwashing on the part of new religious movements. Observers quoted practices such as isolating recruits from their family and friends (inviting them to an end-of-term camp after university for example), arranging a sleep-deprivation program (3 a.m. [[prayer meeting| prayer-meeting]]s) and exposing them to loud and repetitive chanting. Another alleged technique of religious brainwashing involved [[love bombing]] rather than [[torture]].

Revision as of 22:30, 10 November 2008

Brainwashing (also known as thought reform or as re-education) consists of any effort aimed at instilling certain attitudes and beliefs in a person — beliefs sometimes unwelcome or in conflict with the person's prior beliefs and knowledge,[1] in order to affect that individual's value system and subsequent thought-patterns and behaviors.

In 1987 the American Psychological Association (APA) Board of Social and Ethical Responsibility for Psychology (BSERP) provisionally declined to endorse one particular approach to brainwashing as "lack[ing] the scientific rigor and evenhanded critical approach necessary for APA imprimatur". The debate amongst APA members on this subject continues.[2]

Terminology

The English words "re-educate" and "re-education", which the Oxford English Dictionary attests in general senses from 1808, began in the 1940s to express specifically political connotations. George Orwell mentioned in Animal Farm (1945) "the Wild Comrades' Re-education Committee (the object of this was to tame the rats and rabbits)"; and Arthur Koestler in The Age of Longing (1951) wrote of "revolutionary vigilance,.. and discipline, and re-education camps".

The term "brainwashing" first came into use in the English language in the 1950s. Author John Marks writes that a journalist later revealed to have worked undercover for the Central Intelligence Agency (CIA)[3] first coined the term in 1950. The OED records its earliest known English-language usage of "brain-washing" by E. Hunter in New Leader on 7 October 1950.

Earlier forms of coercive persuasion occurred for example during the Inquisition and in the course of show trials against "enemies of the state" in the Soviet Union; but no specific term emerged until the methodologies of these earlier movements became systematized during the early decades of the People's Republic of China for use in struggles against internal class enemies and foreign invaders. Until that time, presentations of the phenomenon described only concrete specific techniques.[citation needed]

The term xǐ năo (洗腦, the Chinese term literally translated as "to wash the brain") originally referred to methodologies of coercive persuasion used in the "reconstruction" (改造 gǎi zào) of the so-called feudal (封建 fēng jiàn) thought-patterns of Chinese citizens raised under pre-revolutionary régimes; the term punned on the Taoist custom of "cleansing/washing the heart" (洗心 xǐ xīn) prior to conducting certain ceremonies or entering certain holy places, and in Chinese, the word "心" xīn also refers to the soul or the mind, contrasting with the brain. The term first came into use in the United States in the 1950s during the Korean War (1950–1953) to describe those same methods as applied by the Chinese communists to attempt deep and permanent behavioral changes in foreign prisoners, and especially during the Korean War to disrupt the ability of captured United Nations troops to effectively organize and resist their imprisonment.[4]

The word brainwashing consequently came into use in the United States of America to explain why, unlike in earlier wars, a relatively high percentage of American GIs defected to the enemy side after becoming prisoners of war in Korea. Later analysis determined that some of the primary methodologies employed on them during their imprisonment included sleep-deprivation and other intense psychological manipulations designed to break down the autonomy of individuals. American alarm at the new phenomenon of substantial numbers of U.S. troops switching their allegiance to support foreign Communists lessened after the repatriation of prisoners, when it emerged that few of them retained allegiance to the Marxist and "anti-American" doctrines inculcated during their incarcerations. When rigid control of information ceased and the former prisoners' "natural" cultural methods of reality-testing could resume functioning, the superimposed values and judgments rapidly decreased[clarification needed].

Although the use of brainwashing on United Nations prisoners during the Korean War produced some propaganda-benefits to the forces opposing the United Nations, its main utility to the Chinese lay in the fact that it significantly increased the maximum number of prisoners that one guard could control, thus freeing other Chinese soldiers to go to the battlefield[citation needed].

After the Korean War the term "brainwashing" came to apply to other methods of coercive persuasion and even to the effective use of ordinary propaganda and indoctrination. Formal discourses of the Chinese Communist Party came to prefer the more clinical-sounding term sī xǐang gǎi zào 思想改造 ("thought reform"). Metaphorical uses of "brainwashing" extended as far as the memes of fashion-following.

Political brainwashing

Studies of the Korean War (1950–1953)

The Communist Party of China used the phrase "xǐ nǎo" ("wash brain", 洗脑) to describe its methods of persuading into orthodoxy those members who did not conform to the Party message. The phrase played on xǐ xīn (洗心"wash heart"), an admonition — found in many Daoist temples — which exhorted the faithful to cleanse their hearts of impure desires before entering.

In September 1950, the Miami Daily News published an article by Edward Hunter titled "'Brain-Washing' Tactics Force Chinese into Ranks of Communist Party". It contained the first printed use of the English-language term "brainwashing", which quickly became a stock phrase in Cold War headlines. Hunter, a CIA propaganda-operator[5] who worked under-cover as a journalist, turned out a steady stream of books and articles on the theme. An additional article by Hunter on the same subject appeared in New Leader magazine in 1951.[6] In 1953 Allen Welsh Dulles, the CIA director at that time, explained that "the brain under [Communist influence] becomes a phonograph playing a disc put on its spindle by an outside genius over which it has no control."[citation needed]

In his 1956 book Brain-Washing: The Story of Men Who Defied It (Pyramid Books), Hunter described "a system of befogging the brain so a person can be seduced into acceptance of what otherwise would be abhorrent to him". According to Hunter, the process became so destructive of physical and mental health that many of his interviewees had not fully recovered after several years of freedom from Chinese captivity.

Later, two studies of the Korean War defections by Robert Lifton[7] and Edgar Schein[8] concluded that brainwashing had a transient effect when used on prisoners-of-war. Lifton and Schein found that the Chinese did not engage in any systematic re-education of prisoners, but generally used their techniques of coercive persuasion to disrupt the ability of the prisoners to organize to maintain their morale and to try to escape. The Chinese did, however, succeed in getting some of the prisoners to make anti-American statements by placing the prisoners under harsh conditions of physical and social deprivation and disruption, and then by offering them more comfortable situations such as better sleeping quarters, quality food, warmer clothes or blankets. Nevertheless, the psychiatrists noted that even these measures of coercion proved quite ineffective at changing basic attitudes for most people. In essence, the prisoners did not actually adopt Communist beliefs. Rather, many of them behaved as though they did in order to avoid the plausible threat of extreme physical abuse. Moreover, the few prisoners influenced by Communist indoctrination apparently succumbed as a result of the confluence of the coercive persuasion, and of the motives and personality characteristics of the prisoners that already existed before imprisonment. In particular, individuals with very rigid systems of belief tended to snap and realign, whereas individuals with more flexible systems of belief tended to bend under pressure and then restore themselves after the removal of external pressures.

Working individually, Lifton and Schein discussed coercive persuasion in their published analyses of the treatment of Korean War POWs. They defined coercive persuasion as a mixture of social, psychological and physical pressures applied to produce changes in an individual's beliefs, attitudes, and behaviors. Lifton and Schein both concluded that such coercive persuasion can succeed in the presence of a physical element of confinement, "forcing the individual into a situation in which he must, in order to survive physically and psychologically, expose himself to persuasive attempts". They also concluded that such coercive persuasion succeeded only on a minority of POWs, and that the end-result of such coercion remained very unstable, as most of the individuals reverted to their previous condition soon after they left the coercive environment.

Following the armistice that interrupted hostilities in the Korean War, a large group of intelligence-officers, psychiatrists, and psychologists received assignments to debrief United Nations soldiers in the process of repatriation. The government of the United States wanted to understand the unprecedented level of collaboration, the breakdown of trust among prisoners, and other such indications that the Chinese were doing something new and effective in their handling of prisoners of war. Formal studies in academic journals began to appear in the mid-1950s, as well as some first-person reports from former prisoners. In 1961, two specialists in the field published books which synthesized these studies for the non-specialists concerned with issues of national security and social policy. Edgar H. Schein wrote on Coercive Persuasion and Robert J. Lifton wrote on Thought Control and the Psychology of Totalism. Both books focused primarily on the techniques called xǐ nǎo, or more formally sī xiǎng gǎi zào (reconstructing or remodeling thought). The following discussion largely builds on their studies.

Although the attention of Americans came to bear on thought reconstruction or brainwashing as one result of the Korean War (1950–1953), the techniques had operated on ordinary Chinese citizens after the establishment of the People's Republic of China (PRC) in October 1949. The PRC had refined and extended techniques earlier used in the Soviet Union to prepare prisoners for show-trials, and they in turn had learned much from the Inquisition[citation needed]. In the Chinese context, these techniques had multiple goals that went far beyond the simple control of subjects in the prison camps of North Korea. They aimed to produce confessions, to convince the accused that they had indeed perpetrated anti-social acts, to make them feel guilty of these crimes against the state, to make them desirous of a fundamental change in outlook toward the institutions of the new communist society, and, finally, to actually accomplish these desired changes in the recipients of the brainwashing/thought-reform. To that end, brainwashers desired techniques that would break down the psychic integrity of the individual with regard to information processing, with regard to information retained in the mind, and with regard to values. Chosen techniques included:

The ultimate goal that drove these extreme efforts consisted of the transformation of an individual with an ostensible "feudal" or capitalist mindset into a "right-thinking" member of the new social system, or, in other words, to transform what the state regarded as a criminal mind into what the state could regard as a non-criminal mind.

The methods of thought-control proved extremely useful when deployed for gaining the compliance of prisoners-of-war. Key elements in their success included tight control of the information available to the individual and tight control over the behavior of the individual. When, after repatriation, close control of information ceased and "reality"-testing could resume, former prisoners fairly quickly regained a close approximation of their original picture of the world and of the societies from which they had come. Furthermore, prisoners subject to thought-control often had simply behaved in ways that pleased their captors, without changing their fundamental beliefs.[citation needed] So the fear of brainwashed sleeper agents, such as that dramatized in the novel and in the two films called The Manchurian Candidate, never materialized.

Terrible though the process frequently seemed to individuals imprisoned by the Chinese Communist Party, these attempts at extreme coercive persuasion ended with a reassuring result: they showed that the human mind has enormous ability to adapt to stress (not a recognized term in common use with reference to psychology in the early 1950s) and also a powerful homeostatic capacity. John Clifford, S.J. gives an account of one man's adamant resistance to brainwashing in In the Presence of My Enemies[9] that substantiates the picture drawn from studies of large groups reported by Lifton and Schein. Allyn and Adele Rickett[10] wrote a more penitent account of their imprisonment (Allyn Rickett had by his own admission broken PRC laws against espionage) in "Prisoners of the Liberation",[11] but it too details techniques such as the “struggle groups” described in other accounts. Between these opposite reactions to attempts by the state to reform them, experience showed that most people would change under pressure and would change back following the removal of that pressure.[original research?] Interestingly, some individuals derived benefit from these coercive procedures due to the fact that the interactions, perhaps as an unintended side effect,[original research?] actually promoted insight into dysfunctional behaviors that the subjects then abandoned.[citation needed]

In Tibet in the 1950s the invading Chinese army arrested Robert W. Ford, a British radio-operator working there. Ford spent nearly 5 years in jail, in constant fear of execution, and experienced interrogation and thought-reform. He published a book, Captured in Tibet, about his experience in Tibet, describing and analyzing thought-reform in practice.[12]

Criticism of claims of political brainwashing

According to forensic psychologist Dick Anthony[citation needed], the CIA invented the concept of "brainwashing" as a propaganda strategy to undercut communist claims that American POWs in Korean communist camps had voluntarily expressed sympathy for communism. Anthony stated that definitive research demonstrated that fear and duress, not brainwashing, caused western POWs to collaborate.[citation needed] He argued that the books of Edward Hunter (a secret CIA "psychological warfare specialist" passing as a journalist) pushed the CIA brainwashing-theory onto the general public. He further asserted that for twenty years, starting in the early 1950s, the CIA and the Defense Department conducted secret research (notably including Project MKULTRA) in an attempt to develop practical brainwashing techniques (possibly to counteract the brainwashing efforts of the Chinese), and that their attempt failed.

Brainwashing in the context of new religious movements and cults

Frequent disputes regarding brainwashing take place in discussion of cults and of new religious movements (NRMs). The controversy about the existence of cultic brainwashing has become one of the most polarizing issues among cult-followers, academic researchers of cults, and cult-critics. Parties disagree about the existence of a social process attempting coercive influence, and also disagree about the existence of the social outcome — that people become influenced against their will.

The issue gets even more complicated due to the existence of several definitions of the term "brainwashing" (some of them almost strawman-caricature metaphors of the original Korean War era concept[13] ) and through the introduction of the similarly controversial concept of "mind control" in the 1990s. (In some usages "mind control" and "brainwashing" serve as exact synonyms; other usages differentiate the two terms.) Additionally, some authors refer to brainwashing as a recruitment method (Barker) while others refer to brainwashing as a method of retaining existing members (Kent 1997; Zablocki 2001).

Theories on brainwashing have also become the subject of discussions in legal courts, where experts have had to pronounce their views before juries in simpler terms than those used in academic publications and where the issue became presented in rather black-and-white terms in order to make a point in a case. The media have taken up some such cases — including their black and white colorings.

In 1984, the British sociologist Eileen Barker wrote in her book The Making of a Moonie: Choice or Brainwashing? (based on her first-hand studies of British members of the Unification Church) that she had found no extraordinary persuasion techniques used to recruit or retain members.

Charlotte Allen reported that

"[i]n his article in Nova Religio, Zablocki was worried less about those academics who may stretch the brainwashing concept than about those, like Bromley, who reject it altogether. And in advancing his case, he took a hard look at such scholars’ intentions and tactics. (His title is deliberately provocative: 'The Blacklisting of a Concept: The Strange History of the Brainwashing Conjecture in the Sociology of Religion.')"[14]

In his book Combatting Cult Mind Control Steven Hassan describes the extraordinary persuasion technique that (in his opinion) members of the Unification Church used to accomplish his own recruitment and retention.

Philip Zimbardo writes that "[m]ind control is the process by which individual or collective freedom of choice and action is compromised by agents or agencies that modify or distort perception, motivation, affect, cognition and/or behavioral outcomes. It is neither magical nor mystical, but a process that involves a set of basic social psychological principles."(Zimbardo, 2002)

Some people have come to use the terms "brainwashing" or "mind control" to explain the otherwise intuitively puzzling success of some fast-acting episodes of religious conversion or of recruitment of inductees into groups known variously as new religious movements or as cults.[15]

One of the first published uses of the term thought reform occurred in the title of the book by Robert Jay Lifton: Thought Reform and the Psychology of Totalism: A Study of 'Brainwashing' in China (1961). (Lifton also testified on behavioral-change methodologies at the 1976 trial of Patty Hearst.) In his book Lifton used the term "thought reform" as a synonym for "brainwashing", though he preferred the first term. The elements of thought reform as published in that book sometimes serve as a basis for cult checklists, and read as follows:[16][17]

Benjamin Zablocki sees brainwashing as a "term for a concept that stands for a form of influence manifested in a deliberately and systematically applied traumatizing and obedience-producing process of ideological resocializations". Zablocki states that this same concept, historically, also bore the names "thought reform" and "coercive persuasion".

Proto-brainwashing

Before the popularization of the name and concept of "brainwashing" in the 1950s, popular lore often associated the enthusiasm and commitment of recruits joining cults to witchcraft or to mesmerism/hypnotism.[18]

The APA, DIMPAC, and theories of brainwashing

In the early 1980s some mental-health professionals in the United States became prominent figures due to their involvement as expert witnesses in court-cases involving new religious movements. In their testimony they presented certain theories involving brainwashing, mind control, or coercive persuasion as concepts generally accepted within the scientific community. The American Psychological Association (APA) in 1983 asked Margaret Singer, one of the leading proponents of coercive persuasion theories, to chair a taskforce called the APA Task Force on Deceptive and Indirect Techniques of Persuasion and Control (DIMPAC) to investigate whether brainwashing or "coercive persuasion" did indeed play a role in recruitment by such movements. Before the taskforce had submitted its final report, the APA submitted on February 10, 1987 an amicus curiæ brief in an ongoing case. The brief stated that:

[t]he methodology of Drs. Singer and Benson has been repudiated by the scientific community [... the hypotheses advanced by Singer comprised] little more than uninformed speculation, based on skewed data [...] [t]he coercive persuasion theory ... is not a meaningful scientific concept. [...] The theories of Drs. Singer and Benson are not new to the scientific community. After searching scrutiny, the scientific community has repudiated the assumptions, methodologies, and conclusions of Drs. Singer and Benson. The validity of the claim that, absent physical force or threats, "systematic manipulation of the social influences" can coercively deprive individuals of free will lacks any empirical foundation and has never been confirmed by other research. The specific methods by which Drs. Singer and Benson have arrived at their conclusions have also been rejected by all serious scholars in the field.[19]

The brief characterized the theory of brainwashing as not scientifically proven and suggested the hypothesis that cult recruitment techniques might prove coercive for certain sub-groups, while not affecting others coercively. On March 24, 1987, the APA filed a motion to withdraw its signature from this brief, as it considered the conclusion premature, in view of the ongoing work of the DIMPAC taskforce.[20] The amicus as such remained, as only the APA withdraw the signature, but not the co-signed scholars (including Jeffrey Hadden, Eileen Barker, David Bromley and J. Gordon Melton). On May 11, 1987, the APA's Board of Social and Ethical Responsibility for Psychology (BSERP) rejected the DIMPAC report because the brainwashing theory espoused "lacks the scientific rigor and evenhanded critical approach necessary for APA imprimatur"[This quote needs a citation], and concluded "Finally, after much consideration, BSERP does not believe that we have sufficient information available to guide us in taking a position on this issue."[This quote needs a citation]

With the rejection-memo came two letters from external advisers to the APA who reviewed the report. One of the letters, from Professor Benjamin Beit-Hallahmi of the University of Haifa, stated amongst other comments that "lacking psychological theory, the report resorts to sensationalism in the style of certain tabloids" and that "the term 'brainwashing' is not a recognized theoretical concept, and is just a sensationalist 'explanation' more suitable to 'cultists' and revival preachers. It should not be used by psychologists, since it does not explain anything". Professor Beit-Hallahmi asked that the report not be made public. The second letter, from Professor of Psychology Jeffrey D. Fisher, Ph.D., said that the report "[...] seems to be unscientific in tone, and biased in nature. It draws conclusions, which in many cases do not mesh well with the evidence presented. At times, the reasoning seems flawed to the point of being almost ridiculous. In fact, the report sometimes seems to be characterized by the use of deceptive, indirect techniques of persuasion and control — the very thing it is investigating".[21]

When the APA's BSERP rejected her findings, Singer sued the APA in 1992 for "defamation, frauds, aiding and abetting and conspiracy"; and lost in 1994.[22]

Zablocki (1997) and Amitrani (2001) cite APA boards and scholars on the subject and conclude that the APA has made no unanimous decision regarding this issue. They also write that Margaret Singer, despite the rejection of the DIMPAC report, continued her work and retained respect in the psychological community, which they corroborate by mentioning that in the 1987 edition of the peer-reviewed Merck Manual, Margaret Singer wrote the article "Group Psychodynamics and Cults" (Singer, 1987).

Benjamin Zablocki, professor of sociology and one of the reviewers of the rejected DIMPAC report, wrote in 1997:

"Many people have been misled about the true position of the APA and the ASA with regard to brainwashing. Like so many other theories in the behavioral sciences, the jury is still out on this one. The APA and the ASA acknowledge that some scholars believe that brainwashing exists but others believe that it does not exist. The ASA and the APA acknowledge that nobody is currently in a position to make a Solomonic decision as to which group is right and which group is wrong. Instead they urge scholars to do further research to throw more light on this matter. I think this is a reasonable position to take."[citation needed]

APA Division 36 (then "Psychologists Interested in Religious Issues", today "Psychology of Religion") in its 1990 annual convention approved the following resolution:

"The Executive Committee of the Division of Psychologists Interested in Religious Issues supports the conclusion that, at this time, there is no consensus that sufficient psychological research exists to scientifically equate undue non-physical persuasion (otherwise known as "coercive persuasion", "mind control", or "brainwashing") with techniques of influence as typically practiced by one or more religious groups. Further, the Executive Committee invites those with research on this topic to submit proposals to present their work at Divisional programs." (PIRI Executive Committee Adopts Position on Non-Physical Persuasion Winter, 1991, in Amitrano and Di Marzio, 2001)

In 2002, APA's then president, Philip Zimbardo wrote in Psychology Monitor:

"A body of social science evidence shows that when systematically practiced by state-sanctioned police, military or destructive cults, mind control can induce false confessions, create converts who willingly torture or kill "invented enemies," engage indoctrinated members to work tirelessly, give up their money—and even their lives—for "the cause." (Zimbardo, 2002)

Other views

Two months after her kidnapping by the Symbionese Liberation Army in 1974, Patty Hearst, an American newspaper-heiress, participated in a bank-robbery with her kidnappers. In her trial, the defense postulated a concerted brainwashing-program as central. Despite this claim, the court convicted her of bank-robbery.

In the 1990 U.S. v. Fishman Case, Steven Fishman offered a "brainwashing" defense to charges of embezzlement. Margaret Singer and Richard Ofshe would have appeared as expert witnesses for him. The court disallowed the introduction of Singer and Ofshe's testimony:[23]

"The evidence before the Court, which is detailed below, shows that neither the APA nor the ASA has endorsed the views of Dr. Singer and Dr. Ofshe on thought reform ... At best, the evidence establishes that psychiatrists, psychologists and sociologists disagree as to whether or not there is agreement regarding the Singer-Ofshe thesis. The Court therefore excludes defendants' proffered testimony (U.S. vs. Fishman, 1989)."

Social scientists who study new religious movements, such as Jeffrey K. Hadden (see References), understand the general proposition that religious groups can have considerable influence over their members, and that that influence may have come about through deception and indoctrination. Indeed, many sociologists[who?] observe that "influence" occurs ubiquitously in human cultures, and some[who?] argue that the influence exerted in cults or new religious movements does not differ greatly from the influence present in practically every{[fact}} domain of human action and of human endeavor.

The Association of World Academics for Religious Education states that "... without the legitimating umbrella of brainwashing ideology, deprogramming — the practice of kidnapping members of NRMs and destroying their religious faith — cannot be justified, either legally or morally."[citation needed]

F.A.C.T.net states that "Forced deprogramming was sometimes successful and sometimes unsuccessful, but is not considered an acceptable, legal, or ethical method of rescuing a person from a cult."[24]

The American Civil Liberties Union (ACLU) published a statement in 1977 related to brainwashing and mind control. In this statement the ACLU opposed certain methods "depriving people of the free exercise of religion". The ACLU also rejected (under certain conditions) the idea that claims of the use of "brainwashing" or of "mind control" should overcome the free exercise of religion. (See quote) There have been allegations of the use of brainwashing in cults using children in rituals.[25][26][27][28][29]

In the 1960s, after coming into contact with new religious movements (NRMs, a subset of which have gained the popular designation of "cults"), some young people suddenly adopted faiths, beliefs, and behavior that differed markedly from their previous lifestyles and seemed at variance with their upbringings. In some cases, these people neglected or even broke contact with their families. Such changes appeared strange and upsetting to their families. To explain these phenomena, some postulated brainwashing on the part of new religious movements. Observers quoted practices such as isolating recruits from their family and friends (inviting them to an end-of-term camp after university for example), arranging a sleep-deprivation program (3 a.m. prayer-meetings) and exposing them to loud and repetitive chanting. Another alleged technique of religious brainwashing involved love bombing rather than torture.

James T. (Jim) Richardson, a Professor of Sociology and Judicial Studies at the University of Nevada, states[30] that if the NRMs had access to powerful brainwashing techniques, one would expect that NRMs would have high growth-rates, while in fact most have not had notable success in recruitment, most adherents participate for only a short time, and such groups have limited success in retaining members. Langone has rejected this claim, comparing the figures of various movements, some of which do (by common consent) not use brainwashing and others of which some authors report as using brainwashing. (Langone, 1993)

In their Handbook of Cults and Sects in America, Bromley and Hadden present one possible ideological foundation of brainwashing theories that they state demonstrates the lack of scientific support: they argue that a simplistic perspective (one they see as inherent in the brainwashing metaphor) appeals to those attempting to locate an effective social weapon to use against disfavored groups, and that any relative success of such efforts at social control should not detract from any lack of scientific basis for such opinions.

Philip Zimbardo, Professor Emeritus of Psychology at Stanford University, writes: "Whatever any member of a cult has done, you and I could be recruited or seduced into doing — under the right or wrong conditions. The majority of 'normal, average, intelligent' individuals can be led to engage in immoral, illegal, irrational, aggressive and self destructive actions that are contrary to their values or personality — when manipulated situational conditions exert their power over individual dispositions."(Zimbardo, 1997)

Some religious groups, especially those of Hindu and Buddhist origin, openly state that they seek to improve what they call the "natural" human mind by spiritual exercises. Intense spiritual exercises have an effect on the mind, for example by leading to an altered state of consciousness. These groups also state that they do not [condone the] use [of] coercive techniques to acquire or to retain converts. [citation needed]

On the other hand, several scholars in sociology and psychology have in recent years stated that many scholars of NRMs express a bias to deny any possibility of brainwashing and to disregard actual evidence. (Zablocki 1997, Amitrani 1998, Kent 1998, Beit-Hallahmi 2001)

Steven Hassan, author of the book Combatting Cult Mind Control, has suggested that the influence of sincere but misled people can provide a significant factor in the process of thought-reform. (Many scholars[who?] in the field of new religious movements do not accept Hassan's BITE model for understanding cults.)

Brainwashing in fiction

  • In George Orwell's novel Nineteen Eighty-Four (published in 1949 before the popularization of the term "brainwashing"), the fictional totalitarian government of Oceania uses brainwashing-style techniques to erase nonconformist thought and rebellious personalities.
  • In David Karp's novel One the "benevolent State" has devised a sophisticated system of surveillance, subtle forms of re-education and, if necessary, brainwashing. The novel describes one such instance, where the authorities find a man — who believes himself an active supporter of the system — guilty of "heresy" and accordingly hold him captive and administer the State's routine treatment for his allegedly deviant behaviour.
  • In the 1968 novel When the Enemy is Tired,[31] the Australian writer Russell Braddon attempts to detail some of the techniques used by the Chinese during the Korean War, involving "washing" the brain of content by having people endlessly rewrite personal history, taking away their written accounts and telling them to start again.
  • In the novel Brave New World by Aldous Huxley, Huxley portrays a process of brain-washing newly-produced babies called "conditioning".
  • In the novel A Clockwork Orange by Anthony Burgess (film adaptation by Stanley Kubrick), the protagonist undergoes a re-education process called the "Ludovico technique" in an attempt to remove his violent tendencies.
  • Max Ehrlich's novel The Cult (1978) (Bantam Books) deals with the fictional brainwashing and attempted deprogramming (counter-brainwashing) of a cult-member that goes horribly wrong.
  • Vernor Vinge speculates on the application of technology to achieve brainwashing in Rainbows End (ISBN 0-312-85684-9), portraying separately the dangers of JITT (Just-in-time training) and the specter of YGBM (You gotta believe me). This picks up on themes of "mindrot" and controlled "Focus" in Vinge's earlier novel A Deepness in the Sky.

Video media

Video games

  • In the video game Psychonauts, Boyd Cooper, the security-guard at Thorney Towers, undergoes hypnosis and has a second personality (dubbed "The Milkman") implanted into his mind, which certain actions or commands can trigger.
  • In Half-Life 2, the Combine race uses brainwashing on humans to produce soldiers and CP units. They extract organs (brainwashed brains) from humans to create synths.
  • In Quake 4, the Strogg race "brainwashes" the humans by activating the neutrocyte (mind-control chip), thus fully "Stroggifying" them.
  • In a Captain N: The Game Master where Simon Belmont suffers temporary amnesia, Mother Brain orders King Hippo and Eggplant Wizard to brainwash Simon into becoming an enemy of the N-Team: they start scrubbing Mother Brain's glass casing as they misunderstand the meaning of the word "brainwash".
  • In Starcraft, a dark archon unit can use "mind control" to bring opposing units into the player's side.
  • In BioShock, the protagonist becomes "brainwashed" into carrying out actions on hearing the phrase "Would you kindly...".
  • In Red Alert 2 (2000), the character Yuri controls the Premier of the Soviets, Romanov, forcing him into war against the U.S. Yuri also functions as a playable unit in the game with the ability to exert mind-control over its enemies.
  • In Super Paper Mario (2007), the main antagonist's secretary, Nastasia, brainwashes Princess Peach into marrying Bowser. She also brainwashes Bowser's army into serving her master.
  • Metal Gear Solid 4 reveals that Revolver Ocelot willingly submitted to hypnosis and brainwashing in order to trick himself into believing that he was Liquid Snake, rather than the previously suggested idea that Liquid Snake's personality had asserted itself through his arm.
  • In Destroy All Humans, the alien that you control in the game has the ability to brainwash the towns people and force them to tell him information.

See also

Footnotes

  1. ^ For a medical (as opposed to sociological) definition, compare: "brainwashing". Dorland's Medical Dictionary for Healthcare Consumers. Merck/Elsevier. 2007. Retrieved 2008-09-13. [A]ny systematic effort aimed at instilling certain attitudes and beliefs against a person's will, usually beliefs in conflict with prior beliefs and knowledge. It initially referred to political indoctrination of prisoners of war and political prisoners. {{cite web}}: Cite has empty unknown parameters: |month= and |coauthors= (help)
  2. ^ Dittmann, Melisa, Cults of Hatred: Panelists at a convention session on hatred asked APA to form a task force to investigate mind control among destructive cults., Volume 33, No. 10, November 2002, Melissa Dittmann, pg. 30, American Psychological Association, Monitor, "Available online"
  3. ^ Marks, John. The Search for the "Manchurian Candidate": The CIA and Mind Control. New York: McGraw-Hill, 1980.
  4. ^ Browning, Michael (2003-03-14). "Brainwashing agitates victims into submission". Palm Beach Post. Palm Beach. ISSN 1528-5758. Retrieved 2008-07-05. During the Korean War, captured American soldiers were subjected to prolonged interrogations and harangues by their captors, who often worked in relays and used the "good-cop, bad-cop" approach, alternating a brutal interrogator with a gentle one. It was all part of "Xi Nao," washing the brain. The Chinese and Koreans were making valiant attempts to convert the captives to the communist way of thought. {{cite news}}: line feed character in |quote= at position 248 (help)
  5. ^ The Search for the Manchurian Candidate - Chapter 8
  6. ^ Zweiback, Adam J. (1998-12). ""Turncoat GIs": Nonrepatriations and the political culture of the Korean War". The Historian. 60 (2): 345–362. doi:10.1111/j.1540-6563.1998.tb01398.x. Retrieved 2008-03-30. {{cite journal}}: Check date values in: |date= (help); Cite has empty unknown parameter: |coauthors= (help)
  7. ^ Lifton, Robert J. (1954-04). "Home by Ship: Reaction Patterns of American Prisoners of War Repatriated from North Korea". American Journal of Psychiatry. 110 (10): 732–739. doi:10.1176/appi.ajp.110.10.732. PMID 13138750. Retrieved 2008-03-30. {{cite journal}}: Check date values in: |date= (help); Cite has empty unknown parameter: |coauthors= (help); Unknown parameter |doi_brokendate= ignored (|doi-broken-date= suggested) (help) Cited in Thought Reform and the Psychology of Totalism
  8. ^ Schein, Edgar (1956-05). "The Chinese indoctrination program for prisoners of war: a study of attempted brainwashing". Psychiatry. 19 (2): 149–172. PMID 13323141. Retrieved 2008-03-30. {{cite journal}}: Check date values in: |date= (help); Cite has empty unknown parameter: |coauthors= (help) Cited in Thought Reform and the Psychology of Totalism
  9. ^ Clifford, John W, In the Presence of My Enemies. New York: Norton, 1963.
  10. ^ "Criticism and self-criticism: How a socialist society deals with its enemies". Retrieved 2008-10-25. In 1951, Allyn Rickett was an American student in revolutionary China. He was also a spy. Along with his wife Adele, he was arrested and spent four years in a Chinese prison undergoing a process of criticism and self-criticism. {{cite web}}: Cite has empty unknown parameters: |month= and |coauthors= (help)
  11. ^ W Allyn Rickett and Adele Rickett: Prisoners of liberation. New York, Cameron Associates, 1957.
  12. ^ Robert W. Ford, Captured in Tibet, Publisher: Oxford University Press, September 1990, ISBN 019581570X ; Wind Between the Worlds: Captured in Tibet, Publisher: SLG Books, ISBN 0961706694
  13. ^ The American Heritage Dictionary of the English Language (Fourth Edition, 2000), for example, records advertising as an example of a type of brainwashing. Online at http://www.bartleby.com/61/1/B0450100.html, retrieved 2007-09-02.
  14. ^ Charlotte Allen, "Brainwashed! Scholars of Cults Accuse Each Other of Bad Faith", Lingua Franca, December 1998. Online at http://www.rickross.com/reference/apologist/apologist29.html - retrieved 2007-03-25
  15. ^ Eileen Barker explains the attractions for observers of explaining — using the concept of "brainwashing" — the behavior of those who join new religious movements. See Barker, Eileen: New Religious Movements: A Practical Introduction. London: Her Majesty's Stationery office, 1989.
  16. ^ The REVEAL Library: Lifton's Eight Criteria
  17. ^ Thought Reform and the Psychology of Totalism
  18. ^ Jenkins, Philip (2000). "9. Cult Wars: 1969-1985". Mystics and Messiahs: Cults and New Religions in American History. New York: Oxford University Press. pp. 187–188. ISBN 0-19-512744-7. Nor would earlier critics have been too surprised by the new theories about the underhanded means by which individuals were recruited to odd fringe sects. While in the seventeenth century such a puzzling change could be blamed on witchcraft and on Mesmerism or hypnotism in the nineteenth, the fashionable explanation was now phrased in terms of brainwashing and mind control, an idea that permitted converts to be 'deprogrammed' to what their families considered religious normality. The transition from hypnotism to brainwashing represented little more than a change in name for the same underlying concept. {{cite book}}: Cite has empty unknown parameters: |origmonth=, |month=, |chapterurl=, and |origdate= (help)
  19. ^ CESNUR - APA Brief in the Molko Case
  20. ^ Motion of the American Psychological Association to Withdraw as Amicus Curiae
  21. ^ APA memo and two enclosures
  22. ^ Case No. 730012-8 Margaret Singer v. American Psychological Association
  23. ^ Brainwashed! Scholars of Cults Accuse Each Other of Bad Faith, Lingua Franca, December 1998.
  24. ^ Use of Forced Deprogramming F.A.C.T.net
  25. ^ Randall, James (2000). Cult and Ritual Abuse: Its History, Anthropology, and Recent Discovery in Contemporary America. Greenwood Publishing Group. p. 229. ISBN 0-275-95281-9. {{cite book}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  26. ^ An Empirical Look at the Ritual Abuse Controversy - Randy Noblitt, PhD
  27. ^ Becker, T (2008). "The extreme abuse surveys: Preliminary findings regarding dissociative identity disorder". Forensic aspects of dissociative identity disorder. London: Karnac Books. pp. 32–49. ISBN 1-855-75596-3. {{cite book}}: Unknown parameter |coauthors= ignored (|author= suggested) (help); Unknown parameter |editors= ignored (|editor= suggested) (help)
  28. ^ Rutz, C. Becker, T., Overkamp, B. & Karriker, W. (2008). Exploring Commonalities Reported by Adult Survivors of Extreme Abuse: Preliminary Empirical Findings pp. 31- 84 in Noblitt, J.R. (2008). Ritual Abuse in the Twenty-first Century: Psychological, Forensic, Social and Political Considerations. Bandor, OR: Robert Reed. p. 552. ISBN 1-934759-12-0. {{cite book}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  29. ^ Extreme Abuse Survey results
  30. ^ Richardson, James T. (1985-06). "The active vs. passive convert: paradigm conflict in conversion/recruitment research". Journal for the Scientific Study of Religion. 24 (2): 163–179. Retrieved 2008-07-05. {{cite journal}}: Check date values in: |date= (help); Cite has empty unknown parameter: |coauthors= (help)
  31. ^ Braddon, Russell: When the enemy is tired. London, Joseph, 1968.

References

Further reading

  • Anthony, Dick, Brainwashing and Totalitarian Influence. An Exploration of Admissibility Criteria for Testimony in Brainwashing Trials, Ph.D. Dissertation, Berkeley (California): Graduate Theological Union, 1996, p. 165.
  • Barker, Eileen, The Making of a Moonie: Choice or Brainwashing, Oxford, UK: Blackwell Publishers, 1984 ISBN 0-631-13246-5
  • Committee on Un-American Activities (HUAC), Communist Psychological Warfare (Brainwashing), United States House of Representatives, Washington, D. C., Tuesday, March 13, 1958
  • Hassan, Steven. Releasing The Bonds: Empowering People to Think for Themselves, Somerville MA: Freedom of Mind Press, 2000. ISBN 0-9670688-0-0.
  • Hunter, Edward, Brain-Washing in Red China. The Calculated Destruction of Men’s Minds, New York: The Vanguard Press, 1951; 2nd expanded ed.: New York: The Vanguard Press, 1953
  • Lifton, Robert J., Thought reform and the psychology of totalism; a study of "brainwashing" in China. New York: Norton, 1961. ISBN 0-8078-4253-2
  • Sargant, William Walters, Battle for the Mind: A Physiology of Conversion and Brainwashing. Cambridge, MA: Malor Books, 1997. ISBN 1-883536-06-5
  • Streatfeild, Dominic, Brainwash: The Secret History of Mind Control, 2006, ISBN 0-340-92103-X
  • Taylor, Kathleen, Brainwashing: The Science of Thought Control, 2005, ISBN 0-19-280496-0
  • Benjamin Zablocki and Thomas Robbins (editors), Misunderstanding Cults, 2001, ISBN 0-8020-8188-6
  • Philip Zimbardo, "Mind control: psychological reality or mindless rhetoric?" Monitor on Psychology, Volume 33, No. 10 November 2002