Cognitive traps for intelligence analysis
- This article deals with a subset of the intellectual process of intelligence analysis itself, as opposed to intelligence analysis management, which in turn is a subcomponent of intelligence cycle management. For a complete hierarchical list of articles in this series, see the intelligence cycle management hierarchy.
Intelligence analysis is plagued by many of the cognitive traps also encountered in other disciplines. The first systematic study of the specific pitfalls lying between an intelligence analyst and clear thinking was carried out by Dick Heuer. According to Heuer, these traps may be rooted either in the analyst's organizational culture or his or her own personality.
The most common personality trap, known as mirror-imaging is the analysts' assumption that the people being studied think like the analysts themselves. An important variation is to confuse actual subjects with one's information or images about them, as the sort of apple one eats and the ideas and issues it may raise. It poses a dilemma for the scientific method in general, since science uses information and theory to represent complex natural systems as if theoretical constructs might be in control of indefinable natural processes. An inability to distinguish subjects from what one is thinking about them is also studied under the subject of functional fixedness, first studied in Gestalt psychology and in relation to the subject-object problem.
Experienced analysts may recognize that they have fallen prey to mirror-imaging if they discover that they are unwilling to examine variants of what they consider most reasonable in light of their personal frame of reference. Less-perceptive analysts affected by this trap may regard legitimate objections as a personal attack, rather than looking beyond ego to the merits of the question. Peer review (especially by people from a different background) can be a wise safeguard. Organizational culture can also create traps which render individual analysts unwilling to challenge acknowledged experts in the group.
Another trap, target fixation, has an analogy in aviation: it occurs when pilots become so intent on delivering their ordnance that they lose sight of the big picture and crash into the target. This is a more basic human tendency than many realize. Analysts may fixate on one hypothesis, looking only at evidence that is consistent with their preconceptions and ignoring other relevant views. The desire for rapid closure is another form of idea fixation.
"Familiarity with terrorist methods, repeated attacks against U.S. facilities overseas, combined with indications that the continental United States was at the top of the terrorist target list might have alerted us that we were in peril of a significant attack. And yet, for reasons those who study intelligence failure will find familiar, 9/11 fits very much into the norm of surprise caused by a breakdown of intelligence warning." The breakdown happened, in part, because there was poor information-sharing among analysts (in different FBI offices, for example). At a conceptual level, US intelligence knew that al-Qaida actions almost always involve multiple, near-simultaneous attacks; however, the FBI did not assimilate piecemeal information on oddly behaving foreign flight-training students into this context.
On the day of the hijackings (under tremendous time pressure), no analyst associated the multiple hijackings with the multiple-attack signature of al-Qaeda. The failure to conceive that a major attack could occur within the US left the country unprepared. For example, irregularities detected by the Federal Aviation Administration and North American Air Defense Command did not flow into a center where analysts could consolidate this information and (ideally) collate it with earlier reports of odd behavior among certain pilot trainees, or the possibility of hijacked airliners being used as weapons.
Inappropriate analogies are yet another cognitive trap. Though analogies may be extremely useful they can become dangerous when forced, or when they are based on assumptions of cultural or contextual equivalence. Avoiding such analogies is difficult when analysts are merely unconscious of differences between their own context and that of others; it becomes extremely difficult when they are unaware that important knowledge is missing. Difficulties associated with admitting one's ignorance are an additional barrier to avoiding such traps. Such ignorance can take the form of insufficient study: a lack of factual information or understanding; an inability to mesh new facts with old; or a simple denial of conflicting facts.
Even extremely creative thinkers may find it difficult to gain support within their organization. Often more concerned with appearances, managers may suppress conflict born of creativity in favor of the status quo. A special case of stereotyping is stovepiping, whereby a group heavily invested in a particular collection technology ignores valid information from other sources (functional specialization). It was a Soviet tendency to value HUMINT (HUMan INTelligence), gathered from espionage, above all other sources; the Soviet OSINT was forced to go outside the state intelligence organization in developing the USA (later USA-Canada) Institute of the Soviet Academy of Sciences.
Another specialization problem may come as a result of security compartmentalization. An analytic team with unique access to a source may overemphasize that source's significance. This can be a major problem with long-term HUMINT relationships, in which partners develop personal bonds.
Groups (like individual analysts) can also reject evidence which contradicts prior conclusions. When this happens it is often difficult to assess whether the inclusion of certain analysts in the group was the thoughtful application of deliberately contrarian "red teams", or the politicized insertion of ideologues to militate for a certain policy. Monopolization of the information flow (as caused by the latter) has also been termed "stovepiping", by analogy with intelligence-collection disciplines.
The "other culture"
There are many levels at which one can misunderstand another culture, be it that of an organization or a country. One frequently encountered trap is the rational-actor hypothesis, which ascribes rational behavior to the other side—according to a definition of rationality from one's own culture.
The social anthropologist Edward T. Hall illustrated one such conflict with an example from the American Southwest. "Anglo" drivers became infuriated when "Hispanic" traffic police would cite them for going one mile an hour over the speed limit, although a Hispanic judge would later dismiss the charge. "Hispanic" drivers, on the other hand, were convinced that "Anglo" judges were unfair because they would not dismiss charges due to extenuating circumstances.
Both cultures were rational with regard to law enforcement and the adjudication of charges; indeed, both believed that one of the two had to be flexible and the other had to be formal. However, in the Anglo culture it was the police who had discretion with regard to issuing speeding tickets, while the court was expected to stay within the letter of the law. In the Hispanic culture the police were expected to be strict, but the courts would balance the situation. There was a fundamental misunderstanding; both sides were ethnocentric, and both incorrectly assumed the other culture was a mirror image of itself. In this example denial of rationality was the result in both cultures, yet each was acting rationally within its own value set.
In a subsequent interview, Hall spoke widely about intercultural communication. He summed up years of study with the statement, "I spent years trying to figure out how to select people to go overseas. This is the secret. You have to know how to make a friend. And that is it!"
To make a friend, one has to understand the culture of the potential friend, one's own culture, and how things which are rational in one may not translate to the other. Key questions are:
- What is culture?
- How is an individual unique within a culture?
If we can get away from theoretical paradigms and focus more on what is really going on with people, we will be doing well. I have two models that I used originally. One is the linguistics model, that is, descriptive linguistics. And the other one is animal behavior. Both involve paying close attention to what is happening right under our nose. There is no way to get answers unless you immerse yourself in a situation and pay close attention. From this, the validity and integrity of patterns is experienced. In other words, the pattern can live and become a part of you.
The main thing that marks my methodology is that I really do use myself as a control. I pay very close attention to myself, my feelings because then I have a base. And it is not intellectual.
Proportionality bias assumes that small things in one culture are small in every culture. In reality, cultures prioritize differently. In Western (especially northern European) culture, time schedules are important; being late can be a major discourtesy. Waiting one's turn is the cultural norm, and failing to stand in line is a cultural failing. "Honor killing" seems bizarre in some cultures, but is an accepted part of others.
Even within a culture, however, individuals remain individual. Presumption of unitary action by organizations is another trap. In Japanese culture the lines of authority are very clear, but the senior individual will also seek consensus. American negotiators may push for quick decisions, but the Japanese need to build consensus first; once it exists, they may execute faster than Americans.
The "other side" is different
The analyst's country (or organization) is not identical to that of their opponent. One error is to mirror-image the opposition, assuming it will act the same as your country and culture would under the same circumstances. "It seemed inconceivable to the U.S. planners in 1941 that the Japanese would be so foolish to attack a power whose resources so exceeded those of Japan, thus virtually guaranteeing defeat".
In like manner, no analyst in US Navy force protection conceived of an Arleigh Burke class destroyer such as the USS Cole being attacked with a small suicide boat—a boat much like those the Japanese planned to use extensively against invasion forces during World War II.
The "other side" makes different technological assumptions
An opponent's cultural framework affects its approach to technology. This complicates the task of one's own analysts in assessing the opponent's resources, how they may be used and defining intelligence targets accordingly. Mirror-imaging—committing to a set of common assumptions rather than challenging those assumptions—has figured in numerous intelligence failures.
In the Pacific Theater of World War II, the Japanese seemed to believe that their language was so complex that even if their cryptosystems such as PURPLE were broken, outsiders would not really understand the content. That was not strictly true, but it was sufficiently so that there were cases where even the intended recipients did not clearly understand the writer's intent.
On the other side, the US Navy assumed that ships anchored in the shallow waters of Pearl Harbor were safe from torpedo attack, even though the British had demonstrated the feasibility of shallow-water torpedo attacks in the 1940 Battle of Taranto.
Even if intelligence services had credited the 9/11 conspirators with the organizational capacity necessary to hijack four airliners simultaneously, no one would have suspected that the hijackers' weapon of choice would be the box-cutter.
Likewise, the US Navy underestimated the danger of suicide boats in harbor, and set rules of engagement that allowed an unidentified boat to sail into the USS Cole without being warned off or taken under fire. An Arleigh Burke class destroyer is one of the most powerful warships ever built, yet US security policies did not protect the docked USS Cole.
The "other side" does not make decisions as you do
Mirror-imaging can be a major problem for policymakers, as well as analysts. During the Vietnam War, Johnson and McNamara assumed that Ho Chi Minh would react to situations in the same manner as they themselves. Similarly, in the run-up to the Gulf War, there was a serious misapprehension—independent of politically motivated intelligence manipulation—that Saddam Hussein would view the situation vis-à-vis Kuwait as the State Department and White House viewed it.
Opposing countries are not monolithic, even within their governments. There can be bureaucratic competition, which becomes associated with different ideas. Some dictators (such as Hitler and Stalin) were known for creating internal dissension, so that only the leader was in complete control. A current issue, which analysts understand but politicians may not (or may want to exploit by playing on domestic fears), is the actual political and power structure of Iran; one must not equate the power of Iran's president with that of the president of the United States.
Opponents are not always rational. They may have a greater risk tolerance than one's own country. Maintaining the illusion of a WMD threat appears to have been one of Saddam Hussein's survival strategies. Returning again to the Iranian example, an apparently irrational statement from President Mahmoud Ahmadinejad would not carry the weight of a similar statement by Supreme Leader Ali Khamenei. Analysts sometimes assume that the opponent is all-wise and knows all of your side's weaknesses. Despite this danger, opponents are unlikely to act according to one's best-case scenario; they may take the worst-case approach, to which one is most vulnerable.
The "other side" may be trying to confuse you
The analysts' job is to form hypotheses; however, they should also be prepared to reexamine them repeatedly in light of new information instead of searching for evidence buttressing one favored theory. They must remember that the enemy may be deliberately deceiving them with information which seems plausible to the enemy. Donald Bacon observed that "the most successful deception stories were apparently as reasonable as the truth. Allied strategic deception, as well as Soviet deception in support of the operations at Stalingrad, Kursk, and the 1944 summer offensive, all exploited German leadership’s preexisting beliefs and were, therefore, incredibly effective." Theories that Hitler thought implausible were not accepted. Western deception staffs alternated "ambiguous" and "misleading" deceptions; the former intended simply to confuse analysts, and the latter to make one false alternative especially likely.
Of all modern militaries, the Russians treat strategic deception (or, in their word, maskirovka; this goes beyond our phrase to include deception, operational security and concealment) as an integral part of all planning. The highest levels of command are involved.
Bacon wrote further that
The battle of Kursk was also an example of effective Soviet maskirovka. While the Germans were preparing for their Kursk offensive, the Soviets created a story that they intended to conduct only defensive operations at Kursk. The reality was the Soviets planned a large counteroffensive at Kursk once they blunted the German attack. .... German intelligence for the Russian Front assumed the Soviets would conduct only “local” attacks around Kursk to “gain a better jumping off place for the winter offensive.
The counterattack by the Steppe Front stunned the Germans.
The opponent may try to overload one's analytical capabilities as a gambit for those preparing the intelligence budget, and for those agencies where the fast track to promotion is in data collection; one's own side may produce so much raw data that the analyst is overwhelmed, even without enemy assistance.
- Heuer, Richards J. Jr. (1999). "Psychology of Intelligence Analysis. Chapter 2. Perception: Why Can't We See What Is There To Be Seen?". History Staff, Center for the Study of Intelligence, Central Intelligence Agency. Retrieved 2007-10-29.
- Lauren Witlin (Winter–Spring 2008). "Of Note: Mirror-Imaging and Its Dangers" (excerpt). SAIS Review (The Johns Hopkins University Press) 28 (1): 89–90. doi:10.1353/sais.2008.0024. Retrieved 2013-03-28.
- Porch, Douglas; James J. Wirtz (September 2002). "Surprise and Intelligence Failure". Strategic Insights (US Naval Postgraduate School) I (7). Cite error: Invalid
<ref>tag; name "NPG-Porch-2002" defined multiple times with different content (see the help page).
- "The Amerikanisti". Time. July 24, 1972. Retrieved 2007-10-28.
- Hall, Edward T. (1973). The Silent Language. Anchor. ISBN 0-385-05549-8.
- Sorrells, Kathryn (Summer 1998). "Gifts of Wisdom: An Interview with Dr. Edward T. Hall". The Edge: the E-Journal of Intercultural Relations. Retrieved 2007-10-28.
- Bacon, Donald J. (December 1998). "Second World War Deception: Lessons Learned for Today’s Joint Planner, Wright Flyer Paper No. 5" (PDF). (US) Air Command and Staff College. Retrieved 2007-10-24.
- Smith, Charles L. (Spring 1988). "Soviet Maskirovko". Airpower Journal.
- Luttwak, Edward (1997). Coup D'Etat: A Practical Handbook. Harvard University Press. ISBN 0-674-17547-6.