Jump to content

User:Crtwiki/sandbox/Red Teaming

From Wikipedia, the free encyclopedia

Red Teaming

Red Teaming (RT) is a decision-aiding art traditionally used (by military) to play the devil’s advocate against one’s own concepts, plans, strategies, or system to “test and evaluate” them to improve decision making.

History

[edit]

Although the core concept of Red Teaming has been used by commanders since ancient warfare, the term “red teaming” is relatively new buzzword firstly appeared in the US military and then civilian enterprises.

- In 1997, the first dedicated red teaming journal called “Red Team journal” was launched to promote red teaming and alternative analysis \cite{}.

- In 2003, the United State of America Office of the Secretary of Defense Acquisition, Technology, and Logistics published an unclassified document entitled “The Role and Status of DoD Red Teaming Activities” \cite{}. In the same year, SANS Institute published a whitepaper entitled “Red Teaming: The Art of Ethical Hacking” \cite{SANS}.

- In 2008, David Longbine published an unclassified monograph entitled “Red Teaming: Past and Present” \cite{}.

- In 2009, the Canadians published an unclassified manuscript entitled “Red Dawn: The Emergence of a Red Teaming Capability in the Canadian Forces” \cite{}.

Definitions

[edit]

US Army define:

Definition 1: Red Teaming is a structured, iterative process executed by trained, educated and practiced team members that provides commanders an independent capability to continuously challenge plans, operations, concepts, organizations and capabilities in the context of the operational environment and from our partners’ and adversaries’ perspectives \cite{USA Army}.

DSTO Australia defines:

Definition 2: Red Teaming is the practice of critical analysis, through means of challenge and contest of argument, as conducted by an independent party, in the study of reasoning, and for the purposes of improving decision-making processes \cite{DSTO}.

Abbass defines:

Definition 3: Red Teaming is a structured approach for modeling and executing exercises by deliberately challenging the competitive reciprocal interaction among two or more players and monitoring how concepts, plans, technologies, human behavior, or any events related to the system or situation under study is unfolding from a risk lens, with the objective of understanding the space of possibilities, sometimes exploring nonconventional behaviors, testing strategies, and mitigating risk \cite{Abbass}.

Comments: while the definitions presented above may seem to be different to each other at first, it is important to note that they have more commonalities than differences. The third definition provided by Abbass illustrates RT in greater details by emphasizing on important terms and concepts such as “deliberate challenge”, “competitive reciprocal interaction”, “risk lens”, and “nonconventional behaviors”. Further explanations of these terms and concepts are as follows:

Modeling: in this context, modeling reflects the process transferring a complex situation into a representation (such as diagrams) that captures important information and relationships in that situation, while ignoring less important information and relationships.

Executing: A RT exercise needs to be executed i.e. hypotheses generated while building a model need to be transformed into actionable steps. Executing is the process of unfolding events based on interaction among different entities.

Deliberate challenge: a challenge is an exposure of a system to a situation that requires the system to perform close to, but outside, the boundary of its abilities. A challenge is deliberate if the system is being put to the challenging situation with intension or on purpose. Deliberate challenge is an important feature of RT.

Risk lens: risk is the effect of uncertainties on objectives. When events and situations in a RT exercise are unfolded under a risk lens, they are analyzed to discover blind spots that can impact the blue team objectives.

Understanding the Space of possibilities: the primary objective of a RT exercise is not for red to beat blue. A more achievable goal is to gain a greater understanding of what is possible in a specific situation.

Exploring nonconventional behaviors: in a normal situation, entities will behave following certain rules or ways of thinking. As an objective of RT is to deliberately challenge a plan, strategy, or system, it encourages entities to act in a way which is different to what in normal situation. As a result the space of nonconventional behaviors can be explored.

Testing strategies: a strategy denotes the set of methods that are able to transform the resources to which an organization has access to, or has the ability to access, into the organization’s goals. RT transforms a strategy design process from one-off exercise to a lifelong learning exercise by continuously challenging the current strategy.

Mitigating risk: risk cannot be minimized, instead it is mitigated. A risk is mitigated if its impact is steered away from the system, or the system is reshaped to transform a negative impact into a positive one. RT helps to develop good risk-mitigation strategy by discovering and challenging different risks within a situation.

Application areas

[edit]

Historically, Red Teaming has been a decision-aiding tool for military commanders. For instance, RT concept was effectively incorporated into decision making by T.E. Lawrence during the Arab Revolt against the Turks in World War I or by Field Marshal Slim’s in 1945 counteroffensive into Burma during World War II \cite{}. In the 21st century, the military continues to see RT as an effective decision-aiding tool especially after the event of 9/11. Since 2000 the Ministry of Defence of US, UK, Canada and Australia have published and updated their guide on Red Teaming \cite{}.

Although Red Teaming is traditionally used by the military, it has recently found its position in many civilian enterprises. For instance, RT is used to evaluate trading strategies \cite{}, to secure IT infrastructures \cite{}, and to improve air safety \cite{}. Several IT giants including IBM, Dell and Microsoft have develop their own red teaming capabilities to challenge their own products and development strategies.

The application of RT is due to its ability to help decision makers to:

  • Explore space of possibilities
  • Discover vulnerabilities
  • Reveal bias
  • Learn about competitors
  • Build a database of cases for future events
  • Unlearn to learn

See also

[edit]

Computational Red Teaming

References

[edit]
[edit]