Human error has been cited as a primary cause or contributing factor in disasters and accidents in industries as diverse as nuclear power (e.g., Three Mile Island accident), aviation (see pilot error), space exploration (e.g., Space Shuttle Challenger Disaster and Space Shuttle Columbia disaster), and medicine (see medical error).
Following Erik Hollnagel:  The term Human Error refers to the human activity (typically, the user of a product or a system operator) or absence of activity, followed by painful system behavior. 
Human Error and Performance 
Human error and performance are two sides of the same coin: "human error" mechanisms are the same as "human performance" mechanisms; performance later categorized as 'error' is done so in hindsight: therefore actions later termed "human error" are actually part of the ordinary spectrum of human behaviour. The study of absent-mindedness in everyday life provides ample documentation and categorization of such aspects of behavior. While human error is firmly entrenched in the classical approaches to accident investigation and risk assessment, it has no role in newer approaches such as resilience engineering.
Categories of human error 
- exogenous versus endogenous (i.e., originating outside versus inside the individual)
- situation assessment versus response planning and related distinctions in
- By level of analysis; for example, perceptual (e.g., optical illusions) versus cognitive versus communication versus organizational.
Sources of human errors 
The cognitive study of human error is a very active research field, including work related to limits of memory and attention and also to decision making strategies such as the availability heuristic and other cognitive biases. Such heuristics and biases are strategies that are useful and often correct, but can lead to systematic patterns of error.
Some researchers have argued that the dichotomy of human actions as "correct" or "incorrect" is a harmful oversimplification of a complex phenomena. A focus on the variability of human performance and how human operators (and organizations) can manage that variability may be a more fruitful approach. Newer approaches such as resilience engineering mentioned above, highlights the positive roles that humans can play in complex systems. In resilience engineering, successes (things that go right) and failures (things that go wrong) are seen as having the same basis, namely human performance variability. A specific account of that is the efficiency–thoroughness trade-off principle (ETTO principle), which can be found on all levels of human activity, in individual as well as collective.
- , Erik Hollnagel home page
- , Hollnagel: Why "Human Error" is a Meaningless Concept
- Reason, 1991
- Woods, 1990
- Hollnagel, E., Woods, D. D. & Leveson, N. G. (2006). Resilience engineering: Concepts and precepts. Aldershot, UK: Ashgate
- Jones, 1999
- Wallace and Ross, 2006
- Senders and Moray, 1991
- Roth et al., 1994
- Sage, 1992
- Norman, 1988
- Reason, 1991
- (Kirwan and Ainsworth, 1992;
- search for MORT on the FAA Human Factors Workbench
- Hollnagel, E. (1983). Human error. (Position Paper for NATO Conference on Human Error, August 1983, Bellagio, Italy)
- Hollnagel, E. and Amalberti, R. (2001). The Emperor’s New Clothes, or whatever happened to “human error”? Invited keynote presentation at 4th International Workshop on Human Error, Safety and System Development.. Linköping, June 11–12, 2001.
- Hollnagel, E. (2009). The ETTO Principle - Efficiency-Thoroughness Trade-Off. Why things that go right sometimes go wrong. Ashgate