Jump to content

Normal Accidents

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Leftbrainstuff (talk | contribs) at 06:18, 10 May 2012 (Oops forgot to start the sentence with the...). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Normal Accidents
Cover
AuthorCharles Perrow
PublisherBasic Books
Publication date
1984
ISBNISBN 978-0691004129 Parameter error in {{ISBNT}}: invalid character

Normal Accidents: Living with High-Risk Technologies is a 1984 book by Charles Perrow, which provides a classic analysis of complex systems conducted from the point of view of a social scientist. It was the first, or one of the first, to "propose a framework for characterizing complex technological systems such as air traffic, marine traffic, chemical plants, dams, and especially nuclear power plants according to their riskiness". Perrow says that multiple and unexpected failures are built into society's complex systems.[1]

"Normal" accidents (sometimes called system accidents) are named that because they seem to start with something that seems ordinary or that happens all the time, almost always without causing great harm. Events which seem trivial cascade through the system in unpredictable ways to cause a large event with severe consequences:[1]

Normal Accidents contributed key concepts to a set of intellectual developments in the 1980s that revolutionized how we think about safety and risk. It made the case for examining technological failures as the product of highly interacting systems, and highlighted organizational and management factors as the main causes of failures. Technological disasters could no longer be ascribed to isolated equipment malfunction, operator error or acts of God.[2]

The 1979 Three Mile Island accident inspired Perrow's book, where a nuclear accident occurs, resulting from an unanticipated interaction of multiple failures in a complex system. The Three Mile Island accident was an example of a normal accident because it was "unexpected, incomprehensible, uncontrollable and unavoidable".[3]

Perrow concluded that the failure at Three Mile Island was a consequence of the system's immense complexity. Such modern high-risk systems, he realized, were prone to failures however well they were managed. It was inevitable that they would eventually suffer what he termed a 'normal accident'. Therefore, he suggested, we might do better to contemplate a radical redesign, or if that was not possible, to abandon such technology entirely.[2]

See also

References

  1. ^ a b Daniel E Whitney (2003). "Normal Accidents by Charles Perrow" (PDF). Massachusetts Institute of Technology.
  2. ^ a b Nick Pidgeon (22 September 2011 Vol 477). "In retrospect:Normal accidents". Nature. {{cite web}}: Check date values in: |date= (help); Missing or empty |url= (help)
  3. ^ Perrow, C. (1982), ‘The President’s Commission and the Normal Accident’, in Sils, D., Wolf, C. and Shelanski, V. (Eds), Accident at Three Mile Island: The Human Dimensions, Westview, Boulder, pp.173–184.