Jump to content

David Woods (safety researcher)

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Citation bot (talk | contribs) at 19:21, 20 September 2023 (Add: publisher. | Use this bot. Report bugs. | Suggested by Abductive | #UCB_webform 355/3835). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

David D. Woods is an American safety systems researcher who studies human coordination and automation issues in a wide range safety-critical fields such as nuclear power, aviation, space operations, critical care medicine, and software services. He is one of the founding researchers of the fields of cognitive systems engineering[1] and resilience engineering.[2]

Biography

In 1974, Woods received his BA in psychology at Canisius College. In 1977, he received his MS in cognitive psychology at Purdue University. In 1979, he received his PhD at Purdue University in cognitive psychology, where he studied human perception and attention.[3]

From 1979 to 1988, Woods worked as a senior engineer at the Westinghouse Research and Development Center[3] where he worked on improving control room equipment interfaces for power plants.[4][1]

From 1988 onwards, he served on the faculty of The Ohio State University in the Department of Integrated Systems, where he is currently a Professor Emeritus.[1]

In 2017, Woods co-founded a consulting company, Adaptive Capacity Labs, with Richard Cook and John Allspaw.[5]

Awards

Woods has previously been president of the Resilience Engineering Association (2011-2013), and the Human Factors and Ergonomics Society (1998-1999).[6] He is a fellow of the Human Factors and Ergonomics Society.[7]

National advisory committees and testimony

Work

Resilience engineering

Woods is one of the founders of the field of resilience engineering.[2] One of his significant contributions is the theory of graceful extensibility.[14]

Cognitive systems engineering

In the wake of the Three Mile Island accident, Woods and Erik Hollnagel proposed a new approach to thinking about human-computer interaction (HCI) in the domain of supervisory control, Cognitive Systems Engineering (CSE)[15] that focuses on the interaction between people, technological artifacts, and work. In this approach, a set of interacting human and software agents are viewed as a joint cognitive system, where the overall system itself is seen as performing cognitive tasks.

Theory of graceful extensibility

The theory of graceful extensibility is a theory proposed by Woods to explain how some systems are able to continually adapt over time to face new challenges (sustained adaptability) where other systems fail to do so.[16]

This theory asserts that all complex adaptive systems can be model as the composition of individual units that have some ability to adapt their behavior and communicate with other units. It is expressed as ten statements that Woods calls 'proto-theorems':

  1. Individual units have a limit in the degree to which they are able to adapt.
  2. Units will inevitably encounter events that they have difficulty dealing with.
  3. Because units have limits, they need to identify when they are near the limit, and need a mechanism to increase their limit when this happens.
  4. Individual units will never have a high enough limit to handle everything, so units have to work together.
  5. A nearby unit can affect the saturation limit of another unit.
  6. When the pressure that is applied to a unit changes, the trade-off space changes for that unit.
  7. Units perform differently as they approach saturation.
  8. Units only have a local perspective.
  9. The local perspective of any one unit is necessarily limited.
  10. Each unit has to continually do work to adjust its model of the adaptive capacity of itself and others to match the actual adaptive capacity.

Visual momentum

Woods proposed visual momentum as a measure of how easy it is for a person to navigate to a new screen and integrate the information they see, when in the process of performing a task.[17][18] This work was motivated by study of event-driven tasks, where events occur that operators must respond to (e.g., pilots, space flight controllers, nuclear plant operators, physicians).

Woods argued that it is easy to get lost in such user interfaces. Effective operator interfaces should help figure out where to look next, and that navigating a virtual space of information could be improved by leveraging the human perceptual system has already been optimized to do, such as pattern recognition.

Woods proposed a number of concepts for improving the design of such interfaces by increasing the visual momentum:

  1. Provide a long shot view that acts a global map to assist an operator in stepping back from the specific details.
  2. Provide perceptual landmarks to help operators orient themselves within the virtual data space.
  3. Use display overlap when moving between data views: have some common subset of the data on both the current and the next view so that the transition between views is not jarring.
  4. Use spatial representation: encode information spatially to leverage the perceptual system.

Dynamic fault management

Woods studied the nature of operations work involved in identifying and mitigating faults in a supervisory context, such as controlling a power plant or operating a software service.[19] He found that this work was qualitatively different from traditional offline troubleshooting that had previously been studied.[20] In particular, because of the dynamic nature of the underlying component, the nature and severity of the problem can potentially change over time. In addition, because of the safety-critical nature of the process, the operator must work to limit possible harms in addition to addressing the underlying problem.

How complex, adaptive systems break down

Woods's research found three recurring patterns in the failure modes of complex adaptive systems:[21]

  1. Decompensation
  2. Working at cross-purposes
  3. Getting stuck in outdated behaviors

Adaptive universe

The adaptive universe is a model proposed by Woods for the constraints that all complex adaptive systems are bound by. The model contains two assumptions:[16]

  1. The amount of resources available to a system are always finite.
  2. The environment that a system is embedded within is always dynamic: change never stops.

Selected publications

Books

  • A Tale of Two Stories: Contrasting Views of Patient Safety (1988)
  • Joint Cognitive Systems: Foundations of Cognitive Systems Engineering (2005)
  • Joint Cognitive Systems: Patterns in Cognitive Systems Engineering (2006)
  • Resilience Engineering: Concepts and Precepts (2006)
  • Behind Human Error (2012)

References

  1. ^ a b c Cognitive systems engineering : the future for a changing world. Philip J. Smith, Robert R. Hoffman. Boca Raton. 2018. ISBN 978-1-315-57252-9. OCLC 1002192481.{{cite book}}: CS1 maint: location missing publisher (link) CS1 maint: others (link)
  2. ^ a b Dekker, Sidney (2019). Foundations of safety science : a century of understanding accidents and disasters. Boca Raton. ISBN 978-1-351-05977-0. OCLC 1091899791.{{cite book}}: CS1 maint: location missing publisher (link)
  3. ^ a b Woods, David D. "Curriculum Vitae: David D. Woods" (PDF). Retrieved 2022-09-10.
  4. ^ Woods, D. D.; Wise, J. A.; Hanes, L. F. (1982-02-01). "Evaluation of safety parameter display concepts. Final report". OSTI 5339665. {{cite journal}}: Cite journal requires |journal= (help)
  5. ^ "The Career, Accomplishments, and Impact of Richard I. Cook: A Life in Many Acts – Adaptive Capacity Labs". Retrieved 2022-09-29.
  6. ^ "HFES Officers, Editors, and Committee Chairs". Retrieved 2022-09-17.
  7. ^ "HFES Fellows Program: List of Fellows". Retrieved 2022-09-17.
  8. ^ "NASA - Report of Columbia Accident Investigation Board, Volume I". www.nasa.gov. Retrieved 2022-09-18.
  9. ^ "Future of NASA". U.S. Senate Committee on Commerce, Science, & Transportation. 2003-10-29. Retrieved 2022-09-18.
  10. ^ Council, National Research (2007-05-09). Software for Dependable Systems: Sufficient Evidence?. National Academies Press. ISBN 978-0-309-10394-7.
  11. ^ "Defense Science Board Task Force Report: The Role of Autonomy in DoD Systems". 2012-07-01. {{cite journal}}: Cite journal requires |journal= (help)
  12. ^ Nakamura, Dave (2013-09-05). "Operational Use of Flight Path Management System. Final Report of the Performance-based operations Aviation Rulemaking Committee/Commercial Aviation Safety Team Flight Deck Automation Working Group" (PDF). Federal Aviation Association. Retrieved 2022-09-17.
  13. ^ Council, National Research (2014-06-05). Autonomy Research for Civil Aviation: Toward a New Era of Flight. National Academies Press. ISBN 978-0-309-30614-0.
  14. ^ Woods, David D. (December 2018). "The theory of graceful extensibility: basic rules that govern adaptive systems". Environment Systems and Decisions. 38 (4): 433–457. doi:10.1007/s10669-018-9708-3. ISSN 2194-5403. S2CID 70052983.
  15. ^ Hollnagel, Erik; Woods, David D (August 1999). "Cognitive Systems Engineering: New wine in new bottles". International Journal of Human-Computer Studies. 51 (2): 339–356. doi:10.1006/ijhc.1982.0313. PMID 11543350.
  16. ^ a b Woods, David D. (2018-12-01). "The theory of graceful extensibility: basic rules that govern adaptive systems". Environment Systems and Decisions. 38 (4): 433–457. doi:10.1007/s10669-018-9708-3. ISSN 2194-5411. S2CID 70052983.
  17. ^ Woods, David D. (September 1984). "Visual momentum: a concept to improve the cognitive coupling of person and computer". International Journal of Man-Machine Studies. 21 (3): 229–244. doi:10.1016/s0020-7373(84)80043-7. ISSN 0020-7373.
  18. ^ Woods, David D.; Watts, Jennifer C. (1997), "How Not to Have to Navigate Through Too Many Displays", Handbook of Human-Computer Interaction, Elsevier, pp. 617–650, doi:10.1016/b978-044481862-1.50092-3, ISBN 9780444818621, retrieved 2022-09-25
  19. ^ Woods, D.D. (1994-02-28). "Cognitive demands and activities in dynamic fault management: abductive reasoning and disturbance management". In Stanton, Neville A. (ed.). Human Factors in Alarm Design (0 ed.). CRC Press. doi:10.1201/9780203481714. ISBN 978-0-203-48171-4.
  20. ^ Rasmussen, J.; Jensen, A. (May 1974). "Mental Procedures in Real-Life Tasks: A Case Study of Electronic Trouble Shooting". Ergonomics. 17 (3): 293–307. doi:10.1080/00140137408931355. ISSN 0014-0139. PMID 4442376.
  21. ^ Woods, D.D.; Branlat, M (2017-05-15). "Basic Patterns in How Adaptive Systems Fail". In Hollnagel, Erik; Pariès, Jean; Woods, David; Wreathall, John (eds.). Resilience Engineering in Practice: A Guidebook (1 ed.). CRC Press. doi:10.1201/9781317065265. ISBN 978-1-315-60569-2.