SHELL model

From Wikipedia, the free encyclopedia
A diagram showing the SHELL model components as building blocks. Liveware is in the middle, surrounded by Software, Hardware, Environment and Liveware. A note reads, "In this model the match or mismatch of the blocks (interface) is just as important as the characteristics of the blocks themselves. A mismatch can be a source of human error."
The SHELL model

In aviation, the SHELL model (also known as the SHEL model) is a conceptual model of human factors that helps to clarify the location and cause of human error within an aviation environment.[1]: 1 [2][3]

It is named after the initial letters of its components (Software, Hardware, Environment, Liveware) and places emphasis on the human being and human interfaces with other components of the aviation system.[1]: 3 [4]

The SHELL model adopts a systems perspective that suggests the human is rarely, if ever, the sole cause of an accident.[5] The systems perspective considers a variety of contextual and task-related factors that interact with the human operator within the aviation system to affect operator performance.[5] As a result, the SHELL model considers both active and latent failures in the aviation system.


The model was first developed as the SHEL model by Elwyn Edwards in 1972[6][1] and later modified into a 'building block' structure by Frank Hawkins in 1975.[2]


Each component of the SHELL model (software, hardware, environment, liveware) represents a building block of human factors studies within aviation.[7]

The human element or worker of interest (liveware) is at the centre or hub of the SHELL model that represents the modern air transportation system. The human element is the most critical and flexible component in the system, interacting directly with other system components, namely software, hardware, environment and liveware.[2]

However, the edges of the central human component block are varied, to represent human limitations and variations in performance. Therefore, the other system component blocks must be carefully adapted and matched to this central component to accommodate human limitations and avoid stress and breakdowns (incidents/accidents) in the aviation system.[2] To accomplish this matching, the characteristics or general capabilities and limitations of this central human component must be understood.

Human characteristics[edit]

Physical size and shape[edit]

In the design of aviation workplaces and equipment, body measurements and movement are a vital factor.[2] Differences occur according to ethnicity, age and gender for example. Design decisions must take into account the human dimensions and population percentage that the design is intended to satisfy.[2]

Human size and shape are relevant in the design and location of aircraft cabin equipment, emergency equipment, seats and furnishings as well as access and space requirements for cargo compartments.

Fuel requirements[edit]

Humans require food, water and oxygen to function effectively and deficiencies can affect performance and well-being.[2]

Information processing[edit]

Humans have limitations in information processing capabilities (such as working memory capacity, time and retrieval considerations) that can also be influenced by other factors such as motivation and stress or high workload.[2] Aircraft display, instrument and alerting/warning system design needs to take into account the capabilities and limitations of human information processing to prevent human error.

Input characteristics[edit]

The human senses for collecting vital task and environment-related information are subject to limitations and degradation. Human senses cannot detect the whole range of sensory information available.[3] For example, the human eye cannot see an object at night due to low light levels. This produces implications for pilot performance during night flying. In addition to sight, other senses include sound, smell, taste and touch (movement and temperature).

Output characteristics[edit]

After sensing and processing information, the output involves decisions, muscular action and communication. Design considerations include aircraft control-display movement relationship, acceptable direction of movement of controls, control resistance and coding, acceptable human forces required to operate aircraft doors, hatches and cargo equipment and speech characteristics in the design of voice communication procedures.[2]

Environmental tolerances[edit]

People function effectively only within a narrow range of environmental conditions (tolerable for optimum human performance) and therefore their performance and well-being is affected by physical environmental factors such as temperature, vibration, noise, g-forces and time of day as well as time zone transitions, boring/stressful working environments, heights and enclosed spaces.[2]



  • Non-physical, intangible aspects of the aviation system that govern how the aviation system operates and how information within the system is organised.[2]
  • Software may be likened to the software that controls the operations of computer hardware.[4]
  • Software includes rules, instructions, aviation law and regulations, policies, norms, orders, safety procedures, standard operating procedures, customs, practices, conventions, habits, symbology, supervisor commands and computer programmes.
  • Software can be included in a collection of documents such as the contents of charts, maps, publications, emergency operating manuals and procedural checklists.[8]


  • Physical elements of the aviation system such as aircraft (including controls, surfaces, displays, functional systems and seating), operator equipment, tools, materials, buildings, vehicles, computers, conveyor belts etc.[4][8][9]


  • The context in which aircraft and aviation system resources (software, hardware, liveware) operate, made up of physical, organisational, economic, regulatory, political and social variables that may impact on the worker/operator.[4][8]
  • Internal air transport environment relates to immediate work area and includes physical factors such as cabin/cockpit temperature, air pressure, humidity, noise, vibration and ambient light levels.
  • External air transport environment includes the physical environment outside the immediate work area such as weather (visibility/Turbulence), terrain, congested airspace and physical facilities and infrastructure including airports as well as broad organisational, economic, regulatory, political and social factors.[7]


  • Human element or people in the aviation system. For example, flight crew personnel who operate aircraft, cabin crew, ground crew, management and administration personnel.
  • The liveware component considers human performance, capabilities and limitations.[7]

The four components of the SHELL model or aviation system do not act in isolation but instead interact with the central human component to provide areas for human factors analysis and consideration.[5] The SHELL model indicates relationships between people and other system components and therefore provides a framework for optimising the relationship between people and their activities within the aviation system that is of primary concern to human factors. In fact, the International Civil Aviation Organisation has described human factors as a concept of people in their living and working situations; their interactions with machines (hardware), procedures (software) and the environment about them; and also their relationships with other people.[3]

According to the SHELL model, a mismatch at the interface of the blocks/components where energy and information is interchanged can be a source of human error or system vulnerability that can lead to system failure in the form of an incident/accident.[4] Aviation disasters tend to be characterised by mismatches at interfaces between system components, rather than catastrophic failures of individual components.[8]


Liveware-Software (L-S)[edit]

  • Interaction between human operator and non-physical supporting systems in the workplace.[4]
  • Involves designing software to match the general characteristics of human users and ensuring that the software (e.g. rules/procedures) is capable of being implemented with ease.[2]
  • During training, flight crew members incorporate much of the software (e.g. procedural information) associated with flying and emergency situations into their memory in the form of knowledge and skills. However, more information is obtained by referring to manuals, checklists, maps and charts. In a physical sense these documents are regarded as hardware however in the information design of these documents adequate attention has to be paid to numerous aspects of the L-S interface.[8]
  • For instance, by referring to cognitive ergonomics principles, the designer must consider currency and accuracy of information; user-friendliness of format and vocabulary; clarity of information; subdivision and indexing to facilitate user retrieval of information; presentation of numerical data; use of abbreviations, symbolic codes and other language devices; presentation of instructions using diagrams and/or sentences etc. The solutions adopted after consideration of these informational design factors play a crucial role in effective human performance at the L-S interface.[8]
  • Mismatches at the L-S interface may occur through:
  • Insufficient/inappropriate procedures
  • Misinterpretation of confusing or ambiguous symbology/checklists
  • Confusing, misleading or cluttered documents, maps or charts
  • Irrational indexing of an operations manual.[2]
  • A number of pilots have reported confusion in trying to maintain aircraft attitude through reference to the Head-Up-Display artificial horizon and 'pitch-ladder' symbology.[3]

Liveware-Hardware (L-H)[edit]

  • Interaction between human operator and machine
  • Involves matching the physical features of the aircraft, cockpit or equipment with the general characteristics of human users while considering the task or job to be performed.[2] Examples:
  • designing passenger and crew seats to fit the sitting characteristics of the human body
  • designing cockpit displays and controls to match the sensory, information processing and movement characteristics of human users while facilitating action sequencing, minimising workload (through location/layout) and including safeguards for incorrect/inadvertent operation.[2]
  • Mismatches at the L-H interface may occur through:
  • poorly designed equipment
  • inappropriate or missing operational material
  • badly located or coded instruments and control devices
  • warning systems that fail in alerting, informational or guidance functions in abnormal situations etc.[10]
  • The old 3-pointer aircraft altimeter encouraged errors because it was very difficult for pilots to tell what information related to which pointer.[3]

Liveware-Environment (L-E)[edit]

  • Interaction between human operator and internal and external environments.[4]
  • Involves adapting the environment to match human requirements. Examples:
  • Engineering systems to protect crews and passengers from discomfort, damage, stress and distraction caused by the physical environment.[8]
  • Air conditioning systems to control aircraft cabin temperature
  • Sound-proofing to reduce noise
  • Pressurisation systems to control cabin air pressure
  • Protective systems to combat ozone concentrations
  • Using black-out curtains to obtain sleep during daylight house as a result of transmeridian travel and shift work
  • Expanding infrastructure, passenger terminals and airport facilities to accommodate more people due to larger jets (e.g. Airbus A380) and the growth in air transport
  • Examples of mismatches at the L-E interface include:
  • Reduced performance and errors resulting from disturbed biological rhythms (jet lag) as a result of long-range flying and irregular work-sleep patterns
  • Pilot perceptual errors induced by environmental conditions such as visual illusions during aircraft approach/landing at nighttime
  • Flawed operator performance and errors as a result of management failure to properly address issues at the L-E interface including:
  • Operator stress due to changes in air transport demand and capacity during times of economic boom and economic recession.[4]
  • Biased crew decision making and operator short-cuts as a consequence of economic pressure brought on by airline competition and cost-cutting measures linked with deregulation.[8]
  • Inadequate or unhealthy organisational environment reflecting a flawed operating philosophy, poor employee morale or negative organisational culture.[2]

Liveware-Liveware (L-L)[edit]

  • Interaction between central human operator and any other person in the aviation system during performance of tasks.[7]
  • Involves interrelationships among individuals within and between groups including maintenance personnel, engineers, designers, ground crew, flight crew, cabin crew, operations personnel, air traffic controllers, passengers, instructors, students, managers and supervisors.
  • Human-human/group interactions can positively or negatively influence behaviour and performance including the development and implementation of behavioural norms. Therefore, the L-L interface is largely concerned with:
  • interpersonal relations
  • leadership
  • crew cooperation, coordination and communication
  • dynamics of social interactions
  • teamwork
  • cultural interactions
  • personality and attitude interactions.[2][4]
  • The importance of the L-L interface and the issues involved have contributed to the development of cockpit/crew resource management (CRM) programmes in an attempt to reduce error at the interface between aviation professionals
  • Examples of mismatches at the L-L interface include:
  • Communication errors due to misleading, ambiguous, inappropriate or poorly constructed communication between individuals. Communication errors have resulted in aviation accidents such as the double Boeing 747 disaster at Tenerife Airport in 1977.
  • Reduced performance and error from an imbalanced authority relationship between aircraft captain and first officer.[2] For instance, an autocratic captain and an overly submissive first officer may cause the first officer to fail to speak up when something is wrong, or alternatively the captain may fail to listen.

The SHELL Model does not consider interfaces that are outside the scope of human factors. For instance, the hardware-hardware, hardware-environment and hardware-software interfaces are not considered as these interfaces do not involve the liveware component.

Aviation System Stability[edit]

Any change within the aviation SHELL system can have far-reaching repercussions.[8] For example, a minor equipment change (hardware) requires an assessment of the impact of the change on operations and maintenance personnel (Liveware-Hardware) and the possibility of the need for alterations to procedures/training programmes (to optimise Liveware-Software interactions). Unless all potential effects of a change in the aviation system are properly addressed, it is possible that even a small system modification may produce undesirable consequences.[8] Similarly, the aviation system must be continually reviewed to adjust for changes at the Liveware-Environment interface.[8]


  1. **Safety analysis tool**: The SHELL model can be used as a framework for collecting data about human performance and contributory component mismatches during aviation incident/accident analysis or investigation as recommended by the International Civil Aviation Organisation.[7] Similarly, the SHELL model can be used to understand systemic human factors relationships during operational audits with the aim of reducing error, enhancing safety[10] and improving processes[11] For example, LOSA (Line Operations Safety Audit) is founded on Threat and error management (TEM) that considers SHELL interfaces.[11][12] For instance, aircraft handling errors involve liveware-hardware interactions, procedural errors involve liveware-software interactions and communication errors involve liveware-liveware interactions.[13]
  2. **Licensing tool**: The SHELL model can be used to help clarify human performance needs, capabilities and limitations thereby enabling competencies to be defined from a safety management perspective.[13]
  3. **Training tool**: The SHELL model can be used to help an aviation organisation improve training interventions and the effectiveness of organisation safeguards against error.[13]


  1. ^ a b c "CAP 719 Fundamental Human Factors Concepts (previously ICAO Digest No. 1, ICAO Circular 216-AN/131)" (PDF). UK Civil Aviation Authority. Retrieved 26 September 2023.
  2. ^ a b c d e f g h i j k l m n o p q r Hawkins, Frank H. (31 December 2017). Orlady, Harry W. (ed.). Human Factors in Flight (2 ed.). Routledge. ISBN 978-1-351-21858-0. Retrieved 25 September 2023.
  3. ^ a b c d e Keightley, Alan (2004). "Human factors study guide". Palmerston North: Massey University. 190.216. {{cite journal}}: Cite journal requires |journal= (help)
  4. ^ a b c d e f g h i Johnston, Neil; McDonald, Nick (31 December 2017). Aviation Psychology in Practice. Routledge. ISBN 978-1-351-21882-5. Retrieved 25 September 2023.
  5. ^ a b c Wiegmann, Douglas A.; Shappell, Scott A. (31 December 2016). A Human Error Approach to Aviation Accident Analysis: The Human Factors Analysis and Classification System. Routledge. ISBN 978-1-315-26387-8. Retrieved 25 September 2023.
  6. ^ Edwards, Elywn (14–16 November 1972). "Man and Machine - Systems for Safety". Outlook on Safety: Proceedings of the Thirteenth Annual Technical Symposium. London: British Air Line Pilots Association: 21–36. A73-34078.
  7. ^ a b c d e "ICAO Circular 240-AN/144: Human Factors Digest No 7 - Investigation of Human Factors in Accidents and Incidents". CIRCULAR 240-AN/144. Montreal, Canada: International Civil Aviation Organization. 1993. Retrieved 25 September 2023.
  8. ^ a b c d e f g h i j k Wiener, Earl L.; Nagel, David C. (1988). Human Factors in Aviation. Gulf Professional Publishing. ISBN 978-0-12-750031-7. Retrieved 25 September 2023.
  9. ^ Campbell, R. D.; Bagshaw, Michael (15 April 2008). Human Performance and Limitations in Aviation (PDF) (3 ed.). John Wiley & Sons. ISBN 978-1-4051-4734-7. Retrieved 26 September 2023.
  10. ^ a b Cacciabue, Carlo (17 April 2013). Guide to Applying Human Factors Methods: Human Error and Accident Management in Safety-Critical Systems. Springer Science & Business Media. ISBN 978-1-4471-3812-9. Retrieved 26 September 2023.
  11. ^ a b Rizzo, Antonio; Pasquini, Alberto; Nucci, Paolo Di; Bagnara, Sebastiano (2000). "SHELFS: Managing critical issues through experience feedback". Human Factors and Ergonomics in Manufacturing. 10 (1): 83–98. doi:10.1002/(SICI)1520-6564(200024)10:1<83::AID-HFM5>3.0.CO;2-D. ISSN 1090-8471.
  12. ^ Pfister, Peter (2 March 2017). Innovation and Consolidation in Aviation: Selected Contributions to the Australian Aviation Psychology Symposium 2000. Routledge. ISBN 978-1-351-92740-6. Retrieved 26 September 2023.
  13. ^ a b c Maurino, Dan (2005): "Threat and error management (TEM)". Canadian Aviation Safety Seminar (CASS), Vancouver, BC, 18–20 April 2005. Captain Dan Maurino, Coordinator, Flight safety and Human Factors Programme – ICAO. Retrieved April 4, 2016 from the World Wide Web:

External links[edit]

  • AviationKnowledge - Shell Model Interface Errors This AviationKnowledge page provides examples of aviation accidents where errors or mismatches at SHELL interfaces have either contributed to or caused accidents
  • AviationKnowledge - Shell Model Variants, You can also consult on two variants to the SHELL model: