Organizational safety

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Organizational safety is a contemporary discipline of study and research developed from the works of James Reason, creator of the Swiss Cheese Model, and Charles Perrow author of Normal Accidents. These scholars demonstrated the complexity and system coupling inherent in organizations, created by multiple process and various people working simultaneously to achieve organizational objectives, is responsible for errors ranging from small to catastrophic system failures. The discipline crosses professions, spans industries, and involves multiple academic domains. As such, the literature is disjointed and the associated research outcomes vary by study setting. This page provides a comprehensive yet concise summary of safety and accidents organizational knowledge using internal links (to existing Wikipedia pages), external links (to sources outside of Wikipedia), and seminal literature citations.

Historical Perspective[edit]

Industrial Perspective[edit]

Disciplinary Perspective[edit]

Organizational Culture and Climate[edit]

Organizational culture emerged from organizational studies and management to describe the attitudes, perceptions, beliefs and values of an organization. Organizational culture is the established underlying suppositions (Ashkanasy, Broadfoot, & Falkus, 2000; Schein, 1991; Strauss, 1987) communicated through shared, collectively supported, perceptions (Schneider, Brief, & Guzzo, 1996) that ultimately manifest in organizational outputs (Ashkanasy et al., 2000; Schein, 1991; Strauss, 1987). More basic, organizational culture has been described as "the specific collection of values and norms that are shared by people and groups in an organization and that control the way they interact with each other and with stakeholders outside the organization."[1]

To take a slightly broader view, it is necessary to consider the topic of organisational safety in the context of change management. An eminent example of a book in this category, discussing the topic of organisational culture, is The Change Masters".[2] Rosabeth Moss Kanter notes in the chapter 2 of the book that change should be seen as opportunity rather than see it as a threat. Seen in this way, organisations can be analysed as systems tending towards open or closed systems as they are conceived in the field of social science. The classification of systems using these two categories allows analogies to be made using the framework of general systems theory. The Open and closed system in social science provides further details on the topic of communications between people which can be treated as a closed communication system. Rosabeth Moss Kanter noted that culture of segmentalism inhibits innovation as segmentalism treats change as a threat and integrated companies encourage innovation as they treat change as an opportunity.

Analyzing organisations or its culture must taken into account the concept of system consisting of elements. As a system, an organisation is a purposeful system that contains at least two purposeful elements which have a common purpose. But, unlike, technical systems, organisations can consist of elements that can display will and learn to adapt.Organisations, unlike organisms forming part of the organisation, cannot learn but adapt to changing situations. Organisms, unlike organisations,are made up of elements that are not purposeful in any way. The elements of an organisms are functional in nature, but the purpose is only revealed by the whole.At least one of the elements of organisation must have a system control function to guide the organisation towards variety increasing rather than variety decreasing ( e.g. ineffective committee) as unlike the organisms, the will can be displayed only by parts or its elements.[3] The dynamics of decision making within an organisation decide the course of action taken by an organisation. Research results highlight the risks of ignoring the role of disequilibrium dynamics and bounded rationality in shaping competitive outcomes, and demonstrate how both can be incorporated into strategic analysis to form a dynamic behavioural theory amenable to rigorous analysis.[4]

The idea that too much reliance should not be placed on tool supported decision making was discussed by Robert Freeman in a 1980s article, "Taking the Bugs Out of Computer Spreadsheets". In it, Freeman discusses a case of a Dallas based oil gas company losing millions of dollars in the acquisition deal. The question of reliability of the computer spreadsheets studied by Robert Freeman suggests that logic of models and the makeup underlying the spreadsheets must be thoroughly checked.[5]

The UK HSE Research Report 367 presents a review of safety culture and safety climate literature for the development of the safety culture inspection toolkit.[6]

The key issue in the organisational context is the way the process of management of safety risk handles changes to the existing infrastructure, processes, technology or other elements and how communications regarding potential accident scenarios are handled and are seen in an integrated way. These changes might have unseen or adverse safety critical impacts. There are several concepts available to guide understanding in the area of safety. System safety is one of the concepts available. The other prominent method is the concept of Inherent safety. Espousing Change management can be seen as another concept in this direction.

Safety Culture, Climate, and Attitude[edit]

Safety Culture[edit]

Safety culture can be defined as the product of individual and group attitudes, perceptions, and values about workplace behaviors and processes that collectively result safety work units and reliable organizational products (Cox & Flin, 1998; Flin et al., 2000; Hale, 2000; Williamson, Feyer, Cairns, & Biancotti, 1997; Zohar, 1980, 2003). In essence, safety culture describes the organizational attributes that reflect safe work environments (Guldenmund, 2000). This concept is deeply rooted in social systems where comprehensive analysis of errors exposed organizational (Reason, 1998), system (Perrow, 1984), process (Rasmussen, 1999) and human failures (Cook, Render, & Woods, 2000) responsible for most preventable adverse outcomes (Reason, 1990).

Safety Climate[edit]

Safety climate.

Safety Attitude[edit]

Other Resources[edit]

Significant Scholars[edit]

Citations and References[edit]

Citations[edit]

  1. ^ Charles W. L. Hill, and Gareth R. Jones, (2001) Strategic Management. Houghton Mifflin.
  2. ^ Rosabeth Moss Kanter, The Change Masters,(1984)Unwin Paperbacks.
  3. ^ Russel L.Ackoff,Towards System of System Approach,(1971) Management Science.
  4. ^ John D. Sterman, Rebecca Henderson, Eric D. Beinhocker, and Lee I. Newma,Getting Big Too Fast: Strategic Dynamics with Increasing Returns and Bounded Rationality,(2007).Management Science.
  5. ^ " The Wall Street Journal on Management",(1984),Mentor Book
  6. ^ The UK HSE Research Report 367,(2007)Her Majesty's Stationery Office,Norwhich

References[edit]

Articles