Talk:Unexpected events

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Systems (Rated Redirect-class)
WikiProject icon This redirect is within the scope of WikiProject Systems, which collaborates on articles related to systems and systems science.
Redirect page Redirect  This redirect does not require a rating on the project's quality scale.
Taskforce icon
This redirect is not associated with a particular field. Fields are listed on the template page.
 

I am new to contributing to Wiki, and I hope I am doing the right things. I have noticed that somebody (who?) had tagged this page as containing problems of: essay-like, notability, original research. I am not sure whether and how I am allowed to remove these tags, so I was trying to improve, as follows:

  • I have added references to convince that this term reflects a new trend in accident investigation, and that it is not just a personal reflection of essay.
  • I have added some content, according to the "General notability guideline" in http://en.wikipedia.org/wiki/Wikipedia:Notability
  • I hope that the added references may help realize that the content is not any original research.

{{delrev}}

I hope this resolves these issues, and that the page will not be subject to deletion any more.

Avi Harel (talk) 16:12, 15 November 2011 (UTC)

I realize now that nobody pays any attention to my request to review the latest version. So I will try now to follow the recommendations in http://en.wikipedia.org/wiki/Wikipedia:BOLD. I will try to remove the Multiple Issue tag, and see what happens. Avi Harel (talk) 15:56, 19 November 2011 (UTC)

ticket # 2011112810015293[edit]

quotes covered by ticket # 2011112810015293
  • When faced with a human error problem, you may be tempted to ask 'Why

didn't they watch out better? How could they not have noticed?'. You think you can solve your human error problem by telling people to be more careful, by reprimanding the miscreants, by issuing a new rule or procedure. These are all expressions of 'The Bad Apple Theory', where you believe your system is basically safe if it were not for those few unreliable people in it. This old view of human error is increasingly outdated and will lead you nowhere.*****

  • Responses to incidents and accidents that are seen as unjust can impede

safety investigations, promote fear rather than mindfulness in people who do safety-critical work, make organizations more bureaucratic rather than more careful, and cultivate professional secrecy, evasion, and self-protection. A just culture is critical for the creation of a safety culture. Without reporting of failures and problems, without openness and information sharing, a safety culture cannot flourish.*****