Outrage factor

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In public policy, the outrage factor is the portion of public opposition to a policy which does not derive from knowledge of the technical details. While policy analysis by institutional stakeholders may focus on risk-benefit analysis and cost-benefit analysis, popular risk perception is not informed by the same concerns, and so the successful implementation of a policy relying on public support and cooperation will need to address the outrage factor when informing the public about the policy.

Factors[edit]

The term "outrage factor" originates from Peter Sandman's 1993 book, Responding to community outrage: strategies for effective risk communication.[1][2]

(As of February 2012, Responding to Community Outrage is now online under a Creative Commons license, and may be freely accessed and downloaded.)

He gives the formula:[3]

Risk = Hazard + Outrage

In an interview with New York Times journalist and Freakonomics author Stephen J. Dubner, Sandman emphasised "the most important truth in risk communication is the exceedingly low correlation between whether a risk is dangerous, and whether it’s upsetting."[4]

Sandman enumerates several sources of outrage:

Voluntary vs. coerced
People may object to something compulsory which is less dangerous than something else that they have a choice to refuse, such as a dangerous sport.[5]
Natural vs. industrial
A human-made source of risk provides someone to blame for the risk; household radon is less publicly feared than less carcinogenic artificial sources. While there are approximately 21,000 deaths per year in the USA due to radon-induced lung cancers, the EPA struggles to encourage citizens to test their own homes for radiation.[6][7]
Familiar vs. exotic
We worry less about familiar things than things that are out of our experience range. We are very familiar with water and this reduces the impact of the very high number of children that die each year from drowning in swimming pools (The old saying "familiarity breeds contempt' is accurate here). However, if there was a news headline talking about the fatality of scores of children each year due to an excessive intake of Dihydrogen Monoxide (the chemical name for water) the reaction would be very different indeed.[8]
Memorable or not
Memorableness may derive from personal experience, news reports, fiction, or iconic images or symbols of something bad. It is a kind of Deja Vu feeling that you know how this is going to turn out even though the actual event you are experiencing may be quite different from your earlier experience.[9]
Dreaded or not
Disgust can exaggerate perceived risk: Some topics just send a shudder through us> Shark attacks, cancer, nuclear radiation are all in the category where the fear is not particularly dependent on the specific circumstance being encountered. For example, if a doctor told you that you had cancer but there was a 90% chance of a total recovery with this particular form - you would probably still be more scared than if a doctor told you you had a heart condition with the same chance of total recovery.[10]
Chronic vs. catastrophic
People will react far more about 400 people dying in one aircraft crash than 1000 dying in road crashes over a year. Sandman asks that we consider what would happen if all of the people that died of smoking over a year died in one city on one day - and suggests smoking would become illegal overnight.[11]
Knowable or not
people take a worst-case approach to uncertainty. Sandman talks of duelling PhDs meaning that when experts disagree the public are likely to believe things could be much worse than either of the expert's claim. If the experts understood this they would realise that agreeing on a range of values rather than arguing would reassure people that there are reasonable limits to what might occur.[12]
Controlled by me vs. others
[13]
Fair or not
[14]
Morally relevant or not
[15]
Can I trust you or not
[16]
Is the process responsive or not
[17]

Issues[edit]

The relevance of public outrage has been acknowledged in discussions of various policy debates, including nuclear safety,[18] terrorism,[19] public health,[20][21] and environmental management.[1][2]

Addressing outrage[edit]

The mass media often frame policy debate by focusing on outrage factors. For proponents of a policy trying to address outrage, Sandman recommends acknowledging and empathising with the underlying sentiment.

At the turn of the century Sandman, together with a small Australian risk consultancy called Qest Consulting, released the 'OUTRAGE' software to help organisations fearing stakeholder outrage to avoid getting into trouble in the first place. It was world-leading in concept and the content is still regarded highly by Dr Sandman. However, the product sold poorly and is now available as freeware.

References[edit]

Notes[edit]

  1. ^ a b Nebel, Bernard J.; Richard T. Wright (1993). Environmental science: the way the world works (4th ed.). Prentice Hall PTR. pp. 392–3. ISBN 0-13-285446-5. 
  2. ^ a b Hird, John A. (1994). Superfund: the political economy of environmental risk. JHU Press. p. 70. ISBN 0-8018-4807-5. 
  3. ^ Sandman, p.1
  4. ^ Stephen J. Dubner (2011-11-29). "Risk = Hazard + Outrage: A Conversation with Risk Consultant Peter Sandman". 
  5. ^ Sandman, pp.14–17
  6. ^ "A Citizen's Guide to Radon". www.epa.gov. United States Environmental Protection Agency. October 12, 2010. Retrieved January 29, 2012. 
  7. ^ Sandman, pp.17–19
  8. ^ Sandman, pp.19–23
  9. ^ Sandman, pp.23–27
  10. ^ Sandman, pp.27–29
  11. ^ Sandman, pp.29–33
  12. ^ Sandman, pp.33–37
  13. ^ Sandman, pp.37–41
  14. ^ Sandman, pp.41–44
  15. ^ Sandman, pp.44–49
  16. ^ Sandman, pp.49–62
  17. ^ Sandman, pp.62–73
  18. ^ Williams, David R. (1998). What is safe?: the risks of living in a nuclear age. Royal Society of Chemistry. p. 39. ISBN 0-85404-569-4. 
  19. ^ Kayyem, Juliette; Robyn L. Pangi (2003). First to arrive: state and local responses to terrorism. BCSIA studies in international security. MIT Press. p. 68. ISBN 0-262-61195-3. 
  20. ^ Milloy, Steven J. (1995). Science without sense: the risky business of public health research. Cato Institute. p. 8. ISBN 1-882577-34-5. 
  21. ^ David, Pencheon; David Melzer, Charles Guest, Muir Gray (2006). Oxford handbook of public health practice. Oxford handbooks (2nd ed.). Oxford University Press. p. 221. ISBN 0-19-856655-7. 

See also[edit]