Science communication

From Wikipedia, the free encyclopedia
Jump to: navigation, search
For the academic journal, see Science Communication.
An illustration of Newton's cradle in motion.

Science communication generally refers to public communication presenting science-related topics to non-experts. This often involves professional scientists (called "outreach" or "popularization"), but has also evolved into a professional field in its own right. It includes science exhibitions, journalism, policy or media production.

Science communication can aim to generate support for scientific research or study, or to inform decision making, including political and ethical thinking. There is increasing emphasis on explaining methods rather than simply findings of science. This may be especially critical in addressing scientific misinformation, which spreads easily because it is not subject to the constraints of scientific method.[1][2][3][4]

Science communicators can use entertainment and persuasion including humour, storytelling and metaphors.[3][4] Scientists can be trained in some of the techniques used by actors to improve their communication.[5]

Science communication can also simply describe communication between scientists (for instance through scientific journals), as well as between non-scientists.


Partly due to a market for professional training, science communication is also an academic discipline. Journals include Public Understanding of Science and Science Communication. Researchers in this field are often linked to Science and Technology Studies, but may also come from history of science, mainstream media studies, psychology or sociology. As a reflection of growth in this field, academic departments, such as the Department of Life Sciences Communication at the University of Wisconsin-Madison, have been established to focus on applied and theoretical communication issues. Agricultural communication is considered a subset of science communication from an academic and professional standpoint relating to agriculture-related information among agricultural and non-agricultural stakeholders. Health communication is a related discipline.

Writing in 1987, Geoffery Thomas and John Durant advocated various reasons to increase public understanding of science, or scientific literacy. If the public enjoyed science more, they suggested there would presumably be more funding, progressive regulation, and trained scientists. More trained engineers and scientists could allow a nation to be more competitive economically.[1]

Science can also benefit individuals. Science can simply have aesthetic appeal (e.g. popular science or science fiction). Living in an increasingly technological society, background scientific knowledge can help to negotiate it. The science of happiness is an example of a field whose research can have direct and obvious implications for individuals.[1]

Governments and societies might also benefit from more scientific literacy, since an informed electorate promotes a more democratic society.[1] Moreover, science can inform moral decision making (e.g. answering questions about whether animals can feel pain, how human activity influences climate, or even a science of morality).

Bernard Cohen points out potential pitfalls in improving scientific literacy. He explains first that we must avoid 'scientific idolatry'. In other words, science education must allow the public to respect science without worshiping it, or expecting infallibility. Ultimately scientists are humans, and neither perfectly altruistic, nor perfectly competent. Science communicators must also appreciate the distinction between understanding science and possessing a transferable skill of scientific thinking. Indeed, even trained scientists do not always manage to transfer the skill to other areas of their life.

Cohen is critical of what has been called "Scientism" – the claim that science is the best or only way to solve all problems. He also criticizes the teaching of 'miscellaneous information' and doubts that much of it will ever be of any use, (e.g. the distance in light years from the Earth to various stars, or the names of minerals). Much of scientific knowledge, particularly if it is not the subject of public debate and policy revision, may never really translate to practical changes for the lives of the learners.[1]

Many criticisms of academic research in public understanding of science come from scholars in Science and Technology Studies. For example, Steven Hilgartner (1990)[2] argues that what he calls 'the dominant view' of science popularization tends to imply a tight boundary around those who can articulate true, reliable knowledge. By defining a deficient public as recipients of knowledge, the scientists get to contrast their own identity as experts. The process of popularization is a form of boundary work. Understood in this way, science communication may explicitly exist to connect scientists with the rest of society, but its very existence only acts to emphasise it: as if the scientific community only invited the public to play in order to reinforce its most powerful boundary (according to work by Massimiano Bucchi or Brian Wynne).[6][7]

Biologist Randy Olson adds that anti-science groups can often be so motivated, and so well funded, that the impartiality of science organizations in politics can lead to crises of public understanding of science. He cites examples of denialism (for instance of global warming) to support this worry.[3] Journalist Robert Krulwich likewise argues that the stories scientists tell are invariably competing with the efforts of people like Adnan Oktar. Krulwich explains that attractive, easy to read, and cheap creationist textbooks were sold by the thousands to schools in Turkey (despite their strong secular tradition) due to the efforts of Oktar.[4]


Walter Lewin demonstrates conservation of potential energy. It can be difficult to captivatingly share good scientific thinking as well as scientifically accurate information. Krulwich and Olson believe scientists must rise to that challenge using metaphor and story telling.[3][4]

Marine biologist and film-maker Randy Olson published Don't Be Such a Scientist: Talking Substance in an Age of Style. In the book he describes how there has been this unproductive negligence when it comes to teaching scientists to communicate. Don't be Such a Scientist is written to his fellow scientists, and he says they need to "lighten up". He adds that scientists are ultimately the most responsible for promoting and explaining science to the public and media. This, Olson says, should be done according to a good grasp of social science; scientists must use persuasive and effective means like story telling. Olson acknowledges that the stories told by scientists need not only be compelling but also accurate to modern science - and says this added challenge must simply be confronted. He points to figures like Carl Sagan as effective popularizers, partly because such figures actively cultivate a likeable image.[3]

As his commencement address to Caltech students, journalist Robert Krulwich delivered a speech entitled "Tell me a story". Krulwich says that scientists are actually given many opportunities to explain something interesting about science or their work, and that they must seize such opportunities. He says scientists must resist shunning the public, as Sir Isaac Newton did in his writing, and instead embrace metaphors the way Galileo did; Krulwich suggests that metaphors only become more important as the science gets more difficult to understand. He adds that telling stories of science in practice, of scientists' success stories and struggles, helps convey that scientists are real people. Finally, Krulwich advocates for the importance of scientific values in general, and helping the public to understand that scientific views are not mere opinions - but hard won knowledge.[4]

Actor Alan Alda helps scientists and PhD students get more comfortable with communication with the help of drama coaches (they use the acting techniques of Viola Spolin).[5]

Imagining science’s publics[edit]

In the preface of The Selfish Gene, Richard Dawkins wrote: "Three imaginary readers looked over my shoulder while I was writing, and I now dedicate the book to them. [...] First the general reader, the layman [...] second the expert [and] third the student".
Students explain science projects to visitors. Susanna Hornig promotes the message that anyone can meaningfully engage with science, even without going as deeply into it as the researchers themselves do.[8]

Many criticisms of the public understanding of science movement have emphasized that this thing they were calling the public was somewhat of an (unhelpful) black box. Approaches to the public changed with the move away from the public understanding of science. Science communication researchers and practitioners now often showcase their desire to listen to non-scientists as well as acknowledging an awareness of the fluid and complex nature of (post/late) modern social identities.[9] At the very least, people will use plurals: publics or audiences. As the editor of Public Understanding of Science put it in a special issue on publics:

We have clearly moved from the old days of the deficit frame and thinking of publics as monolithic to viewing publics as active, knowledgeable, playing multiple roles, receiving as well as shaping science. (Einsiedel, 2007: 5)[10]

However, Einsiedel goes on to suggest both views of the public are "monolithic" in their own way; they both choose to declare what something called the public is. Public understanding of science might have ridiculed publics for their ignorance, but an alternative "public engagement with science and technology" romanticizes its publics for their participatory instincts, intrinsic morality or simple collective wisdom. As Susanna Hornig Priest (2009)[8] concludes in her recent introduction essay on science’s contemporary audiences, the job of science communication might be to help non-scientists feel they are not excluded as opposed to always included; that they can join in if they want, rather than that there is a necessity to spend their lives engaging.

The process of quantifiably surveying public opinion of science is now largely associated with the public understanding of science movement (some would say unfairly[11]). In the US, Jon Miller is the name most associated with such work and well known for differentiating between identifiable ‘attentive’ or ‘interested’ publics (that is to say science fans) and those who do not care much about science and technology. Miller’s work questioned whether American publics had the following four attributes of scientific literacy:

  • knowledge of basic textbook scientific factual knowledge,
  • an understanding of scientific method,
  • appreciated the positive outcomes of science and technology,
  • rejected superstitious beliefs, such as astrology or numerology.[12]

In some respects, John Durant’s work surveying British public[13] applied similar ideas to Miller. However, they were slightly more concerned with attitudes to science and technology, rather than just how much knowledge people had. They also looked at public confidence in their knowledge, considering issues such as the gender of those ticking "don’t know" boxes. We can see aspects of this approach, as well as a more "public engagement with science and technology" influenced one, reflected within the Eurobarometer studies of public opinion. These have been running since 1973 to monitor public opinion in the member states, with the aim of helping the preparation of policy (and evaluation of policy). They look at a host of topics, not just science and technology but also defence, the euro, enlargement of the European Union, and culture. Eurobarometer’s recent study of Europeans’ Attitudes to Climate Change[14] is a good example. It focuses on respondents’ "subjective level of information"; asking "personally, do you think that you are well informed or not about…?" rather than checking what people knew.

Frame analysis[edit]

Science communication can be analysed through frame analysis, a research method used to analyse how people understand situations and activities.

Some features of this analysis are listed below.

  • Public accountability: placing a blame on public actions for value, e.g. political gain in the climate change debate
  • Runaway technology: creating a certain view of technological advancements, e.g. photos of an exploded nuclear power plant
  • Scientific uncertainty: questioning the reliability of a scientific theory, e.g. arguing how bad global climate change can be if humans are still alive[15]


People make an enormous number of decisions every day, and to approach all of them in a careful, methodical manner is impractical. We therefore often use mental shortcuts known as "heuristics" to quickly arrive at acceptable inferences.[16] Tversky and Kahneman originally proposed three heuristics, listed below, although there are many others that have been discussed in later research.[17]

  • Representativeness: used to make assumptions about probability based on relevancy, e.g. how likely item A is to be a member of category B (is Kim a chef?), or that event C resulted from process D (could the sequence of coin tosses H-H-T-T have occurred randomly?).
  • Availability: used to estimate how frequent or likely an event is based on how quickly one can conjure examples of the event. For example, if you were asked to approximate the number of people in your age group that are currently in college, your judgment would be affected by how many of your own acquaintances are in college.
  • Anchoring and adjustment: used when making judgments with uncertainties. One will start with an anchoring point, then adjust it to reach an assumption. For example, if you are asked to estimate how many people will take Dr. Smith's biology class this spring, you may recall that 38 students took the class in the fall, and adjust your estimation based on whether the class is more popular in the spring or in the fall.

The most effective science communication efforts take into account the role that heuristics play in everyday decision-making. Many outreach initiatives focus solely on increasing the public's knowledge, but studies (e.g. Brossard et al. 2012)[18] have found that there is little – if any – correlation between knowledge levels and attitudes towards scientific issues.[19]


Main article: Simulation heuristic

The simulation heuristic is used to judge how likely certain outcomes are based on the ease with which one can imagine a particular ending.[16] This heuristic can be used for many tasks, including prediction (Will the Jets win this football game?) and causality (Did Jim eat the last slice of pizza?). An interesting application of this heuristic is to the case of near misses. Consider the following example from Kahneman & Tversky:[20]

Mr. Crane and Mr. Tees were scheduled to leave the airport on different flights, at the same time. They traveled from town in the same limousine, were caught in a traffic jam, and arrived at the airport thirty minutes after the scheduled departure time of their flights.

Mr. Crane is told his flight left on time.

Mr. Tees is told that his flight was delayed, and just left five minutes ago.

Who is more upset?

Mr. Crane or Mr. Tees?

Almost everyone says, "Mr. Tees," because they cannot imagine how Mr. Crane could have caught his flight, while Mr. Tees might have made it if not for that slow pedestrian, or the exceptionally long security line. The simulation heuristic has this ability to generate "if only" conditions, which can be used to understand the negative feelings of frustration, indignation, etc. that arise from near misses such as that of Mr. Tees.

This simulation of how events might have occurred is referred to as counterfactual thinking, and can be used to try to identify a unique or unusual circumstance that lead to a dramatic outcome. For example, consider a man who is shot during a robbery while shopping at a convenience store. Subjects will award more damages to a man who was shopping at a store far from his house than they will to a man who was shopping at a store near his home that he commonly visits.[16]

Regarding simulations of future events, simply imagining hypothetical events makes them seem more likely to occur.[21][22] This phenomenon can be extended to a person's own behavior, as imagining oneself performing or refusing to perform an action causes changes in expections about one's future behavior.[23][24] Interestingly, simulation is "more likely to increase the perceived likelihood of a potential outcome...than to reduce perceived likelihood of a potential consequence".[16] Thus, the implications of research on the simulation heuristic are particularly intriguing when designing outreach efforts intended to change behaviors, such as increasing recycling or decreasing fast food consumption.

See also[edit]

Notes and references[edit]

  1. ^ a b c d e As summarised in Gregory, Jane & Steve Miller (1998) Science in Public: communication, culture and credibility (New York: Plenum), 11-17.
  2. ^ a b Hilgartner, Stephen (1990) ‘The Dominant View of Popularization: Conceptual Problems, Political Uses, Social Studies of Science, vol. 20(3): 519-539.
  3. ^ a b c d e (October 23, 2009.) "Randy Olson - Don’t Be Such a Scientist." (Includes podcast). Accessed May 2012.
  4. ^ a b c d e Miller, Lulu (July 29, 2008)."Tell Me a Story." (Includes podcast). Accessed May 2012.
  5. ^ a b Grushkin, Daniel (August 5, 2010). "Try acting like a scientist" The Scientist Magazine. Accessed May 2012.
  6. ^ Massimiano Bucchi (1998) Science and the Media (London & New York: Routledge).
  7. ^ Wynne, Brian (1992) ‘Misunderstood misunderstanding: Social identities and public uptake of science’, Public Understanding of Science, vol. 1 (3): 281-304. See also Irwin, Alan & Wynne, Brian (eds) (1996) Misunderstanding Science (Cambridge & New York: Cambridge University Press).
  8. ^ a b Priest, Susanna Hornig (2009) ‘Reinterpreting the audiences for media messages about science’, in Richard Holliman et al (eds) Investigating Science Communication in the Information Age: Implications for Public Engagement and Popular Media (Oxford: Oxford University Press) 223-236.
  9. ^ for example, see Irwin, Alan & Michael, Mike (2003) Science, Social Theory and Public Knowledge (Maidenhead & Philadelphia: Open University Press). chapter 6
  10. ^ Einsiedel, Edna (2005) ‘Editorial: Of Publics and Science’, Public Understanding of Science, vol. 16(1): 5-6.
  11. ^ Martin Bauer, Nick Allum and Steve Miller, "What can we learn from 25 years of PUS survey research? Liberating and expanding the agenda", Public Understanding of Science, volume 16, 2007, pages 79-95.
  12. ^ Martin Bauer, Nick Allum and Steve Miller, "What can we learn from 25 years of PUS survey research? Liberating and expanding the agenda", Public Understanding of Science, volume 16, 2007, pages 80-81.
  13. ^ e.g. Durant, John, GA Evans & GP Thomas (1989) ‘The Public Understanding of Science’, Nature 340: 11–14.
  14. ^ (September 2008.) "Europeans’ attitudes towards climate change." European Parliament and European Commission (accessed in May 2012).
  15. ^ See, for example, Nisbet, Matthew C. (2009). Communicating Climate Change: Why Frames Matter for Public Engagement. Environment (Online at, retrieved 20 October 2010).
  16. ^ a b c d Fiske, S. T., & Taylor, S. E. (1991). Social Cognition (2nd ed.). New York: McGraw-Hill.
  17. ^ Tversky, Amos; Kahneman, Daniel (1974-09-27). "Judgment under Uncertainty: Heuristics and Biases". Science. 185 (4157): 1124–1131. doi:10.1126/science.185.4157.1124. ISSN 0036-8075. PMID 17835457. 
  18. ^ Brossard, Dominique; Lewenstein, Bruce; Bonney, Rick (2005-01-01). "Scientific knowledge and attitude change: The impact of a citizen science project". International Journal of Science Education. 27 (9): 1099–1121. doi:10.1080/09500690500069483. ISSN 0950-0693. 
  19. ^ Scheufele, D. A. (2006). Messages and heuristics: How audiences form attitudes about emerging technologies. In J. Turney (Ed.), Engaging science: Thoughts, deeds, analysis and action (pp. 20-25). London: The Wellcome Trust.
  20. ^ Kahneman, D. & Tversky, A. (1981). The simulation heuristic. (Report No. 5). Retrieved from
  21. ^ Sparks, Paul; Harris, Peter R.; Raats, Monique (2003-04-01). "Imagining and Explaining Hypothetical Scenarios: Mediational Effects on the Subjective Likelihood of Health-Related Outcomes1". Journal of Applied Social Psychology. 33 (4): 869–887. doi:10.1111/j.1559-1816.2003.tb01929.x. ISSN 1559-1816. 
  22. ^ Gregory, W. Larry; Burroughs, W. Jeffrey; Ainslie, Frances M. (1985-12-01). "Self-Relevant Scenarios as an Indirect Means of Attitude Change". Personality and Social Psychology Bulletin. 11 (4): 435–444. doi:10.1177/0146167285114009. ISSN 0146-1672. 
  23. ^ Anderson, Craig A.; Godfrey, Sandra S. (1987-09-01). "Thoughts about Actions: The Effects of Specificity and Availability of Imagined Behavioral Scripts on Expectations about Oneself and Others". Social Cognition. 5 (3): 238–258. doi:10.1521/soco.1987.5.3.238. ISSN 0278-016X. 
  24. ^ Anderson, Craig A. (1983-08-01). "Imagination and expectation: The effect of imagining behavioral scripts on personal influences.". Journal of Personality and Social Psychology. 45 (2): 293–305. doi:10.1037/0022-3514.45.2.293. ISSN 1939-1315. 

Further reading[edit]

  • Bauer, M & Bucchi, M (eds) (2007) Journalism, Science and Society (London & New York: Routledge).
  • Bucchi, M & Trench, B (eds) (2008) Handbook of Public Communication of Science and Technology (London & New York: Routledge).
  • Gregory, J & Miller, S (1998) Science in Public: communication, culture and credibility (New York: Plenum).
  • Holliman, R et al. (eds) (2009) Investigating Science Communication in the Information Age: Implications for Public Engagement and popular media (Oxford: Oxford University Press).
  • Nelkin, D (1995) Selling Science: How the Press Covers Science & Technology, 2nd edition (New York: WH Freeman).
  • Saab, BJ (2010) "Engaging the Clutch of the Science Communication Continuum – Shifting Science Outreach into High Gear" (Hypothesis 9(1) e12).

External links[edit]