Science Blogging that Boomerangs


“This impedes reasonable public risk communication in the long run and creates a social group of people who harbor fears and anxieties not grounded in reality, but are immune against correction.” - A Boomerang Effect of an All-Clear Message on Radiation Risk.

Shutterstock: http://ow.ly/zDBK7

Shutterstock: http://ow.ly/zDBK7

Experts have concluded in many cases that the risks of nanotechnology, electromagnetic fields and nuclear energy are acceptable in the ways we currently use these technologies. Risks assessments of nanoparticles, for example, are typically complicated and rarely lead to conclusions that would raise issues about the technology in any kind of broad way. Risks assessments of nanoparticles might instead caution against specific products used in specific ways. But try communicating the complexities of such risks assessments to the public – It’s not easy.

I could write a long blog post here telling you not to worry, that (theoretically) researchers have concluded in dozens of studies that titanium dioxide nanoparticles in sunscreens pose no substantial risks to your health. Or that (again theoretically) genetically modified foods cause no broadly known food allergies. Or that not theoretically vaccines have no shown link to autism and that side effects are very mild except in very rare cases. But would these blog posts convince you not to worry about nanoparticles, GMOs or vaccine side effects if you were already worried about these things to begin with? According to a 2014 study in Human and Ecological Risk Assessment, the answer is unfortunately NO.

“[A] prominent example is nuclear energy, which a large majority of physicists considered safe while journalists, the media and the public remained worried.” - A Boomerang Effect of an All-Clear Message on Radiation Risk.

Boomerangs and Backfires

Boomerang effects are widely known to many mass communication scholars with bents for social psychology. They are “backfire effects” whereby the effect of a message – a blog post about the actual safety of vaccines, for example – runs against the message’s or the message sender’s intention. For example, I might write a blog post to show you why nanoparticles are safe in your sunscreen. But if you came to my post already worried about nanoparticles, you’d probably leave even MORE worried. #Fail.

It turns out that according to a quirk in how we cognitively process messages, boomerang effects become more likely as a communicative stimulus becomes stronger. In simpler language, the more strongly and obviously I try to convince you of a particular point of view, the more I might drive you in the opposite direction. This is the boomerang effect.

In interviews with 240 participants in Switzerland, Uwe Hartung, Peter Schultz and Simone Keller asked participants if, after reading an article minimizing the health risks of electromagnetic radiation from mobile phone towers, they were more worried or more reassured about the risks. The researchers thought that readers already very worried about the dangers of radiation emitted by cell phone towers would assess the risk as higher after reading a message arguing the opposite.

According to their results, they had thought right. The more worried a participant was before reading the message – a message intended to assuage worries about mobile phone tower risks – the less likely it was that his or her worries decreased after reading. In fact, worried people become more worried after reading the article about negligible risks of mobile phone towers.

“The boomerang effect was shown to be stronger the stronger the threat in the message to the recipient’s freedom [to hold another position] was (Wicklunk and Brehm, 1968).” - A Boomerang Effect of an All-Clear Message on Radiation Risk.

This boomerang effect happens for a variety of reasons. Let’s take my theoretical blog post about the relative safety of nanoparticles in sunscreens. The greater the opinion difference between me and you at the outset, the more likely a boomerang effect is to occur for you. If you start out with a very different opinion than me (that nanoparticles are dangerous, for instance), the information in my blog post is likely to backfire, making you more worried about the dangers of nanoparticles. The effect will be even more pronounced if you don’t see me as a credible source – “she isn’t even a scientist,” or, “she is biased because she works for a nanotechnology company.” [Which I don’t, of course]. The effect will also be more pronounced if you perceive that I'm threatening your freedom to hold a different opinion.

The effect is also worse the more overtly explicit my message is. If my blog post comes across as very obviously countering your preconceptions or challenging your views, especially in a dogmatic way, I might as well cede to the fact that you are going to leave my blog more convinced of what you were already worried about than before. I’m never using sunscreen ever again.

“[S]everal studies show that (sometimes ludicrous) political beliefs were actually strengthened when strong believers were confronted with contrary evidence (Nyhan and Reifler, 2010 [PDF]).” - A Boomerang Effect of an All-Clear Message on Radiation Risk.

Boomerang effects also include reactions to especially emotionally manipulative messages. This is why overt fear appeals to global warming, for instance, so often backfire. Can you say ‘exaggerated’? That sinking polar bear picture is SOO photo-shopped.

So what do we do? How do we write about the complicated risks and non-risks of GMOs, nuclear energy, vaccines and nanotechnology without creating more public resistance to these technologies?

According to one view, we might in fact not be able to write about these things at all without creating some backfire effects. According to Mazur’s Quantity of Coverage theory, the higher the media attention to technology, the stronger the public opposition to that technology regardless of the balance or nature of the coverage.

But there might be an alternative. I might lead you through the safety of nanoparticles in sunscreens in a subtle way, through narrative and immersive storytelling as opposed to in-your-face evidence.

Story time. Shutterstock: http://ow.ly/zDCjO

Story time. Shutterstock: http://ow.ly/zDCjO

“Transporting narratives can both change beliefs and motivate action, and may be particularly useful for conveying [health] information because they reduce counterarguments.” – Narratives and Cancer Information

It might be better to communicate on the aspects of controversial issues like climate change, GMOs and vaccines for which your readers don’t hold strong views or previous concerns. Jason Tetro (AKA The Germ Guy) recently wrote on his blog that the most important criterion in communicating public health messages is “the use of positive images and words to ensure the message is delivered in a way that avoids controversy.” To learn from the boomerang effects known to social psychology, we should especially avoid communicating about science and risks (or the lack of risks) in particularly exaggerated, manipulative, dogmatic or overly explicit ways.

Side note: The term dogmatic indicates messages inclined to lay down principles as incontrovertibly true. This seems to be one of the features of Ashutosh Jogalekar’s posts at Scientific American Blogs. Whatever his message intentions might have been – whether or not he was trying to assuage us that things have changed since Feynman’s days – several of his posts left those concerned about sexism in science even more so after reading. And rightly so.

But what does this all mean? What do boomerang effects mean for science journalists and science bloggers who often communicate on controversial issues and complicated risk assessments associated with science and technology? I’ll leave you with a list of suggestions that, while they require further research, follow from what we currently know about social psychology and message processing:

  • Avoid assuming that your readers agree with you.
  • Avoid strong emotional appeals to convince readers of the risks (or lack thereof) associated with a scientific technology.
  • Use immersive narrative and tell a story.
  • Avoid jargon.
  • Messages that explicitly counter readers’ positions or views will often backfire.
  • Avoid overt persuasion or argumentation. Stay positive.
  • Be a credible source. Don’t exaggerate, even if the ends seem to justify the means.
  • Avoid stating facts or evaluations as if they are incontrovertibly true.
  • Be patient.

Science bloggers have often found their voices in countering misconceptions and misperceptions surrounding evolution, vaccines, climate change, GMOs and “chemical free” products. But, I think, more of us should be aware going forward that boomerang effects are potentially creating unintended consequences for our well-intentioned efforts.

“Boomerang effects of all-clear messages mean there is a section of the population that cannot be reached by good news, who react to good news by denial.” – A Boomerang Effect of an All-Clear Message on Radiation Risk.


One Response to “Science Blogging that Boomerangs”

  1. Hank Roberts Reply | Permalink

    > pose no substantial risks to your health.
    Anyone who's had exposure to some statistics, epidemiology, and ecology, won't take personal risk as the important criterion in evaluating risk and benefit.

    Make a list of the concerns that pose no substantial risks to your health.
    Would they include structural failure in passenger aircraft? ebola? lead poisoning? thiamine deficiency? bioaccumulation of fat soluble pesticides? measles?

    The criterion I use for evaluating the risk of new profitable technology is public health and ecology.

    I can survive most anything, the odds are always in my favor.
    But I read __Silent Spring__
    And I read __Thinking Like A Mountain__.

    Waiting til there are bodies stacked on the sidewalk for pickup in my neighborhood isn't the rational approach to risk.

Leave a Reply


three + 8 =