Confirmation Bias Varies According to How Much We Think We Know

20 September 2013 by Matt Shipman, posted in Uncategorized

Photo credit: arinas74/stock.xchng

If you believe you’re already well-informed on science-related subjects, you are more likely to avoid science news stories that challenge your position on those subjects. That’s one of the findings of a recent study published online in Science Communication.

The paper, “Seeking Congruency or Incongruency Online? Examining Selective Exposure to Four Controversial Science Issues,” evaluated which science news stories people choose to read online. Specifically, it looked at the extent to which people choose to read news that is consistent with their pre-existing opinions about a specific subject (“consonant”), news that is not consistent with their opinions (“dissonant”) and news that is neutral – neither supporting nor opposing their pre-existing opinions. The paper also examines how other factors come into play, such as whether people consider themselves to be religious. The paper was authored by S. Mo Jang of the University of Michigan and was published online Sept. 16.

The Study

The study assessed the behavior of 238 adults in the United States, with varying political affiliations (28 percent Republican, 30 percent Democrat, 42 percent independent). Each study participant was asked to respond to a survey that assessed their attitude toward four scientific subjects: stem cell research, evolution, genetically modified (GM) foods and climate change. For example, participants were asked to respond on a six-point scale (1 = strongly disagree, 6 = strongly agree) to questions like “I think human beings evolved from earlier species of animals.”

Study participants were also surveyed to determine their predispositions on subjects including how much they pay attention to science in mass media outlets (TV/newspapers), the amount of scientific knowledge they thought they had (“perceived scientific knowledge”), the amount of scientific knowledge they actually had (via a basic true/false test), and their “religiosity” (or how religious they were).

Participants were then given time to explore a fake online news site that displayed 12 science news articles. There were three articles on each of the science topics: stem cells, evolution, GM foods and climate change. For each of the science topics there was one article supporting, or believing in, the topic; one article opposing, or not believing in, the topic; and one neutral article on the topic. The articles were all of similar length and had headlines of similar length.

Jang used behavior tracking software to determine which news articles each participant clicked on, as well as how long the participant spent viewing the page of each article.

The Findings

If you’re like me, you’re assuming that confirmation bias won out and that the participants were more likely to read articles that were consistent with their opinions on the science topics. Wrong.

On average, 34.56 percent of the stories that study participants read were consonant, 23.64 percent were neutral and 41.8 percent percent were dissonant. In other words, well over half of the articles people chose to read didn’t affirm their pre-existing beliefs. This trend was even more pronounced for the time spent reading the articles. Participants spent 48.1 percent of their time reading dissonant stories, and only 23.1 percent reading consonant ones.

But those numbers are averages. For stem cell research and GM foods, participants were more likely to read dissonant news stories (far more likely, in the case of stem cell research). But for evolution and climate change, people did prefer stories that confirmed their existing biases (though not by much).

That’s interesting. But what really got my attention were the findings on the role that individual characteristics played on news selection.

The “I Think I’m Smart” Bias?

People who thought they knew a lot about science were far more likely to read news stories that agreed with them, and far less likely to read news stories they didn’t agree with. That’s not to be confused with people who actually did know a lot about science. Actual scientific knowledge – meaning how scientifically literate an individual is –didn’t predict what sort of stories people would read at all.

Religion and the extent to which people follow science news via mass media outlets also predicted the behavior of study participants, though not as much I expected. The more religious a participant was, the less likely they were to read science news articles that challenged their views. Conversely, the more attention a participant paid to science stories in mainstream media, the more likely they were seek out dissonant science stories. However, neither affect was as pronounced as the “perceived science knowledge” factor.

My Two Cents

One thing I took away from this paper is the idea that claims of confirmation bias may not have as powerful an effect as I (and others) thought in regard to the behavior of news consumers online. I’m still puzzling over what that may mean in practical terms, in terms of science communication practice, as opposed to science communication research.

It also highlights a significant problem facing science communication efforts. The fact that someone is confident they understand something does not mean that they actually do understand it. And if they choose to read only news items that reinforce that confidence, they are increasingly unlikely to believe they are wrong. So how do you reach them? That’s a problem that I’m not sure how to solve. But it’s worth thinking about.

Note: Citation below.

Seeking Congruency or Incongruency Online? Examining Selective Exposure to Four Controversial Science Issues,” Science Communication, online Sept. 16, 2013, S. Mo Jang. DOI: 10.1177/1075547013502733


4 Responses to “Confirmation Bias Varies According to How Much We Think We Know”

  1. Paige Brown Reply | Permalink

    Matt, AWESOME post.

    One small thing I'm worried about: the fact that participants were exposed to the articles immediately following an attitude test. Participants, perhaps especially participants who think they know a lot about science, might be primed to be consistent with the attitudes they just expressed in survey questions as they peruse different potential articles. In other words, if they remember the attitudes they just expressed, they might take that as a cue of which article to read (why would you want to read an article talking about the reality of climate change if you just told survey researchers that you didn't believe in human-caused warming?)

    But that might be a small issue. I think it's interesting that sometimes people might want to read an article against their prior beliefs (perhaps to actively argue against the content?). It is encouraging that confirmation bias depends on other individual factors.

  2. Matt Shipman Reply | Permalink

    Good points, Paige. One reason I think this post is timely (in addition to the fact that the paper was just made publicly available) is the fact that the IPCC report is about to come out. And any discussion of climate change news is bound to include a mention of confirmation bias!

    Also: sorry for the delay in responding to your comment. I meant to schedule this post to roll out on Monday morning -- instead, it went out Friday evening. Oops! User error, all the way. :)

Leave a Reply


× 9 = eighteen