Can News Media Boost Citations? Examining One (Old) Study

14 November 2012 by Matt Shipman, posted in Uncategorized

Photo: emsago/stock.xchng

I recently raised the idea that media coverage of a research article may boost that article’s citations, and mentioned a 2003 study by Vincent Kiernan that found a correlation between news stories and citation rates. Now I’d like to talk about another, older study that makes a stronger claim regarding the link between news and citations.

The paper, “Importance of the Lay Press in the Transmission of Scientific Knowledge to the Scientific Community,” was published in 1991 in the New England Journal of Medicine. While the media landscape has changed dramatically over the past two decades, the paper makes some interesting observations and is well worth a look. (Note: the paper was brought to my attention by Peter Edmonds, following an online conversation between myself and Liz Neeley about the lack of papers addressing this issue. Further, when I dismissed the paper as dated, they argued that I should take a closer look. They were right.)

So, do news stories boost citations?

The paper stems from a study by researchers at the University of California at San Diego who were trying to figure out whether medical articles that got coverage in the New York Times got more scientific citations than articles that did not receive coverage in the Times. On a more interesting note, they also “sought to discover whether coverage by the Times genuinely increased the effect of an article (the publicity hypothesis), or merely earmarked outstanding articles that would have garnered many citations without such coverage (the ‘earmark’ hypothesis).” Incidentally, I love the term “earmark hypothesis.” I had no idea that argument had a name.

Photo: lusi/stock.xchng

What makes this paper particularly interesting is what Neeley called its “built-in experimental control,” namely the fact that it incorporated a 12-week strike at the Times in 1978 as part of its study design.

During that strike, the Times produced an “edition of record” but did not distribute issues to the public. To quote the paper: “During the strike the Times continued to earmark Journal articles it deemed worthy of coverage, but it did not publicize this information to its readership. By comparing the number of citations of Journal articles published when the Times was not on strike, one can discover whether publicity in the popular press truly amplifies the transmission of scientific findings to the medical community.”

In their study, the researchers examined the citation rates of every Journal article covered by the Times in 1979 for 10 years after the article’s publication. The researchers also identified “control” articles that they deemed comparable to the articles covered by the Times, and then compared the two sets of citation rates. They found that articles publicized in the Times “received consistently more scientific citations in each of the 10 calendar years after their publication than did matched control articles not reported by the Times.”

However, the Journal articles earmarked by the Times during the newspaper’s strike – which would have been publicized, but weren’t – did not receive more citations than their controls. In fact, they received fewer. In short, the study authors note, “the earmark hypothesis seems implausible.”

What now?

This paper reflects a media landscape that no longer exists. It predates the 24-hour news cycle, online news media, blogs and social media. But I think it is still relevant.

Here’s a quote from the 1991 paper: “Every medical researcher develops systematic and nonsystematic mechanisms for reducing and filtering what would otherwise be an overwhelming flow of scientific information. Our evidence suggests that a lay publication may serve as one of these filtering mechanisms, even for scientists.”

Here’s a quote from biomedical researcher and blogger Jalees Rehman, which was written earlier this month: “To avoid drowning in the information overload, researchers have developed multiple strategies to survive and navigate their way through all this published data. … In order to keep up with scientific developments outside of my area of expertise, I have begun to rely on high-quality science journalism.”

In other words, the more things change, the more they stay the same. The mechanisms for delivering information may have changed, but the challenge of keeping abreast of scientific developments has not. News articles, in print or online, likely still play a role in drawing attention to new findings within the research community.

Note: I linked to papers that may not be open access. Citations below.

“Importance of the lay press in the transmission of medical knowledge to the scientific community,” New England Journal of Medicine, David P. Phillips, et al., DOI: 10.1056/NEJM199110173251620

“Diffusion of News About Research,” Science Communication, Vincent Kiernan, DOI: 10.1177/1075547003255297


8 Responses to “Can News Media Boost Citations? Examining One (Old) Study”

  1. [BLOCKED BY STBV] SciComm Matters Because…It’s Tough to Keep Up with Journals | Communication Breakdown Reply | Permalink

    [...] What all of this tells us is that science communication is more important than ever for researchers: because anything researchers can do to raise the profile of their articles will improve their citation rates, for the simple reason that more people in the research community will be aware of them. The well-known Kiernan study (2003) showed this to a certain extent, highlighting a correlation between newspaper coverage of a journal article and the number of citations it received. (Update: Since first posting this piece, I came across another relevant study, and wrote about it here.) [...]

Leave a Reply


+ 8 = sixteen