Time for a heartfelt confession: We're awfully fond of highlighting and discussing new academic research on this blog—from economic modeling to political science papers to the latest medical studies. After all, who doesn't love a good study showing that a peek at adorable kitten pics can amp up your productivity at work?

Not sure what this photo is supposed to represent but it fits well enough.

But now comes a brand-new paper in the Public Library of Science suggesting that... well, that we should all be wary of hot new research that's trumpeted by the press. This isn't an argument against science. The scientific process works quite well in sifting strong claims from weak ones. The problem, it turns out, is that the media tends to over-emphasize exciting new findings while rarely following up to see whether those conclusions have been re-affirmed or debunked later on.

To see this, the authors, led by François Gonon of the University of Bordeaux, looked at 47 different scientific publications on attention deficit hyperactivity disorder (ADHD) published during the 1990s, and then went through 347 English-language newspaper articles covering the research. From there, they culled the top ten papers and looked at the coverage. And what they found was stunning.

Those ten papers on ADHD received an outpour of press coverage—223 newspaper articles written in the sample. That's not surprising, since most of them were advancing new hypotheses about ADHD. But those ten studies also generated a lot of follow-up research. In all, there were at least 67 follow-up biomedical studies that ended up either debunking or weakening most of the initial research. Yet newspapers were much less thorough in covering these subsequent papers—the debunking work generated just 57 newspaper articles.

There's a clear imbalance here, note Gonon and his colleagues. The biomedical papers that get the most attention are often the most novel, finding some unexpected effect or testing new hypotheses. But, precisely because they are so novel, they're more likely to be untrue. That would be fine, except newspapers tend to highlight the initial studies while paying less attention to the party-pooping follow-ups.

Part of the problem, too, is with the journals themselves. Those thrilling initial studies tend to get published in the most prestigious medical journals, like Lancet or the New England Journal of Medicine, whereas the follow-ups are often confined to less-visible outlets that few reporters read.

The study doesn't say whether this press bias occurs in other areas of coverage, such as economics or political science. We’ll need—yes—follow-ups to Gonon’s research! But let’s assume this is a widespread problem. Is there an easy solution? That's less clear.

After all, it's great when an organization like the National Academies of Science or the Intergovernmental Panel on Climate Change can write up a big, thorough overview of a scientific topic, sifting through many studies in order to judge claims and counterclaims. Those assessments are a good antidote to relying on a single, much-hyped new paper on a topic. But those thorough assessments don't come along very often—the IPCC, for instance, typically takes five or six years to release its updated reports on the state of climate science.

The Economist, which has a good write-up of Gonon's paper, offers this common-sense advice: "The matter goes beyond simply not believing what you read in the newspapers. Rather, it is a question of remembering that if you do not read subsequent confirmation, then the original conclusion may have fallen by the wayside." And over at the New York Times, Andrew Revkin has a fascinating post wrestling with the difficulty that science reporters face in conveying the complexity of new scientific research without making their coverage too boring. 

In any case, Gonon's study offers a good cautionary tale. Academic research is fascinating! But it's important to check up on what follow up studies say—those tedious-sounding papers languishing in B-list journals that find no effect for a clever new medical treatment is probably worth a blog post. (And yes, we'll be sure to keep an eye out for whether anyone comes along with a study that debunks Gonon's findings...)

Related: The world's most boring journal—and why it's good for science.