This year's global press freedom ranking, released annually by the international NGO Reporters Without Borders, appears to show something alarming: The U.S. has dropped in the rankings from the world's 32nd most free for media to only 46th. The drop in rankings has attracted significant media attention here in the U.S., helped along by a Reporters Without Borders campaign alerting, "U.S. plummets in global press freedom rankings." Press Freedom Campaign director Josh Stearns of Free Press, a watchdog group that is not connected to Reporters Without Borders but covers similar issues, wrote at the Huffington Post that the ranking change reflected "a profound erosion of press freedom in the United States in 2013."
There are serious and important reasons to worry about press freedom in the United States, particularly in the Obama administration's treatment of whistleblowers, and Reporters Without Borders does important work around the world. But the warnings that the U.S. is "plummeting" are simply not born out in this data, which have been deeply misread and over-interpreted in media coverage.
Most of the coverage is based on the premise that 2013 saw a sudden, alarming and perhaps unprecedented decline in media press freedom because the ranking dropped from the previous year. This is just bad data journalism for big two reasons. First, it confuses relative rankings with absolute scores – more on this later. Second, it ignores the fact that Reporters Without Borders has been raising and dropping the U.S. ranking for years.
Take a look at this chart of the U.S. ranking since the report's 2002 inception and tell me if it really shows the most recent ranking as such a singular event:
What this shows to me is that the U.S. ranking has alternated for over a decade between rankings in the 20s and in the 40s. That's a remarkably wide range to have jumped back and forth so often. More to the point, it shows that this year's "plummeting ranking" and "profound erosion" is something that seems to happen every couple of years, with U.S. press freedom miraculously and immediately recovering every single time.
In fact, according to this report, press freedom in the U.S. has actually improved in the rankings from only two years ago, when it was ranked 47th. And we're much better off than we were in 2006 or 2007, when the rankings were 53rd and 48th, respectively. So either press freedom has been wildly yo-yoing between good and bad every other year for the past decade, in which case this year's drop is nothing new, or maybe the ranking actually is just not a very effective indicator of press freedom in the United States.
Now, back to the difference between absolute scores and relative rankings. Saying that the U.S. dropped in the rankings does not, in itself, affirmatively prove that U.S. press freedom worsened absolutely. It just shows the U.S. performing worse relative to other countries. So it's entirely possible that the change is driven less by press erosion in the U.S. and more by press freedom improvement in other countries; maybe the news here is just that things got a lot better in Central Europe or Latin America.
Or there's another possibility: Maybe the U.S. is one of 20 or so countries that have tightly clustered absolute scores. That would certainly help explain how the U.S. could swing so dramatically back and forth between rankings in the 20s and in the 40s. In the 2011-2012 report, for example, the U.S. has an absolute press freedom score of 19, on a range from negative 10 to 125. There are 25 other countries that score within five points of the U.S.; that's a pretty tight little cluster of data. By that metric, a relatively minor change in the U.S.'s absolute score change its relative ranking dramatically.
There's a third possibility to explain the changes in the U.S. score: The Reporters Without Borders survey methodology changes from year-to-year. It's possible that what we're seeing here is not press freedom changing in the U.S. but the NGO just measuring it differently, asking different questions of different people that produce different "scores" on the same thing.
This is all why, in general, it's not a good idea to make an absolute statement ("there was a profound erosion of press freedom in the United States," as the Free Press campaign director wrote) based on a relative ranking. That's just not how data works.
It's possible that the U.S. absolute score for press freedom has changed dramatically over the last year, which actually would be a metric for demonstrating an absolute change in U.S. press freedom. Unfortunately, their metric and methodology appears to change year-to-year, which makes comparing the score across years difficult or impossible. (You can see absolute scores for 2014 by downloading this Excel file; each country's absolute score is, I believe, the third number in the A-column. So Afghanistan's score is 37, for example; higher scores are worse.) It appears that the U.S. absolute score is 23 this year and was 18.2 the previous year. Assuming those are measured using the same methodology, that would be a change of 4.8 points on a 100-point scale. That's not good news, of course, but neither does it suggest a "profound erosion of press freedom."
To be clear, the Reporters Without Borders methodology is not arbitrary. The organization sends out questionnaires to 18 different NGOs that deal with press freedom, to 150 Reporters Without Borders "correspondents," and to an unspecified number of "journalists, researchers, jurists and human rights activists." They fill out the forms by assigning numerical scores to subjective value judgments (lots of social science research works this way), for example by scoring the transparency of TV license-granting processes on a scale from one to 10. The organization collects all the forms and puts the numerical scores through a formula, which produces an absolute score; those scores are then ranked.
None of this is to dispute that there are serious an ongoing issues regarding press freedom in the United States, and that Reporters Without Borders is doing laudable work in tracking and reporting those issues in the United States and around the world. Still, year-to-year ranking fluctuations in the Global Press Freedom index are a very unreliable metric for measuring changes in press freedom within individual countries.
It's just not good data, even if it does make for good copy. I should know: In 2012, the last time that the U.S. ranking "plummeted," to 47th place, I wrote up the same alarming-looking results as everyone else. Imagine my surprise when, two years later, I saw that the U.S. ranking had made the same plummet to almost the exact same score. I only wish I'd looked into the data sooner.
This post originally identified Josh Stearns as Campaign Director for Reporters Without Borders' Global Press Freedom campaign. In fact, he is the Press Freedom Campaign Director at Free Press, which is not connected to Reporters Without Borders. Stearns has since posted some additional thoughts on the report's metrics, which you should read here.