Media critic

President Trump in October. (Mandel Ngan/AFP/Getty Images)

The Oxford Internet Institute has been busy. In three months leading up to President Trump’s first State of the Union address, the institute’s researchers combed through a heap of data on social-media networks, the better to understand who’s guilty of spreading bogus news out there in the world.

The verdict? Trump supporters and diehard conservatives, pretty much.

Here’s how the researchers did their work: They carefully monitored content from a wide range of news sources, culling them until they’d identified a range of “junk” sources. “The 91 sources of political news and information, which we identified over the course of several years of research and monitoring, produce content that includes various forms of propaganda and ideologically extreme, hyper-partisan, and conspiratorial political information.” The sources had to fail at least three criteria in a list that the researchers had laid out. (See Page 2.) Those criteria are:

  • Professionalism: Does the site adhere to journalistic standards?
  • Style: Does the site “use emotionally driven language with emotive expressions, hyperbole, ad hominem attacks, misleading headlines, excessive capitalization, unsafe generalizations and fallacies, moving images, graphic pictures
    and mobilizing memes”?
  • Credibility: Does the site use flimsy sourcing and give wind to conspiracy theories?
  • Bias: Is there an ideological “skew” to the site’s work?
  • Counterfeit: Is the site using trade dress to mimic real news organizations, and does it disguise commentary as news reporting?

The resulting sites ranged from to to to to to, as well as well-established sites such as,,, and

Then the team examined just who was sharing links from all these sites on Twitter and Facebook, at which point it kicked off a process of categorization. The researchers examined the allegiances and activities of the people who were sharing the junk sources. What other stuff did they care about? If some of the people sharing the bogus links “are also talking about the GOP of Tennessee and another segment is talking about the GOP of Minnesota,” then they drop into a broader GOP group, says Philip N. Howard, a professor of sociology, information and international affairs at Oxford University and a member of the research team. From all this digital drudgery, the researchers split Facebook into 13 groups and Twitter into 10 groups. For the Twitter analysis, for example, there’s a “Republican Party” group, a “Democratic Party” group a “Resistance” group, a “Conservative Media” group, a “Trump Support” group, among others.

The conclusions? On Twitter, the “Trump Support” group shared 95 percent of the “junk” sites on the study’s list. On Facebook, the “Hard Conservative” group shares 91 percent of the “junk” sites.

Slow down for a minute, however. The “junk” source list includes a number of sites that have provided very reliable information to the Erik Wemple Blog over the years. National Review, the Washington Free Beacon, Mediaite, Crooks and Liars, among several others, bear archives that don’t appear to merit a spot in Oxford’s bewildering basket. So the Erik Wemple Blog asked Howard about a few of the choices.

National Review, for starters. What did it do wrong? “I think they lost points on commentary masking as news,” Howard said. Yikes — please don’t judge the Erik Wemple Blog on that basis! Here’s a screen shot of the offending National Review showing what Howard calls “misleading headlines”:

(Screen shot)

Here’s another, this time displaying allegedly “misleading” sponsored content:

(Screen shot)

To the eyes of this blog, all that appears like the work of a legitimate online journal. If we’re going to whistle every site that stretches a headline here or there, we’ll need a lot more referees in U.S. media. As for sponsored content, same idea.

Now, on to Mediaite. Disclosure: This site has written favorably about the Erik Wemple Blog, equipping us with an almost insurmountable conflict of interest. Also equipping us with a conflict of interest is that the Erik Wemple Blog covers a lot of the same stuff as Mediaite — and a good bit of that material is covering the execution of honest-to-goodness junk news. So our coverage of that coverage sometimes stoops to the level of that coverage, if you get the idea.

In any case, we asked Howard for the case against Mediaite. “That one was probably scooped up because the far-right uses links to those stories as if they themselves are news items,” responded Howard. Perhaps, but the study laid out five criteria for classification as “junk” news sites. “Mediaite … would have been used by someone on Twitter during the 2016 election and associated with a political hashtag—that’s how it got on the watch list. Then as the coders looked at the sites over the year, they would have evaluated the things on our list of criteria for low quality political news and information,” Howard said. Of which criteria did Mediaite run afoul? Howard sent along this screen shot, showing “opinion is a distinct section but everything is opinion.”

(Screen shot)

And this, which allegedly qualifies as “curse commentary masking as news story,” Howard noted.

(Screen shot)

Again, these seem like trifles., one of the Internet’s busiest news hubs, is also on the list. “It’s more likely to carry stories from other junk news sites. That’s how Drudge ended up coded as junk. They aggregate mostly other junk,” says Howard. Shareblue, a politics and commentary site that’s part of David Brock’s lefty empire, secured the “junk” label from Oxford as well. “We debated on that one but we kept them in,” says Howard, who says that the research team examined multiple pieces of content from each of the listed sources.

Banging out a news taxonomy is a towering challenge. There’s just way too much news, way too many sources, and way too many tweets and Facebook postings out there. Given that ocean, the Oxford folks devised a sensible plan: Reach judgments about news sources, and then examine how stories from those sources get passed around. Yet the apparent overclassification of some right-wing sites raises the possibility that the study, at least in part, merely caught conservatives sharing conservative journalism.

Proportion matters, too. Whereas the study’s statistical spreadsheet identifies 23,431 shares/likes of content, National Review clocks in at 1,290, Mediate at 809, Drudge at 822. “As you can tell, some sites are straightforward to code and some of them we end up talking about for hours,” says Howard. “The top 10 or top 20 we feel quite confident about.”

The substandard type of reporting that Oxford cites in its “junk” classification is a milder species of news atrocity than “fake news,” the term that entered into vogue in the latter stages of the 2016 presidential election. As BuzzFeed’s Craig Silverman eloquently wrote in a recent story, “fake news” describes the output of those who “consciously lie for profit and propaganda.” Howard & Co. couldn’t fact-check every article, or even pat down the editing bona fides of all the sites they’d encountered on social media. “‘Junk news’ helps capture this broader category of stuff that’s designed to prevent voters from making good decisions on Election Day,” says Howard.