Here’s the thing about Facebook’s bar of “trending” topics: It’s basically news aggregation, representing itself as a data point. It looks like a neutral platform — and, to some extent, allows its users to interpret it as one — but humans play a role in what you see and don’t see.
That disconnect was crystallized this week by a story from Gizmodo that said some of Facebook’s “news curators” — the humans who work on Facebook’s trends — are artificially keeping some conservative news outlets and topics out of the trending bars. Facebook issued an equally strongly worded denial of many of those claims. There’s now a Senate committee inquiry about it. Facebook’s trending bar was, itself, trending on Facebook as a result of their reporting.
Right now, a lot of that attention is going toward the question of whether Facebook is “suppressing” conservative points of view, itself a really loaded issue, because many mainstream news organizations are also accused of the same by conservatives. The Fix looked at the political implications of Gizmodo’s central charge in more detail, but there’s something else worth examining here: a misunderstanding about what “trending” even means.
Essentially, the common perception of what Facebook’s trends were, and how they actually work, were standing on two sides of a very wide canyon that most people didn’t even know was there.
Facebook’s own description of its trending bar certainly doesn’t dissuade readers from thinking that the trending process is automated. Facebook says on its site that “Trending shows you topics that have recently become popular on Facebook” and that the topics any given user sees “are based on a number of factors including engagement, timeliness, Pages you’ve liked and your location.” The description certainly doesn’t say anything about the role of curators in this process. In August, Re/code wrote a whole piece on how Facebook’s trends work. That piece described the process as an algorithmic one.
So it’s easy to understand why, on one side of the canyon, the popular assumption is that the trending list primarily represents the results of an algorithm, with minimal human involvement. On the other side, it’s more complicated: The algorithm guides what ends up on the bar, but human beings are actually responsible for making sense of what the algorithm is saying.
Given what we know about how information — particularly bad information — can circulate among Facebook’s users, it shouldn’t be that surprising that the “trending” topics Facebook shows you every day aren’t the result of a “pure” algorithm. Tarleton Gillespie examined this disconnect in response to Gizmodo’s report, noting:
“In many ways, a trending algorithm can be an enormous liability, if allowed to be: it could generate a list of dreadful or depressing topics; it could become a playground for trolls who want to fill it with nonsense and profanity; it could reveal how little people use Facebook to talk about matters of public importance; it could reveal how depressingly little people care about matters of public importance; and it could help amplify a story critical of Facebook itself.”
Gizmodo’s report was so explosive that it ended up trending on Facebook. Good for them. But no matter how this shakes out, it’s also good for us, because it’s becoming harder to see a bar of “trends” on a site like Facebook without thinking about how they got there in the first place.
Liked that? try these: