The “sweeping changes” that Facebook just announced to its trending topics module are, in fact, a whole lot less dramatic than the headlines would suggest. In a nutshell, Facebook has heard your concerns, reviewed Trending … and nixed the least important, least impactful steps in the process.
A letter from Facebook general counsel Colin Stretch to Senate Commerce Committee Chairman John Thune, published online Monday night, details the findings of an internal review the company launched after a report in Gizmodo accused it of political bias. Stretch said Facebook found no evidence of any such bias — a claim that has also been debunked, several times over, by former curators on the team — but that it would nevertheless be making some minor tweaks to trending.
Those tweaks will consist, primarily, of removing human safeguards on the complex, mathematical systems that already do the heavy lifting behind the trending module. Where curators once verified that a trending item represented a real event before approving it for the feed, for instance, they’ll no longer use external sources to do so. And if a major news story breaks across multiple major outlets, but takes longer to trend organically on Facebook, curators will no longer have the power to accelerate that process. (If you rely on trending to surface news for you — and we admittedly don’t know how many people do that — you’ll now be getting some of your news slightly later than say, your Twitter-savvy friends.)
These were all very minor roles to begin with, mind you: The algorithm already surfaced, ranked and redistributed trending stories with minimal human input. The role of human curators, according to both former employees and Facebook itself, was somewhere between babysitter and copy editor.
And yet, people are uncomfortable with other people, who strike us as imminently corruptible (!). And so we are comforted when Facebook tells the Commerce Committee that a tiny, incremental bit of editorial power has been reassigned to the already-powerful, “incorruptible” algorithm.
This should not actually reassure anyone, of course. Algorithms are just as prone to bias as human editors, and their actions are, in many ways, even more inscrutable. (It’s no coincidence that social media algorithms are often called “black boxes,” devices so computationally complex and secretive that even Congressional inquiries could not unravel them.)
Just on Monday, a ProPublica investigation found that “risk assessment” programs, used to calculate a criminal defendant’s risk of reoffense, are clearly and inadvertently biased against African Americans. Similar patterns have been observed everywhere from Google to Netflix. And when Intersect invited a group of more than 100 Facebook users to track their trending topics last week, many expressed concern about the way stories were selectively targeted to them: Just check out your current “ad preferences” for a better glimpse of the racialized, gendered and socioeconomic boxes that Facebook algorithmically puts you in.
Unfortunately, as I wrote last week, no one seems particularly interested in investigating this (larger, more important!) piece of the process. In fact, in a statement, Thune lauded Facebook’s changes to trending topics, calling the company’s response “forthcoming” and “serious.”
What the senator apparently does not realize is that this is less transparency than the clever illusion of it. Facebook appears to be opening up, even as it shuffles more of its work behind an opaque algorithmic curtain.