Humans will serve a janitorial role in the process, while the algorithms take more control. The people who work on Trending will now be responsible for “confirming that a topic is tied to a current event in the real world,” the company said in a blog post. For example, the company explained that “#lunch” wouldn’t end up trending, even though a bunch of people talk about it on Facebook every day (at, duh, lunch time).
According to the guidelines for the Trending reviewers, they’re required to accept “all” algorithmically surfaced topics, unless it’s not a “real-world event,” or if it’s a duplicate of an already trending topic.
“This is something we always hoped to do but we are making these changes sooner given the feedback we got from the Facebook community earlier this year,” the company said. Although not named specifically, many of Facebook’s changes to Trending rolled out in the wake of two Gizmodo reports detailing the control humans had in the process. One former employee accused the company of suppressing stories that appealed to conservatives, a specific charge that Facebook — along with other former employees of the Trending team — denied.
But overall, the reporting raised serious questions about the Trending process, particularly given Facebook’s enormous role in determining what stories the public sees and reads. It pulled back the curtain a bit on the fantasy that a platform like Facebook is inherently neutral.
In May, Facebook announced that humans would no longer rely on other news outlets to vet the events surfaced by the algorithm and removed the power to “accelerate” newsy stories of great importance onto the Trending list if they weren’t yet trending on the social network. As the Intersect pointed out at the time, the changes were meant to ease concerns about the Trending team’s alleged bias, but didn’t really change the fact that the algorithm was already the hardest worker by far in the Trending process.
In any case, the image on the left is what Trending used to look like. On the right is what it’s become with the changes:
The stories still appear on the list due to a combination of volume and momentum. Trending stories not only have a lot of mentions — they have a spiking number of them. Something else that hasn’t changed? Facebook still uses the data it collects on users to “personalize” the process. Those remain the most influential factors in determining what it is that users see in that trending bar, and for both, we can thank the algorithm.
The thing is, Facebook’s changes to the human role in this process might be the best way for it to address the PR nightmare that followed the Gizmodo reporting, but it doesn’t remove the most influential source of human biases in the whole process. Facebook’s algorithms were, after all, written by humans. And they reflect the biases and errors that any human-created thing contains.