How Facebook and Twitter control what you see about Ferguson

Social media is controlled by algorithms – a mathematical formula that dictates what you see and when. In the past week, people have noticed something curious about the way these algorithms have filtered news about protests in Ferguson, Mo., over the fatal shooting of unarmed black teenager Michael Brown.

The fundamental differences between the two platforms help explain the disparity.

“Because of its brevity, and the ease with which updates can be shared, Twitter is a much more rapid-fire experience than Facebook, and that makes it well suited for quick blasts of information during a breaking-news event like Ferguson,” Mathew Ingram of Gigaom pointed out. The non-newsy content that clutters the platform also makes it ill-suited for following breaking news, he added.

Another huge difference? Algorithms. Your Twitter feed isn’t controlled by an algorithm. You see the tweets of people you follow in real time. But Facebook uses a complicated algorithm to determine what ends up in your news feed. They won’t reveal exactly how it works, but the company has said it ranks the content based in part on what you’ve liked, clicked or shared in the past.

Missouri Gov. Jay Nixon called for the National Guard to intervene in his city on Monday morning after protests grew violent yet again on Sunday night, despite a curfew the governor imposed on Saturday. (Sarah Parnass/The Washington Post)

Ars Technica’s Casey Johnson suggested Facebook’s algorithm also weeds out controversial content — racially charged protests, perhaps? — from users’ news feeds: “There is a reason that the content users see tends to be agreeable to a general audience: sites like [BuzzFeed, Elite Daily, Upworthy, and their ilk] are constantly honing their ability to surface stuff with universal appeal. Content that causes dissension and tension can provide short-term rewards to Facebook in the form of heated debates, but content that creates accord and harmony is what keeps people coming back.”

Johnson backed up her theory with a Georgia Institute of Technology study of how political content affects users’ perceptions of Facebook. She summed up the findings: “The study found that, because Facebook friend networks are often composed of ‘weak ties’ where the threshold for friending someone is low, users were often negatively surprised to see their acquaintances express political opinions different from their own. This felt alienating and, overall, made everyone less likely to speak up on political matters (and therefore, create content for Facebook).”

For University of North Carolina sociologist Zeynep Tufekci, this sort of “algorithmic filtering” is more than a matter of technical differences. Last Wednesday, when there was rioting in Ferguson and journalists were being arrested, the events in Ferguson unfolded in real time on her Twitter feed. But on Facebook, where she follows a similar composition of friends, posts about Ferguson didn’t appear in her feed until the next morning. “Would Ferguson be buried in algorithmic censorship?” she wrote on Medium..

If so, that’s bad. “How the internet is run, governed and filtered is a human rights issue,” she wrote.

 

 Related: What Facebook doesn’t show you

Gail Sullivan covers business for the Morning Mix blog.
Comments
Show Comments
Most Read National
Next Story
Terrence McCoy · August 19