The idea that Facebook’s fake news problem could have influenced the result of the 2016 presidential elections is a “crazy idea,” the social network’s co-founder and chief executive said Thursday.
“Voters make decisions based on their lived experience,” Zuckerberg said in an interview with David Kirkpatrick at the Techonomy conference in Half Moon Bay, Calif. “I think there is a certain profound lack of empathy in asserting that the only reason why someone could have voted the way they did is because they saw some fake news.”
There are a couple of reasons why, days after the election of now President-elect Donald Trump, Zuckerberg is being asked to answer for Facebook’s long-standing difficulties in dealing with hoaxes and fake news that spread freely through its massive user network.
Facebook is a tech company, one that prides itself on neutrality and that has explicitly said it is not a media company. With 1.7 billion monthly active users across the globe, Facebook is also perhaps the most influential distributor of information in the world. According to a recent Pew study, 44 percent of the general population in the United States says it gets its news from Facebook. And 79 percent of Americans online use Facebook in general, a much larger share than that of any other social network.
As Facebook has become the favorite online home for Americans, it has also become host to a wide array of hyperpartisan content machines that publish mountains of misleading or outright fabricated stories that are explicitly designed to be widely shared among people who are more inclined to believe them.
According to a BuzzFeed analysis of many of these hyperpartisan pages, that misinformation targets both progressives and conservatives, but false or misleading news stories are more common among right-wing sites. That’s also reflected in The Intersect’s own, nonscientific tracking of fake stories about the election across all social media platforms over the past few weeks. Misinformation on Facebook wasn’t exclusive to the supporters of one candidate, but it was more common and viral on the right.
There is also some evidence that right-wing readers of online news are being encouraged to ignore attempts at verification of these stories. “Forget the press, read the Internet,” Trump told his supporters just months before his victory. He didn’t specify which parts of the Internet he liked, but we already know that he accepts and repeats information from websites and tabloids that publish false or misleading articles. For instance, he has in the past complimented the work of Alex Jones, the conspiracy theorist behind Infowars, which has argued that the massacre at Sandy Hook Elementary School was a “false flag.”
“The Internet,” even just Facebook, looks very different to different people. As Facebook gains influence in the lives of its users, experts have become concerned about algorithmically enforced “filter bubbles.” People tend to click on stories that confirm their worldview.
Any good algorithm for a site like Facebook would be able to notice that and use the information to start showing users other posts and stories like it. Although Facebook has over the years denied that its algorithms are part of the problem, the fact remains that ideological polarization on Facebook is getting worse at a time when the site has never been more important to how people consume information. Zuckerberg repeated his company’s long-standing response to questions about the influence of the filter bubble on Thursday. Which is, essentially, that it’s the people, and not the site or its algorithms, that create the problem.
“Even if 90 percent of your friends are Democrats, probably 10 percent are Republicans. Even if you live in some state or country, you will know some people in another state, another country,” he said. He added later that “it’s not that the diverse information isn’t there … but we haven’t got people to engage with it in higher proportions.”
More reading: