Mark Zuckerberg is stepping into the new year with an apology tour. The Facebook chief executive spent a good portion of 2017 warding off lawmakers and investigators trying to figure out how much of a role his company played in Russia’s interference in the 2016 election. So naturally, Zuckerberg felt the need to announce his New Year’s resolution last week: Fix Facebook.
The problem is, Zuckerberg can’t fix Facebook. By its very nature, the social-media network may be doomed to cause further damage to the American political system — no matter how much it tinkers with its algorithms or changes the way foreign governments place ads on its site. In the end, the problem lies with Facebook’s users.
We can imagine Facebook as an endless labyrinth of tunnels and caverns connecting its billions of users. Most of those tunnels are benign, allowing people to share news, birthday wishes and videos of corgi puppies. But there are some tunnels, deep in the darkest parts of the site, where users subsist on misinformation feeding into their worst biases. And there’s no topic that illustrates this better than vaccines.
Naomi Smith, a sociologist at the Federation University Australia, spent a year observing Facebook users who share, like and comment on thoroughly debunked conspiracy theories on the safety of vaccines. In a recent study analyzing hundreds of thousands of anti-vaxxer comments, she and colleague Tim Graham illustrated just how dangerous this digital world has become.
Facebook didn’t create the anti-vaccine movement. But according to Smith and Graham’s research, anti-vaxxers on Facebook exist in what sociologists call a “small world” network. Such users cluster themselves into cliques, the members of which share connections with one another. This simplifies the movement of ideas immensely. In the real world, anti-vaccine networks are sparse, and finding similarly minded people takes a lot of effort. But Facebook can connect any two anti-vaxxers in just one or two steps.
The result is a highly self-reinforcing network that moves information quickly and efficiently. If one page somehow shuts down or loses its influence, others in the network quickly pick up the slack.
Facebook’s anti-vaccine network is also more sophisticated than a group of people simply positing the debunked belief that vaccines cause autism. The pages usually do not identity as “anti-vaccine,” but instead as “pro-safe vaccines” or in favor of “vaccine choice.” The authors of the study argue that the online network provides anti-vaxxers with a sense of support, offering firsthand anecdotes of harmful vaccines — despite a dearth of scientific evidence — and tapping into the fears of parents, especially mothers. The study found that some 70 percent of activity from the anti-vaccine movement on Facebook over a three-year period came from women.
The study also shows the power of a sense of structural oppression. One of the recurring themes in the network is that government and media are willfully downplaying or covering up the effects of vaccines. Users often compare vaccine use to the Holocaust.
“For [users], it’s more real and more true than information than official sources,” Smith said. “What we see in our day-to-day lives seems more real than what’s actually going on.”
Smith said her research can easily be translated to other special interest topics online, especially politics. Even the limited public data that Facebook provides could give us troves of information on how people are interacting and building their own political silos during election cycles.
We can’t be sure the extent to which this is the fault of Facebook’s algorithms or the fault of how people organize themselves; to answer that would require a lot more data than Facebook would ever divulge. But what we do know is that anti-vaxxers on Facebook often participate across several pages in the network, suggesting a level of willful activity from users that Zuckerberg just can’t change.
The frustrating reality is that we simply don’t have the tools we need to break open our echo chambers. Nor would it be right to expect Facebook to actively try to moderate people’s opinions.
So what should be done? At the very least, Facebook should keep data open for social researchers to look through. And political scientists should take up the challenge to analyze that massive trove of information. To address the epidemic of misinformation, lawmakers and the media need to think creatively to promote real information to consumers. Anti-vaxxers tend to accept personal testimony, so Smith suggests that the key is talking to them on their own terms — translating hard facts into lived experience.
But, really, the heart of the problem is human nature. We seek out information that affirms our beliefs — even if those beliefs are bunk. No algorithm could ever change that. We need to come to terms with the fact that social media makes getting bad information easy.