As social media companies wrestle with how to police dangerous health misinformation on their platforms, Pinterest has taken an extreme approach: blocking search results related to vaccinations, whether the results are medically accurate or not.
The San Francisco-based company told the Wall Street Journal it has been suppressing search results on the topic since December and will continue to do so until it finds a reliable solution to protect Pinterest users from harmful and misleading content. Pinterest searches about vaccines had been dominated by results that bucked long-standing scientific research and guidelines from the Centers for Disease Control and Prevention by claiming vaccinations were hazardous. Users of the photo-sharing site can still pin this kind of material to their personal boards, but it won’t surface in searches.
"It’s better not to serve those results than to lead people down what is like a recommendation rabbit hole,” Ifeoma Ozoma, Pinterest’s public policy and social impact manager, told the Wall Street Journal this week.
The World Health Organization recently named “vaccine hesitancy” as one of the biggest global health threats of 2019. The designation comes as the United States grapples with a sudden resurgence of measles, a disease that was declared eliminated in 2000 by the CDC as a result of extensive use of the measles, mumps and rubella vaccine. More than 100 cases have cropped up since the beginning of the year — more cases than were reported in the United States in all of 2016.
Medical professionals and lawmakers have been pressuring companies like Facebook and Google to take action against anti-vaccination content in the name of public safety. But while these companies agree they have a responsibility to protect the public, finding solutions that tread the line between safety and censorship has proved to be a complex challenge. Most companies have taken a more hands-off approach to the problem, citing free speech issues, but Pinterest’s banning all vaccine-related content stands out as a radical alternative, one that makes it tougher for users to find accurate information about vaccinations even as it seeks to protect them.
Since 2017, Pinterest has barred misinformation about health that might have “immediate and detrimental effects” on users’ health and greater public safety. The company monitors which searches bring up “largely polluted results” that violate its policies and then stops serving results for those searches. Pinterest has also blocked misleading content about chronic and terminal illnesses, like bogus cancer cures. The site also actively looks for content that violates its community guidelines, evaluating it based on advice from institutions like the CDC, WHO and the American Academy of Pediatrics.
The ban on all vaccination-related material is a temporary measure, Pinterest said, while the platform searches for a sustainable way to moderate it.
“We want Pinterest to be an inspiring place for people, and there’s nothing inspiring about misinformation,” Pinterest said in a statement to The Washington Post. “That’s why we continue to work on new ways of keeping misleading content off our platform and out of our recommendations engine.”
Social media platforms have been historically reluctant to crack down on controversial content. Until late last week, Facebook had contended that most anti-vaccination content didn’t violate its community guidelines for inciting “real-world harm.” The social media giant, which has come under fire for how its algorithms can steer users toward misinformation, said removing such content wouldn’t help raise awareness about the facts of vaccinations. Accurate counter-speech, Facebook argued, was a more productive safeguard.
But the company changed its stance after Rep. Adam B. Schiff (D-Calif.) wrote a letter to founder and chief executive Mark Zuckerberg asking how Facebook planned to protect users from misleading material about vaccinations. Schiff sent a similar letter to Sundar Pichai, chief executive of Google, which is also under scrutiny about how its search engine and subsidiary YouTube promote potentially dangerous misinformation.
“We’ve taken steps to reduce the distribution of health-related misinformation on Facebook, but we know we have more to do,” Facebook said in a statement to The Post last week. “We’re currently working on additional changes that we’ll be announcing soon.”
Specifically, Facebook is looking into cutting back or removing misleading health content from recommendations, including “Groups you should join,” according to a Facebook statement. It’s also considering demoting such content in search results.
Google’s YouTube has begun changing algorithms to try to control the spread of misinformation. Last month, YouTube said it would begin removing videos with “borderline content” that “misinform users in harmful ways.”
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to our users,” YouTube wrote in a blog post.