President Biden on Monday balanced his earlier, blunt criticism of Facebook by blaming bad actors on the website for spreading dangerous misinformation about the coronavirus and vaccines, but he still called on the social media platform to be more aggressive in combating the problem.
Biden put Facebook on the defense last week after accusing it of “killing people” by allowing the spread of misinformation about coronavirus vaccines.
“They’re killing people,” Biden said Friday as he was departing the White House. “Look, the only pandemic we have is among the unvaccinated. And they’re killing people.”
Biden was asked to clarify those comments Monday after giving a speech about his infrastructure plan. The president said he’d recently read an article about how the majority of misinformation on Facebook came from a dozen individuals.
“Facebook isn’t killing people, these 12 people are out there giving misinformation,” Biden said. “Anyone listening to it is getting hurt by it. It’s killing people. It’s bad information.”
Biden appeared to be referring to a study about a group of accounts called the “disinformation dozen,” identified by the Center for Countering Digital Hate as spreading vaccine misinformation or hoaxes. Facebook previously said it has taken enforcement action against pages and accounts connected to these people in more than a dozen instances.
“We permanently ban Pages, Groups, and accounts that repeatedly break our rules on COVID misinformation, and this includes more than a dozen Pages, Groups, and accounts from some of the individuals referenced in the press briefing,” Facebook spokeswoman Dani Lever said in a statement Monday.
Over the weekend, the company published a long blog post titled, “Moving Past the Finger Pointing,” saying that vaccine acceptance has been rising on Facebook since January and defending itself against Biden’s accusation.
Facebook has enacted policies to crack down on the misinformation. In December, it said it would ban false and misleading statements concerning coronavirus vaccines. Since the beginning of the pandemic, the company said it has removed more than 18 million pieces of coronavirus-related misinformation.
But health misinformation still persists on the platform, and Facebook hasn’t shared how many people have seen vaccine misinformation on its site. Critics of the company have called for greater transparency, and a coalition of liberal groups and individuals wrote to Facebook urging it to ban the accounts of 12 people, who a study found were sharing the bulk of coronavirus disinformation.
Social media companies, including Facebook, Twitter and YouTube, have long struggled to keep misinformation at bay on their sites. The companies employ thousands of content moderators to enforce their policies, but critics say they are often enforced unequally or too slowly. Pressuring the companies to do more to control the spread of misinformation online has been a key point for Democrats in the increasing legislative crackdown against Big Tech.
White House press secretary Jen Psaki said Monday that while Biden wants Facebook executives to be more reflective about the role their company plays in misinformation, no decisions to get involved have been made.
“The administration isn’t considering any regulatory or legal moves to possibly address disinformation on social media,” she said in a briefing. “That’s up to Congress to determine how they want to proceed moving forward.”
“But let me just note that we are not in a war or a battle with Facebook,” Psaki added. “We’re in a battle with the virus and the problem we’re seeing that our surgeon general elevated just last week is that disinformation traveling through a range of mediums. Some of them are a range of social media platforms.”