“They’ve allowed people who intentionally spread misinformation — what we call disinformation — to have extraordinary reach,” Murthy said of tech companies. “They’ve designed product features such as ‘like’ buttons that reward us for sharing emotionally charged content, not accurate content. And their algorithms tend to give us more of what we click on, pulling us deeper and deeper into a well of misinformation.”
The advisory is the most high-profile action the Biden administration has taken to date to stem the tide of falsehoods spreading on social media. It is a major reversal from practice under the Trump administration, when the former president’s own baseless claims about the virus often tested the social networks’ covid-19 misinformation policies.
Murthy’s advisory calls for the tech platforms to make investments to address disinformation, including building in more suggestions and warnings to make it harder for people to spread false information about vaccines or the virus. He also recommends that the companies make greater investments in content moderation, especially in languages other than English.
Murthy also called on the platforms to prioritize the detection of “super spreaders” and repeat policy violators.
The advisory Murthy issued Thursday has a broad list of recommendations. It advises Americans to check whether a source is trustworthy before forwarding information. It also recommends that health and educational institutions work to improve information literacy and calls on media organizations not to give a platform to newsmakers who spread misinformation.
But the surgeon general’s sharpest words were directed at tech companies, which he said must operate with greater transparency and accountability. Online misinformation, Murthy said, has led some Americans to resist wearing face masks, turn down medical treatments or choose not to get vaccinated against the coronavirus, factors that have “led to avoidable illnesses and death.”
“Simply put, health misinformation has cost us lives,” Murthy said.
Even before the pandemic, researchers were warning Facebook, Google’s YouTube and Twitter about the risk of anti-vaccine messaging shared on their services. And for years, the platforms largely took a hands-off approach.
Many of the companies stepped up their policies to fight misinformation in light of the pandemic and subsequent introduction of vaccines.
In December, Facebook said it would ban false and misleading statements concerning the coronavirus vaccine. Twitter said it would remove false claims about adverse effects of the vaccines or claims that vaccines are unnecessary. The company also started labeling tweets with misleading information about vaccines, even if they do not rise to the level of removal. YouTube says it removes coronavirus vaccine content that contradicts the World Health Organization or other health experts.
Despite these efforts, vaccine misinformation remains easy to find online. The surgeon general’s report follows warnings from Democrats in Congress, who have been calling for more action from the companies since the early days of the pandemic.
Facebook spokeswoman Dani Lever said the company has partnered with government experts and health authorities to take “aggressive action” against misinformation about the coronavirus and vaccines.
“So far we’ve removed more than 18 million pieces of covid misinformation, removed accounts that repeatedly break these rules and connected more than 2 billion people to reliable information about covid-19 and covid vaccines across our apps,” Lever said in a statement.
Twitter said in a statement that it would continue to try to elevate credible health information and enforce its policies.
“We welcome the Surgeon General’s leadership and partnership in this work,” Twitter spokeswoman Elizabeth Busby said in a statement.
YouTube spokeswoman Elena Hernandez said in a statement that the company “will continue working with health organizations, clinicians, and creators to increase access to high-quality health content on our platform and prevent the spread of misinformation.”
In March, a coalition of 12 state attorneys general sent a letter to Facebook and Twitter, pressing them to do more to ensure that online falsehoods aren’t undermining efforts to vaccinate the public against the coronavirus.
Connecticut Attorney General William Tong (D) and 11 other Democratic state attorneys general called on Facebook chief executive Mark Zuckerberg and Twitter CEO Jack Dorsey to “take immediate steps” to fully enforce their policies against vaccine misinformation.
The attorneys general said the companies have not cracked down hard enough on prominent anti-vaccine accounts that repeatedly violate the companies’ terms of service.
They also said that falsehoods about the safety of coronavirus vaccines from a small pool of individuals has reached more than 59 million followers on Facebook, YouTube, Instagram and Twitter, citing data from the Center for Countering Digital Hate, which studies online misinformation and disinformation.
A Facebook spokesperson said at the time that the company had worked with health organizations to update its policies and had removed 2 million pieces of content containing coronavirus and vaccine misinformation from Facebook and Instagram since February.
Twitter said in March that it had removed more than 22,400 tweets for violating its policies against coronavirus misinformation since the early days of the pandemic.
At Thursday’s briefing, Murthy announced that the Rockefeller Foundation has committed to spending $13.5 million to counter health misinformation.
He also revealed the toll the pandemic has taken on his extended family.
“On a personal note, it’s painful for me to know that nearly every death we are seeing now from covid-19 could have been prevented,” Murthy told reporters. “I say that as someone who has lost 10 family members to covid-19 and who wishes each and every day that they had had the opportunity to get vaccinated.”
White House press secretary Jen Psaki said Thursday that a large amount of health misinformation is being spread by a relatively small group of individuals.
“There’s about 12 people who are producing 65 percent of anti-vaccine misinformation on social media platforms,” Psaki said. “All of them remain active on Facebook, despite some even being banned on other platforms, including . . . ones that Facebook owns.”
Psaki’s comments appeared to be a reference to the “disinformation dozen,” a group of accounts identified by the Center for Countering Digital Hate as spreading vaccine misinformation or hoaxes.
Facebook has taken enforcement action against pages and accounts tied to these people in more than a dozen instances, Lever said. She said the company permanently bans accounts that repeatedly break the rules. However, she said, many of the people operate multiple accounts across multiple Facebook-owned platforms and retain active followings.