It’s a rare move for the company, which often labels and demotes false or troubling content as opposed to removing it entirely. In the past, that approach has prompted criticism from health experts, who contend the tech giant hasn’t acted aggressively enough to limit the reach of those who claim vaccines are unsafe or push fake cancer treatments. Others have urged Facebook to remove a broader array of falsehoods, including deliberately incorrect or misleading political posts, photos and videos.
With coronavirus, Facebook said it would base its decisions on advice from “leading global health organizations and local health authorities" and would steer users to more authoritative sources of information such as the World Health Organization. Facebook also said it had provided free advertising credits to help organizations run coronavirus education campaigns.
“As the global public health community works to keep people safe, Facebook is supporting their work in several ways, most especially by working to limit the spread of misinformation and harmful content about the virus and connecting people to helpful information,” Kang-Xing Jin, Facebook’s head of health, said in the post.
The tech giant’s decision to remove false and misleading information underscores the vast challenge public health officials around the globe face as they grapple not only with the rapidly spreading coronavirus but also the effects of inaccurate information proliferating quickly, and widely, on social media. That misinformation threatens to scare patients or skew their decisions on whether to seek care.
The stakes became apparent Friday, after health officials in Los Angeles County discovered a fake letter — issued under its name — that inaccurately said there had been a local coronavirus outbreak. “There is no immediate threat to the general public, no special precautions are required, and people should not be excluded from activities based on their race, country of origin, or recent travel if they do not have symptoms of respiratory illness,” the county said in a statement. A spokesperson did not immediately respond to a request for comment.
Early signs of trouble on social media surfaced over the weekend: On Facebook, Twitter and Google-owned YouTube, some users started sharing incorrect information about the coronavirus and its origins, the number of people affected and the way the illness spread. The information was shared via misleading or false posts and videos that received thousands of shares and views.
Some of the most pervasive falsehoods on Facebook claimed wrongly that the U.S. government created the coronavirus or peddled inaccurate information about cures that do not actually exist. Facebook began to label these posts and demote them in users’ news feeds, once its fact-checkers debunked the claims.
But some of the misleading posts remained available on the social-networking site, largely in private groups that came into existence to discuss the coronavirus, The Washington Post first reported Monday. That apparently moved Facebook to its decision to remove suspect material entirely.
Twitter and YouTube have grappled with similar myths on their services. On Wednesday, though, Twitter said it had not seen “significant coordinated attempts to spread disinformation at scale about this issue.” The company estimated its users had sent more than 15 million tweets over the past few weeks related to the coronavirus. Users searching for it on the site are served with a link to official information provided by the Centers for Disease Control and Prevention.
YouTube, meanwhile, has not yet taken down some videos peddling false information, including cures that do not exist. On Thursday, the company stressed in a statement it is prioritizing authoritative results in searches for the coronavirus. On search, Google also said it had rolled out a special link that directs people to WHO.