Facebook on Wednesday announced a portal that aims to be a one-stop shop for its more than 2.5 billion users to find news and resources about the novel coronavirus, something it said was a step in an effort to combat falsehoods and provide accurate information in the face of a fast-moving pandemic.
The new coronavirus information center will roll out over the next 24 hours and will go at the top of users’ Facebook news feeds, chief executive Mark Zuckerberg said on a media call. He stressed that the most important service Facebook can provide right now is authoritative information — while removing hoaxes and other falsehoods that could cause immediate harm to public health.
“The top priority and focus for us has been making sure people can get access to good information and trusted sources during the pandemic,” he said.
Zuckerberg also said the company would allow thousands of content moderators who review banned content such as child pornography and terrorism to work from home, even though doing so could challenge the company’s efforts to protect its members from disturbing content. For weeks, moderators and other third-party contractors have complained that while most full-time Facebook employees have been working from home for more than a month, many contractors were still required to go to the office as recently as this Monday. Content moderators circulated a petition on Facebook’s internal systems protesting the issue this week.
Since the new coronavirus began sweeping the world earlier this year, misinformation has proliferated across social media, including on the company’s other platforms, WhatsApp and Instagram. As government leaders and health officials are racing to contain the global pandemic, that onslaught of misinformation has hindered some of their efforts.
Facebook has said it has already taken steps to combat misinformation around the pandemic. The social-networking giant previously said it was working to remove content making dangerous claims, such as those that suggest drinking bleach cures the coronavirus, which violate its policies prohibiting speech that can cause real-world harm. The decision to remove such content was made personally by Zuckerberg, according to a person familiar with the effort, who said on the call that the issue was fairly “black and white” for him.
On the call, Zuckerberg said that “even in the most free expression-friendly traditions, like the United States, there’s a precedent that you don’t allow people to yell fire in a crowded room.”
But Facebook is still struggling to handle misinformation about the coronavirus, as well as where to draw the line around harmful speech. Earlier this week, the company conceded it had mistakenly marked articles from legitimate news sites as spam. The company faces a particular challenge on its WhatsApp messaging service because the content is encrypted, making it harder to scan for policy violations and harmful messages.
Zuckerberg has defended the notion that free speech should be protected, no matter how ugly, and the company typically refuses to take down content that falls short of causing real-world harm.
For example, Facebook recently chose to keep up an advertisement by President Trump that directed people to incorrect census information, only to change course in a matter of hours in response to public pressure. The company has a policy banning misleading information about the U.S. Census. Zuckerberg has also previously defended Holocaust denial on Facebook.
Facebook is taking a much more muscular approach in response to covid-19. Instagram said it was removing false information associated with the covid-19 hashtag and replacing it with resources from the World Health Organization, the CDC and other authorities. It has also given those organizations free advertising and has banned the sale of medical face masks to prevent people from profiting off a pandemic.
In his capacity as head of the Chan Zuckerberg Initiative, Zuckerberg has also funded a task force that seeks to quadruple coronavirus testing in the Bay Area.
Facebook has also announced a $100 million grant for small businesses on Facebook and is giving $1,000 to each of its employees to help cover costs during the pandemic.
Zuckerberg said he was working from home and doing so was a “big change” now that his kids’ schools were closed.
He said the company would have to shift resources so content moderators who worked from home would take on less-disturbing content to review. Full-time workers would help pick up the slack. But workers in the Philippines, where Facebook and other tech giants employ thousands of moderators, will not be able to work from home, a Facebook official later confirmed.
The new work-from-home policies, Zuckerberg conceded, would cause some slowdown in moderation, particularly for posts where content is less urgently problematic. But Zuckerberg said he was still looking for ways to ramp up its moderation efforts.
He said he was also concerned about loneliness, referring to a surge in the use of Facebook platforms such as Messenger since quarantines began. The use of Facebook Messenger, for example, has doubled in some locations, he said.
“I am worried about the isolation of people being at home, that it could lead to more depression and mental-health issues,” he said. “I want to make sure that we are more ahead of that during this time."