A pair of whistleblower complaints filed with the Securities and Exchange Commission this month allege Facebook misled investors about its efforts to combat climate change and covid-19 misinformation, according to redacted copies of the documents viewed by The Washington Post.
One complaint alleges that climate change misinformation was prominently available on Facebook and that the company lacked a clear policy on the issue as recently as last year, despite Facebook executives’ committing to fight the “global crisis” during earnings calls. A second, companion complaint argues that while Facebook executives were publicly touting their efforts to remove harmful covid misinformation, internal documents “paint a different story.” The complaint cites internal company communications about the spread of vaccine hesitancy in comments and internal surveys that showed the proliferation of covid misinformation on the service.
“Some investors simply will not want to invest in a company that fails to adequately address such misinformation and then engages in misstatements and omissions on the topic,” one complaint says.
The education of Frances Haugen: How the Facebook whistleblower learned to use data as a weapon from years in tech
Facebook rebranded itself as Meta last year, after Haugen left the company and went public as a whistleblower. The company continues to remove false claims about vaccines and has worked to elevate “authoritative information” about climate change and public health, Meta spokesman Drew Pusateri said.
“There are no one-size-fits-all solutions to stopping the spread of misinformation, but we’re committed to building new tools and policies to combat it,” Pusateri told The Post in a statement.
For years, Democrats have criticized social networks over what they argue is a negligent approach to misinformation about public health, democracy and the environment. The White House last year pressured Facebook to do more to address vaccine misinformation, a push that culminated in President Biden telling reporters “they’re killing people.” But despite public fireworks, policymakers and regulators in the interim have taken little action to rein in the proliferation of falsehoods online, in part because many proposals targeting misinformation risk running afoul of the First Amendment.
Haugen’s lawyers have sidestepped this concern by focusing their complaints on corporate interests: whether the company has lied to investors.
Nathaniel Persily, a professor at Stanford Law School and director of the Stanford Cyber Policy Center, called the strategy a “creative” approach to the problem. “You cannot pass a law in the U.S. banning disinformation,” Persily said. “So what can you do? You can hold the platforms accountable to promises they make. Those promises could be made to users, to the government, to shareholders.”
While the SEC has not publicly commented on the status of Haugen’s complaints, the agency has signaled a “very strong enforcement stance” under Democratic Chairman Gary Gensler, said Jane Norberg, who headed the SEC’s whistleblower program until April and is now a partner at Arnold & Porter, a law firm that specializes in business regulation. (Haugen sought whistleblower protection from the SEC, which could shield her from retaliation by Facebook.) The agency has been broadly clear that companies need to make clear and accurate disclosures to investors, Norberg said.
“If the company says one thing to investors, but internal documents show that what they were saying is untrue, that could be something the SEC would look at,” she said.
The SEC did not immediately respond to a request for comment on the status of the complaints.
A congressional staffer shared redacted versions of the SEC complaints with a consortium of news organizations, including The Washington Post. The complaints cite confidential documents, originally collected by Haugen, also shared with the consortium.
The climate change complaint, filed with the SEC on Feb. 7, cites records that show employees internally grappling with the company’s perceived role in spreading climate misinformation. In a document from the first quarter of 2021, an employee said they searched for “climate change” in the social network’s Watch tab. The second result was a piece of “climate misinfo,” the employee wrote, and had been viewed more than 6.6 million times.
Another employee working on Facebook’s search integrity called for the company to do more to address climate denialism. “Can we take it a step farther and start classifying and removing climate misinformation and hoaxes from our platforms,” they wrote.
The complaint also cites internal records about the platform’s Climate Science Information Center, a much-touted hub designed to connect people with authoritative climate information. Awareness of the webpage was “very low,” even for people who had visited it.
“Climate change knowledge is generally poor,” one of the internal reports from 2021 said. “Given how many people use Facebook for information about climate change … climate science myths are a problem across all surveyed markets.”
The filings argue that it’s particularly urgent that Facebook tackle climate change misinformation, in part because of the popularity of the site. An internal company document cited in the complaint says Facebook is the second-most common source for news related to climate change, behind only television news and ahead of news aggregators, movies, online climate news sources and other social media platforms.
The company adds information labels to some posts about climate change, and it reduces distribution of posts that its fact-checking partners rate as false. But it generally does not remove those posts, as it does with certain false claims about vaccines and the coronavirus. Michael Mann, director of the Earth System Science Center at Pennsylvania State University, called the company’s approach “disturbing.”
“Unmitigated climate change is projected to lead to far greater numbers of human fatalities than covid-19,” said Mann, author of “The New Climate War.” “The fact that they’re treating greater threat with so much less urgency and care is problematic.”
Pusateri, the Meta spokesman, said that misinformation makes up a small amount of climate change content in the company’s apps, and that it spikes periodically, such as during extreme weather events. He said the company has taken steps to make it easier for fact-checkers to find climate content.
In the other filing, dated Feb. 10, the whistleblower lawyers argue that internal documents showing the spread of covid misinformation contradict the public statements the company has made. An internal Facebook document cited in the complaint shows that in April 2020, the company saw a 20 percent spike in users reporting and seeing false or misleading content. A Facebook employee cites covid as a reason. The complaint cites a May 2020 company record, in which employees warned that hundreds of anti-quarantine groups were active, with many high-ranking comments linking to conspiracy theories about the coronavirus. The SEC complaint also cites an internal Facebook survey, which found 1 in 3 people in the United States said they saw misleading or false information related to covid and voting.
The Washington Post previously reported that coronavirus misinformation was dominating small sections of Facebook’s platform, creating “echo-chamber-like effects” and reinforcing vaccine hesitancy. Other researchers documented how posts by medical authorities, like the World Health Organization, were often swarmed by anti-vaccine commenters. These documents were also cited in the complaint.
Facebook told the White House to focus on the ‘facts’ about vaccine misinformation. Internal documents show it wasn’t sharing key data.
The filings are part of Haugen’s team’s broader legal strategy. Her lawyers filed at least eight other complaints last year with the agency based on the trove of company documents. One complaint alleges that the company misled investors about its role in “perpetuating misinformation and violent extremism relating to the 2020 election and January 6 insurrection.” Others accused the company of misleading investors about its removal of hate speech and the negative consequences of its algorithms promoting misinformation and hate speech.
Persily, the Stanford law professor, said complaints like this could be a model for how to regulate in the area of content moderation. “It is extremely difficult to enact regulation regarding content moderation because it’s such a fast-moving area,” he said. “If you do hold the platforms to rules that they agree to, that’s a different mode of regulation.”