The Washington PostDemocracy Dies in Darkness
The Technology 202

A newsletter briefing on the intersection of technology and politics.

What Facebook knew about covid-19 misinformation — and didn’t tell Congress

The Technology 202

A newsletter briefing on the intersection of technology and politics.

Happy Thursday! If you blinked, you may have missed the White House again hoping we forgot that as a candidate President Biden called for repealing, not reforming, Section 230. 

Below: The FTC is looking into Facebook, again, and Trump faced another legal setback in a social media lawsuit. But first, our latest Facebook Papers investigation:

What Facebook knew about covid-19 misinformation — and didn’t tell Congress

In July, two top House Democrats penned a letter to Facebook CEO Mark Zuckerberg demanding to know how many users had seen the 18 million pieces of covid-19 misinformation that he touted his company had taken down, and how much money they made off the posts.

“It’s past time for Facebook to come clean about your role in prolonging the COVID-19 pandemic and commit to rectifying deadly mistakes that it has made in the past year,” Reps. Jan Schakowsky (D-Ill.) and Anna Eshoo (D-Calif.) wrote at the time.

The lawmakers accused the company of repeated “failures to disclose or act upon internal research that outlines Facebook’s role in the spread of misinformation and disinformation.” The missive was one of many sent by lawmakers and regulators since the pandemic began calling on Facebook to reveal more information about its handling of misleading medical information. 

Facebook curtly rebuffed the two lawmakers’ inquiries. “At this time, we have nothing to share in response to the questions you have raised, outside of what Mark has said publicly,” the company wrote a month later in a three-sentence response signed “FACEBOOK, INC.”

But documents disclosed by Facebook whistleblower Frances Haugen show that the company’s researchers had deep knowledge of how covid and vaccine misinformation moved through the company’s apps, running multiple studies and producing large internal reports on what kinds of users were most likely to share falsehoods about the deadly virus, Gerrit De Vynck, Cat Zakrzewski and I report as part of the Facebook Papers investigation

Internally, we report, Facebook employees showed that coronavirus misinformation was dominating small sections of its platform, creating “echo-chamber-like effects” and reinforcing vaccine hesitancy. Other researchers documented how posts by medical authorities, like the World Health Organization, were often swarmed by anti-vaccine commenters, hijacking a pro-vaccine message and wrapping it in a stream of falsehoods.

The documents also demonstrate that Facebook employees were able to calculate how many views a widely shared piece of misinformation garnered, despite stonewalling requests from Democratic lawmakers to share how often covid misinformation had been viewed on Facebook.

The documents were disclosed to the U.S. Securities and Exchange Commission and Congress by whistleblower Haugen’s lawyers and were reviewed by a consortium of media organizations, including The Washington Post. The documents include research reports, slide decks and discussions between employees on the company’s internal message board. The Wall Street Journal previously reported on some of the pandemic-related revelations in the papers, including how the company struggled to police anti-vaccine comments.

Taken together, the documents underline just how extensively Facebook was studying coronavirus and vaccine misinformation on its platform as the virus tore across the world, unearthing findings that concerned its own employees.

In an emailed statement, Facebook spokesman Aaron Simpson said the company has worked to promote reliable information about the coronavirus throughout the pandemic and that vaccine hesitancy among U.S. Facebook users has gone down by 50 percent since January.

"There’s no silver bullet to fighting misinformation, which is why we take a comprehensive approach which includes removing more than 20 million pieces of content that break our covid misinformation policies, permanently banning thousands of repeat offenders from our services, connecting more than 2 billion people to reliable information about covid-19 and vaccines, and partnering with independent fact-checkers,” Simpson said.

The lawmakers who have been hounding the tech giant to be more transparent on the matter, after seeing glimpses of how much information Facebook declined to share, are fuming. 

“For months, I’ve repeatedly requested information from Facebook about covid misinformation, including questions about which users post it, how the platform amplifies it, how Facebook decides what to remove, and much more,” Eshoo, chair of the House Energy and Commerce Committee’s health panel, told The Washington Post on Tuesday. 

She added, “It was the whistleblower documents that shed light on these issues, instead of Facebook releasing them a long time ago.”

The documents reviewed by The Post shine a light not only on Facebook’s understanding of how covid misinformation spreads, but who spreads it and what can be done to mitigate it. 

One study noted that much of the misinformation being spread about vaccines came from “Covid super-spreaders,” who used tactics similar to those of purveyors of falsehoods about the 2020 election and the extremist QAnon ideology. 

Taking action against these people was “low hanging fruit,” the report said. "We found, like many problems at FB, that this is a head-heavy problem with a relatively few number of actors creating a large percentage of the content growth,” the study read. The writers suggested that Facebook could put limits on the amount of content people could produce, which would not affect the vast majority of users but might stymie repeat spreaders of coronavirus-related misinformation.

In another study, Facebook researchers again flagged that “problematic vaccine content” was concentrated among a small segment of users. Facebook workers found that half of the views of “problematic vaccine content” came from people in just 10 out of 638 sets of U.S. users, classified together by the company because of their similar social ties.

Facebook’s Simpson said the studies “were in no way definitive” and intended to internally guide the company’s product team.

An external research group that in March conducted an influential study on the so-called “disinformation dozen” — which concluded that just 12 individuals were responsible for up to 73 percent of the anti-vaccine content on Facebook — argued the disclosures corroborate their work. Facebook has aggressively disputed the findings.

When Zuckerberg testified at a House hearing on disinformation in March, Rep. Mike Doyle (D-Pa.) pressed him to commit to removing the so-called disinformation dozen. Zuckerberg replied that he and his team would need to “look at the exact examples” to see if they’re breaking Facebook’s rules. Doyle said Wednesday his office never heard back. 

“That’s no surprise given Facebook’s lengthy record of dismissing transparency, avoiding accountability, and not owning up to its own mistakes,” Doyle said. “I wish I could expect better of them.”

Our top tabs

The FTC is looking into Facebook amid whistleblower reports

The Federal Trade Commission has begun looking at Facebook documents that detail what the company knew about the harms of its platforms, the Wall Street Journal’s John D. McKinnon and Brent Kendall report. They’re looking into whether the company may have violated a 2019 privacy settlement that was accompanied by a $5 billion fine, a person familiar with the matter said.

Facebook told the Wall Street Journal that it is “always ready to answer regulators’ questions and will continue to cooperate with government inquiries.” The FTC declined to comment to the outlet.

The inquiry comes as Facebook comes under regulatory and legislative scrutiny around the world. The company ordered its employees to preserve documents dating back to 2016 after “a number of inquiries from governments and legislative bodies have been launched into the company’s operations,” my colleague Elizabeth Dwoskin reports.

Polish politicians warned Facebook about a “social civil war”

Polish political parties told Facebook that changes in the company’s algorithm were responsible for heightened polarization, Loveday Morris reports. The parties called the situation “unsustainable,” according to an internal Facebook report.

“Across multiple European countries, major mainstream parties complained about the structural incentive to engage in attack politics,” the report said. “They see a clear link between this and the outsize influence of radical parties on the platform.”

The country’s far-right Confederation party has the largest presence on the platform among the country’s political parties despite having just 2 percent of the seats in the country’s parliament. It’s a “hate algorithm,” said Tomasz Grabarczyk, who leads Confederation’s social media team. But the party’s posts generally do well on the platform. “I think we are good with emotional messages,” Grabarczyk said.

Blumenthal calls for children’s online privacy bills to be passed “within months”

Two Democratic senators say the time has come for Congress to pass legislation that would limit how major technology collects data on teens and ban manipulative marketing and damaging design choices, Reuters’s Diane Bartz reports. Parents can no longer trust Big Tech to protect their children, according to Sen. Richard Blumenthal (D-Conn.), who chairs the Senate Commerce Committee’s consumer protection panel, and Sen. Edward J. Markey (D-Mass.), an architect of the Children's Online Privacy Protection Act.

“We can't wait years. We need to move forward. My subcommittee will hold additional hearings. We anticipate additional revealing information,” Blumenthal said, calling for Congress to move forward “even now” with legislation that has already been announced. 

Rant and rave

The New York Times's Ryan Mac has devised a new PR strategy for Facebook after the Wall Street Journal published a letter from former president Donald Trump with baseless and debunked claims. Mac:

The Wall Street Journal's Jeff Horwitz:

Agency scanner

Lina Khan isn’t worried about going too far (New York Magazine)

The US Copyright Office just struck a blow supporting the right to repair (The Verge)

Hill happenings

Amazon backs House anti-counterfeit bill (Axios)

Inside the industry

Instagram chief Adam Mosseri says an app for kids is ‘the right thing to do’ (Rachel Lerman and Heather Kelly)

Florida judge rules Trump can’t skirt Twitter’s terms just because he was president, in latest legal setback (Timothy Bella)

Anonymity no more? Age checks come to the Web. (The New York Times)

Daybook

  • The Senate Homeland Security and Governmental Affairs Committee holds a hearing on social media amplification of domestic extremism and other content today at 10:15 a.m.
  • Apple and Amazon hold investor calls today at 5 p.m. and 5:30 p.m.
  • New America’s Wireless Future Project and Public Knowledge host an event on expanding spectrum access on Nov. 2 at noon.

Before you log off

Thats all for today — thank you so much for joining us! Make sure to tell others to subscribe to The Technology 202 here. Get in touch with tips, feedback or greetings on Twitter or email

Loading...