The House select committee on the Jan. 6 insurrection discovered untold details on social media sites’ decisions surrounding the riot — and decided to leave most of them out of its report. All the same, the public and the platforms alike should take lessons from what the committee learned.
The story of social media sites and the storming of the Capitol is, to some extent, already clear. Mainstream services such as Twitter and Facebook were too deferential to the then-president, until, suddenly, they weren’t. They stood by as Donald Trump infamously tweeted “be there, will be wild” — which arguably helped radicalize a narrow group of users to the point that some were persuaded to attempt a coup. But when the social media platforms banned Mr. Trump after the riot, they lacked the rules to justify the drastic move. (On Wednesday, Meta announced it was reinstating Mr. Trump to Facebook and Instagram; in November, Twitter’s new owner, Elon Musk, invited him to rejoin that platform.)
The committee’s 122-page memo on social media and the attack, reported last week by The Post’s Cat Zakrzewski, Cristiano Lima and Drew Harwell, augments the previous understanding of the role social media played, sometimes with alarming specifics. Twitter was dangerously disorganized, going fruitlessly back and forth on a policy against “coded incitement” to violence so that it had nothing useful in place to address the insurrection as it unfolded. Facebook implemented effective measures against misinformation during the election cycle but sat on its hands afterward — and because it had no policy against delegitimizing the election with false claims, it didn’t disrupt the Stop the Steal movement until it was too late. YouTube’s election-fraud policy didn’t apply retroactively and didn’t lead to any account suspensions until after Jan. 6, 2021.
All these platforms’ efforts were heroic compared to those of sites in the internet’s darker corners. Parler, Gab, thedonald.win and more ignored, encouraged or ultimately found themselves overwhelmed by users plotting to overturn the vote. Only days before the joint congressional session to certify the electoral count was disrupted, Parler employees emailed the FBI saying they were “worried.” Gab had a single employee responsible for policing posts by millions of users on Jan. 6, 2021. “I’d buy season tickets to watch public executions of traitorous cucks,” someone wrote on thedonald.win. “Grab your armor, rifle and combat,” someone wrote on Facebook knockoff MeWe.
Also on the Editorial Board’s agenda
- The misery of Belarus’s political prisoners should not be ignored.
- Biden has a new border plan.
- The United States should keep the pressure on Nicaragua.
- America’s fight against inflation isn’t over.
- The Taliban has doubled down on the repression of women.
- The world’s ice is melting quickly.
But most illuminating are the ways in which the report departs from prevailing narratives. The authors emphasize that recommendation algorithms, despite being blamed for so many of social media’s ills, weren’t the primary problem. It was poor policy and poor process. Policies need to anticipate more possible situations; processes need to address what happens when, inevitably, even these expanded policies don’t cover everything. Ensuring that safety teams are independent from lobbying and business teams would also do some good. On that note, revelations about Twitter’s and Facebook’s reluctance to remove some conservative content for fear of political blowback undercut right-wing accusations of biased censorship. At the same time, evidence shows that relying too heavily on bans and removals inspires more migration to fringe sites — and depletes trust in mainstream ones.
All this means that platforms should experiment more heavily with what’s known as “soft interventions.” Facebook, the committee found, had success with tactics such as employing a score that ranked publishers based on the accuracy of their reporting to determine what took preference in users’ feeds, or demoting a piece of content relative to the probability that it violated the site’s terms of service.
Perhaps most important, rather than merely identify these sites’ failures, the committee identified the challenges they face. Facebook, the authors point out, wasn’t the source of the lies that flooded its platform — those came from, among others, cable television, lawmakers and the president of the United States himself. The same is true for Twitter, and even for Parler or Gab. These platforms confront vexing trade-offs every day. This conundrum becomes only more difficult to unravel when so many members of one of the country’s two political parties subscribe to and foment a lie about the election’s legitimacy. “The company,” the memo says of Facebook, “was in a political [vise] grip from which it could not escape without consequence.”
The answer isn’t for social media companies to bend their policies and processes under political pressure. It is to build a system that can anticipate even the unthinkable — and one that is strong enough to withstand it.
The Post’s View | About the Editorial Board
Editorials represent the views of The Post as an institution, as determined through debate among members of the Editorial Board, based in the Opinions section and separate from the newsroom.
Members of the Editorial Board and areas of focus: Opinion Editor David Shipley; Deputy Opinion Editor Karen Tumulty; Associate Opinion Editor Stephen Stromberg (national politics and policy, legal affairs, energy, the environment, health care); Lee Hockstader (European affairs, based in Paris); David E. Hoffman (global public health); James Hohmann (domestic policy and electoral politics, including the White House, Congress and governors); Charles Lane (foreign affairs, national security, international economics); Heather Long (economics); Associate Editor Ruth Marcus; and Molly Roberts (technology and society).