(Washington Post illustration; Facebook screenshots; iStock)

Facebook groups topped 10,000 daily attacks on election before Jan. 6, analysis shows

Review of millions of posts show Facebook played a critical role in spreading false narratives that fomented violence that day

Loading...

Facebook groups swelled with at least 650,000 posts attacking the legitimacy of Joe Biden’s victory between Election Day and the Jan. 6 siege of the U.S. Capitol, with many calling for executions or other political violence, an investigation by ProPublica and The Washington Post has found.

The barrage — averaging at least 10,000 posts a day, a scale not reported previously — turned the groups into incubators for the baseless claims supporters of President Donald Trump voiced as they stormed the Capitol, demanding he get a second term. Many posts portrayed Biden’s election as the result of widespread fraud that required extraordinary action — including the use of force — to prevent the nation from falling into the hands of traitors.

“LOOKS LIKE CIVIL WAR is BECOMING INEVITABLE !!!” read a post a month before the Capitol assault. “WE CANNOT ALLOW FRAUDULENT ELECTIONS TO STAND ! SILENT NO MORE MAJORITY MUST RISE UP NOW AND DEMAND BATTLEGROUND STATES NOT TO CERTIFY FRAUDULENT ELECTIONS NOW !”

Another post, made 10 days after the 2020 election, bore an avatar of a smiling woman with her arms raised in apparent triumph and read, “WE ARE AMERICANS!!! WE FOUGHT AND DIED TO START OUR COUNTRY! WE ARE GOING TO FIGHT... FIGHT LIKE HELL. WE WILL SAVE HER❤ THEN WERE GOING TO SHOOT THE TRAITORS!!!!!!!!!!!”

One post showed a Civil War-era picture of a gallows with more than two dozen nooses and hooded figures waiting to be hanged. Other posts called for arrests and executions of specific public figures — both Democrats and Republicans — depicted as betraying the nation by denying Trump a second term.

“BILL BARR WE WILL BE COMING FOR YOU,” wrote a group member after Barr announced that the Justice Department had found little evidence to support Trump’s claims of widespread vote-rigging. “WE WILL HAVE CIVIL WAR IN THE STREETS BEFORE BIDEN WILL BE PRES.”

Facebook executives have played down the company’s role in the Jan. 6 attack and have resisted calls, including from its own Oversight Board, for a comprehensive internal investigation. The company also has yet to turn over all the information requested by the congressional committee studying the Jan. 6 attack, though it says it is negotiating with the committee.

But the ProPublica-Post investigation, which analyzed millions of posts between Election Day and Jan. 6 and drew on internal company documents and interviews with former employees, provides the clearest evidence yet that Facebook played a critical role in the spread of false narratives that fomented the violence of Jan. 6.

Methodology: How ProPublica and The Post researched Facebook posts

Its efforts to police such content, the investigation also found, were ineffective and started too late to quell the surge of angry, hateful misinformation coursing through Facebook groups — some of it explicitly calling for violent confrontation with government officials, a theme that foreshadowed the storming of the Capitol that day amid clashes that left five people dead.

Drew Pusateri, a spokesman for Meta, Facebook’s newly renamed parent company, said that the platform was not responsible for the violence on Jan. 6. He pointed instead to Trump and others who voiced the lies that sparked the attack on the Capitol.

“The notion that the January 6 insurrection would not have happened but for Facebook is absurd,” Pusateri said in a statement. “The former President of the United States pushed a narrative that the election was stolen, including in-person a short distance from the Capitol building that day. The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them.”

How one of America’s ugliest days unraveled inside and outside the Capitol

To determine the extent of posts attacking Biden’s victory, The Post and ProPublica obtained a unique dataset of 100,000 groups and their posts, along with metadata and images, compiled by CounterAction, a firm that studies online disinformation. The Post and ProPublica used machine learning to narrow that list to 27,000 public groups that showed clear markers of focusing on U.S. politics. Out of the more than 18 million posts in those groups between Election Day and Jan. 6, the analysis searched for words and phrases to identify attacks on the election’s integrity.

The more than 650,000 posts attacking the election — and the 10,000-a-day average — is almost certainly an undercount. The ProPublica-Washington Post analysis examined posts in only a portion of all public groups, and did not include comments, posts in private groups or posts on individuals’ profiles. Only Facebook has access to all the data to calculate the true total — and it hasn’t done so publicly.

Facebook has heavily promoted groups since CEO Mark Zuckerberg made them a strategic priority in 2017. But the ones focused on U.S. politics have become so toxic, say former Facebook employees, that the company established a task force, whose existence has not been previously reported, specifically to police them ahead of Election Day 2020.

The task force removed hundreds of groups with violent or hateful content in the months before Nov. 3, 2020, according to the ProPublica-Post investigation.

Yet shortly after the vote, Facebook dissolved the task force and rolled back other intensive enforcement measures. The results of that decision were clear in the data ProPublica and The Post examined: During the nine increasingly tense weeks that led up to Jan. 6, the groups were inundated with posts attacking the legitimacy of Biden’s election, while the pace of removals noticeably slowed.

Removals did not pick up again until the week of Jan. 6, but even then, many of the groups and their posts remained on the site for months after, as Trump supporters continued to falsely claim election fraud and press for states to conduct audits of the vote or impose new voting restrictions.

Fewer political groups were

removed from Facebook

between Election Day and Jan. 6

Removal dates for about 2,000 public U.S.

political groups between August 2020 and

March 2021

0

25

50

75

100

political

groups

removed

Aug.

2020

Sept.

Early September

Facebook’s Group

Task Force begins

Oct.

Oct. 6

Facebook announces

complete QAnon ban

Nov.

Nov. 3

Election Day

Dec.

Dec. 2

Facebook disbands

Civic Integrity team

and Group Task Force

Jan.

2021

Jan. 6

Insurrection at

the U.S. Capitol

Feb.

Note: Political Facebook groups were identified out of a

sample of roughly 100,000. Removal dates for each group

are estimates. Only groups with 10 or more posts are shown.

Source: A ProPublica-Washington Post analysis of public

Facebook group data collected by CounterAction

Fewer political groups were removed

from Facebook between Election Day

and Jan. 6

Removal dates for about 2,000 public U.S. political

groups between August 2020 and March 2021

0

25

50

75

100

political

groups

removed

Aug.

2020

Sept.

Early September

Facebook’s Group

Task Force begins

Oct.

Oct. 6

Facebook announces

complete QAnon ban

Nov.

Nov. 3

Election Day

Dec.

Dec. 2

Facebook disbands

Civic Integrity team

and Group Task Force

Jan.

2021

Jan. 6

Insurrection at

the U.S. Capitol

Feb.

Note: Political Facebook groups were identified out of a sample of roughly

100,000. Removal dates for each group are estimates. Only groups with

10 or more posts are shown.

Source: A ProPublica-Washington Post analysis of public Facebook group

data collected by CounterAction

Fewer political groups were removed from Facebook between

Election Day and Jan. 6

Removal dates for about 2,000 public U.S. political groups between August 2020 and March 2021

100 political groups removed

Nov. 3

Jan. 6

Election

Day

Insurrection at the

U.S. Capitol

Oct. 6

Facebook

announces

complete

QAnon ban

75

Dec. 2

Early September

Facebook

disbands Civic

Integrity team

and task force

Facebook’s

Group Task

Force begins

50

25

0

August

2020

September

October

November

December

January

2021

February

Note: Political Facebook groups were identified out of a sample of roughly 100,000. Removal dates for each group are estimates.

Only groups with 10 or more posts are shown.

Source: A ProPublica-Washington Post analysis of public Facebook group data collected by CounterAction

Fewer political groups were removed from Facebook between Election Day and Jan. 6

Removal dates for about 2,000 public U.S. political groups between August 2020 and March 2021

100 political groups removed

Nov. 3

Jan. 6

Election

Day

Insurrection at the

U.S. Capitol

Oct. 6

Facebook

announces

complete

QAnon ban

75

Early September

Facebook’s

Group Task

Force begins

Dec. 2

Facebook disbands

Civic Integrity team

and task force

50

25

0

August

2020

September

October

November

December

January

2021

February

Note: Political Facebook groups were identified out of a sample of roughly 100,000. Removal dates for each group are estimates. Only groups with 10 or more posts are shown.

Source: A ProPublica-Washington Post analysis of public Facebook group data collected by CounterAction

“Facebook took its eye off the ball in the interim time between Election Day and January 6,” said a former Integrity team employee who worked on the groups task force and, like others, spoke on the condition of anonymity to discuss sensitive internal matters. “There was a lot of violating content that did appear on the platform that wouldn’t otherwise have.”

Pusateri denied that the company had pulled back on efforts to combat violent and false postings about the election after the vote. He did not comment on the quantitative findings of the ProPublica-Post investigation.

“The idea that we deprioritized our Civic Integrity work in any way is simply not true,” he said. “We integrated it into a larger Central Integrity team to allow us to apply the work that this team pioneered for elections to other challenges like health-related issues for example. Their work continues to this day.”

Inside Facebook, Jan. 6 violence fueled anger, regret over missed warning signs

The investigation also reveals a problem with the way Facebook polices its groups. Former employees say groups are essential to the company’s ability to keep a stagnant American user base as engaged as possible and boost its revenue, which reached nearly $86 billion in 2020.

But they say that as groups have grown more central to Meta’s bottom line, the company’s enforcement efforts have been weak, inconsistent and heavily reliant on the work of unpaid group administrators to do the labor-intensive job of reviewing posts and removing the ones that violate company policies. Many groups have hundreds of thousands or even millions of members, dramatically escalating the challenges of policing posts.

With the administrators themselves steeped in conspiracy theories about the election or, for example, the safety of coronavirus vaccines, reliable enforcement rarely takes place, say former employees. They say automated tools — which search for particular terms indicating policy violations — are ineffective and easily evaded by users simply misspelling key words.

“Groups are a disaster,” said Frances Haugen, a former member of Facebook’s Civic Integrity team who filed a whistleblower complaint against the company and testified before Congress warning about the damaging effects of the company on democracy worldwide, as well as other problems.

Many of the group posts identified in the analysis fell into what a March internal Facebook report, first published by Politico, defined as harmful non-violating narratives.” This refers to content that does not break Facebook’s rules but whose prevalence can cause people to “act in ways which are harmful to themselves, others, or society at large.”

The report warned that such harmful narratives could have had “substantial negative impacts including contributing materially to the Capitol riot and potentially reducing collective civic engagement and social cohesion in the years to come.”

Pusateri declined to comment on specific posts but said the company does not have a policy forbidding posts or comments that attack the legitimacy of the election. He said the company has a dedicated groups integrity team and an ongoing initiative to protect people who use groups from harm.

Facebook officials have noted that more-extreme content flowed through smaller social media platforms in the buildup to the Capitol attack, including detailed planning on bringing guns or building gallows that day. But Trump also used Facebook as a key platform for his lies about the election right up until he was banned on Jan. 6. And Facebook’s reliance on groups to drive engagement gave those lies unequaled reach. This combined with the sag in post-election enforcement to make Facebook a key vector for pushing the ideas that fueled violence on Jan. 6.

Critics and former employees say this also underscores a recurring issue with the platform since its founding in Zuckerberg’s Harvard University dorm room in 2004: The company recognizes the need for enforcement only after a problem has caused serious damage, often in the form of real-world mayhem and violence.

The Attack: Before, During and After

Facebook didn’t discover a campaign by the Russia-based Internet Research Agency to spread hyperpartisan content and disinformation during the 2016 presidential election until months after Americans had voted. The company’s actions were late as well when Myanmar’s military leaders used Facebook to foment rapes, murders and forced migrations of minority Rohingya people. Facebook has apologized for failings in both cases.

The response to attacks on the legitimacy of the 2020 U.S. presidential election was similarly slow, as company officials debated among themselves whether and how to block the rapidly metastasizing lies about the election. The data shows they acted aggressively and comprehensively only after Trump supporters had battered their way into the Capitol, sending lawmakers fleeing for their lives.

The ProPublica-Post investigation “is a new and very important illustration of the company’s unfortunate tendency to deal with safety problems on its platform in a reactive way,” said Paul Barrett, deputy director of the Center for Business and Human Rights at New York University’s Stern School of Business. “And that almost by definition means that the company will be less effective, because it will not be looking out into the future and preventing problems before they happen.”

The problems with policing groups

Facebook’s newly vigorous enforcement actions the week of Jan. 6 — which resulted in Trump himself being banned from the platform — marked such a stark contrast from the company’s previous approach that some Trump supporters took to Facebook to complain about the reversal.

“Facebook is Getting Real Brave and Vicious Now,” Jerry Smith, a retired police officer from Missouri who created and ran a group called United Conservatives for America, wrote the day after the Capitol attack. “They Are Removing Tons of Posts From My Groups!”

In a recent interview at his home, Smith said he could not remember writing that message or which deletions prompted his response. He said he opposed political violence and posts that called for it. But he acknowledged it was difficult for him to remove such content as United Conservatives for America’s membership swelled to more than 11,000, with the number of posts surpassing what one person could monitor. The typical group in the ProPublica-Post analysis had more than 1,000 members.

Smith, who showed a reporter that his Facebook account had received 116 notifications for breaking company rules, said he found some of Facebook’s policies reasonable but disagreed on how they should be enforced. He posted in United Conservatives for America and other groups at a frenetic pace long before Election Day. As early as the summer of 2020, he warned about alleged Democratic Party plans to steal the election and also shared false information about the pandemic, including a video from a conspiracy theorist about the origins of the virus.

“And DEMS Are Pushing For Vote By Mail. Another Way For Them To Steal The Election,” he wrote in August 2020.

In the interview, Smith said he believes that American elections often are rigged and worries that coronavirus vaccines may be tainted. He has used Facebook groups to share these beliefs with tens of thousands of people — and thinks Facebook’s enforcement of its policies is overly aggressive and a result of political bias against conservatives.

“Are you going to do away with their free speech?” Smith said. “If someone thinks it’s not a fair election … why can’t they have their opinion on whether it’s a fair election or not?”

Facebook enforcement slowed before Jan. 6

Facebook’s problems with groups had long been obvious to company employees, who gathered on a remote video conference in early September 2020 to figure out how to stop them from spreading hate, violent threats and misinformation as Election Day approached, according to former employees.

Known as the Group Task Force, the new unit they formed consisted of members of Facebook’s Civic Integrity team, the specialized unit charged with protecting elections on the platform, as well as employees from engineering and operations teams who help oversee the contract moderators who review posts flagged by users or by automated systems, former employees said. The goal of the task force was to identify political groups with large numbers of posts and comments that violated the social media giant’s rules against hate speech and calls for violence. Former employees involved in the effort said they wanted to apply the platform’s rules while respecting political debate and dialogue.

At the same time, Facebook’s Dangerous Individuals and Organizations team was identifying and removing QAnon groups ahead of the election. The results of the two teams’ actions were striking. All of the more than 300 QAnon groups identified by ProPublica and The Post had been removed by October 2020, when Facebook announced a total ban on the movement, the analysis found.

Facebook can be effective

when it chooses

The number of U.S. QAnon groups on

Facebook increased in 2020, before the

company cracked down

Aug. 19

Facebook announces

removals of and restrictions

for QAnon groups

300 QAnon-related

Facebook groups

Oct. 6

Facebook

announces

complete

QAnon ban

200

100

Nov. 3

Election

Day

0

Jan.

2020

Apr.

July

Oct.

Note: QAnon-related Facebook groups were identified out

of a sample of roughly 100,000. Only groups with 10 or

more posts are shown.

Source: A ProPublica-Washington Post analysis of public Facebook group data collected by CounterAction.

Facebook can be effective when

it chooses

The number of U.S. QAnon groups on Facebook

increased in 2020, before the

company cracked down

Aug. 19

Facebook announces removals of

and restrictions for QAnon groups

300 QAnon-related

Facebook groups

Oct. 6

Facebook

announces

complete

QAnon ban

200

100

Nov. 3

Election

Day

0

Jan.

2020

Apr.

July

Oct.

Note: QAnon-related Facebook groups were identified out of a sample of

roughly 100,000. Only groups with 10 or more posts are shown.

Source: A ProPublica-Washington Post analysis of public Facebook group data collected by CounterAction.

Facebook can be effective when it chooses

The number of U.S. QAnon groups on Facebook increased

in 2020, before the company cracked down

Aug. 19

Facebook announces

removals of and restrictions

for QAnon groups

300 QAnon-related Facebook groups

200

Oct. 6

Facebook

announces

complete

QAnon ban

100

Nov. 3

Election

Day

0

January

2020

April

July

October

Note: QAnon-related Facebook groups were identified out of a sample of roughly 100,000. Only groups with 10 or more posts are shown.

Source: A ProPublica-Washington Post analysis of public Facebook group data collected by CounterAction.

In the end, the Group Task Force removed nearly 400 groups whose posts had been seen nearly 1 billion times before Election Day, according to a post on Workplace, Facebook’s internal discussion tool. The document later was included in the Facebook Papers disclosed by Haugen to Congress and the Securities and Exchange Commission. Still, members of the task force told ProPublica and The Post that the existence of such a team was an indictment of Facebook’s failure to police groups as part of its normal operations.

“The whole thing of the civic team needing to come in and do the takedowns was not a good state of affairs,” said one employee involved in the task force. “You could make a good argument that this should have already been done.”

Facebook’s fact-checking favors conservatives in election lead-up

On Nov. 5, 2020, Facebook banned “Stop the Steal,” a hugely viral group created on Election Day itself that quickly attracted over 300,000 members around a message rooted in attacking the legitimacy of the election. The company cited the prevalence of posts calling for violence and using hate speech in banning the group and all other groups using a similar name.

The next day, Nov. 6, the Group Task Force gathered virtually to celebrate its efforts, former employees said. Days later, a task force member published a Workplace post titled “Some Reflections on US2020” to bring attention to its work.

“Along with heroic efforts from other teams across the company, I truly believe the Group Task Force made the election safer and prevented possible instances of real world violence,” said the post.

But the focus on U.S. political groups and content undermining the election wouldn’t last.

A noticeable drop in policing

On Dec. 2, 2020, Facebook executives disbanded the Civic Integrity team and scattered its members to other parts of Facebook’s overall integrity team, reducing their influence. That resulted in the demise of the Group Task Force. The company also rolled back several emergency measures that had been put in place leading up to Election Day to control misbehavior in Facebook groups.

The ProPublica-Post investigation reveals the result: During the lull in enforcement, hundreds of thousands of posts questioned the legitimacy of Biden’s victory, spread lies about voter fraud and at times called for violence. Meanwhile, the company’s pace of group removals slowed to a crawl, the data analysis shows.

Among the content spreading in groups were videos in which former Trump national security adviser Michael Flynn spread false claims of electoral fraud and called for martial law. (Through a spokesperson, Flynn declined to comment.) Another frequent post was a cartoon showing Trump chasing a masked Biden, who carried a bag labeled “election theft,” with swing states depicted inside. It was posted more than 350 times in the political groups analyzed by ProPublica and The Post, attracting over 2,500 total likes.

One meme featured a photo of former congressman Trey Gowdy (R-S.C.), who rose to fame in right-wing circles by leading a congressional committee’s investigation into the deadly 2012 attack on the American diplomatic compound in Benghazi, Libya, accompanied by the text “If you are ok with rigging an election to win, I am ok with martial law to stop you…” That was posted in groups at least 97 times, garnering over 3,500 total likes. Gowdy has denied saying the phrase.

Another meme showed a photo of Trump winking, with the text “Not Only Can Martial Law Guarantee a Trump Victory, It Also Allows Trump To Arrest Anyone He Wants!” It was posted at least 70 times, generating more than 2,400 total likes. The images and their spread in groups were identified using a CounterAction image-analysis tool.

“Everyone needs to make a show of FORCE in DC on the 6th and any congress who doesnt follow the constitution or who doesnt stand up for our president (Pence included) needs to be ’corrected’ by WE the PEOPLE - on the front steps of the state house - for all the world to see!!! THIS IS HOW THE US DEALS WITH HER TRAITORS!!!” read one post from Dec. 27, 2020.

Ten days later, as rioters stormed the Capitol, the ProPublica-Post analysis shows, Facebook began taking down groups at a rate not seen since before the election. An internal Facebook spreadsheet from Jan. 6, which was included in Haugen’s disclosures, contains a section called “Action Items.” The top bullet point was a direction to conduct a “Sweep of Groups with V&I risk” — a term referring to violence and incitement.

It had been 35 days since the Civic Integrity team, and with it the Group Task Force, had been disbanded.

Groups still active long after Jan. 6

Months after the Capitol was breached, Facebook still was working to remove hundreds of political groups that violated company policies.

One of those was Smith’s United Conservatives for America, which continued to carry posts attacking the legitimacy of Biden’s election until Facebook removed it in May.

When Smith met with a reporter in his home early last month, he’d just finished a 30-day posting ban on Facebook. Despite his account’s history of violations, he was still managing at least one Facebook group — also called United Conservatives for America.

Like its predecessor, the new United Conservatives for America group was racking up strikes for violations of Facebook’s rules, according to a post Smith made to the group in September.

That post included a screenshot of an automated message from Facebook informing him that eight recent posts in the new United Conservatives for America group had been flagged by fact-checkers. As a result, the distribution of the group’s posts was being limited.

Smith remained defiant.

“I'm Not Blaming Our Members,” Smith wrote. “I’m Blaming FakeBook!”

Late last month, after being asked about Smith’s account and group, Facebook said it banned his profile and removed United Conservatives for America, citing unspecified violations of its community standards.

About this story

Read how ProPublica and The Post researched election-related posts in Facebook groups in our methodology.

Craig Timberg is a technology reporter and Jeremy B. Merrill is a data reporter for The Post. Craig Silverman is a national reporter and Jeff Kao is a computational journalist at ProPublica. Tom Hamburger contributed to this report. Graphics by Chris Alcantara and Kate Rabinowitz. Design by Irfan Uraizee.

Loading...
Loading...