The Washington PostDemocracy Dies in Darkness

The Technology 202: Domestic extremists are changing their playbook as social media cracks down

Placeholder while article actions load

with Aaron Schaffer

Tech companies have been promising to aggressively crack down on domestic extremists, especially in the wake of the Jan. 6 Capitol riots. But new research highlights how those extremists are evolving their strategies to continue to recruit and organize on social media. 

A new report from Digital Citizens Alliance and Coalition for a Safer Web, which both advocate for tougher regulation of Silicon Valley, details how posts related to QAnon, which the FBI has designated a domestic terrorist threat, and from groups associated with the Capitol riots continue to appear across various social media sites. 

As companies toughen their enforcement and crack down on certain hashtags known to be associated with extremism, researchers found such groups are getting savvier and finding new terms to attract followers or different ways to embed their messages in videos. 

Here are key findings the researchers shared with The Technology 202:

  • Across social media, extremists are co-opting seemingly innocuous phrases -- including terms like “patriots” –  and using them to push extremist content or conspiracy theories. Facebook restricted the hashtag “patriotsincontrol” after The Post's inquiry. Twitter removed an account called PatriotsDontSlp, which the researchers identified, for violating its policy on coordinated harmful activity.
  • TikTok earlier this year banned a series of QAnon-associated hashtags, including #WWG1WGA, which stands for the movement's rallying cry “Where We Go One We Go All.” But researchers found more than a dozen videos where people eluded the ban by posting TikToks with the song lyrics “Where We Go One We Go All,” in some instances featuring messages claiming Donald Trump was still president. One video made baseless claims about hydroxychloroquine being used as a treatment for the coronavirus. TikTok removed the videos over the weekend after The Technology 202 contacted the company.
  • People also circumvented the ban on QAnon-related hashtags by putting images of tweets containing terms related to the conspiracy theory in videos. TikTok removed several examples flagged by The Post, but it's generally harder for AI systems to detect phrases in videos and photos than in text.
  • The researchers found instances where groups appeared to be booted off social media, only to return with new accounts with “v2” for version 2 in their handles. Instagram removed an account called @theboisfromillinois_v2 mentioned in the report. 
  • The report said YouTube has been relatively lax compared to Facebook in addressing domestic extremism. The researchers said they continued to find videos associated with groups related to the Jan. 6 attacks. YouTube pushed back on the report's findings, noting it began removing videos associated with some of these groups prior to the Jan. 6 attacks. The company removed three of nine videos mentioned in the report for violating its rules. The company has said 0.16 percent to 0.18 percent of all the video views on its platform during the fourth quarter of 2020 were on content that broke its rules. That’s down 70 percent from the same period in 2017, the year the company began tracking it.
  • The report also identifies Telegram as a platform where more alarmist and and extremist messages are being shared. The company did not respond to a request for comment on the report.
The researchers' findings underscore the critical role social media is playing in the spread of domestic extremism. 

Digital Citizens Alliance and Coalition for a Safer Web decided to publish the report today because of the 26th anniversary of the Oklahoma City bombings, which led to law enforcement cracking down on domestic terrorism and militia movements. They say following the events of Jan. 6, it's time for the United States to take a similar approach again — but this time with a greater focus on the Internet. 

The report's findings come as FBI Director Christopher Wray sounds the alarm about the central role social media companies play in spreading domestic extremism. He compared the issue to the way foreign disinformation has spread on such platforms. 

“Social media has become, in many ways, the key amplifier to domestic violent extremism just as it has for malign foreign influence,” Wray said at a Hill hearing last week. “The same things that attract people to it for good reasons are also capable of causing all kinds of harms that we are entrusted with trying to protect the American people against.”

The report's authors say it's time for tech companies to make several key changes to address these problems:

  • Hire more people: The report's authors say domestic extremists have figured out how to trick content moderation algorithms, and that platforms need to hire more humans to spot more nuanced examples of extremism. This could allow them to tell the difference between a militia member hijacking the term “Patriots” versus a New England sports fan, for example.
  • Develop industry standards to address domestic extremism: Currently rules and enforcement are inconsistent from platform to platform, so if a militia channel is blocked on Facebook and Instagram, it still might appear on YouTube. Researchers say there needs to be a “standards board” or information sharing platform to prevent groups from simply hopping to a different social network when one cracks down. They say this is also an area where Congress can play a role.

Tom Galvin, the executive director of the Digital Citizens Alliance, said he believes the Capitol attacks are a “key turning point” for U.S. policy toward domestic terrorists, particularly online. 

“The troubling discovery of this investigation is how domestic extremists have reemerged after a bit of a hibernation as a dangerous force in our country,” Galvin said. “That force is being enabled by the digital platforms.”

Companies say they're working hard to address the problem. 

They have removed thousands of pages and accounts in recent months associated with QAnon and other domestic extremist movements. But extremist information continue to crop up. 

  • TikTok said it's expanding limits on certain hashtags following the report's findings. “Violent extremism has no place on TikTok, and we work aggressively to stop the spread of disinformation and ban accounts that attempt to co-opt our platform for violent or hateful behavior,” TikTok spokeswoman Jamie Favazza said.
  • Facebook said it's anticipating that bad actors will change tactics. “We expect this movement to change its tactics to try and evade our enforcement,” Facebook spokeswoman Stephanie Otway said in a statement. “Our teams are monitoring any changes in behavior and the types of content being shared so we can adapt our policies and enforcement as necessary.”
  • YouTube argued in some instances its systems to remove content are more comprehensive than other platforms. "We regularly report on the removal of violent extremist content and in Q4 2020 alone, removed over 13,000 channels and 72,000 videos for violating our violent extremism policy," spokeswoman Ivy Choi said in a statement.
  • Twitter has been focusing on “ban evasion” since it permanently suspended thousands of accounts in the wake of the Jan. 6 attacks. Spokeswoman Katie Rosborough said at times the company took action before banned accounts trying to return to the platform were even able to send a tweet.

Eric Feinberg, vice president of the Coalition for a Safer Web, accused the companies of “gaslighting” when they say they're investing in cracking down on QAnon and militia groups. He said the tech companies are not focused enough on limiting the groups' evolving use of hashtags. 

Our top tabs

A union is seeking to overturn the results of the recent election at an Alabama Amazon warehouse. 

The Retail, Wholesale and Department Store Union, which sought to represent Amazon workers in Bessemer, Ala., filed its objections with the National Labor Relations Board on Friday, claiming that the e-commerce giant improperly pressured workers to oppose the effort to organize, Jay Greene reports. The filing will trigger a hearings process, which could result in the board calling for a new election if the union prevails. 

Workers in the Bessemer warehouse voted to reject unionization by more than a 2-to-1 margin earlier this month. 

The RWDSU said Amazon’s tactics “constitute conduct which prevented a free and uncoerced exercise of choice by the employees,” adding that they “constitute grounds to set the election aside.”

Amazon didn’t respond to a request for comment on the union’s filing. But it anticipated many of the union’s claims when the ballot count ended.

“It’s easy to predict the union will say that Amazon won this election because we intimidated employees, but that’s not true,” the company said in a statement posted to its corporate blog at the time. “We’re not perfect, but we’re proud of our team and what we offer, and will keep working to get better every day.”

(Amazon chief executive Jeff Bezos owns The Washington Post.)

China still leads the world in facial recognition, but Russia is racing to catch up.

The Kremlin is increasingly using the technology to find protesters and government critics. But surveillance cameras have been turned off or malfunctioned when state security officers are accused of murders or attacks, Robyn Dixon reports

Law enforcement agencies worldwide — including in the United States, where Capitol rioters were identified with the technology — are increasingly relying on facial recognition, but there are unique human rights and privacy concerns about its use in authoritarian regimes. 

A facial recognition system in Moscow is now used in 70 percent of crime investigations. The city has more than 189,000 cameras with facial recognition capabilities, as well as more than 12,300 on subway cars in Moscow’s Metro. At least 10 other Russian cities have expanded their use of the technology. 

A member of Russia’s FSB security agency admitted to opposition leader Alexei Navalny, who posed as a security official, that surveillance cameras are switched off in sensitive operations, such as the one in which Navalny was poisoned.

“Instead of the system being used for the benefit of the city, it is being used as a tool of total surveillance and total control of citizens,” said Sergei Abanichev, a protester who was jailed after being arrested using facial recognition.

Law enforcement officials are racing to stop online sales of fake vaccination cards.

Officials say that the online trade of the cards is illegal and could undermine people’s safety, Dan Diamond reports. They worry that people could use them to fake their vaccination status at work, in schools or during travel, potentially putting others' health at risk and undermining the broad campaign to vaccinate the public. 

“This is a concern that is national and bipartisan,” North Carolina Attorney General Josh Stein said, adding that the spread of fake vaccination cards “will extend the pandemic, resulting in more people sick and more people dead.”

Stein recently led an effort with 47 colleagues to demand that eBay and other e-commerce platforms crack down on the scams. But a Washington Post review showed they continue to exist on eBay, where one account sold more than 100 blank vaccination cards in the past two weeks. The company removed those listings after The Post brought them to the company’s attention. “Our team has reviewed and taken appropriate action,” said eBay representative Parmita Choudhury, who declined to disclose additional details about the account.

Rant and rave

NBC News policy editor Benjy Sarlin:

Chase Mitchell, a writer on The Tonight Show Starring Jimmy Fallon:

Political commentator Keith Olbermann suggested rolling out New York's Excelsior Pass as a national template:

Hill happenings

Privacy groups are showing their support for tech critic Lina Khan ahead of her nomination hearing.

Ten groups wrote to the chairwoman of the Senate Commerce Committee, Sen. Maria Cantwell (D-Wash.), and the top Republican on the committee, Sen. Roger F. Wicker (R-Miss.), to support Khan, President Biden’s nominee to fill a commissioner seat on the Federal Trade Commission. Six groups that work on online safety and privacy for children and teens also wrote to support Khan, arguing that she will “help ensure that the FTC serves the interests of young people, their parents and caregivers, including through appropriate enforcement of the Children’s Online Privacy Protection Act.”

Khan will face the committee on Wednesday.

Inside the industry

Microsoft’s president backed Biden’s infrastructure package.

Brad Smith wrote in a USA Today op-ed that the company believes that Biden’s American Jobs Plan “points in the right direction.” Smith’s pitch comes as Microsoft, the world’s third-most valuable company, faces pressure in Washington in the wake of a massive cyberattack. The company last week offered to give free security services to its federal clients, but some lawmakers say that’s not enough.

“Software, data, electronics, and biology are changing the world,” Smith wrote, “but they won’t reach every American or ensure national competitiveness without public investment. A national plan is overdue.” 

Facebook plans to go after Clubhouse — and podcasts — with a suite of new audio products (Vox)

Facebook calls for data portability laws as it expands the types of info users can transfer to other services (CNBC)

Peloton fights federal safety recall after its treadmills left one child dead, others injured (Washington Post)

Daybook

  • Cecilia Muñoz, the director of former president Barack Obama’s Domestic Policy Council, speaks at a New America CA event on gig workers today at 1 p.m.
  • The House Agriculture Committee holds a hearing on rural broadband access on Tuesday at 10 a.m.
  • Acting Federal Trade Commission chairwoman Rebecca Kelly Slaughter and the FTC’s three commissioners testify before the Senate Commerce Committee on Tuesday at 10 a.m.
  • The Senate Commerce Committee holds a nomination hearing for tech critic Lina Khan, Biden’s pick to join the Federal Trade Commission, and Bill Nelson, a former senator who represented Florida and who Biden chose to lead NASA, on Wednesday at 10 a.m.
  • A House Energy and Commerce Committee panel holds a hearing on securing U.S. wireless network technology on Wednesday at 10:30 a.m.
  • A Senate Judiciary Committee panel holds a hearing on app stores on Wednesday at 2:30 p.m. 

Before you log off

Loading...