Neo-Nazis and far-right activists are coaching followers on how to use a new Twitter rule to persuade the social media platform to remove photos of them posted by anti-extremism researchers and journalists who specialize in identifying episodes of real-world hate.

Advocates said they worry the new policy will suppress efforts to document the activities of the far right and will prove to be a gift to members of hateful movements eager to keep their identities concealed.

“It’s going to be emboldening to the fascists,” said Gwen Snyder, an anti-fascist researcher and organizer in Philadelphia.

Snyder’s Twitter account was suspended early Thursday after someone reported a 2019 tweet of hers showing photos of a local mayoral candidate attending a public rally alongside the extremist group the Proud Boys. After The Washington Post asked about the suspension, Twitter spokesperson Trenton Kennedy said the tweet was not in violation and that “our teams took enforcement action in error.”

On Tuesday, Twitter said its new “private information policy” would allow someone whose photo or video was tweeted without their consent to request the company take it down.

Twitter said the rule would help “curb the misuse of media to harass, intimidate and reveal the identities of private individuals, which disproportionately impacts women, activists, dissidents, and members of minority communities.”

The rule, company officials said Tuesday, would not apply to photos that added “value to public discourse” or were of people involved in a large-scale protest, crisis situation or other “newsworthy event due to public interest value.”

In the days since, however, white supremacists on channels such as the encrypted chat service Telegram have urged supporters to use the new policy against activists and journalists who have shared their information or identified them in photos of hate rallies or public events.

“Due to the new privacy policy at Twitter, things now unexpectedly work more in our favor as we can take down Antifa … doxing pages more easily,” a white nationalist and Nazi sympathizer wrote to followers on Telegram on Wednesday night, referring to the anti-fascist political movement whose members often clash with far-right protesters and to the practice of publishing people’s personal information online.

He included a list of nearly 50 Twitter accounts and urged people to report them for suspension under the new rule. At least one of the accounts was suspended by Thursday. Twitter did not respond to a question about why the account had been taken down.

The Telegram post has been viewed more than 10,000 times. After it was shared on Twitter by anti-extremism researcher Kristofer Goldsmith, the Telegram user wrote, “Yeah and we’ll do it again.”

How Twitter will enforce the new policy remains contentious. A Twitter spokesman told The Post this week that the policy would help prevent the unauthorized sharing of photos of rape victims or women in authoritarian countries who could face real-world punishment for going outside without a burqa.

The company said that each report will be reviewed case-by-case and that flagged accounts can file an appeal or delete the offending posts to resolve their suspensions.

Snyder, the Philadelphia anti-fascist researcher, said she believed her reported tweet did not break the rules but deleted it anyway, worried that any appeal she filed would take too long or ultimately fail. She suspects the rule could have a “catastrophic” chilling effect on other researchers working to expose extremists.

Since the violent white-nationalist rally in Charlottesville in 2017, anti-extremism activists have used Twitter to identify previously anonymous members of far-right militias, neo-Nazis and other hate groups, sharing their photos, names and other information.

In some cases, the exposed people have lost jobs, been reported to law enforcement or faced consequences with co-workers, friends or family. Activists and researchers who have shared their information have also faced death threats and online attacks.

Goldsmith, a researcher with the Innovation Lab at Human Rights First who tracks the far right, said the rule could undermine Twitter’s front-line role in distributing critical information about online and real-world hate campaigns.

Amateur investigators known as “sedition hunters” openly used Twitter to identify rioters at the U.S. Capitol on Jan. 6. Other researchers did the same after Charlottesville, he said. A jury last week ruled that more than a dozen white supremacists and hate groups should pay more than $26 million in damages for acts of intimidation and violence during the rally that left one woman dead.

“A large portion of the evidence that has been presented in these cases came from what Twitter now says is protected or ‘private’ information,” Goldsmith said.

Anti-extremism researchers and photojournalists on Twitter have in recent days posted reports showing suspension notices they’d received related to the new rule, even for months-old tweets of people in public places for whom the rule would not appear to apply.

Far-right activists have also worked to exploit their newfound power. On Telegram, one far-right activist shared tips on how to find potentially reportable images, using Twitter search queries such as “images fascist exposed.”

On other sites, like the fringe social network Gab, far-right activists said they were aggressively hammering out reports in hopes of taking down anti-fascist Twitter accounts. One said he had filed more than 50 reports in a day, adding, “It’s time to stay on the offensive.”

Some have also attempted to organize on Twitter, with one account saying they had submitted dozens of reports under the rule against anti-fascist accounts, tweeting, “[Right-wing] Twitter, it is time. I told you yesterday and you had reservations. No more excuses. We have work to do.” The account has since been suspended.

Goldsmith said he worried that Twitter’s moderators would not be prepared for a flood of reports from bad actors who could organize on other sites in hopes of blocking or hindering researchers’ work.

“Twitter simply does not have the human power to make these judgment calls,” he said.

Oren Segal, vice president of the Center on Extremism at the Anti-Defamation League, said Twitter needs to provide more clarity into how these rules will be enforced.

“If the intention of the new rules is to help stop doxing and harassment, that is important. But exposing extremists is also important,” Segal said. “Accountability is important. And sunlight can be the best disinfectant when done responsibly.”