By Saturday afternoon, Chad Loder had seen enough.
The threatening messages made toward Omar were so severe they prompted House Speaker Nancy Pelosi (D-Calif.) to increase protection for the congresswoman. But Loder, who is the CEO of cybersecurity training company Habitu8, said Twitter shares responsibility.
He told The Washington Post he’s been frustrated by the platform’s seeming inability to crack down on accounts that espouse racist, homophobic and violent rhetoric. The threats against Omar, one of the first Muslim women to serve in Congress, were just the latest example.
So, in less than an hour, he crafted his own solution.
Using Twitter’s search feature, Loder combined terms such as “Rope,” “Bullet,” “Noose” and “Hanged” with Omar’s two Twitter accounts, revealing dozens of inflammatory tweets. He said he also used phrases such as “I hope someone,” “Someone needs to,” or “I wish someone,” to pinpoint tweets inciting violence against her.
While the majority of his results were not direct threats, Loder says he was able to pinpoint hundreds that were. He has compiled the results into a Twitter Moment where users are encouraged to mass-report the tweets, prompting Twitter to take action against the accounts behind them.
“It’s not like I’m sitting here building this advanced system. It’s basically searching for stuff adding it to a moment,” he said. “The more people who report on a tweet, the more attention it gets.”
Many of the tweets have since been deleted.
He referred to Trump’s Friday tweet as “stochastic terrorism”: a phenomenon in which a prominent person or group stirs up hatred against a target, potentially causing someone else to carry out a violent act against them.
“Rather than directly call for specific acts of violence, hate groups can maintain plausible deniability by merely suggesting that someone is a subhuman traitor, an enemy of the people,” Loder said. “They know that some random person somewhere who’s just on the edge of a mental breakdown will see it, and they’ll take matters into their own hands.”
He added, “When it’s Donald Trump doing the tweeting, linking Ilhan Omar with pictures of the two towers exploding, it has a big impact.”
That impact, Loder said, has already played out in the myriad Twitter threats against Omar. But these messages can also lead to real-world violence, and some Democrats have cited a man who was charged earlier this month with threatening Omar, allegedly promising to “put a bullet in her . . . skull.” USA Today reports the suspect told investigators he “loves President Trump” and expressed disdain for “radical” Muslims in Congress.
Omar wrote Saturday night that she’d seen an increase in death threats directly linked to Trump’s tweet. On ABC’s “This Week,” White House press secretary Sarah Sanders said the president did not wish “ill will, and certainly not violence toward anyone,” with his tweet.
But that wasn’t reflected on Twitter, Loder said, where accounts encouraging violence sometimes have thousands of followers, amplifying their hateful messages. He said many appear to belong to white men and women in their 50s, whose profiles typically feature American flag emoji and profiles littered with racist, anti-immigrant and anti-Democratic invective.
The main targets of their ire are typically people of color, especially women, he added.
“It’s telling, and disturbing, that so much of their online language reflects lynching or dragging, including pictures of nooses,” Loder said. “We’re not that far removed from when lynching was a reality in the U.S.”
Some of the threats Loder uncovered have been up for weeks and even months. A source familiar with Loder’s efforts, however, told The Post that some of the tweets — which would have otherwise been immediately removed — were “temporarily maintained to enable potential law enforcement coordination.”
Capitol Police said in a statement Monday they could not detail their strategy for protecting members of Congress.
Twitter has taken numerous steps to improve user experience on the platform, even introducing machine-learning software to monitor account behavior and suspending over a million problematic accounts a day. But still, Loder said, automation is not sufficient to police hate speech and threats of violence. He suggested it would be prudent for Twitter to pay special attention to the accounts of controversial political figures, especially those who are constantly under threat.
“If it’s possible for me, as a single guy working for a couple hours over the weekend, to put a little search dashboard together, to find huge numbers of death threats against Ilhan Omar, there is no reason why Twitter couldn’t do the same thing,” he said.
A Twitter spokesman told The Post on Monday that death threats, incitement to violence and hateful conduct are “absolutely unacceptable” on the platform.
“Accounts spreading this type of material will be removed and coupled with our proactive engagement, we continue to encourage people to report this content to us,” the spokesman said in a statement. “This behavior undermines freedom of expression and the values our service is based on.”
According to Twitter’s rules, the platform penalizes accounts based on the severity of the infraction and whether it’s their first time violating Twitter’s policies. This is an effort to drive toward longer-term behavioral change, but additional penalties will result in a permanent suspension.
But Loder said it was disheartening that some of these accounts — even after being reported for making death threats — are allowed to return to the platform and their followers.
He thinks they should be banned outright.
"Why is that person allowed back on the network when they broke the rules and broke the law, and what incentive does Twitter have to allow them to come back?” he asked. “For outright death threats, for Ilhan Omar’s daughter to be dragged by a chain, I don’t think that person should get their account back.”