Democracy Dies in Darkness

The Switch

‘Twitter purge’ suspends account of far-right leader who was retweeted by Trump

By Hayley Tsukayama, Craig Timberg

December 18, 2017 at 12:09 PM

(LEON NEAL/AFP/Getty Images)

Twitter on Monday suspended the account of a top official of a far-right British group whose anti-Muslim videos President Trump retweeted last month, amid the company's move to crack down on content that promotes hate or threatens violence against people or groups.

The implementation of Twitter's new rules was the latest attempt by technology companies to crack down on abuses of their platforms in the aftermath of Charlottesville’s bloody demonstration in August. Though Twitter’s announcement in a morning blog post did not make this connection explicit, companies have been scrambling for months to address allegations that their platforms had become breeding grounds for extremist groups.

Far-right political figures have been criticizing these moves as assaults on their rights to free speech, and some have called Twitter’s new policy part of an effort to “purge” them. Among those whose accounts went offline Monday were three affiliated with the group Britain First, including its main account and those maintained by its leader, Paul Golding, and his deputy Jayda Fransen. It was her anti-Muslim posts last month that were retweeted by President Trump, a move that earned him sharp rebuke from British Prime Minister Theresa May.

The White House declined to comment on the suspension of the accounts of Fransen and others.

Britain First also did not reply to requests for comment but, according to a CNBC report, said in an email to members, "Twitter is now only available to those on the 'left' of politics. This latest attack on our movement demonstrates that 'free speech' is only available to those who do not criticize socialism or Islam."

Late last month, after Trump retweeted three of Fransen’s videos, White House press secretary Sarah Huckabee Sanders told reporters that his action was intended to “elevate the conversation” about the issues of “extreme violence and terrorism.”

The three videos Trump shared were provocatively titled “Muslim migrant beats up Dutch boy on crutches!”, “Muslim destroys a statue of Virgin Mary!” and “Islamist mob pushes teenage boy off roof and beats him to death!” The videos provided no context, and the Netherlands Embassy said that the one about a “Muslim migrant” actually featured a perpetrator born and raised in the Netherlands.

Following Trump’s retweets, Fransen touted the U.S. president’s promotion of her videos. “Donald Trump himself has retweeted these videos and has around 44 million followers!” she tweeted. “God Bless You Trump! God Bless America!”

Also suspended by Twitter Monday were accounts for the group American Renaissance, which the Southern Poverty Law Center has called an extremist group, and its leader, Jared Taylor.

In a statement Monday, American Renaissance said: "Twitter said it has begun enforcing new rules against 'hateful and abusive content.' American Renaissance and Jared Taylor violate none of those rules. Anyone familiar with our work recognizes that we present our dissident political views with civility. We have never even hinted at anything that could be seen as condoning violence or illegality, nor do we associate with those who do. We’ve taken special care to abide by Twitter’s terms of service–old and new."

Related: [Trump’s retweets elevate a tiny fringe group of anti-Muslim activists in Britain]

Twitter Monday declined to specify which accounts it suspended or how many.

According to its new rules, “violent extremist groups” will be prohibited from having accounts on the network.  The new guidelines target accounts from groups that explicitly call for violence against individuals or ethnic and other social groups, for example, or make clear that their main objective is to promote violence. Twitter is also going to crack down on hateful imagery such as symbols and logos, which appear in users’ profile pictures or cover photos. Users who break these rules will have one opportunity to fix it; if they violate the rules again, their accounts will be suspended.

The company will decide which groups meet its definition of a violent extremist group by consulting nonprofit organizations, governments and other experts. Any individual account or tweet that promotes violence against someone will prompt Twitter to suspend that account — temporarily or permanently, depending on how that user has acted in the past.

Twitter critics have repeatedly called on the company to crack down more on hateful language, imagery and calls for violence spread through its site. But the problem came to a head recently as Twitter has grappled with the growing presence of hate groups on its network — and criticism that it’s allowing hateful rhetoric to spread unchecked. After the company verified the account of Jason Kessler, one of the main organizers of the Charlottesville rally, it then reversed its decision and said it would reevaluate its process for choosing the accounts it authenticates. Kessler's account remains online.

The Anti-Defamation League issued a statement praising Twitter's crackdown on Monday. "We have long urged Twitter to push back against hateful and violent rhetoric, and these latest actions are encouraging," the group said.

Related: [Twitter attacked after it verifies account of Charlottesville rally organizer]

While Twitter has not yet announced new rules around verification, the new guidelines do more closely govern what individual users can post to their profiles.

The effect of these rules will depends on how Twitter enforces them. The company did not offer further details about how it will consider these rules in conjunction with its “newsworthiness” standard, which has been the company’s explanation for why it has not removed certain tweets from Trump’s account that would otherwise violate its rules for conduct.

Related: [Twitter users want Trump’s account suspended for ‘threatening violence’ against North Korea]

In addition to changing its policies, the company is also changing the way it describes some of its rules to users. The company’s “Help Center,” which is the main resource for Twitter users looking for information about its rules, will now also include a “rationale” section that explains more about how Twitter arrived at its policies and how it enforces them.

Twitter acknowledged that these rules alone won’t come close to fixing everything wrong with the social network, and even cautioned that its efforts to be more strict could result in disciplining some people who haven't broken the rules in the process.

“In our efforts to be more aggressive here, we may make some mistakes and are working on a robust appeals process,” the company’s blog post said, adding that it will continue evolving its rules based on user feedback.

John Wagner and David Weigel contributed to this report.


Hayley Tsukayama covers consumer technology for The Washington Post. A Minnesota native, she joined The Post in 2010 after completing her master's degree in journalism.

Craig Timberg is a national technology reporter for The Washington Post. Since joining The Post in 1998, he has been a reporter, editor and foreign correspondent, and he contributed to The Post’s Pulitzer Prize-winning coverage of the National Security Agency.

Post Recommends
Outbrain

The Switch

‘Twitter purge’ suspends account of far-right leader who was retweeted by Trump

By Hayley Tsukayama, Craig Timberg

December 18, 2017 at 12:09 PM

(LEON NEAL/AFP/Getty Images)

Twitter on Monday suspended the account of a top official of a far-right British group whose anti-Muslim videos President Trump retweeted last month, amid the company's move to crack down on content that promotes hate or threatens violence against people or groups.

We're glad you're enjoying The Washington Post.

Get access to this story, and every story, on the web and in our apps with our Basic Digital subscription.

Already a subscriber?