Correction: Louis Farrakhan is an extremist leader who has espoused anti-Semitic views. An earlier version of this story and headline incorrectly included him in a list of far-right leaders.
Facebook said on Thursday it has permanently banned several far-right and anti-Semitic figures and organizations, including Nation of Islam leader Louis Farrakhan, Infowars host Alex Jones, Milo Yiannopoulos and Laura Loomer, for being “dangerous,” a sign that the social network is more aggressively enforcing its hate-speech policies at a moment when bigoted violence is on the rise around the world.
Facebook said it was going to remove the accounts, fan pages, and groups affiliated with these individuals on both Facebook and its sister site, Instagram, after it reevaluated the content that they had posted previously, or had examined their activities outside of Facebook, the company said. The removal also pertains to at least one of the organizations run by these people, Jones’ Infowars.
“We’ve always banned individuals or organizations that promote or engage in violence and hate, regardless of ideology. The process for evaluating potential violators is extensive and it is what led us to our decision to remove these accounts today,” Facebook said in a statement.
The social network--which for years has resisted taking a more aggressive stance on extremism --is under massive pressure globally to curtail the ways in which its platform is used by hateful groups and individuals, most recently after massacres in Sri Lanka and New Zealand where the perpetrators used social media to spread their hateful messages.
“The timing is never an accident,” said Angelo Carusone, president of Media Matters, a liberal organization that has long advocated for more enforcement against white supremacy and is one of the groups Facebook briefed on the decision. “The reality is, people are getting killed. There are mass shootings and mass murders that are clearly being connected to ideas like white genocide, which are fueling radicalization. The conditions have changed. When you have these massive catalyzing moments that are connected to real-life consequences, it puts pressure on Facebook and others to look in the mirror.”
Alex Jones, speaking by phone from Austin, Texas, called Facebook’s action “authoritarian” and said he learned about it by seeing a headline on the Drudge Report. Facebook provided no direct notice, he said, and provided no evidence to him that he was “dangerous,” as the company has alleged.
“It’s a bizarre political stunt, and they’re trying to hide their censorship of conservatives by mixing in Louis Farrakhan,” Jones said.
He added, “I’m not really worried about me. I’m worried about how authoritarian this is… I guess free speech in America is dangerous. It’s comical.”
Yiannopoulos, in a text exchange with The Washington Post, said that efforts to squelch voices seen as extreme can lead to a broader crackdown on free expression. “Read Orwell,” Yiannopoulos texted, invoking George Orwell’s dystopian novel 1984. “You’re next."
Facebook is also banning Paul Nehlen, who described himself as a “pro-White Christian Candidate” when he ran for Congress and was also kicked off the website Breitbart News site last year for ties to neo-Nazis and racist comments about Meghan Markle, and Paul Joseph Watson, a far-right YouTube personality and an editor of Infowars, according to the Infowars site.
Facebook has previously imposed temporary bans on extremist figures including Jones and Yiannopoulos, another right-wing social media star. Twitter acted more quickly in permanently suspending Jones, Loomer and Nehlen. YouTube also has a ban of Jones and his Infowars channels in effect. YouTube and Twitter did not respond to requests for comment.
Facebook and its counterparts have until recently largely resisted permanent bans, holding that objectionable speech is permissible, so long as it doesn’t bleed into hate. Facebook has also been wary of offending conservatives, who have become vocal about allegations that the company unfairly censors their speech.
But Facebook has recently signaled that it is willing to take a stronger stance against white nationalism and white supremacy, in particular. In March, the company said it would begin banning posts, photos and other content that reference white nationalism and white separatism, revising its rules in response to criticism that a loophole had allowed racism to thrive on its platform. Previously the company had only banned the term white supremacy.
Governments around the world are pushing Facebook to take town bigoted and other harmful content more quickly--or risk being banned themselves. Facebook and other social media companies were blocked in Sri Lanka in the wake of a massacre at a Catholic Church on Easter Sunday, a response to government concern that social media could spread misinformation and further violence.
The government of New Zealand is also weighing stricter enforcement of social media as a response to a mass shooting at a mosque in Christchurch by a person who appeared to be influenced by white supremacist ideas and streamed the massacre live on Facebook. Meanwhile, Australia and the United Kingdom are considering steep penalties for social media companies that do not quickly remove and reduce the distribution of harmful material, including violent content that can spread bigoted messages.
The company is also in the crosshairs of regulators over civil rights issues. Facebook has submitted to a civil rights audit over the last year and recently announced sweeping changes to its targeted advertising system after being sued by the U.S. Department of Housing and Urban Development, which argued that its software enabled discrimination.
The other people who were banned did not respond to requests for comment.
Facebook said it began to reexamine the extremist figures last year, and some of the activities and posts the company said it had reevaluated took place within the past one or two years. Facebook said it took the individuals’ actions outside Facebook into account when making the decision to ban them. Jones, for example, recently hosted Gavin McInnes, the leader of the Proud Boys whom Facebook designated as a hate figure in December. Yiannopoulos publicly praised McInnes this year, and Loomer appeared with him at a rally.
In other cases, the company reexamined long-held stances by some of the individuals. Farrakhan, Facebook said, referred to Jews as termites earlier this year, and called the religion “dirty” and its followers “liars, cheaters, and thieves.” He has long held anti-Semitic views and has referred to Jews by negative terms for years, according to the Southern Poverty Law Center.
Jonathan Greenblatt, chief executive of the Anti-Defamation League, said Farrakhan has been a leading voice in spreading anti-Semitic ideas for more than 30 years and also has railed against white people and members of the LGBTQ community. “It’s all pretty despicable,” Greenblatt said. “His profile, his serial, congenital anti-Semitism, really puts him at the top of the ranks in terms of spreading these types of ideas.”
Of Facebook’s action, he called for greater transparency from Facebook. “It’s an important step, yes, but the proof is in the pudding.”
The bans were welcomed by civil rights activists, who have long argued that these individuals espouse violent and hateful views and that Silicon Valley companies should not allow their platforms to become a vehicle for spreading them.
Madihha Ahussain, special counsel for anti-Muslim bigotry with the advocacy group Muslim Advocates, said that individuals like Loomer, Jones and Yiannopoulos have used social media platforms to broadcast dangerous hate speech and conspiracies targeting Muslims, Jews and others.
“We applaud Facebook for taking this positive step toward removing hate actors from the company’s platforms,” she said. “As we saw in Christchurch, New Zealand — where a white nationalist was able to live-stream the slaughter of 50 people at two mosques — online platforms like Facebook have been used to target communities and spread hate.”