FACEBOOK HAS banned Alex Jones — again. The technology company’s decision to remove several far-right and anti-Semitic figures from its platform this week may look like more of the same, but it is actually a signal of a marked shift in attitudes toward hate speech online.
Facebook and its peers first went after Mr. Jones and his Infowars last summer. Then, the conservative conspiracy theorist’s exile prompted widespread worry over the imminent death of freedom of expression. Now, Facebook has issued a much firmer farewell to Mr. Jones and several other racist extremists, including Nation of Islam leader Louis Farrakhan, but most people’s concerns seem to have been tempered. What changed?
Last week, a man walked into a synagogue in Poway, Calif., and started shooting — after posting about his planned attack on 8chan with a link to a Facebook live stream and a manifesto that included a nod to “meme magic.” He was copying the playbook of the white supremacist who slaughtered 51 at New Zealand mosques and made his massacre go viral. The deadliest attack on Jews in U.S. history at Pittsburgh’s Tree of Life synagogue was also Internet-inspired. And other threats and thwarted plots have had roots in the conspiracy machine that depends on each individual platform as a cog.
The link between what people say online and what people do offline has never been clearer, and neither has the need for companies to do something about it. Concerns about censorship still exist, and rightly: Governments are becoming more eager to get involved in policing Internet speech, and the most aggressive proposals look especially concerning against a backdrop of authoritarian regimes cutting off access to social media altogether. Firms such as Facebook could also overreach. But so far, they are still only getting a grip on the most dangerous actors.
“Dangerous” is exactly the word Facebook used in its latest enforcement action, which wipes the presence of the individuals and organizations it has designated from the platform entirely. The change is encouraging: Instead of focusing on narrow violations of its hate speech and harassment policies and then removing those posts or pages, the company is viewing its latest outcasts in the broader context of their role both on its site and in society. A rising chorus has called on countries and companies alike to recognize white supremacist terrorism as terrorism; Facebook’s note that many of those it removed praised or appeared with hate figures is a welcome recognition of the networks that sustain the far right.
There’s much enforcement work to do to ensure that Facebook will not have to ban Mr. Jones again — again. And fringe figures will always be able to find a platform, even if it is not Facebook. But if it isn’t Facebook, they will certainly have a harder time hurting people. The company may finally be acknowledging what all those rules are supposed to be for.