THE PRIME MINISTER of New Zealand, Jacinda Ardern, has said she does not think anyone would argue that the perpetrator of the Christchurch massacre should have been able to livestream mass murder. Maybe that question elicits something close to unanimity — but in trying to make the Internet safer, she will find few other points of consensus. And for good reason.

Ms. Ardern is meeting in France this week with President Emmanuel Macron to finalize the “Christchurch Call,” a pact that asks companies and countries to confront violent and extremist content online. The end is a noble one: ridding the Web of terrorist content that puts people both on and off social media sites at risk. But what are the means? The era of an unregulated Internet is ending, but a regulated one will necessarily sacrifice some freedom of speech for safety. The question surrounds the terms of the trade.

Facebook, Google and Microsoft have said they plan to sign the nonbinding pledge, and several nations, including Britain and Canada, have also signaled support. The companies will reportedly promise to audit their algorithms, share data and enforce their existing terms of service; the countries will promise to craft laws that ban objectionable content. Ms. Ardern has stressed that she hopes to skirt the hate speech debate by focusing on violent and terrorist material alone. But figuring out what counts as violent material is itself part of that broader debate, and countries are forging ahead with legal regimens that will affect everyone who uses the Internet — not only mass murderers.

AD
AD

Britain has put forth a proposal threatening companies with unprecedented fines for failing to take “harmful” content off their platforms, but much of that content is not illegal in the country. Australia has imposed criminal penalties on firms that do not “expeditiously” remove “abhorrent violent material,” which could lead to companies proactively screening every post according to an overly restrictive algorithm. France has the more flexible idea of appointing a regulator to verify that companies have effective systems in place to remove illegal hate speech, but inviting a single government appointee to declare what should stay and what should go could still be chilling.

It’s easy to say murder should not be streamed live on the world’s biggest social media platforms, but it’s much harder to stop that without also stopping some of what has made the Internet invaluable. Harsh speech regulation not only offers cover to autocrats seeking not to protect citizens but to repress them: It creates a less free environment even in democracies trying to do the right thing. The Christchurch Call asks the world to acknowledge there is a problem. It is just as important to acknowledge there are no simple solutions. Countries may have reason to clamp down on an unrestrained Internet, but they should also be thoughtful, honest — and cautious — about what they are giving up.

Read more:

AD
AD