Facebook chief executive Mark Zuckerberg on Capitol Hill on April 11. (Matt McClain/The Washington Post)

FACEBOOK HAS come under fire for its speech policies from multiple players in multiple nations for manifold reasons — sometimes for taking down too much content, other times for not taking down enough. Now, Mark Zuckerberg is asking for someone to tell him what to do instead. That’s an understandable but potentially dangerous request.

Some have interpreted Mr. Zuckerberg’s calls for greater regulation of content moderation, along with comments from Facebook’s head of global affairs this past weekend, as requests for a worldwide standard-setting on speech. But any international rules would be formed in part by repressive regimes — putting the Internet’s future in the hands of autocrats. No one set of rules can work for all countries, and people’s ability to express themselves online should not become a geopolitical bargaining chip.

More likely, Facebook and its peers will be regulated country by country. That is more reasonable, though there are risks there, too. Australia has proposed throwing executives in jail if they let violent videos linger on their sites. Germany fines companies for illegal hate speech. Britain is considering hefty fines for “harmful content.” What “harmful” means is unclear — and that’s precisely the problem.

It is one thing for U.S. legislators to hold companies accountable for acting recklessly with regard to illegal content, such as incitements to violence, though that is the sort of regulation Mr. Zuckerberg almost certainly does not want. Letting legislators tell tech companies to ban even some categories of legal speech, on the other hand, would almost certainly be a violation of the First Amendment in the United States. Elsewhere, it would be a serious threat to free speech.

Facebook’s head of public policy clarified that Mr. Zuckerberg does not want the U.S. government to set speech rules, even if other countries might. Instead, he wants a third-party industry group to take charge. The threat is lower, but it could still be a way for Facebook to duck responsibility for its outsize role in society.

That is not the right way to go. Facebook already has rules. They’re called terms of service. The platform should own them, rather than ask someone else to. That will require rethinking what’s acceptable in response to tragedies such as the New Zealand shootings, and it will require working with academics, advocates and anyone else who has an interest in grooming the world’s information landscape. It will require transparency reports and appeals processes, like those Facebook is already refining. And, of course, it will require taking public positions on touchy topics and accepting the criticism that comes along. For platforms such as Facebook, that’s part of the job.