Facebook has declared sovereignty.
This is not a missive from a dystopian future, but rather a fair reading of the company’s recent announcement that it will move forward with creating a supreme court. Facebook will select “judges” from experts around the world with the authority to overturn decisions about what content or accounts are approved or removed from the platform. The court also may help Facebook shape the policies under which those content-moderating decisions are made.
Facebook isn’t calling its court a court anymore, as chief executive Mark Zuckerberg did when he first floated the idea — probably because a firm under scrutiny every day for abusing its colossal power over people around the world does not want to invite comparison to governments. But there is no better word to describe the structure, and the choice to set up the commission reveals some unnerving realities about Facebook’s role in society.
Facebook’s empowering of an independent group to rule on its most controversial content moderation may be in part a way to avoid responsibility. Next time an Alex Jones gets an eviction notice, Zuckerberg won’t have to go on a podcast attempting to explain his decision and accidentally end up defending Holocaust deniers. Instead, he can simply say “wasn’t me.”
But that doesn’t take away from a striking shift: A company that once protested that it was merely a platform and not a publisher is now acknowledging that its role in society is so outsize, and its decisions about who can say what so consequential, that it must establish a check on its own dominance.
Call it a court or call it, as Facebook now does, an oversight board, this company, by adopting a structure of government, is essentially admitting it has some of the powers of a government. The question is what that means for the rest of us.
Facebook has already been legislating, as Kate Klonick writes in the Harvard Law Review, as its general standards for acceptable conduct have evolved into concrete and complicated terms of service. Violations of those terms are increasingly answered by speedy takedowns, sometimes by human moderators and other times by algorithms. This is enforcement, an executive branch function. And where those violations are up in the air, or where those terms of service are unclear, higher-level actors within the company issue a ruling. There’s your judiciary.
In a real democratic government, those three functions are separate, and the judiciary is independent. This may not have mattered much at first for Facebook. After all, there’s no need to mimic the internal processes of a government if a company doesn’t have government-like power over its users' lives. But Facebook has become essential to everyday communication in countries throughout the globe. Losing access has consequences. In this context, the company’s announced court system is a move toward much-needed separation of powers, away from Zuckerberg and other head honchos.
Reassuring, in one sense — but also worrying. Facebook is still very different from the actual democratic governments we know and, on good days, love. Those differences lead to whole host of problems an oversight board cannot solve.
For one thing, Facebook is not governing a single people. It is affecting the lives of a whole series of people in nations throughout the world, and those people have drastically disparate conceptions of appropriate public discourse. Facebook’s draft charter says it hopes to enshrine the values of “voice, safety, equity, dignity, equality and privacy.” It’s hard to argue with those. But what they mean to Zuckerberg is likely not what they mean to an activist in India, or a shop-owner in Britain, or anyone else anywhere else.
Facebook’s decisions can fundamentally alter the speech ecosystem in a nation. The company does not only end up governing individuals; it ends up governing governments, too. The norms Facebook or its court choose for their pseudo-constitution will apply everywhere, and though the company will strive for sensitivity to local context and concerns, those norms will affect how the whole world talks to one another.
That’s a lot of control, as Facebook has implicitly conceded by creating this court. But the court alone cannot close the chasm of accountability that renders Facebook’s preeminence so unsettling. Democracy, at least in theory, allows us to change things we do not like. We can vote out legislators who pass policy we disagree with, or who fail to pass policy at all. We cannot vote out Facebook. We can only quit it.
But can we really? Facebook has grown so large and, in many countries, essential that deleting an account seems to many like an impossibility. Facebook isn’t even just Facebook anymore: It is Instagram and WhatsApp, too. To people in many less developed countries, it is the Internet. Many users may feel more like subjects than customers, in that they cannot just quit. But they are not being governed with their consent.
No court — or oversight board — can change that.