In theory, this is a smart solution to Facebook’s moderation woes. Mr. Zuckerberg has long seemed uneasy with his ability to unilaterally decide what ideas have a place on his platform. Facebook’s court would take determinations that affect users all over the world out of Menlo Park and put them under the control, ideally, of a diverse slate of informed representatives. Public rulings from the council would provide more transparency about Facebook’s choices, as well as the thoughtful rationales users deserve.
The task ahead of Facebook now is turning this potential into practice. That will require, first, figuring out the court’s structure and staffing. A larger collection of decision-makers who convene in panels, as opposed to a series of regional councils, would allow Facebook to ensure geographic expertise on each case without sacrificing consistency. Facebook should open a nomination process to fill these slots with freedom-of-expression experts across society — from lawyers to publishers to human rights advocates to academics. Ensuring true independence will be tricky; Facebook must craft contracts with periods of tenure during which the workers cannot be fired without public reason.
Facebook must also consider how the court will choose its cases. If all appeals of lower-level removals are referred to the court, its members will be overwhelmed sorting through spam. A petition system by which users can apply directly with a fleshed-out argument might make more sense. The court should prioritize high-profile and high-impact cases, and it should be empowered to access any necessary evidence as well as solicit outside information.
Most important will be the code the body operates by. Facebook has considered basing rulings on its community standards, but the company can amend those at any time. Better to devise a constitutional document that is less easily amended. This approach would have the added benefit of forcing Facebook to articulate at a high level the principles it believes should guide it: For instance, is the purpose of removing disinformation only to prevent real-world physical harm? What role do local societal standards play in assessing whether something qualifies as hate speech? Facebook should then clarify its terms of service according to the court’s rulings, training staff moderators to abide by any alterations.
At best, this high court of the Web would allow more people a voice in some of the most significant decisions the company makes. At worst, it would allow Facebook to disavow responsibility for controversial choices about content without giving up meaningful control.