FACEBOOK’S HIGH court was constructed by Facebook, for Facebook, which has left many wondering: Will the Silicon Valley leviathan’s oversight board keep chief executive Mark Zuckerberg’s company accountable, or will it shield it from accountability instead?

Facebook announced 20 inaugural members of its content moderation review corps this month. Critics have found little to quibble with in the lineup of pseudo-jurists; they’re respected professionals committed to human rights. Rather, skeptics believe the company has applied a fig leaf of legitimacy that lets it avoid responsibility for controversial calls without actually changing much.

Some observers have pointed out all the problems with Facebook that the oversight board doesn’t solve: the privacy harms of a business model that critics have come to call surveillance capitalism, for example, or insufficient disclosure obligations in elections messaging and advertising. These are real problems, but the responsibility for addressing them shouldn’t lie with the board, and it shouldn’t even lie with Facebook: Governments ought to set rules of their own, and this country’s legislature has fallen short.

The narrower area where the board’s involvement makes the most sense is precisely the one it has been tasked with monitoring: online speech. Facebook occupies a tenuous space between plain old platform and public square. The government cannot police speech here without encroaching on cherished liberties, yet the alternative is to allow a private business to control civic life without answering to anyone besides shareholders. The oversight board is supposed to split the difference. That’s why skeptics’ arguments in the speech realm are worth paying attention to.

One concerns scope. For the time being, the board can only rule on material that has been removed from Facebook, not material that has remained on the site despite protestations (unless Facebook makes a referral). Yet it is exactly these “leave-ups” that catch the company the most flak, such as a video of House Speaker Nancy Pelosi (D-Calif.) distorted to make her appear drunk, and hate speech in Myanmar that helped lead to genocide.

Facebook has said this limit exists “in line with Facebook’s commitment to free expression,” and there is indeed something noble in creating a body empowered only to protect speech and not to curtail it. But the company now emphasizes that the holdup is the result of technical limitations, and that it’s eager to expand the board’s ambit even to allow it to explore how algorithms boost or bring down content.

Users — and the board, too — ought to hold Facebook to this commitment. Allowing the committee at least an advisory role on these cases, similar to its advisory role on broader policy beyond individual pieces of content, would do a lot of good. The essential job for the new institution is to bring out from behind closed doors determinations about who can say what on the Internet’s largest, most influential platform — to ensure that those determinations are based on principle rather than profit. That should involve all sorts of speech on Facebook, not just the kind the company chooses.

Read more: