The Washington PostDemocracy Dies in Darkness

The Oversight Board exists to make us feel like Facebook can govern itself

Its Trump ruling aside, the Facebook Oversight Board has no real power.

(Olivier Douliery/AFP/Getty Images)

When Harvard Law professor Noah Feldman pitched Mark Zuckerberg on the idea of a Facebook Oversight Board, he explicitly framed it as a Supreme Court. From the outside, that framing seemed largely apt when the board handed down its ruling about the suspension of Donald Trump this week. The process had all the appearance of an institution governing itself. Facebook moderators, at the behest of the company, removed content deemed harmful in a timely manner. A review board with the mannerisms of a judiciary heard an appeal, issued a ruling on it and then stipulated how Facebook must change to comply with its own policies.

But if this looked like governance, that is precisely the point: The process was really little more than theater, a well-tailored spectacle of self-regulation by a company eager to avoid interference from actual governments. Its audience for this performance is, in part, Congress, which has spent the first two decades of the 21st century largely abdicating its role to regulate online spaces, though the events of Jan. 6 might finally stir legislative action. It is in Facebook’s interest to ensure that doesn’t happen, and the best way to do so might be to pass itself off as a sovereign power in its own right. By dramatically appealing to the arbitration of the Oversight Board, it is attempting to achieve just that.

The board, however, is unable to address the role of Facebook itself helping other users like Trump spread harmful information to their sizable audiences. As such, it cannot properly contribute to the governance of Facebook; it can only cloak the company in the illusory semblance of self-regulation.

Facebook's problem isn't Trump. It's Facebook.

This is not for a lack of trying. The board asked Facebook to respond to 46 questions, and reports that Facebook declined to answer questions about “how Facebook’s news feed and other features impacted the visibility of Mr. Trump’s content; whether Facebook has researched, or plans to research, those design decisions in relation to the events of January 6, 2021; and information about violating content from followers of Mr. Trump’s accounts.”

If Facebook had complied, the answers to those questions could have transformed a ruling on specific posts into a revelation about the engineering decisions that serve to amplify misinformation. Instead, by keeping the curtain drawn shut, Facebook continues to obscure how choices made in its code facilitate harm. In the process, it gives away the game. Creating an Oversight Board to function as a court, only to deny that same oversight board authority to access company information relevant to its decisions, undermines the very notion that Facebook can responsibly govern itself.

By design, the Oversight Board can only review content already taken down by Facebook. If a public figure, elected or otherwise, continues to spread misinformation on the platform without Facebook itself intervening, then the Oversight Board is powerless to intervene. Since it functions solely as a court of appeals, the board effectively leaves all proactive decisions for action in Facebook’s hands. The board cannot, for example, proactively recommend a limiting of posts on Facebook that might encourage readers to violence against their neighbors, like the Facebook-facilitated violence against the Rohingya people of Myanmar, a kind of violence that a U.N. official called “a textbook example of ethnic cleansing.”

“In proportion to the massive amounts of rankings, groups and advertisements, the Oversight Board is only empowered to offer its thoughts on a fraction of cases,” wrote Marietje Schaake, president of the CyberPeace Institute. “Meanwhile, as we debate the Trump ban, insurrectionists still post on the site while anti-Vax disinformation flourishes.”

Facebook's oversight board was supposed to let Facebook off the hook. It didn't.

To truly address the harms Facebook facilitates, it would need systems that aren’t just about deciding whether it was right to take down posts and ban accounts. It should also have public mechanisms for evaluating how far posts that remain up can spread and whether accounts that stay active can continue to disseminate information. This is, in effect, what Facebook moderators already do, but without public accountability for such routine decisions, Facebook keeps attention off such day-to-day governance, instead letting the Oversight Board draw focus to high-profile public appeals.

In the absence of external governance like that which Congress might mandate, governance theater at least suggests what the future of responsible regulation for Facebook might entail. Companies should be encouraged to adopt norms of responsible conduct, to enforce policies consistent with the law and the public interest and to open themselves to external review for opaque and hasty decisions.

Nevertheless, I remain skeptical that the path adopted by the Oversight Board will yield meaningful impact beyond those directly involved in its high-profile rulings. Without being legally obligated to change by an existing government, it is hard to see how any decision from the Oversight Board will cause Facebook to alter its behavior.

The semblance of governance offered by the board makes it all the less likely that any external authority will intervene. Until something changes, the machinery of engagement will continue to whir away invisibly, its fresh harms inscribed on people far from the Zooms where decisions are made.

Loading...