When the company banned Trump on Jan. 6 for social media posts encouraging the mob that stormed the U.S. Capitol, Zuckerberg turned that hard decision over to that newly formed independent panel, the Oversight Board, for review, hoping it would make the final determination.
But on Wednesday, the 20-member panel punted the decision back to Facebook, recommending the company decide within six months whether to permanently ban or restore Trump’s account. He is currently suspended “indefinitely,” a one-off penalty outside Facebook’s usual rules.
The board, set up to act as a “Supreme Court-like” body to police Facebook’s content decisions, scolded the company for trying to pass it off, too.
“In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities,” the members wrote in a sharply worded ruling. “The Board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.”
The long-awaited decision by the board Wednesday was widely expected to settle Trump’s fate on the site, as well as provide guidance to other social media companies like YouTube that have also suspended Trump’s accounts. It was the first major test for the Oversight Board, which has been touted by some lawmakers and experts as a potential role model for future regulation and criticized by others as a way for Facebook to front-run the government by taking matters into its own hands.
Instead, those questions are pending. Nick Clegg, Facebook‘s vice president of global affairs and communications, said in a blog post Wednesday that it will “consider the board’s decision and determine an action that is clear and proportionate.” Because its recommendations that Facebook restore or block Trump permanently are nonbinding, it’s unclear whether the company will follow them.
Some experts said the board made the right move by not allowing Facebook to shirk responsibility.
The board “is blowing the whistle on the company’s attempt to outsource responsibility for making this difficult and controversial decision,” said Paul Barrett, deputy director of the NYU Stern Center for Business and Human Rights. “The board refused to play that deflection game and demanded that Facebook clarify its rules and follow them when deciding Trump’s long-term status on the platforms.”
Others said it was the board refusing to do its job.
“Their role is to constrain Facebook’s, and Mark Zuckerberg’s, discretion,” wrote Evelyn Douek, a Harvard Law School lecturer, in a Lawfare article Wednesday. “The [Facebook Oversight Board] has declined to do that almost entirely, and did not even provide meaningful parameters of the policies it calls on Facebook to develop.”
Giving the final call on Trump back to Facebook is unlikely to result in the fair and even decision the board members are calling for, said Joan Donovan, a disinformation and extremism researcher at Harvard University.
“I can imagine that what ends up happening is very different from what the board imagined,” Donovan said. “The board completely dodged in the sense that if they were really on the side of society, if they were really on the side of the people, they would have suggested to Facebook that they should ban Donald Trump.”
Either way, the back-and-forth means the most complicated questions about the decision are left on the table. That includes how social media companies should draw the line between speech and violence or moderate political figures going forward.
The Board insists that Facebook review this matter to determine and justify a proportionate response that is consistent with the rules that are applied to other users of its platform. Facebook must complete its review of this matter within six months of the date of this decision. The Board also makes policy recommendations for Facebook to implement in developing clear, necessary, and proportionate policies that promote public safety and respect freedom of expression.
The board members took issue with the characterization that they were not fulfilling their mandate to oversee Facebook.
“I don’t think we’re just passing the buck back to Facebook here,” board co-chair and former Danish prime minister Helle Thorning-Schmidt said in a media call Wednesday. “What we are saying is, it can’t be left up to Facebook to just choose their own penalty for their users. They have to follow their own rules.”
The decision that caused the rift was announced Wednesday after months of deliberation. Facebook‘s Oversight Board said it was upholding the company’s ban on Trump.
In the ruling, the board agreed that Trump’s comments on the day of the Capitol insurrection “created an environment where a serious risk of violence was possible.” The board pointed to the president’s references to the rioters as “patriots” and “special,” and his instructions to “Remember this day forever.”
But it took issue with Facebook’s “indefinite” suspension of Trump, saying it was “vague and uncertain.” The board said Facebook’s rationale for the original ban was unprincipled and arbitrary, and that “indefinite” bans were not even written policy at the time the Trump decision was made.
When sending the case back to Facebook, the board recommended that the company come up with a better, clearer rationale for either banning or restoring Trump’s account.
Trump, for his part, said in a statement Wednesday that free speech was taken from him because “Radical Left Lunatics are afraid of the truth.”
Trump advisers viewed the Oversight Board decision as disappointing because it could hurt fundraising prospects going forward, said one adviser who spoke on the condition of anonymity to speak freely about internal deliberations.
News of the decision was rare cause for optimism among Facebook’s workforce, which has been demoralized by the company’s many scandals and history of giving passes to Trump. Some employees praised the decision as “thoughtful and a step forward for this governance model” in the employee chat system, according to documents viewed by The Washington Post. Others posted hand-clapping emojis.
But there was also worry about the path forward. Some wished the Oversight Board had given more guidance and expressed concern about what would happen now that the decision had been kicked back to Facebook. Others pointed out that regular people had been permanently banned from FB for doing far worse things.
“I really hope this happens, and that it is done honestly and in the open,” one said.
The board’s decision was also seen as a possible bellwether for other social media companies. YouTube has also suspended Trump’s account indefinitely but says it could reinstate him when it judges the threat of him inciting political violence has abated. The Google-owned video site froze Trump’s channel after it uploaded videos of him doubling down on comments he had made to supporters before the Jan. 6 Capitol attack. Usually breaking YouTube’s rules for the first time results in a one-week suspension, but Trump’s account has been locked for about four months now.
“At YouTube, we have long-established community guidelines that govern what can stay on our platform and apply to everyone,” said YouTube spokesperson Ivy Choi. “The Donald J. Trump channel violated our incitement-to-violence policy and was suspended in accordance with our three-strikes system.”
Trump wasn’t as active on YouTube as he was on Facebook and Twitter, but the video platform was a major part of his campaign’s advertising strategy. Trump bought millions of dollars’ worth of ads on YouTube throughout the election and secured the coveted homepage banner ad during the Democratic National Convention and the week leading up to the election.
Twitter, which got out of the business of political advertising before the 2020 election, banned Trump permanently after the Capitol insurrection.
Facebook is also a critical part of many political campaigns, offering a way for candidates to connect with potential voters and to raise funds.
“Twitter was probably a greater personal loss [to Trump], but Facebook is a more important professional loss,” said social media researcher and Clemson University professor Darren Linvill. “It’s central to any campaign.”
Zuckerberg first publicly floated the idea of creating the Oversight Board in 2018, just as lawmakers around the world were mulling ways to regulate the social network. Critics said the company lacked any accountability for its moderation decisions, from handling hate speech to how it applied its newsworthiness exemption to leaders like Trump. They said there were no checks on Zuckerberg’s power to determine what could be shared with billions of people. Each major decision could have wide-reaching social consequences, the kind that are regularly handled by laws and courts around the world. So Facebook decided to make its own court, outside of any existing legal system, staffed by paid experts and with no legal authority to force Facebook to change.
“You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world,” Zuckerberg told Vox in a 2018 interview.
The actual board didn’t come together until last year. It started working on its first decision in the fall, its 20 members meeting over videoconference. Since late January, the board has heard 10 cases and ruled nine times, filing long legal-like documents that contain a binding ruling on whether to uphold a Facebook moderation decision, along with policy suggestions that Facebook can choose to implement or not. In six cases, it has gone against Facebook’s initial determination.
Samuel Woolley, an assistant professor and the director of a propaganda research team at the University of Texas at Austin, said Wednesday’s decision is further indication that the government needs to step in and regulate this space.
The board was right to insist on more transparency, he said.
“The board is essentially saying, ‘We can’t do our jobs if Facebook doesn’t have clear internal regulations, and we’re not going to be a scapegoat for Facebook’s decisions.’”
Gerrit De Vynck, Rachel Lerman and Josh Dawsey contributed to this report.