Facebook first suspended Trump for encouraging violence during the Capitol riot Jan. 6, before saying the next day that the ban was “indefinite.” Two weeks later, it referred the case to its 20-member Oversight Board, which is largely independent and funded by the social network. But the board on Wednesday handed the decision back to Facebook, recommending that it either permanently ban or reinstate the president within six months — and write clear rules to explain the rationale.
“They cannot invent new unwritten rules when it suits them,” board co-chair and former Danish prime minister Helle Thorning-Schmidt said in an interview.
In the ruling, the board agreed that Trump’s comments on the day of the insurrection “created an environment where a serious risk of violence was possible.” The board pointed to the president’s references to the mob members as “patriots” and “special,” and his instructions to them to “Remember this day forever.”
But it took issue with Facebook’s “indefinite” suspension of Trump, saying it was “vague and uncertain.”
The board also recommended that Facebook publish a report explaining its own role in fomenting the Jan. 6 attack.
Following the decision, Facebook emphasized that Trump would remain off the social network for the time being, in accordance with the board’s order. But the company also seemed noncommittal in its response.
“We will now consider the board’s decision and determine an action that is clear and proportionate,” Nick Clegg, Facebook’s vice president of global affairs and communication, said in a blog post Wednesday, after canceling all planned interviews. “In the meantime, Mr. Trump’s accounts remain suspended.”
Trump said in a statement that Facebook, Twitter and Google embarrassed the United States.
“Free Speech has been taken away from the President of the United States because the Radical Left Lunatics are afraid of the truth,” Trump said in the statement. ”… These corrupt social media companies must pay a political price, and must never again be allowed to destroy and decimate our Electoral Process.”
Twitter and Google’s YouTube followed Facebook in suspending Trump following his comments. Twitter’s ban is permanent, while YouTube’s is indefinite. Facebook and Twitter declined to comment on Trump’s statement. YouTube did not have immediate comment.
Critics are already calling into question the legitimacy and value of the Oversight Board, an experimental entity set up by Facebook to help hold it accountable in making such calls. The board is only able to offer Facebook recommendations on its policies, which the social network can take or leave, and Facebook has a hand in selecting members. Because of the board’s limited powers, some critics see the body as a distraction from developing new laws or government oversight of social media companies.
“The practical effect of this decision will be that Facebook — and possibly other platforms that might have been watching the Oversight Board for unofficial guidance — will have to continue to grapple themselves with the problem of what to do about political leaders who abuse social media to spread lies and incite violence,” Paul Barrett, deputy director of the NYU Stern Center for Business and Human Rights, said in a statement.
In Washington, Democrats have already promised to use their new powers to update existing antitrust laws, crack down on misinformation and pass federal privacy legislation. Facebook is also the target of a landmark Federal Trade Commission lawsuit, which focuses on the company’s practice of buying up rivals.
The board’s push for Facebook to create more transparent rules and consistently follow them echoes the criticisms of lawmakers on both sides of the aisle. Wednesday’s ruling renewed calls for the government to take on a greater regulatory role and to continue with efforts underway in the United States to limit the social media giant’s power. Some also called into question why the decision focused almost solely on one person, not on the powerful algorithms that spread hateful content virally.
“Policymakers ultimately must address the root of these issues, which includes pushing for oversight and effective moderation mechanisms to hold platforms accountable for a business model that spreads real-world harm,” Sen. Mark R. Warner (D-Va.) said in a statement.
Allies of Trump swiftly condemned the decision.
“Facebook’s status as a monopoly has led its leaders to believe it can silence and censor Americans’ speech with no repercussions,” said Rep. Ken Buck (Colo.), the top Republican on the House Judiciary antitrust subcommittee. “Now more than ever we need aggressive antitrust reform to break up Facebook’s monopoly.”
Critics have long argued that Facebook should have banned Trump at different points throughout his presidency, saying that his inflammatory language and frequent promotion of misinformation — about the coronavirus in particular — constituted an abuse of his office and of Facebook’s own community standards. But Facebook chief executive Mark Zuckerberg felt strongly that politicians should be given wide latitude because their speech was in the public interest.
Facebook referred its decision about Trump to the Oversight Board shortly after it banned Trump in January. The board, which is less than a year old and had yet to decide a case at the time, was first conceived by Zuckerberg in 2018 as a way to outsource the thorniest content moderation decisions without having the government intervene.
Over the past few months, members spanning time zones from Taiwan to San Francisco connected on videoconference calls to pore over more than 9,000 public comments on the matter, including from Trump himself, according to the board.
In a letter submitted to the board on Trump’s behalf, asking the board to reconsider the suspension, Trump’s allies said it was “inconceivable that either of those two posts can be viewed as a threat to public safety, or an incitement to violence.”
In its decision, the board faulted Facebook for making “arbitrary” decisions on the fly, and said that the company had no published criteria for suspending a user indefinitely. Facebook’s normal penalties are removing a comment, a time-limited suspension or disabling the user’s account permanently, the board said.
“It is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored,” the board said.
Facebook currently exempts political figures from some hate-speech rules on the grounds that those comments are newsworthy.
The board took issue with that exemption, noting that “it is not always useful to draw a firm distinction between political leaders and other influential users,” and that such users have greater power than others to cause harm.
The ruling appeared to nudge Facebook in the direction of being more aggressive when making determinations of what counts as imminent harm.
“Facebook must assess posts by influential users in context according to the way they are likely to be understood, even if their incendiary message is couched in language designed to avoid responsibility,” the board wrote.
A lower threshold for harm — if Facebook were to adopt it — could translate to more-severe penalties for world leaders whose harmful statements have been met with mild penalties from Facebook.
“This doesn’t begin and end with Donald Trump,” said Nathaniel Persily, a Stanford Law School professor. “They’ve got all kinds of elections coming up around the world.”
While the board told Facebook the risk of harm should outweigh free-speech considerations, it didn’t give Facebook any new guidance on how to write new policies or treat political figures who many come right up to the line.
Critics said the ruling was unlikely to have any real impact.
“The Oversight Board is not an oversight board, it’s a PR device. They too have failed to contribute anything of even modest value. We’re back to square one, facing the void,” said Shoshana Zuboff, a member of a group of Facebook critics self-dubbed “The Real Facebook Oversight Board” and author of “The Age of Surveillance Capitalism.”
Under U.S. law, social media platforms are not held legally responsible for policing unwanted or even much illegal content on their services, with some exceptions for copyright issues and child pornography. But in recent years, Silicon Valley has dealt with a series of crises over enabling disinformation and the spread of extremism from both domestic and international forces, and the blowback has forced the companies to invest significantly in content moderation. That investment picked up in 2020, when companies including Facebook and Twitter launched stronger policies aimed at combating misinformation surrounding the election and the coronavirus.
Those forces helped prompt the creation of the Oversight Board.
“You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world,” Zuckerberg told Vox in a 2018 interview.
Facebook then embarked on a months-long process of collecting feedback on how to design the board and consulting more than 2,000 people in 88 countries. It released the rules and selected its first members in 2020. The board was a lightning rod for controversy during its formation, as Facebook’s critics warned its authority was too limited and that the company’s role in picking board members compromised its independence.
The board issued its first decisions in late January, a week after Facebook announced it would refer the high-profile Trump case. The initial round of decisions — which touched on alleged hate speech, coronavirus misinformation and references to dangerous organizations — signaled that the board would demand greater transparency from Facebook about its policies. Before Wednesday’s decision, the board had overturned Facebook’s decisions six times, upheld them twice, and was unable to complete a ruling once.
In the board’s nearly 12,000-word document, it said it had asked Facebook 46 questions and that the company declined to answer seven of them. That included one about Facebook’s design and algorithms, and the role those potentially played in the spread and visibility of Trump’s posts. Facebook also declined to answer a question about if a suspension or deletion would have an impact on its ability to target ads.
Heather Kelly and Rachel Lerman contributed to this report.