Facebook committed replying to the board’s recommendations within 30 days, which included asking the company to conduct a thorough review of Facebook’s role in the Jan 6. Capitol insurrection and adopting more transparent policies around public figures’ accounts. Facebook ultimately declined to release such a report.
The company’s decisions around how it handles political leaders are being closely watched by politicians around the world, as well as social media researchers and other tech companies that similarly banned Trump in January.
Facebook was the first major social media platform to suspend Trump indefinitely in the wake of the Jan. 6 attack on the Capitol, and its decision was met with praise by many critics who believed the company had let him dodge its normal rules and policies. But others decried the decision as “censorship” and said it set a dangerous precedent for how world leaders communicate online.
“We know that any penalty we apply — or choose not to apply — will be controversial,” said Nick Clegg, Facebook’s vice president of global affairs, in a post announcing the two-year ban. “There are many people who believe it was not appropriate for a private company like Facebook to suspend an outgoing President from its platform, and many others who believe Mr. Trump should have immediately been banned for life."
Facebook, the Oversight Board and Trump have periodically outlined their thinking in blog posts and news releases and on social media. Here’s everything you need to know about the board and its role on the ban.
Frequently Asked Questions
- What is the Facebook Oversight Board?
- Will Trump be allowed back on Facebook?
- How do the board’s decisions work?
- How did we get to this point?
- What does this mean for other tech companies?
- What does this mean for Trump?
What is the Facebook Oversight Board?
The Facebook Oversight Board is a group created by Facebook to which users can appeal important company decisions. Though it is funded by a $130 million trust created by Facebook, the board says it is an independent and neutral third party. Its goal is to review moderation decisions made by the company and decide whether they were “made in accordance with its stated values.”
First proposed in a 2018 blog post by Zuckerberg, the Oversight Board is the company’s attempt to have an outside authority handle difficult decisions. It formally started deliberating in October 2020 and has also been called “Facebook’s Supreme Court,” though it has no government affiliation or legal standing. The board is currently made of 20 people from around the world who are experts in things like journalism, misinformation, freedom of speech and extremism, though only 19 participated in this case. The original goal was to have 40 members total, and more will continue to be added.
The board was created to appease critics who thought power over the world’s largest social network and its 3.45 billion monthly users (including Facebook, Instagram and WhatsApp) was too concentrated in a group of Facebook executives, specifically Zuckerberg. However, critics say it outsources individual decisions without creating meaningful internal change and shields Facebook from responsibility for difficult decisions.
Before this, the board has ruled on Facebook moderation decisions around blackface, threats of violence and covid-19 misinformation. As of its decision on Trump, it had overturned Facebook’s decisions six times, upheld them three times, and was unable to complete a ruling once.
Will Trump be allowed back on Facebook?
Facebook says its ban will last two-years, until just before the 2024 presidential election. However the company will review his case and consult with experts to ensure “the risk to public safety has receded,” before reinstating him, according to its website.
The board said that Facebook had six months to review its suspension of Trump, which would have given it until November. The company announced its decision sooner.
Facebook first referred the decision to the board on Jan. 21, saying Trump’s suspension would remain in place during deliberations. In addition to asking the board to rule on the ban, Facebook asked it for any “observations or recommendations” on how to handle other world leaders on the site. According to the rules it established when the board was set up, Facebook has 30-days to reply to recommendations.
The decision on Trump’s ban will be its most high-profile to date. If the board had overturned Facebook’s ban, the company would have had seven days to unlock and give Trump control of the page. There is no way for Facebook or Trump to appeal the decision.
How do the board’s decisions work?
First a case has to be referred to the board, either by Facebook itself or through direct submissions from users who disagree with Facebook taking down their content or leaving someone else’s up. The board selects a panel of five of its members, including at least one person from the country where the case is based. They are not named publicly so they cannot be lobbied. Members have all gone through training for the job, which is not full time, and approach the decisions as precedent-setting legal cases, even though the process is not part of any legal system.
The panel meets over Zoom and considers Facebook’s own lengthy Community Standards bylaws and consults with outside experts and organizations. The affected account holder can also submit a statement, and there is a public commenting period for any regular people to weigh in. The Trump case received more than 9,000 public comments, almost as many as all the board’s past cases combined.
The panel tries to reach a unanimous decision, but technically it needs only a simple majority. It then takes its decision and presents it to the full board, which can overrule the finding if a majority of board members disagree with it.
In the Trump ruling, the board’s decision has two parts. First it upheld Facebook’s decision and directed the company to review the length of the suspension. Its decision on whether to uphold Facebook’s ban is binding, according to the board’s bylaws. The board also goes further than that simple ruling and makes broader policy suggestions to Facebook. Those suggestions — which can include things like asking the company to add policies around issues like hate speech or bullying, or whether world leaders get different treatment — are not binding, and the company does not have to follow them or take them into consideration. However, Facebook has so far been open to the suggestions. In its response to the Trump decision, Facebook says it would fully or partially implement 16 of the board’s 19 recommendations.
If Trump is allowed back on Facebook after the two-year ban, he could still run afoul of its policies and be removed again over future posts. Along with the Trump ban, Facebook changed its rules for politicians and will no longer automatically give them a pass when they break the company’s hate speech rules.
Trump said in an emailed statement the ruling was an insult to the people who voted for him last year. Facebook “shouldn’t be allowed to get away with this censoring and silencing, and ultimately, we will win,” he added. “Our Country can’t take this abuse anymore!”
How did we get to this point?
Tension had been building between Trump and Facebook for nearly six years before the company indefinitely suspended him after the Jan. 6 attack on the Capitol. In 2015, then-candidate Trump posted a video calling for a ban on Muslims entering the United States. In a controversial decision, Facebook declined to remove it. Instead, that internal decision eventually led to the company’s “newsworthiness” policy, which created an exception for some posts that violated guidelines to nevertheless remain online because they carried public-interest value.
Facebook’s policies were constantly tested throughout 2020, when Trump posted misleading information about the coronavirus and bombastic statements about protests taking place across the country. In a May 2020 post, Trump referred to protesters as “THUGS” and wrote, “Any difficulty and we will assume control but, when the looting starts, the shooting starts.”
Though Twitter was Trump’s go-to social media site, the former president also regularly used Facebook to spread messages and often cross-posted on both Twitter and Facebook.
Twitter labeled a similar tweet on its site with a public interest notice, but Facebook left it untouched. Employees and advocates called for Facebook to take harsher action and in June, Zuckerberg announced the company would label posts that violated hate speech and other policies, even from politicians. And it would also remove posts that attempted to incite violence or suppress voting, with no newsworthiness exception.
Facebook did begin labeling some of Trump’s tweets, but it faced mounting pressure from critics saying it wasn’t doing enough, as well as from some conservative politicians and pundits who called its actions “censorship.”
The breaking point came Jan. 6 when Trump posted a video on Facebook and Instagram, and other social media sites, telling rioters to go home. But in the video he also said, “We love you, you’re very special.” Facebook suspended the president for 24 hours. The next day, Zuckerberg announced the suspension would be indefinite, saying, “We believe the risks of allowing the President to continue to use our service during this period are simply too great.”
Later that month, Facebook said it would refer the decision to the Oversight Board to make the final call. “Many argue private companies like Facebook shouldn’t be making these big decisions on their own,” the company wrote at the time. “We agree.”
What does this mean for other tech companies?
Twitter and YouTube took similar action on Trump’s account soon after Facebook. Trump’s account remains available on YouTube, but he’s blocked from uploading new videos. YouTube’s suspensions usually last only a week for a “first strike,” but the company will keep Trump’s in place until the “risk of violence has decreased,” CEO Susan Wojcicki said in March. YouTube’s analysts will determine when the risk is low enough by looking at government statements, whether there are police buildups and the level of violent rhetoric elsewhere on YouTube, Wojcicki said.
Trump had millions of views and followers on YouTube, but the platform wasn’t used as directly as Twitter was. Instead his campaign used the site to post official videos that were shared around the Web by supporters. The campaign also bought prime ad space on YouTube’s homepage the week of the election.
Unlike with Twitter, Trump probably did not personally control the YouTube channel.
Twitter, on the other hand, has made no bones about its plans: Trump is banned permanently, regardless of what other companies decide or whether he runs for office again.
“The way our policies work, when you’re removed from the platform, you’re removed from the platform, whether you’re a commentator, you’re a CFO, or you are a former or current public official,” Twitter CFO Ned Segal said during an interview with CNBC in February.
What does this mean for Trump?
Trump has lost much of his direct online communication with supporters since the major social media networks kicked him off in January. He still has been sending out news releases and messages to supporters, however, as well as appearing on television interviews. Since the board’s May announcement, Trump has shut down a new website he launched just for posting updates.
In an interview on Fox News Channel last month, Trump praised his news release strategy.
“It’s better than Twitter, much more elegant than Twitter,” he said, according to the Hill. “And Twitter now is very boring.”
But being reinstated on Facebook would have been a major win for Trump to reclaim his personal brand of communication — a way to speak directly to supporters in his own, unfiltered words.
Trump senior adviser Jason Miller confirmed to Fox News that the former president is building his own social media network. Creating a social media site with broad appeal is a significant technological, social and financial undertaking. Even if the network successfully launches, Trump will have a tough time building a similar audience from scratch. The former president had more than 88 million followers on Twitter when his account was banned.
Cat Zakrzewski, Gerrit De Vynck and Elizabeth Dwoskin contributed to this report.