Nearly a quarter of former president Donald Trump’s 6,081 Facebook posts from Jan. 1, 2020, to Jan. 6, 2021, contained extremist rhetoric or misinformation about the coronavirus, the election or his critics, according to a new analysis by the left-leaning group Media Matters for America.
The research demonstrates in stark numbers just how many times Trump came up to the line of Facebook’s rules — if not crossed them — but was given a pass by the company.
Facebook spokesman Andy Stone pointed out that not all forms of misinformation related to the election or covid-19 were banned by the company, and that Facebook removed Trump’s posts in the handful of instances where executives found that they violated the social network’s policies.
Facebook appended labels to 506 of Trump’s 1,443 problematic posts, mostly with a generic label, such as “See Election Results,” which provided links to authoritative information but did not provide users with any indication of whether the post was false or misleading. Of the 506, just one was labeled false and another was labeled partly false, while the rest received more generic labels, Media Matters said.
In all of 2020, The Washington Post counted the removal by Facebook of just seven posts by Trump and his campaign, four of which were for copyright-related issues. Trump and his campaign shared an account.
Stone said in instances when Trump and others claimed election fraud, for example, the company chose to affix a label directing people to information about the election and voting methods instead of removing the content.
Trump is currently blocked on Facebook, which suspended his account in the wake of the deadly riot at the U.S. Capitol, along with other social media companies. His status is pending a decision by the oversight board, which has the power to reverse Facebook’s decisions to remove or keep up content. It is independent and is funded by the social media giant.
Media Matters included the research in a submission to the oversight board on Friday as an argument for why the tech platform — whose binding decision on Trump’s account will be closely watched by other tech giants — should not reinstate the former president. Currently the oversight board is considering two posts by Trump, issued on the day of the Capitol riots, in its decision-making. The posts were submitted to the board by Facebook, which asked it to determine whether it made the correct decision in banning Trump. Facebook asked the board whether those posts and subsequent actions by Trump constituted a glorification of violent events, which is prohibited by its policies. The Jan. 6 riot left four rioters and a Capitol Police officer dead.
The board has been flooded with more than 9,000 public comments on the Trump decision.
Angelo Carusone, president and chief executive of Media Matters, said the board should consider the totality and the overall danger of Trump’s rhetoric on the platform.
“In the neighborhood where I grew up, people put the stop signs in the places where there were the most incidents,” he said. “This is not about two posts. This is about a pattern of misinformation, and when you let it go unchecked, you not only get the kind of circumstances that lead to the 6th, but all the downstream effects, including the amplification of conspiracy theories and lies.”
Last year Facebook said it would ban many forms of misinformation about the coronavirus and coronavirus vaccines, as well as misinformation about the election.
Other companies, including YouTube and Twitter, adopted similar policies and have also banned Trump. In Twitter’s case, the ban is permanent and his content was wiped off the platform. YouTube said its ban was temporary, but the platform has been continually extending its ban.
Facebook has long given wide latitude to public figures, and has enshrined in its policies a “newsworthiness” exception that allows some problematic speech that would otherwise break its rules to stay up because it is in the public interest. The company says it invokes the exception when the public interest in seeing the content outweighs the risk of harm. The policy was developed during the 2016 election in response to hateful posts by Trump, The Washington Post reported last year.
In 2020, Facebook clarified that it had used the newsworthiness exception just a handful of times. In most cases in which Trump’s questionable content stayed up, Facebook’s experts determined that Trump’s posts simply did not break the company’s rules.
Among the posts by Trump that were not removed by Facebook include posts that baselessly claimed that the Centers for Disease Control and Prevention was exaggerating coronavirus deaths, that the United States had more cases because it had conducted more tests, and that “the Mortality Rate for the China Virus in the U.S. is just about the LOWEST IN THE WORLD!”
Facebook said at the time that it banned false claims about “the severity of the outbreak.”
These posts are still up on Facebook without any warning label or any indication that Trump’s account has been banned.
Regarding the election, Trump posted misinformation 363 times during the period examined by Media Matters, including false claims of victory and of voter fraud, false claims about voting infrastructure company Dominion Voting Systems, and false claims that the election was stolen from him that used the phrase “Stop the Steal” — the rallying cry of the Capitol rioters.
None of these posts were removed.
For Carusone and other experts, this means that the oversight board’s upcoming decision on Trump’s comments could exclude hundreds of other instances in which Carusone believed that Trump may have broken the spirit, if not the letter, of the company’s policies and contributed to harm in society.
“There is a big gap between their rules and what is happening on the ground,” he said.