The civil rights auditors Facebook hired to scrutinize its record delivered on Wednesday a long-awaited and scathing indictment of the social media giant’s decisions to prioritize free speech above other values, which they called a “tremendous setback” that opened the door for abuse by politicians.

The report criticized Facebook’s choice to leave several posts by President Trump untouched, including three in May that the auditors said “clearly violated” the company’s policies prohibiting voter suppression, hate speech and incitement of violence. The report also found that Facebook provides a forum for white supremacy and white nationalism.

The conclusions by Facebook’s own auditors are likely to bolster criticism that the company has too much power and that it bends and stretches its rules for powerful people. Though Facebook frequently says it listens to experts when making judgment calls, the auditors found that is not always the case on critical matters of free expression.

“When you put free expression on top of every other consideration, I think civil rights considerations take more of a back seat,” said Laura Murphy, a civil rights lawyer and independent consultant who led the two-year audit. Murphy worked with a team from civil rights law firm Relman Colfax, led by partner Megan Cacace.

The report was prompted by years of complaints by civil rights groups that the company foments hatred, stemming back to when the social network was used to organize a 2017 Neo-Nazi march in Chartlottesville, Va. Since then, Facebook has become more aggressive about taking down hate groups, but it has also hardened its stances on protecting free speech, setting up a tension that the auditors said was undermining Facebook’s good faith efforts to improve its service.

Chief executive Mark Zuckerberg’s unwavering position on free expression is isolating Facebook and leaving it at a perilous crossroads just months before the U.S. presidential contest. He has been widely condemned for it: by thousands of employees last month who protested the decision to leave up one of Trump’s posts, and now by major advertisers boycotting the social network as part of a campaign known as “Stop Hate For Profit.”

Civil rights leaders who met with Zuckerberg on Tuesday to discuss the boycott said the company didn’t appear to be ready to change. Facebook’s counterparts in Silicon Valley — including Snapchat, Reddit and Twitch — are taking a tougher tack when policing Trump and his most extreme supporters.

The Facebook-commissioned report potentially carries more weight than other criticisms on the grounds of civil rights because the social network granted the auditors extensive access to its systems and executives, and it encompassed feedback from over 100 civil rights groups. However, it provides no guarantee that Facebook will make major changes to its policies or practices.

“Being a platform where everyone can make their voice heard is core to our mission, but that doesn’t mean it’s acceptable for people to spread hate. It’s not,” Facebook Chief Operating Officer Sheryl Sandberg wrote in a blog post in response to the report. “We have clear policies against hate — and we strive constantly to get better and faster at enforcing them.”

The report comes on the heels of a meeting Facebook held with the organizers of a fast-growing boycott of over 1,000 advertisers, who have several demands of Facebook, including hiring a top-level executive who will ensure the global platform does not fuel racism and radicalization. The timing of the publication of the long-anticipated report led the civil rights groups organizing the boycott to argue Facebook was attempting to use it to draw attention away from their demands, which also include ending exceptions for politicians. The organizers called the Tuesday meeting “disappointing.”

Facebook denied it was trying to deflect attention from the boycott.

On Wednesday the company said it had taken down accounts tied to longtime Trump friend and former campaign adviser Roger Stone, after finding that he violated Facebook’s rules by using more than 100 accounts and pages to manipulate public debate.

Facebook’s auditors faulted the social network for making policy decisions that undermine civil rights progress. They said Facebook failed to improve the experience of people of color who use the platform. They also said the company had delayed acting on calls to hire experts in civil rights to senior leadership positions, noting recent decisions over hate speech were made by senior executives who lacked specific civil rights expertise and nuanced understandings of race — and that certain decisions were made against the objections of the auditors.

In the posts about voting in May, Trump called use of mail-in ballots in Nevada and Michigan “illegal” and “substantially fraudulent.”

Because mail-in ballots were lawful forms of voter registration in both states, the auditors “vehemently expressed” their views to Facebook that the posts were prohibited by the company’s voter interference policy, which bans false representations about voter registration methods, the report said.

But senior executives at Facebook found that the posts did not break the policies, ignoring the conclusions of the auditors, their own voting rights consultant, and the broader civil rights community, the report noted. Instead, the company’s executives interpreted the posts to mean the president was accusing state officials of acting illegally, which it considers to be permissible criticism. That “constrained reading” of its own rules “was both astounding and deeply troubling,” the auditors said, “hurtling [Facebook] down a slippery slope” in which basic facts about how to vote can be freely misrepresented.

“With only months left before a major election, this is deeply troublesome as misinformation, sowing racial division and calls for violence near elections can do great damage to our democracy,” the auditors wrote.

The auditors also challenged Facebook’s decision to let stand another May post by Trump, in which he said, “when the looting starts, the shooting starts,” invoking a civil rights movement-era reference to describe the military potentially entering the protests in Minnesota.

Civil rights advocates believe the comment about shooting people for stealing or looting appeared to encourage law enforcement to commit unlawful capital punishment against protesters. The choice to leave the post up led to an employee uprising and helped fuel the boycott.

Twitter chose to add fact-checking and warning labels to the same posts.

Facebook has made some concessions, including copying Twitter by developing fact-checking labels of its own. The auditors praised the concessions but said they did not go far enough.

Civil rights groups began reaching out to Facebook in 2015, when self-proclaimed white nationalists were attacking black activists on the platform, said Jade Magnus Ogunnaike, deputy senior campaign director at Color of Change, one of the groups behind the ad boycott. Then, in 2017, far-right extremists created an event page on Facebook to promote the Unite the Right march in Charlottesville, Va. The march, which was attended by self-proclaimed Neo-Nazis, turned deadly when a white supremacist drove his car into protesters.

Under pressure from civil rights groups, Facebook baned terms such as white nationalism and took down various accounts of far-right leaders including Alex Jones and Milo Yiannopoulos. But extremist activity has morphed since then, and civil rights activists have argued that Facebook has been slow to react. For example, a violent far-right movement known as the boogaloo flourished on Facebook this year, despite numerous requests from civil rights advocates to remove the groups. Facebook banned these groups last week, one of the demands of the ad boycott organizers.

The auditors noted that Facebook’s decision to leave Trump’s “looting” post up has already encouraged copycat calls for violence, including political and merchandise ads that “looters” and “ANTIFA terrorists” can or should be shot by armed citizens. Facebook ultimately removed the ads, after receiving more than 200,000 clicks.

Civil rights leaders said the release of the report is by no means an “end game” in their efforts to change the social network. Vanita Gupta, president and CEO of the Leadership Conference on Civil and Human Rights, said that work is increasingly critical in light of the intense polarization sweeping the country amid the pandemic and widespread protests against racism.

“There is so much at stake in this moment for the platform to get it right, for our democracy and for our communities,” she said. “The work is going to continue. We’re going to continue to press, to push to make these changes even after the final report comes out.”

The audit noted that Facebook had made progress by creating policies against interfering in the voting process and in the census, and had made historic legal settlements over discrimination in its ad targeting systems as well as with content moderators who suffer psychological harms from the work. The auditors also praised Facebook’s decision to create a target of increasing the number of black executives by 30 percent over the next five years, and to create a team to uncover algorithmic bias.

But the report said Facebook has a long way to go to incorporate civil rights, including changing its approach so that the harms from speech are as valued as free speech, creating an extensive civil rights infrastructure of executives and managers within the company, and investing more resources in areas of bias and discrimination in its products and policies. The auditors also asked a for commitment from Facebook to explore how the platform foments white supremacy in a manner that goes beyond merely banning the terms “white separatism” and “white nationalism.” Finally, it called on Facebook to interpret its voter suppression policies more strongly, noting the recent exceptions for Trump.

Murphy said she’s hopeful Facebook will adopt some of the audit’s recommendations, but she noted it will take continued advocacy and pressure to ensure that happens.

“I just can’t predict which issues are going to make it across the finish line,” she said.

Zuckerberg has frequently said Facebook should not be in the position to make many of the most complex judgment calls over free speech issues, and has called for a governmental regulatory body to set universal standards. The company is also funding an independent external oversight board, which will be able to make decisions about whether content should be removed from Facebook and play a key role in setting precedent about content policy at Facebook. The board is expected to launch this summer.