Around the same time Facebook whistleblower Frances Haugen was downloading tens of thousands of revealing company documents, another researcher from her division quit amid a mix of burnout and frustration. He too had lost hope that he could make change at Facebook.

That employee, Brian Waismeyer, who like Haugen was a member of the company’s integrity unit, broadcast his displeasure in a farewell post on the company’s internal message board — a familiar venue for expressing the mounting dissent within a company whose own employees have become a formidable force for criticism. He said the social network made it “uniquely difficult” for people with jobs like his — reformers within a growth-focused social media service — “to the point that it impedes progress and burns out those who grapple with it.”

The overall picture portrayed in these scathing farewell missives, many of which have been obtained by The Washington Post, echo accounts from dozens of current and former employees interviewed in recent days and over the past several years: Facebook is obsessed with growth, unwilling to undertake systemic reforms in the face of documented harms and ready to accommodate the politically powerful, especially President Donald Trump in the years before he was banned from the platform after the Jan. 6 siege on the U.S. Capitol.

The rising volume of such complaints has brought Facebook to a perilous moment in which it’s being pilloried on a bipartisan basis in Washington and scrutinized by regulators across the world. A company that has endured a relentless string of controversies over its 17-year history while becoming ever-bigger, ever-richer now faces a broadening consensus that the time for self-regulation is long past; only forceful governmental action — in the United States and elsewhere — can curb the company’s penchant for allowing content that undermines democracy and harms individuals.

Some saw Haugen’s testimony — and the troves of documents she has shared with reporters and lawmakers — as steps toward that goal.

“I’m glad that these documents prove what many have been saying: Facebook’s leadership prioritizes growth and political power over the public good — and anyone who tries to change that continuously hits a wall,” said Yael Eisenstat, a former Facebook executive who has visited Congress and met with lawmakers to discuss how she battled senior leadership over an unwillingness to fact-check misinformation in political ads.

The former employees interviewed by The Post described their own painful journeys, from initial optimism that they could play a role in fixing the service to frustration at what they saw inside the company, and ultimately, to resignation that Facebook was unable to transform itself from within. In a handful of cases, they spoke out publicly, in Congress, or spirited away sensitive documents — sometimes the criticism made them persona non grata in the tech industry.

But most frequently they used the company’s internal message board, called Workplace, to sound alarms to the company’s 60,000 employees, in missives that were generally not available to the public. They spoke of the demoralization that arose after believing that they were on the front lines of protecting people from hate and online harm, only to see years’ worth of internal projects cast aside by management. At least 10 members of the integrity division where Haugen worked now work in a similar division at Twitter.

“Selecting anecdotes from departing employees doesn’t tell the story of how change happens at Facebook. Our integrity team is full of incredible professionals who make our platform better,” Facebook spokesman Joe Osborne said. “Projects go through rigorous reviews and debates exactly so that we can be confident in any potential change and its impact on people. In fact, we did end up implementing variations of many of the ideas raised in this story — and we’re grateful to everyone who has worked on these complex issues.”

Some internal critics have mixed feelings about Haugen. While they welcome reforms that could come from airing so much dirty laundry, those still in the trenches feel it could become harder to execute change from within, according to three of the people. Some also felt that Haugen exaggerated her understanding of issues on which she did not work directly and research that she did not conduct; Haugen on several occasions during her testimony told senators that she could not respond to questions because she did not have direct knowledge.

Concerns that conditions could become more difficult for would-be reformers were bolstered by the reaction to Haugen’s testimony of Facebook CEO Mark Zuckerberg, who in the past routinely apologized for the company’s missteps but has taken a more defiant stance this time. In a blog posted Tuesday night, he said that when corporate research is mischaracterized to the public it dissuades companies from examining their problems. “If we attack organizations making an effort to study their impact on the world, we’re effectively sending the message that it’s safer not to look at all, in case you find something that is held against you,” he wrote, noting that the company was still committed to its research.

Though few of these former Facebook employees have spoken as publicly and potently as Haugen has, many have sat just offstage for months, sometimes years, roiling over what they saw while working at Facebook. Many contacted by The Post declined to speak or to speak only on the condition of anonymity to protect their current jobs or out of respect for former co-workers. When reached for comment, Waismeyer confirmed that he had written the note and that the description of his experience was accurate, but said he would decline to comment further “out of respect for the many wonderful people still fighting the good integrity fight” at Facebook.

When Waismeyer resigned in March, he had been one of the longest serving researchers in the company’s integrity division, which came to encompass several subunits such as civic integrity and community integrity. During his more than three years in the integrity division, he had spent more than a year working on a project to help victims of revenge porn report such posts to the company, according to his final post and another person familiar with the situation. The project was abruptly dropped during an internal reorganization and was never executed, upsetting the entire team.

In his farewell post, Waismeyer wrote that people who work in integrity face particular burdens in their jobs. The value of their work is always weighed against potential PR and legal risks, as well as whether it would hurt the bottom line or diminish the amount of time people spend on the platform, he said. In addition, the work can be emotionally damaging because of the repeated exposure to harmful content or contact, via interviews or surveys, with people who have been harmed or who have harmed others on social media.

Another former employee — who like Waismeyer had worked in integrity — described discovering serious problems in Facebook’s private groups, which operate beyond the public eye and have harbored hateful and other problematic content. This former employee hoped to help Facebook remedy such problems but repeatedly ran into barriers within.

“They were just zero percent interested in stuff that I brought them,” this former employee said, speaking on the condition of anonymity. “It was raised all the way up … and was consistently stopped.”

In November 2020, another integrity researcher, Lindsay Blackwell, wrote a long resignation letter on the company’s chat system. Blackwell’s project, called WoW, or the Worst of the Worst, was an effort to refine the company’s hate speech detection algorithms after internal research found that users want Facebook to police some very harmful forms of hateful speech, such as attacks on Muslims or LGBTQ people, more rigorously than others, according to a person familiar with the project, who spoke on the condition of anonymity to describe sensitive matters. After the team spent more than a year engineering a solution to reorient the algorithms, managers and senior leaders dramatically redefined the project, arguing that prioritizing the safety of marginalized groups would be too political, the person said. Blackwell declined to comment.

Others described particular frustration at what they characterized as a pronounced rightward tilt of Facebook’s policy office as Trump emerged as the Republican nominee for president in 2016 and, even more so, after he won the White House. Several former employees have described how a group of leading Republicans at the company’s Washington office, led by former George W. Bush administration official Joel Kaplan, Facebook’s vice president for global public policy, consistently resisted efforts to more aggressively combat misinformation, hate speech and incitement of violence for fear of alienating Trump and his supporters. The Post previously reported that Kaplan objected to some efforts to police false news reports on the grounds that they would “disproportionately affect conservatives.”

Another former data scientist said executives resisted researchers’ calls to identify the Proud Boys as a hate organization for at least four months before the company actually did so, out of concern that the Proud Boys had backing in the Republican Party.

“The Republicans on the public policy team, they made every single decision to avoid upsetting Trump,” one former employee said. “They would just think of ways to make him happy so he wouldn’t unload on them.”

Facebook has contended with waves of public resignations over company decisions in recent years. They followed the company’s decision not to take down a racially divisive post by Trump, in May 2020, that appeared to endorse violence against people protesting the death of George Floyd, a Black man who was murdered in the custody of Minneapolis police. Still more people resigned after learning of the company’s role in promoting groups and content that led to the Capitol insurrection on Jan. 6.

But for people who work in divisions specifically designed to mitigate societal harms, particularly a subunit called civic integrity, where Haugen worked, people have left because they felt their work was undervalued or stymied.

Another person who worked in integrity said a team had discovered that it could make a significant dent in the amount of harmful speech on the platform if the algorithms meant to detect that speech were allowed to overcorrect — essentially, remove content that was suspected to be hate speech without being 100 percent sure of it. But executives canceled the effort, leaving the team demoralized. They made the call that they did not want to risk censoring people unfairly, even if doing so could significantly reduce hate speech and harm overall, the person said.

Several other people familiar with the work of the integrity unit said that teams were reorganized numerous times during a single year, and, ultimately, parts of it were disbanded, causing widespread frustration and leaving projects incomplete.

“We didn’t join Facebook because we were big fans of the company,” said one of the people. “We joined because — rather than shouting about how something sucks you think you can make it better. But over time, you realize that the leadership is either indifferent or ignorant. Sometimes they don’t prioritize the work. Other times they don’t understand the logistical complexity, so they under-resource it until it goes into the void.”

correction

An earlier version of this story incorrectly reported that former Facebook executive Yael Eisenstat testified before Congress about her negative experiences with the company. Eisenstat has spoken publicly against Facebook, met with lawmakers, and attended a congressional hearing with another tech industry whistleblower, but did not testify herself. This version has been updated.