“It was very clear that we needed more, much more support than what we were receiving,” Elder said.
That experience led her to join a lawsuit the following year charging that Facebook, as well as the companies it contracts with to hire tens of thousands of moderators, failed to protect its workers from the trauma associated with spending eight hours a day reviewing horrific images to keep them off social media platforms.
That case resulted in a first-of-its-kind settlement this week. Facebook agreed to pay $52 million to thousands of U.S. workers who have suffered the psychological consequences of reviewing posts depicting acts of suicide, murder, child abuse and other disturbing content.
The settlement, in which Facebook did not admit or deny harms caused to these workers, applies to any U.S.-based content moderator who has ever worked for a third-party company providing services to Facebook and its subsidiaries WhatsApp and Instagram, a group that encompasses more than 10,000 people.
The moderators who brought the class-action lawsuit — Elder, along with Selena Scola and Gabriel Ramos — have not previously spoken to the press. The other two plaintiffs declined to comment.
The case could open the door to lawsuits from workers against other social media companies that hire large numbers of moderators, such as YouTube and Twitter.
“I am incredibly proud of the plaintiffs in this case, who put themselves in jeopardy in coming forward,” said Steve Williams, a partner at the Joseph Saveri Law Firm in San Francisco, one of several firms involved in the case. “No one had ever seen a case like this, and the jobs that people do were in some ways beyond description.”
Facebook spokesman Drew Pusateri said in an emailed statement: “We are grateful to the people who do this important work to make Facebook a safe environment for everyone. We’re committed to providing them additional support through this settlement and in the future.”
In September 2018, Scola, a former Facebook moderator, broke a confidentiality agreement and sued Facebook, alleging she developed post-traumatic stress disorder after working at a job reviewing disturbing content including rape, murder and animal cruelty. The complaint, joined by the two other moderators, argued that Facebook failed to provide a safe workplace or compensate them for the psychological harms that they endured.
The Verge first reported the settlement.
As a result of the case, every content moderator who has worked for Facebook since 2015 will receive at least $1,000. In addition, any worker who has a diagnosis of PTSD from work is eligible to receive up to $50,000 in damages.
Facebook and other social media giants have significantly ramped up their global workforce of content moderators in response to widespread abuse by Russian operatives, drug peddlers and other users. Silicon Valley companies employ tens of thousands of people in at least 20 countries, including Ireland, India and the Philippines. The hiring is indirect, with a network of third-party vendors contracting the workers, who are paid low hourly wages compared with full-time employees.
The growing workforce has also drawn closer scrutiny of working conditions. Workers have spoken with The Washington Post regarding suffering from PTSD and other forms of psychological trauma, such as paranoid ruminations, frequent nightmares and an inability to sleep. Some, making wages between $16 and $18 per hour, say they work side jobs in the gig economy to make ends meet.
Within Facebook, the treatment of content moderators is a subject of debate, and moderators have used Facebook’s internal chat rooms to raise complaints. Some say they have faced retaliation for protesting.
Facebook did not respond to a question about whether the job can cause PTSD. In a company-arranged interview with Austin-based moderators last year, a trainer told The Post the job does cause PTSD. Facebook and other companies voluntarily provide free counseling through work as well as frequent breaks for workers to relieve the stress caused by the job.
But such services were far less available at the time Elder worked as a moderator. In an interview, Elder said her manager was aware workers needed more help than they were getting. But her manager was unable to provide that support, Elder said, because the outsourcing company they worked for, Pro Unlimited, which had offices on the Facebook campus, was not interested in doing so.
“She agreed that we should have more support, but there was no one to hold anyone accountable to get that done,” Elder said. “Pro Unlimited was my employer, and they didn’t seem to think it was important, and Facebook’s attitude was this is Pro’s responsibility.”
Pro Unlimited was not a party in the lawsuit. The company did not respond to a request for comment.
The settlement barely registers on Facebook’s balance sheet, but the ongoing financial compensation may be significant for the workers, many of whom have told The Post they suffer psychological harm long after they leave the job.
Williams said the case was hard-fought, with Facebook initially trying to use legal motions to quash the case, but that the company ultimately negotiated in good faith. Governmental workers'-compensation systems generally do not apply to third-party contractors and do not cover psychological damages, he added, leaving workers little recourse.
Overall, U.S. workers are in a better position than those in other countries. Last year, a Post investigation found workers in the Philippines are not afforded the same legal protections and tend to work longer hours with more limited psychological support.