The Washington PostDemocracy Dies in Darkness

A content moderator says she got PTSD while reviewing images posted on Facebook

Dado Ruvic/Illustration/Reuters (Dado Ruvic/Reuters)

SAN FRANCISCO — A former Facebook content moderator is suing the company on the grounds that reviewing disturbing material on a daily basis caused her psychological and physical harm, according to a lawsuit filed Monday in a California superior court.

The suit by former moderator Selena Scola, who worked at Facebook from June 2017 until March, alleges that she witnessed thousands of acts of extreme and graphic violence “from her cubicle in Facebook’s Silicon Valley offices,” where Scola was charged with enforcing the social network’s extensive rules prohibiting certain types of content on its systems.

Scola, who worked at Facebook through a third-party contracting company, developed post-traumatic stress disorder “as a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace,” the suit says.

Facebook didn’t respond to a request for comment.

Facebook relies on thousands of moderators to determine whether posts violate its rules against violence, hate speech, child exploitation, nudity and disinformation. Many objectionable categories comes with their own sublists of exceptions. It is staffing up its global workforce — hiring 20,000 content moderators and other safety specialists in places such as Dublin, Austin and the Philippines — in response to allegations that the company has not done enough to combat abuse of its services, including Russian meddling, illegal drug content and fake news.

The social network says that in recent years it has been developing artificial intelligence to spot problematic posts, but the technology isn’t sophisticated enough to replace the need for significant amounts of human labor.

Facebook is under intense scrutiny from politicians and lawmakers, who have taken top executives to task in two high-profile hearings on Capitol Hill this year and are considering new regulations that would hold the companies to a more stringent standard of responsibility for illegal content posted on their platforms.

The complaint also charges the Boca Raton, Fla.-based contracting company Pro Unlimited with violating California workplace safety standards.

Pro Unlimited didn’t respond to a request for comment.

The lawsuit does not go into further detail about Scola’s particular experience because she signed a nondisclosure agreement that limits what employees can say about their time on the job. Such agreements are standard in the tech industry, and Scola fears retaliation if she violated it, the suit says. Her attorneys plan to dispute the NDA but are holding off on providing more details until a judge weighs in.

The suit notes that Facebook is one of the leading companies in an industrywide consortium that has developed workplace safety standards for the moderation field. The complaint alleges that Facebook does not uphold the standards it helped developed, unlike industry peers.

In late 2016, two former content moderators sued Microsoft, claiming that they developed PTSD and the company did not provide adequate psychological support.

Scola’s lawsuit asks that Facebook and its third-party outsourcing companies provide content moderators with proper mandatory on-site and ongoing mental health treatment and support, and establish a medical monitoring fund for testing and providing mental health treatment to former and current moderators.

Facebook has been historically tight-lipped about its moderator program. The guidelines used by moderators to make decisions were secret until this year, when the company released a portion of them publicly. The company has declined to disclose information about where moderators work, as well as the hiring practices, performance goals, and working conditions for moderators.

Loading...