The Washington PostDemocracy Dies in Darkness

TikTok content moderator sues company, alleging trauma from hours reviewing videos of rape and killings

TikTok, which said in September it has over 1 billion global users, has become one of the world's most popular social media platforms. (Brent Lewin/Bloomberg News)

A content moderator who reviewed videos for TikTok is suing the social media company, alleging that it did not protect her from suffering psychological trauma after “constant” exposure to violent videos that showed sexual assault, beheadings, suicide and other graphic scenes.

For as long as 12 hours each day, Candie Frazier and other moderators reviewed “extreme and graphic violence,” including videos of “genocide in Myanmar, mass shootings, children being raped, and animals being mutilated” in an effort to filter out such content from being viewed by TikTok users, according to the lawsuit. The legal action was filed in federal court in California last week against TikTok and its parent company, ByteDance.

Frazier developed “significant psychological trauma including anxiety, depression, and posttraumatic stress disorder” as a result of her exposure to the videos, according to the lawsuit, which is seeking class-action status. The legal challenge, which alleges that TikTok violated California labor law by failing to provide a “safe work environment,” requests compensation for moderators who were exposed to the material. It also asks that TikTok and ByteDance provide mental health support and treatment to former and current moderators.

Frazier is not a TikTok employee — she works for Telus International, a firm that provides workers to other businesses — but the lawsuit alleges that “ByteDance and TikTok control the means and manner in which content moderation occurred.”

A TikTok spokesperson declined to address the lawsuit directly, writing in a statement that the company does “not comment on ongoing litigation.” The statement said that at TikTok, “we strive to promote a caring working environment for our employees and our contractors.” It added that it will “continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally.”

A spokesperson for Telus International, which is not a defendant in the suit, said in a statement that the company was “proud of the valuable work our teams perform to support a positive online environment,” adding that the company has a “robust resiliency and mental health program in place.”

The lawsuit comes as content management practices at one of the world’s most popular social media platforms are under scrutiny. TikTok revealed in September that it had more than 1 billion global users.

School districts nationwide were on alert this month after authorities raised concerns over what they said were threats of violence spread on the platform, following a shooting at a high school in Michigan that left four people dead. TikTok denied that the threats had spread widely on its platform, although gun-control advocates called for the company to improve content regulation.

TikTok also said this month that it would adjust its algorithm after an investigation by the Wall Street Journal found that the technology could feed users streams of content focused on subjects such as depression and extreme dieting.

Moderators are made to view as many as 10 videos simultaneously, while being pushed by TikTok software “to review videos faster and faster,” according to Frazier’s lawsuit. During the course of a 12-hour shift, workers are allowed two 15-minute breaks and an hour for lunch, the suit says.

Facebook content moderator details trauma that prompted fight for $52 million PTSD settlement

Telus said its employees could raise concerns through “several internal channels” but that Frazier has never done so. “Her allegations are entirely inconsistent with our policies and practices,” the company said. An attorney for Frazier did not immediately respond to a request for comment.

Similar allegations have been made by content moderators for other social media companies, including Facebook. Last year, Facebook (whose parent company is now called Meta) agreed to a $52 million settlement with thousands of moderators, after a lawsuit alleged that Facebook failed to protect them from traumatic content.

If you or someone you know needs help, call the National Suicide Prevention Lifeline at 800-273-TALK (8255). You can also text a crisis counselor by messaging the Crisis Text Line at 741741.

Loading...