The mother of an 11-year-old child, Selena Rodriguez, from Enfield, Conn. has filed a lawsuit against two social media giants, arguing a lack of adequate safeguards led her daughter to take her own life in July 2021.
The lawsuit alleges that Selena Rodriguez’s suicide was “caused by the defective design, negligence and unreasonably dangerous features of their products,” the SMVLC said in a statement.
The statement also says the tech giants “knowingly and purposefully designed, manufactured, marketed, and sold social media products that were unreasonably dangerous because they were designed to be addictive to minor users.”
Neither Meta nor Snap immediately responded to a request for comment from The Washington Post but have previously said they work to protect minors using their platforms and continually build features to better protect young users.
Selena had struggled for more than two years with an extreme addiction to Instagram and Snapchat, the court documents said. Her mother and administrator of her estate, Tammy, had confiscated her electronic devices, but that led Selena to run away to use social media platforms.
In the months leading to her death last year, the court filing said Selena had experienced “severe sleep deprivation that was caused and aggravated by her addiction to Instagram and Snapchat, and the constant 24-hour stream of notifications and alerts.”
It said that during the coronavirus pandemic she spent even more time on the platforms, “which only worsened her depression.”
While using the apps — where millions share photos, videos, memes and life updates — the court documents said Selena “had been messaged and solicited for sexual exploitive content and acts on numerous occasions by adult male users of Instagram and Snapchat.” It said the feature of disappearing messages and a lack of verification of ID and age made such activities possible.
It said Selena succumbed to pressure to send sexually explicit images on Snapchat that were later leaked among her classmates, “increasing the ridicule and embarrassment she experienced at school.” Her social media addiction also led to multiple absences from school and an investigation by the Connecticut Department of Children and Families, the law firm said.
She was hospitalized for emergency psychiatric care and experienced poor self-esteem, eating disorders, self-harm and, ultimately, suicide. She had received mental health treatment for her addiction on multiple occasions, it added.
The product liability action contends that the two companies are “responsible for causing and contributing to burgeoning mental health crisis perpetrated upon the children and teenagers in the United States,” the court documents said. “Specifically, for the wrongful death of 11-year-old Selena Rodriguez caused by Selena’s addictive use of and exposure to Defendants’ unreasonable dangerous and defective social media products.”
A spokesperson for Snap Inc. told CBS News: “We are devastated to hear of Selena’s passing and our hearts go out to her family. While we can’t comment on the specifics of active litigation, nothing is more important to us than the wellbeing of our community.”
The spokesperson continued: “In fact, Snapchat helps people communicate with their real friends, without some of the public pressure and social comparison features of traditional social media platforms, and intentionally makes it hard for strangers to contact young people.”
On Thursday, Meta launched a feature called Pledge Planets on Messenger Kids, an activity to help children learn how to use the Internet safely and practice making healthy decisions online. The company said in a news release it had developed the interactive features “in close partnership with experts in online safety, child development and children’s media.”
“Kids can explore different planets based on the tenets of the Messenger Kids Pledge: Be Kind, Be Respectful, Be Safe, and Have Fun.”
In September, Meta said it was pausing plans to build an Instagram app for children amid growing pushback from child welfare advocates, policymakers and law enforcement officials who contend that such efforts could harm a generation of users. Facebook’s own internal research has also previously found that teen girls reported Instagram had made their body image issues worse.
In October, Facebook whistleblower Frances Haugen stunned lawmakers at a Senate committee testimony when she alleged the company systematically and repeatedly prioritized profits over the safety of its users, and she painted a detailed picture of an organization where hunger to grow governed decisions — allegations the company has pushed back on.
Nonetheless, lawmakers from both parties have galvanized around regulatory efforts to tamp down on how the services of tech giants can affect children’s mental health. Executives from the companies have also committed to sharing internal research on how their products may affect children.