Ian Russell is the founder of the Molly Rose Foundation.
Less than two weeks after we went public with Molly’s story, Instagram chief Adam Mosseri wrote an op-ed vowing “to do everything we can to keep the most vulnerable people who use our platform safe.”
Now more than ever, I can’t help but think of this promise he made. “We are committed to publicly sharing what we learn. We deeply want to get this right and we will do everything we can to make that happen.”
It brings me no joy to say, but it’s now clear that those words have proved to be hollow.
My daughter Molly’s story may not be familiar to many outside the U.K., but it helped inspire the Online Safety Bill. "Facebook whistleblower" Frances Haugen testified on Monday before the Parliamentary Joint Committee that drafted it.
Haugen and the documents she has shared have exposed how, even as Facebook, which owns Instagram, repeatedly touted its commitments to young people’s well-being, it has been covering up internal research about how toxic Instagram could be for them, especially teen girls having a hard time, like my beloved daughter Molly.
It was frankly difficult to read the internal presentations that demonstrated precisely how “aspects of Instagram exacerbate each other to create a perfect storm” that send struggling young users into a “downward spiral” in which “mental health outcomes … can be severe.” It all rang so harrowingly true.
To learn that Facebook knew its platforms caused unique trauma for teenage girls — and that it neither shared that data with policymakers nor acted on its own researchers’ recommendations to mitigate these grievous harms — just appalled me. “Facebook became a $1 trillion company by paying for its profits with our safety, including the safety of our children. And that is unacceptable,” Haugen has said.
She is right. It is time Facebook stopped monetizing misery.
Haugen explained in her testimony before the U.S. Senate how Facebook’s algorithms use engagement-based rankings to tailor content to each user, often showing them more and more extreme content based on what they engage with. For Molly, this meant an Instagram feed full of suicidal ideation and self-harm. And no one outside of Facebook knows how the algorithm is designed and what its effects are on its users. There are no means by which governments or independent regulators can review company policies and data to ensure its product isn’t leading to harm or even death.
As long as social media companies such as Facebook are left unregulated with no oversight into their algorithmic ranking systems and access to internal data, public health and safety are at risk.
I saw myself in Haugen when she said, “I came forward because I believe that every human being deserves the dignity of the truth." I’m not a public health or tech policy expert. I’m just a bereaved father called to use my voice in the hope of preventing other parents from joining this unbearable fellowship. I was involuntarily drafted into this long, arduous movement for accountability.
Time is of the essence. Every week that companies such as Facebook are allowed to obstruct and obfuscate is another week that children like Molly will needlessly lose their lives. I have learned from firsthand experience: We can no longer afford to take Facebook’s words at face value, or ask politely for it to do better.
It’s abundantly clear that Mosseri and Mark Zuckerberg will not, in fact, “do everything we can" to keep most vulnerable people on their platforms safe. So now it’s incumbent upon lawmakers to do it for them.
Congress and Parliament must use every tool at their disposal to obtain all internal research and documents that pertain to the health and well-being of our children, and pass legislation to protect the families they represent. It’s what Molly and far too many other young people who have suffered as she did all deserve.
If you or someone you know needs help, call the National Suicide Prevention Lifeline at 800-273-TALK (8255). You can also text a crisis counselor by messaging the Crisis Text Line at 741741.