The Washington PostDemocracy Dies in Darkness

Senators unveil children’s online safety bill after months of pressure on Silicon Valley

The bill would require companies to provide parents and minors with new controls and create new obligations for platforms to address self-harm, eating disorders and other content that might harm children and teens

Sen. Richard Blumenthal (D-Conn.) speaks alongside Sen. Marsha Blackburn (R-Tenn.) during a Senate hearing. They are co-sponsoring a bill to address children's safety online. (Anna Moneymaker/Pool/EPA-EFE/Shutterstock)
5 min

A bipartisan pair of senators on Wednesday unveiled a sweeping bill that aims to give parents more control over their children’s time online, following months of congressional scrutiny over the way social media platforms may harm their youngest users.

Co-sponsored by Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), the Kids Online Safety Act would require online platforms to provide parents and minors younger than 16 with “easy-to-use” tools to keep them safe, limit screen time and protect their data. It would demand that companies create tools to allow parents to track how much time their kids spend on a service, or to opt out of features such as autoplay that might extend time online. Companies would also have to offer parents and minors the ability to modify tech companies’ recommendation algorithms, allowing them to limit or ban certain types of content.

“What we’ve seen is gut-wrenchingly a sense of powerlessness, a loss of control from kids themselves,” Blumenthal said during a Wednesday news conference. “What we’re doing in this bill is empowering those children and their parents to take back control and the power over their lives online.”

Facebook’s latest scandal is putting the spotlight on children’s safety online

The bill also establishes an obligation for companies to prevent the promotion of self-harm, eating disorders, bullying and the sexual abuse of children. And it would allow the federal government to create a program for researchers to access data from companies so that they can do more research about tech’s potential harm of children and teens.

The bill is the result of months of hearings and a congressional investigation into tech companies’ handling of children’s safety, after documents were disclosed last year by Facebook whistleblower Frances Haugen. Although the documents touched on a variety of topics, internal research examining how Instagram may affect the mental health of teen girls, first reported by the Wall Street Journal, sparked new political will to update safeguards for minors online.

The legislation appears to directly respond to some of the recommendations that Haugen made to lawmakers last year, when she warned of Facebook’s tendency to prioritize posts likely to elicit reactions from users. Haugen described the company’s algorithms as a black box and called for lawmakers to force greater transparency, in part by facilitating independent research about online platforms.

The education of Frances Haugen: How the Facebook whistleblower learned to use data as a weapon from years in tech

Senators from both parties appeared uniquely emboldened by the revelations, and they responded with calls for immediate action to keep children safe online. The nation’s existing children’s online privacy law, the Children’s Online Privacy Protection Act, is more than 20 years old and applies only to children younger than 13. The law — which is older than Facebook, Instagram, YouTube and many of the other services where children spend time online today — does not do enough to protect children in the age of social media, policymakers say.

In the absence of action from Congress, companies have responded to public pressure on children’s safety, as well as to regulations abroad, such as Britain’s Age-Appropriate Design Code. Instagram in December launched features to keep teens safer online, such as reminders to “take a break” and restrictions on what content is algorithmically recommended to teens. TikTok recently expanded its rules against videos that promote disordered eating, and it recently strengthened its policies on content related to suicides.

Children’s safety online is also gaining traction in state legislatures. California Assemblywoman Buffy Wicks (D) and Assemblyman Jordan Cunningham (R) on Wednesday introduced a bill called the “California Age-Appropriate Design Code Act,” which is modeled off the British regulations. Wicks said in an interview that as a mother herself, she’s grown increasingly concerned about children’s safety, especially as they spend more time online during the pandemic. She said if the bill were to pass in California, it could have ripple effects throughout the country, like the state’s broad digital privacy law.

“If you have European standards and California standards, chances are you’re going to follow those standards even in other parts of the country,” said Wicks, who previously worked for Common Sense Media, which has advocated for more protections for children online.

Beeban Kidron, a member of Britain’s House of Lords who backed regulations to keep children safe online, applauded both the congressional and California bills.

Since enforcement of the Age-Appropriate Design Code took effect last year in Britain, a number of large tech companies have made major changes to their services, Kidron said. But the rule has had major implications beyond Big Tech, forcing smaller companies to turn off GPS tracking of children, for example, or limit collection of their data.

“American lawmakers have woken up to the fact that they need to act,” she said in an interview with The Washington Post. “You cannot wait for the tech companies to do this on a voluntary basis.”

During a Wednesday news conference, Blumenthal and Blackburn directed most of their ire toward large tech companies. Blumenthal accused them of pushing toxic content to kids to “enhance their bottom line.” But the Kids Online Safety Act would have consequences far beyond the small group of tech giants, affecting any online service of any size that is “reasonably likely to be used” by a child younger than 16.

This would probably include a wide array of video games, streaming services and websites. Blackburn said that interactive games are a concern and that she wants to make certain that “children aren’t being adversely impacted in that environment.”

“Others who make the same kinds of choices and who drive content to children ought to be held responsible, as well,” Blumenthal said.