The U.S. government is in the late stages of an investigation into YouTube for allegedly violating children’s privacy, according to four people familiar with the matter, in a probe that threatens the company with a potential fine and already has prompted the tech giant to reevaluate some of its business practices.
The Federal Trade Commission launched its investigation after numerous complaints from consumer groups and privacy advocates, according to the four people, who spoke on the condition of anonymity because such probes are supposed to be confidential.
The complaints contended that YouTube, which is owned by Google, failed to protect kids who used the streaming-video service and improperly collected their data in violation of the Children’s Online Privacy Protection Act, a 1998 law known as COPPA that forbids the tracking and targeting of users younger than age 13.
The possibility of a hefty penalty against YouTube — including a settlement forcing YouTube to change its practices to better protect kids — could signal a new phase in the FTC’s enforcement of the child-privacy law, which many critics say has grown weak amid technological changes over the past two decades.
The FTC declined to comment, citing its policy against confirming or denying nonpublic investigations.
Some of the problems highlighted by the YouTube investigation are shared by many of the most popular online services, including social media sites, such as Instagram and Snapchat, and games such as Fortnite, according to consumer advocates. The companies say their services are intended for adults and that they take action when they find users who are underage. Still, they remain widely popular with children, especially preteens, according to surveys and other data, raising concerns that the companies’ efforts — and federal law — have not kept pace with the rapidly evolving online world.
FTC penalties, when they have been levied, often come after years of violations and rarely have been large enough to dent the profit margins of major technology companies. In February, the FTC issued its highest financial penalty yet for breaking federal kids privacy rules — a $5.7 million settlement with the app known as TikTok for violations that allegedly began in 2014.
“YouTube is a really high-profile target, and for obvious reasons because all of our kids are on it,” said Marc Groman, a privacy lawyer who previously worked for the FTC and the White House. “But the issues on YouTube that we’re all grappling with are elsewhere and everywhere.”
Since its founding in 2005 — and especially after its purchase by Google for $1.65 billion the following year — YouTube has become among the Internet’s most popular sites, generating extensive advertising revenue while becoming a video library for almost anyone with an online connection, almost anywhere in the world. That includes those offering do-it-yourself tips, original shows, musical performances and, on the darker side, far-fetched conspiracies, disinformation and clips troublingly close to child pornography. YouTube’s users upload 400 hours of new content to the platform each minute, the company has said.
As the FTC investigation of YouTube has progressed, company executives in recent months have accelerated internal discussions about broad changes to how the platform handles children’s videos, according to a person familiar with the company’s plans. That includes potential changes to its algorithm for recommending and queuing up videos for users, including kids, part of an ongoing effort at YouTube over the past year and a half to overhaul its software and policies to prevent abuse.
A spokeswoman for YouTube, Andrea Faville, declined to comment on the FTC probe. In a statement, she emphasized that not all discussions about product changes come to fruition. “We consider lots of ideas for improving YouTube and some remain just that — ideas,” she said. “Others, we develop and launch, like our restrictions to minors live-streaming or updated hate speech policy.”
The Wall Street Journal reported Wednesday that YouTube was considering moving all children’s content off the service into a separate app, YouTube Kids, to better protect younger viewers from problematic material — a change that would be difficult to implement because of the sheer volume of content on YouTube, and potentially could be costly to the company in lost advertising revenue.
A person close to the company said that option was highly improbable but that other changes were on the table. YouTube Kids gets a tiny fraction of the YouTube’s audience, which tops 1.9 billion users logging in each month.
The internal conversations come after years of complaints by consumer advocates and independent researchers that YouTube had become a leading conduit for political disinformation, hate speech, conspiracy theories and content threatening the well-being of children.
The prevalence of preteens and younger children on YouTube has been an open secret within the technology industry and repeatedly documented by polls even as the company insisted that the platform complied with COPPA.
A complaint last year by consumer and privacy advocates focused, in part, on the numerous channels on YouTube that appear directed toward children — they feature nursery rhymes, cartoons and kids opening packages of toys or dolls — and that are among the most popular on the site. A Pew Research Center poll last year found that even among kids 11 or younger, 81 percent had watched YouTube at least once and 34 percent did so regularly, according to their parents.
Some lawmakers, including COPPA’s original author Sen. Edward J. Markey (D-Mass.), have proposed a bill to expand its application to include online services that have demonstrably large numbers of children using them. COPPA only applies to services that are directed primarily toward children or for which companies have “actual knowledge” of use by children, a narrow legal standard that has hindered enforcement, advocates say.
“An FTC investigation into YouTube’s treatment of children online is long overdue,” Markey said in a statement. “But we must do much more to ensure that our children are protected from online dangers known and unknown.”
The FTC has been investigating YouTube about its treatment of children based on multiple complaints it received dating to 2015, arguing that both YouTube and YouTube Kids violate federal laws, according to the people familiar with the investigation. The exact nature and status of the inquiry is not known, but one of the people familiar with the investigation said it is in advanced stages — suggesting a settlement, and a fine depending on what the FTC determines, could be forthcoming.
“YouTube’s business model puts profits first, and kids’ well-being last,” said David Monahan of the Campaign for a Commercial-Free Childhood, a Boston-based advocacy group. “When we filed a COPPA complaint with the FTC a year ago, Google’s response was ridiculous — that YouTube is not a site for kids, when it’s actually the most popular children’s site on the Internet. We hope the FTC will act soon, and require YouTube to move all kids’ content to YouTube Kids with no marketing, no autoplay or recommendations, and strong protections for children’s privacy.”
Major advertisers also have pushed YouTube and others to clean up its content amid controversies over the past two years.
A report last month by PWC, a consulting group, said that Google had an internal initiative called Project Unicorn that sought to make company products comply with COPPA.
The company that commissioned the PWC report, SuperAwesome, helps technology companies provide services without violating COPPA or European child-privacy legal restrictions against the tracking of children.
“YouTube has a huge problem,” said Dylan Collins, chief executive of SuperAwesome. “They clearly have huge amounts of children using the platform, but they can’t acknowledge their presence.”
He said the steps being considered by YouTube would help, but “they’re sort of stuck in a corner here, and it’s hard to engineer their way out of the problem.”
Earlier this month, YouTube made its biggest change yet to its hate speech policies — banning direct assertions of superiority against protected groups, such as women, veterans, and minorities, and banning users from denying that well-documented violent events took place. Previously, the company prohibited users from making direct calls for violence against protected groups, but stopped short of banning other forms of hateful speech, including slurs. The changes were accompanied by a purge of thousands of channels, featuring Holocaust denial and content by white supremacists.
The company recently disabled comments on videos featuring minors and banned minors from live-streaming video without an adult present in the video. Executives have moved to limit the company’s own algorithms from recommending content in which a minor is featured in a sexualized or violent situation, even if that content does not technically violate the company’s policies.