Imagine if a stranger parked in front of a child’s bedroom window to peep inside. You’d call the police.
Angry Birds 2 snoops when kids use it. So do Candy Crush Saga and apps for coloring and doing math homework. They’re grabbing kids’ general locations and other identifying information and sending it to companies that can track their interests, predict what they might want to buy or even sell their information to others.
Apple and Google run the app stores, so what are they doing about it? Enabling it.
Tech companies need to stop turning a blind eye when children use their products — or else we need laws to impose some responsibility on them. We the users want children’s privacy to be protected online. But parents and teachers can’t be the only line of defense.
We the users want children’s privacy to be protected online. But parents and teachers can’t be the only line of defense.
Children’s privacy deserves special attention because kids’ data can be misused in some uniquely harmful ways. Research suggests many children can’t distinguish ads from content, and tracking tech lets marketers micro-target young minds.
This is why kids are at the center of one of America’s few privacy laws, the 1998 Children’s Online Privacy Protection Act, or COPPA. It said that companies aren’t supposed to gather personal information about kids under 13 without parental permission. Sounds pretty clear, right?
But even one of the authors of COPPA, Sen. Edward J. Markey (D-Mass.), thinks it needs a do-over. “It was pretty obvious when the bill was being originally drafted that there was going to be a real opportunity for unscrupulous corporations to take advantage of young people,” he told me. “Now the problems are on steroids.”
By the time a child reaches 13, online advertising firms hold an average of 72 million data points about them, according to SuperAwesome, a London-based company that helps app developers navigate child-privacy laws.
“COPPA was passed in a world where parents would be in the room with a child using a computer,” said Stacy Feuer, a senior vice president of the Entertainment Software Rating Board, or ESRB, who worked for two decades at the Federal Trade Commission. “Mobile and everything we have in 2022 present new challenges.” ESRB is the nonprofit, self-regulatory body for the video game industry.
Pixalate said it used software and human reviewers, including teachers, to attempt something that Apple and Google have failed to do: categorize every single app that might appeal to children. Pixalate identified more than 391,000 child-directed apps across both stores — far more than the selection in the stores’ limited kids sections. Pixalate’s methodology draws on the FTC’s definitions of “child-directed,” and it was designed by a former commission staffer who was responsible for enforcing the law.
After identifying the child-directed apps, Pixalate studied how each handled personal information, most notably charting what data each sent to the ad industry. Of all the apps Pixalate identified, 7 percent sent either location or internet address data. But popular apps were much more likely to engage in tracking because they have an incentive to make money from targeted ads, it said.
Google and Apple said their app stores protect children’s privacy. Apple said it disagrees with the premise of the research from Pixalate, and said that company has a conflict of interest because it sells services to advertisers. Google calls Pixalate’s methodology of determining whether an app is child-directed “overly broad.”
A limitation of Pixalate’s study is that it didn’t check which apps seek parental permission like COPPA would require — but my spot checks found many, many do not.
This research is hardly the only indication of the problem. A recent study of 164 educational apps and websites found nearly 90 percent of them sent information to the ad-tech industry. A 2020 study found that two-thirds of the apps played by 124 preschool-aged children collected and shared identifying information. And a 2018 study of 5,855 popular free children’s apps found a majority were potentially in violation of COPPA.
“They are placing their profits over the mental health and social well-being of every child in America, because that’s the power they have today,” Markey told me.
I wanted to know: How did it become open season on kids’ data when we have a privacy law for kids in America?
What I discovered is that Big Tech and app makers found a giant loophole in the law: They claim they don’t have “actual knowledge” they’re taking data from kids.
But if we have the will, we can tighten up the loophole.
To see how apps gather kids’ data, step into the shoes of a parent. Your 12-year-old searches the iPhone app store for a coloring game and chooses one called Pixel Art: Paint by Number, made by a company called Easybrain.
Before your kid downloads the app, you glance at the listing in Apple’s app store. It says “Age: 12+” right at the top. The app preview shows pictures of a vegetable and a toucan to color. The app is free. What’s not to like?
But when your kid opens that coloring app, it sends out to the ad industry her general location, internet address and another code to potentially identify her phone, according to Pixalate.
At no point does Pixel Art ask for her age — or you for permission. Easybrain claims it doesn’t have to, because Pixel Art is not for children.
“We instead operate a ‘general audience’ service, and do not generally have actual knowledge that the Pixel Art App is collecting, using, or disclosing personal information from any child under 13,” emailed company spokesman Evan Roberts.
Let me translate: Many app makers say they’re only required to stop collecting data or get parental consent if they have “actual knowledge” their users are children. Without it, they can claim to be a “general-audience” product, rather than a “child-directed” one.
The coloring designs in Pixel Art include categories such as Dinosaurs, Unicorns, Cute Unicorns, Students, Ice Cream and Creamy Dessert. Those all seem like things kids could be interested in coloring, even though the app maker said it’s marketed to adults.
It doesn’t matter if adults also use an app: COPPA should apply if even just a portion of an app or website’s audience is kids. If it’s a mixed-audience product like Pixel Art, the app should either check ages and get parental permission — or just not collect personal information. In 2021, the FTC settled with a self-identified “adult” coloring app called Recolor that also had a “kids” section.
I also heard the “general audience” explanation from King, the maker of Candy Crush Saga, a game listed as “Age: 4+.” “Our game and our marketing are targeted at adult players, over the age of 18 in the U.S.,” the company emailed.
Same from Rovio, the maker of the Angry Birds app series. “Rovio carefully analyzes whether its games are subject to COPPA,” the company emailed.
I understand it can be complicated for app developers — often small businesses — to understand who their audience is or how to make money without running afoul of the law.
The maker of one app I contacted acknowledged a need to do better. The Calculator and Math Solver app marketed itself as a way to “make math homework fun” even while claiming to only target people older than 16. “We will be more mindful of clearly marketing only to our intended target audience,” emailed Frank List, the chief executive of developer Impala Studios.
Are Apple and Google okay with this happening in their app stores? I told them the results of Pixalate’s study and flagged a dozen apps that appeared to flout COPPA.
They both told me they’re doing a bang-up job protecting kids. “Apps designed for children provide additional layers of security and tools to protect young people and hold accountable those who try to exploit their data,” emailed Apple spokesman Peter Ajemian.
And “Google Play has strict protocols and unique features to help protect kids on our platform,” emailed spokeswoman Danielle Cohen.
Both companies say they require apps follow the law and also have special privacy rules for child-directed apps. But the question is: Which apps adhere to these rules? When apps self-declare they’re not designed for children, Apple and Google too often just look the other way.
Why point the finger at the tech giants? Because they arguably have more power than the U.S. government over the app economy through their control over the two biggest app stores.
They provide age ratings on all apps — but here’s a dirty little secret: Those are just for the content of the apps. They don’t indicated whether the app is COPPA-compliant. In fact, Rovio, of Angry Birds, offers a warning on its website that age labels in app stores can be misleading.
Even more frustrating, neither app store gives parents a simple way to see just the apps that don’t collect kids’ data. Google’s store has a kids tab where apps are labeled as “Teacher Approved” and stringent standards are applied. But just 5 percent of the most-popular child-directed apps that Pixalate identified are in that part of the store.
Good luck even finding Apple’s curated kids category — it’s buried at the bottom of its store. You can’t search it separately, and apps with kids’ privacy protections aren’t labeled as such. (In fact, if you tap your way to the kids section of its most-downloaded charts, the listings you see aren’t even kids apps. Apple said I discovered a bug.)
Even Apple’s parental controls are of limited help. When a parent sets up a child’s iOS account, they get the ability to approve app purchases — but Apple doesn’t limit the store to just apps designed for kids’ privacy.
On an iPhone set up with a kids account, Apple does automatically activate its “ask app not to track” privacy setting. That limits all apps’ access to one piece of personal information — but it doesn’t stop all tracking. Testing the Pixel Art app on a child iOS account, we still saw the app share personal data, including general location, internet address and what appeared to be an alternative way to track the phone across apps by the same developer. (Easybrain disagreed that the last bit could be used to track a phone.)
Bottom line: If you’re a parent who wants to make sure your kids’ apps respect their privacy, it takes work. You could read app privacy policies to see if they claim they’re not for kids, or if they share data with third parties. Or you could turn to reviews by organizations that check child-directed apps for privacy, including Common Sense Media, ESRB or Pixalate.
But until we shift this work off the shoulders of busy parents, kids’ privacy is at risk.
So how do we fix this? For starters, Apple and Google could stop looking the other way, and start labeling all the child-directed apps in their stores.
Then parents, governments and even the advertising industry could have a clearer understanding of which apps are supposed to be treating kids’ data differently — and which ones really are only for grown-ups.
“The problem can be, the devil is in the details,” said Phyllis Marcus, a partner at the law firm Hunton Andrews Kurth who used to run the FTC’s COPPA enforcement.
I’m not saying that will be easy — we’d be asking that the stores call out apps stretching the definition of “general audience.” Yet YouTube was able to start labeling child-directed videos on its service after a COPPA settlement with the FTC in 2019.
Another idea: The phone could identify when it’s being used by a child. Already, Apple asks for a child’s age when a parent sets up a kid-specific iOS account. Why not send a signal to apps when a child is using the phone to stop collecting data?
Still, many of the kids’ privacy advocates I spoke with think the industry won’t really change until there’s legal culpability. “These companies shouldn’t be able to bury their heads in the sand to avoid having to protect kids’ privacy,” Markey said.
This means we need a COPPA 2.0. Markey and Rep. Kathy Castor (D-Fla.) both have drafted bills that would update the law. Among the changes they have proposed are covering teenagers up to age 16 and outright banning behavioral and targeted advertising.
Markey would also do away with that “actual knowledge” loophole and replace it with a new standard called “constructive knowledge.” That would mean apps and websites would be responsible for reasonably trying to figure out whether children are using their services.
California is also considering creating a version of a United Kingdom law known as the Age Appropriate Design Code. It would require companies to establish the age of consumers and maintain the highest level of privacy possible for children by default.
U.S. lawmakers have been talking about privacy broadly without much action for years. But surely protecting children is one thing Democrats and Republicans could agree on?
“If we can’t do kids, then it just shows how broken our political system is,” Markey told me. “It shows how powerful the tech companies are.”