The new rules pit Apple’s privacy prerogative against an overreach of its power.
Apple says it is making the move in part to better protect users’ privacy by shielding children from data trackers, a move that has been lauded by some privacy advocates. But some developers say they fear that the new rules won’t protect kids — possibly exposing them to more adult apps — and could pointlessly reduce their businesses.
That’s what’s worrying Gerald Youngblood. He created Tankee, an iPhone app intended to be a safe alternative to YouTube, with the help of privacy experts and lawyers. While the app is gaining traction, ranking in the top 10 apps for kids ages 9 to 11, the new rules could limit Tankee’s ability to show ads and force Youngblood to abandon his model to make the app free.
Tankee shouldn’t be lumped in with the apps that are negligent and fail to protect children, Youngblood said. “We thought they were going to shut down these apps that are ignoring privacy and targeting kids,” he said. “We were built with privacy as a foundation.”
Apple says it was simply responding to parents’ concerns. Phil Schiller, Apple’s senior vice president of worldwide marketing, said parents were complaining to Apple about inappropriate advertising shown to their kids while using iPhone apps. “Parents are really upset when that happens because they trust us,” Schiller said.
Following an inquiry from The Washington Post, Apple said Friday that it now plans to delay the rule changes. “We aren’t backing off on this important issue, but we are working to help developers get there,” Apple spokesman Fred Sainz wrote in an emailed statement. The statement said some developers had asked Apple to clarify the new rules, but that “generally we have heard from them that there is widespread support for what we are trying to do to protect kids."
Apple’s June announcement of its new kids apps rules has triggered complaints from usually complacent developers. They say that because of Apple’s dominance of the app economy, its potentially small changes can wreak havoc on many businesses. Apple has about 71 percent of spending on the U.S. app market, while Google is a distant second with around 29 percent. Globally, consumers spent $25.5 billion on Apple’s App Store in the first half of 2019 alone, according to market research firm Sensor Tower, far ahead of the $14 billion spent on Google’s Android platform.
“Apple would definitely throw its weight around less if it knew all its developers could desert it for any number of alternatives,” said Christopher Sagers, a professor at the Cleveland-Marshall College of Law and author of the upcoming book “United States v. Apple” that explores the company’s allegedly anti-competitive behavior.
Apple’s App Store is already under the antitrust microscope. The company is facing a European investigation into allegations made by Swedish music app Spotify that Apple unfairly tipped the scales on the App Store in favor of Apple Music, a similar service. And the Supreme Court in May allowed a lawsuit to proceed that accuses Apple of using monopoly power to inflate app prices.
Kids apps are estimated to make up only a small portion of the millions of apps available in the store, though Apple declined to say what percentage they are. It’s unclear exactly how many of those are collecting personally identifiable data on kids, and Apple declined to quantify how many are behaving badly.
Apple has previously contacted developers and advertising software operators to ask them to remove inappropriate ads — but that approach typically failed, Schiller said. And analytics software has gotten increasingly sophisticated in how they collect data.
Now, critics say, Apple is throwing the baby out with the bathwater by banning all external tracking and advertising on children’s apps, even when those apps comply with regulations around data privacy.
“This will simply kill the kids app category,” said Dylan Collins, chief executive of SuperAwesome, which helps app developers navigate child-privacy laws in several countries. Apple’s changes are “easy to perceive as ham-fisted” and shows Apple doesn’t understand how the changes will affect the sector, he said.
Privacy advocates have been complaining for years about the problems Apple says it is trying to fight. The 1998 U.S. Children’s Online Privacy Protection Act and the newer European General Data Protection Regulation limit what data kids apps are able to track. But there have been rampant violations of the law.
A long set of rules, known as the App Store Review Guidelines, govern the way app developers are allowed to operate. How those rules are enforced is a mystery, developers say. They have little visibility into Apple’s process of approving and rejecting apps. And when they are told they are in violation of the rules, they are often not told why. Apple says on its web site that its app review team makes about 1,000 calls a week to diagnose any issues that led to a rejection of their app.
It’s still unclear exactly how the new kids app rules will manifest and how they may affect many developers’ businesses. Developers say that, instead of a blanket ban, Apple should mandate that all kids apps use advertising and analytics that are vetted for safety.
If the goal of Apple’s rule changes for kids apps is to satisfy parents, it may have work to do.
Ann Mongan, a scientist and mother of a 7-year-old boy in San Francisco, said that she supports the idea of apps that don’t collect data and don’t show advertisements. But that’s not what Apple is doing, she pointed out. Developers themselves are still free to collect any data they please and show ads, as long as they are not placed by automated advertising software. Banning third parties, she said, is a good step but doesn’t solve the problem.
“It’s basically just lip service and publicity so that people feel like they’re safer, when they really are not,” she said.
Under the new rules, developers of mobile apps don’t have to stop collecting data themselves. (Apple’s own analytics software is also not banned, according to the new rules.) And once they collect the data, Apple can’t see what they do with it, such as send it to a server, where it can be analyzed by outside parties. In some sense, Apple could be making the problem worse by pushing data collection into the shadows, according to developers and people who work at analytics companies.
In June, when Apple announced its new privacy guidelines for kids apps, Tankee’s Youngblood was left confused. “We just wanted clarity on how best to follow the guidelines,” he said. “I understand we want to clean up the bad actors, but what does it do to the good guys?”
In developing his app, which was inspired by concerns about watching his now 10-year-old son surf YouTube, Youngblood had tapped companies that have sprung up to create advertising and analytics software that complies with the law to help protect kids. Companies like SuperAwesome, for instance, say humans review each advertisement for kid-appropriateness and ensure that no personally identifiable information is collected from kids.
He’s now looking into hiring in-house ad sales people, which would increase costs and require resources, and focusing more on other platforms, such as Android.
Before banning third-party advertising and analytics software, Apple spoke to specific developers and described what it was planning to do, Schiller said. He declined to name any of the developers. “We gathered enough data that we’re doing the right thing,” he said.
Clark Stacey, chief executive of Utah-based WildWorks, says Apple’s new guidelines will fundamentally change his kids app company. Most of the company’s apps, like Animal Jam, Amazing Animals and Animal Jam Jump, don’t have advertising, but the company uses analytics software to determine its development strategy. To protect kids, the company employs about 30 moderators and support staff to scan for inappropriate content, and child psychologists to ensure safety.
But Apple’s new rules will almost certainly mean the end of a WildWorks game called “Tag With Ryan,” based on a 7-year-old named Ryan whose YouTube toy reviews made him $22 million last year. That game is supported by ads, Stacey said. “If that monetization model is removed, I don’t know how long it would be able to continue,” he said.
Developers of kids apps say Apple’s new restrictions will incentivize them to start developing new apps that are technically made for adults — even if many of the users end up being children, meaning they are less protected than on kids apps.
Parents groups have raised concerns about apps such as ChatLive, a video chat service categorized for adults that is popular with children and that receives no added scrutiny from Apple. The shooter game Fortnite is also categorized as an adult app, though it’s very popular with kids.
Lindsay Wolfe, a stay-at-home mother of two in Chicago, said she will continue to sit next to her kids, ages 1 and 3, any time they use an iPad and supervise their activity, regardless of what Apple’s policies are.
“You can’t trust Apple to take care of the problem for you,” she said. “You still have to take responsibility for what your kids see. It’s called parenting.”
Dig Deeper: Personal tech + Privacy
Want to learn about how to keep your personal information private? Check out our curated list of stories below.
Americans receive more than 5.2 billion automated calls in a month. But there are new apps to help stem the deluge.
As tech columnist Geoffrey Fowler slept, a dozen marketing companies used his iPhone to learn his number, email, location and IP address.
Changing privacy default settings means you’ll get less personalization from some services, but it can slow down the number of eerie on-the-nose ads driven by data siphoned by major companies.