(iStock)

When parents download a learning or gaming app from the “Designed for Families” section of the Google Play store, they likely assume that those apps keep their kids’ data safe. After all, the Children’s Online Privacy Protection Act (COPPA) prohibits website operators and app developers from tracking or collecting personal data from children under the age of 13.

Yet that assumption could be wrong. More than 50 percent of Google Play apps targeted at children under 13—we examined more than 5,000 of the most popular (many of which have been downloaded millions of times)—appear to be failing to protect data. In fact, the apps we examined appear to regularly send potentially sensitive information—including device serial numbers, which are often paired with location data, email addresses, and other personally identifiable information—to third-party advertisers. Over 90 percent of these cases involve apps transmitting identifiers that cannot be changed or deleted, like hardware serial numbers—thereby enabling long-term tracking.

To test app privacy, we created an automated test bed that allows us to download and install apps to a series of mobile devices, simulate the behavior of users (with limited additional testing by humans), and then monitor the traffic flowing in and out of the devices. By monitoring an app for just 10 minutes, we can tell whether it tracks the user’s behavior, discloses this tracking, or shares personal data directly with third parties. (Our test bed is limited to Android apps for the sole reason that the Android platform is open source.)

Members of my group decided to contact one developer of several apps with particularly egregious practices, all targeted at children under 13. We observed that many of its apps were sending a wide range of persistent identifiers and location data to an advertising and analytics firm. When we reached out to the company, it thanked us and indicated it was previously unaware of the problem. The company said it had removed the advertising firm's computer code from all its games. We reanalyzed several of its apps and confirmed that this was the case. Thus, for at least this developer, it appears as though invasive privacy practices were due to misuse of third-party code.

We suspect that most of the developers whose apps fail to protect data do not have nefarious intent, but rather fail to configure their software properly or neglect to scrutinize practices of the third-party advertisers they rely upon to generate revenue. When building an app, developers import ready-to-use code from many different third-parties, including advertising companies. While this code “reuse” results in time savings and fewer errors, app developers likely do not realize that they are liable for all code included in their apps, regardless of whether or not they were the ones who wrote it.

Many third-party advertising packages include explicit options for app developers to disable the types of tracking and data sharing that might cause the resulting apps to run afoul of COPPA; others have terms of service that prohibit their inclusion in apps targeted at children altogether. Thus, it is incumbent upon app developers to thoroughly review all third-party code that they include, as well as to ensure their own apps comply with applicable privacy (and other) regulations.

But such a high rate of potential COPPA violations also reveals a systemic and troubling lack of oversight. While app developers are ultimately liable for such violations, it is clear that app stores like Google Play and Apple’s iTunes Store, as well as agencies like the Federal Trade Commission (which is responsible for enforcing COPPA), need to play a greater role.

For its part, Google, as part of its “Designed for Families” program, has already implemented a set of COPPA-compliance checklists that are presented to app developers whenever they indicate that their apps belong in the “Family” category of the Play Store. However, no verification seems to be in place after an app developer self-certifies that they have complied. While Apple (unlike Google) has an app review process, that process is opaque, and so it is unclear whether these policies are consistently enforced. (Upon manually testing several iOS games, I found that all of them had behavioral advertising enabled, a practice prohibited by COPPA, and were not caught by Apple’s review process.)

Meanwhile, the FTC, as well as state attorneys general, can and should be more proactive in identifying violators. To its credit, the FTC has brought several successful actions against COPPA violators for various reasons, including not seeking parental consent before accessing personally identifiable information and sharing persistent identifiers with third-party services. Yet until now the commission’s efforts have been limited to responding to reported violations and suspicious behavior, and its investigations have relied upon slow and laborious manual testing methods. As a result, the status of most mobile apps’ compliance remains largely unknown to the agency responsible for enforcing the law.

Part of the problem is lack of transparency. Unlike most products, where consumer harms may be obvious to consumers, parents in most cases have no way of knowing whether their child’s data is being tracked or transmitted. Consumers can help by demanding more transparent disclosures from the app developers and third-party advertisers that help them generate income — and by deleting (and reporting) apps that do not comply.

We have developed a website, AppCensus, that shows the privacy behaviors of the apps we have automatically tested. We hope that our website will shine light on these practices so that other developers take action. COPPA exists for a noble reason — protecting the privacy of children. We urge key stakeholders in government and industry to work together to ensure that this law is properly enforced.

Serge Egelman is research director of the Usable Security & Privacy group at the International Computer Science Institute and an affiliated researcher at the University of California, Berkeley Center for Long-Term Cybersecurity