The findings raise questions about the security of our most private information in an age where employers, insurers and advertisers can use data to discriminate or target certain categories of people.
The information was shared with the social media giant via the Facebook Software Development Kit, a product that allows developers to create apps for specific operating systems, track analytics and monetize their apps through Facebook’s advertising network. Before users could even agree to the apps’ privacy policies, both Maya and MIA started sharing some data as soon they were opened, according to Privacy International.
Facebook spokesman Joe Osborne said advertisers did not have access to the sensitive health information shared by these apps. In a statement, he said Facebook’s ad system “does not leverage information gleaned from people’s activity across other apps or websites” when advertisers choose target users by interest. BuzzFeed first reported the news.
Period- and pregnancy-tracking apps such as Maya and MIA have climbed in popularity as fun, friendly companions that provide insights into the often daunting world of fertility and pregnancy. They can also be used to track sexual health more generally, moods and other intimate data. But many apps aren’t subject to the same rules as most health data.
That has raised privacy concerns as some of the apps have come under scrutiny as powerful monitoring tools for employers and health insurers, which have aggressively pushed to gather more data about their workers’ lives than ever before under the banner of corporate wellness. Plus, it appears the data could be shared more broadly than many users recognize, as flagged by the Privacy International study.
Several period- and pregnancy-tracking apps have been called out for sharing health data with women’s employers and insurance companies, as well as for security flaws that reveal intimate information. As a result, many women say they’ve devised strategies to use the apps without revealing all of their most sensitive information. Among those strategies: using fake names, documenting only scattered details and even inputting incorrect data.
Users and experts alike worry that the data could be exposed in security breaches, or used by employers and insurance companies to discriminate against women by increasing their premiums or not offering them leadership positions.
Deborah C. Peel, a psychiatrist and founder of the nonprofit Patient Privacy Rights, said people expect that their health data will be protected by the same laws that protect their health information in a doctors office, but that many apps aren’t subject to the same rules.
“Most people would want to make their own decisions about what’s known about their sex life, about whether it’s shared or not,” said Peel. “Right now we have no ability to do that.”
Facebook, the world’s largest social media platform with 1.2 billion daily users, is asking users to trust it with more and more sensitive information than at any time in the past. Last week, the company launched Facebook Dating in the United States, a matchmaking service that suggests potential love interests to users based on preferences, interests and Facebook activity.
At the same time, Facebook has come under fire in recent years for multiple scandals involving misinformation, fake accounts and breaches of trust. That includes the 2018 revelation from a whistleblower that Facebook had allowed political consultancy firm Cambridge Analytica to improperly access data from millions of users. In that case, the data was harvested through a third-party quiz app.
In a Facebook statement included in the report, the company said its terms of service prohibit app developers from sharing health or sensitive data, and that it has been in contact with Maya and MIA to notify them of a possible violation of those terms. Facebook also said that while it has systems in place to automatically detect and delete information like Social Security numbers and passwords from the information shared by apps, the company is “looking at ways to improve our system/products to detect and filter out more types of potentially sensitive data.”
Plackal Tech, which developed Maya, said in its statement to Privacy International that it would remove the Facebook Software Development Kit from a new version of its service. There was no published response from Mobapp Development, the company behind MIA, and the company did not have an immediate comment.