When someone gets an abortion, they may decide not to share information with friends and family members. But chances are their smartphone knows.
There is precedent for it, and privacy advocates say data collection could become a major liability for people seeking abortions in secret. For many women, the ruling puts Americans’ lack of digital privacy in sharp relief: How can people protect information about their reproductive health when popular apps and websites collect and share clues about it thousands of times a day?
Following the leak of a draft of the ruling on Dobbs v. Jackson Women’s Health Organization, Democratic legislators introduced a bill called the My Body, My Data Act, which would add some federal protections for reproductive health data. It is unlikely to pass without support from Republicans.
“It is absolutely something to be concerned about — and something to learn about, hopefully before being in a crisis mode, where learning on the fly might be more difficult,” said Cynthia Conti-Cook, a technology fellow at the Ford Foundation.
Privacy advocates responded to the ruling by calling on tech companies to delete information related to reproductive health, or just collect less of it to start with. But that data has value for companies — much of our digital economy is built on companies tracking consumers to figure out how to sell to them. The data may change hands several times or seep into a broader marketplace run by data sellers. Such brokers can amass huge collections of information.
That data is an easy target for subpoenas, or court orders, and many tech companies do not give straight answers about what information they would be willing to hand over. Google, for one, reports that it received more than 40,000 subpoenas and search warrants in the United States in the first half of 2021.
Police and private citizens alike could buy data and use it to investigate suspected abortions. Phone location information has been used by activist groups to target ads at people in abortion clinics to try to dissuade them.
Crunching all that data isn’t easy, and law enforcement agencies have plenty of “lower-hanging fruit” to pursue, says Alan Butler, the executive director and president of the Electronic Privacy Information Center. Those more traditional methods include checking credit card records, collecting data from cellphone towers, and talking to friends and family members.
Just the possibility of using phone surveillance to enforce abortion bans will hang over the heads of people seeking abortions or helping others get them, said Nikolas Guggenberger, the executive director at the Yale Information Society Project. “People want to be on the safe side, so even if the law doesn’t apply to what they’re doing, it has a chilling effect,” he said.
A number of groups have published citizen guides to avoiding surveillance while seeking an abortion or reproductive health care. Those groups include the Digital Defense Fund, the Repro Legal Helpline and the Electronic Frontier Foundation.
Here are three potential contributors to the data trail on people seeking abortions — and how they might be used.
Phones can collect precise information about your whereabouts — right down to the building — to power maps and other services. Sometimes, though, the fine print in app privacy policies gives companies the right to sell that information to other companies that can make it available to advertisers, or whoever wants to pay to obtain it.
Vice’s Motherboard blog reported that for $160, it bought a week’s worth of data from a company called SafeGraph showing where people who visited more than 600 Planned Parenthood clinics came from and where they went afterward.
This kind of data could be used, for example, to identify clinics that provide abortions to people from out of state in places where that is illegal.
SafeGraph CEO Auren Hoffman told The Washington Post that his company was discussing whether to stop offering aggregated data on physical traffic to abortion providers. SafeGraph and companies like it do not usually sell the location information linked to names or phone numbers, although the company has come under fire from privacy advocates before and has changed some of its practices to make it harder to tie data to specific people.
“You can find someone to say they can de-anonymize the data, but if it could be done, someone would have written a paper by now,” Hoffman said.
But privacy watchdogs say you can learn a lot by connecting the dots on multiple places a single person has visited. For example, last year, a Catholic blog obtained location information originally generated by the dating app Grindr to out a priest as gay. Those behind the blog were able to infer that a person at a church-related location also was visiting gay bars.
Apple and Android phones offer settings to turn off location services for individual apps — or entirely for the phone. But doing so might prevent the operation of certain functions, such as transportation apps.
Search and chat histories
Searching for information about clinics and medications can leave a trail of records with Google, which in some cases saves queries to a user’s profile.
In 2017, prosecutors used Internet searches for abortion drugs as evidence in a Mississippi woman’s trial for the death of her fetus. A grand jury ultimately decided not to pursue charges, according to National Advocates for Pregnant Women. And last year, the Supreme Court of Wisconsin decided that detectives did not violate the rights of the convicted murderer George Burch when, operating without a warrant, they accessed downloaded data from his phone, including his Internet search history.
Private messages also can become evidence. In 2015, text messages about getting an abortion helped convict a woman of child neglect and feticide.
A 2020 report by Upturn, a nonprofit organization focused on technology and justice, found that law enforcement agencies use “mobile device forensic tools” — which can give them access to Internet histories as well as to unencrypted emails and texts — when investigating matters as varied as marijuana possession and graffiti.
People can take some steps to keep their search and chat histories private. Ford Foundation’s Conti-Cook said people do not have to volunteer their phones when police ask, and they can opt for encrypted messaging apps and a virtual private network, or VPN, to obscure their identities while conducting searches.
Reproductive health apps
Millions of people use apps to help track their menstrual cycles, logging and storing intimate data about their reproductive health. Because that data can reveal when periods, ovulation and pregnancy stops and starts, it could become evidence in states where abortion is criminalized.
There is evidence that these companies play fast and loose with privacy. In 2019, the period tracker Ovia got pushback for sharing aggregate data on some users’ family planning with their employers.
Last year, the Federal Trade Commission settled with the period-tracking app Flo after the app promised to keep users’ data private but then shared it with marketing firms including Facebook and Google.
A recent investigation by Consumer Reports found shortcomings in the way five popular period-tracking apps handle the sensitive user data, including sending it to third parties for targeted advertising.
How are the apps allowed to share such personal data? Our interactions with health-care providers are covered by a federal privacy law called the Health Insurance Portability and Accountability Act, or HIPAA. However, period-tracking apps aren’t defined as covered entities, so they can legally share data.
Joseph Menn contributed to this report.