Allie Funk is a senior research analyst for technology and democracy at Freedom House.

At first glance, Apple’s new coin-size tracking device, the $29 AirTag, seems like a convenient solution to an all-too-familiar problem: It leads you back to the items you always seem to lose. But in this case, convenience comes at a potentially dangerous cost. Dropped covertly into someone’s bag or placed under the seat of their car, the AirTag can also serve as an inexpensive form of “stalkerware,” used to digitally track an intimate partner, a journalist, a stranger or really any target of the stalker’s choosing — a frightening prospect in a country such as the United States, where more than 10 million people experience partner and gender-based violence each year.

AirTags are only the latest consumer tech product billed as offering convenience even as developers ignore, understate or underestimate the possibility for harm, reflecting a larger problem of the digital age: We have welcomed into our lives an entire ecosystem of dubious tracking tools designed with minimal safeguards against abuse.

The fertility-tracking app Flo, for example, in January settled with the Federal Trade Commission over allegations it had shared intimate health data, including information about new pregnancies, with third parties such as Facebook and Google. The same month, Vice reported that the Salaat First app, which provides reminders for the Muslim call to prayer, was selling its users’ location data to a contractor that has worked with U.S. Immigration and Customs Enforcement.

The covid-19 pandemic has accelerated our march toward techno-solutionism: Intrusive contact-tracing apps monitor people’s daily movements, online-proctoring services surveil home-based students with webcams and eye-tracking software, and facial recognition systems analyze companies’ customers and workers alike.

The threats presented by this dizzying system of surveillance are staggering. Tools that promise to make the world safer, more efficient and more convenient instead undermine privacy, due process and other democratic freedoms. Certain forms of sophisticated technology can actually automate racial and religious discrimination because of their dependence on biased or inaccurate data and the inequitable contexts in which they’re used.

And there are few opportunities for recourse for those who are victimized. We have cultivated a Wild West surveillance market in which advertisers, data brokers, government authorities and other third parties repurpose the data these technologies collect to serve their commercial, political and law enforcement goals with little accountability.

Apple has made some effort to mitigate the AirTag’s risks; an iPhone will alert you if you’re traveling with an unfamiliar AirTag. But the nearly 130 million people using Android devices and the additional 15 percent of the population who don’t own a smartphone will be left in the dark if someone attempts to track them. The gadget beeps if it is separated from its owner for too long. But a report by The Post found that the alarm goes off only after three days, and it emits a sound that can be muffled or easily missed.

Such location monitoring can increase the likelihood of violence, especially if a person is preparing to leave an abusive partner — one of the most dangerous periods for a survivor of abuse — by conducting job interviews, setting up separate bank accounts or scoping out new apartments.

Apple's new tracking device issues alerts if there's potential misuse. Post tech columnist Geoffrey Fowler explains how to know if AirTags are shadowing you. (Jonathan Baran/The Washington Post)

The private sector must do better. Digital technology can enhance our lives without undermining human rights. Before introducing a product to consumers, companies should undertake rigorous due diligence to develop a clear view of its potential harms — and then build the necessary privacy and security protections into a tool’s architecture. Companies should begin by engaging civil society groups and the communities and people most likely to be negatively affected. In the case of AirTags, the Coalition Against Stalkerware and the National Network to End Domestic Violence would be useful organizations to consult. (Apple did not respond to The Post or Fast Company about whether it conferred with experts on intimate-partner violence in the development of AirTags.)

More fundamentally, we need federal data privacy legislation to limit how data is accessed, stored and used by companies and government actors alike. There is strong public support for this: In 2019, 81 percent of Americans thought the risks of data collection by companies were greater than the benefits. Apple has capitalized on this sentiment by marketing itself as pro-privacy, adopting end-to-end encryption in its products and refusing requests by the Justice Department to unlock iPhones and weaken encryption. But the AirTag episode suggests that Apple may overlook risks to privacy and other forms of abuse depending on what serves its business interests.

Digital technology extends incredible opportunities for public safety and everyday convenience. But excitement over new products often outpaces efforts to identify and address privacy and other risks to human rights. We should continually ask whether this infrastructure of tracking technologies is necessary or desirable in a democratic society. Without robust safeguards, the benefits of these tools will be outweighed by the potential threats.

Read more: