FBI Director Christopher A. Wray. (Jabin Botsford/The Washington Post)
Robyn Greene is the senior counsel and government affairs lead at New America's Open Technology Institute.

I had just left a meeting at the Justice Department and stopped at a nearby coffee shop with colleagues for a debrief and to gab. As I was gathering my things to leave, I reached for my phone to make a call, only to realize: Someone at the coffee shop had taken my phone. By the time I knew my phone was gone, the thief had turned it off, so I couldn’t use the “Find My Phone” feature to locate or erase my device.

At first I was angry. Then, I panicked. Everything was on my phone. If you want to im

ate me, hack me, physically harm me, or scam someone close to me, all you need to do is access the data on my phone. But shortly after the anger and panic took hold, a sense of deep relief washed over me because my device was protected by full disk encryption and a long passcode. All of that information was safe. That’s worth remembering in a moment when law enforcement is actively working to undermine the strength of our encryption, not least of all because doing so might put the most vulnerable among us at risk.

Encryption is a method of scrambling information to ensure that it is unintelligible to anyone who does not have the proper key — most often a passcode — to unscramble it. There are two main kinds of encryption: encryption of data at rest, which is how our phones and other mobile devices are secured; and encryption of data in motion, which secures things like our phone calls, messages as they are being sent and financial transactions. When our phones are encrypted, it not only protects the sensitive information that we store on them, it also prevents unauthorized access to any accounts that are connected to our devices, like personal and work emails, bank and credit card accounts, and other apps. Before Apple rolled out encryption by default on iPhones, smartphone theft was rampant, leaving our personal information vulnerable to criminals. But immediately after Apple instituted this built-in security feature, phone thefts significantly decreased, since encryption rendered stolen phones as valuable as paperweights.

Despite all of the revealing and personal information on my phone, the risks others face from weakened encryption are far worse than any that I am ever likely to face. I’m not a person of color, an immigrant, a Muslim or a member of another over-surveilled or over-policed community. Encryption ensures people in those communities can work, think, learn and socialize freely, without the threat of being watched by the government. I’m not in an abusive relationship where protecting communications with device encryption may be a literal lifesaver: Encryption protects against any attempts by abusive partners to access their victim’s call records, web browsing history or other information stored on or accessible through the phone that, if discovered, could lead to an attack. I also don’t handle or store sensitive materials like trade secrets or sensitive communications that, but for encryption, could be stolen by hackers or spies engaging in corporate or nation-state espionage.

But the U.S. government, now joined by its surveillance partners in a cohort dubbed the Five Eyes (with the U.K., Canada, New Zealand and Australia), persists in its demands that encrypted devices should always be capable of being unlocked by law enforcement. This capability, which the FBI and Justice Department describe as “responsible encryption,” is referred to by most security experts as an “encryption back door” because it would create a weakness in the security of the device.

What these governments are asking for would probably require tech companies to weaken the encryption algorithm to make it easy for law enforcement to break encryption, or weaken how that encryption algorithm is implemented to make it easy for law enforcement to bypass. The first is equivalent to making a lock easy to pick. The second is like installing a strong lock but leaving the screws loose. Both are dangerous ways to try to protect something.

FBI Director Christopher A. Wray recently told an audience that there must be a way that cryptographers hadn’t thought of yet to securely guarantee that law enforcement could unlock encrypted devices. He said, “We put a man on the moon,” in trying to make the point that if mathematicians and scientists could do that, surely they could find a way to build a secure encryption back door. But after decades of research and debate, the experts overwhelmingly agree: trying to build a secure back door would be like asking NASA to safely land a human on the sun. It’s not possible.

Device encryption is already almost impossible to implement perfectly. Indeed, companies such as Cellebrite and Grayshift sell tools that are capable of hacking into iPhones, which law enforcement organizations regularly use. Despite these existing, unintentional vulnerabilities, current phone encryption is still very effective at securing our information. However, intentionally creating an additional means of access not only increases the complexity of building an already complex system, which introduces more opportunities for error. It also establishes a guaranteed way to access devices that criminals and law enforcement alike could exploit, potentially exposing all the data on those devices. Weakened security is weakened security — for anyone, in any situation.

So, yes, my phone was stolen. That was unfortunate. But because it was protected by encryption, the biggest downside was the cost of replacing it. If it hadn’t been encrypted, or if the government gets its way and weakens encryption on mobile devices, the consequences would have been much worse. And for many others, those consequences could be dire.