A sign outside the National Security Agency campus in Fort Meade, Md. (Patrick Semansky/Associated Press)

Each week, In Theory takes on a big idea in the news and explores it from a range of perspectives. This week we’re talking about Internet encryption. Need a primer? Catch up here.

Matt Blaze is an associate professor in the Computer and Information Science Department at the University of Pennsylvania, where he studies secure systems cryptography and the impact of technology on public policy. He can be found at his website and on Twitter.

Cryptography, when used properly, is a critically important tool for securing data on the notoriously vulnerable networks that we rely on for almost every aspect of daily life. But law enforcement agents have expressed concern that cryptography might sometimes work too well, thwarting investigators from extracting useful evidence from wiretaps, smartphones and computers. They call for encryption systems to be designed with special backdoor access features that would allow the government to decrypt data when it is needed for an investigation.

The cryptography debate is often portrayed as a zero-sum game pitting law enforcement against privacy — our individual right to be free from unwarranted intrusion by the government. Put this way, reasonable people might disagree on where balances should be struck and lines should be drawn, and we rely on the political process to find compromises, however imperfect, that we can all live with. But lost in this framing is the reality that cryptography and security are not just political issues, but also deeply difficult technical ones.

[Other perspectives: 5 things tech companies don’t understand about encryption]

We learned this the hard way. Just over two decades ago, in 1993, the government proposed a new standard encryption system called “key escrow,” in which an National Security Agency-designed encryption device, called the “Clipper chip,” could be used in computers and other devices that needed to encrypt data. Clipper incorporated an encryption algorithm that was said to be stronger than the current standard. But there was a catch: Clipper-encoded data would include a copy of the key used to encrypt it, itself encrypted with a key held “in escrow” by the government. If Clipper-encrypted data were encountered during an investigation, the key could be taken out of escrow and the data decrypted.

Clipper was, to say the least, controversial. It ignited a firestorm of debate, framed, just as it often is now, on balancing national security against individual liberty. But it turns out that this was beside the point.

Although Clipper was designed at the NSA by some of the best cryptographers in the world, it had a number of undetected technical flaws, one of which made it possible for a rogue user to bypass the government access feature while still making use of the encryption algorithm. I discovered (and published, to the agency’s credit, without objection from the NSA) a practical way to do this a year after the first products incorporating the chip hit the market. The whole scheme is today remembered as an expensive, embarrassing fiasco.

Clipper’s failure starkly demonstrated that cryptographic backdoors must be understood first as a technical problem, not just the political one reflected in the debate. Clipper failed not because the NSA was incompetent, but because designing a system with a backdoor was — and still is — fundamentally in conflict with basic security principles.

Despite many advances in computer science, building a secure access feature is actually harder now than it was when Clipper failed in the 1990s. This is partly because we now rely on encryption integrated deeply into systems that are more complex, and fragile, than ever.

It’s perhaps tempting to hope that technologists will find a way to build law-enforcement backdoors that don’t unduly compromise security. We built the Internet and the smartphone after all, and surely this can’t be much harder than that! Unfortunately, it really is. For all of computer science’s great advances, we simply don’t know how to build secure, reliable software of this complexity. The problem is as old as software itself, and is one of the reasons we need cryptography in the first place.

[We think encryption allows terrorists to hide. It doesn’t.]

There is overwhelming consensus in the technical community that even ostensibly “secure” backdoors put the systems into which they are incorporated at increased risk of outside attack and compromise. At best, a backdoor greatly increases the “attack surface” of the system and creates rich new opportunities for unauthorized exploitation of hidden (and inevitable) software bugs, to say nothing of the human-scale processes that manage the access.

The stakes are higher than ever as we rely on networked systems to support almost every aspect of the economy and our critical infrastructure. It is not an exaggeration to characterize the state of software security as a national crisis. The regularity of large-scale data breaches such as the Office of Personnel Management (OPM) attack, in which the security clearance records of millions of government employees were stolen, is ample evidence that our infrastructure is already dangerously fragile, even without the burden of complex requirements to accommodate future surveillance.

Reliable, robust security — which means cryptography unencumbered by an extra key “under the doormat” — is not just a privacy nicety, but also a matter of national security and public safety. New technology certainly can create new challenges for law enforcement personnel, but they should be careful about the solutions they wish for. Our adversaries may want exactly the same thing.

Explore these other perspectives:

Nicholas Weaver: We think encryption allows terrorists to hide. It doesn’t.

Cyrus Vance Jr.: 5 things tech companies don’t understand about encryption

Mark Wallace: Our fight against the Islamic State starts online

Rita Katz: Jihadists are making their plans public. Why hasn’t the FBI caught on?