A debate over data security is brewing in Washington. On one side, law enforcement officials warn that new deployments of encryption, the technology that protects our communications and stored data from prying eyes, is leaving the government without the insight it needs to track down criminals and terrorists. On the other, privacy advocates and tech companies say efforts to build ways for law enforcement to access protected communications will leave everyone less secure.

But for many longtime techies, this isn't anything new — it's a repeat of the "Crypto Wars" of the 1990s. In fact, former Clinton tech policy official Michael Nelson said in a recent op-ed published by the Hill that it is giving him a bad case of "digital deja vu."

Nelson, who now works on public policy at CloudFlare, was the Clinton administration's point person on the Clipper chip — a government-backed piece of technology from the early 1990s designed to give authorities a way to wiretap encrypted phone calls.

Here's how he described the thinking behind it at a New York City Bar Association event in 1995:

We set about developing Clipper because we wanted to develop strong cryptography that would not undermine the ability of law enforcement to do its job. We were really faced with three choices. First choice was to adopt relatively weak cryptography, use that throughout the government, knowing that if we needed to we could break it and we could do a wiretap. Second choice was to adopt very strong cryptography, use that and just give up on the ability to do wiretaps. Clipper was the third choice. It was a technology that gave people very strong cryptography that would protect communications and files against unauthorized access, but in the event that law enforcement needed to do a wiretap, that would be possible, and that's why we chose to go that route.

Technologists raised alarm bells at the time that such access would undermine the security of the whole system — after all, if the government has a secret backdoor into a technical system, what's to stop a malicious hacker from finding it? And when experts had a chance to review the device, they quickly discovered technical flaws that undermined its credibility — contributing to the Clipper chip's ultimate demise.

"After years of arguments and multiple policy proposals, we learned that intentionally weakening encryption is a bad idea," Nelson explained in the Hill.

But two decades later, the same debate is being fought again. This time, it's being predicated on the expansion of end-to-end encryption — a version of the technology so secure that only the sender and recipient can unlock it — being rolled out by big tech companies such as Apple in the wake of former NSA contractor Edward Snowden's revelations about the scope of the government's spying capabilities.

Yet the debate today is on an even grander scale: The Clipper chip was about law enforcement access to encrypted voice calls, but now there are much richer troves of personal information that rely on encryption to stay private.

For tech companies, more and stronger encryption is an important way to regain customer confidence shaken by reports of the government's ability to access users' data. And the general consensus among cryptography experts is that the sort of access some law enforcement leaders are pushing for isn't possible without potentially risking the security of all users.

In the eyes of some law enforcement officials, these expanded customer protections give a potential shield to bad guys that could put public safety at risk. Earlier this week, the New York Times reported that Apple's encryption expansion had foiled a law enforcement request for access to real-time text messages between drug suspects over the summer.

As a veteran from earlier "Crypto Wars" clashes, Nelson hopes the Clipper chip will serve as a cautionary tale for policymakers tackling the issue today, but he doesn't sound very optimistic:

Unfortunately, the people advocating for encryption backdoors today seem to have learned few of the lessons of the Clipper Chip story. They are proposing to limit innovation and force IT companies to rely on government-approved solutions. They are not explaining how backdoors or “golden keys” could work, or why anyone should trust them. And they provide no assurance that foreign countries would not abuse their access to encryption keys.

The Obama administration is still weighing a policy response to the debate, but details on technical solutions it might propose remain scant.