An Obama administration working group has explored four possible approaches tech companies might use that would allow law enforcement to unlock encrypted communications — access that some tech firms say their systems are not set up to provide.
The group concluded that the solutions were “technically feasible,” but all had drawbacks as well.
The approaches were analyzed as part of a months-long government discussion about how to deal with the growing use of encryption in which no one but the user can see the information. Law enforcement officials have argued that armed with a warrant they should be able to obtain communications, such as e-mails and text messages, from companies in terrorism and criminal cases.
Senior officials do not intend to advance the solutions as “administration proposals” — or even want them shared outside the government, according to a draft memo obtained by The Washington Post.
They fear blowback.
“Any proposed solution almost certainly would quickly become a focal point for attacks,” said the unclassified memo, drafted this summer by officials from law enforcement, intelligence, diplomatic and economic agencies for eventual consideration by Cabinet members.
“Rather than sparking more discussion, government-proposed technical approaches would almost certainly be perceived as proposals to introduce ‘backdoors’ or vulnerabilities in technology products and services and increase tensions rather [than] build cooperation,” the memo said.
Indeed, National Security Council spokesman Mark Stroh stated in an e-mail that “these proposals are not being pursued.”
He said: “The United States government firmly supports the development and robust adoption of strong encryption, while acknowledging that use of encryption by terrorists and criminals to conceal and enable crimes and other malicious activity can pose serious challenges to public safety. The administration continues to welcome public discussion of this issue as we consider policy options.”
Instead of offering technical solutions, the working group drew up a set of principles to guide engagement with the private sector. They include: no bulk collection of information and no “golden keys” for the government to gain access to data.
The task force had solicited ideas from federal law enforcement and intelligence agencies. “We’re not promoting those as the way to go,” said one senior official, who, like others, spoke on the condition of anonymity because of the subject’s sensitivity. “We’re just saying these are things that could be done.”
The first potential solution called for providers to add a physical, encrypted port to their devices. Companies would maintain a separate set of keys to unlock devices, using that port only if law enforcement had physical access to a device and obtained a court order to compel the company’s assistance.
The necessary hardware changes could be costly for U.S. manufacturers, but the physical access required by this method could limit some of the cybersecurity risks, the memo said.
The second approach would exploit companies’ automatic software updates. Under a court order, the company could insert spyware onto targeted customers’ phones or tablets — essentially hacking the device. However, the memo warned, this could “call into question the trustworthiness of established software update channels” and might lead some users to opt out of updates, which would eventually leave their devices less secure.
A third idea described splitting up encryption keys, a possibility floated by National Security Agency director Michael S. Rogers earlier this year. That would require companies to create a way to unlock encrypted content, but divide the key into several pieces — to be combined only under court order. Exactly how this would work remains unclear, but the memo warned that such a system would be “complex to implement and maintain.”
Under the final approach, which officials called a “forced backup,” companies under court order would be required to upload data stored on an encrypted device to an unencrypted location. But this might put significant constraints on companies, the memo noted, saying it would require that they design new backup channels or “substantially” modify existing systems.
All four approaches amount to what most cryptography experts call a “backdoor” because they would require developers to alter their systems by adding a surreptitious mechanism for accessing encrypted content, according to Joseph Lorenzo Hall, chief technologist at the Center for Democracy & Technology.
Law enforcement officials have rejected the “backdoor” terminology. “We aren’t seeking a backdoor approach. We want to use the front door, with clarity and transparency, and with clear guidance provided by law,” FBI chief James B. Comey said at the Brookings Institution in October.
Technologists have said such approaches weaken the security of encryption by adding layers of complexity that might hide bugs and creating new potential targets for hackers.
Federal law enforcement officials themselves have noted the challenges these solutions pose. Exploiting software updates, for instance, are good only until the next software update and only good for that one device, they say. Second, they note, such solutions are not useful on a broad scale.
“It’s not the same as being able to go to a provider and saying, ‘here’s a warrant,’ ” one senior law enforcement official said in an interview.
The forced backup also has drawbacks, law enforcement officials said. The user has to be connected to WiFi, and such backups drain battery power more quickly, which the user might notice.
In general, creating an “aftermarket solution” instead of designing a solution into the platform from the start “brings in additional vulnerabilities” that could be exploited, the law enforcement official acknowledged.
These are some of the reasons why federal officials say they want the companies themselves to craft solutions based on their own systems.
In a hearing this month, Comey argued that some major Internet companies maintain keys to unlock users’ data so they can scan the content and send related ads to users. Such a system, he suggested, is not “fundamentally insecure.”
But privacy advocates say there is a difference between those services and the level of security at the heart of the encryption debate.
“The simple fact is that data stored in the cloud is unquestionably less secure and more vulnerable to a Sony Pictures-style attack,” said Kevin Bankston, director of New America’s Open Technology Institute, referring to the hack last year of the Hollywood movie studio’s computer network.
Even if law enforcement is able to persuade major tech companies to create ways for investigators to obtain decrypted information from devices, users can still secure their communications by relying on encrypted apps, the memo said.
Also, a number of encryption solutions are built by groups of open-source developers, who make the software available for free on the Internet. The open-source nature of the code makes it harder to hide a backdoor. And because the developers are often dispersed among different countries and volunteers who are not working for any company, it is impractical for law enforcement to serve an order on one that’s enforceable on all.
“[T]hese challenges mean that inaccessible encryption will always be available to malicious actors,” the memo said.