It was a compelling set of facts for the government: a terrorist attack in California that killed 14 people, an iPhone possibly containing crucial evidence that could not be unlocked, and a warrant to search the phone.
But the phone’s contents are encrypted, and Apple, according to the Justice Department, has refused to help the FBI find a way to unlock the device. So this week, the government got a court to order Apple to help.
In a posted message that verged on the apocalyptic, Apple chief executive Tim Cook accused the government of asking it to build a “backdoor” into an iPhone and to design software that amounts to “hack[ing] our own users.”
Apple argues that starting from the hours after the Dec. 2 attack until days ago, it has worked with the FBI to give the agency what data it has — material backed up from the phone into the company’s iCloud service, for instance. But Apple did not want to do anything that it said would weaken the device’s security, such as creating software that would effectively let officials try to crack the phone’s password.
In the escalating fight over encryption, the U.S. government has moved to force a showdown that has been years in the making. By Wednesday morning, the Justice Department and the Silicon Valley giant had torqued up the encryption debate, raising the stakes for those who support widespread strong encryption to protect privacy and security and for those who think that courts should be able to compel tech firms to accommodate law enforcement’s need to thwart criminals and terrorist attacks.
“The government wants to lay down a marker here that companies do have to provide assistance when they can,” said Timothy Edgar, senior fellow at Brown University’s Watson Institute for International and Public Affairs and a former privacy officer with the Office of the Director of National Intelligence. “And Apple is saying, ‘We don’t want to have to hack our own customers.’ The outcome of the case is going to be hugely important for the balance between privacy and security.”
Anyone watching the encryption debate over the past year and a half knew that this day would come. “This is the ideal case for the government to challenge industry in the encryption debate,” said Michael Sussmann, a former Justice Department official and a partner at the Perkins Coie law firm. “The facts are sympathetic to the government and present the starkest example of their need to gain access to encrypted data to protect the American public.”
The device at the center of the debate, an iPhone 5C, was used by Syed Rizwan Farook, who with his wife, Tashfeen Malik, opened fire at a holiday gathering at the Inland Regional Center in San Bernardino County. The couple, who pledged loyalty to the Islamic State terrorist group, died a few hours later in a shootout with police.
What the government wants Apple to do is design software to install on the phone that would block it from automatically wiping data after 10 unsuccessful tries at entering a password. That would enable the FBI to “brute force” the phone’s password — attempting tens of millions of combinations without risking deletion of the data.
The government also wanted the software to permit the FBI to send passwords to the phone electronically, rather than having someone manually type them in. And the software must prevent the phone from adding delays between password attempts.
The request, the Justice Department said, does not require Apple to redesign its products, to disable the phone’s encryption or open its contents. The software, it said, would operate only on that one phone.
Technical experts said that all of that is possible. The question: Is it desirable?
In note titled “A Message to Our Customers” posted on Apple’s website, Cook said, “Make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor.” Although government officials say the software would be designed just for one phone, he said that “once created, the technique could be used over and over again, on any number of devices.”
Cook said it would set a dangerous precedent. “The implications of the government’s demands are chilling.” If the government has its way, he said, it could “demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”
But New York City Police Commissioner William J. Bratton said the government’s demands are reasonable and justified, especially in a case that has ties to the Islamic State, also known as ISIL. “No device, no car and no apartment should be beyond the reach of a court-ordered search warrant,” he said. “As the threats from ISIL become more divergent and complex, we cannot give those seeking to harm us additional tools to keep their activity secret.”
The polarization of the debate can be seen in the use of the term “backdoor,” which Cook accused the government of seeking. The term itself is imprecise and can be understood to mean anything that is intended to create a way around encryption — or more broadly that would weaken security. The government rejects the term for describing what it is asking for.
The White House on Wednesday pushed back against Apple and its framing of the argument. “This case doesn’t require Apple . . . to create a new backdoor,” press secretary Josh Earnest said. “It’s a very specific request that the Department of Justice has made, and a judge agreed with them.”
Reaction from Capitol Hill was swift and divided.
“Court orders are not optional, and Apple should comply,” said Sen. Richard Burr (R-N.C.), chairman of the Senate Intelligence Committee.
A colleague on the committee, Sen. Ron Wyden (D-Ore.), said that “companies should comply with warrants to the extent they are able to do so, but no company should be forced to deliberately weaken its products.”
Some legal analysts said the order issued by a federal magistrate judge in Riverside, Calif., opens a Pandora’s box of unknowns. “If a court has the power to order a third party like Apple to devise software that it does not already possess [to aid in surveillance], what can’t a court order a company to do?” said Stephen Vladeck, a law professor at American University. “There’s a real search for a limiting principle here that we haven’t identified.”
Apple has five business days after Tuesday’s order to respond and has vowed to challenge the order.
Mark Berman and Greg Miller contributed to this report.