The Washington PostDemocracy Dies in Darkness

Opinion Apple should not be forced by the government to decrypt users’ data

The Apple iPhone 5C. (Justin Sullivan/Getty Images)

UNTIL TUESDAY, Apple appeared to be winning its fight with law enforcement. President Obama announced last year that he would not pursue legislation forcing tech companies to give law enforcement access to users' encrypted data. But on Tuesday, the FBI persuaded a judge to order Apple to create software that would help federal investigators crack into the iPhone 5C that Syed Rizwan Farook used before he shot up a San Bernardino, Calif., banquet room in December. Apple immediately promised to fight the order.

In essence, the FBI is attempting to explore and establish the limits of its legal powers to combat terrorism — as well as more mundane domestic crimes — under existing laws, in the absence of action by Congress and the White House. We think that’s the wrong call. The nation should not ask the courts to strike a balance between device security and law enforcement access. The political branches of government should do that.

The FBI relied on the two-centuries-old All Writs Act, a law that helps the government execute search warrants, to compel Apple to create new hacking software for Farook's phone. The order was nominally tailored to Farook's specific device, but its implications are larger. To what extent is it reasonable to force companies to write new code and harm their international reputation for data security — and, therefore, their business models — in order to help the U.S. government hack into suspects' phones? Should this be a routine investigative tool, or reserved for extraordinary situations, or beyond the pale? Farook's is an extreme case, but it is easy to foresee the government attempting to apply All Writs to less important investigations. What sorts of software can the government compel tech companies to write?

The answers to these questions have major implications for online safety and security. The more government-ordered hacking techniques are developed and used, the more likely they eventually will fall into the hands of malicious actors. This risk seems small but is difficult to estimate. Even if technology companies and the government kept the techniques they developed secret, their hacking activities would still threaten the technology ecosystem. Fearful of government-mandated malware, fewer people might accept automatic updates from software companies. This would make devices more vulnerable. The anti-terrorism benefits, meanwhile, would wane over time, as high-level terrorist groups turned to software from places beyond the reach of U.S. law enforcement.

The public has reason to be frustrated that investigators cannot execute valid search warrants; this is a worrying impediment to legitimate law enforcement. We believe Apple should help search for a workable solution. If there is a Paris-style attack in the United States, decisions may be imposed on it in a far less benign atmosphere. But the decisions should be made by Congress.

See why Tim Cook and Apple are refusing to hack into the San Bernardino shooter's iPhone. (Video: Jhaan Elker/The Washington Post)

Meanwhile, Apple’s role as a leading exponent of data security brings special responsibilities. Whatever U.S. officials decide, the policy will be the legitimate product of a democratic government and the rule of law. That will not be true in countries such as China, where dictators would use anti-terrorism tools to crack down on dissenters. We hope that Apple will fight as hard to safeguard its users’ privacy from authoritarian abuse.

Read more about this topic:

The Post’s View: Putting the digital keys to unlock data out of reach of authorities

Mike McConnell, Michael Chertoff and William Lynn: Why the fear over ubiquitous data encryption is overblown

The Post’s View: The next steps for the White House on encryption

The Post’s View: Compromise needed on smartphone encryption

The Post’s View: Paris attacks fuel a fresh debate over encryption