[Note to readers: I changed my views after receiving feedback to this post. For a second post, about my change in view, see here. For a subsequent post asking readers about what the limits and impact of encryption should be, see here. An additional post about an excellent essay by Julian Sanchez criticizing my initial view is here. The original post is reproduced below.]

Apple has announced that it has designed its new operating system, iOS8, to thwart lawful search warrants. Under Apple’s old operating system, if an iPhone is protected by a passcode that the government can’t bypass, the government has to send the phone to Apple together with a search warrant. Apple will unlock at least some of the contents of the phone pursuant to the warrant. Under the new operating system, however, Apple has devised a way to defeat lawful search warrants. “Unlike our competitors,” Apple’s new privacy policy boasts, “Apple cannot bypass your passcode and therefore cannot access this data.” Warrants will go nowhere, as “it’s not technically feasible for [Apple] to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.” Anyone with any iPhone can download the new warrant-thwarting operating system for free, and it comes automatically with the new iPhone 6.

I find Apple’s new design very troubling. In this post, I’ll explain why I’m troubled by Apple’s new approach coded into iOS8. I’ll then turn to some important legal issues raised by Apple’s announcement, and conclude by thinking ahead to what Congress might do in response.

Let’s begin with a really important point: In general, cryptography is an awesome thing. Cryptography protects our data from hackers, trespassers, and all sorts of wrongdoers. That’s hugely important. And under Apple’s old operating system, cryptography protects iPhones from rogue police officers, too. Thanks to the Supreme Court’s recent decision in Riley v. California, the Fourth Amendment requires a warrant to search a cell phone. Apple’s old operating system effectively enforced the warrant requirement technologically by requiring the government to serve a warrant on Apple to decrypt the phone.

Up to that point, I think it’s all good. But the design of Apple’s new operating system does something really different.

If I understand how it works, the only time the new design matters is when the government has a search warrant, signed by a judge, based on a finding of probable cause. Under the old operating system, Apple could execute a lawful warrant and give law enforcement the data on the phone. Under the new operating system, that warrant is a nullity. It’s just a nice piece of paper with a judge’s signature. Because Apple demands a warrant to decrypt a phone when it is capable of doing so, the only time Apple’s inability to do that makes a difference is when the government has a valid warrant. The policy switch doesn’t stop hackers, trespassers, or rogue agents. It only stops lawful investigations with lawful warrants.

Apple’s design change one it is legally authorized to make, to be clear. Apple can’t intentionally obstruct justice in a specific case, but it is generally up to Apple to design its operating system as it pleases. So it’s lawful on Apple’s part. But here’s the question to consider: How is the public interest served by a policy that only thwarts lawful search warrants?

The civil libertarian tradition of American privacy law, enshrined in the Fourth Amendment, has been to see the warrant protection as the Gold Standard of privacy protections. The government can’t invade our private spaces without a showing that the invasion is justified by the expectation that the search will recover evidence. And the government must go to a neutral magistrate and make that case before it conducts the search. When the government can’t make the showing to a neutral judge, the thinking runs, the public interest in privacy outweighs the public interest in solving crime. But when the government does make that showing, on the other hand, the public interest in solving crime outweighs the privacy interest. That’s the basic balance of the Fourth Amendment, most recently found in the stirring civil libertarian language in Riley just a few months ago.

Apple’s new policy seems to thumb its nose at that great tradition. It stops the government from being able to access the phone precisely when it has a lawful warrant signed by a judge. What’s the public interest in that?

One counterargument I have heard is that there are other ways the government can access the data at least some of the time. With the warrant required under Riley, agents could take a stab at guessing the passcode. Perhaps the phone’s owner used one of the popular passwords; according to one study, the top 10 most often-used passcodes will unlock about 15% of phones. Alternatively, if the phone’s owner has backed up his files using iCloud, Apple will turn over whatever has been backed up pursuant to a lawful warrant.

These possibilities may somewhat limit the impact of Apple’s new policy. But I don’t see how they answer the key question of what’s the public interest in thwarting valid warrants. After all, these options also exist under the old operating system when Apple can comply with a warrant to unlock the phone. And while the alternatives may work in some cases, they won’t work in other cases. And that brings us back to how it’s in the public interest to thwart search warrants in those cases when the alternatives won’t work. I’d be very interested in the answer to that question from defenders of Apple’s policy. And I’d especially like to hear an answer from Apple’s General Counsel, Bruce Sewell.

Let me conclude with two important legal questions raised by Apple’s new policy, together with some speculation about how Congress might respond to Apple’s change.

The first question is whether the government can lawfully compel the telephone’s owner to divulge the passcode. I believe the answer is that yes, a person can in fact face punishment for refusal to enter in the password to decrypt his own phone. If the government obtains a subpoena ordering the person to enter in the passcode, and the person refuses or falsely claims not to know the passcode, a person can be held in contempt for failure to comply.

Some may think that the Fifth Amendment right against self-incrimination prohibits such punishment. But I think that’s wrong because of the specific circumstances in which the issue arises. Because people must know their passcodes to use their own phones, the testimonial aspect of decrypting a person’s own phone — admitting that the phone belongs to them and they know the password — will be a “foregone conclusion” whenever the government can show that the phone belongs to that person. If the phone’s in the suspect’s hand or in his pocket when the government finds it, that’s not going to be hard to show.Under the relevant caselaw, that makes all the difference: Entering in the password no longer raises a Fifth Amemdment problem. See, e.g, In re Boucher, 2009 WL 424718 (D.Vt. 2009).

A second question is how the new policy changes the rules for searching a cell phone incident to arrest. Under the Supreme Court’s recent decision in Riley, the government needs a warrant to search the phone. But under the new Apple policy, warrants to search the phone won’t work if the passcode is in place. If officers lawfully come into possession of a target’s unlocked phone, the data may effectively disappear as soon as the phone locks. It’s kind of the digital equivalent of flushing the drugs down the toilet, but it happens by default and automatically. This will create interesting questions under the exigent circumstances exception. If officers make an arrest and the phone hasn’t yet locked, does the exigent circumstances exception now allow the police to search the phone without a warrant because the delay of waiting for a warrant will mean a locked phone that can’t be unlocked even with a warrant? At least in some circumstances, such as when the government has probable cause and the screensaver suggests a later iOS operating system, I suspect the answer may be yes. (Incidentally, I have long argued that the Supreme Court should wait until a technology stabilizes before applying the Fourth Amendment to it to avoid the problem of announcing a rule that doesn’t make sense over time. In light of Apple’s new iOS8, Riley may be an interesting example.)

I’ll conclude with the interesting question of Congressional reaction. It may turn out that the government can get access to the data most of the time despite this new policy using a combination of unlocked phones, data from backups in the cloud, password-guessing, or compelling targets to unlock their phones. If the government can get to the data in other ways, then the Apple policy may not cause much outrage. The government will muddle through. Perhaps.

But imagine that the Apple policy thwarts a lot of important cases. Think of a homicide case in which the government wants to search the victim’s phone for evidence of who was behind the killing. Maybe the victim received a text message that provides the key to the case, and the cellular provider hasn’t stored the messages. Because the victim isn’t alive to share his password, and the phone will have locked before the body was found, the government won’t be able to search the phone to find the messages. Apple’s policy will keep the police from finding the killer. That seems bad.

If we get a lot of cases like that, I suspect Congress may look to legislation to try to restore the privacy/security balance more in the direction of the traditional Fourth Amendment warrant requirement. I can think of three paths Congress might take. To be clear, I’m not endorsing any approach, at least yet. I’m just covering the major options. They look like this:

1) The most obvious option would be follow the example of CALEA and E911 regulations by requiring cellular phone manufacturers to have a technical means to bypass passcodes on cellular phones. In effect, Congress could reverse Apple’s policy change by mandating that phones be designed to have this functionality. That would restore the traditional warrant requirement.

2) A second option would be to enact a new law severely punishing a target’s refusal to enter in his passcode to decrypt his phone. Under current law, such a refusal could lead to civil or criminal contempt charges. But given that the Fifth Amendment isn’t implicated for reasons discussed above, I don’t think there is a constitutional barrier to punishing it more severely. How severely is a policy question up to Congress, so Congress could theoretically impose quite high punishments. Of course, this option wouldn’t work if the owner of the phone is unavailable, such as would be the case in a homicide investigation when it’s the victim’s phone.

3) A third option would be to impose data retention laws. If the key evidence lost because of Apple’s policy is communications data stored on the phone that won’t be found elsewhere, Congress could require providers to store the data. For example, Congress could require cell providers to retain specific kinds of data (such as text messages) so it can obtain the messages from the provider with a warrant rather than from the phone.

Anyway, that’s my take, which I’m happy to open up for comments and reactions. Perhaps I’m misunderstanding Apple’s policy. If so, I’ll post a correction and apologize for wasting everyone’s time with such a long post. But at least based on my understanding of the policy, it strikes me as troubling. And if the switch ends up thwarting a lot of valid investigations, I suspect Apple may not have the last word.