Having waded through hundreds of responses to my posts on Apple’s new iOS8, I wanted to ask two related questions to the many readers who were critical of (and in some cases, deeply hostile to) my initial post. The first question is the “where would you draw the line” question, and the second is “what is the privacy tradeoff” question. I set them out below, and then I ask for your feedback.
Before I get to the questions, I want to add, if it’s not clear from my prior posts, that I’m asking these questions genuinely hoping to learn from the answers. I’m trying to work through the issues myself, and I’m trying to get a sense of where people are coming from to figure out what I think. Given that, it would be very helpful if readers would actually take the questions seriously rather than just express anger that the questions are being asked or speculate about what corrupt motives prompted them. If you think the answers are obvious and people who don’t see it are idiots, then take it slowly and explain your view in a way that even an idiot can understand. Not everyone thinks the issues are so simple, and it’s hard to persuade people unless you take them through your reasoning.
So here’s my first question, the “where would you draw the line” question. It runs like this:
Strong crypto has benefits and costs from the standpoint of security and public safety. For many Internet users, the benefit side is obvious: Crypto allows individuals to keep people out who should be out. But there’s also a cost side, because it also allows individuals to keep people out who shouldn’t be out. Unfortunately, sometimes people use computers and the Internet to facilitate really bad crimes. Maybe the crime is child molestation offenses involving child pornography, or maybe it’s a conspiracy to commit murder or to inflict violence. Maybe it’s fraud or harassment or something else. To deter and punish those and other crimes, communities hire police to investigate crimes to collect evidence and charge wrongdoers in court, the thought being that solving crimes and bringing prosecutions is critical to deterring that kind of crime in the future and to punishing the wrongful acts. If an individual can use crypto to keep anyone out, however, people using computers to commit crimes will use that to keep the police out even when the police have a warrant.
So here’s the question: In your view, can there ever a point when there can be too much encryption — and if so, what is that point? In other words, do you think there could ever be a point at which crypto is so widely used in so many contexts to protect so much data so strongly, that you would think that the marginal costs of more and better crypto begin to cause more harm than good? Some very vocal readers take the view that the government is fundamentally illegitimate, and I gather that they will say that there is no such point. From their perspective, the very idea of governments solving crimes, and criminal law in its entirety, is misguided. But for readers who don’t go that far, and who see some legitimate role for law enforcement, is there any point at which you would say, in the hypothetical future, that there is too much encryption? If so, where would that line be, in terms of the scenarios that trouble you and government powers that you think would be cut back too far?
So that’s the first question. Here’s the second question, which is related to the first question but is more based on some themes of my academic work (and this article in particular). Unlike the first question, the second question is more for those with a legal background, and especially in the area of criminal procedure law. But I hope it will be of at least some interest to the broader readership. This is the “privacy tradeoff” question, and it goes something like this:
The history of Fourth Amendment law shows that the Supreme Court often alters Fourth Amendment rules in response to technological change. The Supreme Court tries to roughly maintain the balance of Fourth Amendment protection over time as technology shifts, so that technological change doesn’t give the government too much power (which would lead to abuses) or take too much power away from the government (which would make it too easy to commit crimes undetected). I’ve called this equilibrium-adjustment, and the basic idea is that there is a technological-legal tradeoff: If technology gives, the law takes away, and if technology takes away, the law gives. This dynamic allows the Fourth Amendment to maintain its role over time. Supreme Court opinions interpreting the Fourth Amendment in tech-related cases are a little bit like drivers trying to maintain constant speed over mountainous terrain: judges add extra gas when facing an uphill climb and ease off the pedal on the downslopes. (This is a really oversimplified version of the argument, of course; for the details, read the full article.)
If I’m correct about this history of the Fourth Amendment, then it suggests that strong crypto may incur legal tradeoffs. If the government can’t get access to contents, even with a warrant, and that ends up substantially shifting the privacy-security balance, the Supreme Court will respond by expanding government power in other ways to counteract that shift and restore the prior balance of power. We’ll get more privacy in some ways from more technological protection, but less privacy in other ways from having less constitutional protection. So here’s the question: What privacy tradeoffs might the Supreme Court make in response to strong encryption that would more-or-less restore the prior balance of constitutional protection? If strong crypto really changes the privacy-security balance, what currently-existing legal protections will we lose in response? And normatively, to the extent we may have a choice, is the full-crypto-with-lost-protections world better or worse than the less-crypto-with-current-protections world we have now?
That’s the second question. I assume a lot of readers will want to fight the premise. In my experience, equilibrium-adjustment is widely celebrated among the Internet crowd when technological change leads to a Supreme Court decision expanding rights, as in Riley, but a lot of people are less enthusiastic when focusing on change in the other direction. But the basic idea of equilibrium-adjustment is that, for better or worse, legal change is a two-way street. So I’m really interested in getting a sense of what privacy tradeoffs strong encryption might trigger.