Imagine you've been detained at customs, waiting to cross the border. Or maybe you've been pulled over for a traffic violation. An officer waves your cellphone at you.
Without you saying a word, he has gotten everything he wanted.
That's the nightmare scenario that some privacy and security experts are raising after Apple on Tuesday
unveiled its new iPhone X (pronounced iPhone "10"), a device with powerful facial-recognition features. The technology, Apple said, replaces the home button and fingerprint reader that was so revolutionary just a few years ago with a new authentication system known as Face ID.
For all intents and purposes, Face ID works just like its predecessor, Touch ID. The only difference is that instead of scanning your fingerprint, the iPhone X
scans your face. It's an incredibly convenient feature, one that could raise security across the board by helping to curb the use of weak passcodes such as "121212.”
But the introduction of Face ID also instantly led to questions among civil liberties experts, who say the technology raises the risk of abuse.
“You have to work pretty hard to get me to put my fingerprint on a reader,” said Chris Calabrese, vice president for policy at the Center for Democracy and Technology. “You have to work less hard to put a phone in front of somebody's face.”
Can the police really force you to unlock your phone with just your face?
“There is some question whether or not they could get you to scan your face or your fingerprint,” said Susan Hennessey, a fellow at the Brookings Institution and a managing editor of Lawfare, a leading national security blog. “Ultimately, this is the next development in the already existing, open legal question.”
The law is ambiguous — and the uncertainty is likely to persist until a court case establishes clearer rules, analysts say.
While you can't legally be compelled to give up your passcode, some analysts say, courts have ruled that law enforcement can compel you to give up your fingerprint under certain conditions. Under a standard known as “reasonable suspicion,” you can be required to provide your fingerprint. Could the same standard be applied to your facial data? That's what is unclear.
That said, Americans enjoy one additional layer of legal protection. Even if a police officer uses your biometric information to unlock a phone, he or she must still obtain a search warrant to search the phone. The warrantless searching of cellphones was ruled unconstitutional by the Supreme Court in Riley v. California in 2014.
“That's now established Supreme Court doctrine,” Calabrese said. Either way, he said, the best protection is probably to use a strong passcode.
Given how confusing the law can be on these issues, can't there be some kind of technological solution?
A partial one may be in the works. The new version of Apple's mobile operating system, iOS 11, is said to contain a fail-safe that will not only disable Touch ID, but also potentially Face ID. By pressing the power button five times in quick succession, an iPhone will stop accepting biometric data as an unlocking mechanism and require a passcode, according to the researcher who discovered the feature in a beta version of iOS 11.
It is not clear how long the fail-safe lasts before things revert to the regular mode. Apple did not respond to a request for comment.
The fail-safe could be the difference between protecting a user's sensitive data and having his or her entire digital life on display. But it won't work for everyone. In the heat of the moment, some may forget how to engage the fail-safe. Some may not even know that such an option exists. Or they may not have the opportunity to turn it on before their device gets confiscated.
This issue is going to become highly relevant to many Americans for whom phones are a constant companion. And just as policing affects certain communities differently than others, so will technologies like facial recognition risk endangering some Americans more than others.
“Responsible companies need to ask themselves, 'Where do we stop? Have we thought through the implications?'" said Katharina Kopp, director of policy for the Center for Digital Democracy, a privacy advocacy group.