“It’s possible to make microphones respond to light as if it were sound,” Takeshi Sugawara, one of the lead researchers on the study, told Wired. “This means that anything that acts on sound commands will act on light commands.”
Since many voice-command systems don’t require authentication, attackers wouldn’t need a password or PIN to take over a device with a light command; they just need to be in the object’s line of sight. In a paper released Monday, researchers detailed how they could easily commandeer smart speakers, tablets and phones without being in the same building, just by pointing a laser through a window. In one case, they took over a Google Home on the fourth floor of an office building from the top of a bell tower at the University of Michigan, more than 200 feet away. And they say the trick could theoretically be deployed to buy things online undetected, operate smart switches in homes and other unsettling applications.
“Once an attacker gains control over a voice assistant a number of other systems could be open to their manipulation,” a breakdown of the study on the University of Michigan’s website says. “In the worst cases, this could mean dangerous access to e-commerce accounts, credit cards, and even any connected medical devices the user has linked to their assistant.”
Researchers spent seven months testing the trick on 17 voice-controlled devices enabled with Alexa, Siri, Facebook Portal and Google Assistant, including Google Home, Echo Dot, Fire Cube, Google Pixel, Samsung Galaxy, iPhone and iPad. They successfully levied attacks using ordinary laser pointers, laser drivers, a telephoto lens and even a souped-up flashlight.
The researchers weren’t sure exactly why these microphones respond to light as they do sound; they didn’t want to speculate and are leaving the physics for future study. They notified Google, Amazon, Apple, Tesla and Ford about the vulnerability.
Spokespeople for Google and Amazon said the companies are reviewing the research and its implications for the security of their products but said risk to consumers seems limited. An Amazon spokeswoman pointed out that customers could safeguard Alexa-enabled products with a PIN, or use the mute button to disconnect the microphone. (Amazon founder Jeff Bezos owns The Washington Post.)
Apple did not immediately respond to requests for comment.
Other undetectable means of exploiting voice-command devices have been revealed by researchers, but their powers have been more limited. In 2016, researchers at the University of California at Berkeley showed it was possible to cloak commands in white noise, music or spoken text. In 2017, researchers in China showed it was possible to give commands to smart devices at frequencies inaudible to the human ear, but a transmitter needs to be relatively close to the object for the method to work.
There are no known instances of someone using light commands to hack a device, researchers said, but eliminating the vulnerability would require a redesign for most microphones. But there are limitations to the stealth of a light command attack, researchers found. With the exception of infrared lasers, lasers and other lights are visible to the naked eye and could easily be noticed by someone near the device. Voice-command devices also generally give audible responses, but an attacker could still change the device’s volume to continue operating it undetected.
For now, researchers say the only foolproof way to protect against light commands is to keep devices out of sight from windows, away from prying eyes — and prying laser beams.