Boffins hack Alexa, Google Assistant and Siri using lasers

Ruben Fields
November 6, 2019

By calibrating the lasers to match the frequency of a human voice, the boffins were effectively able to beam commands to a selection of smart speakers as well as an iPhone and a pair of Android devices.

But whether or not the NSA is eavesdropping while you yell at Alex Trebek may be just the beginning of our problems, as researchers in Japan and the University of MI have figured out how to hack Google Home, Amazon Alex, and Apple Siri devices "from hundreds of feet away" using laser pointers and flashlights. Researchers suggest hackers may even use an infrared laser, which isn't visible to the naked eye, to control your smart speakers.

"This opens up an entirely new class of vulnerabilities", Kevin Fu, an associate professor of electrical engineering and computer science at the University of MI, told The New York Times. The Google Home registered this laser light as a verbal input, allowing Sugarawa to recreate the tonal modulation of common voice commands, like opening a garage door. Smart speakers typically don't come with any user authentication features turned on by default; the Apple devices are among a few exceptions that required the researchers to come up with a way to work around this privacy setting.

What could they command the voice assistants to do?

Yes, Alexa and Google Home devices can still eavesdrop on your conversations Exclusive: Amazon, Google fail to address security loopholes in Alexa and Home devices more than a year after first reports. According to data from tech market researcher Canalys, companies shipped 26.1 million smart speakers in the second quarter.

The speakers responded to the light as if it was voice-based sound waves, which raises serious security concerns-especially since most of these devices do not require user verification in order to be used, at least not by default.

Barcelona vs. Slavia Prague - Football Match Report
After two poor results, Valverde admits his team must quickly respond, starting with Saturday's LaLiga game against Celta Vigo.

The exploit also needs quite a sophisticated setup, with a strong and focussed laser, and equipment to convert audio commands into laser light modulations.

The technique requires the laser to actually hit the target device's microphone port, which could get significantly more hard as the distance gets larger. "We subsequently maintained contact with the security teams of these vendors, as well as with ICS-CERT and the FDA", they said, noting that the findings were made public on "the mutually-agreed date" of November 4.

The researchers noted that they haven't found any evidence to suggest this hack has been used in the real world.

Amazon says that 85,000 smart home gadgets now integrate with Alexa, while Apple is trying to get more gadgets to work with its HomeKit system. Lead author Takeshi Sugawara said one possible way to get rid of the vulnerability in microphones would be to create an obstacle that would block a line of sight to the microphone's diaphragm.

When contacted for comment, Amazon and Google said they were keeping tabs on the research.

Other reports by

Discuss This Article

FOLLOW OUR NEWSPAPER