Hackers can hijack your iPhone or smart speaker with a simple laser

Ruben Fields
November 6, 2019

But these researchers determined they can also command them by shining a laser at smart speakers and other gadgets.

The laser study was conducted by researchers at the University of Electro-Communications in Tokyo and the University of MI, who detail their work in a new paper, "Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems".

Alternatively, smart speakers typically use multiple microphones, meaning that if only one of them receives a signal, the command should be ignored. The sounds of each command were encoded in the intensity of a light beam, Daniel Genkin, a paper coauthor and assistant professor at the University of MI, told CNN Business on Monday.

This proves that an attacker could attempt to open smart doors, smart cars, and access anything a Google Home, Amazon Echo, or Apple HomePod would have access to.

In this scenario, the researchers showed how "an attacker can inject arbitrary audio signals to the target microphone by aiming an amplitude-modulated light at the microphone's aperture".

According to the researchers, these attacks can be mounted "easily and cheaply", using a simple laser pointer (under $20), a laser driver ($339), and a sound amplifier ($28).

"Digital assistants" like Siri and Alexa promise to make life more convenient by enabling users to activate and control complex systems with their voice.

Rabi Pirzada Announces to Leave Showbiz Industry over Leaked Content
After the video surfaced, the Department of Wildlife Protection and Parks took action against Pirzada for keeping wild animals. She has been vocal about social issues and voiced her concerns many times on the Kashmir issue as well.

However, it's not just smart speakers that are vulnerable to light commands. Just shooting a beam of light at a device's speakers won't give the person with the laser full and instant control over a smart home's devices. But they're also quite effective at hacking Alexa, Siri or Google Assistant, researchers say - even from hundreds of feet away.

Voice assistants such as Amazon's Alexa, Apple's Siri and Google Assistant can be hacked by shining a laser on the devices' microphones, according to an worldwide team of researchers. But there are limitations to the stealth of a light command attack, researchers found.

The researchers have already notified Tesla, Ford, Amazon, Apple and Google about the issue - a move that's highly important to get the problem fixed, since simply covering microphones with tape wouldn't solve it. People aren't running around jacking supercars with Radio Shack laser pointers, but the confluence of pervasive connectivity, accessible technology, and a relentless drive for convenience uber alles has managed to drag a lazy and frankly stupid plot device into the real world.

Google has taken notice of Sugarawa's work and confirms that it is "closely reviewing this research paper". Amazon did not respond to a request for comment at the time of publication.

The attack reveals that anyone outside the house could "hack" into these devices and carry out commands which the user would not have given.

However, that would require specialist laser equipment and an academic-level of laser knowledge, something we don't reckon the average garden variety burglar has. Assuming a smart speaker is visible from a window, hackers could use Light Commands to unlock smart doors, garage doors, and auto doors. Altogether, such a setup could cost less than $581 with most of these items available from Amazon.

Other reports by

Discuss This Article

FOLLOW OUR NEWSPAPER