Researchers hack voice-controlled devices with lasers.

A research paper released on Monday raises alarms about the security of voice-control devices. Researchers from the University of Electro-Communications in Tokyo and the University of Michigan were able to manipulate Siri, Alexa, and other devices using “Light Commands.” Discovered in May 2019, Light Commands could become more dangerous as voice-control devices gain popularity.

Although the research team only tested a handful of devices, they suspect Light Commands could work on any smart speaker or phone that uses a micro-electro-mechanical system (MEMS). These systems contain tiny components that convert audio signals into electrical signals. Now, it appears light signals can inject audio information directly into MEMS and trick devices into executing certain tasks.

Today, consumers use voice-control devices for many applications. Users can unlock doors, make online purchases, and much more with simple voice commands. Additionally, many people don’t use voice authentication or passwords to protect devices from unauthorized use.  

Advertisement

Experimenting with Light Commands

In their paper, “Light Commands: Laser-Based Audio Injection on Voice-Controllable Systems,” the research team describes several methods of attack. They used cheap laser pointers, audio amplifiers, infrared lasers, and a laser driver to send audio signals via light to voice devices. In one test, the researchers were able to issue a command from over 230 feet away through a glass window. 

“We show how an attacker can use light-injected voice commands to unlock the target’s smart-lock protected front door, open garage doors…or even… start various vehicles (e.g., Tesla and Ford) if the vehicles are connected to the target’s Google account,” state the researchers. Today, the team is working with Amazon, Google, and other voice-control device manufacturers to bolster security measures. Fortunately, there are no documented instances of Light Command attacks to date. 

Light Command Limitations

Light Commands do have some limitations. First, lasers must point directly at a specific component within the microphone to transmit audio information. Attackers need a direct line of sight and a clear pathway for lasers to travel. 

Second, most light signals are visible to the naked eye and would expose attackers. Also, voice-control devices respond out loud when activated, which could alert nearby people of foul play.

Finally, controlling advanced lasers with precision requires a certain degree of experience and equipment. There is a high barrier to entry when it comes to long-range attacks. 

Voice Command Devices Growing in Popularity

Even with these limitations, the researchers’ findings are noteworthy given the increased adoption of voice-command devices. In 2019, voicebot.ai estimates that smart speaker manufacturers will sell over 90 million devices. Last holiday season, voice shopping with Alexa tripled compared to 2017.

Smart speakers are also getting much better at understanding human language. Users can now ask for recipe ideas, control internal environments, and even follow voice-directed workouts. In 2020, Amazon device owners will be able to donate to presidential candidates using voice commands.

As we rely more on these devices, it’s important to take the time to secure them appropriately. Think critically about what accounts are connected to your smart speakers and enable two-factor authentication whenever possible. A few quick steps can go a long way in protecting you and your household.

Facebook Comments