How can Lasers Attack Voice Assistants?
Last year university researchers discovered that they could attack microphones in voice assistants using a laser light beam. Since then researchers from the Universities of Michigan and Florida in the US, as well as the University of Electro-Communications in Japan, have continued to investigate this phenomenon. They are, nevertheless, still unable to explain how this works. Laser attacks, which use what researchers call “light commands”, exploit smart assistants’ microelectro-mechanical systems (MEMS) microphones. These microphones work by converting sound, i.e. people’s voice commands, into electrical signals that are then translated into actions. However, researchers found that these microphones react to laser light shone directly on them in the same way as to sound. “By modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio,” explained the researchers.
Doesn’t Even Have to Be Nearby
Furthermore, the laser does not need to be near the device. Researchers found that they could send inaudible commands to the microphones with the laser positioned up to 110 meters away. It also worked if they shone the laser through a window from the same distance. With laser attacks, malicious actors could attack voice assistants without having any physical access to the device. Nor would they need any owner interaction. Consequently, an attacker could cheaply and easily launch a laser attack by standing outside a house and potentially control a voice assistant visible through a window. The attacker could then command the voice assistant to unlock doors, make online purchases or remotely start a vehicle.
Vulnerable Systems and Devices
After the initial findings, the researchers broadened their research from MEMS microphones in digital devices to sensing systems. They found that sensing systems found in medical devices, autonomous vehicles, industrial systems and even space systems are also susceptible to such attacks.
Steps to Protect Against Laser Attacks
The researchers have suggested steps that could be taken to protect against laser attacks. One step is implementing further authentication on IoT and other digital devices, such as 2-factor authentication. Or have the device ask a question, which the owner needs to answer, before it executes the command. “An additional layer of authentication can be effective at somewhat mitigating the attack,” said the researchers. “Alternatively, in case the attacker cannot eavesdrop on the device’s response, having the device ask the user a simple randomized question before command execution can be an effective way at preventing the attacker from obtaining successful command execution.” Another option would be to modify devices so that they need to receive commands from multiple microphones before executing them. Or putting a cover over the microphone so that a laser light cannot be shone directly onto it.
Laser Attack Demonstration at Black Hat
Some of the researchers involved, i.e. Sara Rampazzi, Assistant Professor at the University of Florida, and Benjamin Cyr, PhD Student and Danile Genkin, Assistant Professor both from the University of Michigan, will be demonstrating the laser attack at Black Hat Europe 2020. Black Hat is a conference held annually that provides attendees with the latest technical and research information relating to the Information Security industry. This year’s conference will run from the 7th to the 10th of December and it will be entirely virtual. At the conference, the researchers will show how: “