Posted on November 5, 2019 at 3:22 PM
Hackers could Send Commands to Your Google Home and Amazon Echo via Lasers
According to recent findings made by security researchers, hackers might be able to take control over peoples’ devices by using a simple trick — commanding them with lasers. This is possible due to the ability of smart devices (such as Amazon Echo speakers, smartphones, Google Homes, as well as Facebook’s Portal video chat devices) to pick up light signals and transform them into sound.
Controlling devices via laser
The trick was originally discovered in early 2018 by a cybersecurity researcher Takeshi Sugawara. Sugawara shared his knowledge with the University of Michigan professor, Kevin Fu, demonstrating how pointing a high-powered laser at the iPad microphone makes the device transform light into an electrical signal, which produced a high-pitched tone.
After six months of research, researchers learned how to use lasers to silently speak to various devices, including any device that receives voice commands. Their experiments showed that the same method can be used from hundreds of feet away and that it allows for making online purchases, opening garages, and much more.
The attack is not stopped by windows, meaning that devices can even be activated from afar if their owner is not at home to notice the flashing of lasers. Simply put, anything with a microphone can be misused in such a way, provided that the potential attacker knows precisely which frequency to use. Researchers claim that it is possible to do this without particularly precise positioning and that in some cases, simply overflooding the device with light would do.
Researchers then started considering the ability to adjust the lasers’ intensity in order to match the frequency of a human voice. The results are quite disturbing, as it turns out that it is entirely possible to control pretty much all smart speakers that way, from around 164 feet away. This may not be the maximum distance from which the method would work, but it is the maximum distance that researchers have tested during experiments.
However, the results also indicate that smartphones are a bit trickier to use in this way. iPhones, for example, were only susceptible to this method from 33 away, while Androids only picked up the signals if the laser was 16 feet away.
Researchers also tested the regular 5 milliwatt laser, such as the cheap laser pointers that can be acquired by anyone. They tried using it from 361 feet away, and while the laser did not work on the majority of devices, it still managed to control Echo Plus and Google Home.
The troubling part is that voice commands would be completely silent, and all that the owner of the devices might notice is a flashing blue spot on their device, provided that they are home and that they even notice it. This represents a massive security problem, as it could become a new, completely stealthy way of hacking various devices. Furthermore, if hackers were to use an infrared laser, the attack would be completely invisible to the naked eye.
Of course, a virtual assistant would respond audibly to any such commands, just as it does to regular voice commands, However if the hacker first instructs it to turn the volume down to zero, this obstacle could also be bypassed.
How does it work?
Researchers admitted that they don’t know how this might be happening, although their theory is that the laser might be producing the vibrations that make this method of hacking devices possible. The laser’s light would heat up the diaphragm of the microphone, which would slightly expand the air around it, and make an inaudible bump in pressure, similarly to what sound does.
Another theory is that the light might even get past the mic, and hit the electronic chip directly, from which point the same process would occur. Researchers currently mostly have theories, with not enough information to provide a definitive answer. However, the potential for the misuse of this method is huge, as it could take over smart home controls, unlock doors, modify thermostats, and more.
Google’s spokesperson commented on the findings by saying that the company is currently reviewing the research paper, and looking into the flaw. Apple refused to comment on the discovery, while Facebook’s response is still pending. As for Amazon, the company also stated that it is reviewing the research and that it plans to take steps to engage the issue.
Meanwhile, researchers also discovered that some devices might offer protection against this method, especially those that require users to prove their identity before making purchases, such as iPads and iPhones.
Furthermore, some smartphone voice assistants do require so-called ‘wake words’ before they start taking commands, and these require actual words to be spoken by the owner. However, the threat remains serious, and the companies will have to come up with some way to fix the issue in order to secure the devices from this type of attack in the future.