Researchers prove that iPhones, Android and smart speakers have a big security flaw that can't be fixed

The researchers, including a person responsible for discovering the massive vulnerabilities of Meltdown and Specter that chip manufacturers and computer manufacturers rushed to repair in early 2018, discovered a major security issue that can virtually affect all smart speakers in Amazon, Google or Apple. Apparently, you can use laser beams to direct the microphones to the speakers, which interpret the signal as originated by voice commands. Hackers were able to perform all kinds of actions with the help of these smart devices, and there is no real solution at the moment. All you can do is make sure your speaker is not facing a window, that it cannot access confidential data or devices. On the other hand, it is not that there is any evidence that someone has been abusing the unexpected failure. And making it work requires a lot of work.

Researchers at Japan and the University of Michigan have been studying the subject for seven months, The New York Times They explained They could open the garage door by hitting a voice assistant with a laser beam, and were able to control a Google Home device on the fourth floor of a building 230 feet from the top of a different building.

This shows that an attacker can try to open smart doors, smart cars and access anything that Google Home, Amazon Echo or Apple HomePod have access to. the Times The report only mentions products from Amazon, Apple and Google, but other smart speakers of 2019 and earlier models are susceptible to the same trick. Amazon and Google, of course, are the most popular. The study also lists other products that can take voice commands, including Facebook Portal Mini, Fire Cube TV, EchoBee 4, iPhone XR, iPad 6th gen, Galaxy S9 and Google Pixel 2.

The microphones on these devices have a small diaphragm that moves when the sound hits it. But it also moves when the light of a laser or a flashlight reaches it. The computer turns everything into electrical signals, and that is why the speaker can respond to light.

The only way to solve the problem is by adopting a different microphone design for future smart speaker models. All current will be susceptible to hack, the study seems to suggest.

Companies such as Amazon, Apple, Google, Ford and Tesla have been notified and are analyzing the study's conclusions.

Image source: LightCommands

The fact that this huge and unexpected defect exists in smart speakers does not mean that nobody can take advantage of it. Let's say you can get all the equipment in the image above, but you still need to be able to translate voice commands into laser pulses so that you can control smart speakers. Simply shooting a beam of light at the speakers of a device does not give the person complete and instant control over the devices of a smart home. If someone has all that sophisticated technology at hand, complete with experience on how to use it, Y It's headed to your home, well, it has more important things to worry about.

In the meantime, you'll want to use PINs to protect your confidential information and maybe turn off the microphones every time you leave home. Trying to hack phones and tablets from a distance, using laser beams, seems the most dangerous side effect of the hack and the type of attack you will see on the spas films. However, it is probably more difficult to achieve than with the speakers.

Still, smart speaker manufacturers are likely to take steps to prevent such attacks on future machines.