Hackers might use the Amazon Echo to play back instructions.
If you’ve ever been concerned about the security of your smart-home gadgets, the new Amazon Echo security flaw won’t help. Researchers have discovered a technique to make Amazon’s Echo speakers “hack” themselves, allowing them to give voice instructions via the devices’ built-in speakers.
Alexa speakers can be made to execute any number of tasks by playing orders over the speaker itself, according to researchers at Royal Holloway University and the University of Catania in Sicily.
The hack, dubbed “Alexa vs. Alexa,” only required a few seconds of proximity to a vulnerable Echo device in order to be successful. When researchers used voice commands to link an Echo with a Bluetooth device, attackers could utilise that device to give Echo instructions as long as the Bluetooth device remained within range.
An abused Echo could purchase merchandise, manage smart home devices, and even open doors if the command contained the wake-up phrase (Alexa or Echo). In case the Echo required a vocal confirmation before proceeding, researchers inserted a single “yes” command that would play automatically after six seconds.
When the Echo was ‘infected’ in this way, it was susceptible to voice commands from anybody who had nefarious abilities or radio stations. Finally, there was an exploit that made it possible for an Alexa skill to run in the background, intercepting orders and responding to them in the same way as Alexa would.
Text-to-speech programmes may be used to broadcast voice instructions to the Echo in certain situations. “Full Voice Vulnerability,” which prohibited the Echo from automatically dropping its volume after hearing the wake-up command, was also a key component of the attack.
The good news is that Amazon has responded to the findings of the study by releasing fixes that address a lot of the issues identified throughout the investigation. The Bluetooth assault relies on Alexa skills to self-wake devices, which are no longer feasible, as are self-issued orders sent through radio stations.
“Systems in place to regularly monitor live skills for potentially harmful conduct, including quiet re-prompts,” Amazon said. This is either banned during certification or “immediately removed” for Alexa skills.
This is not the first time an Echo or other voice assistant has been misused despite the precautions taken to avoid this. Workers have been able to listen to user audio and applications have been able to spy on users, for example, in the past.
However, it’s not always the case that researchers are able to catch the flaw before it becomes a problem and Amazon is able to remedy it.
That being the case, if you’ve got one of the greatest Alexa devices in your house, you may want to follow experts’ advise and silence it while it’s not in use. Our own list of five security measures for your Alexa device is also worth checking out. Source