Hackers
Amazon
Alexa and Google Assistant hacked to spy on users and steal passwords
Security researchers create spying and phishing software which went undetected by Amazon and Google.
Security researchers create spying and phishing software which went undetected by Amazon and Google.
A team of cybersecurity researchers created software for Alexa and Google Assistant smart speakers which was intended to spy on users and steal their passwords, all without Amazon and Google noticing.
From Berlin-based Security Research Labs (SRL), the team built a set of simple smart speaker applications, known as Amazon Echo Skills and Google Assistant Actions. Application developers have created thousands of these apps over the years, each giving Amazon Echo and Google/Nest Home users access to information and entertainment.
Read More:
But instead of reading a bedtime story or telling them about the weather for their next vacation, the skills and actions created by SRL were intended to spy on users and steal their passwords.
Before that, however, the researchers created skills and actions which contained nothing untoward, so that they would pass through Amazon and Google's vetting process. Once cleared and made available to the general public, the researchers took advantage of a simple gap in both company's security practice - updates to how the skills and actions work are not checked.
This meant the researchers were able to change how the software worked, and turn them into spying and phishing devices. The Amazon Echo Skill was adjusted so that when a user tried to launch it, Alexa would give the error message: "This skill is currently not available in your country" - a standard error message many Alexa users may well have heard before.
Alexa then pauses for a moment, as normal, but is then instructed by the malicious skill to say characters the voice assistant isn't able to pronounce. This results in a minute-long silence where the user might think their Echo speaker is trying to fix the problem with the skill no longer working. After this, Alexa says: "An important security update is available for your device. Please say start update followed by your password."
We'd hope that most readers would know not to ever give away a password like this. But phishing scams are big business and, given how they have endured over the years, clearly work on more trusting people. With the target's Amazon password and email address (also requested by a malicious Alexa skill) the hacker could log into their account and cause further damage.
A second hack let the researchers turn a Google Home smart speaker into a malicious listening device. Here, a Google Action capable of generating a random number on command is submitted to Google, approved, and made available to the public.
The researchers then changed how the action works, making it play the sound Google Assistant uses to signal the end of a conversation, but instead of shutting down the microphone, it keeps listening. According to the hackers, the Google Home device can be instructed to keep listening for up to 30 seconds after the Google Assistant has stopped talking.
But if speech is detected during that time, the count is reset and the device keeps listening for another 30 seconds. Everything it hears is transcribed and sent to the hacker.
All of these hacks were shared with Amazon and Google before disclosed to the public. Security Research Labs said in a blog post: "The privacy implications of an internet-connected microphone listening in to what you say are further reaching than previously understood...Using a new voice app should be approached with a similar level of caution as installing a new app on your smartphone."
On how changes to skills and actions are not checked by Amazon and Google, cybersecurity expert Graham Cluley said the companies are "making a serious error if they believe that a single check when an app is first submitted is enough to confirm that the app will always behave itself in future. More needs to be done to protect users of such devices from privacy-busting apps."
Google said in a statement issued to ZDnet, which first broke the story: "We are putting additional mechanisms in place to prevent these issues from occurring in the future."
Amazon said it has now "put mitigations in place to prevent and detect this type of skill behavior and reject or take them down when identified."
Check out The GearBrain, our smart home compatibility checker to see the other compatible products that work with Google Home, Home Mini and Amazon Alexa enabled devices.
Introducing Echo Studio - High-fidelity smart speaker with 3D audio and Alexa
GearBrain Compatibility Find Engine
A pioneering recommendation platform where you can research,
discover, buy, and learn how to connect and optimize smart devices.
Join our community! Ask and answer questions about smart devices and save yours in My Gear.