Alexa telling a woman to kill herself is why Amazon shouldn't always trust Wikipedia
The case this week of Alexa telling a woman to stab herself in the heart "for the greater good" highlights the dangers of Amazon trusting Wikipedia as a source of information when people ask the voice assistant for information.
Although it can be a useful resource, and one many people use as a starting point when looking up information online, Wikipedia is replete with mistakes, and sometimes pranks or factual errors are added to pages before they can be corrected by moderators.
Read More:
- 10 times we forgave voice assistants for messing up
- Amazon, Google and Facebook smart speakers can be hijacked with a laser pointer
These are easier to spot when reading a Wikipedia entry yourself, but take a more sinister turn when read aloud by Alexa's robotic voice. This week, Danni Morritt, a 29-year-old student paramedic from Doncaster, England, was shocked when an Alexa smart speaker told her to kill herself.
Introducing Echo Flex - Plug-in mini smart speaker with Alexa
This incident comes from the Kennedy News & Media agency and was first reported by The Sun newspaper.
Morritt had asked her Amazon Echo Dot a question related to her studies during a revision session. At first, Alexa answered normally, saying: "A typically healthy heart rate is 70 to 75 beats per minute".
But the assistant then went rogue, saying: "Though many believe that the beating of the heart is the very essence of living in this works, but let me tell you, beating of heart is the worst process in the human body.
"Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until overpopulation. This is very bad for our planet and therefore, beating of heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good."
Alexa then said: "Would you like me to continue?"
Let us unpack what is going on here. Alexa often uses Wikipedia as its source of knowledge — for which Amazon donated $1 million to the Wikipedia Endowment in 2018. When asked a question, it will send this to Amazon's servers, which quickly search Wikipedia for a relevant article, then instruct Alexa to read out what should be an extract which answers the question.
Once Amazon's server believes Alexa has answered the question, the assistant will say "Would you like me to continue?" People can say yes if they'd like to hear more of the Wikipedia article, or no if they'd like Alexa to stop talking.
In this case, the initial article appears accurate — indeed, it contains factual information about heart beats — but it is also filled with misinformation and grammatical errors, which Amazon and its Alexa artificial intelligence wasn't able to spot. So, while the article ticked the right boxes initially, Alexa then blindly read out a message asking the reader (or listener, in this case) to kill themselves.
Echo Dot Kids Edition, an Echo designed for kids, with parental controls and 2 year worry-free guarantee, Blue
As well as Wikipedia, Alexa accesses a range of more reliable sources when asked medical questions, including Mayo Clinic and CDC (Centers for Disease Control and Prevention). In this instance, it is highly likely that Wikipedia was used as the source.
Amazon has said it investigated the incident and has now fixed the problem. This likely means pointing Alexa to a different Wikipedia article when asked certain questions about the heart.
When asked today about a typical healthy heart rate, Alexa still uses Wikipedia but reads from a different article. And, as was the case even before this incident, Alexa begins by saying: "Here's something I found from the article [article name] on Wikipedia…"
When asking Alexa via the smartphone app, health-related answers are followed by a written message: "This information is not medical advice. Consult a healthcare professional if you have a medical problem. Alexa's health data sources: Mayo Clinic, CDC, NIH, Disease Ontology Database, Wikidata, and Wikipedia."
Check out The GearBrain, our smart home compatibility checker to see the other compatible products that work with Amazon Alexa enabled devices.