Artificial Intelligence
Amazon
Alexa to get a speed boost from Amazon's own new server chips
Apple wasn't the only company to switch to its own chips this week
Apple wasn't the only company to switch to its own chips this week
In the same week that saw Apple announce the first Mac computers transitioning away from Intel and towards its own chips, Amazon said the computers powering Alexa's intelligence have also made the switch to custom processors.
Using its own, custom-built chips, Amazon says its Alexa voice assistant will be faster to respond to user commands than when the processing was done by chips supplied by Nvidia. The new chips are the result of Amazon's $350m purchase of chip designer Annapurna Labs in 2015.
Read More:
When Alexa users ask the voice assistant a question, such as "Alexa, what will the weather do today?", a complex chain of events takes place. First, the smart speaker's on-board computer recognizes the Alexa wake-word, then the microphones listen to what's said next.
This phrase is uploaded to Amazon's server, where it is converted from sound into text, which is then analyzed to determine what the user inferred with their question. In this case, the information they need can be gathered from a weather forecast service using their zip code, as found in their Amazon profile.
The forecast is then collected and turned from text into speaker, ready to be sent back to the Alexa device and spoken by the assistant. All of this happens in just a couple of seconds, but with so many steps there is plenty of room for improvement – and that's where Amazon's new chip, called Inferentia, comes into play.
Alexa is now smarter at predicting the intention of your questions Amazon
According to an Amazon Web Services blog post written this week, the company's own Inferentia chip has 25 percent lower end-to-end latency (a measure of speed when looking at the transmission of data). Amazon also says its new chip results in a 30 percent reduction in financial cost compared to before.
While the change is unlikely to make a huge difference to Alexa's speed for now, it opens up some headroom for Amazon to make the assistant smarter, providing more intelligent answers, without forcing the user to wait longer for Alexa's reply.
Amazon says the majority of Alexa communication will be computed using its Inferentia chips, which were first announced in 2018 and are specifically designed to handle large amounts of machine learning tasks at high speed.
Amazon also said this week how its facial-recognition system, called Rekognition, has also stated to use the new Inferentia chips.
how to speed up or slow down the way alexa speakswww.youtube.com
All-new Echo Dot (4th Gen) Kids Edition | Designed for kids, with parental controls | Panda
GearBrain Compatibility Find Engine
A pioneering recommendation platform where you can research,
discover, buy, and learn how to connect and optimize smart devices.
Join our community! Ask and answer questions about smart devices and save yours in My Gear.