Artificial intelligence is popular for blurring the lines between man and machine, and the new Amazon Alexa update continues the trend. Alexa can now talk slower or faster, and her speech rate can be customized to fit user need.
That’s the latest from Amazon, who says the new feature is helping make the Alexa experience more real for its customers. “We heard from customers that they would like the ability to change Alexa’s speaking rate for a variety of reasons. Some of our hard of hearing and older customers shared how they love talking to Alexa and how she has become a companion but sometimes they would like her to slow down so they can better understand her responses…We’re thrilled to introduce this feature to help customers further personalize their interactions with Alexa, and adapt the experience to best fit their individual needs,” said Alexa for Everyone head Sarah Caplener.
The new Alexa update brings seven speech speeds to Amazon’s AI, including two slower speaking rates. To adjust Alexa’s speech, users need only say “Alexa, speak slower” or “Alexa, speak faster.” Alexa’s speaking rates include the standard speaking rate, so don’t be afraid to select the standard if you don’t need to adjust her speech. The new update will prove beneficial for those who may have hearing troubles or those who simply cannot understand the standard speech rate.
The new Amazon Alexa update has received some positive feedback from customers.
Amazon’s new Alexa update is an excellent feature for its AI. After all, companies investing in creating and maintaining their own voice assistants tell consumers that AI “becomes more natural over time,” that, as AIs learn more about users, they can fetch more of the information users need at the exact moment they need it. And yet, part of making AI more user-friendly is to make AI more “human” by adjusting its speech. After all, isn’t that what humans say to one another when someone is talking too fast: “slow down, I can’t understand you”?
In order for AI to become more of a personal companion and friend, speaking rates will have to become customizable. It’s an interesting update because so many AI companies think about language, gender, AI skills such as booking flight reservations, securing movie and concert tickets, and making restaurant reservations for Friday night dinner, for example, and the nature of the information the AI can pull for the user such as sports scores, weather, homemade cooking recipes, and so on, that they tend to forget about how human voice assistants are.
Consumers are finding voice assistants to be an indelible part of everyday life, so much so that many consumers prefer an AI of some sort in their homes now. And yet, with all the uptake of AI in the home and around consumers, there are still concerns about whether Alexa and other voice assistants are “spying on” and violating the privacy of users. For AIs to respond, they must be given permission by users to record what users say.
For some, this makes them uneasy to think that Amazon Alexa, Google Assistant, Microsoft Cortana, or even Samsung Bixby records everything they say. Smartphone users can attest to saying things that, surprisingly, get picked up by Google Assistant — even when the Assistant wasn’t “summoned” with the hot word “Hey Google.”
Alexa’s new capabilities are not limited in where they can go, nor is Amazon limited in what it wants to know about its customers. Three months ago, it was rumored that Amazon is working on a wearable that can detect user emotions. Whether or not such a wrist wearable would have Amazon Alexa on board to talk to consumers in the event that they’re mad or angry or sad and depressed remains to be seen.
At any rate, with Alexa’s new talking speeds, she just turned more human than ever before. And just when you think AI has reached its limit, it’ll expand even further.
The new Amazon Alexa update is now available for Echo device customers.