LAS VEGAS (Reuters)
Amazon.com Inc wants to give customers the chance to make Alexa, the company’s voice assistant, sound just like their grandmother – or anyone else.
The online retailer is developing a system to let Alexa mimic any voice after hearing less than a minute of audio, said Rohit Prasad, an Amazon senior vice president, at a conference the company held in Las Vegas today (June 22). The goal is to “make the memories last” after “so many of us have lost someone we love” during the pandemic, Prasad said.
Amazon declined to share when it would roll out such a feature.
The work wades into an area of technology that has garnered close scrutiny for potential benefits and abuses. For instance, Microsoft Corp recently restricted which businesses could use its software to parrot voices. The goal is to help people with speech impairments or other problems but some worry it could also be used to propagate political deepfakes.
100 MILLION ALEXA CUSTOMERS GLOBALLY
Amazon hopes the project will help Alexa become ubiquitous in shoppers’ lives. But public attention has already shifted elsewhere. At Alphabet Inc’s Google, an engineer made the highly contested claim that a company chat bot had advanced to sentience. Another Amazon executive said Tuesday that Alexa had 100 million customers globally, in line with figures the company has provided for device sales since January 2019.
Prasad said Amazon’s aim for Alexa is “generalisable intelligence,” or the ability to adapt to user environments and learn new concepts with little external input. He said that goal is “not to be confused with the all-knowing, all-capable, uber artificial general intelligence,” or AGI, which Alphabet’s DeepMind unit and Elon Musk-co-founded OpenAI are seeking.
Amazon shared its vision for companionship with Alexa at the conference. In a video segment, it portrayed a child who asked, “Alexa, can grandma finish reading me the Wizard of Oz?”
A moment later, Alexa affirmed the command and changed her voice. She spoke soothingly, less robotically, ostensibly sounding like the individual’s grandmother in real life.