scorecardresearch
Amazon has a feature in the works that will let Alexa speak in your dead relative’s voice 

Amazon has a feature in the works that will let Alexa speak in your dead relative’s voice 

If you think it’s creepy, you are right. 

Amazon is working on a feature that will let Alexa speak in your dead relative’s voice. Amazon is working on a feature that will let Alexa speak in your dead relative’s voice.

When loved ones pass away, memories play a vital part of moving forward. Recently there was a trend where people used technology to animate photos of dead friends and family to give new life to their memories. While many found this to be strangely comforting, it was quite creepy for the others. And if that wasn’t enough, there’s now a chance for you to revive the voice of those who have moved on. 

Amazon is working on a feature that will let Alexa speak in your dead relative’s voice. Creepy? You bet! The smart speaker might soon be able to respond to your queries in your dead relative’s voice as Amazon is working on this at the company’s Re: MARS (Machine Learning, Automation, Robots and Space) conference. 

The aim is to make “memories last”, as the company said. Amazon is working on a system that will allow Alexa, its voice assistant, to mimic any voice after hearing the person speak for less than a minute. 

Rohit Prasad, Senior Vice President, Alexa Team, said during the announcement that they are using artificial intelligence (AI) to make memories last so as it becomes easier to eliminate the pain of losing the ones you love.

To showcase the work Amazon’s done, Prasad played a video where a child asks Alexa “Can grandma finish reading me The Wizard of OZ”. Alexa replies with “Ok” and then starts reading the story in the child’s grandmother’s voice. 

Understandably, while some might find this comforting, many others might get quite creeped out. Currently, it is not known what stage the feature is at right now and Amazon has also not mentioned when it plans to roll this out. 

While Amazon is aiming at reviving memories and comforting people, a feature like this has significant security ramifications. It is possible that this feature might be misused allowing people to use celebrities’ voices without their consent. This is the deepfake issue all over again.