Alexa is being taught to mimic the voice of deadrelatives by Amazon

Alexa is being taught to mimic the voice of deadrelatives by Amazon ...

After the epidemic, Rohit Prasad, who leads Amazon''s Alexa team, said the goal of the project is to "make the memories last."

Alexa might be imitated a voice using pre-recorded footage, which means the person doesn''t have to be present or even alive to serve as a source. In a video segment shown this week, a child asked Alexa if his grandma might finish reading The Wizard of Oz. Sure enough, Alexa changes voices to mock the child''s grandmother and finish reading the story.

During the presentation, Prasad said that Alexa now receives billions of requests per week from hundreds of millions of Alexa-enabled devices across 17 languages in more than 70 countries around the world.

The potential for abuse is large. For example, the tool might be used to deduce evidence for misinformation or political propaganda. Unlike in 2020, scammers tricked a bank manager into transferring $35 million to fund an acquisition that was not.

What are your opinions on the matter? Is Amazon taking the notion of voice cloning a little too far here, or are you concerned about the idea of having a "conversation" with someone from the grave?

Jan Antonin Kolar''s image is named after her.

You may also like: