Alexa will be able to imitate the voice of the dead – Corriere.it

Alexa will be able to imitate the voice of the dead – Corriere.it
Alexa will be able to imitate the voice of the dead – Corriere.it
from Lorenzo Nicolao

Amazon’s new voice assistant feature unveiled in Las Vegas worries many. Beyond the ethical dilemma there is a risk of a new form of deepfake

A great potential with macabre facets. the new function of Alexathe voice assistant of Amazon who has already achieved significant milestones and is now able to imitate and reproduce the voice of any person, including those of the dearest deceased. Presented at the Re: MARS Conference in Las Vegas from Rohit Prasadsenior vice president of the division that deals with updates and development of the digital assistant, the news raised more than a few ethical doubts among the public, since the very example examined was that of a video in which Alexa he was reading a story to a child in the voice of his missing grandmother some time before. At one of the most important annual events on the progress made by technology in the field ofautomationof the machine learning and of robotics thus the age-old question of a technology that apparently seems capable of keeping alive, albeit in different ways, someone who is no longer there is also reopened, crossing the same border of death.

A one-minute registration is all it takes

To the new Alexa skill about a minute is enough listening to a certain voice and then knowing how to reproduce it for utter practically any expression can be processed by the vocal assistant. Imitating voices is certainly the new frontier of this type of instrument, which immediately conquered the front pages, thanks also to such an improvement in technology that now allows ambitious results to be obtained in a much shorter period of time. The new system is indeed able to reproduce the most disparate tones and the most diverse voice timbres calibrating its output very precisely. We can call it “The Golden Age for Artificial Intelligence – he explained Rohit Prasad – This can certainly do nothing about the pain of a loss, but today it can make the memories remain. So many, even during the pandemic, lost the people they loved. With this function we allow them to listen to the voices of missing family and friends again.

Deepfake danger

The technological potential is undeniable, but the new tool raises not a few reflections of an ethical nature, already widely debated on the social platforms by the most concerned users. The imitation of any voice can be traced back to a classic example of deepfake at the audio level, therefore a new and very dangerous opportunity to spread fake news increasingly difficult to recognize. Not a problem that concerns the deceased therefore, having said that for the relatives of the same it would entail not a few problems of mourning, but something highly dangerous for people still alive, which could consequently suffer the cloning of their voice for illegal purposes. With the evolution of technology, many new forms of scam have been born, but the latest Alexa feature revealed could further and suddenly speed up the process. A criminal who had managed to impersonate a wealthy customer with his voice, had for example managed to have about 30 million euros transferred from a current account of a bank of the United Arab Emirates and did not have this tool. Not by chance Microsoftbefore Amazon presented the latest news, had published new ethical rules for the use and development of theArtificial Intelligenceand (therefore not only in specific areas such as health), a regulatory framework that could in the future place many limits on the development of synthetic items. The British newspaper The Guardian instead reports how several experiments were done in the past with a chatbot (GPT-3) capable of holding a conversation as if it were a person who died a few years earlier. In 2020 in the case of Joshua Barbeau it was his girlfriend Jessica, who disappeared eight years earlier, an experiment that was already of great impact at the time. In the future, Amazon may still significantly limit this tool, if only to avoid disputes.

June 23, 2022 (change June 23, 2022 | 19:19)

© REPRODUCTION RESERVED

The article is in Italian

Tags: Alexa imitate voice dead Corriere .it

PREV Putin sends ethnic minorities to die. Who are the Buryats, the Kazakhs, the Tuvani- Corriere.it
NEXT Elena Cornaro Piscopia, 344 years ago the first woman to graduate in the world. And today? “The disparities continue between horizontal and vertical segregation”