Amazon Uses Kid’s Dead Grandma in Morbid Demo of Alexa Audio Deepfake
With just a little recording, Amazon has figured out how to make its Alexa voice assistant deepen the voice of anyone, alive or dead. The business used the current pandemic’s emotional agony and sadness to pique interest when it played the movie on Wednesday at its re: Mars conference in Las Vegas.
With technical experts and business leaders hitting the stage, Amazon’s re: Mars focuses on robotics, artificial intelligence, and other upcoming technologies. Rohit Prasad, senior vice president and lead scientist of Alexa AI at Amazon, demonstrated a feature that is currently being worked on for Alexa during the keynote address on the second day.
In the demonstration, a youngster requests Alexa to finish reading The Wizard of Oz to him. “Okay,” says Alexa in her distinctively feminine computerized voice. The voice of the child’s grandmother then begins to read the L. Frank Baum’s story from the speaker.
Prasad did not say how much work is still needed or when or if the Alexa skill will be released; he merely mentioned that Amazon is working on it. He did, however, disclose minute technical information.
Rather than spending hours recording in a studio, “this necessitated an invention where we had to learn to generate a good quality voice in less than a minute of recording time,” he said. The issue was resolved by framing it as a language conversion effort as opposed to a language generation task.
Naturally, deep faking has developed a bad image. Nevertheless, some attempt has been made to utilize technology as a tool as opposed to a creepy device. The Verge points out that audio deep fakes are frequently employed in the media to cover up mistakes made by podcasters or abrupt deaths of project stars, like in the Anthony Bourdain documentary road runner. The magazine mentioned instances when people used AI to develop chatbots that communicated as if they were a lost loved one.
Not even Alexa would be the first consumer device to use deepfake audio to stand in for a relative who is unable to attend in person. Gizmodo has featured Takara Tomy’s smart speaker, which employs AI to recite bedtime stories to kids in a parental voice. According to reports, parents are allegedly uploading their voices by reading a script for roughly 15 minutes. The owner of the product decides to supply their voice here as opposed to the product using the voice of a person who is unlikely to be able to express their consent, which is a huge difference from Amazon’s demo.
If you have been scammed by an online scam then contact us to help you get your money back!
WORRIED THAT SOMEONE HAS YOUR PERSONAL & BUSINESS INFORMATION?
With how easy it is for scammers to acquire your data, it’s reasonable to be alarmed. Protect yourself and your loved ones by getting advice from experts. We will guide and even help you get your money back from scammers.
Several Disturbing Aspects
There are already several disturbing aspects of how Amazon is building the feature, which doesn’t even have a release date yet, aside from worries about deep fakes being used for scams, rip-offs, and other malicious activities. Prasad discussed the “companionship relationship” that Alexa offers consumers before he demonstrated the feature.
The manager remarked, “In this companion job, human qualities like empathy and effect are crucial to establishing trust.” “In these times of the ongoing pandemic, when so many of us have lost someone we love, these attributes have become even more important. AI can undoubtedly keep their memories lasting even though it cannot take away their loss’s grief.
The feature “enables sustained personal interactions,” according to Prasad.
It is true that a great number of individuals are sincerely looking for human “empathy and affection” to help them cope with the emotional suffering the COVID-19 outbreak has caused. However, the best place to address these demands for humans is not Amazon’s AI voice assistant. Additionally, Alexa is unable to facilitate “lasting personal relationships” with persons who have passed away.
It’s not difficult to accept that this function is being developed with the best of intentions and that hearing the voice of someone you miss can bring you a lot of comfort. Theoretically, we could even envision ourselves enjoying such a feature. Making a friend sound foolish by using Alexa is a fun and easy trick. Additionally, as was already said, other businesses employ deepfake technology in ways comparable to those that Amazon has shown.
However, it would be a huge, unrealistic, and troublesome leap to frame a growing Alexa skill as a way to restore a connection with departed family members. Meanwhile, it seems superfluous to tug at people’s hearts by bringing with it loneliness and anguish associated with the pandemic. Grief counseling is one of the areas where Amazon doesn’t belong.
Find Related News
Subscribe To Our Newsletter
Scam Recovery Resources
A Compilation of High Profile Blacklisted Forex Brokers Unfortunately, fraud can be found in any business, including the Forex market. Sleazy Forex brokers are still
A managed forex account is an investment account for a person who does not have the time and expertise to invest in forex. This type of account is recommended for those who are new to trading, as it provides the investor with professional advice and guidance without the need to have a background in finance. The managed forex account involves one or more professional traders managing your trades and advising you on what kind of trades are available on the market at any given time. The investors don’t even have to be involved in their own trades, and they can just let the trader manage everything that needs to be done while they can focus on other things going on in their lives.
Everything You Need To Know About Online Scams & Romance Scams in 2022 Category: Online Scam Length: 20 Pages Reading Time: 30 Min Excerpt: In
The Art & Science of Internet Scams – How Do Internet Scams Work? I’m sure everyone has heard of internet scams and how people lose
FundTrace is committed to upholding the journalistic standards online, including accuracy. With our news reporting, our policy is to review each issue on a case by case basis, immediately upon becoming aware of a potential error or need for clarification, and to resolve it as quickly as possible.
do you need help?
A lot of those who contact us have questions and concerns about their personal and business data being compromised. We aim to arm you with the legal and technical know-how in the fight against scams. Also, we will be able to refer you to top scam recovery agencies.
Please fill up the form. Rest assured that our support team will get in touch with you