Theclaimers Logo

Amazon Uses Kid’s Dead Grandma in Morbid Demo of Alexa Audio Deepfake

Not even Alexa would be the first consumer device to use deepface audio to stand in for a relative who is unable to attend in person. Gizmodo has featured Takara Tomy’s smart speaker, which employs AI to recite bedtime stories to kids in a parental voice. According to reports, parents are allegedly uploading their voices by reading a script for roughly 15 minutes. The owner of the product decides to supply their voice here as opposed to the product using the voice of a person who is unlikely to be able to express their consent, which is a huge difference from Amazon’s demo.
July 22, 2022
Alexa-of-Amazon

With just a little recording, Amazon has figured out how to make its Alexa voice assistant deepen the voice of anyone, alive or dead. The business used the current pandemic’s emotional agony and sadness to pique interest when it played the movie on Wednesday at its re: Mars conference in Las Vegas.

With technical experts and business leaders hitting the stage, Amazon’s re: Mars focuses on robotics, artificial intelligence, and other upcoming technologies. Rohit Prasad, senior vice president and lead scientist of Alexa AI at Amazon, demonstrated a feature that is currently being worked on for Alexa during the keynote address on the second day.

In the demonstration, a youngster requests Alexa to finish reading The Wizard of Oz to him. “Okay,” says Alexa in her distinctively feminine computerized voice. The voice of the child’s grandmother then begins to read the L. Frank Baum’s story from the speaker.

Prasad did not say how much work is still needed or when or if the Alexa skill will be released; he merely mentioned that Amazon is working on it. He did, however, disclose minute technical information.

Alexa-Speaker-of-Amazon

Rather than spending hours recording in a studio, “this necessitated an invention where we had to learn to generate a good quality voice in less than a minute of recording time,” he said. The issue was resolved by framing it as a language conversion effort as opposed to a language generation task.

Naturally, deep faking has developed a bad image. Nevertheless, some attempt has been made to utilize technology as a tool as opposed to a creepy device. The Verge points out that audio deep fakes are frequently employed in the media to cover up mistakes made by podcasters or abrupt deaths of project stars, like in the Anthony Bourdain documentary road runner. The magazine mentioned instances when people used AI to develop chatbots that communicated as if they were a lost loved one.

Not even Alexa would be the first consumer device to use deepfake audio to stand in for a relative who is unable to attend in person. Gizmodo has featured Takara Tomy’s smart speaker, which employs AI to recite bedtime stories to kids in a parental voice. According to reports, parents are allegedly uploading their voices by reading a script for roughly 15 minutes. The owner of the product decides to supply their voice here as opposed to the product using the voice of a person who is unlikely to be able to express their consent, which is a huge difference from Amazon’s demo.

If you have been scammed by an online scam then contact us to help you get your money back!

WORRIED THAT SOMEONE HAS YOUR PERSONAL & BUSINESS INFORMATION?

With how easy it is for scammers to acquire your data, it’s reasonable to be alarmed. Protect yourself and your loved ones by getting advice from experts. We will guide and even help you get your money back from scammers.

Several Disturbing Aspects

There are already several disturbing aspects of how Amazon is building the feature, which doesn’t even have a release date yet, aside from worries about deep fakes being used for scams, rip-offs, and other malicious activities. Prasad discussed the “companionship relationship” that Alexa offers consumers before he demonstrated the feature.

The manager remarked, “In this companion job, human qualities like empathy and effect are crucial to establishing trust.” “In these times of the ongoing pandemic, when so many of us have lost someone we love, these attributes have become even more important. AI can undoubtedly keep their memories lasting even though it cannot take away their loss’s grief.

The feature “enables sustained personal interactions,” according to Prasad.

It is true that a great number of individuals are sincerely looking for human “empathy and affection” to help them cope with the emotional suffering the COVID-19 outbreak has caused. However, the best place to address these demands for humans is not Amazon’s AI voice assistant. Additionally, Alexa is unable to facilitate “lasting personal relationships” with persons who have passed away.

It’s not difficult to accept that this function is being developed with the best of intentions and that hearing the voice of someone you miss can bring you a lot of comfort. Theoretically, we could even envision ourselves enjoying such a feature. Making a friend sound foolish by using Alexa is a fun and easy trick. Additionally, as was already said, other businesses employ deepfake technology in ways comparable to those that Amazon has shown.

However, it would be a huge, unrealistic, and troublesome leap to frame a growing Alexa skill as a way to restore a connection with departed family members. Meanwhile, it seems superfluous to tug at people’s hearts by bringing with it loneliness and anguish associated with the pandemic. Grief counseling is one of the areas where Amazon doesn’t belong.

Man-Sleeping-Pressing-Alexa-Virtual-Assistant

Sources

Find Related News

Subscribe To Our Newsletter

Scam Recovery Resources

Tinder Scams You Should Be Aware Of These Days

Tinder has been the go-to app for people finding love on the internet for years. Tinder is famous for its swiping right and swiping left concept. Now you know that The Tinder Swindler is not fiction but is inspired by real-life instances. These fraudsters first steal your heart and then your money. Read this article to find out how to save yourself from swiping right on the wrong person.

Read More »

The Emergence of Managed Forex Account Scam – A Fraud You Need To Watch Out For!

The rise of online trading forums has only heightened the risks, allowing for more fraudulent promotional schemes, exaggerated profits, and the inability to pay for victories. “Furthermore, certain actors are using manipulative software to rig the system.” The primary issue with forex trading is a lack of transparency, as well as opaque regulatory systems and insufficient control. There are, however, forex items available on regulated exchanges. Similarly, reputable brokers who make a living on the market exist. The Financial Conduct Authority (FCA) and Action Fraud have issued public warnings about investment frauds involving phony online trading platforms. This warning comes after reports of crypto and forex investment scams that tripled to over 1,800 last year. Fraudsters promise high returns on cryptocurrency and foreign-exchange investments, causing victims to lose a total of £27 million in 2018/19.

Read More »

FundTrace is committed to upholding the journalistic standards online, including accuracy. With our news reporting, our policy is to review each issue on a case by case basis, immediately upon becoming aware of a potential error or need for clarification, and to resolve it as quickly as possible.

do you need help?

A lot of those who contact us have questions and concerns about their personal and business data being compromised. We aim to arm you with the legal and technical know-how in the fight against scams. Also, we will be able to refer you to top scam recovery agencies.

Please fill up the form. Rest assured that our support team will get in touch with you

Share this article
Facebook
Twitter
LinkedIn
Leave a comment
Scan the code