• Home
  • News
  • Coins2Day 500
  • Tech
  • Finance
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
TechAmazon Alexa

Amazon’s plan for Alexa to mimic anyone’s voice raises fears it will be used for deepfakes and scams

Sophie Mellor
By
Sophie Mellor
Sophie Mellor
Down Arrow Button Icon
Sophie Mellor
By
Sophie Mellor
Sophie Mellor
Down Arrow Button Icon
June 23, 2022, 6:55 AM ET

Amazon Alexa’s newest feature: bringing people back from the dead.

Amazon is developing a new technology for its voice assistant Alexa, which will be able to mimic any human’s voice, dead or alive, using less than a minute of recorded audio.

At the company’s Re:Mars conference in Las Vegas on Wednesday, Amazon’s senior vice president and head scientist Rohit Prasad demonstrated the feature using a video of a child asking an Amazon device “Alexa, can Grandma finish reading me The Wizard of Oz?”

Alexa confirms the request with its default, robotic voice, then immediately switches to the humanlike, soft, and kind tone of the child’s grandmother.  

Prasad noted during the demonstration that the feature could be used to help memorialize a deceased family member. “So many of us have lost someone we love” during the COVID-19 pandemic, he said—a reality that has pushed the company to make artificial companion-like conversation a key focus of the company.

“While A.I. Can’t eliminate that pain of loss, it can definitely make the memories last,” Prasad said.

But despite the uplifting emotional nature of the presentation, the new Alexa capabilities received a quick pushback from some in the technology world. More than as a means for emotional connection, they saw voice mimicry as an ideal tool for deepfakes, criminal scams, and other nefarious ends.

The technology

An Amazon spokesperson told Coins2Day that Prasad’s presentation was based on Amazon’s exploratory text-to-speech (TTS) research, which is something the company has been exploring using recent advancements in the technology. “We’ve learned to produce a high-quality voice with far less data versus recording in a professional studio,” the spokesperson said.  

The voice mimicry feature is currently in development and the company did not disclose when it intends to roll it out to the public.

The new voice speech pattern technology will need only “less than a minute of recorded audio” to produce a high-quality voice, Prasad said, which is possible “by framing the problem as a voice conversion task and not a speech generation path.”

The new technology might one day become ubiquitous in shoppers’ lives, and Prasad noted it could be used to build trust between users and their Amazon devices.

“One thing that surprised me the most about Alexa is the companionship relationship we have with it. In this companionship role, human attributes of empathy and affect are key for building trust,” he said.  

Fears

While the new mimicry feature may be innovative, it conjures fears in some—including in companies that work in the field—that it could be used for nefarious purposes.

Microsoft, which also created voice mimicry technology to help people with impaired speech, restricted which segments of its business could use the technology on fears it would be used to enable political deep fakes, Microsoft’s chief responsible A.I. Officer, Natasha Crampton, told Reuters.

The new feature is also stoking worries online.

“Remember when we told you deepfakes would increase the mistrust, alienation, & epistemic crisis already underway in this culture? Yeah that. That times a LOT,” said Twitter user @wolven, whose bio identifies him as Damien P. Williams, Ph.D. Researcher in algorithms, values, and bias.

Some fear the easy way in which scammers could use the technology for their own benefit.

Umm, so how soon will criminals be able to use it to call your family members begging them to Venmo cash? Or ask them for social security numbers? Or bank information?

— 🌈bitty_in_pink 💫 (@bitty_in_pink) June 22, 2022

Mike Butcher, editor of TechCrunch’s ClimateTech, noted, “Alexa mimicking a dead family member sounds like a recipe for deep psychological damage.”

Others advised people to stop buying the device completely.

Https://twitter.com/bigblackjacobin/status/1539753998307233793

Sign up for the Coins2Day Features email list so you don’t miss our biggest features, exclusive interviews, and investigations.

About the Author
Sophie Mellor
By Sophie Mellor
LinkedIn iconTwitter icon
See full bioRight Arrow Button Icon
Rankings
  • 100 Best Companies
  • Coins2Day 500
  • Global 500
  • Coins2Day 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Coins2Day Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Coins2Day Brand Studio
  • Coins2Day Analytics
  • Coins2Day Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Coins2Day
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Coins2Day Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Coins2Day Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.