Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Scammers are using AI voices to steal millions by impersonating loved ones

Over 5,000 victims were conned out of their money through the phone in 2022.
By

Published onMarch 6, 2023

HUAWEI P40 Pro Talking on the phone
TL;DR
  • AI voice-generating software is allowing scammers to mimic the voice of loved ones.
  • These impersonations have led to people being scammed out of $11 million over the phone in 2022.
  • The elderly make up a majority of those who are targeted.

AI has been a central topic in the tech world for a while now, as Microsoft continues to infuse its products with ChatGPT and Google attempts to keep up by pushing out its own AI products. While AI has the potential to do some genuinely impressive stuff — like generating images based on a single line of text — we’re starting to see more of the downside of the barely regulated technology. The latest example of this is AI voice generators being used to scam people out of their money.

AI voice generation software has been making a lot of headlines as of late, mostly for stealing the voices of voice actors. Initially, all that was required was a few sentences for the software to convincingly reproduce the sound and tone of the speaker. The technology has since evolved to the point where just a few seconds of dialogue is enough to accurately mimic someone.

In a new report from The Washington Post, thousands of victims are claiming that they’ve been duped by imposters pretending to be loved ones. Reportedly, imposter scams have become the second most popular type of fraud in America with over 36,000 cases submitted in 2022. Of those 36,000 cases, over 5,000 victims were conned out of their money through the phone, totaling $11 million in losses according to FTC officials.

One story that stood out involved an elderly couple who sent over $15,000 through a bitcoin terminal to a scammer after believing they had talked to their son. The AI voice had convinced the couple that their son was in legal trouble after killing a U.S. diplomat in a car accident.

Like with the victims in the story, these attacks appear to mostly target the elderly. This comes as no surprise as the elderly are among the most vulnerable when it comes to financial scams. Unfortunately, the courts have not yet made a decision on whether companies can be held liable for harm caused by AI voice generators or other forms of AI technology.