Affiliate links on Android Authority may earn us a commission. Learn more.
Robocalls that use AI-generated voices get a step closer to being outlawed
- AI is being used by scammers to mimic the voices of loved ones, people in power, and more.
- The FCC proposes that robocalls that use AI-generated voices be made fundamentally illegal.
- The move will make it easier to charge the people behind the calls.
Ever since AI became a hot topic in the industry, people have been coming up with different ways to use the technology. Unfortunately, this has also led to fraudsters using AI to scam victims out of money or information. For example, the number of robocall scams that use AI to mimic the voices of others has exploded in recent years. Fortunately, there are features like Samsung Smart Call that block robocalls. But for the ones that find a way through, it looks like the FCC is making a move to end the threat of robocalls that use AI-generated voices.
According to TechCrunch, the FCC is proposing to make it fundamentally illegal for robocalls to use voice cloning AI. The goal is to make it easier to charge the individuals who are behind the scams.
Under the current rules, robocalls are only illegal when they are found to be breaking the law in some fashion. The FCC does have the Telephone Consumer Protection Act, which prohibits “artificial” voices, to protect consumers. However, it’s not clear if a voice emulation created by AI generation falls under this category.
What the FCC is attempting to do here is include AI voice cloning under the “artificial” umbrella. This way it’ll be more clear as to whether a robocall is breaking the law in this situation.
Recently, AI-generated robocalls were used to imitate President Biden’s voice. Scammers used this tactic in an attempt to suppress the voting in New Hampshire. To help avoid instances like this and other fraud in the future, the FCC will need for this ruling to pass quickly before things get even more out of hand.