How to Avoid the Latest AI Voice Cloning Scam

AI text-to-speech tools are now being used regularly to clone individuals' voices and exploit their family members.

Do you think you could tell the difference between a voicemail from someone you love, and a voicemail generated by AI? Well, scammers using AI voice cloning technology to defraud unsuspecting victims certainly hope you can’t – and now, they’re more active now than ever.

A survey recently conducted by McAfee found that a quarter of adults have already been targeted by AI voice scams, with the overwhelming majority losing hard-earned money as a result. It also revealed that the majority of adults share their voice on social media platforms and other online spaces at least once a week – which gives threat actors exactly what they need to carry out this sort of scam over and over again.

So, how do AI voice cloning scams actually work, how common is it, and how can you spot an AI voice cloning scam? We take a closer look.

What Is an AI Voice Cloning Scam?

An AI voice cloning scam is any scam that uses artificially-generated audio files to dupe victims into thinking their loved ones are in danger, or need urgent financial assistance and have contacted them for help.

In a scam of this kind, a fraudster will run a clip of a subject speaking  – often scraped from social media – through an AI voice generator. Using machine learning, the generator will analyze the cadence, tone, and pitch of the initial clip, and then allow the fraudster to produce unique, original audio that mimics the subjects’ voice near perfectly.

The scammer will then send these recordings to friends and relatives of the subject via apps like WhatsApp, hoping they’re unable to distinguish between their loved one and an AI-generated version of their voice.

As is common to other online scams, the scammer will try and inject a sense of urgency and distress into their correspondence, in order to nudge the target into acting rashly or erratically.

In one recent case, an AI voice scammer tried to convince a mother in the US that her daughter had been kidnapped by cloning the child’s voice.

Concerningly, a budding scammer won’t find it difficult to unearth the audio files they’ll need to target a victim – in fact, they’re likely to be spoiled for choice.

A May 2023 survey published by McAfee involving over 7,000 people from seven different countries found that 53% of survey respondents said that they share their voice online at least once a week. In India, this figure was 86%.

have you experienced an ai voice scam?

How Voice Cloning Actually Works

AI voice cloning is only possible with an AI tool called an AI voice generator. In a nutshell, AI voice generators turn text files into speech (often called a “text to speech” or “TTS” tool).

AI voice generators use machine learning to teach themselves to speak in specific ways by analyzing information from audio files of people speaking. The generators then apply what they’ve learned to read text files supplied by users and generate original audio content.

Many of these generators have custom voices you can select to read your text, as well as celebrity voices to choose from – but others will let you record your own voice and create subsequent audio content.

Text-to-speech tool Descript.com offer a self-described “state-of-the-art voice generator that creates an ultra-realistic clone of your own voice”.

AI voice generators are now widely available for all sorts of devices. These programs deliver a lot of value to people who have difficulty reading, or simply learn better when listening to audio rather than reading the written word.

They’re also used by advertising companies who don’t have a budget to hire an expensive voiceover artist for their marketing content.

The success and widespread usage of ChatGPT has put a renewed focus on AI tools of all shapes and sizes, including ones that can be used for audio cloning. Despite their noble uses, there is now a small ecosystem of TTS AI tools that can, unfortunately, be abused for nefarious ends, including scamming people.

How Common Are AI Voice Clone Scams?

In the recently-released McAfee survey, the cybersecurity giant found that 1 in 4 adults surveyed have experienced an AI voice scam. 10% have been personally targeted, while 15% know someone who has.

77% of those targeted reported that they lost money due to the scam. McAfee reports that out of that 77%, “more than a third lost over $1,000, while 7% were duped out of between $5,000 and $15,000.”

Victims in the US lose the most, the survey reveals. 11% of US victims who lost money through AI voice cloning scams lost between $5,000–$15,000.

How to Tell If a Message is an AI Voice Clone Scam

The McAfee survey also found that 70% of people said they were “unsure” if they’d be able to tell the difference between an AI voice and a human one.

Almost one-third (28%) of US respondents said they wouldn’t be able to tell the difference between a voicemail left by an “Artificial Imposter”, as McAfee puts it, and a loved one.

Mcafee survey image

Remember, scammers may be able to replicate the voice of a loved one – but taking control of your loved one’s number or WhatsApp account is a lot harder.

It can be hard to act calmly when it sounds like one of your relatives or friends is in distress. But with AI voice scams becoming increasingly common, it’s important you do. There are some signs that an AI voice message might be a scam:

  • An unusual contact method (e.g. an unknown number)
  • Immediate requests for large amounts of money
  • Requests for money to be transferred through unusual means (e.g. gift cards or crypto)
  • A demand that you don’t tell anyone about the call/incident

With that in mind, here’s what the FTC advises you should do :

  • Call the number that left you the message to verify who it is
  • Ring your loved one or friend on their personal number
  • Message family and close friends of the person in question

If you cannot make contact, it’s important you inform law enforcement immediately. For those of you who haven’t yet been targeted by one of these scams but want to ensure you don’t fall victim to one, establish a safeword with your family and friends.

This is a code that means you and your loved ones can identify yourselves to one another is one of the best ways to ensure you don’t fall victim to an AI voice scam. This will be particularly useful for elderly family members, and if it’s never written down, can be kept quite simple.

It’s also important to keep up with the latest methods, techniques, and formats that AI voice scammers are using to extort victims. Along with being vigilant and treating calls from unknown numbers with extreme caution, keeping your ear to the ground is often the best thing you can do.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Aaron Drapkin is a Lead Writer at Tech.co. He has been researching and writing about technology, politics, and society in print and online publications since graduating with a Philosophy degree from the University of Bristol five years ago. As a writer, Aaron takes a special interest in VPNs, cybersecurity, and project management software. He has been quoted in the Daily Mirror, Daily Express, The Daily Mail, Computer Weekly, Cybernews, and the Silicon Republic speaking on various privacy and cybersecurity issues, and has articles published in Wired, Vice, Metro, ProPrivacy, The Week, and Politics.co.uk covering a wide range of topics.
Explore More See all news
Back to top
close Building a Website? We've tested and rated Wix as the best website builder you can choose – try it yourself for free Try Wix today