Keller Williams Platinum - Greg Ouren & Patty Wilton | The Ouren Group

Fraudsters Up Their Game With AI Voice Cloning

Scammers are innovating an old scheme by tapping the power of artificial intelligence (AI) to clone voices and steal money by convincing someone that a close friend or loved one is in desperate trouble.

The deception could unfold like this:

A grandmother answers the phone and hears, “Help me!”

At the other end is her distressed grandson who’s caused an accident. He’s on the way to jail and desperate for bail money.

She recognizes his voice, she’s convinced it’s her grandchild, and follows the instructions to get him the money.

That could entail wiring money, using a payment app, or buying gift cards and giving the card numbers and PINs to the caller.

But it turns out the caller isn’t her grandson, and he’s not in trouble. She has unwittingly handed money to a scam artist.

Old swindle, new technology

Scammers are innovating an old scheme by tapping the power of artificial intelligence (AI) to clone voices and steal money by convincing someone that a close friend or loved one is in desperate trouble.

The Federal Trade Commission says these fake emergencies share several characteristics, including:

  • Scammers create a sense of urgency, maybe saying you’re the only one who can help.
  • They might tell you not to discuss the call with others and that keeping the call a secret is essential.
  • They count on your emotions and fear to act quickly without questioning or checking whether the emergency is real.

Though everyone’s voice is unique, it has gotten easier to clone people’s voices. After all, 53% of adults share their voice data online at least once a week, often on social media, says a McAfee Corporation report, The Artificial Imposter, that examines how AI is fueling more online voice scams. It found that scam artists can clone a person’s voice using just a few seconds of audio from an Instagram Live video or a TikTok post.

AARP, the Federal Trade Commission, and McAfee offer ways to protect yourself. Here are three tips.

  1. Create a code word or phrase with family and friends. If you get a distress call, ask for the secret word to confirm it’s a loved one calling. Or, ask something only your friend or loved one would know the answer to: “What’s your dog’s name?” or “Where did you spend Thanksgiving last year?”
  2. Stay calm and ask yourself some questions. Does that voice really sound like my loved one? Would they call me in this kind of an emergency? Is this legitimate? Instead of panicking and complying immediately, call the supposed “endangered” person or other family members to discuss the situation.
  3. Be careful about sharing your voice and life on social media. Consider who’s in your social media feeds, how well you know them, and whether you trust them. McAfee notes that your risk of exploitation increases as you expand your online social circle and disclose more personal information.
This website uses cookies to improve your experience. For more info, read our Cookie Policy. By clicking “Accept” or continuing to use this site, you agree to our use of cookies Terms of Use and Privacy Policy.