Safeguarding Our Customers


Stay Alert – Voice Cloning Scams Can Be Convincing

Watch Out For Charity Scams Cybercriminals are using artificial intelligence to clone people's voices, and the crimes are leading to distressing phone calls for people around the country. All that's needed is a brief clip of someone talking, which is often pulled from social media, and they can make an eerily similar clone of the voice — a deception so convincing that a mother thinks it's her child.

Victims of voice-cloning scams are tricked into thinking they're talking to a distraught relative who desperately needs money because they've been in a car accident, robbed, or kidnapped. Readily available technology enables cybercriminals to respond in real time during these calls by typing out sentences in their voice-cloning apps. Some go as far as to research personal information about the victim's relative to make the call more believable.

Cybercriminals often ask for forms of payment that are difficult to trace or recover, such as gift cards, wire transfers, reloadable debit cards, and even cryptocurrency. As always, requests for these kinds of payments should raise a major red flag.

To protect yourself from becoming a victim of a voice-cloning scam, do the following:
  • Set a verbal codeword with kids, family members, or trusted close friends. Make sure it's one only you and those closest to you know, and everyone uses it in messages when they ask for help. 

  • Always question the source. In addition to voice-cloning tools, cybercriminals have other tools that can spoof phone numbers so that they look legitimate. Even if it's a call from a number you recognize, pause and think. Does that really sound like the person you think it is? To be safe, hang up and call the person directly.

©2023 Cornerstone Group

Trademarks: All brand names and product names used in here are trade names, service marks, trademarks or registered trademarks of their respective owners.

Privacy Policy