Scammers Are Using AI Voice Cloning to Mimic Your Loved Ones
Have you ever gotten one of those sketchy texts that claim you need to confirm details for a package, fraud alerts from your bank, or pay an unpaid toll? Typically, these texts threaten either legal action, identity theft, or other scary things that would make you click if you weren't so aware of scams.
They've become so common that people usually know not to click on those weird links they send with them, even our grandparents who are most likely to fall for these scams.
But what if it wasn't a text? What if it was a call that sounded exactly like your friend or family, claiming they are in trouble or need cash fast?
Anyone - not just grandma or grandpa - might fall for that.
The Scary Reality of Scams
Scams calls and texts aren't just annoying - they can be dangerous. According to FBI data, senior citizens were scammed out of around $3.4 billion in 2023. And according to the Federal Trade Commision, consumers lost about $470 million off text scams alone in 2024.
You might also find that you're getting more and more scam messages everyday. In 2024, about 83% of adults in the U.S. reported they receive scam calls weekly, while 82% reported receiving at least one scam text a week.
Scam texts are getting more sophisticated, and it's no longer just texts you need to worry about.
Scammers are getting smarter, and now they're using AI-enabled voice cloning tools to mimic the voices of our loved ones to try to get money.
Chuck Herrin, the field chief information security officer for F5, a security and fraud prevention firm, said these scammers, "say things that trigger a fear-based emotional response because they know when humans get afraid, we get stupid and don't exercise the best judgement."
How Voice Cloning Tools Work
There are plenty of AI voice cloning tools out there, including Augie, Resemble AI, Speechify, Descript, and more. While many of these tools have measures in place that make it very difficult, if not impossible, for scammers to use, but some are not as safe-guarded.
Some tools will take your voice and need your permission to clone and use it. While others will accept an audio recording, with or without your knowledge.
And it's not just money scammers are trying to get out of us. These types of voice cloning tools have also been used to scam voters. For example, in 2024, someone cloned former President Joe Biden's voice and tried to discourage people from going to the voting polls.
Protect Yourself With a Code Word
Since scammers are getting better everyday, it's important to stay vigilant and prepared. If you get a call from what claims to be a loved one asking for money, don't panic. Just ask them for the code word.
This is a safe word you should pre-establish with your family members that can't be guessed easily. It's recommended that you should avoid street names, previous schools, pet names, or other information that could be found easily online or on social media.
Safe words are good but safe phrases can be even better. Create a phrase about four words long, and again not easily guessable.
And stay alert and stern when it comes to scammers. The scammer might claim they're too upset to remember the code word, but you should never give it away.
If you suspect you're on the phone with an AI-voice cloning scammer, you should hang up immediately. Block the number and report it to the Federal Trade Commision or your phone carrier. You can also register your number on the National Do Not Call Registry to help reduce unwanted calls.