They say technology is supposed to make our lives easier, but scammers seem to have taken that memo and twisted it for their own gain. If you thought robocalls were bad, wait until you hear how AI-powered scams are changing the game—and not in a good way.
A terrifying new scam is making the rounds, and it’s using artificial intelligence to mimic the voices of loved ones in distress. One woman in Wisconsin learned this the hard way when she received a call from what sounded exactly like her daughter—crying, panicked, and in desperate need of help.
How It Works
Scammers are now leveraging AI to create eerily accurate voice clones of people, usually targeting family members with urgent pleas for help. Here’s how they operate:
- A Call from a Loved One: The victim receives a call that sounds exactly like their child, grandchild, or spouse, claiming they’ve been in an accident or arrested.
- An Urgent Plea for Money: The scammer, pretending to be a lawyer or police officer, demands money immediately to avoid jail time or further legal trouble.
- Secrecy is Key: Victims are warned not to tell anyone due to a supposed "gag order" or "legal restrictions."
- Cash Pickups or Wire Transfers: The scammer arranges for someone to physically collect cash or instructs the victim to send funds via wire transfer or cryptocurrency, making it nearly impossible to recover.
Who’s Targeted?
- The elderly are the primary targets, as they are more likely to trust a call from a loved one and may not be familiar with AI-generated scams.
- People with public social media profiles—scammers can gather voice samples from videos or audio clips posted online.
- Anyone who has family members living far away, making it harder to verify the legitimacy of the call.
A Real-Life Example
Loris Seibel from Chippewa County, Wisconsin, fell victim to this cruel scam. She received a call from what sounded like her daughter, crying and saying she had been in a car accident involving a politician. A supposed "lawyer" then demanded cash to prevent her daughter from going to jail.
Believing the call was real, Loris withdrew money multiple times, handing it over to couriers until her bank flagged the transactions as suspicious. By then, her life savings and retirement funds were gone. Her daughter had never been in an accident—it was all an elaborate AI-generated scam.
Why You Should Care
This isn’t just another scam—it’s a game-changer. AI-powered scams are nearly impossible to detect through traditional means. Caller ID can’t protect you. Even hearing the voice of a loved one isn’t enough to guarantee legitimacy anymore. The financial and emotional toll can be devastating, wiping out savings and leaving victims traumatized.
How to Protect Yourself
- Verify the Call: If you receive a distressing call from a loved one, hang up and call them directly on a known number.
- Ask Personal Questions: Scammers may struggle to answer specific personal questions that only your real family member would know.
- Be Skeptical of Urgency: Scammers thrive on panic. If someone is pressuring you to send money immediately, take a step back and verify the story.
- Limit Personal Information Online: Avoid posting voice recordings or videos on public social media profiles, as scammers can use these to train AI models.
- Consult a Trusted Source: If in doubt, check with another family member, your bank, or even law enforcement before making any financial transactions.
Quick Tips
Did you know? AI-generated voice scams can create a convincing voice clone with as little as 3 seconds of audio.
Pro Tip: If you receive a suspicious call, ask the caller about a childhood pet’s name or a family inside joke—something AI is unlikely to replicate.
Stay safe, stay informed.
Definitions
- AI-Generated Voice Fraud – A scam using artificial intelligence to clone a person's voice and deceive victims.
- Voice Cloning – The process of using AI to replicate a person’s speech patterns and tone.
- Financial Fraud – Any scheme that tricks individuals into giving away money or sensitive information.
- Social Engineering – Manipulative tactics used to exploit human psychology for fraudulent purposes.
To read more, kindly find source article here