This Bank Says ‘millions’ of People Could Be Targeted by AI Voice-Cloning Scams

If you thought your voice could only betray you after one too many karaoke nights, think again! These days, it might be AI cloning it to scam your loved ones. Yes, you heard that right, and no, this isn’t a plot from the latest sci-fi flick!

Scammers are now using artificial intelligence (AI) to mimic voices with astonishing accuracy—enough to fool your friends and family. All they need is a mere three seconds of your voice. That quick birthday shoutout video you posted? Perfect. Scammers use snippets like that to build a convincing audio clone, then call your loved ones pretending to be you, asking for emergency money transfers.


How it works:

  1. They find a short clip of your voice (from social media, a video, or anywhere you speak).
  2. The AI tool clones your voice.
  3. Scammers call your family or friends, sounding exactly like you, claiming to be in urgent need of money.


Who’s Targeted? 

Pretty much anyone with an online presence. If you post videos with your voice on social media, you could be a target. As this scam becomes more widespread, we’re seeing millions of potential victims, from tech-savvy teenagers to unsuspecting grandparents.


Real-Life Shocker:

In a survey of over 3,000 adults conducted by Starling Bank, more than 25% of respondents had been targeted by an AI voice-cloning scam in the past 12 months! Even scarier? Almost half didn’t even know such scams existed, and 8% admitted they might send money, even if they felt something was off.


Why Should You Care?

Imagine your own voice being used to trick your loved ones into sending cash to criminals. It’s personal. Beyond financial loss, this can leave a deep emotional scar—people trust voices they recognize. Plus, as AI continues to evolve, the risks of identity theft, financial scams, and even misinformation grow exponentially.


How to Protect Yourself:

Here are five practical steps to stay safe from voice-cloning scams:

  1. Set Up a Safe Phrase: Pick a random phrase with your family and friends that only you all know. If someone calls claiming to be you, ask for the phrase!
  2. Limit What You Share: Be cautious of how much personal content (especially with your voice) you post online.
  3. Double-Check: If you get a strange request, verify it. Call the person back directly or video call them.
  4. Stay Updated: Be aware of the latest scams through reliable sources. Scammers are always coming up with new tricks.
  5. Delete After Sharing: If you must share sensitive info (like your safe phrase) over text, make sure to delete the message afterward.


Quick Tips:

  • Did You Know? AI tools are now so advanced, they can replicate someone’s voice with only a few seconds of audio.
  • Pro Tip: Never act on any money requests made over a phone call without confirming it’s really your friend or family member.

Ever experienced an AI scam, or know someone who has? We want to hear your story! Sharing your experience could help protect someone else. Drop a comment or send us a message!

Stay safe, stay informed!


Key Terms Explained:

  • Artificial Intelligence (AI): AI refers to the simulation of human intelligence by machines. In this case, AI is used to clone or replicate a person’s voice by analyzing audio data and producing speech that mimics that individual.
  • Voice-Cloning Scam: A type of scam where criminals use AI technology to replicate someone’s voice. They then use the fake voice to trick victims—typically family or friends—into sending money or personal information.
  • Safe Phrase: A pre-agreed secret phrase shared between you and your trusted family or friends, which can be used to verify a person's identity during phone conversations, helping to prevent fraud.
  • Identity Theft: This occurs when someone illegally uses your personal details, like your Social Security Number, to commit fraud or other crimes, often for financial gain.
  • Misinformation: False or misleading information that is spread, often unintentionally. Scammers can use AI-generated voices to spread misinformation or impersonate people, potentially leading to harmful consequences.

To read more, kindly find source article here


Fico and Jersey Telecom Collaborate to Tackle Authorised Push Payment Fraud