Finance Director Nearly Loses $670k to Scammers Using Deepfakes to Pose as Senior Execs

If your boss suddenly becomes extremely polite, well-dressed, and strangely available on Zoom… maybe double-check it's not a Hollywood-level scammer in disguise.

In this publication, we're uncovering one of the most dangerous tech-driven scams we’ve seen yet—one that nearly cost a global company half a million dollars. Let’s dive in.

Scammers are now using deepfake technology to impersonate high-level executives in live video calls, manipulating employees into transferring massive sums of money under fake pretenses. These attacks are so sophisticated, even seasoned professionals are being fooled.


How It Works

Step 1: Trust the Face

The scam starts with a message from someone claiming to be a C-suite executive—usually through WhatsApp, email, or a company comms platform.

Step 2: Fake Meetings, Real Faces

The target is invited to a video call to discuss a “confidential” project. Deepfake tech is used to mimic the voices and faces of known executives, creating highly convincing visuals.

Step 3: NDAs & Urgency

The employee is often pressured to sign a non-disclosure agreement and is told the matter is urgent and top-secret—preventing them from seeking a second opinion.

Step 4: The Transfer Trap

After the fake meeting, the target is instructed to transfer funds to a specified account, believing it’s part of a legitimate corporate operation. But the money ends up in mule accounts controlled by scammers.


Who They Target

• Finance directors, accountants, senior managers—anyone with authority to move funds

• Typically large corporations or MNCs with complex organizational structures

• This scam has gone global, but recent high-profile cases have surfaced in Singapore and Hong Kong


Real-Life Example

On March 24, 2025, a finance director of a multinational firm received a WhatsApp message from someone posing as the CFO. He was asked to join a confidential Zoom call about business restructuring. On the call were deepfaked versions of the CEO and other executives.

Following the call and a follow-up conversation with a fake lawyer, the director transferred over US$499,000 from the company’s HSBC account—believing it was part of the restructuring plan. He only realized something was wrong when he was later asked to send another US$1.4 million.

Thanks to fast action and cross-border cooperation between Singapore’s Anti-Scam Centre and Hong Kong’s Anti-Deception Coordination Centre, the full amount was recovered. But not everyone gets that lucky.


Why You Should Care

This scam doesn’t just put money at risk—it strikes at the heart of internal trust within companies. With generative AI and deepfakes becoming more accessible, any employee could be manipulated if protocols aren't in place.

Beyond financial loss, the reputational damage and operational disruption can be devastating.


How to Protect Yourself and Your Company

• Verify Through Separate Channels

Always confirm unusual instructions—especially those involving money—via a different, verified communication method (e.g., a phone call or face-to-face check).

• Create Executive Verification Protocols

Implement mandatory secondary approvals or authentication checks for large fund transfers or executive-level requests.

• Limit Access to Sensitive Info

Restrict who has access to video footage, voice recordings, or internal images of top execs to reduce material for deepfake training.

• Train Staff on AI Scams

Regularly educate employees about deepfake tech and impersonation tactics, especially in finance, legal, and HR teams.

👁️ Use Code Words or Callbacks

Establish internal "code words" or callback protocols that must be used during high-value or urgent requests.


Quick Tips & Updates

• Did You Know? Deepfake tech can now be created with just 3–5 minutes of real audio and video—often pulled from public sources like LinkedIn or YouTube.

• Pro Tip: Set your social media to private and reduce the amount of video and voice content publically accessible from your leadership team.

• Update Alert: With scam-related losses hitting a record S$1.1 billion in Singapore in 2024, authorities are tightening regulations on AI misuse and urging businesses to strengthen verification protocols.


Stay safe, stay informed.


Keyword Definitions

🔹 Deepfake – AI-generated audio or video that mimics real people, often used to impersonate someone convincingly.

🔹 Money Mule Account – A bank account used by scammers to receive and launder stolen funds.

🔹 Non-Disclosure Agreement (NDA) – A legal document that restricts parties from sharing confidential information.

🔹 Executive Impersonation Scam – A scam in which fraudsters pretend to be high-ranking officials to trick employees into taking action.

🔹 Anti-Scam Centre (ASC) – A Singapore-based task force under the police force focused on scam prevention and response.


To read more, kindly find source article here


They Came for the Deeds, Not the Cookies: Real Estate Scams Targeting Seniors in 2025