TheCryptoUpdates
Crypto Scams

AI Voice Cloning Scams Target Crypto Executives in Multi-Million Dollar Vishing Attacks

It’s getting harder to know who’s really on the other end of the phone. I mean, we’ve all gotten those shady calls about a car warranty or someone from the “IRS” threatening arrest. Annoying, sure. But now it’s different. Apparently, cybercriminals are hiring professional voice actors and using AI tools to target people, especially folks in the crypto world. They’re not just reading a script anymore; they’re performing.

The goal is what’s called “vishing”—voice phishing. The idea is to trick someone into handing over money or sensitive info that gives access to accounts. It often starts with a call or a message from someone pretending to be from a place you trust. Maybe they say they’re from tech support, or that you owe money to some government agency. Sometimes, they even impersonate a co-worker or a boss, asking for login details because of some fabricated emergency.

How the Scams Actually Work

What’s really unsettling is how convincing these calls can be. According to the FTC, scammers might use bits of your personal information—like part of your Social Security number or your home address—to make the whole thing seem legitimate. It’s not just a random call; it feels specific. Personal. And that’s what makes it dangerous.

Perhaps the most jarring part is how advanced the technology has gotten. We’re not talking about robotic, text-to-speech voices anymore. These are sophisticated voice clones, sometimes made by hiring actual impersonators, that capture someone’s tone, their cadence, even the way they pause between sentences.

The Numbers Are Pretty Staggering

According to data from cybersecurity firm Right-Hand, deepfake-related vishing shot up by over 1,600% in the first part of this year compared to late 2024. In one case, a European energy company lost $25 million after criminals cloned the voice of their chief financial officer. An employee said the voice was spot-on—it matched the CFO’s mannerisms perfectly. By the time anyone realized it was a fraud, it was too late. The money was gone.

On average, individual victims are losing around $1,400 per incident. For companies, the recovery costs can run into the millions. Right-Hand also reported that 70% of organizations they surveyed had been targeted. When tested, one out of every four employees couldn’t tell a cloned voice from the real thing.

Why Crypto Execs Are a Prime Target

It makes sense, when you think about it. Crypto transactions are fast and, most importantly, permanent. Unlike a traditional bank transfer, which might be reversed if you catch it quickly, once crypto is sent, it’s pretty much gone. That’s why these voice scams are so effective—they create a sense of urgency, the money moves instantly, and there’s no undo button.

Organized groups have noticed. Activity has ramped up in 2025, with groups like UNC6040 and North Korea’s Lazarus Group getting involved. They’ve used deepfakes in job interviews, created fake companies, and pulled off huge heists. Last year alone, attackers linked to Pyongyang stole over a billion dollars across dozens of incidents.

And it’s not just crypto. Even big tech isn’t immune. Google recently confirmed that hackers got into an internal database tied to Salesforce, stealing customer data. It feels like no one’s really safe from a well-executed impersonation.

So what do you do? I think it comes down to being skeptical. If someone calls asking for money or credentials

Loading

Related posts

Fine slapped on the Blockchain of things (BCOT)ICO for unregistered token sale

Kesarwani

T3 Financial Crime Unit Freezes $9M from North Korean Hackers’ $1.5B Bybit Exploit: A Look at Crypto Crime Control in Action

Jack

DeepSeek Hype Exploited: Fake Meme Coins Scam

Jack
Close No menu locations found.