That Was Your CFO on Zoom, Wasn't It?

When most people hear "deepfake" their minds jump straight to celebrity face-swaps or the disturbing non-consensual images that make the news. Fair enough, that is all genuinely alarming. But if you're running a business, there's another whole side to this AI-powered trickery that should have your attention.

A deepfake is AI-generated content, video, audio, or images, designed to look and sound like a real person. The tech has gotten scarily good, and it's no longer just Hollywood with access. Anyone with a decent computer and some determination can create convincing fakes.

20% of Australian businesses were targeted by deepfakes in the past 12 months

Here's the bit that matters for businesses: social engineering attacks are getting a serious upgrade.

It's already happening overseas

In early 2024, a finance worker at UK engineering firm Arup was tricked into transferring US$25 million after joining what looked like a routine video call with the company's Chief Financial Officer and several senior colleagues. Every single person on that call was a deepfake. The criminals had used publicly available footage of real executives to create AI-generated video and audio convincing enough to fool someone who was actively trying to verify the request.

Italian carmaker Ferrari had a close call in mid-2024 when scammers cloned CEO Benedetto Vigna's voice - right down to his southern Italian accent. The attempt only failed because an executive thought to ask the fake "CEO" to confirm a book he'd recently recommended. The scammer couldn't answer and hung up.

And it's happening here in Australia too

This isn't just an overseas problem. Research from Mastercard found that 20% of Australian businesses were targeted by deepfakes in the past 12 months, and more than one in ten fell for it. CommBank's own research shows one in four Australians have witnessed a deepfake scam in the past year.

Australian faces are being stolen too. In 2024, deepfake videos of Dr Karl Kruszelnicki appeared across Facebook and Instagram, hawking blood pressure pills he'd never endorsed. Users reported the ads to Facebook, only to be told they didn't violate community standards.

Professor Jonathan Shaw from Melbourne's Baker Heart and Diabetes Institute had patients calling his clinic after seeing a deepfake of him promoting diabetes supplements. Supplements he'd never heard of, let alone recommended.

What should you be watching out for?

Fake video calls are a big one. That Teams or Zoom call from your "boss" asking you to transfer funds or share credentials? It might not be them. Real-time deepfake video quality is improving fast.

Voice cloning is another. Scammers only need a few minutes of audio, from a public video, podcast, or even your voicemail greeting, to create a realistic clone of someone's voice.

Then there's good old invoice fraud with an AI twist. Criminals are using deepfakes to approve transactions, redirect payments, and extract confidential data from staff who genuinely believe they're talking to leadership.

What can you do about it?

Start by making sure your team knows this technology exists. They shouldn't automatically trust what they see and hear, especially when money or sensitive information is involved.

Set up verification protocols that don't rely on recognising someone's voice or face. A code word for urgent financial requests, or a policy of calling back on a known number, could be what saves you. Ferrari's executives saved millions simply by asking a question only the real CEO could answer.

And have a proper chat with your IT provider about your current security setup. Things are moving fast, and what worked last year might not cut it anymore.

Want to talk through your options? Get in touch!

We'll help you
get your IT together!

Jamie Wilson, Founder

Jamie Wilson

Founder