Deepfakes & trust in the age of accessible AI

Deepfake frauds are the next step in social engineering attacks. 

They exploit our relationships, our trust in people – whether it’s Taylor Swift appearing to endorse Le Creuset or our CEO telling us to transfer some funds to a new account – they manipulate identities and relationships for financial gain. 

And with advances in AI, deepfakes are becoming both more convincing and easier to make. It’s not surprising that we’re seeing more businesses being targeted by these scams. 

Deepfakes are a problem globally and across sectors.

The 2024 Deepfake Trends report looked at several industries in the US, Germany, Singapore, the UAE and Mexico and found that: 

  • 49% of respondents experienced both audio and video deepfakes (up from 37% and 29% in 2022). The UAE and Singapore were most severely impacted, with over half of respondents saying they had experienced a deepfake attack. 
  • IT companies, law enforcement agencies and crypto firms reported the most problems with deepfake videos. While law enforcement, crypto, financial services and aviation businesses saw the biggest problems with audio deepfake fraud. 

Deepfakes in action 

Audio deepfakes 

In 2019 fraudsters used an audio deepfake of the CEO’s voice to steal $243,000 from a UK-based energy company. The CEO was based in the company’s German-based parent company.  

The message requested an urgent cash transfer to a supplier (which was based in Hungary) and it assured the UK’s CEO that they would reimburse them soon.  

The attackers even tried, and failed, to get two more payments transferred. 

Video deepfakes

Early 2024 saw some high-profile attacks using video deepfakes. 

The CEO of ad giant, WPP, Mark Read sent an email to employees warning of deepfakes after his image was used as part of a scam attempt. 

Attackers set up a WhatsApp account, took his picture from the internet and set up a Teams meeting with him and another senior exec of WPP. They then used voice clone technology and a YouTube video of Read to try and scam the fellow agency leader.  

The attempt failed as the executive wouldn’t bite. 

Fraudsters did successfully steal money from UK engineering firm, Arup. 

They scammed the business out of £20m via a video deepfake when a Hong-Kong based employee made 15 transactions to five accounts after participating joining a video call with what looked like multiple senior executives.  

Investigators believed that the attackers downloaded the videos and used AI to add fake voices to use in the video call. (You can hear more about the Arup deepfake incident on the latest episode of What Just Happened?) 

The big question is how we prevent ourselves from falling victim to a deepfake scam. 

Training and education around what deepfakes are and ways they can be identified are important, but one of the key things to get right is trust. 

The importance of trust 

While it’s important to encourage trust in executive teams, the trust leaders have in their employees is just as crucial. 

Organisations with a culture that encourages pushing back against instructions that make no sense or seem out of character stand a better chance of uncovering a deepfake.  

Whereas an employee who works in a culture that’s very deferential to leaders and where they feel they can’t say “no” could be less likely to raise any suspicions they have. 

Leaders who trust their employees to tell them their true opinion and employees who trust their leaders to handle their true opinions with grace, are vital when defending against deepfakes. 

It also means leaders understanding that sometimes people will call, email or drop by their office to double check something after a video or phone call. If people feel judged, or that they’ve annoyed the boss, they may be less likely to “bother” them with verifying communications in the future. 

The threat from deepfakes is only going to increase, which makes it more important than ever for teams to know, understand and trust each other enough to be vulnerable and courageous in their communication. 

Featured image by Sander Sammy on Unsplash

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

The Arup Deepfake Scam
Culture of Fear: Lessons from the Harrods Abuse Scandal

Author

Related posts