This is what a deepfake voice clone used in an unsuccessful fraud attempt looks like

0
86


One of the strangest applications of deepfakes – the AI ​​technology used to manipulate audiovisual content – is the deepfake audio scam. Hackers use machine learning to clone someone’s voice, then combine that voice clone with social engineering techniques to convince people to move money where it shouldn’t be. Such scams have been successful in the past, but how good are the voice clones used in these attacks? We’ve never heard the audio of a deep scam – until now.

Security consultancy firm NISOS released a report analyzing one of these fraud attempts and shared the audio with Motherboard. The clip below is part of a voicemail message sent to an employee of an unnamed tech company, in which a voice that sounds like the CEO of the company asks the employee for “immediate assistance in finalizing a deal.” urgent commercial ”.

The quality is certainly not great. Even under the guise of a bad phone signal, the voice is a bit robotic. But it’s passable. And if you were a junior employee worried after receiving a supposedly urgent message from your boss, you might not think too much about the audio quality. “It sounds really human. They ticked that box with respect to: does it sound more robotic or more human? I would say more human, ”said Rob Volkert, researcher at NISOS Motherboard. “But that doesn’t sound like the CEO enough.”

The attack was ultimately unsuccessful, as the employee who received the voicemail “immediately found him suspicious” and reported it to the firm’s legal department. But such attacks will be more common as deepfake tools become more accessible.

All you need to create a voice clone is to access many of your target’s recordings. The more data and better audio quality you have, the better the resulting voice clone will be. And for many executives of large companies, these recordings can be easily collected from results calls, interviews and speeches. With enough time and data, the highest quality audio deepfakes are much more compelling than the example above.

The best known and first reported example of a deepfake audio scam took place in 2019, where the CEO of a UK energy company was tricked into sending € 220,000 ($ 240,000) to a Hungarian supplier after receiving a supposed phone call from the CEO of his parent company in Germany. The executive was informed that the transfer was urgent and that funds had to be sent within the hour. He did it. The attackers were never arrested.

Earlier this year, the FTC warned of the rise of these scams, but experts say there is an easy way to beat them. As Patrick Traynor of the Herbert Wertheim College of Engineering said The edge in January, you just need to hang up the phone and call the person back. In many scams, including the one reported by NISOS, attackers use a burner VOIP account to contact their targets.

“Hang up and call them back,” Traynor said. “Unless it’s a state actor who can redirect phone calls or a very, very sophisticated hacking group, chances are this is the best way to know if you were talking to who you thought you were.” . “

LEAVE A REPLY

Please enter your comment!
Please enter your name here