Ride the Lightning

Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.

Cybercriminals Deepfake Company Director’s Voice in $35 Million Bank Heist

October 19, 2021

On October 14, Forbes reported that, in early 2020, a bank manager in the United Arab Emirates received a call from a man whose voice he recognized—a director at a company with whom he’d spoken before. The director had good news: His company was about to make an acquisition, so he needed the bank to authorize some transfers in the amount of $35 million. A lawyer named Martin Zelner had been hired to coordinate the procedures and the bank manager could see in his inbox emails from the director and Zelner, confirming what money needed to move where. The bank manager, believing everything appeared legitimate, began making the transfers.

The poor bank manager didn’t know that he had been fooled by “deep voice” (often called “deepfake” technology) which had been used to clone the director’s speech. This was according to a court document discovered by Forbes in which the U.A.E. sought American investigators’ help in tracing $400,000 of stolen funds that went into U.S.-based accounts held by Centennial Bank. The U.A.E. believes the scheme was elaborate, involving at least 17 individuals, which sent the stolen money to bank accounts across the globe.

Little more detail was given in the document, with none of the victims’ names provided. The Dubai Public Prosecution Office had not responded to requests for comment at the time of publication of the article. Martin Zelner, a U.S.-based lawyer, had also been contacted for comment, but had not responded at the time of publication.

This is only the second known case of cybercriminals allegedly using voice-mimicking tools to carry out a theft, but this one appears to have been far more successful than the first, in which fraudsters used the tech to impersonate a CEO of a U.K.-based energy firm in an attempt to steal $240,000 in 2019.

“Audio and visual deep fakes represent the fascinating development of 21st century technology yet they are also potentially incredibly dangerous posing a huge threat to data, money and businesses,” says Jake Moore, a former police officer with the Dorset Police Department in the U.K. and now a cybersecurity expert at security company ESET. “We are currently on the cusp of malicious actors shifting expertise and resources into using the latest technology to manipulate people who are innocently unaware of the realms of deep fake technology and even their existence.

Manipulating audio, which is easier to orchestrate than making deep fake videos, is only going to increase in volume and without the education and awareness of this new type of attack vector, along with better authentication methods, more businesses are likely to fall victim to very convincing conversations.”

Voice cloning is now widely available. Tech startups are working on increasingly sophisticated AI voice technologies, from London’s Aflorithmic to Ukraine’s Respeecher and Canada’s Resemble.AI. The technology caused a ruckus in recent months with the revelation that the late Anthony Bourdain had his voice synthesized for a documentary on his life. Meanwhile, recognizing the potential for malicious use of the AI, a handful of companies, including $900 million-valued security firm Pindrop, now say they can detect synthesized voices and thereby prevent frauds.

If recordings of your voice are available online, whether on social media, YouTube or on an employer’s website, you may well be a good candidate for an audio deepfake, particularly if you have the authority to wire funds for your business!

Sharon D. Nelson, Esq., President, Sensei Enterprises, Inc.
3975 University Drive, Suite 225, Fairfax, VA 22030
Email: Phone: 703-359-0700
Digital Forensics/Cybersecurity/Information Technology