Ride the Lightning

Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.

Family Lawyers: A Real-Life Case of a Deepfake Audio

February 4, 2020

I assumed (correctly) that deepfake audios would shortly emerge in family law. Last week, I read in a Legal Cheek post that a deepfake audio has indeed been used.

Family law barrister Byron James says that voice forging software was used to create a fake recording of his client threatening another party to a dispute — and advises fellow practitioners to be on the alert for deepfake audios.

James, a partner at cross-border family specialists Expatriate Law, says that "it is now possible, with sufficient content, to create an audio or video file of anyone saying anything".

Audio deepfakes have been very common in the political area – and the blog post includes a deepfake video with deepfake audio of President Obama (warning: offensive language).

Deepfakes which appear online are often quickly debunked as hoaxes. But James warns that lawyers and judges used to taking recorded evidence at face value might not be disposed to question a piece of footage in everyday practice.

The Dubai-based barrister describes how his client was alleged to have threatened another party over the phone, but was adamant that he had never uttered the alleged threat. The matter seemed to be put beyond doubt when "an audio file was produced which included a recording using the precise words my client had been accused of". James continues:

"This is always a difficult position to be in as a lawyer, where you put corroborating contrary evidence to your client and ask them if they would like to comment. My client remained, however, adamant that it was not him despite him agreeing it sounded precisely like him, using words he might otherwise use, with his intonations and accent unmistakably him. Was my client simply lying?"

"In the end, perhaps for one of the first times in the family court, we managed to prove that that [sic] the audio file had not been a faithful recording of a conversation between the parties but rather a deepfake manufacture."

"With practice", James says, "a deepfake video can be so plausible that even an expert may not be able to readily identify it as manufactured."

Clearly, if the original audio file is available, that's one way to demonstrate that the authenticity of the evidence is in question. Unfortunately, the original audio file is not always available but there are other technical ways of exposing a deepfake.

If anyone else spots the use of deepfakes in litigation, please feel free to share. It appears that John and I will be presenting on deepfakes in multiple CLEs – and my opening gambit into this CLE topic will be at ABA TECHSHOW later this month with my good friend (and technical guru) Lincoln Mead. Hope to see some RTL readers there!

Sharon D. Nelson, Esq., President, Sensei Enterprises, Inc.
3975 University Drive, Suite 225|Fairfax, VA 22030
Email: Phone: 703-359-0700
Digital Forensics/Cybersecurity/Information Technology
https://senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson