The ABA Journal featured VP of Sensei Enterprises John Simek in an article entitled “Courts and lawyers struggle with growing prevalence of deepfakes” by Matt Reynolds.
As a child custody battle unfolded behind the closed doors of a British courtroom, a woman said her husband was dangerous and that she had the recording to prove it.
Except, it turned out she didn’t.
The husband’s lawyer revealed that the woman, using widely available software and online tutorials, had doctored the audio to make it sound like his client, a Dubai resident, was making threats.
Byron James, an attorney with the firm Expatriate Law in Dubai, told the United Arab Emirates newspaper the National in February 2020 that by studying the metadata on the recording, his experts revealed that the mother had manipulated it. Under U.K. law, custody proceedings are confidential, but the National reports it took place at some point in 2019.
John Simek’s quotes:
John Simek, vice president of the digital forensics and cybersecurity firm Sensei Enterprises in Fairfax, Virginia, compares the threat of deepfakes to the “CSI effect,” that led jurors to heavily rely on forensic science because of police procedures they had seen depicted on TV—I think it was more than just leaning on it—oftentimes, jurors expected all of the DNA, crime scene reenactments and tech they saw in the show in actual court proceedings—and if they didn’t get that, they’d acquit. He says that the burden could be on lawyers to prove that video or audio evidence is real. That could drive up the costs of litigation if attorneys have to hire experts or use the latest AI to detect fakes.
“I could surely see jurors sitting there going, ‘The other side says that it’s fake; how come you didn’t bring an expert here to say that it’s authentic?’” Simek says.
Henry Ajder, the co-author of The State of Deepfakes for the Amsterdam-based Deeptrace Labs says there’s another problem. There aren’t enough digital forensic experts to go around. Ajder says that cements the need for machine learning tools, like those Deeptrace has developed to detect deepfakes.
“As the technology for generating synthetic media and deep fakes increases and becomes more accessible, the number of human experts who could rule with authority on whether a piece of media is real or not, has not,” Ajder says.
A deepfake future
The British child custody case could be a sign of what’s to come. Simek expects more cheapfake and deepfake cases in the family courts.
“It’s fairly easy to fake audio, and especially if you’re in a spousal arrangement, you probably have a lot of sample to start with, and you’re full of emotion,” Simek says. “I think that’s where it’s going to start.”