Ride the Lightning

Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.

Women Overwhelmingly Targeted in Deepfake Videos (Which Have Doubled!)

October 23, 2019

Naked Security reported on October 9 that 96% of the deepfakes being created in the first half of the year were pornography, mostly being nonconsensual, mostly featuring celebrities – without compensation or permission.

The statistic comes from a report, titled The State of Deepfakes, which was issued last month by Deeptrace, an Amsterdam-based company that uses deep learning and computer vision for detecting and monitoring deepfakes. The company says its mission is "to protect individuals and organizations from the damaging impacts of AI-generated synthetic media."

According to Deeptrace, the number of deepfake videos almost doubled over the seven months leading up to July 2019, to 14,678. The growth is supported by the increased commodification of tools and services that enable non-experts to churn out deepfakes.

One recent example was DeepNude, an app that used a family of dueling computer programs known as generative adversarial networks (GANs): machine learning systems that pit neural networks against each other in order to generate convincing photos of people who don't exist. DeepNude not only advanced the technology, it also put it into an app that anybody could use to strip off (mostly women's) clothes in order to generate a deepfake nudie within 30 seconds.

Since February 2018 when the first porn deepfake site was registered, the top four deepfake porn sites received more than 134 million views on videos targeting hundreds of female celebrities worldwide, the firm said. That illustrates what will surprise no one – that deepfake porn has a large audience.

As Deeptrace tells it, the term 'deepfake' was first coined by the Reddit user u/deepfakes, who created a Reddit forum of the same name on 2 November 2017. This forum was dedicated to the creation and use of deep learning software for synthetically faceswapping female celebrities into pornographic videos.

Reddit banned /r/Deepfakes in February 2018 – along with Pornhub and Twitter – but the faceswap source code, having been donated to the open-source community and uploaded on GitHub, seeded multiple project forks, with programmers continually improving quality, efficiency, and usability of new

Deeptrace says there are now also service portals for generating and selling custom deepfakes. In most cases, customers have to upload photos or videos of their chosen subjects for deepfake generation. One service portal Deeptrace identified required 250 photos of the target subject and two days of processing to generate the deepfake. The prices of the services vary, depending on the quality and duration of the deepfakes.

Deepfakes are posing a range of threats, Deeptrace concludes. Just the awareness of deepfakes alone is destabilizing political processes, given that the credibility of videos featuring politicians and public figures is slipping – even in the absence of any forensic evidence that they've been manipulated.

The tools have been commodified, which means that we'll likely see increased use of deepfakes by scammers looking to boost the credibility of their social engineering fraud, and by fake personas as tools to conduct espionage on platforms such as LinkedIn.

While I fear the destabilization of political systems, the victimization of women here is godawful. I hope those companies which have taken on the challenge of identifying deepfakes are successful – and that laws which penalize this kind of conduct come swiftly and with teeth!

Sharon D. Nelson, Esq., President, Sensei Enterprises, Inc.
3975 University Drive, Suite 225|Fairfax, VA 22030
Email: Phone: 703-359-0700
Digital Forensics/Cybersecurity/Information Technology
https://senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson