Ride the Lightning
Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.
Deepfake Videos Used by Criminals in Interviews for Remote Work
July 6, 2022
Dark Reading reported on July 1 that a new advisory from the FBI’s Internet Crime Complaint Center (IC3) has been published, warning that there is increased activity by criminals gaming the online interview process for remote work positions. Criminals are using a deepfake videos and stolen personal data to misrepresent themselves and procure employment in a number of work-from-home positions that include information technology, computer programming, database maintenance, and software-related job functions.
Federal law-enforcement officials said in the advisory that they’ve received a rash of complaints from businesses.
“In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking,” the advisory said. “At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”
The complaints noted that criminals were using stolen personally identifiable information (PII) in conjunction with these fake videos to better impersonate applicants, with later background checks finding discrepancies between the individual who interviewed and the identity presented in the application.
While the advisory didn’t specify the motives for these attacks, it did note that the positions applied for were ones with some level of corporate access to sensitive data or systems.
Security experts believe one of the goals in deepfaking through a remote interview is to get a criminal into a position to infiltrate an organization for anything from corporate espionage to common theft.
“Notably, some reported positions include access to customer PII, financial data, corporate IT databases and/or proprietary information,” the advisory said.
“A fraudster that hooks a remote job takes several giant steps toward stealing the organization’s data crown jewels or locking them up for ransomware,” says Gil Dabah, co-founder and CEO of Piiano. “Now they are an insider threat and much harder to detect.”
Additionally, short-term impersonation might also be a way for applicants with a “tainted personal profile” to get past security checks, says DJ Sampath, co-founder and CEO of Armorblox.
Previously, the most public examples of criminal use of deepfakes in corporate settings have been as a tool to support business email compromise (BEC) attacks. In 2019, an attacker used deepfake software to impersonate the voice of a German company’s CEO to convince another executive at the company to urgently send a wire transfer of $243,000 in support of a made-up business emergency. Last fall a criminal used deepfake audio and forged email to convince an employee of a United Arab Emirates company to transfer $35 million to an account owned by criminals, tricking the victim into believing it was in support of a company acquisition.
This is certainly a disturbing new development. Be careful out there – and suspicious!
Sharon D. Nelson, Esq., President, Sensei Enterprises, Inc.
3975 University Drive, Suite 225, Fairfax, VA 22030
Email: Phone: 703-359-0700
Digital Forensics/Cybersecurity/Information Technology
https://senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson