Ride the Lightning

Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.

AI Systems Generate 'Skeleton' Keys That Fake Out Fingerprint Scanners

November 20, 2018

As Naked Security reported on November 16th, researchers have developed a method for AI systems to create their own fingerprints. Now that's scary enough, but now the machines have devised a way to create prints that fool fingerprint readers more than 20% of the time. The research could present problems for fingerprint-based biometric systems that rely on unique patterns to grant user access.

The research team, working at New York University Tandon and Michigan State University, used the fact that fingerprint readers don’t scan a whole finger at once. Instead, they scan parts of fingerprints and match those against what’s in the database. Previous research found that some of these partial prints contain features common to many other partial prints. This gives them the potential to act as a kind of skeleton key for fingerprint readers. They are called MasterPrints.

The researchers set out to train a neural network to create its own MasterPrints that could be used to fool fingerprint readers into granting access. They succeeded with a system that they call Latent Variable Evolution (LVE).

They used a common AI tool for creating realistic data, called a Generative Adversarial Network (GAN). They trained this network to recognize realistic images by feeding it lots of them. They do the same with artificially generated images so that it understands the difference between the two. Then they take the statistical model that the neural network produces as it learns, and feeds it to a generator. The generator uses this model to produce realistic images and repeats the process so that it can get better at it.

The researchers took these generated images and tested them against fingerprint matching algorithms to see which got the best results. It then used another algorithm to evolve the fingerprint to make those results even better.

I'm not sure I fully understand the technology, but in essence, the AI system is using mathematical algorithms to devise human fingerprints that can outsmart biometric scanners. Read the full article to get more details. Who knew that fingerprint readers are set to different security levels by adjusting their false match rate? Not me.

Why is this such a danger? Well, if someone is able to spoof your fingerprints, then they don’t have to steal them (and if they do, you can’t exactly change your fingerprints). How far away is this from being an exploit used in the wild? Probably not far.

Hat tip to Dave Ries.

E-mail:    Phone: 703-359-0700
Digital Forensics/Information Security/Information Technology
https://www.senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson