Ride the Lightning

Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.

London’s Facial Recognition System Has a 98% False Positive Rate

June 14, 2018

That's quite a statistic. I am a bit late with this post, but it was such an astonishing statistic that I wanted to make sure it got a little coverage.

Last month's story from The Register had a classic British headline: "Zero arrest, 2 correct matches, no criminals: London cops' facial recog tech slammed." Indeed, that pretty much summed it up.

So as of May 15th, London cops' facial recognition kit had only correctly identified two people – neither of whom were criminals – and the UK capital's police force had made no arrests using it.

According to information released under Freedom of Information laws, the Metropolitan Police's automated facial recognition (AFR) technology has a 98 percent false positive rate. That figure is the highest of those given by UK police forces surveyed by the campaign group Big Brother Watch as part of a report that urges the police to stop using the tech immediately.

Forces use facial recognition in two ways: one is after the fact, while cross-checking of images against mugshots held in national databases; the other involves real-time scanning of people's faces in a crowd to compare against a "watch list" that is drawn up for each event. Big Brother Watch's report focused on the latter, which it said breaches human rights laws as it surveils people without their knowledge and could dissuade them from attending public events.

The police say their system works (huh?) despite the report showing an average false positive rate – where the system "identifies" someone not on the list – of 91 percent across the country. That doesn't mean nine out of ten people seen on camera are wrongly flagged – it means that 91 percent of people flagged turned out to be not on the watch list.

The Met Police claimed that this figure is misleading because there is human intervention after the system flags the match. "We do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts," a spokesperson told The Register.

The system hasn't had much success in positive identifications either: The report showed there have been just two accurate matches, and neither person was a criminal.

Big Brother Watch has also reiterated its concerns about the mass storage of custody images of innocent people on the Police National Database, which has more than 12.5 million photos on it that can be scanned biometrically. Despite a 2012 High Court ruling that keeping images of presumed innocent people on file was unlawful, the government has said it isn't possible to automate removal. This means that they remain on the system unless a person asks for them to be removed.

This certainly appears to be unlawful – and yet authorities seems to say it is simply too expensive to weed out innocent people. Given what they have paid for the facial recognition system, this argument doesn't seem to hold water. I go back to the classically British headline above – it really summarizes this ludicrous situation. Sure makes me wonder what the American stats are.

E-mail: Phone: 703-359-0700
Digital Forensics/Information Security/Information Technology
https://www.senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson