Ride the Lightning

Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.

Faulty Facial Recognition Algorithm Leads to Arrest of Innocent Man

June 30, 2020

The New York Times reported on June 25 that, in January, Robert Julian-Borchak Williams was in his office at an automotive supply company when he got a call from the Detroit Police Department telling him to come to the station to be arrested. He thought it was a prank.

An hour later, he pulled into his driveway in a subdivision in Farmington Hills, Mich. A police car pulled up behind, blocking him in. Two officers handcuffed Mr. Williams on his front lawn, in front of his wife and two young daughters, who were distressed. The police wouldn't say why he was being arrested, only showing him a piece of paper with his photo and the words "felony warrant" and "larceny."

His wife Melissa asked where he was being taken. "Google it," she remembers an officer replying.

The police drove Mr. Williams to a detention center. He had his mug shot, fingerprints and DNA taken, and was held overnight. The next day, two detectives took him to an interrogation room and placed three pieces of paper on the table, face down.

"When's the last time you went to a Shinola store?" one of the detectives asked, in Mr. Williams's recollection. Shinola is an upscale boutique that sells watches, bicycles and leather goods in the trendy Midtown neighborhood of Detroit. Mr. Williams said he and his wife had checked it out when the store opened in 2014.

The detective turned over the first piece of paper. It was a still image from a surveillance video, showing a heavyset man, dressed in black and wearing a red St. Louis Cardinals cap, standing in front of a watch display. Five timepieces, worth $3,800, were shoplifted.

"Is this you?" the detective asked.

The second piece of paper was a close-up. The photo was blurry, but it was clearly not Mr. Williams.

"No, this is not me," Mr. Williams said. "You think all black men look alike?"

Mr. Williams knew that he had not committed the crime in question. What he could not have known is that his case may be the first known account of an American being wrongfully arrested based on a flawed match from a facial recognition algorithm, according to experts on technology and the law.

A nationwide debate is occurring about racism in law enforcement. Millions are protesting not just the actions of individual officers, but bias in the systems used to surveil communities and identify people for prosecution.

Facial recognition systems have been used by police forces for more than two decades. Recent studies by M.I.T. and the National Institute of Standards and Technology, or NIST, have found that while the technology works relatively well on white men, the results are not as accurate for other demographics, in part because of a lack of diversity in the images used to develop the databases.

In 2-19, during a public hearing about the use of facial recognition in Detroit, an assistant police chief raised concerns. "On the question of false positives — that is absolutely factual, and it's well-documented," James White said. "So that concerns me as an African-American male."

This month, Amazon, Microsoft and IBM announced they would stop or pause their facial recognition offerings for law enforcement. The gestures were mostly symbolic because the companies are not big in the industry. The technology police departments use is supplied by companies that aren't household names, such as Vigilant Solutions, Cognitec, NEC, Rank One Computing and Clearview AI.

Clare Garvie, a lawyer at Georgetown University's Center on Privacy and Technology, has written about problems with the government's use of facial recognition. She argues that low-quality search images — such as a still image from a grainy surveillance video — should be banned, and that the systems currently in use should be tested rigorously for accuracy and bias.

The Shinola shoplifting occurred in October 2018. Katherine Johnston, an investigator at Mackinac Partners, a loss prevention firm, reviewed the store's surveillance video and sent a copy to the Detroit police, according to their report.

Five months later, in March 2019, Jennifer Coulson, a digital image examiner for the Michigan State Police, uploaded a "probe image" — a still from the video, showing the man in the Cardinals cap — to the state's facial recognition database. The system would have mapped the man's face and searched for similar ones in a collection of 49 million photos.

The state's technology is supplied for $5.5 million by a company called DataWorks Plus. Founded in South Carolina in 2000, the company first offered mug shot management software, said Todd Pastorini, a general manager. In 2005, the firm began to expand the product, adding face recognition tools developed by outside vendors.

When one of these subcontractors develops an algorithm for recognizing faces, DataWorks attempts to judge its effectiveness by running searches using low-quality images of individuals it knows are present in a system. "We've tested a lot of garbage out there," Mr. Pastorini said. These checks, he added, are not "scientific" — DataWorks does not formally measure the systems' accuracy or bias. "We've become a pseudo-expert in the technology," he said.

In Michigan, the DataWorks software used by the state police incorporates components developed by the Japanese tech giant NEC and by Rank One Computing, based in Colorado, according to Mr. Pastorini and a state police spokeswoman. In 2019, algorithms from both companies were included in a federal study of over 100 facial recognition systems finding they were biased, falsely identifying African-American and Asian faces 10 times to 100 times more than Caucasian faces.

After Ms. Coulson, of the state police, ran her search of the probe image, the system would have provided a row of results generated by NEC and a row from Rank One, along with confidence scores. Mr. Williams's driver's license photo was among the matches. Ms. Coulson sent it to the Detroit police as an "Investigative Lead Report."

"This document is not a positive identification," the file says in bold capital letters at the top. "It is an investigative lead only and is not probable cause for arrest."

Technology providers and law enforcement always emphasize this when defending facial recognition: It is only supposed to be a clue in the case, not proof. Before arresting Mr. Williams, investigators might have sought other evidence that he committed the theft, such as eyewitness testimony, location data from his phone or proof that he owned the clothing that the suspect was wearing.

In this case, however, according to the Detroit police report, investigators simply included Mr. Williams's picture in a "6-pack photo lineup" they created and showed to Ms. Johnston, Shinola's loss-prevention contractor, and she identified him.

Mr. Pastorini was taken aback when the process was described to him. "It sounds thin all the way around," he said.

In Mr. Williams's recollection, after he held the surveillance video still next to his face, the two detectives leaned back in their chairs and looked at one another. One detective, seeming chagrined, said to his partner: "I guess the computer got it wrong."

They turned over a third piece of paper, which was another photo of the man from the Shinola store next to Mr. Williams's driver's license. Mr. Williams again said that they were not the same person.

He asked if he was free to go. "Unfortunately not," one detective said.

Mr. Williams was kept in custody until that evening, 30 hours after being arrested, and released on a $1,000 personal bond.

The Williams family contacted defense attor
n
eys, most of whom, they said, assumed Mr. Williams was guilty and quoted prices of around $7,000 to represent him. Ms. Williams, a real estate marketing director and food blogger, also tweeted at the American Civil Liberties Union of Michigan, which took an immediate interest.

"We've been active in trying to sound the alarm bells around facial recognition, both as a threat to privacy when it works and a racist threat to everyone when it doesn't," said Phil Mayor, an attorney at the organization. "We know these stories are out there, but they're hard to hear about because people don't usually realize they've been the victim of a bad facial recognition search."

Two weeks after his arrest, Mr. Williams took a vacation day to appear in a Wayne County court for an arraignment. When the case was called, the prosecutor moved to dismiss, but "without prejudice," meaning Mr. Williams could be charged again.

On Wednesday, the A.C.L.U. of Michigan filed a complaint with the city, asking for an absolute dismissal of the case, an apology and the removal of Mr. Williams's information from Detroit's criminal databases.

The Detroit Police Department "should stop using facial recognition technology as an investigatory tool," Mr. Mayor wrote in the complaint, adding, "as the facts of Mr. Williams's case prove both that the technology is flawed and that DPD investigators are not competent in making use of such technology."

Until we can get facial recognition right, we have no business using it to accuse people of crimes without additional and convincing evidence. It is no accident that these algorithms are biased. They reflect a biased history and a biased society.

Sharon D. Nelson, Esq., President, Sensei Enterprises, Inc.
3975 University Drive, Suite 225|Fairfax, VA 22030
Email: Phone: 703-359-0700
Digital Forensics/Cybersecurity/Information Technology
https://senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson