Ride the Lightning

Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.

Can Algorithms Send You to Prison? Apparently, Yes.

November 1, 2017

The New York Times reported in an opinion piece last week on a fascinating and disturbing story. In 2013, police officers in Wisconsin arrested Eric Loomis, who was driving a car that had been used in a recent shooting. He pleaded guilty to attempting to flee an officer, and no contest to operating a vehicle without the owner's consent. Neither of his crimes mandated prison time.

But at Mr. Loomis's sentencing, the judge cited, among other factors, Mr. Loomis's high risk of recidivism as predicted by a computer program called COMPAS, a risk assessment algorithm used by the state of Wisconsin. The judge denied probation and prescribed an 11-year sentence – six years in prison, plus five years of extended supervision.

No one knows exactly how COMPAS works; its manufacturer won't disclose the proprietary algorithm. We only know the final risk assessment score, which judges may consider at sentencing.

Loomis challenged the use of an algorithm as a violation of his due process rights to be sentenced individually, and without consideration of impermissible factors like gender or race. The Wisconsin Supreme Court rejected his challenge. In June, the United States Supreme Court declined to hear his case, meaning a majority of justices effectively condoned the algorithm's use.

This may have far-reaching effects. Why are we allowing a computer program, into which no one in the criminal justice system has any insight, to play a role in sending a man to prison? The author of the op-ed piece asked that question – and so do I. Wisconsin is one of several states using algorithms in the sentencing process.

At a sentencing, it is a judge's prerogative to prescribe a sentence within statutory guidelines. The obvious flaw with this system is bias, perhaps gender, religion or race.

This seems to be why states are, at least partially, giving the responsibility for sentencing to a computer. Use of a computerized risk assessment tool somewhere in the criminal justice process is widespread across the United States, and some states, such as Colorado, require it. The states seem to believe that even if they cannot themselves understand proprietary algorithms, computers will be less biased than humans.

I agree with the author of the article that partially shifting the sentencing responsibility to a computer does not necessarily eliminate bias; it delegates and often compounds it.

COMPAS's authors presumably fed historical recidivism data into their system. From that, the program ascertained what factors make a defendant a higher risk. It then applied the patterns it gleaned to defendants like Mr. Loomis to recommend sentences.

But the historical data would necessarily reflect our biases. A ProPublica study found that COMPAS predicts black defendants will have higher risks of recidivism than they actually do, while white defendants are predicted to have lower rates than they actually do. (Northpointe Inc., the company that produces the algorithm, disputes this analysis.)

Besides receiving input that may be flawed, algorithms lack the human ability to individualize. A computer cannot look a defendant in the eye, account for a troubled childhood or disability, and recommend a rehabilitative sentence. This is precisely the argument against mandatory minimum sentences — they deprive judges of the discretion to deliver individualized justice — and that argument is equally compelling against machine sentencing.

Is it true that defendants with higher rates of recidivism warrant longer sentences or is it that defendants with longer sentences are kept out of their communities, unemployed and away from their families longer, naturally increasing their recidivism risk? A judge could and should consider these factors.

With transparency and accountability, algorithms in the criminal justice system day do good. New Jersey used a risk assessment program known as the Public Safety Assessment to reform its bail system this year, leading to a 16 percent decrease in its pre-trial jail population. The same algorithm helped Lucas County, Ohio double the number of pre-trial releases without bail, and cut pre-trial crime in half. But that program's functionality was detailed in a published report, allowing those with subject-matter expertise to confirm that constitutionally impermissible variables — such as race and gender — were not being considered.

The only people who understand COMPAS's functioning are its programmers, certainly less able than judges to deliver justice. Judges have legal training, are bound by ethical oaths, and must account for not only their decisions but also their reasoning in published opinions.

As the author of the article notes: "Computers may be intelligent, but they are not wise. Everything they know, we taught them, and we taught them our biases. They are not going to un-learn them without transparency and corrective action by humans."

Amen.

E-mail: Phone: 703-359-0700
Digital Forensics/Information Security/Information Technology
https://www.senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson