Sentencing software (2018 List)

The black-boxing of the American legal system.

An algorithm may have the uncanny ability to predict which Netflix show you’ll enjoy this evening, but where do we draw the line with big data’s rule over our lives? Turns out we better decide quickly because the technology is outpacing both our awareness and our laws.

While we don’t know just how many jurisdictions are using it, we do know that sentencing software, such as that made by Equivant, is already out there.

Your fate decided by a computer? Maybe that doesn’t sound so bad when you take into account the bias of judges and the proof that sentences are often influenced by race, class, gender, personal experience, and whether or not that judge has had his lunch yet. Is it possible that our justice system would become fairer if we handed it over to AI?

Eric Loomis didn’t think so when he was sentenced to six years in prison for attempting to flee an officer and operating a motor vehicle without the owner’s consent. It didn’t help that the car had been used earlier that day in a drive-by shooting or that Loomis was a registered sex offender. At the sentencing hearing, the court mentioned that it arrived at the sentence with the help of a “COMPAS assessment,” which helped determine Loomis’ risk of recidivism. COMPAS is a program sold by Northpointe, Inc. and marketed as a means to guide courts in their sentencing. According to Northpointe, the program is “designed to incorporate key scales from several of the most informative theoretical explanations of crime and delinquency…” A judge need only plug in all of the relevant information for the case and voila, the algorithm spits out a “more integrated and coherent interpretation of each person’s support needs.” But the issue, according to Loomis’ lawyers is that the algorithm is designed by a private company that will not reveal how it works, leaving everyone in the dark about precisely how the algorithm works. The Wisconsin Supreme Court decided that Loomis had no right to Northpointe’s proprietary software. 

You don’t have to like Loomis to see the problem here. Evaluating a person based on a black-box algorithm is a scary proposition (you need only imagine yourself on the wrong end of it to see why), but the court maintained that this was a delicate issue that needed “time for further percolation.” It’s clear now that we’re out of time. Not only are we dealing with private companies playing a role in the judicial system (one that they do not have to reveal to the police or the courts), but the decision whether or not to involve AI at all seems to have whizzed by us without notice.

It’s important to remember that algorithms don’t emerge out of nothing; they’re written by coders, people who have biases that they may not even realize (as do judges, to be sure). It’s no surprise that a ProPublica study found that COMPAS routinely gives worse scores to black defendants. How can we even begin to weigh the validity of this tool if we’re not allowed to see its decision-making process?

Resources: