The headline was, at best, misleading.
Sent to Prison by a Software Program’s Secret Algorithms
Eric Loomis wasn’t sent to prison by a software program. And he wasn’t sent to prison by a program’s secret algorithms. He was sentenced to prison by a judge, just like anyone else sent to prison. Never forget that. But the length of time for which he was sent to prison was influenced by a program and its secret sauce.
The report in Mr. Loomis’s case was produced by a product called Compas, sold by Northpointe Inc. It included a series of bar charts that assessed the risk that Mr. Loomis would commit more crimes.
The Compas report, a prosecutor told the trial judge, showed “a high risk of violence, high risk of recidivism, high pretrial risk.” The judge agreed, telling Mr. Loomis that “you’re identified, through the Compas assessment, as an individual who is a high risk to the community.”
There are two separate issues raised by Loomis’ case, and the Supreme Court appears interested in at least one of them. The first question has to do with whether any human being should be sentenced based upon some empirical measure, the Sentence-O-matic 1000. No matter what the numbers show, or whether it’s just a proxy for racist assumptions, or just an excuse to lift the unpleasant task of putting a person in prison off a judge’s shoulders.
But many who swear by hard, cold data see this as a given, even if they don’t really have a sense of the multitude of potential considerations that might go into a sentence. And they have a point given that the alternative is judicial voodoo.
But the Loomis case deals with a collateral issue, beyond the adoration of data as a substitute for a judge taking the responsibility of her job seriously. This program, Compas, was created and sold by a company named Northpointe, and like any good company trying to make a killing, it wasn’t about to give up its secret sauce.
The Wisconsin Supreme Court ruled against Mr. Loomis. The report added valuable information, it said, and Mr. Loomis would have gotten the same sentence based solely on the usual factors, including his crime — fleeing the police in a car — and his criminal history.
But that begs the question. Did the report add “valuable information”? Certainly Northpointe thought so, but then, they had a product to sell. Without giving up the proprietary algorithms such that their validity could be tested and challenged, there is no basis upon which the court could determine whether the information was valuable. More importantly, there was no way for the defense to argue that it wasn’t. You can’t challenge what you can’t see.
The company that markets Compas says its formula is a trade secret.
“The key to our product is the algorithms, and they’re proprietary,” one of its executives said last year. “We’ve created them, and we don’t release them because it’s certainly a core piece of our business.”
Or, the algorithms are malarkey, like the great story of the cops using a copy machine as if it was a lie detector, tricking the perp into confessing. If you don’t know how it works, there’s no way to know that it doesn’t. And even if Compas is legit, is it good enough to stake a person’s life on? But even if it isn’t, is it any worse than whatever pops into a judge’s head?
This isn’t to say that Compas is wrong, evil or malarkey, whether total or a little bit. Nor is it even to say that it doesn’t do a better job of determining, in a generic sense, whether a person is more likely to commit more crimes. It may well do the job it purports to do, despite questions raised. But the official response to not knowing is, well, remarkably unhelpful.
In urging the United States Supreme Court not to hear the case, Wisconsin’s attorney general, Brad D. Schimel, seemed to acknowledge that the questions in the case were substantial ones. But he said the justices should not move too fast.
“The use of risk assessments by sentencing courts is a novel issue, which needs time for further percolation,” Mr. Schimel wrote.
This isn’t coffee. “Further percolation” is meaningless gibberish. If Compas is going to be used, it has to be valid. The State of Wisconsin doesn’t get to enjoy the benefits of its purchased program only to figure out, or not, at some point in the future that it’s a steaming pile of feces. Either it can pass muster or not, and until it does, no person should be subject to its data-claimed conclusions.
But what of Nothpointe’s pecuniary interests? It’s no crime to sell a product, and it’s no crime to want to protect the value of the product it sells. Does the Supreme Court hate capitalism?
Of course not, but that isn’t the point. Nobody forced Northpointe to put its efforts into a program to be used by the state, by a judge, to sentence human beings. If that’s where you choose to make your stand, then you don’t get to complain that it comes with strings, such as allowing the defense the opportunity to test and challenge its merit, its efficacy.
No matter how much you love capitalism, you dove into a pool of due process and can’t complain about getting drenched. The question isn’t whether Northpointe should be allowed to keep its proprietary algorithm secret. The question is whether a defendant is entitled to due process before a tool, any tool, is used to determine what will become of his life.
So, easy-peasy case for the Supremes? Not quite. There remains a kicker in the background.
Mr. Loomis would have gotten the same sentence based solely on the usual factors.
Would he? So they say. And if the sentence falls within the usual parameters, the guideline range of reasonableness as a federal circuit might opine, does it really matter whether Compas is sound or not? It’s a ledge to which Northpointe can cling when defending its moneymaker, and it’s an out for the Supreme Court to shrug off any claim of potential harm. After all, even if the algorithm turns out to be garbage, there’s nothing to do when the error is harmless.
Discover more from Simple Justice
Subscribe to get the latest posts sent to your email.
I wonder which WI politician Northpointe bribed…*ahem*…I mean exercised its right to political speech to support…*ahem*…in order to win the contract for Compas to be used by the judiciary.
Or was this a politico trying to show how empirical he can be in reforming the system while keeping the good people safe at night? So many options.
SHG,
For what it is worth, and despite by love of empiricism, it a very bad idea to use business-owned secret software to sentence people. As for whether the same result would have occurred absent the secret sauce, trust me, there is absolutely no way to know for certain.
All the best.
RGK
No doubt, Judge, but it offers an easy backdoor to sidestep the hard issue.
SHG,
I need to be careful so I will ask a question. Can one say the error, assuming it was error, was harmless beyond a reasonable doubt?
All the best.
RGK
Obviously just rhetorical musing, but if it remains within the statutory max, how can it be otherwise? Certainly one couldn’t presume an improvident exercise of judicial discretion.
We appreciate the certainty with which you posted that comment, your Honor.
So Judge Dredd hands out the original sentence, and the Wisconsin Supremes, wearing their Miss Cleo hats concur, while we all “know” that the Minority Report would have nailed it.
In fairness, they look damn good in those Miss Cleo hats. All appellate judges do.
The software industry has a history of “secret, proprietary algorithms” turning out to be snake-oil. If it was actually reliable enough for public use, they would probably patent it. Patents are public. Even ignoring the whole legal and constitutional issue of using secret information that the defense isn’t privy to, if they are hiding it, it’s either still under development, and thus certainly not something that people’s lives should be entrusted to, or just smoke & mirrors to get a fat government contract.
Or not.
Algorithms are not patentable.
It is my understanding that in empirical analytics, the standard for disclosing what factors are going into the system and then the “black box” is how these factors are weighted. I don’t think even that would be good enough in this case to comport with due process. I think defendant needs to know both the input and the weighting if it going to be used in sentencing.
Maybe call it the Apprendi Machine
Plus the underlying data upon which the weighting was based. This is a very black box.
CNN: “‘The dreaded ‘Very Black Box’ leads to increased sentences for minorities convicted of crimes.”
One of the harsh problems with empiricism is it tends to reinforce racial stereotypes, though it’s impossible to determine whether it’s because stereotypes are empirically correct or the inputs and weights are skewed because of racial assumptions. If there were legit studies allowed to delve into these questions, we might get an answer, but since nobody is allowed to conduct any study that could possibly produce a result no one wants to know, we can’t and won’t.
But year, the box turns out be pretty black.
” The report added valuable information, it said, and Mr. Loomis would have gotten the same sentence based solely on the usual factors, including his crime — fleeing the police in a car — and his criminal history.”
The report was *so* valuable that they would have come to the exact same conclusion even without it. Makes sense.
That’s their story and they’re sticking with it.
The ultimate in black box justice:
It’s got a catchy tune, but the beat isn’t much to dance to.
It was a great show for its time, though. The British Star Trek, if you will, except with a libertarian rather than statist premise.
I have all four seasons on DVD.
But how does it stand up to the classic? Catchy tune and great to dance to.
IIRC, it predates Red Dwarf by a great many years. Nevertheless, SWMBO would be in your camp.
That said, smoke me a kipper, I’ll be back for breakfast.
I’ve seen this movie. The Intoxilyzer 5000 uses “proprietary software” in calculating and regurgitating its breath alcohol result. Move along folks, nothing to see here.