Crunchy Numbers Don’t Lie

It was going to save us. Even Jake assured me it would when he invented the Sentence-o-Matic 1000. There was a problem, and everybody “knew” it: judges were awful and racist. They were awful and racist when they set bail. They were awful and racist when they imposed sentence. They were awful and racist in between, too, but it was easier to focus on bail and sentence because both involved numbers, and we could compare numbers because they were numbers. And numbers don’t lie.

Was there ever a task in the courtroom more ripe for automation?

As a representative of the ignorant masses, I find comfort in the notion that everyone would be given sentences using the same criteria, and never again subjected to the whimsy of some of the judges.

As other numbers informed us, black people were disproportionately arrested, their bail was higher than white people’s and, when convicted, they were given more severe sentences. This could not be possible but for racism, it was presumed, since black people were no more inclined to crime or violence than anyone else. Certainly, there couldn’t be anything about being black that could cause this, so the only remaining possibility was racism, implicit or explicit, by those in charge of the legal system. Cops, prosecutors and judges were clearly to blame.

The solution seemed obvious. Remove the discretion and the racism must go with it. And so our hope and audacity in algorithms was born. Artificial intelligence had no feelings. It couldn’t be racist because it was just code and numbers. Having our technocratic geniuses create black boxes of totally unfeeling algos based on our mass of data would finally produce a system free of racism, bias and, as Jake put it, whimsy.

I said it wouldn’t work. That was in 2014.

While faith in the utility of technology by its fanboys is understandable, given that’s what you work with, rely upon and have chosen to dedicate your careers to, you can also appreciate that there is no arguing with binary thinking. Ultimately, it must make choices, based upon inputs, then crank out an answer.  So while it may well produce consistency, a la the Guidelines, its answers will suffer from the same arbitrary and capricious problems as they did under the Guidelines, and the opposite (but equally whimsical) answers judges provide under § 3553(a).

As Judge Bennett noted, “Sentencing requires us to weigh that which cannot be measured.”  If it can’t be measured (and it can’t), then it can’t be input into the Sentence-O-Matic 1000.  And even if it was input to the best a bunch of programmers could manage, it couldn’t be subject to the arguments that real life human beings require to address the individualized circumstances that exist in every case and every sentence, because that would obviate the goal of consistency and return us to the whimsy of the judge.

Barely six years later, 2000 mathematicians have demanded the end of algos, cooperation with police and involvement in the creation of AI “solutions” for the criminal law system.

“The data does not speak for itself, it’s not neutral,” explains Brendan McQuade, author of Pacifying the Homeland: Intelligence Fusion and Mass Supervision. Police data is “dirty data,” because it does not represent crime, but policing and arrests.

Black people not only face higher murder rates at the hands of police but disproportionately high arrest rates as well—twice that of their white counterparts. Racial data is not allowed to be used as a factor within predictive policing models, but in the US, location and socioeconomic factors are employed as an easy stand-in for racial data.

That’s somewhat true, but largely unhelpful. Black people are also disproportionately the victims of crime at the hands of other black people. You can’t fool the system when it comes to murders, as there are dead bodies to explain, and so the data does speak for itself. And nobody likes what it says because it doesn’t say what we want it to say.

As for the rest of the data, it was never going to do any more than reinforce what was already happening. The same criteria that we believed mattered when presenting an argument to the court was going to produce the data to be input into the Sentence-O-Matic 1000. Job? Family ties? Education? Number of convictions? Number of warrants? Nature of the crime? What else did they think they were going to find to input into the black box?

It had to be racist, because it could never be any better going in than coming out. GIGO. And using proxies like home address, level of education, future employment prospects to predict warrants, recidivism and future crimes was just a more benign way of getting to the same place.

But that doesn’t mean the numbers are wrong. After all, they’re just numbers and don’t care about your skin color or ethnicity. And this becomes even more dangerous in the hands of mathematicians, who might be good with numbers but otherwise are no less susceptible to believing utter nonsense than anyone else.

The popular use of racist predictive policing technologies is unsurprising for Aougab, who traces the history of the police, which is rooted in slave patrols and private security forces hired by the rich to break up labor strikes. Their primary motivation is not to serve and protect, Aougab said, but “preserving a social order that is put forth by the will of elites to protect property. The institution has never drifted away from that basic function over its entire existence in this country.”

But then, every effort to tweak the data, as it produced unacceptably racist outcomes, failed to produce the outcome dictated by the narrative.

Newer models have tried to break free from the bias of arrest data by building their algorithms on citizen reports, suggesting that 911 calls similarly correlate to where crime is happening.

While this avoids the bias of individual police officers, the bias of callers remain. Only around 40 percent of the victims of violent crime even report their assault to the police in the first place, and many marginalized people avoid doing so out of fear for their safety.

Then there’s the Karen Problem and the White Collar Crime problem, which skew attention toward black people and away from white people engaging in wage theft, fraud and false calls to their slave patrol police to protect them from black guys. And drug use by white people is just as bad as by black people, but they don’t buy and smoke weed on the street corner so they’re not as easily busted.

The problem isn’t that the numbers lie. They don’t. They’re just numbers. The problem is that the numbers were always inadequate as a way to quantify the problems inherent in people. And now that the narrative is more important than the data, and is almost entirely built of rationalizations for the disparities, we’re back to where we started, except this time relying on a litany of excuses in lieu of dirty empirical data.

It was obvious long ago that this wasn’t going to be the solution. Yet here we are, a mere six years later, demanding that simple answers to complex problems be rejected because the very solutions that were going to eliminate racism turn out to be racist. And we have new demands for new simple answers to complex problems, now based on a fantasy narrative, to replace them. And they’re going to fail too as they’re even worse than the bad data. So much for the Sentence-o-Matic 1000.

There is a reason why black people aren’t calling to Abolish Police and want them to remain in their neighborhood. They want to get home alive. And I don’t blame them. So do I. Put that into your algo and crunch it.


Discover more from Simple Justice

Subscribe to get the latest posts sent to your email.

13 thoughts on “Crunchy Numbers Don’t Lie

  1. Hunting Guy

    Attributed to Paul Ehrlich.

    “ To Err is Human; To Really Foul Things Up Requires a Computer.”

  2. Richard Kopf

    SHG,

    There is a sense in which I agree. That is, data driven sentencing schemes sweep too broadly or not broadly enough. They can be, and sometimes are, off by an order of magnitude.

    On the other hand, one can throw darts blindfolded too. For fun, put into the dart competition a bunch of folks who have radically different views of the way best to throw darts. They find nothing wrong in relying upon their own mad skills when finally allowed to throw darts underhanded, over handed, sidewise, behind the back and anyway they please.

    Take their blindfolds off too if you please. Fuck cataracts, they can see well enough to sentence a black, brown, yellow or (pale) white gang banger slinging a lot of dope and protected by a trusty .9mm as if he was a victim of some amorphous concepts like systematic racism or class-based oppression–a year and day and off you go.

    So, here we are. Until somebody explains to me how a better system can be devised I am content to conclude that shit happens.

    All the best.

    RGK

    1. SHG Post author

      The weird thing here is how we’ve gone from “love the data” to “love the narrative,” essentially the polar opposite, at lightning speed and without so much as a moment’s cognitive dissonance. I’ve always been of the view that it’s mostly voodoo, but at least we try to cast the best spells possible now and don’t slough it off on AI or excuses.

    2. Miles

      I’m likely missing something, but what is it that you’re in a sense agreeing with, Judge? This didn’t strike me as one of those agree/disagree kind of posts.

      1. Richard Kopf

        Miles,

        I get your point. Coherence sometimes (perhaps often) escapes me. I am pretty sure that there was something profound in what I wrote, but I have forgotten what that might have been.

        Scott put it well in his reply: “[W]e’ve gone from ‘love the data’ to ‘love the narrative,’ essentially the polar opposite, at lightning speed and without so much as a moment’s cognitive dissonance.”

        All the best.

        Rich

  3. Bob

    You say it’s obvious this solution wouldn’t work to get the result they wanted, but that point isn’t so obvious to me. Quotas are verboten in university admissions, yet they come up with admissions criteria that somehow produce the racial distributions desired by the administration. Scalia articulates a rigid system of constitutional interpretation that somehow almost always produces the result he wants.

    Peer review doesn’t work if everyone has the same biases. Academia gives lipservice “diversity,” but it’s worked tirelessly over the last fifty years toward ideological uniformity. Get a political consensus among the researchers and a scientific consensus will inevitably follow. Homosexuality is a disease. Now it isn’t! But we need Medicaid and insurance companies to pay for sex change operations, so it’s a disease again!

    Why would the mathematicians be stymied? The more complicated a system is to model, the more opportunities there are for the folks modeling it to put a thumb on the scale. And sentencing can be pretty darn complicated; there are lots of lawyers who make a living doing sentencing recommendations under the federal guidelines. My guess is that mathematicians lack experience in the sort of gentle, often unconscious manipulation that bends the research to the preferences of the researchers. Give them time! It’s not like sociologists haven’t been using the mathematicians’ toolkit that way for a hundred years.

    There really is something to be said for an adversarial system and lay juries.

    1. SHG Post author

      You make an excellent point. The problem isn’t the data, but whatever is preventing them from manipulating it to their desired outcome. They just lack imagination.

    2. Lee Keller King

      People say they want justice, but they really want mercy. The problem is, you can’t get mercy out of a computer.

  4. Jake

    As I recall, I’ve since ceded the point on the sentence-o-matic but the announcement by these mathematicians troubles me. There’s no stopping the digitization of everything. There’s too much opportunity. All these mathematicians, as high-minded their goals may be, will accomplish is to cede this ground to others with less noble motivations.

Comments are closed.