Will Algorithms Be The Death of Bail?

New Jersey, of all places, is leading the nation in the elimination of bail. While bail bondsmen are screaming about how criminals will be released to rape your daughter at night, not that they have a financial horse in the race, and the Attorney General has tweaked the guidance a few times, it turns out that low-level defendants are being cut loose. So it’s working?

This may seem like an unusually technocratic approach to public defense. But it’s not so unusual anymore, at least not in New Jersey, where the state has recently undergone a holistic technological transformation of its arcane court system, all in the service of eliminating the use of bail statewide.

Jersey took the algorithm route, which predicted a defendant’s likelihood of returning to court or committing a new crime on a scale of 1 to 6. The purpose was to end the pre-trial incarceration of indigent defendants for the inability to pay bail.

In 2016, the Department of Justice, under President Obama, also issued a Dear Colleague letter to state and local courts around the country, advising them that courts “must not employ bail or bond practices that cause indigent defendants to remain incarcerated solely because they cannot afford to pay for their release.”

Since pre-trial incarceration was a huge factor in obtaining guilty pleas from innocent defendants, and often resulted in longer incarceration than a defendant could get after conviction, the problems were obvious and huge,

As it turned out, that described a large percentage of people who have spent time in New Jersey jails, according to one 2013 study by the New Jersey Drug Policy Alliance. The advocacy group found that some 75 percent of New Jersey’s jail population at any given moment was simply awaiting trial, and 40 percent of jailed people were there because they couldn’t afford $2,500 or less in bail. On average, people spent 10 months in jail before even getting to trial.

On the flip side, the system wasn’t viewed as perfect on the other end either.

Meanwhile, because New Jersey prohibited even the most violent criminals from being detained without bail, judges often had to set exorbitant bail amounts to keep violent offenders off the streets; sometimes, those people made bail anyway.

Mind you, whether someone is a “violent offender” is usually a determination to make after trial, a detail that Wired seems to miss in its recitation, as it presumes guilt. But ignorance aside, how are the algorithms working?

Just months in, the experiment has already made an impact. New Jersey saw a 19 percent reduction in its jail population overall between January 1 and May 31 of this year, with just eight people being held on bail throughout the entire state over that time period. Others are either being released with certain conditions or detained without bail.

That’s certainly a significant reduction, which means it’s all good? No. We’ve long been aware that the algorithms are kinda real, kinda malarkey.

Not all of these algorithms are created equal. One ProPublica investigation found that a tool called Compas, which was used in sentencing decisions, overwhelmingly rated black defendants higher risk than white defendants.

“Algorithms and predictive tools are only as good as the data that’s fed into them,” Ezekiel Edwards, director of the ACLU’s criminal law reform project, recently told WIRED. “Much of that data is created by man, and that data is infused with bias.”

You have a host of issues concealed behind the word “empiricism,” from the GIGO problem, since the information input is based on human assessment, to the tweaking problem.

“An effective risk assessment must be gender and race neutral,” says Judge Caposela, one of the PSA’s early evangelists in New Jersey. “The more risk factors you have, the less likely you’ll be able to eliminate gender and racial bias.”

In other words, the more empirical it is, the more difficult it becomes to screw with the science to match the religion of race and gender neutrality, meaning that there could be a ton of data available, which would then be subject to human intervention to align with neutral aspects, so they can keep the claim to empiricism while making it unscientific at the same time.

But there was the much sought after reduction in the numbers of people being held, so it can’t be all bad, right? Well, not quite. Before this push toward eliminating needless bail for the poor, bail was routinely imposed for the asking, based largely on the combination of some kid prosecutor deciding to ask for some silly amount, like $1000, that was high enough to preclude a poor defendant from getting out but served no legitimate purpose, and judges whose foremost guiding principle was to not be the guy who cut loose a defendant who later went out and raped and murdered a family of five.

In other words, there was a huge gap of people for whom bail was never appropriate or necessary, but was imposed anyway. Prosecutors asked. Judges acquiesced. Defendants stayed in jail. Lots of them, for no good reason. And the system ground away, SNAFU.

Had there been a rule imposed that anybody charged with an offense below a mid-level felony, or any defendant who would otherwise have gotten bail of $2500 or less, would be released without bail, chances are the outcome would be the same. There were that many defendants being needlessly held that even relatively arbitrary rules would have achieved the same good outcome. This isn’t because algorithms make the system so much better, but the system was so bad before that any change that released poor defendants without bail would have been a relief.

And the backlash to the Jersey system is the usual, that a defendant will be released who will go out and kill someone. Which, of course, happened.

This lack of transparency has become central to lawsuits surrounding the use of the PSA. Jules Black, the man accused of murdering Christian Rodgers, had been in and out of the New Jersey county jail system 28 times since 1994, according to the suit. His most recent arrest was for unlawful possession of a firearm. During a press conference about the case, Dog the Bounty Hunter questioned why a man with such a record would be released.

Even Judge Caposela acknowledges there’s some truth to that. The PSA takes what he describes as a “neutral view” of gun possession. Because it was trained on data from across the country, and because some states have far more lax gun regulations than New Jersey does, the PSA doesn’t consider mere gun possession as an outsized risk. It wasn’t until after the Rodgers murder that the state’s attorney general issued new guidance, directing New Jersey prosecutors to seek pretrial detention in any gun-related cases.

There are two concerns here, the first being that this is called empiricism but it’s really not. And even if it was, it’s not a guarantee that no released defendant will commit a crime, as people are still people. The inclination to demand perfection of an imperfect system, that was far more imperfect before, takes the eye off the ball. Should ten thousand people remain in pre-trial incarceration for fear that one might commit a crime if released?

Bad things are going to happen, and anyone who thought otherwise is insufferably naive. They happened before the change to the algorithm and they will happen again. But cutting 19% loose tells little about the efficacy of algorithms, given how carelessly bail had been imposed on poor, low-level offenders. The imposition of bail on the poor is a huge problem, but the Jersey experience is a long way from proving that algorithms are the answer.

25 thoughts on “Will Algorithms Be The Death of Bail?



    Whether one uses risk prediction instruments* or not, I agree with you that monetary bail should almost never be used. When it is used it should be used on rich people who are fully capable of calculating the costs of running.

    So what is the solution for protecting others and assuring attendance in court but still releasing those who should not be held captive until trial?

    There is no solution.

    The state systems lack or are unwilling to appropriate the resources to hire a plethora of pretrial supervision officers. Even the costs of pretrial incarceration are likely to be far less than the costs of paying for the many new employees who would be required to do the supervision job properly. (We feds don’t have the vast numbers of pretrial detainees so we escape the cost problem while hiring pretrial service officers as we don’t have to supervise a horde of pretrial folk and our employee numbers can be kept low.)

    The inconvenient truth, to employ a phrase my least favorite fat guy invented, is that if the states want to treat poor people who are accused of crimes fairly they should do away with bail for the indigent.

    Then the public should be told straight up that they must accept the risk that a small percentage of their daughters will, to use your opening lines, be raped by the recently charged but released. Also, the public should know that a small percentage of the released will be gone in the wind never to return to face our vaunted justice.

    The truth? I doubt the public can handle it.

    All the best.


    *You and I have a disagreement about empirical methods of predicting risk when it comes to people charged with or convicted of crimes. I won’t trouble your readers with my views on that issue except to state that proprietary software should never be used unless all the data and underlying algorithms are available to the public. Even if one uses risk prediction instruments, money bail is almost never the proper way to go if you want to treat most people charged with crimes fairly.

    1. SHG Post author

      Just for kicks, we’ve debated the merits of empiricism quite a bit over the years. Some great posts in there, particularly from Hercules and the umpire.

      I completely agree that there will be a certain percentage of defendants who will abscond as well as commit new crimes. I’ve written about this in the past, though I can’t remember when at the moment, and posited that this is the price of release properly performed. People are still people. It’s a test of our relative value of freedom v. security, and the only way to assure perfect security is to detain everybody, just in case. If that doesn’t make people happy, then they have to suffer the occasional released defendant doing the dirty. They can’t have it both ways.

    2. Jim Tyre

      The inconvenient truth, to employ a phrase my least favorite fat guy invented

      Thank you, Judge Kopf. After the scolding I gave you yesterday, I was worried that I was your least favorite fat guy.

  2. grberry

    Given the presumption of innocence, is it appropriate for risk of committing a new crime to be a factor? Shouldn’t the only issue be the risk of failing to show up for trial, possibly subdivided between flight and flaking out?

  3. John Neff

    I think the jail population will dip and then recover to a level slightly lower than the original level. This most likely will be the result of additional conditions being imposed as well as the judges learning how to game the new system.

    1. SHG Post author

      In short order, as has already happened with weapons possession, the exceptions will overcome the rule. As usual. But it will all be good because we’ve achieved reform.

      1. John Neff

        I would call it a small alteration by system managers. I would call a reform a large alteration imposed by system outsiders. In both cases because the system is adaptive the outcome may not be what was intended.

    2. Erik H

      The more complicated the rules get, the more that they favor people who have the ability, time, money, standing, and knowledge to game the rules. Some folks get to hire Scott; other folks get an overworked PD.

      It doesn’t seem likely to be limited to judges; the gaming takes place everywhere.

  4. Richard Kopf


    That argument was lost long ago when the Supreme Court in United States v. Salerno (1987) held that “preventive detention” as it was then called was constitutional. All the best.


  5. B. McLeod

    So, the old system was a crapshoot, and now, it’s a crapshoot. But a different crapshoot. Over the years, I have noticed the randomness in the criminal system generally. It would be strange if it didn’t also afflict the pretrial phase.

  6. losingtrader

    Who can ever get enough of your stories, which are far more fascinating than anyone else’s? Some days, I sit here wondering what bit of personal or pedantic lore you will impart.

    SHG, is this revision OK for the all-purpose SHG insult list?

  7. Jake

    I’ve thought about this issue a lot, as you know. As hard as it is for my tiny IANAL brain to consider all the relevant factors, something new occurred to me. Would you be happy if the use of algorithms to decide the fate of defendants was limited to open source solutions? In other words, no black-box trickery, as was the case in the Pro-Publica story?

    I’m not saying I’ve flip-flopped again. You successfully swayed me on this issue, and it caused me to learn about a whole litany of related issues, such as data protection and privacy regulation. But I am wondering where the real problem lies when it comes to supplanting human decision making. As you probably are aware, this application of emerging tech is about to be making a lot more decisions in our lives, some which could easily kill people.

    1. SHG Post author

      Transparency would be the firs, and absolutely necessary, step in the process. But that wouldn’t be the last step. Next up is its efficacy; does it successfully accomplish its purpose. Part of that stems from the validity of the algorithm used, and part from the underlying data fed into the box. And finally, assuming it can make it through to this point, there is the “keep your human mitts off the science” should it turn out not to comport with the religion of social justice.

      Then, maybe.

  8. N. Freed

    Perhaps an actual, albeit old, example of the development of one of these algorithms would be of interest.

    Back in the mid-1980s I acted as data wrangler for a project done for county welfare. The goal of the project was to come up with a formula to predict likely welfare cheats from data available on every participant. The reason this was important was if independent checks at the federal level turned up high rates of fraud they’d get their budget docked. So there was considerable incentive to deploy their limited investigatory resources to maximum effect.

    The input data to the project was generated by selecting and investigating several hundred people at random. The outcome of the investigation was reduced to a yes/no result. The task, then, was to predict that result from ~150 pieces of available data.

    If memory serves, SPSS was used to do the analysis, producing what amounted to a linear formula with some table lookups.

    And lo and behold iit was 90+% accurate. Amazing, really.

    The catch was the most significant variable in the formula was – you guessed it – race. It seems the investigators turned up a lot more Blacks cheating than Asians.

    When the welfare folks noticed this they got upset, and demanded that race be dropped. Which reduced the accuracy to something like 70%. Which wasn’t good enough.

    This was until someone got the bright idea of predicting race from the other variables, then using the synthetic “predicted race” variable instead of the actual race. Which immediately brought the accuracy back up to 85+%.

    The final report included both formulas. I have no idea if either one was ever actually used.

    Now, these days the algorithms used are much more sophisticated. But even back then it was possible to make the use of race a non-obvious thing. And if they’re using something like a neural net, even one that’s completely open source is going to be nearly impossible to figure out.

  9. Pingback: Bail reform? Careful how that goes - Overlawyered

Comments are closed.