The “Black Box” Beats The Constitution

For years now, the criminal justice system has embraced technology, algorithms, empiricism, in a belief that it improves fairness and gives judges and jurors the ability to do better than the historic knee-jerk voodoo that’s been hidden behind fancy phrases like “judicial discretion” and “beyond a reasonable doubt.”

What if tech can sincerely distinguish the guilty defendant from the innocent? The defendant who will kill again or flee the jurisdiction? Isn’t that good for the defendant who won’t, who will get that “break” he was denied when it was nothing more than a gut-based guess?

But then, there’s money to be made off the system. The sellers claim that they can’t disclose their source code or they’ll suffer irreparable harm. The judges want the black-box magic, and fear disclosure will kill the goose that lays the empirical eggs. They may not be sure it works, or how it work, or whether it works, because it’s science, but they know their own failings and figure the black box must certainly do better than they can.

And the defense? There’s that Sixth Amendment right to challenge the basis upon which a guy is going to be convicted, imprisoned, held forever, and it’s not possible to challenge a black box when we have no clue what goes on inside of it. There’s nothing new here.  There’s no interest in changing things. And given the conflict between protecting the vendor’s trade secret and protecting the defendant’s constitutional rights, who wins?

Technological advancement is, in theory, a welcome development. But in practice, aspects of automation are making the justice system less fair for criminal defendants.

The root of the problem is that automated criminal justice technologies are largely privately owned and sold for profit. The developers tend to view their technologies as trade secrets. As a result, they often refuse to disclose details about how their tools work, even to criminal defendants and their attorneys, even under a protective order, even in the controlled context of a criminal proceeding or parole hearing.

Legal Aid’s Rebecca Wexler provides examples that should be familiar to any player in the system, from Compas to TrueAllele., But these are just the best-known names, and there are others, working behind the curtain, whose names and tech is never uttered in the courtroom, but whose voodoo is quietly permeating all aspects of the system.

TrueAllele is not alone. In another case, an organization that produces cybercrime investigative software tried to invoke a trade secret evidentiary privilege to withhold its source code, despite concerns that the program violated the Fourth Amendment by surreptitiously scanning computer hard drives. In still other instances, developers of face recognition technology have refused to disclose the user manuals for their software programs, potentially obstructing defense experts’ ability to evaluate whether a program has been calibrated for certain racial groups and not for others.

Likewise, the algorithms used to generate probabilistic matches for latent fingerprint analysis, and to search ballistic information databases for firearm and cartridge matches, are treated as trade secrets and remain inaccessible to independent auditors.

The private enterprises that sell their wares to the government feel completely entitled to keep their code secret. We are a capitalist economy, after all, and if they reveal their code, some other capitalist will steal it, produce the same black box and sell it cheaper, since they didn’t have to waste their time and money on coming up with the code. From a business standpoint, this isn’t at all unreasonable. If you have something of value, it needs to be protected.

This is a new and troubling feature of the criminal justice system. Property interests do not usually shield relevant evidence from the accused. And it’s not how trade secrets law is supposed to work, either. The most common explanation for why this form of intellectual property should exist is that people will be more likely to invest in new ideas if they can stop their business competitors from free riding on the results. The law is designed to stop business competitors from stealing confidential commercial information, not to justify withholding information from the defense in criminal proceedings.

It’s a good argument, but it’s also in fundamental conflict with the defendant’s constitutional right to know how the black box works. After all, just because some coder created it and some bureaucrat bought it doesn’t mean it works. Or to be less cynical, doesn’t mean it works well enough to risk someone’s life on it.

The coders and science-minded here have argued the efficacy of these efforts to use tech to improve the system. They believe, which is fine but unavailing. Just because they have faith in tech, in statistics, in science, doesn’t mean they have a sufficient understanding of what goes into the myriad factors that apply to any individual defendant.

They believe “one size fits all” is better than blind squirrel in the jury box or bench, which it may be. They also believe that close is good enough. So what if the algorithm fails here and there? Nothing is perfect, right? Tell that to the innocent guy who gets convicted, or the guilty guy who spends an extra decade in prison because the black box said he’ll rape again.

Defense advocacy is a keystone of due process, not a business competition. And defense attorneys are officers of the court, not would-be thieves. In civil cases, trade secrets are often disclosed to opposing parties subject to a protective order. The same solution should work for those defending life or liberty.

There is a difference, however, as the civil disclosure tends to be a one in a million deal, whereas criminal disclosure will happen tens of thousands of times. Somebody will spill the beans and the code will get out. It’s bound to happen, and nobody believes that “officer of the court” stuff will stop it.

The Supreme Court is currently considering hearing a case, Wisconsin v. Loomis, that raises similar issues. If it hears the case, the court will have the opportunity to rule on whether it violates due process to sentence someone based on a risk-assessment instrument whose workings are protected as a trade secret. If the court declines the case or rules that this is constitutional, legislatures should step in and pass laws limiting trade-secret safeguards in criminal proceedings to a protective order and nothing more.

The Supremes refused to decide the question the last time, when it denied cert in Terry v. California. It’s got another chance in Loomis. The issue is pretty straightforward: does protection of trade secrets by tech vendors trump the defendant’s constitutional right to challenge the black box against him? We need a decision.

28 thoughts on “The “Black Box” Beats The Constitution

  1. PDB

    Seems like trial judges need to be doing better gatekeeping here. Do they ever exercise discretion and keep things out if there isn’t sufficient disclosure, or is it always just: “Since the government wants it, it’s all good.” Didn’t seem like it in your earlier post, has anything changed at all?

    Are there similar programs that defendants use? How have trial judges come down on the admissibility when something like this benefits the D?

    If Judge Kopf has any additional insight, I would be interested in hearing it.

    1. SHG Post author

      Judge Kopf has, though it’s just his perspective. Judges are conflicted as gatekeepers, and that’s why they do the job so poorly. All it takes is for one to let it in, and then the floodgates for junk are open. Maybe the one let’s it in because he’s kinda dumb, or kinda prosecutorial oriented. But maybe it’s just that science is hard and they’re ill-equipped to figure it out.

      1. PDB

        And are there any tools like these available for defendants? Or are they always used by the prosecution?

  2. Steve Sundell

    I’m sorry, I know this is a serious problem; but the mental picture of a black-enrobed Magic 8-Ball (TM) displaying the “Try Again” face of the floating icosahedron just won’t go away…

  3. B. McLeod

    Well, whatever is in the box, it’s “science,” right? So it’s not like we (or a defendant looking at a long stretch in the pokey) could disagree with it.

    1. SHG Post author

      The mere mention of the word science makes our eyes glaze over. Thank god there are experts to tell us what it all means.

  4. Richard G. Kopf


    As you know, I am big on using risk prediction instruments during the pretrial phase and I also push for the same at sentencing. The feds have developed two very good risk prediction instruments that are known as PICRA and PICTS.

    Those instruments are publicly available and can be scrutinized for all to see. See, e.g., Richard G. Kopf, Federal Supervised Release and Actuarial Data (including Age, Race, and Gender): The Camel’s Nose and the Use of Actuarial Data at Sentencing, Federal Sentencing Reporter, Vol. 27, No. 4 at pp. 207-215 (April 2015), available at our Court’s website, click on “Judges Information,” my name and that will lead you to the article plus the actual instruments and related information.

    With the foregoing in mind, I am appalled at the idea of using proprietary data and methodology for any purpose at the pretrial stage or at sentencing. I know just enough about this stuff to know they can be dangerous. Trust me, you can’t trust these proprietary instruments unless they are evaluated in the light of day. If that sounds like I am pimping due process, then call me a pimp.

    All the best.


    [Ed. Note: Or I could just add in the link.]

    1. SHG Post author

      As you know, I have issues with the use of “the very good risk predication” algorithms that are transparent, but the ones that are hidden behind the curtain are not only potentially dangerous, but certainly in violation of the 6th Amendment. Love data all you want, but the Constitution still wins.

      When I was in college, there was a letter published in Penthouse (I only read it for the letters) asking if there was a place one could go to learn how to be a pimp. The response was that if you had to ask, you weren’t cut out for the vocation.

      While it’s unclear that you’ve disproved this retort, it appears that law school may be the place to learn, provided that the objective is to be a pimp for due process. Have you considered trading in your bespoke black robe for something in purple velvet with a matching wide-brimmed hat? You would look spectacular.


      Mrs. Kopf’s John Deere garden tractor, after due process modifications.

      1. REvers

        Isaac Hayes’ pimp caddy in Escape From New York is the ultimate. It’s just not possible to argue about fender-mounted chandeliers.

  5. phv3773

    Whatever these programs are based on, I don’t think it’s science. It may be some bastard child of statistics and data analysis. Some of these companies may want to keep their code secret because, like the Wizard of Oz, it loses its power when you look behind the curtain.

    1. Fubar

      My black box, a wondrous appliance,
      Should have all courts’ total reliance.
      Answers never take long,
      Always right, never wrong,
      ‘Cause it’s jam-packed with top secret science!

      It answers before you can ask,
      Using look-ahead code for the task.
      And, if you even think
      That you need a stiff drink,
      It becomes a convenient hip-flask!

  6. Agammamon

    Judges have guidelines for bail and sentencing – when they depart from them, how much leeway are they allowed? How much explaining for the departure is required?

    It seems to me that judges are black boxes where you can’t get an order to expose its inner workings. That may be why this is considered so acceptable, judges believe they can quantify all the relevant factors to give a fair hearing, then so could a an infinite number of code-monkeys banging away on keyboards. Judges are rarely called to explain their sentences – its all according to the Guidelines even when its not – the drift of what is ‘acceptable’ is slow so if the BB is giving answers similar to what these judges would, who’s to notice or care about the problems with edge cases? Edge cases are hard and even judges get them wrong.

    This isn’t a justification for the use of these tools and they certainly should be transparent but anyone fighting this is ice-skating uphill simply because these are not a massive new shift in the way business is done, just a little bit more of the camel’s nose under the tent.

    1. SHG Post author

      I realize you aren’t trying to win today’s Billy Madison award, but damn, you’re going to be hard to beat. What makes me very sad is that you been around here a long time, and I’ve clearly failed you miserably. It’s my fault, not yours.

  7. LTMG

    What’s in the box is not science. It is an engineer’s interpretation of the science. Any interpretation will necessarily insert some degree of bias. A defense attorney can reasonably ask black box developers about assumptions inherent in the design and the logic flow of the processing. These are likely far enough away from IP to be permitted questions. If assumptions and logic flow are protected as IP then defendants should be worried.

    1. SHG Post author

      A defense attorney can reasonably ask black box developers about assumptions inherent in the design and the logic flow of the processing.

      We’re not quite as trusting as you.

  8. Pingback: The Deciders at the Tip of the Technological Iceberg | RHDefense

Comments are closed.