For years now, the criminal justice system has embraced technology, algorithms, empiricism, in a belief that it improves fairness and gives judges and jurors the ability to do better than the historic knee-jerk voodoo that’s been hidden behind fancy phrases like “judicial discretion” and “beyond a reasonable doubt.”
What if tech can sincerely distinguish the guilty defendant from the innocent? The defendant who will kill again or flee the jurisdiction? Isn’t that good for the defendant who won’t, who will get that “break” he was denied when it was nothing more than a gut-based guess?
But then, there’s money to be made off the system. The sellers claim that they can’t disclose their source code or they’ll suffer irreparable harm. The judges want the black-box magic, and fear disclosure will kill the goose that lays the empirical eggs. They may not be sure it works, or how it work, or whether it works, because it’s science, but they know their own failings and figure the black box must certainly do better than they can.
And the defense? There’s that Sixth Amendment right to challenge the basis upon which a guy is going to be convicted, imprisoned, held forever, and it’s not possible to challenge a black box when we have no clue what goes on inside of it. There’s nothing new here. There’s no interest in changing things. And given the conflict between protecting the vendor’s trade secret and protecting the defendant’s constitutional rights, who wins?
Technological advancement is, in theory, a welcome development. But in practice, aspects of automation are making the justice system less fair for criminal defendants.
The root of the problem is that automated criminal justice technologies are largely privately owned and sold for profit. The developers tend to view their technologies as trade secrets. As a result, they often refuse to disclose details about how their tools work, even to criminal defendants and their attorneys, even under a protective order, even in the controlled context of a criminal proceeding or parole hearing.
Legal Aid’s Rebecca Wexler provides examples that should be familiar to any player in the system, from Compas to TrueAllele., But these are just the best-known names, and there are others, working behind the curtain, whose names and tech is never uttered in the courtroom, but whose voodoo is quietly permeating all aspects of the system.
TrueAllele is not alone. In another case, an organization that produces cybercrime investigative software tried to invoke a trade secret evidentiary privilege to withhold its source code, despite concerns that the program violated the Fourth Amendment by surreptitiously scanning computer hard drives. In still other instances, developers of face recognition technology have refused to disclose the user manuals for their software programs, potentially obstructing defense experts’ ability to evaluate whether a program has been calibrated for certain racial groups and not for others.
Likewise, the algorithms used to generate probabilistic matches for latent fingerprint analysis, and to search ballistic information databases for firearm and cartridge matches, are treated as trade secrets and remain inaccessible to independent auditors.
The private enterprises that sell their wares to the government feel completely entitled to keep their code secret. We are a capitalist economy, after all, and if they reveal their code, some other capitalist will steal it, produce the same black box and sell it cheaper, since they didn’t have to waste their time and money on coming up with the code. From a business standpoint, this isn’t at all unreasonable. If you have something of value, it needs to be protected.
This is a new and troubling feature of the criminal justice system. Property interests do not usually shield relevant evidence from the accused. And it’s not how trade secrets law is supposed to work, either. The most common explanation for why this form of intellectual property should exist is that people will be more likely to invest in new ideas if they can stop their business competitors from free riding on the results. The law is designed to stop business competitors from stealing confidential commercial information, not to justify withholding information from the defense in criminal proceedings.
It’s a good argument, but it’s also in fundamental conflict with the defendant’s constitutional right to know how the black box works. After all, just because some coder created it and some bureaucrat bought it doesn’t mean it works. Or to be less cynical, doesn’t mean it works well enough to risk someone’s life on it.
The coders and science-minded here have argued the efficacy of these efforts to use tech to improve the system. They believe, which is fine but unavailing. Just because they have faith in tech, in statistics, in science, doesn’t mean they have a sufficient understanding of what goes into the myriad factors that apply to any individual defendant.
They believe “one size fits all” is better than blind squirrel in the jury box or bench, which it may be. They also believe that close is good enough. So what if the algorithm fails here and there? Nothing is perfect, right? Tell that to the innocent guy who gets convicted, or the guilty guy who spends an extra decade in prison because the black box said he’ll rape again.
Defense advocacy is a keystone of due process, not a business competition. And defense attorneys are officers of the court, not would-be thieves. In civil cases, trade secrets are often disclosed to opposing parties subject to a protective order. The same solution should work for those defending life or liberty.
There is a difference, however, as the civil disclosure tends to be a one in a million deal, whereas criminal disclosure will happen tens of thousands of times. Somebody will spill the beans and the code will get out. It’s bound to happen, and nobody believes that “officer of the court” stuff will stop it.
The Supreme Court is currently considering hearing a case, Wisconsin v. Loomis, that raises similar issues. If it hears the case, the court will have the opportunity to rule on whether it violates due process to sentence someone based on a risk-assessment instrument whose workings are protected as a trade secret. If the court declines the case or rules that this is constitutional, legislatures should step in and pass laws limiting trade-secret safeguards in criminal proceedings to a protective order and nothing more.
The Supremes refused to decide the question the last time, when it denied cert in Terry v. California. It’s got another chance in Loomis. The issue is pretty straightforward: does protection of trade secrets by tech vendors trump the defendant’s constitutional right to challenge the black box against him? We need a decision.