Has The Crime Lab Fiasco Been Fixed?

The first sentence of the title of Radley Balko’s post was heartwarming.

Forensic science reform is finally here.

Well, that’s good news, right? But the second sentence steals the warm and fuzzies back again.

But will we get it right?

He’s such a tease.

[T]he Justice Department announced this month that it will only use evidence from accredited crime labs around the country. That’s at least an incentive for labs to begin adopting best practices and minimum standards. But as with many of the criminal justice reforms that have emanated from the Obama administration, this one may be more symbolism than substance. Symbolism isn’t unimportant. But it doesn’t do much to correct injustices.

We’re suckers for words that convey a sense of officialdom, and accreditation is a really cool word that stinks of it.  Colleges and law schools are accredited, and don’t we trust the imprimatur of an official sounding group to assure quality?  After all, the alternative would be for us to investigate on our own, make a reasoned decision, and accept or reject the significance and value of accreditation. That’s a lot of work. Who needs more work when we can just rely on some trusted resource to do it for us?

The new policy, announced by Deputy Attorney General Sally Q. Yates at a meeting of the National Commission on Forensic Sciences, is unlikely to precipitate any immediate or universal overhaul of the nation’s crime labs. It will only directly affect labs contracted by the DOJ that aren’t already accredited — and it will only begin doing so in 2020. 

Plus, there’s a little caveat that use of accredited labs will happen to the extent it’s “practicable,” which suggests that a lab’s failure to be accredited when it’s the only lab in town might render it impracticable. They can’t use what they don’t have, and it’s not like the prosecution is going to stop putting in scientific evidence just because its lab sucks.  That could result in failed prosecutions, and civilization would collapse.

And then there are the problems with accreditation being somewhat less than “rigorous,” meaning that Timmy the accreditor stops by, has a cold beverage, and gives an enthusiastic thumbs up.

Nor does accreditation solve many other common problems that crop up in labs, from error-prone or undertrained technicians to unscientific conclusions. The sensitivity and accuracy of DNA science, which began being used in courts in the late 1980s, has thrown into doubt the accuracy of many other forensic disciplines that previously commanded confidence in courts, from arson investigation to hair analysis to bite mark analysis. But in recent years, even DNA forensics have come into question. Such problems cannot be addressed by accreditation alone.

While the word “accreditation,” particularly when emanating from the mouth of such a trusted source as Deputy Attorney General Sally Q. Yates*, would suggest that we’ve finally turned the corner on crime labs fraught with incompetence, lies and deceit, there is one nagging problem: all the scandals of the past few years of crime labs falsifying results happened at accredited labs.

Maybe the word “accreditation” isn’t as meaningful a cure as it seems?

The two main pitfalls of forensic evidence are (1) the fact that we ask judges, not trained scientists, to determine what is and isn’t credible scientific evidence, and (2) cognitive bias. The former allows bad science into the courtroom, and allows practitioners of even the more credible forensics fields to overstate their findings. The latter is a problem that creeps into all forms of forensic analysis.

As Radley notes, junk sciences has long held sway in the law until some wag came up with DNA, which proved that much of our evidence offered with nearly-indisputable certainty was not just utter crap, but wrong.  If an “expert” on bite mark evidence testified that the defendant was almost certainly the killer, there was little one could do to dispute it save for the fortuitous appearance of DNA as a new forensic tool, and then only if there was DNA evidence available for use.  People seem to forget that DNA isn’t always available, either at all or preserved, uncontaminated, unmixed, untainted.

And when there is no DNA with which to smack the junk expert upside the head, then what?

But it’s unlikely that anything as certain, overarching, and game-changing as DNA testing will come along again. Of course, in cases where DNA testing is dispositive of guilt, there will be no need to rely on the less credible forensic fields. But that’s only a small percentage of cases.

Well before Sally Quinn’s fabulous news, there was the 2009 report of the National Academy of Sciences.  It was government-issued, but not from the mouth of the DoJ. Instead, it was from neutrals, and didn’t have nice things to say.

Forensic evidence is often offered in criminal prosecutions and civil litigation to support conclusions about individualization — in other words, to “match” a piece of evidence to a particular person, weapon, or other source. But with the exception of nuclear DNA analysis, the report says, no forensic method has been rigorously shown able to consistently, and with a high degree of certainty, demonstrate a connection between evidence and a specific individual or source.

Here we are, six years later, scandal upon scandal revealing false evidence used to put thousands of individuals in prison, and the DoJ offers accreditation of labs as a reform of junk science. But wasn’t this one of the prongs of the report’s fixes?

Rigorous and mandatory certification programs for forensic scientists are currently lacking, the report says, as are strong standards and protocols for analyzing and reporting on evidence. And there is a dearth of peer-reviewed, published studies establishing the scientific bases and reliability of many forensic methods. Moreover, many forensic science labs are underfunded, understaffed, and have no effective oversight.

So no, not quite. Lawyers aren’t sufficiently knowledgeable to deal with junk science. Judges often less so.  Does Sally Quinn’s claim of accreditation by, well, the same people who accredited the failed labs up to now, reform much of anything?  It all depends on how impressed you are with the word “accreditation.”  If the government knows one thing, it’s that we’re suckers for a very official word, as it relieves us of the burden of actually thinking.

*Lest some cynic forget, whenever a challenge arises to the fidelity of local prosecutors’ handling of experts, we turn to the DoJ, because they’re far more trustworthy.

8 thoughts on “Has The Crime Lab Fiasco Been Fixed?

  1. Keith

    Why do government functionaries make things so difficult. If your goal is a fix that’s clear, simple and wrong, just declare that only guilty people can be prosecuted.

    It’s not like people have suggested hard complex solutions like making sure there’s money in the PD’s budget for experts to explain what problems exist or requiring criminal law experts to have the same stringency as civil law experts or any number of other fixes found on these pages.

    Nah, screw it. You’re lawyers, not doctors. Do you want Frye’s with that?
    (and seriously, fix the captcha thing)

  2. Herbert Stock

    Perhaps it is time to introduce the quality control procedures used in Metrology (a branch of Production Engineering) – standard (unknown to all others) samples are prepared by a group of laboratories and sent to laboratories seeking accreditation which have to analyse them and report. The results are then compared with the standard report by a third, independent laboratory and the results published. The lab seeking accreditation must then report their success (or otherwise) in any legal proceedings they are involved in so that the judicial system can assess their reliability.

    1. Dave Ruddell

      That’s how most proficiency tests work, at least in my experience. But that’s not really what accreditation, at least under the ASCLD/LAB ISO 17025 standard is all about. Certainly you need to be proficiency tested in any disciplines you work in, but it’s a much larger program. It basically covers all of your lab operation; testing, reporting, validation of methods, court testimony, your QA system, etc.

      1. Robert Zeh

        But accreditation, as you’ve described it, doesn’t tell you if the crime lab is giving you valid results. It just tells you that the crime lab is adhering to standards.
        Let me give an analogy from the construction industry. As you’re building with concrete, you take some of the concrete off to the side and store it. After the concrete has cured you load test it. You have faith in the concrete because it passed the load test, not because the supplier was ISO 9001 certified.
        From what I can tell by reading “The Advantages of Being Accredited” by ILAC, ASCLD/LAB ISO 17025 certification is an evaluation of how a crime lab does its work — not an evaluation of the lab’s results.
        For crime labs, evaluating the results is cheap and easy. When I’ve talked wth lawyers and DAs about why it isn’t done it comes down to a mentality mismatch. As an engineer I think of crime labs as something that can be independently evaluated as black boxes, while my lawyer friends point out that in the end it is all just testimony.
        But the cynical part of me thinks that no one wants to know what the error rates are for crime labs.

        1. SHG Post author

          Remember, these are not mutually exclusive considerations, and yet none of them are being discussed. Merely uttering accreditation means nothing without it being real accreditation, and without that accreditation leading to provably accurate outcomes.

        2. Dave Ruddell

          I see your point, although I would point out that proficiency testing is part of accreditation. In addition, we constantly run either standards or reference samples to make sure that out instruments are working properly (in my lab at least). For example, when analyzing for GSR, the’s always a check at the beginning and end of the run to check against a known GSR sample to make sure the instrument is performing correctly. This is true for every instrument we have in our lab, to the best of my knowledge (it’s a big lab, ~250 people).

          You make an interesting point about error rates. The NAS report on forensic science called for just that. We’ve been trying to do that, although it’s not always easy in some cases. When I determine the refractive index of a glass fragment, I can give a range of values easily enough, based on standard deviation. On the other hand, if I identify a particle as gunshot residue, I don’t know how to even begin to estimate what the error rate is. It doesn’t lend itself well to numerical analysis.

          In my opinion, the bigger problem is not the accuracy of the results, but their interpretation, which accreditation doesn’t really help with. I think it’s far too easy to oversell the results, and not account for alternate explanations. I’ve seen a number of examples in both the general media and more specialized venues, like the Jurisprudence section of the American Academy of Forensic Sciences.

Comments are closed.