When Guilt Is Statistical

It’s hard to explain how Lucia de Berk could be convicted of anything, particularly when the very idea that a crime occurred was established theoretically rather than be actual evidence.  But in a sophisticated version of the idiot’s delight, “where there’s smoke, there’s fire,” the Dutch nurse was convicted and spent 6 years in prison until somebody finally figured out that sometimes it’s just smoke.

Via NRC Handlesblad,

The final ruling in the case of Lucia de Berk, who was wrongfully imprisoned as a serial killer for more than six years, is a harsh reckoning within judicial circles. The ruling will long resound with judges, prosecutors, detectives and, to a certain extent, lawyers. How could they have been this wrong collectively?

The appeals court in Arnhem, which on Wednesday finally acquitted nurse De Berk of the seven murders and three attempted murders she was convicted for in 2001, drew some conclusions for the future by pointing out all that went wrong in the legal reasoning. Unexplained deaths of very ill hospital patients in De Berk’s care were treated as inexplicable deaths. All of those involved mistakenly assumed that every death could be explained by human action or inaction. Autopsy was only carried out on two of the dead patients’ bodies, several months after they had died, and failed to prove they were murder victims.

The totality of the evidence  against de Berk:.

An expert witness for the state testified that the chances were one in 342 million that so many suspicious deaths would take place during the shifts of one particular nurse.

Sure, she’s a nurse.  Sure, the patients were very ill.  Sure, ill patients sometimes die.  But what are the chances?  Well, apparently not one in 342 million.

But his data and method of calculating the odds were eventually discredited, as prosecutors finally conceded when they asked for the dismissal of all the charges against Ms. de Berk last month.

On April 14, a court granted the acquittal. In their ruling, the judges said that there was no evidence that any of the patients she had been accused of killing had died of anything other than natural causes.

The Arnhem court came to the conclusion there was no perpetrator, because none of the ten alleged crimes could be proven at all. All ‘victims’ died from natural causes, the court said. It went on to acquit the nurse of all charges.

To their credit, the Dutch prosecutor apologized to Ms. de Berk for the six years she spent in prison for a crime that never happened.  Not much chance of that happening around here, where the best she could hope for is a surly guard telling her not to let the door hit her on the way out.

The failure of this case is one of assumption and over-reliance on expertise.  Courts love experts who shift the burden off of soft evidence, like testimony, onto hard evidence which bears the imprimatur of science.  The initiation of the investigation was smoke, that a bunch of patients weren’t likely to die en masse on the shift of one nurse.  That it happened certainly justifies a very serious look-see.  It doesn’t usually happen that way, no doubt about it.

But the next step is what distinguishes those who think Nancy Grace is just brilliant from thinking people.  You ascertain the cause of death.  If there is evidence that the patients died of something other than natural causes, you figure out what that was.  Here, the deaths were characterized as “unexplained”.  Murder, on the other hand, is well-explained.  The gap between the two means that there is no crime, not that we can shut our eyes real tight and pretend that the absence of explanation proves something.

The clincher here was all in the numbers crunching.  It’s an appealing argument, particularly to those disinclined to math.  A variation of the argument is used all the time here when it comes to proving the identify of a criminal by DNA, which presents huge statistical problems complicated by significant collateral issues in collection, retention, preservation and more.  Yet all the jury hears is that the defendant is the killer because there’s a one in 27,843,256,918.37 chance that he’s not.  Fry him.

For reasons unexplained, someone cared to figure out whether Ms. de Berk was guilty.  Unfortunately, but typically, this interest wasn’t sparked until after the conviction.

Erik van den Emster, the chairman of the Netherlands Council for the Judiciary, said, “in a legal sense, the system worked in the end”. The judiciary corrected its own mistake, he added benevolently in a TV interview.

This statement speaks of complacency. All credit for correcting this judicial blunder should go to skilled outsiders. There were scientists, doctors, statisticians, toxicologists and others whose continuous research and civil opposition forced the trial to be reopened. Nobody doubts the conscience of the magistrates involved. But their skills and professionalism now stand corrected by the court.

“Better late than never,” is one great platitude.  “Get it right the first time,” is another.  There is no platitude that the harm caused by well-intended blunders doesn’t hurt too badly.  No doubt everyone involved in the creation of this monumental screw-up believed, in all sincerity, that they were right.  So what if the strength of their strongly-held mistaken beliefs was largely responsible for their inability to assess the faults in their own reasoning?

It would bring comfort to believe that such a speculative showing, one devoid of actual evidence and based entirely on the fire assumption, bootstrapped into certainty by faith and sincerity, could never happen in a rational court system.  When you find one, let me know.

H/T Ed at Blawg Review


Discover more from Simple Justice

Subscribe to get the latest posts sent to your email.

7 thoughts on “When Guilt Is Statistical

  1. Doug Cornelius

    You may also be interested in this New York Times article on the difficulty of this kind of conditional probability:

    The author points out how the defense in the OJ case used some bad conditional probability and the prosecutors failed to catch it.

  2. Stephen

    Stats is a terribly difficult area of maths and it can be extremely useful in many different walks of life. However, I don’t think it really has any place in a criminal court, particularly in terms of if a crime has been committed at all.

    Courts deal with murders all the time, yet statistically you’re very unlikely to be murdered. Should we deny that a murder happened because the stats say that the odds of it happening to any particular person are low?

    You’re not really concerned about what probably happened, you’re supposed to be looking to try and work out as best you can what actually did.

  3. SHG

    We are in a constant struggle with “common wisdom,” two words that should never be used together, born of our media-deluged world.  But the press just feeds us the distilled crap we want in ignorant little bites, suited to our palate, temperament and attention span.  They give us simple and wrong because that’s all we’re willing to absorb.

  4. Windypundit

    Here’s the part that gets me: Even if the odds of seven unexplained natural deaths happening on her shift really were one in 342 million, it only proves that all seven weren’t natural. It doesn’t prove which particular ones were unnatural, and it certainly doesn’t prove that all seven were unnatural. So even without the statistical blunder, they couldn’t claim to have proved that she killed any single victim. Yet somehow they convicted her on all of them.

  5. Lee

    I think this could actually be fashioned into a great argument in court against DNA probabilities. Thanks.

Comments are closed.