All Because The Wrong Box Was Checked

Engineers and clerks love systems. They’re repetitive, conceptually sound and facilitate the movement of huge numbers of widgets, be they ball bearings or human beings, from the beginning to the end of the system. When the occasional bad widget comes out the end, it can be chalked up to an expected failure rate.

When the bad widget turns out to be a human being, the consequences can be devastating and nearly impossible to undo.  Systems, it turns out, are designed to move widgets only in one direction, and are not designed to take into account their own fallibility. Systems don’t like to admit to failure, and fight efforts to correct their mistakes.

The findings and conclusions of Northern District of California Judge William Haskell Alsup in the case of Stanford Ph.D student Rahinah Ibrahim has issued, having been duly redacted by that other branch of government who is in charge of none of its secrets are revealed.

Mind you, Ibrahim was refused access to the United States in 2004, ten years ago. She has since been through a federal trial, all to ascertain why in the world she was on the no-fly list, which the government refused to confirm or explain because it’s a secret. A big, big secret.  And a mistake.


Posed for a magazine? Posed for a poster on people who were screwed by government incompetence? Posed a threat?  Is that really necessary to redact, because no one knows that the government perceives certain people to pose threats of terrorism?

But being merely wrong about someone happens all the time, like believing a person committed a crime when he didn’t. This isn’t about making a mistake.

In November, 2004, Dr. Ibrahim’s name was “nominated” to various federal “watchlists” by Special Agent Michael Kelley, including the VGTO, Violent Gang and Terrorist Organization. It has that bad word in it, and, well, that was that. Except for one thing:


And once Kelley introduced Ibrahim into the system, the system took over, propagating her mistaken characterization as a terrorist through all the other cool systems the government has to protect and defend us.  The system worked wonderfully from that point forward, making sure that all other arms and legs of government would have the information about this newly discovered terrorist, so she shouldn’t slip through the cracks as some contend had happened before.

Except for the GIGO issue.  Kelley had only to check a box to make the system function properly, but Kelley was a Special Agent. Apparently, more “special” than anyone knew, and so the instructions failed him and the wrong box was checked.  The human factor is always the weakest link in systems, because humans are weak, unpredictable and unreliable.

But these systems invariably involve humans at some point, and ultimately affect humans at another.  Indeed, even now, when Judge Alsup has found that this was just a silly ten-year, life-and-soul-crushing affair for a person who did nothing to deserve it because another person checked the wrong box, he still can’t know whether there is a viable remedy. Why? Systems.

The interconnected systems within the government are very good at putting names in and spreading them about.  They are not, apparently, very good at removing names mistakenly put in.

While this post could well end here, my initial attention to the trial was derived from Judge Alsup’s apparent outrage at the government’s refusing to allow a witness, Ibrahim’s daughter,  Raihan Binti Mustafa Kamal, who is a citizen, to come to the United States to testify.  And, like me, you probably want to know the truth about what the government did, how they prevented the witness, a citizen, from getting on a plane and returning to her country.

Well, this is how Judge Alsup explains it:


No doubt this will bring you as much comfort as it does me. After all, if you can’t rely on a system, or a government, what can you rely on?


18 thoughts on “All Because The Wrong Box Was Checked

  1. Brett Middleton

    “He intended to nominate her to the …” Wait. What? Perhaps Ibrahim was better off on the No Fly list rather than being on some other list so secret we’re not even allowed to know the name of it. It may be hard to fight inclusion on the No Fly list, but it’s impossible to fight inclusion on a list that we don’t even know exists. That one little redacted statement seems to be giving us a peek at an abyss more frightening than we knew.

  2. John Burgess

    Am I misreading the decision? I think (always subject to correction) that come April 15, 2014, all those black lines disappear and the text is suddenly revealed. The reason they’re currently redacted is that the court is giving the gov’t one last chance to argue that they should not be revealed.

    1. SHG Post author

      I think you’re giving too much credit to rhetoric that never quite pans out. Threats don’t beget action, but more fist shaking and threats about “don’t do that again.” Other than that, the black lines will be there long after you and I are gone.

    2. Jim Tyre

      April 15 is the deadline imposed by Judge Alsup for the government to appeal and get an order from the Ninth Circuit keeping all those cute black lines in place until the Ninth Circuit is done wrangling with the issue. So, maybe we see more on April 16. But I’m not holding my breath.

  3. Josh C

    With respect, you’re dead wrong. Systems do whatever they are designed to do. If a system has no feedback loops, it’s because someone designed it that way, and that someone is responsible for the results.
    “Systems” is in no way an excuse for incompetent design.

    1. SHG Post author

      It took longer than I anticipated for someone to make this point. In theory, you’re right. A correctly designed system should be, in essence, perfect. But as long as systems are designed by humans, are of any complexity whatsoever, and involve humans at any stage of the process, they never meet theoretical perfection because we, humans, are not perfect. So you are absolutely right, and yet wrong.

      1. Josh C

        Well, yes: an ideal system would be perfect. Designing your system as though it and all its inputs will be perfect is bad practice though.

        For comparison, think of the legal system: obviously, you want trials to go right the first time, and the whole system should be designed to produce that result. However, you still sanitize your inputs (e.g. via grand juries), and have multiple, extensive feedback mechanisms (appeals, etc).

        (separately, that process was apparently often bypassed or deliberately abused, but that’s anecdotal)

        1. SHG Post author

          And after a couple hundred years and a few million tweaks, look how well that system is working. Which, I might add, is one of the reasons they want to redesign it.

  4. John Barleycorn

    Were you listening to Geoff Muldaur’s rendition of Aquarela do Brasil in the background when you wrote this post?

    Archibald Buttle ring any bells?

    The length of time it has taken to unravel the mistakenly checked box in this situation makes me wonder if the outrage isn’t almost entirely surrealistically misplaced by design on this little pebble that cracked loose to be exposed and then mostly redacted while the trillion ton bolder keeps rolling down the hill.

  5. Nigel Declan

    If the person who designed the no-fly checkbox system had designed nuclear submarines, the secret code-verification protocol and two-key launch system would have been replaced by a big ol’ red button with the words “Nuke It!” written on it.

  6. darrtl

    “Systems, it turns out, are designed to move widgets only in one direction, and are not designed to take into account their own fallibility. Systems don’t like to admit to failure, and fight efforts to correct their mistakes.”

    I think that is a bit of a simplistic view of systems, and not really that reflective of real life, in real life ‘systems’ and the engineers who run and design those systems, employ QA and TQM methods, take failure very seriously, seek out and identify potential mistakes and make efforts to continuously review, improve and modify their systems to reduce Cost of Non-Conformance (CoNC), and in achieving the TMQ motto, of “Doing the right things, right, first time, every time”.

    1. SHG Post author

      It is a bit simplistic, but then, it doesn’t pretend to be a textbook on systems. No doubt systems engineers do everything they can to avoid failure, and take it seriously. But as long as systems rely in a human component, they fail. Mottos don’t change that.

  7. EricE

    The problem with making things foolproof is fools are so darn ingenious….

    Interesting post and blog – thanks! It’s going on my reading list.

Comments are closed.