If you get it, it’s a very funny joke:
There are 10 types of people in this world, those who understand binary and those who don’t.
The government’s use of contempt to compel Lavabit’s owner, Ladar Levison, to provide a digital copy of the private encryption code generated huge concern and ideas from the tech community; after all, it was seen as a direct attack at one of their primary goals, the maintenance of privacy from prying government eyes. This generated two levels of discussion in the “what to do about it” category, one from the perspective of lawyers and another from the perspective of hackers (well, maybe not hackers, but it’s a more fun word than technology enthusiasts).
But there’s an even bigger point in here, which I think Kerr misses entirely, and Greenfield skips over: from a technology standpoint, what the government is demanding of Lavabit is absolutely oppressive and abusive. And, for that, it helps to look at Ed Felten’s discussion of the case, in which he notes that the judge and other DOJ supporters in this case (including, it would seem, Kerr) are basically arguing that “If court orders are legitimate, why should we allow engineers to design services that protect users against court-ordered access.”
Princeton’s Felten is not a lawyer, but a professor of computer science and public affairs, and so approaches the problem from the hacker’s perspective:
Commentators on the Lavabit case, including the judge himself, have criticized Lavabit for designing its system in a way that resisted court-ordered access to user data. They ask: If court orders are legitimate, why should we allow engineers to design services that protect users against court-ordered access?
The answer is simple but subtle: There are good reasons to protect against insider attacks, and a court order is an insider attack.
He explains by use of the dreaded analogy:
To see why, consider two companies, which we’ll call Lavabit and Guavabit. At Lavabit, an employee, on receiving a court order, copies user data and gives it to an outside party—in this case, the government. Meanwhile, over at Guavabit, an employee, on receiving a bribe or extortion threat from a drug cartel, copies user data and gives it to an outside party—in this case, the drug cartel.
From a purely technological standpoint, these two scenarios are exactly the same: an employee copies user data and gives it to an outside party. Only two things are different: the employee’s motivation, and the destination of the data after it leaves the company. Neither of these differences is visible to the company’s technology—it can’t read the employee’s mind to learn the motivation, and it can’t tell where the data will go once it has been extracted from the company’s system. Technical measures that prevent one access scenario will unavoidably prevent the other one.
This comports with much of the reaction from tech people here, which essentially embraced the propriety of creating a system that precluded government intrusion. In other words, they saw the Lavabit dilemma as a technological challenge rather than a legal one.
Felten explains why it’s benign, if not necessary, from a technological perspective to create an impenetrable system in opposition to views that such a system only exists to thwart governmental intrusion. It’s not “anti-government.” It’s to safeguard the system from attack, whether internal or external. This is meant to address the issue of whether Levison and Lavabit’s customers were improperly motivated, and therefore undeserving of privacy.
Others, with less of a public policy concern, didn’t care much about motivations, and instead argue that the solution lies in building a system that they government can’t broach. The thought is that if Levison can’t comply because technology renders it impossible, problem solved.
The tech perspective to a legal problem presents some of the fundamental errors that both got us into this mess and would likely prove disastrous. First, Felten’s point (which Masnick describes as “the bigger” one) that protection of privacy isn’t properly characterized as “anti-government” but rather a safeguard from internal violation of privacy is fine, but narrow.
What if Felten’s second company, Guavabit, was used by a person who kidnapped children for sexual abuse, and the government sought access to this person’s emails to locate the children. That there is a benign purpose to making privacy impenetrable suddenly fades to irrelevance; the need to access email immediately to save these children is paramount, and few people would assert that the availability of technological prowess justified thwarting access.
The second consideration that the hackers fail to appreciate is embodied by the maxim from Branzburg v. Hayes, that society is “entitled to every man’s evidence.” In other words, no person in our society is above being compelled to produce what a court orders him to produce, subject of course to constitutional limits. There are good reasons for this, including a criminal defendant being able to access proof of his innocence.
The technological solution of making an impenetrable system would not only thwart legitimate governmental needs to locate kidnapped children, but would create a nightmare for anyone who refused to comply with court-ordered production because they created or used a system that was effectively designed to preclude compliance, even if the purpose behind the system was benign.
Assume some guy, who we’ll call Lavison, responded to a warrant that he would be happy to comply, but unfortunately, can’t because he has chosen to employ technology that makes compliance impossible. He didn’t do so to avoid compliance with court mandates, but to provide a system that was secure from other internal disclosure, an errant employee who might reveal customer’s private content.
Judge: That’s fascinating, Mr. Lavison, but irrelevant. You made a deliberate choice to employ technology that has the effect of depriving access to evidence that I have determined to be necessary. You may have had the best of intentions, but your choice does not trump your legal obligations under the law to comply with my order. You do not deny you possess the evidence necessary, but that you can access it as a result of systems you have in place designed to prevent its access. You claim it’s impossible for you to comply? That’s your problem, Mr. Lavison. If so, you made it so. You made the problem. You fix it. You are held in contempt.
As technology advances and creates hard barriers that the law can’t penetrate, the question won’t be the efficacy of technology (which I have no doubt has or will have the capacity to do the trick), but whether the law will fall in line with the limitations technology can deliver. It has never done so before, and used its force to compel technological advances to bend to its will. To assume it will do so now is dangerous hubris.
The question isn’t whether the law meets the expectations of hackers, but whether the courts and government plan to let hackers beat them at their own game. That’s a very risky bet.