If you get it, it’s a very funny joke:
There are 10 types of people in this world, those who understand binary and those who don’t.
The government’s use of contempt to compel Lavabit’s owner, Ladar Levison, to provide a digital copy of the private encryption code generated huge concern and ideas from the tech community; after all, it was seen as a direct attack at one of their primary goals, the maintenance of privacy from prying government eyes. This generated two levels of discussion in the “what to do about it” category, one from the perspective of lawyers and another from the perspective of hackers (well, maybe not hackers, but it’s a more fun word than technology enthusiasts).
At Techdirt, Mike Masnick sought to mash the two approaches by addressing the arguments of the big leaguers (though he tosses a trench lawyer in for comic relief).
But there’s an even bigger point in here, which I think Kerr misses entirely, and Greenfield skips over: from a technology standpoint, what the government is demanding of Lavabit is absolutely oppressive and abusive. And, for that, it helps to look at Ed Felten’s discussion of the case, in which he notes that the judge and other DOJ supporters in this case (including, it would seem, Kerr) are basically arguing that “If court orders are legitimate, why should we allow engineers to design services that protect users against court-ordered access.”
Princeton’s Felten is not a lawyer, but a professor of computer science and public affairs, and so approaches the problem from the hacker’s perspective:
Commentators on the Lavabit case, including the judge himself, have criticized Lavabit for designing its system in a way that resisted court-ordered access to user data. They ask: If court orders are legitimate, why should we allow engineers to design services that protect users against court-ordered access?
The answer is simple but subtle: There are good reasons to protect against insider attacks, and a court order is an insider attack.
He explains by use of the dreaded analogy:
To see why, consider two companies, which we’ll call Lavabit and Guavabit. At Lavabit, an employee, on receiving a court order, copies user data and gives it to an outside party—in this case, the government. Meanwhile, over at Guavabit, an employee, on receiving a bribe or extortion threat from a drug cartel, copies user data and gives it to an outside party—in this case, the drug cartel.
From a purely technological standpoint, these two scenarios are exactly the same: an employee copies user data and gives it to an outside party. Only two things are different: the employee’s motivation, and the destination of the data after it leaves the company. Neither of these differences is visible to the company’s technology—it can’t read the employee’s mind to learn the motivation, and it can’t tell where the data will go once it has been extracted from the company’s system. Technical measures that prevent one access scenario will unavoidably prevent the other one.
This comports with much of the reaction from tech people here, which essentially embraced the propriety of creating a system that precluded government intrusion. In other words, they saw the Lavabit dilemma as a technological challenge rather than a legal one.
Felten explains why it’s benign, if not necessary, from a technological perspective to create an impenetrable system in opposition to views that such a system only exists to thwart governmental intrusion. It’s not “anti-government.” It’s to safeguard the system from attack, whether internal or external. This is meant to address the issue of whether Levison and Lavabit’s customers were improperly motivated, and therefore undeserving of privacy.
Others, with less of a public policy concern, didn’t care much about motivations, and instead argue that the solution lies in building a system that they government can’t broach. The thought is that if Levison can’t comply because technology renders it impossible, problem solved.
The tech perspective to a legal problem presents some of the fundamental errors that both got us into this mess and would likely prove disastrous. First, Felten’s point (which Masnick describes as “the bigger” one) that protection of privacy isn’t properly characterized as “anti-government” but rather a safeguard from internal violation of privacy is fine, but narrow.
What if Felten’s second company, Guavabit, was used by a person who kidnapped children for sexual abuse, and the government sought access to this person’s emails to locate the children. That there is a benign purpose to making privacy impenetrable suddenly fades to irrelevance; the need to access email immediately to save these children is paramount, and few people would assert that the availability of technological prowess justified thwarting access.
The second consideration that the hackers fail to appreciate is embodied by the maxim from Branzburg v. Hayes, that society is “entitled to every man’s evidence.” In other words, no person in our society is above being compelled to produce what a court orders him to produce, subject of course to constitutional limits. There are good reasons for this, including a criminal defendant being able to access proof of his innocence.
The technological solution of making an impenetrable system would not only thwart legitimate governmental needs to locate kidnapped children, but would create a nightmare for anyone who refused to comply with court-ordered production because they created or used a system that was effectively designed to preclude compliance, even if the purpose behind the system was benign.
Assume some guy, who we’ll call Lavison, responded to a warrant that he would be happy to comply, but unfortunately, can’t because he has chosen to employ technology that makes compliance impossible. He didn’t do so to avoid compliance with court mandates, but to provide a system that was secure from other internal disclosure, an errant employee who might reveal customer’s private content.
Judge: That’s fascinating, Mr. Lavison, but irrelevant. You made a deliberate choice to employ technology that has the effect of depriving access to evidence that I have determined to be necessary. You may have had the best of intentions, but your choice does not trump your legal obligations under the law to comply with my order. You do not deny you possess the evidence necessary, but that you can access it as a result of systems you have in place designed to prevent its access. You claim it’s impossible for you to comply? That’s your problem, Mr. Lavison. If so, you made it so. You made the problem. You fix it. You are held in contempt.
As technology advances and creates hard barriers that the law can’t penetrate, the question won’t be the efficacy of technology (which I have no doubt has or will have the capacity to do the trick), but whether the law will fall in line with the limitations technology can deliver. It has never done so before, and used its force to compel technological advances to bend to its will. To assume it will do so now is dangerous hubris.
The question isn’t whether the law meets the expectations of hackers, but whether the courts and government plan to let hackers beat them at their own game. That’s a very risky bet.
Do it for the children.
Sometimes, do it for the children is the answer.
My problem with “do it for the children” is (IANAL, a little off-topic, prepared for shitstorm to descend, definitely not a perfect analogy, probably a bit obvious, rambling a bit) is that the a (the?) point of “Better 10 guilty [people] go free…” is that there are circumstances in which that person suspected of doing the children has to be allowed to return to maybe continuing that crime. It seems an emotional response…
Not trying to disagree about the legal storm that might fall upon a deliberate user of truly impenetrable (which I personally doubt is likely) for failure to comply
The “children” aspect wasn’t included as a per se justification, but rather to refocus people’s perspective from a governmental purpose (to “get” Snowden, or the invade good people’s privacy) to a different purpose that most people would find justifiable. The point isn’t that it’s a good thing because of “the children,” but rather that there are instances where we will find ourselves less sympathetic toward the person whose privacy is being invaded, or that the invasion is more understandable and comports with more traditional notions of proper law enforcement.
People tend to be concrete in their thinking, and so they only see the problem in terms of the current circumstances when there are other circumstances that change the perspective significantly. The same purpose can be accomplished by replacing the abused child with a kidnapped person, or a missing kitten, whatever serves to remind us that there are time we root for the victim.
As for what will happen, nobody assumes that when they call the cops for a medical emergency, they will end up dead by gunshot. Yet it happens. This has a far higher likelihood, even to the point of being probable. If you want to be the crash dummy, then it doesn’t matter to you, but it will matter to the rest of the digital world if you end up creating law that screws everyone else. Be careful about being too cavalier about what you doubt will happen. The feds are unpleasant folks to screw with.
I am confident that technology will make some evidence inaccessible. VPNs are doing a great job of supporting private browsing, and TOR was doing a great job up until it wasn’t. I am confident there will be leaps forward on each side.
I understand why the court held Levison in contempt for turning over the encryption key in virtually indecipherable text; it would be awful if a defense lawyer got that kind of crap from a prosecutor. I think that turnabout is a binary-enough argument for most tech-heads to eventually get, at least somewhere along the line. Obviously the government will almost never be in a position to provide an encryption key to a defendant, but I hope you will pardon the analogy for the potential broad application in other areas.
There is little doubt that tech has the capacity to do so. But that doesn’t stop the judge from tossing a guy in jail for refusal to comply regardless of whatever tech he’s hiding behind. Therein lies the point that needs to be understood.
Those of us who build and deploy tech specifically intended to prevent government spying are not hesitate to say exactly that: it is the whole point (a feature, not a bug, as it were).
Anyone who can say that the American system of court orders for compelled evidence retains legitimacy in the wake of the rubber-stamp FISA court’s mass surveillance orders – without laughing as they make such a claim – is a fine actor indeed.
Yes, American thugs wearing black robes can do their best to imprison anyone who dares thwart their unchecked power to do and say and compel whatever whim or whimsy dictates to them is “right.” And the rest of us will do our best to stay out of the grasp of this rampaging police state, as it stops pretending to be anything but.
Protect the children? Please. Even as a rhetorical gambit, it’s pathetic. Nobody cares about protecting children in these debates; they’re about government power, and the panic that some people have the ability to limit it via technology. That technology used to be guns, back when white guys in wigs started this whole thing. Now it’s code. Same difference – well, code doesn’t kill people, so there’s that.
What now? American goons pass laws making it illegal to even suggest that someone take steps to protect themselves from American goons? Send out Team America SWAT assault squads to “arrest the world” and teach them to RESPECT MY AUTHORITY? Whatever.
Have fun with the police state you’ve built. The rest of us have opted out, thanks. You might abduct and imprison a free-thinking technologist now and again, just to “prove” how tough you are. It’s not going to suddenly result in code bending to your brief-writing, Order-issuing will. Although maybe the judges can issue orders compelling the fundamental laws of physics in our visible universe to change so that their powers aren’t indirectly hindered thereby.
Pathetic.
Glad to hear you’ve “opted out,” so you and the sovereign citizens can ignore the existence of anything that doesn’t suit your fancy. No doubt you will persuade many with your thoughts, even though the “police state” I’ve built is batshit crazy, but hey, your entitled to believe whatever you want when you’re off your meds.
This is one of the reasons why hackers do so poorly in court. It doesn’t feel much better in a cell knowing that you’re right and the judge is a flaming moron.
You realize we’re not all batshit crazy, right?
Of course. I also know that lurking in the back of many hacker’s minds is a little voice saying what pj_cryptostorm says, even though they realize it’s batshit crazy.
The scary thing is that we’ve normalised a default obeisance to naked display of state power as somehow default – and any questioning of that stance as aberrant or, in the beloved term of the rhetorically challenged, “crazy.”
The reference to meds is also telling as a tired rhetorical trope.
I’m saying – in an intentionally caustic and inflammatory way, to bring forward the underlying schism at play here – something that’s not really subject to debate. Judges don’t get to determine what is or is not possible, technologically. This pisses them off, naturally, but that’s how life goes. Grow up, and get used to it.
Lawyers, largely baffled and thus deeply intimidated/insulted by, technology find this lack of power on the part of the black-robed ones to be unsettling at an epistemological level. Or so my lawyer friends tell me.
None of this changes the process of turning 1s into 0s.
Perhaps if you parse the structural logic of what I’ve said, above, without resorting to self-satirical assertions of psychiatric instability and lack of meds, you’ll see what I’m pointing out. Or not – it’s not much of an issue to me, either way.
Ozymandis, one suspects, would entirely understand the desire to legislate physical reality. But nobody trembles before his edifice now, do they? All top-heavy, brittle empires fall – well, every one in the history of our species has thus far. Perhaps this one is truly American exceptional. We’ll see.
In the meantime, people build tech that isn’t subject to some goon deciding she should be able to pull a judicial Peeping Tom whenever she feels the itch. And it works. And, no, randomly throwing people in prison won’t change that tech. It does make people feel strong and powerful – guns and violence tend to do that – but it changes nothing about how the physical universe works.
One would think that was somewhat beyond dispute. One would, alas, be wrong in current context.
You’ve done a nice job on this blog of excoriating “hackers” for being dumb about the law. I suspect many technologist are simply disgusted by the perversion that American conceptions of “justice” has become. Sure, there’s a risk this monster will grab any given person and destroy her life in its grinding need to prove its power, over and over. That’s true. Given a limited livespan as mortal beings, many of us choose to avoid entanglement with this abortion of a “justice system” as much as possible – for good reason.
That, indeed, seems an entirely same proposition… but perhaps that’s only clear from the outside, looking on.
My initial reaction was TL;dr. You aren’t that fascinating and really have little to say, but I’m going to post this so all can read whatever it is you feel is so important. But not the next diatribe. Hope you got it all out of your system.
And lest we be unclear, this isn’t about excoriating hackers but saving them from themselves. Your rants are a good example of why they need saving. This evil system was in place long before you existed, and it applies to everyone, not just hackers (not that you care about anyone else). It’s the mean guys like me who have been fighting this while you were playing with your Rubics Cube. So cut the attitude and spare me the bullshit.
Sure, in the long run, Ozymandias was irrelevant, but while he was in power, I bet he could lop off a few heads. It would be silly to be the guy telling Ozymandias that he won’t obey just because the king will be just a crumbling statue in the future – he would soon find himself without a head.
Why do you feed into this?
There’s some related history re: printing.
First, during the Crypto Wars, it was verboten to export PGP source code. Never mind that it was already out there {See Door, Barn vs Cow, decided decades ago in the Court of Reality}; it was still illegal. .
So MIT published the PGP source in a book, in a OCRible font, grokking the then-Administration, no matter how hard the NSA and FeeBees were pushing it, was not stoopid enough to try to export-control a book. [This inspired the “Terrorists can’t Type” line many of us recall.]
Separately, and I lack cites here, but I do recall cases where USG refused to release FOIA results in electronic form; instead giving {often poor quality} photocopies. I’m sure this was never being obstructionist; as USG agencies would never act that way.
But isn’t the point of Lavabit that the individual user/owner of the information stored in their account gets to decide when, and if, they will share their information?
This is very different from what the Government’s public demand of Lavabit: which is unfettered access to all user’s information to be accessed if and when the government would like, without permission.
Thought of another way, does the government throw the manufacturer of an uncrackable safe in jail if an individual owner of that type of safe is holding evidence in it and won’t comply?
I know you are right that the government can, and will, resort to it’s favorite tactics of locking men in cages (or worse) when they deny…So I have a question: What’s the end game? If I am correct in the supposition that this is an onerous overreach and abuse of power…What’s next for Mr. Lavison?
Many questions. If you read the post, they will all be answered. You may not be satisfied by the answers, but that doesn’t make them less answers.
Right, well…why stop with the maker of the safe? The courts should also throw the creator of the underlying code, the creator of the servers, and the very architects of the free market system which made it possible for them all to combine their wares into a system which allows the individual owner of a piece of information to refuse to share it if he doesn’t want to. Then, while they are at it, sue God for giving men free will…If God or free will exists.
Hell, then there would be a whole new class of lawsuits possible…Making it possible for more of the deadbeats coming out of law school to have a career.
Uh oh. You’ve gone off the rails again because your grasp of the underlying legal issue is, well, inadequate for a law blog and better suited to, well, anywhere else. The problem relates to the failure of an individual to comply with court mandate, not a disconnected “class of lawsuits.” I realize you’re trying, but, well, you’re trying.
I don’t doubt that a judge could hold Lavison in contempt as you’ve described, but I doubt that it would be lawful.
I want to make sure I understand your argument before I ridicule it.
Are you saying that if I produce a system that allows a person to encrypt a message and pass it through a server that I control to another person who can decrypt it, I have some obligation—legal (i.e. the sovereign can legitimately punish me for not doing so) or moral (i.e. for the chiiiiiildren)—to build it in a way that I can be coopted by the government?
What is this “lawful” of which you speak? Who gets to decide it?
What you call “coopted by the government” is the opposite characterization of Kerr’s “anti-government” business model. The government will see it as fulfilling your responsibility to give evidence upon command of a court. You will argue that systemic encryption makes it impossible. The government will argue that the hinkydoo connects to the thingamajig and, boom, there ya go. The judge will do as he does, as Ladar Levison found out.
And just in case you suspect that the courts will suddenly become attuned to technology and embrace the right to privacy as we do, no reason to anticipate Congress to step in to outlaw systems that preclude the collection of evidence for the chiiiiildren. Because this could never happen. And maybe 20 years later, the Supreme Court will hold it unconstitutional, long after tech has developed far beyond any of this and all the issues are moot. Amirite?
Edit: Before you ridicule me, let me pre-empt the first wave by noting that I don’t disagree that it would not be a lawful order, but we’re not talking here about the legal propriety of the order, but about whether the tech arguments provide simple solutions to the problem. In other words, not about what’s right, but about what’s real.
Ultimately, the Supreme Court decides what’s lawful. Under current law, “you’re going to jail because you haven’t done what I ordered you to do even though what I ordered you to do is impossible because of things you did before I ordered you to do it” would not hold up. Cf. Maggio v. Zeitz (civil contempt).
Put differently, I don’t have any obligation to give evidence that I don’t have access to. I’m not convinced that an Article III judge would even try it.
In Levison’s system, as I understand it, the hinkydoo connected to the thingamajig. In my system, it will not.
But as you know, the nature of contempt doesn’t lend itself well to appellate review. Whether it is, in fact, an “impossibility,” is a matter to be determined sui generis, and whether impossibility by volition alters the equation, remains to be seen. Even in Maggio v. Zeitz (a bankruptcy turnover proceeding, where the funds at issue were already dissipated), this language presents an obstacle:
I can easily see a judge holding someone in contempt for deliberate defiance, even if it wasn’t in response to the mandate but put in place in anticipation of it. It would be wrong to do, but courts sometimes do wrong. A lot.
I agree with Mark.
If you design a system such that you can put information in, and the users can retrieve it, but you as an administrator cannot, then you do not have possession of the information. It is analogous to giving all the information to someone else for safekeeping, with no way to contact that person (but they can contact you). You do not have access to the information, thus no dominion or control over the information, thus do no possess it.
If you are arguing that judges can be stupid about technology and abuse their power, then obviously you are correct. That problem should probably be solved from the legal end rather than the technological end of things, however.
Okay, I think I understand your argument at least well enough to ridicule it. Thanks.
I’m here for you, bro.
Mark’s point (and I apologize to Mark if I am oversimplifying it) is that it would be an abuse of authority for a court to hold someone in civil contempt for failing to comply with an order they cannot comply with. E.g., Holding someone in contempt for failing to produce something they do not have possession of.
Your point appears to be “As technology advances and creates hard barriers that the law can’t penetrate, the question won’t be the efficacy of technology (which I have no doubt has or will have the capacity to do the trick), but whether the law will fall in line with the limitations technology can deliver. It has never done so before, and used its force to compel technological advances to bend to its will. To assume it will do so now is dangerous hubris.”
The law already knows how to deal with trying to compel someone to perform an act that is an impossibility via civil contempt. You argued that the court may look backwards for voluntary acts making complying with the court order impossible to a time before the person was aware of the possibility of such an order.
Maggio v. Zeitz states the law as I understand it. “Conduct which has put property beyond the limited reach of the turnover proceeding may be a crime, or, if it violates an order of the referee, a criminal contempt, but no such acts, however reprehensible, warrant issuance of an order which creates a duty impossible of performance, so that punishment can follow.”
If you cannot comply, then the court cannot attempt to coerce you into doing something you cannot. You cannot be held in civil contempt if technology renders you unable to comply with the court’s order. If you intentionally created a situation such that you cannot comply (e.g., via technology), then although you may be opening yourself to criminal contempt, you cannot be punished under civil contempt.
Further, if you created the situation rendering yourself unable to comply, but not in anticipation of protecting yourself from a forthcoming court order (and that is what the evidence showed) then I do not think you can be punished with criminal contempt. I suppose Congress could create some sort of strict liability offense covering such a situation, but to my knowledge this is not currently the case.
Until and unless it becomes illegal to fail to build a backdoor into any purportedly secure system, then failure to do so should not subject anyone to contempt, be it civil or criminal. Sorry if my point was unclear in my original post.
The Judge is failing my Turing test.
Pingback: Tough Guys and Buttercups | Simple Justice
The real technological barrier may be a very old one…..geography. To the dismay of the FeeBees & the Fort; there *are* other countries in the world, and not all of them roll over/play dead when YKW comes banging on the door.
Maybe, or maybe not, in the Sultanate of Kinakuta; but I saw a report on how the cloud hosting business in Switzerland was growing fast. I was always thinking Iceland myself but hmm, wonder if that’s a business model Raul could use to rescue their economy.
But we’re dealing with here. Of course, if all tech off-shores, gets no income from and delivers no services to the US, then they would be free of our influence. But that’s not what we’re dealing with.
Being offshore is hardly a barrier to delivering Lava type services…unless you are China and build a Great Firewall……
Whether being offshore is sufficient to remove one from the reach of the United States is the question.
Rather than starting with what the government can currently force people to reveal and expand, how about we work out from what they cannot.
If we start with the premise that your thoughts cannot be compelled unless you choose to share them and from there that telepathic communication could not be compelled because it consists of shared thoughts. Could it not also be argued that encrypted communications between two parties who did not wish any third parties to access those shared thoughts is exactly that, shared thoughts.
I think that’s an excellent argument, and in fact comprises in large part my thoughts on why old 4th Amendment law cannot be applied to the digital world by analogy. It require an end to the third party doctrine, and recognition that devices (such as smartphones, tablets, computers) are repositories for thoughts (akin, if I have to make an analogy, to personal and confidential diaries), not just things. Should the courts adopt this view, it would serve to address a great many problems, this included.