Government To Apple: We’re Ba-aa-ack

It was quiet. Calm. A couple of weeks ago, the government claimed it found a hack and no longer needed Apple’s help, though it deeply appreciated Mag. Sherri Pym’s time. Thanks, but no thanks. We’re good, said the government.

All those amici, the arguments, the op-eds, the TV appearances by Jim Comey and Cy Vance, explaining why terrorism was Apple’s fault; all for naught.  So we turned our attention back to what Kim Kardashian had to say. But while we were snoozing, the government was getting ready to clean up loose ends.

The government has notified EDNY Judge Margo Brodie that it will challenge Magistrate Judge James Orenstein’s adverse Apple order.  The government has obtained a new order from Boston Mag. Marianne B. Bowler requiring Apple to crack an iPhone.  Senators Richard Burr and Dianne Feinstein have released a “discussion draft” of a proposed bill to require a backdoor be built into encryption and a new op-ed will appear in the New York Times by some ex-government counterterrorism guys explaining why Apple is hurting its customers by refusing to be a good corporate citizen.

What? You thought this was done?

Why does all this matter? Simple: The longer it takes Apple to patch this vulnerability (either because the government discloses it to Apple or because Apple figures it out on its own), the longer iPhone user data is at risk — both from the government and from criminals.

It’s unclear that the government actually found an exploit for an old version of the iPhone. It says it did, and some people trust the government’s word implicitly, because the government would never lie. But others are skeptical.  The Farook iPhone was never considered a serious source of evidence, even post hoc, but rather a burner for making the pitch to force Apple to crack it.

The carrot at the end of the government’s stick is that if Apple plays ball this time, the government will give up its hack to Apple. One door closed, as long as they leave another door open.  Fair enough, Apple?

So privacy advocates should carefully weigh the costs and benefits of Apple’s decision for both the company and its customers. If Apple had complied with the California court’s order, it could have used a method to obtain access that was known only to the company. The vulnerability would be safe, or at least, safer (hackers could eventually re-create this method).

Mutual back-scratching makes all itches go away, right?  And its not like this comes from some pro-government hacks.

For those of us concerned with government overreach and privacy, there would be three clear benefits to Apple’s cooperation. First, by forcing the government to come to it in the first instance, Apple retains the ability to litigate court orders — assuming its position is not to refuse to cooperate no matter the particular facts — and thereby challenge the government in specific cases of possible overreach.

No doubt the authors of the op-ed are “concerned with government overreach and privacy.” They’re concerned the government won’t get exactly what it wants.

Jamil N. Jaffer, an adjunct professor at the George Mason University School of Law [now, #ASSLaw], was an associate council to President George W. Bush. Daniel J. Rosenthal, an adjunct professor at the University of Maryland, was director for counterterrorism with the National Security Council under President Obama.

And because they speak for a benign sovereign, it’s not as if Apple will get nothing out of complete capitulation.

Second, Apple can choose to protect the method for lawful access as closely as it wants, including by storing it in the same way that it stores its source code or its software signing keys. And, perhaps most important, Apple would leave it to the courts do their job to figure out whether the F.B.I. has probable cause for a search.

Apple probably has a closet somewhere with a couple locks on it where they store the source code and the recipe to Coca-Cola and Kentucky Fried Chicken. And who wouldn’t trust the courts to “do their job to figure out whether the F.B.I. has probable cause,” since warrants are so carefully scrutinized and are only approved when absolutely justified?

The San Bernardino test case before Mag. Pym lit the privacy and encryption world afire.  From all corners, people lined up to fight what would likely prove to be the undoing of privacy at the end of a government’s wrench.  But can they keep it up?

This is a war of attrition.  The government will keep it going, on multiple fronts. Sure, there will be hackers, devs, coders, geeks who will make this their life’s battle, but they don’t really understand that the law isn’t binary and there are only 10 types of judges. Will the legal advocates be willing to keep changing the captions on their briefs? Will they get tired, bored, distracted by the next shiny issue?

And will the public be capable of fighting off each new public relations blitz, each new spin, about why the government is really here to help, it’s only doing this to protect us from the terrorists, the child rapists, the drug kingpins, and the terrorists (again)? After all, if feminists can convince lawmakers and the public that affirmative consent isn’t batshit crazy, certainly the government can convince the public that a backdoor to encryption will save them.

The government only needs to win the battle once. It only needs one order, one teensy weensy order, to compel Apple to give it up, to create the backdoor, and it will be there forever.  And the zeal with which it was put to the test before Mag. Pym will wane over time. The government can be patient. It’s in it for the long game.  Eventually, we’ll miss one, or just say “screw it” because we’re too busy fighting some other battle or playing candy crush. And boom, they’ll get their order.

Do we have the attention span to fight battle after battle with the government? They’re betting we don’t.  The government isn’t so much “ba-aa-ack” as it never went away. They have no plans of going away. They’re just waiting for us to go away.


14 thoughts on “Government To Apple: We’re Ba-aa-ack

  1. Jim Tyre

    The government has obtained a new order from Boston Mag. Marianne B. Bowler requiring Apple to crack an iPhone.

    That’s what the article you link to (and many others) says, but it ain’t so. Unlike San Bernardino and Brooklyn, the Boston order is exactly what Apple has always agreed to. The language of the Order comes straight from Apple’s own Legal Process Guidelines for U.S. Law Enforcement. Your basic point is good, Boston just isn’t a good illustration of it.

    1. SHG Post author

      The articles state that the government is seeking to compel Apple to decrypt the phone. The original order, dated February 1st, does not require Apple to do so. The articles state that they now are. The docket in the case shows another motion and order, both dated April 8th.

      Is this all wrong? Could be. Do you have information to show that you are right and they’re all wrong? Is your information so compelling that you needed write this comment? Are you that sure that you’re right and everyone else is wrong?

      1. Jim Tyre

        As you state, the February Order does not require Apple to do so. There are two docket entries yesterday (the only ones since February), but they don’t change the Order. The first is a pro forma motion to unseal, the second is the grant of the motion. So yes, I am right. Was my comment the most compelling ever made? No. Did it add a pertinent fact to the discussion? I thought so, but you’re free to disagree, of course.

        1. SHG Post author

          You didn’t add a pertinent fact. You subtracted what you believe to be an inaccurate fact despite the myriad articles to the contrary. Of course, it would be ironic if they’re accurate and you’re wrong, but if you feel that it’s so critical to be the voice speaking on behalf of the government, who am I to disagree?

  2. Scott K

    When I first read the draft bill, I thought that it explicitly didn’t require back doors to be added due to this part:

    Nothing in this Act may be construed to authorize any government officer to require or prohibit any specific design or operating system to be adopted by any covered entity.

    Both your reaction to it and many of the comments on techdirt were the opposite. Since IANAL, I assume I’m missing something reasonably obvious to those of you that are. Is it that while no specific design is required, you have to provide the data (however you manage it), so implicitly the manufacturer has to have a way in?

    1. SHG Post author

      I’m perpetually perplexed by a commenter’s need to write a comment like this. Techdirt explains it in brutally simple terms, including the line you quote. If you still can’t grasp why you’re wrong, then there is no hope for you.

      1. Scott K

        OK. It was a failure of imagination. Thanks for the kick in the head to get the cobwebs out.

Comments are closed.