U.S. v. Apple: Never Mind

Didn’t see this coming.

But this was a given.

Federal court in Calif. has canceled tomorrow’s hearing on the FBI’s request to force Apple to unlock an iPhone. pic.twitter.com/IoS4y3IBOr

And Magistrate Judge Sherri Pym, rather than question what the hell was going on, gave the government what it asked for. Ex parte, mind you. Because it’s the government and what mag. doesn’t want to get a hot potato like this off her hands? Who can blame her?

So what does this mean?  Given the intense interest, the significance of the issue at stake, and scope of involvement of amici, there is good, no, excellent, reason to wonder.

At Volokh Conspiracy, Orin Kerr offers his take.

If the government can access the phone through a third party’s alternative, however, then this legal challenge goes away without a ruling. If that happens, neither side will look good in the short term. The FBI won’t look good because it went to court and claimed it had no alternatives when an alternative existed. The whole case was for nothing, which will raise suspicions about why the government filed the case and the timing of this new discovery. But Apple won’t look good either. Apple claimed that the sky would fall if it had to create the code in light of the risk outsiders might steal it and threaten the privacy of everyone. If outsiders already have a way in without Apple’s help, then the sky has already fallen. Apple just didn’t know it.

There’s an implicit trust in the government’s allegations here.  While Orin shows some skepticism, notably by his putting the word “can” in italics, he goes on to discuss how this would reflect poorly on both sides, Apple as well as the government.

Is it possible that the government’s assertion is true? That after all that has happened in this circus, it can access the iPhone without Apple’s cooperation?  Of course. Kinda ridiculous for the government to have gone this far down the road before it came to such an epiphany, but it’s possible. Like space aliens.

But still, Orin’s skepticism is limited by an inexplicable faith in the government’s honesty, that it wouldn’t claim to Mag. Pym that it has an alternative if it didn’t. Notably, there is no commentary from the perspective of “what if the government is totally full of shit, lying through its teeth and doing this to try to salvage its dignity given that neither the law nor public opinion is backing the government up.”

In the interest of balance, I sacrifice myself on the altar of skepticism. I call bullshit.

This was never a case about what the government needed to unearth on the San Bernardino iPhone because they needed to stop the terrorists. There were other phones, trashed, that might have held secrets, but not this phone.  Way too much time has elapsed already to make anything (not that there is any reasonable belief that anything would be found) worthwhile.  Anyone who could potentially be revealed has had enough time to take a cruise around the world and still go into hiding from the government.

No, the “need” to get into this iPhone is nonsensical.

So why then start this dog and pony shitshow?  Glad you asked. This was the scenario the government was waiting for to have its showdown with technology. All tech, not just Apple.  The stars aligned, the facts were as favorable to the government as they were going to get, the appeal to fear and safety would never be stronger. Mass murder by terrorists on American soil. It wasn’t going to get any better for the government than this, and so the government decided that this was where the battle would be fought:

Personal privacy versus the government’s ability to protect Americans from terrorism.

They pulled out all the stops. They called in every handsome, sincere, serious dude in law enforcement. They went on TV to tell us how much they love us, and how they were doing this for us.  Don’t we want to survive? Don’t we fear the terrorists? We can’t let the terrorists win.  Love the government. Trust the government. Would we lie?

Except it didn’t work.  At least, not enough to be certain that they would prevail before the two courts that mattered, Mag. Pym’s and the court of public opinion.  That they blew it before Mag. Orenstein didn’t help, but the case before Mag. Pym was in the center ring, the big show, winner take all.

Is this right? Is this cynical? Is this crazy?  Maybe. Maybe the government is telling the absolute truth, that they inexplicably suddenly discovered a flaw and are now about to exploit it. After all, it’s technology, and flaws seem to be the only consistent thing that permeates all of it. Oh wait, flaws and shiny. I forgot about shiny, which is what blinds us from the flaws.

But if there is anything less trustworthy than technology, it’s the government’s claim that it’s only here to help, and we can trust the government to never abuse its power.

Will we know which it is? Will we know if there really is a flaw in Apple’s security that the government can now exploit, or that the government is full  of shit and trying to save face by taking its motion off the table?  A damn good question.

41 comments on “U.S. v. Apple: Never Mind

  1. Turk

    I go with option 2: The gov’t thought it would lose and cried Uncle.

    And I can’t help but notice that Apple had 6 lawyers on that telephone conference call. Just to say that they weren’t opposed.

      1. Mort

        *looks at current candidates for President*

        I’m starting to be OK with them winning, at this point…

      1. SHG Post author

        That may have been the worst conference I’ve ever read. I should delete the link just to make sure no young lawyer reads it and think that’s the way competent lawyers argue. It was fucking awful.

  2. Patrick Maupin

    Yeah, the entire thing has been poor strategy combined with worse tactics from the get-go. The FBI may have just learned (OK, “seen”, not “learned”) the difference between requiring the public and congress to be asleep, and requiring the public and congress to remain asleep.

    The final “let’s tell everybody how shoddy Apple’s security is” will only work if Apple plays along and begs too hard — obviously any communications from Apple are going to be made public if that can imply any weakness.

    At the end of the day, this is only going to help Apple sell the new, improved shiny, now with somewhat improved security. Hell, if Apple can point to concrete technical improvements like hardening from differential cryptanalysis, at this point, even running John Oliver’s fake commercial could probably sell phones for them.

    1. SHG Post author

      Meh. All Apple need do is announce that the new iPhone comes in a color never before available, and it will sell like hotcakes.

    1. SHG Post author

      Maybe. Maybe not.

      Researchers at Johns Hopkins only tested their attack on older iPhones, but Mr. Green told The Post that a nation-state should be able to use a modified version to access encrypted videos and photos sent over phones running Apple’s most current operating system.

      1. Turk

        I can just see it now:

        Gov’t figures out how to hack in.
        Apple figures out where the bug is and closes the door with a software update.
        Apple accused of obstruction of justice.

    2. Mark

      That particular issue is finding a key to data (iMessage data) on iCloud. It wouldn’t help unlock specific things held only on the phone. It would undo the mess of changing the iCloud password, and thus let them see whatever is in that account.

      As usual, encryption failures come not from failures of the actual cryptography, but the infrastructure to support it. Both the Johns Hopkins attack and the FBI’s desired approach are basically brute force on guessing keys. And the “64-digit” key is exponentially harder than the phone PIN guessing, there’s just not a failure count lockout.

      Also, considering the extraordinarily close ties between Johns Hopkins and the NSA, it’s almost guaranteed that this was first disclosed to the three letter agencies.

      It may, however, be close enough for the FBI and DOJ to use it to cover their asses and run away without the risk that Apple could get a judge to force them to publicly disclose an actual interesting vulnerability.

      1. Jim Tyre

        The bug the Johns Hopkins folks found has nothing to do with the specific issue in the case before Judge Pym. Consistent with ethical infosec practice, they disclosed it first to Apple, didn’t announce it until Apple had a fix. That fix was released yesterday by Apple in iOS 9.3.

        It does illustrate that Apple doesn’t have perfect security. But then, no reasonable person should have thought that Apple did.

  3. Mark

    Is there any possibility that Apple could seek some kind of declaratory judgment against such ex-parte warrant applications?

      1. Mort

        And if they could, I’d like to see the federal judge that would do so, because Government.

  4. Keith

    After the attack in Brussels, perhaps the Government will accidentally upgrade the phone to the version released (did it fix the bug?) yesterday and go for another bite of the Apple.

    Stay tuned (and scared, very scared).

    1. SHG Post author

      Thank you for this orthogonal yet off-topic use of current unrelated events. We can never have enough of them.

      1. Keith

        I was referencing how the case started, when the Government “accidentally” ordered the password for the iCloud account changed, which created the situation where the All Writs Act came into play.

        How coincidental that despite the fact they would have had access to all the info on the phone, one simple accident made all the facts line up in a case where they could claim terrorism and use the AWA.

          1. Alex Bunin

            The government simply changed the password from ISIS to ISIL. Works on all smartphones but Blackberries. Those are impregnable.

            1. MelK

              > If Blackberries are impregnable, how do we get baby Blackberries?

              Vines. Six seconds is enough.

          2. David

            There is a slight detail that came to my attention recently. In iOS 8, after a device restart, an iPhone will not automatically join a secure or open WiFi network until the device is unlocked via the passcode.

            So, if the phone was ever restarted, drained its battery or was otherwise shut down, the iCloud avenue could have been closed that way too (since the automatic backup only occurs when on WiFi and while charging). Though this still doesn’t do much for the clearly intentional iCloud account password change.

  5. Jim Tyre

    OMFG, my tweet leads off the esteemed one’s blog post. I’ve made it, I’ve made it!

    (If I have something of actual substance to say, I’ll come back later. I just need to bask in the moment.)

  6. John

    I’m sad thinking of the questions that could have been asked instead of just granting this ex parte:

    Was the FBI previously aware of this approach?

    Has the FBI tried this approach before?

    If the FBI was previously aware, why was it not tried before filing the initial motion?

    Did any government agency the FBI contacted recommend this approach before?

    Why is this being tried now?

    Is the party with this method a private or government entity?

    If a government agency, was this one previously contacted?

    If a government agency, and not contacted, why not?

    Who initiated the contact with the other?

    When was contact first made?

    When was the offer to “hack” the iphone by the party first made?

    When was the offer to “hack” the iphone accepted?

    Tastes great or less filling?

    Oh Pym. I can’t even.

  7. Bijan

    Orin Kerr’s take on this is pretty bad.

    There is a huge difference between a vulnerability that Apple can patch and a permanent Government mandated backdoor which also sets the precedent for a permanent backdoor in all electronic devices.

    The idea that these two things are the similar is either deliberately obtuse or mind-bogglingly naive.

  8. solaric

    Orin Kerr’s opinion on this seems to be remarkably poor in a number of ways, both in terms of legal arguments and technical specifics. In particular,

    But Apple won’t look good either. Apple claimed that the sky would fall if it had to create the code in light of the risk outsiders might steal it and threaten the privacy of everyone.

    First, I guess technically “Apple won’t look good” because there are lots of people like Orin Kerr who have minimal clue about the specifics involved, but it won’t necessarily be through fault of Apple. The specific iPhone in this case, the 5C, used the now very old Apple A6 platform that did not have the hardware level crypto engine (“Secure Element” in Apple’s case, but there are many implementations of the same idea including “TrustZone” for ARM in general, Intel’s “Trusted Execution Technology”, etc) and security features that devices based on the A7 a later had (starting with the iPhone 5S, released Sept 2013). In particular, it has an unsecured AHB/AXI bus, which means that a sufficiently advanced attacker can use DMA to do arbitrary code injection. In addition/alternatively, the actual NAND flash chips themselves can be easily desoldered and then have their encrypted data copied off, then even if the device “deletes” the data after 10 failed tries, it can just be copied back and they can try again. There are a number of refinements to this basic idea that would make it go faster, but at any rate it’s not particularly complex, just somewhat labor and skill intensive. It’s trivially within reach of a nation-state level actor however and everyone in tech knew about it from the very start as did Apple, that’s why they moved to something much stronger over 2 and a half years ago. Jonathan Zdziarski (an iOS specialized forensic scientist) had a short article on his blog yesterday [0] talking about some of this if anyone is interested in more specifics and some touching on other less likely possibilities.

    Second, Apple’s claims about “the sky would fall” seemed to have much more to do with the fundamental principles of an unrelated 3rd party being forcibly conscripted into a task against the core beliefs and interests, the FBI’s extremely broad reading of the AWA as allowing anything so long as Congress had failed to explicitly prohibit it (even if they’d had the issue come up and had merely declined to endorse it), the fact that CALEA really does seem to be Congress outright prohibiting this, as well as the Bill of Rights, destruction of the American tech industry, the FBI’s utter bad faith and ludicrously transparent scaremongering etc etc etc. That the FBI would of course inevitably leak any access methods they received as a matter of course is almost besides the point compared to the long term precedent. Mr. Kerr takes a remarkably myopic view of the overall situation.

    I do agree with you that this the FBI is ultimately full of shit here and that this is a hasty tactical retreat under absolutely overwhelming fire (worth noting that even the rest of the government, including the intelligence services or DoD, have been either conspicuously silent or outright negative on the FBI’s approach here, and that the FBI did not even get close to the Congressional reception they seem to have expected). But I would not be at all surprised if they can in fact get the data off, the A6 and older, when utilizing a simple 4-digit PIN for protection, isn’t actually particularly good vs a nation-state attacker with physical access. Where they’re full of shit is that this wasn’t known from the start and that this wasn’t a naked attempt to advance their long, long standing and clearly stated anti-encryption agenda.

    The frustrating/scary thing in particular is that it’s also transparently obvious that they’ll be right back with the exact same thing remixed whenever the next horrible even happens. And it will, we’ve got ~320 million people across 3.8 million square miles, 7514 miles of border, and tens of thousands of miles of shoreline with over half a billion people crossing our borders every year [1], and we’re awash in guns. Statistically it is a certainty that we’re going to see incidents where a few dozen people get knocked off from time to time, and precisely because it’s a lower risk then getting hit by lightning it’ll get saturation news each time. That in many ways is the true super power of government: patience. The slow, steady grind over years and decades, in the face of set backs of any sort, the machine just pushes on, whereas the public must be reengaged each time right when fears and emotions are running highest. I’m actually glad that a genuine profit motive to oppose it is developing, since that’ll aid in making sure there will be a higher chanced of well financed counter interests each time too.

    All of this is also frankly a damn shame too, since it’s directly against the USA’s interests. In all seriousness, the government really should be on the side of ubiquitous encryption and as much security as possible when it comes to America, because that plays to our strengths culturally and internationally. The harm suffered from data breaches, infrastructure hacks, IP/R&D loss, and so on is immense and only slated to go up and all to the direct benefit of criminals and hostile foreign polities. The interests of government and the people should be aligned here, and it’s unfortunate that instead many government entities have been stuck in pre-electronic paradigms. Though perhaps at least in the case of the organization founded and shaped by the likes of J. Edgar Hoover it isn’t that surprising either.

    0: [Ed. Note: Link deleted per rules.]
    1: [Ed. Note: This one too.]

    1. HFN

      No computer security has ever gone unbroken. Its not because the worlds hackers are profoundly malicious, rather, its just par for the course in the tech industry. All software breaks. All security professionals, including the ones that work at Apple, know its just a matter of time before their products become insecure. When Apple claims that the FBI will destroy their precious security, they are lying by omission of the fact that the security will get destroyed anyway (and that, as a practical matter, doesnt even hurt privacy, which has been never contingent on unbreakable encryption).

      Its irresponsible to play dumb (as if, we were only one Apple product away from privacy, where as all of our personal lives up until now were public). Apple knows that the relationship between privacy and encryption isnt cut and dry. I think thats what Kerr meant when he said Apple has something to be embarrassed about. Their argument depended on cooperation with the FBI actually making a difference to the worlds privacy. If insecurity is inevitable or already a reality, or bore no clear relationship with peoples privacy, then the argument was empty and opportunistic.

      Id reply more thoroughly but you wrote too many words.

  9. Jim Tyre

    0: [Ed. Note: Link deleted per rules.]
    1: [Ed. Note: This one too.]

    Dammit Scott, you’re a lawyer, not a technologist. Lawyers begin with the far more sensible note 1. Technologists with the non-sensical note 0.

      1. Jyjon

        Second base today, third base the other day. I trust tomorrow you’ll have the sense not to go for fourth base in public.

  10. HFN

    Was the government losing the case? Were their arguments shoddy? I dont understand. What would they have to be embarrassed about?

    Unfortunately, the claim that the FBI is embarrassed, just sounds like people who already dislike the FBI projecting their sentiment into the FBI.

    1. SHG Post author

      Notice that you’re the only person who used the word “embarrassed”? Are you deliberately trying to be deceptive or do you sincerely have no grasp of the problem?

    2. Patrick Maupin

      If the FBI were capable of true embarrassment, e.g. of doing something and realizing that disapproval might mean it should re-think things and perhaps not do them again, we probably wouldn’t be here.

  11. Drew

    Never ascribe ulterior motives when laziness is an equally plausible explanation. The FBI could easily brute-force the PIN on that iPhone with a little help from NSA computing power, they just didn’t want to wait and/or go through the bureaucratic hassle. Chances are that this ‘third party’ is just a convenient cover to enable them to back down.

    For what its worth, there is an important technical distinction between using government resources to brute force a piece of software and compelling the software designer to design a tool that will make the task of brute-forcing it easier.

  12. Pingback: Government To Apple: We’re Ba-aa-ack | Simple Justice

Comments are closed.