Two Issues with the FBI & Apple

by Jay Marshall Wolman, CIPP/US

By now, practically everyone who cares has heard that Magistrate Pym has ordered Apple to help the FBI crack open an iPhone related to the San Bernadino shooting.  The order is pursuant to the All Writs Act, codified at 28 U.S.C. sec. 1651.  In short, it is a catch-all that lets courts issue whatever orders they feel like.  In response, Apple CEO Tim Cook sent a letter saying he opposed the order.  Notably, he wrote:

But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

There’s been a lot of discussion, but little focused on two issues that deserve some attention.  First, this isn’t simply asking Apple to turn over a piece of software or asking to borrow a gadget.  They are, if Mr. Cook is to be believed, asking Apple to write new software.  Software is a creative process, a means of expression; this is why it is protected by copyright. Apple itself was instrumental in this determination.  See  Apple v Franklin, 714 F.2d 1240 (3d Cir. 1983).  In a nutshell, the Order is tantamount to ordering Frank Gehry to design a building featuring straight lines and right angles or ordering Stephen King to write a Harry Potter/Game of Thrones cross-over (assuming, in theory, a criminal investigation that would make such desirable).  EFF briefly touched on this last year in similar circumstances.  The All Writs Act may date to 1789, but it predates the ratification of the First Amendment in 1791 and is subject to it.  The Government may not simply compel speech.  See, e.g., Knox v. SEIU , 567 U.S. 310 (2012)(“The government may not prohibit the dissemination of ideas that it disfavors, nor compel the endorsement of ideas that it approves.”).  

Second, there’s a certain subtext in Mr. Cook’s message.  What he says is that it is too dangerous to create, not that it is unfeasible to create.  The issue faced by the FBI is that the iPhone at issue may erase all data after too many failed attempts at a brute-force passcode hack.  So, they want Apple to design a work-around that would enable them to guess all possible passcodes without bricking the phone.  The auto-erase function is a security feature; the iPhone is encrypted by default.  We rely on this as part of our daily security–heck, I’m sure the government relies on it.  We’ve all seen street magicians use incredible slight of hand–how hard would it be for one of our diplomats, officers, or defense contractors to have had a foreign spy (let’s say–North Korean) swipe their iPhone (and SIM cards) and replace it with a counterfeit.  In that scenario, the person would try their passcode 10 times, fail, wonder why, but feel secure that the iPhone wiped itself.  Yet, the real phone would be in the hands of the foreign government.  Maybe the FBI and Apple haven’t yet developed the tool that bypasses the 10-tries-and-erase feature, but a foreign intelligence agency might have.  Our own NSA might have it also, but just isn’t sharing with the FBI.  This tells me that no iPhone is actually secure.  Though there is pretty much no such thing as an unbreakable lock, such a tool might enable a brute force attack on your phone to crack it in as little as 12 hours.  That’s more than enough time before the subject realizes his phone was swapped rather than just suffering a glitch.  As much as we may want Apple to be able to recover our phones if we forget our own passcodes, we really should want them to make a phone they themselves cannot crack.

These are the issues we should be discussing, in addition to whether we generally think it right for the government to ask Apple to hand over the keys to the kingdom.

16 Responses to Two Issues with the FBI & Apple

  1. CPlatt says:

    This seems analogous to the government requiring a maker of safes to design or modify the lock so that all safes will open with a single design of master key. I tend to doubt that such a demand has ever been made–at least in the United States. As soon as one master key is out there, duplicates can be made, and no one who owns those safes would feel secure.

    • Thomas D Dial says:

      That is incorrect. Apple has been ordered to provide a piece of software to modify the operation of a single smart phone, identified by its serial number and International Mobile Station Equipment Identity (IMEI) that identify it uniquely, or are intended to. The order was issued in connection with an apparently valid search warrant for the phone, to which the phone’s owner (San Bernardino County Department of Public Health), who consent to the search. A more correct analogy is an order requiring a locksmith to assist in opening a combination lock that secures a safe for which a valid search warrant has been issued.

      The government has asked for a unique solution that will not apply to any other device. It is true, as it would be in the case of a locksmith and safe, that the techniques used could facilitate searches of other devices, but that is not analagous to a master key, and the order does not require Apple to modify their operating system designs.

  2. Dante says:

    You might want to check out this post:

    Apple can comply with the FBI court order

    Seems like Iphones 6 and up are a lot safer, than the Iphone 5c in question because they have something called the secure enclave. It’s not clear, whether Apple could comply to the demand, if it were a later iPhone.

    • Jay Wolman says:

      Thank you. As I noted above, however, I was sparked because Apple didn’t just come out and say that they didn’t have the capability. They could’ve said that and that they wouldn’t do it even if they could. But they didn’t.

      • Thomas D Dial says:

        Various technical descriptions (Arstechnia is a reasonable source) suggest that Apple can, in fact, provide the assistance demanded in the case of the phone model in question, an iPhone 5c, but not in the case of 5s and later models (or perhaps only with far greater difficulty).

  3. The real problem here is encryption based on a short numeric PIN. My Android phone requires a strong *password* — not PIN — in order to encrypt, and it wouldn’t matter how many tries the government makes to brute force my encryption because it is not possible with present technology.

    Apple, on the other hand, is trying to make encryption easier, which is great, but it is less secure. The “secure enclave” model, whereby the user’s PIN is hashed with a unique code that is stored inside a chip in the phone, and the chip cannot be made to reveal that code (only the hash), is better, but is it *really* not possible to get the code out of the chip?

    +1 to Apple for fighting the fight, but +10 to Google for eliminating the fight.

    • Nick says:

      It is *really* not possible to get the code out of the chip.

      This is Apple’s white paper on security in iOS 9. The information you’re looking for about how the Secure Enclave chip works is on page 10 (and continues on after that): https://www.apple.com/business/docs/iOS_Security_Guide.pdf

      All iOS devices that use the Secure Enclave hardware require a 6-digit passcode minimum now. The FBI’s requested method is really only relevant (from a technical perspective) to iOS devices without the secure enclave.

      Of course there are also a few other technical security issues at play here. The Farook phone was using iCloud for backups. It is possible for Apple to retrieve iCloud data and in this case they did. The FBI has Farook’s iCloud data prior to ~Oct. 15 when he turned the backups off.

      IMO, 11-character alphanumeric passwords with no iCloud backups are a minimum requirement now. Disabling TouchID is also smart in the event that a court compels you to volunteer your fingerprint to unlock the device.

      For secure backups of iOS devices, plug the device into a computer and use iTunes to back up the device. There’s an encryption option there that will encrypt every backup with a key you specify. And if you’ve encrypted your entire hard drive (which you should have), the backup will be doubly encrypted from a physical search. If an attacker gains access to your machine while it’s running however, you’d still be relying on the backup encryption itself.

  4. Gavin says:

    More significantly, I’d say that Apple designed their product inappropriately if the OS may be upgraded while the owner has not authenticated.

    They did not have to do that. Because they designed it this way, it’s feasible for them to comply with this order. Had they used their tpm-like-thing more carefully, this would be impossible and the court order would be moot.

    This story isn’t as much about Apple fighting for user’s privacy as it is about Apple screwing up product design for privacy and security, and turning to the courts to cover their ass retrospectively.

  5. A Leap at the Wheel says:

    >>Maybe the FBI and Apple haven’t yet developed the tool that bypasses the 10-tries-and-erase feature, but a foreign intelligence agency might have. Our own NSA might have it also, but just isn’t sharing with the FBI. This tells me that no iPhone is actually secure.

    Close, but not exacly. To load the software specified by the court, the software must be signed with a certificate that only Apple has. That cert is just a big binary number, but only Apple has it. If the foreign intelligence agent or NSA wrote the same software and compiled it, they could not load it onto the phone.

    Your last sentence I quoted should be “This tells me that no iPhone is actually secure from whoever has the ability to sign software with Apple’s cert.”

    • Jay Wolman says:

      Technically correct, but now we’re just playing the numbers game. How many people have the Dev key? Your addition is a qualifier, it doesn’t change the fact that the thesis is correct. It is not secure.

      • A Leap at the Wheel says:

        Saying “just playing the numbers game” when it comes to encryption is laying saying “that’s just a semantic game” in patent law.

        Everything about cryptography is a numbers game.

      • Thomas D Dial says:

        The key Leap mentioned is not a dev key. It is a secret key Apple uses to prove to users that the software to be loaded is authentic, from Apple, and has not been altered in any way. It is retained by Apple and never would be released intentionally to anyone else. Another party that gained access to it could distribute software that Apple devices would accept as authentic and load and run, potentially destroying the security of the device and any data on it.

    • Gavin says:

      Apple had a far more rich suite of choices here though. They decided that their certificate wins. That was by no means preordained. They could have required devices only update if authenticated, for instance.

      They could have made the government’s order moot.

      They chose not to.

      That was a design decision. Likely one made for Apple’s convenience, together with that of their users.