On Tuesday, the United States District Court of California issued an order requiring Apple to assist the FBI in accessing a locked iPhone (PDF)—and not just any iPhone, but the iPhone 5c used by one of the San Bernardino shooters. The order is very clear: Build new firmware to enable the FBI to perform an unlimited, high speed brute force attack, and place that firmware on the device.
Make no mistake: This is unprecedented, and the situation was deliberately engineered by the FBI and Department of Justice to force a showdown that could define limits our civil rights for generations to come. This is an issue with far-reaching implications well beyond a single phone, a single case, or even Apple itself.
As a career security professional, this case has chilling implications.
I’ve been writing about Apple’s role in our digital civil rights since 2014, and specifically addressed why Apple is at the center of the battle over encryption last month on TidBITS. The short version is that Apple is one of the only companies with the technologies, high profile, and business model to both find themselves in the cross hairs, and take a strong position.
Make no mistake, Apple has a long history of complying with court orders and assisting law enforcement. Previous to iOS 8, they could extract data off devices. Even today, data in most of their online services (iCloud, excluding iMessage and FaceTime) can be provided upon legal request.
This case is different for multiple reasons:
Apple is being asked to specifically create new software to circumvent their security controls. They aren’t being asked to use existing capabilities, since those no longer work. The FBI wants a new version of the operating system designed to allow the FBI to brute force attack the phone.
The FBI is using a highly emotional, nationally infamous terrorism case as justification for the request.
The request refers to the All Writs Act, which is itself under scrutiny in a case in New York involving Apple. Federal Magistrate Judge James Orenstein of the Eastern District of New York is currently evaluating if the Act applies in these cases.
That’s why this is about far more than a single phone. Apple does not have the existing capability to assist the FBI. The FBI engineered a case where the perpetrators are already dead, but emotions are charged. And the law cited is under active legal debate within the federal courts.
The crux of the issue is should companies be required to build security circumvention technologies to expose their own customers? Not “assist law enforcement with existing tools,” but “build new tools.”
The FBI Director has been clear that the government wants back doors into our devices, even though the former head of the NSA disagrees and supports strong consumer encryption. One reason Apple is likely fighting this case so publicly is that it is a small legal step from requiring new circumvention technology, to building such access into devices. The FBI wants the precedent far more than they need the evidence, and this particular case is incredibly high profile and emotional.
The results will, without question, establish precedence beyond one killer’s iPhone.
The technical details
The court order is quite specific. It applies only to one iPhone, and requests Apple create a new version of the firmware that eliminates the existing feature that erases the iPhone after 10 failed attempts at entering the passcode. It further asks Apple to allow passcode attempts to be performed as rapidly as possible.
Beginning with iOS 8, devices are encrypted using a key derived from your passcode. This is combined with a hardware key specific to the device. Apple has no way of knowing or circumventing that key. On newer devices, the hardware key is embedded in the device and is not recoverable. Thus the passcode must be combined with the device key in a chip on the phone, and that chip rate-limits passcode attempts to make a brute force attack slower.
Reading through the order, it seems the FBI thinks that a modified version of the operating system would allow them to engage in high-speed attacks, if the 10-tries limit was removed. The request indicates they likely can’t image the device and perform all the attacks on their own super-fast computers, due to that hardware key. With a four-character passcode the device could probably be cracked in hours. A six-character code might take days or weeks, and anything longer could take months or years.
As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own—the FBI does not have the secret keys that Apple uses to sign firmware.
This opens up a few questions. Could this work on newer devices with the enhanced encryption of the Secure Enclave? How can Apple pair the device and replace the firmware in the first place? Would they be using the shooter’s computer? An over-the-air update? Apple says that all devices (with or without the Secure Enclave) are vulnerable to this kind of attack, but declined to comment on the specific technical methods, a position I initially disagreed with, but on reflection is probably the right move for reasons we will get to in a moment.
Thus the FBI wants a new version of iOS, signed by Apple and installed on the device, that removes limitations on their attempts to brute-force the password.
Why this matters
Legal precedent is like a glacier, slowly building over time until it becomes nigh unstoppable. Major issues like this are first, and sometimes ultimately, decided on a series of small steps that build on each other. It’s the reason the NRA fights any attempts at gun control, since they fear a slow build, not a single small law.
The crux of this round of the encryption debate is if companies should be forced to build tools to circumvent their customers’ security. If the answer is “yes,” it could be a small step to “should they just build these tools into the OS from the start?”
I have no doubt the FBI deliberately chose the highest-profile domestic terrorism case in possibly a decade. We, average citizens, want the FBI to stop this sort of evil. We don’t necessarily see this one case as applying to our lives and our rights. Why the big deal? What if the FBI could find the terrorists’ contacts and stop other attacks?
But the truth is, no legal case applies in a vacuum. If this goes through, if Apple is forced to assist, it will open a floodgate of law enforcement requests. Then what about civil cases? Opening a phone to support a messy divorce and child custody battle? Or what about requests from other nations, especially places like China and the UAE that already forced BlackBerry and others to compromise the security of their customers?
And once the scale of these requests increases, as a security professional I guarantee the tools will leak, the techniques will be exploited by criminals, and our collective security will decline. It really doesn’t matter if it’s the iPhone 5c or 6s. It really doesn’t matter if this is about dead terrorists or a drug dealer. It doesn’t matter what specific circumvention Apple is being asked to create.
What matters is if we have a right to the security and privacy of our devices, and of our communications, which are also under assault. If we have the right to tools to defend ourselves from the government and criminals alike. Yes, these tools will be sometimes used for the worst of crimes, but they’re also fundamental to our civil rights, freedom of discourse, and our ability to protect our digital lives from the less impactful, but far more frequent criminal attacks.
This situation was engineered by the FBI and Department of Justice for the maximum impact and chances of success. Apple is fighting, and as a security professional it’s my obligation to support their position, and stronger security.