You know the scenario if you’ve watched 24 or similar TV shows and movies. A bad guy has a nuclear bomb or other instrument of terror that will go off, and the only way to stop it is to decrypt information stored on a mobile device or phone. Without the ability to break that encryption quickly, lots of people will die.
Thank goodness, in the context of the show, a wily government hacker or a subverted black-hat ne’er-do-well can punch keys and get the data. (Sometimes, torture is involved, which Amnesty International would remind us is both against international law and doesn’t help acquire useful operational information.)
In reality, governments use a variety of means—sometimes de facto or later found to be illegal or unconstitutional in the country in which it’s occurring—to extract the passwords, encryption keys, and other data they need. (So do criminals worldwide, although their actions are nearly uniformly against the law.)
This narrative is required to justify the erosion of privacy in the name of fighting terror and finding justice. And government officials, whether in democracies or not, perpetuate it. Some seem to truly believe it, even though the number of cases in which it’s proven true is somewhere approaching zero.
Most recently, Apple has reiterated its stand that it won’t change the design of its hardware and software to allow even itself to access the information of its customers; its fighting government efforts in America and beyond to require modifications to allow it; and it’s stated bluntly to a court that, with iOS 8 and 9, it can’t provide the mechanism demanded.
This is all good, and it doesn’t put us more at risk.
How Apple locks information away in iOS 8 and 9
Cryptographic expert and university professor Matthew Green has a very technical rundown from a year ago of what is known and not known about Apple’s methods, if you want a deep dive.

Secure Enclave made its debut in Apple’s A7 processor.
But the top-level shift in iOS 8 and 9 is that starting with the A7 processor, Apple’s use of a Secure Enclave chip pays dividends in resisting brute-force and other methods of cracking an iOS device’s passcode. (The A7 first appeared in the iPhone 5s, iPad mini 2, and iPad Air. All subsequent iOS devices and processors include this support.)
With Secure Enclave, even a relatively weak passcode or passphrase is combined with enough information stored uniquely in the phone that can’t be retrieved to require an extremely long period of time to determine the correct password. As Green notes in his post, Secure Enclave means that every password-cracking attempt has to happen on the iOS device; the part that needs to be cracked can’t be exported and iterated against on another system, like a set of high-performance graphics cards—or an NSA supercomputer.
Apple switched to pushing a six-digit PIN, because a four-digit PIN can now be broken on a device relatively quickly, especially if poorly chosen (something with a pattern or that matches a common sequence). A six-digit PIN requires several months of continuous brute-force effort. (I’d suggest switching, especially if you use Touch ID, because the number of times you have to enter your PIN is very small, especially compared to how often you’re prompted to enter an iCloud password!)
The operating system part of the equation is that starting with iOS 8, Apple encrypts more personal data that’s stored on the phone. So even if it has the tools to boot an arbitrary iOS device in such a way that it could view the internal flash memory, what’s left not encrypted doesn’t provide much useful information at all.
Apple can access anything stored in iCloud (except the content of files you encrypt and then store in iCloud Drive); it does respond to requests to provide information to law enforcement, although it says, “Only a small fraction of requests from law enforcement seek content such as email, photos, and other content stored on users’ iCloud accounts.”
Apple doesn’t ever have your passcode and it has no way of obtaining the unique device number that’s stored in Secure Enclave, and it’s made it effectively impossible so far (that could always change) for any other party to obtain that information, either.
That’s a very cursory explanation of why Apple can’t give information to a government to access an arbitrary phone. But should it redesign iOS to allow it?
Who watches the Apple watchmen?
The problem with a backdoor is that there’s no way to create a way into a secure system that only the “good guys” can use. First and foremost, there’s robust debate in many countries widely considered to have free and democratic elections as to whether parts of the government are “good guys.” Are those acting within such a government that work extrajudicially or in ways that are later defined against the national interest or a national constitution “good guys”?
Then there’s the easier issue of whether nation states judged broadly as lacking a democratic process, and engaging in surveillance, repression, imprisonment, and murder would have the same access to the back doors. If, within the country’s confines, the government is judged by itself to act legally, then shouldn’t such a country have legal access to this back door?
Even if there were a way to ensure that only abstract “good guys” could obtain access, cryptographers and security experts continually explain that any weakness in an algorithm that allows parties who don’t directly control the encryption components—whether a single user or multiple entities in communication—can be used by anyone who sorts it out.
Create a system deployed worldwide that’s impregnable except for a method that any government or some government has access to, and exploits will be created, secret keys stolen, and people suborned through threat or reward.
Even if your belief is that your government should have the unfettered right, tempered by necessity, to decrypt data on an iPhone or any other device, do you want to sign off on an approach that allows China and Russia to do the same? If you’re in China or Russia, and trust your government completely, what if I ask you about the U.S. government or the leadership of the United Kingdom?
Oh, no, you say, are we wading into a political discussion! I’m sorry, but all technology is political; the lens through which we view it changes by the uses to which it’s put. My use of my iPhone is unlikely to lead to my death; opposition politicians and rebels you might recognize as fighting for democratic liberty are in a very different position.
The idea of a magic back door is something some law-enforcement officials and government agencies hold dear—or at least claim they do, while understanding the truth. Law enforcement in the U.S. and other countries have hosts of tools they use already, and we don’t live in a magical world: practical police work and spy craft exist to track down those who do wrong by a government’s lights; a golden key only the righteous and just can wield is fantasy.