At this writing, Apple’s battle with the FBI over how much it can and should help in the investigation of the San Bernardino shootings is less than a week old. But already it’s explosive to say the least. The government has accused Apple of being more concerned with marketing than the fight against terrorism, and Apple has drawn a line in the sand, saying that complying with the FBI’s request “would undermine the very freedoms and liberty our government is meant to protect.”
This fight isn’t going to be over anytime soon, so we’ll keep this FAQ updated as events unfold. If you have more questions—or want to respectfully debate the implications this case will have on privacy and security—please chime away in the comments and we’ll do our best to make everything about this confusing case as clear as possible.
Where does it stand right now?
The United States District Court for the Central District of California issued an order on February 16, giving Apple five business days to respond. Apple posted an open letter to customers on its website explaining its side of the case, prompting government attorneys to file a motion on February 19 disagreeing with Apple’s view of the situation, and asking the court to force Apple to comply.
A hearing is scheduled to take place in Riverside, CA, on March 22. Until then, the lawyers will file more motions, while the two sides also take their case to the court of public opinion. On Sunday February 21, FBI Director James Comey posted at Lawfire that we should “take a deep breath and stop saying the world is ending.” Apple updated its open letter on Monday February 22 to add its own FAQ on privacy and security, and Tim Cook sent a memo to employees calling on the FBI to drop their request. So far, public opinion is not on Apple’s side, but this is only the beginning…
The Basics
So the FBI has an iPhone 5c that belonged to the San Bernardino shooter, and they think it has evidence inside?
The iPhone 5c in question was used by San Bernardino shooter Syed Rizwan Farook, but it was his work phone, so it technically belonged to his employer, the San Bernardino County Department of Public Health. Farook also had a personal phone as well as a personal computer, but he physically destroyed both before the December 2 shooting. Farook was killed in a firefight with police.
In the course of its investigation, the FBI wants to examine the iPhone 5c for evidence. The DOJ’s court filing from Friday February 19 reads:
The government has reason to believe that Farook used that iPhone to communicate with some of the very people whom he and [his also-deceased wife Tafsheen] Malik murdered. The phone may contain critical communications and data prior to and around the time of the shooting that, thus far: (1) has not been accessed; (2) may reside solely on the phone; and (3) cannot be accessed by any other means known to either the government or Apple.
But if it was his employer’s phone, can’t they access its data, or at least consent to the search?
The San Bernardino County Department of Health did consent to the search, but the iPhone is locked with a passcode (reportedly a 4-digit pin, not something more complex), and apparently the county didn’t use good multi-device management practices, because they don’t know that passcode and couldn’t access anything on the phone without it. From the same February 19 court filing:
The FBI obtained a warrant to search the iPhone, and the owner of the iPhone, Farook’s employer, also gave the FBI its consent to the search. Because the iPhone was locked, the government subsequently sought Apple’s help in its efforts to execute the lawfully issued search warrant. Apple refused.
Why is Apple refusing to unlock the phone?
That wasn’t what Apple was asked to do—Apple actually has no way of unlocking a locked iPhone. Apple does have a way to extract data from a device running iOS 7 or earlier, without having to unlock the phone. Apple has done this before for law enforcement with a proper court order—another filing by the government estimates at least 70 times.
But starting with iOS 8, the data on an iPhone is encrypted by default as soon as you enable the passcode feature. Since Farook’s iPhone 5c is running iOS 9, the only way to access the encrypted data it holds is to unlock the phone with the passcode. Since the owner of the phone (Farook’s employer) doesn’t know the passcode, and Apple doesn’t know the passcode, and Farook is dead, the FBI is stuck trying to crack the passcode through brute force.
What does the FBI want Apple to do to help brute-force the passcode?
The best defense iOS has against a brute-force attack is the Erase Data feature, which will wipe all the data on the iPhone after 10 failed passcode attempts. The iPhone has a 4-digit pin, which shouldn’t take too long to crack, but certainly more than 10 tries.
So the FBI’s request, and the court’s February 16 order, is for Apple to create a sideloadable SIF (software image file) of iOS that can run on the iPhone’s RAM without touching any other data on the device. The FBI wants Apple to sign that software so the iPhone—and only this iPhone—will run it. Once installed, the software would disable that Erase Data setting.
The FBI also wants to try passcodes as quickly as possible, so it wants Apple to disable the delay between passcode attempts, plus allow passcodes to be inputted by a computer, either through the iPhone’s Lightning port or wirelessly, a feature that has never existed in a publicly shipping version of iOS. That’s a big deal—as Matthew Panzarino points out at TechCrunch, it’s asking Apple to introduce a new weakness into iOS.
Does the FBI know for sure if the Erase Data feature is turned on?
It doesn’t seem like it—the FBI just doesn’t want to take any chances. From the February 19 filing, emphasis ours:
The FBI has been unable to make attempts to determine the passcode to access the subject device because Apple has written, or “coded,” its operating system with a user-enabled “auto-erase function” that would, if enabled, result in the permanent destruction of the required encryption key material after 10 failed attempts at entering the correct passcode.
What was Apple’s response?
Apple posted an open letter to customers explaining its position. It reads in part:
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software—which does not exist today—would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The main argument: Is this a backdoor to one iPhone or all of them?
Would the software the FBI is requesting be considered a “backdoor”?
That depends on whom you ask. For example, Bruce Schneier of Harvard’s Berkman Center for Internet and Society told our colleagues at NetworkWorld, “The FBI is asking Apple to reinstall a vulnerability they fixed.” He says the iPhone 5c didn’t intially have protection against brute-force attacks to guess the passcode, but those were added in 2014 with iOS 8.
The government’s February 19 court filing definitely disagrees that it’s a backdoor, mostly because the order is written just for this phone.
Apple may maintain custody of the software, destroy it after its purpose under the Order has been served, refuse to disseminate it outside of Apple, and make it clear to the world that it does not apply to other devices or users without lawful court orders. As such, compliance with the Order presents no danger for any other phone, and is not “the equivalent of a master key, capable of opening hundreds of millions of locks.”
But Apple believes that it is—that “master key” quote is right from Apple’s open letter.
Whether it’s a backdoor or not, the FBI says they only want to use it once. So what’s wrong with a single-use backdoor?
The DOJ is saying that the FBI only wants to do this once, that’s true. But the February 19 filing uses several other court cases as precedent to bolster its argument that Apple is being unreasonable to refuse this time. In both this San Bernardino investigation and a separate drug case in the state of New York, the government is saying that since Apple helped before, they should be willing to help again.
So it’s a little weird that the FBI wants us to believe that once Apple builds this tool to assist law enforcement to brute-force a passcode, that it wouldn’t be used again. Even if that particular software image file was never shared and promptly destroyed, the courts could use this case as precedent to order Apple to build it again.
But the government says that this software doesn’t ever have to leave the Apple campus—what’s wrong with that?
The government claims that Apple can retain total control over the software, and even the device itself. Reads the February 19 filing, “the Order permits Apple to take possession of the subject device to load the programs in its own secure location, similar to what Apple has done for years for earlier operting systems, and permit the goverment to make its passcode attempts via remote access.”
But since Apple is being asked to create a tool for law enforcement to use, that tool would have to stand up to scrutiny if any evidence collected with it is ever used in court. Jonathan Zdziarski’s excellent blog post “Apple, FBI, and the Burden of Forensic Methodology” explains this really well. Zdziarski has extensive experience in iOS forensics, working with law enforcement and testifying as an expert in court.
He explains that tools used by law enforcement to collect evidence are legally known as “instruments,” and for evidence collected by such tools to be admissable in court, the court as well as the defense must have confidence that the tools are accurate and their results reproducable. New instruments—a breathalyzer, a speed-detecting radar gun, or a software tool like this one—have to be tested and validated by a third party like the NIST (National Institute of Standards and Technology) or NIJ (National Institute of Justice), and generally accepted by the scientific community. That’s why breathalyzer tests are admissable but polygraphs are not.
On FBI’s Interference WIth iCloud Backups https://t.co/VvnoigHdQl
— Jonathan Ździarski (@JZdziarski) February 21, 2016
Zdziarski also explains how before iOS 8, when Apple could still extract unencrypted data from a locked device, this was seen as a lab service, not an instrument. In that case, Apple would have to demonstrate to the court (usually through expert testimony or an affadavit) that it had the expertise to run the test, but it could claim “trade secrets” to avoid detailing the exact methods. But when it’s law enforcement carrying out the method itself, the standard is different.
Now, just because evidence collected by use of this tool might not be admissable in court doesn’t make that evidence worthless. Law enforcement could learn something about Farook on his iPhone that they could then verify through other means that are admissable.

The iCloud problem
Back to Farook’s iPhone 5c, is there any other way to get the evidence the government wants? What else did they try?
The February 19 filing lists the other methods the government and Apple discussed, and why they won’t work, in a footnote on page 18, paraphrased here:
Obtain cell phone toll records: The filing says “the government has of course done this,” but it’s insufficient since there’s a lot more on the phone than call and SMS records.
Determine if any computers were paired to the phone: The government says there weren’t any.
Attempt an auto-backup of the device with the related iCloud account: This didn’t work because neither the FBI nor the “owner” (the San Bernardino County Department of Public Health) knew the iCloud password.
Obtain previous iCloud backups: The FBI did this too, but the most recent backup was October 19, 2015, but the filing says that’s not sufficient “and also back-ups do not appear to have the same amount of information as is on the phone itself.”
But that third method (attempt an auto-backup to iCloud) is where it gets really weird. The iCloud password was reset remotely, shortly after the crime, by the owner, i.e. the county. The February 19 filing says, “that had the effect of eliminating the possibility of an auto-backup.”
As explained by Ars Technica, they way they tried to force it was to take the iPhone to a known Wi-Fi network, plug it in, and leave it overnight—which should trigger a backup to iCloud if auto-backups are enabled. But it didn’t work because the password had been reset so recently.
So they weren’t able to get an iCloud backup?
Not a full one. According to the February 19 filing, the FBI has Farook’s iCloud backups through October 19, about six weeks before the December 2 shooting. The filing states that the government found evidence in the iCloud account to indicate “that Farook communicated with victims who were later killed in the shootings.” (You’ll recall he killed his own co-workers.)
The filing also states:
In addition, toll records for the subject device establish that Farook communicated with Malik using the subject device between July and November 2015, but this information is not found in the backup iCloud data. Accordingly, there may be critical communications and data prior to and around the time of the shooting that thus far has not been accessed, may reside solely on the subject device, and cannot be accessed by any other means known to either the government or Apple.
Wait, they think there could be data on the phone that isn’t in the iCloud backup?
Yes, the February 19 filing says that—they have service records from Verizon that show communications occurred, but those aren’t in the iCloud backup.
The problem with that argument? There’s no way to selectively back up to iCloud—it’s all or nothing. So if communications from July, August, September, and October are not in the October 19 iCloud backup, it would be pretty surprising to find them on the phone. One logical explanation is that they were deleted by Farook before October 19.
What’s with this story about the iCloud password being changed, and who’s to blame?
It’s kind of a mess. First, the February 19 filing mentioned that the owner (again, that’s San Bernardino County) reset the password for the Apple ID tied to the iPhone—Farook’s iCloud password, in other words. “The owner…was able to reset the password remotely, but that had the effect of eliminating the possibility of an auto-backup.”
So that kind of read like the FBI thought the county had screwed up, but then the next day, February 20, the county’s Twitter account tweeted that the FBI had instructed the county to do so.
The County was working cooperatively with the FBI when it reset the iCloud password at the FBI’s request.
— CountyWire (@CountyWire) February 20, 2016
The FBI released a statement on February 21 to Ars Technica admitting that yes, it had ordered the password reset. But the FBI still maintains that the iCloud backup wouldn’t have everything the investigators would get if they could just get into the phone, which is why the court order was issued in the first place.

The New York case, and why iOS version matters
Farook’s iPhone is running iOS 9, and passcode-based encryption was added in iOS 8. But if Farook’s iPhone was running iOS 7, Apple would still help?
Apple has published a set of Legal Process Guidelines (PDF) that outline the process for law enforcement to request assistance from Apple, as well as what information Apple can provide.
They read in part:
For all devices running iOS 8.0 and later versions, Apple will not perform iOS data extractions as data extraction tools are no longer effective. The files to be extracted are protected by an encryption key that is tied to the user’s passcode, which Apple does not possess. For iOS devices running iOS versions earlier than iOS 8.0, upon receipt of a valid search warrant issued upon a showing of probable cause, Apple can extract certain categories of active data from passcode-locked iOS devices.
However, the government’s February 19 court filing states in a footnote, “Apple has informed another court that it now objects to providing such assistance.”
What other court is Apple objecting to?
There’s another case pending in New York, in which an iPhone 5s belonging to a suspected meth dealer is running iOS 7, but Apple still doesn’t want to help.
Why doesn’t Apple want to help in New York?
In a response filed in the New York case, Apple argues that “social awareness of issues relating to privacy and security, and the authority of government to access data is at an all-time high. And public expectations about the obligation of companies like Apple to minimize government access within the bounds of the law have changed dramatically.” So the time is right to reexamine the authority given to the government by the All Writs Act, Apple is arguing.
It sounds like, from that filing, that Apple just wants out of the iPhone-data-extraction business. The filing explains how starting with iOS 8 Apple doesn’t have the technical ability to do what it once did, and that iOS 7 devices like this one “are becoming rare as they compromise less than 10 percent of the devices in the U.S.” Apple doesn’t want to take its engineers’ time doing the extraction or testifying in court about it, even though the company would be able to claim expenses.
After all, as the final reason argues, you can’t claim expenses for damage to the brand. “Forcing Apple to extract data in this case, absent clear legal authority to do so, could threaten the trust between Apple and its customers and substantially tarnish the Apple brand. This reputational harm could have a longer term economic impact beyond the mere cost of performing the single extraction at issue.”
The legal fight: What happens next?
What’s the deal with the All Writs Act, which Apple is objecting to?
Both this new order and the New York case use the All Writs Act of 1789. In fact, in the case going on in the Eastern District of New York, Apple is arguing that extracting data from a drug dealer’s iPhone 5s running iOS 7 is overly burdensome on manpower and resources, as well as an overly wide application of the All Writs Act. Matthew Panzarino at TechCrunch has a great explanation, and you can also read Apple’s filing questioning the AWA.
According to the Feburary 19 filing in the California case, “The All Writs Act provides in relevant part that ‘all courts established by Act of Congress may issues all writs necessary or appropriate in aid of their respective jurisdictons and agreeable to the usages and principles of law.’” It’s kind of a catch-all, in other words: “As the Supreme Court explained, ‘the All Writs Act is a residual source of authority to issue writs that are not otherwise covered by statute.’”
The tests are whether the third party “is not so far removed from the underlying controversy that its assistance could not be permissably compelled,” that the order “does not place an undue burden” on the third party, and that the assistance is “necessary to achieve the purposes of the warrant.” In the February 19 filing, the government argues that Apple fails all three tests and thus should be ordered to comply.
What happens if Apple refuses?
If the February 16 order from Judge Pym stands after Apple’s appeal—the next hearing is scheduled for March 22—the company could elevate it through the courts, eventually all the way up to the Supreme Court.
This case could prompt legislation in Congress too, according to California Senator Dianne Feinstein speaking on PBS NewsHour. Tim Cook and FBI Director James Comey have both been invited to appear before the bipartisan House Energy and Commerce Committee.