We live in a snooping world. It seem like everybody, from random teenagers in their parents’ basements to government agencies to corporations to the creep who crushed on you in high school, wants to get information about us. Some are criminals, some duly sworn law-enforcement officials, some trollers and griefers. The Internet obliges, to judge by the massive quantity of ads I see routinely that tell me how to get more information about my neighbors and friends, not to mention the daily reports of database cracks and thefts.
Apple’s Error 53 debacle—a debacle in description, disclosure, and damage control—stems from what I will assert are the same motivations that have led Tim Cook to state categorically and repeatedly that his customers’ private communications are and will be kept secure from any interception by design. One could argue that Apple bricked those phones in which it appeared someone’s private information was at risk.
Anti-tampering and intrusion-disclosure techniques have an important role in that, especially against criminal syndicates and government incursion (though that covers both legal and extrajudicial government action). Error 53 as a concept fits neatly inside that.
But there’s the strong flip side: Apple as the nanny state, making decisions on our behalf that can, coincidentally, benefit its bottom line (however meagerly) by requiring customers to pay Apple to fix a problem.
Let’s start at the bottom.
A middle finger…print
For the basics, you can read colleague Jared Newman’s reporting for Macworld. The story was first broken by the Guardian, and others quickly realized it explained mysteries dating back many months, including a inscrutable failure that a Daily Dot reporter wrote about.
The TL;DR appears to be that if the Home button containing the Touch ID sensor on an iPhone 6, 6 Plus, 6s, or 6s Plus is replaced, on installing an iOS 8 update (as Daily Dot reported) or an iOS 9 update the phone becomes “bricked,” or permanently unusable, and the device only reports an error numbered 53 if you try to restore it via iTunes. The phone can never be used again. This becomes more troubling because some people with this problem never had a repair, or, if they had one, it didn’t include the Home button.
Apple’s reply to Jared Newman was, “When an iPhone is serviced by an unauthorized repair provider, faulty screens or other invalid components that affect the Touch ID sensor could cause the check to fail if the pairing cannot be validated. With a subsequent update or restore, additional security checks result in an ‘error 53’ being displayed.”
The Touch ID sensor apparently pairs with Secure Enclave, an encryption module that stores fingerprint data and doesn’t pass it back out. If the two become unpaired—such as a new sensor being put into a phone—this could indicate malicious tampering.

Touch ID was added to the iPhone 5s, but the Secure Enclave didn’t enter the picture until the iPhone 6—we haven’t heard about iPhone 5s units affected by Error 53.
The best complexion to put on this is that Apple engineers had our best interests in mind. Many kinds of software and hardware can tell when an intruder has been at work, often when hardware has been tinkered with. For instance, secure flash drives— like these from Verbatim—have often featured internal mechanisms that permanently disable a drive if someone tampers with the circuitry or tries to remove storage chips. You can buy tamper-resistant hard drives, computers, and, yes, smartphones.
In some industries and for some people, buying a device that kills itself rather than gives up its secrets—or even allows the potential for a third-party to grab the data for later analysis without revealing themself—is desirable or required.
The classic man-in-the-middle (MitM) attack, which is a constant threat for online communication, can be mitigated by providing a means for Alice and Bob, the typical ends of a conversation, to detect when Eve butts in. In PGP, for instance, the intertwined public and private key pairs used to encrypt data can be verified outside of an interchange to be sure that nobody tampered with a message en route. PGP is used in making digital certificates with third-party verification in browsers and operating systems, so that when you connect to a server, you have assurance that an Eve hasn’t set up shop between you and the server.
In this case, Apple seems to have made a perfectly reasonable engineering choice: It’s better to brick a phone than to potentially allow someone’s phone, payment, or other data be used by another party. In practice, it’s overkill, as many perfectly reasonable engineering choices often are.
Option-Control-freakery
Those who have long criticized Apple’s attitude towards owner and third-party repairs of its products—which have largely become more difficult over time due to components, assembly methods, and specialized connectors and screws—denounced Error 53 as a way for Apple to both make more money and deter third-party repairs.
Apple will repair or completely replace a broken iPhone that’s out of warranty for $299 (6/6s) or $329 (6 Plus/6s Plus) in the U.S., and keeps the old phone if it’s replaced. Under its one-year warranty, such a replacement should be free, unless the damage is from an accident that the warranty doesn’t cover. AppleCare+ covers such accidents, repairing or replacing the phone for $79 (iPhone 6, 6 Plus, and older) or $99 (iPhone 6s and 6s Plus), up to two incidents across the three-year contract.

AppleCare+ covers accidental damage for a small fee—and you can rest assured that the repair won’t later brick your phone.
There’s surely some profit in there, as Apple uses refurbished iPhones for replacements, and reports indicate it scavenges working parts from broken phones to use in refurbishing others.
Others, like long-time privacy and sensible-copyright-policy advocate Cory Doctorow, also (or instead) point to Apple’s tight control of its hardware when in its customers hands, making it sometimes seem like we’re renting the right to live in Apple’s world instead of acquiring equipment and software we own and can choose how to use.
Doctorow noted at Boing Boing, “iPhone customers are finding that their investments and data are being confiscated by a distant, high-handed corporation that gets to hide behind tens of thousands of words’ worth of never-read, all-encompassing terms of service.”
I’d like to find a middle ground here, although I agree Apple puts unnecessary restrictions in the name of protecting users, and if we take the company at its word, often crosses into a “nanny ecosystem,” that does things for our own good, there, there, don’t fuss. And these impulses seem to intersect often with Apple’s financial interests, even when those interests are modest.
For instance, the prohibition on running apps in iOS that aren’t purchased or downloaded from the App Store certainly has a strong basis in preventing malware. We can look at other mobile platforms’ less-restrictive app screening and installation policies as a cautionary tale. Yet, there’s no buried Advanced switch that would let users who know they’re taking a risk install non-App Store apps.

Apple could allow for sideloading as well as third-party repairs, as long as customers understand the risks ahead of time and can make informed decisions.
Many Android and Android-forked devices offer “sideloading,” often requiring checking just such a box to accept your fate. Amazon offers this on its Fire, and it has its own store of apps, along with mostly DRM-protected books and videos. Likewise, many browsers let us connect to sites with expired or misconfigured security certificates. While this is unwise, they typically require us to step through multiple hoops so we know what we’re getting into, and accept that burden.
What Apple could have done (and still could) with Error 53 is transform it into a notification, rather than bricking the phone. The phone could still lock up and requiring being restored, but iTunes could offer something better. Like:
Something appears to have happened to your Touch ID sensor, and we can no longer ensure the security of your Touch ID and Apple Pay settings. Click here to permanently erase the Secure Enclave, rendering your secure data irretrievable, and proceed to use your iPhone without Touch ID and Apple Pay in the future. Or click Cancel, and contact Apple to replace your phone; your data will be securely disposed.
Given that Apple provides Error 53 in part to explain this, it’s not revealing any additional secrets to give people a choice. Even if a third party gains access without the phone’s owner knowing, the fact that Touch ID and Apple Pay no longer work is a giveaway. Perhaps iOS would occasionally remind a user of this fact, and the Touch ID section of Settings could provide additional information, too.
Apple should disclose the risk when they sell a phone in clear and upfront language, as there’s nothing else like this bricking that can destroy an iPhone. And it should reconsider the way it ends the life of what its hardware determines is a phone that’s been tampered with.
I don’t buy that Apple implemented this feature as a profit booster, because it can’t bring in enough to make that worthwhile. But I do see where it feels obscurity outweighs security, even when it’s obvious to the owners a phone that they’ve been deprived of their hardware.