Information security has never been a more sensitive subject than it is these days, so it’s little surprise that allegations from a security researcher that iOS contains a “backdoor” permitting access to users’ information provoked a strong response from Apple.
Those accusations came from security researcher Jonathan Zdziarski, who was presenting at the Hackers on Planet Earth conference earlier this week. In his talk, “Identifying Back Doors, Attack Points, and Surveillance Mechanisms in iOS Devices,” Zdziarski claimed to have found systems within iOS that could be used to access users’ information, including photos, address-book information, voicemail messages, and more.
As troubling as that might be, there are some caveats. For one thing, in order for this information to be accessible, your iOS device needs to be connected to a computer. However, since the advent of iOS 7, you need to explicitly tell that device to trust a computer when you first hook it up—meaning that a malicious party who wants to get at your information would either need physical access to your iOS device or to have compromised a computer where you’ve already established that trust. That said, Zdziarski reports that at least some of these systems bypass the encryption on your iOS device backups, which ought to give anybody pause.
Apple, as you might expect, did not take these allegations lying down.
“We have designed iOS so that its diagnostic functions do not compromise user privacy and security, but still provides needed information to enterprise IT departments, developers and Apple for troubleshooting technical issues,” an Apple spokesperson told Macworld. “A user must have unlocked their device and agreed to trust another computer before that computer is able to access this limited diagnostic data. The user must agree to share this information, and data is never transferred without their consent.”
The company also reiterated its stance that it doesn’t compromise its systems for the purpose of providing those access points to the authorities: “As we have said before, Apple has never worked with any government agency from any country to create a backdoor in any of our products or services.”
While such statements may be intended to assuage fears over the privacy implications of these systems, they’re hard to classify as categorical denials in this case. For one thing, Apple hasn’t yet explained why anybody needs the breadth of information that these tools seem to provide access to, nor why these services, if indeed for diagnostic use, are not presented for users to opt into. In the case of enterprise environments where devices are provided by a company, users are generally made aware of the access that IT departments have to their devices. But when we’re talking about the general public, no such warning is given—nor should it be needed.
To be clear, the risk here is not necessarily from malicious parties stealing sensitive data over the Internet nor from the government snooping on your every move. But there are privacy implications much closer to home: Given access to the system described in Zdziarski’s presentation, it wouldn’t be hard for someone with physical access to the device—say, a private investigator hired by a jealous partner—to gain access to that data. At the same time, there’s only so much that can be done when someone has physical access to the device.
But there remains a larger point, especially in this day and age. Apple has taken a firm stand on privacy, and it’s disappointing to see the company not fully and transparently explaining why these systems have the range of access that they do, why they circumvent security processes the company itself put into place, and why there’s no way for a user to easily disable them. That’s the kind of attitude that we’ve grown to expect from the company, and we’d like to see them live up to it.