Apple can share my location data -- on my terms
That the iPhone (or any other modern telephony device) knows its location and stores that information in a file should come as a surprise to no one. That the file is then backed up should also come as no surprise. Heck, I’ll even say that sending the file up to the big cloud in the sky (if indeed that is the case with iOS, Android or any other smartphone) and using it for marketing purposes should not be a shocker.
So, why are people making such a big fuss about all of this? The core of the controversy is informed consent, which is a by-product of a basic reverence for the customer—something sadly lacking far too often these days.
Consumers will put up with a lot from the companies they do business with, so long as they are told what the company wants to do (in clear, consumer-friendly terms) and are given the opportunity to choose.
But there seems to have been no clear communication and no real choice offered in this case. Yeah, some people may argue that consumers granted Apple et al. permission to collect the data in the end-user license agreement, but let’s get serious. Those documents are really only of concern or value to lawyers. I’m talking about putting clear verbiage in front of the user and letting the consumer choose, and even have the ability to selectively enable and disable the data collection and sharing.
Clearly, at a technology level, any modern smartphone—or even moderately smart phones—needs to know where it is to communicate effectively with cell towers, Wi-Fi access points and so on. And that’s not even mentioning the obvious point that most phones today have GPS chips so they can pinpoint their locations with amazing precision. We not only expect that, but we demand it of our devices.
But that’s at an operating system level. Of course, it knows where it is. Not all applications get all that data, and the consumer can grant or deny access to location services, at least on iOS devices. But dig deeper into the operating system, and you find that it knows where it is pretty much all the time.
The question then is what does it do with that data? If the data is cached locally and used temporarily, there would be a lot less complaining right now. But store it in a file that then gets backed up onto the user’s desktop, and the conspiracy theorists are going to cry foul—and rightly so if the manufacturer didn’t provide the users with fair warning and the ability to choose. Even worse, send that data into a manufacturer’s or service provider’s cloud without getting informed consent from the user, and you deserve what you get.
Let’s hope that Apple’s promised iOS fix will adequately address this issue, but it’s a bigger issue than just Apple. As we’ve been hearing, many products collect this sort of data, and we consumers aren’t always given opportunities for informed consent.
Earlier this week, for example, a news story broke about TomTom apparently sharing traffic speed data with the Dutch police, who then used the data and issued speeding tickets to some drivers. Clearly, consumers aren’t likely to consent to this type of use of location data.
But they are likely to agree to sharing their location data for some purposes, if the source of the data is appropriately masked and anonymous. For example, crowd-sourced traffic data provides a clear and tangible benefit to all who share their location data. Basically, the more location sources, the more reliable the traffic data becomes. We already see such systems, and the value to the consumer is self-evident.
So the real problem isn’t in sharing location data. It’s in obtaining consent and ensuring the anonymity—or at least the privacy—of the participants. To do this successfully, it is important that the service provider be completely clear and open about its intentions. Consumers need to be given a clear choice and the ability to opt in or out as they see fit.
What’s more, in the spirit of “trust but verify,” providers who use this sort of data should open their systems up to independent external audits. The audits should include access to the design and source code of the implementation, as well as sample data from the production system, so that the auditors can validate that the data is being used in a way that is consistent with the stated privacy criteria.
That’s a pretty tall order, but if companies build and maintain consumer trust, that’s the sort of thing that is necessary.
[With more than 20 years in the information security field, Kenneth van Wyk has worked at Carnegie Mellon University’s CERT/CC, the U.S. Deptartment of Defense, Para-Protect and others. He has published two books on information security and is working on a third. He is the president and principal consultant at KRvW Associates LLC in Alexandria, Va.]