Over the last several years, Apple has taken a number of opportunities to present itself in stark contrast against one of its chief rivals, Google, but nowhere more in the firm position Cupertino takes against collecting any more data than it needs to about its customers.
Privacy is obviously a major concern in the digital age, and Apple’s stance is largely applauded—and with good reason. But at the same time, that choice doesn’t come without its costs, both to Apple and to its users. By taking such a hardline stance, the company has hindered the development of some of its features, and perhaps even negated some of the advantages of its ecosystem.
There are places, it seems, where a balance is not only desirable but necessary. This isn’t to say Apple should sacrifice security and privacy in favor of capabilities, but that the company should be able to make use of its immense talent to find a middle ground that maintains users’ privacy and provides the features that people want.
Face the facts
Apple’s Photos app got one of the biggest overhauls in macOS Sierra, and among the chief features of the app both on that platform and in iOS 10 was a machine-learning algorithm that, among other things, can identify people in your photo library.

When Apple first announced the feature, it made a big deal about the fact that the processing of photos for faces was done locally, on the device itself, rather than transmitted to servers. It was an obvious jab against Google Photos, which had already rolled out a similar feature, but did its processing in the cloud.
Again, there’s a laudable element to this. People don’t like to feel that their personal and private photos are being pored over, even if “just” by a machine. But these local silos have, at least at the moment, made the feature less useful, because the analysis happens on each device that the new Photos is on. That means even if all the photos on your iPhone are scanned for faces, when you upgrade your Mac to Sierra, the Photos app there doesn’t benefit from the information on your phone—even if they’re all the same photos.
Not only does that seem remarkably inefficient, but it also runs into possible collisions. For example, I store my pictures in iCloud Photo Library: My MacBook Air running Sierra and my iPhone 7 running iOS 10 both have my entire 23,154 photo library synced. And yet, if I look at the People album in iOS 10, it identifies 12 people; my Mac’s People album has only 11. Moreover, the total numbers of photos for each of those people largely differs between the two. For example, my phone identified 523 pictures of me; my Mac, only 306. Those are some pretty disparate numbers, and a search for photos on one is sure to look substantially different from the other. And if I make changes in one place to add in more photos to a certain person, I’m just going to have to repeat that process on my other devices.
They don’t talk
Photos isn’t the only place that Apple could benefit from a more holistic approach to users’ data. Apple’s virtual assistant, Siri, also often seems condemned to live in a silo. After all, there’s Siri on your Mac, on your Apple TV, on your iPhone, on your iPad, on your Apple Watch…and though they sometimes seem to share information, much of the time they each seem to be operating in their own little vacuums.

It’s pretty frustrating, for example, when I correct the pronunciation of a friend’s name on my iPhone, and then have to make the same correction on my iPad and my Mac. It’d be great if picking up a different device and interacting with Siri there didn’t feel like you were talking to a totally different person.
What’s odd is that these are the exceptions to Apple’s general rule. The company knows its broad ecosystem is one of its biggest assets, and it usually takes full advantage of that fact. But for some reason, it’s drawn the line in the sand here—at least for now.
The happy medium is the message
I’m certainly not advocating Apple ape Google’s business model—the two are very different companies, and what works for one often doesn’t work for the other. But I find it surprising that Apple didn’t include a solution that allowed for secure analysis of people’s faces that could be shared between your devices—after all, they already store so much other sensitive information in the cloud, whether it be your credit card numbers, contacts’ phone numbers and addresses, or, say, all of your Siri queries. If the technology is secure enough to keep that information private, why is this particular use case singled out?
In that, I think the conversation comes back to Google, and Cupertino’s interest in drawing a distinction between itself and its rival to the north. One topic Apple has spent some time talking about recently is differential privacy, wherein Apple can collect data to improve its machine-learning algorithms, but the data is transformed in such a way that it’s impossible to track down a single individual. I wouldn’t be surprised to see such technology become a key argument down the road as to why Apple can send your data—including photos—out for processing on a remote server and maintain your ironclad privacy.
So it’s a decision that has its roots in technology, policy, and politics. There are valid arguments for all of those things—it’s just a shame that in the here and now it takes a potentially useful feature and turns it into, well, persona non grata.