Apple’s most important role: Keeping privacy at the forefront

The tools built to track COVID-19 need to be built with consent and transparency in mind.

Today's Best Tech Deals

Picked by Macworld's Editors

Top Deals On Great Products

Picked by Techconnect's Editors

Last week, we learned that Apple and Google would be working together on a COVID-19 tracing project, likely a key requirement for the post-lockdown world to come. It was definitely the best news you’ll ever receive about the creation of a tool that can be used to monitor the personal interactions of billions of individuals.

Yes, mass surveillance is creepy and terrifying when it isn’t saving lives. That’s why I’m glad that Apple is stepping up and putting its energy into this project—because sometimes it feels like Apple is in a unique position to steer our world away from the most negative possible uses of technology and toward a more private and transparent future.

Built with privacy in mind

The new COVID-19 tracking project seems to be rooted in technology that Apple rolled out last year as a part of its revamped Find My technology. It uses cryptography and randomized numbers transmitted by low-energy Bluetooth beacons to create a system where the network of existing Apple devices can be used to find any other Apple device, even if that other device doesn’t know where it is and isn’t connected to the Internet

apple mobility Apple

Presumably this technology was designed for Apple’s rumored, but as yet unannounced, tracking-tile accessory—because such a device would transmit a beacon, but wouldn’t have access to GPS coordinates or a network connection. (But it’s great for lost iPads and MacBooks, too.)

Because Apple’s brand promise is all about privacy these days, Apple had to engineer the new Find My system so that it didn’t inadvertently build a mechanism that could be used to track its users. Because once that kind of data exists, it would be relatively easy for a government entity to produce a legal order demanding that Apple disclose that data.

Apple’s only recourse against that possibility is to design a system that simply can’t be used that way—not even by Apple itself.

Encryption warning signs

The stakes were made clear back in 2016, when the FBI demanded that Apple decrypt an iPhone used by the suspect in a mass shooting. Apple had added encryption and passcode security to the iPhone in an attempt to make it more secure, but the FBI wanted to compel Apple to break into the suspect’s iPhone and extract data from it.

In the end, Apple was able to provide the FBI with data that passed through Apple’s servers unencrypted—iCloud backups—but other data remained obscure. This was all by design. Apple has built many of its systems to be beyond its own reach, because the moment a “secure” channel has any insecure portion, it’s not secure anymore. And government and law-enforcement agencies are happy to go to court to compel even the most well-meaning companies to use that insecure portion to break so-called secure channels wide open.

The beauty about the design of many (but not all) of Apple’s products is that they’re built on public-key cryptography, meaning that if Apple doesn’t hold the keys, it can’t decode the messages. It’s the mathematics that encryption is based on. Courts can compel companies and humans to do what they want, but mathematics is immune from such cares.

Let Apple light the way

In this moment, when the interests of public health require that we build tools that track our movements and associations in order to quickly squash potential outbreaks of COVID-19, it’s vital that the tools that get built can’t be easily perverted for other uses. Anyone who lived through the aftermath of 9/11 can remember that many decisions made rapidly in the aftermath of the initial attacks ended up being anything but temporary. Governments and law enforcement agencies are always reluctant to give up any power they’re given.

That’s why I want Apple involved in building these surveillance tools today. It’s a company that’s committed to safeguarding its users’ privacy and asking permission before doing anything with user data—and has spent years working on ways to build systems that can provide all the benefits of high-tech tracking tools while keeping the needs of the users in mind.

This is not to say that this new COVID-19 contact tracing technology couldn’t be subverted in the future, or that governments who want to exert maximum control over the public couldn’t compel everyone to opt in and share their data. Totalitarian regimes are gonna do their thing. But by keeping privacy and transparency in the mix, and building those principles into the machinery itself, Apple can make it less likely that a tool built to re-open society will later be used to strip away our freedom, privacy, and other basic human rights.

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.
Related:
  
Shop Tech Products at Amazon