Microsoft is the latest multi-billion-dollar global corporation to weigh in on behalf of its users
with a new lawsuit that wants to force the U.S. government to allow it and other companies that store data on behalf of customers to disclose to those customers when warrants have been issued that require disclosure of that information. Not in every case, mind you, but in vastly fewer than the government has gotten court approval to keep gagged, and typically for set, short periods of time instead of indefinitely.
Microsoft hasn’t always been known as a bastion of customer-oriented privacy concerns, nor until relatively recently has it been praised for how it handles security. In contrast, Apple emerged several years ago as the flag bearer for privacy, revising software and hardware to bolster their words, and other companies have fallen either into line to support Apple or fought separate battles. Facebook—in particular its WhatsApp division—has pursued a direction that bore fruit just last week with WhatsApp’s
end-to-end encryption update.
When I started this column over a year and a half ago, I asked for my mandate to be security, encryption, and privacy. Those related elements obviously aren’t directly comparable, but the mix of how companies offer each—and how one plays into the other–affects how much about us they know, and to how much unrelated parties, including attackers, can gain access.
Teasing three terms apart
In my definition, security comprises both philosophy and implementation: How should systems be designed to limit access only to the parties in an organization who need it (such as administrators), if any, as well as to those outside, such as subscribers, members, or users. And, with that design in mind, how well it’s implemented.
The implementation involves a number of parameters, such as two-factor authentication for logins and physical access controls to servers and the like, as well as encryption. Encryption often gets the lion’s share of attention, because flaws in algorithms have widespread consequences. But it’s often a security flaw—a way through access controls—that lets attackers gain access to data, often in a state that’s not encrypted at rest or that passes back and forth to users in an unencrypted state, and can be captured.
Privacy is effectively the control we can exercise over data we provide to or store with other parties. It can be an outgrown of well-implemented security with proper use of encryption. But privacy is not an integral part of security. When they’re lumped together, it provokes misunderstandings. Some companies are great about privacy, and terrible in using encryption to provide strong security. Others are fabulous about encryption, creating nearly impregnable walls against outsiders, but make seemingly free use of our private data for their own marketing and advertising ends.
If we look at Apple, Google, Microsoft, and Facebook, we can see how these aspects get exposed.
Apple decided several years ago to distinguish itself from Google, as Google swallowed up ever more information about users to target advertising, among other purposes. We can set strategy aside from the debate about Google’s tactics, business model, and its response to criticism.
To further its strategy, Apple moved to restrict its access to many parts of its users information. It can’t decrypt a phone (without developing a custom OS). It encrypts iCloud Keychain items in such a way that it can’t decrypt them, even when they’re synced across its system. While it does offer iAds, you can opt out of the information gathering that tailors those to you. (And, frankly, iAds is small potatoes in the ad business, and
not long for this world.) When it launched Newsstand and later Apple News, it made it clear that publishers wouldn’t, by default, get almost any information about subscribers.
But Apple has blind spots when it comes to encryption. It encrypts the synchronization of contacts, calendar entries, and other information across its iCloud service, but with the exception of Keychain entries, that information is stored in a way that Apple can access, and provide access to law enforcement. Apple could shift to a method used by other companies, including AgileBits with the cloud side of its 1Password ecosystem, where data is always encrypted, and client software (including Web apps) handles the decryption locally. They could built this into iOS and OS X so that third-party apps would be able to handle data seamlessly for sync.
Apple is also behind in the methods it uses with some products to ensure total privacy. As recent research has uncovered,
iMessage had a variety of vulnerabilities (now patched), but it also suffers from outdated design. Apple hasn’t kept up with the best practices now understood to achieve the goal of preventing outside parties from gaining access to messages and audio/video sessions.
Google has arguably superior security in some of its products and systems. For instance, it built forward secrecy into Google Talk years ago—a technique that destroys encryption keys after messages are sent, making it much more difficult to impossible to unscramble old messages. It added two-factor authentication broadly and extensively a few years ago in a way that Apple is still catching up with.
Given Google’s designs, it’s less likely that outside parties could gain access to its message system, messaging history, data centers, or encrypted connections of any kind. In many ways, large and small, Google has improved the overall security of Web communications and the integrity of the certificate authority system used to ensure that encrypted sessions can’t be subverted through man-in-the-middle attacks.
But Google’s obsession with examining user data for ads and other targeting means that it stores a lot of information in the cloud in a form to which it has direct access. With Apple, you can avoid using the cloud for sync, for instance, while Google’s cloud focus prevents that.
A great example? Google has offered encryption Web connections to Gmail and Google Search for years, and improved how it handles security over time, so you can spot if someone is trying to or has hijacked your email account. However, Google analyzes every single thing you do on its sites and every character you type into it.
There’s a conflict here. The battle over government’s access to decrypt communications will probably resolve in Google shifting to a stance that’s more like Apple, and improving privacy and security at the same time.
Microsoft and Facebook stepping forward
It’s not that Microsoft was a privacy invader; that kind of opprobrium is usually limited to a critique of Facebook. Rather, even this new post-Gates and post-Ballmer form of Microsoft hadn’t revealed itself as a particular advocate. The lawsuit it’s filed effectively on behalf of its customers is a strong step forward, especially given how many governments are customers of Microsoft for its operating system, applications, enterprise offerings, and cloud apps and services.
Microsoft’s reputation for security improved enormously following the disastrous years in which Windows XP let hackers run wild. But this privacy stance could change the way people regard the company. I saw a number of tweets after its lawsuit was announced with people joking about how they have a hard time praising Microsoft because of its past security inadequacies. The company’s suit is a combination of marketing and good intent, just as with Apple.
Facebook remains in a trickier position. The updates to WhatsApp have a positive impact worldwide for the protection of private thoughts, and Facebook is general has kept ratcheting up security options along with good reminders to its visitors to review settings. But the company remains problematic on privacy, trying to improve user numbers and revenues by making more and more of what we post less protected—including using photos of you in ads shown to your friends. A
2015 examination shows a steady and significant decline over Facebook’s history in how it defines what’s private in its user policies
Even more so than Google, Facebook wants to keep everyone in the world out of your private affairs—except Facebook.
These companies and many others have to wrestle with balancing core businesses against how they protect our data, identity, software, and devices. Apple may have it easiest as a company that makes most of its revenue from high-margin hardware, because it needs the least amount of information about us to make that business thrive.
But the flourishing debate about how governments can or should force companies to release our private information is already shifting stances. The trend is for more protection, especially as companies educate people in what’s risk, letting all of us articulate more precisely what we want.