Down is up: Security is unsafe
Get ready to find out once again that the thing you know is actually the opposite.
OK, the Macalope gets it. Questioning assumptions is part of what drives civilization forward. But for every time some forward-thinker questions whether the sun goes around the Earth, someone else questions whether guinea pigs are adorable little furry eggplants with legs.
They’re adorable little furry eggplants with legs! Just accept it! Sometimes your eyes don’t lie to you!
Writing for the New York Times, Jamil N. Jaffer and Daniel J. Rosenthal want to tell us“Why Apple’s Stand Against the F.B.I. Hurts Its Own Customers.”
Jaffer is a law professor and Rosenthal is a former director of counterterrorism for the Obama Administration. Sadly, when their Wonder Twin powers activate, they are unable to assume the form of someone who knows a darn thing about smartphone security. Their rings can apparently only turn them into administration mouthpieces.
Apple’s decision not to help in the Farook case was ultimately bad for the company and its customers.
You see, it would have been far better for Apple to have rolled over and simply created a system for the government to unlock any iPhone they wanted because hambilty jing-jam with the hogfloffer.
Apple has lost leverage in legal cases and the average iPhone user is significantly more vulnerable — both to government access and to criminal hacking — than if Apple had assisted the government in the first place.
The F.B.I. has already found a company able to access Mr. Farook’s phone without Apple’s assistance, presumably taking advantage of a vulnerability that Apple has either not yet identified or not yet patched.
Why does all this matter? Simple: The longer it takes Apple to patch this vulnerability (either because the government discloses it to Apple or because Apple figures it out on its own), the longer iPhone user data is at risk — both from the government and from criminals.
That’s true. However, the government didn’t invent this flaw, it was already there. And now Apple knows it’s there. The Macalope hopes they find it because the release notes for that update have the potential to be epic. At any rate, a security flaw is something Apple could fix. There is no fix for a back door.
There’s also some interesting language there: “…the longer iPhone user data is at risk — both from the government and from criminals.” Are the esteemed wads of government cheese suggesting the F.B.I. might use this exploit to take a joy ride through users data? Because, if so, yeah, totes would have been better if they just had a back door.
Ultimately, the question is this: For lawful access to material important to terrorism investigations, would we rather trust Apple itself under the close supervision of the courts, or the F.B.I. and some private company that makes money selling cellphone hacks?
This is, of course, a false dichotomy. Even if Apple were to make a version of iOS for unlocking phones like the government wanted, there’d be nothing to stop the F.B.I. from employing zero-day exploits to access iPhones outside of the judicial process. Obviously, since that’s exactly what they just did.
Ultimately this comes down to one question: Should Apple make it harder or easier for the government and others to gain access to iPhone data? They went with harder.