We Now Return You To Your Regularly Scheduled Cyberpunk Dystopia

Apple will lose this battle with the US government. Maybe not this year, or next, but soon enough, and for the rest of our lives. It is folly to pretend otherwise. Most ordinary people, and most powerful people, don’t care about abstruse theoretical arguments against back doors and weakened security. They care about—or want to exploit—the raw visceral fear of terrorist violence.

This isn’t a case about a single phone. Rather, as Amy Davidson says in the New Yorker:

the government is attempting to circumvent the constitutionally serious character of the many questions about encryption and privacy. It is demanding, in effect, that the courts build a back door to the back-door debate.

Or as Julian Sanchez puts it in Time,

If the FBI wins, it could open the door to massive surveillance … The high stakes of Apple’s resistance to the FBI’s order: not whether the federal government can read one dead terrorism suspect’s phone, but whether technology companies can be conscripted to undermine global trust in our computing devices. That’s a staggeringly high price to pay for any investigation.

Of course Apple should win. I admire what Tim Cook is doing immensely. But in the long run they will lose. If they’re victorious in the courts–IANAL and make no prediction–then Congress will come after them the day after the next major terrorist attack hits America. (And there will be another major terrorist attack on America, eventually. Sorry. It will probably involve drones.)

Anyone who thinks Congress won’t bludgeon the tech industry into compliance when that happens doesn’t remember what America was like from September 12, 2001, all the way through the catastrophic invasion of Iraq and beyond. Imagine if one of the men who brought down the World Trade Center had left a locked phone behind.

Even today, opinion polls make it very clear that most Americans favor surveillance and intrusive government over security and encryption: “56 percent of Americans favor and 28 percent oppose the ability of the government to conduct surveillance on Internet communications without needing to get a warrant. That includes such surveillance on U.S. citizens,” to quote the AP. Again, that’s warrantless surveillance. Fulfilling a warrant to get information from a known terrorist’s phone? Forget about it.

(This goes for people who should know better, too. In the aftermath of the Tsarnaev brothers’ attack on Boston, Farhad Manjoo, now of the New York Times, wrote a piece entitled “We Need More Cameras, And We Need Them Now: The Case For Surveillance.” He went on: “Abuses and slippery-slope fears could be contained by regulations that circumscribe how the government can use footage obtained from security cameras.” Oh, regulations. Whew. Problem solved!)

The government knows all this. That’s almost certainly why they’ve chosen this as a test case. Not because the contents of the phone are likely to be valuable. On the contrary. Those contents are (very likely) unavailable only because a government employee changed its password after the attacks. Even so, its call and text metadata have already been strip-mined, and found useless.

Rather, this appears to be part of a deliberate — and, as I’ve written before, completely pointless, futile, and self-destructive — strategy to undermine encryption. “This is one of the worst set of facts possible for Apple. That’s why the government picked this case,” to quote University of Miami professor Michael Froomkin. Bloomberg reports:

In a secret meeting convened by the White House around Thanksgiving, senior national security officials ordered agencies across the U.S. government to find ways to counter encryption software and gain access to the most heavily protected user data on the most secure consumer devices

As a result, Apple itself faces a paralyzing paradox:

Apple cannot simultaneously treat itself (compelled by governments) as a threat, maintain its iron-fisted control of all software that runs on iOS devices, and protect its users’ security. Pick any two: you can’t have all three. If Apple itself can be compelled to be the enemy, right down to the firmware, then only third-party software can secure iOS users … but Apple forbids sideloading, tries to prevent jailbreaking and bar custom firmware, and gatekeeps all third-party software, so it could conceivably be compelled to forbid–or corrupt–any third-party encryption tools.

“If Apple, the government, or anyone else has master access to your device, to a service, or communications, that is a security flaw,” to quote the Securosis blog. If you maintain absolute control over a platform, then you are a security risk, whether you like it or not.

An excellent David Schuetz technical blog post summarizes this problem:

What is true, though, is that once Apple has built the capability, it would be trivial to re-apply it to any future device, and they could quickly find themselves needing a team to unlock devices for law enforcement from all around the world … [and even if this attack is mitigated] a normal OS update to an unlocked phone can change this at any time, restoring the attack for future use.

And so, as Nicholas Weaver puts it on Lawfare:

Let us assume that the FBI wins in court and gains this precedent. This does indeed solve the “going dark” problem as now the FBI can go to Apple, Cisco, Microsoft, or Google with a warrant and say “push out an update to this target” … Almost immediately, the NSA is going to secretly request the same authority through the Foreign Intelligence Surveillance Court … How many honestly believe the FISC wouldn’t rule in the NSA’s favor after the FBI succeeds in getting the authority? … Every other foreign law enforcement and intelligence agency would demand the same access, pointing to the same precedent

Hey, at least there’s one bright side here:

https://twitter.com/JZdziarski/status/700829160403505152

But, to quote Securosis again:

The FBI, DOJ, and others are debating if secure products and services should be legal. They hide this in language around warrants and lawful access, and scream about terrorists and child pornographers. What they don’t say, what they never admit, is that it is physically impossible to build in back doors for law enforcement without creating security vulnerabilities.

…Seems like a good time to drop in this old perennial.

The godfather of cyberpunk himself once said:

I submit that a variant of the same applies here; that people who are more frightened of terrorism than of grinding bureaucratic authoritarian oppression have not yet developed a fully adult concept of scary either. But from a results-oriented point of view, that doesn’t matter. What matters is that we live in a world in which people respond to the most visceral threats, not the most dangerous ones.

That disastrously bad threat modeling is, in a nutshell, why Apple–and, by extension, the tech industry–will ultimately lose its / our battle against government intrusion, surveillance, and compromised security. To protect ourselves, we will need better solutions, ones that do not require the centralization of control in any collective, corporate, or government entity. No matter how well-intentioned they may be.