The War On Crypto Terror

Governments are scared of software. This month, the UK announced a law which will require “Google, Facebook and other internet giants” to “give British spies access to encrypted conversations”; the Commerce Department proposed to classify “intrusion software” as dual-use civilian/military technology; and the Justice Department claimed APIs should be copyrightable.

Everything old is new again. Remember the original Crypto Wars? Those were the days: when the US government categorized important software as dual-use munitions, and tried to impose government backdoors on all mobile communications.

But that was the nineties! Practically prehistory. (Although we’re still suffering from the hangover.) Whereas, in our bold and beautiful today, the US government is … categorizing important software as dual-use munitions, and trying to impose government backdoors on all mobile communications. See how far we’ve come?

It would be (relatively) nice if we could memorialize Crypto War I, win Crypto War II, and then settle into a nice relaxing Crypto Cold War fought by proxies. Alas, conflicts no longer work in that 20th-century way. Instead, the War On Crypto is like the War On Terror. Its target is a tactic, not an actual antagonist; so the war is endless, the enemy amorphous, and the victory conditions nonexistent. We have always been at war with Crypto.

…Even if our governments are often not entirely clear on what that means, exactly. Take the new “intrusion software” proposal from the Department of Commerce’s Bureau of Industry and Security (BIS). What exactly is “intrusion software”? So glad you asked:

Software “specially designed” or modified to avoid detection by `monitoring tools,’ or to defeat `protective countermeasures,’ of a computer or network-capable device, and performing any of the following:

(a) The extraction of data or information, from a computer or network-capable device, or the modification of system or user data; or

(b) The modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions.

They go on to define “monitoring tools” and “protective countermeasures,” while explicitly excluding hypervisors, debuggers, reverse-engineering tools, DRM software (which is darkly hilarious) and asset trackers — but that’s still an extremely broad specification.

It’s really not uncommon to defeat “protective countermeasures” and “monitoring tools” for benevolent purposes. Jailbreaking your phone leaps to mind. Or SSH tunneling. What’s more, what if/when those monitoring tools and protective countermeasures were installed on your computer by someone else–a manufacturer, a carrier–and you’d like them removed?

And that’s without even considering the legitimate work of benevolent security researchers worldwide, and those who would claim the “bug bounties” that many major tech companies cheerfully offer to those who find security holes, because this improves their security immensely. The original intent may have been noble–to limit the sale of surveillance systems to oppressive states–but, as the EFF’s Nate Cardozo and Eva Galperin put it:

the rules [BIS] proposed this month are a disaster … an unworkably-broad set of controls … On its face, it appears that BIS has just proposed prohibiting the sharing of vulnerability research without a license … This is tremendously worrisome … the UK’s implementation does not attempt to control the export of exploits or “intrusion software” itself, while a plain reading of the BIS proposal seems to do just that. Similarly, the UK implementation doesn’t affect jailbreaking, fuzzing, or vulnerability reporting, while the BIS rules could be interpreted to include them …

An analogy: imagine that, while allegedly trying to crack down on murderous militias, the government also, in passing, outlawed your local Neighborhood Watch.

Meanwhile, as one tentacle of the government (theoretically) seeks to limit state surveillance, others are trying to broaden it enormously. Both the UK and US want Apple, Google, etc., to provide them with some flavor of “secure golden key” back door, so they can read encrypted messages on those platforms. (These are, of course, the same governments that have been quietly dragnetting their own citizens’ metadata for many years, unbeknownst to almost all, until Edward Snowden came along.)

All this in the name of a futile attempt to stuff the djinn of strong cryptography back into its lamp. Open-source strong crypto is has long been widely available for free, from eg Open Whisper Systems (a Snowden favorite.) Anyone who really wants to encrypt their communications, and has even a modicum of technical ability, can do so themselves without the auspices of Apple and Google et al. It’s math. You can’t ban math, and you shouldn’t try.

Compounding the recent technical sins of the US government, the Justice Department has come out swinging at the notion that APIs should be copyrightable. This is less of a dread threat to civil rights, but somehow even more irritating than the War On Crypto, because it’s so obviously wrong. It is flagrantly apparent to anyone who has ever written software–such as the judge who originally ruled on the case–that an API is firmly on the “idea” side of copyright law’s “idea/expression” dichotomy. I’m also an author, and hence a big fan of copyright, but this is absurd.

In the past I would have written this all off as simple bureaucratic fear of a changing world so prone to anti-hacker moral panics that you can be sentenced to three years in jail for incrementing a URL, or life without parole for running an online marketplace. But not any more. Now I think something more interesting is happening.

I suspect these creeping attempts to restrict software are a recognition of its increasingly crucial importance to the world order. It’s illustrative that Canada’s terrifying new anti-terror legislation has expanded the definition of “activity that undermines the security of Canada” to include “interference with the global information infrastructure.”

It seems to me the powers that be are, collectively, increasingly coming to the conclusion that the state monopoly on the use of violence must be extended to the use of certain kinds of software (such as, say, 0-days) while the remit of crony-capitalism intellectual property must be extended as far as software APIs. Are they wrong? Morally, yes. But practically, from a self-preservation standpoint, they could well be quite right.