The last two decades have seen a steady migration from analog to digital means for communication and the storage of information.Following closely behind (sometimes not closely enough, as some have found to their peril) was the drive to keep that information secure. The threat of hacking and the desire to assure users of their privacy has led to the encryption of data while it is both at rest and in transmission becoming standard practice. And, just as a bank can’t open a secure deposit box to which only you have the key, proper encryption means that even the companies providing and hosting services can’t access that data unless you authorize it.
But even the strongest safe or door will succumb to drill or explosives. Advances in cryptography methods and increases in computing power have created encryption that cannot be reversed in a realistic timeframe. For the first time in history, people have a way of securing their communications quickly, automatically, and at will from any threat, be it hackers or government snoops.
Good for us — but for the FBI and the police, it’s a calamity. Where once they could pry open locked drawers to find incriminating letters, or force a company to reveal private records, now everything depends on the willingness of the owner to allow that information to be decrypted.
Since they can’t go through the front door, so to speak, they have asked repeatedly for a back door. But what exactly is a backdoor, and why should you care?
A unique threat
The concept of a backdoor is simple to state but not so simple to definitively pin down. Like the back door of a house, a crypto backdoor (generally written as a single word) is a way to circumvent the locks and protections of the main entrance in order to walk in unobstructed and make oneself comfortable. A backdoor could be in a phone, laptop, router, security camera — any device, really.
But backdoors are different from other means of bypassing traditional security. Security researcher Jonathan Zdziarski provides a useful framework for distinguishing a backdoor from a bug, exploit, or administrative access.
First, backdoors operate without consent of the computer system’s owner. This excludes things like administrative access to employee emails, something people often consent to as part of a job, or Comcast maintaining a separate login for your router for troubleshooting purposes. But if the Comcast adds another, secret login, that meets the standard.
Second, the actions performed by backdoors are at odds to the stated purpose of the system. Say a device claims to keep your messages safe; the manufacturer may have a way to install updates on it to keep it functional, which is perfectly compatible with its intended purpose. If, however, the device includes a way of accessing your messages without your knowledge, that’s counter to the intended purpose and qualifies.
Third, backdoors are under the control of undisclosed actors. Many viruses and worms operate more or less autonomously, harvesting information or spamming your contacts; unless a third party is directing their actions (as in ransomware or botnets), they don’t count as backdoors — since there’s nowhere to go through it.
Many Americans will have recently heard the word “backdoor” during the FBI’s high-profile dispute with Apple, which provides a useful example for defining the term. In the course of a terrorism investigation, the FBI tried to force Apple to create code that would unlock an iPhone at the request of law enforcement. Apple CEO Tim Cook wrote at the time, “The U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”
The FBI was asking Apple to build software that would operate without the consent of the device owner, decrypt a version of iOS that promised robust encryption, and remain under the secretive control of the FBI — meeting all three conditions set out above.
But backdoors are far from a new phenomenon, and don’t have to take the form of a piece of software installed in an otherwise free device. One example of deeper integration dates back to 1992.
That year, under the direction of the NSA, the company Mykotronx made a dedicated chip for encrypting telephone communications on lines where secrecy and privacy were important, for example in R&D or at an embassy. This “Clipper Chip” was a replacement for an existing chip, and featured an important addition: a “Law Enforcement Access Field” into which a code could be entered to bypass the device’s encryption altogether.
The code, generated at manufacture, would be kept in strictest secrecy by federal agencies. Privacy advocates vociferously opposed this concept of “key escrow” carried out at the hardware level for several reasons, not least of which were the inability of the public to verify the security of this secret system or the process by which it would be employed.
The Clipper Chip was scrapped, but the idea lives on, though perhaps not in quite such an obvious way. Various routers, wireless chips, and other components of transmission and storage devices have repeatedly been demonstrated to contain mechanisms by which the manufacturer, or of course any actor of its choosing, can gain access to the device.
It’s possible to put a backdoor even deeper. It was reported that the NSA paid $10 million to back an encryption standard that used a particular random number generation technique that the agency knew was flawed. Any product that used this standard would have been effectively backdoored by the NSA, at such a fundamental level that it would be very difficult to detect.
Trust, but verify
If you’re worried about the possibility that some gadget or app you use has a backdoor — well, you should be! But there is a silver lining: lots of very savvy people are worried about it too, and they have their eyes wide open.
When someone claims to have a secure protocol or service, they are invariably asked to make their methods public so independent investigators can look in the code for any flaws, whether inadvertent or deliberate. And standard encryption methods are strong and thorough enough now that hackers and spooks are redirecting their efforts. It’s far easier getting someone to click a shady link in a phishing email or tricking a criminal into unlocking their phone than trying to get a backdoor put into the system.
The threat of backdoors is still a clear and present one. Fortunately, an informed (and occasionally outraged) public is a powerful deterrent to their creation.