There is something about encryption that brings out the worst in journalists. Because to most of them it is magic, they are always searching desperately for the proverbial man behind the curtain, without knowing what to look for. Which may explain The Guardian’s recent bizarre attack on WhatsApp, which they accused, wrongly, of having a “backdoor.” And the security community erupted in rage.
To understand this story, why the Guardian was and is wrong, why they were forced to walk back their original “backdoor” headline, and why the security community is furious, you’ll need a little context. Sit down, my pretties, and let me tell you a little infosec fable:
Once upon a time there was PGP, which stands for Pretty Good Privacy, and it was good and strong. So good and strong that after its creator, Phil Zimmerman, released its source code 25 years ago, the US government opened a criminal investigation against him for arms trafficking. (The case was later dropped without indictment.)
For twenty years PGP was the gold standard of secure messaging. The NSA could not break it. Edward Snowden used it. But it had serious flaws. For one, it lacked forward secrecy; if your key was compromised, so was every message it had ever encrypted. For another, key exchange was/is at best challenging.
But the worst thing about PGP, by far, is that it is fiendishly user-hostile, so only hardcore hackers ever really used it. (The Snowden revelations were delayed by a month because he couldn’t find a way to contact Glenn Greenwald securely.)
Just as the best workout routine is not the Rock’s but, rather, one that you will actually stick to, the most secure messaging system is one that you will actually use. Whether we like it or not, usability is an essential aspect of security. Any “secure” systems which pretend this is not true will fail from disuse.
Enter Signal, a mobile (and Chrome plug-in) secure messaging system. It is fast, slick, sexy, cross-platform, and battle-tested. It implements highly secure end-to-end messaging with a “ratchet” protocol which provides perfect forward secrecy. It is the choice of technically sophisticated, security-conscious people around the world. It is not perfect. No system is perfect. Every system requires compromises. But Signal is the best available alternative.
However, most of the world does not use Signal. Most of the world uses SMS, Facebook Messenger, and, especially, WhatsApp — which, until recently, was much less secure. So the roll-out of the Signal protocol to WhatsApp, which commenced two years ago, was met with rejoicing. However, even though it used the same protocol as Signal, the implementation was different. It’s that difference which the Guardian, strangely and wrongly, called a “back door.”
For the grotty details see “A Trade-Off In Whatsapp Is Called A Backdoor” by the EFF, “There Is No Whatsapp Backdoor” by Signal head honcho Moxie Marlinspike, “WhatsApp Security Vulnerability” by Bruce Schneier, and “On the ‘WhatsApp backdoor’, Trade-Offs and Opportunistic Authentication,” by Frederic Jacobs. (edit: a previous link here was replaced because of a consensus that it its characterization of Wire, a completely different private-messaging app, was unfair/inaccurate.)
The essential problem is that when the person you’re talking to gets a new phone, or re-installs the app, there’s no way to be immediately sure that the new installation is them. In theory, you should communicate with them over a different medium to verify they aren’t someone else pretending to be them; in a perfect world, you would use the tools Signal and WhatsApp provide to be mathematically certain of this. In practice, though, essentially nobody does this.
Signal, which was built for technically sophisticated users, refuses to send any new messages to a person whose identity seems to have changed, until and unless you explicitly tell it to do so. WhatsApp, which had an install base of roughly a billion users, the vast majority of them anything but technically sophisticated, when it rolled out the Signal protocol — decided that doing so would confuse their users and cause conversations to be lost, and that continuing to deliver messages was more important than making users explicitly ensure their security.
Whether they were right to do so is a thing about which reasonable people can disagree. Again, all messaging systems involve security compromises; and all messaging systems require that you trust somebody, sometimes. The Guardian was my newspaper of choice when I lived in the UK, and I’ve written for them myself, but it is deeply irresponsible journalism to suggest that a complex compromise with which some people disagree is a “back door” or a profound concealed vulnerability.
On one hand, WhatsApp’s implementation of the Signal protocol is less secure than Signal’s implementation. On the other, it is far more secure than their previous system — and the only entity able to use this vulnerability to hack WhatsApp messages is WhatsApp itself, or an intruder who compromises WhatsApp’s systems. Furthermore, as Schneier points out, “it’s an attack against current and future messages, and not something that would allow the government to reach into the past. In that way, it is no more troubling than the government hacking your mobile phone and reading your WhatsApp conversations that way.”
More to the point, though, WhatsApp’s users already have to trust WhatsApp. For all they actually, verifiably know, the app isn’t implementing the Signal Protocol at all. They also have to trust Apple, Google, or whoever they downloaded the app from. They have to trust that no malware on their phone is registering their keytaps and taking surreptitious screenshots. They have to trust that the operating system provides the entropy the encryption algorithms need. You always have to trust somebody. It’s inevitable. Even if you compile PGP from scratch, you can’t go over its code line-by-line to be certain it’s secure — and even if you did, what about the kernel? What about the compiler?
Real security design is about navigating the compromises between usability and security, determining the sophistication and threat model of your users, deciding who you have to trust and who you can’t afford to. Signal makes compromises too — in particular, its use of your phone number. Security design is a complex and ambiguous task not made any easier by ignorant gotcha journalism that can’t distinguish between an disputable compromise and a “backdoor.”
This is not an abstruse, theoretical issue: this hurts and endangers real people, en masse. Saying “Switch to Signal” ignores the fact that most people’s contacts won’t do so, so their de facto choice, if they need to communicate, is between WhatsApp and SMS — and if you frighten them off the former, you scare them into the incredibly vulnerable arms of the latter. Those at the Guardian responsible for this ugly mess have much to answer for. You don’t need to take my word for it — but you should take the word of this who’s who of the security world.