Can You Hear Me Now?

It’s been a pretty big week for tech + privacy, with Apple overhauling the privacy-related info it pushes out to users — sharpening its pro-privacy positioning as a marketing differentiator for its devices and services. And NSA whistleblower Edward Snowden stepping into the public arena by joining Twitter as, well, himself — with the verified account status to prove it.

(Who knows if Snowden was lurking on the service under an assumed name prior to uncloaking as @Snowden. Someone has probably DMed him to ask but he clearly has a big backlog of messages to get through…)

On the surface the two events may not seem much related but pro-privacy moves by mainstream tech giants can absolutely chart a link back to Snowden’s 2013 revelations about the extent of government intelligence agencies’ dragnet surveillance of the online sphere.

Snowden’s big reveal crystalized all those vague yet disconcerting digital sensations prior to then — feelings of being tracked from web service to service, stalked by online ads, and nagging questions about why a simple service needed so much personal data — into the concrete certainty of the systematic scope and scale of an industrial surveillance complex with its fingers in all of the mainstream consumer tech platforms. And a private sector user-stalking operation in the digital business sphere to match.

The thing with such gigantic secrets is, once revealed, there’s no way they can slink back into the shadows.

It’s no surprise then that Apple’s new privacy pages have an entire section on government information requests — in which the company states categorically:

Apple privacy

Such public declarations are absolutely progress. While we cannot know for sure that Apple’s hardware and software lacks government backdoors, given these are hermetically sealed proprietary products that don’t allow an open source route for third party audits, the company is on the public record with an anti-backdoors statement — so has chained its corporate reputation to the digital privacy rights cause.

Apple is also making some very clear privacy commitments to its users. This is also progress.

Its privacy page states:

At Apple, your trust means everything to us. That’s why we respect your privacy and protect it with strong encryption, plus strict policies that govern how all data is handled.

Security and privacy are fundamental to the design of all our hardware, software, and services, including iCloud and new services like Apple Pay. And we continue to make improvements. Two-step verification, which we encourage all our customers to use, in addition to protecting your Apple ID account information, now also protects all of the data you store and keep up to date with iCloud.

We believe in telling you up front exactly what’s going to happen to your personal information and asking for your permission before you share it with us. And if you change your mind later, we make it easy to stop sharing with us. Every Apple product is designed around those principles. When we do ask to use your data, it’s to provide you with a better user experience.

That’s not to say that Apple’s services don’t have insecurities — pretty much any software of the modern era contains bugs and flaws that can lead to exploits and data leaks. (Remember last September’s iCloud hack?)

But the point is one of principle. Apple is making a pro-privacy stance, which stands in stark contrast to much of the consumer tech industry’s wonted ways in recent times — where overreaching T&Cs and vaguely worded privacy policies have all too often required users to sign over any expectations of privacy for the ‘privilege’ of using a certain service (even, in some cases, when they’ve paid for the service in question — so this is not just a case of privacy being the ‘price’ of using a free service).

Apple making a robust pro-privacy stance sets a new privacy benchmark and puts pressure on those tech business models that have been built on mining personal data in the digital shadows. Of which there are, of course, many. But perhaps things are set to change on that front. Such a high profile company shining a disinfecting spotlight on the value of personal data makes those companies with less clearly worded privacy commitments seem a whole lot more murky — even if they’re not actually doing anything too outlandish with the data they gather. And when there is enough pressure, well some pretty unexciting base materials can transform into something valuable.

Apple choosing to champion privacy is a marketing strategy that’s both timely and savvy. Of course it aligns with the company’s premium hardware business model. And it allows them to put clear blue water between how they operate and their main, ad-powered competitors’ big data mining operations. It also puts them on Snowden’s side of the fence; on a principled, public stage, championing the rights of online users not to have their every action data-mined for profit — or fed into Kafka-esque government surveillance apparatus on a ceaseless and hopeless quest for crime-preventing omniscience (Minority Report was fiction, yo).

And while Apple’s own privacy practices should still absolutely be scrutinized — yes it’s great that they obfuscate your mapping data so they don’t have an absolute view of your start and end points, but why are they retaining user maps data for two years? — they are effectively asking all of us to ask questions about how they operate and what they do with our data. To continually hold them to their apparently high standards. And yes, that is progress. Because it applies industry-wide pressure and works to counter the pro-surveillance narrative that claims users don’t care about privacy anyway. Bottom line: Plenty of users do care — and certainly they do when you inform them exactly how much invasive snooping is going on. As Snowden has said, we need to have the debate about what’s acceptable and what’s not — and the simple fact is you can’t do that without being fully appraised of the facts.

A more cynical view on Apple’s stance might be that it’s using privacy as a strategy to shield itself against a relative competitive weakness vs the kinds of big data powered services that companies with a greater overview of their users are able to launch. Google, for instance, has been using user data mined from usage of multiple Google services to power its predictive Google Now feature for several years, touting the convenience of notifications that really know your habits and patterns (because, well, Google reads your emails, knows what’s in your calendar, looks at who’s in your photos, and so on…). With the rise of wearables and a growing Internet of Things, more and more personal data-points can be added to such systems to power apparently more powerful predictions. And yet there’s a gigantic trade-off in privacy. The best personal assistant in the world would literally be a mind-reader but who would actually want to employ such a person? What cost incremental convenience?

Meanwhile Apple debuted an update to its Siri voice assistant at its developer conference this summer — called Proactive — which also aims to surface some Google Now-ish predictive smarts. So it’s also moving towards joining more dots about its users’ lives. However Apple’s version of this predictive assistant puts in a privacy check and balance by doing only local on-device processing — meaning it’s not sucking your personal data into the data-mining cloud to power this feature. So the user gets incremental convenience without an eye-wateringly costly privacy price-tag.

These sorts of pro-privacy, data obfuscating approaches perhaps take more engineering effort to develop. So might be slower to bring to market. They might also be less compelling from a user point of view if they aren’t able to be quite so pin-point accurate — given they are likely working with a more partial view of the user, rather than nosing through your emails. But if the user understands the value of their privacy they will also understand the value of a personalized service that does not require they strip entirely naked in order to use it. Apple is betting that tech users will — at the end of the day — prefer to keep their clothes on.

Another thing to note here is that data protection laws vary in different regions. Failure to gain proper consent for how user data is processed is a recurring theme of many U.S. tech giants doing business in Europe. Facebook and Google have both faced legal challenges in the region over such privacy issues. And the Europe Parliament is in the midst of reworking the bloc’s data protection rules — with larger penalties for privacy infringements likely coming down the pipe. That might well be another trigger to push tech companies to clean up murky privacy practices. Lurking in the shadows to eschew scrutiny no longer looks a viable strategy in the post-Snowden tech world.

Another important development triggered by the Snowden revelations is also coming to a head next week. On Tuesday Europe’s top court, the ECJ, will rule on whether the ‘Safe Harbor’ agreement that governs data sharing between Europe and the U.S. affords Europeans enough privacy protections — with the possibility that the court might invalidate the current agreement. U.S. tech companies offering consumers services in Europe but processing user data back in the U.S. rely on this agreement for continued operation of their businesses.

The agreement has, in any case, been in defacto crisis ever since Snowden revealed the extent of dragnet government surveillance programs — since the NSA was shown to be hoovering up data from consumer services that were apparently signed up to the privacy covenant of Safe Harbor. How could European’s personal data shipped across the pond still be considered ‘safe’ in an era of systematic mass surveillance by the U.S. government?

European privacy campaigner Max Schrems has led a legal challenge on this front, challenging multiple U.S. tech giants for sharing data with the NSA in the Irish court — which referred the case to the ECJ, with a decision now imminent. At the same time, the European Commission is continuing to review the Safe Harbor agreement with a view to updating the framework given the ugly fact of mass surveillance. How exactly they will do that remains to be seen. But the ECJ ruling may overtake the politicians, in any case.

In an influential opinion written by the top advisor to the ECJ earlier this month, ahead of the court’s final decision next week, advocate general Yves Bot argued that U.S. mass surveillance has indeed invalidated the Safe Harbor agreement. It’s not clear how the court will rule but they typically lean towards following the AG’s opinion — so at very least these are interesting times for data privacy. Some big implications for how cloud-based tech businesses operate are in the process of being determined.

One thing is amply clear: the privacy debate is here to stay. And for that we must thank Mr Snowden. Looking ahead, a digital era where users understand the value of personal data and where tech businesses compete to protect — not exploit — privacy sounds pretty exciting to me. That’s the dream.

Yes, Mr Snowden, we hear you.