There’s some solid language here that is clearly designed to allay fears about the way that Apple protects user data in the wake of the celebrity nude hacking incidents.
We believe in telling you up front exactly what’s going to happen to your personal information and asking for your permission before you share it with us. And if you change your mind later, we make it easy to stop sharing with us. Every Apple product is designed around those principles. When we do ask to use your data, it’s to provide you with a better user experience.
We’re publishing this website to explain how we handle your personal information, what we do and don’t collect, and why. We’re going to make sure you get updates here about privacy at Apple at least once a year and whenever there are significant changes to our policies.
It’s a welcome statement, exactly the kind of thing I was talking about when I said more transparency from Apple is needed when it comes to security. It’s known for its clear communication of product benefits, so why not security?
The addition of a security page that will be continuously updated with current information about the way that Apple handles user data is a great forward step here.
Cook also takes the opportunity to plainly state (again) that Apple has never worked ‘with governments’ to create service backdoors.
Finally, I want to be absolutely clear that we have never worked with any government agency from any country to create a backdoor in any of our products or services. We have also never allowed access to our servers. And we never will.
Apple also provided a new guide for law enforcement in the US that says it can no longer decrypt user devices. Previously, law enforcement could request decryption in order to retrieve evidence. Apple says it has changed its encryption method in iOS 8 so this is no longer technically possible.
For all devices running iOS 8.0 and later versions, Apple will no longer be performing iOS data extractions as the data sought will be encrypted and Apple will not possess the encryption key.
I’m optimistic about its ability to follow on from this with more action on the security front. I’d love to see a bug bounty program, either public or selective, that allowed third-party researchers a strong conduit for communication and vulnerability reporting, for instance. Until Apple embraces external researchers it will remain vulnerable as its products swell to include millions of lines of code and multiple surfaces for malicious entities to attack. More eyeballs on the problem is a good thing.
But the letter is a good signal, and it’s more forthcoming than many of Apple’s competitors. Cook doesn’t hesitate to point out, of course, that Apple’s business model allows it to take this ‘we don’t want your information’ stance.
Still, Apple is an enormous corporation, looking to protect its interests just like any other. So I don’t doubt that many will view this as a sort of damage control (and it is) to protect Apple’s reputation.
But I also personally know many Apple employees at many levels of the company from engineers to managers — and I can tell you, for what it’s worth — that this is what they really believe. They work at Apple because they make products that will affect the lives of millions and they believe (in general) in its ethos and ethics. Many certainly don’t (just) do it for the paycheck.
I don’t know whether these sentiments are shared among Apple’s executive staff — I don’t have brunch with Tim Cook or anything — but I would hope that it is.
Apple is a corporation, let’s not forget, so it pays to be watchful and to hold it to a high standard when it comes to security and privacy. There’s no need to make it concessions or ‘forgive’ it for mistakes. And the Snowden revelations have shown us that sometimes the best intentions of any company or set of executives isn’t enough to protect users — inside the US or not.
Still, better to have it out in the open, so kudos.