Apple CEO Tim Cook has confirmed that the company will appeal a California judge’s order to unlock an iPhone belonging to one of the terrorists involved in the San Bernardino shooting. Following the request, Cook argued, would “threaten the security of our customers.”
The device in question — an iPhone 5c — was in the possession of Syed Farook, who, alongside his wife, carried out a mass shooting during a training event at the San Bernardino County Department of Public Health, where he worked, last December. The phone was owned by the agency and assigned to Farook. He and his wife were later killed by police in a shootout.
Authorities want access to data on the phone and are seeking Apple’s help to crack the passcode (PDF) by creating software which, when loaded onto the device, would circumvent the security system. That’s because, beyond the passcode itself, Apple’s security measures include an ‘auto-erase function’ which, if activated by a user, will erase all data on a device if the passcode is entered incorrectly 10 times.
In a letter to Apple customers, Cook said Apple has provided “data that’s in our possession” but it will not develop a “backdoor” for its software:
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
Cook also criticized authorities for using the All Writs Act and not Congressional legislation to make the request, which he labeled “a dangerous precedent” that would seriously weaken Apple’s security system:
The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
There’s been plenty of comment over whether, in making this ruling, the FBI is essentially asking Apple to create a backdoor solution that it can use in similar cases. Opinion is divided, however. Techdirt argues that the ultimate goal of the order is a backdoor, while research organization Errata Security claims that’s an overly active interpretation of the ruling.
What is for certain is that the case once again throws down the tension between security user data and providing information to assist authorities. Cook has been forthright in his belief that products and services must be encrypted.
In a speech at EPIC’s Champions of Freedom event in Washington last June, the Apple CEO said:
There’s another attack on our civil liberties that we see heating up every day — it’s the battle over encryption. Some in Washington are hoping to undermine the ability of ordinary citizens to encrypt their data.
We think this is incredibly dangerous. We’ve been offering encryption tools in our products for years, and we’re going to stay on that path. We think it’s a critical feature for our customers who want to keep their data secure. For years we’ve offered encryption services like iMessage and FaceTime because we believe the contents of your text messages and your video chats is none of our business.
This latest news illustrates the important role that technology companies play as the gatekeeper of information in cases of national security and legal proceedings.
Update: The EFF has now said it will be supporting Apple’s appeal against the court order by submitting an amicus brief. “We are supporting Apple here because the government is doing more than simply asking for Apple’s assistance,” writes the EFF’s Kurt Opsahl today.
“For the first time, the government is requesting Apple write brand new code that eliminates key features of iPhone security — security features that protect us all. Essentially, the government is asking Apple to create a master key so that it can open a single phone. And once that master key is created, we’re certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security.”Turning to the question of technical feasibility, security blog Trail of Bits’ Dan Guido has suggested that in his opinion Apple could comply with the court order to provide access to a specific iPhone 5c.
“I believe it is technically feasible for Apple to comply with all of the FBI’s requests in this case. On the iPhone 5C, the passcode delay and device erasure are implemented in software and Apple can add support for peripheral devices that facilitate PIN code entry. In order to limit the risk of abuse, Apple can lock the customized version of iOS to only work on the specific recovered iPhone and perform all recovery on their own, without sharing the firmware image with the FBI,” he writes.
However Guido’s suggestion that a backdoor could be created for the specific iPhone 5c in the case has been refuted by former Apple employee John Kelley, who spent four years working as an embedded security engineer at Cupertino. In a series of tweets earlier today discussing the issue Kelley makes the point that Apple could in fact also be forced to modify its Secure Enclave firmware — thereby backdooring the hardware security feature (i.e. the Secure Enclave) which more modern iPhones have but which the iPhone 5c in this case lacks.
So the suggestion is that Apple is indeed correct when it says that a government request to backdoor security in the case of a single iPhone comes with “no guarantee” that such moves could be limited to just one iPhone. Because, if Kelley’s take is correct, there is no technical blocker — at a firmware level — to prevent Apple being forced to build a backdoor into even more modern iPhones, which do have its hardware Touch ID security feature. Hence the necessity of a principled defense in the face of government agencies trying to use the law to brute force Apple to hack its own security systems.
TechCrunch’s Natasha Lomas contributed to this report
Title updated for clarity.