Apple responds to Senator Franken’s Face ID privacy concerns

Apple has now responded to a letter from U.S. Senator Al Franken last month in which he asked the company to provide more information about the incoming Face ID authentication technology which is baked into its top-of-the-range iPhone X, due to go on sale early next month.

As we’ve previously reported, Face ID raises a range of security and privacy concerns because it encourages smartphone consumers to use a facial biometric for authenticating their identity — and specifically a sophisticated full three dimensional model of their face.

And while the tech is limited to one flagship iPhone for now, with other new iPhones retaining the physical home button plus fingerprint Touch ID biometric combo that Apple launched in 2013, that’s likely to change in future.

After all, Touch ID arrived on a single flagship iPhone before migrating onto additional Apple hardware, including the iPad and Mac. So Face ID will surely also spread to other Apple devices in the coming years.

That means if you’re an iOS user it may be difficult to avoid the tech being baked into your devices. So the Senator is right to be asking questions on behalf of consumers. Even if most of what he’s asking has already been publicly addressed by Apple.

Last month Franken flagged what he dubbed “substantial questions” about how “Face ID will impact iPhone users’ privacy and security, and whether the technology will perform equally well on different groups of people”, asking Apple for “clarity to the millions of Americans who use your products” and how it had weighed privacy and security issues pertaining to the tech itself; and for additional steps taken to protect users.

Here’s the full list of 10 questions the Senator put to the company:

1.      Apple has stated that all faceprint data will be stored locally on an individual’s device as opposed to being sent to the cloud.

a.      Is it currently possible – either remotely or through physical access to the device – for either Apple or a third party to extract  and obtain usable faceprint data from the iPhone X?

b.      Is there any foreseeable reason why Apple would decide to begin storing such data remotely?

2.     Apple has stated that it used more than one billion images in developing the Face ID algorithm. Where did these one billion face images come from?

3.     What steps did Apple take to ensure its system was trained on a diverse set of faces, in terms of race, gender, and age? How is Apple protecting against racial, gender, or age bias in Face ID?

4.     In the unveiling of the iPhone X, Apple made numerous assurances about the accuracy and sophistication of Face ID. Please describe again all the steps that Apple has taken to ensure that Face ID can distinguish an individual’s face from a photograph or mask, for example.

5.     Apple has stated that is has no plans to allow any third party applications access to the Face ID system or its faceprint data. Can Apple assure its users that it will never share faceprint data, along with the tools or other information necessary to extract the data, with any commercial third party?

6.      Can Apple confirm that it currently has no plans to use faceprint data for any purpose other than the operation of Face ID?

7.     Should Apple eventually determine that there would be reason to either begin storing faceprint data remotely or use the data for a purpose other than the operation of Face ID, what steps will it take to ensure users are meaningfully informed and in control of their data?

8.      In order for Face ID to function and unlock the device, is the facial recognition system “always on,” meaning does Face ID perpetually search for a face to recognize? If so:

a.      Will Apple retain, even if only locally, the raw photos of faces that are used to unlock (or attempt to unlock) the device?

b.      Will Apple retain, even if only locally, the faceprints of individuals other than the owner of the device?

9.      What safeguards has Apple implemented to prevent the unlocking of the iPhone X when an individual other than the owner of the device holds it up to the owner’s face?

10.   How will Apple respond to law enforcement requests to access Apple’s faceprint data or the Face ID system itself?

In its response letter, Apple first points the Senator to existing public info — noting it has published a Face ID security white paper and a Knowledge Base article to “explain how we protect our customers’ privacy and keep their data secure”. It adds that this “detailed information” provides answers “all of the questions you raise”.

But also goes on to summarize how Face ID facial biometrics are stored, writing: “Face ID data, including mathematical representations of your face, is encrypted and only available to the Secure Enclave. This data never leaves the device. It is not sent to Apple, nor is it included in device backups. Face images captured during normal unlock operations aren’t saved, but are instead immediately discarded once the mathematical representation is calculated for comparison to the enrolled Face ID data.”

It further specifies in the letter that: “Face ID confirms attention by directing the direction of your gaze, then uses neural networks for matching and anti-spoofing so you can unlock your phone with a glance.”

And reiterates its prior claim that the chance of a random person being able to unlock your phone because their face fooled Face ID is approximately 1 in 1M (vs 1 in 50,000 for the Touch ID tech). After five unsuccessful match attempts a passcode will be required to unlock the device, it further notes.

“Third-party apps can use system provided APIs to ask the user to authenticate using Face ID or a passcode, and apps that support Touch ID automatically support Face ID without any changes. When using Face ID, the app is notified only as to whether the authentication was successful; it cannot access Face ID or the data associated with the enrolled face,” it continues.

On questions about the accessibility of Face ID technology, Apple writes: “The accessibility of the product to people of diverse races and ethnicities was very important to us. Face ID uses facial matching neural networks that we developed using over a billion images, including IR and depth images collected in studies conducted with the participants’ informed consent.”

The company had already made the “billion images” claim during its Face ID presentation last month, although it’s worth noting that it’s not saying — and has never said — it trained the neural networks on images of a billion different people.

Indeed, Apple goes on to tell the Senator that it relied on a “representative group of people” — though it does not confirm exactly how many individuals, writing only that: “We worked with participants from around the world to include a representative group of people accounting for gender, age, ethnicity and other factors. We augmented the studies as needed to provide a high degree of accuracy for a diverse range of users.”

There’s obviously an element of commercial sensitivity at this point, in terms of Apple cloaking its development methods from competitors. So you can understand why it’s not disclosing more exact figures. But of course Face ID’s robustness in the face of diversity remains to be proven (or disproven) when iPhone X devices are out in the wild.

Apple also specifies that it has trained a neural network to “spot and resist spoofing” to defend against attempts to unlock the device with photos or masks. Before concluding the letter with an offer to brief the Senator further if he has more questions.

Notably Apple hasn’t engaged with Senator Franken’s question about responding to law enforcement requests — although given enrolled Face ID data is stored locally on a user’s device in the Secure Element as a mathematical model, the technical architecture of Face ID has been structured to ensure Apple never takes possession of the data — and couldn’t therefore hand over something it does not hold.

The fact Apple’s letter does not literally spell that out is likely down to the issue of law enforcement and data access being rather politically charged.

In his response to the letter, Senator Franken appears satisfied with the initial engagement, though he also says he intends to take the company up on its offer to be briefed in more detail.

“I appreciate Apple’s willingness to engage with my office on these issues, and I’m glad to see the steps that the company has taken to address consumer privacy and security concerns. I plan to follow up with Apple to find out more about how it plans to protect the data of customers who decide to use the latest generation of iPhone’s facial recognition technology,” he writes.

“As the top Democrat on the Privacy Subcommittee, I strongly believe that all Americans have a fundamental right to privacy,” he adds. “All the time, we learn about and actually experience new technologies and innovations that, just a few years back, were difficult to even imagine. While these developments are often great for families, businesses, and our economy, they also raise important questions about how we protect what I believe are among the most pressing issues facing consumers: privacy and security.”