UK privacy and security experts warn over coronavirus app mission creep

A number of UK computer security and privacy experts have signed an open letter raising transparency and mission creep concerns about the national approach to develop a coronavirus contacts tracing app.

The letter, signed by 177 academics, follows a similar letter earlier this month signed by around 300 academics from across the world, who urged caution over the use of such tech tools and called for governments that choose to deploy digital contacts tracing to use privacy-preserving techniques and systems.

We urge that the health benefits of a digital solution be analysed in depth by specialists from all relevant academic disciplines, and sufficiently proven to be of value to justify the dangers involved,” the UK academics write now, directing their attention at NHSX, the digital arm of the National Health Service which has been working on building a digital contacts tracing app since early March. 

It has been reported that NHSX is discussing an approach which records centrally the de-anonymised ID of someone who is infected and also the IDs of all those with whom the infected person has been in contact. This facility would enable (via mission creep) a form of surveillance.”

Yesterday the NHSX’s CEO, Matthew Gould, was giving evidence to the UK parliament’s Science and Technology committee. He defended the approach it’s taking — claiming the forthcoming app uses only “a measure of centralization”, and arguing that it’s a “false dichotomy” to say decentralized is privacy secure and centralized isn’t.

He went on to describe a couple of scenarios he suggested show why centralizing the data is necessary in the NHSX’s view. But in the letter the UK academics cast doubt on the validity of the central claim, writing that “we have seen conflicting advice from different groups about how much data the public health teams need“.

We hold that the usual data protection principles should apply: collect the minimum data necessary to achieve the objective of the application,” they continue. “We hold it is vital that if you are to build the necessary trust in the application the level of data being collected is justified publicly by the public health teams demonstrating why this is truly necessary rather than simply the easiest way, or a ‘nice to have’, given the dangers involved and invasive nature of the technology.”

Europe has seen fierce debate in recent weeks over the choice of app architecture for government-backed coronavirus contacts tracing apps — with different coalitions forming to back decentralized and centralized approaches and some governments pressuring Apple over backing the opposing horse with a cross-platform API for national coronavirus contacts tracing apps it’s developing with Android-maker Google.

Most of the national apps in the works in the region are being designed to use Bluetooth proximity as a proxy for calculating infection risk — with smartphone users’ devices swapping pseudonymized identifiers when near each other. However privacy experts are concerned that centralized stores of IDs risk creating systems of state surveillance as the data could be re-identified by the authority controlling the server.

Alternative decentralized systems have been proposed, using a p2p system with IDs stored locally. Infection risk is also calculated on device, with a relay server used only to push notifications out to devices — meaning social graph data is not systematically exposed.

Although this structure does require the IDs of people who have been confirmed infected to be broadcast to other devices — meaning there’s a potential for interception and re-identification attacks at a local level.

At this stage it’s fair to say that the momentum in Europe is behind decentralized approaches for the national contacts tracing apps. Notably Germany’s government switched from previously backing a centralized approach to decentralized earlier this week, joining a number of others (including Estonia, Spain and Switzerland) — which leaves France and the UK the highest profile backers of centralized systems for now.

France is also seeing expert debate over the issue. Earlier this week a number of French academics signed a letter raising concerns about both centralized and decentralized architectures — arguing that “there should be important evidence in order to justify the risks incurred” of using any such tracking tools.

In the UK, key concerns being attached to the NHSX app are not only the risk of social graph data being centralized and reidentified by the state — but also scope/function creep.

Gould said yesterday that the app will iterate, adding that future versions could ask people to voluntarily give up more data such as their location. And while the NHSX has said use of the app will be voluntary, if multiple functions get baked in that could raise questions over the quality of the consent and whether mission creep is being used as a lever to enforce public uptake.

Another concern is that a public facing branch of the domestic spy agency, GCHQ, has also been involved in advising on the app architecture. And yesterday Gould dodged the committee’s direct questions on whether the National Cyber Security Centre (NCSC) had been involved in the decision to select a centralized architecture.

There may be more concerns on that front, too. Today the HSJ reports that health secretary Matt Hancock recently granted new powers to the UK’s intelligence agencies which mean they can require the NHS to disclose any information that relates to “the security” of the health service’s networks and information systems during the pandemic.

Such links to database-loving spooks are unlikely to quell privacy fears.

There is also concern about how involved the UK’s data watchdog has been in the detail of the app’s design process. Last week the ICO’s executive director, Simon McDougall, was reported to have told a public forum he had not seen plans for the app, although the agency put out a statement on April 24 saying it was working with NHSX “to help them ensure a high level of transparency and governance”.

Yesterday Gould also told the committee the NHSX would publish data protection impact assessments (DPIAs) for each iteration of the app, though none has yet been published.

He also said the software would be “technically” ready to launch in a few weeks’ time — but could not confirm when the code would be published for external review.

In their letter, the UK academics call on NHSX to publish a DPIA for the app “immediately”, rather than dropping it right before deployment, to allow for public debate about the implications of its use and in order that that public scrutiny can take place of the claimed security and privacy safeguards.

The academics are also calling for the unit to publicly commit to no database or databases being created that would allow de-anonymization of users of the system (other than those self reporting as infected), and which could therefore allow the data to be used for constructing users’ social graphs.

They also urge the NHSX to set out details on how the app will be phased out after the pandemic has passed — in order “to prevent mission creep”.

Asked for a commitment on the database point, an NHSX spokesman told us that’s a question for the UK’s Department of Health and Social Care and/or the NCSC — which won’t salve any privacy concerns around the governments’ wider plans for app users’ data.

We also asked when the NHSX will be publishing a DPIA for the app. At the time of writing we were still waiting for a response. Update: An NHSX spokesperson has now sent the following statement: “We will publish the data protection agreements in due course, and we will close down the app once the threat from the pandemic has passed, with any data users have chosen to share deleted at that point and some retained for research purposes, subject to legal and ethical considerations,  to better understand the virus.”

“Users of the app will remain anonymous up to the point where they volunteer their own details, and there will be no database that allows the de-anonymisation of users,” the spokesperson added.