A UK parliamentary committee that focuses on human rights issues has called for primary legislation to be put in place to ensure that legal protections wrap around the national coronavirus contact tracing app.
The app, called NHS COVID-19, is being fast tracked for public use — with a test ongoing this week in the Isle of Wight. It’s set to use Bluetooth Low Energy signals to log social interactions between users to try to automate some contacts tracing based on an algorithmic assessment of users’ infection risk.
The NHSX has said the app could be ready for launch within a matter of weeks but the committee says key choices related to the system architecture create huge risks for people’s rights that demand the safeguard of primary legislation.
“Assurances from Ministers about privacy are not enough. The Government has given assurances about protection of privacy so they should have no objection to those assurances being enshrined in law,” said committee chair, Harriet Harman MP, in a statement.
“The contact tracing app involves unprecedented data gathering. There must be robust legal protection for individuals about what that data will be used for, who will have access to it and how it will be safeguarded from hacking.
“Parliament was able quickly to agree to give the Government sweeping powers. It is perfectly possible for parliament to do the same for legislation to protect privacy.”
The NHSX, a digital arm of the country’s National Health Service, is in the process of testing the app — which it’s said could be launched nationally within a few weeks.
The government has opted for a system design that will centralize large amounts of social graph data when users experiencing COVID-19 symptoms (or who have had a formal diagnosis) choose to upload their proximity logs.
Earlier this week we reported on one of the committee hearings — when it took testimony from NHSX CEO Matthew Gould and the UK’s information commissioner, Elizabeth Denham, among other witnesses.
Warning now over a lack of parliamentary scrutiny — around what it describes as an unprecedented expansion of state surveillance — the committee report calls for primary legislation to ensure “necessary legal clarity and certainty as to how data gathered could be used, stored and disposed of”.
The committee also wants to see an independent body set up to carry out oversight monitoring and guard against ‘mission creep’ — a concern that’s also been raised by a number of UK privacy and security experts in an open letter late last month.
“A Digital Contact Tracing Human Rights Commissioner should be responsible for oversight and they should be able to deal with complaints from the Public and report to Parliament,” the committee suggests.
In this letter, dated May 4, Hancock told it: “We do not consider that legislation is necessary in order to build and deliver the contact tracing app. It is consistent with the powers of, and duties imposed on, the Secretary of State at a time of national crisis in the interests of protecting public health.”
The committee’s view is Hancock’s ‘letter of assurance’ is not enough given the huge risks attached to the state tracking citizens’ social graph data.
“The current data protection framework is contained in a number of different documents and it is nearly impossible for the public to understand what it means for their data which may be collected by the digital contact tracing system. Government’s assurances around data protection and privacy standards will not carry any weight unless the Government is prepared to enshrine these assurances in legislation,” it writes in the report, calling for a bill that it says myst include include a number of “provisions and protections”.
Among the protections the committee is calling for are limits on who has access to data and for what purpose.
“Data held centrally may not be accessed or processed without specific statutory authorisation, for the purpose of combatting Covid-19 and provided adequate security protections are in place for any systems on which this data may be processed,” it urges.
It also wants legal protections against data reconstruction — by different pieces of data being combined “to reconstruct information about an individual”.
The report takes a very strong line — warning that no app should be released without “strong protections and guarantees” on “efficacy and proportionality”.
“Without clear efficacy and benefits of the app, the level of data being collected will be not be justifiable and it will therefore fall foul of data protection law and human rights protections,” says the committee.
The report also calls for regular reviews of the app — looking at efficacy; data safety; and “how privacy is being protected in the use of any such data”.
It also makes a blanket call for transparency, with the committee writing that the government and health authorities “must at all times be transparent about how the app, and data collected through it, is being used”.
A lack of transparency around the project was another of the concerns raised by the 177 academics who signed the open letter last month.
The government has committed to publishing data protection impact assessments for the app. But the ICO’s Denham still hadn’t had sight of this document as of this Monday.
Another call by the committee is for a time-limit to be attached to any data gathered by or generated via the app. “Any digital contact tracing (and data associated with it) must be permanently deleted when no longer required and in any event may not be kept beyond the duration of the public health emergency,” it writes.
We’ve reached out to the Department of Health and NHSX for comment on the human rights committee’s report.
There’s another element to this fast moving story: Yesterday the Financial Times reported that the NHSX has inked a new contract with an IT supplier which suggests it might be looking to change the app architecture — moving away from a centralized database to a decentralized system for contacts tracing. Although NHSX has not confirmed any such switch at this point.
Some other countries have reversed course in their choice of app architecture after running into technical challenges related to Bluetooth. The need to ensure public trust in the system was also cited by Germany for switching to a decentralized model.
The human rights committee report highlights a specific app efficacy issue of relevance to the UK, which it points out is also linked to these system architecture choices, noting that: “The Republic of Ireland has elected to use a decentralised app and if a centralised app is in use in Northern Ireland, there are risks that the two systems will not be interoperable which would be most unfortunate.”
Professor Lilian Edwards, a legal expert from Newcastle University, who has co-authored a draft bill proposing a set of safeguards for coronavirus apps (much of which was subsequently taken up by Australia for a legal instrument that wraps public health contact info during the coronavirus crisis) — and who also now sits as an independent advisor on an ethics committee that’s been set up for the NHSX app — welcomed the committee report.
Speaking in a personal capacity she told TechCrunch: “My team and I welcome this.”
But she flagged a couple of omissions in the report. “They have left out two of the recommendations from my bill — one of which, I totally expected; that there be no compulsion to carry a phone. Because they will just be assumed within our legal system but I don’t think it would have hurt to have said it. But ok.
“The second point — which is important — is the point about there not being compulsion to install the app or to display it. And there not being, therefore, discrimination against you if you don’t. Like not being allowed to go to your workplace is an obvious example. Or not being allowed to go to a football game when they reopen. And that’s the key point where the struggle is.”
The conflict, says Edwards, is on the one hand you could argue what’s the point of doing digital contact tracing at all if you can’t make sure people are able to receive notifications that they might be a contact. But — on the other — if you allow compulsion that then “leaves it open to be very discriminatory” — meaning people could abuse the requirement to target and exclude others from a workplace, for example.
“There are people who’ve got perfectly valid reasons to not want to have this on their phone,” Edwards added. “Particularly if it’s centralized rather than decentralized.”
She also noted that the first version of her draft coronavirus safeguards bill had allowed compulsion re: having the app on the phone but required it to be balanced by a proportionality analysis — meaning any such compulsion must be “proportionate to a legitimate aim”.
But after Australia opted for zero compulsion in its legal instrument she said she and her team decided to revise their bill to also strike out the provision entirely.
Edwards suggested the human rights committee may not have included this particular provision in their recommendations because parliamentary committees are only able to comment on evidence they receive during an inquiry. “So I don’t think it would have been in their remit to recommend on that,” she noted, adding: “It isn’t actually an indication that they’re not interested in these concepts; it’s just procedure I think.”
She also highlighted the issues of so-called ‘immunity passports’ — something the government has reportedly been in discussions with startups about building as part of its digital coronavirus response, but which the committee report also does not touch on.
However, without full clarity on the government’s evolving plans for its digital coronavirus response, and with, inevitably, a high degree of change and flux amid a public health emergency situation, it’s clearly difficult for committees to interrogate so many fast moving pieces.
“The select committees have actually done really, really well,” added Edwards. “But it just shows how the ground has shifted so much in a week.”
This report was updated with additional comment