Concerns have been raised about a set of late stage government amendments inserted into the UK’s draft data protection bill. The clauses deal with the processing of public sector data.
Health data privacy advocacy group MedConfidential believes ministers are trying to sneak in a data protection law carveout, in order to hand politicians the power to judge the ethics of — for example — applying AI to taxpayer-funded medical data-sets.
Thus far, the Data Protection Bill 2017, which is intended to align domestic rules with the incoming EU General Data Protection Regulation — and generally gives consumers more control over how their data is used — has attracted limited controversy. Although some privacy groups argue the government is not going far enough on data redress powers.
But if complex and impactful ethical judgments about how to process sensitive public sector data are to be nakedly conjoined to political interests that would indeed be a major cause for alarm.
MedConfidential’s view is that the government is seeking to legislate in haste, eschewing a proper public debate about the appropriate shape that data processing and AI activity should take in the public sector — at a time when rising use of automation and machine learning technologies in the commercial sphere is highlighting myriad ethical issues and challenges flowing from algorithmic decision-making.
“Ministers say they want a world leading data protection regime — they just don’t want to be bound by any of the protections they claim to offer citizens,” argues MedConfidential’s Sam Smith. “The next time someone in government loses your sensitive data, it will be a politician who gets to decide if the rules were even broken.”
Nor is Smith’s the only disquieted voice here. The UK’s data protection commissioner has its own set of concerns about the scope and intent of ministers’ late stage proposals for public sector data processing.
On this, an ICO spokesperson told us: “We have set out our concerns on these provisions with the government, especially around the ICO having to take account of Secretary of State guidance. We believe this provision is not required as we always take other statutory guidance into account when exercising our powers.”
Framework for Data Processing by Government
Clauses 175-178 of the current amended draft bill introduce a “Framework for Data Processing by Government” — ostensibly, according to a spokesman for the Department for Media, Culture and Sport, to “allow the government to set out how it processes data in the interest of transparency and trust”.
No greater degree of transparency was forthcoming when we asked why DCMS believes it’s necessary to augment an already 250+ page data protection bill with a framework for public sector data at this late stage. An amendment which — at least as currently worded — appears to give politicians an arbitrating role over data ethics by specifying only a duty to ‘consult’ the UK’s data protection watchdog.
Here’s the current wording:
(1) The Secretary of State may prepare a document, called the Framework for Data Processing by Government, which contains guidance about the processing of personal data in connection with the exercise of functions of— (a) the Crown, a Minister of the Crown or a United Kingdom government department, and (b) a person with functions of a public nature who is specified or described in regulations made by the Secretary of State.
(5) Before preparing a document or amendments under this section, the Secretary of State must consult — (a) the Commissioner, and (b) any other person the Secretary of State considers it appropriate to consult.
The scope of public sector data is of course very wide indeed — covering everything from medical records held by the UK’s National Health Service to employment, welfare and other highly sensitive data held by the Department for Work and Pensions, to asylum and immigration data on file at the Home Office, to name just a few examples.
Add to that, if data can more easily be pooled and processed in concert across public sector silos — something the government has said it wants to do — there are even greater ethical considerations and privacy risks in play.
Throw in the omnipresent pressures of Brexit — to cut costs and engineer all sorts of complex feats within ludicrously tiny timescales, such as conjuring an automagical technology-powered non-hard border between Northern Ireland and the Republic of Ireland (as has been suggested as one possible solution for that particular Brexit conundrum) — and it’s not hard to see why ministers might be in an unseemly haste to feed public sector data-sets to any technologist who claims they can engineer a solution.
Clause 176, which deals with the approval process for public sector data processing framework documents, requires DCMS’ Secretary of State to obtain parliamentary approval before issuing a document — with a period of 40 days afforded for this scrutiny process. Approved documents must then be issued and come into force within 21 days.
The DCMS spokesman did not respond explicitly when we asked whether the ICO will have full oversight powers over the framework (i.e. rather than just playing a consultant role) — saying only that: “The Secretary of State would give proper regard to all comments made by the Information Commissioner in consultation on the framework.”
But does “proper regard” equate to ‘full regulatory oversight’? At this stage it’s not entirely clear. What is clear is that the ICO is concerned that it does not.
The DCMS spokesman also pointed out that the bill is still going through parliamentary, adding that “any amendments will be debated through this process”.
“The Government consulted the Information Commissioner in the preparation of clauses 175 and 178 and will continue to work the ICO when preparing its guidance and ensure that there is no conflict with the ICO’s statutory codes,” he added.
While the ICO told us it continues to have “regular discussions” with the government regarding the bill and “will be providing views as part of the Parliamentary process”.
So it’s possible — and to be hoped — that greater clarity will emerge vis-a-vis the government’s intentions before another substantial piece of data-related legislation gets cemented into UK law. (One example of current muddiness: The term ‘data processing’, which could imply all sorts of ways of manipulating data; a point that the information commissioner herself raises, noting: “The definition of this is very wide and could cover any aspect of data handling within government or other bodies to whom the measure is applied.”)
Call for greater clarity
According to the ICO‘s most recent assessment of the bill, this set of amendments was introduced by DCMS “to provide government departments with a clearer legal basis for their processing activities, especially around data sharing”.
“The Commissioner understands the need for government departments and public bodies to be clear about their legal basis for undertaking their functions and this is particularly true when processing personal data,” it begins, before going on to flesh out a number of concerns — such as warning that “the provisions as drafted appear to go beyond this limited ambition and create different risks”; and raising specific worries that the current phrasing of clause 175(1)(b) could also extent to include private sector bodies providing data processing services to the public sector.
“This wording does not appear constrained to just public bodies who may have concerns about their legal basis, but to others who may be able to act privately but nevertheless undertake some limited functions of a public nature,” it writes, adding: “These provisions should just address those public bodies where there is a need for greater clarity on their legal basis for processing.”
“This regulation-making power seems unnecessarily wide to achieve the government’s objective of addressing data processing primarily within government departments and there should be a clearer, more focussed provision setting out the other bodies to which the requirement may be applied,” it further adds.
One recent example of a private sector tech company becoming embroiled in a UK public sector data ethics scandal culminated, last summer, with the ICO sanctioning a London NHS Trust that had signed an information sharing arrangement with Google DeepMind — after the watchdog judged the pair’s 2015 data-sharing arrangement to have breached domestic privacy law. In that instance 1.6 million patients’ fully identifiable medical records were shared with the Google-owned company without people’s knowledge or consent.
Even now it’s still not clear what legal basis is being claimed for the data to be being passed.
Having its name attached to a regulatory sanction hasn’t prevented Google DeepMind signing additional data-sharing arrangements with other UK NHS Trusts to continue rolling out the clinical task management app that makes use of the data. It also has other research partnerships with NHS Trusts that involve AI research on NHS patients’ health data. So the commercial demand for gaining access to large-scale public sector data-sets is clear.
At the same time the UK government has made it clear that supporting AI-based innovation is a strategic priority. It commissioned an independent review into ways to grow the UK’s AI industry, which was published this October — urging increasing ease of access to data in “a wider range of sectors”.
This summer another government-commissioned industrial strategy review, of the life sciences sector, also flagged up the value locked up in publicly funded data-sets as an opportunity — though its author warned against giving valuable public sector health data away for free to commercial entities, and called for a new regulatory framework to “capture for the UK the value in algorithms generated using NHS data”.
On the healthcare front, it could be argued that a Conservative policy of shrinking funding to the NHS, which puts pressure on the service to do more with less, is intended to open Trusts’ doors to ‘AI innovators’ when they come knocking and touting their ability to build cost-cutting tools — if only they’re given access to the data. (In the case of Google-DeepMind, it is not charging Trusts for five years of co-development work in exchange for access to patients’ healthcare data.)
But in the face of these conjoined commercial and political imperatives, what’s to stop data ethics getting squeezed in the middle?
“A Minister’s job is to be political — it is therefore unclear why anyone should expect a unit headed by a Minister to make ethical decisions,” argues Smith. “There are many things that are entirely lawful, but whether they are ethical is the subject of infinite debate.”
Ethics being politicized?
The government has published (in 2016) a 17-page ethical framework for the use of data science in the public sector — a document that’s currently being updated. It has also proposed setting up a quango for “data ethics and innovation” whose appointed advisors would be reporting to ministers making data processing decisions.
But MedConfidential’s concern is exactly that: If public sector data processing is going to be given a new statutory basis — via the incoming DP bill — that takes it outside the normal data protection rules then the ethics of doing things like applying AI to the most highly intimate forms of citizen data will be in the hands of ministers, rather than left to dedicated data guardians to judge.
“Designed for the worthy goal of incentivizing innovation within and between government departments, it is an example of the worst kind of misuse of data ethics — namely, as a tool to introduce subjectivity and negotiability into governmental compliance with the law, societal expectations, and public accountability,” says Julia Powles, research fellow at Cornell Tech, giving her assessment of the proposed framework for public sector data processing.
“There is real import to ethics — this is not it,” she adds.
Smith also points out that DCMS is hardly a government department with a strong history of data processing. Yet it’s the DCMS Secretary of State who would be in charge of these public sector data processing frameworks — with the ICO seemingly relegated to playing second fiddle.
For its part the ICO says its most significant concerns relate to Clause 178(5) — which states:
In determining a question arising in connection with the carrying out of any of the Commissioner’s functions, the Commissioner must take into account a provision of a document issued under section 176(3) if— (a) the question relates to a time when the provision was in force, and (b) the provision appears to the Commissioner to be relevant to the question.
“This puts a duty on the Commissioner to take the Secretary of State’s framework guidance into account when considering any question relevant to her functions,” it writes. “Whilst [the Commissioner] understands the relevance of considering any guidance about the legal basis of government functions the provision runs a real risk of creating the impression that the Commissioner will not enjoy the full independence of action and freedom from external influence when deciding how to exercise her full range of functions as required by Article 52 of the GDPR.”
The European Union’s incoming General Data Protection Regulation comes into force in May, and is intended to strengthen the standing and powers of EU Member State DPAs like the ICO.
Yet here the UK’s data watchdog is warning that the government might be at risk of breaching the incoming data protection rules by legislating to undermine her independence.
And while it’s true that the UK is in the process of negotiating its exit from the EU, as a result of the 2016 Brexit vote, Hancock has previously stated it’s the government’s intention to closely mirror EU data protection standards because — for economic reasons — it really needs to avoid a cliff-edge cut-off for digital data flows between the EU and UK.
So the ICO’s warning should carry some weight.
Or else, suggests Paul Bernal, senior lecturer in law at the University of East Anglia, the government may be calculating that, in a post-Brexit future, it might be able to carve out a little data protection divergence for the UK without risking the sought for EU adequacy arrangement on data protection standards.
“This may be primarily for the post-Brexit scenario, when more divergence may be possible whilst still getting adequacy,” he suggests.
Though that could ultimately prove a costly miscalculation, as the EU’s top court will remain the arbiter of whether the UK is achieving adequate data protection or not. It will not be for UK ministers to make such judgement calls.
“An adequacy decision will be made by the Commission, and can be struck down by the European Court of Justice whatever our own position is,” Bernal notes. “It’s effectively an internal EU decision concerning the UK, not a joint decision. Just as the ECJ struck down the Safe Harbour with the U.S….. the U.S. aren’t under the ECJ, but it doesn’t matter.”
So, as ever when it comes to data and the law, the devil is in the detail.
And time will tell whether concerns over the government’s intentions regarding data ethics and data access will dissolve — or grow horns.