DeepMind NHS health data-sharing deal faces further scrutiny

controversial data-sharing agreement between Google-owned AI company DeepMind and the UK’s National Health Service (NHS) has caught the eye of the National Data Guardian — the government appointee who works with the Department of Health to try to ensure citizens’ confidential health data is safeguarded and used properly.

The NDG role does not have powers of enforcement, so cannot launch a formal investigation into the data-sharing arrangement, but a spokeswoman told TechCrunch the NDG is “considering how data was shared by the Royal Free [NHS Trust] with Google DeepMind”.

“Dame Fiona [Caldicott, the NDG] and her independent advisory panel have asked for and received information about the project, which is currently being considered to assess whether any guidance or advice from the National Data Guardian would be of assistance to this or similar projects,” the spokeswoman added.

The NDG will be publishing minutes of a meeting it held earlier this year with DeepMind and the Royal Free tomorrow — we’ll update this post with a link once the document is onlineUpdate: The relevant paragraph from the minutes is as follows:

6. Google DeepMind discussion

Following the recent media attention on the Royal Free’s agreement with Google DeepMind to develop an app aimed at improving the identification and treatment of patients at risk of acute kidney injury, a number of representatives from the Royal Free and Google DeepMind attended the Panel meeting. A discussion took place around the clinical aims of the project, the data sharing and security arrangements, the information available to the public, and plans for next steps in developing the app and the underlying algorithm.

Action 6.1: NDG Office to draft a response on behalf of the Panel to the Royal Free to confirm the Panel’s understanding of the discussion and the project and to raise any additional questions.

The same Royal Free-DeepMind data-sharing agreement is already being investigated by the UK’s data protection watchdog, the ICO, which confirmed to TechCrunch today that its probe remains ongoing.

An ICO spokesperson said: “We are in contact with the National Data Guardian about this matter and we continue to make enquiries of the Royal Free in relation to the data shared with Google DeepMind.”

“Any organisation processing or using people’s sensitive personal information must do so in accordance with the Data Protection Act,” the ICO spokesperson added.

At the time of writing neither DeepMind nor the Royal Free had responded to a request for comment on the NDG’s action. Update: In a joint statement the pair said:

We were grateful for the opportunity to meet with the National Data Guardian and provide further information on the Streams app during its development. We continue to work closely with all relevant organisations and regulators who are interested in learning more about our work.

We hold ourselves to the highest legal and ethical standards, especially on matters of patient privacy, clinical safety and data security.

NHS data-sharing and due process

This is just the latest regulatory bump in the road for the data-sharing agreement between DeepMind and the Trust — which was inked last September and publicly announced in February this year; the first such collaboration between the Google-owned company and the NHS. (DeepMind has since announced another collaboration, with a different NHS Trust: Moorfields Eye Hospital).

The Royal Free data-sharing agreement has been criticized for the amount of patient identifiable data being shared (especially as, unlike the Moorfields collaboration, the patient data is also not anonymized). Data being shared under the agreement includes real-time inpatient data from the Trust’s three hospitals across multiple departments, as well as historical in-patient data going back five years.

The Royal Free Trust and DeepMind have maintained that patient consent to the sharing of the data in this instance can be implied (rather than explicitly obtained), because they say they are using the data for so-called “direct patient care”. Which means a clinician helping an individual patient with their own care.

However the app the data is used to power — called Streams — targets a particular condition, called Acute Kidney Injury — and it is inevitable that not every patient whose data has been shared with DeepMind will go on to develop and/or be treated for AKI by a clinician at the Royal Free’s hospitals.

Streams has also only been used for three user tests so far which TechCrunch has learned entailed it being used it in parallel to other clinical decision-support processes. The app itself was not used to make any decisions about patient care during these tests.

All of which is to say that so far the NHS patient medical records shared with DeepMind for Streams have only been used to further the development of the software itself. So quite how R&D of a decision-support app also constitutes ‘direct patient care’ — the necessary threshold required for sharing data without explicit patient consent, in this instance — remains unclear. In more extensive user tests of the UI of the app DeepMind deployed fake data. But in the three user tests in the Royal Free hospitals real-world patient medical records were used.

A spokeswoman for the NDG said the body does not generally comment on individual cases, and further noted that in this instance it cannot comment because it is currently considering the arrangement — having contacted DeepMind and the Royal Free to provide details to its panel of experts earlier this year, back in May.

It is possible the NDG will gain more formal advice powers in future, given the role is relatively new — set up in November 2014, following another unrelated data-sharing controversy, and with the UK government only publishing its response to a consultation on the role’s remit and powers this summer.

Once the NDG gains statutory powers, it will mean NHS organisations will need to consider any formal advice when they are drawing up data-sharing agreements with outside entities, such as DeepMind. (At present there are only a set of Caldicott principles that NHS Trusts are supposed to take into account when making decisions about whether and how to share patient data — but it’s up to individual Trusts to interpret these guidelines.)

The UK’s ICO and the CQC (Care Quality Commission) remain the two existing bodies with enforcement powers in this area. One of which, the ICO, continues to investigate the Royal Free-DeepMind data-sharing agreement, as noted above.

Earlier this year it also emerged that DeepMind and the Royal Free had not registered the Streams app as a medical device with the relevant regulatory body, the MHRA, prior to testing it in hospitals. The pair was subsequently contacted by the MHRA and told they did need to register the app. The last of the three user tests of Streams was stopped on the same day, May 11 — after only three days of testing. (In February the app was tested for a week, between February 6 and 12; with the first test, in December, also lasting three days, from 12 to 14.)

The Streams app remains out of commission for now, although the pair have said they are committed to deploying it in future — i.e. once they have registered it with the MHRA.

Announcing DeepMind’s Moorfields research project last month, co-founder Mustafa Suleyman, referred back to the Royal Free collaboration — tacitly acknowledging the not-so-smooth sailing of the project in an almost-but-not-quite mea culpa moment.

“Treating this data with respect really matters,” he wrote in a Medium post entitled ‘Our commitment to the NHS‘. “There are different authorities that give different types of approvals and oversight for NHS data use: HSCIC, HRA, MHRA, ICO, Caldicott Guardians, and many, many more. We’re committed to working with all these groups, and making sure with their help that we get it right.”

In a recent statement provided to TechCrunch a DeepMind spokesperson again defended how NHS patient data is being shared under its agreement with the Royal Free:

Streams is clearly a tool designed to be used for direct patient care, supporting clinicians to provide the right care to patients at risk of developing AKI. For the tool to work, and to ensure doctors and nurses receive the right information about the right patient, Streams needs to process patient identifiable information sent to it by the Royal Free. It’s normal that the early on-site user testing of Streams was conducted alongside existing clinical systems, and this doesn’t change the basis on which data is being processed — which is the same basis for many other third party services active within a hospital setting. We’re now past the prototyping phase and are looking forward to putting Streams into production, so we’re working with the MHRA on that basis.

The Royal Free and DeepMind continue to refuse to disclose the exact number of clinicians involved in testing Streams thus far. Nor will they specify how many patients passed through the hands of those clinicians during the user tests of the app. So it remains impossible to quantify the reach of the app vs the amount of patient identifiable data being shared to power it.

This post was updated with additional comment and the minutes of the NDG’s meeting with DeepMind/the Royal Free