NHS memo details Google/DeepMind’s five year plan to bring AI to healthcare

More details have emerged about the sweeping scope of Google/DeepMind’s ambitions for pushing its algorithmic fingers deep into the healthcare sector — including wanting to apply machine learning processing to UK NHS data within five years.

New Scientist has obtained a Memorandum of Understanding between DeepMind and the Royal Free NHS Trust in London, which describes what the pair envisage as a “broad ranging, mutually beneficial partnership, engaging in high levels of collaborative activity and maximizing the potential to work on genuinely innovative and transformational projects”.

Envisaged benefits of the collaboration include improvements in clinical outcomes, patient safety and cost reductions — the latter being a huge ongoing pressure-point for the free-at-the-point-of-use NHS as demand for its services continues to rise yet government austerity cuts bite into public sector budgets.

The MoU sets out a long list of “areas of mutual interest” where the pair see what they dub as “future potential” to work together over the five-year period of collaboration envisaged in the memorandum. The document, only parts of which are legally binding, was signed on January 28 this year.

Potential areas of future collaboration include developing hospital support systems such as bed and demand management software, financial control products and private messaging and task management for junior doctors. (On the private messaging front, NHS staff informally using messaging apps like WhatsApp to quickly share information has previously been suggested as a risk to patient data confidentiality.)

They also say they want to work together on real-time health prediction — which is where the pair’s first effort (an app called Streams) has focused — involving a range of healthcare data to try to identify the risk of patient deterioration, death and/or readmission.

Reading medical images, and even monitoring the foetal heartbeat when a pregnant woman is in labour are other listed areas of interest.

Here’s the relevant portion of the MoU:

DeepMind/RF

The MoU begins by referencing DeepMind’s ability to build “powerful general-purpose learning algorithms”.

It goes on to state that one of DeepMind’s hopes for the collaboration with the Royal Free NHS Trust is to gain “data for machine learning research under appropriate regulatory and ethical approvals”.

The pair have said their first co-designed app, Streams, is not utilizing any AI. Nor indeed is it powered by algorithms created by DeepMind but instead the core software was written by NHS.

But the scope of the MoU makes it clear that applying machine learning to public healthcare data is exactly where the ambitions lie here.

Criticism over personally identifiable data powering the Streams app

Back in February DeepMind announced it was working with the Royal Free Trust — to “co-develop” an app to targets a particular kidney condition, called AKI. It said the app, Streams, would present “timely information that helps nurses and doctors detect cases of acute kidney injury”.

Few details about the data-sharing agreement between the Google-owned company and the Royal Free Trust were made public at that stage. But it subsequently emerged that DeepMind was being given access to a very wide range of heathcare data on the 1.6 million patients who pass through the Trust’s three London hospitals each year.

The data in question is patient identifiable (i.e. non-anonymized, non- pseudo-anonymized). Under the agreement, DeepMind is also getting access to patient data from the Trust’s three hospitals dating back five years.

Critics, such as health data privacy group MedConfidential, have questioned why so much patient identifiable data is being shared for an app targeting a single condition.

“Direct care is between a patient and a clinician. A doctor taking steps to prevent their patient having a future problem is direct care. An organisation taking steps to reduce future events of unknown patients (e.g. fluoridation) is not,” argues Sam Smith of MedConfidential.

The Royal Free Trust and DeepMind have continually maintained that access to such a wide range of data is necessary for the Streams app to perform a direct patient care function, given the difficulty in predicting which patients are at risk of developing AKI.

They have also continued to assert the app is being used purely for direct patient care, not for research. This is an important distinction given that conducting research on patient identifiable data would likely have required they obtain additional approvals, such as gaining explicit patient consent or Section 251 assent (neither of which they have obtained).

But because they claim the data is not being used for research they argue such approvals are not necessary, even though it is inevitable that a large proportion of the people whose data is being fed into the app will never directly benefit from it. Hence the ongoing criticism.

Even if you factor in the medical uncertainties of predicting AKI — which might suggest you need to cast your data collection net wide — the question remains why is the data of patients who have never had a blood test at the hospitals being shared? How will that help identify risk of AKI?

And why is some of the data being sent monthly if the use-case is for immediate and direct patient care? What happens to patients who fall in the gap? Are they at risk of less effective ‘direct patient care’?

Responding to some of these critical questions put to it by TechCrunch, the Royal Free Trust once again asserted the app is for direct patient care — providing the following statement to flesh out its reasoning:

The vast majority of our in-patients will have a blood test and Streams would monitor the kidney function of every one of those patients for signs of deterioration, alerting clinicians when necessary.

DeepMind only has access to data that is relevant to the detection of AKI. In addition to analysing blood test results, the app allows clinicians to see diagnostic data and historical trends that may affect treatment, and in doing so supports effective and rapid patient care.

The patient’s name, NHS Number, MRN, and date of birth must be used to allow the clinician to positively identify the patient, in accordance with the HSCIC’s interface guidelines. This will be used to allow comparison between pathology results obtained within the hospital.

Monitoring patients at risk of developing AKI for signs of AKI so they can be treated quickly and effectively falls well within the definition of direct care.

Any in-patient coming into our hospital has at least a one in six chance of developing AKI. For the app to be effective this data needs to be in storage so that it can be processed when a patient is admitted. With any clinical data processing platform it is quite normal to have data lying in storage and it is nonsense to suggest that these platforms should only hold the data of those patients being treated at that very moment.

Given the envisaged breadth of the five-year collaboration between DeepMind and the Royal Free, as set out in their MoU, the fact the Google-owned company has been afforded access to such a wide range of healthcare data looks far less surprising — owing to the similarly wide range of products the pair envisage collaborating on in future.

For example, if you’re planning on building a software system to predict bed demand across three busy hospitals then access to a wide range of in-patient data — such as admissions, discharge and transfer data, accident & emergency, pathology & radiology, and critical care — going back for multiple years would obviously be essential to building robust algorithms.

And that’s exactly the sort of data DeepMind is getting under the AKI data-sharing agreement with the Royal Free.

However it would of course be necessary for DeepMind and the Royal Free to gain the correct approvals for each of the potential use-cases they are envisaging in their MoU.

So unless there are any other, as yet unannounced data-sharing agreements in place between the pair, then the wide ranging personally identifiable healthcare data which DeepMind currently has access to must specifically be for the Streams app.

The pair’s MoU also states that separate terms would be agreed to govern their collaboration on each project.

“The Parties would like to form a strategic partnership exploring the intersection of technology and healthcare,” it further notes, going on to describe their hopes for “a wide-ranging collaborative relationship for the purposes of advancing knowledge in the fields of engineering and life and medical sciences through research and associated enterprise activities”.

Sharing personally identifiable NHS patient data

The current framework for handling and sharing personally identifiable NHS patient data was created after a review conducted in 1997 by Fiona Caldicott, and updated by a second review in 2013, following concerns about how patient confidentiality might be being undermined by increasing amounts of data sharing.

NHS Trusts are supposed to take the so-called Caldicott principles into account when making decisions about sharing personally identifiable patient data (PID). Originally there were six principles, all focused on minimizing the amount of PID being shared in an effort to allay concerns about patient confidentiality being undermined.

But a seventh was added in Caldicott’s second report which seeks to actively encourage appropriate data-sharing in what she described as an effort to re-balance the framework with the potential benefits to patients of data-sharing in mind.

The six original Caldicott principles state that: the use/transfer of patient identified data should be justified, clearly defined and scrutinized, as well as regularly reviewed if use continues; that personally identifiable data should not be used unless there is no alternative; the minimum possible personally identifiable data be used; that access to personally identifiable data should be on a strict need to know basis; that everyone handling the data is aware of their responsibilities vis-a-vis patient confidentiality; and that every use of personally identifiable data must be lawful.

The seventh principle adds to this that: “The duty to share information can be as important as the duty to protect patient confidentiality“, with Caldicott writing: “Health and social care professionals should have the confidence to share information in the best interests of their patients within the framework set out by these principles. They should be supported by the policies of their employers, regulators and professional bodies.”

While the seventh principle might appear to be opening the door to more wide-ranging data-sharing agreements — such as the one between the Royal Free and DeepMind — Caldicott’s March 2013 review of Information Governance of healthcare data does specifically note that direct patient care pertains to the care of specific individuals. 

“Only relevant information about a patient should be shared between professionals in support of their care,” she writes [emphasis mine].

While her report describes “indirect patient care” as encompassing “activities that contribute to the overall provision of services to a population as a whole or a group of patients with a particular condition”.

The phrase “a group of patients with a particular condition” suggests an app like Streams, which is targeting a medical condition, might appear to be more obviously categorized as ‘indirect patient care’, based on this framework.

Health services management, preventative medicine, and medical research all also fall under indirect care, according to Caldicott’s definition.

“Examples of activities would be risk prediction and stratification, service evaluation, needs assessment, financial audit,” her 2013 review adds.

Despite Caldicott’s examples of direct vs indirect care, the Royal Free’s own Caldicott Guardian, Dr Killian Hynes, who is the senior person responsible for patient confidentiality and appropriate data-sharing at the Trust, still claims to be satisfied the Streams app constitutes direct patient care.

In a statement provided to TechCrunch Hynes said:

As the senior trust clinician responsible for protecting the confidentiality of patients and ensuring that information is shared appropriately, I have extensively reviewed the arrangements between the trust and DeepMind.

I am satisfied that patient data is being processed by the Streams app for the purpose of direct patient care only and that the arrangements around the storage of encrypted patient data within the secure third-party server are in line with the Caldicott Principles and our responsibilities as data controller.

This is pioneering work that could help us identify and treat the significant number of patients who suffer acute kidney injury within our hospitals.

The Royal Free Trust has repeatedly declined to answer whether Dr Hynes reviewed the data-sharing agreement with DeepMind prior to any patient data being shared.

The Trust has only said that its data protection officer — the person who signed the data-sharing agreement with DeepMind on behalf of the Trust — did so.

If the Trust’s own Caldicott Guardian (CG) did not review such a wide-ranging data-sharing agreement prior to data being shared with DeepMind the question must be why not? Given that the Caldicott principles also urge a process of scrutiny on Trusts at the point of sharing personally identifiable data.

The DeepMind/Royal Free data-sharing agreement is currently being investigated by the UK’s data protection watchdog, acting on a small number of public complaints.

In a statement provided to TechCrunch this week the ICO confirmed it is continuing to probe the arrangement. “We are continuing to make enquiries in relation to this matter. Any organisation processing or using people’s sensitive personal information must do so in accordance with Data Protection Act,” it said.

Meanwhile, last month TechCrunch learned the Streams app was no longer in use by Royal Free clinicians — which had said it had only run a handful of “small user tests” so far.

Last month it also emerged that the UK’s medicines and healthcare regulator, the MHRA, had contacted the Trust and DeepMind to initiate discussions about whether the app should be registered as a medical device. The MHRA had not been informed about the Streams app prior to it being trialled.

It’s also worth pointing out that the NHS Information Governance Toolkit, which was completed by DeepMind last October after it signed the data-sharing agreement with the Royal Free, is a self-assessment process.

DeepMind has said it achieved the highest possible rating on this IG toolkit, which the NHS provides for third party organizations to assess their processes against its information governance standards. DeepMind’s self-graded scores on the IG Toolkit have not yet been audited by the HSCIC, according to MedConfidential.