DeepMind’s inaugural data-sharing deal with the UK’s National Health Service looks to be coming firmly unstuck.
The partnership attracted controversy last year, when the scope of the behind-the-scenes data-sharing was revealed by a New Scientist investigation. The personally identifiable health data of some 1.6 million NHS patients was used to develop an app for the early detection of a kidney condition.
Now the patient data safety advisory body to the UK government, the National Data Guardian (NDG), has weighed in with an opinion that there was — contrary to DeepMind’s repeat insistence — no lawful basis for the data transfer of 1.6 million patients’ medical records to the Google-owned company during a development / piloting phase of the app.
The arrangement remains under investigation by the UK’s data protection watchdog, the ICO.
Sky News has today obtained and published (embedded in the below tweet) a letter sent by the NDG to the Royal Free NHS Trust and to DeepMind co-founder Mustafa Suleyman. A spokeswoman for the NDG confirmed to TechCrunch that the letter is authentic.
In the letter, which was sent on February 2017, Dame Fiona Caldicott, the NDG, writes:
It is my view and that of my panel that the purpose for the transfer of 1.6 million identifiable patient records to Google DeepMind was for the testing of the Streams application, and not for the provision of direct care to patients. Given that Streams was going through testing and therefore could not be relied upon for patient care, any role the application might have played in supporting the provision of direct care would have been limited and secondary to the purpose of the data transfer. My considered opinion therefore remains that it would not have been within the reasonable expectation of patients that their records would have been shared for this purpose.
She goes on to state that she is writing to the ICO to communicate her advice to feed into its ingoing investigation of the data-sharing arrangement.
The NDG also provided us with the following statement regarding the letter:
The National Data Guardian for Health and Care, Dame Fiona Caldicott, and her panel of advisors have been considering how patient data was shared by the Royal Free London NHS Foundation Trust with DeepMind under the ‘Streams’ project to improve the detection and management of acute kidney failure. In discussions with the ICO about this, the NDG agreed to provide advice on the use of implied consent for direct care as a legal basis for the sharing of data by the Royal Free with DeepMind. While the ICO investigation is ongoing the NDG will provide any further assistance to the ICO as required, but will not be commenting further on the matter at this point.
During the development of the Streams app last year, DeepMind and the Royal Free consistently claimed patient consent was not needed to share their medical records for the Streams app project as the app was being used for so-called ‘direct patient care’. Although during 2016 the app was never actually being used for direct care — it was merely being sporadically tested alongside normal clinical practices.
Critics have also disputed their ‘direct care’ argument for the app — pointing out, for example, that not all the patients whose data is being shared with DeepMind for Streams will go on to develop Acute Kidney Injury, the condition the app is intended to detect. And therefore that some will never be in a direct care relationship for this condition.
The NDG’s assessment here is not weighing in on that wider point. But the letter does make it clear that in Caldicott’s view DeepMind and the Royal Free were not justified in their use of live patient data during the testing/development phase of the Streams app.
The spokeswoman confirmed to us that the scope of the review was limited to the appropriateness of implied consent for direct care “as the legal basis when the data was shared”.
“As the letter states, this was the legal basis that the Royal Free had confirmed that they had used. The NDG has not considered or been asked to consider the appropriateness of any other legal bases,” she added.
At the time of writing neither DeepMind nor the Royal Free NHS Trust had responded to our questions or request for comment.
But responding to the NDG’s letter in a statement, health data privacy group medConfidential’s coordinator, Phil Booth, told us: “This letter shows that Google DeepMind must know it had to delete the 1.6 million patient medical records it should never have had in the first place. There were legitimate ways for DeepMind to develop the app they wanted to sell. Instead they broke the law, and then lied to the public about it.”
Despite the controversy around the data-sharing arrangement underpinning Streams, DeepMind and its partner NHS Trust for this deal, the Royal Free, pushed ahead deploying the app in several London hospitals earlier this year — but only after inking a new data-sharing deal last year.
That second data-sharing arrangement is not currently being reviewed by the NDG, according to the spokeswoman.
At the time of writing the ICO could not be reached for comment. Update: An ICO spokesperson has now provided the following statement: “Our investigation into the sharing of patient information between the Royal Free NHS Trust and Deep Mind is close to conclusion. We continue to work with the National Data Guardian and have been in regular contact with the Royal Free and Deep Mind who have provided information about the development of the Streams app. This has been subject to detailed review as part of our investigation. It’s the responsibility of businesses and organisations to comply with data protection law.”