Implied consent ‘not an appropriate legal basis’ for sharing 1.6 million patient records with DeepMind

Written by Rebecca Hill on 16 May 2017 in News
News

Leaked letter from government's national data guardian Fiona Caldicott says patient records were being used to test and develop Streams app - not for direct care

Implied consent was 'not an appropriate legal basis' for sharing patient records - Photo credit: Flickr, Sebastian Wiertz, CC BY 2.0

The legal basis that Google’s DeepMind and London’s Royal Free NHS Trust used to justify granting access to 1.6 million patient records was not appropriate, according to the UK’s national data guardian for health, Fiona Caldicott.

The healthcare arm of Google's artificial intelligence company is working with the Royal Free to use patient data to develop an app, Streams, that will help doctors and nurses identify - and prioritise - patients at risk of acute kidney injury.

The project, launched in early 2016, came under fire when it was revealed that the data-sharing agreement gave the company access to a range of identifiable healthcare data on the 1.6 million patients passing through the hospitals each year.

Although the work was paused - later to be restated in November 2016 under a new agreement and a commitment from DeepMind to be more open with the public - the initial agreement is under investigation by the UK’s data watchdog, the Information Commissioner’s Office. 

As part of this, the ICO asked Caldicott - who has written three reports into patient data sharing and consent within the health service - to give her opinion on the agreement.


Related content

Google’s DeepMind and NHS restart data-sharing deal with greater transparency
John Manzoni: Care.data was a ‘misstep’ that ‘put us back a long way’
Earning public trust in the age of cyber threats


This was reported to the ICO and the Royal Free in February, but was not made public until this week, when a letter to Stephen Powis, the medical director of the Royal Free Hospital in London, was leaked to Sky News.

In it, Caldicott noted that the company and trust had said that the basis for sharing the data was implied consent, because it was for the purposes direct care.

But, she said, the data was being used to test and develop the Streams app, and could not be regarded as direct care, meaning it wasn’t appropriate to use the legal basis of implied consent.

“It is my view, and that of my panel that the purpose for the transfer of 1.6 million identifiable patient records were only used for the testing of Streams application, and not for the provision of direct care to patients,” she said.

Caldicott added that she “did not believe that when the patient data was shared with Google DeepMind, implied consent for direct care was an appropriate legal basis”, and that it was her “considered opinion” that patients would not have reasonably expected their records to be used in this way.

“My view is that when work is taking place to develop new technology this cannot be regarded as direct care, even if the intended end result when the technology is deployed is to provide direct care,” the letter stated.

“Implied consent is only an appropriate legal basis for the disclosure of identifiable data for the  purposes of direct case if it aligns with people’s reasonable expectations.”

Elsewhere in her letter, Caldicott stresses that both she and her panel “keenly appreciate the great benefits that new technologies such as Streams can offer to patients, in terms of better, safer, more timely care”.

And, although the letter does not raise any issues with the way the app - which has been in full use since January this year - is currently working, it said that the full benefits of innovative technologies will not be realised unless unless the use of patient data is done in “a transparent and secure manner, which helps to build public trust”.

Royal Free: 'NHS remained in full control of patient data'

In a statement sent to PublicTechnology, a spokesperson for the Royal Free said that the partners took a “safety-first approach”, by testing Streams with real patient data to make sure it presented the information accurately before using it in a live setting.

“Real patient data is routinely used in the NHS to check new systems are working properly before turning them fully live,” the spokesperson said. “No responsible hospital would ever deploy a system that hadn't been thoroughly tested. The NHS remained in full control of all patient data throughout.”

However, they added the hospital took Caldicott’s decision “seriously”, and acknowledged that, as the project was “one of the first of its kind in the NHS”, there were “always lessons we can learn”.

This point was also picked up on by Caldicott in her letter, which said that further guidance for organisations developing new technologies “would be useful” and that the Department of Health was “looking closely at the regulatory framework and guidance” it provides on this.

The letter was also sent to the ICO, to feed into its investigation, which the organisation said in a statement was nearly finished.

“Our investigation into the sharing of patient information between the Royal Free NHS Trust and Deep Mind is close to conclusion,” the statement said.

“We continue to work with the National Data Guardian and have been in regular contact with the Royal Free and Deep Mind who have provided information about the development of the Streams app. This has been subject to detailed review as part of our investigation. It’s the responsibility of businesses and organisations to comply with data protection law.”

Share this page

Tags

CONTRIBUTIONS FROM READERS

Please login to post a comment or register for a free account.

Related Articles

Interview: CDDO chief Lee Devlin on the ‘move from being disruptive to collaborative’
23 May 2023

In the first of a series of exclusive interviews, the head of government’s ‘Digital HQ’ talks to PublicTechnology about the Central Digital and Data Office’s work to unlock £8bn...

HMRC finds strong support for online Child Benefit claims – but ‘digital by default’ would cause problems for one in five users
17 May 2023

Department publishes findings of study conducted ahead of planned digitisation initiative

Scottish minister warns on Westminster’s ‘hands-off’ approach to AI and requests urgent UK summit
6 June 2023

Richard Lochhead compares technology to previous industrial revolutions and says government’s job is to minimise harms and spread opportunities

DfT declines review of undigitised DVLA processes for citizens with health conditions
2 June 2023

MPs found that ‘inefficient’ manual processes contributed to a pandemic backlog of driving licence applications from those with notifiable medical needs

Related Sponsored Articles

Proactive defence: A new take on cyber security
16 May 2023

The traditional reactive approach to cybersecurity, which involves responding to attacks after they have occurred, is no longer sufficient. Murielle Gonzalez reports on a webinar looking at...