When the Google DeepMind NHS partnership was launched in 2015 it was seen as an exciting new collaboration – an artificial intelligence (AI) startup working with the NHS to improve treatment for patients.
Yet, it has been tarnished by controversy and criticised for failing to comply with data protection law, effectively using patient data illegally according to the UK’s data watchdog.
This week, an investigation by the UK’s Information Commissioner’s Office (ICO) ruled that the Royal Free NHS Foundation Trust in London failed to comply with the Data Protection Act when it provided patient details to DeepMind.
The personal data of around 1.6m patients was provided to the Google-owned subsidiary to use for its Streams app, which reviews test results for serious issues, such as acute kidney injury – a condition linked to 40,000 deaths in the UK every year.
Royal Free said:
Within a few weeks of being introduced nurses who have been using Streams report that it has been saving them up to two hours every day, which means they can spend more face-to-face with patients.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Company Profile – free sampleThank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalData
However, despite the benefits the app and the partnership has brought to patients, it cannot be underestimated that patients could have been made vulnerable by the use of their data without their consent.
Elizabeth Denham, the information commissioner, said:
There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights. Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening.
How did the ruling come about?
Alarm bells were raised by an investigation by the New Scientist last year which revealed the full extent of the partnership — mainly that a private company, Google, has access to the identifiable medical records for 1.6m people.
Earlier this year, Dame Fiona Caldicott, the national data guardian at the Department of Health, said she believed the legal basis for the transfer of patient information was “inappropriate”.
In a letter to the medical director of the Royal Free, Stephen Powis, Caldicott didn’t dispute that the app would be valuable to patients, however, the “purpose for the transfer of 1.6m patient records to Google DeepMind was for the testing of the Streams application and not for the provision of direct care to patients”.
Caldicott’s opinion was taken into account by the ICO during its own investigation.
What does this mean for other NHS partnerships?
As technology and healthcare become increasingly closer, the ICO’s ruling demonstrates that patients need to be protected when it comes to partnerships with private companies.
Ron Chrisley, director of the Centre for Cognitive Science at the University of Sussex, told Verdict:
Information is hard to track. Once it gets out of a certain space, say leaves the NHS, it’s hard to tell where it might end up. Our laws need to catch up with these kinds of technologies. We need to re-think about what the possible reach of these things might be.
As part of the ICO’s ruling, the Trust has been asked to establish a proper legal basis under the Data Protection Act for any future trials and set out how it will comply with its duty of confidence to patients in any future trials involving personal data.
The Trust released a statement in response to the ruling, saying:
We accept the ICO’s findings and have already made good progress to address the areas where they may have concerns.
DeepMind published a blog on its website about the ruling. It said:
We welcome the ICO’s thoughtful resolution of this case, which we hope will guarantee the ongoing safe and legal handling of patient data for Streams.
The post, written by co-founder Mustafa Suleyman and DeepMind’s clinical lead Dominic King, went on to apologise for underestimating the NHS’s rules around patient data and laid out the steps it has put in place to ensure this doesn’t happen again.
Ultimately, if we want to build technology to support a vital social institution like the NHS, then we have to make sure we serve society’s priorities and not outrun them. There’s a fine line between finding exciting new ways to improve care, and moving ahead of patients’ expectations. We know that we fell short at this when our work in health began, and we’ll keep listening and learning about how to get better at this.