Google’s DeepMind violated UK law in use of patient data

Panel rules NHS Royal Free hospital improperly shared the records of 1.6 million patients.


A landmark medical trial involving Alphabet’s DeepMind artificial intelligence division violated British data protection laws, the U.K.’s top privacy watchdog ruled.

The Information Commissioner’s Office said Monday that the National Health Service hospital that conducted the trial with DeepMind improperly shared 1.6 million patient records with the tech company, failing to inform patients that their data would be used to test a new mobile app.



"Patients would not have reasonably expected their information to have been used in this way, and the Trust could have and should have been far more transparent with patients as to what was happening," Elizabeth Denham, the Information Commissioner, said in a statement.

The trial, which began in November 2015, was designed to help doctors diagnose acute kidney injuries and did not involve any artificial intelligence. Instead, DeepMind built software, called Streams, using an existing NHS algorithm designed to identify patients at risk. Streams was built to crunch patient data, such as blood test results, and if a patient was at risk to push an alert to medical staff using the app.

DeepMind, sold to Google for over $400 million in 2014, said it had to have access to partial medical records of all the Royal Free NHS Hospital Trust’s patients going back five years—even if those patients were not currently being treated at the hospital. The Royal Free shared this information with DeepMind under a legal basis called "direct care," meaning that it was being shared to improve patient treatment. Under this legal doctrine, medical professionals do not need to have explicit consent to share patient data.

Following a 13-month investigation, the ICO found "several shortcomings" in how the hospital handled patient data, and following advice from the Department of Health’s own chief adviser on patient data, ruled that "direct care" was not a proper legal basis for sharing this information and that the hospital should have asked patients’ permission to share their data with DeepMind.

It concluded that when DeepMind was conducting its trial it was primarily trying to see whether the mobile app itself worked properly and if medical staff liked its software interface—and not in an attempt to improve patient outcomes.

In addition, the ICO said the hospital and DeepMind had not provided an adequate explanation of why DeepMind needed access to so many patient records to test the app.

"The price of innovation does not need to be the erosion of fundamental privacy rights," Denham said.

The ICO has asked the Royal Free to provide evidence that it has policies in place to make sure patients give proper consent to their data being used for any further testing the hospital conducts with DeepMind, and provide the regulator evidence within three months that it is now in compliance with the law.

The hospital trust has also agreed to have a third party audit its current data processing arrangements with DeepMind and report back to the regulator.

In a blog posted Monday, DeepMind said it acknowledged it made several mistakes in its work with the Royal Free, and it welcomed "the ICO’s thoughtful resolution of this case, which we hope will guarantee the ongoing safe and legal handling of patient data for Streams."

"In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health," DeepMind wrote in the blog.

The company said it had since implemented improvements in transparency and oversight, including a more detailed legal contract with the Royal Free and more attention on making sure patients and the public were aware of its work. It also appointed a nine-member independent review panel to scrutinize DeepMind’s health work and publish recommendations for improvement.

The Royal Free said in a statement that it accepted the ICO’s findings and has "already made good progress to address the areas where they have concerns." The hospital said that it wanted to "reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety."

Fiona Caldicott, the National Data Guardian, which provided advice on the "direct care" legal doctrine to the ICO, said in a statement Monday that she’s “afraid that a laudable aim—in this case, developing and testing life-saving technology—is not enough legally to allow the sharing of data that identifies people without asking them first."

DeepMind, which was founded in 2010, is best known for having created artificial intelligence able to beat the world’s best human players at the strategy game Go. The achievement is considered a major milestone in computer science.

More for you

Loading data for hdm_tax_topic #better-outcomes...