Google DeepMind patient-data sharing deal with Royal Free NHS Trust deal was illegal, rules ICO

Google DeepMind's Streams project with the Royal Free NHS Trust used real patient data

The deal between Google DeepMind and an NHS trust in London that saw patient data shared with the internet giant was illegal, the Information Commissioner's Office (ICO) has ruled.

The ICO said that the deal, which saw data on 1.6 million patients at the Royal Free NHS Trust passed to Google DeepMind without those patient's knowledge and consent had "failed to comply with the Data Protection Act".

The Royal Free and Google DeepMind previously attempted to defend the deal by saying that "implied consent" was assumed because the Streams app was delivering "direct care" to patients.

As part of the deal, DeepMind was using its systems to analyse the medical data in the hope of providing quicker diagnosis and more proactive care through an app called Streams, which sends an alert to a clinician's smartphone if a patient's condition deteriorates

However, the data-sharing partnership soon raised alarm bells, and earlier this year Dame Fiona Caldicott, national data guardian at the Department of Health and senior data protection adviser to the NHS, claimed that the collaboration went beyond the realm of the "direct care" of the patient and therefore no implied consent can be taken before passing medical records to a third party.

The ICO began investigating the partnership back in May, following at least one complaint from the general public. One of these complaints questioned whether DeepMind would be "expected to encrypt the patient data it receives when at rest."

"Whilst the information-sharing agreement insists that personally identifiable information - such as name, address, postcode, NHS number, date of birth, telephone number, and email addresses, etc - must be encrypted whilst in transit to Google, it does not explicitly prohibit that data being unencrypted at the non-NHS location," the complaint reads.

The Information Commissioner, Elizabeth Denham, said on Monday that as a result of the ICO's investigation "the Trust has been asked to sign an undertaking committing it to changes to ensure it is acting in accordance with the law."

Denham said that the investigation revealed that the Royal Free did carry out a privacy impact assessment, but only after Google DeepMind had already been given patient data, noting that "this is not how things should work."

She also said that the ICO wasn't persuaded that it was necessary and proportionate to disclose 1.6 million patient records to test the application.

"The price of innovation didn't need to be the erosion of legally ensured fundamental privacy rights. I've every confidence the Trust can comply with the changes we've asked for and still continue its valuable work," Denham said.

In a blog post, DeepMind said that it welcomes "the ICO's thoughtful resolution of the case, which we hope will guarantee the ongoing safe and legal handling of patient data for Streams".

It goes on to admit that it "underestimated the complexity of the NHS and of the rules around patient data", adding that "we got that wrong, and we need to do better".

The findings of the investigation open the way for the Royal Free NHS Trust to be levied with a fine of up to £500,000, as the 'data controller' - with a 20 per cent discount for early payment.

After 25 May next year, when the EU's General Data Protection Regulation (GDPR) comes into force, the ICO would be empowered to levy a much bigger maximum fine against both parties.