This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.theguardian.com/commentisfree/2017/jul/09/giving-google-private-nhs-data-is-simply-illegal

The article has changed 3 times. There is an RSS feed of changes available.

Version 0 Version 1
Giving Google our private NHS data is simply illegal Giving Google our private NHS data is simply illegal
(7 months later)
The Royal Free hospital’s attempt to gloss over its transfer of more than a million health records to the AI developer DeepMind is boneheaded and dishonest
Sun 9 Jul 2017 06.59 BST
Last modified on Sat 2 Dec 2017 14.33 GMT
Share on Facebook
Share on Twitter
Share via Email
View more sharing options
Share on LinkedIn
Share on Pinterest
Share on Google+
Share on WhatsApp
Share on Messenger
Close
In July 2015, consultants working at the Royal Free hospital trust in London approached DeepMind, a Google-owned artificial intelligence firm that had no previous experience in healthcare, about developing software based on patient data from the trust. Four months later, the health records of 1.6 million identifiable patients were transferred to servers contracted by Google to process the data on behalf of DeepMind. The basic idea was that the company would create an app, called Streams, to help clinicians manage acute kidney injury (AKI), a serious disease that is linked to 40,000 deaths a year in the UK.In July 2015, consultants working at the Royal Free hospital trust in London approached DeepMind, a Google-owned artificial intelligence firm that had no previous experience in healthcare, about developing software based on patient data from the trust. Four months later, the health records of 1.6 million identifiable patients were transferred to servers contracted by Google to process the data on behalf of DeepMind. The basic idea was that the company would create an app, called Streams, to help clinicians manage acute kidney injury (AKI), a serious disease that is linked to 40,000 deaths a year in the UK.
The first most people knew about this exciting new partnership was when DeepMind announced the launch of DeepMind Health on 24 February 2016. The blog post announcing this seems to have contained the first public mention of “AKI detection”. But the idea that 1.6 million identifiable health records had quietly disappeared into the maw of the biggest data-mining company in the world struck some academics and journalists as puzzling. How had the deal passed the various data-protection hurdles that any sharing of medical records have to surmount?The first most people knew about this exciting new partnership was when DeepMind announced the launch of DeepMind Health on 24 February 2016. The blog post announcing this seems to have contained the first public mention of “AKI detection”. But the idea that 1.6 million identifiable health records had quietly disappeared into the maw of the biggest data-mining company in the world struck some academics and journalists as puzzling. How had the deal passed the various data-protection hurdles that any sharing of medical records have to surmount?
How indeed? An academic article published in March 2017 explored some of the mysterious aspects of the deal. “The most striking feature of the DeepMind-Royal Free arrangement,” the authors wrote, “is the conviction with which the parties have pursued a narrative that it is not actually about artificial intelligence at all and that it is all about direct care for kidney injury – but that they still need to process data on all the trust’s patients over a multi-year period. This is hardly a recipe for great trust and confidence.”How indeed? An academic article published in March 2017 explored some of the mysterious aspects of the deal. “The most striking feature of the DeepMind-Royal Free arrangement,” the authors wrote, “is the conviction with which the parties have pursued a narrative that it is not actually about artificial intelligence at all and that it is all about direct care for kidney injury – but that they still need to process data on all the trust’s patients over a multi-year period. This is hardly a recipe for great trust and confidence.”
Spot on. In the end, the puzzle landed first on the desk of the national data guardian, Dame Fiona Caldicott, who concluded that the deal had an “inappropriate legal basis”. Dame Fiona’s reservations were then confirmed by Elizabeth Denham, the information commissioner, who last Monday handed down a stiff verdict. As the “data controller”, she ruled, the Royal Free “did not comply with the Data Protection Act when it turned over the sensitive medical data of around 1.6 million patients to Google DeepMind, a private sector firm, as part of a clinical safety initiative”. She demanded that the trust sign an undertaking “committing it to changes to ensure it is acting in accordance with the law” and warned that she would be working with it to make sure that happens.Spot on. In the end, the puzzle landed first on the desk of the national data guardian, Dame Fiona Caldicott, who concluded that the deal had an “inappropriate legal basis”. Dame Fiona’s reservations were then confirmed by Elizabeth Denham, the information commissioner, who last Monday handed down a stiff verdict. As the “data controller”, she ruled, the Royal Free “did not comply with the Data Protection Act when it turned over the sensitive medical data of around 1.6 million patients to Google DeepMind, a private sector firm, as part of a clinical safety initiative”. She demanded that the trust sign an undertaking “committing it to changes to ensure it is acting in accordance with the law” and warned that she would be working with it to make sure that happens.
The reactions of DeepMind and the Royal Free to the ruling differ in interesting ways. The company, which had been pretty aggressive up to now in defending the deal, has suddenly become contrite. “Although today’s findings are about the Royal Free,” it said in a blog post, “we need to reflect on our own actions too. In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health. We were almost exclusively focused on building tools that nurses and doctors wanted and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong and we need to do better.”The reactions of DeepMind and the Royal Free to the ruling differ in interesting ways. The company, which had been pretty aggressive up to now in defending the deal, has suddenly become contrite. “Although today’s findings are about the Royal Free,” it said in a blog post, “we need to reflect on our own actions too. In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health. We were almost exclusively focused on building tools that nurses and doctors wanted and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong and we need to do better.”
This is a politically astute response, given that the company, which the commissioner regarded as a mere “data processor”, has got off scot-free. The Royal Free’s response, in contrast, seems positively boneheaded. “We have co-operated fully with the ICO’s investigation which began in May 2016,” it says, “and it is helpful to receive some guidance on the issue about how patient information can be processed to test new technology. We also welcome the decision of the Department of Health to publish updated guidance for the wider NHS in the near future.”This is a politically astute response, given that the company, which the commissioner regarded as a mere “data processor”, has got off scot-free. The Royal Free’s response, in contrast, seems positively boneheaded. “We have co-operated fully with the ICO’s investigation which began in May 2016,” it says, “and it is helpful to receive some guidance on the issue about how patient information can be processed to test new technology. We also welcome the decision of the Department of Health to publish updated guidance for the wider NHS in the near future.”
This is pure cant. The fact is that the trust broke the law. So to say that it has “co-operated fully” and “it is helpful to receive some guidance on the issue about how patient information can be processed” is like a burglar claiming credit for co-operating with the cops and expressing gratitude for their advice on how to break and enter legally next time.This is pure cant. The fact is that the trust broke the law. So to say that it has “co-operated fully” and “it is helpful to receive some guidance on the issue about how patient information can be processed” is like a burglar claiming credit for co-operating with the cops and expressing gratitude for their advice on how to break and enter legally next time.
Meanwhile, we are left with the fact that a database of 1.6 million sensitive health records that were transferred illegally is sitting on Google servers somewhere, even though DeepMind claims that it doesn’t need it. After all, it’s only building apps for healthcare providers. What, one wonders, will the information commissioner do about that?Meanwhile, we are left with the fact that a database of 1.6 million sensitive health records that were transferred illegally is sitting on Google servers somewhere, even though DeepMind claims that it doesn’t need it. After all, it’s only building apps for healthcare providers. What, one wonders, will the information commissioner do about that?
Data protection
Opinion
Internet
Google
Alphabet
DeepMind
NHS
comment
Share on Facebook
Share on Twitter
Share via Email
Share on LinkedIn
Share on Pinterest
Share on Google+
Share on WhatsApp
Share on Messenger
Reuse this content