DeepMind's Access to U.K. Health Data Deemed `Inappropriate'

  • Alphabet unit used ‘inappropriate’ legal basis, adviser says
  • Opinion may complicate future AI health-research efforts

DeepMind, an artificial intelligence company owned by Alphabet Inc., accessed 1.6 million hospital patient records using an inappropriate legal justification, according to a top U.K. government adviser.

The National Data Guardian, Dame Fiona Caldicott, who advises the U.K. Department of Health on patient data privacy, gave the opinion in a letter sent to Stephen Powis, medical director of London’s Royal Free Hospital Trust, on Feb. 20.

Starting in July 2015, the Royal Free worked with DeepMind to develop an app that tells doctors when a patient is at risk of developing acute kidney injury. But in April 2016, the magazine the New Scientist raised questions about whether DeepMind obtained proper patient consent for tests of the app.

Royal Free told Caldicott it had implied patient consent to share their data with DeepMind because the information was being used to improve patient treatment -- a legal basis known as “direct care.” But Caldicott said in the letter that during the pilot test of the app, called Streams, the main goal was to make sure that the app functioned well, not to improve patient outcomes, so this legal basis was inappropriate. The letter was obtained and published by Sky News earlier. The National Data Guardian confirmed its authenticity.

While the National Data Guardian has no independent regulatory power, Caldicott’s opinion will inform an investigation by data privacy regulator, the Information Commissioner’s Office. The ICO is looking into whether the Royal Free illegally transferred patient data to DeepMind. That investigation is close to its conclusion, the ICO said.

If the ICO rules the data transfers were illegal, it could fine the Royal Free or possibly impose sanctions on the hospital. That, in turn, might make it more difficult for the Royal Free, and possibly other hospitals, to work with DeepMind and other tech companies.

Either way, it’s bad news for DeepMind, the London-based Alphabet unit that has been trying to move beyond beating humans at board games to find practical uses for its AI technology. It is particularly interested in the potential for AI to improve health care -- helping doctors do everything from interpreting eye scans to making complex diagnoses.

“This project, designed to help prevent unnecessary deaths using new technology, is one of the first of its kind in the NHS and there are always lessons we can learn from pioneering work,” the Royal Free said in a statement. “We take seriously the conclusions of the NDG, and are pleased that they have asked the Department of Health to look closely at the regulatory framework and guidance provided to organizations taking forward this type of innovation, which is essential to the future of the NHS.”

DeepMind said in a statement that the data used by the app “has always been strictly controlled by the Royal Free and has never been used for commercial purposes or combined with Google products, services or ads -- and never will be.”

The company also said it recognized the need for more public discussion about how the NHS used new technology. “We want to become one of the most transparent companies working in NHS IT,” it said.

DeepMind has finished testing Streams, which uses an existing, static algorithm developed by the NHS, not AI. The Royal Free has since rolled the app out widely across the hospitals it administers, where it said it has helped save time and possibly lives.

“The Streams app was built in close collaboration with clinicians to help prevent unnecessary deaths by alerting them to patients in need in a matter of seconds,” the Royal Free said. “It is now in use at the Royal Free, and is helping clinicians provide better, faster care to our patients. Nurses report that it is saving them hours each day.”

Mark Wardle, a neurologist who also founded a digital health startup focused on electronic patient records, wrote in a blog post Monday night that he disagreed with Caldicott’s opinion. "I am surprised that Dame Caldicott has suggested that it might be inappropriate to test software systems designed for direct care with real patient information," he wrote. "I would argue that such testing is necessary in the final stages of development to ensure that new
technology is safely deployed in live clinical environments."

Other experts on patient data sounded a cautious note about Caldicott’s letter. "New digital technologies, such as the DeepMind Streams app, offer real potential to provide better clinical care, but there must be appropriate governance so that everyone can have confidence that patient data is being used responsibly," Nicola Perrin, who heads Understanding Patient Data, an independent task force looking at frameworks for handling patient information that is housed at the London-based medical research charity the Wellcome Trust, said in a statement.

Before it's here, it's on the Bloomberg Terminal.
LEARN MORE