Zavara Farquhar takes a look at the privacy and transparency concerns related to Google’s recent acquisition of DeepMind Health, an Artificial Intelligence team.
Last month, Google Health took full control of DeepMind Health, a team which has worked with the NHS on an application that processes patient data. DeepMind Health belongs to the Artificial Intelligence (AI) start-up DeepMind, a company which was acquired by Google in 2014 but has been run separately since a 2015 restructure.
September’s takeover marks a change and has brought DeepMind Health under Google’s direct control. The repercussions of this are significant, not least when it comes to the future of healthcare and the privacy of our data as NHS patients.
DeepMind Health is known for its mobile application “Streams”. This app has been used by the NHS to determine whether patients are at risk of acute kidney injury. Google’s vision is for Streams to become “an AI-powered assistant for nurses and doctors everywhere – combining the best algorithms with intuitive design, all backed up by rigorous evidence.”
Google, it seems, is intent on revolutionising the healthcare industry, and to do so, AI is its core tool.
But with its visions of the future, there also comes serious mistakes from the past – one of which occurred during the testing of the Streams app. In 2017, the Information Commissioner’s Office (ICO) found that a DeepMind Streams trial at the Royal NHS Foundation Trust failed to comply with data protection law. The trial involved the data of around 1.6 million patients. The ICO concluded that “the mechanisms to inform those patients that their data would be used in the clinical safety testing of the Streams application were inadequate”.
The Royal NHS Foundation stated that in respect of personal data used in the Streams app, Google Health is the data processor, while the Trust remains the data controller. This means Google processes data on behalf of the Trust but does not have any responsibility or control over the data.
The Trust also claims that the personal data in Streams is used for two purposes: first, to detect and treat acute kidney injury; and second, to test that Streams is working properly. Furthermore, NHS patients can “opt-out” of their personal data being used in Streams by contacting the Trust.
The Trust’s clarity in communicating the purpose and legal basis for Google Health’s processing of personal data shows an effort to learn from the ICO investigation.
However, it remains unclear what the parameters are for the use of patient data in testing the application and what mechanisms are in place to ensure NHS patient records are protected in the long-term, as the Streams app and Google Health’s AI technology develop. Google Health has not provided information on these matters either.
Critics and the public alike, are also concerned about the lack of transparency regarding issues surrounding patient data processing and protection in the recent takeover. Without the publication of contracts between Google Health and NHS Trusts it is difficult to know what the effect is on NHS patients or their data. David Maguire of the King’s Fund think tank told New Scientist that the non-publication “creates an unnecessary uncertainty, which isn’t great for assuaging people’s fears. There’s a legitimate thing about people feeling nervous about how their data is used.”
A further concern relates to the ambiguous existence of an AI ethics board at Google Health. In April 2019, Vox reported that Google cancelled a newly formed eight-person AI ethics council less than two weeks after it was established in response to anger over the appointment of various board members.
When Google first acquired DeepMind in 2014, The Information reported that Google agreed to establish an internal Google-DeepMind ethics board to ensure AI technology is not abused. Very little is known about that ethics board. New Scientist reports that the independent ethics panel DeepMind established has been abolished in Google Health’s takeover of the company’s Health division.
While Google and the NHS Trusts may not be legally required to publish their contracts or explain the processes in respect of personal data, it boils down to a question of the sorts of the standards that we want to set. We must define how mega companies should operate at the intersection of technological innovation and patient data.