277.This chapter is a case study on the use of artificial intelligence in the healthcare sector in the UK. Many of the issues presented by AI when deployed in healthcare are representative of wider issues with the use of artificial intelligence, such as the possible benefits to individuals and for the public good, the handling of personal data, public trust, and the need to mitigate potential risks.
278.Our witnesses were clear that healthcare was one sector where AI presented significant opportunities. The Academy of Medical Science said “the impact of artificial intelligence on … the healthcare system is likely to be profound” because research and development will become more efficient, new methods of healthcare delivery will become possible, clinical decision-making will be more informed, and patients will be more informed in managing their health. Others agreed with this assessment. Professor John Fox, who has worked in the field of AI for healthcare for over three decades, sounded a rare dissenting note, and suggested that many of the claims for healthcare AI may well be overblown. Professor Fox said there was a need for Parliament to commission a “dispassionate and objective study of the evidence for success in healthcare” due to the level of interest and a lack of critical analysis of developments.
279.Some of our witnesses told us of the specific areas within healthcare that could benefit from artificial intelligence. The Royal College of Radiologists told us “medical imaging is perfectly placed to benefit from advances in AI” because of the availability of high quality, curated data, “overcoming one of the main hurdles in AI development”. At Microsoft Research in Cambridge, we saw their work on ‘InnerEye’ technology, which is being developed to assist oncologists in the analysis of x-ray and MRI scans. Such an application has the potential to dramatically reduce the cost of analysing scans, allowing far more to be taken over the course of a treatment, thereby facilitating more accurately targeted treatment. The Royal College of Radiologists told of the potential for more efficient breast imaging, where one of two human breast screen readers could be replaced with AI (as mammograms are conventionally double read by a radiologist, advanced practitioner or breast physician). With two million women screened every year in the UK, and with images read at a rate of 55 per hour, considerable time could be saved for the specialists involved. The Royal College told us this was of particular importance given the strain the service is under due to staffing shortages and that many consultants were due to retire at the same time, 30 years after the breast screening programme was established. The Medicines and Healthcare products Regulatory Agency (MCHR) gave us a detailed list of possible uses of AI in healthcare, including for genomics and personalised medicine, the detection and monitoring of pandemics or epidemics, and the development of evidence for medicines submissions.
280.Other witnesses pointed to the more administrative benefits AI could offer the NHS. Deloitte said “on a sector specific level, healthcare is one area in which we see enormous potential for new technologies, both on the clinical side but also in the supporting administrative roles”. Braintree, an artificial intelligence research and development company, said: “instead of being buried in administrative duties and routine medical analysis, [doctors] could concentrate more fully on patient care and higher level medical diagnosis”. When we visited DeepMind they told us of their work with Moorfields Eye Hospital, which they hope will reduce the delay between patients being seen and then treated for certain eye conditions. Touch Surgery, a platform which offers training for surgeons, said AI could also help to “inform scheduling systems, updating them with real-time sensor feeds, to better utilise resources and availability”.
281.However, the Centre for Health Economics at the University of York warned that “organisations within the NHS are under great financial pressure. Developments like the adoption of AI are investments which require resources. Diverting resources away from front-line services is increasingly difficult when resources are limited and demand for services is increasing”. The Centre suggested that a decision needed to be made as to whether devoting scarce resource to the adoption of AI was appropriate, and the Government needed to provide a clear answer to this question.
282.It is no secret that the NHS is under immense pressure. For any of the benefits outlined above to be realised, the use of AI in healthcare is dependent on a number of factors, including:
283.The NHS holds data on nearly everyone in the UK; some of it going back decades. Lord Henley recognised this intrinsic value when he told us the “advantage of having a National Health Service is the quantity and the quality of the data that we have, which other countries do not necessarily have”.
284.Our witnesses agreed that this data could be of immense value to artificial intelligence researchers. Dr Hugh Harvey, a consultant radiologist and artificial intelligence researcher, suggested that IBM’s acquisition of Merge Healthcare in the USA for $1 billion, which netted them five to six million patients’ records, might be indicative of the value of the data held by the NHS. He also pointed to the Royal Society’s report, Machine Learning: the power and promise of computers that learn by example, which cited a figure of £1.8 billion for the direct value of public sector data, and put the wider socio-economic benefits at a minimum of £6.8 billion.
285.Nicola Perrin, who leads the Understanding Patient Data initiative at the Wellcome Trust, said that the question of ascertaining the value of the data the NHS holds, and the nature of compensation that should be sought for access to any data was “a crucial one to get right because of the implications for public confidence”. She added that the public “do not like the idea of the NHS selling data, but they are even more concerned if companies are making a profit at the expense of both the NHS and patients”.
286.Some argued that compensation for access to data does not necessarily have to be financial: it is clear that by sharing data with researchers, it would be fair for the NHS to expect favourable (if not free) access to any AI-based products developed from patient data. DeepMind agreed “that the NHS should be recompensed when it makes data available to companies for the purposes of AI development”, but said “clearly there are many ways to recognise and return value”. With the development of Streams, although not an AI application, DeepMind have given the Royal Free London NHS Foundation Trust “five years’ free use of the system” in exchange for testing the application (see Box 8).
287.Dr Harvey said there should not be a “monetary barrier to entry to data access”, and that “we need to encourage innovation and have failure, and they need to be allowed to fail at low cost”. Professor Martin Severs, Medical Director for NHS Digital (the national information and technology partner to the health and social care system), said:
“I would not have a barrier for entry but I would have some mechanism of demonstrating societal benefit from the data as it is being used. NHS Digital is open to any of those which have a consistent buy-in by all the organisations”.
288.Professor Severs thought that the public would see the use of their data to develop an AI tool as a “fair deal” if it had a wider societal benefit. Dr Barber told us of the “concept called the ‘golden share’ which enables companies to give money back to the contributors of the data” which could be applied to arrangements with the NHS and AI researchers. He said that the Department for Transport was already using such an arrangement.
289.Other witnesses, however, told us it may be hard to capitalise on the value of the data. Dr Julian Huppert, Chair of the Independent Review Panel for DeepMind Health, said “the public tend to believe that the NHS is one institute which has all the data in one place”. He said there are “real problems with data storage, availability and flow throughout the NHS at pretty much every level. It is very much in silos at the moment.” DeepMind told us that “the NHS currently is not able to set aside resources to explore in full the potential that AI holds, which leaves clinicians and other healthcare professionals ill-equipped to make the most of these opportunities”. Dr Harvey told us:
“Medical data … is very chaotic at source. This comes down to a delay, specifically in the NHS but also across the world, in the technology that is available in healthcare institutions compared to the technology that is available on the high street”.
290.Dr Bunz and Elizabeth Denham, the Information Commissioner, reminded us, as we have discussed earlier, that data is not necessarily owned but rather controlled: each time the data is processed, value can be added. Put simply, an individual’s data might be worth very little on its own. When that data is brought together into a dataset in an NHS database, its value increases. When work is done to prepare that dataset for training an algorithm, the data is worth even more. At each stage, the value grows.
291.Another complication in the assessment of the value of data, and the sort of compensation that ought to be expected, is the piecemeal arrangements being made between NHS trusts and companies, some of which may have far more experienced negotiators than trusts have access to.
DeepMind is an artificial intelligence research company, based in London, and owned by Alphabet. In 2015, DeepMind began working with the Royal Free London NHS Foundation Trust to develop an app to help with the diagnosis of acute kidney injury (AKI). Subsequently, the Streams app was developed and deployed within the Trust.
In the development of Streams, the Trust provided personal data of around 1.6 million patients as part of a trial to test an alert, diagnosis and detection system for AKI. When this came to light, the Information Commissioner’s Office (ICO) investigated, and ruled that the Trust failed to comply with the Data Protection Act 1998 when it provided patient details to DeepMind. Although the app does not use artificial intelligence or deep learning techniques, DeepMind’s involvement has highlighted some of the potential issues involved in using patient data to develop technological solutions, many of which are relevant to AI.
292.With this, it is important to bear in mind the lessons of, as one witness described it, the “Royal Free Hospital/DeepMind fiasco”. Doteveryone said this case exemplified what were “many of the major issues at stake: the lack of competence public bodies have in negotiating AI agreements with the private sector; the potential for harm to privacy rights and public trust in data transfers; and the giving away of valuable public data assets to private companies for free”. Nicola Perrin told us that it was “absolutely the situation” that NHS trusts were separately (more or less entrepreneurially) making different arrangements with different companies to use datasets that are very variable in their worth, suitability and application. Dr Huppert said “there are lots of different providers in lots of different trusts and the system is very chaotic” and that “there has not been very much work to look at some of the providers whose standards are not very high”.
293.This lack of consistency not only risks the NHS not maximising the value of data it holds, but also risks the sharing of intensely personal patient data. The sharing of data, even with the best of intentions, to companies which may not be equipped to handle such data securely, must be avoided at all costs.
A Caldicott Guardian is a senior official responsible for protecting the confidentiality of people’s health and care information and enabling appropriate information-sharing. All NHS organisations (since 1999) and local authorities which provide social services (since 2002) must have a Caldicott Guardian. The Guardian plays a role in ensuring that their organisation satisfies the highest standards for handling patient identifiable information, and advises on options for the lawful and ethical processing of information. The role has no statutory basis, and is mostly advisory in nature: however, the Guardian is accountable for any advice given.
There are seven Caldicott Principles which guide the advice of a Guardian. These are:
294.Dame Fiona Caldicott, the National Data Guardian, described the challenge of using patient data in technology, and its implications:
“What we have not done is take the public with us in these discussions, and we really need their views. What is the value? Are they happy for their data to be used when it is anonymised for the purposes we have described? We need to have the public with us on it, otherwise they will be upset that they do not know what is happening to their data and be unwilling to share it with the people to whom they turn for care. That is the last thing we want to happen in our health service”.
295.The patchwork approach is not just a challenge for the NHS. Perrin said “from a company perspective, it is very difficult for them to know how to access the NHS which is a big beast and some hospitals have much easier conversations than others”. Dr Sobia Raza, Head of Science, PHG Foundation, advocated a more joined up approach, which could help in “realising the benefits in terms of negotiations with companies and developing a dataset that could provide more opportunities for accurate tools and algorithms”. Dr Raza also spoke to the benefits of having access to data at a national level, instead of at a local one: “a huge opportunity arises when you can capture the differences in demographics and, essentially, collate a more enriched dataset which is more reflective of the wider population”. We were encouraged by Professor Martin Severs, who told us “NHS Digital would support a national, open and consistent approach”.
296.We were concerned by the NHS’s lack of organisational preparedness to embrace new technology. In April 2017, the Select Committee on the Long-Term Sustainability of the NHS concluded “there is a worrying absence of a credible strategy to encourage the uptake of innovation and technology at scale across the NHS”. In July 2017, The DeepMind Health Independent Review Panel Annual Report stated “the digital revolution has largely bypassed the NHS, which, in 2017, still retains the dubious title of being the world’s largest purchaser of fax machines”. Dr Huppert, Chair of the Panel, told us that “there is a huge amount of work that is still needed to make the NHS more digitally savvy”.
297.When asked if the NHS had the capacity to take advantage of the opportunities, and to minimise the risks, of using AI, Dr Huppert said “the short answer is no”, although “clinicians vary: some of them are very technologically savvy and very keen and eager and some of them very much are not”. Dr Raza said “there is an important need here for healthcare professionals to have knowledge about the technology, to be aware of what it is capable of and to understand its limitations and gauge an awareness of how it might change or influence clinical practice in years to come”. Dr Raza also told us it was just as important that those developing AI engaged with healthcare professionals at an early stage to help them establish where artificial intelligence could be of the most use. Dr Raza said there needed to be “a continued drive towards digitisation and embedding appropriate and modern digital infrastructure” in the NHS. Nicola Perrin highlighted the establishment of the NHS Digital Academy, a virtual organisation which provides a year-long digital health training course for Chief Clinical Information Officers, Chief Information Officers, and those interested within the NHS from clinical or non-clinical backgrounds.
298.Professor Severs told us that clinicians would embrace technology if it could help alleviate them of time-consuming, routine tasks. Both Dame Fiona Caldicott and Dr Harvey agreed that a multidisciplinary approach was required. Dr Harvey said “the medical syllabus needs to start incorporating not just medical statistics but some basics of data science”, as the NHS could not compete with the high salaries offered by industry to dedicated medical data scientists and researchers. Dr Harvey suggested collaboration between everyone with an interest in the healthcare system and AI was needed, as “if the NHS was to try to do it on its own it would fall short of the relevant skills and funding to do so”. It is clear to us that more needs to be done to ensure that everyone in the NHS is equipped to embrace the potential of AI in healthcare, identify and minimise the possible risks.
299.The application of artificial intelligence in the delivery of healthcare in the UK offers significant opportunities to improve the diagnosis and treatment of the unwell, as well as to help the NHS and other healthcare providers be more efficient. Further research and innovation should be encouraged by the NHS, and by the Government, while the public must be reassured that their data will not be made available for use without appropriate safeguards in place. The NHS should look to assess where the most value can be gained from the use of AI in the delivery of its services, and where the patient experience can be most improved through its deployment.
300.Maintaining public trust over the safe and secure use of their data is paramount to the successful widespread deployment of AI and there is no better exemplar of this than personal health data. There must be no repeat of the controversy which arose between the Royal Free London NHS Foundation Trust and DeepMind. If there is, the benefits of deploying AI in the NHS will not be adopted or its benefits realised, and innovation could be stifled.
301.The data held by the NHS could be considered a unique source of value for the nation. It should not be shared lightly, but when it is, it should be done in a manner which allows for that value to be recouped. We are concerned that the current piecemeal approach taken by NHS Trusts, whereby local deals are struck between AI developers and hospitals, risks the inadvertent under-appreciation of the data. It also risks NHS Trusts exposing themselves to inadequate data sharing arrangements.
302.We recommend that a framework for the sharing of NHS data should be prepared and published by the end of 2018 by NHS England (specifically NHS Digital) and the National Data Guardian for Health and Care. This should be prepared with the support of the ICO and the clinicians and NHS Trusts which already have experience of such arrangements (such as the Royal Free London and Moorfields Eye Hospital NHS Foundation Trusts), as well as the Caldicott Guardians. This framework should set out clearly the considerations needed when sharing patient data in an appropriately anonymised form, the precautions needed when doing so, and an awareness of the value of that data and how it is used. It must also take account of the need to ensure SME access to NHS data, and ensure that patients are made aware of the use of their data and given the option to opt out.
303.Many organisations in the United Kingdom are not taking advantage of existing technology, let alone ready to take advantage of new technology such as artificial intelligence. The NHS is, perhaps, the most pressing example of this. The development, and eventual deployment, of AI systems in healthcare in the UK should be seen as a collaborative effort with both the NHS and the AI developer being able to benefit. To release the value of the data held, we urge the NHS to digitise its current practices and records, in consistent formats, by 2022 to ensure that the data it holds does not remain inaccessible and the possible benefits to society unrealised.
376 Written evidence from the Academy of Medical Sciences ()
377 Written evidence from the Wellcome Trust and Association of Medical Research Charities () and (Professor Dame Wendy Hall)
378 Written evidence from Professor John Fox ()
379 Written evidence from Royal College of Radiologists ()
381 Written evidence from Medicines and Healthcare products Regulatory Agency ()
382 Written evidence from Deloitte ()
383 Written evidence from Braintree ()
384 Written evidence from Touch Surgery ()
385 Written evidence from the Centre for Health Economics, University of York ()
386 (Lord Henley)
387 (Dr Hugh Harvey); The Royal Society, Machine Learning: the power and promise of computers that learn by example (April 2017): [accessed 16 January 2018]
388 (Nicola Perrin)
389 Written evidence from DeepMind ()
390 (Dr Julian Huppert)
391 (Dr Hugh Harvey)
392 (Professor Martin Severs)
394 (Dr David Barber)
395 Written evidence from DeepMind ()
396 (Dr Hugh Harvey)
397 (Dr Mercedes Bunz and Elizabeth Denham)
398 Written evidence from medConfidential ()
399 Written evidence from Doteveryone ()
400 (Nicola Perrin)
401 (Dr Julian Huppert)
402 UK Caldicott Guardian Council, A manual for Caldicott Guardians (2017): [accessed 1 February 2018]
403 (Dame Fiona Caldicott)
404 (Nicola Perrin)
405 (Dr Sobia Raza)
407 (Professor Martin Severs)
408 Select Committee on the Long-term Sustainability of the NHS, (Report of Session 2016–17, HL Paper 151)
409 DeepMind, ‘Independent Reviewers release first annual report on DeepMind Health’ (5 July 2017): [accessed 18 January 2018]
410 (Dr Julian Huppert)
412 (Dr Sobia Raza)
413 (Dr Sobia Raza) and (Dr Sobia Raza)
414 (Professor Martin Severs)
415 (Dame Fiona Caldicott; Dr Hugh Harvey)
416 (Dr Hugh Harvey)