The UK response to covid-19: use of scientific advice Contents

Summary

The coronavirus pandemic has marked the most significant test of the way that the UK Government takes and acts on scientific advice in living memory.

The scientific community—in academia, in the public sector and in industry—has risen to that challenge in extraordinary and, in many cases, unprecedented ways.

This Committee, on behalf of the House of Commons, is deeply grateful for the tireless, expert and unstinting work of everyone who has sought to understand the threat of covid-19 from its earliest appearance, and who have brought their experience, ingenuity and judgement to bear on mitigating its impacts and seeking treatments and vaccines against it.

The high reputation of UK science is founded on openness and relentless self-challenge—looking always to test current theories and practices against new evidence and explanations, without sentiment and with a relish for discovery.

The Science and Technology Committee in its continuing inquiry has sought to apply that same spirit. Through asking questions of expert witnesses and scrutinising written evidence our aim has been to do two things:

In May, the Committee wrote to the Prime Minister and the Secretary of State for Health and Social Care with some recommendations drawn from the experience of the first few months of the pandemic.1

This Report considers, specifically, the ways in which the Government has obtained and made use of scientific advice during the pandemic to date.

During the weeks ahead, both as the Science and Technology Committee and in our joint “lessons learned” inquiry with the Health and Social Care Committee, we will set out further evidence and findings on areas including the test and trace system, the development of vaccines and the preparedness for this emergency.

In particular, the remarkable achievement of developing and being in a position to deploy multiple vaccines against a deadly and virulent virus that was completely unknown a little over a year ago ranks as one of the most outstanding scientific accomplishments of recent years—we will consider the lessons to be learned from the scientific, public policy and administrative contributions to this success in a subsequent Report.

This Report is structured as follows:

In Chapter two, we consider how scientific advisory and key decision-making structures evolved in the early stages of the pandemic, through evidence we gathered from Chief Medical and Scientific Advisers, as well as the Secretary of State for Health and Social Care.

Chapter three explores the initial awareness of the novel coronavirus in the UK Government as well as the activation and operation of SAGE itself. While it is apparent to us that science advisory mechanisms responded quickly, there is an open question regarding the longer-term operation of SAGE and the impacts on the independent experts who participate—and their research staff and technicians—as well as the Government officials who support SAGE.

The transparency and communication of science advice is discussed in Chapter four. While it is regrettable that there were initial delays in the publication of SAGE evidence, minutes and the disclosure of expert advisers, we are pleased that a regular drumbeat of public information was eventually established. Nevertheless, we have concerns that the lessons from this experience have not been consistently applied, and call for the Government to publish the advice it has received on indirect effects of covid-19 (including impacts on mental health and social wellbeing, education and the economy) and work to improve transparency around the operation of the Joint Biosecurity Centre.

In Chapter five, we discuss the breadth of expertise drawn upon by the Government through SAGE. We conclude that there was a particular reliance on epidemiological modelling expertise at the beginning of SAGE’s operation—reflecting the paucity of real world data early in the pandemic—and identify an apparent gap in the provision of independent advice on non-medical impacts. We also consider the issue of poor data flows, which have hampered the work of SAGE and other experts in understanding the pandemic.

Our final Chapter presents a number of instances that exemplify how effectively science advice was used, in different policy areas, over the course of the pandemic. In Chapter six we consider the following examples in which science advice has been a key component: testing capacity; social distancing measures, such as face coverings; and the development of potential vaccines and therapeutics.

Key findings

Our overall conclusions are that:

1.During the first part of the pandemic, the Government was serious about taking and following advice from scientists of international repute, through a structure that was designed and used during previous emergencies.

2.The length of the pandemic to date has placed extraordinary demands on the scientific advisers to Government. The Government Chief Scientific Adviser, the Chief Medical Officers, their teams, ministers and officials in Departments, the devolved administrations, the NHS, public health teams in Public Health England and local authorities, and each of the participants in SAGE and its sub-groups have worked intensively and continuously since the beginning of the pandemic. The structures for science advice in emergencies have been based around shorter term emergencies and the Government should consider the resilience of the arrangements for when they are needed to ensure in the longer term.

3.Initially, there was a lack of transparency about who were the scientists who served on the Government’s advisory body, SAGE, and what evidence and scientific papers their advice drew on. This has been improved following our earlier letter to the Government Chief Scientific Adviser, but there is still insufficient visibility as to what advice was given to the Government and over the transparency of the operation and advice of the new Joint Biosecurity Centre.

4.Although the Government was advised by many experts of distinction, and generally followed the advice that was given, the outcome during the first wave of the pandemic is not regarded as having been one of the best in the world.2

While the experience of no country is perfectly comparable with others, it will be important to understand the reasons for this in order to learn lessons for the future. In this Report, there are questions of how quickly scientific analysis could be translated into Government decisions; whether full advantage had been taken of learning from the experience of other countries; and the extent to which scientific advice took as a given operational constraints, such as testing capacity, or sought to change them.

5.Measures taken to contain the pandemic had wider and indirect effects, such as on people’s livelihoods, educational progress and mental and emotional wellbeing. The assessment of these wider impacts was—and remains—much less transparent than the epidemiological analysis; the people conducting the analysis and giving advice are less visible than epidemiological modelling advisers; and its role in decision making opaque.

6.The public has benefitted from seeing and hearing directly from scientists advising the Government, and overall trust in science has remained high despite the inevitability that scientific advice has often been associated with restrictions on people’s activities and sometimes the focus of contention. As the Office for Statistics Regulation advised, in order to maintain high levels of confidence, data and statistics should be presented in ways that align with high standards of clarity and rigour—especially when they are used to support measures of great public impact.

7.A fully effective response to the pandemic has been hampered by a lack of data. For a fast-spreading, invisible, but deadly infection, data is the means of understanding and acting upon the course of the virus in the population. The early shortage of testing capacity—restricting testing only to those so ill that they were admitted to hospital—had the consequence of limiting knowledge of the whereabouts of covid-19. The ONS Infection Survey did not begin until May, and the fragmentation of data across public organisations has impeded the agility and precision of the response.

8.The increase in testing capacity that took place from April was driven principally by a target set by the Secretary of State for Health and Social Care rather than following a scientifically-based plan of what capacity was needed. While testing capacity has increased dramatically, it is still unclear what exact assessment has been made of the testing targets required in the management of the pandemic.

In each instance, the approach we have taken is to draw on the evidence that has been presented to the Committee, orally and in writing, and to draw out lessons by way of recommendations to the Government—which is required to respond formally to the Report.

Where recommendations reflect findings that things could have been done better we make them, in keeping with the scientific approach, not to apportion blame but— recalling the acute uncertainty and urgency with which decisions have had to be made—but to provide a means continually to improve our collective response to this, and future emergencies.

2 Sir Patrick Vallance told us in July it was “clear that the outcome in the UK has not been good” and that there was a “band of countries that have done less well” (Q1043). Further, Professor Neil Ferguson suggested to us in June that the UK’s position, in terms of per-capita deaths from covid-19, would not “necessarily change in a European setting” (Q942).




Published: 8 January 2021 Site information    Accessibility statement