Government transparency and accountability during Covid 19: The data underpinning decisions Contents

2Public communication, behaviour, and trust

The purpose of data transparency

16.The ability of Parliament and the public to understand the Government’s decisions and hold them account is central to democracy. In the last year, we have seen Government impose some of the greatest restrictions on the people in recent history. The extent to which those restrictions were necessary or successful will be debated elsewhere and for long after the pandemic ends, but this report asks whether the data has been available for that debate to happen. Underpinning each decision is a myriad of data that sheds light on potential health, social, economic and educational outcomes. It is vital that Parliamentarians can see that data so we can understand and scrutinise these decisions.

17.Making this data available is not just a moral or democratic question, it is also central to the response. In the last year, individuals have made unprecedented changes to their lives. These changes have separated people from their families, forced businesses to close their doors, and left young people unable to go to school or attend university in person. Some of those sacrifices have been required by law and some have been based on guidance, but all rely on the co-operation and good will of the public. Individuals must understand the purpose of those requests if they are to be expected to abide by them, and we have heard throughout this inquiry that transparency builds trust and trust aids co-operation.

18.Ultimately, sharing the data underpinning these changes is about “gaining democratic consent”.12 This part of the report discusses the purpose of data transparency and asks whether the Government has upheld its end of this contract.

Telling the story of Covid 19 in data

19.Before delving into a discussion of transparency, it is important first to consider the mechanisms through which people receive information. As outlined in Chapter 1, much of the data and research produced by Government is now available online, including (at the time of writing) over 590 Government papers and the minutes of 77 SAGE meetings. However, it is unrealistic to expect most members of the public to engage directly with this wealth of complex research and, as the Committee heard from witnesses, people receive most of their information through secondary sources. Dr Ben Worthy, senior lecturer in politics at Birkbeck, University of London, told the Committee:

[The Covid 19 dashboard receives 300,000 hits a day and] there is even some evidence of the public directly accessing scientific journals themselves, but the primary method is the media indirectly. That is both the traditional media and, to a lesser extent, social media.13

20.Even with the growth of social media, the primary source of information on Covid 19 remains the traditional news media. As Dr Richard Fletcher, of the Reuters Institute and University of Oxford explained:

Television and online are the two most widely used ways of getting news and information about coronavirus, and when we are talking about “online”, we are really talking about the websites, the apps, and what we might think of as traditional news brands: newspapers, broadcasters and the like … . It is important to keep in mind that few people in general describe social media as their main source of news, even though many people use it as a supplement.14

21.The mechanism through which people come into contact with data will change the way they understand and interpret it because, as Dr Worthy told us, “it is very different to see one isolated number in a tweet or on a website than it is to see data as part of a story in a news article. How they meet this can have an impact.”15 He went on to note the importance of factors such as the political context in which the reader finds themselves, or trust in the source of information. Ultimately, people do not see the numbers in isolation but instead they fit them into narratives or stories:

… we very rarely look at pieces of data or numbers in isolation. What we often do is narrativize it and fit it within a story that is already in our heads or already in circulation around us. It is not often that we are blank slates examining this, but it is our own prejudices, our own levels of trust that will shape exactly how each individual reacts to this.16

22.As Ed Conway, data journalist at Sky News told the Committee, engagement with data-led journalism has been unusually high during the pandemic. This highlights the public interest in understanding the nature of the pandemic through the numbers, so that they can form their own judgement. He said:

The number of hits that we have had on very data-heavy stories and videos has taken us all by surprise with the amount of engagement that people have. I think there is quite a lot of curiosity about what the numbers say. That doesn’t necessarily mean that people are falling into one or other camp and that they are sceptical or passionate about lockdown. It means that there is a deep curiosity. A lot is said about data literacy.17

23.Dr Richard Fletcher noted that, while trust in news media is relatively low in the UK, it has grown during the pandemic. He told us:

… trust in the news media in the UK is relatively low compared with other comparable countries in Europe, for example. We were quite surprised to see that trust in the news media for news and information about coronavirus specifically, when we started measuring it in April, was quite high: around 60% said that they trusted the news media as a source of news and information about coronavirus at that point.18

Data presented by Government

24.The Downing Street press briefings have been the key source of information for many people during the pandemic. During the briefings, a Minister (usually the Prime Minister) announces policy and a Civil Servant (often the Chief Medical Officer) will walk through the key data which explains where the country is in the fight against the pandemic. These briefings are one important way in which the Minister shows the link between the data and the decision and demonstrates that they are accountable for that decision.

25.While these briefings have been an important exercise in engaging the public and seeking democratic consent, they have not been without criticism for the way they present some of the data. Witnesses before this Committee have described these briefings as “number theatre” and, as Professor Sir David Spiegelhalter of the University of Cambridge told the Committee:

Poor examples have happened … when the data has started being used in public relations. I am on record as having complained about what I call the number theatre of briefings, in which big numbers were being thrown out.19

26.Professor Spiegelhalter shared the concern of a number of contributors to this inquiry that the Government had, at times, quoted very large numbers in these briefings without context. It is not that these large numbers were false or that they were not grounded in research but that some examples were “not trustworthy communication”.20 He gave the example of briefings which have included “reasonable worst case scenarios” that were “based on extreme assumptions—essentially [the assumption] that we just do not do anything”.21 In his written evidence, he explained that there is a “general problem with using reasonable worst-case scenarios (RWCS) for public consumption”:

The communication at the October 31st briefing announcing the second lockdown was particularly poor … and the projection data of ‘up to 4000 deaths a day’ was subsequently widely ridiculed. [The graph that was presented] was never intended for public consumption, was based on extreme assumptions, and was demonstrably out-of-date at the time it was used.

This has happened at three important occasions: first in early March when the ‘500,000’ deaths was highlighted, second on September 21st when the red-bars showing a projection of cases doubling every week, reaching 49,000 by 13th October: in fact 13,000 new cases were reported that day.22

27.The Government has also sometimes presented data in ways that are hard to access and understand or cannot be easily contextualised or compared to other data. This was referenced in many submissions to this inquiry, and Professor Spiegelhalter was not alone in saying that “the poor quality of many of the slides shown at past briefings became notorious—too crowded, lots of coloured lines (apparently ignoring guidance on accessibility for those with visual restrictions) with a legend that could not be read”.23

28.Examples of this were given by many of the contributors to this inquiry, including the President of the Royal Statistical Society, Professor Sylvia Richardson:

[numbers have been] presented that were both inaccurate and overly-precise. For example, the number of deaths (for the most part, deaths reported on a day in a hospital setting) presented [in early Downing Street briefings] was almost always an under-estimate and the precision of the number presented gave a false impression of certainty … [further examples include] the daily “tested positive” figure, which is not helpful without knowing who has been tested and why, and diagrams of test-results against date which do not specify which date the axis refers to, a clear case of poor practice; there is still no report of the positivity rate, an important indicator, nationally, regionally and locally.24

29.Criticism of the presentation of data has not been limited to the Downing Street briefings. The Health Statistics User Group noted:

Data has been published at many different levels of disaggregation ranging from large areas such as regions and counties down to small local areas below ward level. These give different messages.25

30.It is important to note that contributors to this inquiry have also acknowledged improvement in the presentation of data throughout the pandemic. As discussed in Chapter 1, the Government (including the Civil Service Departments and non-departmental bodies such as the ONS) has made great strides in its understanding of the data from an almost standing start at the beginning of 2020. The RSS said in its written evidence that “there have been continued improvements in data presentation—the RSS has held regular meetings with DHSC staff who are clearly committed to improving the reporting and DHSC staff have also engaged positively with UKSA.”26

31.In February 2021, the Chair of the UK Statistics Authority, Sir David Norgrove, wrote to the Committee about the ONS’s role in improving the presentation of data:

The presentation of data at No 10 press briefings has improved, helped by the later involvement of ONS staff, but early presentations were not always clear or well founded, and more recently a rushed presentation has undermined confidence.27

32.However, while much of the effort of Civil Servants to improve the data and its presentation is to be commended, the Committee remains concerned about the presentation of some data and that political considerations might drive the narratives within which those examples are presented.

33.The Government has made significant steps in the presentation of data throughout this pandemic, including through the Covid 19 dashboard. But it is still presenting some graphics which do not meet the basic standards that we would expect. The Committee welcomes UKSA and Royal Statistical Society intervention to support Departments in producing clear graphics.

34.Graphics used by Government, for example slide packs and briefings, should meet Government Statistical Service good practice guidelines on data visualisation. They should always meet the accessibility regulations, which are now law.

The politicisation of data

35.A number of contributors to this inquiry raised concerns that data presented by Ministers was sometimes framed by political considerations. As Full Fact explained in its written submission:

… Ministers seemed to choose certain numbers in order to paint a more positive picture of the situation—for instance when the Prime Minister overstated the number of schools with returning students, or when the Health Secretary used a confusing metric about the proportion of tests turned around in “24 hours” that actually included tests that were returned the next day”.28

36.The example of test and trace data was widely cited in the written and oral evidence given to the Committee. Test and trace was introduced in order to inform people that they had been in contact with a person who had received a positive Covid 19 test so that they could isolate in order to prevent further spread of the virus. The success of this programme was reliant on rolling out a large testing programme. In April 2020, the Secretary of State for Health and Social Care had promised that 100,000 tests a day would be undertaken.29 The Covid 19 daily update on 30th May claimed that “there have been 1,023,824 tests, with 122,347 tests on 30 April.”30 It later transpired that there was significant double counting in this number, prompting intervention from the Office for Statistics Regulation. Ed Humpherson, Director General of Statistics Regulation at UKSA, wrote to us explaining:

The target of 100,000 tests per day was achieved by adding tests sent out to tests completed. As predicted, there was huge double counting, to the extent of some 1.3 million tests that were eventually removed from the figures in August. The controversy over testing data seems likely to continue to undermine the credibility of statistics and the use that politicians make of them.31

37.It is not possible to know whether this was genuine human error, politically motivated or (as is perhaps most likely) a combination of the two. But when UKSA intervened in July, it clearly stated concerns about the Minister’s incentive. It outlined that the first purpose of the testing statistics was to understand the epidemic, and the second was to help manage the test programme. It concluded that:

The way the [testing] data is analysed and presented currently gives them limited value for the first purpose. The aim seems to be to show the largest possible number of tests, even at the expense of understanding. It is also hard to believe the statistics work to support the testing programme itself. The statistics and analysis serve neither purpose well.32

38.More broadly, both the President of the Royal Statistical Society and the Chair of UKSA have said that statistics appear to have been used to further political narratives. The Royal Statistical Society said:

At times it has seemed that the presentation of statistics has been impacted by political considerations.33

39.And Sir David Norgrove, Chair of UKSA told us:

it is clear that political pressures have led to some of the weaknesses in the handling of Covid 19 statistics.34

40.The Committee is very clear in its view that statistics should be used for the purpose of genuinely informing the public and, as is discussed later in this report, it feels that open and honest communication builds trust even when the Government has fallen short of its promises. It is disappointing to hear that so many people who wrote to us felt that data had often been “used as a rhetorical addition to emphasise an argument, rather than genuinely trying to inform the public”.35

Ministerial and Departmental responsibilities for statistics

41.The first principle of the UKSA Code of Practice for the use of statistics is “Trustworthiness”. This includes “honesty and integrity” and that “statistics, data and explanatory material should be presented impartially and objectively”.36 Observance of the UKSA Code of Practice is a statutory requirement on all organisations that produce official statistics, which includes all Government Departments.37 The Ministerial Code, however, only asks Ministers to be mindful of the UKSA Code of Practice.38

42.In November 2020, the Office for Statistics Regulation (an arms-length body of UKSA) published a transparency statement on the use of Covid 19 data which set out the following three principles:

1. where data is used publicly, the sources of these data or the data themselves should be published

2. where models are referred to publicly … outputs, methodologies and key assumptions should be published at the same time

3. where key decisions are justified by reference to statistics or management information, the underlying data should be made available.39

43.It is evident that Ministers have not always lived up to the expectations of the UKSA Code of Practice. Notably, the Office for Statistics Regulation has written to this Committee on a number of occasions highlighting incidences where numbers have been quoted without underlying data being available. This has included: numbers of prisoners with Covid 19;40 management information on rough sleepers and Covid 19;41 and management information on Universal Credit.42 In many cases (as outlined in correspondence from UKSA, referenced in this paragraph), the responsible department subsequently published underlying data, but it should not take the intervention of the regulator for this to happen. And, in spite of repeated interventions, Sir David Norgrove, Chair of UKSA wrote in February 2021 stating that this was an ongoing problem:

Ministers have sometimes quoted unpublished management information, and continue to do so, against the requirements of the Code of Practice. Such use of unpublished data leads of course to accusations of cooking the books or cherry picking the data.43

44.Statistics quoted by Ministers have not always been underpinned by published data, which goes against the UKSA Code of Practice. Publishing the underlying data is key to transparency and building trust. When the underlying data is not published, numbers may be used to make politicised points and members of the public, journalists and Parliamentarians have no way of verifying the information shared. This means constructive debate cannot happen.

45.When Ministers or senior officials quote statistics, the underlying data must be published. This is already an Office for Statistics Regulation expectation, and OSR should continue to inform this Committee—as it has throughout this inquiry—when it finds examples of statistics that are quoted without published data to back them up.

46.Going forward, Ministerial statements published on Government websites must include hyperlinks or footnotes directing to the detailed data underpinning any numbers or statistics quoted. This should apply to all areas where data is used, not just in relation to this pandemic.

47.The Ministerial Code needs to be strengthened so it is clear that Ministers are required to abide by the UKSA Code of Practice in their presentation of data. The UKSA Code includes the principle of trustworthiness that builds “confidence in the people and organisations that produce statistics and data”. Abiding by the UKSA Code of Practice is a statutory requirement for Government Departments. It is simply not enough to ask Ministers to be “mindful” of the UKSA code.

Informal advisors to the Government

48.In recent months, members of SAGE have appeared on news and discussion panels to share their views on the data used to inform the response to the pandemic. It has, at times, been questionable how helpful these interventions have been in informing the public about the pandemic, especially when different academics have differing views on the data or the response.

49.The nature of SAGE’s contribution might not always be well understood by the public. SAGE is not a standing group nor is it a decision-making body. It is formed for the specific and time-bound purpose of supporting the Government during emergencies. SAGE guidance from 2012 states “SAGE aims to ensure that coordinated, timely scientific and/or technical advice is made available to decision makers to support UK cross-government decisions in COBR.”44

50.SAGE advisors include Civil Servants (such as Professor Chris Whitty and Sir Patrick Vallance) alongside independent academics and scientists providing their advice for free. Unlike Ministers and Civil Service Officials, the independent advisors are not bound by a code of conduct.

51.Previous manifestations of SAGE have met for very short periods of time. For example, SAGE met 5 times between February and August 2016 to provide advice on the Zika outbreak.45 When SAGE was activated in early 2020 to discuss Covid 19 (with its first “precautionary” meeting on the 22nd January), there could not have been the expectation that it would go on to meet over 80 times46 and convene 294 named experts (at the time of writing).47

52.Arguably, it is not helpful to stop academics from commenting publicly, as this might conflict with their paid employment (writing articles, research and teaching), but given SAGE has moved into the public discourse in an unexpected and unprecedented way, guidance must be given to members on how they should engage with the media. Indeed, the SAGE framework published by the Cabinet Office in 2012 states:

Most emergencies attract significant media interest and experts are likely to want to talk about their work, the SAGE secretariat should provide SAGE members with clear guidance on confidentiality. This should explain what can and cannot be said for security reasons and the requirement to take account of the FOI Act.48

It is unclear if any guidance has been provided to members during Covid 19.

53.When SAGE advisors speak publicly about the advice they have given to Government it has the potential to create confusion and undermine trust. This report calls for greater transparency, including on uncertainties, but there also needs to be clarity about what has underpinned Government decisions. SAGE is made transparent through the official records of discussions and advice published, and it is important that this is not framed or politicised by individual advisors. SAGE members, and experts from other bodies, can play a role in informing the public. However, as it stands, the public is not well informed about the role of SAGE advisors and might not be aware that differences of opinion are an inherent (even encouraged) element of discussion in that forum.

54.We are certainly not calling for SAGE advisors to be silenced, but for some expectations to be laid about the appropriate way to communicate considering, amongst other things, the potential for the politicisation of their commentary. Civil Servants advising Government are expected to abide by a code of conduct, and there should be a similar code for SAGE advisors. The SAGE secretariat should produce guidance for members on how to engage with the media, in line with the 2012 Cabinet Office Guidance. This should not be overly restrictive as to prevent individual advisors from undertaking their normal work or from outlining the capacity in which they advised SAGE if required. This should be made public.

Communicating uncertainty

55.No one is contesting that decisions made by Government have been difficult and have involved a significant degree of judgement. Politicians have been keen to stress that they are “following the science” but in reality science rarely produces a single correct answer. As the Royal Society put it in their submission to the inquiry “at the frontiers of science, there is always uncertainty, and to pretend otherwise would be foolish”.49 Alongside the many connected and sometimes competing considerations, including those of public health and the economy, the data usually contains degrees of uncertainty. As Dr Ben Worthy noted in his submission:

almost all the data around Covid 19 is complex and contestable, for experts and the wider public. Even data such as death rates has provoked discussion, controversy, and revision.50

56.Ultimately, a judgement must be made and justified by Ministers and, as the National Statistician was at pains to note when he gave evidence in May:

The lockdown decisions are essentially political, but they must be informed by data.51

57.Advice produced by SAGE and its subgroups have outlined the uncertainties in data they draw on, but as Professor Sir David Spiegelhalter noted, politicians are not always keen to admit this uncertainty:

An anxiety that many communicators have about admitting uncertainty is that, if we admit we do not quite know what the benefits of face masks are and things like that, maybe people will not want to wear them, maybe people will not obey the rules … .That can lead people to overclaim their confidence in the conclusions they are making.52

58.While these concerns are understandable, the overwhelming message we have received is that honestly and openness about the data builds trust and confidence. Professor Spiegelhalter went on to say:

nobody can expect Government or anybody else to have a crystal ball to say exactly what is going to happen … but that transparency, that honesty, that openness is what the public, purely as a duty, deserve to get. Also, pragmatically, the evidence suggests that will not lead to a negative response.53

59.And, as Dr Ben Worthy explained:

It is interesting to note that the public seem to have a quite nuanced understanding of a lot of the trade-offs that are involved here54

60.During this pandemic, it is vital that the public comply with Government guidance and laws designed to prevent the spread of the virus. The message that we heard from behavioural scientists was that, contrary to what one might think, admitting uncertainty is unlikely to undermine the public response and might have a positive impact. Professor Stephen Reicher, Professor of Social Psychology at the University of St Andrews, explained to the Committee that:

Sometimes, there is a sense that people cannot cope with uncertainty and people cannot cope with risk, so we have to phrase things in very simple and absolute ways. Actually … that is a rather problematic view, and … acknowledging uncertainty in our data… either does not undermine confidence or increases it. What really undermines confidence is where you say something such as, “This is absolutely the case” and then it proves not to be the case. Then, people stop believing anything you say.55

61.One example of this is the PCR (polymerase chain reaction) Testing for Covid 19. The Committee has received a number of submissions that note that PCR tests are not 100 per cent accurate. Indeed, a number of SAGE papers openly acknowledge this uncertainty in testing data, including a paper on swab testing arrivals from overseas which states “the sensitivity of the swab test (rt-PCR) is not 100 per cent, and the probability of a false negative result changes over the time since exposure (infection)”.56 While the scientific evidence received by Government has discussed uncertainty in testing data, some written submissions we received express a sense that Government policy has been driven by “flawed figures”.57 The resulting risk is that legitimate questions about the accuracy of data, including on testing, can expand into a generalised mistrust of government decision-making when uncertainties are not acknowledged.

Data and trust

62.Behavioural scientists told us that people with lower trust in Government and in the science of Covid 19 appeared less likely to follow rules and guidance. Professor David Halpern told us:

[We] estimate of about 8 per cent of people [in higher tiers] were significantly less compliant. Interestingly, they were not rich or poor—it was quite spread—and it was not particularly men or women, but they had two characteristics. First, they did not really believe in Covid 19. You might say, “Well, that’s because they don’t believe the data.” Secondly, they had low trust in government. The causality of that could go in lots of different ways, but we thought that that was very striking in the data.58

63.As Professor Reicher and Professor Halpern went on to explain, there are some significant complexities to consider when talking about trust, especially when considering the role of group or community dynamics. People are more likely to trust people who they see as “one of us” rather than “one of them” and the Committee heard that it is often the behaviour displayed in our communities that influences our own behaviour. Professor Reicher explained, the “social contract” with Government is central to compliance:

Compliance with Government, and authority in general, is very much a matter of the social relationship between the public and Government and whether we think of the authorities as “others” and acting “for” us. [If] you break that relationship—you break that relationship of trust, undermine common cause and undermine compliance.59

64.While behavioural scientists noted that the hard evidence was not absolutely conclusive on the question of trust and compliance (for example, they told us that trust in Government is lower in England than Scotland, but compliance is similar), there was a consensus that sharing data honestly and openly, complete with its uncertainties, was helpful. As Professor Reicher explained, trust is often based on “treating people as if they are one of us: treating them with respect”:

On many issues, we will have different people telling us different things … How do you decide between those different sources?… a lot of the time because of your social relationship to that source. How do we build the trust that leads people to accept the information? Well … a lot of it is about treating people as if they are one of us: treating them with respect, listening to them, being transparent with them. Therefore, providing information is not only the basis of science, but the basis of building up the relationship of trust that is going to be critical to people accepting the information they are given, so the answer is that not only is it important to be transparent with information; it is absolutely foundational.60

65.This view that trust was central to a social contract between the Government and the people was reflected in much of the evidence we received. Professor Sylvia Richardson of the Royal Statistical Society told us that:

The confidence of the public, and all actors in the system, is crucial in any major health protection challenge. The psychological contract of trust, goodwill, and confidence between the public and system leaders is an important component of the response to a pandemic. When this is undermined, the public may disengage from the behaviours needed.61

66.There was deep concern that this trust between the Government and the people had been undermined or broken and while many submissions cited political events, Sir David Norgrove reflected the theme of many submissions (including from the Faculty of Public Health, and the Health Statistics User Group) when he referenced the misuse of data. He told us that:

Perhaps most important is the damage to trust from the mishandling of testing data … The controversy over testing data seems likely to continue to undermine the credibility of statistics and the use that politicians make of them.62

67.The Committee also received submissions citing research which outlined the decline in trust throughout the course of the pandemic. For example, Dr Ben Worthy and Stefani Langehennig wrote:

Politicians were initially aided by a ‘rally around the flag’ effect’. This has now faded. There was a fall in trust in politicians after March-April and a large drop again in April-May. By September 2020 one detailed study concluded that ‘citizens granted the government considerable trust at the beginning … but that has started to fray in response to perceived confusion and mismanagement’.63

Anxiety, risk and behaviour

68.Alongside the inherent democratic good of sharing data on the pandemic, there is also a practical imperative of informing people so that they can adapt their behaviour.

69.One theme of the written submissions we have received was an idea that the Government was using data in an attempt to scare the public into complying. Referencing the use of “reasonable worst-case scenarios”, Professor Sir David Spiegelhalter stated:

I don’t want to ascribe motivations to anyone, but if someone were trying to manipulate emotions and wanting to frighten rather than inform, then this is the kind of thing they might do.64

While he is right that we cannot ascribe motivation, it is a concern shared by the Committee that large projections of infections or deaths are being used in an attempt to stoke anxiety rather than to inform the public.

70.In 2011, the (then) Department of Health published a pandemic preparedness strategy which included papers on public communications. These papers stated that higher rates of personal anxiety might be associated with people taking measures designed to protect them and their wider community.65 We tested this theory, asking statisticians whether data was a good way of communicating personal risk and behavioural scientists about how this might change individual behaviour.

71.The overall message conveyed to us by behavioural scientists was that creating a sense of anxiety alone was not sufficient to change behaviour and might even be counterproductive. Professor Halpern told the Committee:

there are a number of popular views that are sometimes misconceptions. Anxiety is one of those. It makes sense to say something frightening, because you will catch people’s attention, but, behaviourally, it is not very effective.66

72.Professor Reicher went on to explain to the Committee that:

Simply inducing fear leads people to turn away and to turn off. What is effective—it is a subtle distinction—is to get people to understand risk and to understand what they can do to mitigate that risk.67

73.In practical terms, this means informing people about the scale of the pandemic and associated risk without falling into the trap of using large numbers to induce anxiety. As Professor Halpern explained:

People were given more information about whether the level of cases in their area was high or low, as well as information about their own personal risk—if it was high or low—and [we asked] would that change what they did? The answer is that it would, really quite significantly, and in particular on social contact ... —15 to 20 percentage points, which is very large— … There is clearly a case for giving people enough information that is consequential for them, and they can do something about it, but throwing stats at people just because you want to get them worried or something is not particularly effective.68

74.Equally, the message from statisticians was that big numbers are not even helpful in understanding the scale of the pandemic or individual risk. Professor Spiegelhalter told us:

What we found is that these numbers … do not make a lot of sense to people, and so it is not the numbers alone. What helps is to give them context by comparing them with other personas. This is the risk of a healthy 25-year-old woman; this is the risk of a middle-aged Asian man with diabetes; this is the risk of an 85-year-old in a care home and something else. If you put those on a scale and say, “You are in here,” that enables people to get a much better idea of where they lie. Risk communication is a tricky business and it is certainly not a matter of just giving people the numbers. People need help.69

Role of the media

75.As noted previously, most people receive information about the pandemic from secondary sources such as media reports (including Ministers featured in the news), rather from SAGE papers or Government websites. This means the media plays a key role in helping the public understand the pandemic and, as we also stated earlier, narrativizing the data.

76.Some poor examples of data being misrepresented by media were drawn to the Committee’s attention. For example, Full Fact said:

We saw a lot of articles and commentators compare the number of deaths from “the flu” to the number of deaths from coronavirus. This is based on a misunderstanding of an ONS release reporting the number of death certificates that mentioned “influenza and pneumonia” or Covid 19. This isn’t the same as these conditions being the underlying cause of death.70

77.Of course, the “narrativizing” of data can also be a positive. As we discussed earlier, people might struggle to engage with the data directly or understand what it means in terms their own lives. As Ed Conway of Sky News explained:

Sometimes statistics can be dense; sometimes they are in need of context; and sometimes they are in need of illustration. The role that we play is to try to explain the statistics, to present them in a way that seems relatable and immediate to people so that they do feel they are relevant to their lives.71

78.The context in which people receive information is vital to how it is understood, and Ministers, Departments and Government agencies need to be mindful of this when preparing announcements. While some of this is outside the Government’s control, transparency in the data release (including notes on uncertainties and mythologies) can, at the very least, ensure the public and journalists are able to check or counter false narratives by referencing back to these stories. The Committee welcomes moves by ONS to update the releases from which the flu and Covid 19 comparisons were made. As Full Fact told us:

Given the apparent confusion [about Covid 19 and flu deaths], we also spoke to the ONS and were pleased that future releases included a clear statement explaining that a mention on a death certificate didn’t mean it was the underlying cause of death.72

Full Fact also noted that “the Sun and the Spectator added lines into their stories to clarify this.”73

79.The importance of clear data releases was also highlighted by Ed Conway, who noted that the media also had a role in checking statements made by Ministers:

My instinct with all these things is to try to go back to the data itself, the primary material, and to explain the different contexts whereby you could explain that. Clearly, there is nothing new about politicians taking pieces of information and using them as justifications to carry out their decisions… Our role in this is just to go back to the primary source material and say, “They are saying this. Is that really what the numbers say? Is there an alternative prism through which you could look at these numbers that would come up with a different view?”74

80.Ministers said that there is work underway to find out which messages are cutting through to the public. This was not specific to data or to media interpretations, but the Committee welcomes efforts to understand how messages are landing in general. The Paymaster General said:

[This work] will look at whether those messages are landing, whether they are understood. It will have disaggregated data, so it will be looking at particular audiences. It will also look at the information that has been gathered about behaviour, about where there have been breaches and where there are hotspots around the country, and it will be looking at doing some particular information analysis about why messages might not be cutting through with particular audiences. That is extremely thorough. It will always do focus groups, both to test how things are working and also in the design of those messages as well.75

81.Building trust between leaders and the public is essential to the response. The evidence the Committee has received, including from behavioural scientists, shows that people respond to open and honest information that is clear about the uncertainties within it. Some data has been communicated with the apparent intention of creating a more favourable view of the Government and some data has appeared to have been used to provoke anxiety rather than help people understand risk. It is disappointing to hear that the way data has been presented might have undermined public trust.

82.Government communication needs to focus on informing the public openly and honestly. As we move into the next stage of the pandemic, the roadmap back to lifting restrictions entirely, this becomes even more pertinent. Previous recommendations cover clarity on source information, and adherence to the UKSA Code of Practice.

12 Sense about Science (DTA0040)

24 Royal Statistical Society (DTA0042)

25 Health Statistics User Group (DTA0033)

26 Royal Statistical Society (DTA0042)

28 Full Fact (DTA 48)

29 HM Government, Health Secretary sets out plan to carry out 100,000 coronavirus tests a day , 2 April 2020, accessed on 1st March

30 Department of Health and Social Care, Daily update, 1 May 2020, accessed 1 March 2021

32 UK Statistics Authority, Sir David Norgrove response to Matt Hancock regarding the Government’s COVID-19 testing data, 2 June 2020, accessed 1 March 2021

33 Royal Statistical Society (DTA0042)

36 UK Statistics Authority, Code of Practice for Statistics, accessed 1 March 2021

38 The Cabinet Office, Ministerial Code, August 2019, accessed 1 March 2020

39 Office for Statistics Regulation, OSR Statement regarding transparency of data related to COVID-19, 5 November 2020

45 HM Government, Scientific Advisory Group on Emergencies, Minutes, Zika Virus, accessed 1 March 2020

46 HM Government, Scientific Advisory Group on Emergencies, Minutes, count accurate as of 22nd February 2021

47 HM Government, Scientific Advisory Group on Emergencies, list of participants at SAGE and subgroups, accessed 22nd February 2021

49 The Royal Society (DTA0039)

50 Dr Ben Worthy (Senior Lecturer at Birkbeck College) (DTA0011)

51 Public Administration and Constitutional Affairs Committee, Oral evidence: The work of the Office for National Statistics, HC 336, Q47

57 Dr Clare Craig (Consultant Pathologist at n/a) (DTA0009)

61 The Association of Directors of Public Health (DTA0046)

63 Dr Ben Worthy (Senior Lecturer at Birkbeck College) and Stefani Langehennig (DTA0011)

65 Department of Health, Principles of effective communication Scientific Evidence Base Review, Supporting Documents for UK Influenza Pandemic Preparedness Strategy, March 2011

70 Full Fact; Full Fact (DTA0048)

72 Full Fact; Full Fact (DTA0048)

73 Full Fact; Full Fact (DTA0048)

Published: 15 March 2021 Site information    Accessibility statement