Transforming the UK’s Evidence Base

This is a House of Commons Committee report, with recommendations to government. The Government has two months to respond.

Fifth Report of Session 2023–24

Download and Share

Contents

1 Introduction

1. Data are everywhere. Each day, often without knowing it, most of us are generating vast numbers of new data points about our spending patterns, commuting patterns, and much else besides.

2. Put to good use, this mass of data might transform not just the way public services are delivered in the UK, but also the lives of citizens. But how effectively are government officials collecting, analysing, and making use of novel sources of data? And how can we be sure that our privacy is being protected, and that data are being used in ethical ways? This Committee sought to find the answers to these questions and others as part of our inquiry, Transforming the UK’s Evidence Base, which we launched in July 2023.1

Inquiry scope

3. The Public Administration and Constitutional Affairs Committee (PACAC) is responsible for overseeing the work of the UK Statistics Authority (UKSA) on behalf of Parliament. Over the course of this Parliament, it has become clear to us that UKSA, and the broader public data landscape within which it operates, are transforming rapidly. Officials have won statutory access to new, novel sources of data, and new infrastructures - like the cross-government Integrated Data Service - are being developed which will allow government analysts and approved researchers to link data in ways that have never before been possible. Indeed, shortly before we launched our inquiry, the Office for National Statistics (ONS) consulted on proposals to end the decennial census - suggesting that emerging data sources might offer us timelier, more granular, and more cost-effective insights on the UK’s population, than the large-scale survey on which we have traditionally relied could ever hope to.

4. Our inquiry has sought to explore this changing landscape, and the UK’s readiness to respond to it. We have considered four questions in particular:

a) What is the current system for producing and communicating official statistics and analysis in the UK?

b) How is the UK’s public data landscape changing?

c) What does it mean to protect privacy and use data ‘ethically’? and

d) How can the public data landscape serve its users?

5. In line with PACAC’s specific responsibilities, our Terms of Reference and call for evidence2 sought evidence on these four questions in the context of official statistics and analysis specifically, two areas which fall under the purview of the UKSA. When many submissions to our inquiry also spoke to research, we expanded our focus to encompass this category of evidence too. (These terms are explored in further detail at paragraphs 12–22). In our inquiry we have not sought to scrutinise the day-to-day work of any one government body or programme, but have instead explored the expansive framework which delivers public evidence and considered how the UK might deliver its citizens a public evidence base that is fit-for-purpose for years to come.

Inquiry process

6. When we launched our inquiry in July 2023, we opened a call for written evidence, and we are grateful to all those who responded. Over the course of our inquiry, we also held five public evidence sessions, and met with representatives from France’s National Institute for Statistics and Economic Studies in Paris. We heard a range of views; but one thing that united witnesses to our inquiry was a passion for data and evidence, and a belief in its potential to transform the UK and the lives of its citizens. We thank all those who have contributed their time and their considered views to this inquiry.

7. It should also be noted that our inquiry ran in parallel with a review of the UKSA conducted by Professor Denise Lievesley, on behalf of the Government. Some of the themes in our report - particularly those that relate specifically to statistics - were explored in detail by Professor Lievesley. We found her work to be a particularly valuable piece of evidence in the conduct of our inquiry, and thank her for her contribution.

8. We hope that this Government - and those that follow - thinks carefully about the issues exposed in our report. The health of our democracy, our economy and society all depend on good evidence.

2 How does government collect, use and communicate evidence? Understanding a complex and fragmented landscape

9. We began our inquiry with a seemingly straightforward question: how does the UK government currently collect, use and communicate evidence? The answer, we discovered, was not straightforward at all. There is a staggering number of public bodies who collect, use and communicate an enormous variety of evidence, some governed by statutory Codes, and others by rather vaguer standards. Furthermore, the lines between different types of evidence are blurry; the same data may feed into different types of evidence, and different types of evidence can be used for different purposes. Only by exploring this messy landscape, however, can we hope to understand how well served we are by official data, and where opportunities might exist to improve the evidence base upon which we all rely.

What forms can evidence take?

10. In the broadest terms, evidence is information that allows us to better understand the world around us and it can take a huge variety of forms. Businesses might use evidence about customers’ online behaviour to inform decisions about new products to stock. Charities might depend on evidence from government on deprivation and demography to understand where best to focus their efforts. At home, people may use historical documents and DNA kits as evidence, as they map their family tree.

11. While of all of these forms of evidence are important, our inquiry has sought to explore areas where government might directly influence the quality and range of evidence available, and has therefore focused on three ‘types’ of evidence: official statistics, government analysis and research. Over the following pages, this report summarises the frameworks currently in place relating to statistics, analysis and research, before delving into the masses of raw data on which all of this evidence depends.

Figure 1 describes the relationship between Government analysis, official statistics, and research. Government analysis is any analysis of data that is undertaken by officials, to inform or evaluate policy. Government analysis is not routinely published. Official statistics refers to a specific sort of analysis, required to comply with the Code of Practice for Statistics. It is always made publicly available. In the context of this inquiry, research is used to refer to work conducted by researchers and academics outside of government, which draws from data supplied by public bodies. Research may be used in government analysis. Finally, the term data is used to refer to raw data, held by government bodies. These may be collected via surveys, in the conduct of government operations, or from private sector organisations. These raw data are not routinely published.

Statistics

12. Official statistics have long sat at the heart of the UK’s public evidence base. Indeed, the roots of today’s statistical system can be traced back to 1940, when Prime Minister Winston Churchill found himself in need of good evidence on the war effort. Frustrated by the quality and consistency of data available, he wrote to his Cabinet Secretary in November 1940:

“It is essential to consolidate and make sure that agreed figures only are used. The utmost confusion is caused when people argue on different statistical data … . Pray look into this, and advise me how my wish can be most speedily and effectively achieved.”3

As a result of this exchange, the Central Statistical Office (CSO) was established in January 1941. The UK’s statistical system grew and evolved in the years that followed, and in 1996 the Office for National Statistics (ONS) was created, as the CSO merged with the Office for Population Census and Surveys.

13. A decade on, the Statistics and Registration Service Act 2007 reshaped the UK’s statistical system, ushering in the framework which is still in operation today.4 Crucially, the Act:

a) established ‘independence’ for statistics, by removing the Office for National Statistics from the control of Ministers, and placing it under the new, independent UK Statistics Authority (UKSA), directly accountable to the UK Parliament and the devolved legislatures;

b) maintained a de-centralised system for producing statistics - meaning that, while ONS collects and communicates many of the UK’s key statistical outputs, so too do devolved administrations, central government departments and smaller public bodies;

c) established a Code of Practice for Statistics, and a regulator (now known as the Office for Statistics Regulation (OSR)) to monitor compliance across the UK’s suite of official statistics.

Figure 2 depicts the UK’s statistical system. The UK Statistics Authority (UKSA) reports to the UK Parliament and the Devolved Legislatures. The Office for National Statistics (ONS), the Office for Statistics Regulation (OSR) and the Government Statistical Service are each overseen by the UKSA. The ONS is the UK’s largest producer of official statistics and its recognised National Statistical Institute. It is responsible for collecting and publishing statistics relating to the economy, population, and society at national, regional and local levels. It is headed by the National Statistician. The OSR is the independent regulator of statistics in the UK. Its core functions include setting the statutory Code of Practice for Statistics, assessing compliance against that Code, awarding designation to statistics that comply fully with the Code, and reporting any concerns about quality and good practice, or comprehensiveness relating to official statistics. The OSR is header by the Director General for Regulation. The Government Statistical Service is a network of all those involved in the collecting, analysing and communication of official statistics. This includes statisticians working in devolved administrations, central government departments, and other public bodies. They are jointly accountable to the National Statistician and their departmental managers.

14. Under legislation, official statistics can be produced by the ONS, any government department, the Scottish Administration, a Welsh ministerial authority, a Northern Ireland Department, or any other organisation named in secondary legislation (of which there were, at last count, 98).5 ONS estimates that there are approximately 9,000 people involved in the collection, production and communication of official statistics.6

15. The independence of statistics, as established in the Statistics and Registration and Service Act 2007, was felt by many of our witnesses to be one of the system’s key strengths. Chief Executive of the fact-checking charity, Full Fact, Mr Chris Morris told us:

“ … we shouldn’t underestimate how lucky we are to have a reliable system of statistics that is politically independent. In so many countries now, the statistics have become so politicised that it’s actually quite difficult to find an independent body of evidence on which you can check facts. It is something we perhaps take for granted”.7

Professor Lievesley told us the framework was internationally respected, and that its independent regulatory function was a feature envied by other nations.8 Of course no system is perfect, and witnesses also noted several shortcomings - a perceived lack of responsiveness to user needs, long-standing data-gaps, and challenges accessing new data sources, for example - which we explore later in this report.

Government Analysis

16. Statistics are but one type of evidence produced and used by government; there are many others. Ministers might consult a variety of analyses as they take major decisions - costings prepared by economists, or information on how policies might affect different demographic groups for example - and civil servants similarly depend heavily on a variety of evidence to help them evaluate the success of policies.

17. Historically, these different types of analyses have been conducted by government analysts under the auspices of distinct professional standards, such as the Government Economic Service, the Government Operational Research Service, or the Government Geography profession. In 2018 - recognising that what Ministers and senior civil servants really need is comprehensive ‘evidence’, rather than separate economic, statistical, and operational analyses - the Government launched a new ‘analysis function’. The function was intended both to introduce common standards to the way different forms of analysis are produced across government, and to promote their use in decision making.9 Today, the analysis function brings together 16,000 analysts across seven professions (see Figure 3), and is led by the UK’s National Statistician.10

Figure 3 sets out the 7 professions who make up the government analysis function: actuaries, economists, geographers, operational researchers, social researchers, statisticians and data scientists, and digital, data and technology professionals.

18. As we explore in greater detail later in this report, we have found it challenging both to understand the role analysis plays in the work of government on a day-to-day basis, and to make an assessment of the quality of analysis produced and used by officials and Ministers. This is because government analysis is largely conducted behind closed doors, and very few of the government’s analytical products ever see the light of day.

Research

19. Official statistics and analysis are examples of evidence which rely on government officials themselves analysing data. But, as submissions to this inquiry made clear, there is another powerful lever available to any government hoping to enrich its public evidence base. That lever is research.

20. By opening up government data to trusted researchers for projects that deliver ‘public good’, the UK can take advantage of the extensive expertise which exists outside the public sector, to deliver better information to citizens, to help inform its own policies, or to support businesses and third-sector organisations in delivering their own priorities. During our inquiry, we heard many examples of the powerful role research can play. For example, during the pandemic the UK’s then Chief Scientific Adviser turned to external researchers, to help decision-makers understand the societal impacts of the pandemic.11 Elsewhere, in recent years, researchers granted access to data have delivered crucial new insights on policy issues including low pay over time, ethnic disparities in sentencing, and the impact of different gestational birth ages on the development of children.12

21. The Digital Economy Act 2017 introduced new provisions, aimed at making it simpler for accredited researchers to access and link de-identified data13 held by public authorities. Under the Act, the UKSA is the statutory accrediting body responsible for the accreditation of processors, researchers and their projects.14 There is currently a patchwork of different infrastructures through which researchers access government data under these provisions - ONS’s Secure Research Service and the Secure Anonymised Information Linkage (SAIL) Databank, to name but two. Some Departments choose to make the majority of their data available in stand-alone infrastructures, under separate legislative provisions. HM Revenue and Customs (HMRC), for example, has its own ‘data-lab’ through which government users, and representatives from academia and impartial research organisations can access HMRC data “to drive innovation and productivity across the UK, enhancing the delivery of public services to improve people’s lives”. This landscape continues to evolve, as ONS and Government attempt to develop a cross-government Integrated Data Service (IDS).15

22. Our witnesses spoke with a single voice about the importance of research to the UK’s public evidence base, but several hinted that it is becoming harder, rather than simpler, for external researchers to access the public data they need, to deliver research for the public good. Many of the challenges faced by researchers - around data access and prioritisation - mirror those raised in the context of statistics and are explored in the chapters which follow.

23. There is much to be proud of across the UK’s public data-landscape. The independence of its statisticians from the government of the day, the innovative work being undertaken by a skilled researcher community, and our unique regulatory framework all received praise from our witnesses.

24. As it stands, the UK’s public data landscape is highly fragmented. This need not be a problem (indeed the model offers several conceivable advantages) if the many actors involved in generating, analysing and communicating evidence work together effectively. Our inquiry suggests that in several areas, however, this is not yet the case.

3 An explosion of data: Navigating new data sources and technologies

25. In recent years, the amount of data available to statisticians, analysts and researchers across the UK has flourished. Our inquiry heard that new data sources offer the potential to deliver evidence more quickly, in greater detail, and in more cost-effective ways. But if the UK is to make the most of this explosion of data - if it is to harness all of the opportunities which it offers - there are challenges to be overcome, and realities to be acknowledged.

Surveys in the UK

26. Statisticians, analysts and researchers have traditionally relied heavily upon surveys in their work. The decennial census is perhaps the UK’s most well-known survey; every ten years citizens answer questions on topics such as age, sex, marital status, health, education and housing, and their collected responses provide not just a detailed snapshot of the UK at that one moment in time, but also a valuable resource against which other datasets can be reconciled.16 In 2024 surveys remain crucial to the UK’s evidence base - we rely on them to understand the very basics of our nation - the health of our labour market, levels of crime, and household finances, for example.17

27. Across the world, surveys are becoming increasingly challenging for statisticians to conduct, as it becomes harder and harder to secure responses from citizens.18 Indeed, problems are so acute in the UK that, in late 2023, the ONS suspended its regular publication of labour force estimates, having concluded that it could not rely on the small number of responses it had gathered to paint an accurate picture of the UK’s labour market, particularly for certain subgroups of the population.19 Surveys are also costly, and increasingly so, as statisticians turn to incentives to arrest declining response rates.

28. But submissions to our inquiry also highlighted the very many advantages that surveys offer curious analysts and policymakers.20 They capture information that no other data source yet can - on intentions and motivations, for example. Unlike any other data source available, surveys are designed specifically to answer questions, and they can be adjusted to gather information on emerging policy priorities (cost-of-living questions, for example, or the nation’s understanding of Covid regulations). Because they use well-established sampling methods, they can produce a more representative sample of the UK population than many other sources are able to achieve, and are less subject to bias. Finally, as we heard repeatedly over the course of our inquiry, by allowing us to harmonise the questions we ask across and between nations, surveys offer us greater opportunities for comparable data than might be possible when relying upon, for example, data gathered by different administrative authorities.

Administrative data

29. Administrative data - that is, data collected in the conduct of government operations - has been used alongside survey data in the production of evidence and analysis for many years. Statistics on births, for example, are not gathered via a survey, but rather via the information supplied when live births and still births are certified and registered as part of the legally-required civil registration process.21 In recent years, as public bodies have digitised services, government analysts have increasingly used the resulting data in more creative ways - both to supplement survey data, and to form stand-alone pieces of evidence.

30. Administrative data offer many potential advantages; they are timely, and frequently offer many more datapoints than might be possible to achieve by means of a survey. But relying on administrative data poses challenges too. Because they are designed for operational purposes, they may not answer policy questions in the way users want. For example, while some might imagine that the crimes recorded by police offer a reasonable proxy for crimes experienced by citizens, they would be forgetting that many crimes will not be reported, and that the recording practices of police might be affected by targets (for instance a rise in knife crime might indicate that police are targeting this type of crime, rather than an increase in its incidence). Crucially, administrative data are also outside of the control of analysts. Any changes to the collection of data, driven by operational requirements, could impact the quality of associated evidence. The removal of child benefits in 2013, for example, saw a significant deterioration of the coverage of administrative datasets within HMRC and the Department of Work and Pensions.22 Finally, administrative data rarely capture a representative cross-section of the population; the National Statistician spoke to us about the poor representation of young males in GP registration lists, to illustrate this point.23

Emerging sources of data

31. Beyond survey and administrative sources, several new sources of data are now emerging which might be harnessed to improve the UK’s public evidence base. Witnesses sometimes used different terms,24 but for the purposes of this report, we use the terms: online data; location data; and, digital services data.

32. Online data are data publicly available on the internet. Examples might include job advertisements (from which we can glean information on our labour market); retailers’ price data (from which we can explore changes in costs over time) or social media posts. Location data are those data - whether collected via a smart-phone or a sat-nav - that tell us about location and movement. They might be used to help us understand changing patterns in travel - as they were, for example, during the Covid-19 pandemic.25 Digital services data are data derived from our interaction with digital services - supermarket loyalty cards, direct debit payments and such. During our inquiry, we heard of examples in which researchers used information on grocery spending habits, to estimate levels of deprivation across London.26

33. There is much to recommend each of these emerging sources; they are available in real-time, they are cost-effective, and the volume of data available often dwarfs that which might be gathered in a survey. However, we also heard repeatedly that these sources bring with them challenges. They are likely to contain significant biases; for example a supermarket may attract a specific type of customer, and a social media platform a particular type of user, from which inferences about the UK’s broader population cannot easily be drawn. Our ability to draw inferences from these datasets is also vulnerable to any changes data owners might make to the way the collect their data. There are also important ethical questions around the use of these sources which the Government needs to address - which we explore in detail at Chapter 6.

Box 1: Goodbye census? Changing data sources in practice

While statisticians and analysts have been incorporating new sources of data into their work for years, a recent announcement from ONS suggests a more significant shift may soon be underway.

On 29 June 2023, ONS launched a consultation on its proposals to transform the UK’s population and migration statistics. In the proposals, the National Statistician wrote “a serious question can be asked about the role the census plays in our statistical system” and recommended that statisticians in future “primarily use administrative data like tax, benefit and border data, complemented by survey data and a wider range of data sources.”27 These proposals follow years of research within ONS about the potential of new data sources, and a suggestion made by Government in 2014 that “censuses after 2021 will be conducted using other sources of data.”28

The National Statistician told our Committee that administrative data sources offer the potential to deliver information to users more quickly and frequently than is possible using a census-model. He explained that statisticians had proven they could produce robust local population and income estimates, but noted that other areas - occupation for example - were more challenging.29

Submissions to our inquiry have suggested that while users of census data support ONS’s ambition to embrace new sources, and deliver data more frequently, concerns remain about whether data will be available at the granularity required on topics such as ethnicity, and whether data on topics such sexual orientation will be available at all.30

The case for joining up data

34. There was a clear consensus among those we spoke to that the real power of evidence is not to be found in any single source - new or old - but rather in the bringing-together of data.31 As the producers of the UK’s Understanding Society survey put it:

“As we move beyond discrete goal-focused policy interventions and into an era of more complex challenges such as spatial disparities, public health, climate change, an ageing society, impact of technology, etc., we can expect to see demand for more multi-dimensional data, the use of more complex methods in research, and growing calls for cross-departmental data sharing and working”.

35. Certainly, the most exciting examples of evidence we came across linked sources together in just this way, allowing users of the data to consider questions from different perspectives and by different sub-groups of the population. The Longitudinal Educational Outcomes (LEO) dataset, for example, brings together de-identified education, employment, benefits and earnings data. In doing, it allows government analysts and external researchers to explore a range of questions, such as: how does the quality of education affect long-term outcomes in the labour market? how do some education policies affect social mobility? and, which education pathways are likely to result in positive labour market outcomes?32 Elsewhere, the ONS’s analysis of ethnic contrasts in deaths involving the coronavirus (COVID-19), based on a unique linked dataset which encompassed 2011 Census records, death registrations, Hospital Episode Statistics (HES), and primary care records retrieved from the General Practice Extraction Service (GPES) Data for Pandemic Planning and Research (GDPPR), provided important information to health professionals, managing Covid cases.33

36. The Government recognised the potential of linked data some years ago, introducing new legislative provisions for the sharing of information for research and statistical purposes within the Digital Economy Act 2017.34 The Act both:

a) provides the ONS (as the UKSA’s executive arm) with greater and easier access to data held within the public and private sectors in support of its statutory functions in the production of official statistics and statistical research (statistics powers); and

b) facilitates the linking and sharing of de-identified data by public authorities for accredited research purposes, to support valuable new research insights about UK society and the economy. Under these research powers, the UKSA is the statutory body responsible for the accreditation of processors, researchers and their projects.

37. There is almost universal agreement - from statisticians, researchers, and those who rely on high-quality data - that these powers are not yet being used to support the production of statistics or research as they should be.35 In her recent review of the UK’s statistical system, Professor Denise Lievesley described “widespread disappointment that [the passage of the Digital Economy Act 2017] has not resulted in the much needed step change of the quantity and frequency of data being shared.”36 ONS itself described a “complex and burdensome system, which leads to long-lead in times to agree data shares.”37 UK Research and Innovation (UKRI) described a “limited capacity [across Government] to invest the time and effort required to share data across departments, link this and create a research-ready dataset.”38

38. Some witnesses spoke positively about the way in which data had been shared quickly over the course of the Covid-19 pandemic, but we learned that over this period data were often shared on the basis of Control of Patient Information (COPI) notices, rather than under the provisions of the Digital Economy Act.39 Indeed, as it stands, section 5 of the Digital Economy Act 2017 specifically excludes the sharing of health and adult social care information with accredited researchers. A 2020 review of the Act noted some stakeholders had argued that health and social care data should be brought into the scope of legislation, but made clear that this would not be considered ahead of the completion of Dame Fiona Caldicott’s Review of Data Security, Consent, and Opt-outs.40

39. Why is it, outside of national crises, that data are not being effectively shared for statistical and research purposes? The barriers to data-sharing are not legal; sections 5 and 7 of the Digital Economy Act 2017 clearly allow for the sharing of data between public bodies, and with accredited researchers. Nor are the barriers the result of data protection - the Information Commissioner assured us he had seen no evidence that data protection regulations were an obstacle. We heard some suggestions of technical concerns,41 but most witnesses appeared to agree that the key barrier to data-sharing for statistical and research purposes is a cultural one. As the National Statistician Professor Sir Ian Diamond put it: “There is a real issue around the fact that we are a government of departments that operate almost, in many ways, as independent entities.”42 Co-Director for Analysis at the Cabinet Office, Mr Steffan Jones, suggested: “One of the issues we sometimes face is that the incentives are not always well aligned. It might be for the greater good to share data, but all the benefits lie elsewhere and all the risk lies with the person who owns the data.”43 In short, while data-collecting departments often bear the financial and technological costs associated with establishing data-shares, it is likely to be the ‘receiving’ departments that accrue benefits, whether in the context of bolstering of their own policy objectives, or simply through the publication of robust and detailed analyses.

40. There are various teams across government tasked with improving access to data for statistical and research purposes. At the highest level, the Government’s National Data Strategy is intended to define how “best to unlock the power of data for the UK”. Within this context, the Central Digital and Data Office in the Cabinet Office (CDDO) is responsible for cross-government data policy, standards and strategy. In practice, the CDDO team tell us that they have sought to address cultural barriers by establishing a Data and Technology Architecture Design Authority; launching a Data Sharing Network of Experts which specialises in unblocking complex data access requests between Government departments and agencies, and establishing a Data Maturity Assessment for bodies to use.44 Separately, the ONS tell us that they also provide an ‘acquisition service’ to data suppliers, to reduce the burden placed on them as far as possible. In some cases, this involves drafting ONS statisticians into the supplying organisation, developing a Memorandum of Understanding to address any concerns, or committing to improving the supplier’s data.45

41. Trying to understand the success of these endeavours has proven a challenge for us. Officials were able to provide almost no specifics about the datasets being sought, or where in government major problems lay. Whether this simply reflects a reluctance to offend those bodies whose data is desperately needed, or a more concerning failure to clearly articulate objectives and evaluate successes, we cannot be sure. It is telling, however, that issues with data-sharing do not appear to have changed significantly as a result of the Government’s work, since our predecessor committee hinted at early problems with the implementation of statistics and research powers of the Digital Economy Act 2017 in their 2019 report.46

42. In her recent report, Professor Lievesley found: “It is not possible for the UKSA to resolve this issue alone … Thus the responsibility must lie with the Cabinet Office to bring departments together to remove the barriers to data sharing, to hold departments accountable when they are not sharing data as required by law, and to ensure that attention is given to the benefits of responsible, cross-government data sharing. In so far as there are financial constraints the involvement of HM Treasury at senior levels will also be required.”47

43. It is clear that the volume and variety of data generated within the UK has exploded in scale over recent years. While new sources of data have much to offer - not least in improving the timeliness of evidence, and allowing us to delve into issues in greater detail - there are things they simply cannot do. They cannot tell us about citizen’s intentions, they cannot be easily tweaked to capture the information an analyst is seeking, and they are prone to bias. The UK’s public evidence base will be best served, therefore, by bringing together old sources and new.

44. Despite the passage of relevant legislation in 2017, the UK has failed to bring its disparate datasets together to enrich its public evidence base. Instead, data withers in silos across countless government bodies. Witnesses to our inquiry were clear; the problems here are not legislative, and they do not result from challenges around data protection. The issue is much simpler; departments and public bodies choose not to share data because they do not share an incentive to do so. And no central, coordinating leader has yet made a sufficiently strong case for the benefits of sharing data for statistics and research across government.

45. It is time for Government to do what it promised to do seven years ago, and to join up the UK’s evidence base. Given that the Cabinet Office’s existing initiatives for improving data sharing are self-evidently insufficient, it should in partnership with the Office for National Statistics develop a comprehensive new programme aimed at improving data-sharing for statistical and research purposes. The programme must clearly define deliverables and timelines, and must be owned by a senior responsible officer at an appropriately high level. In line with the recommendations of the Lievesley report, we also recommend that HM Treasury establish mechanisms so that the costs are not borne by individual Departments, but rather centrally. The Cabinet Office should prepare and publish an annual progress report on delivery against the programme.

46. Separately, the Office for National Statistics should publish information on the datasets it is seeking on an annual basis, setting out its rationale for seeking those data, and details on the status of the request - all of which should be made available on the ONS website.

47. While time in this Parliament now runs short, we recommend that the next Government review the exclusion of health and social care data from the Digital Economy Act 2017. There are understandable sensitivities around the sharing of health data, but it may be that the implementation of research provisions over the last seven years, and the work that has taken place in the context of the Caldicott review, offer a helpful new basis for discussion concerning their possible inclusion.

48. We support the Office for National Statistics in its ambition to deliver high-quality and timely population statistics. It is right to be considering whether new data sources might offer opportunities to improve the UK’s evidence base, and it is also right to be engaging closely with users of that evidence base, as decisions are taken on the future of the decennial census.

49. This Committee’s view - particularly in light of challenges around data-access - is that officials have not yet demonstrated that they can deliver the evidence users need, without a decennial census. We therefore recommend that the Office for National Statistics undertake further work on proposals for the future of migration and population statistics.

4 Who should evidence serve? Delivering evidence for the public good

50. Having explored the ways in which government currently collects, analyses and communicates data - as well as the impacts of a changing data landscape upon this framework - our inquiry also examined who public evidence should serve.

Evidence for all

51. There was a general agreement from those we heard from that evidence produced by the UK government needs to serve many groups: academics, citizens, journalists, researchers, third-sector organisations, and - of course - Ministers and policymakers.48

Figure 4 provides examples of those uses who rely upon the UK’s public evidence base. In government, elected officials and civil servants rely on data in the development and evaluation of policy. Businesses rely on official data sources to help them understand changes in the economy, and to identify and target markets. In research and academia, official data - both statistics and the detailed datasets underlying them - are analysed and linked to improve our understanding of the UK’s economy and society. Parliamentarians rely on data both in the conduct of their constituency work (where low-level geographical breakdowns are particularly important) and in the scrutiny of legislation and policy. Charities use official data to help target their activities, and to educate. The media are important intermediaries of official data, translating complex messages and identifying points of interest for their audience. Citizens themselves also rely on the public evidence base, whether gathering information about the neighbourhood they wish to move to, gauging the performance of elected officials, of simply seeking to learn more about the world around them.

52. Many of those outside government emphasised the necessity of impartial and robust evidence to a healthy democracy. Professor Denise Lievesley, in her recent review of the UK’s statistical system, suggested statistics “serve to empower, enabling citizens to call governments to account and providing a window on society. As such they are an indispensable part of a democratic society.”49 Connected by Data too spoke of the power evidence offers in “empower[ing] communities with insights into their own dynamics and … enabl[ing] them to be active stakeholders in the democratic decision-making process.”50

53. The Government itself - in its National Data Strategy51 - made the case that unlocking data (both official and private) would allow the UK to:

(1) boost productivity and trade;

(2) support new business and jobs;

(3) increase the speed, efficiency and scope of scientific research;

(4) drive a better delivery of policy and public services; and

(5) create a fairer society for all.

The strategy advocated for the unlocking of data for:

a) business (to enhance economic effectiveness through new data-enabled business models);

b) civil society (to better equip organisations to “reach the people most in need, at the time they most need it”);

c) the public sector (to drive “the better delivery of policy and public services”); and

d) scientists and researchers (to ensure that the UK remains “at the forefront of science and research”).

54. Serving the needs of so many users is no easy task. The many different parties across the UK in need of evidence hold different ideas about not just which issues evidence is needed on, but also how exactly we choose to measure and communicate that evidence. The Royal Statistical Society gives the example of inflation; while many parties want to understand how prices are changing over time, some (such as the Bank of England) primarily need a macro-economic measure which allows them to understand changes across the economy overall, while others (charities, for example) have a need to understand the very different experiences of inflation playing out across different subgroups of the population.52 More generally, experts and researchers will often want access to large, raw datasets, to conduct their own analyses, while citizens or time-poor employees in charities - perhaps with different skillsets - may prefer helpful commentaries and interactive data visualisations.

Who does our current framework serve?

55. If we accept that demands for evidence across the UK are practically infinite, how then do our public bodies make choices about which requests for evidence to pursue, and which to leave to one side? In an age of ever-tightening budgets, how do they decide whether to fund the creation of new internal datasets which may help the government of the day manage the economy, or instead channel that money into access to data for accredited researchers, with a view to securing medical breakthroughs?

56. The frameworks described in Chapter 2 provide limited guidance to government officials; statistics are for the ‘public good’; analysis is largely intended to serve actors internal to government; and research (at least that which relies on access to government data) should serve the public interest, whether by providing the evidence base for decisions which benefit the economy, society or quality of life of people in the UK, or by extending our understanding of trends in the UK.53 Beyond this, however, we learned that there is no framework in place to identify and prioritise demands for data in the UK. Statisticians conduct some engagement activities to understand the needs of users,54 and, while that can only be positive, we found no evidence of the information collected by statisticians across the many public bodies of the four nations of the UK subsequently being brought together, so that informed decisions might be taken in the round on what to prioritise and - crucially - fund. In the absence of such a mechanism, decisions about what demands to pursue end up being taken by many different individuals - sometimes with limited information about what their colleagues in other parts of government are doing. Senior leaders in the ONS make decisions about what they collect, while their colleagues in statistical institutes in the devolved administrations or government departments similarly take decisions about what evidence to collate in their own contexts.

57. This problem is a long-standing one. Our predecessor committee flagged it in a 2013 report on statistics,55 and, in his 2016 review of the UK’s economic statistics, Sir Charles Bean found that:

“The fact that UKSA is not always well-sighted on departmental statistical resourcing decisions and has no formal leverage to influence them, has led to a somewhat incoherent statistical landscape. Several [Heads of Profession] and senior civil servants noted that improving the quality of statistics generally came at the bottom of their departments’ list of priorities.”56

58. We also note the important role funding decisions play on what public evidence is available in the UK. Data can only be collected, analysed and published when public funds are made available for such work. Analysts may make bids to improve statistics or research as part of regular spending review processes, but their Ministers, and then HM Treasury, will ultimately decide which of these bids to pursue. The Government, in its own response to Professor Lievelsey’s review suggested government needs for statistics should “take precedence” over external demands, despite that sentiment contradicting the legislative reality defined within the Statistics and Registration Act 2007.57 Several witnesses expressed concerns to us that there is an insufficient emphasis placed on responding to the needs of users outside government.58

59. Given both the enormous breadth of demands for data in the UK, and the lack of a mechanism to identify and prioritise these demands, it is unsurprising that we uncovered examples of significant data gaps in the course of our inquiry. Full Fact highlighted gaps in the area of school absenteeism, and Head of the British Academy, Mr Hetan Shah, on a range of justice issues (from court operation, sentencing, to remand and bail).59 But the issue which was highlighted most frequently as an example of the UK’s failure to deliver to its citizens the evidence that they deserve is UK-wide comparable data (see Box 2 for further detail).

Box 2: Devolution and data gaps

We heard repeatedly throughout our inquiry about challenges facing anybody hoping to use evidence to explore the different experiences of citizens across the four nations of the UK.

Health data are notoriously challenging to compare across England, Northern Ireland, Scotland and Wales. This is because each nation runs their health service differently, sets different targets, and records information in different ways. By way of example, anyone hoping to compare ambulance waiting times across the four nations would face the following issues:60

a) ambulance response categories - that is, the category (or urgency) assigned to a particular incident by the call handler - vary across the four nations. While England and Northern Ireland share a common standard, Scotland and Wales each have their own.

b) ambulance response standards - that is, the target time for responding to a call for an ambulance - also vary across each nation, even between England and Northern Ireland where the same initial categories are in place.

c) further recording differences - in addition to the above, different services will ‘start’ and ‘stop’ the clock for ambulance responses at different points.

These difference all arise from decisions devolved administrations have taken in an effort to provide the best possible care to members of the public, but make it almost impossible to compare the performance of different ambulance services. These issues are not unique to ambulance waiting times; similar challenges face anyone hoping to compare waiting lists and emergency wait times, or indeed performance and experiences in other areas where policy responsibilities are devolved (education for example).

We heard frequently that shifts away from surveys (which can be harmonised across different nations) and towards administrative data (by their nature drawn from the very different operational systems in place) are exacerbating comparability challenges. In her recent review, Professor Lievesley also highlighted “significant resource disparities” and differing policy demands across the statistical offices of the four nations as possible hindrances in the drive for greater comparability across the four nations of the UK.61

Professor Lievesley recommended that the UKSA lead discussions between the four nations to encourage more UK-wide data by creating common standards and improving harmonisation, and that HM Treasury ensure funding is available to support the harmonisation of key data. In the Government’s March 2024 response to her review, it indicated that pursuing comparable data was a priority and that it intended to publish a fuller response on its plans in this regard later in 2024.62 That fuller response is not yet available.

Possible solutions and recent developments

60. In 2019, our predecessor committee found that too little was known about the needs of users, and recommended that the UK Statistics Authority lead cross-government research to build an evidence base of how statistics are used in practice, taking into account the full breadth of stakeholders and to establish where data gaps persist. That committee also recommended that UKSA should conduct sector by sector reviews, to understand what stakeholders need or want, and to make statistics more relevant.63 This has not occurred.

61. In her recent review, Professor Denise Lievesley recommended the creation of a Statistical Assembly, to support the UKSA in identifying and prioritising demands for data:

“This Assembly should involve key organisations inside and outside Government and across the four nations, with the remit of determining the UK’s needs for statistics through a wide consultative process. This should include the private sector, government departments, academia, think tanks and media representatives. The list of organisations involved should be reviewed ahead of each Assembly to ensure it captures a broad range of user and producer views.

“The UKSA will then respond to this by producing a proposal for the statistical priorities for the next three years, thus identifying data gaps and ensuring that users can hold the statistical system to account on the delivery of the programme of work. It will also enable other producers of statistics to complement the work of the official statistical system and factor this work into annual budget allocation processes.”

62. Currently, the UK has no framework by which to identify, and then prioritise, demands for data and evidence. In the absence of such a framework, and with an ongoing need to have regard to budget constraints, Ministers and HM Treasury wield an inordinate amount of power in deciding what evidence the UK collects and communicates through the decisions they take on departmental budgets. In the context of these systemic shortcomings, significant data gaps have emerged; the UK lacks suitable evidence on the performance of its different health services, for example, and on key policy challenges like school absenteeism.

63. It is time to democratise access to data and evidence. The UK Statistics Authority should establish a framework for identifying and prioritising demands for evidence. We recommend that it use a high-level Assembly (of the kind recently recommended by Professor Denise Lievesley) to draw together information from communities across the UK about their needs for evidence and the benefits new evidence would bring, alongside research on data gaps, and public understanding. We further recommend that the UK Statistics Authority submit its findings on the nation’s demands for evidence to Parliament on a triennial basis, for scrutiny by this Committee.

64. We recommend that the OSR support this activity by preparing regular and public reports on data gaps in the UK.

65. We recommend that in its conduct of future Spending Reviews, HM Treasury uses the findings from these reports to inform the decisions it takes on the funding of activity relating to the collection, analysis and communication of public evidence.

66. We ask the Government to confirm, in its response to this report, that it supports the principle - enshrined in the Statistics and Registration Service Act 2007 - that statistics are for the public good; and that the public good includes not just assisting in the development and evaluation of policy, but also informing the public about social and economic matters.

67. It is disappointing that - despite the ever-increasing amount of data available to policy-makers - there are many areas in which it is impossible to compare the experiences of those living in each of the four nations of the UK. This is detrimental to individual citizens, who are deprived of the ability to compare public services delivered in their part of the UK with those delivered in other parts. These issues were highlighted in a recent review of the UK Statistics Authority; but three months on from the report’s publication, although the issue appears to be a priority for Government, it has not yet made its full response on the issue known.

68. We recommend that the Office for Statistics Regulation review and publish a report on the adequacy of UK-wide comparable data, by themes, before April 2025.

5 Evidence in policymaking

69. Thinking carefully about the information we need is perhaps the most significant step the UK could take to improve its public evidence base. But unless we put that evidence to good use, it will ultimately be a fruitless one.

70. As we have explored already in this report, there is a myriad of people and groups who might seek to use public evidence, and we could not hope to cover them all with the attention they deserve in this single report. Given our remit, we have chosen to explore in detail central government’s use of data and evidence. Our findings are set out below.

A new age for evidence in policymaking?

71. The Government is clear about the necessity of robust evidence to good government. It has made the case for evidence in its National Data Strategy, through the introduction of the Digital Economy Act 2017, and again in submissions to this inquiry. In establishing the analysis function (see paragraph 17), the Government has also established a mechanism by which to improve its use of evidence.

72. The overarching vision of the analysis function is “delivering better outcomes for the public by providing the best analysis to inform decision making”. Within this context, the function’s strategy defines six strategic priorities, on which it reports progress against within its Annual Reports. Efforts in the early years of the function appear to have focussed on establishing common frameworks, tools, and guidance, so as to establish a single pool of analysis from which Ministers and officials can easily draw. As a result of these early efforts, there is now a single analytical career framework in place in the civil service, a common learning and development offer, and an overarching ‘Analytical Functional Standard’ which provides a common basis for decisions on assurance, risk management and capability improvement.64

73. Whether any of this has improved the quality of evidence available to policymakers, or its integration into decision-making, remains unclear. When we asked the head of the function, the National Statistician, whether the function had achieved what he hoped it might, he told us it had not yet.65 When we pushed officials for examples of where analysis had supported better decision-making in government, they struggled to provide any, beyond the use of analysis in supporting Government decisions on Covid.66 Elsewhere, external commentators expressed a range of concerns about the state of government analysis. Gemma Tetlow, Chief Economist at the Institute for Government, described Departments sharing and withholding “analysis strategically with one another in an attempt to put forward their case in opposition to another Department’s case.” We also heard from Lord Maude, architect of cross-government functions such as the analysis function, about complaints from Ministers about the quality of advice received.67

74. We pushed repeatedly for any evaluation undertaken by the analysis function of its success in delivering better outcomes, by means of better analysis, but such evidence was in short supply. We were told that directors of analysis were asked to undertake a self-assessment against the function standard in 2023, but that detailed results could not be shared with us as those involved had been assured their responses would not be shared. The National Statistician was able to tell us only that the exercise “showed a mixture picture of strengths and weaknesses across government”.68 We were also told that, in response to a recommendation made by Lord Maude in his 2021 review of the effectiveness of cross-cutting functions, the analysis function had worked closely with the policy profession to gather information on the analytical capability of policy professionals.69 Again, no report has been made publicly available, and we were told only that the findings “led to a more robust analytical capability learning offer for all.”70 Similarly, a paper published on efficiency savings delivered by cross-Government functions in 2023 did not identify any audited efficiencies achieved by the function.71 Government officials could not point to any qualitative analysis to help us answer our two central questions: whether government uses evidence and analysis well, and whether the analysis function is improving the use of evidence in government.

Resourcing change

75. The National Statistician has previously told Parliament that he does not believe the analysis function has the resource and power it needs to drive this cultural change across government.72 In the first years following the function’s establishment, ONS diverted a very small fraction of the resources allocated to it for its core statistical functions towards a small secretariat to support the work of the analysis function. The Public Accounts Committee, in a 2022 report, recommended that ONS and HM Treasury agree on a funding model for the function which would allow it to both monitor Departments’ compliance against the analysis function Standard, and act on areas of concern identified as part of that process.73 We found no evidence in subsequent budget and business planning documentation that any such funding had been made available. For the 2024/25 financial year, ONS has introduced a ‘subscription model’ for the function, seeking contributions from Departments for the funding of the central team and related costs.74

76. The vision of the government analysis function - to deliver better outcomes for the public by providing the best analysis to inform decision-making - is a commendable one. In an age in which Ministers are required to respond to complex policy challenges - whether climate change, cost-of-living challenges, or the performance of the National Health Service - it is surely right that we equip them to do so by making available the best possible evidence.

77. It was not possible for us to form a robust conclusion about how well served our policy-makers are by evidence in 2024. Ironically, for a group of people dedicated to the cause of informed decision-making, analysts appear to have done little by way of evaluating the function’s success in delivering its vision. What we did hear anecdotally, however, suggests that there might be significant room for improvement.

78. We also identified a mismatch between the ambitious vision of the analysis function, and the very limited funding made available to deliver that vision.

79. We recommend that Government reaffirm its commitment to the analysis function, and that HM Treasury review options for its future funding. If Government truly wishes to improve its use of analysis and deliver better outcomes for the public, it clearly needs to fund that change.

80. In parallel, the National Statistician should review the analysis function’s scope and standard, with a view to defining an achievable set of next-steps, and clear plans for honest evaluations of the function’s success. This review and subsequent evaluations should be made publicly available, so that Parliament is in future better equipped to scrutinise both the Government’s use of evidence and the progress of the analysis function.

Time for transparency

81. In addition to the general sense that analysis is not yet being carried out effectively across government, many identified the lack of transparency around government analysis as a point of concern.75

82. Beyond the realm of official statistics, there is no requirement that government analysis should be made equally available to all.76 Certainly, there are good reasons why it might be prudent not to insist every piece of government analysis be published:

a) statisticians and analysts can play an important role in helping users of evidence sift good evidence from bad. Public understanding is not likely to be enhanced by dropping countless pieces of analysis into the public domain, but instead by thinking carefully about the most valuable sources and presenting those with suitable commentary and caveats; and

b) there will always be a need to protect the space for civil servants to speak truth to power, particularly in the early stages of policy-making process. Requiring that every single piece of analysis be placed in the public domain may deter Ministers and senior officials from requesting analysis.

83. Nonetheless our witnesses were clear that, where numbers are being used in support of major government policies, underpinning analysis should be made available as a matter of course. Mr Shah suggested in his oral evidence to this inquiry that “[w]hen Government talk about policy, they should also be publishing their chain of reasoning and evidence underneath that. Sometimes we focus just on the data piece, but actually looking at more evidence transparency across policymaking is a real opportunity.”77 Factcheckers at Full Fact agreed: “Without the full information, fact checkers, journalists, Parliamentarians and the public are kept in the dark and are unable to scrutinise or ask important questions.”78 Lord Maude recently recommended to Government that “When a policy decision is made and announced, the evidence and data that underpin the decision should be published”,79 Former Permanent Secretary and head of the Civil Service policy profession, Jonathan Slater, has also advocated putting more policy advice in the public domain, calling for a new set of rules, “which make clear that departmental select committees absolutely do have the right to quiz civil servants on the analysis, evidence, and risk assessments they do.”80

84. We have been made aware of a wide range of instances in which the Government made major policy announcements either without evidence, or with reference to numbers that could not be publicly verified. For example:

a) in one of our earlier inquiries on government estates,81 we reported that despite being described by the Cabinet Office as a “flagship programme”, Places for Growth did not have a published launch document setting out its objectives and underlying research. The Cabinet Office went on to make various high-profile statements about its successes, without publishing underlying research to support its claims. This mattered, because when we looked into the claims, we found they did not tell the full story.

b) Full Fact highlighted a number of instances in which the Home Office had chosen to make claims based on unpublished operational data, rather than on publicly available statistics. Whether comments relating to the nationality of those arriving on small boats, the age of asylum seekers, or the costs of housing asylum seekers, the common thread was Ministers making claims which were unsupported by (published) official data, and which were therefore impossible for parties outside of government to scrutinise.82

85. It is not clear why the Government should choose to communicate in this manner. In some cases, we suspect, robust supporting evidence is available, has been used sensibly in the policy-making process, and might support the Government in building support for its policies. For example, Mr Humpherson described an instance in which delays publishing an evaluation of the Troubled Families programme fuelled concerns that a bad-news story was being hidden. Releasing a subsequent evaluation in line with proper processes helped to allay some of these concerns, and build confidence in the Government’s ability to deliver honest assessments of its performance - good and bad - to citizens.83

86. In early 2022, the Office for Statistics Regulation issued new ‘Intelligent Transparency’ guidance, to encourage an “open, clear and accessible approach to the release and use of data, statistics and wider analysis.”84 The guidance requires that:

a) data used by government in the public domain should be made available to all in an accessible and timely way;

b) sources for figures should be cited, with an appropriate explanation of context, including strengths and limitations, communicated clearly alongside figures; and

c) decisions about the publication of statistics and data, such as content and timing, should be independent of political influences and policy processes.

87. This Intelligent Transparency guidance has driven the publication of several datasets which would otherwise remain hidden to members of the public, and has been welcomed by many organisations who rely on good data.85 Mr Humpherson is clear, however, that the OSR cannot hope to open-up access to data alone. He suggested to us that although senior officials often sign-up to this guidance, its principles are not yet embedded in day-to-day practices within departments:

“We then find cases where, in a specific moment, a particular Department concludes that it is not in its communications interests or it forgets to make the underlying data available. I would really want to see those commitments that we hear from senior officials being made much more publicly and much more embedded into their practices and processes.”86

88. Too frequently, Government communications exhibit a disregard for evidence. This is helpful neither to the Government, in building trust in our democracy and support for policies of the day, nor to citizens who rightly expect to be able to scrutinise the work of Ministers and officials.

89. Since its launch in 2022, the Office for Statistics Regulation’s Intelligent Transparency guidance has helped to unlock important evidence for Parliament, business, researchers and citizens, but there remains more to do.

90. We commend the OSR for its work on Intelligent Transparency and recommend that it publish an annual report card on departments’ compliance with its guidance, so that Parliament and external bodies might support it in holding departments to account, and making the case for well-informed policy. Recognising that this important work expands the remit of the OSR beyond official statistics, and into the larger world of government analysis, we also suggest that at the next Spending Review, it works with HM Treasury to agree a sustainable funding model for this work, given the vital role it plays.

91. We recommend that all government communications professionals are trained on the OSR’s Intelligent Transparency guidance, and that the Government Functional Standard for Communication be updated to make it clear that officials are expected to comply with that guidance.

92. We concur with Lord Maude’s recent recommendation that, when a policy decision is announced, the Government should publish the evidence and data underpinning that decision.

93. We recommend that, at a minimum, governments in future routinely publish the evidence and data underpinning their major policy announcements. Making this happen will not be a straightforward task, and we suggest that in the first instance leaders of the analysis and communications functions develop options to deliver this ambition, for the consideration of Ministers.

6 Privacy and ethics in an age of data

94. So far this report has considered how the UK might best take advantage of the opportunities afforded to it by emerging data sources, to build the evidence base its citizens deserve. But it would be disingenuous to suggest that these opportunities do not come with risks. In this final section of our report, we explore how public bodies protect privacy, and how they can ensure - as they work to enrich our public evidence base - that they do not race ahead of public appetite.

Evidence and the current data protection framework

95. In the early chapters of this report, we explored the transformational potential of new data sources to the UK’s public evidence base. Just as the traditional survey has often collected very personal information - on religion, and income for example - so too do these new sources, often on a much greater scale.

96. In the UK, those involved in the production of official statistics, government analysis and research are required to comply with various pieces of legislation, aimed at protecting the privacy of data subjects. All data processors in the UK are required to comply with both the Data Protection Act 2018 (DPA), and the UK General Data Protection Regulations (UK GDPR).87 In the context of statistics and research specifically, two other pieces of legislation are relevant: first, the Statistics and Registration Service Act 2007, which bars the ONS from disclosing personal information except to approved researchers, or as required by law; and second, the Digital Economy Act 2017, which sets specific conditions for when and how data may be shared for statistical and research purposes.88

97. The Information Commissioner’s Office (ICO) is responsible for ensuring any organisation using personal information does so in line with the UK GDPR and DPA 2018, and also has a consultative role around the statistics and research provisions of the Digital Economy Act 2017.89 In his oral evidence to us, the Information Commissioner suggested that those organisations across the UK who collected data solely for statistical purposes had generally established a high level of compliance with data protection, perhaps because “they understand better than many others that their legitimacy, their licence to operate, depends on the maintenance of a high level of trust, regardless of whether they are compliant with a legal standard.”90

98. Those responsible for collecting, analysing, and communicating evidence in the UK public sector are bound by a strict data protection framework, enforced by the Information Commissioner’s Office (ICO). We were pleased to learn that - on the whole - producers of public evidence comply with this framework, and work hard to protect the privacy of the public.

99. The ICO’s evidence to this inquiry has been an invaluable resource to us and as officials continue to explore the potential of new data sources in the production of evidence, its continued monitoring of analysts’ compliance with the current data protection framework will be crucial.

100. One area in which the Information Commission suggested there may be room for improvement is around transparency. Under the principles of the UK’s data protection framework, all data processors are expected to be “clear, open and honest with people from the start about how you will use their personal data”.91 While submissions to this inquiry generally welcomed the openness with which producers of official statistics and research communicate about how personal data is used, we heard concerns from some that government is much less transparent about how personal data is being used for the purposes of analysis.92 MedConfidential wrote: “neither ONS, nor anyone else, publish a list of all internal government analysts using data … Those are the projects that are of the highest concern, and they’re shrouded in secrecy.”93

101. Although statisticians and researchers publish a wealth of information on which data sources they hold, and how they are used, very little information is made available about how personal data are being used for the purposes of government analysis.

102. We recommend that the analysis function explore options for improving transparency around the use of personal data in official analyses, and that this work be made publicly available.

New questions of data ethics

103. Protecting the privacy of citizens as we build a strong public evidence base is crucial. But alone it is not enough. As our data landscape changes rapidly, analysts must engage with those on whose data it relies on a deeper level, about what it means to use data ethically.

104. In recent years, organisations in the public sector and beyond have begun to explore the ‘ethics’ of their data use. There is no agreed definition of data ethics, and submissions to this inquiry defined the concept in different ways:

a) “a branch of ethics that evaluates data practices with the potential to adversely impact on people and society - in data collection, sharing and use (Open Data Institute);94

b) “to use data ethically means recognising the power inherent in collecting data and using evidence in decision-making, and orienting every stage toward socially-just outcomes (Connected by Data);95

c) “Data ethics is an emerging branch of applied ethics that studies and evaluates moral problems and describes the value judgements relating to data … algorithms … and corresponding practices … , in order to formulate and support morally good solutions” (the Government’s Data Ethics Framework).96

105. Why is it that organisations are now beginning to explore the ethics of their data use in a way they might not have done, even a decade ago? When we put this question to our witnesses, they highlighted two issues. The first is simply the enormous change we have witnessed in our data landscape; as data expert Gavin Freeguard described to us: “the volume, the speed, and the quality of data that we now have available means that all questions about data are much more prominent in political and public discussion.”97 The second is that the ‘consent model’ upon which official evidence has for many decades relied is changing. When evidence was primarily drawn from surveys, there was a natural opportunity at which to engage members of the public in discussions about how data are used, and to seek consent for the use of their data. By choosing to respond to voluntary surveys, members of the public were implicitly signalling they were happy for their data to be used in the production of evidence. Conversely, when officials draw from administrative data and emerging sources in the conduct of analyses, they can do so without any dialogue with members of the public about the use of these data. Data ethics has therefore emerged as a solution to a potential disconnect between producers of evidence, and those on whose data they rely.

Public attitudes to data

106. Since 2021, the Centre for Data Ethics and Innovation (CDEI) has undertaken regular monitoring of public attitudes to data-driven technology and artificial intelligence. The latest results, issued earlier this year, make for interesting reading. According to the report:98

“The value the public places on data collection and analysis has increased in the last year; with 57% of the public recognising the personal benefits of data and 44% acknowledging its societal value. There is alignment between the public’s perception of the key areas where data can contribute to societal good - health, cost of living, and the economy - and public perceptions of the greatest issues facing the UK, demonstrating the public’s belief in the power of data. However, there is still scepticism around the equal distribution of these benefits across society, with only 33% in agreement that all groups reap the benefits equally.

“The greatest public concern about data use remains its security, with a particular focus on the potential for insecure storage to lead to hacking or theft, and apprehensions about data being sold for profit … Despite these concerns and a prevailing sentiment of limited control over personal data, the public increasingly believe that organisations are held to account when they misuse data, indicating a growing confidence in the accountability mechanisms in place.”

Data ethics in practice

107. In the UK, a great number of organisations have been exploring what in means to use data ethically, and how ethical considerations might be applied practically in the conduct of government work:

a) In 2018, the Cabinet Office published a data ethics framework for use across government. The framework, updated in 2020 and now the responsibility of the Cabinet Office’s Central Digital and Data Office (CDDO) identifies three key principles: transparency, accountability, and fairness.99

b) The UKSA established the National Statistician’s Data Ethics Advisory Committee (NSDEC) in 2014. The Committee considers project and policy proposals which make use of innovative and novel data, and advises the National Statisticians on the ethical appropriateness of those proposals.100

c) The Geospatial Commission have undertaken work on the ethics of location data, specifically.101

d) In the conduct of much of this work, public bodies have drawn heavily from work undertaken by organisations such as the Ada Lovelace Institute, Alan Turing Institute, and the Open Data Institute, who made early and important contributions to the field of data ethics.

108. We asked those we spoke to whether the abundance of data ethics frameworks in use across the public sector helped or hindered analysts in their efforts to use data in an ethical way. Views were mixed. On the one hand, witnesses noted that organisations often use data for different purposes, and in this context it may make sense to consider different sets of ethical questions.102 However, we also heard some suggestion that different frameworks offer organisations the opportunity to pick and choose. Data and ethics expert Reema Patel explained:

“The problem around these different definitions … is that there is a bit of a risk of an instrumentalisation of ethics, of organisations saying, ‘We have created our own ethics framework, so we are doing the ethical thing’ rather than demonstrating good practice and standards and processes that have been collectively agreed by society.”103

109. Witnesses to the inquiry also noted that, while there are various frameworks in place which can support officials in using data in an ethical way, there is limited accountability in the application of these frameworks. They suggested that there may be options in future to embed more formally ethical considerations in the collection, analysis and communication of evidence, and improve the accountability of those involved.104

110. It is crucial that members of the public are involved in making decisions about how the UK chooses to use personal data in the development of its public evidence base. Traditionally, statisticians have engaged with members of the public about the use of their data in the conduct of surveys, but as officials embrace new sources of data there is a risk that this important dialogue falls away.

111. The information that we do have about public attitudes suggest an openness to the use of data, where there are clear benefits to be gained, and where those benefits are seen to be shared equitably.

112. We have been impressed by the great range of early work conducted by organisations in and outside the public sector, to help policymakers understand what it means to use data ethically, and to establish mechanisms for applying those considerations to the every-day work of government.

113. It is now time to consolidate the excellent exploratory work that has been done on data ethics, and to embed it more formally into the collection, analysis, and communication of evidence in the UK. We recommend that the Cabinet Office’s Central Digital and Data Office and the Office for National Statistics jointly review the varying data ethics frameworks available to analysts across the UK; considering opportunities for greater consistency, and possible accountability mechanisms, to encourage a wider adoption of data ethics across government.

114. In parallel, the Centre for Data Ethics and Innovation should continue its excellent work in monitoring public attitudes on the Government’s use of data.

Conclusions and recommendations

How does government collect, use and communicate evidence? Understanding a complex and fragmented landscape

1. There is much to be proud of across the UK’s public data-landscape. The independence of its statisticians from the government of the day, the innovative work being undertaken by a skilled researcher community, and our unique regulatory framework all received praise from our witnesses. (Paragraph 23)

2. As it stands, the UK’s public data landscape is highly fragmented. This need not be a problem (indeed the model offers several conceivable advantages) if the many actors involved in generating, analysing and communicating evidence work together effectively. Our inquiry suggests that in several areas, however, this is not yet the case. (Paragraph 24)

An explosion of data: Navigating new data sources and technologies

3. It is clear that the volume and variety of data generated within the UK has exploded in scale over recent years. While new sources of data have much to offer - not least in improving the timeliness of evidence, and allowing us to delve into issues in greater detail - there are things they simply cannot do. They cannot tell us about citizen’s intentions, they cannot be easily tweaked to capture the information an analyst is seeking, and they are prone to bias. The UK’s public evidence base will be best served, therefore, by bringing together old sources and new. (Paragraph 43)

4. Despite the passage of relevant legislation in 2017, the UK has failed to bring its disparate datasets together to enrich its public evidence base. Instead, data withers in silos across countless government bodies. Witnesses to our inquiry were clear; the problems here are not legislative, and they do not result from challenges around data protection. The issue is much simpler; departments and public bodies choose not to share data because they do not share an incentive to do so. And no central, coordinating leader has yet made a sufficiently strong case for the benefits of sharing data for statistics and research across government. (Paragraph 44)

5. It is time for Government to do what it promised to do seven years ago, and to join up the UK’s evidence base. Given that the Cabinet Office’s existing initiatives for improving data sharing are self-evidently insufficient, it should in partnership with the Office for National Statistics develop a comprehensive new programme aimed at improving data-sharing for statistical and research purposes. The programme must clearly define deliverables and timelines, and must be owned by a senior responsible officer at an appropriately high level. In line with the recommendations of the Lievesley report, we also recommend that HM Treasury establish mechanisms so that the costs are not borne by individual Departments, but rather centrally. The Cabinet Office should prepare and publish an annual progress report on delivery against the programme. (Paragraph 45)

6. Separately, the Office for National Statistics should publish information on the datasets it is seeking on an annual basis, setting out its rationale for seeking those data, and details on the status of the request - all of which should be made available on the ONS website. (Paragraph 46)

7. While time in this Parliament now runs short, we recommend that the next Government review the exclusion of health and social care data from the Digital Economy Act 2017. There are understandable sensitivities around the sharing of health data, but it may be that the implementation of research provisions over the last seven years, and the work that has taken place in the context of the Caldicott review, offer a helpful new basis for discussion concerning their possible inclusion. (Paragraph 47)

8. We support the Office for National Statistics in its ambition to deliver high-quality and timely population statistics. It is right to be considering whether new data sources might offer opportunities to improve the UK’s evidence base, and it is also right to be engaging closely with users of that evidence base, as decisions are taken on the future of the decennial census. (Paragraph 48)

9. This Committee’s view - particularly in light of challenges around data-access - is that officials have not yet demonstrated that they can deliver the evidence users need, without a decennial census. We therefore recommend that the Office for National Statistics undertake further work on proposals for the future of migration and population statistics. (Paragraph 49)

Who should evidence serve? Delivering evidence for the public good

10. Currently, the UK has no framework by which to identify, and then prioritise, demands for data and evidence. In the absence of such a framework, and with an ongoing need to have regard to budget constraints, Ministers and HM Treasury wield an inordinate amount of power in deciding what evidence the UK collects and communicates through the decisions they take on departmental budgets. In the context of these systemic shortcomings, significant data gaps have emerged; the UK lacks suitable evidence on the performance of its different health services, for example, and on key policy challenges like school absenteeism. (Paragraph 62)

11. It is time to democratise access to data and evidence. The UK Statistics Authority should establish a framework for identifying and prioritising demands for evidence. We recommend that it use a high-level Assembly (of the kind recently recommended by Professor Denise Lievesley) to draw together information from communities across the UK about their needs for evidence and the benefits new evidence would bring, alongside research on data gaps, and public understanding. We further recommend that the UK Statistics Authority submit its findings on the nation’s demands for evidence to Parliament on a triennial basis, for scrutiny by this Committee. (Paragraph 63)

12. We recommend that the OSR support this activity by preparing regular and public reports on data gaps in the UK. (Paragraph 64)

13. We recommend that in its conduct of future Spending Reviews, HM Treasury uses the findings from these reports to inform the decisions it takes on the funding of activity relating to the collection, analysis and communication of public evidence. (Paragraph 65)

14. We ask the Government to confirm, in its response to this report, that it supports the principle - enshrined in the Statistics and Registration Service Act 2007 - that statistics are for the public good; and that the public good includes not just assisting in the development and evaluation of policy, but also informing the public about social and economic matters. (Paragraph 66)

15. It is disappointing that - despite the ever-increasing amount of data available to policy-makers - there are many areas in which it is impossible to compare the experiences of those living in each of the four nations of the UK. This is detrimental to individual citizens, who are deprived of the ability to compare public services delivered in their part of the UK with those delivered in other parts. These issues were highlighted in a recent review of the UK Statistics Authority; but three months on from the report’s publication, although the issue appears to be a priority for Government, it has not yet made its full response on the issue known. (Paragraph 67)

16. We recommend that the Office for Statistics Regulation review and publish a report on the adequacy of UK-wide comparable data, by themes, before April 2025. (Paragraph 68)

Evidence in policymaking

17. The vision of the government analysis function - to deliver better outcomes for the public by providing the best analysis to inform decision-making - is a commendable one. In an age in which Ministers are required to respond to complex policy challenges - whether climate change, cost-of-living challenges, or the performance of the National Health Service - it is surely right that we equip them to do so by making available the best possible evidence. (Paragraph 76)

18. It was not possible for us to form a robust conclusion about how well served our policy-makers are by evidence in 2024. Ironically, for a group of people dedicated to the cause of informed decision-making, analysts appear to have done little by way of evaluating the function’s success in delivering its vision. What we did hear anecdotally, however, suggests that there might be significant room for improvement. (Paragraph 77)

19. We also identified a mismatch between the ambitious vision of the analysis function, and the very limited funding made available to deliver that vision. (Paragraph 78)

20. We recommend that Government reaffirm its commitment to the analysis function, and that HM Treasury review options for its future funding. If Government truly wishes to improve its use of analysis and deliver better outcomes for the public, it clearly needs to fund that change. (Paragraph 79)

21. In parallel, the National Statistician should review the analysis function’s scope and standard, with a view to defining an achievable set of next-steps, and clear plans for honest evaluations of the function’s success. This review and subsequent evaluations should be made publicly available, so that Parliament is in future better equipped to scrutinise both the Government’s use of evidence and the progress of the analysis function. (Paragraph 80)

22. Too frequently, Government communications exhibit a disregard for evidence. This is helpful neither to the Government, in building trust in our democracy and support for policies of the day, nor to citizens who rightly expect to be able to scrutinise the work of Ministers and officials. (Paragraph 88)

23. Since its launch in 2022, the Office for Statistics Regulation’s Intelligent Transparency guidance has helped to unlock important evidence for Parliament, business, researchers and citizens, but there remains more to do. (Paragraph 89)

24. We commend the OSR for its work on Intelligent Transparency and recommend that it publish an annual report card on departments’ compliance with its guidance, so that Parliament and external bodies might support it in holding departments to account, and making the case for well-informed policy. Recognising that this important work expands the remit of the OSR beyond official statistics, and into the larger world of government analysis, we also suggest that at the next Spending Review, it works with HM Treasury to agree a sustainable funding model for this work, given the vital role it plays. (Paragraph 90)

25. We recommend that all government communications professionals are trained on the OSR’s Intelligent Transparency guidance, and that the Government Functional Standard for Communication be updated to make it clear that officials are expected to comply with that guidance. (Paragraph 91)

26. We concur with Lord Maude’s recent recommendation that, when a policy decision is announced, the Government should publish the evidence and data underpinning that decision. (Paragraph 92)

27. We recommend that, at a minimum, governments in future routinely publish the evidence and data underpinning their major policy announcements. Making this happen will not be a straightforward task, and we suggest that in the first instance leaders of the analysis and communications functions develop options to deliver this ambition, for the consideration of Ministers. (Paragraph 93)

Privacy and ethics in an age of data

28. Those responsible for collecting, analysing, and communicating evidence in the UK public sector are bound by a strict data protection framework, enforced by the Information Commissioner’s Office (ICO). We were pleased to learn that - on the whole - producers of public evidence comply with this framework, and work hard to protect the privacy of the public. (Paragraph 98)

29. The ICO’s evidence to this inquiry has been an invaluable resource to us and as officials continue to explore the potential of new data sources in the production of evidence, its continued monitoring of analysts’ compliance with the current data protection framework will be crucial. (Paragraph 99)

30. Although statisticians and researchers publish a wealth of information on which data sources they hold, and how they are used, very little information is made available about how personal data are being used for the purposes of government analysis. (Paragraph 101)

31. We recommend that the analysis function explore options for improving transparency around the use of personal data in official analyses, and that this work be made publicly available. (Paragraph 102)

32. It is crucial that members of the public are involved in making decisions about how the UK chooses to use personal data in the development of its public evidence base. Traditionally, statisticians have engaged with members of the public about the use of their data in the conduct of surveys, but as officials embrace new sources of data there is a risk that this important dialogue falls away. (Paragraph 110)

33. The information that we do have about public attitudes suggest an openness to the use of data, where there are clear benefits to be gained, and where those benefits are seen to be shared equitably. (Paragraph 111)

34. We have been impressed by the great range of early work conducted by organisations in and outside the public sector, to help policymakers understand what it means to use data ethically, and to establish mechanisms for applying those considerations to the every-day work of government. (Paragraph 112)

35. It is now time to consolidate the excellent exploratory work that has been done on data ethics, and to embed it more formally into the collection, analysis, and communication of evidence in the UK. We recommend that the Cabinet Office’s Central Digital and Data Office and the Office for National Statistics jointly review the varying data ethics frameworks available to analysts across the UK; considering opportunities for greater consistency, and possible accountability mechanisms, to encourage a wider adoption of data ethics across government. (Paragraph 113)

36. In parallel, the Centre for Data Ethics and Innovation should continue its excellent work in monitoring public attitudes on the Government’s use of data. (Paragraph 114)

Formal minutes

Tuesday 21 April

Members present:

Dame Jackie Doyle-Price, in the Chair

Ronnie Cowan

Jo Gideon

Mr David Jones

John McDonnell

Lloyd Russell-Moyle

John Stevenson

Transforming the UK’s evidence base

Draft Report (Transforming the UK’s evidence base), proposed by the Chair, brought up and read.

Ordered, That the draft Report be read a second time, paragraph by paragraph.

Paragraphs 1 to 114 read and agreed to.

Summary agreed to.

Resolved, That the Report be the Fifth Report of the Committee to the House.

Ordered, That the Chair make the Report to the House.

Ordered, That embargoed copies of the Report be made available, in accordance with the provisions of Standing Order 134.

Adjournment

[Adjourned till Tuesday 4 June 2024 at 09.30am


Witnesses

The following witnesses gave evidence. Transcripts can be viewed on the inquiry publications page of the Committee’s website.

Tuesday 5 September 2023

Professor Sir Ian Diamond, The National Statistician, Office for National StatisticsQ1–47

Thursday 9 November 2023

Hetan Shah, chief executive, British Academy; Dr Gemma Tetlow, Chief Economist, Institute for Government; Chris Morris, Chief Executive, Full FactQ48–100

Tuesday 5 December 2023

John Edwards, Information Commissioner, Information Commissioner’s Office (ICO)Q101–130

Reema Patel, Head of Deliberative Engagement, Ipsos UK; Gavin Freeguard, Policy associate, Connected by DataQ131–150

Tuesday 6 February 2024

Ed Humpherson, Head, Office for Statistics RegulationQ151–180

Tuesday 12 March 2024

Professor Denise Lievesley CBE, Lead on the Review of the UK Statistics Authority and Honorary Fellow and former Principal of Green Templeton College, Oxford UniversityQ181–201

Baroness Neville-Rolfe DBE CMG, Minister of State, Cabinet Office; Steffan Jones, Director of Joint Data and Analysis Centre, Cabinet OfficeQ202–229


Published written evidence

The following written evidence was received and can be viewed on the inquiry publications page of the Committee’s website.

TEB numbers are generated by the evidence processing system and so may not be complete.

1 Abubakar, Dr Aisha Mohammed (Research Fellow in Cognitive Impairment and Exploitation, University of Nottingham); Seymour, Dr Rowland (Assistant Professor in Mathematics, University of Birmingham); and Gardner , Dr Alison (Assistant Professor in Public Administration, University of Nottingham) (TEB0005)

2 Basu, Dr Subhajit (Associate Professor in Information and Technology Law, School of Law, University of Leeds) (TEB0002)

3 Botta, Dr Federico (Senior Lecturer in Data Science, University of Exeter) (TEB0009)

4 Cabinet Office (TEB0010)

5 Connected by Data (TEB0015)

6 Defend Digital Me (TEB0028)

7 Department for Science, Innovation and Technology (TEB0023)

8 Full Fact (TEB0018)

9 GeoPlace LLP (TEB0021)

10 House of Commons Library (TEB0014)

11 Information Commissioner’s Office (TEB0026)

12 King, Thomas (TEB0004)

13 Lam, Mr Si Chun (Head of research, intelligence, and inclusive growth, West Midlands Combined Authority); Heath, Professor Anthony (Emeritus Professor of Sociology, Nuffield College, University of Oxford); Zimeta, Dr Mahlet ; and Collins, Dr Evelyn (TEB0025)

14 Lynn, Professor Peter (Professor of Survey Methodology, Director of the Institute for Social and Economic Research, and Director of Survey Futures, University of Essex) (TEB0001)

15 medConfidential (TEB0027)

16 Office for Statistics Regulation, UK Statistics Authority (TEB0031)

17 Open Data Institute (TEB0024)

18 Ritchie, Professor Felix (Professor of Applied Economics; Director, Data Research Access and Governance Network, University of the West of England,Bristol) (TEB0022)

19 Royal Statistical Society (TEB0008)

20 Scottish Government (TEB0019)

21 Sense about Science (TEB0013)

22 UK Research and Innovation (TEB0011)

23 UK Statistics Authority (TEB0029)

24 UK Statistics Authority (TEB0030)

25 UK Statistics Authority; Office for National Statistics; and Office for Statistics Regulation (TEB0017)

26 Understanding Society, Institute for Social and Economic Research, University of Essex (TEB0003)

27 Welsh Government (TEB0012)

28 Wheatcroft, Mr Martin (TEB0016)

29 Whittard, Damian (Associate Professor, University of the West of England); Ritchie, Professor Felix (Professor, University of the West of England); Phan, Dr Van (Research Fellow, University of the West of England); Bryson, Professor Alex (Professor, University College London); Stokes, Lucy (Principal Economist, National Institute of Economic and Social Research); Forth, Dr John (Reader, Bayes Business School - City, University of London); and Singleton, Dr Carl (Associate Professor, University of Reading) (TEB0006)

30 Wildman Inc Ltd (TEB0020)

31 Women in Mental Health Special Interest Group - Royal College of Psychiatrists (TEB0007)


List of Reports from the Committee during the current Parliament

All publications from the Committee are available on the publications page of the Committee’s website.

Session 2023–24

Number

Title

Reference

1st

The Appointment of Douglas Chalmers CB DSO OBE as Chair of the Committee on Standards in Public Life

HC 243

2nd

Parliamentary Scrutiny of International Agreements in the 21st century

HC 204

3rd

Parliamentary and Health Service Ombudsman Scrutiny 2022–23

HC 198

4th

Lobbying and Influence: post-legislative scrutiny of the Lobbying Act 2014 and related matters

HC 203

1st Special

Civil Service People Survey: Government response to the Committee’s Ninth Report of Session 2022–23

HC 462

2nd Special

Parliamentary Scrutiny of International Agreements in the 21st century: Government Response to the Committee’s Second Report of session 2023–24

HC 685

Session 2022–23

Number

Title

Reference

1st

Parliamentary and Health Service Ombudsman Scrutiny 2020–21

HC 213

2nd

The Work of the Electoral Commission

HC 462

3rd

Governing England

HC 463

4th

Propriety of Governance in Light of Greensill

HC 888

5th

Governing England: Follow up to the Government’s response to the Committee’s Third Report of Session 2022–23

HC 1139

6th

Parliamentary and Health Service Ombudsman Scrutiny 2021–22

HC 745

7th

The Role of Non-Executive Directors in Government

HC 318

8th

Where Civil Servants Work: Planning for the future of the Government’s estates

HC 793

9th

Civil Service People Survey

HC 575

10th

Appointment of Baroness Deech as Chair of the House of Lords Appointments Commission

HC 1906

1st Special

Coronavirus Act 2020 Two Years On: Government response to the Committee’s Seventh Report of Session 2021–22

HC 211

2nd Special

The Cabinet Office Freedom of Information Clearing House: Government Response to the Committee’s Ninth Report of Session 2021–22

HC 576

3rd Special

Parliamentary and Health Service Ombudsman Scrutiny 2020–21: PHSO and Government responses to the Committee’s First Report

HC 616

4th Special

The Work of the Electoral Commission: Government Response to the Committee’s Second Report

HC 1065

5th Special

The Work of the Electoral Commission: Electoral Commission response to the Committee’s Second Report of Session 2022–23

HC 1124

6th Special

Parliamentary and Health Service Ombudsman Scrutiny 2021–22: Government and PHSO response

HC 1437

7th Special

The Role of Non-Executive Directors in Government: Government response to the Committee’s Seventh Report of Session 2022–23

HC 1805

8th Special

Where Civil Servants Work: Planning for the Future of the Government’s Estates: Government and Office for National Statistics response to the Committee’s Eighth Report

HC 1868

Session 2021–22

Number

Title

Reference

1st

The role and status of the Prime Minister’s Office

HC 67

2nd

Covid-Status Certification

HC 42

3rd

Propriety of Governance in Light of Greensill: An Interim Report

HC 59

4th

Appointment of William Shawcross as Commissioner for Public Appointments

HC 662

5th

The Elections Bill

HC 597

6th

The appointment of Rt Hon the Baroness Stuart of Edgbaston as First Civil Service Commissioner

HC 984

7th

Coronavirus Act 2020 Two Years On

HC 978

8th

The appointment of Sir Robert Chote as Chair of the UK Statistics Authority

HC 1162

9th

The Cabinet Office Freedom of Information Clearing House

HC 505

1st Special

Government transparency and accountability during Covid 19: The data underpinning decisions: Government’s response to the Committee’s Eighth Report of Session 2019–21

HC 234

2nd Special

Covid-Status Certification: Government Response to the Committee’s Second Report

HC 670

3rd Special

The role and status of the Prime Minister’s Office: Government Response to the Committee’s First Report

HC 710

4th Special

The Elections Bill: Government Response to the Committee’s Fifth Report

HC 1133

Session 2019–21

Number

Title

Reference

1st

Appointment of Rt Hon Lord Pickles as Chair of the Advisory Committee on Business Appointments

HC 168

2nd

Parliamentary and Health Service Ombudsman Scrutiny 2018–19

HC 117

3rd

Delivering the Government’s infrastructure commitments through major projects

HC 125

4th

Parliamentary Scrutiny of the Government’s handling of Covid-19

HC 377

5th

A Public Inquiry into the Government’s response to the Covid-19 pandemic

HC 541

6th

The Fixed-term Parliaments Act 2011

HC 167

7th

Parliamentary and Health Service Ombudsman Scrutiny 2019–20

HC 843

8th

Government transparency and accountability during Covid 19: The data underpinning decisions

HC 803

1st Special

Electoral law: The Urgent Need for Review: Government Response to the Committee’s First Report of Session 2019

HC 327

2nd Special

Parliamentary and Health Service Ombudsman Scrutiny 2018–19: Parliamentary and Health Service Ombudsman’s response to the Committee’s Second report

HC 822

3rd Special

Delivering the Government’s infrastructure commitments through major projects: Government Response to the Committee’s Third report

HC 853

4th Special

A Public Inquiry into the Government’s response to the Covid-19 pandemic: Government’s response to the Committee’s Fifth report

HC 995

5th Special

Parliamentary Scrutiny of the Government’s handling of Covid-19: Government Response to the Committee’s Fourth Report of Session 2019–21

HC 1078

6th Special

The Fixed-term Parliaments Act 2011: Government’s response to the Committee’s Sixth report of Session 2019–21

HC 1082

7th Special

Parliamentary and Health Service Ombudsman Scrutiny 2019–20: Government’s and PHSO response to the Committee’s Seventh Report of Session 2019–21

HC 1348


Footnotes

1 How can Government harness new data to improve policies?, July 2023.

2 How can Government harness new data to improve policies?, July 2023.

3 Reg Ward and Ted Doggett, Keeping Score: The First Fifty Years of the Central Statistical Office, Central Statistical Office, 1991.

4 Statistics and Registration Service Act 2007

5 See: Statistics and Registration Service Act 2007; Official Statistics Order (2023); Official Statistics Order (Northern Ireland) (2012 and ); the , . and the ; and, and .

6 About the GSS, GSG, government professions and Analysis Function, accessed April 2024.

7 Q67

8 Q183

9 Government Functional Standard: GovS010: Analysis, July 2023.

10 Although the National Statistician leads the analysis function, it should be noted that his role is limited. Under the function’s standard, accountability for analysis ultimately rests with the respective Accounting Officers of each government organisation, rather than with the National Statistician. There is no enforcement mechanism for the function’s standards, and no funding is made available to the National Statistician, to support the running of the function.

11 Q49

12 See for example: International Journal of Epidemiology, Gestational age at birth, chronic conditions and school outcomes: a population-based data linkage study of children born in England, February 2023; ADRUK, Ethnic Inequalities in Sentencing in the Crown Court - Evidence from the MoJ Data First Criminal Justice datasets, September 2022; and, ADRUK, How common is low pay in Britain and is it declining?, April 2023.

13 According to the Digital Economy Act Research Code of Practice and Accreditation Criteria, data can be considered ‘de-identified’ where they do not directly identify individuals and are not reasonably likely to lead to an individual’s identity being ascertained (whether on its own or taken together with other information).

14 Digital Economy Act 2017.

15 UKSA, TEB30.

16 House of Commons Library, Briefing Paper 8531: Preparing for the 2021 Census (England and Wales), March 2021.

17 See: ONS, A guide to Labour Market Statistics, June 2020; ONS, Crime in England and Wales Quality and Methodology Information, March 2024; and ONS, Wealth and Assets Survey Quality and Methodology Information, February 2022.

18 International Nonresponse Trends across Countries and Years: An analysis of 36 years of Labour Force Survey data | Survey Methods: Insights from the Field (SMIF), December 2018

19 See: ONS, Operational Note – Labour Market Statistics 24 October 2023; and, ONS, Labour market overview, UK: October 2023, 24 October 2023.

20 Professor Peter Lynn, TEB01.

21 ONS, Births in England and Wales: 2022 (refreshed populations), February 2024.

22 UKSA, TEB17.

23 Q27

24 See for example: UKRI, TEB 11; Dr Federico Botta, TEB09; UKSA, TEB17.

25 ONS Data Science Campus, Case study: responding to the coronavirus pandemic using aggregated BT mobility data, May 2022.

26 Bannister & Botta, 2021Bannister, A., & Botta, F. (2021). Rapid indicators of deprivation using grocery shopping data. Royal Society open science, 8(12), 211069.

27 ONS, The future of population and migration statistics in England and Wales, 29 June 2023.

28 Minister for the Cabinet Office, Government’s Response to National Statistician’s Recommendations, July 2014.

29 Q27

30 See for example: House of Commons library, TEB14; Open letter from academics to ONS, regarding future of population statistics, October 2023.

31 See for example: ONS, TEB17, para 38; UKRI, TEB11; Understanding Society, TEB03.

32 Administrative Data Research UK, The Longitudinal Education Outcomes (LEO) dataset is now available in the ONS Secure Research Service, July 2021.

33 ONS, Updating ethnic contrasts in deaths involving the coronavirus (COVID-19), England: 24 January 2020 to 31 March 2021, May 2021.

34 Digital Economy Act 2017.

35 See for example: Professor Denise Lievesley, Independent Review of the UK Statistics Authority, March 2024; Royal Statistical Society, TEB08; UKRI, TEB11; UKSA, TEB17.

36 Professor Denise Lievesley, Independent Review of the UK Statistics Authority, March 2024.

37 UKSA, TEB17.

38 UKRI, TEB11.

39 Q39.

40 GOV.UK, Mid-point report on use of the DEA powers, February 2020.

41 See for example: Professor Denise Lievelsey, Independent Review of the UK Statistics Authority, March 2024; Cabinet Office, TEB10.

42 Q39.

43 Q217.

44 Cabinet Office, TEB10.

45 UKSA, TEB30.

46 Governance of official statistics: redefining the dual role of the UK Statistics Authority; and re-evaluating the Statistics and Registration Service Act 2007, July 2019.

47 Professor Denise Lievelsey, Independent Review of the UK Statistics Authority, March 2024.

48 See for example: Connected by Data, TEB15; Royal Statistical Society, TEB08; ODI, TEB 24.

49 Professor Denise Lievesley, Independent Review of the UK Statistics Authority, March 2024.

50 Connected by Data, TEB15.

51 GOV.UK, National Data Strategy, December 2020

52 Royal Statistical Society, TEB08.

53 GOV.UK, Research Code of Practice and Accreditation Criteria.

54 UKSA, TEB17.

55 Public Administration Committee, Public Trust in Government Statistics: A review of the operation of the Statistics and Registration Service Act 2007, February 2013.

56 Sir Charles Bean, Independent Review of UK Economic Statistics, March 2016.

57 GOV.UK, Government Response to the independent review of the UK Statistics Authority, March 2024.

58 See for example: Royal Statistical Society, TEB02, paragraph 1.1.2.

59 Q52.

60 Government Analysis Function, Summary of ambulance response time data in the UK, accessed October 2023.

61 Professor Denise Lievesley, Independent Review of the UK Statistics Authority, March 2024.

62 GOV.UK, Government Response to the independent review of the UK Statistics Authority, March 2024.

63 Governance of official statistics: redefining the dual role of the UK Statistics Authority; and re-evaluating the Statistics and Registration Service Act 2007, July 2019.

64 See: Analysis Function Annual Report 2022 to 2023, accessed May 2024.

65 Q11

66 Q222–223

67 Oral evidence taken before the Public Administration and Constitutional Affairs Committee (Civil Service Leadership and Reform), Q96

68 Letter from Professor Sir Ian Diamond, National Statistician, UK Statistics Authority on additional written evidence following the 5.9.23 oral evidence session, March 2024.

69 Ibid.

70 Ibid.

71 GOV.UK, Government Efficiency Savings 2021/22, July 2023.

72 Oral evidence taken before the Public Accounts Committee (Use of evaluation and financial modelling) Q62

73 Public Accounts Committee, Use of evaluation and modelling in government (HC 254), May 2022.

74 UKSA, UK Statistics Authority Board minutes and papers 29 February 2024, April 2024.

75 See for example: Open Data Institute TEB24; Sense about Science TEB13; Full Fact TEB18

76 The Analysis Functional Standard provides no guidance on when analysis should be published. It simply establishes a governance framework that leaves the decision with the producing-body.

77 Q82

78 Full Fact TEB18

79 Lord Maude, Independent Review of Governance and Accountability in the Civil Service, November 2023.

80 See: Jonathan Slater, Fixing Whitehall’s broken policy machine, March 2022; and, Oral evidence taken by Public Administration and Constitutional Affairs Committee (Civil Service Leadership and Reform) Q128–129.

81 Public Administration and Constitutional Affairs Committee, Where Civil Servants Work: Planning for the future of the Government’s estates, July 2023.

82 Full Fact, TEB18.

83 Q154

84 UKSA, TEB17, paragraphs 45–49.

85 See for example: Following public correspondence about claims made by Ministers regarding the asylum backlog, the Home Office published an ad-hoc release of the data underpinning these statements; Following correspondence with the Welsh Government, data about the introduction of 20mph speed limits were made publicly available; see also references made in evidence submitted by Full Fact to data published on Russian sanctions (TEB18).

86 Q162

87 See: Data Protection Act 2018; and UK General Data Protection Regulations 2018.

88 See: Statistics and Registration Service Act 2007; and Digital Economy Act 2017.

89 ICO, TEB26.

90 Q116

91 Principle (a): Lawfulness, fairness and transparency | ICO

92 See for example: Q118 and 149.

93 medConfidential, TEB27.

94 ODI, TEB 24.

95 Connected by Data, TEB15.

96 Cabinet Office, Data Ethics Framework: Glossary and methodology, September 2020.

97 Q133

98 GOV.UK, Public attitudes to data and AI: Tracker survey (Wave 3), February 2024.

99 GOV.UK, Data Ethics Framework, September 2020.

100 UKSA, TEB17.

101 Geospatial Commission / Science Wise, Geospatial Commission location data ethics dialogue evaluation, June 2022.

102 Q134

103 Q134

104 Q135–136. See also Royal Statistical Society, TEB08.