Select Committee on Science and Technology Seventh Report


4  Evidence Based Policy

"When the evidence changes I change my mind; what do you do?"- John Maynard Keynes

85. One of the defining characteristics of the early years of the present Labour Government was its stated commitment to evidence based policy making. In the words of the Centre for Crime and Justice Studies at King's College London:

"When Labour came to power in 1997 it made a very clear commitment to 'evidence-based' policy making across government and in particular, in criminal justice. This was summed up by Tony Blair when he declared 'what matters is what works'. The clear message sent out to academics and researchers was that the government wanted to use the application of knowledge to inform and determine policy making".[151]

However, over the years, the Government's commitment to evidence based policy has been called into question on many occasions. In the course of this inquiry, various witnesses queried whether evidence based policy making was in fact feasible, given political and other constraints. The Centre for Evidence Based Policy and Practice told us: "Although the term 'evidence based policy' has gained currency in recent years (and is reflected in the title given to our Centre by the ESRC in 2000), our experience suggests that it misrepresents the relationships between evidence and policy". According to the Centre, "'Evidence informed policy' is nearer the reality".[152] William Solesbury, Senior Visiting Research Fellow at the Centre, expanded upon this as follows:

"I think the concept that policy should be based on evidence is something that I would rail against quite fiercely. It implies first of all that it is the sole thing that you should consider. Secondly, it implies the metaphor 'base' and implies a kind of solidity, which […] is often not there, certainly in the social sciences although I think to a great degree, […] not always in the natural and biological sciences".[153]

86. Norman Glass, Director of the National Centre for Social Research, agreed, telling us: "I do not like the phrase 'evidence based'—it is not the way policy gets made".[154] Professor Tim Hope, Professor of Criminology at the University of Keele, also argued that there was "an incompatibility between the ideology of evidence-based policy and the natural inclination of the political process to want to secure the best outcomes".[155] According to Professor Hope, the "power and influence of politics tends to infect the procedures and processes of knowledge production of science, to its detriment, and […] to the detriment of the public interest".[156]

87. Various commentators have recently drawn attention to flagship Government policies that appear to have been developed in the absence of any convincing evidence that they would work. Sir John Krebs, former Chairman of the Food Standards Agency, singled out the announcement in September 2005 by the then Secretary of State for Education and Skills, Ruth Kelly, that the Government planned to ban junk food from meals and vending machines in English schools.[157] According to Sir John, this policy had been developed with: no evidence that it would work; no scientific definition of junk food; no cost benefit analysis; and no public engagement.[158] Sir John also noted that the report, Tackling Child Obesity—First Steps, published by the Audit Commission, Healthcare Commission and National Audit Office in February 2006 commented that there was "no evidence whether [the current] range of programmes or initiatives to improve children's health and nutrition generally will encourage obese children or children at risk of obesity to eat more healthily".[159] Judy Nixon, a Senior Lecturer at Sheffield Hallam University, has also argued that there is little convincing evidence to support the use of Anti-Social Behaviour Orders, commenting that "While there is a diverse and growing literature on ASBOs the absence of robust empirical research means that much of what is written is dominated by anecdote, conjecture and rhetoric".[160] Furthermore, in each of the case studies we conducted we encountered examples of more tenuous relationships between policies and the evidence on which they were purported to be based than is suggested by the phrase 'evidence based policy'.

88. In evidence to us, both the Head of the Government Economic Service and Government CSA acknowledged the need to strengthen the Government's use of evidence. Sir Nicholas Stern told us that there are many examples from across government where evidence is really shaping policy, such as welfare to work. Nonetheless, he acknowledged that "we do have to push harder on using evidence in Government", and said that he would welcome a stronger presence of economists in health and education departments.[161] He also noted that "You would always like, as an ex-academic, more time to look into the evidence than the pace of decision making life allows you".[162] Sir David King also admitted that there was: "an enormous amount of work still to be done", telling us: "I think we have moved a long way, but this is a bit of a tanker that needs turning to get a full understanding of what the strength of scientific knowledge can bring to the evidence based system".[163]

89. We applaud Sir David King's efforts to integrate fully science into an evidence based approach. Government should also be clear when policy is not evidence-based, or when evidence represents only a weak consideration in the process, relative to other factors. There will be many situations in which policy is primarily driven by factors other than evidence, be they political commitments or judgments (eg the minimum wage), moral standpoints (eg stem cell research), or urgent responses to circumstances or policies on which there is little empirical evidence to go on. If evidence-based policy making is to retain credibility, it is essential that it is not abused: ministers should only use the phrase when appropriate and need not be too chary about acknowledging that certain policies are not based on evidence. They should certainly not seek selectively to pick pieces of evidence which support an already agreed policy, or even commission research in order to produce a justification for policy: so-called "policy-based evidence making" (see paragraphs 95-6). Where there is an absence of evidence, or even when the Government is knowingly contradicting the evidence—maybe for very good reason—this should be openly acknowledged.

90. The Secretary of State acknowledged this point. He told us: "You want to take into account all the available evidence; but, at the end of the day, a minister's job, Parliament's job, is to reach a judgment as to whether or not a particular policy ought to be pursued or not, and you can look at evidence and that will influence your judgment".[164] We agree that ministerial decisions need to take into account factors other than evidence, but this is not reflected in the Government's oft-repeated assertion that it is committed to pursuing an evidence based approach to policy making. We have detected little evidence of an appetite for open departure from the mantra of evidence based policy making. It would be more honest and accurate to acknowledge the fact that while evidence plays a key role in informing policy, decisions are ultimately based on a number of factors—including political expediency. Where policy decisions are based on other such factors and do not flow from the evidence or scientific advice, this should be made clear. We would not expect a DCSA or a GCSA to defend a policy or part of a policy which is not based on scientific advice. We return to this topic in chapter 6.

Research

91. Scientific evidence is generated as a result of research. The Government's commitment to evidence based policy therefore necessitates a concomitant commitment to proper investment in research. Most departments have now developed science and innovation strategies which identify departmental research priorities. These strategies are undoubtedly helping to bring a greater degree of rigour and transparency to the setting of research priorities and the commissioning process, but much remains to be done and there are significant gaps in the evidence base relating to key Government policy areas. For example, in our case study looking at the classification of illegal drugs we found a worrying lack of investment in addiction and drugs policy research which could only serve to hinder policy making in that area. Furthermore, our case study addressing the technologies underpinning ID cards highlighted the fact that the Home Office did not employ a clear mechanism for identifying when there was a need to commission research to support emerging priorities as a result of policy development, the objectives described in the departmental science strategy being largely static and high-level. This was particularly significant in light of the fact that the entire ID cards policy depends on the necessary science and technology being developed and available within a timescale that fits with the Government's plans. Departments need to evolve more effective mechanisms for identifying gaps in the evidence base for policy development which are capable of responding to new and emerging political priorities. We consider this further in the context of horizon scanning in paragraph 110.

92. In both the drugs and ID cards case studies we noted a lack of investment in social science research, which is critical for building an evidence base to underpin policy making and for evaluating the effectiveness of existing policies. The Institute for the Study of Science, Technology and Innovation also commented on the importance of sustained investment in social research in this area: "too often there is an illusion that applied, policy-oriented research is like turning on or turning off the tap from which all knowledge flows but, in reality, research cannot just be turned on at will to provide solutions to policy-makers".[165] However, departments only commission a limited amount of policy-oriented research and it has not been the main focus of the Research Councils either. In addition, policy-oriented research has not been rewarded by the Research Assessment Exercise, since it tends not to result in publications in prestigious journals. Our predecessors also highlighted this point in their Report on the use of science in international development policy.[166] Professor Kelly, the Department for Transport DCSA, added further weight to the argument that incentives needed to be improved to encourage the best researchers to pursue policy-oriented research, telling us: "It is difficult to get good research and I have not myself found that the lack of money is the most severe constraint […] It is attempting to define the problem so it attracts the very best academics."[167]

93. The GCSA, Sir David King, has expressed concern regarding the pressures on departmental research budgets. He told us:

"I think that the tension between the research budget and delivery in departments is a constant tension, so I feel, for example, that in the Department of Health there has been almost a tradition of R&D budgets being raided for delivery purposes and this is to the detriment of the long-term health of the department. I understand the reasons for the tension, absolutely, but at the same time these create problems in the longer term".[168]

The Environment Research Funders' Forum (ERFF) also noted the "need to improve engagement between Government departments and their non-departmental public bodies (NDPBs) and with the research institutes they support", telling us: "there is valuable knowledge that is not making its way through to the policy process".[169] ERFF further suggested that there was "scope to improve the usefulness to policy making of the directed or managed programmes of the Research Councils".[170] We are aware that many departments have entered into concordats with Research Councils, partly in recognition of this problem. We return to the need to strengthen investment in policy-oriented research in paragraph 98.

PUBLICATION OF RESEARCH FINDINGS AND EVIDENCE

94. The Freedom of Information Act 2000 has placed an additional demand for openness in the policy making process. The GSCA's guidelines advise that:

"It is good practice to publish the underpinning evidence for a new policy decision […] When publishing the evidence the analysis and judgment that went into it, and any important omissions in the data, should be clearly documented and identified as such."[171]

The guidelines further state that "departments should ensure that data relating to an issue is made available as early as possible to the scientific community, and more widely, to enable a wide range of research groups to provide a check on advice going to government".[172] It is notable that in respect of issues falling under Environmental Information Regulations, publication "will usually be obligatory rather than just good practice".[173] The designation of something as "good practice" stops well short of ensuring that it happens. The Government confirmed in evidence that it: "is committed to making the scientific evidence base public via the Freedom of Information Act, departmental publication schemes and other levers. OST has recently begun to work with departments and the Government Communications Network to ensure that evidence is presented as transparently and effectively as possible".[174] We heard in evidence that the Home Office publishes all its commissioned research, provided it is of the requisite standard and that there are no security or other legitimate reasons for non-disclosure.[175] Commissioned systematic reviews of the evidence base should usually be considered as research for the purposes of publication policy.

95. The commissioning of research from academics by Government departments is widespread so we were extremely concerned to hear allegations from certain academics that departments have been commissioning and publishing research selectively in order to 'prop up' policies. Professor Tim Hope, a criminologist from the University of Keele who has worked with the Home Office, told us: "it was with sadness and regret that I saw our work ill-used and our faith in government's use of evidence traduced".[176] Of two case studies looking at burglary reduction commissioned by the Home Office, Professor Hope told us that the department decided to only write up one: "Presumably […] because the area-wide reduction was greater here than elsewhere".[177] Professor Hope also accused the Home Office of manipulating the data so as "to capitalise on chance, producing much more favourable findings overall", despite the fact that "for individual projects, the [Home Office] method produces considerable distortion".[178] Furthermore, Professor Hope alleged that the Home Office had interfered with presentation of research findings by other researchers:

"At the British Society of Criminology conference in the University of Bangor in July 2003 there were a number of papers to be given by academics on the basis of contracted work that they were involved in, as I was, for the Home Office. A number of the researchers were advised not to present their papers at the last minute even though they had been advertised in the programme by Home Office officials".[179]

Other academics have voiced similar concerns. For example, Reece Walters of Stirling University has claimed of the Home Office's treatment of research results: "It is clear the Home Office is interested only in rubber-stamping the political priorities of the Government of the day […] To participate in Home Office research is to endorse a biased agenda".[180]

96. These are serious accusations, amounting as they do to allegations of serious scientific/publication malpractice, and should be subject to vigorous examination. We are not in a position to make a judgment about the veracity of these claims. We are pleased that Sir David King has agreed to investigate any "cases where the party raising the concerns feels a departmental CSA has not dealt with the issue adequately", although he told us that he has received no such requests to date.[181] We have heard enough on an informal basis about the selective publication of research to harbour concerns. Such allegations do nothing to encourage the research community to engage in government-sponsored research or to improve public confidence in the validity of such work. Because of the obvious problem of potential complainants being dependent on funding from those who commission research, the GCSA should not require a formal complaint from the alleged victim in order to instigate an investigation of substantive allegations of this sort of malpractice. We urge the Government CSA to investigate proactively any allegations of malpractice in the commissioning, publication and use of research by departments and to ensure that opportunities to learn lessons are fully taken advantage of. We would expect the results of any such investigations to be made public.

97. Complete openness in publication policy is the best way to dispel concerns over the independence and handling of research. We welcome the view of the Secretary of State for Trade and Industry that "in general, evidence ought to be published"[182] and note that most departments, including the Home Office, have committed to make publicly available the results of the research which they commission. This commitment needs to be borne out in practice. We recommend that the Government Chief Scientific Adviser ensures that the publication of research underpinning policy development and evidence cited in support of policies is monitored as part of the departmental science reviews.

98. The concerns raised above highlight the need for research commissioned by departments to be carried out and published without inappropriate interference from the sponsoring department. Research must, so far as is achievable, be independent and be seen to be so. We are not convinced that the current mechanisms for commissioning research deliver this objective. We have also made the case for greater investment in research to underpin policy development. We recommend the creation of a cross-departmental fund for policy related research to be held by the Government CSA in order to meet these dual aims. The fund would be in addition to, rather than instead of, existing departmental research budgets although it would be expected that over time a greater proportion of Government spend on policy-oriented research would be via the cross-departmental fund, reflecting the fact that science is becoming increasingly multidisciplinary and many key policy areas require the co-ordinated efforts of multiple departments.

METHODOLOGY

99. We received evidence suggesting that in using research results, departments were not paying sufficient attention to the methods used to generate the evidence in question. Sense About Science told us: "From the perspective of good policy making, it is also clearly important that the status of evidence is understood at all stages and by all parties".[183] Professor Hope commented that "methodology ought to matter, as it does to scientists, because it is the only way in which the validity of the evidence itself can be held to public account".[184] In addition, the Centre for Evidence Based Policy and Practice said that it had found "a far too casual approach to the use of evidence (particularly social science findings) from other countries without adequate regard to contextual differences" and argued for "the concept of 'fitness for purpose' as the appropriate measure of quality, meaning that the science should be methodologically good enough for the weight to be attached to it in informing policy".[185]

100. Norman Glass, Director of the National Centre for Social Research, warned that the "consequences of bias in evidence, which is what we social scientists are essentially hunting down day after day", were sometimes considered by policy makers to be "a kind of geeky interest". Mr Glass argued, however:

"If you are basing your evidence on unrepresentative, biased samples then you cannot believe a word. In fact, it is worse than knowing nothing. Knowing things that are not so is worse than knowing nothing at all".[186]

Professor Nancy Cartwright from the London School of Economics and Political Science also emphasised the need to take into account different types of evidence: "the best decisions are made on the basis of the total evidence [...] taking into account how secure each result is and how heavily it weighs for the proposal and also taking into account the overall pattern of the evidence".[187] There is often a temptation to justify policy by selective use of available evidence. But this, and a failure to acknowledge the implications of the methodology used and the overall balance of evidence, risk serious damage to the credibility of evidence-based policy making.

101. We recommend that where the Government describes a policy as evidence-based, it should make a statement on the department's view of the strength and nature of the evidence relied upon, and that such statements be subject to quality assurance (see paragraph 114 below).

Trials and pilots

102. Trials and pilots provide Government with an opportunity to test out policy concepts. A Cabinet Office review of Government pilots entitled Trying It Out: The Role of 'Pilots' in Policy-Making published in December 2003 described their value as follows: "Although pilots or policy trials may be costly in time and resources and may carry political risks, they should be balanced against greater risk of embedding preventable flaws into a new policy".[188] However, we uncovered evidence in our ID cards case study which demonstrated that departments do not always commission trial or pilots at the appropriate stage in policy development and may use the outcomes for purposes other than those specified at the outset of the pilot. William Solesbury, Senior Visiting Research Fellow at the Centre for Evidence Based Policy and Practice, also told us: "there are probably too few [pilots] and they are used inappropriately". He was of the view that there was a "mismatch between the research timetable and cycle and the political cycle" so that "once pilots are up and running ministers are very often keen to roll them out before results are ready".[189]

103. In July 1998, the Government launched a pilot of its high-profile Sure Start scheme, a family support programme for parents in deprived areas. There were around 200 centres in the programme at the time of launch, and by 2005 there were around 530 schemes. Norman Glass, then a Treasury civil servant, played a central role in the development of the original Sure Start programme. The Government's 10 year strategy for childcare, published in December 2004, included a pledge that by 2010 there would be 3,500 centres, in a move widely seen as shifting the emphasis of the programme towards childcare (rather than child development). Norman Glass was highly critical of this development, telling us: "we should have learned much more about the experience from those 200 before we rolled it out on any scale […] We rolled it out too much, too fast and too inadequately reviewed".[190] Anna Coote, former health policy director at the King's Fund, also argued at the time: "one bold social experiment is being transmuted into another rather different one, before anyone has a chance to learn whether the original approach was worthwhile".[191] In addition, the Education and Skills Committee stated in its March 2005 Report Every Child Matters: "We are concerned that significant changes are being made to the Sure Start programme when evidence about the effectiveness of the current system is only just beginning to emerge", further noting that this reflected "the inherent difficulties of pursuing transformative and rapid change while at the same time maintaining a commitment to evidence-based policy".[192]

104. Trying It Out stated that a pilot "should be undertaken in a spirit of experimentation" and "Once embarked upon […] must be allowed to run its course" since "the full benefits of a policy pilot will not be realised if the policy is rolled out before the results of the pilot have been absorbed and acted upon" and "Early results may give a misleading picture".[193] This does not appear to have been taken on board by the Government. The Secretary of State for Trade and Industry, Alistair Darling, admitted to us that while it was "highly desirable, in some areas, that something should be trialled and you ought to be able to walk away from something and say, 'Well, it didn't work'", this posed problems for ministers: "As you well know, in politics that sometimes can be difficult, because people say, 'Ah, you've failed and the whole thing's a disaster,' and so on".[194] It is necessary to change the political culture, including among opposition parties and the media, to ensure that a decision to change track after a trial or pilot has been evaluated is recognised as good practice, and that failure to evaluate trials and pilots or a failure to change course after evaluation where this would be appropriate is recognised as bad practice. Pilots and trials can make a valuable contribution to policy making but there is no point the Government initiating them if it is not going to use the output properly. In order to protect them from political pressures, pilots and trials should be carried out at arm's length from Government or at least be independently overseen.

Horizon scanning

105. In recent years, there has been a growing emphasis on "horizon scanning" to identify potential threats and opportunities involving science and technology that could impact on Government policy. The Science and Innovation Investment Framework 2004-2014 stated that horizon scanning was "essential to the effective governance and direction of Government policy, publicly funded research and many of the activities of the private sector, and to the interactions between them".[195] Over the last 12 years, the OSI-based Foresight programme has undertaken a significant amount of work on the identification of medium to longer term threats and opportunities posed by science and technology by bringing together scientists, industry and Government. Recent projects have included Cyber Trust and Crime Prevention, Exploiting the Electromagnetic Spectrum, and Brain Science, Addiction and Drugs. A high-level Stakeholder Group, chaired by a minister from the sponsor department, is assembled to oversee each project. The Group reconvenes one year after the publication of the project's findings to review the progress made. The GCSA's Guidelines on Scientific Analysis in Policy Making also state that "individual departments should ensure that adequate horizon scanning procedures are in place, sourcing data across all evidential areas, to provide early indications of trends, issues, or other emerging phenomena that may create significant impacts that departments need to take account of".[196]

106. In addition, as a result of a commitment in the Science and Innovation Investment Framework 2004-2014, the Government has established a centre of excellence in science and technology horizon scanning, based at OSI. The Centre aims to:

  • Inform departmental and cross-departmental decision-making;
  • Support horizon scanning carried out by others inside Government; and
  • Spot the implications of emerging science and technology and enable others to act on them.

The Secretary of State told us that the efforts to strengthen horizon scanning in relation to science and technology had already had an impact: "I have seen a change actually from the late 1990s to where we are now, in that, as a secretary of state now, I would expect to have far more knowledge and far greater awareness of the challenges facing my department, in the longer term, the science, if you like, than certainly was the case eight or nine years ago".[197] We commend the Government CSA and the Office of Science and Innovation on their work aimed at strengthening horizon scanning in relation to science and technology across Government.

107. Other parts of Government also undertake horizon scanning in respect of threats and opportunities which Government policy should take account of. The Prime Minister's Strategy Unit, for example, "provides the Prime Minister with in-depth strategy advice and policy analysis on his priority issues" and has a remit "to identify and effectively disseminate thinking on emerging issues and challenges for the UK Government".[198] Professor Martin Taylor, Vice President of the Royal Society, questioned whether horizon scanning across Government and beyond was being properly co-ordinated. He told us:

"The Royal Society has its own horizon scanning. From what I can gather most of the departments have it, certainly DEFRA has, and of course there is the OSI that runs something sometimes called the scan of scans, so there is a lot of it out there but I do not honestly believe it is terribly well joined up".[199]

The Government admitted to us that "within Departments there is not yet a common embedded view of what horizon scanning is, how and where it is applied, and what part it plays in business processes including strategy and risk management" but suggested that the OSI Horizon Scanning Centre would help to address this.[200] We also note that the Public Administration Committee has been undertaking an inquiry entitled Governing the future which will look more broadly at the role of horizon scanning in policy making.

108. Despite the Government's efforts, we heard criticism of the fact that science was not involved sufficiently early in the policy making process. The Science Council told us:

"lead government units must recognise the need to draw much earlier on many more sources of advice and expertise and they should seek to understand fully the potential impact of the wording of regulation and legislation before it is cast in stone. Too often the involvement of experts comes at a point when poorly drafted regulation has to be implemented in a way that minimises the unintended consequences".[201]

Cancer Research UK was particularly concerned about horizon scanning at the EU level:

"In two recent examples of European legislation, the EU Directives on Clinical Trials and on Physical Agents, the UK has been ineffective in horizon scanning. Both pieces of legislation had the potential to make a significant impact on medical research. The strong impression across the medical research community is that the Government and its departments were either too late entering the debate on this legislation or not adequately aware of the potential impact of these Directives".[202]

We also concluded in our Report on the case study on the EU Physical Agents (Electromagnetic Fields) Directive (the MRI case study) that "failures in the horizon-scanning activities of the Government and its agencies, the Research Councils" meant that the "Directive was well over the horizon before the medical research community, led by the MRC, reacted to its potential consequences".[203] The Government has admitted that although "There are a number of existing mechanisms for Horizon Scanning issues emerging from the EU […] there is no overarching coordinating framework that draws them all together".[204] We have already recommended in our Report on the MRI case study that the Government review its horizon scanning activities in respect of EU legislation which could impact on, or benefit from, the UK science and technology community. We hope that the Government will rectify this situation by implementing our recommendations

109. Another criticism levelled at horizon scanning in Government is that the outputs of such activities are often not well utilised by policy makers. Professor Wiles, Home Office DCSA, expressed the problem thus: "doing horizon scanning is one thing, getting an organisation to actually lift its head from immediate problems and think ten or twenty years ahead and use that horizon scanning is sometimes a challenge".[205] He told us:

"You can imagine, particularly at the moment in the Home Office, it is difficult to get the Department to take its gaze above the immediate crises it has to deal with and say, 'Yes, all very well, but in the long run the way to do that is to be able to look further ahead, understand the kinds of risks that lay in the future and think about how you are going to manage them'".[206]

The Secretary of State for Trade and Industry was sympathetic, telling us "of course it is the case that if the department is so involved in day-to-day matters then I can quite see that, frankly, what happens in ten years' time may not be the thing that is top of the in-tray",[207] but suggested that it depended on the particular minister involved: "There are some politicians who do take very short-term positions and I am sure we can think of many examples; there are others, on the other hand, who take a far-sighted view on behalf of the whole country, of course".[208] To take account of this reality, horizon scanning needs to be firmly embedded in the policy making process across Government.

110. In the context of the electoral cycle and an era of 24 hour news coverage, it is not hard to see why politicians prioritise actions that can deliver short term benefits over those not likely to yield dividends until they have long departed from the Government. It is a major challenge for the Government to ensure that the results of horizon scanning are being used properly. The Government needs to put in place incentives to encourage departments to take a more long term view in developing policy. This will be vital if today's major policy challenges—including energy, climate change and terrorism—are to be addressed in a sustainable manner. The Secretary of State himself pointed out that "Pensions is another case in point, where, frankly, unless there is long-term agreement between the political parties it is going to be difficult".[209] In our view, more needs to be done to drive this change. We recommend that it be a requirement for departments to demonstrate in all major strategic planning documents that they are using the results of—not just conducting—horizon scanning and research.

111. Part of the problem arises from the fact that the Government's current approach to policy making is not sufficiently responsive to changing evidence, making it hard to feed in results from activities such as trials, research and horizon scanning. There needs to be a stronger culture of policy evolution whereby policies are updated and adapted as new evidence emerges. We recognise the political difficulties involved in achieving a change, but we urge the Government, as well as the opposition parties, to move towards a situation where a decision to revise a policy in the light of new evidence is welcomed, rather than being perceived as a policy failure. The Centre for Evidence Based Policy and Practice also highlighted the fact that this would require "for all policy fields, a continuous updating of the evidence base as new scientific research—commissioned by government or by others—yields results that can inform policy development and delivery in a timely way", suggesting that a key challenge was: "Managing those stocks of policy-relevant knowledge—keeping them objective and impartial, up-to-date, accessible".[210]

Quality control

112. In light of the concerns identified above, we sought to establish what processes were in place to control the quality and use of evidence in policy making. There is at present no independent or rigorous verification of Government claims that its policies are evidence based. We were also told by various departments that the fact that they had DCSAs, science and innovation strategies and scientific advisory committees meant that their policies could be considered evidence based. We are unconvinced.

113. Nonetheless, we did find that some provision had been made for quality control of the evidence feeding into policy making and the processes by which it is incorporated into policy. Sir David King told us that, as GCSA: "whether it is energy review or preparations for a flu pandemic it is my job to go in and challenge the evidence, see that it is robust before it goes up to ministers".[211] He also indicated that DCSAs fulfilled similar roles within individual departments. That is welcome but does not amount to a formal monitoring of the advice provided based on the evidence or the degree to which assertions of the evidence-based nature of a policy are valid. In addition, Sir David argued that his Guidelines on Scientific Analysis in Policy Making made an important contribution towards ensuring that the Government followed an evidence based approach to policy making. We heard much support for the Guidelines. The Biosciences Federation was one of a number of witnesses who praised them in evidence to us: "the recently updated OST [OSI] guidelines provide an excellent framework for the use of scientific expertise in formulating public policy".[212] However, questions were also raised about the extent to which departments were putting the Guidelines into practice.[213] When asked whether the Guidelines were actually making a difference, the departmental CSAs who gave evidence to us seemed less than convinced about their usefulness. The DFID DCSA Professor Sir Gordon Conway's immediate response was: "I am not sure I can answer that specifically".[214] Professor Paul Wiles, Home Office DCSA, said: "I see that as something that needs to be placed alongside the actual processes by which policy is developed.[215] Professor Frank Kelly, DCSA for the Department for Transport, was of the view that "they set a context rather like the context that a contract sets in commercial terms. Something is going wrong when you try to read the contract".[216] Sir David King expressed surprise at this view on how the Guidelines are used in practice.[217]

114. We also found in our MRI case study that the Health and Safety Executive and the National Radiological Protection Board/Health Protection Agency had failed to follow the Guidelines in their response to the EU Physical Agents (Electromagnetic Fields) Directive.[218] It is useful that the Government CSA has issued guidance on the use of scientific analysis in policy making but it is disappointing that there has been so little monitoring of its implementation. Departmental CSAs should, in future, be more proactive in ensuring that the principles defined in the Guidelines on Scientific Analysis in Policy making are adhered to within their departments. We accept that the Science Reviews being led by OSI do examine whether departments are following the Guidance but since the launch of the programme in 2003, only one review has been completed (another three are underway). We consider the Science Reviews further in paragraphs 121-123. We also note that the Food Standards Agency has been developing a 'Science Checklist' "to make explicit the points to be considered in the preparation of papers dealing with science-based issues", which overlaps with both guidance issued by the Government's Social Research Unit and the Guidelines on Scientific Analysis in Policy Making, suggesting either that the Guidelines are not presented in the most useful format or that individual departments would benefit from tailoring the Guidelines and associated guidance to their specific needs.

115. To increase public and scientific confidence in the way that the Government uses scientific advice and evidence, it is necessary for there to be a more formal and accountable system of monitoring the quality of the scientific advice provided and the validity of statements by departments of the evidence-based nature of policies.

PEER REVIEW

116. It is not possible within the terms of reference of this inquiry to do justice to the issues involved in peer review more generally, although, as our predecessor Committee previously indicated,[219] we intend to return to this subject at a later date. The GCSA's Guidelines highlight the importance of quality assurance and peer review:

"Quality assurance provides confidence in the evidence gathering process whilst peer review provides expert evaluation of the evidence itself. Both are important tools in ensuring advice is as up to date and robust as possible".

The Royal Society told us: "the effective use of independent peer review is a vital part of ensuring the quality of the work that Government Departments sponsor".[220] Sense About Science suggested that "Peer review is a dividing line: it indicates that work has been scrutinised for its validity, significance and originality".[221] Most departments have now set out their arrangements for peer review of evidence in their science and innovation strategies.

117. A number of witnesses argued that the Government needed to strengthen its approach to reviewing and evaluating its policies (rather than the underlying evidence). The Royal Society of Chemistry said that it was "not aware of much if any post hoc examination of decisions taken", suggesting that this might be due to the fact that "if such an analysis indicated that the original decision was incorrect this would be politically embarrassing".[222] The Environment Research Funders' Forum commented that "measuring impact and uptake was identified as important but difficult" by the people it consulted and noted that "within departments and agencies quality assurance and evaluation systems can have too narrow a focus, and need to be extended to the full science-into-policy process, including the question formulation and policy uptake stages".[223]

118. The Centre for Evidence Based Policy and Practice told us that Government needed to conduct evaluation "not just to show 'what works' but also why policies work (or not), and what we understand of current phenomena and likely future trends in shaping policies and outcomes".[224] This echoes comments made by Norman Glass: "'What works' is important, but 'how it works' […] is equally, if not more, important".[225] The same can be said about the importance of showing why a policy did or does not work as intended. Mr Glass, a former Treasury civil servant, has been especially critical of the Treasury's approach to policy evaluation, commenting that "Systematic evaluation of policies (even where it exists and the Treasury itself is a notable non-practitioner) remains, in many cases, a procedure for bayoneting the dead", and telling us "the Treasury is a notable absentee [in terms of evaluation]. They introduce all sorts of policies, tax policies, which never get evaluated because they do not have the process".[226]

119. The Sure Start programme discussed in paragraph 103 in the context of policy pilots has been put forward as an example of the Government's weakness in policy evaluation. The £20 million evaluation of the programme, being carried out by a team at Birkbeck College, London, has been criticised for its timing and approach. It has been claimed that the evaluation "did not ask participants whether they had actually used Sure Start services",[227] and has been likened by one journalist to "the under-fives pulling up recently sown radishes to see if their vegetables were growing". It was emphasised that "this was not the researchers' fault, but their commissioners".[228]

120. Some have called for more independent auditing of Government policies in terms of their relation to the evidence base. William Solesbury from the Centre for Evidence Based Policy Making suggested that a National Audit Office-style body could provide a useful function in reviewing Government policies and assessing their relationship with the evidence base: "There might be a case for something that might be akin to the National Audit Office, which has a position of great authority and, usually retrospectively, passes judgments of this kind".[229] However, the Secretary of State seemed sceptical as to the value of such a body. He told us that he had "doubts as to whether or not it is possible to get somebody who was so distant, so impartial", noting that different people "will look at the same evidence and come to different conclusions" about whether it is reflected by the policy.[230] We understand this scepticism. While the importance of peer review for establishing the validity of evidence underpinning policy is not in question, peer review is not necessarily the best mechanism for evaluating policies themselves. Nevertheless, peer review of the extent to which Government policies are evidence-based by learned societies, professional bodies and researchers can play a useful role in stimulating debate and refining policy makers' thinking and should, therefore, be welcomed by the Government. We recommend that the Government commission such reviews, on a trial basis, of selected key policies after a reasonable period of time as part of the policy review process.

SCIENCE REVIEWS

121. As noted above, the Office of Science and Innovation has embarked on a rolling programme of Science Reviews looking at each government department in turn. Sir David King set up the Science Review Team in response to a recommendation in the 2002 Investing in Innovation White Paper. The aim of the Reviews is "to externally scrutinise and benchmark the quality and use of science in government departments", where science is interpreted as "physical, natural and social sciences research and data collection (monitoring and surveillance) activities".[231] The Science Reviews got off to a slow start with the review of the first department, DCMS, taking nearly two years. The reviewing function has since been outsourced and reviews are now being conducted on the HSE, DEFRA and DCLG.

122. Sir David told us that the very fact that departments knew they would be subject to a review served a useful purpose: "the existence of the science reviews begins to develop best practice in departments even before we arrive, so there are departments which might try and persuade me to delay the review because they want to put things right, and that in itself is not necessarily a bad thing".[232] He also explained that his ability to persuade departments to implement review recommendations flowed from the support of the Treasury and Prime Minister:

"the Treasury now works with my Office on each of those spending review applications from government departments where science and social sciences are included. So in other words, there is a financial factor that, as you might imagine, is quite an important factor in all of this. The Treasury is one important element, but of course the second element is that the drive comes from the Prime Minister to improve the quality of the evidence base".[233]

We look forward to seeing the results of the next wave of reviews as they emerge.

123. One potential weakness of the Science Reviews is that they fail to address cross-departmental policy making. The Royal Society told us: "The cross-departmental overview is a vital aspect of Sir David King's role" [234] but said that it was not convinced that "the Government is dealing effectively with the scientific advice on the key cross-departmental issues of energy and climate change".,[235] In addition, in the case study looking at the technologies supporting ID cards, we found little evidence that the Home Office had liaised effectively with other government departments. The Institute for the Study of Science, Technology and Innovation also told us: "nasty surprises can often occur in the cracks between departments".[236] Norman Glass, Director of the National Centre for Science and Research, was similarly concerned about cross-departmental working:

"departments do not—it is amazing—even compare their research programmes with one another to see whether there is overlap and whether they could do things synergistically. Getting people to work together is a problem in all these cases. Everyone signs up to it, but nobody does it".[237]

This resonates with the observation of our predecessor Committee in its 2001 Report on the scientific advisory system that "where issues cross departmental boundaries—as they do on GM foods, mobile phones and climate change, for example—there is frequently inadequate co-ordination of the research being commissioned by the different departments, and insufficient cross-fertilisation of ideas".[238] On climate change, we welcome the cross-departmental approach that the Government has now developed, for example in the review led by Sir Nicholas Stern. We recommend that issue-based reviews be introduced as a means of auditing cross-departmental policies. These could be incorporated into the Science Review of the department which has been designated as lead department for the relevant policy. For example, the DEFRA review could include examination of the Government's approach to climate change policy, for which DEFRA is the named lead department.

Conclusions

124. Evidence based policy making has been a watchword of this Government and is widely seen as representing best practice. However, in reality policies are developed on the basis of a myriad of factors, only one of which constitutes evidence in its scientific sense. We have argued that the phrase 'evidence based policy' is misleading and that the Government should therefore desist from seeking to claim that all its policies are evidence based. It is, nonetheless, important that Government invests in research in order to strengthen the evidence base available to inform its policy decisions and we have recommended the establishment of a cross-departmental fund, overseen by the GCSA, to boost government investment in policy-oriented research. It is also vital that research, trials and pilots are conducted, and the outputs published, free from political interference. We are concerned by suggestions that this is not happening in all cases and call for the GCSA to ensure that allegations of poor practice are investigated. In addition, we note that government investment in research, pilots and horizon scanning will never yield dividends unless proper mechanisms exist to incorporate the results of such activities into policy development. In this respect, the short term outlook encouraged by the electoral cycle is a major obstacle to effective policy making and we urge the Government and opposition parties to move towards a more iterative mode of policy making where refining policies in the light of new evidence is seen as a mark of good practice, rather than a sign of failure.


151   Ev 145 Back

152   Ev 173 Back

153   Q 991 Back

154   Q 995 Back

155   Ev 147 Back

156   Q 987 Back

157   "Junk food to be banned in schools", BBC News, 28 September 2005 Back

158   Sir John Krebs, Scientific Advice, Impartiality and Policy, Inaugural Sense About Science lecture, 7 March 2006 Back

159   National Audit Office, Healthcare Commission, Audit Commission, Tackling Child Obesity-First Steps, 28 February 2006, HC 801 Back

160   Ev 146 Back

161   Q 1050 Back

162   Q 1034 Back

163   Q 26 Back

164   Q 1314 Back

165   Ev 120 Back

166   HC (2003-04) 133-I, paras 184-185 Back

167   Q 1112 Back

168   Q 1373 Back

169   Ev 99 Back

170   Ev 100 Back

171   http://www.dti.gov.uk/files/file9767.pdf, para 25 Back

172   As above, para 26 Back

173   http://www.dti.gov.uk/files/file9767.pdf, para 25 Back

174   Ev 89 Back

175   Q 1131 Back

176   Ev 147 Back

177   Ev 148 Back

178   As above Back

179   Q 993 Back

180   "Truth about crime 'being distorted'", Metro, 13 February 2006 Back

181   Ev 202 Back

182   Q 1316 Back

183   Ev 117 Back

184   Ev 147 Back

185   Ev 174 Back

186   Q 1003 Back

187   Ev 96 Back

188   Cabinet Office, Trying It Out, The Role of 'Pilots' in Policy-Making, December 2003, recommendation 2 Back

189   Q 1014 Back

190   Q 997 Back

191   "But does Sure Start work?", Anna Coote, The Guardian, 19 January 2005 Back

192   Education and Skills Committee, Ninth Report of Session 2004-05, Every Child Matters, HC 40-I, para 39 Back

193   Cabinet Office, Trying It Out, The Role of 'Pilots' in Policy-Making, December 2003, recommendations 5-6 Back

194   Q 1326 Back

195   HM Treasury, DTI, DfES, Science and innovation investment framework 2004-2014, July 2004, para 8.17 Back

196   http://www.dti.gov.uk/files/file9767.pdf Back

197   Q 1303 Back

198   http://www.strategy.gov.uk/ Back

199   Q 975 Back

200   Ev 138 Back

201   Ev 128 Back

202   Ev 134 Back

203   HC (2005-06) 1030, para 72 Back

204   Ev 142 Back

205   Q 1107 Back

206   Q 1108 Back

207   Q 1392 Back

208   Q 1390 Back

209   Q 1328 Back

210   Ev 175 Back

211   Q 1334 Back

212   Ev 109 Back

213   Ev 116 Back

214   Q 1098 Back

215   Q 1099 Back

216   Q 1100 Back

217   Q 1367 Back

218   HC (2005-06) 1030, para 60 Back

219   Science and Technology Committee, Tenth Report of Session 2003-4, Scientific Publications: Free for All?, HC 399-I, para 95 Back

220   Ev 107 Back

221   Ev 118 Back

222   Ev 127 Back

223   Ev 99 Back

224   Ev 174 Back

225   "Surely some mistake?" Norman Glass, The Guardian, 5 January 2005 Back

226   Q 1012 Back

227   "Sure Start sets back the worst placed youngsters, study finds", The Guardian, 1 December 2005 Back

228   "Shaky times for Sure Start", The Guardian, 13 September 2005 Back

229   Q 1019 Back

230   Q 1332 Back

231   www.dti.gov.uk/science/science-in-govt/works/science-reviews/background/page25852.html Back

232   Q 70 Back

233   Q 72 Back

234   Ev 102 Back

235   Ev 104 Back

236   Ev 120 Back

237   Q 1013 Back

238   HC (2000-01) 257, para 97 Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2006
Prepared 8 November 2006