APPENDIX 22
Memorandum from William Solesbury, Senior
Visiting Research Fellow, ESRC UK Centre for Evidence Based Policy
and Practice, Kings College London
INTRODUCTION
1. This memorandum draws on the work of
the ESRC UK Centre for Evidence Based Policy and Practice at Kings
College London as it relates to the questions being addressed
by the Committee's inquiry. The Centre's remit is to analyse the
diverse ways in which evidenceincluding scientific, social
scientific and other research, enquiry and debateis used
in public policy. To that end it has undertaken, over the last
five years, a programme of research, training, consultancy and
resource provision (in part funded by the Economic and Social
Research Council
Research projects have been
concerned with the public debate about GM crops and foods; the
nature and uses of evidence in the audit, inspection and scrutiny
functions of government; the conduct of research reviews; the
development of a new "realist" approach to the synthesis
of evidence; the nature and quality of knowledge within social
care; the assessment of research quality; the contribution to
the work of government departments of officials and board members
from outside the civil service; and the role of strategic thinking
in government.
Training has been provided
for clients including the National School of Government, the National
Audit Office, and the Social Care Institute of Excellence and
for researchers and postgraduate students supported by the Economic
and Social Research Council.
Consultancy clients have included
the Parliamentary Office of Science and Technology, the Learning
and Skills Development Agency, and the National Centre for Social
Research.
The Centre functions as a resource
centre through its website which includes inter alia a guide
to many databases of published research and other resources for
evidence based policy and practice and an extensive searchable
bibliography of publication on evidence based policy.
The Centre also edits the journal
Evidence and Policy, published by Policy Press.
More details of most of this work can be found
on the Centre's website: www.evidencenetwork.org. This memorandum
now addresses the questions in the Committee's inquiry brief.
SOURCES AND
HANDLING OF
ADVICE
2. Government departments secure scientific
(including social scientific) advice through a number of modes
Through standing or ad hoc scientific
advisory bodies.
Through commissioning new research
externally from academic or other centres.
Through searching existing published
research.
There has been a notable increase in departmental
research budgets for commissioned research over the last decade.
There has also been a growing interest in the practice of "systematic
reviews", that is, searching published research in a rigorous
way and seeking to establish the present state of scientific knowledge
relevant to a particular policy question. Some believe that this
has the potential to offer more comprehensive, speedier and cost-effective
evidence than the other two modes. However, our research on systematic
reviewing[105]
and our experience in contributing to reviews suggests that the
practice of commissioning, undertaking and using such reviews
often underestimates the time, money and skills needed for such
reviews. It also shows that policy makers can find the results
difficult to interpret and apply.
3. All these ways of securing scientific
advice depend on an in-house capability to handle itidentifying
when science can contribute to policy, seeking it out from a wide
range of sources and interpreting its relevance to policy. This
should be a prime role of departmental scientific and social research
staffs who can act as "brokers" between science and
policy. But our training and consultancy experience suggests that
this role is underdeveloped in departments. It is also important
that mainstream civil servants develop an understanding of what
science can contribute, and crucially when that contribution is
needed. The recent work on Professional Skills for Government
has recognised this in the new competency frameworks that have
been developed.[106]
Among the core skills for staff at Grade 7 and above are Analysis
and Use of Evidence (the others are Financial Management, People
Management and Programme and Project Management). This is laudable,
but insufficient. Skills alone are not the problem. Attitudes
need to change too. Our research on the experiences of outsiders
recruited into Whitehall, mostly in quite senior positions, revealed
that many of them struggled to gain recognition from their insider
colleagues of the expertise they brought with them; and some left
quite quickly.[107]
RELATIONSHIP BETWEEN
SCIENTIFIC ADVICE
AND POLICY
DEVELOPMENT
4. Although the term "evidence based
policy" has gained currency in recent years (and is reflected
in the title given to our Centre by the ESRC in 2000), our experience
suggests that it misrepresents the relationships between evidence
and policy. "Evidence informed policy" is nearer the
reality. Our current research on the role of strategy in government[108]
reveals the continuing interplay between "facts" and
"values" that characterises policy activities. In our
training work we find that the "Four Is" framework (originated
by the US evaluator Professor Carol Weiss)[109]
helps practitioners to place evidence appropriately. It states
that policy is the outcome of the interaction of Ideologies, Interests
and Information (that is, evidence) within the context of Institutions
where:
Information is "the range of
knowledge and ideas that help people make sense of the current
state of affairs, why things happen as they do, and which new
initiatives will help or hinder";
Interests are essentially "self-interests";
Ideologies are "philosophies,
principles, values and political orientation";
Institutions are doubly important:
"first the institutional environment shapes the way in which
participants interpret their own interests, ideologies, and information.
[ . . . ] Second, organisational arrangements affect the decision
process itself, such as who is empowered to make decisions."
This framework also proved fruitful in our analysis
of policy making on GM foods.[110]
We found that, because the Four Is are in dynamic interaction
creating mutual influences that repeatedly interweave, decisions
and outcomes become progressively reshaped. This requires stakeholders
to reposition themselves in relation to each other, the evidence,
the decision process and the policy and practice outcomes.
5. Government must be concerned about the
quality of scientific evidence for policy work. Its commitment
to quality assurance is one of the distinct strengths of scientific
research as a source of evidence for policy. But for this purpose
it is important that methodological considerations of validity
and reliability are complemented by utilitarian considerations
of relevance and accessibility. For example, our work on the practice
of research reviewing has revealed a far too casual approach to
the use of evidence (particularly social science findings) from
other countries without adequate regard to contextual differences.
More generally, from our work we would argue for the concept of
"fitness for purpose" as the appropriate measure of
quality for scientific evidence for policy, meaning that the science
should be methodologically good enough for the weight to be attached
to it in informing policy.[111],
[112]
6. Our training and consultancy work has
revealed tensions between scientists and government about the
interpretation and application of research findings. The details
of each case differ. But overall there is a need to pay adequate
prices for commissioned research, to allow researchers adequate
time to undertake the work rigorously, and to expect them to report
"as they find". If government decidesfor whatever
reasonsto act in ways that contradict or are unsupported
by scientific advice, then it should do so without seeking to
misrepresent the evidence. We note with interest the proposals
from HM Treasury to safeguard the objectivity and impartiality
of National Statistics.[113]
TREATMENT OF
RISK
7. Our research on strategy in politics
confirms the observation that policies often fail for one of two
reasons: unforeseen changes in context or unintended consequences
of the policy itself. Effective policy work requires an initial
assessment of the risks of both as part of policy development
and constant alertness to either occurring through policy monitoring.
This suggests to us that assuming research on "what works"
will deliver "evidence based policy" is dangerously
naive. Research is needed not just to show "what works"
but also why policies work (or not), and what we understand of
current phenomena and likely future trends in shaping policies
and outcomes. Horizon scanning research can help the latter, but
such research must be concerned as much with societal as technological
trends.
TRANSPARENCY, COMMUNICATION
AND PUBLIC
ENGAGEMENT
8. Most government departments have a commitment
to publish the research they commissionin print and/or
electronically. That degree of openness does not apply to the
other two modes in paragraph 2 above (scientific advice, and internal
reviews of existing research). What is not, understandably, transparent
is how scientific research has been interpreted and applied in
policy work. In these circumstances it is all too frequently the
case that suspicions arise about the misuse of scientific evidence,
maybe fuelled by the media. Only openness by government about
how it has weighed such evidence (the Information I in the "Four
Is" framework in paragraph 4 above) against other considerations
will allay such suspicions. For example, for the decision on GM
crops, the specially constructed evidence base our research examined
was put into the public domain. It was not all exactly fit for
the government's purpose. The government had a priori decided
to give "science" pre-eminence and greater weight over
other types of evidence. In the event the science was interpreted
to give the government insufficient grounds for supporting immediate
commercialisation of the three test crops. On the wider decision
about how to sustain competing choices (GM/conventional/organic)
in the face of competing interests and ideologies, the special
evidence base was mostly of little use. It arrived too late to
add anything crucial, and well after stakeholders' views had already
shifted strategically.[114]
9. Part of our training and consultancy
portfolio concerns the communication of research to lay audiencespolicy
makers, parliamentarians, professionals, and the public. Scientists
are generally not very good at this and traditionally not interested
in it. (The POSTnotes produced by the Parliamentary Office of
Science and Technology are an important and deliberate attempt
to redress this.) Our experience suggests that this is largely
a consequence of inadequate skills training (not just in writing
accessibly but in making presentations, running interactive workshop
and seminars, and exploiting the potential of the Web) and of
incentives that favour communicating research to fellow scientists
more than to practitioners (notably the use of research publications
as performance measures in the Research Assessment Exercise).
Increasing emphasis is being placed on dissemination and "knowledge
transfer" by the Research Councils, the Wellcome Trust, the
Royal Society and others, but there is a long way to go.
EVALUATION AND
FOLLOW UP
10. From our research and consultancy work
on the relationship between evidence and policy, we consider that
the need is not so much (as the Inquiry brief suggests) "to
re-evaluate the evidence base after the implementation of policy."
Rather the need is to maintain, for all policy fields, a continuous
updating of the evidence base as new scientific researchcommissioned
by government or by othersyields results that can inform
policy development and delivery in a timely way. In our experience
most policy needs for scientific evidence could be metat
a standard of reliability that is "fit for purpose"from
knowledge that already exists and is available. Managing those
stocks of policy-relevant knowledgekeeping them objective
and impartial, up-to-date, accessibleis the big scientific
challenge for more evidence-informed policy.
May 2006
105 Annette Boaz, William Solesbury and Fay Sullivan
(2004): The practice of research reviewing 1. An assessment
of 28 review reports ESRC UK Centre for Evidence Based Policy
and Practice, Working Paper No 22; available via www.evidencenetwork.org;
also Annette Boaz, William Solesbury and Fay Sullivan (forthcoming):
The practice of research reviewing 2. 10 case studies of reviews,
ESRC UK Centre for Evidence Based Policy and Practice, Working
Paper No 24. Back
106
See http://psg.civilservice.gov.uk/skill_selection.asp Back
107
Ruth Levitt and William Solesbury (2005), Evidence-informed
policy: what difference do outsiders make in Whitehall, ,
ESRC UK Centre for Evidence Based Policy and Practice, Working
Paper No 23; available via www.evidencenetwork.org Back
108
This is part of an international comparative study of Strategy
and Politics funded by the German Bertelsmann Foundation; it will
report later in 2006. Back
109
Carol Weiss (1995), "The four `I's' of school reform:
how interests, ideology, information and institution affect teachers
and principals", Harvard Educational Review 65(4) pp
571-592. Back
110
Ruth Levitt (2003): GM Crops and Foods: Evidence, Policy and
Practice in the UK; a case study, ESRC UK Centre for Evidence
Based Policy and Practice, Working Paper No 20; available via
www.evidencenetwork.org Back
111
Annette Boaz and Deborah Ashby (2003), Fit for purpose? Assessing
research quality for evidence based policy and practice, Working
Paper 11, ESRC UK Centre for Evidence Based Policy and Practice,
Working Paper No 22; available via www.evidencenetwork.org Back
112
This argument is similar to that used by the National Audit Office
in its VFM work where it adopts the criteria that evidence should
be relevant, reliable and sufficient. See National Audit Office,
Value for Money Handbook-a guide for building quality into VFM
examinations, page 16. Back
113
H M Treasury (2006), Independence for Statistics: a consultation
document, available via www.hm-treasury.gov.uk/budget/budget_06/other_documents/bud_bud06_odstatistics.cfm Back
114
The then Secretary of State, Margaret Beckett, said "The
UK is neither pro nor anti GM. We are pro consumer safety and
choice and pro protection of the environment. [ . . . ] The challenge
for any government is to regulate the use of this new technology
in a way that safeguards the public and our planet, commands public
confidence, but also ensures that our society does not unnecessarily
throw away the benefits science can provide. This is no easy task
. . . " DEFRA News Release 174/03, 15 May 2003; via http://www.defra.gov.uk/news/2003/indx2003.asp Back
|