APPENDIX 16
Supplementary evidence from the Government
SCIENTIFIC ADVICE, RISK AND EVIDENCE: HOW
GOVERNMENT HANDLES THEM
Answers from Sir David King, Government Chief
Scientific Adviser and Ms Sue Duncan, Chief Government Social
Researcher to Committee questions following the oral evidence
session on 15 February 2006, [HC 900-i].
QUESTION 1
In oral evidence Ms Duncan referred to the Professional
Skills for Government (PSG) initiative [Q11]. Please explain how
PSG will improve the use of scientific evidence (including social
science) by policy makers and the professional development of
scientists within the civil service. What other steps are being
taken to improve opportunities for scientists within the Civil
Service [Q14]?
1. Improving the use of science and social
science evidence requires action to address both the supply and
demand issues. The Professional Skills for Government (PSG) initiative
aims to address both. On the demand side, one of the core skills
policy makers are expected to demonstrate is "analysis and
use of evidence" (A+UE). This should significantly improve
the use of all types of evidence, including scientific evidence,
by policy makers.
2. The requirements associated with the
A+UE skill were developed by a small team that included representation
from "analytical" professions within government. The
"Scientist/Engineer" profession was represented, as
were, for example, social researchers, economists and statisticians.
Should they be required, a full list of the requirements associated
with the A+UE core skill may be found on the Cabinet Office's
PSG site[53].
3. One part of the A+UE skill at the Grade
7 "career gateway" involves "engaging with relevant
experts to gather and evaluate evidence". Since A+UE is a
core skill, all civil servants at Grade 7 and above should be
demonstrating this behaviour. Furthermore, as people progress
through the PSG career gateways, they have to demonstrate real
development in this area. They are required to move from "using"
and "understanding" evidence to "championing"
the role of analysis and evidence.
4. It is as yet unclear what the response
to developing this core A+UE skill is likely to be, and what the
appropriate balance between courses and on-the-job learning is,
in acquiring the skill.
5. PSG emphasises that everyone in the civil
service now belongs to a professional group and clearly puts the
responsibility for developing professionalism (which includes
core policy and business skills alongside analytical skills) in
the hands of the relevant Heads of Profession. Different groups
are at different stages in this process, but Heads of Profession
meet regularly to discuss progress and share best practice.
Science/Engineer Profession
6. The creation of the "Scientist/Engineer"
profession within the PSG framework clearly puts scientists and
engineers on the same footing as all other professional groups.
It should no longer appear as a "specialist" profession,
with an implicit limit on the level that its members can reach.
Integrating the profession into a civil service wide initiative
also highlights the opportunities that exist for scientists and
engineers to progress into the SCS and, furthermore, PSG clearly
defines the skills and experience that are needed to achieve this.
7. The Skills Framework (SF) for the Scientist/Engineer
profession, and the associated Learning and Development document,
will also aid the professional development of scientists and engineers.
For example, these documents highlight the importance of Continuing
Professional Development (CPD), especially that leading to a professional
qualification (eg Chartered Engineer and/or Chartered Scientist).
8. These documents also highlight the importance
of being able to provide clear advice to technical and non-technical
audiences, and of understanding the needs and constraints of stakeholder
communities. These skills should further improve the interface
between policy makers and the analytical professions.
9. As discussed above, the introduction
of PSG has improved the opportunities available to civil service
scientists and engineers. Sir David King's role as the government-wide
Head of Science and Engineering Profession (HoSEP) also supports
this aim. Working with departmental heads of profession, he aims
to publicise the breadth of work conducted by the science and
engineering community, which will help individuals identify future
opportunities. He will also be launching a web site that, amongst
other things, contains a high-level discussion of the various
career paths open to scientists and engineers. The description
of this framework is enhanced by a number of example jobs that
have recently been advertised on the Civil Service Recruitment
Gateway. Current plans are for the web site to go live during
May 2006, this date being driven by changes to the overall Department
of Trade and Industry web site, of which it forms a part.
10. The ongoing programme of Science Reviews,
which are conducted by officials in the Office of Science and
Technology (OST) on behalf of the Government Chief Scientific
Adviser, also contribute to improving the opportunities for civil
service scientists. In particular, one of the criteria they examine
is how departments "use, maintain and develop scientific
expertise".
Social Research Profession
11. To further embed the principles of PSG,
a common competence framework for members of the Government Social
Research service has been developed, setting out the requirements
for each grade. This is soon to be complemented by a Continuing
Professional Development Handbook that introduces a minimum number
of hours of CPD required by GSR members, and assists them in assessing
their current level of competence, and identifying training and
other professional development activities that they can undertake
in each area. GSR has also just launched its Fast Stream programme
for existing GSR members, which is intended to identify and develop
the talent of GSR members with potential to progress to the Senior
Civil Service. Fast Stream schemes for economists and statisticians
already exist.
QUESTION 2
Ms Duncan stated that she had "no role specifically
in advising ministers" and that the Chief Economist was her
line manager [Q1-2]. Does the Government Chief Economist have
a role in advising Ministers and the Prime Minister on social
science issues, or does all responsibility for providing cross-departmental
advice on social science lie with the Government Chief Scientific
Adviser?
12. There are a number of mechanisms for
providing cross-departmental advice on social science ranging
from informal liaison by the central chiefs of social science
professions through to formal reviews of cross-departmental policy
areas. One of the reasons for establishing the Coordination of
Research and Analysis Group (CRAG), was to ensure more effective
coordination across the analytical and policy communities in both
anticipating and responding to cross-departmental challenges.
13. The Head of the Government Economic
Service, as well as being a member of CRAG, supports and guides
departmental Government Chief Economists, who do have a direct
role in advising ministers on social science issues, and who meet
regularly to discuss cross-cutting issues. He is available to
give advice to any minister, should that be requested. Sir Nicholas
Stern, as well as being Head of the GES, is, in his other capacity,
an adviser to the UK government on the Economics of Climate Change
and Development, and is currently leading the Stern Review on
the Economics of Climate Change.
QUESTION 3
Sir David indicated that the Office of Science
and Technology fulfilled the same function as the Government Social
Research Service, Government Economic Service, Government Statistical
Service and Government Operational Research Service [Q12]. Could
you please explain the rationale behind this statement and clarify
why there are Government Services for social science disciplines
but not for the natural or physical sciences, engineering and
technology?
14. The rationale behind this statement
is that the Office of Science and Technology (OST) shares the
same high-level aims as the organisations listed in the question.
For example, part of OST's work is about ensuring that the work
of scientists and engineers is integrated into the policy making
process. Its work also involves supporting and championing the
scientific and engineering community.
15. A number of the organisations identified
in this question are involved in the recruitment of members of
their profession. They are also highly active in areas like the
provision of individual training courses.
16. While, in common with other analytical
professions OST works with other government departments, OST does
not work directly with individual members of the science and engineering
community, whose professional development is managed at a Departmental
level. By helping departments write Science and Innovation (or
Evidence and Innovation) strategies, and encouraging the use of
Horizon Scanning (HS) techniques, OST helps departments to identify
the science and engineering capabilities they need to support
their business. In short, amongst other things, OST tries to help
departments better manage their own science and engineering communities.
17. The topic of Continuing Professional
Development (CPD) provides a good example of how this works in
practice. In Sir David King's role as Head of Profession for scientists
and engineers, he has stated that, in most cases, CPD that leads
to a professional qualification like Chartered Scientist (CSci)
or Chartered Engineer (CEng) should be undertaken. However, he
has not dictated that this must be achieved in a particular manner.
18. At least 16 different professional bodies
can award CSci status[54].
As might be expected, these include the Institute of Physics,
the Royal Society of Chemistry and the Institute of Mathematics
and its Applications. A number of more specialist organisations
including, for example, the Oil and Colour Chemists' Association,
the Institute of Corrosion and the Institute of Professional Soil
Scientists, can also award the qualification. The choice of a
specialist, or more general, route to chartered status is clearly
best made locally, where the needs of the individual and their
employing organisation are understood much better.
QUESTION 4
Sir David stated: "the role of the Chief
Scientific Adviser is to report to the Prime Minister and the
Cabinet and yet my office is in the DTI. I think that tension
exists and I feel it many days of the week" [Q13]. What steps
you have taken to mitigate this problem and how will the new arrangements
at the OST and DTI impact on this?
19. There are benefits as well as challenges
in the current arrangements. As the Government's Chief Scientific
Adviser, Sir David King values being able to draw readily on the
work of the Science and Engineering Base and Innovation Groups
in the DTI, and to work closely with the Secretary of State for
Trade and Industry and the Minister for Science and Innovation.
Having his office in the DTI sends a clear signal internally and
externally that the DTI is indeed the "Department for Science".
At the same time he ensures that his links to the centre of Government
are strong: Sir David meets the Prime Minister and his staff,
as well as the Secretary of the Cabinet and other Cabinet Office
officials, regularly. It is well understood and accepted by DTI
colleagues, including Ministers, that Sir David advises the Government
independently. The merging of the DTI's Innovation Group into
the Office of Science and Technology to create the Office of Science
and Innovation, which Sir David King will continue to lead, will
not change that position; however it will mean that science and
innovation policies will be better integrated, and carry more
weight, within the DTI.
QUESTION 5
Sir David stated that he was not confident that
all departments had sufficient scientific expertise to respond
to unforeseen challenges [Q25]. Which are the key areas of weakness,
in terms of departments, processes and policies, which need to
be addressed?
20. One of the key recommendations of the
2002 Cross Cutting Review of Science and Research (CCR) was that
Departments publish Science and Innovation (S&I) Strategies,
which are proposals for meeting defined S&T policy objectives
and goals.
21. The Strategies should show how they
relate to and contribute to departmental priorities, objectives
and relevant Public Services Agreement (PSA) targets. They also
aim to identify and meet the strategic public policy challenges
across Government, which undoubtedly require input from a wide
range of analytical resources.
22. CSA's Guidelines call for adequate horizon
scanning procedures to be in place in Departments and S&I
strategies to ensure that these procedures are exploited in support
of scientific evidence and understanding of technological innovation.
This should form part of the solution to crosscutting public policy
issues that may affect future policies.
23. Horizon scanning is intended to help
reduce the impact of unforeseen challenges by identifying public
policy challenges; sourcing data across all evidential areas;
and identifying the opportunities, threats and key scientific
issues the department will need to address for the longer-term.
Horizon scanning should provide early indications of trends, issues,
or other emerging phenomena that may create significant impacts
that departments need to take account of.
24. While the key elements of the S&I
Strategy documents, such as horizon scanning; risk management;
skills strategy; and public engagement, cover the issues of capacity
and capability, the key areas of weakness tend to be the lack
of coherent systems for co-ordinating these elements.
25. The ability to retain, fund and develop
scientific expertise for unforeseen challenges is a real issue
for Departments. The need to fund, retain the expertise, and constantly
update the skills is an equation that is rarely truly resolved.
(This is based on Science Review experience.)
26. Most Departments are responsible for
a wide range of policy areas, each requiring specialist scientists,
and while these departments build generically on the experience
of their past responses to emergency situations, extrapolating
learning from one to inform the next crisis, they cannot fully
predict the nature of the scientific expertise required for the
next emergency.
27. Processes: the second part of
the question asks which are the key areas of weakness in terms
of departments processes and policies, which need to be addressed?
The Horizon Scanning Centre (HSC) was established
in November 2004 to provide a higher-level strategic context to
horizon scanning in departments, and to feed directly into cross-government
priority setting, strategy formation and decision-making. Horizon
scanning helps a Department look beyond the day-to-day operational
priorities, to identify potential changes in science and social
science, (for example, dealing with the perception of and attitude
to science) and make provision for planning responses to these.
OST's experience suggests that within Departments there is not
yet a common embedded view of what horizon scanning is, how and
where it is applied, and what part it plays in business processes
including strategy and risk management. However, the Horizon Scanning
Centre within OST is aware of these developmental issues, and
has a Strategic Futures coaching and support programme open to
all Government Departments with the aim of raising the capacity
for horizon scanning within Government Departments and spreading
good practice.
28. In the last 12 months, the coaching
and support programme has involved 78 individuals from eight Government
Departments and a wide range of agencies, devolved administrations
and other public sector bodies. The Horizon Scanning Centre has
an on-going process of engagement with the private sector to seek
lessons and good practice that can be applied in Government. However,
horizon scanning, as an activity across Government, is fairly
new and the activity is not yet well embedded in strategic processes
in all Departments.
29. The Horizon Scanning Centre also runs
a Best Practice network (the Futures Analysts Network, known as
the FAN Club) whose quarterly meetings attract around 60-70 people
from all sectors, both public and private.
30. As well as expecting a Department to
take a view of the future, OST's Science Review Team also looks
at the extent to which a Department looks beyond its own organisation
to identify unforeseen challenges, and areas where the necessary
expertise may already exist elsewhere. This includes working with,
and being aware of expertise in academia, OGDs and internationally.
31. Departmental strategies tend to concentrate
on issues for which a department has lead-responsibility, so cross-departmental
issues may receive less attention. One response to this structural
weakness is the work of the OST's Foresight Directorate whose
criteria for prioritising and selecting potential projects include
the requirement that they should involve cross- Departmental policy
issues.
32. There are in addition network mechanismssome
informal and others systematicwhere Departments have access
to expertise. R&D programmes dedicated to Departmental policy
areas createthrough R&D contractsextensive networks
of external experts. Many Departments also have access to expertise
in their own science agencies, who in themselves are engaged in
extensive topic focussed networks. Departments are also often
invested deeply in relationships with specific research institutions
(eg research council institutes) which gives them access to external
institutional expertise. Some of these relationships are expressed
through formal memoranda of understanding. Departments may also
have access to scientific advisory committees that include leading
experts, often engaged on advisory functions of an anticipatory
nature. Departments may also have formal relationships with international
expert committees and panels. The nature of such network mechanisms
necessarily results in Departments with greater R&D budgets
and science focussed policy areas having a higher level of ready
access to both internal and external expertise.
QUESTION 6
Sir David stated that "the Treasury leads
on managing risk" and Ms Duncan noted that the Treasury had
provided guidance on risk appraisal [Q76]. Who is responsible
for monitoring the implementation of this guidance by departments
and how is this co-ordinated with the Science Reviews led by OST?
33. "Managing risks to the public:
appraisal guidance" was published on 17 June 2005 following
wide consultation. What the document does is provide departments
with the flexibility to take judgments on the management of risk
which are related to their particular circumstances.
34. Monitoring of implementation is the
responsibility of the Treasury.
35. Monitoring is not co-ordinated with
the Science Reviews as the guidance is designed for policy makers
across all government and not specifically for those working on
scientific policy. It is nevertheless the case that the ten criteria
that underpin Science Reviews include several which address risk
at the strategic and operational level. In particular, the Reviews
test for implementation of Guidelines 2005 and the Code of Practice
for Scientific Advisory Committees.
QUESTION 7
Sir David referred to a table of relative risks
produced by the life insurance industry [Q77]. Is this table,
or an analogous framework, used by Government for risk communication?
If not, why not? If so, please provide further information on
how it is used.
36. The Government has not developed a standardised
table of risks; risk means different things to different people.
Looked at from one perspective an individual may have a better
sense of their own personal risk than Government as the Government
looks at risk in aggregate across the population as a whole. For
example, someone who does not drive when tired or drunk and always
drives cautiously could reasonably assume that their level of
risk is different to the aggregate average level of risk. On the
other hand people do not tend to have a very analytical approach
to personal risk but rather make judgements based on the way they
feel.
37. What Government does provide is guidance
on managing risks to the public for policy makers and professional
risk assessors. The Government has also developed guidance on
communicating risk, which is published on UK Resilience web site.
This gives guidance to communicators and policy makers on the
types of risks people face and a set of principles for communicating
on risk based issues. http://www. ukresilience.info/preparedness/risk/communicatingrisk.pdf
QUESTION 8
Sir David made reference to the Prime Minister's
initiative to improve media handling of risk, formerly led by
John Hutton [Q80]. Please provide information on who is now leading
this initiative, what activities have been undertaken and what
further activities are planned.
38. There have been useful discussions with
the broadcast media about handling risk and there are plans for
the Chief Scientific Adviser, Chief Medical Officer and the Permanent
Secretary for Government Communications to continue the dialogue
with the print media over the next few months.
QUESTION 9
Sir David indicated that he considered the idea
of a precautionary principle to be "unscientific", preferring
to refer to a precautionary approach [Q87]. Please define what
is meant by a precautionary approach and explain the steps that
you have taken to promote the use of a precautionary approach
(as opposed to principle) across Government.
39. The precautionary approach is consistent
with wider risk management practice. It applies where the scientific
evidence is incomplete or inconclusive, and there is the possibility
of severe and irreversible consequences. It simply allows for
a further judgment to be made, alongside a more standard cost-benefit
analysis, to enable policy makers to justify resource investment
in risk prevention/minimization in these circumstances.
40. Each individual case has to be fully
examined on its merits, taking into account relevant risk management
principles such as proportionality and consistency. It is clearly
difficult to define whether proposed action is "proportionate"
when the science is uncertain. Central advice for Departments
seeking to take precautionary action to mitigate perceived risk
is therefore provided in the Treasury Guidance "Managing
risks to the public: appraisal guidance" (p 17) which
supplements the Treasury's Green Book guidance "Appraisal
and Evaluation in Central Government". The guidance identifies
precautionary action as one of several possible options for the
management of riskincluding delay and allowing more time
for investigation of alternative, less damaging ways for achieving
stated objectives. The guidance directs appraisers to draw the
matter to the attention of senior management and seek expert advice.
QUESTION 10
Ms Duncan said: "my understanding is that
[the Cabinet Office] do not any more monitor the way departments
implement the guidance" on best practice for consultations
[Q93]. Please confirm that this is the case.
41. The Cabinet Office does still undertake
some monitoring of departmental compliance with the Code of Practice
on Consultation through its annual Assessment of Performance.
(http://www.cabinetoffice.gov.uk/regulation/consultation/index.asp)
23% of the 621 consultations by departments in the
2004 calendar year lasted less than 12 weeks. Where consultations
do not meet the 12 week minimum, require ministerial sign-off
explaining the reasons and are expected to take extra effort to
ensure that the consultation is as effective as possible. Of those
consultations that lasted less than the 12 week period, all but
5 received Ministerial approval and were therefore compliant with
the Code. Reasons given for non-compliance have included pressure
of European timetables, stakeholder pressure for quick action,
and the need to dovetail with the parliamentary scrutiny timetable.
From 2005, departments are required to state, in the better regulation
section of their annual reports, the reason why Ministerial approval
is given for consultations lasting less than 12 weeks.
QUESTION 11
Sir David said that he speaks "on behalf
of the advisory system within Government" [Q97]. Please clarify
what the term "advisory system" refers to and which
Minister has ultimate responsibility for the scientific advisory
system in Government.
42. The term "advisory system"
refers to the analytical advice provided by Government analysts
to decision makers (usually Ministers). Sir David speaks on behalf
of the scientific community within this system. Other chief analysts
(such as the Chief Economist and Chief Social Researcher) provide
advice on behalf of their own analytical communities. The way
in which this advice reaches decision makers varies according
to discipline and department.
43. Chief analysts and senior policy makers
meet regularly through CRAG to ensure more effective coordination
across the analytical and policy communities, and to ensure that
Government is in a better position to anticipate and respond to
where such advice is likely to be needed.
44. Lord Sainsbury (Parliamentary Under-Secretary
of State for Science and Innovation, DTI) is the Minister who
has the lead on the science agenda. However, due to the high profile
and sensitive nature of science, Sir David King, as the independent
GCSA, has the lead in this area and provides advice directly to
the Prime Minister.
QUESTION 12
Are there any general criteria for the selection
of departmental chief scientific advisers? What guidance has been
issued to departments regarding the experience and skills profile
that they should be seeking in a departmental chief scientific
adviser?
45. The role of a Departmental Chief Scientific
Adviser (DCSA) was first developed in the "Cross-Cutting
Review of Science and Research", which published a final
report in March 2002.[55]
This identified the following aspects of a DCSA's role:
1. Ensure that policy is soundly based on
good science.
2. Provide strategic direction to the department's
scientific activities.
3. Be the department's scientific spokesman
to the outside world.
4. Be credible, both within and outside the
department.
5. Have direct access to the department's
board.
6. Accountable for the level of scientific
expertise in the department.
7. Ensure that staff complete appropriate
continuing professional development.
46. Whilst the general nature of the role
is the same across government, different departments do emphasise
those parts that are most relevant to their business. They also
recruit a DCSA whose area of expertise bets matches the work of
the department. For example, as Sir David King stated in his oral
evidence, Paul Wiles, the Home Office DCSA is a social scientist.
47. To ensure that appropriate criteria
are applied across government, Sir David is involved in the appointment
of all DCSAs. This involvement often includes providing comments
on job advertisements and forming part of the interview panel.
QUESTION 13
Are the numbers of natural and physical scientists
and engineers employed by Government rising or falling (please
provide details for the last five years if possible)? What impact
has the Gershon review had on numbers of in-house scientists?
48. There are no accurate figures that can
be used to provide a quantitative answer to this question. Since
the disappearance of the scientific civil service there has been
no central record on the number of scientists and engineers employed.
Anecdotal evidence suggests that the situation has been exacerbated
by individuals in more general civil service jobs hiding their
scientific skills as they viewed them as an impediment to promotion.
49. The introduction of PSG and the formation
of the Government Skills Sector Skills Council offer at least
some hope that this situation will improve. In particular, PSG
requires individuals to select a professional grouping and Government
Skills need to collect workforce data. The combination of these
requirements might in the future lead to a situation where departments
(and their agencies) are asked report on the number of staff within
each profession.
50. As Sir David mentioned in his oral evidence,
the privatisation of agencies and laboratories is reducing the
scientific expertise that is available within the civil service.
The privatisation of part of what was the Defence Evaluation and
Research Agency (to form QinetiQ) and the Laboratory of the Government
Chemist are but two recent examples. Ongoing changes to the status
of the Forensic Science Service are another case in point.
51. Whilst it is accepted that there are
"local issues" associated with the cost-effective provision
of services to a department, the ability to exploit Intellectual
Property (IP), the opportunity for commercial activities, and
the need to achieve efficiency savings, the continuing reduction
of scientists and engineers in the civil service is a concern
to Sir David.
52. It may, for example, reduce the number
of technically qualified people who are able to act as "intelligent
customers" when the government procures scientific or engineering
expertise from external organisations. This area was highlighted
in the Gershon Review,[56]
albeit the examples used were of consultancy, legal services and
financial advisory services, rather than scientific or engineering
expertise.
53. The upheaval and uncertainty associated
with these changes may also reduce the appeal of a career in the
civil service. The reduction in the number of scientists and engineers
that are employed within the civil service also reduces the size
and diversity of the talent pool from which future generations
of the SCS will be drawn.
QUESTION 14
What incentives has the Government put in place,
for both researchers and civil servants, to promote interaction
between scientists and policy-makers?
54. There are a number of activities associated
with PSG that will promote interaction between policy makers and
members of the Scientist/Engineer profession. For example, the
requirements associated with the Analysis and Use of Evidence
core skill (which is discussed in more detail in the response
to Question 1) mean that civil servants need to understand evidence
and to engage with relevant experts. Whilst this captures the
interaction between policy makers and scientists it is much wider
in its remit, covering all civil servants at Grade 7 or above
and all forms of evidence.
55. In addition, the Skills Framework associated
with the Scientist/Engineer profession requires members of the
profession to "provide clear advice to technical and non-technical
audiences", and to "understand the needs and constraints
of stakeholder communities". These skills should further
improve the interface between policy makers and members of the
Scientist/Engineer profession.
56. Furthermore, PSG requires that members
of the SCS have "broader experience" of working in government.
This means that they must have had meaningful experience of working
in at least two of the following three areas: policy delivery;
operational delivery; corporate services delivery. It is hoped
that this will increase the proportion of the SCS that have detailed,
hands-on experience of working with scientists and engineers.
57. A number of departments have introduced
local initiatives that seek to improve the interaction between
policy makers and scientists. For example, the Department for
Environment, Food and Rural Affairs has restructured so that,
in some cases, scientists work in the same organisation teams
as policy makers. The Chief Scientific Adviser's Committee (CSAC),
chaired by Sir David King, offers an opportunity for departmental
Chief Scientific Advisers to share their experiences of such initiatives,
thus facilitating the spread of "best practice".
58. The Co-ordination of Analysis and Research
Group (CRAG) was established in December 2004 to promote closer
interaction between policy experts and the full range of analytical
disciplines within government. It is chaired by Sir Brian Bender,
Head of Profession for Policy Delivery, and its members include
the central heads of science and analysis professions and high
level policy and strategy representatives. CRAG is currently facilitating
cross-cutting, evidence-based work on migration, globalisation,
ageing and public policy and service delivery with the aim of
addressing key gaps and tackling problems such as data sharing.
59. The Professional Skills for Government
Initiative (PSG) requires that all middle and senior management
officialsbe they policy experts or policy analystsare
able to demonstrate an understanding of analysis and to use evidence
effectively. As this requirement will form part of the annual
reporting and performance management cycle it should motivate
policy experts to gain a better understanding of basic research
methods and, at the same time, encourage policy analysts to learn
more about the political context in which they are working.
60. Several government departments are experimenting
with multi-disciplinary teams in which analytical specialists
and policy experts are located together and work side by side
on policy development. DfT, for example, has already evaluated
the first phase of its "flexible deployment" trial.
61. Within GSR, a large amount of research
activity is commissioned from external research contractors. Contract
managers are expected to ensure effective interaction between
external research contractors and policy sponsors for projects.
Departmental Heads of Profession for GSR have oversight of these
relationships at a strategic level.
QUESTION 15
What processes are in place for horizon scanning
of issues emerging from the EU? What steps are taken to ensure
that issues that could be of concern to UK scientists or that
could benefit from their input are identified as early as possible?
62. There are a number of existing mechanisms
for Horizon Scanning issues emerging from the EU, although there
is no overarching coordinating framework that draws them all together.
On the regulatory side, the CSA Guidelines
lay out where and how scientific expertise and advice integrates
in policy processes, including guidance on Horizon Scanning. This
integration compliments the Better Regulation Executive's (BRE)
guidance on the Regulatory Impact Assessment (RIA) which is an
analysis of the full range of likely impacts of a policy change.
These sets of guidance cover approaches to EU policy areas as
other areas.
On anticipating issues where there
is a research need that has a European dimension, the UK negotiates
from a forward looking perspective based on intensive stakeholder
consultation on priorities for forthcoming EU R&D framework
programmes.
CSA Guidelines, BRE Guidance, and
the Professional Skills in Government programme (particularly
elements relating to evidence and analysis) define the skills
and processes required for good use of evidence (including scientific
evidence) generally, which would include consideration of UK negotiating
positions on forthcoming EU legislation. They also offer specific
guidance on S&T horizon scanning to encourage the identification
of S&T issues where early S&T inputs would be beneficial.
The CSA Guidelines give specific
advice on seeking out wider S&T inputs to from external sources,
for example engaging learned societies and professional bodies
to access a wide range of specialists.
The CSA guidelines recommend stepping
through a number of processes to achieve early warning of S&T
issues, such as seeking external advice when an issue raises questions
that exceed the expertise of in-house staff; when responsibility
for a particular issue cuts across government departments; when
there is considerable uncertainty and a wide range of expert opinion
exists; when there are potentially significant implications for
sensitive areas of public policy; when independent analyses could
potentially strengthen public confidence in scientific advice
from government.
In terms of seeking EU expertise,
the CSA guidelines state that "where appropriate, consideration
should also be given to inviting experts from outside the UK,
for example those from European or international advisory mechanisms
. . . Where the issue falls within European Community competence,
or is likely to affect intra-community trade, particular attention
should be paid to encouraging an evidence-based approach for Community
decision making. This may involve contributing to Community level
scientific committees, briefing the Commission on developing expert
opinion, and exchange visits by scientific experts from other
Member States . . . ". A number of UK laboratories are Community
reference laboratories, and play a leading role in setting the
EU agenda (for example on testing protocols) in specific areas
of science. UK experts are involved in advisory roles for UK representation
on EC standing committees with strong science components, such
as the Standing Committee on the Food Chain and Animal Health.
The UK has numerous independent expert
advisory committees, many of whom give advice on EU issues. Their
rolewhilst not formally defined as suchhas a strong
element of Horizon Scanning for S&T issues.
Within the EC there are numerous
specific Foresight and Horizon Scanning activities. DG Research
coordinate many of these activities, having set up the "Science
and Technology Foresight" unit in January 2001. Apart
from "embedded" S&T Foresight in multilateral research
infrastructures like CERN and EMBL, Foresight and supporting activities
have been developed principally by The European Parliament
and The European Parliamentary Technology Assessment Network,
EPTA. EPTA networks Parliamentary Technology Assessment bodies
of Europe, which includes the UK's Parliamentary Office of Science
and Technology (POST). Other significant S&T Foresight activities
in Europe include the Institute for Prospective Technological
Studies (IPTS) in Seville, which provides techno-economic analysis
to support European decision-makers. The European Science Foundation
(ESF) recently introduced its "Forward Look" to assist
Europe's scientific community to develop medium to long term views
and analyses of future research developments in multidisciplinary
topics, and interact with policy makers from member organisations.
DG Research's Science and Technology
Foresight unit also act to implement the actions relevant to S&T
Foresight under the "Support for the coherent development
of S&T policies" of the 6th EU Framework Programme for
Research. The 6th Framework programme also has a Foresight element
under "coordination of policies" work programme.
DG research state "it is apparent
that Foresight activities themselves have not yet reached the
same state of integration and coherence at EU level as many other
policy fields". This is likely to be a reflection of the
level of development of Foresight and Horizon Scanning skills
across the European generally, and is reflected in the UK also
in the relative youth of Guidelines for embedding such tools and
skills such as CSA Guidelines, and PSG.
OST's Foresight Directorate carries
out in-depth analyses looking ahead at least 10 years (and often
further) on the future implications of key areas selected on the
basis of being driven by science and technology, having outcomes
that can be influenced (ie the work adds value), are not being
covered by work carried on elsewhere, require an inter-disciplinary
approach, and command support from departments.
The OST Horizon Scanning Centre (HSC),
which commenced activities in November 2004 (under a commitment
in the HMT/DTI/DfES 10 year science and innovation investment
framework) takes a global perspective of future trends and issues,
in the context of impacts on the UK. European issues are encompassed
in the context of the HSC's remit on Horizon Scanning evidence,
which is to provide a higher-level strategic context to the Horizon
Scanning activities of other departments.
It also has a "best practice"
workstream, which looks across Government and widely across the
private sector, in order to seek out, test, and promote good practice
in Horizon Scanning. The HSC is also piloting a coaching programme
to raise capacity to undertake Horizon Scanning across Government.
The HSC also has a programme of strategic S&T Horizon Scanning,
where issues are prioritized by impacts on the UK. This provides
a high-level strategic context to the Horizon Scanning activities
of other departments. The HSC has created an active "Future
Analysts Network" which puts on events on specific Horizon
Scanning techniques or topics, some of which raise awareness across
Government of specific important S&T areas. The network draws
deliberately on both public and private sector expertise (local
and international) and membership in order to maximize synergies
and value.
QUESTION 16
How is the effectiveness of cross-departmental
policies that rely on or have implications for science evaluated?
63. The 2002 Cross-Cutting Review of Science
and Research (CCR) represented the first major review of the effectiveness
of government's use and management of science and research underpinning
the development of government policies.
64. Much progress has been made since the
Cross-Cutting Review, including the development of the 10-year
Framework for Science and Innovation published in 2004. OST,
SI (DTI) and DFES report annually to HM Treasury on progress against
measures and indicators contained in the 10-year Framework.
65. Most Departments have completed and
published science and innovation strategies, or are very close
to doing so. In providing their S&I Strategies, Departments
explain their systems for identifying and meeting the strategic
public policy challenges they face. Departmental Chief Scientific
Advisers (DCSAs) have been appointed to nearly all the Departments
where they are needed, and are making positive impacts on the
quality of policy-making as their roles develop.
66. Following the S&I Strategies, OST's
Science Review team conducts more detailed analysis and evaluation
of how individual departments manage their science and use science
in support of policy.
67. In response to the Ministerial Committee
on Science and Innovation (SI), in September 2005, OST led the
identification of three Grand Challenges that bring scientific
evidence to bear in cross government policies. Each challenge
currently being developed by lead departments who will report
back to SI.
68. In 2005-06, HM Treasury's Comprehensive
Spending Review (CSR) has further emphasised crosscutting policy
priorities requiring input from a wide range of analytical resources
through inter-Departmental collaboration, thus reinforcing the
importance of improved and effective cross-departmental workings.
There is increasing co-ordination and prioritisation of cross
government policies by the Coordination of Research and Analysis
Group (CRAG) and the Chief Scientific Advisers Committee (CSAC).
69. OST and CRAG have been working closely
with HM Treasury to develop analysis and evidence to underpin
CSR objectives that should support future evaluation of cross-departmental
policies.
QUESTION 17
What processes are in place to ensure that once
a policy has been decided on and/or implemented, it is re-evaluated
as new evidence emerges?
70. The Spending Review process led by the
Treasury requires departments to review and justify the evidence
base for their investment across policy and programmes on a regular
basis.
71. At the individual policy level, the
Better Regulation Executive's guidance on regulatory impact assessments
states that officials must be clear on how they intend to review
new policies and that a post-implementation review should:
establish a baseline and include
success criteria against which the effectiveness of the policy
in delivering the desired outcome can be assessed,
describe how and when the review
will take place and say which elements of the policy it will cover,
check whether the policy objective
has been met, whether the impacts were as expected and whether
there have been unintended consequences; and
include criteria for modifying or
replacing the policy if necessary.
72. GSR members are expected (as set out
in the GSR competency framework) to support this process by fulfilling
a research intelligence role. This involves keeping abreast of
research evidence in their policy area and ensuring policy colleagues
are briefed on emerging evidence. They also work in partnership
with other social science analysts to monitor and evaluate policy
and delivery.
QUESTION 18
What steps are taken to ensure that the results
of pilots and trials are incorporated into policy development
and what roles do the Government Chief Scientific Adviser and
Chief Government Social Researcher play in making sure that this
happens?
73. The Adding It Up report mentioned
in the Chief Government Social Researcher's oral evidence called
for more and better use of pilots to test the impact of policies
before implementation. To support this, a review of government
pilots was commissioned by the Government Social Research Unit
and was published in December 2003 (Cabinet Office 2003: Trying
it Out: the role of Pilots in Policy-Making). This report set
out a number of recommendations on the appropriate role and properties
of pilots, pre-conditions for success, appropriate methods and
practices, and the use of results. While the Government Social
Research Unit has no continued role in monitoring pilot activity,
it continues to provide advice, support and training to departments
in the design and execution of pilot evaluations.
QUESTION 19
Should research-based advice be published once
a policy-decision has been taken?
74. Section 35 of the Freedom of Information
Act (FOIA) recognises that there is public interest in ensuring
that there is a space within which Ministers and officials are
able to formulate and develop policy options freely and frankly
and that some information and advice generated in the process
should therefore be exempt from release. Information falling within
the scope of the exemption is subject to the public interest test.
It is necessary for departments to consider the balance between
ensuring greater openness and transparency of the policy making
process to better inform the public about the way Government works,
whilst also protecting the policy making process, providing for
effective government. Government departments must also pay regard
to the potential application of other exemptions, for example
those relating to national security or commercial interests and
apply the appropriate tests within the legislation before releasing
or withholding information.
75. Whilst a policy is in its formulation
or development stage this section of the Act applies to statistical
and social research information as well as to advice. However,
once a policy decision has been taken, this section of the Act
requires particular consideration be given to the public interest
in disclosing factual information. This recognises that there
is a particular public interest in making publicly available evidence
that supports government policy decision-making. However there
may be situations in which the factual and statistical information
provided is integral to the advice process and it may not be appropriate
(but this would be subject to a public interest and/or other tests)
to provide this information in response to a request under FOIA.
24 March 2006
53 http://psg.civilservice.gov.uk/. Back
54
http://www.sciencecouncil.org/chartered_scientist/licensed_bodies.html Back
55
Specifically paragraphs 249 to 258, inclusive. Back
56
"There is little evidence that the procurement of professional
services (for example consultancy, legal services, financial advisory
services) is managed to ensure value for money." Third bullet
under paragraph 3.24 of "Releasing resources to the front
line. Independent Review of Public Sector Efficiency",
Sir Peter Gershon, CBE, July 2004. Back
|