Written evidence
submitted by the National Audit Office
SUMMARY
The Government has set out in its 2010 Local Growth
White Paper[103]
its ambition for locally-driven growth, encouraging business investment
and promoting economic development. The Department for Communities
and Local Government has said that, when at its most effective,
regeneration can be at the heart of this approach, supporting
local leaders to strengthen their communities drive economic growth
and support people back into work.[104]
In examining the Government's approach, the Select
Committee is considering what lessons should be learnt from past
and existing regeneration projects and how success should be assessed
in future.
In a period of relative financial constraint, it
is more vital than ever that funds available to support regeneration
are targeted to deliver the maximum benefit.
From our work on regeneration over the last decade,
the National Audit Office has identified two key principles that
should help guide future decision-making on regeneration projects
and programmes at all spatial levels and inform future assessments
of success.
Regeneration programmes and projects need to be:
(a) Appraised
rigorously on the basis of robust evidence,
so that funds can be targeted on those interventions likely to
maximise benefits; and
(b) Evaluated
thoroughly so that the outcomes of interventions
are quantified and that lessons from projectsgood and badare
fed back into the appraisal process.
TO ACHIEVE
RIGOROUS APPRAISAL
Objectives
and outcomes should be stated clearly and well communicated, reducing
as far as possible the potential for confusion and ambiguity.
Subsequently,
there should be a clear, logical link between the objectives sought
and the interventions or projects funded.
Where
there are a number of partners involved, clarity is key. Objectives
and outcomes should be agreed or developed with partners, and
priorities should be aligned with those of partners. Roles and
responsibilities of different parties also need to be clearly
stated and agreed.
The
effectiveness of appraisal is dependent on the quality of information
used. The National Audit Office has examined a number of regeneration
projects where poor or inadequate market intelligence led to unrealistic
or over-optimistic expectations of costs and benefits.
Standardised
approaches to appraisal (and evaluation) of programmes and projects
can help develop the evidence base for future regeneration activity.
Simple and streamlined appraisal can also assist in reducing delays
to projects getting started that may deter potential funders.
TO ACHIEVE
THOROUGH EVALUATION
AND LEARNING
Evaluation
processes should be embedded
within a clear framework of quantifiable
and measurable outcomes.
Evaluation
needs to be continual,
especially where projects slip or are changed significantly.
Keeping
evaluation measures clear and simple increases the likelihood
of gathering data that is replicable and therefore comparable
across interventions, contributing to the overall evidence base.
Formal
and structured mechanisms or forums can facilitate sharing of
good practice between projects and localities.
DETAILED
EVIDENCE
Introduction
1. The Government has signalled a new environment
for the operation of regeneration programmes and projects, in
which local communities and business determine local economic
priorities and lead on commissioning the regeneration projects
that will support those priorities. In a period of relative financial
constraint there is a premium on ensuring, regardless of the spatial
level at which commissioning and delivery arrangements are organised,
that public funds are targeted to deliver the maximum benefit.
2. The National Audit Office has, through its
value for money reports and other work, including Independent
Performance Assessments and Independent Supplementary Reviews
of Regional Development Agencies, developed a significant body
of work on regeneration. Details of our past reports are shown
in Appendix 1. We have consistently found that project appraisal
and evaluation are vital. They ensure that actions are well thought
through, are framed by evidence on what works and what does not,
and that lessons and knowledge from projects and programmes are
fed back systematically to improve future decision-making.
Appraisal
3. Appraisal helps decision makers understand
and challenge the business case for an intervention, and the potential
risks involved. It provides assurance that public sector investment
can be justified given two basic tests:
Are
there better ways to achieve the project objectives?
Are
there better uses for these resources?
4. The purpose of an appraisal is therefore to
provide a rigorous and thorough assessment of a project. It should
provide an assessment of whether a proposal is worthwhile, by
providing, for example: clear identification of a market failure,
a cost-benefit based assessment of whether a proposal is feasible
(and better than alternatives) and the nature and scale of benefits
and beneficiaries. The principles and approach of good appraisal
are outlined in the Treasury's Green Book.
FINDINGS ON
APPRAISAL IN
REGENERATION PROJECTS
FROM THE
NATIONAL AUDIT
OFFICE'S
WORK
Understanding and articulating
outcomes
5. When the overall regeneration objectives and
outcomes expected from a programme are not well defined and communicated,
there is a risk of overlap and inefficient use of funds. The National
Audit Office reported in 2009[105]
that the Department for Communities and Local Government launched
three key initiatives to regenerate former coalfield areas at
different times, each with its own aims and objectives. However,
there was no overall strategy for coordinating these three initiatives
and the benefit of the interventions was not maximised because
of poor integration between the initiatives.
6. There should be a clear and logical link between
the outcomes that are being sought from a regeneration programme
and the interventions (ie the specific projects or initiatives)
that are developed to deliver those outcomes. This is easier when
those outcomes are carefully specified. Where they are not, it
will be more difficult to arrive at transparent decisions on the
prioritisation of projects competing for limited funding. For
example, the National Audit Office reported in 2007[106]
that in the initial phase of the Housing Market Renewal programme,
many of the projects funded were "off the shelf" schemes
that had previously been identified, but had not been implemented
due to lack of funding. Many of these interventions were not clearly
linked to solving the problems that the programme was meant to
address: ie housing market renewal, and the subsequent investment
contributed more towards other, broader objectives such as the
Decent Homes target.
7. Clarification of priorities and objectives
can therefore help target funding on the projects that are most
likely to make a strong contribution. Regional Development Agencies
went through a reprioritisation process in response to reductions
in funding and the need to consider the relevance of their activity
in the downturn.[107]
Some Agencies developed systems to help prioritise projects in
this context. For example, the East of England Development Agency[108]
developed a set of criteria to guide its prioritisation process
that included: evidence of market failure, significant Gross Value
Added returns at or above benchmark levels, ability to attract
significant leverage, aligned to other strategic drivers such
as transport or pan-regional growth agendas.
Partners' buy-in to objectives and roles
8. Successful and cost-efficient interventions
require all the bodies involved in the delivery chain to be clear
on what their roles are and what is expected of them. The National
Audit Office found in its 2005[109]
report on affordable housing, for example, that the building of
affordable housing in areas of high demand resulted in a complex
delivery chain because of the interrelationships between organisations
and processes at a local and regional level. This lack of certainty
led to inconsistencies and confusion, which in turn led to challenges
and appeals by developers and costly delays.
9. Clarity on roles and expectation is also important
for communication with external bodies and potential investors.
The Thames Gateway Programme, for example, involved several forms
of partnerships at different spatial levels (regional, sub-regional
and local) to help coordinate investment. The National Audit Office
found that, whilst this allowed flexibility to adapt to local
circumstances, the complexity of decision-making and delivery
chains made it difficult for potential investors, developers and
Government to understand the programme and integrate investment
as a whole.[110]
Quality of information
10. The National Audit Office has examined a
number of regeneration projects where poor or inadequate market
intelligence led to unrealistic or over-optimistic expectation
of costs and benefits. For example:
The
National Audit Office's 2010 report on the Regional Development
Agencies' support to physical regeneration projects[111]
found that, in general, appraisals made clear how projects contributed
to regional priorities but were less robust in the technical and
financial appraisal of projects. Whilst physical regeneration
projects supported by the Regional Development Agencies had added
to regional growth, weaknesses in project appraisals meant Agencies
were unable to demonstrate that they had consistently chosen the
right projects to maximise economic growth and minimise value
for money weaknesses. Figure 1 presents the weaknesses
identified by the Central Project Review Group,[112]
which was responsible for Regional Development Agency investments
above £10 million.
Figure 1
WEAKNESSES IN REGIONAL DEVELOPMENT AGENCY
APPRAISALS AS IDENTIFIED BY THE CENTRAL PROJECT REVIEW GROUP
Market Failure | In a number of cases the Agencies have "struggled" to identify market failures to justify action, or where they have, the proposed intervention bears little relation to the market failure identified.
|
Options Analysis | Many projects did not contain a comprehensive value for money analysis or proper adjustment of the forecast outputs to show those directly attributable to the Agency, and which represent additional jobs created. This makes it difficult to judge whether the preferred option represents value for money.
|
Sensitivity Analysis | More extensive sensitivity analysis is required in initial appraisals to allow for the risks from rising costs.
|
Optimism Bias | Little evidence of optimism bias being allowed for, despite the clear identification of many risks.
|
Risk Mitigation | There was not always an adequate consideration of risk, or of risk mitigation.
|
Source: Central Projects Review Group
The
National Audit Office's 2008 report on progress in regenerating
the Greenwich Peninsula[113]
found that the pace of residential development had been slower
than forecast. This coupled with an assumed scenario of "residential
value growth outstripping inflation in every year for almost 20
years" resulted in financial returns for the public sector
being lower than forecast.
Shared approaches to appraisal
11. The development of shared systems or approaches
can help embed the principles of robust appraisal across a number
of bodies. The Regional Development Agencies based their appraisal
processes on a single source of guidance: The Single Programme
Appraisal Guidance (which was issued in July 2003 and was
replaced by Guidance for Regional Development Agencies in Appraisal,
Delivery and Evaluation in March 2008).[114]
This guidance drew, in turn, on existing good practice, such as
the Treasury's Green Book.[115]
Evaluation
12. Evaluation is a retrospective analysis of
a programme or project to assess how successful it has been and
what lessons can be learnt for the future. Effective evaluation
is an integral part of good project and programme management and
helps build the evidence base around "what works".
FINDINGS ON
EVALUATION IN
REGENERATION PROJECTS
FROM THE
NATIONAL AUDIT
OFFICE'S
WORK
A framework of clear and measurable outcomes
13. In order to assess the effectiveness of a
programme, commissioners need to be able to measure how it has
delivered against the objectives they set for it. It is difficult,
however, to evaluate the effectiveness of programmes if measures
of success are ill-defined or not quantifiable. For example, the
National Audit Office's 2009 report on Regenerating the English
Coalfields[116]
found that clear and measurable objectives had not been established
for the Coalfields Enterprise Fund, nor had the contribution of
the Fund been evaluated. This posed the risk that the Fund did
not address the particular venture capital needs of the coalfield
communities.
14. A good evaluation framework will develop
a more accurate understanding of the effectiveness of an intervention
by measuring the net impact (ie disentangling benefits from other
initiatives or those that would have happened anyway without the
intervention). The National Audit Office's 2011 report into the
preparations for the London 2012 Olympic Games[117]
found that the Government Olympic Executive was developing an
evaluation framework for assessing the impact of the Games. We
recommended that it should set baselines against which to measure
whether the expected legacy benefits are achieved and that the
evaluation framework should set out how the effects of the Games
will be separated out from "business as usual" activities.
A culture of evaluation
15. Poor or inadequate evaluation inhibits learning
and poses risks to the delivery of future projects. The National
Audit Office's 2003 report, Success in the Regions[118]
found that regional stakeholders believed the Regional Development
Agencies were not evaluating past projects adequately and were
concerned that experience from past projects may be lost. Better
evaluation would help to identify effective interventions and
provide early warnings of risks to be managed in future projects.
16. Embedding a culture of evaluation enables
an organisation to monitor its effectiveness and drive improvement.
The National Audit Office's 2010 Independent Supplementary Review
of Advantage West Midlands,[119]
for example, reported that the Agency placed significant emphasis
on identifying its impact and improving its return on investment.
There was clear communication of the anticipated return on investment
from projects and evaluations were in place to assess the impact.
This was designed in turn to feed lessons learned from evaluations
into investment planning.
Continual and consistent evaluation and benchmarking
17. Projects should be monitored and evaluated
throughout their life to ensure they are on course to return the
expected benefits and that objectives are met. The National Audit
Office's 2009 report on Regenerating the English Coalfields[120]
identified that, despite the Department for Communities and Local
Government doubling the number of sites and more than doubling
the associated expenditure in the light of remediation required,
the Department did not significantly change the target benefits,
and that achieving those benefits in full would take twice the
ten-year timescale of the original programme.
18. Using consistent evaluation measures across
a range of projects facilitates the comparison of outputs of different
projects to help prioritise future funding. The Regional Development
Agencies' guidance on evaluation, for example, stipulated that
Agencies should always report the net Gross Value Added impact
of an intervention.[121]
Similarly, the National Audit Office's report on Regional Selective
Assistance and Enterprise Grants[122]
found that the then Department for Trade and Industry undertook
a series of evaluations using a consistent methodology that allowed
the cost effectiveness of the programme to be tracked over time.
19. If consistent metrics are in place, there
will be opportunities to use benchmarking as a useful way of comparing
achievements and learning from others. However, this type of activity
has not always been exploited fully in the past. The National
Audit Office's 2010 Independent Supplementary Reviews of Regional
Development Agencies[123]
found that only a small number of Agencies
had developed a systematic approach to benchmarking. The Agencies
were using inconsistent benchmarks, resulting in unreliable comparisons
between Agencies and across projects.
Sharing learning
20. Learning from others, whether through formal
benchmarking or more informal networking, can help improve performance
and promote efficiency. The National Audit Office's Independent
Supplementary Review of the East of England Development Agency,[124]
for example, reported that the Agency had identified financial
savings from sharing good practice in its review of Business Link
contracts. The Agency visited other regions and drew on the performance
data of others, which informed a new Business Link contract that
achieved a 5% efficiency saving in its back office function.
21. A lack of networking opportunities reduces
the likelihood of good practice being spread between organisations.
There is a key role for bodies with a strategic role to facilitate
such networking. The National Audit Office's 2009 report on Regenerating
the English Coalfields[125]
found that the Department for Communities and Local Government
did not play a sufficiently strong role in bringing together the
elements of the Programme. As a result, opportunities for smarter
working locally and across Whitehall to coordinate physical regeneration
with enterprise and skills initiatives had been missed.
March 2011
APPENDIX
Figure 2
EXAMPLES OF REGENERATION ACTIVITY EXAMINED
BY THE NATIONAL AUDIT OFFICE
Title of report |
Subject | Year of publication
|
Preparations for the London 2012 Olympic and Paralympic Games: Progress report February 2011
| Examined progress of preparations for the London 2012 Olympic and Paralympic Games, looking at the Olympic Delivery Authority's construction programme, how the Government is coordinating the Olympics programme, the legacy from the Games and the cost of the Games.
| 2011 |
Independent Supplementary Review of Regional Development Agencies
| Assessment of the effectiveness of prioritisation, improvement planning, and performance management and evaluation of Regional Development Agencies outside of London.
| 2010 |
Regenerating the English Regions: Regional Development Agencies' support to physical regeneration project
| Examined how well Regional Development Agencies support physical regeneration projects and, in particular, how well: priorities are determined; funds are targeted; projects are appraised for value for money; outcomes are evaluated; and lessons are learned.
| 2010 |
The Decent Homes Programme | Examined the achievements of the Decent Homes Programme, in particular, progress towards targets, impacts, and costs and the Department's management of the Programme.
| 2010 |
Regenerating the English Coalfields | Examined the progress and impact of the Department for Communities and Local Government's three specific initiatives to tackle coalfields' regeneration in England.
| 2009 |
The Regeneration of the Greenwich Peninsula: A Progress Report
| Examination of the regeneration of the Greenwich Peninsula looking at progress of redevelopment, forecast financial returns to the taxpayer, and governance, accountability and risk management.
| 2008 |
Housing Market Renewal | Examined whether the Housing Market Renewal Programme was on course to meet its objectives of addressing the problems of low demand housing markets in the pathfinder neighbourhoods.
| 2007 |
The Thames Gateway: Laying the Foundations |
Examination of whether central government had laid solid foundations for delivering its ambitions for the Gateway and in particular whether the risks to success have been identified by the Department for Communities and Local Government and are being actively managed.
| 2007 |
Independent Performance Assessment of Regional Development Agencies
| Examined themes including: capacity, performance management and achievement of Regional Development Agencies outside London.
| 2006 and 2007 |
Mind the gap: tackling disparities in regional economic performance
| Review of six factors that Government believes influence the economy's output and how they affect regional economic performance, related government action and relevant identifiable public expenditure.
| 2007 |
Enhancing urban green space | Examination of the impact of urban green space initiatives including the identification of barriers and the progress made.
| 2006 |
Building more affordable homes: Improving the delivery of affordable housing in areas of high demand
| A joint study with the Audit Commission that looked at the delivery chain associated with the delivery of affordable housing
| 2005 |
Getting Citizens Involved: Community Participation in Neighbourhood Renewal
| Examined the extent to which the Single Community Programme is helping to get deprived communities involved in neighbourhood renewal, influencing local decisions and shaping local policy making.
| 2004 |
Success in the Regions | Considered practical measures that Regional Development Agencies and departments can take to ensure sustained success.
| 2003 |
Regional Grants in England | Examination of the operation of the Regional Selective Assistance and Enterprise Grant schemes and the effects of the scheme on identified economic problems.
| 2003 |
NOTES
1. Reports are available from the National Audit Office's
website: www.nao.org.uk
Source: National Audit Office
103
HM Government, Local growth: realising every place's potential,
CM 7961, Department for Business, Innovation and Skills, October
2010 Back
104
Department for Communities and Local Government, Regeneration
to enable growth: What Government is doing in support of community-led
regeneration, January 2011 Back
105
Comptroller and Auditor General, Regenerating the English Coalfields,
HC 84 Session 2009-10, National Audit Office, December 2009
Back
106
Comptroller and Auditor General, Housing Market Renewal,
HC 20 Session 2007-08, National Audit Office, November 2007 Back
107
National Audit Office, Independent Supplementary Reviews of
the Regional Development Agencies, June 2010 Back
108
National Audit Office, Independent Supplementary Review: East
of England Development Agency, May 2010 Back
109
Comptroller and Auditor General, Building more affordable homes:
Improving the delivery of affordable housing in areas of high
demand, HC 459 Session 2005-06, National Audit Office and
Audit Commission, December 2005 Back
110
Comptroller and Auditor General, The Thames Gateway: Laying
the Foundations, HC 526 Session 2006-07, National Audit Office,
May 2007 Back
111
Comptroller and Auditor General, Regenerating the English Regions:
Regional Development Agencies' support to physical regeneration
projects, HC 214 Session 2009-10, National Audit Office, March
2010 Back
112
The Central Project Review Group is an inter-departmental panel
that appraises and approves Regional Development Agency funding
for projects of over £10 million or are considered novel
or contentious Back
113
Comptroller and Auditor General, The Regeneration of the Greenwich
Peninsula: A Progress Report, HC 338 Session 2007-08, National
Audit Office, July 2008 Back
114
Comptroller and Auditor General, Regenerating the English Regions:
Regional Development Agencies' support to physical regeneration
projects, HC 214 Session 2009-10, National Audit Office, March
2010 Back
115
Department for Business Enterprise and Regulatory Reform, Guidance
for Regional Development Agencies in Appraisal, Delivery and Evaluation,
March 2008 Back
116
Comptroller and Auditor General, Regenerating the English Coalfields,
HC 84 Session 2009-10, National Audit Office, December 2009 Back
117
Comptroller and Auditor General, Preparation for the London
2012 Olympic and Paralympic Games: Progress report February 2011,
HC 756 Session 2010-11, National Audit Office, February 2011 Back
118
Comptroller and Auditor General, Success in the Regions,
HC 1268 Session 2002-03, National Audit Office, November 2003 Back
119
National Audit Office, Independent Supplementary Review: Advantage
West Midlands, May 2010 Back
120
Comptroller and Auditor General, Regenerating the English Coalfields,
HC 84 Session 2009-10, National Audit Office, December 2009 Back
121
Department for Business, Innovation and Skills, Regional Development
Agency Evaluation: Practical Guidance on Implementing the Evaluation
Framework, December 2009 Back
122
Comptroller and Auditor General, Regional Grants in England,
HC 702 Session 2002-03, National Audit Office, June 2003 Back
123
National Audit Office, Independent Supplementary Reviews of
the Regional Development Agencies, June 2010 Back
124
National Audit Office, Independent Supplementary Review: East
of England Development Agency, May 2010 Back
125
Comptroller and Auditor General, Regenerating the English Coalfields,
HC 84 Session 2009-10, National Audit Office, December 2009 Back
|