Session 2010-11
RegenerationRegen 29 Written evidence submitted by the National Audit Office The Government has set out in its 2010 Local Growth White Paper [1] its ambition for locally-driven growth, encouraging business investment and promoting economic development. The Department for Communities and Local Government has said that, when at its most effective, regeneration can be at the heart of this approach, supporting local leaders to strengthen their communities drive economic growth and support people back into work. [2] In examining the Government’s approach, the Select Committee is considering what lessons should be learnt from past and existing regeneration projects and how success should be assessed in future. In a period of relative financial constraint, it is more vital than ever that funds available to support regeneration are targeted to deliver the maximum benefit. From our work on regeneration over the last decade, the National Audit Office has identified two key principles that should help guide future decision-making on regeneration projects and programmes at all spatial levels and inform future assessments of success. Regeneration programmes and projects need to be: a. Appraised rigorously on the basis of robust evidence, so that funds can be targeted on those interventions likely to maximise benefits; and b. Evaluated thoroughly so that the outcomes of interventions are quantified and that lessons from projects – good and bad – are fed back into the appraisal process. To achieve rigorous appraisal· Objectives and outcomes should be stated clearly and well communicated, reducing as far as possible the potential for confusion and ambiguity. · Subsequently, there should be a clear, logical link between the objectives sought and the interventions or projects funded. · Where there are a number of partners involved, clarity is key. Objectives and outcomes should be agreed or developed with partners, and priorities should be aligned with those of partners. Roles and responsibilities of different parties also need to be clearly stated and agreed. · The effectiveness of appraisal is dependent on the quality of information used. The National Audit Office has examined a number of regeneration projects where poor or inadequate market intelligence led to unrealistic or over-optimistic expectations of costs and benefits. · Standardised approaches to appraisal (and evaluation) of programmes and projects can help develop the evidence base for future regeneration activity. Simple and streamlined appraisal can also assist in reducing delays to projects getting started that may deter potential funders. To achieve thorough evaluation and learning· Evaluation processes should be embedded within a clear framework of quantifiable and measurable outcomes. · Evaluation needs to be continual, especially where projects slip or are changed significantly. · Keeping evaluation measures clear and simple increases the likelihood of gathering data that is replicable and therefore comparable across interventions, contributing to the overall evidence base. · Formal and structured mechanisms or forums can facilitate sharing of good practice between projects and localities. Detailed EvidenceIntroduction1 The Government has signalled a new environment for the operation of regeneration programmes and projects, in which local communities and business determine local economic priorities and lead on commissioning the regeneration projects that will support those priorities. In a period of relative financial constraint there is a premium on ensuring, regardless of the spatial level at which commissioning and delivery arrangements are organised, that public funds are targeted to deliver the maximum benefit. 2 The National Audit Office has, through its value for money reports and other work, including Independent Performance Assessments and Independent Supplementary Reviews of Regional Development Agencies, developed a significant body of work on regeneration. Details of our past reports are shown in Appendix 1. We have consistently found that project appraisal and evaluation are vital. They ensure that actions are well thought through, are framed by evidence on what works and what does not, and that lessons and knowledge from projects and programmes are fed back systematically to improve future decision-making. Appraisal3 Appraisal helps decision makers understand and challenge the business case for an intervention, and the potential risks involved. It provides assurance that public sector investment can be justified given two basic tests: · Are there better ways to achieve the project objectives? · Are there better uses for these resources? 4 The purpose of an appraisal is therefore to provide a rigorous and thorough assessment of a project. It should provide an assessment of whether a proposal is worthwhile, by providing, for example: clear identification of a market failure, a cost-benefit based assessment of whether a proposal is feasible (and better than alternatives) and the nature and scale of benefits and beneficiaries. The principles and approach of good appraisal are outlined in the Treasury’s Green Book. Findings on appraisal in regeneration projects from the National Audit Office’s workUnderstanding and articulating outcomes5 When the overall regeneration objectives and outcomes expected from a programme are not well defined and communicated, there is a risk of overlap and inefficient use of funds. The National Audit Office reported in 2009 [1] that the Department for Communities and Local Government launched three key initiatives to regenerate former coalfield areas at different times, each with its own aims and objectives. However, there was no overall strategy for coordinating these three initiatives and the benefit of the interventions was not maximised because of poor integration between the initiatives. 6 There should be a clear and logical link between the outcomes that are being sought from a regeneration programme and the interventions (i.e. the specific projects or initiatives) that are developed to deliver those outcomes. This is easier when those outcomes are carefully specified. Where they are not, it will be more difficult to arrive at transparent decisions on the prioritisation of projects competing for limited funding. For example, the National Audit Office reported in 2007 [2] that in the initial phase of the Housing Market Renewal programme, many of the projects funded were ‘off the shelf’ schemes that had previously been identified, but had not been implemented due to lack of funding. Many of these interventions were not clearly linked to solving the problems that the programme was meant to address: i.e. housing market renewal, and the subsequent investment contributed more towards other, broader objectives such as the Decent Homes target. 7 Clarification of priorities and objectives can therefore help target funding on the projects that are most likely to make a strong contribution. Regional Development Agencies went through a reprioritisation process in response to reductions in funding and the need to consider the relevance of their activity in the downturn. [3] Some Agencies developed systems to help prioritise projects in this context. For example, the East of England Development Agency [4] developed a set of criteria to guide its prioritisation process that included: evidence of market failure, significant Gross Value Added returns at or above benchmark levels, ability to attract significant leverage, aligned to other strategic drivers such as transport or pan-regional growth agendas. Partners’ buy-in to objectives and roles8 Successful and cost-efficient interventions require all the bodies involved in the delivery chain to be clear on what their roles are and what is expected of them. The National Audit Office found in its 2005 [1] report on affordable housing, for example, that the building of affordable housing in areas of high demand resulted in a complex delivery chain because of the interrelationships between organisations and processes at a local and regional level. This lack of certainty led to inconsistencies and confusion, which in turn led to challenges and appeals by developers and costly delays. 9 Clarity on roles and expectation is also important for communication with external bodies and potential investors. The Thames Gateway Programme, for example, involved several forms of partnerships at different spatial levels (regional, sub-regional and local) to help coordinate investment. The National Audit Office found that, whilst this allowed flexibility to adapt to local circumstances, the complexity of decision-making and delivery chains made it difficult for potential investors, developers and Government to understand the programme and integrate investment as a whole. [2] Quality of information10 The National Audit Office has examined a number of regeneration projects where poor or inadequate market intelligence led to unrealistic or over-optimistic expectation of costs and benefits. For example: · The National Audit Office’s 2010 report on the Regional Development Agencies’ support to physical regeneration projects [1] found that, in general, appraisals made clear how projects contributed to regional priorities but were less robust in the technical and financial appraisal of projects. Whilst physical regeneration projects supported by the Regional Development Agencies had added to regional growth, weaknesses in project appraisals meant Agencies were unable to demonstrate that they had consistently chosen the right projects to maximise economic growth and minimise value for money weaknesses. Figure 1 presents the weaknesses identified by the Central Project Review Group [2] , which was responsible for Regional Development Agency investments above £10 million.
· The National Audit Office’s 2008 report on progress in regenerating the Greenwich Peninsula [3] found that the pace of residential development had been slower than forecast. This coupled with an assumed scenario of ‘residential value growth outstripping inflation in every year for almost 20 years’ resulted in financial returns for the public sector being lower than forecast. Shared approaches to appraisal11 The development of shared systems or approaches can help embed the principles of robust appraisal across a number of bodies. The Regional Development Agencies based their appraisal processes on a single source of guidance: The Single Programme Appraisal Guidance (which was issued in July 2003 and was replaced by Guidance for Regional Development Agencies in Appraisal, Delivery and Evaluation in March 2008). [1] This guidance drew, in turn, on existing good practice, such as the Treasury’s Green Book. [2] Evaluation
12 Evaluation is a retrospective analysis of a programme or project to assess how successful it has been and what lessons can be learnt for the future. Effective evaluation is an integral part of good project and programme management and helps build the evidence base around ‘what works’. Findings on evaluation in regeneration projects from the National Audit Office’s workA framework of clear and measurable outcomes13 In order to assess the effectiveness of a programme, commissioners need to be able to measure how it has delivered against the objectives they set for it. It is difficult, however, to evaluate the effectiveness of programmes if measures of success are ill-defined or not quantifiable. For example, the National Audit Office’s 2009 report on Regenerating the English Coalfields [1] found that clear and measurable objectives had not been established for the Coalfields Enterprise Fund, nor had the contribution of the Fund been evaluated. This posed the risk that the Fund did not address the particular venture capital needs of the coalfield communities. 14 A good evaluation framework will develop a more accurate understanding of the effectiveness of an intervention by measuring the net impact (i.e. disentangling benefits from other initiatives or those that would have happened anyway without the intervention). The National Audit Office’s 2011 report into the preparations for the London 2012 Olympic Games [2] found that the Government Olympic Executive was developing an evaluation framework for assessing the impact of the Games. We recommended that it should set baselines against which to measure whether the expected legacy benefits are achieved and that the evaluation framework should set out how the effects of the Games will be separated out from ‘business as usual’ activities. A culture of evaluation15 Poor or inadequate evaluation inhibits learning and poses risks to the delivery of future projects. The National Audit Office’s 2003 report, Success in the Regions [1] found that regional stakeholders believed the Regional Development Agencies were not evaluating past projects adequately and were concerned that experience from past projects may be lost. Better evaluation would help to identify effective interventions and provide early warnings of risks to be managed in future projects. 16 Embedding a culture of evaluation enables an organisation to monitor its effectiveness and drive improvement. The National Audit Office’s 2010 Independent Supplementary Review of Advantage West Midlands [2] , for example, reported that the Agency placed significant emphasis on identifying its impact and improving its return on investment. There was clear communication of the anticipated return on investment from projects and evaluations were in place to assess the impact. This was designed in turn to feed lessons learned from evaluations into investment planning. Continual and consistent evaluation and benchmarking17 Projects should be monitored and evaluated throughout their life to ensure they are on course to return the expected benefits and that objectives are met. The National Audit Office’s 2009 report on Regenerating the English Coalfields [1] identified that, despite the Department for Communities and Local Government doubling the number of sites and more than doubling the associated expenditure in the light of remediation required, the Department did not significantly change the target benefits, and that achieving those benefits in full would take twice the ten-year timescale of the original programme. 18 Using consistent evaluation measures across a range of projects facilitates the comparison of outputs of different projects to help prioritise future funding. The Regional Development Agencies’ guidance on evaluation, for example, stipulated that Agencies should always report the net Gross Value Added impact of an intervention. [2] Similarly, the National Audit Office’s report on Regional Selective Assistance and Enterprise Grants [3] found that the then Department for Trade and Industry undertook a series of evaluations using a consistent methodology that allowed the cost effectiveness of the programme to be tracked over time. 19 If consistent metrics are in place, there will be opportunities to use benchmarking as a useful way of comparing achievements and learning from others. However, this type of activity has not always been exploited fully in the past. The National Audit Office’s 2010 Independent Supplementary Reviews of Regional Development Agencies [4] found that only a small number of Agencies had developed a systematic approach to benchmarking. The Agencies were using inconsistent benchmarks, resulting in unreliable comparisons between Agencies and across projects. Sharing learning20 Learning from others, whether through formal benchmarking or more informal networking, can help improve performance and promote efficiency. The National Audit Office’s Independent Supplementary Review of the East of England Development Agency [1] , for example, reported that the Agency had identified financial savings from sharing good practice in its review of Business Link contracts. The Agency visited other regions and drew on the performance data of others, which informed a new Business Link contract that achieved a 5 per cent efficiency saving in its back office function. 21 A lack of networking opportunities reduces the likelihood of good practice being spread between organisations. There is a key role for bodies with a strategic role to facilitate such networking. The National Audit Office’s 2009 report on Regenerating the English Coalfields [2] found that the Department for Communities and Local Government did not play a sufficiently strong role in bringing together the elements of the Programme. As a result, opportunities for smarter working locally and across Whitehall to coordinate physical regeneration with enterprise and skills initiatives had been missed. March 2011Appendix
[1] HM Government, Local growth: realising every place’s potential , CM 7961, Department for Business, Innovation and Skills, October 2010 [2] Department for Communities and Local Government, Regeneration to enable growth: What Government is doing in support of community-led regeneration , January 2011 [1] Comptroller and Auditor General, Regenerating the English Coalfields , HC 84 Session 2009-10, [1] National Audit Office, December 2009 [2] Comptroller and Auditor General, Housing Market Renewal , HC 20 Session 2007-08, [2] National Audit Office, November 2007 [3] National Audit Office, Independent Supplementary Reviews of the Regional Development Agencies , June 2010 [4] National Audit Office, Independent Supplementary Review: East of England Development Agency , May 2010 [1] Comptroller and Auditor General, Building more affordable homes: Improving the delivery of affordable housing in areas of high demand , HC 459 Session 2005-06, National Audit Office and [1] Audit Commission, December 2005 [2] Comptroller and Auditor General, The Thames Gateway: Laying the Foundations , HC 526 Session [2] 2006-07, National Audit Office, May 2007 [1] Comptroller and Auditor General, Regenerating the English Regions: Regional Development Agencies’ support to physical regeneration projects , HC 214 Session 2009-10, National Audit Office, March 2010 [2] The Central Project Review G roup is an inter-departmental panel that appraises and approves Regional Development Agency funding for projects of over £10 million or are considered novel or contentious [3] Comptroller and Auditor General, The Regeneration of the Greenwich Peninsula: A Progress Report , HC 338 Session 2007-08, National Audit Office, July 2008 [1] Comptroller and Auditor General, Regenerating the English Regions: Regional Development Agencies’ support to physical regeneration projects , HC 214 Session 2009-10, National Audit Office, March 2010 [2] Department for Business Enterprise and Regulatory Reform, Guidance for R egional D evelopment A gencie s in Appraisal, Delivery and Evaluation , March 2008 [1] Comptroller and Auditor General, Regenerating the English Coalfields , HC 84 Session 2009-10, [1] National Audit Office, December 2009 [2] Comptroller and Auditor General, Preparation for the London 2012 Olympic and Paralympic Games: Progress report February 2011 , HC 756 Session 2010-11, National Audit Office, February 2011 [1] Comptroller and Auditor General, Success in the Regions , HC 1268 Session 2002-03, [1] National Audit Office, November 2003 [2] National Audit Office, Ind ependent Supplementary Review: Advantage West Midlands , May 2010 [1] Comptroller and Auditor General, Regenerating the English Coalfields , HC 84 Session 2009-10, [1] National Audit Office, December 2009 [2] Department for Business, Innovation and Skills, Regional Development Agency Evaluation: Practical Guidance on Implementing the Evaluation Framework , December 2009 [3] Comptroller and Auditor General, Regional Grants in England , HC 702 Session 2002-03, [3] National Audit Office, June 2003 [4] National Audit Office, Independent Supplementary Reviews of the Regional Development Agencies , June 2010 [1] National Audit Office, Independent Supplementary Review: East of England Development Agency , May 2010 [2] Comptroller and Auditor General, Regenerating the English Coalfields , HC 84 Session 2009-10, [2] National Audit Office, December 2009 |
|
|
©Parliamentary copyright | Prepared 4th April 2011 |