Select Committee on Education and Skills First Special Report


Government response


Public Expenditure: Government Response to the Committee's Fifth Report of Session 2005-06

The Select Committee's conclusions and recommendations are in bold text. The Government's response is in plain text.

Departmental Report

1. We expect the DfES to take our concerns about the Departmental report on board for the future and to ensure that information is presented in the Departmental Annual Report in ways which are consistent with previous years and which provide clarity about what is happening with expenditure, for example by having all tables reflecting the full period of a Government (in this case, running from 1997). Consistency and rigour will benefit us in our scrutiny work, but will also benefit the DfES. Debate should be about what information on expenditure tells us about what is happening in the education sector, not whether the information itself is reliable. Moreover, we expect to be informed, prior to preparation of the report, about significant changes to the DAR or within any other key annual sources of information on education expenditure and outcomes. (Paragraph 11)

The Department takes the Committee's concerns seriously and is committed to improving the information contained in the Departmental Annual Report (DAR). We have recently held discussions with the Clerk to the Committee and with the Committee's Special Adviser. We understand the points made by the Committee and will be addressing these concerns.

In the 2007 DAR we will reinstate the table Education Expenditure by Central and Local Government by sector in real terms (table 12.3 in the 2005 Report). Where possible all tables and annexes will include data for the full period from 1997-98. In a few cases it may not be possible to provide information for the earlier years and in these cases the reasons will be explained in the text.

We will be retaining the same overall lay-out and chapter headings in the 2007 DAR so that it will be easier to track through changes. The exact format of the 2008 DAR will depend on the outcome of the 2007 Comprehensive Spending Review and the new Public Service Agreement targets. As part of its improved consultation with the Committee, the Department will be sending it the proposed layout of 2007 DAR including the tables. We expect to provide this information before the end of February.

Schools' funding

2. One of the main aims for the new schools' funding system ought to be that it is as comprehensible as possible, so that head teachers, governors and parents are able to understand how funding decisions for their schools are arrived at. (Paragraph 28)

The Government agrees that it will be important to ensure that the new school funding arrangements which emerge from the current review are as comprehensible as possible to a wide range of the Department's stakeholders and partners. Indeed, the objectives for the review set out in the terms of reference published in April 2006 include the following:

Simplicity—school funding arrangements should be transparent and easy for schools to understand, with the number of separate funding streams kept to a minimum.

We will need to achieve a careful balance between this and the other key objectives for the review: flexibility, stability, equity and value for money. While we will endeavour, therefore, to reduce complexity in the funding arrangements as far as practicable, we will also make it a priority to communicate clearly and effectively about the new arrangements with a range of key stakeholders, using language and channels of communication appropriate to each.

The review is concerned with identifying any changes to the current funding system that need to be introduced ready for the introduction of three year allocations to local authorities and three year budgets for schools from 2008-09. The Government will consult widely on specific proposals for change in spring 2007, and will announce decisions in the summer in the light of the consultation and of the outcome of the Comprehensive Spending Review.

Expenditure and efficiency

3. We welcome the recyclable gains expected from the efficiency programme, but we do have doubts about whether quantifying them in cash terms is in any way helpful. Money is not being redeployed elsewhere, and it is a moot point the extent to which the gain which accrues from a teaching assistant or other non-teaching staff member taking on tasks previously undertaken by teachers, and thereby freeing teachers' time for preparation or teaching, can be given a monetary value. This does not seem to be money as it is normally understood, and once again draws the DfES, and Government more widely, into arguments about what the numbers mean, rather than putting the focus on the matter in hand, namely the quality of educational provision. (Paragraph 32)

Of the Department's gains a very small proportion, including the restructuring of the Department, is cashable at a Departmental level but some 40% is cashable at an institutional level and frees up monetary resource to be reallocated by schools, colleges and other frontline institutions and some 80% is recyclable according to the definitions we have supplied to the Committee. The split of gains between categories also alters over time as the programme develops and more efficiency gains are identified.

An example of a gain that is cashable at an institutional but not at Departmental level is in local authority procurement, where the Department is not extracting these gains from the organisation and moving them elsewhere, but efficiencies made from better procurement deals are nevertheless instantly available for the local authority to re-deploy as they see fit. Such an example is at Essex County Council where, during the summer of 2005, three procurement exercises were undertaken which re-tendered a proportion of Home to School Transport routes, producing efficiency savings of £1,098,000 in 2005-06.

While we appreciate the difficulties of reporting recyclable gains that you set out, we maintain that they provide a valuable tool for quantifying improvements to efficiency. Given the challenges around measuring the wide range of outcomes delivered through our sector, we have looked for alternative ways of measuring the difference our initiatives are having; attributing a monetary value to teacher time freed up from administrative tasks for preparation is just one of the ways in which we have done this.

Clearly effective use of our frontline workforce should be a key element of our efficiency programme. As the money spent in developing the schools workforce is increasing it is more difficult to demonstrate cashable gains, and as your report and the work undertaken by the National Audit Office have highlighted, measuring overall productivity is extremely complex. We have therefore attributed realistically assessed monetary values to the teacher time spent more productively through others carrying out administrative or clerical task or more efficient use of technology. We maintain this is a meaningful measure of improved efficiency within schools.

To focus so narrowly on efficiency gains where money is freed up to be re-apportioned elsewhere excludes a large proportion of the techniques and approaches we are employing to improve efficiency in our sector. In the context of increasing resources it is harder to point to money being freed up. So rather we point to their being utilised more efficiently; this is what we aim to do through the initiatives that free up our non-cashable, recyclable gains.

These gains are expected to impact on the quality of educational provision. The majority of our programme stems from activities designed to do exactly that. But as the Committee has said there are a number of challenges in measuring the range of outcomes delivered by the sector and as such it is currently more practical to measure our effect on the mixture of inputs. It is through this that the majority of our recyclable gains are realised.

We have always been open about measurement of these gains and agreed the approach with Treasury at the outset of the programme. We will continue to seek to improve our measurement systems into the 2007 Comprehensive Spending Review but firmly believe that the approaches we are currently employing represent the best available.

4. Given the increased level of investment that this Government has made in education, it is unfortunate that it has not yet proved possible to measure the effectiveness of that spending in providing better education and more highly qualified students. This is not to say that the investment was ineffective; but in productivity terms, we simply do not have the data to tell us one way or the other. There is a risk, in the longer term, that the inability to demonstrate a measurable link between inputs and outputs will mean that taxpayers have no way of judging whether or not public resources are being well used. Such an outcome would be bad for taxpayers and, potentially, could undermine the electorate's willingness to fund public services. (Paragraph 35)

EFFECTIVENESS AND PRODUCTIVITY OF INCREASED INVESTMENT IN EDUCATION

Developing ways of measuring the productivity of education spending is the subject of an ongoing programme of work that the Department is undertaking in collaboration with the Office for National Statistics. The UK is a world leader in developing public sector productivity measurement in education, and we have already made much progress in developing better measures of output and productivity. These measures were published last year by, respectively, DfES and the Office for National Statistics (ONS).

Measuring aggregate productivity and the effectiveness of increased investment in any education system remains difficult, however, as the nature of the relationship between spending and education outcomes is complex. We know that education is improving on a number of fronts, but we have to be careful when indicating that a specific amount of investment results in a specific amount of improvement. It is also the case that not all of the extra spending will affect outcomes directly and immediately or in the same way, and although attainment is a key outcome of the education system, there are many other outcomes that are not easily measured, such as soft skills which are being increasingly well-rewarded in the labour market. Nevertheless, given the difficulty and importance of the issue, we continue to strive to develop our understanding of how measurement can be improved.

Since 1997 we have met a number of significant spending commitments which may not yet have fed through to effects on headline performance measures such as attainment, including:

  • significant investment in the secondary school infrastructure, with a sevenfold increase in the schools capital budget since 1997 as part of the Building Schools for the Future programme;
  • much of the extra spending has gone on raising teachers' pay, and average salaries are now the highest they have ever been; and
  • in 2003 Ofsted said that today's newly-qualified secondary teachers are the best and most consistently trained that we have ever had—the full impact of which is not likely to have been realised yet.

Effectiveness and productivity are different things, and the difficulty in measuring education productivity certainly does not equate to ineffectiveness. Nor does it undermine the case for that, or further, investment. It is cost-benefit analysis, not measured productivity, which determines whether increased investment provides value for money, and the Department places importance on ensuring value for money from its spending decisions.

We have said before that providing evidence of productivity is complicated by:

  • diminishing returns, which means that it becomes progressively harder (and hence more expensive) to drive up attainment as the base level of attainment continues to rise; and
  • time-lags before the increased investment shows up in attainment data. For example, investment in early years provision and on new school buildings will not immediately impact on today's examination results, but will feed through into higher achievement in future years.

Of course, productivity is still a useful indicator of how efficiently the education sector is performing, even though it's possible to invest in a worthwhile way that reduces productivity, narrowly defined (such as increasing spending on reducing class sizes). ONS has estimated the productivity of spending in the maintained compulsory schooling sector, but its estimates are inconclusive and cannot ever capture the full range of outputs produced by that spending.

There is no single estimate of productivity of overall education spending, because there is no single way of measuring education output or the quality of that output. ONS has therefore produced a range of estimates for annual productivity change, ranging from between 2% and minus 2% per year since 1998. These are narrow estimates which need setting in the context of wider evidence on education inputs and outputs.

Indicative as these estimates are, they do not capture the full range of outcomes from education. Qualifications are important, but they are not the only thing that is important. Children also leave school having been introduced to a range of sports and cultural activities and with enhanced social and communication skills. Many will have benefited from special needs education, counselling and support services, home-work and other after-school clubs, that help pupils become more fulfilled, healthier and productive adults. Schools also deliver secondary services, in particular, child-care provision for pupils during school hours which enables their parents to participate in the labour market. All these activities can also deliver wider benefits, such as lower levels of crime, a healthier population, and greater levels of participation in democratic processes. Since few of these broader outcomes can be readily quantified, they are often overlooked. Ways of capturing some of these wider outcomes and education quality dimensions in order to better estimate productivity, will be the subject of ongoing investigation.

Given the difficulty of accurately measuring productivity, and the fact that there are various ways of measuring output and its quality, ONS are seeking outside expert views on these measures in their public consultation. The consultation began early in November this year and will finish in February 2007. Once the consultation has finished, both ONS and DfES will consider the findings and conclusions, and decide whether and how these conclusions can help shape further work to improve the way we measure education output and productivity.

THE LINK BETWEEN EDUCATIONAL INPUTS AND OUTPUTS

We have already acknowledged to the Committee that proving a direct link between spending and improved outcomes is difficult for a number of reasons. So it is difficult to show, at an aggregate level, that a given level of investment produces a given output. Nonetheless, we continue to carry out work in this area to develop our understanding of how resources impact on outcomes, and the productivity of education spending.

A wide range of evidence points to improvements in both inputs (e.g. teachers) and outputs from education, which goes some way to demonstrating what taxpayers are getting from public spending on education:

  • Attainment throughout the school system has seen dramatic and sustained improvements, and there has been a significant reduction in the number of schools having low proportions of pupils achieving expected standards;
  • Attainment gaps between higher and lower social classes have not widened, and sometimes narrowed very slightly;
  • There are now 30,000 more teachers than in 1997-98, and numbers of support staff, including teaching assistants have also increased;
  • We now have the best and most consistently-trained new teachers;
  • School attendance has improved; and
  • Ofsted inspections show that 60% of all schools are good or outstanding and over 90% are at least satisfactory; and the number of failing schools has fallen by a half since 1997.

Research funding

5. We are planning a wide-ranging inquiry into a number of issues concerning higher education in the next parliamentary session, and research funding is one of the subjects that we shall be investigating. We expect the Government not to take any irrevocable decisions on the next steps until we have reported our findings. (Paragraph 49)

The Government has repeatedly stressed its commitment to the dual support system for public funding of research, and to the deployment of public funds to underpin a broad range of economically and socially beneficial research activity. However, it has also laid increasing emphasis on the need to consider alternative approaches to quality assessment and to allocating funding, and in particular on the scope for substantially greater dependence on metrics. The Research Assessment Exercise panels already make some use of research metrics in reaching their judgements about research quality. Research metrics are statistics that provide indicators of the success of a researcher or department. Examples of metrics include the amount of income a department attracts from funders for its research, the number of postgraduate students, or the number of times a published piece of research is cited by other researchers.

Therefore the Science and innovation investment framework 2004-2014, published in July 2004, noted an intention to examine the use of metrics as an alternative to peer review, while in March 2006 Science and innovation investment framework 2004-2014: next steps committed the Government and higher education funding bodies to consulting the sector on what those metrics should be in the firm expectation that they would form the basis of a new system for research quality assessment and funding after the 2008 Research Assessment Exercise.

The consultation on the principles that should underpin an alternative and simpler system was launched on 13 June and closed on 13 October. Nearly 300 organisations and individuals responded and key criteria were identified: it should continue to use expert advice; should recognise the disciplinary differences within a common framework; and it should use an indicator directly linked to research quality.

The Secretary of State announced on 6 December that the new process will use for all subjects a set of indicators based on research income, postgraduate numbers, and a quality indicator. For subjects in science, engineering, technology and medicine (SET) the quality indicator will be a bibliometric statistic relating to research publications or citations. For other subjects, the quality indicator will continue to involve a lighter touch expert review of research outputs, with a substantial reduction in the administrative burden. Experts will also be involved in advising on the weighting of the indicators for all subjects.

The first assessment exercise for SET subjects will be in 2009, and will begin to inform funding in the 2010-11 academic year. Other subjects will have their first assessment under the lighter touch regime during 2013 and this will inform funding from 2014-15.

Deferring decisions on the future of research assessment and funding until the Committee has given their views was not a practical option. The delay would have left insufficient time for the necessary arrangements to be made for the new system to be introduced after the 2008 Research Assessment Exercise.

The Department will, of course, take note of whatever views the Committee expresses in taking forward the further work that follows from the 6 December announcement.


 
previous page contents

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2007
Prepared 15 January 2007