Select Committee on Education and Skills Written Evidence


Memorandum submitted by Professor Stephen Gorard, University of York

    —  The new framework for the inspection of schools creates a "lighter touch" process, less reliant on primary observation.

    —  Therefore, the grading of any school is more reliant than before on indicators already known to Ofsted, such as those used in the contextualised value-added analysis of school performance.

    —  However, while contextualised value-added analysis is a useful research tool for dealing with school types and systems, it is not (yet) suitable for diagnostic use with individual schools.

    —  In fact, contextualised value-added analysis can give very misleading results, biased towards the high attaining (rather than high performing) schools.

    —  This has led to a number of schools being told on inspection that their final grade is constrained by their contextualised value-added analysis score.

    —  In an extreme case, a school told that lessons had been "good" or "outstanding" and its leadership "excellent" was given an "satisfactory" grade because that was the best allowable under the system constrained by a bottom quartile CVA score. The lead inspector described CVA as "king". Such cases have been reported to me by concerned heads and deputy heads.

    —  If true, this is poor science, and patently confusing for schools.

    —  It seems no better than a decade ago, when the schools placed in "special measures" were inner-city schools with high levels of pupil deprivation and mobility—the situation that the development of CVA was presumably intended to prevent.

    —  It leads me to consider what proportion of an estimated £250 million of public money spent per annum on Ofsted, and the almost incalculable cost in time and opportunity for the schools, might not be better spent.

    —  As an illustration, might £500,000 per annum given to each of the 500 most deprived schools (or some such breakdown) lead to greater educational improvement than maintaining the existing system?

  I am unable to submit the examples of schools reporting to me that they have been disadvantaged by the new system, because these contacts were made in confidence. However, if these complaints are valid there should be no difficulty in substantiating them. I append my research on the DfES secondary school value-added tables (Gorard 2006), and point out that the same flaw also appears in the primary school tables (publication pending).[8] I, therefore, agree with one of the originators of pupil-level regression that value-added was devised as a tool for research (Paterson 1997). Its limitations are not easily understood, and it is not yet established enough to be used directly for pupil, teacher or school assessment.

REFERENCES

Gorard, S (2006) Value-added is of little value, Journal of Educational Policy, 21, 2, 233-241.

Paterson, L (1997) A commentary on methods currently being used in Scotland to evaluate schools statistically, pp 298-312 in Watson, K, Modgil, C and Modgil, S (Eds) Educational dilemmas: debate and diversity, London: Cassell.

March 2006





8   Not printed. Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2006
Prepared 18 July 2006