Select Committee on Education and Employment Fourth Report


THE WORK OF OFSTED

REPORT FROM THE EDUCATION SUB-COMMITTEE

Inspection judgements

MONITORING INSPECTIONS

  124. OFSTED reviews approximately five per cent of all reports arising from Section 10 (school) inspections to ensure they meet the Inspection Framework's requirements. OFSTED also monitors a number of school inspections to check that the inspection team has complied with the Inspection Framework and to monitor the quality of inspections. The monitoring programme is conducted by members of HMI, who observe the inspection team for part of the inspection. Feedback is provided by the HMI to the Registered and Team Inspectors and to the contracting organisation on the quality of the inspection work observed. OFSTED has recently announced that it will increase the proportion of school inspections which are monitored by HMI. Currently 10 per cent of inspections are monitored, but from the next academic year 30 per cent will be monitored.[183] We welcome OFSTED's decision to increase the proportion of Section 10 inspections which are monitored to ensure they comply with the Inspection Framework.

VALIDITY AND RELIABILITY OF INSPECTORS' JUDGEMENTS

  125. The difficulty of ensuring the validity, reliability and consistency of the judgements made by a large number of individual observers across a national system should not be underestimated. (This is a separate point from simple administrative errors in the inspection process, such as those discussed in paragraph 88 above.) Dr Janet Ouston, of the Institute of Education, noted that there were several challenges to be overcome when drawing conclusions from inspection-derived data. Such judgements are based on observation of an atypical week; the week is in any case only a 'snapshot' of the life of a school; and there are questions about the quality of judgements made by inspectors. She concluded that "we know from other social science data how very difficult it is to make agreed valid and reliable judgments about something as complex as a classroom".[184] Concerns about such difficulties were expressed to us by a number of witnesses. For example, Mr Jonathan Harris, Chief Education Officer of Cornwall County Council, was critical of possible variability of judgements between inspection teams. He argued that inspection reports from different teams could reach different conclusions about the same school, a situation which he regarded as "unacceptable".[185] The Association of Inspection Providers told us that there was "no doubt" that human error and subjectivity could occur, even in the most rigorously controlled circumstances. The Association recognised that one of the ways in which the OFSTED inspection system sought to reduce such errors was to have try to ensure judgements were corporate—i.e. reached by consensus among the whole inspection team—rather than the judgement of individuals.[186]

126. This point was considered, in some detail, by OFSTED in the research they undertook in 1966, in association with the Dutch inspectorate of schools, into the reliability and validity of judgements made by inspectors in their observation of lessons. 'Reliability' is this case means the degree to which two inspectors will reach the same judgement when observing the same lesson. 'Validity' means whether the inspector makes the 'right' judgement.[187] The research addressed (i) the extent to which pairs of inspectors observing the same lesson agree about the grade awarded to the teacher, (ii) to what extent do pairs of inspectors base their grades on the same recorded evidence, and (iii) how well do teaching grades awarded by inspectors match their recorded evidence. The research found that in 33 per cent of cases, the pairs of inspectors awarded different grades after observing the same lesson. In the majority of these cases, the pairs of inspectors arrived at judgements which were one grade apart, for example, one graded a lesson '3' another '4'. However, in three per cent of cases the difference was two grades. The statistical correlation between the two sets of inspectors' judgements was r = 0.81. The research describes this level of correlation as "reassuringly high".[188]

127. Professor Peter Mortimore did not support OFSTED's conclusion that this level of correlation was high. He told us that the reliability and validity of inspectors' judgements were "not there".[189] Professor Harvey Goldstein of the Institute of Education highlighted a particular finding of OFSTED's research which he was concerned about. He noted that at the boundary between grade 4 (satisfactory) and grade 5 (less than satisfactory)—in his words, the "failure boundary"—only two thirds of inspectors in OFSTED's research arrived at the same judgement. He concluded that "on this crucial point OFSTED's judgments were ... not very reliable".[190] The conduct of OFSTED's research itself was also criticised by Professor Carol Fitz-Gibbon of the University of Durham. She argued that the sample of inspectors used in the research was poor, as they were experienced inspectors, many of whom had previously worked in the same inspection team. She also felt that the results did not indicate a high level of agreement between different inspectors as the "generally accepted" standard of reliability when making judgements about individuals is r = 0.90 or better.[191] It should be noted that, although one third of inspectors had worked togther often, and others and sometimes or occasionally worked together, OFSTED found that the "degree of professional acquaintance between inspectors had little bearing on the consistency of the pairs of evaluations".[192]

128. We discussed a related issue with some Chief Education Officers: the extent to which the judgements of schools' performance by OFSTED inspection teams matches LEA's perception of their own schools. This compares the judgements of entire inspection teams, rather than individual inspectors, with the perception of the LEA's officers. For example, Dr Paul Gray, Chief Education Officer of Surrey County Council, told us that in approximately 15 per cent of cases the views of OFSTED inspection teams were "out of line with our knowledge of the school". This concern was echoed by Mr Harris, who told us that a figure of approximately 10 to 15 per cent discrepancy was "probably not far wrong".[193] Interestingly, Mr Gray noted that these differences could "go both ways", i.e., the LEA having a more positive view of their school compared to the inspection team, or vice versa.[194]

129. We welcome the fact that OFSTED has undertaken research on the validity and reliability of inspectors' judgements. However, we note the criticisms of this research project. It is important that there is confidence about this fundamental aspect of inspection. Full and frank research into this area must establish the level of reliability and validity of the basic elements of inspection. We wish to see research into this issue extended. It is important, to help ensure public acceptance of inspection, that such work is open to scrutiny by the academic community. Given that the OFSTED research was carried out in 1996, and the inspection system has evolved since then, it might also be timely to consider carrying out a similar exercise using a wider sample of inspectors than OFSTED's initial research.

PRE-INSPECTION CONTEXT AND STATISTICAL INDICATORS

  130. Inspectors are required to use performance data to "compare each school's standards with those found in schools nationally and in schools with pupils from similar backgrounds".[195] As we noted in paragraph 12, data on national standards of achievement and on schools in similar settings are provided to the inspection team by way of the Pre-Inspection Context and Statistical Indicators (PICSI) report, prepared by OFSTED for each inspection. Comparative judgements, particularly with schools in similar settings, are important as they will help to assess the value that a school adds to its pupil intake. It is therefore important that the PICSI data fully and accurately reflect the school's situation. Concern was expressed to us in the inquiry about the accuracy of the PICSI data. For instance, Mr David Thompson, Headteacher of Haydon Bridge School, told us that the PICSI information on his school's socio-economic base was "laughable", bearing only "scant relationship to the reality of this school and its enormous catchment area".[196] Of course, the development of PICSIs is at an early stage and over time the process of compiling them should become more refined and more sophisticated. We support the principle of assessing schools' performance against schools in similar circumstances. But inspectors' judgements about schools' comparative performance will be undermined if the data they use as a starting point is inaccurate. We therefore recommend that OFSTED consider ways of improving the accuracy of data contained in Pre-Inspection Context and Statistical Indicators.

CHALLENGING INSPECTORS' JUDGEMENTS

  131. OFSTED states that its complaints procedure does not allow "second guess judgements" to be made about inspection findings.[197] However, many of our witnesses argued that the inspection framework should allow the professional judgements made by OFSTED inspectors to be challenged. For example, the NUT expressed concern that teachers were unable to challenge what they regarded as unfair gradings of their lessons.[198] The NASUWT argued that for teachers, one of the most frustrating and unfair aspects of the OFSTED inspection process was what they refer to as the "inability of schools or individual teachers to have redress against potentially damaging and inaccurate judgements". They noted that some schools making a complaint have been offered redress in the form of another inspection. This, they argued, was nothing more than a "hollow gesture".[199]

132. Professor Robin Alexander argued that the nature of inspection meant it was likely that some errors of judgement would be made by inspection teams:

    "Given the format and context of inspections—high stakes, high stress and a relatively brief period of time in which to pack a complex procedure on which an institution's future may depend—it is inevitable that some of the evidence will be inadequate, or skewed by the dynamics of the inspection process, thus compounding the problems of criterial ambiguity and judgmental subjectivity"

Given these inbuilt weaknesses, Professor Alexander argued that the evidence for particular inspection judgements should be open to examination.[200] On the other hand, Professor Harvey Goldstein told us (in an informal seminar at which we discussed the technicalities of data collection and analysis) that as judgements were made by the whole inspection team it was likely that some 'positive' and 'negative' errors would balance out. This would, of course, depend on the inspection team being sufficiently large. Very small inspection teams were more likely to make errors than larger teams for this reason.

133. Overall, we do not think that it is feasible to develop a system whereby judgments made by inspectors can be challenged. We believe this could undermine the inspection system and lead to lengthy disputes between schools and OFSTED which could well detract attention from developing educational provision at the school. This does not mean we are complacent about the reliability of all OFSTED's judgements. But we think a better way of ensuring that inspectors' judgments are robust would be to concentrate on entrenching good practice, including further improvements to inspector training and to OFSTED's quality assurance mechanisms. Our recommendations about involving more serving teachers and headteachers in inspection teams, developing greater ownership of inspection findings among school staff and giving the governing body the option of nominating an observer, will also help in this respect.

Cost of school inspection and OFSTED's value for money

  134. In 1997-98, OFSTED carried out 7,840 inspections at a total cost of £107 million (an average per school of about £13,600). In 1998-99, it carried out 6,290 school inspections at a total cost of £81 million, an average cost per school of about £13,000.[201] We asked OFSTED for information on the costs of individual school inspections and the trend in costs over time. The Chief Inspector told us that, because the information was commercially confidential, it was not possible for him to publish the details of how much inspections cost. However, he provided us with information in confidence on the average cost of inspection for primary, secondary and special schools from 1993-94 onwards. This shows a fairly flat trend in the average cost of primary inspections, and a continuing fall in the price of secondary inspections. This fall may be due in part to the gradual growth in the size of the inspection market, which would naturally lead to greater competition for inspections and a concomitant fall in the price paid. In the interest of a wider perspective, we requested information on the average cost of inspections of further education colleges, and schools in Wales and Scotland. We recognise, of course, that these figures are not directly comparable. We were told that the average cost of inspection of further education was £27,605[202]; Ms Susan Lewis, HM Chief Inspector of Schools in Wales, told the Sub-committee that the cost of inspection of schools in Wales were £22,000-23,000 for secondary, £10,000-11,000 for primary and £9,000-9,500 for special schools[203]; and Mr Graham Donaldson, HM Depute Senior Chief Inspector of Schools in Scotland, told us that the cost of inspection by HMI in Scotland were approximately £14,000 for secondary schools and £6,700 for primary schools.[204]

135. Some witnesses expressed the view that OFSTED inspections did not represent good value for money,[205] although an exception to this was the Local Government Association which noted that OFSTED's total budget (which covers more than school inspections) was only 0.6 per cent of the total education budget and was "rather good value for money".[206] It is important for OFSTED and the DfEE to develop a clear understanding of the total costs incurred by inspection activity. These will be important in making judgements on OFSTED's vale for money. We do not, however, underestimate the difficulties of this. As Professor Christopher Hood his and colleagues pointed out, very few public sector regulators attempt to calculate compliance costs incurred by their regulatees.[207]

  136. Although OFSTED's expenditure in school inspections is a very small fraction of the overall education budget for England, it is, nevertheless, a large sum of money in absolute terms. It has not been possible for us to make a reliable judgement on OFSTED's contribution to the education sector from a value for money viewpoint. We therefore recommend that the National Audit Office conduct a value for money audit of OFSTED's work.

137. A number of our recommendations are likely to have financial implications. We also note that OFSTED's overall budget will decrease in the future, due, in part, to the introduction of 'short' inspections for some schools and as a result of the change from a four-yearly to a six-yearly inspection cycle. Our recommendation that work should be done to establish OFSTED's value for money is not based on a general concern that inspection is too expensive an activity (indeed, some of our recommendations could increase its cost), but to provide a baseline from which we can measure its benefits.


183   Chris Woodhead. 'Why inspections set you free', Times Educational Supplement, 14 May 1999. Back

184   Q. 13. Back

185   Q. 153. Back

186   Appendix 66, p. 246. Back

187   Peter Matthews et al. Aspects of the reliability and validity of school inspection judgements of teaching quality. Paper presented to the British Educational Research Association annual conference, September 1997.  Back

188   Ibid. Back

189   Q. 177. Back

190   Q. 190. Back

191   Professor Carol Fitz Gibbon. OFSTED: Time to go? Managing Schools Today, March 1998. Back

192   Peter Matthews, et al. Aspects of the reliability and validity of school inspection judgements of teaching quality. Paper presented to the British Educational Research Association annual conference, September 1997.  Back

193   Q. 154. Back

194   Q. 151. Back

195   OFSTED. Inspection '98. 1998, p.5. Back

196   Appendix 70, p. 259. Back

197   OFSTED. Making Complaints to OFSTED. 1998. Back

198   Appendix 10, p. 72. Back

199   Appendix 9, p. 67. Back

200   Appendix 38, p. 148. Back

201   Department for Education and Employment. Annual departmental report. p.190, 1999. Back

202   Submission from Jim Donaldson, Chief Inspector, Further Education Funding Council, not printed. Back

203   Q. 727. Back

204   Graham Donaldson, HM Depute Senior Chief Inspector of Schools in Scotland, not printed. Back

205   See, for example, Professor Carol Fitz-Gibbon, Q. 65 and Gary Yates, Q. 369. Back

206   Q. 325. Back

207   Q. 993. Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries

© Parliamentary copyright 1999
Prepared 14 June 1999