- Children, Schools and Families Committee Contents

Memorandum submitted by the Royal Statistical Society (RSS)


1.   Summary

  The Royal Statistical Society's Education Strategy Group is pleased to have the opportunity to comment on schools' performance reporting. The key issues involved are in great measure statistical and tend to be poorly understood by policymakers.

Please see the Report of the Royal Statistical Society (RSS) Working Party on Performance Monitoring in the Public Services Performance Indicators: Good, Bad and Ugly[8] along with our comments. The RSS Working Party report sets out the technical issues and limitations of reporting performance indicators in general.

    — As statisticians, we are concerned with the need for statistical rigour in performance management: the technical probity of measures and the validity of inferences that they allow us to draw. — Existing public accountability systems do not take into account the uncertainty built into all systems of judgment from test scores to inspections and self evaluation. This leads to problems of misinformation and miscommunication with all users including the general public (and parents). The use of value-added measures, while an improvement on unadjusted test and exam scores, may also be misleading if over-interpreted and especially if the full uncertainty surrounding them is not clearly displayed and understood.

    — We do not support the current system of public reporting of existing league tables via the media. We would prefer to see a "private accountability" system built round the need to support teachers and schools rather than trying to identify failure publicly. Feedback should be supplied to the schools themselves and to the governing authorities with the aim of correcting weaknesses and building on strengths. Any subsequent publication of results should be at the end of such a process of discussion and should recognize the provisional nature of any judgements, the statistical uncertainties and above all the contextual factors which are likely to have influenced the results.

    — Our view is that there is a strong need for policy change in relation to performance indicators presented as league tables, including a much stronger role for government in making clear the limitations of the data in league tables. The Royal Statistical Society would, of course, be happy to lend its professional expertise to supporting the development of new policies.

2.   What aspects of a school's performance should be measured and how?

  2.1  The avowed purpose of a testing and assessment system is stated as follows[9] (in no order of priority): "to give parents the information they need to compare different schools, choose the right school for their child and then track their child's progress, provide head teachers and teachers with the information they need to assess the progress of every child and their school as a whole, without unnecessary burdens or bureaucracy; and allow the public to hold national and local government and governing bodies to account for the performance of schools." The information needed to meet each of these objectives is different.

2.2  Cogniscent, therefore, that examination results, students' academic progress, and some broader measures of the wider purposes of education (eg preparedness for making a positive contribution to society) need also to be measured, the Royal Statistical Society would like to concentrate its response on the need for statistical rigour in performance management: the technical probity of measures and the validity of inferences that they allow us to draw. We are well aware that there is a debate around the extent to which institutions such as schools should be publicly accountable and we are also aware of the debate that surrounds the side effects or "perverse incentives" that such systems tend to generate. A detailed discussion of these issues can be found in Performance Indicators: Good, Bad and Ugly, the RSS Working Party Report on Performance Monitoring in the Public Services.

  2.3  All systems of judgement: from test scores (adjusted or not) to inspections, or self evaluation, have uncertainty built in. Despite the wealth of knowledge about the uncertainty of examination results and test scores, and also the evidence from Ofsted and others on the uncertainty that accompanies inspection judgements, existing public accountability systems do not take this into account and this leads to many problems.

  2.4  "School performance" is essentially measured using a proxy which is the set of measurements taken on the students attending a school. As is well established, the characteristics of a school and its teachers are only one set among numerous factors affecting student performance such as their social and cultural background, out of school activities, peer groups etc.

  2.5  The task for anyone wishing to extract a measure of school contribution to student performance is to find some way of measuring and hence, adjusting for other than school factors. This is the intention behind "value added" measures that statistically adjust for achievement prior to school entry. There is now a large literature on this which advises, for example, the need to take account of previous achievement at more than one prior occasion as well as student mobility among schools.[10]

  2.6  The existing literature also makes clear that even when such adjustments are made, there remains considerable uncertainty about any resulting rankings of schools as expressed in wide confidence intervals. This implies that simple rank differences can easily be over-interpreted.

  2.7  Whilst the Department for Children, Schools and Families (DCSF) website does provide uncertainty estimates for such rankings (or school scores)[11] we believe that the Government should be doing more to insist that uncertainty estimates are conveyed to the general public.

3.   How should these performance measurements be reported and by whom? To whom should this information be made available?

  3.1  The provision, reporting, and the timing of reporting (ie before or after they are discussed within schools) of performance measures, also whether or not they are placed in the public domain, should recognise the purposes to which the measures will be put, ie to provide feedback for the school, to support parental choice or for some other purpose?.

3.2  Recent studies[12] show that the existing presentations are inappropriate for parental choice, and that a more realistic presentation would result in even more uncertainty associated with "value added" rankings to the extent that very few schools could reliably be distinguished one from another.

  3.3  It has often been suggested that to not report "raw" test and exam scores, and/or value-added ones, would be to deny the public information to which they have a right. While this might, on initial consideration, seem plausible, a close examination reveals its flaws. As we have pointed out, a full and honest description of the results requires expressions of uncertainty, and these would show that in fact league table rankings have very little discriminatory power.

  3.4  All published materials should recognise the provisional nature of any judgements, the statistical uncertainties and above all the factors, such as pupil deprivation, that are needed to place what is happening in context. We believe that a performance ranking should be treated as a screening device that provides preliminary evidence for possible "problems" or outstanding achievement in some institutions that can then be followed up in more detail, eg through an inspection system.

  3.5  We support a "private accountability" system. By "private" we mean a system that provides direct feedback to the schools themselves and the governing authority with the aim of correcting weaknesses and building on strengths with publication information being released only at the end of the process of discussion in a form that allows a measure of accountability while providing a report that takes into account different viewpoints and explanations.

  3.6  We believe that the requirements of "accountability" can be fully achieved in this way without the need to publicise the league tables themselves. A "private accountability" system[13] would not only lead to more sensitive and more efficient decisions, it would also avoid the (usually deliberate) political distortion of performance reporting and the perverse incentives of the current "name and shame" regime, as described eg in Performance Indicators: Good, Bad and Ugly, the RSS Working Party report on Performance Monitoring in the Public Services.

  NB. Other educational systems do take a different view about the publishing of league tables. E.g. Scotland, Wales and Northern Ireland do not do so, nor do some parts of Australia. The information on which such countries have reached those decisions takes into account the research evidence and the kind of arguments that we have outlined above. In our view, it is the English system that is out of line in continuing, until now, to ignore such evidence.

4.   What is the effect of the current system of public performance reporting (Achievement and Attainment Tables www.dcsf.gov.uk/performancetables/, and the online School Profile schoolsfinder.direct.gov.uk) on a school's performance, including confidence, creativity and innovation?

  4.1  A good future starting point would be that reporting on a school's performance and subsequent inspections should be principally aimed at supporting teachers and schools rather than trying to identify failure. Every child needs to attend a good school so a key purpose of the accountability system should be to identify what schools need to improve and what support (if any) they require. As we have said, we believe that this is best achieved through a private accountability system

4.2  From a statistician's perspective, we would like to emphasise the damage that adherence to simple (adjusted or unadjusted) measurements can do within schools. Many (perhaps most) of the misinterpretations stem from a failure on the part of educational managers to understand the variation that lies beneath simple summaries. Measurements that may be useful indicators of trends over time or of the progress of an age cohort can be over-interpreted at the level of the individual pupil or school. The need for confidence intervals or error bounds in reported measurements is key. Seeing a spurious precision in baseline measurements can sometimes result in iniquitous pressure being put on pupils (in target setting) and on teachers (in analysing their examination results). The use of threshold measures, eg 5A*-C (EM) also creates perverse incentives, leading schools to focus their efforts on a small group of students whose result "make the difference".

  4.3  Related to this, school inspections rely very heavily on the very same statistics as are used in performance measurement, and these statistics are interpreted by inspectors with, in many cases, very limited statistical expertise. The guidance on interpreting statistics that is given to inspectors is inadequate. Since so much emphasis is placed on the interpretation of statistics, inspection systems should have a competent statistician on every team!

5.   What is the impact on schools of league tables published by the press?

  5.1  Where a school finds itself positioned in a league table can have a huge impact on the ease with which it recruits staff and, of course, on the school's self esteem and "reputation" (certainly moving down a league table can have an immediate effect on perceptions of the school's ability to educate and lead (unfairly) to reduced student numbers as parents opt for other schools).

5.2  Many parents (and teachers) use the published statistics without reference to (or any understanding of) the uncertainty built into the league tables, so there is a pressure on schools to do whatever it takes to make the number bigger. At the very least, the effect is that, in an effort to not miss targets, schools increasingly teach to the test rather than teach for understanding.

6.   How useful is this information to stakeholders, particularly parents?

  6.1  There is a strongly held view by some that parents actually have very little choice as to which school to send their children to in many parts of the country and that they are swayed less by league tables and more by qualitative factors such as pastoral care eg school policy on bullying, extra curricular opportunities and the accessibility of teachers[14]. League tables in their present form and as they are currently reported, lead to misinformation and misunderstanding and are of little practical use in relation to eg school choice.

6.2  What is currently made available is only part of the story and the full story needs to be told if an honest picture is to be presented. If this point were understood well by the public then we do not think that there would be a great deal of support for publishing the tables. We believe that Government has so far failed to take responsibility to make this issue well understood by the media and the general public and we would welcome a change of policy in this respect. The Royal Statistical Society would, of course, be happy to lend its expertise.

7.   School Report Card

  7.1  We would like to emphasise that the main technical issues that we have discussed above are relevant to any proposed report card ie

    (i) Uncertainty. As we have pointed out in connection with reliability of inspection reports, all measures whether made at student, teacher or school level have a component of measurement error. Statisticians study these issues and statistical input in terms of the measurement and presentation of such uncertainty is essential.

    (ii) Adjustments. As in the case of value-added measures, it is important to take account of student prior dispositions, well being, behaviour and achievement when using student measurements to make comparisons among schools. Again, statisticians have studied ways of making such adjustments that are efficient and reliable.

    (iii) We have already referred to the fact that student performance is used as a proxy for the quality of teaching in school. For measures such as well-being and others on the proposed report card, their proxy nature is even more pronounced. This implies that any attempts to use these for school accountability purposes should be viewed with even more care and indeed scepticism, than test and exam scores.

    (iv) We would urge caution over any attempts to combine measures of achievement with those on the report card, into a single indicator at the school level.

    (v) If it is decided to go ahead with some form of report card, we consider it essential not only that the above issues are fully addressed but also that a proper pilot study is conducted and evaluated.

February 2009


Downloadable from: http://www.rss.org.uk/main.asp?page=1713

8   Not printed. Report of the RSS Working Party on Performance Monitoring in the Public Services Performance Indicators: Good, Bad and UglyBack

9   Report to the Expert Group on Assessment by Mathematics in Education and Industry -http://www.mei.org.uk/files/pdf/Expert<&lowbar;>Group<&lowbar;>on<&lowbar;>Assessment<&lowbar;>(MEI<&lowbar;>comments).pdf Back

10   E.g. see H Goldstein, S Burgess and B McConnell (2007). "Modelling the impact of pupil mobility on school differences in educational achievement." J. Royal Statistical Society, A. 170: 941-954. Back

11   It is not clear why the DCSF does not think it necessary to provide uncertainty estimates for "unadjusted" rankings (although we do not, anyway, believe that these should be provided in the context of performance monitoring) Back

12   H Goldstein and G Leckie (2008). "School league tables: what can they really tell us?." Significance June 2008: 67-69) Back

13   An example of such a "private" accountability system is given by H. Goldstein (2001). "Using pupil performance data for judging schools and teachers: scope and limitations." British Educational Research Journal 27: 433-442. Back

14   This has been documented by Kirkland Rowell, the biggest provider of school surveys. Back

previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2010
Prepared 7 January 2010