Appendix 2
Ofsted's response to the Third Report from the
Children, Schools and Families Committee, Session 2007-08
Your Committee published its report on Testing and
Assessment on Tuesday 13 May, which was followed by Ofsted's oral
evidence session on Wednesday 14 May. As promised, I now provide
some relevant evidence on this issue together with a written response
to those recommendations from the Report which make reference
to Ofsted.
Summary
Solid evidence of performance through test and exam
results, particularly in English and maths, is essential to learners'
future access to employment. Proficiency in these subjects is
also vital for access to a wider curriculum. Inspection evidence
shows that the most successful schools focus on national testing
and assessment without reducing creativity in the curriculum.
My Annual Report for 2006/07 states that the overall
quality of the primary curriculum has improved, although specific
weaknesses are cited in relation to both the primary and the secondary
curriculum.
However, in some schools an emphasis on tests in
English, mathematics and science limits the range of work in these
subjects in particular year groups (often Years 6 and 9), as well
as more broadly across the curriculum in some primary schools.
For example, in Year 6 mathematics there are sometimes fewer opportunities
than in other years for practical work because of the emphasis
given to practising skills and techniques in preparation for national
Key Stage 2 tests. Similar issues arise with older learners: my
Annual Report for 2006/07 comments that skills for life training
offered by colleges, learndirect and other providers, "was
often too narrowly focused on simply passing a test, rather than
on the value of the learning process."
Evidence from survey work
As stated above, the best schools can focus on tests
and exams without narrowing the curriculum. However this is not
always the case. My Annual Report for 2005/06 said that, "For
some pupils, however, the experience of English had become narrower
in certain years as teachers focused on tests and examinations;
this affected pupils' achievement in speaking and listening in
particular," and that, "Weaker teaching (in mathematics)
was too narrowly focused on proficiency in examination techniques
at the expense of building understanding of concepts and their
relationships."
More recent evidence suggests the continuance of
these trends. For example, in Year 6 mathematics there are fewer
opportunities than in other years for practical work because of
the emphasis given to practising skills and techniques in preparation
for national Key Stage 2 tests. Similarly, in some secondary schools,
routine exercises and preparation for tests impair the development
of understanding as well as enjoyment of mathematics particularly
but not exclusively in year 9. The recently published poetry report
also discusses the significant impact of tests on the teaching
of poetry in English, particularly in year 9. However the best
schools found ways to continue to teach poetry.
A 'teaching to the test' effect can be observed in
some schools at GCSE and A level as well. The report "Evaluating
mathematics provision for 14-19 year olds" (HMI 2611) published
in May 2006 reported that factors which acted against effective
achievement, motivation and participation included:
"A narrow focus on meeting examination requirements
by 'teaching to the test', so that although students are able
to pass the examinations they are not able to apply their knowledge
independently to new contexts and they are not well prepared for
further study."
Similar issues arise with older learners, my Annual
Report for 2006/07 comments that learning for skills for life
training offered by colleges, learndirect and other providers,
"was often too narrowly focused on simply passing a test,
rather than on the value of the learning process. Providers still
offered insufficiently individualised learning packages. These
concentrated on dealing with the gaps in learners' skills, rather
than laying the secure foundations needed to support them effectively
in employment and their personal lives."
The following are direct references to teaching to
the test in survey reports published since April 2007. There are
more oblique references in some other reports, for example to
a narrowing of the curriculum; however, these are examples of
the most direct references.
Poetry in schools: A survey of practice, 2006/07
(070034, December 2007)
The end-of-key-stage national tests and examinations
have had a significant impact on poetry in schools. Poetry featured
less in the English curriculum in Years 6 and 9 in the schools
visited because too many teachers focused on preparing pupils
for the tests.
The Key Stage 4 curriculum: Increased flexibility
and work-related learning
(070113, May 2007)
The Key Stage 4 curriculum was good in well over
half of the schools surveyed in the second year of the survey,
a more positive picture than in the previous year. Across the
two years of the survey, curriculum development in a small minority
of the schools visited was constrained by a perception that change
would not maximise success in public examinations. They offered
a narrow curriculum with little or no access to vocational qualifications.
History in the balance: History in English schools
2003-07
(070043, July 2007)
History currently has a limited place in the curriculum.
In primary schools, this has been because of the necessary focus
on literacy and numeracy.
Geography in schools: changing practice
(070044, January 2008)
Achievement was slightly better in Key Stage 1 than
in Key Stage 2. Achievement in Year 6 is often very limited and
pupils in many schools study little geography until the statutory
tests have finished.
Evidence from school inspection reports
The issue has not been raised frequently in individual
school inspection reports. One of the most regular adverse references
has been to the squeezing out of activities such as problem-solving
and a small proportion of letters to learners make direct reference
to learners' concerns about a lack of interest and variety in
the curriculum. This occurs more commonly in primary than in secondary
schools.
Responses to recommendations directly relating
to the work of Ofsted
Recommendation 2
The evidence we have received strongly favours
the view that national tests do not serve all of the purposes
for which they are, in fact used. The fact that the results of
these tests are used for so many purposes, with high-stakes attached
to the outcomes, creates tensions in the system leading to undesirable
consequences, including distortion of the education experience
of many children. In addition, the data derived from the testing
system do not necessarily provide an accurate or complete picture
of the performance of schools and teachers, yet they are relied
upon by the Government, the QCA and Ofsted to make important decisions
affecting the education system in general and individual schools,
teachers and pupils in particular. In short, we consider that
the current national testing system is being applied to serve
too many purposes.
Ofsted does not rely on published test data alone
to provide a complete picture of the performance of pupils, teachers
and schools. When considering learners' achievement, inspectors
consider the attainment (or standards) of the learners, and in
doing so they make use of published test data. Inspectors also
form a view about the progress learners are making. This judgement
is based on a wide range of evidence including contextual value
added (CVA) data.
The evidence taken into account during an inspection
includes the school's self assessment, covering a wide range of
judgements about the quality of provision and information about
the tracking of pupils' progress. In addition, first-hand observations
of pupils' current progress in developing their skills, knowledge
and understanding will always be a key part of the inspection
process. Inspectors will talk to learners about their experiences,
observe them at work and draw on the views expressed in the parents'
questionnaire.
The inspector's evaluation of the school is summarised
in the Overall Effectiveness judgement; this is most closely related
to the inspection judgement on learners' progress, rather than
the judgement on standards. Although the greatest focus in inspection
is on whether the school is helping young people to make good
progress, the system is sufficiently flexible to allow for special
schools, where standards are invariably very low in comparison
to those found nationally, to be graded as outstanding where appropriate
because of the very good provision they make for their pupils.
In order to make a judgement about the overall effectiveness of
a school, inspectors will also consider factors other than achievement
and standards, such as the personal development and well-being
of pupils, the quality of teaching and learning, how well the
curriculum meets individual needs, and the effectiveness of the
leadership and management of the school.
We therefore believe that Ofsted makes appropriate
use of the national testing system in its evaluation of schools.
Data are sufficiently accurate for the school-level purposes for
which we use them but greatest emphasis is placed upon the most
reliable indicators, such as progress across Key Stage 2 to 4
rather than Key Stage 2 to 3.
We believe that using national test and exam data
for several purposes is a strength. One of the core principles
for efficient use of data in government should be 'collect once,
use more than once', and this is the case. We believe that the
use of test and exam data in inspection is appropriate. Set within
the context of the inspection methodology described above, we
believe the data are extremely helpful in evaluating schools'
effectiveness.
RECOMMENDATION 9
We are concerned about the underlying assumptions
on which Contextualised Value Added scores are based. Whilst it
may be true that the sub-groups adjusted for in the Contextualised
Value Added measure may statistically perform less well than other
sub-groups, we do not consider that it should be accepted that
they will always perform less well than others.
The Contextualised Value Added (CVA) data are a key
aspect of the RAISEonline data package, which is jointly managed
by Ofsted and the DCSF and provided to all maintained schools
for the purpose of self-evaluation. It is also used by Ofsted
inspectors in inspection. The weightings (coefficients) used in
the CVA model are recalculated each year to use the actual performance
data of the latest cohort of pupils. In this way they reflect
trends in performance of cohorts. The data should not be used
to predict the future performance of any particular group of learners
or to set pupils' targets. The CVA score of a pupil tells us how
s/he has performed compared to other pupils with similar characteristics
across the country. This serves to highlight where pupils have
performed much better, or much worse, than other similar pupils.
When aggregated up to the school level, this indicates where overall
performance is better, or worse, than in other schools with similar
intakes of pupils.
It is a statistical fact that groups of pupils achieve,
on average, different standards. For example, the latest performance
data shows that, on average, girls outperformed boys last year.
However, it cannot be used to predict that girls will again outperform
boys next year and it should not be used to justify setting lower
targets for particular groups of pupils.
Therefore we agree with the Select Committee that
they should not be used by schools to determine expectations of
particular sub-groups. CVA scores give an historic picture of
how pupils have performed, taking their prior attainment and circumstances
into account, in order to inform an evaluation of the school's
performance. CVA is a very powerful tool for this purpose and
allows us to compare the work of schools in similar contexts,
but it should not be used to set targets or expectations for individual
learners or groups into the future.
RECOMMENDATION 10
In addition to these specific recommendations
about Contextual Value Added scores, we recommend that the Government
rethinks the way it publishes the information presented in the
Achievement and Attainment Tables generally. We believe that this
information should be presented in a more accessible manner so
that parents and others can make a holistic evaluation of a school
more easily. In addition, there should be a statement with the
Achievement and Attainment Tables that they should not be read
in isolation, but in conjunction with the relevant Ofsted report
in order to get a more rounded view of a school's performance
and a link to the Ofsted site should be provided.
The Achievement and Attainment Tables now contain
a wealth of detailed information and we have become aware thaton
occasionsthe information in them has been misunderstood
by local newspapers. A link to the Ofsted report would be helpful.
A balance needs to be found between providing enough
information to be useful and not so much detail that it becomes
confusing.
RECOMMENDATION 11
The scope of this inquiry does not extend to a
thorough examination of the way Ofsted uses data from the performance
tables under the new, lighter touch, inspection regime. However,
we would be concerned if Ofsted were, in fact, using test result
data as primary inspection evidence in a disproportionate manner
because of our view that national test data are evidence only
of a very limited amount of the important and wide-ranging work
that schools do.
National test data contribute to judgements on standards,
and progress and on how effectively schools are using targets
to raise attainment for all learners. The use of CVA data means
that it is perfectly possible for a school operating in challenging
circumstances, with attainment on entry much below average, to
achieve a good inspection report because there is clear evidence
that learners are making better progress than is typical. Although
the CVA data provide an important piece of evidence for this judgement,
inspectors are asked to verify by looking at the school's own
assessment systems, the impact of teaching on progress, actual
progress in lessons, and by discussions with learners. Data help
to inform the agenda for the inspection, and to set some parameters,
but they do not determine the final inspection grade.
Schools with low standards may therefore be graded
good or, on rare occasions, even outstanding if the provision
had enabled pupils to make good or outstanding progress compared
with their starting points on entry. This is the case in those
special schools, for example, where learners have attainment well
below the national expectations but who nonetheless are making
very good progress due to the quality of education and care provided.
Conversely, schools in advantaged areas with high attainment on
entry may be judged inadequate if inspectors judge that pupils
make insufficient progress.
Nonetheless, we think it is right that standards
in national tests and examinations should continue to be prominently
reported in inspections. This is because success in tests and
exams is fundamental to the future life chances of young learners
and this is especially true of areas where low standards in education
have prevailed for a number of years.
Currently, very good data are available to inspectors
about standards and progress through RAISEonline and also other
data packages. We are aware, though, that the same is not true
of all the Every Child Matters outcomes. That is why we
are working to improve the range of data available for the next
round of inspections in 2009.
The proportion of complaints about school inspections
which are about the way inspectors use data have fallen steadily
over the last two years and now account for fewer than 1 in 10
complaints. This is in the context of about 4% of school inspections
giving rise to a complaint.
RECOMMENDATION 12
We consider that schools are being held accountable
for only a very narrow part of their essential activities and
we recommend that the Government reforms the performance tables
to include a wider range of measures, including those from the
recent Ofsted report.
Performance tables, by their nature, reflect a narrow
if very important part of school's work. Inspection judgements
are made across a wide range of a school's provision (including
teaching, curriculum, care, guidance and support) and the outcomes
for its pupils (including their standards and achievement, personal
development and well-being, and the Every Child Matters
outcomes). Certainly, including the school's overall effectiveness
grade from its last inspection report would help to provide a
wider perspective. However, inclusion of this information could
be misleading if the most recent inspection is several years out
of date. This risk would be lessened if the date of the inspection
were included.
I hope that this response is useful and would be
happy to discuss the subject further. Please be in touch if you
have specific concerns, or if you would like to meet to discuss
this issue further.
Christine Gilbert, Her Majesty's Chief Inspector
of Education, Children's Services and Skills
|