Memorandum submitted by Cambridge Assessment
A BETTER ACCOUNTABILITY
Should Ofqual be allocated responsibility
for performance (achievement and attainment) table ratings and
equivalences?
About Cambridge Assessment
Cambridge Assessment is a department of the University
of Cambridge, and a not-for-profit organisation. Established in
1858, we are experts in assessment and are Europe's largest assessment
agency. Cambridge Assessment incorporates three exam boardswhich
develop and deliver qualifications and tests for learners of all
ages across the full range of subjectsand the largest research
capability of its kind in the world.
In the UK OCR (Oxford Cambridge and RSA) is one of
the UK's leading and most respected regulated awarding bodies
with over 13,000 schools, colleges, workplaces and other
institutions using its qualifications.
The qualifications offered by CIE (University of
Cambridge International Examinations) are recognised by universities,
education providers and employers in 150 countries. CIE qualifications
are created with an international audience in mind, making them
interesting, valuable and relevant for students around the world.
Cambridge ESOL (English for Speakers of Other
Languages) exams are the world's leading certificates for English
language learners. They are recognised and supported by universities,
employers, government agencies, immigration authorities and professional
bodies in many countries. Over two million people in 135 countries
sit them every year.
The use of qualifications data for achievement
attainment purposes
Cambridge Assessment's awarding bodies design
and award qualifications for the purposes of recognising the achievement
of individual students. It does not believe that using the data
provided for these purposes are very useful for the purposes of
measuring schools, as performance tables do.
However, even an imperfect measure may be improved
by bringing the calculations involved in its creation into the
public domain. Proper publication of the criteria, with the opportunity
for popular, academic and statistical debate, would shine a light
on this very grey area with its huge impact on schools and colleges.
Its removal from the DCSF to the new regulator would also remove
yet another area of suspicion from public debate, thereby increasing
public confidence.
Accumulating evidence of serious issues
There is accumulating evidence of serious structural
problems within current arrangements for compiling and managing
performance tablesrecently renamed achievement attainment
tables:
qualifications which are deemed to be
equivalent but which clearly have different societal status and
currency for progression;
schools optimising performance table position
by migrating to qualifications of lesser educational merit or
currency for progressionin which students are likely to
gain a higher grade than notionally equivalent qualifications
of higher educational merit or currency for progression;
a divide opening up between the independent
school sector and the state sector due to the independent schools'
continuing adoption of qualifications not recognised in performance
tables, but which are highly regarded for progression purposes;
implicit suppression of qualifications
which are "different" from those already recognised
in performance tables at specific levels, reducing the capacity
of the education and training system to respond to the needs of
learnersparticularly those less engaged with learningand
to changing societal and economic requirements;
An important example of this is the two-decade
controversy over the failure to separate English Language and
English Literature. Although it is vital for school children to
engage with the study of English Literature, the combination of
Language and Literature into a single examination has compromised
adults' access to GCSE English. In the 1970s over 20,000 adults
per annum accessed level 2 English Language through open
centressince they desired a vital labour market qualification
which would also materially help them in their lives and work.
The inclusion of Literature and the move to coursework impacted
severely and adversely on this. This is a serious failing of the
systemthe needs of adults who wish to obtain Level 2
qualifications which are regarded as essential in the labour market
have not been met. This has social and economic consequences as
well as impact on individuals.
Alongside this, the more recent divisive
debate over IGCSE has led to undermining of public confidence
and international confidence in the standing of UK qualifications.
There has also been a tendency for vocational qualifications to
be forced into alignment with academic qualifications, thus reducing
their utility (and uptake) amongst the learners for which they
were originally designed.
anomalies in funding arrangements due
to funding being linked to notional "size" and "level"
(as determined by the rules for locating qualifications within
performance tables, rather than the genuine, specific resource
requirements of the awards and their related learning programmes;
and
extended, inefficient processes for approval
of qualifications due to the complexities of meeting the increasingly
detailed and complex requirements for locating new qualifications
within the existing suites of recognised awards.
Current arrangements for rating qualifications
for inclusion in performance tables
Since the late 1990s, a complex process has
underpinned the rating of qualifications for performance tablesa
process of which few are aware. It contains substantial elements
of judgement; these are not subject to coherent regulation or
scrutiny.
A team in DCSF compiles performance tables. These
tables are based on a flow of data from schools. These data are
conditioned by ratings for qualifications which are allocated
by a very small team in QCA. The decisions of this team are crucial,
since they determine which qualification is equivalent to anotherthe
"rating" of the qualification in the performance tables.
They award this on the advice of officers in QCA and DCSF, on
the basis of "fit" and avoidance of anomalies. They
do not undertake extensive empirical work on the consequences
of ratings or institutional behaviour in the light of the performance
tablesthey are heavily driven by the "internal logic"
of previous decisions and allocations.
Significant judgements are made in regard of
the equivalence of contrasting grade structures within different
qualifications (eg one qualification being rated Pass Merit Distinction;
another having eight grades A*-G; another with five grades; etc).
Such decisions are of great consequence in terms of the standing
of different qualifications. The DCSF is wholly dependent on the
work of this team. If their work is not completed to schedule,
the performance tables cannot be compiled. This work currently
done in QCA will pass to QCDA.
The concerns in respect of this are: the lack
of transparency in the process; the fragility of arrangements
and complex dependencies between DCSF and QCDA; the tendency for
the process to be driven by internal logic rather than an understanding
and analysis of its consequences for learners and schools.
Focusing on the appropriate "unit of interest",
in order to improve learner attainment
Performance tables are driven by an assumption
that to improve individual pupil learning, the school is the correct
level at which to measure performance and to apply incentives
and pressure for improvement. At a recent Cambridge/Nuffield/NFER
seminar, the view of leading analysts was that classroom interactionthe
level of the teacher rather than the schoolis the critical
level in the system on which to focus.
Performance tables impact principally on school-level
behaviours, which include "game playing" in terms of
qualifications choice. It is not at all clear that performance
tables have impacted beneficially on interaction in the classroom,
indeed there is evidence that more superficial learning approaches
have been adopted in a misguided attempt to maximise examination
performance.
It is vital to note that the accountability
process which once focused principally on the quality of teachingformal
inspectionhas now been deflected towards school-level performance
as expressed through attainment of qualifications and through
national assessment results. In particular, in order to focus
on the quality of educational provision we would suggest that
inspection needs to be re-oriented towards classroom level observation
and review and to pupil-teacher interaction.
Unnecessary pressure on standards
Awarding Bodies are acutely conscious of the
full range of pressures which place upwards or downwards pressure
on examination standards, and use a range of mechanisms for standards
maintenance and monitoring. Performance tables exert a strong
downwards pull on the systemschools actively "game
play" in order to find the easiest route to higher qualifications
outcomes. It results in wasted time and resource, at all levels
of the system, in respect of standards monitoring and maintenance.
The current approach to assuming that all subjects
areand should beat the same level of demand compounds
the problem (this is not an assumption which is made in Australia,
where the HE admissions process weights different subjects differently).
Reducing or removing the downward pressure that emanates from
performance tables would be highly desirable, both from a technical
point of view and in terms of general public confidence in examinations.
Control of the ratings and equivalences
We suggest that the control of the ratings and
equivalences processes (which lie at the heart of performance
tables) be allocated to Ofqual. The impact of performance tablesin
terms of the full range of artefacts and unintended consequences,
as well as the desirable outcomesrequire far more attention
than it is given at present. Ofqual could undertake this work
and, at the same time, institute more sensitive approaches to
issues such as differences in demand between subjects.
Far greater sophistication and transparency is necessary
in respect of performance tables, alongside recognition that focusing
on school performance may be the correct approach (as we outline
above).
Cambridge Assessment believes that when data
is to be used in the public arena it should be done so in as transparent
a way as possible.
March 2009
|