The administration of examinations for 15-19 year olds in England - Education Committee Contents


9  Service: question papers and marking

171. The quality of service offered by exam boards is a broad area, extending to logistical elements of service delivery and the administration of exams in schools. We have focused on three areas of service that featured most prominently in evidence to us: question paper errors, marking reliability and online standardisation.

Question paper errors in summer 2011

172. Public confidence in the exam system was shaken by 12 errors on GCSE and A level question papers in summer 2011. Ofqual's investigation into the errors exposed issues relating to exam boards' question paper setting procedures and checks —an area of the system largely untouched by recent innovation. Examiners told us of how the system relies very heavily on a few key people. This was confirmed by the Ofqual report, which concluded that, unlike marking, question paper setting procedures have "remained virtually unchanged for years, if not decades" and recommended that exam boards "look afresh at the process".[284] Ofqual's report made no suggestion that exam boards were cutting corners as a result of competition or that having multiple exam boards contributed to the errors in summer 2011. Indeed, the impact of the errors was reduced by having multiple exam boards, as fewer candidates were affected by each error. We welcome the findings of Ofqual's investigation into the errors in summer 2011. It is vital that Ofqual acts swiftly and robustly (including, where appropriate, using its power to fine) in the event of errors in order to protect the integrity of the system and the interests of young people.

Reliability of marking

173. Exam boards told us that the quality and reliability of marking have improved in recent years. Online marking is generally credited with improving the reliability of marking and there is research evidence to support this.[285] It has also helped to improve examiner standardisation and monitoring and quality assurance procedures associated with the marking process.

174. Yet, as the exam boards and assessment researchers acknowledge, concerns persist among teachers and the general public about the reliability of marking. [286] Ofqual's most recent public perceptions survey found a negative shift in the opinion of teachers towards the reliability of GCSE grading, with fewer teachers reporting that their students achieved the right grade (77% compared to 86% the previous year) and more teachers saying that about a quarter of their pupils got the wrong grade (20% up from 11%). The most commonly reported concern among teachers about the A level system was incorrect marking and grading.[287] Barnaby Lenon, Chairman of the Independent Schools Council (ISC) and recently appointed to the Board of Ofqual, voiced his concerns publicly earlier this year, saying that independent schools are "anxious that there should be greater consistency between and within boards in relation to marking and grading."[288]

175. Enquiries about results and the number of resulting grade changes have increased in recent years.[289] In 2011, enquiries about results were up 38% on the previous year and the number of grade changes increased by 11%. Grade changes represented 0.45% of the total GCSE awards made and 0.48% of the total A level awards. This was a "statistically significant" increase in the number of grade changes at GCSE on the previous year.[290] The increase may be linked to the introduction of new GCSEs in summer 2011—an illustration of the destabilising impact of change on the system. According to Ofqual's latest perceptions survey, 42 per cent of teachers said that they had to rely on challenging initial results (enquiries about results services) to get accurate results for their students.[291] On the other hand, researchers at AQA's CERP suggest that "the trend of increased enquiries about results reflects not a reduction in marking reliability, but an increase in the high-stakes nature of general qualifications".[292]

176. Ofqual has acknowledged that marking is an area that is "significantly undermining confidence".[293] It has recently announced a programme of work to review current arrangements for the marking of GCSEs and A levels.[294] Glenys Stacey suggested to us that there may be issues not so much with marking processes but with the way schools are treated by exam boards when questioning marking or grades. AQA's Andrew Hall acknowledged this, saying "ideally I would love the quality of service to be the same between each of our subjects [...] hand on heart we are better in some parts of our organisation than others [...] I think others would be the same".[295] School leaders also complained about this aspect of the process, with headteacher Robert Pritchard telling us that "the response is slow and the machine is so big".[296] Ofqual has said that it "will be working with awarding bodies to agree a common approach to the service that anyone would expect when they raise a concern about marking. We wish to promote a much greater consistency and transparency about that".[297]

177. Public confidence in the exam system is undermined significantly by recurring crises, such as the summer 2011 errors, and by allegations of improper conduct by exam boards in relation to marking and grading. A recent example is the allegation that one exam board failed to investigate the full extent of errors in the calculation of candidates' marks in the summer 2011 exams, potentially leading to candidates' being awarded the wrong grades.[298] Society places considerable trust in the ability of exam boards to ensure that results achieved by young people are an accurate and fair reflection of their attainment. Ofqual must investigate allegations of improper conduct by exam boards thoroughly, taking vigorous action if necessary, to ensure that candidates are awarded the grades they deserve and to protect the integrity of the exam system.

178. We recognise that some gap between exam boards' view of the reliability of marking and the public perception is inevitable and we accept Andrew Hall's point that examinations have become increasingly high stakes and so there is more challenge.[299] We also note the point made to us by Dr Tony Gardner that "examining is a craft rather than a science: examination results are never wholly reliable".[300] OCR stated that "there is a philosophical point about how far we seek to design papers which elicit absolute reliability from examiners [...] mechanistic assessment may be accurate but it does not encourage deep learning".[301] Assessment researchers point out that absolute marking reliability would only be achievable using multiple choice tests, but this would limit the assessment of the full range of knowledge and skills required at GCSE and A level (for example, essay writing skills).[302] A degree of marking unreliability is therefore the price to be paid, although this is not necessarily politically or publicly very palatable. We welcome Ofqual's work to agree a common approach across exam boards to deal with concerns about marking and to ensure students are treated fairly across the system.

Online standardisation

179. The most consistent message that emerged from our consultation with examiners was a dislike of online standardisation, whereby examiners' marking is standardised at the beginning of the marking process via an online session rather than a face-to-face meeting.[303] The objection seemed particularly strong in essay-based subjects, where there is more room for interpretation of the mark scheme and examiners felt that the opportunity for face-to-face discussion was especially valuable. The issue was also raised by examiners who submitted formal written evidence, with examiner Richard Nixon telling us "having done all three types available to examiners in the last 18 months the former two (chatroom/online and online only) saves Edexcel lots of money in teacher release fees, travel costs and hotel bookings but not sure that it is the best way to prepare examiners for marking papers".[304]

180. AQA's Andrew Hall defended online standardisation, saying that "the research evidence is absolutely clear that this makes for better quality of marking [...] and the students getting the right results".[305] We looked at the research cited by the exam boards, which involved GCSE History examiners.[306] The study that found "online standardisation was as effective as face-to-face standardisation" with examiners demonstrating "a similar level of accuracy and consistency in their marking post-training".[307] The researchers commented that "gaining the acceptance of the users of new systems can be the most challenging aspect of innovation" and that "some examiners were concerned about a potential loss of their community of practice".[308]

181. We accept that there is some research evidence to show that online standardisation is as effective as (but, if our reading of the research is correct, not necessarily more effective than) face-to-face standardisation. We can also see that it brings other benefits, such as reduced costs, an accelerated marking process and real-time monitoring of marking. We believe, however, that exam boards should continue to monitor the effectiveness of online standardisation and should consider offering opportunities for face-to-face discussion between examiners.




284   Inquiry into examination errors summer 2011 final report, Ofqual, 2011 Back

285   Ev 193, paragraph 3.3 and Ev 116, paragraph 6.5 cite the following research: Fowles, D.(2005). Literature review on effects on assessment of e-marking. AQA Internal Report. Pinot de Moira, A. (2009). Marking reliability & mark tolerances: Deriving business rules for the CMI+ marking of long answer questions, AQA report. Taylor, R. (2007). The impact of e-marking on enquiries after results. AQA Internal Report. Whitehouse, C.(2010). Reliability of on-screen marking of essays. AQA report. Back

286   Ev 116, paragraph 6.1, Ev 193, paragraph 3.2 Back

287   Perceptions of A levels , GCSEs and other qualifications: Wave 10, Ofqual 2012 Back

288   "Top private schools head support multiple choice tests, insisting they are harder than short written questions", Daily Mail, 2 January 2012 Back

289   Statistical Bulletin, Enquiries About Results for GCSE and GCE: Summer 2011 Examination Series, Ofqual, 2011 Back

290   Ibid. Back

291   Perceptions of A levels , GCSEs and other qualifications: Wave 10, Ofqual 2012 Back

292   Ev 193, paragraph 3.2 Back

293   Q300 and Q305 Glenys Stacey Back

294   Ofqual Corporate Plan 2012-15, p9 and p15 Back

295   Q524 Back

296   Q61 Robert Pritchard Back

297   Q305 Back

298   http://www.channel4.com/news/exams-whistleblower-thousands-of-papers-could-be-wrong (Sunday 20 May 2012) and http://www.channel4.com/news/whistleblower-suspended-for-revealing-exam-mark-mistakes (Thursday 17 May 2012); see also Ev w51 (David Leitch) Back

299   Q549 Back

300   Ev w47, see also Ev w38, paragraph 8 and Ev w59, paragraph 6 Back

301   Ev 134, paragraph 12 Back

302   Ev 194, paragraph 4.1 Back

303   See annexes 1 and 2  Back

304   Ev w119, see also Ev w84, Ev w116, paragraph 10 Back

305   Q548 Back

306   Chamberlain, S and Taylor, R. Online or face-to-face? An experimental study of examiner training in British Journal of Educational Technology, volume 42 no4 2011, pp665-675 Back

307   Ibid. Back

308   Ibid. Back


 
previous page contents next page


© Parliamentary copyright 2012
Prepared 3 July 2012