Getting the grades they’ve earned: Covid-19: the cancellation of exams and ‘calculated’ grades Contents

2Is the system fair?


4.On 31 March 2020, following the announcement that exams were cancelled, the Secretary of State for Education issued a direction instructing Ofqual, the exams and qualifications regulator, to develop a process for calculating GCSE, AS and A level grades which would ensure that “qualification standards are maintained and the distribution of grades follows a similar profile to that in previous years.”2 A further direction was issued on 9 April 2020, instructing Ofqual to develop suitable arrangements awarding for vocational and technical qualifications. We agree with the Department’s assessment that the complexity and diversity of these qualifications means that a one-size-fits-all approach is not appropriate.3

5.We consider exams to be the fairest form of assessment, and any alternative will inevitably be an imperfect replacement. Ofqual has stepped up to the immense challenge of devising these exceptional arrangements, and has engaged in numerous consultations with the sector.4 We note too that Ofqual has developed this year’s extraordinary regulatory frameworks within the parameters set out by Department’s directions.

The calculated grade system

6.Due to the Covid-19 pandemic, the majority of national exams have been cancelled and learners will instead receive a calculated result. Certain vocational and technical qualifications—such as those required for occupational competence—will be awarded through adapted or delayed assessments.

7.For cancelled exams, including AS, A levels and GCSEs, and eligible vocational and technical qualifications, Ofqual has instructed teachers to make a judgement of the grade their pupils would have been likely to be awarded in summer 2020 exams.5 To do this, teachers will draw on evidence such as school or college records, mock exams and non-examination assessment that a pupil has completed. Teachers have also been asked to rank each pupil relative to others who they judge would have been awarded the same grade. These grades, termed ‘centre-assessed grades’, and rankings, go through a multi-level sign off process, including sign off from the Head of Centre. To ensure that grading is fair nationally, exam boards then apply a standardisation model devised by Ofqual, which draws on evidence including:

historical outcomes for each centre; the prior attainment (Key Stage 2 or GCSE) of this year’s students and those in previous years within each centre; and the expected national grade distribution for the subject given the prior attainment of the national entry.6

The standardisation model is intended to account for under or over-predictions from teachers, and grades may be adjusted accordingly.7

8.Students will receive their exam results on the dates originally planned (13 August for A levels, and 20 August for GCSEs). The grades received will be indistinguishable from grades awarded in previous years, and students will be able to use them for progression to higher education, further education, and employment as normal.

The potential for bias

9.We asked Ofqual about the potential for bias in centre-assessed grades. Michelle Meadows, Ofqual’s Deputy Chief Regulator and Executive Director for Strategy, Risk and Research, told us that:

There is some evidence of bias. For example, in A-level grades, there is evidence that bias with regard to ethnic minorities interacts with the ability of the students. For the most able students, there tends to be under-prediction of the grades that students go on to get. At lower levels of ability, you get the reverse effect where there is some over-prediction.8

10.Numerous written submissions we received highlighted the abundance of academic research evidence on bias in predicted grades. These submissions outlined how particular groups, including pupils from low-income backgrounds, black, Asian and minority ethnic (BAME) pupils, and pupils with special educational needs and disabilities (SEND), could be adversely affected by this year’s grade awarding process.9

11.Acknowledging the potential for bias in calculated grades is not a comment on teachers and their professional judgement. Teachers are some of the unsung heroes of the Covid-19 pandemic, and they are doing their very best in exceptional circumstances to provide fair grades for their pupils. Rather, as Lee Elliot Major, Professor of Social Mobility at the University of Exeter, told us, “There is lots of evidence around human bias in assessment. It is nothing to do with teachers [ … ] We are all prone to bias”.10 As the 2020 extraordinary regulatory framework is an entirely new system, evidence on outcomes and fairness is not yet available. However, we believe it is reasonable to remain aware that the potential for human bias in predicted grades may be replicated in the calculated grade system.11 We note that teachers and support staff themselves appear sceptical of the fairness of this year’s system of awarding grades. A Tes survey from May 2020 of around 19,000 school staff found that just 39 per cent of school staff in England think the system will be fair to all.12

12.Research into grade prediction accuracy for university applicants has found that just 16% of applicants received the grades they were predicted.13 Pupils from low-income families are more likely to have their grades incorrectly predicted compared to their more affluent peers.14 In particular, high-attaining disadvantaged pupils are more likely to be underpredicted compared to those from more affluent backgrounds. Research by The Sutton Trust concluded that around 1,000 high-achieving disadvantaged students have their grades underpredicted per year”.15 We asked witnesses which groups they thought would be negatively affected by centre-assessed grades. Kevin Courtney, joint General Secretary of the National Education Union, highlighted disadvantage and race as factors that can increase the likelihood of bias.16 Lee Elliot Major told us that “unintentionally, teachers will sometimes underestimate the academic potential of poorer pupils”,17 pointing out that disadvantage and race are often intersectional, and can “combine to make it even worse for those children.”18 Zubaida Haque, Interim Director of race equality thinktank the Runnymede Trust, further highlighted “higher attaining working-class students—but also particular ethnic minority students and specifically black Caribbean boys, as well as Gypsy Roma and Irish Traveller students.”19

We also asked witnesses for their views on Ofqual’s contention that the evidence on the likelihood of bias is “mixed”.20 Zubaida Haque told us that mixed evidence:

[ … ] is not the issue here. The issue is there is a public sector equality duty that all public authorities are supposed to do to ensure that there is not bias. You are supposed to check for it regardless of the evidence.21

13.We put it to Ofqual that if the evidence is indeed mixed, it should be a priority to urgently undertake further research to establish the likelihood of bias and to identify which groups are most at risk.22 Sally Collier, Chief Regulator of Ofqual, insisted that the regulator takes its “public sector equalities duty incredibly seriously” and that they have put safeguards in place to minimise bias “as far as possible”.23 These safeguards include issuing additional guidance, including “practical recommendations” to teachers to help protect against bias in judgements.24 Nevertheless, we remain unconvinced that additional guidance is a sufficiently robust solution to the potential problem of unconscious bias, which could harm the life chances of thousands of already disadvantaged pupils. We note with particular concern that high attaining students from low-income families are more likely to have their grades underpredicted than their wealthier peers,25 and we do not think enough has been done to ensure students from poorer backgrounds are not disadvantaged by this year’s process.

14.Pupils with special educational needs and disabilities (SEND) might be disadvantaged by the evidence used to inform teachers’ judgements about grades and rankings. There is no guarantee that appropriate access arrangements—for example, modified papers for those with visual impairments—were provided for internal mock exams. VIEW’s written evidence highlighted that while Ofqual has recommended that schools seek the advice of specialist advisory teachers to input into calculated grades, they are “not aware of a mechanism for ensuring that this happens”.26 Similarly, the National Deaf Children’s Society’s submission agreed that there is “little accountability in place for the type of evidence used by schools and colleges”.27

15.We raised our concerns about fairness for pupils with special educational needs to Ofqual, emphasising the importance of ensuring SEND specialists feed into calculated grades.28 We are pleased that Ofqual produced guidance on considering evidence from SEND specialists during the calculated grade process.29 We are concerned, however, that there was no accountability mechanism for ensuring this happened consistently. In our section on appeals, we give our recommendation on what action is needed in cases where pupils with SEND, or their families, believe inappropriate evidence was used in the grade calculating process.

16.We are unconvinced that safeguards—such as additional guidance and practical recommendations—put in place by Ofqual will be sufficient to protect against bias and inaccuracy in calculated grades. In particular, given research evidence on unconscious bias, we are concerned that groups including pupils from low-income families, BAME pupils, pupils with SEND, and children looked after could be disadvantaged by calculated grades.

17.Our recommendations on Ofqual’s standardisation model in the next section set out what we believe needs to happen now to identify and mitigate any bias in calculated grades.

18.Sally Collier told us that Ofqual will publish “a full programme of evaluation” in the autumn, and that this will examine how attainment gaps vary this year compared to those in previous examination years.30 We agree with written evidence submitted by the Equality and Human Rights Commission that if this evaluation reveals disparities outside usual thresholds for variability for pupils with particular protected characteristics, and for Free School Meal (FSM) eligible pupils, they should be investigated urgently by Ofqual.31

19.We welcome Ofqual’s commitment to publish a full programme of evaluation in the autumn. Until it does so, we will not have answers about the fairness of the system.

20.Ofqual’s evaluation must include comprehensive data on attainment, by characteristics including gender, ethnicity, SEND, children looked after, and FSM eligibility, providing full transparency on whether there are statistically significant differences between attainment this year compared with previous years.

The standardisation process

21.The Department’s direction on GCSEs, AS and A levels instructed Ofqual to devise a standardisation process that would ensure “qualification standards are maintained and the distribution of grades follows a similar profile to that in previous years”.32 Michelle Meadows told us that the standardisation process “will adjust outcomes for schools and colleges to set a fair standard, a level playing field.”33 Ofqual has confirmed its standardisation model will take into account a centre’s historical outcomes, cohort prior attainment, and expected national grade distributions for subjects.34 However, little detail has yet been published on Ofqual’s model, and we agree with the Royal Statistical Society’s conclusion that “more transparency” is needed urgently.35

Who might be disadvantaged by standardisation?

22.A number of concerns with Ofqual’s standardisation model were identified in the written submissions we received. Standardisation will operate at subject level, within which the model will consider each centre individually.36 This may result in historic data derived from extremely small cohorts, making generalisation problematic. A submission from University College London’s Centre for Education Policy and Equalising Opportunities warned that the use of historic performance data for standardisation could penalise “atypical” students such as high achievers in historically low-performing schools.37 Written evidence from the New Schools Network expressed concern that newer institutions with little historic performance data would be disadvantaged. The association of national specialist colleges, Natspec, expressed reservations that use of historic data will not provide fair outcomes for small SEND specialist providers with highly variable year on year cohort attainment.38 Another submission stated that the statistical model is a “significant concern” for Alternative Providers, for whom often “historical data shows no patterns of attainment”.39 The National Association of Hospital Education’s submission expressed “huge concerns” that the model will “negatively impact on some of our learners in our cohorts this year who are gifted and talented, [ … ] their results will be out of step with our normal set of results”.40

23.Ofqual’s decision not to include trajectory in their standardisation process was criticised in the Sutton Trust’s written evidence, which suggested that the ‘turnaround’ schools disadvantaged by this decision “are likely to disproportionately serve poorer communities.”41 However, Ofqual’s consultation concluded that trajectory would be an “unacceptably unreliable” predictor of performance in 2020.42 We agree that the principles of statistical soundness and reliability must be followed, but remain concerned that schools on an upward trajectory may lose out. It is right that Ofqual has recognised that in exceptional cases there may be instances where there is “a substantive difference” between the 2020 cohort and historical cohorts, and we agree that in these cases schools and colleges should be able to make an appeal.43

24.We asked Ofqual what steps are being taken to ensure that the standardisation model is fair to all types of institution. Ofqual told us that they are testing the impact their model has on different school types to ensure standardisation does not have “an adverse consequence for a particular school type”.44 Michelle Meadows further acknowledged that Ofqual is aware of concerns from schools “for whom this year would have been the year, but trying to evidence that, of course, is incredibly difficult.”45

25.We asked Ofqual about concerns that the statistical model might result in results being driven down.46 Michelle Meadows agreed that there is concern that some students “would have pulled it out of the bag on the day, while others would “would have had a bad day”, but argued that it is impossible reliably to identify students and schools for whom this would have been the case.47 We agree with Ofqual that it would be impossible reliably to identify individual pupils who would have performed unexpectedly well in their exams, just as it would be impossible to identify those who would have had an unexpectedly bad result.

26.Given the potential risks of bias in calculated grades, it is clear that standardisation will be a crucial part of ensuring fairness. We are extremely concerned that Ofqual’s standardisation model does not appear to include any mechanism to identify whether groups such as BAME pupils, FSM eligible pupils, children looked after, and pupils with SEND have been systematically disadvantaged by calculated grades.

27.Ofqual must identify whether there is evidence that groups such as BAME pupils, FSM eligible pupils, children looked after, and pupils with SEND have been systematically disadvantaged by calculated grades. If this is the case, Ofqual’s standardisation model must adjust the grades of the pupils affected upwards.

28.Ofqual must be completely transparent about its standardisation model and publish the model immediately to allow time for scrutiny. In addition, Ofqual must publish an explanatory memorandum on decisions and assumptions made during the model’s development. This should include clearly setting out how it has ensured fairness for schools without 3 years of historic data, and for settings with small, variable cohorts.

29.Ofqual must collect and publish anonymised data at the conclusion of the appeals process on where it received appeals from, including, as a minimum, type of school attended, region, gender, ethnicity, SEND status, children looked after (including children supported by virtual schools),48 and FSM eligibility.

Vocational and technical qualifications

30.Calculating grades for vocational and technical qualifications is more complex than for academic qualifications due to the range and purpose of such qualifications. Unlike GCSE and A levels, there is no single approach to awarding that would work for all types of vocational and technical qualification. Ofqual, following the Secretary of State’s direction, divided vocational and technical categories into three categories, according to whether their purpose is progression to further study, gaining a qualification signifying occupational competence, or a combination of both.49 Depending on what category qualifications fall into, the assessment method may be a calculated grade, or an adapted or delayed assessment.50 Adapted assessments are those modified in such a way, for example a paper-based test being moved online, as to allow the assessment to be completed under current public health restrictions. Ofqual’s consultation decision stated that delay should be a last resort option, but acknowledged that for a small group of qualifications—for example those with a health and safety critical element—delayed assessment would be the most appropriate option.51

31.We consider this to be a pragmatic approach, suitably adapted for the complexity of the landscape. However, we highlight that where adapted assessments are used to award vocational qualifications, these assessments must be fair and accessible for pupils with SEND.

32.We raised concerns from further education providers and schools about inconsistencies between awarding bodies around what work counts towards the centre-assessed grades.52 Tom Bewick, Chief Executive of the Federation of Awarding Bodies (FAB), told us that “10,000 or so [ … ] qualifications are within scope of this particular extraordinary regulatory framework”, which makes consistency difficult, but assured us that “it is an issue that has been addressed throughout this process.”53 Tom Bewick emphasised that awarding bodies have been directed to “deliver a parallel regulatory regime alongside the existing regime”, which “will incur additional costs”.54 He explained that the Department had requested an estimate of the additional costs of implementing the directions, and that the estimated cost provided by the Federation of Awarding Bodies is £16.3 million.55 We agree that it is right the Department should provide grant funding for awarding organisations where there are additional costs of implementing the extraordinary regulatory framework for vocational and technical qualifications.

33.As part of its evaluation Ofqual must publish comprehensive data on vocational and technical qualifications, by characteristics including gender, ethnicity, SEND, children looked after, and FSM eligibility, providing full transparency on whether there are statistically significant differences between attainment this year compared with previous years.

34.Where calculated grades are used to award vocational and technical qualifications this year, Ofqual must identify whether there is evidence that groups such as BAME pupils, FSM eligible pupils, children looked after, and pupils with SEND have been systematically disadvantaged by calculated grades. If this is the case, Ofqual’s standardisation model must adjust the grades of the pupils affected upwards.

9 See, for example: Teach First (CIE0034); Impetus (CIE0086); The Institute of Physics (CIE0071); The Traveller Movement (CIE0137); the Equality and Human Rights Commission (CIE0139)

11 See: Ofqual, Equality impact assessment: literature review, 15 April 2020, for a review of literature on the potential for bias in teachers’ grade predictions.

13 Wyness, G. (2016). Predicted grades: accuracy and impact. University and College Union

14 Wyness, G. (2016). Predicted grades: accuracy and impact. University and College Union. See also: Ofqual, Equality impact assessment: literature review, 15 April 2020

15 Wyness, G. (2017). Rules of the Game. Sutton Trust.

25 Wyness, G. (2017). Rules of the Game. Sutton Trust.

26 VIEW (CIE0183)

27 National Deaf Children’s Society (CIE0101)

31 Equality and Human Rights Commission (CIE0139). See also, for example, Impetus (CIE0086)

35 Royal Statistical Society (CIE0199)

37 University College London, Centre for Education Policy and Equalising Opportunities (CIE0075)

38 Natspec (CIE0162)

39 MAT CEO network, PRUSAP & NAHE (CIE0156)

40 The National Association for Hospital Education (CIE0083)

41 The Sutton Trust (CIE0194)

48 Under the Children and Families Act 2014, local authorities have a statutory obligation to establish a ‘virtual school headteacher’ (VSH) to champion the education of looked after children and care leavers. Virtual schools are a service to help these children to succeed in their education pathways. See, for example: DfE: Promoting the education of looked after children and previously looked after children, February 2018

Published: 11 July 2020