The administration of examination for 15 to 19 year olds in England

Written evidence submitted by Jo-Anne Baird, Jannette Elwood & Tina Isaacs

Oxford University, Queen’s University, Belfast & the Institute of Education

1. Overview

1.1. In this paper, we outline the historical reasons for the current number of awarding bodies, compare this with the situation in other countries, consider the benefits and problems with multiple awarding bodies, discuss alternative models of examination provision and, finally, discuss the issue of qualification currencies.

1.2. English public examinations are respected internationally and emulated in many countries. We also know that there is a great deal of public confidence in the examination system in England. [1] Nonetheless, recent press reports will reduce confidence levels, at least temporarily.

1.3. The facts regarding competition between examination boards on standards should be established before decisions are taken to overhaul the system. Media reports, though worrying, did not constitute widespread evidence of competition. Independent research could be undertaken on the statistical screening data to provide better information on this issue. We indicate in sections 6.3 and 6.4 some of the areas that could be pursued.

1.4. Assessment policy over the last decade has been characterised by change [2]  , which directs educational resources to managing these changes rather than improving education per se. Re-structuring examination provision also heightens the risk of system failure. [3] Furthermore, any change needs to address the problems of lack of curriculum innovation and the dominance of examinations over secondary education.

1.5. Justifiable methods for setting and assuring the standards of our syllabuses and qualifications are important for upholding public confidence and breaking the cycle of angst about our examination standards. Regulators are often in a weak position with respect to industry, as the relevant expertise resides in industry to a greater extent than in regulatory organisations. As such, we would encourage the establishment of an advisory body comprising strong assessment expertise in relation to content and performance standards, as well as methodology for national and international comparability studies.

1.6. Qualifications have differing currencies for different stakeholders. Students who take the examinations can use them for access to Higher Education or employment. The performance tables are another exchange rate that has impact upon the schools rather than directly upon the students. Assignment of values of qualifications for the performance tables needs greater transparency and scrutiny. This aspect of the system has had insufficient attention in spite of the powerful incentives created by value assignment to the performance tables.

2. Number of exam boards

2.1. A search of Ofqual’s register of regulated qualifications [4] shows that there are currently 183 regulated providers in England, Wales and Northern Ireland and nearly 14,000 qualifications. However, there are many more unregulated providers of qualifications. This Inquiry is not concerned with the extent of examination boards per se, but with the running of examinations for 15-19 year olds. However, the Department for Education’s Section 96 list, which gives the list of qualifications that will be government-funded in schools and colleges has over 11,000 entries from a large number of providers. [5] Nonetheless, we tend to think only of the unitary awarding Bodies: AQA, Cambridge Assessment, CCEA, Edexcel and WJEC. These bodies are able to offer general public qualifications such as GCSE and A-level in England. [6]

2.2. Historically, there was a plethora of regional examination providers; many of which had strong links with universities (e.g the Northern Universities [7] Joint Matriculation Board). This picture has been simplified over time. The ‘Unitary Awarding Bodies’ were created following the Dearing Review [8] , with the aims of rationalising the number of providers and qualifications (particularly vocational qualifications) and addressing "the century-old division between education and training". Examination boards were required to create partnerships that spanned the academic-vocational divide in the late 1990s. Edexcel, for example, was created in 1996 from the merger of the University of London Examinations and Assessment Council and the Business and Technology Education Council.

3. International comparison

3.1. Unlike the UK, most countries have a single set of exit qualifications. Some countries appear to have a reasonably simple system of examination administration, with a single organisation, typically the Ministry, responsible for examinations (e.g. Denmark, France, Hungary, Italy, Kenya, Lebanon, Malawi, Malaysia, Netherlands, New Zealand, Northern Ireland, Norway, Scotland, Sweden, Uganda, Wales, Zimbabwe). Others have regional exam boards for each state or province (e.g. Australia, Canada, China) or regional boards across countries (e.g. Carribean Examinations Council, West African Examinations Council). The International Baccalaureate Organisation is an international examination body designed to administer particular qualifications globally. Another model is specialisation of examination board (e.g. by language (Singapore), region (Pakistan), school type (Greece, South Africa)). Many countries have hybrid systems that defy classification. For example, in the US, there are national testing initiatives (e.g. SATs, ACTs, Advanced Placement Program) in addition to statewide tests. Public and private ownership of exam boards also coexist in the US. Russia has a federal examination system, but many aspects of it are devolved to states. Yet another complication is how exam board functions are handled. Some countries have a single exam board to handle everything from setting examinations to issuing results, whilst others have different organisations involved in setting, administering and certifying examinations.

3.2. As we see above, the international position is complex. Although we have not found another country with examination boards in competition for school examinations, in many countries there are at least international examinations available in tandem with the national examinations.

4. Benefits of more than one awarding body

4.1. The benefits of competition could be reduced prices, diversity and innovation in product offer and/or high quality customer services. However, these benefits arise in theory in situations of open competition in a market. In practice, examination boards operate in a regulated oligopoly, which produces different characteristics. In an oligopoly, firms compete less aggressively on price and the more so the fewer operators in the marketplace. Financially, examination boards in England went through difficult times in the early part of this century due to the requirements to create merged organisations and to produce new qualifications frequently. In years where there has been greater stability, examination boards have been able to recoup their losses and produce a profit. Government policy can have a large impact upon the financial health of these organisations. Even in a steady state, cross-funding of small-entry qualifications from the profit-making larger-entry qualifications is a necessary to enable breadth of provision.

4.2. Pricing strategy can also run counter to traditional economic theory. Few people want to buy cheap perfume, safety equipment or educational resources. A low price is not always a good sales point in an educational market – who wants a cheap qualification? That is not to say that we should not expect value for money from examination boards. The point is that the market itself will not necessarily produce prices at marginal cost.

5. What evidence is there for these benefits in practice?

5.1. Due to the regulation of the examinations market, there is little diversity or innovation in product offer. There is a tension between the regulator upholding content standards and allowing variation in the syllabus and examination offer. Differences can be perceived as having an effect upon standards. For example, the content of the curriculum can be perceived as more or less demanding if it is allowed to vary. However, the concern for maintenance of standards has come to overshadow innovation. Presently, we have a large number of syllabuses in a given subject, but the differences between them are small. Ofqual need to be empowered to foster more diversity in the examinations system, whilst ensuring that evidence is collated to reassure stakeholders that standards have been upheld.

5.2. Examination boards compete in terms of quality of customer service and some of the comments by Chief Executives of the Awarding Bodies indicated this in the evidence provided to the Committee on 15 December 2011. These services include production of syllabus and examination materials, quality of marking, advice and guidance through Subject Officers and teacher events, rapidity of appeals processes and so on. Gareth Pierce (CEO of WJEC) argued, for example, that their market share in GCSE English could have risen because they are a small organisation in which teachers can speak directly to a knowledgeable Subject Officer by telephone.

6. The main problem with more than one examination board

6.1. Competition between examination boards on the grounds of examination standards is a longstanding concern with the current arrangements and this has also been raised in the recent The Telegraph articles. [9] In a context in which falling examination standards is the prevailing media narrative, the recent allegations are clearly worrying. We note the refutation of some of the allegations in the evidence of the examiners involved to the Select Committee Inquiry and Ofqual’s decision to revoke one examination paper. A useful question to ask is how we would know if examination boards were competing on standards, in the absence of direct admissions.

6.2. Standards are set in England in a distributed fashion by individual examining committees and those are then scrutinised on a case by case basis by the examination board; in particular the Accountable Officer has a formal role in approving them. No individual is responsible for the national level outcomes for GCSE or A-level examinations. They are simply collated across committees and awarding bodies for the purposes of a JCQ press release and later for statistical records.

6.3. A system of statistical screening was implemented by JCQ so that standards could be compared statistically after the fact. Using the statistical screening data, the following could be independently investigated:

· how many examinations have statistically significant differences in outcomes, after controlling for prior achievement of the students who entered for the examinations? [10]

· given that the statistical screening was introduced a number of years ago, has the number of examinations with statistically discrepant outcomes reduced? (i.e. have the examination boards taken action to address the findings)

· aggregated across syllabuses, are any of the examination boards significantly more generous or harsh, or are effects subject-specific?

6.4. Examination standards cannot simply be defined by the statistical outcomes. [11] The statistical screening process catered for this by proposing that qualitative investigations were undertaken (‘comparability studies’) when there were grounds to suspect that the quality of students’ work justified the discrepant statistical results. Thus, it would be useful to know

· how many comparability studies have been undertaken by Ofqual, JCQ or the awarding bodies in response to the statistical screening findings and what were the outcomes of the studies?

6.5. A creeping ‘grade inflation’ might also be construed as evidence of competition between awarding bodies. As the Inquiry has heard, there are many potential determinants of increased proportions being awarded the grades, so this source of evidence is rather indirect as an indicator of competition. Due to the performance tables, there are benefits to many stakeholders in examination results going up annually. Furthermore, there is a genuine effort on the part of teachers to raise the numbers of students attaining the requirements. Disentangling the extent to which rising pass rates is due to different causes is impossible. Nonetheless, confidence in the system is undermined by these concerns.

6.6. Aside from annual rises in outcomes, large changes in the statistics can occur when new syllabuses are first examined. The changes could legitimately be explained by movements of schools to different examination boards, but unless proper attention is given to what is known statistically about the schools and students entering for the examinations, changes in syllabuses could lead to unwarranted changes in outcomes. We note that GCSE Science outcomes increased dramatically in 2009 when there was a syllabus revision and question the rationale for those increases.

6.7. Another indicator of competition between examination boards could be an attempt to reduce the content of syllabuses and examinations. As such, it would be helpful to know,

· how frequently does Ofqual have to reject awarding body syllabus and sample assessment material submissions on the grounds that they are not demanding enough?

6.8. Equally, the process by which content standards of qualifications are judged by Ofqual could be more robust and transparent. A review of methodologies and publication of the process generated by this work is warranted.

6.9. In her evidence to the Committee, Glenys Stacey, Chief Executive of Ofqual, recognised that assessment expertise is scarce and that much of it lies within the assessment industry. Indeed, much of the assessment expertise in the country is developed by the examination boards. We believe that Ofqual’s access to expertise should be strengthened and support their proposed development of an advisory committee. Membership of their Board would also be strengthened by the appointment of an expert in assessment.

7. What are the alternative models?

7.1. Nationalisation - Several countries run the examinations through the Department for Education and this is effectively the case for national curriculum test in England (currently operated by the Standards and Testing Agency). A single examination board would remove problems of comparability of standards between awarding bodies, but not issues of comparability of standards over time, between subjects or between qualifications. As previously mentioned, a single nationalised examination board would remove at a stroke the allegations of the ‘race to the bottom’ associated with competition. Theoretically, a single examination body could offer as much choice of provision as is currently the case, managing volume and complexity is a recipe for disaster and most organisations would keep their offer as simple as possible under conditions of a nationalised exam board. Thus, it would be likely to lead to little diversity or change and this could cause problems in our educational assessment slipping behind other countries that could be more agile in keeping up with the times due to a different structural system. With such scarce assessment expertise, a single examination board could arguably make better use of resources. A regulator would not be required, for example, in this system. Examination fees would remain within the public sector and small-entry, niche subjects could be subsidised as a matter of policy. Another advantage of nationalisation is that the government is in control of an important societal service, but there are disadvantages associated with direct political connection with examination results and governments often wish to have some distance from them for reasons of impartiality and public confidence. Certainly, it is difficult to see how it would be politically acceptable (or the political will) to nationalise public examinations in England currently, as examination boards have a long history and tradition of independence from government.

7.2. Outsourcing on a contract basis - Through the Qualifications and Curriculum Development Agency, national curriculum tests were operated on a contractual basis. All of the largest examination boards in England, the National Foundation for Educational Research and ETS-Europe held contracts with QCDA to deliver aspects of the national curriculum tests at some time. Advantages of this model are distance from political control, drawing upon expertise in other agencies and ability to change provider. Contracts could be drawn up for different operational functions, by subject area or by qualification type for the entire operation.

7.3. Outsourcing by function runs the risk that the functions will not connect well between suppliers and it is worth bearing in mind that the likely suppliers would be competitors. Problems with the delivery of the national curriculum tests have occurred regularly. These were contracted by function, which might have contributed to the problems. A more pressing problem with the contractual model for the national curriculum tests was the length of the contracts. Setting up large-scale, detailed operations is challenging and changing providers is wasteful of resources and runs risks of delivery problems. To assist with stability, contracts would need to have a minimum of five years’ duration. Even so, running the examination system on a contract basis could undermine capacity in the industry, as expertise takes time to develop and staff employment prospects would be uncertain under this model. New computer systems, personnel, logistics and so on have to be devised each time the supplier is changed.

7.4. Outsourcing by subject area would have the advantages that the entire operation would be joined up through a single organisation with responsibility and accountability and it could foster greater development of expertise in particular subject areas. Examination boards have significant logistical operations to deliver under tight time schedules with extraordinarily high demands for accuracy. Over the past decade, following the Curriculum 2000 examiner shortages and the Qualifications and Curriculum Agency’s view that examinations were a ‘cottage industry’ in England, examination boards have invested heavily in their systems. As some examiners have testified, this has led to electronic systems making examiners feel somewhat de-professionalised and alienated. Accompanying this was an influx of staff to examination boards and QCDA with backgrounds in business rather than education. Examination boards need both kinds of experience. Outsourcing by subject area might foster greater focus and connection with subject matter experts and create better leadership in disciplinary-embedded assessment. Expectations for comparability of standards between subjects would have to be tackled explicitly under such a system. Provision of small-entry subjects would also need to be a requirement upon awarding bodies in this model, as it has already been noted that they must be cross-funded and will not be financially viable as stand-alone propositions.

7.5. Links with universities - Historically, each of the awarding bodies had ties to universities and there has been some discussion in the current debate about the merits of this and of having senior examiners from Higher Education. One caution we would place upon this is that systems of accountability in Higher Education are now a disincentive to academics being involved with examining at secondary level, as this would not contribute to the indicators upon which individuals and institutions are measured in HE. These are predominantly: quality of research output, research funding and research impact.

8. Currency of qualifications

8.1. The Wolf Report (2011) [12] sets out a dire picture in relation to some of the vocational qualifications available and promoted to numbers of young people, defining them as ‘sub-standard’ with little or no labour market value. Furthermore, increased competition for university places, employment and careers has created concern amongst students about the devaluation of the currency and worth of their qualifications. Research2 shows that students are concerned about the different titles of qualifications that are available (GCSEs, A levels, BTECs, etc.) and their actual value with employers, their worth in the HE market, their equivalencies and what are the best qualifications to optimise opportunities. A present need for better transparency and common agreement around value of qualifications cannot be underestimated and links with discussions on standards and equivalences in qualifications from different awarding providers.

January 2012


[1] e.g. Ipsos MORI (2011) Perceptions of A levels and GCSEs. Wave 9. Ofqual/11/4834. http://www.ofqual.gov.uk/news-and-announcements/83/582

[2] Baird, J., Elwood, J., Duffy, G., Feiler, A., O’Boyle, A., Rose, J. and Stobart, G. (2011) 14-19 Centre Research Study. Qualifications and Curriculum Authority Report.

[3] Baird, J. & Coxell, A. (2009) Policy, Latent Error and Systemic Examination Failures. CADMO , XVII, 2, 105-122.

[4] Search conducted 13 December 2011 ( http://register.ofqual.gov.uk/ )

[5] We anticipate that this number will reduce with changes in policy relating to approval of vocational qualifications in the performance tables.

[6] ICAEE also offers GCSEs in ICT, Design & Technology and business subjects.

[7] The Victoria University of Manchester, University of Liverpool and the University of Leeds.

[8] Dearing, R. (1996) Review of qualifications for 16-19 year olds. London: School Curriculum and Assessment Authority. See page 11 for recommendation regarding awarding body arrangements. ( http://www.eric.ed.gov/ERICWebPortal/search/detailmini.jsp?_nfpb=true&_&ERICExtSearch_SearchValue_0=ED403388&ERICExtSearch_SearchType_0=no&accno=ED403388 )

[9] e.g. ‘Teachers giving students exam questions before they sit GCSEs and A-levels.’ By Claire Newell and Holly Watt. The Telegraph , 14 December 2011. http://www.telegraph.co.uk/education/secondaryeducation/8957499/Teachers-giving-students-exam-questions-before-they-sit-GCSEs-and-A-levels.html

[10] A study conducted by NFER (Benton & Lin, 2011) shows that there are very few significant differences between English awarding bodies. The predominant pattern was of significant differences between CCEA and the other awarding bodies. However, it would be useful to investigate whether any differences between English awarding bodies pertain, having excluded the effect of CCEA.

[10] Benton, T., and Lin, Y. (2011) Investigating the relationship between A level results and prior attainment at GCSE. Ofqual/11/5037

[11] Baird, J. (2007) Alternative conceptions of examination standards. In Newton, P., Baird, J., Goldstein, H., Patrick, H. & Tymms, P. (Editors) (2007) Comparability of UK public examinations. QCA book publication.

[11]

[12] Wolf, A . (2011) Review of Vocational Education – The Wolf Report. DFE-00031-2011. https://www.education.gov.uk/publications/standard/publicationDetail/Page1/DFE-00031-2011

Prepared 28th February 2012