Education Committee - Minutes of EvidenceHC 141

Back to Report

Oral Evidence

Taken before the Education Committee

on Thursday 15 December 2011

Members present:

Mr Graham Stuart (Chair)

Neil Carmichael

Pat Glass

Damian Hinds

Tessa Munt

Craig Whittaker

________________

Examination of Witnesses

Witnesses: Paul Barnes, Paul Evans and Steph Warren, senior examiners, gave evidence.

Q139 Chair: Good morning. Welcome to this meeting of the Education Committee, as part of our ongoing inquiry into the awarding bodies, and specifically in response to the stories that have appeared in The Daily Telegraph in the last week or so. I am grateful to the three of you for coming along this morning to give evidence to us. I would be grateful if each of you made a short statement, of no more than five minutes, if I can be brutal on that. I would be happy to start the session with that. Mr Barnes, may I start with you?

Paul Barnes: Yes. Good morning. First, I wish to challenge the issues raised by The Daily Telegraph article-last Thursday, I believe. It was alleged that, first of all, we gave specific information about the 2012 exam-that we actually leaked information. The second one was that advice given at the conference held in the Danubius hotel in London actually contravened the guidelines. So the issues are the breach of confidentiality and offering too much beyond the guidelines. The two allegations I refute.

A little bit of background-I have actually taught the subject for 33 years; I am into my 34th year now. I have worked with the board since 1988, when GCSE began. I moved through the ranks to senior and then became principal examiner about four years ago.

The aim of the seminar in London was to give retrospective advice on the 2011 paper, to provide sort of future guidance and to equip teachers with an understanding of the new specification. It is my belief that my comments were edited and taken out of context. I would like today to contextualise and even, perhaps, to offer the method of examining-even going as far as showing you the sort of exam papers-so you get a feel for how I can actually ask questions.

My responsibility is for three outline papers-the USA, Germany and the Middle East-the most popular being the USA, 1929-2000. The paper itself is divided into three sections-changing life in America, the race issue and the wider issue in foreign affairs. Candidates have to choose two of those sections and answer an essay as well, all in just one hour. There is a huge volume of work to cover, and I can ask only four questions, which I shall come back to.

The legacy paper, in the last specification, ran from 1929 to 1990. When the new spec came about, it was suggested that we stretched it, to make it more of an outline study, and took it to 2000. The advice given at the time was that there would be very little detail on this period of the 1990s, and that was stressed at the CPDs. The outline paper focuses more sharply on concepts: we look at causation, change, continuity, the significance of events and the contribution of significant individuals. The 1990s is very much a peripheral sort of topic; it is not a major part of the specification.

We could have a huge debate now as to whether events in the Gulf war are history or current affairs. The presidencies of Bush and Clinton are not yet significant. I would argue that Bush’s presidency was not, in fact, that significant-it did not really achieve much. It is whether these current affairs have actually become history, and that is an issue.

The inference of The Daily Telegraph was that the topic area is a major part of the spec. In fact-I have the spec here-it is half of one bullet point; a very, very small part of the work itself. The structure of the paper will exemplify this, if I can just contextualise-

Chair: Two minutes left, Mr Barnes.

Paul Barnes: Very quickly, here is the paper. Basically, I can ask just four questions in one section. Questions A and B are low tariff. Questions C and D are more challenging; they differentiate; they are harder and your better candidates score better on these two.

When I mentioned this word "never", what I actually meant is that it is never an in-depth question; it cannot be. On an eight-mark question, I cannot be asking questions on Clinton or Bush-it has got to be on the established significance of other presidents. So that is put into context.

The reference to "off the record" was a throwaway figure of speech, so to speak. We expect that all areas of the spec will be covered in about a six-year cycle, so conceivably there could be a question on the ’90s in the 2016 exam. The video then cut me. I went on to say, at the meeting, that all our papers are reviewed by a panel called QPEC consisting of a senior officer and senior examiners, as well as practising teachers. Ironically, I was tumbling a question in the last meeting, and we needed a question 1B. I suggested a question on Clinton; it was rejected on the grounds that we had given advice that it would not be there-so that is continued advice.

I added, as well, that the course had been bedded in, and we would actually wait until people were confident. If a question does not perform well, we can make allowances at the award session afterwards, but you cannot make allowances for candidates who are challenged and disadvantaged on the day. It is always about the candidates, bearing in mind, of course, that we have to cater for A* to G people-we are not a tiered paper; it is a common paper.

Resources were an issue as well. Only this year did we have a textbook published. It was published in, I think, May of this year, and the Welsh edition actually came out just last month. We told centres that, because of the lack of resources, there would not be anything specific on the 1990s. Again, that was actually given to people at the meeting. I made the point that candidates had to do an essay as well and that they would have to cover the 1990s in the essay section, so you have to teach it.

The allegation that this was a breach of confidentiality is therefore totally wrong. Everything that was said in the meetings is in the public domain. I have even brought in one sheet from my examiner’s report that said last year, "It is to be noted that the paper now runs to 2000 and centres should be encouraged to touch on events in the 1990s." It is there in the public domain.

Chair: Can I bring you to a close?

Paul Barnes: Yes. Very quickly, as far as I am concerned, there has been no breach of my duties as a principal examiner. As a member of the WJEC, I have always tried to uphold the very high standards of the exam board, and I truly believe that I have been misrepresented by The Daily Telegraph.

Q140 Chair: Thank you. Mr Evans, may I continue with you?

Paul Evans: I am a chief examiner of history for the WJEC, and I set six study-in-depth papers. Those six papers count for 90 questions. I also supervise a team of team leaders of those marking the papers and assistant examiners. I must stress that this is a part-time role, for which I am appointed on an annual basis. I am a full-time teacher and a faculty leader for humanities in a school in Wales.

I expect that members of the Committee will be aware of the widespread coverage in The Daily Telegraph of a seminar for teachers conducted by the WJEC, and I feel that that needs to be put into context. The seminar was held for history teachers at schools utilising the WJEC syllabus. I understand that similar seminars are held by other examination boards throughout England and Wales. The intention was to inform teachers about the course and take them through the lessons learnt from the results of past examinations with a view to assisting their teaching of pupils for future exams.

The WJEC history course is studied in most Welsh schools and in a growing number of schools in England. There are actually three exams that pupils have to take. One is an outline development study, which was alluded to by my colleague Mr Barnes, and two are in-depth studies or thematic studies. Each exam carries a weighting of 25%, and a further 25% is made up of a controlled assessment unit-what used to be called coursework.

The in-depth or thematic studies are divided into two sections. There is a compulsory section A, which tests source evaluation skills as well as some knowledge to put the sources into context. In section B, the candidate has a choice of answering one out of two questions, which tests the application of knowledge and understanding. The current course was launched two years ago, and the first sitting of the in-depth papers was in the summer exams of 2011.

Of the available in-depth studies, one is the history of Germany 1929 to 1947 and that is the one that I was recorded talking about. It is divided into three sub-units. The first is on the rise of the Nazi party and the consolidation of power ’29 to ’34. The second sub-unit is on the changing life of the German people ’33 to ’39, and the third sub-unit is on war and its impact on life in Germany ’39 to ’47. In each GCSE history exam, the compulsory question in section A will focus on one of those sub-units. Section B will ask questions on the other two sub-units.

I have not been provided with a full copy of the transcript recorded by The Daily Telegraph journalist of the seminar delivered on 11 November 2011. I am, however, recorded as having referred to section A as going through a cycle. The choice of the unit for the compulsory section of question A and the cycle are published on the WJEC website on page 9 of the teaching guide. I have a copy of page 9 here, where we actually record the dates and the years. This has been available on the website since March 2009.

Hence, when I referred to the compulsory question for the summer of 2012 focusing on the changing life of the German people ’33 to ’39, I was reporting what was already public knowledge. Indeed, WJEC is not alone in making the subject of a compulsory question public. For example, other exam boards, such as AQA, publish the cycle for the compulsory question for their history specification, which, because it is in the specification, would have been approved by the regulator.

In the seminar, teachers asked whether they should be teaching all three sub-units of the in-depth study. As teachers are informed which of the sub-units is to be the subject of the compulsory question, and as the pupils were given a choice in the second part, they need only be taught two sub-units in order to complete the exam. Whereas it might be thought appropriate for the full course to be taught, I am aware that time allocated to schools for the teaching of history at GCSE varies between schools, some having five hours per fortnight to deliver the course, and others four. In some schools, teachers do not have enough time to finish teaching the full specification, and it was in that context that I was answering questions about missing sections out. Hence, pupils could be taught only two sub-units of the three.

As a result of the publication in The Daily Telegraph, on which I had no opportunity to comment beforehand, I have been suspended by WJEC, which is currently undertaking an investigation. I am here to assist you as best I can, but I have been advised not to answer questions on my own position. I can say this: the word "cheating" was an inappropriate term to use. What I am saying is that the identification of a compulsory question was made clear in the teaching guide, but not in the specification. The specification will have been approved by the regulator. I do not understand that the regulator has any say in terms of the teaching guide that we issued in March 2009, and in speaking to other teachers I made the off-the-cuff remark that the regulator might tell us off.

I can say that at no stage during the seminar did I reveal any specific questions that were to be asked in the 2012 exam or in any subsequent exam; nor did I breach any confidentiality regarding the examination process itself. I am obviously happy to take questions on the broad issues raised, but as I say, on advice and because I am subject to an ongoing investigation, I am not prepared to say anything further regarding the specific words attributed to me, particularly as I have not yet seen the full transcript of the seminar to put my words in the full and proper context.

Q141 Chair: Thank you, Mr Evans. Ms Warren.

Steph Warren: Good morning, everybody. I have something I would like to read out to you, please.

I have been a geography teacher for 31 years and an Edexcel examiner since 1988. I became a principal examiner in 1999 and a chief examiner in 2009. I would like to thank the Committee for giving me a chance to make a statement about the press coverage I received last week. I do not even recall the conversation that has been the subject of all the press coverage in the past few days. I do feel that my views and the training session as a whole, which the reporter attended, have been misrepresented, and I want to put this on record.

It would seem from the video clip that the journalist approached me when I was talking to a couple of colleagues, not during the main training session. I had had a really exhausting day of training on one of the most difficult training days for teachers that I have ever done in my career. The students of some of the delegates found the exam very difficult and had not got the results they were predicted. I had spent much of the day defending the questions that had been set. I do regret the comments I made, which create the impression that the content of the specification is less, and therefore the specification is easier than other geography GCSEs. I did not say that the specification was easier-I am coming off my thing now. That has been reported incorrectly. My own daughter has transcribed what they said that I said, and it is not true.

The comments were made as part of a conversation I was having with some colleagues and led from their comments. It appears that the video clip is not the whole conversation, and the comments made by the other people have been edited out. My comments were said in the heat of the moment, after a long training session. I do not think that the specification has less content. In fact, I had problems covering all of the content myself for the first cycle of the course, so there is no way that I can think that- I had a lot of problems.

I do not know why I made those comments. I have not read the other awarding bodies’ specifications or even Edexcel’s other geography specification, so I have no basis on which to compare them. It was an inappropriate comment that I deeply regret making. I am only human and we all make mistakes.

There were also a number of comments made about my integrity in the press. I have been a trainer for many years. In autumn 2010 and spring 2011, many of the training events were cancelled because there was so little demand from centres. I would like you to infer from that without me saying any more. It just shows that I do not give anything away, or else they would be queuing at the door. I have always maintained the highest integrity in my training sessions, and would never discuss the content of specific examination papers. This has been verified by the e-mails of support that I have received.

Training sessions are run to help centres improve student achievement. Therefore, comments are made in this light. In fact, one of the aims of the session is to share ideas and good practice. Any comments I made during the main session were made to share good practice and were based on the specification, not the exam papers.

I believe I was also recorded saying that I could not remember what was on specific exam papers. That is the truth, and an observation made by many principal examiners. The papers are written two years in advance and are not kept by anybody but Edexcel. If the papers are sent to us for checking, they are immediately returned. There is very tight security on those procedures.

I was also accused of saying that parts of the exam papers were easier than others. That is also a misunderstanding of an answer to a question from the journalist. I was discussing the specification, not the exam papers. I believe that the life experiences of students in a subject such as geography enable them to access the knowledge and understanding of some topics better than others. For example, a topic on coasts and, to some extent, rivers, is more familiar to students than, for example, a topic such as tectonics or glaciation, unless, of course, the students live in a seismic or glaciated area.

I would like to thank you again for giving me this opportunity to make the statement about the press coverage. I would like to add one final thing. No matter how difficult the media coverage has been for me, my family and my school, who have all been extremely supportive, the heart of examining is the children who are sitting the exams. They are working extremely hard and are very nervous about the exams they will sit in future. My concerns are for them, and how the coverage by the media affects them and their belief in themselves and the examination system. They are the most important people in all this.

Q142 Chair: Thank you. That is a good note on which to end. Is the overwhelming pressure within the school system to turn Ds into Cs leading to behaviour by schools and exam boards that is educationally counter-productive?

Paul Barnes: Of course there is pressure to raise achievement, yes, at the critical point from D to C, arguably as well from A to A* and from G to F. At the end of the day, we want to push our candidates to their potential. The fact is that there is the benchmark: the five A* to C as good passes. We take a critical look at what is needed to raise achievement from D to C. We live in a competitive world; we are judged by our results. As my colleague said, at the heart of all this is the candidate. If someone is predicted to get an E or an F and they achieve a D, that is a good achievement.

Q143 Chair: But are the teachers coming to your sessions focused on trying to turn Ds into Cs? Is that what they are hoping for?

Paul Barnes: It depends on the circumstances of their school. Perhaps at a school that does not perform well there is pressure to get to that C boundary. I know from re-marks that we re-mark lots of A grades looking for A*. I have re-marked this year G grades looking for that little bit better mark. A grade converts to a mark and those marks mean something even to lower-achieving candidates. They have achieved something. I wish years back that they had not made that line between the good pass of C and above. They virtually destroyed others back in 1988 when we first examined. I try to tell some of my candidates who I know will never get to a C, "You have done well; you have got an E or a D." I do not feel under pressure to push things on. It depends on the individual centre.

Q144 Chair: You have said that it is a competitive world, well, not least among awarding bodies. When you are working for an awarding body, how much pressure are you under to help it maintain market share in the subject area in which you work?

Paul Barnes: This conflict, if it is one, is between the educational aims and standards, and the commercial. If you believe The Daily Telegraph, there is "aggressive competition to win business". WJEC, I think, is too small in that respect, relatively, because we are a small awarding board and we do not attract the big publishers. Our entries are quite often small. Luckily for us in Wales, we have the Welsh Assembly Government, which funded the textbooks.1 On the other route B, there are six topics that have never been resourced; until now because the money has been released. Teachers had to produce their own resources on that part.

Q145 Chair: But the Welsh board has been increasing its market share in particular subjects in England: English, I think, has gone from 7 to 18%. How is it doing that if it is not trading on schools’ desire to deliver results?

Paul Barnes: For the same reasons. It is a small board, it is accessible and you can pick a phone up and speak to the subject officer. You will have good contact straight away-it is not an automated system, as with some of the exam boards.

From INSET as well, what has gone down really well is that we are practising teachers. Often we teach in these CPDs. We actually talk teacher to teacher, which is welcome. We are back in the classroom tomorrow.

Q146 Chair: So is there pressure on you?

Paul Barnes: I do not think so, no.

Paul Evans: We have never been given any information that we need to go out and market the course. The CPD units are advertised, people apply and attend. We never go out actively to market because it is not part of our role. We just receive it.

Steph Warren: No, I have never ever been asked to market anything for Edexcel at all. If you want my opinion as to why people come to different specs-this is what my colleague said-it is because they like to have communication with an exam board. The exam boards that they like are the ones that have someone on the end of the phone who they can speak to if they are having a problem.

My comment on the C to D is that they are not looking at answers, questions or whatever. They are looking at interpretation of specifications so that they can improve their candidates. That is why I think they come to INSET training. The ones who speak to me at the end of it say, "Your hints on how you teach it yourself are the things that we want to hear. We want teaching ideas so that we can help our candidates understand."

I reiterate that I have nothing to do with marketing at all. It is my job to maintain standards on geography, not to market anything. So I know nothing.

Q147 Craig Whittaker: Good morning. Libby Purves in The Times on Monday said, "What more revolting in a society than cheating and manipulating the young, exploiting their anxieties and efforts while starving them of the proper mental nourishment they need to grow, think clearly, and face a complex world?" Was she right? In what context does that apply to The Daily Telegraph report?

Steph Warren: I have not read any newspaper comments. As you can imagine, I have kept well away. I do not actually understand what they are saying, to be honest. Are they saying that we are trying to factory farm the young in some way, or whatever? I do not believe it. I do not really understand the question.

Q148 Craig Whittaker: Well, basically what Libby is saying is that by cheating, such as has been alleged in The Daily Telegraph report, you are starving the young of proper mental nourishment with which to go out and face the world. Is she right, and how does that apply to The Daily Telegraph report?

Steph Warren: As far as I am concerned, she is wrong because we are not cheating. She is wrong because we have not cheated. We have not told them anything at all.

Paul Evans: The urge of every teacher in the classroom is to ensure that every individual in front of them achieves their best potential.

Q149 Craig Whittaker: With all due respect, we are not talking about teachers in the classroom; we are talking here about examiners and an examining board teaching the exam to teachers.

Steph Warren: We are not teaching the exam to teachers. I quite take umbrage at that. We are explaining a specification, a number of words, that some teachers will immediately pick up, grasp and get a hold of while others say, "What does this particular statement mean?" That is the way. That is how I see it, anyway. I do not know about my colleagues. We are explaining the specification, we never refer to exam papers at all.

Paul Evans: I have here a specification, so if a teacher wants to teach the WJEC history course, they can run through this-other courses are outlined-and that is the course to deliver. They come to the INSET to try to find out, "Right, what style of questions are going to be asked? What was the performance of students last year? Where were the strengths and weaknesses in the questions?" They want to ensure that their children do better in the classroom. We are always talking retrospectively-this is last year’s exam, and this is how it went.

Q150 Craig Whittaker: I appreciate that, but my specific question was whether Libby Purves is right in her statement and how does that apply to The Daily Telegraph piece?

Paul Barnes: The unfortunate term "cheating" was used again, and we have now made the point that that argument was not made. To go back to your point, maybe yes and no. To a degree, I think she is right because, laying no blames, these will come up from the teaching of a two-year course, lesson by lesson, 39 weeks a year with five lessons a fortnight. We develop the skills. The end product, of course, is in my case a one-hour exam where they have a volume of information, unlike maybe in some of the other subjects. That is the end product. We develop people through a course and we test them at the end of it.

Q151 Tessa Munt: Good morning. I am aware that sometimes students at university are used to mark papers. I understand that very often they will do this online and what they are searching for are key words or phrases in the more conversational or essay-type answers. Students will gain extra marks for picking up on key words and phrases. Is that your sense of how marking is done?

Paul Evans: In terms of the WJEC history course, we don’t employ any students to mark the papers and they are not marked online. The teachers attend a conference. The teachers have to have been in the classroom for two years before they are appointed as an assistant examiner.

Q152 Tessa Munt: So you are not aware of anybody, students or people being used to mark examination papers?

Paul Evans: Not in WJEC history.

Steph Warren: No. They are not at Edexcel either. It is all practising teachers who mark the questions for Edexcel. It is marked online. There will be questions that have only one word answers but you are not referring to those; you are referring to the longer ones. The longer answers are all marked by practising teachers. If you are saying that there are certain words-in geography we teach case studies so there would be specific points about a particular case study, but there are many, many case studies out there. So there is no way any mark scheme would ever say, "Only look for this. Only look for that." It is looking for specific points. So no, not at all.

Paul Barnes: There is a sort of irony in a sense. We like to see people with two or three years’ experience teaching before they become markers. Yet in discussions with delegates in CPD quite often I have come across newly qualified teachers in their first year of teaching who, especially in a high turnover place like London, have been dropped with the job of head of department. As a newly qualified teacher in their first year of teaching they take on all the pressures of a job like that. They are so appreciative of the advice and feedback and the opportunities to liaise with other people at the CPDs. So it is ironic that we expect examiners to have two years’ experience but people with less than one year’s experience can be guiding these young people through exams.

Q153 Tessa Munt: What explicit instructions are you given by your boards about what you have to say when you are doing these training courses?

Steph Warren: I get sent an e-mail and it has in it templates that I am to use that are the official templates. I am sent the aims of the course which then tell me what I should be putting into the course.

Q154 Tessa Munt: When you say a template are you talking about-

Steph Warren: For PowerPoint and things like that and any other information. There are set templates that are sent to me.

Q155 Tessa Munt: What about guidance as to what you are meant to say and what you are not meant to say or is this implicit?

Steph Warren: I am a professional.

Q156 Tessa Munt: I absolutely understand that. I am not questioning that. I am just asking what guidance you get from your exam board, which is effectively your employer at that point, that is explicit about what you may and may not say while you on the training course.

Steph Warren: The aims of the course are to discuss the papers and it is quite explicit on there that that is what you should be doing in your training sessions. But there is also one at the bottom that is to share ideas and good practice. That is when I bring in my own teaching experience as well.

Q157 Tessa Munt: So are there statements of what you specifically may not say and do?

Steph Warren: Not that-well, are there? There are in the contract because the contract says that I may not refer to any future examinations. So in my contract it is quite explicit that that is not to be done.

Q158 Tessa Munt: Thank you. That is good. Can I ask the same of you, Sir?

Paul Evans: Only in the sense that you must not reveal anything that would dispute the integrity of the exams in the following year, so you do not reveal anything that will come up in future. As regards what you can or cannot say, you do not have any training or information.

Paul Barnes: I can quote from the terms and duties of the principal examiner: the need to respect and maintain the confidentiality of the examining and marking process-that is paramount. Training, in many ways, is informal, with the subject officer. We have a meeting before the round of CPDs and discuss the agenda.

I think your question is: "Is there a code of practice, with a line?" The line is the professional line. As professionals, we do not breach confidentiality. In fact, BBC Wales news last night made the point that no exam paper has been compromised in the exam board. There are 18 courses and 240 potential questions, and not one question has been breached.

Q159 Tessa Munt: I appreciate that. I just wanted to clarify exactly what you are given in the way of assistance, for guidance.

Paul Barnes: Basically, when you are elevated to principal or chief examiner, it is based on experience and expertise gained over a long period. On that basis you disseminate your acquired knowledge and expertise.

Q160 Tessa Munt: Do your contracts cover a certain number of hours?

Paul Barnes: No. Technically, we are not an employee, I read somewhere. We just kick into action at critical times of the year.

Steph Warren: I think mine might be slightly different; I have a contract for every event or job I am doing. I will have had a contract for training and for chief examiner. I get a new contract-I get lots of contracts.

Q161 Tessa Munt: Do you train in pairs or on your own?

Steph Warren: On my own.

Paul Evans: We do it as a team.

Q162 Tessa Munt: Explain to me what a team means. How many of you go to the event?

Paul Evans: The session would be led by the subject officer, who is a direct employee of the board. He will then delegate sections in the day, when we will do our specifics.

Q163 Tessa Munt: I see, so there might be three, four or five of you.

Paul Evans: In this case, there will be three of us present.

Tessa Munt: Fine.

Paul Barnes: It might be worth adding that at the London conference, the subject officer was taken badly ill. It was very short notice-the day before. As historians, we are supposed to be blessed with the skill of hindsight, and on reflection, it might have been a good idea to cancel; it was too short notice. My colleague, Dr Evans, covered the subject officer’s contribution and was on his feet for two hours, facing 60 or 70 professionals, which was quite pressured. I feel strongly that had the subject officer been there, he would have dealt with the issues of the exam board, procedural protocol and so on, and we would have just made our contributions as senior examiners.

Q164 Tessa Munt: How many times does the subject officer not make it? I accept he was very ill.

Paul Barnes: Never. 2

Paul Evans: He slipped two discs. He could not get out of bed-couldn’t walk, and we decided to run with it.

Q165 Pat Glass: I am aware of the systemic pressures in schools on how we deliver teaching. I have held long conversations with head teachers, trying to persuade them not to put their best teachers into years 10 and 11 and suggesting that years 8 and 9 need good teachers, too. I am aware of the pressures on D to C and A to A*. Again, I have tried to persuade head teachers that kids who are going to get Bs deserve good teaching as well. Are there equal kinds of systemic pressures that lead you to deliver training in a particular way?

Paul Evans: I don’t think there is any pressure. Our remit is to give some feedback on how the exam performed last year. How did candidates perform in comparison to previous years? This was a new spec-this was the first time it had been examined. Teachers came to the CPD event to find out how the paper performed. They were looking at whether their candidates did better this year or slightly worse than in the previous year, and finding out why. What can they do to improve their teaching in the classroom-to boost their pupils’ performance?

Paul Barnes: In my session, I simulate a marking conference. We go through the mark scheme. We then hand out three live scripts, if you like, with the marks taken off. In small groups, they mark and report back, and they find that very useful.

Q166 Pat Glass: We are politicians, and we understand that you can be taken out of context and misrepresented-believe me, we’ve been there.

Steph Warren: Can I make a comment on training? I’m butting in again, sorry. I don’t think I train any differently now from how I did in 1999, except that I might be a bit better at it because I have done it for longer. I do not think that there has been any difference-obviously I have been teaching for years and years and years-in my department in how I teach an A to A* or a B to a C. Yes, we are told to do this and that, but every child is important to me, so I try to look after every single one of them.

Q167 Pat Glass: Going back to the issue of what happened on that day and the issue of the specification-I understand, Paul, that you cannot talk about the details-there was no discussion about future exam papers, because, we can assume that, had anyone said something, The Daily Telegraph would have printed it.

Paul Barnes: Absolutely. The papers would have to be rewritten in light of that, and they are not being, as far as I gather.

Pat Glass: And you are clear that there are no systemic pressures on how you deliver teacher training. Thank you.

Q168 Neil Carmichael: I have a quick question to wrap this theme up. There is a perception now that pupils are being taught to pass exams, rather than taught a subject with an exam at the end of it. What steps do you think we can take to correct that perception and move back towards the public view that you go to school to learn the subject and, at the end of the day, you will get an award if you have done well?

Paul Evans: It has been alluded to earlier, but it is the pressure that the Government have put on us, and schools in particular, and the publishing of league tables. It is: "What are your results? How many A*s to C have you got? What is the benchmark?" That puts an awful lot of pressure on that school to get a particular percentage. The schools are then in bands, in band 3 or band 2, where there is pressure to move up to the next band. Subject leaders are called in by the governing body and asked, "What can you do to improve your subject? You need to get better." With all that pressure, you tend to focus on the teaching of the exam unit and the exam technique, as opposed to history. At the end of the day, they should be coming in and doing history, and not having to think that they have to pass the exam. There is then the pressure of being asked, "How many A*s to C can you get?"

Q169 Neil Carmichael: By extension, of course, that means that you are yielding yourself to that pressure by allowing teachers to come to seminars and things to probe the possibilities.

Paul Evans: Well, professionally, these teachers are coming because they want to improve their performance in the classroom. It is like a training exercise, asking, "What are you expecting our children to do?" They would be remiss not to attend, I would have thought.

Paul Barnes: The point I want to make is that I object massively when I hear the term "dumbing down". History is a gold standard subject. It demands a huge volume of knowledge, which has to be applied in a focused exam situation. If I go back to my time in school-maybe yours as well, Neil-I never got told how to improve a mark. I was given a grade for an essay. If you got 12 out of 20, you had to get 13 or 14 next time. We were never told how to do that. I went through university, getting degrees, and never got any sort of feedback.

We are helping these young people to realise the targets of the questions. Lots of young people have a real problem retaining and accessing the knowledge, so if they can combine their knowledge of history with the skills demanded and the target of the question, you have got it. That is what CPD is about.

Paul Evans: May I add to that? To obtain a GCSE in history now-for those studying WJEC in history-that child will have had to sit three one-hour papers, each carrying 25%, plus two pieces of controlled assessment within the classroom environment. When I sat history at O-level in the 1970s, you went in for one exam, wrote five essays and there was your O-level. They are publicly examined for three hours in three separate papers to get a GCSE in history. That is an immense amount of material.

Paul Barnes: And then multiply that by 10, 11 or 12 subjects.

Steph Warren: We are taught teaching now. We are teaching differently to how we used to years ago. I have seen many changes. One of the biggest changes, which I think has been for the better-I have been there a long time and I have seen them come in cycles-is assessment for learning. That is one of the things that we do in schools, which is taking a mark scheme, taking your own work and marking that work to improve it. That is one of the things: we teach differently, which I think is better for the children.

If you are talking about pressures, however, I did not notice it so much when it was just five A* to C. Since there has been all the pressure of five A* to C including English and maths, there have been major movements in a lot of schools to give less time to some subjects and more time to others. Pressures from league tables and the Government have meant that, if a volcano erupts, you cannot spend a lesson on a volcano erupting if you are not teaching that, as I used to do, because I know that I have only a certain amount of time, because my time has been clipped and clipped and clipped by pressures brought in by the league tables.

Q170 Craig Whittaker: What shocked me about the investigation, since we have been involved, is the sheer amount of e-mails and correspondence from people that indicates that the allegations made against the two of you are actually quite systemic throughout the whole process. It is not for us to judge whether that is right, but the question for me is very much, whether you guys are the scapegoats or not, what are your employers doing to put checks and balances in place to ensure that what is alleged does not happen?

Steph Warren: I know that Edexcel has put out a statement on its website to say that all training sessions from now on will be videoed or taped, there will always be a member of Edexcel staff there with the trainer-

Q171 Craig Whittaker: Does that mean that it has not had any checks and balances in the past to check?

Steph Warren: It has had checks. I have had people come in and watch me train; they were in a London session a couple of years ago. I also do online training and the online training is all taped, so there were checks in the past as well. I think that what it is doing is being more thorough-no, not more thorough; that is not the right word. I will take that back. It is putting in place things to help more; it is not more "thorough".

Q172 Craig Whittaker: So up until this point, and this blowing up, there have been very few checks and balances in place from your employer to ensure that what has been alleged, does not happen.

Steph Warren: Well, there have been. I just said they have been in. I have been checked.

Q173 Craig Whittaker: You said two years ago.

Steph Warren: Well, I did not do any last year, did I? That is what I said. Do you remember that they were cancelled last year?

Q174 Craig Whittaker: So how frequently does your employer have checks and balances?

Steph Warren: I would not know for everybody.

Q175 Craig Whittaker: So you do not know.

Mr Evans, you were saying about the document.

Paul Evans: I can recall that about three years ago a regulator came to the INSET session, took detailed notes and reported back, and said that we were fine. The inspection actually took place three years ago.

Q176 Craig Whittaker: So that is three years ago?

Paul Evans: It was three years ago, yes.

Q177 Craig Whittaker: So nothing in between from your employer to do checks and balances to make sure that you guys are not open to the allegations that have been laid at your door?

Paul Evans: Not that I am aware of.

Paul Barnes: You mentioned the volume of e-mails. I would like to counter that with the mass of e-mails in support of people like ourselves, who have been almost pilloried. It really is refreshing to see so much support coming in from practising teachers, ex-pupils and the whole spectrum.

Q178 Craig Whittaker: I absolutely appreciate that, but I suppose that the question for me is that if this subject appears to be widespread and people have seemed to have known about it for a very, very long time, why are your employers not doing more to protect you guys on the frontline, to ensure that actually this rogue element is not happening?

Paul Barnes: Presumably, that will be the outcome of this Committee-a fresh look at training and delivery. It is the old expression: sometimes good can come from bad.

Chair: I thank all three of you for coming and giving evidence this morning, after what will have been a very difficult week.

Examination of Witnesses

Witnesses: Andrew Hall, Chief Executive Officer, AQA, Mark Dawe, Chief Executive, OCR, Rod Bristow, President, Pearson UK (on behalf of Edexcel) and Gareth Pierce, Chief Executive, WJEC, gave evidence.

Q179 Chair: Good morning, and thank you very much for joining us today as we continue our inquiry into awarding bodies, including the ones that you head up. Are your current GCSE and A-level procedures strong enough to bear the weight that the accountability system puts on them? Let’s start with you, Mark.

Mark Dawe: We have an enormous amount of procedures, conditions and requirements on examiners and on those running CPD events, so there is a lot in place. You are also right that an enormous amount of pressure is put on the exams, not only for student results, but because teachers and institutions are judged by results. There is an enormous amount of pressure on the system, but we believe we have the procedures in place to protect the integrity of our exams.

Q180 Chair: So is that yes, despite the weight put on the qualifications, you believe that procedures are strong enough?

Mark Dawe: Yes.

Gareth Pierce: Yes, I would agree. They have certainly been tested in periods of frequent change in specifications, for instance. There are some subject areas where we are going through possibly the third cycle of change even during the seven years that I have been in post. I think overall we have robust systems, however. As you heard earlier, we work with very experienced teams of principal and chief examiners, and the fact that we have experienced teams in those subjects helps us to bear that pressure of change and the accountability that goes with all that.

Q181 Chair: And yet there is so much, and it is long standing. Warwick Mansell’s book six years ago suggested that he had attended sessions, and he described "jaw-droppingly" bad behaviour in the sessions he witnessed. In some senses, there has been a lack of surprise at The Telegraph story and a sense that there is a complicit effort to allow schools to try to meet their A to C targets, which seems to have affected everybody in the system, from examining bodies to examiners to teachers. Are we going to get the message from you that everything is hunky-dory, procedures are good enough and The Telegraph is just barking up the wrong tree? It is quite hard for us to square that.

Gareth Pierce: I have attended sessions and been very impressed with the level of debate. Some of the sessions have formally led presentations, but within a day there would be a lot of dialogue, feedback, exchange of views and challenge from teachers. I have seen excellent debate on pedagogy, on assessment issues, and on standards and expectations. I have seen teachers being very challenging of examiners, and I have sat in there and heard examiners respond very professionally. That is the nature, in virtually every case, of our events. There are pressures, however, and there are ways of improving and tightening guidance in the light of what has come through from The Telegraph’s visits to some events.

Andrew Hall: I think, "yes, but" would be my answer, because your question is quite broad in terms of the pressure on the whole exam system. You cannot look at the last two years and say that systems and controls cannot be improved, because there have been other issues along the way, so you have to recognise that. If I turn to the point specifically of training, having listened to the previous session I have to say that we deliver and prepare for our training sessions in a very different way. You had representatives from other boards. We actually have a senior manager, who is a subject specialist and a CPD specialist, commission all the materials that are used for training purposes. They then review those materials with the presenters, and we conduct face-to-face training every year with every trainer. We have one of our senior staff attending to monitor one of each session for a subject so that we see what is going on.

I think this is about continuous improvement. Having seen what is there now, we have been investigating some of the allegations and we have written some very strong rebuttals to the regulator, which I am happy to share with you. You cannot say, however, that we found it difficult to know what precisely what went on in a session when we did not have a member of staff in that particular session. There is a logical improvement that says we should start to record that. I think it is about being honest about the need for improvement.

Q182 Chair: The Telegraph only today reports a case in which an examiner briefed teachers about questions on an exam paper, which they passed on to students. Even the school agreed that a student "clearly had insider information". Mr Hall, why did you not consider that practice to be irregular and, consequently, not choose to refer it to Ofqual?

Andrew Hall: If it is the comment on law that you are referring to, which I saw in The Telegraph this morning, we did investigate that thoroughly. It is sad in a way that we have to have it, but we have a very large and experienced malpractice investigation team. We do not just have to investigate allegations such as that. Sadly we have to investigate situations where teachers act entirely inappropriately in their own school. It is very sad that the pressures create that for them.

We have to investigate allegations of students cheating. We also have to investigate unfortunate episodes such as when Parcelforce loses a series of examination papers. Only last week we had to replace for our January series a whole paper because one paper was found outside a school. We do investigate those incidents very thoroughly. If there is any suggestion about what we would do, in the past year we terminated the contract of an examiner and reported that to Ofqual.

Q183 Chair: But in this specific case, the school said that the student clearly had inside information, and you did not find that irregular.

Andrew Hall: We carried out the investigation. It was quite interesting that The Telegraph did not also carry the comment it was given by the regulator on the subject, whose opinion was, to summarise, "This looks like a case of teachers doing a good job of question spotting." We did investigate it. I do not sit here, having read the paper this morning and heard about it last night, with every detail of the investigation. But I know and am confident that our team, which is experienced and investigates many things, will thoroughly investigate. We have no hesitation in taking action when we find a case.

Rod Bristow: I am deeply concerned about the revelations that appear to have come through from The Telegraph investigation. I take full responsibility for getting into it and understanding what has happened. We do need to investigate that.

There are two issues contained within your question. One relates to standards, and whether the pressures from the system, as you put it, have had an effect on standards; pressure on the system, competition and so on. To go back to the qualifications in history, English and geography that were exposed in The Telegraph investigation, we took action. We have done our own analysis and we are having an independent analysis done of the specifications in terms of their challenge, depth, breadth and demand. Those investigations, which are still ongoing, are basically telling us that they are comparable and that they conform to the guidance laid down by Ofqual.

We have also looked at a lot of very rigorous data analysis on what grades are awarded for these qualifications. Using an independently verified and rather complicated methodology, which I will not go into, the results were analysed against those given by other awarding bodies and there are absolutely no differences. The awarding is in line. You would not see that if those qualifications were easier to obtain. That data are available to all of the awarding bodies, and are what give us confidence that standards between awarding bodies are the same.

On training, as you heard from Steph Warren, we have quickly put in place steps to ensure we have our own employees attending the meetings, that we are videoing the meetings, and next year we plan to make the videos publicly available. We want to enable teachers, if they wish, to view these events online free of charge. We have also issued more thorough, detailed guidance than in the past. We are reviewing our training and we are going to make some changes to it for the long term. What the investigation from The Telegraph has exposed is that the things that we had been relying on so far are probably insufficient. We have been relying on the fact that our contracts make it very clear that examiners are not to reveal anything about what may appear in future examinations. All of our examiners sign up to a code of conduct that makes very clear what standards of behaviour are expected.

We also make sure that when these training events are given, we write the materials that are used for the delivery of the events. That is not left up to the individual trainers. Nevertheless, the events and videos that we have seen show that we need to strengthen the systems and processes that we have, and we are going to do so.

Q184 Chair: You said your examinations meet the specifications set by Ofqual. The quotes we saw in The Telegraph suggested that there was a surprise at the ability to steer a particular one through those specifications, and there was a suggestion that in order to get market share-after all, you are a profit-making company-the aim is to meet the specification with the lowest possible standard while doing so, in order then to be able to suggest that there is less to learn and you are more likely to get passes. That does not look like a healthy situation, however many procedures and formalities we see as a result of the news of last week.

Rod Bristow: If that were happening, that would be true, but it is not happening. What is happening is that more support is being given to improve teaching and learning, so that students can meet these standards. There is a world of difference between that and giving away information about what is going to be in the exams. As far as I am concerned, that has not happened.

Q185 Damian Hinds: I am conscious that we have only a small amount of time, but I want to try and get to the heart of what makes your organisations tick, very briefly. Mr Dawe, Mr Pierce and Mr Hall, you are chief executives of your different organisations. What are the job titles of your direct reports? When you get together as a senior management team, what are the key performance indicators you discuss? And, does anybody have performance-related pay?

Mark Dawe: I am chief executive of OCR, and we are part of the University of Cambridge. My direct reports are director of operations and director of standards, who is the accountable officer. Actually, if that director has any concerns about pressure-

Q186 Damian Hinds: We can perhaps get into that later. We need to do this as a really rapid-fire thing. Who are your direct reports?

Mark Dawe: A director of partnerships, too, and there is a strategy director.

Q187 Damian Hinds: What about KPIs?

Mark Dawe: Our KPIs are around the general performance of the business. It is a big logistics business in particular, with millions of scripts going out and coming back in. We are looking at the awarding and where there have been issues. Obviously, that is in a cycle, depending on the time of year.

Q188 Damian Hinds: They do not sound like KPIs. Do you measure specific things? For example, the number of students you have taking your exams.

Mark Dawe: We look at the number of candidates and things like that, as well as where candidates may be going up or down in terms of entries.

Q189 Damian Hinds: What about performance-related pay?

Mark Dawe: Not at all.3

Q190 Damian Hinds: Not for you or any of your reports?

Mark Dawe: Correct.

Q191 Damian Hinds: Thank you. Mr Pierce.

Gareth Pierce: My job title is chief executive and there is no performance-related pay in the organisation at all. The direct reports are director of finance and estates, director of examinations and assessment, human resources manager, operations manager, chief information officer, marketing and communications officer and assistant director, strategy and planning. KPIs cover a whole range of matters on the exams and qualifications front. There are a lot of volume-related and performance-related logistics indicators that are very high level, because of the scrutiny and the importance of performance indicators.

Q192 Damian Hinds: When you say volume and performance indicators, can you just unpack that a bit?

Gareth Pierce: Volume is of interest obviously in terms of total volume of candidates and centres that we support and service, including through all the means of achieving that-the resources we have to support different schemes and the number of events. There are many.

Q193 Damian Hinds: Thank you. Mr Hall.

Andrew Hall: I am the chief executive of our organisation, and I am also the accountable officer, so I am personally responsible for standards. My direct reports are a chief operating officer, who looks after day-to-day operations, and a director of communications, who also looks after whatever marketing we do within that remit. There is a director of general qualifications and very recently, we have appointed a new director who is responsible for our non-exam business. I am always worried I am going to miss one in doing this.

Q194 Damian Hinds: They will be very upset. What about KPIs?

Andrew Hall: KPIs vary according to the needs of what is going on in the business. We have been very heavily focused on looking at quality of marking over the last year, because that is something I have been particularly interested in. We look at our cash flows, because clearly, any organisation manages its business. We do not focus on market share. We do not focus on market share. We look at where our entries are twice a year and we look at where we think they may go so that we plan appropriate resource.

Q195 Damian Hinds: When you say you focus not on market share but on where your entries are, do you mean you have a volume measure rather than a percentage measure?

Andrew Hall: It is a volume measure. What resource are we going to need to provide in the future?

Q196 Damian Hinds: And performance-related pay?

Andrew Hall: Not at all.

Q197 Damian Hinds: And Mr Bristow, I know you are not chief executive of the part of the organisation, but maybe you could talk us through it in rough terms.

Rod Bristow: Perhaps I can explain that the managing director of Edexcel reports to me. The way that the performance of the managing director and the team at Edexcel is judged is on the robustness of the delivery, on the quality of the delivery and on error-free delivery. So, essentially, it is about making sure that we really have a strong delivery, high standard of delivery and high quality of delivery.

Q198 Damian Hinds: So no revenue target?

Rod Bristow: There absolutely are financial benefits for achieving financial targets, but we mix those objectives. Basically, the way that our performance-related pay works is that it is a mix of financial goals and personal objectives. The personal objectives are the things that include these sorts of issues. I will say that we have made a big point of putting standards at the centre of our strategy in the last 24 months, which is one of the issues that people are measured on.

Q199 Damian Hinds: Another quick-fire round. You may not be able to do this accurately, but if you can give us a rough guide: how much of your income at the revenue line comes from examination entries versus materials, whether that is textbooks or other, versus training and consultancy or anything else?

Andrew Hall: May I go first so that I may correct something? I did exactly as I feared and missed someone-the director of research is sitting behind me and is going to kill me on the way out. We have a research organisation that is part of the executive but actually runs independently. I wanted to correct that because I do not want to be killed for that.

The great majority, something over 90%, of our income comes directly from examination fees. We have a small amount of revenue that comes from publishing, something like £400,000 of contributions from people who publish books that we have supported. We have been looking to move into supporting primary students. We have a philosophy that we should actually help students of primary age so that they have more chance. We are diversifying in that area to support more, which accounts for 3% or 4% of our turnover.

Q200 Damian Hinds: What are the areas of growth? We know that the average maintained secondary school in 2003 spent £44,000 on exam entries and in 2010 it was £95,000, so there is huge growth, but presumably there is growth in other areas of your business as well. What are the key growing segments?

Andrew Hall: I am conscious that I have been there for a year and a half and that I have to look back. We have seen students doing two things. One, they are taking an increasing number of GCSEs, and it is an open question whether 10, 11, 12 is actually the right number-I think there is a debate about that. We have also seen students resitting exams many, many more times and being entered far too early. That is something that we as an organisation and I have publicly spoken out about. Without making a point, that goes slightly against the argument that we do this for profit and market share; actually we do it for education. So it has been students taking the same exam more times and just taking more GCSEs.

Q201 Damian Hinds: Other areas of growth? Textbooks? Licensing fees?

Andrew Hall: Not for us.

Gareth Pierce: Ballpark figures: entry fees-£30 million; events-£1 million; and resources-probably next to nothing. The reason for that is because our policy is to make as many of our resources available online and free of charge.

Q202 Damian Hinds: Do those resources include textbooks?

Gareth Pierce: We do not produce textbooks. Textbooks are being produced by publishers.

Q203 Damian Hinds: Do you license your brand for use? You give an endorsement, do you take a fee for that endorsement?

Gareth Pierce: No fee. For example, here is a publisher and it is for WJEC specification and our logo is there. We will have given some advice and some level of scrutiny of the content.

Q204 Damian Hinds: To be clear, WJEC will receive no money whatsoever for the use of its name on the front of that book?

Gareth Pierce: Yes, that is correct.

Mark Dawe: The director of curriculum and qualifications I missed out.

Over 95% of our income is from qualification fees. You talked about areas of growth. In the last few years that has particularly been vocational learning where schools in particular have expanded their vocational learning offer. In terms of textbooks, we take no money. The only thing we will do is charge between £100 and £250 for books that we endorse. That is just the cost of someone sitting down and going through it and comparing it to the spec and making sure it meets requirements. There is no actual money made from that. It is just covering costs.

Rod Bristow: Within Edexcel the figures are very similar. By far the majority of the revenues come from qualifications.

Q205 Damian Hinds: But in the wider group?

Rod Bristow: In the wider group we do publish, of course. We publish resources not just relating to Edexcel qualifications. We publish for higher education, for primary schools and secondary beyond exams. They are our roots. Our roots are in publishing so consequently we have overall quite significant revenues in publishing, but the vast majority of it is not directly related to what we do in Edexcel.

Q206 Damian Hinds: If you could just give me one figure that would be great. I realise that this can only ever be an estimate and a very rough estimate at that. Of the people you employ as examiners-I realise that employ is not quite the right verb-but the people who are chief examiners and so on for you, what percentage of their income do you estimate comes from examining and being chief examiner versus doing consultancy and training, authorship and so on, Mark?

Mark Dawe: That is incredibly difficult to answer.

Q207 Damian Hinds: I appreciate that.

Mark Dawe: About 75% of our examiners are current teachers and 25% are retired. Obviously the retired have more time to work with us.

Q208 Damian Hinds: So thinking of those at the very top of the pyramid, those who are setting questions, reviewing work and so on?

Mark Dawe: It really does vary. There are a number of roles they can fulfil. Some just do one role and some will say, "Yes, I am free for Inset." Some won’t.

Q209 Damian Hinds: But at the very senior level, are we talking a quarter, a half, three quarters, 99%?

Mark Dawe: Possibly a quarter. It just depends what else they are doing with their time.

Q210 Damian Hinds: We don’t need to go down the line but can anyone else have a stab?

Andrew Hall: I could not give a more precise answer but I would argue that it is at the lower end because they do many other things.

Q211 Damian Hinds: For each of your boards, do you measure the market share that you have among, for example, schools judged outstanding by Ofsted or among private schools?

Mark Dawe: Those figures are available because they come out as part of the JCQ data. So we see that.

Q212 Damian Hinds: That is data that is done on a collective basis?

Mark Dawe: Yes and we see a fairly reasonable spread across all types of schools.

Damian Hinds: I don’t know whether we have those numbers. In that case that it is the end of my questions.

Q213Tessa Munt: I want to know how many training seminars you run each year, how many teachers those cover, how many trainers you have at each event and, just to complicate it further, whether it is a cost or a profit and how that splits up for you.

Gareth Pierce: We run about 400 events a year. We reckon we have round about 10,000 teachers annually attending. We suspect that may be growing. That is partly because we have moved in some places to run regional events on closure days which means that larger cohorts of teachers have access. The number of participants or contributors is round about 300 but the number at an individual event would vary. We would almost always have the subject officer there and then the team with the subject officer could be anything from one person to four people, depending how many strands of assessment there are. Different units, different specialisms, moderators and examiners: it depends on the subject and we make a loss. We subsidise that programme. We get an income of about £1 million but we probably spend more like £1.3 million or £1.4 million.

Mark Dawe: We run roughly 1,200 events. We ran 1,200 events last year with just over 20,000 attendees. We had 289 examiners running those events, all on contract so obviously not employees and at about 25% of those events OCR staff were also present. It is done on a random basis but also if the trainer is a new trainer they will definitely be along to check.

Q214 Tessa Munt: Otherwise they run with one trainer?

Mark Dawe: One or sometimes two. It varies, but not always with OCR-employed people at those events. They are generally contracted individuals.

Q215 Tessa Munt: And cost or profit?

Mark Dawe: We make a loss. Over the past three years it has varied from a loss of £200,000 to £1.5 million.

Rod Bristow: We run about 1,000 bookable events, of which just under 400 are feedback events of the type that the three examiners were talking about earlier.

Q216 Tessa Munt: What are the other 600?

Rod Bristow: The others break down into two areas. There is "Getting Ready to Teach", and there is one where we are launching a new curriculum. "Getting Ready to Teach" is for new teachers who are new to that curriculum, and then there is another category when a completely new curriculum is being launched. There are about 1,000 in total. We have a very similar number of attendees; on average, we would probably expect to get about 50 teachers at each event. Our costs of running these events are about £4.3 million, and the revenues that we realise from them are about £2 million to £2.5 million. We are running at a loss of about £2 million.

Q217 Tessa Munt: Just to clarify, who is present at those? I know you might have answered that in another life.

Rod Bristow: We would typically have one senior examiner at those events, but we have our staff attend them on a rather more ad-hoc basis.

Andrew Hall: It varies considerably according to the level of specification changes in a year. Two or three years ago there were a lot of new GCSEs, so we ran considerably more courses. I make that point because it does vary. If I take the past 12 months, we ran just under 500 courses, from which we made a loss of something like £200,000. To put it in perspective, and I want to take the chance to do this, two years ago we ran 1,200 courses and our loss on that was something like £3.2 million. It is about supporting the specification change. We have 336 trainers we have used, some of whom are examiners and some of whom are not. Some are teachers we bring in, so they will not always have knowledge of future papers. The number of trainers at a meeting will vary depending on the size of the course and the number of attendees.

Q218 Tessa Munt: Do you know how many you have trained over the past year, at your 500 events?

Andrew Hall: 12,000.

Q219 Tessa Munt: 12,000 at 500 events. And how many trainers at each training event?

Andrew Hall: It varies; it will be one, two or three depending on the number of delegates.

Q220 Tessa Munt: Can I ask you all why you run these things at a loss? I am not including the occasions when there is a new specification. I want to know why you just trundle along at a loss on an ordinary year when there are no changes.

Gareth Pierce: We see it as a service, and we are happy to provide it. We work in the public benefit as a charity. Also, schools and colleges face plenty of issues in getting their teachers to be able to attend, because they often have to pay teacher release costs, or cover for a teacher who is released, and travel costs. Our day rate is about £120, which I think is quite modest, but we see it as a service. There are issues in terms of teachers accessing the events anyway, so we do not want to add another problem by charging a very high fee.

Chair: No one else needs to add to that unless they have something different to say.

Q221 Tessa Munt: I want to know how many of your examiners are involved in training. What percentage of your total number of examiners are involved in training?

Andrew Hall: The easy answer for us is a third of the very senior examiners.

Rod Bristow: For us it is 205 out of 690.

Q222 Damian Hinds: Inside your organisation?

Rod Bristow: Yes.

Q223 Damian Hinds: If it were shown on a Venn diagram there could, presumably, be another whole chunk who were training outside-doing some extra work-or is that not the case?

Andrew Hall: It is possible.

Rod Bristow: In our case, our contracts prohibit our trainers from doing their own training outside.4

Q224 Chair: Just on that, because you have raised it, do you prohibit training elsewhere? It has been suggested that some well-known examiners go off and run their own courses separately.

Mark Dawe: We do not prohibit that, but we issue very strict guidance about what they can and cannot say about their position, and what they are disseminating.

Andrew Hall: We do exactly the same as that.

Q225 Tessa Munt: Do your attendees do evaluation forms?

All Witnesses: Yes.

Q226 Tessa Munt: Okay, fine, and you collate those and that is some form of management measurement tool.

Gareth Pierce: Yes, indeed, and part of the guidance for our people in preparing for the following year’s events is to take firm recognition of what has been said on the evaluation forms from the previous cycle. These tend to run in annual cycles, and therefore the evaluation forms are very important as a steer to quality and to content.

Q227 Tessa Munt: Do you tie the results in the schools to who has done those training events and that sort of thing?

All Witnesses: No.

Rod Bristow: Virtually all schools do them.

Mark Dawe: We had 5,500 centres attend our events last year.

Q228 Tessa Munt: I think it was you, Mr Bristow, who said that you were going to put everything online, so everybody in the world could access it. Will you stop running these events now? If everything can go online and every teacher can just go online with their questions, there is no need for contact with examiners, which would remove the problem completely. If everything were online, it would be fair for everybody; they could all get exactly the same information and you could quality control that.

Andrew Hall: Currently, we put all the course materials that are used for this online anyway, and we have done that for some considerable time. The piece that is different therefore is the dialogue that you heard the two gentlemen and the lady in the discussions before me talk about. If you do not have a face-to-face meeting, one thing that we are looking at is, can you create this interactive web discussion? It is cheaper, and we have started running some webinars-I hate the phrase, but I do not have a better one, I am afraid. We run similar courses over the web with interactive discussion. We can run those more cost-effectively and the fee for that is £80.

Q229 Tessa Munt: I put it to you that exactly the problem that we are dealing with in The Sunday Telegraph would not occur if these face-to-face events did not happen.

Mark Dawe: I would suggest the opposite actually. We are aware of private companies running these sorts of courses. We have very strict rules for our examiners and we monitor them quite carefully, so we have had examples that are clearly inappropriate in the press, but many examples that are not. We have controls in place that private companies would not have. I offered yesterday to Ofqual to stop all our training, but I know what would happen the next day; these companies would pop up and we would have far less control over that. It is in all our interests to keep those controls in place.

Q230 Tessa Munt: But you lot have the knowledge that is actually of value, don’t you? Do you think that you would have hundreds and thousands of teachers flocking to them?

Mark Dawe: We have one of the most transparent systems in the world in terms of what we publish and what we are required to publish in terms of past question papers, mark schemes and sample assessments. It is all there for the teachers, but some people like to go to training sessions. We can all sit at a computer at home and look at things, but we would all also go to training sessions. It is a blended learning approach. All we are doing is giving what is already there face-to-face. We have a lot of controls in place to ensure that it does not go beyond what it should.

Andrew Hall: I am particularly keen that we do run the sessions, and we have to control them in the right way. I talked before about what happens-more courses-when the specifications change, but other changes happen along the way. Teachers change schools, and therefore start to take another specification. We want to be there to support them. New teachers qualify and enter the teaching system, and they need the support. It helps them to get there. It creates the ability to change. Some people want that extra comfort and I think that that is really valuable.

Rod Bristow: I think that we should review all options, including the one that you have suggested, but we need to ensure that there are no unintended consequences of taking a quick decision like that. I think we should review all options.

Q231 Tessa Munt: I just think that you have answered it in a certain way, and, clearly, you are going to make everything done on your training courses available to everybody, so they will not need to pay. Why not have an even playing field where nobody pays?

Chair: Tessa, we will take evidence from the witnesses, rather than the other way round.

Q232 Craig Whittaker: I want quickly to ask a follow up to Mr Hinds’ question earlier. You all mentioned that you had volume KPIs. Can you quickly each tell me what tactics you use to drive those KPIs?

Mark Dawe: As we have all said, a lot of those KPIs are around standards and logistics.

Craig Whittaker: I am specifically talking about the one around volume.

Q233 Chair: Market share. If it drops markedly, you will do something. What will you do?

Mark Dawe: We are very proud of our standards. We are part of Cambridge university. We are proud of our rigour. We are a not-for-profit organisation. Those are the things we talk about with our customers. We talk about the available-legitimate-support that goes along with that. Our focus is on providing the best support to the teachers in the classroom so that they can prepare their students in the appropriate way. That is what we are looking at.

As a not-for-profit organisation, opportunities might come up that make us think, "Actually, this would help." Our recent e-books initiative has made free electronic textbooks available to all A-level students, with over 300,000 registrations. That has no cost to the student or the teachers.

Q234 Craig Whittaker: So all those things drive that specific KPI for your organisation.

Mark Dawe: Absolutely. It is all about that service.

Rod Bristow: Three things: price, service and support. These are the factors that schools take into account when they are deciding which awarding body to go for. Typically, service and support are what they care most about-the service and support they get in teaching and learning.

Andrew Hall: Mine would be exactly the same as Rod’s.

Gareth Pierce: Product-that may be the specification-service and support, but the strategy is very much driven by information. We have to convey information to our potential customers, if that’s the right word, about products, services and support.

Q235 Craig Whittaker: I said earlier, and I think the Chair mentioned it too, that it seems evident that there is a widespread belief that the practice that has been alleged in The Daily Telegraph has been widespread for years. Have you guys known about it?

Mark Dawe: I have been in this job for a year, and I was a principal of a college before. Certainly, it was not what we sent our staff to training for. From both sides of the fence, I have not been aware of this.

Q236 Craig Whittaker: You have never been aware of it in the past.

Mark Dawe: No. We have had one or two instances, but those are where people have reported concerns, we have dealt with it, and those examiners have been removed from our books.

Gareth Pierce: I think that because our strategy is based on a small team, which almost always includes the subject officer, we have had high confidence. However, what has emerged now will cause us to review our reliance on that people aspect of our strength and see what systems improvements we need to reinforce quality in events.

Q237 Craig Whittaker: Were you aware of it before or not, or have you been aware? It seems widespread; it seems to have been going on for years. There does not seem to be an outcry about it because it appears that a lot of people in the profession have come to expect that this is the norm. My question to you, as the bodies, is: did you know it was going on?

Gareth Pierce: In terms of whether we know what is going on, we need to understand exactly what is being conveyed.

Q238 Craig Whittaker: There are allegations in The Daily Telegraph that that type of-

Gareth Pierce: My main point on that is that we are looking forward immensely to seeing the full transcript of one of these events-

Q239 Craig Whittaker: With all due respect, you are playing with me now. Please just answer the question: are you aware of this widespread allegation? Have you been aware of it in the past?

Gareth Pierce: No, because we have had confidence in our events.

Q240 Craig Whittaker: You are absolutely confident that you have not been aware of it in the past.

Gareth Pierce: No. However it has been drawn to our attention now, so therefore we are going to make improvements.

Andrew Hall: I must be clear that we refute the allegations about our organisation in The Daily Telegraph. I will make that point to the regulator and we will share that with anyone who wants it.

We have been aware, from time to time, as I mentioned, that the odd examiner does something in an inappropriate way. We have a malpractice team, and we investigate it. Widespread-no.

Rod Bristow: The question is what "it" is. The issue that has been raised against us is that our exams are easier, and that simply is not the case-all our rigorous analysis shows that. We utterly refute that.

Q241 Craig Whittaker: What will you do to make things better? Whether you knew it was going on or not, the evidence strongly suggests that it has been widespread for a very long period of time. You are the main guys who drive these events. What will you do to clean up your act?

Mark Dawe: Like the others, I am not sure that we can accept its being widespread and for a long time. First, we must clarify what is appropriate and not appropriate. There are actually guidelines there as to what is appropriate in these sessions, and some of the examples given in the press were perfectly appropriate. In our English one, at the end of the day the examiner was saying, "Read the question." That was the guidance and that has been going on for decades. "Read the question," was the advice, and this was some terrible insight into the next question paper.

Therefore, I think we have to be very clear about what is appropriate and what is not appropriate. If there a decision that that line needs to move, we will all agree to move that line. We certainly will, but at the moment we believe that the culture in our organisation is to give appropriate support and that is what happens. If that appropriateness is not appropriate any more, let us talk about what should be out there.

Gareth Pierce: We will certainly review the parameters and improve and clarify the guidance, especially for situations where examiners are not in the company of subject officers.

Andrew Hall: My position is the same. As I said, we actually believe that the courses that we run are properly run. I think my opening remark to the Chair’s first question was what we were going to do to improve things, and we talked about it. Our controls are good, but, yes, they need to be better now. That is about recording things, because we then have 100% vision, but that is actually too late to some extent. You actually need to fix it at the beginning. The point that I want to make is that we do have very rigorous training for all our people, and we have to build on that.

Rod Bristow: I do not think that it is that widespread or endemic, but nevertheless The Daily Telegraph has highlighted areas that we need to look at and where we need to improve. We have been clear, and we are being clear, in our determination to do so and the actions that we are taking. I think that is the important thing, particularly in these areas of training.

Q242 Craig Whittaker: Mr Pierce, have the actions of Mr Evans and Mr Barnes compromised the security of your summer 2012 GCSE papers?

Gareth Pierce: No, they have not. We have fully reviewed all issues to do with the GCSE history papers for the next series, which is the June series, and we are absolutely confident that there has been no compromise at all.

Q243 Craig Whittaker: What guarantees can you give to parents that that is the case?

Gareth Pierce: We have gone public on that finding. The guarantee is based on the fact that on one of the issues-namely the rotational cycle-all that information was already in the public domain, and it referred to sections of the specification not to questions.

On the other theme, which is to do with the OSA specification, the emphasis was already in examiners’ reports, and it is in the spec, but we are issuing a note of additional guidance to centres taking our specification just to confirm our position on that.

Q244 Craig Whittaker: Just for clarity, would it be fair to surmise that The Daily Telegraph’s allegations give you absolutely no cause for concern about the security and the integrity of each of your papers?

Gareth Pierce: That is right.

Q245 Craig Whittaker: So it is fine. No worries.

Gareth Pierce: The concern for me from The Daily Telegraph’s reporting of the GCSE history event is to do with the tone and the language, which were unacceptable. The content or the message behind that communiqué was in fact correct.

Q246 Craig Whittaker: So total denial.

Gareth Pierce: No. There were two aspects of the message. One was on the rotational cycle. That was in the public domain, and therefore-

Q247 Chair: So there was no revelation.

Gareth Pierce: There were absolutely no revelations.

Q248 Chair: "We’re cheating, we’re telling you the cycle. Probably the regulator will tell us off." So basically it was one individual who was tired at the end of the day and actually it was wrong. Actually, the cycle was publicly available to anyone.

Gareth Pierce: Absolutely. So why on earth there was a need to describe that as cheating or say that the regulators would be unhappy about something that has been in the public domain for several years, I do not understand. I do not understand the need for that language or tone, but the message that was conveyed was actually correct as per the published guidance.

Andrew Hall: But there is a point that I will make if I can here. While we can look at this technically and provide the evidence to whoever wants it, this has clearly been something that is in the public domain. The earlier that the feedback comes back from this inquiry and the work that the regulator is doing then that will do something about the broader public confidence. There is a public confidence issue in this, which, as I run one of the awarding bodies, I am very conscious of. We have to convince people about this, and I think that is why the inquiries are important.

Q249 Chair: As you know, as part of the inquiry that we have been carrying out before this all broke, the Advisory Committee on Mathematics Education wrote to us and said "chief examiners...run paid-for training sessions which risk being focused on coaching participants on how to pass the examination, further encouraging ‘teaching to the test’ in schools and creating an incentive for the examiner to set questions in such a way as to reward those who attend the course". That quite neatly sums up some of the misgivings that people have and the Telegraph story simply brings that alive. Do you think that the Advisory Committee on Mathematics Education is fundamentally wrong?

Mark Dawe: The linking of seminars to questions is non-existent. There is no evidence of that whatsoever.

Q250 Pat Glass: There is an awful lot of money involved in this, isn’t there? We have heard that the average secondary school is now spending £100,000 on exams. Tessa teased out that there were about 3,500 of these training courses. All of that means release and cover. Then we have got the CPD and then we have got the resources. All of this is money that is coming out of schools, that is not going directly into teaching and learning. If the Secretary of State decides after all this-there is an issue of public confidence here-that we are going to have one exam body, which will be in-house, where we will have a separation of examinations from publications, would that not give the public, parents, employers and young people themselves greater confidence in the integrity of the system?

Rod Bristow: I would say that it is worth looking back 10 years to see what the exam system was like then, when there was huge disarray in the system, with students routinely not getting their results on time or getting incorrect grades. A tremendous amount of progress has been made since then and that is worth bearing in mind. In fact, that was the time that Pearson became involved and Edexcel became part of Pearson. We made a significant investment at that time of £35 million into an awarding body where profitability was extremely low. As a result of that investment, we no longer send exam papers round the country in cardboard boxes between examiners and the awarding body in ways that they can get lost. We are able to monitor in real time the marking that is being done. We can double mark it, because the marking is done on screen. The quality of marking has increased tremendously. That improvement has come about as a result of innovation, which has been fuelled through competition through the awarding bodies.

We need to ensure that we do not discount some of the benefits that the current system provides. Nevertheless, it is the case that a lot of questions are being asked. We can see the performance of the UK in international league tables and we would all like it to be better. Mid-table is not where we want to be. It is absolutely right that we review all of the available options and whether there are ways that the system can be improved. The Secretary of State recently put a suggestion on the table and I would say that no suggestions that can be made to improve the system should be off the table.

Andrew Hall: My response would be that we need to very careful and think through the change we make. Before I came to AQA, my first leadership role in education was in running an organisation called QCDA, which I was asked to take over when the 2008 SATs had a challenge, which you may recall. I started to lead that. That was, in a way, a large national body delivering a large national test. There were many disbenefits of having one large national body that we could rehearse with you. I understand that we may be coming back on another occasion to do that. I do not think that it is entirely straightforward.

Gareth Pierce: My view would be that every model or system has some advantages and some disadvantages. In considering this, which is very important and has to be very considered carefully, it must be matched against the question of our key priorities. For example, whether the key priority is comparative standards within a subject, innovation, operational risk or costs, different models or systems would score better or worse against those key priorities. The debate has to be informed by the key priorities, as well as by what kind of regulatory framework, if any, would be needed for each of those models or systems. It has to be a holistic debate.

Mark Dawe: The important thing is that, when you talk about cost, it is still roughly 2% to 3% of the overall budget in a school or a college. As I have said, I came from college, where exam fees were high, but it was 2.5% of our overall cost. That was for external assessment of the success of our students and they walked away with a certificate that they could use with their employers. That seemed like a fairly reasonable fee to pay, actually.

In terms of having one board, I think that there are enormous risks with having one board. Maybe that it a future discussion, but it would be a false confidence for the public if we said that we would just create one board. There is a lot of talk around A-levels and involving higher education in A-levels, and they can bring along some of that confidence and say, "We’ve worked with the exam boards; we are comfortable with these A-levels as being robust, rigorous and progressing through to us." That is the sort of thing that should give confidence to the public.

Q251 Pat Glass: How do you respond to the allegations? At the bottom of this there is the perception that some schools-some teachers-are paying for access to privileged information, and that a school that cannot afford it or chooses to go down a different route does not get the same access. How would you respond to that, because that is what is sitting at the bottom of these allegations?

Mark Dawe: That information is freely available. As I have said, it is this blended learning approach-some people like to read it online; some people like to have it face to face. We see a churn of people coming and going, and they come and go for the same reasons. As many people come to us, saying, "You are a better exam board", as leave us. That gives me some confidence that there is some balance in the system between us all.

Andrew Hall: Likewise, we have put all those materials on our website, but only for teachers to access. They are secure-you have to register as a teacher. The one thing we do not want is young people coming in and potentially misunderstanding. You need a degree of expertise to understand the materials. So that is there, and any teacher can have it, whether they take our examinations or not.

Q252 Pat Glass: Can I move on to the competition between exam boards, because we have heard a lot about that as part of this inquiry? What internal controls do you have to ensure that you do not compete on market share?

Andrew Hall: I had the embarrassment of leaving out one of my management team when I was talking. That is a key part in doing that for me. We look at our business. We absolutely would expect to compete and deliver better service than everybody else-that is what anyone would want to do. We believe we have standards and we have rigour. We look at how we are awarding.

There are two parts-I know you have some of the technical experts coming to see you later on-of what a standard is: there is the content and then there is the grading and awarding of that. The grading and awarding of that is where there is absolute scrutiny. We share data-we share it with the regulator-and we have made a great leap forward now.

AQA, for a long while, has argued about statistical predictions being a really important part of judgment and now, for the last new GCSEs, the regulator has agreed that they will form the fundamental base. That means that you cannot compete on standards. It is then about the content, and that is how that interlinks. There are things like: are the reading ages in all our mathematics papers the same? I am not sure. But that is an area where we say that the awarding takes care of that, so there is no scope for competing on standards.

Q253 Pat Glass: The concern is that schools are shopping around to come up with the easiest paper. What systems do you have in place internally to make sure that your drive for increased market share does not lower standards?

Rod Bristow: I think that part of the review that is under way at the moment really ought to include this idea of what incentives can be put into the system to raise standards. I think that that is a very fair question to ask. There are examples internationally that we can look at. For example, in the US, there is the race to the top programme, which I believe is very highly regarded and seems to be doing a good job, where there are specific incentives in place for schools to take actions that will raise standards. We need to look at what incentives can be put into the system to do that.

Gareth Pierce: I think, as well, it is the fact that the standards theme is the one theme that brings us together. That is the one we work on collectively, as responsible officers or as technical staff within awarding bodies working through data, so it is the standards agenda that brings us into a collective, but we compete on everything else. We compete on quality of specifications; we compete on quality of teaching and learning resources; we compete on accessibility of our staff; and we compete on value for money. The standards agenda is what brings us together. We work collectively on that and we work with the regulators on that.

Andrew Hall: There are two points. There is the internal control. I go exactly back to the standards point, because that is absolutely how we nail it. It is a matter of proven fact-we can demonstrate it. The other part is a culture in the organisation. We have just over 1,200 full-time employees, and a highly significant proportion of those come from the teaching profession; they are teachers. They are not driven-there is no reward for them-to build market share and to compete on standards. They are managed day in, day out around-

Q254 Pat Glass: In the system, there is the cost of failure of not reaching the data we say or not reaching the floor targets. There may not be incentives in that sense, but there are massive issues if they fail. Is that not the incentive?

Mark Dawe: Our long-term survival is based on maintaining standards. In our case, even if Ofqual told us we could drop our standards, I would have Cambridge on my shoulder, saying, "No, you won’t." We have a research team that we have access to that does a lot of comparability work, monitoring standards across all of us within Cambridge, identifying where there may be issues. Ofqual fulfils the same role.

Q255 Chair: Where are there issues?

Mark Dawe: There have been issues in the past where an awarding body has been shown to be too hard or too easy and we are told to shift.

Q256 Chair: You are doing this work yourself and are looking across the piece, so tell us your list of the top three subject and board concerns based on that assessment. In relation to my earlier comments about looking at a budget and profit and loss report, you would look at the discrepancy. Where are the discrepancies in the past three years?

Mark Dawe: We had a sociology paper that was considered to be too hard and were told we needed to ease up on particular grade boundaries. Ofqual recently reported on English and a couple of other areas where they have identified some papers as easier.

Q257 Chair: Mr Pierce’s board has gone from 7% to 18%. It is suggested that that English GCSE might be easier. Is that borne out by your data, Mr Dawe, before we come to Mr Pierce to defend himself?

Mark Dawe: That was an Ofqual report. That suggested that that particular paper was slightly easier. I will let Mr Pierce talk about that.

Q258 Pat Glass: There are perceptions in the system that Edexcel is easier and that your board is unsuitable for children from private schools. Where do these perceptions come from if they are not real?

Mark Dawe: They are historical: there are decades back of Oxford and Cambridge. Our spread across schools is even: the percentage of schools and our market share is the same. It is perception. Nowadays you just can’t be that different, because of the rules that are put in place.

Q259 Pat Glass: We met lots of examiners yesterday. One criticism they made was that the cost has gone up and yet the amount of face-to-face quality assurance has gone down, in some cases to virtually nothing. They saw that as a major flaw in the system.

Mark Dawe: In terms of price per exam, the cost has been under inflation over the past five years. The volume of exams taken has gone up substantially. That goes back to the wider system and that pressure to do more exams and get more results. When we talk to universities they do not want more exams, but in the system there seems to be a pressure that they want more exams. The cost per exam has not gone up substantially.

Q260 Pat Glass: One of the recommendations made to us was that there needs to be more face-to-face quality assurance-what I used to call moderation-across boards and within boards. That is costly.

Andrew Hall: There is a debate around that-we call it standardisation, the words move. If you take a large subject such as English the number of people you need to standardise is immense. One challenge of having it as a face-to-face meeting is about human behaviour, about which you know more than I do, I suspect. You cannot have one person talking to 500 people and actually getting an effective message across. You therefore break it down into teams. When you break it down into teams you find, if you are not careful-and this is where we have quality of marking issues-you get different interpretations of one man’s or woman’s voice going through the system.

Using online methods of sharing that information means you can have much more direct communication, with just the physical logistics to make it work. That also then gives you the ability to test and see more effectively how the standardisation works. If you have it on a face-to-face basis it is very difficult. That makes for more rigorous, proven standardisation.

I do not know which examiners you have had in; they will have different views.

Q261 Pat Glass: They came from all the big boards and they all said the same thing.

Andrew Hall: I am not saying which board. One of the reasons some examiners do not like it-I am sure that would not be the senior people you had in-is because it finds them out. If I look at my e-mail box just after the results come out, one of the greatest criticisms is, "My students didn’t get the results I expected. Your marking is wrong." That is I believe our biggest issue about the quality of marking.

Gareth Pierce: I would like to comment on the point made. Yes, our growth in volume is notable in several subject areas, but it has all happened in an era when there has been very tight regulatory monitoring of standards through agreed statistical procedures. However, I would want to caution against being too comfortable as a country or countries. If we are couching our debate on standards increasingly in terms of prediction models and tolerances, that makes me uncomfortable as a statistician and as an educationist. I think standards in any nation’s education system need to be based on quality of achievement, the quality of candidates’ work, so I would caution against being too comfortable when our standards debate is couched statistically, especially in an international context where we want to make sure that our young people can have successful futures and can contribute to a successful economy.

Q262 Chair: The crude level is seeing that big increase in your market share and thinking, "First, is that because you are perceived to be easier? Secondly, are you actually easier and do the data back it up?" Are you more likely to get an A* in English from your board than you are from OCR, and do the data back that up?

Gareth Pierce: The answer is no, because we are working within this very tight-sometimes we might say too tight-statistical environment in terms of all our awarding data.

Q263 Damian Hinds: Gentlemen, I think we now better understand the business models that you run. It sounds like the vast majority of your income comes from examination fees. You do not make a lot of money out of textbooks, but you have to have a textbook to give some confidence to the schools and the teachers that it is going to guide them towards the said exam. You say you run training courses at a loss, especially when there are new specifications coming out, which I have to say sounds a little like saying that Procter & Gamble runs product launches at a loss. I put it to you that those things are marketing fees. In a business, an industry, where examination revenue has more than doubled in seven years, you do not need to grow market share in order to be very successful, you just need to hold your own. Is there anything fatally flawed in that assessment?

Rod Bristow: I would say there is because I do not believe they are marketing. I think they are actually providing a benefit-they are enabling teachers better to understand how to do their job.

Q264 Damian Hinds: I did not dispute that it was a benefit for teachers, but it is a cost of doing business, if you like. In order for people to go with you when the new specifications come out you need to have a training course.

Rod Bristow: It is a benefit for students.

Mark Dawe: It is a requirement.

Q265 Damian Hinds: You need to have a training course? It is a cost of doing business.

Mark Dawe: It is in our code of conduct from Ofqual that we should put these things on. It is a requirement.

Q266 Damian Hinds: Mr Pierce, in a previous inquiry this Committee did on the English baccalaureate, there was much discussion of your new qualification in Latin. There was some consternation among some of my colleagues that this qualification would not count towards the English baccalaureate and some marvelment at the remarkable growth rate that you achieved for this new qualification. What was the secret of your success?

Gareth Pierce: That is a very interesting individual case. A professional group, the Cambridge School Classics Project, came to us to explain that they felt there was a gap in the type of provision available in that curriculum area. We were not sure what to do because we were not expecting that approach. It came and we decided to respond to it, and, lo and behold, there are many-

Q267 Damian Hinds: Sorry. The gap was for what?

Gareth Pierce: The gap was for a curriculum offer in the Latin area. I think there was a time when there were two or three options available from different awarding bodies that had become just one available. In a way, this is an interesting microcosm of the question about a single provider. What they were saying to us was, "There is a single provider and there are schools and colleges that would like to provide but are not happy with that single offer." Therefore, they engaged with us and we developed a model offer that was approved by the regulator, and it became an accredited level 1, level 2 qualification.

Q268 Damian Hinds: But it was easier.

Gareth Pierce: No, it could not be easier because it is at level 1, level 2. It is different, but it is at level 1, level 2. The regulators have accredited it, which is what confirms to us and gives us the assurance that it is comparable.

Q269 Damian Hinds: My final question. I would like a quick answer from all four of you if possible. When we spoke to the examiners yesterday they talked to us about seeing expected results. So when they were marking papers they would have some visibility of what that cohort of students, other things being equal, should score. What is the role of that? What do those examiners actually see?

Mark Dawe: There is statistical work done on the cohort to get those expected results, but then there is an enormous amount of judgment made by the examiners in the awarding.

Q270 Damian Hinds: Why should the examiners see that, rather than you just doing it as a post-analysis exercise?

Mark Dawe: Again, there is an expectation from the regulator that, to ensure that our spread of grades fits the expectation, we do that analysis. So it is just part of the process, but it informs; it does not give the final award.

Q271 Damian Hinds: To be clear, it is a requirement from the regulator that you tell examiners, in advance of their marking the papers, that overall-not for the individual student perhaps-this is the sort of average mark that the students should be getting. Is that right?

Mark Dawe: The examiners mark separate papers without any knowledge of what should or should not happen. It is when it comes to the awarding meeting at the end that discussions are had.

Gareth Pierce: It is at the awarding stage that we operate within guidance on what kind of information should be brought to bear on those awarding decisions and that can include a whole range of what we deem relevant. Usually the regulators are aware of what kind of information we use and therefore they can also give a view on whether it is relevant. So it is information on the cohort. It is information from these predictor models. It is information about the quality of work of candidates because the examiners will be very clued up on that. So it is a bringing together of data and quality of work information. I think the trick is making sure that it is in the right balance.

Q272 Damian Hinds: What does that information get used for?

Gareth Pierce: It is used for setting what are called the judgmental grade boundaries. At GCSE for example, the grade A boundary and the grade C are among the judgmental boundaries. Examiners have to debate that and they need to bring together their own view on the quality of work and the available background data.

Q273 Damian Hinds: Given that we are told that different cohorts can vary substantially and that is one reason why perhaps over time there may be what some people call grade inflation, have you known a time when your overall results have come in significantly below the level that you have been told to expect?

Gareth Pierce: It is quite interesting at the moment because we have some statistical data which are predictive. So for next summer’s exams we will have some predictive data about that cohort which will help the decision making at that time.

Q274 Damian Hinds: Sorry, can you explain that?

Gareth Pierce: In the lead-up to next summer’s awards there is a bringing together of data across awarding bodies which portrays the background characteristics of that cohort. The regulators favour that kind of collective work, and so it gives us some indication predictively of what each of our cohorts looks like in terms of quality.

Q275 Damian Hinds: Do you find that the end results end up coming in remarkably close to that prediction?

Gareth Pierce: Well in a sense they have to because within this culture of working they have to be pretty close because we have to work within certain tolerance limits. But then there is also after the event. We do some collective comparisons, which is the screening stage after a summer series is awarded, when we again look at the data and there could be recommendations from that retrospective look. It is quite complex.

Andrew Hall: We use those in our organisation. As I am personally responsible and held accountable by my trustees for awarding the grade boundaries, if we do get them up and down from the guidance, we set ourselves a tolerance and it will vary according to the subject to some extent. I will have very detailed conversations with the chair of the examiners if they are either up or down-I think more than 1% is my broad rule of thumb, but occasionally I will go within that. You get all the reports and you sit and read them and study them. It is a really important way of controlling standards. I cannot stress that enough. Gareth and I might not be in quite the same place on this, but that statistical prediction is crucial.

Q276 Damian Hinds: Mr Pierce is a statistician. I do not know whether any of the rest of you are. I am not. It sounds to me, however, from what you are saying that it is physically impossible to have an extreme year where the overall GCSE results would be, say, 5% or 10% down on a previous year, but within a certain tolerance level you can have variation. Lo and behold, it just so happens that over the last x number of years there seems to have been on average a small tolerance above the expectation and that is what has led to grade inflation but there have not been counterbalancing years when it has gone the other way. Is that a fair interpretation?

Mark Dawe: One of the issues is where you have continuous change in the syllabus and specifications. It is very hard to maintain standards year on year because you are starting with a new set. So some of the work that is done year on year becomes much more difficult if it is a new syllabus but also if the work is poor, if the examiners are sitting there and saying this is just not good enough, we will put it down and we won’t be within those tolerances and we will justify why we are not in those tolerances. So we have the opportunity to do that. I suppose it is exceptional reporting. If you come outside the tolerances you need to explain why.

Damian Hinds: I would love to pursue this more, but I know that I can’t.

Chair: Gentlemen, thank you very much for giving evidence to us this morning.

Examination of Witnesses

Witnesses: Dennis Opposs, Director of Standards, Ofqual and Glenys Stacey, Chief Executive, Ofqual, gave evidence.

Q277 Chair: Good morning, thank you for joining us today. Headmaster Bernard Trafford wrote in The Telegraph last Friday: "The public are outraged, but no one in education is surprised." Were you surprised?

Glenys Stacey: Yes, I was. I was surprised by the nature of the comments and by the tone and tenor of some of the material I saw in those clips. I was not unaware of concerns about what happens in seminars. Indeed, you will know that I have been sufficiently concerned about what I have heard and the noises in the system about that to declare it as a priority area for me, and did so about a month ago. I was aware that sufficient noises were there for us to look at it, but I was surprised at the exact nature of what I saw.

Q278Chair: The awarding bodies referred to data showing that their standards are all in line, according to the statistics. Do you have that data? Do you feel that they show that? Or, are there persistent differences between boards on particular subjects, be it GCSE maths, English, or whatever?

Glenys Stacey: That is a big question. I will answer it briefly and see whether you want to explore any particular aspect of it. I know, Damian Hinds, that you were asking particularly about statistical modelling and really about grade inflation.

As you know, one of our concerns at Ofqual is grade inflation. We set out this year to control it, using a comparable outcomes approach. We did contain it at A-level and will continue to do so. That is a rather technical process of moderating across subjects, grade boundaries and awarding bodies. Yes, we have that data, and we get data in on a daily basis from awarding bodies throughout the summer examination season. It is a big task; some 15 million examinations are sat and scripts marked. We have that data.

Apart from that, we do comparability studies, as between awarding bodies and between subjects, and studies over time. We have about 50 or so of those on the record so far. One or two have been referred to-I caught the last bit of the last session. For example, we have data in relation to A-level English that gave us cause for concern.

The last thing I would mention is that in September this year we published an NFER study looking at differences between awarding bodies, and particularly, our Northern Ireland counterparts and the WJEC as well. We are looking very closely at these matters.

Q279 Chair: What do the data show? Where are there significant differences? I know that Ofqual has only been going two years and that you have not been in post that long, but what is your early take?

Glenys Stacey: The early take is that we can and do moderate robustly across grade boundaries, as between awarding bodies; first of all, dealing with the grade inflation point. Awarding bodies co-operate in that they are increasingly using the statistical models, which we applaud. Examiner discretion and judgment is an important piece early on in the process, but inevitably, given the nature of humanity, there tends to be an element of doubt. Given the benefit of the doubt in the system and left to its own, that inevitably means gradual grade inflation, so statistical modelling does help you to take a different perspective and see whether there is justification across the cohort for change. I think that does work quite well.

Q280 Chair: Given that schools are desperate to meet the accountability measure, which is overly fixated on A to C GCSE, and there may be a desire-it seemed to be expressed by people recorded by The Telegraph-for boards to make it easier through their exams for schools to do that, anyone who stands out from the others is likely to try to drag everyone else up in order to try to make sure that they are not perceived as much more difficult than the new, perceived easier one. That would itself contribute to grade inflation. Are there particular subjects where there is a link between, for instance, increase in market share by a particular board and data suggesting that you are more likely to get a higher grade?

Glenys Stacey: I would not necessarily say particular subjects. I think you need to refer to the NFER report, which looks at the broader picture across subjects and which gives us cause for concern in relation to, as I have said, CCEA in Northern Ireland and WJEC. It is early days in discussions with them about what lies behind the apparent statistical anomalies. We are certainly looking at those.

Q281 Chair: Mr Opposs, anything to add to that?

Dennis Opposs: The NFER report was primarily commissioned to test out this statistical model-this approach to trying to ensure comparable outcomes-and it gave the method we are using a pretty clean bill of health. As part of that, there is a lot of data in there about individual A-levels. As we have said, the main issue that is being followed up now is to do with the Northern Ireland board and the comparability of some of its A-levels with others.

Q282 Chair: Right. What about WJEC?

Dennis Opposs: That is in the report on the website, and all the data are there. The report covers three years-2008, 2009 and 2010-and it suggests that over those three years WJEC has come much more closely into line. We are planning to do a follow-up looking at what has happened in 2011.

Q283 Chair: So it was out of line, and that may have contributed to its growth in market share in certain subjects?

Dennis Opposs: In this A-level study, we have not looked at whether particular subjects and their grading relate to the market share.

Q284 Chair: If WJEC had gone from 7% to 18% in GCSE English, which is not exactly a minor topic, I would have thought that notwithstanding NFER reports and large-scale data assessment, I might have just picked that one and gone and had a good poke at it to try to understand what was going on. But that is just me.

Glenys Stacey: What is interesting to me as a newcomer is that at a very high level, as awarding body chief execs have said to you this morning, market shares are relatively stable. Year on year, you do not see a lot of movement at the highest level, but what is interesting is the churn underneath. We suspect that the movement underneath might be telling us something possibly about standards. We are very interested in evaluating the churn or movement data year on year between one specification and another to see what it might be suggesting to us, although not much is certain.

What I do know is that when I go out to schools and ask about particular movements-for example, as it happens, noting the increase in market share on the particular qualification that you mentioned, I was at an independent selective girls’ city school recently where they had chosen to move to that qualification. I asked why that move had taken place, and the teacher responsible said to me that she was very impressed with the levels of service that were provided in terms of the qualification, so I gleaned something about why one teacher in one school chose to make that change. You will recognise, I know, that many of the choices made about this board or that are made at the chalk face.

Q285 Chair: Yes, but it is at the chalk face that you have schools absolutely driven to a frenzy in their desire to meet the five A to C GCSEs. The floor target is going from 30% to 50% so the system, which is already over-fixated on one particular measure, is going to be cranked up even more. That is why we need you to check whether it is leading to perverse outcomes. When the head of Ofqual arrives, I do not know how many teachers will say, "I’m choosing that board because I think it might be easier. It’s an excellent service." I do not want to be cynical, but I don’t know how often they will fess up and say, "Actually, it’s because my head teacher is going to sack me if I don’t improve the percentage of people who get a pass, because apparently we are the worst in our subject-we’re the worst in the area. I’ve changed my board so that I don’t fit that category any more."

Glenys Stacey: You touch on something that I have mentioned in correspondence-that is now public, actually-about how the wider system works. Our job in Ofqual is to secure standards in a delivery model with awarding organisations. That is our raison d’être, our determination and our intent, but that operates in a wider system, and you mention aspects of it there.

Q286 Tessa Munt: Do you think that the comments of individual examiners have in any way jeopardised future examinations, particularly next summer’s?

Glenys Stacey: Yes. Obviously, that has been a priority for me in the inquiry that we started just last week. We have had rough transcripts only of the material from The Daily Telegraph-it retains the copyright on those. We have looked through those and we do not see an immediate concern, but the evening before last, we received the audio tapes-some 54 hours of them. We are checking to make sure that there are not any more. We are having those 54 hours transcribed, and as soon as that is done, they will be passed to awarding bodies to check those examination papers that are already set and on the shelf.

We know now, I believe, that none of that material was covering examinations to be sat in the January series, so we are looking at examinations that might be set for the May or June series. We can get our awarding bodies to go through the process of looking closely at the material and evaluating the papers. Hopefully, that will give us the assurance that we need; if there is any doubt, papers will be pulled and replaced.

Q287 Tessa Munt: Is it worth changing the papers anyway?

Glenys Stacey: The paper has been set. We have seen nothing as yet that compromises it. Once we have looked at all the evidence, we will take a view on whether there is any sniff of a legitimate concern. We will also consider public confidence when we make a decision about this-rest assured we will-but we will make that decision in January.

Q288 Tessa Munt: It was only as a matter of public confidence-the perception is that this is tainted, so I wondered what action you might take?

Glenys Stacey: The issue is that The Daily Telegraph has done an excellent job in getting to these seminars and presenting this evidence, but it has covered only a small number of seminars, on subjects that may have been selected because of particular concerns expressed. I don’t know; I need to understand that better from The Daily Telegraph, and will do so.

There is a question of scale. We have not seen evidence from all the seminars across all the subjects. We need to think about whether you simply pull all examination papers, which would be quite destabilising, or whether you are driven to take a proportionate approach based on an evaluation of the evidence, but also taking into account public confidence. I have that very much in mind.

Q289 Tessa Munt: Are you doing anything to reassure students who are due to take their exams in the summer?

Glenys Stacey: We will be reassuring students when we have made a decision in early January, so we have time to do that.

Q290 Tessa Munt: What is your opinion on whether teachers are, effectively, paying for privileged access on a face-to-face basis?

Glenys Stacey: Obviously, I need to know more. I need to be given the opportunity really to look at the evidence. You will know from the correspondence I have already shared and exchanged with Ministers that I have sufficient concern about these seminars, and training aids generally, to make it a priority area of work for my organisation now. I also make the point that it is a wider system that is working-there is an extent to which our traditional approaches, not simply as a regulator, but in the system at large, are based on assumptions about the integrity of individuals wherever they are in the system. We need to take stock.

Q291 Tessa Munt: On the idea of questions coming around in a cyclical fashion, it has been said this morning that this is all open, and it is completely clear that it will happen, so if you are into gaming, you can go on the probability and work out what you might have. What is your view on that, please?

Glenys Stacey: I am not sure it is quite as straightforward as that. I think what you are touching on there is that we have a transparent system here. It is not the same in every jurisdiction, but we have a long-standing belief in transparency-not just in the examination system, but culturally across the public sector provision of services. Examination papers are on the web and available, mark schemes are understood and so on. One can claw back from that if it is desirable to do so. Do not forget, though, that it is not simply at the moment the awarding bodies; there are private providers running similar seminars as well, and they are not regulated at the moment. So we actually need a debate about how transparent we would like our system to be.

On the issue of how frequently some things come around, I am no expert as yet, but there are some subjects-Latin, for example-where there are only so many texts one can actually examine. In other subjects, there are areas of the syllabus that are so significant in terms of their size and weight that it would be very odd if they were not examined on a regular basis. So there is a balance to be struck. You do not want it to be formulaic; you want it to cover the syllabus with the right weight applied.

Q292 Pat Glass: I accept that your giving training sessions is a priority for you in the future, but would you therefore accept that in the past Ofqual has turned a blind eye to some of these long-standing issues?

Glenys Stacey: As the Chairman said, Ofqual has not been around that long. It did have a predecessor, and I am not responsible for that. What I can see is that, since having been at Ofqual from March this year, we have raised very quickly in the public domain our concerns about standards. We have encouraged people to speak openly and honestly about these issues when, even a year ago, the climate did not seem to be one where one could voice these concerns readily enough.

We have taken a firm hand on matters of priority-for example, exam errors over the summer. We have run a standards debate with those who understand assessment and those with a wider interest. We have been looking very closely at grade inflation; certainly at standards of demand in qualifications; at marking; and, yes, at what I call, loosely, the commercial behaviours of awarding organisations, because we have a market model at the moment. Like any regulator, I am prioritising my resources, and I am concentrating on those issues that I believe people are really concerned about in standards.

So far as these seminars are concerned, we have set a requirement for May 2012-a condition on all awarding bodies-that they demonstrate that they do not compromise the integrity of the assessment or the confidence of the subject matters that are covered in examinations. We have required awarding bodies to have conflict of interest procedures that deal with and control the role of examiners there or anywhere else. We have most certainly been monitoring those seminars on spot checks but, as the Chairman says, when the regulator turns up things seem all above board.

Q293 Pat Glass: Can I ask you about the conflict of interest procedure? Would you expect to be involved in that, either in drawing it up or particularly in monitoring it?

Glenys Stacey: The requirement is that they should have a conflict of interest procedure. They need to suit the particular conflicts and the size and scale of the businesses. Bear in mind that, although you are focusing on four or five awarding bodies today, there are about 180 of them, providing a wide range of qualifications, so there is a case of horses for courses when you are designing a conflict of interest procedure. Our job is, first, to specify that it should be there; secondly, to make sure it is there; and thirdly, to monitor its effectiveness.

We are in a year of transition here, moving to much tighter regulatory arrangements, which we have published for May 2012. We start our new monitoring arrangements in earnest in January. You would expect, I hope-certainly, we will be hoping-that so far as the big players are concerned, our monitoring will be what I call close and continuous.

Q294 Pat Glass: In terms of monitoring, do you monitor how much exam boards charge for support materials, such as training and textbooks, as well as the exam entries themselves?

Glenys Stacey: We have not been explicitly monitoring that to date, but it is inevitably going to be part of our study now on textbooks, support services and seminars. There is bound to be an element of looking at whether a profit is made and, if so, to what extent. My understanding is that the seminars are not thought to be a cash cow, let’s put it that way. We will get to the root of it.

Q295 Craig Whittaker: As a follow-up to that, some people say that you lack teeth or the will to go and tackle the issue around textbooks, for example. I know that you have said that you have put some guidelines in place, but are you doing enough?

Glenys Stacey: Yes, it is interesting, isn’t it? On lacking teeth, first of all, we do not lack the appetite to deal with matters, but we do need the right sanctions and the right powers to do that. We have been making representations to the current and the past Government about that. I am pleased to say that this Government are giving us and have given us the power to fine. We are now going hell for leather to ensure that we can have those arrangements in place at the earliest possible opportunity.

Fining has its proper place and we have not had that sanction. We have got what one might call the ultimate nuclear option, which is taking an awarding body out of the market, but for the sorts of things that we are talking about today, one needs intermediate sanctions, such as fining, in order to bite, if you like. I am very keen to see the right powers for Ofqual. I will be advising the Secretary of State in the new year of any new powers we might think we might need in light of this. One, for example, is that whistleblowers to Ofqual are not currently given the same protection as those other public sector organisations where they might go. We need whistleblowers to have the greatest possible protection, so that we can directly hear the evidence that we need.

Q296 Chair: How sound are your systems on that? It is very anecdotal, but we had somebody say that they used it and that you never got back to them. If somebody goes to the trouble of trying to contact you and they don’t hear back from you, that does not sound very good. That is just one instance.

Glenys Stacey: Given the number of letters that I sign each day, I find that surprising. I would very much like to see the example, if that is the case. I see what we are dealing with, so it is not a concern of mine at the moment, but if there is evidence, please let me know.

Getting back to why we have not done something about this before-

Q297 Craig Whittaker: May I stop you and take you back to that last comment about the number of letters that you sign off and what you deal with? Are you therefore suggesting that it is commonly known that it is widespread?

Glenys Stacey: No. In the sort of letters that I am signing off, when people are writing to our organisation, raising queries about any aspect of examinations or qualifications, many of those inquiries will relate to a particular issue about the delivery for one person or the mark for one person. We are ensuring that the awarding bodies are dealing with those appropriately.

Q298 Craig Whittaker: How many more teeth do you need to bottom this out?

Glenys Stacey: I certainly need the fining power. I certainly need to be able to build the capacity of the organisation, which we are doing, sharing our plans with DFE and BIS. We will be working increasingly closely with other experts on assessment in the wider academic field, for example, to ensure that, when we are looking at and evaluating evidence, we are reaching the right conclusions and determining the right priorities. By building our capacity and looking afresh at our powers, we will get where we need to be.

On powers, I think that fining power is the most significant thing. I am very pleased to see that coming. I am also keen to have a fresh look, particularly at whether-I think we are okay, but I want to check with our lawyers-we are able to extract from awarding bodies all of the data that we need, in a timely and dynamic way, to ensure that we understand some of the flux, churn and change in the system that we need to.

Q299 Craig Whittaker: Do you not do that already?

Glenys Stacey: Yes, we do get a considerable amount of data. I have just asked for more yesterday in relation to a particular detail on flux over the summer. I want to double-check that there is not data information that is available within an awarding body or across awarding bodies that we do not have access to. I am just getting my lawyers to check out statutory provisions to ensure that that is the case.

I think you asked why we did not deal with this before. I know that Warwick Mansell wrote a book-I think it was called "Education by Numbers"-maybe five or six years ago, and there are a number of issues there. I have read that book, it has been by night-time reading, and I have met Warwick Mansell to discuss the issues that are in there. Interestingly-I look to Dennis here-I am not aware of any direct evidence being presented to Ofqual, certainly not in my time, that this is an issue, but having read that and heard one or two noises, as I said, I have been sufficiently concerned about it to put it as a priority. I have shared that with Ministers.

Q300 Craig Whittaker: I am struggling as a parent, as well as a member of this forum, to really have confidence in the system. Are the Daily Telegraph allegations just a little annoyance? Speaking to the four guys we had on before you, I was quite ready to sign up to give them knighthoods-I thought everything was hunky dory. In reality I suspect it is not-I do not know because I do not have the evidence-but what is it? How do we put that faith back in the system? How do we make students and parents really feel that, when their son or daughter is taking a GCSE or A-level, it is robust; it is setting them up for their future, because, at the end of the day, that is what it is about?

Glenys Stacey: I will try to answer that as concisely as possible, but it is a big question.

First, we do annual surveys of confidence in GCSEs and A-levels, and we look particularly at the confidence that teachers and students have in various aspects, including marking. Confidence there is remarkably sound- among teachers, I think it is about 80% confidence in A-levels and about 70% in GCSEs-that they are robust, valuable qualifications with currency. Confidence in GCSE and GCSE marking has dropped a bit over the last year, which really sharpens our interest in marking. We could get on to that, but, generally speaking, that is one objective measure, if you like.

Secondly, this year we have compared our A-levels with the equivalent qualifications internationally, and we are about to publish our study on those. Dennis can certainly talk about the detail if you are interested, but, basically, our A-levels are standing up pretty well when we compare them with their international equivalents. Because our A-levels, if you think about it, are designed as the appropriate gateway into higher education here, they look more fit for purpose here than some of the qualifications abroad, but there are some nuggets in that study that we want to reflect on and to look at whether they could be transferred into our systems here. That is very interesting, and we will progress that into GCSE comparisons in the next phase of international work.

So there are some good signs, but there was just a sufficient element of concern-this is a very emotional and emotive area for people-about grade inflation, about whether demand is sustained, about some technical issues on the construction of questions and question scaffolding, about whether you are leading and clarifying, and where the boundaries lie, for us to sit up and take notice in the summer and start discussing with awarding bodies and with other experts, "Let us get a grip on this. How significant are these issues?" If they are significant, our job is to correct any slip.

In summary, as a parent you need to know these are good qualifications that have currency. We will continue to secure it, but there is a little bit of correction to be done.

Q301 Tessa Munt: I just want to challenge you on that a little bit, because in my experience of my neck of the woods, which is the west country, colleges absolutely consistently test every student for their capacity, ability, literacy and numeracy-the whole flipping lot. It does not matter whether they have seven grade As or four grade Cs, whatever it is, every college tests every young person going in. We have also heard evidence on this Committee from employers and people involved in further and higher education who do not trust. They are not being given what they think they need. That is slightly at variance with your statement that all is fine, mostly, but there are some bits that need to be twiddled with. Why are colleges checking every child?

Glenys Stacey: It might be at variance, or it might be just a different element of it. One has heard stories or views that the student is not actually demonstrating the abilities, skills and knowledge that one would expect, having looked at the qualifications that they have. One hears these tales now, but one also hears them from 30, 40 or 50 years ago. It is a common theme, so we need to recognise that.

Q302 Chair: So is public confidence higher now than it was? The data would seem to suggest that the general public have a lower confidence in the system than teachers and parents do.

Glenys Stacey: I would not know what the confidence was in qualifications during the era that I am talking about. Just to continue, when employers or higher education are talking about the student, they may be talking about the knowledge base, the ability to analyse or true skill. We need to get a better understanding when a qualification is constructed about the balance to be struck between assessing knowledge, assessing analytical skills and assessing true skill, so there is a conversation to be had.

There is a common view that certain core skills, such as the ability to communicate, to spell, to punctuate and to use grammar well have been falling short, and we have made changes this year to tighten up on that. You may be aware of them, but in GCSE subjects-some of the main topics now-there is a requirement for at least five marks to be allocated to the assessment of spelling, punctuation and grammar. Where we can see common skills across qualifications that employers and higher education are telling us are missing, we want to have that dialogue with higher education and employers. Where we find a common theme that we can correct, we are doing so.

Q303 Damian Hinds: One of the many letters that you signed the day before yesterday was to me, for which thank you very much. I was asking for more details on some of the research work that your organisation had done, and it was a very full response, which I am very grateful for.

Glenys Stacey: Thank you.

Q304 Damian Hinds: One of the things I found particularly striking was that in Ofqual’s own research on the quantitative study of attitudes, the proportion of people saying that the exam system was doing a good or very good job with no need for reform was 26% of teachers, 25% of students and 18% of employers. Does that not argue for rather more than a bit of change in the examination system? Does it not actually argue for radical change?

Glenys Stacey: If we are talking there about the market model-is that the focus of your question?

Q305 Damian Hinds: No, I am being very general at the moment. I am just saying that there seems to be a big problem with the exam system in this country. Although we have had many discussions about some of the specific aspects of it, it sounds as though the undermining of confidence in the system has become so widespread that some really radical changes may be needed. That is my hypothesis to put to you.

Glenys Stacey: My perspective on that is that we are identifying facets of the system that will undermine confidence. Grade inflation is one of those facets, and we have already outlined our intent and how we would actually deal with that. A second area of the system that I know is significantly undermining confidence with teachers is in relation to marking. Here, when I speak with teachers and teacher representatives, as I do, the issue seems to be not the generic systems now used for marking-I think the chief exec of Pearson explained the innovations there, and they have the real potential to increase the consistency and quality of marking-but how individual awarding bodies deal with issues raised about individuals’ marks or the marks of a class in particular examinations. That is an area where we will be working with awarding bodies to agree a common approach to the service that anyone would expect when they raise a concern about marking. We wish to promote a much greater consistency and transparency about that. Particularly, you will understand that we do not wish to see any student advantaged or disadvantaged unfairly as compared with the cohort. So, for me, as the regulator, I can see areas where a determined effort on our part, as we are now doing, should make a significant difference to confidence in the system as it stands. The question as to whether fundamental reform is required is a matter for the Government to consider. My view is that I am the regulator and I will regulate.

Q306 Damian Hinds: You talked about the international comparison and some of the analysis you have done. We know that in future the Secretary of State wants examination standards to be pegged to the best in the world-not the average. How is that going to work in practice?

Glenys Stacey: I will ask Dennis to outline our approach on that.

Dennis Opposs: As you know, as we have said today, we are carrying out this study at the moment looking at A-levels alongside qualifications used many jurisdictions around the world. We will be publishing a report on that early next year. What we then need to do, or what we doing at the moment actually, is to try to draw from that what we can take in particular subjects that we can build into future A-levels in this country. One of the subjects we have been looking at is history: what can we learn from what we see around the world from the way history is taught and assessed that we can build in? We are still working on the details of how we will do that but at least at the moment we are building the evidence that we will need.

Q307 Damian Hinds: To be clear, you talk about the way it is assessed rather than saying, for example, a top grade in the United Kingdom should be in some sense equivalent to a top grade in say, Shanghai, Singapore or one of the other jurisdictions that is frequently mentioned. Is that right?

Dennis Opposs: We have had experts in the subjects looking at material. We have said that we want them to think about the sort of standard that is required to get into a selective university in this country. So we are looking at something around grade B. We have not yet looked and we may not look at students’ work but they are able to analyse the syllabuses and try to make some comparison about the expectations in these different countries.

Q308 Damian Hinds: So it does include attainment levels?

Dennis Opposs: When you link that with the exam papers and so on that we look at, we can get a hang on that.

Q309 Damian Hinds: Given that it seems that in some of these other countries, rather inconveniently, not only is the standard higher in some cases in the best performing system in the world, but it keeps going up. So if we adopt the comparable outcomes principle here-we have adopted the comparable outcomes principle here-is that somewhat in conflict with pegging to the standards and best systems in the world?

Dennis Opposs: We have just started using the comparable outcomes approach. This was only the second year with the new A-levels. One thing we are aware of is that you cannot just keep that going for ever. You have from time to time to step back and review. What you have described could be one reason for doing that. It could be there are other reasons that are coming through that mean that you need to set your targets higher in particular subjects. So when we put in place our full arrangements for this, one of the things we will have to build in is the opportunity to adjust from time to time. We cannot simply put in place the current standards and say, "And they shall be carried forward for ever and a day."

Q310 Damian Hinds: In this Committee we hear frequently from businesses complaining that the qualifications people have are not equipping them for the jobs they want to be done. They find that people do not have employability skills and so on as well as core craft skills. We also know that some universities are finding it necessary to redo A-level maths or parts of it in the first year of a science course and frequently need to do their own test on top of the A-level system. If you were a man from Mars looking at this situation, wouldn’t you say, "Universities should be in charge of the standards of academic A-level qualifications and employers should be in charge of standards in vocational subjects"? And that would be without the need for some parallel system of awarding bodies, although I suppose you would need some intermediary body to pull it together.

Glenys Stacey: Can I deal with that because I am concerned about the distance between qualifications production and higher education and I understand the Government’s laudable desire to get higher education more closely involved in the design of A-levels? The practicalities of that are all in the detail. You will know that people in higher education have a great, diverse range of views about the subject matter within individual subjects and about the structure of the qualification, level, demand and so on.

For us, the key to success is finding a way to understand well enough-in a sufficiently granular fashion-the different views within higher education by subject type, by higher education college type and so on, so that we can see to what extent and how higher education will be properly involved. There is a danger in listening to a small cadre of voices in higher education; we need to look at it in the round.

Q311 Pat Glass: May I briefly explore something that Tessa was talking about? Throughout the system, we are checking at every point. When children go from infant to junior schools, the junior schools are saying that the key stage 1 results are not matching what they see. When kids go from primary to secondary school, every one of them is reassessed. It is the same at college and university. Universities are re-teaching maths and so on. How much of that is about the exam system? How much is it about the pressure in the system that we keep hearing about in this Committee? Schools, colleges and universities are getting their retaliation in first because there is massive pressure to get grades A to C, floor targets and so on. Is the examination system being corrupted by the massive pressures in the system-or blamed, even?

Glenys Stacey: I am not sure that I am the best person to answer that. We have a limited role in overseeing the assessment system, the SATs and so on, but the primary responsibility for those is with another agency.

Q312 Pat Glass: But you are dealing with the outcome of all this, aren’t you?

Glenys Stacey: We are responsible for qualifications and examinations. The ones that bite in the system that you are talking about would be those, first, at GCSE level-there is a fair number of qualifications alongside GCSEs-and then at A-level. We are responsible for the standard of those qualifications, and that is what we are really concerned about.

I think the issues that you are pointing to relate to how the wider system works. I am sure that the Government take an overview of that and will take a view about it. The qualifications are simply one part of it. I am afraid that it is not really for the qualifications regulator to take a view or comment on the wider system.

Q313 Pat Glass: You are telling us that there is not too much wrong with the examination system, but that is very different from the public perception. Is what is missing from this the amount of pressure in the system?

Glenys Stacey: A number of things are missing from it. My concern is to make sure that a qualification, and particularly an examination, assesses the programme of study fairly and comprehensively and comes out with a reliable and valid result. When it boils down to it, that is my concern.

I suspect that you are touching on the weight that is put on qualifications. Of course, GCSEs have been around for quite a long time. They had a stated purpose, but things change, and Governments need to look at how qualifications are used, relied on and all the rest of it. There is that aspect to it, for sure, but my job is definitely around securing the integrity of the qualification.

Q314 Pat Glass: Okay. Is there sufficient consistency across exam boards? Are universities, as has been suggested by the Secretary of State, secretly looking at a maths A-level from one board and comparing it to another?

Glenys Stacey: So far as consistency is concerned, we do our comparability studies, which I mentioned earlier. We have 50 or so, so far. We point out where we find any shortcomings or inconsistencies-I think A-level English from one of the boards was mentioned earlier-so we are looking closely at that.

We accredit qualifications in the first place. If we see any shortcomings, we refuse accreditation and if you look at our record on GCSE science a couple of years ago, it took our awarding bodies two or three goes to get that right as we hiked up the standard, because we were concerned. So where we can see, if you like, that the level playing field is not correct, we can take steps to hike standards up.

As for what happens in universities to select candidates, I do think, as you would expect, that it is different for each university-entirely different-and some universities use quite sophisticated methodologies, of which A-levels and GCSEs are just one part. I do not have evidence at the moment to suggest that they are particularly differentiating one board’s English examination, for example, from another’s. That is not the information that we get as we talk with those in higher education, but, as I say, we are looking and we have a programme of work on at the moment to identify more closely higher education’s concerns and whether they see A-levels to be good, bad or indifferent, but in sufficient granularity that we can do something about it.

Q315 Pat Glass: The Secretary of State has said that some universities have begun secretly screening applicants to cherry-pick those who have passed the more rigorous test offered by some exam boards, so will you be asking the Secretary of State where the evidence for that is and looking at it?

Glenys Stacey: The evidence that I am aware of is from when I spoke recently to Professor Partington at Cambridge. He tells me that he is certainly looking at grades. He has very recent evidence that shows that where students have three or more A* grades, they do exceptionally well in their first-year examinations at Cambridge. He believes that the A* grade is a good predictor. I also know that universities such as Cambridge will use other things as well. They will look at other aspects of the student. They may conduct an interview. They may look at other work, work experience, projects or whatever that individual has done.

The point here is about the reliance that one might put on one qualification, which may have been earned through just a couple of examinations, and whether you are putting a heavy weight on the A-level or whether there are other aspects of the individual that you can properly take into account in differentiating one student from another. My interest is in making sure that the differentiation by grade is robust.

Q316 Chair: Can I ask about your comparability work? Cambridge Assessment have been fairly critical of the methodology used by Ofqual in doing that, and that comes to the nub of a lot of what we are talking about here, which is the need to ensure that we do have comparability. If you do not have sound methodology, you will not get a sound result. That is my crude interpretation of what they have said. How do you respond to that?

Glenys Stacey: Okay. I will ask Dennis to deal with the detail in a minute, but, as an immediate reaction, we recognise at Ofqual that we are not the sole centre of expertise on assessment. In fact, assessment expertise is quite a rare thing. Clearly, it sits in the awarding bodies. They do have significant expertise. There is also expertise within some of the academic institutions. Our board is reaching out to that expertise. It is creating a standards advisory group, and we want those experts working with us, so that we are not insular in our approach and so that they are able to challenge and help us develop our approach to maintaining standards.

Q317 Chair: Did that not happen before? It was understandable that I was trying to invite anyone who wanted to do so to dob in one of the other awarding bodies, but there was a reluctance to do that. I would have thought, however, that behind closed doors people would be very happy to say, "Our data suggest that this other board is not comparable with us, and unless you want us to dumb down, you need to get them to smarten up."

Glenys Stacey: You will understand that I am not entirely sure and cannot say how that might have worked in the past. What I am sure about is how it is working now and how we are going to get yet greater collaboration, if you like, across experts on assessment to ensure that our methodologies-be they research methodologies or comparability methodologies-remain robust.

Q318 Chair: And will you be open to them? As long as you are open to challenge and they have the opportunity to come to you, then hopefully we can have more confidence.

Glenys Stacey: The danger would be if we were not.

Dennis Opposs: Where we are creating methodologies for new bits of work that we have done, like the international work, our new approach is to produce what we think is our methodology and then to offer people the opportunity to comment on it, so that we can take account of criticisms before we finalise it.

Q319 Chair: This is something that I have asked you about before in the light of the Education Act 2011, as it now is. You are supposed to maintain standards over time, but you are also supposed to benchmark against the best systems in the world, which are all investing heavily, because we are a knowledge-based, global economy. How do you get the balance right between maintaining standards over time and moving against what might, if you benchmark with Shanghai and Singapore, be a very fast rising international standard?

Dennis Opposs: There will be tensions from time to time, yes. It may be that there is a qualification in Shanghai or somewhere that is at a much higher standard, and yet we have maintained our own standards and-

Q320 Chair: So which takes precedence?

Dennis Opposs: And at that point I think Ofqual would have to judge which way we go.

Q321 Chair: So have you got basically incoherent instructions?

Glenys Stacey: We do have not instructions on how we interpret our statutory objectives-

Q322 Chair: Are they in conflict then?

Glenys Stacey: And, indeed, we do not have that amendment to our statutory objective in place just yet, but we have been thinking carefully about our approach to that and our board has been considering it. The general approach that we propose is that, first, we continue to benchmark internationally and will have regard to the range of international tables and benchmarks, because they tell us something, but they do not tell us everything. For example, we know that our position in any one table might move down a dozen places, not because of any change here, but because a dozen other very sophisticated countries have joined the table. We need to have regard to and understand what those tables tell us-

Q323 Chair: So which one takes precedence? Is it stability? We have had grade inflation over the years and now there is a desire for stability. Is it stability or is it international benchmarking? Those two could be incompatible and I want to know which one you are going to jump on.

Glenys Stacey: The answer is that we want the best, so where we pick out, for example, from our A-level-

Chair: You should be in politics.

Glenys Stacey: That is an option I will bear in mind should things go pear-shaped, but for the moment I am a regulator. You have quite put me off my stride.

Chair: I think you have answered.

Glenys Stacey: No, no; I want to get this on the record. In looking at the international comparisons, we can look at the tables, but we need to know what they tell us, and we will take them seriously. On the benchmarking piece, when we look at how things happen in different jurisdictions, we need to bring back these nuggets, share them and debate them. For example, in relation to the A-level comparisons, there are just two things to mention: first, in other jurisdictions, generally speaking, there is a much greater reliance on and faith in teacher assessment. That is culturally quite difficult in this country. We like examinations.

Secondly, in those jurisdictions that have examinations, there is greater faith in multiple choice questions and a much more sophisticated use of multiple choice in some subjects, because they can really strengthen the breadth of coverage in the assessment. Again, that is a cultural issue for us, but we have learnt something from that. We are going to pick up these things, learn, play them out for people, discuss them and then take a view.

Q324 Tessa Munt: Michael Gove has suggested having one exam board for each subject to stop the race to the bottom. Do you agree?

Glenys Stacey: I read that in The Daily Telegraph on Saturday, and immediately asked for a meeting with the Secretary of State to understand his thinking better. I had that meeting yesterday afternoon-I was thankful for that. He has explained to me that he is entirely open-minded and understands that there is no perfect system-a point that I would make. I think sober reflection is required.

My job is to regulate the system. I have, at the moment, a system that involves a range of awarding bodies. There are 180 of them, so it is not simply the top players around the table today. If there is sufficient concern about the way the system works, yes, let’s have a look at it, but I think anyone who leaps to a conclusion would be making a mistake.

Q325 Tessa Munt: We are not unique, are we? A lot of countries across the world have a single exam board, and it would allow a single comparison.

Glenys Stacey: A single exam board is not necessarily the solution either. I think we need to look at resilience. For example, in France, where there is a much more limited model, you may have noticed that there were difficulties this year. I forget which subject it was, but I think it ended up with the Minister having to decide which questions would get which marks. That is not a desirable position for a Minister or indeed, the system at large, so one exam board is not necessarily the solution. I think that however many exam boards you have, it is a market, and markets respond to incentives. Incentivising the race to the top is what I am interested in.

Q326 Tessa Munt: Yes, I presume that it would allow a national comparison of young people, in terms of their outcomes.

Glenys Stacey: It would allow a comparison against one specification. That may not meet higher education’s needs, or further education’s needs, or employer’s needs. It is a complex business.

Q327 Tessa Munt: Some evidence to the Committee suggests that, whatever happens with the organisation or reorganisation of the exam system, you need more powers. Is there anything that you would say to the Committee about that?

Chair: In addition to what you have said already.

Glenys Stacey: The Secretary of State has asked me to state whatever powers we might need. As I have said, I am reflecting on that. The whistleblowing issue is significant. I want to check that I have sufficient powers around data collection. I want the greatest range of sanctions-of course I do-and fining has a significant place in that.

Q328 Chair: We normally end our sessions by asking witnesses to send us information, but I confirm that we will send through to you the details we have on the whistleblower who was frustrated by your system.

Glenys Stacey: That would be very welcome. Thank you very much.

Chair: Thank you very much for giving evidence to us today.


[1] (Witness Addition) Hodder actually published the new GCSE textbooks. The Welsh Assembly Government met the cost of translating them into Welsh.

[2] (Witness Addition) Please add that the London meeting on November 11 th 2011 was the only CPD event that the Subject Officer has missed in the ten years that he has been in post.

[3] (Witness Addition) “As part of staff personal development OCR operates a Performance Management Scheme (PMS). The objectives set focus on qualitative drivers and goals. Employers are remunerated on a sliding scale of rewards reflecting the achievement of personal objectives agreed between themselves and their line managers. The scheme covers all employees including Senior Management Teams (SMT) and all receive the same percentage rewards (ranging from 0% to 5% of base salary). There is no form of automatic incremental progression up our salary ranges as a result. CEO’s and SMT’s PMS do not include specific targets for increasing revenue or market share”.

[4] (Witness Addition) “ Assessment Associates (AA) Terms and Conditions prohibit giving any tuition, training or guidance on which they had access to pre-published secure assessment content. This prohibition was notified to examiners in March 2011 and is effective March 2012. It relates specifically to training aimed at students, not to training aimed at teachers. ”

Prepared 2nd July 2012