Education CommitteeWritten evidence submitted by Dr Ian White
Subject: Clear evidence of falling standards and how to get more evidence (plus what I was told on an OCR exam course).
Dear Sirs
I would like to forward you some evidence regarding the recent exam controversies.
I attended a exam feedback session run by OCR for their A-level Physics course and was astounded to discover the following things that should be of interest to you. I have also attached clear evidence of falling standards over the past few years and explained how you can find further evidence that the constant rise in exam Grades are not down to better performance in exams (point 5). I am a Physics teacher in a state school.
1. Exam writing and marking is a part time job. Exams are written by teachers who write the exams in their spare time. This means that once an examiner has written an Exam THEY ARE ALLOWED TO GO BACK TO THEIR SCHOOLS AND BEGIN PREPARING STUDENTS FOR THEIR OWN EXAMS.
2. A further consequence of the part time nature of exam writing is the complete lack of quality assurance that goes on before an exam is issued. Quality assurance appears to be Limited to a couple of meetings between examiners where they look at each other’s papers and comment on them. Worryingly it appears that no-one ever actually attempts the exam in the time allowed before the exam is issued. For OCR Physics this has had several consequences:
When the new specification was issued three years ago it was nearly impossible for students to actually finish the exam in the allotted time period. Rather than making the exam a test of physics skills this made it a race. Furthermore it massively disadvantaged students who have English as a second language.
Furthermore it means that questions with ambiguous wording only come to light after the exam has been sat. This means that many questions become worthless as a means of assessing students because questions do not make it clear what the candidates are required to do to get the marks. I would have expected that all papers would be sat by about 30 independent people before the exam was released so that questions could be checked for possible misinterpretations of wording. During the meeting I attended there were numerous examples of this which the examiners responded to with “yes, I suppose that wasn’t very clear.”
In short exams are very badly written. This also explains the spate of exam mistakes this summer.
3. On the course we were told that there are certain questions which require certain responses (where actually there are several reasonable and physically correct responses). When I asked how candidates were supposed to know the exact response that a question requires I was told “you came on the course—so now you can tell your students”. When you talk in your Telegraph article about the scandal of teachers being told the exact wording that students should use in their answers the real Scandal is that only certain wording will gain the marks when answers that show real understanding of a subject go unaccredited just because they do not exactly match the mark scheme, this means that you have no option but to teach to the test rather than provide proper learning.
4. A exam marker on the course boasted that when marking exams he can mark a 60 mark paper in four minutes (they are paid per exam marked—not something that promotes accurate marking). In physics exams there is a concept called error carried forward. This is supposed to mean that if a candidate makes a mistake on an earlier part of a question that affects their answers to later parts then their marks for the later parts should not be affected if they have completed the latter parts correctly.
This often involves some careful checking of the student’s maths. I would consider it very difficult to take into account error carried forward and mark a paper in four minutes.
5. Grade boundaries are set only after all the candidates’ marks have been collected in. This makes a complete mockery of the constant improvement in exam grades as exam boards decide exactly how many students get each grade ensuring that standards appear to rise every year. Grade boundaries are now so low that 30% is enough to get a candidate a C grade in some GCSE Science exams.
Here is an example of grade creep in the OCR 21st Century Science P456 paper (unit 332/2). The graph shows the number of marks (out of 42) required to get a C grade in the paper over the past four years in each exam sitting. You can see a clear downwards trend (you can even extrapolate and show that by 2015 zero marks will get you a C grade!).
The information on grade boundaries is freely available. Analysis of all science papers shows the same downwards trend. For OCR here are the links
June 2011 grade boundaries:
http://www.ocr.org.uk/download/admin/ocr_60649_admin_unit_level_raw_mark_grade_boundaries0611.pdf
January 2011 grade boundaries:
http://www.ocr.org.uk/download/admin/ocr_56047_admin_unit_level_raw_mark.pdf
June 2010 grade boundaries:
http://www.ocr.org.uk/download/admin/ocr_47951_admin_mk_grd_bound_jun_10.pdf
For boundaries prior to June 2010 you need to go to the specific mark schemes provided on the OCR website. The link for the 21st century science Physics course is:
http://www.ocr.org.uk/qualifications/type/gcse_2006/tfcss/physics_a/documents/index.aspx
The issue of plummeting grade boundaries has severe issues because when grade boundaries are so low it means there are only a few marks between grades. For example on the June 2011 paper of P456 exam a C was 13/42 and a D was 10/42. This means only three marks out of 42 separates a whole grade. This is one question and means that missing one question out can affect a student’s result by a whole grade.
Low grade boundaries are a result of a conflict between Ofqual insisting exam boards set very challenging exams “to maintain standards” and exam boards needing a large majority of students to pass the exam so they can maintain market share.
If I can be any assistance in your research into the total ineptness of exam boards then please free to contact me.
January 2012