Policy and delivery: the National Curriculum tests delivery failure in 2008 - Children, Schools and Families Committee Contents


Memorandum submitted by ETS

EXECUTIVE SUMMARY

  There is no question that ETS experienced some operational and technical issues that aggravated the process of marking this year's national curriculum tests in England early on—and for those we take full responsibility. These issues were exacerbated by program changes required by NAA, long delays by NAA on key project decisions and layering on of additional project deliverables. Despite all these challenges, marking quality was maintained.

INTRODUCTION AND OVERVIEW

  ETS welcomes the opportunity to appear before the Children, Schools and Families Select Committee. Our most important stakeholders are the millions of English pupils, teachers, schools and parents who depended on ETS and QCA to deliver a well-run testing programme. They have been let down, and we take this opportunity to repeat our apology to them.

  I am a director and the chairman of the Supervisory Board of ETS Europe, the ETS entity responsible for delivery of the 2008 national assessment tests. In May 2008, I was given a mandate from ETS President Kurt Landgraf to investigate ETS and NAA issues affecting the safe delivery of the testing program.

  ETS desires to provide full cooperation to this Committee and to Lord Sutherland's independent inquiry. We remain bound by confidentiality obligations under the original contract with QCA and under the August settlement agreement. We have requested but not yet received permission from QCA to provide full information to the Sutherland inquiry and the documentary evidence requested by this Committee. We are hopeful that both this Committee and Lord Sutherland will be able to prevail upon QCA to give its consent.

  For the same reasons, we have made limited comment in the public domain. You should not take our silence as agreement with what has been said about us, but as our commitment to our contractual obligations.

  ETS is a non-profit organisation that administers more than 50 million exams to exceptionally high standards in all of the 180 countries we operate in. We bid this contract because helping pupils learn and teachers teach is our mission, not profit. We took it because we believed our expertise could improve educational measurement in England. We invented large-scale standardised assessment 60 years ago, we pioneered computer-based testing, we originated online marking and created the largest Internet-based testing network in the world, and in all that time, we never asked for early release from a standardised achievement contract.

  We worked closely with the NAA throughout the project and, whilst we have not achieved everything we should have, together we have made real progress in the quality of the marking and the detailed database of results provided to schools and students.

QUALITY OF MARKING

  We are aware that the Committee is particularly interested in quality and we would like to dispel questions over this year's marking. We can confirm that the quality of markers in 2008 was in fact even higher than in previous years. We introduced a more rigorous method of certifying markers to ensure adherence to the marking scheme and constantly monitored quality during marking. Early on we eliminated hundreds of markers who could not meet the required NAA standards.

  The training that we offered was delivered by the same senior markers as in previous years. We also used the same pool of teachers and retired teachers as in previous years. We used the same criteria for screening new applicants as in previous years, and in addition all markers were recruited according to NAA guidelines. The ongoing marker reviews are being managed by the NAA and are being carried out by the same markers we recruited to mark the tests originally.

SHARED RESPONSIBILITY WITH THE NAA

  There is no question that ETS experienced some operational and technical difficulties that hindered our ability to deliver test results on time and we have not shied away from taking responsibilities for these. For example, some school allocations were split, which meant that a school was given to two separate markers. In these cases one of the markers would be unable to view their allocation online. There were also instances in which scripts were wrongly allocated, so one marker would be able to view a school online, for which they did not have scripts, and another marker would receive scripts they could not view online so were unable to enter marks online.

  At the same time, NAA also shares significant responsibility for the delivery failure. Through a combination of the NAA making changes to the contract—against our advice, delaying critical decision-making and layering on additional responsibilities, we ended up with a much more complex and challenging task. Thus, I cannot point to just one or two things that contributed to the marking not being completed on schedule. It was, in fact, that the cumulative interaction of ETS and NAA created a compounding effect.

  For example, the solution we presented in the bidding process and the contract we signed called for training about 5,000 experienced markers online instead of face-to-face. The online training was one of the innovations we were led to believe was pivotal in our being selected. In March 2008, just two months before marking was to begin, the NAA mandated that we should revert to face-to-face training for all 10,000 markers requiring us to find meeting venues, print and ship materials to those venues, co-ordinate marker invitations, travel schedules, costly overnight accommodations—all at the last moment. Not only did this specifically impact future delivery milestones, but it also prohibited markers from going online and accessing training materials early. This caused frustrations for markers because they had to understand how to use the entire system in a shortened time period.

  Additionally, we were not supplied with critical information on operational failures experienced by previous suppliers that could have informed our decisions. For example:

    —  We were not made aware that recruitment of Key Stage 3 English markers has been a historical problem and that we would face difficulties identifying markers.

    —  When the NAA set milestones, it indicated that the previous supplier had accomplished similar deadlines when in fact this was not the case. For example we were told 100% of results had been returned on time, when, as this Committee well knows, suppliers had historically had problems achieving this.

  Our view is that the NAA changes to the agreed program, the long delays by NAA in reaching decisions and the layering on by NAA of additional tasks, combined with ETS operational and technical issues, and compounded each other and we believe this lies at the heart of the delivery issues.

CONCLUSION

  As I stated before, we accept responsibility for all of ETS's operational and technical issues that affected the experience of markers and the return of results to schools. And, not withstanding the challenges presented by the NAA, I honestly believe we introduced more quality improvement measures into the assessment than ever before with the result that students, parents and schools got good quality scores. Once more I reiterate my apology and welcome this opportunity to answer your questions.

Dr Philip S Tabbiner

Director and Chairman of the Supervisory Board

September 2008





 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2009
Prepared 23 July 2009