Select Committee on Children, Schools and Families Third Report


187. In this Chapter, we look at three areas of reform which have featured as important in this inquiry: the pilot study of single-level tests; the new Diploma qualification; and the proposals for a new regulator and test development agency. This should not be taken as a full review of each of each area of reform, rather as in indication of the some of the testing-related issues which we believe are likely to become important as these initiatives are developed further.

Single-level tests: the Making Good Progress pilot

188. The Government introduced pilot tests, resulting from the Making Good Progress consultation document, in around 500 schools from September 2007. Although the pilot study is not due to finish until Summer 2009, the Government appears to have thrown its weight behind the scheme in the Children's Plan, which states that:

189. The new tests are known as 'single-level tests'. In contrast to the current Key Stage tests, which test students only at the end of a Key Stage and across a range of levels, resulting in a level being "awarded" as a grade, the single-level tests assess the pupil at a set, 'single' level, which the candidate either achieves or does not achieve. Teachers at Key Stage 2 and 3 in English and mathematics will be able to enter a pupil for an externally written and marked test as soon as, in the teacher's assessment, the pupil is ready to be tested. Science will continue to be tested with Key Stage tests while the Government explores "new options" for the assessment of science.[340] The principle of 'testing when ready" means that the tests will be available twice a year.

190. A new performance target for schools will be set, with children expected to make two levels of progress between each Key Stage. One-to-one tuition will be offered to pupils making "slow progress".[341] The single-level tests are intended to be confirmatory of a teacher's assessment of the level of attainment of a pupil. Professor Dylan Wiliam argues that this proposition is disingenuous: "If the teacher's judgment is that a pupil has reached a level, but the test indicates that they have not, there is no process of reconciliation between these two. It is the teacher who is wrong, and the test that is right. The role of the teacher's assessment is therefore limited to deciding when the student should take the test […]".[342]

191. Several concerns with the pilot tests have been raised in evidence to this inquiry. The availability of tests twice each year and the necessity that a child sit a separate test for each National Curriculum level means that a child potentially faces a much larger number of national tests and certainly more frequent testing than under the current end of Key Stage regime. Some witnesses have highlighted possible curriculum disruption and logistical difficulties with single-level tests if testing is to take place over a longer period of time. There may also be additional costs involved.[343]

192. Other witnesses have criticised single-level tests on the basis that the proposals pay no heed to the effects of measurement error. Candidates can keep re-sitting each test until they are successful and pass the next level. As borderline students finally succeed, standards will appear to have risen over time, yet the rise will be attributable only to a statistical artefact.[344] Once achieved, pupils are deemed to be at that level in perpetuity. This, it is argued, is not defensible since it is possible for children to fall back in subjects if they are neglected.[345] There is suspicion that this 'one-way ratchet' mechanism will entrench teaching to the test and narrowing of the taught curriculum as schools will be held accountable on the basis of the number of levels of progress which a child makes across a Key Stage. Professor Dylan Wiliam argues that teaching to the test under the single-level test regime will "permeate the entire key stage, rather than the final year as it does now".[346]

193. The National Foundation for Educational Research does not believe that the single-level test model will support teaching in any direct way and further states that:

    The achievement of a level and the knowledge that it cannot be removed may act to demotivate rather than motivate. We would advise that the 'one way ratchet' is abandoned and that the system allows for re-testing of doubtful cases so that high levels of certainty are achieved and so that misclassification is minimised.[347]

194. There is some concern about the conduct of the pilot, not least because the schools taking part are not released from their Key Stage testing obligations, which they must continue to run in parallel.[348] Some teachers have complained about the burden of the pilot tests and a proportion of schools have already dropped out.[349] The pilot study caused further controversy when it was reported that the Government was delaying notification of the results of the first round of single-level tests results. David Bell later stated in evidence that:

    we would not and should not be surprised that, when you pilot a new form of testing, you might need to see what actually happened. We are doing some further work, and we have asked the National Assessment Agency to do some further work. We are not ready yet to come back with the results of that analysis or to say what has happened.[350]

195. After the results had been released to schools, the Minister explained that there had been some "unexpected patterns" and "unusual outcomes" in the results.[351] The Minister expressed surprise at the number of candidates had been entered at the wrong level.[352] He explained further that:

    the most significant unusual outcome was variations between Key Stage 2 and Key Stage 3 pupils taking the same test. So, let us say that they were taking a level 4 writing test. The Key Stage 2 students were doing significantly better when they were taking exactly the same test as the Key Stage 3 students. Now, that was a bit odd.[353]

196. The NAA's subsequent investigation found a number of factors which were likely to have combined to produce these unusual outcomes:

  • inappropriate entry of pupils not secure in the level for which their teachers had assessed them;
  • a style of test unfamiliar to both pupils and markers, with questions pitched at a single level, rather than a range of three levels;
  • less motivated pupils; research suggests that pupil motivation for new tests taken in a pilot may be lower than for National Curriculum statutory tests, and that this factor may be more marked for pupils in Key Stage 3 than in Key Stage 2;
  • a number of pupils not completing test papers, particularly on higher level papers; lack of familiarity with this type of test may have contributed to this;
  • markers unaccustomed to marking scripts at a single level from pupils in two key stages.

197. The Minister wrote to explain that:

    It is important to recognise that NAA developed these tests on a much shorter timescale than is typical for test development, and that this did not allow for the usual pre-testing that would take place.[354]

However, giving oral evidence, he reminded us that:

    We should bear in mind that it took four years for the SATs to be piloted.[355]

We have not received any evidence to indicate why the Government might be in such a hurry to roll out these single-level tests. When so much is at stake, we consider this haste inappropriate at best.

198. Our predecessors warned the Government about bringing in new tests with undue haste. We recommend that the Government allows sufficient time for a full pilot of the new single-level tests and ensures that any issues and problems arising out of that pilot are fully addressed before any formal roll-out of the new regime to schools.


The Government consultation Making Good Progress emphasises more informal teacher assessment and personalisation in teaching and the Children's Plan has now reinforced this approach.[356] Teachers' assessment skills are, therefore, ever more important with the increasing focus on personalised learning and monitoring of pupil progress. The Government has made a strong commitment to the techniques embodied in Assessment for Learning ("AfL")[357] and has committed resources for professional development of teachers in AfL techniques.[358] The Children's Plan states that "Our new approach in schools—which looks at progression across stages—means we will focus on every pupil, in every year group, not just those at the end of key stages and in the middle of the ability range."[359] The Government considers that greater personalised learning under the "new approach" will "help to identify and prioritise those pupils who are in danger of stalling in their learning at the start of secondary school".[360]

199. The QCA explains that AfL involves using assessment in the classroom to raise pupils' achievement. It is underpinned by the proposition that pupils will improve most if they understand the aim of their learning, where they are in relation to this aim and how they can achieve the aim or reduce the gap. Effective AfL is already used in classrooms by some teachers. The key characteristics of AfL are:

  • teachers using effective questioning techniques;
  • teachers using marking and feedback strategies;
  • teachers and pupils sharing learning goals; and
  • peer and self-assessment by pupils.

These characteristics of AfL are embodied in a number of processes:

  • sharing learning goals with pupils;
  • helping pupils know and recognise the standards to aim for;
  • providing feedback that helps pupils to identify how to improve;
  • believing that every pupil can improve in comparison with previous achievements;
  • both the teacher and pupils reviewing and reflecting on pupils' performance and progress;
  • pupils learning self-assessment techniques to discover areas they need to improve;
  • recognising that both motivation and self-esteem, crucial for effective learning and progress, can be increased by effective assessment techniques.[361]

The QCA states that research has shown that participation in the review process raises standards and empowers pupils to take action to improve their performance. AfL is a type of formative assessment and is different from assessment of learning, otherwise known as summative assessment, which involves judging pupils' performance against national standards (level descriptions) as is the case with Key Stage tests. However, according to the QCA, the formative use of summative data remains an important aspect of AfL.[362]

200. The NAHT states that AfL is vital and goes far beyond snapshot, national tests. Many schools have developed "sophisticated pupil tracking systems" using these methods.[363] The GTC agrees, but admits that "there remains considerable diversity in school approaches to AfL".[364] According to the ASCL:

    The use of assessment for learning has improved the quality and extent of formative assessment, encouraging students to think more about their own learning and helping teachers to mould their teaching style more effectively to the needs of the students."[365]

201. The ASCL finds the "personalised classroom" an attractive prospect, but argues that it can only be a reality if the teacher has access to necessary and reliable data. The current regime of Key Stage tests, they argue, does not provide such data because of their high-stakes nature and "weakness" of the tests themselves. Other, diagnostic testing mechanisms are needed.[366]

202. The ASCL and others favour a move towards a stronger element of teacher assessment in the classroom, with teachers able to draw on a national bank of tests developed under the auspices of the QCA and the administration of which would be monitored by chartered assessors.[367] However, the ATL argues that there is a need to address perceptions of bias in teacher assessment.[368] The NASUWT cautions against approaches to formative assessment which are overly bureaucratic and burdensome for teachers, particularly in terms of the need to demonstrate effective performance to auditors.[369]

203. Perhaps one way of making sense of the rather diverse evidence we have received on this subject is to adopt the matrix analysis put forward by the NFER. It contrasted the formative/diagnostic and summative dimensions of assessment with informal and formal processes of assessment and presented the information in Table 3:

Table 3: The four quadrants of educational assessment

Source: NFER

204. According to the NFER, AfL is a means of informal formative assessment which can have extremely positive results but may make "formidable demands on teachers in terms of their professional knowledge and skill".[370] The NFER strongly supported the principles of AfL and argued that better support materials should be provided to teachers in order to encourage the spread of formative assessment in classrooms. It cautioned, however, there are limitations to AfL. More research was needed in order to understand the precise aspects of AfL which lead to greater gains in pupils' knowledge and understanding. In addition, AfL data is not useful for gaining an overview of the overall level of attainment or of the curriculum as a whole, since its focus is on what has just been learned and what is about to be learned. The use of teacher assessment, self-assessment by pupils and peer-assessment by classmates gives rise to problems with reliability and bias in the data. There are additional problems inherent in the systematic collation of these data, which may be time-consuming, so as to allow for reliable and comparable overall judgements. For these reasons, NFER considered that AfL neither could nor should provide summative information: this function should be served by a separate assessment system.[371]

205. Formal formative assessment is increasingly a focus of the Key Stage testing system in the sense that test results are systematically analysed (RAISEonline would be an example) in order to generate information for teaching and learning. There are problems with using such information for formative purposes, however. For example, formative information is most useful at the beginning of a programme of study, yet the Key Stage tests are, by definition, at the end of the Key Stage. The NFER suggested that e-assessment may provide a useful, unobtrusive method for formal formative assessment and is currently researching in this area.[372]

206. Informal summative assessment is already an integral part of the testing system in that the Key Stage testing system requires teacher assessment judgements alongside test results. However, the status of these teacher assessments has tended to be lower that that accorded to the test results themselves. The NFER noted that the balance has started to change as the themes of AfL have been integrated into policy and there is renewed interest in systematic, informal summative assessment, exemplified by the QCA's Assessing Pupils' Progress (secondary schools) and Monitoring Children's Progress (primary schools) initiatives. In Wales, the balance has changed more radically, Key Stage tests having been replaced by teacher assessment.

207. The NFER noted that, in order to be used summatively, teacher assessment information must be tied to the standards embodied in the National Curriculum level descriptions. However, these descriptions are broad and include imprecise judgemental terms. A consensus within the teaching profession on their meaning and application is necessary if summative teacher assessments are to be meaningful. This would, in turn, involve an extensive moderation process which would be "professionally valuable but costly and extremely time-consuming".[373] This is part of an ongoing debate about the potential for teacher assessment to replace test results as the main source of formal summative assessment. The NFER pointed out that, on the one hand the scope of assessment and teacher involvement would be enhanced; on the other there were serious questions about manageability and reliability which would need to be addressed. The NFER set out three conditions for its successful introduction: major investment in professional development in relation to the criteria for assessment; professional development to enhance understanding of the nature and purposes of the four quadrants of assessment as set out in Table 3; and a system of external monitoring and accountability to assure public and professional confidence.[374]

208. Finally, formal summative assessment is exemplified by Key Stage tests; the single-level tests are a new evolution in this quadrant of the NFER model. Formal summative assessment may serve many different purposes, as was discussed in Chapter 2, and we have argued that the current Key Stage tests are a compromise which attempts to meet a wide variety of these purposes, including assessing pupil attainment, school accountability and national monitoring. The NFER expressed the view that the Key Stage tests adequately serve the first two of these purposes, but serve less well the purpose of national monitoring.

209. The NFER stated that introduction of single-level tests, if they are to replace the Key Stage tests, should be accompanied by a statement of which purposes they are expected to meet, which they are not, and the extent to which they meet the requirements for validity, reliability and manageability for each of the intended purposes.[375] It is at this point that the utility of the NFER's model becomes apparent. It considers that all four of the quadrants of the model are essential to the effective education of children. Each has distinctive features and requirements, yet all are related and education professionals and policy-makers alike should accord appropriate attention to each.[376] Part of the problem with the single-level tests appears to be a confounding of the features and requirements of the four quadrants of the model. The NFER supported testing when ready and closer ties between tests, teaching and learning and considered that such notions are consistent with personalised learning and AfL. However, it doubted that the single-level tests as described in Making Good Progress would promote personalised learning and AfL, as claimed by the Government. The single-level tests would given an indication of pupil attainment, but would not simultaneously provide diagnostic information to indicate appropriate next steps in learning. The tests are likely to be too far apart to be useful in identifying the detail necessary for personalised learning: a level represents, on average, two years of teaching. This means, according to the NFER, that the single-level tests are unlikely to support teaching in any direct way. Looked at from another direction, for the reasons set out in paragraph 204, AfL techniques are unsuitable for use in generating summative data, so the Government's claims in Making Good Progress that single-level tests will serve both summative and formative/diagnostic purposes would appear to be wide of the mark.

210. Making Good Progress characterises single-level tests as integral to personalised learning and Assessment for Learning yet also the means by which to generate summative data. We agree with the National Foundation for Educational Research that this single assessment instrument cannot validly perform these functions simultaneously and, if it is attempted, there is a danger that the single-level tests will work for neither purpose. The single-level tests may be useful, however, if their purpose is carefully defined and the tests are developed to ensure they are valid and reliable specifically for those purposes.[377]

211. We recommend that, if single-level tests are introduced, they are used for summative purposes only and that Assessment for Learning and personalised learning are supported separately by enhanced professional development for teachers, backed up with a centralised bank of formative and diagnostic assessment materials on which teachers can draw as necessary on a regular basis.


212. It is a feature of Making Good Progress, reiterated in the Children's Plan, that national testing will continue to be used for the purpose of school accountability. In the context of single-level tests, the proposed target is that a child should move up two levels between Key Stages. These will be known as 'progression targets' and the results will be published in performance tables.[378] These proposals are unpopular with the many organisations who have submitted evidence taking issue with the current targets and tables. Although the progression target is that pupils will progress two levels between Key Stages, the Key Stages are not of equal length, and the progression target has been criticised as arbitrary and unfair on this basis.[379] In addition, witnesses have objected that progression targets assume that all pupils should ideally progress at the same rate, which is considered by many as a false assumption.[380]

213. Witnesses have pointed to a contradiction between the concept of personalised learning, which recognises differences in the abilities and needs of children, and systemic targets, which assume that children should ideally develop at the same rate, that is, two levels across each Key Stage.[381] The NUT warned that, in its view, the proposed progression targets linked to the single-level tests will ultimately perpetuate the current practice of diverting "resources towards children on the borderline of national target levels".[382] The NASUWT maintains that many of the problems associated with testing relate to the high-stakes environment in which it takes place and argues that associating single-level tests with accountability measures based on progression of pupils leads to a "significant danger that such an approach would result only in the replacement of one high-stakes assessment system with another".[383] The ATL similarly believes that its vision for reform, which includes the use of AfL, personalised learning and teacher assessment, cannot exist alongside league tables "which already have a pernicious effect on the current national testing system."[384]

214. The NAHT sees single-level tests as providing data of significant value to a school. However, if those data are used in the same way as they are currently for school accountability, there is no reason to assume that the new data set derived from single-level tests "would be any more accurate or less damaging than the current data set" if it is to be used in the same way and in isolation from other measures of performance.[385] Given the emphasis on personalised learning, according to the NAHT, data from single-level tests will not support comparisons of performance between different schools.[386]

215. Single-level tests may have some positive effects and we certainly approve of the Government's new emphasis on the personalised approach. However, the Government has structured the single-level testing system in such a way as to risk a transposition of existing, systemic problems into the new arrangements. Without structural modification, we foresee that the existing problems—including teaching to the test, narrowing of the taught curriculum and the focus on borderline candidates to the detriment of others—will continue under the single-level test regime.

216. We believe that true personalised learning is incompatible with a high-stakes single-level test which focuses on academic learning and does not assess a range of other skills which children might possess. Children who struggle with the core subjects may receive more targeted assistance in those subjects. However, if this means that children who are struggling with core subjects get less opportunity to access the wider curriculum, they risk being put off learning at an early age. We call upon the Government to invest in ways to help and, if necessary, train teachers to improve the basic skills of struggling pupils while enhancing their enjoyment of learning and guaranteeing their access to a broad curriculum.

217. We are concerned about the "one-way ratchet" on the attainment of test levels under the single-level testing regime and we find persuasive the evidence that this may lead to an apparent, but artificial, improvement in performance standards. We recommend that the Government consider further whether it is in children's best interests that they may be certified to have achieved a level of knowledge and understanding which they do not, in truth, possess. We suspect that this may lead to further disillusionment and children perceiving themselves as 'failures'.

218. We recommend that the Government urgently rethinks its decision to use progression targets, based on pupils' achievement in single-level tests, for the purposes of school accountability. If such high-stakes accountability measures are combined with more frequent testing of children, the negative effect on children's education experiences promises to be greater than it is at present. We urge the Government to listen to the QCA, which has already warned of the dangers of saddling the single-level tests with the same range of purposes which the Key Stage tests demonstrably cannot bear.


219. The Diploma is a new, employer-designed 14-19 qualification, designed to give the student a rounded education, combining theoretical and practical learning. It combines essential skills and knowledge, practical experience, employer-based learning, as well as functional English, mathematics and ICT and the opportunity to develop a specialism or complementary study. The first five Diplomas—Construction and the Built Environment; Creative and Media; Society, Health and Development; Information Technology; and Engineering—are available in 2008 in selected areas. Others will be added in future years, with three new Diplomas in Science, Humanities and Languages starting in 2011, leading up to an entitlement to 17 Diplomas for 16-18 year-old students from 2013.[387] The three principal components, together with their characteristics are set out in Figure 2.

Figure 2: principle components and characteristics of Diplomas

Source: QCA, 14-19 education and skills: what is a Diploma?[388]

220. The QCA has also produced a diagram setting out how the 14-19 qualifications fit together. This is reproduced at Figure 3.

Figure 3: 11-19 progression routes

Source: QCA, 14-19 education and skills: what is a Diploma?[389]

221. The introduction of the Diploma has taken some time and the shape of the qualification has been through several evolutions. The report of the Working Group on 14-19 Reform, established in February 2003 and chaired by Mike Tomlinson, proposed that all education and training for 14-19 year-olds should be brought together in a common format of learning programmes together with a unified system of certifying achievement in those programmes.[390] Our predecessors expressed disappointment that the Government decided not to implement in full the proposals of the Working Group and create a unified, overarching Diploma to replace the current qualifications system.[391] Instead, the Diploma emerged as a further qualification alongside existing qualifications. However, as the framework has continued to evolve, we have learned that Advanced Diplomas will be the equivalent of three and a half A-levels and that the Government intends to "bring the best of existing qualifications within the Diploma framework".[392] With a full Government review of Diplomas, GCSEs, A-levels and other general qualifications announced for 2013, we are beginning to suspect that the wheel may have turned full circle and that the Government intends to adopt the Tomlinson proposals after all.

222. Greg Watson of OCR labelled the Diploma "the most complicated qualification that I have ever seen" and emphasised the urgent need to recognise its complexity and address the practical logistics of how it will be taught.[393] As well as studying for a wide variety of different components of the course, students will have the opportunity to work in different schools, colleges and work places. Indeed the QCA states that no one school or college will be able to teach the entire range of available Diplomas.[394] Schools and colleges will be required to work in collaboration with each other, and alongside work-based learning providers, and some witnesses have expressed concern that the current accountability regime, which puts schools in direct competition with each other, is incompatible with this aim.[395] The GTC, for example, considers that the current accountability structures do not sit easily with a cross-institutional, collaborative approach amongst schools and colleges and views the introduction of Diplomas as an opportunity to move away from:

    […] an assessment system dominated by the purposes of quality control and accountability and assessment of learning towards a more balanced model with a greater element of diagnostic and formative assessment for learning.[396]

223. Witnesses have generally welcomed the Diploma.[397] Jerry Jarvis of Edexcel thought that the Diploma's broad curriculum might increase the breadth of experience and learning of those entering higher education.[398] Some have highlighted the positive benefits of a balance between internal and external assessment.[399] City and Guilds, however, pointed to a tension between the general and vocational themes of the Diploma which, in its view, will be played out in the chosen methods of assessment. The vocational theme would indicate an emphasis on performance evidence, whereas the general theme would indicate an emphasis on knowledge-based evidence. City and Guilds considers it too early to say in what form the Diploma will finally emerge.[400] The ASCL is also sceptical, stating that:

    Experience of previous attempts to introduce quasi-vocational qualifications, for example GNVQ, lead ASCL members to be concerned that the assessment of the diplomas may be too much like those of GCSE and A-level. Effective vocationally-oriented courses cannot be assessed in the same way as academic courses.

224. However, the DfES memorandum referred to the need for "innovative forms of assessment to reflect the blend of practical and theoretical learning" and stated that assessment would combine locally determined and standardised external assessment that would provide both formative and summative data on students' progress (and, crucially, the performance of educational institutions).[401] In addition, the DCSF confirmed that Diplomas will combine "internal controlled assessment" with a practical focus, with theory-focused external assessment.[402]

225. We welcome the Government's stated intentions that both the vocational and the general elements of Diplomas should be reflected in the methods of assessment used. We caution the Government against any haste in shifting this delicate balance in future until the full implications of such a shift have been understood.

226. The Chartered Institute of Educational Assessors and others warned of the vulnerability of a new qualification. Steps must be taken to ensure that the Diploma is wanted by students, parents and users of qualifications, such as employers and higher education.[403] Jerry Jarvis of Edexcel has warned that "if the diploma doesn't earn its spurs as a qualification, and that means respect from employers, pupils, parents and higher education, we face a serious problem. There is a huge educational risk to this country."[404] We are concerned that in a recent survey conducted by ACS International Schools, fewer than 4 in 10 university admissions officers saw the Diploma as a "good alternative to A-Levels".[405] In fact, the CIEA finds "evidence of a real intention to make the new qualification work and of cooperation across educationalists, employers and awarding bodies".[406] Jerry Jarvis told us that Edexcel and the other Awarding Bodies were certainly extremely keen to make the Diploma work because of the "huge investment" that they have made in it.[407] Greg Watson of OCR added that there would inevitably be issues which needed to be addressed over time, but the fate of Diplomas was likely to be decided by the way they are taught in the first few years. For that reason, it was imperative that teachers received all the support that Government and the Awarding Bodies could give them.[408]

227. We are concerned that the results of a recent NUT survey showed that, within the schools introducing the Diploma this year, the majority of staff are still unfamiliar with them.[409] Jerry Jarvis of Edexcel has expressed concern that teachers will receive only three days' training before the roll-out in September.[410] The NAHT has also expressed concern about the lack of training provided to teachers.[411]

228. Whilst welcoming the Diploma, the NAHT remains sceptical as to whether the opportunity for a radical and imaginative approach to assessment is actually taken.[412] However, the NAHT considers that:

    If anything will assist the reintegration of some of the NEETs (young people not in education, employment or training) it will be the further, suitable development of modular, component assessment within the new vocational diplomas.[413]

According to OCR, a suspicion of alternative qualifications (ie other than A-level) which assess the practical application of skills may reflect the belief, in turn reflected in Government policy, that the only rigorous way to assess achievement is through formal, written examination.[414] OCR states that, in its experience, new qualifications take at least ten years to become accepted and take root.[415] On this basis, there is plenty of time for Diplomas to be altered radically from their current format. OCR has already noted that:

    In seeking parity with GCSE and GCE, the main parts of the Diplomas have increasingly adopted models which mirror the models for GCSE/GCE laid out in the regulatory codes of practice. The grading structures have also been adopted to mirror GCSE/GCE scales.[416]

However, Jon Coles, Director of 14-19 Reform at the DCSF, reassured us that Diplomas were intended to introduce a broader range of assessment methods to test the broader range of skills which had been called for by universities and employers.[417] Furthermore, the Minister told us that "The fundamental design of the Diplomas will not change". What he terms 'generic learning'—"functional skills, personal learning and thinking skills"—will remain part of the Diploma curriculum. He considers that Diplomas have had a fair lead-in time, "It is not the full OCR 10 years, but it is fair."[418]


229. The extended project forms part of the core content of a Diploma and may be used as a free-standing qualification which can be taken alongside A-levels, in which case it is expected that an extended project would be taken instead of, not in addition to, a fourth or fifth AS Level. The project will be in an area of the student's choice, to be approved by the relevant Awarding Body, and will test skills such as independent research, study and planning.[419]

230. Some witnesses have remarked on the apparent paradox that, as coursework is being scaled back considerably for GCSE and A-level, the extended project is being introduced for Diplomas (and also for some A-level courses).[420] The DCSF addressed this point in a supplementary memorandum, stating that the theoretical focus of GCSEs and A-levels makes external assessment more appropriate than coursework. Diplomas, on the other hand, focus more on practical learning, making coursework more appropriate. The Department has recharacterised coursework as "internal controlled assessment" for GCSEs, A-levels and Diplomas.

231. The NASUWT expressed concern that the extended project may prove to be overly burdensome for teachers, who must ensure that a student's learning is assessed validly, particularly in respect of reliability and comparability. Although it acknowledges extended projects as beneficial for students' learning, the NASUWT cautions that this benefit may be undermined by bureaucratic and work-intensive procedures for assessing the extended project.[421]

232. Although the reaction to the introduction of Diplomas has been one of cautious welcome, there are important caveats. The issue of accountability has arisen once again, and we consider that it is of the first importance that the Government addresses this issue once and for all. There is concern about the way the Diploma has been introduced. It is an innovative and profoundly complex qualification with serious logistical issues to be addressed, yet it the programme of introduction has, according to witnesses, been too fast.[422] There is concern that teachers have had little say in how they have been developed. Professor Richard Pring, author of the Nuffield Review of 14-19 education and training, said, "We have got to return to a tradition in which teachers are much more actively involved in creating and thinking about the curriculum rather than—that awful word— 'delivering' a curriculum created elsewhere."[423] There is concern about how transportation will work for pupils who have to be moved between different schools. This is especially relevant in rural areas, where long journeys could potentially eat into learning time. We are also concerned about the practicalities of child protection checks on staff in industry who might be working with Diploma pupils. The NAHT has said that this may discourage businesses from wanting to participate.[424] What has the Government done to address these concerns?

233. Schools and colleges, who are required to work in collaboration with each other to provide a rounded education for Diploma students, cannot be expected to do so effectively when the accountability regime places them in direct competition with each other. We welcome the introduction of the Diploma and recognise the determination of all concerned to make it work, but we have some concerns about how it will work in a competitive environment.

234. Given its complexity, the Diploma must, in our view, be given an opportunity to settle into its operational phase without undue intervention from the Government. We consider that this is an area best left to the proposed new regulator who we hope will approach Diplomas with a light touch and at a strategic level in the first few years as the initial problems are ironed out over time.

235. The whole education sector would welcome greater clarity on the future direction of Diplomas. We urge the Government to make clear what its intentions are for the future of Diplomas and other 14-19 qualifications and whether it is, in fact, heading towards one, overarching framework for all 14-19 qualifications as Mike Tomlinson's Working Group on 14-19 Reform proposed in 2004.

The QCA, development and regulation

236. The Government has referred to a perceived conflict of interest inherent in the remit of the QCA. On the one hand, the QCA is responsible for monitoring and advising on the curriculum for children of school age; and for developing associated assessments, tests and examinations. On the other hand, the QCA is the regulator of qualifications offered in schools, colleges and workplaces in England. Others have noted this logical conflict, including the QCA itself, and some have cited weak regulation in some cases; yet there is no serious suggestion that the QCA has acted improperly, especially since the QCA's development function was hived off to its subsidiary, the National Assessment Agency, in 2004.[425] Dr Ken Boston of the QCA told us that:

    Our private, but consistent, advice to Government has been that there is a perception that the regulatory decisions could be manipulated by Government, given the way in which we report to Ministers rather than to Parliament. [426]

    […]the Government have listened to and heard our concerns about the ambiguity present where there is a body that, among other things, is responsible for regulation and reports on the maintenance of assessment standards to a Government who are committed to driving up standards to meet particular targets.[427]

237. The Government's proposal to allocate the development and regulatory functions to two separate agencies is intended to enhance public confidence in standards in the education system.[428] The white paper Confidence in Standards: regulating and developing qualifications and assessment was published on 17 December 2007 and was jointly presented by the DCSF and the Department for Innovation, Universities and Skills. The white paper proposed two developments:

  • a new, independent regulator of tests and qualifications in England, known as the Office of the Qualifications and Examinations Regulator; and
  • a new agency to advise Ministers in the monitoring and development of curriculum, assessment and qualifications.


238. The independent regulator will report to Parliament through the Children, Schools and Families Committee. It will be the "guardian of standards across the assessment and qualifications system for children, young people and adult learners", although it will not regulate qualifications awarded by higher education institutions.[429]

239. The regulator will be responsible for the maintenance of standards over time, which we take to mean 'assessment standards' as defined in this Report. As part of this function, the regulator will be required to recognise Awarding Bodies, accredit public qualifications, and monitor and inspect Awarding Bodies.[430] It will also regulate National Curriculum tests and moderate assessment at Key Stage 1 and in the Early Years Foundation Stage.[431]

240. Finally, the regulator will oversee the qualifications market and ensure that it is delivering value for money. It will also investigate complaints and consider appeals.[432]


241. Under the Government's proposals, the QCA will develop into a new agency, responsible to Ministers, whose main objectives will be:

242. It is further proposed that the development agency, rather than the regulator, will develop the criteria for public qualifications, such as GCSE and A-levels, whereas the role of the regulator will be to scrutinise the agency's criteria. The Government intends that the agency will "support the communication of government aims for curriculum and qualifications".[433]

243. The work of the QCA to date has been praised by some witnesses.[434] The NAHT, for example, said:

    The integrity and skill of QCA officials is generally appreciated and respected by the education professionals.[435]

Others, particularly the Awarding Bodies, have been more critical, stating that regulation has been inconsistent, sometimes overly interventionist and prescriptive.[436] Referring to frequent changes to qualifications, Greg Watson of OCR said:

    I think that QCA, because of the position it has occupied very close to Government, has tended to find that its role in being a sponsor of change has far outweighed, over time, its responsibility for stability.[437]

244. Whilst the Government gives the QCA a clear remit for its work, the NAHT states that frustration can arise from the fact that, in its view, the QCA does not have "sufficient freedom in aspects of its work". The NAHT gives the example of the QCA offering "sound professional advice" which the Government has chosen not to follow. At other times, the Government has asked for further investigation to be undertaken when the QCA has recommended caution, for example in relation to the withdrawal of coursework from the GCSE curriculum.[438] The NAHT concludes:

    QCA is generally effective but there are potential dangers in that it is so strictly controlled by the DfES that all it is empowered to do is offer advice.[439]

245. Whether the independence of the new regulator will have an impact on the Government's propensity to take advice on regulatory matters remains to be seen. The Government is clear that, in its view, the regulatory functions of the QCA have always been carried out at arm's length from government and the QCA has confirmed this.[440] Clearly, the new development agency will stand in the shoes of the current QCA in terms of its relationship with Government, so that advice on the development side will be given on the same basis as before. There is, therefore, no obvious reason why Government should change its attitude towards advice on development and related matters. However, the new arrangements have broadly been welcomed by witnesses to this inquiry.[441]

246. A major rationale for the introduction of an independent regulator is the monitoring and maintenance of assessment standards over time.[442] Professor Peter Tymms told us that an independent body was essential for this task, a proposition with which Sir Michael Barber agreed.[443] Professor Tymms said that standards could not be monitored through the current national testing system due to frequent changes in the curriculum and that an independent body would need to use international standards, as well as the National Curriculum, to track change.[444] We asked Dr Boston whether there was likely to be anything different about the new regulator which would bring to a halt the drift in assessment standards which he seemed to accept had been a feature of the testing system. Dr Boston replied:

    No. The new body—the regulatory authority—will use codes of practice similar to those we have used in the past.[445]

247. OCR have expressed frustration at the annual debate on "standards" which takes place, in their view, at the low level of this year's papers, the marking of a given paper or the percentage of children awarded a given grade. OCR considers that the debate is taking place at the wrong level and that the focus should really be on the way in which assessment standards are affected by systemic change.

    The potential for standards to move and for public confidence to be shaken is greatest when there is wholesale, system-wide change or major structural changes to long-established qualifications. The acid test for looking at the move to an independent regulator is whether we will have a body that is sufficiently able to look at the macro-level changes and the effect that they may have on standards and public confidence and worry much less about the detail of which individual qualification is which.[446]

248. Although there is greater logical consistency in the separation of test development and regulation, this alone is unlikely to address the annual outcry about grade inflation in GCSEs and A-levels. We discussed this with Dr Boston, who thought the new arrangements might help, but admitted that they were unlikely to resolve the issue:

    […] if we consider one of the causes of the August debate to be that the separation of the regulator from Government is not perfectly clear, then that August debate might be diminished if the separation were made more apparent. Of course, there may be other issues in the August debate that are not resolved by that situation.[447]

    […] while the basis for [the August debate] might be diminished I am not sure that it is going to go away.[448]

249. We welcome the creation of a development agency and separate, independent regulator on the logical grounds that it is right that development and regulation should be the responsibility of two separate organisations. That assessment standards will now be overseen by a regulator demonstrably free from government control and responsible to Parliament through the Children, Schools and Families Committee is a positive step.

250. However, the Government has failed to address the issue of the standards themselves. In the context of the current testing system, with its ever-changing curriculum and endless test reforms, no regulator, however independent, can assure assessment standards as they are not capable of accurate measurement using the data available. Until the Government allows for standardised sample testing for monitoring purposes, the regulator will be left without the tools required to fulfil its primary function.

339   Children's Plan, Cm 7280, para 3.68 Back

340   Children's Plan, Cm 7280, para 3.68 Back

341   Children's Plan, para 3.61 Back

342   Making good progress: incorrect diagnosis and weak treatment? Prof Dylan Wiliam, p3 Back

343   Ev 70-71; Ev 50-54; Ev 246; Ev212-214 Back

344   Ev 204 Back

345   Ev 252  Back

346   Making good progress: incorrect diagnosis and weak treatment? Prof Dylan Wiliam, p4 Back

347   Ev 252 Back

348   TES, Schools back off test trial,14 December 2007, p1 Back

349   TES, Schools back off test trial,14 December 2007, p1 Back

350   Q311 Back

351   Ev 176-177 Back

352   Q373 Back

353   Q380 Back

354   Ev 177 Back

355   Q380 Back

356   DfES consultative document, 8 January 2007; Children's Plan, p67 Back

357   Making Good Progress, p11 Back

358   Children's Plan, para 3.65; Ev 78 Back

359   Children's Plan, para 3.54 Back

360   Children's Plan, para 3.79 Back

361   Source:  Back

362   Source: Back

363   Ev 68-69 Back

364   Ev 78 Back

365   Ev 48 Back

366   Ev 51 Back

367   Ev 266; Ev 49 Back

368   Ev 55-56 Back

369   Ev 248 Back

370   Ev 254 Back

371   Ev 254 Back

372   Ev 255 Back

373   Ev 255 Back

374   Ev 255-256 Back

375   Ev 261 Back

376   Ev 258 Back

377   Ev 261 Back

378   Children's Plan, para 3.67 Back

379   Ev 52-53; Ev 248; written evidence from the Advisory Committee on Mathematics Education; Back

380   Ev 247-248 Back

381   Ev 51 Back

382   Ev 264 Back

383   Ev 246-247 Back

384   Ev 56 Back

385   Ev 70-71 Back

386   Ev 70 Back

387   Ev 165; DCSF, Promoting achievement, valuing success: a strategy for 14-19 qualifications, Cm7354, March 2008 Back

388  Back

389  Back

390   14-19 Curriculum and Qualifications Reform: The final report of the Working Group on 14-19 Reform, October 2004 Back

391   HC 246, House of Commons Education and Skills Committee, 14-19 Diplomas, Fifth Report of Session 2006-07, paras 13-15 Back

392   DCSF, Promoting achievement, valuing success: a strategy for 14-19 qualifications, Cm7354, March 2008, p21 Back

393   Q197 Back

394   QCA, 14-19 education and skills: what is a Diploma? Back

395   Ev 198; Ev 201 Back

396   Ev 76 Back

397   Ev 72-73; Ev 143; Ev 230; Q265; Q275; Q286  Back

398   Q195 Back

399   Ev 73; Ev222; Ev 224  Back

400   Ev 112; Q198 Back

401   Ev 162 Back

402   Ev 165 Back

403   Ev 230; see also Jerry Jarvis, Q196 Back

404 Back

405 Back

406   Ev 230; See also Prof Steve Smith, Q276 Back

407   Q196; Q200 Back

408   Q200 Back

409 Back

410   Ibid. Back

411,-warn-headteachers.html Back

412   Ev 72 Back

413   Ev 73 Back

414   TA46, para 27 Back

415   TA46, para 4 Back

416   TA46, para 34 Back

417   Q322 Back

418   Q404 Back

419   Ev 162; Q199 Back

420   Ev 72; Ev 105 Back

421   Ev 249 Back

422   Q206 Back

423,,2277690,00.html Back

424 Back

425   Ev 103; Q15; Q55; Q218; written evidence from The Royal Society, section 7  Back

426   Q56 Back

427   Q62 Back

428   (2007) DFES & DIUS, Confidence in Standards: regulating and developing qualifications and assessment, Cm 7281; Ev 163 Back

429   ibid. para 3 Back

430   ibid. para 4 Back

431   ibid. para 6 Back

432   ibid. para 5 Back

433   ibid. p6 Back

434   Ev 110; Ev 119; Q13 Back

435   Ev 67 Back

436   Q219; Q220 Back

437   Q219 Back

438   Ev 67 Back

439   Ev 67 Back

440   Q55; Q62 Back

441   Ev 103; Ev 146; Q13; Q55; Q173; Q180; Q220  Back

442   (2007) DFES & DIUS, Confidence in Standards: regulating and developing qualifications and assessment, Cm 7281; Ev 163 Back

443   Q16 Back

444   Q15 Back

445   Q68 Back

446   Q247 Back

447   Q62 Back

448   Q71 Back

previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 13 May 2008