APPENDIX 23
Memorandum from the Department for Education
and Skills
INTRODUCTION
1. This document responds to the Committee's
request for information on the processes adopted by DfES for using
research and evidence in policy making. Two policy issues were
selected for questioning by the Committee: the DfES announcement
on 20 March 2006 concerning the future role of phonics in early
reading; and the role of evidence in the development of Sure Start.
In both cases, the Committee asked how far the processes involved
were typical. The first part of this Memorandum therefore sets
out the Department's general framework for handling evidence.
The strategic context for handling evidence and
analysis within DfES
2. The Department approaches the use of
research and analysis in a strategic context. This reflects the
strategic role for the Department as a whole set out in the 2004
Five Year Strategy for Children and Learners. It involves a number
of complementary developments.
3. There is systematic identification of
the major challenges facing education and children's services,
and the analytical implications which they pose. These include
challenges set by Treasury as part of the framework for the 2007
Spending Review, such as globalisation, and overarching policy
drivers, such as the high priority given to social mobility. Analytical
efforts continue to address the Department's concern with efficiency
and effectiveness.
4. As a result, there is a range of strategic
issues have been identified for analysis. Some relate to individual
sectors (HE, Lifelong Learning, Children's services), others are
system-wide. The challenges and strategic issues are being brought
together into a single framework to guide, prioritise and then
monitor the Department's programme of analysis and research. Examples
of issues which are being addressed include the longer term impacts
of early years provision, the most effective interventions to
narrow social class gaps in attainment, and the drivers for youth
engagement in positive or anti-social activities.
The prioritising and commissioning of evidence
5. Within the Department, much of the research,
data, modelling and analysis is handled by an analytical community
of social researchers, economists, statisticians and operational
researchers. Policy officials are responsible for determining
specific evaluation needs and for making use of evidence, and
for commissioning periodic independent reviews that embrace both
analytical and good practice evidence. The Department's Strategy
Unit is often involved in drawing together, interpreting and disseminating
evidence.
6. The analytical work undertaken to support
policy partly reflects the state of knowledge in any given field.
Where the knowledge base is weak, the priority is to generate
new insights to support policy intervention. This is where most
externally contracted research is focused. Across all areas, but
especially where the evidence base is richer, the goal is to marshal
evidence from a range of sources to provide a clear and coherent
picture.
7. Analytical work is conducted both internally,
which offers flexibility to respond to issues as need arises,
and externally. The external work involves a range of research
studies, reviews and statistical or econometric analysis. These
may be commissioned as self-standing projects or as part of programmes
of work at external centres (eg the Centre for the Economics of
Education; Centre for Research into the Wider Benefits of Learning).
8. Proposals for external projects are considered
by a Research Approvals Committee (RAC), comprising senior analysts
and policy officials, and chaired by the Department's Chief Scientific
Adviser (who is also the Chief Economist). RAC recommendations
go to the Secretary of State for consideration and approval for
funding, which may be from the Department's central research budget
(mainly for strategically-related research) or programme budgets
(mainly for policy evaluation). Quality assurance processes for
research studies are currently being reviewed. As part of this,
the practicalities of formal peer reviews of research, and post-project
assessment of policy impact, are being considered.
9. Independent policy and practice reviews
are commissioned periodically to bring together diverse evidence
and representations on a high-profile issue, and to make recommendations
for Government action. Commissions for such reviews are typically
developed at senior policy level, with an invitation coming directly
from Ministers. In some cases, a commission is given to a partner
organisation whose remit covers the issue, such as the 2005 HEFCE
review of risks to strategically important subjects. In other
situations, an individual or group of external figures are asked
to undertake a review. DfES typically provides staff support and
makes available existing evidence for the review.
10. The Rose Review was one of a number
of such reviews over the last two years, others being Foster (FE
sector roles and reform), Steer (behaviour in schools) and the
current Gilbert Review (personalisation of teaching and learning).
In each case, the issues required an exploration and drawing together
of a wide range of evidence and opinions, from practitioner experience
through stakeholder representation to inspection and academic
evidence. Decisions to commission such reviews draw on existing
analysis and evidence.
Evidence gathering and synthesis
11. The Department's analysis and research
programme gathers different types of evidence, through a number
of mechanisms. It includes:
longitudinal studies: DfES
fully funds the Longitudinal Study of Young People and Education
and the Youth Cohort Study, and co-funds or supports the Millennium
Cohort Study and others;
research centres: seven are
current funded, which develop programmes of research in related
areas, rather than simply conduct single studies or analyses.
Outputs may cover individual issue or topics (e.g. CEE on choice,
EPPI on thinking skills) or wider syntheses (eg WBL on a conceptual
framework for education's non-economic contribution).
Contributions to International
studies (PIRLS, TIMSS, PISA)
Policy or issue-specific evaluations:
some are medium-term programmes of work (Surestart, Aimhigher,
Education Maintenance Allowances, Extended Schools), others are
single studies or analyses (eg on Formalised Peer Mentoring Pilot,
School Efficiency measures)
reviews of existing evidence,
some adopting systematic review principles, others for more
urgent and formative purposes.
12. The Department's work is closely linked
to the efforts of partners, and also to cross-Government developments.
On some issues, other organisations lead on evidence gathering,
with the Department drawing on their findings for policy (eg the
Food Standards Agency for nutrition in schools, Becta (British
Educational Communications and Technology Agency) on e-learning.
There are also wider analytical developments, such as the Treasury-led
Atkinson Review of Government Output and Productivity, where DfES
analysts had a significant role.
The contribution of evidence to policy making
13. Evidence is incorporated into policy
formulation and implementation in a number of ways. There is direct
policy engagement with studies and analysis, whether as commissioners
of work or through participation in steering or review groups.
Emerging and final results feed into policy consideration on an
ongoing basis. Depending on the point reached in policy cycles,
the influence of evidence may be on policy development, on implementation
or in support of front-line practice and leadership.
14. Different types of analytical work can
make different contributions to policy. It is common for synthesis
of existing research, or exploratory analysis, to set a context
for policy or strategic thinking. The development of options for
14-19 policies, for example, was informed at various stages by
analysis of the motivations and decisions of young people and
the returns to different qualifications. Research and statistical
evidence on the poor progress made by many pupils on moving to
secondary school was an important prompt for the extension of
the Primary National Strategy into Key Stage 3.
15. Large scale evaluations typically relate
to the implementation of policies or options for delivery. The
amounts and payment mechanisms for Education Maintenance Allowances,
for example, were piloted and subject to both quantitative and
qualitative evaluation. The eventual roll-out reflected the most
favourable option from the research. Econometric analysis of the
impact of Employer Training Pilots identified high deadweight,
which was countered in the national programme (Train to Gain).
Modelling and projections work supports funding decisions and
the setting of targets and priorities for policy intervention.
PSA targets on pupil absence, for example, were based on detailed
analysis of trends for schools in different deprivations bands,
as assessed by Free School Meals. And the modelling of alternative
options for Higher Education student support was influential in
shaping the final decision on repayment thresholds and rates.
THE PHONICS
ISSUE AND
THE ROSE
REVIEW
16. Since the introduction of the National
Literacy Strategy, now part of the Primary National Strategy (PNS),
considerable improvements have been secured in the standards achieved
in reading and writing. Nevertheless, since its inception, the
Strategy has sought to draw upon new evidence and developments
in order to further strengthen and refine its approaches. As part
of the commitment to ensuring that the PNS continues to provide
the most effective support for teaching reading (and literacy
more widely), the Department announced in 2005 an intention to
renew the cornerstone literacy document, the Framework for
teaching.
Rationale for commissioning an independent review
of early reading
17. There is wide consensus over the centrality
of learning to read in children's education, but there has been
a longstanding debate about how best to teach early reading. Since
1998, the National Curriculum has required all primary schools
to teach phonics. It did not specify how phonics should be taught
but guidance was provided in the Framework for teaching.
18. The Department was clear that the renewal
of the Framework, plus plans to introduce the new Early Years
Foundation Stage[115]
(EYFS), needed to be informed by a comprehensive examination of
the available evidence. This would include formal research findings
but also wider evidence from discussions with key stakeholders,
first-hand observation of practice and the representations and
submissions of organisations engaged in the literacy field. Any
set of recommendations for new pedagogical approaches needed to
carry credibility across many different interest groups and stakeholders.
19. Policy officials and advisers defined
the Department's requirement as credible and practice-grounded
recommendations which could be implemented in a revised Framework
and in the new EYFS. An independent review, of the type outlined
in paragraphs 9 and 10 above, was judged to be the best mechanism
to address the wide-ranging issues of practice, development and
training. The over-riding consideration was that the long-standing
debate around phonics had led to a divergent set of opinions.
For example, the Chair of the Education and Skills Committee,
referring to that Committee's report on teaching reading, noted
that ". . . When we took evidence on this particular topic
. . . passions ran very high". Against this background, the
Department concluded that securing a consensus would most likely
be achieved by an independent expert making recommendations from
a combined analysis of research and practice. Jim Rose was invited
to lead the review as he fulfilled the criteria of being independent,
credible and well-informed. The remit for the review is attached
at Annex A.
20. In conclusion, the Department considered
that an independent review would yield the greatest benefits in
terms of:
(i) building a comprehensive picture to inform
work to strengthen the quality of support for teaching early reading;
(ii) ensuring a robust process founded on
evidence from research, practice and stakeholder opinion; and
(iii) establishing broad consensus about
recommendations which would allow for successful implementation
via the Framework and revised teaching training and development
processes.
Responses to the Committee's questions
Q. In view of the uncertain conclusions
on synthetic phonics reached by the DfES-commissioned research
referred to in the Rose final report (p 19), what weight was given
to:
(i) analysis of existing relevant research,
(ii) experience and pilots in the UK and
(iii) other factors in reaching a decision
on the new policy?
21. From the late 1990s, the Department
had been aware that prior formal research did not point overwhelmingly
to a particular conclusion about synthetic phonics (see Annex
B for definitions of approaches to phonics). Accordingly, the
Rose review was established to integrate a wide set of evidence,
which included all the categories in the Committee's question.
The purpose was to identify best practice in the teaching of reading,
including the role played by synthetic phonics. The following
sources were employed in addressing the five fold remit set out
by the Secretary of State.
Published research on literacy
in general and the teaching of reading in particular. This included
the systematic review of trials using phonics, which was commissioned
by the Department in 2005 (see paragraph 23);
The report by the Education
and Skills Committee on teaching reading;
Reports and data from Ofsted,
the Qualifications and Curriculum Authority, and the DfES;
Evidence from classroom practice:
notably, a bespoke HMI exercise to monitor the use of phonics
in a variety of schools;
Additional visits to settings,
and schools, including some engaged in the Primary National Strategy's
Early Reading Development Pilot;
Attendance of practitioner and
teacher training sessions;
Discussions with practitioners,
teachers and those responsible for training them; and
Oral and written representations
from individuals and organisations in the fields of literacy and
early reading.
22. A full list of the types of evidence
considered, which appears in the published report, is at Annex
C. In addition, Jim Rose drew upon a review advisory team, comprised
of researchers and an HMI, which brought expertise in different
aspects of the review's remit. One of the advisory researchers
was Greg Brooks, who also co-authored the DfES-commissioned systematic
review.
23. The DfES research referred to by the
Committee was one of a number of studies or reviews considered
by Jim Rose. It was a systematic review of published quantitative
studies on the teaching of phonics, conducted by Carole Torgerson,
Greg Brooks and Jill Hall. A systematic review gathers published
evidence according to clear criteria. In this case, the studies
included were confined to randomised control trials (RCTs) focusing
on the use of phonics instruction in English. The studies had
to compare a systematic approach to phonics delivery with either
unsystematic or non-phonics teaching, or else directly compare
synthetic and analytical phonics teaching. In all cases, studies
needed to report or allow calculation of statistical effect sizes
to order to be used for the synthesis of results. A total of 12
RCTs met the full criteria for comparing systematic phonics with
alternative reading approaches, although only one was a British
study. The primarily North American context is an issue to consider
in assessing the contribution of the RCT evidence.
24. Torgerson et al concluded that
the use of phonics in a systematic fashion within a broad literacy
curriculum, as a means of improving reading accuracy, was supported
by the RCT evidence included in their review. This matches Jim
Rose's view, which he drew from much wider evidence, about the
value of teaching phonics in a systematic fashion. However, evidence
from RCTs was less able to answer other questions addressed in
the systematic review. For example, few of the included trials
covered reading comprehension; as a result, the effect on comprehension
of a systematic approach to phonics was not statistically significant.
Similarly, there was insufficient trial evidence on the issue
of spelling, and on direct comparison between the synthetic and
analytic phonics approaches, to draw statistically valid conclusions
on these questions. The authors also advised caution in using
their RCT evidence because of the variation in the amount of phonics
teaching involved in different trials, from just a few hours to
well over 100 hours.
25. As described above, Jim Rose drew on
a much wider range of evidence than the mainly North American
randomised control trials in Torgerson et al. The conclusions
from UK practice, as reported by the specific HMI visits undertaken
for the review and from his own discussions and observations,
not only supported the case for using phonics but suggested that
synthetic phonics was the best route for young children learning
to read. In giving evidence to the Education and Skills Committee
in January, Jim Rose confirmed that the evidence he saw from practice
was compelling. [116]His
report states that the principles of a systematic approach using
synthetic phonics "featured consistently in the best work
seen including the visits undertaken by HMI for the review".
However, he also confirmed that phonics by itself is not sufficient
to fully equip children with all the skills they need to become
effective readers. In this respect, his findings are not inconsistent
with the Torgerson et al position that too few controlled
studies have been undertaken in relation to comprehension and
spelling to demonstrate a positive effect of phonics in these
areas.
Q. What other evidence and research,
in addition to the reports by Jim Rose, were used to inform the
decision; and was his work subject to peer review?
26. The Rose review was not the starting
point for evidence on the teaching of early reading. The National
Literacy Strategy was founded on earlier research conclusions
about reading, and its continued evolution has taken account of
evaluated best practice. As a result, the Strategy always emphasised
the importance of systematic approaches and the key role of phonics.
The review was, therefore, conducted against the background of
evidence-based principles that were already inherent within the
Strategy.
27. The review was designed to be a thorough
and coherent analysis of all types of evidence, including that
commissioned or provided by the Department (including the Torgerson
et al review). Jim Rose kept the Department informed of
his emerging assessments, enabling Ministers and officials to
weigh them against information already available to the Department.
He considered the Department's responses fully in preparing his
drafts, although he retained full responsibility for producing
an independent report. This process enabled the Department to
be confident that the published reports reflected the evidence
obtained. Ministers were therefore able to accept the recommendations
fully.
28. There are two issues to consider in
relation to peer review, which, in the scientific sense, seeks
to test the robustness of the analytical design and methods brought
to bear on the research question. First, the Rose review was not
a research study per se, but aimed to assess existing evidence
and practice. A peer review of the classic type would be less
relevant in this situation. Many of the studies examined had been
peer reviewed themselves, of course, prior to publication.
29. Second, the independent and wide consultation
undertaken by Jim Rose, including with many researchers, did allow
expert judgements to be fed into the process. His interim report
invited comments on the issues raised and the evidence included.
Over 300 written responses were received during the course of
the Review. The input from Rose's expert advisory team, whose
members had a predominantly research background, provided a further
mechanism to ensure robustness in the Review's conclusions. These
channels gave full opportunity for the research community to test
the review's emerging conclusions, which were reaffirmed in the
final report.
Q. What steps are in place to evaluate
the implementation and results of the policy?
30. The Department will monitor closely
the implementation of the recommendations from the review through
a number of mechanisms, many of which are already established.
The Primary National Strategy regional advisers will continue
their programme of monitoring implementation of the Strategy at
local level. Throughout 2006-07, the focus will be on the progress
made by early years settings, schools and local authorities in
adopting and implementing the renewed literacy framework and associated
support which will be introduced from Autumn 2006. Over time,
this will extend to include the implementation of support offered
by the new Early Years Foundation Stage when this is introduced
from 2008.
31. External agencies have also been invited
to monitor and evaluate the implementation. Ofsted's 2006-07 evaluation
of the Primary National Strategy will focus on the implementation
and impact of the Strategy's renewed support in schools and local
authorities. Ofsted has also been invited to review the quality
of training in the teaching of reading that providers of Initial
Teacher Training (ITT) offer over the next two years. The Strategy
will additionally draw upon external expert advice to inform its
development of professional development materials and programmes
for teaching early reading.
32. Finally, Jim Rose has been asked by
the Secretary of State to assess, and provide advice in early
2007, on the progress made in implementing his recommendations
for teaching and practice, based on the first stage of implementation
in Autumn 2006.
33. The effectiveness of making synthetic
phonics the prime strategy in teaching children to read will be
assessed through analysis of national and local attainment and
progress data. National results for the Foundation Stage, and
Key Stages 1 and 2 already yield a rich source of evidence about
standards achieved in reading, writing and English. We will consider
how best to interrogate the national data to make an effective
assessment of the impact of our revised approaches. The approach
will take account of the national nature of the implementation,
which does not allow for classic control trials or studies. The
emerging findings from analysis of effectiveness will also guide
future development of support for early reading.
CONCLUSION
34. The independent review of early reading
was undertaken to provide a comprehensive picture of best practice
in the teaching of early reading. It was seen as the soundest
basis on which to make recommendations for further strengthening
the quality of the support already offered to settings and schools.
In accepting the review's recommendations, the Department was
satisfied that these criteria have been fulfilled. The focus now
is on ensuring that the enhanced support is developed, introduced
and implemented in line with the review's recommendations, and
on monitoring the effects of the initiative.
THE SURE
START PROGRAMME
Introduction
35. The development of the Sure Start programme
and its successor policy, Children's Centres, has been heavily
and consistently informed by research evidence. The Sure Start
programme was built on evidence from a wide range of both UK and
international studies. The National Evaluation of Sure Start (NESS),
the Department's Effective Provision of Pre-School Education study
(EPPE) and other UK and international studies continue to be instrumental
in the development of early years policy.
The genesis of the Sure Start Programme
36. The Sure Start Programme emerged from
a cross cutting review of services for young children undertaken
by a cross-Departmental group, led by HM Treasury, as part of
the Government's 1997 Comprehensive Spending Review (CSR). A key
part of this review was an independent assessment of the research
evidence on what worked in improving outcomes for young children,
particularly those in greatest need. Conducted by Marjorie Smith
of the Thomas Coram Institute at the Institute of Education, this
assessment highlighted evidence from a range of studies demonstrating
a clear relationship between poverty and poor child outcomes.
It also illustrated the importance of early childhood experiencesboth
positive and negativein influencing later outcomes. The
Smith assessment concluded that early childhood intervention programmes
could have a significant impact on a wide range of child and family
outcomes. This underpinned the CSR review's conclusion that:
"the provision of a comprehensive community
based programme of early intervention and family support which
built on existing services could have positive and persistent
effects, not only on child and family development but also help
the cycle of social exclusion and lead to significant long term
gain to the Exchequer."
37. The Government's consideration of services
for young children recognised that the UK evidence on the effectiveness
of early intervention was quite limited, with much of the evidence
for proposed policies coming from the US. It was recommended that
the new programme, which became Sure Start, should be accompanied
by a rigorous and comprehensive evaluation.
38. These recommendations culminated in
the announcement of Government funding of £452 million to
set up 250 Sure Start local programmes (SSLPs) in areas of deprivation
by 2001-02. These programmes aimed to improve services both for
young children and families, in order to improve outcomes for
children. The national programme had a dedicated suite of PSA
targets that were chosen based on their relationship to future
child outcomes. These included smoking in pregnancy, hospital
admissions, and speech and language development. These targets
ensured that all local programmes worked to common aims.
39. Within these common aims, individual
Sure Start programmes had relative freedom to develop a package
of services that met the needs of their particular community.
But they were tasked to select and adopt services and approaches
that were evidence based. A range of approaches had already been
subject to evaluation and identified in the literature as having
positive effects on children and parents, eg parenting interventions
such as Webster-Stratton or PEEP (Peers Early Education Partnership)
amongst others. The Department published a Guidance document in
1999 (A Guide to Evidence Based Practice) that drew together
much of this evidence, enabling local programmes to identify and
implement appropriate evidence- based services.
National evaluation of Sure Start
40. The Department's commitment to a robust
evaluation was clear from the outset. A clear aim of the evaluation
strategy was to inform the ongoing development and roll out of
the programme, as well as identify its impact once established.
41. After a competitive tender, a scoping
exercise was undertaken by the Institute of Education to assess
the type, level and nature of an evaluation needed for this diverse
initiative. The report was used to inform the specification of
requirements for the national level evaluation.
42. The National Evaluation of Sure Start
(NESS) was commissioned in January 2001 after a competitively
tendered, open competition. A consortium of academics and consultants,
led by Birkbeck College, University of London, won the contract.
The first survey of Trailblazer and second round programmes took
place later that year.
43. NESS is overseen by an integrated governance
structure which allows for independent scrutiny of both methodology
and findings. An independent expert panel provides this function
to both the Department and to the researchers. An additional steering
group allows other government departments, researchers and practitioners
an opportunity to feed into the development of the evaluation
(particularly in sharing messages from their own evaluations and
research) as well as ensuring reports are policy relevant and
therefore will be useful in policy development in other Government
departments.
Response to the Committee's questions
Q. What pilot projects were undertaken
in conjunction with the Sure Start programme and how were the
results of these pilots assessed?
44. The initial phase of Sure Start was
implemented rapidly and involved a significant number of local
programmes. The initial programme design drew heavily on the evidence
gathered during the Treasury-led review. The staged roll-out of
the local programmes[117]
afforded an opportunity to use the first set of 59 SSLPsthe
"Trailblazers"as a pilot for evaluation purposes.
These initial programmes provided an opportunity to gather early
lessons, and help and support future programmes. Informal feedback
was brought together within the Department, through local programme
staff themselves, partner agencies and regionally based DfES staff.
Staff at Government Offices facilitated the sharing of experience
and networking via seminars and meetings, which allowed practitioners
to share both positive and negative lessons.
45. The Department drew together the information
from these different sources and modified its implementation accordingly.
It placed greater emphasis on working with pregnant women, for
example, not just those already with babies or young children.
These policy modifications, as well as examples of good practice,
were distilled into guidance which was quickly disseminated to
the next tranche or "Round" of programmes. Guidance
documents were issued for each new Round, utilising evidence and
learning generated by the previous SSLPs.
46. Early learning and feedback from programmes
on the ground shaped Sure Start policy more fundamentally. It
became clear, for example, that the Sure Start "model"
did not fit with rural communities, since the geographic proximity
to services which was a key feature of the design was more problematic
in rural areas. The Department became more flexible in the application
of the model and also set up a parallel programme specifically
to incorporate rural areas as well as small pockets of deprivation,
based on this evidence.
47. More formal evaluation of the early
rounds came via the national evaluation (NESS), which was designed
to cover a series of rounds of implementation. Early lessons from
the implementation of Trailblazer and Round 2 programmes were
available by early 2002 and fed into further guidance for programmes.
As well as providing an overview of early progress, the evaluation
identified a number of challenges and barriers to delivery of
Sure Start. These included: the need to engage and consult better
with families, especially fathers and those from minority ethnic
groups; to improve access to services; to build multi-professional
teams; and to improve local partnership working. In response,
the Department provided additional support in the form of advisers
based at Government Offices, as well as targeting its guidance
documents on identified issues (eg "Guidance on involving
minority ethnic children and families".
48. During 2002 the Department assessed
the range of evidence which had become available on the pilot
programmes. Whilst it was too early to identify formal impacts,
the picture that emerged from local evidence and from the early
evaluation of Trailblazers was that clearly parents liked (and
were using) Sure Start services, and that staff felt they were
already making a difference in families' lives. This evidence
fed into Spending Review processes, including a Prime Ministers'
Strategy Unit review of childcare which recommended the bringing
together of DfES work on early education, childcare and SSLPs,
and a transfer of joint responsibility from DoH/DfES to DWP/DfES.
Q. What were the key lessons from the
pilots and how have they been used in the development of the Sure
Start programme?
49. The process of drawing lessons from
the early rounds of local partnerships has been described above,
with some examples already cited of modifications to policy implementation.
The reports from the national evaluation identified a number of
issues which the Department needed to consider. Three further
examples are:
it was shown that some of the
most disadvantaged families in Sure Start areas were not demonstrating
benefits from the programme. This evaluation suggested that
groups such as teenage parents or families with children with
special needs either did use Sure Start services or that the services
provided did not meet their needs and improve outcomes. In response,
Departmental guidance "Sure Start Children's Centres Practice
Guidance" in November 2005 required that all centres worked
closely with low participating groups, through outreach, peer
support and other mechanisms. The guidance advised working closely
with partner organisations to share expertise, information and
data in order to engage such families;
the Sure Start evaluation also
found that children living in those local partnerships that were
actively engaged with their local health agencies had better outcomes
than in other SSLPs. The guidance therefore emphasised closer
working relationships with health partners. At a policy level,
further emphasis was placed on work with the Department of Health
to ensure children's centres remained at the heart of health service
delivery to young families;
SSLPs had been set a key goal
of helping parents into work or training, which reflected prior
evidence that work is a prime route out of poverty. The Sure Start
evaluation, however, found that this objective was proving difficult
for SSLPs, since some of the activities involved (job search support,
directly provided training) fell beyond the traditional remit
of early years practitioners. DfES and DWP therefore arranged
for closer links between SSLPs and Jobcentre Plus, to ensure that
Sure Start parents had easier access both to the full range of
advice and support they needed and also to the training providers
in their area. Latest figures show fewer children aged 0 to three
living in homes completely dependent on benefits in SSLP areas
than there were four years ago (down 3.8% to 40.4%). This was
a significantly greater drop than in England as a whole (where
the figure fell by 1.2% to 22% over the same period.
50. Evidence from the national evaluation
and other studies has contributed to the design and development
of Sure Start, both in the original rollout of local partnerships
and in developing Sure Start children's centres, the "successor"
policy to SSLPs. Examples of this broader impact of evidence are:
birth cohort studies (supported
by the Department) have consistently demonstrated the link between
early childhood experience and later outcomes. And the DfES-funded
Effective Provision of Pre-School Education (EPPE) longitudinal
study has demonstrated that children from disadvantaged backgrounds,
in particular, benefit from high quality pre-school. Therefore,
a key element of the Government's 10 year Strategy for Childcare
(HMT, 2004) and Sure Start children's centres has been to provide
high quality integrated education and childcare services in disadvantaged
areas. And, based on EPPE evidence that the presence of a qualified
teacher was crucial for high quality early years provision, the
Department has required that all children's' centres have a minimum
level of teacher time on site.
EPPE also found that early years
settings that integrated their early education and childcare were
among the most successful in improving outcomes for children.
This led the Department to develop this type of provision more
widely. "One-stop shop" centres that provide both early
education, childcare, family support and a range of health and
other advice are now at the heart of early years policy. NESS
evidence confirms that parents and staff alike prefer the "one
stop shop" approach. These and other findings contributed
to Practice guidance in November 2005.
Q. What other evaluations of the Sure
Start programme or research have been undertaken and what role
have they played in informing the evolution of the programme?
51. From the outset, SSLPs were each required
to undertake their own evaluation of their programme. This was
intended to encourage reflective practice, test new and innovative
approaches and practice robustly, grow the evidence base about
what works for children and families, and inform the further development
and progress of the programme. SSLPS were encouraged and supported
by academics from the NESS team who facilitated workshops, training
events, shared information and synthesised findings.
52. Local evaluation reports have been extremely
useful in disseminating evidence about what works for families
and children. This has complemented the more overarching reports
of the national evaluation team. Examples of good practice were
heavily used in the Children's Centres practice guidance and new
evidence will be incorporated into the next set of guidance due
for release later in 2006. Mechanisms were established to share
evidence and experience between practitioners, policy makers and
researchers. This has since extended beyond SSLPs to the wider
early years sector, as well as those involved with community development
and local regeneration.
Q. What has been the total Government
expenditure to date on:
(ii) evaluation and commissioned research
to inform the programme and
(iii) the Sure Start programme as a whole?
53. The cost of the overall Sure Start programme
to date (1999-20002004-05) has been £1.3 billion.
Within this, the approximate allocations of capital and revenue
from the Department to the 59 Trailblazer local programmes over
their first three years of operation was £148 milliom. The
National Evaluation of Sure Start is costing £20.3 million
over the seven year period 2001-08. This is not solely for the
independent national evaluation but includes costs of support
provided by the consortium to Sure Start local partnerships for
their local evaluation work.
June 2006
Annex A
REMIT FOR THE ROSE REVIEW
ASPECT 1
What best practice should be expected in the
teaching of early reading and synthetic phonics.
ASPECT 2
How this relates to the Early Years Foundation
Stage and the development and renewal of the National Literacy
Strategy's Framework for teaching.
ASPECT 3
What range of provision best supports children
with significant literacy difficulties and enables them to catch
up with their peers, and the relationship of such targeted intervention
programmes with synthetic phonics teaching.
ASPECT 4
How leadership and management in schools can
support the teaching of reading, as well as practitioners' subject
knowledge and skills.
ASPECT 5
The value for money or cost effectiveness of
the range of approaches covered by the review.
Annex B
GLOSSARY
Synthetic phonics: the defining characteristics
are sounding out and blending phonemes (the smallest units of
sound that make a difference to the meaning of a word) in order,
all through a word to read it; and segmenting words into their
phonemes in order to spell them.
Analytic phonics: the defining characteristics
are inferring letter-sound relationships from sets of words which
share a letter and a sound, eg pet, park, push, pen.
Systematic phonics: teaching letter-sound
relationships in an explicit, sequenced fashion, as opposed to
incidentally or on a "when-needed" basis.
Randomised control trials: where two
or more groups of children are formed randomly and each group
receives a different form of instruction. If one groups makes
significantly better progress it can be inferred that the form
of teaching they received was more effective, because all other
factors which might influence the outcome are controlled for (with
the exception of chance).
Annex C
EVIDENCE CONSIDERED BY THE REVIEW
The review took evidence from a range of sources,
the main ones being:
oral evidence, from individuals
and associations;
ORAL EVIDENCEINDIVIDUALS
Professor Lesley Abbot, Institute of Education,
Manchester Metropolitan University.
Professor Robin Alexander, University of Cambridge.
Bev Atkinson, Medway LA.
Sir Michael Barber.
Ian Barren, Institute of Education, Manchester
Metropolitan University.
Alix Beleschenko, Qualifications and Curriculum
Authority (QCA).
Professor Greg Brooks, University of Sheffield.
Tom Burkard, Promethean Trust.
Professor Brian Byrne, University of New England.
Mary Charlton, Tracks Literacy.
Professor Margaret Clark, University of Birmingham.
Ian Coates, former Head of SEN and disability
division, DfES.
Kevan Collins, former Director, Primary National
Strategy.
Felicity Craig.
Shirley Cramer, Chief Executive, Dyslexia Institute.
Kate Daly, adviser, Minority Ethnic Achievement
Unit, DfES.
Edward Davey MP.
Alan Davies, THRASS.
Professor Henrietta Dombey, University of Brighton.
Marion Dowling.
Nick Gibb MP.
Professor Usha Goswami, University of Cambridge.
Marlynne Grant, Educational psychologist, South
Gloucestershire LA.
Jean Gross, Every Child A Reader.
Kate Gooding, Early Childhood Forum (ECF).
Sue Hackman, chief adviser to ministers on school
standards, DfES.
Professor Kathy Hall, Open University.
Diana Hatchett, Primary National Strategy.
Debbie Hepplewhite, Reading Reform Foundation.
Sue Horner, QCA.
Jane Hurry, University of London, Institute
of Education.
Laura Huxford.
Julie Jennings, ECF.
Professor Rhona Johnston, University of Hull.
Lesley Kelly, Cambridgeshire LA.
Penny Kenway, Islington LA.
Julie Lawes, Catch Up.
Sue Lloyd and Chris Jolly, Jolly Phonics.
Ruth Miskin, ReadWriteInc.
Sue Nally, Warwickshire LA.
Angie Nicholas, Dyslexia Institute.
Joan Norris, ECF.
Wendy Pemberton, Primary National Strategy.
Sue Pidgeon, Primary National Strategy.
Dee Reid, Catch Up.
Eva Retkin.
Dilwen Roberts, Merton LA.
Rosie Roberts.
Cheryl Robinson, Bedfordshire LA.
Lindsey Rousseau, South East Region Special
Educational Needs Partnership.
Conor Ryan.
Professor Pam Sammons, University of Nottingham.
Peter Saugman and Bruce Robinson, Mindweavers.
Professor Margaret Snowling, University of York.
Professor Jonathan Solity, University of Warwick.
Lesley Staggs, Primary National Strategy.
Professor Rhona Stainthorp, University of London,
Institute of Education.
John Stannard.
Arthur Staples, LexiaUK.
Professor Morag Stuart, University of London,
Institute of Education.
Professor Kathy Sylva, University of Oxford.
Ralph Tabberer, Training and Development Agency
for Schools (TDA).
Jude Thompson, Headteacher, Dorton House School.
Janet Townend, Dyslexia Institute.
Gail Treml, SEN professional adviser, DfES.
Paul Wagstaff, Primary National Strategy.
Trudy Wainwright, St Michael's Primary School,
South Gloucestershire LA.
Tina Wakefield, British Association of Teachers
of the Deaf.
Mick Waters, QCA.
Joyce Watson, University of St. Andrew's.
Lyn and Mark Wendon, Letterland.
Caroline Webber, Medway LA.
Rose Woods, Helen Arkell Dyslexia Centre.
ORAL EVIDENCEASSOCIATIONS
Association of Teachers and Lecturers (ATL).
Basic Skills Agency.
British Association for Early Childhood Education.
Dyslexia Institute.
Early Education Advisory Group.
Educational Publishers Council.
GMB.
I CAN.
National Association for Language Development
in the Curriculum (NALDIC).
National Association for the Teaching of English
(NATE) .
National Association of Education Inspectors,
Advisers and Consultants.
National Association of Head Teachers (NAHT).
National Association of Primary Education (NAPE).
National Governors' Association.
National Association of Schoolmasters Union
of Women Teachers (NASUWT).
National Childminding Association (NCMA).
National Children's Bureau.
National Confederation of Parent Teacher Associations
(NCPT).
National Family and Parenting Institute (NFPI).
National Literacy Trust.
National Union of Teachers (NUT).
Parent Education and Support Forum (PESF).
Peers Early Education Partnership (PEEP).
Pre-School Learning Alliance (PLA).
Primary Umbrella Group.
Reading Recovery National Network.
Renaissance Learning.
UNISON.
United Kingdom Literacy Association (UKLA).
Volunteer Reading Help.
Xtraordinary People.
ORAL EVIDENCEEDUCATION
AND SKILLS
COMMITTEE
In addition, oral representations were taken
from members of the Education and Skills Committee on 30 January,
2006.
Visits
In Scotland, members of the review took evidence
from the Scottish Executive Education Department, members of Clackmannanshire
council, headteachers and teachers of Clackmannanshire primary
schools.
In England, in addition to the oral evidence
listed, evidence was drawn from visits to schools and training
events, as well as discussions with practitioners during those
events. Of the schools visited by HMI, 17 of them included nursery-aged
pupils (aged 3-4).
Schools visited by Her Majesty's Inspectors
(HMI).
Andrews' Endowed Church of England Primary,
Hampshire LA.
Barlows Primary, Liverpool, Liverpool LA.
Blue Coat C of E Aided Infants, Walsall, Walsall
LA.
Bonner Primary, London, Tower Hamlets LA.
Brooklands Primary, London, Greenwich LA.
Byron Primary, Bradford, Bradford LA.
Christ the King RC Primary, London, Islington
LA.
Cobholm First School, Great Yarmouth, Norfolk
LA.
Coppice Infant and Nursery School, Oldham, Oldham
LA.
Elmhurst Primary, London, Newham LA.
Heaton Primary, Bradford, Bradford LA.
Holy Family Catholic Primary, Coventry, Coventry
LA.
Kings Hedges Primary, Cambridge, Cambridgeshire
LA.
Lostwithiel Primary, Lostwithiel, Cornwall LA.
St Michael's C of E Primary, Bristol, South
Gloucestershire LA.
St Sebastian's Catholic Primary School and Nursery,
Liverpool, Liverpool LA.
Stoughton Infants, Guildford, Surrey LA.
Swaythling Primary, Southampton, Southampton
LA.
Thelwall Community Infant School, Warrington,
Warrington LA.
Tyldesley Primary, Manchester, Wigan LA.
Victoria Infants, Workington, Cumbria LA.
Victoria Road Primary, Plymouth, Plymouth LA.
William Lilley Infant and Nursery, Nottingham,
Nottinghamshire LA.
Woodberry Down Primary, London, Hackney LA.
Other schools visited by members of the review
team
Greatwood Community Primary, Skipton, North
Yorkshire LA.
Ings Community Primary and Nursery, Skipton,
North Yorkshire LA.
Lyndhurst Primary, London, Southwark LA.
Millfield Preparatory School, Glastonbury.
Oliver Goldsmith Primary, London, Brent LA.
Snowsfields Primary School incorporating the
Tim Jewell Unit for Children with Autism, London, Southwark LA.
Walnut Tree Walk School, London, Lambeth LA.
Training observed and conferences attended
"ReadWriteInc"training: 5 and
6 September 2005.
Amy Johnson Primary School, Sutton LA.
"ReadWriteInc"training: 16
September 2005.
Vermont School, Southampton, Hampshire LA.
"The Death of Dyslexia?"conference:
21 October 2005.
The Friends House, London.
"Playing with sounds"training:
8 November 2005.
Cambridge Professional Development Centre, Cambridgeshire
LA.
Early Reading Development Pilotfeedback
conference for pilot LAs: 15 December 2005 Marlborough Hotel,
London.
Reading Recoverytraining: 24 January
2006.
Woodlane High School, London, Hammersmith and
Fulham LA.
Written evidence
Evidence was also drawn from sources of published
information, notably:
the House of Commons Education
and Skills Committee, particularly the report Teaching children
to read;
reports and data from Ofsted,
in particular from evaluations of the National Literacy Strategy,
the Primary National Strategy, the teaching of English and initial
teacher training;
reports and papers from the other
bodies, including the Qualifications and Curriculum Authority,
the Teacher Development and Training Agency for Schools and the
Basic Skills Agency;
reports and papers from researchers
from academic establishments, professional associations, and professionals
working in the field of early reading and other aspects of literacy
from both the United Kingdom and internationally;
materials and guidance for practitioners
and teachers on supporting literacy and reading development for
0-3, the Foundation Stage, and Key Stages 1 and 2 produced by
the DfES and the Primary National Strategy;
teaching materials and guidance
produced by providers of commercial and voluntary reading schemes;
analysis by the DfES of national
test results for reading and writing at Key Stage 1 and for English
at Key Stage 2.
Further evidence was drawn from over 300 letters
and submissions to the review, including some from those who also
provided oral evidence.
115 This aims to create a single, coherent framework
for children's care, learning and development from birth until
the end of August following a child's fifth birthday. Back
116
House of Commons Education and Skills Committee, Teaching Children
to Read: session, 30 January 2006. Back
117
59 Round 1 "Trailblazer" programmes (announced November
1998); Round 2-69 programmes (November 1999); Round 3-64 programmes
(July 2000); Round 4-66 programmes (January 2001). Back
|