2 Using data |
13. The Department recognised that there were shortcomings
in data received from local authorities about how much they spend
on foster and residential care. The Department considered that
its guidance to councils on how to fill in the return was clear,
but there is no consensus among them on how to report spending
or complete their returns. The Local Government Association has
asked the Chartered Institute for Public Finance and Accountancy
to investigate how the data can be improved.
The Department's and others' tools for benchmarking the costs
of care are not widely used. The Department described it as a
chicken-and-egg problem: if the data in benchmarking tools were
not useful, then local authorities had no incentive to improve
the quality of data they provided.
14. The Department collects a huge amount of data
on spending and individual children. The Association of Directors
of Children's services estimated that local authorities make 11
major statutory returns to Government with information relating
to children. Local authorities submit data on each child in care,
for example, their age or the type of placement they are in, and
the total amount spent on their children's social care functions.
The Association told us that collecting more data with which to
monitor local authority performance was unnecessary. Instead,
a suite of about 25 to 30 indicators of children's experiences,
collected from parents, children and carers, could give a picture
of the broad health of a local authority's provision for children
in social care.
15. The Department does not use its national database
on all children in care to challenge individual local authorities
to improve their performance, hold councils to account, understand
how different patterns of care affect outcomes for children, or
assess what constitutes value for money.
Although its own Accountability System Statement, dated September
2012, clearly states that it has responsibility for managing the
performance of local authorities, the Department has no indicators
that accurately measure the efficacy of the care system. The Department
told us that Ofsted inspection reports are the best way to hold
local authorities to account but these do not fully assess all
aspects of quality.
The Care Leavers' Association reported that 'fuzzy' lines of accountability
mean that children and young people face a postcode lottery as
different local authorities provide different services and experiences.
Ofsted's opinion is that poor local accountability is a cause
of the 'shockingly wide' gap in educational attainment.
The Department relies on Ofsted inspection reports to tell it
about outcomes for children, but these do not assess value for
16. We heard that local people are not well-supported
in holding their councils to account for the quality of foster
and residential services. For adoption the Department produces
user-friendly scorecards for each local authority that sets out
the average time taken for different stages of the adoption process.
There is no equivalent for foster and residential care, setting
out, for example, how well their local authority compares to others
on how often children change placement, their educational outcomes
and receipt of health checks. The Department does publish annual
Microsoft Excel performance tables on a set of measures on the
pan-government website, GOV.UK, which includes all looked-after
children, but it is far from being the clear and easily accessible
information that should be available to the layperson.
17. Ofsted told us that, for schools, it has helped
to improve standards by making benchmarking data available to
them in an accessible and useful form. Ofsted use the data themselves
to spot if things are going badly wrong at a school, and if so
it will inspect more regularly. With children's services, Ofsted
has to wait for the Department to request an inspection outside
of the three yearly scheduled visit.
Ofsted told us that it would like the same inspection principles
that apply to under-performing schools to be applied to children's
services; namely, that the Department use its data to identify
quickly when local authority services were faltering, and instruct
Ofsted to inspect more regularly.
18. In the health sector, Monitor promptly challenges
hospitals when their performance information suggests things are
starting to drift. However, Ofsted has to rely on the Department
to intervene when it finds that a local authority is not meeting
the required standards. As an inspector, Ofsted cannot intervene
itself. The Department
told us that it does not intervene based on its analysis of data
provided by local authorities. The Department was intervening
at 21 local authorities at the time of our hearing, following
- Ofsted reported that it runs best practice programmes
with the 55% of inspected local authorities that require improvement,
and that this was proving to be very successful. It does not,
however, use the Department's expertise when doing so.
The Department's data about the services provided to children
in care, cannot yet be matched to Ofsted's assessment of the quality
of care so the Department cannot say if children with the highest
needs are placed in the best quality care.
28 Q 127; C&AG's Report, paras 3.8-3.9 Back
Q 128; C&AG's Report, para 3.18 Back
Q 19; C&AG's Report, para 1.4 Back
Qq 20-23 Back
Q 157; C&AG's Report, paras 1.30, 2.4 Back
Qq 113, 130; C&AG's Report, paras 1.3, 2.4, 2.33 Back
Qq 4-5, 107 Back
Qq 113, 130; C&AG's Report, para 2.33 Back
Qq 111-113, C&AG's report, para 2.39 Back
Qq 125, 153 Back
Qq 125, 199 Back
Qq 78, 199 Back
Qq 81, 116; C&AG's Report, para 2.38 Back
Qq 178-179 Back
C&AG's Report, para 2.35 Back