Performance indicators
29. Local electorates must have a mechanism which
allows them to compare the relative performance of local authorities
and to judge whether they are receiving value for money. Since
1992, the Audit Commission has been responsible for this task,
collecting and publishing indicators which allow comparisons to
be drawn between the performance of audited bodies including local,
police and fire authorities.[65]
Each audited body is required to collect, record and publish,
in a local newspaper, performance indicators on an annual basis.
30. After some initial resistance the Audit Commission's
Performance Indicators are now accepted by local government.[66]
The Local Government Association commented that "the Audit
Commission has discharged a difficult task well ... it has won
the respect of many initially sceptical local authorities".[67]
However, there is still work to be done to improve the indicators.
This may be possible through the changes being made as part of
the introduction of Best Value.[68]
31. The key issue raised by witnesses was the importance
of developing Performance Indicators which measured outcomes rather
than simply outputs.[69]
Traditionally performance indicators have measured what is easy
to measure which has tended to be quantitative outputs, for example
the percentage of planning applications processed in eight weeks,
rather than trying to measure the quality or outcomes of a service,
for example whether decisions taken on planning applications were
of a high quality, took into consideration good urban design principles
and so on. This was the main point made by the London Borough
of Newham which told us:
"what matters is how
things change for the better for people in their every day lives
... we run the risk of diverting energy and resources if we do
not take an early opportunity to get rid of pointless measures
that tell us little or nothing about what is happening in the
real daily lives of people".[70]
Unfortunately the Audit Commission performance indicators
have often been unable to adequately capture outcomes. Steve Martin
of Warwick Business School stated that "existing performance
indicators largely measure throughputs and outputs (economy and
efficiency) as opposed to outcomes and impacts (effectiveness)".[71]
One of the reasons for this may be that services are often delivered
by a wide range of agencies.[72]
32. The Audit Commission agreed that "outcomes
are the most desirable thing to be measuring"[73]
and argued strongly that significant progress had been made in
that regard. To illustrate this, Ms Thomson cited examples of
outcome-based performance indicators such as 'older people over
65 helped to live at home' and the 'proportion of pupils at 11
plus achieving three or more A-C GCSE results'.[74]
33. We accept that measuring outcomes is a not a
straightforward task and that there are difficulties in both developing
sensible measures of outcomes, and in the collection and analysis
of data. However, while we were pleased to hear that the Audit
Commission has certainly made some progress, we consider that,
given the time and resources which it has committed to developing
Performance Indicators, it is reasonable to expect that more sophisticated
measures should now be in place. We recommend that the Audit
Commission and DETR continue to review their respective sets of
performance indicators to ensure that they provide meaningful,
outcome-based measures of local authorities' performance.
Joint working
34. Increasingly local services are being organised
on a joint or 'cross-cutting' basis, with several departments
or agencies being involved (for example, care services for elderly
people, which involves social services departments, and community
and mental health trusts). Such working is an integral part of
the Best Value regime. Accordingly, there is a need for 'joined
up' audit and inspection.
35. In its role as auditor of local authorities and
health bodies, the Audit Commission claimed that "it has
a strong track record of undertaking studies that cut across boundaries"
and that effective arrangements were in place to ensure joint
working with the National Audit Office, Her Majesty's Inspectorate
of Constabulary, the Social Services Inspectorate and Ofsted.[75]
We received evidence from the National Audit Office,[76]
the Social Services Inspectorate[77]
and the Benefit Fraud Inspectorate[78]
that these arrangements work well, and that there is a high degree
of coordination between the bodies to avoid overlap and duplication.
36. However, some witnesses had concerns about the
Audit Commission's ability to audit 'cross-cutting' services.
The Local Government Association told us that "judgments
by the [Audit Commission's Best Value] Inspectorate on the corporate
health and capacity of an authority can only be assembled from
a series of partial reviews carried out over an extended period,
and conducted by several different inspectorates ... they each
work within very different traditions, with different cultures
and methodologies".[79]
We consider solutions to this problem at paragraphs 40-48 below.
37. We welcome the Audit Commission's and the other
Inspectorates' commitment to joint working but we consider that
it does not go far enough. Indeed, we were surprised to learn
that joint inspections and joint working tends to be the exception.
Given the ways in which local government is now being asked to
deliver services, we are of the view that there needs to be more
joint inspection, and that this approach should become the norm
(this is discussed further at paragraph 48). In addition, while
we agree that joint working with some agencies, for example the
Social Services Inspectorate, has been relatively frequent, we
are concerned that there has in practice been relatively little
joint working between the Audit Commission and the National Audit
Office over the last two decades. Joint studies have been relatively
rare, and we consider this to be a lost opportunity.[80]
We therefore recommend that the Public Audit Forum and the
Government's review of central government audit arrangements seek
out opportunities for increased joint work between the Audit Commission
and the National Audit Office.
13 Q86, AC10, AC13, AC17, AC18, AC20, AC25,
AC27 Back
14
AC11, paras 14&15 Back
15
AC15, para 20 Back
16
AC30, para 9 Back
17
Q148, Q320 Back
18
AC10, para 8.1 Back
19
QQ305, 307, AC23 Back
20
Q148 Back
21
QQ309, 310 Back
22
Q436 Back
23
The mixed market brings a number of benefits, mainly relating
to fees and the quality of audit work. First, fee increases are
constrained by the presence of District Audit upon whose costs
fees are based. The existence of an in-house contractor able
to deliver high quality work at the specified price discourages
the private sector from trying to seek higher rates. Second,
the mixed market helps to maintain professional standards and
facilitates the spread of good audit practice (partners/ directors
from all audit providers participate in 'peer reviews' as part
of the audit quality process). This in turn encourages competition
as providers wish to be seen to be delivering a good service relative
to their competitors. Back
24
AC10, para 8.2 Back
25
AC12, para 39 Back
26
AC11, para 14 Back
27
AC10, para 4.4 Back
28
AC12, para 12-see also Q436 Back
29
QQ305, 306, 325 Back
30
Q130-see also AC10 Back
31
Q437-see also Q504 Back
32
Q439 Back
33
Q291 Back
34
AC14 Back
35
Q162 Back
36
Mr. P.J. Butler reviewed the Commission's overall performance
in 1995. His report found that the market testing round which
took place in November 1994 cost the Audit Commission £50,000
and firms £650,000 (an average of £110,000 per audit-6
audits were market tested that season) Back
37
AC23 Back
38
AC23 Back
39
AC25, para 26, and see Q287 Back
40
AC30, para 5 Back
41
Q211-this was also the view of Hammersmith & Fulham
(Q204) Back
42
AC07, AC14, AC24, Q110 Back
43
AC12, para 10 Back
44
This is the average forecast audit fee for all English and
Welsh local authorities (excluding parish councils) for 1999-2000.
The average fee for health bodies in 1998-99 was £68,000 Back
45
AC06, para 13, AC07, AC08, AC24, Q366 Back
46
AC23, AC25, AC30, para 12 Back
47
AC27, para 7.6 Back
48
Q149 Back
49
Cipfa Finance & General Statistics 1999-2000 Back
50
Q224 Back
51
QQ224, 226 Back
52
AC15 Back
53
See AC11, Box 4 at para 18 Back
54
AC17, AC18, AC25, paras 7, 9 Back
55
Q196, Q243, AC16, AC20 Back
56
AC24-see also Q95 Back
57
AC23, AC25, Q110 Back
58
AC11, para 58 Back
59
AC14, AC16, Q342 Back
60
AC03, see also AC27, para 6.6 Back
61
AC24, AC25 Back
62
AC10 Back
63
Q430, AC15, para 27 Back
64
QQ432, 433 Back
65
AC15, para31 Back
66
AC14, para 7 Back
67
AC17 Back
68
As part of the introduction of Best Value, changes have
been made to the Performance Indicators and the way in which they
are used. There now exist three groups: the Government's 'Best
Value Performance Indicators', of which there are 170; the Audit
Commission's performance indicators, of which there are 54; and
local indicators which authorities will need to develop to reflect
their particular circumstances. Back
69
See AC22, Q70 Back
70
AC09, paras 4.2-4.3-see also para 3.2 Back
71
AC13 Back
72
AC09 Back
73
Q453 Back
74
Q453 Back
75
AC11, paras 25 & 26 Back
76
AC19, para 3 Back
77
AC31, para 15 Back
78
AC32, para 3 Back
79
AC17 Back
80
Q24 Back