Select Committee on Environment, Transport and Regional Affairs Tenth Report


AUDIT COMMISSION

THE AUDIT COMMISSION'S EXISTING ROLE

9. Our evidence indicated that the Audit Commission is a well respected body[13] which is generally regarded as executing a difficult task well. Nevertheless, there were several areas of its existing activities and policies which attracted criticism including: its use of the mixed market for public audit, the market testing process, audit fees, the approach to value for money work, performance indicators and joint working.

The mixed market

10. The Audit Commission carries out its duties in relation to the audit of local authorities and health bodies through District Audit and a number of approved private sector accountancy firms. Currently District Audit undertakes 71 per cent of audits and the remaining share is tendered to seven private firms. 'Blocks' of audits, where several authorities or health bodies constitute a block, are allocated to a particular auditor. This arrangement is called the 'mixed market'. The rationale behind the operation of the mixed market is, according to the Audit Commission, to ensure continuity of audit supply and to stimulate genuine competition and keep costs low.[14] The 71:29 split has been static since 1983, notwithstanding Government claims to keep the ratio between District Audit and the private firms under review "at regular intervals".[15]

11. There is a belief that a larger share of the audit market should be available to the private firms. Predictably this was the view of the firms themselves. They argue that since the private sector market has no prospect of growth, and the firms therefore have no scope to increase their share (other than by taking audits away from other firms), there is little incentive for them to continue investment in recruitment, career development and training.[16] We were told that two firms had withdrawn from the market last year[17] and that this was because the "risk: reward equation"[18] was no longer attractive.[19] This concern was shared by the Chartered Institute for Public Finance and Accountancy (Cipfa).[20] All the private firms submitting evidence—KPMG, PriceWaterhouseCoopers and RSM Robson Rhodes—argued that District Audit's share should be reduced to 50 per cent.

12. No cogent arguments were presented in favour of the 71:29 split. There was even a distinct lack of enthusiasm on behalf of District Audit about defending the status quo.[21] That the status quo has continued seemed, to us, to be a result of inertia rather than a deliberate decision by the Audit Commission to limit the private firms' share. The Audit Commission pointed out that shifts in the balance can take place as a result of market testing (discussed in paragraphs 16-21 below), but that in practice only marginal changes had occurred as a result of this.[22]

13. We are convinced of the value of a mixed market for public audit and are concerned about the withdrawal of firms from that market.[23] Cipfa told us that "the futher withdrawal of firms would damage the credibility of the public audit process and in the long term could potentially undermine the Audit Commission itself".[24] We recommend that the Audit Commission explore ways both to ensure that no more firms withdraw from the market and to attract new entrants.

14. Also important in the context of the mixed audit market is the status of the relationship between District Audit and the Audit Commission. District Audit was set up as an arm's length agency of the Audit Commission in 1994, having previously been an in-house function. The relationship between the two is now defined by District Audit's Corporate Plan.[25] We agree with the Audit Commission that it is necessary to retain a dedicated public audit provider to provide "an audit presence both in remote geographic locations ... and for difficult or risky audits".[26] However, even with arm's length status, Cipfa told us that this relationship created a conflict of interest and suggested that increased separation was needed, and that District Audit should be set up as a more formal separate agency, with a distinct legal identity and its own line of reporting.[27] The difficulty arises because in addition to the nine or so percent of audit fees it collects from all audit providers, the Audit Commission collects an 'owners dividend' from District Audit. This amounted to around 5.5 per cent of District Audit's turnover in 1999-2000.[28] We are concerned that this creates a somewhat cosy relationship between the Audit Commission and District Audit and means that it is in the Commission's own interests to perpetuate the current arrangement and existing market share. We recommend that the Audit Commission reviews the status of District Audit with a view to further enhancing the separate identities of the two bodies.

15. In addition to further opening the public audit market to private firms, witnesses argued that the same principle should ultimately be applied to the inspection of Best Value. With the creation of an in-house Best Value Inspectorate (described at paragraph 39) the Audit Commission will, at least initially, be undertaking all Best Value inspections. However, we were told that there would be a number of benefits in also establishing, in time, a mixed market for the inspection function. RSM Robson Rhodes, one of the audit firms, claimed that such a move would make the overall market more attractive and reduce the risk of audit firms withdrawing from the public audit side.[29] It would also remove the potential conflict of interest which currently exists whereby 71 per cent of local authorities would have both an auditor (of their Best Value Performance Plan) and an inspector employed by the same organisation. Other witnesses agreed—for example, the Local Government Association argued that the function should be externalised "as soon as possible", primarily to ensure that fees are kept low.[30] We were therefore pleased that Dame Helena Shovelton, Chair of the Audit Commission, made it clear that the Commission is committed in principle to having a mixed market for Best Value inspection.[31] It envisages a period of 18 months to two years before it is feasible to go out to tender with inspection contracts.[32] Given the burden on local authorities which the Best Value regime and the Audit Commission's inspection activities may potentially represent for local authorities, we agree that this cautious approach is appropriate. Inspection is likely to evolve and only once the Audit Commission's approach has been tried and tested, and a preferred formula settled upon, should private firms be invited to tender for inspection contracts. However, we would be very concerned if the Audit Commission backs away from this commitment in 18 months' time. There is a danger that having recruited and trained a body of inspectors, the Commission may be reluctant to relinquish part of its role in relation to inspection. This should not be allowed to happen.

Market testing and rotation of auditors

16. Selective market testing, whereby providers tender for individual local authority audits, was introduced in 1994 by the Audit Commission following recommendations set out in the 1993 FMPR. Around 1 per cent of local authority and health body audits are put out for competitive tender each year, the purpose being to ensure that the public audit market operates as competitively as possible.

17. District Audit supported market testing, claiming that it had been "a powerful stimulus to the improvement to our quality generally".[33] And one of the authorities we took evidence from was also in favour, the London Borough of Hammersmith & Fulham which argued that market testing should be "the norm".[34]

18. However, most other witnesses who expressed a view cast doubt whether, six years after the introduction of market testing, it continued to serve a useful purpose.[35] In particular, the private sector providers were very sceptical about its value. KPMG told us that market testing was expensive for providers[36] and served no useful purpose, claiming that "fees have not decreased and other firms have not entered the market".[37] It commented that the effort and expense of competing is disproportionate to the gains that can be made (for the firm) since there are no new assignments and work put out to tender would have to be taken from an existing supplier. Furthermore, the firm argued that this burden fell disproportionately on the private audit providers since "District Audit, with 75 per cent of the market, is able to spread the costs more effectively than a firm with an average of six per cent".[38] These arguments were also put forward by RSM Robson Rhodes[39] and PriceWaterhouseCoopers, the latter commenting that the most recent market testing exercise was a waste of time because "the audited bodies who participated in the test had no wish to change their auditor".[40] The costs involved in market testing may also be another barrier to potential market entrants.

19. We conclude that in practice there seemed to be little benefit to local authorities or to auditors in going through an expensive and time consuming market testing process when often all this achieves is to confirm that the incumbent auditor is doing a good job and should stay. We therefore recommend that the Audit Commission phase out selective market testing.

20. However, we do recognise there exists a minority of authorities which are unhappy with their auditor and wish greater freedom over the decision to appoint. Ms Halton of West Devon Borough Council told us "I think we ought to be trusted to select our own external auditors: it can be from a list of accredited auditors who have been approved, but I feel we should be allowed to do that. And I think that will create the market, and the market will get more competitive because of that".[41] There may therefore be an argument for allowing those authorities which can demonstrate intractable differences with their auditor freedom to initiate a change of auditor inside the normal five year appointment period. On a more positive note, such freedom could also be awarded to authorities achieving beacon status. We therefore recommend that the Audit Commission review cases where authorities can demonstrate 'special circumstances' with a view to allowing such authorities greater control over audit appointments.

21. Currently the Audit Commission appoints auditors for a period of five years. At that point the appointment is reviewed and in some cases the audit supplier may be changed. In addition, where a second term is awarded to the same audit provider, the senior personnel (ie district auditor and senior manager) must be 'rotated' every seven years. This seemed to us to be a sensible approach, balancing the need for continuity against the need to avoid the dangers of auditors becoming too familiar with an audited body. However, within this system we are worried about the turnover of staff among the audit firms. Several authorities told us that high turnover was a problem and that they had to spend large amounts of time each year re-training auditors not familiar with the authority's systems.[42] We see considerable advantages in audit staff remaining the same for the duration of the appointment. Providers should make efforts to ensure this happens.

Audit fees

22. The Audit Commission Act 1998 requires the Commission to set a scale of audit fees, which all audit providers must charge. The actual fee for an individual audit is agreed locally between the auditor and audited body and will be determined by the number of days required by the audit.[43] On average, an authority can expect to pay around £86,000.[44] We received evidence from a number of authorities claiming that fees were too high and annual increases excessive.[45] However, some of the private sector audit firms maintained that the fees were in fact too low, and told us that many of the firms remained in the market simply because it is not subject to economic cycles and therefore offers a predictable work-load.[46] Mr Bundred of the London Borough of Camden told us that "there is no evidence that private firms are making large profits from their current involvement in local government audit".[47] Cipfa argued that with respect to audit fees, "the Commission has got it right".[48]

23. On balance we do not see a strong case for fundamentally altering the basis of public audit fees or the way in which they are determined. That said, we are concerned about the problems faced by smaller authorities whose audit bills can represent a disproportionate burden as a percentage of their budgets. Budgets for local authorities in England range from £2.8 million for Teesdale District Council to £1,115.7 million for the City of Birmingham.[49] West Devon Borough Council, an authority with a budget of around £5.5 million, argued that larger authorities found it much easier to absorb the audit fees than did smaller councils.[50] It suggested that fees should be based on size of population and relate to the number of transactions an authority undertakes.[51] A similar case was put forward by Tandridge District Council which told us that "the total level of external and internal audit activity required seems out of proportion for a small authority which has not recently been nor currently is any significant cause for concern".[52] The Audit Commission should recognise the difficulties faced by small authorities in meeting audit costs. We recommend that the Commission determine a method by which smaller authorities, say those with populations under 50,000, are subject to lower audit fees.

Value for money studies

24. The Audit Commission carries out value for money (or VFM) studies, their purpose being to identify what works well in local government and disseminate best practice. Two types of studies are carried out. The Audit Commission itself undertakes national VFM studies, which compare services and themes in local government and the National Health Service. Examples of recent work include Early Retirement in Local Government and Value For Money in Emergency Ambulance Services.[53] Such studies, along with the annual publication of performance indicators (see paragraphs 29-33), often attract a fair degree of attention in the national press. In addition, 30 per cent of local auditors' time is spent on local VFM studies, where national findings are applied to audited bodies to review local performance.

25. There were mixed views among witnesses about the value of VFM studies. Some witnesses told us that they were well respected and useful,[54] others said they represented the least valuable part of the audit process.[55] The Society of Municipal Treasurers commented that "there is a feeling that National Studies are often more geared up to generating headlines (in the press) than they are to achieve real improvements in the delivery of services".[56] With respect to local studies, we were told that auditors have only limited flexibility in terms of which services they select for VFM audits each year. Both local authorities and auditors were critical of this arrangement, arguing strongly that the VFM audits undertaken locally are often not appropriate for the individual authority at that time.[57] The Audit Commission recognises this difficulty, stating that "respondents to the Commission's 1998 strategy consultation made a strong case for greater local flexibility in the application of national VFM work to increase its relevance to local agendas".[58]

26. The Local Government Act 1999, which confers on the Commission the responsibility for inspecting Best Value in local authorities, does not remove the duty to carry out VFM studies. However, there was scepticism among some witnesses about the continuing need for such assessments alongside the introduction of Best Value inspections. Several argued that they would now be superfluous and should be phased out.[59] There may also be practical difficulties in operating VFM studies alongside Best Value inspections. Hertfordshire County Council pointed out that "it is unlikely that the nationally imposed value-for-money studies programme will align with a council's five-year service review programme".[60]

27. A few witnesses argued that there remained a useful role for such studies and that they should continue, albeit in a different form.[61] Cipfa, for example, claimed that "there is a continuing role for VFM studies, particularly for cross-cutting and 'neglected' issues".[62] Similarly, the Audit Commission and DETR view was that while national Value for Money studies should be significantly reduced in scale, they should not be removed altogether.[63] The Commission saw particular value in continuing those national studies which dealt with issues which local authorities have not addressed adequately, such as early retirement.[64]

28. We cannot see that there is a strong case for retaining local VFM studies in the context of the Best Value regime, which is strongly based on principles such as economy, efficiency and effectiveness. Furthermore, we would be disappointed if local authorities' Best Value reviews and the Audit Commission inspections were not used to assess cross-cutting services, thereby significantly reducing the need for traditional VFM work. Given the high level of scrutiny that local authorities are now subject to, we recommend that local value for money studies should be phased out. However, there is a role for the Audit Commission to undertake some national VFM work, but only where it can be demonstrated that the issue at hand is not within the purview of Best Value.

Performance indicators

29. Local electorates must have a mechanism which allows them to compare the relative performance of local authorities and to judge whether they are receiving value for money. Since 1992, the Audit Commission has been responsible for this task, collecting and publishing indicators which allow comparisons to be drawn between the performance of audited bodies including local, police and fire authorities.[65] Each audited body is required to collect, record and publish, in a local newspaper, performance indicators on an annual basis.

30. After some initial resistance the Audit Commission's Performance Indicators are now accepted by local government.[66] The Local Government Association commented that "the Audit Commission has discharged a difficult task well ... it has won the respect of many initially sceptical local authorities".[67] However, there is still work to be done to improve the indicators. This may be possible through the changes being made as part of the introduction of Best Value.[68]

31. The key issue raised by witnesses was the importance of developing Performance Indicators which measured outcomes rather than simply outputs.[69] Traditionally performance indicators have measured what is easy to measure which has tended to be quantitative outputs, for example the percentage of planning applications processed in eight weeks, rather than trying to measure the quality or outcomes of a service, for example whether decisions taken on planning applications were of a high quality, took into consideration good urban design principles and so on. This was the main point made by the London Borough of Newham which told us:

    "what matters is how things change for the better for people in their every day lives ... we run the risk of diverting energy and resources if we do not take an early opportunity to get rid of pointless measures that tell us little or nothing about what is happening in the real daily lives of people".[70]

Unfortunately the Audit Commission performance indicators have often been unable to adequately capture outcomes. Steve Martin of Warwick Business School stated that "existing performance indicators largely measure throughputs and outputs (economy and efficiency) as opposed to outcomes and impacts (effectiveness)".[71] One of the reasons for this may be that services are often delivered by a wide range of agencies.[72]

32. The Audit Commission agreed that "outcomes are the most desirable thing to be measuring"[73] and argued strongly that significant progress had been made in that regard. To illustrate this, Ms Thomson cited examples of outcome-based performance indicators such as 'older people over 65 helped to live at home' and the 'proportion of pupils at 11 plus achieving three or more A-C GCSE results'.[74]

33. We accept that measuring outcomes is a not a straightforward task and that there are difficulties in both developing sensible measures of outcomes, and in the collection and analysis of data. However, while we were pleased to hear that the Audit Commission has certainly made some progress, we consider that, given the time and resources which it has committed to developing Performance Indicators, it is reasonable to expect that more sophisticated measures should now be in place. We recommend that the Audit Commission and DETR continue to review their respective sets of performance indicators to ensure that they provide meaningful, outcome-based measures of local authorities' performance.

Joint working

34. Increasingly local services are being organised on a joint or 'cross-cutting' basis, with several departments or agencies being involved (for example, care services for elderly people, which involves social services departments, and community and mental health trusts). Such working is an integral part of the Best Value regime. Accordingly, there is a need for 'joined up' audit and inspection.

35. In its role as auditor of local authorities and health bodies, the Audit Commission claimed that "it has a strong track record of undertaking studies that cut across boundaries" and that effective arrangements were in place to ensure joint working with the National Audit Office, Her Majesty's Inspectorate of Constabulary, the Social Services Inspectorate and Ofsted.[75] We received evidence from the National Audit Office,[76] the Social Services Inspectorate[77] and the Benefit Fraud Inspectorate[78] that these arrangements work well, and that there is a high degree of coordination between the bodies to avoid overlap and duplication.

36. However, some witnesses had concerns about the Audit Commission's ability to audit 'cross-cutting' services. The Local Government Association told us that "judgments by the [Audit Commission's Best Value] Inspectorate on the corporate health and capacity of an authority can only be assembled from a series of partial reviews carried out over an extended period, and conducted by several different inspectorates ... they each work within very different traditions, with different cultures and methodologies".[79] We consider solutions to this problem at paragraphs 40-48 below.

37. We welcome the Audit Commission's and the other Inspectorates' commitment to joint working but we consider that it does not go far enough. Indeed, we were surprised to learn that joint inspections and joint working tends to be the exception. Given the ways in which local government is now being asked to deliver services, we are of the view that there needs to be more joint inspection, and that this approach should become the norm (this is discussed further at paragraph 48). In addition, while we agree that joint working with some agencies, for example the Social Services Inspectorate, has been relatively frequent, we are concerned that there has in practice been relatively little joint working between the Audit Commission and the National Audit Office over the last two decades. Joint studies have been relatively rare, and we consider this to be a lost opportunity.[80] We therefore recommend that the Public Audit Forum and the Government's review of central government audit arrangements seek out opportunities for increased joint work between the Audit Commission and the National Audit Office.


13   Q86, AC10, AC13, AC17, AC18, AC20, AC25, AC27 Back

14   AC11, paras 14&15 Back

15   AC15, para 20 Back

16   AC30, para 9  Back

17   Q148, Q320 Back

18   AC10, para 8.1 Back

19   QQ305, 307, AC23 Back

20   Q148 Back

21   QQ309, 310 Back

22   Q436 Back

23   The mixed market brings a number of benefits, mainly relating to fees and the quality of audit work. First, fee increases are constrained by the presence of District Audit upon whose costs fees are based. The existence of an in-house contractor able to deliver high quality work at the specified price discourages the private sector from trying to seek higher rates. Second, the mixed market helps to maintain professional standards and facilitates the spread of good audit practice (partners/ directors from all audit providers participate in 'peer reviews' as part of the audit quality process). This in turn encourages competition as providers wish to be seen to be delivering a good service relative to their competitors.  Back

24   AC10, para 8.2 Back

25   AC12, para 39 Back

26   AC11, para 14 Back

27   AC10, para 4.4 Back

28   AC12, para 12-see also Q436 Back

29   QQ305, 306, 325 Back

30   Q130-see also AC10 Back

31   Q437-see also Q504 Back

32   Q439 Back

33   Q291 Back

34   AC14 Back

35   Q162 Back

36   Mr. P.J. Butler reviewed the Commission's overall performance in 1995. His report found that the market testing round which took place in November 1994 cost the Audit Commission £50,000 and firms £650,000 (an average of £110,000 per audit-6 audits were market tested that season) Back

37   AC23 Back

38   AC23 Back

39   AC25, para 26, and see Q287 Back

40   AC30, para 5 Back

41   Q211-this was also the view of Hammersmith & Fulham (Q204) Back

42   AC07, AC14, AC24, Q110 Back

43   AC12, para 10 Back

44   This is the average forecast audit fee for all English and Welsh local authorities (excluding parish councils) for 1999-2000. The average fee for health bodies in 1998-99 was £68,000 Back

45   AC06, para 13, AC07, AC08, AC24, Q366 Back

46   AC23, AC25, AC30, para 12 Back

47   AC27, para 7.6 Back

48   Q149 Back

49   Cipfa Finance & General Statistics 1999-2000 Back

50   Q224 Back

51   QQ224, 226 Back

52   AC15 Back

53   See AC11, Box 4 at para 18 Back

54   AC17, AC18, AC25, paras 7, 9 Back

55   Q196, Q243, AC16, AC20  Back

56   AC24-see also Q95 Back

57   AC23, AC25, Q110 Back

58   AC11, para 58 Back

59   AC14, AC16, Q342 Back

60   AC03, see also AC27, para 6.6 Back

61   AC24, AC25 Back

62   AC10 Back

63   Q430, AC15, para 27 Back

64   QQ432, 433 Back

65   AC15, para31 Back

66   AC14, para 7  Back

67   AC17 Back

68   As part of the introduction of Best Value, changes have been made to the Performance Indicators and the way in which they are used. There now exist three groups: the Government's 'Best Value Performance Indicators', of which there are 170; the Audit Commission's performance indicators, of which there are 54; and local indicators which authorities will need to develop to reflect their particular circumstances.  Back

69   See AC22, Q70 Back

70   AC09, paras 4.2-4.3-see also para 3.2 Back

71   AC13 Back

72   AC09 Back

73   Q453 Back

74   Q453 Back

75   AC11, paras 25 & 26 Back

76   AC19, para 3 Back

77   AC31, para 15 Back

78   AC32, para 3 Back

79   AC17 Back

80   Q24 Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2000
Prepared 22 June 2000