Good Governance - Effective Use of IT

Further written evidence submitted by Socitm (IT 60)

At the Public Administration Select Committee hearing on 22nd March 2011, I made the point that central government IT costs and performance in general were worse than those achieved in local government. This note provides further evidence to support this point. It focuses on comparisons between the practice and performance of ICT implementation in central and local government and includes:

· Introduction by Jos Creese, President of Socitm.

· ICT spend as a % of total revenue expenditure.

· Costs of desktop computers.

· Performance of websites.

· The lesson of business change in local government.

· Customer access, efficiency and channel shift.

Introduction

Introduction by Jos Creese, President of Socitm, Chair of the Local CIO Council and Head of IT at Hampshire County Council.

"Having worked in both local and central government there are a number of clear differences in approach that I have observed.

Firstly, local government has been more effective in retaining IT capacity and skills. IT professional turnover has been lower, and whilst outsourcing is common, it has not been pursued as dogmatically as has been the case in central government in the past. Building and retaining in-house skills helps with complex change management. The rate of avoidable turnover of senior responsible officers on IT programmes on key central government programmes is notable and even encouraged. In local government ‘seeing things through’ has a higher value.

Secondly, local government has had much less money to spend. This has forced a level of creativity and innovation, as well as tight control of programmes which tend to be smaller or more modular, again as a result of more stringent financial controls. Despite much lower salaries for equivalent roles, much lower spend on similar projects and significantly lower unit cost of commodity IT, the success rate does seem considerably higher – albeit that activity is at a smaller scale.

Most notably, and perhaps surprisingly, there is less disconnection between IT and policy-setting in local government. Perhaps this is because local government is less hierarchical, with politicians and top management more likely to involve IT professionals because they are more visible. But I suspect also it is because there is less fear of IT (not so many project failures which damage reputations?) and IT has been allowed to develop as an internal service rather than relegated to the role of mere utility.

Perhaps as a result of this, there are also far fewer ‘IT projects’.  IT is more likely to be embedded in service improvement projects or change projects. This has other problems, but it is less likely that IT is left ‘high and dry’ without proper business backing."

ICT spend as a % of total revenue expenditure

Sources:

Socitm Benchmarking Service

Gartner (2011) IT Metrics Data.

It is widely accepted that 3% is a benchmark of good practice in the private sector service industries for ICT spend as a percentage of total revenue expenditure.

Socitm Benchmarking in recent years has demonstrated that local government organisations spend consistently less than 3% on average on ICT as a percentage of their total revenue expenditure. The chart below shows the results from Socitm benchmarking studies across UK local authorities from 2004 to 2009 (up to 100 councils each year). The average of averages from these studies is 2.33%.

Please note that Socitm excludes ICT expenditure relating to curriculum expenditure in schools, as many local authority ICT services do not have responsibility for this activity.

A recent special Socitm benchmarking study of nine Scottish local authorities in 2010, which did include all ICT expenditure for schools, identified the following:

· Median 2.10 %

· Lower quartile is 1.82%

· Upper quartile is 3.02%

Our understanding, from talking to colleagues on the CIO Council, is that the average for the percentage of total revenue expenditure spent on ICT in central government departments is at least 5%.

These figures can be compared with the Gartner indicative global average for the percentage of total operating expense attributed to IT of 3.2% across state and local government in 2010. (We asked Gartner to provide comparative data on the comparison between state and local government, but they are unable to do so.)

Benchmarking is an essential first step to service improvement. For an ICT service cost reductions should be seen in the context of user satisfaction. Here, Socitm’s benchmarking includes a popular service for benchmarking user satisfaction, which demonstrates that, despite cost reductions, ICT services have improved year on year in the past five years. Used by up to 75 local authorities and others each year, this service processes each year up to 50,000 user responses to a standard survey. No other local authority service has such a wealth of management information about its performance and no other sector, public or private, has such a comprehensive database about ICT performance as viewed by those who use the service.

The very fact that it is so difficult to obtain comparative data about ICT costs and performance for central government speaks volumes. We believe that Parliament should have IT benchmarking data for central government departments and Non-Departmental Public Bodies, which is comparable in spread and depth to the Socitm Benchmarking data for local government.

Costs of desktop computers

Sources:

The Network for the Post Bureaucratic Age (2010) "Better for Less" How to make Government IT deliver savings.

Philip Virgo, Secretary General, The Information Society Alliance (March 2011) Eurim: Procurement in the Big Information Society, Government Computing.

Socitm (2011) IT Trends 2010-11.

The Better for Less report states (p.6): "Failing to make basic IT services a commodity has cost the British taxpayer dear. It has also reduced the effectiveness of government. Changing to commodity services - such as a user’s desktop software - can reduce the huge annual spending on IT by billions of pounds.

· The cost of running a desktop computer in a typical local government body is £345 per annum.

· The current cost of running a desktop in central government is £800 to £1600 per annum.

· There are approximately 4 million desktop computers in local and central government.

· The difference in cost cannot be explained by additional security requirements in central government.

The opportunity for savings is immense. Just in "desktop" the figure of £2bn per year is a reasonable figure to aim at."

This evidence is reinforced by Philip Virgo, who draws on data from Socitm’s IT Trends 2010-11 report which "shows that local government now pays much less than the main private sector benchmarks (let alone central government catalogues) for commodity ICT products such as desktop PCs." He goes on to cite a number of reasons for this in the article, a copy of which is attached.

Evidence from Socitm, suggests that one important factor explaining the difference is the impact of outsourcing. In our benchmarking surveys, we have regularly found a cost difference of up to 20% between the cheaper in-house provision and outsourced provision on a number of indicators such as cost of supporting a desktop. It would be very surprising if the much greater scale of outsourcing in central government were not a significant factor in explaining the differences reported here.

Performance of websites

Sources:

Socitm Insight and GovMetric (August 2010) Use of the web – local government compared with central government Customer Access Improvement Service Supplement.

One area which Socitm has researched extensively is performance of websites. Since 2004, Socitm Insight has collected extensive information about the performance of websites as experienced by those who visit them. Only in 2008 did central government develop standards in website performance, which were very much influenced by our experience.

Central government departments were mandated by the Central Office of Information to collect this data by March 2010 against newly defined standards. In August 2010 we were able to compare for the first time local government performance against central government performance. The most important indicator is visit success. Visit failure is costly for the organisation (as well as frustrating for the visitor) because, when they cannot complete the task that they set out to complete, visitors are forced to use the much costlier channels of the phone (approx. ten times costlier) or face to face. We noted:

· Only 16 out of 47 central government departments were able to report visit success

· The average visit success reported was 28.32% compared with an average of 53.74% for local authorities (123 councils participating).

· One example was Directgov which reported 32.00%, significantly lower than the local government average. Directgov is one of the Government’s flagship sites, reportedly costing over £35m to support.

Feedback about web visits is essential in order to improve websites and reduce avoidable contact. It is ironic, that in 2007 and 2008, the policy of the previous Government was to focus on avoidable contact with the phone, completely ignoring the web, despite our lobbying for a different approach.

The lesson of business change in local government

The following material comes from Glyn Evans, Vice-president of Socitm and Corporate Director of Business Change, Birmingham City Council and expands on the CHAMPS2 business change methodology www.champs2.info to which I referred during my evidence.

"I very much enjoyed attending the PASC hearing at which you gave evidence.


However, I did wonder whether the Committee was necessarily addressing the right question. It sometimes seems that every other year we have some inquiry into why IT projects fail, whereas perhaps a more meaningful question is why do we try and run business change projects as if they were IT projects? I think we will always have more or less 'pure' IT projects - a system upgrade or the implementation of a network. But these are not generally the projects that fail; rather it is those that are focused on implementing a particular policy initiative or reforming the way a specific part of the public service works.

What we have found at Birmingham - and why we developed CHAMPS2 - is that to realise fundamental business change means taking a holistic approach. So we try and get every element right - redesigning organisational structures, redefining job roles, establishing new processes, trying to address embedded cultural issues, keeping all levels of the organisation engaged in the process, ensuring that the leaders do lead the change. And, yes, the IT has to be right as well but in fact it's often the relatively easy bit. As an aside, though our change programme is IT-enabled, I've always underplayed this (we branded as a business transformation programme, the individual workstreams were called non-IT names) as the best way to make sure business leaders don't engage is to associate it with IT.

Whilst we try and get every element right, of course we don't always succeed - we live in the real world. The strength of our approach is that we didn't need to. But we got enough of it right to succeed overall; we are realising in excess of 95% of the benefits we predicted.

The danger of focusing, in effect, on the IT bit of change is that you don't do well any of the other elements. In such circumstances it's amazing that any IT projects actually succeed. And it has a further pernicious effect in that it drives you to look for the 'magic bullet' which will make everything right. It's been a recurring theme of my entire career, and I suspect Agile might just be the latest manifestation."

Failure to take a business change approach is a common theme that runs through all the Public Accounts Committees reports on failed IT projects dating back to the London Ambulance Service Computer Aided Dispatch System (whose failure actually cost lives) in 1992. We believe that a business-led approach to change should be adopted by the new Major Projects Authority, whose creation is a positive indication that the present government intends to do better than its predecessors.

Customer access, efficiency and channel shift

Source:

Socitm Insight (2011) Better Served: Customer access, efficiency and channel shift.

During the Committee’s session on 22nd March 2011, questions were asked about the use of digital access to public services. For local government, there exists a strong evidence base for managing the transition of service users towards cheaper, digital channels, where this is appropriate. This evidence base has been assembled by Socitm Insight from a number of sources, including its Customer Access Improvement Service.

The Better Served report details this evidence, including a number of case studies that illustrate the scope for efficiency gains and service improvement from centralising customer management and making this a corporately managed programme.

Examples include Birmingham City Council’s business case for its Customer First Programme, which anticipates £197.4m of cashable benefits over ten years, while Tameside, a much smaller metropolitan borough, is looking at savings of £1m over the next four years, just from better management of the front office. Since 2007, Surrey County Council has reduced the cost of phone and web contacts from 79p to 49p per enquiry and since 2007 has saved £175,000 in its contact centre and an additional £150,000 elsewhere by reducing avoidable contact. The case studies show how, with centralised management, and full data about types and volumes of enquiries available for analysis, scope for savings can be realised, by reducing avoidable contacts and encouraging channel shift, principally to the web.

The report draws ten key conclusions about good practice in managing customer access and channel shift:

1. Councils can make significant cost savings through better management of customer access.

2. Cost reduction comes from reducing volumes of phone and face-to-face enquiries.

3. There are three main ways of reducing call and face-to-face volumes without reducing customer satisfaction.

4. Customer channels must be managed together to reduce volumes.

5. Full data from all channels is needed to manage customers efficiently: it is unlikely to be available where channels are managed separately.

6. Collecting customer data for analysis to identify improvement is difficult, but not impossible.

7. Benchmarking highlights variations in management of customer access and opportunities for improvement.

8. Data analysis will reveal opportunities for front- or back-office collaboration in cost-saving process improvement.

9. Maximising customer access efficiency requires an excellent website integrated with all other customer channels.

10. Customers need to be made aware of services on the web and be encouraged to use them.

We know of no equivalent evidence base or analysis for central government that can help to steer implementation of the Government’s policy of ‘digital by default’.

Conclusions

All the points in this note indicate that central government has not benefitted from the kind of network that Socitm has built up, and indeed has very much ignored what has been happening in local government.

Socitm has played an important part in evangelising the role of IT in modern public services, providing research, insight and benchmarking services specific to the public sector as well as a professional network to aid practitioners. Central government could benefit from this approach, but has always shied away from closer integration with colleagues in local government.

April 2011