Select Committee on Public Administration Fourth Special Report


Appendix 2


Letter to Dr Tony Wright MP, Chairman of the Public Administration Select Committee, from Sir David Varney, Chairman of HM Revenue & Customs, dated 20 March 2006

You wrote to me on the 8th December raising some additional queries following the PASC hearing into The Ombudsman's Report into Tax Credits on the 20th October, 2005. Please accept my apologies for the delay in replying.

For clarity, I have listed the questions in order, followed by my response.

At Q173 you were asked "Can you tell us the number of overpayments resulting from Revenue errors?" You referred the Committee to the Standard Report on the Accounts. The Committee would like to know whether there is now any further information about the proportion of errors caused by a) Revenue staff error, b) software errors and c) customer error and omissions respectively and their respective financial quanta? If not, what steps are being taken to gather and report such information?

It is difficult to ascribe every error to the particular categories mentioned. In some cases—for example where there has been a particular computer problem—we can assess the size of the error and ascribe it to a particular cause. The following details provide an overview:

(a)  The accuracy of the Department's processing of tax credit awards was 96.5% in 2004-05 (78.6% in 2003-04) against a Public Service Agreement target of 90%.

(b)  The Comptroller & Auditor General's reports on the Inland Revenue accounts for 2003­04 and 2004­05 provide details of software errors. These gave rise to write-offs made centrally of £37m in 2003-04 and £1 .85m in 2004-05. In addition, where software errors which resulted in overpayments were corrected, some further write-offs will be authorised in instances where it was not reasonable for claimants to have detected the error. More recently, the Department was able to quantify accurately the impact of three other software errors that had led to overpayments mainly relating to 2003-04. As a result write-off of a further £44.85m has been authorised. This write-off will be included in HMRC's 2005-06 accounts. The Department has ongoing processes in place to detect and, where necessary, quantify the financial impact of software errors.

(c)  To produce a reliable estimate of the level of claimant error and fraud the Department is examining approximately 4,700 randomly-selected 2003-04 awards to determine whether any non-compliance has occurred and whether this non-compliance was deliberate or resulted from simple negligence. This approach is in line with that used by DWP for benefits and by the IRS in the US for their Earned Income Tax Credit (EITC). Progress on this exercise will be reported when final results from it are available.

You say in your statement that the £71.25 million settlement is "commensurate" with EDS's responsibility for the IT problems surrounding the launch of the Tax Credit System. But the report to which you referred the Committee suggests that software errors resulted in overpayments of £94m in 2003-04 and £7.9m in 2004-05, in addition to "various other incorrect payments" and, presumably, to the cost of a far more extensive customer support operation than originally planned. Some press reports state that EDS was paid £168m for the project. Is this correct? If so, is it reasonable to assume that software failure and inadequacy accounted for nearly half the difficulties in the tax credit system?

After many months of detailed legal and forensic accounting analysis, we were able to identify a loss legally attributable to EDS failures of £104 million after taking account of overpayments being recovered from claimants and other acts of mitigation.

Our assessment of the maximum amount we could recover from EDS had to take into account the "limitation of liability cap" under the contract which limited damages to a maximum of £31 million "per event of default". The number of events of default would have been the subject of considerable argument in any legal proceedings, with EDS proposing a single event and HMRC arguing for three or more. Even if HMRC had succeeded with its arguments, it was clear that not every event of default would have involved the maximum amount of £31 million.

In considering the settlement offer, HMRC looked for precedents for settlements in software-related cases in the public sector and elsewhere. We could not find any situation in which an IT company had made a compensation payment to a UK government customer of a similar magnitude to that eventually obtained by HMRC from EDS.

Against this background, and given the settlement avoided at least two years litigation, which would have been very costly both in money and senior management time, we believe the settlement of £71.25 million represents a very satisfactory outcome.

How far were the initial problems over the launch of the Tax Credit scheme due to bad systems design? Did this make the problems over the IT system inevitable?

The initial problem with the IT system to support Tax Credits arose from poor system architecture, poor software coding and the use of EDS staff who had not been fully trained.

We have stabilised system availability and performance and we continue to enhance its functionality in periodic software releases. We also continue to look for ways of improving the design of the system through our new IT suppliers, Capgemini.

Did the Revenue work with EDS to design the system, or simply define its requirements at the outset?

The Revenue did not work with EDS to design the system; its role was to define requirements.

What consideration was given to the users and client base of the New Tax Credit system in drawing up the systems design and procurement specification?

The System design was informed by the initial policy consultation. Pilots were also conducted to understand how customers would cope with forms, processes and portal services, which informed the design. Detailed walk through exercises to test customer experience, and revise accordingly, were held at all major stages.

The needs of our system users were established through model office testing, as well as business testing of IT with business processes.

Were there extensive changes in the specifications of this system during the development process?

As is normal with a project of this sort there were changes to the system requirements as the development process progressed. EDS were given the opportunity to object to any requested change on the basis that it was too late or not deliverable.

How long did it take for the IT system to become stable? Was EDS paid to supply the software fixes the revenue found necessary? How many fixes were needed as a result of initial IT faults, and how many changes in Revenue operational requirements?

The IT system was stable and performing to availability and other service levels by autumn 2003. Large numbers of fixes and process workarounds were needed because of IT problems. These were fixed by EDS with no additional payment.

When you gave evidence to us, you told us you would not be able to produce an automatic pause between the recalculation of tax credits and the recovery of overpayments until 2007; Is it usual to have an IT system which is so difficult to adjust?

The department has agreed to look at the proposal for a pause, however, having established the system's integrity and significantly improved its performance, the priority is to ensure that progress takes place in a measured and orderly way. We continue to look for ways of improving the flexibility of the system alongside our new IT suppliers, Capgemini.

What sorts of governance arrangements were put in place to oversee delivery of the new scheme and the associated IT system? Did such arrangements include external or independent assessors?

A full programme structure was put in place in line with best practice and OGC guidance. This included board level involvement and independent reviews undertaken by OGC through their Gate processes. In addition, IR commissioned an external review by Deloittes in the early months of 2003-04 when the system problems emerged which resulted in an agreed action plan.

How important is it to have an "intelligent customer" capacity when designing and producing new business critical systems? How is this intelligent customer function being discharged at present and how was it discharged at the time of the EDS contract?

It is very important. It is a key plank in the reforms our new CIO is making with the engagement of experts to ensure we can articulate our needs effectively to our new IT supplier, Capgemini, and validate IT changes proposed by them against the latest technology industry standards. We see ourselves as owning the IT strategy for HMRC and not being dependent on the supplier for that.

This differs to the arrangements in place in 2003 when former IR relied heavily on EDS for this role.

Why was not enough time given to testing the new IT system? What measures have been put in place with your current partners to avoid a repetition of these problems?

Testing was undertaken accordingly, to plans put forward by EDS.

With Capgemini, we have a new testing strategy that adheres to latest industry standards. Recent software releases have been implemented successfully.



 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2006
Prepared 3 May 2006