Select Committee on Public Accounts Forty-First Report


FORTY-FIRST REPORT


The Committee of Public Accounts has agreed to the following Report:

MINISTRY OF DEFENCE: MAJOR PROJECTS REPORT 2001

INTRODUCTION AND LIST OF CONCLUSIONS AND RECOMMENDATIONS

1.   Each year since 1984 the Ministry of Defence (the Department) has reported to Parliament on progress in procuring major defence equipments. The Comptroller and Auditor General's (C&AG's) Major Projects Report 2001 covers the period to 31 March 2001. It details progress on the top 30 equipment procurement programmes split, in accordance with Smart Acquisition principles, between the 20 largest projects on which the Main Gate decision has been taken and the 10 largest projects in assessment and yet to reach that point (see Figure 1).[1]

Figure 1: The Smart acquisition Cycle showing the composition of the Major Projects Report




2. For the post-Main Gate projects, the Report includes data on the cost and time performance of the projects against the Department's estimates and data on the technical performance of projects against the Customer's Key User Requirements. For the pre-Main Gate projects, the report examines how the Department measures whether the crucial Assessment Phase is achieving its objective of assessing possible options for meeting the military requirement, selecting a technological solution and reducing risks to an acceptable level for Main Gate. The Report also examines how the Department is assessing the impacts of Smart Acquisition on cost, time and technical performance across the Department.[2]

3. The Committee took evidence on the findings of the C&AG's Report from the Chief of Defence Procurement and the Deputy Chief of Defence Staff (Equipment Capability). Our Report examines three key issues building on early progress with Smart Acquisition:

  • whether the Department is doing enough to monitor the outcome of the Assessment Phase to ensure that it is successful;

  • how the Department can secure more timely delivery of equipment, as delays are still a problem; and

  • whether the Department is clearly and coherently measuring the impact of Smart Acquisition.

4. Table 2 shows that the Committee's recommendations for securing improved performances are already having a significant effect. But there is still a fair way to go especially on delays. And our main conclusions are as follows:

  • The Department does not yet objectively measure the outcome of assessment although is its seeking to introduce robust measures. Without comprehensive information on assessment outcomes, the Department's decisions are not well-informed and more open to risk. The Department should move quickly to introduce robust, quantified measures and targets to assess the successful outcome of assessment work, using the spread of three-point estimates and technology readiness levels as indicators.

  • There is evidence (see Figure 2 below) that the Department is continuing to improve project cost control and meet the majority of its Customer's technical requirements but delays, although slowing, are still occurring on some projects. There may be a trade off between technical perfection and delivery on time: the Department should give top priority to more timely delivery of equipment and the Committee has made a number of positive recommendations to that end.

  • The Department has various performance measures which give a confused and incomplete picture of the impact of Smart Acquisition. The Department needs to ensure that through its performance measures it appropriately incentivises and clearly demonstrates continuous improvements across the Department and throughout project lifecycles.



Figure 2: Trends in project performance for the 18 post-Main Gate projects

 common to the 2000 and 2001 Major Projects Reports



PERFORMANCE FEATURE


2000 PERFORMANCE


2001 PERFORMANCE

Costs:

Control continuing to improve


- 0.25%

decrease in total cost overrun against approval

- 0.25%

decrease in total cost overrun against approval

Technical performance:

The majority of customers' key requirements continue to be met


99%

of key requirements forecast to be met

96%

of key requirements forecast to be met

Delays:

Still occurring but the rate of project slippage is beginning to slow


+ 3.5 months increase in average project delay

+ 1.5 months increase in average project delay

Source: C&AG's Major Projects Report 2001 (HC 330, Session 2001-02)


5. Our detailed conclusions and recommendations are as follows:

Measurement of the outcome of the Assessment Phase

  (i)  Three-point estimates (see Figure 3 below) are fundamental to the Department's understanding of how risk is reduced in the Assessment Phase. The spread of three-point estimates is one indicator to objectively measure the outcome of assessment. The Department should introduce this system rapidly, codifying how it expects estimates to narrow during the Assessment Phase.

  (ii)  The Department is introducing Technology Readiness Levels which provide a way of objectively assessing the level of technical risk remaining on a project before committing to major funding. There is a risk that if Technology Readiness Levels are not used properly they could unduly discourage innovation. It will therefore be important for the Department to review its experience with using them to ascertain whether on balance they prove to be beneficial, and to report its findings in a future Major Projects Report.

  (iii)  Projects in the Assessment Phase need to be managed against realistic time and cost parameters which enable risk to be successfully reduced. Where cost or time parameters for the Assessment Phase are exceeded the Department should explain why in the Major Projects Report, including any problems encountered or benefits expected as a result. Taking longer or spending more than planned over the Assessment Phase is not necessarily a bad thing: a better understanding of risk could lead to benefits over the life of the equipment.

Timely delivery of equipment

  (iv)  Recent experience in Afghanistan has illustrated the changing nature of modern war, accompanied by rapidly evolving technology. The Department should explore the scope for the UK to adopt a more flexible approach to contracting for development along the lines adopted in the United States, where the uncertainty inherent in harnessing high technology to meet capability needs is recognised in pricing and framing contracts.

  (v)  The problems on the Advance Short Range Air-to-Air Missile (ASRAAM) and delays to the Swiftsure and Trafalgar Submarine Class Update stem in part from industry and the Department underestimating the technical task and failing to properly manage the associated risks. The Department should work more closely with industry as intended under Smart Acquisition, through mechanisms such as Capability Working Groups and Integrated Project Teams, to ensure that what is required is technically achievable, risks are borne by those best able to manage them and that progress is closely and objectively monitored.

  (vi)  The Department told us that it had managed to recover two months of slippage on the Landing Platform Dock (Replacement) project, which is the first time that we have heard of slippage being recovered in this way. The Department should seek to draw lessons from this achievement and apply them to recover slippage on other programmes.

  (vii)  The problems on the ASRAAM programme mean the Department is initially accepting less than the required capability from the contractor because it recognises that early delivery will provide operational benefits. The Department should ensure that trade offs between timeliness and operational capability are identified and resolved in a rational manner, for example by planning incremental acquisition from the outset where appropriate.

  (viii)  The Department claimed that for weapon systems such as ASRAAM it could not be certain of the equipment's performance until it had been tested in the real world, which brings into question the robustness of the Department's forecasts of performance against Key User Requirements for projects which are early in the development cycle. The Department should indicate the level of confidence which it has in its assessments of equipment performance against Key User Requirements and include these in the Major Projects Report.

Clearer and more coherent measures

  (ix)  Smart Acquisition is about finding 'cheaper, faster and better' ways of doing things. The Department needs to be able to demonstrate that it is continuously improving its performance in line with this aspiration. Some four years into Smart Acquisition, it is still not clear how the Department's measures fit together to demonstrate performance improvements throughout the acquisition cycle, and the Department is itself concerned that they may create perverse incentives. Clearer and more coherent measures are needed to appropriately incentivise continuous improvements throughout the acquisition cycle, and provide a better assessment of the impacts of Smart Acquisition on cost, time and performance.

  (x)  The Major Projects Report for 2001 shows encouraging signs that the Department's performance is improving under Smart Acquisition with 55 per cent of projects performing the same or better in cost, time and performance terms than a year ago. But the Department still has a lot to do to achieve its ambitious Strategic Goal of delivering 90 per cent of all projects worth more than £20 million within approval by 2005. Progress towards this Goal should be reflected in continuing improvements in the reported performance of projects, especially as newer 'Smart' projects come into the Major Projects Report.

MEASUREMENT OF THE OUTCOME OF THE ASSESSMENT PHASE

6. The objective of the Assessment Phase of a project is to assess the possible options for meeting the military requirement, select a technological solution and reduce risks to an acceptable level for Main Gate (the main investment decision point). Much less is spent during the Assessment Phase than once a project has gone through Main Gate but what is spent on assessment, and how it is spent, is crucial to successful delivery of the project.[3] This part of our Report examines how the Department measures the outcome of assessment.

Three-point estimates should be better used to quantify risk and risk reduction

7. The Department does not yet have an objective measure of whether the Assessment Phase has accomplished its objectives. Three-point estimates are a fundamental tool in assessing the uncertainty remaining in a project. These are estimates of the demonstration and manufacture costs and in-service date for each project at three different confidence levels based on the probability of risks arising (see Figure 3 below). The Department told us that at the start of the Assessment Phase there should be quite a spread in three-point estimates. By Main Gate the spread should be much narrower,[4] perhaps no more than 10 per cent for cost estimates, although the Department had not firmed up on an exact range yet.[5]



Figure 3: Definition of three-point estimates

ESTIMATE POINT

DEFINITION

lowest cost/earliest time (at 10 per cent confidence)

Costs and in-service date assuming that risks do not materialise and everything goes well.

Most likely cost and time (at 50 per cent confidence)

Costs and in-service date assuming the average position where some risks do materialise and some do not.

Maximum cost/latest time (at 90 per cent confidence)

Cost and in-service date assuming that risks do materialise and things do not go well.

8. In our Report on the Major Projects Report 2000, we recommended that all projects in the 2001 Report should have robustly generated three-point estimates[6]. In the 2001 Report, however, only four of the 10 pre-Main Gate projects had full three-point cost estimates at the report date, 31 March 2001, and only four had full three-point time estimates.[7] The Department told us that not all the projects had three-point estimates at the report date because some were at a critical point, but all the projects that were going to go ahead now had three-point estimates. All projects should have three-point estimates at Initial Gate, there are no circumstances in which projects in assessment should not have estimates, and the estimates should be reviewed every time the Department takes a deliverable from the contractor. All projects in the Major Projects Report for 2002 would have soundly based, analytical three-point estimates for cost and time.[8]

Technology Readiness Levels should be used to assess technical risk more rigorously

9. Technical factors continue to be a major cause of delay and cost increase and were the biggest cause in the year to 31 March 2001, accounting for 19 months delay[9] and £88 million of cost increases.[10] The Department acknowledged that it took on too much technical risk in some of its projects, starting them with technology at an insufficiently mature level. The Department recognised that it needed to assess more rigorously the technical risk remaining in a project at the main investment point, and was doing so in the same way as the United States Department of Defense, using technology readiness levels, based on objective evidence of whether technology was mature enough. Having attained a technology readiness level between six and eight (see Figure 4) was a pre-condition for embarking on a major new technology for a project. Technology readiness levels had been introduced on some projects and would be introduced increasingly as new projects were approved.[11]

10. There is a risk that technology readiness levels could lead the Department to use only tried and tested technology at the expense of innovation. The Department said that it had a substantial technological competence at its disposal, and those experts had to be aware of what was technically in the pipeline and consider whether it was reasonable to pursue it. The Chief of Defence Procurement and the Deputy Chief of Defence Staff (Equipment Capability) were jointly responsible for ensuring that the Department spent enough money during the Assessment Phase to give a rational rather than just an enthusiast's opinion on new technology.[12]

Figure 4: Definition of Technology Readiness Levels

TECHNOLOGY READINESS LEVEL

DESCRIPTION

1

Lowest level — typically paper studies of a technology's basic properties. Scientific research begins to be translated into applied research and development.

2

Practical applications begin to be invented but still limited to paper studies and not proven or supported by detailed analysis.

3

Active research and development begins including analytical studies and laboratory studies to validate predictions of separate elements of technology.

4

Basic technological components are integrated in a laboratory to establish that the pieces will work together.

5

More sophisticated laboratory integration of components with reasonably realistic supporting elements so that technology can be tested in a simulated environment.

6

Major step up in a technology's demonstrated readiness — representative model or prototype tested in simulated operational environment.

7

Major step up from level 6 — demonstration of an actual system prototype in an operational environment, such as in an aircraft, vehicle or space.

8

Technology 'flight qualified' through successful test and evaluation of the system in its final form in intended weapon system under expected conditions.

9

System 'flight proven' through successful application of the technology in its final form under operational mission conditions.

COST AND TIME SHOULD BE CONTROLLED BUT SHOULD NOT CONSTRAIN JUSTIFIABLE INVESTMENT

11. The Department acknowledged that it could not yet demonstrate through objective measurement that it was spending the right amount of time and money on projects in the Assessment Phase. The Assessment Phase should be conducted to approved cost and to time. Exceptions to this needed to be explained but should not be prohibited, as if the Department discovered something during assessment, it should investigate it rather than pretend it had not happened just to avoid a cost increase or delay in the assessment phase.[13]

TIMELY DELIVERY OF EQUIPMENT

12. The Major Projects Report 2001 showed that the Department was expecting to meet the majority of its Customer's technical requirements, but not always within time and cost. However, there was evidence that the Department was continuing to improve cost control and beginning to bring delays under control, although slippage remained a problem on some projects.[14]

Options for more flexible contracting for development work should be explored

13. The Department saw fixed price contracting as some protection against cost overruns in the wake of Main Gate decisions. Even on very advanced work, it contracted with industry on a fixed price basis and very rarely moved away from that. Industry would probably not want to start work if they did not think that it was sensible to undertake the work on a fixed price basis.[15]

14. Procurements often spanning many years need to take account of changing technology and lessons from conflicts such as Afghanistan. The Department said that flexibility did not sit well with fixed price contracting, which relied on a tight specification of what the Department wanted the contractor to achieve. Fixed price contracting meant that it was less prone and less able to change requirements during contract execution. It was therefore questionable whether fixed price contracting for development was the proper vehicle for testing the boundaries of what was technically possible. The United States would argue that there was no sensible way to set a price for something when you did not know how you were going to achieve it. The Department had more work to do to find a more flexible framework within which to deliver value for money from development work.[16]

The Department, with industry, should ensure that risks are appropriately managed

15. The Department seeks to transfer the cost risk of resolving technical difficulties, and the delays they cause to industry, by using performance based contracts, but still bears the operational implications of delays.[17] Responsibility for managing risks to delivery of equipment therefore rests jointly with the Department and industry. The Swiftsure and Trafalgar Class Submarine Update and the Advanced Short Range Air-to-Air Missile (ASRAAM) are two projects which provide further evidence of cost increases and delays stemming, in part, from industry and the Department underestimating the technical task and failing to properly manage the associated risks.

16. In the last year, technical factors have caused 12 months slippage and a cost increase of £18 million on the Swiftsure and Trafalgar Class Submarine Update programme. Software testing has shown that transfer of data between different parts of the integrated sonar set (Sonar 2076) is a far more complex engineering feat than the contractor or the Department had appreciated and there was more work to be accomplished than they had understood.[18]

17. ASRAAM has been delayed by six months in the last year due to technical factors, building on earlier difficulties.[19] The Department believed that it would soon successfully resolve the dispute with the missile manufacturer that had prevented ASRAAM from entering service in April 2001 as planned. The Minister for Defence Procurement subsequently wrote to us confirming that the Department had resolved the dispute and that the arrangements agreed for bringing ASRAAM into service would not incur any additional costs to the Department. Delivery of the first batch of interim standard missiles offering a significant improvement over the current Sidewinder missile would begin in January 2002 and delivery of missiles at a higher interim standard would begin in mid-2002. Following a continuous development programme, involving further software upgrades, the Department was hoping to achieve full operational capability by the end of 2003, but certainly no later than 2005.[20]



The Department endeavour to recover slippage wherever possible

18. Asked why there had again been no recovery of slippage on any of the projects in the Major Projects Report 2001, the Department told us that it now expected to recover two months slippage on the Landing Platform Dock (Replacement) programme, which would bring the in-service date forward from March 2003 to January 2003. In-service dates for programmes were ambitious, and people within the Department worked hard to try to hold programmes to them. Once a project slipped, however, the resource earmarked for sustaining that project was easily absorbed in balancing the defence budget each year.[21] The Department over-programmed the defence budget by around seven per cent, so the first seven per cent of slippage was swallowed up merely by the defence budget coming back on track.[22]

Incremental acquisition should be planned from the outset where appropriate

19. Incremental acquisition is one of the principles of Smart Acquisition by which the Department expects to be able to bring capability into service earlier and at reduced cost by reducing the risk of technological difficulties and obsolescence.[23] In the case of the ASRAAM, the Department was reverting to incremental acquisition principles because of technological difficulties encountered and consequential delays to the introduction of the capability. The Department wanted the full requirement but, given the current threat most likely to be faced, was prepared to accept a slightly lower requirement whilst developing a route map to the full requirement should it need it.[24]

Confidence levels should be shown for Key User Requirement forecasts in the Major Projects Report

20. In the 2001 Report the Department forecast that it would meet 93 per cent of the Key User Requirements for equipment agreed with the Equipment Capability Customer compared to 98 per cent in 2000.[25] Although disappointed not to be meeting the higher target, the lower proportion did indicated that the Department was taking a tougher view of whether or not it was going to deliver equipment that met Key User Requirements. Under Smart Acquisition there was more rigorous scrutiny of the evidence and projects had to justify to the Equipment Capability Customer how they were going to meet Key User Requirements.

21. Failure on some Key User Requirements had resulted from changed budgetary priorities. The Department reviewed its priorities for the delivery of particular equipments in response to the nature of the operations it faced and lessons learned in those operations. There was not the same degree of stability about the nature of the threat as in the days of the Cold War. Technology was also developing very fast in some areas, a generation being perhaps 18 months in some of the information and communications areas.[26] There was a trade-off between costs and capabilities, provided of course that the need foreseen for the equipment would still be met.[27]

CLEARER AND MORE COHERENT MEASURES

22. Smart Acquisition is intended to enable the Department to buy equipment 'cheaper, faster and better' and over time, noticeable improvements in the Department's performance against cost, time and technical parameters should be evident.[28] The Department has introduced Smart Acquisition reforms across the Department and needs to measure its impact coherently and through the life of the equipment.[29]

Clearer and more coherent measures of the impact of Smart Acquisition are needed

23. The Department's different Smart Acquisition cost reduction targets - £2 billion over ten years in the Strategic Defence Review, £750 million over three years in the Public Service Agreement and £200 million annually in the Business Plan - are confusing and difficult to relate together to get a clear picture of total reductions achieved or being aimed for under Smart Acquisition. The Department told us that the total cost reductions added up to around £3 billion, some five per cent of the equipment programme over ten years or a little more than one per cent of the defence budget.[30]

24. The Department said that the £2 billion required to meet the Strategic Defence Review target had been removed from its budget.[31] Some 70 per cent of the reductions were expected to arise on only 12 of the 80 or so largest projects[32] In the whole population of projects there were other savings that could properly be categorised under Smart Acquisition though they had not been classified in this way. For example, through a combination of later delivery and a number of other measures associated with optimising construction, the Astute submarine would save about £90 million, but it had not been categorised under Smart Acquisition because it was seen as normal business.[33]

25. The Department had just about achieved the £750 million Public Service Agreement target but believed that it would struggle year on year to find another £200 million as called for by the Business Plan and could not continue to do so indefinitely.[34] Most projects had very tight estimates and the Department was cautious about assuming that there must be scope for saving money.[35] More important was a determination to secure cost efficiencies by working closer with industry and making sure that project teams worked more closely together across the Department.[36]

Progress towards the Department's Strategic Goal should be reflected in the Major Projects Report

26. The Defence Procurement Agency has set itself a Strategic Goal to deliver 90 per cent of major projects[37] within approved time, cost and performance parameters by 2005.[38] The rationale for the 90 per cent was that the population of major projects making up the Strategic Goal would, at any one time, be made up of some recently approved that were likely to be on track and some further advanced, of which 90 per cent might be expected to be on track as their approved limits for time, cost and performance were set at 90 per cent confidence. Thus on average, 95 per cent of projects would be on track and achieve their approved time, cost and performance parameters. There were also many legacy projects which would not have worked their way out of the system by 2005, so the best the Department could achieve if every project stayed within limits was 90 per cent of projects within approval by 2005. The Department's long-term goal was to get to 95 per cent.[39]

27. The Strategic Goal required 90 per cent of the major projects to be within approval on all three factors, time, cost and performance, so that they could not shelter behind one another.[40] In just the last year, however, nine of the 20 projects in the 2001 Report have failed on one or more of the three factors[41] and several of the others may fail somewhere else over their life, so the Agency is a long way from getting to its 90 per cent Strategic Goal. The Department said that the Agency's Strategic Goal was a tough one, but it was confident of achieving it because the project approvals against which it would be measured would have been set more analytically than for projects in the 2001 Report, which were predominantly legacy projects.[42]


1   C&AG's Report, Ministry of Defence: Major Projects Report 2001 (HC 330, Session 2001-02), Executive Summary paras 1-2 Back

2   C&AG's Report, Parts 1-3 Back

3   C&AG's Report, para 2.1 Back

4   Q3 Back

5   Q16 Back

6   5th Report from the Committee of Public Accounts, Ministry of Defence: Major Projects Report 2000 (HC 368, Session 2001-02) Back

7   C&AG's Report, para 2.7 Back

8   Qs 4-8 Back

9   C&AG's Report, para 1.24 Back

10  ibid, para 1.11 Back

11   Qs 1-2 Back

12   Qs 17-18 Back

13   Q3 Back

14   C&AG's Report, para 1.3 Back

15   Q20 Back

16   Qs 89, 102 Back

17   5th Report from the Committee of Public Accounts, Ministry of Defence: Major Projects Report 2000 (HC 368, Session 2001-02), para 36 Back

18   Qs 45, 64 Back

19   C&AG's Report, Appendix 2, p38 Back

20   Ev 18 Back

21   Qs 65-66 Back

22   Qs 68-69 Back

23   C&AG's Report, para 3.7 Back

24   Q36 Back

25   C&AG's Report, para 1.29 Back

26   Q32 Back

27   Q53 Back

28   C&AG's Report, para 3.1 Back

29   ibid, para 3.12 Back

30   Qs 53-56 Back

31   Q49 Back

32   C&AG's Report, Figure 16 Back

33   Q10 Back

34   Qs 50-51 Back

35   Q10 Back

36   Q53 Back

37   Defined as post-Main Gate projects with a forecast spend in excess of £20 million and yet to enter service Back

38   C&AG's Report, para 1.3 Back

39   Q15 Back

40   Q83 Back

41   C&AG's Report, Figure 1 Back

42   Q85 Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2002
Prepared 4 July 2002