Select Committee on International Development Eighth Report


Measuring The Impact and Effectiveness of DFID's Work

Geographical and Policy Focus of DFID's Programmes in its Top 30 Partner Countries

30. The publication of Country Strategy Papers by DFID has demonstrated the sophisticated spectrum of policies and responses which DFID has agreed with its partner countries. In some countries it is concentrating more on health, and in some more on education, depending (quite rightly) on its analysis of the priority areas, and on its strategy agreed with the partner government. Therefore to analyse its performance on the basis of its total expenditure in those countries is to ignore the sophistication of its programme profile. Such an aggregation is misleading, and does not provide useful information about DFID's contribution to development in the countries where it works.

31. In some of the countries where it works, DFID is involved in programmes and projects in specific geographical areas, rather than on a country-wide basis. India is perhaps the most important example: DFID has programmes in three states (Andhra Pradesh, Orissa and West Bengal), and although it is planning an expansion into other states, the population included in those states is a small proportion of the total population of the country. It is therefore inappropriate to measure progress at the level of India as a whole, rather than concentrating on the areas where DFID is actually working. Obviously other donors are working in other states and at a central government level, but if DFID has no involvement in those programmes, it is difficult to imagine how it can claim responsibility for the impact or effectiveness of programmes there.

32. Those PSA targets which are based on progress towards the IDTs tend to set one target to be achieved in all of the top 30 partner countries on average. For example, DFID has a target for a 1.5 per cent rate of GNP per capita growth, a maternal mortality ratio of 240 deaths per 100,000 live births, under-five mortality rates of 70 per 1,000 live births, and 91 per cent of children in primary school, in its top 30 development partners, by 2002. The table below shows the current state of these countries in relation to three of DFID's 2002 targets.

Table 6: Development Indicators in DFID's Top 30 Partner Countries

DFID's Top 30
development partners
Under - five mortality rate (1998, deaths per 1,000 live births)
Maternal mortality ratio (1998, deaths per 100,000 live births)
Primary education enrolment (1998, percentage of relevant age group)
TARGET
70
240
91
Angola
204
  
35
Bangladesh
96
440
75
Cambodia
143
 
100
China
36
65
100
Ivory Coast
143
600
58
Egypt
59
170
95
Ethiopia
173
 
35
Ghana
96
 
 
India
83
410
77
Indonesia
52
450
99
Jordan
31
41
68
Kenya
124
590
65
Malawi
229
620
99
Morocco
61
230
77
Mozambique
213
 
40
Nepal
107
540
78
Pakistan
120
 
 
Philippines
40
170
100
Poland
11
8
99
Romania
25
41
100
Russia
20
50
100
Sierra Leone
283
 
 
South Africa 
83
 
100
Sri Lanka 
18
60
100
Tanzania
136
530
48
Uganda
170
510
 
Vietnam
42
160
100
West Bank & Gaza Strip
26
 
 
Zambia
192
650
 
Zimbabwe
125
400
93
Minimum
11
8
35
Maximum
283
650
100
Average
105
321
81

World Bank, World Development Indicators 2000

Range in Country Circumstances

33. The table above shows that for these three targets there is a large range in the indicators relating to the IDTs among DFID's top 30 development partners. Maternal mortality ratios, for example, range from 8 to 650 per 100,000 live births. To set a target for the average ratio in all these countries, given this large range, is to produce arbitrary figures which do not provide any information at all about actual progress in relation to the current situation.

Conclusion: The International Development Targets as Tools for Performance Measurement

34. The Government's PSA states that targets should be 'SMART' — Specific, Measurable, Achievable, Relevant and Timed. To this we would add a requirement that they cover the full range of the Departments' expenditure programmes. The targets included in DFID's Departmental Report 2000 include some which are not timed and for which measurable targets have been set (those in square brackets in Table 4). The targets also exclude elements of DFID's objective to contribute to good governance and the realisation of human rights, and the prevention and resolution of conflicts. If DFID wishes to relate its targets directly to its overall objectives, then the targets should be comprehensive, not selective of those objectives which might be more amenable to quantification.

35. The PSA targets are specific and relevant in that they relate to specific development outcomes and are based on the IDTs which are now the basis of all DFID's work. But they are not, as we have shown, specific to DFID's contribution to the achievement of the IDTs, and they are irrelevant to the task of holding DFID to account for its expenditure because they do not provide any insight into the effectiveness or impact of any of DFID's projects or programmes. Similarly, whilst progress towards the IDTs in the UK's top 30 development partners is, at least potentially, measurable (putting aside the difficulties of collecting such statistics in developing countries), the targets are not measurable in the sense that they do not provide any measure of DFID's own work. DFID's performance targets, as contained in the Public Service Agreement, fail to meet the Government's own 'SMART' criteria (which specify that targets must be specific, measurable, achievable, relevant and timed), and they fail to cover the full range of DFID's policies and Departmental Objectives.

36. We do not question the value of the International Development Targets, and we congratulate DFID on its work in mobilising support for them among other donors and development partners. Nor do we doubt that in various ways DFID's work is contributing significantly to their achievement. It is our view, however, that measuring DFID's performance as a Department by relating its expenditure at country levels to specific development outcomes in a disparate set of countries is so simplistic as to be uninformative and useless.

37. Graham Steggman told us in oral evidence that DFID, along with other Government Departments, was revising its OPA / PSA targets as part of the 2000 Spending Review process, and that the outcome would be "new Public Service Agreements which will set out rather more clearly what are the department's objectives for the three-year period covered and how they relate to resources".[32] These are expected to be published shortly.[33] Four of the new targets have been published with the results of the Spending Review.[34] We welcome DFID's commitment to revise its PSA, and we look forward to examining the revised targets.

38. We recommend that DFID revise its PSA so as to differentiate between its overall objectives (to contribute to the achievement of the IDTs) and the measurement of the effectiveness of its own programmes and projects (and those of the multilateral development organisations to which it contributes) against comprehensive, clear, specific, measurable and timed targets. The targets should be built on a strong basis of evaluation at project and programme level, the results of which may then be aggregated to show DFID's performance in each of its key policy areas.

Project and Programme Evaluations

39. At present, there is a clear gap between the specificity of project- and programme- level evaluations, and DFID's overall objectives. It is this gap which must be filled in order for DFID to be able to establish useful performance targets. Sir John Vereker said in oral evidence that "we recognise that there is potentially a gap between on the one hand our measurement of the impact of our expenditure and on the other hand our achievement of a policy outcome in a particular country".[35] Sir John's answer in oral evidence to this "attribution gap" was that DFID had invested significantly in policy analysis. There is no doubt that there is a large amount of policy analysis information now available, produced by DFID and by others, perhaps most notably the World Bank. Whilst this is welcome, it is not sufficient for DFID to learn from its own successes and failures, and to be held properly to account for its expenditure. A significant investment must also be made in expenditure analysis and evaluation procedures. This is necessary in order to establish the effectiveness of individual programmes and projects in producing the desired outcomes.

40. We must be able to assess how the levels of expenditure on projects and programmes relate to each of DFID's objectives and targets. This aspect of accountability is well catered for in data recorded by the Policy Information Marker System, the Poverty Aim Marker and the Policy Objective Marker.[36] These markers provide an analysis of DFID's expenditure according to key policy areas, its objectives, and the ways in which projects are expected to contribute to poverty eradication.

41. In its 1998 and 1999 Departmental Reports, DFID included in each section a table showing information on its expenditure captured by Policy Information Marker System data, which measures expenditure according to key policy areas. In its 2000 Departmental Report, DFID has discontinued this practice. We recommend that in future Departmental Reports, in each section of its report relating to one of its policy objectives, DFID include a table showing relevant Policy Information Marker System, Policy Objective Markers and Poverty Aim Marker data for the past three years. This will provide a clear and readily-accessible picture of DFID's expenditure profile in relation to its objectives.

42. There must also be information available on the success or otherwise of individual projects and programmes in meeting their objectives. All projects over £500,000 are currently assessed on their completion by the project manager, who must fill in a project completion report.[37] The information provided in these reports is collated periodically in synthesis reports, which provide an overview of the success of projects in various sectors and countries in meeting their objectives. Our analysis of the most recent synthesis study, which covers PCRs completed during the period 1983 to 1998,[38] is that the document is useful in that it provides an overview of the areas of strength and weakness of DFID's programmes. Its limitations lie in its lack of information about or analysis of the reasons for the success or failure of projects and programmes in meeting their objectives. This undermines the utility of the study — lessons learned cannot be fed back into programmes if they are not disseminated in studies such as this.

43. DFID's project completion report forms have been revised recently to capture more information about lessons learned from successes and failures, and we look forward to this information being available on an annual basis in future project completion report synthesis studies. This will represent a significant improvement in their value as tools for learning lessons and applying them in future projects and programmes. We recommend that, in each section of future Departmental Reports, tables be included showing the success of projects in each policy area in reaching their objectives (as reported in the most recent project completion report synthesis study), along with a summary of the key lessons which have been learned.

44. The Project Completion Reports represent a snapshot of a programme or project at its completion. The PCRs are not independent (they are completed by the project manager), and they focus on the objectives of the project or programme set by the project manager. They do not include significant scope for an analysis of any wider or unexpected impact which may have been made beyond that intended by the project. The Project Completion Reports alone do not present a comprehensive picture of DFID's performance. They must be supported by evaluation studies which are independent, published, are conducted after the completion of projects and programmes, and which measure the broader and longer-term impact of projects and programmes, and their contribution to poverty eradication.

45. We asked DFID what proportion of its bilateral expenditure was independently evaluated. The response was, "only a small fraction".[39] When we questioned him in oral evidence about the level of independent evaluation of DFID's work, Sir John Vereker responded, "I am not sure that it is right to distinguish quite so violently between independent evaluations and our own. I think the head of our evaluations department ... would assert fairly vigorously that evaluations, whether they are done in-house or contracted to outsiders, are in every case done by professional evaluators according to well-established criteria, and all our evaluations are published".[40]

46. Sir John Vereker went on, however, to state that "Of course ... you have got to view quite sceptically anything which contains the element of self-assessment which is in [the Project Completion Reports]... It is quite difficult, without being massively bureaucratic and elaborate, to come up with something that is more obviously independent".[41] We disagree with Sir John's assertion that to achieve a more independent level of evaluation would be difficult in the way he suggests. The evaluation studies published by DFID demonstrate that this is not the case.

47. DFID commissions ad-hoc evaluation studies of some of its projects and programmes. These involve a full retrospective impact evaluation, which is the individual responsibility of the authors of the report (which often include experts from outside DFID as well as representatives from its own evaluation department), rather than DFID. These provide an important supplement to the project completion reports. In our Report on DFID's 1998 Departmental Report, we noted that evaluation studies were conducted on around 12 projects each year. Sir John Vereker told us in evidence to this inquiry, however, that there has been a shift in focus recently in DFID's evaluation studies: they now concentrate on sectors, countries or themes, rather than on individual projects and programmes, and there are now fewer studies, each looking at several projects at a time. Sir John explained that this shift in emphasis would lead to evaluation studies which provided more valuable insights into lessons which could be drawn.[42]

48. It is impossible to check what evaluation studies have been published recently, since there is no list of recent evaluation studies available either on the Department's Website (which only includes evaluations carried out up to 1998), in its publications catalogue (published in 2000), or in the Departmental Report. We recommend that DFID revise its publications catalogue to include evaluation studies, and that it update its Internet site to include all evaluation studies on a regular basis. This will enable other organisations outside DFID to benefit from the studies more easily.

49. In the absence of readily available information, we asked DFID to provide us with a list of recent evaluation studies.[43] This showed that, in 1999, DFID published only three evaluation studies in addition to the 1999 Project Completion Report Synthesis which we discussed above. These concerned: Health Planning in Pakistan; Indonesia Police; and an Evaluation of HMG's Response to the Montserrat Volcanic Emergency. So far this year, only one evaluation study has been published; an Environment Evaluation Synthesis Study.

50. We consider independent evaluation of a much greater proportion DFID's projects and programmes would be a very valuable investment. The lessons which can be learned from such evaluations could prove essential to future planning. We are sure that DFID's in-house Project Completion Reports are compiled in good faith. Nevertheless we would interpret cautiously an evaluation portfolio which consisted almost entirely of self-assessment, without any supplementary independent evaluations. At the very least, there is an undeniable value in obtaining a second expert opinion on the strengths and weaknesses of a project or programme and the nature and extent of its impact, and on the lessons which may be drawn for future work. We recommend that DFID significantly increase the proportion of its projects and programmes that are subjected to evaluation studies. We recommend that DFID publish an evaluation strategy and invite suggestions from its staff and from the development community on projects and programmes which might usefully be independently reviewed. The strategy should include all aspects of its work, including budgetary support and sector wide adjustment programmes (SWAPs).

51. In the 2000 Departmental Report, DFID stated that "In the future we will support more in-depth evaluations of our effectiveness, both at country level and internationally. This will help assess our impact in key policy areas covered by the International Development Targets and will take forward the development of systems to improve understanding of our impact and influence in the international arena. A Development Impact and Resource Centre will be set up in early 2000. This will provide staff with high quality advice on how to carry out impact assessment studies and improve in-country capacity to maintain and evaluate performance".[44] We look forward to the establishment of the new Development and Impact Resource Centre, and request that DFID keep us informed of progress in its work.

52. DFID publishes a large range of policy analysis documents, strategy papers, and evaluation studies. It has in place good evaluation systems, and has been working to improve them. All this we welcome. There is a gap, however, between these qualitative assessments of DFID's role and performance, and the need to provide summary quantitative information about the overall performance of the Department against clear and meaningful targets. We recommend that DFID produce a set of revised departmental targets, based on the results of its Project Completion Reports and Evaluation Studies, disaggregated according to its objectives and expenditure profile.



32  Q. 50 Back

33  DFID, 2000 Departmental Report, p. 20 Back

34  HM Treasury, July 2000, Spending Review 2000: New Public Spending Plans 2001-2004 (CM 4807), pp. 73-75 Back

35  Q. 79 Back

36  See DFID, 2000 Departmental Report, p. 166 Back

37  See DFID, December 1998, Project Completion Reports: 1998 Synthesis, Annex D Back

38  DFID, August 1999, Project Completion Reports: A Review of Findings from Reports Prepared on Projects Approved Between 1983 and 1998 Back

39  Evidence, p. 55 Back

40  Q. 131 Back

41  Q. 132 Back

42  Q. 127 and evidence, p. 55 Back

43  Evidence pp. 61-62 Back

44  DFID, 2000 Departmental Report, p. 110 Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2000
Prepared 8 August 2000