Science and international development

Written evidence submitted by the

Overseas Development Institute (Int Dev 31)


1. We would like to thank the committee for inviting us to prepare a formal submission to this inquiry; in which we outline issues that have arisen since the initial terms of reference were issued and which we believe are important to the committee’s deliberations. We have had the opportunity to read through the written submissions that have already been received, and agree with many of the points made. Our focus for this submission is on how science can influence policy with reference to the whole system of generating and using knowledge.

2. While a focus on building scientific research capacity will help address issues of market failure in the provision of science as a public good, this is not the whole picture. Science is a public good in its own right, but it is also a means to help deliver other public goods such as a cleaner environment, improved health, better education, broad-based economic development and improving trust between citizens and their government.

3. Developing capacities to ensure that scientific research is used to deliver these wider public goods via the policymaking process means looking at the system as a whole (drawing on approaches like Innovations Systems (IS)) but also paying attention to specific parts of the system namely:

· strengthening and systematising policymakers’ demand for science by equipping them with the tools and methods to be able to procure and use science cost-effectively;

· adding value to scientific research by ensuring its implications are well understood and embedded in broader policy processes;

· recognising the central importance of intermediary organisations which facilitate policy debates and convey narratives around science.

4. Building capacity is a not a straightforward process and doing so successfully requires long term commitment, a systemic approach, innovation and a high level of professional rigour. We elaborate on these points below.

Taking a ‘whole systems’ approach in delivering policy goals

Why this is important

5. The simple linear model, where research results are disseminated to target audiences who assimilate this new knowledge and act upon it, is too simplistic (Barnard, et al., 2006). Scientific research is clearly just one of many competing factors influencing policy decisions and changes in practice (see Court, et al., 2004 and Young and Mendizabal 2009). Decisions about how to use science to deliver wider public goods are intimately bound up with the policymaking process in a reflexive and complex set of relationships (see Jones, 2011; Ramalingam, et al., 2008). As such there is a real need to understand and focus on the processes and drivers behind the use and uptake of new or existing knowledge. The RAPID framework [1] , for instance, identifies four groups of factors that shape the science-policy interface: the political context, the nature of the evidence, the mechanisms which link evidence with policy processes, and external influences.

6. Such whole system approaches have been taken up by a number of initiatives developing research capacity. DFID has funded the Research into Use (RIU) programme which aimed to contribute to sustainable agriculture in South Asia and sub-Saharan Africa by adopting a pro-poor ‘whole system’ or, in this case, Innovation Systems (IS) approach to getting (DFID-funded Natural resources-related) research into use and to increasing the understanding of how this is done. An innovation system is a network of organisations and individuals – comprising knowledge users, producers and intermediaries (at national, sub-national, regional and/or international level) involved in generating, modifying and using new knowledge. The IS approach considers not only the totality of the entire research, development and extension spectrum, but also the institutions, systems of production, and social relations in which these activities take place (Clark 2010).

7. In Indonesia, AusAID is about to launch an almost two decade long AUS$ 300 million programme to develop what they call the country’s ‘knowledge sector’. The Programme’s "knowledge to policy" model contains four inter-connected pillars, each of which will be supported through this program: 1) research organisations that produce knowledge and evidence that influence policies; 2) policy makers who demand and use evidence in formulating policies; 3) intermediary functions and bodies that translate, package, and communicate knowledge and; 4) the enabling environment – the policies, regulations, and procedures that govern how the supply and demand sides operate and interact (AusAID, 2011).

Implications for DFID

8. As Jones, et al. (2009) suggest, taking a whole systems approach emphasises:

· not just the supply but also the demand for knowledge (including scientific research) and the need to strengthen this demand by amplifying the voice of knowledge users (such as farmers, small and medium sized enterprises as well as policymakers) and providing knowledge services;

· the importance of different types of knowledge (beyond scientific research to include citizen or stakeholder evidence and evidence from practice);

· that often structural factors and national context, such as the social value placed on ‘policy entrepreneurship’ or the strength of basic infrastructure, can shape the use of knowledge;

· the importance of networks and linkages as channels for increasing the role of knowledge in policy and practice and the need to facilitate trust and interaction between a diverse range of actors and;

· the need for actors carrying out ‘intermediary functions’ to facilitate continuous exchange between the ‘supply’ and ‘demand’ for knowledge.

9. There is a widespread belief that the blockages to using science effectively in policymaking arise because of barriers within government that can be overcome with more or better science, or more or better communication. A whole systems approach demonstrates the importance of conducting early and thorough diagnoses of the system to identify the weak points in both supply and demand.

Supporting decision makers to demand and make use of scientific research

Why this is important

10. Strengthening the demand for science by policymakers has received relatively little attention internationally: while there has been a great deal of work internationally on strengthening the supply of science to policy via academia, think tanks and other organisations, considerably less has been done to help developing country policymakers understand how to use science and other forms of evidence effectively.

11. It is not enough for policymakers to use science instrumentally. Promoting better governance means helping to align policymakers’ incentives to use science with the policy goals they are charged with delivering in a set of robust and transparent decision-making processes. If an impact assessment is just a tick-box affair there is very little incentive for policymakers to seek out evidence to answer the questions. Likewise, strategy development processes may be purely about negotiating policy positions without reference to what the evidence says, and budgeting processes may be simply an accounting exercise rather than a systematic effort to underpin policy priorities with the necessary resources.

12. There is also an important set of questions relating to how cost-effectively science is procured and used to inform policy: specifically how the budgets at the disposal of government departments are used to procure scientific research for policymaking. For the purposes of this enquiry, we include donor-funded projects in the definition, as developing country governments, particularly those with low incomes, generally have minimal budgets for procuring science themselves and often rely on external funders to deliver what they need.

13. Lessons from Whitehall show the importance of developing a clear line of sight between policy goals and the science base to ensure that government budgets spent on science are focused on delivering those goals cost-effectively. This is perhaps best exemplified by the Department for Environment, Food and Rural Affairs (Defra)’s Evidence Investment Strategy process [1] .

14. Strengthening policymaking processes such as these helps make the procurement and use of science more cost-effective. It also contributes to the good governance agenda; improving the effectiveness of policymaking processes by ensuring that that the business processes of government departments (strategy, planning, budgeting, options appraisal, citizen engagement and monitoring) actively encourage policymakers to seek out evidence in a structured and systematic way (see for example Hallsworth & Rutter, 2011 and Laughrin, 2011). In addition, clarifying policy’s needs for science helps create a ‘demand-pull’ on the science base, aligning incentives to supply science with demand from government.

15. This on its own is not enough, however. It is also important to build policymakers’ own capacity to articulate their science needs to researchers more effectively and systematically; to help them understand the scientific process, recognise good and bad science, and know how to deal with uncertainty in the science base.

Implications for DFID

16. More systematic and political analyses of policy processes would help identify the different actors or stakeholders involved and understand how their interests and perspectives influence the ways they use scientific research, or not. This can help inform knowledge-to-policy programming – allowing DFID to identify key entry points to help partner governments ensure that policy processes engage more widely and draw on multiple types of knowledge (see for example Datta et al, 2011; Jones et al, forthcoming 2012).

17. We believe that some of the best examples of tools to systemise demand are to be found in Whitehall, for example in the Government Chief Scientific Adviser (GCSA)’s Guidelines on the Use of Scientific and Engineering Advice in Government [1] and the work on Evidence Strategies that has been done by Defra and other departments (referenced above). While more needs to be done to determine the extent to which this Anglo-Saxon model of evidence-based policymaking is truly relevant to other countries and cultures, we believe there would be merit in:

· seeking to adapt the GCSA guidelines to country-specific circumstances. While we recognise that there are substantial differences in the capacity of Whitehall and developing country departments, the GCSA principles are equally relevant. The process of producing country-specific guidelines would help identify where the particular science-related weaknesses are inside departments and help focus efforts to strengthen them;

· draw lessons from the work in Whitehall to develop evidence investment strategies, to begin the process of stimulating and systematising the demand for science and other forms of evidence by policymakers. As we note above, this would speak to the twin agendas of value for money and good governance;

· support techniques which encourage policymakers to think ahead, help them consider what evidence they need to future-proof policies and strategies, and explore future risks and opportunities and;

· consider support to programmes like the Canadian Health Services Research Foundation (CHSRF)’s Executive Training for Research Application (EXTRA) program which develops capacity and leadership to optimize the use of research evidence in managing Canadian healthcare organizations. [2]

18. Strengthening and systematising the demand for science and other forms of evidence in policymaking is a long-term process, but we believe there are sufficient examples of good practice in the UK that could be adapted to the different realities of developing country governments.

Adding value to scientific research by strengthening science-policy dialogue

Why is this important

19. Communication is fundamental to science’s ability to make a meaningful difference to policymaking processes. But improving science communication is not simply an issue of marketing the results of scientific research by for instance, producing policy briefs and working papers, or targeting key decision-makers with information. As indicated above, good communication can create the conditions for, facilitating dialogue among and, critically engaging with, key policy actors (including citizens as well as scientists and policymakers) to improve the rigour and quality of policymaking. For instance, scientific research is more likely to be integrated into policy if policymakers (and other users) are consulted about their views and priorities at the outset of a research project, rather than presented with a completed research report over which they have little sense of ownership (Jones, et al., 2009). Such dialogue can:

· help policymakers who are not technical specialists understand the results of complex scientific processes and their relevance to current policy priorities;

· help policymakers understand how to recognise good and bad science, the implications of scientific uncertainty and translate this into a better appreciation of how to deal with risk;

· alert policymakers to the possible implications of new and emerging evidence and to engage widely and openly about the implications, such as upstream public engagement work on nanotechnologies (see, for instance, Datta, 2011);

· amplify the voices of knowledge users, such as farmers, and owners of small and medium-sized enterprises (SME), and provide knowledge services to strengthen them and;

· ensure that messages from science are delivered in a timely way, anticipating policy’s likely needs for information or responding to them as rapidly as possible

20. In Peru, for instance, the Consorcio de Investigación Económica y Social (CIES) (Economic and Social Research Consortium)-a Peruvian umbrella organisation with 44 institutional members, including think tanks, research centres, NGOs, private consultancies and public agencies - designed its research awards around the knowledge demands (i.e. a research agenda) outlined by the Public Sector Consultative Group (PSCG) made up of government, the Central Bank, regulatory entities and Parliament (Jones, et al., 2009).

21. In another example, a deliberative citizens’ jury on food and farming futures in Zimbabwe convened by a group of actors including a government agency (overseen by a panel of senior officials from the Ministry of Land and Agriculture) enabled an information exchange amongst farmers, scientists and policymakers on what stakeholders wanted to see in the smallholder agricultural sector in Zimbabwe by 2010 (Rusike, 2005).

Implications for DFID

22. DFID has led the way internationally in ensuring that sufficient attention is paid to communicating the results of scientific research to policymakers, stipulating that research projects and programmes actively consider communications from the outset, through its policy of 10% minimum spend on communications within RPCs and requiring each one to produce a strategy to show how research would be put into use; providing web portals such as Research for Development ( ); bringing international groups of researchers together to discuss how to improve the impact of scientific research on policy (see Shaxson, 2010) and reviewing the impact of its research communication spend (see Hovland, et al., 2008).

23. While the past decade of DFID-funded work has given the international community a real appreciation of the need to think systematically and rigorously about how to communicate science to policy, more work still needs to be done to understand the impact of different research communication strategies and the wider public value they ultimately add to scientific research. The evaluation of DFID’s policy of a minimum spend on communications in RPCs suggested they continue with this policy, increase the threshold to 15 % for future rounds, provide more practical support to RPCs in implementing the communications policy and fund cutting edge research on research communication (Hovland, et al., 2008). The current moratorium on communications and marketing spend by Whitehall departments has had an unfortunate impact on funding for communicating scientific research to policymakers, because research communications (to facilitate policy dialogue) has been conflated with central communications (to ‘sell’ the DFID message). Should this continue, it will have significant impacts on the value that can be added to DFID-funded research.

Working with intermediaries to facilitate the interpretation and use of science

Why is this important

24. There are limits to what individual research centres can achieve on their own, particularly smaller organisations that operate outside the traditional academic environment (as noted by the STEPS Centre’s points about citizen science). The importance of creating networks and linkages as channels for increasing the uptake of knowledge, and the need to facilitate trust and interaction between a diverse range of actors is now well recognised.

25. The print, broadcast and online media have to some extent facilitated this interaction, by publicising and critiquing research findings, promoting and widening debate, and demanding accountability. But spurred on by rapid developments in information and communication technologies over the last decade (particularly the development of web-based tools, social media and smart phones), these roles are increasingly being played by other intermediaries; ones who do more than simply transmit information. Instead they contribute to interpreting information, creating new knowledge and fostering social learning and innovation in a variety of ways by, for instance, by strengthening relationships and networks of actors or contributing to collective engagement around an issue.

26. These intermediaries can sit outside government (such as prominent academics and communication specialists  within universities, networks - enabling collaboration beyond the usual institutional, cultural, and functional boundaries of an organization, think tanks and civil society organisations), inside government (such as strategy units and evidence teams) or somewhere in between (high level commissions, science advisory councils and legislative committees). Given the growing importance of legislatures in many developing countries, supporting the knowledge requirements of legislative committees in particular could help them fulfil their oversight function more effectively and drive up the quality of policymaking (see, for instance, Datta & Jones, 2011 for legislator-research linkages).

27. Examples include, Jean Drèze, a development economist who has been influential in Indian economic policymaking particularly on issues of hunger, gender inequality, child health and education. He helped conceptualise and draft the first version of the National Rural Employment Guarantee Act (NREGA) (Datta & Mendizabal, forthcoming 2012). The African Centre for Economic Transformation (ACET) is a think tank which was established in 2007 to provide policy analysis and advice to African governments. It is unique in that it champions an African perspective, harnesses African talent from within the continent and from its diaspora, and draws on a network of international experts and preeminent African professionals (Datta & Young, 2011). The Vietnam Economic Research Network (VERN) – a community of young researchers formed to address inadequate capacities of existing research organisations - works with both government and legislative committees to develop policy options in a range of social and economic policy areas.

28. The Inter-Agency Network for Education in Emergencies was established in 2000 after it was realised that humanitarian emergencies were an obstacle to the fulfilment of the global commitment to ‘Education for All’. The network created a great deal of value through collective action in formal and self-organising groups of representatives from aid agencies engaged in producing, translating and sharing knowledge; successfully advocated for more attention to education in emergencies; and provided training and advice on minimum standards (Mendizabal & Hearn, 2009).

29. Several large scale initiatives have aimed to fund and support intermediaries to improve policymaking processes. The ODI managed Mwanainche programme aims to build the capacity of key interlocutors such as the media and civil society organisations, but also government departments, in improving state-citizen relations in several African countries. [1] The African Capacity Building Foundation (ACBF) helped to establish and support economic oriented government and non-government policy think tanks in sub-Saharan Africa and established 12 national and regional knowledge networks. These helped raise awareness among policymakers of the need for more evidence-informed policymaking, and many of the organisations won a visible and credible voice in their country’s policy discourse (Daima Associates, 2006 in Datta & Mendizabal, 2008). The Think Tank Initiative (TTI), which DFID is contributing to, has provided core budget support to 51 think tanks across the developing world to help them improve their research and engagement work as well as their organisational structures. [2]

30. In an increasingly interconnected world, actors are increasingly joining forces in partnerships and networks that cut across national boundaries to generate and use knowledge more systematically to address regional and global challenges. The Climate and Development Knowledge Network (CDKN) for instance, is a large-scale global alliance of Northern and Southern private and non-governmental organizations working to support decision makers in designing and delivering climate compatible development (Datta & Young, 2011).

Implications for DFID

31. Visualising science and policy as separate communities does not help efforts to make science meaningful to developing country policymakers. DFID could perhaps do more to recognise that intermediary organisations take various forms and can play a wide range of roles. This includes support to different types of knowledge intermediary (such as policymaker or issue based networks) to work in ways that are more tailored to different contexts and would help foster wider engagement around science in general.

Building capacity for lasting solutions

Why is this important

32. An assessment of the research environment in Africa commissioned by DFID and conducted by ODI suggested that capacity building should not be viewed as a simple add-on to existing initiatives (Jones and Young 2007). Capacity is a multidimensional concept; building it can lead to outcomes that are not initially obvious or clearly attributable. It is an inherently political task which requires long term commitment, a systemic approach and a high degree of flexibility.

33. Approaches focussed on single entities have tended be limited in their impact as they do not deal sufficiently well with actors and their relations with one another. Hence, capacity development needs to focus not just on the capacities needed to achieve technical results such as knowing how to communicate research to non-specialists, but also on what it takes to build more effective and dynamic relationships between different actors - such as scientists, policymakers and intermediaries - within a system (be it an organisation, a sector or a country).

34. Promoting capacity development can be a difficult process: it needs an appreciation of many domains of knowledge and disciplines including organizational development and management science; multi-stakeholder processes, related insights from social and political science; behavioural psychology and others. Like doctors and teachers, an understanding of these issues is not necessarily brought about through formal teaching processes. Field experience is also crucial, through for instance, immersion in context and learning by doing (Datta, et al., forthcoming).

35. Capacity development services are often overseen by Northern based organisations with local capacity development providers, although growing in number, still playing a marginal role. While foreign organisations may have staff with excellent technical skills, they often lack for instance, an in depth understanding of the local context and cultural sensitivities; are unable to speak the local languages; or may be unfamiliar with professional, formal and informal networks. [1] Moreover, Northern consultants building capacities of Southern organisations, can, if not carefully managed, reinforce existing power and knowledge asymmetries.

Implications for DFID

36. Given the complex and multidimensional nature of human systems, building capacity effectively would involve:

· promoting ownership of strategies: locally designed and monitored and context-specific capacity building initiatives can help to ensure their sustainability;

· delivering long term and flexible support: long-term core funding and providing space to local organisations to deliver what they think is needed (drawing on both conventional and advanced approaches) when they think its required can help them respond to complex and changing organisational and environmental contexts;

· encouraging the growth and development of national (and sub-national) level capacity development service providers (such as civil society and consultancy organisations) and promoting South-South learning/collaboration;

· encouraging higher levels of professional rigour and innovation amongst those who manage and implement capacity development programmes through for instance, support to more and better development and learning opportunities, communities of practice, ensuring minimum professional standards and more information for local organisations about the kinds of solutions and support available.


37. There is a real need to take a whole systems approach to developing capacity in producing and using scientific research. Within government this means being clear about what the policy goals are and being open about what science is needed to inform the policy development and delivery processes. Outside government, it means being clear where science has something to say about policy and engaging in a conversation with multiple stakeholders. This also requires a ‘brokering’ process to decide which of those messages are currently most useful, whether they challenge received wisdom, confirm what we think we know, explain complex relationships, enrich our understanding of an issue, or scope opportunities for policy change. As policy priorities shift and as new scientific information emerges, their implications have to be set in context of what policymakers are trying to deliver. And finally, capacity building in all these areas requires a long term, systemic and flexible approach rooted in local ownership.

Declaration of interests

The Overseas Development Institute is a UK based think tank working on international development and humanitarian issues. Its mission is to inspire and inform policy and practice which lead to the reduction of poverty, the alleviation of suffering and the achievement of sustainable livelihoods in developing countries. The ODI does this by combining applied research, practical policy advice, and policy-focused dissemination and debate. It works with partners in the public and private sectors, in both developing and developed countries.

ODI’s Research and Development programme (RAPID) works to understand the relationship between research, policy and practice across different contexts, exploring factors that may contribute to or limit the ability for knowledge to play a role in policy and practice. RAPID then uses insights from its research, learning and practical experiences to develop new competencies and skills in those wanting to use research evidence to influence and improve policies and practices. Given our remit, we have a keen interest in the findings of the Inquiry.

Louise Shaxson, Ajoy Datta and John Young

On behalf of the RAPID Programme at the ODI

2nd February 2012



AusAID, 2011. Revitalising Indonesia's Knowledge Sector for Development Policy, Jakarta: AusAID.

Barnard, G., Carlile, L. & Basu Ray, D., 2006. Maximising the impact of development research: how can funders encourage more effective research communication, s.l.: DFID, IDS, IDRC.

Clark, N., 2010. Innovation Systems, Economic Systems, Complexity and Development Policy, s.l.: Research into Use.

Court, J., Hovland, I. & Young, J., 2004. Bridging Research and Policy in International Development: Evidence and the Change Process. London: ITDG.

Daima Associates, 2006. External Evaluation of the African Capacity Building Foundation (ACBF), s.l.: s.n.

Datta, A., 2011. Lessons from deliberative public engagement work: a scoping study, London: ODI.

Datta, A. et al., 2011. The political economy of policymaking in Indonesia: Opportunities for improving the demand for and use of knowledge, London: ODI.

Datta, A. & Jones, N., 2011. Linkages between researchers and legislators in developing countries, London: ODI.

Datta, A. & Mendizabal, E., 2008. Lessons Learnt in Think Tank Development, s.l.: s.n.

Datta, A. & Mendizabal, E., forthcoming 2012. Think tanks and hegemony: How politics, policy and power have shaped the birth and evolution of think tanks in the developing world, London: ODI.

Datta, A., Shaxson, L. & Pellini, A., forthcoming 2012. Capacity, complexity and consulting: managing large capacity development projects in complex settings, London: ODI.

Datta, A. & Young, J., 2011. Producing Home Grown Solutions: Think Tanks and Knowledge Networks in International Development, Washington: World Bank Institute.

Davies, R., 2010. Cynefin Framework versus Stacey Matrix versus network perspectives. [Online]
Available at:
[Accessed 15 January 2011].

Hallsworth, M. & Rutter , J., 2011. Making Policy Better: Improving Whitehall's Core Business, London: Institute for Government.

Hovland, I., Young , J., Mendizabal, E. & Knezovich, J., 2008. Review of research communication in DFID-funded Research Programme Consortia (RPC), London: ODI.

Jones , N. & Young, J., 2007. ‘Setting the Scene: Situating DFID’s Research Funding Policy and Practice in an International Comparative Perspective, London: ODI.

Jones, H., 2011. Taking responsibility for complexity: How implementation can achieve results in the face of complex problems, London: ODI.

Jones, H., Jones, N., Shaxson, L. & Walker, D., forthcoming 2012. Knowledge, Policy and Power in International Development: A Practical Guide, Bristol: Policy Press.

Jones, N., Datta, A. & Jones, H., 2009. Knowledge, policy and power: Six dimensions of the knowledge–development policy interface, London: ODI.

Laughrin, D., 2011. Searching for the X-Factors: A Review of Decision-Making in Government and Business,, s.l.: Ashridge.

Mendizabal, E., Datta, A. & Young, J., 2011. Developing capacities for better research uptake: the experience of ODI's Research and Policy in Development Programme, London: ODI.

Mendizabal, E. & Hearn, S., 2009. Case Study of the Inter-Agency Network for Education in Emergencies, s.l.: s.n.

Ramalingam, B., Jones, H., Reba, T. a. & Young, J., 2008. Exploring the science of complexity: Ideas and implications for development and humanitarian efforts, London: ODI.

Rusike, E., 2005. ‘Exploring Food and Farming Futures in Zimbabwe: A Citizens’ Jury and Scenario Workshop Experiment. In: Science and Citizens, Globalization and the Challenge of Engagement. London/New York: Zed Books.

Shaxson, L., 2010. Improving the impact of development research through better research communications and uptake: Report of the AusAID, DFID and UKCDS funded workshop - London November 29th and 30th 2010, London: AusAID, DFID and UKCDS.

Young , J. & Mendizabal, E., 2009. Helping researchers become policy entrepreneurs, London: ODI.

Overseas Development Institute

February 2012

[1] See for more information

[1] The original Evidence Investment Strategy can be found at and the 2011 update at



[1] See



Prepared 7th February 2012