Fifteen years ago, our organization was founded with a clear mandate:
The mission of the International Initiative for Impact Evaluation (3ie) is to contribute to the fulfilment of aspirations for wellbeing by encouraging the production and use of evidence from rigorous impact evaluations for policy decisions that improve social and economic development programs in low- and middle-income countries. (3ie Founding Document, 2008)
As a mission-driven non-profit international organization, this mission remains the north star that motivates our staff, board of directors, fellows, and partners. Occasionally it is important to pause, reflect on where we are, and celebrate what we have achieved as a community. What better occasion to do so than for our 15th anniversary!
While the mission has remained focused on the generation and effective use of high-quality evidence to inform decision-making and improve the lives of people living in poverty in low- and middle-income countries, we're continuously updating the ways we do our work. Now we focus on recognizing where the gaps are, identifying the binding constraints to fulfill our mission, and pushing the frontiers of impact evaluation methods, always while working with our partners and the wider community. Over the years, 3ie has remained relevant by continuously adapting, learning, and reacting to constraints by innovating. But don’t take our word for it, let’s go to the evidence. Let’s celebrate 15 contributions by 3ie for each of its years of existence:
- As the Center for Global Development's seminal report ‘When will we ever learn?’ pointed out in 2006, there was hardly any rigorous evidence in existence at that point to inform the billions spent by governments and development agencies on international development. 3ie’s first ‘open window,’ launched early in 2009, helped change that by funding and quality assuring a range of impact evaluations across sectors. A range of ‘windows’ – open, thematic and policy-driven – was to follow over the years.
- 3ie required all grantees to clarify theories of change, to report on equity and gender, to have letters of interest from the implementing agencies, to include southern researchers in key roles, to carry out costing analysis, to share the data, to have clear stakeholder engagement and evidence uptake plans, and to use mixed-methods to respond to questions not only of what works on average but also for whom, why and under what circumstances. These practices were mostly new and met with substantial resistance. Now, these requirements are commonly accepted and replicated by others.
- 3ie also launched the Journal of Development Effectiveness (JDeff) in 2009 with the expressed intention to publish empirical applications of existing methods (not only methodological innovations) with a focus on policy relevance. The journal welcomed studies with null or negative findings to counter the publication bias phenomenon.
- 3ie’s first systematic review window was launched in 2008, recognizing that systematic synthesis across a body of similar studies is key to avoiding misleading bias and identifying generalizable and context specific lessons that only become apparent when looking at the totality of the evidence.
- 3ie worked with the Campbell Collaboration to establish the International Development Coordinating Group in 2010 to help incentivize the production of more systematic reviews in international development to inform policies and programs. 3ie has hosted the secretariat since.
- 3ie dedicated a special issue in JDeff to systematic reviews which included methodologically ground-breaking work, including this article about shifting from ‘bare bones’ systematic reviews to more policy-relevant mixed-method reviews, and this article about applying narrative approaches to systematic reviews to address a broader range of research questions.
- In 2011, as a response to the need to map out the existing evidence around agricultural interventions and their effects on nutrition, 3ie invented the Evidence Gap Maps (EGMs) which have since become a sought-after tool in the development community.
- Following the initial EGM, 3ie continued to innovate through several ‘generations' of EGMs. From a static map presented in Excel, to an interactive tool, to navigating an evidence base fully integrated with the Development Evidence Portal, new EGMs are even more informative than the originals. The approach has been adapted by other organisations, such as the International Rescue Committee and South Africa's Department for Planning, Monitoring and Evaluation. Colombia’s Department of Planning, a member of 3ie, has worked with us to develop their own guidance in Spanish: mapas de brechas en evidencia. EGMs have also been applied to different and innovative topics, such as our big data systematic map.
- In our most recent EGM innovation, living EGMs, 3ie had successfully developed an approach to producing EGMs which are regularly updated with the newest evidence. The first living EGM covers interventions that function within the food system, including nutrition programs. The map has already been cited in numerous reports and articles, as well as serving as a key database in the development of two rapid evidence assessments and a systematic review.
- In 2018 3ie pioneered the use of contribution tracing to confirm our claims of evidence-informed change. We adapted the method, which uses the Bayes theorem combined with process tracing, and applied it to a portfolio of closed research projects, 100 to begin with. The exercise led us to change our initial understanding of whether and how evidence was used for dozens of studies in the first year (36 to be exact). The project improved our understanding of evidence use and helped build an evidence impact summaries portal with evidence use cases we could be confident about.
- 3ie hosts the Development Evidence Portal (DEP), the world’s largest repository of impact evaluations and systematic reviews in global development. The portal was re-launched in 2020 with a user-friendly interface and advanced searching and filtering options. DEP now contains more than 10,000 impact evaluations and nearly 1,000 systematic reviews.
- Since its founding, 3ie has championed transparent, reproducible and ethical evidence (TREE). We support transparency by ensuring that 3ie evaluations, synthesis and other knowledge products clearly document the methods and data used in each study, including by hosting our study registration portal, RIDIE, and publishing data on the 3ie Dataverse. 3ie promotes computational reproducibility through our push-button replication (PBR) protocols, requiring all 3ie supported studies to ensure replicable results. Most recently, 3ie has led the charge to align best practices in this area through our TREE review framework, currently in its pilot testing phase.
- 3ie continues to make methodological advance, recently working on big data, costing, and virtual training: We have taken stock of current applications, opportunities and limitations of big data and data science for understanding development effectiveness and are developing and sharing publications, guidance, and tutorials to help judiciously advance global practice on this front. We routinely incorporate costs alongside effectiveness evidence whenever possible, and have produced practical guidance to help our partners, grantees and our own staff incorporate prospective cost capture into ongoing impact evaluations. And through 3ie's Training Initiative, we have trained over 200 researchers, policymakers, and implementing partners in evidence generation and use, with an emphasis on causal inference and mixed methods to enhance inference and understanding.
- Recognizing the disparities that exist in resources and capacity between regions, 3ie launched the WACIE program in 2018 to provide focused support for policymakers and evaluation professionals in Francophone West Africa. One part of the program, the WACIE Helpdesk responds to demands for evidence from decision-makers with Rapid Response Briefs in French and English which draw together rigorous evidence about pressing policy needs.
- We have expanded the menu of services we offer at various stages of an organizational program cycle to better "meet partners where they are" in terms of the use of rigorous evidence for decision making. We are also launching a new Evidence for Policy and Learning unit to lead the delivery of these new areas of work, both leveraging and enhancing our core strengths in impact evaluation and evidence synthesis.
The impact of all of these contributions is evident in the number of impact evaluations and systematic reviews which are now available on the DEP. Evidence gap maps now exist on a range of topics and sectors, allowing policymakers and researchers to quickly get an overview of the evidence landscape. These resources are increasingly being consulted and used, as evidenced by testimonials and page visit data.
And through the evolving practice in the evidence community, we can observe our leadership (since the definition of leadership is that others follow). Mixed-methods and theories of change have become the norm rather than being the exception in the impact evaluation community, as they were when we began requiring them. Transparency and reproducibility are no longer things we have to argue with lead authors about. Funders and development institutions are increasingly requesting our maps and systematic reviews to inform their decisions.
Are we there yet? Not by a long shot. Including costing as part of effectiveness studies is still a rarity (see our recently published JDeff special issue). New technologies and data sources have just recently become widespread, and their uses and abuses need to be managed carefully. Despite the abundance of accessible and translated evidence, good and timely use of the right evidence to inform decisions is still lagging.
3ie has realized that for evidence access to lead to increased use, we need to invest in long-term partnerships with international development organizations and governments. We need to support the local capacity to produce, translate and use evidence, as well as to understand and improve institutional evidence cultures and incentives. We are updating our theory of change and strategy accordingly and welcome engagements with you over the course of this anniversary year!