Where next in the quest for better evidence — after this year's picks for the Nobel Prize in Economics

Where next in the quest for better evidence -- after this year's picks for the Nobel Prize in Economics

The buzz around this year’s Nobel Prize picks in economics continues to stimulate discussion, not least on “where next in the quest for better evidence?” The groundbreaking research by Laureates David Card, Joshua Angrist and Guido Imbens has substantially improved our ability to answer key causal questions with great benefits to society. But much more needs to be done to get from conceptual advances to improved practice in evidence-informed decision-making by policymakers and program designers around the globe.

After the revered prize was awarded to Esther Duflo, Michael Kremer and Abhijit Banerjee in 2019 for having pioneered the use of randomized controlled trials to study the effects of development policies, this year’s award celebrates an even broader swath of approaches to empirical research now commonplace in the field of development impact evaluation.

3ie, as many readers may already know, promotes the use of rigorous methods and data to answer policy-relevant questions about the effectiveness of development programs and policies. We seek scientifically credible answers to questions about what interventions work, for whom, why, and at what cost. For example, 3ie is currently leading evaluation programs on immunization, peacebuilding, and collective women’s economic empowerment, to name a few.

Just how influential have the methodological contributions recognized by this year’s Nobel been for the field of impact evaluation? To find out, we consulted 3ie’s Development Evidence Portal (DEP), the most comprehensive and searchable repository of rigorous evidence on what works in international development.

As of November 2021, the DEP registered 9,803 impact evaluations, of which 23.7% were classified as ’quasi-experiments’. Quasi-experimental evaluations use sources of variation in the data-generation process other than random assignment of treatment to infer a causal relationship. In the case of the study of ’natural experiments’, as recognized by the Nobel committee, the sources of variation are outside the researcher’s control. Examples cited by the committee include differential exposure to an intervention because of a change in policies across states, or based on small variations in date of birth. Similar in spirit to these groundbreaking studies, 37.4% of quasi-experimental evaluations on the DEP used a ’difference-in-difference’ identification strategy, and 10.5% relied on instrumental variables. The methodological contributions championed by this year’s Laureates spill over into the “experimental” approach to impact evaluation as well. For example, estimates of Local Average Treatment Effects (LATEs) championed by Angrist and Imbens are found in experimental evaluations when treatment compliance deviates from the original experimental assignment.

And how about the Laureates' influence on 3ie’s own work? Since 2008, 3ie has directly funded, supported and implemented 278 impact evaluations. Of these, 27% are classified as quasi-experimental. The rest, including several co-authored by the 2019 Laureates, fall into the experimental camp.

3ie’s approach to impact evaluation draws from the range of experimental and quasi-experimental methods, in combination with qualitative methods, process evaluations, cost analysis, etc., to obtain a more complete picture of an intervention’s impact. We think that the research question should drive the empirical approach, rather than the other way around, and we recognize the advantages and limitations of different methods. Our experience supporting hundreds of impact evaluations in lower- and middle-income countries has taught us that operational realities, budgetary constraints and ethical considerations are key drivers of the best feasible strategy for causal identification. We find that there are still a lot of misperceptions around these issues.  More work is therefore needed to unpack how the robustness of study findings, as well as the cost and timeliness of the research, depends on different combinations of (i) approach to causal inference (determining whether experimental, quasi-experimental, observational, or descriptive methods are required), (ii) data source(s) (primary or secondary) and (iii) whether the evaluation is prospective or retrospective. Common perceptions that impact evaluations, and in particular experimental evaluations, are always expensive need to be challenged: the driver of evaluation costs is data, especially primary data collection.

The next frontiers for research and action are already being explored.  A major challenge is that while methods and data sources are improving and the evidence base is growing exponentially, the understanding of how to effectively use this evidence for decision-making is still lagging.  Actions need to be taken to align incentives and institutional cultures so as to make critical thinking and the use of evidence the ‘new normal’. In addition, we need to pay more attention to the inherent interdisciplinarity and cross-cutting nature of real-world decision-making: policymakers have to make sense of a jumble of evidence on several interrelated questions at once. How different programs and policies interact and either complement each other or work at cross purposes is a further theme where policy-oriented research would be beneficial.

But the main message at this point is that thanks to the ingenuity of the 2021 and 2019 Nobel Laureates and contributions from hundreds of researchers and practitioners dedicated to the study of economic development, it is now possible to answer a growing set of policy-relevant questions that have direct bearing on the well-being of citizens in lower and middle-income countries.

Add new comment

This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Authors

David de Ferranti David de FerrantiDirector and Head of the Washington, DC Office
Marie Gaarder Marie GaarderExecutive Director, 3ie
Sebastian Martinez Sebastian MartinezDirector of Evaluation

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives

Authors

David de Ferranti David de FerrantiDirector and Head of the Washington, DC Office
Marie Gaarder Marie GaarderExecutive Director, 3ie
Sebastian Martinez Sebastian MartinezDirector of Evaluation

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives