Dan Levy and Julie Wilson will be conducting a one-week programme on "Using evidence to improve policy and programs" at the Harvard Keneddy School 22-27 June 2014. This programme is aimed primarily at professionals involved in designing, implementing or funding social programmes. The programme covers impact evaluation but it also focuses on other types of evidence. Application deadline 23 May 2014.
A 3ie-funded impact evaluation of the Cheeranjivi Yojana, a health insurance programme in the state of Gujarat, India, undertaken by Duke University, was cited in an article titled Gujarat experiments with public health insurance in India Ink, New York Times, 3 April 2014.
The Community of Evaluators (CoE) South Asia is inviting applications for the documentation of innovative design, method, process, tool or technique for evaluation in South Asia. Application deadline: 25 April, 2014
Journal of Development Effectiveness, vol 6, issue 1 is out. The issue features a paper on Conditional, unconditional and everything in between: a systematic review of the effects of cash transfer programmes on schooling outcomes by Sarah Baird, Francisco H.G. Ferreira, Berk Özler & Michael Woolcock.
There's a section on book reviews as well. Howard White reviews three books -- Power: why some people have it – and others don’t, by Jeffrey Pfeffer, Hard facts, dangerous half-truths and total nonsense, by Jeffrey Pfeffer and Robert I. Sutton and The signal and the noise: the art and science of prediction, by Nate Silver.
Agricultural land tenure reforms have been less effective in Africa than Latin America or Asia, says Steven Lawry and Cyrus Samii in a Guardian op-ed. This is based on a systematic review, The impact of land property rights interventions on investment and agricultural productivity in developing countries. 3ie provided the quality assurance support for this review.
A three-day international workshop on Evaluating Forest Conservation Initiatives, held in Barcelona in December 2013. The workshop brought together about 40 researchers, practitioners and policymakers to discuss the complexities involved in evaluating forest conservation initiatives. Philip Davies, 3ie Deputy Director - Systematic Reviews was as the workshop. Davies has been cited in this blog by the Center for International Forestry Research.
The Economic Policy Research Institute (EPRI), together with the African Institute for Health and Development (AIHD), is offering a two-week course between 23 June and 4 July, aimed at providing participants with an in-depth understanding of the conceptual and practical issues involved in the development of social protection programmes in Africa. Participants will acquire tools required for the appropriate identification and successful design and implementation of these programmes.
The Evaluation Stories Project has launched an international call for evaluation stories. The purpose of the project is to solicit and publish a collection of stories, told from the perspective of evaluation users, describing how evaluation can be a force for social betterment. The deadline for submission of stories is 5 May 2014.
PEGNet invites papers for a conference on 'Employment strategies in the Developing World - How to create sufficient, productive and decent jobs' to be held on September 18 and 19 in Lusaka, Zambia, in cooperation with the Zambia Institute for Policy Analysis and Research (ZIPAR). Deadline for submission of abstracts: 20 April, 2014
Two papers from the Journal of Development Effectiveness are among the top 25 most read Development Studies papers of 2013 from among a variety of Routledge Development Studies journals. These include, An introduction to the use of randomised control trials to evaluate development interventions by Howard White and How to do a good systematic review of effects in international development: a tool kit by Hugh Waddington, Howard White, Birte Snilstveit, Jorge Garcia Hombrados, Martina Vojtkova, Philip Davies, Ami Bhavsar, John Eyers, Tracey Perez Koehlmoos, Mark Petticrew, Jeffrey C. Valentine, Peter Tugwell.
The Center for International Development, University of Bologna, has announced its summer workshop on Planning, Monitoring and Evaluation for Complex Development Programmes, 1-12 September 2014. Register by 31 March to avail of the early bird registration discount. Registrations between 1 April and 30 June will be charged full registration fee.
"To err is human, but to use the word 'error' in a replication study is usually not divine." Guest post on the Development Impact blog by Annette Brown, Deputy Director, Advancement and Impact Evaluation Services, 3ie, and Benjamin Wood, Evaluation Specialist for Replication, 3ie.
The special issue of The European Journal of Development Research features a paper by Howard White titled Current Challenges in Impact Evaluation.
AfriComNet call for abstracts for the Practicum on Monitoring and Evaluation of Health Communication
The African Network for Strategic Communication in Health and Development (AfriComNet) is calling for abstracts to be presented at the Practicum on Monitoring and Evaluation of Health Communication
programmes to take place 2-4 June 2014, in Accra, Ghana. Submission deadline: 31 March 2014.
Bill Saveoff of Center from Global Development applauds the achievements of 3ie under the leadership of Howard White. He also says it's time to revisit the original visions for 3ie – that all foreign aid and multilateral agencies should contribute 0.01% of their annual disbursements to 3ie in support of impact evaluation.
3ie-supported impact evaluation of a maternal health programme in Gujarat, India, shows that the much-touted Chiranjeevi Yojana, launched in 2006 to reduce maternal and infant mortality in the state, hasn’t had any significant impact on institutional delivery rates or maternal health outcomes.
Gap maps enable policymakers and practitioners to explore the findings and quality of the existing evidence and facilitate informed judgment and evidence-based decision making in international development policy and practice. This paper by 3ie researchers provides an introduction to evidence-gap maps, outlines the gap-map methodology, and presents some examples.
The Campbell Collaboration Colloquium 2014 will be held 16-19 June at Queen's University, Belfast, Northern Ireland. Online registration is open. Special prices for early bird registrations!
An article by Prof Michael Greenstone of the Massachusetts Institute of Technology in the New York Times cites a 3ie-supported study on reforming the environmental audit system in industrial units in Gujarat.
Howard White was interviewed on the Story of Aid in the Rear Vision programme on ABC Radio.
Howard White writes about the lessons 3ie has learned through the experience of conducting and managing impact evaluations in the past decade in the Impact magazine published by Population Services International.
An article in The Hindu newspaper (dated 31/10/2013) on the 3ie-ASCI conference on Measuring Results.
Howard White, 3ie Executive Director, was part of a discussion on evaluation of social sector programmes on Rajya Sabha TV with Ajay Chhibber, Director General, Independent Evaluation Office, Government of India; Thoriq Ali Luthfee, Minister of Health, Maldives and Biraj Patnaik, Principal Advisor, Right to Food in the office of Commissioners to Supreme Court. Anchor: Girish Nikam
Two 3ie-supported studies cited in the Economist. These include Girl power: cash transfers and adolescent welfare. Evidence from a cluster-randomized experiment in Malawi, by Sarah J. Baird, Ephraim Chirwa, Jacobus de Hoop, Berk Özler and Relative effectiveness of conditional and unconditional cash transfers for schooling outcomes in developing countries: a systematic review, by Sarah Baird, Francisco H. G. Ferreira, Berk Özler, Michael Woolcock.
Girl power: cash transfers and adolescent welfare. evidence from a cluster-randomized experiment in Malawi
This 3ie-supported study now has a paper in the NBER working paper series.
This study summarises evidence from short-term impacts of a cash transfer programme on the empowerment of adolescent girls in Malawi during and immediately after the two-year intervention.
Getting children into school is only part of the education battle. We must also ensure they learn once they are there. Blog by Howard White in the Guardian
The University of East Anglia is offering a one-year MSc course in Impact Evaluation in International Development.
The course has been designed for students who are interested in designing and implementing development projects and programmes and/or in researching development effectiveness, and who need to develop and enhance their skills for undertaking high-quality rigorous impact evaluations.
3ie is offering a scholarship for this course.
This 3ie-supported study now has a paper in the NBER Working Paper Series.
This study evaluates the impact of a programme that provides pre-fabricated housing to members of extremely poor population groups in Latin America. Findings show that a positive effect on general well-being of the people. There is also evidence of improvement in children's health in two countries.
The World Bank, Innovations for Poverty Action (IPA) and 3ie are pleased to announce a research and grant-making programme, Evidence for Peace: Evaluating to Improve Results in Countries Impacted by Conflict, Fragility and Violence (E4P).
The goal of this joint initiative is to enhance the evidence base on development approaches to peace- and state-building challenges and to link this with the policy design and management process to achieve better outcomes.
The three institutions will partner on four areas of work -- Stocktaking and scoping to determine what we know, don't know and need to know, tailored evaluation tools and methologies that can be used for fragile and conflict-affected areas, E4P evaluation series which will implement a grants programme to implement 10-20 impact evaluations of peace- and state-building programmes. The final phase, promoting rigorous evaluation and brokering evidence, will include dissemination of findings to policymakers and practitioners.
Please contact Annette Brown (firstname.lastname@example.org) or Kelly Bidwell (email@example.com) if you would like to support this programme or learn more.
A 3ie study on the impact of daycare in Brazil shows that it has the potential to improve household welfare especially for the poor. Household income was went up (8%), as did the labour supply of the carer (usually the mother). Children also fared better in terms of cognitive and anthropometric outcomes.
This paper by Marie Gaarder and Jeannie Annan argues that it is both possible and important to carry out impact evaluations even in settings of violent conflict, and it presents some examples from a collection of impact evaluations of conflict prevention and peacebuilding interventions. This Policy Research Working Paper is published by the Independent Evaluation Group, World Bank.
The Campbell Collaboration training videos were recorded at the May 2013 Campbell Colloquium. The introductory videos are intended for researchers conducting systematic reviews as well as for policymakers interested in evidence-based policies. The advanced methods / applied topics videos are for researchers carrying out systematic reviews.
Do conditions moderate the effects of cash transfer programs? Preliminary findings from a systematic review
The preliminary findings from the systematic review indicates that both conditional cash transfer (CCT) and unconditional cash transfer (UCT) programmes improve school enrolment: 23% in the case of UCT programmes and 41% for CCT programmes.
An impact evaluation of the loans offered by Compartamos, the largest microlender in Mexico shows that loan recipients grew their business revenues and expenses, were happier, more trusting, had greater household decision power, and were better able to manage liquidity and risk. However, there was little evidence that loans had an impact on building wealth like household income, business profits, or consumption.
The study adds to the evidence that microcredit is generally beneficial, but not necessarily transformative.
The Research Department of Agence Française de Développement (AFD) has published the Ex Post collection of the following papers: Insuring health or insuring wealth? An experimental evaluation of health insurance in rural Cambodia and Sky impact evaluation, Cambodia, 2010, village monographs
Ian Ramage and David Levine, co-authors of these studies will be speaking at the Health Financing in Developing Countries Conference (Financement de la santé dans les pays en développement) on 27 and 28 May at AFD.
This report gathers together examples which attempt to explain how and why change happens as a result of cash transfers (CTs). It first presents a selection of theories of change about how cash transfers are expected to work in general, drawn from academic literature, and then a selection of theories as used in a few specific cash transfer programmes.
Pocast of presentation by Jyotsna Puri at the 28th Active Learning Network for Accountability and Performance in Humanitarian Action (ALNAP) meeting.
In a video interview, Phil Davies talks about the importance of timing for researchers wanting to engage and inform policymakers. He talks about 3ie GapMap as a visual and engaging tool for understanding what is known and what isn't.
In Part 2 of the video, he stresses the importance of physical access, encouraging efforts towards ensuring that policymakers are aware of, and can get hold of, the kinds of evidence that they need to make good decisions.
A DFID guidance note on the best ways to assess evidence in international development. It offers some rules on:
- understanding different types of empirical research evidence
- appreciating the principles of high quality evidence
- considering how the context of research findings affects the way that staff may use them
- understanding how to make sense of inconsistent or conflicting evidence
Findings from a 3ie-funded study on the impact of a conditional cash transfer programme in Nicaragua cited in this Inter-American Development Bank (IADB) blog.
Theme: What evidence-based development has to learn from evidence-based medicine
Speaker: Chris Whitty
Theme: What we have learned from 3ie's experience in evidence- based development
Speaker: Howard White
Howard White discusses different designs of randomized control trials and addresses criticisms of RCTs which are mostly argued to rest on misunderstandings of the approach.
The standard approach to policymaking and advice in economics implicitly or explicitly ignores politics and political economy, and maintains that if possible, any market failure should be rapidly removed. This NBER working paper by Daren Acemoglu and James A. Robinson argues why this conclusion may be incorrect.
This CGD Working Paper by Tessa Bold, Mwangi Kimenyi, Germano Mwabu, Alice Ng'ang'a, and Justin Sandefur looks at how a Kenyan education programme proven successful in a randomised controlled trial, failed to have similar outcomes when implemented.
See blog by Justin Sandefur Finding what works in development: what is the what?)
Esther Duflo discusses a 3ie-supported study on Gujarat pollution control in an interview to Yale Insights.
Markus Goldstein highlights the findings from two 3ie-supported studies -- a systematic review on the impact of daycare programmes and the impact evaluation of Save the Children's early childhood development programme in Mozambique.
"The purpose of this document is to advance the Foundation’s existing work so that our evaluation practices become more consistent across the organization.We hope to create more common understanding of our philosophy, purpose,
and expectations regarding evaluation as well as clarify staff roles and available support."
Howard White, 3ie Executive Director, discusses “closing the evidence gap”, randomised control trials, and the value of impact evaluation with Dereck Rooken-Smith of ODE, AusAID.
Louise Shaxson, Research Fellow, ODI on how "Pressure to demonstrate concrete impacts on public policy is encouraging researchers to make grand claims about what we/they are likely to achieve."
3ie finds a mention in the article by Natasha Gilbert. "To aid such reviews, the International Initiative for Impact Evaluation (3ie), a non-profit organization based in Washington DC that funds and conducts aid-assessment research, is setting up a database in which researchers can register studies. Expected to launch later this year, the initiative aims eventually to provide a complete listing of assessments for various types of aid interventions, says Howard White, executive director of 3ie."
Rosalind Eyben, Research Fellow at IDS and Chris Roche, Associate Professor, La Trobe University kick off a discussion on the implications of evidence-based approaches.
Nina Cromeyer Dieke shares tips and lessons development community can learn from mainstream media. The 3ie policy impact toolkit finds a mention in the story.
Berk Ozler on the pros and cons of using surveys to measure impact.
"...So, evaluating a large government program using an unrelated routine government survey may be fine (although I suspect that they too will have biases depending on what the respondents think the survey is for, how large, important, and ‘in the news’ the intervention is, etc.), but evaluating your own experiment that aims to change some behavior by asking study participants whether they have changed that behavior is unacceptable."
How can systematic reviews contribute evidence for policy? Blogs on this page take up the debate on conducting and using systematic reviews.
Tracey Koehlmoos, adjunct professor at George Mason University, Washington DC, and adjunct scientist at ICDDR,B blogs on sessions at the Dhaka Colloquium on Systematic Reviews.
"...Perhaps the most controversial session that I have attended so far was provocatively named “Rapid reviews: opportunity or oxymoron?” 3ie deputy director, Phil Davies presented on “rapid evidence assessment” and their place in the pantheon of evidence synthesis efforts aimed at informing policy making. Serious questions remain about rapid reviews being biased compared to systematic reviews—and how the process would even allow the developers of these reviews to recognize any biases. However, Davies pointed out that “all evidence is probabilistic.” ..."
"This discussion paper aims to support appropriate and effective use of impact evaluations in AusAID by providing AusAID staff with information on impact evaluation. It provides staff who commission impact evaluations with a definition, guidance and minimum standards."
3ie recently participated at an IFAD learning event on impact evaluations for environmental and climate change interventions.
William Savedoff blogs on the challenge that impact evaluation poses for organisations. "...Other times, the concerns reflect an unwillingness to clearly state their goals, be explicit about their theories of change, or put their beliefs about what works to an objective test. Yet, this is exactly what is at stake with evaluation: are you willing to be proven wrong?"
"Potential biases can arise when collecting qualitative data, in deciding which questions are asked, in what order, how they are asked and how the replies of the respondents are recorded.
"There can also be biases in how the responses are interpreted and analyzed, and finally which results are chosen for presentation. Of course quantitative data and analysis is also prone to bias, such as sampling bias and selection bias. But methodologies have been developed to explicitly deal with these biases. Indeed evaluation designs are judged on precisely how well they deal with these biases."