One year ago, a group of experts on a 3ie panel agreed that simply producing evidence and data was not sufficient for learning. On Friday, 3ie’s Executive Director, Marie Gaarder, invited those experts back to her virtual table to share what they've learned about how to set up processes within development institutions so that good evidence is not just generated but incorporated into the project planning cycle.
"Between the challenges of quick timelines, rigid systems, cultures that reward approvals and disbursements over learning and ultimate results on the ground, development agencies do not always walk the talk of evidence-informed decision-making," Gaarder reminded the audience and invited the panelists to discuss possible solutions.
"We do need to kind of raise our eyes to the system level," said Alison Evans, director-general of evaluation at the World Bank Group. "[It is important] that evidence generators and evidence users are on the same page about the orientation of the organization … getting to that larger conversation about 'Where are we going? Are we headed in the right direction?'"
Setting up systems to encourage the use of development evidence, in turn, requires looking at a different type of evidence – research on how people learn, said Charlotte Watts, chief scientific adviser and director for research and evidence in the Foreign, Commonwealth & Development Office.
"There is such a firm evidence base for what it takes to actually learn, and we should invest in what it takes to actually learn from the evidence that we have," Watts said. "What it takes is time."
There are tradeoffs between using deep long-term processes and quick one-shot evidence products, said Stacey Young, USAID’s agency knowledge management and organizational learning officer.
"Longer-term holistic learning approaches are more effective but are more time-consuming than discrete pieces of evidence," Young said.
USAID has had success with a team-based long-term learning approach which helps in-country teams understand technical evidence, partnership skills, and power dynamics, Young said.
"The downside is it takes a lot of time, and we've all acknowledged that is a constraint," she said.
The challenge is synchronicity, where evidence meets needs, Evans said. Right now, there is often a long process for evidence to wind its way from an evaluation back into policy – and front-line staff are eager for there to be shorter paths. Part of that involves getting the timing right, part of it involves easily-digestible packaging, and part of it involves direct access to evaluation experts.
"Even though demand for evidence is in a way constant in the World Bank Group, there is a certain logic to the pace and scale of the project and programming cycle," Evans said.
At Norad, the Norwegian Agency for Development Cooperation, the push to incorporate evidence into decision-making is being integrated into the organization's structure. Director of Knowledge Håvard Mokleiv Nygård has been overseeing the process of building a new division within the organization: the Division of Knowledge.
At the same time, all the organization's aid is being reorganized into portfolios, each with its own knowledge and learning plan with a theory of change.
"The theory of change really should give you a whole lot of information, to put it naively, on where you know what you're doing, and where you don't know what you're doing," Nygård said.
In the former case, a standard midterm review might be appropriate, whereas more creative or rigorous evaluation approaches might be needed for new pilot projects and untested approaches.
"It's not just up to each individual case worker, but it's a question of what kind of information does the organization need at the portfolio level?" Nygård said. "Where do we best use which parts of the knowledge toolkit?"
It is also essential to acknowledge uncertainty upfront and build in processes for adaptive learning, said Mark Sundberg, the Millennium Challenge Corporation’s chief economist and the deputy vice president of its department of policy and evaluation.
"There's a lot that we simply don't know," Sundberg said. "Evidence is good, and good evidence is even better, but that's necessary and not sufficient, because there's a lot we simply don't know … these sources of uncertainty really need real-time solutions on the ground."
To win staff buy-in on all these approaches, it is important to stay grounded in the ultimate goals of evidence use, Watts said.
"Why do we care about evidence? Because it's the way that we're going to deliver impacts," Watts said. "We need the more holistic approach to learning that really connects to people's passions for doing better development … People are in development because they want to make a difference."