3ie’s new Development Evidence Portal: Our expert panel walks you through its features
While we’re not yet in a world where finding the best development evidence is as easy as asking Alexa or Siri a question, 3ie’s new Development Evidence Portal gets us one step closer to that goal. The newly revamped repository of development evidence includes powerful search and filtering functions for its 3745 impact evaluations, 730 systematic reviews, and 20 evidence gap maps. As part of 3ie’s Virtual Evidence Weeks, our expert panel presented the portal’s features, explained how to use it, and discussed its role in increasing evidence uptake.
The new portal reflects the progress that the development community has made in producing rigorous evidence, according to Marie Gaarder, 3ie’s executive director.
“When 3ie first developed a database for all existing impact evaluations in the development field back in 2009, there were less than 300 studies to be found,” she said. “Fast forward 11 years and the world looks quite different.”
Now the portal is a “gold mine of easily-accessible and freely-available evidence,” Gaarder said. (Some full-text articles published by journals are still behind paywalls, but the summary data on the 3ie repository is free.)
For decision-makers, the portal is especially useful because it draws together studies from a wide set of different journals, websites, and other sources. Searching each different place on its own could be overwhelming, making it difficult for to find relevant, high-quality evidence. The development evidence portal helps solve that problem, said Birte Snilstveit, director of synthesis and reviews and head of 3ie’s London office.
The portal now includes detailed meta-data for each evaluation, allowing people to search by sector, keyword, region, or research design. Users can also search only for studies that break out results by subgroup, like gender.
“We want people to be able to zoom in and focus on exactly the kinds of things they’re looking for,” said 3ie evaluation specialist Mark Engelbert.
Marion Krämer, evaluator and team leader at DEval in Germany, said the portal offers a good way to introduce policymakers who are unfamiliar with impact evaluations to relevant completed studies.
“For a country like Germany that doesn’t have that much experience with rigorous impact evaluation, the usage side is a much easier entry point … than doing impact evaluations yourself,” she said. “Doing your own rigorous evaluations requires a lot of knowledge and a lot of resources.”
Members of the virtual audience seemed to agree. In a poll of the audience at the end of the panel, 40 per cent said they would use the portal for their own research, 37 per cent said they would use it to inform program design, and 23 per cent said they would use it to inform a policy discussion. Many of these audience-members would be new users. In a poll at the start of the panel, nearly two thirds of respondents said they had not used the panel in the last three months.