Since 3ie was established, we have known we need to learn from evidence to make better decisions and that we need to open the ‘black box’ of project design and implementation to learn what works, what doesn’t, and why. But as this movement for more evidence generation matured, we have increasingly understood that research that generates evidence also requires accountability and transparency. While there was an initial focus on opening the ‘black box’ of projects, the ‘black box’ of research design, implementation, analysis, and dissemination did not receive as much attention.
As both an evidence producer and consumer, 3ie has been working to change that. Through practice, learning and applying lessons learned, we are launching our renewed Transparent, Reproducible and Ethical Evidence (TREE) Policy. The policy has two primary objectives: 1) Ensure the rigor and credibility of 3ie research activities by asking demand-driven questions and mitigating intentional and unintentional scientific misconduct, and 2) Ensure 3ie research activities follow the principles of research ethics: respect for persons, beneficence, and justice. We’ve aimed to incorporate a set of best practices and provide guidance on what TREE practices are applicable – and in some cases required – for 3ie research activities.
Our work joins with others working to take the "con out of econometrics" and use transparency as a tool to align with foundational scientific norms and improve how research is designed, implemented, and shared. As with any process, our work began with understanding the problems before jumping to solutions. We aimed to understand how researcher degrees of freedom and the garden of forking paths can affect our work; how practices such as active searching for statistical significance through p-hacking and selective reporting from the ‘file drawer’, as well as journal editors’ preferences, can lead to publication bias and failures to replicate (broadly and even specifically in international development impact evaluations). And we are not even referring to outright data and methodology manipulation. (see BITSS MOOC for more detail on research credibility).
Relatedly, 3ie and other international development organizations have directed renewed attention on research ethics (for example, see Barnett and Munslow, 2014; DFID 2016). This year has seen additional suggestions to publish an Ethics Appendix and think more carefully and transparently about the ethical challenges we face, such as understanding what Institutional Review Board (IRB) reviews can and cannot do, assessing appropriate research participant compensation, and asking ourselves which evaluation methodology is not only feasible, but also ethical and appropriate for the context.
Reflecting on these discussions, we have codified several ethical principles, which can be seen here. Our 2021 TREE policy builds off these principles and is part of a larger set of actions for researchers, funders, and journal editors to not only understand the problems we face, but also utilize known and create new solutions to improve the credibility of research. The focus for 3ie and other research institutions is on building these solutions into research workflow - as appropriate – to address researcher-driven challenges facing research credibility and rigor (funders and journal editors have other roles to play). By integrating TREE into workflow, 3ie also aims to engage with policymakers and decision-makers on how to use TREE to consume available evidence more critically prior to decision-making.
Problems we want to solve | Researcher degrees of freedom, p-hacking, | Publication Bias and the File Drawer | Failure to reproduce analysis | Improved oversight of ethical risk assessment and mitigation | Inconsistent communication on and adherence to promises of confidentiality |
Tools to address the problem | Study registration; Pre-analysis Plans; Standardized Reporting Templates | Push-button replication; Data sharing | Training in protection of human subjects; navigating IRB review; Ethics review and documentation | Informed consent; responsible data stewardship; Data de-identification |
Of course, the hard part is operationalizing integration of evolving principles and best practices to address these areas of concern. Whether the projects we study and related research activities are complex or simple, the tools available are similar and focused on transparency, reproducibility, and ethics.
Since day 1, 3ie embraced the movement toward improved research transparency and published its first Research Transparency policy in 2018. As a result, 3ie has demonstrated success where it has integrated TREE elements into workflow. For example, 3ie committed to making its published analysis computationally reproducible (meaning the shared data and code allows new users to replicate the published analysis). As presented in a recent working paper, 3ie continues to increase the percentage of its portfolio that is computationally reproducible due to integrating push-button replication (PBR) into workflow before data sharing.
But there have been some challenges alongside successes. Since 2018, 3ie has learned several lessons that are applied to the 2021 TREE Policy revision and activities moving forward:
Find balance across TREE practices and research activity requirements. Sometimes being ethical means we cannot be fully transparent or reproducible. For example, making the data and code fully transparent to facilitate reproducibility of original analysis could increase re-identification risk for research participants. Ultimately, integrating TREE best practices is about improving the credibility and quality of the research to improve development effectiveness. This goal can be accomplished even if we cannot be fully transparent and/or fully reproducible, but it always requires careful consideration of ethics, responsible data stewardship, and using transparency as a tool to communicate what can be shared, what can’t, and why.
Solve known problems with known solutions but be willing to experiment. Beyond study registries and data catalogs, including 3ie’s RIDIE, Development Evidence Portal, and Dataverse, the number of tools available to be more transparent and reproducible is extensive – from dynamic documents to CurateScience, DeclareDesign, AidGrade, and the Social Science Reproduction Platform. And while we believe we are narrowing in on standards and tools for transparency and reproducibility, we believe there is still much work to be done to improve how we define, integrate, and monitor ethical issues and data stewardship needs that arise in our research. This is an ongoing process and will take experimenting and learning from new practices.
Build TREE into research early in its life cycle. When we released our 2018 Research Transparency Policy, we had not built TREE practices into the research life cycle early and instead required them late in research implementation. In addition, we did not fully integrate the activities that dovetail across transparency, reproducibility, and ethics – such as ensuring that informed consent statements clearly state how data will be de-identified and shared to facilitate reproducibility. We learned many decisions that facilitate transparency and reproducibility must be made early in and throughout the research life cycle.
Build in funding for TREE. We believe that some practices – such as the development of pre-analysis plans (PAPs) – can smooth costs of conducting research across the life cycle (for example, an interim or final analytical report can strongly leverage the content from a PAP). In addition, conducting independent ethical assessment throughout the research life cycle can mitigate risks and avoid costly course corrections. But there will be a marginal cost to integrating some new practices into how we do business. When we released our 2018 Research Transparency Policy, these practices weren’t built into the requirements at the funding stage and there was limited funding available to implement them. However, we believe that evidence funders and consumers are as committed to ensuring these tools are integrated into research workflow as we are.
Build technical support for TREE. Research teams are typically built around specific methods: strong understanding of identification strategies, statistics, questionnaire design, econometrics, and sector expertise. Depending on team composition, a research team may not have the qualifications or time to utilize and prepare study registrations, data de-identification, push-button replications, etc. In addition, even when teams prepare PAPs, IRB materials, ethics risk assessments and mitigation strategies, these documents are strengthened by independent review to examine researcher bias or blind spots. Integrating TREE practices into workflows requires integrating specific technical support for the core research team to do so.
Our end goal isn’t transparency and reproducibility alone. We aim to produce credible evidence that informs decision-making to improve development outcomes while remaining aligned with foundational principles of ethical research. We see transparency solutions – such as study registration, PAPs, standardized reporting, data repositories – as tools to improve the rigor and credibility of evidence generation. We see reproducibility as a measurable outcome that signals the very basic requirements of rigor and credibility of that evidence. We see ethical principles as guidance for not just focusing on what evidence we generate, but how we generate that evidence in a way that protects research participants and staff from all forms of harm, abuse, neglect, and exploitation. We have expanded our focus on 'counterfactuals' and 'measuring causal impact of development interventions' to include 'integration of best practices in TREE' as one of the key qualities for rigorous and credible evidence generation.
For more information about the 3ie TREE program, visit our webpage or contact us at TREE@3ieimpact.org.