10 years of research transparency: lessons learned

Open data

The 1854 London cholera outbreak prompted Dr John Snow’s famous “experiment…on the grandest scale”, widely cited as one of the earliest known natural experiments. By comparing cholera deaths among households that received a supply of contaminated water with those receiving a cleaner supply, Snow sought to test his theory (against prevailing wisdom) that cholera is a waterborne disease. But, what makes this 19th century study particularly remarkable is that it is an early example of research transparency. Snow provided the study data (the list of cholera related deaths) in the appendix “…as a guarantee that the water supply was inquired into and to afford any person who wishes it an opportunity of verifying the result (Snow, 1859).”


In modern times, the ‘reproducibility crisis’ has motivated a movement towards open science. Social scientists are promoting the adoption of practices that improve transparency in research processes and increase confidence in study results. These practices range from pre-registering studies to providing access to study materials and data, and running replications.

Within the international development sector, this thinking has resulted in an increased focus on transparency among researchers, funders and NGOs. The reproducibility of results within the international development sector has a special significance. As with the open science movement, the focus is on ensuring that studies are designed, conducted and shared transparently, and have robust results. In low-and middle-income countries, where limited data is available, making these data openly accessible ensures better value-for-money, can inform the design and implementation of policies and interventions and facilitate additional research. Finally, given the limited availability of aid funds, ensuring the credibility of evaluation results is a moral imperative.

3ie has had a long-standing commitment to research transparency. Since its inception a decade ago, all 3ie-funded research (data and publications) is openly accessible. We have funded the replications of innovative, influential and controversial impact evaluations. We have a registry (RIDIE) for experimental and quasi-experimental impact evaluations. Additionally, we mandate the submission of protocols and pre-analysis plans (PAPs) for 3ie-funded systematic reviews, replications and impact evaluations. Here are some of the lessons we’ve learned along the way:

  1. Open data: while we’ve always mandated the submission of de-identified evaluation data (and code files more recently), these requirements were not verified. Recently, as part of our quality assurance processes, we conducted a few push button replications (PBR) and found that the data and code did not replicate published results. Reasons for non-replication ranged from simple errors (incomplete code files or datasets) to more serious problems. Additionally, a few datasets contained identifiers (names, GPS coordinates, for instance) in violation of institutional review board regulations. These findings have led us to conduct PBR with all our datasets (completed studies as well as those that will close in 2018) as this is the only way of verifying results before sharing data for open access.
  2. Open publications: 3ie’s preference is to make all products (data, code and publications) available online immediately. However, we often receive requests from researchers to embargo evaluation reports or data for various reasons. In the past, these requests have been managed on a case-by-case basis, but we identified the need to define guidelines that would make this expectation clear and negotiate any exceptions, prior to signing of grant contracts.
  3. Pre-registrations and pre-analysis plans: While the registration of studies was optional, a few years ago we began mandating the submission of PAPs in the impact evaluations we fund. The reason for this is twofold: first, it allows researchers to see what research is ongoing, thereby encouraging collaboration or, at a minimum, avoiding duplication; and second, it is a way to limit data mining by ensuring that proposed analyses are pre-defined. Over the years, our experience has been that the review and subsequent discussions on the PAPs have helped to align expectations between 3ie and the research teams and to increase transparency in the evaluations and their reporting. Researchers are often apprehensive that committing to an analysis plan will mean they cannot conduct other exploratory analyses. However, our only requirement is that any post-hoc analyses are transparently reported.
  4. Building awareness: Given the varying levels of awareness on research transparency among 3ie staff and researchers, we need to invest time and resources to ensure that standards and procedures are clearly defined; and to build awareness both within and outside 3ie.

In summary, we’ve learned that we need to move the bar up several notches both in our own internal processes and in the research that we fund. Discussions across offices crystallized in the articulation of our research transparency policy. We view the policy to be a living document that we update periodically. In a nutshell, the policy mandates:

  1. All 3ie-funded research will be pre-registered;
  2. Pre-analysis plans or protocols must be submitted and will be made public when the grant closes;
  3. Data, statistical code and research materials will be made publically available. We will conduct PBRs prior to sharing data and publications online;
  4. Evaluation reports should provide complete and transparent descriptions of analysis and findings; and
  5. Reports must adhere to appropriate citation standards.

These are ambitious goals, which require the support of staff and researchers. To facilitate the process for researchers, we are working on protocols for data submission that would facilitate the PBR process. We are updating RIDIE to make it simpler and more user friendly. We’ve begun including research transparency as a subject in our workshops with the evaluation teams. Finally, we also would like to improve connectivity across all 3ie platforms (such as the repository, the dataverse and RIDIE) which will help researchers, policy makers, students and others find relevant information across all our products.

In our view, research transparency is a journey and following in Dr Snow’s footsteps, our vision is to be at its forefront. We do it because it is the right thing to do: on moral grounds, for ensuring higher quality products, as well as to obtain better value-for-money. In doing this, we hope to join forces with other researchers, funders and implementers who share the same vision.

(With inputs from Benjamin DK Wood, Marie Gaarder and Sayak Khatua)

This blog is a part of a special series to commemorate 3ie’s 10th anniversary. In these blogs, our current and former staff members reflect on the lessons learned in the last decade working on some of 3ie’s major programmes.

Add new comment

This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Authors

Neeta Neeta GoelFormer Senior Evaluation Specialist

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives

Authors

Neeta Neeta GoelFormer Senior Evaluation Specialist

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives