Be careful what you wish for: cautionary tales on using single studies to inform policymaking

Be careful what you wish for: cautionary tales on using single studies to inform policymaking

For a development evaluator, the holy grail is to have evidence from one’s study be taken up and used in policy or programming decisions that improve people’s lives. It’s not easy. Decisions are based on many factors. The availability of evidence is just one of them. And of course, even when evidence is taken up, it does not mean that it will lead to the right decision.

A dramatic example of the complexity of take up is the political and judicial controversy that has swirled in the Philippines recently. The events could lead to a prominent researcher and senior officials (including a former health minister) being prosecuted precisely for acting on evidence of impact. The problem is that they are alleged to have done so a bit prematurely.

Dengue is a mosquito-borne disease that threatens the health and sometimes life of millions in many countries, including here in India. In 2014, an article published in The Lancet reported the efficacy of a new vaccine, Dengvaxia, in dramatically lowering the incidence of the disease. The drug was approved in 19 countries, including the Philippines. In late 2016, the Philippines procured millions of doses and, over the span of 18-24 months, proceeded to inoculate some 800,000 school children. 

Arguably, the original study ticked many of the boxes for success. Its findings were taken up and turned into policy that could improve lives. An added bonus was that the lead author of the article, which had passed the most rigorous of peer reviews, was from a low- and middle-income country, the Philippines. 

But life is seldom so simple. Recent articles in Science and Scientific American document why this study has stirred up such controversy in the country. There are now criminal charges being considered against the researcher and health officials, who sanctioned the programme. While the clinical trials were apparently well-conducted, there is doubt about how policies were drawn from them. Apparently, the vaccine, while very effective as a prophylactic for those who have already been exposed to dengue, may actually be dangerous for those who have never had it. Subsequent studies may be used by prosecutors to imply that policy influence can even be deadly.

It would be a shame if this example disincentivises decision makers from using evidence. Especially these days, when the standards for evidence seem under attack, rigorous research cannot be neglected. It wasn’t long ago that another vaccine study, now discredited, suggested a link between the measles, mumps, and rubella vaccine and autism. Although the study has been debunked, it continues to fuel vaccine hesitancy across Europe. This is why it’s so important to assess when and how evidence from rigorous studies should be taken up. At 3ie, our team has been using such experiences to draw lessons.

One lesson is the danger of relying on single studies to inform policy. Multiple contextual factors can influence effectiveness. To the extent possible, synthesise evidence through rigorous theory-based systematic reviews that include meta-analysis. 3ie is not only supporting this work but it is also facilitating access to evidence through our systematic review repository.

When there is not enough evidence to synthesise, it is important to generate more evidence. For example, 3ie has an evidence programme that is supporting studies testing innovative approaches in engaging communities for improving low and stagnating immunisation rates across different contexts. Study findings are showing that in low- and middle-income countries, there are important practical barriers to immunisation. Lack of information on when and where to get children immunised, inadequate information on immunisation, travel costs and the long waits at health centres, and a dearth of health staff and services, are big challenges.

A final lesson is the importance of promoting research transparency and replications. The public inquiry into the Dengvaxia episode was prompted by a healthy scholarly debate which included policymakers. A critical aspect is the replication of scientific work. The ‘crisis’ caused by the inability to replicate some highly influential studies (including in the social sciences) is beginning to change how research is done. 3ie has been supporting this work through its replication programme and by our strong commitment to improving transparency.

Hopefully lessons like these on the use of rigorous evidence will be helpful, especially in a world when facts are so easily distorted.

With input from Radhika Menon.

Add new comment

Authors

Emmanuel-Jimenez Emmanuel JimenezDirector General, Independent Evaluation Department of the Asian Development Bank (ADB)

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives

Authors

Emmanuel-Jimenez Emmanuel JimenezDirector General, Independent Evaluation Department of the Asian Development Bank (ADB)

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives