Home Announcements › Making-Systematic-Reviews-Policy-Relevant

When a group of UK policymakers were surveyed about what they thought about research evidence they said it was verbose, dense, full of jargon, untimely and irrelevant for policy. This is not surprising. Getting technical research evidence to seep into policy thinking has always been a challenge. The fact that the culture of opinion based policy still persists makes it difficult for researchers to reach out to policymakers. 

What is the way forward then for systematic reviews? This was the big question that Philip Davies, Deputy Director-Systematic Reviews, 3ie, Kent Ranson from the Alliance for Health Policy and Systems Research, WHO and Howard White, Executive Director of 3ie tackled at the final plenary of the Dhaka Colloquium on Systematic Reviews in International Development. The plenary brought to a close a very successful week of deliberations on the use of systematic reviews in international development among 130 researchers and policymakers from 30 countries across the world.

 

 

 

When a group of UK policymakers were surveyed about what they thought about research evidence they said it was verbose, dense, full of jargon, untimely and irrelevant for policy. This is not surprising. Getting technical research evidence to seep into policy thinking has always been a challenge. The fact that the culture of opinion based policy still persists makes it difficult for researchers to reach out to policymakers. 

What is the way forward then for systematic reviews? This was the big question that Philip Davies, Deputy Director-Systematic Reviews, 3ie, Kent Ranson from the Alliance for Health Policy and Systems Research, WHO and Howard White, Executive Director of 3ie tackled at the final plenary of the Dhaka Colloquium on Systematic Reviews in International Development. The plenary brought to a close a very successful week of deliberations on the use of systematic reviews in international development among 130 researchers and policymakers from 30 countries across the world.

An issue that policymakers often grapple with is ‘How can I get a programme to work in my environment?’ For systematic reviews to be able to provide the answer they need to drill down to specific contexts. According to Davies, addressing a macro question without examining the micro, sub-units of analysis reduces the usefulness of systematic reviews for policymaking. For instance, WASH interventions have distinct and complex sub-interventions and conducting sub-unit analysis is crucial for generating useful evidence for programme managers. “We may be saying something works when it doesn’t because we are looking at a very simple model,” Davies said.

What we need is more evidence that examines the diversity in effectiveness of development interventions. “We need to move beyond just the average treatment effect. Analysis of the variance of effects of an intervention is more important. We need to answer the question ‘For whom does the intervention work and not work?’,” he said.

For systematic reviews to have a rounded approach to effectiveness and be a tool for policymaking, they need to also incorporate diverse kinds of evidence on efficiency, implementation and citizens’ experiences of policies, programmes and services.

Kent Ranson recommended having a consultative approach for translating issues identified by policymakers into review questions. But there is always the possibility that the consultative process could potentially get derailed. Ranson cited an instance of how at the end of a long and elaborate consultative process, both policymakers and review authors eventually headed off in completely different directions.
The key lesson here is that it is important to get the question right. There needs be ongoing engagement with policymakers and other users throughout the review process to ensure the final product is an appropriate response to the demand for evidence.

Building demand

On the supply side, researchers can work on making systematic reviews a user friendly and indispensable tool for policymaking. But on the demand side, it is in the hands of donors and governments to give the big push forward to the evidence based policy movement.

Howard White, Executive Director of 3ie said that development agencies like DFID were doing this by mandating that applications for new projects cite evidence in their proposals. Governments in Mexico and Colombia have also institutionalized evidence based policymaking by laying it down in the law that social programmes should be evaluated for their impact.

There is also an increasing realization that impact evaluations are not the end game. They are the building blocks for systematic reviews which generate evidence that has higher external validity. The government of Uganda and the Presidency of South Africa have evinced interest in building capacity for conducting systematic reviews, White said.

For building the demand for evidence, the training and capacity building of users and commissioners of research is imperative. By getting researchers and policymakers together, an event like the Dhaka Colloquium has certainly helped in contributing new and exciting ideas for spurring the evidence based policy movement

Published on: 19 December 2012

Scroll to Top