Is it possible to combine capacity development with a rapid synthesis response to an evidence request?
Two of the most important and long standing challenges for evidence synthesis in international development are: 1) The need to provide timely (eg. rapid) responses to demands for evidence to inform decisions; 2) Developing capacity to do high-quality, policy-relevant syntheses, especially in L&MICs. At present these challenges are typically addressed in isolation. In this post we discuss these challenges and describe an experiment designed to start to address this, inspired in part by the growth of citizen science in other fields. We hope you’ll be motivated to join our efforts by the end of it.
Need for rapid systematic reviews and response to evidence requests
A key facilitator of the use of evidence is that it is available when decision makers need it. But systematic reviews, often considered the most appropriate source for knowledge translation, typically take a year or more to produce. As awareness of the value of systematic reviews has increased, so has the requests for more rapid synthesis. The ‘sweet spot’ for rapid evidence synthesis is sometimes set at 3-6 months.
One option is to ‘cut corners’ and produce a less rigorous review of the evidence. Another is to try to provide the same level of rigour, but in a significantly shorter time frame through a combination of using technology, process parallelisation and a highly experienced and relatively large team.
Working in this way certainly has potential to produce systematic reviews and respond to demand for evidence more quickly. But it leaves little room for learning by doing and developing the capacity of researchers to conduct systematic reviews.
Developing capacity to do systematic reviews
But there is also a lack of sufficient capacity to do systematic reviews. And there are few systematic review training programmes combining training with opportunities to apply skills in practice. By and large most available training is of relatively short duration, either as dedicated professional development courses or workshops run as part of an international conference. In our experience from both attending and delivering short workshops they rarely translate into the sustained skills development that is necessary to conduct a high quality systematic reviews. The above mentioned need to deliver research quickly adds to this challenge.
Call for proposals for conferences and workshops result in internal discussions and a cost-benefit assessment of the likely outcome of joining such activities. It can be nice to travel to new places and attend stimulating talks. But is it worth the environmental, time and financial costs often associated with such events? When the What Works Global Summit (WWGS) call for abstracts came out we decided to try something different.
In an experiment starting 17 September and culminating in a workshop at WWGS in Mexico City on 15 October, we will attempt to address both of these objectives. Starting with an evidence gap map on intimate partner violence our goal is to provide a rapid systematic review and response to a request for evidence brokered by the Department of National Planning, Colombia.
We will host three webinars in the run up to WWGS where we will go through key tasks involved in conducting a systematic review, including data extraction, critical appraisal and meta-analysis. Participants will get a chance to gain hands on experience through tasks allocated between webinars, with the goal of together conducting a rapid systematic review by the time we reach Mexico City.
Citizen science for rapid synthesis response and capacity development – the best of both worlds?
The idea of citizen science has been explored in other fields. The Cochrane crowd is perhaps the most relevant example, and a key inspiration for our effort. It is a global community of volunteers who are helping to classify research so that Cochrane and other systematic reviewers can quickly identify the evidence they need to answer pressing health questions. Just last year, a challenge organised by Cochrane Mexico brought together 455 volunteers, who identified 10,000 RCTs for Cochrane’s CENTRAL register of controlled trials in only three days.
Can a variation of this approach work for systematic reviews of international development literature? Can we produce a rapid systematic review and at the same time contribute to developing key skills for doing systematic reviews? Join our webinar and virtual crowd to help us test it out!
You can read the details about the experiment and sign up to join our effort here.