Substantial progress has been made in improving access to education in low- and middle income countries (L&MICs). However, the progress has been uneven and several challenges still need to be addressed. Millions of children of primary school age are still out of school and studies measuring learning outcomes across L&MICs find consistently low levels of learning, with hundreds of millions of children leaving school without basic numeracy and literacy skills. While the Millennium Development Goal on education was to achieve universal primary education, the SDG on education is to ensure inclusive and equitable quality education and promote lifelong learning. In this context, evidence on the effects of education interventions is needed for informing decisions about how limited funding can be best used to achieve the ambitious goal of achieving quality education for all children. An increasing number of systematic reviews and literature reviews have assessed one or more parts of the evidence base but no review to date has comprehensively synthesised the education impact evaluation literature across interventions and outcomes using statistical meta-analysis.
This is an updated and corrected version of the report which corrects the errata in the report published on 31 December 2015.The corrections include copy-editing and formatting of text and appendixes. In addition, Adelman 2015 was included in the analyses for user fee reduction by error. It should have been included in the analyses for public-private partnerships. The meta-analyses for enrolment and completion, for which this study reports outcomes, have now been corrected in both chapters. The summary of findings tables and concluding sections have also been corrected to reflect these changes.
To identify, assess and synthesise evidence on the effects of education interventions on children's access to education and learning in L&MICs. The authors also aimed to assess how education interventions affect different sub-groups of participants and address questions related to context, process and implementation.'
The authors included experimental and quasi-experimental studies assessing the effect of an education intervention directed at primary and secondary school children in mainstream education in L&MICs on a measure of school participation (enrolment, attendance, drop-out, completion) or learning (cognitive skills, maths, language arts and composite scores). They aimed to include a comprehensive range of commonly implemented education interventions designed to address one or more barriers to school participation and learning. The analysis in the review was organised by the barriers these interventions aimed to address; namely child, household, school, teacher and systems level barriers. The authors included published and unpublished literature covering the period between 1990 and June 2015.
They searched a range of electronic academic databases (including ERIC, Econlit and Web of Science), internet search engines, electronic libraries and registries of impact evaluations (including 3ie, JPAL and the World Bank) and theses collections. In addition, they screened the bibliographies of included studies and existing reviews for additional eligible studies and contacted key researchers and organisations working in the education field to identify additional studies. They critically appraised all included impact evaluations appraised using standard appraisal tools and extracted effect size data whenever possible, calculating standardised mean differences (SMDs) with standard errors and 95 per cent confidence intervals. Studies were synthesised using random effects meta-analysis, estimating average effects of different education interventions and associated heterogeneity. To address identify process, implementation and contextual factors that may explain intervention effects the authors' included qualitative studies, descriptive quantitative studies, process evaluations and project documents linked to the interventions that were evaluated in the included experimental and quasi-experimental studies. For the synthesis of this evidence, they used a thematic approach, organising themes according to the intervention programme theory.
Headline Findings: a summary statement
Programmes typically improve either school participation or learning outcomes, but not both. Cash transfer programmes have the largest and most consistent positive effects on school participation outcomes, but they do not typically improve learning outcomes. Structured pedagogy on the other hand have the largest and most consistent positive effects on learning outcomes, but the (few) studies that measure participation outcomes do not suggest a positive effect.
The authors included 420 papers corresponding to 238 different studies, assessing 216 unique programmes. The included studies cover programmes across 52 different L&MICs. This include 59 studies from Sub-Saharan Africa, 38 studies from East Asia & the Pacific, 87 from Latin America & the Caribbean; 51 from South Asia, two from Middle Eastern & North Africa and one from Europe & CIS. They estimate that the studies include data from over 16 million children. Across the entire review, 52 per cent of the included studies were cluster-RCTs, 11 per cent used a regression discontinuity design, eight per cent were randomised controlled trials, seven per cent were natural experiments, and the remaining 23 per cent used a controlled before-after study design, with estimation strategies such as difference-in-difference estimation or propensity score matching.
Implications for policy and practice
Most interventions have an overall positive effect on children who were beneficiaries compared to children who did not receive these interventions, although for some programmes average effects are relatively small.
- Cash transfer programmes have the most substantial and consistent beneficial effects on school participation. Effects range from 0.11 SMD (95% confidence interval (CI), [0.07, 0.15]) for enrolment, to 0.13 SMD, 95% CI [0.08, 0.18] for attendance, with effects on dropout and completion of a similar magnitude. These are based on a relatively large number of studies. Cash transfers do not however appear to lead to any improvement in learning outcomes.
- Other interventions that may be promising for improving school participation outcomes include community-based monitoring, low cost private schools, new schools and infrastructure and school feeding.
- Structured pedagogy programmes have the largest and most consistent positive average effects on learning outcomes. These interventions typically include development of new content focused on a particular topic, materials for students and teachers, and short term training courses for teachers in delivering the new content. The meta-analysis for language arts outcomes shows an overall effect of 0.23 (95% CI, [0.13, 0.34]). The effect on maths test scores is slightly smaller in magnitude at 0.14 (95% CI, [0.08, 0.20]), but it is still the largest and most consistent effect observed for maths test scores across the review.
- Other interventions that may be promising for improving learning outcomes include merit-based scholarships, school feeding, extra time in school and remedial education.
Finally, the authors find zero or small negative average effects for some interventions across outcomes, for example school based management, de-worming, and interventions providing materials. The average effects observed for computer-assisted learning leads them to conclude the effects of computer assisted learning on children's learning may not be beneficial in all contexts.
Implications for further research
- The authors identify various intervention areas with few studies and suggest it is worth focusing new studies in these areas, particularly those that seem promising. These include teacher training programmes, remedial education, school-based health programmes (malaria, de-worming, micronutrients), diagnostic feedback, providing information to parents, tracking students by ability, extension of the school day and different approaches to teacher training and hiring.
- They suggest future studies should use mixed methods designs to assess the effects of interventions as well as process, implementation and contextual factors that influence final outcomes to help explain heterogeneity and inform future programming.
- Future studies should include data on costs to allow for cost-effectiveness analysis.
- The authors found that many studies did not clearly describe all main study constructs, report methods in detail or clearly report the statistical information necessary to calculate standardised effect sizes. They suggest that a requirement for researchers to follow reporting guidelines such as CONSORT could improve the value of new research and aid future synthesis.