• News
    • Latest news
    • News search
    • Health
    • Finance
    • Food
    • Career news
    • Content series
    • Try Devex Pro
  • Jobs
    • Job search
    • Post a job
    • Employer search
    • CV Writing
    • Upcoming career events
    • Try Career Account
  • Funding
    • Funding search
    • Funding news
  • Talent
    • Candidate search
    • Devex Talent Solutions
  • Events
    • Upcoming and past events
    • Partner on an event
  • Post a job
  • About
      • About us
      • Membership
      • Newsletters
      • Advertising partnerships
      • Devex Talent Solutions
      • Contact us
Join DevexSign in
Join DevexSign in

News

  • Latest news
  • News search
  • Health
  • Finance
  • Food
  • Career news
  • Content series
  • Try Devex Pro

Jobs

  • Job search
  • Post a job
  • Employer search
  • CV Writing
  • Upcoming career events
  • Try Career Account

Funding

  • Funding search
  • Funding news

Talent

  • Candidate search
  • Devex Talent Solutions

Events

  • Upcoming and past events
  • Partner on an event
Post a job

About

  • About us
  • Membership
  • Newsletters
  • Advertising partnerships
  • Devex Talent Solutions
  • Contact us
  • My Devex
  • Update my profile % complete
  • Account & privacy settings
  • My saved jobs
  • Manage newsletters
  • Support
  • Sign out
Latest newsNews searchHealthFinanceFoodCareer newsContent seriesTry Devex Pro
    Sponsored Content
    ODI Global
    • News

    ‘What works’? Systematic reviews in international development research and policy

    While promoted as a way to boost “evidence-informed” policymaking, systematic reviews require more careful consideration and may even be problematic in some cases, writes Richard Mallett of the Overseas Development Institute.

    By Devex Editor // 01 February 2012

    EDITOR’S NOTE: While promoted as a way to boost “evidence-informed” policymaking, systematic reviews require more careful consideration and may even be problematic in some cases, writes Richard Mallett of the Overseas Development Institute.

    Although well established in the natural sciences, systematic reviews are relatively new to the world of international development research. But they are being increasingly promoted by the likes of the UK Department for International Development (DFID) and AusAID as an important step in strengthening evidence-informed policy-making amongst aid agencies.

    Recently described by The Guardian’s Ben Goldacre as the ‘cleanest form of research summary’, systematic reviews involve synthesising and assessing all available evidence in order to answer tightly focused research questions, usually on the outcomes or impacts of specific programmes. If done well, they are considered to be ‘the most reliable and comprehensive statement about what works’ in terms of programme effectiveness. This is particularly important given the current demand for donor governments to demonstrate value for money in an economic climate defined largely by austerity and cutbacks.

    The idea behind systematic reviews is hugely appealing: researchers follow a carefully designed review protocol, identify relevant studies from a broad range of sources and grade their quality against pre-determined scales, before synthesising their findings and drawing an objective conclusion about programme effectiveness. In theory, they should help decision-makers identify ‘what works’ in generating positive outcomes for beneficiaries – and therefore have the potential to shape future spending choices.

    However, a new briefing paper by the Secure Livelihoods Research Consortium (SLRC)suggests that systematic reviews may not be all they’re cracked up to be. Drawing on researchers’ shared experiences of conducting eight systematic reviews into the impacts of a range of development interventions – from cash transfers to school feeding – the paper identifies a number of ways in which this approach may become problematic, suggesting that their use within international development research demands more careful consideration than has perhaps so far been the case.

    Practical constraints

    While the theory sounds good, in practice systematic reviews are not straightforward. In addition to being heavily resource intensive exercises – both in terms of cost and time – one of the major problems relates to researchers’ ability to objectively identify and retrieve all relevant evidence. The experiences of some SLRC and ODI researchers suggest a great deal of the literature on intervention impact in developing countries is located beyond peer-reviewed journals, which means manually ‘hand-searching’ institutional websites – a more subjective practice – becomes just as, if not more important, than plugging pre-defined search terms into academic databases. Identification of relevant material is further complicated by the remarkable prominence of vague, unclear study titles and abstracts within the development studies literature (something noted by Duncan Green last year)!

    Thus, although considered to be objective and rigorous, there are no guarantees that systematic reviews, or rather the individuals conducting them, will successfully identify every relevant study, meaning that subsequent conclusions may only partially reflect the true evidence base.

    What evidence counts?

    Although understandable, the desire to assess evidence against uniform quality scales (such as the one described by Lawrence Sherman and colleagues) is both concerning and problematic. Systematic reviews tend to privilege one kind of method over another, with full-blown randomised controlled trials (RCTs) often representing the ‘gold standard’ of methodology and in-depth qualitative evidence not really given the credit it deserves. In addition, considerations of political economy, social relations and institutions are essential to understanding why particular interventions work in particular places at particular times. But by privileging research that aims to measure impact by introducing laboratory-like conditions in the field – effectively abstracting the intervention from its context – systematic reviews do not necessarily help us understand these important mediating factors.

    The future

    Systematic reviews can ultimately add value to development research: in addition to reducing researcher bias and increasing rigour, they place empirics centre-stage within literature reviews. But their strengths must be balanced against a number of practical and fundamental limitations, such as those outlined here.

    Perhaps most importantly, we need to avoid turning discussions on the use of systematic reviews in international development – and indeed the social sciences more broadly – into an overly technical niche topic that only speaks to reviewers themselves. There are far broader implications of the systematic review debate which should be of interest to a much wider audience.

    For example, the growing prevalence of systematic reviews opens up a number of questions about the nature and process of evidence-building; about how we, as researchers, reach and construct narratives on programme impact, and how decision-makers engage with those narratives. Their use also reminds us that questions of impact and effectiveness should always be driven by empirical evidence – not by anecdotes and received wisdom.

    The big point in all of this is as follows: while it is now fairly well recognised that good international development policy relies, to a large extent, on good research and evidence, in order to reach a sound conclusion about ‘what works’ in terms of programme effectiveness, we must first wrestle with the question of ‘what works’ in evidence-building. Systematic reviews may well have a central role to play, but they must be handled with sense and diligence.

    Republished with permission from the Overseas Development Institute. View original article.

    • Research
    Printing articles to share with others is a breach of our terms and conditions and copyright policy. Please use the sharing options on the left side of the article. Devex Pro members may share up to 10 articles per month using the Pro share tool ( ).

    About the author

    • Devex Editor

      Devex Editor

      Thanks a lot for your interest in Devex News. To share news and views, story ideas and press releases, please email editor@devex.com. We look forward to hearing from you.

    Search for articles

    Related Stories

    EducationOpinion: Business and philanthropy networks are education’s missing backer

    Opinion: Business and philanthropy networks are education’s missing backer

    Devex DishDevex Dish: What could US food aid look like under an ‘America First’ agenda?

    Devex Dish: What could US food aid look like under an ‘America First’ agenda?

    Devex Career HubDevex Career Hub: The top roles recruiters are still looking to fill

    Devex Career Hub: The top roles recruiters are still looking to fill

    Most Read

    • 1
      Lasting nutrition and food security needs new funding — and new systems
    • 2
      The power of diagnostics to improve mental health
    • 3
      The UN's changing of the guard
    • 4
      The top local employers in Europe
    • 5
      Opinion: Urgent action is needed to close the mobile gender gap
    • News
    • Jobs
    • Funding
    • Talent
    • Events

    Devex is the media platform for the global development community.

    A social enterprise, we connect and inform over 1.3 million development, health, humanitarian, and sustainability professionals through news, business intelligence, and funding & career opportunities so you can do more good for more people. We invite you to join us.

    • About us
    • Membership
    • Newsletters
    • Advertising partnerships
    • Devex Talent Solutions
    • Post a job
    • Careers at Devex
    • Contact us
    © Copyright 2000 - 2025 Devex|User Agreement|Privacy Statement