Commentary

Four lessons learned from synthesising humanitarian evidence

Many humanitarians find it hard to gather evidence of what we know and how we know it. For example, what is the impact of different mental health and psychosocial support interventions on populations affected by humanitarian crises? What do we know about the impact of protection interventions on unaccompanied and separated minors during humanitarian emergencies?

The Humanitarian Evidence Program has synthesised the available research to answer questions like these. As a partnership between Oxfam and Feinstein International Center at Tufts University, we published eight systematic reviews of the evidence in key areas of the humanitarian field, with the ultimate goal of improving humanitarian policy and practice.

In addition to the specific insights that emerged in each evidence synthesis, the programme revealed four overarching themes about the humanitarian evidence base and the process of synthesising existing research.

Show me the evidence

As opposed to prospective field research, which often requires crisis-affected populations to narrate their experiences of harm repeatedly for multiple audiences, systematic reviews present an opportunity to learn from the existing evidence base.

As one systematic review author said in an interview, emergency responders and decision-makers in the humanitarian field have neither the time nor necessarily the interest or ability to sift through an unsorted evidence base.

A key advantage of systematic reviews is, therefore, that they follow a rigorous process of synthesising available research, appraising the evidence base, and identifying research gaps.

"...emergency responders and decision-makers in the humanitarian field have neither the time nor necessarily the interest or ability to sift through an unsorted evidence base."

Think about the politics of evidence

The systematic review process requires authors to make judgments about which bodies of evidence to search and which pieces of research to include in their synthesis based on pre-determined criteria.

Though the process of searching and appraising evidence is rigorous, systematic reviews at times hide their own subjectivity and partiality—and the fact that they still require systematic reviewers to make judgments on what counts as evidence and rigor.

As one systematic review author told us, “people put different meanings behind what is ‘evidence-based.’” Based on pre-established, peer-reviewed criteria of which studies were eligible for inclusion in the evidence synthesis, Humanitarian Evidence Program reviews included between 0.1% and 0.7% of the initial studies the search process identified.

This has led programme advisors to comment on a missed opportunity to learn from research that does not meet the strict criteria, but may still be of value to the field. As one author said, “through being this strict, humanitarian systematic reviews end up being at once rigorous and anemic.”

Definitions matter

The authors of Humanitarian Evidence Programme systematic reviews noted the challenge of defining the term ‘humanitarian,’ particularly in terms of delimiting humanitarian/development and emergency/non-emergency settings. Definitions matter not only because they determine which studies get included in an evidence synthesis, but also because they affect the comparability of interventions in different contexts.

Comparability was also a challenge in terms of how different agencies defined interventions of interest and the indicators they used to measure them. Measurements differed across programmes over time – even within the same country or same agency, making comparisons difficult.

Addressing this issue would require coordination among donors and implementing agencies to develop a common system of indicators, measurements, and thresholds and ensure its consistent implementation.

More thorough reporting of evaluation methods can improve humanitarian evidence

A key lesson that emerged from the Humanitarian Evidence Programme is that much of the evidence included in systematic reviews arose from programmatic evaluations, as opposed to academic studies or peer-reviewed journal articles. As one programme advisor said in an interview, “this is a quick win – if we commission, manage, and sign-off on these [evaluations] a little more strictly, we can make big improvements.” Many of these improvements relate to reporting on the methods of data collection. Evaluations and other programmatic reports should:

  • include collection and documentation of sex- and age-disaggregated data
  • report when and where the project under evaluation took place at a level of temporal and locational specificity that is appropriate to the context
  • state when and where data collection took place
  • clarify who collected data (e.g. program staff? External evaluator?)
  • discuss the type of data collection and instrument (e.g. ethnography? Survey? Interviews?)
  • provide information on the sampling strategy (how were populations identified and recruited?)
  • state how many respondents participated in the evaluation or study
  • discuss any limitations or biases that may have affected the results
  • include data on cost-effectiveness of different interventions, where possible
  • include data on implementation opportunities and challenges of different interventions, where possible

Humanitarian evidence syntheses not only present an opportunity to reflect on what we know and how we know it. They also require us to reckon the limitations of existing research and barriers to evidence sharing and use that affect decision-making. We will soon publish a journal article on these findings and look forward to continuing the conversation on ways in which researchers and practitioners can contribute to improving the humanitarian evidence base

The views and opinions expressed herein are those of the author and do not necessarily represent those of Oxfam, Feinstein International Center or the UK government.

Find more resources about monitoring & evaluation in our Learning section

Click here