Commentary

ALNAP Lessons papers: A case for humanitarian desk-based reviews

This blog was written by Helene Juillard at Key Aid Consulting. Juillard is lead author of the upcoming ALNAP Lessons Paper: Responding to earthquakes, which she produced alongside her colleague Joris Jourdain.

Practitioners and researchers in the humanitarian sector often face a key dilemma. As time and resources are scarce, is it worth investing efforts in searching, screening, extracting and analysing secondary data to eventually feed into future programming? In light of their 2018 Lessons Papers: A Methods Note, ALNAP’s answer would be a resounding yes. As a new desk-based convert and author of the upcoming ALNAP Lessons Paper: Responding to earthquakes, so would mine.

For a long time, the humanitarian sector has been trying to produce knowledge and evidence that are easily shared and transferred to improve the quality of aid. ALNAP published its first lesson paper 18 years ago, in 2001 and has shared a number of lessons for response since. More recently, there has been an attempt to transfer and use systematic review approach. And yet, the journey to evidence-based decision-making remains rife with difficulties.

Key obstacles to evidence-based decision-making

First, decision-makers tend to rely on the judgment of those they trust as much as on any information provided, irrespective of its strength and accuracy.

ALNAP Lessons papers_ A case for humanitarian desk-based reviews

Nepal Earthquake / IOM

Our habits of thought and inherent bias prevent new evidence from being able to challenge assumptions we all have about what will work in a given crisis context. We are somehow and sadly path-dependant, praising innovations and adaptive programming but in reality, staying within the safe remits of what we think is our organisational framework for decision-making.

Second, knowledge management is not a strong suit of the sector. High turnover, poor connectivity, lack of time to dedicate to the formalisation of information, along with difficulties to collect strong data, makes the availability and accessibility of evidence a challenge in crisis settings.

This makes the need for smart and rigorous literature reviews even more important. In an area where humanitarians are asked to do more and do better with less, building on existing documented experiences may prove an effective and efficient way to improve assistance.

Rushing to collect primary data at field level can risk incurring high opportunity costs for respondents and wasting limited humanitarian resources when information may be readily available among secondary sources.

Harnessing lessons from previous response

So how do we go about doing this? A lot can be learnt from the academic approach to systematic secondary data review. Whilst this is not always the most exciting method to use, it may well be the most shrewd and effective. Rushing to collect primary data at field level can risk incurring high opportunity costs for respondents and wasting limited humanitarian resources when information may be readily available among secondary sources. Yet, one should be careful about the “gold standard” trap. To avoid setting yourself up for failure, you need to be flexible around how you are going to translate an academic and originally health focused approach to one that can be applicable in the humanitarian sector. If not, you might find yourself concluding - 6,000 documents later - that there is no strong evidence about what works and what does not.

ALNAP’s new approach to lessons papers intends to do just that. It supports M&E specialists and researchers to use their professional judgement when deciding on a research question and ensure it is broad enough to harness useful results in desk-based review. For the Lessons Paper: Responding to Earthquakes, we posed the following question: Across the project cycle, what lessons can be learnt from sectoral and multi-sectoral humanitarian responses to earthquakes since 2008? This research question was both specific and time bound enough to harness lessons that can be compiled, and yet not too restrictive. By using the steps of the project cycle for structure, plain language and a field-oriented tone, we aim to incentivise and encourage practitioners to use the harnessed evidence to inform their future earthquake responses.

I believe that the strength of an evidence-base does not come solely from the sum or strength of each individual paper. Rather, it comes from finding consistent lessons learnt across contexts, crisis types and/or sectors.

As recommended in ALNAP’s Methods Note, quality appraisal in the humanitarian sector should not be used as a way to exclude documentation, but rather as a way to assess the strength of the evidence-base. By following this process, we were able to extract data from 65 papers, out of almost 4,000 screened for this paper. We appraised the quality of each study, using a specific quality appraisal template developed for the paper. More specifically, for each study we looked at the design, sampling, rigour in conduct, credibility of the findings, extent to which it attends to contexts and is reflexive, and assesses the risks of bias. Unsurprisingly, some studies scored higher than others, but none would have passed the “test” of a quality appraisal for the sake of a systematic review.

ALNAP Lessons papers_ A case for humanitarian desk-based reviews_2

International Medical Surgery Response Team in Haiti / DVIDSHUB

I believe that the strength of an evidence-base does not come solely from the sum or strength of each individual paper. Rather, it comes from finding consistent lessons learnt across contexts, crisis types and/or sectors. It is a way to reach “data saturation” by way of desk-based approach. The final product is not a systematic review, but instead a rigorous review of documented lessons learned that can be used to inform future programming. To read more about those lessons, keep an eye out for ALNAP’s Lessons Paper: Responding to earthquakes, to be launched on 13 February 2019.