Evidence in humanitarian emergencies: What does it look like?
By Jeremy Shoham and Marie McGrath, Field Exchange co-editors
ENN was invited by Evidence Aid to share our perspective regarding evidence in emergencies in their online blog. Below is a comment posted in September 2016, www.evidenceaid.org/evidence-in-humanitarian-emergencies-what-does-it-look-like/
Over 20 years, ENN has published Field Exchange to help achieve our purpose of strengthening the evidence and know-how for effective nutrition interventions in countries prone to crisis and high levels of malnutrition. It provides the experiences of those implementing nutrition programmes in acute and protracted emergencies. Most of these experiences and lessons learnt would not have been captured or disseminated without this type of non-peer reviewed publication. Yet, for many, Field Exchange, and equivalent publications are the ‘grey’ literature, often seen as a source of findings that are ‘plausible’ but not evidenced. Hence the question, what is evidence and what does it look like – especially in humanitarian contexts?
The few reviews of evidence in humanitarian nutrition programming show that there is very little ‘probabilistic’ evidence out there. Randomised trials that are held up as the “gold standard” for assessing the effects of interventions are particularly difficult to mount in humanitarian contexts. They require foresight and investment by donors; early collaboration between pragmatic, creative academics and operational agencies; and long term commitment to plan, deliver and publish. And, many of them do not come cheap. Even where randomised trials are carried out (mostly in secure settings), they don’t necessarily tell us whether something will work or not in the complex environment and ‘ever shifting sands’ of an emergency or, critically, how it works. This is a key uncertainty for programmers looking to adapt and respond to the needs of specific populations and contexts.
There are also challenges around the global coordination of ‘robust’ research. Research agendas are not shared between research institutions and there is competition for scarce resources – even (shamefully) amongst research groups in the same research consortium. The culture around research is often ‘secretive’ and conflict of interests are not always apparent.
However, the fact that there is a gap in the evidence doesn’t stop programming. Agencies on the ground still need to respond, innovate and adapt, guided by what they know or suspect works, influenced by agency strengths, sometimes driven by donor interests and sadly, often underpinned by the bureaucratic imperative that the implementing agency must, itself, survive. The intention though is always worthy – to alleviate suffering.
On the positive side, ENN has witnessed (and, through Field Exchange, has captured and disseminated) innovation, programme development and learning. Seen as a non-governmental organisation with no vested interest other than to reflect learning, multiple agencies have shared programming experiences though Field Exchange, documenting perceived and measured successes and failures. This provides a collective memory and exchange – and evidence of sorts – of what works and what doesn’t work. The process of experience capture is cathartic for many. It helps them unpack and reflect, enabling both personal and agency reflection and learning. It has also helped identify where urgent ‘robust’ research is needed. The impact on programming and research of this collective experience has been substantial, not least on ENN’s research and reviews (www.ennonline.net/ourwork).
Should we aspire to more? Absolutely. We need more randomised trials, complemented by theories of change, to help explain how and when interventions are likely to impact nutrition. We also need institutional changes which allow for more of these studies to be done in challenging contexts. The current architecture makes such research very difficult. At the same time, we need to continue to capture the kind of ‘evidence’ provided by practitioners, which is critical to shining a light on programme performance, and to identify where greater institutional coherence and joined up thinking is needed.
A great example is the special edition of Field Exchange on the Syria nutrition response in 2014 that documented detailed case studies of more than 60 programmes (www.ennonline.net/fex/48). This single publication has had a significant impact on international guidance around infant feeding in emergencies, the need for policies on non-communicable diseases (NCDs) in emergencies and generated a considerable research focus on how to address high levels of stunting in protracted crises.
In conclusion, evidence is not just generated by academic researchers, statisticians and the like but also by those at the sharp end of programming. Many of those at the sharp end may well need to brush up on their epidemiology, just as many professional researchers may need to familiarise themselves with the complex circumstances of humanitarian crises and the unique insights of the implementers on how the programming they are intimately involved in is playing out on the ground