Enable low bandwidth mode Disable low bandwidth mode
FEX 53 Banner

Robust evidence for an evidence-based approach to humanitarian action

Published: 

Research

By Mike Clarke, Jeroen Jansen and Claire Allen

Mike Clarke is one of the founders of Evidence Aid and is now voluntary Research Director and Chair of the Board of Trustees. He has extensive experience in the evaluation of health and social care and is Director of the Northern Ireland Hub for Trials Methodology Research at Queen’s University, Belfast. He has worked on some of the largest randomised trials in specific areas of health and on dozens of systematic reviews in a wide range of areas, with a strong interest in increasing capacity for the conduct and use of systematic reviews. He facilitates trainings on systematic reviews for Evidence Aid.

Jeroen Jansen is the Director of Evidence Aid. He has been engaged in a wide variety of sectors, issues and working environments, mostly in management positions. He worked for Médecins Sans Frontières (MSF) in Afghanistan, Liberia, Darfur and several other contexts. He has been successfully employed in the private sector, managing the deployment of humanitarian aid and working as a department head for MSF in the UK. Formerly an engineer, he later obtained a Masters in International Human Rights Law and worked for Marie Stopes International.

Claire Allen is the part-time Operations Manager for Evidence Aid. She worked for Cochrane for 18 years and has a B.A. (Hons) in Communication, Media and Culture. Claire’s background is in administration and finance. Her main tasks relate to the overall operations of Evidence Aid, including management of information to ensure that evidence is available in a readily accessible format.

The increasing demand for ‘value for money’, proof of impact and effectiveness in the provision of humanitarian aid requires that decisions and activities become more evidence-based. Reliable and robust evidence will help those making decisions and choices, and those developing policies and standards, to know which interventions work, which don’t work and which remain unproven. For those interventions that work, people need to know how effective they are and for whom, so that they can choose the most appropriate and effective intervention in a specific context. Only with reliable and robust evidence is it possible to maximise the impact, efficiency (‘value for money’) and effectiveness of humanitarian action and ensure more good than harm is done.

Evidence Aid 1 engages with those guiding the humanitarian sector to inspire and enable them to apply a more evidence-based approach to their decisions and actions. Through this engagement, and by working with others who generate, disseminate and apply evidence, a large and growing audience is revealed that supports a more evidence-based approach to humanitarian action. A momentum is building, similar to what has happened in the healthcare sector since the latter decades of the twentieth century (Clarke, 2015), generating a demand for robust evidence. One of the challenges is that this audience has a wide range of interpretations regarding what reliable and robust evidence entails.

When it comes to identifying and using findings of research to decide what is likely to do more good than harm, the healthcare sector recognises the need for evidence to come from a synthesis of similar studies, often in a systematic review. These evidence syntheses provide users with a critically appraised summary of the relevant research on a topic, allowing the existing studies to be compared, contrasted and combined to provide the knowledge needed to resolve uncertainties. In this way, systematic reviews provide the vehicle by which evidence from earlier research can be brought together in ways that minimise bias, avoid undue emphasis on individual studies, maximise the power of research that has already been done, and minimise waste from unnecessary duplication or inadequate uptake.

Systematic reviews begin with a focused question and clear eligibility criteria, then seek out and appraise the relevant studies and compare, contrast and, where relevant and possible, combine their findings. They provide decision-makers and others making choices with a summary of the available evidence, which they can consider alongside other information, such as local values and resources, before taking action. An up-to-date, systematic review allows well-informed decisions to be taken quicker and eases the evidence-gathering burden for people who need to take these decisions. The value of systematic reviews is widely recognised in healthcare, and the concept of drawing on the totality of evidence when making decisions is neither new nor outlandish when explained to practitioners, patients, policy-makers and the public. This should be no different for disasters and other humanitarian emergencies (Gerdin et al, 2014). Just as happened in the healthcare sector several decades ago, the growing need and support for an evidence-based approach in the humanitarian sector should result in a growing need for robust evidence and an increase in the investment required to generate and disseminate this evidence.

Despite a growing momentum for an evidence-based approach in the humanitarian sector, progress seems limited by a strong sense of tradition, an antipathy to change and continuing misapplication of ‘expert opinion’ or ‘best practice’. Key documents in the sector, such as guidelines and policies, continue to be predominantly based on ‘expert opinion’ and ‘best practice’, although this is changing as key influencers that straddle the humanitarian and healthcare sectors, such as the World Health Organisation (WHO), emphasise the need to underpin their guidance with systematic reviews. (See, for example, the ongoing work on a guideline for major radiation emergencies (Carr et al, 2016) and a systematic review of accessibility in the home for an upcoming WHO guideline on healthy housing (Cho et al, 2016)). This does not imply an unthinking adoption of evidence synthesis as a recipe for decision-making. There are situations where the use of ‘expert opinion’ or ‘best practice’ is justified, just as an evidence-based approach, based on robust evidence, should only be applied when and where appropriate. Nevertheless, it is important that the humanitarian sector recognises the limits of ‘expert opinion’ and ‘best practice’ and the value of an approach based on evidence synthesis.

Expert opinion is a valuable tool when applied appropriately, but applied inappropriately it can cause considerable harm, getting in the way of effective action or promoting the use of ineffective or harmful actions. William J. Sutherland and Mark Burgman (2015) described the pitfalls of applying expert opinion rather well in the journal Nature. They assert that the “accuracy and reliability of expert opinions is … compromised by a long list of cognitive frailties (Tversky & Kahneman, 1982). Estimates are influenced by experts’ values, mood, whether they stand to gain or lose from a decision (Englich  & Soder, 2009), and by the context in which their opinions are sought. Experts are typically unaware of these subjective influences. They are often highly credible, yet they vastly overestimate their own objectivity and the reliability of their peers (Burgman  et al, 2011).” This does not render ‘expert opinion’ useless, but demands a certain degree of care when engaging experts and a need for awareness of the tools available to counter these pitfalls.

An example of the appropriate use of expert opinion is in determining the gaps in the evidence base and prioritising the filling of those gaps. The identification of gaps in evidence in the humanitarian sector is important for minimising unnecessary overlap of activities and waste of resources. Evidence Aid held a priority setting meeting in 2013, bringing together those who influence and guide the humanitarian sector, and published the output in the journal PLOS Currents: Disasters (EAPSG, 2013).

Thirty high-priority research questions were identified under ten themes that could be addressed by systematic reviews in the area of planning for or responding to natural disasters, humanitarian crises or other major healthcare emergencies. Some of these gaps have already been taken up by the Humanitarian Evidence Programme2 and others, and relevant systematic reviews will appear in coming months and years. As these new reviews are done, they will be added to the more than 250 systematic reviews that are already freely available from the Evidence Aid resources.3

Another concept frequently applied in the context of decision-making in humanitarian response is ‘best practice’. On The Guardian Global Development Professionals Network page, an anonymous blogger asserts that “best practices are those things that we’ve somehow managed to figure out actually work, and work well” or “is something that we know works and is worth repeating”4 The variety of definitions of concepts such as best practice is one of the common problems in the humanitarian sector. However, potentially even more problematic is the fact that the blogger, like many others in the sector, does not reveal what their criteria are for something to ‘work’ and how it was established that any particular practice adheres to these criteria. As with ‘expert opinion’, the application of a transparent methodology to determine what ‘works’ would overcome most of the hurdles, allowing specific interventions, actions and strategies to be proven to be effective or efficient to the satisfaction of decision-makers.

Proper evaluations can provide evidence of the impact of a certain project or practice. Key to this is the application of an appropriate methodology, such as promoted by the Active Learning Network for Accountability and Performance in Humanitarian Action (ALNAP)5. Nevertheless, an evaluation of one project only reveals something about the impact in that particular context and does not allow us to derive strong conclusions about the likely effects of a similar project in a different context. Just as the successful treatment of one patient or the positive findings of one clinical trial in the healthcare sector do not provide any certainty that it will work for others, we need a synthesis of the evaluations of similar projects to determine the likelihood of success in another place and time. Recognition of this in the healthcare sector helped the drive towards evidence synthesis and systematic reviews as a means to bring together all the available evidence.

Evidence syntheses, or systematic reviews, of reliable evaluations should be a key source of knowledge for all decision-makers who want to answer the question: What is likely to happen if we do this, rather than something else? In the humanitarian sector, decisions, choices and policies impact on the health and wellbeing of thousands, if not millions, of people and those responsible have a duty to ensure that the evidence they use is reliable and robust. This requires the use of appropriate methodologies; firstly to evaluate humanitarian action, and then to bring together the findings of those evaluations. It will mean that the best possible use is made of what happened in the past to predict what will happen in the future, and ensure that humanitarian action does what those who fund it and those who implement it want it to do: prevent and alleviate the suffering of people in need in humanitarian and disaster risk reduction contexts.

The next two courses An introduction to systematic reviews in the humanitarian sector, led by Mike Clarke, are to be held on 16 November in Washington DC, USA and on 30 November in Oxford, UK. The cost is £225/person. For more information, contact: Claire Allen, email: callen@evidenceaid.org or visit http://www.evidenceaid.org/

References

Burgman MA, McBride M, Ashton R, Speirs-Bridge A, Flander L, Wintle B, Fidler F, Rumpff L, Tweedy C. (2011) Expert status and performance. PLoS ONE 6(7): e22998. doi:10.1371/journal.pone.0022998.

Carr Z, Clarke M, Ak EA; Schneider R, Murith C, Li C, Parrish-Sprowl J, Stenke L, Cui-Ping L, Bertrand S, Miller C. (2016) Using the GRADE approach to support the development of recommendations for public health interventions in radiation emergencies. Radiation Protection Dosimetry (first published online 12 August 2016) doi: 10.1093/rpd/ncw234.

Cho HY, MacLachlan M, Clarke M, Mannan H. (2016) Accessible home environments for people with functional limitations: a systematic review. Int J Environ Res Public Health 13: 826. doi: 10.3390/ijerph13080826.

Clarke M (2015). History of evidence synthesis to assess treatment effects: personal reflections on something that is very much alive. JLL Bulletin: Commentaries on the history of treatment evaluation.  J Roy Soc Med 109:154-63. Doi: 10.1177/0141076816640243. www.jameslindlibrary.org/articles/history-of-evidence-synthesis-to-assess-treatment-effects-personal-reflections-on-something-that-is-very-much-alive/)

EAPSG Evidence Aid Priority Setting Group. Prioritization of themes and research questions for health outcomes in natural disasters, humanitarian crises or other major healthcare emergencies. PLoS Curr Disasters 2013 October 16. Edition 1. doi: 10.1371/currents.dis.c9c4f4db9887633409182d2864b20c31

Englich B, Soder K. (2009) Moody experts – How mood and expertise influence judgmental anchoring. Judgement and Decision Making Vol. 4, No. 1, February 2009, pp. 41-50. journal.sjdm.org/71130/jdm71130.pdf

Gerdin M, Clarke M, Allen C, Kayabu B, Summerskill W, et al. (2014) Optimal evidence in difficult settings: improving health interventions and decision making in disasters. PLoS Med 11(4): e1001632.doi:10.1371/journal.pmed.1001632. journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.1001632

Sutherland WJ and Burgman M. Policy advice: Use experts wisely. Nature 2015 Oct 15;526(7573):317-8. doi: 10.1038/526317a. www.ncbi.nlm.nih.gov/pubmed/26469026

Tversky A and Kahneman D. (1982) In: Kahneman D, Slovic P, Tversky A (editors) Judgement Under Uncertainty: Heuristics and Biases. Cambridge University Press pp.23-30.

 


 

Published 

About This Article

Article type: 
Original articles

Download & Citation

Recommended Citation
Citation Tools