Skip to main content

Much at stake: the importance of training and capacity building for stakeholder engagement in evidence synthesis

Abstract

Systematic reviews and maps are complex methods for synthesising evidence that involve specialist and resource-intensive activities. Systematic reviewers face challenges when attempting to clearly and precisely communicate their methods to end-users and other stakeholder groups. We propose that these challenges are likely to be a key causal factor in the generally low uptake of systematic reviews and maps by policy and practitioners in environmental science and management. We argue that training and capacity building are inherently important components of systematic reviews and maps for all stakeholders; the reviewers themselves, the end-users of specific reviews, and the broader research and decision-making community. Training can help to build capacity for undertaking reviews and maps, and can help to explain complex methods to stakeholders. Training is important for those wishing to undertake stakeholder engagement activities as part of a review. It allows researchers and decision-makers to critique systematic reviews and maps based on their methods. Finally, training may be necessary to allow reviewers to prepare visualisations and communication media for presenting the findings of systematic reviews and maps. We conclude that a broad approach, by viewing every opportunity of stakeholder engagement as a potential for training and capacity building is appropriate both within a specific review and across reviews as a community of practice in evidence synthesis. We call for systematic reviewers to improve networks across disciplines in relation to training, sharing experiences and course content, and ensuring a consistent approach to capacity building in the conduct and use of evidence syntheses.

Background

Systematic review methods were developed within the field of medicine in the 1980s and 1990s [1] in an attempt to improve the evidence base for clinical decision-making. The Cochrane Collaboration was established in 1992 to oversee the production of guidance in systematic review methods and the peer-review and endorsement of systematic review protocols and reports [1]. The methods were subsequently adapted for the field of conservation and environmental management [2], and the Collaboration for Environmental Evidence (CEE) was established in 2008 to coordinate standards for environmental systematic reviews, and has endorsed a number of courses since its establishment (see recent examples in Table 1).

Table 1 Systematic review and map training endorsed by The Collaboration for Environmental Evidence undertaken in 2017 to date

In order to fully understand or conduct a systematic review or systematic map, reviewer authors, researchers, end-users and decision-makers (hereafter included within the term stakeholders; [3]) require detailed and comprehensive knowledge across a suite of research and communication skills. As this skillset is rare, training is a necessary part of the effort to increase adoption of systematic synthesis methods in environmental science and management. We believe that this current training gap is likely a key factor in the generally low uptake of systematic reviews and maps by policy and practitioners. Indeed, ideas around the use of training have, until now, been rather traditional, considering training as useful purely in capacity building for those wishing to conduct a systematic review or map. Such a limited view of the role of training in increasing both the understanding and use of systematic review methods and results ignores the importance of the need to continually raise awareness about these methods across all stakeholders. To date, the need for innovative and thoughtfully designed training has not been seen as a priority by the evidence synthesis community, and we propose that, although not traditionally thought of as part of stakeholder engagement, training and capacity building are an inherently important component of systematic evidence synthesis.

Currently, guidance from CEE [4] and from the Campbell [5] and Cochrane [6] collaborations does not focus on the importance of training for effective engagement among the different stakeholder groups. This is because such guidance relates to the conduct of single systematic reviews or maps. Whilst training activities may well be linked to a specific review project, a strategic approach to training and capacity building is key to raising awareness and interest, and increasing the uptake of systematic reviews and maps as methods and as a reliable form of evidence in decision-making.

Fundamentally, training and capacity building increase direct and indirect communication among different stakeholder groups engaged with evidence syntheses. The two-way information flow that comes from effective communication can ensure that: an evidence synthesis concentrates on the issues of greatest importance; outputs can be understood by a wider audience; and benefits of evidence-based approaches are clear. These benefits include improved transparency, accountability, and accuracy, and reduced risk in decision-making. These points are all essential for helping to bridge the ‘knowing-doing gap’ that currently prevents the uptake of much applied research in environmental science and conservation [7].

Systematic review and map training challenges

Systematic review and map methods training inherently involves challenges, some of which are particularly apparent when the training is aimed at non-specialists or a non-research focused audience [3]. These challenges include:

  1. 1.

    Explaining complex concepts in lay terms.

  2. 2.

    Deciding between overview and methods training.

  3. 3.

    Explaining relatively abstract concepts without information overload (e.g. critical appraisal and meta-analysis).

  4. 4.

    Determining when systematic review/map methods are appropriate (resources, timelines, staffing, desired output).

  5. 5.

    Ensuring that participants appreciate that while robust evidence syntheses require greater resources than informal and ad-hoc reviews, the payoff is in the reliability of results.

  6. 6.

    The need for ongoing training as methods develop and improve.

  7. 7.

    Making training cost-efficient.

  8. 8.

    Tailoring training media to the situation (e.g. workshops or written media).

  9. 9.

    Providing continued support for people who are conducting reviews.

  10. 10.

    Ensuring an appreciation of the importance of course accreditation by a coordinating body (e.g. CEE).

In the following pages, we outline several types of training courses or efforts and how they can address these challenges.

Training providers

Courses accredited by the Collaboration for Environmental Evidence [8] have been written by trainers with experience in stakeholder engagement in evidence syntheses in the environmental sector. They are designed for a non-research focused audience, are updated with new methodological developments as they arise. The Campbell Collaboration provides and approves (primarily methods-focused) courses by affiliated trainers and maintains lists of both Campbell-approved and non-approved courses. These include training offered by the EPPI-Centre of the University College London, ranging from 1-day workshops to a MSc course in systematic reviews for public policy and practice [9]. Since systematic reviews are well-developed in the field of medicine, a wide range of training courses have long been advertised by the Cochrane Collaboration. These include specialised courses, for example, on software to support meta-analysis [10]. Most courses are aimed at a research audience, yet a stakeholder engagement component is not strongly evident. However, a 1-day course focusing on engaging stakeholders and audiences in research was offered by Cochrane Australia in June 2017 [11]. The Cochrane Collaboration offer training via Cochrane groups such as Cochrane South Asia [12], and also advertise training courses provided by affiliate or independent organisations, such as York Health Economics Consortium and academic institutions, such as Columbia University. Despite the wealth and breadth of experience in capacity building and training in all these fields, there has so far been no concerted effort to connect and learn from the expertise in systematic review training across disciplines.

Opportunities to improve stakeholder engagement through training

We identify five broad categories of training across evidence synthesis processes, from question formulation to communication of findings, where training is important for effective two-way communication among the full range of different stakeholder groups (Table 2). We discuss these below.

Table 2 Stakeholder training stages, beneficiaries and descriptions of the purposes of different training opportunities, along with suggestions of suitable training media

Training for stakeholders, education and outreach

Value of systematic reviews/map methods (point 4 in Table 2)

Many stakeholders wish to better understand the purpose and characteristics of systematic reviews and maps, but do not need to be able to conduct a review. In these cases, a basic understanding is likely to be sufficient (Challenge 2). Here, relevant training should provide an understanding of the benefits of systematic methods compared to informal narrative literature reviews, and the importance of the central tenets of comprehensiveness, transparency, repeatability and objectivity [4,5,6]. There is a general appreciation for the ‘added value’ associated with reviews that label themselves as ‘systematic’, but there is also a misunderstanding over what is required to make a review reliable [17, 25]. This kind of training would be suitable for potential commissioners of syntheses along with end-users (policy stakeholders and practitioners) wishing to integrate review findings into decision-making processes. Similarly, reviewers may wish to target end-users with specific training efforts in order to maximise the likelihood of use of a reviews findings. Box 2 summarises a recent training event provided to policy advisors forming part of the European Commission’s Science Advisory Mechanism. Such training can help to increase awareness of the limitations of traditional reviews and the benefits of systematic review methods.

Technical critique of review methods (point 5 in Table 2)

Many syntheses call themselves systematic reviews, but fail to meet basic qualifying standards of what is considered to be a systematic review [17] as set out by systematic review coordinating bodies [4,5,6]. Training in how to critically appraise reviews can enable stakeholders to highlight common problems with non-systematic reviews. Tools for critical appraisal of reviews have been published for such purposes, for example CEESAT [26], which include assessments of limitations and susceptibility to bias, such as a lack of comprehensiveness and the presence of selection bias and vote-counting [16]. At present, stakeholders may not fully appreciate the potentially fatal characteristics of some non-systematic reviews. Having undertaken training in technical critique of review methods participants can recognise and appreciate reliable reviews, justify the resources needed to obtain a higher level of reliability in reviews that follow systematic principles (Challenge 5), and appreciate the value of endorsing reviews with a coordinating body such as CEE (Challenge 10).

Training students in systematic review methods (point 2 in Table 2)

Training university students (undergraduate and postgraduate) in systematic review or map methods is a vital means of raising awareness and educating future decision-makers and researchers about the benefits of systematic approaches to evidence synthesis (see Box 3). Since students may wish to incorporate systematic review methods in their work it is important to be pragmatic and recognise that systematic reviews or maps may not be appropriate within the restricted timeframes of many students’ secondary research theses (Challenge 4). Training in universities may make use of workshops, taught and self-led courses and online resources [27], and represents a mechanism by which training can be provided without the need for a direct funding source (Challenge 7).

Conclusion

Systematic review and map methods are complex and nuanced means of synthesising the available evidence to improve decision-making. Because of their complexity, training is often needed at various stages of the planning, conduct and communication of reviews. Effective stakeholder engagement is a critical component for the success of systematic reviews and maps [3, 13], but to date, stakeholder engagement and training activities have largely been undertaken independently by the evidence synthesis community, and we believe this constrained thinking has limited the uptake of systematic reviews. We propose that every occasion where reviewers engage with stakeholders should be viewed as a potential training opportunity. This would provide a range of benefits, including raising awareness, acceptance and understanding of systematic reviews. We identify five main areas where training of reviewers and other stakeholders can not only build capacity for systematic review conduct but also provide a range of other benefits from stakeholder engagement.

Finally, there are ongoing efforts to improve networking between systematic review methodologists across disciplines (e.g. the Evidence Synthesis Technology Methods Group [28]). We call for similar efforts to connect those involved with training and systematic reviews across disciplines to share knowledge and experiences, improving our collective understanding of best practices in capacity building and raising awareness in the methods and their integration into decision-making. An evidence synthesis methods group that spans disciplines, including actors from CEE, The Campbell Collaboration, The Cochrane Collaboration, is one such opportunity for networking and collaborative exchange. The increasing level of interest in training in systematic review and map methods (see recent examples in Table 1) suggests that we are at a critical time to consolidate and optimise efforts.

References

  1. Allen C, Richmond K. The cochrane collaboration: international activity within cochrane review groups in the first decade of the twenty-first century. J Evid Based Med. 2011;4(1):2–7.

    Article  Google Scholar 

  2. Pullin AS, Stewart GB. Guidelines for systematic review in conservation and environmental management. Conserv Biol. 2006;20(6):1647–56.

    Article  Google Scholar 

  3. Haddaway N, Kohl C, da Silva NR, Schiemann J, Spök A, Stewart R, Sweet J, Wilhelm R. A framework for stakeholder engagement during systematic reviews and maps in environmental management. Environ Evid. 2017;6(1):11.

    Article  Google Scholar 

  4. CEE. Guidelines for Systematic Review and Evidence Synthesis in Environmental Management. Version 4.2.: The Collaboration for Environmental Evidence; 2013.

  5. The Steering Group of the Campbell Collaboration. Campbell systematic reviews: policies and guidelines Campbell systematic reviews: the Campbell collaboration; 2014.

  6. Higgins JP, Green S. Cochrane handbook for systematic reviews of interventions. New York: Wiley; 2011.

    Google Scholar 

  7. Knight AT, Cowling RM, Rouget M, Balmford A, Lombard AT, Campbell BM. Knowing but not doing: selecting priority conservation areas and the research—implementation gap. Conserv Biol. 2008;22(3):610–7.

    Article  Google Scholar 

  8. CEE. The Collaboration for Environmental Evidence 2017. http://www.environmentalevidence.org/. Accessed 14 Mar 2017.

  9. EPPI-Centre. Courses and seminars: Social Science Research Unit, UCL Institute of Education; 2016. http://eppi.ioe.ac.uk/cms/Default.aspx?tabid=168. Accessed 10 June 2017.

  10. Cochrane training. Learn how to conduct, edit, and read systematic reviews: Cochrane training; 2010. http://training.cochrane.org/. Accessed 10 June 2017.

  11. Cochrane Australia. Cochrane Australia Learning Week: Cochrane Australia; 2017. http://learningweek.cochrane.org.au/. Accessed 10 June 2017.

  12. Cochrane South Asia. Training: The Cochrane Collaboration; 2017. http://southasia.cochrane.org/training. Accessed 10 June 2017.

  13. Cottrell E, Whitlock E, Kato E, Uhl S, Belinson S, Chang C, Hoomans T, Meltzer D, Noorani H, Robinson K. Defining the benefits of stakeholder engagement in systematic reviews. Report No.: 14-EHC006-EF. Rockville (MD); 2014.

  14. Bayliss HR, Beyer FR. Information retrieval for ecological syntheses. Res Synth Methods. 2015;6(2):136–48.

    Article  Google Scholar 

  15. Stewart G. Meta-analysis in applied ecology. Biol Lett. 2010;6(1):78–81.

    Article  Google Scholar 

  16. Haddaway N, Woodcock P, Macura B, Collins A. Making literature reviews more reliable through application of lessons from systematic reviews. Conserv Biol. 2015;29(6):1596–605.

    Article  CAS  Google Scholar 

  17. Haddaway NR, Watson MJ. On the benefits of systematic reviews for wildlife parasitology. Int J Parasitol Parasit Wildlife. 2016;5(2):184–91.

    Article  Google Scholar 

  18. James KL, Randall NP, Haddaway NR. A methodology for systematic mapping in environmental sciences. Environ Evid. 2016;5(1):7.

    Article  Google Scholar 

  19. McKinnon MC. Map the evidence. Nature. 2015;528(7581):185.

    Article  CAS  Google Scholar 

  20. Bernes C, Carpenter SR, Gårdmark A, Larsson P, Persson L, Skov C, Speed JDM, Donk EV. Effects of biomanipulation on water quality in eutrophic lakes. Stockholm: Mistra EviEM; 2015. Contract No.: EviEM Summary.

  21. EviEM Mistra. Removal of nitrogen and phosphorus in freshwater wetlands. Stockholm: Mistra EviEM; 2016.

    Google Scholar 

  22. Hammar J. Wetlands as nutrient traps. Mistra EviEM; 2016. p. 4:46 minutes.

  23. Lankow J, Ritchie J, Crooks R. Infographics: the power of visual storytelling. New York: Wiley; 2012.

    Google Scholar 

  24. Center for Public Engagement with Science & Technology. Communicating Science Workshops: American Association for the Advancement of Science (AAAS); 2017. https://www.aaas.org/pes/communicating-science-workshops. Accessed 10 June 2017.

  25. Haddaway NR, Land M, Macura B. A little learning is a dangerous thing: a call for better understanding of the term systematic review. Environ Int. 2016;99:356–60.

    Article  Google Scholar 

  26. Woodcock P, Pullin AS, Kaiser MJ. Evaluating and improving the reliability of evidence syntheses in conservation and environmental science: a methodology. Biol Conserv. 2014;176:54–62.

    Article  Google Scholar 

  27. Ryan S, Scott B, Freeman H, Patel D. The virtual university: The internet and resource-based learning: Routledge; 2013.

  28. Evidence Synthesis Technology Methods Group. Evidence Synthesis Technology Methods Group: Evidence Synthesis Technology Methods Group; 2017. https://www.researchgate.net/project/Evidence-Synthesis-Technology-Methods-Group. Accessed 10 June 2017.

Download references

Authors’ contributions

NRH and JE drafted the manuscript. All authors edited the draft. All authors read and approved the final manuscript.

Acknowledgements

The authors thank Mistra EviEM for covering publication fees.

Competing interests

The authors declare that they have no competing interests.

Availability of data and materials

Not applicable.

Consent for publication

Not applicable.

Ethics approval and consent to participate

Not applicable.

Funding

NRH is funded by Mistra EviEM.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Neal R. Haddaway.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Eales, J., Haddaway, N.R. & Webb, J.A. Much at stake: the importance of training and capacity building for stakeholder engagement in evidence synthesis. Environ Evid 6, 22 (2017). https://doi.org/10.1186/s13750-017-0101-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13750-017-0101-3

Keywords