Skip to main content
  • Study protocol
  • Open access
  • Published:

How can evidence-based interventions give the best value for users in social services? Balance between adherence and adaptations: a study protocol

Abstract

Background

Using evidence-based interventions (EBIs) is a basic premise of contemporary social services (e.g., child and family social services). However, EBIs seldom fit seamlessly into a specific setting but often need to be adapted. Although some adaptions might be necessary, they can cause interventions to be less effective or even unsafe. The challenge of balancing adherence and adaptations when using EBIs is often referred to as the adherence and adaptation dilemma. Although the current literature identifies professionals’ management of this dilemma as problematic, it offers little practical guidance for professionals. This research aims to investigate how the adherence and adaptation dilemma is handled in social services and to explore how structured decision support can impact the management of the dilemma.

Methods

The design is a prospective, longitudinal intervention with a focus on the feasibility and usefulness of the structured decision support. The project is a collaboration between academic researchers, embedded researchers at three research and development units, and social service organizations. A multi-method data collection will be employed. Initially, a scoping review will be performed, and the results will be used in the development of a structured decision support. The decision support will be further developed and tested during a series of workshops with social service professionals. Different forms of data—focus group interviews, questionnaires, and documentation—will be used on several occasions to evaluate the impact of the structured decision support. Qualitative and quantitative analysis will be performed and usefulness for practice prioritized throughout the study.

Discussion

The study will contribute with knowledge on how the adherence and adaption dilemma is handled and experienced by social service professionals. Most importantly, the study will generate rich empirical data on how a structured decision support impacts professionals’ management of adherence and adaptions. The goal is to produce more strategic and context-sensitive implementation of EBIs in social service, which will increase value for service users.

Peer Review reports

Background

Provision of good-quality social services, which are equally distributed to all citizens in need, is a common goal for decision-makers and service providers [1]. Using evidence-based interventions (EBIs) is considered one way to achieve this [2]. Although there is an ongoing discussion within social service research and practice concerning what kind of knowledge should count as evidence and how evidence should reach social workers [1, 3], there is a clear expectation that social service practice should be based on evidence [4, 5]. In Sweden, this is mirrored in a national policy of basing social services (i.e., social support to children and families, individuals with disabilities, people with substance abuse problems, and older people) on evidence, together with professionals’ experience and clients’ preferences [6].

However, using EBIs in social services has shown to be challenging. Two interlinked problems have occurred. First, professionals in social services are struggling with determining how to use their own specific expertise as well as clients’ experiences and preferences in relation to evidence and EBIs [7,8,9]. Although scientific models suggest how this can be done in theory (e.g., [10]), the practice lacks clear and hands-on guidance on how different knowledge sources are best combined when making decisions [7, 11, 12]. Second, EBIs seldom fit seamlessly into a specific setting [13]. The context in which an EBI was developed and tested often differs in substantial ways from the context in which it is applied. For instance, workforce, provider capacity, staff training, welfare systems, resources, and service user characteristics can be different [14]. These misfits between contexts mean that EBIs often need to be adapted to fit the new context [15, 16]. Most EBIs used in social services will be adapted, to various degrees [17]. Studies from other areas show that adaptation is the rule rather than the exception [18]. An accumulation of findings has shown that between 44 and 88% of users adapt the procedure, dosage, content, format, and/or target group when using EBIs (e.g., [19,20,21,22]).

Although adaptions of EBIs can have beneficial effects for clients, and even might be necessary to implement an EBI, adaptations are not unproblematic. Adaptations can cause an EBI to be less effective or even unsafe [23]. In fact, high adherence has been related to better client outcomes in a number of studies (e.g., 20). Adaptations can also induce unwanted variation in the services provided to clients, as compared to the original EBI and between different service users. Thus, adaptations might increase the risk of inequalities in service provision [24]. The challenge of balancing adherence and adaptations when using EBIs is often called the adherence and adaptation dilemma [25]. At heart, this dilemma concerns how evidence can be applied in a specific setting. More specifically, it is concerned with the extent to which an EBI needs to adhere to the original version of the EBI and the extent to which purposeful changes to the EBI in response to restraints and possibilities in the local setting are acceptable or even desirable.

Literature on adherence and adaptations

Most of the literature on the adherence and adaptation dilemma is conceptual or theoretical, building convincing arguments for either the importance of high adherence or the burning need for local adaptations [23, 26]. As described above, some empirical studies also show that adaptations are common in practice (e.g., [13]). Fewer studies show why and how adaptations are done in practice, although in general, issues of adherence and adaptation tend to be handled in an ad hoc way [19, 27].

In the literature, we identified five main ways of managing the adherence and adaptation dilemma that can be problematic for the outcomes of an EBI. First, adaptations often seem to be conducted reactively as an impulsive response to constraints in the context (e.g., time pressure). This is problematic, because research shows that proactive adaptations based on a thoughtful analysis of the EBI, the context in which it was developed and tested, and the context in which it will be implemented tend to give better client outcomes [19, 22]. Second, decisions about adaptations are often made in response to practical restraints (that is, to get the EBI in place) rather than to provide a better fit with the target group [13]. Third, decisions are often made by individuals rather than based on discussions in work teams [28]. This increases the risk of unwanted variation and, in the end, inequalities in the provided services. Fourth, decisions regarding adherence and adaptations are often made without careful consideration of how the adaptation will affect the outcomes for clients [22]. Finally, adaptations are often made without consideration of how they will affect the EBI’s core components—that is, without ensuring that the active ingredients or core elements that make the EBI effective remain [28].

Although the literature identifies a number of problems related to how the adherence and adaption dilemma is approached in practice, it offers little practical guidance for professionals’ management of it. A call has been made for practical tools for managing the adherence and adaptation dilemma [29], emphasizing the need for solutions in the area. The current study is a response to that call. The study makes use of Lee et al.’s [29] Planned Adaptation Model and Hasson and von Thiele Schwarz’s Useful Evidence Model [30]. The Planned Adaptation Model was developed to support public health professionals in identifying core components in an EBI and determining differences between their own setting and the setting in which the EBI was originally developed. Although this model can provide some guidance for professionals, it might not be directly applicable in a social service setting. First, it is directed toward large public health programs, focusing on understanding population characteristics rather than main concerns in social service settings, such as differences among organizations (e.g., in resources, size, and staff education levels). Second, and most importantly, the model does not give guidance on how to conduct proactive, goal-oriented adaptations instead of reactive responses to practical restraints. Hence, the model is probably insufficient to solve the five problems in managing the adherence and adaptation dilemma, discussed above. The Useful Evidence Model [30] adds to Lee et al.’s model by acknowledging a broader set of contextual factors and by giving hands-on guiding principles for collaboratively managing adherence and adaptations in a work team. This model was developed in cooperation with practitioners from mental health care and pediatric care but remains to be tested in, and potentially modified to fit, social service settings [30].

Cognitive and emotional demands among professionals

Lack of practical support for social service professionals’ management of the adherence and adaptation dilemma [30] means that they are left on their own with the dilemma, which can act as both a cognitive and emotional stressor for professionals [31]. The literature is scarce regarding cognitive and emotional demands among professionals when dealing with the adherence and adaptation dilemma. It has been highlighted that this dilemma causes significant tension for professional groups with a deep respect for scientific principles [25], but we have found no empirical studies that directly address this. A related finding, however, is that rule-practice gaps, such as when professionals are unable to act according to legal requirements or feel forced to either break rules to deliver high-quality service or act according to guidelines that are misaligned with their professional values, are a source of ethical distress for staff [31, 32]. In terms of cognitive demands, clinical guidelines have the potential to both off-load and increase cognitive demands, depending on how well the level of detail matches professionals’ needs [33]. Both too much and too little guidance tend to be problematic, which, in reference to the adherence and adaptation dilemma, might reflect the balance between adherence to specific protocols versus leaving too much room for adaptations without sufficient guidance. In addition, practitioners who receive support to adopt an EBI (possibly including support to manage the adherence and adaptation dilemma) have a better well-being than those who are not supported [34]. Thus, there is a need for studies investigating the relationship between the adherence and adaptation dilemma and emotional and cognitive demands among staff.

In sum, the adherence and adaption dilemma is an inevitable part of the implementation of EBIs in social services, and no practical support is currently available to help social service professionals manage this dilemma. Thus, there is a need to explore this dilemma in social service settings, especially regarding how to support professionals in managing the dilemma in daily professional practice.

Aim and research questions

The aim of this research is to investigate how the adherence and adaptation dilemma is handled in social services and to explore how a structured decision support to social service professionals impacts how the dilemma is managed.

The following research questions (RQs) will be addressed:

  1. 1.

    How is the adherence and adaptation dilemma managed?

    1. (a)

      How is the dilemma related to social service professionals’ experiences of cognitive and emotional demands?

  2. 2.

    How does structured decision support impact how the adherence and adaptation dilemma is managed? The focus is on:

    1. (a)

      The EBI: Does it support professionals in identifying core components?

    2. (b)

      The context: Does it support professionals in identifying differences between their context and the context in which the EBI has been used?

    3. (c)

      The adaptations: Which adaptations are made, and to what extent do these align with the EBI’s goals?

    4. (d)

      The knowledge sources: How are professionals’ expertise and consideration of clients’ needs expressed in the process?

    5. (e)

      The support: Does the support decrease social service professionals’ experience of cognitive and emotional demands?

  3. 3.

    How does participation in research-practice collaboration affect social service professionals’ attitudes toward research and scientific knowledge? Do they experience more possibilities to impact social services research?

Theoretical approach

The project takes its starting point from the notion of professionals’ discretion and how it is exercised when using general knowledge (evidence) in a specific work situation and context. Social service professionals have varying degrees of autonomy and power to decide how to carry out tasks in daily practice. To understand the tension that professionals in the welfare sector often deal with, we make use of Lipsky’s theory of street-level bureaucracy [35]. On the one hand, welfare organizations provide highly scripted regulations and goals to the professionals. However, on the other hand, professionals also need to improvise and be responsive to individual cases. They need to navigate these tensions given the boundaries of their professional values and the available organizational resources. According to Lipsky’s theory, professionals in welfare organizations often lack the necessary resources (e.g., time, information) to provide the highest quality services to each client. Professionals manage this by creating routines and psychologically simplifying both clients’ problems and their environment. Thus, this theory will offer valuable insights into the discretion of social service professionals and how this is exercised when dealing with the adherence and adaptation dilemma.

Methods

Participatory approach

One of the study’s basic premises is that it should meet the needs and resources of the social service organizations involved, produce relevant and actionable findings, and be of good quality to add to the scientific literature [36]. Thus, the proposed design will be discussed and further developed in interaction with the local stakeholders as the project unfolds. In doing so, the project builds on a participatory research approach (e.g., [37]).

In contrast to a traditional approach, in which researchers first produce research evidence and then disseminate it and implement it into practice, we use a constructive approach to knowledge development [37]. With this approach, knowledge is assumed to be developed through interactions between various stakeholders, resulting in knowledge that is more relevant in practice. It is also believed to facilitate dissemination and implementation of the findings because the barriers to implementation have been reduced in the early stages of knowledge development, rather than being managed upstream [36].

More specifically, the project is a collaboration between academic researchers (HH, UvTS, HG), embedded researchers at three research and development (R&D) units (ÅHR, GA, HU), and social service organizations. In Sweden, social services are the responsibility of the local municipalities. The social services include social support to children and families, support for individuals with disabilities or substance abuse problems, and eldercare (including nursing homes and home help services). The R&D units operate in specific geographical areas and have established relationships with the local social service organizations. Thus, they have knowledge about the service providers’ needs and resources.

Study design

This is a prospective, longitudinal intervention study that will be conducted in regular social service practices rather than a controlled research context [38]. The focus is on understanding how the adherence and adaptation dilemma is handled under ordinary conditions and on exploring whether and how this can be improved with a structured decision support. Although this is an intervention study, the objective is not to conduct an effect evaluation. Instead, we follow recent calls in evaluation science to start evaluations based on the needs of the practice, focusing on usefulness first and conducting effect evaluations with a stricter set of requirements later (e.g., [38]). Thus, in addition to answering the research questions, this project might also produce an intervention that can be further tested in future research.

Literature review

A literature review on the adherence and adaptation dilemma in social services will be conducted as a first project activity. The findings of the review will be incorporated into the decision support. A scoping review methodology will be chosen, given the immaturity of the field [39]. This methodology is valuable for exploring the kinds of research that has been done in a field, and unlike systematic reviews, the question is often wider, and studies with all types of designs are included. The review will follow the steps outlined by Armstrong et al. [39], and the analysis will be descriptive (qualitative) rather than statistical. Relevant publications will be identified through searches in electronic databases, through searches of reference lists and of key journals. Inclusion criteria will be developed during the process and not determined beforehand. Information from included studies will be extracted and organized thematically. Themes will be inductively developed from the empirical data during the analytical process.

The intervention

The intervention is a structured decision support for social service professionals for managing adherence and adaptations in their daily practice. The structured decision support is based on the Planned Adaptation Model [29] and the Useful Evidence Model [30]. In addition, the results of the literature review will be incorporated into the decision support.

Based on these models, the structured decision support builds on four guiding principles for managing adherence and adaptations: (1) adaptations of any EBI should be carefully planned, with the value for service users as a goal (rather than, e.g., organizational constraints); (2) adherence and adaptation are not opposites but can co-occur (i.e., by explicating how flexibly the EBI’s core components can be applied); (3) the amount and type of research on the effectiveness of the EBI impact how it should be managed (e.g., clear evidence showing better outcomes when the EBI is used with high adherence indicates that adherence is motivated); and (4) adaptations to the local context might also be needed for an EBI to have a chance to function and produce positive outcomes for the service users.

As a first step, manuals and worksheets will be developed to form the structured decision support. These materials will be pilot tested with social service professionals and modified if needed. After the piloting, a number of workshops will be conducted to guide the professionals in the use of the structured decision support (see below). An electronic decision support system might also be developed and tested if interest and need among the social service organizations exist.

Participants

The 3 R&D units operate in different geographical areas involving a total of approximately 500 social service organizations in the Stockholm area. These social service organizations are the population from which the participants will be recruited. The R&D units already have established channels to these organizations.

All of the social service organizations will receive information about the project from the R&D units, and organizations that are about to implement an EBI will continuously be invited to use the structured decision support in a series of workshops. These workshops constitute the backbone of the study and are where the data will be collected.

EBIs will be defined as methods or programs that have been shown to be effective in scientifically rigorous evaluations. Nevertheless, we will not exclude any organization based on the level of evidence for the interventions, as the level of evidence for existing interventions varies considerably between different areas of social services, and we aim to contribute to practice such as it is. Moreover, the aim is that participation will contribute to building capacity for managing adherence and adaptation beyond the specific intervention at hand, thus making the knowledge recyclable. Furthermore, we will not exclude any type of social services. Thus, the study covers all areas of social services in which EBIs are implemented (e.g., support for children, families, individuals with disabilities or substance abuse problems, and older people, including nursing homes and home help services). This means that we will include multiple social service organizations, including different types of services, geographical areas that vary in size, services provided, and characteristics of the social service users. This design will increase the chances that the results will be transferable to other organizations, thus increasing the generalizability of the findings.

The participating social service organizations will be asked to identify individuals who are likely to be involved in the management and use of the particular EBI (e.g., managers, professionals, individuals responsible for a certain EBI) for participation in the workshop. This should include those who will be working with the EBI as well as people who know the context in which it will be applied and those with the mandate to change the contextual factors, if needed.

Based on the total amount of social service organizations in the area and our previous experiences with conducting similar methodological workshop series, we estimate that each R&D unit will conduct 2 to 3 series of workshops each, resulting in a total of 6–9 workshop series during the project (excl. pilot cases). In each of these, we anticipate that 5–7 work teams will participate, with approximately 3–5 individuals on each team. This would result in 45–63 work teams and 90–315 individuals included in the analysis. This number is well above the number of teams (the level of analysis) needed for the planned multilevel analyses (i.e., around 20 teams) [40]. The final number of teams and workshops conducted will be determined jointly by practice and academic partners based on the demand from the social service organizations, the number of work teams participating in each workshop, and the variation in types of EBIs that they plan to implement.

Workshops

During the workshops, the participating work teams will be offered hands-on support in using the structured decision support, and data for the research will also be collected (see below). Each work team will be offered the chance to participate in one to three workshops; the exact number will be determined after piloting the decision support. With guidance from the structured decision support, the work teams will initially identify benefits and risks of the EBI they intend to implement. They will also identify how relevant outcomes of the EBI can be measured. In the next step, the work team will identify the core components and outline the program logic of the EBI. Moreover, they will work with identifying key contextual characteristics of both the context in which the EBI was previously used and the context in which it will be used, along with identifying previously used implementation strategies for the EBI. Once the EBI, context, and implementation strategies have been explicated, the focus will be on deciding what needs to be adhered to and determining what can be adapted in the EBI. This step can help to identify adaptations needed for the EBI to work in a specific setting. This will result in a prototype that can be tested in practice and further adapted if needed. The decision support emphasizes careful documentation and evaluation of the testing, which will also be emphasized in the workshops.

The pedagogy of the workshops will be based on the theory of experiential learning [41]. The goal is to offer a good learning environment to increase the transfer of learning from the workshops to the workplace. This implies that concrete experiences are reflected upon, which will advance the understanding of theoretical concepts relevant to personal experience. Furthermore, an advanced understanding of the theoretical concepts will be translated into new actions, leading to new personal experiences [41].

One pedagogical aspect is also peer learning across participating units because several work teams (max. 10) will participate in one series of workshops. Because the groups can choose which EBI they will focus on, the groups might be working with different EBIs, which will further increase learning opportunities. Between the workshops, they will have assignments to anchor the planned work with their colleagues (i.e., inform the others at the unit or conduct small, practical testing).

Data collection

We use three main sources of data in connection with the workshop series: focus group interviews, questionnaire surveys, and participants’ documentation from workshops. These will be repeated at several occasions in conjunction with the workshop series. The workshops have a dual focus: to facilitate the management of adherence and adaptations and to provide a data source for answering the research questions. The golden rule for the data collection will be that it should provide value to the participants and researchers while placing as little of a burden on respondents as possible. We will follow Richter et al.’s methodology to evaluate the change process during the workshops with both qualitative and quantitative data [42]. The collection of qualitative data will be carried out in accordance with the COREQ checklist [43] (Additional file 1).

Focus groups

Focus group interviews will be conducted at the start and end of each workshop series. The focus group format enables multiple perspectives and is believed to spark beneficial discussions on this complex and rather abstract topic [44]. All of the workshop participants will be invited to the focus groups; consequently, different types of social service organizations and professionals will be represented. The data from the first focus groups will be used to answer RQ1 concerning how adherence and adaptations are currently managed and how the participants perceive the cognitive and emotional demands of the dilemma. The combined data from the first and last focus group interviews will be used to answer the research questions of a longitudinal nature (i.e., whether the decision support decreases the demands [RQ2e] and whether the attitudes toward research changed during participation in the project [RQ3]). During the last focus group interview, the participants will also be asked to reflect on the data generated during the first focus group and during the workshop series using a sequential explanatory mixed-method approach [45]. This approach can enrich the interpretation and understanding of the different data sources and provide reflection opportunities for the participants. The focus group interviews will be based on the methodology proposed by Kitzinger [44]. For instance, the preferred number of participants will be four to eight persons, and the length will be approximately 60–120 min.

Questionnaires

A questionnaire will be distributed to all participants at each workshop. It will be used to answer several of the research questions: The impact of the decision support (RQ2a–e) will be evaluated through the questionnaires. Furthermore, the questionnaires, together with the focus groups, will be used to answer RQ3 regarding whether participation in the project affected the participants’ attitudes toward research. In line with our previous procedure, some questions will be the same across all time points (e.g., the experience of combining different knowledge sources), whereas others will be specific to certain time points. This is to ensure that we capture the process of change at the most valid time point, without burdening the participants with lengthy questionnaires.

The research team will start by identifying suitable constructs to be measured, which will be discussed with representatives from the social service organizations. Thereafter, the research team will select the scales to be used in the questionnaire and, if needed, develop new items. Previously validated scales will be used as much as possible and could include the Copenhagen Psychosocial Questionnaire (COPSOQ), Intervention Process Measure (IPM), and Evidence-Based Practice Attitude Scale (EBPA).

Documentation

Documentation produced by the participants during the workshops will be used to answer the research questions related to the impact of the decision support (RQ2a–c). Through the participants’ documentation on the worksheets, we will be able to collect data on how the participants identified core components in the EBIs applied, contextual differences, the types of adaptations made, and the extent to which they align with the goals of the EBIs. We will photograph the notes made by the teams at the end of each workshop.

Analyses

The data will be continuously analyzed for each specific RQ. Two members of the research team (HH and HG) will be responsible for the data analysis. The remaining research group members will act as informed outsiders. They will participate in iterative debriefing sessions to support the analysis and interpretation of the findings. For the focus group and the documentation data, the data will be analyzed with thematic analyses following the steps outlined by Braun and Clarke [46]. The focus groups will be audiotaped and transcribed verbatim. NVivo will be used for the data analysis.

The quantitative data will be analyzed through descriptive analyses (e.g., frequencies, correlations) and more complex analyses, such as multilevel modeling. This is used to account for the dependence of the data that is created by employees being nested within work units. SPSS 23, HLM 7.1, and Mplus 7.2 will be used. To have enough statistical power to conduct the multilevel analyses, we will follow the recommendation of Hox et al. [40] that 20 higher-order units—e.g., work units—will be sufficient to analyze the data.

Discussion

The current study aims at investigating the adherence and adaption dilemma in social service and exploring how a structured decision support impacts how the dilemma is managed. Thus, our ambition is to contribute with novel knowledge on the adherence and adaption dilemma in general, as well as on the management of this dilemma in social service in particular. As such, the study is a contribution to implementation science, social service research, and social service practice.

An increased understanding of the balance between adherence and adaptation when using EBIs can be essential for new EBIs that are being developed and expected to be put to use in practice. Previous research indicates that adaptions of EBIs are common and that the dilemma tends to be managed in a non-reflective manner. In essence, improved knowledge about adherence and adaptations will increase the likelihood that service users will receive services that are in line with the best available evidence. The knowledge gained from the study is valuable beyond the specific field of social service practice. Insights made from the study might thus be useable in other contexts in which EBIs are implemented and the balance between adaption and adherence is managed by professionals.

The structured decision support that will be developed and tested has the potential to guide professionals in managing the balance between adherence to and adaptations of EBIs. Current research offers limited practical guidance for professionals’ management of the dilemma, and calls for practical tools directed toward professionals have been made [29]. New knowledge will be obtained regarding whether this type of decision support can improve professionals’ opportunities to identify the core components in an EBI and differences between their own context and the context in which the EBI has been used, as well as to reflect upon the necessary implementation strategies. In addition, we will test if the decision support can teach the professionals to reflect upon the opportunity to obtain the intended (and unintended) impact of an EBI depending on the balance of adherence and adaptations. These aspects are central prerequisites for making conscious choices regarding adherence and adaptations and conducting adaptations that align with the EBI’s goals. Thus, the project has the opportunity to produce a tool that can contribute to more careful adaptations, which in the longer run will improve the outcomes for service users. This will also be a contribution to the research, because this intervention can be further tested in future research.

In addition, the project contributes to the knowledge of how adherence and adaptation relate to social service professionals’ experience of cognitive and emotional demands at work. Currently, the dilemma is left to the professionals to solve, but without support, this has shown to be difficult to do without negative effects for professionals and service users alike: for professionals, for whom the dilemma has become a cognitive and emotional challenge that they have to face without sufficient resources, and for service users, who risk receiving less efficient, less safe, and less equal services. Therefore, the adherence and adaptation dilemma might have implications not only for the users of social services but also for the working conditions of social service professionals.

Furthermore, the project has the potential to help professionals with the fundamental principles of evidence-based practice (e.g., the use of research evidence together with the professionals’ own expertise and the service users’ needs). This is one of the burning challenges for staff in social services, yet little guidance is currently available. The project’s contributions lie in offering hands-on guidance for the professionals so that the EBIs (i.e., external research knowledge) can work better in their local context.

The project might also contribute to how social service professionals view research and scientific knowledge. It is possible that the collaborative research design, which starts from the needs and resources of the social service organizations and will be designed through interactions with the local stakeholders as the project unfolds, will impact how these stakeholders perceive research and their own opportunities to impact research conducted in social services. This would be an important contribution for advancing social service professionals’ orientation toward evidence-based practice and their own opportunities to impact the research conducted in the field.

Availability of data and materials

The datasets used will be available from the corresponding author on reasonable request.

Abbreviations

EBI:

Evidence-based intervention

R&D units:

Research and development units

References

  1. Gambrill E. Views of evidence-based practice: social workers’ code of ethics and accreditation standards as guides for choice. J Soci Work Edu. 2007;43(3):447–61.

    Article  Google Scholar 

  2. Morago P. Evidence-based practice: from medicine to social work. Eur J Soc Work. 2006;9(4):461–77.

    Article  Google Scholar 

  3. Rosen A. Evidence-based social work practice: challenges and promise. Soc Work Res. 2003;27(4):197–208.

    Article  Google Scholar 

  4. Morago P. Dissemination and implementation of evidence-based practice in the social services. J Evidence-Based Soci Work. 2010;7(5):452–65.

    Article  Google Scholar 

  5. Plath D. Evidence-based practice: current issues and future directions. Aust Soci Work. 2006;59(1):56–72.

    Article  Google Scholar 

  6. Socialdepartementet Överenskommelse om stöd till en evidensbaserad praktik 2011/986.

  7. Evans S, Upton D. Role and nature of evidence. J Soc Work. 2015;12(4):369–99.

    Google Scholar 

  8. Wike TL, Bledsoe SE, Manuel JI, Despard M, Johnson LV, Bellamy JL, et al. Evidence-based practice in social work: challenges and opportunities for clinicians and organizations. Clin Soc Work J. 2014;42(2):161–70.

    Article  Google Scholar 

  9. Adams KB, Matto HC, LeCroy CW. Limitations of evidence-based practice for social work education: unpacking the complexity. J Soc Work Educ. 2009;45(2):165–86.

    Article  Google Scholar 

  10. Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice. Qual Health Care. 1998;7(3):149–58.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Gray M, Joy E, Plath D, Webb SA. Implementing evidence-based practice a review of the empirical research literature. Res Soc Work Pract. 2013;23(2):157–66.

    Article  Google Scholar 

  12. Perlinski M, Blom B, Morèn S. Getting a sense of the client: working methods in the personal social services. J Soc Work. 2013;13(5):508–32.

    Article  Google Scholar 

  13. Miller-Day M, Pettigrew J, Hecht ML, Shin Y, Graham J, Krieger J. How prevention curricula are taught under real-world conditions. Health Educ. 2013;113(4):324–44.

    Article  Google Scholar 

  14. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice. Admin Pol Ment Health. 2011;38(1):4–23.

    Article  Google Scholar 

  15. Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations. Implement Sci. 2013;8(1):65.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7(1):32.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Mosson R, Hasson H, Wallin L, von Thiele Schwarz U. Exploring the role of line managers in implementing evidence-based practice in social services. Br J Soc Work. 2016;47(2):542–60.

    Google Scholar 

  18. Colby M, Hecht ML, Miller-Day M, Krieger JL, Syvertsen AK, Graham JW, et al. Adapting school-based substance use prevention curriculum through cultural grounding. Am J Community Psychol. 2013;51(1–2):51.

    Google Scholar 

  19. Moore JE, Bumbarger BK, Cooper BR. Examining adaptations of evidence-based programs in natural contexts. J Prim Prev. 2013;34(3):147–61.

    Article  PubMed  Google Scholar 

  20. Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes. Am J Community Psychol. 2008;41(3):327–50.

    Article  PubMed  Google Scholar 

  21. Pettigrew J, Graham JW, Miller-Day M, Hecht ML, Krieger JL, Shin YJ. Adherence and delivery: implementation quality and program outcomes. Prev Sci. 2015;16(1):90–9.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Cooper BR, Shrestha G, Hyman L, Hill L. Adaptations in a community-based family intervention. J Prim Prev. 2016;37(1):33–52.

    Article  PubMed  Google Scholar 

  23. Mihalic S. The importance of implementation fidelity. Emot Beh Disorders Youth. 2004;4:83–105.

    Google Scholar 

  24. Elliott DS, Mihalic S. Issues in disseminating and replicating effective prevention programs. Prev Sci. 2004;5(1):47–53.

    Article  PubMed  Google Scholar 

  25. Castro FG, Barrera M Jr, Holleran Steiker LK. Issues and challenges in the design of culturally adapted evidence-based interventions. Ann Review Clin Psyc. 2010;6:213–39.

    Article  Google Scholar 

  26. Stirman SW, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14(1):58.

    Article  Google Scholar 

  27. Wiltsey Stirman S, Gamarra JM, Bartlett BA, Calloway A, Gutner C. Empirical examinations of modifications and adaptations to evidence-based psychotherapies. Sci Pract. 2017;24(4):396–420.

    Google Scholar 

  28. Stirman SW, Gutner CA, Crits-Christoph P, Edmunds J, Evans AC, Beidas RS. Relationships between clinician-level attributes and fidelity-consistent and fidelity-inconsistent modifications. Implement Sci. 2015;10(1):115.

    Article  Google Scholar 

  29. Lee SJ, Altschul I, Mowbray CT. Using planned adaptation to implement evidence-based programs with new populations. Am J Community Psychol. 2008;41(3–4):290–303.

    Article  PubMed  Google Scholar 

  30. Hasson H, von Thiele Schwartz U. Användbar Evidens. Stockholm:Natur och Kultur. 2017.

  31. Kälvemark S, Höglund AT, Hansson MG, Westerholm P, Arnetz B. Living with conflicts-ethical dilemmas and moral distress in the health care system. Soc Sci Med. 2004;58(6):1075–84.

    Article  PubMed  Google Scholar 

  32. Burston AS, Tuckett AG. Moral distress in nursing: contributing factors, outcomes and interventions. Nurs Ethics. 2013;20(3):312–24.

    Article  PubMed  Google Scholar 

  33. Bracha Y, Brottman G, Carlson A. Physicians, guidelines, and cognitive tasks. Eval Health Prof. 2011;34(3):309–35.

    Article  PubMed  Google Scholar 

  34. Aarons GA, Sommerfeld DH, Hecht DB, Silovsky JF, Chaffin MJ. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover. J Consulting Clin Psyc. 2009;77(2):270.

    Article  Google Scholar 

  35. Lipsky M. Street-level bureaucracy: dilemmas of the individual in public services. Russel Sage Foundation; 1980.

    Google Scholar 

  36. Ovretveit J, Hempel SL, Magnabosco JS, Mittman BV, Rubenstein LA, Ganz D. Guidance for research-practice partnerships and collaborative research. JHOM. 2014;28(1):115–26.

    Article  Google Scholar 

  37. Wallerstein NB, Duran B. Using community-based participatory research to address health disparities. Health Promot Pract. 2006;7(3):312–23.

    Article  PubMed  Google Scholar 

  38. Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research issues in external validation. Eval Health Prof. 2006;29(1):126–53.

    Article  PubMed  Google Scholar 

  39. Armstrong R, Hall BJ, Doyle J, Waters E. Scoping the scope of a cochrane review. J Public Health. 2011;33(1):147–50.

    Article  Google Scholar 

  40. Hox J, Moerbeek M, Kluytmans A, Van De Schoot R. Analyzing indirect effects in cluster randomized trials. Front Psychol. 2014;5:78.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Kolb DA. Experiential learning: experiences as the source of learning and development. Englewood Cliffs: Prentice-Hall; 1984.

  42. Richter A, von Thiele SU, Lornudd C, Lundmark R, Mosson R, Hasson H. iLead—a transformational leadership intervention. Implement Sci. 2016;11(1):108.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–57.

    Article  PubMed  Google Scholar 

  44. Kitzinger J. Introducing focus groups. BMJ. 1995;311:299–302.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  45. Creswell C. Designing and conducting mixed methods research: Sage Publications; 2007.

  46. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.

    Article  Google Scholar 

  47. Resnicow K, Soler R, Braithwaite RL, Ahluwalia JS, Butler J. Cultural sensitivity in substance use prevention. J Comm Psyc. 2000;28(3):271–90.

    Article  Google Scholar 

Download references

Funding

This study has received research grant funding from the Swedish Research Council for Health, Working Life and Welfare (FORTE) (project no. 2018-01315) after a competitive peer review process. FORTE is one of the largest national research funders funding both basic and needs-driven research, distributing around 550 million SEK every year to both. Open access funding provided by Karolinska Institute.

Author information

Authors and Affiliations

Authors

Contributions

HH, UvTS, GA, ÅHR, and HU designed the project. HH secured funding for the project and was responsible for the ethical application. HG drafted the first version of the study protocol based on the original application. All authors discussed the draft, revised it, and approved the final manuscript.

Corresponding author

Correspondence to Henna Hasson.

Ethics declarations

Ethics approval and consent to participate

The project has been approved by the Regional Ethical Review Board in Stockholm (Ref no. 2019-03460). All participants will be treated in accordance with the ethical guidelines for good research practice. Informed consent will be obtained from all study participants. In the case of refusal, these individuals will not be included in the dataset used for analyses.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1.

Consolidated criteria for reporting qualitative studies (COREQ).

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hasson, H., Gröndal, H., Rundgren, Å.H. et al. How can evidence-based interventions give the best value for users in social services? Balance between adherence and adaptations: a study protocol. Implement Sci Commun 1, 15 (2020). https://doi.org/10.1186/s43058-020-00005-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-020-00005-9

Keywords