Skip to main content

Strengthening methods for tracking adaptations and modifications to implementation strategies

Abstract

Background

Developing effective implementation strategies requires adequate tracking and reporting on their application. Guidelines exist for defining and reporting on implementation strategy characteristics, but not for describing how strategies are adapted and modified in practice. We built on existing implementation science methods to provide novel methods for tracking strategy modifications.

Methods

These methods were developed within a stepped-wedge trial of an implementation strategy package designed to help community clinics adopt social determinants of health-related activities: in brief, an ‘Implementation Support Team’ supports clinics through a multi-step process. These methods involve five components: 1) describe planned strategy; 2) track its use; 3) monitor barriers; 4) describe modifications; and 5) identify / describe new strategies. We used the Expert Recommendations for Implementing Change taxonomy to categorize strategies, Proctor et al.’s reporting framework to describe them, the Consolidated Framework for Implementation Research to code barriers / contextual factors necessitating modifications, and elements of the Framework for Reporting Adaptations and Modifications-Enhanced to describe strategy modifications.

Results

We present three examples of the use of these methods: 1) modifications made to a facilitation-focused strategy (clinics reported that certain meetings were too frequent, so their frequency was reduced in subsequent wedges); 2) a clinic-level strategy addition which involved connecting one study clinic seeking help with community health worker-related workflows to another that already had such a workflow in place; 3) a study-level strategy addition which involved providing assistance in overcoming previously encountered (rather than de novo) challenges.

Conclusions

These methods for tracking modifications made to implementation strategies build on existing methods, frameworks, and guidelines; however, as none of these were a perfect fit, we made additions to several frameworks as indicated, and used certain frameworks’ components selectively. While these methods are time-intensive, and more work is needed to streamline them, they are among the first such methods presented to implementation science. As such, they may be used in research on assessing effective strategy modifications and for replication and scale-up of effective strategies. We present these methods to guide others seeking to document implementation strategies and modifications to their studies.

Trial registration

clinicaltrials.gov ID: NCT03607617 (first posted 31/07/2018).

Peer Review reports

Contribution to the literature

  • Tracking adaptations and modifications made to implementation strategies and key factors driving the decisions to modify can be crucial for assessing the impact of implementation strategies and replicating effective strategies.

  • Despite advances in detailed tracking methods in implementation studies, little guidance exists for tracking adaptations and modifications made to implementation strategies.

  • These methods outline a process for tracking adaptations and modifications made to implementation strategies, which build on existing tracking methods, implementation frameworks, and reporting guidelines.

Background

Implementation strategies are actions or processes used to increase interventions’ uptake and sustainment [1]. Developing generalizable knowledge about these strategies requires carefully tracking and reporting on how they are applied. Several related guidelines exist; for example, Powell et al. (2015) provide a list of standardized implementation strategy labels and definitions, and Proctor et al. (2013) provide guidelines for reporting implementation strategies in sufficient detail to ensure they can be replicated in research and practice [2,3]. In addition, a handful of studies have developed and tested methods for tracking and reporting implementation strategies, [2, 4,5,6,7,8,9] including: tracking logs that are completed by clinicians conducting implementation activities [9]; a system for research teams to track and code implementation strategies in alignment with Proctor et al.’s reporting recommendations [2, 7]; and logs completed by stakeholders involved in the implementation process to report on implementation strategies [10].

Despite these efforts, implementation research and practice literature often lacks sufficient detail on how implementation strategies were operationalized, how and why they worked (or failed), and how to replicate or refine such strategies in future uses [11,12,13]. Notably, studies of implementation strategies’ effectiveness often fail to document adaptations and modifications made to these strategies. While methods exist for tracking and reporting implementation strategies, as described above, there is a dearth of methods for identifying and describing modifications made to such strategies. Yet given the dynamic nature of implementation, strategy modifications may be necessary based on implementation context [14,15,16,17,18,19]. The Framework for Reporting Adaptations and Modifications to Evidence-Based Interventions (FRAME) provides guidance on how to track adaptations and modifications made to clinical interventions [20], but additional work is needed to examine how this framework might be applied to implementation strategies. Based on the definitions in FRAME, we use adaptation to refer to “thoughtful or deliberate alterations” made to implementation strategies “with the goal of improving its fit or effectiveness in a given context [20].” Modification encompasses a broad range of changes to strategies including adaptations, additions, and unanticipated, iterative changes that emerged naturally throughout the implementation process [20]. Finley et al. (2018) present one potential method, as structured reflection sessions throughout implementation show promise in documenting both modifications and associated contextual factors [21]. There is a clear need to further identify and test methods for documenting implementation strategy adaptations and modifications, as such methods are necessary to determine how and why implementation strategies deviate from plans and when such deviations are necessary. This knowledge is essential for replicating implementation studies’ results and disseminating best practices across settings.

This paper builds on existing methods for tracking implementation strategies to provide novel methods for tracking strategy adaptations and modifications [7, 9, 10, 21]. These methods include prospective tracking and coding of originally planned implementation strategies (i.e., those in the study protocol), and how those strategies were adapted and modified throughout a study. As little guidance for tracking modifications made to implementation strategies has been previously published, this paper is intended to help others hoping to track such modifications.

Methods

Study context

The methods presented here were developed in the context of a mixed methods, pragmatic, stepped-wedge, cluster-randomized trial, with a hybrid type 3 implementation-effectiveness design. The parent trial (funded in the U.S. by NIDDK 5R18DK114701) is assessing the effectiveness of an implementation strategy package designed to help community health centers (CHCs) adopt social determinants of health (SDH) screening and referral activities, called ‘SDH activities’ [22]. In each of six sequential wedges (referred to throughout this paper as ‘wedge 1,’ ‘wedge 2,’ etc.), up to five CHC clinics receive 6 months of technical assistance from a multi-disciplinary ‘Implementation Support Team’ with an electronic health record (EHR) trainer, practice coach, and SDH expert. The Implementation Support Team guides the clinics through a multi-step process called the ‘Clinic Action Plan,’ developed based on lessons learned from a pilot study [23] and refined from wedge to wedge. Implementation strategies are provided to support each step, as described in Gold et al. (2019) and summarized Table 4 in Appendix [22]. Per study protocol, any aspects of the planned implementation support could be modified to meet individual clinics’ needs, where feasible [22]. Modifications could be made in response to an individual clinic’s context (clinic-level), or perceived need to change the strategies delivered to all clinics (study-level).

Tracking process

We developed the methods presented here to fully capture and describe the implementation support provided to each study clinic, by systematically tracking the implementation strategies used, and modifications made to the originally planned strategies. To develop these methods, we identified processes and data sources presumed critical to tracking implementation strategies and their adaptations using existing methods for tracking implementation strategies [7, 9, 10], guidance from implementation frameworks such as the Consolidated Framework for Implementation Research (CFIR) [24], and reporting guidelines including the Framework for Reporting Adaptations and Modifications-Enhanced (FRAME) [20] and Proctor et al.’s (2013) reporting framework [2].

These methods involve five components that are presented sequentially here, but in practice were often iterative or overlapping: 1) describe each planned strategy in detail; 2) track how the strategies are used; 3) monitor barriers and contextual factors that could impact strategy modification; 4) describe modifications made to planned strategies in response to barriers and contextual factors; and 5) identify and describe new strategies added during the study period. These are shown in Fig. 1 and described in detail below. To collect the data needed for these components, we drew on and augmented the rigorous documentation already planned as part of the parent trial. This tracking effort includes strategies provided to the parent study clinics by the research team; it is not intended to capture strategies initiated by the clinics themselves in the course of study participation.

Fig. 1
figure 1

Five components tracked by these methods

Describe planned strategies

We described all implementation strategies included in the study in detail to monitor deviations from their intended application. We used the Expert Recommendations for Implementing Change (ERIC) taxonomy of 73 discrete implementation strategies, and research building on ERIC, to categorize these strategies [3, 8]. We then described each strategy using Proctor et al.’s (2013) reporting framework, which recommends documenting a given strategy’s actor, action, dose, temporality, action target, and justification [2]. We drew on the parent study’s protocol and study materials to describe each strategy [22], named each strategy using ERIC, defined it based on study materials, and described each facet using Proctor’s framework. Members of the study team then verified the detailed list of planned implementation strategies.

Track strategy use

We tracked the use of implementation strategies with each CHC clinic, with details on when and how the strategies were used, to identify modifications made to the strategies and / or differences between what was originally planned and what was delivered. To do so, the Implementation Support Team closely tracked and documented each study clinic’s implementation progress on a weekly basis. The research team monitored these notes weekly for changes and synthesized the documentation quarterly using the fields shown in Table 1. These data included documentation of regularly scheduled meetings with study clinics, dates when clinics reached critical milestones, materials sent to or received from the clinics, and clinic goals. The Implementation Support Team also included support that was provided to the study clinics beyond what was planned in the original intervention.

Table 1 Data elements tracked in original plan

Track barriers and solutions

We monitored discussions of clinics’ contextual factors or barriers, and of decisions made about adapting and modifying implementation strategies in response to these factors. We drew from detailed notes and transcripts from meetings with the Implementation Support Team and each clinic, and notes and recordings of weekly Implementation Support Team meetings. These sources enabled identifying the rationale for modifying implementation strategies, and whether it occurred at the clinic or study level. We then used the Consolidated Framework for Implementation Research (CFIR) to code these barriers / contextual factors [24]. The CFIR provides a comprehensive list of contextual factors that may impact implementation success, categorized as associated with: Outer Setting, Inner Setting, Characteristics of Individuals, and Characteristics of the Intervention, all with extensive sub-categories (see Additional File 3 of Damschroder et al., 2009). We did not use the Process constructs from CFIR due to their potential overlap with the ERIC taxonomy [3, 24].

Describe modifications to planned strategies

We described adaptations and modifications made to strategies by documenting any deviations from the planned process. To document these, we drew on the detailed descriptions of planned strategies, strategy use, and barriers and solutions as described above. We then ensured that our approach to documenting these modifications was consistent with existing methods by building on published tracking methods [7, 9] and coding taxonomies [2, 3, 20, 24], by including elements of these taxonomies that we considered relevant to documenting implementation strategy modifications, as follows (Table 2).

Table 2 Data elements tracked to capture modifications to implementation strategies

We used elements of the Framework for Reporting Adaptations and Modifications-Enhanced (FRAME) [20] to describe strategy modifications, an expansion of Stirman et al.’s review [25]. FRAME describes elements that should be considered when tracking modifications and adaptations made to interventions as they are implemented. Here, we explored FRAME as a tool for reporting modifications to implementation strategies, rather than the intervention. We used many of FRAME’s reporting elements, and added elements from implementation frameworks or project-specific language as needed.

We included FRAME elements to describe the nature of the modification (e.g. adding, tweaking or refining, lengthening or shortening, reordering strategies, or removing or skipping elements), when the modification occurred (e.g. pre-implementation, or stage in the study), who participated in the decision to modify (e.g. Implementation Support Team, practice coach, clinic champion), and the reason why the modification was made (e.g. staffing, available resources, competing demands). FRAME also includes level of delivery; here, this meant whether the modification was at the clinic or study level. For strategies that were not enacted (e.g., because a given clinic did not get to the implementation support within the study period) we coded the nature of the modification as removing or skipping elements.

We used CFIR to augment the documentation of the reason for a given strategy modification. In this study, the reasons for modifications were often implementation barriers. While the FRAME categories were a useful starting point, CFIR is a more comprehensive framework to describe implementation barriers. Using CFIR for reasons also allowed for greater consistency of coding, as it was also used to identify barriers and contextual factors earlier in the process. Strategies were often added to address common implementation barriers. For example, if a study clinic had not planned for SDH screening, it was coded as planning; if clinic staff had inadequate knowledge about SDH screening, it was coded as access to knowledge & information; and if limited resources were dedicated to implementing SDH screening, it was coded as available resources.

Several elements of FRAME were considered not applicable, or unlikely to vary across modifications. For example, all modifications were considered content modifications (rather than contextual or evaluation modifications). We did not code for the relationship to fidelity or whether modifications should be considered cultural. It was not appropriate to consider fidelity to planned strategies as the study design intentionally allowed for modification. Guidelines for fidelity-consistent modifications were not developed for the strategies included in this intervention because the implementation support was designed to be adaptive and core elements were not yet known. Tracking modifications in response to culture was not appropriate given the focus on modifications to strategies rather than the intervention and the limited cultural variation in the study context and population. This element was added to FRAME to capture modifications made to interventions that are implemented in cultures different from where the intervention was developed. This was not applicable to our study.

Identify and describe added strategies

The prior four components were used to track strategies that were planned and revised. However, unplanned strategies may also be added throughout an implementation process, which require slightly different tracking methods. For added strategies, we begin by populating elements of FRAME to describe the addition. Once these strategies are added, they can also be tracked to understand if they are used as intended. We track added strategies for subsequent modification by completing each component of the process to describe the added strategy, track strategy use, monitor barriers and solutions, and describe any modifications to the strategies as planned.

We identified strategies added for a given clinic using a separate tracking tool (Table 1), and strategies added at the study level using notes from Implementation Support Team meetings and intervention materials. We then briefly described the added strategy based on FRAME (Table 2), and coded it using the ERIC taxonomy and the Proctor reporting guidelines. Study-level additions were then included in the tracking of planned strategies and monitored as such in subsequent use.

Results

This five-component process for tracking modifications made to implementation strategies in the context of an implementation study leveraged existing implementation frameworks, reporting guidelines, and methods for tracking implementation strategies. Clinic-level modifications were often based on clinic context and implementation needs; study-level modifications were often based on lessons learned over the course of the study, and were applied to clinics in subsequent wedges. Table 3 gives examples of the use of these methods.

Table 3 Examples of implementation strategy modifications

Example one is based on a facilitation strategy. Members of the Implementation Support Team conducted virtual meetings with project champions from all clinics in a given wedge. When working with the first set of study clinics, these meetings took place once a month throughout the support period, and were designed to improve implementation by increasing champion knowledge and self-efficacy and improving readiness. By tracking strategy use, we identified a change to this strategy between wedge 1 and wedge 2 of the parent study. To understand the reason for this change, we used process data from clinic interactions and internal meetings of the study team to track barriers and solutions. Several of these clinics reported that the meetings were too frequent, so the study team decided to reduce the frequency of these meetings in subsequent wedges. We used the description of the strategy, the tracking of the strategy use, and the tracking of barriers and solutions to describe modifications to the implementation strategy using elements of FRAME.

Examples two and three illustrate additions made to planned strategies. We used the tracking of strategy use, and the tracking of barriers and solutions to identify the added strategy. As part of tracking strategy use, the Implementation Support Team listed any “other support” provided to the clinics. This includes strategies that were not part of the planned study activities for that wedge. Data from this tracker showed that the practice coach connected a clinic in wedge three and a clinic from a prior wedge to share information. Additional process data to track barriers and solutions showed that the clinic wanted to identify and train appropriate staff to conduct screening and develop new workflows as part of developing the implementation plan. This clinic expressed a need to better understand the potential role of community health workers in this process. The clinic champion requested additional information about the job description of the community health workers at a peer clinic enrolled in the study. The practice coach contacted the clinic and requested that they share the job description for community health workers. We describe this strategy modification using FRAME, and describe the strategy in detail using the ERIC taxonomy and Proctor et al. reporting framework.

Example three illustrates a modification made by adding a strategy between wedges. Again, we used information from an earlier component in the process to identify and understand this added strategy. Process data from Implementation Support Team meetings showed that several study clinics were not taking on SDH activities de novo; many had attempted to do so in the past, and sought assistance in overcoming previously encountered challenges. To address this, the Implementation Support Team added questions to the study’s baseline survey to assess clinics’ past experience with SDH implementation, and factors that might impact the clinic’s ability to initiate, expand, or improve such activities. This strategy addition was administered pre-wedge, to improve the fit of future implementation strategies (Table 3).

Discussion

This approach contributes to growing body of research to address calls for improved reporting of implementation interventions and strategies [2, 11, 19, 26]. Systematic reviews of implementation studies show that strategies are often not reported in sufficient detail to describe what was planned as part of the study design and whether strategies were executed as intended [27, 28]. This imprecise reporting hinders our ability to evaluate the impact of implementation strategies within and across studies, and make incremental improvements or refinements to strategies to improve their impact.

These methods outline a process for tracking adaptations and modifications made to implementation strategies, which build on existing tracking methods, implementation frameworks, and reporting guidelines. Integrating existing frameworks based on study context allowed for the potential to compare across studies, and to build on previous work to further refine the application of these frameworks for future research. No framework is comprehensive for all contexts, however, and each contains elements that are not applicable in particular contexts. Several challenges arose in applying and integrating these frameworks, as described below.

Although the selected frameworks generally suited the purposes of the study, we made additions to several frameworks. In the parent study, developing and adapting workflows was a key implementation strategy. This strategy is not part of the ERIC taxonomy. For this reason, we used suggested additions to the ERIC compilation as identified by Perry and colleagues [8]. We also added two components to the Proctor framework: ‘Supporting Actor’ (any other person who might need to be involved to ensure the strategy was completed other than the ‘Actor’), and Format of Strategy Delivery, to clarify mode of delivery of strategies. Supporting Actor provided additional detail where the primary actor of a strategy was external to the organization and the purpose of the strategy was to create change within the organization. It was useful to define the roles of both internal or external actors. We added ‘Format of Strategy Delivery’ to document changes from the planned mode of delivery: for example, steps to develop a clinic’s implementation plans were often completed during meetings, rather than in written format, as planned. This could be a critical detail to ensure a strategy’s replication, particularly where facilitation is a key implementation strategy.

In the application of these methods, there was overlap in elements of several frameworks. We did not use the CFIR ‘Process’ domain, as it was redundant with the ERIC documentation of implementation strategies. CFIR components were applied to describe both implementation barriers and reasons for strategy modifications. We found that the reasons for strategy modification were best described using CFIR’s comprehensive overview of the multi-level implementation determinants. We augmented CFIR categories with FRAME as needed. For example, we found that CFIR provided limited detail for describing barriers related to workforce; we could only code barriers related to insufficient workforce or staff turnover using the CFIR category available resources. Here, FRAME offered additional detail, with a sub-category staffing.

FRAME also provided elements for documenting modifications made to implementation strategies for individual clinics and at the study level. This was useful given the dynamic nature of the study design. While we used FRAME’s general categories on when the modification occurred, who participated in the decision to modify, the nature of the modification, and the reason why the modification was made, we generally used either a subset of the codes within these categories, or developed new codes. Additional research is needed to explore the application of FRAME to implementation strategies.

We selected these frameworks primarily based on their usability and applicability to the parent study [29, 30]. Future users of the methods presented here should consider whether other frameworks and data sources are a better fit in other contexts. For example, CFIR represents one of many determinant frameworks [31]. Alternatives include the Theoretical Domains Framework [32, 33] or the Exploration, Preparation, Implementation, and Sustainment Framework [34]. Proctor et al.’s reporting framework [2] could be augmented or replaced with the Workgroup for Intervention Development and Evaluation Research (WIDER) or the Template for Intervention Description and Replication (TIDieR) checklist and guide [35,36,37]. The behavior change technique taxonomy could be used in addition to ERIC or as a replacement, where appropriate [38]. Researchers may select frameworks based of the underlying theory, change processes, analytics level, and disciplinary credibility [29, 30]. When making decisions about combining frameworks, researchers may retain some elements we did not use here. Any use of frameworks to guide these methods should be flexible and responsive to context.

These methods have several limitations. Like other approaches to reporting and tracking research activities, these processes are time intensive and may be perceived as burdensome. This study did not allow us to estimate the time required to tracking strategies and their modifications using these methods. Future studies should consider documenting the time required to track strategies and their adaptations using these methods, to target improvements. Additional work is needed to streamline tracking to be more pragmatic [39, 40]. We refined these tracking methods based on feedback from the Implementation Support Team during weekly meetings and through a formal mid-project review and made several improvements to the methods over the course of the study. The research team refined their process for prospective tracking over the course of the study to summarize a given clinic’s incremental progress, and to guide weekly discussions of this progress. We believe this iterative improvement resulted more pragmatic tracking through an appropriate balance of prospective tracking and group discussion. These data could be collected more easily by the implementation team, and better used to guided planning efforts and implementation support. This adds to research examining the feasibility and acceptability of various approaches to tracking [10]. Our methods focused only on the delivery of implementation strategies by the study team, and did not include tracking within the clinics participating in the parent study. We did not ask study participants to complete the tracking tools presented here to minimize what was asked of the clinics, as study participation already required substantial effort on the part of participants. Additional research is needed to refine tracking tools and improve usability for practitioners and other stakeholders, including prompts for facilitated discussions and field definitions and instructions for tracking logs [10].

Conclusions

Data collected using these methods may be used in myriad ways, such as to describe adaptations made to the originally-planned implementation strategies, or as covariates to evaluate the impact of strategies on implementation outcomes. These methods may improve assessment of implementation strategies through identifying associations between variation in strategy use and implementation outcomes and health outcomes. Data from these methods may also be used to better plan for and resource scale-up of implementation through identifying typical patterns of variation in response to context. Additional research is needed to explore methods to assess strategies and strategy modifications which most impact implementation outcomes; these methods could enhance that work [41]. Although this study did not code strategy modifications for their impact on fidelity, these methods could be expanded to track fidelity to implementation strategies by identifying core elements, developing thresholds for fidelity prior to the study, and integrating recommendations for reporting on fidelity [41, 42]. Our goal was to track the types of modifications needed and use the data to later evaluation the impact of those modifications. Future research may use these methods along with guidelines for fidelity-consistent and fidelity-inconsistent modification where core elements of the strategies are known prior to tracking efforts. Defining these components is critical for tracking strategies such as implementation facilitation and developing an implementation blueprint which are often multi-stage and widely variable in their application. Future research may further explore how to document modifications and fidelity in studies on implementation strategies’ impact.

These methods are among the first options put forth for tracking how implementation strategies are modified in implementation studies; doing so is critical for replication and scale-up of effective strategies. We present these methods to guide others seeking to document implementation strategies and modifications to these studies over the course of a research study. Future research is needed to validate and improve these methods.

Availability of data and materials

The data used during the current study are available from the corresponding author on reasonable request.

References

  1. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69(2):123–57.

    Article  PubMed  Google Scholar 

  2. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implementation Science. 2013;8(1):1–1.

  3. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci. 2015;10:21.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Shelley DR, Ogedegbe G, Anane S, Wu WY, Goldfeld K, Gold HT, et al. Testing the use of practice facilitation in a cluster randomized stepped-wedge design trial to improve adherence to cardiovascular disease prevention guidelines: HealthyHearts NYC. Implement Sci. 2015;11(1):88.

    Article  Google Scholar 

  5. Dogherty EJ, Harrison MB, Baker C, Graham ID. Following a natural experiment of guideline adaptation and early implementation: a mixed-methods study of facilitation. Implement Sci. 2012;7(1):9.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Jabbour M, Curran J, Scott SD, Guttman A, Rotter T, Ducharme FM, et al. Best strategies to implement clinical pathways in an emergency department setting: study protocol for a cluster randomized controlled trial. Implement Sci. 2013;8(1):55.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Boyd MR, Powell BJ, Endicott D, Lewis CC. A method for tracking implementation strategies: an exemplar implementing measurement-based care in community behavioral health clinics. Behav Ther. 2018;49(4):525–37.

    Article  PubMed  Google Scholar 

  8. Perry CK, Damschroder LJ, Hemler JR, Woodson TT, Ono SS, Cohen DJ. Specifying and comparing implementation strategies across seven large implementation interventions: a practical application of theory. Implement Sci. 2019;14(1):32.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, Shea C. Tracking implementation strategies: a description of a practical approach and early findings. Health Res Policy Syst. 2017;15(1):15.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Walsh-Bailey C PL, Jones SMW, Mettert K, Powell BJ, Wiltsey Stirman S, Lyon AR, Rohde P, Lewis CC. Pilot methods for tracking implementation strategies and treatment adaptations. Implementation Res Pract. 2021;in press.

  11. Wilson PM, Sales A, Wensing M, Aarons GA, Flottorp S, Glidewell L, Hutchinson A, Presseau J, Rogers A, Sevdalis N, Squires J. Enhancing the reporting of implementation research. Implementation Science. 2017;12(1):13.

  12. Lewis CC, Boyd MR, Walsh-Bailey C, Lyon AR, Beidas R, Mittman B, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15(1):21.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Lewis CC, Klasnja P, Powell B, Tuzzio L, Jones S, Walsh-Bailey C, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6:136.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Scott K, Lewis CC. Using measurement-based care to enhance any treatment. Cogn Behav Pract. 2015;22(1):49–59.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Quanbeck A, Brown RT, Zgierska AE, Jacobson N, Robinson JM, Johnson RA, et al. A randomized matched-pairs study of feasibility, acceptability, and effectiveness of systems consultation: a novel implementation strategy for adopting clinical guidelines for opioid prescribing in primary care. Implement Sci. 2018;13(1):21.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Kilbourne AM, Almirall D, Eisenberg D, Waxmonsky J, Goodrich DE, Fortney JC, et al. Protocol: adaptive implementation of effective programs trial (ADEPT): cluster randomized SMART trial comparing a standard versus enhanced implementation strategy to improve outcomes of a mood disorders program. Implement Sci. 2014;9(1):132.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Sanetti LMH, Collier-Meek MA. Data-driven delivery of implementation supports in a multi-tiered framework: a pilot study. Psychol Sch. 2015;52(8):815–28.

    Article  Google Scholar 

  18. Watson DP, Young J, Ahonen E, Xu H, Henderson M, Shuman V, et al. Development and testing of an implementation strategy for a complex housing intervention: protocol for a mixed methods study. Implement Sci. 2014;9(1):138.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, et al. Enhancing the impact of implementation strategies in healthcare: a research agenda. Front Public Health. 2019;7:3.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Stirman SW, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14(1):58.

    Article  Google Scholar 

  21. Finley EP, Huynh AK, Farmer MM, Bean-Mayberry B, Moin T, Oishi SM, et al. Periodic reflections: a method of guided discussions for documenting implementation phenomena. BMC Med Res Methodol. 2018;18(1):153.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Gold R, Bunce A, Cottrell E, Marino M, Middendorf M, Cowburn S, et al. Study protocol: a pragmatic, stepped-wedge trial of tailored support for implementing social determinants of health documentation/action in community health centers, with realist evaluation. Implement Sci. 2019;14(1):9.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Gold R, Bunce A, Cowburn S, Dambrun K, Dearing M, Middendorf M, et al. Adoption of social determinants of health EHR tools by community health centers. Ann Fam Med. 2018;16(5):399–407.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8(1):65.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Michie S, Fixsen D, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implementation Science. 2009;4:40.

  27. Nadeem E, Olin SS, Hill LC, Hoagwood KE, Horwitz SM. Understanding the components of quality improvement collaboratives: a systematic literature review. Milbank Q. 2013;91(2):354–94.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Prior M, Guerin M, Grimmer-Somers K. The effectiveness of clinical guideline implementation strategies–a synthesis of systematic review findings. J Eval Clin Pract. 2008;14(5):888–97.

    Article  PubMed  Google Scholar 

  29. Birken SA, Rohweder CL, Powell BJ, Shea CM, Scott J, Leeman J, et al. T-CaST: an implementation theory comparison and selection tool. Implement Sci. 2018;13(1):143.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Birken SA, Powell BJ, Shea CM, Haines ER, Kirk MA, Leeman J, et al. Criteria for selecting implementation science theories and frameworks: results from an international survey. Implement Sci. 2017;12(1):124.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:53.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A. Making psychological theory useful for implementing evidence based practice: a consensus approach. BMJ Qual Saf. 2005;14(1):26–33.

    Article  CAS  Google Scholar 

  33. Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7(1):37.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Admin Pol Ment Health. 2011;38(1):4–23.

    Article  Google Scholar 

  35. Hooley C, Amano T, Markovitz L, Yaeger L, Proctor E. Assessing implementation strategy reporting in the mental health literature: a narrative review. Adm Policy Ment Health Ment Health Serv Res. 2020;47(1):19–35.

    Article  Google Scholar 

  36. Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, Altman DG, Barbour V, Macdonald H, Johnston M, Lamb SE. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. Bmj. 2014. p. 348.

  37. Albrecht L, Archibald M, Arseneau D, Scott SD. Development of a checklist to assess the quality of reporting of knowledge translation interventions using the workgroup for intervention development and evaluation research (WIDER) recommendations. Implement Sci. 2013;8(1):1–5.

    Article  Google Scholar 

  38. Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013;46(1):81–95.

    Article  PubMed  Google Scholar 

  39. Powell BJ, Stanick CF, Halko HM, Dorsey CN, Weiner BJ, Barwick MA, et al. Toward criteria for pragmatic measurement in implementation research and practice: a stakeholder-driven approach using concept mapping. Implement Sci. 2017;12(1):118.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Stanick CF, Halko HM, Dorsey CN, Weiner BJ, Powell BJ, Palinkas LA, et al. Operationalizing the ‘pragmatic’measures construct using a stakeholder feedback and a multi-method approach. BMC Health Serv Res. 2018;18(1):882.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Kirk MA, Moore JE, Wiltsey Stirman S, Birken SA. Towards a comprehensive model for understanding adaptations’ impact: the model for adaptation design and impact (MADI). Implement Sci. 2020;15(1):56.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Slaughter SE, Hill JN, Snelgrove-Clarke E. What is the extent and quality of documentation and reporting of fidelity to implementation strategies: a scoping review. Implement Sci. 2015;10(1):129.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

The authors deeply appreciate the contributions of the ASCEND research team for all of their support and feedback throughout the study.

Funding

This publication was supported by grants from the National Cancer Institute through P50CA244289 and the National Institute of Diabetes and Digestive and Kidney Diseases through 5R18DK114701. BJP and ADH were also supported by the National Institute of Mental Health through K01MH113806.

Author information

Authors and Affiliations

Authors

Contributions

AH led the conception and design of the work and the drafting of the manuscript. RG made substantial contributions to the design of the work and substantial contributions to the drafting of the manuscript. BJP made substantial contributions to the design of the work and substantial revisions to the manuscript. CWB contributed to drafting the manuscript and provided substantial feedback on all drafts of the manuscript. MK and IG reviewed methods, contributed to data analysis, and contributed to improvements in the methods. CMS, LF, and KHL reviewed and provided feedback on the conception and design of the work and reviewed and provided feedback on the manuscript. AB and MM reviewed and provided substantial feedback on the manuscript. All authors provided feedback on manuscript drafts and approved the final manuscript.

Corresponding author

Correspondence to Amber D. Haley.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the University of North Carolina at Chapel Hill Institutional Review Board based on a rely on the Kaiser Permanente Northwest IRB. The Kaiser Permanente Northwest IRB granted the ASCEND study a waiver of informed consent for all data collection activities. ASCEND is a pragmatic trial and obtaining signed consent would unnaturally restrict our study sample, diminishing the external validity of our findings. The project is promoting standard clinical care and quality improvement in the CHC setting.

Consent for publication

Not applicable.

All methods were performed in accordance with the relevant guidelines and regulations (Declaration of Helsinki).

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Table 4 Originally planned implementation support

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Haley, A.D., Powell, B.J., Walsh-Bailey, C. et al. Strengthening methods for tracking adaptations and modifications to implementation strategies. BMC Med Res Methodol 21, 133 (2021). https://doi.org/10.1186/s12874-021-01326-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12874-021-01326-6

Keywords