Introduction
Implementation mapping has recently been described as an organized way to develop or select implementation strategies through five specific tasks guided by an implementation science framework [
1]. The process of selecting implementation strategies can be challenging for implementation scientists. Appropriate strategies are guided by an implementation science theory or framework and consider contextual factors and known implementation barriers, which may differ across key stakeholders such as leaders, nurses, or providers [
2]. One specific approach to the selection of implementation strategies is to map strategies to the determinants of the chosen implementation science framework, as initially described in 2019 as part of implementation mapping [
1]. Since then, several researchers have reported on their application of implementation mapping. According to these reports, researchers used advisory groups (e.g., task force or stakeholder advisory group) to select implementation strategies from potentially applicable Expert Recommendations for Implementing Change (ERIC) strategies [
3,
4]. While this approach worked, selection of strategies likely depended on the composition of these advisory groups and on the opinion of the individuals comprising them. Thus, one potential area for improvement in the application of implementation mapping is the use of a systematic data-driven approach to reviewing and prioritizing all 73 ERIC strategies.
For this reason, we operationalized implementation mapping through a data-driven process, considering all 73 ERIC strategies and every determinant of the Tailored Implementation for Chronic Diseases (TICD) framework. We used data visualization techniques to manage the consequently large number of objectives and ERIC strategies. In this manuscript, we illustrate our data-driven approach to implementation mapping using implementation of risk-aligned bladder cancer surveillance as a case example. Our approach is intended for use by implementation scientists who seek a rigorous selection process for implementation strategies.
Case example: risk-aligned bladder cancer surveillance
Bladder cancer is one of the most prevalent cancers in the Department of Veterans Affairs (VA) [
5]. The vast majority of patients with bladder cancer have early stage cancer, which only grows superficially within the bladder [
6]. Early stage bladder cancer patients undergo resection of the cancer from the bladder and are then at varying risks of cancer recurrence within the bladder—categorized as low, intermediate, and high according to current guidelines [
7]. To detect these recurrences, patients undergo regular surveillance cystoscopy procedures, during which providers directly inspect the bladder via an endoscope. Given the broad range of cancer recurrence risks, providers should align the frequency with which patients undergo surveillance cystoscopy procedures with each patient’s individual risk of cancer recurrence. However, we previously found that there is both underuse of surveillance among high-risk and overuse of surveillance among low-risk patients, with up to three quarters of low-risk patients undergoing more procedures than are recommended [
8]. Thus, we embarked on selecting implementation strategies to promote risk-aligned bladder cancer surveillance using a data-driven approach to implementation mapping.
Discussion
We describe a rigorous and data-driven approach to consider every TICD implementation science framework determinant and every ERIC strategy during implementation mapping. We were able to interpret the large matrices by plotting the results of the implementation strategy matrix (Fig.
2) and the factors influencing strategy prioritization and selection (Fig.
3). This rigorous process allowed us to select implementation strategies primarily based on data rather than on opinions of the advisory groups alone. The implementation mapping process culminated in highly specified implementation strategies that were codified in an implementation blueprint.
Our approach is novel as the selection of implementation strategies was driven primarily by data. Prior work using implementation mapping employed advisory panels to select implementation strategies out of potential ERIC strategies [
3,
4], which is more subjective, or did not clearly report how the selection was handled [
20]. To overcome this limitation, we created an implementation strategy matrix, cross-walking all potentially applicable ERIC strategies against all change objectives. We then developed a plot visualizing this large matrix (Fig.
2). This allowed us to evaluate the scope of each ERIC strategy, based on the change objectives that were addressed. The plot also included visualization of which TICD framework domains and determinants were addressed by each strategy along with which employee types would be involved. This comprehensive representation of all mapping data then drove the decisions of which strategies to select.
To our knowledge, this study is the first to apply implementation mapping as recommended by Fernandez et al. [
1] to improve guideline-concordant cancer care delivery in the clinic. Prior studies used implementation mapping in oncology to implement a phone navigation program [
21] and exercise clinics in oncology [
4], but not yet to directly improve cancer care delivery in the clinic.
We would like to emphasize that our data-driven approach to implementation mapping is not limited to a specific implementation science framework. Whereas our change objectives were categorized by TICD domains and determinants, other frameworks that can guide systematic categorization of determinants of evidence-based practice can be used in similar fashion. For example, the initial description of implementation mapping specifically mentions use of the Consolidated Framework for Implementation Science [
22] and the Theoretical Domains Framework [
23] as other suitable framework options [
1].
It is important to acknowledge issues of equity and stakeholder preferences and values in the selection of implementation strategies. In our data-driven approach, equity and stakeholder preferences were included to the extent that they were represented in the prior mixed-methods assessments of staff needs [
10]. However, diversity among stakeholders recruited for interviews and participation in advisory panels was somewhat limited with 8% African American and 2% Hispanic representation among interview participants [
10] and no African American representation in our advisory panels. This could be seen as a limitation of our specific work and case example. However, our data-driven approach could easily be adapted for projects focused on diversity, equity, and inclusion. For example, one could use the Health Equity Implementation framework [
24] to incorporate equity-relevant determinants into the data-driven implementation mapping process, optimizing the scientific yield and equity of implementation efforts [
25].
Despite our approach’s innovation and rigor, there are several limitations to discuss. First, opinions of the research team affected certain parts of the implementation mapping process. This included the assessment of time commitment for local teams as well as the interpretation of the available literature when assessing the overall impact of a strategy. However, we tried to limit subjectivity as much as possible to focused questions and by including different perspectives from an implementation scientist, a urologist, an internist, and several implementation research staff members in this process. Second, whereas our implementation mapping process was primarily driven by data, we did not formally assess its reproducibility by an independent team. Third, the data-driven approach relied mostly on the work of the research team and a formal co-design approach was not included in the selection of the implementation strategies. Fourth, this study was focused on improving cancer surveillance in the VA, so findings regarding the impact of the selected implementation strategies may not readily translate to other healthcare settings or different clinical problems. However, our data-driven approach to implementation mapping will likely be helpful to others regardless of healthcare setting or clinical problem being addressed. Finally, implementation mapping in general is quite labor intensive. Our data-driven implementation mapping took about a year of part-time investigator and full-time research assistant effort. However, we were unable to quantify how much more effort was required for our approach compared to prior studies, as the authors of the prior studies did not report the amount of time, personnel, and expertise needed for their work [
1,
20,
21]. We recognize that this level of rigor may not always be possible in our current era of rapid research or during routine operational activities. However, our visualization of the implementation strategy matrix (Fig.
2) could still be integrated into implementation mapping and will likely be helpful for researchers to understand, interpret, and present results.
It is also quite possible that our data-driven approach yielded additional information that otherwise might have been overlooked in implementation mapping as previously applied. Future work could address the empirical question whether our data-driven approach yielded additional information compared to an advisory panel approach, and whether this information is important enough to justify the additional time needed to complete the highly data-driven implementation mapping process.
Conclusions
In conclusion, we described a data-driven and rigorous implementation mapping process to select implementation strategies for risk-aligned bladder cancer surveillance. The implementation strategies are currently being pilot-tested across four VA sites, with the goal of measuring implementation outcomes and adapting strategies to different local preferences. Once piloting is complete, future work will likely entail testing both the strategies and the clinical innovation (i.e., risk-aligned bladder cancer surveillance) in a larger number of sites. We hope that our work will inspire other implementation scientists to use similar data-driven processes in their selection of implementation strategies, minimizing the risk of bias being introduced by heavy reliance on the opinions of advisory groups.
Acknowledgements
This study was supported using resources and facilities at the White River Junction Department of Veterans Affairs (VA) Medical Center, the Richard L. Roudebush Veterans Affairs Medical Center, and the VA Informatics and Computing Infrastructure (VINCI), VA HSR RES 13-457. We acknowledge Teresa M. Damush, PhD, for providing advice during the implementation mapping process, as well as the members of the advisory groups: Carlos Glender, Dean Walker, Ronald Goff, Suzanne B. Molloy, Dan Charland, Rachel Moses, Jacob McCoy, Muta Issa, Kevin Rice, and M. Minhaj Siddiqui.
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit
http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (
http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.