2 Objective
As the application of evidence-based intervention methods in universities and research institutions is rapidly increasing, evidence-based medicine faces the challenge of how to translate high-quality evidence into clinical practice. Studies have shown that evidence-based practice (EBP) typically takes an average of 17 years to be integrated into routine clinical practice, and only half of EBP interventions are successfully disseminated and applied. Therefore, given the current lengthy timeline and relatively low success rate of EBP translation, the utilization of implementation science to facilitate evidence translation has become a breakthrough point for researchers both domestically and internationally. Implementation science is a systematic research approach aimed at facilitating the effective application of research findings and other evidence-based practices into routine clinical practice, ultimately enhancing the quality and effectiveness of healthcare services. Given its critical role in bridging the gap between research outcomes and clinical practice, implementation science has become a highly discussed topic in scientific research.
As scholars from various countries continue to explore the concept of "implementation science," related theories and practical methods have evolved accordingly. Implementation science researchers have introduced the concept of "implementation strategies" and related application methods to address the determinants of implementation decisions in clinical interventions, EBP, or new technologies (including barriers and facilitators). This is done with the aim of promoting the dissemination of innovative approaches. Adapting implementation strategies based on determinants of innovation, combined with the clinical context, to ensure effective implementation is a key task of implementation science.
In order to familiarize researchers more quickly and comprehensively with implementation strategies, this article will provide an overview of theoretical frameworks, methodologies, implementation strategies, and applications within implementation research. This will serve as a reference for future endeavors in conducting implementation research.
3 Theoretical Frameworks of Implementation Research
Implementation refers to a series of procedures and measures aimed at promoting the use of a particular intervention within a system, representing a critical organizational decision-making process concerning the adoption and utilization of the intervention. During the process of conducting implementation research, without guidance from theoretical frameworks, researchers often find it challenging to identify factors that influence the outcomes of implementation research in a specific context, making it difficult to generalize research results. A survey conducted by Birken et al.in 2015-2016 among 223 scholars from 12 countries engaged in implementation research revealed that over 100 different theoretical frameworks and models were used, covering disciplines such as implementation science, health behavior, organizational management, sociology, and business. Among the frequently utilized theoretical frameworks or models were the Consolidated Framework for Implementation Research (CFIR), the Knowledge to Action Framework (KTA), the Promoting Action on Research Service Framework (PARIHS), and the REAIM framework.
The PARIHS model comprises three core elements: evidence, context, and facilitation. Subelements of evidence include research evidence, clinical experience, and patient preferences. Subelements of context encompass organizational culture, leadership, and monitoring systems, while subelements of facilitation encompass characteristics, role identity, and behavior.
The KTA modeldivides evidence-based practice into two phases: knowledge creation and knowledge application. The knowledge creation phase emphasizes the extraction and transformation of research evidence to make it more relevant to the needs of stakeholders and presented in a concise format. The knowledge application phase emphasizes the assessment, management, and monitoring of barriers in accordance with the specific practice environment.
The CFIR frameworkidentifies five key elements in implementation research: 1. Intervention Characteristics: Refers to the intervention plan designed for a specific practice environment. 2. Outer Setting: Refers to external factors influencing implementation, including social, economic, political factors, patient needs, resources, peer pressure, external policies, and organizational openness.
3. Inner Setting: Relates to internal organizational factors affecting intervention implementation, such as organizational culture, structure, and networks, including structural characteristics, organizational networks, internal communication, and organizational cultural climate.
4. Individuals Involved: Refers to the participants in the intervention, such as healthcare providers, managers, policy-makers, and patients. This element primarily includes the knowledge, beliefs, selfefficacy, attitudes at different stages of intervention, and a sense of identification with the organization. 5. Implementation Process: Primarily focuses on the methods for promoting the adoption of intervention plans at the individual and organizational levels. This includes planning, engagement, execution, evaluation, and reflection. Planning involves pre-planning the action plan for the intervention. Engagement involves strategies to attract individuals to participate in the implementation process, such as through promotion, education, and exemplary demonstrations. Execution involves completing tasks according to the predetermined plan, while evaluation and reflection represent feedback on the implementation process by implementers or implementation teams, with evaluation and reflection being integral throughout the entire implementation process.
4 Methodology of Implementation Research
The journey from evidence-based innovation to its inception, clinical translation, and continuous, normalized execution faces numerous obstacles. Implementation science can identify and address the barriers to evidence-based practice, bridging the gap between research evidence and clinical practice, and facilitating the implementation of health policies, plans, and practices. This process encounters multifaceted barriers: obstacles arising from the intervention itself (such as high costs, lengthy timelines, and limited clinical applicability of evidence), barriers originating from research design (such as a lack of representative target populations), barriers stemming from the practice environment (such as healthcare institutions, communities, schools), and the interaction of these three factors. To mitigate the adverse impact of obstacles in research design on research outcomes, the introduction of rigorous research methods is of paramount importance in implementation research.
Implementation science has matured and developed various research design methods [15-16], including randomized controlled trials using effectiveness-implementation hybrid designs, multistage optimization strategies, multiple baselines randomized sequential trials, stepped-wedge designs, and stepped-wedge cluster randomized trials. Quasi-experimental studies can employ interrupted time series and breakpoint regression designs. Qualitative research can utilize theoretical frameworks like integrated frameworks and knowledge translation action frameworks to develop interview guides for semi-structured interviews. Mixed-methods research combines qualitative and quantitative designs.
Quantitative research design is fundamentally concerned with measuring, analyzing, and drawing conclusions about natural and social phenomena. Many implementation science research designs adhere to the paradigm of quantitative research, such as effectiveness-implementation hybrid designs, multi-stage optimization strategies, and interrupted time series. Implementation science primarily focuses on the impact of implementation strategies, necessitating a shift toward clinical trial methods in the implementation and evaluation of research. Quantitative methods are frequently employed in implementation science to assess the effectiveness of implementation strategies, as quantitative approaches are well-suited to explore changes induced by implementation strategies and their scope. Quantitative research can be categorized into randomized quantitative research and nonrandomized quantitative research. However, implementation science often lacks clear descriptions of randomization methods in randomized quantitative studies and control of follow-up in nonrandomized quantitative studies.
4.1 Randomized Controlled Trials
Randomized Controlled Trials (RCTs) represent the highest level of evidence in empirical research.
As new methodological approaches have been proposed, the design of RCTs has become increasingly diverse, including effectiveness-implementation hybrid designs, multiphase optimization strategies, sequential multiple assignment randomized trials, stepped-wedge designs, and steppedwedge cluster randomized trials.
4.1.1 Effectiveness-Implementation Hybrid Designs (EIHD)
The implementation of evidence-based practices often encounters obstacles, such as high costs, lengthy timelines, and limited clinical applicability of interventions. Effectiveness-implementation hybrid designs can assess the effectiveness of interventions and implementation outcomes, emphasizing research efficiency to effectively address barriers associated with the intervention itself. The specific implementation process includes: (1) selecting the study subjects, (2) collecting data before and after implementation, and (3) conducting assessments before, during, and after implementation. Depending on the emphasis on effectiveness and execution results, this design is divided into three types: Type I hybrid design, Type II hybrid design, and Type III hybrid design.
Type I hybrid design is used when there is insufficient clinical evidence to support implementation. It examines the effectiveness of evidence-based innovation implementation, collects implementation information, focuses on the effectiveness of the intervention, and explores its feasibility.
Type II hybrid design balances efficiency and results, simultaneously assessing the effects of evidence-based innovation implementation and implementation strategies. Unlike Type I, Type II hybrid design requires feasible implementation plans. It is used when the intervention has been proven effective in other settings or populations but has not been confirmed effective in the current trial context or population.
Type III hybrid design, building upon Type II, observes and collects additional information during the trial. For example, the Vaughn team used a Type III hybrid design to compare the effectiveness of basic and enhanced applications of "children's nutrition and physical activity self-assessment".
Hybrid designs can answer questions in implementation science regarding whether outcomes can be reasonably attributed to the intervention without being influenced by other factors and how interventions can be implemented in the future.
4.1.2 Multiphase Optimization Strategy (MOST)
The Multiphase Optimization Strategy (MOST), based on engineering principles, systematically develops and tests multi-component interventions, including screening, optimization, and testing phases. It is suitable for multi-factor, multi-domain complex behavioral intervention research. Applying MOST in implementation science can optimize evidence-based practice.
4.1.3 Sequential Multiple Assignment Randomized Trial (SMART)
SMART is a multi-stage randomized trial design suitable for comparing adaptive interventions. In this type of study, each stage randomly assigns all study subjects to intervention or control groups. After multiple assignments, study subjects receive multiple intervention measures randomly. Subsequently, the trial assesses subjects' trial outcomes at each stage to determine the optimal intervention strategy. This design thoroughly analyzes intervention strategies and ensures that the implementation and outcomes of interventions effectively address the question of whether interventions change among beneficiaries during implementation. Compared to MOST, SMART can determine the optimal intervention strategy, and researchers can calculate minimum sample sizes and conduct statistical analyses using software, greatly reducing the workload of the study.
Figure 1: Sequential Multiple Assignment Randomized Trial
4.1.4 Stepped-Wedge Design (SWD)
The Stepped-Wedge Design (SWD) is suitable for cluster randomized trials where the benefits of the intervention outweigh the disadvantages. In this research process, groups are first randomly assigned numbers, and the intervention process is then divided into different stages in chronological order. Interventions are implemented sequentially based on group numbers, with groups that have received interventions continuing to receive them, while groups waiting for interventions remain unexposed until all groups have received interventions.
This design can address the question in research of whether outcomes can be reasonably attributed to the intervention without being influenced by other factors. The ideal stepped-wedge design involves observing study subjects for an extended period after receiving interventions for longitudinal comparisons.
4.2 Quasi-Experimental Studies
When it is not feasible to conduct a randomized controlled trial (RCT) or there are ethical issues surrounding RCT implementation, quasi-experimental studies can be considered. Although quasiexperimental studies have less statistical power for hypothesis testing compared to RCTs, they do not require random allocation and are generally easier to execute.
4.2.1 Interrupted Time Series Design (ITS)
Interrupted Time Series (ITS) is a quasi-experimental research approach with strong testing capabilities, often used in public health to validate the effects of interventions. ITS involves collecting data at multiple time points before and after the introduction of an intervention or exposure to assess its effects. ITS designs leverage longitudinal data to intuitively compare the effects of an intervention, addressing the question in implementation science of whether outcomes can be reasonably attributed to the intervention without being influenced by other factors. ITS designs repeatedly collect data at intervention time points, reducing the possibility of the intervention effect being confounded by long-term trends.
4.2.2 Regression Discontinuity Design (RDD)
When an intervention has already been proven effective and there is no need for an RCT, Regression Discontinuity Design (RDD) can be used for causal inference. RDD leverages real-world constraints to analyze causal effects near a threshold. It can answer questions in implementation science such as "Under what conditions or contexts should interventions be implemented? What are the current conditions influencing implementation?" Thistlethwaite and Campbell first proposed RDD in 1960. In non-randomized trials, there may be unaccounted confounding factors. RDD can help reduce the impact of confounding factors on outcomes. However, compared to RCTs, RDD requires a larger sample size. Currently, research articles using RDD need further improvement in research design and reporting standards.
4.2.3 Difference-in-Difference (DID)
Difference-in-Difference (DID) is a widely used method in econometrics research that emphasizes comparing control and intervention groups, commonly employed for evaluating the effects of public policies or project implementations. In recent years, DID has gained popularity among researchers. DID is typically applied in policy impact evaluation studies, such as assessing the effects of policies like "Beijing-Tianjin-Hebei Coordination" and "High-Speed Rail Opening".
We generally refer to experiments that naturally change the environment as natural experiments. DID is a mature analytical method for conducting policy research, and its principles are similar to those of natural experiments. It treats the implementation of a policy as a natural experiment and compares the results of a group unaffected by the policy (control group) with those affected by the policy (experimental group), examining the net impact of policy implementation.
In using DID for policy impact evaluation, sample data must meet three assumptions: the linearity assumption, the individual treatment stability assumption, and the parallel trend assumption. The first two assumptions are generally satisfied and do not require separate verification, with the focus primarily on verifying the third assumption. Assumption 1 -Linearity Assumption: The potential outcome variable is linear with respect to both the treatment variable and the time variable. Assumption 2 -Individual Treatment Stability Assumption: The policy intervention affects only the experimental group and does not interact with the control group. Assumption 3 -Parallel Trend Assumption (most important): Before the policy intervention, the outcome trends for both the experimental and control groups should be the same (parallel trends).
Parallel trend tests can be performed using various methods, including t-tests, tests for the significance of interaction terms, F-tests, and graphical methods. DID's model and principles are relatively easy to understand and apply compared to other methods. Moreover, it can largely avoid endogeneity problems, effectively controlling for the mutual influence between the dependent and independent variables. These two factors have led to the widespread application of DID in recent years. However, DID also has its limitations. For example, it is mainly suitable for panel data; if only cross-sectional data is available, it is not suitable for this method.
Additionally, in practical research, it may be challenging to find an appropriate control group for comparison, which could lead to less rigorous research. Therefore, researchers should pay attention to the assumptions required for the use of DID in practical research. If DID is not suitable, alternative methods such as PSM-DID and synthetic control methods can be considered.
5 Implementation Strategies
In implementation science, implementation strategies are defined as methods or techniques used to enhance the adoption, implementation, sustainability, and dissemination (or spread) of innovations. For example, in the context of implementing clinical interventions for chronic disease selfmanagement based on electronic health, implementation strategies may include "providing education and training for healthcare professionals." Through this implementation strategy, healthcare professionals can gain an understanding of the benefits of self-management clinical interventions and become more proactive in implementing them through electronic health.
Leeman et al.primarily categorize implementation strategies into five types: 1. Dissemination Strategies: These strategies focus on the knowledge, awareness, attitudes, and intentions of research members and stakeholders regarding the adoption of innovations. They involve creating key information and materials and sharing them with relevant audiences. 2. Process Implementation Strategies: These strategies involve planning and implementing an innovation at different stages of implementation. Improvements are made in these strategies by assessing the environment, involving key stakeholders, and monitoring the implementation process.
3. Integration Strategies: These strategies aim to integrate a specific innovation into a particular environment. For example, if members need to introduce a new technology, their roles and responsibilities may need to change and be updated. For instance, in the implementation study of electronic health interventions for chronic kidney disease self-management, healthcare professionals may need to shift from their previous roles as caregivers and educators to include responsibilities related to maintaining and managing patient electronic medical record data. 4. Capacity Building Strategies: These strategies aim to enhance the motivation and capabilities of the audience to participate in the implementation. They involve providing targeted skills training, knowledge dissemination, seminars, and similar activities. 5. Scale-Up Strategies: These strategies focus on enhancing the ability to smoothly implement and scale an innovation in various scenarios. For example, to promote the scale-up of electronic health interventions, training related to the intervention program and the establishment of electronic health infrastructure, such as self-management information systems, may be required.
Powell et al.have established 73 implementation strategies, forming the Expert Recommendations for Implementing Change (ERIC). ERIC employs expert consensus to create a clear classification table of implementation strategies. The 73 implementation strategies are categorized into nine thematic groups. Table 1 presents examples of implementation strategies from each of the nine thematic groups along with their specific definitions.
Table 1 Examples and Definitions of Implementation Strategies in the Nine Thematic Groups of ERIC ERIC Implementation Strategy Themes Examples Definitions 1. Using Evaluative and Iterative Strategies Assess readiness for innovation implementation, identify barriers and facilitators Assess various aspects of an organization to determine readiness for innovation implementation, identify barriers that may hinder implementation, and recognize strengths that can be leveraged during implementation Audit and provide feedback Collect and summarize clinical performance data over specific periods and provide this information to clinicians and managers for monitoring, assessment, and modification of organizational behavior 2. Providing Interactive Assistance Strategies Clinical supervision Provide ongoing supervision focused on the innovation for clinical practitioners 3. Tailoring and Aligning to Context Strategies Promote adaptability Identify ways to adapt clinical innovations to meet the needs of implementation and specify which elements of the innovation must be retained to maintain alignment with clinical practice Tailored strategies Modify implementation strategies to address barriers identified during early data collection and leverage facilitators 4. Developing Relationships Among Stakeholders Identify and cultivate champions Identify and cultivate individuals committed to supporting, marketing, and driving the implementation of innovations, overcoming indifference or resistance within the organization Identify early adopters Identify local early adopters who can share their experiences with innovation and serve as role models
5 Training and Educating Stakeholders Conduct educational meetings
Hold meetings for different stakeholder groups (e.g., providers, managers, other organizational stakeholders, community, patients/consumers, and families) to inform them about the clinical innovation 6. Supporting Providers Revise professional roles Shift and modify the roles of care providers and redesign the intervention accordingly 7. Engaging Consumers Intervene with consumers to increase uptake and adherence Collaborate with consumers to develop strategies that encourage engagement and address adherence issues 8. Using Financial Strategies Provide funding and contract opportunities for clinical innovations Governments and other service payers issue requests for innovation services, specify contracts to encourage bidders to provide innovative clinical services, and establish new funding criteria to make providers more likely to offer innovative services 9. Changing Infrastructure Change record systems Modify record systems to better evaluate implementation, record clinical outcomes, and monitor the process of care delivery These strategies represent various approaches that can be employed to facilitate the successful implementation of innovations in healthcare and other fields.
6 Applications of Implementation Research
The focus of implementation research is to identify common implementation issues, understand barriers and facilitators to knowledge translation, develop implementation strategies, and promote the dissemination and sustainable development of interventions. Theoretical models and frameworks can assist researchers in understanding the process of evidence application, guiding the implementation of evidence-based projects. This includes constructing theory-based intervention strategies, selecting appropriate outcome indicators and measurement methods, and guiding the process evaluation of evidence application. Currently, the fields of implementation research mainly include health services, HIV prevention, school health, mental health, cancer control, violence prevention, and rehabilitation, among others.
In 2016, Tavender et al.conducted a search for implementation research in the field of critical care, and the results indicated a continuous increase in relevant research. The research topics included identifying the gap between research evidence and clinical practice, assessing the clinical effectiveness of evidence-based interventions, and identifying barriers and facilitators during implementation. When conducting implementation research, researchers should use theoretical frameworks to guide change, employ more rigorous research design methods, and pay attention to the identification and intervention of barriers. Moreover, researchers should provide detailed descriptions of evidence-based interventions in their research findings. In 2017, Lourida et al.identified 88 implementation research studies in the field of elderly dementia care. Among them, 70 studies applied measures to facilitate the translation of research findings into clinical practice. These measures included increasing training and education for practitioners, enhancing stakeholder engagement, increasing assessment and monitoring processes, and providing financial and material support. Sixty-two studies explored barriers and facilitators in the process of implementation research. Organizational-level barriers in long-term care facilities included restrictions on working hours and workloads, while managerial support was identified as a facilitator. In 2017, Gwadz et al. utilized a multi-stage optimization strategy to develop an efficient, scalable, and cost-effective intervention for optimizing continuous care for vulnerable populations with HIV in the United States. In 2017, Meurer et al. used a multiple-arm, sequential, randomized trial to test treatment strategies for poststroke reperfusion and evaluated outcomes such as patients' quality of life and survival, leading to improved treatment guidelines and increased probability of reperfusion after stroke. In 2017, Van Den Heuvel's team applied a stepped-wedge design to test the impact of intranasal insulin on cognitive development in children with Phelan-McDermid syndrome. In 2018, Jacobsen et al. proposed that implementation research could facilitate the translation and application of cancer patient pain assessment and management practices. The National Cancer Institute suggested integrating evidence-based guidelines related to cancer patient symptom management through implementation research to provide decision support for cancer patients and clinical practitioners, promoting the translation of research evidence into clinical practice. In 2020, Ndejjo's team implemented a cardiovascular disease prevention program in a Ugandan community using a stepped-wedge cluster randomized trial to identify barriers and facilitators to program implementation. In 2020, Anderson conducted an observational study using a regression discontinuity design to determine the effectiveness of influenza vaccination in reducing hospitalization and mortality rates among the elderly, describing the relevant background and conditions for the successful implementation of influenza vaccination in reducing hospitalization and mortality rates among the elderly.
7 Conclusion
. Criteria for Selecting Implementation Science Theories and
. Advancing the Application of Implementation Research in the Field
. Research Methods in the Field of Implementation Science in the Medical and Health
Sector. Chinese Journal of Evidence-Based Medicine
Zhang, J. (2018). Guidelines for the 2017 National Natural Science Foundation of China and Canadian Institutes of Health Research Collaboration Research Project on Mental Health and Dementia Implementation. [Online]. Retrieved from [URL].Gu, Y., Hu, Y. (2015). Progress in the Application of the PARIHS Evidence-Based Practice Conceptual Framework. PLA Journal of Nursing, 32(8), 45-47.Zhou, Y., Hu, Y., Gu, Y., et al. (2016). Application of Knowledge Translation Models in Evidence-Based Practice. , 31(2), 84-87.Shao, H., Wang, Q., Hu, Y., et al. (2015). Interrupted Time Series Analysis and Its Application in Public Health. Epidemiology,36(9), 1015-1017.Zhao, F., Yang, H., Lin, Z., et al. (2012). Comparison of Outpatient Services at a Community Health Service Center and Township Health Center in Ximou Town After the Implementation of the Basic Drug System. Chinese Health Policy Research, 5(11), 19-26.Jabbour, M., Newton, A. S., Johnson, D., et al. (2018). Defining Barriers and Enablers for Clinical Pathway Implementation in Complex Clinical Settings. Implement Sci, 13(1), 139. DOI: 10.1186/s13012-018-0832-8.McNulty, M., Smith, J. D., Villamar, J., et al. (2019). Implementation Research Methodologies for Achieving Scientific Equity and Health Equity. Ethn Dis, 29(Suppl 1), 83-92. DOI: 10.18865/ed.29.S1.83.Lewis, C. C., Boyd, M. R., Walsh-Bailey, C., et al. (2020). A Systematic Review of Empirical Studies Examining Mechanisms of Implementation in Health. Implement Sci, 15(1), 21. DOI: 10.1186/s13012-020-00983-3.Banerjee, S., Taylor, R. S., Hellier, J. (2020). Randomized Controlled Trials. Chest, 158(1), S79-S87. DOI: 10.1016/j.chest.2020.03.013.Curran, G. M., Bauer, M., Mittman, B., et al. (2012). Effectiveness-implementation Hybrid Designs: Combining Elements of Clinical Effectiveness and Implementation Research to Enhance Public Health Impact. Med Care, 50(3), 217-226. DOI: 10.1097/MLR.0b013e3182408812.Chirwa,E.,Kapito,E.,Jere,D. L. , et al. (2020) Health, 20(1), 205. DOI: 10.1186/s12889-020-8276-x.Smith, J. D., Berkel, C., Jordan, N. , et al. (2018). An Individually Tailored Family-centered Intervention for Pediatric Obesity in Primary Care: Study Protocol of A Randomized Type II Hybrid Effectiveness-implementation Trial (Raising Healthy Children study). Implement Sci, 13(1), 11. DOI: 10.1186/s13012-017-0697-2.Vaughn,A. E.,Studts,C. R.,Powell,B. J.,et al. (2019). The impact of Basic vs. Enhanced Go NAPSACC on Child Care Centers Healthy Eating and Physical Activity Practices: Protocol for A Type 3 Hybrid Effectivenessimplementation Cluster-randomized Trial. Implement Sci, 14(1), 101. DOI: 10.1186/s13012-019-0949-4.Landes, S. J., McBain, S. A., Curran, G. M. (2020). An Introduction to Effectiveness-implementation Hybrid Designs. Psychiatry Res, 283, 112630. DOI: 10.1016/j.psychres.2019.112513.Collins, L. M., Murphy, S. A., Nair, V. N. , et al. (2005). A Strategy for Optimizing and Evaluating Behavioral Interventions. Ann Behav Med, 30(1), 65-73. DOI: 10.1207/s15324796abm3001_8.Gwadz, M. V., Collins, L. M., Cleland, C. M., et al. (2017). Using the Multiphase Optimization Strategy (MOST) to Optimize an HIV Care Continuum Intervention for Vulnerable Populations: A Study Protocol. BMC Public Health, 17(1), 383. DOI: 10.1186/s12889-017-4279-7.Gallis, J. A., Bennett, G. G., Steinberg, D. M. , et al. (2019). Randomization Procedures for Multicomponent Behavioral Intervention Factorial Trials in the Multiphase Optimization Strategy Framework: Challenges and Recommendations. Transl Behav Med, 9(6), 1047-1056. DOI: 10.1093/tbm/iby131.Murphy, S. A. (2005). An Experimental Design for the Development of Adaptive Treatment Strategies. Stat Med, 24(10), 1455-1481. DOI: 10.1002/sim.2022.Doorenbos, A. Z., Haozous, E. A., Jang, M. K., et al. (2019). Sequential Multiple Assignment Randomization Trial Designs for Nursing Research. Res Nurs Health, 42(6), 429-435. DOI: 10.1002/nur.21988.Meurer, W. J., Seewald, N. J., Kidwell, K. (2017). Sequential Multiple Assignment Randomized Trials: An Opportunity for Improved Design of Stroke Reperfusion Trials. J Stroke Cerebrovasc Dis, 26(4), 717-724. DOI: 10.1016/j.jstrokecerebrovasdis.2016.09.010.Candlish, J., Teare, M. D., Cohen, J., et al. (2019). Statistical Design and Analysis in Trials of Proportionate Interventions: A Systematic Review. Trials, 20(1), 151. DOI: 10.1186/s13063-019-3206-x.Proctor, E. K., Powell, B. J., McMillen, J. C. (2013). Implementation strategies: recommendations for specifying and reporting. Implement Sci, 8, 139.Chambers, D., Vinson, C., Norton, W. (2018). Advancing the science of implementation across the cancer continuum. New York: Oxford University Press.Leeman, J., Birken, S. A., Powell, B. J., et al. (2017). Beyond "implementation strategies": classifying the full range of strategies used in implementation science and practice. Implement Sci, 12(1), 125.Waltz, T. J., Powell, B. J., Chinman, M. J., et al. (2014). Expert recommendations for implementing change (ERIC): protocol for a mixed methods study. Implement Sci, 9, 39.Powell, B. J., Waltz, T. J., Chinman, M. J., et al. (2015). A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci, 10, 21.Tavender, E. J., Bosch, M., Fiander, M., et al. (2016). Implementation research in emergency medicine: a systematic scoping review. Emerg Med J, 33(9), 652-659.Lourida, I., Abbott, R. A., Rogers, M. , et al. (2017). Dissemination and implementation research in dementia care: a systematic scoping review and evidence map. BMC Geriatrics, 17, 147.Jacobsen, P. B., Snyder, C. F. (2018). Improving pain assessment and management in routine oncology practice: the role of implementation research. J Clin Oncol, 36(13), 1272Oncol, 36(13), -1274. .
Chinese Journal of Nursing
Chinese Journal of
. An Effectiveness-implementation Hybrid Type 1 Trial Assessing the Impact of Group Versus Individual Antenatal Care on Maternal and Infant Outcomes in Malawi. BMC Public
References
[1]. Williams, L. S., & Vickrey, B. G. (2021). Implementation Science. Stroke, 52(12), 4054-4056. DOI: 10.1161/STROKEAHA.121.033971.
[2]. Theobald, S., Brandes, N., Gyapong, M., et al. (2018). Implementation Research: New Imperatives and Opportunities in Global Health. Lancet, 392(10160), 2214-2228. DOI: 10.1016/S0140-6736(18)32205-0.
[3]. Institute of Medicine Committee on Quality of Health Care. (2001). Crossing the quality chasm: a new health system for the 21st century. Washington, DC: National Academies Press (US).
[4]. Mittman, B. S., Weiner, B. J., Proctor, E. K., et al. (2015). Expanding D&I science capacity and activity within Clinical and Translational Science Award (CTSA) programs: guidance and successful models from national leaders. Implement Sci, 10(1), 1.
[5]. Boehm, L. M., Stolldorf, D. P., Jeffery, A. D. (2020). Implementation Science Training and Resources for Nurses and Nurse Scientists. J Nurs Scholarsh, 52(1), 47-54. DOI: 10.1111/jnu.12510.
[6]. Bisken, S. A., Powell, B. J., Shea, C. M., et al. (2017). Criteria for Selecting Implementation Science Theories and Frameworks: Results from an International Survey. Implement Sci, 12(1), 124. DOI: 10.1186/s13012-017-0656-y.
[7]. Tao, F., Shahirose Premji, Wu, X., et al. (2020). Advancing the Application of Implementation Research in the Field of Public Health in China. Chinese Journal of Preventive Medicine, 54, 8-12. DOI: 10.3760/cma.j.issn.0253-9624.2020.01.004.
[8]. Xie, R., Xu, D., Li, H. (2020). Research Methods in the Field of Implementation Science in the Medical and Health Sector. Chinese Journal of Evidence-Based Medicine, 20(9), 1104-1110. DOI: 10.7507/1672-2531.202003234.
[9]. Qu, Z., Guo, S., Zhang, W., et al. (2017). Insights from Implementation Science for the Construction of China's Mental Health Service System. Journal of Beijing Normal University (Social Sciences), 2017(2), 29–36.
[10]. Zhang, J. (2018). Guidelines for the 2017 National Natural Science Foundation of China and Canadian Institutes of Health Research Collaboration Research Project on Mental Health and Dementia Implementation. [Online]. Retrieved from [URL].
[11]. Gu, Y., Hu, Y. (2015). Progress in the Application of the PARIHS Evidence-Based Practice Conceptual Framework. PLA Journal of Nursing, 32(8), 45–47.
[12]. Zhou, Y., Hu, Y., Gu, Y., et al. (2016). Application of Knowledge Translation Models in Evidence-Based Practice. Chinese Journal of Nursing, 31(2), 84–87.
[13]. Shao, H., Wang, Q., Hu, Y., et al. (2015). Interrupted Time Series Analysis and Its Application in Public Health. Chinese Journal of Epidemiology, 36(9), 1015–1017.
[14]. Zhao, F., Yang, H., Lin, Z., et al. (2012). Comparison of Outpatient Services at a Community Health Service Center and Township Health Center in Ximou Town After the Implementation of the Basic Drug System. Chinese Health Policy Research, 5(11), 19-26.
[15]. Jabbour, M., Newton, A. S., Johnson, D., et al. (2018). Defining Barriers and Enablers for Clinical Pathway Implementation in Complex Clinical Settings. Implement Sci, 13(1), 139. DOI: 10.1186/s13012-018-0832-8.
[16]. McNulty, M., Smith, J. D., Villamar, J., et al. (2019). Implementation Research Methodologies for Achieving Scientific Equity and Health Equity. Ethn Dis, 29(Suppl 1), 83-92. DOI: 10.18865/ed.29.S1.83.
[17]. Lewis, C. C., Boyd, M. R., Walsh-Bailey, C., et al. (2020). A Systematic Review of Empirical Studies Examining Mechanisms of Implementation in Health. Implement Sci, 15(1), 21. DOI: 10.1186/s13012-020-00983-3.
[18]. Banerjee, S., Taylor, R. S., Hellier, J. (2020). Randomized Controlled Trials. Chest, 158(1), S79-S87. DOI: 10.1016/j.chest.2020.03.013.
[19]. Curran, G. M., Bauer, M., Mittman, B., et al. (2012). Effectiveness-implementation Hybrid Designs: Combining Elements of Clinical Effectiveness and Implementation Research to Enhance Public Health Impact. Med Care, 50(3), 217-226. DOI: 10.1097/MLR.0b013e3182408812.
[20]. Chirwa, E., Kapito, E., Jere, D. L., et al. (2020). An Effectiveness-implementation Hybrid Type 1 Trial Assessing the Impact of Group Versus Individual Antenatal Care on Maternal and Infant Outcomes in Malawi. BMC Public Health, 20(1), 205. DOI: 10.1186/s12889-020-8276-x.
[21]. Smith, J. D., Berkel, C., Jordan, N., et al. (2018). An Individually Tailored Family-centered Intervention for Pediatric Obesity in Primary Care: Study Protocol of A Randomized Type II Hybrid Effectiveness-implementation Trial (Raising Healthy Children study). Implement Sci, 13(1), 11. DOI: 10.1186/s13012-017-0697-2.
[22]. Vaughn, A. E., Studts, C. R., Powell, B. J., et al. (2019). The impact of Basic vs. Enhanced Go NAPSACC on Child Care Centers Healthy Eating and Physical Activity Practices: Protocol for A Type 3 Hybrid Effectiveness-implementation Cluster-randomized Trial. Implement Sci, 14(1), 101. DOI: 10.1186/s13012-019-0949-4.
[23]. Landes, S. J., McBain, S. A., Curran, G. M. (2020). An Introduction to Effectiveness-implementation Hybrid Designs. Psychiatry Res, 283, 112630. DOI: 10.1016/j.psychres.2019.112513.
[24]. Collins, L. M., Murphy, S. A., Nair, V. N., et al. (2005). A Strategy for Optimizing and Evaluating Behavioral Interventions. Ann Behav Med, 30(1), 65-73. DOI: 10.1207/s15324796abm3001_8.
[25]. Gwadz, M. V., Collins, L. M., Cleland, C. M., et al. (2017). Using the Multiphase Optimization Strategy (MOST) to Optimize an HIV Care Continuum Intervention for Vulnerable Populations: A Study Protocol. BMC Public Health, 17(1), 383. DOI: 10.1186/s12889-017-4279-7.
[26]. Gallis, J. A., Bennett, G. G., Steinberg, D. M., et al. (2019). Randomization Procedures for Multicomponent Behavioral Intervention Factorial Trials in the Multiphase Optimization Strategy Framework: Challenges and Recommendations. Transl Behav Med, 9(6), 1047-1056. DOI: 10.1093/tbm/iby131.
[27]. Murphy, S. A. (2005). An Experimental Design for the Development of Adaptive Treatment Strategies. Stat Med, 24(10), 1455-1481. DOI: 10.1002/sim.2022.
[28]. Doorenbos, A. Z., Haozous, E. A., Jang, M. K., et al. (2019). Sequential Multiple Assignment Randomization Trial Designs for Nursing Research. Res Nurs Health, 42(6), 429-435. DOI: 10.1002/nur.21988.
[29]. Meurer, W. J., Seewald, N. J., Kidwell, K. (2017). Sequential Multiple Assignment Randomized Trials: An Opportunity for Improved Design of Stroke Reperfusion Trials. J Stroke Cerebrovasc Dis, 26(4), 717-724. DOI: 10.1016/j.jstrokecerebrovasdis.2016.09.010.
[30]. Candlish, J., Teare, M. D., Cohen, J., et al. (2019). Statistical Design and Analysis in Trials of Proportionate Interventions: A Systematic Review. Trials, 20(1), 151. DOI: 10.1186/s13063-019-3206-x.
[31]. Proctor, E. K., Powell, B. J., McMillen, J. C. (2013). Implementation strategies: recommendations for specifying and reporting. Implement Sci, 8, 139.
[32]. Chambers, D., Vinson, C., Norton, W. (2018). Advancing the science of implementation across the cancer continuum. New York: Oxford University Press.
[33]. Leeman, J., Birken, S. A., Powell, B. J., et al. (2017). Beyond "implementation strategies": classifying the full range of strategies used in implementation science and practice. Implement Sci, 12(1), 125.
[34]. Waltz, T. J., Powell, B. J., Chinman, M. J., et al. (2014). Expert recommendations for implementing change (ERIC): protocol for a mixed methods study. Implement Sci, 9, 39.
[35]. Powell, B. J., Waltz, T. J., Chinman, M. J., et al. (2015). A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci, 10, 21.
[36]. Tavender, E. J., Bosch, M., Fiander, M., et al. (2016). Implementation research in emergency medicine: a systematic scoping review. Emerg Med J, 33(9), 652-659.
[37]. Lourida, I., Abbott, R. A., Rogers, M., et al. (2017). Dissemination and implementation research in dementia care: a systematic scoping review and evidence map. BMC Geriatrics, 17, 147.
[38]. Jacobsen, P. B., Snyder, C. F. (2018). Improving pain assessment and management in routine oncology practice: the role of implementation research. J Clin Oncol, 36(13), 1272-1274.
Cite this article
Zhou,Y. (2023). Overview of Scientific Advancements in Implementation Science. Advances in Social Behavior Research,2,54-66.
Data availability
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
Disclaimer/Publisher's Note
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
About volume
Journal:Advances in Social Behavior Research
© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and
conditions of the Creative Commons Attribution (CC BY) license. Authors who
publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons
Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this
series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published
version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial
publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and
during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See
Open access policy for details).
References
[1]. Williams, L. S., & Vickrey, B. G. (2021). Implementation Science. Stroke, 52(12), 4054-4056. DOI: 10.1161/STROKEAHA.121.033971.
[2]. Theobald, S., Brandes, N., Gyapong, M., et al. (2018). Implementation Research: New Imperatives and Opportunities in Global Health. Lancet, 392(10160), 2214-2228. DOI: 10.1016/S0140-6736(18)32205-0.
[3]. Institute of Medicine Committee on Quality of Health Care. (2001). Crossing the quality chasm: a new health system for the 21st century. Washington, DC: National Academies Press (US).
[4]. Mittman, B. S., Weiner, B. J., Proctor, E. K., et al. (2015). Expanding D&I science capacity and activity within Clinical and Translational Science Award (CTSA) programs: guidance and successful models from national leaders. Implement Sci, 10(1), 1.
[5]. Boehm, L. M., Stolldorf, D. P., Jeffery, A. D. (2020). Implementation Science Training and Resources for Nurses and Nurse Scientists. J Nurs Scholarsh, 52(1), 47-54. DOI: 10.1111/jnu.12510.
[6]. Bisken, S. A., Powell, B. J., Shea, C. M., et al. (2017). Criteria for Selecting Implementation Science Theories and Frameworks: Results from an International Survey. Implement Sci, 12(1), 124. DOI: 10.1186/s13012-017-0656-y.
[7]. Tao, F., Shahirose Premji, Wu, X., et al. (2020). Advancing the Application of Implementation Research in the Field of Public Health in China. Chinese Journal of Preventive Medicine, 54, 8-12. DOI: 10.3760/cma.j.issn.0253-9624.2020.01.004.
[8]. Xie, R., Xu, D., Li, H. (2020). Research Methods in the Field of Implementation Science in the Medical and Health Sector. Chinese Journal of Evidence-Based Medicine, 20(9), 1104-1110. DOI: 10.7507/1672-2531.202003234.
[9]. Qu, Z., Guo, S., Zhang, W., et al. (2017). Insights from Implementation Science for the Construction of China's Mental Health Service System. Journal of Beijing Normal University (Social Sciences), 2017(2), 29–36.
[10]. Zhang, J. (2018). Guidelines for the 2017 National Natural Science Foundation of China and Canadian Institutes of Health Research Collaboration Research Project on Mental Health and Dementia Implementation. [Online]. Retrieved from [URL].
[11]. Gu, Y., Hu, Y. (2015). Progress in the Application of the PARIHS Evidence-Based Practice Conceptual Framework. PLA Journal of Nursing, 32(8), 45–47.
[12]. Zhou, Y., Hu, Y., Gu, Y., et al. (2016). Application of Knowledge Translation Models in Evidence-Based Practice. Chinese Journal of Nursing, 31(2), 84–87.
[13]. Shao, H., Wang, Q., Hu, Y., et al. (2015). Interrupted Time Series Analysis and Its Application in Public Health. Chinese Journal of Epidemiology, 36(9), 1015–1017.
[14]. Zhao, F., Yang, H., Lin, Z., et al. (2012). Comparison of Outpatient Services at a Community Health Service Center and Township Health Center in Ximou Town After the Implementation of the Basic Drug System. Chinese Health Policy Research, 5(11), 19-26.
[15]. Jabbour, M., Newton, A. S., Johnson, D., et al. (2018). Defining Barriers and Enablers for Clinical Pathway Implementation in Complex Clinical Settings. Implement Sci, 13(1), 139. DOI: 10.1186/s13012-018-0832-8.
[16]. McNulty, M., Smith, J. D., Villamar, J., et al. (2019). Implementation Research Methodologies for Achieving Scientific Equity and Health Equity. Ethn Dis, 29(Suppl 1), 83-92. DOI: 10.18865/ed.29.S1.83.
[17]. Lewis, C. C., Boyd, M. R., Walsh-Bailey, C., et al. (2020). A Systematic Review of Empirical Studies Examining Mechanisms of Implementation in Health. Implement Sci, 15(1), 21. DOI: 10.1186/s13012-020-00983-3.
[18]. Banerjee, S., Taylor, R. S., Hellier, J. (2020). Randomized Controlled Trials. Chest, 158(1), S79-S87. DOI: 10.1016/j.chest.2020.03.013.
[19]. Curran, G. M., Bauer, M., Mittman, B., et al. (2012). Effectiveness-implementation Hybrid Designs: Combining Elements of Clinical Effectiveness and Implementation Research to Enhance Public Health Impact. Med Care, 50(3), 217-226. DOI: 10.1097/MLR.0b013e3182408812.
[20]. Chirwa, E., Kapito, E., Jere, D. L., et al. (2020). An Effectiveness-implementation Hybrid Type 1 Trial Assessing the Impact of Group Versus Individual Antenatal Care on Maternal and Infant Outcomes in Malawi. BMC Public Health, 20(1), 205. DOI: 10.1186/s12889-020-8276-x.
[21]. Smith, J. D., Berkel, C., Jordan, N., et al. (2018). An Individually Tailored Family-centered Intervention for Pediatric Obesity in Primary Care: Study Protocol of A Randomized Type II Hybrid Effectiveness-implementation Trial (Raising Healthy Children study). Implement Sci, 13(1), 11. DOI: 10.1186/s13012-017-0697-2.
[22]. Vaughn, A. E., Studts, C. R., Powell, B. J., et al. (2019). The impact of Basic vs. Enhanced Go NAPSACC on Child Care Centers Healthy Eating and Physical Activity Practices: Protocol for A Type 3 Hybrid Effectiveness-implementation Cluster-randomized Trial. Implement Sci, 14(1), 101. DOI: 10.1186/s13012-019-0949-4.
[23]. Landes, S. J., McBain, S. A., Curran, G. M. (2020). An Introduction to Effectiveness-implementation Hybrid Designs. Psychiatry Res, 283, 112630. DOI: 10.1016/j.psychres.2019.112513.
[24]. Collins, L. M., Murphy, S. A., Nair, V. N., et al. (2005). A Strategy for Optimizing and Evaluating Behavioral Interventions. Ann Behav Med, 30(1), 65-73. DOI: 10.1207/s15324796abm3001_8.
[25]. Gwadz, M. V., Collins, L. M., Cleland, C. M., et al. (2017). Using the Multiphase Optimization Strategy (MOST) to Optimize an HIV Care Continuum Intervention for Vulnerable Populations: A Study Protocol. BMC Public Health, 17(1), 383. DOI: 10.1186/s12889-017-4279-7.
[26]. Gallis, J. A., Bennett, G. G., Steinberg, D. M., et al. (2019). Randomization Procedures for Multicomponent Behavioral Intervention Factorial Trials in the Multiphase Optimization Strategy Framework: Challenges and Recommendations. Transl Behav Med, 9(6), 1047-1056. DOI: 10.1093/tbm/iby131.
[27]. Murphy, S. A. (2005). An Experimental Design for the Development of Adaptive Treatment Strategies. Stat Med, 24(10), 1455-1481. DOI: 10.1002/sim.2022.
[28]. Doorenbos, A. Z., Haozous, E. A., Jang, M. K., et al. (2019). Sequential Multiple Assignment Randomization Trial Designs for Nursing Research. Res Nurs Health, 42(6), 429-435. DOI: 10.1002/nur.21988.
[29]. Meurer, W. J., Seewald, N. J., Kidwell, K. (2017). Sequential Multiple Assignment Randomized Trials: An Opportunity for Improved Design of Stroke Reperfusion Trials. J Stroke Cerebrovasc Dis, 26(4), 717-724. DOI: 10.1016/j.jstrokecerebrovasdis.2016.09.010.
[30]. Candlish, J., Teare, M. D., Cohen, J., et al. (2019). Statistical Design and Analysis in Trials of Proportionate Interventions: A Systematic Review. Trials, 20(1), 151. DOI: 10.1186/s13063-019-3206-x.
[31]. Proctor, E. K., Powell, B. J., McMillen, J. C. (2013). Implementation strategies: recommendations for specifying and reporting. Implement Sci, 8, 139.
[32]. Chambers, D., Vinson, C., Norton, W. (2018). Advancing the science of implementation across the cancer continuum. New York: Oxford University Press.
[33]. Leeman, J., Birken, S. A., Powell, B. J., et al. (2017). Beyond "implementation strategies": classifying the full range of strategies used in implementation science and practice. Implement Sci, 12(1), 125.
[34]. Waltz, T. J., Powell, B. J., Chinman, M. J., et al. (2014). Expert recommendations for implementing change (ERIC): protocol for a mixed methods study. Implement Sci, 9, 39.
[35]. Powell, B. J., Waltz, T. J., Chinman, M. J., et al. (2015). A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci, 10, 21.
[36]. Tavender, E. J., Bosch, M., Fiander, M., et al. (2016). Implementation research in emergency medicine: a systematic scoping review. Emerg Med J, 33(9), 652-659.
[37]. Lourida, I., Abbott, R. A., Rogers, M., et al. (2017). Dissemination and implementation research in dementia care: a systematic scoping review and evidence map. BMC Geriatrics, 17, 147.
[38]. Jacobsen, P. B., Snyder, C. F. (2018). Improving pain assessment and management in routine oncology practice: the role of implementation research. J Clin Oncol, 36(13), 1272-1274.