Resources
-
State Capacity Assessment (SCA)
The primary purpose of the State Capacity Assessment (SCA) is to assist state agency, regional education agencies, and school districts implement effective innovations that benefit students. The capacity of a state to facilitate implementation refers to the systems, activities, and resources that are necessary to successfully adopt and sustain Effective Innovations.
Download: State Capacity Assessment
-
Case Example: Statewide Implementation and Scaling Evaluation
This report documents the culmination of ten years of development of a new approach to improving human service systems, organizations, and outcomes. Based on the Active Implementation Frameworks, intensive support is provided to develop implementation and scaling infrastructures in state education systems to initiate and mange change processes, and to provide reliable supports for improved teacher instruction and student learning. Measures of capacity inform action planning and monitor progress in states, regions, districts, schools, and classrooms. Implementation is a significant addition to efforts to improve education in the United States.
Download: SISEP – Implementation And Scaling Evaluation Report -
Case Example: Changing State Education Systems
A decade ago, the importance of implementation science was recognized by the U.S. Department of Education Office of Special Education Programs (OSEP) and led to the development of what is now the State Implementation and Scaling up of Evidence-Based Programs (SISEP) Center. The design of SISEP’s work in state education systems is based on the developing field of implementation science. As such, SISEP makes use of the best available evidence in implementation practice, science, and policy. In 2014, the SISEP Center completed Year 6 of its work with states. The purpose of this document is to summarize key findings.
Download: SISEP – Systems Changes In State Education Systems -
Implementation Frameworks: An Analysis
While the importance of implementation science is increasingly recognized, the growing field finds itself fragmented across disciplines. Researchers in different disciplines, with different traditions and interests, use varied language to describe common concepts or, conversely, use common language to describe different concepts. Making steps toward developing a “generalizable framework” is the intent of the analysis reported in this article. A unified field of implementation can contribute to improving the impact of innovations supported by evidence. This analysis utilizes the six components of the Active Implementation Frameworks (AIF), initially developed to reflect many disciplines, as a grounding from which other known implementation frameworks are examined. A qualitative content analysis of 23 implementation frameworks was conducted. Findings reveal more similarity than difference across frameworks; including a strong focus on the Implementation Drivers. Differences are seen in the variation of frameworks addressing Systemic Change or Implementation Teams. Implications for implementation practice are discussed.
Download: Implementation Frameworks: An Analysis
-
Implementation Drivers: Responsibility Analysis
In many settings, responsibility for the implementation infrastructure is shared across levels and entities in a system. Therefore, the Responsibility Analysis is essential to ensure that the relevant parties are engaged 1) to better understand the current roles, responsibilities, funding and communication links to effectively support the use of evidence-based innovations and 2) to assess the current implementation supports with regard to identifying gaps and duplication of effort.
Download: (English) Drivers Responsibility Analysis
Download: (Français) Les éléments moteurs de l’implémentation: analyse des responsabilités (revisee 2022) -
An implementation-informed solution for COVID 19 (and COVID 20, 21, …)
The space between intention and accomplishment is the domain of active Implementation science. In this commentary we separate current intention from accomplishment, highlight the need for active implementation, and describe the critical functions of implementation teams to establish and sustain a national health response to current and future challenges.
-
Handout 2: Community Health Teams and Implementation Teams for National Health Response
To assure effective societal action to protect public health, Implementation Teams must be established to support the development and services of Community Health Teams in neighborhoods and communities across the country. Purposeful, active, and effective implementation work (making it happen) is done by Implementation Teams. The functions of the Teams described here are required to establish or sustain a healthy, functioning society. Once in place, the teams stand ready to take on current and future challenges. COVID 19 is used here to illustration the separate, but dependent functions of both teams.
-
OnAir: Implementation Research after 15 Years
Almost two decades ago our research network set out to describe the current state of the science of implementation and identify what it would take to improve the uptake, fidelity, and outcomes of innovative programs and practices in human services.
-
OnAir: COVID 19 and a Case for Active Implementation Teams
Hot on our minds is ending the current pandemic while continuing our community and work lives. To accomplish this, Active Implementation tells us we need effective interventions and effective implementation to realize socially significant outcomes. Implementation Teams lead the effort.
-
Eradication of Smallpox & Implementation
The approaches to implementation and scaling outlined in the Active Implementation Frameworks (Fixsen, Blase, & Van Dyke, 2019) have been used to conduct a post hoc analysis of the eradication of smallpox as described by William Foege in the book House on Fire (Foege, 2011).
Download: Eradicating smallpox – Implementation
-
OnAir: Do we have a science of implementation?
Research sometimes is confused with science. Are we engaged in research that will produce and grow our science?
-
The Origins of Social Validity
The term and constructs related to social validity, as an important aspect of research on human behavior,
were developed by Montrose M. Wolf. Mont Wolf was a co-developer of Applied Behavior Analysis
(Baer, Wolf, & Risley, 1968; Risley, 2001) and a key architect of the Teaching-Family Model (Phillips,
Phillips, Fixsen, & Wolf, 1971; Wolf, 1968) that exemplified the use of Applied Behavior Analysis in
practice. The Teaching-Family Model is an enduring testament to the genius of Mont Wolf (Fixsen &
Blase, 2018) and his influence in government (Voit, 1995) and on thinking about social change (Wolf,
Kirigin, Fixsen, Blase, & Braukmann, 1995).Download: The Origins of Social Validity
-
Implementation Practice and Science
This book is 378 pages of the latest information on implementation research and practice to promote a more definitive and developmental implementation science. With over 500 references, the book updates and extends the summaries of research and practice that have been published since 2005. The book includes previously unpublished data regarding strong implementation variables and associated implementation outcomes in a wide variety of human services. The global focus and inclusion of implementation variables and outcomes in health provides a new view of familiar problems and solutions. The generic and practical focus on how to support high fidelity use of effective innovations is augmented by an emphasis on science as it has been defined historically and currently.We are pleased to make the first edition available to all those interested in extending and refining the implementation knowledge base.
-
Improvement Cycles Example
An example of an approach to establishing usable interventions and implementation supports is provided. The plan, do, study, act cycle (PDSAC) logic is used on purpose to develop simultaneously the innovation and the implementation supports for the innovation.
Download: Improvement Cycles Example -
Heptagon Tool
During Exploration Stage discussions information gathering is focused on the community, relevant governance, workforce, structures and programs, infrastructure and technology as well as resources. Key stakeholders generally are individuals who could have a positive or negative effect on the use of the innovation. Many of these key stakeholders may be affected by the innovation through training, changes in practice, or additional responsibilities.
Van Dyke, M., Kiser, L., and Blase, K. (2019). Heptagon Tool. Chapel Hill, NC: Active Implementation Research Network. www.activeimplementation.org/resources -
Science and Implementation
The purpose of this article is to explore implementation science as a science. The idea of science, and the use of the scientific method to test predictions and hypotheses to advance science are discussed in the practical contexts faced by implementation scientists.
Fixsen, D. L., Blase, K. A., & Van Dyke, M. (2018). Science and implementation. Retrieved from Chapel Hill, NC: Active Implementation Research Network: www.activeimplementation.org/resourcesDownload: Science and Implementation
-
Assessing Implementation Stages
The Active Implementation Stages are iterative and recursive and do not have end points since they tend to overlap and flow back and forth for many years as people and circumstances change. Assessing Stages allows Implementation Teams and Specialists to adjust their activities to provide more precise and helpful support to achieve intended benefits.
Fixsen, D. L., Blase, K. A., & Van Dyke, M. (2018). Assessing implementation stages. Retrieved from Chapel Hill, NC: Active Implementation Research Network: www.activeimplementation.org/resources
Download: AssessingImplementationStages -
Synthesis of Implementation Frameworks
As knowledge about evidence-based implementation has grown so too have the number of theoretical frameworks. Two major reviews in 2012 produced a list of 32 unique frameworks. A parsimonious assumption is that implementation is universal. If this assumption is true, then it also is true that each unique framework emphasizes some aspects of the universal. Identifying and combining fragments of the whole contained within unique frameworks can establish an integrated implementation framework to serve the interests of all fields.
Fixsen, D. L., & Fixsen, A. A. M. (2016). An integration and synthesis of current implementation frameworks. Retrieved from Chapel Hill, NC: Active Implementation Research Network: www.activeimplementation.org/resourcesDownload: Fixsen-and-Fixsen-IntegrationAndSynthesisImplementationFrameworks-2016
-
OSEP Case Example
The U.S. Department of Education Office of Special Education Programs (OSEP) is a model for other government agencies seeking to support the development of implementation capacity in human service systems. In 2006 OSEP was the first federal agency to recognize the potential benefits of implementation science for improving student outcomes.
Download: OSEP Case Example -
Implementation Capacity “Look Fors”
“Look fors” are observable indicators of more complete and complex processes related to a major goal. The indicators are what reviewers look for when visiting a state, reading documents, or discussing plans and progress with participants in SSIP activities.
Download: SISEP-LookForsInSSIP-05-2016
-
OSEP Guidance and Review Tool
The focus of OSEP is on building State capacity to support local educational agencies (LEAs) with the
implementation of evidence-based practices (EBPs) that will lead to measurable improvement in the State identified Measurable Result(s) (SIMR) for children with disabilities. The OSEP Guidance and Review Tool is based on the three components: 1) Infrastructure Development; 2) Support for LEA Implementation of EBPs; and 3) Evaluation.
Download: OSEP-GuidanceAndReviewTool-09-19-2015 -
Staff Selection Process
Staff selection continues to be an overlooked part of implementation and scaling. The processes for recruiting staff (new or existing) to use innovations and implementation best practices and for interviewing candidates for key skills (e.g. judgement, coachability) are detailed with examples of a telephone screening interview, in-person scenario interview, and role plays (work samples) for skill assessment.
Fixsen, D. L., Blase, K. A., & Van Dyke, M. (2018). Staff selection processes. Retrieved from Chapel Hill, NC: Active Implementation Research Network: www.activeimplementation.org/resourcesDownload: StaffSelectionProcess
-
Implementation Specialist Position Description and Interview
Implementation Specialists have the expertise to identify and develop Usable Innovations, use and teach implementation best practices, and support organization and system change. The process for recruiting and interviewing candidates for this challenging role is detailed.
Fixsen, D. L., Blase, K. A., & Van Dyke, M. (2018). Implementation Specialist Position Description and Interview Protocol. Retrieved from Chapel Hill, NC: Active Implementation Research Network: www.activeimplementation.org/resources -
Implementation Science: Fidelity Predictions
What are the foundations for considering implementation as a science? Prediction (if…then) is the heart of science and theory is the source of predictions that advance science. Active Implementation theory predicts if fidelity is achieved then outcomes will improve. A summary of the evidence is presented in this paper.
Fixsen, D. L., Van Dyke, M., & Blase, K. A. (2019). Implementation science: Fidelity predictions and outcomes. Chapel Hill, NC: Active Implementation Research Network. www.activeimplementation.org/resources
Download: Implementation Science FidelityPredictionsOutcomes -
Developing Usable Innovations
An overview of how to turn programs and practices into Usable Innovations that are teachable, learnable, doable, assessable, and scalable in practice. Usable Innovations combine evidence, improvement, implementation, and outcome. Practice profiles operationalize essential functions of a Usable Innovation.
Blase, K. A., Fixsen, D. L., & Van Dyke, M. (2018). Developing Usable Innovations. Retrieved from Chapel Hill, NC: Active Implementation Research Network: www.activeimplementation.org/resourcesDownload: DevelopingUsableInnovations
-
The Teaching-Family Model: The First 50 Years
The Teaching-Family Model was perhaps the first “evidence-based program” in human services. This article describes the development of the treatment model, the failure of the first attempts to replicate the treatment model, the discovery of larger units for replication, the modest success of first attempts to replicate larger units, and the eventual success of replications. The Teaching-Family Model is a testament to the sustainability (and continual improvement) of innovation and implementation methods and the value of the Teaching-Family Association for sustaining a community of practice and for managing the practitioner fidelity and organization fidelity data systems nationally. The benefits of applied behavior analysis and the implications for a new science of implementation for having research purposefully used in practice are explored.
Download: TeachingFamilyModelFirst50Years
-
Assessing Drivers Best Practices
Implementation Drivers are the key components of capacity and the functional infrastructure supports for the successful use of practices or programs. The three categories of Implementation Drivers are Competency, Organization, and Leadership. The assessment tool can be used by Implementation Teams during any Implementation Stage of using an innovation. The assessment asks knowledgeable respondents to rate the extent to which implementation supports are currently in place, based on their experiences.
Download:
(English) Assessing Drivers Best Practices
(Français) Évaluation des éléments moteurs de l’implémentation (revisee 2022) -
An Integration and Synthesis of Implementation Frameworks
Implementation science has developed to support the use of innovations in individual fields within and outside of human services. As the evidence-based program movement gained momentum and the quality chasm widened since the 1990s, implementation science has been recognized as a missing link in the science to service chain. As a result, knowledge about evidence-based implementation has grown. The depth and breadth of expanded knowledge has been expressed in unique implementation frameworks. A qualitative analysis of 32 frameworks is summarized and the multiple frameworks are synthesized in this document.
Download: An Integration and Synthesis of Implementation Frameworks
-
Implementation Specialist Practice Profile
The work of Implementation Specialists and Implementation Teams has been going on for years; however, the designation of an ‘implementation specialist’ position or an ‘Active Implementation Practitioner’ is relatively new, and the clarification of the core competencies of the position is recent. The practice profile reflects the core competencies found to be critical across numerous projects that have successfully supported the full and effective use of innovations.
Download: AIRN-AiPracticeProfile -
Measures of Implementation in Practice Implementation Climate Assessment
Implementation climate has been recognized as a contributor to and reflection of the full and effective use of innovations in organizations. The Implementation Climate Assessment is intended to be used with three groups of key informants within a human service organization. The information from all key informants helps assure a complete view of implementation progress at one point in time within the organization. The measures can be repeated to assess initial progress toward full implementation and to assess changes in implementation over time.
Download: ActiveImplementationClimateAssessment
-
Core Intervention Components: Identifying and Operationalizing What Makes Programs Work
Rather than being based on hunches and best guesses, innovations, interventions, and programs increasingly are expected to be evidence-based. However, when evidence-based programs are replicated or scaled up, it is critical not only to know whether a program works, but which program elements are essential in making the program successful. To date, though, few programs have had hard data about which program features are critical ― core components ― and and which features can be adapted without jeopardizing outcomes ― an adaptable periphery. Since issues related to the core components of innovations are relevant to producing new knowledge about what works and for moving science to practice in socially significant ways, this brief is useful for a range of professionals and stakeholders, including program developers, researchers, implementers, and policy makers.
Download: Core Intervention Components: Identifying and Operationalizing What Makes Programs Work -
Implementation Quotient for Organizations
Many organizations and funders have come to understand that Full Implementation can be reached in 2 to 4 years with support from a competent Implementation Team or Purveyor . These organizations and funders also realize that implementation is an active process with simultaneous work going on at many levels to help assure full and effective uses of an effective innovation. The Implementation Tracker was developed to monitor implementation progress over many years, and to track the return on investing in implementation capacity.
Download: ImplementationQuotientforOrganizations
-
The Evidence Bases for the Teaching-Family Model (Bibliography)
The Teaching-Family Model provides an example of research that has been “transmitted to the field” to benefit large numbers of children and families. The Teaching-Family Model also has developed and tested a strategy for dissemination-implementation to help “move science to service successfully.” The research on the development of the Teaching-Family Model was supported by NIMH and other agencies for over 20 years. The studies outlined in this bibliography demonstrate that research can be done on the “active ingredients” of an intervention as they are taught to practitioners during training workshops and coaching on the job, used in treatment programs, and correlated with important outcomes for the youths and for the practitioners. Research has helped to develop the Teaching-Family Model and has helped to promote the effective adoption and sustainable implementation of the Model nationally.
Download: The Evidence Bases for the Teaching-Family Model (Bibliography) -
Implementation in the Real World: Purveyors’ Craft Knowledge
In the Fall of 2004 a select group of expert program purveyors of evidence-based programs and practices was invited to a working meeting to explore the “craft knowledge” related to the successful use of evidence-based programs and practices. This document outlines meeting methods and meeting results, along with a summary of responses to questions. The results of the meeting helped to operationalize and extend the knowledge base for Active Implementation practice and research.
Download: Implementation in the Real World: Purveyors’ Craft Knowledge -
Lessons Learned from 64 Evidence-Based Program Developers
In order to achieve positive outcomes, effective innovations and implementation strategies must be used in combination. This study examined the implementation of evidence-based programs and practices in the real world by exploring the ways in which evidence-based program developers support implementation of their programs and practices in new settings. Structured interviews were conducted with a random sample of evidence-based program developers whose programs were listed on the National Registry of Effective Programs and Practices as well as other national registries of evidence-based programs and practices.
The interview was focused on factors derived from a review and synthesis of the implementation evaluation literature. The interviews were recorded, transcribed, and coded to identify similarities and differences between responses as well as themes and patterns that emerged across the participants. Results indicated that program developers provide varying degrees of support to organizations implementing their intervention. In addition, the results describe the extent to which program developers demonstrate varying levels of responsibility for implementation components. Implications for program developers, organizations, policy-makers, and consumers are discussed.
Download: EBP Model Program Study -
Researcher Perspectives on Implementation Research
In 2005 a meeting of expert researchers was held to develop an outline for a multi-site, multi-year (10 – 15 years) program of research to dramatically improve the practice and science of implementation. Based on findings from the monograph, “Implementation Research: A Synthesis of the Literature” (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005), the focus of the meeting was on implementation process and outcome studies that can inform the implementation of quality programs and practices across domains (e.g. mental health, substance abuse, prevention). It was concluded that the field of implementation research is still in its infancy and in need of a long-term research agenda to focus efforts on successful approaches to implementation, the various influences on implementation, and the interaction effects among implementation factors.
Download: Researcher Perspectives on Implementation Research -
Implementation Research: A Synthesis of the Literature
The science related to developing and identifying “evidence-based practices and programs” has improved- however the science related to implementing these programs with fidelity and good outcomes for consumers lag far behind. This monograph describes the science related to implementation and identifies what it will take to transmit innovative programs and practices to mental health, social services, juvenile justice, education, health, early childhood education, employment services, and substance abuse prevention and treatment. The monograph also summarizes findings from the review of the implementation evaluation literature and proposes frameworks for understanding effective implementation processes. The results of this literature review and synthesis confirm that systematic implementation practices are essential to any national attempt to use the products of science – such as evidence-based programs – to improve the lives of its citizens.
Download: Implementation Research: A Synthesis of the Literature
-
Understanding Purveyor and Implementer Perceptions of Implementing Evidence-Based Programs
In the Fall of 2004 a select group of successful program developers (purveyors) and experienced users of evidence-based programs and practices (implementers) were invited to a series of two working meeting to explore the “craft knowledge” related to the implementation of evidence-based programs and practices. The first of these meetings was conducted with the developers of evidence-based programs and practices; the second meeting was conducted with implementers of the same evidence-based programs and practices. The results from a concept mapping process used with the participants of these two working meetings are presented in this paper.
Download: Understanding Purveyor and Implementer Perceptions of Implementing Evidence-Based Programs
-
Evidence-Based Programs and Cultural Competence
In 2003 a meeting was convened of leaders of various cultural, racial, and ethnic professional associations; representatives of family advocacy associations; and developers of evidence-based programs (EBPs). This was an historic meeting to find common ground for using EBPs in culturally and racially diverse communities and for shaping EBPs to address issues directly related to diverse communities. This meeting provided an opportunity for mutual understanding and mutual gain as well as an opportunity to integrate EBP developer, practitioner and community paths. The overall meeting process was structured to systematically solicit information by using a modified Nominal Group Technique and the findings are summarized in this report.
Download: Meeting Summary: Evidence-Based Programs and Cultural Competence
-
Consensus Statement on Evidence-Based Programs and Cultural Competence
In March, 2003, the National Implementation Research Network and the Louis de la Parte Florida Mental Health Institute convened a meeting of experts in the area of children’s mental health and cultural competence. These included the developers of evidence-based programs for children; individuals with expertise on African American, Asian American Pacific Islander, Latino, and Native American issues; as well as researchers, family members, and stakeholders. The goals of the meeting were twofold. The first was to address the applicability and appropriateness of evidence-based programs for children and adolescents of different cultures and, second, to increase the capacity of systems to develop and implement culturally relevant approaches.
At the meeting, participants developed a consensus statement of what we know and what we do not know about the relationship between evidence-based programs and cultural competence. The objective of this consensus statement is to provide both a platform and a guide for discussions and decisions related to the cultural relevance of evidence-based programs for children and adolescents. Participants also developed recommendations for future action.Download: Consensus Statement on Evidence-Based Programs and Cultural Competence
-
Family Specialist Fidelity Assessment
This sample fidelity assessment is part of the overall evaluation of Family Specialists in a Teaching-Family program. Family Specialists provide intensive, short-term, in-home treatment to families whose children have been or are about to be removed from their homes due to a variety of social or mental health problems. The fidelity instrument is designed to assess the following dimensions: treatment planning, relationship development, teaching skills, clinical judgement, intervention progress, and record keeping.
Download: Family Specialist Fidelity Evaluation
-
The Boys Town Revolution
This article describes the reorganization and restructuring effort of Boys Town in the 1970s. Founded in 1917, Boys Town was a 1,400-acre child-care ‘village’ in Nebraska home to 400 adolescent boys. Boys Town had been unable to keep up with the social changes of the 1950s and 1960s, nor cope with new problems of its teenagers. When a team of behavioral psychologists applied a new method of child-care technology, a venerable institution was transformed.
Download: The Boys Town Revolution
-
ImpleMap: Exploring the Implementation Landscape
When creating implementation capacity in an organization, the first task is to map the current implementation landscape. The ImpleMap interview process provides a baseline assessment and assists implementation specialists in collecting information to inform active implementation planning and development in the organization.
Download: ImpleMapExploringImplementationLandscapeImpleMap Segment 1 (9 min. 29 sec.)
This segment demonstrates the inquiry process related to the “what” – we want to learn which interventions, programs, frameworks, or practices they are purposefully attempting to implement well. And we want to know on what basis they selected them. In particular, we are finding out if there is a purposeful vetting process for determining the characteristics of programs that they are willing to “invest in” and what the dimensions of that vetting process might be. Our work at asks leadership and implementation teams to consider the conditions under which a particular innovation is supported (e.g. need, evidence, readiness for implementation, capacity to implement, fit, resources).
ImpleMap Segment 2 (9 min. 59 sec.)
As the two or three interventions are identified and more is known about the vetting process and how it occurs, we want to understand “who” is involved. This will inform our future work with them related to membership and activities of Leadership and Implementation Teams. This segment also demonstrates how to check in on process and how to make summative statements before moving on. The end of the segment demonstrates a discussion on how to arrive at the one or two interventions they want to focus on. This is the pre-work to engaging in a reflective and guided discussion about the Implementation Drivers.
ImpleMap Segment 3 (9 min. 29 min.)
This segment is a discussion about the Drivers, starting with selection. Notice how the interviewer followed the group as they clearly were focused on “units” and not practitioners. This still produced useful information (recast as Stages) about how they think about implementation. Notice that because they focused on “units” rather than practitioners, this was a good opportunity to offer some information about the Exploration and Installation Stages of implementation.
ImpleMap Segment 4 (9 min. 44 sec.)
After identifying target interventions and the vetting process, the questions shift to how they operationalize interventions and use the Drivers to prepare practitioners. Notice that the interviewer finds opportunities based on what the participants are saying to weave in information about any driver that they mention. Of course, they will not label it as a driver or use implementation-informed language. But these are great opportunities to recognize their current thoughtful approaches and begin to introduce new language.
ImpleMap Segment 5
(video link currently unavailable)
This segment demonstrated asking questions, in a conversational style, about current practices related to supervision, coaching, and performance assessments. The vocal members of this group are very smart and conceptual. In this segment the interviewer asks questions to get at the details without pushing too hard.
ImpleMap Segment 6 (9 min. 59 sec.)
This segment focuses primarily on coaching and performance assessments and loops back to talking about preparation (e.g. selection, training) of coaches and supervisors. Notice that even though the interviewer is trying to get more specific information about the Drivers, he continues to look for opportunities to capture more information about earlier topics (vetting, operationalizing).
ImpleMap Segment 7 (4 min.)
This particular example of ImpleMapping involves a group that discusses the intervention and implementation processes at a more conceptual level. Notice the interviewer’s attempt to “drill down” by asking specific questions to elicit more detail. Also notice that it is important not to “interrogate” people or make them uncomfortable once you are relatively sure that a process, intervention, Stage, or Driver is not yet operationalized or perhaps not known in sufficient detail by the group you are interviewing. There will be many more opportunities in the future. The ImpleMap interview is just the beginning of a long process of implementation infrastructure development.