Implementing the Improvement Plan
During Phase III, states will implement the improvement plans developed in Phase II, which include improvement strategies in two primary areas: infrastructure development and support for EIS program and/or EIS provider implementation of evidence-based practices (EBPs). The SSIP includes the activities, steps, and resources needed to implement the coherent improvement strategies with attention to the research on implementation and timelines for implementation.
Many states have established teams to support implementation of improvement activities during Phase III. These implementation teams support work at the state level and in local programs. They leverage resources across offices and agencies and address barriers to implementation as they arise. Information is shared among the teams using established feedback loops and communication protocols. Adjustments to the implementation plan are made based on progress and outcome data with input from stakeholders.
The following section addresses considerations and resources that can be used by state staff in implementing improvement strategies and associated activities. Resources and tools related to the implementation process in general, infrastructure development, and support for implementation of EBPs are included in this guide.
Tools and Resources: Implementation Process
A Guide to Implementation Process: Stages, Steps and Activities: Companion State and Local Level Self-Assessments
This guide, developed by the Early Childhood Technical Assistance Center (ECTA), is based on implementation science research and the collective experiences of federally funded technical assistance centers in conducting statewide system change initiatives. The guide includes critical implementation activities for five implementation stages (e.g. Exploration, Installation, Initial Implementation, Full Implementation, and Expansion/Scale-up). Outcomes are also provided for each of the stages.
A Pre-Test can be used to determine status of implementation. The companion State-Level and Local-Level Self-Assessments can be used by leadership teams as they guide and evaluate the systematic implementation, expansion, and sustainability of new practices or innovations. The tools provide a way to systematically assess outcomes that have been achieved and to determine outcomes that need to be addressed.
Source: Early Childhood Technical Assistance Center. (2014). A guide to implementation process: stages, steps and activities [Power Point Slides]. Retrieved from https://ectacenter.org/implementprocess/
This brief provides an integrated stage-based implementation framework that builds on implementation science literature. This framework is based on the following: (1) implementation happens in four discernible stages, and (2) three common threads, or core elements, exist across each of these stages. The three core elements are: building and using implementation teams to actively lead implementation efforts; using data and feedback loops to drive decision-making and promote continuous improvement; and developing a sustainable implementation infrastructure that supports general capacity and innovation-specific capacity for individuals, organizations, and communities.
Source: U.S. Department of Health and Humans Services: Office of Planning, Research, and Evaluation. (2015.). An integrated stage-based framework for implementation of early childhood programs and systems. Retrieved from http://www.acf.hhs.gov/programs/opre/resource/an-integrated-stage-based-framework-for-implementation-of-early-childhood-programs-and-systems
Get Started: A set of quick start videos and guides developed to help you and your team get started with Active Implementation
The National Implementation Research Network’s Get Started webpage includes videos that can be used to support teams in implementing innovations including evidence-based practices. In addition, the website includes resources related to usable interventions, implementation stages, implementation drivers, implementation teams, and improvement cycles. Modules and lessons with aligned activities are also available.
Source: National Implementation Network. (2016). Get started: A set of quick start videos and guides developed to help you and your team get started with active implementation. Retrieved from https://nirn.fpg.unc.edu/resources/active-implementation-frameworks-overview-videos
The Basics of Implementation Science presentation includes an overview on developing an infrastructure that supports implementation, scale-up, and sustainability of effective practices and highlights core components of implementation. Highlighted components include: implementation stages, implementation drivers, implementation teams, usable interventions, and improvement cycles.
Source: Davis, Susan. (2015). Basics of implementation science. Retrieved from https://ideadata.org/resources/resource/175/the-basics-of-implementation-science
The Model for Improvement, which was developed by the Associates for Process Improvement, is designed to accelerate improvement of programs utilizing existing change theories. The steps included in this model are the following: forming the team, setting aims, establishing measures, selecting changes, testing changes (which includes the Plan-Do-Study-Act [PDSA] Cycle), implementing changes, and spreading changes.
Source: Institute for Healthcare Improvement. (2016). Science of improvement: how to improve. Retrieved from http://www.ihi.org/resources/pages/howtoimprove/scienceofimprovementhowtoimprove.aspx
This document provides an overview of the 90-Day Cycle and provides information on each of the stages of the cycle. The 90-Day Cycle can be used to identify barriers to implementation and to target specific processes that are needed to address the barriers. Associated Tools and Resources related to the 90-Day Cycle are included.
Source: Park, S., and Takahashi, S. (2013). The 90-day cycle handbook. Retrieved from https://www.carnegiefoundation.org/resources/publications/90-day-cycle-handbook/
This document defines the essential components of capacity building and provides an at-a-glance summary of best practice recommendations for building and measuring capacity.
Source: National Center for Systemic Improvement. (2016). Practice brief: best practice recommendations for building and measuring capacity. Retrieved from http://ncsi.wested.org/wp-content/uploads/2016/03/PracticeBriefCapacity.pdf
This document categorizes capacity tools so that teams can determine which ones may be most helpful in their efforts to build and measure capacity.
Source: National Center for Systemic Improvement. (2016). Tools for building and measuring capacity. Retrieved from http://ncsi.wested.org/wp-content/uploads/2016/03/ResourceList-ToolsforBuildingMeasuringCapacity.pdf
During Phase III, states will be implementing improvement strategies and associated activities to enhance the state infrastructure to better support EIS programs and/or EIS providers in implementing and scaling up evidence-based practices to achieve the SIMR(s) for infants and toddlers with disabilities and their families. These strategies, which were developed with input from stakeholders during Phase II, address improvements to one or more components of the state system including: governance, fiscal, quality standards, professional development, data, technical assistance, and accountability/monitoring.
States will continue to work toward further aligning and leveraging other state improvement plans and initiatives that impact infants and toddlers with disabilities. In addition, states will continue to engage multiple offices within the state lead agency (LA), as well as other state agencies (such as the state educational agency or SEA, if different from the LA), in implementing improvement activities and associated activities related to improving its infrastructure.
- Ensure infrastructure improvements are connected to root causes identified in Phase I.
- Document what infrastructure changes have been made to support SSIP implementation.
- Use implementation teams to make sure infrastructure improvements are made at both the state and program level as appropriate, track progress, and modify as necessary.
- Revisit timing of implementation of identified infrastructure improvements to ensure that supports are in place for implementation of evidence-based practices (EBPs).
- Use feedback loops to address barriers and make additional modifications to the infrastructure improvements.
- Access sufficient resources to make and sustain infrastructure improvements, including fiscal and human resources.
- Ensure implementation drivers are addressed in the infrastructure improvements to support implementation of EBPs.
- Keep stakeholders informed of progress and engage them in making recommendations for modifications to the infrastructure improvements in the improvement plan.
Tools and Resources: Improvement Strategies to Support Infrastructure Development
A System Framework for Building High-Quality Early Intervention and Preschool Special Education Programs
The framework, which was developed by the Early Childhood Technical Assistance Center (ECTA), can be used by state Part C and Section 619 coordinators and their staff to evaluate their current systems; identify potential areas for improvement; and develop more effective, efficient systems that support implementation of evidence-based practices leading to improved outcomes for young children with disabilities and their families. The ECTA System Framework is organized around six interrelated components: Governance, Finance, Personnel/Workforce, Data System, Accountability and Quality Improvement, and Quality Standards. Each component contains a set of subcomponents that identify key areas of content within the component. Each subcomponent contains a set of quality indicators that specify what needs to be in place to support a high-quality Part C/Section 619 system. Each quality indicator has corresponding elements of quality that operationalize its implementation.
Source: Early Childhood Technical Assistance Center (2015). A system framework for building high-quality early intervention and preschool special education programs. Retrieved from https://ectacenter.org/sysframe/
The Framework Self-assessment Tool, which was developed by the ECTA and the Center for IDEA Early Childhood Data Systems (DaSy) with input from partner states, provides an Excel-based tool that state staff can use to record the current status of their state system, set priorities for improvement, and measure progress over time.
Source: Early Childhood Technical Assistance Center (2015). Framework self-assessment tool. Retrieved from https://ectacenter.org/sysframe/selfassessment.asp
Applying Implementation Science to State System Change: An Example of Improving the Finance System Component: Implementation of a Family Cost Participation Program in a Hypothetical State
This document provides an example of how implementation science could be applied to improving a hypothetical state’s finance system through the implementation of a family cost participation program. Goals for each of the implementation stages are addressed, and stage-based implementation activities are provided.
Source: Lucas, A., Hurth, J., and Kelley, G. (2015). Applying implementation science to state system change: an example of improving the finance system component: Implementation of a family cost participation program in a hypothetical state. Retrieved from https://ectacenter.org/~pdfs/sysframe/implement-finance-example.pdf
Implementing Evidence-based Practices
During Phase III, states will be supporting EIS programs and/or EIS providers in implementing evidence-based practices (EBPs) to achieve the SIMR(s). States took one of two approaches in the selection of EBPs during Phase II. One approach was to identify a model/approach with specific practices determined by that model/approach. A second approach was to identify a model or approach but practices were not yet identified. A few states had yet identified a model/approach or specific practices for implementation.
States are also using varied approaches to implementation. Some states are planning to begin with initial implementation sites and later expand or scale up to other programs/providers while other states are planning statewide implementation. States needed to take into account their implementation approach (i.e., other sites or statewide) as they consider how they will implement and evaluate EBPs in Phase III.
Some states may need to make adjustments to their implementation plans based on data and stakeholder input in Phase III. These adjustments may include changes in models/approaches or changes in EBPs.
- States that have not yet selected their EBPs will need to identify the EBPs that EIS programs/EIS providers will implement to achieve the SIMR. Key questions states should consider in this process include:
- Do the EBPs fit with the state’s culture, values, and service philosophy?
- Do the EBPs align with current practices/initiatives in the state?
- Which specific practices are likely to have the most direct impact on expected outcomes and the SIMR? How many specific practices can EIS programs/EIS providers reasonably implement with fidelity? (Be careful not to select too many practices that will make implementation with fidelity challenging.)
- What opportunities can be provided to engage stakeholders in the process of selecting EBPs?
- All states will need to operationalize their Phase II plans for implementing EBPs based on the activities, steps, and timelines included in their plans using the implementation science and/or improvement science concepts. Some key things to consider when implementing EBPs include ensuring that:
- A communication plan is in place and implemented to build awareness and support and solicit stakeholder engagement throughout implementation;
- Necessary infrastructure and administrative supports are in place including resources (e.g., people, funding, materials) to begin implementing EBPs;
- If necessary, professional development and other content, such as practice profiles that operationalize the practices included in the model, innovation, or training, are provided or may need to be developed;
- Coaches and mentors are trained on the practices that will be implemented;
- Ongoing support for practitioners such as coaching and mentoring are in place and implemented over time;
- Feedback loops are used with initial implementers to identify barriers and make changes to materials/processes prior to expanding or scaling up to other programs/providers;
- Tools to track practice fidelity (observation checklists, self-assessments) are identified/developed and used;
- Practitioners use data to track progress in implementing EBPs and inform what practices to target with TA, training, and coaching/mentoring;
- Fidelity of implementation of EBPs is monitored and well-documented;
- A clear process is in place to expand/scale up use of EBPs by additional providers/programs as appropriate;
- Continuous improvement cycles are used to evaluate and improve the implementation plan activities and process over time; and
- Strategies to ensure sustainability of practice fidelity are implemented.
Tools and Resources: Implementing Evidence-based Practices
The Statewide Implementation Guide is a process for implementing evidence-based practices statewide. The guide is based on results and evidence from the multi-year Pyramid Model implementation initiative in 25 states. The guide includes tools, materials, and examples derived from the Pyramid Model and ECTA's DEC Recommended Practices implementation technical assistance.
Source: Early Childhood Technical Assistance Center. (2018). Statewide Implementation Guide. Retrieved from https://ectacenter.org/sig/
This document was developed by the Council for Exceptional Children’s Division for Early Childhood Education (DEC) to support practitioners and families in implementing research supported practices that are designed to improve outcomes and promote development of young children who have or are at risk for developmental delays or disabilities. The Recommended Practices, which were updated in collaboration with ECTA, consist of eight domains: leadership, assessment, environment, family, instruction, interaction, teaming and collaboration, and transition. Videos about the practices are available on DEC’s website.
Source: Division for Early Childhood. (2014). DEC recommended practices in early intervention/early childhood special education. Retrieved from http://www.dec-sped.org/recommendedpractices
The Practice Improvement Tools help practitioners implement evidence-based practices. They are based on the Division for Early Childhood (DEC) Recommended Practices. These Tools and Resources guide practitioners and families in supporting young children who have, or are at-risk for, developmental delays or disabilities across a variety of early childhood settings. They include performance checklists, practice guides, illustrations of the practices, and guidance materials.
The performance checklists help practitioners improve their skills, plan interventions, and self-evaluate their use evidence-based practices. Practice guides for practitioners and families explain the practices and how to do them using videos and vignettes. They describe how practitioners will know if practices are working. The tools also include an interactive product selection tool and professional development modules.
Source: Early Childhood Technical Assistance Center. (2018). Practice Improvement Tools: Using the DEC Recommended Practices. Retrieved from https://ectacenter.org/decrp
This research brief reviews the best practices for scaling up effective programs based on a comprehensive literature review. Examples of experiences of several programs that were successfully scaled up are included.
Source: Sacks, Vanessa, Belts, Martha, Beckwith, Samuel, and Anderson-Moore, Kristin. (2015). How to scale up effective programs serving children, youth, and families. Retrieved from https://www.childtrends.org/publications/how-to-scale-up-effective-programs-serving-children-youth-and-families-2
This planning tool can be used to identify core components or essential functions of the evidence-based practices that are being implemented. Core components of the practices can be defined or operationalized, and expected, developmental, and unacceptable practice variations can be shown. This tool can be used to support identification or development of fidelity measures to understand if the practice is being implemented as intended.
Source: State Implementation and Scaling-up of Evidence-based Practices Center and National Implementation Network. (2014). Practice Profile Planning Tool. Retrieved from https://nirn.fpg.unc.edu/resources/practice-profile-planning-tool