• Logo: DaSy
  • Logo: ECTA Center
  • Logo: IDC
  • Logo: NCSI

Implementing the Improvement Plan

During Phase III, states will implement the improvement plans developed in Phase II, which include improvement strategies in two primary areas: infrastructure development and support for EIS program and/or EIS provider implementation of evidence-based practices (EBPs). The SSIP includes the activities, steps, and resources needed to implement the coherent improvement strategies with attention to the research on implementation and timelines for implementation.

Many states have established teams to support implementation of improvement activities during Phase III. These implementation teams support work at the state level and in local programs. They leverage resources across offices and agencies and address barriers to implementation as they arise. Information is shared among the teams using established feedback loops and communication protocols. Adjustments to the implementation plan are made based on progress and outcome data with input from stakeholders.

The following section addresses considerations and resources that can be used by state staff in implementing improvement strategies and associated activities. Resources and tools related to the implementation process in general, infrastructure development, and support for implementation of EBPs are included in this guide.

Tools and Resources: Implementation Process

  • This guide, developed by the Early Childhood Technical Assistance Center (ECTA), is based on implementation science research and the collective experiences of federally funded technical assistance centers in conducting statewide system change initiatives. The guide includes critical implementation activities for five implementation stages (e.g. Exploration, Installation, Initial Implementation, Full Implementation, and Expansion/Scale-up). Outcomes are also provided for each of the stages.

    A Pre-Test can be used to determine status of implementation. The companion State-Level and Local-Level Self-Assessments can be used by leadership teams as they guide and evaluate the systematic implementation, expansion, and sustainability of new practices or innovations. The tools provide a way to systematically assess outcomes that have been achieved and to determine outcomes that need to be addressed.

    Source: Early Childhood Technical Assistance Center (2014). A guide to implementation process: stages, steps and activities. Retrieved from http://ectacenter.org/implementprocess/implementprocess.asp

  • This brief provides an integrated stage-based implementation framework that builds on implementation science literature. This framework is based on the following: (1) implementation happens in four discernible stages, and (2) three common threads, or core elements, exist across each of these stages. The three core elements are: building and using implementation teams to actively lead implementation efforts; using data and feedback loops to drive decision-making and promote continuous improvement; and developing a sustainable implementation infrastructure that supports general capacity and innovation-specific capacity for individuals, organizations, and communities.

    Source: U.S. Department of Health and Humans Services: Office of Planning, Research, and Evaluation. (2015.). An integrated stage-based framework for implementation of early childhood programs and systems. Retrieved from http://www.acf.hhs.gov/programs/opre/resource/an-integrated-stage-based-framework-for-implementation-of-early-childhood-programs-and-systems

  • The National Implementation Research Network’s Get Started webpage includes videos that can be used to support teams in implementing innovations including evidence-based practices. In addition, the website includes resources related to usable interventions, implementation stages, implementation drivers, implementation teams, and improvement cycles. Modules and lessons with aligned activities are also available.

    Source: National Implementation Network. (2016). Get started: A set of quick start videos and guides developed to help you and your team get started with active implementation. Retrieved from http://implementation.fpg.unc.edu

  • The Basics of Implementation Science presentation includes an overview on developing an infrastructure that supports implementation, scale-up, and sustainability of effective practices and highlights core components of implementation. Highlighted components include: implementation stages, implementation drivers, implementation teams, usable interventions, and improvement cycles.

    Source: Davis, Susan. (2015). Basics of implementation science. Retrieved from https://ideadata.org/resource-library/55ba8132140ba05f7e8b4575/

  • The Model for Improvement, which was developed by the Associates for Process Improvement, is designed to accelerate improvement of programs utilizing existing change theories. The steps included in this model are the following: forming the team, setting aims, establishing measures, selecting changes, testing changes (which includes the Plan-Do-Study-Act [PDSA] Cycle), implementing changes, and spreading changes.

    Source: Institute for Healthcare Improvement. (2016). Science of improvement: how to improve. Retrieved from http://www.ihi.org/resources/pages/howtoimprove/scienceofimprovementhowtoimprove.aspx

  • This document provides an overview of the 90-Day Cycle and provides information on each of the stages of the cycle. The 90-Day Cycle can be used to identify barriers to implementation and to target specific processes that are needed to address the barriers. Associated tools and resources related to the 90-Day Cycle are included.

    Source: Park, S., and Takahashi, S. (2013). The 90-day cycle handbook. Retrieved from http://cdn.carnegiefoundation.org/wp-content/uploads/2014/09/90DC_Handbook_external_10_8.pdf

  • This document defines the essential components of capacity building and provides an at-a-glance summary of best practice recommendations for building and measuring capacity.

    Source: National Center for Systemic Improvement. (2016). Practice brief: best practice recommendations for building and measuring capacity. Retrieved from http://ncsi.wested.org/wp-content/uploads/2016/03/PracticeBriefCapacity.pdf

  • This document categorizes capacity tools so that teams can determine which ones may be most helpful in their efforts to build and measure capacity.

    Source: National Center for Systemic Improvement. (2016). Tools for building and measuring capacity. Retrieved from http://ncsi.wested.org/wp-content/uploads/2016/03/ResourceList-ToolsforBuildingMeasuringCapacity.pdf

Infrastructure Development

During Phase III, states will be implementing improvement strategies and associated activities to enhance the state infrastructure to better support EIS programs and/or EIS providers in implementing and scaling up evidence-based practices to achieve the SIMR(s) for infants and toddlers with disabilities and their families. These strategies, which were developed with input from stakeholders during Phase II, address improvements to one or more components of the state system including: governance, fiscal, quality standards, professional development, data, technical assistance, and accountability/monitoring.

States will continue to work toward further aligning and leveraging other state improvement plans and initiatives that impact infants and toddlers with disabilities. In addition, states will continue to engage multiple offices within the state lead agency (LA), as well as other state agencies (such as the state educational agency or SEA, if different from the LA), in implementing improvement activities and associated activities related to improving its infrastructure.


  • Ensure infrastructure improvements are connected to root causes identified in Phase I.
  • Document what infrastructure changes have been made to support SSIP implementation.
  • Use implementation teams to make sure infrastructure improvements are made at both the state and program level as appropriate, track progress, and modify as necessary.
  • Revisit timing of implementation of identified infrastructure improvements to ensure that supports are in place for implementation of evidence-based practices (EBPs).
  • Use feedback loops to address barriers and make additional modifications to the infrastructure improvements.
  • Access sufficient resources to make and sustain infrastructure improvements, including fiscal and human resources.
  • Ensure implementation drivers are addressed in the infrastructure improvements to support implementation of EBPs.
  • Keep stakeholders informed of progress and engage them in making recommendations for modifications to the infrastructure improvements in the improvement plan.

Tools and Resources: Improvement Strategies to Support Infrastructure Development

  • The framework, which was developed by the Early Childhood Technical Assistance Center (ECTA), can be used by state Part C and Section 619 coordinators and their staff to evaluate their current systems; identify potential areas for improvement; and develop more effective, efficient systems that support implementation of evidence-based practices leading to improved outcomes for young children with disabilities and their families. The ECTA System Framework is organized around six interrelated components: Governance, Finance, Personnel/Workforce, Data System, Accountability and Quality Improvement, and Quality Standards. Each component contains a set of subcomponents that identify key areas of content within the component. Each subcomponent contains a set of quality indicators that specify what needs to be in place to support a high-quality Part C/Section 619 system. Each quality indicator has corresponding elements of quality that operationalize its implementation.

    Source: Early Childhood Technical Assistance Center (2015). A system framework for building high-quality early intervention and preschool special education programs. Retrieved from http://ectacenter.org/sysframe/

  • The Framework Self-assessment Tool, which was developed by the ECTA and the Center for IDEA Early Childhood Data Systems (DaSy) with input from partner states, provides an Excel-based tool that state staff can use to record the current status of their state system, set priorities for improvement, and measure progress over time.

    Source: Early Childhood Technical Assistance Center (2015). Framework self-assessment tool. Retrieved from http://ectacenter.org/sysframe/selfassessment.asp

  • This document provides an example of how implementation science could be applied to improving a hypothetical state’s finance system through the implementation of a family cost participation program. Goals for each of the implementation stages are addressed, and stage-based implementation activities are provided.

    Source: Lucas, A., Hurth, J., and Kelley, G. (2015). Applying implementation science to state system change: an example of improving the finance system component: Implementation of a family cost participation program in a hypothetical state. Retrieved from http://ectacenter.org/~pdfs/sysframe/implement-finance-example.pdf

  • This presentation provides practical suggestions for creating a financing plan for implementing and scaling-up improvement initiatives. Areas addressed include estimating costs, mapping current resources, and assessing gaps. Information is also provided on identifying and prioritizing short-term and long-term financing strategies.

    Source: Center for the Study of Social Policy. (2009). Retrieved from http://www.cssp.org/community/neighborhood-investment/other-resources/CreatingaStrategicFinancingPlantoAchieveResultsatScale.pdf

Implementing Evidence-based Practices

During Phase III, states will be supporting EIS programs and/or EIS providers in implementing evidence-based practices (EBPs) to achieve the SIMR(s). States took one of two approaches in the selection of EBPs during Phase II. One approach was to identify a model/approach with specific practices determined by that model/approach. A second approach was to identify a model or approach but practices were not yet identified. A few states had yet identified a model/approach or specific practices for implementation.

States are also using varied approaches to implementation. Some states are planning to begin with initial implementation sites and later expand or scale up to other programs/providers while other states are planning statewide implementation. States needed to take into account their implementation approach (i.e., other sites or statewide) as they consider how they will implement and evaluate EBPs in Phase III.

Some states may need to make adjustments to their implementation plans based on data and stakeholder input in Phase III. These adjustments may include changes in models/approaches or changes in EBPs.


  • States that have not yet selected their EBPs will need to identify the EBPs that EIS programs/EIS providers will implement to achieve the SIMR. Key questions states should consider in this process include:
    • Do the EBPs fit with the state’s culture, values, and service philosophy?
    • Do the EBPs align with current practices/initiatives in the state?
    • Which specific practices are likely to have the most direct impact on expected outcomes and the SIMR? How many specific practices can EIS programs/EIS providers reasonably implement with fidelity? (Be careful not to select too many practices that will make implementation with fidelity challenging.)
    • What opportunities can be provided to engage stakeholders in the process of selecting EBPs?
  • All states will need to operationalize their Phase II plans for implementing EBPs based on the activities, steps, and timelines included in their plans using the implementation science and/or improvement science concepts. Some key things to consider when implementing EBPs include ensuring that:
    • A communication plan is in place and implemented to build awareness and support and solicit stakeholder engagement throughout implementation;
    • Necessary infrastructure and administrative supports are in place including resources (e.g., people, funding, materials) to begin implementing EBPs;
    • If necessary, professional development and other content, such as practice profiles that operationalize the practices included in the model, innovation, or training, are provided or may need to be developed;
    • Coaches and mentors are trained on the practices that will be implemented;
    • Ongoing support for practitioners such as coaching and mentoring are in place and implemented over time;
    • Feedback loops are used with initial implementers to identify barriers and make changes to materials/processes prior to expanding or scaling up to other programs/providers;
    • Tools to track practice fidelity (observation checklists, self-assessments) are identified/developed and used;
    • Practitioners use data to track progress in implementing EBPs and inform what practices to target with TA, training, and coaching/mentoring;
    • Fidelity of implementation of EBPs is monitored and well-documented;
    • A clear process is in place to expand/scale up use of EBPs by additional providers/programs as appropriate;
    • Continuous improvement cycles are used to evaluate and improve the implementation plan activities and process over time; and
    • Strategies to ensure sustainability of practice fidelity are implemented.

Tools and Resources: Implementing Evidence-based Practices

  • Developed by the Early Childhood Technical Assistance Center (ECTA), this guide can be used to support widespread use of EBPs designed to improve outcomes for young children with or at risk for delays or disabilities and their families. The guide, which was developed through the Center’s Reaching Potential through Recommended Practices initiative (RP2), focuses on implementation of the Division for Early Childhood (DEC) Recommended Practices and can be used statewide or in specific regions by cross-agency teams to implement RP2 throughout the early childhood and early intervention service-delivery systems where young children with disabilities and their families are served.

    The guide includes information on the three major elements that are instrumental in the process of planning and sustaining the high-fidelity implementation of the DEC Recommended Practices. The first element is the Stages of Implementation (see http://implementation.fpg.unc.edu/module-4 and http://ectacenter.org/implementprocess/implementprocess.asp), which refers to the major steps that must be followed in any effort of full-fledged implementation. The second element is an overview of the four major structures that are needed for high-fidelity implementation of Recommended Practices: (1) the State Leadership Team, (2) the state’s Master Cadre of coaches/trainers, (3) demonstration and implementation sites, and (4) data and evaluation systems. The third element covered in this introduction is the State Benchmarks of Quality, a tool for planning and monitoring the implementation process.

    Source: Early Childhood Technical Assistance Center (2014). Planning Guide to Statewide Implementation, Scale-up, and Sustainability of Recommended Practices. Retrieved from http://ectacenter.org/~pdfs/implement_ebp/ECTA_RP_StateGuide_2-2015.pdf

  • This document was developed by the Council for Exceptional Children’s Division for Early Childhood Education (DEC) to support practitioners and families in implementing research supported practices that are designed to improve outcomes and promote development of young children who have or are at risk for developmental delays or disabilities. The Recommended Practices, which were updated in collaboration with ECTA, consist of eight domains: leadership, assessment, environment, family, instruction, interaction, teaming and collaboration, and transition. Videos about the practices are available on DEC’s website.

    Source: Division for Early Childhood. (2014). DEC recommended practices in early intervention/early childhood special education. Retrieved from http://www.dec-sped.org/recommendedpractices

  • The Early Childhood Technical Assistance Center (ECTA) has developed numerous resources to support the implementation of the DEC Recommended Practices. These resources, which are available on the Center’s website, include performance checklists, illustrations (video vignettes), Practice Guides for Practitioners, and Practice Guides for Families.

    The Practice Guides and Checklists can support teams in evaluating implementation of EBPs. The Checklists and Practice Guides can support operationalizing and defining the core components of the DEC Recommended Practices, an essential task when developing fidelity tools. Specifically, the Performance Checklists are intended for practitioners (and leaders where noted) to increase their understanding and use of the DEC Recommended Practices and for self-evaluation of one's use of the practices.

    Source: Early Childhood Technical Assistance Center. (2015). Resources for Recognizing and Performing the DEC Recommended Practices. Retrieved from http://ectacenter.org/decrp

  • This research brief reviews the best practices for scaling up effective programs based on a comprehensive literature review. Examples of experiences of several programs that were successfully scaled up are included.

    Source: Sacks, Vanessa, Belts, Martha, Beckwith, Samuel, and Anderson-Moore, Kristin. (2015). How to scale up effective programs serving children, youth, and families. Retrieved from http://www.childtrends.org/wp-content/uploads/2015/11/2015-43ScaleUpPrograms.pdf

  • This planning tool can be used to identify core components or essential functions of the evidence-based practices that are being implemented. Core components of the practices can be defined or operationalized, and expected, developmental, and unacceptable practice variations can be shown. This tool can be used to support identification or development of fidelity measures to understand if the practice is being implemented as intended.

    Source: State Implementation and Scaling-up of Evidence-based Practices Center and National Implementation Network. (2014). Practice Profile Planning Tool. Retrieved from http://implementation.fpg.unc.edu/sites/implementation.fpg.unc.edu/files/NIRN-Education-PracticeProfilePlanningTool.pdf

Links on this site are verified monthly. This page content was last updated on 2016-08-29 AML

Content hosted by The Early Childhood Technical Assistance Center

  • CB 8040
  • Chapel Hill, NC 27599-8040
  • phone: 919.962.2001
  • fax: 919.966.7463
  • email: ectacenter@unc.edu

The contents of this guide were developed under cooperative agreement numbers #H326R140006 (DaSy), #H326P120002 (ECTA Center), #H373Y130002 (IDC) and #H326R140006 (NCSI) from the Office of Special Education Programs, U.S. Department of Education. Opinions expressed herein do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government.

Project Officers: Meredith Miceli & Richelle Davis(DaSy), Julia Martin Eile (ECTA Center), Richelle Davis & Meredith Miceli (IDC), and Perry Williams & Shedeh Hajghassemali (NCSI)

  • OSEP's TA&D Network:
  • IDEAs that Work