• Logo: DaSy
  • Logo: ECTA Center
  • Logo: IDC
  • Logo: NCSI
 

SSIP Phase III: Tools and Resources

The items below include the resources used in Phase III of the SSIP. The potential uses of each resource are provided.

Tools and Resources: Implementation Process

  • This guide, developed by the Early Childhood Technical Assistance Center (ECTA), is based on implementation science research and the collective experiences of federally funded technical assistance centers in conducting statewide system change initiatives. The guide includes critical implementation activities for five implementation stages (e.g. Exploration, Installation, Initial Implementation, Full Implementation, and Expansion/Scale-up). Outcomes are also provided for each of the stages.

    A Pre-Test can be used to determine status of implementation. The companion State-Level and Local-Level Self-Assessments can be used by leadership teams as they guide and evaluate the systematic implementation, expansion, and sustainability of new practices or innovations. The tools provide a way to systematically assess outcomes that have been achieved and to determine outcomes that need to be addressed.

    Source: Early Childhood Technical Assistance Center (2014). A guide to implementation process: stages, steps and activities. Retrieved from http://ectacenter.org/implementprocess/implementprocess.asp

  • This brief provides an integrated stage-based implementation framework that builds on implementation science literature. This framework is based on the following: (1) implementation happens in four discernible stages, and (2) three common threads, or core elements, exist across each of these stages. The three core elements are: building and using implementation teams to actively lead implementation efforts; using data and feedback loops to drive decision-making and promote continuous improvement; and developing a sustainable implementation infrastructure that supports general capacity and innovation-specific capacity for individuals, organizations, and communities.

    Source: U.S. Department of Health and Humans Services: Office of Planning, Research, and Evaluation. (2015.). An integrated stage-based framework for implementation of early childhood programs and systems. Retrieved from http://www.acf.hhs.gov/programs/opre/resource/an-integrated-stage-based-framework-for-implementation-of-early-childhood-programs-and-systems

  • The National Implementation Research Network’s Get Started webpage includes videos that can be used to support teams in implementing innovations including evidence-based practices. In addition, the website includes resources related to usable interventions, implementation stages, implementation drivers, implementation teams, and improvement cycles. Modules and lessons with aligned activities are also available.

    Source: National Implementation Network. (2016). Get started: A set of quick start videos and guides developed to help you and your team get started with active implementation. Retrieved from http://implementation.fpg.unc.edu

  • The Basics of Implementation Science presentation includes an overview on developing an infrastructure that supports implementation, scale-up, and sustainability of effective practices and highlights core components of implementation. Highlighted components include: implementation stages, implementation drivers, implementation teams, usable interventions, and improvement cycles.

    Source: Davis, Susan. (2015). Basics of implementation science. Retrieved from https://ideadata.org/resource-library/55ba8132140ba05f7e8b4575/

  • The Model for Improvement, which was developed by the Associates for Process Improvement, is designed to accelerate improvement of programs utilizing existing change theories. The steps included in this model are the following: forming the team, setting aims, establishing measures, selecting changes, testing changes (which includes the Plan-Do-Study-Act [PDSA] Cycle), implementing changes, and spreading changes.

    Source: Institute for Healthcare Improvement. (2016). Science of improvement: how to improve. Retrieved from http://www.ihi.org/resources/pages/howtoimprove/scienceofimprovementhowtoimprove.aspx

  • This document provides an overview of the 90-Day Cycle and provides information on each of the stages of the cycle. The 90-Day Cycle can be used to identify barriers to implementation and to target specific processes that are needed to address the barriers. Associated tools and resources related to the 90-Day Cycle are included.

    Source: Park, S., and Takahashi, S. (2013). The 90-day cycle handbook. Retrieved from http://cdn.carnegiefoundation.org/wp-content/uploads/2014/09/90DC_Handbook_external_10_8.pdf

  • This document defines the essential components of capacity building and provides an at-a-glance summary of best practice recommendations for building and measuring capacity.

    Source: National Center for Systemic Improvement. (2016). Practice brief: best practice recommendations for building and measuring capacity. Retrieved from http://ncsi.wested.org/wp-content/uploads/2016/03/PracticeBriefCapacity.pdf

  • This document categorizes capacity tools so that teams can determine which ones may be most helpful in their efforts to build and measure capacity.

    Source: National Center for Systemic Improvement. (2016). Tools for building and measuring capacity. Retrieved from http://ncsi.wested.org/wp-content/uploads/2016/03/ResourceList-ToolsforBuildingMeasuringCapacity.pdf

Tools and Resources: Improvement Strategies to Support Infrastructure Development

  • The framework, which was developed by the Early Childhood Technical Assistance Center (ECTA), can be used by state Part C and Section 619 coordinators and their staff to evaluate their current systems; identify potential areas for improvement; and develop more effective, efficient systems that support implementation of evidence-based practices leading to improved outcomes for young children with disabilities and their families. The ECTA System Framework is organized around six interrelated components: Governance, Finance, Personnel/Workforce, Data System, Accountability and Quality Improvement, and Quality Standards. Each component contains a set of subcomponents that identify key areas of content within the component. Each subcomponent contains a set of quality indicators that specify what needs to be in place to support a high-quality Part C/Section 619 system. Each quality indicator has corresponding elements of quality that operationalize its implementation.

    Source: Early Childhood Technical Assistance Center (2015). A system framework for building high-quality early intervention and preschool special education programs. Retrieved from http://ectacenter.org/sysframe/

  • The Framework Self-assessment Tool, which was developed by the ECTA and the Center for IDEA Early Childhood Data Systems (DaSy) with input from partner states, provides an Excel-based tool that state staff can use to record the current status of their state system, set priorities for improvement, and measure progress over time.

    Source: Early Childhood Technical Assistance Center (2015). Framework self-assessment tool. Retrieved from http://ectacenter.org/sysframe/selfassessment.asp

  • This document provides an example of how implementation science could be applied to improving a hypothetical state’s finance system through the implementation of a family cost participation program. Goals for each of the implementation stages are addressed, and stage-based implementation activities are provided.

    Source: Lucas, A., Hurth, J., and Kelley, G. (2015). Applying implementation science to state system change: an example of improving the finance system component: Implementation of a family cost participation program in a hypothetical state. Retrieved from http://ectacenter.org/~pdfs/sysframe/implement-finance-example.pdf

  • This presentation provides practical suggestions for creating a financing plan for implementing and scaling-up improvement initiatives. Areas addressed include estimating costs, mapping current resources, and assessing gaps. Information is also provided on identifying and prioritizing short-term and long-term financing strategies.

    Source: Center for the Study of Social Policy. (2009). Retrieved from http://www.cssp.org/community/neighborhood-investment/other-resources/CreatingaStrategicFinancingPlantoAchieveResultsatScale.pdf

Tools and Resources: Implementing Evidence-based Practices

  • Developed by the Early Childhood Technical Assistance Center (ECTA), this guide can be used to support widespread use of EBPs designed to improve outcomes for young children with or at risk for delays or disabilities and their families. The guide, which was developed through the Center’s Reaching Potential through Recommended Practices initiative (RP2), focuses on implementation of the Division for Early Childhood (DEC) Recommended Practices and can be used statewide or in specific regions by cross-agency teams to implement RP2 throughout the early childhood and early intervention service-delivery systems where young children with disabilities and their families are served.

    The guide includes information on the three major elements that are instrumental in the process of planning and sustaining the high-fidelity implementation of the DEC Recommended Practices. The first element is the Stages of Implementation (see http://implementation.fpg.unc.edu/module-4 and http://ectacenter.org/implementprocess/implementprocess.asp), which refers to the major steps that must be followed in any effort of full-fledged implementation. The second element is an overview of the four major structures that are needed for high-fidelity implementation of Recommended Practices: (1) the State Leadership Team, (2) the state’s Master Cadre of coaches/trainers, (3) demonstration and implementation sites, and (4) data and evaluation systems. The third element covered in this introduction is the State Benchmarks of Quality, a tool for planning and monitoring the implementation process.

    Source: Early Childhood Technical Assistance Center (2014). Planning Guide to Statewide Implementation, Scale-up, and Sustainability of Recommended Practices. Retrieved from http://ectacenter.org/~pdfs/implement_ebp/ECTA_RP_StateGuide_2-2015.pdf

  • This document was developed by the Council for Exceptional Children’s Division for Early Childhood Education (DEC) to support practitioners and families in implementing research supported practices that are designed to improve outcomes and promote development of young children who have or are at risk for developmental delays or disabilities. The Recommended Practices, which were updated in collaboration with ECTA, consist of eight domains: leadership, assessment, environment, family, instruction, interaction, teaming and collaboration, and transition. Videos about the practices are available on DEC’s website.

    Source: Division for Early Childhood. (2014). DEC recommended practices in early intervention/early childhood special education. Retrieved from http://www.dec-sped.org/recommendedpractices

  • The Early Childhood Technical Assistance Center (ECTA) has developed numerous resources to support the implementation of the DEC Recommended Practices. These resources, which are available on the Center’s website, include performance checklists, illustrations (video vignettes), Practice Guides for Practitioners, and Practice Guides for Families.

    The Practice Guides and Checklists can support teams in evaluating implementation of EBPs. The Checklists and Practice Guides can support operationalizing and defining the core components of the DEC Recommended Practices, an essential task when developing fidelity tools. Specifically, the Performance Checklists are intended for practitioners (and leaders where noted) to increase their understanding and use of the DEC Recommended Practices and for self-evaluation of one's use of the practices.

    Source: Early Childhood Technical Assistance Center. (2015). Resources for Recognizing and Performing the DEC Recommended Practices. Retrieved from http://ectacenter.org/decrp

  • This research brief reviews the best practices for scaling up effective programs based on a comprehensive literature review. Examples of experiences of several programs that were successfully scaled up are included.

    Source: Sacks, Vanessa, Belts, Martha, Beckwith, Samuel, and Anderson-Moore, Kristin. (2015). How to scale up effective programs serving children, youth, and families. Retrieved from http://www.childtrends.org/wp-content/uploads/2015/11/2015-43ScaleUpPrograms.pdf

  • This planning tool can be used to identify core components or essential functions of the evidence-based practices that are being implemented. Core components of the practices can be defined or operationalized, and expected, developmental, and unacceptable practice variations can be shown. This tool can be used to support identification or development of fidelity measures to understand if the practice is being implemented as intended.

    Source: State Implementation and Scaling-up of Evidence-based Practices Center and National Implementation Network. (2014). Practice Profile Planning Tool. Retrieved from http://implementation.fpg.unc.edu/sites/implementation.fpg.unc.edu/files/NIRN-Education-PracticeProfilePlanningTool.pdf

Tools and Resources: Evaluating Process and Outcomes

  • This document provides a list of recommended resources to support evaluation planning for program improvement efforts including the SSIP. Resources relevant to early intervention and preschool special education are included in the list, which will be updated as new and relevant resources become available.

    Source: Winer, A., Nelson, R., Kahn, L., Derrington, T., Davies-Mercier, E., Cochenour, M., and Copa, N. (2015). Recommended resources for planning to evaluate improvement efforts. Retrieved from http://ectacenter.org/~pdfs/topics/ssip/plan_eval_program_improvement.pdf

  • This guide describes key steps for developing a well thought out plan for evaluating an SSIP. The guide provides considerations for how to incorporate each step into an evaluation plan, as well as a series of worksheets that correspond to each step and can be used to facilitate the planning process. Preferred use of the guide, along with its corresponding worksheets, is by TA providers in partnership with state staff.

    Source: IDEA Data Center. (2015). A guide to SSIP evaluation planning. Retrieved from https://ideadata.org/resource-library/5697cca3140ba0ca5c8b4599/

  • This sample action plan template was designed by DaSy, ECTA, IDC, and NCSI to provide states with a suggested format and examples of potential content for their Phase II SSIP improvement and evaluation plan. States should feel free to adapt the template or use one that best meets their needs and communicates how they will implement and evaluate their SSIP in Phase III. This template is based on a logic model approach. It links activities and steps needed to implement the improvement strategies with intended outcomes and uses the activities and outcomes as the basis for the evaluation plan.

    Source: Early Childhood Technical Assistance Center). (2015). Sample SSIP action plan template. Retrieved from http://ectacenter.org/~docs/topics/ssip/ssip_improvement_plan_template.doc

  • This resource was designed by the National Center for Systemic Improvement (NCSI) to provide states with a sample approach and tool to plan and track measures of State Systemic Improvement Plan (SSIP) implementation. This resource will assist states in addressing the SSIP requirements laid out in the State Performance Plan/Annual Performance Report (SPP/APR) Part B and Part C Indicator Measurement Tables and the SSIP Phase II OSEP Guidance and Review Tool, which call for the evaluation of implementation as well as outcomes.

    Source: National Center for Systemic Improvement. (2016). Implementation evaluation matrix. Retrieved from http://ncsi.wested.org/wp-content/uploads/2016/03/Implementation_Evaluation_Matrix-1.docx

  • This national webinar was hosted by NCSI, ECTA, and DaSy for state Part B and Part C staff and focused on strategies for assessing the impact of SSIP infrastructure improvements. Representatives from two state departments of education and two state Part C programs participated in a “virtual state panel” and shared their experiences with implementing infrastructure changes as well as their approaches to assessing the impact of those changes on their SSIP improvement strategies and ultimately, their SIMR.

    Source: National Center for Systemic Improvement. (2016). Assessing impact of infrastructure improvements. Retrieved from https://vimeo.com/169687158

  • The presentation focuses on how to use high-quality data to support effective implementation. Information is included on the use of data for decision-making and improvement and the conditions under which high-quality data can make the most difference.

    Source: Blasé, K. (2015). Building implementation capacity: Data to drive change. Retrieved from https://ideadata.org/resource-library/55c8c10b140ba0a8218b4574/

  • This white paper focuses on factors that could lead Part C or Part B state agencies to propose changes in their SIMR baselines or targets. The paper addresses questions that state agency personnel should propose when establishing baselines and targets and considerations that may need to be addressed when revising targets.

    Source: Ruggiero, T. and Kahn, L. (2015). Considerations for Making Changes to SIMR Baseline and Targets. Retrieved from https://ideadata.org/resource-library/5682b8ab140ba0fb0f8b45a7/

  • This research brief, which is available from the Office of Planning, Research, and Evaluation at the U.S. Department of Health and Human Services, addresses the importance of incorporating quality measures into the implementation evaluation process. Examples are provided on how quality and quantity constructs are assessed and examined in relation to early care and education program outcomes.

    Downer J. and Yazejian, N. (2013). Measuring the quality and quantity of implementation in early childhood interventions. Retrieved from http://www.acf.hhs.gov/programs/opre/resource/measuring-the-quality-and-quantity-of-implementation-in-early-childhood

  • This brief offers examples of how quality and quantity constructs are assessed and examined in relation to early care and education program outcomes.

    Source: Smith, B. J., Fox, L., Dunlap, G., Strain, P., Trivette, C. M., Perez Binder, D., Bovey, T., McCullough, K., & Blase, K. (2015). Planning guide to statewide implementation, scale-up, and sustainability of recommended practices. Retrieved from http://ectacenter.org/~pdfs/implement_ebp/ECTA_RP_StateGuide_2-2015.pdf

  • This assessment tool is for home visiting program leadership teams to use in assessing their status in the critical elements of program-wide implementation.

    Source: Trivette, C. and Jones, A. (2015). Reaching potential through recommended practices (RP2): Benchmarks of quality for home-visiting programs. Retrieved from http://ectacenter.org/~pdfs/calls/2015/decrp-2015-02-11/Benchmarks_Home%20Visiting.pdf

  • This assessment tool is for preschool special education programs’ leadership teams to use in assessing their status in the critical elements of program-wide implementation.

    Source: Trivette, C. and Jones, A. (2015). Reaching potential through recommended practices (RP2): Benchmarks of quality for classroom-based programs. Retrieved from http://ectacenter.org/~pdfs/calls/2015/decrp-2015-02-11/Benchmarks_Home%20Visiting.pdf

  • These activities will support evaluation teams in designing and developing fidelity assessments. The Designing a Fidelity Assessment activity allows teams to identify, categorize, and discuss challenges to implementing a fidelity assessment.

    Source: National Implementation Research Network and State Implementation and Scaling-up of Evidence-based Practices Center. (2016). Designing a fidelity assessment. Retrieved from http://implementation.fpg.unc.edu/resources/activity-7-1-designing-fidelity-assessment

  • These activities will support evaluation teams in designing and developing fidelity assessments. Once the essential components or functions of the EBPs have been identified, the Developing a Fidelity Assessment activity will support teams in brainstorming fidelity assessments.

    Source: National Implementation Research Network and State Implementation and Scaling-up of Evidence-based Practices Center. (2016). Developing a fidelity assessment. Retrieved from http://implementation.fpg.unc.edu/resources/activity-7-2-fidelity-module-7-capstone-developing-fidelity-assessment

  • This website can support teams in evaluating and planning for coaching and training systems and implementing and assessing best practices.

    Source: National Implementation Research Network and State Implementation and Scaling-up of Evidence-based Practices Center. (2016). Resource library: Evaluation and planning tools. Drivers. Retrieved from http://implementation.fpg.unc.edu/resources/results/taxonomy%3A23%2C40

  • This reference tool includes a series of steps that are required to collect high quality data needed to evaluate SSIP implementation and outcomes.  These steps can help states collect high quality data regardless of whether or not they are in the process of planning for or engaged in data collection.

    Retrieved from http://ectacenter.org/~pdfs/topics/ssip/Data_Pathway.pdf

  • This document is designed to assist states develop or refine their SSIP performance indicators. It includes a worksheet with a series of questions based on S.M.A.R.T. criteria (Specific, Measurable, Achievable, Relevant and Timely) that will help states write performance indicators, providing the information needed for the SSIP, and will articulate a rationale for making changes to existing performance indicators and their corresponding intended outcome statements.

    Retrieved from http://ectacenter.org/~pdfs/topics/ssip/Refining_SMART_Performance_Indicators.pdf

  • This document is designed for states who are currently working to refine and refocus their short and/or long term intended SSIP outcomes. It includes a worksheet to support states in identifying the intended outcomes that are most critical to the success of their SSIP and refining the language of those outcomes to best align to the theory of action.

    Retrieved from http://ectacenter.org/~pdfs/topics/ssip/Refining_Intended_SSIP_Outcomes.pdf

  • Developed by ECTA and DaSy, this document is designed to illustrate how a state might summarize and report data gathered through the System Framework self-assessment process to document infrastructure improvements in their Phase III SSIP or other program improvement efforts. A template for reporting progress is provided along with an example of hypothetical state data.

    Retrieved from http://ectacenter.org/~pdfs/sysframe/Reporting_Infrastructure_Improvements_2017-03-03.pdf

Links on this site are verified monthly. This page content was last updated on 2016-08-29 AML

Content hosted by The Early Childhood Technical Assistance Center

  • CB 8040
  • Chapel Hill, NC 27599-8040
  • phone: 919.962.2001
  • fax: 919.966.7463
  • email: ectacenter@unc.edu

The contents of this guide were developed under cooperative agreement numbers #H326R140006 (DaSy), #H326P120002 (ECTA Center), #H373Y130002 (IDC) and #H326R140006 (NCSI) from the Office of Special Education Programs, U.S. Department of Education. Opinions expressed herein do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government.

Project Officers: Meredith Miceli & Richelle Davis(DaSy), Julia Martin Eile (ECTA Center), Richelle Davis & Meredith Miceli (IDC), and Perry Williams & Shedeh Hajghassemali (NCSI)

  • OSEP's TA&D Network:
  • IDEAs that Work