• Logo: DaSy
  • Logo: ECTA Center
  • Logo: IDC
  • Logo: NCSI

SSIP Phase II: Tools and Resources

The items below include the resources used in Phase II of the SSIP. The potential uses of each resource are provided.

  • Integration of Implementation Drivers, including creating information/communication pathways, is a key facet of doing Active implementation. This mapping activity, which includes an Implementations Drivers diagram, will help improve and integrate feedback and feed forward processes.

    Retrieved from http://implementation.fpg.unc.edu/sites/implementation.fpg.unc.edu/files/resources/AIModules-Activity-2-4a-MappingCommunicationPathways.pdf

  • An activity that highlights the importance of PDSA cycles in data-based decision making and guides participants in reflections about team strengths and weaknesses.

    Retrieved from http://implementation.fpg.unc.edu/sites/implementation.fpg.unc.edu/files/AIModules-Activity-5-2-PDSAWhoAmI.pdf

  • This Communication Protocol Worksheet can be used to promote system alignment and facilitate communication.

    Retrieved from http://implementation.fpg.unc.edu/sites/implementation.fpg.unc.edu/files/AIHub-Handout8-CommunicationProtocolWorksheet.pdf

  • Implementation teams employ improvement cycles in order to intentionally identify problems and solutions. As a result, practices improve and hospitable environments are developed to support more effective and efficient ways to work. The Plan, Do, Study, Act Cycle or PDSA Cycle underlies the different types of improvement cycles described in this active implementation framework (Deming, 1986).

    Retrieved from http://implementation.fpg.unc.edu/sites/implementation.fpg.unc.edu/files/AIHub-Handout14-ImprovementCycles.pdf

  • Framework 5 of this webpage outlines two different conceptualizations of improvement cycles: Plan-Do-Study-Act, and Practice-Policy.

    Retrieved from http://implementation.fpg.unc.edu/module-1/improvement-cycles

  • This module located on the Active Implementation Hub provides an introduction to implementation teams, including definitions, rationale, key functions, and best practices for establishing and maintaining these teams.

    Retrieved from http://implementation.fpg.unc.edu/module-3

  • Practice-Policy Feedback Loops are PDSA cycles designed to provide organizational leaders and policy makers with information from the practice level about implementation barriers and successes so that a more aligned system can be developed.

    Retrieved from http://implementation.fpg.unc.edu/module-5/topic-3-practice-policy-feedback-loops

  • This worksheet provides benchmarks of a high-quality home-visiting program and a self-assessment tool.

    Retrieved from http://ectacenter.org/~pdfs/calls/2015/decrp-2015-02-11/Benchmarks_Home Visiting.pdf

  • Communication cycles are designed to intentionally address communication gaps as education systems work to improve student outcomes.

    Retrieved from https://www.melcrum.com/research/strategy-planning-tactics-intranets-digital-social-media/choosing-right-communication

  • This is an exercise developed to identify key partners for implementing system changes. The version linked here includes instructions for use in a specific public health initiative. The most useful part of the document may be the definition of the Circles of Involvement. Consideration of all of the various types of stakeholders identified in the Circles of Involvement and the benefit they will bring to the implementation of the SSIP can improve the quality of stakeholder involvement. This process links to the Implementation Science recommendation that members of implementation teams should represent different perspectives and range across multiple levels of the system.

    Retrieved from http://www.naccho.org/topics/infrastructure/community-health-assessment-and-improvement-planning/upload/http___www-naccho-org_topics_infrastructure_mapp_framework_clearinghouse_loader.pdf

  • The Common Education Data Standards (CEDS) project supports state work to develop common data standards for a key set of education data elements which streamlines the exchange, comparison, and understanding of data within and across P-20W. States already implementing CEDS can use the standards to identify data elements available for analysis. For states building their data system infrastructure to support the SSIP evaluation, CEDS would be an important resource for planning data collections.

    The CEDS Alignment Tool is a Web-based tool that allows stakeholders to:

    • import or input their organizations' data dictionaries,
    • compare (or "map") their data dictionaries (element names, definitions, and options sets) to CEDS, and;
    • compare their data dictionaries with those of other participating organizations.

    The CEDS Connect Tool allows stakeholders from varied educational organizations to identify policy questions and related data elements, define analytic approaches, calculate metrics and indicators, address reporting requirements, and accomplish many other data tasks.

    Retrieved from https://ceds.ed.gov

  • Facilitators and barriers encountered in practice are rapidly (at least monthly) communicated to the highest level required for a solution. For example, ‘not enough time for competent coaches to support teachers learning to use new instruction practices' can be taken from the school to the district leadership. If district leadership cannot find a solution for some reason, the issue is not dropped – it is taken to the State Management Team. In this way, local issues can be resolved locally and more systemic issues can be resolved at a statewide level.

    Retrieved from http://sisep.fpg.unc.edu/news/sisep-enotes-may-2015

  • Identifying stakeholders and their interests is an important aspect of the participatory process. Chapter 7 of the Community Tool Box goes into depth about the process of identifying and engaging stakeholders.

    Retrieved from http://ctb.ku.edu/en/table-of-contents/participation/encouraging-involvement/identify-stakeholders/main

  • The Community Tool Box provides tips and guidelines for developing a strategic plan and organizational structure, from defining a vision to bringing about real change.

    Retrieved from http://ctb.ku.edu/en/table-of-contents/structure/strategic-planning/develop-action-plans/main

  • This tool is designed to help programs identify evaluation questions. The tool is a good starting point for states beginning to or working toward customizing their evaluation questions to align with their needs. Evaluation questions are divided into three outcome areas:

    1. child/family,
    2. practitioner, and;
    3. program/agency.

    Also helpful: Each question is identified as essential ("must-have") or aspirational ("nice-to-have").

    Retrieved from http://dasycenter.org/critical-questions

  • States can use this resource in planning the infrastructure to support their evaluation and data collection activities for the SSIP. The quality elements within each component outline what needs to be in place to collect and use data effectively. The DaSy Framework subcomponent WWW: Data Governance and Management outlines activities that are helpful when looking for secondary data and developing procedures for collecting new data. Section 2 of this subcomponent (Quality and Integrity) should be carefully considered when new data are being collected. Sections 1 (Authority and Accountability) and 3 (Security and Access) must also be considered as states think about where new data will be housed and how the data can be accessed.

    Retrieved from: http://dasycenter.org/resources/dasy-framework

  • This 2015 presentation focuses on breaking down and/or rephrasing evaluation questions, perhaps into several questions, that clearly communicate the details needed for data analysts to understand what you want to analyze and how. Clearly articulated analytical questions will help you to determine what data are needed and to more efficiently and effectively obtain the answers that you need, using data to support success in the State Systemic Improvement Plan (SSIP).

    Retrieved from http://dasycenter.org/the-data-are-in-the-details-albuquerque

  • The DEC Recommended Practices provide guidance to practitioners and families about the most effective ways to improve the learning outcomes and promote the development of young children, birth through age 5, who have or are at risk for developmental delays or disabilities.

    Retrieved from http://www.dec-sped.org/recommendedpractices

  • This presentation includes information about how VA Part C elicited membership for planning teams, how the team meetings are conducted, and how the teams are coordinated. This can be used as a model for developing a process.

    Kicking off Phase II: Slides 11-23 are most relevant to staffing for planning.

    Developing the Improvement Plan: Slides 3-11 are most relevant.

    Retrieved from https://appam.certain.com/accounts/register123/air/events/pdconf/userfiles/0x1255388f65dDeveloping_a_High-Quality_Improv.pptx

  • This is a teaching guide that can be used in the development of a logic model. It includes definitions of the components of a logic model (inputs, activities, outputs, outcomes) and examples of some of the categories underneath the components.

    University of Wisconsin-Extension. (2010a). Developing a logic model [Slide presentation]. Madison, WI: Author. Retrieved from http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html

  • This guide gives a series of steps to help an organization plan its development strategically and outline its goals and aims.

    Retrieved from http://www.diycommitteeguide.org/resource/strategic-plan-step-5-writing-your-plan

  • This annotated presentation provides an overview of Phase II focused on getting ready for Phase II of the SSIP. States can copy the slides into presentations that they are planning to kick off Phase II.

    Retrieved from http://ectacenter.org/~ppts/calls/2014/ssip/ssip2-1-2014-12-01.pptx

  • This guidance table is designed to help identify key issues, questions, and approaches for analyzing and interpreting data on outcomes for young children with disabilities. The tool outlines a series of steps related to defining analysis questions, clarifying expectations, analyzing data, testing inferences, and conducting data-based program improvement planning. It also includes examples of questions and approaches and sample figures to consider.

    Retrieved from http://ectacenter.org/eco/assets/pdfs/AnalyzingChildOutcomesData-GuidanceTable.pdf

  • This interactive guide on the implementation stages includes information on forming a state leadership team to plan and oversee implementation and scaling up of evidence-based practices. Content is provided on the development of the state leadership teams that links to the activities included in kicking off Phase II.

    Retrieved from http://ectacenter.org/implementprocess/interactive/stage2/intro.asp

  • This guide describes key steps for developing a well-thought-out plan for evaluating a SSIP. The guide provides considerations for how to incorporate each step into an evaluation plan, as well as a series of worksheets that correspond to each step and can be used to facilitate the planning process. The guide, along with its corresponding worksheets, is intended for TA providers to use in partnership with state staff.


  • The Hexagon Tool helps states, districts, and schools systematically evaluate new and existing interventions via six broad factors: needs, fit, resource availability, evidence, readiness for replication and capacity to implement.

    Retrieved from http://implementation.fpg.unc.edu/sites/implementation.fpg.unc.edu/files/resources/NIRN-Education-TheHexagonTool.pdf

  • Although Excel is not a graphical tool by design, it is possible, with quite a bit of formatting, to make a Gantt chart in Excel. You can do this by turning your project tables into an Excel Gantt chart using Excel's bar graph functionality, and importing your Gantt charts into PowerPoint.

    Retrieved from https://www.officetimeline.com/gantt-chart-excel

  • This brief introduces key elements of effective implementation within an integrated, stage-based framework. This framework posits that (1) implementation happens in four discernible stages; and (2) three common "threads" or core elements exist across each of these stages.

    Kicking off Phase II: Pages 5 and 6 include a useful summary of implementation teams. Included in the summary are: a definition of implementation teams, membership of implementation teams, relationships among teams, rational for using implementation team, and core competencies required of implementation teams.

    Developing the Improvement Plan: The entire document is useful for this activity.

    Retrieved from http://www.acf.hhs.gov/sites/default/files/opre/es_cceepra_stage_based_framework_brief_508.pdf

  • Dialogue is necessary when implementing practices. This dialogue guide can be used to develop shared meaning and create effective dialogue.

    Retrieved from http://www.ideapartnership.org/building-connections/the-partnership-way.html

  • This document contains models for stakeholder interaction around issues.

    Retrieved from http://www.ideapartnership.org/documents/NovUploads/Blueprint%20USB/Meetings%20to%20Co-Create/Dialogue%20Guides.pdf

  • This document has tips for encouraging and supporting relevant participation from the full range of stakeholders, including asking at which level individuals want to be engaged early in your collaboration.

    Retrieved from http://www.ideapartnership.org/documents/NovUploads/Blueprint%20USB/Ensuring%20Relevant/Engage%20Everybody.pdf

  • These qualitative rubrics are tools that can be used to generate data on interactions. These data will help you understand the growth of critical relationships to ensure stakeholders are engaged.

    Retrieved from http://www.ideapartnership.org/documents/NovUploads/Blueprint%20USB/Bringing%20it%20All/Measuring%20Progress.pdf

  • For every issue, there are a number of groups that have deep and durable connections at the practice level. Some are very closely aligned with the issues that you are trying to influence. Others have more distant, yet still important, connections. In either case, stakeholder groups have influence in what practitioners know, believe and do. Stakeholder groups can be important allies in moving new and/or proven practices to implementation! This template will help leaders identify and reach out to potential partners in order to meet and address persistent challenges.

    Retrieved from http://www.ideapartnership.org/documents/NovUploads/Blueprint%20USB/Coalescing%20Around/Meet%20the%20Stakeholders.pdf

  • This PowerPoint presentation offers a blueprint for authentic engagement by considering different types of learning strategies, communication styles, and leadership models.

    Retrieved from http://www.ideapartnership.org/documents/NovUploads/Blueprint%20USB/Doing%20the%20Work/One-way%20and%20Two-way%20Learning.pdf

  • This presentation is geared toward helping practitioners develop useful logic models. The slides address the hallmarks of well-constructed, useful logic models and how to use logic models for program evaluation planning, implementation, and using the findings. The slides break down and explain the different components and terminology associated with a logic model. Additional resources and sites are also cited. States can use this presentation as an introduction to logic models and as guidance for key components to create a meaningful logic model for their SSIP evaluation.

    Honeycutt, S. & Kegler, M.C. (2010). Retrieved from http://comm.eval.org/viewdocument/?DocumentKey=79c7da3d-6978-452c-8e8b-7056d3626966

  • The table describes the requirements for Phase II of the SSIP for Part B.

    Retrieved from https://osep.grads360.org/#communities/pdc/documents/4603

  • The Part B SSIP Phase II OSEP Guidance and Review Tool is based on the three components described in Phase II of the Measurement Table under Indicator 17 (Part B). Those components are 1) Infrastructure Development; 2) Support for LEA Implementation of EBPs; and 3) Evaluation. Phase II builds on the five components developed in Phase I. Phase II must be submitted by April 1, 2016 as part of the FFY 2014 SPP/APR. The Phase II components are in addition to Phase I content (including any updates).

    Retrieved from https://osep.grads360.org/#communities/pdc/documents/8823

  • The table describes the requirements for Phase II of the SSIP for Part C.

    Retrieved from https://osep.grads360.org/#communities/pdc/documents/4604

  • The Part C SSIP Phase II OSEP Guidance and Review Tool is based on the three components described in Phase II of the Measurement Table under Indicator 11 (Part C). Those components are 1) Infrastructure Development; 2) Support for EIS Programs and EIS Provider Implementation of EBPs; and 3) Evaluation. Phase II builds on the five components developed in Phase I. Phase II must be submitted by April 1, 2016 with the FFY 2014 SPP/APR. The Phase II components are in addition to Phase I content (including any updates).

    Retrieved from https://osep.grads360.org/#communities/pdc/documents/8824

  • This document provides information to help state staff define and limit the scope of data analysis for program improvement efforts, including the State Systemic Improvement Plan (SSIP); develop a plan for data analysis; document alternative hypotheses and additional analyses as they are generated; and summarize findings and document results.

    Retrieved from http://dasycenter.sri.com/downloads/DaSy_papers/DaSy_SSIP_DataAnalysisPlanning_20150323_FINAL_Acc.pdf

  • This guide includes TA process documents and the implementation process, structures, and tools for planning and monitoring implementation used in ECTA Intensive TA for Implementing, Sustaining, & Scaling Up Evidence-Based Practices to Improve Child Outcomes.

    Kicking off Phase II: The section on p. 6 provides an overview of the major structures of the RP2: Reaching Potentials through Recommended Practices Initiatives including:

    • State Leadership Team
    • Master Cadre of Training and TA providers
    • Implementation and Demonstration Sites

    Also useful are the sections related to the Planning/Installation Stage embedded under each of the major structures.

    Developing the Improvement Plan: The entire document relates to developing the improvement plan.

    Retrieved from http://ectacenter.org/~pdfs/implement_ebp/ECTA_RP_StateGuide_2-2015.pdf

  • This guide provides strategies for involving stakeholders in setting the direction for evaluation. While it is geared toward evaluation consultants, it provides useful information for state staff to guide efforts to promote stakeholder engagement throughout the SSIP process.

    Preskill, H., & Jones, N. (2009). Robert Wood Johnson Foundation Evaluation Series. Retrieved from http://www.rwjf.org/content/dam/web-assets/2009/01/a-practical-guide-for-engaging-stakeholders-in-developing-evalua

  • This manual, written for program managers, contains tips, worksheets, and samples to help program managers understand each step of the evaluation process.

    Most relevant sections:

    • Outline of the basic evaluation steps (pg. 8-9)
    • Chapter 5: Outlines the steps to prepare for an evaluation (pg. 30–41)
    • Sample logic model and logic model worksheet (pg. 42–43)
    • Sample and worksheet for describing implementation objectives in measurable terms (pg. 44–45)
    • Sample and worksheet for describing participant outcome objectives (pg. 46–47)
    • Sample outline for an evaluation plan (pg. 59–61)
    • Sample data collection plan (pg. 74–75)
    • Worksheet for developing a data collection plan (pg. 76)

    Retrieved from http://www.acf.hhs.gov/programs/opre/research/project/the-program-managers-guide-to-evaluation

  • This rubric describes the key actions and behaviors that leaders need to pay attention to in order to create active engagement at the informing level, networking level, collaborating level, and transforming level.

    Retrieved from https://wested.app.box.com/s/oh0wkij7a0a7hsfyfoxrv0f0u55ljdj3

  • This document provides a list of recommended resources to support evaluation planning for program improvement efforts, including the State Systemic Improvement Plan (SSIP).

    Retrieved from http://ectacenter.org/topics/ssip/plan_eval_program_improvement.asp

  • This sample action plan template provides states with a suggested, but not required, format and examples of potential content to assist them in completing their Phase II SSIP improvement plan and evaluation plan.

    Retrieved from http://ectacenter.org/~docs/topics/ssip/ssip_improvement_plan_template.doc

  • The Gantt Chart is a tool to track state planning activities with timelines.

    Retrieved from http://ectacenter.org/~docs/topics/ssip/ssip_phase_ii_gantt_chart.docx

  • This online module provides foundational information on effective stakeholder engagement and includes a session (Session 3) on strategies and resources for state staff to use when leading stakeholders in data system initiatives (including the SSIP).

    DaSy (2014). Retrieved from http://dasycenter.org/stakeholder-engagement-in-data-system-initiatives-an-online-module-for-part-c-and-part-b-619-state-staff-2

  • The State Benchmarks of Quality is can be used by the State Leadership Team (SLT) to assess progress and plan future actions so that Recommended Practices (RPs) are available for providers and families statewide. Sections of this document are particularly useful in considering the structure and staffing of the state leadership team, including on pages 3-4.

    Retrieved from http://ectacenter.org/~pdfs/implement_ebp/ECTA_RP_Benchmarks_4-2015.pdf

  • The system framework guides coordinators and staff in successfully addressing state needs, then implementing evidence-based practices, and finally bringing about positive outcomes for children and their families.

    Retrieved from http://ectacenter.org/sysframe

  • This manual is written for community-based organizations and focuses on internal evaluation conducted by program staff, which will be useful for states planning to conduct their SSIP evaluation internally. The manual provides an overview of the evaluation process and includes the basic steps of planning for and conducting internal program evaluation, including practical strategies for identifying quantitative and qualitative data.

    Most relevant sections:

    • Chapter 4: What Are You Trying to Do? Defining Goals and Objectives (pg. 15–19)
    • Evaluation Planning Chart (pg. 25)
    • Chapter 6: Finding the Evidence: Strategies for Data Collection (pg. 27-37)
    • Chart of program objectives to evaluation questions (pg. 47)
    • Roadmap for evaluation design (pg. 61-62)
    • Appendices: Example evaluation reports

    Bond, S.L., Boyd, S.E., Rapp, K.A., Raphael, J.B. and Sizemore, B.A. (1997). Horizon Research. Retrieved from http://www.horizon-research.com/taking-stock-a-practical-guide-to-evaluating-your-own-programs

  • This template provides guidance for the development of a strategic communication plan (see Step 3: Develop Messages, pg. 2).

    Using the Message Development Worksheet will help convey goals and objectives, deliver important information about the issue, and compel the targeted audience to think, feel, or act. (see Message Worksheet, pg. 11)

    Retrieved from https://www.wkkf.org/resource-directory/resource/2006/01/template-for-strategic-communications-plan

  • This checklist includes examples of steps leaders can take to help create a well-functioning and forward-thinking organization and to help practitioners feel a sense of belonging as they understand their purpose within the organization. The checklist can also be used as a self-evaluation by leaders at both state and local levels.

    Retrieved from http://ectacenter.org/~pdfs/decrp/LDR-3_Leaders_vision_direction.pdf

  • This guide is written for a broad audience to clarify the what, why, and how of logic models. The contents of discussion include what logic models are and the different types, why a program should develop a logic model, and how a logic model can be used to guide implementation and plan for evaluation.

    The guide also includes templates and checklists that states can apply to their SSIP. This guide provides useful explanations and definitions of evaluation terminology.

    Most relevant sections:

    • Figure 2: How to read a logic model graphic (pg. 3)
    • 3 Approaches to logic models (pg. 9–10)

    Retrieved from https://www.wkkf.org/resource-directory/resource/2006/02/wk-kellogg-foundation-logic-model-development-guide

  • This guidebook is written for the person who has never done an evaluation before. It provides step-by-step instructions on how to design and carry out an evaluation. It could also be used as a reference by people interested in certain phases of the evaluation process, such as writing performance indicators or designing a survey. The guidebook details how to:

    1. develop outcome statements, indicators, and evaluation questions;
    2. formulate an evaluation methodology and collect, assess, and summarize data; and
    3. develop and disseminate evaluation findings and recommendations.

    The guidebook also contains a glossary, sample outcome and indicator statements, evaluation resources, and real-life stories of how community organizations used evaluation tools. This resource is lengthy but full of useful explanations of evaluation concepts. It has some wonderful worksheets and resources that states can use for the SSIP and that TA providers can use with states during sessions and workshops.

    Most relevant sections:

    • Worksheets to develop evaluation questions (pg. 35–39)
    • Implementation questions (pg. 58)
    • Worksheet for documenting strategies and activities (pg. 61)
    • Examples of documentation forms (pg. 62-66)
    • Examples and worksheets for developing an evaluation work plan (pg. 91–96)
    • Exercise for analyzing training attendance data (pg. 154–155)

    SRI International (supported by the Sierra Health Foundation) (2000). Retrieved from https://www.sierrahealth.org/pages/525

Links on this site are verified monthly. This page content was last updated on 2015-03-01 AML

Content hosted by The Early Childhood Technical Assistance Center

  • CB 8040
  • Chapel Hill, NC 27599-8040
  • phone: 919.962.2001
  • fax: 919.966.7463
  • email: ectacenter@unc.edu

The contents of this guide were developed under cooperative agreement numbers #H326R140006 (DaSy), #H326P120002 (ECTA Center), #H373Y130002 (IDC) and #H326R140006 (NCSI) from the Office of Special Education Programs, U.S. Department of Education. Opinions expressed herein do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government.

Project Officers: Meredith Miceli & Richelle Davis(DaSy), Julia Martin Eile (ECTA Center), Richelle Davis & Meredith Miceli (IDC), and Perry Williams & Shedeh Hajghassemali (NCSI)

  • OSEP's TA&D Network:
  • IDEAs that Work