Improving Systems, Practices and Outcomes

Recommended Resources for Planning to Evaluate Program Improvement Efforts (including the SSIP)

Download Thumbnail: Recommended Resources for Planning to Evaluate Program Improvement Efforts

Download a print-friendly edition of this document:

The full text of this document is reprinted below:

Recommended Resources for Planning to Evaluate Program Improvement Efforts (including the SSIP)

DaSy: The Center for IDEA Early Childhood Data Systems
ECTA Center: The Early Childhood Technical Assistance Center
NCSI: The National Center for Systemic Immprovement

This document provides a list of recommended existing resources for state Part C and Part B 619 staff and technical assistance (TA) providers to utilize to support evaluation planning for program improvement efforts (including the State Systemic Improvement Plan, SSIP). There are many resources available related to evaluation and evaluation planning, but these were selected as the ones more relevant to and useful for early intervention and preschool special education. Additional resources can be found at the GRADS360 Phase II Evaluation Resource Library . As program improvement and the SSIP work is ongoing and evolving, this list will be updated as new and relevant resources become available.

Comprehensive Evaluation Guidebooks and Manuals

WWW: The Program Manager's Guide to Evaluation, 2nd Edition

This manual is written for program managers, and contains tips, worksheets and samples to provide information to help you understand each step of the evaluation process.

Most relevant sections:
  • Pages 8-9: Outline of the basic evaluation steps
  • Chapter 5 (pages 30-41): Outlines the steps to prepare for an evaluation
  • Pages 42-43: Sample logic model and logic model worksheet
  • Pages 44-45: Sample and worksheet for describing implementation objectives in measurable terms
  • Pages 46-47: Sample and worksheet for describing participant outcome objectives
  • Pages 59-61: Sample outline for an evaluation plan
  • Pages 74-75: Sample data collection plan
  • Page 76: Worksheet for developing a data collection plan

Source: Administration for Children and Families, Office of Planning, Research and Evaluation (ACF OPRE) (2010). Retrieved from http://www.acf.hhs.gov/programs/opre/research/project/the-program-managers-guide-to-evaluation

WWW: We Did It Ourselves: An Evaluation Guidebook

This guidebook is written for the person who has never done an evaluation before. It provides step-by-step instructions on how to design and carry out an evaluation. It could also be used as a reference by people interested in certain phases of the evaluation process, such as writing performance indicators or designing a survey. The guidebook details how to (1) develop outcome statements, indicators, and evaluation questions; (2) formulate an evaluation methodology and collect, assess, and summarize data, and (3) develop and disseminate evaluation findings and recommendations. The guidebook also contains a glossary, sample outcome and indicator statements, evaluation resources, and real-life stories of how community organizations used evaluation tools. This resource is lengthy, but full of useful explanations of evaluation concepts and has some really wonderful worksheets and resources that can be used by states for the SSIP and also for TA providers to use with states during sessions and workshops.

Most relevant sections:
  • Pages 35-39: Worksheets to develop your evaluation questions
  • Page 58: Implementation questions
  • Page 61: Worksheet for documenting strategies and activities
  • Pages 62-66: Examples of documentation forms
  • Page 91-96: Examples and worksheets for developing an evaluation work plan
  • Page 154-155: Exercise for analyzing training attendance data

Source: SRI International (supported by the Sierra Health Foundation) (2000). Retrieved from https://www.sierrahealth.org/pages/525

WWW: Taking Stock: A practical guide to evaluating your own programs

This manual is written for community-based organizations and provides a practical guide to program evaluation and focuses on internal evaluation conducted by program staff, which will be useful for States planning on conducting their SSIP evaluation internally. The manual provides a nice overview of the evaluation process and includes the basic steps of planning for and conducting internal program evaluation, including practical strategies for identifying quantitative and qualitative data.

Most relevant sections:
  • Chapter 4 (pages 15-19): What Are You Trying to Do? Defining Goals and Objectives
  • Page 25: Evaluation Planning Chart
  • Chapter 6 (pages 27-37): Finding the Evidence: Strategies for Data Collection
  • Page 47: Chart of program objectives to evaluation questions
  • Pages 61-62: Roadmap for evaluation design
  • Appendices: Example evaluation reports

Source: Bond, S.L., Boyd, S.E., Rapp, K.A., Raphael, J.B. and Sizemore, B.A. (1997). Horizon Research. Retrieved from http://www.horizon-research.com/taking-stock-a-practical-guide-to-evaluating-your-own-programs/

Resources Specific to Evaluating Systems Change

WWW: Measuring Implementation of Early Childhood Interventions at Multiple Systems Levels

This brief is written for early childhood researchers, program developers, and funders and seeks to introduce the importance of measuring implementation at multiple system levels and proposes tools for doing so, including a cascading logic model that makes connections between the outcomes and resources of different systems. The brief uses two illustrative examples: 1) a state's effort to improve the quality of infant-toddler child care and 2) a state's effort to improve child and family outcomes through the expansion of home visiting.

Most relevant sections:
  • Pages 15 & 18: Figures 1 and 2 provide example cascading logic models laying out the system level, target population, improvement strategy and desired outcome(s)
  • Pages 16-17 & 19-20: Tables 2 and 3 provide the constructs, performance measures, and data collection methods for each strategy, grouped by systems level

Source: Berghout, Anne M., Lokteff, Maegan, Paulsell, Diane. (2013). OPRE. Retrieved from http://www.acf.hhs.gov/programs/opre/resource/measuring-implementation-of-early-childhood-interventions-at-multiple

Resources Specific to Developing Logic Models to Plan for Evaluation

WWW: Logic Model Development Guide

This guide is written for a broad audience to clarify the what, why and how of logic models. What logic models are and the different types. Why a program should develop a logic model, and how a logic model can be used to guide implementation and plan for evaluation. This guide also includes templates and checklists that states can use to apply it to their specific SSIP. This guide provides useful explanations and definitions of evaluation terminology.

Most relevant sections:
  • Page 3 (Figure 2): How to read a logic model graphic
  • Pages 9-10: 3 Approaches to logic models

Source: W.K. Kellogg Foundation (Updated 2004). Retrieved from https://www.wkkf.org/resource-directory/resource/2006/02/wk-kellogg-foundation-logic-model-development-guide

WWW: Logic Models as a Platform for Program Evaluation Planning, Implementation, and Use of Findings

This presentation is geared toward helping practitioners develop useful logic models. The slides address the hallmarks of well-constructed, useful logic models and how to use logic model for program evaluation planning, implementation, and using the findings. The slides do a nice job of breaking down and explaining the different components and terminology associated with a logic model. Additional resources and sites are also included. States can use this presentation as an introduction to logic models and as guidance for key components to create a meaningful logic model for their SSIP evaluation.

Most relevant sections:
  • Slides 3 & 4: Components of a logic model and how to construct one
  • Slide 11: Process diagram for creating a logic model

Source: Honeycutt, S. & Kegler, M. C. (2010). Honeycutt, S. & Kegler, M. C. AEA/CDC 2010 Summer Evaluation Institute. Retrieved from http://comm.eval.org/viewdocument/?DocumentKey=79c7da3d-6978-452c-8e8b-7056d3626966


Contributors
  • Abby Winer, ECTA Center / DaSy
  • Robin Nelson, DaSy
  • Lynne Kahn, DaSy
  • Taletha Derrington, NCSI / DaSy
  • Elizabeth Davies-Mercier, DaSy
  • Missy Cochenhour, DaSy
  • Nancy Copa, DaSy
Reviewers
  • Holly Cavender, IDC
  • Kathleen Hebbeler, DaSy
  • Christina Kasprzak, ECTA Center
  • Kristin Reedy, NCSI
  • Ellen Schiller, IDC
  • Megan Vinh, ECTA Center

The contents of this document were developed under cooperative agreement numbers # H326R140006 (DaSy), #H326P120002 (ECTA Center), and #H326R140006 (NCSI) from the Office of Special Education Programs, U.S. Department of Education. Opinions expressed herein do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government.

Project Officers: Meredith Miceli & Richelle Davis (DaSy), Julia Martin Eile (ECTA Center) and Perry Williams & Shedeh Hajghassemali (NCSI).

Links on this site are verified monthly. This page content was last updated on 2015-10-30 AML

Early Childhood Technical Assistance Center

  • CB 8040
  • Chapel Hill, NC 27599-8040
  • phone: 919.962.2001
  • fax: 919.966.7463
  • email: ectacenter@unc.edu

The ECTA Center is a program of the FPG Child Development Institute of the University of North Carolina at Chapel Hill, funded through cooperative agreement number H326P120002 from the Office of Special Education Programs, U.S. Department of Education. Opinions expressed herein do not necessarily represent the Department of Education's position or policy.

  • FPG Child Development Institute
  • OSEP's TA&D Network:IDEAs that Work