Improving Systems, Practices and Outcomes

Outcomes Measurement: Using Data

Overview

This Using Data section provides resources to help state and local programs in making regular use of outcomes data to improve programs. Key elements of the optimal use of data include:

  • Utilizing a local and/or state stakeholder process to consider the implications of child, family, and other data, and to identify needed program improvements.
  • Developing a comprehensive plan for using data based on stakeholder interpretation of the data, knowledge of effective practices, principles of systems change, evaluation of previous improvement activities, and available resources.
  • Identifying programs for targeted support and working to jointly develop action plans.
  • Identifying statewide systemic targets for improvement.
  • Developing policies or guidance addressing local program responsibilities with regard to use of data.
  • Implementing and evaluating program improvement activities on a regular basis.

Resources on these and related topics are found on the next tab entitled "Resources and Tools." Additional resources are continually being developed and shared by both the ECO Center and by states. If you are interested in specific resources or examples, or if you have materials you would like to share, please contact ectacenter@unc.edu.

Resources and Tools

SSIP Child Outcomes Templates

DOC: SSIP Child Outcomes Broad Data Analysis Template (March, 2014) looks at how children in the state are performing relative to national data, across years, within the state, and by comparisons across programs within the state. Uses data from current APR reporting.

DOC: SSIP Child Outcomes Subgroup Analysis Template (March, 2014) provides states with table shells for subgroup analyses that have proven useful in understanding predictors of child outcomes. These shells are suggestions and should be tailored to fit the appropriate categories for your state.

SSIP Family Outcomes Templates

DOC: SSIP Family Outcomes Broad Data Analysis Template (July, 2014) compares data on family outcomes in the state to national data, across years, within the state, and across programs within the state. Uses data from current APR reporting.

DOC: SSIP Family Outcomes Subgroup Analysis Template (July, 2014) provides states with table shells for subgroup analyses that have proven useful in understanding predictors of family outcomes. These shells are suggestions and should be tailored to fit the appropriate categories for your state.

Guidance Table for Analyzing Child Outcomes Data for Program Improvement

This guidance table is designed to help identify key issues, questions, and approaches for analyzing and interpreting data on outcomes for young children with disabilities. The tool outlines a series of steps related to defining analysis questions, clarifying expectations, analyzing data, testing inferences, and conducting data-based program improvement planning. It also includes examples of questions, approaches, and sample figures to consider.

Contact ECTA Center Staff, too! TA providers are happy to work with you as you begin using the guidance table.

FOS Analysis Guide

DOC: FOS Analysis Guide (updated September, 2014) a guidance document on analyzing data using the original Family Outcomes Survey (FOS).

Local Contributing Factor Tool

This tool was designed to assist local programs in collecting valid and reliable data to determine contributing factors impacting performance on State Performance Plan (SPP) indicators. The latest addition to the existing tool is a section of drill down questions focused on child outcomes, indicator C3/B7, which provide ideas for the types of questions a local team would consider in identifying factors impacting performance.

  • Streaming Presentation: WWW: Local Contributing Factor Tool- New Sections for Child Outcomes (C3/B7): This video provides a brief introduction to the tool and the new sections for child outcomes. (10 min.)
  • DOC: New Child Outcomes drill questions for C3/B7: This document provides ideas for the types of questions a local team would consider in identifying factors impacting performance. General questions that are applicable to all indicators are included as well as questions specific to each indicator. Suggested questions are categorized into two main areas: 1) Systems/Infrastructure and 2) Providers/Practice. This is not meant to be an exhaustive list of questions. Some questions are designed to determine adequacy of local agency management and oversight while others are geared for gathering information from practitioners and about practices. Data collected from this investigation should be used to identify contributing factors that relate to program infrastructure, policies and procedures, funding, training and technical assistance, supervision, data, personnel and practices. These factors, once identified, can lead to the development of meaningful strategies for improvement. Based upon the results of the investigation and analysis of data, it is expected that strategies would only be developed in those areas impacting current performance.

Data Workshop Series

These are the first two in a five-part series of narrated presentations with actitivies on using child outcomes data. The content and activites are based on a data workshop at the 2011 Measuring and Improving Child and Family Outcomes Conference.

Handouts
Activities
  • WWW: Using Data Activity 1: Comparison of State and National Data: This is the first in the series of data workshop activities. In this activity, we compare progress category and summary statement data from a hypothetical state to national data and data from states with a similar population size or percent served.
  • WWW: Using Data Activity 2: Checking Data for Stability and Completeness: This is the second in the series of data workshop activities. In this activity, we dig a little deeper into the quality of state data by reviewing statewide missing data and trend data across years in order to answer the following questions:
    1. Do we have enough data to trust the findings?
    2. Are the data stable?

Using Data for Program Improvement: Call series sponsored by the Outcomes Priority Team

Monitoring programs and responding to federal requirements have generated lots of data. Next steps are to look for patterns in the data, figure out what they are telling us about our programs, and determine the action steps needed to improve programs. Federally funded TA providers have developed tools to help states and local programs use their data. This call series presented a number of these tools with examples from programs as to how they used the tools to plan for program improvement.

Using SPP/APR data for program improvement with Ann Bailey and Jeanna Mullins (RRCP)

The first call, held on February 22, 2011, featured two tools for using data. Thinking through Improvement (IT Kit), an improvement planning process developed by the North Central Regional Resource Center, which provides information and activities on prioritizing areas for improvement, setting targets, selecting activities, evaluating process and impact, and reporting progress. The Evaluating SPP/APR Improvement Activities resource document is intended to assist State Education Agency (SEA) and Lead Agency (LA) staff and technical assistance providers in designing a meaningful evaluation for the State Performance Plan (SPP)/Annual Performance Report (APR) improvement activities. It provides information about the relevance of evaluation in the context of improvement planning and strategic systems thinking; guidance on selecting an appropriate design for evaluating different types of improvement activities and; additional resources and tools that support the overall design, implementation and evaluation of the SPP which may serve as a State's blueprint for systems improvement.

Materials from the call:

Building local capacity for data analysis and use with Sharon Walsh and Haidee Bernstein (DAC); Mary Anne White and Beverly Crouse (Infant Toddler Connection of VA)

The second call in the series, held on March 8, 2011, showcased a set of Virginia Part C training materials developed to improve the capacity of local data teams to understand and analyze their data and to develop, implement and evaluate program improvement plans. The data analysis process involves creating hypotheses that are evaluated to determine the possible root causes of poor/low performance.

Materials from the call:

Using outcomes data for program improvement with Kathy Hebbeler and Cornelia Taylor (ECO)

The third call, held on April 12, 2011, described key concepts in using outcomes data for program improvement and highlighted TA materials developed by the Early Childhood Outcomes Center on how child outcomes data can be used at the state and local level to improve programs. The call also reviewed the sections of the ECO state self assessment that address using data for program improvement and show how states and local programs can use the tool to chart their own progress toward data-based decision making.

Materials from the call:

Data-based decision making: Tools for improving practice with Anne Lucas (RRCP) and Christina Kasprzak (NECTAC)

In the fourth call of the series, held on May 10, 2011, presenters shared a tool designed to help states use data to facilitate systemic state and local improvement. The tool illustrates the use of root cause analysis and other strategies for data-based decision making that improve practices.

Materials from the call:

Ohio: Preschool Special Education Outcomes Institute on Data Analysis

ECO and NECTAC staff collaborated with Ohio's Office of Early Learning and School Readiness, Department of Education to conduct two two-day professional development opportunities for local 619 administrators and service providers. This training emphasized the reporting and use of child outcomes data.

Looking at Data

This presentation focused on using child outcomes data for programimprovement. The process of E (evidence), (I) inference, (A) action is used to guide participants through the review of the data, interpreting the data, and deciding next steps based on the interpretation.

Data Workshops at the 2009 OSEP National Early Childhood Conference

The ECO Center hosted two data workshops focusing on current issues and challenges related to analyzing child and family outcomes data. The morning session covered the basics of quality data and target setting. The afternoon session addressed data analysis for program improvement and included opportunities for hands-on data analysis by participants using data templates. Data templates were provided for states using the Child Outcomes Summary Form (COSF) as well as for states using other assessments for outcomes reporting.

Using Data for Program Improvement

In this session, presenters discussed the use of outcomes data to drive State and local program improvement. Participants reviewed and analyzed sample State data to develop a better understanding of State and local issues anddetermined what types of improvement activities a State might implement,including changing policies and guidance, and providing targeted training and TA.

Reporting Data to the Public

Data reports should be routinely generated and analyzed at both the state and district/program levels. Data trends should be compared to the targets in the SPP and other state selected monitoring indicators and targets, and be designed to meet the needs of districts/programs. States are required to annually report both state performance and district/program performance data (e.g., 618 and SPP/APR data) to the public. The following resource is available regarding reporting data to the public:

Early Childhood Technical Assistance Center

  • CB 8040
  • Chapel Hill, NC 27599-8040
  • phone: 919.962.2001
  • fax: 919.966.7463
  • email: ectacenter@unc.edu

The ECTA Center is a program of the FPG Child Development Institute of the University of North Carolina at Chapel Hill, funded through cooperative agreement number H326P120002 from the Office of Special Education Programs, U.S. Department of Education. Opinions expressed herein do not necessarily represent the Department of Education's position or policy.

  • FPG Child Development Institute
  • OSEP's TA&D Network:IDEAs that Work