System Framework: Resource Search Results

System Framework
Resource Name Quality Indicator(s) Notes
  • AC1
    • a (pg. 4)
    • b (pg. 4)
    • c (pg. 4)
  • AC3
    • a (pg. 11-13)
    • c (pg. 9, 12, 14)
    • d (pg. 17)
    • e (pg. 14, 15, 17)
    • f (pg. 15, 21-22)
    • g (pg. 16)
    • h (pg. 16)
  • AC5
    • b (pg. 14, 15)
    • d (pg. 18)
  • AC6
    • b (pg. 13, 17)
    • g (pg. 16)

This document presents three views of accountability design to address states' needs:

  • a framework that provides a structure for helping states move through the process of designing a school accountability system, with questions, criteria, and comments;
  • a checklist of characteristics to help states evaluate the consistency and coherence of their programs, and;
  • examples of actual state experiences, with design features that might be considered and why. The document can be used to assist states in designing their state accountability and quality improvement system.
  • AC1
    • a (pg. 17)
    • b (pg. 17)
    • c (pg. 33)
    • e (pg. 33)
    • g (pg. 33)
  • AC3
    • a (pg. 27)
    • c (pg. 27)
    • g (pg. 27, 30)
    • j (pg. 27)
  • AC5
    • a (pg. 49-50)
    • b (pg. 49-50)
    • c (pg. 49-50)
    • d (pg. 49-50)
  • AC6
  • AC7
    • a (pg. 61-62)
    • b (pg. 61-62)

This document summarizes the work that has been done to date on developing a set of standards for accountability. It can be used to inform those not well experienced in accountability about essential elements of a good, valid accountability system. States can use the checklists provided to assist in designing their accountability system. Although the examples are for Part B, they can be applied to Part C. It provides relevant state examples and checklists to assist states in developing an accountability system (see pg. 17-59, 98-104).

  • AC1
    • b (pg. 18)
    • c (pg. 4, 5, 12, 18)
  • AC2
    • a (pg. 3)
    • b (pg. 7-8)
    • e (pg. 9)
    • h (pg. 12)
  • AC3
    • c (pg. 9)
    • f (pg. 10, 12)
    • g (pg. 6, 9)
    • h (pg. 12)
  • AC4
    • b (pg. 5, 9, 10)
    • d (pg. 12)
  • AC5
    • a (pg.10)
    • b (pg. 5, 9, 10)
    • c (pg. 12)
  • AC6
    • a (pg. 5)
    • c (pg. 10, 11, 13-15)
    • e (pg. 12)
    • f (pg. 14-15)
    • g (pg. 12, 15)
  • AC7
    • a (pg. 13)

This document from the National Center for Special Education Accountability Monitoring presents eight components that make up a state's general supervision system under Part B of IDEA. These components Include: State Performance Plan; policies, procedures and effective implementation; data on processes and results; targeted technical assistance and professional development; effective dispute resolution; integrated monitoring activities; improvement, correction, incentives and sanctions; and fiscal management - connect, interact and articulate to form a comprehensive system. The document was designed to be used by a state in self-evaluating its general supervision system. It was developed through a collaborative effort with the RRCP, NECTAC, representatives from state agencies, NASDSE and ITCA.

  • AC1
    • b (pg. 18)
    • c (pg. 4, 5, 12, 18)
  • AC2
    • a (pg. 3)
    • b (pg. 7-8)
    • e (pg. 9)
    • h (pg. 12)
  • AC3
    • c (pg. 9)
    • f (pg. 10, 12)
    • g (pg. 6, 9)
    • h (pg. 12)
  • AC4
    • b (pg. 5, 9, 10)
    • d (pg. 12)
  • AC5
    • a (pg. 10)
    • b (pg. 5, 9, 10)
    • c (pg. 12)
  • AC6
    • a (pg. 5)
    • c (pg. 10, 11, 13-15)
    • e (pg. 12)
    • f (pg. 14-15)
    • g (pg. 12, 15)
  • AC7
    • a (pg. 13)

This document from the National Center for Special Education Accountability Monitoring presents eight components that make up a state's general supervision system under Part C of IDEA. These components - State Performance Plan; policies, procedures and effective implementation; data on processes and results; targeted technical assistance and professional development; effective dispute resolution; integrated monitoring activities; improvement, correction, incentives and sanctions; and fiscal management connect, interact and articulate to form a comprehensive system. The document was designed to be used by a state in self-evaluating its general supervision system. It was developed through a collaborative effort with the RRCP, NECTAC, representatives from state agencies, NASDSE and ITCA.

  • AC2
    • c
    • d
    • e
  • AC3
    • c
    • d (pg. 3-6)

See links to Arizona materials located at the bottom of this page. The AZ materials include an individual child record review tool and an Excel file/workbook for summarizing and calculating data from multiple child records for APR indicators and select related requirements. The summary tool can be used by state monitoring staff to create a report to show program performance on each indicator and will yield percentages related to compliance. It can be a useful tool in focused monitoring and can help identify trends and concerns. The materials can also be used as mechanisms for self-assessment and/or tracking correction when timelines are not met. Materials are easily adaptable for use by other states.

  • AC1
    • c (pg. 1)
  • AC2
    • a (pg. 1-2)
    • c (pg. 1)
    • h (pg. 1)
  • AC3
    • a (pg. 2, 3)
    • b (pg. 2)
    • c (pg. 2)
    • d (pg. 2)
    • f (pg. 2-4)
    • i (pg. 3)
    • j (pg. 2, 12)
  • AC4
    • a (pg. 4-6)
    • b (pg. 3-6)
    • c (pg. 3-4)
    • d (pg. 3, 5)
  • AC5
    • a (pg. 1)
    • b (pg. 4)
    • c (pg. 12, 4, 7, 8)
    • d (pg. 4)
  • AC6
    • a (pg. 2, 11)
    • c (pg. 2, 8, 9, 12)
    • f (pg. 2, 11)
    • g (pg. 2, 11)

This document provides a quick overview of the General Supervision components for five states (NC, CT, WY, ID, NY). The information is organized according to the Six Steps of Monitoring and Program Improvement: A Framework for Streamlining and Integrating Part C General Supervision Activities, i.e. activities that describe what a general supervision system does: identify an issue; determine the extent/level; determine the cause; assign accountability; ensure and verify resolution; follow up. The document can be used to compare how different states address different aspects of general supervision. It does not include processes for fiscal monitoring.

  • AC2
    • c (pg. 4-16)
    • d (pg. 4-19)
  • AC3
    • c (pg. 20-23)
    • e (pg. 4-16)
    • g (pg. 17)
    • h (pg. 4-16)

The Missouri First Steps Part C program developed this document to rate the quality of IFSPs for accountability and monitoring purposes, specifically for measuring a performance standard in the System Point of Entry (SPOE) contract. This resource outlines indicators and performance measures related to the quality of IFSPs. It also provides examples of quality IFSP elements (e.g. outcomes, strategies, measurement criterion). It can be used by states in developing their own indicators and performance measures of IFSP quality.

  • AC2
    • c
    • d
    • e

This summary document reflects what methods are used to gather data on the 38 Texas Part C monitoring indicators (i.e. onsite monitoring, desk review monitoring, data monitoring, quality assurance reviews, funding application reviews, financial report desk monitoring, complaints management, determinations). A number of the items are related to quality. States can use this document to guide their identification of both compliance and quality indicators for Part C.

  • AC2
    • a (pg. 1)
    • b (pg. 7-8)
    • e (pg. 3-7)

This document describes the Texas Early Childhood Intervention program's system of general supervision and oversight of contractors implementing Part C at the local level. It includes the oversight methods used for monitoring compliance as well as reviewing quality. This resource can be used by states to guide their process of enhancing their Part C general supervision and oversight of local programs in their states.

  • AC2
    • c
    • f

This document includes the indicators that the NJ Part C program uses for monitoring. The indicators include both federal and state requirements. The data sources and timelines for each Indicator are included. States can use this document to assist in identifying their state monitoring indicators.

  • AC2
    • c
    • d
  • AC3
    • c
  • AC6
    • c

This document was developed by the District of Columbia Part C program for monitoring local EI programs/providers. The tool identifies the indicator and provides guidance on how to measure performance for each item. The indicators includes items related to verification of data entry in the child's file compared to data in the data system and requirements related to procedural safeguards, evaluation/assessment, IFSP and fiscal. Each indicator also includes the corrective actions that the program must take to meet compliance and improvement.

  • AC1
    • a (pg. 1)
    • b (pg. 1)
    • c (pg. 2)
    • d (pg. 1)
    • e (pg. 4-6)
  • AC2
    • a (pg. 1-4)
    • b (pg. 1-4)
    • c (pg. 7-9, 12-14)
    • e (pg. 10-12)
    • f (pg. 1-6, 10-12)
    • g (pg. 17-20)
    • h (pg. 1-4)
  • AC3
    • f (pg. 1, 3)
  • AC4
    • a (pg. 1, 4)
    • d (pg. 15)
  • AC5
    • a (pg. 14)
    • b (pg. 14, 15, 21, 26, 28)
    • c (pg. 14, 15, 22, 23)
    • d (pg. 15)
  • AC6
    • a (pg. 16, 21)
    • c (pg. 23, 24, 27)
  • AC7
    • a (pg. 25, 26)

This manual outlines purpose, procedures, and process for monitoring regional programs and early intervention providers to ensure appropriate implementation of IDEA and improved results for children and their families. The manual can be used by states to understand how the general supervision components can be integrated for a streamlined system. Also provides a framework for scheduling multiple activities throughout the year.

  • AC3
    • a (pg. 2)
    • b (pg. 2)
    • c (pg. 2)
    • g (pg. 2-4, 7-9, 11-12, 14-15, 17-18, 21-22)
  • AC4
    • c (pg. 2)
    • d (pg. 2, 5, 9, 12, 15, 19, 22, 25, 29)
  • AC6
    • b (pg. 5-6, 9-10, 12-13, 15-56, 19-20, 22-23, 25-26, 29-30)
  • AC7
    • a (pg. 5-6, 9-10, 12-13, 15-56, 19-20, 22-23, 25-26, 29-30)

This document provides ideas for the types of questions a local team would consider in identifying factors contributing to noncompliance for SPP/APR Indicators C1, C7, C8, C9, B11, B12 and B15. Some questions are designed to determine adequacy of local agency/district management and oversight while others are geared for gathering information from service coordinators, providers and/or teachers and about actual practices. States can use this tool to collect data to identify contributing factors that relate to program infrastructure, policies and procedures, funding, training and technical assistance, supervision, data, personnel and provider practices. In addition, the tool provides a template for states to use in developing meaningful strategies for improvement in those areas impacting compliance.

  • AC3
    • a (pg. 2)
    • b (pg. 3)
    • c (pg. 2)
    • d (pg. 5-7)
    • g (pg. 4-10, 14, 19, 20)
  • AC4
    • b (pg. 4, 6, 15)
    • c (pg. 2)
    • d (pg. 2, 9, 17, 23)
  • AC6
    • b (pg. 11, 18, 24)
  • AC7
    • a (pg. 10, 18, 24)

This document provides ideas for the types of questions a local team would consider in identifying factors contributing to performance for SPP/APR Indicators C2, C4, C5, C6. General questions that are applicable to all indicators are included, as well as questions specific to each indicator. Some questions are designed to determine adequacy of local agency/district management and oversight while others are geared for gathering information from service coordinators, providers and/or teachers about actual practices. States can use this tool to collect data to identify contributing factors that relate to program infrastructure, policies and procedures, funding, training and technical assistance, supervision, data, personnel and provider practices. In addition, the tool provides a template for states to use in developing meaningful strategies for improvement in those areas impacting performance.

  • AC3
    • a (pg. 2)
    • b (pg. 3)
    • c (pg. 2)
    • d (pg. 7)
    • e (pg. 6)
    • g (pg.7)
  • AC4
    • b (pg. 7)
    • c (pg. 2)
    • d (pg. 8)
  • AC5
    • a (pg. 6)
    • b (pg. 9)
  • AC7
    • a (pg. 9)

This document provides ideas for the types of questions a local team would consider in identifying factors impacting performance. General questions that are applicable to both indicators are included, as well as questions specific to each indicator. Suggested questions are categorized into two main areas:

  1. Systems/Infrastructure and
  2. Practitioner/Practice.

Some questions are designed to determine adequacy of local agency management and oversight while others are geared for gathering information from service coordinators and practitioners and about actual practices. States can use this tool to collect data to identify contributing factors that relate to program infrastructure, policies and procedures, funding, training and technical assistance, supervision, data, personnel and provider practices. In addition, the tool provides a template for states to use in developing meaningful strategies for improvement in those areas impacting performance.

  • AC3
    • c (pg. 3-12)
  • AC6
    • b (pg. 3-12)
  • AC7
    • b (pg. 3-12)

This document assists states in identifying ways to improve results for children and families participating in Part C early intervention services through implementation of quality practices. The document identifies key practices selected from the Agreed Upon Practices for Providing Early Intervention Services in Natural Environments as well as the Personnel Standards for Early Education and Early Intervention: Guidelines for Licensure in Early Childhood Special Education (DEC Recommended Practices) that have the most direct impact on each of the child and family outcomes. This resource can assist states in selecting which quality practices should be implemented to improve each of 3 child and/or 3 family outcome results.

  • AC4
    • b (pg. 2-12)
    • c (pg. 2)
    • d (pg. 2)
  • AC5
    • b (pg. 2)
  • AC6
    • a (pg. 2)
    • b (pg. 2)

This guidance table is a tool to help states identify key issues, questions, and approaches for analyzing and interpreting data on outcomes for young children with disabilities. The tool outlines a series of steps related to defining analysis questions, clarifying expectations, analyzing data, testing inferences, and conducting data-based program improvement planning.

  • AC3
    • c (pg. 2)
    • d (pg. 2)
  • AC6
    • c (pg. 2)
    • f (pg. 2)
    • g (pg. 2)

This OSEP policy memo defines how states correct non-compliance at child and program level and how evidence will be used to prove correction and compliance with IDEA. This document can be used by states to help establish their state mechanisms and procedures for correcting noncompliance as part of their Accountability and Quality Improvement System.

Early Childhood Technical Assistance Center

  • CB 8040
  • Chapel Hill, NC 27599-8040
  • phone: 919.962.2001
  • fax: 919.966.7463
  • email: ectacenter@unc.edu

ECTA Center is a program of the FPG Child Development Institute of the University of North Carolina at Chapel Hill, funded through cooperative agreement number H326P170001 from the Office of Special Education Programs, U.S. Department of Education. Opinions expressed herein do not necessarily represent the Department of Education's position or policy.

Project Officer: Julia Martin Eile

  • UNC Frank Porter Graham Child Development Institute
  • IDEAs that Work: Office of Special Education Programs, U.S. Department of Education