Interactive Guide to Streamlining and Integrating Part C General Supervision Activities: Monitoring and Program Improvement
As a result of the ECTA Center's annual needs assessment of Part C coordinators and Section 619 program, general supervision was identified by Part C coordinators as their highest priority need of all early childhood topics for several years. In July 2009, Part C identified "streamlining and integrating the various general supervision activities with the Annual Performance Report (APR), including using a data system/database for monitoring" as the greatest specific need within the topic of general supervision.
In April 2010, a National Think Tank, "Streamlining and Integrating Part C General Supervision Activities: Monitoring and Program Improvement", was held to develop resources and materials (a tool kit) that to help states identify where and how monitoring and program improvement activities could be streamlined and better integrated to meet both state needs and national reporting requirements. This online interactive guide is based on the proceedings of the Think Tank, Streamlining and Integrating Part C General Supervision Activities: Monitoring and Program Improvement.
- FAQ Regarding Identification and Correction of Noncompliance and Reporting on Correction in the State Performance Plan (SPP)/Annual Performance Report (APR): This OSEP document provides OSEP responses to frequently asked question related to the identification and correction of noncompliance.
- 09-02 Memorandum: This OSEP memo provides clarification to states on the identification and correction of noncompliance and how OSEP determines evidence of correction into deciding if a state meets substantial compliance for the purpose of making determinations.
The Think Tank refined a six step framework that was originally conceptualized by Western Regional Resource Center (WRRC) in July 2009. The purpose of this framework, is to operationalize the interrelated functions of monitoring and program improvement activities and to serve as a foundation for states to improve, organize and redesign their general supervision activities. The 6 steps describe what a general supervision does and are designed to assist states in efficiently implementing the various components of general supervision to resolve issues effectively, resulting in continuous, lasting improvement. The framework only addresses monitoring and program improvement activities and does not cover fiscal, complaints/hearings or other components of general supervision.
The remainder of this on-line guide presents each step separately, providing a description, the associated state challenges, and relevant resources of the step.
- Use a variety of methods and activities including on-site monitoring, state database, self-assessment, complaints, etc. to identify both state and local level issues with performance and implementation of IDEA
- To be effective and manageable, the methods and activities to identify issues should:
- be integrated with each other
- not duplicate effort
- respond to what is needed to meet requirements (State Performance Plan (SPP)/Annual Performance Report (APR), related requirements, state requirements) and support quality
- limit the amount of data collected/analyzed
- be based on the capacity of the state to carry out the activities
- For noncompliance, determine if verification of data is needed before determining if the issue is noncompliance and/or if noncompliance has been corrected prior to issuing written notification of a finding
- Limited or no monitoring of fiscal accountability
- Some core state monitoring criteria do not correspond to SPP/APR indicators
- Collecting data on too many different "things": not prioritizing what data to collect (not all of IDEA)
- Too many redundant components of general supervision system and methods used for monitoring
- High frequency of issues identified
- State data systems that are not real time and do not capture data on all indicators and priority areas
State Monitoring Indicator Matrices
States have selected indicators for monitoring local early intervention (EI) programs and overall state performance. The monitoring indicators usually include some related requirements and/or quality measures in addition to APR indicators. The data source used to collect the data is included for each indicator. State tools include:
- Alaska Monitoring Indicators
- District of Columbia OSSE Part C Monitoring Tool
- New Jersey Early Intervention System General Supervision Data Review Matrix
State Self-Assessment Tools
- Florida Self-Assessment Instructions and Quality Assurance Probes and Criteria for Measurement: Florida uses a self-assessment process to monitor local early intervention programs and support local program improvement. The Probes and Criteria for Measurement identify what self-assessment items are required and how they will be measured. Electronic worksheets are provided for local programs to complete based upon the data sample selected by the state. The same Probes and Criteria for Measurement tool is also used with several local early intervention programs that are selected annually for a facilitated self-assessment. The facilitated self-assessment includes state and local program staff working together to jointly complete the self-assessment. Programs are selected for the facilitated self-assessment based upon performance or the need to complete on-site monitoring once every three years.
- Idaho Regional Annual Performance Report (Self-assessment): Idaho's Regional APR (R-APR) is a self-assessment completed by regional programs on an annual basis and includes monitoring indicators, targets, data sources, instructions, and measurement. Some data in the R-APR is pre-populated from the database, while other data is collected through a regional record review/self-assessment process.
- Arizona Site Review Child File: Data Sheet and Worksheet: Arizona developed a self-assessment tool for local early intervention programs to collect data on monitoring indicators and related requirements that are not collected through other means, such as the state data system.
- Once an issue is identified, the state determines:
- If it is noncompliance or if noncompliance is contributing to the issue
- The level/extent of the noncompliance by using:
- Percentages (e.g., ≥95%, 85-94%, 76-84%, ≤75%), or
- Number of instances in proportion to the N (e.g., 1 out of 5, 1 out of 50)
- As appropriate, consider other factors in determining the extent/level of the issue as appropriate:
- Where and with whom the issue is occurring (one (1) or more service coordinators/providers; one (1) or more programs; regionally or statewide)
- Historical or trend data (e.g., repeat offender)
- Contextual factors (e.g., programs' demonstrated ability to correct prior noncompliance)
- Number of issues/findings of noncompliance
- Consider the extent/level of noncompliance when it is determining what is required to ensure correction and/or issue resolution
- Not having clear protocols to determine the extent/level of the issue
- Not paying due attention to "performance" issues
This document includes 2 example flow charts that link the extent and level of the noncompliance with a proportionate response for correction (Steps 2, 3, and 4). The flow charts diagram the relationship between a program's level/extent of the issue with the actions that will be required of local programs (including corrective action plans or CAPS) to ensure correction of noncompliance/resolution of the issue and the amount of data that the state will use to verify resolution or correction.
This flow chart and instructions outline the steps in the decision-making process for determining the level and extent of noncompliance and the relationship with root causes and the required corrective actions (including the amount of data needed for verifying correction) (Steps 2, 3 and 4).
This state's monitoring tool incorporates the criteria in the right column for determining the corrective action based on the level of noncompliance.
- Actions related to determining the root cause of the issue should be related to what information/data is already available and what is known about the issue
- Other factors such as historical and trend data may also impact formality of actions needed to determine the root cause
- Analysis of the root cause(s) should be thoughtful and in sufficient detail to ensure that corrective actions/improvement efforts are meaningful and effective (change policies, procedures, practices, personnel development, administration, etc.)
- Overall, identifying root causes should ensure corrective actions are meaningful and effective to ensure that noncompliance can be verified as corrected as soon as possible, but in no case more than one year from identification
- Not investigating root cause
- Identifying the root cause(s) vs. acculturated excuses (such as personnel shortages)
- Insufficient state resources (including TA and incentives) to support local programs in determining root cause
- Early intervention personnel do not always have the skill set to analyze and drill down in data and then use data to develop effective CAPs
Local Contributing Factor Tool for SPP/APR Compliance Indicators C-1, C-7, C-8, C-9/B-15, B-11 and B-12
This document provides ideas for the types of questions a local team would consider in identifying factors contributing to noncompliance for SPP/APR Indicators C1, C7, C8, C9, B11, B12 and B15. Some questions are designed to determine adequacy of local agency/district management and oversight while others are geared for gathering information from service coordinators, providers and/or teachers and about actual practices. Data collected from this analysis should be used to identify contributing factors that relate to program infrastructure, policies and procedures, funding, training and technical assistance, supervision, data, personnel and provider practices. These factors, once identified, can lead to the development of meaningful strategies for correction in those areas contributing to the noncompliance.
This document provides ideas for the types of questions a local team would consider in identifying factors contributing to performance. Some questions are designed to determine adequacy of local agency/district management and oversight while others are geared for gathering information from service coordinators, providers and/or teachers and about actual practices. Data collected from this analysis should be used to identify contributing factors that relate to program infrastructure, policies and procedures, funding, training and technical assistance, supervision, data, personnel and provider practices. These factors, once identified, can lead to the development of meaningful strategies for improvement in those areas currently impacting performance.
Florida Early Steps Root Cause Analysis: Periodic and Annual IFSP Reviews and Timely Services
Florida developed Root Cause Analysis tools for local early intervention programs with long standing noncompliance with Periodic and Annual IFSP Reviews and with Timely Services. Other early intervention programs can use the Root Cause Analysis tools in full or in part to identify local issues with these requirements.
- Use decisions on the level/extent and the root cause of the issue (Steps 2 and 3), including whether there is noncompliance, to determine:
- At what level resolution needs to happen
- Who needs to be responsible
- What actions should be required
- What data will be used to verify correction and how verification will occur
- For improvement, develop improvement strategies and correlate with SPP/APR as appropriate
- For correction of noncompliance, issue written finding notification (data/conclusion leading to finding of noncompliance, citation, requirement to correction as soon as possible but in no case later than one year from identification)
- Powerful interest groups, local municipalities and provider agencies reluctant to change
- No clear ownership of the issue and/or its cause
- Line of authority not clearly defined
- Issuing written notification of findings
- Determining resolution that is doable and ensures correction
- Not linking what is needed for "correction" with the level/extent of issue and root cause
- Developing focused, functional and measureable corrective action plans that result in correction of noncompliance
Sample Written Notification of Noncompliance
States are required to provide written notification of noncompliance to local early intervention programs. Based on the level/extent of the noncompliance, some state's findings letters require a CAP while others do not. Examples of findings letters include:
Corrective Action Plans/Improvement Plans
States have developed corrective action plan/improvement plan (IP) templates to ensure correction of noncompliance and program improvement.
- New Jersey
- Florida Note: The Florida continuous improvement plan template is used immediately following identification of noncompliance and following early intervention program's notification of their Status Determinations.
Colorado DASHBOARD Reports on Implementation of Plan of Corrections
These state reports are printed monthly and shared with staff to track progress and allow intervention as needed. Also, the state is planning to use this report to do progress checks when a finding is not issued.
These instructions provide guidance to local programs on what data fields should be selected to run updated data reports to determine correction of noncompliance related to transition conferences, Notification and 45 day timelines. These reports can also be used by local programs on an ongoing basis to determine their performance with these requirements throughout the year as a preventative measure.
- Improvement issues and those requiring system improvement may need to occur over several years
- New data should be reviewed to determine the effectiveness of the improvement strategies
- For noncompliance, the timeline for when correction must be verified (as soon as possible but in no case later than one year) begins on the date that the state Lead Agency notifies the EI program in writing of its finding of noncompliance
- To demonstrate that noncompliance has been corrected, the state must:
- Prong 1: Account for the correction of all child-specific instances of noncompliance (state or local agency can review a sample of the records with noncompliance to verify correction) AND
- Prong 2: Determine whether each EI Program with identified noncompliance is correctly implementing the specific regulatory requirements (achieved 100% compliance)
- Determining how much data is enough to verify correction
- Designing a system that is not overwhelming for providers (drowning in data)
- How to ensure correction since 100% compliance for 100% of the time is impossible
Tracking Correction of Findings of Noncompliance
States have developed data bases and other tools to track when findings of noncompliance need to be corrected (within one year of identification) and actually are corrected.
- Connecticut (database)
- Missouri (database)
- Washington Tracking Determinations and Correcting Findings of Noncompliance
- Florida Noncompliance Correction Tracking
- The state should have a system of incentives and enforcement actions in place
- If noncompliance is not corrected in a timely manner (within one year of identification), the state must have processes in place to continue to collect updated data to reflect 100% correction and that child-specific noncompliance has been corrected
- Written notification of correction of noncompliance must be provided to local early intervention programs
- Data on correction of noncompliance is used in making status determinations for local early intervention programs
- Maintaining correction once focus is off indicator
- Using resolution results to guide state strategic plans and priorities
- Identifying appropriate sanctions/incentives
- Celebrating success in correcting noncompliance to stay in compliance
OSEP has made available several resources to assist states in developing and implementing their local determination process.
This OSEP document outlines §616 of IDEA requirements related to the U.S. Department of Education's responsibility for reviewing states' APR on an annual basis and making determinations.
How the Department Made Determinations under Sections 616(d) and 642 of the Individuals with Disabilities Education Act in 2012: Part C
This OSEP document outlines how OSEP made Determination for states.
Foundational/Overall System Resources
Integrated System of General Supervision: Spider Web Activity and Instructions
This tool is designed to evaluate the integration of state general supervision activities.
This tool is to assist states in establishing and communicating timelines for completing general supervision activities throughout the year (includes an example for Parts B and C), integrating the various general supervision activities with the APR and other reporting requirements, and/or to define, describe or help evaluate the state's general supervision system.
This state tool is a color-coded monthly calendar for state level activities related to monitoring, training, etc.
This document summarizes five states' (NC, CT, WY, ID and NY) general supervision systems according to the six steps of monitoring.
This document includes a matrix and accompanying guidance to assist states in assessing their state capacity and staffing for carry out monitoring and program improvement activities and adjusting their activities to match available resources.