State Systemic Improvement Plan Questions and Answers

Similar documents
Part B State Systemic Improvement Plan (SSIP) OSEP Evaluation Tool

Developing and Implementing an Effective System of General Supervision: Part B

Immunization Information System (IIS) Trainer Sample Role Description

Entry 1: Formal Knowledge Source Cover Sheet

Office School Improvement. Updates METIS Conference Summer 2017

Refer to Chapter 4, Subsection for information on Guidance/Counseling during service delivery.

Early Intervention Specialists

System Framework Governance (GV) Component Draft updated August 30, 2014

nectac nectac Technical Assistance Model for Long-Term Systems Change

Program Outcome 1 explain concepts related to the marketing mix of product, price, place, promotion and the contribution of each to an organization.

Minnesota Literacy Council AmeriCorps VISTA. New Project Application

Preparing an Effective Response

DCF Needs Assessment. A Report by the Office of Performance Management and Accountability. Allison Blake, Ph.D., L.S.W.

Positive Behavior Support Facilitator Portfolio Standard Process Application Contents of Portfolio Packet

JOB DESCRIPTION Position Monitoring and Evaluation (M&E) Coordinator - Hub Grade C2 Department & Location

Stanford 10 (Stanford Achievement Test Series, Tenth Edition)

Impact evaluation proposal form

Design Options for Home Visiting Evaluation

COPRA Policy Statement November 17, 2016

Design, Monitoring, & Evaluation Module

Executive Board of the United Nations Development Programme and of the United Nations Population Fund

Job PLAN INTERNATIONAL UK JOB PROFILE

Terms of Reference (ToR)

Guidance: Quality Criteria for Evaluation Reports

Evaluation Office, UNICEF New York (HQ) Terms of Reference. Review of Adolescent Participation in UNICEF Programme Monitoring and Evaluation

GRI Greenhouse Gas (GHG) Emissions Working Group

Section 3: Professional validation criteria and evidence requirements

Standards for Social Work Practice with Groups, Second Edition

GUIDING FOR ACCOUNTABILITY:

Strategic Planning Consultant Proposal No. P13/9824 Date: July 17, 2013

Elements of a WIA Youth Program

Rochester City School District Department of Human Capital Initiatives

2017 Request for Proposal (RFP) Pregnancy Prevention and Parenting Support

The Seven Areas of Responsibility of Health Educators Area of Responsibility I: ASSESS NEEDS, ASSETS AND CAPACITY FOR HEALTH EDUCATION COMPETENCY

RRC SELF-EVALUATION. A tool for Michigan communities seeking RRC certification

TIER II STANDARD FOR TECHNICAL COOPERATION ADMINISTRATORS INTRODUCTION

Cross-Program Evaluation Frameworks, Outcomes, and Reporting Training, Technical Assistance and Capacity Building

CCA for Data Centres Do I need an energy consultant?

Measuring the Process and Impact of Children and Young People s Participation. Mererid Lewis, Emyr Williams and Stacey Jones Participation Unit

Introduction to Performance Management

JOB PROFILE. A. Post Information. Salary. Contract Type and Duration Permanent (level 8) Job Profile Verification. Date Verified: 30 January 2017

Advisory Board Members: Sample Board Evaluation Best Buddies Board Evaluation

TIPS BUILDING A RESULTS FRAMEWORK ABOUT TIPS

MBA 599 Strategic Management Case Project

Category 1 Consumer Input & Involvement

JOB DESCRIPTION. PERSONNEL EVALUATION RESPONSIBILITY: Professional and non professional personnel if delegated by the principal

Welcome Baby Impact Evaluation Request for Proposals FREQUENTLY ASKED QUESTIONS. Information Webinar

Program Assessment. University of Cincinnati School of Social Work Master of Social Work Program. August 2013

Philadelphia Infant Toddler Early Intervention

EVALUATION OFFICE. Global Evaluation Reports Oversight System (GEROS)

Improving Teacher and Principal Effectiveness. Self-Assessment and Reflection Tool

REQUEST FOR PROPOSALS

The long anticipated draft of the FDA s

CLASSIFICATION REVIEW GUIDELINE FOR STAFF AND SUPERVISORS. Table of Contents

Army DCIPS Midpoint Review Step-By-Step Guide

Terms of Reference for the Outcome Evaluation of Achieving the MDGs and Reducing Human Poverty Programme

Reintegration Framework Systems Planning Toolkit

CSO Evaluability Assessment Checklist: Working Draft

Technical Specifications SCOPE OF WORK

Key Performance Indicators

Programme Planning and Assessment Process 2011

Quality Management System Guidance. ISO 9001:2015 Clause-by-clause Interpretation

Lead Management System Overview, Phase 1 User Guide, FAQ, and Data Field Glossary Created: April 2016 Updated: June 2017

Indicators, Targets and Data Sources

Effective Date: Rule 6A-5.066, F.A.C. Form EPI IAS Page

O VERVIEW. This evaluation tool will help answer the questions:

Use of Logic Models and Theory of Change Models for Planning and Evaluation

AMERICAN PUBLIC HUMAN SERVICES ASSOCIATION. Organizational Effectiveness Department; Report prepared by Kathy Jones Kelley

QUALITY IMPROVEMENT FORM

Education, Health and Care Plans

PATHWAYS TO WELL-BEING: California s Integrated Training Plan for Children, Youth and Families

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS

A Systematic Approach to Business Process Improvement

REQUEST FOR PROPOSALS

THE MIDAS TOUCH INTRODUCTION... 3 QUESTION- BY- QUESTION GUIDE... 5 HOW TO PREPARE A WORK SAMPLE... 13

Mid-term Project Evaluation Guidance Note For EIF TIER 1 Funded Projects: Support to National Implementation Arrangements

G4 DEVELOPMENT. Document 2 of 12 Statistics: Quantitative Online Feedback. Second G4 Public Comment Period: Submissions.

REQUEST FOR PROPOSALS Promoting Life-skills in Aboriginal Youth (PLAY) Program Evaluation

To begin, it is important to understand the difference between Individual Student Assessment and Programmatic Outcome Assessment

Fairtrade International Monitoring Evaluation and Learning Programme ISEAL Impacts Code Public System Report January Executive Summary 2

Effective Date: Rule 6A-5.066, F.A.C. Form ITP IAS Page

HR Training for Human Resources Partners. Shared Services Office HR Training

Data for Development: DAC Member Priorities and Challenges

ISO INTERNATIONAL STANDARD. Risk management Principles and guidelines. Management du risque Principes et lignes directrices

CAPBS Organisational and Workforce Development Framework

Improving Outcomes for Children in Out-Of-Home Care Through. Enhanced Quality Assurance

Areas of Responsibilities, Competencies, and Sub-competencies for Health Education Specialists

Category 1 Consumer Input & Involvement

Professional Year Program (Accounting)

NOT PROTECTIVELY MARKED. Stop and Search Improvement Plan 2016/17 Code Phase III. 1 Version 2.0 October 2016 NOT PROTECTIVELY MARKED

Supporting Green Investment Banks in Emerging Economies

The Role of the Standards for Educational and Psychological Testing in Establishing a Methodology to Support the Evaluation of Assessment Quality 1

MINISTRY OF EDUCATION STATE DEPARTMENT OF BASIC EDUCATION KENYA PRIMARY EDUCATION DEVELOPMENT (PRIEDE) PROJECT TERMS OF REFERENCE

Aid Program Monitoring and Evaluation Standards

Quantitative Support for the Evaluation and Analysis Team

PageUp Recruitment Management System USER GUIDE

Effective Date: Rule 6A-5.081, F.A.C. Form SP IAS Page

JOB PROFLE NO.: _ CCOG CODE: FUNCTIONAL CODE: EI/P-2 JOB CLASSIFICATION

Transcription:

State Systemic Improvement Plan Questions and Answers This document is based on questions posed by State staff after OSEP presented on RDA and the SSIP at various meetings or conferences across regions. QUESTION Why are States being required to develop a State Systematic Improvement Plan (SSIP)? RESPONSE General The U.S. Department of Education is implementing a revised accountability system under the Individuals with Disabilities Education Act (IDEA). Results-Driven Accountability (RDA) shifts the Department s accountability efforts from a primary emphasis on compliance to a framework that focuses on improved results for children with disabilities, while continuing to ensure States meet IDEA requirements. RDA emphasizes improving child outcomes such as performance on assessments, graduation rates, and early childhood outcomes. To support this effort, States are being required to develop a State Systemic Improvement Plan (SSIP) as part of their State Performance Plan/Annual Performance Report (SPP/APR). In developing, implementing, and evaluating the SSIP, we expect that a State s focus on results will drive innovation in the use of evidence-based practices in the delivery of services to children with disabilities, which will lead to improved results for children with disabilities. 1

What is expected in terms of stakeholder engagement? Who? To what extent? How will States know if their identified measurable result is acceptable? Is there a mechanism for some type of approval prior to submission of Phase I in April 2015? For example, is there a way for States to determine the acceptability of their SSIP throughout the process? Stakeholder Engagement It is expected that stakeholders will be meaningfully involved in every phase of the SSIP, including development, implementation, and evaluation. The State needs to determine which stakeholder perspectives are needed for the development, implementation, and evaluation of the SSIP, and whether the stakeholders would change according to the task. At a minimum, we would expect to see representatives from local educational agencies (LEAs) and the State Advisory Panel for the IDEA Part B SSIP, and early intervention service (EIS) programs and providers and the State Interagency Coordinating Council (SICC) for the IDEA Part C SSIP. We would also expect to see representatives of: parents of children with disabilities, parent advocacy groups, and other State and local agencies that pay for, provide, or collaborate on IDEA services and issues. Finally the State should include stakeholders with expertise on the issues to be discussed for both the IDEA Part B and C SSIPs. Within each phase, stakeholder involvement will be valuable. State Identified Measurable Result (SIMR) The SSIP is an indicator in the FFYs 2013 through 2018 SPP/APR. States must submit by February 2, 2015 their SPP/APR for both Part B and Part C and submit by April 1, 2015 Phase I of the SSIP, which is Indicator C-11 and B-17 of the SPP/APR. OSEP is providing targeted support to States through SSIP Implementation Support visits and conference calls, during which States proposed SIMR can be discussed. States will also have the ability to submit draft Phase I SSIPs for OSEP to review, prior to the official April 1, 2015 submission date. In addition, OSEP has developed and shared with States an Evaluation Tool which includes criteria for reviewing Phase I SSIPs. We will work with each State through these mechanisms to ensure that the State has an appropriate SIMR before the April 2015 submission date. 2

What is included in the SIMR? What is meant by child results or outcomes in Indicators C-11 and B- 17? In the FFY 2013 SPP/APR, States must provide, as part of Phase I of the SSIP, a statement of the result(s) the State intends to achieve through implementation of the SSIP, which is referred to as the SIMR for Children with Disabilities, and include the following additional information to meet the requirements in IDEA section 616(b)(1)(A) and (b)(2)(a). States must provide FFY 2013 baseline data for Indicators C-11 or B-17 (the SSIP) that must be expressed as a percentage and aligned with the SIMR. The State must establish measurable and rigorous targets for each successive year of the SPP (FFYs 2014 through 2018). The end target (for FFY 2018) must demonstrate improvement over the FFY 2013 baseline data. The State must submit all other components of Phase I of the SSIP. If the State selects a SIMR that focuses on improving a result for a subset of districts/programs or populations, then the State must include in the SIMR section of Phase I of its SSIP an explanation of why improving that result for that subset of districts/programs or population would improve that result on a Statewide basis. In Indicators C-11 and B-17 of the 2015 SPP/APR package, child outcomes are discussed in the context of the SIMR, which must be clearly based on the Data and State Infrastructure Analyses and must be a childlevel outcome in contrast to a process outcome. The State may select a single result (e.g., increasing the graduation rate for students with disabilities), or a cluster of results that improve child outcomes. 3

Are there results indicators that would not be considered appropriate child-level outcomes? If so, will OSEP provide guidance or a list to help guide States toward an appropriate/acceptable measurable results area? Yes. As discussed above, the SIMR must be a child-level, measurable result that improves child outcomes. The compliance indicators measure compliance but do not measure child outcomes. This includes the compliance indicators under Part C (1, 7 and 8) and Part B (9, 10, 11, 12, and 13). In addition, there are some results indicators that are not appropriate to use as a SIMR, since they do not measure improvement in child outcomes. Topics that would not be acceptable, stand-alone SIMRs include those related to the following results indicators: For Part C: Indicator 2 natural environments Indicators 5 and 6 child find Indicators 9 and 10 resolution sessions and mediation. For Part B: Indicator 2 dropout Indicator 4 suspension/expulsion Indicators 5 and 6 LRE Indicator 8 parent involvement Indicators 15 and 16 - resolution sessions and mediation. Can a State use a six year graduation cohort rate since we think this is more realistic for our populations if the State is not using a six year graduation cohort rate when reporting data under Indicator B-1? The most common SIMRs address child-specific results such as graduation rates (B-1), performance on assessments (B-3), and early childhood outcomes (C-3). If a State is planning to use a six year graduation cohort rate for its SIMR when that is not the data it is reporting under Indicator B-1, the State should discuss its SIMR with the OSEP State Contact, including how the State will be able to demonstrate improvement in Statewide data. 4

What if, as a result of our infrastructure analysis, our stakeholders determine that something about our infrastructure is a problem, but our child and family outcomes do not appear to be a problem? Can we pick that (infrastructure) as a focus? Can the SIMR for the Part C SSIP be an outcome that is measured when the child is older (beyond Part C exit, like at ages 4, 5, 6, 7, 8, etc.) if the State has this longitudinal measurement capacity? If, after analysis of our Part C SPP/APR child and family outcome data, our stakeholders believe that we are doing well in all but one summary statement of one outcome - can we focus on only one outcome? Will a State be allowed to totally change the SIMR they are focusing upon? For example, a State may choose graduation rate as its result area because performance of students with disabilities is poor and because it is a part of a broader general education initiative. Then the political structure in that State changes and new initiatives are being emphasized. In this case could a State change its SIMR? Do targets have to be agreed upon by stakeholders or can they provide input and States make the final decision? The SIMR must be a result that improves a child-level outcome, as opposed to a process or compliance outcome. Addressing problems with the State s infrastructure could be one of the State s coherent improvement strategies that will lead to a measureable improvement in the SIMR. A State should select as a SIMR a result for which improvement in child outcomes is necessary. For the purpose of the Part C SSIP, States must set targets for children while they are in the program. However, States are encouraged to look at multiple data sources that will assist them in improving the effectiveness of services and results for infants and toddlers with disabilities and their families. A State can focus on one summary statement of one outcome as its SIMR as long as it can provide a rationale for its decision. It is OSEP s expectation that a State would use the same SIMR/SSIP for FFYs 2013 through 2018. We encourage States to engage key stakeholders and obtain commitment from leaders within the SEA or Lead Agency. By doing so, we believe that a State has a better likelihood of sustaining its improvement efforts and achieving its FFY 2018 improvement target. SSIP baseline and targets OSEP expects that stakeholders will be involved in the development of all three phases of the SSIP. Ultimately the selection of the SIMR and target setting is an SEA or Lead Agency decision. 5

Can targets for the SSIP be different than for one of the indicators? For example, my State has chosen improving the graduation rate statewide as our focus. The measurement table for B-1 has said that the target must be the same as the ESEA Target. If we focus on graduation for the SSIP, does our target for B-17 need to be the same as for B-1 and for ESEA? The SIMR must be aligned with an SPP/APR indicator or a component of an SPP/APR indicator. The SIMR must be clearly based on the Data and State Infrastructure Analyses and must be a child-level outcome in contrast to a compliance or process outcome. The State may select a single result (e.g. graduation rate) or a cluster of related results (e.g., increasing the graduation rate and decreasing the dropout rate). If the SIMR is the same as an indicator, or indicator component, then the baseline data and targets for the SSIP must be aligned with, and therefore generally the same as, the baseline data and targets of the SPP/APR indicator or component of the indicator to which the SIMR is aligned. If the SIMR is the same as an indicator, or indicator component, and the State is planning to use different baseline data or targets, the State should discuss its SIMR with its OSEP State contact. 6

Can the State establish different baseline/targets for specific districts/programs or populations if its SIMR focuses on a specific subgroup? In order to identify an area of focus, we need to establish baseline for 2013-2014. This limits us to areas for which we already have a data collection system. Can we change our measurement methodology next year? When will baselines and targets need to be reported for the SSIP indicator? If in a subsequent year the target needs to change based on better data, will that be possible? For Indicators B-17 and C-11, the State must provide baseline data for FFY 2013 and annual target(s) for each of the SPP/APR reporting periods from FFYs 2014 through 2018. A State may choose to focus on a subset of districts/programs or populations as its SIMR in which case, its SSIP FFY 2013 baseline data will be different from the Statewide baseline data of the comparable indicator in the SPP/APR and its targets will also likely be different from the Statewide targets for the comparable indicator in the SPP/APR. In that case, the State s SSIP baseline data and targets must be aligned with the SIMR, but will not be based on Statewide data. If the State selects a SIMR that focuses on improving a result for a subset of districts/programs or populations, then the State must include in the SIMR section of Phase I of its SSIP an explanation of why improving that result for that subset of districts/programs or population would improve that result on a Statewide basis. For example, a State may choose to improve the graduation rate for a subpopulation of students. In that case, a State must provide FFY 2013 baseline data on the graduation rate of that subpopulation and establish targets for that subpopulation for Indicator B-17 for each of the SPP/APR reporting periods from FFYs 2014 through 2018. The State must explain how improving the graduation rate for that subpopulation will improve the Statewide graduation rate for children with disabilities. If a State is committed to a particular result area, it can establish its FFY 2013 baseline data with existing data and then, if better, more valid and accurate data become available, revise its baseline in the next year s SPP/APR. If the State revises the baseline data, we would expect a full description of the process and justification for the revision in the State s APR submission. A State must report baseline data and set targets as part of Phase I of the SSIP, which is due on April 1, 2015. States will be required to report target data in their SPP/APRs for FFYs 2014 through 2018. As has been the case previously, a State may revise its baseline data and adjust targets in a subsequent SPP/APR with appropriate justification. 7

Could we use alternate data sources to measure outcomes, i.e., progress monitoring or benchmark data? Can performance data include measures other than or in addition to State assessment data since we re changing from our State-developed assessment? Will States have to report SSIP targets at the district/program level in the local APR? Where in the SSIP does a State describe how they will address data quality? Is OSEP willing to accept recommended practices as strategies? Will a State need to cite the literature to support the practice(s) chosen as their strategy? States are encouraged to use multiple data sources, such as data from progress monitoring or benchmark data, to measure improvement in the SIMR, including whether the State is meeting its long and short term objectives. However, when reporting target data to determine if the State met its target on the SIMR, that data must be expressed as a percentage and aligned with the SIMR. OSEP wants States to use valid data sources for measuring proficiency on assessments and the State s assessment data for its SSIP should align with the data reported in B-3. We recognize that during these transition years, a State may need to use multiple measures to collect quality data or may need to change its data source. Indicator B-17/C-11 is a Statewide indicator. LEAs and EIS programs are not required to develop SSIPs and, for purposes of local reporting, States are not required to report on local-level performance on the SSIP. As noted in Indicators C-11 and B-17, the State must describe how it will address data quality in the Data Analysis section of Phase I of the SSIP. If the State identifies any concerns about data quality, the description must include how it will address these concerns, and if additional data are needed, what methods and timelines will be applied to collect and analyze those additional data. Selection of Coherent Improvement Strategies OSEP is looking for coherent improvement strategies that are based on the best available research, aligned to data analysis and the infrastructure analysis, and support systemic change. The State should provide the evidence they used to make decisions on appropriate improvement strategies to improve the SIMR. A State will need to provide evidence on why they chose specific improvement strategies and why they think that implementation of those strategies will lead to improvements in the SIMR. A State may, but is not required to, cite literature that supports the practice(s) chosen as their strategy. 8

Can the State initially work with one cohort of districts/programs when implementing the evidence-based practices with plans to scale up over the life of the SSIP? Do States have to scale up implementation of evidence-based practices on a Statewide level by the final year of the SSIP? If measuring results around a subpopulation, then the activities must be based on moving the numbers for that subpopulation. Must the activities be applicable to other populations and be scalable? Is the expectation for Phase I of the SSIP to include a detailed and coherent proposal of improvement activities or just an outline of improvement activities? What criteria will be used for measuring improvement activities and the quality of the activities? We have not historically received feedback on this in the past. Yes. The State may choose to work with a cohort of districts or programs during the first year of implementation of evidence-based practices and scale up implementation over the life of the SSIP. This is consistent with what we know about implementation science. However, when deciding on the number of districts/programs with which to work, the State should consider how many districts/programs are needed to have a positive impact on Statewide data. States do not have to scale up evidence-based practices Statewide by the final year of the SSIP, although it is expected that implementation of those evidence-based practices in the SSIP will ultimately impact Statewide data. The expectation is that full implementation of the SSIP will result in improvement of State data over time. States can focus on a subgroup (such as specific districts/programs or populations) and implement evidence-based practices to improve performance in the subgroup. It is expected that the State would make needed modifications to its improvement strategies to address other populations as it scales up the initiative. Coherent improvement strategies are envisioned as the broad strategies that will address root causes for low performance and build local capacity to achieve the SIMR. States will be asked to provide more detailed improvement activities, timelines, resources and assign responsibilities in Phase II of the SSIP. OSEP has drafted a Phase I SSIP Evaluation Tool which includes criteria for evaluating the quality of each Phase I components as well as the entire Phase I SSIP. 9

Previous information from OSEP indicated that the SSIP would include improvement activities for all indicators and address general supervision. Is this the case? Is it appropriate to incorporate only the SIMR into the theory of action, e.g. improving the performance of children with disabilities on 3 rd grade reading assessments is the outcome, or do we need to also incorporate other longer term results, e.g. graduation rate, post-school outcomes, grade 10 reading assessments into the theory of action? Is there going to be a prescribed form for the Theory of Action? Will it be part of the electronic system? Based on the recent GRADS360 presentation will the SSIP be submitted as part of the electronic submission? If so, does OSEP have an idea of the format? In developing, implementing, and evaluating the SSIP, the State is selecting a SIMR that will lead to improved results Statewide for children with disabilities. The SPP/APR for FFYs 2013 through 2018 was revised to no longer require reporting of separate improvement activities for all indicators and also removed the reporting requirement in former SPP/APR Indicator C-9/B-15 regarding the correction of noncompliance. Within the compliance indicators of the SPP/APR, States must continue to report on correction of noncompliance for those compliance indicators. Although States will need to complete the necessary improvement activities to improve performance of those indicators, they are not required to submit those improvement activities as a part of the SPP/APR. Theory of Action It is permissible to include only the SIMR in the Theory of Action. The Theory of Action should be an If-Then statement, graphically portrayed in the State s SSIP, that shows the rationale of how implementing the coherent set of improvement strategies selected will increase the State s capacity to lead meaningful change in LEAs or EIS programs, and achieve improvement in the SIMR for children with disabilities. There will not be a prescribed form for the theory of action. GRADS 360 will allow States to upload their Theory of Action graphic. States may want to use materials developed by Regional Resource Centers (RRCs) or the Early Childhood Technical Assistance Center (ECTA) to assist in the design of their Theory of Action. Submission/format The SSIP will be submitted as part of the electronic submission, but is not due until April 1, 2015. States will be able to submit narrative in the GRADS 360 through text boxes. In addition, States can upload the SSIP as an attachment. 10

Will we need to publish and/or submit the work behind the development of the SSIP? For example, will we need to produce a formal infrastructure analysis of just a working draft? Will OSEP expect to see tools that were shared at RRC regional meetings attached to the APR for Indicator B-17 as evidence that the State updated their process? States will not be required to submit completed tools or materials used to develop the SSIP. States may reference or submit attachments (e.g., data summary tables, foundational documents) if they feel that it will be beneficial in helping OSEP understand their process. OSEP is not requiring that working documents such as those provided through technical assistance centers, e.g., RRC or ECTA tools, be attached to the SPP/APR. 11