Att_4411 OMB_PartA_3_15_2011

Att_4411 OMB_PartA_3_15_2011.docx

School Improvement Status and Outcomes for Students with Disabilities Study

OMB: 1850-0879

Document [docx]
Download: docx | pdf

Shape1

School Improvement Status and Outcomes for Students with Disabilities Study

Part A (Justification)

OMB CLEARANCE REQUEST

For Data Collection Instruments

December 17, 2010, updated March 15, 2011


Prepared for:

United States Department of Education

Contract No. ED-04-CO-0025/0013


Prepared by:

American Institutes for Research


Contents

Introduction 1

School Improvement Status and Outcomes for Students with Disabilities Study 2

Overview 2

Theory of Action 3

Evaluation Questions 5

Research Question 1: Inclusion of SWDs in the School Accountability System and AYP Performance of Schools Accountable for SWDs 5

Research Question 2: School Practices 5

Research Question 3: SWD Achievement Trends 5

Sampling Design 6

Data Collection Procedures 7

Analytic Approach 8

Analytic Methods for Research Question 1 (Inclusion of SWDs in Accountability and School AYP Performance) 8

Analytic Methods for Research Question 2 (School Practices) 9

Analytic Methods for Research Question 3 (SWD Achievement Trends) 10

Supporting Statement for Paperwork Reduction Act Submission 13

Justification (Part A) 13

1. Circumstances Making Collection of Information Necessary 13

2. Purpose and Uses of the Data 13

3. Use of Technology to Reduce Burden 13

4. Efforts to Identify Duplication 14

5. Methods to Minimize Burden on Small Entities 14

6. Consequence of Not Collecting Data 14

7. Special Circumstances 14

8. Federal Register Comments and Persons Consulted Outside the Agency 14

9. Payment or Gifts 15

10. Assurance of Confidentiality 16

11. Justification of Sensitive Questions 17

12. Estimates of Hour Burden 17

13. Estimate of Cost Burden to Respondents 17

14. Estimate of Annual Cost to the Federal Government 17

15. Program Changes or Adjustments 17

16. Plans for Tabulation and Publication of Results 18

17. Approval to Not Display OMB Expiration Date 19

18. Explanation of Exceptions 19

References 20

Appendix A 21

Employee Agreement on Data Use and Confidentiality Procedures 21



List of Exhibits




Introduction

The Institute of Education Sciences (IES) of the U.S. Department of Education (ED) requests clearance for the data collection part of the School Improvement Status and Outcomes for Students with Disabilities Study. The purpose of the study is to provide policy-relevant information about the education of students with disabilities (SWDs) by examining what school practices are occurring that may affect the education outcomes of SWDs and what the achievement trends of SWDs are in selected states. Where appropriate, the study team will compare school practices and SWD achievement trends between schools accountable and schools not accountable for SWD subgroup performance under the accountability provisions of the Elementary and Secondary Education Act (ESEA) as reauthorized in 2001. The study team will also compare school practices and SWD achievement trends between schools with different histories of school improvement status among schools accountable for SWD subgroup performance. Clearance is requested for the study’s data collection instruments.

This document contains two major sections with multiple subsections and an appendix:

  • School Improvement Status and Outcomes for Students with Disabilities Study

  • Overview

  • Theory of action

  • Evaluation questions

  • Sampling design

  • Data collection procedures

  • Analytic approach

  • Supporting Statement for Paperwork Reduction Act Submission

  • Justification (Part A)

  • Appendix A (Employee Agreement on Data Use and Confidentiality Procedures)



A separate document contains Part B (Description of Statistical Methods) of the Supporting Statement for Paperwork Reduction Act Submission. Copies of the survey instruments for which we are requesting clearance are included as separate documents in this OMB package.


School Improvement Status and Outcomes for Students with Disabilities Study

Overview

IES has been congressionally mandated (20 USC §664(b)) to conduct a national assessment of how well the Individuals with Disabilities Education Act of 2004 (IDEA) is achieving its purposes. These purposes include some of the highest priorities for those concerned with special education reform: improved achievement, access to the regular education curriculum, grade transitions, and dropout prevention. The 2004 reauthorization of IDEA brought the nation’s special education laws into closer alignment with the provisions of ESEA, including provisions regarding accountability for student progress. Therefore, the National Center for Education Evaluation (NCEE) has recommended that the National Assessment of IDEA include a study of the relationship between school improvement status and the educational outcomes of SWDs.

The first phase of the study was a feasibility study conducted between March 2008 and January 2009 by a team of researchers from the American Institutes for Research (AIR) and Northwestern University. After careful consideration of a variety of feasibility criteria, the feasibility team concluded that it was unlikely that a rigorous impact study could be carried out that would generate strong causal conclusions about the impact of school improvement status on SWDs. Therefore, the AIR research team, in consultation with NCEE, explored new directions for the study that would be feasible, meaningful, and responsive to the mandate. The original study, which was intended to be an impact study, has been reframed as a descriptive study addressing three sets of research questions that may provide useful information to both policymakers and practitioners. The first set of questions establishes the context for the study by examining the inclusion of SWDs in the school accountability system and the AYP performance of schools accountable for the SWD subgroup. The second set of questions concerns school practices that may affect the outcomes of SWDs, and the third set of questions examines the achievement trends of SWDs over time.

The study design emphasizes the integration and analysis of multiple sources of data. Research questions about the inclusion of SWDs in the school accountability system and the AYP performance of schools accountable for the performance of SWDs will rely primarily on extant school-level data from the EDFacts system maintained by ED, which will be supplemented with the National Adequate Yearly Progress and Identification (NAYPI) database created by AIR. School-level EDFacts data and the NAYPI data will also be used to address research questions related to the achievement trends of SWDs. Longitudinally linked student-level achievement data, requested from up to two states, will provide an additional source of extant data for SWD achievement trend analyses. Findings based on analyses of these extant data will be presented in both the interim study report and the final study report.

Data for addressing research questions about school practices related to SWDs will come primarily from a principal survey and a special education designee survey to be administered in spring 2011 to 6,638 schools supported by IDEA. The survey will focus on school practices that occurred during the 2010–11 school year and will inform the final study report.

Theory of Action

Under the ESEA accountability provisions, schools are held accountable for the performance of defined student subgroups and are required to include the performance of each subgroup in their AYP determination if the size of the subgroup meets a certain threshold as set by each state. Schools repeatedly failing to make AYP over time will be identified for improvement with increasingly intensive sanctions and will be required to develop and implement school improvement plans to improve student achievement. The fundamental assumption underlying this study is that being explicitly held accountable for the performance of SWDs, or being identified as being in need of improvement for reasons including the performance of SWDs, will affect the practices that schools engage in with respect to the education of their SWDs, which will in turn affect a variety of student outcomes as shown by the following chain of events and elaborated in Exhibit 1.

Accountability/Identification for improvement (IFI) School practices Student outcomes


Exhibit 1. Theory of Action

Shape3

Evaluation Questions

The first set of questions concerns the inclusion of SWDs in the school accountability system and school-level AYP determination as it relates to the performance of the SWD subgroup. The second set of questions pertains to school practices that may affect the outcomes of SWDs, and the third set of questions examines the achievement trends of SWDs over time. These questions are stated as follows:

Research Question 1: Inclusion of SWDs in the School Accountability System and AYP Performance of Schools Accountable for SWDs

  • 1(a). What percentage of schools were accountable for the performance of the SWD subgroup between 2005–06 and 2009–10?

  • 1(b). What percentage of different types of schools were held accountable for the performance of the SWD subgroup?1

  • 1(c). What percentage of schools moved in and out of accountability for the SWD subgroup between 2005–06 and 2009–10?

  • 1(d). What percentage of schools missed AYP due to the performance of the SWD subgroup between 2005–06 and 2009–10?


Research Question 2: School Practices

  • 2(a). What regular and special education practices are occurring in 2010–11 in schools accountable for SWD performance and in schools not accountable for SWD performance, and how do these practices differ between the two school groups?

  • 2(b). Do schools’ regular or special education practices in 2010–11 vary by schools’ improvement status between 2005–06 and 2010–11 among schools accountable for SWD performance?

Research Question 3: SWD Achievement Trends

  • 3(a). What were the trends of SWD achievement in reading and mathematics between 2005–06 and 2009–10 in schools accountable for SWD performance in selected states? Did similar trends occur in other schools in these states?

  • 3(b). To what extent did SWD achievement trends between 2005–06 and 2009–10 differ between schools with different levels of poverty and minority concentration among school accountable for SWD performance in selected states?

  • 3(c). To what extent did the achievement trends between 2005–06 and 2009–10 for SWDs differ from the corresponding trends for non-disabled students in schools accountable for SWD performance in selected states?

  • 3(d). To what extent were changes in achievement between 2005–06 and 2009–10 for SWDs associated with corresponding changes for non-disabled students in schools accountable for SWD performance in selected states?

  • 3(e). What were the trends of SWD achievement in mathematics and reading between
    2005–06 and 2009–10 in SWD-accountable schools that were identified for school improvement and SWD-accountable schools that were never identified for improvement in selected states?

  • 3(f). To what extent did the trends of SWD achievement in mathematics and reading between 2005–06 and 2009–10 in SWD-accountable schools identified for school improvement differ from corresponding trends in never-identified schools in selected states?

Data to address the research questions above will come primarily from the EDFacts data system, the NAYPI database created and maintained by AIR, and a school survey that has been developed specifically for this study. In addition, we will explore the possibility of obtaining student-level longitudinal data from up to two states for examining the trends of SWD outcomes. We will analyze the data for each state in our study sample separately and will also pool data across states where appropriate.

In the remainder of this document, we discuss our approaches to addressing the three sets of research questions guiding this study. We first define our study sample, then explain our data collection plan, and finally describe the types of data analyses that we will perform to address each research question.

Sampling Design

The main components of this study are presented in Exhibit 2 along with the proposed sample. A detailed discussion of our sampling design is provided in the Supporting Statement for Paperwork Reduction Act Submission, Part B section of this package.


Exhibit 2. Main Study Components and Proposed Sample

Study Component

Sample

School-level analyses of inclusion of SWDs in accountability system and school AYP performance

(for both interim and final study reports)

For the interim study report:

  • All 67,596 public schools in 40 states that reported to EDFacts the relevant accountability and performance data for 2008–09

For the final study report:

  • All public schools in states that reported to EDFacts the relevant accountability and performance data for 2009–10

School-level achievement data analyses

(for both interim and final study reports)

For the interim study report:

  • Elementary-school sample: A total of 2,553 accountable elementary schools in the 11 states that have at least 50 elementary schools accountable for SWD performance and that have reading achievement data and/or mathematics achievement data from 2005–06 through 2007–08 for grades 3–5

  • Middle-school sample: A total of 1,278 accountable middle schools in the 8 states that have at least 50 middle schools accountable for SWD performance and that have reading achievement data and/or mathematics achievement data from 2005–06 through 2007–08 for grades 6–8

For the final study report:

  • A subset of the schools in the elementary-school sample for the interim report that have reading achievement data and/or mathematics achievement data from 2005–06 through 2009–10 for grades 3–5

  • A subset of the schools in the middle-school sample for the interim report that have reading achievement data and/or mathematics achievement data from 2005–06 through 2009–10 for grades 6–8

Student-level achievement data analyses

(for final study report)

Grades 3–5 students in elementary schools and grades 6–8 students in middle schools from a purposive sample of up to two states

School survey

(for final study report)

A sample of 4,725 elementary schools and 1,913 middle schools, with the goal of receiving 7,560 completed survey responses (two responses per school—one from the principal and one from the special education designee) from the sampled elementary schools and 3,060 completed survey responses from the sampled middle schools (a response rate of 80%)


Data Collection Procedures

This study will collect extant school-level and student-level achievement data and administer a principal survey and a special education designee survey. Only the survey instruments, which are included as separate documents in this OMB package, require OMB clearance. Exhibit 3 presents a summary of our data collection procedures. A more-detailed discussion of these procedures is provided in the Supporting Statement for Paperwork Reduction Act Submission, Part B section of this package.

Exhibit 3. Summary of Data Collection Procedures

Study Component

Data Sources

Data Collection Timeline

School-level analyses of inclusion of SWDs in accountability system and school AYP performance

Summer 2009–Summer 2011

School-level Achievement Data Analyses

  • EDFacts (2005–06 through 2009–10 school years)
    http://www.ed.gov/about/inits/ed/edfacts/index.html

Summer 2009–Summer 2011

Student-level Achievement Data Analyses

  • Student-level longitudinally linked achievement data from a purposive sample of up to two states that have such data available

Fall 2011

School survey

  • Web-based survey with paper-based follow-up administered to principals and special education designees

Spring 2011


Analytic Approach

This section discusses how we will analyze the data collected to address the research questions for this study, focusing primarily on Research Question 2 about school practices based on survey data. The other two research questions will rely exclusively on extant data.

Analytic Methods for Research Question 1 (Inclusion of SWDs in Accountability and School AYP Performance)

The first set of research questions examines the inclusion of SWDs in the school accountability system and the school-level AYP determination as it relates to the performance of the SWD subgroup. These questions will rely exclusively on extant data from the EDFacts system and the NAYPI database and will be addressed with descriptive analyses.

Specifically, we will examine the percentage of public schools accountable for the performance of the SWD subgroup and the percentage of tested SWDs represented by these schools (Research Question 1(a)). We will also examine the percentage of different types of public schools (i.e., regular public schools, charter schools, special education schools, and vocational/alternative schools) that were held accountable for the performance of the SWD subgroup and the percentage of tested SWDs represented by these schools (Research Question 1(b)). In addition, we will assess the stability of schools’ accountability status over time by examining the distribution of schools that were accountable for the SWD subgroup for different numbers of years during the time period studied and by examining the percentage of schools accountable for SWDs in a given year that remained accountable in subsequent years (Research Question 1(c)). Finally, we will calculate the percentage of public schools that missed AYP due to the performance of the SWD subgroup (either as the sole reason or one of multiple reasons) and the percentage of tested SWDs represented by these schools (Research Question 1(d)).

For the interim study report, we will address the above research questions using extant data from EDFacts and the NAYPI database between 2005–06 and 2008–09. For the final study report, we will address the research questions using data between 2005–06 and 2009–10. Depending on data availability and the specific research question addressed, the number of schools and states included in the analytic sample for a particular analysis will differ across analyses (see Part B for details about the “school accountability sample”). We will conduct each analysis both within the individual states included in the analytic sample and across all states in the analytic sample.

Analytic Methods for Research Question 2 (School Practices)

RQ 2(a). What regular and special education practices are occurring in 2010–11 in schools accountable for SWD performance and in schools not accountable for SWD performance, and how to do these practices differ between these two school groups?

Exhibit 11, presented in the Supporting Statement for Paperwork Reduction Act Submission, Part B section of this package, lists the various types of school practices that may be associated with positive SWD outcomes based on existing research and expert opinions. Under RQ 2(a), we will examine the extent to which practices are occurring in a sample of 4,725 elementary schools and 1,913 middle schools accountable and not accountable for SWD subgroup performance, using data from the school survey described earlier.

Based on the school survey data, we will create measures for each type of school practice presented in Exhibit 11. Using exploratory and confirmatory factor analyses, we will also create composite scales of school practices based on relevant survey items. For example, we will create a scale for “Access to the Regular Education Curriculum,” which we hypothesize may affect SWD outcomes. Items from our school survey that are potentially relevant to this scale include the following:

  • Percentage of students with disabilities in the school who spend 80 percent or more of their time per week in a regular education classroom (Sped Designee Survey, Question 6)

  • Percentage of students with disabilities receiving the majority of their instruction in mathematics and English/language arts, respectively, in regular education classrooms (Sped Designee Survey, Question 14)

  • School is engaged in a deliberate strategy to move students from self-contained classrooms to the regular education classrooms (Sped Designee Survey, Question 9p)

  • School has included special education teachers in content-related professional development opportunities (Sped Designee Survey, Questions 12 and 13)

  • Special and regular education teachers collaborate through common planning time (Sped Designee Survey, Question 8k)

  • Regular and special education teachers engage in team-teaching model (Sped Designee Survey, Question 8j)

  • School has provided professional development to all teachers that focuses on strategies for instructing students with disabilities (Principal Survey, Questions 6 and 7; Sped Designee Survey, Questions 12 and 13)

  • School uses instructional and assistive technology with students with disabilities (Sped Designee Survey, Question 7h)

In addition, we will create subscales measuring constructs within the overall scale, such as placement in the least restrictive environment and curriculum and instruction, and teacher collaboration. We will rely on both formal factor analyses and substantive knowledge about school practices to create scales and subscales that measure important types of school practices that may be associated with school accountability and SWD outcomes.

Using the measures that are created on the basis of the survey data, we will perform descriptive analyses within each selected state to describe the kinds of practices that are occurring in schools accountable and not accountable for SWD subgroup performance, and we will pool results across states. To assess whether there are statistically significant differences in school practices between schools that were accountable for SWD performance and those that were not, we will perform independent-samples t tests for continuous measures of school practice and chi-square tests for categorical measures of school practice.2

RQ 2(b). Do schools’ regular or special education practices in 2010–11 vary by schools’ improvement status between 2005–06 and 2010–11 among schools accountable for SWD performance?

In addition to the overall description of schools’ regular and special education practices, we will examine under RQ 2(b) the extent to which school practices differ among accountable schools with different identification histories. We will conduct descriptive analyses describing school practices within two school groups defined on the basis of their school improvement status between 2005–06 and 2010–11:3

  • Group 1 (Never Identified): Schools that have never been identified for improvement (IFI) for any reason as of 2010–11

  • Group 2 (Identified): Schools that were identified for improvement between 2005–06 and 2010–2011

For continuous measures of school practice, we will further perform independent-samples t tests to test whether the differences in school practice between the two school groups are statistically significant.4 For dichotomous measures of school practice, we will use chi-square tests to test for group differences. The analyses for RQ 2(b) will be conducted for elementary schools and middle schools separately.

Analytic Methods for Research Question 3 (SWD Achievement Trends)

To address the third set of research questions concerning the trends of SWD achievement outcomes, we will examine changes in SWD achievement in reading and mathematics across successive grade-specific cohorts between 2005–06 and 2007–08 for the interim report and between 2005–06 and 2009–10 for the final report. We have decided to use 2005–06 as the starting year for the achievement trend analyses because we are concerned about the availability and, more important, the quality of extant achievement data from earlier years.

The primary data source for SWD achievement trend analyses will be EDFacts. It is important to note, however, that changes in SWD outcomes based on school-level EDFacts data are likely to reflect both true changes in student outcomes and shifts in student composition over time, which unfortunately cannot be separated out in analyses using school-level EDFacts data. Therefore, we plan to supplement EDFacts data with longitudinally linked student-level data from up to two states, which will allow us to assess trends in SWD outcomes adjusted for changes in student composition across successive cohorts. Below we describe in detail our approaches to trend analyses based on school-level data and student-level data, respectively.

SWD achievement analyses based on EDFacts data

The outcome measures for SWD achievement trend analyses based on EDFacts data are the percentage of SWDs scoring proficient in each school on state tests in reading and mathematics respectively. Relying on school-level EDFacts data, we will use hierarchical linear modeling (HLM) methods to assess changes in SWD achievement over time in schools accountable for SWD performance, SWD-accountable schools that were identified for improvement and SWD-accountable schools that were never identified for improvement, while taking into account the clutering of multiple year- and grade- specific cohorts within schools (Research Questions 3(a) and 3(e)). Because a substantial proportion of non-accountable schools did not report SWD performance data to EDFacts, we will rely on school-level EDFacts data for accountable schools and state-level EDFacts data for all schools to compute the proficiency rates for SWDs in schools other than the accountable schools (as defined for the purpose of this study). These analyses will be conducted separately by subject (reading and mathematics) and grade level (elementary schools and middle schools) based on data pooled across the three target grades (3–5 for elementary schools and 6–8 for middle schools). Given that achievement data and changes in achievement data are not comparable across states, analyses of the overall achievement trends of SWDs will be conducted within each selected state and not pooled across states.

Two-level HLM models (cohorts nested within schools) will also be used to compare SWD achievement trends between schools with different levels of poverty and minority concentration and between schools that were identified for improvement and schools that were never identified for improvement during the time period examined (2005–06 to 2007–08 for the interim report and 2005–06 and 2009–10 for the final report) among schools accountable for SWD performance (Research Questions 3(b) and 3(f)). Relying on HLM analysis, we will further assess the differences in changes in achievement over time between SWDs and their non-disabled peers within the same school among schools accountable for SWD performance (Research Questions 3(c)). In addition, correlations will be calculated to assess the extent to which changes in the SWD achievement were associated with corresponding changes for their non-disabled peers in the same SWD-accountable schools (Research Questions 3(d)). We will conduct all the analyses described above separately by subject and grade level within each selected state, and will also pool the data across states. Results from the SWD achievement trends analyses will be presented in both table form and graphic form where appropriate.

SWD achievement analyses based on state longitudinal student-level data

The primary outcome measures for SWD achievement trend analyses based on state longitudinal student-level data are individual students’ scale scores on state tests in reading and mathematics. For the overall SWD achievement trends analyses, we will use a three-level HLM model (students nested within cohorts and cohorts nested within schools) that incorporates student background characteristics (e.g., gender, race, eligibility for free or reduced-price lunch, English language learner status, achievement in previous year if available, type of test taken [regular, alternate, or modified test]) as student-level covariates. Such a model will allow us to assess changes in SWD outcomes adjusted for potential shifts in student composition across successive cohorts of students.

Similarly, we will use three-level HLM models that incorporate student-level covariates to compare SWD achievement trends between different types of accountable schools and to compare the achievement trends between SWDs and their non-disabled peers in schools accountable for SWD performance. We will conduct these analyses separately by subject and grade level within each selected state.


Supporting Statement for Paperwork Reduction Act Submission

Justification (Part A)

1. Circumstances Making Collection of Information Necessary

The Institute for Education Sciences (IES) has been congressionally mandated (under 20 USC §664(b)) to conduct a national assessment of how well the Individuals with Disabilities Education Improvement Act of 2004 (IDEA, P.L. 108-446) is achieving its purposes. These purposes include some of the highest priorities for those concerned with special education reform: improved achievement, access to the regular education curriculum, better management of transitions, and dropout prevention. The 2004 reauthorization of IDEA brought the nation’s special education laws into closer alignment with the provisions of the 2001 reauthorization of the Elementary and Secondary Education Act (ESEA), including provisions regarding accountability for student progress. Therefore, the National Center for Education Evaluation (NCEE) has proposed that the National Assessment of IDEA include a study of school improvement status and the educational outcomes of students with disabilities (SWDs).

The School Improvement Status and Outcomes for Students with Disabilities Study will provide valuable information about the inclusion of SWDs in school accountability systems, schools’ AYP determination as it relates to SWD performance, and the performance of SWDs over time on key academic achievement outcomes in schools supported through IDEA. In addition, the study will also depict a comprehensive picture about what school practices are occurring that may affect the educational outcomes of SWDs and how school accountability and improvement status under ESEA relate to such practices.

2. Purpose and Uses of the Data

Data collected by the School Improvement Status and Outcomes for Students with Disabilities Study will be of immediate interest and import for policymakers and practitioners. The U.S. Department of Education (ED) will use the information to assess how school accountability and improvement status under ESEA relate to school practices that may be associated with the key outcomes identified by IDEA 2004 as important for the educational progress of SWDs. Data collected from this study will also help policymakers and practitioners understand the extent to which SWDs are represented in the school accountability system and the extent to which the achievement outcomes of SWDs are improving over time in schools supported through IDEA. The study will thus contribute to the congressionally mandated National Assessment of IDEA and may inform the next reauthorization of the ESEA and IDEA.

3. Use of Technology to Reduce Burden

Information technologies will be used to maximize the efficiency and completeness of the information gathered for this evaluation and to minimize the burden the evaluation places on respondents:

  • When possible, data will be collected through ED’s and states’ websites and through extant data sources such as EDFacts, the NAYPI database, and the Common Core of Data.

  • School surveys will be offered in a web-based format to alleviate burden on the respondents. We will follow up with those who do not respond to the online survey with printed surveys sent by mail. Printed surveys will contain clear instructions for completing the survey, together with a postage-paid return envelope with clear directions for its return to NORC at the University of Chicago.

  • A toll-free number will be available during the data collection process to permit respondents to contact evaluation staff with questions or to complete the survey by phone.

4. Efforts to Identify Duplication

Where possible, we will use existing data including EDFacts, state accountability workbooks, the NAYPI database, and the Common Core of Data to inform our analyses, which will greatly reduce the number of questions asked on the school surveys, minimizing duplication of previous data collection efforts and information.

5. Methods to Minimize Burden on Small Entities

No small business or entities will be involved as respondents.

6. Consequence of Not Collecting Data

As part of the congressionally mandated National Assessment of IDEA, this study will collect information on schools’ accountability status and AYP performance with respect to SWDs, the academic achievement of SWDs, and school practices concerning SWDs that has not been systematically acquired and analyzed to date. Failure to collect the data proposed for this study would prevent the Congress and ED from assessing how the ESEA accountability and improvement status of schools supported through IDEA relate to school practices that may affect the educational outcomes of SWDs. It would also prevent the Congress and ED from assessing whether key achievement outcomes of SWDs are improving in schools supported through IDEA.

7. Special Circumstances

None of the special circumstances listed apply to this data collection.

8. Federal Register Comments and Persons Consulted Outside the Agency

A 60-day notice about the study was published in the Federal Register (Volume 75, page 61137) on October 4, 2010, to provide the opportunity for public comment. No public comments were received during the 60-day period.

To assist with the study’s complex technical and substantive issues, the study team has drawn on the experience and expertise of a technical working group (TWG) that provides a diverse range of experiences and perspectives. The members of this group, their affiliations, and their areas of expertise are listed in Exhibit 4.

Exhibit 4. Technical Work Group Members

TWG Member

Professional Affiliation

Areas of Expertise

Thomas Cook

Professor, Northwestern University

Research methodology; program evaluation

Pete Goldschmidt

Associate Professor, CRESST/UCLA

School accountability rules; state and district data systems

Brian Gong

Executive Director, National Center for the Improvement of Educational Assessment

School accountability rules; state and district data systems

Douglas Fuchs

Professor, Vanderbilt University

Special education research

Larry Hedges

Professor, Northwestern University

Measurements of student achievement

Thomas Hehir

Professor, Harvard University

Special education; federal education policy

Margaret McLaughlin

Professor, University of Maryland

Special education

Martha Thurlow

Director, National Center on Educational Outcomes

Special education policy and practices at the national and state levels


TWG meetings for this study will be held in spring 2011 and again in spring 2012 to solicit feedback on the study reports.

9. Payment or Gifts

The principal and the special education designee surveys will be the sole source of data for Research Question 2; therefore it is important to achieve a high response rate. Studies have shown that when used appropriately, incentives are a cost-effective means of significantly increasing response rates (e.g., Dillman, 1978, 2000). As Groves, Cialdini, and Couper (1992) note, people feel obligated to reward positive behavior (such as being provided with an incentive) with positive behavior in return—in the current context, such positive return behavior would be defined as the completion of the survey. Surveys that use incentives can actually be less expensive that those that do not. This is because greater up-front investment by means of incentives will help minimize the need for more expensive follow-ups in later stages, thus resulting in greater cost savings and lower missing data rates.

To boost the survey response rate and to compensate for respondents’ time, the study team plans to provide incentives for the survey of special education designees, who are likely to be special education teachers with teaching responsibilities during the months when data collection occurs. This survey will require approximately 30 minutes to complete, compared with 15 minutes for the principal survey. The planned incentive amount is $20 per respondent, for a maximum total cost of $132,760, assuming all 6,638 special education designees complete the survey. The amount of incentive planned for the survey respondents is consistent with that proposed in the NCEE memo Guidelines for Incentives for NCEE Evaluation Studies, dated March 22, 2005.







10. Assurance of Confidentiality

The study team has established procedures to ensure the confidentiality and security of the data collected. This approach will be in accordance with all relevant regulations and requirements, in particular the Education Sciences Institute Reform Act of 2002, Title I, Subsection (c) of Section 183, which requires the director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.” The study will also adhere to requirements of subsection (d) of Section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.

In compliance with all the relevant regulations, the study team will protect the full privacy and confidentiality of the school survey respondents. No respondent names, schools, or districts will be identified in reports or findings, and if necessary, distinguishing characteristics will be masked. The study team will also ensure the confidentiality and security of all student-level data collected from the states participating in the study. To prevent the disclosure of any personally identifiable information, we will have states replace their true identifiers with a replacement set of identifiers; a crosswalk will be maintained separately in a secure location.

Responses to data collection are voluntary and will be used only for broadly descriptive and statistical purposes. The reports prepared for the study will summarize findings across the sample and will not associate responses with a specific school or individual. In no instances will data that relate to or describe identifiable characteristics of individuals or individual schools be disclosed or used in identifiable form, except as required by law.

The study team will maintain the security of the complete set of all master data files and documentation, and access to personally identifiable data will be strictly controlled. With regard to printed data, information identifying individuals will be kept separate from other research data, and all printed data will be kept in locked file cabinets during non-working hours.

All electronic data will be protected using several methods. The study team will provide secure FTP services that allow encrypted transfer of large data files with clients. This added service prevents the need to break up large files into many smaller pieces, while providing a secure connection over the Internet. To protect its internal network from unauthorized access, AIR uses defense-in-depth best practices that incorporate firewalls and intrusion detection and prevention systems. Data are backed up using high-speed backup tape drives (SDLT) and stored off site in a secure location on a weekly rotation schedule. Servers are located in separate, secured rooms with access limited to authorized staff. The network is configured so that each user has a tailored set of rights, granted by the network administrator, to files approved for access and stored on the LAN. Access to our computer systems is password protected, and network passwords must be changed on regular basis and conform to our strong password policy.

AIR assumes responsibility for ensuring that all AIR staff members (including consultants and subcontractors) who will work on the contract are in compliance with ED’s contractor security clearance process, relevant privacy laws and regulations, and AIR’s Institutional Review Board policies. AIR staff members who have access to sensitive data, including school survey data and student-level education records, will be required to sign a confidentiality agreement (see Appendix A) and obtain any clearances that may be necessary.

11. Justification of Sensitive Questions

No questions of a sensitive nature will be included in this study.

12. Estimates of Hour Burden

The estimated maximum hour burden for the data collections for the study is 4,978.5 hours. This figure includes

  • time for 100 percent of the 6,638 principals to respond to a 15-minute survey and

  • time for 100 percent of the 6,638 special education designees to respond to a 30-minute survey.

Based on average hourly wages for participants, this amounts to an estimated monetary cost of $182,545. Exhibit 5 provides further details about the estimates of respondent burden.

Exhibit 5. Summary of Estimates of Hour Burden

Task

Total Sample Size

Time Estimate

(hours)

Number of Administrations

Total Hour Burden

Hourly Rate*

Estimated Monetary Cost of Burden

Administering principal survey

6,638

0.25

1

1,659.50

$50

$82,975

Administering special education designee survey

6,638

0.50

1

3,319.00

$43

$99,570

Total (one year)

13,276

--

--

4,978.50

--

$182,545

Notes: *Hourly rates are based on the 2009 national data from the Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wages.


13. Estimate of Cost Burden to Respondents

There are no additional respondent costs associated with this data collection beyond the hour burden estimated in Item A12 (i.e., 15 minutes for the principal survey and 30 minutes for the special education designee survey).

14. Estimate of Annual Cost to the Federal Government

The estimated cost for this study, including development of a detailed study design, data collection instruments, justification package, data collection, data analysis, and report preparation, is $3,126,425 for the four years, or approximately $781,606 per year.

15. Program Changes or Adjustments

This request is for a new information collection.

16. Plans for Tabulation and Publication of Results

Results based on the data collected for the School Improvement Status and Outcomes for Students with Disabilities Study will be reported to ED by AIR according to the dissemination schedule, summarized in Exhibit 6.

Exhibit 6. Schedule for Dissemination of Study Results

Deliverable

Date Due

First draft of interim report

October 19, 2010

Revised draft of interim report

December 28, 2010

Final draft of interim report

August 28, 2011

First draft of final report

April 27, 2012

Revised draft of final report

June 28, 2012

Final draft of final report

February 27, 2013


The interim report relies exclusively on the extant EDFacts data and was submitted in draft form in October 2010. It will be submitted in final form in June 2011. The interim report focuses on the inclusion of SWDs in school accountability systems and the AYP performance of schools reporting on SWDs based on EDFacts data between 2005–06 and 2008–09. It also examines SWD achievement trends between 2005–06 and 2007–08 in schools accountable for the SWD subgroup and other schools as well a s SWD achievement trends in schools with different school identification histories in selected states. The final study report, to be submitted in draft form in April 2012 and in final form in February 2013, will be a capstone report that fully addresses all the research questions proposed for this study, integrating information from both the school surveys and extant student achievement data.

Findings from this study will be represented in both table form and graphic form as appropriate in the study reports. Exhibits 7 and 8 are two example table shells that we will use to present findings about school practices and SWD achievement trends respectively in our final study report.


Exhibit 7. Example Table Shell: Percentage of Schools Engaging in Different Types of Special Education Practices in 2010–11 Across Selected States,
by Accountability Status and Grade Level

School practice

Elementary Schools

Middle Schools

Accountable Schools

(N = xxx)

Non-Accountable Schools

(N = xxx)

Difference

Accountable Schools

(N = xxx)

Non-Accountable Schools

(N = xxx)

Difference

Practice A







Practice B







Practice C













Note. Number of states = 11 for elementary-school analyses and 8 for middle-school analyses.


Exhibit 8. Example Table Shell: Percentage of SWDs Scoring Proficient or
Above in Accountable Schools Between 2005–06 and 2009–10,
by Year, State, and Grade Level

State

N of schools

2005–06

2006–07

2007–08

2008–09

2009–10

Five-year change

Elementary Schools

State A








State B








State C








...








Middle Schools

State A








State B








State C
















17. Approval to Not Display OMB Expiration Date

All data collection instruments will include the OMB expiration date.

18. Explanation of Exceptions

No exceptions are requested.

References


Dillman, D. A. (1978). Mail and telephone surveys: The total design method. New York: John Wiley.


Dillman, D. A. (2000). Mail and internet surveys: The tailored design method. New York: John Wiley.

Groves, R. M., Cialdini, R. B., & Couper, M. P. (1992). Understanding the decision to participate in a survey. Public Opinion Quarterly, 56, 475-495.

Institute of Education Sciences, U.S. Department of Education (22 March 2005). Guidelines for Incentives for NCEE Impact Evaluations. Washington, DC: Author.




Appendix A

Employee Agreement on Data Use and Confidentiality Procedures

School Improvement Status and Outcomes for Students with Disabilities Study

Data Use and Confidentiality Procedures

Employee Agreement

 

 

I acknowledge that I have been granted access to confidential data, which include school survey data and student-level education records, to facilitate the performance of my duties on the School Improvement Status and Outcomes for Students with Disabilities Study (or “Students with Disabilities Study”). This Agreement confirms that I recognize and understand that my use of these data is restricted to the fulfillment of my duties on the Students with Disabilities Study and that it is my responsibility to safeguard and maintain the confidentiality of these data.

 

I have received a copy of the Students with Disabilities Study Data Use and Confidentiality Procedures. I certify that I have reviewed this document and agree to abide by the standards set forth therein for the duration of my employment on the Students with Disabilities Study. I understand that my e-mail and computer usage may be monitored by the company to ensure compliance with these standards. I am aware that any violations of the Data Use and Confidentiality Procedures may subject me to disciplinary action, up to and including discharge from employment.

 

 

 

___________________________________ _______________________

Employee’s Signature Date

 

 

___________________________________

Employee’s Printed Name

 



1 This question examines the accountability by school type in the most recent year for which data are available rather than over multiple years due to the small number of non-regular schools.

2 We will report unadjusted p-values for all analyses in the study report. Where appropriate, we will also apply multiple comparison adjustment and note whether our results are robust to such adjustments.

3 “School improvement status in a given year” refers to school improvement status based on school adequate yearly progress (AYP) status in previous years. Schools first identified in 2010–11, for example, are schools that missed AYP for two years in a row in 2008–09 and 2009–10 and thus became identified for improvement in 2010–11.

4 We will report unadjusted p-values for all analyses in the study report. Where appropriate, we will also apply multiple comparison adjustment and note whether our results are robust to such adjustments.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSchool Improvement Status and Outcomes for Students with Disabilities Study
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy