IDEA OMB Part B 8 9 2019 Clean

IDEA OMB Part B 8 9 2019 Clean.docx

Individuals with Disabilities Education Act (IDEA) State and Local Implementation Study 2019

OMB: 1850-0949

Document [docx]
Download: docx | pdf


The Individuals with Disabilities Education Act (IDEA) State and Local Implementation Study 2019

Part B: Collection of Information Employing Statistical Methods

April 8, 2019


Submitted to:

Institute of Education Sciences

550 12th Street, S.W.

Washington, DC 20202

Project Officer: Erica Johnson
Contract Number:
ED-IES-17-C-0069

Submitted by:

Mathematica Policy Research

P.O. Box 2393
Princeton, NJ 08543-2393
Telephone: (609) 799-3535
Facsimile: (609) 799-0005

Project Director: Amy Johnson
Reference Number: 50538

CONTENTS

Overview of the Study 1

B.1. Respondent Universe and Sampling Methods 2

B.1.1. States 2

B.1.2. District sample 2

B.1.3. School sample 3

B.2. Procedures for the Collection of Information 4

B.2.1. Notification of the sample, recruitment, and data collection 4

State surveys 4

District surveys 5

School survey 6

B.2.2. Statistical methodology for stratification and sample selection 7

B.2.2.1. States 7

B.2.2.2. District sample 7

B.2.2.3. School sample 8

B.2.3. Estimation procedure 9

B.2.4. Degree of accuracy needed 10

B.2.5. Unusual problems requiring specialized sampling procedures 11

B.2.6. Use of periodic data collection cycles to reduce burden 11

B.3. Methods to Maximize Response Rates and Deal with Nonresponse 11

B.3.1. Maximizing response rates 11

B.3.2. Weighting the district and school samples 11

B.4. Tests of Procedures or Methods to be Undertaken 12

B.5. Individuals Consulted on Statistical Aspects of the design and on Collecting and/or Analyzing Data 13




TABLES

Table B.1. District strata 8

Table B.2. School strata within district test sample 9

Table B.3. Estimated precision for district and school samples 10

Table B.4. Individuals consulted on study design 13



Overview of the Study

This Office of Management and Budget (OMB) package requests clearance for data collection activities to support the Individuals with Disabilities Education Act (IDEA) State and Local Implementation Study 2019. The Institute of Education Sciences (IES) within the U.S. Department of Education (ED) has contracted with Mathematica Policy Research and its partners the National Center for Special Education in Charter Schools, Inc. and Walsh Taylor, Inc. to conduct this study.

IDEA, which was last reauthorized in 2004, is a federal law first passed in 1975 to support the needs of children with disabilities. IDEA supports early intervention services for infants and toddlers identified as having a disability or at risk of substantial developmental delay. IDEA also supports special education and related services for children and youth ages 3 through 21 identified as having a disability, as well as coordinated early intervening services for children and youth who do not require special education, but who need additional support to succeed in a general education environment.

IDEA mandates that IES assess national activities under the law, known as the National Assessment of IDEA. The study that is the focus here—the IDEA State and Local Implementation Study 2019—is one aspect of this assessment and will develop a national picture of how states, districts, and schools are implementing IDEA. It will provide (1) ED, Congress, and other stakeholders with knowledge that can inform the next reauthorization of IDEA and, ultimately, how services are provided to children; and (2) states, districts, and schools with an understanding of how others are implementing IDEA. This study will build on an earlier IDEA National Assessment Implementation Study (IDEA-NAIS), which surveyed states and a nationally representative sample of districts in 2009.

The study will examine IDEA implementation at the state, district, and school levels. Specifically, the study will describe how state and local agencies identify infants, toddlers, children, and youth for early intervention and special education and related services. It will also examine the policies and programs these agencies have in place, and the learning supports that schools provide, to assist those identified for services. Finally, the study will investigate the extent to which state and local agencies use evidence from research on children with disabilities and how they allocate resources to support children with disabilities.

The study team will conduct surveys in Fall 2019 of a census of states and territories receiving IDEA funding and of national samples of districts and schools. The survey data will provide a snapshot of current implementation and, where possible, allow the study team to examine trends over time. This package provides a detailed discussion of the procedures for these data collection activities and the data collection forms and instruments.

The IDEA State and Local Implementation study 2019 includes a potential follow-up data collection in the same states, districts, and schools in 2022. If IES proceeds with the second round, it will submit a separate package requesting OMB clearance for the follow-up surveys.

B.1. Respondent Universe and Sampling Methods

This study includes multiple survey data collection efforts. States, districts, and schools have important and non-overlapping roles in how IDEA is implemented, which necessitates data collection at each level. The study team will survey a census of state administrators receiving IDEA funding, as well as staff from nationally representative samples of school districts and schools. The school sample is nested within the selected district sample.

B.1.1. States

The study team will collect data from a census of the 61 state-level entities that receive IDEA funding; therefore, no sampling methods are required. These entities are all 50 states, the District of Columbia, eight U.S. territories, the Bureau of Indian Education, and the Department of Defense Education Activity.

B.1.2. District sample

The sample frame for the IDEA district surveys was derived from the list of public school districts provided by IES’s National Center for Education Statistics in its Common Core of Data (CCD). The study team used the CCD district files for the 2015–16 school year, as this was the latest available year with all the data necessary for sample selection. The sample frame for the IDEA district surveys is the subset of the full list of school districts that meet all the following criteria. Applying the criteria below restricts the sample frame to a set of districts that is currently operational with students enrolled in pre-kindergarten to grade 12 and is eligible for IDEA funding.

  1. Districts must have reported student enrollment greater than zero. Districts with no students reported as enrolled in the 2015–16 school year were excluded from the sample frame.

  2. Districts must have low and high grades within the span of pre-kindergarten through grade 13.1 Districts with low and high grades of ungraded or missing, or with adult education as their reported low grade, were excluded from the sample frame.

  3. Districts must be categorized as regular local school districts, local school districts that are part of a supervisory union, supervisory unions, or charter education agencies. Districts categorized as (1) regional education service agencies, (2) state-operated agencies, (3) federally operated agencies, or (4) other types of agencies were excluded from the frame. The study team decided to include supervisory unions (typically administrative offices) as a district type for this study because two such local education agencies in the CCD have non-zero enrollment and appear to be regular local school districts. All other supervisory unions are effectively excluded because they report zero students enrolled.

  4. Districts must have at least one school that is eligible for selection (see school eligibility below). Districts with no schools eligible for selection were excluded from the sample frame.

  5. Charter districts must be considered a local education agency (LEA) for the purposes of IDEA funding. Charter districts that are not considered an LEA for IDEA funding cannot receive IDEA funding and were excluded from the sample frame.

The district sample frame consists of the 16,108 districts satisfying the criteria listed above. These districts serve 99.4 percent of the total district enrollment in the CCD district file. The districts in the sample frame come from 54 states and territories. Seven state-level entities have no eligible districts (American Samoa, Bureau of Indian Education, Department of Defense Education Activity, Federated States of Micronesia, Marshall Islands, Northern Mariana Islands, and Palau).

The study team selected a nationally representative sample of 665 districts. The preschool-age district survey will be administered in 602 of the 665 districts. Section B.2 discusses the district sampling approach in more detail.

B.1.3. School sample

The sample frame for the IDEA school survey was derived from the list of public schools provided by the CCD files for the 2015–16 school year. Eligible schools for the school sample include the subset of schools that meet all the following criteria. These criteria ensure we selected only currently operational schools eligible for IDEA funding, with students enrolled in pre-kindergarten to grade 12.

  1. Schools must have low and high grades within the span of pre-kindergarten through grade 13. Schools with low and high grades of ungraded or missing, or with adult education as their reported low grade, were excluded from the sample frame.

  2. Schools must be listed as currently operational in the 2015–16 CCD file, defined as a school that (1) is continuing operation from the previous year, (2) is newly opened or added, (3) changed districts, or (4) re-opened. Schools that are closed, inactive, or scheduled to be opened within the next two years were excluded from the sample frame.

  3. Schools must have at least one student enrolled in the 2015–16 school year. Schools with zero students enrolled or unreported enrollment were excluded from the sample frame.

  4. Schools must have a physical location where students receive at least part of their education and cannot be identified as virtual schools. All virtual schools were excluded from the sample frame.

  5. Eligible schools include regular schools, special education schools, vocational education schools, and other/alternative schools.

The school sample was nested within the 665 districts selected for participation in the study, as discussed further in Section B.2.

B.2. Procedures for the Collection of Information

B.2.1. Notification of the sample, recruitment, and data collection

State surveys

In Fall 2019, the study team will administer three different state surveys that focus on the Part C program for infants and toddlers, the Part B program for preschool-age children, and the Part B program for school-age children and youth, respectively. Three surveys are necessary because different state administrators are likely to oversee IDEA programs for children at those different age levels. The surveys will be administered to:

  • the IDEA Part C infants and toddlers program coordinator,

  • the IDEA Part B program for preschool-age children coordinator, and

  • the special education director, who will respond for the IDEA Part B program for school-age children.

The surveys will be administered to respondents in each state, the District of Columbia, and other territories. The expected time to complete the surveys will be 60 minutes, the same as for the IDEA-NAIS state surveys administered in 2009. IDEA is a comprehensive law, and the information collected on the surveys (state policies on identification and supports for children with disabilities and use of evidence from research, and the allocation of special education resources) is complex. Obtaining information on such issues requires surveys of this length.

Contact information for respondents will be obtained from up-to-date online lists. The National Association of State Directors of Special Education maintains a list of state special education directors, and the Early Childhood Technical Assistance Center maintains lists of state coordinators for IDEA’s Part C infants and toddlers program and Part B program for preschool-age children.

To build interest in completing the surveys, the study team will send a letter to each state to introduce the study (Appendix B.1). Prior to the start of data collection, the study team will mail a letter that will describe the study and survey (Appendix B.2). The study team will also provide additional information, answer questions about the study, and obtain feedback from state directors/coordinators at meetings sponsored by ED, such as the Office of Special Education Programs Leadership Conference.

In Fall 2019, the study team will send an email with a link to the electronic survey and instructions for completing it. Recognizing that the coordinators and directors may need to consult others in order to complete the survey, they will also be provided, for reference, a hard-copy survey that they can share with people with whom they consult.

The study team will send reminder emails every two weeks and, after six weeks, will make weekly phone calls. Because states receive IDEA funds and are therefore expected to participate in the study, it is expected that nearly all states will respond.

District surveys

In Fall 2019, the study team will administer two different district that focus on the IDEA Part B program for preschool-age children and the IDEA Part B program for school-age children. Two surveys are necessary because different district staff members are likely to oversee IDEA programs for students at those different age levels. The surveys will be administered to:

  • coordinators of the IDEA Part B program for preschool-age children in 602 districts, and

  • the district special education directors in 665 districts, who will respond for the IDEA Part B program for school-age children.

If a district does not have an IDEA Part B program for preschool-age children coordinator, the study team will work with the district to identify the survey’s most appropriate respondent, likely someone in the pre-school special education leadership. Study team recruiters will reach out to district special education directors and superintendents to secure their district’s participation. During this first outreach, the districts will receive a letter (Appendix B.3) and a brochure (Appendix B.4) that describe the need for the study and what is expected of the school district. The initial outreach will also include a participation letter (Appendix B.5). The study team plans to include a letter of endorsement from a key stakeholder in special education: the Council of Administrators of Special Education, a Division of the Council for Exceptional Children (CEC) (Appendix B.6). The study team will also involve the state directors in district recruitment where relevant and feasible.

The preliminary outreach will focus on securing a participation letter from each district and determining whether a formal research application is needed. The study team estimates that one-fourth of the sampled districts will require a formal research application. The participation letter will document a district’s intended cooperation with the study for both the current and the potential subsequent round of data collection (for which a second OMB package would be submitted). It will also ask district staff to identify the coordinator for the IDEA Part B program for preschool-age children. (As noted above, the district special education director will respond to the IDEA Part B program for school-age children survey, so there is no need to ask the district to identify that survey respondent.) The study team will contact the district again at the beginning of the 2019-2020 school year to introduce the study to the coordinator of the IDEA Part B program for preschool-age children and to ask the district special education director to identify the most appropriate school survey respondent at each study school. The school survey respondent could be the school’s principal or lead special education staff.

In Fall 2019, the study team will administer a survey about the IDEA Part B program for preschool-age children to a nationally representative sample of 602 school districts and a survey about the IDEA Part B program for school-age children to a nationally representative sample of 665 school districts.2 Before data collection begins, the study team will again contact each district special education director and the identified IDEA Part B program for preschool-age children survey respondent to remind them of the study and alert them to our impending request for their participation (Appendix B.7). The study team will then send an email with instructions and a link to the survey. A hard copy of the survey will also be mailed to each district respondent. The two surveys will be administered online, and each is expected to take about 60 minutes, the same as the district surveys for IDEA-NAIS administered in 2009. As stated above, IDEA is a comprehensive law, and the information collected on the district surveys (districts’ policies and programs related to IDEA) is complex. Obtaining information on such issues requires surveys of this length.

Shortly after sending the survey links, the study team will follow up with a telephone call to each district sample member to ensure they received the email, to address questions, to troubleshoot any issues they might have, and to encourage participation. The surveys will be optimized for mobile phones and will allow respondents to start and stop as needed to accommodate schedules and gather data needed for completion. Because districts receive IDEA funds and thus their participation is expected, a response rate of at least 85 percent is expected (512 school districts for the preschool-age form and 565 school districts for the school-age form).

School survey

In Fall 2019, the study team will administer an electronic survey on Part B to a nationally representative sample of 2,750 schools from the 665 selected districts. Before the start of data collection, the study team recruiters will call each school survey respondent identified by the district special education director to secure his/her participation. This person could be the school’s principal or lead special education staff. If the identified school respondent has left the school or position, the study team will work with the appropriate administrator at the school to identify the correct respondent. The study team will also send each school a letter (Appendix B.8) and a brochure detailing why the study is needed and what is expected of the school.3 Once the school agrees to participate and the respondent has been confirmed, the study team will send an email with instructions and a link to the survey. The survey will be administered online and will take about 45 minutes to complete. An 80 percent response rate, or responses from 2,200 schools, is expected. Because district and school surveys will be fielded simultaneously, it will be possible to obtain school survey responses even if the district respondents do not respond (provided a participation letter from the district is secured).

Because school staff are busy and schools are not direct recipients of IDEA funds, obtaining cooperation with the survey from school-level staff (either principals or special education staff) is likely to be challenging. Therefore, the study team will take several additional steps to encourage participation. As described in Part A, a $30 incentive will be offered to each school respondent. In addition, a note of endorsement secured for the study from the Council of Administrators of Special Education, a Division of the Council for Exceptional Children (CEC) will be sent to schools.4 Hearing about the value of participation from a known and trusted source can greatly increase cooperation (Groves, Cialdini, & Couper, 1992). Shortly after sending the link to the survey, the study team will follow up with a telephone call to each school sample member to ensure they received the email, to address questions, to troubleshoot any issues they might have, and to encourage them to participate.

B.2.2. Statistical methodology for stratification and sample selection

B.2.2.1. States

The state sample includes the 61 state-level entities that receive IDEA funding: all 50 states, the District of Columbia, eight U.S. territories, the Bureau of Indian Education, and the Department of Defense Education Activity. The study design assumes that nearly all state entities will respond. Therefore, the study team does not plan to construct weights to account for state-level nonresponse.

B.2.2.2. District sample

The district sample frame includes 25 strata from which the study team will select a total of 665 districts for the study. The first 24 strata are based on combinations of census region (Northeast, South, Midwest, West), urbanicity (urban, suburban, town/rural area), and total enrollment (above/below median enrollment for each urbanicity type). Only districts with no charter schools are included in these 24 strata. The final stratum is for districts with at least one charter school, which includes charter school agencies and traditional districts with at least one charter school. To ensure enough schools for the school sample, the team oversampled charter school districts, traditional districts with charter schools, and larger urban and suburban districts by increasing the allocation of sample to these district strata beyond the number determined by a proportional allocation.5 Large districts were oversampled by approximately 1.4:1. The district sample from the remaining strata was proportionally allocated.

Districts were selected with probability proportional to size, with a transformation of the number of schools as the measure of size. For districts in the first 24 strata, this measure was capped at 200 schools, and then the square root taken to reduce design effects for the district survey.6 In the 25th stratum, the square root of the number of charter schools was used as the measure of size for regular school districts, while the raw number of charter schools was used for the charter school agencies. This approach was used in the 25th stratum in order to increase the number of charter schools within the selected districts; using all schools consistently resulted in too few charter schools within the selected districts. The study team plans to collect data from at least 565 of the 665 selected districts, assuming a response rate of at least 85 percent.

Table B.1 presents the strata details for the district sample frame and the selected sample.

Table B.1. District strata


Stratum

Sampling rate

Number of districts selected

Anticipated number of districts responding

1

Midwest, low student enrollment, urban

2.6%

2

1.7

2

Midwest, high student enrollment, urban

3.7%

2

1.7

3

Midwest, low student enrollment, suburban

3.4%

11

9.4

4

Midwest, high student enrollment, suburban

4.6%

13

11.1

5

Midwest, low student enrollment, town/rural

3.3%

69

58.7

6

Midwest, high student enrollment, town/rural

3.3%

60

51.0

7

Northeast, low student enrollment, urban

5.6%

2

1.7

8

Northeast, high student enrollment, urban

7.4%

2

1.7

9

Northeast, low student enrollment, suburban

3.4%

19

16.2

10

Northeast, high student enrollment, suburban

4.5%

19

16.2

11

Northeast, low student enrollment, town/rural

3.3%

27

23.0

12

Northeast, high student enrollment, town/rural

3.2%

29

24.7

13

South, low student enrollment, urban

7.4%

2

1.7

14

South, high student enrollment, urban

4.8%

3

2.6

15

South, low student enrollment, suburban

5.9%

2

1.7

16

South, high student enrollment, suburban

4.6%

5

4.3

17

South, low student enrollment, town/rural

3.2%

32

27.2

18

South, high student enrollment, town/rural

3.3%

58

49.3

19

West, low student enrollment, urban

3.3%

2

1.7

20

West, high student enrollment, urban

3.5%

2

1.7

21

West, low student enrollment, suburban

4.8%

2

1.7

22

West, high student enrollment, suburban

4.3%

6

5.1

23

West, low student enrollment, town/rural

3.3%

41

34.9

24

West, high student enrollment, town/rural

3.3%

21

17.9

25

One or more charter schools

7.4%

234

198.9

Note: The sampling rate is the number of districts to be selected within the stratum divided by the total number of districts in that stratum.


B.2.2.3. School sample

Within each sampled district, there were up to six school strata for sampling purposes: (1) traditional elementary with pre-kindergarten, (2) traditional elementary without pre-kindergarten, (3) traditional secondary (middle and high schools), (4) charter elementary with pre-kindergarten, (5) charter elementary without pre-kindergarten, and (6) charter secondary. The study team chose these six strata to ensure adequate sample not only to provide nationally representative estimates for schools overall but also to separately examine analytic subgroups of schools that are important for IDEA implementation. The analytic subgroups include charter schools, traditional public schools, elementary schools, secondary schools, and just those elementary schools that offer pre-kindergarten.7 For instance, the school sample design allows for comparisons between charter schools and traditional public schools, providing a key source of nationally-representative data about school choice for students with disabilities. Similarly, the school sample design allows for separate nationally-representative estimates for elementary schools and secondary schools. This is important for not only for making comparisons across grade ranges but also for examining a topic like youth transition from school that is only relevant at the secondary level. Finally, combining the two strata that include elementary schools with pre-kindergarten allows the study to examine school-level implementation of IDEA for preschool-age children with disabilities.

Charter schools and elementary schools with pre-kindergarten classes were oversampled to maximize precision for those subgroups. For the school selection, strata were defined as the combination of the sampled district and the six-level school strata, described above. Therefore, each district can have a minimum of one school stratum and a maximum of six school strata, depending on the mix of eligible schools located within each district.

The study team selected 2,750 schools to yield at least 2,200 responding schools (80 percent). To achieve precision targets for the key analytic subgroups of schools, the number of schools selected from each district was determined separately for each school type. For each school stratum, the study team allocated the school sample across districts by initially assigning a sample size of 1 for each district where the school type is located. Then, the remaining sample for each type not already assigned was proportionally allocated by the relative size of the school stratum for that type of school in each district. For example, if a given district had 5 percent of the elementary with pre-kindergarten schools, 5 percent of the remaining number of these schools was allocated to that district. In the selected sample, a total of 1,433 schools were allocated in order to select a minimum of 1 school per stratum, with the remaining 1,317 schools proportionally allocated to strata. About 95 percent of districts had three or fewer school strata, and about 1 percent had the maximum of 6 school strata. Schools were selected with equal probability within each district-school stratum combination.

Table B.2 presents the strata details for the school sample frame and the selected sample.

Table B.2. School strata within district sample

Stratum

Sampling rate

Number of schools selected across
selected districts

Anticipated number of schools responding
across selected districts

Traditional elementary without pre-K

15.2%

323

258.4

Traditional elementary with pre-K

30.4%

812

649.6

Traditional middle and high school

25.8%

790

632

Charter elementary without pre-K

65.2%

377

301.6

Charter elementary with pre-K

100.0%

125

100

Charter middle and high school

55.5%

323

258.4


B.2.3. Estimation procedure

The study team will use descriptive statistical methods to tabulate the data. The primary method will be to report point-in-time estimates of mean values for state, district, and school data. The analysis will also compare means for subgroups of schools and, where possible, examine trends across time. The associated standard errors for estimates will account for design effects due to weighting and clustering (for schools).

  • Point-in-time estimates. For state data, the study will report numbers of states and unweighted means. For district and school data, the study will report weighted means for survey variables and unweighted means from extant sources such as EDFacts, where data exist for nearly all districts.

  • Point-in-time estimates for school subgroups. Subgroup analyses will provide a fuller understanding of how IDEA implementation varies across types of schools. The sampling strategy will permit the study to document (and statistically test for) differences between charter schools and traditional schools in how they identify and support children with disabilities. The sampling approach will also allow the study team to make precise estimates of means for topics applicable only to specific school subgroups, such as on preschool inclusion policies for the subgroup of elementary schools with pre-kindergartens or on post–high school transition planning for the subgroup of secondary schools.

  • Trend analyses. The study team will document trends relative to the IDEA-NAIS on measures related to topics such as the development and quality of individualized education programs (IEPs) , provided that the item universes and framing in the 2009 study are comparable. Comparisons will be made across selected items in the state and district surveys. (The IDEA-NAIS did not include a school survey.)

B.2.4. Degree of accuracy needed

Given the sample design described in sections B.2.2.2 and B.2.2.3, the study team estimates that the targeted sample sizes will provide precise, unbiased estimates for key indicators within ±5 percentage points, using 95 percent confidence intervals. This applies to school districts and schools overall, as well as to policy-relevant subgroups (Table B.3). Table B.3 shows precision estimates for an outcome proportion of 0.5 to demonstrate the precision when the outcome variance is highest, and for an outcome proportion of 0.7 to demonstrate the precision for larger or smaller proportions. These precision estimates assume an 85 percent response rate, with a 2 percent active refusal rate among districts. For schools, the study team assumes an 85 percent response rate among schools whose districts responded, a 60 percent response rate among schools whose districts did not respond but did not actively refuse, and a zero percent response rate for schools whose districts actively refused. Cumulatively, this results in an estimated 80 percent response rate among all sampled schools. The precision estimates also assume a weighting design effect due to nonresponse adjustments of 1.2, a weighting design effect of 1.2 due to oversampling based on the sample design described above, an intra-cluster correlation coefficient of 0.02 for school-level estimates, and a 5 percent Type I error rate. The design effect due to the district oversampling may change slightly in the final selected sample.

Table B.3. Estimated precision for district and school samples

District or school group

Approximate number of responding districts or schools

Size in percentage points of the half-width of a 95% confidence interval for a proportion of 0.5

Size in percentage points of the half-width of a 95% confidence interval for a proportion of 0.7

School districts (full sample)

565

5.0

4.6

Schools (full sample)

2,200

3.4

3.1

Elementary schools with pre-kindergarten

750

5.1

4.6

Elementary schools (all)

1,310

4.7

4.3

Secondary schools (all)

890

4.5

4.1

Charter schools

660

5.5

5.0


B.2.5. Unusual problems requiring specialized sampling procedures

There are no unusual problems that require specialized sampling procedures.

B.2.6. Use of periodic data collection cycles to reduce burden

The data collection will require up to one hour per survey in fall 2019. The study might collect data again in 2022 if there are relevant judicial, legislative, and/or regulatory developments that may have implications for how states and local agencies are implementing services under IDEA. In that case, collecting a second round of data would provide up-to-date information to ED, Congress, and other stakeholders. If IES proceeds with the second round of data collection in 2022, it will submit a separate package requesting OMB clearance for the follow-up surveys.

B.3. Methods to Maximize Response Rates and Deal with Nonresponse

B.3.1. Maximizing response rates

So that response rates are maximized, state, district, and school sample members will receive email reminders every two weeks for the first six weeks and then phone calls each week thereafter. For district-level nonrespondents, toward the end of the field period, the study team will seek assistance from the state special education directors and IDEA Part B program for preschool-age children coordinators to encourage district sample members to participate. The state coordinators will be asked to send an email (or make a phone call, if they prefer) to district special education directors or district IDEA Part B program for preschool-age children coordinators to help convince them of the value and importance of cooperating with the study. Similarly, at the end of the field period, the study team will seek support from the district-level special education directors who completed the district survey to send an email (or make a phone call) encouraging nonresponding schools in their district to participate. Finally, for those who have not completed the web survey within the final three weeks of the field period, the study team will provide a hard-copy instrument and offer to conduct the survey by telephone.

B.3.2. Weighting the district and school samples

For district and school data, the study will report weighted means for survey variables. As described above, for state data, the study will report numbers of states and unweighted means.

The sampling weight for each selected district will be the inverse of its selection probability, which will vary across strata and district size. The measure of district size is a transformation of the number of schools in the district (Section B.2.2.2). The sum of the sampling weights across selected districts will approximately equal the total measure of size for all districts (selected and unselected) within each stratum.

The sampling weight for each selected school will be the inverse of the school’s cumulative probability, defined as the product of its district selection probability and its selection probability within its school stratum within its district. The school sampling weights will approximately sum to the total number of schools of each type for districts in their district stratum.

High response rates (85 percent of districts and 80 percent of schools) are expected, and the study team will adjust the sampling weights to further reduce the potential effect of nonresponse bias due to differential response patterns. To do so, the team will first identify a set of characteristics on the CCD district and school files that are associated with the likelihood of response.8 Then they will estimate response propensities by fitting logistic regression models with response as the outcome variable and the identified characteristics as the predictors. Response propensities estimate the likelihood that districts or schools with similar characteristics responded to the survey. The response propensities will be converted into five weighting classes, where each class is defined as one quintile of the response propensity score distribution. A weighting class adjustment will be used to account for the nonrespondents’ sampling weights in each cell. A school’s district does not need to respond in order for the school to respond, so the nonresponse adjustment for the districts will be independent of the nonresponse adjustment for schools, though an indicator for district response will be included in the response propensity model for schools. The study team will examine the distribution of the nonresponse-adjusted district and school weights to assess the need for weight trimming, and implement a trimming procedure if needed.

The final stage of the weighting process will be a post-stratification adjustment to align the weights with known population totals, by multiplying the nonresponse-adjusted weights by a constant within each post-stratum. For districts, the study team will post-stratify the weights within the same strata used for sampling, which include census region (Northeast, South, Midwest, West), urbanicity (urban, suburban, town/rural area), and total enrollment (above/below median enrollment for each urbanicity type). The team will also include an indicator for districts that require research applications in the post-stratification adjustment. For schools, the study team will post-stratify to the population totals for (1) traditional elementary with pre-K, (2) traditional elementary without pre-K, (3) traditional middle and high, and (4) charter. The team will also explore the use of an indicator for district-level nonresponse. To conduct the analyses, the study team will use a statistical package with survey analysis procedures to calculate accurate estimates of the standard errors that take into account the complex sample designs and the weights.

B.4. Tests of Procedures or Methods to be Undertaken

The study team has solicited feedback on the surveys from several groups. First, Trent Buskirk, the director of the Center for Survey Research at the University of Massachusetts Boston, reviewed and provided input on the sampling plan. Second, the technical working group (TWG) members—researchers and practitioners with substantial experience in IDEA policy—provided feedback on the survey content. Third, IES gathered feedback from staff members in other ED offices, such as the Office of Special Education Programs. Fourth, the instruments were pretested in parallel to submission of the OMB package. Details of the pretest approach are provided below.

The study team recruited 3-4 respondents to pretest each of the surveys by completing the survey and participating in a debriefing interview. Pretest respondents came from a range of states, districts, and school types. Two rounds of pretests were conducted to allow for retesting the revisions that resulted from the first round of pretests. As a result of pretesting, the surveys were revised to: (1) clarify text that was unclear, (2) add response options that were missing, (3) reorder questions to ensure the surveys were logically organized, (4) change question formats to reduce burden on respondents, and (5) remove survey items that were too difficult to answer or provide lower priority information, in order to decrease the overall burden of the survey.

B.5. Individuals Consulted on Statistical Aspects of the design and on Collecting and/or Analyzing Data

Table B.4. Individuals consulted on study design

Name

Title and affiliation

Barbara Carlson

Associate Director and Senior Statistician, Mathematica

Jared Coopersmith

Statistician, Mathematica

Trent Buskirk

Director, Center for Survey Research, University of Massachusetts Boston



Carl Beck

Preschool Coordinator, Pennsylvania Department of Education and Welfare

Cecelia Dodge

Project Director, WestEd

John Eisenberg

Assistant Superintendent, Virginia Department of Education

Segun Eubanks

Director, Center for Education Innovation and Improvement, University of Maryland College Park

John Hosp

Professor, Department of Student Development, University of Massachusetts

Sheila Self

Education Programs Consultant, California Department of Education

Patricia Snyder

Professor of Special Education and Early Childhood Studies and Director of the Anita Zucker Center for Excellence in Early Childhood Studies, University of Florida

David Test

Professor of Special Education, University of North Carolina - Charlotte

Gerald Tindal

Castle-McIntosh-Knight Professor of Education, University of Oregon

Laurie VanderPloeg

Director of Special Education, Kent Independent School DIstrict9




REFERENCES

Groves, R.M., Cialdini, R.B., & Couper, M.P. (1992). Understanding the decision to participate in a survey. Public Opinion Quarterly, 56(4), 475–495. https://doi.org/10.1086/269338









www.mathematica-mpr.com

Improving public well-being by conducting high quality,
objective research and data collection

Princeton, NJ Ann Arbor, MI Cambridge, MA Chicago, IL Oakland, CA ■ seattle, wa TUCSON, AZ Washington, DC ■ WOODLAWN, MD





1 The CCD uses grade 13 to denote high school students who are enrolled in programs where they can earn college credit in an extended high school environment, or career and technical education (CTE) students in a high school program that continues beyond grade 12.

2 Of the 665 districts selected overall, 63 did not offer pre-kindergarten instruction and are not eligible for the preschool-age district survey.

3 Appendix B.8 displays the letter that will be used to inform and recruit schools. The brochure (Appendix B.4) is the same for districts and schools.

4 The letter of support from the Council of Administrators of Special Education (Appendix B.6) is the same for schools and districts.

5 To determine the charter school district sample size the study team selected a sample, determined the number of charter schools in the selected districts, adjusted the number of districts to select based on the results, and repeated this process until the selected samples provided sufficient counts of charter schools for selection.

6 The design effect due to weighting is the increase in the variance of the outcome due to the complex sample design, compared to the variance of the same outcome if a simple random sample had been drawn. The study team tested various caps on the number of schools in a district. The number 200 was selected because it produced the smallest design effects.

7 The strata are used only for sampling purposes and must be mutually exclusive. The subgroups are used for analytic purposes and can overlap. For example, the three charter sampling strata will contribute schools to all of the school subgroups.

8 The study team will use the 2015-2016 CCD data for sampling and the most recent available CCD data (at the time of the analysis) to construct post-stratification weights.

9 Subsequent to providing input on this study, Laurie VanderPloeg became the Director of the Office of Special Education Programs (OSEP), U.S. Department of Education.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy