Part B MGLS2017 MS1 & OFT2 & Tracking & Recruitment for MS2

Part B MGLS2017 MS1 & OFT2 & Tracking & Recruitment for MS2.docx

Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Main Study Base Year (MS1), Operational Field Test First Follow-up (OFT2), and Tracking and Recruitment for Main Study First Follow-up (MS2)

OMB: 1850-0911

Document [docx]
Download: docx | pdf




Middle Grades Longitudinal Study of 2017-18 (MGLS:2017)

Main Study Base Year (MS1), Operational Field Test First Follow-up (OFT2), and Tracking and Recruitment for Main Study First Follow-up (MS2)



OMB# 1850-0911 v.18




Supporting Statement Part B







National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC






July 2017

revised January 2018





B. Collection of Information Employing Statistical Methods


Part B of this submission presents information on the collection of information employing statistical methods for the Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Main Study Base Year (MS1) and Operational Field Test (OFT) First Follow-up (OFT2), as well as for tracking and recruitment for the Main Study First Follow-up (MS2).

B.1 Universe, Sample Design, and Estimation

The universe and sample design for the OFT Base Year (OFT1) and for MS1 were fully described in previous clearance submissions (OMB# 1850-0911 v. 10-15), which covered all aspects of OFT1 and recruitment for MS1. In this document, sampling universe and design of OFT2 and MS2 are described.

MGLS:2017 MS1 will be conducted during the 2017-18 school year, with data collection scheduled to begin in January 2018. MS1 includes a nationally representative sample of schools offering grade 6 instruction and a nationally representative sample of students enrolled in grade 6, including students whose primary Individualized Education Program (IEP) classification is Autism (AUT), Emotional Disturbance (EMN), or Specific Learning Disability (SLD) who are being educated in an ungraded setting and are age-equivalent (aged 11 to 13) to grade 6 students.

MS1 employs a multi-stage sampling design with schools selected in the first stage and students selected, within schools, at the second stage. As of the date of this submission, the school sample has been selected using probability proportional to size sampling within school sampling strata. Students will be selected from school enrollment lists collected in the fall of 2017 using simple random sampling within student sampling strata within schools.

The school frame was constructed from the 2013-14 Common Core of Data (CCD 2013-14) and the 2013-14 Private School Universe Survey (PSS 2013-14). The MS1 school population excludes the following types of schools that are included in the CCD and PSS:

  • Department of Defense Education Activity schools and Bureau of Indian Education schools,

  • alternative education schools, and

  • special education schools.1

In addition, schools included in OFT1 were excluded from the sampling frame for the Main Study and, therefore, were not eligible for MS1 due to the OFT2 tracking activities that will be conducted in parallel with MS1. The final set of schools that participated in OFT1 was not yet known at the time of initial sampling for MS1, and, therefore, the number of schools in the sampling frame for MS1 could not yet be precisely stated (it had been estimated at around 48,000 in an earlier package). It is now known that there are 48,376 schools that are eligible for MS1.

The sample design calls for information on sixth-grade enrollment, overall and by race and ethnicity, and counts of students whose primary IEP designation is AUT, EMN, or SLD to be used in the sampling process. EDFacts data were used to determine, for each school in the sampling frame, the number of students between the ages of 11 and 13 whose primary IEP designation is AUT, EMN, or SLD. In order for schools to be sampled, sixth-grade enrollment, overall and by race and ethnicity, and counts of students whose primary IEP designation is AUT, EMN, or SLD must have been available.

There are some schools for which some of the necessary information is missing but imputation was used to include them in the sampling process; 2,971 of the 48,376 schools had sixth-grade enrollment or EDFacts focal disability counts imputed2. For some schools with missing data imputation is not advisable due to a concern that misestimating enrollment counts may give a higher probability of selection to these schools than warranted. For this reason, the following schools were excluded from the sampling frame:

  • schools that reported overall sixth-grade enrollment but did not report enrollment by race and ethnicity (n=9), and

  • schools that reported no sixth-grade enrollment3 and reported having no enrolled students between the ages of 11 and 13 in the three focal disability groups4 or did not report information on students with disabilities to EDFacts (n=1,578).

The 45,528 schools with complete (non-imputed) information in the sampling frame were explicitly stratified by the cross-classification of the following characteristics:

  • school type (public, Catholic, other private),

  • region (Northeast, Midwest, South, West), and

  • prevalence of students with disabilities (high/low).5

The prevalence indicator is defined using the number of students in two of the three focal disability groups noted above. Schools were classified as having a high prevalence of students with disabilities (i.e., high prevalence schools) if the total number of students whose primary IEP designation was AUT or EMN exceeds 17. The number of SLD students was not factored into the stratification process because the number of students classified as SLD generally far exceeds the number of students classified as either EMN or AUT. Factoring in the number of SLD students would have resulted in a threshold where schools above the threshold would have had very few EMN or AUT students. The number of SLD students was also be excluded in the determination of high/low prevalence schools because it appears that sufficient numbers of SLD students will be included in the sample without oversampling. The threshold of 17 was determined by identifying the 95th percentile of the total number of students whose primary designation was AUT or EMN across all 45,528 schools.

Prior to selection of the school sample, schools were sorted by locale (city, suburban, town, rural), school configuration (PK/KG/1-06, PK/KG/1-08, PK/KG/1-12, 05-08, 06-08, other), median income of the ZIP code in which a school resides, and school size measure within each of the explicit school strata so that approximate proportionality across locales, school configurations, and median ZIP code incomes was preserved. The purpose of including school size measure in the sort is to enable the ability to freshen the school sample. The school sample was freshened in the third quarter of 2017, before the start of Base Year data collection, because schools were initially selected about a year before the start of data collection to allow sufficient time for recruitment. New schools were identified through review of preliminary versions of the 2015-2016 CCD and PSS files. Newly identified schools were inserted into the sorted sampling frame in such a fashion as to preserve the original sort ordering. Using a half-open interval rule,6 we identified schools to be added to the initial school sample.

Declining response rates are a concern for any research study, and some recent school-based NCES longitudinal studies achieved response rates lower than a desired target of 75 percent. For example, the school response rate for the High School Longitudinal Study of 2009 (HSLS:09) was 56 percent and the school response rate for the Early Childhood Longitudinal Study Kindergarten Class of 2010-11 (ECLS-K:2011) was 63 percent. For the first MGLS:2017 field test, the Item Validation Field Test (IVFT), the overall school participation rate was approximately 25 percent. During OFT1, 45 schools (35%) participated out of the 129 eligible schools sampled. However, we expect a higher response rate in MS1 based on numerous differences between the two field tests and MS1. For example, the recruitment window for the IVFT was greatly compressed compared to the planned recruitment window for MS1 (3 versus 12 months, respectively). Also, the burden on the schools for the IVFT was substantially higher, because the IVFT included up to all students in sixth, seventh, and eighth grades whereas MS1 will target, on average, a student sample yield of 29 students per school. Most importantly, the two field tests involved incentive experiments that may have partially suppressed participation. The MS1 recruitment effort has included a comprehensive study awareness-raising campaign, including presentations to state/district/school officials at national conferences, webinars explaining the study available to all interested parties, and articles in association periodicals. We have also obtained study support letters from chief state school officers, and middle-grades researchers, association personnel, and NAEP state coordinators are providing further assistance. With both IVFT and OFT1 now completed, MS1 fully benefits from an optimized, evidence-based incentive recruitment plan. Additionally, during MS1, we will use more intensive refusal conversion efforts (e.g., a higher school-level incentive offer and in-person visits to district and school officials in pending refusal districts that have one or more sample schools identified as high-prevalence schools; see section B.3 for details on OFT1 results and MS1 plans).

Nevertheless, to be conservative, the MS1 sampling plan was designed to be flexible so that the study can achieve sample size targets even if eligibility and response rates are lower than anticipated. The school sampling process was designed to achieve 900 participating schools (740 public, 80 Catholic, and 80 other private) distributed over 16 school sampling strata. We selected 3,710 schools using stratified probability proportional to size sampling, from which an initial simple random sample of 1,236 schools was selected within school strata. This subset of 1,236 schools comprises the initial set of schools that were released for recruitment in January of 2017. The remaining schools comprise a reserve sample to be released if participation among schools in the initial sample is too low to meet sample size targets. The numbers of participating schools among the 1,236 released schools are being monitored by school stratum and, if the number of participating schools in a given stratum is less than the yield goals for that stratum, then additional schools may be released for that stratum from the reserve set of schools. The reserve sample is ordered randomly within each stratum and will be released in waves by strata, as necessary, until there are 900 participating schools. This procedure will enable the study to achieve within-stratum sample size targets, given expected stratum-specific variation in eligibility and participation rates. The desired numbers of participating schools by the margins of the school stratification characteristics are shown in table 1.

Table 1. MS1 School Participation Goals, by School Stratification Characteristics




Public

Catholic

Other private

Total

Total



740

80

80

900

Region


Northeast

122

19

16

157



Midwest

162

28

15

205



South

278

19

33

330



West

178

14

16

208

Prevalence of students with disabilities


High

128

NA

NA

128



Low

612

80

80

772

NA: Not Applicable. No explicit participation goals are established for Catholic and Other private schools with these two grade configurations. Catholic and Other private schools with school grade configurations of 05-08 and 06-08 are classified as Other configuration for the purposes of sampling. Catholic and Other private schools are all classified as Low prevalence, for purposes of sampling, as no focal disability counts are available.

The 16 school strata along with the corresponding stratum-specific participation goals, frame counts, total selected school sample (n=3,710), initial school sample (n=1,236), and reserve sample (n=2,474) are shown in table 2.

Table 2. MS1 School Sample Allocation

School Type

Census Region

Prevalence

Participation Goals

School Frame Count

Total Selected School Sample

Initial School Sample

School Reserve Sample

Public

Northeast

High

17

245

70

23

47

Public

Northeast

Low

105

4,965

433

144

289

Public

Midwest

High

35

445

144

48

96

Public

Midwest

Low

127

8,167

524

175

349

Public

South

High

50

578

206

69

137

Public

South

Low

228

9,569

940

313

627

Public

West

High

26

290

107

36

71

Public

West

Low

152

9,554

627

209

418

Catholic

Northeast

Low

19

1,069

78

26

52

Catholic

Midwest

Low

28

1,697

115

38

77

Catholic

South

Low

19

970

78

26

52

Catholic

West

Low

14

770

58

19

39

Other Private

Northeast

Low

16

2,060

66

22

44

Other Private

Midwest

Low

15

2,277

62

21

41

Other Private

South

Low

33

3,768

136

45

91

Other Private

West

Low

16

1,952

66

22

44

Total



900

48,376

3,710

1,236

2,474


The size measure used for the probability proportional to size selection of 3,710 schools was constructed using the overall sampling rates for students in the following seven student categories:

  • Autism (AUT),

  • Emotional Disturbance (EMN),

  • Specific Learning Disability (SLD),

  • Asian, non-Hispanic (non-SLD, non-EMN, non-AUT),

  • Hispanic (non-SLD, non-EMN, non-AUT),

  • Black, non-Hispanic (non-SLD, non-EMN, non-AUT), and

  • Other race, non-Hispanic (non-SLD, non-EMN, non-AUT)

combined with the total number of students in each of those seven categories at a given school. In other words, the size measure for a given school (i) in school stratum h may be written as follows:

where is the sampling rate for the jth student category in the hth school stratum and is the number of students in the jth category within school i in the hth school stratum. The sampling rate, , equals the number of students to sample from the jth category in the hth school stratum divided by the number of students in the jth category across all schools in the hth school stratum. The sampling rates for the seven student categories listed above will vary across the school strata; for example, a rate of 0 is used for students with Autism at Catholic schools while an overall rate of .033 is used for students with Autism at public schools. The student sampling rates by school strata are provided in table 3. Because private schools do not report focal disability counts to EDFacts, the school sampling process assumes no students in the focal disability categories are enrolled in private schools. The sampling plan does not rely on sampling focal disability students from private schools in order to achieve the desired number of participating students in each of the three focal disability categories. In practice, however, students in the focal disability categories who are enrolled in sampled private schools will be sampled.

Table 3. Aggregate Student Sampling Rates Used for School Selection

School Type

Census Region

Prevalence

Overall

AUT

EMN

SLD

Asian

Hispanic

Black

Other

Public

Northeast

High

0.009

0.038

0.065

0.004

0.008

0.004

0.003

0.005


Northeast

Low

0.006

0.031

0.031

0.005

0.009

0.004

0.002

0.005


Midwest

High

0.010

0.040

0.081

0.004

0.010

0.005

0.005

0.005


Midwest

Low

0.006

0.024

0.024

0.005

0.009

0.004

0.004

0.005


South

High

0.009

0.045

0.109

0.005

0.007

0.004

0.003

0.006


South

Low

0.006

0.034

0.034

0.005

0.009

0.005

0.003

0.005


West

High

0.010

0.052

0.122

0.004

0.009

0.004

0.004

0.006


West

Low

0.005

0.025

0.025

0.005

0.008

0.004

0.003

0.005

Catholic

Northeast

Low

0.017

NA

NA

NA

0.024

0.024

0.024

0.014


Midwest

Low

0.016

NA

NA

NA

0.022

0.022

0.022

0.015


South

Low

0.016

NA

NA

NA

0.026

0.027

0.027

0.011


West

Low

0.017

NA

NA

NA

0.025

0.025

0.024

0.011

Other private

Northeast

Low

0.011

NA

NA

NA

0.011

0.010

0.011

0.011


Midwest

Low

0.009

NA

NA

NA

0.010

0.009

0.009

0.009


South

Low

0.012

NA

NA

NA

0.012

0.012

0.012

0.012


West

Low

0.011

NA

NA

NA

0.011

0.011

0.011

0.011

NA: Not Applicable. No explicit participation goals are established for Catholic and Other private schools with the three focal disability groups.

The sampling plan was designed to produce constant weights within each of the seven student domains (autism, specific learning disability, emotional disturbance, Asian non-Hispanic (non-SLD, non-EMN, non-AUT), Hispanic (non-SLD, non-EMN, non-AUT), Black non-Hispanic (non-SLD, non-EMN, non-AUT), and other non-Hispanic (non-SLD, non-EMN, non-AUT)) within each school stratum. When weights are constant within a given student domain and school stratum, there is no increase in the design effect due to unequal weights for estimates produced for the given student domain and school stratum.

Within participating schools, students will be stratified into the seven student categories defined above and a simple random sample of students will be selected from each student sampling stratum. Approximately 29 students will be sampled from each of the anticipated 900 participating schools. However, the number of students sampled per student stratum will vary by school because the within-school student allocation to strata depends upon the number of students in each of the seven student sampling strata. The process of determining the student sample allocation follows the procedure outlined in section 2 of Folsom et al (1987).7

As schools agree to participate in the study, students enrolled in grade 6 will be selected from student rosters that schools will be asked to provide. The student sample sizes were determined by the requirement that at least 782 students in each of the seven student domains8 participate in the second follow-up of MGLS:2017. That requirement was determined by evaluating the minimum required sample size that would be able to measure a relative change of 20 percent in proportions between any pair of the MGLS:2017 study rounds (MS1 in 2018, first follow-up, and second follow-up). Several assumptions were used to conduct this evaluation, as noted below.

  • Two-tailed tests with significance of alpha = 0.05 were used to test differences between means and proportions with required power of 80 percent.

  • A proportion of p = .30 was used to calculate sample sizes for tests of proportion.

  • Design effect is 2.0.

  • Correlation between waves is 0.6.

McNemar’s test using Connor’s approximation was used to determine the minimum sample size needed to meet the precision requirement under the previously stated assumptions. The Proc Power procedure available in SAS software9 was used to determine the minimum sample size.

The minimum number of students to sample from each of the seven student categories in the 2018 MS1, along with the assumptions used to derive those numbers, are provided in table 4.

Estimates of the minimum number of students to sample in the 2018 MS1 were derived by adjusting the 782 to account for a variety of factors including estimates of student response in grades 6, 7, and 8 as well as other factors, including the extent to which MS1 participating schools agree to participate in the first and second follow-up studies and the extent to which students are expected to change schools between grades 6 and 7 and between grades 7 and 8.

Because of different assumptions regarding student response rates and mover rates, the number of grade 6 students to sample varies across the student categories. In order to achieve the required minimum of 5,474 grade 8 respondents, with 782 respondents in each of the seven student categories, a total of 13,987 students must be sampled in grade 6. Following the assumptions specified in table 4, we estimate that 10,334, or approximately 74 percent, of a sample of 13,987 students would respond in grade 6. This minimal total sample of 13,987 students, however, employs a substantial oversampling of students in two of the three focal disability categories and a substantial undersampling of Hispanic (non-SLD, non-EMN, non-AUT), Black, non-Hispanic (non-SLD, non-EMN, non-AUT), and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT) students. In order to reduce the impact of disproportionate sampling on national estimates and estimates that compare or combine estimates across student categories, the sample sizes for the Hispanic (non-SLD, non-EMN, non-AUT), Black, non-Hispanic (non-SLD, non-EMN, non-AUT), and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT) student categories were increased.

Therefore, for MS1, the plan is to sample 29 students, on average, within each of 900 participating schools for a total of 26,100 sample students and, assuming the grade 6 eligibility and response rates shown in table 4, to produce approximately 20,322 participating grade 6 students. The distribution of the grade 6 student sample and estimates of the number of participating students in each of grades 6, 7, and 8 are provided in Table 5.

Table 4. Minimum Sample Sizes and Associated Sample Design Assumptions for Student Sampling Categories

Assumption

Each Non-focal disability student category

SLD

AUT

EMN

Grade 6 inflated student sample size

1,509

2,455

2,748

2,748

Grade 6 student eligibility rate

97%

97%

97%

97%

Grade 6 student response rate

85%

75%

67%

67%

Grade 7 inflated student sample size

1,244

1,786

1,786

1,786

Grade 7 school retention rate

96%

96%

96%

96%

Grade 6 to 7 move rate

30%

30%

30%

30%

Grade 7 mover follow rate

80%

100%

100%

100%

Grade 7 non-mover response rate

92%

75%

75%

75%

Grade 7 mover response rate

60%

45%

45%

45%

Grade 8 inflated student sample size

941

1,132

1,132

1,132

Grade 8 school retention rate

96%

96%

96%

96%

Grade 7 to 8 move rate

15%

15%

15%

15%

Grade 8 mover follow rate

80%

100%

100%

100%

Grade 8 non-mover response rate

92%

75%

75%

75%

Grade 8 mover response rate

70%

55%

55%

55%

Grade 8 minimum number of respondents

782

782

782

782

Table 5. Final Student Sample Sizes and Expected Minimum Student Participation by Grade

Assumption

non-SLD, non-EMN, non-AUT

SLD

AUT

EMN

Total

Hispanic

Asian, non-Hispanic

Black, non-Hispanic

Other race, non-Hispanic

Grade 6 inflated student sample size

3,786

1,509

1,868

10,986

2,455

2,748

2,748

26,100

Grade 6 student eligibility rate

97%

97%

97%

97%

97%

97%

97%

Grade 6 student response rate

85%

85%

85%

85%

75%

67%

67%

Grade 6 expected participants

3,122

1,244

1,540

9,058

1,786

1,786

1,786

20,322

Grade 7 school retention rate

96%

96%

96%

96%

96%

96%

96%

Grade 6 to 7 move rate

30%

30%

30%

30%

30%

30%

30%

Grade 7 mover follow rate

80%

80%

80%

80%

100%

100%

100%

Grade 7 non-mover response rate

92%

92%

92%

92%

75%

75%

75%

Grade 7 mover response rate

60%

60%

60%

60%

45%

45%

45%

Grade 7 expected participants

2,361

941

1,165

6,852

1,132

1,132

1,132

14,715

Grade 8 school retention rate

96%

96%

96%

96%

96%

96%

96%

Grade 7 to 8 move rate

15%

15%

15%

15%

15%

15%

15%

Grade 8 mover follow rate

80%

80%

80%

80%

100%

100%

100%

Grade 8 non-mover response rate

92%

92%

92%

92%

75%

75%

75%

Grade 8 mover response rate

70%

70%

70%

70%

55%

55%

55%

Grade 8 expected participants

1,963

782

969

5,697

782

782

782

11,757

Note: SLD=Specific Learning Disability. AUT=Autism. EMN=Emotional Disturbance. The non-focal disability student categories are Asian, non-Hispanic (non-SLD, non-EMN, non-AUT); Hispanic (non-SLD, non-EMN, non-AUT); Black, non-Hispanic (non-SLD, non-EMN, non-AUT); and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT.)

MS2 Samples

The MS2 student sample will consist of those among the estimated 26,100 students sampled for MS1 who remain eligible for the study in the 2018-19 school year. Study ineligible sample members include those MS1 sample members who become deceased prior to MS2 or MS1 nonrespondents who are found to not have been enrolled in sixth grade as of fall 2017. As noted in Table 5, we estimate that 97 percent (25,317) of the planned MS1 26,100 student sample members will be eligible for MS2.

The MS2 school sample consists of the planned 900 MS1 participating schools combined with an estimated 450 non-base-year transfer schools at which one or more sample students will be enrolled as of MS2.

OFT2 Samples

A stratified random sample of 135 schools was selected for the OFT1, and 45 schools participated. The OFT1 sample of schools was selected using a two-stage selection process that followed the process outlined for the MS1 sample, with some differences in the school sampling strata. The school sampling frame for the OFT1 was constructed from the MS1 school sampling frame by including only schools in one of ten metropolitan statistical areas (MSAs). Schools within each MSA were stratified into high and low prevalence strata using the same methodology that was employed for the MS1 school stratification. The school sample size, number of schools sampled that were determined to be ineligible, and the number of schools that participated are provided in Table 6. The participation rate among eligible schools was 34.9 percent (45/129.)

Table 6. OFT1 School Sample Disposition

School Region

Prevalence

School
Frame Count

Sample Size

Ineligible Schools

Participating Schools

A

High Prevalence

76

10

0

4

A

Low Prevalence

395

7

0

3

B

High Prevalence

22

12

0

2

B

Low Prevalence

293

7

2

1

C

High Prevalence

1

1

0

0

C

Low Prevalence

87

11

0

5

D

High Prevalence

5

5

0

0

D

Low Prevalence

170

7

1

3

E

High Prevalence

5

5

0

0

E

Low Prevalence

305

8

0

2

F

High Prevalence

40

7

0

4

F

Low Prevalence

566

6

0

3

G

High Prevalence

14

9

0

2

G

Low Prevalence

572

6

0

3

H

High Prevalence

12

11

0

1

H

Low Prevalence

497

6

3

0

I

High Prevalence

4

4

0

1

I

Low Prevalence

148

6

0

5

J

High Prevalence

2

2

0

2

J

Low Prevalence

87

5

0

4

Total


3,301

135

6

45


Stratified simple random samples of students were selected within each of the 45 OFT1 participating schools. A total of 1,739 students was sampled and 1,294 participated for a 76.4 percent participation rate. The sample size, numbers of eligible and ineligible students, and number of participating students are provided in Table 7.

Table 7. OFT1 Student Sample Disposition

Student Group

Student Sample Size

Ineligible Students

Eligible Students

Participating Students

Autism

98

2

96

47

Emotional Disturbance

63

2

61

33

Specific Learning Disability

200

6

194

147

No Key Focal Disability

1,378

36

1,342

1,067

Total

1,739

46

1,693

1,294


The OFT2 student sample will consist of approximately 1,120 students from among the 1,294 who participated in OFT1 and remain eligible for the study in the 2018-19 school year (for cost reasons some OFT1 students will not be followed). The OFT2 school sample consists of the 45 OFT1 participating schools combined with an estimated 30 non-base-year transfer schools at which one or more students from the OFT2 sample will be enrolled as of OFT2.

OFT2 will also include about 400 participating students in grade 8 to calibrate the mathematics and reading assessment items for the Main Study Second Follow-up (MS3) without needing another costly field test. Participating OFT2 schools will be asked to provide high-level grade 8 math classes for this calibration. This will be a convenience sample of classrooms to ensure that a yield of about 400 highest ability grade 8 students is achieved.

B.2 Procedures for the Collection of Information

MGLS:2017 will rely on a set of complementary instruments to collect data across several types of respondents to provide information on the outcomes, experiences, and perspectives of students. These instruments will be used when the students are in grades 6, 7, and 8 to allow for the analysis of change and growth across time; their families and home lives; their teachers, classrooms, and instruction; and the school settings, programs, and services available to them. At each round of data collection in MS1, students’ mathematics and reading skills, socioemotional development, and executive function will be assessed. Students will also complete a survey that asks about their engagement in school, out-of-school experiences, peer group relationships, and identity development. Parents will be asked through an online survey or over the telephone about their background, family resources, and involvement with their child’s education and school. Students’ mathematics teachers will complete a two-part survey. In part 1, they will be asked about their background and classroom instruction. In part 2, they will be asked to report on the academic behavior, mathematics performance, and classroom conduct of each study child in their classroom(s). For students receiving special education services, their special education teacher or provider will also complete a survey similar in structure to the two-part mathematics teacher instrument, consisting of a teacher-level questionnaire and student-level questionnaire, but with questions specific to the special education experiences of and services received by the study child. School administrators will be asked to report on school programs and services, as well as on school climate. Finally, a facilities observation checklist, consisting of questions about the school buildings, classrooms and campus security, will be completed by field data collection staff.

The OFT1 school recruitment and data collection approaches, which inform those of MS1, were fully described in the previous clearance submission (OMB# 1850-0911 v. 10-15). Below, the methodological descriptions focus on all aspects of MS1 and OFT2 (for which tracking and recruitment have been approved in OMB# 1850-0911 v. 10-15), and student tracking and school recruitment for MS2.

MS1 School Recruitment Approach

Gaining schools’ cooperation in voluntary research is increasingly challenging. For example, in 1998–99 the Early Childhood Longitudinal Study had a weighted school-level response rate of 74 percent,10 whereas 12 years later, the complementary ECLS-K:2011 study had a weighted school-level response rate of 63 percent.11 Additionally, there is evidence that response rates may be lower for schools that serve older students, as in the High School Longitudinal Study of 2009, which had a weighted school-level response rate of 56 percent.12 Therefore, effective strategies for gaining the cooperation of schools are of paramount importance. Recruitment activities began about one year prior to the start of data collection, in February 2017. Attempts were made to solicit a letter of endorsement from the respective state education agencies to include in recruitment materials sent to districts and schools. Schools will then be recruited both directly and at the district level.

State Endorsement. To encourage district and school participation in the study, their respective state education agencies have been contacted to inform them about the study and to request a letter of endorsement (appendix MS1-B). The state testing coordinator and, where applicable, the middle grades coordinator at the state level were copied on the state letter. Within 3 days of sending the letter to the state, senior recruitment staff contacted the state superintendent, state testing coordinator, and, where applicable, the middle grades coordinator to discuss and secure support for the study. Endorsement letters received by the state are included in all mailings to districts and schools within the state.

School Districts and Diocesan Recruitment. After state contacts have been completed, whether or not an endorsement letter was received, school districts of sample public schools and dioceses of sample Catholic schools (if district or diocese affiliation exists) receive a mailing about the study. The district introductory information packet includes a cover letter (appendix MS1-C), a colorful recruitment-oriented brochure (appendix MS1-H), and a sheet of Frequently Asked Questions (FAQs) about the study (appendix MS1-I). Three days after mail delivery of the packet, a recruiter calls to secure the district’s cooperation and answer any questions the superintendent or other district staff may have. The staff person working with us from the school district is asked to sign an affidavit of nondisclosure prior to receiving the list of schools sampled in the district. Once the signed nondisclosure affidavit is received, we discuss the sampled schools, confirm key information about the schools (e.g., grades served, size of enrollment, enrollment of students with disabilities), and discuss obtaining the students’ IEP information that is necessary for drawing the MS1 student sample. Information collected during this call is used to confirm which schools in the district are eligible for participation in the study, and to obtain contact and other information helpful in school recruitment.

The study staff are prepared to respond to requirements such as research applications or meetings to provide more information about the study. If a district chooses not to participate, the recruiter documents all concerns listed by the district so that a strategy can be formulated for refusal conversion attempts.

In addition to obtaining permission to contact the selected schools, districts are also asked about the best way to gather student rosters to identify students in the three focal disability categories at the selected school(s) in the district to enable MGLS:2017 staff to recruit enough students in the three focal disability categories.

Recruitment of Public and Catholic Schools. Upon receipt of district or diocesan approval to contact the sample public or Catholic schools, respectively, an introductory information packet is sent via overnight express courier that includes a cover letter (appendix MS1-D) and the same colorful recruitment-oriented brochure (appendix MS1-H) and sheet of Frequently Asked Questions (FAQs) about the study (appendix MS1-I) that were sent to school districts and dioceses with links for accessing the MGLS:2017 recruitment website. Three business days after the information packet delivery (confirmed via package tracking), a school recruiter follows up with a phone call to secure the school’s cooperation and answer any questions the school may have. During this call, the recruiter establishes who from the school’s staff will serve as the school coordinator for the study. In the fall of 2017, the MGLS:2017 study team will then work with the school coordinator to schedule MS1 activities at the school, including gathering student rosters, distributing consent materials to parents of sample students, and arranging the onsite assessments. In early communications, the recruiter will also gather information about what type of parental consent procedures need to be followed at the school; any requirements for collecting data on the IEP status of students and student’s teacher and math course information; hours of operation, including early dismissal days, school closures/vacations, and dates for standardized testing; and any other considerations that may impact the scheduling of student assessments (e.g., planned construction periods, school reconfiguration, or planned changes in leadership). The study recruitment team will meet regularly to discuss recruitment issues and develop strategies for refusal conversion on a school-by-school basis.

Private and Charter School Recruitment. If a private or charter school selected for MS1 operates under a higher-level governing body such as a diocese, a consortium of private schools, or a charter school district, we will use the district-level recruitment approach with the appropriate higher-level governing body. If a private or charter school selected for MS1 does not have a higher-level governing body, the school recruitment approach outlined above will be used.

Collection of Student Rosters. Beginning in the fall of 2017, data collection staff will gather student rosters for schools that have agreed to participate in the study. These rosters will be collected from the district or directly from the school with the assistance of the school coordinator from the school. A complete roster of all students eligible for sampling will be requested, and information will be requested for each student on key student characteristics, such as: name; school or district ID number; month and year of birth; grade level; gender; race/ethnicity; and IEP status with disability code(s), when applicable. Each of these characteristics is important for sampling purposes, but we will work with schools that are unable to provide all of the information to obtain the key information available. Based on this information the student sample will be drawn. As part of the roster collection, the study will also request from the school coordinator or designated district personnel the following information for each student eligible for sampling: student’s parent and/or guardian contact information (e.g., mailing address; landline phone number; cell phone number; e-mail address); student’s math teacher (including course name and period or section number); and student’s special education teacher, when applicable. Schools and districts usually find it easiest, and therefore most efficient, to supply all of the desired information one time for all of their students. However, should it be problematic for any school or district to provide the parent and teacher information on the complete roster, the data collection team will gather that information as a second step for the sampled students only. If the school and/or district is unwilling to provide parent contact information for the sampled students, the team will work with the school and/or district to determine the best way to contact parents (e.g., the school coordinator or designated district personnel would facilitate contacting parents and/or mail the required materials to parents using the contact information they have on file).

Schools and districts will be provided with a template and secure transfer options to deliver the rosters (see appendix MS1-S and T for student rostering materials). The data quality of the student rosters will be then evaluated by:

  • reviewing and assessing the quality and robustness of student and parent information available at each school, including contact information for parents;

  • reviewing and assessing the quality of the data on student-teacher linkages;

  • reviewing and assessing the quality of the data on IEP status;

  • addressing any incompleteness or irregularities in the roster file;

  • requesting additional information as needed from the school coordinator or designated district personnel; and

  • (re)verifying that the sampled students are currently in attendance in the school.

Parent Recruitment. Information about schools’ procedures for obtaining consent for students to participate in the study will have been gathered during school recruitment. Schools generally require one of two types of consent: implicit or explicit (appendix MS1-F). Both types of consent require that parents be notified that their children have been selected for the study. With implicit consent, the school does not require verbal or written consent for a student to participate in the study – parents are asked only to notify the appropriate person if they do not want their child to participate. With explicit consent, children may participate only if their parents provide written or oral consent for their children to do so. In MS1, proactive parent recruitment will be focused on maximizing the number of parents (1) returning signed explicit consent forms and (2) completing the parent survey. Because implicit consent does not require a verbal or written response from parents, these parents will not be contacted about consent forms.

After the student sample is drawn within a school, the initial communication with parents consisting of introductory and consent materials will be distributed to parents in a way each school believes to be most appropriate and effective (e.g., sending the materials home with students; the school or district sending the materials directly to parents; and/or trained MGLS:2017 recruitment staff contacting parents directly by mail, email, and/or phone). The initial materials will introduce the study, explain the study’s purpose and the importance of student and parent participation, describe what is involved in participation, and specify the consent procedure that is being used by their school. The materials will include a consent seeking letter to all parents plus a consent form of the type specified by the school (appendix MS1-F), a colorful recruitment-oriented brochure (appendix MS1-H), and a sheet of FAQs about the study (appendix MS1-I) with links for accessing the MGLS:2017 recruitment website (website text in appendix MS1-J). Additionally, in schools using explicit consent, the parental consent form for student’s participation, which will be included in the initial communication materials, will ask parents to provide their contact information.

Parents will also receive an invitation to participate in the parent questionnaire (appendix MS1-G). Parent data collection will entail web-based self-administration with nonresponse follow-up by computer-assisted telephone interviewing (CATI). Parents who do not participate between January and mid-June 2018 will be offered an abbreviated version of the online questionnaire which can be completed online or by CATI. As a last attempt at gaining parent participation, about three weeks prior to the end of data collection, a 1-page (front and back) paper-and-pencil mini version of the survey will be mailed to nonresponding parents with a postage-paid envelope to return the completed survey.

All printed parent communication materials will be provided to all parents in both English and Spanish (printed front/back), including parent permission letters (requesting approval or disapproval of the student’s participation in the study), data collection letters (asking parents to take part in the parent survey), and Quick Facts sheet about MGLS:2017 (Appendices MS1-F2, MS1-F4, MS1-G1a, MS1-G2a, MS1-G3a, and MS1-H1). Additionally, when a parent logs in to respond to the survey, they will be shown a button allowing them to switch to the Spanish version of the questionnaire. This ability will be available in all three versions of the parent survey (full, abbreviated, and mini; Appendices MS1-U2b and MS1-U2d). With regards to emails asking parents to take part in the parent survey, any parent who is marked as needing Spanish in CATI or who returned the permission form in Spanish will be sent the initial invitation email (Appendix W.3.a) in Spanish. The reminder emails and the thank you for participation email will be sent to all parents in English only.

MS1 Data Collection Approach

MS1 will include student surveys and direct assessments, as well as surveys for students’ parents, math teachers, special education teachers or service providers (as applicable), and school administrators. The student surveys and direct assessments will take place in the school setting and be administered using Chromebooks, a tablet-like computer with touchscreen capability and an attached keyboard, brought in to the school by MGLS:2017 staff. This portion of data collection is referred to as the student session. To administer the survey and direct assessment, study staff will work with schools to identify and utilize locations for administration that minimize distractions for the student and disruption to the school routine. The parent, mathematics teacher, special education teacher, and school administrator surveys will have an internet option and a telephone option, so respondents will have the choice to complete the survey in a variety of school and non-school settings. Initially, the surveys will be released in internet-based form. To access the internet-based surveys, parents, teachers, and school administrators will receive an email with links and instructions for completing the survey (described in more detail below). To better understand the differences between participating schools and schools that have refused participation and how these differences may relate to nonresponse bias in MS1, MGLS:2017 will field the school administrator survey to both participating schools and schools that have declined participation.

Planning for the School Data Collection Visit. As noted above, during the recruitment of the school, MGLS:2017 recruitment staff will work with the school to identify a school staff person to serve as a school coordinator for MGLS:2017. This school coordinator will provide a student roster in the fall or winter before the student session to facilitate student sampling. About 4 weeks prior to the scheduled student session, the school coordinator will receive a list of students sampled for the study and copies of the parental permission forms to send home with the sampled students.

Prior to the data collection visit, a trained Session Facilitator (SF) will work with the school coordinator to verify that the students selected for the sample are still enrolled, and, if not already provided, to identify each student’s mathematics teacher and, if applicable, the student’s special education teacher or person who provides special education services to the student. The SF will also work with the school coordinator to establish the following:

  • The schedule for data collection (i.e., days the study will be collecting data in the school, start time and end time of the school day, bell schedule for the transition between classes, and window of time during which students will be assessed during the school day);

  • Any accommodations that may be needed for students, particularly those with IEPs;

  • The WIDA ACCESStm or equivalent score for ELL students to determine their capability of participating in English;

  • A location in the school setting to accommodate the data collection (determining the optimal space for the in-school student session);

  • A plan for distributing permission forms and tracking response;

  • The required logistical information for field staff entering the school (e.g., security and parking procedures); and

  • The school’s preferred protocol for having students arrive at and return from the study space (e.g., this may involve field staff going to classrooms to collect students, students being notified by their teacher to report to the study space, and/or students returning to a specific class on their own when finished with the assessment and survey).

The SF will visit the school prior to the student session to ensure that the logistics are arranged in preparation for a successful data collection. While at the school, the SF will complete a school facilities observation checklist (estimated to take the SF 45 minutes on average to complete), the completion of which does not require the involvement of any school staff (if required by the school, a school staff person may accompany the SF while the checklist is being completed). The SF will complete the checklist on paper, then enter the information online. This approach is less intrusive than carrying a laptop through the school for data entry of each individual element. The facilities checklist gathers information such as: the general upkeep of the school, including the presence of graffiti, trash, or broken windows (inside classrooms, entrance and hallways, and restrooms); noise level; security measures; and the general condition of the neighborhood/area around the school (Appendix MS1-V). This checklist may be completed during the pre-session visit and/or on the day of the student session.

Student Survey and Assessments. Student surveys and direct assessments will be administered in 90-minute group sessions during the school day. The SF will be responsible for administering the student session. A second SF or a session facilitator assistant (SFA) may accompany the SF to help with equipment setup and to conduct a second student session if more than one student session is scheduled concurrently. The student session will generally be carried out as follows:

  • The SF (and SFA, if applicable) will arrive at the school on the appointed day of assessments, 90 minutes prior to when the first student session begins, following all school security protocols for entering the school and seeking out the school coordinator who will be expecting the study team field staff per the arrangements made during the planning of the data collection visit;

  • The SF (and SFA, if applicable) will be escorted by school staff to the designated location for the student session;

  • The SF (and SFA, if applicable) will bring an independently functioning mobile testing lab to the school for the student session and will set up the equipment and space, verifying that the tablet computers are in working order and setting them to the appropriate start screen; and

  • Once students arrive in the designated student session space, the SF provides a brief introduction and will ask the students to put on a pair of earbuds provided by the study. The students will view a video to introduce the student session, provide information about their participation in the study, and help the students log in to begin (appendix MS1-D3). If required by the school, the students will be asked to sign an assent form (appendix MS1-D4).

  • MS1 students will complete a survey, mathematics assessment, reading assessment, executive function (hearts and flowers, described in Appendix MS1-M), and will have their height and weight collected.

Accommodations will be provided to students who need them to the greatest extent possible. As previously mentioned, the SF will work with the school coordinator to determine any accommodations that may need to be provided. Possible accommodations include, but are not limited to, small group student sessions, one-on-one-student sessions, and having an aide present during the student session. Each screen in the student session, except for the reading assessment screens, will have a button that the students may press to have the item read aloud to them. The read-aloud button will be accessible to all students, and use will be encouraged for students who require a read-aloud accommodation. Students who require read-alouds will not participate in the reading assessment.

English Language Learners who do not participate in state assessments will be excluded from the sixth-grade student session, however data will be collected from their parents, teachers, and school administrators.

Parent Questionnaire. The parent questionnaire is expected to take an average of 40 minutes to complete and will feature a multi-mode approach, with self-administered internet-based questionnaires and a telephone interview follow-up for respondents not completing the questionnaire online. The instrument will be available in both English and Spanish.

At the time of student recruitment, parents will have been informed of their participation in the study and of any incentive they might receive (see section A.9 in Part A for information on incentives), and parent contact information will have been collected through the school or on the consent form (appendix MS1-F). The parent data collection will generally be carried out as detailed below.

  • Parent respondents will receive a letter and/or an email (appendix MS1-G) that announces the launch of the survey and provides a link to the online instrument.

  • Upon completion of the survey, parents will receive a thank you letter and incentive.

  • For nonresponding parents, follow-up prompting will include reminder emails, letters, and/or postcards with information repeating the instructions on how to access the survey.

    • Emails will be sent approximately every 6-10 days and letters will be sent approximately every 2-3 weeks.

    • For parents of students with emotional disturbance (EMN; one of the focal disability groups for the study), telephone prompting will begin approximately 10 days after the initial contact is made to encourage participation right away. In OFT1, parents of students with EMN – an oversampled group of analytic importance – participated at a lower rate than did other OFT1 parents.

  • All other parents will begin receiving telephone prompting approximately 20 days after the initial contact. The study team interviewer placing the telephone call may offer to complete the survey with the parent over the phone at that moment or schedule a time to follow up with the parent and complete the survey later.

  • About five weeks prior to the end of the data collection period, all parents who have not yet participated will be invited to complete an abbreviated version of the parent survey, approximately 10 minutes in length, either on-line or by phone. The abbreviated survey includes a subset of items from the full parent survey. Items were selected based on their importance to researchers, and cover household composition, parent education level and background, household languages, student health, and employment and income information.

  • About three weeks prior to the end of the data collection period, all nonresponding parents will receive a mini parent survey, which is designed to be a one-page (front and back) paper-and-pencil survey. It will be mailed in the final three weeks of data collection to all parents who have not completed either the full or the abbreviated survey. The mini survey includes only the most critical items from the parent survey covering household composition, household income, and parents’ education level and employment status. As noted in Part C (Part C.4.2.1), the relevant parent questionnaire items were reworded or otherwise adapted to better fit the paper and pencil format. The mini parent survey is expected to take about five minutes and will come with a postage-paid envelope for returning the completed survey.

Mathematics Teacher and Special Education Teacher Surveys. The mathematics teacher and special education teacher/provider surveys are internet-based, self-administered surveys. The math teacher survey consists of questions for the teacher about: (1) her/himself; (2) the class(es) taught by the teacher which include at least one study sample member (these questions repeat for each separate class); and (3) each study child, referred to as teacher student reports (TSRs). The special education teacher/provider survey consists of questions for the teacher about: (1) her/himself and (2) each study child (TSRs). For mathematics teachers, the combination of teacher questions and class questions is expected to take approximately 20 minutes to complete, and the TSR is expected to take approximately 10 minutes to complete for each student. For special education teachers, the teacher portion is expected to take approximately 10 minutes to complete, and the TSR is expected to take approximately 25 minutes to complete for each student. The mathematics and special education teacher survey data collection will generally be carried out as follows below.

  • For all students, their mathematics teacher, and for students identified as having an IEP, their special education teacher or the person who provides special education services will receive a letter and/or email (appendix MS1-E) announcing the launch of the survey, and providing them with a link to the survey.

  • Upon completion of the survey, teachers will receive a thank you letter and incentive.

  • While at the school to conduct the student sessions, SFs will leave hand-written notes in the teachers’ mailboxes reminding them to complete their survey and thanking them if they have already participated.

  • For nonresponding mathematics and special education teachers, follow-ups will include reminder emails, letters, or postcards with information repeating instructions on how to access the survey. Emails will be sent approximately every 6-10 days and letters will be sent approximately every 2-3 weeks.

  • For teachers who have not completed their internet-based surveys after approximately 2-3 weeks, additional follow-ups will be used, including but not limited to, a telephone call encouraging teachers to complete their surveys.

School Administrator Questionnaire. The school administrator questionnaire will be web-based. It will take the administrator (generally, the principal or principal’s designee) approximately 40 minutes to complete. The school administrator data collection will generally be carried out as follows.

  • School administrators will receive a letter and/or an email announcing the launch of the survey with a link to the survey.

  • Upon completion of the survey, school administrators will receive a thank you email (appendix MS1-D2) including information about how the school will receive its incentive.

  • While at the school to conduct the student sessions, SFs will ask to meet with the school administrator to thank him or her for the school’s participation and remind the administrator to complete the survey if he or she has not done so already. SFs unable to meet with the school administrator personally will leave hand-written notes in the school administrator’s mailbox as a reminder to complete the survey and thanking the administrator if he or she has already participated.

  • For nonresponding school administrators, follow-ups will include reminder emails, letters, or postcards with information repeating instructions on how to access the survey. Emails will be sent approximately every 6-10 days and letters will be sent approximately every 2-3 weeks.

  • For school administrators who have not completed their internet-based survey after approximately 2-3 weeks, additional follow-ups will be used, including but not limited to, a telephone call encouraging school administrators to complete their survey or an offer to complete an abbreviated version of the survey. All administrators who have not participated will be offered the abbreviated version 3 weeks prior to the end of data collection.

Nonparticipating School Administrator Questionnaire

To better understand the differences between participating schools and schools that have refused participation and how these differences may relate to nonresponse bias, administrators or a designee from nonresponding schools will be asked to complete a 20-minute survey about their school characteristics. This survey will be an abbreviated version of the school administrator survey for participating schools.

  • School administrators from nonparticipating schools will receive a letter and/or email inviting them to participate (appendix MS1-E3).

  • For nonresponding school administrators, follow-ups will include reminder emails, letters, or postcards with information repeating instructions on how to access the survey. Emails will be sent approximately every 6-10 days and letters will be sent approximately every 2-3 weeks.

  • For school administrators who have not completed their internet-based survey after approximately 2-3 weeks, additional follow-ups will be used, including emails and telephone calls encouraging school administrators to complete their survey.

OFT2 Tracking and Data Collection

The tracking and recruiting protocols and materials for OFT2, which were previously approved (OMB# 0850-0911 v.13-15), are described below. The tracking and recruiting activities lead to the first follow-up data collection (OFT2) with the OFT1 sample, which is scheduled to be conducted between January and May 2018, at the same time as data collection for MS1. OFT2 data collection activities will include a 75-minute student session and a 40-minute school administrator survey. For OFT2, we anticipate students will fall into one of the four broad categories of enrollment ‒

• OFT2 returning schools: For schools whose grade configuration extends beyond grade 6, we anticipate that most students will continue to be enrolled at their grade 6 school the subsequent year, when most of the OFT sample will have advanced into grade 7. Base Year returning schools may include students who are held back and are still in grade 6 during the second year of the study.

• OFT2 destination schools: We anticipate that students from schools ending in grade 6 will move en masse to schools in the district that begin at grade 7 (destination schools).

• OFT2 transfer schools: We anticipate that some students will have transferred to schools that were not Base Year schools or the designated destination school.

• Other: In addition, we anticipate some students may no longer be enrolled in school (e.g., homeschools, virtual schools, other circumstances).

Schools included in in-school data collection activities, including the request for the administrator survey, during OFT2 will be returning schools, destination schools, and transfer schools that are attended by 4 or more sampled students. MGLS:2017 students not attending a school with four or more sample students, and students no longer enrolled in school, will be asked to participate outside of school via a self-administered Web session.

This section describes tracking activities and the data collection approach for in-school sessions and out-of-school Web data collection for students, and data collection procedures for school administrators.

OFT2 Tracking. Tracking will occur for those students in the sample for whom data were collected from the student, parent, math teacher, or special education teacher during the grade 6 OFT1 collection. The planned three-tiered approach to tracking the MGLS:2017 OFT1 sample will include an enrollment status update at the school level, panel maintenance activities with parents, and database tracing.

School Enrollment Status Update. The purpose of the school enrollment status update is to check the enrollment status of the sampled students in each OFT1 school in the fall of the school year before the planned spring OFT2 data collection. We anticipate that many of the students will continue to be enrolled in the school they attended during OFT1, others will have advanced to a destination school (if their school ended in sixth grade), transferred to a new school, or moved into another circumstance such as started home schooling. OFT1 schools will be asked to provide enrollment status information in the fall of 2017 in advance of a planned follow-up data collection in the winter/spring of 2018. Collecting this information is necessary to maintain current records and to gather the information from the school while that information is still available.

The schools that participated in the 2017 OFT1 will be asked to review the list of eligible sampled students from OFT1. For those who have left the school, we will ask schools to provide the students’ last date of attendance, current school status (transfer, home schooling, etc.), last known address and phone number, and, for transfer students, the name, city, and state of the student’s new school if they are known. We anticipate that it will take 20 minutes, on average, to provide this information through a secure website set up for this purpose.

To initiate this contact, the school principal from each school will receive a lead letter that explains the purpose of the planned follow-up field test and that includes a user name, password, and secure website address. Appendix OFT2-B contains the letter to be sent to sampled schools. The letter will prompt the principal or designee to log into the study website. Upon logging in the principal/designee must confirm he or she is the intended recipient of the letter by answering an identification verification question, such as “What is the school phone number?”, and then reset the password for the account. There is no access to any information until the password is reset using a strong password. A test of the password’s strength is built into the password change application. The users then proceed to a screen where they verify the current enrollment of sampled students and provide any updated information they may have on MGLS:2017 students who are no longer enrolled. Appendix OFT2-C includes the instructions to users and Appendix OFT2-D provides a sample form to be used for the screenshots of the enrollment list update application.

If a user has to stop and continue updating later, he or she must use the new password he or she created. If the user forgets the new password, he or she must contact the MGLS:2017 help desk to reset the password.

A follow-up email will be sent two weeks after the lead letter to all nonrespondents. School Enrollment List Update nonrespondents will be categorized into two groups:

Group One: Have not changed their password or initiated the process at all – they will receive an email with the same study ID, password, and URL prompting them to change the password and initiate the enrollment update process, just like the letter.

Group Two: Have started the update but have not "submitted" it – they will get an email prompting them to continue and reminding them that if they have forgotten their password, they can contact the help desk to have it reset.

After the two-week period, the recruitment team will begin to contact the school via telephone to follow up on the enrollment status update and to begin to coordinate the logistics of the in-school student data collection for the sampled students who remain at the school. The OFT2 data collection is scheduled to begin in January 2018.

As the enrollment status updates are received and processed, students who are no longer attending the Base Year school will be identified. Destination schools will be contacted if four or more MGLS:2017 students have enrolled at the school. Appendices OFT2-F through OFT2-H provide the communication materials that will be sent to the school districts and schools that are newly identified for the study. If fewer than four students transfer to a particular school or if a student becomes homeschooled, attends a virtual school, or is otherwise not enrolled at school, the students will be contacted separately to participate via Web.

The OFT2 data collection is scheduled to begin in January 2018. The letter to the school administrator to initiate the tracking activities (Appendix OFT2-B1) describes the upcoming OFT2 data collection, which will consist of a 75-minute student survey and assessment and a 40-minute school administrator survey.

Parent/Student Address Update. In addition to the school-level update, we plan to directly contact the parents of eligible sampled students during later rounds of OFT to update our address database. A mailing (OFT2-I) will be sent to the parent or guardian of each sample student asking that the parent or guardian log onto our website and update their contacting information. If we have an email address for the parent, the materials will be sent via email as well (OFT2-J). For data security reasons, no personally identifiable information will be preloaded onto the website for this address update. In addition to updating contact information, parents will be asked if their child will be at the same school that he/she attended in the spring of 2017, or if his/her school enrollment status has changed. This provides two sources of information in preparation for the OFT2 collection. The address update will take approximately 5 minutes to complete. See appendix OFT2-K for an example of what information will be on the website for the parent to update. To maximize response, a hardcopy version (OFT2-L) of the same form will be sent to nonrespondents 3 weeks after the mailing is sent with the address update website. An email reminder will be sent at this time as well.

Tracing. Batch tracing will be conducted about 30 days prior to the start of the OFT2 data collection. Batch databases are used to confirm or update the contact information that we have for parents to maximize resources for the data collection activities. Throughout the data collection period, for parents that we are unable to reach due to a lack of or out-of-date contact information, RTI tracing specialists will use proprietary databases to conduct intensive tracing activities. A locator database will be maintained for the study and all newly identified information will be loaded into the locator database regularly to be used for current and future data collection efforts.

OFT2 Data Collection. The OFT2 data collection will include in-school student sessions, out-of-school student sessions, and school administrator surveys.

In-school Sessions. Data collection protocols for OFT2 will closely resemble those established for MS1 data collection. After the school coordinator (SC) provides tracking information for the sampled students who attended that school in the base year, the SF will work with the SC to coordinate the data collection for those study students still enrolled at the school. The SF will also work with the school coordinator to establish the following:

  • The schedule for data collection (i.e., days the study will be collecting data in the school, start time and end time of the school day, bell schedule for the transition between classes, and window of time during which students will be assessed during the school day);

  • Any accommodations that may be needed for students, particularly those with IEPs;

  • The WIDA ACCESStm or equivalent score for ELL students to determine their capability of participating in English;

  • A location in the school setting to accommodate the data collection (i.e., determining the optimal space for the study);

  • A plan for distributing permission forms and tracking response;

  • The required logistical information for field staff entering the school (e.g., security and parking procedures); and

  • The school’s preferred protocol for having students arrive at and return from the study space (e.g., this may involve field staff going to classrooms to collect students, students being notified by their teacher to report to the study space, and/or students returning to a specific class on their own when finished with the assessment and survey).

For the grade 8 students added specifically for OFT2 to test the math and reading assessment items, the SF will work with the SC to identify one or more intact classrooms of students with high math ability, coordinate logistics for the session (date, time, location), and distribute and track parental permission forms.

The SF will visit the school prior to the student session to ensure that the logistics are arranged in preparation for a successful data collection. The student session will take 75 minutes and will include assessments in mathematics, reading, and executive function (two-back, described in Appendix OFT2-M), plus a brief student survey. During OFT2, we will test administering to students a reduced total number of executive function items from the 120 items administered in OFT1 to either 60 or 90 in OFT2. OFT2 students will be randomly assigned to receive either 60 items or 90 items during the two-back executive function assessment.

Out-of-school Sessions. An important aspect of a longitudinal study is to follow students regardless of where they attend school after the base year. This enables examination of the trajectory and associated success of all students regardless of their path through the education system. The OFT2 data collection will test procedures for conducting data collection with middle grades students outside of school via Web. Out-of-school data collection was conducted successfully during the High School Longitudinal Study of 2009 (HSLS:09) first follow-up with students in modal grade 11. The OFT2 out-of-school data collection will be conducted only with students who were part of the OFT1 sample and not with the students in grade 8 added specifically for OFT2.

  • Because the students are minors, all communication will go through the parent or guardian. The parent or guardian of the study student will receive a letter and/or email reminding him or her about the study and inviting the study student to participate via Web (appendix OFT2-M1). As an enclosure in the parent mailing, the parent will receive an envelope addressed to the student. This envelope will contain a letter to the student inviting him or her to participate via Web and will contain information to log in to the session (appendix OFT2-M2). By the parent passing the information on to the student for participation, the parent implies permission for the student to participate.

  • The student will complete the same 75-minute session as his or her in-school counterparts.

  • Parents of nonresponding students will receive reminder emails, letters, or postcards with information repeating the instructions on how the study student may access the survey. Emails will be sent approximately every 6-10 days and letters will be sent approximately every 2-3 weeks. Telephone prompting may also occur.

School Administrator Survey. The OFT2 school administrator questionnaire will be web-based. It will take the administrator (generally, the principal or principal’s designee) approximately 40 minutes to complete. The school administrator data collection will generally be carried out as described below, which are the same steps as noted above for MS1.

  • School administrators will receive a letter and/or an email announcing the launch of the survey with a link to the survey.

  • Upon completion of the survey, school administrators will receive a thank you email (appendix OFT2-D2) including information about how the school will receive its incentive.

  • While at the school to conduct the student sessions, SFs will ask to meet with the school administrator to thank him or her for the school’s participation and remind the administrator to complete the survey if he or she has not done so already. SFs unable to meet with the school administrator personally will leave hand-written notes in the school administrator’s mailbox as a reminder to complete the survey and thanking the administrator if he or she has already participated.

  • For nonresponding school administrators, follow-ups will include reminder emails, letters, or postcards with information repeating instructions on how to access the survey. Emails will be sent approximately every 6-10 days and letters will be sent approximately every 2-3 weeks.

  • For school administrators who have not completed their internet-based survey after approximately 2-3 weeks, additional follow-ups will be used, including but not limited to, a telephone call encouraging school administrators to complete their survey or an offer to complete an abbreviated version of the survey.

MS2 Tracking and Recruitment Approach

In preparation for MS2, we will track the student’s enrollment status and update the parents’ contact/locating data. The OFT2 tracking procedures described above will be used to track the main study sample. In preparation for MS2, we will track students’ enrollment status and update their parents’ contact/locating data through panel maintenance activities and database tracing. These procedures may be modified based on the OFT2 experience, in which case such modification will be submitted to OMB for review as a change request.

One difference between tracking for OFT2 and MS2 is that, to retain the possibility of including all eligible sample members in future rounds, tracking will occur for all eligible students in the MS1 sample, whether or not they are classified as participants in the base year of the study. Gaining cooperation from MS1 nonrespondents has the potential to increase the precision of estimates for the first- and second follow-up rounds of data collection. Additionally, if nonrespondents are not pursued in subsequent data collection rounds, the response rate and yield for subsequent rounds is certain to be lower than that of the base year. Student participation in the base year is defined as receiving data from the student, the student’s parent, the student’s math teacher, or the student’s special education teacher/service-provider.

The school enrollment status update, parent/student address update, and tracing procedures that will be used for MS2 are the same as those described above for OFT2, with the exception that the MS2 procedures will begin in September 2018, a little over a year later than the equivalent OFT2 procedures.

B.3 Methods to Secure Cooperation, Maximize Response Rates, and Deal with Nonresponse

MS1 Recruitment

Methods to secure cooperation, maximize response rates, and deal with response for MS1 recruitment have been described and approved in previous submissions (OMB# 1850-0911 v. 11-15). They are reiterated here because they inform these activities for OFT2 recruitment, tracking, and data collection, as well as MS2 recruitment and tracking.

Maximizing School Participation. District- and school- participation rates in school-based studies have been declining steadily over time. District and school personnel understand the value of the research but have many reasons for refusing participation in these voluntary studies, which require considerable burden on their part. Studies increasingly experience challenges in obtaining the cooperation of districts and schools. Loss of instructional time, competing demands (such as district and state testing requirements), lack of teacher and parent support, and increased demands on principals impede gaining permission to conduct research in schools. MGLS:2017 recruitment teams will be trained to communicate clearly to districts, dioceses, private school organizations, schools, teachers, parents, and students the benefits of participating in MS1 and what participation will require in terms of student and school personnel time. MGLS:2017 staff will utilize conferences to inform middle grades professionals about the study and to increase MGLS:2017 name recognition.

As part of the strategy to maximize response rates among school districts and schools during the recruitment process, we have established partnerships with organizations such as the Association for Middle Level Education (AMLE), the National Forum to Accelerate Middle-Grades Reform (the Forum), the National Center for Education, Research and Technology (NCERT), and the School Superintendents Association (AASA). These organizations will actively promote the value of the study to their constituencies, as will a number of middle-grades education researchers who will participate in the recruitment effort.

Representatives from these organizations have committed to provide outreach to the middle grades community in general via information in newsletters and related communications. These communications will include information about the importance of the study, what study participation entails, and the benefits of the study to the middle grades community.

We have initiated a communication and outreach approach to publicize MGLS:2017 in public settings, particularly conferences. This has included recent presentations at two NCERT meetings and the AASA National Conference on Education, and has enabled us to connect to local- and state- education officials to raise awareness and to engage their direct support of the study with their constituencies (e.g., endorsement letters; direct contact with their staff). It also has enabled us to meet 1-on-1 with some state- and district officials who are a part of the MS1sample.

As part of the broad communication, we are conducting webinars that are open to the public, that are being publicized by AMLE, AASA, NCERT as well as other interested parties. These webinars describe the importance of the study, provide some information about participation, and allow webinar attendees to ask questions. These webinars will be conducted starting in April 2017 through the end of the MS1 data collection period, as needed. While open to the public, these webinars may include specific invitations to district and school personnel. The webinar materials are included in Appendix MS1-J2.

We are also implementing the following enhancements for MS1 recruitment to encourage participation:

  • The in-school data collection will begin on January 9, 2018 for schools using implicit permission and January 16, 2018 for schools requiring written consent. In many schools/districts the January dates avoid mandatory testing, among other spring term activities (roughly one-quarter of the OFT1 participating schools selected January sessions, even with OFT1 data collection starting on January 24).

  • MS1 data collection will be conducted from January through July 2018. In-school student data collection will take place from January through June 2018, and staff and parent survey collection from January through July 2018. The inclusion of June 2018 as part of the available dates for in-school sessions will enable some schools to participate after their high-stakes testing is finished. Staff and parent surveys will continue through July 2018 to allow them sufficient time to respond, given that teacher and parent lists are submitted on a flow basis throughout the in-school data collection period.

  • Based on the results of the IVFT and OFT1 school-incentive experiments, we will offer each school (if allowed by the district/school) a choice of $400 in a check, gift card, or in goods/services for participation (the IVFT and OFT1 had 3 treatment conditions for the experiment: $200; $400; or $400 equivalent in goods/services). An additional $200 may be offered to schools associated with pending-refusal districts that have one or more high-prevalence schools, as described below, under More Intensive Refusal Conversion.

  • To provide a tangible connection between the school’s participation and study findings and to respond to districts’/schools’ desire for data, we will also offer each school a report reflecting its aggregated MGLS:2017 assessment results as compared to national and sub-national results (where possible).

  • We will offer to personnel of participating schools and districts training for analyzing and learning from MGLS:2017 data (to take place after data collections ends) as a professional development and continuing education opportunity incentive.

  • We have included in this submission minor revisions to our communication materials to emphasize more explicitly the value and uniqueness of this middle-grades study and what may be learned as a result. We also make reference to what we give back to districts and schools (e.g., school-level reports).

  • To encourage submission of parental consent forms in schools requiring explicit consent for student participation and to engender goodwill and enthusiasm with the school, we will offer the students an in-school pizza party (or other food provision per school’s preference) to motivate returning the consent forms. Such an offer has the potential to reduce burden on the school staff while increasing student participation. We found that districts and schools with explicit consent requirements are sometimes hesitant to participate, anticipating low student participation, and that an incentive to students for returning the form can boost participation and alleviate those concerns.

  • Students will be using earbuds to complete the audio portion of the student assessment. Students will be allowed to keep the earbuds after participation, in addition to the already approved token incentives (e.g., keychain, pen). These tokens are mentioned in the parent consent materials.

  • We will conduct in-person recruitment visits to a limited and targeted subset of districts and schools to encourage their participation and explain the value of the study.

Recruiters will be trained to address concerns that districts and schools may have about participation, while simultaneously communicating the value of the study and the school’s key role in developing instruments that ensure high-quality data focusing on middle-grade students. Early engagement of districts and school administrators will be important. Along with what is described above, our plan for maximizing district, school administrator, and parent engagement includes the following:

More Intensive Refusal Conversion. The MGLS:2017 MS1 sampling plan is designed to achieve 900 participating schools distributed across 16 sampling strata. A sample of 3,710 schools was selected for MGLS:2017, and under the assumption that approximately 70 percent of schools would agree to participate, a subsample of 1,236 of those 3,710 schools was selected for initial recruitment. The set of 2,474 schools not selected for initial recruitment defines a set of reserve schools from which additional schools will be selected for recruitment if the desired number of participating schools are not achieved in one or more of the 16 sampling strata.

A more intensive refusal conversion strategy will be employed during the recruitment of the initial set of 1,236 sampled schools to reduce the degree to which sampling of reserve schools is required and to help obtain the target number of students in the study’s focal disability groups set for the base year. The set of pending refusal schools will be reviewed and a subset of districts and schools will be selected to receive interventions designed to increase the likelihood of their participation.

During our recently completed OFT1, school-level response was lower for schools classified as having a higher prevalence of students in the focal disability groups. The study sampling design identifies schools as either high prevalence or low prevalence with respect to the number of students in the focal disability groups. High prevalence schools are public schools where the expected number of students whose primary IEP is autism or emotional disturbance exceeds the 95th percentile (or 17 students) across all schools in the MGLS:2017 MS1 sampling frame. Only 13 of 51 high prevalence public schools (25.5 percent) in our initial OFT1 sample agreed to participate. This compared with 17 of 36 low prevalence public schools (47.2 percent) in our initial OFT1 sample that agreed to participate (p < .036). Most of the refusals experienced among high prevalence schools were received at the district level, where 25 districts, representing 32 high prevalence schools, declined participation. At the district level, with our initial OFT1 sample, 11 of the 38 districts with high-prevalence schools (28.9 percent) allowed us to contact their schools about the study; 20 of 32 districts with low-prevalence schools (62.5 percent) allowed us to contact their schools (p < .005). Furthermore, OFT1 districts that declined participation for their schools have on average a higher percentage of high-prevalence schools (32 percent high-prevalence schools) than do OFT1 districts that allowed us to contact their schools for participation (20 percent high-prevalence schools).

Given the rarity in many schools of students in two of the three focal disability groups – namely students whose primary IEP designation is autism or emotional disturbance – these refusal conversion strategies will be targeted to achieve higher response among those public schools classified as “high prevalence.” Of the 1,236 initially sampled schools, 1,017 are public schools and public schools identify IEP students (the remaining 219 sampled schools are private schools and counts of students with the focal disabilities are unavailable for private schools). The 1,017 public schools come from 839 public school districts. In the initial school sample, we sampled 176 high prevalence schools and they reside in 144 districts. There are an additional 56 low prevalence schools in those districts for a total of 232 sampled schools in 144 districts with at least one high prevalence school (out of the total of 1,017 sampled public schools in 839 districts).

In OFT1, schools were randomly assigned to a school-level participation incentive of a $200 monetary incentive or either a $400 monetary or non-monetary equivalent incentive. For high prevalence schools, if the district participated, the $400 incentive level yielded 9 participating schools of 11 (82 percent); the $200 level yielded 4 participating schools of 9 (44 percent). Given the apparent effectiveness of the additional $200 in increasing response, we propose providing a boost of an additional $200 to pending refusal schools. For each additional high-prevalence school that agrees to participate, we estimate bringing in on average an additional 8 students with autism or emotional disturbance IEP designation.

Refusals at the district level prohibit the study from contacting the school or schools in their jurisdictions to discuss participation in the study; thus a single district refusal can affect participation of multiple schools. This is even more damaging to the study when the refusing district contains high prevalence schools, meaning schools where the total number of students with primary IEP designation of AUT or EMN exceeds 17. Although the $200 boost will be provided to schools, because the district is a gatekeeper and was a significant factor in nonresponse during OFT1, we propose to offer the additional $200 in districts that contain high prevalence schools to maximize their approval of research and improve school participation rates. We will let each refusing district know about the higher incentive offer as part of our district refusal conversion efforts. This is intended to encourage the district to open the door for us to contact schools directly, where we have greater likelihood of gaining cooperation. Upon OMB approval of this plan, we will identify all pending refusal districts containing high prevalence schools on an ongoing basis. We will offer all schools in those districts, including low prevalence schools, an additional $200 monetary incentive or $200 monetary equivalent in goods or services – for a total of $600 for the targeted schools – to achieve district participation. The purpose of offering consistent levels of incentives across schools in a district is to avoid treating schools in the same district differentially.

A financial incentive does not secure the assurance of participation, as districts and schools are increasingly protective of their instructional time and are reluctant to participate in voluntary studies. However, our recent experience demonstrates that many schools facing budget reductions find that an increase in the incentive is attractive and that it encourages their participation in the study. The proposed increase of $200 is a moderate increase compared with other studies, and we are concerned that a lesser offer may not be sufficient to offset the districts’ reasons for declining to participate in the study.

We will also offer site visits and tele- or video conferences (which may include school administrators in addition to district officials) to explain the details of the study, to address any concerns that the districts and schools may have, and to improve the likelihood of district participation. We will compare the districts that receive the intensified effort (including the nonmonetary measures) to districts where schools received $400 to evaluate effectiveness.

Experienced recruiters. The recruiting team will include staff with established records of successfully recruiting school districts and schools. To maximize district approval, senior staff will make the initial district telephone contacts. Their familiarity with the study and its future impact, as well as their experience in working with districts to overcome challenges to participation, will be crucial to obtaining district approval. Recruiters contacting schools will be equally adept at working with school administrators and providing solutions to overcome the many obstacles associated with student assessments, including conflicts related to scheduling and assessment space, minimizing interruption to instructional time, and obtaining teacher and parent buy-in.

Persuasive written materials. Key to the plan for maximizing participation is developing informative materials and professional and persuasive requests for participation. The importance of the study will be reflected in the initial invitations from NCES (appendices MS1-B to D) sent with a comprehensive set of FAQs (appendix MS1-I), a colorful recruitment-oriented brochure describing the study (appendix MS1-H), and a brief one-page flyer providing quick facts about the study which also explains that MGLS:2017 is different from other assessments (appendix MS1-H). Reviewing these study materials should provide districts and school administrators with a good understanding of the study’s value, the importance of MGLS:2017, and the data collection activities required as part of the study. A full understanding of these factors will be important both to obtain cooperation and to ensure that schools and districts accept the data collection requests that follow.

Persuasive electronically accessible materials. In addition to written materials, information about the study will be available on the study website (text in appendix MS1-J). The website will draw heavily on the written materials, will present clear and concise information about the study, and will convey the critical importance of taking part in the study.

Outreach. As mentioned briefly above, AMLE and the Forum will provide an outreach service, asking for support of the study, offering updates to their constituencies on the progress of the study, and making available information on recent articles and other material relevant to education in the middle grades. In addition, project staff will attend conferences, such as the AMLE annual conference, to promote the study.

Buy-in and support at each level. During district recruitment, the study team will seek not just permission to contact schools and obtain student rosters but also to obtain support from the district. This may take the form of approval of a research application and a letter from the district’s superintendent encouraging schools to participate. Active support from a higher governing body or organization, such as a district or a diocese, encourages cooperation of schools. Similarly, when principals are interested in the research activity, they are more likely to encourage teacher participation and provide an effective school coordinator.

Avoiding refusals. MGLS:2017 recruiters will work to avoid direct refusals by focusing on strategies to solve problems or meet obstacles to participation faced by district or school administrators. They will endeavor to keep the door open while providing additional information and seeking other ways to persuade school districts and schools to participate.

MS1 Data Collection

The data collection plan approaches the school as a community. We aim to establish rapport with the whole community—principals, teachers, parents, and students. The school community must be approached with respect and sensitivity to achieve high levels of participation. In addition to sound data collection approaches, the study will also offer monetary and nonmonetary incentives to schools (described in Part A, Section A.9), which have proven to increase school participation rates. Along with offering incentives, our plan for maximizing district, school administrator, and parent engagement and increasing response rates for the MS1 data collection includes various strategies, described below.

  • The data collection plan attempts to minimize the time that any one student, parent, or teacher will be asked to spend in completing survey instruments. For example, the student survey and direct assessment was designed to take approximately 90 minutes per student. The parent interview was designed to take an average of 40 minutes. The mathematics teacher survey was designed to take on average approximately 20 minutes for the combination of teacher-level and class-level questions and approximately 10 minutes (per student) for the teacher-reported questions about students. The special education teacher survey was designed to take approximately 10 minutes for the teacher-level questions and approximately 25 minutes (per student) for the teacher-reported questions about students. The items being considered for inclusion in the surveys were reviewed over two field tests to ensure that only items that functioned well and are needed to address important research questions were included, and repetitious questions were not included. For the assessments, the number of items included was kept to the minimum needed to be able to measure knowledge and skills in each domain with sufficient precision.

  • Internet-based surveys and other computer-assisted methods will be used to collect data from parents, teachers, and school administrators, while offering alternative modes for nonrespondents so that respondents can complete the survey in the mode that is most convenient for them or with which they feel most comfortable. OFT1 provided NCES with important information on the percent of parents who choose to respond using the different modes, item completion rates in each mode, and the level of effort that is required to obtain a response. This informed MS1 instrument content and data collection methods.

  • A variety of communication materials (advance letters, email invitations, a study summary, and follow-up letters) will be sent to respondents to communicate the importance of the study and of their participation, and how their participation will inform education policy for the future. Providing multiple types of materials and sending them via multiple modes increases the chance that the intended recipient receives and reads the materials. For example, some individuals may be more likely to open an email than an envelope received via U.S. Mail, or vice versa. Some may be more likely to respond to a visually pleasing study summary as opposed to a formal letter, or vice versa. The variety of contact materials will help maximize the coverage of contact, which in turn will maximize response.

  • Contact will be maintained with respondents using various approaches and through multiple attempts. By staying in contact with reluctant respondents to determine their primary reasons for not responding, the data collection staff can be flexible and proactive. Direct contact with respondents by phone after unsuccessful email and hardcopy invitations can often break through resistance and help to increase cooperation and response rates. Experience with each of these modes during OFT1 helped to inform and refine the design and data collection protocol for MS1.

  • Use experienced session facilitators. The session facilitator (SF) team will include staff with established records of successfully conducting data collection and/or working in schools. Experienced SFs understand how to relate to school staff and students and meet their needs while still effectively and accurately maintaining the integrity of the study design. SFs will demonstrate flexibility in working with schools and assume as much of the burden as possible while conducting the student sessions. These good faith actions on the part of the SFs help to maximize response from schools which typically have limited time and resources to coordinate such efforts.

  • An incentive will be offered to schools, parents, and teachers to encourage their participation and thank them for their time. (For more information on incentives see Supporting Statement Part A, Section A.9.)

Maximizing Parent Participation. To improve the chances of obtaining adequate parent participation in MGLS:2017, a combination of methods will be employed for MS1 based on the IVFT and OFT1 parent incentive experiments and data collection results. Results from the OFT1 parent incentive experiment in particular (details provided in section B.4 below), which involved providing a baseline incentive followed by boost incentives for nonrespondents, were used to determine the incentive strategy for MS1.

  • Parents (with the noted exception below) will be offered $20 at the start of data collection and will receive telephone prompting approximately 20 days after their first contact.

  • Parents of students with emotional disturbance (EMN) will receive a differentiated incentive and contacting structure from the remaining sample because they are a special population of interest and had depressed response rates in the prior two field tests. Parents of students with EMN will be offered $30 at the start of the data collection and telephone prompting will commence approximately 10 days after the first contact.

  • Both groups of parents will receive the offer of an additional $10 six weeks after the first contact.

  • Thus, parents of non-EMN students will receive a maximum offer of $30 and parents of EMN students will receive a maximum incentive offer of $40.

  • About five weeks prior to the end of the data collection period, all parents who have not yet participated will be invited to complete an abbreviated version of the parent survey, approximately 10 minutes in length, either online or by phone. The abbreviated survey includes a subset of items from the full parent survey. Items were selected based on their importance to researchers, and cover household composition, parent education level and background, household languages, student health, and employment and income information.

  • About three weeks prior to the end of the data collection period, all nonresponding parents will receive a mini parent survey – a one-page (front and back) paper-and-pencil survey accompanied by a postage-paid return envelope, that is approximately 5 minutes in length and includes only the most critical items from the full parent survey covering household composition and income, and parents’ education level and employment status.

  • A Spanish translation of the parent surveys will be available to facilitate participation for those respondents who prefer to respond in Spanish.

Parent cases will be released for data collection on a flow basis over the course of the January to July 2018 data collection period. If parent information is available at the start of data collection, parent mailings/emails will be sent then. Otherwise, mailings/emails will be sent after parent contact information is provided by the schools.

During OFT1, many schools did not provide parent lists until well into the data collection period. Additionally, little information is available for sampled parents beyond school and student characteristics used for sampling. Given these considerations, responsive design methods are not practical to implement for Base Year parent data collection. Instead, based on the OFT1 results, optimal baseline incentive values have been determined plus more intensive efforts for parents of students with EMN.

OFT2 Data Collection

To maximize participation in OFT2, we will track the student’s enrollment status and update the parents’ contact/locating data. The goal of this collection is to inform the procedures necessary for successful tracking of students and parents after the base year data collection. The OFT2 data collection, to be conducted starting in January 2018, will inform the MS2 data collection procedures to be conducted a year later, in January 2019.

MGLS:2017 staff will be trained to communicate with schools and parents. Additionally, the MGLS:2017 study team proposes to take several steps to increase response rates, in addition to the steps described above, in the first 4 bullet points under MS1 Data Collection:

  • Batch and intensive tracing activities will be used as a low-cost, quick-turnaround approach to ensuring current contact information to facilitate reaching parents for the OFT2 data collection.

  • Contact materials to the schools will describe both the tracking and data collection activities to fully inform school administrators about the activities to be conducted in the 2017-18 school year.

  • Students no longer enrolled at their base, destination, or transfer school may be asked to participate via Web to facilitate participation by students who otherwise would be unable to participate in school.

Additionally, we will build upon the relationships with school staff, students, and parents established in the base year and during the tracking activities to maximize participation in the OFT2 data collection.

  • The OFT2 data collection consists of a reduced burden from OFT1. The student session has been reduced from 90 minutes to 75 minutes. The collection of height and weight measurements and of the teacher and parent surveys will be omitted.

  • Efforts will be made to ensure consistency among the session facilitators who visit the school to conduct the student session.

  • We will provide an out-of-school Web participation option to facilitate participation by students who participated at grade 6 but no longer attend the school they attended during OFT1 nor a destination school with 4 or more students. Students who participate outside of school will be offered $20 to encourage their participation given the challenge and burden of participating on their own time.

  • The administrator survey may be completed by the principal or a designee knowledgeable about the school.

  • Grade 8 students added to test more challenging math and reading assessment items will complete the same in-school session as students sampled for OFT1 but will not be asked to participate outside of school if they miss the in-school session.

MS2 Tracking

Methods for increasing response rates and dealing with nonresponse in MS2 will be similar to those used in OFT2. These methods may be modified based on the OFT2 experience, in which case any modification will be submitted to OMB for review as a change request.

B.4 Test of Methods and Procedures

Of the two MGLS:2017 field tests, IVFT was conducted in the winter/spring 2016 and OFT1 in the winter/spring 2017. Together, they are the basis for informing decisions about the methods and procedures for MS1. One of the main goals of the IVFT/OFT1 effort was to provide data needed to evaluate a battery of student assessments (in the areas of mathematics and reading achievement, and executive functions) and to evaluate survey instruments for use in MS1. To that end, a number of analyses were performed on the IVFT and OFT1 data in order to determine whether assessment and questionnaire items needed revision or removal.

The properties of the survey items were examined using frequencies, mean, median, mode, standard deviation, skew, kurtosis, and histograms. Differences in response patterns were examined overall and by grade level. If the survey items were intended to be part of a scale, reliability, dimensionality, and item-to-total correlations were examined. Additionally, bivariate correlations with preliminary mathematics assessment, reading assessment, and executive function information were examined. Finally, the timing required to answer items was reviewed to remove or revise any items that needed an inordinate amount of time to complete. Based on these findings, in combination with consideration of construct importance, decisions were made to revise some items and to remove others.

The purpose of the IVFT was also to provide data to establish the psychometric properties and item performance of the items in the mathematics item pool. These data were used to construct a two-stage mathematics assessment that was fielded in OFT1 and will be refined for MS1. In addition, the IVFT and OFT1 provided data on the performance of the reading assessment and the executive function tasks. These data were used to refine the reading assessment and to select and refine executive function tasks to be fielded in MS1.

The IVFT also provided an opportunity to develop study policies and procedures that could be further tested in OFT1 for eventual use in MS1. However, the two field tests are quite different. The IVFT included students in multiple grades, though not necessarily from a representative sample, and tested a large number of items to determine the item pool for the longitudinal study. A main goal of OFT1 was to better understand the recruitment strategies necessary for a large-scale nationally representative study to obtain the targeted sample yield of grade 6 general education students and students with disabilities, and the subsequent tracing and tracking strategies necessary to maintain the student sample from the base year (when sample students will be in grade 6) to the first follow-up (when most of the students will be in grade 7) and the second follow-up (when most of the students will be in grade 8). OFT1 provided an opportunity to further test the procedures that worked effectively in the IVFT and subsequently learn from OFT1 how to best implement MS1.

Two incentive experiments were conducted in OFT1 to inform decisions about the optimal baseline incentive offer for MS1. These experiments were a school-level incentive experiment and a parent-level incentive experiment.

School-level Incentive Experiment. School participation has been increasingly difficult to secure. Given the many demands and outside pressures that schools already face, it is essential that schools see that MGLS:2017 staff understand the additional burden being placed on school staff when requesting their participation. The study asks for many kinds of information and cooperation from schools, including a student roster with basic demographic information (e.g., date of birth, sex, and race/ethnicity); information on students’ IEP status, math and special education teachers, and parent contact information; permission for field staff to be in the school for up to a week; space for administering student sessions (assessments and surveys); permission for students to leave their normal classes for the duration of the sessions; and information about the students’ teachers and parents. Sample students with disabilities sometimes require accommodations and different session settings, such as individual administration and smaller group sessions, which add to the time the study spends in schools and sometimes require additional assistance from school staff to assure that these students are accommodated appropriately.

IVFT and OFT1 included a school-level incentive experiment (see Supporting Statement Part A of OMB# 1850-0911 v.9 and v.15 for details). Schools were randomly assigned to one of three incentive conditions: Condition 1 - $200, Condition 2 - $400, or Condition 3 - $400 in materials or services for the school (school coordinators also received $150, consistent across all three conditions). Table 8 displays information on the types of non-monetary materials or services offered in Condition 3.

Table 8. Non-Monetary Incentive Choices for Schools in Experimental Condition 3

Incentive (Approximate Value = $400)

Registration for Association for Middle Level Education (AMLE) or Regional Annual Meeting

Two-Year School Membership in AMLE

Membership in Regional ML Organization plus Subscriptions to Professional Journals

Professional Development Webinar

School Supplies

Library of Middle Level Publications


The original analytic plans for this incentive experiment called for combining results from the IVFT and OFT1 schools to increase the possibility of detecting differences with statistical significance. While we have provided an analysis that uses the combined results from the IVFT and OFT1, given differences between those groups that affect their predictive value regarding MS1, we have also provided individual analyses for IVFT and OFT1. The IVFT set of schools was purposively selected while the OFT1 schools were selected using a probability proportional to size sampling method that mimics the method to be employed in MS1. Also, the IVFT was fielded with a shortened school recruitment window, making it likely that recruitment results were less than we can expect in MS1. In addition, many districts refused to allow their sampled schools to be contacted for the study. This meant that sampled schools in those districts never actually received an incentive offer. To assess the degree to which the level of school incentive impacted participation rates, we analyzed results both including and excluding schools in districts that refused. Table 9 presents results for schools in all districts. Due to small sample sizes of schools, tests of statistical significance may not be particularly informative because of a lack of power. However, it would appear that, for example, in OFT1 the $400 condition resulted in a higher participation rate than the $200 condition (Table 9, far right column).

Table 9. School Participation Rates by Experimental Condition (All Districts)

Experimental Condition

IVFT and OFT1 Combined Participation Rate

(Number of Schools)

IVFT Participation Rate

(Number of Schools)

OFT1 Participation Rate

(Number of Schools)

1

$200

23.1% (30 of 130)

22.9% (20 of 87)

23.3% (10 of 43)

2

$400

27.3% (35 of 128)

22.1% (17 of 77)

35.3% (18 of 51)

3

$400 non-monetary equivalent

29.0% (36 of 124)

25.0% (21 of 84)

37.5% (15 of 40)


Table 10 presents participation rates among schools in cooperating districts.

Table 10. School Participation Rates by Experimental Condition (Participating Districts)

Experimental Condition

IVFT and OFT1 Combined Participation Rate (Number of Schools)

IVFT Participation Rate

(Number of Schools)

OFT1 Participation Rate

(Number of Schools)

1

$200

39.5% (30 of 76)

37.7% (20 of 53)

43.5% (10 of 23)

2

$400

49.3% (35 of 71)

37.8% (17 of 45)

69.2% (18 of 26)

3

$400 non-monetary equivalent

42.9% (36 of 84)

34.4% (21 of 61)

65.2% (15 of 23)


Table 11 presents participation rates for schools in all districts, when the two higher incentive level conditions are combined. In keeping with the information presented in Table 9, due to small sample sizes of schools, tests of statistical significance may not be particularly informative because of a lack of power. However, similar to the earlier results, it would appear that, for example, in OFT1 the $400 condition (regardless of whether it was monetary or non-monetary) was connected to a higher participation rate than the $200 condition (Table 11).

Table 11. School Participation Rates by Combined Experimental Condition (All Districts)

Experimental Condition

IVFT and OFT1 Combined

Participation Rate

(Number of Schools)

IVFT Participation Rate

(Number of Schools)

OFT1 Participation Rate

(Number of Schools)

1

$200

23.1% (30 of 130)

22.9% (20 of 87)

23.3% (10 of 43)

2 and 3

$400 or $400 non-monetary equivalent

28.2% (71 of 252)

23.6% (38 of 161)

36.3% (33 of 91)


Table 12 presents participation rates among schools in cooperating districts, when the two higher incentive level conditions are combined.

Table 12. School Participation Rates by Combined Experimental Condition (Participating Districts)

Experimental Condition

IVFT and OFT1 Combined Participation Rate (Number of Schools)

IVFT Participation Rate

(Number of Schools)

OFT1 Participation Rate

(Number of Schools)

1

$200

39.5% (30 of 76)

37.7% (20 of 53)

43.5% (10 of 23)

2 and 3

$400 or $400 non-monetary equivalent

45.8% (71 of 155)

35.9% (38 of 106)

67.4% (33 of 49)


We believe OFT1 was a truer test of what we might expect in MS1, as we were sampling only from grade 6 and had a smaller student sample per school than in IVFT. With those greater similarities in mind between OFT1 and MS1, and a large difference in response rates in OFT1 between treatment 1 ($200) versus 2 ($400) and 3 ($400 non-monetary equivalent), we are offering schools in MS1 an incentive of $400, with the option of receiving it as a check or a non-monetary equivalent (approved in OMB #1850-0911 v.15).

Parent-level Incentive Experiment. OFT1 evaluated baseline incentive amounts and incentive boosts differentiated between parents of students with EMN and all other parents. OFT1 entailed randomized parent assignment such that parent incentive amounts differed between schools but not within schools (with the exception of incentive amounts for parents of students with EMN).

  • For the baseline incentive, parents of students with EMN were offered either $20 or $30.

  • Parents of non-EMN students were offered $0, $10, or $20 at baseline.

  • In early March, phase 2 entailed an incentive boost offer of an additional $10 for parents of students with EMN. Parents of non-EMN students were offered an additional $0 or $10 to the baseline amount.

  • In early April, phase 3 commenced in which parents of students with EMN were offered $10 more. Parents of non-EMN students were offered either no additional boost or a cumulative offer of $40.

Parent contact information was received from schools on a flow basis and invitations to complete the parent survey were sent as the contact information became available. This meant that some cases had an abbreviated phase 1; some cases never reached phase 2; and some cases never reached phase 3. The parent data collection took place between February 1 and May 31, 2017. This field period was abbreviated compared to what will be used in MS1, for which data collection will begin in January and continue through July, 2018.

Table 13 shows OFT1 parent participation by baseline incentive offer for non-EMN cases. In OFT1, parents offered $20 at baseline had a higher participation rate. Therefore, for MS1, it is recommended that parents of students not having an EMN IEP designation be offered $20 at baseline for MS1.

Table 13. OFT1 Parent Participation by Baseline Incentive Offer: parents of students not having EMN IEP designation

Baseline Incentive

Selected N

Completes

N

Percent

$0

320

105

32.81

$10

653

245

37.52

$20

660

287

43.48*

* Significantly higher than $0 (p<0.05).

Table 14 shows OFT1 parent participation by baseline incentive offer for EMN cases. Due to small sample sizes, tests of statistical significance may not be particularly informative because of a lack of power. In OFT1, parents of EMN students offered $30 at baseline had a substantively higher participation rate. Therefore, for MS1, it is recommended that parents of EMN students be offered $30 at baseline.

Table 14. OFT1 Parent Participation by Baseline Incentive Offer: parents of students having EMN IEP designation

Baseline Incentive

Selected N

Completes

N

Percent

$20

32

7

21.88

$30

29

10

34.48


In terms of what amount of boost might relate to higher participation, table 15 shows that for non-EMN parent cases reaching phase 2, in which a $10 boost was compared with no boost, those offered the $10 boost had a significantly higher response rate than those not given the $10 boost offer.

Table 15. OFT1 Parent Participation by First Incentive Boost Offer: parents of students not having EMN IEP designation

Incentive Boost

Selected N

Completes

N

Percent

No boost

633

198

31.28

$10 boost

415

155

37.35*

* Significantly higher than no boost (p<0.05).

Table 16 shows the results of a $10 first incentive boost for parents of EMN cases. There was no experimentation, given that all pending cases received a $10 boost offer at the start of phase 2 and another $10 boost offer at the start of phase 3. Due to small sample sizes, tests of statistical significance may not be particularly informative because of a lack of power. In OFT1, parents of EMN students who were offered a cumulative $40 incentive (including a $10 boost) had a substantively higher participation rate. Therefore, for MS1, it is recommended that parents of EMN students be offered $30 at baseline and a $10 boost.

Table 16. OFT1 Parent Participation by First Incentive Boost Offer: parents of students having EMN IEP designation

Cumulative Incentive

Selected N

Completes

N

Percent

$30 ($10 boost)

25

6

24.00

$40 ($10 boost)

21

8

38.10


In analyzing the effectiveness of a second boost offer, the cumulative offer of $40 for non-EMN cases and $50 for EMN cases did not result in higher response.

Given the OFT1 boost offer results, it is recommended that non-EMN pending nonresponding parent cases in MS1 receive a single boost offer of $10 (for a cumulative offer of $30) six weeks after initial contact. It is recommended that EMN pending nonresponding parent cases also receive a single $10 boost offer (for a cumulative offer of $40) six weeks after initial contact.

B.5 Individuals Responsible for Study Design and Performance

The following individuals at the National Center for Education Statistics (NCES) are responsible for MGLS:2017: Carolyn Fidelman, Gail Mulligan, Chris Chapman, and Marilyn Seastrom. The following individuals at RTI are responsible for the study: Dan Pratt, Debbie Herget, and David Wilson, along with subcontractor staff: Sally Atkins-Burnett (Mathematica) and Michelle Najarian (ETS).

1 A special education schools is a public elementary/secondary school that focuses on educating students with disabilities and adapts curriculum, materials, or instruction for the students served.

2 Imputation was necessary in order to be able to include eligible schools in the sampling process, which helped ensure the sampling frame was more representative of the population of eligible schools. Imputation was used for grade 6 enrollment when grade 6 enrollment was missing. Imputation was used for focal disability counts when focal disability counts were missing. We note that schools in Wyoming and Iowa do not report to EDFacts so they would not be able to be represented in the MS1 sample without imputing focal disability counts. If both grade 6 enrollment and focal disability counts were missing, imputation was not used and these schools were excluded from the frame.

3 Sixth-grade enrollment is reported as 0 or was not reported to the CCD 2013-14 or PS 2013-14.

4 A school reports zero students or does not report the number of students in any of the three focal disability groups.

5 For sampling purposes, all private schools were classified as low prevalence schools because private schools do not report to EDFacts.

6 See, for example, Kish (1965.) Survey Sampling, John Wiley & Sons, Inc. p.56.

7 Folsom, R.E., Potter, F.J., and Williams, S.R. (1987). Notes on a Composite Size Measure for Self-Weighting Samples in Multiple Domains. Proceedings of the Section on Survey Research Methods of the American Statistical Association, 792-796.

8 The seven student domains are as follows: Autism (AUT); Emotional Disturbance (EMN); Specific Learning Disability (SLD); Asian, non-Hispanic (non-SLD, non-EMN, non-AUT); Hispanic (non-SLD, non-EMN, non-AUT;, Black, non-Hispanic (non-SLD, non-EMN, non-AUT); and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT)

9 SAS Institute Inc. 2008. SAS/STAT® 9.2 User’s Guide. Cary, NC: SAS Institute Inc.

10 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2001). Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), User’s Manual for the ECLS-K Base Year Public-Use Data Files and Electronic Codebook (NCES 2001-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

11 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2012). Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), User’s Manual for the ECLS-K:2011 Kindergarten Data File and Electronic Codebook (NCES 2013-061). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

12 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy