Part B MGLS 2017 MS2 Data Collection

Part B MGLS 2017 MS2 Data Collection.docx

Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Main Study First Follow-up (MS2) Data Collection

OMB: 1850-0911

Document [docx]
Download: docx | pdf





Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Main Study First Follow-up (MS2) Data Collection





OMB# 1850-0911 v.27





Supporting Statement Part B







National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC





August 2019

revised March 2020

second revision May 2020





Part B of this submission presents information on the collection of information employing statistical methods for the Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Main Study First Follow-up (MS2) Data Collection. MS2 recruitment, which began in began in January 2019, was approved in December 2018 with an update approved in May 2019 (OMB# 1850-0911 v.21-23). MS2 data collection was approved in November 2019 with an update in April 2020 (OMB# 1850-0911 v. 24-26), and the current request is to extend the end date for the student data collection to July 31, 2020 and add an email to sample members.

B.1 Universe, Sample Design, and Estimation

The universe, sample design, and all aspects of the OFT Base Year (OFT1), OFT First Follow-up (OFT2), and Main Study Base Year (MS1) studies were fully described in previous clearance submissions (OMB# 1850-0911 v. 10-19). The update to this submission, approved in September 2018 (OMB# 1850-0911 v. 20), added information about OFT3 tracking and about a change in periodicity of and oversampling in the MS follow up data collection. Plans for the sampling universe, tracking of the sample in the spring 2019 (MS2A) and fall 2019 (MS2B) and school recruitment for the MS2 data collection were approved in previous submissions (OMB #1850-0911 v. 21-23). This submission describes the design of the MS2 data collection.

MGLS:2017 MS1 data collection was conducted from January through August 2018 on a nationally representative sample of schools offering grade 6 instruction and a nationally representative sample of students enrolled in grade 6, including students whose primary Individualized Education Program (IEP) classification was Autism (AUT), Emotional Disturbance (EMN), or Specific Learning Disability (SLD) and who were being educated in an ungraded setting and were age-equivalent (aged 11 to 13) to grade 6 students.

MS1 employed a multi-stage sampling design with schools selected in the first stage and students selected, within schools, at the second stage. The school sample was selected using probability proportional to size sampling within school sampling strata. Students were selected from school enrollment lists collected in the fall of 2017 using simple random sampling within student sampling strata within schools.

The school frame was constructed from the 2013-14 Common Core of Data (CCD 2013-14) and the 2013-14 Private School Universe Survey (PSS 2013-14). The MS1 school population excluded the following types of schools that are included in the CCD and PSS:

  • Department of Defense Education Activity schools and Bureau of Indian Education schools,

  • alternative education schools, and

  • special education schools.1

In addition, schools included in OFT1 were excluded from the sampling frame for the Main Study and, therefore, were not eligible for MS1 due to the OFT2 tracking and data collection activities that were conducted in parallel with MS1. The final set of schools that participated in OFT1 was not yet known at the time of initial sampling for MS1, and, therefore, the number of schools in the sampling frame for MS1 could not yet be precisely stated (it had been estimated at around 48,000 in an earlier package). It is now known that there are 48,376 schools that were eligible for MS1.

The sample design called for information on sixth-grade enrollment, overall and by race and ethnicity, and counts of students whose primary IEP designation is AUT, EMN, or SLD to be used in the sampling process. EDFacts data were used to determine, for each school in the sampling frame, the number of students between the ages of 11 and 13 whose primary IEP designation is AUT, EMN, or SLD. In order for schools to be sampled, sixth-grade enrollment, overall and by race and ethnicity, and counts of students whose primary IEP designation is AUT, EMN, or SLD must have been available.

There are some schools for which some of the necessary information was missing but imputation was used to include them in the sampling process; 2,971 of the 48,376 schools had sixth-grade enrollment or EDFacts focal disability counts imputed2. For some schools with missing data imputation is not advisable due to a concern that misestimating enrollment counts may give a higher probability of selection to these schools than warranted. For this reason, the following schools were excluded from the sampling frame:

  • schools that reported overall sixth-grade enrollment but did not report enrollment by race and ethnicity (n=9), and

  • schools that reported no sixth-grade enrollment3 and reported having no enrolled students between the ages of 11 and 13 in the three focal disability groups4 or did not report information on students with disabilities to EDFacts (n=1,578).

The 45,528 schools with complete (non-imputed) information in the sampling frame were explicitly stratified by the cross-classification of the following characteristics:

  • school type (public, Catholic, other private),

  • region (Northeast, Midwest, South, West), and

  • prevalence of students with disabilities (high/low).5

The prevalence indicator is defined using the number of students in two of the three focal disability groups noted above. Schools were classified as having a high prevalence of students with disabilities (i.e., high prevalence schools) if the total number of students whose primary IEP designation was AUT or EMN exceeded 17. The number of SLD students was not factored into the stratification process because the number of students classified as SLD generally far exceeds the number of students classified as either EMN or AUT. Factoring in the number of SLD students would have resulted in a threshold where schools above the threshold would have had very few EMN or AUT students. The number of SLD students was also excluded in the determination of high/low prevalence schools because it appears that sufficient numbers of SLD students would be included in the sample without oversampling. The threshold of 17 was determined by identifying the 95th percentile of the total number of students whose primary designation was AUT or EMN across all 45,528 schools.

Prior to selection of the school sample, schools were sorted by locale (city, suburban, town, rural), school configuration (PK/KG/1-06, PK/KG/1-08, PK/KG/1-12, 05-08, 06-08, other), median income of the ZIP code in which a school resides, and school size measure within each of the explicit school strata so that approximate proportionality across locales, school configurations, and median ZIP code incomes was preserved. The purpose of including school size measure in the sort was to enable the ability to freshen the school sample. The school sample was freshened in the third quarter of 2017, before the start of Base Year data collection, because schools were initially selected about a year before the start of data collection to allow sufficient time for recruitment. New schools were identified through review of preliminary versions of the 2015-2016 CCD and PSS files. Newly identified schools were inserted into the sorted sampling frame in such a fashion as to preserve the original sort ordering. Using a half-open interval rule,6 we identified schools to be added to the initial school sample.

The MS1 sample was designed to account for declining response rates observed in recent school-based NCES longitudinal studies, the recruitment experience of the field tests, and the desire to achieve 900 participating schools and targeting a student sample yield of 29 students per school in MS1. To be conservative, the MS1 sampling plan was designed to be flexible so that the study could achieve sample size targets even if eligibility and response rates were lower than anticipated. The school sampling process was designed to achieve 900 participating schools (740 public, 80 Catholic, and 80 other private) distributed over 16 school sampling strata. We selected 3,710 schools using stratified probability proportional to size sampling, from which an initial simple random sample of 1,236 schools was selected within school strata. This subset of 1,236 schools comprised the initial set of schools that were released for recruitment in January of 2017. The remaining schools comprised a reserve sample from which 198 schools were sampled and pursued for recruitment in October of 2017. As the school sample was selected from a sampling frame constructed from the 2013-2014 Common Core of Data (CCD) and Private School Universe Survey (PSS) data, a sample freshening process using the 2015-2016 CCD and PSS data was employed to improve the coverage of the MGLS:2017 school sample. This sample freshening process added 95 schools to the MGLS:2017 school sample. Thus, a total of 1,529 schools were sampled and pursued for recruitment for MS1.

The numbers of participating schools among the initial set of 1,236 released schools were monitored by school stratum and a decision was made in October of 2017 to select 198 schools from the reserve in order to increase the number of participating schools. The desired numbers of participating schools by the margins of the school stratification characteristics are shown in table 1.

Table 1. MS1 School Participation Goals, by School Stratification Characteristics




Public

Catholic

Other private

Total

Total



740

80

80

900

Region


Northeast

122

19

16

157



Midwest

162

28

15

205



South

278

19

33

330



West

178

14

16

208

Prevalence of students with disabilities


High

128

NA

NA

128



Low

612

80

80

772

NOTE: NA: Not Applicable. No explicit participation goals are established for Catholic and Other private schools with these two grade configurations. Catholic and Other private schools with school grade configurations of 05-08 and 06-08 are classified as Other configuration for the purposes of sampling. Catholic and Other private schools are all classified as Low prevalence, for purposes of sampling, as no focal disability counts are available.

The 16 school strata along with the corresponding stratum-specific participation goals, frame counts, total selected school sample (n=3,710), initial school sample (n=1,236), and reserve sample (n=2,474) are shown in table 2.

Table 2. MS1 School Sample Allocation

School Type

Census Region

Prevalence

Participation Goals

School Frame Count

Total Selected School Sample

Initial School Sample

School Reserve Sample

Total

-

-

900

48,376

3,710

1,236

2,474

Public

Northeast

High

17

245

70

23

47

Public

Northeast

Low

105

4,965

433

144

289

Public

Midwest

High

35

445

144

48

96

Public

Midwest

Low

127

8,167

524

175

349

Public

South

High

50

578

206

69

137

Public

South

Low

228

9,569

940

313

627

Public

West

High

26

290

107

36

71

Public

West

Low

152

9,554

627

209

418

Catholic

Northeast

Low

19

1,069

78

26

52

Catholic

Midwest

Low

28

1,697

115

38

77

Catholic

South

Low

19

970

78

26

52

Catholic

West

Low

14

770

58

19

39

Other Private

Northeast

Low

16

2,060

66

22

44

Other Private

Midwest

Low

15

2,277

62

21

41

Other Private

South

Low

33

3,768

136

45

91

Other Private

West

Low

16

1,952

66

22

44


The size measure used for the probability proportional to size selection of 3,710 schools was constructed using the overall sampling rates for students in the following seven student categories:

  • Autism (AUT),

  • Emotional Disturbance (EMN),

  • Specific Learning Disability (SLD),

  • Asian, non-Hispanic (non-SLD, non-EMN, non-AUT),

  • Hispanic (non-SLD, non-EMN, non-AUT),

  • Black, non-Hispanic (non-SLD, non-EMN, non-AUT), and

  • Other race, non-Hispanic (non-SLD, non-EMN, non-AUT)

combined with the total number of students in each of those seven categories at a given school. In other words, the size measure for a given school (i) in school stratum h may be written as follows:

where is the sampling rate for the jth student category in the hth school stratum and is the number of students in the jth category within school i in the hth school stratum. The sampling rate, , equals the number of students to sample from the jth category in the hth school stratum divided by the number of students in the jth category across all schools in the hth school stratum. The sampling rates for the seven student categories listed above varied across the school strata; for example, a rate of 0 was used for students with Autism at Catholic schools while an overall rate of .033 was used for students with Autism at public schools. The designed student sampling rates by school strata are provided in table 3. Because private schools do not report focal disability counts to EDFacts, the school sampling process assumed no students in the focal disability categories were enrolled in private schools. The sampling plan did not rely on sampling focal disability students from private schools in order to try to achieve the desired number of participating students in each of the three focal disability categories.

Table 3. Aggregate Student Sampling Rates Used for School Selection

School Type

Census Region

Prevalence

Overall

AUT

EMN

SLD

Asian

Hispanic

Black

Other

Public

Northeast

High

0.009

0.038

0.065

0.004

0.008

0.004

0.003

0.005


Northeast

Low

0.006

0.031

0.031

0.005

0.009

0.004

0.002

0.005


Midwest

High

0.010

0.040

0.081

0.004

0.010

0.005

0.005

0.005


Midwest

Low

0.006

0.024

0.024

0.005

0.009

0.004

0.004

0.005


South

High

0.009

0.045

0.109

0.005

0.007

0.004

0.003

0.006


South

Low

0.006

0.034

0.034

0.005

0.009

0.005

0.003

0.005


West

High

0.010

0.052

0.122

0.004

0.009

0.004

0.004

0.006


West

Low

0.005

0.025

0.025

0.005

0.008

0.004

0.003

0.005

Catholic

Northeast

Low

0.017

NA

NA

NA

0.024

0.024

0.024

0.014


Midwest

Low

0.016

NA

NA

NA

0.022

0.022

0.022

0.015


South

Low

0.016

NA

NA

NA

0.026

0.027

0.027

0.011


West

Low

0.017

NA

NA

NA

0.025

0.025

0.024

0.011

Other private

Northeast

Low

0.011

NA

NA

NA

0.011

0.010

0.011

0.011


Midwest

Low

0.009

NA

NA

NA

0.010

0.009

0.009

0.009


South

Low

0.012

NA

NA

NA

0.012

0.012

0.012

0.012


West

Low

0.011

NA

NA

NA

0.011

0.011

0.011

0.011

NOTE: NA: Not Applicable. No explicit participation goals are established for Catholic and Other private schools with the three focal disability groups.

The distribution of the final school sample of 1,529 schools and the counts of the number of participating schools are provided in table 4.

Table 4. MS1 Final School Sample and Participating Schools

School Type

Census Region

Prevalence

Participation Goals

Final School Sample

Participating Schools

Ineligible Schools

Total

-

-

900

1,529

568

83

Public

Northeast

High

17

31

7

4

Public

Northeast

Low

105

180

59

7

Public

Midwest

High

35

64

14

4

Public

Midwest

Low

127

215

86

18

Public

South

High

50

84

26

4

Public

South

Low

228

380

168

18

Public

West

High

26

46

19

3

Public

West

Low

152

249

96

9

Catholic

Northeast

Low

19

32

11

2

Catholic

Midwest

Low

28

48

22

1

Catholic

South

Low

19

31

13

1

Catholic

West

Low

14

23

11

2

Other Private

Northeast

Low

16

32

3

5

Other Private

Midwest

Low

15

25

7

1

Other Private

South

Low

33

60

17

1

Other Private

West

Low

16

29

9

3



The sampling plan was designed to produce constant weights within each of the seven student domains (autism, specific learning disability, emotional disturbance, Asian non-Hispanic (non-SLD, non-EMN, non-AUT), Hispanic (non-SLD, non-EMN, non-AUT), Black non-Hispanic (non-SLD, non-EMN, non-AUT), and other non-Hispanic (non-SLD, non-EMN, non-AUT)) within each school stratum. When weights are constant within a given student domain and school stratum, there is no increase in the design effect due to unequal weights for estimates produced for the given student domain and school stratum.

Within participating schools, students were stratified into the seven student categories defined above and a simple random sample of students were selected from each student sampling stratum. An average of 30.6 students were sampled from each of the 568 participating schools. However, the number of students sampled per student stratum varied by school because the within-school student allocation to strata depended upon the number of students in each of the seven student sampling strata. The process of determining the student sample allocation followed the procedure outlined in section 2 of Folsom et al (1987).7

As schools agreed to participate in the study, students enrolled in grade 6 were selected from student rosters that schools were asked to provide. The student sample sizes were determined by the requirement that at least 782 students in each of the seven student domains8 participate in the second follow-up of MGLS:2017. That requirement was determined by evaluating the minimum required sample size that would be able to measure a relative change of 20 percent in proportions between any pair of the MGLS:2017 study rounds (MS1 in 2018, first follow-up, and second follow-up). Several assumptions were used to conduct this evaluation, as noted below.

  • Two-tailed tests with significance of alpha = 0.05 were used to test differences between means and proportions with required power of 80 percent.

  • A proportion of p = .30 was used to calculate sample sizes for tests of proportion.

  • Design effect is 2.0.

  • Correlation between waves is 0.6.

McNemar’s test using Connor’s approximation was used to determine the minimum sample size needed to meet the precision requirement under the previously stated assumptions. The Proc Power procedure available in SAS software9 was used to determine the minimum sample size.

The minimum number of students to sample from each of the seven student categories in the 2018 MS1, along with the assumptions used to derive those numbers, are provided in table 5.

Estimates of the minimum number of students to sample in the 2018 MS1 were derived by adjusting the 782 to account for a variety of factors including estimates of student response in grades 6, 7, and 8 as well as other factors, including the extent to which MS1 participating schools agree to participate in the first and second follow-up studies and the extent to which students are expected to change schools between grades 6 and 7 and between grades 7 and 8.

Because of different assumptions regarding student response rates and mover rates, the number of grade 6 students to sample varies across the student categories. In order to achieve the required minimum of 5,474 grade 8 respondents, with 782 respondents in each of the seven student categories, a total of 13,987 students must be sampled in grade 6. Following the assumptions specified in table 5, we estimated that 10,334, or approximately 74 percent, of a sample of 13,987 students would respond in grade 6. This minimal total sample of 13,987 students, however, employs a substantial oversampling of students in two of the three focal disability categories and a substantial undersampling of Hispanic (non-SLD, non-EMN, non-AUT), Black, non-Hispanic (non-SLD, non-EMN, non-AUT), and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT) students. In order to reduce the impact of disproportionate sampling on national estimates and estimates that compare or combine estimates across student categories, the sample sizes for the Hispanic (non-SLD, non-EMN, non-AUT), Black, non-Hispanic (non-SLD, non-EMN, non-AUT), and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT) student categories were increased.

Therefore, for MS1, the plan was to sample 29 students, on average, within each of 900 participating schools for a total of 26,100 sample students and, assuming the grade 6 eligibility and response rates shown in table 5, to produce approximately 20,322 participating grade 6 students. The distribution of the grade 6 student sample and estimates of the number of participating students in each of grades 6, 7, and 8 are provided in table 6. The achieved student sample sizes and actual student participation counts for grade 6 are provided in table 7. Table 8 provides the MS1 participation rates for students, parents, and school staff.

Table 5. Minimum Sample Sizes and Associated Sample Design Assumptions for Student Sampling Categories

Assumption

Each Non-focal disability student category

SLD

AUT

EMN

Grade 6 inflated student sample size

1,509

2,455

2,748

2,748

Grade 6 student eligibility rate

97%

97%

97%

97%

Grade 6 student response rate

85%

75%

67%

67%

Grade 7 inflated student sample size

1,244

1,786

1,786

1,786

Grade 7 school retention rate

96%

96%

96%

96%

Grade 6 to 7 move rate

30%

30%

30%

30%

Grade 7 mover follow rate

80%

100%

100%

100%

Grade 7 non-mover response rate

92%

75%

75%

75%

Grade 7 mover response rate

60%

45%

45%

45%

Grade 8 inflated student sample size

941

1,132

1,132

1,132

Grade 8 school retention rate

96%

96%

96%

96%

Grade 7 to 8 move rate

15%

15%

15%

15%

Grade 8 mover follow rate

80%

100%

100%

100%

Grade 8 non-mover response rate

92%

75%

75%

75%

Grade 8 mover response rate

70%

55%

55%

55%

Grade 8 minimum number of respondents

782

782

782

782

Table 6. Final Student Sample Sizes and Expected Minimum Student Participation by Grade

Assumption

non-SLD, non-EMN, non-AUT

SLD

AUT

EMN

Total

Hispanic

Asian, non-Hispanic

Black, non-Hispanic

Other race, non-Hispanic

Grade 6 inflated student sample size

3,786

1,509

1,868

10,986

2,455

2,748

2,748

26,100

Grade 6 student eligibility rate

97%

97%

97%

97%

97%

97%

97%

Grade 6 student response rate

85%

85%

85%

85%

75%

67%

67%

Grade 6 expected participants

3,122

1,244

1,540

9,058

1,786

1,786

1,786

20,322

Grade 7 school retention rate

96%

96%

96%

96%

96%

96%

96%

Grade 6 to 7 move rate

30%

30%

30%

30%

30%

30%

30%

Grade 7 mover follow rate

80%

80%

80%

80%

100%

100%

100%

Grade 7 non-mover response rate

92%

92%

92%

92%

75%

75%

75%

Grade 7 mover response rate

60%

60%

60%

60%

45%

45%

45%

Grade 7 expected participants

2,361

941

1,165

6,852

1,132

1,132

1,132

14,715

Grade 8 school retention rate

96%

96%

96%

96%

96%

96%

96%

Grade 7 to 8 move rate

15%

15%

15%

15%

15%

15%

15%

Grade 8 mover follow rate

80%

80%

80%

80%

100%

100%

100%

Grade 8 non-mover response rate

92%

92%

92%

92%

75%

75%

75%

Grade 8 mover response rate

70%

70%

70%

70%

55%

55%

55%

Grade 8 expected participants

1,963

782

969

5,697

782

782

782

11,757

Note: SLD=Specific Learning Disability. AUT=Autism. EMN=Emotional Disturbance. The non-focal disability student categories are Asian, non-Hispanic (non-SLD, non-EMN, non-AUT); Hispanic (non-SLD, non-EMN, non-AUT); Black, non-Hispanic (non-SLD, non-EMN, non-AUT); and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT.)

Table 7. Actual Student Sample Sizes and Student Participation10 for Grade 6

Assumption

non-SLD, non-EMN, non-AUT

SLD

AUT

EMN

Total

Hispanic

Asian, non-Hispanic

Black, non-Hispanic

Other race, non-Hispanic

Grade 6 eligible student sample size

3,719

948

1,641

7,727

1,535

786

456

16,812

Grade 6 participants

3,188

804

1,400

6,645

1,248

608

388

14,281

Note: SLD=Specific Learning Disability. AUT=Autism. EMN=Emotional Disturbance. The non-focal disability student categories are Asian, non-Hispanic (non-SLD, non-EMN, non-AUT); Hispanic (non-SLD, non-EMN, non-AUT); Black, non-Hispanic (non-SLD, non-EMN, non-AUT); and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT.)

Table 8. MS1 Student, Parent, and School Staff Actual Participation Numbers and Rate

Session/Survey Type

Number eligible

Number participated

Participation rate

Students

16,4681

13,274

81%

Parents

16,812

8,619

51%

School Administrators

568

435

77%

Math Teachers

1,739

1,046

60%

Math Teacher Student Reports

14,984

8,604

57%

Special Education Teachers

980

549

56%

Special Education Teacher Student Reports

2,699

1,318

49%

1Excludes 344 students who were unable to participate in the survey and assessments due to disability or limited English proficiency.

B.2 Procedures for the Collection of Information

MS2 Samples

The MS2 student sample will consist of the estimated 16,812 students sampled in MS1 who have not withdrawn from the study plus an estimated additional 6,163 students sampled at sample augmentation schools, for a total student sample size of 22,975. Some of the MS1 participants may be reclassified as study ineligible as part of MS2 or as part of status updates conducted between the MS1 and MS2 data collections. Students who became deceased since the MS1 data collection will be classified as study ineligible and students who were previously thought to be enrolled in sixth grade as of fall 2017 may turn out to have not been enrolled in sixth grade as of fall 2017 and will also be classified as study ineligible. We estimate that 97 percent (16,307) of the MS1 16,812 student sample members will be eligible for MS2. The MS2 school sample will consist of the 568 MS1 participating schools, 697 MS1 non-participating schools that offer instruction in grade 8, and an estimated 1,100 non-base-year transfer schools at which one or more sample students will be enrolled as of MS2. We estimate that approximately 28 of the 697 MS1 non-participating schools that offer instruction in grade 8 will agree to participate and that approximately 767 students will be sampled from those schools leading to participation of 488 students. In addition, we plan to recruit approximately 206 additional schools to augment the sample in order to increase the number of participating students in key domains. The augmentation sample is intended to increase the number of participating students who attended schools in towns in the 2017-2018 school year, students who attended private schools in the 2017-2018 school year, students who attended schools in the Northeast in the 2017-2018 school year, and non-Hispanic black students. Approximately 965 schools will be sampled from the reserve set of schools constructed for MS1 and, of these, an estimated 647 schools is expected to offer instruction in grade 8, to not reside in school districts that declined to participate in MS1, and to therefore be eligible to be pursued for recruitment. The reserve school sample size, estimated numbers of schools to be pursued for recruitment, and estimated number of participating schools from this augmentation sample are provided in Table 9.

Table 9. MS2 School Augmentation Sample

School Type

Census
Region

Prevalence
Status

Schools in MS1 Reserve

Augmentation Sample

Estimated Number of Augmentation Schools with 8th Grade in Non-refusing School Districts

Estimated Number of Participating Schools Among Augmentation Sample

Total

-

-

2,431

965

647

206

Public

Northeast

High

42

Public

Northeast

Low

269

146

103

27

Public

Midwest

High

88

Public

Midwest

Low

348

288

182

58

Public

South

High

133

Public

South

Low

628

388

262

92

Public

West

High

63

Public

West

Low

405

75

40

12

Catholic

Northeast

Low

57

Catholic

Midwest

Low

79

40

34

12

Catholic

South

Low

54

Catholic

West

Low

44

Other Private

Northeast

Low

43

Other Private

Midwest

Low

46

Other Private

South

Low

87

28

26

5

Other Private

West

Low

45


Assuming the desired number of participating schools from the augmentation sample are achieved, an estimated additional 3,930 students are expected to participate in MS2 from the augmented sample of schools. Of these 3,930 additional participating students, 496 are estimated to attend schools in towns, 243 in private schools, 490 in the Northeast, and 520 to be non-Hispanic black students.

MGLS:2017 will rely on a set of complementary instruments to collect data across several types of respondents to provide information on the outcomes, experiences, and perspectives of students. These instruments will be used when the students are in grades 6 and 8 to allow for the analysis of change and growth across time; their families and home lives; their teachers, classrooms, and instruction; and the school settings, programs, and services available to them. At each round of data collection, students’ mathematics and reading skills, socioemotional development, and executive function will be assessed. Students will also complete a survey that asks about their engagement in school, out-of-school experiences, peer group relationships, and identity development. Parents will be asked through an online survey or over the telephone about their background, family resources, and involvement with their child’s education and school. Students’ mathematics teachers will complete a two-part survey. In part 1, they will be asked about their background and classroom instruction. In part 2, they will be asked to report on the academic behavior, mathematics performance, and classroom conduct of each study child in their classroom(s). For students receiving special education services, their special education teacher or provider will also complete a survey similar in structure to the two-part mathematics teacher instrument, consisting of a teacher-level questionnaire and student-level questionnaire, but with questions specific to the special education experiences of and services received by the study child. School administrators will be asked to report on school programs and services, as well as on school climate. Finally, a facilities observation checklist, consisting of questions about the school buildings, classrooms and campus security, will be completed by field data collection staff.

B.3 Methods to Secure Cooperation, Maximize Response Rates, and Deal with Nonresponse

Below, the methodological descriptions focus on MS2B student tracking and MS2 school recruitment and student and parent address update activities. Tracking and recruitment activities were approved in OMB #0850-0911 v21-23. The current submission requests approval for the MS2 data collection procedures.

MS2B Tracking

Obtaining high response rates, and thereby minimizing sample attrition, is always a paramount objective in longitudinal studies. MGLS:2017 employs a three-tiered tracking protocol to ensure that contact information is maximized prior to the next data collection. Tracking will occur for all students in the MS1 sample who did not refuse participation in MS1. The planned three-tiered approach to tracking the MGLS:2017 MS1 sample will include an enrollment status update at the school level, student/parent address update activities with parents, and database tracing. The first round of tracking activities occurred in the Spring 2019 (MS2A) when most of the sampled students were in Grade 7. The MS2A experience (approved in OMB #0850-0911 v21-23) informed the procedures for the second round of tracking (MS2B) which will occur in the fall of 2019 in advance of the winter/spring 2020 data collection when most students will be in grade 8.

School Enrollment Status Update. The purpose of the school enrollment status update is to check the enrollment status of the sampled students in each MS1 school in the fall of the school year before the planned spring MS2 data collection. We anticipate that many of the students will continue to be enrolled in the school they attended during MS1, while others will have advanced to a destination school (if their school ended in grade six or seven), transferred to a new school, or moved into another circumstance such as home schooling. MS1 schools were asked to provide enrollment status information in the winter/spring of 2018-19. This information will be used to inform the MS2B school enrollment status update activities. In the Fall of 2019, enrollment status updates will be collected from MS1 schools as well as from those schools to which students may have transferred or matriculated to since the 2017-18 school year based on information from the MS2A tracking activities. Collecting this information is necessary to maintain current records.

The schools that participated in MS1 will be asked to review the list of eligible sampled students from MS1. For those students who have left the school, we will ask schools to provide the students’ last date of attendance, current school status (transfer, home schooling, etc.), last known address and phone number, and, for transfer students, the name, city, and state of the student’s new school if they are known. We anticipate that it will take 20 minutes, on average, to provide this information through a secure website set up for this purpose.

To initiate this contact, the school principal from each school will receive a lead letter that explains the purpose of the planned follow-up study and that includes a user name, password, and secure website address. Appendix MS2B-A contains the letter to be sent to MS1 schools with sampled students. The letter will prompt the principal or designee to log into the study website. Upon logging in the principal/designee must confirm he or she is the intended recipient of the letter by answering an identification verification question, such as “What is the school phone number?”, and then reset the password for the account. There is no access to any information until the password is reset using a strong password. A test of the password’s strength is built into the password change application. The users then proceed to a screen where they verify the current enrollment of sampled students and provide any updated information they may have on MGLS:2017 students who are no longer enrolled. Appendix MS2B-B includes the instructions to users and Appendix MS2B-C provides a sample form to be used for the screenshots of the enrollment list update application.

If a user has to stop and continue updating student enrollment status later, he or she must use the new password he or she created. If the user forgets the new password, he or she must contact the MGLS:2017 help desk to reset the password.

A follow-up email (Appendix MS2B-D) will be sent two weeks after the lead letter to all nonrespondents. School Enrollment List Update nonrespondents will be categorized into two groups:

Group One: Have not changed their password or initiated the process at all – they will receive an email with the same study ID, password, and URL prompting them to change the password and initiate the enrollment update process, just like the letter.

Group Two: Have started the update but have not "submitted" it – they will get an email prompting them to continue the process and reminding them that if they have forgotten their password, they can contact the help desk to have it reset.

After the two-week period, the recruitment team will begin to contact the school via telephone to follow up on the enrollment status update and to begin to coordinate the logistics of the in-school student data collection for the sampled students who remain at the school. The MS2 data collection is scheduled to begin in January 2020.

As the enrollment status updates are received and processed, students who are no longer attending the base-year school will be identified. Destination schools will be contacted if four or more MGLS:2017 students have enrolled at the school. Appendices MS2B-E through MS2B-G provide the communication materials that will be sent to the school districts and schools that are newly identified for the study. Destination schools will be encouraged to assist and support the in-school student data collection, and if they do so will be eligible to receive the same incentives that all other fully participating schools will receive.

If fewer than four students transfer to a particular school or if a student becomes homeschooled, attends a virtual school, or is otherwise not enrolled at school, the students will be contacted separately to participate outside of school via Web. If students are enrolled in a transfer school, the school will be contacted (Appendix MS2B-G1) to notify them that math and special education teachers of transfer students, as well as the school administrator and the school coordinator, will be invited to participate.

The MS2 data collection is scheduled to begin in January 2020. The letter to the school administrator to initiate the tracking activities (Appendix MS2B-A) describes the upcoming MS2 data collection, which will consist of a 90-minute student session, a 35-minute parent survey, a 20-minute math teacher survey plus 7 minutes per teacher student report, a 10-minute special education teacher survey plus 20 minutes per teacher student report, and a 40-minute school administrator survey.

Parent/Student Address Update. In addition to the school-level update, we plan to directly contact the parents of eligible sampled students to update our address database. A mailing (MS2B-M) will be sent to the parent or guardian of each sample student asking that the parent or guardian to log onto our website and update their contacting information. If we have an email address for the parent, the materials will be sent via email as well (MS2B-N). For data security reasons, no personally identifiable information will be preloaded onto the website for this address update. In addition to updating contact information, parents will be asked if their child will be at the same school that he/she attended in the spring of 2019 (or in the spring of 2018 if no updated school information was received), or if his/her school enrollment status has changed. This provides two sources of information in preparation for the MS2 data collection. The address update will take approximately 5 minutes to complete. See Appendix MS2B-O for an example of what information will be on the website for the parent to update. To maximize response, parents will be offered a $10 incentive for providing this information (approved in OMB# 1850-0911.v23) and a hardcopy version (MS2B-P) of the same address update form will be sent to nonrespondents 3 weeks after the mailing is sent that includes the address update website. An email reminder will be sent at this time as well.

The address update is very important for the first follow-up of MGLS:2017, because the final base-year sample size is lower than originally targeted. Considering the value that the base-year student and parent data provide, maximizing the amount of contact information we get from parents is crucial to the study.

Tracing. Batch tracing will be conducted about 30 days prior to the start of the MS2 data collection. Batch databases are used to confirm or update the contact information that we have for parents to maximize resources for the data collection activities. Throughout the data collection period, for parents that we are unable to reach due to a lack of or out-of-date contact information, RTI tracing specialists will use proprietary databases to conduct intensive tracing activities. A locator database will be maintained for the study and all newly identified information will be loaded into the locator database regularly to be used for current and future data collection efforts.

MS2 School Recruitment Approach

Gaining schools’ cooperation in voluntary research is increasingly challenging. For example, in 1998–99 the Early Childhood Longitudinal Study had a weighted school-level response rate of 74 percent,11 whereas 12 years later, the complementary ECLS-K:2011 study had a weighted school-level response rate of 63 percent.12 Additionally, there is evidence that response rates may be lower for schools that serve older students, as in the High School Longitudinal Study of 2009, which had a weighted school-level response rate of 56 percent.13 As previously stated, the MGLS:2017 MS1 achieved participation from about 570 schools instead of the desired 900. In addition to returning to the 570 schools that participated in MS1, we will be augmenting the sample and recruiting an additional 206 schools to participate in the winter/spring 2020. Effective strategies for gaining the cooperation of schools are of paramount importance. Recruitment activities for the augmentation sample began about one year prior to the start of MS2 data collection, in January 2019. Recruitment strategies for the augmentation sample as well as MS1 schools are presented below and have been approved in previous submissions (OMB #1850-0911 v23).

Recruitment of MS1 Districts and Diocese. Some schools in the augmentation sample or those to which sampled students have transferred will be in school districts or dioceses with schools that are already participating in the study. For these districts and dioceses, we will notify them that we have added schools to the MS2 sample and that we will be contacting the schools. If a district required a research application, an addendum to that application will be sent to the district for approval to contact the school(s).

Recruitment of New Districts or Diocese. For school districts new to the study because they are part of the augmentation sample or because one or more sampled students have transferred, school districts of augmentation sample public schools and dioceses of sample Catholic schools (if district or diocese affiliation exists) will receive a mailing about the study. The district introductory information packet includes a cover letter (Appendix MS2B-H), a colorful recruitment-oriented brochure (Appendix MS2-A1), and a sheet of Frequently Asked Questions (FAQs) about the study (Appendix MS2B-K). Three days after mail delivery of the packet, a recruiter will call to secure the district’s cooperation and answer any questions the superintendent or other district staff may have. We also discuss the sampled schools and confirm key information about the schools (e.g., grades served, size of enrollment). Information collected during this call is used to confirm which schools in the district are eligible for participation in the study, and to obtain contact and other information helpful in school recruitment.

The study staff are prepared to respond to requirements such as research applications or meetings to provide more information about the study. If a district chooses not to participate, the recruiter documents all concerns listed by the district so that a strategy can be formulated for refusal conversion attempts.

In addition to obtaining permission to contact the selected schools, districts are also asked about the best way to gather student rosters.

Recruitment of Public and Catholic Schools. Upon receipt of district or diocesan approval to contact the sample public or Catholic schools, respectively, an introductory information packet is sent via overnight express courier that includes a cover letter (Appendix MS2B-I) and the same colorful recruitment-oriented brochure (Appendix MS2-A1) and sheet of Frequently Asked Questions (FAQs) about the study (Appendix MS2B-K) that were sent to school districts and dioceses with links for accessing the MGLS:2017 recruitment website. Three business days after the information packet delivery (confirmed via package tracking), a school recruiter follows up with a phone call to secure the school’s cooperation and answer any questions the school may have. During this call, the recruiter establishes who from the school’s staff will serve as the school coordinator for the study. If schools do not respond by phone or email to recruitment efforts within one month, in-person recruitment will be considered for the school.

In the fall of 2019, the MGLS:2017 study team will work with the school coordinator to schedule MS2 activities at the school, including gathering student rosters, distributing consent materials to parents of sample students, and arranging the onsite assessments. In early communications, the recruiter will also gather information about what type of parental consent procedures need to be followed at the school; any requirements for collecting data on the IEP status of students and student’s teacher and math course information; hours of operation, including early dismissal days, school closures/vacations, and dates for standardized testing; and any other considerations that may impact the scheduling of student assessments (e.g., planned construction periods, school reconfiguration, or planned changes in leadership). The study recruitment team will meet regularly to discuss recruitment issues and develop strategies for refusal conversion on a school-by-school basis.

Private and Charter School Recruitment. If a private or charter school selected for MS2 operates under a higher-level governing body such as a diocese, a consortium of private schools, or a charter school district, we will use the district-level recruitment approach with the appropriate higher-level governing body. If a private or charter school selected for MS2 does not have a higher-level governing body, the school recruitment approach outlined above will be used.

Recruitment of Schools for Out-of-school Student Data Collection. As a final effort to secure the participation of schools and their students, we will offer the possibility of collecting student data outside of school. This option will be offered to schools that are unable to otherwise schedule study participation into the school calendar. Schools allowing an out-of-school student data collection will be asked to provide a student roster to select students into the sample along with contact information to invite the parent and student to participate via Web and teacher information in order to invite teachers to participate in the teacher surveys. Student data collection will be conducted using the out-of-school data collection procedures described below.

Collection of Student Rosters. Beginning in the fall of 2019, data collection staff will gather student rosters for augmentation schools that have agreed to participate in the study. These rosters will be collected from the district or directly from the school with the assistance of the school coordinator from the school. A complete roster of all students eligible for sampling will be requested, and information will be requested for each student on key student characteristics, such as: name; school or district ID number; month and year of birth; grade level; gender; and race/ethnicity14. Each of these characteristics is important for sampling purposes, but we will work with schools that are unable to provide all of the information to obtain the key information available. Based on this information the student sample will be drawn. As part of the roster collection, the study will also request from the school coordinator or designated district personnel the following information for each student eligible for sampling: student’s parent and/or guardian contact information (e.g., mailing address; landline phone number; cell phone number; e-mail address); student’s math teacher (including course name and period or section number); and student’s special education teacher, when applicable. Schools and districts usually find it easiest, and therefore most efficient, to supply all of the desired information one time for all of their students. However, should it be problematic for any school or district to provide the parent and teacher information on the complete roster, the data collection team will gather that information as a second step for the sampled students only. If the school and/or district is unwilling to provide parent contact information for the sampled students, the team will work with the school and/or district to determine the best way to contact parents (e.g., the school coordinator or designated district personnel would facilitate contacting parents and/or mail the required materials to parents using the contact information they have on file).

Schools and districts will be provided with a template and secure transfer options to deliver the rosters (see Appendix MS2B-J for student rostering materials). Unlike the enrollment status update that is completed directly on the website, most schools will upload their roster to the study website in a process similar to attaching a file to an email. Once received, the data quality of the student rosters will be then evaluated by:

  • reviewing and assessing the quality and robustness of student and parent information available at each school, including contact information for parents;

  • reviewing and assessing the quality of the data on student-teacher linkages;

  • reviewing and assessing the quality of the data on IEP status;

  • addressing any incompleteness or irregularities in the roster file;

  • requesting additional information as needed from the school coordinator or designated district personnel; and

  • (re)verifying that the sampled students are currently in attendance in the school.

Maximizing School Participation. District- and school- participation rates in school-based studies have been declining steadily over time. District and school personnel understand the value of the research but have many reasons for refusing participation in these voluntary studies, which require considerable burden on their part. Studies increasingly experience challenges in obtaining the cooperation of districts and schools. Loss of instructional time, competing demands (such as district and state testing requirements), lack of teacher and parent support, and increased demands on principals impede gaining permission to conduct research in schools. MGLS:2017 recruitment teams will be trained to communicate clearly to districts, dioceses, private school organizations, schools, teachers, parents, and students the benefits of participating in MS2 and what participation will require in terms of student and school personnel time. The following strategies will be utilized to maximize response rates among school districts and schools during the recruitment process:

  • We have established partnerships with organizations such as the Association for Middle Level Education (AMLE), the National Forum to Accelerate Middle-Grades Reform (the Forum), the National Center for Education, Research and Technology (NCERT), and the School Superintendents Association (AASA). These organizations will actively promote the value of the study to their constituencies, as will a number of middle-grades education researchers who will participate in the recruitment effort. Members of the study team have attended conferences for these organizations and spoken to many schools about the value of the study. Once the augmentation sample is selected, it will be cross referenced with the list of school districts and schools we have spoken with at these conferences and, where there is a match, we will reach out directly to the person we spoke with at the conference to pave the way for the district’s or school’s participation. In addition, representatives from these organizations have committed to provide outreach to the middle grades community in general via information in newsletters and related communications. These communications will include information about the importance of the study, what study participation entails, and the benefits of the study to the middle grades community.

  • We attended the National School Boards Association annual meeting and will be attending the National Principals annual meeting. At each conference, we will have an exhibit staffed with MGLS:2017 representatives who can introduce the study to attendees and ask any questions that they may have. Our goal with attendance at each conference is to generate interest in the study as well as to meet officials who work on our sampled schools’ school boards or in their district offices.

  • School staff are extremely busy and reaching them to discuss the study is sometimes challenging. Schools that do not respond to our communications within one month will be considered for an in-person recruitment attempt. During this in-person visit, our staff will be able to collect roster or enrollment status information as needed.

  • In-person recruitment will also be used for refusal conversion conversations as appropriate. Conference calls with senior staff at NCES and RTI may also be utilized for refusal conversion conversations. In-person recruitment has been used to initiate contact with the augmentation and refusal conversion samples. Our goal is to gain early cooperation with in-person meetings with district- and school-level decision makers.

  • We have learned that many schools prefer to participate as early as possible in the calendar year, especially in January and February, to minimize overlap with school testing preparations. Data collection will begin as early as possible to enable schools to find a date that fits in their schedule. The in-school data collection will begin in January 2020. In many schools/districts the January dates avoid mandatory testing, among other spring term activities (more than one-third of MS1 participating schools participated in January or February).

  • MS2 data collection will be conducted from January through July 2020. In-school student data collection will take place from January through June 2020, and staff and parent survey collection from January through July 2020. The inclusion of June 2020 as part of the available dates for in-school sessions will enable some schools to participate after their high-stakes testing is finished. Staff and parent surveys will continue through July 2020 to allow them sufficient time to respond, given that teacher and parent lists are submitted on a flow basis throughout the in-school data collection period.

  • Each school will be offered $400 in a check or a $400 equivalent in goods/services. To provide a tangible connection between the school’s participation and study findings and to respond to districts’/schools’ desire for data, we will also offer each school a report reflecting its aggregated MGLS:2017 assessment results as compared to national and sub-national results (where possible).

  • We will offer to personnel of participating schools and districts training for analyzing and learning from MGLS:2017 data (to take place after data collections ends) as a professional development and continuing education opportunity incentive.

  • We have included in this submission minor revisions to our communication materials to emphasize more explicitly the value and uniqueness of this middle-grades study and what may be learned as a result. We also make reference to what we give back to districts and schools (e.g., school-level reports).

  • To encourage submission of parental consent forms in schools requiring explicit consent for student participation and to engender goodwill and enthusiasm with the school, we will offer the students an in-school pizza party (or other food provision per school’s preference) to motivate returning the consent forms. Such an offer has the potential to reduce burden on the school staff while increasing student participation. We found that districts and schools with explicit consent requirements are sometimes hesitant to participate, anticipating low student participation, and that an incentive to students for returning the form can boost participation and alleviate those concerns.

  • Students participating in school will be using earbuds to complete the audio portion of the student assessment. Students will be allowed to keep the earbuds and pencil after participation.

  • Students participating outside of school will be offered a choice between a 45-minute session for $20 plus a certificate for 2 hours of community service from the U.S. Department of Education or a 75-minute session for $20 plus a certificate for 3 hours of community service from the U.S. Department of Education and a donation of $20 to Save the Children’s special fund to help kids affected by the COVID-19 outbreak.

  • Each student who participates will also receive a 2-hour community service certificate regardless of whether they participate in school or outside of school. For confidentiality purposes, no study-specific information will be included on the certificate.

  • We will offer an out-of-school student data collection to schools unwilling or unable to fit MGLS:2017 into the school schedule. To maximize student response, we will also contact students who miss the in-school session to participate outside of school.

  • Schools that mention their school time is limited will be offered a partial in-school session and for students who don’t finish, they will be asked to participate outside of school (Appendix MS2B-R5).

Recruiters will be trained to address concerns that districts and schools may have about participation, while simultaneously communicating the value of the study and the school’s key role in contributing high-quality data focusing on middle-grade students. Early engagement of districts and school administrators will be important. Along with what is described above, our plan for maximizing district, school administrator, and parent engagement includes the following:

Experienced recruiters. The recruiting team will include staff with established records of successfully recruiting school districts and schools. To maximize district approval, senior staff will make the initial district telephone contacts. Their familiarity with the study and its future impact, as well as their experience in working with districts to overcome challenges to participation, will be crucial to obtaining district approval. Recruiters contacting schools will be equally adept at working with school administrators and providing solutions to overcome the many obstacles associated with student assessments, including conflicts related to scheduling and assessment space, minimizing interruption to instructional time, and obtaining teacher and parent buy-in.

Persuasive written materials. Key to the plan for maximizing participation is developing informative materials and professional and persuasive requests for participation. The importance of the study will be reflected in the initial invitations from NCES (Appendices MS2B-E to I) sent with a comprehensive set of FAQs (Appendix MS2B-K), a colorful recruitment-oriented brochure describing the study, and a brief one-page flyer providing quick facts about the study which also explains that MGLS:2017 is different from other assessments (Appendix MS2-A1). Reviewing these study materials should provide districts and school administrators with a good understanding of the study’s value, the importance of MGLS:2017, and the data collection activities required as part of the study. A full understanding of these factors will be important both to obtain cooperation and to ensure that schools and districts accept the data collection requests that follow.

Persuasive electronically accessible materials. In addition to written materials, information about the study will be available on the study website (text in Appendix MS2-B1). The website will draw heavily on the written materials, will present clear and concise information about the study, and will convey the critical importance of taking part in the study.

Outreach. As mentioned briefly above, AMLE and the Forum will provide an outreach service, asking for support of the study, offering updates to their constituencies on the progress of the study, and making available information on recent articles and other material relevant to education in the middle grades. In addition, project staff will reach out to contacts made at various conferences attended to promote the study.

Buy-in and support at each level. During district recruitment, the study team will seek not just permission to contact schools and obtain student rosters but also to obtain support from the district. This may take the form of approval of a research application and a letter from the district’s superintendent encouraging schools to participate. Active support from a higher governing body or organization, such as a district or a diocese, encourages cooperation of schools. Similarly, when principals are interested in the research activity, they are more likely to encourage teacher participation and provide an effective school coordinator.

Avoiding refusals. MGLS:2017 recruiters will work to avoid direct refusals by focusing on strategies to solve problems or meet obstacles to participation faced by district or school administrators. They will endeavor to keep the door open while providing additional information and seeking other ways to persuade school districts and schools to participate.

MS2 Data Collection

The MS2 data collection will include in-school student sessions, out-of-school student sessions, as well as surveys for students’ parents, math teachers, special education teachers or service providers (as applicable), and school administrators. The student session consists of surveys and direct assessments and will take place primarily in the school setting and be administered using Chromebooks, a tablet-like computer with touchscreen capability and an attached keyboard, brought into the school by MGLS:2017 staff.

Data collection protocols for MS2 will closely resemble those established for MS1 data collection. To administer the survey and direct assessments, study staff will work with schools to identify and utilize locations for administration that minimize distractions for the student and disruption to the school routine. The parent, mathematics teacher, special education teacher, and school administrator surveys will have a web option and a telephone option, so respondents will have the choice to complete the survey in a variety of school and non-school settings. Initially, the surveys will be released in web-based form. Students will also have the ability to participate outside of school if they have moved to a school with fewer than four study students, are homeschooled or if they miss the in-school session. To access the web-based surveys, parents, teachers, and school administrators will receive an email with links and instructions for completing the survey (described in more detail below). Materials for the student web-based session will be sent directly to the parent for distribution to their child.

The student in-school session will take approximately 90 minutes and will include assessments in mathematics, reading, and executive function “brain games” (Hearts and Flowers and 2-Back), plus a brief student survey. Height and weight measurements will also be collected. In addition, the session facilitator (SF) will complete a facilities observation checklist to report information about the school environment.

Planning for the School Data Collection Visit. MGLS:2017 recruitment staff will work with the school to identify a school staff person to serve as a school coordinator for MGLS:2017. About 4 weeks prior to the scheduled student session, the school coordinator will receive a list of sampled students enrolled at the school and copies of the parental permission forms to send home with the sampled students.

Prior to the data collection visit, an SF will work with the school coordinator to verify that the students selected for the sample are still enrolled, and, if not already provided, to identify each student’s mathematics teacher and, if applicable, the student’s special education teacher or person who provides special education services to the student. The SF will also work with the school coordinator to establish the following:

  • The schedule for data collection (i.e., days the study will be collecting data in the school, start time and end time of the school day, bell schedule for the transition between classes, and window of time during which students will be assessed during the school day);

  • Any accommodations that may be needed for students, particularly those with IEPs;

  • The WIDA ACCESStm or equivalent score for ELL students to determine their capability of participating in English for MS2;

  • A location in the school setting to accommodate the data collection (determining the optimal space for the in-school student session and height/weight measurement station);

  • A plan for distributing permission forms and notification letters and tracking responses for explicit forms (implied permission will be encouraged when permissible since families are already familiar with the study);

  • The required logistical information for field staff entering the school (e.g., security and parking procedures); and

  • The school’s preferred protocol for having students arrive at and return from the study space (e.g., this may involve field staff going to classrooms to collect students, students being notified by their teacher to report to the study space, and/or students returning to a specific class on their own when finished with the assessment and survey).

The SF will visit the school prior to the student session to ensure that the logistics are arranged in preparation for a successful data collection. While at the school, the SF will complete a school facilities observation checklist (estimated to take the SF 45 minutes on average to complete), the completion of which does not require the involvement of any school staff (if required by the school, a school staff person may accompany the SF while the checklist is being completed). The SF will complete the checklist on paper, then enter the information online. This approach is less intrusive than carrying a laptop through the school for data entry of each individual element. The facilities checklist gathers information such as: the general upkeep of the school, including the presence of graffiti, trash, or broken windows (inside classrooms, entrance and hallways, and restrooms); noise level; security measures; and the general condition of the neighborhood/area around the school (Appendix MS2-V). This checklist may be completed during the pre-session visit and/or on the day of the student session.

Student Survey and Assessments (In-School). Student surveys and direct assessments will be administered in 90-minute group sessions during the school day. If a school is only willing or able to conduct a shorter in-school session, we will administer as much as possible in the school session and the remaining sections will be completed out of school at the student’s convenience. The lead SF will be responsible for administering the in-school student session. A second SF will accompany the lead SF to help with equipment setup, perform height and weight measurements, and to conduct a second student session if more than one student session is scheduled concurrently. The student session will generally be carried out as follows:

  • The SFs will arrive at the school on the appointed day of assessments, approximately 90 minutes prior to when the first student session begins, following all school security protocols for entering the school and seeking out the school coordinator who will be expecting the study team field staff per the arrangements made during the planning of the data collection visit;

  • The SFs will be escorted by school staff to the designated location for the student session;

  • The SFs will bring an independently functioning mobile testing lab to the school for the student session and will set up the equipment and space, verifying that the tablet computers are in working order and setting them to the appropriate start screen;

  • Once students arrive in the designated student session space, the lead SF provides a brief introduction including information about students’ participation in the study and will then ask the students to put on a pair of earbuds provided by the study. The students will view a video to introduce the student session, and help the students log in to begin (Appendix MS2B-Q7). If required by the school, the students will be asked to sign an assent form (Appendix MS2B-Q6); and

  • MS2 students will then complete a survey, mathematics assessment, reading assessment, executive function (Hearts and Flowers and 2-Back, described in Appendix MS2-C3), and will have their height and weight collected. We anticipate that a small number of schools will require a shorter in-school session, so some components may be administered outside of the in-school session.

Accommodations will be provided to the greatest extent possible to students who need them. As previously mentioned, the SF will work with the school coordinator to determine whether accommodations provided in MS1 still apply or of there are any new accommodations that may need to be provided. Possible accommodations include, but are not limited to, small group student sessions, one-on-one-student sessions, and having an aide present during the student session. Most screens in the student session, except for the reading assessment screens, will have a button that the students may press to have the item read aloud to them. The read-aloud button will be accessible to all students, and use will be encouraged for students who require a read-aloud accommodation. Students who require read-alouds will not participate in the reading assessment.

Participation in MS2 of students who are English Language Learners and did not participate in MS1 and who continue to not participate in state assessments will be discussed with the school coordinator and based on the state assessment eligibility criteria and the teacher’s recommendation, he/she may be included for the entire session, included in a session with only mathematics and executive function, or excluded from the session altogether. Data will be collected from these students’ parents, teachers, and school administrators.

If a school is unwilling to participate in the in-school data collection, we will contact sampled students from that school to participate outside of school. For each of those students, we will also attempt to collect data from the parent, teacher, and school administrator. Similarly, students who are home schooled or who miss an in-school session will also be asked to participate outside of school. Contextual data from parents, teachers, and administrators will be collected for all students, regardless of whether they participate in-school or outside of school.

Student Survey and Assessments (Out-of-school). An important aspect of a longitudinal study is to follow students regardless of where they attend school after the base year. This enables examination of the trajectory and associated success of all students regardless of their path through the education system. Out-of-school data collection was conducted successfully during the High School Longitudinal Study of 2009 (HSLS:09) first follow-up with students in modal grade 11 and was field tested in the MGLS:2017 OFT2. The out-of-school student session will take approximately 45 minutes and include only the first stage of the mathematics and reading assessments and the student survey. Students may also choose to complete a 75-minute session which would include the second stage of the mathematics and reading assessments. Height and weight measurements and executive function tasks will not be collected from students participating outside of school.

The MS2 out-of-school data collection will be conducted with students who are not in base-year, destination, or transfer schools with at least four study students, whose schools are unable or unwilling to schedule an in-school session, students who are homeschooled, or with students who miss the in-school session due to absence or an unreturned explicit permission form.

  • Because MS2 students are minors, initial communication with them will go through their parents or guardians. The parent or guardian of the study student will receive a letter and/or email telling him or her about the study and inviting the study student to participate via Web (Appendix MS2B-R1, MS2-W.3.a), along with a set of FAQs about the out-of-school collection (Appendix MS2B-R8g). As an enclosure in the parent mailing, the parent will receive an envelope addressed to the student. This envelope will contain a letter to the student inviting him or her to participate via Web and will contain information to log in to the session (Appendix MS2B-R3, MS2B-R6). By the parent passing the information on to the student for participation, the parent implies permission for the student to participate. If parents provide permission by phone, subsequent communication can take place with the student directly. If student email is provided, email reminders will be sent to students as well (Appendix MS2-W.5a through MS2-W.5e).

  • The student will complete the same 90-minute session that their counterparts will be taking in school except that they will not have executive function components or height and weight measurements taken.

Parents of nonresponding students will receive reminder emails, letters, and postcards with information repeating the instructions on how the study student may access the survey. Emails will be sent approximately every 6-10 days and letters approximately every 2-3 weeks. Telephone prompting will also take place. If telephone prompting is unsuccessful and we have a valid mailing address, a session facilitator will visit the home and speak to the parent about prompting the student to participate. The SF will have his/her laptop and Chromebook so the student can complete the online student assessment then. Parents who have not completed the parent interview will also be given the opportunity to complete the online parent survey with the SF’s project laptop while he/she is there. Parents and/or students who receive an incentive as a token of appreciation will be asked to sign the receipt in Appendix MS2B-R10. If a session facilitator tries to visit, but no parent is home, he/she will leave a “Sorry I Missed You” like in Appendix MS2B-R9 in the door.

Parent Recruitment and Survey. While MGLS:2017 recruiters are following up with schools regarding enrollment status updates, recruiters will inquire about schools’ procedures for obtaining consent for students to participate in the follow-up round. Schools will be given one of three options: (1) an notification letter about their child’s participation in the study at the school; (2) an implicit permission letter which gives parents the option of opting out of their child’s participation; or (3) explicit permission which requires that parents provide written permission for their child to participate (Appendix MS2B-Q1 through MS2B-Q3). In MS2B, proactive parent recruitment will be focused on maximizing the number of parents (1) returning signed explicit consent forms for the child and (2) completing the parent survey.

The initial communication with parents consisting of introductory and consent materials will be distributed to parents in a way each school believes to be most appropriate and effective (e.g., sending the materials home with students; the school or district sending the materials directly to parents; and/or trained MGLS:2017 recruitment staff contacting parents directly by mail, email, and/or phone). The initial materials will introduce the study, explain the study’s purpose and the importance of student and parent participation, describe what is involved in participation, and specify the consent procedure that is being used by their school. If the school chose explicit or implicit reconsenting, the materials will include a consent seeking letter to all parents plus a consent form of the type specified by the school (Appendix MS2B-Q1 through MS2B-Q3), a colorful recruitment-oriented brochure (Appendix MS2-A1), and a sheet of FAQs about the study (Appendix MS2B-R8f) with links for accessing the MGLS:2017 recruitment website (website text in Appendix MS2-B1). Additionally, in schools using explicit consent, the parental consent form for student’s participation, which will be included in the initial communication materials, will ask parents to provide their contact information.

The parent survey is expected to take an average of 35 minutes to complete and will feature a multi-mode approach, with self-administered web-based questionnaires and a telephone interview follow-up for respondents not completing the questionnaire online. The instrument will be available in both English and Spanish.

Contacts with parents will be made using information received from the base year school, the MS1 parent survey, the parent address update, batch tracing, and the school-provided enrollment status update. The parent data collection will generally be carried out as detailed below.

  • Parent respondents will receive a letter and/or an email (Appendix MS2B-R1, MS2-W.3.a) that announces the launch of the survey and provides a link to the online instrument.

  • Upon completion of the survey, parents will receive a thank you letter and incentive.

  • For nonresponding parents, follow-up prompting will include reminder emails, letters, texts, and/or postcards with information repeating the instructions on how to access the survey.

    • Emails will be sent approximately every 6-10 days and letters will be sent approximately every 2-3 weeks.

  • Parents will begin receiving telephone prompting and text messaging approximately 20 days after the initial contact. The study team interviewer placing the telephone call may offer to complete the survey with the parent over the phone at that moment or schedule a time to follow up with the parent and complete the survey later.

  • On or about May 1, 2020, parents will receive an incentive boost. Parents who have not responded as of this date will be offered a $30 incentive instead of the $20 incentive initially offered.

  • About five weeks prior to the end of the data collection period, all parents who have not yet participated will be invited to complete an abbreviated version of the parent survey, approximately 20 minutes in length, either online or by phone. The abbreviated survey includes a subset of items from the full parent survey. Items were selected based on their importance to researchers, and cover household composition, parent education level and background, household languages, student health, and employment and income information. For nonrespondents, a follow up reminder postcard (Appendix MS2B-R8c) will be sent to prompt the parent to complete the online survey, as well as an email reminder (MS2-W.3.e).

  • About three weeks prior to the end of the data collection period, all nonresponding parents will receive a mini parent survey, which is designed to be a one-page (front and back) paper-and-pencil survey. It will be mailed in the final three weeks of data collection to all parents who have not completed either the full or the abbreviated survey. The mini survey includes only the most critical items from the parent survey covering household composition, household income, and parents’ education level and employment status. As noted in Part C (Part C.4.2.1), the relevant parent questionnaire items were reworded or otherwise adapted to better fit the paper-and-pencil format. The mini parent survey is expected to take about five minutes and will come with a postage-paid envelope for returning the completed survey. Nonrespondents will receive a reminder letter (Appendix MS2B-R8d) that will be sent to prompt parents to complete the mini parent survey or the online survey.

Mathematics Teacher and Special Education Teacher Surveys. The mathematics teacher and special education teacher/provider surveys are web-based, self-administered surveys. The math teacher survey consists of questions for the teacher about: (1) her/himself; (2) the class(es) taught by the teacher which include at least one study sample member (these questions repeat for each separate class); and (3) each study child, referred to as teacher student reports (TSRs). The special education teacher/provider survey consists of questions for the teacher about: (1) her/himself and (2) each study child (TSRs). For mathematics teachers, the combination of teacher questions and class questions is expected to take approximately 20 minutes to complete, and the TSR is expected to take approximately 7 minutes to complete for each student. For special education teachers, the teacher portion is expected to take approximately 10 minutes to complete, and the TSR is expected to take approximately 20 minutes to complete for each student. The mathematics and special education teacher survey data collection will generally be carried out as follows below.

  • For all students, their mathematics teacher, and for students identified as having an IEP, their special education teacher or the person who provides special education services will receive a letter and/or email (Appendix MS2B-S1b, MS2-W.2.a) announcing the launch of the survey and providing them with a link to the survey. While at the school during the pre-session visit on the session day, the Session Facilitator may leave a “reminder” postcard as in Appendix MS2B-S2 in the teacher’s mailbox in the school office to prompt them to complete their online survey.

  • Upon completion of the survey, teachers will receive a thank you letter and incentive.

  • For nonresponding mathematics and special education teachers, follow-ups will include reminder emails, letters, or postcards with information repeating instructions on how to access the survey. Emails will be sent approximately every 6-10 days and letters will be sent approximately every 2-3 weeks.

  • For teachers who have not completed their web-based surveys after approximately 2-3 weeks, additional follow-ups will be used, including but not limited to, a telephone prompt encouraging teachers to complete their surveys.

  • In addition, SFs will prompt teachers while at the school for the student session. This may be done in-person or by a reminder postcard (Appendix MS2B-S2).

School Administrator Survey. The MS2 school administrator questionnaire will be web-based. It will take the administrator (generally, the principal or principal’s designee) approximately 40 minutes to complete. The school administrator data collection will generally be carried out as described below.

  • School administrators will receive a letter and/or an email announcing the launch of the survey with a link to the survey (Appendix MS2B-S1a, MS2-W.1a).

  • Upon completion of the survey, school administrators will receive a thank you letter (Appendix MS2B-Q8) including information about how the school will receive its incentive.

  • While at the school to conduct the student sessions, SFs will ask to meet with the school administrator to thank him or her for the school’s participation and remind the administrator to complete the survey if he or she has not done so already. SFs unable to meet with the school administrator personally will leave hand-written notes in the school administrator’s mailbox as a reminder to complete the survey and thanking the administrator if he or she has already participated.

  • For nonresponding school administrators, follow-ups will include reminder emails, letters, or postcards with information repeating instructions on how to access the survey. Emails will be sent approximately every 6-10 days and letters will be sent approximately every 2-3 weeks.

  • For school administrators who have not completed their web-based survey after approximately 2-3 weeks, additional follow-ups will be used, including but not limited to, a reminder from the SF on the day of the student session or a telephone call encouraging school administrators to complete their survey.

  • About eight weeks prior to the end of the data collection period, all school administrators who have not yet participated will be invited by email to complete an abbreviated version of the survey, approximately 20 minutes in length (Appendix MS2-W.1e).

Administrative Records Data Collection

To add context and comparison to data provided in MS1 and MS2, the study hopes to collect administrative records. Any sampled students (minus study withdrawals) from either MS1 or MS2 would be eligible for records collection. If approved, collection will likely take place in fall 2021, when most students are starting grade 10. This collection will include courses and grades from grade levels 6 through 8 as well as test scores from grades 8 and 9. Record collection will begin at the state and district level, with recruiters contacting schools as needed. For each student, the study may need to collect records from multiple sources. In this submission, we have revised our materials where necessary to mention the student records collection; however, there will be a separate submission for the actual administrative records collection.



B.4 Test of Methods and Procedures

Of the two MGLS:2017 field tests, IVFT was conducted in the winter/spring 2016 and OFT1 in the winter/spring 2017. Together, they were the basis for informing decisions about the methods and procedures for MS1. One of the main goals of the IVFT/OFT1 effort was to provide data needed to evaluate a battery of student assessments (in the areas of mathematics and reading achievement, and executive functions) and to evaluate survey instruments for use in MS1. To that end, a number of analyses were performed on the IVFT and OFT1 data in order to determine whether assessment and questionnaire items needed revision or removal.

The properties of the survey items were examined using frequencies, mean, median, mode, standard deviation, skew, kurtosis, and histograms. Differences in response patterns were examined overall and by grade level. If the survey items were intended to be part of a scale, reliability, dimensionality, and item-to-total correlations were examined. Additionally, bivariate correlations with preliminary mathematics assessment, reading assessment, and executive function information were examined. Finally, the timing required to answer items was reviewed to remove or revise any items that needed an inordinate amount of time to complete. Based on these findings, in combination with consideration of construct importance, decisions were made to revise some items and to remove others.

The purpose of the IVFT was also to provide data to establish the psychometric properties and item performance of the items in the mathematics item pool. These data were used to construct a two-stage mathematics assessment that was fielded in OFT1 and was refined for MS1. In addition, the IVFT and OFT1 provided data on the performance of the reading assessment and the executive function tasks. These data were used to refine the reading assessment and to select and refine executive function tasks that were fielded in MS1.

The IVFT also provided an opportunity to develop study policies and procedures that could be further tested in OFT1 for use in MS1. However, the two field tests are quite different. The IVFT included students in multiple grades, though not necessarily from a representative sample, and tested a large number of items to determine the item pool for the longitudinal study. A main goal of OFT1 was to better understand the recruitment strategies necessary for a large-scale nationally representative study to obtain the targeted sample yield of grade 6 general education students and students with disabilities, and the subsequent tracing and tracking strategies necessary to maintain the student sample from the base year (when sample students would be in grade 6) through the middle grade years, to when most of the students would be in grade 8. OFT1 provided an opportunity to further test the procedures that worked effectively in the IVFT and subsequently learn from OFT1 how to best implement them in MS1.

Two incentive experiments were conducted in OFT1 to inform decisions about the optimal baseline incentive offer for MS1. These experiments were a school-level incentive experiment and a parent-level incentive experiment.

School-level Incentive Experiment. School participation has been increasingly difficult to secure. Given the many demands and outside pressures that schools already face, it is essential that schools see that MGLS:2017 staff understand the additional burden being placed on school staff when requesting their participation. The study asks for many kinds of information and cooperation from schools, including a student roster with basic demographic information (e.g., date of birth, sex, and race/ethnicity); information on students’ IEP status, math and special education teachers, and parent contact information; permission for field staff to be in the school for up to a week; space for administering student sessions (assessments and surveys); permission for students to leave their normal classes for the duration of the sessions; and information about the students’ teachers and parents. Sample students with disabilities sometimes require accommodations and different session settings, such as individual administration and smaller group sessions, which add to the time the study spends in schools and sometimes require additional assistance from school staff to assure that these students are accommodated appropriately.

IVFT and OFT1 included a school-level incentive experiment (see Supporting Statement Part A of OMB# 1850-0911 v.9 and v.15 for details). Schools were randomly assigned to one of three incentive conditions: Condition 1 - $200, Condition 2 - $400, or Condition 3 - $400 in materials or services for the school (school coordinators also received $150, consistent across all three conditions). Table 10 displays information on the types of non-monetary materials or services offered in Condition 3.

Table 10. Non-Monetary Incentive Choices for Schools in Experimental Condition 3

Incentive (Approximate Value = $400)

Registration for Association for Middle Level Education (AMLE) or Regional Annual Meeting

Two-Year School Membership in AMLE

Membership in Regional ML Organization plus Subscriptions to Professional Journals

Professional Development Webinar

School Supplies

Library of Middle Level Publications


The original analytic plans for this incentive experiment called for combining results from the IVFT and OFT1 schools to increase the possibility of detecting differences with statistical significance. While we have provided an analysis that uses the combined results from the IVFT and OFT1, given differences between those groups that affect their predictive value regarding MS1, we have also provided individual analyses for IVFT and OFT1. The IVFT set of schools was purposively selected while the OFT1 schools were selected using a probability proportional to size sampling method that mimics the method employed in MS1. Also, the IVFT was fielded with a shortened school recruitment window, making it likely that recruitment results were less than we expected in MS1. In addition, many districts refused to allow their sampled schools to be contacted for the study. This meant that sampled schools in those districts never actually received an incentive offer. To assess the degree to which the level of school incentive impacted participation rates, we analyzed results both including and excluding schools in districts that refused. Table 11 presents results for schools in all districts. Due to small sample sizes of schools, tests of statistical significance may not be particularly informative because of a lack of power. However, it would appear that, for example, in OFT1 the $400 condition resulted in a higher participation rate than the $200 condition (Table 11, far right column).

Table 11. School Participation Rates by Experimental Condition (All Districts)

Experimental Condition

IVFT and OFT1 Combined Participation Rate

(Number of Schools)

IVFT Participation Rate

(Number of Schools)

OFT1 Participation Rate

(Number of Schools)

1

$200

23.1% (30 of 130)

22.9% (20 of 87)

23.3% (10 of 43)

2

$400

27.3% (35 of 128)

22.1% (17 of 77)

35.3% (18 of 51)

3

$400 non-monetary equivalent

29.0% (36 of 124)

25.0% (21 of 84)

37.5% (15 of 40)


Table 12 presents participation rates among schools in cooperating districts.

Table 12. School Participation Rates by Experimental Condition (Participating Districts)

Experimental Condition

IVFT and OFT1 Combined Participation Rate (Number of Schools)

IVFT Participation Rate

(Number of Schools)

OFT1 Participation Rate

(Number of Schools)

1

$200

39.5% (30 of 76)

37.7% (20 of 53)

43.5% (10 of 23)

2

$400

49.3% (35 of 71)

37.8% (17 of 45)

69.2% (18 of 26)

3

$400 non-monetary equivalent

42.9% (36 of 84)

34.4% (21 of 61)

65.2% (15 of 23)


Table 13 presents participation rates for schools in all districts, when the two higher incentive level conditions are combined. In keeping with the information presented in Table 11, due to small sample sizes of schools, tests of statistical significance may not be particularly informative because of a lack of power. However, similar to the earlier results, it would appear that, for example, in OFT1 the $400 condition (regardless of whether it was monetary or non-monetary) was connected to a higher participation rate than the $200 condition (Table 13).

Table 13. School Participation Rates by Combined Experimental Condition (All Districts)

Experimental Condition

IVFT and OFT1 Combined

Participation Rate

(Number of Schools)

IVFT Participation Rate

(Number of Schools)

OFT1 Participation Rate

(Number of Schools)

1

$200

23.1% (30 of 130)

22.9% (20 of 87)

23.3% (10 of 43)

2 and 3

$400 or $400 non-monetary equivalent

28.2% (71 of 252)

23.6% (38 of 161)

36.3% (33 of 91)


Table 14 presents participation rates among schools in cooperating districts, when the two higher incentive level conditions are combined.

Table 14. School Participation Rates by Combined Experimental Condition (Participating Districts)

Experimental Condition

IVFT and OFT1 Combined Participation Rate (Number of Schools)

IVFT Participation Rate

(Number of Schools)

OFT1 Participation Rate

(Number of Schools)

1

$200

39.5% (30 of 76)

37.7% (20 of 53)

43.5% (10 of 23)

2 and 3

$400 or $400 non-monetary equivalent

45.8% (71 of 155)

35.9% (38 of 106)

67.4% (33 of 49)


Based on these results, we offered MS1 schools $400 in the form of a check or non-monetary equivalent (approved in OMB# 1850-0911 v.13) and, starting in June 2017, we began offering an additional $200 (for a total of $600) to schools associated with districts that initially decline to participate and that had one or more sample schools that were designated as having “higher” counts of students in the focal disability groups (OMB# 1850-0911 v.15). Despite this approach, MS1 did not reach its target participation. School officials reported different reasons for not participating, such as: the study burden; too many other assessments, research studies, or initiatives; loss of instructional time; no direct benefit to the school, staff, or students; and lack of resources at the school. While some schools appreciated the incentive offer, most that did not participate said their decision would not be impacted by a higher monetary incentive. The $600 incentive was specifically used to help boost participation among the schools that had higher counts of students in the focal disability groups. Since the study is no longer trying to achieve a representative sample of students in focal disability groups for the follow-up, the $600 incentive is not requested for MS2. In-person recruitment attempts were extremely successful for schools reluctant to participate or who were difficult to reach by phone or email. Of the 239 schools receiving an in-person recruitment visit, 78 schools participated in MS1 (33 percent). We plan to use this approach more proactively for MS2. We also found that participating schools in the base year are likely to participate again in the follow-up study. Each of the 45 participating OFT1 schools participated in OFT2. Fewer students participated in OFT2 than OFT1, which likely resulted from the fact that not all students were still attending their OFT1 schools and were contacted to participate outside of school. Student participation in out-of-school data collection is lower than in school, which resulted in fewer student participants in the OFT2. For MS2, we will follow 100 percent of the MS1 student sample, with the exception of study withdrawals, to maximize the number of participants this upcoming round.

We are still analyzing results from MS1 and OFT2. We will provide additional details about MS1 and OFT2 recruitment and school debriefing survey results in the MS2 data collection request in 2019.

Parent-level Incentive Experiment. OFT1 evaluated baseline incentive amounts and incentive boosts differentiated between parents of students with EMN and all other parents. OFT1 entailed randomized parent assignment such that parent incentive amounts differed between schools but not within schools (with the exception of incentive amounts for parents of students with EMN).

  • For the baseline incentive, parents of students with EMN were offered either $20 or $30.

  • Parents of non-EMN students were offered $0, $10, or $20 at baseline.

  • In early March, phase 2 entailed an incentive boost offer of an additional $10 for parents of students with EMN. Parents of non-EMN students were offered an additional $0 or $10 to the baseline amount.

  • In early April, phase 3 commenced in which parents of students with EMN were offered $10 more. Parents of non-EMN students were offered either no additional boost or a cumulative offer of $40.

Parent contact information was received from schools on a flow basis and invitations to complete the parent survey were sent as the contact information became available. This meant that some cases had an abbreviated phase 1; some cases never reached phase 2; and some cases never reached phase 3. The parent data collection took place between February 1 and May 31, 2017. This field period was abbreviated compared to what was used in MS1, for which data collection began in January and continued through August 2018.

Table 15 shows OFT1 parent participation by baseline incentive offer for non-EMN cases. In OFT1, parents offered $20 at baseline had a higher participation rate. Therefore, in MS1, parents of students not having an EMN IEP designation were offered $20 at baseline.

Table 15. OFT1 Parent Participation by Baseline Incentive Offer: parents of students not having EMN IEP designation

Baseline Incentive

Selected N

Completes

N

Percent

$0

320

105

32.81

$10

653

245

37.52

$20

660

287

43.48*

* Significantly higher than $0 (p<0.05).

Table 16 shows OFT1 parent participation by baseline incentive offer for EMN cases. Due to small sample sizes, tests of statistical significance may not be particularly informative because of a lack of power. In OFT1, parents of EMN students offered $30 at baseline had a substantively higher participation rate. Therefore, in MS1, parents of EMN students were offered $30 at baseline.

Table 16. OFT1 Parent Participation by Baseline Incentive Offer: parents of students having EMN IEP designation

Baseline Incentive

Selected N

Completes

N

Percent

$20

32

7

21.88

$30

29

10

34.48


In terms of what amount of boost might relate to higher participation, table 15 shows that for non-EMN parent cases reaching phase 2, in which a $10 boost was compared with no boost, those offered the $10 boost had a significantly higher response rate than those not given the $10 boost offer (see Table 17).

Table 17. OFT1 Parent Participation by First Incentive Boost Offer: parents of students not having EMN IEP designation

Incentive Boost

Selected N

Completes

N

Percent

No boost

633

198

31.28

$10 boost

415

155

37.35*

* Significantly higher than no boost (p<0.05).

Table 18 shows the results of a $10 first incentive boost for parents of EMN cases. There was no experimentation, given that all pending cases received a $10 boost offer at the start of phase 2 and another $10 boost offer at the start of phase 3. Due to small sample sizes, tests of statistical significance may not be particularly informative because of a lack of power. In OFT1, parents of EMN students who were offered a cumulative $40 incentive (including a $10 boost) had a substantively higher participation rate. Therefore, in MS1, parents of EMN students were offered $30 at baseline and a $10 boost.

Table 18. OFT1 Parent Participation by First Incentive Boost Offer: parents of students having EMN IEP designation

Cumulative Incentive

Selected N

Completes

N

Percent

$30 ($10 boost)

25

6

24.00

$40 ($10 boost)

21

8

38.10


In analyzing the effectiveness of a second boost offer, the cumulative offer of $40 for non-EMN cases and $50 for EMN cases did not result in higher response.

Given the OFT1 boost offer results, non-EMN pending nonresponding parent cases in MS1 received a single boost offer of $10 (for a cumulative offer of $30) six weeks after initial contact, and EMN pending nonresponding parent cases also received a single $10 boost offer (for a cumulative offer of $40) six weeks after initial contact.

Because we are not pursuing the IEP oversample in MS2, we are not currently requesting a differential incentive for MS2. In MS2, as in MS1, parents will receive $20.

B.5 Individuals Responsible for Study Design and Performance

The following individuals at the National Center for Education Statistics (NCES) are responsible for MGLS:2017: Carolyn Fidelman, Gail Mulligan, Chris Chapman, and Marilyn Seastrom. The following individuals at RTI are responsible for the study: Dan Pratt, Debbie Herget, and David Wilson, along with subcontractor staff: Sally Atkins-Burnett (Mathematica) and Michelle Najarian (ETS).

1 A special education school is a public elementary/secondary school that focuses on educating students with disabilities and adapts curriculum, materials, or instruction for the students served.

2 Imputation was necessary in order to be able to include eligible schools in the sampling process, which helped ensure the sampling frame was more representative of the population of eligible schools. Imputation was used for grade 6 enrollment when grade 6 enrollment was missing. Imputation was used for focal disability counts when focal disability counts were missing. We note that schools in Wyoming and Iowa do not report to EDFacts so they would not be able to be represented in the MS1 sample without imputing focal disability counts. If both grade 6 enrollment and focal disability counts were missing, imputation was not used and these schools were excluded from the frame.

3 Sixth-grade enrollment is reported as 0 or was not reported to the CCD 2013-14 or PS 2013-14.

4 A school reports zero students or does not report the number of students in any of the three focal disability groups.

5 For sampling purposes, all private schools were classified as low prevalence schools because private schools do not report to EDFacts.

6 See, for example, Kish (1965.) Survey Sampling, John Wiley & Sons, Inc. p.56.

7 Folsom, R.E., Potter, F.J., and Williams, S.R. (1987). Notes on a Composite Size Measure for Self-Weighting Samples in Multiple Domains. Proceedings of the Section on Survey Research Methods of the American Statistical Association, 792-796.

8 The seven student domains are as follows: Autism (AUT); Emotional Disturbance (EMN); Specific Learning Disability (SLD); Asian, non-Hispanic (non-SLD, non-EMN, non-AUT); Hispanic (non-SLD, non-EMN, non-AUT; Black, non-Hispanic (non-SLD, non-EMN, non-AUT); and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT)

9 SAS Institute Inc. 2008. SAS/STAT® 9.2 User’s Guide. Cary, NC: SAS Institute Inc.

10 A student was considered a participant if the student completed the student assessment or student survey or if a parent associated with the student completed their respective survey.

11 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2001). Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), User’s Manual for the ECLS-K Base Year Public-Use Data Files and Electronic Codebook (NCES 2001-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

12 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2012). Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), User’s Manual for the ECLS-K:2011 Kindergarten Data File and Electronic Codebook (NCES 2013-061). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

13 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

14 We are no longer asking for IEP information on the student roster because we are no longer maintaining those oversamples.

20

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-10-04

© 2024 OMB.report | Privacy Policy