Part B HS&B 2020 BYFS Recruitment & BYFT

Part B HS&B 2020 BYFS Recruitment & BYFT.docx

High School and Beyond 2020 (HS&B:20) Base-Year Full-Scale Study Recruitment and Field Test Update

OMB: 1850-0944

Document [docx]
Download: docx | pdf




High School and Beyond 2020 (HS&B:20)

Base-Year Full-Scale Study Recruitment and Field Test



OMB# 1850-0944 v.4





Supporting Statement Part B






Submitted by

National Center for Education Statistics

U.S. Department of Education




March 2019

revised August 2019




B. Collection of Information Employing Statistical Methods

Part B of this submission presents information on the statistical methods employed for the HS&B:20 base-year field test and for the HS&B:20 base-year full-scale (BYFS) study sampling techniques and recruitment.

B.1 Respondent Universe

The High School and Beyond 2020 study (HS&B:20) will follow a nationally-representative sample of ninth-grade students from the start of high school in the fall of 2020 to the spring of 2024 when most will be in twelfth grade. The study sample will be freshened in 2024 to create a nationally representative sample of twelfth-graders. A high school transcript collection and additional follow-up data collections beyond high school are also planned. The sample of ninth-grade students selected in the fall of 2020 is referred to the ninth-grade cohort while the sample of students enrolled in twelfth-grade in spring of 2024 is referred to as the twelfth-grade cohort.

The target populations for BYFS consists of ninth-grade students in public and private schools in the 50 United States and District of Columbia as of fall 20201. Excluded from the target universe will be special education schools, area vocational schools that do not enroll students directly, Department of Defense (DoD) schools outside of the US, and schools associated with temporary housing such as correctional facilities and treatment centers.

Field Test

The HS&B:20 field test will include ninth- and twelfth-grade students and will be conducted during the fall of the 2019-20 school year in six geographic locations, metropolitan statistical areas (MSA), to ensure sampled schools come from each of four broad Census regions (Northeast, Midwest, South, and West). Field test recruitment is scheduled to begin in January 2019 and data collection in August 2019.

The field test will employ a two-stage sampling design, with schools selected in one stage, and then students selected within schools. Schools will be selected using simple random sampling within school sampling strata.

The primary sampling units (PSU) of schools will be selected from two databases of the U.S. Department of Education. The 2015-2016 Common Core of Data (CCD) will be used to select public schools and the 2015-2016 Private School Universe Survey (PSS) will be used to select private schools. The secondary sampling units (SSU) of students will be selected from student rosters that will be obtained from participating schools.

Full-scale

BYFS will be conducted during the 2020-21 school year, with recruitment scheduled to begin in August 2019 and data collection in September 2020. The BYFS is designed to select a nationally representative sample of schools offering grade 9 instruction and a nationally representative sample of students enrolled in grade 9. The BYFS school population consists of regular public schools, including state department of education schools, that include 9th grade; Bureau of Indian Education schools that include 9th grade; and Catholic and other private schools that include 9th grade. It excludes the following types of schools:

  • DoD Education Activity schools outside of the United States,

  • Schools associated with correctional facilities, treatment facilities, hospitals, and other temporary housing facilities,

  • Area vocational schools that do not enroll students directly, and

  • Special education schools.2

The HS&B:20 BYFS will employ a multi-stage sampling design with schools selected in the first stage and students selected, within schools, at the second stage. Schools will be selected using probability proportional to size sampling within school sampling strata.

Students will be selected using simple random sampling within student sampling strata within schools. The school frame will be constructed from the 2017-18 Common Core of Data (CCD 2017-18) and the 2015-16 Private School Universe Survey (PSS 2015-16) and will include 28,688 schools that report offering ninth-grade instruction to at least 1 student. An initial sample of 1,373 schools will be selected with the goal of achieving 920 participating schools. A sample of approximately 26,000 students from an estimated 920 participating schools is estimated to produce 20,995 participating students enrolled in grade 9.

The 28,688 schools in the sampling frame will be explicitly stratified by the cross-classification of the following characteristics3:

  • school type (public, Catholic, other private),

  • region (Northeast, Midwest, South, West),

  • locale (city, suburb, town, rural), and

  • public school type (Charter, Magnet, Virtual, not Charter/Magnet/Virtual).

The distributions of the numbers of schools in the school sampling frame, the initial school sample, and school participation goals are shown by school sampling strata in table 1.

Table 1. HS&B:20 BYFS School Sample Allocation

School Type

Census Region

Locale

Participation Goals

School Frame Count

Total Selected School Sample

Initial School Sample

School Reserve Sample

Total



920

28,688

2,728

1,373

1,355

Public-Charter



67

2,466

198

99

99

Public-Magnet



67

971

198

99

99

Public-Virtual



67

623

198

99

99

Public-Not charter, magnet, or virtual

Northeast

City

17

717

51

26

25

Public-Not charter, magnet, or virtual

Northeast

Suburb

30

1,162

89

45

44

Public-Not charter, magnet, or virtual

Northeast

Town

8

247

24

12

12

Public-Not charter, magnet, or virtual

Northeast

Rural

16

775

48

24

24

Public-Not charter, magnet, or virtual

Midwest

City

20

713

59

30

29

Public-Not charter, magnet, or virtual

Midwest

Suburb

29

984

86

43

43

Public-Not charter, magnet, or virtual

Midwest

Town

16

928

48

24

24

Public-Not charter, magnet, or virtual

Midwest

Rural

28

2,697

83

42

41

Public-Not charter, magnet, or virtual

South

City

32

1,097

95

48

47

Public-Not charter, magnet, or virtual

South

Suburb

44

1,250

130

65

65

Public-Not charter, magnet, or virtual

South

Town

24

1,004

71

36

35

Public-Not charter, magnet, or virtual

South

Rural

50

2,857

148

74

74

Public-Not charter, magnet, or virtual

West

City

36

960

107

54

53

Public-Not charter, magnet, or virtual

West

Suburb

34

950

101

51

50

Public-Not charter, magnet, or virtual

West

Town

24

632

71

36

35

Public-Not charter, magnet, or virtual

West

Rural

32

1,311

95

48

47

Catholic

Northeast


34

265

101

51

50

Catholic

Midwest


32

336

95

48

47

Catholic

South


30

258

89

45

44

Catholic

West


32

172

95

48

47

Other private

Northeast


30

1,058

89

45

44

Other private

Midwest


34

854

101

51

50

Other private

South


55

2,442

163

82

81

Other private

West


32

959

95

48

47

B.2 Procedures for the Collection of Information

HS&B:20 will collect data from high school students and their parents, math teachers, guidance counselors, and school administrators. Data will be collected from ninth graders in the fall of 2020 as they begin high school and again in the spring of 2024 when most students in the sample will be seniors at the end of their high school career. The field test data collections will be conducted one year prior to their full-scale counterparts. Collecting data at these time points from students, parents, teachers, counselors, and administrators, with high school transcripts collected after high school, will culminate in a rich data set that will provide educators, policymakers, and researchers with information about transitions, outcomes, and experiences in multiple contexts.

Field Test School Sample

The base-year field test school frame will include 1,0924 schools that report offering ninth and twelfth-grade instruction to at least 35 ninth- and 35 twelfth-grade students and are located within six MSAs. The school frame will also include 53 schools that report offering ninth-grade instruction to at least 35 ninth-grade students but do not offer instruction for twelfth grade, for a total of 1,145 schools in the school sampling frame.

Schools will be stratified by MSA and, within MSA, into two groups based on the presence or absence of twelfth-grade instruction: those schools offering ninth- and twelfth-grade instruction and those schools offering instruction in ninth but not twelfth grade. Schools offering both ninth- and twelfth-grade instruction will be further stratified into the following nine groups:

  • Catholic schools

  • Other private schools

  • Non-virtual magnet schools

  • Non-virtual/non-magnet charter schools

  • Virtual schools

  • Non-magnet/non-charter/non-virtual public schools in the Northeast

  • Non-magnet/non-charter/non-virtual public schools in the Midwest

  • Non-magnet/non-charter/non-virtual public schools in the South

  • Non-magnet/non-charter/non-virtual public schools in the West

Three hundred nine schools will be sampled and randomly assigned to receive an offer of a 90-minute student session or receive an offer for a single class period (e.g., 45-minute) student session.

The sample allocation was designed to produce 75 participating schools with approximately equal numbers of participating schools in each of six MSAs. The school strata and a sample allocation are shown in Table 2.

The first step in the sampling process involves the selection of two non-magnet/non-charter/non-virtual schools with certainty to ensure that at least two schools are sampled from public-school districts that require a research application. After selection of these schools, the school sample size associated with the school sampling strata for those selected schools will be reduced by two. The second step in the sampling process involves the selection of the single Bureau of Indian Education (BIE) school with certainty. After selection of this BIE school, the school sample size associated with the school sampling strata for that selected school will be reduced by one. The third step in the sampling process involves the use of simple random sampling to select 306 more schools. Schools will be randomly assigned to be offered a single class period (e.g., 45-minute) or a 90-minute student session in such a fashion as to ensure that schools in the same district or diocese will be assigned the same offer. For recruitment efficiency and power associated with recruitment experiments, the full sample of 309 schools is planned to be released for recruitment all at once. The sample of 309 schools is expected to yield 75 or more participating schools. The recruitment team will continue to encourage participation for the purposes of the field test in-school session length experiment (described further in B.3) even after the 75-school yield target is reached. If considerably more than 75 schools agree to participate, a subset of these schools may be informed that they will not need to take part.

Table 2. HS&B:20 Base-Year Field Test School Sample Allocation

MSA

School Frame Count

Grade Level

School Type

Public School Type

School Sample Size

School Participation Goal

Total

1,145




309

75

A

13

9th but no 12th

Public

-

6

1

A

22

9th and 12th

Public

Charter not Virtual or Magnet

0

0

A

29

9th and 12th

Public

Magnet not Virtual

0

0

A

0

9th and 12th

Public

Virtual

0

0

A

40

9th and 12th

Public

non-magnet/non-charter/non-virtual – City

18

4

A

76

9th and 12th

Public

non-magnet/non-charter/non-virtual – Suburb

26

6

A

3

9th and 12th

Public

non-magnet/non-charter/non-virtual – Town

0

0

A

13

9th and 12th

Public

non-magnet/non-charter/non-virtual – Rural

10

2

A

15

9th and 12th

Catholic

-

0

0

A

7

9th and 12th

Other Private

-

0

0

B

1

9th but no 12th

Public

-

0

0

B

31

9th and 12th

Public

Charter not Virtual or Magnet

0

0

B

83

9th and 12th

Public

Magnet not Virtual

10

2

B

1

9th and 12th

Public

Virtual

0

0

B

2

9th and 12th

Public

non-magnet/non-charter/non-virtual – City

0

0

B

18

9th and 12th

Public

non-magnet/non-charter/non-virtual – Suburb

18

8

B

0

9th and 12th

Public

non-magnet/non-charter/non-virtual – Town

0

0

B

1

9th and 12th

Public

non-magnet/non-charter/non-virtual – Rural

1

1

B

15

9th and 12th

Catholic

-

0

0

B

29

9th and 12th

Other Private

-

10

2

C

0

9th but no 12th

Public

-

0

0

C

5

9th and 12th

Public

Charter not Virtual or Magnet

0

0

C

2

9th and 12th

Public

Magnet not Virtual

0

0

C

8

9th and 12th

Public

Virtual

6

1

C

10

9th and 12th

Public

non-magnet/non-charter/non-virtual – City

10

5

C

6

9th and 12th

Public

non-magnet/non-charter/non-virtual – Suburb

6

4

C

2

9th and 12th

Public

non-magnet/non-charter/non-virtual – Town

0

0

C

1

9th and 12th

Public

non-magnet/non-charter/non-virtual – Rural

0

0

C

3

9th and 12th

Catholic

-

0

0

C

2

9th and 12th

Other Private

-

0

0

D

23

9th but no 12th

Public

-

6

1

D

43

9th and 12th

Public

Charter not Virtual or Magnet

10

2

D

20

9th and 12th

Public

Magnet not Virtual

0

0

D

2

9th and 12th

Public

Virtual

0

0

D

26

9th and 12th

Public

non-magnet/non-charter/non-virtual – City

10

2

D

117

9th and 12th

Public

non-magnet/non-charter/non-virtual – Suburb

18

4

D

3

9th and 12th

Public

non-magnet/non-charter/non-virtual – Town

3

1

D

15

9th and 12th

Public

non-magnet/non-charter/non-virtual – Rural

10

2

D

30

9th and 12th

Catholic

-

10

2

D

27

9th and 12th

Other Private

-

0

0

E

6

9th but no 12th

Public

-

6

1

E

42

9th and 12th

Public

Charter not Virtual or Magnet

0

0

E

0

9th and 12th

Public

Magnet not Virtual

0

0

E

1

9th and 12th

Public

Virtual

0

0

E

45

9th and 12th

Public

non-magnet/non-charter/non-virtual – City

22

5

E

40

9th and 12th

Public

non-magnet/non-charter/non-virtual – Suburb

18

4

E

5

9th and 12th

Public

non-magnet/non-charter/non-virtual – Town

5

1

E

12

9th and 12th

Public

non-magnet/non-charter/non-virtual – Rural

10

2

E

6

9th and 12th

Catholic

-

0

0

E

8

9th and 12th

Other Private

-

0

0

F

10

9th but no 12th

Public

-

6

1

F

15

9th and 12th

Public

Charter not Virtual or Magnet

0

0

F

39

9th and 12th

Public

Magnet not Virtual

0

0

F

0

9th and 12th

Public

Virtual

0

0

F

24

9th and 12th

Public

non-magnet/non-charter/non-virtual – City

14

3

F

67

9th and 12th

Public

non-magnet/non-charter/non-virtual – Suburb

18

4

F

9

9th and 12th

Public

non-magnet/non-charter/non-virtual – Town

9

2

F

28

9th and 12th

Public

non-magnet/non-charter/non-virtual – Rural

13

2

F

16

9th and 12th

Catholic

-

0

0

F

28

9th and 12th

Other Private

-

0

0


Field Test Student Sample

Within participating schools, students will be stratified into grades 9 or 12, and a simple random sample of 35 ninth-grade students will be selected and, for those schools that offer twelfth-grade instruction, a simple random sample of 35 twelfth-grade students will be selected. The desired yield for the field test is 2,120 students enrolled in grade 9 and 2,007 students enrolled in grade 12. An estimated 5 percent of sampled students are assumed to be ineligible and 85 percent of eligible students are assumed to participate. Thus, to achieve a yield of 2,120 ninth-grade students, the parents of approximately 2,625 ninth-grade students will need to be contacted for consent (2,625*0.95*0.85=2,120). Similarly, the parents of approximately 2,485 twelfth-grade students will need to be contacted for consent (2,485*.95*.85=2,007).

Full-Scale School Sample

Prior to selection of the school sample, schools will be sorted by locale (city, suburban, town, rural), school grade configuration, and school size measure within each of the explicit school strata so that approximate proportionality across locale and school configuration is preserved. The purpose of including school size measure in the sort is to enable the ability to freshen the school sample. If newer versions of the CCD or PSS are released after selection of the school sample and by the end of the first quarter of 2020, the school sample will be freshened in the first half of 2020, before the start of BYFS data collection, because schools will be selected about a year before the start of data collection to allow sufficient time for recruitment. Thereby new schools will be identified through review of more recent CCD and PSS files, if available. Newly identified schools will be inserted into the sorted sampling frame in such a fashion as to preserve the original sort ordering. Using a half-open interval rule,5 we will identify schools to be added to the initial school sample.

Declining response rates are a concern for any research study, and some of the recent school-based NCES longitudinal studies achieved response rates lower than a desired target of 75 percent. For example, the school response rate for the High School Longitudinal Study of 2009 (HSLS:09) was 56 percent and the school response rate for the Early Childhood Longitudinal Study Kindergarten Class of 2010-11 (ECLS-K:2011) was 63 percent.

Nevertheless, to be conservative, the HS&B:20 BYFS sampling plan is designed to be flexible so that the study can better achieve school participation targets even if eligibility and response rates are lower than anticipated. The proposed school sampling process is designed to achieve 920 participating schools (641 public, 128 Catholic, and 151 other private) distributed over 27 school sampling strata. We plan to select 2,728 schools using stratified probability proportional to size sampling, from which an initial simple random sample of 1,373 schools will be selected within school strata. This subset of 1,373 schools will comprise the initial set of schools that will be pursued for recruitment, starting in August 2019, into the BYFS. The remaining schools will provide a reserve sample from which additional schools may be sampled and pursued for recruitment. Schools may be sampled from the reserve if participation among schools in the initial sample is estimated to be too far away from school participation targets. There is no predefined schedule or number of participating schools that will trigger release of schools from the reserve sample.6 Rather, the numbers of participating schools among the 1,373 released schools will be monitored by school stratum and, if the number of participating schools in a stratum is substantially less than the yield goals for that stratum, then additional schools may be released for that stratum from the reserve set of schools. If a reserve sample of schools is warranted, a random sample of schools will be selected from the reserve and released for recruitment. This procedure will improve the ability to better achieve within-stratum school participation goals in the event that stratum-specific eligibility and participation rates are lower than expected. The desired numbers of participating schools by the margins of the school stratification characteristics are shown in table 3.

Table 3. HS&B:20 BYFS School Participation Goals, by School Stratification Characteristics



Public

Catholic

Other private

Total

Total


641

128

151

920

Region

Northeast

71

34

30

135


Midwest

93

32

34

159


South

150

30

55

235


West

126

32

32

190

Locale

City

105

NA

NA

105


Suburb

137

NA

NA

137


Town

72

NA

NA

72


Rural

126

NA

NA

126

Public-school Type

Charter

67

NA

NA

67


Magnet

67

NA

NA

67


Virtual

67

NA

NA

67


Not charter, magnet, or virtual

440

NA

NA

440

NA: Not Applicable. No explicit participation goals are established for Catholic and other private schools by locale. Catholic and Other private schools are all classified as Low prevalence, for purposes of sampling, as no focal disability counts are available.

The 27 school strata along with the corresponding stratum-specific participation goals, frame counts, total selected school sample (n=2,728), initial school sample (n=1,373), and reserve sample (n=1,355) are shown in table 1. The size measure used for the probability proportional to size selection of 2,728 schools will be constructed using the overall sampling rates for students in the following five student categories:

  • American Indian or Alaskan Native (AIAN), non-Hispanic

  • Asian, non-Hispanic,

  • Hispanic,

  • Black, non-Hispanic, and

  • Other race, non-Hispanic

combined with the total number of students in each of those five categories at a given school. In other words, the size measure for a given school (i) in school stratum h may be written as follows:

Where is the sampling rate for the jth student category in the hth school stratum and is the number of students in the jth category within school i in the hth school stratum. The sampling rate, , equals the number of students to sample from the jth category in the hth school stratum divided by the number of students in the jth category across all schools in the hth school stratum. The sampling rates for the five student categories listed above will vary across the school strata; for example, a rate of .072 is used for AIAN students attending traditional public schools in the suburban areas in the South while an overall rate of .003 is used for students attending traditional public schools in suburban areas in the South. The student sampling rates by school strata are provided in table 4.

The sampling plan is designed to produce constant weights within each of the five student domains (AIAN non-Hispanic, Asian non-Hispanic, Hispanic, Black non-Hispanic, and other non-Hispanic) within each school stratum. When weights are constant within a given student domain and school stratum, there is no increase in the design effect due to unequal weights for estimates produced for the given student domain and school stratum.

Within participating schools, students will be stratified into the five student categories defined above and a systematic sample of students will be selected from each student sampling stratum. Approximately 28 students will be sampled from each of the anticipated 920 participating schools. However, the number of students sampled per student stratum will vary by school because the within-school student-stratum sample sizes depend upon the numbers of students in the five student sampling strata. The process of determining the student sample allocation follows the procedure outlined in section 2 of Folsom et al (1987).7

Table 4. Aggregate Student Sampling Rates Used for School Selection

School Type

Census
Region

Locale

Overall

American Indian or Alaskan Native, non-Hispanic

Asian, non-Hispanic

Hispanic

Black, non-Hispanic

Other, non-Hispanic

Public-Charter



0.011

0.103

0.030

0.007

0.009

0.011

Public-Magnet



0.006

0.097

0.025

0.004

0.005

0.005

Public-Virtual



0.048

0.090

0.075

0.042

0.053

0.047

Public-Not charter, magnet, or virtual

Northeast

City

0.004

0.065

0.009

0.002

0.003

0.003

Public-Not charter, magnet, or virtual

Northeast

Suburb

0.003

0.063

0.007

0.002

0.003

0.002

Public-Not charter, magnet, or virtual

Northeast

Town

0.005

0.061

0.048

0.012

0.020

0.003

Public-Not charter, magnet, or virtual

Northeast

Rural

0.004

0.062

0.032

0.002

0.010

0.002

Public-Not charter, magnet, or virtual

Midwest

City

0.004

0.067

0.012

0.003

0.003

0.002

Public-Not charter, magnet, or virtual

Midwest

Suburb

0.003

0.065

0.010

0.003

0.003

0.002

Public-Not charter, magnet, or virtual

Midwest

Town

0.004

0.066

0.037

0.004

0.009

0.002

Public-Not charter, magnet, or virtual

Midwest

Rural

0.005

0.079

0.031

0.004

0.007

0.002

Public-Not charter, magnet, or virtual

South

City

0.003

0.067

0.011

0.002

0.002

0.002

Public-Not charter, magnet, or virtual

South

Suburb

0.003

0.072

0.007

0.002

0.002

0.002

Public-Not charter, magnet, or virtual

South

Town

0.004

0.076

0.037

0.003

0.003

0.002

Public-Not charter, magnet, or virtual

South

Rural

0.004

0.085

0.016

0.002

0.002

0.002

Public-Not charter, magnet, or virtual

West

City

0.004

0.077

0.006

0.002

0.003

0.002

Public-Not charter, magnet, or virtual

West

Suburb

0.003

0.070

0.006

0.002

0.004

0.002

Public-Not charter, magnet, or virtual

West

Town

0.008

0.082

0.032

0.003

0.021

0.003

Public-Not charter, magnet, or virtual

West

Rural

0.009

0.086

0.025

0.003

0.013

0.003

Catholic

Northeast


0.032

0.064

0.090

0.043

0.049

0.023

Catholic

Midwest


0.028

0.069

0.082

0.047

0.050

0.020

Catholic

South


0.032

0.065

0.079

0.039

0.052

0.025

Catholic

West


0.041

0.073

0.094

0.035

0.065

0.030

Other private

Northeast


0.028

0.063

0.035

0.065

0.059

0.023

Other private

Midwest


0.039

0.064

0.085

0.066

0.057

0.031

Other private

South


0.022

0.070

0.040

0.039

0.036

0.017

Other private

West


0.033

0.072

0.028

0.055

0.067

0.029



Once schools are selected and recruited, students enrolled in grade 9 will be selected from student rosters that schools or school districts will be asked to provide. Whether the school or school district provides the information will be decided by the school or school district. The student sample sizes were determined by the requirement that at least 1,708 students in each of the five student domains8 participate in a second follow-up of HS&B:20. The 1,708 requirement was determined by evaluating the minimum required sample size that would be able to measure a relative change of 15 percent in proportions between the first- and second follow-up for students in the twelfth-grade as of the first follow-up. Students in twelfth grade during the first follow-up, both those who were in the sample in ninth grade and those who were added through freshening in twelfth grade, form the twelfth-grade cohort.9 Setting the minimum sample size required to achieve 1,708 participating students in the twelfth-grade cohort as of a second follow-up also ensures that the number of participating students in the second follow-up who were in the ninth-grade in the base year will exceed 1,708.10 Several assumptions were used to conduct this evaluation, as noted below.

  • Two-tailed tests with significance of alpha = 0.05 were used to test differences between means and proportions with required power of 80 percent.

  • A proportion of p = .30 was used to calculate sample sizes for tests of proportion.

  • Design effect is 2.5.

  • Correlation between waves is 0.6.

McNemar’s test using Connor’s approximation was used to determine the minimum sample size needed to meet the precision requirement under the previously stated assumptions. The Proc Power procedure available in SAS software11 was used to determine the minimum sample size.

Sample sizes were also established for three other student domains: students attending virtual schools, students attending charter schools, and students attending magnet schools. The minimum sample sizes used for these three student domains were established by using all but one of the same assumptions and requirements as were used for five student race/ethnicity domains. The requirement to measure a relative change of 15 percent in proportions between a first- and second follow-up for students in the twelfth-grade as of the first follow-up was changed to require the ability to measure a 20 percent change in proportions. Under this modified requirement, a minimum of 978 participating students are required as of the end of a second follow-up for students in these three types of public-schools.

The minimum number of students to sample from each of the five student categories and three public-school domains in the BYFS to achieve 1,708 participating students in each of the five student race/ethnicity domains and to achieve 978 participating students in the three public-school domains for the twelfth-grade cohort as of a second follow-up, along with the assumptions used to derive those numbers, are provided in table 5. The assumptions in Tables 5 and 6 are supported and motivated by previous work on HSLS:09. The numbers reported in table 5 show, for example, that if 3,500 Hispanic students are sampled in grade 9 then approximately 1,708 will respond in the base year, enter grade 12 in 2024, and respond in both the first- and second-follow-ups.

As can be seen in table 6, if the minimum sample sizes reported in table 5 are used in the BYFS, the estimated numbers of participating students among the BYFS ninth-grade cohort as of a second follow-up exceed the minimum numbers of required participating students of 1,708 and 978, for the five student race/ethnicity domains and three public-school type domains, respectively. The numbers reported in table 6 show, for example, that if 3,500 Hispanic students are sampled in grade 9 then approximately 1,847 will respond in the base year as well as in both the first- and second-follow-ups (regardless of what grade they enter in 2024).

Table 5. Minimum Sample Sizes and Associated Sample Design Assumptions for Student Sampling Categories for Twelfth-grade Cohort

Assumption

Each Student Race/Ethnicity Domain12

Charter students

Magnet students

Virtual Students

Ninth-grade inflated student sample size for each key student domain

3,500

2003

2003

2003

Ninth-grade student eligibility rate

95%

95%

95%

95%

Ninth-grade student response rate

85%

85%

85%

85%

Ninth-grade respondents

2,826

1,617

1,617

1,617

Percentage of ninth-grade students in twelfth grade as of the first follow-up

90%

90%

90%

90%

Ninth-grade respondents in twelfth grade as of the first follow-up

2,544

1,456

1,456

1,456

Percentage of twelfth-grade students whose transfer schools are known

15%

15%

15%

15%

Percentage of twelfth-grade students whose transfer schools are not known by their base-year school

8%

8%

8%

8%

Percentage of twelfth -grade students who are known to remain in base-year schools

77%

77%

77%

77%

Response rate for twelfth-grade transfer students whose schools are known

60%

60%

60%

60%

Response rate for twelfth-grade transfer students whose schools are unknown

50%

50%

50%

50%

Response rate for twelfth-grade students known to remain in their base-year schools

90%

90%

90%

90%

First follow-up respondents

2,093

1,198

1,198

1,198

Second follow-up eligibility rate

100%

100%

100%

100%

Second follow-up locate rate

96%

96%

96%

96%

Second follow-up response rate

85%

85%

85%

85%

Second follow-up respondents

1,708

978

978

978



Table 6. Minimum Sample Sizes and Associated Sample Design Assumptions for Student Sampling Categories for Ninth-grade Cohort

Assumption

Each Student Race/Ethnicity

Domain8

Charter students

Magnet students

Virtual Students

Ninth-grade inflated student sample size for each key student domain

3,500

2003

2003

2003

Ninth -grade student eligibility rate

95%

95%

95%

95%

Ninth -grade student response rate

85%

85%

85%

85%

Ninth -grade respondents

2,826

1,617

1,617

1,617

Percentage of Ninth-grade students not in base-year school or whose status is indeterminate13 as of the first follow-up

30%

30%

30%

30%

Twelfth-grade response rate among students known to be in base-year schools

90%

90%

90%

90%

Twelfth-grade response rate among students not in base-year schools or whose status is indeterminate

57%

57%

57%

57%

First follow-up respondents

2,264

1,296

1,296

1,296

Second follow-up eligibility rate

100%

100%

100%

100%

Second follow-up locate rate

96%

96%

96%

96%

Second follow-up response rate

85%

85%

85%

85%

Second follow-up respondents

1,847

1,057

1,057

1,057



Estimates of the minimum number of students to sample in the BYFS were derived by adjusting the 1,708 and 978 required for a twelfth-grade cohort to account for a variety of factors including estimates of student response in 2020 (grade 9), 2024 (grade 12), and at some point a few years after 2024 representing a second follow-up as well as other factors, including the extent to which BYFS participating schools agree to participate in first and second follow-up studies and the extent to which students are expected to move between schools between grades 9 and 12.

Using the minimum sample sizes reported in the top row of table 6, the minimum required sample size is 17,500 (3,500*5) students but the sample allocation substantially oversamples AIAN, non-Hispanic students; Asian, non-Hispanic students; students attending virtual schools; and students attending charter schools. The sample allocation substantially undersamples Other, non-Hispanic students and undersamples Hispanic students. In order to reduce the impact of disproportionate sampling on national estimates and estimates that compare or combine estimates across student categories, the sample sizes for the Hispanic and Other, non-Hispanic student domains were increased. The increases in the Hispanic and Other, non-Hispanic student domains were determined by specifying and solving a non-linear optimization problem that sought to determine the total student sample size and student domain sample sizes that would produce design effects of 2.5 or less for estimates of means within each of the five key student race/ethnicity groups. The solution of this non-linear optimization problem indicated a total student sample size of 26,000 students would be required. The allocation of these 26,000 students to each of the five key student/race ethnicity groups was also determined by the solution to the non-linear optimization problem and the allocation is reported in the second row of table 7.

Table 7. Final Student Sample Sizes and Expected Minimum Student Participation

Assumption

AIAN, non-Hispanic

Asian, non-Hispanic

Hispanic

Black, non-Hispanic

Other, non-Hispanic

Total

Ninth-grade inflated student sample size for each key student domain

3,500

3,500

4,500

3,500

11,000

26,000

Ninth-grade student eligibility rate

95%

95%

95%

95%

95%


Ninth-grade student response rate

85%

85%

85%

85%

85%


Ninth-grade respondents

2,826

2,826

3,634

2,826

8,883

20,995

Percentage of Ninth-grade students not in base-year school or whose status is indeterminate9 as of the first follow-up

30%

30%

30%

30%

30%


Twelfth-grade response rate among students known to be in base-year schools

90%

90%

90%

90%

90%


Twelfth -grade response rate among students not in base-year schools or whose status is indeterminate

57%

57%

57%

57%

57%


First follow-up respondents

2,264

2,264

2,911

2,264

7,115

16,818

Second follow-up eligibility rate

100%

100%

100%

100%

100%


Second follow-up locate rate

96%

96%

96%

96%

96%


Second follow-up response rate

85%

85%

85%

85%

85%


Second follow-up respondents

1,847

1,847

2,375

1,847

5,806

13,722



Therefore, for the BYFS the plan is to sample 28 students, on average, within each of 920 participating schools for a total of 26,000 sample students and, assuming the grade 9 eligibility and response rates shown in table 7 (an estimated 5 percent of sampled students are expected to be ineligible), to produce approximately 20,995 participating grade 9 students. Thus, to achieve a yield of 20,995 ninth-grade students, the parents of approximately 24,700 ninth-grade students will need to be contacted for consent (26,000*0.95=24,700). The distribution of the grade 9 student sample and estimates of the number of participating students in the BYFS, first follow-up, and second follow-up are provided in table 7.

Roster Collection. Beginning in the fall of 2019 for the field test, a roster of all ninth- and twelfth-grade students will be requested (appendix A.10). BYFS rosters will be requested for ninth-grade students beginning in the fall of 2020.

The rosters may be provided from the district or from the school, and it will be requested that the roster be provided once the enrollment for the school year has stabilized (which is often approximately 4 weeks into the school year) to increase accuracy. Key information needed for student sampling will be requested, such as: student name; school or district student ID number; date of birth; grade level; gender; race/ethnicity; and ELL status. Each of these characteristics is important for sampling purposes, but we will work with schools that are unable to provide all of the information to obtain the key information available. Based on this information, the student sample will be drawn. As part of the roster collection, the study will also request from the school coordinator or designated district personnel the following information for each student eligible for sampling: student’s parent and/or guardian contact information (e.g., mailing address; landline phone number; cell phone number; e-mail address) and student’s math teacher. Schools and districts often find it easier, and therefore more efficient, to supply all of the desired information one time for all of their students. However, should it be problematic for any school or district to provide the parent and teacher information on the complete roster, the recruitment team will gather that information as a second step for the sampled students only. If the school and/or district is unwilling to provide parent contact information for the sampled students, the team will work with the school and/or district to determine the best way to contact parents (e.g., the school coordinator or designated district personnel would facilitate contacting parents and/or would mail the required materials to parents using the contact information they have on file). Parent contact information is required to conduct the out-of-school student data collection.

The roster request will include a template and secure transfer options to deliver the rosters. The data quality of the student rosters will then be evaluated by:

  • reviewing and assessing the quality and robustness of student and parent information available at each school, including contact information for parents;

  • reviewing and assessing the quality of the data on student-teacher linkages;

  • addressing any incompleteness or irregularities in the roster file;

  • requesting additional information as needed from the school coordinator or designated district personnel; and

  • (re)verifying that the sampled students are currently in attendance in the school.

The provider of the roster, whether from a school district or school, will receive the $50 incentive for providing the roster as described in the Supporting Statement Part A, section A.9.

B.3 Methods to Secure Cooperation, Maximize Response Rates, and Deal with Nonresponse

School Recruitment

Gaining cooperation from school districts and schools is paramount to the success of this voluntary study. However, recruitment efforts in similar studies have been meeting with increasing challenges that must be carefully mitigated to ensure adequate school participation. For example, in 1998–99 the Early Childhood Longitudinal Study had a weighted school-level response rate of 74 percent,14 whereas 12 years later, the successor ECLS-K:2011 cohort study had a weighted school-level response rate of 63 percent.15 Additionally, response rates tend to be lower for schools that serve older students (e.g., the High School Longitudinal Study of 2009 (HSLS:2009) had a weighted school-level response rate of 56 percent,16 and the Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) had an unweighted school-level response rate of 39 percent). Methods to secure cooperation of school districts and schools, maximize response rates, and deal with nonresponse are described in this section.

Maximizing School Participation. The success of HS&B:20 hinges on securing the cooperation and maximizing response rates among school districts and schools, and then their students, parents, and staff. Participation among school districts and schools has been declining for voluntary school-based studies. Often, district and school personnel understand the value of the research but that does not offset their reasons for not participating. Reasons cited for not participating in voluntary school-based studies such as MGLS:2017 and HSLS:2009 are the high burden associated with participating, over-testing of students, loss of instructional time, lack of parent and teacher support, increased demands on school staff, and a moratorium on outside research. To mitigate these concerns, HS&B:20 has developed a recruitment plan to maximize school participation that is comprehensive and flexible in its approach and incentive structure to effectively secure cooperation for the study. Strategies recommended to maximize school participation include:

Outreach. Study and NCES name recognition add validity to the study when recruiting school districts and schools. Even prior to drawing the sample, outreach activities will be conducted to announce the upcoming study and begin to garner support from states, districts, schools, and stakeholders. Outreach activities will include:

  • Contacting school districts that are typically challenging to recruit to discuss their decision-making process about participating in research studies and the benefits they would like to see from participating. This would be done generically as an exercise to help plan the recruitment effort prior to selecting the sample.

  • Distribution of a brief video to stakeholders, state, school districts, and schools to explain the importance of the study (Appendix A.6e; video is expected to become live in April 2019).

  • A webinar explaining the importance of the study and the impact of the data collected. The webinar will be live, but a recording will be made available on the study website.

  • Attendance at conferences attended by stakeholders to introduce and promote the study.

Compelling recruitment materials. School districts and school staff are busy. Materials sent to these contacts must be informative, compelling, and brief. In addition, multiple types of materials (e.g., mailings, video, website) will be made available to ensure that decision-makers have options to receive the message in the manner that works best for them. The materials will all be available on the study website for easy access. Reviewing these study materials should provide districts and school administrators with an understanding of the study’s value, the importance of HS&B:20, and the data collection activities required as part of the study. A full understanding of these factors will be important both to obtain cooperation and to ensure that schools and districts accept the data collection requests that follow.

Determining the “right” amount of time to facilitate participation. One of schools’ primary concerns about participating in outside research is the loss of instructional time. In an effort to balance the need to collect information with the desire to minimize the burden on schools and students, we are testing two student session lengths. Described in section B.4, for the field test, half of the schools will be assigned at the outset to a 45-minute session and half to a 90-minute session. If a school in the 90-minutes group is unable to accommodate the full 90-minutes, it will be offered to partake in a one-class period or 45-minute session with the remainder of the session to be completed by the student outside of school. Schools unable to provide a 45-minute session will be asked, as a refusal conversion option, to provide roster information so that students may be contacted to participate outside of school. For BYFS, the 90-minute session will be offered to schools during recruitment and the 45-minute session will be offered only as part of a non-response and/or refusal conversion strategy.

For teachers and parents, in the field test, we will also experiment with two survey lengths to determine which is most effective in securing participation. Teachers will be offered either the full survey or an abbreviated survey. The mathematics teacher survey will consist of two sections: the teacher-level portion of the survey, which includes classroom-level information and will take either 16 or 10 minutes, and the student-level report, which will take either 4 or 3 minutes per student to complete. Because of the small school sample size, no experimentation is being explored with guidance counselors or administrators beyond the student session length.

Varied communication modes. Prior to the start of recruitment, study team members will review the sample of school districts and schools to determine the appropriate mode of communication for each. Staff who may have connections to a particular area or district may be called upon to make the first contact. That staff person may remain the primary contact to the school district or school, or they may turn the school over to a recruiter for the collection of logistical information. Communication may be conducted via mail, email, phone, and in-person methods.

Leveraging participation in past studies. NCES has been conducting the longitudinal studies in schools since the early 1970s. HS&B:20 will leverage the participation information of school districts and schools from prior studies. Information gathered will include whether or not the school district or school participated, whether a research application was required, who in the district or school made the decision about participating, and the reasons for refusal, if applicable. This information will be used only to strategize who to contact and how to respond to previous concerns, if any, to encourage participation in HS&B:20.

Geospatial Modeling. The use of geospatial modeling helps overcome a serious challenge in base-year studies – a lack of information on sample members to use in predicting response. We will use geospatial modeling with data from open-access sources (e.g., NCES’s Education Demographic and Geographic Estimates, FBI’s Uniform Crime Reporting statistics) and RTI’s Enhanced Address-based Sampling Frame (Enhanced ABS Frame) to generate a superset of covariates that may help in estimating the likelihood of participation at all levels. From this covariate superset, we will identify a subset of substantive covariates that are significant predictors of response. Model coefficients from this geospatial model will then be used to predict, a priori, the likelihood of response for each unit in the HS&B:20 sample if the model provides sufficient ability to predict nonresponse. Interventions will be implemented based on the patterns of information found in the model. If we see that certain areas are underrepresented, we will focus our outreach efforts in those areas.

Volunteer hours for students. Many high schools require volunteer hours to be completed prior to graduation. As a token of appreciation for student participation, the U.S. Department of Education will provide a certificate to each participating student to acknowledge 2 hours of volunteer service for participation in the study. For confidentiality purposes, no study-specific information will be included on the certificate.

Flexible incentive package. As observed across NCES studies, schools and sample members vary in their motivation to participate in voluntary research, and what incentivizes some does not work for others. We are thus offering incentive choices to enable schools and sample members to determine what works best for them. The incentive structure is provided in section A.9 of the Supporting Statement Part A. To encourage schools to participate, we are also exploring offering school-related incentives to students and parents, such as entry to a school event or credit towards, or an item from, the school store. Such incentives further support the schools and can be seen as a benefit to both the school and students. Furthermore, increased flexibility in the items that are offered may encourage participation. In BYFS, a school-level incentive boost from $200 to $400 will be offered to schools as a refusal conversion strategy to mitigate nonresponse both at the school-level and at the student item-level. Depending on the results of the field test experiment designed to determine the “right” student session length to facilitate school participation, a shorter student session in the school may be offered as the next refusal conversion step. This school-level incentive boost is not being offered in the field test due to sample size and its potential impact on the field test session time experiment.

Continuing education or professional development credits for teachers, counselors, and administrators. In some states, districts, and schools, the school staff are required to complete continuing education credits or participate in documented professional development activities. We are exploring the requirements in each state to be able to offer continuing education (or professional development) credits to participating school staff.

Avoiding refusals. HS&B:20 recruiters will be trained to avoid direct refusals by focusing on strategies to solve problems or meet obstacles to participation faced by district or school administrators. They will endeavor to keep the door open while providing additional information and seeking other ways to persuade school districts and schools to participate. As described above, shortening the time for the student session is one tool that will be used for refusal avoidance.

When possible, HS&B:20 session facilitators will meet with students prior to the session at the school and explain the importance of participating in the study. They will emphasize that participating in the study will not affect their grades, and that none of their responses will be shared with their parents or teachers. Session facilitators may also meet with parents at a scheduled parent event at the school to generate excitement about the study, and answer questions parents may have about either their or their child’s participation. Parents that have not returned permission forms or who have initially refused to provide permission for their child to participate will be contacted by session facilitators who will attempt to alleviate any concerns about their child’s participation and answer any questions parents may have about the study. HS&B:20 session facilitators will also prompt school staff to complete the staff surveys and assist school staff with logging in to their survey or answering questions about the study.

General Recruiting. The following approach will be implemented to recruit school districts and schools for both the field test and BYFS. This approach was previously approved in December 2018 (OMB# 1850-0944 v.1.).

Endorsements. Support from leading education organizations can, at times, be influential to school districts’ and schools’ decision to participate. Prior to contacting sampled school districts or schools, we will request the endorsement and support from relevant organizations and key stakeholders in secondary education (see appendix A.1). Organizations will be able to provide a letter of support and/or electronic endorsement of the study. Logos from endorsing organizations will be included on the HS&B:20 study website. Endorsement collection will continue through spring 2019.

State Endorsement. As part of our study outreach efforts, all states will be contacted to inform them that the study will be taking place. States will be asked to provide a letter of endorsement to encourage school districts’ and schools’ participation in the study should schools in their state be selected. Letters will be sent to the state superintendent with copies to the state-level director of research and/or director of secondary education, as applicable (see appendix A.2). Once the sample schools are selected, senior HS&B:20 recruitment staff will contact state staff with schools in the sample to discuss the study and secure support. Endorsement letters received by the state are included in all mailings to districts and schools within the state. Endorsement collection will continue through spring 2019.

School District and Diocesan Notification and Recruitment. Once states have been contacted, whether an endorsement letter was received or not, school districts and dioceses that do not require a research application will be notified that schools in their district have been selected for the study. The letter to school districts will state that NCES’s contractor, RTI International, will contact the school within two weeks and that they may contact RTI with questions (see appendix A.3).

Research applications will be prepared for any school districts that require the approval of applications in order to conduct research in schools in their jurisdiction. If a school district notifies us that an application must be submitted, or some other requirement must be fulfilled, study staff will be prepared to respond to such requirements. If a district chooses not to participate, all reasons will be documented to help formulate a strategy for refusal conversion attempts. Participating districts may be asked to provide student roster information on the school’s behalf to reduce the burden on the school.

Public and Catholic School Recruitment. Two weeks after the district was notified, or after the district provides approval when required, recruitment will commence at the school-level. Schools will receive a colored folder that contains a letter (appendix A.4a), study information sheet (appendix A.6a), frequently asked questions (FAQs) about the study (appendix A.6b), and a study brochure (appendix A.6c). Materials will be sent via overnight delivery and follow-up will occur within three business days. The first contact will be intentionally assigned based on prior history of working with the schools and school or district characteristics. First contacts may include modes such as a telephone call from recruitment staff, study management staff at RTI, or NCES staff; or an in-person visit to the school. Each of these modes, as well as email communication, may be used throughout the recruitment process as needed.

Once a school agrees to participate in HS&B:20, a recruiter will work with the school to name a member of the school’s staff to serve as the school coordinator for the study. The recruiter will work with the school coordinator to schedule study activities at the school, including gathering student rosters, distributing consent materials to parents of sample students, and arranging the session logistics. Roster instructions will be sent electronically in the fall of 2019 for the field test and the fall of 2020 for BYFS (see Appendix A10) and are also available on the website (see Appendix A7a). If a school is experiencing difficulty with preparing the roster, the district may be asked to provide the roster on the school’s behalf.

In early communications, the recruiter will also gather information about the school including: what type of parental consent procedures need to be followed at the school; hours of operation, including early dismissal days, school closures/vacations, and dates for standardized testing; and any other considerations that may impact the scheduling of student sessions (e.g., planned construction periods, school reconfiguration, or planned changes in leadership). The HS&B:20 study recruitment team will meet regularly to discuss recruitment issues and develop strategies for refusal conversion on a school-by-school basis.

As mentioned, for the field test, half of the schools will be presented with a 90-minute student session. The other half will be presented with a 45-minute (one-class-period session) student session with the remainder of the session to be completed outside of school. As a refusal conversion effort, schools assigned to the 90-minute session will be asked to consider a 45-minute instead. If the school still declines, an out-of-school student session will be offered. Similarly, as a refusal conversion effort, if a school that is offered the 45-minute student session declines, that school will also be offered an out-of-school student session in which the entire battery is completed outside of school. For BYFS, the 90-minute session will be offered to schools during recruitment and the 45-minute session will be offered only as part of a non-response and/or refusal conversion strategy.

Private and Charter School Recruitment. If a private or charter school selected for the base-year field test operates under a higher-level governing body such as a diocese, a consortium of private schools, or a charter school district, we will use the district-level recruitment approach with the appropriate higher-level governing body. If a private or charter school selected for the field test does not have a higher-level governing body, the school recruitment approach outlined above will be used.

Out-of-School Data Collection. Some schools may not permit the data collection to occur in a school-based session. To maximize participation and address school concerns about loss of instructional time, schools declining to conduct an in-school group session will be offered the possibly to have their students participate outside of school. For these schools, schools will still be asked to provide the student roster, teacher information, and parent contact information. We will ask out-of-school schools to distribute materials to sampled students and parents and help to encourage participation. This could mean sending materials to parents via mail, email, or distribution of materials through students; providing computer access so that students can participate at their convenience in the school; and/or following up with students and parents to encourage participation. The person designated as the school coordinator for the school would receive the coordinator incentive as if the session was happening in school. Parent contact information would be used to contact parents directly to secure student and parent participation. Contacts outside of school would be in addition to prompting by the school coordinator, for schools willing to assist with this activity. Teachers, administrators, and counselors would be asked to participate as if the session were conducted in schools.

Parent Recruitment. For schools allowing in-school student sessions, schools will be given the option of one of three types of parental permission letters: notification (appendix A.5a), implicit permission (opt out) (appendix A.5b), or explicit permission (opt in) (appendix A.5c). Each type of consent requires that parents be notified that their children have been selected for the study. With a notification letter, no permission form is sent home since no action is required on the part of the parent. For implicit consent (opt out), the school does not require verbal or written consent for a student to participate in the study – parents are asked only to notify the appropriate person if they do not want their child to participate. With explicit consent (opt in), children may participate only if their parents provide written or oral consent for their children to do so. Proactive parent recruitment will be focused on maximizing the number of parents (1) returning signed explicit consent forms and (2) completing the parent survey. Because implicit consent does not require a verbal or written response from parents, these parents will not be contacted about consent forms. The letter accompanying the parent permission form will let parents know that the students will complete a survey, the math and reading questions, and a hearing and vision assessment. Parents will be told that they will receive results from the hearing and vision assessments.

The letters will be sent to the school for distribution to sampled students. For the field test, students in explicit permission schools will be offered a pizza party or equivalent food event for those who return the form by a designated date, regardless of whether permission is granted. For BYFS, the option of a $3 voucher to the school cafeteria or a food event will be offered to students in explicit permission schools for returning a signed permission form by a designated date. Also, the permission letter for parents of students in explicit permission schools will include instructions for how to provide permission electronically. The school coordinator will be able to see electronic permission status via their HS&B:20 website, and the session facilitator will walk the school coordinator through that process if applicable.

For schools that only permit out-of-school data collection, all initial contacts with the student will be conducted through the parent. The parent will receive a letter and an enclosed envelope with study information and study instructions. The parent will be asked to give the student the enclosed envelope with study information and study instructions. By giving the envelope to the student, it is implied that the parent consents to the child’s participation. The study information and instructions will include a student letter and the URL and login information for the student session. Students participating outside of school will not complete the hearing or vision assessment.

Parents will also receive an invitation to participate in the parent questionnaire (appendix A.11) at the start of data collection, with parent cases being added to the data collection process on a flow basis as parent contact information is provided by the school or parents provide such information on consent forms. Parent data collection will entail web-based self-administration with nonresponse follow-up by computer-assisted telephone interviewing (CATI). Letters inviting parents to participate in the study will contain a message on the envelope such as “Important, please open!”, “Your input is requested”, or “Help improve education! Open to find out how.” In addition to the parent questionnaire, a random sample of about 50 parents will be asked to participate in an abbreviated reinterview.

Data Collection Approach

The HS&B:20 data collection will consist of a student session (survey, math and reading assessment, and hearing and vision assessment) as well as surveys for students’ parents, math teachers, guidance counselors, and school administrators.

Once schools agree to participate, the designated school coordinator will be asked to provide a roster of all students in grades 9 and 12. The school coordinator will receive a template of the roster (see appendix B) with instructions to prepare and upload the roster electronically to the secure study website (see appendix A10). Upon receipt of the roster, RTI statisticians will randomly select about 35 students each from grades 9 and 12. About a month prior to the scheduled student session, a student tracking form listing the selected students will be sent to the school along with parent permission forms to distribute to the students.

Students will be asked to complete either a 90- or 45-minute session in a group administration in their school. The student surveys and direct assessments will take place in the school setting and be administered using Chromebooks (tablet-like computers with touchscreen capability and an attached keyboard) brought in to the school by HS&B:20 staff. HS&B:20 staff will also bring the necessary equipment for the hearing and vision assessments. This portion of data collection is referred to as the student session. To administer the survey and direct assessment in schools, study staff will work with schools to identify and utilize locations for administration that minimize distractions for the student and disruption to the school routine. Students will be prompted on screen to use the hearing and vision equipment that is at their desk to complete that portion of the session. This component may be turned off if schools decline to have the students’ vision and hearing tested. Students from schools completing the 45-minute student session will be asked to complete the remainder of the session outside of school.

Schools that decline to participate in the in-school session will be asked to allow students to participate outside of school. For these students, schools may be willing to distribute materials to the students to take home, send emails to parents, or mail materials directly to schools. Schools may also be willing to allow the student(s) to complete the session on a school computer at their convenience. In addition, the study team will contact the family directly to obtain student and parent participation in the study. Students from these schools will be contacted through their parent using contact information provided by the school (appendix A.8a). The parent will be asked to give an envelope to the sampled student which contains a letter including login information for the student to complete the session online (appendix A.8b). Students who participate outside of school will not be asked to complete the reading, hearing, and vision assessments.

The parent (appendix A.11) survey will have an internet option and a telephone option, while the mathematics teacher (appendix A.12a), school counselor (appendix A.12b), and school administrator (appendix A.12c) surveys will be self-administered via the Web.

B.4 Tests of Methods and Procedures

BYFS will collect base-year data from a sample of ninth-grade students in the fall of 2020, their teachers, their parents, their guidance counselors, and their school administrators. For the field test, we plan a set of experiments to refine the full-scale data collection procedures for the school/students, teachers, and parents. Experiments are not recommended for counselors or administrators due to the small school sample size. The experiments manipulate the amount of information requested and in what setting it is collected, as well as variations on how to incentivize participation.

Field Test Experiments

School/Students. Student data collection largely relies on the schools’ willingness to allow in-school data collection. A student survey that averages 90 minutes may require two class periods to administer. To examine the impact of reducing participation burden on the schools, an alternative design will fit the data collection within one class period (approximately 45 minutes), potentially gaining participation from a larger proportion of sample schools.

To better understand the effect of the length of the student session on school recruitment, the 309 schools in the field test sample were randomly assigned (after controlling for the same treatment within school district) to a 90-minute or a 45-minute in-school student administration request. For each condition, there was a sequence of nonresponse follow-up procedures that ultimately allows for out-of-school administration:

  • Full student session: The initial request for schools was for a 90-minute in-school student administration. Schools and districts that refused, were asked to participate with a 45-minute administration. For those still refusing, a 90-minute out-of-school student administration was offered.

  • Reduced student session: In this condition, the initial request to schools and school districts was for the reduced 45-minute student session administration, which consists of demographic questions, the hearing and vision assessment, and the mathematics assessment. Students from schools that took the 45-minute session in school received a postcard asking them to complete the remainder of the session outside of school, specifically the student survey and reading assessment plus any math items they did not finish at school (see appendix A8c). Those schools refusing to participate in the shorter session were offered a 90-minute out-of-school student administration.

From the planned experiments, this one has the lowest statistical power as schools are the unit of analysis, i.e., we should be able to detect only relatively large differences in participation. At the .05 level of significance and power of .80, we should be able to detect an approximately 13 percentage point difference in school participation (e.g., 24 percent vs. 37 percent) with a one-tailed test. However, despite our school sample size limitations, we believe that such a difference from substantially reducing the burden on schools is possible and warrants a test.

If the field test recruitment effort yielded significantly higher school response rates with the 45-minute student session, we would have implemented it in BYFS. If the differences were not statistically significant, we would plan to release the BYFS sample in two sample replicates, repeating the experiment in the first replicate in order to select the optimal student survey length for the second sample replicate. An alternative to this approach was to use the 45-minute instrument as a nonresponse follow-up strategy for school-level (and school district-level) nonresponse. If the field test recruitment effort did not yield higher school response rates with the 45-minute student session, the 90-minute session would be offered to schools during BYFS recruitment, and the 45-minute session would be used as a refusal conversion strategy.

As of July 23, 2019, 66 of the desired 75 schools have agreed to participate in the HS&B:20 BYFT. In preparation to begin school district and school recruitment in September 2019 for the Fall 2020 BYFS data collection, we reviewed the results of the experiment to date to determine whether there is a difference in participation among those schools offered a 90-minute session and those offered a 45-minute session. The 66 participating schools include five virtual schools whose students would necessarily participate outside of school. Thus, we looked to the remaining 61 schools to determine the outcome of the experiment, of which 33 were assigned the 90-minute session and 28 were assigned the 45-minute session. Similarly, we looked at the number of school districts that have declined to participate. Of the 17 districts that declined, eight were assigned to the 90-minute session (representing 28 schools) and nine were assigned to the 45-minute session (representing 21 schools). Though a small number of schools have expressly commented favorably on the one-class period session time, session time did not have a large impact on a school’s decision to participate in the study. We will therefore proceed with the option to offer the 90-minute session to schools during BYFS recruitment, and to use the 45-minute session only as a refusal conversion strategy. As in the field test, students participating in the 45-minute session in school will be asked to complete the remainder of the session outside of school (they will be asked to complete the BYFS student survey and reading assessment, but will not be asked to finish any math items they did not complete at school).

Teachers. Teachers have multiple job responsibilities and demands on their time. Reducing the time demand to provide survey data has the potential to be a highly effective design feature to increase participation. Teachers within each participating school will be randomly assigned to provide the full teacher background information and the full teacher-student report, or to provide a reduced version of each set of data.

Assuming 75 participating schools, an average of 3.5 math teachers per school, and a teacher response rate of 75%, at the .05 level of significance and power of .80, we expect to detect a difference of 12 percentage points with a one-tailed test (the reduced burden is not expected to lead to a lower response rate).

Parents. Although the parent survey will not be administered in an institutional setting, the length of the survey may still be an important factor in gaining participation. In addition, there are multiple ways to incentivize parent participation. We plan to offer a total of $20 to each sample parent. Part of this $20 will be offering a $5 prepaid incentive. A prepaid incentive can be an effective way to encourage participation, with the remainder provided upon completion of the survey. However, we will examine the most effective way to provide the remaining $15 of the total $20 incentive, in monetary or non-monetary options.

The parents will be the largest sample, allowing for a test with more than two conditions. We plan on a 2x2 factorial design, with a treatment on survey length and a treatment on incentive format. Sample parents will be randomly assigned to a 30-minute survey or to a 15-minute survey. All parents will receive the $5 prepaid incentive. Then, the survey time treatment (30 minutes versus 15-minutes) will be crossed with assignment to be offered $15 in cash (or check) upon completion, or to choose a different incentive with a value of approximately $15, such as: school tickets to ball game, donation to the school for their participation, college preparation materials, movie tickets for family (2), a family board game, or donation to a charity.

The key interest is in the main effect of each treatment. Assuming 2,625 sample parents of ninth-grade students and an expected parent response rate of 50% (most conservative estimate), at the .05 level of significance and power of .80, we expect to detect a difference of 3.5 percentage points in the parent response rate.

B.5 Reviewing Statisticians and Individuals Responsible for Study Design and Conduct

The following individuals at the National Center for Education Statistics (NCES) are responsible for HS&B:20: Elise Christopher, Gail Mulligan, Chris Chapman, and Marilyn Seastrom. The following individuals at RTI are responsible for the study: Dan Pratt, Debbie Herget, Donna Jewell, David Wilson, and Laura Fritch, along with subcontractor staff Marc Berger and Rick Morgan (ETS).

References

Folsom, R.E., Potter, F.J., Williams, S.R. (1987) Notes on a Composite Size Measure for Self-Weighting Samples in Multiple Domains, Research Triangle Institute. http://ww2.amstat.org/sections/srms/Proceedings/papers/1987_141.pdf

1 While the BYFS sample will include only ninth-grade students, the base-year field test sample will include both ninth- and twelfth-grade students to prognosticate the progression that will be observed when reassessing the sample ninth-grade students three years later, when most will be in 12th grade.

2 A special education school is a public elementary/secondary school that focuses on educating students with disabilities and adapts curriculum, materials, or instruction for the students served.

3 The school sampling strata will be defined by a subset of the possible combinations of the characteristics.

4 One of the 1,092 schools is a Bureau of Indian Education (BIE) school and, though no enrollment counts are provided by BIE schools to the CCD, BIE schools were included in the sampling frame as long as they reported offering instruction in grade 9 and grade 12.

5 See, for example, Kish (1965.) Survey Sampling, John Wiley & Sons, Inc. p.56.

6 If needed, the reserve sample will be released after the start of BYFS data collection. Should we expect to need to release any reserve sample schools, we will update our burden estimated in Part A in the next, BYFS data collection, submission.

7 Folsom, R.E., Potter, F.J., and Williams, S.R. (1987). Notes on a Composite Size Measure for Self-Weighting Samples in Multiple Domains. Proceedings of the Section on Survey Research Methods of the American Statistical Association, 792-796.

8 The five student domains are as follows: AIAN, non-Hispanic; Asian, non-Hispanic; Hispanic; Black, non-Hispanic; and Other race, non-Hispanic.

9 The twelfth-grade cohort includes those ninth-grade students sampled in the base year who are enrolled in twelfth grade in the first follow-up. These students alone, however, are not representative of all students enrolled in twelfth grade in spring 2024. There will be students who enroll in U.S. schools after ninth grade, as well as students in twelfth grade in the spring of 2024 who were not in ninth grade in the fall of 2020. Additional students enrolled in twelfth grade will be added to the student sample in the first follow-up through sample freshening to produce a sample of students that is representative of all students enrolled in twelfth-grade in spring 2024. Because the number of students to be added via sample freshening is uncontrollable and estimation is imprecise, base-year students sample sizes were established assuming no students would be added via sample freshening.

10 Some of the students sampled in the ninth grade will be in the twelfth grade as of the first follow-up but some will not. Students sampled in ninth grade will be included in the first follow-up data collection regardless of the grade they are in, but those who are not in twelfth grade will not be part of the twelfth-grade cohort.

11 SAS Institute Inc. 2008. SAS/STAT® 9.2 User’s Guide. Cary, NC: SAS Institute Inc.

12 AIAN, non-Hispanic; Asian, non-Hispanic; Hispanic; Black, non-Hispanic; and Other race, non-Hispanic.

13 Students of indeterminate status include those whose whereabouts are not known by their base-year schools. Such students include those who may be transfer students, early graduates, or dropouts.

14 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2001). Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), User’s Manual for the ECLS-K Base Year Public-Use Data Files and Electronic Codebook (NCES 2001-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

15 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2012). Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), User’s Manual for the ECLS-K:2011 Kindergarten Data File and Electronic Codebook (NCES 2013-061). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

16 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy