Part B SSOCS 2018 & 2020 Update

Part B SSOCS 2018 & 2020 Update.docx

School Survey on Crime and Safety (SSOCS) 2018 and 2020 Update

OMB: 1850-0761

Document [docx]
Download: docx | pdf






School Survey on Crime and Safety (SSOCS) 2018 and 2020 Update



OMB# 1850-0761 v.16




Supporting Statement Part B





National Center for Education Statistics

Institute of Education Sciences

U.S. Department of Education





March 17, 2017

Revised April 2019






Contents







Section B. Methodology

The 2018 School Survey on Crime and Safety (SSOCS) questionnaire has only minor content changes that distinguish it from the 2016 questionnaire. However, the 2018 collection involves methodological changes—including a test of a web version of the survey for a subsample of schools and an incentive experiment for half of the sample. If these tests yield positive results in response rates, NCES will look into using the web-based instrument, as well as an incentive, for all respondents in the 2020 collection.

The SSOCS:2020 questionnaire has no new content compared to the 2018 and 2016 questionnaires. However, some items and subitems were removed from the questionnaire in order to reduce respondent burden, and other formatting revisions were made to improve the questionnaire’s visual design (e.g., using alternative shading in subitem rows and removing grid lines). SSOCS:2020 also includes methodological changes that distinguish it from SSOCS:2018. First, given the favorable results of the web test included in the 2018 collection in terms of response rates, SSOCS:2020 will be collected primarily by Internet, with paper questionnaires offered in follow-up mailings rather than at the onset of collection. Second, the testing of monetary incentives will be expanded in the 2020 collection.

The information presented in this document covers both SSOCS:2018 and SSOCS:2020, with differences between the two collections noted explicitly.

B1. Respondent Universe and Sample Design and Estimation

The sampling frame for SSOCS is constructed from the public school sampling frame for the National Teacher and Principal Survey (NTPS) for the same collection year, but it excludes schools that are considered out-of-scope for SSOCS. The NTPS sampling frame is constructed from the Common Core of Data (CCD) public school universe file, but it excludes schools in the U.S. outlying areas1 and Puerto Rico, overseas Department of Defense schools, newly closed schools, home schools, and schools with the highest grade of kindergarten or lower. Regular public schools, charter schools, and schools that have partial or total magnet programs with students in any of grades prekindergarten through 12 are included in the NTPS sampling frame. The SSOCS sampling frame starts with the NTPS frame, but excludes additional schools— schools run by the Bureau of Indian Education and Department of Defense, schools specializing in special education or alternative education, vocational schools, virtual schools, and ungraded schools.

The sampling frame for SSOCS:2018 is constructed from the sampling frame for the 2017-18 NTPS. The 2017-18 NTPS public school sampling frame was constructed from the Public Elementary/
Secondary School Universe data file of the 2014–15 Common Core of Data (CCD), which is an NCES annual collection of fiscal and nonfiscal data for all public schools, public school districts, and state education agencies in the United States.

The size of the SSOCS:2018 population is estimated to be about 84,000 schools. Tables 1 and 2 show the expected distribution of the public school sampling universe for SSOCS:2018, based on the 2014–15 CCD.

Table 1. Expected respondent universe for the SSOCS:2018 public school sample, by school level and urbanicity, based on the 2014–15 CCD

Urbanicity

Primary

Middle

High

Combined

Total

City

14,805

3,838

3,402

984

23,029

Suburb

17,339

5,617

3,927

661

27,544

Town

5,721

2,680

2,159

524

11,084

Rural

11,701

3,523

3,388

4,149

22,761

Total

49,566

15,658

12,876

6,318

84,418


Table 2. Expected respondent universe for the SSOCS:2018 public school sample, by school level and enrollment size, based on the 2014–15 CCD

Enrollment size

Primary

Middle

High

Combined

Total

Less than 300

10,495

2,837

2,420

2,835

18,587

300–499

18,122

3,515

2,018

1,537

25,192

500–999

19,911

7,454

3,094

1,492

31,951

1,000+

1,038

1,852

5,344

454

8,688

Total

49,566

15,658

12,876

6,318

84,418


The sampling frame for SSOCS:2020 will be constructed from the public school sampling frame originally planned for the 2019–20 NTPS,2 which will be constructed from the Public Elementary/Secondary School Universe data file of the 2017–18 CCD (scheduled to be released in April/May of 2019). The size of the SSOCS:2020 population is estimated to be approximately 84,400 schools.

Tables 3 and 4 show the estimated expected distribution of the public school sampling universe for SSOCS:2020, by school level and urbanicity and by school level and enrollment size, respectively. Tables 3 and 4 reflect the expected numbers as estimated using the 2014-15 CCD universe, because the 2017-18 CCD file, based on which the SSOCS:2020 frame will be built, is not yet available at the time of this submission.

Table 3. Expected respondent universe for the SSOCS:2020 public school sample, by school level and urbanicity, based on the 2014-15 CCD

Urbanicity

Primary

Middle

High

Combined

Total

City

14,938

3,800

3,402

1,109

23,249

Suburb

17,410

5,596

3,909

720

27,635

Town

5,695

2,611

2,104

593

11,003

Rural

11,537

3,418

3,289

4,292

22,536

Total

49,580

15,425

12,704

6,714

84,423


Table 4. Expected respondent universe for the SSOCS:2020 public school sample, by school level and enrollment size, based on the 2014-15 CCD

Enrollment size

Primary

Middle

High

Combined

Total

Less than 300

10,371

2,757

2,254

2,871

18,253

300–499

18,193

3,467

2,029

1,652

25,341

500–999

19,934

7,322

3,047

1,640

31,943

1,000+

1,082

1,879

5,374

551

8,886

Total

49,580

15,425

12,704

6,714

84,423



Sample Selection and Response Rates

SSOCS:2016 yielded an unweighted response rate of approximately 59 percent. When the responding schools were weighted to account for their original sampling probabilities, the response rate increased to approximately 63 percent. Both the unweighted and weighted response rates were a significant drop from those obtained in the prior collection (conducted for SSOCS:2010), which yielded unweighted and weighted response rates of 77 and 81 percent, respectively.

SSOCS:2018 yielded an unweighted response rate of approximately 58 percent. When the responding schools were weighted to account for their original sampling probabilities, the response rate increased to approximately 62 percent. Given the inclusion of planned experiments aimed at increasing the overall response, we anticipate at least maintaining the SSOCS:2016 and SSOCS:2018 response rates in SSOCS:2020, which will yield more completed surveys than needed to meet the study’s objective.

A stratified sample design will be used to select approximately 4,800 public schools for SSOCS:2018 and SSOCS:2020 (compared to 3,553 public schools for SSOCS:2016) in order to obtain the 2,550 completed interviews needed to ensure precision in the estimates. For sample allocation purposes, strata will be defined by instructional level, locale, and enrollment size. Minority enrollment, region, and state will be used as sorting variables in the sample selection process to induce implicit stratification.

Sample Design for SSOCS:2016

A stratified sample design was used to select schools for SSOCS:2016. For sample allocation and sample selection, strata were defined by instructional level, locale, and enrollment size. Within each of four instructional level categories, the sample was allocated to each of 16 subgroups formed by the cross-classification of locale (four levels) and enrollment size (four levels) in proportion to an aggregate measure of size derived for each subgroup. The aggregate measure of size for a specific locale by enrollment cell within an instructional level is equal to the sum of the square root of school enrollment.

The initial goal of SSOCS:2016 was to collect data from at least 2,550 schools, taking nonresponse into account. One possible method of allocating schools to the different sampling strata would have been to allocate them proportionally to the U.S. public school population. However, while the majority of U.S. public schools are primary schools, the majority of school violence is reported in middle and high schools. Proportional allocation would, therefore, have yielded an inefficient sample design because the sample composition would have included more primary schools (where crime is an infrequent event) than middle or high schools (where crime is a relatively more frequent event). As a result, a larger proportion of the target sample of 2,550 schools was allocated to middle and high schools, with the allocation as follows: 640 primary schools, 895 middle schools, 915 high schools, and 100 combined schools. After inflating the sample size to allow for nonresponse, the resulting sample allocation by school level was 849 primary schools, 1,230 middle schools, 1,347 high schools, and 127 combined schools. The total sample size was 3,553 schools. Schools in SSOCS:2000, SSOCS:2004, SSOCS:2006, SSOCS:2008 and SSOCS:2010 were allocated to instructional levels in a similar manner.

After the allocation for each stratum was determined, percent minority, region, and state were used as implicit stratification variables by sorting the school lists in each stratum by these variables before sample selection. The formula used to calculate measure of size is given as

MOS(h) =

where Ehi = the enrollment of the i th school in stratum h and Nh = the total number of schools in stratum h.

The measure of size for the instructional level, MOS(l), is found by summing across the 16 measure-of-size values, MOS(h), that comprise the instructional level. The ratio of the stratum’s measure of size to the overall measure of size for the instructional level determines the number of cases to be allocated to that stratum. This is found by dividing the stratum measure of size, MOS(h), by the total measure of size for the instructional level, MOS(l). The result provides the proportion of the sample that should be allocated to this stratum.

Sample Design for SSOCS:2018 and SSOCS:2020

The same general sample design used for SSOCS:2016 will be adopted for the selection of schools in SSOCS:2018 and SSOCS:2020 with regard to stratification variables, the number of strata, the method of sample allocation, and the sorting of variables before selection.

The two main objectives of the SSOCS:2018 and SSOCS:2020 sampling design are identical to those of SSOCS:2016: (1) to obtain overall cross-sectional and subgroup estimates of important indicators of school crime and safety; and (2) to maintain precise estimates of change in various characteristics relating to crime between the earliest and most recent SSOCS administrations. Adopting the same general design increases the precision of the estimate of change. For sample allocation and sample selection purposes, strata were defined in prior administrations of SSOCS by crossing instructional level, locale, and enrollment size. In addition, percent minority, region, and state were used as implicit stratification variables by sorting schools by these variables within each stratum before sample selection. The three explicit and three implicit stratification variables have been shown to be related to school crime and thus create meaningful strata for this survey.

SSOCS:2018

While the general sampling design for SSOCS:2018 remains the same as in prior collections, there are three notable differences. First, in an attempt to be proactive in reducing the burden on districts and their respective schools, the SSOCS team and the NTPS team are coordinating on the special district recruitment efforts (approved in March, 2017; OMB# 1850-0761 v.11), and drawing each sample in tandem to minimize overlap in the schools sampled for both surveys. Second, a web experiment will be conducted during the SSOCS:2018 collection only. Approximately 1,150 cases from the sample will be randomly selected to receive the web-based instrument. This experimental group will be selected in a manner that reflects the overall sampling design, providing the ability to use responses from this group when calculating estimates. Lastly, an incentive experiment will be conducted during the SSOCS:2018 collection only. Approximately half of the entire sample will be randomly selected to receive $10 cash as part of the initial mailout. This incentive is not contingent on the respondent’s completion of the survey.

SSOCS:2018 will take advantage of the lessons learned from the 2016 data collection. The response rates achieved for various strata and substrata in SSOCS:2016 have been examined in order to determine the proper size of the initial sample selection for 2018. Table 5 displays the SSOCS:2016 response rates by school level, enrollment size, urbanicity, percent White enrollment, and region. The 2016 response rates were used to estimate the 2018 response rates to ensure a sufficient number of completed cases for analysis.

SSOCS:2020

While the general sampling design for SSOCS:2020 remains the same as in prior collections, there are three notable differences from SSOCS:2018. First, SSOCS:2020 will not coordinate with NTPS in the ways SSOCS:2018 did, because the planned collection for the 2019-20 NTPS has been delayed by one year. The special district recruitment efforts (approved in March 2017; OMB# 1850-0761 v.11) will not run in tandem with similar NTPS efforts, and the SSOCS sampling probabilities will not be adjusted based on the NTPS sample to minimize the overlap of sampled schools Second, an incentive experiment will be included in the SSOCS:2020 collection, with approximately 2,340 schools assigned to the “early incentive” treatment; 1,230 schools assigned to the “delayed incentive” treatment; and 1,230 schools assigned to the “no incentive” (control) treatment. The schools in these experimental groups will be selected in a manner that reflects the overall sampling design, providing the ability to use their responses when calculating estimates. Lastly, a split-panel experiment will be conducted within the web instrument, designed to test a navigation menu, with approximately half of the entire sample randomly selected to receive a different version of the web instrument.

SSOCS:2020 will take advantage of the lessons learned from SSOCS:2018. The response rates achieved for the various strata and substrata in SSOCS:2018 have been examined in order to determine the proper size of the initial sample selected for 2020 to ensure a sufficient number of completed cases for analysis. Table 6 displays the SSOCS:2018 response rates by school level, enrollment size, urbanicity, percent White enrollment, and region.

Calculation of Weights

Weights will be attached to each surveyed school so that the weighted data will represent population levels. The final weight for completed cases will be composed of a sampling base weight and an adjustment for nonresponse. As with SSOCS:2016, nonresponse weighting adjustment cells for the SSOCS:2018 and SSOCS:2020 data will be determined using a categorical search algorithm called Chi-Square Automatic Interaction Detection (CHAID). CHAID begins by identifying the school-level characteristics of interest that are the best predictors of response. It divides the dataset into groups so that the unit response rate within cells is as constant as possible and the unit response rate between cells is as different as possible. The characteristics of interest as predictors of response must be available for both respondents and nonrespondents in order to conduct a CHAID analysis, and, in the case of SSOCS, will be available through the CCD sampling frame. Weighting adjustment cells for SSOCS:2018 data and SSOCS:2020 will be determined based on bias analysis results from the SSOCS:2016 data and SSOCS:2018, respectively, in order to create the adjustment for nonresponse. The final, adjusted weights will be raked so that the sum of the weights matches the number of schools derived from the latest CCD public school universe file.

Methods for Variance Estimation

Standard errors of the estimates will be estimated using jackknife repeated replication (JRR). Replicate codes that indicate the computing strata and the half-sample to which each sample unit belongs will be provided, as will the weights for all replicates that were formed in order to calculate variances.

Table 5. Unweighted and weighted SSOCS unit response rates, by selected school characteristics:

School year 2015–16

School characteristic

Initial sample

Completed Survey1

Non- respondents2

Ineligible3

Unweighted response rate (percent)4

Weighted response rate (percent)5

Total

3,553

2,092

1,442

19

59.2

63.0

Level6

Primary

849

516

325

8

61.4

63.6

Middle

1,230

719

508

3

58.6

60.4

High school

1,347

774

567

6

57.7

60.2

Combined

127

83

42

2

66.4

70.1

Enrollment size

Less than 300

349

234

107

8

68.6

73.0

300–499

702

426

273

3

60.9

62.3

500–999

1,384

831

546

7

60.3

60.2

1,000 or more

1,118

601

516

1

53.8

54.0

Urbanicity

City

1,083

558

517

8

51.9

52.2

Suburb

1,362

781

576

5

57.6

60.7

Town

428

295

130

3

69.4

68.8

Rural

680

458

219

3

67.7

73.9

Percent White enrollment

More than 95 percent

147

108

39

0

73.5

74.1

More than 80 to 95 percent

801

543

255

3

68.0

71.6

More than 50 to 80 percent

1,025

606

414

5

59.4

63.0

50 percent or less

1,580

835

734

11

53.2

56.2

Region

Northeast

602

338

262

2

56.3

61.6

Midwest

788

501

283

4

63.9

66.4

South

1,346

765

575

6

57.1

61.6

West

817

488

322

7

60.2

62.5

1In SSOCS:2016, a minimum of 162 of the 296 subitems eligible for recontact (i.e., all subitems in the questionnaire except those associated with the introductory items) were required to be answered for the survey to be considered complete. This includes a minimum of 75 of the 92 critical subitems, 18 of the 30 subitems within item 26, and 6 of the 25 subitems within item 35.

2Nonrespondents include schools whose districts denied permission to NCES to conduct the survey and those eligible schools that either did not respond or responded but did not answer the minimum number of items required for the survey to be considered complete.

3Ineligible schools include those that had closed, merged with another school at a new location, changed from a regular public school to an alternative school, or are not a school ("not a school" generally refers to a school record for an organization that does not provide any classroom instruction (e.g., an office overseeing a certain type of program or offering tutoring services only)).

4The unweighted response rate is calculated as the following ratio: completed cases / (total sample known ineligibles).

5The weighted response rate is calculated by applying the base sampling rates to the following ratio: completed cases / (total sample known ineligibles).

6Primary schools are defined as schools in which the lowest grade is not higher than grade 3 and the highest grade is not higher than grade 8. Middle schools are defined as schools in which the lowest grade is not lower than grade 4 and the highest grade is not higher than grade 9. High schools are defined as schools in which the lowest grade is not lower than grade 9 and the highest grade is not higher than grade 12. Combined schools include all other combinations of grades, including K–12 schools.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2015–16 School Survey on Crime and Safety (SSOCS:2016).


Table 6. Unweighted and weighted SSOCS unit response rates, by selected school characteristics:

School year 2017–18

School characteristic


Initial sample

Completed Survey1


Non- respondents2



Ineligible3

Unweighted response rate (percent)4

Weighted response rate (percent)5

Total

4,803

2,762

1,975

66

58.3

61.7

Level6

Primary

1,170

671

477

22

58.4

60.8

Middle

1,704

975

703

26

58.1

60.7

High school

1,748

997

740

11

57.4

61.4

Combined

181

119

55

7

68.4

71.5

Enrollment size

Less than 300

456

286

135

35

67.9

68.4

300–499

955

605

334

16

64.4

65.8

500–999

1,860

1,042

806

12

56.4

56.8

1,000 or more

1,532

829

700

3

54.2

55.1

Urbanicity

City

1,528

723

769

36

48.5

49.3

Suburb

1,837

1,034

793

10

56.6

58.2

Town

563

382

168

13

69.5

68.2

Rural

875

623

245

7

71.8

55.0

Percent White enrollment

More than 95 percent

170

128

39

3

76.6

79.2

More than 80 to 95 percent

1,014

675

330

9

67.2

68.3

More than 50 to 80 percent

1,390

848

536

6

61.3

62.8

50 percent or less

2,229

1,111

1,070

48

50.9

55.0

Region

Northeast

819

459

347

13

56.9

61.3

Midwest

1,029

636

377

16

62.8

64.3

South

1,845

1,042

782

21

57.1

61.0

West

1,110

625

469

16

57.1

60.4

1In SSOCS:2018, a minimum of 60 percent (157 subitems) of the 261 subitems eligible for recontact (i.e., all subitems in the questionnaire except the non-survey items that collect information about the respondent) were required to be answered for the survey to be considered complete. The 261 subitems eligible for recontact include a minimum of 80 percent of the 76 critical subitems (61 out of 76 total), 60 percent of item 30 subitems (18 out of 30 total), and 60 percent of item 38 subitems in column 1 (3 out of 5 total). The critical items are 11, 18, 19, 20, 22, 28, 29, 30, 31, 35, 36, 38 (column 1), 39, 40, 41, 42, 46, 47, and 48. Questionnaires that did not meet established completion criteria were considered incomplete and are excluded from the SSOCS:2018 data file.

2Nonrespondents include schools whose districts denied permission to NCES to conduct the survey and those eligible schools that either did not respond or responded but did not answer the minimum number of items required for the survey to be considered complete.

3Ineligible schools include those that had closed, merged with another school at a new location, changed from a regular public school to an alternative school, or are not a school ("not a school" generally refers to a school record for an organization that does not provide any classroom instruction (e.g., an office overseeing a certain type of program or offering tutoring services only)).

4The unweighted response rate is calculated as the following ratio: completed cases / (total sample known ineligibles).

5The weighted response rate is calculated by applying the base sampling rates to the following ratio: completed cases / (total sample known ineligibles).

6Primary schools are defined as schools in which the lowest grade is not higher than grade 3 and the highest grade is not higher than grade 8. Middle schools are defined as schools in which the lowest grade is not lower than grade 4 and the highest grade is not higher than grade 9. High schools are defined as schools in which the lowest grade is not lower than grade 9 and the highest grade is not higher than grade 12. Combined schools include all other combinations of grades, including K–12 schools.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2017–18 School Survey on Crime and Safety (SSOCS:2018).


B2. Procedures for the Collection of Information

The data collection methods used in SSOCS:2018 and SSOCS:2020 include, for the first time in SSOCS, mail survey and web-based survey experiments with intensive follow-up by both phone and e‑mail. The methods are described in more detail in the following sections.

Steps in the Data Collection Process

The following is a description of the main tasks in the data collection process for SSOCS. These tasks include drawing the sample; identifying special contact districts; mailing advance letters to school principals (SSOCS:2018 only), district superintendents, and Chief State School Officers (CSSOs); mailing full package to principals; placing reminder and follow-up calls to nonresponding schools; and refusal conversion efforts using both mailings and e-mails. All communication materials to potential respondents are designed for refusal aversion. See appendix A for letters to superintendents, CSSOs, and principals, as well as postcards to schools in special contact districts and reminder e-mails to respondents. The contact materials in appendix A are grouped separately for each: SSOCS:2018 and SSOCS:2020.

Drawing the Sample

The sample of schools will be drawn in the June preceding data collection once the NTPS frame creation is complete and following the drawing of the NTPS sample (SSOCS:2018 only). However, since many larger districts (known as “certainty” districts) are always included in the various NCES sample surveys, the preliminary research and application development for these districts will begin in early spring, prior to sampling. This will ensure that these districts have the necessary information to present to their research approval board during their scheduled annual or bi-annual meetings. Additional special contact district outreach will occur once the sample is drawn for any remaining sampled districts that require approval.

Identifying Special Contact Districts and the Need for Research Applications

Special contact districts require that a research application be submitted to and reviewed by the district before they will allow schools under their jurisdiction to participate in a study. Districts are identified as “special contact districts” prior to data collection because they were flagged as such during previous cycles of SASS, NTPS, or SSOCS, or by other NCES studies. Special contact districts are also identified during data collection when districts indicate that they will not complete the survey until a research application is submitted, reviewed, and approved.

Once a district is identified as a special contact district, basic information about the district is obtained from the NCES Common Core of Data (CCD). The basic information includes the NCES LEA ID number, district name, city, and state. The next step is to search the district’s website for a point of contact and any information available about the district’s requirements for conducting external research. Some districts identified as being a special contact district from the previous cycle may be incorrect and staff will verify whether a given district has requirements for conducting external research before proceeding.

The following are examples of the type of information that will be gathered from each district’s website in order to prepare a research application for submission to this district:

  • Name and contact information for the district office or department that reviews applications to conduct external research, and the name and contact information of the person in charge of that office.

  • Information about review schedules and submission deadlines.

  • Whether application fees are required, and if so, how much.

  • Whether a district sponsor is required.

  • Whether an online application is required, and if so, the link to the application if possible.

  • Whether in-person approval is required, and if so, information about the in-person approval process.

  • Information about research topics and/or agenda on which the district is focusing.

  • The web link to the main research department or office website.

  • Research guidelines, instructions, application forms, District Action Plans, Strategic Plan or Goals, if any.

Recruitment staff will contact districts by phone and email to obtain key information not listed on the district’s website, (e.g., requirements for the research application, research application submission deadlines, etc.).

SSOCS special district recruitment staff developed a generic research application (see appendix A) that covers the information typically requested in district research applications. Staff will customize the generic research application to each district’s specific requirements that need to be addressed or included in the research application (e.g., how the study addresses key district goals, or inclusion of a district study sponsor), or submit the generic application with minimal changes to districts that do not have specific application requirements.

Using the information obtained from the district website or phone or email exchanges, a district research request packet will be prepared. Each research application will include the following documents, where applicable:

  • District research application cover letter;

  • Research application (district-specific or generic, as required by the district);

  • Study summary;

  • Frequently Asked Questions (FAQ) document;

  • Special contact district approval form;

  • Participant informed consent form (if required by the district);

  • NCES Project Director’s resume;

  • Copy of questionnaires; and

  • Application fee (if required by the district).

Other information about the study may be required by the district and will be included with the application or provided upon request.

Approximately one week after the application is submitted to the district (either electronically or in hard copy, as required by the district), SSOCS special district recruitment staff will contact the district’s research office to confirm receipt of the package and to ask when the district expects to review the research application and when a decision will be made. If additional information is requested by the district (e.g., the list of sampled schools), recruitment staff will follow up on such requests and will be available to answer any questions the district may have throughout the data collection period.

For SSOCS:2018, to reduce burden for the special contact districts and improve operational efficiency, NCES will seek research approval simultaneously for NTPS 2017-18 and SSOCS 2018. Although NCES is minimizing overlap in the schools sampled for NTPS and SSOCS, most of the largest districts will have schools selected for both surveys. All special contact districts with schools in both surveys will receive both research applications concurrently and will be given the option to participate in NTPS only, SSOCS only, or both NTPS and SSOCS. The research request packets for the districts in both studies will contain an additional letter introducing the studies and emphasizing that SSOCS and NTPS are working together to minimize the number of schools asked to participate in both studies.

However, because due to resource constraints NTPS will not be conducted during the 2019–20 school year, as originally planned, SSOCS:2020 will not be able to seek special district approval simultaneously with NTPS. Therefore, SSOCS:2020 will alone conduct the special district operations, as was done in prior administrations of SSOCS, before SSOCS:2018.

Some districts charge a fee (~$50-200) to process research application requests, which will be paid as necessary.

Advance Notification to Principals

For SSOCS:2018, principals will be notified of the survey through an advance letter and e-mail sent a week or two before the questionnaire, following OMB clearance. The letter will include information about the study, the date of the first mailing, and a toll-free number that principals can call if they have questions. The toll-free number will be answered by Census program staff in Suitland, Maryland, who have been explicitly trained for this study and on how to respond to calls from schools. Staffing levels will ensure that at least one staff person is available at all times during the promised hours of operation. Copies of the advance letter to principals and principals in special contact districts are included in appendix A.

Census conducted an expert review of all SSOCS contact material, and the resulting recommendation was that non-actionable contact materials be removed from collection. Therefore, for SSOCS:2020, the advance notification letter will not be sent to principals, as they will be receiving their initial mailout package with instructions to complete the SSOCS approximately one week later.

Mailing the Study Notification to District Superintendents and Chief State School Officers

In order to achieve the highest possible response rate, we will send the study notification mailing to superintendents and Chief State School Officers (CSSOs) at the same time as the advance notification to principals (SSOCS:18 only). The purpose of this mailing is to provide districts with information about the survey and to inform them about the questionnaires being mailed to sampled schools in their district. It is not designed to ask for permission; rather, it is designed as a vehicle to help enhance participation. All materials sent to the CSSOs will be personalized using contact information from the CSSO website. Copies of the letters and materials sent to the superintendents/CSSOs are included in appendix A.

Mailouts

SSOCS:2018 will be conducted primarily by mail and will include a modal experiment with a web-based version of the instrument. SSOCS:2020 will be conducted primarily via the web-based survey instrument. A clerical operation prior to data collection will obtain e-mail addresses for all of the sampled principals, and these e-mails will be used to contact the principals throughout the data collection. Both collections will use both mail and e-mail to distribute instructions on how to complete the web questionnaire, with paper questionnaires introduced in follow-up mailings. Sampled principals will receive as many as four mailings, as needed, throughout the collection period, and principals who have completed their questionnaire prior to subsequent mailing(s) will be excluded from those mailouts.

SSOCS:2018

SSOCS:2018 will include a web experiment and an incentive experiment. The web experiment will be designed to test the efficacy of offering a web response option as the initial mode of data collection, as in the 2015–16 NTPS Schools and Principals Internet Study. A subsample of schools will be randomly selected (using the same sample design as the paper sample) to be included in the experimental (web) treatment at the time of sampling. Additionally, a $10 cash incentive will be included in the initial mailout to half of the sample (half of the paper treatment and half of the web treatment) in order to determine the effectiveness of incentivizing respondents to complete the questionnaire.

The initial mailout is scheduled for late February 2018, and the second mailout is scheduled for March 2018. The principal will be asked to complete the questionnaire—or to have it completed by the person at the school who is the most knowledgeable about school crime and safety—within 2 weeks of receipt of the questionnaire. Principals who have completed their questionnaire prior to the second mailout will be excluded from the second mailout.

The mailings for the paper respondents will include a paper questionnaire and a postage-paid return envelope, as well as a personalized cover letter that will provide the toll-free number at the Census Bureau, the SSOCS e-mail address, and the hours of operation.

The mailings for the schools selected for the web experiment will include a personalized letter containing the survey URL and a unique UserID for accessing the survey online. The letter will also include the Census Bureau contact information and hours of operation. A paper version of the questionnaire will be sent to respondents in the web experiment upon their request.

The principals in the web experiment will receive an e-mail invitation that includes a clickable URL to the web survey and log-in credentials around the time of the first and second mailings.

The third and fourth mailings for all respondents, regardless of whether they are in the web or paper treatment, will include a paper questionnaire, a postage-paid return envelope, and a personalized cover letter that will include the toll-free number at the Census Bureau, the SSOCS e-mail address, and the Census Bureau’s hours of operation. The third mailing will be the first time that respondents in the web experiment receive a paper questionnaire. Principals who have completed their questionnaire prior to these mailing(s) will be excluded from the mailouts.

Principals of all schools, regardless of whether the school was assigned to the web experiment, will be sent reminder e-mails, as appropriate, throughout the data collection period. E-mails will be personalized and sent to individual respondents. For the fourth follow-up email, all nonrespondents (including the paper treatment) will receive an email including the link to the survey and the User ID. Providing the web option to the paper treatment is an effort to continue to boost the overall response rate of the survey.

SSOCS:2020

SSOCS:2020 will be conducted primarily by the web-based survey instrument, with instructions to complete the questionnaire distributed to respondents by both mail and e-mail. It will include a modal experiment to test a navigation menu within the web instrument.

SSOCS:2020 will also build on the SSOCS:2018 incentive experiment described above and will include two incentive treatment groups. Schools in the “early incentive” treatment group will receive a $10 cash incentive at the first contact by mail. Schools in the “delayed incentive” treatment group will not receive an incentive in the first two mail contacts but will receive a $10 cash incentive during the third mail contact. Both treatment groups will be evaluated against the control group, which will not receive any incentive.

The initial mailout is scheduled for mid-February 2020, and the second mailout is scheduled for March 2020. The principal will be asked to complete the questionnaire—or to have it completed by the person at the school who is the most knowledgeable about school crime and safety—within 2 weeks of receipt. Both mailings will include a personalized letter containing the survey URL and a unique UserID to access the survey online. The letter will also include Census Bureau contact information and answers to FAQs. In addition, the mailing will include a one-page endorsement insert, which will display the names and logos of all SSOCS endorsing agencies. Finally, schools in the “early incentive” treatment will receive $10 cash adhered to a brightly colored incentive insert in their initial mailout package.

The third and fourth mailings (in March and April, respectively) will include a paper questionnaire, a postage-paid return envelope, and a personalized cover letter that will include the toll-free number at the Census Bureau and the SSOCS e-mail address. The third mailing will be the first time that respondents receive a paper questionnaire. Schools in the “delayed incentive” treatment group will also receive their $10 cash incentive adhered to a brightly colored incentive insert in the third package mailing.

Principals will receive an e-mail invitation that includes a clickable URL to the web survey and log-in credentials around the time of the first and second mailings. E-mails will be personalized and sent to individual respondents. Principals will be sent reminder e-mails, as appropriate, throughout the data collection period.

A copy of the cover letters and e-mails sent to principals throughout SSOCS:2018 and SSOCS:2020 data collection is included in appendix A.

Protocol for Follow-up Calls

For SSOCS:2018, approximately 3 weeks after the estimated delivery of the first mailing to school principals, Census will initiate phone calls to confirm that they have received the mailing and to ask if they have any questions. About a month later, Census will initiate phone calls with nonrespondents, reminding them to complete their questionnaire. For SSOCS:2020, approximately 3 weeks after the second mailing to school principals, Census will initiate phone calls with nonrespondents, reminding them to complete their questionnaire.

Finally, during the last two months of the SSOCS:2018 and SSOCS:2020 data collections, Census will conduct nonresponse follow-up by phone. This operation is aimed at collecting SSOCS data over the phone, whenever possible.

Refusal Conversion for Schools That Will Not Participate

If a school expresses strong concerns about confidentiality at any time during data collection, these concerns will be directed to the Census Project Director (and possibly to NCES) for formal assurance. All mailed materials will include the project’s toll-free number. In addition, for SSOCS:2020, FAQs will be included on the back of the initial mailout letters and will include information about why the participation of each sampled school is important and how respondent information will be protected.

Data Retrieval of Critical Items

In terms of the collection of “critical items,” the interview labor will be divided between follow-up with nonrespondents (seeking the completion of “critical items” rather than the full survey) and follow-up with respondents who have skipped items deemed to be critical (the retrieval of missing data). For nonrespondents, in May 2018 (May 2020 for the 2020 collection), we will offer “critical item” completion by fax or phone. The “critical items” identified by NCES for SSOCS:2018 and SSOCS:2020 include incidence data as well as data on school characteristics, consistent with SSOCS:2016. The SSOCS:2018 critical items are as follows: 11, 18, 19, 20, 22, 28, 29, 30, 31, 35, 36, 38 (column 1), 39, 40, 41, 42, 46, 47 and 48. The SSOCS:2020 critical items are analogous to the SSOCS:2018 items, with item numbers updated to match the revised SSOCS:2020 questionnaire: 9, 15, 16, 17, 19, 25, 26, 29, 30, 32, 33, 35 (column 1), 36, 37, 38, 39, 43, 44, and 45.

B3. Methods to Maximize Response Rates

NCES is committed to obtaining a high response rate in SSOCS:2018 and SSOCS:2020. In general, a key to achieving a high response rate is to track the response status of each sampled school, with telephone follow-up, as well as follow-up by mail and e-mail, of those schools that do not respond promptly. To help track response status, survey responses will be monitored through an automated receipt control system.

Several other steps will also be taken to maximize the response rate. For example, SSOCS:2018 will include two experiments—a web-based mode and an incentive—to motivate principals to respond to the survey. If the experimental groups yield higher response rates than the control groups (paper respondents, no incentive), they may be implemented in the full sample in the 2020 data collection.

The decision to move to a primarily web-based instrument for SSOCS:2020 was based on the results of these two SSOCS:2018 experiments (see section B.4 of this submission). Analyses of these experiments resulted in the recommendation to include an incentive and allow web-based responses as part of a mixed-mode methodology in future SSOCS administrations. Overall, offering an incentive was advantageous for SSOCS:2018, as it increased response rates and promoted significantly faster response times. SSOCS:2020 will build on the SSOCS:2018 incentive experiment but will include two incentive treatment groups (see section B.4 of this document for details).

In addition, SSOCS:2020 will include a modal experiment to test a navigation menu within the web instrument. If the experimental group—the group that receives the instrument with the added navigation menu functionality—yields a higher response rate than the control group (traditional web instrument), this would indicate that the navigation menu improves instrument usability and/or reduces respondent burden and may be implemented in the full sample in subsequent data collections.

SSOCS:2018 questionnaires will be mailed by Federal Express to ensure their prompt receipt and to give the survey greater importance in the eyes of the potential respondents. SSOCS:2020 will take a slightly different approach, utilizing Federal Express only during the fourth and final mailing in order to make the questionnaire package stand out to nonrespondents.

All SSOCS mailed paper questionnaires will be accompanied by a postage-paid return reply envelope and a personalized letter and include a toll-free number that respondents may call to resolve questions about the survey. The letters will also provide a means for seeking help by e‑mail. If a questionnaire is returned by the U.S. Postal Service, the Census Bureau will seek to verify the correct address and remail the questionnaire. Likewise, if outgoing e-mails sent to respondents bounce back, the Census Bureau will perform research to obtain the correct addresses and then resend the e-mails.

All completed questionnaires (both paper and web) that are received by the Census Bureau will be reviewed for consistency and completeness. If a questionnaire has too few items completed to be counted as a response (or if it has missing or conflicting data for key items), telephone interviewers will seek to obtain more complete responses. Telephone interviews will be conducted only by Census Bureau interviewers who have received training in general telephone interview techniques as well as specific training for SSOCS. After data retrieval is completed, a questionnaire must have approximately 60 percent of all items and approximately 80 percent of all critical items completed to be considered valid for inclusion in the dataset. Responses of “don’t know” (which only apply to item 17 on SSOCS:18/item 14 on SSOCS:2020) will not be considered as valid responses when counting the number of items completed.

Endorsements

To further increase the perceived legitimacy of the survey and thus improve the response rate, the following organizations have endorsed the SSOCS:2018 collection:

  • American Association of School Administrators

  • American Federation of Teachers

  • American School Counselors Association

  • Association for Middle Level Education

  • Association of American Educators

  • Council of Chief State School Officers

  • Education Northwest

  • National Association of Elementary School Principals

  • National Association of School Psychologists

  • National Association of School Resource Officers

  • National Association of Secondary School Principals

  • National Association of State Boards of Education

  • National PTA

  • National School Safety Center

  • School Safety Advocacy Council

  • School Social Work Association of America

  • UCLA Center for Mental Health in the Schools

In addition to the above agencies that endorsed SSOCS:2018, Census will solicit endorsement from the following agencies for SSOCS:2020:

  • Center for Prevention of School Violence

  • Center for School Mental Health

  • National Association of School Safety and Law Enforcement Officers

  • National School Boards Association

  • Police Executive Research Forum

  • Safe Schools Initiative Division

  • University of Arkansas Criminal Justice Institute


B4. Tests of Procedures

Experiments

Both SSOCS:2018 and SSOCS:2020 will include methodological experiments aimed at boosting response rates.

SSOCS:2018 Experiments

SSOCS:2018 will include two data collection experiments: the first experiment tests the inclusion of a web instrument as a mode of completing the survey, and the second experiment investigates the inclusion of an incentive to completing the survey.

Experiment #1: Web Instrument

Among a total sample of 4,800 schools, approximately 1,150 schools will be selected at random to be included in the treatment group. As opposed to the control group, which will receive a package including an informative letter about the survey, a brochure, and the questionnaire, the treatment group will receive a letter including log-in information, as well as a brochure. Out of a total of four potential mailings to schools, this group will only receive the web option to complete the survey for the first two mailings, and will receive the paper questionnaire in the third and fourth mailings. The difference in response rate between the control and treatment group necessary to detect a statistically significant difference has been calculated and is discussed below. The following test statistic was used in determining this difference:

where

  • = 1.96 for a 95 percent confidence level per NCES standards

  • and are the response rates for the treatment and control groups

  • and are the sample sizes for the treatment and control groups

  • = 1.6, the design effect observed in the recent administrations of SSOCS

To be conservative, the standard error was maximized by setting and equal to 0.5 only in the standard error component of the equation above. The actual experiment will gain additional power as the response rates for each group deviate from 50 percent. With 1,150 schools receiving the web option and 3,650 receiving the mail questionnaire, a significant difference will be detectable if the response rates between the two groups differ by at least 4.2 percentage points.

Experiment #2: Incentive

For this experiment, half of the sample (approximately 2,400 schools) will receive $10 cash in the initial mailing. This is considered the treatment group. For respondents in the treatment group receiving the paper version of the questionnaire, this incentive will be included with the informative letter, the brochure, and the survey. For respondents in the treatment group receiving the web option of the questionnaire, this incentive will be included with the informative letter and the brochure. The incentive will be included only in the initial mailing and not in other subsequent mailings. Control group respondents will not receive any incentive throughout the data collection procedures. Using the same statistic as above, the differences in response rate between the control and treatment group necessary to detect statistically significant differences have been calculated. With 2,400 cases receiving the incentive and 2,400 not receiving the incentive, a significant difference will be detectable if the response rates between the two groups differ by at least 3.6 percentage points.

SSOCS:2018 Experiment Results

Weighted response rates for the four experimental treatment groups included in SSOCS:2018 are reported in Table 7 below. Each treatment group is listed with its response rate and its difference from the control group’s response rate. The p-value for the hypothesis test of no difference is reported in the last column3.


Table 7: SSOCS:2018 Experimental Group Response Rates (Standard Errors)

Experimental Group

Sample Size

Response Rates

Difference from Control

Significance Test P-Value

Internet, No Incentive

575

46.7 (2.5)

-1.7 (3.1)

0.5917

Paper, Incentive

1,825

56.1 (1.6)

7.7 (2.7)

0.0028*

Internet and Incentive

575

53.2 (2.7)

4.9 (2.9)

0.1010

Control, Paper, No Incentive

1,825

48.4 (1.9)

N/A

N/A

Source: U.S. Census Bureau, Results from the 2018 School Survey on Crime and Safety Internet and Incentive Experiment

Response rates calculated as of May 7, 2018, when a web-push effort deployed. We consider cases that responded after May 7 “nonrespondents” for the analysis.

*Denotes significance at 0.10.


In SSOCS:2018, schools that received the incentive and did not receive the option to respond online had a response rate of 7.7 percentage points higher than the control group (statistically significant difference, p = 0.0028). Although the web-based instrument option did not increase response rates on its own, the analyses showed that schools that were part of both the Internet (option to respond online) and incentive treatment groups had a response rate of 4.9 percentage points higher than the control group; however, this was not statistically significant. This result may have been driven by the incentive rather than the internet option, given that the internet offer did not appear to influence response by itself.

The weighted response distribution of the final mode of data collection by the assigned mode is presented in Table 8. For schools who were assigned to both the internet and incentive treatments, 88.2 percent responded using the internet; 11.1 percent responded using paper; and 0.7 responded over the telephone during follow-up operations. Overall, between 88 and 90 percent of schools who were assigned to the internet treatment responded online.

Table 8: SSOCS:2018 Final Mode Distribution Percent of the Assigned Mode (Standard Errors)

Assigned Mode

Final Mode (of those completed percent by each mode)

Percent Total

Internet Treatment

Incentive Treatment

Internet

Paper

Telephone Follow-up

Internet

Incentive

88.2 (3.0)

11.1 (3.0)

0.7 (0.5)

100

No incentive

89.3 (2.7)

10.0 (2.6)

0.7 (0.5)

100

Paper

Incentive

N/A1

100.0 (0.0)

0.0^ (0.0)

100

No incentive

0.6* (0.5)

99.2 (0.5)

0.1 (0.1)

100

Source: U.S. Census Bureau, Results from the 2018 School Survey on Crime and Safety Internet and Incentive Experiment

1 Schools that were assigned to paper did not have the option to respond on the internet until a web-push effort deployed on May 7, 2018. We consider cases that responded after May 7 “nonrespondents” for the analysis.

^Rounds to zero due to disclosure avoidance requirements by the U.S. Census Bureau.

*A few cases responded by paper, but ultimately completed more questions using the web after it was available May 7. These cases are considered respondents because they submitted the paper questionnaire before May 7, 2018, but they are also considered “internet, final mode respondents” because their last mode (and mode used for processing) was the internet.


Response distributions for each treatment were compared to the control group across eight school characteristics and three survey items. The chi-square test results do not provide evidence that the treatment group response distributions across school characteristics were different from the control group. However, there was one significant difference in the item response distributions, “Percent of schools with a sworn officer,” displayed in Table 9. The Internet and Incentive group in the last column has a different response distribution than the control group (p-value 0.0808).

Table 9: SSOCS:2018 Item Response Distribution for Schools with a Sworn Law Enforcement Officer


Percent of responding schools (of those completed by school type, percent by item response)

Item Response

All Respondents

Control, Paper, No Incentive

Internet, No Incentive

Paper, Incentive

Internet and Incentive

No sworn law enforcement officers present

49.5 (1.4)

46.6 (2.4)

51.1 (4.1)

49.9 (2.7)

54.7 (3.9)

At least 1 sworn law enforcement officer

50.5 (1.4)

53.4 (2.4)

48.9 (4.1)

50.1 (2.7)

45.3 (3.9)

Percent Total

100

100

100

100

100

Rao-Scott Chi-Square p-value, comparison to control group (degrees of freedom)

0.3006 (1)

0.3584 (1)

0.0808* (1)

Source: U.S. Census Bureau, Results from the 2018 School Survey on Crime and Safety Internet and Incentive Experiment

*Denotes significance at 0.10.


Looking at the distributions of school characteristics for nonresponding schools, a few school characteristics were identified as being associated with propensity to respond. Specifically, school locale, enrollment size, the percent of white students, and the student-to-full-time-teacher ratio do not have similar distributions between the sample respondents and nonrespondents. These characteristics were previously identified as being correlated to nonresponse, in addition to the number of teachers and the percentage of students eligible for free or reduced-price lunch, and are used in the algorithm for nonresponse adjustments.

When introducing a new mode or incentives, it is often helpful to understand the effects of an intervention on producing faster response, which can save money on follow-up efforts. Therefore, the amount of time (days) that it took each respondent in the experimental groups to return the survey (even if the survey was later deemed incomplete) was calculated as part of the analyses. Table 10 displays the weighted average number of days to respond for each experimental group, with the difference in average number of days from the control group. The p-value for the hypothesis test of no difference is reported in the last column.

The option to respond online and the incentive had significantly faster response times. Specifically, the incentive, regardless of the internet option, produces the fastest response time.

Table 10: SSOCS:2018 Experimental Group Response Times (Standard Errors)

Experimental Group

Response Time in Days

Difference from Control

Significance Test P-Value

Internet, No Incentive

45.2 (2.1)

-4.4 (2.4)

0.0694*

Paper, Incentive

41.1 (1.2)

-8.5 (1.8)

<0.0001*

Internet and Incentive

41.2 (2.5)

-8.4 (3.0)

0.0072*

Control, Paper, No Incentive

49.6 (1.3)

N/A

N/A

Source: U.S. Census Bureau, Results from the 2018 School Survey on Crime and Safety Internet and Incentive Experiment

Response times for respondents as of May 7, 2018.

*Denotes significance at 0.10.


Providing the option to respond online, and especially when also offering an incentive, resulted in a decreased response time to the survey compared to those who were offered a paper questionnaire only with no incentive. All treatment groups observed significantly faster response time compared to the control group. On average, schools in the internet/no incentive treatment responded 4.4 days faster, schools in internet/incentive treatment responded 8.4 days faster, and schools in the paper/incentive treatment responded 8.5 days faster than schools in the control group (paper/no incentive). Based on these analyses, the web-based instrument option is expected to result in earlier questionnaire completions and thus cost savings in the follow-up efforts.

SSOCS:2020 Experiments

SSOCS:2020 will include two data collection experiments: the first experiment will further investigate the inclusion of monetary incentives on completing the survey and the second will test the inclusion of a navigation menu for the web survey.

Experiment #1: Incentive

SSOCS:2020 will include two incentive treatment groups. Schools in the “early incentive” treatment group will receive a $10 cash incentive at the first contact by mail, as was done for the SSOCS:2018 incentive treatment group. Schools in the “delayed incentive” treatment group will not receive an incentive in the first two mail contacts but will receive a $10 cash incentive during the third mail contact. Both treatment groups will be evaluated against the control group, which will not receive any incentive throughout data collection.

Among a total sample of 4,800 schools, approximately 2,340 schools will be selected at random to be included in the “early incentive” treatment group and approximately 1,230 schools will be selected at random to be included in the “delayed incentive” treatment group. The remaining 1,230 schools will be in the control group.

The goal of this experiment is to further refine the SSOCS incentive strategy by comparing response rates, indicators of nonresponse bias, and data collection costs for the early and delayed incentive strategies, relative to a no-incentive control.

The smallest subsample size needed to detect a 5 percent difference between treatment groups was calculated to be 1,230 schools, which is the sample allocated to the delayed treatment group and the control group. The actual experiment will gain additional power as the response rates for each group deviate from 50 percent. With 1,230 schools receiving the delayed incentive and 1,230 schools receiving no incentive, a significant difference will be detectable from the “early incentive” treatment if the response rates between the groups differ by at least 4.4 percentage points.

Experiment #2: Navigation Menu within Web Instrument

For this experiment, half of the sample (approximately 2,400 schools) will receive an invitation to complete the SSOCS survey via a slightly different version of the web instrument that will include navigation menu functionality. This is considered the treatment group. The other half of the sample will receive an invitation to complete the SSOCS via the traditional web instrument without the navigation menu (similar to the SSOCS:2018 instrument). The version of the web instrument offered to respondents will remain constant throughout data collection.

Using the same statistic as above, the differences in response rates between the control and treatment groups necessary to detect statistically significant differences have been calculated. With 2,400 cases receiving the instrument with the navigation menu and 2,400 receiving the instrument without the navigation menu, a significant difference will be detectable if the response rates between the two groups differ by at least 3.6 percentage points.

Cognitive Testing and Focus Groups

As part of the development of SSOCS:2018, cognitive testing was conducted with 19 administrators during the winter of 2016 (OMB# 1850-0803 v.171). The cognitive testing concentrated on new items, items that had undergone substantial revisions, and items that have proven to be problematic (e.g., items where interpretability issues were identified in previous iterations of testing). Based on the results of the cognitive testing, NCES is confident in the validity of the finalized items in the questionnaire. The SSOCS:2018 cognitive interviews summary of results is provided in Part C of this submission.

In addition to the cognitive testing of the items, three online focus groups were conducted prior to the SSOCS:2018 data collection involving a total of 22 elementary, middle, and high school principals in order to gain a better understanding of the barriers and benefits schools tend to associate with participation in federal surveys, and to identify communication strategies that will help overcome those barriers to participation. Specifically, participants reviewed and provided feedback on the SSOCS brochure, advance letter to principals, and the 2016 questionnaire. Based on the results of these focus groups, NCES is considering an update to the overall look and feel of the brochure. Appendix A provides draft text for the brochure. The focus group summary of results is provided in Part C of this submission.

Cognitive testing was not conducted for SSOCS:2020 because there were no new items and none were substantially revised.

B5. Individuals Responsible for Study Design and Performance

Several key staff are responsible for the study design and performance of SSOCS:2018. They are:

  • Rachel Hansen, Project Director, National Center for Education Statistics

  • Jana Kemp, American Institutes for Research

  • Melissa Diliberti, American Institutes for Research

  • Steven Hummel, American Institutes for Research

  • Samantha Neiman, American Institutes for Research

  • Michael Jackson, American Institutes for Research

  • Shawna Cox, U.S. Census Bureau

  • Adam Rettig, U.S. Census Bureau


The key staff responsible for the study design and performance of SSOCS:2020 are:

  • Rachel Hansen, Project Director, National Center for Education Statistics

  • Jana Kemp, American Institutes for Research

  • Melissa Diliberti, American Institutes for Research

  • Michael Jackson, American Institutes for Research

  • Zoe Padgett, American Institutes for Research

  • Sam Correa, American Institutes for Research

  • Shawna Cox, U.S. Census Bureau

  • Walter Holmes, U.S. Census Bureau

  • Tracae McClure, U.S. Census Bureau

  • Kombo Gbondo Tugbawa, U.S. Census Bureau

  • Aaron Gilary, U.S. Census Bureau

  • Alfred Meier, U.S. Census Bureau


1 The U.S. outlying areas are America Samoa, Guam, the Commonwealth of the Northern Mariana Islands, and the U.S. Virgin Islands.

2 In early 2019, NCES made the decision to delay the 2019-20 NTPS by one year, making it the 2020-21 NTPS. However, the 2019-20 NTPS frame creation work continues for use in SSOCS:2020, as outlined in this document. All references to the 2019-20 NTPS remain as is because they relate to the SSOCS:2020 frame and sampling.

3 The “Paper, Incentive” group had a different hypothesis test from the other two treatment groups. For the “Paper, Incentive” group, the last column displays the p-value for the hypothesis test that the group that received the $10 cash incentive and no internet option has the same or lesser response rate than the control group.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKathryn.Chandler
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy