SSOCS Update Memo

SSOCS 2018 & 2020 Update Memo.docx

School Survey on Crime and Safety (SSOCS) 2018 and 2020 Update

SSOCS Update Memo

OMB: 1850-0761

Document [docx]
Download: docx | pdf

U NITED STATES DEPARTMENT OF EDUCATION

National Center for Education Statistics


April 5, 2019

(updated May 1, 2019)

MEMORANDUM

To: Robert Sivinski, OMB

From: Rachel Hansen, NCES

Through: Kashka Kubzdela, NCES

Re: School Survey on Crime and Safety (SSOCS) 2018 and 2020 Update (OMB# 1850-0761 v.16)



The School Survey on Crime and Safety (SSOCS) is a nationally representative survey of elementary and secondary school principals that serves as the primary source of school-level data on crime and safety in public schools. SSOCS is the only recurring federal survey collecting detailed information on the incidence, frequency, seriousness, and nature of violence affecting students and school personnel from the school’s perspective. Data are also collected on frequency and types of disciplinary actions taken for select offenses; perceptions of other disciplinary problems, such as bullying, verbal abuse and disorder in the classroom; the presence and role of school security staff; parent and community involvement; staff training; mental health services available to students; and, school policies and programs concerning crime and safety. Prior administrations of SSOCS were conducted in 2000, 2004, 2006, 2008, 2010, 2016, and 2018. The 2018 and 2020 SSOCS full-scale data collections were approved in July 2017 with the latest change request approved in May 2018 (OMB# 1850-0761 v.15).

This request is to make updates for the 2020 SSOCS full-scale data collection. The updates listed below reflect feedback from technical review panel members as well as a survey design expert. Furthermore, the removal of items is supported by a literature review that investigated the frequency (or lack thereof) of items used in the field, as well as a review of the variability in responses to certain items and/or changes in estimates over time. Specifically, items resulting in little variability across response options or items that have not changed over time were candidates for removal. Changes discussed in this request are consist of revisions to: 1) the approved incentive and web experiments, (2) communication materials, and (3) SSOCS:2020 questionnaire (nonsubstantive changes and removal of items). This request involves a small adjustment to the estimated respondent burden and a small cost increase to the federal government for this study due to the minor tweaks to the incentive and web experiments, as well as updated CIPSEA confidentiality law citations to reflect changes in law, and discontinuation of the use of the affidavit of nondisclosure with districts in SSOCS 2020 because district staff will no longer be asked to sign affidavits of nondisclosure prior to receiving the list of schools sampled in the district.

The approved versions of the Supporting Statement Parts A, B, and C as well as Appendix A - SSOCS 2018 & 2020 Communication Materials, have been revised to reflect the updates to the 2020 SSOCS collection. Appendix B - SSOCS 2018 & 2020 Questionnaires provides the approved SSOCS:2018 and the revised SSOCS:2020 questionnaire. The noteworthy changes to the approved clearance package documents are listed below. Text added since the last approved version of each document is marked in burgundy font color, all text deleted since the last approved version is marked in crossed-out burgundy font color, and all unchanged text is shown in black font.

The follow updates were made to Part A:

  1. A. Justification:

The following revision was made on p. 1:

As in 2006, 2008, 2010, and 2016, NCES has entered into an interagency agreement with the Census Bureau to conduct the 2018 collection of SSOCS.and 2020 collections of SSOCS. The 2020 administration of SSOCS is being funded and conducted by NCES as in 2017-18, but with supplemental funding from the Office of Safe and Healthy Students (OSHS)


  1. A.1 Circumstances Making Collection of Information Necessary:

The following was added on p. 2:

In early 2019, minimal revisions were made to the SSOCS:2020 questionnaire in order to maintain trend and in anticipation of the implementation of a full redesign for the SSOCS:2022 administration. These changes are designed to reduce respondent burden (e.g., by removing some items and subitems) and improve the visual design of the questionnaire (e.g., by using alternative shading in subitem rows and removing grid lines). These revisions were based on feedback from a TRP consisting of content area experts and on the recommendations of a national expert in visual design elements for self-administered surveys. TRP experts suggested a number of specific modifications to existing definitions, survey items, and content areas. The majority of these suggestions will be implemented for the SSOCS:2022 data collection, as they require more extensive research and testing. Panelists recognized both the necessity and the difficulty of shortening the questionnaire to reduce burden on respondents. Based on panelist feedback on the relevance and analytic utility of items, some items and sub-items have been removed from the SSOCS:2020 questionnaire. No new content was added to the questionnaire for SSOCS:2020. Revisions to the 2020 questionnaire are detailed in Supporting Statement Part C, Section C2, of this submission.


  1. A.2. Purposes, Use, and Availability of Information:

The following was added on p. 3:

A complete description of the differences between the 2016 and 2018 surveys is provided in the questionnaire changes and rationale section in Supporting Statement Part C, Section C2. A complete description of the differences between the 2018 and 2020 surveys is provided in Supporting Statement Part C, Section C3.


The following was added on p. 3:

The First Look report and restricted-use data file and user’s manual for the SSOCS:2018 data collection will be released in summer 2019.


  1. A.3. Appropriate Use of Information Technology:

The following was added on p. 4:

Based on the results of the two experiments (Internet and incentive) conducted during SSOCS:2018, SSOCS:2020 will be primarily conducted by the web-based survey instrument, with instructions distributed to respondents by both mail and e-mail on how to complete the questionnaire. Paper questionnaires will be introduced to non-respondents in follow-up mailings, in addition to follow-up efforts by both telephone and e-mail. During the SSOCS:2018 data collection, approximately 77% of responding schools in the Internet treatment group completed the questionnaire online. It is expected that a similar proportion of the SSOCS:2020 responses will be completed through the web instrument. SSOCS:2020 will also include a modal experiment to test a navigation menu within the web instrument.

Analyses of the SSOCS:2018 Internet and incentive experiments resulted in the recommendation to include an incentive and allow web-based responses as part of a mixed-mode methodology in future SSOCS administrations. Although the web-based instrument option did not increase response rates on its own, the analyses showed higher response rates for schools that were part of both the Internet and incentive treatment groups. The web-based instrument option will offer cost savings on mailout, processing, and keying operations compared to a paper-only methodology. It will also allow for earlier questionnaire completion, as analyses showed a reduction in response time for the Internet treatment group, which leads to cost savings on follow-up efforts. For more information on the results of the SSOCS:2018 experiments, see Part B, Section B3, of this submission. All SSOCS:2020 schools will receive assurances that all of their data will be stored on secure online servers controlled by the U.S. Census Bureau and will be given the option to instead respond by paper during follow-up mailings later in the data collection.


The following revisions were made on p. 4:

InvitationsFor SSOCS:2018, invitations to complete the SSOCS questionnaires via the web-based instrument will be sent to principals of the schools randomly assigned to the web test. Principals of all schools, regardless of whether the school was randomly assigned to the web test, will be sent reminder e-mails, as appropriate, throughout the data collection period. For SSOCS:2020, all school principals will receive invitations to complete the SSOCS questionnaires via the web-based instrument and will be sent reminder e-mails, as appropriate, throughout the data collection period.


  1. A.4. Efforts to Identify Duplication: For 2018, the special district recruitment operation for SSOCS and NTPS was conducted simultaneously in an effort to reduce burden for district research committees and improve approval rates of district participation. However, due to a shortage in NCES staffing, the NTPS 2019-20 collection has been postponed, resulting in a solo SSOCS special district operation for the 2020 collection.


The following was added on p. 6:

However, because of resource constraints NTPS will not be conducted during the 2019–20 school year, as originally planned, SSOCS:2020 will not seek special district approval simultaneously with NTPS. Therefore, SSOCS:2020 will alone conduct the special district operations, as was done in prior administrations of SSOCS, before SSOCS:2018.


The following was added on p. 6:

NCES and OCR have been working together since the 2015–16 CRDC data became available to compare estimates of incident counts that are reported in both surveys. Preliminary analyses conducted by NCES’s contractor, the American Institutes for Research (AIR), have shown discrepancies in the information reported for schools that responded to the same items in both SSOCS:2016 and the 2015–16 CRDC. Thus, before considering removing the items from one of the surveys, NCES wants to develop a better idea of which source provides the more accurate data. NCES is considering conducting a validation study to learn about both SSOCS and CRDC respondents’ processes for collecting and submitting crime data as well as their understanding of the survey items. The goals of the study would be to obtain information to improve the survey items, reduce the burden of future data collections, and ensure that the resulting data are more accurate for schools, districts, policymakers, and other data users. If conducted, the validation study would compare responses from SSOCS:2018 (data collected from February to June 2018) with those from CRDC 2017–18 (data collected during the spring of 2019). The validation study is in the initial phase of design, and if conducted, its results are expected to become available by the end of 2019. They will help inform NCES’s decision on whether to retain or remove the overlapping items from SSOCS:2022.


  1. A.5. Methods Used to Minimize Burden On Small Entities:

The following was introduced on p. 7:

The SSOCS:2020 initial invitation letter will be mailed to respondents in February 2020 and will include log-in information and instructions to complete the online questionnaire within 2 weeks. Schools that do not respond will be contacted again by mail and encouraged to complete their questionnaire online. Schools that have not responded within 6 weeks will be mailed a SSOCS:2020 paper questionnaire. Schools will also receive periodic e-mail reminders throughout the data collection period. The data collection period will remain open through mid-June 2020.


  1. A.8. Consultants Outside the Agency:

The following were added on p. 8:

Dr. Jolene D. Smyth, Department of Sociology and Director of the Bureau of Sociological Research, University of Nebraska-Lincoln

Jon Carrier, Maryland Association of School Resource Officers

Benjamin Fisher, University of Louisville

Christine Handy, National Association of Secondary School Principals

Kimberly Kendziora, American Institutes for Research

Mary Poulin Carlton, National Institute of Justice

Jill Sharkey, University of California, Santa Barbara

Madeline Sullivan, Office of Safe and Healthy Students

(…)

Rita Foy Moss, Office of Safe and Healthy Students

Rosa Olmeda, Office of Civil Rights

Madeline Sullivan, Office of Safe and Healthy Students


The following was added on p. 8:

No cognitive interviews were conducted specifically for SSOCS:2020 development, because no new or significantly modified items will be included in the questionnaire.


  1. A.9. Provision of Payments or Gifts to Respondents:

The following revisions were made on pp. 8-9:

In addition to the web test, SSOCS:2018 will include an incentive experiment designed to examine the effectiveness of offering principals a monetary incentive to boost the overall response rate. Schools in the experimental treatment will receive a $10 cashprepaid incentive gift card at the first contact by mail. This treatment will be evaluated against the control group, which will not receive any incentive.

SSOCS:2020 will build on the SSOCS:2018 incentive experiment and will include two incentive treatment groups. Schools in the “early incentive” treatment group will receive a $10 cash incentive at the first contact by mail. Schools in the “delayed incentive” treatment group will not receive an incentive in the first two mail contacts but will receive a $10 cash incentive during the third mail contact. Both treatment groups will be evaluated against the control group, which will not receive any incentive. The goal of this experiment is to further refine the SSOCS incentive strategy by comparing response rates, indicators of nonresponse bias, and data collection costs between the early and delayed incentive strategies, relative to a no-incentive control.


  1. A.10. Assurance of Confidentiality:

The following was revised on p. 9:

  1. Confidential Information Protect and Statistical Efficiency Act of 2002;

  2. E-Government Act of 2002, Title V, Subtitle A;

  1. Foundations of Evidence-Based Policymaking Act of 2018, Title III, Part B, Confidential Information Protection;


  1. A.12. Estimates of Burden for Information Collection:

The following was added on pp. 12-13:

SSOCS:2020

SSOCS:2018 yielded an unweighted response rate of approximately 58 percent. When the responding schools were weighted to account for their original sampling probabilities, the response rate increased to approximately 62 percent. As in the prior collections, the objectives of the SSOCS:2020 sample design are twofold: to obtain overall cross-sectional and subgroup estimates of important indicators of school crime and safety and to develop precise estimates of change in various characteristics relating to crime between the SSOCS administrations. To attain these objectives and taking into consideration the low response rates in the 2016 and 2018 collections, approximately 4,800 total schools will be drawn in the sample: 2,340 schools will be assigned to the “early incentive” treatment; 1,230 schools will be assigned to the “delayed incentive” treatment; and 1,230 schools will be assigned to the “no incentive” (control) treatment. Given the inclusion of both web menu and incentive experiments aimed at increasing the overall response, we anticipate at least maintaining the SSOCS:2016 and SSOCS:2018 response rates, which will yield more completed surveys than needed to meet the study’s objectives.

An item was included in the SSOCS:2018 questionnaire that asked respondents, “How long did it take you to complete this form, not counting interruptions?” Based on their answers, respondents took approximately 51 minutes, on average, to respond to the SSOCS survey in 2018. In preparation for SSOCS:2020, upon reviewing the SSOCS:2018 survey items and the results of prior cognitive testing, NCES decided to delete 11 of SSOCS:2018 items/subitems. Based on these updates, we estimate that the average 2020 survey response time in SSOCS:2020 will be 49 minutes.1

Districts selected for the SSOCS sample that require submission and approval of a research application before the schools under their jurisdiction can be asked to participate in a study will be contacted to seek research approval. Based on previous SSOCS administrations, we estimate that approximately 195 special contact districts will be included in the SSOCS:2020 sample. Differing from the process for SSOCS:2018, SSOCS:2020 will not seek simultaneous special district approval with NTPS because NTPS will not be conducted during the 2019–20 school year. Otherwise, the process for contacting special districts for SSOCS:2020 will follow the process described for SSOCS:2018, as outlined earlier in this document.

Principals of sampled schools will be notified of the survey through an initial mailout containing an invitation letter with log-in information for the online questionnaire. The burden per school for reading and potentially following up on the SSOCS initial letter and any follow-up letters and e-mails is estimated to average about 6 minutes total.

Table 2. Estimated hourly burden for SSOCS:2020

Activity for each administration

Sample size

Expected response rate

Number of respondents*

Number of responses

Burden hours per respondent

Total burden hours

District IRB Staff Review

195

0.80

156

156

3

468

District IRB Panel Review

195*6

0.80

936

936

1

936

State Notification

51

1.0

51

51

0.05

3

District Notification

2,800

1.0

2,800

2,800

0.05

140

School Recruitment

4,800

1.0

4,800

4,800

0.1

480

SSOCS Questionnaire

4,800

0.6**

2,880

2,880

0.817

2,353

Total for SSOCS:2020 administration

-

-

8,743

11,623

-

4,380

* Details may not sum to totals because counts are unduplicated.

** This response rate is calculated based on the results of the SSOCS:2018 data collection. The web menu and incentive experiments are being conducted with the hope of increasing or at least maintaining the 2018 overall response rates.


Annualized Response Burden for SSOCS:2018 and SSOCS:2020

The annualized estimated response burden for SSOCS:2018 and SSOCS:2020 is provided in Table 3.

Table 3. Annualized estimated response burden for SSOCS:2018 and SSOCS:2020

Activity for each administration

Number of respondents

Number of responses

Total burden hours

Total for SSOCS:2018 administration

8,659

11,538

4,461

Total for SSOCS:2020 administration

8,743

11,623

4,380

Annualized Total for SSOCS:2018 and SSOCS:2020*

5,801

7,721

2,947

* The annualized total is the sum of the total SSOCS:2018 and SSOCS:2020 burden, divided by 3.


Assuming that the respondents (district education administrators for district approvals and mostly principals for the data collection) earn on average $43.46[4]45.802 per hour, and given the total2,947 annualized total estimated burden hours, the annualized total estimated burden time cost to respondents for SSOCS:2018 and SSOCS:2020 is estimated to be $129,250$134,973.



  1. A.14. Estimates of Annual Government Cost:

The following revisions were made on p. 13:

The Census Bureau will conduct the SSOCS:2020 data collection preparation, data collection, and data file development work for approximately $2,400,000 over 3 years. A task in NCES’s ESSIN contract with AIR also supports this survey at about $725,000 over 3 years. Thus, SSOCS:2020 will cost the government approximately $3,125,000 over 3 years.


Therefore, total annualized average cost for SSOCS:2018 and SSOCS:2020 is approximately $1,021,521.


  1. A.15. Reasons for Changes in Response Burden:

The following was added on p. 14:

The small decrease in burden from SSOCS:2018 to SSOCS:2020 is due to the omission of the principal advance letter and a reduction in the number of questionnaire items and subitems, which are somewhat balanced out by the expected increase in the number of special handling districts in the SSOCS:2020 sample.


  1. A.16. Time Schedule:

The following was added on p. 14:

Table 5. Schedule of major project activities: SSOCS:2020

Task

Date

Contact special districts to begin approval process

June 2019–January 2020

Complete and deliver special district applications and packages

June 2019–January 2020

Draft special mailing materials for schools in special districts

June 2019–January 2020

Data collection begins

February 2020

Data collection ends

July 2020

Restricted-use data file finalized

February 2021

First Look report through NCES review

March 2021

First Look report released

July 2021

Restricted-use data file released

September 2021

Survey documentation released

September 2021

Public-use data file released

November 2021

Web tables through NCES review

March 2022

Web tables released

July 2022


The follow updates were made to Part B:

  1. B. Methodology:

The following revisions were made on p. 1:

The information presented in this document for SSOCS:2018 will also be used in SSOCS:2020, except where noted otherwise.

The SSOCS:2020 questionnaire has no new content compared to the 2018 and 2016 questionnaires. However, some items and subitems were removed from the questionnaire in order to reduce respondent burden, and other formatting revisions were made to improve the questionnaire’s visual design (e.g., using alternative shading in subitem rows and removing grid lines). SSOCS:2020 also includes methodological changes that distinguish it from SSOCS:2018. First, given the favorable results of the web test included in the 2018 collection in terms of response rates, SSOCS:2020 will be collected primarily by Internet, with paper questionnaires offered in follow-up mailings rather than at the onset of collection. Second, the testing of monetary incentives will be expanded in the 2020 collection.

The information presented in this document covers both SSOCS:2018 and SSOCS:2020, with differences between the two collections noted explicitly.


  1. B.1. Respondent Universe and Sample Design and Estimation:

The following was added on p. 2:

The sampling frame for SSOCS:2020 will be constructed from the public school sampling frame originally planned for the 2019–20 NTPS,3 which will be constructed from the Public Elementary/Secondary School Universe data file of the 2017–18 CCD (scheduled to be released in April/May of 2019). The size of the SSOCS:2020 population is estimated to be approximately 84,400 schools.

Tables 3 and 4 show the estimated expected distribution of the public school sampling universe for SSOCS:2020, by school level and urbanicity and by school level and enrollment size, respectively. Tables 3 and 4 reflect the expected numbers as estimated using the 2014-15 CCD universe, because the 2017-18 CCD file, based on which the SSOCS:2020 frame will be built, is not yet available at the time of this submission.

Table 3. Expected respondent universe for the SSOCS:2020 public school sample, by school level and urbanicity, based on the 2014-15 CCD

Urbanicity

Primary

Middle

High

Combined

Total

City

14,938

3,800

3,402

1,109

23,249

Suburb

17,410

5,596

3,909

720

27,635

Town

5,695

2,611

2,104

593

11,003

Rural

11,537

3,418

3,289

4,292

22,536

Total

49,580

15,425

12,704

6,714

84,423


Table 4. Expected respondent universe for the SSOCS:2020 public school sample, by school level and enrollment size, based on the 2014-15 CCD

Enrollment size

Primary

Middle

High

Combined

Total

Less than 300

10,371

2,757

2,254

2,871

18,253

300–499

18,193

3,467

2,029

1,652

25,341

500–999

19,934

7,322

3,047

1,640

31,943

1,000+

1,082

1,879

5,374

551

8,886

Total

49,580

15,425

12,704

6,714

84,423


The following was added in subsection “Sample Selection and Response Rates” on p. 3:

SSOCS:2018 yielded an unweighted response rate of approximately 58 percent. When the responding schools were weighted to account for their original sampling probabilities, the response rate increased to approximately 62 percent. Given the inclusion of planned experiments aimed at increasing the overall response, we anticipate at least maintaining the SSOCS:2016 and SSOCS:2018 response rates in SSOCS:2020, which will yield more completed surveys than needed to meet the study’s objective.


The following was added in subsection “Sample Design for SSOCS:2018 and SSOCS:2020” on p. 4:

SSOCS:2020

While the general sampling design for SSOCS:2020 remains the same as in prior collections, there are three notable differences from SSOCS:2018. First, SSOCS:2020 will not coordinate with NTPS in the ways SSOCS:2018 did, because the planned collection for the 2019-20 NTPS has been delayed by one year. The special district recruitment efforts (approved in March 2017; OMB# 1850-0761 v.11) will not run in tandem with similar NTPS efforts, and the SSOCS sampling probabilities will not be adjusted based on the NTPS sample to minimize the overlap of sampled schools Second, an incentive experiment will be included in the SSOCS:2020 collection, with approximately 2,340 schools assigned to the “early incentive” treatment; 1,230 schools assigned to the “delayed incentive” treatment; and 1,230 schools assigned to the “no incentive” (control) treatment. The schools in these experimental groups will be selected in a manner that reflects the overall sampling design, providing the ability to use their responses when calculating estimates. Lastly, a split-panel experiment will be conducted within the web instrument, designed to test a navigation menu, with approximately half of the entire sample randomly selected to receive a different version of the web instrument.

SSOCS:2020 will take advantage of the lessons learned from SSOCS:2018. The response rates achieved for the various strata and substrata in SSOCS:2018 have been examined in order to determine the proper size of the initial sample selected for 2020 to ensure a sufficient number of completed cases for analysis. Table 6 displays the SSOCS:2018 response rates by school level, enrollment size, urbanicity, percent White enrollment, and region.


The following was added on p. 6:

Table 6. Unweighted and weighted SSOCS unit response rates, by selected school characteristics:

School year 2017–18

School characteristic

Initial sample

Completed Survey1

Non- respondents2

Ineligible3

Unweighted response rate (percent)4

Weighted response rate (percent)5

Total

4,803

2,762

1,975

66

58.3

61.7

Level6

Primary

1,170

671

477

22

58.4

60.8

Middle

1,704

975

703

26

58.1

60.7

High school

1,748

997

740

11

57.4

61.4

Combined

181

119

55

7

68.4

71.5

Enrollment size

Less than 300

456

286

135

35

67.9

68.4

300–499

955

605

334

16

64.4

65.8

500–999

1,860

1,042

806

12

56.4

56.8

1,000 or more

1,532

829

700

3

54.2

55.1

Urbanicity

City

1,528

723

769

36

48.5

49.3

Suburb

1,837

1,034

793

10

56.6

58.2

Town

563

382

168

13

69.5

68.2

Rural

875

623

245

7

71.8

55.0

Percent White enrollment

More than 95 percent

170

128

39

3

76.6

79.2

More than 80 to 95 percent

1,014

675

330

9

67.2

68.3

More than 50 to 80 percent

1,390

848

536

6

61.3

62.8

50 percent or less

2,229

1,111

1,070

48

50.9

55.0

Region

Northeast

819

459

347

13

56.9

61.3

Midwest

1,029

636

377

16

62.8

64.3

South

1,845

1,042

782

21

57.1

61.0

West

1,110

625

469

16

57.1

60.4

1In SSOCS:2018, a minimum of 60 percent (157 subitems) of the 261 subitems eligible for recontact (i.e., all subitems in the questionnaire except the non-survey items that collect information about the respondent) were required to be answered for the survey to be considered complete. The 261 subitems eligible for recontact include a minimum of 80 percent of the 76 critical subitems (61 out of 76 total), 60 percent of item 30 subitems (18 out of 30 total), and 60 percent of item 38 subitems in column 1 (3 out of 5 total). The critical items are 11, 18, 19, 20, 22, 28, 29, 30, 31, 35, 36, 38 (column 1), 39, 40, 41, 42, 46, 47, and 48. Questionnaires that did not meet established completion criteria were considered incomplete and are excluded from the SSOCS:2018 data file.

2Nonrespondents include schools whose districts denied permission to NCES to conduct the survey and those eligible schools that either did not respond or responded but did not answer the minimum number of items required for the survey to be considered complete.

3Ineligible schools include those that had closed, merged with another school at a new location, changed from a regular public school to an alternative school, or are not a school ("not a school" generally refers to a school record for an organization that does not provide any classroom instruction (e.g., an office overseeing a certain type of program or offering tutoring services only)).

4The unweighted response rate is calculated as the following ratio: completed cases / (total sample known ineligibles).

5The weighted response rate is calculated by applying the base sampling rates to the following ratio: completed cases / (total sample known ineligibles).

6Primary schools are defined as schools in which the lowest grade is not higher than grade 3 and the highest grade is not higher than grade 8. Middle schools are defined as schools in which the lowest grade is not lower than grade 4 and the highest grade is not higher than grade 9. High schools are defined as schools in which the lowest grade is not lower than grade 9 and the highest grade is not higher than grade 12. Combined schools include all other combinations of grades, including K–12 schools.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2017–18 School Survey on Crime and Safety (SSOCS:2018).


  1. B.2. Procedures for the Collection of Information: The 2018 SSOCS sample was drawn at an early date in order for the special district recruitment operation to be conducted simultaneously with the NTPS. Since the NTPS data collection has been postponed a year, this operation will not be done simultaneously in 2020 SSOCS. Additionally, an expert review of the SSOCS materials resulted in minor changes to the letters (reduction in text and removal of endorsements), the addition of the SSOCS FAQs to be added to the back of the initial letter, and the removal of the advance letter to principals. The list of endorsements will now be a separate handout included in the package, along with a separate handout for the incentive.


The following revisions were made in subsection “Steps in the Data Collection Process” on p. 6:

The following is a description of the main tasks in the data collection process for SSOCS. These tasks include drawing the sample; identifying special contact districts; mailing advance letters to school principals (SSOCS:2018 only), district superintendents, and Chief State School Officers (CSSOs); mailing full package to principals; placing reminder and follow-up calls to nonresponding schools; and refusal conversion efforts using both mailings and e-mails.


The following bullet point was added in subsection “Identifying Special Contact Districts and the Need for Research Applications” on p. 7:

The following are examples of the type of information that will be gathered from each district’s website in order to prepare a research application for submission to this district:

  • Whether in-person approval is required, and if so, information about the in-person approval process.


The following was added in subsection “Identifying Special Contact Districts and the Need for Research Applications” on p. 8:

However, because due to resource constraints NTPS will not be conducted during the 2019–20 school year, as originally planned, SSOCS:2020 will not be able to seek special district approval simultaneously with NTPS. Therefore, SSOCS:2020 will alone conduct the special district operations, as was done in prior administrations of SSOCS, before SSOCS:2018.


The following revisions were made in subsection “Advance Notification to Principals” on p. 8:

PrincipalsFor SSOCS:2018, principals will be notified of the survey through an advance letter and e-mail sent a week or two before the questionnaire, following OMB clearance. The letter will include information about the study, the date of the first mailing, and a toll-free number that principals can call if they have questions. The toll-free number will be answered by Census program staff in Suitland, Maryland, who have been explicitly trained for this study and on how to respond to calls from schools. Staffing levels will ensure that at least one staff person is available at all times during the promised hours of operation. Copies of the advance letter to principals and principals in special contact districts are included in appendix A.

Census conducted an expert review of all SSOCS contact material, and the resulting recommendation was that non-actionable contact materials be removed from collection. Therefore, for SSOCS:2020, the advance notification letter will not be sent to principals, as they will be receiving their initial mailout package with instructions to complete the SSOCS approximately one week later.


The following was added in subsection “Mailouts” on p. 9:

SSOCS:2018 will be conducted primarily by mail and will include a modal experiment with a web-based version of the instrument. SSOCS:2020 will be conducted primarily via the web-based survey instrument. A clerical operation prior to data collection will obtain e-mail addresses for all of the sampled principals, and these e-mails will be used to contact the principals throughout the data collection. Both collections will use both mail and e-mail to distribute instructions on how to complete the web questionnaire, with paper questionnaires introduced in follow-up mailings. Sampled principals will receive as many as four mailings, as needed, throughout the collection period, and principals who have completed their questionnaire prior to subsequent mailing(s) will be excluded from those mailouts.


The following revisions were made in subsection “Mailouts” on p. 10:

SSOCS:2020

SSOCS:2020 will be conducted primarily by the web-based survey instrument, with instructions to complete the questionnaire distributed to respondents by both mail and e-mail. It will include a modal experiment to test a navigation menu within the web instrument.

SSOCS:2020 will also build on the SSOCS:2018 incentive experiment described above and will include two incentive treatment groups. Schools in the “early incentive” treatment group will receive a $10 cash incentive at the first contact by mail. Schools in the “delayed incentive” treatment group will not receive an incentive in the first two mail contacts but will receive a $10 cash incentive during the third mail contact. Both treatment groups will be evaluated against the control group, which will not receive any incentive.

The initial mailout is scheduled for mid-February 2020, and the second mailout is scheduled for March 2020. The principal will be asked to complete the questionnaire—or to have it completed by the person at the school who is the most knowledgeable about school crime and safety—within 2 weeks of receipt. Both mailings will include a personalized letter containing the survey URL and a unique UserID to access the survey online. The letter will also include Census Bureau contact information and answers to FAQs. In addition, the mailing will include a one-page endorsement insert, which will display the names and logos of all SSOCS endorsing agencies. Finally, schools in the “early incentive” treatment will receive $10 cash adhered to a brightly colored incentive insert in their initial mailout package.

The third and fourth mailings (in March and April, respectively) will include a paper questionnaire, a postage-paid return envelope, and a personalized cover letter that will include the toll-free number at the Census Bureau and the SSOCS e-mail address. The third mailing will be the first time that respondents receive a paper questionnaire. Schools in the “delayed incentive” treatment group will also receive their $10 cash incentive adhered to a brightly colored incentive insert in the third package mailing.

Principals will receive an e-mail invitation that includes a clickable URL to the web survey and log-in credentials around the time of the first and second mailings. E-mails will be personalized and sent to individual respondents. Principals will be sent reminder e-mails, as appropriate, throughout the data collection period.

A copy of the cover letters and e-mails sent to principals used in the first and second mailings and a copy of the postcard for special contact districts are throughout SSOCS:2018 and SSOCS:2020 data collection is included in appendix A.


The following revisions were made in subsection “Protocol for Follow-up Calls” on p. 10:

Approximately 2For SSOCS:2018, approximately 3 weeks after the estimated delivery of the first mailing to school principals, Census will initiate phone calls to confirm that they have received the mailing and to ask if they have any questions. About a month later, Census will initiate phone calls with nonrespondents, reminding them to complete their questionnaire. For SSOCS:2020, approximately 3 weeks after the second mailing to school principals, Census will initiate phone calls with nonrespondents, reminding them to complete their questionnaire.


The following revisions were made in subsection “Refusal Conversion for Schools That Will Not Participate” on pp. 10-11 (the “refusal conversion” operation for SSOCS:2018, as was originally outlined in the deleted paragraph, was not utilized and thus was removed from text):

In addition, for SSOCS:2020, FAQs will be included on the back of the initial mailout letters and will include information about why the participation of each sampled school is important and how respondent information will be protected.

The SSOCS:2018 refusal conversion will begin about one month after the start of data collection and continue throughout the rest of the field period. This lag between the start of the data collection and the beginning of refusal conversion will allow time for the development and design of the refusal conversion training and protocol, which will be based on lessons learned during the first month of data collection. Throughout the field period, we will ensure a “cooling off period” of at minimum 14 calendar days before a refusing school is called.


The following revisions were made in subsection “Data Retrieval of Critical Items” on p. 11:

In terms of the collection of “critical items,” the interview labor will be divided between follow-up with nonrespondents (seeking the completion of “critical items” rather than the full survey) and follow-up with respondents who have skipped items deemed to be critical (the retrieval of missing data). For nonrespondents, in May 2018 (May 2020 for the 2020 collection), we will offer “critical item” completion by fax or phone. The “critical items” identified by NCES for SSOCS:2018 and SSOCS:2020 are the same as in SSOCS:20164 and include incidence data as well as school attributesinclude incidence data as well as data on school characteristics, consistent with SSOCS:2016. The SSOCS:2018 critical items are as follows: 11, 18, 19, 20, 22, 28, 29, 30, 31, 35, 36, 38 (column 1), 39, 40, 41, 42, 46, 47 and 48. The SSOCS:2020 critical items are analogous to the SSOCS:2018 items, with item numbers updated to match the revised SSOCS:2020 questionnaire: 9, 15, 16, 17, 19, 25, 26, 29, 30, 32, 33, 35 (column 1), 36, 37, 38, 39, 43, 44, and 45.


  1. B.3. Methods to Maximize Response Rates: The 2018 incentive experiment was successful with the experimental group yielding a significantly higher response rate as compared to the control group. NCES proposes further experimenting with the incentive by continuing to offer the $10 cash incentive at the initial mailout for half of the sample, and also offering a $10 incentive to a subsample (1/4 of the overall sample) at the third mailout. This effort will hopefully encourage late-respondents or potential nonrespondents to respond to the survey.


The following revisions were made on p. 11:

The decision to move to a primarily web-based instrument for SSOCS:2020 was based on the results of these two SSOCS:2018 experiments (see section B.4 of this submission). Analyses of these experiments resulted in the recommendation to include an incentive and allow web-based responses as part of a mixed-mode methodology in future SSOCS administrations. Overall, offering an incentive was advantageous for SSOCS:2018, as it increased response rates and promoted significantly faster response times. SSOCS:2020 will build on the SSOCS:2018 incentive experiment but will include two incentive treatment groups (see section B.4 of this document for details).

In addition, SSOCS:2020 will include a modal experiment to test a navigation menu within the web instrument. If the experimental group—the group that receives the instrument with the added navigation menu functionality—yields a higher response rate than the control group (traditional web instrument), this would indicate that the navigation menu improves instrument usability and/or reduces respondent burden and may be implemented in the full sample in subsequent data collections.

SSOCS:2018 questionnaires will be mailed by Federal Express to ensure their prompt receipt and to give the survey greater importance in the eyes of the potential respondents. MailedSSOCS:2020 will take a slightly different approach, utilizing Federal Express only during the fourth and final mailing in order to make the questionnaire package stand out to nonrespondents.


The following was added in subsection “Endorsements” on p. 12:

In addition to the above agencies that endorsed SSOCS:2018, Census will solicit endorsement from the following agencies for SSOCS:2020:

  • Center for Prevention of School Violence

  • Center for School Mental Health

  • National Association of School Safety and Law Enforcement Officers

  • National School Boards Association

  • Police Executive Research Forum

  • Safe Schools Initiative Division

  • University of Arkansas Criminal Justice Institute


  1. B.4. Tests of Procedures:

The following was added in subsection “Experiments” on p. 12:

Both SSOCS:2018 and SSOCS:2020 will include methodological experiments aimed at boosting response rates.


The following was added in subsection “SSOCS:2018 Experiment Results” on pp. 13-16:

SSOCS:2018 Experiment Results

Weighted response rates for the four experimental treatment groups included in SSOCS:2018 are reported in Table 7 below. Each treatment group is listed with its response rate and its difference from the control group’s response rate. The p-value for the hypothesis test of no difference is reported in the last column5.

Table 7: SSOCS:2018 Experimental Group Response Rates (Standard Errors)

Experimental Group

Sample Size

Response Rates

Difference from Control

Significance Test P-Value

Internet, No Incentive

575

46.7 (2.5)

-1.7 (3.1)

0.5917

Paper, Incentive

1,825

56.1 (1.6)

7.7 (2.7)

0.0028*

Internet and Incentive

575

53.2 (2.7)

4.9 (2.9)

0.1010

Control, Paper, No Incentive

1,825

48.4 (1.9)

N/A

N/A

Source: U.S. Census Bureau, Results from the 2018 School Survey on Crime and Safety Internet and Incentive Experiment

Response rates calculated as of May 7, 2018, when a web-push effort deployed. We consider cases that responded after May 7 “nonrespondents” for the analysis.

*Denotes significance at 0.10.


In SSOCS:2018, schools that received the incentive and did not receive the option to respond online had a response rate of 7.7 percentage points higher than the control group (statistically significant difference, p = 0.0028). Although the web-based instrument option did not increase response rates on its own, the analyses showed that schools that were part of both the Internet (option to respond online) and incentive treatment groups had a response rate of 4.9 percentage points higher than the control group; however, this was not statistically significant. This result may have been driven by the incentive rather than the internet option, given that the internet offer did not appear to influence response by itself.

The weighted response distribution of the final mode of data collection by the assigned mode is presented in Table 8. For schools who were assigned to both the internet and incentive treatments, 88.2 percent responded using the internet; 11.1 percent responded using paper; and 0.7 responded over the telephone during follow-up operations. Overall, between 88 and 90 percent of schools who were assigned to the internet treatment responded online.

Table 8: SSOCS:2018 Final Mode Distribution Percent of the Assigned Mode (Standard Errors)

Assigned Mode

Final Mode (of those completed percent by each mode)

Percent Total

Internet Treatment

Incentive Treatment

Internet

Paper

Telephone Follow-up

Internet

Incentive

88.2 (3.0)

11.1 (3.0)

0.7 (0.5)

100

No incentive

89.3 (2.7)

10.0 (2.6)

0.7 (0.5)

100

Paper

Incentive

N/A1

100.0 (0.0)

0.0^ (0.0)

100

No incentive

0.6* (0.5)

99.2 (0.5)

0.1 (0.1)

100

Source: U.S. Census Bureau, Results from the 2018 School Survey on Crime and Safety Internet and Incentive Experiment

1 Schools that were assigned to paper did not have the option to respond on the internet until a web-push effort deployed on May 7, 2018. We consider cases that responded after May 7 “nonrespondents” for the analysis.

^Rounds to zero due to disclosure avoidance requirements by the U.S. Census Bureau.

*A few cases responded by paper, but ultimately completed more questions using the web after it was available May 7. These cases are considered respondents because they submitted the paper questionnaire before May 7, 2018, but they are also considered “internet, final mode respondents” because their last mode (and mode used for processing) was the internet.


Response distributions for each treatment were compared to the control group across eight school characteristics and three survey items. The chi-square test results do not provide evidence that the treatment group response distributions across school characteristics were different from the control group. However, there was one significant difference in the item response distributions, “Percent of schools with a sworn officer,” displayed in Table 9. The Internet and Incentive group in the last column has a different response distribution than the control group (p-value 0.0808).


Table 9: SSOCS:2018 Item Response Distribution for Schools with a Sworn Law Enforcement Officer


Percent of responding schools (of those completed by school type, percent by item response)

Item Response

All Respondents

Control, Paper, No Incentive

Internet, No Incentive

Paper, Incentive

Internet and Incentive

No sworn law enforcement officers present

49.5 (1.4)

46.6 (2.4)

51.1 (4.1)

49.9 (2.7)

54.7 (3.9)

At least 1 sworn law enforcement officer

50.5 (1.4)

53.4 (2.4)

48.9 (4.1)

50.1 (2.7)

45.3 (3.9)

Percent Total

100

100

100

100

100

Rao-Scott Chi-Square p-value, comparison to control group (degrees of freedom)

0.3006 (1)

0.3584 (1)

0.0808* (1)

Source: U.S. Census Bureau, Results from the 2018 School Survey on Crime and Safety Internet and Incentive Experiment

*Denotes significance at 0.10.


Looking at the distributions of school characteristics for nonresponding schools, a few school characteristics were identified as being associated with propensity to respond. Specifically, school locale, enrollment size, the percent of white students, and the student-to-full-time-teacher ratio do not have similar distributions between the sample respondents and nonrespondents. These characteristics were previously identified as being correlated to nonresponse, in addition to the number of teachers and the percentage of students eligible for free or reduced-price lunch, and are used in the algorithm for nonresponse adjustments.

When introducing a new mode or incentives, it is often helpful to understand the effects of an intervention on producing faster response, which can save money on follow-up efforts. Therefore, the amount of time (days) that it took each respondent in the experimental groups to return the survey (even if the survey was later deemed incomplete) was calculated as part of the analyses. Table 10 displays the weighted average number of days to respond for each experimental group, with the difference in average number of days from the control group. The p-value for the hypothesis test of no difference is reported in the last column.

The option to respond online and the incentive had significantly faster response times. Specifically, the incentive, regardless of the internet option, produces the fastest response time.

Table 10: SSOCS:2018 Experimental Group Response Times (Standard Errors)

Experimental Group

Response Time in Days

Difference from Control

Significance Test P-Value

Internet, No Incentive

45.2 (2.1)

-4.4 (2.4)

0.0694*

Paper, Incentive

41.1 (1.2)

-8.5 (1.8)

<0.0001*

Internet and Incentive

41.2 (2.5)

-8.4 (3.0)

0.0072*

Control, Paper, No Incentive

49.6 (1.3)

N/A

N/A

Source: U.S. Census Bureau, Results from the 2018 School Survey on Crime and Safety Internet and Incentive Experiment

Response times for respondents as of May 7, 2018.

*Denotes significance at 0.10.


Providing the option to respond online, and especially when also offering an incentive, resulted in a decreased response time to the survey compared to those who were offered a paper questionnaire only with no incentive. All treatment groups observed significantly faster response time compared to the control group. On average, schools in the internet/no incentive treatment responded 4.4 days faster, schools in internet/incentive treatment responded 8.4 days faster, and schools in the paper/incentive treatment responded 8.5 days faster than schools in the control group (paper/no incentive). Based on these analyses, the web-based instrument option is expected to result in earlier questionnaire completions and thus cost savings in the follow-up efforts.

SSOCS:2020 Experiments

SSOCS:2020 will include two data collection experiments: the first experiment will further investigate the inclusion of monetary incentives on completing the survey and the second will test the inclusion of a navigation menu for the web survey.

Experiment #1: Incentive

SSOCS:2020 will include two incentive treatment groups. Schools in the “early incentive” treatment group will receive a $10 cash incentive at the first contact by mail, as was done for the SSOCS:2018 incentive treatment group. Schools in the “delayed incentive” treatment group will not receive an incentive in the first two mail contacts but will receive a $10 cash incentive during the third mail contact. Both treatment groups will be evaluated against the control group, which will not receive any incentive throughout data collection.

Among a total sample of 4,800 schools, approximately 2,340 schools will be selected at random to be included in the “early incentive” treatment group and approximately 1,230 schools will be selected at random to be included in the “delayed incentive” treatment group. The remaining 1,230 schools will be in the control group.

The goal of this experiment is to further refine the SSOCS incentive strategy by comparing response rates, indicators of nonresponse bias, and data collection costs for the early and delayed incentive strategies, relative to a no-incentive control.

The smallest subsample size needed to detect a 5 percent difference between treatment groups was calculated to be 1,230 schools, which is the sample allocated to the delayed treatment group and the control group. The actual experiment will gain additional power as the response rates for each group deviate from 50 percent. With 1,230 schools receiving the delayed incentive and 1,230 schools receiving no incentive, a significant difference will be detectable from the “early incentive” treatment if the response rates between the groups differ by at least 4.4 percentage points.

Experiment #2: Navigation Menu within Web Instrument

For this experiment, half of the sample (approximately 2,400 schools) will receive an invitation to complete the SSOCS survey via a slightly different version of the web instrument that will include navigation menu functionality. This is considered the treatment group. The other half of the sample will receive an invitation to complete the SSOCS via the traditional web instrument without the navigation menu (similar to the SSOCS:2018 instrument). The version of the web instrument offered to respondents will remain constant throughout data collection.

Using the same statistic as above, the differences in response rates between the control and treatment groups necessary to detect statistically significant differences have been calculated. With 2,400 cases receiving the instrument with the navigation menu and 2,400 receiving the instrument without the navigation menu, a significant difference will be detectable if the response rates between the two groups differ by at least 3.6 percentage points.


The following was added in subsection “Cognitive Testing and Focus Groups” on p. 16:

On p. 16, the following sentence was added:

Cognitive testing was not conducted for SSOCS:2020 because there were no new items and none were substantially revised.


  1. B.5. Individuals Responsible for Study Design and Performance.

The following was added on p. 17:

The key staff responsible for the study design and performance of SSOCS:2020 are:

  • Rachel Hansen, Project Director, National Center for Education Statistics

  • Jana Kemp, American Institutes for Research

  • Melissa Diliberti, American Institutes for Research

  • Michael Jackson, American Institutes for Research

  • Zoe Padgett, American Institutes for Research

  • Sam Correa, American Institutes for Research

  • Shawna Cox, U.S. Census Bureau

  • Walter Holmes, U.S. Census Bureau

  • Tracae McClure, U.S. Census Bureau

  • Kombo Gbondo Tugbawa, U.S. Census Bureau

  • Aaron Gilary, U.S. Census Bureau

  • Alfred Meier, U.S. Census Bureau


The follow updates were made to Part C-1:

  1. C.1. Item Description and Justification: SSOCS:2018 and SSOCS:2020:

Full item descriptions and justifications were added in section 1.2 on pp. 5-10.The revisions below detail the places where the items from the SSOCS:2020 questionnaire diverge from the SSOCS: 2018 questionnaire.


The following revisions were made on p. 1:

At various times in the history of the School Survey on Crime and Safety (SSOCS), the survey items have been examined both for the quality of their content and the data collected in response to them and, when necessary, the questionnaire has been adjusted. To maintain consistent benchmarks over time, to the extent possible, minimal changes have been made to the questionnaire between survey administrations. Based on the results of the SSOCS:2016 data collection and cognitive interviews testing, some items for SSOCS:2018 were revised to clarify their meaning. Additionally, severalSome items were removed from the surveySSOCS:2018 and SSOCS:2020 questionnaires based on a perceived decline in their usefulness and to make room for new items that reflect emerging issues in school crimereduce the burden on respondents, and safety. some items were revised to clarify their meaning.

Information on specific editorial changes, content modifications, item additions, and item deletions is included in the following sectionSections C2 and C3 of this document.

Presented below is a complete description of the sections and the corresponding items in the SSOCS:2018 questionnaire (see appendix B for the questionnaire). and SSOCS:2020 questionnaires (see appendix B for the full questionnaires).

The following revisions were made on p. 5:

The SSOCS:2020 questionnaire and procedures are expected to be the same as in SSOCS:2018. Due to adjustments to the questionnaire between these two administrations, some item numbers have shifted. The item numbers presented below represent the SSOCS:2020 question numbering, with the SSOCS:2018 question numbers displayed in parentheses. Further details on the changes made to the questionnaire between the 2018 and 2020 administrations of SSOCS, including rationales for the changes, can be found in Section C3.

The SSOCS:20182020 questionnaire consists of the following sections:

  • School practices and programs;

  • Parent and community involvement at school;

  • School security staff;

  • School mental health services;

  • Staff training and practices;

  • Limitations on crime prevention;

  • Frequency of crime and violence at school;

  • Incidents;

  • Incidents (the section on Frequency of Crime and Violence at School was removed from the SSOCS:2020 questionnaire; items previously in that section were incorporated into the Incidents section)

  • Disciplinary problems and actions; and

  • School characteristics: 2017–182019–20 school year.


The following revisions were made in subsection “School Practices and Programs” on p. 6:

Questions 5 (SSOCS:2018 Questions 5 and 6 ask)

This question asks whether schools have a threat assessment team, and, if so, how often the threat assessment team meets. Threat assessment teams are an emerging practice in schools to identify and interrupt students who may be on a path to violent behavior. A follow-up question in the SSOCS:2018 questionnaire asked how often the threat assessment team meets; this question was removed from the SSOCS:2020 questionnaire.


The following revisions were made in subsection “Parent and Community Involvement” on p. 6:

Question 7 (SSOCS:2018 Questions 8 and 9 ask)

This question asks about policies or practices that schools have implemented to involve parents in school procedures and. An additional question in the SSOCS:2018 questionnaire asked about the percentage of parents participating in specific school events; this question was removed from the SSOCS:2020 questionnaire

The following revisions were made in subsection “School Security Staff” on p. 6:



Questions 9 through 15 (SSOCS:2018 Questions 11 through 18)

These questions ask about the use and activities of sworn law enforcement officers (including School Resource Officers) on school grounds and at school events. One question from this section (SSOCS:2018 Question 15) was removed from the SSOCS:2020 questionnaire.

The following revisions were made in subsection “Limitations on Crime Prevention” on p. 8:

Question 24 (SSOCS:2018 Question 27)

This question asks about factors such as lack of adequate training for teachers, lack of support from parents or teachers, and inadequate funding, and federal, state, or district policies on disciplining students. Although principals are not trained evaluators, they are the people who are the most knowledgeable about the situations at their schools and whether their own actions have been constrained by the factors listed. Four subitems from this section (SSOCS:2018 items 27j–m) were removed from the SSOCS:2020 questionnaire.

The following revisions were made in subsection “Incidents” on p. 8:

The questions in this section ask about the frequency and types of crime and disruptions at school (other than violent deaths). 1.7Note that the section Frequency of Crime and Violence at School has been removed from the SSOCS:2020 questionnaire and its items have been incorporated into the Incidents section.

Question 25 (SSOCS:2018 Question 30)

This question specifically asks principals to provide counts of the number of recorded incidents that occurred at school and the number of incidents that were reported to the police or other law enforcement. This question will assist in identifying which types of crimes in schools are underreported to the police and will provide justification for further investigation.

Questions 26 and 27 (SSOCS:2018 Questions 31 and 32)

These questions ask about the number of hate crimes and the biases that may have motivated these hate crimes.

Question 28 and(SSOCS:2018 Question 33)

This question asks whether there were any incidents of sexual misconduct between school staff members and students.

Questions 29 and 30 (SSOCS:2018 Questions 28 and 29)

These questions ask about violent deaths (specifically, homicides and shootings at school). Violent Although violent deaths get substantial media attention but, they are actually relatively rare. ThereIn fact, there is evidence that, in general, schools are much safer than students’ neighboring communities. Based on analyses of such previous SSOCS data, these crimes are such rare events that the National Center for Education Statistics (NCES) is unable to report estimates per its statistical standards. Nonetheless, it is important to include these items as they are significant incidents of crime that, at the very least, independent researchers can evaluate. Furthermore, the survey is intended to represent a comprehensive picture of the types of violence that can occur in schools, and the omission of violent deaths and shootings would be questioned by respondents who may have experienced such violence. In the SSOCS:2018 questionnaire, these questions were contained in the Frequency of Crime and Violence at School section; this section was removed from the SSOCS:2020 questionnaire and its items moved to the Incidents section.

Question 31 (SSOCS:2018 Question 34)

This questionQuestion 30 specifically asks principals to provide counts of the number of recorded incidents that occurred at school and the number of incidents that were reported to the police or other law enforcement. Question 30 will assist in identifying which types of crimes in schools are underreported to the police and will provide justification for further investigation. Questions 31 and 32 ask about the number of hate crimes and the biases that may have motivated these hate crimes. Question 33 asks whether there were any incidents of sexual misconduct between school staff members and students. Question 34 asks about the number of arrests that have occurred at school. The data gained from this section can be used directly as an indicator of the degree of safety in U.S. public schools and indirectly to compare schools in terms of the number of problems they face.

The following revisions were made in subsection “Disciplinary Problems and Actions” on p. 9:

Question 33 (SSOCS:2018 Question 36)

This question asks about the frequency of three aspects of cyberbullying (including at and away from school), providing a general measure of the degree to which cyberbullying is an issue for students. Two additional subitems were included in the SSOCS:2018 questionnaire asking about how often cyberbullying affected the school environment and how often staff resources arewere used to deal with cyberbullying; these subitems were removed from the SSOCS:2020 questionnaire.

  1. C.2. Changes to the Questionnaire and Rationale: SSOCS:2018:

The following revisions were added on p. 10.

Based on the results of the SSOCS:2016 data collection and cognitive interview testing, some items for SSOCS:2018 were revised to clarify their meaning. Additionally, several items were removed from the survey based on a perceived decline in their usefulness and to make room for new items that reflect emerging issues in school crime and safety.

The following was added at the beginning of the section 2.2, Editorial Changes”, on p. 11:

Throughout the questionnaire, the school year has been updated to reflect the most recent 2017–18 school year, and item skip patterns have been updated to reflect the new numbering in the questionnaire.

  1. C.3. Changes to the Questionnaire and Rationale: SSOCS: 2020

This is an overview of high level changes and is a new section to Part C. The full text of the additions, on pp. 14-18, is provided below.

Front Page: The Department of Education and SSOCS logos will be added to the front page of the questionnaire. The endorsements will be removed from the front page and put onto a separate handout to be included in the mailout.

Definitions: Based on feedback from the TRP members, some definitions were updated to reflect official definitions used elsewhere within the Department or shortened to remove unnecessary language. No new terms were added to the definitions pages.

Additional survey information: Based on feedback from the TRP, more detailed position options were added to the request for information on the primary respondent and additional position information is requested of any school personnel who assisted in completing the questionnaire.

Items: General changes throughout the paper instrument:

  • Gridlines have been removed and alternate shading is in place.

  • Repetitive instructions have been removed throughout each section.

  • Apples that were used as bullets for instructions have been removed.

  • Single response items will have check ovals rather than boxes

  • Multi-response items will have mark all check boxes

Item Deletions:

  • Item 6: frequency of threat assessment team meeting

  • Item 9: parental involvement

  • Item 12a: Sworn law enforcement officer used at least once a week at any time during school hours

  • Item 15: Sworn law enforcement officer on campus for all instructional hours of the day, every day

  • Item 27 j, k, l, m: federal, state, district policies that could limit the school’s efforts to reduce crime

  • Item 36 b, c: school being affected by cyberbullying and resources used to deal with cyberbullying

Item Modifications

  1. Minor editing changes were made to some items to improve consistency across items and improve clarity of items (without changing the underlying meaning or purpose of the item).

Below is the full text of section C.3, from pp. 14-18.

The following section details the editorial changes, item deletions, and global formatting changes made between the SSOCS:2018 and SSOCS:2020 questionnaires. Based on the results of the SSOCS:2018 data collection, feedback from content area experts, and a seminar on visual design in self-administered surveys, some items for SSOCS:2020 were revised for consistency, clarity, and brevity. The section Frequency of Crime and Violence at School was removed, and the corresponding questions were incorporated into the Incidents section. Additionally, several items were removed from the survey in an effort to reduce overall questionnaire burden on the respondent. No new items were added.

The result is the proposed instrument for the SSOCS:2020 survey administration, which is provided in appendix B.



    1. Changes to Definitions

Three terms and definitions (active shooter, alternative school, and children with disabilities) have been adjusted to align with federal definitions for those terms. Eight definitions (evacuation, gender identity, hate crime, lockdown, rape, School Resource Officer (SRO), shelter-in-place, and threat assessment) have been minimally revised to increase brevity and clarity for survey respondents.

Active shooter – The definition was revised to align with the current definition used by the U.S. Department of Homeland Security.

Alternative school – The definition for alternative school (previously “specialized school”) was revised to align with other NCES and Department of Education surveys and publications.

Children with disabilities – The definition for children with disabilities (previously “special education students”) was updated to align with the Individuals with Disabilities Education Act (IDEA) definition.

Evacuation – The definition was simplified to avoid implied endorsement of a specific procedure for evacuation.

Gender identity – Detailed examples of gender expression were removed from the definition for brevity.

Hate crime – The definition was modified to include national origin or ethnicity as a hate crime bias.

Lockdown – The term was simplified, and examples were removed to avoid implied endorsement of a specific procedure for lockdown.

Rape – The bracketed item-specific instruction was removed from the definition. This information is specific to item 25 and the instructions appear within that item.

School Resource Officer (SRO) – The word “career” was removed from the definition to broaden the definition to all SROs.

Shelter-in-place – The definition was modified to clarify the purpose of the practice and examples of procedures were simplified.

Threat assessment – The word “team” was removed from the term and the definition was modified to focus on a formalized threat assessment process rather than a team.

    1. Editing Changes

Throughout the questionnaire, the school year has been updated to reflect the most recent 2019–20 school year, and item skip patterns have been updated to reflect the new numbering in the questionnaire.

Arrest – The first letter in the definition was lowercased for consistency with other definitions.

Gender identity – The word “means” was removed from the beginning of the definition for consistency with other definitions.

Hate crime – The first letter in the definition was lowercased for consistency with other definitions.

Sexual misconduct - The first letter in the definition was lowercased for consistency with the rest of the definitions.

Sexual orientation – The word “means” was removed from the beginning of the definition for consistency with other definitions.

Item 1, subitem a. The underlining and bolding of the word “and” were removed to align with consistent formatting practices across the questionnaire.

Item 1, subitem u. The underlining of the word “use” was removed to align with consistent formatting practices across the questionnaire.

Item 2, subitem f. The term “Suicide threats or incidents” was pluralized to make the item parallel with the wording used in items 2d and 2e.

Item 4, subitem d. The forward slashes in “mentoring/tutoring/coaching” were changed to commas.

Item 5. Per the changes to the term and definition as noted above, the term “threat assessment team” was changed to “threat assessment.”

Item 6, subitem c. This subitem was expanded to include student groups supporting the acceptance of religious diversity.

Item 8. The phrase “disciplined and drug-free schools” was replaced with “a safe school” to broaden the question and better reflect current Department of Education language.

Item 13. The phrase “Memorandum of Use” was changed to “Memorandum of Understanding” to better reflect current terminology.

Item 14. The term “at school” was set in bold and marked with an asterisk to indicate that it has a formal definition.

Item 14, subitem b. The subitem was reworded to distinguish examples of physical restraints from chemical aerosol sprays.

Item 15. The term “Part-time” was capitalized in the instructions to increase consistency with the response options of the item.

Item 16. The term “Part-time” was capitalized in the instructions to increase consistency with the response options of the item. The term “security guards” was changed to “security officers” to better reflect current terminology.

Item 23. The phrase “to the best of your knowledge” was removed from the item for brevity. The instruction to exclude sworn law enforcement was moved into the item stem to increase clarity.

Item 25. The underlining of the word “incidents” was removed to align with consistent formatting practices across the questionnaire. The column 2 header was changed to “Number reported to sworn law enforcement” for clarity.

Item 27, subitem a. The phrase “color” was removed from the item to reduce ambiguity in terminology.

Item 28. The underlining of “whether or not the incidents occurred at school or away from school” was removed to align with consistent formatting practices across the questionnaire.

Item 31. The placement of language specifying the inclusion of both students and non-students was adjusted for increased clarity.

Item 34. The word “Yes” was capitalized for consistency with the rest of the item.

Item 34, subitem c. Per the changes to the term and definition as noted above, the term “a specialized school” was changed to “an alternative school.”

Item 35. Per the changes to the term and definition as noted above, the column 3 header term “specialized schools” was changed to “alternative schools.”

Item 36, subitem b. Per the changes to the term and definition as noted above, the term “specialized schools” was changed to “alternative schools.”

Item 38, subitem c. Per the changes to the term and definition as noted above, the term “Special education students” was changed to “Children with disabilities (CWD).”

Item 44. The question was rephrased to better align with the language above the response box and clarify that the response should be a percentage of the school’s total enrollment.

    1. Changes to School/Respondent Information

In prior SSOCS collections, respondents have been asked to provide their name and title/position. For SSOCS 2020, respondents are provided more title/position response options and similar title/position information is being requested for any other school personnel who helped to complete the questionnaire. This modification reflects feedback from the TRP and aims to gain a better understand of all staff involved in completing the survey.

    1. Item Deletions and Rationale

2017–18 Questionnaire Item 6. This item was deleted. Following feedback from an expert panel, it was determined that how often the threat assessment team meets is not a critical piece of information. The broad response options had limited analytic use.

2017–18 Questionnaire Item 9. This item was deleted to reduce respondent burden since the item overlaps with the National Teacher and Principal Survey (NTPS).

2017–18 Questionnaire Item 12, subitem a. This subitem was deleted. Similar information is collected in SSOCS:2020 item 9 (SSOCS:2018 item 11); its deletion is intended to help reduce overall questionnaire burden on the respondent.

2017–18 Questionnaire Item 15. This item was deleted. Similar information is collected in SSOCS:2020 items 9 and 10 (SSOCS:2018 items 11 and 12); its deletion is intended to help reduce overall questionnaire burden on the respondent.

2017–18 Questionnaire Item 27, subitem j. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable was outdated and had limited analytic use.

2017–18 Questionnaire Item 27, subitem k. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable was outdated and had limited analytic use.

2017–18 Questionnaire Item 27, subitem l. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable was outdated and had limited analytic use.

2017–18 Questionnaire Item 27, subitem m. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable was outdated and had limited analytic use.

2017–18 Questionnaire Item 36, subitem b. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable had limited analytic use.

2017–18 Questionnaire Item 36, subitem c. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable had limited analytic use.

    1. Global Changes to Formatting and Instructions

In addition to the item-specific changes described above, some global changes were made to enhance the consistency and formatting of the questionnaire in an effort to improve its visual design. A streamlined and consistent questionnaire will be easier for the respondent to follow, reduce response time and burden, and help promote an accurate understanding of survey items and response options. These revisions were based on feedback from a TRP consisting of content area experts and on the recommendations of a national expert in visual design elements for self-administered surveys.

The survey cover page has been revised to:

  • Include the Department of Education and U.S. Census Bureau logos in order to enhance the perception of the survey’s legitimacy.

  • Remove white space where the school information will be printed. White space typically indicates an area for the respondent to fill in, but in this case the information will be pre-filled by Census.

  • Remove the list of endorsements. The endorsements will be provided in a separate handout in order to reduce clutter on the cover page and allow for the incorporation of the logos of some endorsing agencies that respondents may be most familiar with.

Horizontal and vertical grid lines have been removed.

Alternative row shading has been incorporated.

Certain response field shapes have been changed to reflect best practices in questionnaire design. The following guidelines for response fields have been implemented for consistency across the SSOCS:2020 questionnaire. These changes also bring the paper questionnaire design in better alignment with the design of the SSOCS web instrument:

  • For items where respondents select only one response (out of two or more response options), response fields will appear as circles.

  • For items where respondents select all applicable responses (out of two or more response options), response fields will appear as squares.

  • For items where respondents are asked to provide numerical values (e.g., incident counts or dates) or text (e.g., names or e-mail addresses), response fields will appear as rectangles.

Instructions found at the bottom of pages referring the respondent to item definitions will now read “*A removable Definitions sheet is printed on pages 3–4.” Similar to NTPS procedures, the two definition pages will be included as a perforated sheet that can be removed from the questionnaire to facilitate easier reference when taking the survey.

All apple-style bullet points have been changed to circular bullet points.

The source code font has been lightened, and codes have been moved away from response boxes to avoid distracting the respondent.

Certain instructions in the survey were also removed to reduce redundancy and item length. The following instructions are included the first time a style of item response options is introduced, but not in subsequent questions that use the same response option style:

  • Check “Yes” or “No” on each line” (appears first in Question 1).

  • Check one response on each line” (appears first in Question 21).

  • If none, please place an “X” in the None box” (appears first in Question 15).


No changes were made to Part C-2 since its last approval in July 2017.


The following updates were made to Appendix A – SSOCS 2018 & SSOCS 2020 Communication Materials

Feedback from a Census communications expert reviewed the initial letter to principals and suggested shortening the length of the letters, moving pieces of information (e.g., Log-in box) to different parts of the letter, removing the endorsements (now in a separate handout – Attachment A), attaching the incentive to paper (rather than unattached in the package) and adding FAQs to the back of the initial letter.


The following was added above the Table of Contents on p. ii.

All 2018 materials have been approved (OMB# 1850-0761 v.12-15) and all 2020 materials are newly added.

Some of the SSOCS:2020 communication materials are still being developed – their final versions will be provided to OMB as a change request in September 2019. In the currently provided SSOCS:2020 materials, all citations of findings from the 2015–16 SSOCS will be replaced with findings from the 2017-18 SSOCS and all screenshots will be replaced with the final SSOCS:2020 screenshot and provided in the September 2019 change request submission.


Similar Letters but Tailored to Different Experimental Treatments in SSOCS:2018 vs. SSOCS:2020:

  • Initial Letter to Principal

  • First and Second Reminder Letter to Principal

  • Initial e-mail to Principals

  • First, Second, and Third Follow-up E-mail to Principals


Letter Deletions (used in SSOCS:2018 but not in SSOCS:2020):

  • District Research Application Cover Letter (Districts in NTPS & SSOCS)

  • Special Contact District Affidavit of Nondisclosure6

  • Advance Letter to Principal

  • Advance Letter to Principal – Special District, Status Pending

  • Advance Letter to Principal – Special District, Status Approved

  • Postcard to Principal – Special District Approval

  • First Reminder Letter to Principal – Paper

  • Second Reminder Letter to Principal – Paper

  • Initial e-mail to principals - Paper, Incentive

  • Initial e-mail to principals - Paper, No incentive

  • First follow-up e-mail to principals – Paper

  • Second follow-up e-mail to principals – Paper

  • Third follow-up e-mail reminder to nonresponding schools and thank you e-mail to responding schools – Paper

  • Sixth follow-up e-mail


Letter Additions (not used in SSOCS:2018 but to be used in SSOCS:2020):

  • SSOCS Endorsements

  • SSOCS Incentive Handout (the final SSOCS:handout will be provided in September 2019)

  • Special District Approval Notice

  • Frequently Asked Questions (to be included on the back of initial letter to principal)


The revised and new contact materials for SSOCS:2020 are provided on pp. 58-106.


In the 2019-20 School Survey on Crime and Safety (SSOCS) Research Application the following text was revised on p. 65:

  1. Confidential Information Protect and Statistical Efficiency Act of 2002;

  2. E-Government Act of 2002, Title V, Subtitle A;

  1. Foundations of Evidence-Based Policymaking Act of 2018, Title III, Part B, Confidential Information Protection;


The following updates were made to Appendix B – SSOCS 2020 Questionnaire.


The newly added SSOCS:2020 Questionnaire begins on p. 25.

Items that have been removed or edited are listed below.


Changes to Definitions

Three terms and definitions (active shooter, alternative school, and children with disabilities) have been adjusted to align with federal definitions for those terms. Eight definitions (evacuation, gender identity, hate crime, lockdown, rape, School Resource Officer (SRO), shelter-in-place, and threat assessment) have been minimally revised to increase brevity and clarity for survey respondents. The unchanged 2018 definitions are on pp. 2-3 of the 2018 Questionnaire (pp. 3-4 of pdf), while the revised 2020 definitions are on pp. 3-4 of the 2020 Questionnaire (pp. 28-29 of pdf). The 2020 revisions are shown below:


Active shooteran one or more individuals actively engaged in killing or attempting to kill people in a confined and populated area; in most cases, active shooters use firearm(s) and there is no pattern or method to their selection of victims.

Alternative schoolSpecialized school a school that is specifically for students who were referred for disciplinary reasons, although the school may also have students who were referred for other reasons. The school may be at the same location as your school. a school that addresses the needs of students that typically cannot be met in a regular school program and is designed to meet the needs of students with academic difficulties, students with discipline problems, or both students with academic difficulties and discipline problems.

Children with disabilitiesSpecial education student a child with a disability, defined as mental retardation, hearing impairments (including deafness), speech or language impairments, visual impairments (including blindness), serious emotional disturbance, orthopedic impairments, autism, traumatic brain injury, other health impairments, or specific learning disabilities, who needs special education and related services and receives these under the Individuals with Disabilities Education Act (IDEA). children having intellectual disability; hearing impairment, including deafness; serious emotional disturbance; orthopedic impairment; autism; traumatic brain injury; developmental delay; other health impairment; specific learning disability; deaf-blindness; or multiple disabilities and who, by reason thereof, receive special education and related services under the Individuals with Disabilities Education Act (IDEA) according to an Individual Education Program (IEP), Individualized Family Service Plan (IFSP), or services plan.

Evacuation – a procedure that requires all students and staff to leave the building. While evacuating to the school’s field makes sense for a fire drill that only lasts a few minutes, it may not be an appropriate location for a longer period of time. The evacuation plan should may encompass relocation procedures and include backup buildings to serve as emergency shelters, such as nearby community centers, religious institutions, businesses, or other schools. Evacuation also includes “reverse evacuation,” a procedure for schools to return students to the building quickly if an incident occurs while students are outside.

Gender identity – means one’s inner sense of one’s own gender, which may or may not match the sex assigned at birth. Different people choose to express their gender identity differently. For some, gender may be expressed through, for example, dress, grooming, mannerisms, speech patterns, and social interactions. Gender expression usually ranges between masculine and feminine, and some transgender people express their gender consistent with how they identify internally, rather than in accordance with the sex they were assigned at birth.

Hate crimeA a committed criminal offense that is motivated, in whole or in part, by the offender’s bias(es) against a race, national origin or ethnicity, religion, disability, sexual orientation, gender, or gender identity; also known as bias crime.

Lockdowna procedure that involves occupants of a school building being directed to remain confined to a room or area within a building with specific procedures to follow. A lockdown may be used when a crisis occurs outside of the school and an evacuation would be dangerous. A lockdown may also be called for when there is a crisis inside and movement within the school will put students in jeopardy. All exterior doors are locked and students and staff stay in their classrooms. a procedure that involves securing school buildings and grounds during incidents that pose an immediate threat of violence in or around the school.

Rape – forced sexual intercourse (vaginal, anal, or oral penetration). This includes sodomy and penetration with a foreign object. All students, regardless of sex or gender identity, can be victims of rape. [Counts of attempted rape should be added to counts of rapes in your reporting of item 30a.]

School Resource Officer (SRO) – a career sworn law enforcement officer with arrest authority, who has specialized training and is assigned to work in collaboration with school organizations.

Shelter-in-place – a procedure similar to a lockdown in that the occupants are to remain on the premises; however, shelter-in-place is designed to use a facility and its indoor atmosphere to temporarily separate people from a hazardous outdoor environment. Everyone would be brought indoors and building personnel would close all windows and doors and shut down the heating, ventilation, and air conditioning system (HVAC). This would create a neutral pressure in the building, meaning the contaminated air would not be drawn into the building. a procedure that requires all students and staff to remain indoors because it is safer inside the building or a room than outside. Depending on the threat or hazard, students and staff may be required to move to rooms that can be sealed (such as in the event of a chemical or biological hazard) or without windows, or to a weather shelter (such as in the event of a tornado).

Threat assessment team– a formalized group of persons who meet on a regular basis with the common purpose process of identifying, assessing, and managing students who may pose a threat of targeted violence in schools.


Editing Changes

Throughout the questionnaire, the school year has been updated to reflect the most recent 2019–20 school year, item skip patterns have been updated to reflect new numberings in the questionnaire, repetitive instructions throughout sections were removed, and several instances of underlining/bolding were removed to align with consistent formatting practices across the questionnaire. Items described in this section reflect SSOCS:2020 questionnaire item numbering.

Item 6 subitem c. (p. 7 of 2020 Questionnaire; p. 32 of pdf) [Item 7 in 2018]:

This subitem was expanded to include student groups supporting the acceptance of religious diversity.

During the 2019–20 school year, did your school have any recognized student groups with the following

purposes?

Check "Yes" or "No" on each line.

a. Acceptance of sexual orientation* and gender identity* of students (e.g., Gay-Straight Alliance)

b. Acceptance of students with disabilities (e.g., Best Buddies)

c. Acceptance of cultural diversity or religious diversity (e.g., Cultural Awareness Club)

Item 8. (p. 8 of 2020 Questionnaire; p.33 of pdf) [Item 10 in 2018]:

The phrase “disciplined and drug-free schools” was replaced with “a safe school” to broaden the question and better reflect current Department of Education language.

During the 2019–20 school year, were any of the following community and outside groups involved in your school’s efforts to promote a safe, disciplined, and drug-free schools?

Check "Yes" or "No" on each line.

Item 13. (p. 10 of 2020 Questionnaire; p. 35 of pdf) [Item 16 in 2018]:

The phrase “Memorandum of Use” was changed to “Memorandum of Understanding” to better reflect current terminology.

During the 2019–20 school year, did your school or school district have any formalized policies or written documents (e.g., Memorandum of Use Understanding, Memorandum of Agreement) that outlined the roles, responsibilities, and expectations of sworn law enforcement officers (including School Resource Officers*) at school*?

Item 14 subitem b. (p. 11 of 2020 Questionnaire; p. 36 of pdf) [Item 17 in 2018]:

The subitem was reworded to distinguish examples of physical restraints from chemical aerosol sprays.

Did these formalized policies or written documents include language defining the role of sworn law enforcement officers (including School Resource Officers*) at school* in the following areas?

Check "Yes," "No," or "Don’t know" on each line.

b. Use of physical restraints (e.g., handcuffs, Tasers) or chemical restraints aerosol sprays (e.g., handcuffs, Tasers, Mace, pepper spray)

Item 23. (p. 13 of 2020 Questionnaire; p.38 of pdf) [Item 26 in 2018]:

The phrase “to the best of your knowledge” was removed from the item for brevity. The instruction to exclude sworn law enforcement was moved into the item stem to increase clarity.

Aside from sworn law enforcement officers (including School Resource Officers*) or other security officers or personnel who carry firearms,To the best of your knowledge, during the 2019–20 school year, were there any staff at your school* who legally carried a firearm* on school property?

Exclude sworn law enforcement officers (including School Resource Officers*) or other security guards or personnel who carry firearms.

Item 25. (p. 15 of 2020 Questionnaire; p. 40 of pdf) [Item 30 in 2018]:

The column 2 header was changed to “Number reported to sworn law enforcement” for clarity.

Please record the number of incidents that occurred at school* during the 2019–20 school year for the offenses listed below. (NOTE: The number in column 1 should be greater than or equal to the number in column 2.)


Please provide information on:

  • The number of incidents, not the number of victims or offenders.

  • Recorded incidents, regardless of whether any disciplinary action was taken.

  • Recorded incidents, regardless of whether students or non-students were involved.

  • Incidents occurring before, during, or after normal school hours.

Column 1 Column 2

Total number Number reported to police or other

of recorded incidents sworn law enforcement

Item 27, subitem a. (p. 16 of 2020 Questionnaire; p. 41of pdf) [Item 32 in 2018]:

The phrase “color” was removed from the item to reduce ambiguity in terminology.

To the best of your knowledge, were any of these hate crimes* motivated by the offender’s bias against the following characteristics or perceived characteristics?

Check "Yes" or "No" on each line.

If a hate crime* was motivated by multiple characteristics, answer "Yes" for each that applies.

  1. Race or color



Item 31. (p. 16 of 2020 Questionnaire; p. 41of pdf) [Item 34 in 2018]:

The placement of language specifying the inclusion of both students and non-students was adjusted for increased clarity.

Please record the number of arrests*, including both students and non-students, that occurred at your school* during the 2019–20 school year. Please include all arrests* that occurred at school*, regardless of whether a student or non-student was arrested.


Item 34, subitem c. (p. 18 of 2020 Questionnaire; p. 43 of pdf) [Item 37 in 2018]:

Per changes to the term and definition as noted above, the term “a specialized school” was changed to “an alternative school.”

During the 2019–20 school year, did your school allow for the use of the following disciplinary actions? If "Yes," were the actions used this school year?

    1. Transfer to a specialized school* an alternative school* for disciplinary reasons


Item 35. (p. 19 of 2020 Questionnaire; p.44 of pdf) [Item 38 in 2018]:

Per changes to the term and definition as noted above, the column 3 header term “specialized schools” was changed to “alternative schools.”

During the 2019–20 school year, how many students were involved in committing the following offenses, and how many of the following disciplinary actions were taken in response?

Please follow these guidelines when determining the number of offenses and disciplinary actions:

  • If more than one student was involved in an incident, please count each student separately when

providing the number of disciplinary actions.

  • If a student was disciplined more than once, please count each offense separately (e.g., a student

who was suspended five times would be counted as five suspensions).

  • If a student was disciplined in two different ways for a single infraction (e.g., the student was both suspended and referred to counseling), count only the most severe disciplinary action that was taken.

  • If a student was disciplined in one way for multiple infractions, record the disciplinary action for only the most serious offense.

Number of disciplinary actions taken in response to offense

(Column header)

Transfers to

Specialized schools*

alternative schools*


Item 36, subitem b. (p. 19 of 2020 Questionnaire; p. 44 of pdf) [Item 39 in 2018]:

Per changes to the term and definition as noted above, the term “specialized schools*” was changed to “alternative schools.”

During the 2019–20 school year, how many of the following occurred?


b. Students were transferred to specialized alternative schools* for disciplinary reasons. (NOTE: This number should be greater than or equal to the sum of entries in item 38, column 3.)


Item 38, subitem c. (p. 20 of 2020 Questionnaire; p. 45of pdf) [Item 41 in 2018]:

Per changes to the term and definition as noted above, the term “Special education students” was changed to “Children with disabilities (CWD).”

What percentage of your current students fit the following criteria?

  1. Special education students Children with disabilities (CWD)*

Item 44. (p. 21 of 2020 Questionnaire; p. 46 of pdf) [Item 47 in 2018]:

The question was rephrased to better align with the language above the response box and clarify that the response should be a percentage of the school’s total enrollment.

What is your school’s average daily attendance?

What percentage of your school’s total enrollment is present on an average day?

Percent of students present

None

%

Additional respondent information. (pp. 22-23 of 2020 Questionnaire; pp. 47-48 of pdf)

[pp. 22-23 of 2018 Questionnaire; pp. 23-24 of pdf]:

Based on feedback from the TRP, more detailed position options were added to the request for information on the primary respondent, plus additional information is requested of any school personnel who assisted in completing the questionnaire (p. 23 of 2020 Questionnaire; p. 48 of pdf):


[Regarding the primary person completing form:]


Title or position

Check one response.

Principal

Vice principal or disciplinarian

Disciplinarian

Counselor

Administrative or secretarial staff

Teacher or instructor

Superintendent or district staff

Security personnel

Other Please specify


Title or position(s) of other personnel who helped complete the questionnaire

Check all that apply.

Principal

Vice principal

Disciplinarian

Counselor

Administrative or secretarial staff

Teacher or instructor

Superintendent or district staff

Security personnel

Other Please specify


SSOCS:2020 Item and Subitem Deletions from the SSOCS:2018 Survey and Rationale

2017–18 Questionnaire Item 6. (p. 7 of 2018 Questionnaire; p. 8 of pdf)

Following feedback from an expert panel, it was determined that how often the threat assessment team meets is not a critical piece of information. The broad response options had limited analytic use.

During the 2017–18 school year, how often did your school’s threat assessment team* formally meet?

Check one response.

At least once a week

At least once a month

On occasion

Never

2017–18 Questionnaire Item 9. (p. 8 of 2018 Questionnaire; p. 9 of pdf)

This item was deleted to reduce respondent burden since the item overlaps with National Teacher and Principal Survey (NTPS).

What is your best estimate of the percentage of students who had at least one parent or guardian

participating in the following events during the 2017–18 school year?

Check one response on each line.

Open house or back-to-school night

Regularly scheduled parent-teacher conferences


2017–18 Questionnaire Item 12, subitem a. (p. 9 of 2018 Questionnaire; p. 10 of pdf) [Item 10 in 2020]

Similar information is collected in SSOCS:2020 item 9 (SSOCS:2018 item 11); its deletion is intended to help reduce overall questionnaire burden on the respondent.

Were sworn law enforcement officers (including School Resource Officers*) used at least once a week in or around your school at the following times?

Do not include security guards or other security personnel who are not sworn law enforcement in your response to this item; information on additional security staff is gathered in item 19.

Check "Yes" or "No" on each line.

  1. At any time during school hours


2017–18 Questionnaire Item 15. (p. 10 of 2018 Questionnaire; p. 11 of pdf)

Similar information is collected in SSOCS:2020 items 9 and 10 (SSOCS:2018 items 11 and 12); its deletion is intended to help reduce overall questionnaire burden on the respondent.

During the 2017–18 school year, did your school have a sworn law enforcement officer (including School Resource Officers*) present for all instructional hours every day that school was in session?

  • Include officers who are used as temporary coverage while regularly assigned officers are performing duties external to the school (such as attending court) or during these officers’ personal leave time.

  • Check "No" if your school does not have officer coverage while regularly assigned officers are

performing duties external to the school (such as attending court) or during these officers’ personal

leave time.

  • Do not include security guards or other security personnel who are not sworn law enforcement in your response to this item; information on additional security staff is gathered in item 19.



2017–18 Questionnaire Item 27, subitems j, k, l, m. (p. 14 of 2018 Questionnaire; p. 15 of pdf) [Item 24 in 2020]

Following feedback from an expert panel, it was determined that this variable was outdated and had limited analytic use.

To what extent do the following factors limit your school’s efforts to reduce or prevent crime?

Check one response on each line.

j. Fear of district or state reprisal

k. Federal, state, or district policies on disciplining special education students*

l. Federal policies on discipline and safety other than those for special education students*

m. State or district policies on discipline and safety other than those for special education students*



2017–18 Questionnaire Item 36, subitems b & c. (p. 17 of 2018 Questionnaire; p. 18 of pdf) [Item 33 in 2020]

It was determined that these variables were outdated and had limited analytic use.

To the best of your knowledge, thinking about problems that can occur anywhere (both at your school* and away from school), how often do the following occur?

Check one response on each line.

  1. School environment is affected by cyberbullying*

  2. Staff resources are used to deal with cyberbullying*

1 Each subitem in the SSOCS:2020 questionnaire was counted as an item. Assuming an average burden of 11.7 seconds per item (based on the average amount of time it took respondents to complete the 2018 questionnaire) and that the items do not differ substantially in complexity or length, the burden for the SSOCS:2020 survey is estimated to be very similar to that for the SSOCS:2018 survey.

[4] The source of this estimate is the mean hourly rate of Education Administrators (data type: SOC:119030) on the BLS Occupational Employment Statistics website, http://data.bls.gov/oes/, accessed on February 1, 2017.

2 The source of this estimate is the mean hourly rate of Education Administrators (data type: SOC:119030) on the BLS Occupational Employment Statistics website, http://data.bls.gov/oes/, accessed on February 25, 2019.

3 In early 2019, NCES made the decision to delay the 2019-20 NTPS by one year, making it the 2020-21 NTPS. However, the 2019-20 NTPS frame creation work continues for use in SSOCS:2020, as outlined in this document. All references to the 2019-20 NTPS remain as is because they relate to the SSOCS:2020 frame and sampling.

4 The critical items in SSOCS:2018 are items 11, 18, 19, 28, 29, 30, 31, 35, 36, 38, 39, 40, 41, 42, 46, 47, and 48 (see appendix B).

5 The “Paper, Incentive” group had a different hypothesis test from the other two treatment groups. For the “Paper, Incentive” group, the last column displays the p-value for the hypothesis test that the group that received the $10 cash incentive and no internet option has the same or lesser response rate than the control group.

6 District staff will no longer be asked to sign affidavits of nondisclosure prior to receiving the list of schools sampled in the district.

550 12th Street, S.W., Washington, DC 20202

Our mission is to ensure equal access to education and to promote educational excellence throughout the Nation.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy