School Survey on Crime and Safety (SSOCS) 2018 and 2020 Update
OMB# 1850-0761 v.16
Supporting Statement Part A
National Center for Education Statistics
Institute of Education Sciences
U.S. Department of Education
March 2017
Revised April 2019
Contents
A.1. Circumstances Making Collection of Information Necessary 1
A.2. Purposes, Uses, and Availability of Information 3
A.3. Appropriate Use of Information Technology 4
A.4. Efforts to Identify Duplication 5
A.5. Methods Used to Minimize Burden on Small Entities 7
A.6. Frequency of Data Collection 7
A.7. Special Circumstances of the Data Collection 7
A.8. Consultants Outside the Agency 7
A.9. Provision of Payments or Gifts to Respondents 8
A.10. Assurance of Confidentiality 9
A.12. Estimates of Burden for Information Collection 10
A.13. Estimates of Cost Burden to Respondents 13
A.14. Estimates of Annual Government Cost 13
A.15. Reasons for Changes in Response Burden 13
A.17. Approval to Not Display Expiration Date of OMB Approval 16
The request to conduct the 2018 and 2020 School Survey on Crime and Safety (SSOCS) was approved in July 2017 with the latest change request approved in May 2018 (OMB# 1850-0761 v.15). This request adds updates for the 2020 SSOCS full-scale data collection involving revisions to: (1) the approved incentive and web experiments, (2) communication materials, and (3) SSOCS:2020 questionnaire (nonsubstantive changes and removal of items). Some of the SSOCS:2020 communication materials are still being developed – their final versions will be provided to OMB as a change request in September 2019.
SSOCS was conducted in 2000, 2004, 2006, 2008, 2010, and 2016 (OMB# 1850-0761). Four years separated the first two collections of SSOCS to allow for sufficient time to study the results of the first survey and to allow for necessary redesign work; the next three collections were conducted at 2-year intervals. Due to a reorganization of the sponsoring agency (the Office of Safe and Drug-Free Schools) and funding issues, the 2012 administration of SSOCS, although approved by OMB, was not fielded. With new funding available through the National Institute of Justice (NIJ), SSOCS was conducted again in the spring of the 2015–16 school year. With continued dedicated funding, SSOCS will be conducted on a biennial basis, with the next administrations scheduled to take place in spring of the 2017–18 and the 2019–20 school year.
SSOCS is a survey of public schools covering the topic of school crime and violence and is designed to produce nationally representative data on public schools. Historically, it has been conducted by mail, with telephone and e-mail follow-up; however, as an experiment, an Internet version will be fielded during the SSOCS:2018 administration. The respondent is the school principal or a member of the school staff designated by the principal as the person “the most knowledgeable about school crime and policies to provide a safe environment.”
The 2018 survey is being funded and conducted by the National Center for Education Statistics (NCES) of the Institute of Education Sciences (IES), within the U.S. Department of Education, with supplemental funding from NIJ through its Comprehensive School Safety Initiative (CSSI). The CSSI was developed in response to a 2014 congressional appropriation to conduct research about school safety, and it fully funded the 2016 collection. The responsibility for the design and conduct of the survey continues to rest with NCES. As in 2006, 2008, 2010, and 2016, NCES has entered into an interagency agreement with the Census Bureau to conduct the 2018 and 2020 collections of SSOCS. The 2020 administration of SSOCS is being funded and conducted by NCES as in 2017-18, but with supplemental funding from the Office of Safe and Healthy Students (OSHS).
A.1. Circumstances Making Collection of Information Necessary
SSOCS is the only recurring federal survey that collects detailed information on the incidence, frequency, seriousness, and nature of violence affecting students and school personnel, as well as other indicators of school safety from the schools’ perspective. As such, it fills an important gap in data collected by NCES and other agencies. It collects information on:
the frequency and types of crimes at schools, including homicide; rape; sexual assault; physical attacks with or without weapons; threats of attack with or without weapons; robbery with or without weapons; theft; possession of weapons; distribution, possession, or use of illegal drugs or alcohol; and vandalism;
the frequency and types of disciplinary actions for selected offenses, such as removals with no continuing services; transfers to specialized schools; and suspensions;
perceptions of other disciplinary problems, such as student racial or ethnic tensions; bullying; harassment; verbal abuse; disorder in the classroom; and gang activities;
school policies and programs concerning crime and safety;
student, parent, teacher, and law enforcement involvement in efforts intended to prevent or reduce school violence;
mental health services available to students at school and limitations on schools’ efforts to provide these services; and
school characteristics associated with school crime.
The predecessor to SSOCS was a one-time survey done through NCES’s Fast Response Survey System (FRSS) in 1996–97. Around the time when the FRSS data were being released in 1997–98, a number of tragic shootings occurred at schools across the county. These events took place in Pearl, MS; West Paducah, KY; Jonesboro, AR; and Columbine, CO. When it came to light that neither the Departments of Justice nor Education had a recurring survey by which to measure the frequency of crime and violence at schools, the Department of Education made a commitment to begin such a survey on a regular basis. Thus, planning for SSOCS began.
From the beginning, the purpose of SSOCS was to provide data about school crime and safety in the nation’s public elementary and secondary schools. As allocated by its budget, SSOCS continues to meet this purpose by collecting data on elementary and secondary regular public schools. This includes magnet and charter schools and excludes public alternative, vocational, virtual, and special education schools, as well as private schools.
The original SSOCS questionnaire, used in the 2000 data collection, was developed in consultation with a technical review panel (TRP) consisting of some of the nation’s top experts on school crime and school programs relating to crime and safety. Revisions to the 2004 questionnaire were based on an analysis of responses to the 2000 questionnaire, a review of current literature in the field, feedback from a TRP and invested government agencies, and the results of extensive pretesting conducted by Abt Associates. The questionnaires used in 2006 and 2008 were essentially the same as that used in 2004. The questionnaire used in 2010 was similar to that used in 2008, but it incorporated minor revisions based on feedback from several SSOCS data users and school crime and safety experts. The questionnaire planned for use in 2012 incorporated two additional items on bullying that underwent cognitive testing and were approved in the OMB clearance update for the 2012 collection (OMB# 1850-0761 v.6).
Revisions to the full SSOCS questionnaire used in 2016 were based on several sources of information, including an analysis of responses to the SSOCS:2010 questionnaire, a review of current literature in the field, feedback from a TRP and invested government agencies, the results of extensive cognitive testing, and NIJ’s goals related to collecting information about school security personnel and mental health services. The process for revising the 2018 questionnaire content was similar to that performed for the 2016 questionnaire. Revisions to the 2018 questionnaire are detailed in Supporting Statement Part C, Section 1, of this submission.
In early 2019, minimal revisions were made to the SSOCS:2020 questionnaire in order to maintain trend and in anticipation of the implementation of a full redesign for the SSOCS:2022 administration. These changes are designed to reduce respondent burden (e.g., by removing some items and subitems) and improve the visual design of the questionnaire (e.g., by using alternative shading in subitem rows and removing grid lines). These revisions were based on feedback from a TRP consisting of content area experts and on the recommendations of a national expert in visual design elements for self-administered surveys. TRP experts suggested a number of specific modifications to existing definitions, survey items, and content areas. The majority of these suggestions will be implemented for the SSOCS:2022 data collection, as they require more extensive research and testing. Panelists recognized both the necessity and the difficulty of shortening the questionnaire to reduce burden on respondents. Based on panelist feedback on the relevance and analytic utility of items, some items and sub-items have been removed from the SSOCS:2020 questionnaire. No new content was added to the questionnaire for SSOCS:2020. Revisions to the 2020 questionnaire are detailed in Supporting Statement Part C, Section C2, of this submission.
SSOCS:2018 and SSOCS:2020 will continue to provide a valuable tool to policymakers and researchers who need to know what the level of school crime is and how it is changing, what disciplinary actions schools are taking, what policies and programs related to school crime and violence schools have in place, and what related services are available to students.
Legislative Authorization
NCES is authorized to conduct SSOCS by the Education Sciences Reform Act of 2002 (ESRA 2002, U.S.C. 20 §9543).
The reauthorization in 2002 of the Safe and Drug-Free Schools and Communities Act of 1994 and the Department of Justice Appropriations Act passed in 2014 provide additional legislative authority to conduct this study. The Safe and Drug-Free Schools and Communities Act of 1994 was reauthorized to support drug and violence prevention programs, including a data collection to be performed by NCES to collect data on the incidence and prevalence of illegal drug use and violence in elementary and secondary schools. SSOCS will address this provision by providing statistics on the frequency of school violence, the nature of the school environment, and the characteristics of school violence prevention programs.
The Department of Justice Appropriations Act passed in 2014 provided funds for NIJ to conduct research about school safety. In response, NIJ developed the CSSI (of which NCES is a federal partner) to use a variety of research and data collection efforts to learn which programs, policies, and practices are effective in making schools safer. Since understanding schools’ safety problems begins with collecting better data, part of the initiative’s goal is to improve data collection at the national level. As a part of this effort, NIJ fully funded the SSOCS:2016 data collection and has provided supplemental funding for the 2018 collection. SSOCS will continue to specifically address the priorities of the initiative by collecting more in-depth information on the roles and responsibilities of mental health professionals and law enforcement officers working in schools. For the SSOCS:2020 collection, NCES received supplemental funding from OSHS.
A.2. Purposes, Uses, and Availability of Information
SSOCS has been designed to meet the congressional mandate for NCES to provide statistics on the frequency of school violence, the nature of the school environment, and the characteristics of school violence prevention programs. Such national data are critical, given the tendency to focus on anecdotal evidence of crimes without knowing the true frequency of problems in schools. Accurate information is necessary for policymakers to make informed decisions about school policy, and to demonstrate to the public a proactive approach to school safety. Most items from prior SSOCS questionnaires will be included in the 2018 and 2020 surveys, thus allowing comparisons with previous years. A complete description of the differences between the 2016 and 2018 surveys is provided in the questionnaire changes and rationale section in Supporting Statement Part C, Section C2. A complete description of the differences between the 2018 and 2020 surveys is provided in Supporting Statement Part C, Section C3.
NCES will use the SSOCS:2018 and SSOCS:2020 data to prepare summary descriptive reports of the findings and will make the data available both as a restricted-use database (for use by researchers and policymakers on school crime and safety) and as a public-use database available on the NCES website.
Data from the previous SSOCS surveys have been released in NCES’s Condition of Education and Digest of Education Statistics, as well as in its Indicators of School Crime and Safety. Each iteration of SSOCS data has also been released in a First Look report, as listed below:
Crime, Violence, Discipline, and Safety in U.S. Public Schools, Findings From the School Survey on Crime and Safety: 2009–10 (as well as for 2007–08; 2005–06; and 2003–04); and
Violence in U.S. Public Schools: 2000 School Survey on Crime and Safety.
The Crime, Violence, Discipline, and Safety in U.S. Public Schools, Findings From the School Survey on Crime and Safety: 2015–16 First Look report will be released in summer 2017, accompanied by a restricted-use file and user’s manual. The First Look report and restricted-use data file and user’s manual for the SSOCS:2018 data collection will be released in summer 2019. All of these products are available on the NCES website. Summary statistics will also be available on the NCES website in a table library containing cross-tabulations of SSOCS variables by various school characteristics.
Data products from the previous SSOCS surveys are also available on the NCES website. Public-use data files are available on the NCES website in various software formats (with accompanying survey documentation and codebooks), while restricted-use SSOCS data files are available to users who obtain a restricted use license agreement with NCES. Additionally, some older SSOCS public-use datasets are hosted on the website of the Inter-university Consortium for Political and Social Research (ICPSR).
A.3. Appropriate Use of Information Technology
SSOCS:2018 will be mainly conducted by mail, with telephone and e-mail follow-up, and will also include a modal experiment with a web-based version of the instrument. Developing a web-based version of the instrument was in direct response to feedback received during cognitive laboratory interviews (OMB# 1850-0803 v.171) indicating respondents’ increased likelihood to respond if a web-based version was available. The web test treatment (1,150 randomly selected schools) will be evaluated against the control group, which will follow the traditional SSOCS data collection path, using paper questionnaires (3,650 randomly selected schools). The web test treatment schools will be assured that all of their data will be stored on secure online servers controlled by the U.S. Census Bureau, and will be given the option to respond by paper during follow-up mailings later in the data collection.
Based on the results of the two experiments (Internet and incentive) conducted during SSOCS:2018, SSOCS:2020 will be primarily conducted by the web-based survey instrument, with instructions distributed to respondents by both mail and e-mail on how to complete the questionnaire. Paper questionnaires will be introduced to non-repsondents in follow-up mailings, in addition to follow-up efforts by both telephone and e-mail. During the SSOCS:2018 data collection, approximately 77% of responding schools in the Internet treatment group completed the questionnaire online. It is expected that a similar proportion of the SSOCS:2020 responses will be completed through the web instrument. SSOCS:2020 will also include a modal experiment to test a navigation menu within the web instrument.
Analyses of the SSOCS:2018 Internet and incentive experiments resulted in the recommendation to include an incentive and allow web-based responses as part of a mixed-mode methodology in future SSOCS administrations. Although the web-based instrument option did not increase response rates on its own, the analyses showed higher response rates for schools that were part of both the Internet and incentive treatment groups. The web-based instrument option will offer cost savings on mailout, processing, and keying operations compared to a paper-only methodology. It will also allow for earlier questionnaire completion, as analyses showed a reduction in response time for the Internet treatment group, which leads to cost savings on follow-up efforts. For more information on the results of the SSOCS:2018 experiments, see Part B, Section B3, of this submission. All SSOCS:2020 schools will receive assurances that all of their data will be stored on secure online servers controlled by the U.S. Census Bureau and will be given the option to instead respond by paper during follow-up mailings later in the data collection.
Principals’ e-mail addresses, obtained through clerical research prior to the SSOCS:2018 and SSOCS:2020 data collections, will be utilized during data collection. For SSOCS:2018, invitations to complete the SSOCS questionnaires via the web-based instrument will be sent to principals of the schools randomly assigned to the web test. Principals of all schools, regardless of whether the school was randomly assigned to the web test, will be sent reminder e-mails, as appropriate, throughout the data collection period. For SSOCS:2020, all school principals will receive invitations to complete the SSOCS questionnaires via the web-based instrument and will be sent reminder e-mails, as appropriate, throughout the data collection period. All e-mail addresses will be “masked” so that recipients do not have access to the e-mail addresses of other recipients. An electronic database maintained by the U.S. Census Bureau will be used to track all sampled cases in order to determine where further follow-up during data collection is required.
Computer edits will be performed to verify the completeness of the questionnaire and the consistency of the data collected. For example, computer edits will verify whether a subset of responses adds to the total, whether skip patterns have been followed correctly, whether values fall outside of the range typically found for such schools, and whether some responses might be logically inconsistent.
A.4. Efforts to Identify Duplication
SSOCS was initially developed in consultation with the
Office of Safe and Healthy Students (OSHS), formerly known as the Office of Safe and Drug-Free Schools;
Office of Juvenile Justice and Delinquency Prevention (OJJDP);
Bureau of Justice Statistics (BJS);
Office of Special Education Programs (OSEP);
National Institute of Justice (NIJ);
Office of Community Oriented Policing Services (COPS); and
national experts on the topic of school crime.
When SSOCS was first developed, extant surveys that touch on the topic of school crime and safety were examined to determine where duplication might exist. While there were other federal surveys that collected information from principals about school crime and safety (the 2000 National Study of Delinquency Prevention in Schools and the 1999–2000 School Health Policies and Programs), they did not collect the same type of information as SSOCS. SSOCS provides more extensive coverage of the types of crime and discipline that occur in schools, as well as the efforts that schools use to combat crime.
Other surveys that have collected similar information as SSOCS are not administered repeatedly. For example, the Safe School Study of 1976, and the 1991, 1996–97,1 and 2014 FRSS surveys, collected data from principals on school crime. These surveys, however, are not recurring. SSOCS’s regular and repeated administrations allow for the analysis of trends in the incidence of school crime and its correlates.
In 2016, NCES developed and administered a pilot of the ED School Climate Surveys (EDSCLS), which assessed various indicators of school climate from the perspectives of students, parents, teachers, and non-instructional staff. A small subset of the SSOCS items were included in the EDSCLS to provide a school-level picture of safety; however, these items were structured as Likert-type perception questions rather than as factual questions on school crime incidents and safety policies, as they are in SSOCS. The EDSCLS was intended to only collect nationally representative data one time during a 2016 benchmarking study. However, due to low response rates, the EDSCLS benchmark study was canceled.
The National Teacher Principal Survey (NTPS) includes a section on school climate and safety. Within this section, there are two questions that have subitems that directly overlap with subitems in SSOCS. When the 2015–16 data are final for both collections, and resources are available, NCES staff plan to run comparisons to examine similarities in reporting. While these items have been included in SSOCS since 2004 and their continued collection allows for trend analyses over time, including these data in the NTPS allows for a linkage to teacher responses, for example, on teacher-reported working conditions and climate. Therefore, there are different, yet important, reasons to continue this overlap in future data collections.
Districts selected for the SSOCS:2018, and later SSOCS:2020, sample that require submission and approval of a research application before the schools under their jurisdiction can be asked to participate in a study (referred to here as the special contact districts) will be contacted to seek research approval. During the SSOCS:2018 cycle, to improve the efficiency of the special contact district operations, research application packages for SSOCS and NTPS will be sent simultaneously to districts that have schools sampled for both surveys. Each special contact district with schools sampled for both SSOCS:2018 and the NTPS 2017-18 has the option of allowing its schools to participate in one or both surveys. Sending the applications together allows the district to consider participation in each survey simultaneously with the aim of reducing the burden of reviewing separate research applications for the two collections. Furthermore, to reduce the burden on typically larger districts whose schools have a higher probability of selection in various NCES sample surveys, the SSOCS:2018 and NTPS 2017-18 sampling designs are coordinated to minimize overlap, as much as possible, between the two surveys. However, because of resource constraints NTPS will not be conducted during the 2019–20 school year, as originally planned, SSOCS:2020 will not seek special district approval simultaneously with NTPS. Therefore, SSOCS:2020 will alone conduct the special district operations, as was done in prior administrations of SSOCS, before SSOCS:2018.
Other federal surveys obtain information about school crime from individuals other than those who have the school-level perspective of principals. For example, the School Crime Supplement to the National Crime Victimization Survey—administered in 1989, 1995, 1999, 2001, 2003, 2005, 2007, 2009, 2011, 2013, 2015, and 2017—collected data on perceptions of school crime and safety from students ages 12 to 18. Students also serve as the primary respondents in the Youth Risk Behavior Survey and the Monitoring the Future Survey.
The Civil Rights Data Collection (CRDC), administered by the Office of Civil Rights (OCR), collects some information on crime and discipline from local education agencies (LEAs) rather than school principals. For CRDC, each LEA completes an LEA-level survey plus one school-level survey for each of its schools. There is some overlap in topical areas between SSOCS and CRDC, specifically, the counts of incidents reported, disciplinary actions, and harassment/bullying data. However, CRDC collects these data at a disaggregated level (e.g., by student race/ethnicity), whereas SSOCS focuses on overall counts at the school level and is intended to provide a national benchmark on the status of violence and discipline in our nation’s schools. Additionally, given that SSOCS and CRDC collect data from different types of respondents, it is uncertain whether the responses received on similar items will be comparable. NCES and OCR are interested in investigating the comparability of similar items as a check on their reliability and validity. If items are found to be comparable, some could potentially be removed from either SSOCS or the CRDC in future data collections. CRDC will collect data for the 2015–16 school year during the spring of 2017. After the data collection has ended and data are available for internal analyses, NCES staff will compare the SSOCS:2016 and CRDC results and make a decision on whether to continue any overlap.
NCES and OCR have been working together since the 2015–16 CRDC data became available to compare estimates of incident counts that are reported in both surveys. Preliminary analyses conducted by NCES’s contractor, the American Institutes for Research (AIR), have shown discrepancies in the information reported for schools that responded to the same items in both SSOCS:2016 and the 2015–16 CRDC. Thus, before considering removing the items from one of the surveys, NCES wants to develop a better idea of which source provides the more accurate data. NCES is considering conducting a validation study to learn about both SSOCS and CRDC respondents’ processes for collecting and submitting crime data as well as their understanding of the survey items. The goals of the study would be to obtain information to improve the survey items, reduce the burden of future data collections, and ensure that the resulting data are more accurate for schools, districts, policymakers, and other data users. If conducted, the validation study would compare responses from SSOCS:2018 (data collected from February to June 2018) with those from CRDC 2017–18 (data collected during the spring of 2019). The validation study is in the initial phase of design, and if conducted, its results are expected to become available by the end of 2019. They will help inform NCES’s decision on whether to retain or remove the overlapping items from SSOCS:2022.
To address the priorities of the NIJ in collecting more data on mental health services in schools, several new items in this area were added to SSOCS:2016. The Centers for Disease Control (CDC) has administered the School Health Policies and Practices Study (SHPPS), a national survey conducted periodically to assess school health policies and practices at the state, district, school, and classroom levels. The 2014 SHPPS included a questionnaire on mental health and social services that collected school-level information; however, the respondent could be any member of the school staff. SHHPS included items on the types and number of mental health professionals in schools and the services they offer. The questions included in the SSOCS questionnaire complement those in the SHPPS, but focus on student access to services and professionals as funded by the school or district. Gathering this information through SSOCS will provide an indication of whether or not schools are equipped to deal with student mental health issues that may contribute to school crime and violence. In addition, it will allow for the analysis of the incidence of crime in relation to the provision of student services.
A.5. Methods Used to Minimize Burden on Small Entities
The burden on small schools and districts is minimized during the SSOCS data collection through the sample design. The design specifies the selection of schools as a function of size, which is defined by the number of students. Small schools and districts are sampled at lower rates because they comprise a smaller proportion of the student population per school.
The SSOCS:2018 questionnaire will be mailed to respondents in late February 2018, with instructions to return it within 2 weeks. The schools that are randomly assigned to the web test will be mailed an invitation letter that includes the log-in information, rather than a paper questionnaire, at the same time. Schools that do not respond within 4 weeks will be contacted again and encouraged to complete their questionnaires. The data collection period will remain open through mid-June 2018.
The SSOCS:2020 initial invitation letter will be mailed to respondents in February 2020 and will include log-in information and instructions to complete the online questionnaire within 2 weeks. Schools that do not respond will be contacted again by mail and encouraged to complete their questionnaire online. Schools that have not responded within 6 weeks will be mailed a SSOCS:2020 paper questionnaire. Schools will also receive periodic e-mail reminders throughout the data collection period. The data collection period will remain open through mid-June 2020.
For a number of reasons, schools are encouraged to complete the survey in less than 30 days. One reason for this is that the data collection is designed to close at the end of the school year (and not overlap with the beginning of summer vacation). Thus, in order to achieve a high response rate, there needs to be enough time before the end of the school year to place follow-up calls and send follow-up mailings and e-mails to principals, as necessary. Most of the schools in the earlier SSOCS collections required some form of nonresponse follow-up, and this is expected for the 2018 and 2020 surveys as well.
The timing of the survey administration is also designed to avoid overburdening principals at the very end of the school year, when they have other administrative responsibilities. The survey collects counts of certain events, such as the number of crimes or disciplinary actions, which occur during the school year. In order to collect information on as much of the school year as possible, the data collection period is kept short and finishes as close to the end of the school year as possible.
A.6. Frequency of Data Collection
As indicated earlier, SSOCS is planned as a recurring survey. This request is for clearance of SSOCS:2018 and SSOCS:2020. Separate requests will be submitted for future SSOCS collections. If these data were not collected on a recurring basis, it would hamper the ability to monitor trends and to provide policymakers with timely data on school crime. If the data were not collected at all, NCES would fail to meet its legislatively required mandate to collect and report such data, and legislators, school officials, and constituents would be without timely data on the incidence and frequency of school crime, and on the characteristics of disciplinary actions, programs, and indicators of disorder in U.S. schools.
A.7. Special Circumstances of the Data Collection
There are no other special circumstances.
A.8. Consultants Outside the Agency
Since its inception, the development of SSOCS has relied on the substantive and technical review and comments of people both inside and outside the U.S. Department of Education. Outside experts who were convened to offer comments on revisions for the SSOCS 2016, 2018, and 2020 collections include
Lynn Addington, Department of Justice, Law and Society, American University
William Dikel, Consultant on School Mental Health
Elizabeth Freeman, American Institutes for Research
Denise Gottfredson, Department of Criminology and Justice, University of Maryland
Bill Modzeleski, SIGMA Threat Management
Amanda Nickerson, Educational and Counseling Psychology, University of Albany, SUNY
Dr. Jolene D. Smyth, Department of Sociology and Director of the Bureau of Sociological Research, University of Nebraska-Lincoln
Jon Carrier, Maryland Association of School Resource Officers
Benjamin Fisher, University of Louisville
Christine Handy, National Association of Secondary School Principals
Kimberly Kendziora, American Institutes for Research
Mary Poulin Carlton, National Institute of Justice
Jill Sharkey, University of California, Santa Barbara
Madeline Sullivan, Office of Safe and Healthy Students
The SSOCS instruments have benefited from consultation with the following federal experts:
Nadine Frederique, NIJ
Calvin Hodnett, NIJ (COPS)
Rachel Morgan, BJS
Michael Planty, BJS
Matthew Scheider, NIJ (COPS)
Dara Blachman-Demner (COPS)
Jenna Truman, BJS
Phelan Wyrick, NIJ
David Esquith, Director, Office of Safe and Healthy Students (OSHS)
Sarah Sisaye, Health and Human Services (formally at OSHS)
Rita Foy Moss, Office of Safe and Healthy Students
Rosa Olmeda, Office of Civil Rights
Madeline Sullivan, Office of Safe and Healthy Students
As part of the SSOCS:2018 development, 19 administrators from public schools varying in locale, level, and district tested a portion of new and modified survey items through cognitive interviews. The purpose of the interviews was to uncover comprehension issues and to measure the participants’ overall understanding of the content surveyed. Participants were asked to think aloud as they answered items in the SSOCS questionnaire and to respond to a series of scripted questions related to the survey items that tested the clarity of terms, the appropriateness of response options, and overall ease in responding to specific survey questions. Interviews were approximately 60 minutes in length and were conducted remotely, via telephone or videoconference, or in person at schools. In response to early findings during cognitive interviews, modifications were made to item wording and design, then further tested in subsequent interviews. The SSOCS:2018 questionnaire was modified based on the results of these cognitive interviews (see Part C). No cognitive interviews were conducted specifically for SSOCS:2020 development, because no new or significantly modified items will be included in the questionnaire.
A.9. Provision of Payments or Gifts to Respondents
Some districts charge a fee (~$50-200) to process research application requests, which will be paid as necessary. In addition to the web test, SSOCS:2018 will include an incentive experiment designed to examine the effectiveness of offering principals a monetary incentive to boost the overall response rate. Schools in the experimental treatment will receive a $10 prepaid incentive gift card at the first contact by mail. This treatment will be evaluated against the control group, which will not receive any incentive.
SSOCS:2020 will build on the SSOCS:2018 incentive experiment and will include two incentive treatment groups. Schools in the “early incentive” treatment group will receive a $10 cash incentive at the first contact by mail. Schools in the “delayed incentive” treatment group will not receive an incentive in the first two mail contacts but will receive a $10 cash incentive during the third mail contact. Both treatment groups will be evaluated against the control group, which will not receive any incentive. The goal of this experiment is to further refine the SSOCS incentive strategy by comparing response rates, indicators of nonresponse bias, and data collection costs between the early and delayed incentive strategies, relative to a no-incentive control.
Further, upon completion of the data collection and report/data release, we will send an e-mail with a link to the “First Look” publication to all schools participating in SSOCS.
A.10. Assurance of Confidentiality
Data security and confidentiality protection procedures have been put in place for SSOCS:2018 and SSOCS:2020 to ensure that all contractors and agents working on SSOCS comply with all privacy requirements including, as applicable:
The Inter-agency agreement with NCES for this study and the statement of work of SSOCS contract;
Privacy Act of 1974 (5 U.S.C. §552a);
Privacy Act Regulations (34 CFR Part 5b);
Computer Security Act of 1987;
U.S.A. Patriot Act of 2001 (P.L. 107-56);
Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);
Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);
Foundations of Evidence-Based Policymaking Act of 2018, Title III, Part B, Confidential Information Protection;
The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);
The U.S. Department of Education Incident Handling Procedures (February 2009);
The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;
NCES Statistical Standards; and
All new legislation that impacts the data collected through the inter-agency agreement and contract for this study.
The U.S. Census Bureau will collect data under an interagency agreement with NCES, and maintain the individually identifiable questionnaires per the agreement, including:
Provisions for data collection in the field;
Provisions to protect the data-coding phase required before machine processing;
Provisions to safeguard completed survey documents;
Authorization procedures to access or obtain files containing identifying information; and
Provisions to remove printouts and other outputs that contain identification information from normal operation (such materials will be maintained in secured storage areas and will be securely destroyed as soon as practical).
U.S. Census Bureau and contractors working on SSOCS:2018 and SSOCS:2020 will comply with the Department of Education’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to: the revised NCES Statistical Standards, as described at the website: https://nces.ed.gov/statprog/2012/.
By law (20 U.S.C. §9573), a violation of the confidentiality restrictions is a felony, punishable by imprisonment of up to 5 years and/or a fine of up to $250,000. All government or contracted staff working on the SSOCS study and having access to the data, including SSOCS field staff, are required to sign an NCES Affidavit of Nondisclosure and have received public-trust security clearance. These requirements include the successful certification and accreditation of the system before it can be implemented. Appropriate memoranda of understanding and interconnection security agreements will be documented as part of the certification and accreditation process.
From the initial contact with the participants in this survey through all of the follow-up efforts, potential survey respondents will be informed that (a) the U.S. Census Bureau administers SSOCS on behalf of NCES; (b) NCES is authorized to conduct SSOCS by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543); (c) all of the information they provide may only be used for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151); and (d) that their participation is voluntary.
The following language will be included in respondent contact materials and on data collection instruments:
The National Center for Education Statistics (NCES), within the U.S. Department of Education, conducts SSOCS as authorized by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543).
All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).
The following language will be included on data collection instruments:
According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0761. The time required to complete this information collection is estimated to average 49 minutes per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate, suggestions for improving this collection, or comments or concerns about the contents or the status of your individual submission of this questionnaire, please e-mail: [email protected], or write directly to: School Survey on Crime and Safety (SSOCS), National Center for Education Statistics, PCP, 550 12th Street SW, #4036, Washington, DC 20202.
As is clearly stated on recruitment materials and in the questionnaires, SSOCS:2018 and SSOCS:2020 are voluntary surveys. No one is required to respond to the SSOCS questionnaire or specific questions within it. The items in the SSOCS questionnaire are not considered to be sensitive, as they collect information about schools rather than about individuals (see Supporting Statement Part C for a description and justification of the items and appendix B for the questionnaire). Items about the frequency of crime and disciplinary problems at the school could be viewed as sensitive by some respondents because schools may not want to report data associated with unusually high frequencies of problems. However, the protection of individually identifiable information from disclosure is stated in the cover letter to participants, as well as the fact that the responses are not in any way tied to funding. Also, the SSOCS questionnaire asks for information that is generally in the public domain (e.g., information on policies which schools communicate to their students and parents in a variety of ways).
A.12. Estimates of Burden for Information Collection
The estimated burden to respondents for all of SSOCS:2018 and SSOCS:2020 activities is presented in Table 1 and Table 2, respectively. The time required to respond to the collection is estimated based on the responses in previous SSOCS administrations. Recruitment and pre-collection activities include (a) the time to review study requirements in the districts that require research approval before contacting their schools and (b) the time involved in a school deciding to participate.
SSOCS:2018
SSOCS:2016 yielded an unweighted response rate of approximately 59 percent. When the responding schools were weighted to account for their original sampling probabilities, the response rate increased to approximately 63 percent. Detailed analysis to determine the necessary number of completed interviews to ensure precision in key estimates and to have confidence in our ability to make comparisons over time indicated a goal of collecting data for at least 2,550 schools. Due to the low response rate in the 2016 collection, approximately 3,650 schools will be drawn in the sample to receive the paper version and approximately 1,150 schools will be drawn to receive the web-based version (for the web test) in order to meet the goal of a minimum of 2,550 completed surveys. Given the inclusion of both web and incentive experiments aimed at increasing the overall response, we anticipate at least maintaining the SSOCS:2016 response rate, which will yield more completed surveys than needed to meet the goal.
An item was included in the SSOCS:2016 questionnaire that asked respondents, “How long did it take you to complete this form, not counting interruptions?” Based on their answers, respondents took approximately 55 minutes, on average, to respond to the SSOCS survey in 2016. Upon reviewing the survey items, as well as the results of the cognitive testing, it was determined that 10 item/subitems would be added, 20 would be deleted, and 19 would be modified to improve respondent comprehension. Based on these updates, we estimate that the average 2018 survey response time will be 53 minutes.2
Districts selected for the SSOCS sample that require submission and approval of a research application before the schools under their jurisdiction can be asked to participate in a study will be contacted to seek research approval3. Based on previous SSOCS administrations, we estimate that approximately 180 special contact districts will be included in the SSOCS:2018 sample. To reduce the burden on these districts and improve operational efficiency, we plan to seek research approval for SSOCS:2018 simultaneously with NTPS. Contacting special districts begins with updating district information based on what can be gleaned from online sources and what is known from previous cycles of SSOCS. Individual districts will be contacted as needed to fill in gaps about where and to whom to send the completed required research application forms. The estimated number of such districts represents those with particularly detailed application forms and lengthy processes for approval. The projected number of responses is based on the SSOCS:2018 sample size and takes into account eligibility and response rates from SSOCS:2016. Not all districts initially flagged as special contact districts will respond in the recruitment effort because they may not have a formal research application process and are not actually a special contact district, as such, the estimated number of responding special districts is lower than the estimated sample size for the special district operation. The total response burden estimate for special district IRB approvals is based on 360 minutes for IRB review by one staff member, and 60 minutes per member for special district IRB panel review, assuming each panel would on average be composed of six panel members. This operation began after receiving OMB approval and will continue until we receive a final response (approval or denial of request) as long as there is sufficient time for sampled schools to respond to SSOCS.
Principals of sampled schools will be notified of the survey through an advance letter and an e-mail sent a week or two before the questionnaire. The burden per school for reading and potentially following up on the SSOCS advanced, initial, and any follow up letters and e-mails is estimated to average about 6 minutes total.
Table 1. Estimated hourly burden for SSOCS:2018
Activity for each administration |
Sample size |
Expected response rate |
Number of respondents* |
Number of responses |
Burden hours per respondent |
Total burden hours |
District IRB Staff Review |
180 |
0.80 |
144 |
144 |
3 |
432 |
District IRB Panel Review |
180*6 |
0.80 |
864 |
864 |
1 |
864 |
State Notification |
51 |
1.0 |
51 |
51 |
0.05 |
3 |
District Notification |
2,800 |
1.0 |
2,800 |
2,800 |
0.05 |
140 |
School Recruitment |
4,800 |
1.0 |
4,800 |
4,800 |
0.1 |
480 |
Paper Questionnaire |
3,650 |
0.59** |
2,154 |
2,154 |
0.883 |
1,902 |
Web Questionnaire |
1,150 |
0.63** |
725 |
725 |
0.883 |
640 |
Total for SSOCS:2018 administration |
- |
- |
8,659 |
11,538 |
- |
4,461 |
* Details may not sum to totals because counts are unduplicated.
** This response rate is calculated based on the results of the SSOCS:2016 data collection. The increase in the overall sample size from 2016 to 2018 is to ensure the receipt of at least 2,550 completed questionnaires (across paper and web versions). The web and incentive experiments are being conducted with the hope of increasing or at least maintaining the 2016 overall response rates.
SSOCS:2020
SSOCS:2018 yielded an unweighted response rate of approximately 58 percent. When the responding schools were weighted to account for their original sampling probabilities, the response rate increased to approximately 62 percent. As in the prior collections, the objectives of the SSOCS:2020 sample design are twofold: to obtain overall cross-sectional and subgroup estimates of important indicators of school crime and safety and to develop precise estimates of change in various characteristics relating to crime between the SSOCS administrations. To attain these objectives and taking into consideration the low response rates in the 2016 and 2018 collections, approximately 4,800 total schools will be drawn in the sample: 2,340 schools will be assigned to the “early incentive” treatment; 1,230 schools will be assigned to the “delayed incentive” treatment; and 1,230 schools will be assigned to the “no incentive” (control) treatment. Given the inclusion of both web menu and incentive experiments aimed at increasing the overall response, we anticipate at least maintaining the SSOCS:2016 and SSOCS:2018 response rates, which will yield more completed surveys than needed to meet the study’s objectives.
An item was included in the SSOCS:2018 questionnaire that asked respondents, “How long did it take you to complete this form, not counting interruptions?” Based on their answers, respondents took approximately 51 minutes, on average, to respond to the SSOCS survey in 2018. In preparation for SSOCS:2020, upon reviewing the SSOCS:2018 survey items and the results of prior cognitive testing, NCES decided to delete 11 of SSOCS:2018 items/subitems. Based on these updates, we estimate that the average 2020 survey response time in SSOCS:2020 will be 49 minutes.4
Districts selected for the SSOCS sample that require submission and approval of a research application before the schools under their jurisdiction can be asked to participate in a study will be contacted to seek research approval. Based on previous SSOCS administrations, we estimate that approximately 195 special contact districts will be included in the SSOCS:2020 sample. Differing from the process for SSOCS:2018, SSOCS:2020 will not seek simultaneous special district approval with NTPS because NTPS will not be conducted during the 2019–20 school year. Otherwise, the process for contacting special districts for SSOCS:2020 will follow the process described for SSOCS:2018, as outlined earlier in this document.
Principals of sampled schools will be notified of the survey through an initial mailout containing an invitation letter with log-in information for the online questionnaire. The burden per school for reading and potentially following up on the SSOCS initial letter and any follow-up letters and e-mails is estimated to average about 6 minutes total.
Table 2. Estimated hourly burden for SSOCS:2020
Activity for each administration |
Sample size |
Expected response rate |
Number of respondents* |
Number of responses |
Burden hours per respondent |
Total burden hours |
District IRB Staff Review |
195 |
0.80 |
156 |
156 |
3 |
468 |
District IRB Panel Review |
195*6 |
0.80 |
936 |
936 |
1 |
936 |
State Notification |
51 |
1.0 |
51 |
51 |
0.05 |
3 |
District Notification |
2,800 |
1.0 |
2,800 |
2,800 |
0.05 |
140 |
School Recruitment |
4,800 |
1.0 |
4,800 |
4,800 |
0.1 |
480 |
SSOCS Questionnaire |
4,800 |
0.6** |
2,880 |
2,880 |
0.817 |
2,353 |
Total for SSOCS:2020 administration |
- |
- |
8,743 |
11,623 |
- |
4,380 |
* Details may not sum to totals because counts are unduplicated.
** This response rate is calculated based on the results of the SSOCS:2018 data collection. The web menu and incentive experiments are being conducted with the hope of increasing or at least maintaining the 2018 overall response rates.
Annualized Response Burden for SSOCS:2018 and SSOCS:2020
The annualized estimated response burden for SSOCS:2018 and SSOCS:2020 is provided in Table 3.
Table 3. Annualized estimated response burden for SSOCS:2018 and SSOCS:2020
Activity for each administration |
Number of respondents |
Number of responses |
Total burden hours |
Total for SSOCS:2018 administration |
8,659 |
11,538 |
4,461 |
Total for SSOCS:2020 administration |
8,743 |
11,623 |
4,380 |
Annualized Total for SSOCS:2018 and SSOCS:2020* |
5,801 |
7,721 |
2,947 |
* The annualized total is the sum of the total SSOCS:2018 and SSOCS:2020 burden, divided by 3.
Assuming that respondents (district education administrators for district approvals and mostly principals for the data collection) earn on average $45.805 per hour, and given the 2,947 annualized total estimated burden hours, the annualized total estimated burden time cost to respondents for SSOCS:2018 and SSOCS:2020 is $134,973.
A.13. Estimates of Cost Burden to Respondents
There are no additional costs to respondents beyond those reported for the hour burden.
A.14. Estimates of Annual Government Cost
The Census Bureau will conduct the SSOCS:2018 data collection preparation, data collection, and data file development work for approximately $2,079,125 over 3 years. A task in NCES’s ESSIN contract with AIR also supports this survey at about $725,000 over 3 years. NCES has allotted an additional $200,000 for additional post-collection support tasks. Thus, SSOCS:2018 will cost the government approximately $3,004,125 over 3 years.
The Census Bureau will conduct the SSOCS:2020 data collection preparation, data collection, and data file development work for approximately $2,400,000 over 3 years. A task in NCES’s ESSIN contract with AIR also supports this survey at about $725,000 over 3 years. Thus, SSOCS:2020 will cost the government approximately $3,125,000 over 3 years.
Therefore, total annualized average cost for SSOCS:2018 and SSOCS:2020 is approximately $1,021,521.
A.15. Reasons for Changes in Response Burden
The increase in burden from SSOCS:2016 to SSOCS:2018 is due to lower than anticipated response rate in SSOCS:2016 and thus an increased initial sample of SSOCS:2018 schools to result in the minimum 2,550 participating schools needed to produce key statistical estimates. Moreover, SSOCS:2018 includes an incentive experiment and a web-based experiment. The increased sample size and the addition of the experiments have contributed to an increase in the overall cost of the survey to federal government as compared to SSOCS:2016.
The small decrease in burden from SSOCS:2018 to SSOCS:2020 is due to the omission of the principal advance letter and a reduction in the number of questionnaire items and subitems, which are somewhat balanced out by the expected increase in the number of special handling districts in the SSOCS:2020 sample.
NCES will release the first publication from a data collection as soon as possible after it is completed. The ultimate goal for all NCES collections, including SSOCS:2018 and SSOCS:2020, is to release a restricted-use data file, First Look report, and supplemental data documentation within approximately 12 months of the data collection end date. Table 4 displays the time schedule for the major project activities in SSOCS:2018 and Table 5 for the activities in SSOCS:2020.
Table 4. Schedule of major project activities: SSOCS:2018
Task |
Date |
Contact special districts to begin approval process |
February 2017–January 2018 |
Complete and deliver special district applications and packages |
February 2017–January 2018 |
Draft special mailing materials for schools in special districts |
February 2017–January 2018 |
Data collection begins |
February 2018 |
Data collection ends |
June 2018 |
Restricted-use data file finalized |
February 2019 |
First Look report through NCES review |
March 2019 |
First Look report released |
June 2019 |
Restricted-use data file released |
June 2019 |
Survey documentation released |
June 2019 |
Public-use data file released |
September 2019 |
Web tables through NCES review |
December 2019 |
Web tables released |
March 2020 |
Table 5. Schedule of major project activities: SSOCS:2020
Task |
Date |
Contact special districts to begin approval process |
June 2019–January 2020 |
Complete and deliver special district applications and packages |
June 2019–January 2020 |
Draft special mailing materials for schools in special districts |
June 2019–January 2020 |
Data collection begins |
February 2020 |
Data collection ends |
July 2020 |
Restricted-use data file finalized |
February 2021 |
First Look report through NCES review |
March 2021 |
First Look report released |
July 2021 |
Restricted-use data file released |
September 2021 |
Survey documentation released |
September 2021 |
Public-use data file released |
November 2021 |
Web tables through NCES review |
March 2022 |
Web tables released |
July 2022 |
Analysis Tasks
First Look report
This First Look report will use data from SSOCS:2018 and SSOCS:2020 to examine a range of issues dealing with school crime and safety, such as the frequency of school crime and violence, disciplinary actions, and school practices related to the prevention and reduction of crime and safety. This publication will largely follow the format and analysis techniques used in publications released in prior years, such as
Data files and related data documentation
All data files (in several statistical formats) and data documentation (codebooks and user’s manuals) are publicly available on the NCES website at http://nces.ed.gov/surveys/ssocs/data_products.asp.
SSOCS web tables
Data from each SSOCS administration are tabulated and released in a table library, accessible through the NCES website at http://nces.ed.gov/programs/crime/crime_tables.asp.
Generally, analyses of the SSOCS data follow the research questions presented below. The data will be analyzed in accordance with the research questions. A goal of the data analysis is to provide answers to these questions using various analytical techniques, including t tests and cross-tabulations.
The SSOCS instrument is divided into 10 main research objectives, each with a series of items addressing a specific research question, as presented below. See Supporting Statement Part C for a description and justification of the items.
What is the frequency and nature of crime at public schools?
What is the number of incidents, by type of crime?
What are the characteristics of those incidents?
How many incidents were reported to police?
What is the number of hate-crime incidents?
What biases motivated these incidents?
How many arrests were made at school?
How many schools report violent deaths?
How many schools report school shootings?
How many schools report disruptions for violent threats?
What is the frequency and nature of discipline problems and disorder at public schools?
What types of discipline problems and disorder occur at public schools?
How serious are the problems?
What disciplinary actions do public schools use?
What types of disciplinary actions were available to principals?
How many disciplinary actions were taken, by type of action and offense?
What practices to prevent/reduce crime and violence do public schools use?
How do schools monitor student behavior?
How do schools control student behavior?
How do schools monitor and secure the physical grounds?
How do schools limit access to the school?
How do schools plan and practice procedures for emergencies?
How do schools involve law enforcement?
Do schools have sworn law enforcement officers present on a regular basis?
How often are they available and at what times?
What activities do they participate in?
How many are present at the school?
How are sworn law enforcement officers armed?
Is there written documentation outlining the roles and responsibilities of law enforcement in schools?
Do schools have security guards or personnel other than law enforcement?
How do schools provide access to student mental health services?
Are mental health services, such as diagnostic assessment and treatment, available to students?
Where are those services available?
What factors limit a school’s efforts to provide mental health services to students?
What formal programs designed to prevent/reduce crime and violence do public schools use?
Which programs target students, teachers, parents, and other community members?
What are the characteristics of the programs?
Do schools have threat assessment teams?
How often do they formally meet?
What student groups promote acceptance of student diversity?
What training is provided to staff?
What efforts used by public schools to prevent/reduce crime and violence involve various stakeholders (e.g., law enforcement, parents, juvenile justice agencies, mental health agencies, social services, and the business community)?
In what activities are stakeholders involved?
How much are stakeholders involved?
What problems do principals encounter in preventing/reducing crime and violence in public schools?
What school characteristics might be related to the research questions above?
What are the demographic characteristics of schools?
What are the characteristics of the student population?
What is the average student/teacher ratio?
What are the general measures of school climate, such as truancy or student mobility?
NCES is not seeking approval to not display the expiration date of OMB approval.
There are no exceptions to the certification statement.
1 The 1996–97 FRSS survey was a predecessor to SSOCS:2000.
2 Each subitem in the SSOCS:2018 questionnaire was counted as an item. Assuming an average burden of 12.3 seconds per item (based on the average amount of time it took respondents to complete the 2016 questionnaire) and that the items do not differ substantially in complexity or length, the burden for the SSOCS:2018 survey is estimated to be very similar to that in the SSOCS:2016 survey.
3 Please note that the preliminary activities for SSOCS:2018 were approved in March 2017 (OMB# 1850-0761 v.11).
4 Each subitem in the SSOCS:2020 questionnaire was counted as an item. Assuming an average burden of 11.7 seconds per item (based on the average amount of time it took respondents to complete the 2018 questionnaire) and that the items do not differ substantially in complexity or length, the burden for the SSOCS:2020 survey is estimated to be very similar to that for the SSOCS:2018 survey.
5 The source of this estimate is the mean hourly rate of Education Administrators (data type: SOC:119030) on the BLS Occupational Employment Statistics website, http://data.bls.gov/oes/, accessed on February 25, 2019.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |