Supporting Statement
2015 School Crime Supplement (SCS)
A. Justification
This request is for clearance to conduct the 2015 School Crime Supplement (SCS) to the National Crime Victimization Survey (NCVS) from January through June 2015. The primary purpose of the SCS is to obtain information about school-related victimizations. This information helps policymakers, academic researchers, practitioners at the federal, state, and local levels, and special interest groups, who are concerned with crime in schools, make informed decisions about policies and programs. The Bureau of Justice Statistics (BJS) is authorized to collect statistics on victimization under Title 42, United States Code, Section 3732 of the Justice Systems Improvement Act of 1979.1 Title 1 of the Education Sciences Reform Act (ESRA) mandates that the National Center for Education Statistics (NCES) collect, report, analyze, and disseminate statistical data regarding education in the United States.
Addressing Terms of Clearance
In approving the 2011 and 2013 SCS administrations, OMB’s terms of clearance included a requirement that “prior to submitting this collection for its next full clearance cycle, BJS, in concert with the National Center for Education Statistics (NCES) and experts in criminology, should conduct a systematic review of the survey and its content. This review should assess purpose, burden and response rates, and if BJS decides to continue using content added in recent cycles, the review must also demonstrate that this new content is being analyzed, reported and found useful by stakeholders.”
To address the Terms of Clearance, the following occurred:
To inform discussions about increased burden, NCES contracted with Synergy Enterprises, Incorporated (SEI) and its partner Mathematica Policy Research (MPR) in June 2012 to analyze response patterns on twelve items added to the 2011 SCS with regard to utility and necessity. The primary findings are – 2
Two of the new items (questions 16c.b and 16f.a in the SCS 2013 instrument3) may not have contributed meaningful data due to 1) infrequent occurrence among respondents, and 2) difficulty respondents had in answering the item.
The expanded items asking about supportive relationships with adults at school were not strongly correlated with each other and appeared to measure different constructs. It was suggested that the number of items in this question could not be reduced without losing potentially useful information.
The findings of the study suggest that some of these new items added in 2011 need to be reworded for clarity and that other sections of the survey should be explored for reducing administration time.
To examine differences between SCS estimates of bullying frequency and the estimates from other national surveys, Census contracted with ICF International in June 2012 to plan and conduct cognitive interviews with middle-school students to determine how students ages 11 to 14 interpret bullying questions. A summary of the relevant findings from the report are – 4
Respondents’ own concepts of bullying did not always include all definitional elements such as repetition and power imbalance.
Responses to the bullying scenarios differed based on how bullying was defined in the questionnaire. That is, in some cases, respondents did appear to base their answers to questions about bullying on the definition presented instead of on their pre-existing definitions.
When responding to questions about bullying based on the SCS presentation of bullying behaviors, students tended to focus on the list of individual behaviors presented rather than the overall stem question about bullying.
Respondents’ answers throughout the interviews demonstrated interconnectedness between bullying and cyberbullying incidents in their experiences.
These findings suggest that the reasons for the differences between national surveys in bullying estimates might include question structure as well as known differences in the sample population. They also suggest that in order to operationalize the Centers for Disease Control and Prevention (CDC)/Department of Education (ED) endorsed common definition of bullying the SCS questions need to present all relevant components of the definition to students before asking them if they were bullied.5
To assess how the content of the SCS is being used by stakeholders, NCES is supporting an ongoing literature review to continually update a bibliography of both government and non-government publications analyzing SCS data. ICPSR also provides lists of publications that are associated with each SCS dataset. Researchers can see those publications on the pages describing each dataset on the website. The number and diversity of secondary publications (not produced by NCES, BJS or Census) indicates that the SCS continues to be relevant to policy makers, educators, and administrators as part of ongoing efforts to understand and address bullying in schools as a construct separate from other types of victimization. Attachment 12 contains a selected bibliography. Current areas of research include the impact of school safety measures and environmental factors on incidents of victimization, the impact of bullying on academic achievement, and the growing incidence of cyberbullying as part of overall bullying.
To address response rates, Census and NCES conducted some outreach efforts in 2011 and 2013. These included preparing a number of informational materials about the SCS for the Census field representatives (FR) and for distribution to parents and students. A video was created for the FRs that included information about the SCS and why it is important. The 2013 brochures, which included findings from the 2009 SCS, will be updated for 2015 with findings from the 2011 SCS. These brochures for parents and students (in English and Spanish) include information regarding the survey’s purpose, why responding to the survey is important, and the variety of topics covered in the SCS (see Attachments 17, 18, 19, and 20 for the 2013 brochures).
To make recommendations for revising and testing the 2015 SCS, a Technical Review Panel (TRP) for the SCS met from August 12-13, 2013.6 The charge to the TRP included (1) addressing emerging information needs of stakeholders, (2) decreasing the administrative burden of the complete survey, (3) revising the section on bullying to be consistent with the newly developed uniform definition for bullying7 and (4) aligning the SCS with ED’s commitment to more fully address the needs of vulnerable student groups. After reviewing relevant materials, the TRP made a number of recommendations including –
Eliminating questions that are no longer relevant to most students (e.g., leaving school for lunch).
Reducing the number of questions on drug availability by consolidating individual questions on specific drugs into one question on ‘illegal’ drugs.
Analyzing correlations between two series of questions about relationships with teachers and with other adults to determine if they can be combined without loss of information.
Developing a revised series of questions on bullying to (1) include all relevant dimensions of the current definition, (2) incorporate incidents of cyberbullying into overall bullying, and (3) address the issue noted in the ICF report (see above) of respondents including non-bullying incidents in their responses.
NCES, in consultation with BJS and Census, developed a revised questionnaire from the TRP recommendations. Using those recommendations, Census completed an additional cognitive lab study of the revised questions to (1) fully examine whether the proposed new questions were well understood by the target population, and (2) establish validity of the new questions (e.g., did students construct responses based on the intended information reflected in each survey item). Census conducted testing in two rounds between November 2013 and February 2014.8 Two rounds of testing allowed some revisions to questions based on the results of the first round of testing. Evidence from the study indicated that the final versions of the questions were well understood and were capturing intended information.
One key recommendation that emerged was to continue to present the bullying questions in two different ways using a split sample design for the 2015 SCS administration. There were too few students involved in the cognitive lab testing to reliably estimate how overall bullying frequency would be affected by a change to the bullying questions on the instrument to reflect the CDC’s uniform definition. Only using the redesigned bullying questions would result in a loss of comparability to historic data, which the TRP recommended against. A suggested alternative was to use the old series of questions, with follow-up questions about power imbalance and repetition (part of the CDC uniform definition). From this, two frequencies would be generated that could be compared (one based on responses to the original series, the other based on responses modified by the follow-up questions). However, this version is cumbersome, and has the potential to cause confusion among respondents with later questions about bullying location and frequency. Using this version (old series of questions and two follow-up questions) with only half of the respondents minimizes burden and chance for confusion by reducing the number of times the longer version is used, while generating sufficient responses under both the old and new definitions for two separate estimates. These estimates can be used to create a conversion factor that will allow future administrations of the SCS using only the new version of the bullying question series to be compared to historic data. Overall, using a split sample design in the 2015 SCS data collection will allow for maintaining the trend in bullying data, as well as testing the new measures of bullying.
Based on the findings from the cognitive lab, NCES created a final version of the 2015 SCS.9 The proposed 2015 SCS items meet the needs of researchers and school-based planners, and address the terms of clearance. A summary of changes appears in Exhibit 1.10 In addition to changes listed, all items were renumbered to be in sequence for the new survey.11
Exhibit 1: Summary of changes made to 2015 SCS questions
|
|
|
|
|
Necessity of the Information Collection
In 1989, the SCS was first administered as a supplement to the NCVS. It was repeated in 1995 and 1999, since then it has been administered biennially. To study the relationships between victimization at school and the school environment, and to monitor changes in student experiences with victimization, accurate information regarding its incidence must be collected. The SCS includes questions related to students' experiences with, and perceptions of, crime and safety at school. The questions focus on preventive measures used by schools; students' participation in after school activities; students' perception of school rules and enforcement of these rules; the presence of weapons, drugs, alcohol, and gangs in school; student bullying; hate-related incidents; and attitudinal questions relating to the fear of victimization at school. These responses are linked to the NCVS survey instrument responses for a more complete understanding of the individual student’s circumstances.
NCVS data on school crime have shown that school crimes are under-reported to the police and those victims between the ages of 12 and 18 are not as likely as older victims to report victimizations to the police. In addition, police-based statistics are not organized in a manner to properly identify crimes that occurred at school or during school hours. Therefore, police statistics on school crime are not adequate to address the issue of the nature and prevalence of school victimization. The 2015 supplement will continue to provide critical information about the overall safety environment in schools to understand the context in which school-related victimizations occur on a national level.
Two recent reports highlight the need for more data about bullying victimization. Questions pertaining to school bullying are a key component to the SCS. School bullying represents a unique type of victimization among youth and significantly impacts education outcomes and physical and mental health. A 2012 Government Accounting Office (GAO) report on school bullying recommended that “the Secretaries of Education and HHS and the Attorney General work together to develop information in their future surveys of youths’ health and safety issues on the extent to which youths in various vulnerable demographic groups are bullied.” 12 In 2014, the CDC also advocated for more information on vulnerable groups and bullying subsequently publishing a report on the topic. 13 The CDC report proposed using a uniform school bullying definition for all future research in this area. The Centers for Disease Control and Prevention (CDC) define bullying as –
any unwanted aggressive behavior(s) by another youth or group of youths who are not siblings or current dating partners that involves an observed or perceived power imbalance and is repeated multiple times or is highly likely to be repeated. Bullying may inflict harm or distress on the targeted youth including physical, psychological, social, or educational harm.14
The latest revisions proposed for the 2015 SCS questionnaire represent the first attempt to operationalize all parts of the definition proposed in the CDC report and to collect nationally representative information on bullying among certain vulnerable groups. The CDC report further recommends that “those concerned about youths’ safety not limit their data collection efforts to bullying alone, but rather gather information on the broad threats to youths’ safety.”15 The SCS uniquely addresses this recommendation due to the linkage between the SCS and NCVS. This integration of the two surveys allows for a more complete understanding of individual students’ circumstances and the relationships between victimization in and out of school.
Needs and Uses
Title 1 of the Education Sciences Reform Act (ESRA) mandates that NCES collect, report, analyze, and disseminate statistical data regarding education in the United States. These data include the nature of criminal incidents at school and other indices of school safety. Specifically, the incidence, frequency, seriousness, and nature of violence affecting students, school personnel, and other individuals participating in school activities. Furthermore, other indices of school safety are to be detailed, including information regarding the relationship between victims and perpetrators and demographic characteristics of the victims. To study the relationship between victimization at school and the school environment, and to monitor changes in student experiences with victimization, accurate information regarding its characteristics and incidence must be collected. These data yield numerous types of information that are used generally and by several specific groups interested in school crime such as school administrators, resource officers and educators.
General Uses. Funded by the U.S. Department of Education, Institute of Education Sciences’ (IES) National Center for Education Statistics (NCES) and jointly designed with Bureau of Justice Statistics (BJS), the SCS collects the data to address the mandates of both agencies. Since its first collection in 1989, and in 1995, 1999 and biennially thereafter, the SCS has been NCES' primary data source on student victimization. In addition to collecting characteristics related to various types of student victimization at school, the SCS also asks students about: alcohol and drug availability; fighting, bullying, and hate-related behaviors; cyber-bullying; fear and avoidance behaviors; gun and weapon carrying; and gangs at school.
To meet its obligation to Congress under the ESRA, NCES works with its diverse customer groups and relies on their feedback to determine how to meet their information demands for timely, comprehensive, and useful information that maintains high statistical standards. Specifically, NCES engaged and encouraged school practitioners, researchers, and data users of the SCS by convening a Technical Review Panel (TRP) in August 2013 to review the SCS and its content. For the 2015 SCS, the TRP made recommendations related to addressing the needs of stakeholders, reducing questions on the instrument, and aligning the SCS with ED’s commitment to more fully address the needs of vulnerable student groups. The TRP also recommended ways that NCES/BJS can update questions on the SCS questionnaire concerning bullying victimization to incorporate the recommendations in the 2014 CDC report on uniform definitions of bullying. Using a split sample design in the 2015 SCS data collection will allow for maintaining the trend in bullying data, as well as testing these new measures of bullying.
As part of the response to the 2012 GAO report on school bullying, the U.S. Department of Education (ED) has been working to collect and disseminate more information on how bullying impacts vulnerable student groups. In particular, the report noted that –
When bullying rises to the level of discrimination, federal civil rights laws may be used to provide redress to individuals in legally protected groups....However, federal agencies generally lack jurisdiction to address discrimination based on classifications not protected under federal civil rights statutes. For example, federal agencies lack authority to pursue discrimination cases based solely on sexual orientation.16
In 2011, ED hosted the first Lesbian, Gay, Bisexual and Transgendered (LGBT) Youth Summit. ED’s commitment to address issues faced by LGBT students requires additional data to inform the policies and evaluate the results of programs stemming from new initiatives. The NCES/BJS developed questions on the 2015 SCS questionnaire to obtain student level data about whether reported bullying was perceived to be due to sexual orientation (or other factors, such as race, ethnicity, and gender).
Exhibit 2 displays the types of estimates that can be drawn from the 2015 SCS.
Exhibit 2: Types of estimates that can be drawn from the 2015 SCS.
Estimates1 |
Relevant questions |
Percentages of students ages 12–18 who reported presence of selected security measures at school |
Q10 |
Percentage of students ages 12–18 who reported being bullied at school during the school year by type of bullying and by selected student and school characteristics |
Q22, 23 |
Percentage of students ages 12–18 who reported cyber-bullying problems during the school year, by selected student and school characteristics |
Q25.8 |
Number and percentage of students ages 12–18 who reported being bullied at school, by the frequency of bullying and whether an adult was notified, and selected student characteristics |
Q22, 23, 24, 25, 27 |
Percentage of students ages 12–18 who reported bullying problems at school and the effect it had on them, by selected student and school characteristics |
Q22, 23, 27 |
Percentage of students ages 12–18 who reported being targets of hate-related bullying, hearing hate-related words and seeing hate-related graffiti at school during the school year, by selected student and school characteristics |
Q30, 31, 32 |
Percentage of students ages 12–18 who reported being afraid of attack or harm during the school year, by location and urbanicity |
Q35 a-c |
Percentage of students ages 12–18 who reported that gangs were present at school during the school year |
Q39 a-c |
Percentage of students ages 12–18 who reported being bullied at school, by student reports of unfavorable school conditions |
Q19, 22, 23, 32, 37b, 39a |
Percentage of students ages 12–18 who reported being bullied at school, by presence of indicators of school attachment, performance, and future orientation |
Q9, 14, 40, 41, 42 |
Percentage of students ages 12–18 who reported being bullied at school, by student reports of personal fear, avoidance behaviors, fighting, and weapon carrying at school, and type of bullying |
Q21a, 33, 34a-c, 35a-c, 36 |
1 Some data that refer to student characteristics like sex, race, and household income are covered in the NCVS survey and not in the SCS. School characteristics for the schools of attendance reported by respondents are taken from NCES’s Common Core of Data (CCD) and Private School Universe Survey (PSS).
Use by Federal Stakeholders
NCES and BJS use the SCS data to meet the reporting mandates of the agencies. Together they issue a joint annual report, Indicators of School Crime and Safety. The latest report is available at http://www.bjs.gov/index.cfm?ty=pbdetail&iid=5008. Seven of the 22 indicators in this report are based on SCS data. Indicator 2, “Incidence of Victimization at School and Away from School,” is the primary mechanism for releasing annual estimates from the NCVS for violence and theft against students ages 12 to 18.
NCES also uses these data to complement other publications, such as The Condition of Education, a congressionally mandated annual report that summarizes developments and trends in education using the latest available data. Some of the other federal stakeholders and the ways in which they use SCS data are as follows:
Congress uses these data to evaluate the prevalence and extent of school crime to help support Federal, State and local agencies in reducing student victimization, develop new or improved initiatives or laws aimed at ensuring the safety of America's students and monitor the effectiveness of school policies and programs.
The Department of Education reviews the data to meet its obligation to Congress under the Education Sciences Reform Act (ESRA) to understand the current trends in school crime and disorder and its possible effects on student education and school systems. Within the Department, the Office of Elementary and Secondary Education (OESE) and the Office of Safe and Healthy Students (OSHS) use the data to: communicate and understand the current trends in school crime and to allocate resources to assist states and local agencies to meet the needs of school officials, administrators, teachers, and parents to assess conditions within their own schools/jurisdictions relative to those at the national level, as well as determine needs and budget requirements.
Use by Non-Federal Stakeholders
Non-federal users include state and local officials who, in conjunction with researchers and planners, need to analyze the current trends in victimization and school safety. For example:
State and local governments use the data to assess conditions within their own jurisdictions relative to those at the national level and to determine needs and budget requirements for local school districts.
Researchers and practitioners often reanalyze the data to estimate the prevalence and impact of student victimization, and correlate school crime to design prevention and programs.
The media disseminates findings from the survey to inform the public about all of the issues related to school crime and safety.
In addition to principal, district, or state-level data sources, students' reports of victimization and perceptions of crime, violence, and school climate are important factors in providing a comprehensive picture of school crime and safety. Currently, the SCS is the only recurring national data source that provides nationally representative student-level data detailing victimization and other school characteristics related to crime and disorder.
If the data in the SCS were not collected, data users would have no source of nationally representative student-level data on victimization and school characteristics related to victimization that includes incidents both reported and not reported to police. Stakeholders would not have sufficient data to make comparative assessments that document the changing demands on schools, community mental health agencies, and law enforcement. These entities will not have the necessary data to obtain resources for personnel and services to ensure school safety (e.g. security, personnel, and programmatic efforts) and other demands for tax dollars.
Attachment 12 displays selected nonfederal publications that report secondary analyses of SCS data.
Use of Technology
The SCS will be conducted in a fully automated interviewing environment using computer-assisted personal interviewing (CAPI) methods whereby field representatives use a laptop computer to display questions and record answers. The use of CAPI technologies reduces data collection costs as well as respondent and interviewer burden. Furthermore, automated instruments afford the opportunity to implement inter-data item integrity constraints which minimize the amount of data inconsistency. More consistent data, in turn, reduces the need for extensive post-data collection editing and imputation processes which will significantly reduce the time needed to release the data for public consumption. The use of technology results in more accurate data products that are delivered in a more timely fashion giving data users access to information while it is still relevant.
The NCVS is conducted in six month intervals at selected households for a total of seven interviews. The first interviews are conducted in person while the second through seventh interviews are conducted over the telephone. All interviews are conducted using CAPI technology.
Efforts to Identify Duplication
Two contemporary surveys collect information about school-related crime and safety from the students’ perspective. The Youth Risk Behavior Survey (YRBS) and Monitoring the Future (MTF) are nationally-based collections that target various populations and substantive areas. However, neither of these studies provides a comprehensive picture of school crime from the students’ perspective from both the public and private sectors.
Youth Risk Behavior Survey (YRBS). The Centers for Disease Control and Prevention’s (CDC) Youth Risk Behavior Survey (YRBS) collects information on risky behaviors and offending, but there is minimal overlap of YRBS content with that of the SCS. The YRBS is a school-based survey and interviews students in grades 9 through 12. Most of the questions ask about all experiences, not just those confined to school. The SCS is a household-based sample and interviews children ages 12 to 18 who have attended school during the previous six months (grades 6 through 12). All of the questions are about experiences at school. Three areas of overlap include: did the student carry a weapon on school property, was the student in a fight on school property, and did the student skip (or not attend) school because of safety concerns. In 2011, two questions on bullying and cyber-bullying were added to the YRBS. Unlike the SCS, the questions do not go into detail about the type of bullying behavior, number of incidents, or results (notification of adults, injuries sustained, avoidance, etc.). Additionally, because this is a self-administered survey, the responses are not directly comparable to the SCS.
Monitoring the Future (MTF). The National Institute on Drug Abuse (NIDA) publishes survey results from Monitoring the Future (MTF). This survey, like the YRBS, is a self-administered form. It is also a school-based survey population. The population surveyed does not completely overlap with the SCS as the survey is not administered to students below grade 8 and uses different forms for grades 8, 10, and 12; it includes college students; and is not restricted by age. More importantly, the sampling procedures are representative of schools, not the general population. Monitoring the Future does not look at bullying or cyber-bullying, and only overlaps in the areas of drug and alcohol use and availability. Like the YRBS, MTF does not restrict responses to experiences on school property. Thus, the SCS does not duplicate existing data collections.
Minimizing Burden
The SCS is part of the NCVS which is a household-based sample. The supplement will be conducted in households scheduled to be interviewed in January through June 2015. Based on the 2013 SCS data collection, we expect that the SCS will take no longer than about 17.5 minutes to administer. The 2013 SCS was administered to approximately 9,552 persons in the household who are 12 through 18 years old. We estimate that approximately 14,461 respondents between the ages of 12 and 18 will be eligible for the supplement in 2015. This is an increase of about 51% compared to the total number of persons 12 to 18 years of age that were eligible for the 2013 SCS. This increase is attributable to the 2013 NCVS Sample Boost in 11 states from July 2013 to December 2015 to test the feasibility of collecting sub-national estimates of victimization. Since the 2013 Sample Boost is a feasibility study, analysis will be conducted to determine if these cases will be used in the final SCS datafile.
In 2015, like 2009, 2011, and 2013, all SCS interviews will collect data using only CAPI technology. Using CAPI technologies reduces respondent and interviewer burden because the automated instruments present the next ‘on-path’ question. This prevents the need for the interviewer to delay the interview to assess and proceed with the correct skip pattern. This also creates fewer delays throughout the interview which results in shorter interviews and a commensurate reduction in respondent and interviewer burden.
During the current cycle of review and revisions, a number of questions and sub-questions were deleted, edited or combined in order to improve information or minimize non-response. In total, comparing new questions added to those deleted, there has been a net reduction of 12 items in the questionnaire to which students are asked to respond.
Consequences of Less Frequent Collection
To produce a regular series of data on school crime victimization requires regular data collection. In 1999, the SCS became a biennial survey for several reasons: 1) the student perspective is important in understanding school crime, and 2) the data about the students’ must be analyzed over time to identify trends.
Special Circumstances
Collection is consistent with the guidelines in 5 C.F.R. 1320.9.
Federal Register Publication and Consultations Outside the Agency
The research under this clearance is consistent with the guidelines in 5 CFR 1320.6. Comments on this data collection effort were solicited in the Federal Register, Vol. 79, No. 115, on June 16, 2014 and in Vol. 79, No. 159, August 18, 2014. No comments were received in response to the information provided.
The U.S. Census Bureau, the BJS, and the NCES cooperated to develop the questionnaire and procedures used to collect this supplemental information. Michael Planty, Ph.D., Jennifer Truman, Ph.D., and Rachel Morgan, Ph.D., from the BJS, and Ms. Kathryn Chandler, from the NCES, were the principal consultants. Those persons consulted from the Census Bureau included Ms. Meagan Wilson, Mr. Christopher Seamands, Mr. William Samples, Mr. Edward Madrid, Ms. Joanne Pascale, and Ms. Theresa DeMaio.
As part of the 2015 survey development, members of the Technical Review Panel (TRP) were consulted about content. TRP members and their affiliations include –
Lynn Addington, American University
Jon Akers, Kentucky Center for School Safety
Catherine Bradshaw, Johns Hopkins University
Jill DeVoe, Independent Consultant
Brad Lerman, Rutgers University
William Modzeleski, Independent Consultant
Sister Dale McDonald, National Catholic Educational Association
Deborah Temkin, Robert F. Kennedy Center for Justice and Human Rights
Michele Ybarra, Center for Innovative Public Health Research
9. Paying Respondents
Payment or gifts to respondents is not provided in return for participation in the survey.
10. Assurance of Confidentiality
All NCVS information about individuals or households is confidential by law--Title 42, United States Code, Sections 3789g and 3735 (formerly Section 3771) and Title 13, United States Code, Section 9. Only Census Bureau employees sworn to preserve this confidentiality may see the survey responses. Even BJS, as the sponsor of the survey, is not authorized to see or handle the data in its raw form. All unique and identifying information is scrambled or suppressed before it is provided to BJS and NCES to analyze. Data are maintained in secure environments and in restricted access locations within the Census Bureau. All data provided to BJS must meet the confidentiality requirements set forth by the Disclosure Review Board at the Census Bureau.
In a letter signed by the Director of the Census Bureau, sent to all participants in the survey, respondents are informed of this law and assured that it requires the Census Bureau to keep all information provided by the respondent confidential. The letter also informs respondents that this is a voluntary survey. Furthermore, in addition to the legal authority and voluntary nature of the survey, the letter informs respondents of the public reporting burden for this collection of information, the principal purposes for collecting the information, and the various uses for the data after it is collected which satisfies the requirements of the Privacy Act of 1974.
11. Justification for Sensitive Questions
Sensitive questions include those related to victimization, bullying victimization, drug availability at school, gang presence at school, and students’ access to weapons since these are of great interest for school administrators and personnel responsible for maintaining school safety. These have been included in past SCS administrations. An additional section asking respondents whether they felt being bullied was related to personal characteristics such as sexual orientation or religious beliefs has been added to the questionnaire for 2015. These questions are carefully constructed to ask about perceptions of victims, rather than about actual personal characteristics. This information is necessary to meet ED’s commitment to provide better information on victimization among protected and vulnerable student groups.
12. Estimated Respondent Burden
The 2013 SCS was administered to approximately 9,552 persons. We estimate that approximately 14,461 respondents between the ages of 12 and 18 will be eligible for the supplement in 2015 (a difference of 4,909 respondents). This increase is attributable to the 2013 NCVS Sample Boost in 11 states from July 2013 to December 2015. The majority of respondents will complete the long SCS interview (entire SCS questionnaire) which will take an estimated 0.292 hours (17.52 minutes) to complete. Based on the 2013 SCS data collection, we expect the completion rate to be 51.7% for the long interview. The remainder of the respondents will complete the short interview (i.e. will be screened out for not being in school), which will take an estimated 0.047 hours (2.83 minutes) to complete. We expect the completion rate to be 8.2% for the short interview. This will amount to a total increase in burden response of 760 hours (((4,909*0.517)*0.292) + ((4,909*0.082)*0.047))). Due to the changes in the 2015 SCS instrument, we anticipate a total decrease in burden of 89 hours. This is a net increase of 671 (760-89) hours in respondent burden compared to the 2013 submitted total respondent burden estimate of 1,773 hours. This net increase is primarily due to the increase in sample as stated above. The total respondent burden is approximately 2,444 (1,773+671) hours. Exhibit 3 presents this information.
Exhibit 3: 2015 SCS estimated burden hours
Eligible respondents |
14,461 |
|
|
Number of eligible respondents in 2013 |
9,552 |
|
Difference between 2015 and 2013 |
4,909 |
|
|
|
Completion rate |
|
|
|
Full SCS |
0.517 |
|
Short interview |
0.082 |
|
|
|
Response time (hours) |
|
|
|
Full SCS |
0.292 |
|
Short interview |
0.047 |
|
|
|
Burden hours |
|
|
|
Full SCS |
741 |
|
Short interview |
19 |
|
Total increase in 2015 |
760 |
|
Estimated decrease due to changes to SCS in 2015 |
89 |
|
Net increase of burden hours in 2015 |
671 |
|
2013 burden hours |
1,773 |
|
|
|
Total 2015 burden hours requested |
2,444 |
13. Estimate of Cost Burden
There are no costs to respondents other than that of their time to respond.
14. Estimates of Cost to the Federal Government
The estimated annual cost to the Federal Government for the SCS is approximately $768,120. The NCES will bear all costs of data collection for the supplement incurred by the Census Bureau. Exhibit 4 details estimated costs to BJS, NCES, the total estimated Census costs, and the total estimated costs to the federal government.
Exhibit 4: Annual cost to the Federal Government for collecting and disseminating the 2015 School Crime Supplement
Data Collection: Census Bureau |
$314,000 |
|
NCES Support Contractors |
$300,000 |
|
|
|
|
Supplement Project Management – staff salaries |
|
|
|
GS12 – Statistician, BJS (15%) |
$12,855 |
|
GS15 - Supervisory Statistician, BJS (3%) |
$4,250 |
|
GS13 – Statistician, NCES (vacant) (45%) |
$40,466 |
|
GS15 – Statistician, NCES (30%) |
$47,130 |
|
Subtotal salaries |
$104,701 |
Fringe benefits (28% of salaries) |
$29,316 |
|
Subtotal: Salary and fringe |
$134,017 |
|
Other administrative costs of salary and fringe (15%) |
$20,103 |
|
Subtotal: Project management costs |
$154,120 |
|
|
|
|
Total estimated costs for SCS |
$768,120 |
NOTE: The interagency agreement with the Census Bureau is, in fact, a multi-year, unseverable agreement that covers all Census Bureau work on the 2015 SCS. The current agreement, which will run from approximately August 1, 2014 through September 30, 2016, totals $680,407. This is considerably less than interagency agreements covering earlier collections. The cost is lower because of tighter budgeting.
15. Reasons for Changes in Burden
The increase in the respondent burden from 1,773 hours to 2,444 hours is attributed to the increase in the NCVS sample. Approximately, the number of persons in the household who are 12 through 18 years old that will be eligible for the supplement will increase by about 51% from 9,552 respondents in 2013 to about 14,461 respondents in 2015. This increase is attributable to the 2013 NCVS Sample Boost in 11 states from July 2013 to December 2015 to test the feasibility of collecting sub-national estimates of victimization.
16. Plan for Publication and Project Schedule
The
BJS and the NCES will be responsible for release of the data to the
public (hereafter referred to as the “datafile”), the
statistical analysis of the data, and the production of resultant
web-based publications and tabulations. These microdata are made
available as a public-use file (PUF) after it has been approved by
the Census Bureau’s Disclosure Review Board (DRB). The datafile
itself is released via the Inter-University Consortium for Political
and Social Research (http://www.icpsr.umich.edu/)
and includes a codebook, setup program in SAS language, text file of
the raw data, as well as the datafile in SPSS, SAS, and STATA data
formats. As an example, the 2011 SCS data release documentation and
datasets can be found at
http://www.icpsr.umich.edu/icpsrweb/ICPSR/studies/33081?searchSource=revise&q=SCS+2011&archive=ICPSR.
The following publications have been released using data from the SCS 2011:
Student Reports of Bullying and Cyber-Bullying: Results From the 2011 School Crime Supplement to the National Crime Victimization Survey (August 2013) can be viewed at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2013329.
Indicators of School Crime and Safety: 2013 (June 2014) can be found at http://www.bjs.gov/index.cfm?ty=pbdetail&iid=5008.
Seven of the 22 indicators in the report are based on SCS data. These include –
Indicator 3: Prevalence of Victimization at School
Indicator 8: Students’ Reports of Gangs at School
Indicator 10: Students’ Reports of Being Called Hate-Related Words and Seeing Hate-Related Graffiti
Indicator 11: Bullying at School and Cyber-Bullying Anywhere
Indicator 17: Students’ Perceptions of Personal Safety at School and Away From School
Indicator 18: Students’ Reports of Avoiding School Activities or Specific Places in School
Indicator 21: Students’ Reports of Safety and Security Measures Observed at School
SCS 2013 Data Schedule
SCS 2013 Datafile: ICPSR expects to release the datafile from the 2013 SCS on the ICPSR website in late 2014.
SCS 2013 Publications: Recurring reports from the 2013 collection, similar to those described above for the SCS 2011, will be released approximately six months after the data are approved for release.
SCS 2015 Data Schedule
SCS 2015 Datafile: Interviewing for the 2015 supplement will be conducted from January 2015 through June 2015. Processing of the survey will take place between January 2015 and December 2015. Computer-based clerical editing and coding will be completed by July 2015, and the computer processing, editing, imputation and weighting of the data will be completed by the end of November 2015. The Census Bureau will prepare a microdata user file with documentation, which is scheduled to be sent to ICPSR by January 2016. ICPSR generally takes 2-3 months to format the data and provide documentation.
SCS 2015 Publications: Recurring reports from the 2015 collection, similar to those described above for the SCS 2011, will be released approximately 6 months after the data are approved for release.
17. Display of Expiration Date
The OMB Control Number and the expiration date will be published on instructions provided to all respondents
18. Exceptions to the Certification
N/A. There are no exceptions to the certification.
1 See attachment 1.
2 See attachment 2 for the full results of these analyses.
3 See attachment 3.
4 See attachment 4 for the full report from ICF.
5 Gladden, R.M., Vivolo-Kantor, A.M., Hamburger, M.E., & Lumpkin, C.D. Bullying Surveillance Among Youths: Uniform Definitions for Public Health and Recommended Data Elements, Version 1.0. Atlanta, GA; National Center for Injury Prevention and Control, Centers for Disease Control and Prevention and U.S. Department of Education; 2014. http://www.cdc.gov/violenceprevention/pdf/bullying-definitions-final-a.pdf.
6 See attachment 5 for the list of TRP members and attachment 6 for the agenda for their discussions.
7 Gladden, R.M., Vivolo-Kantor, A.M., Hamburger, M.E., & Lumpkin, C.D. Bullying Surveillance Among Youths: Uniform Definitions for Public Health and Recommended Data Elements, Version 1.0. Atlanta, GA; National Center for Injury Prevention and Control, Centers for Disease Control and Prevention and U.S. Department of Education; 2014. http://www.cdc.gov/violenceprevention/pdf/bullying-definitions-final-a.pdf.
8 See attachments 7 and 8.
9 Attachment 9.
10 See attachment 10 for an item by item review of changes from the 2013 to the 2015 versions.
11 See attachment 11 for a complete crosswalk of all SCS items from 2005 through 2015.
12 GAO, School Bullying: Extent of Legal Protections for Vulnerable Groups Needs to be More Fully Assessed, GAO-12-349 (Washington, D.C.: May 29, 2012) p.28. http://www.gao.gov/assets/600/591202.pdf.
13 Gladden, R.M., Vivolo-Kantor, A.M., Hamburger, M.E., & Lumpkin, C.D. Bullying Surveillance Among Youths: Uniform Definitions for Public Health and Recommended Data Elements, Version 1.0. Atlanta, GA; National Center for Injury Prevention and Control, Centers for Disease Control and Prevention and U.S. Department of Education; 2014. http://www.cdc.gov/violenceprevention/pdf/bullying-definitions-final-a.pdf.
14 Ibid, p7.
15 Ibid, p1.
16 GAO, School Bullying: Extent of Legal Protections for Vulnerable Groups Needs to be More Fully Assessed, GAO-12-349 (Washington, D.C.: May 29, 2012), p18. http://www.gao.gov/assets/600/591202.pdf.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Draft 2015 OMB Supporting Statement A |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |