Supporting Statement – 2017 School Crime Supplement to the National Crime Victimization Survey
B. Collection of Information Employing Statistical Methods
1. Universe and Respondent Selection
The sample universe is all children ages 12 to 18 living in NCVS interviewed households who have attended school during the previous six months (grades 6 through 12). The NCVS sample of households is drawn from the more than 120 million households nationwide and excludes military barracks and institutionalized populations. In 2017, the annual national sample is planned to be approximately 224,000 designated addresses located in 542 stratified Primary Sampling Units (PSUs) throughout the United States. The sample consists of seven parts, each of which is designated for interview in a given month and again at 6-month intervals.
Every 10 years, the Census Bureau redesigns the samples for all of their continuing demographic surveys, including the NCVS. In 2015, the 2000 sample design started to phase out and the 2010 sample design started to be phased in. Although the PSUs did not change in 2015, some of the cases assigned to 2015 interviews were selected from the 2010 design procedures from the Master Address File (MAF). The MAF contains all addresses from the most recent decennial census plus updates from the United States Postal Service, state and local address lists, and listings. The MAF is the frame used to reach the target NCVS population. Beginning in 2016, some PSUs will be removed from sample, some new PSUs will be added to the sample, and some continuing PSUs that were selected for both the 2000 and 2010 designs will remain in sample. The phase-in and phase-out of the designs will occur from January 2015 through December 2017. As part of the 2010 design, new addresses are selected each year from a master list of addresses based upon the 2010 Decennial Census of Population and Housing and addresses from the United States Postal Service. The new sample sizes are larger than in previous years to support state-level estimates in 22 states. In 2017, approximately 91% of the sample will be drawn from the 2010 design, with the remaining 9% from the 2000 design.
The NCVS uses a rotating sample. The sample usually consists of seven groups for each month of enumeration. When the SCS will be in the field (January–June 2017), there will be two rotation groups that were selected as part of the 2000 sample design. These two rotation groups will only be in continuing PSUs and will contain about 9% of all SCS sample units. The remaining sample will be divided into seven rotation groups that were selected as part of the 2010 sample design.
Each interview period the interviewer completes or updates the household composition component of the NCVS interview and asks the crime screen questions (NCVS-1) for each household member age 12 or older. The interviewer then completes a crime incident report (NCVS-2) for each reported crime incident identified in the crime screener. Following either the screener or the administration of the crime incident report, depending on whether a crime was reported, each household member ages 12-18 will be administered the SCS. Each household member provides the information by self-response or proxy. For the NCVS, proxy respondents are allowable under very limited circumstances and represent about 5% of all interviews. All forms and materials used for the NCVS screener and crime incident report have been previously approved by OMB (OMB NO: 1121-0111). The 2017 SCS instrument is included in Attachment 2.
SAMPLING
Sample selection for the NCVS, and by default the SCS, has three stages: the selection of primary sampling units or areas known as PSUs, the selection of address units in sample PSUs, and the determination of persons and households to be included in the sample.
Survey estimates are derived from a stratified, multi-stage cluster sample. The PSUs composing the first stage of the sample are formed from counties or groups of adjacent counties based upon data from the decennial census and the American Community Survey (ACS). The larger PSUs are included in the sample with certainty and are considered to be self-representing (SR). The remaining PSUs, called non self-representing (NSR) because only a subset of them are selected, are combined into strata by grouping PSUs with similar geographic and demographic characteristics. For the NCVS, decennial census counts, ACS estimates, and administrative crime data drawn from the FBI’s Uniform Crime Reporting Program are also used to stratify the PSUs.
Stage 1. Defining and Selection of PSUs
Defining PSUs – Formation of PSUs begins with listing counties and independent cities in the target area. For the NCVS, the target area is the entire country. The counties are either grouped with one or more contiguous counties to form PSUs or are PSUs all by themselves. The groupings are based on certain characteristics such as total land area, current and projected population counts, large metropolitan areas, and potential natural barriers such as rivers and mountains. The resulting county groupings are called PSUs.
After the PSUs are formed, the large PSUs and those in large urban areas are designated SR. The smaller PSUs are designated NSR. Determining which PSUs are considered small and which are large depends on the survey’s SR population cutoff, whether estimates are desired for the state, and the size of the MSA that contains the PSU.
Stratifying PSUs – The NSR PSUs are grouped with similar NSR PSUs within states to form strata. Each SR PSU forms its own stratum. The data used for grouping the PSUs consists of decennial census demographic data, ACS data, and administrative crime data. NSR PSUs are grouped to be as similar or homogeneous as possible. Just as the SR PSUs must be large enough to support a full workload so must each NSR strata. The most efficient stratification scheme is determined by minimizing the between PSU variance and the within PSU variance.
Selecting PSUs – The SR PSUs are automatically selected for sample or “selected with certainty.” One NSR PSU is selected from each stratum with probability proportional to the population size using a linear programming algorithm.
Stage 2. Preparing Frames and Sampling within PSUs
Frame Determination – To ensure adequate coverage for the target population, the Census Bureau defines and selects sample from address lists called frames. The 2000 and 2010 sample designs use different frame systems. The 2000 sample design was selected from four frames: a unit frame, an area frame, a group quarters (GQ) frame, and a new construction or permit frame. The 2010 sample design was selected from a unit frame and a GQ frame.
In the 2000 design, each address in the country was assigned to one and only one of the four frames. Frame assignment depended on four factors:
what type of living quarters are at the address,
when the living quarters were built,
where the living quarters were built, and
how completely the street address was listed.
The main distinction between the 2000 and 2010 frames is the procedure used to obtain the sample addresses. In the 2000 design, the unit and GQ frames were static address lists from the 2000 Census, the permit frame came from building permit office updates, and the area frame required field staff to canvass and list all addresses within specific census blocks. Research has shown that the MAF, which is the source for both 2010 design frames, provides similar coverage to the 2000 design frames but with reduced costs.
In the 2010 design, each address in the country was assigned to the unit or GQ frame based on the type of living quarter. Two types of living quarters are defined in the decennial census. The first type is a housing unit (HU). An HU is a group of rooms or a single room occupied as separate living quarters or intended for occupancy as separate living quarters. An HU may be occupied by a family or one person, as well as by two or more unrelated persons who share the living quarters.
The second type of living quarters is GQ. GQs are living quarters where residents share common facilities or receive formally authorized care. About 3% of the population counted in the 2010 Census resided in GQs. Of those, less than half resided in non-institutionalized GQs. About 97% of the population counted in the 2010 Census lived in HUs.
Within-PSU Sampling – All of the Census Bureau’s continuing demographic surveys, such as the NCVS, are sampled together. This takes advantage of updates from the January MAF delivery and ACS data. In the 2010 sample design, about 28.6% of the HU sample is selected every year; although 57% of the cases selected for 2016 interviews were selected in 2015 to start the 2010 sample design. The GQ sample is selected every three years.
Selection of samples is done one survey at a time (sequentially). Each survey determines how the unit addresses within the frame should be sorted prior to sampling. For the NCVS, each frame is sorted by geographic variables. A systematic sampling procedure is used to select addresses from each frame. A skeleton sample is also selected in every PSU. Every six months new addresses on the MAF are matched to the skeleton frame. The skeleton frame allows the sample to be refreshed with new addresses and thereby reduces the risk of under-coverage errors due to an outdated frame.
Addresses selected for a survey are removed from the frames, leaving an unbiased or clean universe behind for the next survey that is subsequently sampled. By leaving a clean universe for the next survey, duplication of addresses between surveys is avoided. This is done to help preserve response rates by insuring no unit falls into more than one survey sample.
Stage 3. Sample within Sample Addresses
The last stage of sampling is done during initial contact of the sample address during the data collection phase. For the SCS, if the address is a residence and the occupants agree to participate, then an attempt is made to interview every person ages 12 to 18 who live at the resident address and completes the NCVS-1. The NCVS has procedures to determine who lives in the sample unit and a household roster is completed with names and other demographic information. If someone moves out (in) of the household during the interviewing cycle, he or she is removed from (added to) the roster.
State Samples
From July of 2013 through December of 2015, BJS and Census boosted the existing national sample in the 10 largest states based on population size in order to test the feasibility, cost, and precision of state-level violent and property crime victimization estimates. Prior research conducted through the NCVS redesign had revealed that by building on existing sample in the most populous states, direct state-level three year rolling estimates of violent victimization with 10% relative standard error (RSE) could be generated for a reasonable cost.
Beginning in January of 2016, BJS and Census boosted the existing national sample in the 22-largest states. The states receiving a sample boost include Arizona, California, Colorado, Florida, Georgia, Illinois, Indiana, Maryland, Massachusetts, Michigan, Minnesota, Missouri, New Jersey, New York, North Carolina, Ohio, Pennsylvania, Tennessee, Texas, Virginia, Washington, and Wisconsin. In each of the 22 states, enough sample will be selected to achieve a 10% RSE for a three year average violent victimization rate of 0.02. BJS and NCES anticipate producing state-level estimates using the 2017 SCS, including estimates of bullying, given the prevalence of bullying during the last two survey administrations. Sample sizes in the remaining 28 states and the District of Columbia will be determined based on previous sample sizes. Unlike the 2000 sample design, no strata cross state boundaries and all 50 states and the District of Columbia will have at least one sampled PSU.
2. Procedures for Collecting Information
The SCS is designed to calculate national and 22 state-level estimates of school-related victimization for the target population – all children ages 12 to 18 living in NCVS households who have attended school during the previous six months. The SCS is administered to all age-eligible NCVS respondents during the 6-month period from January through June of 2017.
DATA COLLECTION
The SCS will be administered from January through June of 2017. Initially each eligible person ages 12 to 18 is asked a short set of screener questions to determine if they attended school, either private or public sector, at any time during the current school year. Students are also eliminated if they were home-schooled the entire survey period, or if they were enrolled in a grade below 6th, a GED program, or in college. If they did meet the school criteria, the students are then administered the SCS core instrument.
The SCS instrument is divided into seven primary parts: 1) environmental (school environment), 2) fighting, bullying, and hate behaviors, 3) avoidance, 4) fear, 5) weapons, 6) gangs, and 7) student characteristics. The environmental section asks students about their school’s name, type, grade levels, access to school and building, student activities, school organizational features related to safety, academic and teaching conditions, student-teacher relations, and drug availability. Section two, fighting, bullying, and hate behaviors asks students about the number and characteristics related to physical fights, bullying, and hate-related incidents. Section three, avoidance, asks students whether they avoided certain parts of the school building or campus, skipped class, or stayed home entirely because of the threat of harm or attack. Section four, fear, follows up with questions on how afraid students feel in and on their way to and from school. Section five, weapons, focuses on whether students carried weapons on to school grounds for protection or know of any students who have brought a gun to school. Section six, gangs, asks students about their perception of gang presence and activity at school. Finally, section seven asks students about their attendance and academic performance. Justifications for the sections/items can be found in Attachment 5.
3. Methods to Maximize Response Rates
Census Bureau staff mails an introductory letter (NCVS-572(L) or NCVS-573(L)) (Attachment 7 and Attachment 8) explaining the NCVS to the household before the interviewer's visit or call. When they go to a house, the interviewers carry cards identifying them as Census Bureau employees. The Census Bureau trains interviewers to obtain respondent cooperation and instructs them to make repeated attempts to contact respondents and complete all interviews. SCS response rate reports will be generated on a monthly basis and compared to the previous month’s average to ensure their reasonableness.
As part of their job, interviewers are instructed to keep noninterviews, or nonresponse from a household or persons within a household, to a minimum. Household nonresponse occurs when an interviewer finds an eligible household but obtains no interview. Person nonresponse occurs when an interview is obtained from at least one household member, but an interview is not obtained from one or more other eligible persons in that household. Maintaining a high response rate involves the interviewer’s ability to enlist cooperation from all kinds of people and to contact households when people are most likely to be home. As part of their initial training, interviewers are exposed to ways in which they can persuade respondents to participate as well as strategies to use to avoid refusals. Furthermore, the office staff makes every effort to help interviewers maintain high participation by suggesting ways to obtain an interview, and by making sure that sample units reported as noninterviews are in fact noninterviews. Also, survey procedures permit sending a letter to a reluctant respondent as soon as a new refusal is reported by the interviewer to encourage their participation and to reiterate the importance of the survey and their response.
In addition to the above procedures used to ensure high participation rates the Census Bureau implemented additional performance measures for interviewers based on data quality standards. Interviewers are trained to and assessed on administering the NCVS-1 and the NCVS-2 exactly as worded to ensure the uniformity of data collection, completing interviews in an appropriate amount of time (not rushing through them), and keeping item nonresponse and “don’t know” responses to a minimum.
The Census Bureau also uses quality control methods to ensure that accurate data is collected. Interviewers are continually monitored by each Regional Office to assess whether performance and response rate standards are being met and corrective action is taken to assist and discipline interviewers who are not meeting the standards.
As was done in previous years, in 2017, NCES will prepare a number of informational materials about the SCS for FR distribution to parents and students. Designed as brochures, these informational materials will provide answers to frequently asked questions about the SCS, and they will be produced in both English and Spanish. The student brochure includes the answers to such questions as “Do I have to take the survey?” and “Why are my answers to the survey important?” The parent brochure includes answers to such questions as “What is the purpose of the survey?” and “What questions are on the survey for my child?” The parent brochure will also include some illustrative survey findings from the 2013 SCS. Findings will not be included on the student brochure out of concern that they might bias student responses.
The 2017 brochures will be similar to those produced for 2015. The four 2017 brochures are as follows:
For Parents in English (Attachment 9)
For Students in English (Attachment 10)
For Parents in Spanish (Attachment 11)
For Students in Spanish (Attachment 12)
For the core NCVS, interviewers are able to obtain interviews with about 86% of household members in 82% of the occupied units in sample in any given month. Only household members ages 12 to 18 who have completed the NCVS-1 will be eligible for the SCS. The final overall response rates for the last two (2013 and 2015) administrations of the SCS were 51.2% and 47.7%, respectively (see Exhibit 5).
Based on the 2015 SCS response rates, we anticipate that about 60% or about 8,889 persons, will complete the 2017 SCS. Exhibit 5 includes the final household completion rates, student completion rates (for NCVS respondents eligible for the SCS), and overall SCS response rates (calculated by multiplying the household completion rate by the student completion rate) from 1989 through 2015.
Exhibit 5: SCS final response rates
Year |
Household Completion Rate (percent) |
Student Completion Rate (percent) |
Overall SCS Response Rate (percent) |
1989 |
96.0 |
86.5 |
83.5 |
1995 |
95.1 |
77.5 |
73.7 |
1999 |
93.8 |
77.6 |
72.8 |
2001 |
93.1 |
77.0 |
71.7 |
2003 |
91.9 |
69.6 |
64.0 |
2005 |
90.6 |
61.7 |
56.0 |
2007 |
90.4 |
58.3 |
52.7 |
2009 |
91.7 |
55.9 |
51.3 |
2011 |
90.7 |
63.3 |
57.4 |
2013 |
85.5 |
59.9 |
51.2 |
2015 |
82.5 |
57.8 |
47.7 |
Exhibit 5 shows that 57.8% of the eligible SCS sample completed the 2015 SCS. The remaining 42.2% were noninterviews. Reasons for noninterviews include – 1) the person was an NCVS noninterview, 2) the person was an NCVS interview but refused to complete the SCS interview, 3) the person was not available for an SCS interview and an acceptable proxy was not identified, 4) the person completed the NCVS in a language other than English or Spanish (only English and Spanish are available for the SCS), 5) the person was unable to complete the SCS for an unknown reason, 6) an acceptable proxy respondent was identified but refused to give a proxy SCS interview. The majority (78%) of SCS noninterviews were because the person refused to respond to the NCVS; therefore, they were not eligible to complete the SCS. Approximately 7% of SCS noninterviews were parent refusals. The 2015 SCS response rate not including the NCVS noninterviews was 86%.
OMB guidelines require a nonresponse bias analysis for all surveys with an overall unit response rate less than 80 percent. A unit nonresponse bias analysis was done by the Census Bureau due to the low unit nonresponse rate for the SCS in 2005, 2007, 2009, 2011, 2013, and 2015. Nonresponse can greatly affect the strength and application of survey data by leading to an increase in variance as a result of a reduction in the actual size of the sample and can produce bias if the nonrespondents have characteristics of interest that are different from the respondents.
In order for response bias to occur, respondents must have different response rates and responses to particular variables in the survey. The magnitude of unit nonresponse bias is determined by the response rate and differences between respondents and nonrespondents on key survey variables. Although the bias analysis cannot measure response bias since the SCS is a sample survey and it is not known how the population would have responded, the SCS sampling frame has four key student or school characteristic variables for which data is known for respondents and nonrespondents: sex, race/ethnicity, household income, and urbanicity, all of which are associated with student victimization. To the extent that there are different responses by respondents in these groups, nonresponse bias is not a concern.
Given that the SCS has a relatively small sample and the subgroups represent even smaller numbers, we would expect some fluctuations in nonresponse bias estimates from year to year. Where the same groups show nonresponse bias for two consecutive SCS reports there is more evidence that these subgroups may have a nonresponse bias associated with them. This information is used in calibrating weights to mitigate nonresponse bias in the public data set.
In 2015, for the NCVS interview and the SCS interview, the age group, sex group, race group, Hispanic group, urbanicity, and region all had response distributions significantly different than the population, meaning that nonresponse bias is a potential issue (Attachment 13). For the SCS interview, respondents represented significantly more of the age 14 group than the other age categories. Respondent representation in the Asian race category was significantly lower than for other race categories. Response for the rural subgroup was significantly higher than for the urban, and response for the northeast was significantly lower than response for all other regions. Beginning in 2015, Census calibrated the SCS person weights to population controls by age and Hispanic origin to mitigate potential bias identified in 2011 and 2013.
Item response rates for most SCS survey items in all survey years were high – typically over 95% of all eligible respondents – meaning there is little potential for item nonresponse bias for most items in the survey. Weights were developed to compensate for differential probabilities of selection and nonresponse. The weighted data in each year permit inferences about the eligible student population who were enrolled in schools for that SCS data year.
4. Final Testing of Procedures
The revised 2017 SCS questionnaire underwent cognitive testing by the Center for Survey Measurement staff at the Census Bureau, under the NCES generic clearance for cognitive, pilot, and field test studies (OMB number 1850-0803), from May to July of 2016. The cognitive testing was primarily focused on the section of questions about students’ experiences with bullying in schools.
The testing was conducted in three rounds, and an iterative methodology was used to identify and address problematic questions at the end of each round. The iterative method allowed for assessment of whether or not revised question wording addressed the problems interviewers observed during the previous rounds. Evidence from the study indicated that the original questions (questions included on the 2015 SCS) in the bullying section generally performed well. Most of the questions were easy for interviewers to administer, easy for respondents to understand and answer, and thus required no revisions. However, some questions did require revisions.
One question that required revisions was intended to measure the likelihood of bullying behaviors being repeated. Repetition is a key component of the CDC uniform definition of bullying. The phrase “over and over,” while used in the 2015 SCS, was revised after round one of testing. Cognitive testing indicated that this phrase was open to interpretation and could contribute to measurement error. The relevant question was revised to read “did you think the bullying would happen again?” This revised wording performed well and was easy for respondents to understand and answer, and this question is included in the final version of the questionnaire (Q24).
The frequency of bullying question went through multiple revisions during cognitive testing. This question, along with the question mentioned above, is used to measure repetition of bullying behaviors. Multiple versions of the question were included in each round of testing (e.g., asking how often, how many times, how many days, and when the bullying started and stopped). Respondents were not only asked questions about their own bullying experiences, but were given a vignette and asked to answer questions for the scenario outlined in the vignette. The vignette was used in order to determine how well the questions worked for specific situations in which the bullying was limited to a particular time frame and did not occur regularly throughout the school year. After round three of testing, the decision was made to use the version of the question that a majority of the students said was easiest to understand and answer, which asks about the number of days in which students were bullied (Q23a). A follow-up question was added to determine if students who reported experiencing bullying behaviors on only one day experienced multiple behaviors throughout the single day (Q23b). This question was not tested during the SCS cognitive testing, however, was modeled after a question that was cognitively tested for the 2016 NCVS Supplemental Victimization Survey (SVS) on stalking.
Another revision to the 2017 SCS involved the questions examining a power imbalance between the victim and the perpetrator(s), another key component of the CDC uniform definition. For the first round of cognitive testing, the power imbalance question on the 2015 SCS was split up into four separate sub-item questions with three of the questions measuring a type of power imbalance and the fourth being a catch-all asking if the bully had more power than the respondent “in another way.” Cognitive testing indicated that the survey would benefit from a new item measuring another dimension of power imbalance that was not originally asked about: the ability to influence what other students think of them. These questions performed well after these revisions were made and are included on the final questionnaire (Q27).
The final report outlining these and other recommendations of the cognitive testing and testing protocols are included with this package as Attachments 4, 14, 15, and 16.
By November of 2016, the Census Bureau will translate the survey instrument into an automated CAPI instrument. Census Bureau staff, including instrument developers and the project management staff, will conduct internal testing of the CAPI instrument.
Interviewers will be provided with an SCS self-study training, which is mandatory to complete prior to initiating any interviews. Interviewer training is usually conducted a month prior to the first month of interview. This allows the interviewers time to familiarize themselves with the survey content and any special instrument functionality that is specific to conducting interviews for the SCS.
5. Contacts for Statistical Aspects and Data Collection
BJS and NCES take responsibility for the overall design and management of the activities described in this submission, including developing study protocols, sampling procedures, and questionnaires and overseeing the conduct of the studies and analysis of the data by contractors.
The Census Bureau is responsible for the testing of interview materials and the collection of all data. Ms. Meagan Meuchel is the NCVS Survey Director at the Census Bureau and manages and coordinates the NCVS and SCS. BJS, NCES, and Census Bureau staff responsible for the SCS include:
BJS Staff: all staff located at- 810 7th Street, NW Washington, DC 20531
|
NCES Staff: all staff located at- 550 12th Street, SW Washington, DC 20202 |
Census Bureau Staff: all staff located at- 4600 Silver Hill Road Suitland, MD 20746 |
Jeri Mulrow Acting Director
|
Marilyn Seastrom, Ph.D. Chief Statistician |
Meagan Meuchel NCVS Survey Director Associate Directorate for Demographic Programs – Survey Operations
|
Michael Planty, Ph.D. Deputy Director Statistical Collections Division
|
Chris Chapman Associate Commissioner Sample Surveys Division |
Jill Harbison NCVS Assistant Survey Director Associate Directorate for Demographic Programs – Survey Operations
|
Lynn Langton, Ph.D. Chief Victimization Statistics Unit
|
Andy Zukerberg Chief Cross-Sectional Surveys Branch |
Timothy Gilbert Survey Statistician Associate Directorate for Demographic Programs – Survey Operations
|
Rachel Morgan, Ph.D. Statistician Victimization Statistics Unit
|
Rachel Hansen Project Director Cross-Sectional Surveys Branch |
David Hornick Lead Scientist Demographic Statistical Methods Division |
Jennifer Truman, Ph.D. Statistician Victimization Statistics Unit
|
Maura Spiegelman Associate Education Research Scientist Cross-Sectional Surveys Branch |
|
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Lisa Price-Grear |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |