Supporting Statement – 2022 National Crime Victimization Survey (NCVS) School Crime Supplement (SCS)
B. Collection of Information Employing Statistical Methods
Universe and Respondent Selection
The sample universe for the National Crime Victimization Survey (NCVS) School Crime Supplement (SCS) is all persons age 12 to 18 living in NCVS interviewed households who have attended public or private school during the current school year (grades 6 through 12).1 Students are eligible for the SCS if they were homeschooled for part of the school year and attended a public or private school during the other part of the school year, or attended a homeschool cooperative in person.2 Students who were homeschooled the entire school year are ineligible for the SCS.
The potential universe for the NCVS national sample is all persons age 12 or older in the more than 120 million U.S. households and persons 12 or older living in non-institutional group quarters (GQ) (except crews of vessels, military in barracks, and those at domestic violence shelters or living quarters for victims of natural disasters). In 2022, the annual NCVS national sample is planned to be approximately 254,000 designated addresses located in 542 stratified Primary Sampling Units (PSUs) throughout the United States. From January through June 2022, when the 2022 SCS is in the field, the NCVS national sample will include about 127,000 designated addresses.
Frame
The Master Address File (MAF) contains all addresses from the most recent decennial census plus updates from the United States Postal Service, state and local address lists, and other address listing operations. The MAF is the frame for the target NCVS population. Every ten years, the Census Bureau redesigns the samples for all of their continuing demographic surveys, including the NCVS. In general, the purpose of these redesigns is to capture population shifts measured by the most recent decennial census.
In 2015, the 2000 sample design started to phase out and the 2010 sample design started to be phased in. The phase-in and phase-out of the sample designs started in January 2015 and continued through December 2017. Beginning in 2016, some PSUs were removed from the sample, some new PSUs were added to the sample, and some continuing PSUs that were selected for both the 2000 and 2010 designs remained in the sample. The 2018 NCVS was the first full year of the phased in 2010 design where all PSUs and addresses were from the 2010 design. The new NCVS sample sizes are larger than in previous years to support state-level estimates in 22 states.
Rotating Panel Design
The NCVS uses a rotating panel design. The sample consists of seven groups for each month of enumeration. Each of these groups stays in the sample for an initial interview and six subsequent interviews, for a total of seven interviews for the typical household. During the course of the 6-month period when the SCS is administered, a full sample of seven rotation groups will be interviewed (one-sixth each month). One rotation group enters the sample for its first interview each month.
SAMPLE SELECTION
The sample design for the NCVS is a stratified, multi-stage cluster sample. Sample selection for the NCVS, and by default the SCS, has three stages: the selection of PSUs, the selection of address units within sample PSUs, and the determination of persons and households from those addresses to be included in the sample.
Stage 1. Defining and Selecting PSUs
Defining PSUs – Formation of PSUs begins with listing counties and independent cities in the target area. For the NCVS, the target area is all 50 states and the District of Columbia. The PSUs comprising the first stage of the sample are formed from counties or groups of adjacent counties based upon data from the most recent decennial census and the American Community Survey (ACS). Counties are either grouped with one or more contiguous counties to form PSUs or are PSUs all by themselves. For counties that are grouped, the groupings are based on certain characteristics such as total land area, current and projected population counts, large metropolitan areas, and potential natural barriers such as rivers and mountains. For the NCVS, decennial census counts, ACS estimates, and administrative crime data drawn from the FBI’s Uniform Crime Reporting program are also used to stratify the PSUs. The resulting county groupings are called PSUs.
After the PSUs are formed, the large PSUs and those in large urban areas are designated self-representing (SR). The smaller PSUs are designated non-self-representing (NSR). Determining which PSUs are considered small and which are considered large depends on the survey’s SR population cutoff, whether estimates are desired for the state, and the size of the metropolitan statistical area (MSA) that contains the PSU.
Stratifying PSUs – For the 2010 design, the NSR PSUs are grouped with similar NSR PSUs within states to form strata. Each SR PSU forms its own stratum. The data used for grouping the PSUs is also based on decennial census demographic data, ACS data, and administrative crime data. NSR PSUs are grouped to be as similar or homogeneous as possible. Just as the SR PSUs must be large enough to support a full workload so must each NSR strata. The most efficient stratification scheme is determined by minimizing the variance both between and within PSUs.
Selecting PSUs – The SR PSUs are automatically selected for sample or “selected with certainty.” NSR PSUs are sampled with probability proportional to the population size using a linear programming algorithm. One PSU is selected from each NSR stratum. The 2010 design NCVS sample includes 339 SR PSUs and 203 NSR PSUs. PSUs are defined, stratified, and selected once every ten years. The 2010 design sample PSUs were sampled using population data from the 2010 Census.
Stage 2. Preparing Frames and Sampling within PSUs
Frame Determination – The 2010 sample design selects its sample from two dynamic address-based sampling frames, one for housing units and one for group quarters. Both frames are based upon the MAF, which is a national inventory of addresses. The MAF is continually updated by various Census Bureau programs and external sources. New housing units are added to the MAF, and therefore the NCVS sampling frame, through semiannual updates from a variety of address sources, including the U.S. Postal Service Delivery Sequence File, local government files, and field listing operations.
In the 2010 design, each address in the country was assigned to the sampling frame based on the type of living quarters. Two types of living quarters are defined in the decennial census. The first type is a housing unit (HU). An HU is a group of rooms or a single room occupied as separate living quarters or intended for occupancy as separate living quarters. An HU may be occupied by a family or one person, as well as by two or more unrelated persons who share the living quarters.
The second type of living quarters are group quarters (GQ). GQs are living quarters where residents share common facilities or receive formally authorized care. About 3% of the population counted in the 2010 Census resided in GQs. Of those, less than half resided in non-institutionalized GQs. About 97% of the population counted in the 2010 Census lived in HUs.
Within-PSU Sampling – All of the Census Bureau’s continuing demographic surveys, including the NCVS, are sampled together. This procedure takes advantage of updates from the January MAF delivery and ACS data. This within-PSU selection occurs every year for HUs and every three years for GQs.
Selection of samples is done sequentially, one survey at a time. Each survey determines how the unit addresses within the frame should be sorted prior to sampling. For the NCVS, each frame is sorted by geographic variables. A systematic sampling procedure is used to select addresses from each frame. A skeleton sample is also selected in every PSU. Every six months new addresses on the MAF are matched to the skeleton frame. The skeleton frame allows the sample to be refreshed with new addresses and thereby reduces the risk of under-coverage errors due to an outdated frame.
Addresses selected for a survey are removed from the frames, leaving an unbiased or clean universe behind for the next survey that is subsequently sampled. By leaving a clean universe for the next survey, duplication of addresses across surveys is avoided. This is done to help preserve response rates by ensuring that no unit falls into more than one survey sample.
Stage 3. Persons within Sample Addresses
The last stage of sampling is done during the initial contact of the sample address during the data collection phase. The NCVS has procedures to determine who lives in the sample unit and a household roster is completed with names and other demographic information. If someone moves out (in) of the household during the interviewing cycle, he or she is removed from (added to) the roster. For the SCS, if the address is a residence and the occupants agree to participate, then an attempt is made to interview every person ages 12 to 18 who lives at the resident address and completes the NCVS-1.
The expected NCVS sample size for January through June 2022 is 127,000 households. Approximately 2,332 persons a month, ages 12 to 18, in these households will be eligible to be interviewed for the supplement during the SCS administration for a total of 13,992 possible interviews. Generally, interviewers are able to obtain SCS interviews with approximately 50% of the SCS eligible household members in occupied units in sample in any given month. A total of 7,010 persons ages 12 to 18 are expected to be interviewed for the SCS during the 6-month collection period.
State Samples
Beginning in January of 2016, BJS and the Census Bureau increased and reallocated the existing national sample in the 22 most populous states. The states receiving a sample boost include Arizona, California, Colorado, Florida, Georgia, Illinois, Indiana, Maryland, Massachusetts, Michigan, Minnesota, Missouri, New Jersey, New York, North Carolina, Ohio, Pennsylvania, Tennessee, Texas, Virginia, Washington, and Wisconsin. In 2017, each of these 22 states had a population greater than 5 million persons and in total these 22 states comprised 79% of the U.S. population.3 The underlying assumption of the subnational sample design is that three years of data will be needed to produce precise estimates of violent crime, which is experienced by about 1% of the population. Sample sizes in the remaining 28 states and the District of Columbia were determined to ensure full representation and unbiased estimates at the national level. For the 2010 design, unlike the 2000 sample design, no strata cross state boundaries and all 50 states and the District of Columbia have at least one sampled PSU.
Weighting and Estimation
The purpose of the SCS is to make inferences about school-related victimizations for the population of students ages 12 to 18 in the U.S. Before such inferences can be drawn, it is necessary to adjust, or weight, the sample of people to ensure it is similar to the entire population in this age group. The SCS weights are a combination of household-level and person-level adjustment factors. Household and person respondents from the NCVS sample are adjusted on a bi-annual basis to represent the U.S. population age 12 or older. For the SCS, the population is restricted to students ages 12 to 18 who attend public school, private school, or a homeschool cooperative in person during the current school year.
NCVS household and person weights are first adjusted to account for any subsampling that occurs within large GQs. The NCVS nonresponse weighting adjustment then allocates the sampling weights of nonresponding households and persons to respondents with similar characteristics. Additional factors are then applied to correct for the differences between the sample distributions of age, race, Hispanic origin, and sex and the population distributions of these characteristics. The resulting weights are assigned to all interviewed households and persons in the NCVS file.
SCS weighting begins with the NCVS final person weight, which is then multiplied by an SCS noninterview adjustment factor. SCS noninterview adjustment factors are computed by distributing the weights of SCS noninterviews to the weights of the SCS interviews, with adjustment cells determined by age, race, Hispanic origin, and sex. The result is an SCS person-level weight that can be used for producing estimates from the SCS variables.
Variance Estimates
The NCVS and SCS estimates come from a sample, so they may differ from figures from an enumeration of the entire population using the same questionnaires, instructions, and enumerators. For a given estimator, the average squared difference between estimates based on repeated samples and the estimate that would result if the sample were to include the entire population is known as sampling error.4 The sampling error quantifies the amount of uncertainty in an estimate as a result of selecting a sample.
Variance estimates can be derived using direct estimation or generalized variance functions (GVFs). Both methods are used to produce SCS statistical estimates. Replication methods provide estimates of variance for a wide variety of designs using probability sampling even when complex estimation procedures are used. This method requires sample selection, data collection, and estimation procedures to be carried out (i.e., replicated) several times. In addition, the Census Bureau produces parameters for GVFs that estimate the variance of any crime count estimate based on the value of the estimate. To do this, estimates and their relative variance are fit to a regression model using an iterative weighted least squares procedure where the weight is the inverse of the square of the predicted relative variance.
2. Procedures for Collecting Information
The SCS is designed to calculate national estimates of school-related victimization for the target population – all children ages 12 to 18 living in NCVS households who have attended public or private school during the current school year, or a homeschool cooperative in person. The SCS will be administered to all NCVS respondents ages 12 to 18 during the 6-month period from January through June 2022.
For the 6-month period, January through June 2022, the SCS will be administered to approximately 127,000 designated households. The NCVS uses a rotating sample that consists of seven groups for each month of enumeration. Each HU selected for the NCVS remains in the sample for three years, with each of seven interviews taking place at 6-month intervals.
The NCVS-500 (Control Card) is used to complete a household roster with names and other demographic information of the household members. For some demographic questions that are asked directly of respondents, flashcards are used, such as for education, race, Hispanic origin, employment, and household income.
Respondents are asked to report victimization experiences occurring in the six months preceding the month of interview. The NCVS Crime Screener instrument (NCVS-1) is asked of all respondents age 12 years or older in the household and is used to ascertain whether the respondent has experienced a personal crime victimization during the prior six months and is therefore eligible to be administered the NCVS Crime Incident Report instrument (NCVS-2). The NCVS-1 collects the basic information needed to determine whether the respondent experienced a crime victimization (rape or other sexual assault, robbery, aggravated or simple assault, personal larceny, burglary, motor vehicle theft, or other types of household theft).
When a respondent reports an eligible personal victimization, the NCVS-2 is then administered to collect detailed information about the crime incident. The NCVS-2 is administered for each incident the respondent reports. For each victimization incident, the NCVS-2 collects information about the offender (e.g., sex, race, Hispanic origin, age, and victim-offender relationship), characteristics of the crime (including time and place of occurrence, use of weapons, nature of injury, and economic consequences), whether the crime was reported to police, reasons the crime was or was not reported, and victim experiences with the criminal justice system. Clearance for the core NCVS forms and materials including the NCVS-500, NCVS-1 and NCVS-2 are requested through a separate OMB request and number (OMB Control No: 1121-0111).
At each interview, the interviewer completes or updates the household composition component of the NCVS interview and asks the crime screener questions (NCVS-1) for each household member age 12 or older. The interviewer then completes a crime incident report (NCVS-2) for each reported crime incident identified in the crime screener. Once the NCVS interview is completed (i.e. nonvictims responded to all NCVS-1 screening questions or victims completed all necessary NCVS-2 incident reports), the interviewer administers the SCS questionnaire to persons ages 12 to 18.
If the interview occurs during the first contact with a household that is new to the sample, the interview is typically conducted in person. Households that have been previously interviewed and are in their second through seventh interview can be interviewed by telephone whenever possible. A little over half (56%) of all interviews conducted each month are by telephone.
SCS collection
The SCS is designed to calculate national estimates of school-related victimization for the target population – all children ages 12 to 18 living in NCVS households who have attended school during the current school year.
Initially, each eligible person ages 12 to 18 is asked a short set of screener questions to determine if they attended school, either private or public, at any time during the current school year. Respondents are also asked if they were homeschooled or attended a homeschool cooperative during the current school year. Students are ineligible if they were homeschooled the entire survey period, did not attend a homeschool cooperative in person, or if they were enrolled in a grade below 6th, a GED program, or in college. If they did meet the school criteria, the students are then administered the SCS core instrument.
In the spring of 2020, the COVID-19 pandemic prompted many school buildings across the United States to close. As a result, school districts implemented different approaches to learning and continuing students’ education. To account for changes with regard to virtual learning or remote instruction for students, and whether students participated in a homeschooling cooperative (co-op), additional changes were incorporated into the 2022 SCS questionnaire. Some homeschool co-ops resemble private schools and students meet in-person, therefore, it was appropriate to add this question. Homeschooled students will also be asked about the primary reason behind the decision to be homeschooled, and if it is related to the school environment or bullying. These changes are described in Attachment 4.
The SCS instrument is divided into eight primary parts. Specific rationale for each question can be found in Attachment 4. The sections include –
Screener questions – establishes eligibility of the respondent being interviewed.
Environmental (school environment) – asks students about their school’s name, type of school, how they attended school (in-person, virtual, etc.), grade levels, access to school and building, student activities, school organizational features related to safety, academic and teaching conditions, student-teacher relations, and drug availability.
Fighting, bullying, and hate behaviors – asks students about the number and characteristics of physical fights, bullying, and hate-related incidents.
Avoidance – asks students whether they avoided certain parts of the school building or campus, skipped class, or stayed home entirely because of the threat of harm or attack.
Fear – follows up with questions on how afraid students feel in and on their way to and from school.
Weapons – focuses on whether students carried weapons on school grounds for protection or know of any students who have brought a gun to school.
Gangs – asks students about their perception of gang presence and activity at school.
Student characteristics – asks students about their attendance and academic performance.
The 2022 SCS questionnaire is a modified version of the 2019 SCS (see Attachment 5). The 2019 SCS included a split-sample with two versions. While there are several minor question and word choice differences between versions, the primary difference is the use of the word “bullying” in the series of bullying questions. Version 1 included the word “bullying” in the questionnaire while Version 2 completely excluded the word. The word “bullying” was omitted in Version 2 in order to not influence students’ opinion on the types of behaviors that constitute bullying. It was decided to only use Version 1 for the 2022 SCS after cognitive testing and discussion about keeping the historical trend data. Some questions were added, some questions were amended for word choice, and some questions in 2019 Version 2 remained in the 2022 SCS. Below is a summary of changes to the 2022 SCS.
Screener questions were modified in order to ask about in-person and virtual schooling through a public or private school (items 1a and 1b in 2022 SCS).
A screener question was modified in order to ask about homeschooling (item 1c in 2022 SCS) and a new question was added to ask respondents who reported being homeschooled if they attended a homeschool cooperative in-person during the current school year (item 1d in 2022 SCS).
A screener question was added to determine if respondents participated in virtual learning or homeschooling because of the COVID-19 pandemic (item 1e in 2022 SCS).
A series of questions was added in order to determine the primary reason that homeschooled students were homeschooled (items 1f1-1f8).
Added July to options for when the school year began for students (item 3 in 2022 SCS).
Questions on transportation to and from school were amended to specify “in-person” schooling (items 7-8 in 2022 SCS).
Used wording for question on student activities from 2019 Version 2 (items 9a-9h in 2022 SCS).
A question was added asking students if they had observed another student who was under the influence of illegal drugs or alcohol while they were attended virtual school (item 20a in 2022 SCS).
Use wording for question on drugs from 2019 Version 2 (items 19a-19e in 2022 SCS).
The word “bullying” is included in the gate question of the bullying section similar to 2019 Version 1. In addition, the use of technologies as a means of bullying is mentioned which was included in 2019 Version 2 (item 22a in 2022 SCS).
Use a combination of types of bullying questions from 2019 Version 1 and 2019 Version 2 (items 22a-22h in 2022 SCS).
Use response categories from 2019 Version 1 on the number of days the bullying behaviors occurred (item 23a from 2022 SCS).
Use response categories from 2019 Version 2 on the number of times bullying behaviors occurred (item 23b in 2022 SCS).
Add an “Other, specify” option from 2019 Version 2 for the section on student bullying and power (item 27f in 2022 SCS).
Questions about family and dating relationships from 2019 version 2 will be included (items 28a-28d in 2022 SCS).
Answer categories for where students were bullied at school will be taken from 2019 Version 2 (item 29 in 2022 SCS).
One of the sub-questions asking the student whether the bullying was related to certain personal characteristics was amended to avoid using the term “disability” (item 32d in 2022 SCS).
A question was added asking students if they had observed hate-related words, pictures, videos, or symbols on the online school platforms (item 35b in 2022 SCS).
A question was added asking students if they skipped virtual classes because they were afraid of other students (item 36a in 2022 SCS).
Use wording for question on gangs from 2019 Version 2 (intro to item 42a in 2022 SCS).
Add “mostly passes” and “mostly fails” to options for the types of grades that students could receive (item 43 in 2022 SCS).
As in prior years, the 2022 SCS responses will be linked to the NCVS survey instrument responses for a more complete understanding of the individual student’s experiences with victimization outside of the school environment. Demographic and household characteristics of the individual student can also be examined through this linking. This integration of the two surveys allows for a more complete understanding of individual students’ circumstances and the relationships between victimization in and out of school.
Methods to Maximize Response
Contact Strategy
The Census Bureau (Census) mails an introductory letter (NCVS-572(L)) or continuing household letter (NCVS-573(L)) explaining the NCVS to the household before the interviewer's visit or call (Attachments 6 and 7). During the SCS data collection months, Census also mails an SCS parent and student (English) brochure to each NCVS household with their NCVS letter. When they go to a household, the interviewers carry cards identifying them as Census Bureau employees. Potential respondents are assured that their answers will be held in confidence and are only used for statistical purposes. For respondents who have questions about the NCVS, interviewers provide a brochure (NCVS-110), and can also reference information in their Information Card Booklet (NCVS-554) that contains information such as uses of NCVS data and frequently asked questions and answers. At the field representative (FR)’s discretion, a thank you letter is sent to the household (NCVS-593(L)). All forms and materials used for contact with the household have been previously approved by OMB (OMB Control No: 1121-0111).
The Census Bureau trains interviewers to obtain respondent cooperation and instructs them to make repeated attempts to contact respondents and complete all interviews. The interviewer obtains demographic characteristics of noninterview persons for use in the adjustment for nonresponse. SCS response rates are monitored on a monthly basis and compared to the previous month’s average to ensure their reasonableness.
As part of their job, interviewers are instructed to keep noninterviews, or nonresponse from a household or persons within a household, to a minimum. Household nonresponse occurs when an interviewer finds an eligible household but obtains no interviews. Person nonresponse occurs when an interview is obtained from at least one household member, but an interview is not obtained from one or more other eligible persons in that household. Maintaining a high response rate involves the interviewer’s ability to enlist cooperation from all kinds of people and to contact households when people are most likely to be home. As part of their initial training, interviewers are exposed to ways in which they can persuade respondents to participate as well as strategies to use to avoid refusals. Furthermore, the Census office staff makes every effort to help interviewers maintain high participation by suggesting ways to obtain an interview, and by making sure that sample units reported as noninterviews are in fact noninterviews. Also, survey procedures permit sending a letter to a reluctant respondent as soon as a new refusal is reported by the interviewer to encourage their participation and to reiterate the importance of the survey and their response. For the 2019 SCS, Census regional offices sent SCS-specific letters to parents to encourage their child’s participation in the survey.
If resources are available, NCES will prepare a number of informational materials about the 2022 SCS for FR distribution to parents and students. These materials have been distributed during prior SCS administrations and will provide answers to frequently asked questions about the SCS, and they will be produced in both English and Spanish. The student brochure includes the answers to such questions as “Do I have to take this survey?” and “Why are my answers to the survey important?” The parent brochure includes answers to such questions as “What is the purpose of this survey?” and “What questions are on the survey for my child?” The parent brochure will also include some illustrative survey findings from the SCS. Findings will not be included on the student brochure out of concern that they might bias student responses.
Interviewer Training
Training for NCVS interviewers consists of classroom and on-the-job training. Initial training for interviewers consists of a full day pre-classroom self-study, four-day classroom training, post-classroom self-study, and on-the-job observation and training. Initial training includes topics such as protecting respondent confidentiality, gaining respondent cooperation, answering respondent questions, proper survey administration, use of systems to collect and transmit survey data, NCVS concepts and definitions, and completing simulated practice NCVS interviews. The NCVS procedures and concepts taught in initial training are also regularly reinforced for experienced NCVS interviewers. This information is received via monthly written communications, ongoing feedback from observations of interviews by supervisors, and monthly performance and data quality feedback reports.
NCVS interviewers also receive specific training on the SCS including eligibility, the organization of the SCS interview, content of the survey questionnaire, addressing potential respondent questions, and internal check items that are in place to help the interviewer ensure that the respondent is being asked the appropriate questions and follow-up when clarification is needed. Interviewers receive a self-study training manual that they are required to read and they must complete a Final Review Exercise to verify their knowledge of the concepts presented in the self-study training manual. The SCS training materials are distributed to interviewers electronically on their Census laptop approximately one month before the supplement goes into the field.
Monitoring Interviewers
In addition to the above procedures used to ensure high participation rates, the Census Bureau implements additional performance measures for interviewers based on data quality standards. Interviewers are trained and assessed on administering the NCVS-1, NCVS-2, and SCS exactly as worded to ensure the uniformity of data collection, completing interviews in an appropriate amount of time (not rushing through them), and keeping item nonresponse and “don’t know” responses to a minimum. The Census Bureau also uses quality control methods to ensure that accurate data are collected. Interviewers are continually monitored by their regional office to assess whether performance and response rate standards are being met and corrective action is taken to assist and discipline interviewers who are not meeting the standards.
Reinterview is a major feature of both the quality assurance (QA) and the missed crimes estimation program. The NCVS QA reinterview uses two approaches: random and supplemental (supervisor discretion) to validate interviewer performance. The missed crimes estimation program uses the data from the QA program to estimate household and person level missed crimes. The random reinterview approach consists of selecting a sample of each interviewer’s work to review over the data collection cycle. The supplemental approach allows supervisors to identify additional interviewers or cases for review throughout the cycle. Reinterview requires that a supervisor or experienced interviewer re-contact respondents at a sample of previously-interviewed households. Reinterviewers verify that the original interviewer contacted the correct sample unit, determined the correct household composition, and classified noninterview households correctly. Reinterviewers also verify the household roster and tenure, ensure specific questions are covered, and re-ask a subset of the crime screener questions.
Another component of the data quality program is monthly feedback. In 2011, the Census Bureau implemented a series of field performance and data quality indicators. Previously, high response rates were the primary measure of interviewer performance. The data quality indicators are tracked through the Census Bureau’s expanded Performance and Data Analysis (Giant PANDA) tool, and monthly reports provided to the field. Under the revised performance structure, interviewers are monitored on the following –
response rates (household, person, and the current supplement in the field);
time stamps (the time it takes to administer the screener questions on the NCVS-1 or the crime incident questions on the NCVS-2);
overnight starts (interviews conducted very late at night or very early in the morning);
late starts (cases not started until the 15th or later in the interview month);
absence of contact history records (cases missing records of contact attempts with the household and/or persons within the household); and
quality of crime incidents (changes made to the location, presence, or theft data items on the NCVS-2 during post-processing coding operations).
Noncompliance with these indicators results in supervisor notification and follow-up with the interviewer. The follow-up activity may include simple points of clarification (e.g., the respondent works nights and is only available in the early morning for an interview), additional interviewer training, or removal of the interviewer from the survey.
Every effort has been made to make the survey materials clear and straightforward. The SCS instrument has been designed to make collection of the data as concise and easy for the respondent as possible. The SCS questions have been cognitively tested to ensure that they are easily understood by most respondents.
Nonresponse and Response Rates
Interviewers are able to obtain NCVS interviews with about 83% of household members in 71% of the occupied units in sample in a given month. The interviewers are trained to make repeated attempts at contacting respondents and to complete interviews with all eligible households. Annually, the Census Bureau conducts a complete analysis of nonresponse. Beginning with 2018 and following data collection years, the Census Bureau plans to report nonresponse and response rates, respondent and nonrespondent distribution estimates, and proxy nonresponse bias estimates for various subgroups. Should the analyses reveal evidence of nonresponse bias, BJS will work with the Census Bureau to assess the impact to estimates and ways to adjust the weights accordingly. The interviewers obtain demographic characteristics of noninterview persons for use in the adjustment for nonresponse.
In 2019, the Census Bureau found evidence of potential bias in the SCS estimates because the overall response rate was low. Analysis indicated that respondent and nonrespondent distributions were significantly different for race and Hispanic origin and census region subgroups. However, after applying weights adjusted for person nonresponse, there was no evidence that these response differences introduced nonresponse bias in the final victimization estimates.
Test of Procedures
Two separate cognitive testing procedures were conducted prior to the 2022 SCS administration. The first iteration was conducted by the American Institutes for Research (AIR) from May through June 2020. AIR conducted nine interviews, and therefore, did not submit for OMB approval. This testing primarily focused on revising the bullying questions and was completed prior to the sponsors’ postponement of the SCS administration for 12 months due to the COVID-19 pandemic.
The second iteration was conducted from December 2020 through June 2021 by the Center for Behavioral Science Methods (CBSM) at the Census Bureau and focused on developing questions about how students attended school during the COVID-19 pandemic. CBSM’s cognitive testing was approved under the NCES generic clearance for cognitive, pilot, and field test studies (OMB NO: 1850-0803).
First Iteration of Cognitive Testing
The cognitive testing conducted by AIR focused on the bullying questions. As previously stated, the 2019 SCS included a split-sample experiment and the bullying question was one of the items that was included in this experiment. Version 1 of this item preserved the historical trend for the bullying data and included the word “bullied” in the question.5 Version 2 did not include the word “bullied” and asks respondents to think about situations that may have occurred online.6 NCES wanted to preserve the historical trend data by using Version 1 of the bullying question in the next administration of the SCS. However, they also wanted to test incorporating the online component that was in Version 2.
AIR conducted nine interviews with students ages 12 to 17. AIR noted that some of the respondents had trouble answering the bullying questions and identifying their experiences as bullying. Respondents did not have trouble understanding the new sentence asking them to think about situations that may have occurred online. Given this, and the extensive cognitive testing that CBSM had done in prior years on the historical bullying question, NCES decided to move forward with the historical bullying question and include the online component. AIR’s final report outlining these and other recommendations of the cognitive testing and testing protocols are included with this package as Attachment 8. Recommended changes by AIR that were agreed upon by the sponsors were the starting point for CBSM’s cognitive testing in 2021.
Second Iteration of Cognitive Testing
In spring of 2020, the COVID-19 pandemic impacted schooling in the U.S. As the type of schooling students received changed as a result of the pandemic, it became clear that the 2022 SCS needed new and revised questions to accurately capture the type(s) of schooling a student receives, and phrasing throughout the survey needed to be adjusted so that references to their schooling aligned with their experiences.
The CBSM testing was conducted in three rounds, and an iterative methodology was used to identify and address problematic questions at the end of each round. The iterative method allowed for assessment of whether or not revised question wording addressed the problems interviewers observed during the previous rounds. CBSM researchers conducted 30 cognitive interviews over three iterative rounds of testing between December 2020 and March 2021.
The SCS screener questions ask whether the student attended school:
in person this school year (Q1a),
virtually (Q1b), and
if they received homeschooling instead of being enrolled in a public or private school (Q1c).
Those who said they attended school virtually were asked a follow-up question about whether they attended virtually due to the coronavirus pandemic (Q1e). Students who reported being homeschooled were asked a follow-up question about the reasons why their family decided to homeschool instead of enrolling them in a public or private school (Q1f).
In round 1, some students forgot the reference to the time frame and answered Q1a-Q1c about both the previous and current school year. Revising Q1a-Q1c to repeat the timeframe in each item, as well as adding a reminder of the time frame in the introductory text that the Census field representative reads appeared to decrease the issue in rounds 2 and 3.
Students in rounds 1 and 2 noted some confusion about the phrase “remote instruction” that was initially used in Q1b and Q1e. Although most students recognized the other term (“virtual” in round 1 and “virtual learning” in round 2) and were able to answer the questions without issue, a few students answered Q1b and Q1e incorrectly due to the term. Replacing the entire phrase with “online schooling or virtual learning” effectively addressed the issue and prevented confusion in round 3.
Students also noted confusion about homeschool cooperatives (co-ops) during testing. After the confirmation round results meeting, NCES and BJS determined that since co-ops vary widely and some do resemble the private school experience, homeschool students who attend co-ops should continue through the instrument if they report that they attended school in person. To account for this possibility, a new question was added (Q1d) as a follow-up to Q1c asking students if they attended a co-op in person during the school year.
Despite concerns that fully virtual students would find questions about in-person school activities problematic, that did not appear to be the case. Many of the questions that were only applicable to in-person students were skipped in the new path for fully virtual students. Those traditionally in-person questions that were still asked of fully virtual students (extracurricular activities, seeing others under the influence of drugs and alcohol, and bullying items) still worked well for fully virtual students. Many of the items still applied. Schools still offered activities to fully virtual students, whether they hosted them in-person (sports), or switched them to virtual meetings. Most of the bullying behaviors were still possible during virtual school, and those that were not did not seem to faze the students.
Overall, the findings from this cognitive testing indicate that the new and revised screener questions performed well by the confirmation round. While there is still uncertainty about what schooling will look like in future school years, the final recommendations for the 2022 SCS are designed to adapt to each student’s situation. CBSM’s final report outlining these and other recommendations of the cognitive testing and testing protocols are included with this package as Attachment 9.
Consultants on Statistical Aspects of the Design
BJS and NCES take responsibility for the overall design and management of the activities described in this submission, including developing study protocols, sampling procedures, and questionnaires and overseeing the conduct of the studies and analysis of the data by contractors.
The Census Bureau will collect all information. Meagan Meuchel is the NCVS Survey Director at the Census Bureau and manages and coordinates the NCVS and its supplements. David Hornick of the Demographic Statistical Methods Division of the Census Bureau oversees the statistical aspects of the supplement. BJS, NCES, and Census Bureau staff responsible for the SCS include –
BJS Staff: all staff located at- 810 7th Street, NW Washington, DC 20531 |
NCES Staff: all staff located at- 550 12th Street, SW Washington, DC 20202 |
Census Bureau Staff: all staff located at- 4600 Silver Hill Road Suitland, MD 20746 |
Doris J. James Acting Director
|
Marilyn Seastrom, Ph.D. Chief Statistician |
Meagan Meuchel NCVS Survey Director Associate Directorate for Demographic Programs – Survey Operations |
Heather Brotsos Chief Victimization Statistics Unit |
Chris Chapman Associate Commissioner Sample Surveys Division |
Megan Ruhnke NCVS Assistant Survey Director Associate Directorate for Demographic Programs – Survey Operations |
Rachel Morgan, Ph.D. Statistician Victimization Statistics Unit |
Andrew Zukerberg Chief Cross-Sectional Surveys Branch |
Chris Seamands NCVS Assistant Survey Director Associate Directorate for Demographic Programs – Survey Operations |
Alexandra Thompson Statistician Victimization Statistics Unit |
Deanne Swan, Ph.D. Senior Technical Advisor |
Scott Raudabaugh Chief, Crime Surveys Programming & Population Support Branch Chief Demographic Surveys Division |
|
|
David Hornick Lead Scientist Demographic Statistical Methods Division |
C. List of attachments
BJS authorizing statute; Title 34, United States Code, Section 10132 of the Justice Systems Improvement Act of 1979
2022 SCS questionnaire
Selected nonfederal citations citing data from the NCVS SCS
2022 SCS item justification and rationale
2019 SCS questionnaire
Incoming household letter from Census (NCVS-572(L))
Continuing household letter from Census (NCVS-573(L))
2022 NCVS SCS final CBSM cognitive testing report with attachments
1 Public schools are identified on the Department of Education’s (ED) Common Core of Data (CCD) database (https://nces.ed.gov/ccd/). Charter schools are included in the CCD database and therefore are categorized as public schools. Private schools are identified on ED’s Private School Universe Survey (PSS) (https://nces.ed.gov/surveys/pss/).
2 Homeschooling cooperatives (co-ops) are groups of homeschooling families who work together to educate their children. They can range from informal groups to more formal programs that resemble private schools. Some co-op students attend this type of schooling in person.
3 Table 1. Annual Estimates of the Resident Population for the United States, Regions, States, and Puerto Rico: April 1, 2010 to July 1, 2017 (NST-EST2017-01). Source: U.S. Census Bureau, Population Division. Release Date: December 2017.
4 Everitt, B.S., and Skrondal, A. (2010). The Cambridge Dictionary of Statistics, Fourth Edition. Retrieved from http://www.stewartschultz.com/statistics/books/Cambridge%20Dictionary%20Statistics%204th.pdf.
5 “Now I have some questions about what students do at school that make you feel bad or are hurtful to you. We often refer to this as being bullied. You may include events you told me about already. During this school year, has any student bullied you? That is, has another student...”
6 “Now I have some questions about what students do at school that make you feel bad or are hurtful to you. These could occur in person or using technologies, such as a phone, the Internet, or social media. During this school year, has any student from your school…”
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Lisa Price-Grear |
File Modified | 0000-00-00 |
File Created | 2021-09-13 |