National Crime Victimization Survey
OMB Control Number 1121-0111
OMB Expiration Date: 11/30/2026
Updates from previously approved package in yellow.
B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS
1. Universe and Respondent Selection
Universe
The potential universe for the NCVS national sample is all persons age 12 or older in the more than 120 million U.S. households and persons 12 or older living in non-institutional group quarters (GQ) (except crews of vessels, military in barracks, and those at domestic violence shelters or living quarters for victims of natural disasters).
Frame
The Master Address File (MAF) contains all addresses from the most recent decennial census plus updates from the U.S. Postal Service, state and local address lists, and other address listing operations. The MAF is the frame for the target NCVS population. Every ten years, the Census Bureau redesigns the samples for all of their continuing demographic surveys, including the NCVS. In general, the purpose of these redesigns is to capture population shifts measured by the most recent decennial census. In 2015, the 2000 sample design started to phase out and the 2010-based sample design started to be phased in. As part of the 2010-based sample design, new addresses are selected each year from the master list of addresses (MAF) based upon the 2010 Decennial Census of Population and Housing and addresses from the United States Postal Service. New housing units are added to the MAF, and therefore the NCVS sampling frame, through semiannual updates.
Sampling
The sample design for the NCVS is a stratified, multi-stage cluster sample. Sample selection for the NCVS is done in three stages: the selection of primary sampling units (PSUs), the selection of sample hits within sampled PSUs, and the selection of all eligible persons and households within the sample hits.a Sample hits are clusters of four typically nearby housing units, but they can also be four housing unit equivalents within one or more group quarters.
Stage 1. Defining and Selecting PSUs
Defining PSUs – PSUs are defined, stratified, and selected once every ten years. Formation of PSUs begins with listing counties, independent cities, and other county equivalents in the target area. For the NCVS, the target area is all 50 states and the District of Columbia. The PSUs comprising the first stage of the sample are formed from counties or groups of adjacent counties based upon data from the most recent decennial census and the American Community Survey (ACS). The counties are either PSUs by themselves or grouped with one or more contiguous counties to form PSUs. For counties that are grouped, the groupings are based on certain characteristics such as state boundaries, total land area, current and projected population counts, metropolitan area status, and potential natural barriers such as rivers and mountains. The resulting county groupings are called PSUs.
After the PSUs are formed, the large PSUs and those within large metropolitan areas (specifically, Core Based Statistical Areas (CBSAs)) are designated self-representing (SR). The remaining smaller PSUs are designated non-self-representing (NSR). Determining which PSUs are considered small and which are considered large depends on the survey’s SR population cutoff and whether estimates are desired for the state. In the 2010 design, all PSUs in the top 85 CBSAs were designated as SR. In the 22 states for which NCVS made state-level estimates, additional PSUs were designated as SR based on a formula to achieve targeted coefficients of variation. Other than the top 85 CBSAs, there is no general rule to differentiate between SR and NSR PSUs.
Stratifying PSUs – For the 2010-based sample design, each SR PSU formed its own stratum. The NSR PSUs were grouped with similar NSR PSUs within states to form strata. For the NCVS, decennial census counts, ACS estimates, and administrative crime data drawn from the FBI’s Uniform Crime Reporting (UCR) Program are used to stratify the NSR PSUs to form strata as similar or homogeneous as possible. Just as the SR PSUs must be large enough to support a full workload, so must each NSR stratum. The most efficient stratification scheme was determined by minimizing the between-PSU variance within stratum and maximizing the between-stratum variance.
Selecting PSUs – In general, the SR PSUs are automatically included in the sample or selected with certainty. NSR PSUs are sampled with probability proportional to the population size. The NSR PSUs in the 2010-based sample were selected with Ohlsson’s (2000)b method of maximizing the sample overlap, independently of the 2000-based sample. This method will provide a basis for maximizing the overlap between the 2010- and 2020-based samples. One PSU was selected from each NSR stratum. The 2010-based sample design NCVS sample includes 339 SR PSUs and 203 NSR PSUs (out of 1,648 NSR PSUs).
Stage 2. Preparing Frames and Sampling within PSUs
Frame Determination – The 2010-based sample design selects its sample from two dynamic address-based sampling frames, one for housing units (HUs) and one for group quarters (GQs). Both frames are based upon the MAF, which is a national inventory of addresses. The MAF is continually updated by various Census Bureau programs and external sources. New housing units are added to the MAF, and therefore the NCVS sampling frame, through semiannual updates from a variety of address sources, including the U.S. Postal Service Delivery Sequence File, local government files, and field listing operations.
In the 2010-based sample design, each address in the country was assigned to the housing unit or GQ frame based on the type of living quarters. Two types of living quarters are defined in the decennial census. The first type is a housing unit (HU). An HU is a group of rooms or a single room occupied as separate living quarters or intended for occupancy as separate living quarters. An HU may be occupied by a family or one person, as well as by two or more unrelated persons who share the living quarters. The second type of living quarters is GQ. GQs are living quarters where residents share common facilities or receive formally authorized care. About 3% of the population counted in the 2010 Census resided in GQs. Of those, less than half resided in non-institutionalized GQs.
Within-PSU Sampling – All of the Census Bureau’s continuing demographic surveys, such as the NCVS, are sampled together. This procedure takes advantage of updates from the January MAF delivery and ACS data. This within-PSU selection occurs every year for housing units and every three years for GQs.
Selection of samples is done one survey at a time (sequentially). Each survey determines how the unit addresses within the frame should be sorted prior to sampling. For the NCVS, each frame is sorted by geographic variables. A systematic sampling procedure is used to select addresses from each frame. A skeleton sample is also selected in every PSU. Every six months new addresses on the MAF are matched to the skeleton frame. The skeleton frame allows the sample to be refreshed with new addresses and thereby reduces the risk of undercoverage errors due to an outdated frame.
Addresses selected for a survey are removed from the frames, leaving an unbiased or clean universe behind for the next survey that is subsequently sampled. By leaving a clean universe for the next survey, duplication of addresses across surveys is avoided. This is done to help preserve response rates by ensuring that no unit falls into more than one survey sample.
Stage 3. Persons within Sample Addresses
The last stage of sampling is done during the initial contact of the sample address during the data collection phase. For the NCVS, if the address is a residence and the occupants agree to participate, then an attempt is made to interview every person age 12 or older who lives at the address. The NCVS has procedures to determine who lives in the sample unit and a household roster is completed with names and other demographic information of all persons who live there (see Attachment 1). If someone moves out (in) of the household during the interviewing cycle, he or she is removed from (added to) the roster.
State Samples
Beginning in January 2016, BJS and the Census Bureau increased and reallocated the existing national sample in the 22 largest states. The states receiving a sample boost include Arizona, California, Colorado, Florida, Georgia, Illinois, Indiana, Maryland, Massachusetts, Michigan, Minnesota, Missouri, New Jersey, New York, North Carolina, Ohio, Pennsylvania, Tennessee, Texas, Virginia, Washington, and Wisconsin. In 2019, each of these 22 states had a population greater than 5 million persons and in total, these 22 states comprised 79% of the U.S. population.c In each of the 22 states, enough sample was selected with the goal of achieving a 10% relative standard error (RSE) for a three-year average violent victimization rate of 0.02. The state estimates for 2017-2019 fall somewhat short of this precision goal while still measuring important differences among this group of states.d
The underlying assumption of the subnational sample design is that three years of data will be needed to produce precise estimates of violent crime, which is experienced by about 1% of the population. Sample sizes in the remaining 28 states and the District of Columbia were determined to ensure full representation and unbiased estimates at the national level. Unlike the 2000 sample design, in the 2010-based sample design, no strata cross state boundaries and all 50 states and the District of Columbia have at least one sampled PSU.
Estimated Sample Size
For 2025, the estimated total number of sampled households is 256,572 with approximately 276,230 eligible persons.
Rotating Panel Design
The NCVS uses a rotating panel design. The interviewing schedule is provided in Table 1 and the rotation chart is available in the attached forms (NCVS-551; Attachment 6). The sample consists of seven groups for each month of enumeration. Each of these groups stays in the sample for an initial interview and six subsequent interviews, for a total of seven interviews for the typical household. During the course of a 6-month period, a full sample of seven rotation groups is interviewed (one-sixth each month). One rotation group enters the sample for its first interview each month.
Table 1. NCVS Interviewing Schedule
Frequency of Data Collection |
Jan |
Feb |
Mar |
Apr |
May |
Jun |
Jul |
Aug |
Sep |
Oct |
Nov |
Dec |
1/6 of sample |
X |
|
|
|
|
|
X |
|
|
|
|
|
1/6 of sample |
|
X |
|
|
|
|
|
X |
|
|
|
|
1/6 of sample |
|
|
X |
|
|
|
|
|
X |
|
|
|
1/6 of sample |
|
|
|
X |
|
|
|
|
|
X |
|
|
1/6 of sample |
|
|
|
|
X |
|
|
|
|
|
X |
|
1/6 of sample |
|
|
|
|
|
X |
|
|
|
|
|
X |
The NCVS rotating panel design offers three key benefits to the survey:
Bounded interviews, which allow for anchoring a previous interview to a specific point in time to help ensure that previously reported victimizations are not counted again in the current interview period.
Reduction in data collection costs achieved by interviewers building rapport with respondents and maintaining higher response rates leading to a smaller number of required sampling units. Further, cost savings are achieved by allowing interviews to be conducted largely by telephone after the first in-person visit.
Longitudinal data analysis, enabled by interviewing the same household multiple times, allows analysts to monitor outcomes for the respondents in that household over time.
A potential drawback to the panel design is respondent fatigue, which can have the impact of suppressing victimization rates and can increase attrition from the survey. Additionally, household turnover in the sample may reduce cost savings by requiring households to be replaced with new households who would be interviewed in person.e
Weighting and Estimation
Household, person, and victimization data from the NCVS sample are adjusted to give annual and bi-annual estimates of crime experienced by the U.S. population age 12 or older. Household and person weights are first adjusted to account for any subsampling that occurs within large GQs. The nonresponse weighting adjustment then allocates the sampling weights of nonresponding households and persons to respondents with similar characteristics. A ratio adjustment reduces the variance of the estimate by correcting for differences between the distribution of the sample by age, sex, and race and the distribution of the population by these characteristics. This also reduces bias due to undercoverage of various portions of the population.
Base Weights
The original NCVS base weight for each HU or GQ is the inverse of the probability of selection for that case.
Weighting Adjustments
If all eligible units in the sample responded to the survey and reported crimes only within the reference period, the sampling base weights would produce unbiased estimates with reasonably low variance. However, nonresponse and nonsampling errors are expected in all sample surveys, and the following post-data-collection weighting adjustments minimize their impact on the NCVS estimates. All of these adjustments are completed on six months of response data at a time –
GQ Subsampling
Some units in the GQ frame are subsampled because the observed GQ size is much larger than expected. During the estimation procedure, units within these GQs must receive a GQ subsampling adjustment (also known as the weighting control factor) to account for the change in the probability of selection.
Household Nonresponse
Nonresponse is classified into two major types: item nonresponse and unit (complete) nonresponse. Item nonresponse occurs when a cooperating household fails or refuses to provide some specific items of information. In the NCVS estimation process, the weights for all of the interviewed households are adjusted to account for occupied sample households for which no information was obtained due to unit nonresponse. To reduce estimate bias, the household nonresponse adjustment is performed within cells that are formed using the following variables: noninterview cluster, CBSA/MSA status, urbanicity, race of the household reference person, and interview number groups for the address.
Within-household Nonresponse
A household is considered a response if at least one person within the household completes the NCVS interview. The interviewer then attempts to interview all persons age 12 and older within the household, but some persons within the household may be unavailable or refuse to participate in the survey. This within-household nonresponse adjustment allocates the weights of nonresponding persons to respondents. The starting weight for all persons within responding households is the same household-level base weight multiplied by any GQ subsampling factor and the household nonresponse adjustment factor.
If nonrespondents’ crime victimizations are significantly different from respondents’ crime victimizations, there could be nonresponse bias in the NCVS estimates. To reduce nonresponse bias, the within-household nonresponse adjustment cells are formed by characteristics that are correlated with response and crime victimization rates. This includes: top 22 states/region, age, sex, race/ethnicity, and relationship to household reference person (self/spouse or all others). These variables are cross-classified in different ways, depending on household relationship, to create 54 cells within each state or region.
Ratio Adjustment
Distributions of the demographic characteristics derived from the NCVS sample in any month will be somewhat different from the true distributions, even for such basic characteristics as age, sex, race, and Hispanic origin. These particular population characteristics are closely correlated with victimization status and other characteristics estimated from the sample. Therefore, the variance of sample estimates based on these characteristics can be reduced when, by the use of appropriate weighting adjustments, the sample population distribution is brought as closely into agreement as possible with the known distribution of the entire population, with respect to these characteristics. This is accomplished by means of ratio adjustments. The NCVS ratio adjustment has three high-level steps: (1) person coverage, (2) person iterative raking, and (3) household coverage.
Bounding
Telescoping occurs when respondents report events that fall outside of the period of interest. Telescoping causes over-reporting and often happens in surveys when respondents are asked to recall all events within a given period. The NCVS asks respondents to recall all incidents that occurred during the previous 6 months. Prior to 2006, the first NCVS interview was a bounding interview and was not used in estimates, to avoid potential telescoping bias. In 2006, the first of the seven NCVS interviews in new sample areas was used in estimates, in conjunction with a bounding adjustment for the first interview, to avoid telescoping bias. All of the first NCVS interviews have been included in the estimates with a bounding adjustment since 2007.
Series Victimizations
When
a respondent reports a series crime, the interviewer completes one
incident report for all of the incidents with details collected on
only the most recent incident. In order to count all instances of
this incident, the victimization weight is multiplied by the number
of times (up to 10) the incident occurred. Including series
victimizations in national rates results in large increases in the
level of violent victimization; however, trends in violence are
generally similar regardless of whether series victimizations are
included.
Multiple Victims
If every victimization had one victim, the incident weight would be the same as the victimization weight. Because incidents sometimes have more than one victim, the incident weight is the series victimization weight divided by the number of victims in the incident.
Replicate Weights
The NCVS uses 160 replicate weights to produce variance estimates that reflect the complex sample design and weighting adjustments. To produce these replicate weights, the sampling base weights are multiplied by 160 different replicate factors to produce replicate base weights. Each set of replicate base weights is subjected to the same weighting adjustments described in the previous section to produce 160 sets of final replicate weights for households, persons, series victimizations, and incidents. By applying the weighting adjustments to each replicate, the final replicate weights reflect the impact of the weighting adjustments on the variance.
Variance Estimates
The NCVS estimates come from a sample, so they may differ from figures from an enumeration of the entire population using the same questionnaires, instructions, and enumerators. For a given estimator, the average squared difference between estimates based on repeated samples and the estimate that would result if the sample were to include the entire population is known as sampling error. The sampling error quantifies the amount of uncertainty and bias in an estimate as a result of selecting a sample.
Variance estimates can be derived using direct estimation or generalized variance functions (GVFs). Replication methods provide estimates of variance for a wide variety of designs using probability sampling, even when complex estimation procedures are used. This method requires the sample selection, data collection, and estimation procedures to be carried out (i.e., replicated) several times. Dispersing the resulting estimates can be used to measure the variance of the full sample.
In addition, the Census Bureau produces parameters for GVFs that estimate the variance of any crime count estimate based on the value of the estimate.f To do this, estimates and their relative variances are fit to a regression model using an iterative weighted least squares procedure where the weight is the inverse of the square of the predicted relative variance. GVFs are not the only means by which to estimate variance in the NCVS. Direct estimation of variance is possible as well with instructions available on the BJS website.g BJS maintains an active research program on direct variance and GVF estimation methods that seeks to improve the quality and accuracy of NCVS estimates and make technical information available to data users to support research.
Total Crime Estimates
The NCVS data allows users to produce estimates of crime and crime rates. Point estimates of crime victimizations include all incidents reported by sample units within the domain and time period of interest, weighted appropriately. NCVS personal crime rate estimates are calculated as the number of victimizations per one thousand people age 12 or older. NCVS household crime rate estimates are calculated as the number of victimizations per one thousand households.
Response Rates
Overall, 63% (N = 142,028) of eligible households in 2023 and 64% (N = 143,794) in 2022 completed interviews. The response rate among eligible persons from responding households was 82% in 2023 (N = 226,480) and 2022 (N = 226,962). For 2025, the expected response rates are 56% (N = 142,498) for all households and 82% (N = 227,062) for all persons.
2. Procedures for Collecting Information
The NCVS is designed to calculate national and state level (for the 22 most populous states) estimates of violent and property victimization for the target population—the noninstitutionalized resident population age 12 years or older. The NCVS is continuously in the field being administered to all age-eligible respondents. The procedures for collecting these data are the same for the new instrument. All forms and materials used for the NCVS data collection are identified in Appendix A with the associated attachment number.
DATA COLLECTION
Each HU selected for the NCVS remains in the sample for three years, with each of seven interviews taking place at 6-month intervals. Both the current and new NCVS instruments collect demographic information and use a two-stage measurement approach for the screening and classification of criminal victimization. The NCVS Control Card section (Attachment 1) is used to complete a household roster with names and other demographic information of the household members. Respondents are asked to report victimization experiences occurring in the six months preceding the month of interview. The NCVS-1 victimization screener (Attachment 1) is asked of all respondents age 12 years old older in the household and is used to ascertain whether the respondent has experienced a personal crime victimization during the prior six months and is therefore eligible to be administered the NCVS-2 crime incident report (Attachment 1).
The NCVS-1 screener collects the basic information needed to determine whether the respondent experienced a crime victimization (rape or other sexual assault, robbery, aggravated or simple assault, personal larceny, burglary, motor vehicle theft, other household theft, and vandalism). When a respondent reports an eligible personal victimization, the NCVS-2 crime incident report is then administered to collect detailed information about the crime incident. The NCVS-2 is administered for each incident the respondent reports. For each victimization incident, the NCVS-2 collects information about the offender (e.g. sex, race, Hispanic origin, age, and victim-offender relationship), characteristics of the crime (including time and place of occurrence, use of weapons, nature of injury, and economic consequences), whether the crime was reported to police, reasons the crime was not reported, and victim experiences with the criminal justice system. In each household, one respondent is designated as the head of the household and that head of the household reports about all household property crimes on behalf of the entire household.
Each interview period, the interviewer completes or updates the household composition component of the NCVS interview and asks the crime screener questions (NCVS-1) for each household member age 12 or older. The interviewer then completes a crime incident report (NCVS-2) for each reported crime incident identified in the crime screener. Each household member provides the information by self-response. For the NCVS, proxy respondents are allowable under very limited circumstances and represent less than 6% of all interviews. All forms and materials used for the NCVS screener and crime incident report are attached and identified in Appendix A.
The first contact with a household is by personal visit and subsequent contacts may be by telephone. For the second through seventh visits, interviews are done by telephone whenever possible. Over half of all interviews conducted each month are by telephone.
3. Methods to Maximize Response
Contact Strategy
The Census Bureau mails notifications to households prior to data collection, interviewers contact households for the first time in-person, and interviewers conduct nonresponse follow-up. The Census Bureau mails an introductory letter (NCVS-572(L); Attachment 3) explaining the NCVS to the household before the interviewer's visit or call. When they go to a household, the interviewers carry cards identifying them as Census Bureau employees. Potential respondents are assured that their answers will be held in confidence and are used for statistical purposes. For respondents who have questions about the NCVS, interviewers provide a brochure (NCVS-110; Attachment 7X) that contains information such as uses of NCVS data and frequently asked questions and answers. After interviews are completed at each enumeration period, the Census Bureau mails a thank you letter to the household (NCVS-593(L); Attachment 8). All forms and materials used for contact with the household are attached and identified in Appendix A.
The Census Bureau trains interviewers (see Interviewer Training below) to obtain respondent cooperation and instructs them to make repeated attempts to contact respondents and complete all interviews. The interviewer obtains demographic characteristics of noninterview persons for use in the adjustment for nonresponse. NCVS response rates are monitored on a monthly basis and compared to the previous month’s average to ensure their reasonableness.
As part of their job, interviewers are instructed to keep noninterviews, or nonresponse from a household or persons within a household, to a minimum. Household nonresponse occurs when an interviewer finds an eligible household but obtains no interviews. Person nonresponse occurs when an interview is obtained from at least one household member, but an interview is not obtained from one or more other eligible persons in that household. Maintaining a high response rate involves the interviewer’s ability to enlist cooperation from all kinds of people and to contact households when people are most likely to be home.
As part of their initial training, interviewers are exposed to ways in which they can persuade respondents to participate as well as strategies to use to avoid refusals. Furthermore, the office staff makes every effort to help interviewers maintain high participation by suggesting ways to obtain an interview, and by making sure that sample units reported as noninterviews are in fact noninterviews. Also, survey procedures permit sending a letter to a reluctant respondent as soon as a new refusal is reported by the interviewer to encourage their participation and to reiterate the importance of the survey and their response.
Interviewer Training
Training for NCVS interviewers consists of classroom and on-the-job training. Initial training for newly hired interviewers consists of a full day pre-classroom self-study, 4-day classroom training, post-classroom self-study, and on-the-job observation and training. Initial training includes topics such as protecting respondent confidentiality, gaining respondent cooperation, answering respondent questions, proper survey administration, use of systems to collect and transmit survey data, NCVS concepts and definitions, and completing simulated practice NCVS interviews. The NCVS procedures and concepts taught in initial training are also regularly reinforced for experienced NCVS interviewers. This information is received via monthly written communications, ongoing feedback from observations of interviews by supervisors, and monthly performance and data quality feedback reports.
All interviewers were trained on the new instrument prior to the split-sample field test in 2024. This training consisted of a full day pre-classroom self-study, 2-day classroom training, and on-the-job observation and training (see Attachments 9-11) for the new NCVS instrument–NCVS-521RE, NCVS-522RE, and NCVS-523RE). Training on the new instrument focused on major changes to protocols and survey items. All forms used by interviewers are attached and identified in Appendix A.
Monitoring Interviewers
In addition to the above procedures used to ensure high participation rates, the Census Bureau implements additional performance measures for interviewers based on data quality standards (NCVS-570; Attachment 12). Interviewers are trained and assessed on administering the NCVS-1 and the NCVS-2 exactly as worded to ensure the uniformity of data collection, completing interviews in an appropriate amount of time (not rushing through them), and keeping item nonresponse and “don’t know” responses to a minimum. The Census Bureau also uses quality control methods to ensure that accurate data are collected. Interviewers are continually monitored by their Regional Office to assess whether performance and response rate standards are being met and corrective action is taken to assist and discipline interviewers who are not meeting the standards. During the 2024 split-sample design, a variety of quality indicators are being closely monitored throughout the transition to track performance and employ training interventions as needed. To monitor any presence of contamination between the two instruments, BJS and the Census Bureau identified paradata indicators and other data comparisons that can be tracked over time. This monitoring will inform any changes that may be needed to performance measures for the full-scale implementation of the new NCVS instrument in 2025.
Reinterview is a major feature of both the quality assurance (QA) and the missed crimes estimation program (see Attachments 13-19 – NCVS-541, 11-170, 11-171, 11-172, Field Division Current Surveys Reinterviewer’s Self-Study, and the CATI and CAPI Reinterview Training Memorandums). The NCVS QA reinterview uses two approaches: random and supplemental (supervisor discretion) to validate interviewer performance. The missed crimes estimation program uses the data from the QA program to estimate household- and person-level missed crimes. The random reinterview approach consists of selecting a sample of each interviewer’s work to review over the data collection cycle. The supplemental approach allows supervisors to identify additional interviewers or cases for review throughout the cycle.
Reinterview requires that a supervisor or experienced interviewer re-contact respondents at a sample of previously interviewed households. Reinterviewers verify that the original interviewer contacted the correct sample unit, determined the correct household composition, and classified noninterview households correctly. Reinterviewers also verify the household roster and tenure, ensure specific questions are covered, and re-ask a subset of the crime screener questions.
Another component of the data quality program is monthly feedback. In 2011, the Census Bureau implemented a series of field performance and data quality indicators. Before that time, high response rates were the primary measure of interviewer performance. The data quality indicators are tracked through the Census Bureau’s expanded Performance and Data Analysis (Giant PANDA) tool, and monthly reports provided to the field. Under the revised performance structure, interviewers are monitored on the following:
response rates (household, person, and the current supplement in the field)
timestamps (the time it takes to administer the screener questions on the current and new NCVS-1 or the crime incident questions on the current and new NCVS-2)
overnight starts (interviews conducted very late at night or very early in the morning)
late starts (cases not started until the 15th or later in the interview month)
absence of contact history records (cases missing records of contact attempts with the household and/or persons within the household)
quality of crime incidents (changes made to the location, presence, or theft data items on the current NCVS-2 during post-processing coding operations)
Noncompliance with these indicators results in supervisor notification and follow-up with the interviewer. The follow-up activity may include simple points of clarification (e.g., the respondent works nights and is only available in the early morning for an interview), additional interviewer training, or removal of the interviewer from the survey.
Nonresponse and Response Rates
In 2023, interviewers were able to obtain interviews with about 82% of household members in 63% of the occupied units in sample in a given month. Annually, the Census Bureau conducts a complete analysis of nonresponse. Item nonresponse is generally low (under 1%) for the majority of NCVS items. For the 2025 and following data collection years, the Census Bureau plans to report nonresponse and response rates, respondent and nonrespondent distribution estimates, and proxy nonresponse bias estimates for various subgroups. Should the analyses reveal evidence of nonresponse bias, BJS will work with the Census Bureau to assess the impact on estimates and ways to adjust the weights accordingly.
4. Testing of Procedures
The 2024 NCVS data collection employed a split-sample design with half the sampled addresses receiving the currently approved NCVS instrument and the other half receiving the new instrument. Activities supporting the development of the new instrument, included extensive cognitive testing, usability testing, small-scale pilot test, and large-scale field test. These testing activities were approved through the OMB generic clearance agreement (OMB No. 1121-0339) for Cognitive, Pilot and Field Studies for BJS Data Collection Activities. Details on the new NCVS instrument development and testing are available in NCVS Instrument Redesign Field Test Methodology (NCJ 306155, June 2023).
From July to September 2023, BJS and the Census Bureau conducted an operational pilot test (OMB No. 1121-0339). The purpose of this pilot test was twofold: 1) to assess the new survey instrument and protocols in the Census Bureau data collection environment and (2) to test all systems and operational procedures within the Census Bureau data collection environment. The pilot test informed field protocols and interviewer training needs for the larger split-sample administration in 2024 and full-scale implementation of the new NCVS instrument in 2025. Other changes over the history of the survey that were approved by OMB are detailed in Appendix B.
5. Consultants on the Statistical Aspects of the Design
The Victimization Statistics Unit at BJS takes responsibility for the overall design and management of the activities described in this submission, including developing study protocols, sampling procedures, questionnaires, and overseeing the conduct of the studies and analysis of the data by contractors. Dr. Rachel Morgan is the Victimization Statistics Unit Chief.
The Census Bureau is responsible for the collection of all data. Mr. John Gloster is the NCVS Survey Director and manages and coordinates the NCVS. BJS and Census Bureau staff contacts include:
BJS Staff: all staff located at- 999 N. Capitol Street, NE Washington, DC 20531 |
Census Bureau Staff: all staff located at- 4600 Silver Hill Road Suitland, MD 20746 |
Kevin M. Scott, Ph.D. Principal Deputy Director 202-532-3323 |
John Gloster NCVS Survey Director Associate Directorate for Demographic Programs – Survey Operations 301-763-3165 |
Shelley S. Hyland, Ph.D. Senior Statistical Advisor 202-532-5523 |
Megan Ruhnke NCVS Assistant Survey Director Associate Directorate for Demographic Programs – Survey Operations 301-763-9842 |
Heather Brotsos Deputy Director Statistical Programs Division 202-598-7960 |
Chris Seamands NCVS Assistant Survey Director Associate Directorate for Demographic Programs – Survey Operations 202-809-7339 |
Rachel Morgan, Ph.D. Chief Victimization Statistics Unit 202-598-9237 |
Scott Raudabaugh Chief, Crime Surveys Programming & Population Support Branch Chief Demographic Systems Division 301-763-5448 |
Emilie Coen, DrPH Statistician Victimization Statistics Unit 202-598-9136 |
David Hornick Lead Scientist Demographic Statistical Methods Division 301-763-4183 |
Rebecca Bielamowicz, Ph.D. Statistician Victimization Statistics Unit 202-305-5257 |
|
Erika Harrell, Ph.D. Statistician Victimization Statistics Unit 202-598-1841 |
|
Susannah Tapp, Ph.D. Statistician Victimization Statistics Unit 202-353-5162 |
|
Alexandra Thompson Statistician Victimization Statistics Unit 202-532-5472 |
|
Erin Tinney, Ph.D. Statistician Victimization Statistics Unit 202-812-6594 |
|
Jennifer Truman, Ph.D. Statistician Victimization Statistics Unit 202-598-1931 |
|
Appendix Ah
NCVS Forms
Forms Used with All Sampled Householdsi
(completed by interviewers in-person or on the phone)
Form Number |
Title |
Description |
Frequency |
Attachment |
NCVS-500 (new) |
Redesigned Control Card |
“Control Card” Lists a roster of all persons living in the household with ages and other characteristics to help interviewer determine who should be interviewed. Respondent questions also available in Spanish. |
Monthly (2x/yr per household) |
1 |
NCVS-1 (new) |
Redesigned Basic Screen Questionnaire |
“Screener” Screens for crime incidents. Respondent questions also available in Spanish. |
Monthly (2x/yr per household assigned the NCVSR instruments) |
1 |
NCVS-2 (new) |
Redesigned Crime Incident Report |
“Incident Report” Collect detailed information about each incident identified in the screener. Respondent questions also available in Spanish. |
Monthly (2x/yr per household assigned the NCVSR instruments) |
1 |
Forms Used with Some Householdsj
(completed by interviewers in-person or on the phone)
Form Number |
Title |
Description |
Frequency |
Attachment |
NCVS-541 (new) |
Reinterview Questionnaire |
“Reinterview” Asked of respondents to evaluate the performance of a sample of field representatives. |
As needed |
13 |
Forms Used with Some Households
(standard forms used by interviewers upon request to provide more information)
Form Number |
Title |
Description |
Frequency |
Attachment |
NCVS-110 |
NCVS Fact Sheet |
“Fact Sheet” This is a brochure for the field representatives to give to respondents if they have questions about the NCVS. |
As needed |
7 |
NCVS-110 (SP) |
Spanish NCVS Fact Sheet |
“Spanish Fact Sheet” Spanish translation of the NCVS-110. |
As needed |
7 |
NCVS-110 (Arabic) |
Arabic NCVS Fact Sheet |
“Arabic Fact Sheet” Arabic translation of the NCVS-110. |
As needed |
7 |
NCVS-110 (Chinese) |
Chinese NCVS Fact Sheet |
“Chinese Fact Sheet” Chinese translation of the NCVS-110. |
As needed |
7 |
NCVS-110 (Korean) |
Korean NCVS Fact Sheet |
“Korean Fact Sheet” Korean translation of the NCVS-110. |
As needed |
7 |
NCVS-110 (Vietnamese) |
Vietnamese NCVS Fact Sheet |
“Vietnamese Fact Sheet” Vietnamese translation of the NCVS-110. |
As needed |
7 |
Forms Used by the Field Representatives
(Interviewing Manuals and Training Materials)
Form Number |
Title |
Description |
Frequency |
Attachment |
NCVS-521RE |
Redesign NCVS Self-Study |
“Self-Study Training Guide” Self-Study on the new instrument for field representatives to be completed prior to attending the classroom training. |
As needed |
9 |
NCVS-522RE |
Classroom Trainer Guide |
“Trainer’s Guide” The classroom training guide for the new instrument used by the trainer. |
As needed |
10 |
NCVS 523RE |
FR Training Workbook |
“Classroom Workbook” Workbook used by field representatives during classroom training on the new instrument. |
As needed |
11 |
11-170 |
NCVS Quality Control Reinterview CATI: Reinterviewer Training Guide |
“CATI Reinterviewer Training Guide” Training guide for CATI reinterview process. |
As needed |
14 |
11-171 |
NCVS Quality Control Reinterview CATI: Reinterviewer Training Workbook |
“CATI Reinterviewer Training Workbook” Training workbook for CATI reinterview process. |
As needed |
15 |
11-172 |
NCVS CATI Quality Control Reinterview: Supervisor’s Manual and Self Study |
“CATI Reinterview Trainer’s Guide” Trainer’s guide for CATI reinterviewer training. |
As needed |
16 |
Field Division Current Surveys: Reinterviewer’s Self-Study |
Field Division Current Surveys: Reinterviewer’s Self-Study |
“CAPI Reinterview Self-Study” Self-study for the CAPI reinterview process. Generic across several Census surveys with an NCVS-specific chapter. |
As needed |
17 |
NCVS CATI Reinterview Instrument Redesign Training Memorandum |
NCVS CATI Reinterview Instrument Redesign Training Memorandum |
“2024 National Crime Victimization Survey (NCVS) CATI Reinterview Instrument Redesign” Memorandum to all NCVS CATI reinterviewers with overview of redesigned reinterview instrument. |
As needed |
18 |
NCVS CAPI Reinterview Instrument Redesign Training Memorandum |
NCVS CAPI Reinterview Instrument Redesign Training Memorandum |
“2024 National Crime Victimization Survey (NCVS) CAPI Reinterview Instrument Redesign” Memorandum to all NCVS reinterviewers with overview of redesigned reinterview instrument. |
As needed |
19 |
NCVS-570 |
NCVS Regional Office Manual |
Regional Office manual for performance guidelines. |
As needed |
12 |
Letters
(provided to respondents)
Form Number |
Title |
Description |
Frequency |
Attachment |
NCVS-572(L) |
Introductory letter |
“Introductory letter” Introductory letter mailed to households prior to data collection for first time in sample. |
Mailed to incoming households |
3 |
NCVS-572(L)SP |
Spanish Introductory letter |
“Spanish Introductory letter” Spanish translation of the NCVS-572(L). |
As needed |
3 |
NCVS-572(L)AR |
Arabic Introductory letter |
“Arabic Introductory letter” Arabic translation of the NCVS-572(L). |
As needed |
3 |
NCVS-572(L)CH |
Chinese Introductory letter |
“Chinese (Simplified) Introductory letter” Chinese translation of the NCVS-572(L). |
As needed |
3 |
NCVS-572(L)KOR |
Korean Introductory letter |
“Korean Introductory letter” Korean translation of the NCVS-572(L). |
As needed |
3 |
NCVS-572(L)VI |
Vietnamese Introductory letter |
“Vietnamese Introductory letter” Vietnamese translation of the NCVS-572(L). |
As needed |
3 |
NCVS-593(L) |
Thank-you letter |
“Thank-you letter” Letter sent to households that completed an interview. |
As needed |
8 |
NCVS-593(L)SP |
Spanish Thank-you letter |
“Spanish Thank-you letter” Spanish translation of the NCVS-593(L). |
As needed |
8 |
Appendix B
OMB approved revisions to the NCVS
1999-2000
The NCVS has been used as the vehicle for developing questions to obtain information about a variety of initiatives related to crime and crime victimization. In 1999, a set of questions was added to the survey to obtain information about hate crime victimization. In 2000, in response to a Congressional mandate, questions were added on a test basis to collect information about the victimization of people with developmental disabilities. The Census Bureau, in conjunction with BJS, developed questions to collect this information as part of the NCVS beginning in July 2000. Also, beginning in July 2000, questions pertaining to the respondent's lifestyle and home protection were removed from the NCVS to enable adding the disability questions without increasing respondent burden.
2001
Per Executive Order 13221 signed by the President on October 16, 2001, BJS worked to develop questions designed to elicit information from NCVS respondents about the vulnerability and occurrences of computer-related crime. With the ever-expanding growth and use of the internet, including a rapid growth of internet-related commerce, there was growing concern about vulnerability of people to a variety of offenses related to its use. Such offenses include attacks by computer viruses, fraud in purchasing online, threats via email, and unrequested lewd or pornographic emails.
In addition to adding the computer crime questions to the NCVS, BJS implemented revised employment questions and expanded the victim-offender relationship answer categories on the NCVS-2, Crime Incident Report. The employment questions are used to obtain more detailed information about the industry and occupation of employed respondents who were victims of crime. The revised answer categories for the victim-offender relationship questions provide more detailed information about employee-employer type relationships of victims to their offenders.
2003
In January of 2003, BJS implemented several changes to the NCVS-500 Control Card and the NCVS-1 Basic Screen Questionnaire to comply with OMB’s 1997 guidelines for collecting data on race and ethnicity from the respondent. These changes included:
Replacing the existing single-response race question with a multiple-response race question and allowing a maximum of four categories (races) to be selected by the respondent.
Incorporating revised race answer categories for the race question.
Modifying the question wording of the ethnicity question.
Asking the ethnicity question prior to the race question, rather than after the race question.
In 2003 the NCVS replaced the education questions, “Education-highest grade” and “Education-complete that year?” with a single question that asks about “Education-highest grade completed?” This question included expanded answer categories for the 12th grade high school educational level and higher educational degrees as well.
2004
In January 2004, two new questions were added to help determine if a sample unit is located within a gated/walled or restricted access community. Also, at this time, two new questions were added to the crime incident report to collect information about the number of guns stolen and the number of other firearms stolen.
Because small sample sizes limited the utility and reliability of the computer crime data, in July 2004, the computer crime questions were removed from the survey and household identity theft questions were added. These questions, on the use or unauthorized use of credit cards, existing accounts, or personal information, were added to the NCVS-1, Basic Screen Questionnaire in an effort to measure the level and change in identity theft victimization among households over time.
2005
As research began to indicate that pregnant women might be at a higher risk of being a victim of violent crime, in July 2005 a question was added to the NCVS crime incident report to determine the pregnancy status at the time the incident occurred, of all female respondents age 18 to 49.
2007
In January 2007 BJS modified questions regarding respondent disabilities, in order to match the set of disability questions asked on the American Community Survey (ACS). BJS also modified the response category to the NCVS-2 question about the relationship of the offender to the respondent by adding the category “Teacher/School staff.” Also in 2007, the NCVS sample size was reduced due to budgetary constraints.
2008
When BJS conducted the first Identity Theft Supplement (ITS) from January-June 2008, the set of questions on identity theft from the NCVS-1 screener were removed for that period. In addition, changes were made to the set of questions regarding disabilities based on changes implemented in the ACS.
In July 2008 the set of questions on identity theft in the NCVS-1 screener question section were revised and reinserted into the NCVS-1. Additionally, a set of questions pertaining to the emotional and psychological impact of victimization and victim help-seeking behaviors was added to the NCVS-2 and asked of all violent crime victims. This set of questions was originally asked as part of the ITS.
To offset respondent burden added by the inclusion of the emotional toll questions, the set of questions involving vandalism and hate-motivated vandalism – which was limited on utility due to small sample – were removed at this same time.
2010
In October 2010, in order to restore the NCVS’s ability to measure the extent and characteristics of crime and to measure year-to-year change in the victimization rates, sample that was removed in 2007 began to be reinstated. The sample reinstatement increased the monthly sample about 26%, from about 8,500 households to about 10,700 households.
2012
In January 2012, BJS revised the set of questions collecting data on the race(s) and ethnicity of offender(s). This modification brought the race of offender questions into compliance with the Office of Management and Budget (OMB) 1997 Standards for the Classification of Federal Data on Race and Ethnicity. The revised set of questions asks first about the offender(s)’ relationship to the victim, followed by questions about the offender(s)’ gender, age, ethnicity, and race; and ends with questions about gang involvement and drug or alcohol use. There are two sections: one for crimes committed by a lone offender and one for crimes committed by multiple offenders.
In July of 2012, household questions on identify theft were removed from the NCVS-1 screener permanently due to a decision to instead administer a person-level identity theft supplement every other year.
2016
In July 2016, additional socio-demographic questions were added to the NCVS-1 including veteran status, citizenship, gender identity, and sexual orientation. Disability questions were also moved from the NCVS-2 Crime Incident Report to the NCVS-1 so they are now asked of all respondents. In addition, household income answer categories were expanded.
2019
In July 2019, questions regarding sexual orientation and gender identity were changed to be administered only to crime victims age 16 or older. In addition, these questions were only asked once of all victims during the time they remain in sample, rather than asking at their first, third, fifth, and seventh interview.
2022
In January 2022, questions regarding sexual orientation and gender identity were changed to be administered to all respondents age 16 or older at their first, third, fifth, and seventh interviews or if they had not been asked the questions before.
2019-2024
In 2019-2020, the redesigned NCVS instrumentation was developed and tested. The instruments were fine-tuned using cognitive and usability testing and then administered in a small-scale pilot test. A large-scale national field test, completed in early 2020, was used to finalize the new instrument design. From July to September 2023, BJS, in coordination with the Census Bureau, conducted an operational pilot test. This pilot test informed the final field protocols and interviewer training needs for the larger split-sample administration conducted in 2024. The split-sample administration informed the final field protocol and interviewer training needs for the full-scale implementation of the new NCVS instrument in 2025.
a For a more complete description of the 2010-based sample design, see National Crime Victimization Survey, Technical Documentation, NCJ 251442.
b Ohlsson, Esbjorn (2000). Coordination of PPS Samples Over Time. In The Second International Conference on Establishment Surveys, American Statistical Association, 255-264.
cTable 1. Annual Estimates of the Resident Population for the United States, Regions, States, and Puerto Rico. July 1, 2010 to July 1, 2019 (NST-EST2019-01). Source: U.S. Census Bureau, Population Division. Release Date: December 2019.
d Criminal Victimization in the 22 Largest U.S. States, 2017-2019 (NCJ 305402, March 2023).
e For more information on the panel design see Determining the Optimal Number of Interview Waves in the National Crime Victimization Survey: Evaluation and Recommendations, NCJ 249878, November 2016.
f User's Guide to The National Crime Victimization Survey (NCVS) Generalized Variance Functions (GVF).
h All included attachments are the most recent versions of these documents.
i In July 2006, the NCVS was fully automated and, as such, paper forms are no longer used to complete the survey. In 2024, the NCVS sample will be divided so that approximately half of households will be interviewed using the new (redesigned) instrument and half will be interviewed using the current instrument.
j In July 2006, the NCVS was fully automated and, paper forms are no longer used to complete the survey.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2024-09-06 |