Supporting Statement –
National Crime Victimization Survey Instrument Redesign and Testing Project: Field Test (National Survey of Crime and Safety)
A. Justification
Necessity of the Information Collection
The Bureau of Justice Statistics (BJS), U.S. Department of Justice, requests clearance to conduct a field test of newly revised National Crime Victimization Survey (NCVS) questionnaires (OMB No. 1121-0111). The field test will use the title “National Survey of Crime and Safety” to distinguish the tests from the ongoing NCVS data collection. The BJS is authorized to collect statistics on victimization under Title 34 (Crime Control and Law Enforcement), United States Code, Section 10132 (Attachment 1). BJS, in consultation with Westat under a cooperative agreement (Award 2013-MU-CX-K054 National Crime Victimization Survey Instrument Redesign and Testing Project), has worked to redesign the NCVS survey instruments and mode of administration. Activities supporting the development of these questionnaires, including cognitive and usability testing, have been approved through the BJS OMB generic clearance agreement (OMB No. 1121-0325) for activities under the National Crime Victimization Survey Redesign Research program, and the OMB generic clearance agreement (OMB No. 1121-0339) for Cognitive, Pilot and Field Studies for Bureau of Justice Statistics Data Collection Activities.
Since 1972, the NCVS has been providing national data on personal and household victimization, both reported and not reported to police. The data collection allows the BJS to fulfill its mission of collecting, analyzing, publishing, and disseminating information on victims of crime. The NCVS is one of the two principal measures of crime in the United States, along with the Federal Bureau of Investigation’s (FBI) Uniform Crime Reporting Program. Involving nearly 250,000 interviews annually, the NCVS is the 2nd-largest survey in the country, behind the Census Bureau’s American Community Survey. The NCVS captures the “dark side of crime” (crimes not reported to police), provides important knowledge about criminal victimization, and yields relevant demographic information about both victims and offenders. Together with the FBI’s statistics on crimes reported to law enforcement agencies, the NCVS provides an understanding of the nature of and changes in the nation’s crime problems.
The NCVS is currently an important source of annual national data on a number of policy relevant subjects related to criminal victimization, including intimate partner violence, hate crime, workplace violence, injury from victimization, guns and crime, the cost of crime, reporting to police, and crime against vulnerable populations, such as the elderly, juveniles, and persons with disabilities. The NCVS is also a vehicle for the implementation of routine survey supplements that provide detailed information on timely and relevant topics such as identity theft, school crime, and contacts between the police and the public.
The NCVS was last redesigned more than 25 years ago. Since then, much has changed, both in the level of public acceptance of surveys and in the nature of crime. The primary purpose of the NCVS Instrument Redesign and Testing Project (NCVSIRTP) is to provide scientific and technical support for the redesign and testing of the NCVS roster control card, crime screener (NCVS-1), and crime incident (NCVS-2) instruments in support of BJS’s efforts related to increasing the efficiency, reliability, and utility of the NCVS. The NCVSIRTP field test will take into account previous testing results and will be used to determine the feasibility of administering the revised instrument, the utility of data collected, and the impact of the instrument revisions on victimization estimates. In addition to testing new versions of NCVS instruments, the field test will also assess mode of data collection and effects of promised incentives for web survey completion rates and data quality.
BJS is requesting a one-year OMB clearance from September 2019 through September 2020 with data collection conducted from October 2019 through August 2020. During this period, the field test will be administered to all respondents age 12 or older in sampled households.
Needs and Uses
Since 1972, the NCVS and its predecessor, the National Crime Survey (NCS), have provided national data on the level and change of nonfatal personal crimes (rape or sexual assault, robbery, aggravated and simple assault, and personal larceny) and property crimes (burglary, motor vehicle theft, and other theft) both reported and not reported to police. It is one of the two main sources of data on crime in the United States and the only source that provides detailed information on the nature and consequences of crime and crimes not reported to police. By capturing crimes not reported to police, as well as those known to law enforcement, the NCVS serves as the primary, independent source of information on crime in the United States. Understanding unreported crime also helps to inform the appropriate allocation of criminal justice system and victim service resources and provides a better understanding of victim decision-making, responses to crime, and the resulting consequences.
Beginning in the late 2000s, BJS initiated a substantial multi-stage redesign effort to contain survey costs while enabling the NCVS to meet stakeholder needs for reliable and timely statistics on criminal victimization that are independent of police agency reports, as well as to generate subnational estimates of criminal victimization. BJS has undertaken a number of research projects to respond to recommendations from the Committee on National Statistics (CNSTAT) of the National Research Council on increasing the relevance and quality of NCVS data.1 These on-going projects have been conducted under separate clearance packages, and include efforts to conduct a low cost self-administered companion survey to collect local estimates of victimization (OMB No 1121-0351); testing of various approaches to improve the measurement of rape and sexual assault (OMB No. 1121-0343); the development of a subnational program with a combination of model-based estimates and direct estimates through a boost of NCVS sample in the 22 most populous states; and a major overhaul of the NCVS survey instrument to modernize it, improve measurement of victimization and incident characteristics, and increase its flexibility for measuring emerging crime types and to capture indicators of safety and security and perceptions of police that go beyond experiences with victimization. Completion of this last activity is the subject of this request.
Modernization and methodological developments to increase utility
In early 2014, BJS initiated the NCVSIRTP through a competitive award to Westat, Inc. The NCVSIRTP is a major multi-year effort to overhaul the existing survey instrument. The overarching objective of the project is to provide scientific and technical support for the redesign and testing of the NCVS roster control card, crime screener (NCVS-1), and crime incident (NCVS-2) instruments in support of BJS’s efforts related to increasing the efficiency, reliability, and utility of the NCVS. Through the project, BJS aims to evaluate and modernize the organization and content of the NCVS; improve measurement; improve the efficiency of the instruments and the current core-supplement design; and develop a procedure for introducing routine improvements to the survey in order to capture emerging crime types and time-relevant topics. The project work is expected to be completed in 2020.
One of the first steps in the project was a comprehensive assessment of the instrument to determine which survey items are being utilized and how, which survey items are problematic in their language and placement, and where there are gaps in the content of the instrument. The initial assessment provided a better understanding of the substantive and procedural issues with the instrument and helped to identify areas where the content could be improved to enhance current knowledge of victimization and its correlates and enhance the measurement of these constructs. Through the initial assessment work, content and methodological areas in need of modernization became apparent. Content areas included 1) collecting data on police performance and community safety, 2) adding correlates of victimization and 3) increasing the utility of data collected about individual incidents. The methodological enhancements included 1) improving the victimization screening, 2) improving the flow and logic of the instrument, 3) improving the measurement of rape and sexual assault and 4) developing and testing a self-administered web mode for data collection.
Content changes
Police and community measures
One key component of the redesigned instruments is a series of questions pertaining to residents’ perceptions of police legitimacy and satisfaction with police. These questions are intended to provide indications about public perceptions of the police, how they vary by different subgroups and how they change over time. The questions on community safety and fear will provide a similar indicator of public perceptions of issues related to crime, as well as a potential correlate of victimization.
The data from these ‘noncrime’ questions will have utility for members of the law enforcement community, as well as for researchers and policy makers. The BJS Crime Indicators Working Group (CIWG), comprised of members of the law enforcement community, provided knowledge and insight into contemporary challenges facing the law enforcement fields. The CIWG noted that public perceptions of crime and safety are often as important as the crime rates themselves. The ‘noncrime’ questions could be used to address the expressed concerns of the CIWG and other law enforcement officials who want to be able to assess the relationship between demographic characteristics of residents and their perceptions of neighborhood safety and satisfaction with police.
A second purpose of these items is to increase the relevance of the survey for the majority of respondents, those who do not experience a victimization during their time in survey panel. This goal may be especially important for maintaining the interest of respondents after the first interview. Half of the field test sample will receive the police items and half will receive the community items.
Correlates of victimization
Since the NCVS was first designed, a great deal has been learned about the correlates of crime and victimization, that is, household and individual characteristics associated with increased risk of being victimized. However, since its inception the NCVS has not been updated with new items to track these correlations. Several questions have now been added on person and household characteristics that have been found to be correlated with victimization, including homelessness in the past 12 months, occupation, disabilities, living with a spouse/boyfriend/girlfriend, and receipt of government assistance.
Expansion and enhanced measurement of crime types
Another area of focus is on improving the measurement of victimization and increasing the crime types covered by the survey. In 2016, the National Academy of Sciences (NAS) released a report recommending that BJS focus on measuring new and emerging crimes in addition to the current street crimes already included on the NCVS.2 The current NCVS measures rape and sexual assault, robbery, physical assault, burglary, larceny, and motor vehicle theft with the core survey instrument, and uses routine supplements to collect information on other crime types like identity theft, stalking and, starting in 2017, financial fraud. The redesigned questionnaire to be used in the field test will also capture respondents’ experiences with vandalism of private property.
Increasing utility of the crime incident report (CIR)
Victim-help seeking. The redesigned instrument will include a more extensive series of questions on formal and informal help-seeking behavior. Despite the fact that the federal government allocates billions of dollars a year to provide services and compensation to crime victims through the Crime Victims Fund, very little data currently exist about who receives this money, about gaps in the services, and about compensation provided. The current NCVS instrument asks only two questions related to whether the victim received victim services. The Office for Victims of Crime laid out the need for more comprehensive data in their Vision 21 report (http://ovc.ncjrs.gov/vision21/pdfs/Vision21_Report.pdf).
BJS’s redesigned instrument will enhance the capacity of the survey to measure both formal and informal victim help-seeking behaviors. The redesigned instrument expands the information collected about why victims do or do not receive formal services, the type of assistance they receive, and their level of satisfaction with the assistance received. The redesigned survey has also added questions about informal help-seeking behaviors, such as speaking to a family member, friend, or religious leader, and has sought to improve current NCVS questions about the consequences of victimization including injuries, receipt of medical and mental health care, and emotional reactions following a victimization.
Reactions to contact with the police. The current NCVS instrument includes questions on what happens when a victim contacts the police. The redesigned survey updated these questions to include modern policing methods (e.g., use of the internet and telephone contacts), as well as asking about the victim’s reactions and satisfaction with their encounters with the police.
Enhanced collection of reactions by victims. The NCVS is one of the primary sources of information on how victims react when attacked (e.g., self-protective measures and types of resistance). The redesign has modified these items to provide an expanded view of how victims of different types of attacks (e.g., physical assault and sexual violence) react during the incident.
Methodological changes
Improving the victimization screener
The NCVSIRTP has streamlined the screening questions, which ask respondents to report whether they experienced various types of crime victimizations during the last six months3. The current screeners, first implemented in 1992, incorporate a wide range of verbal cues and examples to prompt recall of victimizations. In addition, the survey organizes the questions in a “blocked” format; in this format, all of the screening items are administered prior to any of the associated follow-up questions that gather more detail about each incident.
Over time, evidence has accumulated that the approach taken in 1992 may not be working as well as intended. For example, it is apparent from time-stamp data and from direct observation of NCVS interviews that interviewers often go through the examples in the screening questions very quickly or skip them entirely, sometimes at the insistence of the respondents. This is especially the case after the first time-in-sample (i.e., sampled households and eligible household members are interviewed every 6 months for a total of 7 interviews), after the respondent learns what is in the survey. Further, the “blocked” organization of the screening items may be less effective in a longitudinal setting than in a cross-sectional context, since respondents may learn the connection between answers to the screening items and the administration of a large number of follow-up questions. Intermixing at least some of the follow-up questions with the initial screening items (an approach called “interleaving”) may offer advantages over the blocked approach—producing a more conversational flow to the questions, improving the routing to later items, and possibly improving data quality and reliability.
Improving the measurement of rape and sexual assault
The measurement of rape and sexual assault has been improved by updating the methods used to screen and classify these incidents (e.g., using behaviorally-specific language and that defines what is meant by sexual contact). These improvements were based on prior research and recommendations to measure these crimes (Kruttschnitt, C., Kalsbeek, W.D., & House, C.C. (2014). Estimating the incidence of rape and sexual assault. Washington, DC: National Academies Press). In addition to changing the screening items, the redesign has modified the CIR to improve the classification of these types of incidents.
Self-administered mode
The anticipated changes to and improvement of the types of crimes measured by the NCVS may require changes to the survey methodology to ensure that the information collected is accurate and reliable. BJS has developed questionnaires that may be either interviewer- or self-administered. Self-administration via the web has potential benefits but also some challenges. The NCVS collects sensitive information about respondents’ victimization experiences, and using self-administration will increase privacy for respondents and possibly enhance reporting to improve data quality. In addition, allowing self-administration via the web may cut costs for data collection as there would be less interviewer contact with the respondents. Challenges for self-administration via the web include the potential for lower response rates, lower item response, data quality issues, and selection bias. The field test will include (1) an interviewer-administered version of the current NCVS instrument (condition 1), (2) an interviewer-administered, revised questionnaire (condition 2), and (3) a self-administered, web-based version of the revised questionnaire (condition 3). This design will support assessment of the revised content and methodological changes against the current NCVS, and separately the effect of self-administration on estimates from the revised instruments.
3. Use of Information Technology
Respondents to the NCVSIRTP field test are individuals living in households. Westat will collect the data using in-person and telephone interviews, and will ask respondents to complete questionnaires themselves on the web, using either their own device or the interviewer’s. The current NCVS will be administered using the same computer-assisted personal interviewing (CAPI) methods as those used by the U.S. Census Bureau in the current NCVS. The redesigned questionnaires will be administered through a web-based program running on interviewers’ laptops, which will be compatible for a variety of devices, including smart phones and tablets.
The use of fully automated interviewing technologies, including interviewer-administered interviews and self-administered interviews, reduces data collection costs as well as respondent and interviewer burden. Furthermore, automated instruments reduce the amount of data inconsistency and the need for extensive post-data collection editing and imputation processes. The use of technology results in more accurate data products that are delivered in a timelier fashion.
Efforts to Identify Duplication
The administration of one version of the NCVSIRTP field test (i.e. the interviewer-administered version of the current NCVS) is technically duplicative, on a much smaller scale, of the NCVS that is currently being administered in the field; however, the comparison of the current NCVS instrument to the redesigned NCVS instrument is critical for field test design. The NCVS does not duplicate any other effort in the field. There is no other omnibus survey that can be used to generate annual national statistics on a range of crimes and victim responses to crimes regardless of whether the victimization was reported to the police. The NCVSIRTP field test is also not duplicative of any other development activity.
The FBI’s Uniform Crime Reports (UCR) data covers a similar range of crimes as the NCVS, but is limited to only those crimes known to the police. One of the central goals of the NCVS is to complement the picture the UCR provides by providing the victim’s perspective of crime.
The FBI’s National Incident-Based Reporting System (NIBRS) also includes similar crimes as the NCVS (as well as a number of additional offense types) and collects basic demographic data on the age, sex, and race of victim and offenders. Like the UCR, NIBRS includes only crimes known to police. It is also limited by a lack of information on the victim response to criminal incidents. To date, about a third of all law enforcement agencies report NIBRS data to the FBI.4 These reporting agencies cover only a portion of the population of the United States, meaning that the data are not nationally representative.
Efforts to Minimize Burden
N/A. The NCVSIRTP field test is a household-based sample and does not impact small businesses or small entities.
Consequences of Less Frequent Collection
This is a one-time data collection.
7. Special Circumstances
N/A. Collection is consistent with the guidelines in 5 CFR 1320.9.
8. Adherence to 5 CFR 1320.8(d) and Outside Consultations
Outside Consultations:
Throughout the NCVSIRTP, and specifically throughout the development of the revised questionnaires, BJS has consulted with a variety of data users, as well as with federal government and outside experts with knowledge and experience in criminal justice research and survey methodology. To date for the NCVSIRTP, BJS has held four Technical Review Panels with data users and experts. Those consulted on the redesign effort include:
Dr. Bonnie Fisher, University of Cincinnati
Rachel Hansen, Statistician, National Center for Education Statistics
Dr. Dan Hartley, Coordinator for Workplace Violence Prevention Research, National Institute for Occupation Safety and Health
Dr. Allyson Holbrook, University of Illinois at Chicago
Dr. Kristy Holtfreter, Arizona State University
Aviva Kurash, International Association of Chiefs of Police
Dr. Frauke Kreuter, Survey Methodologist, Joint Program of Survey Methodology
Dr. Janet Lauritsen, University of Missouri, St. Louis
Dr. Colin Loftin, University of Albany
Dr. James Lynch, Chair, Department of Criminology, University of Maryland
Anne Menard, Chief Executive Officer, National Resource Center on Domestic Violence
Dr. Michael Reisig, Arizona State University
Dr. Wes Skogan, Northwestern University
Dr. Min Xie, University of Maryland
9. Paying Respondents
Payment
or gifts to respondents are not provided in return for participation
in the interviewer-administered portions of the survey. During the
household roster interview for condition 3, interviewers will provide
all households with a nonmonetary incentive (i.e., a magnet) to serve
as a reminder of the survey when they are contacted two months after
the household roster interview to complete the redesigned NCVS
instrument on the web using their own devices. For the web
self-administered questionnaire, an experiment will be embedded to
test the effects of a promised incentive on survey completion and
data quality. A portion of respondents asked to complete the web
self-administered questionnaire will be promised a $20 gift card upon
completion.
This incentive is critical to the experiment
planned that will provide BJS with information on the efficacy and
effects of such a gift. While the effects of monetary incentives on
response rates and data quality are well-understood5,
less is known about their effects on web self-administration
following in-person interviewer contact.6
Using an incentive amount of $20 was chosen in order to increase
response rates, while maintaining cost efficiency. Research has found
that incentives do have positive effects on response rates, but the
returns are diminished as the size of the incentives increase.7
An incentive experiment done as part of another victimization survey
found that incentives in the $20 to $30 range would likely result in
the best survey participation rates, and may lead to a larger and
more representative sample.8
For the planned experiment, the expectation is that the promised gift
card will increase the response rate and reduce overall
per-completion survey cost.
10. Assurance of Confidentiality
All information collected during the NCVSIRPT field test is confidential by law – Title 34, United States Code (USC), Sections 10134 and 10231. All respondents who participate in the interviewer-administered surveys will be provided the BJS confidentiality pledge and assurance that the identity of all participants and victims will be protected as required under Title 34, USC, Sections 10134 and 10231. The consent form assures confidentiality to all respondents and explains that their information is protected by a Privacy Certificate, that their participation is completely voluntary, that no identifying information will be released, and that information they provide during the interview is prohibited from use in any legal action (Attachments 2a and 2b). All respondents who participate in the survey by telephone will be verbally presented with this information (Attachments 3a, 3b, and 3c). Respondents completing the survey on their own on the web will be provided this information in an invitation letter, e-mail, or text, and also on the introductory screen to the survey (Attachments 4a, 4b, 5a, 5b, and 6). BJS and Westat hold in confidence any information that could identify an individual according to Title 34, United States Code, Sections 10134 and 10231.
BJS and Westat have procedures in place to guard against disclosure of personally identifiable information. As required under Title 34 USC, section 10231, BJS and its data collection agents will take all necessary steps to mask the identity of survey respondents, including suppression of demographic characteristics and other potentially identifying information, especially in situations in which cell sizes are small. The NCVSIRTP data will be maintained under the security provisions outlined in the U.S. Department of Justice regulation 28 CFR § 22.23 which can be reviewed at www.bjs.gov/content/pub/pdf/bjsmpc.pdf.
The procedures proposed for this study have been submitted to Westat’s Internal Review Board (IRB) to ensure that the data collection procedures are in compliance with human subjects’ protection protocols and confidentiality regulations. Final approval has not yet been received and data collection will not begin until IRB approval is received.
11. Justification for Sensitive Questions
The current NCVS and the redesigned instruments ask about experiences such as rape and sexual assault, as well as other types of victimization that may be sensitive for some respondents. Given the objective of the NCVS—to estimate the amount of victimization in the nation—this is necessary as BJS would not be able to provide a complete picture of nonfatal violent victimization without asking about such experiences. The victimization and incident details that are collected are necessary for accurately classifying the types of crimes the NCVS measures. NCVSIRTP field test interviewers will receive training and guidance on how to ask sensitive questions. The importance of estimating crime levels, as well as the potential value of detailed information about victimization for designing crime prevention strategies, will be explained to any respondent with questions. All respondents have the option of refusing to answer any question. Response rates and data quality will be assessed across the modes of data collection.
12. Estimate of Respondent Burden
Table 1. Burden estimates
Instrument |
Condition 1 |
Condition 2 |
Condition 3 |
Total |
Household Roster |
||||
Number of respondents |
2,080 |
3,467 |
3,752 |
9,299 |
Number of responses per respondent |
1 |
1 |
1 |
|
Average time per response |
0.15 |
0.15 |
0.15 |
|
Total burden |
312 |
520 |
563 |
1,395 |
Police and community items |
||||
Number of respondents |
N/A |
5,107 |
4,122 |
9,229 |
Number of responses per respondent |
|
1 |
1 |
|
Average time per response |
|
0.07 |
0.07 |
|
Total burden |
|
357 |
289 |
646 |
Victimization Screener |
||||
Number of respondents |
3,064 |
5,107 |
4,122 |
12,293 |
Number of responses per respondent |
1 |
1 |
1 |
|
Average time per response |
0.15 |
0.20 |
0.15 |
|
Total burden |
460 |
1,021 |
618 |
2,099 |
Crime Incident Report |
||||
Number of respondents |
576 |
960 |
738 |
2,274 |
Number of responses per respondent |
1.3 |
1.3 |
1.3 |
|
Average time per response |
0.25 |
0.30 |
0.25 |
|
Total burden |
187 |
374 |
240 |
801 |
|
|
|
|
|
|
|
|
|
|
Grand total burden hours |
959 |
2,273 |
1,709 |
4,941 |
13. Estimate of Respondent’s Cost Burden
14. Costs to Federal Government
The total cost to the federal government for the NCVSIRTP field test data collection is an estimated $10.2 million (Table 2). Westat will act as the data collection agent on behalf of BJS for the field test at an estimated cost of $10 million. Westat has developed, tested, and programmed the NCVSIRTP field test instruments, and will develop all data collection support and training materials, train interviewers and support staff, and collect, process, and report on the field test data. BJS costs total about $158,600, and cover overall program management, review, feedback, and discussion of deliverables from Westat, and any dissemination activities. BJS bears all costs of the survey.
Table 2. Estimated costs for NCVSIRTP field test
Estimated contractor (Westat) costs |
Costs |
Activity |
|
Design and development |
$550,000 |
Data collection and processing |
$8,750,000 |
Reporting |
$700,000 |
Subtotal: Estimated costs for Westat |
$10,000,000 |
|
|
Estimated BJS costs |
Costs |
BJS Personnel |
|
GS13 Statistician (40%) |
$45,000 |
GS14 Statistician (8%) |
$10,600 |
GS15 Supervisory Statistician (5%) |
$7,800 |
GS13 Technical Editor (3%) |
$4,000 |
GS12 Production Editor (2%) |
$1,900 |
GS13 Digital Information Specialist (2%) |
$2,200 |
GS15 Chief Editor (2%) |
$3,100 |
Senior BJS management (GS14, GS15, SES, Director) |
$33,100 |
Subtotal: Salaries |
$107,700 |
Fringe benefits (28% of salaries) |
$30,200 |
Subtotal: Salary and fringe |
$137,900 |
Other administrative costs of salary and fringe (15%) |
$20,700 |
Subtotal: Total BJS personnel costs |
$158,600 |
Total estimate costs |
$10,158,600 |
15. Reasons for Change in Burden
This is a new request.
16. Project Schedule and Publication Plans
Pending OMB approval, the NCVSIRTP field test data collection is scheduled to begin in October 2019. Letters will be mailed to sampled addresses in October 2019. Field work for Conditions 1 (interviewer-administered, current NCVS) and 2 (interviewer-administered, redesigned NCVS) will start in October 2019 and continue through March 2020, including any nonresponse follow-up. Field work (household roster interview) for Condition 3 (self-administered, redesigned NCVS) will start in January 2020 and continue through April 2020. Invitations for the web survey will be sent to enumerated individuals two months after the household interview, from March through June 2020. Nonresponse follow-up for the web survey will continue through August 2020.
The data collection agent (Westat) will produce a final report detailing the field test methodology and findings. The plan for this report is to examine the following main research questions: 1) comparison of Condition 1 (interviewer-administered, current NCVS) and 2 (interviewer-administered, redesigned instrument; 2) performance of the redesigned instrument, including the interleaving items, non-crime ask-all items, and other revised items; 3) impact of self-administration on estimates and data quality; 4) impact of incentives on estimates and response rates; and 5) potential nonresponse bias in all conditions. Analyses will examine response rates, crime rates, data quality indicators (e.g., break-off rates, item nonresponse rates, timings, number of CIRs completed), distributions of answers to questions, and respondent experience. Finally, this report will also provide recommendations for full-scale implementation of revisions to maintain series continuity, if changes affect victimization rates. BJS does not plan to archive the NCVSIRTP field test data, and will use these data to inform decisions about the mode of data collection to make final changes to the instrument before national implementation. The final report is scheduled to be delivered to BJS by late 2020. BJS will review and plan to post to the BJS website as a final deliverable for the NCVSIRTP.
17. Display of Expiration Date
The OMB control number and expiration date are provided to each household in sample as part of the study brochure mailed with the advanced letter and are displayed on the CAPI laptop or read during the interview describing the nature of the survey and authority to collect the information. They are also provided on the first screen of the self-administered web interview. The brochure and screen shots are included in the attachments (see Attachments 7 and 8).
18. Exception to the Certificate Statement
N/A. There are no exceptions to Certification for Paperwork Reduction Act
Submissions. Collection is consistent with the guidelines in 5 CFR 1320.9.
1 The recommendations are contained in two reports, Surveying Victims: Options for Conducting the National Crime Victimization Survey (National Research Council, 2008, https://www.nap.edu/catalog/12090/surveying-victims-options-for-conducting-the-national-crime-victimization-survey) and Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics (National Research Council, 2009, https://www.nap.edu/catalog/12671/ensuring-the-quality-credibility-and-relevance-of-us-justice-statistics).
2 National Academies of Sciences, Engineering, and Medicine. (2016). Modernizing crime statistics – report 1: Defining and classifying crime. Washington, DC: The National Academies Press.
3 Note that the field test will use a 12-month reference period to increase the number of CIRs available for analysis.
4 Details on NIBRS reporting are available through the FBI’s website: https://ucr.fbi.gov/nibrs-overview.
5 E.g., Mercer, A., Caporaso, A., Cantor, D., and Townsend, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79(1), 105-129.
6 See Part B. Collection of Information Employing Statistical Methods for additional information on the field test design and planned experiments.
7 Mercer, A., Caporaso, A., Cantor, D., and Townsend, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79(1), 105-129.
8 Krebs, C, Lindquist, C., Berzofsky, M., Shook-Sa, B., Peterson, K., Planty, M., Langton, L., and
Stroop, J. (2016). Campus climate survey validation study final technical report. Bureau of Justice
Statistics, U.S. Department of Justice, R&DP-2015:04, NCJ 249545.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Statement |
Author | MONAH002 |
File Modified | 0000-00-00 |
File Created | 2021-01-22 |