Memo to OMB re: NCVS redesign Vignette Pilot

NCVS Redesign OMB Memo - Crime Vignettes Experiment final.docx

Research to support the National Crime Victimization Survey (NCVS)

Memo to OMB re: NCVS redesign Vignette Pilot

OMB: 1121-0325

Document [docx]
Download: docx | pdf




U.S. Department of Justice


Office of Justice Programs


Bureau of Justice Statistics

Washington, D.C. 20531


MEMORANDUM


To: Shelly Wilkie Martinez

Office of Statistical and Science Policy

Office of Management and Budget


Through: Lynn Murray

Clearance Officer

Justice Management Division


William J. Sabol, Ph.D.

Director

Bureau of Justice Statistics


From: Michael Planty, Jennifer Truman, Lynn Langton

Bureau of Justice Statistics

Date: March 12, 2015


Re: BJS Request for OMB Clearance for a pilot vignette study for the National Crime Victimization Survey (NCVS) under the NCVS Redesign Generic Clearance, OMB Number 1121-0325.


The Bureau of Justice Statistics (BJS) requests clearance to conduct a pilot study under the OMB generic clearance agreement (OMB Number 1121-0325) for activities related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program. BJS, in consultation with Westat under a cooperative agreement (Award 2013-MU-CX-K054 National Crime Victimization Survey), is starting the next phase of research to redesign the NCVS for the first time since 1992. In accordance with the generic agreement, BJS is submitting to OMB for clearance materials for a pilot vignette study to assess how individuals respond to the current NCVS screening items. The intent is to better understand how respondents understand and classify various crime incidents through an experimental study to better inform the selection and use of screening questions and cues.


Overview


The NCVS is based on research conducted by the Department of Justice and the Census Bureau in the 1970s (summarized in Lehnen and Skogan, 1981; Skogan, 1990; Skogan and Lehnen, 1985). There was a major redesign implemented in 1992, motivated in part by a National Academy of Sciences (NAS) review (Penick and Owens, 1976). A more recent review by the NAS (Groves and Cork, 2008) has motivated the current BJS redesign effort.


Since 2008, BJS has initiated a number of research projects to assess and improve upon core NCVS methodology, including redesigning the sample plan, comparing alternative modes of interview, reducing non-response bias, examining various reference period lengths, testing effectiveness of victimization screening questions, and exploring the feasibility of producing sub-national estimates of victimization.


Now, BJS is taking the results from its various methodological studies and using them to inform a redesign of the NCVS. The initial redesign activities are formative in nature. In this request, we describe an experiment to assess how respondents classify incidents in terms of the current NCVS screening items. The same type of vignette study was used in developing the original screening questions (see Biderman, Cantor, Lynch, and Martin, 1986).



Crime Vignette Testing Experiment


The current NCVS screener items are based on research conducted more than 25 years ago (the current screener questions are in Appendix A). Societal norms have changed over the past decades, and the approach adopted in the late 80s may not be as effective for today’s respondents. The issue to be explored in the vignette experiment is whether the cues in the current screener questions get respondents to report the types of incidents they are supposed to report. Also, while NCVS panel households are asked to complete the survey seven times over a three and a half year span, the 1989 screener experiment was based on a single interview only. Census interviewers today suggest that respondents who are patient with screener items in the initial interview may stop listening to the probes in later waves of the study (with interviews 5-7 particularly challenging). One focus of the Redesign will be to revise the current screener items, with the goal of developing a screener that can maximize data quality across all seven interviews.


This first experiment and the focus of this generic clearance request uses vignettes to examine both the current screening items and shorter versions of them. Vignettes, or brief stories, will be used to examine how respondents decide whether a given scenario should be reported in response to a specific screening item. The experiment examines the existing NCVS screener items, observing whether a sample of adults classifies reports incidents as crimes. In addition, the experiment includes streamlined versions of the screener items to see whether they lead to different classifications. Later in the redesign process, we plan to use the “vignette approach” again to test the impact of question wording changes on interpretation and response. This current experiment offers a chance to test and fine-tune this methodology so that it is ready for question testing in later phases. In addition, the results of this initial study will help us identify the characteristics of incidents (for example, the seriousness of their impact) that lead respondents to report them.


Vignettes have often been used to understand how respondents classify events or situations (Biderman et al., 1986; Martin and Polivka, 1996). We propose to administer a set of vignettes to help us understand how respondents determine whether incidents should be reported in response to the NCVS screening items. We examine several factors that may influence whether respondents report specific events, including the seriousness of the incident, the relation of the offender to the respondent, and, in the case of property crimes, whether the item stolen was actually the respondent’s. If we can better understand the factors that influence what respondents include or exclude from their reports, we can revise the screening items so that their answers are more in line with the NCVS’s objectives. We regard this initial study as a pilot study, demonstrating the value of the vignette approach. When we have revised screener items at later stage of the project, the vignette methodology may be useful in testing whether the new items lead to more accurate answers.


Sample Design


The vignettes will be administered to a split sample of respondents, with half receiving the current screening items and half receiving streamlined versions of the items. Within each group, respondents will be asked to answer the screening items based on randomly assigned vignettes. These vignettes will experimentally vary the seriousness of the incident and additional factors, such as the respondent’s relationship with the offender.


Because we are not attempting to generate population estimates, we plan to use a non-probability sample from the web survey vendor SurveyMonkey. SurveyMonkey allows organizations to conduct surveys of their members under the condition that the end of the survey includes an invitation to join the SurveyMonkey Audience Panel. This service reaches about 2 million individuals a week. As a result, the Audience Panel is continuously updated, contains a very heterogeneous group of people, and has rich member profiles. Many people who may never consider “opting-in” to a web ad to join a panel accept the invitation at the end of the survey because they feel it is safe (since an organization they trusted, such as their employer or their local PTA, sponsored the survey).


We aim to collect 1,000 web completes from the SurveyMonkey Audience panel. The survey will take about 10 minutes in which respondents will be exposed to the current or streamlined versions of seven NCVS screening items. Prior to each screening item, respondents will receive a randomly assigned fictional scenario; they will be asked to answer the screening question based on the scenario. The design will allow us to explore how people classify different types of crimes under the current screening items and how they might respond to shorter items. Below we present power calculations for comparisons between the two versions of the screening items. We examine the proportion of respondents classifying an incident as something to be reported. For most comparisons, the power is at least .60 or better for a seven percentage point difference (e.g., 15 percent saying they would report the incident with the current screener question versus 8 percent with the streamlined version; see the first row of the table). We believe this is adequate power for an exploratory study. In addition, we will be collecting four-point scale ratings so the power figures in Table 1 are likely to underestimate power to some extent.


Table 1. Power of the Pilot Vignette Study

for Comparing Proportions across Screener Versions

P1

P2

n

Power

0.15

0.08

500

0.934

0.25

0.18

500

0.769

0.35

0.28

500

0.664

0.45

0.38

500

0.613

0.55

0.48

500

0.600

0.65

0.58

500

0.623

0.75

0.68

500

0.689

0.85

0.78

500

0.813

0.95

0.88

500

0.978



For each screening item, we have developed several versions of a basic scenario. The violent crime vignettes vary according to the following dimensions: the level of seriousness of the incident (low, high), and who the offender was (stranger, acquaintance, relative or close friend). The property crime vignettes vary according to the same dimensions, with an added dimension of property ownership (owned by the respondent or borrowed property). Each vignette will be randomly and independently assigned.


Besides responses to the seven vignettes, the survey will collect a few demographic characteristics (see questions 1 through 5 in the questionnaire). SurveyMonkey will monitor data collection to assure a balanced split by respondent sex and to assure a mix across age and race-ethnicity groups.


Survey Instrument


The instrument is designed to assess how varying dimensions of crime severity or relationship to the offender (or property) changes how respondents interpret the screener questions. Appendix B includes a copy of the full survey instrument to be used in the SurveyMonkey web survey. An example is provided below.


SurveyMonkey respondents will be presented with the following NCVS screener question:


Has anyone attacked or threatened you in any of these ways:

(a) With any weapon, for instance, a gun or knife-

(b) With anything like a baseball bat, frying pan, scissors, or stick

(c) By something thrown, such as a rock or bottle

(d) Include any grabbing, punching, or choking

(e) Any rape, attempted rape or other type of sexual attack

(f) Any face to face threats

(g) Any attack or threat or use of force by anyone at all? Please mention it even if you are not certain it was a crime.


In tandem, the respondent will be provided with one of the following vignettes (assigned at random):


  • Last month, you were at a restaurant. You accidentally bumped into someone. He grabbed your shoulder forcefully and said “watch out!”


  • Last month, you were at a restaurant. You accidentally bumped into someone. He turned around and punched you in the face, giving you a black eye.


  • Last month, you were at a company gathering. On the way out, you accidentally bumped into a co-worker who you don’t know well. He grabbed your shoulder forcefully and said “watch out!”


  • Last month, you were at a company gathering. On the way out, you accidentally bumped into a co-worker who you don’t know well. He punched you in the face, giving you a black eye.


  • Last month, you were out at a restaurant with your friend and had been drinking a bottle of wine. On the way out, you accidentally bumped into him. He grabbed your shoulder forcefully and said “watch out!”


  • Last month, you were out at a restaurant with your friend and had been drinking a bottle of wine. On the way out, you accidentally bumped into him. He punched you in the face, giving you a black eye.


The respondent will be asked whether he or she would answer “YES” to the screener item based on the information provided in their sample vignette (with a four-point response scale ranging from “Definitely ‘yes’” to “Definitely ‘no’”).


The instrument concludes with a few questions about the respondent’s perceptions of personal safety and concern about victimization.


Burden


The estimated burden is indicated below.



Number of responses per respondent

Anticipated

Number of respondents

Avg. Time per response

Total time across all respondents

Crime Vignette Survey


1

1,000

10 minutes

166.7 hours


The NCVS generic clearance allocated a predetermined combination of sample cases and burden hours that could be used for NCVS redesign efforts. The current sample size and burden hours in fall within the remaining allocation.


Language


The vignettes will be administered in English only.


Reporting


Upon completion of testing, a draft report will be delivered to BJS that will include a discussion of how respondents classified each vignette and whether responses differed significantly by version of the screening question and by version of the vignette. The aims of the analysis are 1) demonstrate the utility of this approach, 2) determine the factors that lead to respondents to classify situations as “crimes” (or at least to report them in response to the screening items), and 3) to identify situations that are likely to be misreported in the NCVS. This report will be one source of information that guides the effort to revise the NCVS screening items.


Protection of Human Subjects


The research poses minimal risk to subjects. None of the questions are sensitive and SurveyMonkey will delete any identifying information before turning the data over to Westat to analyze. The survey will be restricted to persons 18 and older.


Informed Consent, Data Confidentiality and Data Security


Informed Consent


SurveyMonkey panelists will be invited by email to take part in the survey. Once panelists link to the Web survey they will be presented with an introduction screen that will tell panel members that participation is voluntary and that they can skip any question they do not want to answer. In addition, the introduction will state the approximate length of the survey (10 minutes) and the topic (issues related to crime and crime victimization). The splash page of the survey will include information about how the confidentiality of the data will be protected (see below).


The invitation email and the other survey materials will be reviewed by the Westat IRB.


Data Confidentiality and Security


BJS’ pledge of confidentiality is based on its governing statutes Title 42 USC, Section 3735 and 3789g, which establish the allowable use of data collected by BJS. Under these sections, data collected by BJS shall be used only for statistical or research purposes and shall be gathered in a manner that precludes their use for law enforcement or any purpose relating to a particular individual other than statistical or research purposes (Section 3735). BJS staff, other federal employees, and Westat staff (the data collection agent) shall not use or reveal any research or statistical information identifiable to any specific private person for any purpose other than the research and statistical purposes for which it was obtained. Pursuant to 42 U.S.C. Sec. 3789g, BJS will not publish any data identifiable specific to a private person (including respondents and decedents). To protect the identity of the respondents, no identifying information will be kept on the final data file. The survey will not be collecting the name of any of the respondents.



REFERENCES


Biderman, A. D., Cantor, D., Lynch, J. P., and E. A. Martin (1986). Final Report of Research and Development for the Redesign of the National Crime Survey. Washington, DC: Bureau of Social Science Research


Groves, R. and D. Cork (eds.) (2008). Surveying Victims: Options for Conducting the National Crime Victimization Survey. Panel to Review the Programs of the Bureau of Justice Statistics. Committee on National Statistics and Committee on Law and Justice, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.


Lehnen, R. and W. Skogan (eds.) (1981) The National Crime Survey Working Papers, Volume I: Current and Historical Perspectives. Washington, D.C.: U.S. Government Printing Office.

Martin, E., and A. E. Polivka, A. (1995). “Diagnostics for Redesigning Survey Questionnaires: Measuring Work in the Current Population Survey.” Public Opinion Quarterly, 59, 547-567.


Penick, B.K.E and M.E.B. Owens (1976). Surveying Crime. Panel for the Evaluation of Crime Surveys. Committee on National Statistics, Academy of Mathematical and Physical Sciences. Washington: National Academy of Sciences.


Skogan, W. and R. Lehnen (1985) The National Crime Survey Working Papers, Volume II: Methodological Studies. Washington, D.C.: U.S. Government Printing Office.


Skogan, W. (1990) "The National Crime Survey Redesign," Public Opinion Quarterly, 54, 256-272.




3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSeptember 15, 2005
AuthorGerry Ramker, BJS
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy