Memo to OMB on Cognitive Testing for NCVS Redesign

NCVS Redesign - Memo requesting clearance for cognitive testing 6.23.16.docx

Research to support the National Crime Victimization Survey (NCVS)

Memo to OMB on Cognitive Testing for NCVS Redesign

OMB: 1121-0325

Document [docx]
Download: docx | pdf




U.S. Department of Justice


Office of Justice Programs


Bureau of Justice Statistics

Washington, D.C. 20531


MEMORANDUM



To: Shelly Wilkie Martinez

Office of Statistical and Science Policy

Office of Management and Budget


Through: Lynn Murray

Clearance Officer

Justice Management Division


Jeri M. Mulrow

Acting Director

Bureau of Justice Statistics


From: Michael Planty, Jennifer Truman, Lynn Langton

Bureau of Justice Statistics


Date: June 23, 2016


Re: BJS Request for OMB Clearance for Cognitive Testing under the National Crime Victimization Survey (NCVS) Redesign Generic Clearance, OMB Number 1121-0325.


The Bureau of Justice Statistics (BJS) requests clearance for cognitive interviewing tasks under the OMB generic clearance agreement (OMB Number 1121-0325) for activities related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program. BJS, in consultation with Westat under a cooperative agreement (Award 2013-MU-CX-K054 National Crime Victimization Survey (NCVS) Instrument Redesign and Testing Project), is working to redesign the NCVS survey instrument and mode of administration. The NCVS was last redesigned more than two decades ago, in 1992. Much has changed in the interim, both in the level of public acceptance of surveys and in terms of the nature of crime. The primary purpose of the NCVS Instrument Redesign and Testing Project is to provide scientific and technical support for the redesign and testing of the NCVS roster control card, crime screener (NCVS-1), and crime incident (NCVS-2) instruments in support of BJS’s efforts related to increasing the efficiency, reliability, and utility of the NCVS.


For this clearance, cognitive interviewing will be used to develop and test the revised crime screener, as well as new questions that focus on perceptions of police. In-person cognitive testing of the instruments will be conducted iteratively with up to 90 persons over multiple rounds of testing. Additionally an online panel will be used to collect examples of type of crimes to help modernize cues (described in more detail below) with up to 400 online respondents. Together these cognitive testing tasks will require up to 160 burden hours and will help to identify issues with the wording or content of the instruments and to estimate the burden associated with completion.


Additional cognitive testing will be requested under separate clearances and will cover crime incident questions and noncrime questions on community safety and security, as well as any additional testing needed on the crime screening questions. The results from the cognitive interviews will be used to prepare final instruments for computer-assisted interviewing (CAI) programming. We anticipate a Web-based set of instruments that may be used by interviewers or by respondents themselves. The Web instruments will undergo usability testing, then a formal pretest. Finally, we plan to conduct a large, two-wave field test. The field test will administer the survey to a representative sample of persons age 12 or older, testing aspects of the design such as mode, response rates, and administration times. OMB approval for usability testing and the pretest will be sought after completion of the cognitive testing in late 2016. We anticipate usability testing early in 2017, the pretest in mid- to late 2017, and requesting clearance for the field test in late 2017.


Description and Purpose of Overall Project


This request is for clearance to conduct cognitive interviews to develop and test the redesigned NCVS instruments, including brief interviews with online panel members to help modernize the recall cues. The cognitive interviews will inform the design and the final instruments to be tested in a larger field test. Before describing the proposed cognitive interviews, the goals and research questions for the larger NCVS Instrument Redesign and Testing Project are described.


The NCVS is based on research conducted by the Department of Justice and the Census Bureau in the 1970s (summarized in Lehnen and Skogan, 1981; Skogan, 1990; Skogan and Lehnen, 1985). There was a major redesign implemented in 1992, motivated in part by a National Academy of Sciences (NAS) review (Penick and Owens, 1976). A more recent review by the NAS (Groves and Cork, 2008) has motivated the current BJS redesign effort.


Since 2008, BJS has initiated a number of research projects to assess and improve upon core NCVS methodology, including redesigning the sample plan, comparing alternative modes of interview, testing methods for reducing non-response bias, experimenting with various reference period lengths, testing the effectiveness of new victimization screening questions, and exploring the feasibility of producing sub-national estimates of victimization.


The current effort is part of BJS’s work (with Westat under a cooperative agreement) to come up with a new design of the NCVS. The overarching objective of the project is to provide scientific and technical support for the redesign and testing of the NCVS roster control card, crime screener (NCVS-1), and crime incident (NCVS-2) instruments in support of BJS’s efforts related to increasing the efficiency, reliability, and utility of the NCVS. Through the project, BJS aims to evaluate and modernize the organization and content of the NCVS; improve the efficiency of the instruments and the current core-supplement design; and develop a procedure for introducing routine improvements to the survey in order to capture emerging crime types and time-relevant topics.


One of the first steps in the project was a comprehensive assessment of the instrument to determine which survey items are being utilized and how, which survey items are problematic in their language and placement, and where there are gaps in the content of the instrument. The initial assessment provided a better understanding of the substantive and procedural issues with the instrument and helped to identify areas where the content could be improved to enhance current knowledge of victimization and its correlates.


Another area of focus is on improving the measurement of and increasing the range of crime types covered by the survey. The current NCVS captures rape and sexual assault, robbery, physical assault, burglary, larceny, and motor vehicle theft through the core survey instrument and uses routine supplements to collect information on other crime types, such as identity theft and stalking. However, the rates of victimization for these supplemental crimes are not incorporated into the overall victimization rates. Other growing crimes like financial fraud are not measured by the survey at all. One of the goals of the redesign is to expand the crime screener to incorporate a broader range of crimes, including some, like fraud, that are not typically reported through official police statistics. Additional efforts are on modernizing the crime screener; improving the measurement of highly sensitive crimes like rape and sexual assault and intimate partner violence; and building in a series of questions on perceptions of police and community safety to be asked of all respondent (‘ask all’/‘noncrime’ items). The anticipated changes to and improvement of the types of crimes measured by the NCVS, may require changes to the survey methodology to ensure that the information collected is accurate and reliable. Through this project, we will also examine the feasibility of using self-administered approaches to collect comparable data as obtained through an interview administered mode.


Overall, the NCVS Instrument Redesign and Testing Project will be used to inform improvements in the —


  • NCVS screener and crime incident instruments;

  • mode of data collection for the NCVS;

  • flexibility to measure emerging crime types; and

  • ability to capture indictors of safety and security and perceptions of police that go beyond experiences with victimization.


Current Request for Cognitive Testing


In the current request, we are asking for clearance to conduct cognitive interviewing to support this overall redesign effort. The cognitive testing will be used to develop and test the revised instrument and will focus on two main areas, which are discussed in detail below: (1) crime screener and (2) new ‘noncrime’ questions that will be asked of all respondents (‘ask all’) and focus on perceptions of police. The cognitive testing being planned will utilize two approaches, (1) in-person interviewing and (2) online data collection. In-person interviewing will be used to obtain a more nuanced understanding of how respondents are conceptualizing and answering each survey question. Online data collection will be implemented to obtain a relatively large number of responses regarding the identification of common crime characteristics from respondents (i.e. items stolen; activities typical of stalking events) that could inform the updating of victimization screening cues. The majority of cognitive testing will take place in 2016 and will be used to inform the usability testing, pretest, and field test of the full instrument. The cognitive interviews will address the structure and language of the current versions of the instruments, and specifically will address the following –


  • The wording of the new and revised screening questions, including the items on identity theft, stalking, and fraud;

  • Respondent reactions to changes in question sequence and organization (interleafing approach versus non-interleafing/blocked approach);

  • Respondent responses to a long, cue-rich, version of the crime incident screener

  • Respondent reactions to a short, cue-sparse, version of the crime incident screener;

  • A new method of bounding that asks respondents to generate personal landmarks;

  • Effective cuing for different crime types; and

  • Respondent reactions to new ‘noncrime/ask-all’ items related to perceptions of police.


Cognitive Testing of the Crime Screener


One component of cognitive testing is examining new approaches for the screening items, including coverage of a broader range of crimes (such as fraud, stalking, and identity theft), and different organizational approaches to the questions. This is expected to be an iterative cognitive testing process as these new items and approaches are revised. Data from the interviews will be analyzed after each of three rounds to identify problematic questions and assess how well each approach performs. Those questions will be revised and tested in the subsequent round.


A major focus of the redesign is on improving the screening questions, which ask respondents to report whether they experienced various types of crime victimizations during the last six months. The screener, first implemented in 1992, incorporated a wide range of verbal cues and examples in order to prompt fuller recall of victimizations. In addition, it adopted an organization of the questions known as the “blocked” format; in this format, all of the screening items are administered prior to any of the associated follow-up questions that gather more detail about each incident. Over time, evidence has accumulated that the approach taken in 1992 may not be working as well as intended. For example, it is apparent from time stamp data and from direct observation of NCVS interviews that interviewers often go through the examples in the screening questions very quickly or skip them entirely, sometimes at the insistence of the respondents. Further, the blocked organization of the screening items may be less effective in a longitudinal setting (the NCVS interviews respondents up to seven times) than in a cross-sectional context, since respondents may discover the contingency between answers to the screening items and the administration of a large number of follow-up questions. Intermixing at least some of the follow-up questions with the initial screening items (an approach called “interleafing”) may offer advantages over the blocked approach—producing a more conversational flow to the questions and improving the routing to later items. Because of the expanded range of crimes to be covered in the new screener, several different sets of follow-up items will be needed (the same follow-up questions will not work for assaults and fraud), so identifying the proper routing will be more complicated.


This component of the cognitive testing will try out four new versions of the screening questions (see Attachment A). These versions differ in several ways from the current NCVS screener (see Attachment B for comparison). First, they cover new types of crime, specifically identity theft, fraud, and stalking. Second, they incorporate wording changes to update the language of some of the items (e.g., the item on theft now includes “cell phones” as an example of something that might have been stolen). Third, some versions will use an interleafed organization of the screening items. Fourth, some versions use a new method for bounding that asks respondents to generate personal landmarks, and respondents will be cognitively probed to understand whether the landmarks aided in their recall of victimization incidents. Finally, two versions use a shorter cueing approach that is being developed as a possibility for subsequent waves of NCVS, to be tested quantitatively in the field test.


Cognitive Testing of the New Noncrime (Ask-all) Questions – Perceptions of Police


Another key component of the cognitive testing will be assessing a series of questions pertaining to citizens’ perception of safety, disorder, police legitimacy, and satisfaction with police; we refer to these as ‘noncrime’ questions. Recent events in places such as Ferguson, MO and Baltimore, MD have demonstrated the need for data on residents’ perceptions of police and for understanding the relationship between experiencing victimization, reporting crime to police, and perceived police legitimacy. In addition, the President’s Task Force on 21st Century Policing recommended developing questions to be added to the NCVS on citizen satisfaction with police.1 Questions on satisfaction with police and police legitimacy will be asked of all NCVS sample, not just those who experienced a victimization. Questions on fear, perceptions of community disorder, and feelings of safety will also be added, but those questions will be included for testing under a separate clearance request. These questions are intended to increase the relevance of the survey for the majority of respondents who never experience a victimization. Additionally, because the items are answered by all respondents the estimates are expected to have stronger precision at the subnational level compared to victimization rates.


The current cognitive testing request will assess respondent reactions to these new items (see Attachment D for the police ask-all items to be tested). Cognitive interviewing will assess comprehension of the questions and concepts, difficulty with answering, and ease of administration for interviewers. To ensure the length of the cognitive interview remains within 60 minutes, the perception of police items will only be tested in the two shorter cues versions of the instrument, not in the two longer cues versions.


Cognitive Probing


Because the questionnaire designs are in the early stages, all of these assessments will be informal and qualitative. Cognitive interviewing will be used to identify potential problems with each version of the questions. Trained cognitive interviewers will probe participants for additional information about the response process. (See Attachment A for the specific probes to be used in testing the screening items). Some probes will seek to determine whether the participant understands specific terms and concepts, such as “stalking” or “vandalism.” Other probes will be used to determine whether the screening items accurately captured what happened to the participant and whether the participants understood the questions easily and as intended. The cognitive interviewers will also probe respondents about their reaction to the different organizations of the screening questions (e.g., “What did you think of the overall flow of the questionnaire?”) and will assess how well the respondents processed the examples (“Do you recall any of the examples of vehicle parts that I gave you?”). Apart from scripted probes, the interviewers will ask unscripted probes if the participant show signs of difficulty, confusion, or frustration (e.g., “You seem to be having trouble with this question. Can you tell me what the problem might be?”).


Once viable versions of the new screening questions and noncrime questions have been developed, they will be compared in field experiments in both interviewer-administered and self-administered versions. Current plans for the field test call for comparing self- and interviewer-administration of the questions, shorter versus longer versions of the crime screening questions, visual (web) versus verbal presentation of the examples, different organizations of the screening questions, and placement of the noncrime questions. However, the current request covers only the cognitive testing for the crime screening questions and the noncrime questions on perceptions of police. Additional cognitive testing will be requested under a separate clearance and will cover crime incident questions and noncrime questions on community safety and security, as well as any additional testing needed on the crime screening questions. The field test will be the subject of a separate clearance request.


Data Collection


In-person Interviewing


The target population for the in-person cognitive interviewing is persons age 18 or older living in the Washington, D.C. area and Cleveland, Ohio. Conducting interviews at multiple locations facilitates recruitment of diverse participants from different geographic regions, where experiences and terminology may vary. In addition, these locations allow Westat to reduce costs by taking advantage of local interviewers.


For the core NCVS, most respondents do not report any victimizations. As a result, the proposed cognitive interviews will be conducted primarily with respondents who have experienced at least one type of criminal victimization in the prior 12 months. The questions will also use a twelve month reference period to increase the number of respondents who have victimizations to report. The only other eligibility criterion is that the person speak English. All in-person interviews will take place at the Westat offices or the professional interviewing facility in Cleveland, OH. Interviews will last approximately 1 hour. Participants will receive $40 to encourage participation and thank them for their time and effort. Across the multiple iterations of cognitive testing, in-person interviews will be conducted with up to 90 respondents. Prior cognitive testing experiences and research suggest that $40 is a justifiable incentive amount as it is effective in attracting a wide diversity of respondents.


Recruiters will advertise the study to solicit participation, using internal recruiting databases, as well as newspaper ads, fliers, and Craigslist ads. Recruiters will adapt recruiting strategies as needed to ensure adequate participation. Interested participants who contact study recruiters will be asked a series of screening questions to determine their eligibility. (See Attachment E for the recruitment screening questionnaire and the Craigslist ad.) Persons selected to participate will be contacted by the recruiters and scheduled for their interview session. Individuals will be selected to the extent possible to achieve diversity by age, gender, educational attainment, race, and ethnicity across the interviews. Respondents will be recruited such that most will have been a victim of at least one type of crime in the past 12 months.


All cognitive interviews will be audio-recorded with the participant’s consent. The audio recordings will only be accessible to project staff directly working on the project and no names or other personally identifying information (other than the participant’s voice itself) will be included on the audio recordings or transcripts.


Online Data Collection


In addition to in-person cognitive testing, web-based data collection will be used to gather feedback from respondents on common and less-common cueing examples for each type of crime (see Attachment F). This feedback will then be used to modify or add additional cueing examples to the crime screening questions.


The online survey will identify approximately 400 respondents with targeted specific demographic characteristics that include U.S. residency, English speaking ability, and a mix of sex, Hispanic origin and race, and educational attainment. The target number of up to 400 respondents will allow for some variation among respondents but also takes into consideration affordability and schedule constraints. Westat’s research has investigated and pilot tested a number of online panels, and recommends using Survey Sampling International’s Survey Spot Panel, which is a national non-probability panel. We chose the Survey Spot Panel because it was five times less expensive than the GfK Knowledge Panel and, because, relative to the other non-probability panels who provided bids for this work, the researchers (especially, Roger Tourangeau) had extensive positive experiences with the Survey Spot Panel. Given the qualitative nature of the data to be collected, a probability panel did not seem necessary.


Westat staff will recruit approximately 400 volunteers total from the Survey Spot Panel. The panel consists of about 11 million adult members (ages 18 and older), including both English and Spanish speakers. SSI will recruit approximately 300 volunteers from this group. In addition to adult members, SSI will also ask its adult members with children in the target age range (ages 12 to 17) to provide consent for their children to complete the survey. SSI will recruit approximately 100 volunteers from this group. SSI will e-mail the invitation (which uses standard text) to participate in the survey to panelists whose characteristics who are United States residents, English speakers, and at least 12 years of age. Westat will not provide incentives to participants. Westat will not have access to any information about the respondents who are invited (i.e., we will not receive any information about nonrespondent or any PII). The Survey Spot Panel provides for efficient data collection with panelists who can supply the information needed for helping us develop improved cues for the revised NCVS instruments.


Language


The cognitive interviews and online data collection will be conducted in English.


Burden Hours for Cognitive Testing


The burden for this task consists of participants being screened for and subsequently participating in either online data collection or in-person cognitive interviews. The online data collection will be completed in two rounds to assess responses to the cueing examples for each type of crime. The in-person cognitive testing will use an iterative process. Data from the interviews will be analyzed after each of three rounds to identify problematic questions, as well as focusing on whether the interleafing approach works well. Those questions will be revised and tested in the subsequent round. Findings from the first and second rounds of online testing, will also be incorporated into the second and third rounds of cognitive interviewing as needed. The burden associated with these activities is presented in the following table.



Table 1. Burden Associated with Planned NCVS-R Cognitive Testing Activities


Data Collection Type

# of Respondents per Round


Total # of Respondents

Average Administration Time (minutes)

Burden (hours)

Round 1

Round 2

Round 3

Online Data Collection

(Cueing examples only)

200

200

--

400

10

70

In-person Cognitive Interviewing

30

30

30

90

60

90

Total

230

230

30

490

--

160


Cost to the Federal Government


The total cost of conducting the cognitive interviews, including incentives for in-person interviews, will be approximately $90,000 under the cooperative agreement with Westat (Award 2013-MU-CX-K054) for the National Crime Victimization Survey (NCVS) Instrument Redesign and Testing Project.




Data Analysis


Cognitive interviewers will summarize the findings from each cognitive interview. Interviewers will prepare summary findings on each completed interview based on the completed questionnaire, notes taken during the interview, and associated audio recordings. The summaries will be analyzed using qualitative software to help identify common themes organized by overall questionnaire issues, individual questionnaire items and sections, and participants’ overall reactions to the questionnaire.


The cognitive interviewing analysis will assess and identify problems, such as comprehension issues, recall problems, difficulties understanding the task, telescoping. In addition, it will examine whether administration of the questions, for example interleafing some of the follow-up questions between the screening items, works smoothly. These issues will be assessed qualitatively, based on the interviewers’ assessments of their own experience.


The online data collection will be analyzed by developing a set of codes for each item and classifying each response into the coding scheme. Multiple coders will be trained and used to ensure reliability of the coding structure.


Upon completion of all cognitive testing, a draft cognitive interviewing report will be delivered to BJS that will include recommendations for final revisions to the survey for the field test. These recommendations will be based upon the findings of both the online data collection and in-person cognitive testing, and will provide detailed information on the cognitive testing methodology, basic characteristics of the respondents, average time needed to complete the survey instruments, and any issues with question comprehension. The report will also document changes made to the initial draft NCVS survey instruments that are being recommended for use in the usability testing, pretest, and field test.


Protection of Human Subjects


There is some risk of emotional distress for the respondents given the sensitive nature of the topic, particularly since the questions are of a personal nature; however, appropriate safeguards are in place and the planned cognitive testing will be reviewed by Westat’s Institutional Review Board (IRB), which has Federal-wide assurance.


Informed Consent


In-person Cognitive Interviewing


Adult participants will review the informed consent document with the aid of the interviewer and will be asked to sign the form before participating. Should an individual refuse informed consent, s/he will be excused from participation and thanked for her/his time. At the time of consent/assent, the interviewer will also ask permission to audio record the interview. (See Attachment G for the informed consent documents).


Online Data Collection


After clicking on the link displayed in the recruitment e-mail, the 400 panelists will be brought to the first page of the survey, which will explain the task. If the respondent wants to proceed, they will indicate that they consent and will then proceed into the survey.


Use of Information Technology to Reduce Burden


The cognitive testing study will utilize technology to facilitate recruitment and the scheduling process while also reducing participant burden and controlling study costs. Recruitment efforts will use email communications when possible, because participants increasingly prefer to communicate via email so they can respond when it is convenient. Using email for recruitment and scheduling can help to reduce participant burden and save time and money that would otherwise be spent conducting telephone calls, leaving voice messages, and making call-backs.


Data Confidentiality and Security


BJS’s pledge of confidentiality is based on its governing statutes Title 42 USC, Section 3735 and 3789g, which establish the allowable use of data collected by BJS. Under these sections, data collected by BJS shall be used only for statistical or research purposes and shall be gathered in a manner that precludes their use for law enforcement or any purpose relating to a particular individual other than statistical or research purposes (Section 3735). BJS staff, other federal employees, and Westat staff (the data collection agent) shall not use or reveal any research or statistical information identifiable to any specific private person for any purpose other than the research and statistical purposes for which it was obtained. Pursuant to 42 U.S.C. Sec. 3789g, BJS will not publish any data identifiable specific to a private person (including respondents and decedents). To protect the identity of the respondents, no identifying information will be kept on the final data file.


Participation in this cognitive testing study is voluntary. Personally identifiable information (PII), including names and contact information (phone number and/or email address), will be collected by recruiting facilities for the purpose of scheduling eligible participants for interviews. These data will be securely stored in password protected files to which only project staff will have access, and will be destroyed after the study is finished. Names provided by participants on consent and incentive receipt forms will be stored in locked cabinets, separate from data. Participant PII will never be associated with data collected during the interview. PII for individuals not selected for interviews will be destroyed immediately. PII for selected participants will be destroyed per contract requirements.


1 President’s Task Force on 21st Century Policing. 2015. Final Report of the President’s Task Force on 21st Century Policing. Washington, DC: Office of Community Oriented Policing Services.

8

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy