Memo to OMB on NCVS Redesign Usability Testing

NCVS redesign - Memo requesting clearance for usability testing.docx

Research to support the National Crime Victimization Survey (NCVS)

Memo to OMB on NCVS Redesign Usability Testing

OMB: 1121-0325

Document [docx]
Download: docx | pdf




U.S. Department of Justice


Office of Justice Programs


Bureau of Justice Statistics

Washington, D.C. 20531


MEMORANDUM



To: Robert Sivinski

Office of Statistical and Science Policy

Office of Management and Budget


Through: Lynn Murray

Clearance Officer

Justice Management Division


Jeffrey H. Anderson

Director

Bureau of Justice Statistics


From: Jennifer Truman and Lynn Langton

Bureau of Justice Statistics


Date: July 17, 2018


Re: BJS Request for OMB Generic Clearance for Usability Testing under the National Crime Victimization Survey (NCVS) Redesign Generic Clearance, OMB Number 1121-0325




The Bureau of Justice Statistics (BJS) requests clearance for usability testing tasks under generic clearance agreement, OMB Number 1121-0325 (expires December 31, 2018), for activities related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program. BJS, in consultation with Westat under a cooperative agreement (Award 2013-MU-CX-K054 National Crime Victimization Survey (NCVS) Instrument Redesign and Testing Project), is working to redesign the NCVS survey instrument and mode of administration. The NCVS was last redesigned more than two decades ago, in 1992. Much has changed in the interim, both in the level of public acceptance of surveys and in terms of the nature of crime. The primary purpose of the NCVS Instrument Redesign and Testing Project is to provide scientific and technical support for the redesign and testing of the NCVS roster control card, crime screener (NCVS-1), and crime incident (NCVS-2) instruments in support of BJS’s efforts related to increasing the efficiency, reliability, and utility of the NCVS.


Prior generic clearances for this project allowed for cognitively testing different sections of the NCVS instrument with adults and youth. The results from these cognitive interviews have contributed to a completely revised NCVS instrument for web-based self-administration. At this time, clearance is requested for usability testing which will be conducted on the revised web-based version of the core NCVS instrument. The usability testing will identify issues related to the navigation and flow of the full NCVS instrument, and will be conducted with 52 victims of crime over three waves of testing, including 40 adults and 12 youth. This usability testing will require up to 78 burden hours.


Following the usability testing, we will request a separate full clearance for a pre-test of the field test procedures and for a large, two-wave field test. The field test will involve administering the survey to a representative sample of persons age 12 or older, testing aspects of the design such as mode, victimization screener approaches, response rates, and administration times. We anticipate requesting clearance for the field test in 2019.


Description and Purpose of Overall Project


This request is for generic clearance to conduct usability testing of the redesigned web-based NCVS instrument. The usability testing will inform the design and the final instruments for the larger field test. Before describing the proposed methods, the goals and research questions for the larger NCVS Instrument Redesign and Testing Project are described.


The NCVS is based on research conducted by the Department of Justice and the Census Bureau in the 1970s (summarized in Lehnen and Skogan, 1981; Skogan, 1990; Skogan and Lehnen, 1985). There was a major redesign implemented in 1992, motivated in part by a National Academy of Sciences (NAS) review (Penick and Owens, 1976). A more recent review by the NAS (Groves and Cork, 2008) provided similar motivation for the current BJS redesign effort.


Since 2008, BJS has initiated a number of research projects to assess and improve upon core NCVS methodology, including redesigning the sample plan, comparing alternative modes of interviewing, testing methods for reducing non-response bias, experimenting with various reference period lengths, testing the effectiveness of new victimization screening questions, and exploring the feasibility of producing sub-national estimates of victimization.


The current NCVS Instrument Redesign and Testing Project is part of BJS’s work (with Westat under a cooperative agreement) to develop a new design for the NCVS. The overarching objective for this project is to redesign and test the NCVS roster control card, crime screener (NCVS-1), and crime incident report (NCVS-2). Ultimately, BJS aims to evaluate and modernize the organization and content of the NCVS; improve the efficiency of the instruments and the current core-supplement design; and develop a procedure for introducing routine improvements to the survey in order to capture emerging crime types and time-relevant topics.


A first step in the NCVS Instrument Redesign and Testing Project was a comprehensive assessment of the instruments to determine which survey items are being utilized and how by BJS and NCVS data users; which survey items are problematic in their language and placement; and where there are gaps in the content of the instrument. The initial assessment, which was completed within the first year of the project, provided an understanding of the substantive and methodological issues with the instrument and helped to identify areas where improvements to the content would enhance current knowledge of victimization and its correlates.


A major focus of the NCVS Instrument Redesign and Testing Project has been on improving the screening questions, which ask respondents to report whether they experienced various types of crime victimizations during the last six months. The screener, first implemented in 1992, incorporates a wide range of verbal cues and examples to prompt recall of victimizations. In addition, it organizes the questions in a “blocked” format; in this format, all of the screening items are administered prior to any of the associated follow-up questions that gather more detail about each incident.


Over time, evidence has accumulated that the approach taken in 1992 may not be working as well as intended. For example, it is apparent from time stamp data and from direct observation of NCVS interviews that interviewers often go through the examples in the screening questions very quickly or skip them entirely, sometimes at the insistence of the respondents. Further, the “blocked” organization of the screening items may be less effective in a longitudinal setting (the NCVS interviews respondents up to seven times) than in a cross-sectional context, since respondents may learn the connection between answers to the screening items and the administration of a large number of follow-up questions. Intermixing at least some of the follow-up questions with the initial screening items (an approach called “interleafing”) may offer advantages over the blocked approach—producing a more conversational flow to the questions and improving the routing to later items.


The NCVS Instrument Redesign and Testing Project has also focused on improving the measurement of highly sensitive crimes like rape and sexual assault and intimate partner violence; increasing the relevance and utility of the crime incident report (CIR); and building in a series of questions on perceptions of police and community safety to be asked of all respondent (‘ask all’/or ‘noncrime’ items). The anticipated changes to the types of crimes measured by the NCVS, and improvement of the measurement process, may require other changes to the survey methodology to ensure that the information collected is accurate and reliable. The NCVS Instrument Redesign and Testing Project has also examined the feasibility of using self-administered approaches to collect data comparable to those obtained through interviewer administration.


To date, the NCVS Instrument Redesign and Testing Project has completed four total rounds of cognitive testing. This cognitive testing focused on the redesigned screener, redesigned crime incident report (CIR), and the series of questions on perceptions of police and community safety. In August, November, and December 2016, 50 cognitive interviews with adults were conducted in two waves of testing. These interviews addressed the structure and language of multiple versions of the NCVS screening instrument, including a long cue-rich, short cue-sparse, interleaved, and non-interleaved (“blocked”) versions. These interviews also addressed respondent reactions to the new ‘ask all’/or ‘noncrime’ items related to perceptions of police.


In April and August 2017, 84 cognitive interviews with adults and youth were conducted in two waves of testing. These interviews addressed the changes to the CIR. These cognitive tests were conducted in both interview- and self-administered (paper and pencil) modes. Since this was the first time testing the redesigned instrument with youth, there was also a focus on comprehension of the screening items for youth. The interviews addressed the redesigned parts of the CIR, including location series, presence items, “what happened” series, help-seeking items, self-protection, and victim-offender relationship. They also addressed respondent reactions to the new ‘ask all’/or ‘noncrime’ items related to community safety and perceptions of police (youth only).


Current Request for Usability Testing


This current request is the first formal usability test of the redesigned self-administered, web-based, NCVS instrument. We are requesting clearance to conduct three waves of usability testing to support the redesign effort.


This research will examine how survey questions, instructions, and supplemental information are presented on computer instruments, especially over the Internet, and how the presentation affects users’ navigation and understanding of these instruments. The instrument will primarily be tested on a laptop computer, but ten interviews will also focus on the visual layout for smartphones and tablets. Participants will be observed as they complete the questionnaire, and then be asked for feedback on design features: how the questions are positioned on the screen; how instructions are displayed; the amount of information presented on one screen; the use of color and other navigational tools; and so on. Research has shown that these features can have a significant effect on the time required to complete the survey questions, on the accuracy of question-reading and data entry, and the full usage of resources available to help the user complete his or her task (Couper, M., 1999, The Application of Cognitive Science to Computer Assisted Interviewing, in Sirken, M., Hermann, D., Schechter, S., Schwarz, N., Tanur, J., and Tourangeau, R. (eds.), Cognition and Survey Research, New York, Wiley, 277–300).


Usability testing of the programmed redesigned NCVS instrument will be conducted with 40 adults and 12 youth. Testing sessions will last approximately 90 minutes and include completion of the entire questionnaire as applicable, as well as concurrent usability probes, spontaneous feedback from the respondent, and debriefing questions. In addition, a few modules that had not been included in prior cognitive testing efforts will be concurrently probed for overall comprehension, including household/person characteristics, injuries, socio-economic consequences, and economic consequences. These items present in the current NCVS, but have been modified for the redesigned instrument. The probes are designed to make sure the respondents understand the modified versions of the items. (See Attachments 1, 2 and 3 for the items to be tested, the usability testing protocol, and the usability issue log.) Please note that the attached instrument is the paper version of the instrument and does not fully reflect the programmed version of the instrument for usability testing. As noted, the instrument will be a web-based self-administered mode with all of the skip patterns and programming built in and optimized for all platforms (See Attachment 11 for a few screenshots of the instrument).


Testing sessions will be conducted in three waves, with revisions to the instruments and protocol as needed between the waves. In the first wave, 15 adults and 6 youth who have experienced different types of victimization will navigate the instrument on a Westat-supplied laptop and provide feedback on areas where they are having difficulty. Interviewers will focus their scripted and unscripted probes on feedback on usability and design, but if cognitive issues arise, they will also probe to understand these difficulties. We will edit the programmed instrument based on wave 1 feedback and incorporate these changes for wave 2, in which we will retest the revised instrument on the Westat laptop with an additional 15 adults and 6 youth. Finally in wave 3, we will test the instrument on users’ own devices with 10 adults, using tablets or smartphones.


To prepare for the Field Test experiments on improving the flow and utility of the victimization screener, we will randomly assign usability testing respondents to one of two versions of the victimization screener: long cues with interleafing of incident details, and short cues with interleafing. This allocation will allow us to test how the items perform in the version intended for the wave 1 interview (long cues) and for waves 2 and beyond (short cues).


Respondents will be asked to “think aloud” at key questions as they complete the web survey, and then will be asked a series of debriefing questions at the end to assess overall reactions to the instrument, and feedback on the look and feel of the survey. As respondents work through the survey, trained interviewers will ask concurrent probes to follow up on any issues or problems encountered including:

  • Navigation through the survey

  • Generation and understanding of error messages

  • Reasons the user required help

  • Changing answers

  • Finding necessary information


Debriefing topics will include:

  • Overall reactions to the look, feel, and usability of the web survey

  • Additional undiscussed issues or problems


We have selected four areas in which to conduct the two rounds of usability interviews: Rockville, Los Angeles, Chicago, and St. Louis. Two of the areas, Rockville and Los Angeles, have been selected primarily because Westat can take advantage of local interviewers. Los Angeles also has an elevated crime rate compared with national rates. St. Louis and Chicago were chosen because they have relatively high crime rates according to the FBI Uniform Crime Reports. We will conduct interviews remotely in these two areas, with the respondent sitting at a computer at a focus group facility and using WebEx to communicate with the interviewer. The interviewer will display the web instrument on the screen, and will pass control to the respondent to navigate the instrument. The interviewer will be able to watch the respondent’s mouse and clicks as they answer the survey questions, and will also be able to see the respondent’s nonverbal reactions to the questions. Westat used a similar approach in the first round of CIR cognitive testing in 2017. This approach allows us to minimize travel costs while allowing broader geographic coverage, access to more varied types of crimes, and a private controlled setting in which to conduct the interview.


All youth interviews will be conducted in person in LA and Rockville so that we can manage the parental consent and youth assent procedures.


Recruitment will be based on victimization in the past six months. Adult participants will receive $60 ($40 per hour) as compensation for their time and to offset the cost of their participation, such as transportation, parking and childcare. Youth will receive $40 (about $25 per hour).


Recruiters will use the same strategies as they do for the cognitive testing effort, advertising the study to solicit participation, using internal recruiting databases, newspaper ads, fliers, and Craigslist ads. Interested participants who contact study recruiters will be asked a series of screening questions to determine their eligibility. Recruited adults will have been a victim of at least one crime in the past 6 months. (See Attachments 4 and 5 for the usability testing recruitment screening questionnaire and Craigslist ad.) Individuals will be selected to the extent possible to achieve diversity by age, gender, educational attainment, race, and ethnicity across the interviews.


In-person usability testing sessions will be audio-recorded with the participant’s consent, and remote WebEx sessions will be video-recorded with participant consent. The recordings will only be accessible to project staff directly working on the project and no names or other personally identifying information will be included in interviewer summaries or the recordings. These recordings will be destroyed at the end of the project.


Language


All interviews will be conducted in English.


Burden Hours for Usability Testing


The burden for this task consists of participants being screened for and subsequently participating in in-person or remote usability interviews. Usability interviews will include the NCVS screener and all relevant modules. The burden associated with these activities is presented in the following table.



Table 1. Burden Associated with Planned NCVS-R Usability Testing Activities


Data Collection Type


Total # of Respondents

Total # of Respondents

Average Administration Time (minutes)

Burden (hours)

Wave 1

Wave 2

Wave 3

Usability testing with adults

15

15

10

40

90

60

Usability testing with youth

6

6


12

90

18

TOTAL

21

21

10

52

90

78


Justification of Respondent Burden


Every effort has been made to minimize respondent burden and government cost in the conduct of these proposed tests. The size of the sample needed to support analysis of the usability testing interviewing was driven by the goals of the testing. The 52 usability testing interviews will be split into three waves and divided between victims each of different types of crimes, including vehicle theft (adults only), theft of other items (adults and youth), assault (adults and youth), break-ins (adults only), and vandalism (adults only) to ensure that all items in the CIR are tested.


The prior rounds of cognitive testing approved by OMB took approximately 90 minutes to administer. Because the full survey will be administered in usability testing we anticipate approximately 90 minutes will also be required for these interviews.


Cost to the Federal Government


The total cost of conducting the usability testing, including incentives for in-person interviews, will be approximately $95,000 under the cooperative agreement with Westat (Award 2013-MU-CX-K054) for the National Crime Victimization Survey (NCVS) Instrument Redesign and Testing Project.


Data Analysis


Usability testing interviewers will summarize the findings from each usability test interview. Interviewers will prepare summary findings on each completed interview based on the completed questionnaire, notes taken during the interview in the issue log, and associated audio recordings. The summaries will be analyzed using qualitative data analysis software to help identify common errors, points of confusion, and participants’ overall reactions to the questionnaire.


The usability testing analysis will assess and identify problems, such as trouble with navigation, issues that generate error messages, items that elicit requests for help, and comprehension issues.


Upon completion of all usability testing, a draft usability testing report will be prepared that will include recommendations for final revisions to the programmed instrument for the pretest and field test. These recommendations will provide detailed information on the usability testing methodology, basic characteristics of the respondents, and any issues with using the programmed questionnaire. The report will also document changes made to the programmed questionnaire between waves one and two of testing as well as final recommendations for use in the pretest and field test.


Protection of Human Subjects


There is some risk of emotional distress for the respondents given the sensitive nature of the topics, particularly since the questions are of a personal nature; however, appropriate safeguards are in place. The planned usability testing has been reviewed and approved by Westat’s Institutional Review Board (IRB), which has Federal-wide assurance.


Westat interviewers will be trained to recognize when respondents are becoming emotionally upset. They will also be trained on how to react when they are upset. Any respondent who seems to be in distress will be asked if they wish to stop the interview. All respondents will be provided with a list of hotline numbers before leaving the interview session, regardless of whether the interviewer identified any distress. (See Attachments 9 and 10 for the adult and youth resources to be provided to all respondents)


Informed Consent


Adult participants will review the informed consent document with the aid of the interviewer and will be asked to sign the form before participating. Should an individual refuse informed consent, s/he will be excused from participation and thanked for her/his time. At the time of consent/assent, the interviewer will also ask permission to audio record the interview. (See Attachment 6 for the informed consent document.)


To interview youth participants we must receive both parent/guardian permission and youth assent. Those youth who are coming to a concurrent interview with a parent will be interviewed in a separate, private room to ensure confidentiality. Written parental/guardian consent will be obtained before the interview begins. Since Westat is covered by BJS Title 34 USC, Section 10134 and 10231, we are unable to report any instances of abuse and must maintain strict confidentiality of the data. As such, we plan to collect verbal assent from youth rather than written; the interviewer will sign the assent form rather than the youth. We will also not maintain the names of the children to ensure we are unable to link the interview data with a child’s identifying information. Should a parent or a child refuse informed consent, the child will be excused from participation and thanked for her/his time. In both the parental consent and youth assent process, the interviewer will ask for permission to audio record the interview. (See Attachments 7 and 8 for the parental permission and youth assent documents.)



Use of Information Technology to Reduce Burden


The usability testing study will utilize technology to facilitate recruitment and the scheduling process while also reducing participant burden and controlling study costs. Recruitment efforts will use email communications when possible, because participants increasingly prefer to communicate via email so they can respond when it is convenient. Using email for recruitment and scheduling can help to reduce participant burden and save time and money that would otherwise be spent conducting telephone calls, leaving voice messages, and making call-backs.


The usability testing will be conducted using the programmed version of the instrument. In waves 1 and 2, respondents will be provided with a laptop computer, whereas in wave 3, respondents will be asked to bring their own smartphone or tablet to use to answer the questions.


The study will also take advantage of technology to conduct some of the usability interviews remotely, allowing access to a broader geographic population without incurring the costs of travel. Respondents in Chicago and St. Louis will be asked to go to local focus group facilities and log onto an online streaming tool to interact with a live researcher who will conduct the interview remotely.


Data Confidentiality and Security


The Bureau of Justice Statistics (BJS) is authorized to conduct this data collection under 34 U.S.C. § 10132. BJS will protect and maintain the confidentiality of your personally identifiable information (PII) to the fullest extent under federal law. BJS, its employees, and its contractors (Westat staff) will only use the information you provide for statistical or research purposes pursuant to 34 U.S.C. § 10134, and will not disclose your information in identifiable form to anyone outside of the BJS project team without your consent. All PII collected under BJS’s authority is protected under the confidentiality provisions of 34 U.S.C. § 10231. Any person who violates these provisions may be punished by a fine up to $10,000, in addition to any other penalties imposed by law. Further, per the Cybersecurity Enhancement Act of 2015 (6 U.S.C. § 151), federal information systems are protected from malicious activities through cybersecurity screening of transmitted data. For more information on how BJS and its contractors will use and protect your information, go to https://www.bjs.gov/content/pub/pdf/BJS_Data_Protection_Guidelines.pdf.


Participation in the usability testing studies is voluntary. Personally identifiable information (PII), including names and contact information (phone number and/or email address), will be collected by recruiting facilities for the purpose of scheduling eligible participants for interviews. These data, including the audio recordings, will be securely stored in password protected files to which only project staff will have access, and will be destroyed after the study is finished. Names provided by adult participants on consent and incentive receipt forms will be stored in locked cabinets, separate from data. Participant PII will never be associated with data collected during the interview. PII for individuals not selected for interviews will be destroyed immediately. PII for sample participants will be destroyed per contract requirements.



6

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy