Non-Substantive Change Request Justification

NISVS Non-substantial change request 3.18.19 OMB_CDC revision_Clean 6.10.19.docx

The National Intimate Partner and Sexual Violence Survey (NISVS)

Non-Substantive Change Request Justification

OMB: 0920-0822

Document [docx]
Download: docx | pdf

The National Intimate Partner and Sexual Violence Survey (NISVS)

(OMB no. 0920-0822 exp. date 2/29/2020)


Proposed Changes: Justification and Overview

March 14, 2019


Justification

This document serves as a change request for the currently approved National Intimate Partner and Sexual Violence Survey - OMB# 0920-0822, expiration date 02/29/2020 for the first phase of developmental activity, which is inclusive of cognitive testing of the new and revised survey instruments.


In late 2017, CDC learned that the survey’s response rates had declined substantially during the 2016/17 data collection compared to the 2015 data collection period. Thus, in response to the declining response rates, as well as concerns raised about the NISVS methodology by an OMB-required methodology workgroup, CDC funded a contract beginning in September 2018 to explore the feasibility and cost of implementing alternative methods for collecting NISVS data in a manner that would result in increased response rates and reductions in nonresponse bias. This contract involves three phases, including:


  1. Cognitive testing of a revised computer-assisted telephone interview (CATI) instrument (shortened to reduce respondents’ burden but not altering the core intimate partner violence [IPV], sexual violence [SV], and stalking content of the survey), as well as cognitive testing of web and paper versions of the survey.

  2. Experimentation and feasibility testing to assess a number of alternative design features, including the sample frame (address based sample [ABS], random digit dial [RDD], web panel), mode of response (telephone, web, paper), and incentive structures that help garner participation and help reduce nonresponse.

  3. Pilot testing of a new design, procedures, and a final set of survey instruments for national survey administration based on results from feasibility tests.


Questionnaire Changes


Changes have been made to the survey submitted for OMB approval in February 2018. The length of the survey has been reduced. See Attachments A and B for a detailed description of items that have been removed, added, and modified.


To reduce the burden and duration of the survey, revisions to were made to the questionnaire length. First, the selection of violence questions to drop was guided by statistical analyses of the 2016-2017 data that showed the percentage of respondent endorsement (i.e., “yes” responses) for specific questions (i.e., violence experience) and how removal of the infrequently endorsed questions impacted the overall prevalence estimates. The purpose of this analysis was to use a parsimonious approach to measuring the constructs within NISVS. For example, if a 4-question measure of a form of violence produced a statistically similar prevalence estimate as a 6-question measure, then the 4-question measure was deemed more optimal.


Second, the NISVS program continually seeks to improve its measures and to provide new data on emerging and important health conditions and their relation to violence victimization. The health questions were reexamined with this purpose in mind. Health problems such as traumatic brain injury are known to be connected to intimate partner violence, but national level estimates have not been available. Further, little to no literature shows the relationship between prescription drug misuse and violence victimization; thus, our goal for adding these questions is to add more context to the understanding of this health epidemic.


It is anticipated that the survey revisions will have little impact for data users. However, during the survey revision process, careful attention was given to the items being removed from the survey to ensure that frequent data users (e.g., state Rape Prevention and Education grantees who have provided feedback on the survey previously) would still be able to obtain the data in which they have expressed specific interest in the past. With the exception of one module that assessed normative behaviors related to SV and IPV, all of the items that data users have specifically expressed interest in remain in the survey. Regarding the normative behaviors section, previous analyses demonstrate very little variance in the responses to these questions at the national level. Thus, for the sake of shortening the survey, these questions were removed. However, the NISVS team is considering collecting data on these items every few years to ensure the field has updated data on this topic.




Proposed Cognitive Testing

  • Cognitive testing is proposed to ensure that the questions included in the full NISVS survey are understood across different modes (phone, web, paper). First, because the surveys inherently involve different administration experiences (i.e., interviewer-administered vs. self-administered), it is essential to understand whether the questions are understood by respondents in the ways they were intended and the response options are clear. Thus, interviewers will use a mix of concurrent and retrospective probing to gather feedback on both item comprehension as well as the usability of the instruments. Second, revisions were made to the NISVS instrument. The instrument used in the 2016-2018 survey administration periods was cognitively tested by the previous contractor. Thus, we propose cognitively testing the newly revised survey by the current contractor. Questions that are unchanged will not be directly tested, but respondents will have the opportunity to provide feedback on all portions of the survey.


The contractor’s recommendations for revisions to any tested items will be grounded in the data. Using notes from debriefings following the cognitive tests, interviewer summaries from each interview, and the usability logs, the contractor will produce a final report that reflects key findings from the interviews and provides final recommendations for each mode of the instrument. The recommendations report will be organized to include the following sections: executive summary; goals and objectives of the research; recruitment methods and an overview of the participants; methods for conducting the cognitive interviews; key findings; recommendations of the final wording for the questions and reasons for the recommendations; and lessons learned and conclusions. The new items that have been added, for the most part, have been administered in other surveys, and thus, we expect minimal changes will be needed for those items. Success of the cognitive testing will be assessed by how people respond to the way in which questions are asked, how well they understood the survey instructions, and whether they understood the terms that were used throughout the survey.


  • Three surveys have been developed for testing data collection modes, including versions for 1) CATI, 2) paper, and 3) web-based administrations (Attachments C, D, and E, respectively).


  • A cognitive testing plan has been developed for testing the revised CATI questionnaire in addition to the web-based and paper versions of the survey.

    • The following attachments describe the cognitive testing plan in detail:

      • Attachment F – Cognitive testing plan (includes overview, participant recruitment plans, interviewer training, data analysis, communications, and timeline)

      • Attachment G – Recruitment advertisement

      • Attachment H – Recruiting screener (to ensure we obtain a diverse group of respondents with and without victimization experiences)

      • Attachment I – Informed consent form

      • Attachment J – CATI cognitive testing protocol

      • Attachment K – Paper instrument cognitive testing protocol

      • Attachment L – Web-based instrument cognitive testing protocol


A subsequent OMB change request will be submitted once the second phase (i.e., experimentation and feasibility testing) plans are finalized and approved by IRB (around May 2019). This developmental work will inform the establishment of a novel data collection approach to be tested during the pilot testing phase.


Finally, an OMB revision request will be submitted around November 2019 describing the data collection approach to be used during the pilot testing, tentatively scheduled to occur beginning in March 2020. The design to be implemented during this pilot testing phase will inform the NISVS full-scale national data collection that is expected to be implemented upon completion of the currently funded contract.



Currently approved burden and costs


In February 2018, OMB approved the NISVS data collection plans for the 2018 NISVS data collection. At the same time, 2,516 burden hours for developmental testing associated with new data collection procedures was approved.


The contractor will conduct a total of 120 cognitive interviews in April-June 2019. Interviews will be conducted in two rounds, each with 60 interviews, including 20 in each round to test the Web instrument (40 total), 20 in each round to test the paper instrument (40 total), and 20 in each round to test the CATI instrument (40 total). Note that the number of CATI interviews may change slightly (i.e., the number of Web/Paper interviews may decrease accordingly) if CDC determines that saturation in comments has been reached before the 40 interviews are complete. We anticipate each interview will take approximately one hour, and will provide respondents with $40 to help defray the costs of participating, such as transportation or child care.


The total burden for the cognitive testing phase of the study is estimated at 120 hours. This is derived from the total burden hours for respondents that complete a 60-minute cognitive test of the survey.



Table 1. Estimated Burden Hours for 2019-2020 Data Collection


Category of Respondent

No. of Respondents

Number of Responses per Respondent

Average Burden per Response

(in hours)

Total Burden (in hours)

Cognitive testing





Paper Questionnaire

40

1

60/60

2,400 minutes (40 hours)

Web Questionnaire

40

1

60/60

2,400 minutes (40 hours)

Phone Questionnaire

40

1

60/60

2,400 minutes (40 hours)



Created: 7 December 2009

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleChange Request Guidance
Authorjahlani akil
File Modified0000-00-00
File Created2021-01-16

© 2024 OMB.report | Privacy Policy