Att A_ Program Changes and Adustments

Att A. Description of Program Changes and Adjustments.docx

The National Intimate Partner and Sexual Violence Survey (NISVS)

Att A_ Program Changes and Adustments

OMB: 0920-0822

Document [docx]
Download: docx | pdf

Attachment A.

National Intimate Partner and Sexual Violence Survey (NISVS)

Program Changes and Adjustments



Changes to the survey

Detailed revisions are presented in Attachment B. Proposed changes to the survey include:

  • Minor edits to reflect a change in contractor and updating of CATI instructions due to renumbering of survey questions.

  • Added interviewer instruction to end the interview if respondent did not provide their age.

  • For the education question, revised the parenthetical examples in graduate education response option.

  • Revised sexual orientation response option from “Straight, that is, not gay” to “Straight, that is, not lesbian or gay.”

  • Dropped 9 health condition questions.

  • Added 3 health questions to address National Center for Injury Prevention and Control research priorities (i.e., suicide, prescription drug misuse).

  • Revised and combined 5 stalking tactic questions after consultation with federal partners and dropped 2 stalking tactic questions.

  • Shortened intimate partner relationship question throughout the survey to reduce length and repetitiveness.

  • Added 2 sexual violence questions to address workplace sexual harassment.

  • Dropped questions pertaining to perpetrator age throughout the survey.

  • Dropped questions on 12-month frequency of victimization throughout survey.

  • Combined or dropped rape and made to penetrate items to reduce burden.

  • Added 1 question on having filed a police report after a rape or made to penetrate incident.

  • Reduced intimate partner descriptor in the stem and questions throughout the survey to reduce repetitiveness.

  • Revised 2 psychological aggression questions to improve and reduce burden.

  • Added introduction and 5 reproductive coercion questions (previously asked in 2010-2015) due to high interest in the field.

  • Revised intimate partner violence (IPV) behavior items to improve clarity and reduce burden.

  • Dropped 1 IPV behavior question due to low use in analyses.

  • Replaced head injury question with concussion question to address Center priorities.

  • Shortened 2 IPV Impact items.

    • Revised IPV Impact questions about law enforcement to ask about having filed a police report.

  • Dropped 3 IPV Impact questions in order to add other priority questions.

  • Dropped all 11 questions from (formerly) Section J (Normative Behaviors) to reduce survey length.


Additionally, a paper-and-pencil version of the survey and a web-based version of the survey have been developed. The paper-and-pencil survey is a shortened version of the survey that will be used as a final effort to obtain some valuable information on sexual violence, intimate partner violence, and stalking from individuals who have not responded to requests to complete the survey via other channels (i.e., via the web or phone). This survey includes questions that will help to produce estimates for some of the more common and overarching NISVS outcomes. Because the NISVS CATI instrument contains some difficult skip patterns that are difficult to replicate on a paper survey, the level of detail collected via the paper survey has been reduced. Still, though, the information collected should be sufficient when combined with data collected via other approaches (e.g., the web) to produce the lietime and 12-month estimates for victimization.

A web-based version of the survey has also been developed. This version mirrors the questions asked via CATI implementation, but this approach introduces the opportunity for NISVS respondents to complete the survey on their own time and at their own pace. The web-based version implements skip patterns, and technological advances such as call-out boxes with definitions, an option to have the question verbally read aloud by the computer, and the ability to view and complete the survey on a number of different devices, including a personal computer, tablet, or smartphone.

The changes briefly described here are further detailed in Attachment B. Crosswalk of Survey Changes.



Program Changes Related to Cognitive Testing of the New Survey Instruments:

The first phase of the newly funded NISVS redesign contract involves cognitive testing of the three instruments described above. The contractor will conduct a total of 120 cognitive interviews in April-June 2019 to support NISVS redesign efforts. Interviews will be conducted in two rounds, each with 60 interviews, including 20 in each round to test the Web instrument (40 total), 20 in each round to test the Paper instrument (40 total), and 20 in each round to test the CATI instrument (40 total). Each interview is expected to last approximately one hour, and respondents will be provided with $40 to help defray the costs of participating, such as transportation or child care.


In each round, half of the CATI interviews will be conducted by telephone for the purposes of gathering an estimated timing of the instrument (10 per round). This will provide us an opportunity to go through the entire instrument with respondents uninterrupted to gather their feedback on the full experience (including the effects of placing the mental health questions at the beginning of the instrument). These will be conducted by a telephone interviewer, will be recorded and we will listen to them to assess flow and sensitivity of the new questions. These interviews will be conducted by one to two trained senior female telephone interviewers and will include debriefing questions at the end of the interview to gather feedback on the experience.


The other 50 interviews per round (20 Web, 20 Paper, 10 CATI) will be conducted in-person so that cognitive interviewers can observe respondents as they work through the instrument. Interviewers do not focus on taking notes during the interview so that they can focus on respondent reactions, non-verbal cues, and administering scripted and spontaneous probes. Interviews will be audio and video recorded. (Note-taking is performed after the interview using notes and recordings, and may be written by the interviewer or by a trained note-taker. If someone other than the interviewer writes the notes, the interviewer is required to review the notes prior to finalization.) A team of 6 experienced female cognitive interviewers and 6 trained note-takers will be executing the two rounds of research.


Interviewers will use a mix of concurrent and retrospective probing to gather feedback on both item comprehension as well as the usability of the instruments.



Changes to data collection efforts to improve response rate and address non-response bias:

The changes included below are described only briefly for the time being. They will be elaborated upon once the final plans for the Experimentation and Feasibility Phase of the newly funded NISVS redesign contract are developed.


In compliance with the OMB’s remaining terms of clearance for 2014 and 2016, CDC has collaborated with BJS to convene a work group to obtain expert feedback and input on how to enhance the NISVS survey. Workgroup participants provided guidance on how to improve the system’s survey design (e.g., methods, sampling frame, recruitment, mode of administration, etc.) with the goals of increasing response rates, reducing non-response bias, and maximizing the collaborative opportunities across Federal surveys for covering populations of interest. Meetings with the workgroup, which included a representative from OMB, began in February of 2017 and were completed in July of 2017. Recommendations from the workgroup, provided to CDC in a written report, were used to inform the 2018 data collection efforts as well as the survey redesign beginning in September 2018 and subsequent survey administrations. CDC proposes to make the following changes to the NISVS data collection to address the recommendations provided by the workgroup:



  1. Explore different sample frame options (e.g., single vs. dual sample frame)

Recommendation:  Continue dual frame for the data collection beginning in March 2018 but continue to reassess the proportion of the cell phone frame.

Action:  The newly funded NISVS redesign contract will explore a number of alternative design features, including the sample frame. Specifically, this contract will assess the optimal sampling frame for achieving an acceptable response rate and coverage to produce national and state-level prevalence estimates for NISVS outcomes. Experiments and feasibility studies to be conducted in the second phase of the contract will examine the effectiveness of an address-based sample and web panels in comparison with a random digit dial sample.

  1. Response rate:  Caller ID and use of text exchanges

Recommendation:  (1) Consider a 2-arm (add non-800 # display) or 3-arm (add non-800 # or caller ID display) experiment in phase 2 of data collection starting March 2018. (2) Consider adding a question regarding what respondent saw on display/caller ID. (3) Consider a text exchange between interviewer and respondent regarding context of survey. (4) Consider text message as an advance letter

Action:  In the newly funded NISVS contract, the contractor has initially planned for one experiment involving text exchanges to help increase the response rate. The contractor proposes sending a text message to respondents in the cell phone frame to provide the opportunity for the respondent to fill out the survey on the web directly from their smartphone or to enter a link to the survey from a PC. The message will describe the incentive, as well as that they will be receiving a call from the CDC if the survey is not completed in the next week. Half of the cell phone numbers will be assigned to this text message treatment, while the other half will not receive any text message at all. The landline frame that is receiving an advance letter will also be given the option to fill out the screener on the web, rather than wait to be called.


  1. Non-response bias:  Comparison with other surveys



Recommendation:   Identify data sources that include questions that are similar to those asked in NISVS, and review relevant findings from other systems.

Action:  For the NISVS redesign contract beginning in September 2018, some items have been added to the NISVS survey specifically designed to serve as benchmarks. One example of a benchmark item added to the new version of the survey is an item measuring depression assessed annually by BRFSS.


  1. Non-response bias:  Target low-responding groups



Recommendation:  Target low-responding groups with the purpose of increasing their participation and balancing the sample. 

Action:  This recommendation has been taken into consideration for the NISVS design contract funded in September 2018. The contractor has been tasked with developing plans to improve response from low-responding groups. One approach that will be assessed in this new contract is the use of an address-based sample (ABS), which will provide some level of sociodemographic information that may be used to increase recruitment of those who are most difficult to reach. Specifically, the contractor will investigate the use of geographic identifiers that are linked to telephone numbers for identifying subpopulations of interest. They will also consider using ABS data either from the Census or from third-party vendors to understand household characteristics, which could allow for stratifying the sample and oversampling groups of particular interest.


  1. Experimentation with pre-incentives

Recommendation:  Experiment with pre-incentives to increase likelihood of opening and reading the advance letter (e.g., insert incentive in envelope). Use official stationery on advance letter mailings (letterhead, envelope). Add question about whether the letter was received and read.

Actions:  In the NISVS design contract funded in September 2018, the contractor has developed plans for testing a new incentive structure to be examined during Phase 2 of the contract (Experimentation and Feasibility Studies). The varied incentive structures to be assessed include a series of experiments that will help us to understand the impact of: a) providing a pre-incentive (comparing a $2 to a $5 pre-incentive for completing a NISVS screener); b) providing a promised incentive to complete a screening interview (comparing a $10 vs. $20 incentive for completion via the web and $5 vs. $10 for completion via paper); and c) providing a promised incentive for completing the full NISVS questionnaire (comparing $15 vs. $25 for web completion and $5 vs. $15 for paper completion).


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy