Description of Program Changes/Adjustments

Att A. Description of Program Changes and Adjustments OMB_CDC revision_CLEAN 6.10.19.docx

The National Intimate Partner and Sexual Violence Survey (NISVS)

Description of Program Changes/Adjustments

OMB: 0920-0822

Document [docx]
Download: docx | pdf

Attachment A.

National Intimate Partner and Sexual Violence Survey (NISVS)

Program Changes and Adjustments



Changes to the survey

Detailed revisions and justifications supporting the revisions are presented in Attachment B. Proposed changes to the survey include. Questions that are unchanged will not be directly tested, but respondents will have the opportunity to provide feedback on all portions of the survey.

  • Minor edits to reflect a change in contractor and updating of CATI instructions due to renumbering of survey questions.

  • Added interviewer instruction to end the interview if respondent did not provide their age.

  • For the education question, revised the parenthetical examples in graduate education response option.

  • Revised sexual orientation response option from “Straight, that is, not gay” to “Straight, that is, not lesbian or gay.”

  • Dropped 9 health condition questions.

  • Added 1 health question to address National Center for Injury Prevention and Control research priority (i.e., suicide) and to better understand its association with SV and IPV victimization.

    • This questions has been added for several reasons. First, suicide is one of only three of the National Center for Injury Prevention and Control’s research priorities. Additionally, research suggests that both adolescent and adult victims of rape and other forms of SV and IPV experience depression and suicide attempts at a higher rate than their counterparts who have not been victimized (e.g., Chen, Murad, Paras et al., 2010). Thus, posing questions about whether respondents have been diagnosed with depression or attempted suicide in combination with questions that assess SV, IPV, and stalking victimization will allow NCIPC staff to better understand the connection between these violence and injury outcomes among a nationally representative sample.

  • Revised and combined 5 stalking tactic questions after consultation with federal partners and dropped 2 stalking tactic questions.

  • Shortened intimate partner relationship question throughout the survey to reduce length and repetitiveness.

  • Added 2 sexual violence questions to address workplace sexual harassment.

  • Dropped questions pertaining to perpetrator age throughout the survey.

  • Dropped questions on 12-month frequency of victimization throughout survey.

  • Combined or dropped rape and made to penetrate items to reduce burden.

    • The survey will continue to include rape and made to penetrate items, however, some items have been combined if they measure the same construct (i.e., rape or made to penetrate) to reduce burden. Some items have been removed if they have shown low individual prevalence; we conducted statistical analyses to examine the effect on overall prevalence. A careful and thoughtful approach was taken to ensure that items that were dropped from a scale were not those that significantly influenced the overall prevalence of the index outcome. Most questions are the same or similar to those used in 2016-2018. Questions that did change may have a small effect on trend data. Thus, if future publications examine changes over time, they will need to include a limitation specifying the survey modifications as one potential reason for any noticeable differences detected over time. Finally, unfortunately, we believe a break in series data is unavoidable given the new NISVS administration approach which will result in new data collection methods and simultaneous changes to the survey.



  • Added 1 question on having filed a police report after a rape or made to penetrate incident.

  • Reduced intimate partner descriptor in the stem and questions throughout the survey to reduce repetitiveness.

  • Revised 2 psychological aggression questions to improve and reduce burden.

  • Added introduction and 5 reproductive coercion questions (previously asked in 2010-2015) due to high interest in the field.

    • These questions were previously asked in 2010-2012 NISVS administrations. The program has received multiple requests for recent data on this issue (reproductive and sexual coercion), and data are not available from other national surveys.

  • Revised intimate partner violence (IPV) behavior items to improve clarity and reduce burden.

  • Dropped 1 IPV behavior question due to low use in analyses.

  • Replaced head injury question with concussion question to address National Center for Injury Prevention and Control research priority related to understanding and preventing traumatic brain injuries.

  • Shortened 2 IPV Impact items.

    • Revised IPV Impact questions about law enforcement to ask about having filed a police report.

  • Dropped 3 IPV Impact questions in order to add other priority questions.

  • Dropped all 11 questions from (formerly) Section J (Normative Behaviors) to reduce survey length.

    • This module assessed normative behaviors related to SV and IPV; however, analyses demonstrate very little variance in the responses to these questions at the national level. Thus, for the sake of shortening the survey, these questions were removed. However, the NISVS team is considering collecting data on these items every few years in the future to ensure the field has updated data on this topic.


Additionally, a paper-and-pencil version of the survey and a web-based version of the survey have been developed. The paper-and-pencil survey is a shortened version of the survey that will be used as a final effort to obtain some valuable information on sexual violence, intimate partner violence, and stalking from individuals who have not responded to requests to complete the survey via other channels (i.e., via the web or phone). This survey includes questions that will help to produce estimates for some of the more common and overarching NISVS outcomes. Because the NISVS CATI instrument contains some difficult skip patterns that are difficult to replicate on a paper survey, the level of detail collected via the paper survey has been reduced. Still, though, the information collected should be sufficient when combined with data collected via other approaches (e.g., the web) to produce the lifetime and 12-month estimates for victimization.

A web-based version of the survey has also been developed. This version mirrors the questions asked via CATI implementation, but this approach introduces the opportunity for NISVS respondents to complete the survey on their own time and at their own pace. The web-based version implements skip patterns, and technological advances such as call-out boxes with definitions, an option to have the question verbally read aloud by the computer, and the ability to view and complete the survey on a number of different devices, including a personal computer, tablet, or smartphone.

The changes briefly described here are further detailed in Attachment B. Crosswalk of Survey Changes.



Program Changes Related to Cognitive Testing of the New Survey Instruments:

The first phase of the newly funded NISVS redesign contract involves cognitive testing of the three instruments described above. The contractor will conduct a total of 120 cognitive interviews in April-June 2019 to support NISVS redesign efforts. Interviews will be conducted in two rounds, each with 60 interviews, including 20 in each round to test the Web instrument (40 total), 20 in each round to test the Paper instrument (40 total), and 20 in each round to test the CATI instrument (40 total). Each interview is expected to last approximately one hour, and respondents will be provided with $40 to help defray the costs of participating, such as transportation or child care.


In each round, half of the CATI interviews will be conducted by telephone for the purposes of gathering an estimated timing of the instrument (10 per round). This will provide us an opportunity to go through the entire instrument with respondents uninterrupted to gather their feedback on the full experience (including the effects of placing the mental health questions at the beginning of the instrument). These will be conducted by a telephone interviewer, will be recorded and we will listen to them to assess flow and sensitivity of the new questions. These interviews will be conducted by one to two trained senior female telephone interviewers and will include debriefing questions at the end of the interview to gather feedback on the experience.


The other 50 interviews per round (20 Web, 20 Paper, 10 CATI) will be conducted in-person so that cognitive interviewers can observe respondents as they work through the instrument. Interviewers do not focus on taking notes during the interview so that they can focus on respondent reactions, non-verbal cues, and administering scripted and spontaneous probes. Interviews will be audio and video recorded. (Note-taking is performed after the interview using notes and recordings, and may be written by the interviewer or by a trained note-taker. If someone other than the interviewer writes the notes, the interviewer is required to review the notes prior to finalization.) A team of 6 experienced female cognitive interviewers and 6 trained note-takers will be executing the two rounds of research.


Interviewers will use a mix of concurrent and retrospective probing to gather feedback on both item comprehension as well as the usability of the instruments.



Changes to data collection efforts to improve response rate and address non-response bias:

The changes included below are described only briefly for the time being. They will be elaborated upon once the final plans for the Experimentation and Feasibility Phase of the newly funded NISVS redesign contract are developed.


In compliance with the OMB’s remaining terms of clearance for 2014 and 2016, CDC has collaborated with BJS to convene a work group to obtain expert feedback and input on how to enhance the NISVS survey. Workgroup participants provided guidance on how to improve the system’s survey design (e.g., methods, sampling frame, recruitment, mode of administration, etc.) with the goals of increasing response rates, reducing non-response bias, and maximizing the collaborative opportunities across Federal surveys for covering populations of interest. Meetings with the workgroup, which included a representative from OMB, began in February of 2017 and were completed in July of 2017. Recommendations from the workgroup, provided to CDC in a written report, were used to inform the 2018 data collection efforts as well as the survey redesign beginning in September 2018 and subsequent survey administrations. CDC proposes to make the following changes to the NISVS data collection to address the recommendations provided by the workgroup:



  1. Explore different sample frame options (e.g., single vs. dual sample frame)

Recommendation:  Continue dual frame for the data collection beginning in March 2018 but continue to reassess the proportion of the cell phone frame.

Action:  The newly funded NISVS redesign contract will explore a number of alternative design features, including the sample frame. Specifically, this contract will assess the optimal sampling frame for achieving an acceptable response rate and coverage to produce national and state-level prevalence estimates for NISVS outcomes. Experiments and feasibility studies to be conducted in the second phase of the contract will examine the effectiveness of an address-based sample and web panels in comparison with a random digit dial sample.

  1. Response rate:  Caller ID and use of text exchanges

Recommendation:  (1) Consider a 2-arm (add non-800 # display) or 3-arm (add non-800 # or caller ID display) experiment in phase 2 of data collection starting March 2018. (2) Consider adding a question regarding what respondent saw on display/caller ID. (3) Consider a text exchange between interviewer and respondent regarding context of survey. (4) Consider text message as an advance letter

Action:  In the newly funded NISVS contract, the contractor has initially planned for one experiment involving text exchanges to help increase the response rate. The contractor proposes sending a text message to respondents in the cell phone frame to provide the opportunity for the respondent to fill out the survey on the web directly from their smartphone or to enter a link to the survey from a PC. The message will describe the incentive, as well as that they will be receiving a call from the CDC if the survey is not completed in the next week. Half of the cell phone numbers will be assigned to this text message treatment, while the other half will not receive any text message at all. The landline frame that is receiving an advance letter will also be given the option to fill out the screener on the web, rather than wait to be called.


  1. Non-response bias:  Comparison with other surveys



Recommendation:   Identify data sources that include questions that are similar to those asked in NISVS, and review relevant findings from other systems.

Action:  For the NISVS redesign contract beginning in September 2018, some items have been added to the NISVS survey specifically designed to serve as benchmarks. One example of a benchmark item added to the new version of the survey is an item measuring depression assessed annually by BRFSS.


  1. Non-response bias:  Target low-responding groups



Recommendation:  Target low-responding groups with the purpose of increasing their participation and balancing the sample. 

Action:  This recommendation has been taken into consideration for the NISVS design contract funded in September 2018. The contractor has been tasked with developing plans to improve response from low-responding groups. One approach that will be assessed in this new contract is the use of an address-based sample (ABS), which will provide some level of sociodemographic information that may be used to increase recruitment of those who are most difficult to reach. Specifically, the contractor will investigate the use of geographic identifiers that are linked to telephone numbers for identifying subpopulations of interest. They will also consider using ABS data either from the Census or from third-party vendors to understand household characteristics, which could allow for stratifying the sample and oversampling groups of particular interest.


  1. Experimentation with pre-incentives

Recommendation:  Experiment with pre-incentives to increase likelihood of opening and reading the advance letter (e.g., insert incentive in envelope). Use official stationery on advance letter mailings (letterhead, envelope). Add question about whether the letter was received and read.

Actions:  In the NISVS design contract funded in September 2018, the contractor has developed plans for testing a new incentive structure to be examined during Phase 2 of the contract (Experimentation and Feasibility Studies).


References


Chen, L. P., Murad, M. H., Paras, M. L., Colbenson, K. M., Sattler, A. L., Goranson, E. N., ... & Zirakzadeh, A. (2010, July). Sexual abuse and lifetime diagnosis of psychiatric disorders: systematic review and meta-analysis. In Mayo clinic proceedings (Vol. 85, No. 7, pp. 618-629). Elsevier.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHolland, Kristin (CDC/ONDIEH/NCIPC)
File Modified0000-00-00
File Created2021-05-27

© 2024 OMB.report | Privacy Policy