NISVS workgroup summary

AttK.2_NISVS Workgroup Summary.docx

The National Intimate Partner and Sexual Violence Survey (NISVS)

NISVS workgroup summary

OMB: 0920-0822

Document [docx]
Download: docx | pdf

Attachment K.2


NISVS Workgroup Summary


Outlined below is a brief summary of the primary recommendations from the NISVS Methodology Workgroup, as well as our proposed actions in response to the recommendations. Some of the proposed actions have already been carried out, and others first require revision to our OMB package before implementation.  Actions that require changes to our current package are bolded and underlined.  The highlighted activities are appropriately reflected throughout the OMB SSA and SSB.B3.



  1. Single vs. dual frame

Recommendation:  Continue dual frame for the data collection beginning in March 2018 but continue to reassess the proportion of the cell phone frame.

Action:  We considered the single frame option of cell phone only, but will continue to use dual frame at the recommendation of the Workgroup.  Although there is an increase in cell-phone only households, there is still some concern that we could be missing respondents if we switch to an all CP frame at this time. 


  1. Response rate:  Caller ID and use of text exchanges

Recommendation:  (1) Consider a 2-arm (add non-800 # display) or 3-arm (add non-800 # or caller ID display) experiment in phase 2 of data collection starting March 2018. (2) Consider adding a question regarding what respondent saw on display/caller ID. (3) Consider a text exchange between interviewer and respondent regarding context of survey. (4) Consider text message as an advance letter

Action:  The contractor is supportive of conducting experiments but shared that there are legal restrictions with sending text messages (e.g., potential cost incurred by recipient if sent a text message as an advance letter).  In the 2018 data collection, the contractor used only Atlanta-local outgoing numbers (i.e., beginning with area codes 404 or 770) and a descriptor, “CDC Health Injuries Study,” that appeared on caller ID displays to promote responsiveness to survey calls from the contractor.


  1. Non-response bias:  Comparison with other surveys



Recommendation:   Identify data sources that include questions that are similar to those asked in NISVS, and review relevant findings from other systems.

Action:  We identified data sources with survey questions that can be used as benchmarks (e.g., comparable items for chronic conditions from BRFSS, NHIS; forced vaginal intercourse among 18-44 year old women from National Survey of Family Growth).



  1. Sampling: Cell phone sharing

Recommendation:  Consider trying to understand the extent to which cell phone sharing is occurring among the sample. Input on the need to address this was mixed among the panelists.

Action:  We consulted with staff from BRFSS who stated that the prevalence of cell phone sharing has decreased over time. They treat the cell phone as a personal device and have removed their survey question about cell phone sharing. We will begin systematically treating cell phones as a personal device. Thus, if a cell phone respondent answers that they are under the age of 18, they will be considered ineligible to participate in the study.



  1. Non-response bias:  Reaching late responders



Recommendation:  Consider adding an experimental Phase-3 to the NISVS call-back protocol for non-respondents, whereby additional efforts are made for a subset of non-respondents in phase-2, possibly at a higher incentive level.  

Action:  The duration of phase 2 of data collection was lengthened, ensuring that data collection in this phase started earlier and resulted in an increased phase 2 sample size. Detailed plans describing the increased in phase-2 duration were described in the February 2018 OMB SSB.B3 submission. The impact of this change is still being assessed.



  1. Non-response bias:  Familiarize audience with CDC

Recommendation:  Consider altering the introductory script to add a statement explaining CDC’s mission.

Action:   We revised the introductory language that recipients hear early in the call. The purpose of the revision was to provide additional information to call recipients who may not be familiar with the CDC and its mission. The revision included a brief description of the CDC and its activities. 



  1. Non-response bias:  Total survey error

Recommendation:  Consider strategies for minimizing total survey error (TSE) and understanding its relationship to non-response bias.

Action:  We are working both in-house and with the contractor on how best to assess TSE and to understand and describe each source of error. We are looking at the various components of non-sampling error and will identify those sources that have been addressed and those that still need work.  



  1. Non-response bias:  Target low-responding groups



Recommendation:  Target low-responding groups with the purpose of increasing their participation and balancing the sample. 

Action:  This requires having relevant information in advance. We worked with the contractor to determine the kinds of vendor data that are available (e.g., zip codes for phone numbers), and to understand the characteristics of non-responders. The contractor has indicated that the quality of this information has improved over time but is really only useful for address-based sampling. This recommendation has been taken into consideration for the NISVS design contract funded in September 2018. The contractor has been tasked with developing plans to improve response from low-responding groups. One approach that will be assessed in this new contract is the use of an address-based sample, which will provide some level of sociodemographic information that may be used to increase recruitment of those who are most difficult to reach.



  1. Effectiveness and receipt of advance letter

Recommendation:  Experiment with pre-incentives to increase likelihood of opening and reading the advance letter (e.g., insert incentive in envelope). Use official stationery on advance letter mailings (letterhead, envelope). Add question about whether the letter was received and read.

Action:  To increase the likelihood of opening and reading the letter during the 2018 data collectio, some changes were made to the advance letter, including changes to the stationery to emphasize that it is a CDC study (e.g., letterhead, envelope). We also included a postcard mailing following the mailing of the advance letter. Postcards were mailed to current non-respondents in the landline sample for whom we had an address. The postcard included the CDC logo with Centers for Disease Control and Prevention spelled out, as well as a brief description of the study, the phone number to call to participate, and information about the incentive.

This was a more cost-efficient option than sending cash (e.g., $1 pre-incentive). An analysis to examine the response rate for those who were sent the advance letter and postcard will be completed.



In the NISVS design contract funded in September 2018, the contractor has developed plans for testing a new incentive structure to be examined during Phase 2 of the contract (Experimentation and Feasibility Studies). The varied incentive structures to be assessed include a series of experiments that will help us to understand the impact of: a) providing a pre-incentive (comparing a $2 to a $5 pre-incentive for completing a NISVS screener); b) providing a promised incentive to complete a screening interview (comparing a $10 vs. $20 incentive for completion via the web and $5 vs. $10 for completion via paper); and c) providing a promised incentive for completing the full NISVS questionnaire (comparing $15 vs. $25 for web completion and $5 vs. $15 for paper completion).

(10) Data quality, validity of responses, disclosure

Recommendation:  Consider adding question(s) on the survey about honesty of responses, discomfort disclosing victimization, whether survey content was upsetting, and whether the person felt safe.

Action:  In the NISVS pilot, we included questions about participant reactions. Results indicated that over 90% agreed that a survey should ask questions like these and that they would still have participated if they had known the subject matter in advance or known what participating would be like. Less than 10% felt that the survey made them feel “a little” or “very” upset.   For the next contract, we will consider conducting an experiment to further examine participant comfort and its influence on disclosure of victimization experiences. 




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHolland, Kristin (CDC/ONDIEH/NCIPC)
File Modified0000-00-00
File Created2021-05-27

© 2024 OMB.report | Privacy Policy