SSB_NISVS_Clean_2.5.18

SSB_NISVS_Clean_2.5.18.docx

The National Intimate Partner and Sexual Violence Survey (NISVS)

OMB: 0920-0822

Document [docx]
Download: docx | pdf

Supporting Statement

For OMB Information Collection Request


Part B


OMB# 0920-0822





November 27, 2017



The National Intimate Partner and Sexual Violence Survey (NISVS)


Supported by:


Department of Health and Human Services

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

Division of Violence Prevention







Project Officer:

Sharon G. Smith, PhD

Behavioral Scientist

Contact Information:

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

4770 Buford Highway NE, MS F-64

Atlanta, GA 30341-3724

phone: 770.488.1363

email: [email protected]

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


B.1. Respondent Universe and Sampling Methods



This is a revision request for the currently approved National Intimate Partner and Sexual Violence Survey - OMB# 0920-0822, expiration date 07/31/2018, for The National Intimate Partner and Sexual Violence Survey (NISVS) for data collection in the 2018-2019 cycle is slated to begin in March 2018 and run through March 2019.

For the data collection year 2018-2019, the periodicity of the administration of the NISVS instrument remains biennial. Biennial data collection was incorporated for the 2016-2017 data collection cycle to increase the number of interviews from a minimum of 12,500 interviews collected annually to at least 25,000 interviews during a 12 month period. For the 2018-2019 data collection cycle, CDC has already allocated funding in its current contract to increase the 12,500 NISVS interviews conducted in each data collection cycle by as much as 2,500 per 6-month period so that as many as 15,000 interviews will be collected per 6-month period and up to 30,000 per year.


The Department of Defense (DoD) subsample that was collected in 2016-2017 was completed in 2017; therefore, questions specific to the DoD populations will be removed. The DoD collaboration and questions specific to the DoD populations will provide updated prevalence estimates for active duty females and wives of active duty males, as well as first time NISVS prevalence estimates for active duty males.


Target Population

The target population for the civilian NISVS is English or Spanish speaking men and women aged 18 and older in U.S. households. Those under age 18 are excluded because they are legally considered minors and their participation would necessitate significant changes to the study protocol. Additional exclusions include adults that are: 1) residing in penal, mental, or other institutions; 2) living in other group quarters such as dormitories, convents, or boarding houses (with ten or more unrelated residents) and do not have a cell phone; 3) living in a dwelling unit without a land line telephone used for voice purposes and also do not have a working cell phone; or 4) unable to speak English or Spanish well enough to be interviewed. Those who do not speak English or Spanish are excluded because the instrument and survey materials are currently limited to those two languages.


The targeted sample size is driven by the number of respondents needed to provide precise (+/- 1 to 2%) and unsuppressed (relative standard error < 0.30) national prevalence estimates each year and to provide stable lifetime state-level prevalence estimates within 2-3 years depending on state population size.


Furthermore, 41.2% of adults were living in households with both a landline and a cell phone, and 48.3% were wireless only (Blumberg & Luke, 2016). There is the possibility of coverage bias if survey households are excluded based on the type of phone (e.g., cell-phone only, landline-only) (Blumberg & Luke, 2016). This provides motivation for selecting 1) adults with only cell phones and 2) adults who have both cell and landline phones but are selected through their cell phones. To maximize the coverage of the target population, a dual frame design that randomly samples landline and cell phone numbers, is used. The dual frame design to reduce under-coverage is discussed in more detail in section B.4.

While there is an overlap between the cell phone frame and the landline frame, the gains in coverage achieved by sampling both frames and the efficiency from not excluding any adults with both types of phone service outweigh the resulting design effect due to inclusion of adults with multiple selection probabilities, who can be selected from either frame.

From July – December 2015, only 7.2% of adults were estimated to have lived in households with only landlines (Blumberg & Luke, 2016), down from 8.6% from July – December 2012. If this trend continues, a single frame RDD survey with cell phone numbers will have near complete coverage of the population. There are great efficiencies to be gained from a single frame design, as well as new challenges. Such an alternative will be evaluated in terms of potential bias and variance reduction, to allow the survey design to be responsive to changes in the survey environment.



Table 5. Sampling Strata

Cell Phone Strata

Stratum No.

Strata

1

Alabama

2

Alaska

51

Wyoming



Landline Strata

Stratum No.

Strata

52

Alabama

53

Alaska

102

Wyoming



The national sample is allocated across 102 strata; 51 cell phone frame strata and 51 landline frame strata, shown in Table 5. Each sample is stratified by state, one for each state and the District of Columbia.



Sampling Frames For the landline strata, the recommended sampling frame is maintained by Marketing Systems Group (MSG) using Cell-WINS. A list-assisted RDD sample was selected and stratified by state. Exchanges that are known to be limited to cell phones are excluded from the landline strata. For the cell phone strata defined by state, an RDD sample of cell phone numbers are selected within exchanges assigned to wireless providers.

Multi-year data collection plan

As a surveillance system, NISVS plans to collect data on a semi-annual, biennial basis to track national and state level estimates of IPV, SV, and stalking. During data collection periods a new sample is selected every quarter, avoiding problems associated with an old sample of telephone numbers. The quarterly sample design also allows the implementation of any changes for the next quarter as the need arises, rather than being unable to implement a change until the following year. The main features of the sampling and data collection design for each survey period are similar to those used in prior NISVS data collection years. For the data collection year 2018-2019, the periodicity of the administration of the NISVS instrument remains biennial. Biennial data collection was incorporated for the 2016-2017 data collection cycle to increase the number of interviews from a minimum of 12,500 interviews collected annually to at least 25,000 interviews during a 12 month period. For the 2018-2019 data collection cycle, CDC has already allocated funding in its current contract to increase the 12,500 NISVS interviews conducted in each data collection cycle by as much as 2,500 per 6-month period so that as many as 15,000 interviews will be collected per 6-month period and up to 30,000 per year.

Based on the study goal of producing national estimates, each sample is allocated to distribute the intended interviews across the national landline frame, and the cell phone frame. To address the study goal of achieving state-level estimates for selected prevalence rates through cumulating data across years, the samples are further stratified by state. Sample size within states is determined primarily by proportional allocation based on the relative size of individual state populations. However, to decrease the number of years required to generate stable state estimates among smaller states, a minimum target of 225 interviews per data collection period was set. This over-allocation to smaller states was done within both landline and cell phone frames.

The most recent NHIS results based on data from January-June 2017 (Blumberg & Luke, 2017) estimate that 52.0% of adults live in households with only cell phones while only 4.8% live in households with only landlines (39.6% have both types of service and 3.4% have no phone service). We use this information together with cost of interviewing data to optimize the sample allocation to the cell and landline frames (Brick, Cervantes, Lee, & Norman, 2011). Since the cell phone only rate continues to increase and the landline only rate continues to decrease, NISVS uses projections to plan sample allocation and also evaluates the point at which a dual frame design becomes suboptimal compared to a single frame RDD design.


Landline and cell phone samples perform differently for many reasons. Examples include the inability to screen cell phone samples for known nonworking and business numbers and the responsibility of the cell phone owner for airtime costs. The cost per interview in each sample will be closely monitored and the allocation between landline and cell phone numbers adjusted during data collection to achieve a more optimal allocation. Releasing the sample in replicates provides some control during the initial weeks of data collection, while more major changes can be made across quarters of data collection, if needed.


Males, particularly in the landline frame, tend to respond at a lower rate than females. While greater substantive interest may lie in victimization of females, the proportion of males and females is also monitored. The instrument provides the capacity to change selection probability during data collection for males and females in households with adults from both sexes, during data collection. If the percent of respondents who are male drops below 40%, oversampling of males will be reconsidered. As with the allocation by telephone service, if substantial changes are needed, changes in allocation are made between quarters of data collection, to avoid the creation of extreme weights (e.g., for males and females in such households, interviewed late in the data collection period).



To address nonresponse, a nonresponse protocol has been implemented, as described in section B.3. Briefly, the nonresponse phase is a protocol implemented at some point during survey recruitment, in order to decrease nonresponse and gain information from sample members who have not yet chosen to respond or participate. Importantly, the point at which the nonresponse protocol is implemented during the recruitment process has cost implications. For example, a decision could be made to move a phone number into the nonresponse recruitment protocol after 8 unsuccessful attempts or after 15 unsuccessful recruitment attempts. As described in section B.3, indicators of cost and survey measures are monitored to determine when the nonresponse protocol is implemented. This approach is “responsive” to maintain the most effective data collection. Based on the recommendations from the NISVS methodology workgroup and after consultation with the contractor, CDC is proposing to extend the duration of the nonresponse phase and the proportion of the sample included (see B.3).


Response rates are maximized, in part, by utilizing experience from previous surveillance efforts, such as Behavioral Risk Factor Surveillance System (BRFSS) and Injury Control and Risk Survey (ICARIS-2) and improving response rates by using sophisticated methodological techniques (for example, using responsive design elements). Most telephone surveys have seen a decrease in response rates in recent years. One comparison for telephone survey response rates is the National Immunization Survey (NIS). For example, in 1997 the NIS response rate was 87.1% and dropped to 62.6% in 2014 (Czajka & Beyler, 2016).



For the NISVS, the response rate is computed based on the American Association of Public Opinion Research (AAPOR) response rate #4 formula (AAPOR, 2016). This particular formula is the most commonly implemented formula that 1) accounts for ineligibility among cases with unknown eligibility; and 2) treats partial interviews (by respondents who have answered all pre-identified essential questions) as interviews. The response rate for the previous NCIPC ICARIS survey conducted in 2001-2003 was 47.9% (Black, Kresnow, Simon, Arias, & Shelley, 2006). The response rate for years 2010-2012 ranged from 27.5% to 33.6% (Smith et al., 2017). The response rate for NISVS in 2012 was 20.32% for landlines and 28.04% for cell phones. The overall response rate for 2012 was 33.20%. The 2015 data report is currently in development. Estimates from the 2015 NISVS Survey indicate similar frame-specific and overall response rates (AAPOR RR4, unweighted – 22.1% landline; 28.2% cell phone; 26.3% overall).


The importance of increasing and maintaining response rates is well recognized. Even if evidence is provided that various survey estimates do not suffer from nonresponse bias and that response rates are not highly correlated with nonresponse bias, the response rate remains the single number that is reported and used to gauge the representativeness of the survey data. One way to improve response rates is through the use of advance letters which have been shown to positively influence response rates and cooperation rates (de Leeuw, Callegaro, Hox, Korendijk, & Lensvelt-Mulders, 2007). A second way to increase overall response rates is through interviewer training, reducing the variation of response rates by interviewer through improving techniques among lower performing interviewers (Groves & McGonagle, 2001). Third, providing incentives for completing the survey has been shown to increase response rates in RDD surveys (e.g., Holbrook, Krosnick, & Pfent, 2008; Singer & Ye, 2013). Implementing an effective incentive plan can, over the course of data collection, reduce overall costs and burden to respondents by reducing the need for additional calls to potential respondents. Furthermore, we have tried to improve the impact of incentives on increasing response rates and reducing nonresponse bias by implementing a phased design. The implementation of these methods in the NISVS is described in section B.3.


B.2. Procedures for the Collection of Information


Using a letter to inform households about a forthcoming telephone call and giving them a general description of the survey being conducted has been shown to increase survey response rates. For the purpose of mailing advance letters of introduction, released telephone numbers are address-matched to the extent possible. The sample and address matches are obtained from Genesys Sampling, Inc.


Based on the 2016-2017 data collection, an estimated 50% of address matches are found for the released landline telephone numbers. Respondents with an address match are mailed an advance letter approximately 1-2 weeks prior to the first telephone contact (Attachments I1-I2). The letter describes the purpose of the survey in both English and Spanish and: 1) inform sample members that their household has been randomly chosen to participate in the survey; 2) provide useful information regarding the survey; 3) include a toll-free telephone number that respondents can call if they have questions; and 4) include information regarding the incentive that is offered to eligible respondents who agree to participate.

The study identification numbers contain an indicator specifying whether or not the household was mailed a letter. The anonymity of those households receiving the letter are protected because at no time are the address file be linked to data collected during the telephone interview. In addition, upon completion of the study the address files are destroyed to further prevent any matching.


To maximize human subject protection, the introductory letter has been carefully written to be very general and describe the study in broad terms (Attachments I1-I2). The lack of detailed study information in the advance letter is intentional for the protection of the prospective study participant. If the prospective study participant is in a relationship where IPV is present, a more general introductory letter is less likely to raise suspicion or incite potential perpetrators.

Interviewers are highly trained female staff. The decision to use only female interviewers is based on both the survey topics and the literature regarding gender and reporting. A study conducted by Pollner (1998) indicates that interviewer gender is significantly related to respondents' reports of psychiatric symptoms. Male and female respondents interviewed by women reported more symptoms of depression, substance abuse, and conduct disorders than respondents interviewed by men. These results suggest that female interviewers may create conditions more conducive to disclosure and be perceived as more sympathetic than male interviewers (Pollner, 1998). A more recent study found that male respondents had greater engagement and consistency of responses when responding to a female voice in an audio-assisted self-interviewed interview (Dykema, Diloreto, Price, White, & Schaeffer, 2012). Furthermore, the sex of the respondent selected from a specific household is unknown until the respondent has been randomly selected. Thus, it is not feasible to match interviewer and respondent by sex.


A study of the relationship between interviewer characteristics and disclosure of physical and sexual abuse showed that matching clients and interviewers on sex, race, and age did not increase disclosures of either physical or sexual abuse. Rather, respondents were more likely to disclose sexual abuse to female interviewers than to male interviewers (Dailey & Claus, 2001). An earlier study showed that, in most cases, the socio-demographic characteristics of the interviewer did not affect the quality of participants' responses (Fowler & Mangione, 1990).


An additional consideration specifically related to interviews about IPV, SV, and stalking includes the fact that the majority of victims are female and the majority of the perpetrators are male. Thus, females may be less comfortable reporting IPV, SV, and stalking to a male interviewer. Based on the lack of evidence to suggest the need for matching interviewers and respondents by gender and because evidence suggests that female interviewers may create conditions more conducive to disclosure, only female interviewers conduct interviews for this study.


It is essential that the interviewers be comfortable with the survey because their level of comfort, in turn, impacts the quality with which they administer the interview. During the hiring process, potential English and Spanish speaking interviewers are informed about the background and purpose of the study and carefully screened to ensure that they are comfortable conducting interviews about the topics included.


Interviewers receive a minimum of 12 hours of training. Only those who successfully complete all training sessions conduct interviews. Training topics include the purpose of the study, question-by-question review of the instrument, ways to engage respondents, role-playing, and techniques to foster cooperation and completed surveys. Interviewers are briefed on the potential challenges of administering a survey on IPV, SV, and stalking.


Interviewers are trained to follow specific interviewing procedures that have been proven in previous studies. Interviewers are properly trained in the art of administering questions about IPV, SV, and stalking. For example, interviewers learn about respondent reactions to similar surveys conducted by CDC (as described in Section A.11). They learn about the need for the use of explicit language and are coached on being matter-of-fact in their delivery. Interviewers also learn about resource information that is provided for participants to learn about resources that are available to those who are coping with traumatic and violent events.


A detailed written training manual specific to this study has been developed. The content of the training focuses on the study background, project specific protocols, confidentiality procedures, questionnaire content, refusal avoidance and well-defined conversion protocols. The information is presented using a variety of methods, including lecture, demonstration, round-robin practice, paired-practice, and group and paired mock interviews. Due to the nature of the study, particular attention is paid to the distressed respondent protocol for this study.


Respondent safety is a primary concern for any data collection asking about violence, particularly IPV, SV, and stalking. This protocol addresses how telephone interviewers should respond and record issues of emotional, physical, or unknown sources of distress throughout the interview process. The distress protocol is covered extensively during interviewer training. Any information entered into CATI regarding distress cases are reviewed by project staff, including the staff clinical psychologist. Project staff forward information regarding distressed respondents to the contractor’s IRB, and includes information regarding these cases on the weekly report to CDC. Further, to ensure the safety of respondents, we provide them with a code word that they can use to end the interview at any time they feel concerned for their safety.


A clinical psychologist with prior experience working with victims of interpersonal violence participates in the training and in ongoing monitoring and supervision of interviewers. Only interviewers whose work has been reviewed and certified by the project team are permitted to conduct actual interviews. The certification process involves observation during training, completion of practice interviews, completion of paired mock interviews, successful completion of written and oral quizzes, signing of a confidentiality agreement, and demonstration of proficient NISVS survey administration with correct disposition coding. Interviewers are not allowed to begin working until all certification steps are completed and documented.


While participation in surveys is typically not distressful, it is important for researchers to anticipate potential respondent reactions to the questions being asked and to minimize any adverse impact to the fullest extent possible. Although distress is unlikely, both telephone interviewers and supervisors are trained in the distress protocol appropriate for this study.

The distress protocol includes step-by-step instructions on handling different types of distress. Interviewers are properly trained with well-established contingency plans, including early termination of the interview if the respondent becomes distressed or concerned for their safety. The protocol includes instructions on steps to follow for different types of distress: physical, emotional, and unknown.


If a respondent does display distress, either verbally or non-verbally (i.e., crying) the interviewer immediately offers to finish the interview at another time and offers the respondent the telephone numbers for the National Domestic Violence Hotline, the Rape, Abuse, and Incest National Network, the National Child Abuse Hotline and the National Suicide Hotline so that the respondent may obtain services to help alleviate their emotional distress. Similarly, in the unlikely event that a respondent expresses thoughts/intentions of suicide, the interviewer stops the interview and encourages the respondent to call the National Suicide Hotline.


In surveys conducted by NCIPC or by the contractor there have been no instances where interviewers actually had to transfer respondents to 911. In the extremely unlikely event that a respondent is in immediate physical danger, the interviewer will advise the respondent to hang up and dial 911 for immediate police assistance. If the respondent specifically asks the interviewer to call 911, the call will be transferred directly and the interviewer will then hang up. The supervisor will then record the details of the event, and relay them to a project staff member as soon as possible. The project staff member will evaluate any events as they are reported, and relay them to the project director and CDC/NCIPC staff as soon as possible.


Resource information will also be provided for participants to access for assistance in coping with traumatic and violent events; telephone numbers are provided for the National Domestic Violence Hotline, the Rape, Abuse, and Incest National Network, and the National Child Abuse Hotline. These procedures have been recommended in the literature (Gondolf & Heckert, 2003; Johnson, 1996; Tjaden & Thoennes, 2000; Sullivan & Cain, 2004) and have been consistently used in NCIPC’s previous studies, including ICARIS-2 and the NISVS Pilot Survey.


Throughout data collection, interviewers are monitored to check the quality of their work and to identify areas needing more training or clarification. Silent audio and video monitoring of interviewers take place throughout data collection. Approximately 10% of all interviewing time is observed. Interviewers are scored on their performance during these sessions, which are unknown to the interviewer at the time of administration, and are given written and verbal feedback on their performance. This process allows the identification of any individual interviewer performance issues, as well as larger issues that might affect the data collection. The information obtained is then used as a teaching tool for other interviewers, as appropriate.


Because of the prevalence of IPV, SV, and stalking, it can be anticipated that potential or selected interviewers may have personal experience with the topics being addressed during the interview. Although disclosure of this private information is not requested, it is important for the interviewers to have support available, as needed, and opportunities to debrief (regardless of their personal history) on a regular basis during the conduct of this study. In addition to participating in the interviewer training and ongoing monitoring and supervision of interviewers, interviewers attend weekly meetings with members of project staff. The purpose of these meetings, which occur throughout data collection, is typically to discuss progress in data collection, problems in interviewing, and survey instrument changes. These meetings allow the interviewers to discuss specific experiences as well as their responses to difficult situations. The clinical psychologist is available to provide this support during regularly scheduled meetings with the interviewers.


Households in the landline frame are contacted by telephone approximately one week after the introductory letter has been sent. Interviewers introduce themselves and (when applicable) state "You may have received a letter from us” (Attachment I), then inform the potential participant about the study, select a respondent, and proceed with the introductory script. Households with multiple 18 year old or older residents are selected using the most recent birthday method.


The letter of introduction and survey has been translated into Spanish. To ensure accuracy and usability of the Spanish versions of the introductory letter and survey instrument, several steps have been taken. A translator translates the documents into Spanish and another translator translates the instruments back into English to ensure that all study materials were properly translated and that the meaning of the questions has been preserved. Both the letter and survey have been written in language that is commonly understood; to ensure that people of different Hispanic backgrounds can understand the Spanish versions, a third translator has reviewed the study instruments. If it is determined that the respondent speaks Spanish and not English, a bilingual interviewer will continue with the introductory script, respondent selection, oral consent, and survey administration.


All estimates are weighted to account for the stratified dual-frame sample design, multiple phases, and additional post-survey adjustments for coverage and nonresponse. The latest National Health Interview Survey data and reported estimates are used to adjust selection weights in order to combine the landline and cell phone samples, to inform the relative size of each sampling frame and the demographic composition within each frame. Census estimated totals are used to adjust the combined sample to U.S. adult population.


The variance of survey estimates are computed using statistical software designed for survey data analyses (e.g., SAS and SUDAAN). These procedures, such as CROSSTAB in SUDAAN, take into account the complex survey design and unequal weighting, and the option for Taylor Series Linearization for estimating variances of proportions are used.

B.3. Methods to Maximize Response Rates and Deal with Nonresponse


Historically, NISVS response rates between 2010-2015 have ranged between 26-34%, and cooperation rates range between 80-90%.

Table 6. NISVS Response and Cooperation Rates by Survey Year

Survey Year

Response Rate

Cooperation Rate

2010

27% - 34%*

81%

2011

33%

83%

2012

33%

80%

2015

26%

90%

*The range reflects differences in how the proportion of the unknowns that are eligible is estimated.

NISVS uses an incentive plan structure that has been previously approved for several years (2010, 2011, 2012, 2015, and 2016-2017) of information collection requests (OMB# 0920-0822). The data collection is divided into quarters (3.5-month periods). In each quarter, Phase 1 data collection is carried out for approximately 8 to 11 weeks. During Phase 1, all respondents are offered a $10 incentive to complete the survey. Upon completion of the first phase, a random subsample of non-respondents who did not participate during the main data collection period is drawn (Phase 2). The subsampling rate of all non-respondents for Phase 2 starts as 0.50 but can be adjusted as needed. The sole purpose of Phase 2 of the two-phase sample design is to be able to measure and reduce nonresponse bias. Respondents in Phase 2 are re-contacted and offered a higher incentive of $40 to encourage their participation. NISVS contractors have demonstrated that inclusion of Phase 2 results in relative increases in response rates ranging from 22% to 30%. For instance, in 2015 in Phase 1, the response rate was 20.7% but once Phase 2 was included, the overall response rate increased to 26.4%, which is approximately 22% higher than what the response rate would have been, had we not implemented Phase 2. Additionally, Phase 2 implementation allows for measurement and gives us a mechanism for looking at whether increasing incentives changes the composition of the sample potential reduction of nonresponse bias. We conducted within sample comparisons of Phase 1 respondents and Phase 2 respondents, with the assumption that individuals who complete the survey in Phase 2, once the higher incentive is implemented, are more similar to nonrespondents. A limitation of this assumption is that it is unknown what the final response rate of Phase 1 would be if additional calls were made to survey nonrespondents. However, the NISVS protocol indicates that potential respondents should only be called up to 15 times, so if there were no Phase 2, there would have been no more attempts to reach Phase 1 nonrespondents. We believe that what allows for the continued success in converting nonrespondents after the initial 15 calls is likely the increased incentive, and thus Phase 2 is instrumental in increasing overall response rates. Table 7 describes results from an analysis of 2016/17 data, which revealed that respondent sex, age group, and race/ethnicity (all p<0.001) and education (p=0.002) were all significantly associated with Phase (Wald chi-square test). Phase 1 respondents were more likely than those recruited during Phase 2 to be White non-Hispanic, older (age 45+) and more highly educated. While Phase-2 (higher incentive) respondents were more likely to be Hispanic or Black non-Hispanic, younger (less than 45 years of age), and less educated (less than or equal to high school education). Given that racial minorities, younger compared to older individuals, and individuals with lower educational achievement are at an increased risk for being victims of many of the outcomes measured by NISVS (Smith, Chen, Basile, Gilbert, Merrick, et al., 2017), it is crucial to implement Phase 2 to help ensure representativeness of the sample.


Table 7. Weighted1 Percent Distribution and Association between Selected Demographic Characteristics2 by Phase, NISVS 2016-2017, Base Year

Trait

[P-Value]3


Trait

[P-Value]3


Phase 1

Phase 2



Phase 1

Phase 2

Sex

[p <=0.004]


Race/Ethnicity

[p < 0.001]

   Male*

45.3

41.3


   Hispanic*

12.2

15.4

   Female

54.7

58.7


   White NH*

67.5

60.6





   Black NH*

12.8

15.3

Age Group

[p < 0.001]


   Other NH

7.5

8.7

   18-24

10.2

9.2





   25-34*

15.0

19.1


Education

[p = 0.002]

   35-44*

13.1

15.7


   < High School*

28.7

33.4

   45-64*

38.6

35.0


   Tech/SomeCol4

29.9

27.9

   65+

23.2

21.1


   ColGrad+

41.4

38.7








1 Weights PRIOR to calibration to population demographic characteristics.

2 There was no association between marital status and phase (p-value = 0.235, Wald chi-square test).

3 P-Value, Wald chi-square test for the association between each demographic characteristic and phase.

4 Includes Associates Degree.

* Phase-2 significantly different from Phase-1 (p < 0.05) for that particular level of the demographic trait.



We also posit that it is possible that Phase 2 respondents may be, may are more similar to nonrespondents, though we do not have the data to test such a hypothesis. If funds are available in the future, we will submit a nonsubstantive change request to collect a small amount of data from nonrespondents in an effort to understand more about this group so that we can better understand the potential benefit of Phase 2 data collection with respect to reducing nonresponse bias.

The encouragement plan structure proposed in this request is exactly the same as the one used in previously approved information collections requests for 2010, 2011, 2012, 2015 and 2016-2017 (OMB# 0920-0822). Maintaining the two-phase survey design with the current encouragement plan structure will allow for consistency across years of data collection. Such consistency will permit tracking of changes over time.

Response rates vary greatly across interviewers (e.g., O’Muircheartaigh & Campanelli, 1999). Improving interviewer training has been found effective in increasing response rates, particularly among interviewers with lower response rates (Groves & McGonagle, 2001). For this reason, extensive interviewer training is a key aspect of the success of this data collection effort. In addition to the training protocols implemented in previous NISVS data collection periods (e.g., review of challenges related to administering a survey on IPV, SV, and stalking; having conversion staff recontact nonrespondents; providing additional information about the survey to reluctant respondents; paying special attention to callback and refusal procedures, etc.), one way to further improve response rates may be to provide training sessions throughout the data collection period whereby more experienced interviewers who consistently complete a large number of interviews train lesser experienced interviewers to help them learn how to do the same. Contract managers select a team of refusal avoidance specialists from among the interviewers who demonstrate special talents for obtaining cooperation and avoiding initial refusals, but all interviewers are provided refusal avoidance training as part of the initial project training and a more specialized refusal avoidance training prior to calling back refusals. In an effort to best prepare new interviewers, the contractor is now implementing a floor management structure to provide additional support and coaching to interviewers during their transition from training to live calling. Interviewers are assigned to sit in a specific area for the first three weeks of dialing on the project with experienced, high performing staff mixed in. This allows less experienced staff to witness first-hand how experienced staff are able to convert nonrespondents to respondents, how they manage challenging calls, and the general tone and manner they use to help them to achieve completed interviews. Additional supervisory staff work the assigned area to ensure the staff are receiving coaching and follow up during dialing. The monitoring staff will conduct both live and recorded monitoring of the new to project staff and provide immediate feedback to them. As part of this feedback, interviewers will be given the opportunity to listen to their recordings. Throughout data collection interviewers also are given the opportunity to listen to higher performing interviewers to determine what techniques they are using. Finally, all interviews are recorded; thus, throughout data collection more challenging interviews and conversions can be played for less experienced interviews to learn from during ongoing training sessions throughout the data collection period.


As briefly described in the sampling plan, approximately 52.0% of adults in the U.S. have a cell phone and do not have a landline in the household (Blumberg & Luke, 2017). The substantial rate, coupled with its continuous increase, necessitates that a surveillance system such as NISVS incorporate this cell phone-only population, which would be missing from a landline telephone frame. To address this growing under-coverage problem, a dual-frame approach has been implemented with RDD samples of landline and cell phone numbers. Gaining cooperation on cell phones can be at least as challenging as landlines; the intensive methods to increase response rates and reduce nonresponse bias described in section B.3 has been implemented for both landline and cell phone samples.


Despite the dual-frame approach, additional bias may result from the differential likelihood of reaching respondents with both types of telephone service, depending on which service they are being contacted on. If individuals with both types of service are selected only through the landline frame, and adults from the cell phone frame are screened for having only cell phones, a bias may result because people with both types of service tend to mostly use their landlines. To alleviate this potential problem and to increase the efficiency of data collection, adults with both types of service are interviewed from each frame. Those with both cell phones and landlines who predominantly use their cell phones (and are therefore unlikely to be interviewed on a landline) are more likely to be interviewed than if such procedures were not followed. The resulting increased complexity in identifying selection probabilities are addressed through weighting using the individual and household level telephone service questions asked during the interview (Attachment E1-E3).


Finally, in compliance with the OMB’s remaining terms of clearance for 2014 and 2016, CDC has collaborated with BJS to convene a workgroup to obtain expert feedback and input on how to enhance the NISVS survey. Workgroup participants provided guidance on how to improve the system’s survey design (e.g., methods, sampling frame, recruitment, mode of administration, etc.) with the goals of increasing response rates, reducing non-response bias, and maximizing the collaborative opportunities across Federal surveys for covering populations of interest. Four meetings with the workgroup, which included a representative from OMB and a representative from CDC’s Board of Scientific Counselors, began in February of 2017 and were completed in July of 2017. Recommendations from the workgroup have been used to inform both the 2018-2019 efforts as well as issues to consider for the system moving forward with after 2019. Based on the suggestions from the workgroup, consultations with federal and nonfederal partners, and negotiations with the contractor, CDC proposes to integrate the following activities described below into the 2018-2019 data collection (further described in Attachment J.1.). Additionally, more costly and substantive changes to data collection procedures recommended by the Methodology Workgroup (e.g., testing address-based sampling and multi-modal data collection efforts) will be implemented in a new design contract to be funded in the future.


  1. COMPARISONS WITH OTHER SURVEYS

One method to evaluate nonresponse bias in NISVS is to use benchmark estimates from other national surveys that have substantially higher response rates and therefore, reduced risk of nonresponse bias in the survey estimates. Weighted estimates are then directly compared and differences interpreted as potential bias, considering differences in the designs of the surveys. By comparing how well NISVS data correspond with data from national surveys that achieved higher response rates, sometimes using other data collection methods (e.g., in person data collection) we can better understand if and how non-response bias is influencing NISVS estimates. Following the NISVS methodology panel, CDC worked with partners to identify several survey questions comparable with NISVS questions that could be used as benchmarks. Three specific health conditions measured both in NISVS and in other large-scale national surveys were identified to serve as benchmarks: hypertension, asthma, and diabetes. The National Health and Nutrition Examination Survey (NHANES), the National Health Interview Survey (NHIS), and the Medical Expenditure Survey (MEPS; a 50% sample from the NHIS but where data are collected on every member of the household) household component are in-person national surveys with higher coverage rates and response rates than RDD telephone surveys. Weighted estimates for the three health conditions were produced from NISVS and compared to those from the three in-person health surveys. Because NISVS asks behaviorally specific questions within a health context, the group found it difficult to identify a violence-related benchmark with which to compare NISVS. One such benchmark, forced vaginal intercourse among 18-44 year old women from National Survey of Family Growth, was identified for comparison. The same sort of comparison can be done as was done with the health conditions, with any differences observed interpreted as potential bias, considering differences in the designs of the surveys. This approach will be used to compare estimates in current and future NISVS data collections to external data collections.


  1. ADD ATLANTA LOCAL NUMBERS FOR OUTBOUND CALLS

The use of a local telephone number (i.e., a non-800 number) may increase survey participation. For example, an 800 number may imply a corporate solicitation to some. The use of a local phone number (i.e., a 404 or 770 number that is local for both CDC Atlanta offices) will be used to avoid participants that have been dissuaded to answer an 800-phone number. Additionally, an online survey of over 2,300 randomly selected Internet users within the United States found that only 7% of people are likely to answer an unknown caller from a toll-free number but more than twice as many (about 15%) would answer a call from a local number, even if out-of-state(Borowski, 2017).


  1. INCREASE THE NUMBER OF TELEPHONE NUMBERS OF EACH TYPE

(800 AND 404/770) FOR OUTBOUND CALLS

To date, an 800 number has been used for outbound calls to NISVS sample members (which potentially appears on sample member’s Caller ID), during both Phase 1 and Phase 2. Recent years have seen the development and increased use of phone applications that block repeated calls from 800 numbers. Thus, NISVS will no longer be using an outbound 800 number. However, because of the potential use of call blocking software, additional local (770/404) numbers for outbound calls are needed so that outbound phone numbers can be changed more frequently. This may reduce the problem of erroneous flagging and blocking of the study phone number as spam by cell phone carrier applications and potentially increase the number of people who answer the phone. Note that since landline users can also block numbers, additional numbers for that frame are also needed. We recognize that it will be difficult or impossible to evaluate the effectiveness of this specific method. However, a notice released by AAPOR in mid-2017 highlighting the impact of new software designed to block numbers flagged as spam on RDD survey response rates underlines the need to proactively address the problem to avoid potential negative implications in the 2018 data collection period.


Care will need to be taken to determine when a number should/should not be changed. This is of particular concern as it relates to item 6 below (TEXT MESSAGE FOLLOW-UP) as once the answerer gives permission to text, they may recognize the number initially dialed as legitimate and be more likely to pick up the next time it is dialed. This may also be important to consider for landline calls where a message is left on an answering machine or where someone in the household answers and does not have time to talk. It is important to avoid a situation where a potentially interested participant sees a different phone number and does not realize that it is about the study.


  1. CUSTOMIZED TEXT TO CALLER ID

Potential respondents might be reluctant to answer the phone if they think that a solicitor is calling. A brief description (instead of a phone number) on the caller ID might increase survey participation. The customized text, “Health and Injuries Study”, will be displayed on caller IDs for landlines. This text display does not apply to the cell phone frame since cell phone caller IDs only receive the phone number. This proposed change is not experimental in nature. We anticipate that potential respondents may be more likely to respond to our request if the caller ID text indicates that it is related to a health study and is unlikely to be related to a solicitation.


5. EXPAND PHASE 2 DATA COLLECTION

NISVS uses a two-phase design. In the first phase, sampled individuals are offered a $10 incentive. In Phase 2, a subset of non-respondents fully worked during Phase 1 are offered a $40 incentive and additional attempts are made to contact these individuals. Phase 2 may help reduce non-response bias by recruiting individuals who may have characteristics similar to non-respondents. To further improve response rate and reduce nonresponse bias, we will start Phase 2 earlier in the data collection period and increase the sample size for Phase 2. CDC is requesting that RTI increase the proportion of individuals followed up within Phase 2 over the prior reporting period by sampling an additional 10-15% of non-respondents fully worked during Phase 1 for this phase. CDC will work with RTI to monitor the Phase 1 response rate, hours per complete, and the number of current nonrespondents to identify the point at which the sample has been most fully worked. This will allow CDC and RTI to determine the most opportune time for Phase 1 to end and Phase 2 to begin in order to reach an optimal number of nonrespondents and offer them the increased incentive to complete the survey. The rate at which Phase 1 nonrespondents are contacted to complete the survey in Phase 2 will be tracked, as will their response rate. The data obtained from the expansion of the duration of Phase 2 in the 2018 data collection period will be compared to Phase 2 contact and response rates from previous data collection periods to examine whether expanding this nonresponse follow up period is an effective method for improving response rate and reducing nonresponse bias.


6. REMINDER POSTCARD

An advance letter is currently sent to potential landline respondents introducing the study, the incentive, and contact information should they have questions (Appendix I). Additionally, the envelope and letterhead used for the advance letter were modified slightly so they now feature the CDC logo more prominently. This could help to increase potential respondents’ recognition of NISVS as a federal survey sponsored by CDC and may help improve response rate. A reminder will be sent to non-respondents approximately 2 weeks after the advance letter. This reminder will re-introduce the study and provide information about the incentive, but this time using an altered format (a postcard, which does not have to be opened, rather than the letter). 


A second postcard reminder highlighting the increased incentive will be sent before the start of Phase 2 to the subset of non-respondents selected for Phase 2. This information may increase response from selected sample households that have received the initial advanced letter but have not yet been reached or who initially declined participation.


7. TEXT MESSAGE FOLLOW-UP

For those in the cell phone frame that answer their phone but are not willing or able to participate at the time, interviewers will ask permission to send them some information by text message. If granted, a text will be sent to the individual’s cell phone (similar to the advance mailing for those in the landline frame) indicating that they have been selected for the survey along with details about the incentive amount and a number they can call to participate.


Sampled individuals that indicate that they are younger than 18 years old will be classified as age-ineligible (screened out), and no further call attempts will be made to that number. This could reduce costs, participant burden, and improve the response rate. The number of text messages sent and the phone numbers to which they are sent will be tracked, as will the rate at which respondents respond to the text message, so that we can assess the extent to which this new method results in increased contact, response, and cooperation rates.


B.4. Tests of Procedures or Methods to be Undertaken



To ensure that all skip patterns and data collection procedures are operating correctly, the first several months of data collection are closely monitored and any necessary adjustments to the CATI instrument or survey protocols made during the initial weeks of data collection.


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Individuals who have participated in designing the data collection:

CDC staff:

Sharon Smith, Ph.D. 770-488-1368 [email protected]

Kathleen Basile, Ph.D. 770-488-4224 [email protected]

Jieru Chen, Ph.D. 770-488-1288 [email protected]

Marcie-jo Kresnow–Sedacca, M.S 770-488-4753 [email protected]

Melissa Merrick, Ph.D. 770-488-7464 [email protected]

Thomas Simon, Ph.D. 770-488-1654 [email protected]

Kevin Webb 770-488-1559 [email protected]
Mikel Walters, Ph.D. 404-639-0913 [email protected]

Jeff Hall, Ph.D. 770-488-4648 [email protected]

RTI International Staff:

Lisa Carley-Baxter, M.A. 919-485-2616 [email protected]

Kim Aspinwall, M.A. 919-316-3348 [email protected]

Andy Peytchev, Ph.D. 919-541-6648 [email protected]

Lilia Filippenko, M.A. 919-541-6717 [email protected]

Jessica Williams, M.A. 919-541-6558 [email protected]

Amang Sukasih, Ph.D. 202-728-2469 [email protected]

Christopher Krebs, Ph.D. 919-485-5714 [email protected]


The following individuals from the contract participate in the collection of data:

Lisa Carley-Baxter, M.A. 919-485-2616 [email protected]

Kim Aspinwall, M.A. 919-316-3348 [email protected]

Andy Peytchev, Ph.D. 919-541-6648 [email protected]

Lilia Filippenko, M.A. 919-541-6717 [email protected]

Jessica Williams, M.A. 919-541-6558 [email protected]

Amang Sukasih, Ph.D. 202-728-2469 [email protected]

The following individuals participate in data analysis:


CDC Staff:

Jieru Chen, Ph.D. 770-488-1288 [email protected]

Marcie-jo Kresnow–Sedacca, M.S. 770-488-4753 [email protected]

Xinjian Zhang, Ph.D. 770-488-1558 [email protected]

Robert Thomas, M.S. 770 488-1978 [email protected]

Ann Smalls, B.S. 770 488-1653 [email protected]


RTI International Staff:

Lisa Carley-Baxter, M.A. 919-485-2616 [email protected]

Kim Aspinwall, M.A. 919-316-3348 [email protected]

Andy Peytchev, Ph.D. 919-541-6648 [email protected]

Jennifer Iriondo-Perez, M.A. 919-541-6140 [email protected]

JD Bunker, B.S. 919-316-3747 [email protected]

Amang Sukasih, Ph.D. 202-728-2469 [email protected]

Kevin Wang, Ph.D. 770-407-4950 [email protected]








REFERENCES


American Association for Public Opinion Research (2016). Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys, 9th edition. Lenexa, Kansas: AAPOR.

Black MC, Kresnow M, Simon TR, Arias I, & Shelley G. (2006). Telephone survey respondents' reactions to questions regarding interpersonal violence. Violence and Victims, 21, 445-459.

Blumberg, S. J., & Luke, J. V. (2017). Wireless Substitution: Early Release of Estimates Based on Data from the National Health Interview Survey, July-December 2015.  Retrieved

July 11, 2017, from https://www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201712.pdf


Borowski, C. (2017) Survey: Local Presence Dialing—Foot in the Door, or Door-Slammer? Retrieved January 17, 2018 from https://www.softwareadvice.com/resources/local-presence-dialing-survey/.


Brick JM, Cervantes IF, Lee S, & Norman G. (2011). Nonsampling errors in dual frame telephone surveys. Survey Methodology, 37(1), 1-12.


Czajka, J. L., & Beyler, A. (2016). Background Paper Declining Response Rates in Federal

Surveys: Trends and Implications. Mathematica Policy Research. Retrieved at https://aspe.hhs.gov/system/files/pdf/255531/Decliningresponserates.pdf


Dailey R & Claus RE. (2001). The relationship between interviewer characteristics and physical and sexual abuse disclosures among substance users: A multilevel analysis. Journal of

Drug Issues, 31, 867-88.


de Leeuw E, Callegaro M, Hox J, Korendijk E, Lensvelt-Mulders G. (2007). The influence of advance letters on response in telephone surveys. Public Opinion Quarterly, 71(3), 413-443.


Deming WE. (1953). On a Probability Mechanism to Attain an Economic Balance between the Resultant Error of Nonresponse and the Bias of Nonresponse. Journal of the American Statistical Association, 48, 743-772.


Dykema J, Diloreto K, Price JL, White E, Schaeffer NC. (2012). ACASI gender-of-interviewer

voice effects on reports to questions about sensitive behaviors among young adults. Public Opinion Quarterly, 76(2), 311-325.


Fowler Jr FJ & Mangione TW. (1990). Standardized Survey Interviewing. Newbury Park:

Sage publications. 


Gondolf EW & Heckert DA. (2003). Determinants of women's perceptions of risk in battering relationships. Violence & Victims, 18, 371-386.


Groves RM & Couper MP, Presser S, Singer E, Tourangeau R, Acosta GP & Nelson L. (2006). Experiments in Producing Nonresponse Bias. Public Opinion Quarterly, 70, 720-736.


Groves RM & McGonagle KA. (2001). A Theory-Guided Interviewer Training Protocol Regarding Survey Participation. Journal of Official Statistics, 17, 249-265.


Groves RM, Singer E & Corning A. (2000). Leverage-Saliency Theory of Survey Participation - Description and an Illustration. Public Opinion Quarterly, 64, 299-308.


Holbrook A, Krosnick JA, & Pfent A. (2007). The causes and consequences of response rates

in surveys by the news media and government contractor survey research firms. In J.M. Lepkowski, C. Tucker, J.M. Brick, E. de Leeuw, L. Japec, P.J. Lavrakas, M.W. Link, & R.L. Sangster (Eds.), Advances in telephone survey methodology (pp. 499-528). Hoboken, NJ: John Wiley & Sons, Inc.


Johnson H. (1996). Dangerous Domains: Violence Against Women in Canada. Scarborough,

ON: Nelson Canada.


O'Muircheartaigh C & Campanelli P. (1999). A Multilevel Exploration of the Role of Interviewers in Survey Non-Response. Journal of the Royal Statistical Society, 162, 437-446.

Pollner M. (1998). The effects of interviewer gender in mental health interviews. Journal of Nervous & Mental Disease, 186, 369-73.


Singer E & Ye C. (2013). The use and effects of incentives in surveys. The Annals of the American Academy of Political and Social Science, 645, 112-141.


Smith SG, Chen J, Basile KC, Gilbert L, Merrick MT, Patel N, Walling M, & Jain A. (2017).

The National Intimate Partner and Sexual Violence Survey (NISVS): 2010-2012 state report. Atlanta, GA: National Center for Injury Prevention and Control, Centers for Disease Control and Prevention.


Sullivan CM & Cain D. (2004). Ethical and safety considerations when obtaining information from or about battered women for research purposes. Journal of Interpersonal Violence, 19, 603-18.


Tjaden P & Thoennes N. (2000). Full Report on the Prevalence, Incidence, and Consequences of Violence Against Women (NCJ Publication No. 183781). Washington, DC: National Institute of Justice.





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy