SSA_0822_NISVS_12.7.22_revised 3.28.23_CLEAN

SSA_0822_NISVS_12.7.22_revised 3.28.23_CLEAN.docx

[NCIPC] The National Intimate Partner and Sexual Violence Survey (NISVS)

OMB: 0920-0822

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT: PART A





OMB# 0920-0822


The National Intimate Partner and Sexual Violence Survey (NISVS)




December 7, 2022







Point of Contact:

Sharon G. Smith, PhD

Behavioral Scientist


Contact Information:

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

4770 Buford Highway NE MS S106-10

Atlanta, GA 30341-3724

phone: 770.488.1363

email: [email protected]





CONTENTS

Section Page


Summary table 4

  1. Justification 4


A.1. Circumstances Making the Collection of Information Necessary 4

A.2. Purpose and Use of Information Collection 5

A.3. Use of Improved Information Technology and Burden Reduction 9

A.4. Efforts to Identify Duplication and Use of Similar Information 10

A.5. Impact on Small Businesses or Other Small Entities 12

A.6. Consequences of Collecting the Information Less Frequently 12

A.7. Special Circumstances Relating to the Guidelines of

5 CFR 1320.5(d)2 13

A.8. Comments in Response to the Federal Register Notice and

Efforts to Consult Outside the Agency 13

A.9. Explanation of Any Payment or Gift to Respondents 15

A.10. Protection of the Privacy and Confidentiality of Information

Provided by Respondents 16

A.11. Institutional Review Board (IRB) and Justification for Sensitive

Questions 17

A.12. Estimates of Annualized Burden Hours and Costs 19

A.13. Estimates of Other Total Annual Cost Burden to Respondents

or Record Keepers 20

A.14. Annualized Cost to the Government 20

A.15. Explanation for Program Changes or Adjustments 21

A.16. Plans for Tabulation and Publication and Project Time Schedule 22

A.17. Reason(s) Display of OMB Expiration Date is Inappropriate 22

A.18. Exceptions to Certification for Paperwork Reducation Act

Submissions 22


REFERENCES 23



Attachments

A Authorizing Legislation: Public Health Service Act

B. Published 60-Day Federal Register Notice

C. Public Comments to the 60-Day Federal Register Notice

D. Response to the Public Comments to the 60-Day Federal Register Notice

E. Feasibility Testing Report

F. Non-Response Follow-up Report

G. Pilot Testing Report

H. Cognitive Testing Report

I. Description of Survey Revisions




J. NCIPC-NCHS Methodology Study Abstract

K. Web/Phone Screeners and Survey Instrument, English

L. Web/Phone Screeners and Survey Instrument, Spanish

M. Paper Screener, English

N. Paper Screener, Spanish

O. Consultation on the Initial Development of NISVS

P. NISVS Workgroup Participants

Q. NISVS Workgroup Summary

R. Privacy Impact Assessment (PIA)

S. Advance Letters, English

T. Advance Letters, Spanish

U. Research/Non-research Determination STARS


SUMMARY TABLE

Shape1

Goals of the current revision request: Restart collecting annual nationally representative data collection for NISVS using redesigned methods recently tested in the currently approved package. NISVS collects information about individual’s experiences of sexual violence (SV), stalking, and intimate partner violence (IPV).


Intended use of the resulting data. NISVS is a surveillance system used to monitor the magnitude of SV, stalking, and IPV victimization among adults in the U.S. Data are used by the federal government, states, partner organizations, and stakeholders to inform prevention programs and policies related to SV, stalking, and IPV.


Methods to be used to collect data. NISVS data will be collected using address-based randomized sampling with a push-to-web design, whereby respondents will be encouraged to complete the survey on the internet. A call-in telephone option will be available to those who prefer to complete the survey by phone.


The subpopulation to be studied. Non-institutionalized, English- and Spanish-speaking women and men aged 18 years or older in the United States.

How data will be analyzed. Data are analyzed using appropriate statistical software to account for the complexity of the survey design to compute weighted counts, percentages, and confidence intervals for SV, stalking, and IPV prevalence and impacts using national and state-level data.




A. JUSTIFICATION


A.1. Circumstances Making the Collection of Information Necessary


Background


The Centers for Disease Control and Prevention (CDC) National Center for Injury Prevention and Control (NCIPC) requests approval for a 3-year period for this revision request for the currently approved “The National Intimate Partner and Sexual Violence Survey (NISVS)” – OMB# 0920-0822, expiration date 03/31/2023. Sexual violence (SV), intimate partner violence (IPV), and stalking are significant public health issues that impact the health and well-being of women and men across the United States. The most recent data from NISVS (2016/2017) indicates that nearly 1 in 3 women and 1 in 6 men were stalked in their lifetime (Smith et al., 2022), 1 in 2 women and about 1 in 3 men experienced some form of contact sexual violence in their lifetime (Basile et al., 2022), and about 1 in 3 women and 1 in 4 men experienced severe physical violence by an intimate partner during their lifetime (Leemis et al., 2022).


An extensive field of research has demonstrated that SV, IPV, and stalking can have serious long-term health consequences and significant social and public health costs (Basile et al., 2022; Leemis et al., 2022; Peterson, DeGue, Florence, & Lokey, 2017; Peterson et al., 2018; Smith et al., 2022).


For example, the literature indicates that SV, IPV, and stalking victims are more likely than non-victims to experience a range of negative physical health consequences, including HIV and other sexually transmitted infections, gastrointestinal and neurological disorders, and chronic pain (e.g., Basile et al., 2022; Bonomi et al., 2013; Jina & Thomas, 2013), as well as negative mental health outcomes, including depression, chronic mental illness, and post-traumatic stress disorder (e.g., Basile et al., 2022; Leemis et al., 2022; Jordan, Campbell, & Follingstad, 2010; Smith et al., 2022). Additionally, NISVS data indicate that violence victimization is associated with activity limitations such as difficulty dressing and bathing, cognitive difficulties, and doing errands alone (Basile et al., 2022, Leemis et al., 2022; Smith et al., 2022). Finally, NISVS data have been used to increase the understanding of victimization among sexual minorities (Chen, 2020). A more recent analysis shows that bisexual and lesbian women bear a large burden of the forms of violence captured in NISVS (Chen et al., in press).


The Centers for Disease Control and Prevention (CDC) leads federal efforts related to reducing injury and violence at a population level. The Healthy People 2030 report (Healthy People, 2030) lists several objectives that pertain directly to SV, IPV, and stalking. Applicable objectives include objectives IVP-D04, to “reduce intimate partner violence (i.e., contact sexual violence, physical violence, and stalking) across the lifespan;” IVP-D05, to “reduce contact sexual violence by anyone across the lifespan.” Authority for CDC’s National Center for Injury Prevention and Control to collect these data is granted by Section 301 of the Public Health Service Act (42 U.S.C. 241) (Attachment A). The Public Health Service Act gives CDC broad authority to collect data and carry out other public health activities, such as to conduct a survey on SV, IPV, and stalking.



A.2. Purpose and Use of Information Collection


The purpose of The National Intimate Partner and Sexual Violence Survey (NISVS) is to collect nationally representative data on the lifetime and previous 12-month prevalence, and burden of intimate partner violence (IPV), sexual violence (SV), and stalking at both the state and national level for men and women in the United States. The objective is to understand and describe the characteristics of the violence, who is most likely to experience it, and the health conditions and impacts associated with it, for both lifetime and twelve-month victimization. These data are the only data that can provide these estimates and are used in many ways. Data are used by state and federal partners to inform policy, by prevention partners, local and state health departments, coalitions, universities and used in training programs. Data are used in peer reviewed journals, technical reports, factsheets and other media. Datasets are made public for external researchers to use as well. Data has been used to understand and advance health equity by examining prevalence by race and ethnicity, sexual orientation, and disability status. In 2010 NISVS also collected data for the National Institute of Justice (NIJ) to examine IPV, SV, and stalking among American Indian and Alaska Native people. NISVS collected data in 2010 and 2016/17 for the Department of Defense to understand the prevalence of violence among active-duty women, active-duty men, and the wives of active-duty men.

Continuing to document and monitor the prevalence of IPV, SV, and stalking is a critical step to improving the health of individuals, making communities safer, and reducing the social and healthcare costs currently burdening state and federal governments and programs. NISVS data can be used to inform public policies and prevention strategies and help guide and evaluate progress towards reducing the substantial health and social burden associated with IPV, SV, and stalking.


NISVS has historically been administered through random-digit-dial sampling. In response to declines in the response rate among NISVS and other RDD survey systems, CDC embarked upon a methodology study to explore the feasibility and cost of implementing alternative methods for collecting NISVS data. After an independent peer review in 2017, from 2018-2021, CDC contracted with Westat to conduct feasibility testing and pilot testing aimed at developing methods for transitioning NISVS to an address-based sampling design that would reduce reliance on cell and landline phone participation and improve response rates.


The revision request is to fully implement the redesigned methodology and refreshed questionnaire for a production-level national sample. The redesigned NISVS will use an address-based sampling frame with push-to-web data collection and a call-in telephone option, intended to increase the response rate, and reduce respondent burden.


Activities Leading to the Current Request


Cognitive Testing

Beginning in 2018 the NISVS program undertook a multi-phased study to redesign the data collection procedures, consistent with the advice of the independent external peer review conducted in 2017. In the first step of the methodological work, cognitive testing was conducted (N = 120, 2 rounds of 60) on the survey instrument. Although the instrument was largely similar to the one used in 2016-2018 (OMB# 0920-0822, approved 7/25/2016), that survey instrument was designed for RDD sampling. In this cognitive testing, we aimed to ensure that the survey questions were understood across different modes of testing (i.e., phone, web, paper). Additionally, the instrument was shortened to reduce burden on respondents but not alter the core content (SV, stalking, and IPV victimization). Select survey questions were revised based on results from previous cognitive testing (non-substantive change request, OMB# 0920-0822, approved 6/19/2019), and a revised instrument was used in the subsequent phases of the redesign study.


Feasibility Testing

In the second step of the methodological work, feasibility testing was conducted using two different designs for collecting NISVS data (revision request, OMB# 0920-0822, approved 3/20/2020). Feasibility testing sought to answer the following research questions: (1) What are the response rates by sample frame? How well does each sample frame represent key population groups? Are there differences in response and coverage of key population groups between the different experimental conditions? (2) Are there differences in key outcomes (e.g., violence victimization) by sample frame, mode of interview and experimental conditions? (3) Are there differences between frames with respect to the reaction to, perceived confidentiality of, and privacy of the interview? (4) What are differences in costs associated with each frame/mode? (5) What are the recommended designs for the national survey?


Two alternative designs were tested for NISVS: In the first design, a sampling frame using random digit dial (RDD) and computer-assisted telephone interviews (CATI) as the mode of interviewing with a sample size of 1,461. The second design used an address-based sample (ABS) that pushed respondents to the web (N = 3,521) and followed up with multimode alternatives: (a) respondent’s choice between the web and calling in to do a telephone interview; (b) respondent’s choice between completing the web survey or an abbreviated paper survey. Details about the study procedures can be found in the feasibility testing report (Attachment E).


Feasibility testing results indicated an improvement in response rate for ABS (33.1%) compared to RDD sampling (10.8%). Furthermore, the reduction in cost by using ABS is significant. For the RDD survey, costs included labor hours for interviewers related to training and conducting the interviews, labor hours for supervision and other direct costs related to these two activities (e.g., telephone charges, mailings for pre-notification letters, incentives). The ABS costs included the labor for the different mailings, costs associated with the mailings (e.g., printing, postage), and the respondent incentives.


Additionally, one ABS experiment compared the Rizzo-Brick-Park probability method to the Youngest-Male-Oldest-Female approach. A second ABS experiment offered an alternative mode to nonrespondents to complete the survey (i.e., paper vs. call-in telephone). Detailed information about the results is described in Attachment E.


In previously approved NISVS official data collections (OMB # 0920-0822), incentives were given upon completing the survey (i.e., $10 for the primary data collection—Phase 1, and $40 for the non-response follow-up—Phase 2). During redesign testing, we adapted the incentive procedure for ABS. In the feasibility testing (OMB # 0920-0822 approved 3/20/2020), incentives were offered at each stage. A $5 cash pre-incentive was included in the very first mailing. Respondents were offered a $10 Amazon gift code (promised incentive) to complete the screener on the web and a $5 cash pre-incentive for the paper screener. For the full survey, respondents were promised a $15 Amazon gift code (promised incentive) for completing the survey on either the web or phone.

Based on the results of feasibility testing, the following methodological procedures were recommended for future NISVS studies (see details in Attachment E):


  1. Use address-based sampling with push-to-web methodology.

  2. Use additional mailings.

  3. Use a web/CATI option group whereas respondents can choose to complete the survey by web or call-in CATI.

  4. Use the probability method for respondent selection. The Rizzo-Brick-Park (RBP) method is recommended.

  5. Include items on the questionnaire that allow assessment of representation and bias, such as items from the American Community Survey and National Health Interview Survey.

  6. Include items to measure the attention of the respondent on the web survey to ensure that respondents are reading the questions carefully, and not skipping critical instructions.


Pilot Testing

The third step of the methodological work consisted of pilot testing the data collection method (change request, OMB# 0920-0822, approved 9/2/2021) among a small sample (N = 285) using the recommended approach. The purpose of the pilot test was to test the implementation of the recommended design for NISVS; the pilot approach used an ABS sampling frame which pushed respondents to the web to complete the screener and full survey. Households were mailed a request to fill out a screening survey on the web in order to select a respondent. If there was no response to this request, the respondent was sent a package in the mail that included a paper screener. The person who completed the web screener could go directly to the full NISVS survey on the web (or call-in telephone) if they were selected as the respondent. If the person was not selected, they were asked to pass the information to the selected individual. If the screener was completed on paper, a separate request to do the extended survey was mailed to the selected respondent.


Consistent with ABS procedures used in feasibility testing, incentives were offered at each stage during pilot testing. A $5 cash pre-incentive was included in the very first mailing. Respondents were offered a $10 Amazon gift code to complete the screener on the web and a $5 cash pre-incentive for the paper screener. For the full survey, respondents were promised a $15 Amazon gift code for completing the survey on either the web or phone.


Testing of the ABS methodology demonstrated that the final response rate during the pilot testing step was 21.8% compared to 33.1% in the feasibility testing step. The difference in response rates between the feasibility and pilot testing periods may have been related to the COVID-19 environment. The feasibility testing was conducted in May-October 2020 when pandemic-related shutdowns and quarantines were common, and respondents may have been spending more time at home, potentially increasing the time they had available to complete the survey. CDC received much publicity during that time, and this may have prompted greater attention to the survey request. In contrast, pilot testing occurred during September-December 2021 when the pandemic shutdowns had lifted to a greater extent, and people may have returned to normal routines, potentially resulting in decreased availability to complete the survey.


Furthermore, differences in prevalence estimates were observed between the feasibility and pilot tests. However, the pilot test used a much smaller sample and did not include a non-response follow-up phase. These factors along with the response rate may have influenced the differences in prevalence.


More details are described in the feasibility and pilot study reports, Attachments E and G.


Supplemental Studies


Non-response Follow-up Study

NCIPC collaborated with the NISVS redesign study team to evaluate additional aspects of the nonresponse conversion phase (Appendix F, the nonresponse follow-up study report). This study used data collected from the feasibility study (revision request, OMB# 0920-0822, approved 3/20/2020), and included an analysis of representation of the study population (e.g., demographic and general health indicators), effects on prevalence estimates, and cost.


In this nonresponse follow-up study (Appendix F), initial non-respondents were resampled, and a new data collection protocol was implemented. This NRFU protocol entailed a higher monetary incentive ($40) relative to the amount used in the initial data collection phase ($15). The increased incentive contributed to enhancing the overall sample’s representativeness and reducing non-response bias in several aspects:


  1. The NRFU phase increased the overall response rate. While response rate is not the only measure that can be used to assess survey data quality, it is one of the most visible indicators of data quality to those reviewing and using the survey. The NRFU study shows that using the higher incentive to reach initial non-respondents raised the response rate by 5 absolute percentage points.



  1. The NRFU phase resulted in a sample more closely aligned with the population the survey intended to represent (i.e., one consistent with that represented by the the American Community Survey [ACS]). Results show that the NRFU population does differ markedly from the responders who were recruited in the initial data collection phase with respect to several important characteristics, including age, marital status, race-ethnicity, and education. The direction of these differences is that the NRFU group has a higher proportion of respondents who are under-represented in the sample collected through the initial data collection phase. For example, those 18-29, those not married, Hispanic persons, and those with lower education are all under-represented when comparing the total sample to the ACS benchmarks. In all these cases, therefore, the NRFU brought the sample closer to the general population represented in the ACS.


  1. The NRFU phase improves the overall representation of the combined sample with respect to selected demographic characteristics. None of the differences between the initial and NRFU groups were statistically significant. For two indicators – being hospitalized and having asthma – the NRFU aligned the study population more closely with the national benchmark. For two other measures – physical/mental/emotional problems and depression – the NRFU was diverged to a greater extent from the national benchmarks. Despite the latter result, overall, the NRFU does seem to improve the overall representation in selected demographic characteristics of the combined sample.


  1. The NRFU phase increased the design effect by only a small amount. An examination of the design effects from the main sample and the sample combining both the main data collection and the NRFU phases shows that the NRFU increase the design effect by only a very small amount (many are less than 1.05, with only a few as high as 1.10.). For example, the design effect for males for lifetime prevalence for contact sexual violence was 1.75 for the initial sample and 1.96 when including the NRFU group. One reason why there are not larger effects of the NRFU on the overall sample design effect is that the NRFU did bring the sample closer to the national population for many of the characteristics that were used for the weighting.


In summary, the effect of the Phase 2 (feasibility non-response follow-up) respondents on the final prevalence estimates is limited because Phase 2 sample only made up a small percentage of the overall sample, but findings suggest that Phase 2 contributed to bringing in respondents who were under-represented (with respect to age, marital status, race-ethnicity, and education) in Phase 1 (primary feasibility data collection phase, revision request, OMB# 0920-0822, approved 3/20/2020).



Additional Cognitive Testing

More recently, the program has conducted additional cognitive testing. The purpose was to improve and update the survey and address remaining issues with select survey questions (based on earlier cognitive testing through the redesign study) and to test a few new questions (e.g., technology-facilitated sexual violence). The NISVS program worked with the Collaborating Center for Questionnaire Design and Evaluation Research in the National Center for Health Statistics to conduct cognitive testing which was completed in June 2022. Results from the first round of testing revealed that some participants were confused about which perpetrator and incident to reference in their responses to follow-up questions (see report in Attachment H). We revised the questions to provide more instruction to better assist participants about which person and incident to frame their follow-up question responses. The second round of testing showed improvement in participants’ understanding of the questions and greater consistency in their responses. These results were used to inform the final survey instrument. See Attachment I for a description of the survey changes.


Test of Survey Design

Additionally, for collecting better quality data in violence victimization and injury, the National Center for Injury Prevention and Control collaborated with the National Center for Health Statistics to conduct a methodology study in 2022. The primary methodological goal of this study was to examine potential survey design approaches for the future NISVS. Specifically, we studied the relative performance of an interleafed rostering approach and a simplified grouped approach with limited follow-up questions. In an interleafed approach individual screener items are directly followed by follow-up questions, whereas in a grouped approach, groups of screener items are administered together followed by a set of follow-up questions. In addition, we designed the study to assess survey section order effect. Survey participants were a stratified sample of adults (18+) from the AmeriSpeak panel, a probability-based, recruited panel developed and maintained by the National Opinion Research Center at the University of Chicago. The study used a randomized design to assign participants to three experimental conditions: one condition used an interleafed format, the second condition used a simplified grouped format, and the third condition used a simplified group format with an alternative survey section order. Data collection modes included both web (self-report) and telephone interviews (computer-assisted and interviewer administered). After the data collection, sample data from each experimental condition was weighted to the national population and calibrated using the 2020 NHIS sample adult file. Preliminary analyses show evidence of survey instrument structure effect, confirming what survey research has learned in mostly non-violence related surveys about the difference between the interleafed format and a simplified grouped format. Preliminary analyses also show that the data quality does not seem to be linearly associated with an increase in response time. See Attachment J for the study abstract.



Conclusions from the Redesign and Supplemental Studies


During feasibility testing, we observed differences in the prevalence estimates between ABS-web and RDD-phone interviews using the same survey instrument. This was not surprising given previous research showing that changes in survey mode can impact prevalence estimates (Hoebel, von der Lippe, Lange, & Ziese, 2014; Messeri et al., 2019; Zuwallack et al., 2022). Therefore, it is possible that we may see changes in prevalence estimates due to the change in mode in the upcoming data collection. The question remains whether victimization estimates are increasing or not, or whether recent changes in estimates are methodological in nature. We will continue to investigate this issue. For example, we will conduct an analysis of nonresponse bias that includes (1) health questions embedded within the survey that will be used for benchmarking to external data sources (e.g., BRFSS, NHIS). (2) Conduct within-sample comparison. Specifically, we will examine nonresponse bias by comparing selected weighted estimates of the nonrespondent sample to weighted estimates from the main data collection phase. The nonresponse analysis may also be used to improve our understanding of health equity issues and set a foundation for future investigation in this area. Furthermore, the program may elect to conduct experimental and/or cognitive testing for program improvement purposes, as needed. Examples of such testing may include testing of paper screeners, design and aesthetics of mailing materials and web screens, and survey questions. Additionally, for future data collection years, the program may seek approval to take additional measures to address non-response, as needed. Options may include an abbreviated non-response follow-up survey or offering a higher incentive amount to complete the main survey.


Regarding results from the test of the survey design, we have considered the trade-offs of using an interleafed format versus a grouped format and the results from the recent round of cognitive testing. For NISVS-5 we plan to use a hybrid survey format which has no inquiry about perpetrator initials (as used in the 2010-2012, and 2015 instruments), but instead, includes prompts referring to a specific perpetrator category (e.g., spouse, supervisor, first date, etc.) when asking about follow-up questions. Through anchoring respondents’ recall of experiences to specific perpetrator types, the hybrid format seeks to lessen respondent burden, facilitate respondents’ memory of events and their details in their responses.


The results of these studies inform the current data collection approach described in this requested revision. Based on results from the feasibility study and pilot testing it was determined that the ABS method with push-to-web would be most cost efficient, realize the best possible response rates, and be least burdensome on respondents. This method will allow respondents the option to complete the survey by telephone if they so wish. This approach also will provide respondents incentives to complete both the screener and the questionnaire.


Finally, for the upcoming data collection, our NISVS reports, presentations, and other communications will include information about the methodology, its implications, and discourage comparisons to previous data years.



Terms of Clearance


Previous terms continue (based on the revision request, OMB# 0920-0822, approved 3/20/2020 and change request, OMB# 0920-0822, approved 9/2/2021). Previous terms required that the study results would include a discussion of how the COVID-19 response environment may have influenced the generalizability of these findings. Because data collection for this study will occur on the web and by phone, COVID-19 is unlikely to present challenges to data collection. If applicable, study results will include discussion of how the COVID-19 response environment may have influenced the generalizability of the findings


A.3. Use of Improved Information Technology and Burden Reduction

Traditionally, NISVS has used random digit dialing (RDD) as a sampling frame, as well as computer-assisted telephone interviewing (CATI) software to contact survey respondents. However, consistent with other national surveys that have used RDD, the response rates have been progressively declining. To improve the response rate, reduce respondent burden, and explore improved technology, the NISVS program contracted Westat to conduct testing to redesign the NISVS data collection methodology. The feasibility study compared RDD to address-based sampling (ABS), ABS with options to complete a web survey, call-in CATI survey or an abbreviated paper survey. The study results showed that ABS had a higher response rate (33.1%) compared to RDD (10.8%). In addition, the ABS approach is less burdensome in terms of survey completion time. For respondents with no victimization, the median time to complete the survey was 12 minutes for ABS/web vs. 30 minutes for RDD. For those with 3 or more different types of victimizations, the completion time was 25 minutes for ABS/web vs. 50 minutes for RDD. See the feasibility testing report (Attachment E) for additional information.


In the current request, the ABS/web-based screener and instrument (Attachments K-L) utilizes improved information technology that will be implemented for NISVS. Paper screeners will be available for those who prefer that option over using a web screener. After completing the screener and selection process, the selected adults will be requested to complete the web survey by using personal computer, laptop, tablet, and/or smartphone. The web instrument includes skip patterns, rotations, range checks and other online consistency checks and procedures during the survey assuring that only relevant and applicable questions are asked of each respondent. The survey will be available in both English and Spanish versions (Attachments K-L) and can be completed on the web or by phone with a live interviewer.


For those respondents who were selected and cannot (or prefer not to) complete the web screener and survey, they will have the option to complete the screener and survey by phone or a paper screener. The phone survey will use the same instrument as the web survey (Attachments K-L). Data collection and data entry into the system will occur simultaneously with the phone interview. The contractor will maintain consistency in the contents and formats for both web and phone survey data, therefore the combination of the datasets will be seamless and high quality for assessing SV, IPV, and stalking victimization. Finally, data can be extracted and analyzed using existing statistical packages directly from the system, which significantly decreases the amount of time required to process, analyze, and report the data.


A.4. Efforts to Identify Duplication and Use of Similar Information


Prior to NISVS, the most recent national health survey on SV, IPV, and stalking (National Violence Against Women Survey, NVAWS), jointly sponsored by NIJ and CDC (conducted by Schulman, Ronca, Bucuvalas, Inc. (SRBI)), was a one-time data collection completed in 1995-1996 (Tjaden and Thoennes, 1998). Prior to NVAWS, there had been no similar national health survey with a specific focus on SV, IPV, and stalking.


When NISVS was originally designed, CDC consulted with other federal agencies (e.g., National Institute of Justice (NIJ), Department of Defense (DoD) and other leading experts and stakeholders in the fields of IPV, SV, and stalking (see Attachment O). NCIPC convened a workshop “Building Data Systems for Monitoring and Responding to Violence Against Women” (CDC, 2000). Recommendations provided by those in attendance are reflected in the design of NISVS. As discussed in the Data Systems workshop, surveys that ask behaviorally specific questions and that are couched in a public health context have much higher levels of disclosure than those couched within a crime context (as in the National Crime Victimization Survey (NCVS) conducted by the Bureau of Justice Statistics (BJS)). The types of victimization measured in NISVS (i.e., stalking, SV, IPV) are also among the types of outcomes that are unlikely to be disclosed in crime surveys, thus NISVS fills an important gap.

Although NISVS and NCVS collect some similar information, they are complementary in nature (see Table 1). Key characteristics of both systems are listed below. Additional information can be found in Basile, Langton, and Gilbert (2018). In our ongoing assessment of NISVS, CDC worked with BJS in discussing the complementary nature of NISVS and NCVS. This included demonstrating the ways that these systems provide unique data on victimization and the impacts, exploring options for collaborative, and continuing enhancement of both systems. CDC and BJS participated in regular meetings to discuss the lessons learned and implications for continued improvement of the systems, including survey administration and data collection methodology. CDC and BJS also collaborated to develop a summary document that explains the unique and complementary nature of these and other systems for measuring sexual violence. The summary may help users of the data to better understand the survey options that are available and to make an informed decision about which data source to use to address specific questions. The document is available on the CDC website (Basile, Langton & Gilbert, 2018).


Additionally, CDC has met with scientists working on the NCVS Redesign to be informed about changes to the NCVS instrument and to identify questions that may be appropriate to use as benchmarking items within the NISVS survey. Upon examination of the methods, victimization framing, and survey content, the NCVS and NISVS surveys serve complementary purposes (see Table 1 below). To our knowledge, the redesigned NCVS survey is planned for fielding in 2025 (Truman & Brotsos, 2022).


Table 1. Feature Comparison of NISVS and NCVS Surveys


Feature

NISVS - new design

NCVS

Context

Public health

Crime-based

Eligibility

Eligible respondents are non-institutionalized adults aged 18 and older

Eligible respondents are all members of U.S. households age 12 or older and non-institutional group living facilities

Sampling

Respondents are selected using address-based sampling

Respondents are selected using a stratified, multi-stage cluster sample

Interview modes

Conducted by web, with a telephone call-in option

Conducted in person and by telephone

Question approach

Employs behaviorally-specific language as recommended by the National Research Council (National Research Council, 2014)

Employs criminal justice terminology with some items using behaviorally specific language

Violence content

Focused on sexual violence, intimate partner violence, and stalking

Focused on nonfatal violent and property crime

Characteristics of violence

Data provide information on the characteristics of victims and perpetrators

Data provide information on the characteristics of victims and perpetrators

Time frame of victimization

Timeframe of victimization is lifetime and the 12 months preceding the survey

Timeframe of victimization is past 6 months


Type of data

Data provide lifetime and 12-month prevalence estimates that can be used to generate national and state-specific estimates

Data provide counts and rates of victims, incidents, and victimizations


Associations with health

Data are used to describe associations between victimization and health conditions

General health questions are not included

First-time victimization

Data on the age at first-time victimization which can be used to understand and guide prevention efforts among children and adolescents

First-time victimization questions are not assessed

Trend data

Currently unable to provide trends, but will be possible after three continuous data collection cycles

Data can be used to measure trends over time


State-level estimates

Available

Not available


Although the Behavioral Risk Factor Surveillance System (BRFSS) included optional IPV and SV modules in 2005, 2006, and most recently in 2007, fewer than half of the states administered the module during any one year (2005: IPV, 12 states; SV, 20 states; 2006: IPV, 8 states; SV, 12 states; 2007: IPV, 3 states; SV, 6 states). Furthermore, the information collected in the optional modules was limited to a small number of relatively simple questions [IPV (n= 7) and SV (n=8)] and limited to physical and sexual violence. Because financial support from CDC’s Division of Violence Prevention no longer exists for the optional modules, few (if any) states continue to collect IPV or SV data.


Finally, the National Survey of Family Growth also includes limited questions about SV (i.e., forced intercourse). Those questions were used as benchmarks in the recent feasibility and pilot testing of the NISVS methodology study. However, the NISVS survey is more focused on violence victimization and assesses a broad range of sexual victimization, as well as intimate partner violence and stalking. Thus, NISVS fills a major gap in the field by providing a comprehensive picture of SV, IPV, and stalking victimization that other current survey systems do not capture.


A.5. Impact on Small Businesses or Other Small Entities

No small businesses will be involved in this data collection.


A.6. Consequences of Collecting the Information Less Frequently


The primary consequence of collecting these data less frequently is that stakeholders would have less timely access to national and state-level prevalence estimates and other data on SV, IPV, and stalking victimization. NISVS is the only data system that can provide these estimates, and their availability is valued and needed in fields of research and practice. NISVS data are used by state and federal partners to inform policy, by prevention partners, local and state health departments, coalitions, universities and used in training programs. A lack of state-specific prevalence data will limit the ability of national and state public health officials to understand the prevalence of IPV, SV, and stalking in individual states and inform ongoing state-level prevention efforts. To generate reportable state-level estimates, data from across data collection years may need to be combined, emphasizing the need for frequent and regular data collection. More detailed and frequent information will inform public policies, and intervention and prevention strategies at both national and state levels.


A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


The request fully complies with the regulation 5 CFR 1320.5.


A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


A.8.a) Federal Register Notice


A 60-day Federal Register Notice was published in the Federal Register on March 7, 2022, Vol. 87, No. 44, pp. 12705 (Attachment B).


A.8.b) Efforts to Consult Outside the Agency


In the past, CDC participated in discussions involving federal researchers involved in the study of violence against women (documentation included in Attachment O). NCIPC convened a workshop “Building Data Systems for Monitoring and Responding to Violence Against Women” (CDC, 2000). Recommendations provided by those in attendance are reflected in the design of NISVS.


When NISVS was originally designed in 2007, NCIPC consulted with other federal agencies (e.g., NIJ, DoD) and other leading experts and stakeholders in the fields of SV, IPV, and stalking. Additionally, NCIPC invited a panel of experts to attend a meeting in November 2007 to discuss preliminary findings from the 2007 methodological study (referred to as the NISVS Pilot, although it was not a pilot test of the NISVS survey itself) and to discuss the planned directions for NISVS. The review panel consisted of federal and non-federal subject matter experts with expertise in SV, IPV, and stalking.


In 2008, staff within DOJ and DoD served as technical reviewers for the proposals submitted in response to CDC’s Funding Opportunity Announcement for NISVS. As part of the review team, they participated in the selection of the contractor to do the work and approved the proposed statement of work. DOJ and DoD were also integrally involved in the design of the interview instrument as described below. As described in Section A.4, CDC worked closely with the DoD, NIJ, and other federal agencies in the development of NISVS. Numerous presentations were made in 2008, 2009 and 2010 to vet the proposed NISVS among a range of interested stakeholders, including victim advocates, family advocacy programs, Title IX Task Force authorized under the 2005 VAWA, and several other conferences and public meetings. Further, CDC staff remain engaged in ongoing discussions with Federal colleagues from DoD related to the collection of special population data from military personnel. In 2015 and 2016, staff within the DoD collaborated with CDC in the development, review and approval of the proposed statement of work for the 2016-2017 data collection contract. Data collection for the DoD was conducted in February of 2017 through August 2017. Collaboration between CDC and the DoD was initiated to facilitate collection of military subpopulation data during 2017.

NCIPC recruited a panel of experts to attend a meeting in February 2017 to begin discussions regarding the NISVS study design and to discuss the planned directions for current and future NISVS surveys. The review panel consisted of federal and non-federal subject matter experts with expertise in survey methodology, statistics, IPV and SV research, survey question design, and respondent safety concerns. Attachment P provides a list of those individuals who participated in the meeting and provided suggestions regarding survey design and administration during three webinars and one 2-day in-person meeting between February and July 2017. The summary of the suggestions is presented in Attachment Q. Following the 2016/2017 data collection we underwent a thorough process to explore variations observed in the prevalence of multiple prevalence estimates from the prior years. Analyses were conducted in close consultation with RTI who assisted in examining potential reasons for the changes in the prevalence estimates that we observed.


For the 2016-2018 survey, CDC staff actively engaged NCIPC’s Rape Prevention and Education (RPE) and Domestic Violence Prevention Enhancements and Leadership Through Alliances (DELTA) Impact program grantees and other stakeholders to obtain feedback regarding processes implemented to enhance the ability of NISVS to provide timely data that are more easily accessed and used by those groups that have the greatest potential to take actions that can prevent SV, IPV, and stalking, particularly grantees and state-level prevention partners.


In compliance with OMB guidance, NISVS staff have been engaged in the OMB Sexual Orientation and Gender Identity Working Group to ensure that NISVS is using appropriate measures to identify sexual minority populations.


Moreover, in response to recommendations of the workgroup to maximize collaborative opportunities across Federal surveys, NCIPC has engaged a number of Federal partners to learn about ongoing experiments being conducted in Federal surveys to improve response rates, to assess the feasibility of partnering to conduct mutually beneficial experiments, and to learn from methods being implemented by other Federal surveys. Since July 2017, CDC/NCIPC has consulted with or referred to publications and work from other Federal and non-Federal partners (including BJS, CDC–BRFSS, CDC–National Survey of Family Growth (NSFG), CDC–National Health Information Survey (NHIS), National Highway Traffic Safety Administration (NHTSA), National Science Foundation (NSF), Census Bureau, National Center for Health Statistics (NCHS), American Association for Public Opinion Research (AAPOR), Office of Juvenile Justice and Delinquency Prevention’s redesign of the National Survey of Children’s Exposure to Violence, and Research Triangle Institute (RTI)) to learn more about studies that are currently in the field or pending and that could have implications for NISVS. For instance, CDC has engaged BRFSS staff to gain a better understanding of BRFSS RDD calling methods (e.g., how many follow-up calls BRFSS conducts before considering a phone number “fully worked”, considering cell phones as personal devices and thereby immediately excluding minors under the age of 18 who answer a cell phone number), methods for calculating response rate (e.g., determining whether other Federal survey statisticians are using survival methods to calculate response rate), and to discuss experiments involving address-based sampling methods and efforts to push potential survey respondents to a web-based survey, to return a phone call, or to reply by mail.


Additionally, NCIPC engaged with a number of federal agencies who were conducting research on methods to enhance participation and reduce nonresponse. For instance, NCIPC engaged with NCHS staff working on the National Health and Nutrition Examination Survey to understand results from recent experiments related to optimal incentive structures to garner participation and BJS staff to understand results related to the redesigns of the National Survey of Children’s Exposure to Violence.


Further, NCIPC engaged a number of partners, including AAPOR members, RTI, NHTSA, and NHIS staff in discussions regarding novel technologies that may be greatly impacting response rates. For example, at the 2017 Annual AAPOR meeting, survey methodologists discussed advancements in technology that have allowed for a proliferation of phone applications that block repeated calls from 800 numbers. Thus, after discussions with RTI, AAPOR scientists, CDC staff, and NCIPC’s BSC, for the 2018 survey CDC added numbers local to the Atlanta CDC area (770/404) for outbound calls, which would allow for outbound phone numbers to be changed more frequently to avoid being inadvertently blocked by the phone applications designed to block repeated calls from numbers suspected of being marketers. This may have reduced the problem of erroneous flagging and blocking of the study phone number as spam by cell phone carrier applications and increase the number of survey participants.


CDC also engaged Federal partners to learn more about incentives offered to survey respondents and how a range of incentive types and reminder letters, postcards, and other materials may be used to improve response rates. For instance, CDC engaged in conversations with NHIS, NHTSA, BRFSS, and RTI to learn about relatively inexpensive options that could be mailed with an advance letter to potential respondents, which would serve as a reminder to participate in the survey.


The suggestions from the methodology panel and CDC’s efforts to consult with Federal and non-Federal partners outside the agency resulted in a number of ideas for activities that were integrated into the 2018 data collection period, including consultation with Westat, a leader in developing sound methods of data collection that have evolved with the introduction of new technologies. Consultation with outside entities strengthened our partnerships and improved our ability to call on our partners to discuss opportunities for collaboration and to learn from one another’s research and investments. These discussions further strengthened our ability to develop the redesign contract (conducted in 2018-2021) aimed at determining an optimal data collection approach for NISVS moving forward.


NCIPC staff have met with the Gender Policy Council and White House staff to discuss our measurement of technology-facilitated stalking and reproductive coercion. These discussions led to updates of some stalking items and further consultation with an expert in the field to update the reproductive coercion questions in the current instrument.


A.9. Explanation of Any Payment or Gift to Respondents


For this collection, NISVS is building on the incentive plan structure that approved for several previous survey administrations (2010, 2011, 2012, 2015, and 2016-2018), making changes consistent with testing in the recent 2020-2021 methodology studies (OMB# 0920-0822). Some research indicates that the use of incentives may increase response rates (Guo, Kopec, Cibere, LI, & Goldsmith, 2016; Singer and Ye, 2013). Additionally, over the course of data collection, this could reduce costs and burden to respondents by reducing the need for additional contacts to potential respondents. A literature review on incentives by Singer and Ye (2013) concluded that incentives increase response rates across all survey modes, including mail, telephone, face-to-face, and web. For telephone and mail-paper surveys, a small pre-incentive of between $1 and $5 has been shown to significantly increase response rates (Cantor, et al., 2008; Mercer, et al., 2015; Smith et al., 2019). A promised incentive has also shown effectiveness, but a larger amount is generally needed (e.g., $10 or greater). There is less research on promised incentives for ABS web surveys, but a few studies have found both pre- and promised incentives are effective when asking respondents to complete a web survey (Messer and Dillman, 2011; Biemer, et al., 2017). Less is known on the most effective amounts, with promised incentives ranging between $10 and $40. In addition, a random control trial found that monetary incentives improved completion of an online survey (Hall et al., 2019).


Since 2010, NISVS has always employed a two-phase survey design: Phase 1 being the primary data collection period, and Phase 2 (non-response follow-up study) specifically focusing on improving response rates and reducing nonresponse bias. The recent NISVS methodological study discussed above found that this non-response follow-up procedure continued to reduce survey error and bias in the ABS web survey.

In previously approved NISVS data collections (OMB # 0920-0822), incentives were given upon completing the survey (i.e., $10 for the primary data collection—Phase 1, and $40 for the non-response follow-up—Phase 2).


To be consistent with the incentive levels used in the NISVS feasibility and pilot studies, incentives are modestly increased (i.e., up to $25 for Phase 1; Phase 2 remains at $40) compared to previous NISVS data collections. Also, the incentives will be offered in sequential stages of survey completion rather than only at survey completion. Specifically, in Phase 1 a $5 cash pre-incentive will be included in the very first mailing. Next, respondents will be offered a $10 promised incentive to complete the screener on the web, or a $5 pre-incentive will be offered if the paper screener is filled out. Finally, respondents will be given a $15 promised incentive for completing the survey on the web or call-in telephone. After Phase 1, a random subsample of non-respondents will be drawn (Phase 2, non-response follow-up). Respondents in Phase 2 will be recontacted and offered a higher incentive of $40 to encourage their participation and complete the survey.


A.10. Protection of the Privacy and Confidentiality of Information Provided by Respondents


The CDC Office of the Chief Information Officer has determined that the Privacy Act does apply. The applicable System of Records Notice (SORN) is 09-20-0136 Epidemiologic Studies and Surveillance of Disease Problems. Published in the Federal Register on December 31, 1992. Volume 57, Number 252, Page 62812-62813. The Privacy Impact Assessment (PIA) is shown in Attachment R.


A number of procedures will be used to maintain the privacy of the respondent. An advance letter (Attachments S-T) will be mailed to selected households, and subsequent information provided to respondents will describe the study as being about health and injuries. For the web survey, the selected respondent will be required to change their password once they log on to the survey. The selected respondent will be provided with instructions on how to delete the browsing history from the computer.

Participation in the study is voluntary. Personally identifiable information (PII) will be collected for the ABS samples. Examples are addresses for those who request an incentive by check, and names or initials for distinguishing adults on the screener’s household roster, contact information to conduct follow-up contact for selected respondents. The main NISVS survey will only collect emails for delivering the gift code and mailing addresses for those who prefer to receive their incentive check by mail. Any names, addresses, phone numbers, e-mail addresses will never be associated or directly linked with the survey data. PII will be securely stored in password-protected files, separate from the survey data, to which only project staff will have access. The PII will be destroyed at the conclusion of the contract.


The contract will be covered by a Certificate of Confidentiality from the CDC. The Certificate indicates that contractor employees working on the NISVS contract cannot disclose information or documents pertaining to NISVS to anyone else who is not connected with the project. It may not be disclosed in any federal, state, or local civil, criminal, administrative, legislative, or other action, suit, or proceeding. The only exception is if there is a federal, state, or local law that requires disclosure (such as to report child abuse or communicable diseases) or if a respondent indicates plans to harm him- or herself or others.


All data will be maintained in a secure manner throughout the data collection and data processing phases in accordance with NIST standards and OCISO requirements. Only contractor personnel, who are conducting the study, will have study-specific access to the temporary information that could potentially be used to identify a respondent (i.e., the telephone number and address). While under review, data will reside on directories that only the project director can give permission to access. All computers will reside in a building with electronic security protections. Data cleaning processes will take place to ensure high data quality and remove any data elements that potentially could be used to identify individuals (e.g., mailing address, email address).



Informed Consent

Both web and phone surveys will use a graduated consent procedure. The advance letter (Attachments S-T) will describe the survey in a generic way (e.g., “health and injuries”). For those who are selected to take the full survey, the questionnaire will begin with questions about the household and demographics, then move to health questions. Prior to beginning the first set of victimization questions (i.e., stalking), respondents will be given more information about the content of the remaining questions in a generic phrase (e.g., “physical injuries, harassing behaviors, and unwanted sexual activity”) and given instructions for completing the survey, such as taking the survey in a private location. Additionally, participants are informed that they can skip any question and/or quit the survey at any time. Prior to each of the remaining sections, additional descriptions are given that are appropriate to the content of the items (e.g., use of explicit language). The instrument is presented in Attachments K-L).

A.11. Institutional Review Board (IRB) and Justification for Sensitive Questions

IRB Approval


Recent changes to the Common Rule now designate three criteria for designation as public health surveillance: (1) must be surveillance addressing a public health objective; (2) pursuant to a public health authority; (3) limited to activities that achieve a public health objective. The CDC National Center for Injury Prevention and Control’s OMB and human subject’s liaison has determined that the activity is not research and IRB approval is not needed. This data collection is a public health surveillance effort (Attachment U).


Justification for Sensitive Questions


Experiences of SV, IPV, stalking, and the impacts that originate from them are often underreported to officials and health care providers. As such, survey data provide the best source of information regarding the prevalence of SV, IPV, and stalking and impacts of such violence which has the potential to inform prevention and response activities. Black et al. (2006) published a study that assessed survey respondent reactions to survey questions about experiences of violence. Results showed that survey respondents not only believed the violence questions should be asked, but also showed their willingness to answer such questions (Black et al., 2006). Since then, several studies have been published. These studies clearly demonstrated that participants being asked about these potentially sensitive areas (such as topics related to trauma or violence) are not impacted negatively by participation in these surveys/studies. Other survey studies have supported these findings with most respondents reporting not regretting participation (e.g., McClinton et al., 2015; Newman et al., 2006), and even some respondents reporting a benefit to participating (Kaasa, et al., 2016). Some studies even demonstrated that there were short or long-term benefits for the participants’ healing and well-being due to participating in such surveys or studies (Cook et al., 2015; Hamberger & Larsen, 2020; Kirkner & Relyea, 2019; Larsen & Berenbaum, 2014). Finally, in results from feasibility testing of the NISVS survey (OMB # 0920-0822), a majority of respondents reported at the conclusion of the survey that they would strongly agree or agree to make the same choice to participate, with a higher percentage of victims agreeing that they would make the same choice to participate when compared to non-victims. This finding suggests that those with victimization are likely to find the survey experience meaningful and important, perhaps so their voices are heard, their experiences captured, to make a contribution to reducing the problem.


Web surveys, specifically, are increasingly used for sensitive topics as the mode allows for anonymous communication that minimizes the need for disclosure to an interviewer . Web surveys have been used to collect details on topics of sexual victimization, such as sexual aggression and rape (Buday & Peterson, 2015; Fagerlund & Ellonen, 2016; Griggs et al., 2018; Littleton et al., 2019), sexual harassment among adolescents across sexual orientations (Mitchell, Ybarra, & Korchmaros, 2014), recall of extra-familial childhood sexual abuse (Langeland et al., 2015), and incidence of polyvictimization among LGBTQ adolescents (Sterzing, Gartner & McGeough, 2018).


Attachments K-L contain the full NISVS survey instrument. Questions included in the current NISVS are modeled after earlier NISVS instruments, questions that were used in the National Violence Against Women Survey 1995-96 (NVAWS), and other studies measuring SV, IPV, and stalking.


A.12.a) Estimates of Annualized Burden Hours and Costs


The total burden for this study is higher due to the change in sampling and survey methods, estimated at 17,949 hours (see Table 2). The redesign and pilot testing studies sampled fewer respondents; therefore, the estimated burden in this revision is higher that the previously approved burden of 1,189 hours, but lower than the annualized hours from the previous full data collection (27,106 hours, OMB # 0920-0822, approved 7/25/2016).

The overall response rate is expected to be between 26% - 31%, which is higher than the previous NISVS data collection under the RDD design. An estimated 97% of respondents will complete the survey via the web and 3% will utilize the telephone call-in option. The same survey instrument will be used in both web and telephone modes. This was calculated based on a final sample size of 20,000 per data collection period, with an estimate of 1 minute to read the advance letter, 5 minutes to complete the web or paper screener, 25 minutes for the web questionnaire, and 40 minutes for the call-in telephone survey option. This is derived from the total burden hours for non-participating households and eligible households based on an average response of 5 minutes for screened households and 25 minutes for respondents that complete the survey.


Table 2. Estimated Annualized Burden Hours


Type of Respondent

Form Name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (in hours)

Total Burden (in hours)

Individuals and Households

Advance Letter (Att. S-T)

134,933

1

1/60

2,249

Screener (Att. K-N )

48,667

1

5/60

4,056

Questionnaire, web (Att. K-L)

26,667

1

25/60

11,111

Questionnaire, phone (Att. K-L)

800

1

40/60

533

Total Burden

17,949 


A.12.b) Estimated Annualized Respondent Burden Costs


For the general population the estimated annualized respondent burden cost is $589,087 (see Table 3). It is estimated that approximately 97% of respondents will complete the survey via the web and 3% will utilize the telephone call-in option. This was calculated based on a final sample size of 20,000 per data collection period, with an estimate of 1 minute to read the advance letter, 5 minutes to complete the web, phone, or paper screener, 25 minutes for the web questionnaire, and 40 minutes for the call-in telephone survey option. The mean hourly wage for private non-farm positions from the Bureau of Labor Statistics for November 2022 is $32.82 (Table B-3, U.S. Department of Labor, 2022).


Table 3. Estimated Annualized Burden Costs


Type of Respondent

Form Name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (in hours)

Average Hourly Wage*

Total Cost

Individuals and Households

Advance Letter (Att. S-T)

134,933

1

1/60

$32.82

$73,808

Screener (Att. K-N )

48,667

1

5/60

$32.82

$133,104

Questionnaire, web (Att. K-L)

26,667

1

25/60

$32.82

$364,671

Questionnaire, phone (Att. K-L)

800

1

40/60

$32.82

$17,504

Total Burden Cost

$589,087


A.13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers


This data collection activity does not include any other annual cost burden to respondents, nor to any record keepers.

A.14. Annualized Cost to the Government


The contract to conduct the study was awarded to RTI International (RTI) through competitive bid in September, 2022. The total cost for the 2022-2023 data collection activities is $9,092,501 including $8,194,963 in contractor costs and $897,538 in annual costs incurred directly by the federal government (Table 3).


Costs for this study include personnel for designing the data collection protocol, developing, programming, and testing the web and phone survey instruments; designing the web system for use in the survey data collection, drawing the address-based sample; advance letter and screener mailings and postage, training interviewers for call-in telephone option; collecting and analyzing the data; and reporting the study results. The government costs include personnel costs for federal staff involved in the oversight, study design, and analysis, as presented in detail in Table 4.


Table 4. Estimated Annualized Cost to the Government


Type of Cost


Description of Services

Annual Cost

Government Statistician (2 FTEs)

Project oversight, study and survey design, sample selection, data analysis, and consultation.

Provide review/input into all statistical aspects of the study design and conduct, including but not limited to study design, sample selection, weighting, total survey error, non-response bias, and response rate.

Survey instrument testing, data analysis and consultation, provide oversight of the QA process.


$296,283

Government Computer Programmer (.5 FTE)

Process data, produce code for complex quality assurance checks


$73,948

Government Data Manager (.5 FTE)

Data storage, documentation, quality assurance checking and reporting

Suggests timetables associated with the data collection and analysis plan

Collaborates with investigators to write plans pertaining to the design of data collection and analysis

Develops plans to ensure quality control of data collection and analysis processes



$37,607

Government Behavioral Scientist (2.5 FTEs)

Project oversight, study and survey design, sample selection, data analysis, and consultation.

Discusses different data collection methods and statistical approaches

Applies theories of psychology, sociology, and other behavioral sciences to the development of data collection instruments and methodological approaches

Designs tools and materials for data collection

Communicates research findings to professional audiences and agency staff using appropriate methods (e.g., manuscripts, peer-reviewed journals, conferences)

$289,600

Government Epidemiologist (1.2 FTE)

Describes sources, quality, and limitations of surveillance data

Defines and monitors surveillance system parameters (e.g., timeliness, frequency)

Defines the functional requirements of the supporting information system

Tests data collection, data storage, and analytical methods

Evaluates surveillance systems using national guidance and methods

Recommends and implements modifications to surveillance systems on the basis of an evaluation

Communicates research findings to professional audiences and agency staff using appropriate methods (e.g., reports manuscripts, peer-reviewed journals, conferences)

$134,100

Government Public Health Analyst (.6 FTE)

Project management including oversight of budget and administration

Applies knowledge of the acquisition and grants lifecycle

Manages and monitors the implementation of interagency agreements, and contracts

Applies methods and procedures for funding acquisitions

$66,000

Subtotal, Government Personnel

$897,538

Contracted Personnel and Services1

Study design, web system programming, information technology, phone interviewer training, respondent incentives, data collection, cleaning, and analysis

$8,194,963

TOTAL COST


$9,092,501

1Contracted personnel and services cost estimates are based on estimated funds available during the base period (September, 2022 – May, 2024). The contract is funded for multiple years with data collected on an annual basis. The total contract amount for the general population data collection (20-month base period plus three 12-month option years for a total of 56 months) is anticipated to be $14,987,202 or an annualized amount of $4,995,734 over 3 years. The government expects that this task order will be incrementally funded; based upon satisfactory performance and availability of funds, the contract may be renewed for the third option year.



A.15. Explanation for Program Changes or Adjustments


The estimated burden in this revision is higher that the previously approved burden of 1,189. The total burden for this study is higher due to the change in sampling and survey methods, estimated at 17,949 hours. CDC requests a revision to complete the 2022-2023 full scale data collection using redesigned methodology and the current survey instrument (Attachments K-L). The redesigned methodology is based on recommendations resulting from experimental studies (Attachment E) conducted in 2018-2021 and approved by OMB 6/19/2019, 3/20/2020, and 9/2/2021. This revision request incorporates methodological design changes to improve response rate, reduce cost, and reduce non-response bias. Additionally, survey questions and their formatting were revised to improve clarity and reduce respondent burden or to update content with more recent concerns (e.g., stalking technology, technology-facilitated sexual violence) (Attachment I). The survey question revisions are informed by results from cognitive testing conducted by NCHS (CCQDER) in 2021-2022 (Attachment H) and consultation with the current contractor (RTI). See the detailed description of survey revisions in Attachment I.


A.16. Plans for Tabulation and Publication, and Project Time Schedule

The schedule for data collection, analysis, and reporting is shown in Table 5 below. Data from each phase of data collection will be stored in password-protected files. Data analyses will be conducted and results will be prepared for publication and dissemination.

Table 5. Data Collection & Report Generation Time Schedule



Activities

Time Schedule


Prepare system for data collection (web system design, phone survey design, programming, testing, phone interviewer training)

Concurrent to OMB review

Data collection

Begins after OMB approval and continues for 3 years

Clean and edit Base Period dataset

Begins immediately after data collection is completed

Conduct data analyses

Begins six months after initiation of cleaning and editing of dataset

Prepare reports

Begins six months after initiation of data analyses



A.17. Reason(s) Display of OMB Expiration Date is Inappropriate

The display of the OMB expiration date is not inappropriate.


A.18. Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification.


REFERENCES


Basile KC, Langton L, Gilbert LK. (2018). Sexual violence: United States health and justice measures of sexual victimization. [Accessed October 20, 2022]. Retrieved from: https://www.cdc.gov/violenceprevention/sexualviolence/sexualvictimization.html#print


Basile KC, Smith SG, Kresnow M, Khatiwada S, & Leemis RW. (2022). The National Intimate Partner and Sexual Violence Survey: 2016/2017 Report on Sexual Violence. Atlanta (GA): Centers for Disease Control and Prevention, National Center for Injury Prevention and Control.


Biemer PP, Murphy J, Zimmer S, Berry C, Deng G, & Lewis K. (2017). Using bonus monetary incentives to encourage web response in mixed-mode household surveys. Journal of Survey Statistics and Methodology, 6(2), 240–261.


Black MC, Kresnow M, Simon TR, Arias I, & Shelley G. (2006). Telephone survey respondents' reactions to questions regarding interpersonal violence. Violence and Victims, 21(4), 445–459.


Bonomi AE, Anderson ML, Nemeth J, Rivara FP, & Buettner C. (2013). History of dating violence and the association with late adolescent health. BMC Public Health, https://doi.org/10.1186/1471-2458-13-821


Buday SK, & Peterson ZD. (2015). Men's and women's interpretation and endorsement of items measuring self-reported heterosexual aggression. Journal of Sex Research, 52(9), 1042–1053.


Cantor D, Han D, & Sigman R. (2008). Pilot of a mail survey for the Health Information National Trends Survey. Annual Meeting of the American Association for Public Opinion Research, New Orleans, LA.


Chen J, Khatiwada S, Chen MS, Smith SG, Leemis RW, Friar N, Basile KC, and Kresnow M. (In press). The National Intimate Partner and Sexual Violence Survey (NISVS) 2016/2017: Report on Victimization by Sexual Orientation. Atlanta, GA: National Center for Injury Prevention and Control, Centers for Disease Control and Prevention.


Chen J, Walters ML, Gilbert LK, & Patel N. (2020). Sexual violence, stalking, and intimate partner violence by sexual orientation, United States. Psychology of Violence, 10(1), http://dx.doi.org/10.1037/vio0000252


Cook SL, Swartout KM, Goodnight BL, Hipp TN, & Bellis AL. (2015). Impact of violence research on participants over time: helpful, harmful, or neither? Psychology of Violence, 5(3), 314324.


Fagerlund M & Ellonen N. (2016). Children's experiences of completing a computer-based violence survey: Finnish Child Victim Survey revisited. Journal of Child Sex Abuse, 25(5), 556–576.


Griggs AK, Berzofsky ME, Shook-Sa BE, Lindquist CH, Enders KP, Krebs CP, Planty M, & Langton L. (2018). The impact of greeting personalization on prevalence estimates in a survey of sexual assault victimization. Public Opinion Quarterly, 82(2), 366–378.


Guo Y, Kopec JA, Cibere J, Li LC, & Goldsmith CH. (2016). Population survey features and response rates: a randomized experiment. Surveillance, 106(8), 1422–1426.


Hall E, Sanchez T, Stephenson R, Stein AD, Sineath RC, Zlotorzynska M, & Sullivan P. (2019). Randomised controlled trial of incentives to improve online survey completion among internet-using men who have sex with men. Journal of Epidemiology and Community Health, 73, 156-161.


Hamberger LK, Larsen S, & Ambuel B. (2020). It helped a lot to go over it”: intimate partner violence research risks and benefits from participating in an 18-month longitudinal study. Journal of Family Violence, 35(1), 4352


Healthy People 2030 [internet]. Washington, DC: U.S. Department of Health and Human Services, Office of Disease Prevention and Health Promotion. [Accessed February 03, 2022]. Available at: https://health.gov/healthypeople/objectives-and-data/browse-objectives/violence-prevention


Hoebel J, von der Lippe E, Lange C, & Ziese T. (2014). Mode differences in a mixed-mode health interview survey among adults. Archives of Public Health, 72, 46 https://doi.org/10.1186/2049-3258-72-46


Jina R, & Thomas LS. (2013). Health consequences of sexual violence against women. Best Practice & Research Clinical Obstetrics and Gynaecology, 27, 15–26.


Jordan CE, Campbell R, & Follingstad D. (2010). Violence and women’s mental health: the impact of physical, sexual, and psychological aggression. Annual Review of Clinical Psychology, 6, 607–628.


Kaasa SO, Heaton L, McAloon R, & Cantor D. (2016, July). Harm, benefit, and regret: How respondent characteristics affect reactions to a sexual assault survey (paper presentation). International Family Violence and Child Victimization Research Conference, Portsmouth, NH.


Kirkner A, Relyea M, & Ullman SE. (2019). Predicting the effects of sexual assault research participation: reactions, perceived insight, and help-seeking. Journal of Interpersonal Violence, 34(17), 35923613


Langeland W, Smit JH, Merckelbach H, de Vries G, Hoogendoorn AW, & Draijer N. (2015). Inconsistent retrospective self-reports of childhood sexual abuse and their correlates in the general population. Social Psychiatry and Psychiatric Epidemiology, 50(4), 603–612.


Larsen SE, & Berenbaum H. (2014). The effect of participating in a trauma- and stressful event-focused study. Journal of Clinical Psychology, 70(4), 333–340.

Leemis RW, Friar N, Khatiwada S, Chen MS, Kresnow M, Smith SG, Caslin S, & Basile KC. (2022). The National Intimate Partner and Sexual Violence Survey, 2016/2017: Report on Intimate Partner Violence. Atlanta (GA): National Center for Injury Prevention and Control, Centers for Disease Control and Prevention.


Littleton H, Layh M, Rudolph K, & Haney L. (2019). Evaluation of the Sexual Experiences Survey-Revised as a screening measure for sexual assault victimization among college students. Psychology of Violence, 9(5), 555–563.


McClinton Appollis T, Lund C, de Vries PJ, & Matthews C. (2015). Adolescents' and adults' experiences of being surveyed about violence and abuse: a systematic review of harms, benefits, and regrets. American Journal of Public Health, 105(2), e31–45.


Mercer A, Caporaso A, Cantor D, & Townsend R. (2015). How much gets you how much? monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79(1), 105–129.


Messer BL, & Dillman DA. (2011). Surveying the general public over the internet using address-based sampling and mail contact procedures. Public Opinion Quarterly, 75(3), 429–457.

Messeri P, Cantrell J, Mowery P, Bennett M, Hair E, & Vallone D. (2019). Examining differences in cigarette smoking prevalence among young adults across national surveillance surveys. PLoS ONE, 14(12), e0225312, https://doi.org/10.1371/journal.pone.0225312

Mitchell KJ, Ybarra ML, & Korchmaros JD. (2014). Sexual harassment among adolescents of different sexual orientations and gender identities. Child Abuse & Neglect. 38(2), 280–295.


National Research Council. 2014. Estimating the Incidence of Rape and Sexual Assault. Washington, DC: The National Academies Press. https://doi.org/10.17226/18605


Newman E, Risch E, & Kassam-Adams K. (2006). Ethical issues in trauma-related research: a review. Journal of Empirical Research on Human Research Ethics, 1(3), 29–46.


Peterson C, DeGue S, Florence C, & Lokey CN. (2017). Lifetime economic burden of rape among US adults. American Journal of Preventive Medicine, 52(6), 691­–701.


Peterson C, Kearns MC, McIntosh WL, Estefan LF, Nicolaidis C, McCollister KE, Gordon A, Florence C. (2018). Lifetime economic burden of intimate partner violence among US adults. American Journal of Preventive Medicine, 55(4), 433–444.


Singer E, & Ye C. (2013). The use and effects of incentives in surveys. The ANNALS of the American Academy of Political and Social Science, 645(1), 112–141.


Smith MG, Witte M, Rocha S, Basner M. (2019). Effectiveness of incentives and follow-up on increasing survey response rates and participation in field studies. BMC Medical Research Methodology, https://doi.org/10.1186/s12874-019-0868-8

Smith SG, Basile KC & Kresnow M. (2022). The National Intimate Partner and Sexual

Violence Survey: 2016/2017 Report on Stalking — Updated Release. Atlanta, GA: National Center for Injury Prevention and Control, Centers for Disease Control and Prevention.


Sterzing PR, Gartner RE, & McGeough BL. (2018). Conducting anonymous, incentivized, online surveys with sexual and gender minority adolescents: lessons learned from a national polyvictimization study. Journal of Interpersonal Violence, 33(5):740–761.


Tjaden P, & Thoennes N. (1998). Prevalence, incidence, and consequences of violence against women: findings from the National Violence Against Women Survey. (NCJ Publication No. 172837). U.S. Department of Justice, Office of Justice Programs, Washington, DC.


Truman JL, & Brotsos H. (2022). Update on the NCVS instrument redesign (NCJ Publication No. 304055). U.S. Department of Justice Bureau of Justice Statistics, Washington, DC. [Accessed October 17, 2022]. Retrieved from: https://bjs.ojp.gov/content/pub/pdf/uncvsir_sum.pdf


U.S. Department of Labor. Bureau of Labor Statistics. (2022). Table B-3. Average hourly and weekly earnings of all employees on private nonfarm payrolls by industry sector, seasonally adjusted [Accessed December 7, 2022]. Retrieved from: https://www.bls.gov/news.release/empsit.t19.htm


Zuwallack R, Jans M, Brassell T, Bailly K, Dayton J, Martinez P, Patterson D, Greenfield TK, & Karriker-Jaffe KJ. (2022). Estimating web survey mode and panel effects in a nationwide survey of alcohol use. Journal of Survey Statistics and Methodology, https://doi.org/10.1093/jssam/smac028



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOMB Application for
Authormcl2
File Modified0000-00-00
File Created2023-11-10

© 2024 OMB.report | Privacy Policy