SSA_0822_NISVS_050516 v2 OMB comments 07 15_2016 revisions clean OMB

SSA_0822_NISVS_050516 v2 OMB comments 07 15_2016 revisions clean OMB.docx

The National Intimate Partner and Sexual Violence Survey (NISVS)

OMB: 0920-0822

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT: PART A






OMB# 0920-0822


The National Intimate Partner and Sexual Violence Survey (NISVS)







April 20, 2016












Point of Contact:

Sharon G. Smith, PhD

Behavioral Scientist

Contact Information:

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

4770 Buford Highway NE MS F-64

Atlanta, GA 30341-3724

phone: 770.488.1363

email: [email protected]





CONTENTS

Section Page


Summary table 3

  1. Justification 5


A.1. Circumstances Making the Collection of Information Necessary 5

A.2. Purpose and Use of Information Collection 8

A.3. Use of Improved Information Technology and Burden Reduction 8

A.4. Efforts to Identify Duplication and Use of Similar Information 9

A.5. Impact on Small Businesses or Other Small Entities 10

A.6. Consequences of Collecting the Information Less Frequently 10

A.7. Special Circumstances Relating to the Guidelines of

5 CFR 1320.5(d)2 10

A.8. Comments in Response to the Federal Register Notice and

Efforts to Consult Outside the Agency 11

A.9. Explanation of Any Payment or Gift to Respondents 15

A.10. Protection of the Privacy and Confidentiality of Information

Provided by Respondents 15

A.11. Institutional Review Board (IRB) and Justification for Sensitive

Questions 18

A.12. Estimates of Annualized Burden Hours and Costs 19

A.13. Estimates of Other Total Annual Cost Burden to Respondents

or Record Keepers 21

A.14. Annualized Cost to the Government 21

A.15. Explanation for Program Changes or Adjustments 22

A.16. Plans for Tabulation and Publication and Project Time Schedule 22

A.17. Reason(s) Display of OMB Expiration Date is Inappropriate 23

A.18. Exceptions to Certification for Paperwork Reducation Act

Submissions 23


Attachments

A Authorizing Legislation: Public Health Service Act

B Published 60-Day Federal Register Notice

C Documentation Regarding Consultation with Other Federal Agencies

D Institutional Review Board (IRB) Approval

E Survey - National Intimate Partner and Sexual Violence Survey (NISVS)

F Security Agreement

G Impact Assessment (PIA)

H NISVS Questionnaire - Spanish Version

I Lead Letters

J Program Changes

K 2008 Meeting Participants

L NISVS Work Group Timeline 070816


SUMMARY TABLE

Shape1

  • Goal of the study.

The National Intimate Partner and Sexual Violence Survey (NISVS) collects information about individuals’ experiences of sexual violence, stalking and intimate partner violence. NISVS produces national and state level prevalence estimates of these types of violence.

  • Intended use of the resulting data.

These data are used by local, state and national governments and organizations to inform prevention programs and policy making related to intimate partner violence, sexual violence and stalking.

  • Methods to be used to collect.

NISVS is a dual-frame (landline and cell phone) random digit dial (RDD) telephone survey.

  • The subpopulation to be studied.

Non-institutionalized, English and Spanish speaking men and women aged 18 years or older in the United States; Active Duty women and men serving in the military (Army, Navy, Air Force, Marines); Wives of active duty men serving in the military (Army, Navy, Air Force, Marines).

  • How data will be analyzed.

Data are be analyzed using appropriate statistical software to account for the complexity of the survey design to compute weighted counts, percentages, confidence intervals using both national and state level data.




A. JUSTIFICATION


A.1. Circumstances Making the Collection of Information Necessary


This is a revision request for the currently approved National Intimate Partner and Sexual Violence Survey - OMB# 0920-0822, expiration date 6/30/2016 for2 years. This survey has been conducted annually since 2010. Data collection is the 2016-2017 cycle is slated to begin in September 2016 and run through September 2017.

This request for Revision is multi-faceted. It includes the following elements including changes which receive further, detailed elaboration in Attachment J.

  • We are requesting a continuation of data collection among non-institutionalized adult men and women aged 18 years or older in the United States assessing lifetime experiences of intimate partner violence (IPV), sexual violence (SV) and stalking with a new and improved data collection tool. In this revised instrument, CDC:

    • added a small number of new questions that state health departments, state IPV/SV coalitions, and grantees have indicated will be useful (i.e., questions on child exposure to physical or psychological IPV, normative beliefs about IPV, SV, and bystander intervention, and on barriers to bystander intervention) and included one item on HIV status to address goals outlined in the National HIV/AIDS Strategy for the United States;

    • streamlined and improved the flow of the NISVS data collection tool;

    • dropped perpetrator initials and collects data on the victim-perpetrator relationship and perpetrator sex for each set of behaviors of interest as questions are asked;

    • identifies the first intimate relationship in which victimization occurred for a given set of behaviors, and expands the number of questions related to psychological aggression (adding back questions from 2010-2012 that were removed due to space limitations);

    • revised the stalking section to more specifically measure technology-based stalking, including the addition of one item to measure stalking via GPS tracking devices;

    • revised the introductory script and the language of some items to more clearly capture behaviors corresponding to the sexual violence experiences of rape and being made to penetrate a perpetrator for both forcible and alcohol/drug-facilitated contexts;

  • The revisions to the survey are aimed at reducing the time and complexity of the instrument..

  • The simplified structure of the instrument will also reduce the complexity of the data set, making it available to the public sooner, and hopefully easier to use.

  • For the data collection year 2016-2017, the periodicity of the administration of the NISVS instrument is being changed from annual to biennial. This change is proposed to increase the number of interviews from 12,500 interviews collected annually to 25,000 interviews during a 12 month period. In addition, CDC has secured funding to increase the number of NISVS interviews conducted in each data collection cycle by as much as 7,500 initially during 2016-2017, and as many as 15,000 over the next three to four years.

  • In addition, in collaboration with the Department of Defense (DoD), NISVS (using the same newly revised survey described above)will collect information regarding the experiences of IPV, SV and stalking among active duty women and men in the military and wives of active duty men. The collection of data on behalf of DoD will take place during the first six months of data collection during the 2016-2017 . The NISVS Survey was last administered to active duty females and wives of active duty males in 2010.


CDC initiated the process that led to the above changes in 2015. The overarching goal of this effort was to enhance the ability of NISVS to provide timely data that are more easily accessed and used by those groups that have the greatest potential to take actions that can prevent IPV, SV, and stalking, particularly grantees and state-level prevention partners. To achieve this goal, CDC, in close collaboration with its partners and stakeholders, completed work to:

  1. Revise the content of the NISVS data collection tool to provide information that is useful for guiding action at the state level.

  2. Enhance the systems data collection methods to allow for increased precision, sensitivity, and representativeness.

  3. Ensure that NISVS data are collected and managed in a way that allows for timely analysis & dissemination.

Examples of actions taken in pursuit of these objectives include but are not limited to:

  1. Pooling of data from NISVS 2010, 2011, and 2012 data years to produce state-specific estimates that will be presented in a NISVS State Report in 2016

  2. Increasing the number and diversifying the skill mix of program and analytic staff assigned to assist with NISVS operations.

  3. Providing funding to increase the total number of completed interviews to be acquired via the NISVS contract.

  4. Transitioning the system to use of a format where data collection occurs every other year that would enable substantial increases in the sample size during data collection years and create more time for generating data sets for public use and for generating data reports for use by prevention stakeholders

  5. Collaborating with the Bureau of Justice Statistics to initiate a series of expert panel meetings throughout 2016 to obtain guidance on how to improve survey design (methods, sampling frame, recruitment, mode of administration etc. ) to increase response rates, reduce non-response bias, and maximize opportunities across Federal surveys for covering populations of interest.


NCIPC has worked to improve the performance of the NISVS data collection tool (without altering its core content on IPV, SV, and stalking prevalence), decrease the level of burden on respondents, and reduce the time required to complete data processing, validation, and packaging for public release. In addition, our inclusion of new questions on child exposure to physical or psychological IPV, normative beliefs about IPV, SV, and bystander intervention, and on barriers to bystander intervention in the NISVS data collection tool further aligns NISVS surveillance approaches with stakeholder needs and demonstrates responsiveness to their expressed recommendations for surveillance improvement.


The revised NISVS data collection is slated for deployment in 2016. However, before this tool can be implemented it was critical that cognitive testing to characterize its performance in real interview situations and to identify potential sources of response error be completed. Therefore, in February 2016 the contractor for NISVS conducted interviews with both victims of intimate partner violence, sexual violence, and stalking victimization as well as non-victims to gather feedback related to some modifications of existing questions and the addition of some new questions in the NISVS survey. The goal of gathering this feedback was to ensure that the terms and concepts used are universally understood by respondents and that the process of answering the survey questions is not overwhelming from a cognitive, time, or emotional burden perspective. In particular, we wanted to understand and address any sources of confusion related to revisions, including edits to introductions, the formatting and sequencing of questions, and the transition to the new questions. Cognitive interviews were with conducted with 30 participants. The information collected was used to further refine and improve the NISVS survey to help ensure that the instrument is effectively and efficiently measuring the types of victimization of central interest in the surveillance system.


To comply with the remaining OMB’s terms of clearance for 2014, CDC continues its collaboration with BJS and its progress toward convening a work group to obtain expert feedback and input on how to enhance the NISVS. A revised time line is included. It is anticipated that the work group will meet several times over the course of the next year during the period between October 2016 and April 2017. The participants in this work group will provide guidance on how to improve the system’s survey design (methods, sampling frame, recruitment, mode of administration etc.) with the goals of increasing response rates, reducing non-response bias, and maximizing the opportunities across Federal surveys for covering populations of interest. We will continue to report to OMB on the progress in convening this group. 



Background

Intimate partner violence, sexual violence, and stalking endanger the health and well-being of women and men across the United States. As described below, more than two decades of research demonstrate that IPV, SV, and stalking are major public health problems with serious long-term health consequences and significant social and public health costs (Basile, Black, Simon, Arias, Brener & Saltzman, 2006; Black and Breiding, 2008; Breiding, Black, & Ryan, 2008; CDC, 2003; Tjaden and Thoennes, 1998). Extensive literature provides evidence indicating IPV, SV, and stalking substantially contribute to negative mental health outcomes, including depression, chronic mental illness, and post-traumatic stress disorder (e.g., Breiding, Black, & Ryan, 2008, Bonomi, Thompson, Anderson, Reid, Carrell, et al., 2006; Vos, Astbury, Piers, Magnus, Heenan, et al., 2006).

Intimate Partner Violence IPV is violence committed by a spouse, ex-spouse, current or former boyfriend or girlfriend; includes physical violence, sexual violence, and emotional abuse and has an estimated annual cost of $5.8 billion for medical care and lost productivity (National Center for Injury Prevention and Control, 2003). Both men and women are victims of IPV; it can occur among heterosexual and same-sex couples. In 2011, the National Intimate Partner and Sexual Violence Survey (NISVS) estimated that 1 in 3 women and 1 in 4 men reported experiencing IPV (rape, physical violence and/or stalking) during their lifetime (Black, Basile, Breiding, Smith, Walters, Merrick, Chen & Stevens, 2011). This translates into approximately 42.4 million women and 32.2 million men who experienced rape, physical violence and/or stalking by an intimate partner during their lifetime in the United States. In addition, approximately 7 million women and 5.7 million men experienced these types of violence by an intimate partner within the 12 months prior to the survey. Both women and men have increased risk for long term health problems (Black and Breiding, 2008). However, women are much more likely than men to suffer physical injuries or psychological trauma from IPV (Brush 1990; Gelles, 1997). Women are also significantly more likely than men to be killed by an intimate partner (Puzone et al. 2000).

Studies have also shown that abused women experience more physical and functional health problems and have a higher occurrence of depression, drug and alcohol abuse, and suicide attempts than do women who are not abused (Campbell, et al., 1995; Golding, 1996; Kaslow et al., 1998; Kessler et al., 1994; Krug et al., 2002). Psychological consequences include posttraumatic stress disorder, depression, substance abuse, and suicidal behaviors and ideation (Caetano and Cunradi 2003; Campbell 2002; Coker et al. 2000; Kaslow et al. 1998, 2002; Koss et al. 2003; Mechanic et al. 2000.)

Sexual Violence SV has a profound and long-term impact on the physical and mental health of the victim. In addition to injury, SV is associated with an immediate and long term increased risk of sexual and reproductive problems (Krug et al., 2002.) The annual cost of rape committed by intimate partners alone exceeds $319 million (Max, Rice, Finkelstein, Bardwell, & Leadbetter, 2004). According to the Bureau of Justice Statistics, rape is one of the most underreported crimes (Bachar and Koss, 2001), due in large part to the high level of social stigma and shame associated with rape. Approximately 84% of rapes and sexual assaults are not reported to police (Kilpatrick et al., 1992).

Stalking In 2010, The National Intimate Partner and Sexual Violence Survey found that 16.2% of women and 5.2% of men in the United States had experienced stalking during their lifetime in which they felt very fearful or believe that they or someone close to them would be harmed or killed (Black, et al., 2011). This translates into approximately 19.3 million women and 5.8 million men in the United States. Stalking can result in severe and even fatal outcomes for victims because it often occurs with other kinds of partner violence; 81% of women who were stalked by a current or former intimate partner were also physically assaulted by that partner and 31% were sexually assaulted by that partner (Tjaden & Thoennes, 1998). Evidence also suggests that women who are stalked by ex-partners may be at high risk for being killed (Crowell and Burgess, 1996). The estimated economic cost of stalking of women in 1995 was $342 million (Max, et al., 2004). Adjusted for inflation, this cost was $438 million in 2005 (Sahr, 2006).


The need for an ongoing surveillance system is evident in the fact that, prior to NISVS, the lack of regular, ongoing surveillance, using uniform definitions and consistent survey methods over time has made it nearly impossible to evaluate trends in IPV, SV, and stalking. The lack of comparable state-specific prevalence data has limited the ability of national and state public health officials to measure the impact of IPV, SV, and stalking in individual states. Improved surveillance helps guide the most effective use of limited prevention resources. More detailed and frequent information informs intervention and prevention strategies at both the national and state levels. Documenting and monitoring the incidence and prevalence of IPV, SV, and stalking is critical to improving the health status of individuals, making communities safer, and reducing the social and healthcare costs currently burdening state and federal governments and programs. NISVS data helps inform public policies and prevention strategies and helps to guide and evaluate progress towards reducing the substantial health and social burden associated with IPV, SV, and stalking.



The CDC is the lead federal agency for public health objectives related to injury and violence. The Healthy People 2020 report (U.S. DHHS, 2010) lists several objectives that pertain directly to IPV, SV, and stalking. Applicable objectives include objectives IVP39: “reduce the rate of physical assault by current or former intimate partners”; “reduce sexual violence by a current or former intimate partner”; “reduce psychological violence by a current or former intimate partner”; “reduce stalking by a current or former intimate partner.” Also applicable are objective IPV40 “reduce the annual rate of rape or attempted rape”; “reduce sexual assault other than rape.” Authority for CDC’s National Center for Injury Prevention and Control to collect these data is granted by Section 301 of the Public Health Service Act (42 U.S.C. 241) (Attachment A). This act gives Federal health agencies, such as CDC, broad authority to collect data and carry out other public health activities, including this type of study.





A.2. Purpose and Use of Information Collection


The specific aims of NISVS are to collect consistent and reliable data on the incidence, prevalence, and nature of IPV, SV, and stalking at the state and national level among U.S. women and men on an annual basis. These data have previously been used by CDC, the National Institute of Justice and the Department of Defense to understand the prevalence of these types of violence in the general population as well as in the American Indian/Alaska Native population and the military population. In addition to federal use of these data, public use data sets are developed to promote the use of these data by external researchers.


Ongoing surveillance is critical in the further development of prevention and intervention programs to reduce the prevalence and incidence of IPV, SV, and stalking. Stable and precise annual prevalence estimates were produced at the national level in 2011 from the 2010 data. Stable and precise state-level prevalence estimates were also produced in 2011 using the 2010 data and will be available in subsequent years as interviews accrue over time. Currently, for the vast majority of states, the data provided by NISVS is the only population-based information regarding the prevalence of IPV, SV, or stalking.


The need for an ongoing surveillance system is reflected in the fact that prior to NISVS the lack of regular, ongoing surveillance, using uniform definitions and consistent survey methods over time has made it nearly impossible to evaluate trends in IPV, SV, and stalking. The lack of comparable state-specific prevalence data has limited the ability of national and state public health officials to measure the impact of IPV, SV, and stalking in individual states. Improved surveillance helps guide the most effective use of limited prevention resources. More detailed and frequent information informs intervention and prevention strategies at both the national and state levels.


Documenting and monitoring the incidence and prevalence of IPV, SV, and stalking is a critical first step to improving the health status of individuals, making communities safer, and reducing the social and healthcare costs currently burdening state and federal governments and programs. NISVS data helps inform public policies and prevention strategies and helps to guide and evaluate progress towards reducing the substantial health and social burden associated with IPV, SV, and stalking.


The change in this request is to fully implement the streamlined, revised NISVS instrument for full national level data collection. The same instruments recently developed after being evaluated via cognitive testing are used for this full implementation.


A.3. Use of Improved Information Technology and Burden Reduction


All interviews have been conducted over the telephone, using computer-assisted telephone interviewing (CATI) software. The use of CATI reduces respondent burden, reduces coding errors, and increases efficiency and data quality. The CATI program involves a computer-based sample management and reporting system that incorporates sample information, creates an automatic record of all dialings, tracks the outcome of each interviewing attempt, documents sources of ineligibility, records the reasons for refusals, and locates mid-questionnaire termination.


The CATI system also includes the actual interview program (including the question text, response options, interviewer instructions, and interviewer probes). The CATI’s data quality and control program includes skip patterns, rotations, range checks and other on-line consistency checks and procedures during the interview, assuring that only relevant and applicable questions are asked of each respondent. Data collection and data entry occur simultaneously with the CATI data entry system. The quality of the data is also improved because the CATI system automatically detects errors and ensures that there is no variation in the order in which questions are asked. Data can be extracted and analyzed using existing statistical packages directly from the system, which significantly decreases the amount of time required to process, analyze, and report the data.


A.4. Efforts to Identify Duplication and Use of Similar Information


Prior to NISVS, the most recent national health survey on IPV, SV, and stalking (National Violence Against Women Survey, VVAWS) was completed in 1995, more than a decade ago (Tjaden and Thoennes, 1998). Prior to NVAWS, there had been no similar national health surveys with a specific focus on IPV, SV, and stalking (which are also the types of outcomes that are least likely to be disclosed in crime surveys).


When NISVS was originally designed, CDC consulted with other federal agencies (e.g., National Institute of Justice, Department of Defense) and other leading experts and stakeholders in the fields of IPV, SV, and stalking. NCIPC convened a workshop “Building Data Systems for Monitoring and Responding to Violence Against Women” (CDC, 2000). Recommendations provided by those in attendance are reflected in the design of NISVS.


As discussed in the Data Systems workshop, surveys that ask behaviorally specific questions that are couched in a public health context have much higher levels of disclosure than those couched within a crime context (as in the National Crime Victimization Survey (NCVS) conducted by the Bureau of Justice Statistics).  With its public health focus, NISVS also examines the health impacts of victimization and collects information about consequences and needs of victimization that can help guide violence prevention efforts.   NISVS interviews are initiated by asking basic health and lifestyle questions to establish a rapport with the interviewee and set a health context rather than crimes and criminal events. In addition, NISVS increases disclosure through the use of multiple behaviorally specific questions (e.g., not asking about rape, but asking about unwanted or forced sex). NISVS also gathers more detailed information (compared to the NCVS or other surveys) on the full range of behaviors that victims of intimate partner violence, sexual violence, and stalking experience, including forced sex, coercive sex, alcohol or drug facilitated sex, being made to sexually penetrate another person, non-contact sexual violence, physical violence by intimates, and technology assisted stalking (e.g., cell phone, Face Book). Information is also gathered with respect to frequency, time frame, age at first victimization, relationship to perpetrator(s), impact of abuse, and service use. Unlike NCVS, NISVS provides both 12-month and lifetime prevalence estimates and can be used to generate national and state-specific estimates. Respondents to NCVS are recruited and first interviewed in their homes and all members of the household age 12 or older are recruited and asked the victimization questions. In NISVS, only a single randomly selected adult respondent is aware of the violence content of the interview. This is done to avoid a situation where the respondent could be asked about their answers by other members of the household who could be the perpetrators of violence. The NISVS interviewers work to reduce risk for retaliation from perpetrators and to enhance respondent’s comfort with disclosing victimization by ensuring that the respondent feels safe before asking the violence questions.


Despite its numerous strengths relative to other systems, NISVS also has its limitations.  The system uses a random digit dial (RDD) telephone survey methodology and response rates for RDD surveys have been declining. However the cooperation rate among those who are reached in NISVS is consistently high. Also, although NISVS captures a broad range of self-reported victimization experiences and the estimates are considerably higher than those from crime surveys, it is likely that the results are still underestimates of the true prevalence of sexual violence, stalking, and intimate partner violence. Victims who are involved in violent relationships or who have recently experienced severe forms of violence might be less likely to participate in surveys or might not be willing to disclose their experiences because of unresolved emotional trauma or concern for their safety, among other reasons. This is one of the reasons why NISVS also collects data on lifetime victimization. Victims are often more willing to disclose victimization that happened years ago over victimization that is more recent or ongoing. In addition, telephone surveys such as those used with NISVS may be less likely to include some populations that could be at higher risk for victimization (e.g., persons living in nursing homes, prisons, or shelters, or those who are homeless). 


In our ongoing assessment of NISVS, CDC is working closely with the Bureau of Justice Statistics discussing the fit between NISVS and NCVS, including demonstrating the ways that these systems provide unique yet complementary data on victimization, and exploring options for collaborative, continued enhancement of both systems. CDC and BJS participate in regular meetings to discuss the lessons learned and implications for continued improvement of the systems.




Although the Behavioral Risk Factor Surveillance System (BRFSS) included optional IPV and SV modules in 2005, 2006, and 2007, fewer than half of the states administered the module during any one year. Furthermore, the information collected in the optional modules was limited to a small number of relatively simple IPV (n= 7) and SV (n=8) questions and limited to physical and sexual violence. Because of time constraints, there was no information collected on stalking or psychological abuse by an intimate partner. In addition, there was only one question that provided information on the impact of the violence that occurred - “were you injured during the most recent event?”


The BRFSS SV and IPV modules have provided useful, albeit limited, information to participating states regarding their prevalence of IPV and SV. Because consistent survey methods were used, participating states were able to make comparisons between their state and other states that administered the module (Breiding, Black, & Ryan, 2008). Except for NISVS, no other consistently collected state level data using similar questions and survey methods currently exist. An additional concern is that neither all states nor a statistically representative set of states collected IPV or SV data during the years that funding was available (2005, 2006, and 2007). Only three states have SV data across all three years and only five states have IPV data across all three years in which the optional module was offered. Because financial support from the Division of Violence Prevention no longer exists for the optional modules, few (if any) states continue to collect IPV or SV data. Thus, the BRFSS does not provide national estimates of IPV or SV. Furthermore, to adequately monitor and evaluate trends, data must be collected more frequently, across all states, using consistent surveillance methods.




Currently, in efforts to comply with OMB’s terms of clearance for 2014, CDC is preparing to convene a work group comprised of experts in survey methodology and representatives from other federal agencies such as NCHS and BJS. OMB will also be invited to attend all work group meetings. This work group will provide feedback and input on how to improve both survey design (methods, sampling frame, recruitment, mode of administration) and content/question wording with the goal of increasing response rates, reducing non-response bias, and maximizing the opportunities across Federal surveys for covering populations of interest. This work group will begin with initial meetings to occur in 2016, the first meeting being held in October and the second in November. Subsequent meetings will follow each month through April 2017 (Attachment L). CDC has collaborated with BJS on agenda topics and invited participants.




A.5. Impact on Small Businesses or Other Small Entities


No small businesses will be involved in this data collection.


A.6. Consequences of Collecting the Information Less Frequently


The proposed reduction in frequency of the data collection (from annual to biennial) is designed to provide a far larger sample size in a shorter period of time while allowing more time to produce and release data products that will further inform prevention. In addition the compressed, biennial schedule also will increase the statistical precision of IPV, SV, and stalking prevalence estimates provided by NISVS and provide more statistical power to detect and characterize rare but pivotal experiences. This change in frequency will continue to allow us to to evaluate the effectiveness of prevention programs on a national scale directed at the prevention of these types of violence is contingent upon obtaining data that can provide solid information about changes in trends over time.


A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


The request fully complies with the regulation 5 CFR 1320.5.


A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


A.8.a) Federal Register Notice

A 60-day Federal Register Notice was published in the Federal Register on May 19, 2015 vol. 80, No. 96, pp. 28618-28619 (Attachment B). There were no comments to the 60-day Federal Register Notice.


A.8.b) Efforts to Consult Outside the Agency

In the past, CDC participated in a monthly conference call involving federal researchers involved in the study of violence against women (documentation included in Attachment C).



NCIPC invited a panel of experts to attend a meeting in November 2007 to discuss preliminary findings from the 2007 methodological study and to discuss the planned directions for NISVS. The review panel consisted of federal and non-federal subject matter experts with expertise in IPV, SV, and stalking. Attachment K provides a list of those individuals who participated in the meeting and provided input to the redevelopment of the survey during monthly conference calls in 2008.



In 2008, staff within the Departments of Justice and Defense served as technical reviewers for the proposals submitted in response to CDC’s Funding Opportunity Announcement for NISVS. As part of the review team, they participated in the selection of the contractor to do the work and approved the proposed statement of work. DOJ and DoD were also integrally involved in the design of the interview instrument as described below (and see interagency agreement included in Attachment C). As described in Section A.4, CDC worked closely with DoD, NIJ, and other federal agencies in the development of the survey (NISVS). Documentation providing an example of the consultations between CDC, DoD, and DOJ/NIJ regarding NISVS is also included in Attachment C. In addition, CDC staff remain engaged in ongoing discussions with Federal colleagues from DOD related to the collection of special population data from military personnel. Numerous presentations were made in 2008, 2009 and 2010 to vet the proposed NISVS among a range of interested stakeholders, including victim advocates, family advocacy programs, Title IX Task Force authorized under the 2005 VAWA, and a number of other conferences and public meetings.



For the current revision, xxxx throughout 2015 NCIPC staff actively engaged NCIPC’s RPE and Delta program grantees and other stakeholders to obtain feedback regarding processes implemented to enhance the ability of NISVS to provide timely data that are more easily accessed and used by those groups that have the greatest potential to take actions that can prevent IPV, SV, and stalking, particularly grantees and state-level prevention partners.


Lastly, in 2014 and 2015, staff within the Department of Defense collaborated with CDC in the development, review and approval of the proposed statement of work for the currently active data collection contract. The DoD participated in the kick-off meeting with the new contractor and collaboration between CDC and DoD will continue throughout the duration of the Interagency Agreement (IAA) initiated to facilitate collection of military subpopulation data during 2016.



A.9. Explanation of Any Payment or Gift to Respondents


Financial incentives can help gain cooperation through fewer calls, which can help make their use cost effective Armstrong (1975), Yu and Cooper (1983), Church (1993), Singer (2002), Cantor, O’Hare, and O’Connor (2007). Incentives have also been found to be effective in increasing response rates in Random Digit Dial (RDD) telephone surveys (e.g., Cantor, Wang, and Abi-Habib 2003), as well as in reducing nonresponse bias by gaining cooperation from those less interested in the topic (e.g., Groves et al. 2006; Groves, Singer, and Corning 2000). Increasing the response rate also increase the likelihood that information provided by survey participants are representative of the sample and maximize the utility of all information provided by study participants.


Thus, implementing an incentive plan can be a cost effective way for surveys to improve response rates and lower refusal rates, and could, over the course of data collection, actually reduce costs and burden to respondents by reducing the need for additional calls to potential respondents. In addition, NISVS contains a series of sensitive questions regarding respondent’s victimization experiences of sexual violence, intimate partner violence and stalking throughout their lifetime. Given the sensitive nature of these topics and the difficulty of obtaining acceptable response rates in a Random Digit Dial (RDD) telephone surveys, a substantially higher incentive is required in an attempt to reduce non-response bias and to increase the response rate. The incentive structure proposed in this request is exactly the same as the one used in previously approved information collections requests (OMB# 0920-0822) for 2010, 2011, 2012, and 2015 with the exception that the respondents will no longer be allowed to donate their incentives to charity.


Since its origin, NISVS has employed a two-phase survey design with Phase 1 being the main data collection period and Phase 2 specifically targeted at increasing response rates and reducing nonresponse bias. During Phase 1, all respondents are offered a $10 incentive to complete the survey.


Upon completion of the first phase a random subsample of non-respondents who did not participate during the main data collection period is drawn (Phase 2). The subsampling rate of all non-respondents for Phase 2 is approximately 0.40. Respondents in Phase 2 are re-contacted and offered a higher incentive of $40 to encourage their participation.


In a previous NISVS data collection cycle, respondents in Phase 2 were randomly assigned to receive incentive amounts of either $25 or $40 in order to determine the impact the lower amount could have on the response rate. It was found that decreasing the amount from $40 to $25, during Phase 2, decreased the response rate by 17% for landlines and 7% for cell phones. It appears that that a decrease in the amount offered would negatively impact the response rate.



Maintaining the two-phase survey design with the current incentive structure will allow for consistency across years of data collection. Such consistency will permit tracking of changes of these types of violence over time. Methodological changes, that impact the sample, could call into question our ability to make comparisons with earlier national and state level prevalence estimates.

Active duty military members participating in the NISVS survey will not be receive any incentive for their participation due to DOD’s policies on the use of incentives in government funded information collections. However, wives of active duty men will be eligible for the same incentives as the general population.


A.10. Protection of the Privacy and Confidentiality of Information Provided by Respondents


The CDC Office of the Chief Information Officer has determined that the Privacy Act does apply. The applicable System of Records Notice (SORN) is 0920-0136 Epidemiologic Studies and Surveillance of Disease Problems. Published in the Federal Register on December 31, 1992. Volume 57, Number 252, Page 62812-62813. The Privacy Impact Assessment (PIA) is attached (Attachment G).


At no time will CDC have access to or receive potentially identifiable information. During data collection, the contractor collects names and addresses of those respondents who wish to be mailed a promised incentive. At no time is this information linked or linkable to survey information. Only limited demographic information is requested (e.g., race, zip code, year of birth). Once an interview is completed, the telephone number is eliminated from the database in an overnight batch process.


The data are collected anonymously. The measures used to insure confidentiality in the approved IRB protocol (Attachments D) closely follows the IRB and OMB approved National Intimate Partner and Sexual Violence Survey (NISVS) (OMB # 0920-0822.


During the verbal informed consent process and throughout the interviews the respondents are informed that their participation is completely voluntary and reminded that they can stop the interview at any time. They are also informed and reminded that they can skip any question that they do not want to answer (Attachment E).


Following recommended guidelines (Sullivan & Cain, 2004; WHO, 2001) a graduated verbal informed consent protocol is used. Specifically, to ensure respondent safety and privacy, the initial person who answers the telephone is provided general non-specific information about the survey topic. The specific topic of the survey is only revealed to the individual respondent selected. After a single adult respondent in the household is randomly selected to participate, the interviewer administers the IRB-approved verbal informed consent, which provides information on the voluntary and confidential nature of the survey, the benefits and risks of participation, the survey topic and the telephone numbers to speak with staff from the CDC or project staff from the contractor (Attachment E). Potential respondents are informed 1) of the purpose for the data collection; 2) that their data will be treated in a secure manner and will not be disclosed; and 3) that all information collected will be pooled with responses from other participants. Literature regarding the ethical and safe collection of research data on IPV offers many reasons for obtaining verbal informed consent in a graduated manner (WHO, 1993; Sullivan & Cain, 2004). In addition to safety and ethical considerations, a graduated consent process allows the interviewer to build rapport and increases the likelihood of gaining the participant's trust, the key to minimizing non-participation and under-reporting. Carefully conducted studies with well-trained interviewers who are able to build rapport and trust with potential participants are essential both to the collection of valid data and the well-being of respondents.


All data will be maintained in a secure manner throughout the data collection and data processing phases in accordance with NIST standards and OCISO requirements. Only contractor personnel, who are conducting the study, will have study-specific access to the temporary information that could potentially be used to identify a respondent (i.e., the telephone number and address). All project staff have signed the project specific security agreement (Attachment F). While under review, data will reside on directories that only the project director can give permission to access. All computers will reside in a building with electronic security and are ID and password protected.

Although some sensitive questions on social behaviors and victimization are asked using a RDD telephone survey, respondents' first name or initials only are used for the interview process. The name "resident" is used to send the advanced informational letter prior to the interview and the incentive check is addressed as the respondent specifies after his/her participation. To maximize human subject protection, the letter has been carefully written to provide only general information about the survey. The lack of detailed study information in the advance letter is intentional for the protection of the prospective study participant. If the prospective study participant is in a relationship where IPV is present, we do not want the advance letter to raise suspicion or incite potential perpetrators.

Upon completion of the survey, respondents may choose to receive or waive receipt of an incentive check. If the respondent does choose to receive the incentive, it is sent to their specified mailing address. Following survey completion, the interviewer asks for the respondent’s name and mailing address. The respondent is informed that this information is being collected for the sole purpose of sending the incentive and that it will not be stored with their survey responses (Attachment E). If the respondent is not comfortable giving this information to the interviewer, the interviewer then offers to have the respondent give the information to her supervisor. If the interviewer thinks that further reassurance is needed, she can offer that her supervisor will not know how the respondent answered any of the questions. If the respondent is still not comfortable with giving their contact information to a call center supervisor, the interviewer will offer to transfer the respondent to a voice mail box to leave their information. The toll-free project hotline number is also offered to respondents so they can call if they experience problems leaving their information.

The mailing contact information is initially recorded in the case management database, a database separate from the survey data. The phone number, address, and name information are subsequently removed from the database during an overnight batch process. By utilizing a two-step process, identifying information that is potentially linkable is removed quickly and respondent privacy is maintained.

The contractor has procedures in place to protect against data loss and down time in the event of equipment failure. These include regularly scheduled back up of data, redundant services in case of server failure, and uninterruptible power supplies to bridge a temporary loss of power. Under normal operating conditions, a complete backup of all files on every disk are written to tape weekly. Every business day, a differential backup is performed of all files created or modified since the last complete backup. In the event of a hardware or software failure, files can be restored to their status as of the time of the last differential backup, usually the evening of the previous business day. Tapes from complete backups are kept for approximately 3 months. Tapes or CD-R drives are used for long-term data archiving. Several additional measures have been implemented to ensure data security. The CATI system includes a compartmentalized data structure, in which personally identifying information are maintained separately from the actual questionnaire responses. Once an individual has completed his/her survey, all identifying information including first name, and telephone number are transferred to an Excel file, stripped from the data files and destroyed in an overnight batch process. These measures safeguard the privacy of participants – once their interview has been completed, it does not have any personal identifiers.


Before any data are released (e.g. in disseminated reports), all demographic information that could potentially lead to identification of an individual are stripped and the information destroyed. The database is configured so that it is not possible to retrieve individual responses or potentially identifying information.



A.11. Institutional Review Board (IRB) and Justification for Sensitive

Questions


IRB Approval


CDC’s IRB has deferred to the contractor’s IRB. The IRB approval obtained through the study contractor is presented in Attachment D. CDC will not have contact with study participants, nor will CDC have access to PII.


Justification for Sensitive Questions


Because very few people report IPV, SV, or stalking to officials and very few injuries are reported to health care providers, survey data provide the best source of information regarding the prevalence of IPV, SV, and stalking. Until recently, questions about IPV, SV, and stalking were considered by some to be “too sensitive” to ask in an RDD telephone survey. However, CDC evaluated respondent reactions to questions about violence in three large telephone surveys: 1) National and State Surveys on Violence Against Women and the Evaluation of Measurement Tools for IPV (OMB # 0990-0115); 2) Injury Control and Risk Survey (ICARIS-2 Phase 2) (OMB # 0920-0513); and 3) National Intimate Partner and Sexual Violence Survey (NISVS) (OMB # 0920-0724). Findings from these evaluations are published in Black, Kresnow, Simon, Arias and Shelley (2006)’s article in Violence and Victims-- “Telephone Survey Respondents’ Reactions to Questions Regarding Interpersonal Violence.”



In all three surveys, it was consistently found that between 88.0% and 98.4% of participants felt such questions should be asked, regardless of their experience with or their history of interpersonal violence. Victims were as likely as non-victims to believe that such questions should be asked. In addition, responses were consistent, regardless of the respondent’s victimization experience; those with different types of victimizations, those victimized within the past 12 months, and those victimized by an intimate partner all reported that the questions should be asked. Importantly, even among victims who reported that being asked these questions made them feel upset or afraid, the majority felt that such questions should be asked in a telephone survey.


These findings provide important information for researchers and offer some assurance to those concerned with the ethical collection of data on victimization (Black and Black, 2007).


Still, it is critical that respondent safety remains the primary concern for any data collection asking about violence, particularly IPV, SV, and stalking. Such measures have been well described (Sullivan & Cain, 2004) and are addressed in the interviewer training.


Additional information regarding the potential benefits of participation were gathered in the National Intimate Partner and Sexual Violence Survey (NISVS) conducted in early 2007 (OMB # 0920-0724). The overall purpose of the 2007 study was to evaluate several methodological issues and to inform the design of NISVS. One of the issues evaluated was the degree to which respondents reported experiencing benefits as a result of participation. More than 70% of respondents reported that they gained something positive from participating (National Intimate Partner and Sexual Violence Survey (NISVS), unpublished data). Nearly 70% reported that they felt someone cared about issues that were important to them and over 90% reported the perceived benefit of helping others (National Intimate Partner and Sexual Violence Survey (NISVS), unpublished data). When researchers focus solely on the potential for negative impact, such perceived positive responses to participation by respondents may often be overlooked.


Attachment E contain the NISVS survey instrument and associated supporting materials. Questions included in NISVS are closely modeled after questions that were used in the NVAWS, the National Intimate Partner and Sexual Violence Survey (NISVS) or other studies regarding IPV, SV, and stalking.


A.12. a) Estimates of Annualized Burden Hours and Costs


There are two types of households included in the burden table: the non-participating households that are screened and are not eligible or do not wish to participate and the households that are eligible and agree to participate. The estimated number of non-participating screen households is 170,000. It will take approximately 3 minutes to determine their eligibility and participation status. It is estimated the total burden for this group to be 8,500 hours.


The number of participating households will be 25,000 over a 12 month data collection period. It is anticipated that most respondents will take approximately 25 minutes to complete the survey including reviewing instructions. We estimate the total burden for this group to be 10,416 hours.


The total burden for this study is estimated at 18,916 hours. This is derived from the total burden hours for non-participating households and eligible households based on an average response of 3 minutes for screened households and 25 minutes for respondents that complete the survey.


In addition to collecting data for the general population, CDC in collaboration with DoD, will collect 10,800 completed interviews from active duty men and women and wives of active duty men over a 6 months data collection period in 2016. The estimated number of non-participating screen DoD households is 73,800. It will take approximately 3 minutes to determine their eligibility and participation status.


For the participating DoD households it is estimated the total burden for this group to be 3,690.

It is anticipated that most respondents will take approximately 25 minutes to complete the survey including reviewing instructions. We estimate the total burden for this group to be 4,500.



Overall, the annual burden hours for the survey increased by 17,648 up from 9,458 hours in 2015 to 27,106 hours for the 2016-2017 data collection period. This increase is due to differences in the number of respondents sought in an annual data collection (for example, 12,500 in 2015 versus 25,000 in 2016-2017) and the inclusion of military respondents (for example, 0 in 2015 versus 10,800 in 2016-2017). The revisions to the survey may reduce the average time per response, but to provide a conservative estimate (in case the new question compensate for the time reduction associated with the complexity), we have continue to use the same average time per response that we did in previous clearance years.







Table 1. Estimated Annualized Burden Hours


Type of Respondents


Form Name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response

(in hours)

Total Burden (in hours)

Non-Participating Household (Screened)

NISVS Survey Instrument. First section non-participating (Att. E - For respondents Screened)

170,000

1

3/60

8,500

Eligible Household

(Completes Survey)

NISVS Survey Instrument. Section for participating (Att. E - For respondents completing survey)

25,000

1

25/60

10,416

Non-Participating DoD Household (Screened)

NISVS Survey Instrument. Section for DoD participating (Att. E - For military respondents Screened)

73,800

1

3/60

3,690



Eligible DoD Household

(Completes Survey)

NISVS Survey Instrument. Section for participating (Att. E - For military respondents completing survey)

10,800

1

25/60

4,500

Total

27,106



A.12.b) Estimated Annualized Respondent Burden Costs


For the general population, it is estimated the annual burden cost will be $401,601.00 for 25,000 completed interviews was estimated using 170,000 as the expected number of households containing an eligible respondent ages 18 and older; and 25,000 of these eligible households completing the survey.


For the DoD population, it is estimated the annual burden cost will be $173,874for 10,800 completed interviews was estimated using 73,800 as the expected number of military households containing an eligible respondent ages 18 and older; and 10,800 of these eligible DoD households completing the survey.


The estimates of individual annualized costs are based on the number of respondents interviewed and the amount of time required from individuals who were reached by telephone and agreed to the one time interview. The average hourly wage was obtained from the 2015 U.S. Bureau of Labor Statistics. It takes up to 3 minutes to determine whether a household is eligible to complete the verbal informed consent. For those who agree to participate, the total time required is approximately 25 minutes, on average, including screening and verbal informed consent. The average hourly earnings for those in private, non-farm positions are $ 21.23. (http://www.bls.gov/news.release/empsit.t24.htm).


Table 2. Estimated Annualized Burden Costs



Type of Respondent

No. of Respondents

Hourly Wage Rate (in dollars)

Total Respondent Cost

Non-Participating Household (Screened)

170,000


$21.23


$180,455

Eligible Household

(Completes Survey)


25,000


$21.23


$221,146

Non-Participating DoD Household (Screened)


73,800


$21.23


$78,339

Eligible DoD Household

(Completes Survey)


10,800


$21.23


$95,535

Total



575,475

























A.13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers


This data collection activity does not include any other annual cost burden to respondents, nor to any record keepers.


A.14. Annualized Cost to the Government


The contract to conduct the survey was awarded to RTI, International through competitive bid in October of 2015. The total cost for the base year of the contract is $4,895,472, including $4,430,619 in contractor costs and $464,853121, 237.10 in annual costs incurred directly by the federal government (Table 3).


Costs for this study includes personnel for designing the study, developing, programming, and testing the survey instrument; drawing the sample; training the recruiters/interviewers; collecting and analyzing the data; and reporting the study results. The government costs include personnel costs for federal staff involved in the oversight, study design, and analysis, as presented in detail in Table 3.


Table 3. Estimated Annualized Cost to the Government


Type of Cost

(New Version)

Description of Services

Annual Cost

Government Statistician (2 FTEs)

•Project oversight, study and survey design, sample selection, data analysis, and consultation.

•Provide review/input into all statistical aspects of the study design and conduct, including but not limited to study design, sample selection, weighting, total survey error, non-response bias, and response rate. •Survey instrument testing, data analysis and consultation, provide oversight of the QA process.


$290,473

Government Computer Programmer (.5 FTE)

Process data, produce code for complex quality assurance checks



$72,498

Government Data Manager (.5 FTE)

•Data storage, documentation, quality assurance checking and reporting

•Suggests timetables associated with the data collection and analysis plan

•Collaborates with investigators to write plans pertaining to the design of data collection and analysis

•Develops plans to ensure quality control of data collection and analysis processes



$36,869

Government Behavioral Scientist (1.6 FTEs)

•Project oversight, study and survey design, sample selection, data analysis, and consultation.

•Discusses different data collection methods and statistical approaches

•Applies theories of psychology, sociology, and other behavioral sciences to the development of data collection instruments and methodological approaches

•Designs tools and materials for data collection

•Communicates research findings to professional audiences and agency staff using appropriate methods (e.g., manuscripts, peer-reviewed journals, conferences)

$254, 500

Government Epidemiologist (.9 FTE)

•Describes sources, quality, and limitations of surveillance data

•Defines and monitors surveillance system parameters (e.g., timeliness, frequency)

•Defines the functional requirements of the supporting information system

•Tests data collection, data storage, and analytical methods

•Evaluates surveillance systems using national guidance and methods

•Recommends and implements modifications to surveillance systems on the basis of an evaluation

•Communicates research findings to professional audiences and agency staff using appropriate methods (e.g., reports manuscripts, peer-reviewed journals, conferences)

$110, 600

Government Public Health Advisor (.4 FTE)

•Project management including oversight of budget and administration

•Applies knowledge of the acquisition and grants lifecycle

•Manages and monitors the implementation of interagency agreements, and contracts

•Applies methods and procedures for funding acquisitions

$65,013

Subtotal, Government Personnel

$464,853

Contracted Personnel and Services1

Study design, interviewer/recruiter training, data collection and analysis

$4,430,619



$4,895,472


1Contracted personnel and services cost estimates are based on estimated funds available during the base year (18 months, October 2015 – December, 2016). The contract is funded for multiple years with data collected on a biennial basis. The total contract amount is anticipated to be $24,878,242. The government expects that this task order will be incrementally funded; based upon satisfactory performance and availability of funds, the contract may be renewed for the third option year. Funds in the amount of $1,510,316 were transferred to the CDC budget from the DoD for the base year.


A.15. Explanation for Program Changes or Adjustments


CDC requests a Revision for an additional 2 years to implement use of a newly revised instrument in the NISVS data collection cycle, to accommodate collection of military population data on behalf of the Department of Defense, and to fully enact changes in the administration of the NISVS instrument. Explanations of these changes are described in attachment J. The same survey instrument is used for the General Population and the Military, the only areas of difference are that the military specific survey contains a small number of additional questions that will collect data on aspects of military deployment for active duty military and does not include the item on HIV/AIDS status.



A.16. Plans for Tabulation and Publication, and Project Time Schedule

Table 4. Data Collection & Report Generation Time Schedule


Data Collection Period

Activities

Time Schedule

One

Letters sent to respondents

Beginning 5 weeks immediately after OMB approval


Initiate telephone contact

Beginning 5 weeks immediately after OMB approval


Clean and edit 1st period data set

Beginning six months after telephone contacts are initiated

Two

Initiate telephone contact and data collection

Beginning six months after the start of data collection period one


Clean and edit 2nd period data set

Beginning six months after initiation of data collection period two


Conduct analyses

Beginning six months after initiation of cleaning and editing for period two data set


Prepare and distribute reports

Beginning one year after initiation of analyses.



To determine the prevalence of IPV, SV, and stalking among women and men bivariate analyses are conducted using SUDAAN, version 9.0.Weighted estimates of 12-month and lifetime victimization prevalence are calculated annually. Separate estimates have been produced for population subgroups (e.g., sex, race/ethnicity, sexual orientation and age groups) and will continue to be produced on a regular basis. Chi square tests have been performed on weighted percentages to formally test for statistically significant differences between proportions and will be produced on a regular basis. Additional multivariable logistic regression analyses have been used to adjust the data and further evaluate associations between the outcomes and potential risk factors.


Data from each biennial data collection will be stored in password protected files. Various summary and special topic reports will be distributed to stakeholders. Public use data sets will also be made available to state and national researchers and practitioners.



A.17. Reason(s) Display of OMB Expiration Date is Inappropriate


The display of the OMB expiration date is not inappropriate.


A.18. Exceptions to Certification for Paperwork Reduction Act Submissions


There are no exceptions to the certification.


REFERENCES


American Association for Public Opinion Research (2008). Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys, 5th edition. Lenexa, Kansas: AAPOR.

Armstrong, J.S. (1975). Monetary Incentives in Mail Surveys. Public Opinion Quarterly, 39, 111-116.


Bachar K, Koss MP. (2001). From prevalence to prevention: Closing the gap between what we know about rape and what we do. In: Renzetti C, Edleson J, Bergen RK, editors. Sourcebook on Violence Against Women. Thousand Oaks (CA): Sage Publications.


Basile KC, Black MC, Simon TR, Arias I, Brener ND & Saltzman LE. (2006). The Association between self reported lifetime history of forced sexual intercourse and recent health risk behaviors: findings from the 2003 National Youth Risk Behavior Survey. Journal of Adolescent Health, 39, 752.


Basile KC, Chen J, Black MC, & Saltzman LE. (2007). Prevalence and characteristics of sexual violence victimization among U.S. Adults 2001-2003. Violence and Victims, 22, 437-448.


Basile KC & Saltzman LE. (2002). Sexual violence surveillance: uniform definitions and recommended data elements. Version 1.0. Atlanta, GA: Centers for Disease Control and Prevention, National Center for Injury Prevention and Control.


Basile KC, Swahn MH, Chen J & Saltzman LE. (2006). Stalking in the United States: Recent National Prevalence Estimates. American Journal of Preventive Medicine, 31, 172-175.


Behavioral Risk Factor Surveillance System Summary Data Quality Report: http://www.cdc.gov/brfss/technical_infodata/pdf/2002SummaryDataQualityReport.pdf


Black, M C, Kresnow, M J, Simon, T R. Arias, I, & Shelley, G. (2006). Telephone survey respondents' reactions to questions regarding interpersonal violence. Violence and Victims, 21, 445-459.


Black MC & Black RS. (2007). A public health perspective on the ethics of asking and not asking about abuse. American Psychologist, 62, 328.


Black MC & Breiding, MJ. (2008) Adverse health conditions and health risk behaviors associated with intimate partner violence – United States, 2005. MMWR, 57, 113-117.



Blumberg S J & Luke JV. (2008). Wireless Substitution: Early Release of Estimates Based on Data from the National Health Interview Survey, July-December 2007. Retrieved May 13, 2008, from http://www.cdc.gov/nchs/nhis.htm.


Blumberg, S. J., & Luke, J. V. (2012). Wireless Substitution: Early Release of Estimates Based on Data from the National Health Interview Survey, July-December 2011  Retrieved June 28, 2012, from http://www.cdc.gov/nchs/nhis.htm



Bonomi AE, Thompson RS & Anderson Ml. (2006). Intimate partner violence and women’s physical, mental, and social functioning. Am J Prev Med, 30, 458-466


Breiding MJ, Black MC & Ryan GW. (2008). Prevalence and risk factors of intimate partner violence in Eighteen U.S. States/Territories, 2005. American Journal of Preventive Medicine, 34, 112-118.


Brick, J. M., Cervantes, I. F., Lee, S., & Norman, G. (2011). Nonsampling errors in dual frame telephone surveys. Survey Methodology, 37(1), 1-12.


Brush LD. (1990). Violent acts and injurious outcomes in married couples: methodological issues in the National Survey of Families and Households. Gender and Society, 4, 56-67.


Caetano R & Cunradi C. (2003). Intimate partner violence and depression among whites, blacks, and Hispanics. Annals of Epidemiology, 13, 661–5.


Campbell J, Sullivan CM & Davidson WD. (1995). Women who use domestic violence shelters: changes in depression over time. Psychology of Women Quarterly 19, 237-55.


Campbell JC. (2002). Health consequences of intimate partner violence. Lancet, 359, 1331–6.


Cantor D, O’Hare, BC & O’Connor KS. (2007). The Use of Monetary Incentives to Reduce Non-Response in Random Digit Dial Telephone Surveys. Pp. 471-498 in Advances in Telephone Survey Methodology, edited by J.M. Lepkowski, C. Tucker, J.M. Brick, E. de Leeuw, L. Japec, P.J. Lavrakas, M.W. Link, and R.L. Sangester. New York: Wiley.


Cantor D, Wang K & Abi-Habib N. (2003). Comparing Promised and Pre-Paid Incentives for an Extended Interview on a Random Digit Dial Survey. Proceedings of the Survey Research Methods Section of the ASA.


Centers for Disease Control and Prevention (CDC). (2009). Building data systems for monitoring and responding to violence against women: recommendations from a workshop. MMWR 49, No. RR-11).


Church AH. (1993). Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta-Analysis. Public Opinion Quarterly, 57, 62-79.


Coker AL, Smith PH, Bethea L, King MR & McKeown RE. (2000). Physical health consequences of physical and psychological intimate partner violence. Archives of Family Medicine, 9, 451-7.


Corso PS, Mercy JA, Simon TR, Finkelstein EA & Miller TR. (2007). Medical Costs and Productivity Losses Due to Interpersonal and Self-Directed Violence in the United States. American Journal of Prevention Medicine, 32, 474-482.

Crowell NA, Burgess AW, eds. Understanding Violence Against Women. Washington, D.C.; National Academy Press; 1996.


Dailey R & Claus RE. (2001). The relationship between interviewer characteristics and physical and sexual abuse disclosures among substance users: A multilevel analysis. Journal of

Drug Issues, 31, 867-88.


Defense Manpower Data Center. (2008). “August 2007 Status of Services Survey of Active Duty Members: Tabulations and Responses.” DMDC Report No. 2007–049.


Deming W E. (1953). On a Probability Mechanism to Attain an Economic Balance between the Resultant Error of Nonresponse and the Bias of Nonresponse. Journal of the American Statistical Association, 48, 743-772.


Dillman D. (2000) Mail and Internet Surveys. New York, NY: John Wiley & Sons, Inc.


Evans-Campbell T, Lindhorst T, Huang B & Walters KL. (2006). Interpersonal Violence in the Lives of Urban American Indian and Alaska Native Women: Implications for Health, Mental Health, and Help-Seeking. American Journal of Public Health, 96, 1416-1422.

Fahimi M, Kulp D, & Brick JM. (2008). Bias in List-Assisted 100-Series RDD Sampling. Survey Practice. September 2008.

Fisher BJ. (2004). Measuring Rape Against Women: The Significance of Survey Questions. U.S. Department of Justice.


Fowler Jr FJ & Mangione TW. (1990). Standardized Survey Interviewing. Newbury Park:

Sage publications. 


Gelles RJ. (1997). Intimate Violence in Families. 3rd ed. Thousand Oaks (CA): Sage

Publications.


Golding JM. (1996). Sexual assault history and limitations in physical functioning in two general population samples. Research in Nursing and Health, 9, 33-44.


Gondolf EW & Heckert DA. (2003). Determinants of women's perceptions of risk in battering relationships. Violence & Victims, 18, 371-386.


Grossman, S. F., & Lundy, M. (2003). Use of domestic violence services across race and ethnicity by women aged 55 and older. Violence Against Women, 9(12), 2003.


Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public

Opinion Quarterly 70(5): 646-675.


Groves R M, Couper MP, Presser S, Singer E, Tourangeau R, Acosta GP & Nelson L. (2006). Experiments in Producing Nonresponse Bias. Public Opinion Quarterly 70, 720-736.


Groves RM & Heeringa S. (2006). Responsive Design for Household Surveys: Tools for Actively Controlling Survey Errors and Costs. Journal of the Royal Statistical Society Series A: Statistics in Society 169, 439-457.


Groves R M & McGonagle KA. (2001). A Theory-Guided Interviewer Training Protocol Regarding Survey Participation. Journal of Official Statistics 17, 249-265.


Groves R M, Singer E & Corning A.(2000). Leverage-Saliency Theory of Survey Participation - Description and an Illustration. Public Opinion Quarterly, 64, 299-308.


Heyman RE, Schaffer R, Gimbel C & Kemer-Hoeg S. (1996). A Comparison of the Prevalence of Army and Civilian Spouse Violence. Prepared by Caliber Associates and Behavioral Science Associates for U.S. Army Community and Family Support Center, September, 1996.

Health Information National Trends Study. (http://cancercontrol.cancer.gov/hints/docs/HINTS_refusal_incentive_abstract.pdf ).

Johnson H. (1996). Dangerous Domains: Violence Against Women in Canada. Scarborough,

ON: Nelson Canada; 1996.


Kaslow N, Thompson MP, Meadows L, Jacobs D, Chance S & Gibb B. (1998). Factors that

mediate or moderate the link between partner abuse and suicidal behavior in African American Women. Journal of Consulting and Clinical Psychology; 66, 533-40.


Kennedy C. (2007). Constructing Weights for Landline and Cell Phone RDD Surveys. Paper

presented at the Annual Meeting of the American Association for Public Opinion Research, May 17-20, Anaheim, CA.


Kessler RC, McGoangle KA, Zhao S, Nelson CB, Hughes M, & Eshleman S. (1994).

Lifetime and 12-month prevalence of DSM-II-R psychiatric disorders in the United States: results from the National Comorbidity Survey. Archives of General Psychiatry, 51, 8-19.


Kilpatrick DG, Edmunds CN, Seymour AK. (1992). Rape in America: A Report to the Nation.

Arlington,VA: National Victim Center & Medical University of South Carolina.


Kish L. Survey Sampling. John Wiley and Sons, Inc. New York; 1965.


Koss MP, Bailey JA, Yuan NP, Herrera VM & Lichter EL. (2003). Depression and PTSD in survivors of male violence: research and training initiatives to facilitate recovery. Psychology of Women Quarterly, 27, 130–42.


Krug et al., eds. (2002). World Report on Violence and Health. Geneva, World Health Organization; 2002.


Lundy M & Grossman SF. (2004). Elder abuse: spouse/intimate partner abuse and family abuse among elders. Journal of Elder Abuse & Neglect, 16, 85-102.


Malcoe LH, Duran BM & Montgomery JM. (2004). Socioeconomic Disparities in Intimate Partner Violence Against Native American Women: A Cross-Sectional Study. BMC Medicine, 2, 20.

Marshall A, Panuzioa J & Taft CT. (2005). Intimate Partner Violence Among Military Veterans and Active Duty Servicemen. Clinical Psychology Review, 25, 862-876.

Martin SL, Gibbs DA, Johnson RE, Rentz ED, Clinton-Sherrod AM & Hardison J. (In Press). Spouse Abuse and Child Abuse by Army Soldiers. Journal of Family Violence.

Max W, Rice DP, Finkelstein E, Bardwell RA, Leadbetter S. The economic toll of intimate partner violence against women in the United States. Violence Vict. 2004;19(3):259-72.



McCarroll JE, Newby JH, Thayer LE, Norwood AE, Fullerton CS & Ursano RJ. (1999). Reports of Spouse Abuse in the U.S. Army Central Registry (1989-1997). Military Medicine, 164, 77–84.

McCarty C. (2003) Differences in Response Rates Using Most Recent Versus Final Dispositions in Telephone Surveys. Public Opinion Quarterly, 67, 396-406.

Mechanic MB, Uhlmansiek MH, Weaver TL & Resick PA. (2000). The impact of severe stalking experienced by acutely battered women: an examination of violence, psychological symptoms and strategic responding. Violence and Victims, 15, 443–58.

Merrill LL, Newell CE, Milner JS, Koss MP, Hervig LK, Gold SR, Rosswork SG & Thornton SR. (1998). Prevalence of premilitary adult sexual victimization and aggression in a Navy recruit sample. Military Medicine, 163, 209-212.

Mouton CP, Rovi S, Furniss K & Lasser NL. (1999). The associations between health and domestic violence in older women: results of a pilot study. Journal of Women’s Health & Gender-Based Medicine, 8, 1173-1179.


National Center for Injury Prevention and Control. (2008). CDC Injury Research Agenda, 2009–2018. Atlanta, GA: US Department of Health and Human Services, Centers for Disease Control and Prevention. Available at: http://www.cdc.gov/ncipc.


National Center for Injury Prevention and Control (NCIPC). (2003). Costs of Intimate Partner Violence Against Women in the United States. Atlanta (GA): Centers for Disease Control and Prevention.


National Household Education Survey. (http://www.amstat.org/sections/srms/Proceedings/papers/1997_181.pdf).


National Research Council. (2003). Elder Mistreatment: Abuse, Neglect, and Exploitation in an Aging America. Panel to Review Risk and Prevalence of Elder Abuse and Neglect. Richard J. Bonnie and Robert B. Wallace, Editors. Committee on National Statistics and Committee on Law and Justice, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.



Oetzel J & Duran B. (2004). Intimate Partner Violence in American Indian and/or Alaska Native Communities: A Social Ecological Framework of Determinants and Interventions. American Indian and Alaska Native Mental Health Research, 11, 49-68.

O'Muircheartaigh C & Campanelli P. (1999). A Multilevel Exploration of the Role of Interviewers in Survey Non-Response. Journal of the Royal Statistical Society, 162, 437-446.

Peytchev, A., R. Baxter and L. R. Carley-Baxter (in press). Not All Survey Effort is Equal: Reduction of Nonresponse Bias and Nonresponse Error. Public Opinion Quarterly.

Pollner M. (1998). The effects of interviewer gender in mental health interviews. Journal of Nervous & Mental Disease, 186, 369-73.


Puzone CA, Saltzman LE, Kresnow MJ, Thompson MP & Mercy JA. (2000). National trends in intimate partner homicide. Violence Against Women, 6, 409–26.


Rennison C & Rand M. (2003). Non-lethal intimate partner violence: women age 55 or older. Violence Against Women, 12, 1417-1428.


Robin RW, Chester B, Rasmussen JK, Jaranson JM & Goldman JK. (1997). Prevalence and Characteristics of Trauma and Post-Traumatic Stress Disorder in a Southwestern American Indian Community. American Journal of Psychiatry, 154, 1582-1588.

Sadler AG, Booth BM & Doebbeling BN. (2005). Gang and Multiple Rapes During Military Service: Health Consequences and Health Care. Journal of the American Medical Women’s Association, 60, 33-41

Sahr, R. Consumer Price Index (CPI) Conversion Factors 1800 to Estimated 2015 to Convert Dollars of 2005. (Revised January, 18, 2006). Available: http://oregonstate.edu/Dept/pol_sci/fac/sahr/cv2005.xls (Accessibility Verified January 23, 2006).


Singer E. (2002). The Use of Incentives to Reduce Nonresponse in Household Surveys. Pp. 163-178 in Survey Nonresponse, edited by R.M. Groves, D.A. Dillman, J.L. Eltinge, and R. J.A. Little. New York: Wiley.


Singer E & Bossarte RM. (2006). Incentives for survey participation: when are they coercive? Am J Prev Med 31, 411-418.


Sullivan CM & Cain D. (2004). Ethical and safety considerations when obtaining information from or about battered women for research purposes. Journal of Interpersonal Violence, 19, 603-18.


Teaster, P.A. (2002). A response to the abuse of vulnerable adults: the 2000 survey of state adult protective services. Washington, D.C.: National Center on Elder Abuse.


Thompson M, Arias I, Basile KC, & Desai S. (2002). The Association Between Childhood Physical and Sexual Victimization and Health Problems in Adulthood in a Nationally Representative Sample of Women. Journal of Interpersonal Violence, 17, 1115-1129.


Thornberry O, Massey J. (1998). Trends in United States Telephone Coverage Across Time and Subgroups. In R.M. Groves, P.P. Biemer, L.E. Lyberg, J.T. Massey, W.L. Nicholls, II, & J. Wakesberg (Eds.), Telephone Survey Methodology. New York: Wiley.


Tjaden P & Thoennes N. (1998). Prevalence, Incidence, and Consequences of Violence against Women: Findings from the National Violence Against Women Survey. U.S. Department of Justice, Office of Justice Programs, Washington, DC, Report No. NCJ 172837.

Tjaden P & Thoennes N. (1998). Stalking in America: Findings from the National Violence Against Women Survey: research brief. U.S. Department of Justice; 1998.


Tjaden P & Thoennes N. (2000). Full Report on the Prevalence, Incidence, and Consequences of Violence Against Women. NCJ Publication # 183781, Washington, DC: National Institute of Justice.

Tjaden P & Thoennes N. (2006). Extent, Nature, and Consequences of Rape Victimization: Findings From the National Violence Against Women Survey. U.S. Department of Justice, Office of Justice Programs, Washington, DC, Report No. NCJ 210346.

Traugott MW, Groves RM & Lepkowski J. (1987). Using Dual Frame Designs to Reduce Nonresponse in Telephone Surveys. Public Opinion Quarterly, 51, 522-539.


Tucker C, Brick JM, Meekins B, Morganstein D. (2004). Household Telephone Service and

Usage Patterns in the U.S. in 2004. Proceedings of the Section on Survey Research Methods, American Statistical Association, pp. 4528 -4534.


U.S. Bureau of Statistics. http://www.dol.gov/dol/topic/statistics/index.htm).


U.S. Census. http://www.census.gov/popest/national/asrh/NC-EST2004/NC-EST2004-01.xls


U.S. Department of Health and Human Services (DHHS). Healthy People 2010. 2nd ed. With Understanding and Improving Health and Objectives for Improving Health 2 vols. Washington, DC: U.S. Government Printing Office; 2000.


U.S. Department of Health and Human Services. Report from the Secretaries Task Force on Elder Abuse. Feb 1992. http://aspe.hhs.gov/daltcp/reports/elderab.htm


Vos T, Astbury J, Piers LS, Magnus A, Heenan M, Stanley L, Walker L & Webster K. (2006). Measuring the Impact of Intimate Partner Violence on the Health of Women in Victoria, Australia. Bulletin of the World Health Organization, 84, 9.

Waksberg J (1978). Sampling Methods for Random Digit Dialing. Journal of the American Statistical Association, 73, 40-46.


Watts C, Heise L, Ellsberg M & Moreno, G. (2001). Putting women first: ethical and safety recommendations for research on domestic violence against women. (Document WHO/EIP/GPE/01.1). Geneva: World Health Organization, Global Programme on Evidence for Health Policy.


Yu J & Cooper H. (1983). Quantitative Review of Research Design Effects on Response Rates to Questionnaires. Journal of Marketing Research, 20, 36-44.








File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOMB Application for
Authormcl2
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy