RIViR_OMB_SSA_Nov 3 2016_final

RIViR_OMB_SSA_Nov 3 2016_final.docx

Responding to Intimate Violence in Relationship Programs (RIViR)

OMB: 0970-0503

Document [docx]
Download: docx | pdf



OMB No. 0970-XXXX Expiration XX/XX/20XX


Responding to Intimate Violence in Relationship Programs (RIViR)


Supporting Statement A


New Collection






November 2016







Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services

Mary Switzer Building

330 C Street, SW

Washington, DC, 20201


Table of Contents

Section

Tables

LIST OF ATTACHMENTS

A.1 Lead letter for parents

A.2 Recruitment script for adults

A.3 Recruitment script for youth 18 years and older

B.1 Adult consent form

B.2 Adult consent form script

B.3 Parent permission form

B.4 Parent permission form script

B.5 Youth assent form

B.6 Youth assent script

B.7 Youth 18 years and older consent

B.8 Youth 18 years and older script

C.1 Locator section for adults

C.2 Contact information form for parents of youth younger than 18

C.3 Post-screener questions

D.1 Existing validated IPV and TDV screening tools

E.1 60 Day Federal Register Notice

F.1 IRB Approval Notice


LIST OF INSTRUMENTS


Instrument #1.1: IPV Screener #1

Instrument #1.2: IPV Screener #2

Instrument #1.3: IPV Screener #3

Instrument #2.1: TDV Screener #1

Instrument #2.2: TDV Screener #2

Instrument #2.3: TDV Screener #3

A. Justification

A.1 Necessity for the Data Collection

The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval for data collection as a component of the Responding to Intimate Violence in Relationship Programs (RIVIR) project. The goals of this research project are to develop improved theoretical frameworks, screening tools, and surrounding protocols for recognizing and addressing intimate partner violence (IPV) and teen dating violence (TDV) in healthy marriage and relationship education (HMRE) programs with the diverse, middle- to low-income populations they serve. We are studying HMRE programs, and we are referring to them as “healthy relationship” (HR) programs in this project (and consequently throughout this document). This project has already conducted several activities using existing data sources and materials to synthesize information about the current state of the field and to identify potential IPV and TDV screeners and protocols.

The project is now planning to test these IPV and TDV screeners and surrounding protocols in HR programs. This application seeks approval for this testing component of the RIVIR project. We plan to test standardized quantitative tools, as well as open-ended scripts that create opportunities for disclosure of IPV or TDV among adults and youth who participate in HR programs. This work will include data collection from approximately 1,200 HR program participants and will be completed in collaboration with approximately four HR grantee organizations funded by ACF’s Office of Family Assistance (OFA) to implement HR programs.

A.1.1 Study Background

Gaps the Information Collection is Designed to Address

Intimate partner violence (IPV) and teen dating violence (TDV) have long-lasting and deleterious effects (Coker et al, 2002). Unfortunately, IPV and TDV are both prevalent in the U.S. (Breiding et al, 2014; Black et al, 2011; Taylor et al, 2014), and research (McKay et al, 2015) and practice-based knowledge (Menard & Williams, 2005) indicates that IPV and TDV are common among healthy relationship (HR) program participants. HR programs, which typically involve healthy relationship education and skill-building and can include explicit information on IPV or TDV, provide a natural environment for individuals to discuss their personal relationships, including unhealthy relationship issues (Krieger et al, 2016) and can be associated with reductions in IPV among participants (Antle et al., 2011; Lundquist et al., 2014). Conversely, a concern among domestic violence advocates has been that HR programs could inadvertently discourage individuals from leaving abusive relationships (Leiwant, 2003) and thereby contribute to increased IPV. For these reasons, experts in both the IPV/TDV and HR fields agree that it is crucial that HR programs are prepared to recognize and address IPV and TDV in their programming (Menard & Williams, 2006; Derrinton et al, 2010; Ooms et al, 2006).

Federal programs have acknowledged the importance of HR programs’ ability to recognize and respond to IPV and TDV. The Administration for Children and Families (ACF) Office of Family Assistance (OFA) has funded three cohorts of HR programs since 2006 and each cohort has had requirements related to addressing IPV and TDV in their programs. For example, current ACF HR grantees, funded in 2015, were required to show evidence in their grant applications of consultation with a local domestic violence program or coalition and encouraged to take a “comprehensive approach to addressing domestic violence” (ACF, 2015).

Very little research is available to guide such approaches, however (Clinton-Sherrod et al., 2016; McKay et al., 2016). Research-based publications in this area have been descriptive in nature and have not included assessment of guidelines for recognizing or addressing IPV or TDV in HR programs. The two available works in this area describe the implementation of IPV screening and surrounding protocols in one community-based HR program (Whiting et al, 2009) and a set of twelve HR programs serving incarcerated and reentering men and their families (McKay et al, 2013).

In the absence of research-based guidance, practice-based recommendations have been developed to provide guidance to HR programs on how to recognize and respond to IPV and TDV. In collaboration with the National Healthy Marriage Resource Center (NHMRC), the National Resource Center on Domestic Violence developed a five-part resource packet for HR practitioners and administrators that includes sections on understanding domestic violence, building effective partnerships with local domestic violence programs, developing domestic violence protocols, screening and assessment for domestic violence, and responding to domestic violence disclosure (NHMRC, 2011, updated 2015). This work has also been informed by practitioner discussions that occurred during two inter-agency and inter-organizational meetings focused on addressing IPV/TDV in HR programs (Ooms et al, 2006; Derrington et al, 2010).

One commonality between these practice-based recommendations is that they all suggest that HR programs provide all program participants with information and education on IPV and TDV and safe opportunities for disclosure of abusive relationships and HR staff should be prepared to support participants in making decisions about safe program participation and seeking follow up services from local domestic violence organizations. Many HR programs offer opportunities for participants to disclose IPV or TDV in the form of having one-on-one open-ended conversations with individuals or through using screening instruments with closed-ended questions that are administered during program intake. However, HR programs vary in the types of IPV and TDV education and screening approaches they use and in the manner in which they use them (e.g., at what point during the program, by whom) (Krieger et al, 2016), and no empirical information is available to guide these decisions. To date, there are no IPV or TDV screening approaches or surrounding protocols that have been empirically tested in HR settings and among HR populations.

While no IPV or TDV closed-ended screening tools have been tested among HR populations, there is evidence that these types of tools may be used to effectively provide HR participants with opportunities for IPV or TDV disclosure. Many instruments with closed-ended questions have been empirically validated to effectively screen adults for IPV in other populations and settings. Most validated IPV screeners have been tested among heterosexual women within medical settings, but some IPV screeners have been tested and validated in social service settings (with some similarities to HR programs), including parents participating in court-ordered family mediation (Pokman et al, 2014); heterosexual women in mental health, social service, and medical agencies (Jory, 2004); individuals in substance abuse treatment (Kraanen et al, 2013); women in crisis shelters (Sherin et al, 1998; Brown et al, 1996); and women seeking legal help for IPV (Bonomi et al, 2005). Likewise, some IPV screeners have been validated with populations that include similar sub-groups as HR program populations, including men (Goetz et al, 2006; Shakil et al, 2005), youth (Emelianchik-Key, 2011; Datner et al, 2007; Goetz et al, 2006), Spanish-speakers (Goetz et al, 2006; Paranjape et al, 2006; Bonomi et al, 2005), parents (Jones et al, 2005; Pokman et al, 2014; Eliason et al, 2005; Williams, 2012), individuals involved in the criminal justice system (Eliason et al, 2005; Williams, 2012), and individuals in same-sex relationships (Chan & Cavacuiti, 2008). Only one standardized TDV measurement instrument has been validated (Emelianchik-Key, 2011) and it was tested among a primarily white, heterosexual, and female youth population aged 13 to 21 (and its length makes it unsuitable for use as a screening tool).

Literature indicates that universal education1 on IPV and TDV paired with open-ended conversations that provide individuals with opportunities to talk about their relationships may be promising. Several intervention studies that included a universal education component suggest that it is perceived as important by those who receive it (Thompson et al, 1998), can lead to improved knowledge and self-efficacy regarding accessing IPV/TDV resources (Miller et al, 2016; Thompson et al, 1998), and could help to address barriers to disclosure (Othman et al, 2013). In a qualitative analysis of audiotaped conversations between patients and emergency health care providers, researchers found that IPV disclosure was more likely when providers probed about IPV experience, created open-ended opportunities for discussion, and were generally responsive or expressed empathy when a patient mentioned a psychosocial issue (for example, “stress”) (Rhodes et al., 2007). This available research on IPV and TDV closed-ended screeners, universal education, and open-ended conversations indicates that these strategies could be an appropriate strategy for recognizing and addressing (through referral) IPV and TDV among HR participants. The research and data collection proposed by OPRE in this application is needed to empirically test IPV and TDV screening tools and protocols to provide HR programs with evidence-based recommendations on how to recognize and address IPV and TDV in their programs.

How the Information from this Study Will Further ACF Goals

This study is needed to achieve a central goal of the RIViR project for ACF: to identify, prioritize, and test IPV and TDV screeners and surrounding protocols in HR programs. More broadly, the study furthers one of ACF’s agency goals, which is to “promote safety and well-being of children, youth, and families,” as outlined in ACF’s 2015-2016 strategic plan (ACF, 2015). In line with this goal, ACF prioritizes the safety and well-being of HR program participants. Thus, the information that this study will produce will be used to help HR programs better serve (and avoid causing inadvertent harm to) individuals who have been or are currently in abusive relationships.

A.1.2 Legal or Administrative Requirements that Necessitate the Collection

There are no legal or administrative requirements that necessitate the collection. ACF is undertaking the collection at the discretion of the agency.

A.2 Purpose of Survey and Data Collection Procedures

A.2.1 Overview of Purpose and Approach

The purpose of this component of the RIVIR study is to test IPV and TDV screening tools among HR program participant populations in HR program settings. The study will assess the psychometric properties of the screening tools when used with HR program populations, and compare how well each differentiates HR program participants who are experiencing IPV or TDV from those who are not (for purposes of guiding HR program staff in offering referrals to their local domestic violence program partners for full assessment and possible services). We will test a total of six screening tools: four standardized closed-ended tools (two for IPV and two for TDV) and two open-ended universal education scripts (one each for IPV and TDV) that create opportunities for disclosure of IPV or TDV among adults and youth who participate in federally funded HR programs2. We will test the IPV and TDV screeners among no more than 600 adults and 600 youth, respectively. Each study participant will be asked to complete three IPV or TDV screeners.

This work will be completed in collaboration with approximately four grantee organizations that have been funded by ACF’s OFA to implement HR programs. Data collection will commence after OMB approval (anticipated by February 2017). We anticipate that data collection will take place over the course of up to 24 months (see table in attachment A.16 for more details). There are not multiple phases of this study and no previously OMB approved collections related to this study. There are no other related studies conducted by ACF that are addressing similar or the same research questions. With participants’ permission, this data collection will be supplemented with administrative data on HR program participant demographics available from OFA’s administrative data collection system (nFORM).

A.2.2 Research Questions

The research questions are included in Table A.2.1. As described in the study background section (see A.1.1), little evidence exists regarding the effectiveness of IPV and TDV screeners and surrounding protocols among HR program populations. Our research will examine the psychometric properties of selected IPV and TDV screeners in HR program populations and compare the open- and closed-ended IPV and TDV screeners with regard to their ability to differentiate program participants who are experiencing IPV and TDV from those who are not. We are specifically examining these two research questions because they address vitally important research gaps that will inform how HR programs recognize and address IPV and TDV among their program populations. Our findings are intended to inform future HR programs in their efforts to address IPV and TDV in their programs, which requires collecting data on the use of IPV and TDV screeners and protocols within HR program settings and populations.

First, we will address the research question, “What are the psychometrics of common closed-ended IPV and TDV screeners as implemented in HR programs?” Psychometric testing looks at the “reliability,” or the extent to which an instrument produces consistent results, and “validity,” or how accurately an instrument measures what it is intended to measure, of the selected screeners. Determining the reliability and validity is an essential step to understand if and how these IPV and TDV screeners function among HR populations, and ultimately to be able to provide evidence-based recommendations to HR programs and practitioners on strategies for recognizing IPV and TDV in HR program populations. As mentioned in A.1.1., several studies have tested the psychometrics of IPV and TDV screeners among populations similar to HR populations, but no studies have examined the psychometrics of IPV or TDV screeners among HR program participants in the HR program setting.

Second, we will address the research question, “How well do open-ended IPV and TDV screening approaches compare to closed-ended screening approaches?” Addressing this question will allow us to understand how well standardized IPV/TDV tools and open-ended approaches each differentiate participants who may need IPV/TDV-related help from those who do not. As noted in A.1.1, there is an absence of evidence of the effectiveness of open-ended IPV/TDV screening approaches in identifying HR participants who may need help.

Table A.2.1. Research Questions

Research Questions

  1. What are the psychometrics of (2) common closed-ended IPV and TDV screeners as implemented in HR programs?

  1. How well do open-ended IPV and TDV screening approaches compare to closed-ended screening approaches?

A.2.3 Study Design

Our research approach and methodology was chosen to answer the project’s research questions, and generate usable results that can be applied in HR programs, while placing the least burden on HR programs and participants. Grantee staff will recruit approximately 600 adult and 600 youth participants to participate in this study, for a total of 1200 participants recruited from approximately four sites. Adult participants will be recruited to test the three IPV screeners and youth participants will be recruited to test the three TDV screeners. The same participants will complete the three IPV or TDV screeners at three different time points. This number of proposed research participants is essential to the utility of our research; we conducted power calculations to determine how many individuals we would need to interview in order to have adequate power to detect difference in the screeners’ effectiveness with the least amount of program and participant burden. (Additional details on power analysis and other aspects of our study design are included in Supporting Statement B.)

We are aware of two limitations of this study design. First, although all participants will be invited to participate in the study, there may be differences between individuals who choose to participate (and/or whose parents give them permission to participate) and the general population of HR participants to whom our findings should be generalizable. Second, as with all research on IPV or TDV screening, we must rely on individuals’ self-reports of IPV and TDV-related experiences, and some individuals may not disclose IPV and TDV or may provide incorrect information on their IPV and TDV experiences. However, our research design will allow us to model 1) study selection and attrition biases, and 2) any differences in how well a particular screener elicits disclosures of IPV or TDV experiences compared to the other screeners.

Data Collection

OPRE has contracted with RTI to conduct this study. RTI will work with four HR grantee organizations that meet criteria indicating they have the capacity to successfully participate in the study. All program participants who are enrolled at the four HR grantee program sites during the study enrollment period will be invited to participate in the study. Recruitment will begin after IRB and OMB approval (anticipated February 2017) and will continue until target sample sizes are reached. Recruitment scripts for parents of minor youth, adults, and youth 18 years and older, are located in Appendices A.1, A.2, and A.3 respectively.

Prior to and during study involvement, grantee project staff will emphasize to participants that participation in the study is voluntary and that participants may reverse their decision to participate at any time. They will also be reminded that participation in this study will not have any bearing on the services that they receive, nor will declining to participate result in any punitive measures (particularly important to clarify for high school aged youth). The voluntary nature of the study, as well as other key information about the study, will be explained in writing and verbally through informed consents and accompanying scripts. All information will be kept private. All potential adult participants will receive an informed consent (Attachment B.1) that research staff will review with them (Attachment B.2). Parents will need to provide permission for their child to participate, prior to minor participants’ recruitment (Attachment B.3-B.4). All potential minor youth participants that have received parental permission will receive an informed assent (Attachment B.5) that research staff will explain to them (Attachment B.6). Youth who are 18 years or older will receive an informed consent (Attachment B.7), which will be reviewed by staff (Attachment B.8).

Adult participants will complete the IPV screeners (Instruments 1.1, 1.2, and 1.3) one-on-one with HR staff during intake or program participation. Youth participants will complete closed-ended TDV screeners using tablets during HR high school programming, and the open-ended TDV screener one-on-one with program staff (Instruments 2.1, 2.2, and 2.3). The first screener will be administered right after participant consent or assent is obtained. The second screener will be administered between 2 days and one month after the first screener, and the third screener will be similarly spaced. The survey system will be programmed to administer the instruments in a random order for each participant. All of the instruments will be programmed to be Web-based and completed via tablets by the respondent or by the HR program staff member administering the screener. For screeners administered out loud, participants will participate in a private space such as the project office or (for youth) an office at their school.

A locator section (Attachment C.1) will be administered to adults at the end of the first instrument to collect participant contact information. Parents of youth will be asked to complete and return a contact information form (Attachment C.2), which will be distributed to them along with the permission form. Post screener items will collect gender identity and sexual orientation (Attachment C.3). The contact information will be needed for RTI to re-contact a small number of participants (less than 10 each of adults and youth) after this part of the study ends, to recruit them for an interview to collect additional feedback about the screening process. We will not be selecting follow up participants by personally identifiable information. Rather, we will invite a diverse sub-set of participants based on answers they provided to questions (i.e., we will randomly select a small sub-set of participants that indicated IPV or TDV in one or more screener, as well as a sub-set of participants that did not indicate IPV or TDV in the screeners).

A.2.4 Universe of Data Collection Efforts

Table A.2.2 lists all data collection instruments by the title used to describe the instrument throughout the entire package (which matches the file name of the instrument document) and in the same order as they are listed in the burden table in A.12. All instruments can be found in Instruments 1.1 through 2.3 and C.1 and C.2.

Table A.2.2. Data Collection Instruments and Description

Instrument

Description

Total Number of Respondents

1.1: IPV Screener 1

Standardized IPV tool 1: Intimate Justice Scale (15 items) for physical violence and coercive control

600

1.2: IPV Screener 2

Standardized IPV tool 2: Universal Violence Prevention Screen (5 items) for physical violence and Women’s Experience with Battering (10 items) for coercive control

600

1.3: IPV Screener 3

Open-ended IPV tool based on existing and widely accepted IPV universal education guidelines and expert consultant input

600

2.1: TDV Screener 1

Standardized TDV tool 1: Safe Dates tool (32 items) for physical violence and coercive control, with expanded/revised monitoring items

600

2.2: TDV Screener 2

Standardized TDV tool 2: Conflict in Adolescent Dating Relationships Inventory (25 items) for physical violence and coercive control, with expanded/revised monitoring items

600

2.3: TDV Screener 3

Open-ended TDV tool based on existing and widely accepted TDV universal education guidelines and expert consultant input

600

Locator section for adults (C.1)

Form that includes brief questions on whether or not the study can recontact the participant and, if so, questions to collect their contact information (e.g., phone number, address, email address)

600

Contact Information Form for Parents of Youth Younger Than 18 (C.2)

Form that includes brief questions on whether or not a parent gives permission for the study to recontact their minor child, and if so, questions to collect their contact information (e.g., phone number, address, email address)

600

Demographic questions (C.3)

Form that asks brief questions about gender identity and sexual orientation

600


Table A.2.3 directly connects each instrument back to the research questions. Our first research question, “What are the psychometrics of two common closed-ended IPV and TDV screeners as implemented in HR programs?” requires testing two IPV and two TDV closed-ended screeners in HR program settings. Our second research question, “How well do open-ended IPV and TDV screening approaches compare to closed-ended screening approaches in their ability to classify program participants into the categories that are most useful in guiding program responses, particularly whether to refer to the local domestic violence partner?” requires testing (and comparing results of) both closed-ended and open-ended screeners. We also will ask participants (or parents of minor participants) for permission to potentially recontact them to develop a qualitative understanding of their experiences with any domestic violence referrals or services received.

Table A.2.3 Cross-walk of Research Questions and Instruments

Research Questions

Instruments Used to Answer Research Questions

What are the psychometrics of (2) common closed-ended IPV and TDV screeners as implemented in HR programs?

  • 1.1: IPV Screener 1

  • 1.2: IPV Screener 2

  • 2.1: TDV Screener 1

  • 2.2: TDV Screener 2

How well do open-ended IPV and TDV screening approaches compare to closed-ended screening approaches?

  • 1.1: IPV Screener 1

  • 1.2: IPV Screener 2

  • 1.3: IPV Screener 3

  • 2.1: TDV Screener 1

  • 2.2: TDV Screener 2

  • 2.3: TDV Screener 3

  • Locator section for adults

  • Contact Information Form for Parents of Youth Younger than 18


To inform the selection of the IPV and TDV screener instruments to test for this study, we first conducted a systematic literature review of existing validated IPV and TDV screening tools. Empirically validated tools, defined as those with a published measure of accuracy or validity (e.g., correlation with another known measure) or sensitivity greater than or equal to 50%, were included for further synthesis and review (see Attachment D.1 for a summary of empirically validated IPV and TDV disclosure tools that resulted from the literature review). We also conducted a search for protocols for universal education and open-ended IPV/TDV disclosure opportunities. While we did not identify any universal education and open-ended protocols that had been empirically tested, we did find research that indicated these types of approaches are promising (as summarized in section A.1.1). We consulted with expert panelists and academic partners (see section A.8), as well as federal partners, to obtain any missing existing validated IPV or TDV closed- or open-ended tools in our literature review. From this consultation, we added one missing closed-ended tool to our list and identified two open-ended tools on which to base the open-ended tools to be tested in this study.

After categorizing the existing validated IPV and TDV closed-ended tools, we assessed the extent to which each validated IPV and TDV tool would be the most likely to be appropriate for use among HR programs and populations. We looked at the focus of each screener (victimization or perpetration), the forms of IPV measured (e.g., emotional, physical, sexual abuse); the population(s) with which it was tested and validated, the settings in which it was validated, and the length of each screener (e.g., number of items). We prioritized IPV and TDV screeners that were short enough to feasibly be used in HR program settings, yet complete enough to cover essential IPV and TDV constructs, and had been validated among sub-populations that were similar to HR program sub-populations and tested in similar service delivert oriented settings. Using these criteria, our review resulted in a short list of standardized tools to vet with our panel of experts. The tools on the IPV screener short list included Partner Violence Screen (Mills et al, 2006), Universal Violence Prevention Screen (Heron et al, 2003), Datner Measure (Datner et al, 2007), Intimate Justice Scale (Jory, 2004; Whiting et al, 2009), Women’s Experience with Battering (Smith et al, 1994), Psychological Malreatment of Women Inventory (Tolman, 1999), and Intimate Partner Violence Control Scale (Bledsoe & Sar, 2011). The tools on the TDV screener short list included the Teen Screen for Dating Violence (Emelianchik-Key, 2011), Conflict in Adolescent Dating Relationships Inventory (Wolfe et al, 2001), and the Safe Dates Evaluation Tools (Foshee et al., 2005). Concurrently, we developed two sample protocols representing universal education and open-ended IPV and TDV screening approaches for testing in youth- and adult-serving healthy relationship programs. We based these sample protocols on available practice-based guidance (e.g., Chamberlain & Levenson, 2013), including input from practitioner experts.

To move forward with selecting the four closed-ended tools (two IPV and two TDV) to be included in this study, we shared these narrowed lists of standardized tools recommended for consideration with our expert panel, federal partners, and our academic partners. We asked the panelists and partners to help us to prioritize a total of two standardized tools for IPV and TDV each to test in the adult and youth HR populations, respectively. We also shared our universal education and open-ended IPV and TDV screening approaches with the experts and partners to garner feedback on the appropriateness of the language, questions, and guidance to users.

The expert panelists, federal partners, and academic partners provided feedback on the literature review table and tool recommendations. From this expert feedback, we selected the final set of standardized tools to be tested for this current study:

  • 1.1: IPV Screener 1: Intimate Justice Scale (15 items) for physical violence and coercive control. (Instrument List 1.1)


  • 1.2: IPV Screener 2: Combine the Universal Violence Prevention Screen (5 items) to measure physical violence and the Women’s Experience with Battering (10 items) to measure coercive control. (Instrument List 1.2)


  • 2.1: TDV Screener 1: Safe Dates tool (32 items) for physical violence and coercive control, with expanded/revised monitoring items (Instrument List 2.1)

  • 2.2: TDV Screener 2: Conflict in Adolescent Dating Relationships Inventory (25 items) for physical violence and coercive control, with expanded/revised monitoring items. (Instrument List 2.2)


Experts also provided suggestions to improve the open-ended screeners, including revisions to improve the open-ended screeners’ flow, accuracy, clarity, and accessible to HR program participants. The final open-ended IPV screener and TDV screener are Instruments 1.3 and 2.3.

A.3 Improved Information Technology to Reduce Burden

Our data collection will utilize a web-based platform (Voxco-programmed online survey) to collect data from participants for the four closed-ended screeners in real time. All screener data will be collected using the technology (e.g., tablets) that participating HR programs are already using for their local evaluation and federally required data collection. The open-ended screeners require face-to-face conversation, and therefore would not be suitable to be collected via a tablet or laptop. To save burden and avoid privacy risks associated with paper-based data collection, however, data resulting from these open-ended screeners will be also be entered electronically only. All data will be stored online, which will improve privacy protections (as opposed to paper forms that HR programs would have to scan or mail to RTI).

A.4 Efforts to Identify Duplication

As described in section A.2.4, the RIViR team has conducted a systematic review of validated closed-ended IPV and TDV screeners (Attachment D.1), as well as a literature search for evidence-based IPV and TDV universal education and open-ended screeners. This review, conducted in 2015 and updated in 2016, underscored key shortcomings in existing research:

  • Most formally validated IPV/TDV screening tools have been tested in health care settings and only a few have been validated in non-health care delivery settings.

  • No available IPV/TDV screening tools have been validated with HR program populations, although some have been validated with populations that include similar sub-populations.

  • Little empirical information is available on open-ended screeners (in any setting or with any population), and none is available on the use of open-ended screeners in HR programs.

  • To date, no studies have established the psychometric properties of IPV or TDV screeners among HR program populations nor compared the use of open- and closed-ended screeners in these populations.

Whiting and colleagues (2009) documented one HR program’s use of the previously validated closed-ended IPV screener, the Intimate Justice Scale (Jory, 2004). While the Intimate Justice Scale was used among HR program participants, Whiting et al. were not able to establish psychometric properties of this closed-ended IPV screener (Whiting et al, 2009).

In addition to reviewing published works, the RIViR project team contacted researchers at Brigham Young University who have compiled local evaluation data from several cohorts of previously funded HR program sites, including collecting information on whether and how IPV screening was conducted and what data resulted (Hawkins and Erickson, unpublished data). This consultation confirmed that prior HR programs (or their local evaluators) have not collected any data that could be used to answer this study’s research questions.

Finally, as noted in section A.1.1, HR programs currently have access to practice-based guidance on IPV and TDV screening, but research-based guidance has not been available.

A.5 Involvement of Small Organizations

Some or all of the HR programs included in this study will be small community-based organizations. To minimize any burden on these organizations resulting from the data collection process, the study team will develop site-specific data collection protocols that work within each site’s staffing work flow and existing program activities such that this data collection will not impact the organizations’ operations or ability to serve clients.

A.6 Consequences of Less-Frequent Data Collection

This data collection request covers the administration of three screeners to HR program participants at three time points; none of the tools is repeated at more than one time point. Testing of more than one screener is necessary to be able to address our first research question (“What are the psychometrics of (2) common closed-ended IPV and TDV screeners as implemented in HR programs?”) and our second research question (“How well do open-ended IPV and TDV screening approaches compare to closed-ended screening approaches in their ability to classify program participants into the categories that are most useful in guiding program responses, particularly whether to refer to the local domestic violence partner?”). As detailed in SSB, Section B.2.4, the analyses to answer each of these research questions require that the same individuals complete multiple screeners.

To reduce participant burden (and to provide findings that are likely more feasible to implement in HR programs), we have selected IPV and TDV screeners that are brief and should take between 10 and 15 minutes each. While some IPV and TDV screening tools have over 100 items, our selected IPV and TDV screeners for testing range from 15 to 32 items. Our IPV and TDV open-ended screeners are also brief, and only include approximately four open-ended questions each.

A.7 Special Circumstances

There are no special circumstances for this data collection.

A.8 Federal Register Notice and Consultation

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13 and OMB regulations at 5 CFR Part 1320 [60 FR 44978, August 29, 1995]), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on April 12, 2016 in Federal Register Vol. 81, No. 70, pages 21553-21554, and provided a 60-day period for public comment. A copy of this notice is included in Attachment E.1. No substantive comments were received during the 60-day notice period.

RTI also consulted with an expert panel with expertise in areas including IPV services and program implementation, IPV screening and, HR program implementation on the design of the study and selection of screeners.

A.9 Tokens of appreciation for Respondents

We propose to offer a $10 token of appreciation per participant per screening tool in order to proactively minimize non-response bias. It is widely recognized that, unless survey non-response occurs at random, low response rates in survey-based research lead to non-response bias (e.g., Rubin, 1976). The most effective way to minimize non-response bias is to design survey field approaches that maximize response rates (Massey & Tourangeau, 2013). According to “leverage-salience theory,” based on a wide body of research on survey incentives, tokens of appreciation can function to minimize non-response bias, because they specifically help to increase participation among those who are less intrinsically motivated to participate in a survey on a particular topic or who have heavier competing obligations (Singer & Ye, 2013). In a study in which participants will be asked questions about IPV victimization, it is reasonable to assume that the strength of sample members’ intrinsic motivation to participate and the extent of their competing obligations could each be independently related to the victimization experiences captured by the IPV screening tools this study will testProactively minimizing the potential for differential non-response according to these characteristics (such as between IPV victims and non-victims) is a major validity issue for the RIViR study.

Multiple empirical assessments, including experimental studies, have found that the provision of tokens of appreciation reduces non-response and non-response bias in both interviewer-administered and web-based surveys (such as those proposed for the RIViR study) with a variety that monetary tokens of appreciation are more effective than other tokens or gifts at preventing survey nonresponse (e.g., Abreu & Winters, 1999; Greenbaum, 2000; Goldenberg, McGrath, & Tan, 2009; Goritz, 2006b; SAMHSA, 2014; Shettle & Mooney, 1999; Singer et al., 1999; Singer and Kulka, 2002). Evidence across studies indicates that the effect of tokens of appreciation on response rates functions minimize non-response by decreasing refusals (Singer & Ye, 2013) and that they can help to minimize attrition bias for studies involving multiple waves of data collection (Singer et al., 1998), such as the RIViR study. Research has shown that tokens of appreciation can increase response rates for populations that include racially and socioeconomically diverse participants (similar to the RIViR study sample), including a few studies specifically with low-income and racially diverse populations (James & Bolstein, 1990; Singer & Kulka, 2002), and individuals with lower education levels (Berlin et al, 1992).

RTI and others have led studies that provided tokens of appreciation among similar populations (including low-income and racially diverse populations) on other studies. On the Assets for Indendence (AFI) Evaluation conducted by RTI (OMB control number 0970-0414), participants were paid $20 for completing baseline and follow-up hour-long surveys. To combat potential non-response bias associated with low initial response rates, the study increased the token of appreciation for the follow-up surveys to $50. On the Child Support Noncustodial Parent Employment Demonstration study (OMB control number 0970-0439), participants in the program receive $10 for participation in a baseline survey (about 30 minutes) and $25 for participation in a follow-up survey (about 45 minutes). In an Office of the Assistant Secretary for Planning and Evaluation (ASPE) study on the effectiveness of HR programs among reentering men and their families conducted by RTI (OMB control number 0990-0331), study participants were provided a $35-70 token of appreciation for completing each of four 90-minute surveys. (When the $35 token of appreciation proved insufficient for obtaining adequate survey response rates in particular sub-groups of the HR program study population, such as women in the New York site, the amount was increased to help prevent differential attrition among members of this sub-group who were less stably housed or less stably attached to their study partners.) A recent experiment conducted by RTI on incentives in the Residential Energy Consumption Study (OMB control number 1905-0092), a nationally representative household survey, found that offering a $20 token of appreciation produced a uniform, statistically significant decrease in nonresponse across study modes and protocols (Murphy, 2016).

Although limited empirical evidence exists to guide precise incentive amounts relative to study type and population (Singer & Ye, 2013), our proposed token of appreciation of $10 per screening tool is set somewhat lower than the examples provided because the estimated time burden for the three screening tools is less (10 minutes for each of the two closed-ended screening tools and 15 minutes for the open-ended screening tool).

In reviewing OMB’s guidance on the factors that may justify provision of tokens of appreciation to research participants,[1] we have determined that the following principles apply:

  1. Minimizing non-response and attrition bias. Nonresponse and attrition bias could be a serious threat to the RIViR study’s validity and must be proactively addressed, given that our analytic objectives require that participants complete three different screening tools over the course of their study participation. Without a token of appreciation, differential nonresponse and differential attrition are likely based on differences in intrinsic motivation to participate in a survey on the topic of IPV (which could reasonably be related to a prospective respondent’s history of IPV victimization) and based on differences in competing obligations (which could be associated with economic strain and therefore also associated with IPV [Cunradi, Caetano, & Schafer, 2002]). Minimizing nonresponse and attrition among less-intrinsically-motivated or more-strained respondent groups is crucial, since their characteristics could be independently related to our analytic outcomes, and offering tokens of appreciation is a strong, empirically tested approach to achieving that.

  2. Improving data quality: If we are unable to secure participation from a significant proportion of our sampling pool, which is not expansive, we will be unable to achieve our target sample sizes. If we secure participation but participants are not sufficiently motivated to spend the time needed to complete all items on the screening tools, the quality of the data will be compromised by missingness. Either issue could prevent the study from having the statistical power to address its research questions.

  3. Reducing survey costs: We anticipate that without the tokens of appreciation, more potential respondents will need to be recruited to achieve sufficient numbers to conduct our analysis and answer our research questions. HR program staff will be in charge of recruiting and following up with study participants. If few potential study participants enroll in the study or if more participants drop out of the study, HR program staff will need to spend more time recruiting and following up with participants.

A.10 Privacy of Respondents

The proposed information collection was reviewed and received contingent approval from RTI’s Office of Research Protection on August 16, 2016. Upon receipt of required revisions, the IRB provided formal approval on September 9, 2016. The IRB approval letter appears as Attachment F.1.

We will take the utmost measures to maximize participant privacy, particularly because of the sensitive nature of the questions and to protect the safety of all research participants. Before enrolling participants and administering the screeners, grantee project staff will participate in an in-person training with the RTI project team about participant privacy guidelines and screener administration procedures. RTI staff will serve as liaisons to the grantee projects throughout data collection to provide support and to ensure that data collection protocols are carefully followed.

Efforts will be made to ensure that no one other than the project staff administering the screener and the participant can view or hear responses during screener administration (i.e., the staff will be instructed to administer the verbal screeners in a private room with the door closed). During group administration with youth, we will try to ensure that no other youth or teachers can view survey responses; respondents will be spaced out around the room and privacy screens will be used to protect students’ tablets from the view of surrounding students as needed.

Screener responses will only be accessible to authorized RTI and HR program staff, and identifying information will be kept separate from screener responses by grantee project staff and RTI staff. Each participant will be given a unique identifier prior to data collection; identifying information collected in the locator section will be automatically separated from screener responses when transmitted and will be stored separately at RTI. All electronic data will be transmitted securely using an encrypted protocol (HTTPS) immediately upon completion of each screener and will be stored in RTI’s Enhanced Security Network and on RTI’s secure project shared drive. A summary of responses for each participant to each screener will be uploaded to a secure website for sharing with grantee project staff, to be used for making referral decisions. No personally-identifying data or participant information will be stored on the computers used to collect the data. HR program staff will store completed consent, assent, parent permission forms, and tokens of appreciation receipt forms in a locked filing cabinet in their agency office. They will scan these forms and upload them at least weekly to a secure website hosted by RTI.

Information will not be maintained in a paper or electronic system from which they are actually or directly retrieved by an individuals’ personal identifier.

A.11 Justification for Sensitive Questions

The goals of this study necessitate collecting data via IPV and TDV screeners that ask questions about emotional abuse, physical violence, and/or coercive control. These types of questions are necessary for identifying IPV and TDV, are included in some form in all IPV or TDV screeners, and therefore must be included in this data collection effort in order to assess the psychometric properties of and compare the effectiveness of IPV and TDV screening tools. All of the screeners include such questions.

In an effort to minimize the participant burden associated with answering sensitive questions, we chose closed-ended IPV and TDV screening tools (1.1, 1.2, 2.1, and 2.2) that have been previously validated and had fewer questions (15-32) compared to other IPV and TDV screeners. Based on expert consultant recommendations, the IPV and TDV open-ended screening instruments include less direct and explicit questions about IPV and TDV. Instead, the open-ended screeners include universal education in which participants are read information about IPV or TDV and then asked questions such as, “What are your thoughts on the information?” or “Does this sound like your relationship?”

Research indicates that women who have experienced IPV are amenable to and generally supportive of being asked questions about IPV in the medical setting (Bacchus et al, 2002; Gielen et al, 2000) and in surveys (Black, Kresnow, Simon, Arias and Shelley, 2006). Nonetheless, study participants will be notified that the screeners include sensitive questions and they will be reassured that their participation is voluntary during the informed consent process and that their responses will be kept private to the extent permitted by law. Participants will be told that their decision to participate or not will have no effect on any services they receive from their HR program.

A.12 Estimates of Information Collection Burden

These forms will be used for data collection for up to 2 years (or 24 months), depending how long it takes to reach target data collection numbers. Respondents will be youth or adult participants in OFA-funded HR programs at 4 HR program sites across the nation. Table A.12.1 provides the annual burden associated with this effort.

Six hundred adult participants will complete the three IPV screeners, the locator section for adults, and the post-screener questions. Generally, HR programs serve low- to lower middle-income men and women (e.g., most participants in the Building Strong Families and Supporting Healthy Marriage demonstrations reporting incomes of less than $25,000 per year). We estimate adult participants will be primarily low income and that their hourly pay rates will stay the same through their study participation (which would be up to 3 months). Therefore the wage estimate for adult participants is based on an annual income of $40,180, which is 200% of the 2016 federal poverty level for a three-person household; this translates to an hourly rate of $19.32.

Six hundred youth will complete the three TDV screeners. We have used the federal minimum wage of $7.25 to estimate the hourly wage for youth participants because youth typically hold minimum wage jobs (if they are employed). Six hundred parents of youth will complete the contact information form for parents of youth younger than 18. Because HR programs typically target youth in low-income areas, we estimate the average hourly rates for parents of youth participants will be the same as the HR adult program participants; $19.32 an hour for a three-person household.

Staff from four sites will be involved in recruiting participants, administering surveys, and maintaining data collection protocol and record-keeping. We base an estimate of site staffs’ average hourly wage on the U.S. Bureau of Labor Statistics, National Compensation Survey, 2010, that indicates that “social and community service managers” average hourly wages are $27.86.


Table A.12.1. Estimated Annualized Burden Costs and Total for 2-Year Data Collection

Activity

Total No. of Respondents

Annual No. of Respondents

No. of Responses per Respondent

Average Burden per Response
(in Hours)

Total Annual Burden Hours

Hourly Wage Rate1

Total Annual
Respondent Costs

1.1: IPV Screener 1

600

300

1

.167

50

$19.32

$967.93

1.2: IPV Screener 2

600

300

1

.167

50

$19.32

$967.93

1.3: IPV Screener 3

600

300

1

.25

75

$19.32

$1,449.00

2.1: TDV Screener 1

600

300

1

.167

50

$7.25

$363.23

2.2: TDV Screener 2

600

300

1

.167

50

$7.25

$363.23

2.3: TDV Screener 3

600

300

1

.25

75

$7.25

$543.75

Locator section for adults (C.1)

600

300

1

.1

30

$19.32

$579.60

Contact information form for parents of youth younger than 18 (C.2)

600

300

1

.1

30

$19.32

$579.60

Post screener questions for adults (C.3)

600

300

1

.1

30

$19.32

$579.60

Post screener questions for youth (C.3)

600

300

1

.1

30

$7.25

$217.50

Participant recruitment

600

300

1

.1

30

$27.86

$835.80

Administration of data collection protocol and record-keeping

600

300

1

.167

50

$27.86

$1,395.79

Total





550


$8,842.96

N/A = Not applicable.

A.13 Cost Burden to Respondents or Record Keepers

ACF does not anticipate additional costs to respondents other than time spent (captured in A.12.1, above).

A.14 Estimate of Cost to the Federal Government

The estimated cost to the federal government for the proposed data collection and analysis is $860,000. This figure includes labor hours, and other direct costs (travel, photocopying, mailing, etc.) for both years of data collection. The annual cost is $430,000.

A.15 Change in Burden

This is a new information collection.

A.16 Plan and Time Schedule for Information Collection, Tabulation, and Publication

The table below provides a timeline based on OMB approval in February 2017 with data collection beginning upon approval. Data collection will continue until sites have enrolled and administered screeners to sufficient numbers of respondents to provide adequate power for the planned analyses (see Supporting Statement B), but no longer than two years (through February 2019 assuming approval is received in February 2017). All dates are dependent on OMB approval.

A.16.1 Project Timeline for Information Collection

Activities

Due Date

Data collection by HR program staff

February 2017 through February 2019

Analysis

March 2019 through June 2019

Final brief

August 2019



A.17 Reasons Not to Display OMB Expiration Date

OMB expiration date will be displayed on all necessary materials and documents.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification.

1 We define the term “universal education” as information on IPV or TDV that is proactively provided to all program participants. “Universal education” typically includes information about the warning signs of IPV and TDV, how to access help, and phone numbers for local and national hotlines.

2 These tools will be referred to throughout this OMB application as “screeners.”

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMcKay, Tasseli
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy