SSB Prog Support - STRIVE

STRIVE_Formative ACF Program Support_SSB_clean.docx

Formative Data Collections for ACF Program Support

SSB Prog Support - STRIVE

OMB: 0970-0531

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Strategies To Reduce Intimate Violence Effectively (STRIVE) Information Collection Request



Formative Data Collections for Program Support


0970 - 0531





Supporting Statement

Part B

February 2025


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officer: Jesse Coe


Part B


B1. Objectives

Study Objectives

The purpose of this data collection is to formatively test the feasibility and preliminary outcomes associated with implementing a short educational intervention (called CUES, which stands for Confidentiality, Universal Education + Empowerment, Support) aimed at preventing and addressing intimate partner violence (IPV) within HMRF programs. Findings will support ACF to determine the types of activities that HMRF programs can implement to help prevent and address IPV. Findings will also allow ACF to identify the processes and training and technical assistance (TTA) that future grant recipients may need to implement these services. Although findings from this study will not be generalizable, results from quantitative and qualitative data collection efforts will be disseminated broadly to support HMRF and similar programs in increasing their capacity to address IPV.


Generalizability of Results

This study is intended to produce estimates of the intervention’s impact in chosen sites, not to promote statistical generalization to other sites or service populations.


Appropriateness of Study Design and Methods for Planned Uses

The study team plans to:

  • Use web-based surveys to directly collect information from HMRF program participants (Instrument 1 and Instrument 2). Most survey items will be multiple-choice questions that provide descriptive data that is collected in a consistent way across programs. Providing structured close-ended response options will minimize survey burden while also supplying sufficient detail to answer the study’s research questions (see Supporting Statement A2 for the study’s specific research questions). The surveys will be completed electronically and are of relatively low burden to respondents given their anticipated duration of less than 10 minutes. The survey data will allow for the perspectives of a variety of participants across programs to be captured and used to inform the understanding and development of effective methods for addressing IPV in HMRF program settings.

  • Collect participant contact information via a web-based form (Instrument 3). This form is very brief and will automatically appear when a participant has completed their pre- and post-test surveys (although, importantly, information provided on the contact information form will not be linked to survey data). Information collected through the contact information forms will allow the study team to distribute tokens of appreciation and survey reminders to participants.

  • Conduct semi-structured interviews with HMRF staff from study sites to gather insights into implementation considerations for the CUES intervention and staff capacity to address IPV among program participants (Instrument 5). Interviews are particularly well-suited for this purpose, as they allow for an in-depth exploration of participants’ experiences and perspectives. This method offers the flexibility to probe into unexpected themes that may emerge during the conversation and provides an opportunity to gather valuable contextual information that could have influenced the implementation of the intervention or shaped staff perspectives.

  • Use implementation logs to contextualize findings from the surveys and interviews by collecting data on implementation fidelity, participant IPV disclosures, and provision of resources (Instrument 4). The implementation logs are relatively low burden to staff given their anticipated duration of around 2 minutes per log, but they will provide additional data not captured by the surveys or interviews that will be used to better understand the study’s qualitative and quantitative findings.

As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.  

The data collected through this study is also not intended to be representative or generalizable. Key limitations of this data collection will be included in written products associated with the study.


B2. Methods and Design

Target Population

The target population for this study includes:

  • HMRF program participants. We plan to recruit participants who enroll in Healthy Marriage and Relationship Education (HMRE) or Responsible Fatherhood (RF) programming at the four study sites between March 2025 and August 2025.

  • HMRF staff who were involved in implementing or overseeing the CUES approach at each of the four study sites.



Sampling and Site Selection

The study team will be conducting a multi-site study across four HMRF programs. To select the four study sites, the study team collaborated with liaisons from the Office of Family Assistance who recommended potentially eligible HMRF programs. To be considered eligible for participation, programs needed to:

  • Serve individual, adult participants

  • Plan to enroll at least 100 participants during the data collection window (March 2025-August 2025)

  • Be willing to incorporate one-on-one meetings between participants and case managers/facilitators

  • Have a partnership with a local domestic violence agency

  • Serve primarily English and/or Spanish speaking participants

  • Not exclude participants who indicate potential IPV victimization or perpetration during the program’s IPV screening procedures at intake

  • Be able to accommodate electronic data collection efforts


The study team reached out to 12 sites and held initial informational calls with 9 sites. Ultimately, four sites—two HMRE and two RF programs—have the capacity and are willing to participate in the proposed study.


Program participants will be recruited using a non-probability, convenience sampling approach. Staff at the four study sites will provide all participants enrolling in their HMRE or RF program with information about the study during participant intake (Attachment F).


The study team will use non-probability, purposive sampling to recruit HMRF program staff at the four study sites. Child Trends staff will ask site contacts at each of the four study contacts to recommend staff who implemented the CUES approach to participate in an interview.



B3. Design of Data Collection Instruments

Development of Data Collection Instruments

The instruments were developed by a team of researchers who reviewed relevant existing instruments, applied their substantive expertise, and consulted with participants, practitioners, advocates, and other researchers. Specifically, the project surveys and implementation log were informed by the Responding to Intimate Violence in Relationship Programs (RIViR) study (OMB #0970-0503), as well as a targeted literature scan of previously used and validated measures. The study team met with the RIViR study leadership to solicit input on recommended survey items based on their experiences during the RIViR study. Members of the study team who did not develop the surveys also reviewed the instruments and 4 members of the STRIVE Advisory Group and three HMRF program alumni provided feedback. The advisory group’s review of the study was significant in contributing to minimizing measurement error by suggesting wording changes to improve the clarity and readability of the study based on familiarity with or experience as HMRF participants. Following this process of study refinement, the study team also conducted a crosswalk of the survey items against the study’s identified outcomes of interest to ensure minimally burdensome and streamlined instruments. All study instruments were also reviewed by the project officers to ensure quality and alignment with study objectives.



B4. Collection of Data and Quality Control

Child Trends, a contractor to OPRE, will be responsible for overseeing data collection and quality control efforts.


Who will be collecting the data (e.g., agency, contractor, local health departments)?

Program staff at the study sites will be tasked with recruiting participants and administering the pre- and post-surveys to program participants and completing the implementation logs following each CUES conversation. Child Trends will be responsible for monitoring and troubleshooting any issues that arise during survey administration, conducting semi-structured staff interviews, and for all data monitoring activities.


What is the recruitment protocol?

Survey recruitment:

Staff at the four study sites will inform potential participants about the study during program intake procedures (Attachment F). If program participants express that they may be interested in joining the study, a staff person will provide them with the study consent form, via tablet or QR code (Attachment A). If the HMRF program participant decides to join the study and sign the consent form, they will be automatically re-directed to the pre-survey.


Interview recruitment:

Child Trends will work with site contacts at the four participating study sites to identify staff members who implemented or are familiar with the implementation of the CUES approach for interviews. With the staff members’ permission, site contacts will share staff members’ contact information with Child Trends. Child Trends will then email these staff and invite them to participate in an interview (Attachment F).


What is the mode of data collection?

Pre- and post-surveys, contact information forms, and implementation logs will be collected through a secure, web-based data collection platform. Semi-structured interviews will be conducted virtually with HMRF staff using a secure meeting platform.


How are the data collection activities monitored for quality and consistency (e.g., interviewer training)?

Child Trends will train staff at study sites on how to best recruit and enroll participants and administer the pre- and post-surveys. Training topics will include voluntary participation, participant privacy, participant ID linking, and recruitment and survey administration procedures. Staff will have study team contact information to reach out regarding any questions or challenges that arise during implementation.


The surveys, contact information form, and implementation logs will all be programmed into a secure online data collection platform in a way that safeguards against major data quality issues. For example, for multiple choice questions where only one response should be selected, the survey will not allow participants to select multiple responses. To the extent possible, the study team will also program validation checks on open-ended items such that, for example, ID numbers must include the correct number of characters and age can only be entered in digits.


Child Trends study team members will lead the semi-structured interviews and each interview will have both a lead interviewer and a notetaker present. Child Trends will conduct an internal interview training, which will focus on interview goals, protocol review, question intent, and best practices for semi-structured interviewing.


What data evaluation activities are planned as part of monitoring for quality and consistency in this collection, such as re-interviews?

Most issues with data quality can be addressed using statistical techniques during the analyses stage, but for data that appears incorrect (e.g., participant enters an age below 18 years), Child Trends staff will reach out to site contacts to try and obtain correct information. However, because of the automated program features within data collection platforms for data quality monitoring, we anticipate minimal need for this step.



B5. Response Rates and Potential Nonresponse Bias

Response Rates

The interviews and surveys are not designed to produce statistically generalizable findings, and participation is wholly at the respondent’s discretion. Response rates for study enrollment will not be calculated. The study team will calculate response rates for receipt of intervention and completion of the post-test survey. These numbers will be included in study reporting to contextualize findings.


NonResponse

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in written materials associated with this data collection effort and we will assess attrition rates between enrollment, pre-survey completion, and post-survey completion.



B6. Production of Estimates and Projections

The data will not be used to generate population estimates, either for internal use or dissemination. The data collected will only be used for program improvement and to inform future evaluation efforts.



B7. Data Handling and Analysis

Data Handling

All surveys, contact information forms, and implementation logs will be collected using a secure, online platform. The survey will be programmed with validation checks to reduce errors (e.g., for questions where respondents should select only one answer, the survey will be programmed such that only one response can be selected). Data from these instruments will be downloaded directly from the data collection platform to a secure drive. As described in SSA (A10.Privacy: Procedures to protect privacy of information, while maximizing data sharing), we are in the process of obtaining an Authority to Operate (ATO) for this work. All analysis will occur in the statistical analysis program Stata on a secure drive and data will only be accessible to approved study team members. When working with the survey data, one study team member will write code to clean data and a second study team member will review the code for errors.


All interviews will be conducted virtually and recorded via a secure meeting platform. Recordings will be automatically transcribed and saved to the secure drive immediately following the interviews. Qualitative analysis will take place within the secure drive and only approved project team members will have access to the data.


Data Analysis

Below we describe our data analysis plans by study research question.

  1. Research question 1. To what extent do participants’ IPV-related outcomes show signs of improvement after implementing the universal education (UE) intervention? To understand changes in participants’ IPV-related outcomes, we will use a one-group pretest-posttest approach that compares participants’ survey responses at Time 1 (pre-survey) to their responses at Time 2 (post-survey). This set of analyses will make use of descriptive statistical techniques such as t-tests and chi-square tests. Responses will be pooled by program type (HMRE and RF) but if sample size allows, we may examine differences between program sites. We may also examine differences in outcomes associated with demographic characteristics of interest. The purpose of these analyses is to understand preliminary IPV-related outcomes associated with the CUES intervention to help support HMRF and similar programs in building their capacity to address IPV, including informing TTA materials and resources


  1. Research question 2. To what extent does implementing UE support staff capacity to address IPV? To answer whether and how the intervention supports staff capacity to address IPV, we will analyze interview transcripts using a Rapid Qualitative Analysis approach shown to produce timely, reliable, and actionable findings within programmatic settings.1 Specifically, we will summarize interview transcripts using a structured template, consolidate these summaries in matrices organized by participant type (HMRE staff and RF staff), and examine the matrices to identify themes and patterns related to questions of interest. 


  1. Research question 3. What are implementation considerations for the UE interventions within HMRF programs? To increase our understanding of the implementation considerations for using CUES in HMRF programs, we will analyze interview transcripts using the Rapid Qualitative Analysis approach described above. We will supplement this analysis with information from the implementation logs, which will contain information about the number of participants who received the CUES intervention and the number of referrals provided. Participant post-surveys will also provide insights on participants’ perceptions of CUES’ usefulness and their comfort with it.


Data Use

The study findings will be used to inform:

  • A guidance document identifying specific practical strategies for addressing and preventing IPV in HMRF programs that will be shared with HMRF grant recipients.

  • An internal memo summarizing HMRF programs’ TTA needs related to addressing and preventing IPV that will be shared with ACF.

  • A public-facing report that includes descriptive statistics and key themes from qualitative data. The purpose of the report will be to support the capacity of HMRF and similar programs in addressing IPV. The report will include guidance for interpreting study findings and clearly describe limitations of the data collection.


These plans align with the potential sharing described in the umbrella generic. Specifically, the umbrella states that “Under this umbrella generic IC, information is meant to inform ACF activities and may be incorporated into documents or presentations that are made public such as through conference presentations, websites, or social media." The following examples that are provided align with the plans for data from this study:

  • Technical assistance plans

  • Project specific reports


B8. Contact Persons

Samantha Ciaravino

Research Scientist

Child Trends

[email protected]


Sydney Briggs

Research Scientist

Child Trends

[email protected]


Jesse Coe

Social Science Research Analyst

Office of Planning, Research, and Evaluation

[email protected]


Christine Kim

Social Science Research Analyst

Office of Planning, Research, and Evaluation

[email protected]



Attachments

Attachment A. Study consent form for HMRF participants

Attachment B. Study consent form for HMRF staff interviews

Attachment C. CUES safety card for HMRE programs

Attachment D. CUES safety card for RF programs

Attachment E. CUES safety card for AIAN participants at HMRE programs

Attachment F. Information and outreach scripts

Instruments

Instrument 1. HMRE pre- and post-survey

Instrument 2. RF pre- and post-survey

Instrument 3. Contact information form

Instrument 4. Interview protocol for HMRF staff

Instrument 5. Practitioner implementation log



1 Hamilton, A. (2013). Qualitative methods in rapid turn-around health services research. Health services research & development cyberseminar, 2023-03. Available: https://www.hsrd.research.va.gov/for_researchers/cyber_seminars/archives/780-notes.pdf

9


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCoe, Jesse (ACF)
File Modified0000-00-00
File Created2025-07-04

© 2025 OMB.report | Privacy Policy