FCL Supporting Statement B_12.15.21_clean

FCL Supporting Statement B_12.15.21_clean.docx

OPRE Study: Fathers and Continuous Learning in Child Welfare Project [Descriptive Study]

OMB: 0970-0579

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Fathers and Continuous Learning in Child Welfare Project



OMB Information Collection Request

New Collection





Supporting Statement

Part B



December 2021











Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Pooja Curtin

Katie Pahigiannis


Part B


B1. Objectives

Study Objectives

The Fathers and Continuous Learning in Child Welfare project (FCL), conducted by Mathematica and the University of Denver on behalf of the Administration for Children and Families (ACF) Office of Planning, Research, and Evaluation, seeks to test the use of the Breakthrough Series Collaborative (BSC) methodology to strengthen fathers’ and paternal relatives’ engagement with children involved in child welfare and to add to the evidence base on engagement strategies for fathers and paternal relatives in child welfare.

To build on the findings of a pilot study conducted under ACF umbrella generics for formative data collections1, this descriptive process evaluation will focus on organizational changes and network supports for engaging fathers and paternal relatives, changes in staff’s attitudes and skills for engaging fathers and paternal relatives, and outcomes of engaging fathers and paternal relatives. The evaluation will explore the implementation of strategies and approaches for engaging fathers and paternal relatives. By examining process outcomes, the evaluation is designed to indicate the likelihood that strategies and approaches developed in the BSC will lead to placement stability and permanency outcomes. To assess process outcomes, the evaluation will also document how teams implemented the BSC and specific father and paternal relative engagement strategies and approaches they developed to align with the Collaborative Change Framework.2 To the extent possible, the evaluation will draw on data already collected in the pilot study and add detail about how strategies changed over time. As described in Section A2, ACF’s three aims for the evaluation are to (1) describe potentially promising strategies and approaches for engaging fathers and paternal relatives in the child welfare system, (2) assess the promise of the BSC as a continuous quality improvement framework for addressing challenges in the child welfare system, and (3) assess the extent to which the agencies experienced a shift in organizational culture over time.

Generalizability of Results

The goal of this study is to describe the development, implementation, and spread of strategies in five child welfare agencies who participated in the FCL BSC. These agencies were purposively selected for this study based on three criteria: prior efforts to engage fathers and paternal relatives, prior experience participating in a BSC, and prior experience participating in FCL. Each agency developed site-specific strategies based on its setting and the and needs of the fathers and paternal relatives it serves. This study intends to show how five child welfare agencies in different settings designed and implemented strategies to engage fathers and paternal relatives. It is not intended to provide generalizable findings or instructions on how to successfully implement strategies in other agencies.



While the findings are not intended to be generalizable, this study could help demonstrate opportunities and examples of promising practices for engaging fathers and paternal relatives in a variety of child welfare settings. These sites reflect the diversity of child welfare agencies. They vary in size, authority (state- or county-administered), urbanicity, population served, and collaboration with system partners (Fung et al. 2021). Other agencies interested in deepening their engagement of fathers and paternal relatives may be interested in lessons and potentially promising practices drawn from the sites’ experiences.

Appropriateness of Study Design and Methods for Planned Uses

The study is a descriptive evaluation. Both qualitative and quantitative data sources will serve to capture the experience of participating stakeholders (for example, agency leaders, frontline staff, agency supervisors, agency partners, community stakeholders, and fathers and paternal relatives):

  • Interview topic guide for staff (Instrument 1)

  • Father and paternal relative focus group protocol (Instrument 2)

  • Staff survey (Instrument 3)


This study design and the methods used will allow us to answer the research questions (see section A2) and will provide insight into the overarching study objective, documenting and evaluating the implementation of strategies for engaging fathers and paternal relatives using the BSC methodology. Study reports and briefs will be made available to the public.


This evaluation does not assess impact and should not be used to assess participants’ outcomes. Written products associated with the study will include key limitations. As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.

B2. Methods and Design

Target Population

The universe of programs eligible for the evaluation includes local and state child welfare agencies that participated in the pilot study and their key partners. Target respondents include staff and fathers and paternal relatives of children with involvement in the child welfare system from five child welfare agencies (as described in Section A2) and the agencies’ key partners. We estimate that up to 869 respondents will participate across the five agencies. This includes about 181 respondents in Los Angeles and Connecticut and about 169 respondents in each of the other three agencies, though the sizes of the agency vary widely.3 For example, multiple offices in Los Angeles and Connecticut are participating in the evaluation, while Prowers, a small, rural county in Colorado, has only one office (see A12, “explanation of burden estimates” and “Sampling and Site Selection” below).


Sampling and Site Selection

The FCL project team worked with federal partners, experts, and stakeholders familiar with child welfare agencies to select six Improvement Teams representing five agencies for the pilot study. As described in Section A2, teams comprised child welfare agency staff (such as managers, supervisors, and workers) and community partner staff (such as staff from father engagement organizations) and were responsible for monitoring progress and improvements throughout the pilot study.

All the agencies that participated in the pilot study have agreed to participate in the descriptive evaluation. Participating agencies are not intended to represent child welfare agencies nationally.



The team will select key staff and stakeholders to participate in the data collection based on their roles in implementing the BSC and strategies and approaches to engage fathers and paternal relatives, with respondents recruited to capture a range of experiences and perspectives. Staff will include those who participated in the pilot study, as well as additional staff who were not involved in the pilot study but who are involved in implementing father and paternal relative engagement strategies and community stakeholders will include those whose role has intersected with the child welfare agency and who have an interest in father and paternal relative engagement in the child welfare system, such as a local parent education provider. Community stakeholders will only be asked to participate in interviews. Agency and partner staff will be asked to participate in interviews and/or complete the staff survey. As described in A2, some staff involved in direct services to clients will complete the staff survey, while others (determined in collaboration with the study sites) will respond to both instruments. Similarly, the project team will recruit fathers and paternal relatives with relatively recent experience with the focal child welfare agencies as well as those who engaged with the agencies longer and served on the Improvement Teams (and thus, participated in data collection for the pilot study).



B3. Design of Data Collection Instruments

Development of Data Collection Instruments

We developed the data collection instruments based on the essential data required to answer the priority research questions. We developed the interview topic guide and focus group protocols based on the protocols used in the pilot study.

The staff survey uses established measures when possible. Sources for individual survey items are available in the instrument. The team closely examined all instruments to confirm that they were streamlined and did not collect duplicative data.


Table 1 presents a crosswalk between the data collection instruments and the study aims.


Table 1. Crosswalk between data collection instruments and study aims

Objective

Instrument 1: Interview topic guide

Instrument 2: Father and paternal relative focus group protocol

Instrument 3: Staff survey

Objective 1: Describe potentially promising strategies and approaches for engaging fathers and paternal relatives in the child welfare system

X

X

X

Objective 2: Assess the promise of the BSC as a continuous quality improvement framework for addressing challenges in the child welfare system

X


X

Objective 3: Assess the extent to which agencies experienced a shift in organizational culture

X


X



B4. Collection of Data and Quality Control

Site visits

A team of at least two researchers will visit each site. Additional study team members may participate in a site visit depending on the interview schedule. At least two study team members will visit a smaller site, like Prowers County, Colorado, while up to four may visit a larger site, like Connecticut, that has multiple offices participating and where multiple staff interviews may need to be scheduled simultaneously. While we are currently planning for in-person site visits, we may need to administer Instruments 1 and 2 virtually given ongoing uncertainty about the COVID-19 pandemic or other extenuating circumstances. We have designed these instruments to be administered virtually. During the site visits, the team will interview key agency staff involved with planning, implementing, and spreading the strategies and approaches: high-level child welfare administrators and managers (senior leaders/Improvement Team); frontline workers; and leaders and frontline staff from community partners and stakeholders, such as fatherhood programs, that might have helped implement strategies and approaches or have an interest and role to play in father and paternal relative engagement. We will conduct semistructured interviews in small groups organized by staff level and function. To identify and recruit agency and partner staff for site visit interviews, we will request organizational charts and information about staff roles at the site and its partner organizations participating in the engagement strategies and approaches to ensure that interview participants have at least a minimal level of involvement in designing or implementing the strategies and approaches.

During the site visit, we will also conduct focus groups or interviews with fathers and paternal relatives who have relatively recent experiences with the child welfare agencies. The focus group protocol (Instrument 2) has been designed to be easily adapted for use with individual participants as a semi-structured interview protocol, if it is not possible to hold focus groups or if the agency feels that the topics are more suitable to an individual conversation. We will work with the key site contact to identify and recruit potential focus group/interview participants. In order to speak with 12 fathers and paternal relatives, we will ask the key site contact to recruit 20 people to account for no-shows. We have prepared a focus group flyer for the site contact to distribute to potential participants (Appendix B). Once the site contact has identified and recruited focus group participants, we have prepared a reminder email for contact to send to each participant, as well as a confirmation email that a member of the study team will send. Text for the confirmation and reminder is included in Appendix C.

We will work with the key site contact to schedule the interviews and focus groups and to contact participants. We will also work with the key contact to discuss potential barriers to participation and how to overcome them. For example, we may hold a participant focus group at a central location and/or after regular working hours to make it easier for working fathers and fathers without easy access to transportation to attend. If there are employee union contract agreements that are barriers to research, such as those that stipulate how often staff can be surveyed or whether they can participate in interviews or other research activities outside specified working hours, we will provide flexibility in the timing of interviews, ensure that staff know that their participation is voluntary, and work with partner agencies to find other amenable solutions, as necessary. Two staff members will attend the interviews and focus groups.

We designed the instruments to accommodate virtual administration, if necessary. In these cases, we will still work with the key site contact to identify and recruit participants and schedule the interviews and focus groups. The project team will initiate invitations to attend the virtual interviews and focus groups to allow for recording of the sessions.

The interview and focus group topic guides include a consent statement at the beginning. Project staff will read the consent statement to respondents before beginning interviews and focus groups. The consent statement provides assurances that the information shared will be kept private and reported in a manner that will not identify individual respondents. Staff will ask participants to provide verbal consent after reading the consent statement.

To ensure data quality, site visit staff will receive training that includes a careful review of the following:

  • The objectives of the evaluation and the research questions that the interviews and focus groups will address

  • Each topic guide with an emphasis on the intent of each question to ensure interviewers have a keen sense of how to properly probe, which is critical because of the open-ended nature of interviews and focus groups

We will create templates for the interview and focus group summaries to ensure that site visitors capture information consistently. The training will also include a review of these templates.

Staff survey

Using a low-burden, web-based platform, we will administer a short survey to staff who play a role in implementing strategies and approaches to engage fathers and paternal relatives at the child welfare and partner agencies. Staff will be sent invitations to complete the survey, along with a secure web link, by email. We will also send an email reminder to complete the survey about one week after the initial invitation. These communications are included in Appendix D. The respondents will include, but not be limited to, the staff members who participated in the semistructured interviews. The survey will ask staff to assess their organization’s culture and their own practices related to engaging fathers and paternal relatives. We will administer the survey at the beginning of data collection and near the end to see any observable changes over time. The project team will monitor survey data for quality and completeness throughout the data collection period.



B5. Response Rates and Potential Nonresponse Bias

Response Rates

The interviews, focus groups, and staff surveys are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. For monitoring purposes, we will calculate response rates for the staff surveys only. We expect a high rate of response because of existing staff engagement with the BSC and pilot study. These response rates will not be reported.


Nonresponse

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in written materials associated with the data collection.


B6. Production of Estimates and Projections

The data will not be used to generate population estimates, either for internal use or dissemination.


B7. Data Handling and Analysis

Data Handling

We will perform quality assurance checks for reliability on coded qualitative data. We will review quantitative data for completeness and accuracy of data entry and program edit checks into the web-based staff survey to minimize errors in entry.


Data Analysis

After data collection concludes, we will analyze according to the procedures described below for each type of data (qualitative or quantitative). Then, we will analyze each data source separately before combining the descriptive themes that emerge to identify any findings supported by multiple data sources.

As described in Section A2, we will use the data collected through the evaluation along with any existing program-collected data to address the research questions. We will first seek to address Research Questions 1, 2, and 3. After identifying evidence of cultural shifts, plausible connections between strategies and approaches and outcomes, and whether the BSC contributed to the launch and sustainability of engagement strategies and approaches, we will then examine Research Question 4: whether the BSC is a promising and useful tool for addressing challenges facing child welfare agencies.

Qualitative data

For each qualitative data collection activity, we will use standardized templates to organize and document the data and then apply codes. We will search the coded text to gauge consistency and triangulate across participants and data sources. This process will reduce large volumes of qualitative data to a manageable number of topics, themes, or categories (Coffey and Atkinson 1996; Yin 1994) that can then be analyzed to address the research questions.

To code the qualitative data for key themes and subtopics, we will first develop a coding scheme in accordance with the constructs of interest covered in the interview and focus group questions. Questions and codes will align with National Implementation Research Network Active Implementation Frameworks constructs. In the first stage of coding, we will code interview and focus group responses to applicable constructs. In the second stage, we will review all data coded within a specific construct to identify broad themes, triangulating across respondents and data sources. In the third stage, we will create narrower codes within the broad themes. In addition, coders will use a coding scheme to document key information, such as descriptions of the strategies and approaches implemented and the names of partners involved in implementing each strategy.

Team members will code the data using a qualitative analysis software package such as NVivo. To ensure reliability across coders, all team members will code an initial document and compare codes to identify and resolve discrepancies. In addition, as coding proceeds, the evaluation task lead will review samples of coded data to check reliability.

After coding the data, we can look across staff and organizations (within agencies and among partnering organizations) by searching on codes. Using a program such as NVivo will also enable us to retrieve data on codes by the type of study participant (for example, organizational leadership or Improvement Team member). To compare information, we might retrieve data for subsets of agencies, such as those that implement the same strategy for engaging fathers and paternal relatives, serve similar target populations, or have similar staffing structures.

Quantitative data

For the staff surveys, we will report descriptive statistics and aggregate responses at the team level. We will report means for scales used in the survey but do not anticipate conducting psychometric analysis of these data. We plan to administer the survey at two points and will compare descriptive statistics from each point using standard statistical techniques, such as a t-test to compare differences in means at the beginning and the end of the evaluation. We will also explore responses by staff position, but our ability to report on or draw inferences from position-level responses will depend on the sample size and response rate.

For any program data that teams can provide, we will report descriptive statistics. When possible, we will assess changes over time. Though this analysis will not be causal, it might suggest whether certain strategies and approaches are promising. For example, if an agency can provide data on participants from periodic case reviews or data from quality assurance reviews that reflect quality ratings of caseworkers’ engagement with fathers, we could examine whether reported paternal engagement increases the longer strategies are implemented.

Data Use

The FCL project team will use the information collected to provide insight into the overarching research questions. The team will also use the data to develop a report on the evaluation that includes options for integrating promising strategies into other child welfare settings. The final report will be used internally by ACF and made public. We will also produce short, engaging products for different populations, such as child welfare agency staff, fathers and paternal relatives, and the general public.

General limitations to the resulting study data will be included in materials and presentations. Written products will clearly state that this study aims to present a high quality description of the implementation of the BSC in participating child welfare agencies, not to promote statistical generalization to other agencies or service populations.

Data collected through this evaluation do not lend themselves to secondary analysis, and no data sets will be shared publicly. We will not generate a document for public use on how to properly interpret, analyze, and evaluate information from this collection because we are not creating a public-use file.

B8. Contact Persons

Mathematica is conducting this project under contract number HHSP233201500035I/HHSP23337025T. Mathematica developed the plans for this data collection. Experts in their respective fields from the ACF Office of Planning, Research, and Evaluation and Mathematica helped developed the design, data collection plan, and instruments for which we request clearance.

The following people can answer questions about this data collection effort:


Staff from Mathematica and the University of Denver will be responsible for collecting, processing, and analyzing the information for the Office of Planning, Research, and Evaluation.


Attachments

Instruments

Instrument 1. Interview topic guide

Instrument 2. Father and paternal relative focus group protocol

Instrument 3. Staff survey



Appendices

Appendix A. Informed consent form

Appendix B. Focus group recruitment flyer

Appendix C. Focus group reminders

Appendix D. Staff survey notifications

References

Coffey, Amanda Jane, and Paul A. Atkinson. Making Sense of Qualitative Data: Complementary Research Strategies. Thousand Oaks, CA: Sage, 1996.

Fung, Nickie, Jennifer Bellamy, Eliza Abendroth, Diletta Mittone, Roseana Bess, and Matthew Stagner (2021). A Seat at the Table: Piloting continuous learning to engage fathers and parental relatives in child welfare, OPRE Report # 2021-62, Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Yin, Robert K. “Discovering the Future of the Case Study Method in Evaluation Research.” Evaluation Practice, vol. 15, no. 3, October 1994, pp. 283–290. https://doi.org/10.1016/0886-1633(94)90023-X.





1 Formative Data Collections for ACF Research (OMB Control 0970-0356; GenIC approved 3/26/2018 and updates incorporated 8/31/2018) and Formative Data Collections for ACF Program Support (OMB #: 0970-0531; GenIC approved 11/5/2019).

2 As described in Section A2, the FCL project team convened an expert group to develop a Collaborative Change Framework. Improvement Teams tracked and reported on specific measures related to the framework to monitor progress and improvements at multiple points throughout the pilot study.


3 The total estimated number of respondents is greater in Los Angeles and Connecticut because we anticipate conducting additional focus groups in Los Angeles and Connecticut. The number of focus group respondents at those agencies is estimated to be 24, compared to 12 at each of the other three agencies.

8


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-12-17

© 2024 OMB.report | Privacy Policy