Supporting Statement B_3.23.21 - Clean

Supporting Statement B_3.23.21 - Clean.docx

Phase II Evaluation Activities for Implementing a Next Generation Evaluation Agenda for the Chafee Foster Care Independence Program

OMB: 0970-0489

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes





Phase II Evaluation Activities for Implementing a Next Generation Evaluation Agenda for the Chafee Foster Care Independence Program



OMB Information Collection Request

0970 - 0489





Supporting Statement

Part B


MARCH 2021





Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers: Maria Woolverton, Kelly Jedd McKenzie


Part B


B1. Objectives

Study Objectives

The study objectives are to identify promising programs and services in select areas of interest to ACF (e.g., education, housing, employment programs among others) for youth transitioning to adulthood from foster care, to document the present state of interventions in these areas, and determine different programs’ readiness for potential rigorous evaluation in the future. The project will achieve its objectives through formative evaluation including interviews and focus groups with program leaders, partners, stakeholders, staff, and participants.


Generalizability of Results

This study is intended to present internally-valid descriptions of select programs and services for youth transitioning to adulthood from foster care, not to promote statistical generalization to other programs in different locations or service populations. The study has involved an environmental scan of programs across the country in order to select a range that capture some, but not all, program diversity. The goal is not to achieve generalizability but to show the diversity and commonalities and differences among program models and approaches.


Appropriateness of Study Design and Methods for Planned Uses

The study design, involving formative evaluation, is the appropriate method for the project’s planned uses. Those uses, as described in Supporting Statement A, include informing ACF and the child welfare field about the present state of interventions and different programs’ readiness for future summative and impact evaluation. The methods include interviewing program leaders, partners, stakeholders, and staff to better understand the range of program models and to assess how well program activities, procedures, and outcomes for participants align with the programs’ stated goals, logic models, and/or theories of change. This study approach helps the project team evaluate whether the programs operate to fidelity. The current study will not produce impact findings, track participant outcomes, or be generalizable to all similar programs. The project team will acknowledge these and other study limitations in written products associated with this project. As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.




B2. Methods and Design

Target Population

The target population for this study and for each program selected includes young adults ages 18 and older transitioning to adulthood from foster care and practitioners from state or local agencies, private organizations, and other program partners knowledgeable about each program. The sampling frame by program will consist of the roster of staff, partners, and participants. The unit of analysis is the individual, and the project team will use a nonrepresentative sample, and interview up to:

  • 23 program leaders

  • 14 program partners

  • 66 program front-line staff

  • 240 program participants (i.e., young adults transitioning to adulthood from foster care; roughly 24 focus groups with 10 participants each)


Sampling and Site Selection

The respondent recruitment strategy is purposive focused on individuals most knowledgeable about the programs, and on a sample of program participants who are interested and willing to discuss their experiences – positive and negative – in focus groups with peers.


Recruiting program staff: The project team will contact program leaders, front-line staff, and program partners affiliated with programs the team selected for formative evaluations via email using contact information from program websites. The team will use an email outreach script (Appendix A), and attach a one-page project description (Appendix B) to the email. If necessary, the team will follow-up with emails or phone calls to answer any questions respondents may have.


Recruiting program participants: The project team will ask program leadership and staff at the selected programs to recruit participants for the focus groups. In the case of in-person focus groups, which the project is not conducting currently due to COVID-19, program leadership and staff will provide the physical space for the discussions. As directed by the project team, program staff will recruit young adult participants age 18 and older who are currently participating or who have participated in the selected programs. The project team will ask program staff to recruit participants who can speak knowledgeably about the program based on their own experience, including both those who have experienced success in the program and those who have faced challenges. The project team will inform program staff of the purpose of the focus groups so staff are prepared to answer questions posed by prospective focus group participants and will provide program staff with an informational fact sheet to aid with recruitment that will describe the purpose of the study and address other logistical questions. For focus groups with young adults who received Education and Training Vouchers (ETVs) through the Chafee ETV program, the project team will engage organizations that serve foster care alumni to aid with recruitment (Instrument 5). The project team has developed specialized materials, including a recruitment flyer, outreach email to young adults/program participants, and a project overview and frequently asked questions document, which it will provide to staff at these organizations to aid with recruitment (see Instrument 5 and Appendix B). The project team will update the language in these materials to reflect the ETV local program name [e.g., some states refer to the ETV as the Education and Training Grant (EGT)]. These materials currently have a space holder for this update.


The project team selected the programs as part of activities completed to date for this study1.) The selection criteria for programs considered for formative evaluation included:

  • Programs in select areas of interest to ACF including education, housing, and employment programs.

  • Programs that demonstrated a clear program model and that emphasized fidelity to that model

  • Programs with program features identified as promising in the research literature

  • Programs that demonstrated a clear understanding about the target population(s), had a clear process for identifying the target population(s), were deploying a model that was appropriate for achieving the program’s stated objectives and had a robust referral process

  • Programs with available data, or the potential for gathering data, for estimating sample size – including current program participation and potential program sizes.

  • Programs with available data, or the capacity to gather data, that demonstrate the intensity of the treatment.

  • Programs interested in taking part in formative evaluation and in the possibility of participating in future rigorous evaluation.



B3. Design of Data Collection Instruments

Development of Data Collection Instruments

The project team is using the same information collection instruments previously approved by OMB for this study [e.g., see a recent report (Dworksy, 2020 )2; OMB #0970-0489], and consistent with topics covered in formative evaluation research3 [e.g., see guide (OPRE, 2010)]. The instruments are designed to engage discussion with program leaders, program staff, program partners, and program participants about the structural, process and content components of the programs, and also contextual factors (e.g., policy changes), staffing issues (e.g., burden) and participant considerations (e.g., number of participants served) that may affect the feasibility of future rigorous impact evaluation. The project team does not plan to pilot the discussion guides. The team has used similar guides in prior exploratory interviews during Phase I of the project and have found them to be effective. Team members have also used similar questions in other studies.


The questions were streamlined to focus only on key research questions and to not extend beyond 60 minutes for individual interviews and 90-120 minutes for focus groups.



B4. Collection of Data and Quality Control

Data collection will take place in the form of site visits (all virtual pending changes to COVID-19 travel restrictions and safety concerns). Tables A1 and A4 in Supporting Statement A summarize all the data collection that will be conducted for the evaluation. The team will reach out to program leaders, front-line staff, and partners via email requesting their participation in an interview (Appendix A). We will ask front-line staff who agree to participate in interviews to assist with the recruitment of program participants and will provide them with recruitment materials to do so (Instrument 5). All recruited staff and program participants will be assured that their participation is voluntary and that they are free to choose not to participate without consequence.


All project team members who participate in site visits will be trained on consent and interview procedures prior to entering the field. Project team members will sign the Urban Institute Confidentiality Forms prior to data collection and will store all data in locked cabinets or on a secure drive.


At least one senior researcher from the project team will attend each site visit, which will be virtual pending changes to COVID-19 travel restrictions and safety concerns. Virtual site visits will be scheduled based on interview- and focus- group participants’ availability to reduce burden and work disruption. If in-person site visits become feasible, the visits will generally last two days. The discussion guide questions for interviews and focus groups are designed to elicit nuanced responses, and the project team will need to probe with individualized follow-ups when answers are vague, ambiguous, or when we want to obtain more specific or in-depth information. At the start of the interviews and focus groups with staff, the project team will ask the respondents for verbal consent to participate and permission to record the conversation using the informed consent form attached to each interview and focus group guide (Instruments 1-5). Verbal consent will also be requested during program participant focus groups. The team will ask program participants to sign consent forms for focus groups conducted in-person, if feasible. The team will cover the following during the consent process: the study’s purpose and funder, the nature of the information that will be collected, how the information will be used, the potential benefits and risks of participating, and assurance that participation in the study is voluntary. The team will also inform study participants that they may choose to skip any questions or stop participating in the interview at any time. Additionally, the team will also ask program participants for consent to audio record the focus group. The team will only use this audio recording to supplement notes taken during the focus group and will not record if any program participant does not consent to being recorded. If at any time a study participant becomes distressed, the study team members conducting the interview will stop the interview.


The project team will rely on leaders and staff at each program to help recruit program participants for focus groups. As directed by the project team, program staff will recruit young adult participants age 18 and older. For focus groups conducted virtually, the program staff will share the contact information for program participants with the project team. This information will be kept on a secure server and used for scheduling purposes as well as tracking the distribution of gift cards to program participants. Once confirmation of gift cards is received, all correspondence containing program participant contact information will be permanently deleted.



B5. Response Rates and Potential Nonresponse Bias

Response Rates

The interviews and focus groups are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported, though ACF anticipates a high response. The project team is identifying promising programs and expects that program leaders and other staff will be interested in sharing their insights with ACF. ACF anticipates that programs agreeing to the formative evaluations will be willing to participate fully in the study. Additionally, the project team will conduct interviews on site or virtually and select dates that are most convenient for program staff. To make participating as easy as possible, the project team will work collaboratively with respondents to schedule the interviews at times that are most convenient.


NonResponse

As study respondents will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated.


Program leaders and staff: In the event that staff are not available to participate in interviews, the project team will work closely with program leaders to either identify other staff with similar knowledge, or ways to schedule telephone interviews or follow-up conversations. Because the evaluation is voluntary, any member of the program may choose not to participate. Any substantial nonresponse from members of a program will be reported as a study limitation and may result in excluding the program from the analysis.


Program participants: In the event that the project team does not have sufficient participation in focus groups, the team may consider working with staff to reschedule. If participants choose not to participate or to contribute to the focus group discussion the project team will respect the participant’s decision. The team has experience engaging young adults and others in focus groups and has generally found that individuals tend to be willing and interested in speaking and sharing their opinions. Although ACF expects high participation, the interviews are voluntary and respondents may choose not to participate.



B6. Production of Estimates and Projections

The data will not be used to generate population estimates, either for internal use or dissemination.



B7. Data Handling and Analysis

Data Handling

For qualitative data coding, we will ensure inter-rater reliability through having multiple coders code several transcripts once the coding scheme is established and running coding comparison queries in NVivo and re-coding until a Kappa coefficient of over 0.80 is achieved (considered a high level of agreement between) (McHugh, 2012). If the initial level of agreement is below 0.80, the coders will meet to discuss the definitions of each code before returning to recode the transcripts.


Data Analysis

Qualitative data analysis will combine information from the various data sources. The semi-structured interview guides and focus group protocols the project team developed to guide qualitative data collection include discussion topics and questions that reflect key implementation study research questions. The project team will transcribe audio recordings of interviews and focus groups and clean notes in cases when participants did not consent to being recorded. The team will develop a coding scheme to organize the data into themes or topic areas. Transcripts will be coded (tagged based on the theme or topic for which they are relevant) and analyzed using the qualitative analysis software package NVivo.


Data Use

The project team will produce final products for ACF including reports that may be released to the public about the programs. The products will inform ACF and the child welfare field about the present state of interventions and different programs’ readiness for future summative and impact evaluation. The individual formative evaluation reports, and the analyses on which the reports are based, will be organized around assessing the falsifiability of key assumptions in each program’s logic model. The project team will customize each report to the selected program and the assumptions the project team tested. But generally, each report will assess assumptions about 1) program context (e.g., key relations and partnerships with other organizations; availability of competing services), 2) target population, 3) target population participation/engagement, 4) implementation of key program elements, and 5) participant outcomes. Each report will conclude with a description of the project team’s assessment of the steps necessary to bring the program to the point of readiness for experimental evaluation and the likelihood that each step could be undertaken within the time frame of Phase II activities.



B8. Contact Persons

The Administration for Children and Families (ACF), Office of Planning, Research, and Evaluation (OPRE) is funding and overseeing this project. The Federal project officers for this project are Maria Woolverton and Kelly Jedd McKenzie.


The information for this study is being collected by the Urban Institute and Chapin Hall at the University of Chicago on behalf of ACF. Principal Investigator Michael Pergamit led development of the study design plan and data collection protocols, and will oversee collection and analysis of data gathered through on-site interviews and telephone interviews.



Attachments

Appendix A: Outreach email for discussion with Program Administrators and Staff

Appendix B: Additional Project Materials

Instrument 1: Discussion Guide for program leaders

Instrument 2: Discussion Guide for program partners and stakeholders

Instrument 3: Discussion Guide for program front-line staff

Instrument 4: Focus Group Guide for program participants

Instrument 5: Program Staff Recruitment for Focus Group Participants



1 Activities approved under this OMB number (0970-0489). For information about previously approved materials and activities, see https://www.reginfo.gov/public/do/PRAOMBHistory?ombControlNumber=0970-0489

2 Amy Dworsky, “Supporting College Students Transitioning Out of Foster Care,” Urban Institute, August 28, 2020, https://www.urban.org/research/publication/supporting-college-students-transitioning-out-foster-care

3 “The Program Manager’s Guide to Evaluation, Second Edition,” OPRE, January 15, 2010, https://www.acf.hhs.gov/opre/report/program-managers-guide-evaluation-second-edition

7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMarla McDaniel
File Modified0000-00-00
File Created2021-03-30

© 2024 OMB.report | Privacy Policy