Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
The John H. Chafee Foster Care Program for Successful Transition to Adulthood
Strengthening Outcomes for Transition to Adulthood (Chafee SOTA) Project
Formative Data Collections for ACF Research
0970 - 0356
Supporting Statement
Part B
AUGUST 2022
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officers:
Kelly Jedd McKenzie
Harmanpreet Bhatti
Part B
B1. Objectives
Study Objectives
The objective of the proposed information collection is to identify 5-6 programs serving youth transitioning from foster care that demonstrate readiness to participate in future evaluation activities that will build the evidence base for what works to promote positive outcomes for this population of youth. A minimum of 36 candidate programs will be identified and then methodically screened to select a final 5-6 programs to participate in the primary evaluation that constitutes the larger study. Onsite evaluability assessments with nine selected programs will be conducted to ensure the final 5-6 selected programs meet the study criteria and are demonstrate readiness to participate in an evaluation of their services.
Generalizability of Results
The proposed iterative, screening process is intended to help the study team to identify 5-6 programs that could participate in the larger study (evaluation). The nomination process as well as proposed interviews and focus groups are not designed to produce statistically generalizable findings. Consequently, no generalizable results are expected.
Appropriateness of Study Design and Methods for Planned Uses
As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.
Our model for identifying programs to participate in this study will be the Systematic Screening and Assessment (SSA) Method. SSA combines components of traditional screening and assessment methodology, with planned input and feedback from stakeholders with direct experience and perspective to inform the process (e.g., federal staff, individuals with lived experience). The SSA Method is an approach aimed at identifying practice-based interventions that are ready for evaluation. In a sequential process, the SSA Method involves a call for nominations of promising practices, expert input and feedback, and evaluability assessments (EAs) (Leviton and Gutman, 2010), all designed to achieve the goal of identifying promising interventions ready for evaluation. For the purposes of this collection, we will implement four steps associated with the SSA Method. These are: (1) solicit nominations (activity 1); (2) conduct interviews with directors of the 36 programs (activity 2); (3) conduct EAs with nine programs (activity 3); and (4) with experts and in consideration of EA findings, make recommendations for 5-6 programs to participate in the larger study. These sequential activities – with each step directly related to the next - allow us to move from a broad set of potential programs to the 5-6 needed for the larger study. In addition, for each step, we will document the methods and decision points by which programs are selected to move to the next step, thereby providing justification for each one. In the end, we will have gathered only the information necessary to select the final program sample.
B2. Methods and Design
Target Population
Activity 1. Activity 1 includes a “call” for program nominations, broadcasted via relevant listservs to a variety of audiences (potential respondents) and posted on relevant public sites (such as the ACF or contractor website, or on relevant social media outlets).
Activity 2. Activity 2 includes telephone interviews with up to 36 executive directors identified through activity 1.
Activity 3. Activity 3 includes several data collection efforts with different target populations. These are:
Interviews
Onsite (or virtual) interviews with executive directors (n=9) and partner agency directors (n=3)
Focus groups
Onsite (or virtual) focus groups with program staff (n=144) and partner agency staff (n=144)
Onsite (or virtual) focus groups with youth participants (n=72)
Sampling and Site Selection
Activity 1
Information on submitted nomination forms will be reviewed for completeness and to classify each nominated program according to the prioritized information needs and inclusion/exclusion criteria. For example, we will be assessing programs on whether or not the target population is defined and identifiable and mechanisms are in place for recruiting and engaging the target population into the program, and with regard to implementation, if activities and program components are well-defined, measurable, and routinely implemented. Next, the pool of candidate programs will be reviewed and up to 36 programs will be selected to participate in the next stage of the identification process (activity 2).
Activity 2
We will contact 36 executive directors selected from activity 1 (the nomination process); selected programs will meet one or more critical information needs as identified by the guiding questions and inclusion/exclusion criteria associated with activity 1. Directors will be contacted initially via email or telephone for an introduction to the project and to ascertain their interest in being considered for participation. If executive directors are interested, we will schedule an in-depth telephone interview at their convenience to gather further information on operations, how practices and procedures support the program’s design, and to discuss any gaps between the program goals and service model. We will synthesize the information gathered in a profile for each program.
Activity 3
Based on the information from activity 2, we will consider which programs are most likely to provide opportunities for further assessment and will select nine programs for EAs. Once selected, the sampling frame for each of the nine sites1 will be: (1) the director of each program agency for individual interviews; (2) the director of three partner agencies for individual interviews; (3) program staff identified by agency directors for focus groups; (4) partner agency staff identified by partner agency directors for focus groups; and (5) youth identified by agency directors or program staff for focus groups. The research team will work with executive directors to implement non-probability, purposive sampling to identify potential respondents who can provide information on the study’s key constructs. Because participants will be purposively selected, they will not be representative of the population of program or partner agency staff or youth participants. All EA activities will be scheduled at the convenience of the participants.
B3. Design of Data Collection Instruments
Development of Data Collection Instruments
Activity 1. The introduction to the nomination form includes basic information about the Chafee SOTA project, the call for nominations, and frequently asked questions. The submission form itself requests the name and contact information for the candidate program, as well as five questions about the program. The questions, based on qualifying criteria, were developed by senior staff with extensive experience in survey development.
Activities 2 and 3. These activities are being undertaken to identify nine programs for an evaluability assessment, and then from those nine programs, 5-6 to participate in the larger study. As such, the data collection instruments (telephone and onsite program director and partner agency director interview guides; and onsite focus group guides for groups with program and partner agency staff and youth participants) were developed specifically for the purposes of this effort and are tied directly to the guiding questions for each key activity (see Supporting Statement A). In addition, each instrument was carefully vetted to include only those questions necessary to achieve the objectives of this data collection effort. The instruments were designed by senior Chafee SOTA team members with expertise in the design and implementation of interview and focus group protocols.
B4. Collection of Data and Quality Control
Exhibit B-1: Data Collection and Quality Control
Data Collection Activity 1: Call for program nominations |
|||
Who is collecting the data? |
Mode of data collection |
Recruitment protocol |
Data collection quality and consistency monitoring |
Senior members of the Chafee SOTA project team with experience in multimode process and survey development
|
Web and email |
|
|
Data Collection Activity 2: Follow-up interviews with 36 executive directors |
|||
Who is collecting the data? |
Mode of data collection |
Recruitment protocol |
Data collection quality and consistency monitoring |
Senior members of the Chafee SOTA project team with experience in qualitative and semi-structured interviewing methods |
Telephone or virtual platform (Zoom or Teams) |
|
|
Data Collection Activity 3: Evaluability assessments with 9 programs (interviews and focus groups) |
|||
Who is collecting the data? |
Mode of data collection |
Recruitment protocol |
Data collection quality and consistency monitoring |
Senior members of the Chafee SOTA project team with experience in qualitative and semi-structured interviewing methods with support from research assistants (who will take notes) who are also members of the team with experience in qualitative methods |
Onsite or virtual |
|
|
B5. Response Rates and Potential Nonresponse Bias
Response Rates
The nomination process as well as proposed interviews and focus groups are not designed to produce statistically generalizable findings, and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported.
NonResponse
As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. The only respondent demographics that will be documented and reported in written materials associated with the data collection are those that allow us to understand what respondent category the individual is in (e.g., youth participant, program agency staff). Other demographic information is not needed to analyze results and interpret findings.
B6. Production of Estimates and Projections
The data will not be used to generate population estimates, either for internal use or dissemination.
B7. Data Handling and Analysis
Data Handling
Activity 1. Data collected from nomination forms will be reviewed for completeness. Staff will conduct a brief internet search to glean additional information and, if appropriate, attempt to resolve any unanswered questions before moving on to activity 2. Programs will be classified according to the prioritized information needs and services provided.
Activities 2 & 3. The procedures for monitoring and handling data collection for activities 2 & 3 are outlined below.
Data Monitoring. We will implement a process for routine handling of data and monitoring of data collection. We will establish processes that will facilitate prompt submission and review of data collected. We will also establish a schedule for routine data submission (e.g., promptly, after data are collected, they will be submitted for transcribing and review) to ensure that there are no lags in the collection and submission process. To ensure the data are of high quality, the task lead will implement a process for immediate review of the data for quality, completeness, and integrity. As part of this process, task leads will meet with data collection staff on a regular basis to review the data, discuss any issues, and problem-solve challenges or barriers staff may be facing in the accurate collection of the data. If data collected are not of sufficient quality to meet the needs of this effort, task leads will implement corrective actions through additional trainings and monitoring activities.
Prepare Data for Analysis. We will prepare the data for analysis. We will routinely run quality assurance checks as described above to ensure the data are of high quality and meet the needs of this information collection.
Data Analysis
Activity 1. All submitted nominations (received through the website and via email) will be received, acknowledged, and reviewed on a flow basis, within 48 hours, and assigned to senior staff for additional, in depth examination and follow-up. Information on submitted nomination forms will be reviewed for completeness and to classify each nominated program according to the prioritized information needs and inclusion/exclusion criteria.
Activities 2 & 3. Information collected from the interviews and focus groups conducted with program and partner agency directors and staff as well as youth participants will be guided by the study protocols. The purpose of the data collection is to obtain needed data for site identification and evaluability assessments. As such, we will use descriptive analyses to determine the extent to which data are aligned with the inclusion and exclusion criteria used to select programs for site identification and evaluability assessment. For example, it is important that a selected program is operational for a period of time sufficient to enroll enough youth to test the program model. Analysts will look to data collected from executive directors to determine the length of time the program has been in operation and the number of youth it has served (or is serving). Similarly, data will be used to describe the extent to which youth are engaged in program planning, implementation or feedback, another important factor for participation. For questions related to challenges and facilitators, our analysts will use content analysis to identify common themes both within and across potential program sites; however, these data, too, will be used for site identification purposes and not to inform any cross-site conclusions about program effectiveness. For example, program staff in one site might report that collecting consistent data across program participants and facilitators is a challenge resulting in poor quality data, while another site reports that data collection is actually a strength for them and their data are of high quality. This information, while useful on its own, will be used in this context only as a factor in site selection.
Data Use
All data collected will be used to screen nominated programs, using inclusion/exclusion criteria, for potential participation in the larger study evaluation.
B8. Contact Persons
Kelly Jedd McKenzie, Senior Social Science Research Analyst, Office of Planning, Research and Evaluation, [email protected]
Harmanpreet Bhatti, Social Science Analyst, Office of Planning, Research and Evaluation, [email protected]
Susan H. Chibnall, Ph.D. Chafee SOTA Project Director, [email protected]
Gail M.L. Thomas, Chafee SOTA Team Task Lead, [email protected]
Attachments
Instrument 1 – Call for Nominations |
Instrument 2 – Followup Questions for Nominated Program Executive Directors |
Instrument 3 – EA Interview Guide for Nominated Program Executive Directors |
Instrument 4 – EA Interview Guide for Partner Agency Executive Directors |
Instrument 5 – EA Focus Group Guide for Nominated Program Staff |
Instrument 6 – E Focus Group Guide for Partner Agency Staff |
Instrument 7 – EA Focus Group Guide for Youth Participants Instrument 8 – EA Discussion Guide for Data Administrators |
|
Appendix A- Advance Emails |
Appendix B – Interview Emails |
Appendix C – Focus Group Emails |
Appendix D – Consent Forms |
1 “Site” refers to the larger context in which the program exists – either a county or city or other locale. The site also includes partner agencies and entities from which youth are recruited to participate in the program (e.g., schools, community centers, community mental health settings, agencies).
2 “Site” refers to the larger context in which the program exists – either a county or city or other locale. The site also includes partner agencies and entities from which youth are recruited to participate in the program (e.g., schools, community centers, community mental health settings, agencies).
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | schibnall |
File Modified | 0000-00-00 |
File Created | 2023-10-17 |