Supporting Statement A_3.24.21_fnl

Supporting Statement A_3.24.21_fnl.docx

Phase II Evaluation Activities for Implementing a Next Generation Evaluation Agenda for the Chafee Foster Care Independence Program

OMB: 0970-0489

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Phase II Evaluation Activities for Implementing a Next Generation Evaluation Agenda for the Chafee Foster Care Independence Program




OMB Information Collection Request

0970 - 0489




Supporting Statement

Part A






MARCH 2021





Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers: Maria Woolverton, Kelly Jedd McKenzie



Part A


Executive Summary


  • Type of Request: This Information Collection Request is for an extension with no changes to a previously approved information collection. We are requesting 2 years of approval.

  • Progress to Date: Project activities for this information collection have not changed since the project’s most recent OMB non-substantive change approval on 1/19/2021. The project team has identified promising programs and services for youth transitioning to adulthood from foster care and has conducted formative evaluation through interviews and focus groups with program leaders, program partners, frontline staff, and participants, and through analyzing program administrative data.

  • Timeline: The project’s original timeline has been affected directly and indirectly by the COVID-19 pandemic. The direct effects included paused data collection activities in early 2020 as programs under evaluation were adjusting to new conditions and circumstances and were less available for interviews. As social distancing measures and travel restrictions ensued, the indirect effects of COVID-19 included the project revising its original research design, stopping in-person interviews, and developing virtual-interview procedures, which required OMB review and approval. Arranging virtual focus groups with program participants has also affected the timeline as the project team has needed to build in more time and seek additional program staff and other stakeholders to help recruit program participants for the study. The timeline has also been affected by ACF’s interests and priorities to increase the number of program participants and staff included in the study, which extends the time needed for collecting the information. The project requests a two-year extension on its current OMB approval to complete all planned data collection activities and analyses.

  • Description of Request: The project is seeking clearance to extend the data collection period and to recruit and interview additional program leaders, staff, and program partners and conduct several focus groups with program participants. Additional respondents are proposed to gather additional perspectives from a wider range of program participants and staff. We do not intend for this information to be used as the principal basis for public policy decisions.






A1. Necessity for Collection

The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) requests extension of a currently approved information collection with minor updates to the number of respondents. The purpose of the extension is to continue data collection for the Phase II Evaluation Activities for Implementing a Next Generation Evaluation Agenda for the Chafee Foster Care Independence Program. These formative evaluations include programs such as, but not limited to the following: employment, housing, and postsecondary education for youth transitioning to adulthood from foster care.


Although considerable research over the past several decades documents the challenging early adult outcomes of youth who have aged out of foster care, very little is known about the effectiveness of interventions aimed at improving young adults’ outcomes. To understand the current program and practice landscape, the U.S. Department of Health and Human Services, Office of Planning, Research, and Evaluation (OPRE) contracted with the Urban Institute (Urban) and its partner Chapin Hall at the University of Chicago (Chapin Hall) to identify promising programs and services for youth transitioning to adulthood from foster care. ACF is engaging in this collection at the agency’s discretion, and these activities help to fulfill the requirement for evaluations of promising programs under the legislation for the Chafee program.


To date, project team has held 5 focus groups and conducted interviews with approximately 38 program leaders and 31 program frontline staff. To complete data collection, this request is to continue use of currently approved materials to conduct no more than 24 additional focus groups with up to 10 participants each, and to interview an additional 23 program leaders, 14 program partners, and 66 program frontline staff.

A2. Purpose

Purpose and Use

The purpose of the interviews and focus groups is to gather detailed information about program sites’ services from program leaders, staff, and partners and from young adults who have participated or are participating currently in the programs. This information collection adds to the data the program team has been collecting about program models, target populations, program inputs, outputs, and anticipated outcomes. The additional respondents expand the sample and strengthen the study by widening the range of perspectives from which to draw conclusions.

The project team will use the information and themes gathered from the interviews and focus groups to better understand the range of program models, and to further assess how well program activities, procedures, and outcomes for participants align with the programs’ stated goals, logic models, or theories of change (i.e., operate to fidelity). The intended use of the results is to inform ACF and the child welfare field about the present state of interventions in the selected domains and to determine different programs’ readiness for future summative and impact evaluation.

The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker, and is not expected to meet the threshold of influential or highly influential scientific information.


Research Questions or Tests

Our core study questions for understanding program site models and implementation include:

  • Who does the program target?

  • How many participants are served?

  • What is the outreach and referral process?

  • What are the eligibility requirements?

  • What is the program’s logic model?

  • What are the primary inputs, outputs, and intended outcomes?

  • What is the context of the program? What other similar or related services or programs serve the same population?


Our core study questions for assessing program sites’ readiness for rigorous evaluation include:

  • How well does the program’s logic model reflect actual program practice?

  • How valid or reliable are the data and measures programs currently use to evaluate outcomes?

  • What would the burden on program staff be to implement new measures if necessary?

  • Is the program large enough to make rigorous evaluation feasible?

  • What further program development would be required for a rigorous impact evaluation?

  • Is a rigorous impact evaluation feasible?



Study Design

The study design involves formative evaluations that among other objectives detail each program’s logic model to assess how well the ‘logic’ conforms to each program’s actual operations using readily available data. The data collection activities include one virtual site visit each with up to 7 housing or employment programs, and between 17 and 24 focus groups with program participants in housing, employment, and also ETV programs. The data collection will focus on obtaining information about structural program components (e.g., hours of operation or training, class sizes (if applicable), intake process, process for identifying target participants, eligibility criteria), as well as process and content components. Process components include, for example, matching internships to young adult interests in an employment program, or mentor-student interactions in a college success program. Content components could include, for example, curricula or class schedule offerings, which might involve interviews with staff or reviewing course schedule materials.



Interviews with program staff and partners: The study includes interviews with program leaders, front-line staff, and program partners to gather and synthesize each stakeholders’ individual vantage point on the program.



Focus groups with program participants: The study also includes focus groups with program participants. The young adults ages 18 and older who are participating or who have participated in the selected programs, have a special vantage point on how the programs fit within their lives, meet their needs, and could be improved. The design involves focus groups, as opposed to one-on-one interviews to create opportunities for group-level insights as participants reflect together about services, challenges, and recommendations, and is an efficient way to gather multiple perspectives at one time.



See Supporting Statement B for recruitment procedures for all participants.



The project is also making use of administrative data collected from programs. The current request no longer includes the previously approved compilation and submission of administrative data files, since that work has already been completed.



The planned study design with these qualitative interviews and focus groups, is the best approach for obtaining the information we need to a) better understand the services currently being delivered; and b) assess if programs are being operated to fidelity.


Table A1.

Data Collection Activity

Instrument(s)

Respondent, Content, Purpose of Collection

Mode and Duration

Interviews with program leaders

Discussion Guide for program leaders (1)


Respondents: Program leaders


Content: Gathering program leaders’ perspectives about the programs


Purpose: To understand the program leaders role in the program, the main components of the program, the program’s goals, successes, and challenges, and who the program serves

Mode: Virtually (i.e., phone, video), or in person if feasible


Duration: 1 hour

Interviews with program partners and stakeholders

Discussion Guide for program partners and stakeholders (2)

Respondents: Program partners and stakeholders


Content: Gathering program partners and stakeholders’ perspectives about the programs


Purpose: To understand the relationship between the program and program partners/stakeholders

Mode: Virtually (i.e., phone, video), or in person if feasible


Duration: 1 hour

Interviews with program front-line staff

Discussion Guide for program front-line staff (3)

Respondents: Program front-line staff


Content: Gathering front-line staffs’ perspectives about the programs


Purpose: To understand front-line staff roles in the program, the main components of the program, the program’s goals, successes, and challenges, and who the program serves

Mode: Virtually (i.e., phone, video), or in person if feasible


Duration: 1 hour

Focus groups with Program Participants

Focus Group Guide for program participants (4)

Respondents: Young adult program participants


Content: Gathering participant perspectives about the programs


Purpose: To understand program goals, successes, and challenges and who is served from firsthand perspectives

Mode: Virtually (i.e., video), or in person if feasible


Duration: up to 2 hours


Other Data Sources and Uses of Information

These data from interviews and focus groups will be used in concert with the information already collected for this study from completed interviews with program leaders, program partners, other stakeholders, and interviews with frontline staff. These data will also be used in concert with other available information such as published program materials (e.g., program manuals and annual reports), to minimize collecting data on information already compiled.



A3. Use of Information Technology to Reduce Burden

The study will use semi-structured interviews and focus groups. While some technology, such as computer-assisted instruments, may reduce burden on respondents (e.g., enabling respondents to answer questions on their own time), this study requires direct person-to-person communication. The discussion questions are designed to elicit nuanced responses, and the project team will need to probe appropriately. A computer-assisted survey method would not allow the interview flexibility the project requires.

To reduce respondent burden, the project team will hold small-group interviews and focus groups virtually over Zoom and conference line or in-person dependent on the ongoing public health situation. Small-group interviews and focus groups will reduce the overall time that a single organization spends on the study. The project team will try to schedule the interviews and focus groups when the input from multiple respondents with comparable roles in the same organization (e.g., case workers, participants) will increase the efficiency and the amount of information the project team can gather in a single session. With respondents’ permission, the project team will audio record the focus groups with program participants to minimize time needed for potential follow-up for clarification.


A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency

The information collected will not duplicate information that is already available. The project team will review written documents and organizational materials in advance and use the interviews to fill in missing information – which will make the interviews more efficient. The project is designed to gather details about programs and services that will allow ACF to assess whether the program or services would be a strong candidate for future rigorous evaluation. No other studies are exploring these programs or services with these goals in mind or have collected the information the project team needs to make their assessments.


A5. Impact on Small Businesses

It is possible that some of the organizations recruited into this study for site visits and concomitant interviews will be small businesses. The team will minimize the burden on program staff by keeping the interviews and focus groups as short as possible, by scheduling the interviews and focus groups at a time most convenient for respondents, and by not requesting written responses.




A6. Consequences of Less Frequent Collection

Potential negative consequences of less frequent data collection would be outdated or inaccurate findings. The study design calls for the minimum number of staff and participant hours necessary for successful and complete data collection. To reduce the time burden on program staff and participants, the project team will conduct the interviews and focus groups as efficiently as possible and will work with program leaders and staff to determine the most appropriate respondents for each interview and focus group.


A7. Now subsumed under 2(b) above and 10 (below)



A8. Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this request to continue the information collection activity. This notice was published on January 19, 2021 Volume 86, Number 11, page 5199, and provided a sixty-day period for public comment. During the notice and comment period, no comments were received.


Consultation with Experts Outside of the Study

The project has consulted with Children Bureau staff for input on program selection and the agency’s priority program areas.


A9. Tokens of Appreciation

Program leaders and staff will not receive a token of appreciation for participating in interviews.

As previously approved, the project team will give program participants who attend the focus groups $25 in order to encourage participation among those young adults who might not otherwise take part in the research. By encouraging otherwise reluctant young adults, the study reduces the risks associated with nonresponse bias – namely the risk that the project team draws inaccurate or biased conclusions about the program. For focus groups conducted virtually, participants will be given a $25 Visa e-gift card.

Respondent participation is voluntary. The $25 for focus group participants is intended to assist with transportation costs, child care, or other expenses that might prevent some in our target population from participating – i.e., those with the greatest financial challenges or other barriers, and whose absence could contribute to nonresponse bias. Based on the focus groups we have conducted for the project to date, the tokens of appreciation were sufficient for encouraging participation. As noted section A1 above, since the project’s most recent approved OMB extension on 3/15/2019, the project team has held 5 focus groups with program participants from five programs.

To prevent the token of appreciation from being coercive the project team will provide participants who show up for the focus group with a $25 Visa e-gift card, regardless of whether an individual ultimately chooses to stay and participate.



A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing

Personally Identifiable Information

We will obtain names and emails in order to schedule interviews with program staff and partners during the site visits. Program staff will provide a list of the contact information for program participants who have expressed interest in participating in focus groups to the project team. This information will be used for scheduling purposes, and program participant emails will also be used to send the $25 e-gift card token of appreciation if they participate in a virtual focus group. When we send participants the token of appreciation, we will ask for an email or text message confirmation that they received the gift card. Once confirmation is received, we will permanently delete any correspondence containing participant contact information, emails, and texts with the participant. If in-person focus groups with program participants become feasible, agency staff will lead recruitment and maintain all program participant contact information.



Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individuals’ personal identifier.


Assurances of Privacy

Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. As specified in the contract, the Contractor will comply with all Federal and Departmental regulations for private information. Urban has also obtained Institutional Review Board (IRB) approval for all data collection under this contract. All researchers working with the data will read and sign the Urban Institute’s Confidentiality Pledge, agreeing to adhere to the data security procedures laid out in the approved IRB submission. The contractor will safeguard all data, and only authorized users will have access to them. Information gathered for this study will be made available only to researchers authorized to work on the study.


For interviews with program leaders and staff, Urban will use the informed consent documents attached to each interview guide (Instruments 1-4) to obtain consent for participation in the study. These forms detail the risks and benefits of participating and the expected privacy for each participant. These respondents are not in categories designated as vulnerable populations, and the information the evaluation team will collect is not highly sensitive. Because some study participants will be program leaders, staff members, or program partner staff, and because the team will name the programs in our reports, individuals reading the reports may be able to attribute particular information or comments to that respondent. The evaluation team will inform respondents about this potential risk.


For focus groups with program participants, Urban will use the informed consent for participants included with the focus group guide for young adults (Instrument 4). The consent statement details the risks and benefits of participating and the level of expected privacy for each participant. The questions primarily revolve around the young adults’ experience with the program and do not include sensitive questions (see section A11). Program participants will be informed that they may choose not to answer any or all questions during the interview.


The project team will rely on leaders and staff at each program to recruit young adults for the focus groups and to provide the physical space for the discussions if they are in-person. As directed by the project team, agency staff will recruit young adult participants age 18 and older. Should focus groups be conducted virtually, program staff will share the contact information for program participants with the project team. This information will be kept on a secure server and used for scheduling purposes as well as tracking the distribution of gift cards to the program participants. If focus groups are conducted in-person, agency staff will collect and maintain any contact information necessary for recruitment and coordinate with young adults focus group participants.


To maintain participants’ privacy, the project team will request verbal consent at the start of each discussion. Participants will be provided a physical copy of the consent form before the interview if it is in-person or presented with the consent form via video or email if the visit is virtual. Program staff who helped with the recruitment may be physically present at the location if these discussions are conducted in-person but will not be permitted in the focus group itself. If conducted virtually, program staff will not be permitted on the Zoom or phone call during the discussion. Similar to an in-person setting, program staff may be present at the start of the meeting to help participants connect with the project team (e.g., set up the technology, ensure participants are in the right place) but will exit before the discussion begins. If the program cannot recruit enough participants for the focus groups, we will ask to interview participants individually and follow the same privacy, informed consent, and interview procedures as a virtual or in-person focus group.


Data Security and Monitoring

As specified in the contract, the Contractor shall protect respondent privacy to the extent permitted by law and will comply with all Federal and Departmental regulations for private information. The Contractor has developed a Data Safety and Monitoring Plan that assesses all protections of respondents’ PII. The Contractor shall ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor, who perform work under this contract/subcontract, are trained on data privacy issues and comply with the above requirements. 


As specified in the evaluator’s contract, the Contractor shall use Federal Information Processing Standard compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. The Contractor shall securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Processing Standard.  The Contractor shall: ensure that this standard is incorporated into the Contractor’s property management/control system; establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically will be secured in accordance with the most current National Institute of Standards and Technology (NIST) requirements and other applicable Federal and Departmental regulations. In addition, the Contractor must submit a plan for minimizing to the extent possible the inclusion of sensitive information on paper records and for the protection of any paper records, field notes, or other documents that contain sensitive or PII that ensures secure storage and limits on access.  









A11. Sensitive Information 1

There are no sensitive questions in this data collection. As noted above, the project team will inform respondents that participation is voluntary.



A12. Burden

Explanation of Burden Estimates

Table A2 below shows the estimated burden of the information collection, which will continue for a period of approximately 2 more years. The previously approved information collection included compilation and submission of administrative data files, which is not included here since that work has already been completed.

We expect that respondents during this timeframe will include:

  • 23 Program leaders for 1 hour each (Instrument 1)

  • 14 Program partners and stakeholders for 1 hour each (Instrument 2)

  • 66 Program front-line staff for 1 hour each (Instrument 3)

  • 240 Program participants for 2 hours each (Instrument 4)

  • 96 Program front-line staff to assist with program participant focus group recruitment for 8 hours each (Instrument 5)

The total annual cost burden to respondents is approximately $12,634.96. For program leaders and program partners the figure ($35.05/hr) is based on the mean wages for “Social and Community Services Managers,” job code 11-9151, as reported in the May 2019 U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wage Estimates.2 For program front-line staff the figure ($24.53/hr) is based on the mean wages for “Child, Family and School Social Workers,” job code 21-1021, as reported in the 2019 U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wage Estimates.3 Wage data for focus group participants is based on the federal minimum wage of $7.25/hr as set by the U.S. Department of Labor.

Estimated Annualized Cost to Respondents

The estimated total annualized burden for this effort is 676 hours and the estimated annualized total respondent cost is $12,634.96. A breakdown of the estimated annualized burden and cost are below in Table A2.


Table A2.

Instrument

No. of Respondents (total over request period)

No. of Responses per Respondent (total over request period)

Avg. Burden per Response (in hours)

Total Burden (in hours)

Annual Burden (in hours)

Average Hourly Wage Rate

Total Annual Respondent Cost

Program Staff Recruitment for Focus Group Participants

96

1

8

768

384

$24.53

$9,419.52

Discussion Guide for program leaders

23

1

1

23

12

$35.05

$420.60

Discussion Guide for program partners and stakeholders

14

1

1

14

7

$35.05

$245.35

Discussion Guide for program front-line staff

66

1

1

66

33

$24.53

$809.49

Focus Group Guide for program participants

240

1

2

480

240

$7.25

$1,740.00

Total

439



1,351

676


$12,634.96



A13. Costs

There are no additional costs to respondents.






A14. Estimated Annualized Costs to the Federal Government

The total cost for the data collection activities under this current request will be $800,000. The annualized cost is $400,000. The estimate includes the costs of project staff time to draft the discussion guides, collect the information, analyze the responses, and write up the results. Table A3 below shows estimated costs to the federal government by cost category.

Table A3.

Cost Category

Estimated Costs

Instrument Development and OMB Clearance

$20,000

Field Work

$430,000

Analysis

$200,000

Publications/Dissemination

$150,000

Total costs over the request period

$800,000

Annual costs

$400,000



A15. Reasons for changes in burden

The change in burden is due to a few factors:

  1. Increasing the number of respondents as part of the current data collection activities.

  2. Removing burden for completed activities (the work to compile and submit administrative data files has been completed)

  3. Removing separate burden for an outreach email that does not actually impose additional burden on respondents. (The referenced email is included as Appendix A).



A16. Timeline

Table A4 below provides a data collection schedule over the following two years. The project team will prepare a final report and/or briefs for public dissemination following the completion of data collection. See Supporting Statement B, section B7 for additional information about plans for dissemination.

Table A4.

Task

Description

Timeframe (after OMB approval)

Site visits (including interviews and focus groups)

Interviews with program leaders, staff, and partners; Focus groups with program participants

Months 1-14

Reporting and Disseminating findings


Individual formative evaluation reports

Months 14-24




A17. Exceptions

No exceptions are necessary for this information collection.




Attachments

Appendix A: Outreach email for discussion with Program Administrators and Staff

Appendix B: Additional Project Materials

Instrument 1: Discussion Guide for program leaders

Instrument 2: Discussion Guide for program partners and stakeholders

Instrument 3: Discussion Guide for program front-line staff

Instrument 4: Focus Group Guide for program participants

Instrument 5: Program Staff Recruitment for Focus Group Participants



1 Examples of sensitive topics include (but not limited to): social security number; sex behavior and attitudes; illegal, anti-social, self-incriminating and demeaning behavior; critical appraisals of other individuals with whom respondents have close relationships, e.g., family, pupil-teacher, employee-supervisor; mental and psychological problems potentially embarrassing to respondents; religion and indicators of religion; community activities which indicate political affiliation and attitudes; legally recognized privileged and analogous relationships, such as those of lawyers, physicians and ministers; records describing how an individual exercises rights guaranteed by the First Amendment; receipt of economic assistance from the government (e.g., unemployment or WIC or SNAP); immigration/citizenship status.

2 “Occupational Employment Statistics: Occupational Employment and Wages, May 2019,” Bureau of Labor Statistics, accessed March 8th, 2021, https://www.bls.gov/oes/current/oes119151.htm.

3 “Occupational Employment Statistics: Occupational Employment and Wages, May 2019,” Bureau of Labor Statistics, accessed March 8th, 2021, https://www.bls.gov/oes/current/oes211021.htm

13


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMarla McDaniel
File Modified0000-00-00
File Created2021-03-30

© 2024 OMB.report | Privacy Policy