Supporting Statement Part A_Chafee_0970-0489 - Jan 2021

Supporting Statement Part A_Chafee_0970-0489 - Jan 2021.docx

Phase II Evaluation Activities for Implementing a Next Generation Evaluation Agenda for the Chafee Foster Care Independence Program

OMB: 0970-0489

Document [docx]
Download: docx | pdf



Phase II Evaluation Activities for Implementing a Next Generation Evaluation Agenda for the Chafee Foster Care Independence Program


Full OMB Information Collection Request

0970-0489



Supporting Statement

Part A

November 2018

Updated January 2021


Submitted By:

Office of Planning, Research and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C St, SW

Washington, DC 20201


Project Officer: Maria Woolverton








A. Justification


Summary of NonSub Change Request – January 2021


All currently approved qualitative interviews were intended to take place over the phone or in-person and all focus groups were intended to take place in-person. Given the COVID-19 pandemic, we now need to have the option to conduct our interviews and focus groups virtually or in-person, depending on how COVID-19 develops.


We also plan to include focus group participants who are youth with a history of foster care who have received Education and Training Vouchers (ETVs) in the past two years and propose adding procedures to engage staff from organizations serving foster care alumni to help with recruiting these youth. This collection falls under currently approved data collection from participants in college success programs. However, due to the nature of ETV programs, which have diverse administrative structures and often minimal consistent staff contact with youth, the addition of new recruitment procedures is necessary to reach the target population.


This nonsubstantive change request is to update materials to reflect these changes to the information collection. For additional information, see the January 2021 Nonsubstantive Change Justification Memo.


A1. Necessity for the Data Collection


The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) requests extension of a currently approved information collection. The purpose of the extension is to continue data collection for the Phase II Evaluation Activities for Implementing a Next Generation Evaluation Agenda for the Chafee Foster Care Independence Program. We request permission to continue conducting formative evaluations of programs that include, but are not limited to, employment and postsecondary education for youth transitioning to adulthood from foster care.


Although considerable research over the past several decades documents the challenging early adult outcomes of youth who have aged out of foster care, very little is known about the effectiveness of interventions aimed at improving youth outcomes. To understand the current program and practice landscape, the U.S. Department of Health and Human Services, Office of Planning, Research, and Evaluation (OPRE) has contracted with the Urban Institute (Urban) and its partner Chapin Hall at the University of Chicago (Chapin Hall) to identify promising programs and services for youth transitioning to adulthood from foster care. ACF will use the information the project team collects to plan future evaluation activities. ACF is engaging in this collection at the agency’s discretion, and these activities help to fulfill the requirement for evaluations of promising programs under the legislation for the Chafee program.


We’re seeking OMB approval to continue to engage in five data collection activities to be conducted in association with site visits to programs serving youth transitioning to adulthood from foster care. These activities will support ACF’s work developing an evaluation agenda as part of the project, Phase II Evaluation Activities for Implementing a Next Generation Evaluation Agenda for the Chafee Foster Care Independence Program. The evaluations will help ACF document the current state of interventions in the stated domain areas of interest and determine different programs’ readiness for potential rigorous evaluation in the future.


Data collection activities include interviews with program leaders, partners, stakeholders, and front-line staff; focus groups with program participants; and the collection of administrative data.



Study Background


The John H. Chafee Foster Care Independence Program (CFCIP) was created following the passage of the Foster Care Independence Act (FCIA) of 1999 (Public Law 106-169). The program provides assistance to help youth currently and formerly in foster care achieve self-sufficiency by providing grants to States and eligible Tribes that submit an approved plan. Activities and programs allowable under the CFCIP include, but are not limited to, help with education, employment financial management, housing, emotional support and assured connections to caring adults for older youth in foster care. The FCIA also required ACF to establish a data collection and performance measurement system for the CFCIP.


The Multi-Site Evaluation of Foster youth Programs

The Foster Care Independence Act of 1999 mandates that a portion of Chafee Program funding be used for rigorous evaluation of independent living programs that are “innovative or of potential national significance.” Under this mandate, the current project team conducted the first rigorous evaluation, the Multi-Site Evaluation of Foster Youth Programs, between 2003 and 2011 and found few statistically significant effects on youth outcomes. But other important lessons from the long-term study were that many potential programs were not in a position to be evaluated rigorously for reasons such as their small size, not having clearly defined logic models, or having poor targeting of youth participants, among other challenges. An earlier review of non-experimental studies reached similar conclusions (Montgomery et al, 2006).


Implementing a “Next Generation” Evaluation Agenda

In response to critical limitations in our knowledge about independent living programs, ACF, through this current project, is developing a plan and strategy for future evaluation activities before launching its next rigorous evaluation. In this work, we are continuing and expanding on activities the project team began in 2012 during Phase I of the project.

During Phase I the project team reviewed current research on youth in foster care and developed a conceptual framework outlining core developmental assets youth need for success in adulthood. The team also created a typology of existing independent living programs, identifying 10 different domains and describing available research evidence on each. With ACF guidance, they consulted with researchers, federal staff, and program experts in three areas of special interest to ACF—education, employment, and financial literacy and asset-building programs -- to further aid research planning. Finally, the team outlined potential design options and next steps for narrowing and identifying promising programs for future evaluation.

For the current Phase II activities the team is selecting promising programs to document the state of those promising programs for the field and to assess as well as develop the programs’ readiness for summative evaluation, including formalizing their logic models. Strong logic models explicitly define the population a program intends to affect and makes clear how and why particular program “inputs” (e.g., financial resources and personnel) are expected to produce certain “outputs” (program activities and program participation), and how and why those outputs will lead to clearly defined outcomes/impacts (short-, medium-, and long-term). With OMB approval we will continue with the formative process and outcome evaluation in new sites. The evaluations will assess each program’s readiness for rigorous impact evaluation, and, in some cases, assist programs to become ready for impact evaluation.


A2. Purpose of Data Collection Procedures

The purpose of the data collection procedures, which include site visits, one-on-one interviews (either in-person or by phone) and focus groups, is to gather detailed information about program-sites’ service models, target populations, inputs, outputs, intended outcomes, and program implementation. The data collection will also gather detail about program-sites’ readiness for rigorous evaluation.


Research Questions

Our core study questions for understanding program site models and implementation include:

  • Who does the program target?

  • How many youth are served?

  • What is the outreach and referral process?

  • What are the eligibility requirements?

  • What is the program’s logic model?

  • What are the primary inputs, outputs, and intended outcomes?

  • What is the context of the program? What other similar or related services or programs serve the same population?


Our core study questions for assessing program sites’ readiness for rigorous evaluation include:

  • How well does the program’s logic model reflect actual program practice?

  • How valid or reliable are the data and measures programs currently use to evaluate outcomes?

  • What would the burden on program staff be to implement new measures if necessary?

  • Is the program large enough to make rigorous evaluation feasible?

  • What further program development would be required for a rigorous impact evaluation?

  • Is a rigorous impact evaluation feasible?



Study Design

The study design involves formative evaluations that among other objectives detail each program’s logic model to assess how well the ‘logic’ conforms to each program’s actual operations using readily available data. The data collection activities, including up to 4 site visits, will focus on obtaining information about structural program components (e.g., hours of operation or training, class sizes (if applicable), intake process, process for identifying target participants, eligibility criteria), as well as process and content components. Process components include matching internships to youth interests in an employment program, or mentor-student interactions in a college success program. Content components could include curricula or class schedule offerings, which might involve interviews with staff or reviewing course schedule materials. The collection of this data will be used to detail each program’s logic model and to assess how well the program’s processes, as currently implemented, are likely to achieve intended objectives. The data collection will also be used to document and assess changes to the logic model and procedures resulting from the project team’s review of the program model, procedures, and data.

The data collection, which involves speaking with all staff and stakeholders knowledgeable about the programs, is the best approach for obtaining the information we need to a) better understand the services currently being delivered and document the current state of the field in relation to these services; and b) assess if programs are ready for summative evaluation and/or what programs would need to be ready for rigorous evaluation.



Universe of data collection efforts

The 5 data collection activities will be the following:

  • Interviews with program leaders. Program leaders include practitioners from state and local agencies and private organizations. We will recruit participants for these interviews via email (see Appendix A). The interviews will be semi-structured and will include informed consent (Appendix C). The goals of the interviews are to understand the program features and history, recruitment and referral processes, administrative program data, logic model, target population, intended outcomes and potential rigorous evaluation designs. Interviews will also cover the plan for the formative evaluation and what will be needed and expected from staff (Appendix C).


  • Interviews with program partners and other stakeholders. Program partners and other stakeholders include practitioners from state and local agencies, private organizations, providers, or employers (for employment programs), educators (for educational programs), or others who work with the selected program or program participants in some capacity (e.g., referring, hiring, or training participants; providing funding or other support etc.). We will recruit participants for these interviews via email (see Appendix A). The interviews will be semi-structured and will include informed consent (Appendix D). The goal of the interviews is to understand partners or other stakeholders’ relationship to the program and type and frequency of interactions with the program as well as perceptions about program goals and priorities (Appendix D).


  • Interviews with front-line program staff. Front-line program staff includes staff who work directly with program participants or front-line staff from partnering agencies or organizations. We will recruit participants for these interviews via email (see Appendix A). The interviews will be semi-structured and will include informed consent (Appendix E). The goal of the interviews are to understand staff training and background, history and features of the program, major roles and responsibilities, program data elements and uses, perceptions of participants and participants’ experiences in the program (Appendix E).


  • Focus groups with program participants. Program participants will include young adults 18 and older who are currently or formerly involved in foster care and eligible for services under the Chafee Foster Care Independence Program. We will ask for program staff assistance in recruiting participants for these focus groups (Appendix B). Participants need to have taken part in the program and be able to speak knowledgeably about their experiences – including successes and challenges. The goal of the semi-structured interviews, which will include informed consent (Appendix F), are to understand how participants learn about the program, their motivation for participating, their understanding of the program components and goals, perceptions of staff, perspectives on what additional help the staff or program could provide, and whether and how the program influences how participants think about themselves or the future (Appendix F).


  • Compiling and accessing program administrative data files (electronically or hard copies). This activity includes working with program data staff most familiar with administrative data and procedures for collecting and tracking information about program participants and services. The goal of this activity is to assess which data elements are best suited for evaluating the logic model and program inputs, outputs, and outcomes. The goal of the activity is to also identify data elements or data collecting procedures programs may need adopt or track in order to improve their readiness for rigorous evaluation



Due to COVID-19, the project team will potentially conduct interviews and focus groups virtually, depending on how COVID-19 develops. The team will also access data from program staff who compile and submit it to the project team. The team’s communication with program staff will include discussing the types of indicators available in the administrative data and how often and how well it is collected, and whether the data are appropriate for estimating short and intermediate outcomes that conform to the program’s logic model.


Site Visits (interviews and focus groups):

The project team will visit each site approximately 4 times and will follow discussion topic guides for the interviews and focus groups (see Appendices C, D, E, and F for previously approved data collection instruments that we will continue to use). The team will use the first visit to interview program leaders and program partners or other stakeholders about the program’s history and components, and program oversight processes. The team’s second site visits will involve interviewing front-line staff and will focus on capturing the referral and service procedures, and intended outputs and outcomes. In addition, we will also use this visit to conduct follow up interviews with program partners and other stakeholders about the program and their organization’s role and responsibilities.

The second site visit will also include focus groups with program participants. The project team will conduct two focus groups with up to 10 participants at each program. The focus groups will last approximately 2 hours and will be conducted by two members of the project team.

The third visit will focus on interviewing front-line staff about their roles and responsibilities. During this visit the project team will speak with the program’s data staff to learn about the administrative data. The project team will also work with the data staff to develop procedures for compiling and securely sending any information the project team would need to analyze the data.

The team’s fourth site visits will focus on any changes to the program model or program procedures resulting from the research team’s review of the logic model and data collection activities from the second and third site visits. The visit will involve interviewing program leaders, front-line workers, program partners and other stakeholders about the program’s capacity to sustain or implement procedures that would support rigorous evaluation.

The team will also present findings from the formative evaluation to program leadership and to seek their feedback. Most data collection efforts will occur at the program site, but the team may conduct some interviews by telephone if staff and other stakeholders are not available during the time of the visits.


Recruiting focus group participants: The study procedures include working with program leaders and staff at the selected programs to recruit participants for the focus groups (In the case of in-person focus groups, which the project is not conducting currently due to COVID-19, program leaders and staff will provide the physical space for the discussions. As directed by the project team, program staff will recruit young adult participants age 18 and older who are participating or who have participated in the selected programs. The research team will inform program staff of the purpose of the focus groups so staff are prepared to answer questions posed by prospective focus group participants and will provide program staff with an informational fact sheet to aid with recruitment that will describe the purpose of the study and address other logistical questions. For focus groups with youth who received ETVs, the project team will engage organizations that serve foster care alumni to aid with recruitment (Appendix B). The research team has developed specialized materials, including a recruitment flyer, youth outreach email, project overview and frequently asked questions document, which it will provide to staff at these organizations to aid with recruitment (see Appendix B and Additional Project Materials).


Accessing program administrative data: In addition to the site visits, data collection will also involve working with program staff to compile and send administrative data. The communication with program staff will include determining appropriate short and intermediate outcomes that conform to the logic model, and discussing how often and how well the data are collected. The communication will also include establishing a process for program staff to send the project team data including outcome indicators.


A3. Improved Information Technology to Reduce Burden


The study will use semi-structured interviews and focus groups. While some technology, such as computer-assisted instruments, may reduce burden on respondents (e.g., enabling respondents to answer questions on their own time), this study requires direct person-to-person communication. The discussion questions are designed to elicit nuanced responses, and the project team will need to probe appropriately. A computer-assisted survey method would not allow the interview flexibility the project requires.


To reduce respondent burden, the project team will hold small-group interviews and focus groups virtually over Zoom and conference line or in person dependent on the ongoing public health situation. Small-group interviews and focus groups will reduce the overall time that a single organization spends on the study. The project team will try to schedule the interviews and focus groups when the input from multiple respondents with comparable roles in the same organization (e.g., case workers, participants) will increase the efficiency and the amount of information the project team can gather in a single session. As described above, with respondents’ permission, the project team will audio record the interviews and focus groups to minimize time needed for potential follow-up for clarification.



A4. Efforts to Identify Duplication


The information collected will not duplicate information that is already available. The project team will review written documents and organizational materials in advance and use the interviews to fill in missing information – which will make the interviews more efficient. The project is designed to gather details about programs and services that will allow ACF to assess whether the program or services would be a strong candidate for future rigorous evaluation. No other studies are exploring these programs or services with these goals in mind or have collected the information the project team needs to make their assessments.


A5. Involvement of Small Organizations


It is possible that some of the organizations recruited into this study for site visits and concomitant interviews will be small organizations. The team will minimize the burden on program staff by keeping the interviews and focus groups as short as possible, by scheduling the interviews and focus groups at a time most convenient for respondents, holding them on-site, and by not requesting written responses.


A6. Consequences of Less Frequent Data Collection


Potential negative consequences of less frequent data collection would be outdated or inaccurate findings. The study design calls for the minimum number of staff and participant hours necessary for successful and complete data collection. To reduce the time burden on program staff and participants, the project team will conduct the interviews and focus groups as efficiently as possible and will work with program leaders and staff to determine the most appropriate respondents for each interview and focus group.


A7. Special Circumstances

There are no special circumstances for the proposed data collection efforts.


A8. Federal Register Notice and Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on Friday September, 21 2018, Volume 83, Number 184, page 47922 and provided a sixty-day period for public comment. A copy of this notice is attached as Appendix G. During the notice and comment period, no requests for copies of the information collection instruments and no comments were received.


A9. Incentives for Respondents


Program leaders and staff will not receive a cash incentive for participating in interviews.


The project team will give program participants who attend the focus groups $25 in order to encourage participation among those young adults who might not otherwise take part in the research. By encouraging otherwise reluctant young adults, the study reduces the risks associated with nonresponse bias – namely the risk that the research team draws inaccurate or biased conclusions about the program. For focus groups conducted virtually, participants will be given a $25 Visa e-gift card.



Respondent participation is voluntary. The $25 for focus group participants is intended to assist with transportation costs, child care, or other expenses that might prevent some in our target population from participating – i.e., those with the greatest financial challenges or other barriers, and whose absence could contribute to nonresponse bias. Research has shown that such incentives are effective at increasing participation from populations with lower education levels (Berlin et al, 1992) as well as low-income and nonwhite populations (James and Bolstein 1990). The $25 defrays transportation and other costs associated with attending. Based on the focus groups we have conducted for the project to date, the incentives were sufficient for encouraging participation. To date, we have held 7 focus groups plus 5 one-one-one interviews with program participants from four programs. The focus groups drew between 5-12 participants and all were low-income.


Generally, research designs involving focus groups use incentives to offset the requirements that the design poses on participants to appear at a group meeting at a pre-set location and time (Liamputtong 2011). While research on nonresponse bias often applies to household and other surveys, the often experimental studies conclude that the incentives give an added boost to encourage participation among those with less inherent interest in the topic (Groves, Singer, Corning, 2000; Groves et al 2006), those who are more vulnerable due to income or less education (Knoll et al, 2012), and potentially participants who feel less strongly – whether positively or negatively – about the program, and less motivated to participate without some incentive (Groves, Singer, Corning, 2000).


  • Groves, R., Couper, M., Presser, S., Singer, E., Tourangeau, R., Acosta, G., & Nelson, L.

(2006). Experiments in producing nonresponse bias. Public Opinion Quarterly, 70(5), 720-736.

  • Groves, R., Singer, E., Corning A. (2000) Leverage-Saliency Theory of Survey Participation: Description and an Illustration. The Public Opinion Quarterly, Vol 64, 3 (299-308).

  • Knoll et al, 2012. The use of incentives in vulnerable populations for a telephone survey: a randomized controlled trial. Biomed Central, Research Notes 2012 5:572.

  • Liamputtong, Pranee (2011). Focus group methodology: Principle and practice. Sage Publications.


Based on the project team’s prior experience with studies of similar young adult populations, $25 is high enough to support participation, but not so high that it is overly generous or that participants would feel the incentive is excessive or coercive. The amount is based on what was previously used in OMB-approved studies for focus groups with similar low-income, hard-to-reach populations, such as the Descriptive Study of Tribal TANF programs ($25, OMB #0970-0411, expiration date October 31, 2013), the Goal-Oriented Adult Learning in Self-Sufficiency (GOALS) Study ($25, OMB # 0970-0472, expiration date January 31, 2018), and the study on Same-sex relationships: Updates to Healthy Marriage and Relationship Education ($25, OMB# 0970-0479 , expiration date April 30, 2017).


To prevent the incentive from being coercive the project team will give participants who show up to the focus group the incentive at the time of the group, regardless of whether an individual ultimately chooses to stay and participate.



A10. Privacy of Respondents

Program Participants

The project team will rely on program leaders and staff at each site to recruit participants for the focus groups Focus groups will take place in-person or virtually over Zoom or phone calls. If taking place in-person, the Urban and Chapin Hall team will rely on leadership and staff at the selected programs to provide the physical space for the discussions. As directed by the research team, program staff will recruit young adult participants age 18 and older who are currently participating or who have participated in the selected programs. The research team will inform program staff of the purpose of the focus groups so staff are prepared to answer questions posed by prospective focus group participants and will provide program staff with an informational fact sheet to aid with recruitment that will describe the purpose of the study and address other logistical questions (see Additional Project Materials). If the program cannot recruit enough participants for a virtual focus group, we will ask to interview participants individually and follow the same privacy, informed consent, and interview procedures as a focus group.


Program staff will collect and maintain any contact information necessary for recruitment and to coordinate with focus group participants. For Zoom focus groups, the project team members will receive names, emails and phone numbers of the participants. We will ask program staff for a list of names and numbers of anyone calling in to ensure the expected participants and correct number of participants are on the call. We will also see names and numbers through the Zoom platform during the Zoom session. We will also collect participants’ emails from recruiting program staff to send the $25 Visa e-gift card. When we send participants the e-gift card, we will ask for an email confirmation of receipt. After receiving confirmation, we will immediately permanently delete any correspondence containing participant contact information and emails with participants. To maintain participants’ privacy, the research team will request verbal, not written, consent, at the start of the focus group. If conducted via video, the research team will share the consent on the screen so that program participants can see. If conducted in person, note that the program staff who help with recruitment may be physically present at the sites of the focus groups when the groups are held to help with access to the building and other logistics, but will NOT be permitted in the focus group itself. If held virtually, the program staff will NOT be permitted on the Zoom or phone call during the focus group itself, although as in an in-person setting, program staff may help bring participants into the “room”, help them set up the technology, then exit before the meeting starts.




Program Leaders, Staff, Program Partners, and other Stakeholders


Program leaders, staff, program partners, and other stakeholders are categories of respondents not designated as vulnerable populations, and the information the project team will collect is not highly sensitive. The team will ask respondents for factual information about their programs (e.g., what the programs do, the number of people they serve, who is eligible, the outreach and referral process). Since some study participants will be leaders, administrators or staff members, and because the team will name the programs in our reports, individuals reading the reports may be able to attribute particular information or comments to that respondent. The project team will tell respondents about this potential risk.

Information will not be maintained in a paper or electronic system from which they are actually ore directly retrieved by an individuals’ personal identifier.


A11. Sensitive Questions

There are no sensitive questions in this data collection. As noted above, the project team will inform respondents that participation is voluntary.


A12. Estimation of Information Collection Burden


Table 1 below shows estimated burden of the information collection, which will take place over two years. The project team will interview approximately 48 program leaders (roughly 6 leaders across 8 program sites), 60 program partners (up to 7 or 8 representatives from partner organizations or agencies in each of the 8 program study sites), 104 program front-line staff, and 160 program participants (up to 16 focus groups with 10 participants each). The estimated burden also includes 16 program administrators and staff to review information about the study and to respond to an outreach email for interviews and 12 front-line staff to assist with recruiting program participants for focus groups. The project team will interview most program leaders approximately 4 times and direct staff twice. Estimated burden also includes time spent by approximately 48 program data staff (roughly 6 staff across 8 program sites) to spend 24 hours over two years reviewing and submitting data files to the project team. The total annual number of burden hours for this effort is 1,056.

The total annual cost burden to respondents is approximately $ 24,541.80, as shown in the second to last column of Table 1. For program leaders and program partners the figure ($33.91/hr) is based on the mean wages for “Social and Community Services Managers;” as reported in the May 2017 U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wage Estimates. For program front-line staff the figure ($23.28/hr) is based on the mean wages for “Child, Family and School Social Workers,” as reported in the 2015 U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wage Estimates. For the compilation and submission of data files, the figure ($23.60/hr) is based on the mean wages for “Statistical Assistants”, as reported in the 2017 U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wage Estimates.


Wage data for focus group participants is based on the federal minimum wage of $7.25/hr as set by the U.S. Department of Labor.



TABLE 1: Total Burden Requested Under this Information Collection


Instrument

Total no. of respondents

Annual no. of respondents

Number of responses per respondent

Average burden hours per response

Annual burden hours

Hourly wage rate

Annual respondent costs

Appendix A: Outreach email for discussion with Program Administrators and Staff

16

8

1

8

64

$33. 91

$ 2,170.24

Appendix B: Outreach emails for Focus Group Recruiters

12

6

1

8

48

$23.28

$ 1,117.44

Appendix C: Discussion Guide for program leaders

48

24

4

1

96

$33.91

$ 3,255.36

Appendix D: Discussion Guide for program partners and stakeholders

60

30

2

1

60

$33.91

$

2,034.60

Appendix E: Discussion Guide for program front-line staff

104

52

1

1

52

$

23.28

$

1,210.56

Appendix F: Focus Group Guide for program participants

160

80

1

2

160

$7.25

$ 1,160.00

Compilation and Submission of Administrative Data Files

48

24

2

12

576

$23.60

$ 13,593.60

TOTALS


224



1056


$

24,541.80




A13. Cost Burden to Respondents or Record Keepers

There are no additional costs to respondents.


A14. Estimate of Cost to the Federal Government

The total cost for the data collection activities under this current request will be $800,000. The annualized cost is $400,000. The estimate includes the costs of project staff time to draft the discussion guides, collect the information, analyze the responses, and write up the results.


A15. Change in Burden

No changes have been made to the burden estimates since the last OMB approval.


A16. Plan and Time Schedule for Information Collection, Tabulation and Publication

The formative study will occur over two years. The project team will take notes during each interview and prepare a summary write-up of each program or service. The project team will draft a final report and/or briefs that ACF will release publicly, as part of the goal of these data collection activities is to document the current state of interventions in the stated domain areas of interest for dissemination to the field.


Task

Description

Timeframe (after OMB approval)

Site Visit 1

Program leader interviews and introduction to the evaluation process

Month 1

Site Visit 2

Participant focus groups; front-line staff, leaders, partners, and other stakeholder interviews

Month 5

Site Visit 3, Data Access and Year 1 Analysis

Determining data indicators, assessing quality and conformity to logic model; providing feedback to sites about data

Months 8-11

Site Visit 4, Data Access and Year 2 Analysis

Follow up interviews with program leaders, front-line staff, program partners and other stakeholders; Outcome analysis of program data

Months 12-18

Reporting and Disseminating findings (including site visit 5)

Individual formative evaluation reports

Months 18-24



The individual formative evaluation reports, and the analyses on which the reports are based, will be organized around assessing the falsifiability of key assumptions in each program’s logic model. We will customize each report to the selected program and the assumptions the project team tests. But generally, each report will assess assumptions about 1) program context (e.g., key relations and partnerships with other organizations; availability of competing services), 2) target population, 3) target population participation/engagement, 4) implementation of key program elements, and 5) participant outcomes. Each report will conclude with a description of the project team’s assessment of the steps necessary to bring the program to the point of readiness for experimental evaluation and the likelihood that each step could be undertaken within the time frame of Phase II activities.



A17. Reasons Not to Display OMB Expiration Date

All instruments will display the expiration date for OMB approval.


A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy