RPG7_OMB Renewal_SSA_CLEAN

RPG7_OMB Renewal_SSA_CLEAN.docx

Regional Partnership Grants (RPG) National Cross-Site Evaluation and Evaluation Technical Assistance

OMB: 0970-0527

Document [docx]
Download: docx | pdf

REGIONAL PARTNERSHIP GRANTS (RPG) NATIONAL CROSS-SITE EVALUATION AND EVALUATION TECHNICAL ASSISTANCE



OMB Information Collection Request

0970 - 0527




Supporting Statement Part A - Justification

April 2025

Type of Request: Revision














Submitted By:

Children’s Bureau

Administration for Children and Families

U.S. Department of Health and Human Services










Summary

The Children’s Bureau (CB), Administration for Children and Families (ACF), Administration for Children, Youth and Families (ACYF), U.S. Department of Health and Human Services (HHS), is requesting revisions to the approved information collection: Regional Partnership Grants (RPG) National Cross-Site Evaluation and Evaluation Technical Assistance (OMB #0970-0527). The proposed information collection will be used in a national cross-site evaluation of the seventh cohort of Children’s Bureau’s Regional Partnership Grants (RPG7). 


Specifically, this request is to extend approval of the majority of the currently approved instruments, remove two information collections, and add two information collections. This will allow CB to continue to conduct site visits, administer a web-based sustainability survey, and collect data on participants’ enrollment, service use, and outcomes. The addition of individual interviews and focus groups with adult participants enrolled in RPG services will allow for the first time the cross-site evaluation to include participants’ own voices to describe their experiences receiving RPG services. CB is removing the partnership survey, which will no longer be administered to grantees and the semiannual progress reports (SAPRs) which are now approved under a separate OMB number (0970-0490).


  1. Circumstances Making the Collection of Information Necessary

Substance use is a common risk factor for families’ involvement in the child welfare system. In 2022, of the 558,899 children who experienced maltreatment, 24 percent had a caregiver who misused drugs, and about 15 percent had a caregiver who misused alcohol.1,2 Some children had more than one of these risk factors. More than 36,000 infants were referred to child protective services in 2022 for prenatal substance exposure and screened in for investigation.3 In addition, higher rates of drug overdose deaths and drug-related hospitalizations correspond to higher child welfare caseloads.4 Higher rates of serious drug-related issues may make it more difficult for child welfare systems to support and strengthen families, keep children at home, or return them quickly from out-of-home care.


The Children’s Bureau (CB) within the HHS Administration for Children and Families (ACF) seeks approval to continue to collect information for the Regional Partnership Grants to Increase the Well-being of and to Improve Permanency Outcomes for Children Affected by Substance Abuse (known as the Regional Partnership Grants Program or “RPG”) National Cross-Site Evaluation and Evaluation-Related Technical Assistance project. The Child and Family Services Improvement and Innovation Act (Pub. L. 109-288) includes a targeted grants program (section 437(f) of the Social Security Act) that directs the Secretary of HHS to reserve a specified portion of the appropriation for these RPGs, to be used to improve the well-being of children affected by parental or caregiver substance use.


Through six prior rounds of RPG, CB has issued 109 grants to organizations such as child welfare or substance use disorder (SUD) treatment providers or family court systems to develop interagency collaborations and integration of programs, activities, and services designed to increase well-being, improve permanency, and enhance the safety of children who are in an out-of-home placement or at risk of being placed in out-of-home care as a result of a parent’s or caretaker’s substance use. In 2022, CB awarded 18 grants to a seventh cohort (RPG7). The current information collection request (ICR) is for continued data collection activities associated with these 18 RPG7 grantees.

The first three RPG cohorts were included in previous ICRs (OMB Control Numbers 0970-0353 and 0970-0444) and data collection for the fourth, fifth, sixth, and seventh cohorts have been covered under this OMB number (0970-0527).

The ongoing RPG cross-site evaluation will extend our understanding about the grantee partnerships formed to coordinate services for children and families, the experiences of participants enrolled in RPG programs, grantees’ plans to sustain their programs after the end of the grant, the types of programs and services grantees provided to participants, and the outcomes for children and families enrolled in RPG programs. Specifically, this ICR includes (1) conducting site visits to conduct interviews with program staff, (2) administering a web-based sustainability survey, (3) conducting individual interviews and focus groups with adult participants enrolled in RPG services, (4) collecting data on participants’ enrollment and service use, and (5) collecting outcomes on children and families enrolled in RPG programs from administrative data and standardized instruments.

This ICR makes four changes to the currently approved collection. First the semiannual progress reports (SAPRs) are removed from this request as they are now approved under a separate OMB number (0970-0490). Second, this request removes the partnership survey, which will no longer be administered to grantees. Third, this request adds individual interviews and focus groups with adult participants enrolled in RPG services. The individual interviews and focus groups were pretested with two grantees during the cross-site evaluation of the sixth RPG grantee cohort. These new data collection activities will be completed with a subset of the RPG7 grantees. Fourth, the collection of program enrollment and service information was updated with minor wording changes and adds one question.

Approval of this request will allow CB to complete data collection, as needed, through the end of the RPG7 grant period. The evaluation is being conducted by CB and its contractor Mathematica and its subcontractor, WRMA Inc.

Legal or administrative requirements that necessitate the collection

Authorization. The Child and Family Services Improvement Act of 2006 (Pub. L. 109-288) created the competitive RPG program. The legislation required HHS to select performance indicators; required grantees to report the indicators to HHS; and required HHS to report to Congress on (1) the services provided and activities conducted, (2) the progress made in addressing the needs of families with methamphetamine or other substance use disorders who come to the attention of the child welfare system, and (3) grantees’ progress achieving the goals of child safety, permanence, and well-being.

The first reauthorization. The September 30, 2011, passage of the Child and Family Services Improvement and Innovation Act (Pub. L. 112-34) extended funding for the RPG program from federal fiscal year (FFY) 2012 to FFY 2016. The legislation removed the specific focus on methamphetamine use. It specified that grantees could apply for and be awarded multiple grants. In addition to the statutorily required reports for grantees and HHS, it required HHS to evaluate and report on the effectiveness of the grants by specified dates.

The second reauthorization. In 2018, the president signed the Bipartisan Budget Act of 2018 (Pub. L. 115-123) into law, reauthorizing the RPG program through FFY 2021 and adding a focus on opioid use. As part of the reauthorization, several changes were made to the RPG program, with the primary ones being a change in the required mandatory partners and a newly required planning phase that was not to exceed 2 years or a funding disbursement of $250,000. The Bipartisan Budget Act of 2018 (Pub. L. 115-123) is included in Appendix A.



  1. Purpose and Use of the Information Collection

Purpose and Use

The purpose of the RPG cross-site evaluation is to learn about RPG programs and services and their potential effect on improving outcomes for children and families in the key areas of increased child well-being, family functioning and stability, adult recovery, improved permanency, and enhanced child safety. By analyzing data on RPG partnerships and participants’ experiences, CB seeks to understand how the proximal and distal outcomes are influenced by the partnership between the child welfare and substance use treatment agencies. In addition, CB seeks to understand how the partnership plans to sustain services and programs after grant funding ends and how the partnership influences service delivery and how the services provided influence outcomes. An outcomes and impacts analysis is also being conducted. The inclusion of a rigorously designed impact study using a subset of grantees will also provide CB, Congress, grantees, providers, and researchers with information about the effectiveness of RPG programs.

The findings from the RPG cross-site evaluation will be used by policymakers and funders to consider what strategies and programs they should support to meet the needs of these families. Providers can use the findings to select and implement strategies and program models suited to the specific families and communities they serve. Evaluation findings can fill research gaps by rigorously testing program models that have prior evidence of effectiveness with some target populations but not the RPG focal populations, or when provided in combination with other services and programs. Congress will also use information provided through the evaluation to examine the performance of the grantees and the grant program. This could be helpful in the development of future policy.

Details on the purpose and use of the information collected through each instrument used to support the cross-site evaluation follows the research questions.

Note: For all data collection materials described below, except the individual interviews and focus groups with participants enrolled in RPG services, OMB has previously approved the materials and we have made only minor revisions to Appendix F, including editing one item and making wording changes to comply with the recent Executive Order 14168 (Defending Women). For the individual interviews and focus groups with participants, we have added data collection protocols to the list of instruments.

Research Questions

Taking these goals into consideration, the cross-site evaluation aims to address the following research questions:

Partnerships analysis

  1. What partners were involved in each RPG project and how did they work together?

  2. How much progress did the RPG projects make toward interagency collaboration and service coordination?

  3. How do the child welfare and substance use treatment agencies work together?

Participant experiences analysis

  1. For participants enrolled in RPG services, how do their past experiences and circumstances factor into their current involvement with the child welfare system?

  2. How do participants enrolled in RPG services describe their experiences participating in the program services?

Sustainability analysis

  1. What plans and activities did RPG projects undertake to maintain the implementation infrastructure and processes during and after the grant period?

  2. What plans and activities did RPG projects undertake to maintain the organizational infrastructure and processes after the grant period?

  3. To what extent were RPG projects prepared to sustain services after the grant period?

  4. What plans and activities did RPG projects undertake to develop funding strategies and secure resources needed after the grant period?

  5. How did the federal, state, and local context affect RPG projects and their efforts to sustain RPG services?

Enrollment and services analysis

  1. What referral sources did projects use?

  2. Who enrolled in RPG?

  3. To what extent did RPG projects reach their target populations?

  4. What core services5 were provided and to whom?

  5. Were core services that families received different from the services that were proposed in the RPG project applications? If so, what led to the changes in planned services?

  6. How engaged were participants with the services provided?

  7. How did grantees and their partners collaborate to provide services?

Outcomes analysis

  1. What are the well-being, family functioning, recovery, permanency, and safety outcomes of children and adults who received services from RPG projects?

Impact analysis

  1. What are the impacts of RPG projects on children and adults who enrolled in RPG?



Study Components Overview

Partnerships analysis

The partnerships analysis will assess the collaboration and coordination of services the RPG projects provided for families. The analysis will draw on data collected through the grantee and partner staff site visit topic guide (Appendix B). The topic guide covers what partner agencies are involved in each project, the roles they play, and the extent of collaboration among partners, such as sharing a vision and goals to integrating assessment and treatment. In addition, the topic guide explores the interagency collaboration and coordination of the child welfare and substance use treatment agencies, specifically examining topics such as competing priorities within each agency, conflicting timelines of recovery and permanency decisions, and conflicting and limited sharing of data between agencies. Advancing the collaboration and coordination of these two agencies is critical to the success of the RPG partnerships because they aim to serve the same families and support their well-being.

Participant experiences analysis

The participant experiences analysis will examine the experiences and circumstances of participants enrolled in RPG services, as well as participants’ perceptions of the RPG services and partnerships.

The analysis will draw on two data sources:

  • Individual interviews with participants enrolled in RPG services (Appendix C). The individual interview protocol collects information about the underlying conditions and circumstances that led participants to enroll in RPG services and learn about their experiences in those services. The topic guide also covers participants’ reflections on their significant experiences and events across the lifespan – from childhood through adulthood. This will include participants’ interpretations of how these experiences and events factored into their substance use and child welfare involvement and the skills they learned in the RPG program to cope with difficult life challenges.

  • Focus groups with participants enrolled in RPG services (Appendix D). The focus group protocol collects information on participant experiences in RPG services such as motivation for enrollment and how the services they received improved their outcomes related to recovery and child welfare involvement. As part of the focus group, we will also administer a brief questionnaire to obtain participant demographic data; this questionnaire follows the Statistical Policy Directive 15: Standards for Maintaining, Collecting, and Presenting Federal Data on Race and Ethnicity (SPD 15; 89 FR 22182) guidance for race and ethnicity questions, with minimum categories.

Sustainability analysis

The sustainability analysis will describe grantees’ efforts to sustain their RPG services after grant funding ends. This analysis will rely on a web-based sustainability survey (Appendix E), which will be provided to grantees and selected partners. This survey will gather information about organizations’ involvement in plans and activities to improve services during and after the grant period, and to sustain the RPG project after the grant ends.

Enrollment and services analysis

The enrollment and services analysis will describe who was enrolled in the RPG projects and what RPG services they received. The analysis will examine how grantees defined and refined their target populations over the course of their projects and why those changes occurred. It will provide an expanded picture of all core services provided to families enrolled in RPG. Core services are the services defined by the grantee that make up its main RPG project. These include, at a minimum, all services funded by the grant, and might include in-kind services provided by partners. The analysis also seeks to describe how engagement varied across participants and services, and how grantees and their partners collaborated to provide the services. For this ICR, we edited one item related to participant engagement with services to attempt to improve the response variability and data quality for these questions as compared to prior cohorts.

The enrollment and services analysis will use information from the SAPRs (collected under OMB 0970-0490) and enrollment and services data (Appendix F). The enrollment and services data describe participants’ characteristics at enrollment and the services they receive. Grantees record the enrollment date for each RPG family or household and demographic information on each family member including date of birth, ethnicity, race, primary language spoken at home, type of current residence (children only), income (adults only), highest education level attained (adults only), and relationship to a focal child in each family on whom data is collected. Grantees also record service contact information for core services and dates they exit RPG.

Outcomes analysis

The outcomes analysis will describe the characteristics of participating families and their outcomes in the five domains: (1) child well-being, (2) family functioning and stability, (3) adult recovery, (4) child permanency, and (5) child safety.

Grantees administer five instruments at project entry and exit to obtain data on child well-being for a focal child identified in each RPG case, and for the family functioning/stability and recovery domains, as follows (also in Appendix G):

  1. Child well-being (one of the following age-appropriate instruments depending on the age of the focal child)

    • Child Behavior Checklist-Preschool Form (Achenbach and Rescorla 2000)

    • Child Behavior Checklist-School-Age Form (Achenbach and Rescorla 2001)

    • Infant-Toddler Sensory Profile (Dunn 2002)

  1. Family functioning and stability (both)

    • Adult-Adolescent Parenting Inventory (Bavolek and Keene 1999)

    • Center for Epidemiologic Studies-Depression Scale, 12-Item Short Form (Radloff 1977)

  1. Adult recovery (both)

    • Addiction Severity Index, Self-Report Form (drug and alcohol scales only) (McLellan et al. 1992)

    • Trauma Symptoms Checklist-40 (Briere and Runtz 1989)

Grantees also obtain data from administrative records maintained by local or state child welfare, foster care, and substance use treatment agencies for their local evaluations, and provide a core set of records to the cross-site evaluator. These records are used to create measures of child safety and permanency, and adult receipt of substance use treatment services and their recovery. Grantees receive a list and specifications of the core set of records needed (Appendix H).

Impacts analysis

The impacts analysis aims to provide pooled estimates of the effectiveness of RPG projects among grantees with rigorous local evaluation designs. All grantees who have a well-specified quasi-experimental or randomized controlled trial design and a non-administrative data comparison group will be part of the impacts analysis. Grantees in the impacts analysis will collect data using the same set of standardized instruments and obtain the same administrative data on the comparison group as described above for the outcomes analysis (Appendices G and H).

Universe of Data Collection Activities

The RPG cross-site evaluation includes the following data collection activities to support the partnerships, participant experiences, sustainability, enrollment and services, and outcomes and impacts analyses:

  1. Site visits and key informant interviews. To understand the design and implementation of RPG projects, the cross-site evaluation team will conduct 18 RPG7 site visits to better understand the partnership and coordination between the child welfare and SUD treatment agencies. The site visits focus on the RPG planning process; how and why particular services were selected; the ability of the child welfare, substance use treatment, and other service systems to collaborate and support quality implementation of the RPG services; challenges experienced; and the potential for sustaining the collaborations and services after RPG funding ends. For flexibility, the cross-site team may offer a virtual site visit to accommodate participation by grantees that prefer not to or are unable to host an in-person visit.

  1. Individual interviews and focus groups with participants enrolled in RPG services. To describe the experiences of participants enrolled in RPG services, the cross-site evaluation team will conduct up to 16 individual interviews and 8 focus groups with participants enrolled in RPG services. The individual interview and focus group data collection will happen one time during the grant period. This data will provide complimentary information to the site visit data. For flexibility, the cross-site team may offer a virtual interview or focus group to respondents if needed to reach the intended sample size for the interviews and focus groups or, if resources allow, to accommodate a few respondents who were unable to attend in person.

  2. Sustainability survey. To describe projects’ use of data for continuous improvement and their sustainability planning activities, all RPG grantees (and selected knowledgeable partners) will participate in an online survey one time during the grant period. Seven people from each grantee site (where each site includes the grantee organization and their partners), who are knowledgeable about the RPG project, will be invited to participate in the survey. The survey will collect information about supports within the partnership that can help improve and sustain RPG services, such as continuous use of data for service improvement, identification of a lead organization, and policies needed after grant funding ends. In addition, the survey will collect information about funding sources and resources needed after the end of the grant.

  3. Enrollment and services data. To document participants’ characteristics and their enrollment in RPG services, all grantees provide data on enrollment of and services provided to RPG families. These data include demographic information on family members, dates of entry into and exit from RPG services, and information on RPG service dosage. These data are submitted regularly by staff at the grantee organizations into an information system developed by the cross-site evaluation contractor and subcontractor.

  4. Outcome and impact data. To measure participant outcomes, all grantees use self-administered standardized instruments to collect data from RPG adults. The standardized instruments used in RPG collect information on child well-being, adult and family functioning, and adult substance use. Grantees also obtain administrative data on a common set of child welfare and SUD treatment data elements. Grantees share the responses on these self-report instruments and the administrative data with the cross-site evaluation team through an information system developed by the cross-site contractor and subcontractor.


  1. Use of Improved Information Technology and Burden Reduction

The RPG cross-site evaluation uses technology to collect study information. The only exceptions are for the semi-structured interviews conducted during site visits, and the individual interviews and focus groups with participants enrolled in RPG. The cross-site evaluation uses technology to improve the user experience and reduce burden in the following ways:

        • Web-based sustainability survey. The survey of grantee staff and partners is administered via the web. Compared with other survey modes, web-based surveys offer ease and efficiency to respondents and help ensure data quality. The survey is programmed to automatically skip questions not relevant to the respondent, thus reducing cognitive and time burden. The instrument also allows respondents to complete the survey at a time convenient to them. If respondents are unable to complete the survey in one sitting, they can save their place in the survey and return to the questionnaire later. Validation checks and data ranges are built into appropriate items to ensure data quality.

        • Data entry system to collect data from grantees. The evaluation contractor and its subcontractor operate a seamless and transparent web-based data reporting system, known as the RPG-Evaluation Data System (EDS). RPG-EDS has a user interface accessible from any computer, allowing for ease of entry, while all data are housed on secure servers behind the contractors’ firewalls, thereby maintaining data security. The system has been modeled after the data systems used with prior cohorts of RPG grantees. It includes two applications, each designed to facilitate efficient reporting of (1) enrollment and services data and (2) outcomes data. The system can be used by multiple users at each organization and provide varying levels of access depending on users’ needs. For example, administrators or supervisors have the greatest rights within the system, being able to create new users, assign program participants to staff members, and review all activity from the organization. Staff providing direct services to study participants are only able to record and review information about participants assigned to their caseload. The various levels of system access allow for streamlining of information. Limiting full system access to a small set of staff members increases data security, reduces respondent confusion, and supports the collection of higher quality information.

  • Enrollment and services data. On a rolling basis, grantee staff use the enrollment and services data application to provide demographic information on each RPG case at enrollment, as well as enrollment and exit dates for the RPG project and information on each service in which case members enroll. The design of the RPG-EDS enrollment and services data entry application is based on web-based case management systems that Mathematica has developed and implemented successfully for multiple projects, including evaluations of prior RPG cohorts that involved collecting similar data from similar types of providers. For example, the enrollment and services data entry application is flexible and easy to use, and include navigational links to relevant fields for each type of entry to minimize burden on grantee staff and increase the quality and quantity of data collected.

  • Outcomes data. Each grantee reports data from standardized instruments and a list of data elements they draw from administrative records. Grantees develop their own project or agency databases to store these data. The grantee database includes all data the grantee collects from clients or on behalf of clients. The contractor provides format specifications to the grantees to use when uploading outcomes data to RPG-EDS. These are in easy-to-use PDF and Microsoft Excel formats. Twice a year, grantees upload these data to RPG-EDS. This application in RPG-EDS is modeled on the system that was used to obtain these types of data from RPG grantees during previous rounds of grants. All 18 RPG7 grantees are currently using RPG-EDS; thus, they are already familiar with the data system. Importantly, the application in RPG-EDS incorporates advances in technology and software, and improved programming approaches. These improvements enhance the experience of providing outcomes data for this RPG cohort, including reducing the time to prepare and upload data to the system.


  1. Efforts to Identify Duplication and Use of Similar Information

The RPG cross-site evaluation is specifically designed to minimize the duplication of efforts for data. First, grantees are legislatively required to report performance indicators aligned with their proposed program strategies and activities. A key strategy of the RPG cross-site evaluation is to minimize burden on the grantees by ensuring that the data grantees share with the cross-site evaluation, fully meets the need for performance reporting. Thus, rather than collecting separate evaluation and performance indicator data, the grantees need only participate in the cross-site evaluation.

Second, data shared by grantees or provided through direct collection from grantees, staff members, and partners for the cross-site evaluation also serve to describe grantee performance. That is, to reduce duplication of efforts for grantees to comply with both CB’s local and cross-site evaluation requirements and legislatively mandated performance indicators, the cross-site evaluation data must completely overlap with data needed for performance indicators. Because no existing reporting systems collect the data required for reporting to Congress or for the cross-site evaluation, this data collection plan does not duplicate any current efforts.

Furthermore, the design of the cross-site evaluation instruments prevents duplication of data collected through each instrument. For example, during the semi-structured interviews conducted during site visits, partner representatives are not asked any questions included in the sustainability survey. In creating the instruments for the outcomes and impacts analysis, the contractor reviewed and performed a crosswalk of all items to identify duplication across instruments. Any duplicate items not needed for scoring the instruments were removed from the versions of the standardized instruments provided in the outcomes and impacts instruments. This not only reduces burden on RPG participants providing data for grantees’ local evaluations, but also reduces the burden on grantee staff preparing and uploading outcomes data to the cross-site evaluation.


  1. Impact on Small Businesses or Other Small Entities

The potential exists to affect small entities within a grantee site, depending on the local community partners and funders with which grantees engage. RPG grantees and partners are be included in the site visit interviews and the sustainability survey. Additionally, grantee agencies and possibly partners enter data into the RPG-EDS. Proposed data collection for these efforts aims to minimize the burden on all organizations involved, including small businesses and entities and is consistent with the aims of the legislation establishing RPG and CB’s need for valid, reliable, and rigorous evaluations of federally funded programs.


  1. Consequences of Collecting the Information Less Frequently

Not collecting information for the RPG cross-site evaluation would limit the government’s ability to document the performance of its grantees, as legislatively mandated, and to assess the extent to which these federal grants successfully achieve their purpose. Furthermore, the RPG cross-site evaluation is a valuable opportunity for CB, practitioners, and researchers to learn about the implementation and effectiveness of coordinated strategies and services for meeting the needs of families in the child welfare and substance use treatment systems. The study will examine whether the government’s strategy of funding collaborative, cross-system partnerships is a productive one that is likely to be sustained after the grant period ends, and understand how well partnerships are collaborating, the characteristics of the participants enrolling in RPG, the services provided to participants, and the outcomes and impacts on children and adults enrolled in RPG.

The information collection proposed is necessary for a successful cross-site evaluation. The consequences of not collecting this information or collecting the information less frequently are discussed as follows for each data collection element:

        • Grantee and partner staff topic guide. Without the information being collected through interviews with grantee and partner staff, the cross-site evaluation would have to rely entirely on information reported by a single source: the RPG project directors through the semiannual progress reports. Thus, the study would lack the broader perspectives of other key participants, and it would not be possible to conduct any in-depth analysis of critical program challenges and successes or implementation issues.

        • Individual interviews and focus groups with participants enrolled in RPG services. The individual interviews and focus groups with participants enrolled in RPG services allow for a better understanding of the participants’ experiences receiving RPG services. Their insights and suggestions can help inform improvements to the RPG program over time. The cross-site evaluation has not previously directly collected detailed qualitative data from participants; this new data collection activities fill a gap in knowledge of the RPG program and the partnerships that provide services to families.

        • Sustainability survey. Without the sustainability survey, CB would not be able to collect information to understand planning for the continued implementation of services or programs after the period of performance for the grants, which would make it difficult to assess the programs’ return on CB’s investment. Thus, collecting these data increases knowledge on whether funding, staffing, and other resources are in place to continue the partnerships after the grants end.

        • Enrollment and services data. The enrollment and services data are important for describing actual service delivery to cases and for tracking all activities completed with the participants, including assessments, referrals, education, and support. Data is collected when participants enroll, as they receive services, and at exit. Without these data, the study would have no information on the services recipients actually receive, including their duration and dosage. The evaluation would be unable to link participant outcomes to the levels or combinations of specific services or understand whether and how participants engaged in the selected services. If data were collected less frequently, providers would have to store services data or try to recall them weeks or months after delivery. Regular collection also enables data quality checks to address missing data, errors, or other problems in a timely way.

        • Outcomes data. It is CB’s mission to ensure child well-being, safety, and permanency for children who are at risk of or experience maltreatment. The outcomes instruments provide detailed information on these outcomes and the participants who receive services. Grantees upload data from the outcomes instruments twice each year. Without this information, evaluators would be unable to describe the outcomes of RPG program participants or analyze the extent to which grants have affected the outcomes of or addressed the needs of families co-involved with substance use treatment and child welfare systems. Further, it would be impossible to conduct an impact study (described next). During each upload, RPG-EDS performs automatic validation checks of the quality and completeness of the data. Mathematica then reviews submissions to address any remaining data quality issues, and work with grantees to resolve problems. If data were uploaded less often, it would be more cumbersome and challenging for grantees to search through older records to correct or provide missing data.

        • Impacts analysis. Grantees participating in the impacts analysis also upload outcomes data for participants in their comparison group (that is, those who do not receive RPG services or receive only a subset of RPG services). Without this information, it would not be possible to rigorously analyze the effectiveness of the interventions by comparing outcomes for individuals with access to RPG services with those in comparison groups. Uploading the data every six months provides the same benefits with respect to data quality described above.



  1. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

Race and ethnicity information is collected as part of the enrollment and services data. This information collection was initiated prior to the publication of SPD-15, so the race and ethnicity items were not developed in alignment with the current standards. This request maintains the original items for race and ethnicity for several reasons. First, continuing with the current items simplifies continued operations of this information collection for our grant recipients. In addition, this ensures reporting across the RPG7 cohort is consistent. If CB funds future cohorts of RPG projects, CB will implement a race and ethnicity item that complies with the SPD-15 reporting standards in a future request to continue the cross-site evaluation data collection.



  1. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on December 31, 2024 (89 FR 107144) and provided a sixty-day period for public comment. During the notice and comment period we did not receive comments.




  1. Explanation of Any Payment or Gift to Respondents

Tokens of appreciation, which are associated with the new data collection proposed as part of this request, will be used for the individual interviews and focus groups with participants enrolled in RPG services. Specifically, the respondents for the individual interviews will receive a $75 prepaid debit card for a 2-hour interview and focus group respondents will receive a $50 prepaid debit card for a 90-minute focus group. The respondents that participate in an interview or focus will not be statistically representative, in that they will not be used to make statements about the prevalence of experiences for the entire service population. However, to secure respondents with a range of background characteristics and a variety of possible experiences with RPG programs, offsetting the direct costs incurred by respondents is important. For example, participating in the data collection may require arranging child care, transportation, or time off from paid work, thus running the risk that only those individuals able to overcome the financial obstacles will participate, which would reduce the overall quality of the qualitative data collection. Research indicates that offering monetary tokens of appreciation both improves response rates and mitigates nonresponse bias across different respondent populations, particularly among low-income respondents, those residing in rural areas, and those receiving federal nutrition assistance benefits.6 The prepaid debit card amounts proposed in this study are consistent with the literature.7,8


In addition, the amounts are consistent with other OMB-approved information collections, such as $60 for 2-hour interviews on the Next Generation of Enhanced Employment Strategies Project (OMB Control Number 0970-0545, Expiration Date April 30, 2023). The respondents for these interviewers were a similar hard-to-reach population to RPG7’s intended respondents. With the amount offered, the Next Generation of Enhanced Employment Strategies Project was able to reach the intended sample size.


No payments or gifts are provided to respondents as part of any other data collection collections on this study.



  1. Assurance of Confidentiality Provided to Respondents

This study is being conducted in accordance with all relevant regulations and requirements, including meeting the Federal Confidentiality Protection Requirements, under 42 CFR Part 2, and Human Subjects Protection Requirements, under 45 CFR Part 46, with respect to the data collected, analyzed, and reported upon. This information will be kept private to the extent permitted by law.

Several specific measures are taken to protect respondent privacy:

        • Adopting strict security measures and web security best practices to protect data collected through RPG-EDS. Data collected through RPG-EDS (which include outcomes data as well as enrollment and services data), are housed on secure servers that conform to the requirements of the HHS Information Security Program Policy. The data portal employs strict security measures and web security best practices to ensure the data are submitted, stored, maintained, and disseminated securely and safely. Strict security measures are employed to protect the privacy of participant information stored in the system including data authentication, monitoring, auditing, and encryption. Specific security procedures include, but are not limited to the following:

  • The system underwent the HHS security authorization process and obtained a renewal for the Authority to Operate in 2023.

  • All data are encrypted in transit and at rest and reside behind firewalls.

  • Access to RPG-EDS is restricted to approved staff members who are assigned a password only with permission from the study director. Each user has a unique user ID/password combination and are enrolled in the project’s multifactor authentication system.

  • Database access requires special system accounts. RPG-EDS users are not able to access the database directly.

  • RPG-EDS users can access the system only within the scope of their assigned roles and responsibilities.

  • Security procedures are integrated into the design, implementation, and day-to-day operations of RPG-EDS.

  • All data files on multi-user systems are under the control of a database manager, with access limited to project staff on a “need-to-know” basis only. To further ensure data security, project personnel must adhere to strict standards, receive periodic security training, and sign security agreements as a condition of employment.

        • Training cross-site evaluation interviewers in privacy procedures. All site visit interviewers, as well as staff conducting individual interviews and focus groups with participants enrolled in RPG services, are trained on privacy procedures and are prepared to describe them in detail or to answer any related questions raised by respondents. During the introduction to each interview and focus groups, respondents are told that none of the information they provide are used for monitoring or accountability purposes and that the results of the study are presented in aggregate form only. All data collection procedures and data collection instruments have been approved by the Health Media Lab (HML) Institutional Review Board (#713MPR20).

        • Using web-based sustainability surveys. Administering the sustainability survey via web over secured servers eliminates security risks related to shipping hard-copy forms containing personally identifiable information (PII) to the evaluator.

        • Assignment of content-free case and participant identification numbers to replace PII associated with all participant outcomes data provided by grantees to the cross-site evaluation. The cross-site evaluation develops and works with grantees to implement standard procedures for assigning identification numbers to all participant-level data. Case- and individual-level numbers are content-free. For example, they do not include special codes to indicate enrollment dates, participant location, sex, age, or other characteristics.



  1. Justification for Sensitive Questions

There are no sensitive questions in the instruments that the contractor uses to collect data for the grantee and partner staff interviews during site visits, the sustainability survey, and enrollment and services data.

Some of the specified standardized instruments that grantees use to collect data for the outcomes and impacts analysis do include sensitive questions. For example, in the case of parents who are endangering their children as a result of their substance use, grantees must measure the parents’ pattern of substance use as a critical indicator of recovery. Grantees shares de-identified data with the evaluation team; no identifiable information is shared with the cross-site evaluation team. In recognition of the need for grantees to collect this information, and to ensure confidentiality and other protections to their clients, as a condition of their RPG funding, all grantees must obtain IRB clearance for their data collection. As part of their IRB submissions, grantees explain the process through which they share de-identified data from these standardized instruments with the cross-site evaluation.

The individual interviews and focus groups with program participants enrolled in RPG services cover sensitive topics. To understand how participants came to enroll in RPG services, it is necessary to understand the factors that influenced parental substance use and involvement with the child welfare system. The topics were carefully selected based on the findings from the cross-site evaluation of prior cohorts grantees, and other studies that use individual interviews to understand participants’ experiences across the life span, and focus groups that ask for participants’ experiences with a program. The cross-site evaluation team obtained feedback on the questions from participants during a pretest of the instruments with participants from two grantees in the sixth RPG cohort. In the pretest, participants were asked to consider whether any questions were too sensitive. Participants in the pretest reported that although the questions involved sensitive topics, they were comfortable answering them and through the consent process they understood they did not have to talk about any topics that made them uncomfortable.


The sensitive questions in the individual interview with participants enrolled in RPG services instrument include the following:

  • Substance use and misuse. Substance use and substance use treatment may be the focus of services provided by RPG programs. It is important to talk with participants about their substance use history to understand better ways to help them access and engage in treatment.

  • Involvement with the child welfare system. RPG is focused on parents involved with or at risk of involvement with the child welfare system. It is important to ask questions about participants’ child welfare involvement as a parent, and as a child if applicable, to examine how the child welfare system can better serve families.

  • Economic and financial stability. Economic hardship is common in families served by RPG programs. It is important to examine the extent to which participants experienced economic and financial stability across the life span, and whether they think it influenced their substance use and involvement with the child welfare system.

  • Physical and mental health. The pretest highlighted that many participants experienced health conditions throughout childhood and in adulthood, and that these sometimes influenced their substance use. It is important to include these questions in the study to further understand the extent to which health conditions influence substance use in a larger sample of participants.

The focus group instrument does not ask participants to discuss personal sensitive topics in the group setting, but because RPG is focused on substance use and child welfare involvement, questions about the grant program’s services may raise discussion of sensitive topics.


To minimize risk related to sensitive topics, the consent forms state that participants may skip any questions they do not want to answer. Participants will be notified that their responses will not affect any services or benefits they receive and that their responses will be grouped with others when reporting the findings. Participants will be informed that, to the extent permitted by law, individual identifying information will not be released or published; rather, data collection will be published only in summary form with no identifying information at the individual level. All study staff will be trained to detect participant discomfort and to allow participants to discontinue participation if they wish.



  1. Estimates of Annualized Burden Hours and Costs

The estimated reporting burden and cost for the data collection instruments included in this ICR is presented in Table A.1.

All instruments except the individual interviews and focus groups with participants enrolled in RPG services have been previously approved. This request is to continue use of those approved instruments through the end of the seventh cohort of RPG grants and to implement the new data collection activities with the seventh cohort. The seventh cohort grants end in 2027; we are requesting clearance to collect data within a three-year period.

We estimate the average hourly wage for program directors and managers to be the average hourly wage of “Social and Community Services Manager” ($40.10), that of grantee staff to be the average hourly wage of “Counselors, Social Workers, and Other Community and Social Service Specialists” ($28.32), that of data managers to be the average hourly wage of “Database Administrators” ($50.39), that of data entry specialists to be the average hourly wage of “Data Entry Keyers” ($19.20), that for partners to be the average hourly wage of “General and Operations Manager” $62.18), and that for participants enrolled in RPG services to be the average hourly wage of “Community and Social Service Occupations,” taken from the U.S. Bureau of Labor Statistics, Occupational Employment Statistics survey, May 2023. The hourly wage rate for each category has been multiplied by two to account for fringe benefits and overhead. Table A.1 summarizes the proposed burden and cost estimates for the use of the instruments and products associated with the partnerships, enrollment and services, and outcomes and impacts analyses.

For each burden estimate, annualized burden has been calculated by dividing the estimated total burden hours by the three years covered by this submission for RPG7 grantees . Figures are estimated as follows:

Site visit and key informant data collection

        • Individual interview with program director. We expect to interview 18 RPG7 program directors (1 per grantee) once during the evaluation period. These interviews will take 2 hours. The total burden for individual interviews with program directors is 36 hours, and the total annualized burden is 12 hours.

        • Individual interview with program manager or supervisor. We expect to conduct individual, semi-structured interviews with 18 program managers or supervisors (1 staff per 18 grantees) once during the evaluation period. These interviews will take 1 hour. The total burden for individual interviews with program managers is 18 hours, and the total annualized burden is 6 hours.

        • Individual interview with frontline staff. We expect to conduct individual, semi-structured interviews with 36 frontline staff (2 staff per 18 grantees) once during the evaluation period. These interviews will take 1 hour. The total burden for individual interviews with program managers is 36 hours, and the total annualized burden is 12 hours.

        • Partner representative interviews. We expect to conduct individual, semi-structured interviews with 54 partner representatives (3 partners for 18 grantees), once during the evaluation. These interviews will take 1 hour. The total burden for the individual interviews with partner representatives is 54 hours, and the total annualized burden is 18 hours.

        • Individual interviews with participants enrolled in RPG services. We expect to conduct individual, semi-structured interviews with 16 RPG participants enrolled in RPG services, once during the evaluation. These interviews will take 2 hours, including the time needed to complete the permission to contact form (Appendix I). The total burden for the individual interviews with participants enrolled in RPG services is 32 hours, and the total annualized burden is 11 hours.

        • Focus groups with participants enrolled in RPG services. We expect to conduct focus groups with six participants per group. We expect to hold eight focus groups total, for a total of 48 respondents. The focus groups will be held once during the evaluation. These focus groups will take 90 minutes, including the time needed to complete the permission to contact form (Appendix I). The total burden for the focus groups with participants enrolled in RPG services is 72 hours, and the total annualized burden is 24 hours.

        • Sustainability survey. We expect to administer the web-based survey once to 126 grantee key staff and partners (7 per site across the 18 RPG grantees). The survey will take approximately 20 minutes to complete. The total burden for the sustainability survey is 42 hours, and the total annualized burden is 14 hours.

Enrollment and services data9

        • Case enrollment. Based on grantee estimates, we assume enrollment of 100 families per year per grantee. We assume that 3 staff per grantee will conduct enrollment, or 54 staff total. Each staff person will enroll about 33 families per year. It will take 15 minutes to enroll each family using RPG-EDS. Thus, the total burden for enrolling families across all staff members is 1,337 hours, and the total annualized burden is 446 hours.

        • Case closure. Based on grantee estimates, we assume 100 cases will close each year, per grantee. We assume that 3 staff per grantee will conduct case closures in RPG-EDS, or 54 staff total. Each staff will close 33 cases per year. It will take 1 minute to close a case. Thus, the total burden for case closure across all staff members is 107 hours, and the total annualized burden is 36 hours.

        • Case closure – prenatal cases. We assume one-tenth of cases, or 10 families per grantee, per year will include pregnant women. We assume 1 staff per grantee will conduct case closures for prenatal cases, which will require additional time at closure. It will take 1 additional minute to close a case in RPG-EDS. Thus, the total burden for case closure prenatal cases across all staff members is 11 hours, and the total annualized burden is 4 hours.

        • Service log entries. Based on the expected participation of families in specific RPG services, we assume there will be two service log entries each week for each family (104 entries per family per year) in RPG-EDS. We assume that 6 staff per grantee will enter services data (108 staff total), with a caseload size of 15 families each. Each weekly entry will take 2 minutes. Thus, the total annualized burden is 5,054 hours.

Outcomes and impacts data

Administrative data

        • Obtain access to administrative data. During the cross-site evaluation, grantees will review all data submission instructions, and grantee agency personnel will develop a data management plan and the necessary administrative agreements (such as memoranda of understanding) with agencies that house the administrative records to obtain the requested records. They will implement data protocols, including mapping their data fields to the fields in RPG-EDS. Finally, they will pilot the data request and receipt process. Nine grantees in the seventh cohort still need to obtain initial access to the administrative data (the other nine grantees have already obtained this initial access). It will take 220 hours to obtain initial access. Thus, the total burden for obtaining initial access across 9 grantees is 1,980 hours. Grantees will use these data for their local evaluations as well; however, to comply with the procedures for providing the data to the cross-site evaluation, additional steps might be necessary. Therefore, we have assumed that half of the burden of obtaining the administrative data (990 hours) should be allocated to the cross-site evaluation. The annualized burden is 330 hours. We assume 1 data manager per grantee (or 9 data managers) will complete these processes.

        • Report administrative data. Grantees will upload administrative data they have obtained to RPG-EDS twice per year for the evaluation period. For each upload, each grantee will require 72 hours to prepare and upload their administrative data, including correcting any data validation problems, and well as 9 hours to update administrative agreements with agencies that house the administrative records. The total burden for reporting administrative data is thus 8,748 hours for all 18 grantees combined, and the total annualized burden is 2,916 hours. We assume that 1 data entry operator per grantee (or 18 data entry operators) will upload the data.

Standardized instruments

        • Data entry for standardized instruments. Over the course of the study period, each grantee will enroll 100 cases each year (for a total of 5,400 cases). For every case, five standardized instruments will be administered at baseline and again at program completion. Grantees will enter data from the completed instruments into their local databases, and data entry for each instrument will take 15 minutes (0.25 hours) per administration (1.25 hours total). RPG grantees will use these data for their local evaluations; however, to comply with the procedures for providing the data to the cross-site evaluation, additional steps to enter these data into their local databases might be necessary. Therefore, we have assumed that half of the burden of data entry should be allocated to the cross-site evaluation. Thus, the total burden for entering cross-site evaluation data is 3,375 hours, and the total annualized burden is 1,125 hours. We assume that 18 data entry operators (1 operator in each site) will enter the data.

        • Review records and submit electronically. Grantees will review records to ensure that all data have been entered and upload the data to RPG-EDS twice per year for each year of the evaluation period. It will take 3 hours to review and submit data for each of the five instruments twice per year. Grantees will then validate and resubmit data when errors are identified. It will take 2 hours to validate data for each of the five instruments, including time for obtaining responses to validation questions and resubmitting the data. Thus, the total burden is 2,700 hours, and the annualized burden is 900 hours. We assume that 18 data entry operators (1 operator in each site) will review and submit the data.

  • Data entry for comparison study sites. Fourteen grantees participating in the impact study will also enter data for control group members. For every member, five standardized instruments will be administered at baseline and follow-up. Grantees will enter data from the completed instruments into their local databases. It will take 0.25 hours for each administration (1.25 hours total). RPG grantees will use these data for their local evaluations as well; however, to comply with the procedures for providing the data to the cross-site evaluation, additional steps to enter these data into their local databases might be necessary. Therefore, we have assumed that half of the burden of data entry should be allocated to the cross-site evaluation. Thus, the total burden for entering cross-site evaluation data is 2,625 hours, and the total annualized burden is 875 hours. We assume the same enrollment size as grantees (100 cases per year) and that 14 data entry operators (1 operator in each grantees) will enter the data.

Table A.1. Estimate of burden and cost for the RPG evaluation

Data collection activity

TOTAL number of respondents

Number of responses per respondent (each year)

Average burden hours per response (in hours)

Total annual burden hours

Average hourly wage


Total annualized cost

Site Visit and Key Informant Data Collection

Program director individual interview

18

0.33

2

12

$80.20

$952.78

Program manager/ supervisor individual interviews

18

0.33

1

6

$80.20

$476.39

Frontline staff interviews

36

0.33

1

12

$56.64

$672.88

Partner representative interviews

54

0.33

1

18

$80.20

$1,429.16

Individual interviews with participants enrolled in RPG services

16

0.33

2

11

$56.72

$598.96

Focus groups with participants enrolled in RPG services

48

0.33

1.5

24

$56.72

$1,347.67

Sustainability survey

126

0.33

0.33

14

$124.36

$1,706.39

Enrollment, client, and service data

Case enrollment data

54

33

0.25

446

$56.64

$25,233.12

Case closure

54

33

0.02

36

$56.64

$2,018.65

Case closure – prenatal

18

10

0.02

4

$56.64

$203.90

Service log entries 

108

1,560

0.03

5,054

$56.64

$286,281.22

Outcome and impact data

Administrative Data

Obtain access to administrative data a

9

0.33

220

330

$100.78

$32,924.83

Report administrative data

18

2

81

2,916

$38.58

$112,499.28

Standardized instruments

Enter data into local database a 

18

100

1.25

1,125

$38.58

$43,402.50

Review records and submit 

18

2

25

900

$38.58

$34,722.00

Data entry for comparison study sites (14 sites) a

14

100

1.25

875

$38.58

$33,757.50

Estimated Totals

 

 

 

11,783


$578,227.23

a Data are used for site-level evaluations conducted by the grantees. To account for added data preparation steps needed to share data with the cross-site evaluation, burden hour estimates assume that only half of this burden is part of the cross-site evaluation.

  1. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

These information collection activities do not place any additional costs on respondents or record keepers.



  1. Annualized Cost to the Federal Government

The estimated cost for completing the RPG cross-site evaluation data collection over the three years of the requested clearance is $762,238. The annualized cost to the federal government includes one-third of that total ($254,079).


15. Explanation for Program Changes or Adjustments

This request is to continue data collection under OMB #0970-0527 using the previously approved instruments and to add two new data collection activities: individual interviews and focus groups with participants enrolled in RPG services. This new data collection allows us to add participants’ descriptions of their experiences receiving services to the cross-site evaluation findings. We have also edited one item in Appendix F related to participant engagement with services to attempt to improve the response variability and data quality for these questions as compared to prior cohorts. We have also made minor wording changes to Appendix F to comply with the recent Executive Order 14168 (Defending Women), such as updating all relevant data collection fields to include the term “sex.”

We plan to drop the previously approved partnership survey as we have learned from prior rounds that the grantees and partner interviews during the site visits can sufficiently answer the partnership analyses research questions about cross-sector collaboration and the make-up of RPG partnerships.

16. Plans for Tabulation and Publication and Project Time Schedule

Plans for tabulation

The information from the RPG cross-site evaluation—with a focus on partnerships, services, and outcomes for families—will be useful to funders, practitioners, and other parties interested in targeting resources to effective approaches to address the needs of children at risk of maltreatment due to adult substance use. Identifying what has worked well allows subsequent efforts of program operators and funders to home in on evidence-based practices and strategies.

Partnerships, participant experiences, sustainability, enrollment and services, and outcomes analyses

Data from the instruments included in this OMB package will be analyzed using qualitative and quantitative methods to describe the target populations’ characteristics and outcomes; program services, dosage, and participant engagement; program sustainability; the structure, quality, and goals of partnerships; and the experiences of participants in RPG programs. An enrollment and services analysis of participants will provide a snapshot of child, adult, and family characteristics at entry, and their outcomes. Thoroughly documenting program services and partnerships will expand understanding of the breadth of programs, practices, and services being offered through RPG to families and will describe successes in achieving goals and difficulties encountered. A greater understanding of how programs can be implemented with a network of partners might inform future efforts in this area.

Mathematica will use standard qualitative procedures to analyze and summarize information from project staff and partner interviews conducted using the semi-structured staff interview topic guide. Mathematica will separately analyze the qualitative data from individual interviews and focus groups with participants enrolled in RPG services. For both types of qualitative data, the analysis procedures include organization, coding, and theme identification. Standardized templates will be used to organize and document the information and then code interview and focus group data. Coded text will be searched to gauge consistency and consolidate data across respondents and data sources. This process will reduce large volumes of qualitative data to a manageable number of topics, themes, or categories (Yin 1994; Coffey and Atkinson,1996), which can then be analyzed to address the study’s research questions.

Quantitative data will be summarized using basic descriptive methods. For the outcomes analysis, data from the standardized instruments will be tabulated and used to create scales and scores appropriate for each instrument and will use established norms when appropriate for the RPG target populations. Administrative records will be examined to determine whether incidents of child maltreatment and child removal from the home have occurred and whether adults have received substance use treatment, the frequency of treatment, and resolution. These data will capture information at baseline and program exit for families who participate in services. For the partnerships, enrollment and services, and sustainability analyses, sources of quantitative data include a sustainability survey and the enrollment and services data. Data from each source will undergo a common set of steps involving cleaning data, constructing variables, and computing descriptive statistics. To facilitate analysis of each data source, we will create variables to address the study’s research questions. Constructing these analytic variables will depend on a variable’s purpose and the data source being used. Variables might combine several survey responses into a scale or a score, aggregate attendance data from a set period, or compare responses to identify a level of agreement.

Enrollment and services data, which grantees enter into RPG-EDS, will also be used for the enrollment and services analysis. The study will provide summary statistics for key program features:

        • Enrollment. For example, the average number of new cases each month

        • Services provided by grantees. For example, the services in which clients typically participate (including any common combinations of services); distribution of location of services (such as home, treatment facility, or other site); the average number of selected services (such as workshops) offered each month; and common topics covered during services

        • Participation. For example, the average length of time participants are served by the program, the average number of hours of services program participants receive, and the average duration between enrollment and start of services

We will analyze data from RPG-EDS for each grantee for the reports to Congress and annual reports identified in Table A.2. The reports to Congress will include topics such as enrollment patterns, services provided, and participation patterns over the previous 12 months. Later analyses might describe how patterns changed over time, such as from the early to late implementation period.

Impacts analysis

The impacts analysis will complement other components of the evaluation by examining program effectiveness in the areas of child well-being, safety, and permanency; adult recovery; and family functioning. It will include the 14 grantees who have proposed rigorous local evaluations, either using random assignment or a strong matched comparison group. To be considered a strong matched comparison group, the local evaluation must include baseline data on key characteristics, such as family functioning and parental substance use, on which to establish equivalence with those enrolled in RPG programs. As noted above, all grantees will provide data on the program groups as part of the outcomes study. Those involved in the impact study will also collect data on comparison group members who are not enrolled in RPG projects at baseline and program exit.

The impacts analyses will be conducted for three groups of studies. First, we will pool the grantees’ projects that used well-implemented randomized controlled trials (RCTs) in their local evaluations. RCTs have excellent internal validity—ability to determine whether the program caused the outcomes—because the treatment and comparison groups are initially equivalent on all observed and unobserved characteristics, on average. Any observed differences in outcomes between the program and control group of families can therefore be attributed to the program with a known degree of precision. Second, we will pool grantees with RCTs with some issues (such as high attrition) and those with strong quasi-experimental designs (QEDs), in which program and comparison groups were matched on key factors, such as baseline history of substance use and family functioning. The internal validity of QEDs is weaker than that of RCTs, because differences on unobservable characteristics cannot be determined. However, a design with well-matched program participants and comparison group members provides useful information on program effects. Third, we will pool the studies in groups 1 and 2 that include well-implemented RCTs, RCTs with issues, and QEDs. Combining the QED results with RCTs will increase the statistical power of the overall analysis, enabling us to detect smaller effects. Because of the serious consequences of child maltreatment, even relatively small effect sizes might be clinically meaningful.

Grantees and their local evaluators will collect baseline data for use in the cross-site evaluation. First, baseline data will serve to describe the characteristics of RPG program participants. We will present tables of frequencies and means for key participant characteristics, including demographic and family information for the three groups of impacts analyses: the grantees with well-implemented RCTs, the combined group of RCTs with issues and QEDs, and the group that combines well-implemented RCTs, RCTs with issues, and QEDs.

A key use of baseline data is to test for baseline equivalence for both the RCT and the RCT-QED samples. Though random assignment ensures that families participating in the program and those in comparison groups do not initially differ in any systematic way, chance differences might exist between groups. Establishing baseline equivalence for the QEDs is critical for determining whether the comparison group serves as a reasonable counterfactual that represents what would have happened to the program group had it not received treatment. To confirm that there were no differences between the program and comparison groups at the study’s onset, we will statistically compare key characteristics between the groups. In addition, because the standardized instruments will be administered twice—once at program entry and again at program exit, we will also compare baseline measures of outcomes at program entry between the two groups. In particular, to establish baseline equivalence, we will conduct t-tests for differences between the two groups both overall and separately by grantee. In these comparisons, we will use the analytic sample, which includes respondents to both the baseline and follow-up instruments.

A key use of follow-up data is to estimate program impacts. We will use baseline data to improve the statistical precision of impact estimates and control for any remaining differences between groups. The average impact estimate will be the weighted average of each site-specific impact, where the weight of each site-specific impact is the inverse of the squared standard error of the impact. As such, sites with more precise impact estimates (for example, sites with larger sample sizes or baseline variables that are highly correlated with the outcomes) will receive greater weight in the average impact estimate. We will compare the results using the sites with well-implemented RCT evaluations with those obtained from the RCT with issues and QED sample, noting that the former is most rigorous, whereas the latter should be considered suggestive or promising evidence of effectiveness.

Overall performance

To inform Congress on the performance and progress of the RPG sites, we will produce two reports to estimate and report on performance measures for the 18 sites. The reporting will include selected measures collected and calculated for the (1) partnerships analysis, including partnership goals and collaboration; (2) participants experiences analysis, including feedback on RPG services from participants enrolled in RPG services and the participants’ experiences before enrolling in RPG services; (3) sustainability analysis; (4) enrollment and services analysis, including information about program operations, enrollment, and participation; (5) outcomes analysis, including detailed descriptions of the characteristics and outcomes associated with participating families; and (6) impacts analysis. To reduce the burden for the grantees and local evaluators, we have designed the cross-site analyses to completely overlap between the performance measures and those of the other evaluation components, so no additional data are needed.

Time schedule and publications

This ICR is to continue to collect data for the cross-site evaluation for an additional three years. Once data collection is complete, reporting will continue through September 2027.

We will continue to produce three types of reports will summarize the progress and findings of the cross-site evaluation: annual reports, reports to Congress, and a final evaluation report (Table A.2). Each year, we will develop reports describing cross-site evaluation progress. Annual reports, starting in October 2025, will be designed for accessibility for a broad audience of policymakers and practitioners. For the overall performance component, we will produce two additional reports to Congress beginning in September 2026. A final evaluation report will provide a comprehensive synthesis of all aspects of the study over the entire contract, including integration and interpretation of both qualitative and quantitative data. Previous reports to Congress includethe fifth Annual Report to Congress the seventh report to Congress, sixth report to Congress, fifth report to Congress, fourth annual report to Congress, and the third annual report to Congress. The sixth and seventh report to Congress are currently undergoing the clearance process at HHS.

Table A.2. Schedule for the RPG cross-site evaluation

Activity

Date

Data collection

May 2025–September 2027

Reports to Congress

Two reports (September 2026 and September 2027)

Annual reports

Ad-hoc reports or research briefs

Annually beginning October 2025

As requested by Children’s Bureaub

Final evaluation report

September 2027



In addition to planned reports on the findings, RPG will provide opportunities for analyzing and disseminating additional information through special topics reports and research or issue briefs. Short research or policy briefs are an effective and efficient way of disseminating study information and findings. The cross-site evaluation team will produce up to two ad hoc reports or special topics briefs each year at the request of CB. Topics for these briefs will emerge as the evaluation progresses but could, for example, provide background on selected services offered by grantees, summarize program activities in tribal grantee sites, discuss impact or subgroup findings, or describe the grantees. Examples of additional reports previously created include “Continuous Quality Improvement: How Regional Partnership Grantees Can Use Data from the RPG Cross-Site Evaluation to Learn About Project Implementation,” “Offering Data Collection Incentives to Adults at Risk for Substance Use Disorder,” and “Collecting Sensitive Information and Encouraging Reluctant Respondents.”



A17. Reason(s) Display of OMB Expiration Date is Inappropriate

Approval not to display the expiration date for OMB approval is not requested.



A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this data collection.

1 Children’s Bureau. (2024). Child Maltreatment. U.S. Department of Health & Human Services, Administration for Children and Families, Administration on Children, Youth and Families. https://www.acf.hhs.gov/cb/data-research/child-maltreatment

2 The state determined maltreatment was substantiated or indicated (that is, maltreatment could not be substantiated under state law or policy, but there was reason to suspect that at least one child may have been maltreated or was at risk of maltreatment) (see footnote 1). The drug misuse risk factor was categorized as “drug abuse,” which was defined as “the compulsive use of drugs that is not of a temporary nature” (see footnote 1). These results were limited to 39 states that reported data on “drug abuse” as a possible risk factor. The “alcohol abuse” risk factor, which was defined as “the compulsive use of alcohol that is not of a temporary nature,” was reported by 33 states (see footnote 1).

3 Data on prenatal substance exposure were reported by 50 states.

4 Radel L, Baldwin M, Crouse G, Ghertner R, Waters A. Substance use, the opioid epidemic, and the child welfare system: Key findings from a mixed methods study. U.S. Department of Health and Human Services; 2018. https://aspe​.hhs.gov​/system/files/pdf/258836​/SubstanceUseChildWelfareOverview​.pdf.

5 Core services are the services defined by the grantee that make up its main RPG project. These include, at a minimum, all services funded by the grant, and might include in-kind services provided by partners.

6 Abdelazeem B, Abbas KS, Amin MA, El-Shahat NA, Malik B, Kalantary A, Eltobgy M. “The effectiveness of incentives for research participation: A systematic review and meta-analysis of randomized controlled trials.” PLoS One 17:4, 2022.

7 Krueger, Richard A. Focus groups: A practical guide for applied research. Sage publications, 2014.

8 Kelly, B., Margolis, M., McCormack, L., LeBaron, P. A., & Chowdhury, D. “What Affects People’s Willingness to Participate in Qualitative Research? An Experimental Comparison of Five Incentives.” Field Methods, 29(4), 2017: 333-350. https://doi.org/10.1177/1525822X17698958


9 The prior clearance for this OMB package included the semiannual progress reports. These reports are now covered under OMB package 0970-0490, Expiration date 3/31/2026.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJones, Molly (ACF)
File Modified0000-00-00
File Created2025-06-15

© 2025 OMB.report | Privacy Policy