Supporting Statement A - RPG CSE_CLEAN_fnl

Supporting Statement A - RPG CSE_CLEAN_fnl.docx

CB Evaluation: Regional Partnership Grants (RPG) National Cross-Site Evaluation and Evaluation Technical Assistance [Implementation/descriptive study and Outcomes/Impact analyses]

OMB: 0970-0527

Document [docx]
Download: docx | pdf

REGIONAL PARTNERSHIP GRANTS (RPG) NATIONAL CROSS-SITE EVALUATION AND EVALUATION TECHNICAL ASSISTANCE





OMB Information Collection Request

0970 - 0527





Supporting Statement Part A - Justification

February 2022



























Submitted By:

Children’s Bureau

Administration for Children and Families

U.S. Department of Health and Human Services

















Summary

  • Type of Request: This Information Collection Request (ICR) is for a renewal of a currently approved information collection (OMB #: 0970-0527). The extension is to allow more time for the information collection and includes a revision to add a new data collection instrument.

  • Progress to Date: This ICR is a continuation of the national cross-site evaluation of Children’s Bureau’s Regional Partnership Grants (RPG), which began in 2017. These data collection activities are associated with the 18 grantees in the fifth and sixth cohorts. Data collection for these cohorts began in 2019 and 2020, respectively, and is currently underway.

  • Description of Request: This request includes all previously approved instruments and one new instrument, a survey of programs. Information collection will continue to be used in the national cross-site evaluation of the fifth and sixth RPG cohorts. The cross-site evaluation includes surveys, interviews, progress reports, and data on participant enrollment, services, and outcomes.




A1. Circumstances Making the Collection of Information Necessary

Substance misuse by parents and caregivers is linked to significant developmental delays for children and disruptions to healthy parenting. As defined by the Surgeon General (HHS, 2016), substance use disorder (SUD) is a medical illness caused by repeated misuse of a substance and is characterized by clinically significant impairments in health and social function, and impaired control over substance use. A SUD may limit parents’ ability to meet their children’s emotional and physical needs; affect healthy parental behavior, and inhibit parents’ ability to support child well-being (Lander et al. 2013). Consequently, children of parents with substance use issues are more likely to be placed in out-of-home care and more likely to stay in care longer than other children (Barth et al. 2006; Neger and Prinz 2015). Data from U.S. Department of Health and Human Services (HHS) (2016) indicate the number of children in foster care is growing and parental substance misuse is increasingly a factor contributing to a child’s removal from the home.

The Children’s Bureau (CB) within the HHS Administration for Children and Families (ACF) seeks approval to continue to collect information for the Regional Partnership Grants to Increase the Well-being of and to Improve Permanency Outcomes for Children Affected by Substance Abuse (known as the Regional Partnership Grants Program or “RPG”) Cross-Site Evaluation and Evaluation-Related Technical Assistance project. The Child and Family Services Improvement and Innovation Act (Pub. L. 109-288) includes a targeted grants program (section 437(f) of the Social Security Act) that directs the Secretary of HHS to reserve a specified portion of the appropriation for these RPGs, to be used to improve the well-being of children affected by substance abuse. Under four prior rounds of RPG, CB has issued 91 grants to organizations such as child welfare or substance abuse treatment providers or family court systems to develop interagency collaborations and integration of programs, activities, and services designed to increase well-being, improve permanency, and enhance the safety of children who are in an out-of-home placement or at risk of being placed in out-of-home care as a result of a parent’s or caretaker’s substance abuse. In 2018 CB awarded 10 grants to a fifth cohort and in 2019 CB awarded 8 grants to a sixth cohort. The current information collection request (ICR) is for continued data collection activities associated with these 18 grantees. The first three cohorts were included in previous ICRs (OMB Control Numbers 0970-0353 and 0970-0444) and data collection for the fourth cohort were covered in the previous three-year clearance under this OMB number (0970-0527).

The ongoing RPG cross-site evaluation will extend our understanding of the types of programs and services grantees provided to participants, how grantees leveraged their partnerships to coordinate services for children and families, and the outcomes for children and families enrolled in RPG programs. First, the cross-site evaluation will assess the coordination of partners’ service systems to better understand how partners’ collaborative effort affected the services offered to families. The cross-site evaluation will also focus on the partnership between the child welfare and SUD treatment agencies, to add to the research base about how these agencies can collaborate to address the needs of children and families affected by SUD. Second, the evaluation will describe the characteristics of participants served by RPG programs, the types of services provided to families, the dosage of each type of service received by families, and the level of participant engagement with the services provided. Finally, the evaluation will assess the outcomes of children and adults served through the RPG program, such as child behavioral problems, adult depressive symptoms, or adult substance use issues. 

This ICR is to continue the previously approved data collection activities that are currently ongoing and add a sustainability survey as a new data collection activity. Approval of this request will allow CB to complete data collection, as needed, through the sixth cohort of RPG grantees. These data collection activities include the following: (1) site visits with grantees, (2) a web-based survey about grantee partnerships, (3) a web-based survey about sustainability planning, (4) semiannual progress reports, (5) enrollment and services data provided by grantees, and (6) outcomes and impacts data provided by grantees. The evaluation is being undertaken by CB and its contractor Mathematica and its subcontractor, WRMA Inc.

Legal or administrative requirements that necessitate the collection

Authorization. The Child and Family Services Improvement Act of 2006 (Pub. L. 109-288) created the competitive RPG program. The legislation required HHS to select performance indicators; required grantees to report the indicators to HHS; and required HHS to report to Congress on (1) the services provided and activities conducted, (2) the progress made in addressing the needs of families with methamphetamine or other substance use disorders who come to the attention of the child welfare system, and (3) grantees’ progress achieving the goals of child safety, permanence, and well-being.

The first reauthorization. The September 30, 2011, passage of the Child and Family Services Improvement and Innovation Act (Pub. L. 112-34) extended funding for the RPG program from federal fiscal year (FFY) 2012 to FFY 2016. The legislation removed the specific focus on methamphetamine abuse. It specified that grantees could apply for and be awarded multiple grants. In addition to the statutorily required reports for grantees and HHS, it required HHS to evaluate and report on the effectiveness of the grants by specified dates.

The second reauthorization. In 2018, the president signed the Bipartisan Budget Act of 2018 (Pub. L. 115-123) into law, reauthorizing the RPG program through FFY 2021 and adding a focus on opioid abuse. As part of the reauthorization, several changes were made to the RPG program, with the primary ones being a change in the required mandatory partners and a newly required planning phase that was not to exceed 2 years or a funding disbursement of $250,000. The Bipartisan Budget Act of 2018 (Pub. L. 115-123) is included in appendix A.


A2. Purpose and Use of the Information Collection

Purpose and Use

The purpose of the RPG cross-site evaluation is to learn about RPG programs and services and their potential effect on improving outcomes for children and families in the key areas of increased child well-being, family functioning and stability, adult recovery, improved permanency, and enhanced child safety. By analyzing data on RPG partnerships, services, and sustainability, CB seeks to understand how the proximal and distal outcomes are influenced by the partnership between the child welfare and substance use treatment agencies and the level of integration among the partners (partnerships analysis). In addition, CB seeks to understand how the partnership influences service delivery and how the services provided influence outcomes (enrollment and services analysis), and how the partnership plans to sustain services and programs after grant funding ends. An outcomes and impacts analysis will also be conducted. The inclusion of a rigorously designed impact study using a subset of grantees will also provide CB, Congress, grantees, providers, and researchers with information about the effectiveness of RPG programs.

The findings from the RPG cross-site evaluation will be used by policymakers and funders to consider what strategies and programs they should support to meet the needs of these families. Providers can use the findings to select and implement strategies and program models suited to the specific families and communities they serve. Evaluation findings can fill research gaps by rigorously testing program models that have prior evidence of effectiveness with some target populations but not the RPG target populations, or when provided in combination with other services and programs. Congress will also use information provided through the evaluation to examine the performance of the grantees and the grant program. This could be helpful in the development of future policy.

Details on the purpose and use of the information collected through each instrument used to support the partnerships, enrollment and services, sustainability efforts, and outcomes and impacts analyses follow the research questions. Note: This request includes one new instrument, the sustainability survey (Appendix D). All other data collection materials described below have been previously approved and there are no changes proposed.

Research Questions

Taking these goals into consideration, the cross-site evaluation aims to address the following research questions:

Partnerships analysis

  1. What partners were involved in each RPG project and how did they work together?

  2. How much progress did the RPG projects make toward interagency collaboration and service coordination?

  3. How do the child welfare and substance use treatment agencies work together?

Enrollment and services analysis

  1. What referral sources did projects use?

  2. Who enrolled in RPG?

  3. To what extent did RPG projects reach their target populations?

  4. What core services1 were provided and to whom?

  5. Were core services that families received different from the services that were proposed in the RPG project applications? If so, what led to the changes in planned services?

  6. How engaged were participants with the services provided?

  7. How did grantees and their partners collaborate to provide services?

Outcomes analysis

  1. What are the well-being, family functioning, recovery, permanency, and safety outcomes of children and adults who received services from RPG projects?

Impact analysis

  1. What are the impacts of RPG projects on children and adults who enrolled in RPG?

Sustainability analysis

13. What plans and activities did RPG projects undertake to maintain the implementation infrastructure and processes during and after the grant period?

14. What plans and activities did RPG projects undertake to maintain the organizational infrastructure and processes after the grant period?

15. To what extent were RPG projects prepared to sustain services after the grant period?

16. What plans and activities did RPG projects undertake to develop funding strategies and secure resources needed after the grant period?

17. How did the federal, state, and local context affect RPG projects and their efforts to sustain RPG services?



Study Components Overview

Partnerships analysis

The partnerships analysis will assess the collaboration and coordination of services the RPG projects provided for families. The analysis will examine what partner agencies are involved in each project, the roles they play, and the extent of collaboration among partners, such as sharing a vision and goals to integrating assessment and treatment. In addition, the analysis will explore the interagency collaboration and coordination of the child welfare and substance use treatment agencies, specifically examining topics such as competing priorities within each agency, conflicting timelines of recovery and permanency decisions, and conflicting and limited sharing of data between agencies. Advancing the collaboration and coordination of these two agencies is critical to the success of the RPG partnerships because they aim to serve the same families and support their well-being. The analysis will primarily draw on two data sources:

  • Grantee and partner staff site visit topic guide (Appendix B). This topic guide collects detailed information from selected project and grantee staff and partners about the RPG planning process, how and why particular partners were selected, how the partnerships developed, changes in partnerships and the rationale for those changes, the project director’s perceptions of partnership quality, partnership challenges, and lessons learned. In addition, site visitors interview representatives from the child welfare provider and substance use treatment agency (if those differ from the grantee) to understand their role in RPG planning, their roles and responsibilities in the program, views on the goals of RPG, agency goals and priorities, reconciling competing priorities, and any policy or process changes within the agencies resulting from collaboration on RPG.

  • Partner survey (Appendix C). The partner survey is administered to grantees and their partners. This survey gathers information on the characteristics of the partner organizations, how partners communicate and collaborate, goals of the partnership, and the types of organizations and roles within the partnership.

Sustainability analysis

The sustainability analysis will describe grantees’ efforts to sustain their RPG services after grant funding ends. This analysis will rely on a web-based sustainability survey (Appendix D), which will be provided to grantees and selected partners. This survey will gather information about organizations’ involvement in plans and activities to improve services during and after the grant period, and to sustain the RPG project after the grant ends.

Enrollment and services analysis

The enrollment and services analysis will describe who was enrolled in the RPG projects and what RPG services they received. The analysis will examine how grantees defined and refined their target populations over the course of their projects and why those changes occurred. It will provide an expanded picture of all core services provided to families enrolled in RPG. Core services are the services defined by the grantee that make up its main RPG project. These include, at a minimum, all services funded by the grant, and might include in-kind services provided by partners. The analysis also seeks to describe how engagement varied across participants and services, and how grantees and their partners collaborated to provide the services. The enrollment and services analysis will use the following data sources:

  • Semiannual progress reports (Appendix E). Grantee project directors complete progress reports twice each year with updated information about their projects, including any changes from prior periods. CB has tailored the semiannual progress reports to collect information on grantees’ services, the target population for the RPG program, project operations, partnerships, and grantees’ perceived successes and challenges to implementation.

  • Enrollment and services data (Appendix F). These data describe participants’ characteristics at enrollment and the services they receive. Grantees record the enrollment date for each RPG family or household and demographic information on each family member including date of birth, ethnicity, race, primary language spoken at home, type of current residence (children only), income (adults only), highest education level attained (adults only), and relationship to a focal child in each family on whom data is collected. Grantees also record service contact information for core services and dates they exit RPG.

Outcomes analysis

The outcomes analysis will describe the characteristics of participating families and their outcomes in the five domains: (1) child well-being, (2) family functioning and stability, (3) adult recovery, (4) child permanency, and (5) child safety.

Grantees administer five instruments at project entry and exit to obtain data on child well-being for a focal child identified in each RPG case, and for the family functioning/stability and recovery domains, as follows (also in Appendix G):

  1. Child well-being (one of the following age-appropriate instruments depending on the age of the focal child)

    • Child Behavior Checklist-Preschool Form (Achenbach and Rescorla 2000)

    • Child Behavior Checklist-School-Age Form (Achenbach and Rescorla 2001)

    • Infant-Toddler Sensory Profile (Dunn 2002)

  1. Family functioning and stability (both)

    • Adult-Adolescent Parenting Inventory (Bavolek and Keene 1999)

    • Center for Epidemiologic Studies-Depression Scale, 12-Item Short Form (Radloff 1977)

  1. Adult recovery (both)

    • Addiction Severity Index, Self-Report Form (drug and alcohol scales only) (McLellan et al. 1992)

    • Trauma Symptoms Checklist-40 (Briere and Runtz 1989)

Grantees also obtain data from administrative records maintained by local or state child welfare, foster care, and substance use treatment agencies for their local evaluations, and provide a core set of records to the cross-site evaluator. These records are used to create measures of child safety and permanency, and adult receipt of substance use treatment services and their recovery. Grantees receive a list and specifications of the core set of records needed (see Appendix H).

Impacts analysis

The impacts analysis aims to provide pooled estimates of the effectiveness of RPG projects among grantees with rigorous local evaluation designs. All grantees who have a well-specified quasi-experimental or randomized controlled trial design and a non-administrative data comparison group will be part of the impacts analysis. Grantees in the impacts analysis will collect data using the same set of standardized instruments and obtain the same administrative data on the comparison group as described above for the outcomes analysis (see Appendix G and I).

Universe of Data Collection Activities

The RPG cross-site evaluation includes the following data collection activities to support the partnerships, enrollment and services, sustainability efforts, and outcomes and impacts analyses:

  1. Site visits and key informant interviews. To understand the design and implementation of RPG projects, the cross-site evaluation team visits the 8 RPG6 sites, which were funded in FY2019, to better understand the partnership and coordination between the child welfare and SUD treatment agencies. The site visits focus on the RPG planning process; how and why particular services were selected; the ability of the child welfare, substance use treatment, and other service systems to collaborate and support quality implementation of the RPG services; challenges experienced; and the potential for sustaining the collaborations and services after RPG funding ends.

  1. Partner survey. To describe the interagency collaboration within RPG sites, all RPG6 grantees and their partners participate in an online survey one time during the grant period. One person from each organization knowledgeable about the RPG project is invited to participate in the survey. The survey collects information about communication and service coordination among partners. The survey also collects information on characteristics of strong partnerships (such as data-sharing agreements, co-location of staff, referral procedures, and cross-staff training).

  2. Sustainability survey. To describe projects’ use of data for continuous improvement and their sustainability planning activities, all RPG grantees (and selected knowledgeable partners) will participate in an online survey one time during the grant period. Seven of people from each grantee site (where each site includes the grantee organization and their partners), who are knowledgeable about the RPG project, will be invited to participate in the survey. The survey will collect information about supports within the partnership that can help improve and sustain RPG services, such as continuous use of data for service improvement, identification of a lead organization, and policies needed after grant funding ends. In addition, the survey will collect information about funding sources and resources needed after the end of the grant.

  3. Semiannual progress reports. All grantee project directors complete semiannual progress reports with updated information about their projects, including any changes from prior periods. CB has tailored the semiannual progress reports to collect information on grantees’ services, the target population for the RPG program, project operations, partnerships, and grantees’ perceived successes and challenges to implementation.

  4. Enrollment and services data. To document participants’ characteristics and their enrollment in RPG services, all grantees provide data on enrollment of and services provided to RPG families. These data include demographic information on family members, dates of entry into and exit from RPG services, and information on RPG service dosage. These data are submitted regularly by staff at the grantee organizations into an information system developed by the cross-site evaluation contractor and subcontractor.

  5. Outcome and impact data. To measure participant outcomes, all grantees use self-administered standardized instruments to collect data from RPG adults. The standardized instruments used in RPG collect information on child well-being, adult and family functioning, and adult substance use. Grantees also obtain administrative data on a common set of child welfare and SUD treatment data elements. Grantees share the responses on these self-report instruments and the administrative data with the cross-site evaluation team through an information system developed by the cross-site contractor and subcontractor.



A3. Use of Improved Information Technology and Burden Reduction

The RPG cross-site evaluation uses technology to collect study information. The only exceptions are for the semi-structured in-person interviews conducted during site visits, and the written semiannual progress reports. The cross-site evaluation uses technology to improve the user experience and reduce burden in the following ways:

        • Web-based partner and sustainability surveys. The surveys of grantee staff and partners are administered via the web. Compared with other survey modes, web-based surveys offer ease and efficiency to respondents and help ensure data quality. The surveys are programmed to automatically skip questions not relevant to the respondent, thus reducing cognitive and time burden. The instruments also allow respondents to complete the surveys at a time convenient to them. If respondents are unable to complete the survey in one sitting, they can save their place in the survey and return to the questionnaire later. Validation checks and data ranges are built into appropriate items to ensure data quality.

        • Data entry system to collect data from grantees. The evaluation contractor and its subcontractor operate a seamless and transparent web-based data reporting system, known as the RPG-Evaluation Data System (EDS). RPG-EDS has a user interface accessible from any computer, allowing for ease of entry, while all data are housed on secure servers behind the contractors’ firewalls, thereby maintaining data security. The system has been modeled after the data systems used with prior cohorts of RPG grantees. It includes two applications, each designed to facilitate efficient reporting of (1) enrollment and services data and (2) outcomes data. The system can be used by multiple users at each organization and provide varying levels of access depending on users’ needs. For example, administrators or supervisors have the greatest rights within the system, being able to create new users, assign program participants to staff members, and review all activity from the organization. Staff providing direct services to study participants are only able to record and review information about participants assigned to their caseload. The various levels of system access allow for streamlining of information. Limiting full system access to a small set of staff members increases data security, reduces respondent confusion, and supports the collection of higher quality information.

  • Enrollment and services data. On a rolling basis, grantee staff use the enrollment and services data application to provide demographic information on each RPG case at enrollment, as well as enrollment and exit dates for the RPG project and information on each service in which case members enroll. The design of the RPG-EDS enrollment and services data entry application is based on web-based case management systems that Mathematica has developed and implemented successfully for multiple projects, including evaluations of prior RPG cohorts that involved collecting similar data from similar types of providers. For example, the enrollment and services data entry application is flexible and easy to use, and include navigational links to relevant fields for each type of entry to minimize burden on grantee staff and increase the quality and quantity of data collected.

  • Outcomes data. Each grantee reports data from standardized instruments and a list of data elements they draw from administrative records. Grantees develop their own project or agency databases to store these data. The grantee database includes all data the grantee collects from clients or on behalf of clients. The contractor provides format specifications to the grantees to use when uploading outcomes data to RPG-EDS. These are in easy-to-use PDF and Microsoft Excel formats. Twice a year, grantees upload these data to RPG-EDS. This application in RPG-EDS is modeled on the system that was used to obtain these types of data from RPG recipients during previous rounds of grants. Eleven of the current grantees were also in prior RPG cohorts and reported data through the existing or prior data systems; thus, they are well prepared to use this type of application. Importantly, the application in RPG-EDS incorporates advances in technology and software, and improved programming approaches. These improvements enhance the experience of providing outcomes data for this RPG cohort, including reducing the time to prepare and upload data to the system.



A4. Efforts to Identify Duplication and Use of Similar Information

The RPG cross-site evaluation is specifically designed to minimize the duplication of efforts for data. First, grantees are legislatively required to report performance indicators aligned with their proposed program strategies and activities. A key strategy of the RPG cross-site evaluation is to minimize burden on the grantees by ensuring that the data grantees share with the cross-site evaluation, fully meets the need for performance reporting. Thus, rather than collecting separate evaluation and performance indicator data, the grantees need only participate in the cross-site evaluation.

Second, data shared by grantees or provided through direct collection from grantees, staff members, and partners for the cross-site evaluation also serve to describe grantee performance. That is, to reduce duplication of efforts for grantees to comply with both CB’s local and cross-site evaluation requirements and legislatively mandated performance indicators, the cross-site evaluation data must completely overlap with data needed for performance indicators. Because no existing reporting systems collect the data required for reporting to Congress or for the cross-site evaluation, this data collection plan does not duplicate any current efforts.

Furthermore, the design of the cross-site evaluation instruments prevents duplication of data collected through each instrument. For example, during the semi-structured interviews conducted during site visits, partner representatives are not asked any questions included in the partner survey or sustainability survey. In creating the instruments for the outcomes and impacts analysis, the contractor reviewed and performed a crosswalk of all items to identify duplication across instruments. Any duplicate items not needed for scoring the instruments were removed from the versions of the standardized instruments provided in the outcomes and impacts instruments. This not only reduces burden on RPG participants providing data for grantees’ local evaluations, but also reduces the burden on grantee staff preparing and uploading outcomes data to the cross-site evaluation.


A5. Impact on Small Businesses or Other Small Entities

The potential exists to affect small entities within a grantee site, depending on the local community partners and funders with which grantees engage. RPG grantees and partners are be included in the site visit interviews, the partner survey, and the sustainability survey. Additionally, grantee agencies and possibly partners enter data into the RPG-EDS. Proposed data collection for these efforts aims to minimize the burden on all organizations involved, including small businesses and entities and is consistent with the aims of the legislation establishing RPG and CB’s need for valid, reliable, and rigorous evaluations of federally funded programs.



A6. Consequences of Collecting the Information Less Frequently

Not collecting information for the RPG cross-site evaluation would limit the government’s ability to document the performance of its grantees, as legislatively mandated, and to assess the extent to which these federal grants successfully achieve their purpose. Furthermore, the RPG cross-site evaluation is a valuable opportunity for CB, practitioners, and researchers to learn about the implementation and effectiveness of coordinated strategies and services for meeting the needs of families in the child welfare and substance use treatment systems. The study will examine whether the government’s strategy of funding collaborative, cross-system partnerships is a productive one that is likely to be sustained after the grant period ends, and understand how well partnerships are collaborating, the characteristics of the participants enrolling in RPG, the services provided to participants, and the outcomes and impacts on children and adults enrolled in RPG.

The information collection proposed is necessary for a successful cross-site evaluation. The consequences of not collecting this information or collecting the information less frequently are discussed as follows for each data collection element:

        • Grantee and partner staff topic guide. Without the information being collected through interviews with grantee and partner staff, the cross-site evaluation would have to rely entirely on information reported by a single source: the RPG project directors through the semiannual progress reports. Thus, the study would lack the broader perspectives of other key participants, and it would not be possible to conduct any in-depth analysis of critical program challenges and successes or implementation issues.

        • Partner survey. Without the partner survey, CB would not be able to collect information to understand the roles that partners play in RPG, the communications and working relationships among partners, the quality of their collaboration, and their goals for the RPG project in their region. Partnerships are a key element of the RPG program, but the literature shows that collaboration and service integration between child welfare agencies, substance use treatment providers, and other key systems such as the courts has been rare or challenging in the past. In addition, many federal initiatives require grantees to establish collaborations. Thus, collecting these data helps fill important gaps in knowledge for RPG and potentially other grant programs.

        • Sustainability survey. Without the sustainability survey, CB would not be able to collect information to understand planning for the continued implementation of services or programs after the period of performance for the grants, which would make it difficult to assess the programs’ return on CB’s investment. Thus, collecting these data increases knowledge on whether funding, staffing, and other resources are in place to continue the partnerships after the grants end.

        • Semiannual progress reports. Without obtaining information from the semiannual progress reports, the study would not have detailed information about grantee operations; changes to planned interventions, target population and eligibility criteria, or target outcomes; nor planned or unplanned changes to services provided to participants. The progress reports provide timely information about the infrastructure that grantees put in place to support implementation as well as features of the community context that have influenced grantees’ implementation plans. Collecting this information less often than twice a year would violate the federal requirements for grantee progress reporting and would place larger burdens on respondents to remember or store information about events, changes in direction, or challenges and successes over a longer period. Because aggregate information from the reports is be extracted and shared with grantees for program improvement and peer learning, less frequent reporting would also limit grantees’ ability to consider adjustments or reach out to one another. The data also provide a critical supplement to other data being collected and provide information for designing evaluation-related and programmatic technical assistance in response to emerging issues and grantee needs.

        • Enrollment and services data. The enrollment and services data are important for describing actual service delivery to cases and for tracking all activities completed with the participants, including assessments, referrals, education, and support. Data is collected when participants enroll, as they receive services, and at exit. Without these data, the study would have no information on the services recipients actually receive, including their duration and dosage. The evaluation would be unable to link participant outcomes to the levels or combinations of specific services or understand whether and how participants engaged in the selected services. If data were collected less frequently, providers would have to store services data or try to recall them weeks or months after delivery. Regular collection also enables data quality checks to address missing data, errors, or other problems in a timely way.

        • Outcomes data. It is CB’s mission to ensure child well-being, safety, and permanency for children who are at risk of or experience maltreatment. The outcomes instruments provide detailed information on these outcomes and the participants who receive services. Grantees upload data from the outcomes instruments twice each year. Without this information, evaluators would be unable to describe the outcomes of RPG program participants or analyze the extent to which grants have affected the outcomes of or addressed the needs of families co-involved with substance use treatment and child welfare systems. Further, it would be impossible to conduct an impact study (described next). During each upload, RPG-EDS performs automatic validation checks of the quality and completeness of the data. Mathematica then reviews submissions to address any remaining data quality issues, and work with grantees to resolve problems. If data were uploaded less often, it would be more cumbersome and challenging for grantees to search through older records to correct or provide missing data.

        • Impacts analysis. Grantees participating in the impacts analysis also upload outcomes data for participants in their comparison group (that is, those who do not receive RPG services or receive only a subset of RPG services). Without this information, it would not be possible to rigorously analyze the effectiveness of the interventions by comparing outcomes for individuals with access to RPG services with those in comparison groups. Uploading the data every six months provides the same benefits with respect to data quality described above.



A7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances requiring deviation from these guidelines.



A8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on December 13, 2021, Volume 86, Number 236, page 70844, and provided a sixty-day period for public comment. We did not receive comments.



A9. Explanation of Any Payment or Gift to Respondents

No payments or gifts are provided to respondents as part of data collection.



A10. Assurance of Confidentiality Provided to Respondents

This study is being conducted in accordance with all relevant regulations and requirements, including meeting the Federal Confidentiality Protection Requirements, under 42 CFR Part 2, and Human Subjects Protection Requirements, under 45 CFR Part 46, with respect to the data collected, analyzed, and reported upon. Several specific measures are taken to protect respondent privacy.

        • Adopting strict security measures and web security best practices to protect data collected through RPG-EDS. Data collected through RPG-EDS (which include outcomes data as well as enrollment and services data), are housed on secure servers that conform to the requirements of the HHS Information Security Program Policy. The data portal employs strict security measures and web security best practices to ensure the data are submitted, stored, maintained, and disseminated securely and safely. Strict security measures are employed to protect the privacy of participant information stored in the system including data authentication, monitoring, auditing, and encryption. Specific security procedures include, but are not limited to the following:

  • The system underwent the HHS security authorization process and obtained an Authority to Operate in 2019.

  • All data are encrypted in transit and at rest and reside behind firewalls.

  • Access to RPG-EDS is restricted to approved staff members who are assigned a password only with permission from the study director. Each user has a unique user ID/password combination and are enrolled in the project’s multifactor authentication system.

  • Database access requires special system accounts. RPG-EDS users are not able to access the database directly.

  • RPG-EDS users can access the system only within the scope of their assigned roles and responsibilities.

  • Security procedures are integrated into the design, implementation, and day-to-day operations of RPG-EDS.

  • All data files on multi-user systems are under the control of a database manager, with access limited to project staff on a “need-to-know” basis only. To further ensure data security, project personnel must adhere to strict standards, receive periodic security training, and sign security agreements as a condition of employment.

        • Training cross-site evaluation interviewers in privacy procedures. All site visit interviewers are trained on privacy procedures and are prepared to describe them in detail or to answer any related questions raised by respondents. During the introduction to each interview, site visit informants are told that none of the information they provide are used for monitoring or accountability purposes and that the results of the study are presented in aggregate form only.

        • Using web-based partner and sustainability surveys. Administering the partner and sustainability surveys via web over secured servers eliminates security risks related to shipping hard-copy forms containing personally identifiable information (PII) to the evaluator.

        • Assignment of content-free case and participant identification numbers to replace PII associated with all participant outcomes data provided by grantees to the cross-site evaluation. The cross-site evaluation develops and works with grantees to implement standard procedures for assigning identification numbers to all participant-level data. Case- and individual-level numbers are content-free. For example, they do not include special codes to indicate enrollment dates, participant location, gender, age, or other characteristics.



A11. Justification for Sensitive Questions

There are no sensitive questions in the instruments that the contractor uses to collect data.

Some of the specified standardized instruments that grantees use to collect data do include sensitive questions. For example, in the case of parents who are endangering their children as a result of their substance use, grantees must measure the parents’ pattern of substance use as a critical indicator of recovery. Grantees shares de-identified data with the evaluation team; no identifiable information is shared with the cross-site evaluation team. In recognition of the need for grantees to collect this information, and to ensure confidentiality and other protections to their clients, as a condition of their RPG, all grantees must obtain IRB clearance for their data collection. As part of their IRB submissions, grantees explain the process through which they share de-identified data from these standardized instruments with the cross-site evaluation.



A12. Estimates of Annualized Burden Hours and Costs

The estimated reporting burden and cost for the data collection instruments included in this ICR is presented in Table A.1.

The sustainability survey is the only new instrument proposed with this request, all other instruments have been previously approved and this request is to continue use of those approved instruments through the end of the sixth cohort of RPG grants. The fifth cohort grants end in 2023, and the sixth cohort grants end in 2024; we are requesting clearance to collect data within a three-year period.

We estimate the average hourly wage for program directors and managers to be the average hourly wage of “Social and Community Services Manager” ($36.13), that of grantee staff to be the average hourly wage of “Counselors, Social Workers, and Other Community and Social Service Specialists” ($25.09), that of data managers to be the average hourly wage of “Database Administrators and Architects” ($48.60), that of data entry specialists to be the average hourly wage of “Data Entry Keyers” ($17.24), and that for partners to be the average hourly wage of “General and Operations Manager” ($60.45), taken from the U.S. Bureau of Labor Statistics, Occupational Employment Statistics survey, 2020. Table A.1 summarizes the proposed burden and cost estimates for the use of the instruments and products associated with the partnerships, enrollment and services, and outcomes and impacts analyses.

For each burden estimate, annualized burden has been calculated by dividing the estimated total burden hours by the three years covered by this submission for RPG6 grantees and two years for RPG5 grantees. Figures are estimated as follows:

Site visit and key informant data collection

        • Individual interview with program director. We expect to interview 8 RPG program directors (1 per grantee across 8 grantees) once during the evaluation period. These interviews will take 2 hours. The total burden for individual interviews with program directors is 16 hours, and the total annualized burden is 5 hours.

        • Individual interview with program manager or supervisor. We expect to conduct individual, semi-structured interviews with 8 program managers or supervisors (1 staff per 8 sites) once during the evaluation period. These interviews will take 1 hour. The total burden for individual interviews with program managers is 8 hours, and the total annualized burden is 3 hours.

        • Individual interview with frontline staff. We expect to conduct individual, semi-structured interviews with 16 frontline staff (2 staff per 8 sites) once during the evaluation period. These interviews will take 1 hour. The total burden for individual interviews with program managers is 16 hours, and the total annualized burden is 5 hours.

        • Partner representative interviews. We expect to conduct individual, semi-structured interviews with 24 partner representatives (3 partners for 8 grantees), once during the evaluation. These interviews will take 1 hour. The total burden for the individual interviews with partner representatives is 24 hours, and the total annualized burden is 8 hours.

        • Partner survey. We expect to administer the web-based survey once to 40 grantee partners (5 per site across 8 RPG6 sites). The survey will take 25 minutes to complete. The total burden for the partner survey is 16.67 hour, and the total annualized burden is 6 hours.

        • Sustainability survey. We expect to administer the web-based survey once to 126 grantee key staff and partners (7 per site across the 18 RPG sites. The survey will take approximately 20 minutes to complete. The total burden for the sustainability survey is 52.5 hours, and the total annualized burden is 18 hours.

Enrollment and services data

        • Semiannual progress report. Grantees will submit two progress reports per year for each year of the evaluation period. We assume that 8 RPG6 project directors (1 per grantee) will prepare and submit the semiannual progress reports six times during the evaluation period, and 10 RPG5 project directors (1 per grantee) will submit the semiannual progress reports four times. It will take 16.5 hours to prepare and submit each one. The total burden for submitting the semiannual progress report is 1,452 hours, and the total annualized burden is 594 hours.

        • Case enrollment. Based on grantee estimates, we assume enrollment of 100 families per year per grantee. We assume that 3 staff per grantee will conduct enrollment, or 54 staff total. Each staff person will enroll about 33 families per year. It will take 15 minutes to enroll each family using RPG-EDS. Thus, the total burden for enrolling families across all staff members is 1,089 hours, and the total annualized burden is 446 hours.

        • Case closure. Based on grantee estimates, we assume 100 cases will close each year, per grantee. We assume that 3 staff per grantee will conduct case closures in RPG-EDS, or 54 staff total. Each staff will close 33 cases per year. It will take 1 minute to close a case. Thus, the total burden for case closure across all staff members is 73 hours, and the total annualized burden is 30 hours.

        • Case closure – prenatal cases. We assume one-tenth of cases, or 10 families per grantee, per year will include pregnant women. We assume 1 staff per grantee will conduct case closures for prenatal cases, which will require additional time at closure. It will take 1 additional minute to close a case in RPG-EDS. Thus, the total burden for case closure prenatal cases across all staff members is 7 hours, and the total annualized burden is 3 hours.

        • Service log entries. Based on the expected participation of families in specific RPG services, we assume there will be two service log entries each week for each family (104 entries per family per year) in RPG-EDS. We assume that 6 staff per grantee will enter services data (108 staff total), with a caseload size of 15 families each. Each weekly entry will take 2 minutes. Thus, the total annualized burden is 5,560 hours.

Outcomes and impacts data

Administrative data

        • Obtain access to administrative data. During the cross-site evaluation, grantees will review all data submission instructions, and grantee agency personnel will develop a data management plan and the necessary administrative agreements (such as memoranda of understanding) with agencies that house the administrative records to obtain the requested records. They will implement data protocols, including mapping their data fields to the fields in RPG-EDS. Finally, they will pilot the data request and receipt process. It will take 220 hours to obtain initial access. Thus, the total burden for obtaining initial access across all 18 grantees is 3,960 hours. Grantees will then update administrative agreements with agencies that house the administrative records once each subsequent year of the study. It will take 18 hours to update each agreement. Thus, the total burden is 468 hours for 18 grantees to update agreements once for RPG5 grantees and twice for RPG6. The combined burden for obtaining initial and ongoing access to administrative data is 4,428 hours. Grantees will use these data for their local evaluations as well; however, to comply with the procedures for providing the data to the cross-site evaluation, additional steps might be necessary. Therefore, we have assumed that half of the burden of obtaining the administrative data (2,214 hours) should be allocated to the cross-site evaluation. The annualized burden is 738 hours. We assume 1 data manager per grantee (or 18 data managers) will complete these processes.

        • Report administrative data. Grantees will upload administrative data they have obtained to RPG-EDS twice per year for the evaluation period (two years for RPG5 and three years for RPG6). For each upload, each grantee will require 144 hours to prepare and upload their administrative data, including correcting any data validation problems. The total burden for reporting administrative data is thus 12,672 hours for all 18 grantees combined, and the total annualized burden is 5,184 hours. We assume that 1 data entry operator per grantee (or 18 data entry operators) will upload the data.

Standardized instruments

        • Data entry for standardized instruments. Over the course of the study period, each grantee will enroll 100 cases each year (for a total of 2,000 cases for RPG5 and 2,400 cases for RPG6). For every case, five standardized instruments will be administered at baseline and again at program completion. Grantees will enter data from the completed instruments into their local databases, and data entry for each instrument will take 15 minutes (0.25 hours) per administration (1.25 hours total). RPG grantees will use these data for their local evaluations; however, to comply with the procedures for providing the data to the cross-site evaluation, additional steps to enter these data into their local databases might be necessary. Therefore, we have assumed that half of the burden of data entry should be allocated to the cross-site evaluation. Thus, the total burden for entering cross-site evaluation data is 2,750 hours, and the total annualized burden is 1,125 hours. We assume that 18 data entry operators (1 operator in each site) will enter the data.

        • Review records and submit electronically. Grantees will review records to ensure that all data have been entered and upload the data to RPG-EDS twice per year for each year of the evaluation period. It will take 3 hours to review and submit data for each of the five instruments twice per year. Grantees will then validate and resubmit data when errors are identified. It will take 2 hours to validate data for each of the five instruments, including time for obtaining responses to validation questions and resubmitting the data. Thus, the total burden is 2,200 hours, and the annualized burden is 900 hours. We assume that 18 data entry operators (1 operator in each site) will review and submit the data.

  • Data entry for comparison study sites. Sixteen grantees (9 RPG5 grantees and 7 RPG6 grantees) participating in the impact study will also enter data for control group members. For every member, five standardized instruments will be administered at baseline and follow-up. Grantees will enter data from the completed instruments into their local databases. It will take 0.25 hours for each administration (1.25 hours total). RPG grantees will use these data for their local evaluations as well; however, to comply with the procedures for providing the data to the cross-site evaluation, additional steps to enter these data into their local databases might be necessary. Therefore, we have assumed that half of the burden of data entry should be allocated to the cross-site evaluation. Thus, the total burden for entering cross-site evaluation data is 2,438 hours, and the total annualized burden is 1,000 hours. We assume the same enrollment size as grantees (100 cases per year) and that 16 data entry operators (1 operator in each of the 16 sites) will enter the data.

Table A.1. Estimate of burden and cost for the RPG evaluation

Data collection activity

TOTAL number of respondents

Number of responses per respondent (each year)

Average burden hours per response (in hours)

Total annual burden hours

Average hourly wage


Total annualized cost


Site Visit and Key Informant Data Collection

Program director individual interview

8

0.33

2

5

$36.13

$192.69

Program manager/

supervisor individual interviews

8

0.33

1

3

$36.13

$96.35

Frontline staff interviews

16

0.33

1

5

$25.09

$133.81

Partner representative interviews

24

0.33

1

8

$36.13

$289.04

Partner survey

40

0.33

0.42

6

$60.45

$335.83

Sustainability survey

126

0.42

0.33

18

$60.45

$1,057.88


Enrollment, client, and service data

Semi -annual progress reports

18

2

16.5

594

$36.13

$21,461.22

Case enrollment data

54

33

0.25

446

$25.09

$11,177.60

Case closure

54

33

0.0167

30

$25.09

$745.17

Case closure – prenatal

18

10

0.0167

3

$25.09

$75.27

Service log entries 

108

1,560

0.033

5,560

$25.09

$139,500.4


Outcome and impact data


Administrative Data

Obtain access to administrative data

18

1

41

738

$48.60

$35,866.80

Report administrative data

18

2

144

5,184

$17.24

$89,372.16


Standardized instruments

Enter data into local database a 

18

100

0.625

1,125

$17.24

$19,395.00

Review records and submit 

18

2

25

900

$17.24

$15,516.00

Data entry for comparison study sites (16 grantees) a

16

100

0.625

1,000

$17.24

$17,240.00

Estimated Totals

 

 

 

15,625


$352,455.22

a Burden hour estimates assume that only half of this burden is part of the cross-site evaluation.

A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

These information collection activities do not place any additional costs on respondents or record keepers.



A14. Annualized Cost to the Federal Government

The estimated cost for completing the RPG cross-site evaluation data collection over the three years of the requested clearance is $397,081. The annualized cost to the federal government includes one-third of that total ($132,360).



A15. Explanation for Program Changes or Adjustments

This request is to continue data collection under OMB #0970-0527 using the previously approved instruments and to add one new survey of grantee sustainability efforts. No changes are proposed to previously approved instruments, but we have updated Appendix labeling and expiration date information within the documents.



A16. Plans for Tabulation and Publication and Project Time Schedule

Plans for tabulation

The information from the RPG cross-site evaluation—with a focus on partnerships, services, and outcomes for families—will be useful to funders, practitioners, and other stakeholders interested in targeting resources to effective approaches to address the needs of children at risk of maltreatment due to adult substance misuse. Identifying what has worked well allows subsequent efforts of program operators and funders to home in on evidence-based practices and strategies.

Partnerships, enrollment and services, sustainability, and outcomes analyses

Data from the instruments included in this OMB package will be analyzed using qualitative and quantitative methods to describe the target populations’ characteristics and outcomes; program services, dosage, and participant engagement; program sustainability; and the structure, quality, and goals of partnerships. An enrollment and services analysis of participants will provide a snapshot of child, adult, and family characteristics at entry, and their outcomes. Thoroughly documenting program services and partnerships will expand understanding of the breadth of programs, practices, and services being offered through RPG to vulnerable families and will describe successes in achieving goals and barriers encountered. A greater understanding of how programs can be implemented with a network of partners might inform future efforts in this area.

Mathematica will use standard qualitative procedures to analyze and summarize information from project staff and partner interviews conducted using the semi-structured staff interview topic guide. These procedures include organization, coding, and theme identification. Standardized templates will be used to organize and document the information and then code interview data. Coded text will be searched to gauge consistency and consolidate data across respondents and data sources. This process will reduce large volumes of qualitative data to a manageable number of topics, themes, or categories (Yin 1994; Coffey and Atkinson,1996), which can then be analyzed to address the study’s research questions.

Quantitative data will be summarized using basic descriptive methods. For the outcomes analysis, data from the standardized instruments will be tabulated and used to create scales and scores appropriate for each instrument and will use established norms when appropriate for the RPG target populations. Administrative records will be examined to determine whether incidents of child maltreatment and child removal from the home have occurred and whether adults have received substance use treatment, the frequency of treatment, and resolution. These data will capture information at baseline and program exit for families who participate in services. For the partnerships, enrollment and services, and sustainability analyses, sources of quantitative data include a partner survey, sustainability survey, and the enrollment and services data. Data from each source will undergo a common set of steps involving cleaning data, constructing variables, and computing descriptive statistics. To facilitate analysis of each data source, we will create variables to address the study’s research questions. Constructing these analytic variables will depend on a variable’s purpose and the data source being used. Variables might combine several survey responses into a scale or a score, aggregate attendance data from a set period, or compare responses to identify a level of agreement.

Enrollment and services data, which grantees enter into RPG-EDS, will also be used for the enrollment and services analysis. The study will provide summary statistics for key program features:

        • Enrollment. For example, the average number of new cases each month

        • Services provided by grantees. For example, the services in which clients typically participate (including any common combinations of services); distribution of location of services (such as home, treatment facility, or other site); the average number of selected services (such as workshops) offered each month; and common topics covered during services

        • Participation. For example, the average length of time participants are served by the program, the average number of hours of services program participants receive, and the average duration between enrollment and start of services

We will analyze data from RPG-EDS for each grantee for the reports to Congress and annual reports identified in Table A.2. The reports to Congress will include topics such as enrollment patterns, services provided, and participation patterns over the previous 12 months. Later analyses might describe how patterns changed over time, such as from the early to late implementation period.

Impacts analysis

The impacts analysis will complement other components of the evaluation by examining program effectiveness in the areas of child well-being, safety, and permanency; adult recovery; and family functioning. It will include the 16 grantees who have proposed rigorous local evaluations, either using random assignment or a strong matched comparison group. To be considered a strong matched comparison group, the local evaluation must include baseline data on key characteristics, such as family functioning and parental substance use, on which to establish equivalence with those enrolled in RPG programs. As noted above, all grantees will provide data on the program groups as part of the outcomes study. Those involved in the impact study will also collect data on comparison group members who are not enrolled in RPG projects at baseline and program exit.

The impacts analyses will be conducted for three groups of studies. First, we will pool the grantees’ projects that used well-implemented randomized controlled trials (RCTs) in their local evaluations. RCTs have excellent internal validity—ability to determine whether the program caused the outcomes—because the treatment and comparison groups are initially equivalent on all observed and unobserved characteristics, on average. Any observed differences in outcomes between the program and control group of families can therefore be attributed to the program with a known degree of precision. Second, we will pool grantees with RCTs with some issues (such as high attrition) and those with strong quasi-experimental designs (QEDs), in which program and comparison groups were matched on key factors, such as baseline history of substance use and family functioning. The internal validity of QEDs is weaker than that of RCTs, because differences on unobservable characteristics cannot be determined. However, a design with well-matched program participants and comparison group members provides useful information on program effects. Third, we will pool the studies in groups 1 and 2 that include well-implemented RCTs, RCTs with issues, and QEDs. Combining the QED results with RCTs will increase the statistical power of the overall analysis, enabling us to detect smaller effects. Because of the serious consequences of child maltreatment, even relatively small effect sizes might be clinically meaningful.

Grantees and their local evaluators will collect baseline data for use in the cross-site evaluation. First, baseline data will serve to describe the characteristics of RPG program participants. We will present tables of frequencies and means for key participant characteristics, including demographic and family information for the three groups of impacts analyses: the grantees with well-implemented RCTs, the combined group of RCTs with issues and QEDs, and the group that combines well-implemented RCTs, RCTs with issues, and QEDs.

A key use of baseline data is to test for baseline equivalence for both the RCT and the RCT-QED samples. Though random assignment ensures that families participating in the program and those in comparison groups do not initially differ in any systematic way, chance differences might exist between groups. Establishing baseline equivalence for the QEDs is critical for determining whether the comparison group serves as a reasonable counterfactual that represents what would have happened to the program group had it not received treatment. To confirm that there were no differences between the program and comparison groups at the study’s onset, we will statistically compare key characteristics between the groups. In addition, because the standardized instruments will be administered twice—once at program entry and again at program exit, we will also compare baseline measures of outcomes at program entry between the two groups. In particular, to establish baseline equivalence, we will conduct t-tests for differences between the two groups both overall and separately by grantee. In these comparisons, we will use the analytic sample, which includes respondents to both the baseline and follow-up instruments.

A key use of follow-up data is to estimate program impacts. We will use baseline data to improve the statistical precision of impact estimates and control for any remaining differences between groups. The average impact estimate will be the weighted average of each site-specific impact, where the weight of each site-specific impact is the inverse of the squared standard error of the impact. As such, sites with more precise impact estimates (for example, sites with larger sample sizes or baseline variables that are highly correlated with the outcomes) will receive greater weight in the average impact estimate. We will compare the results using the sites with well-implemented RCT evaluations with those obtained from the RCT with issues and QED sample, noting that the former is most rigorous, whereas the latter should be considered suggestive or promising evidence of effectiveness.

Overall performance

To inform Congress on the performance and progress of the RPG sites, we will produce two reports to estimate and report on performance measures for the 18 sites. The reporting will include selected measures collected and calculated for the (1) partnerships analysis, including partnership goals and collaboration; (2) enrollment and services analysis, including information about program operations, enrollment, and participation; (3) sustainability analysis; (4) outcomes analysis, including detailed descriptions of the characteristics and outcomes associated with participating families; and (5) impacts analysis. To reduce the burden for the grantees and local evaluators, we have designed the cross-site analyses to completely overlap between the performance measures and those of the other evaluation components, so no additional data are needed.

B. Time schedule and publications

This ICR is to continue to collect data for the cross-site evaluation for an additional three years. Once data collection is complete, reporting will continue through September 2025.

We will continue to produce three types of reports will summarize the progress and findings of the cross-site evaluation: annual reports, reports to Congress, and a final evaluation report (Table A.2). Each year, we will develop reports describing cross-site evaluation progress. Annual reports, starting in October 2022, will be designed for accessibility for a broad audience of policymakers and practitioners. For the overall performance component, we will produce two additonal reports to Congress beginning in September 2022. A final evaluation report will provide a comprehensive synthesis of all aspects of the study over the entire contract, including integration and interpretation of both qualitative and quantitative data. Previous reports to Congress include the fifth Annual Report to Congress, the fourth Annual Report to Congress, and the third Annual Report to Congress. The sixth and seventh report to Congress are currently undergoing the clearance process at HHS.

Table A.2. Schedule for the RPG cross-site evaluation

Activity

Date

Data collection

May 2022–September 2024

Reports to Congress

Two reports (every other year) beginning September 2022

Annual reports

Ad-hoc reports or research briefs

Annually beginning October 2022

As requested by Children’s Bureaub

Final evaluation report

September 2024



In addition to planned reports on the findings, RPG will provide opportunities for analyzing and disseminating additional information through special topics reports and research or issue briefs. Short research or policy briefs are an effective and efficient way of disseminating study information and findings. The cross-site evaluation team will produce up to two ad hoc reports or special topics briefs each year at the request of CB. Topics for these briefs will emerge as the evaluation progresses but could, for example, provide background on selected services offered by grantees, summarize program activities in tribal grantee sites, discuss impact or subgroup findings, or describe the grantees. Examples of additional reports previously created include “Offering Data Collection Incentives to Adults at Risk for Substance Use Disorder,” and “Collecting Sensitive Information and Encouraging Reluctant Respondents.”



A17. Reason(s) Display of OMB Expiration Date is Inappropriate

Approval not to display the expiration date for OMB approval is not requested.



A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this data collection.

1 Core services are the services defined by the grantee that make up its main RPG project. These include, at a minimum, all services funded by the grant, and might include in-kind services provided by partners.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorClaire Smither Wulsin
File Modified0000-00-00
File Created2022-02-19

© 2024 OMB.report | Privacy Policy