Supporting Statement B - RPG CSE_CLEAN

Supporting Statement B - RPG CSE_CLEAN.docx

CB Evaluation: Regional Partnership Grants (RPG) National Cross-Site Evaluation and Evaluation Technical Assistance [Implementation/descriptive study and Outcomes/Impact analyses]

OMB: 0970-0527

Document [docx]
Download: docx | pdf

REGIONAL PARTNERSHIP GRANTS (RPG) NATIONAL CROSS-SITE EVALUATION AND EVALUATION TECHNICAL ASSISTANCE



OMB Information Collection Request

0970 - 0527





Supporting Statement Part B –

Statistical Methods

February 2022





























Submitted By:

Children’s Bureau

Administration for Children and Families

U.S. Department of Health and Human Services





B1. Respondent Universe and Sampling Methods

The overall objective of the Regional Partnership Grants (RPG) cross-site evaluation is to describe and document the performance of the RPG projects, the outcomes for participants enrolled in RPG, and the effectiveness of the grantees’ approaches, as stated in the legislation. To meet these evaluation goals, the RPG cross-site evaluation includes five study components: (1) a partnerships analysis, (2) an enrollment and services analysis, (3) a sustainability analysis, (4) an outcomes analysis, and (5) an impacts analysis. The partnerships analysis will assess the coordination and collaboration of partners’ service systems. The enrollment and services analysis will assess data on RPG participant characteristics and the types of, dosage of, and engagement with services. The outcomes analysis will describe the characteristics of and changes over time in children, adults, and families who participate in the RPG programs.

The impacts analysis includes a subset of 16 of the 18 RPG grantees that are implementing rigorous local evaluation designs and can provide outcomes data on both treatment and comparison group members. The sustainability analysis will assess projects’ use of data for continuous improvement and their activities to sustain RPG programs once their grants end.

Partnerships analysis

The partnerships analysis assesses the collaboration and coordination of services the RPG projects provided for families. The analysis examines which partners are involved in each project, the roles they play, and the extent of collaboration among partners, such as sharing a vision and goals to integrating assessment and treatment. In addition, the analysis explores the interagency collaboration and coordination of the child welfare and substance use treatment agencies, specifically examining topics such as competing priorities within each agency, conflicting timelines of recovery and permanency decisions, and conflicting and limited sharing of data between agencies. Advancing the collaboration and coordination of these two agencies is critical to the success of the RPG partnerships because they aim to serve the same families and support their well-being.

This component of the evaluation includes the following instruments and respondents:

  • Grantee and partner staff topic guide (Appendix B). During site visits to the 8 of the RPG6 grantees, semi-structured individual interviews will be conducted with the RPG project director, two managers or supervisors for the RPG project, and two frontline staff who are providing services to participants. Individual interviews will take place with three grantee partners who may represent the child welfare agency or substance use treatment provider.

  • Partner survey (Appendix C). Lead staff of the grantee and partner organizations are asked to complete a web-based survey. Partner organizations are defined as organizations other than the grantee that provide RPG services to families enrolled in the program or coordinate their services with the grantee. We estimate that five people will fit this criterion for each of the 8 RPG6 grantees.

Enrollment and services analysis

The ongoing enrollment and services analysis describes who was served in the RPG projects and how. The analysis examines how grantees defined and refined their target populations over the course of their projects and why those changes occurred. It provides an expanded picture of all core services provided to families enrolled in RPG. Core services are the services defined by the grantee that make up its main RPG project. These include, at a minimum, all services funded by the grant, and might include in-kind services provided by partners. The analysis also describes how engagement varies across participants and services, and how grantees and their partners collaborate to provide the services.

  • This component of the evaluation includes the following instruments and respondents: Semiannual progress reports (Appendix E). Grantee project directors complete semiannual progress reports with updated information about their projects, including any changes from prior periods. CB has tailored the semiannual progress reports to collect information on grantees’ services, the target population for the RPG program, project operations, partnerships, and grantees’ perceived successes and challenges to implementation.

  • Enrollment and services data (Appendix F). These data describe participants’ characteristics at enrollment and the services they receive. Grantees record the enrollment date for each RPG family or household and demographic information on each family member including date of birth, ethnicity, race, primary language spoken at home, type of current residence (children only), income (adults only), highest education level attained (adults only), and relationship to a focal child in each family on whom data will be collected. Grantees also record the enrollment date for families or individual family members into RPG services, service contact information for core services, and exit dates for RPG.

Sustainability analysis.

The sustainability analysis will assess projects’ use of data for continuous improvement and their activities to sustain RPG programs once their grants end. The analysis will examine information about supports within the partnership that can help improve and sustain RPG services, such as continuous use of data for service improvement, identification of a lead organization, and policies needed after grant funding ends. It will also provide information about funding sources and resources needed after the end of the grants.

  • This component of the evaluation includes the following instruments and respondents: Sustainability survey (Appendix D). Lead staff of grantee and partner organizations with knowledge about sustainability planning will be asked to complete a web-based survey on this topic. We expect to administer the web-based survey once to 126 grantee key staff and partners (seven per site across 18 sites).

Outcomes analysis

The ongoing outcomes analysis includes all 18 grantees. This analysis describes the characteristics of participating families and their outcomes in the five domains: (1) child well-being, (2) family functioning and stability, (3) adult recovery, (4) child permanency, and (5) child safety.

This component of the evaluation includes five measures and administrative data elements associated with the cross-site evaluation’s outcomes analysis (Appendix G and H). From the child’s primary caregiver, grantees collect outcomes data on child well-being, functioning and stability, safety, and permanency on one focal child in each participating family. However, if the child is in out-of-home placement, then grantees collect child well-being data from the current caregiver. Each grantee selects a focal child at enrollment based on their target populations and planned services (for example, some grantees plan to serve families with infants or toddlers, and others plan to serve adolescents or teens.) Grantees administer the instruments collecting data on adult SUD recovery from the same adult, unless he or she is not the adult with an SUD or in recovery from an SUD. In those cases, grantees collect recovery data from a separate adult receiving RPG services who has or had an SUD.

Impacts analysis

The impacts analysis includes 16 grantees that are implementing rigorous comparison local evaluation designs and can provide outcomes data on both treatment and comparison group members. This component of the evaluation includes the same elements contained in the outcomes analysis instruments from participants in the comparison groups.



B2. Procedures for the Collection of Information

Partnerships analysis

Descriptions of the data collection procedures for the two instruments associated with the partnerships analysis follow:

  • Grantee and partner staff topic guide. Two members of the RPG cross-site evaluation team will conduct one site visit to 8 grantees (in Year 4 of the grant program for RPG6). While on-site, they will conduct in-person, individual interviews with grantee, partner, and frontline staff. Evaluators will obtain verbal consent from each interviewee, including permission to audio record the interviews for later transcription. One team member will moderate the interview. If interviewees do not consent to audio recording, the second team member will use a laptop computer to take detailed notes.

  • Partner survey. This survey is web-based. The cross-site evaluation team will obtain from the grantees contact information for desired respondents and notify respondents in advance about the survey with an email. Personalized links to the survey (along with both an email address and telephone number where respondents can ask any questions about the survey) will then be distributed to each respondent via email. By clicking on the link or pasting it into their browser, respondents will go directly to the 25-minute survey. If they are unable to complete the survey in one sitting, they can return to it as needed.

Information for the partnerships analysis will be descriptive. In general, it will not involve formal hypothesis testing.

Enrollment and services analysis

Descriptions of the data collection procedures for the two instruments associated with the enrollment and services analysis follow:

  • Semiannual progress reports. CB provides a template for the semiannual progress reports, which grantees submit every six months. Grantees can enter narrative information directly into the template, or they can respond to the questions in other electronic file formats of their choosing. Grantee project directors will submit their reports to www.GrantSolutions.com.

  • Enrollment and services data. Intake workers enter demographic characteristics and RPG program enrollment and exit dates for each RPG case into the case management system known as the RPG-Evaluation Data System (RPG-EDS). Staff delivering services enter individual service contact information on a rolling basis for the duration of participation in RPG services.

Information for the enrollment and services analysis will be descriptive. In general, it does not involve formal hypothesis testing.

Outcomes analysis

  • Outcomes analysis data. Each grantee is expected to maintain outcomes data from the case-specific standardized instruments and administrative records for all RPG participants in its project or agency database(s). Grantees upload these data to the RPG-EDS every six months, using file formats specified or provided by the cross-site evaluation. To maximize data quality, automatic data validation checks occur during the upload, and error messages will indicate any corrections needed before the submission can be accepted.

Information for the outcomes analysis will be descriptive. In general, it does not involve formal hypothesis testing.

Sustainability analysis

In this request, we are seeking clearance for one new instrument associated with the sustainability analysis.

  • Sustainability survey. This survey is web-based. The cross-site evaluation team will obtain from the grantees contact information for desired respondents and notify respondents in advance about the survey with an email. Personalized links to the survey (along with both an email address and telephone number where respondents can ask any questions about the survey) will then be distributed to each respondent via email. By clicking on the link or pasting it into their browser, respondents will go directly to the 20-minute survey. If they are unable to complete the survey in one sitting, they can return to it as needed.

Impacts analysis

  • Impacts analysis data. Each of the 16 grantees participating in the impacts study is expected to maintain the case-specific outcomes data for comparison group members from standardized instruments and administrative records in its project or agency database(s). Grantees upload these data to RPG-EDS every six months using file formats specified or provided by the cross-site evaluation. To maximize data quality, automatic data validation checks occur during the upload, and error messages will indicate any corrections needed before the submission can be accepted.



B3. Methods to Maximize Response Rates and Deal with Nonresponse

Based on our experience with the second, third, and fourth RPG cohorts, the cross-site evaluation expects to obtain a high response rate of 80 percent or more for surveys and close to total participation in other data collection activities, such as site visits and enrollment, services, and outcomes data submissions. A grantee liaison is assigned to each site and serve as a link to work with grantees, if needed, to address nonresponse. Descriptions of the strategies for maximizing response in the data collection efforts follow.

Partnerships data

  • Conduct interviews with key grantee staff, supervisors/managers, partners, and frontline staff during site visits. All interviews with key grantee staff, supervisors/managers, partners, and staff will occur during site visits. We anticipate that all grantees will agree to participate in these visits. Our experience with previous RPG cohorts indicates that participation rates of the desired interviewees are typically close to 100 percent. To help ensure high participation, we coordinate with the grantees, supervisors/managers, partners, and staff to determine convenient dates and schedules for these visits.

  • Design partner survey in a manner that minimizes respondent burden. To minimize burden on respondents, the surveys are brief, web-based, and structured such that respondents do not have to pay attention to routing and skip logic or view questions that do not apply to them.

  • Send advance and reminder emails to respondents (Appendix I and J). We send advance emails and an FAQ document to grantee and partner staff requesting their participation. If respondents have not completed the survey within a certain amount of time, we send reminder emails requesting them to complete the surveys.

  • Solicit the help of grantees to encourage completion of the partner surveys. If response rates for individual grantees lag, the cross-site evaluation team works with lead grantee staff to identify additional strategies for increasing completed surveys without compromising respondent confidentiality. For instance, lead grantee staff may be asked to send an email to all the survey participants they had identified in their site, encouraging everyone’s response. In past rounds, this approach helped boost response rates, because lead grantee staff had personal relationships with their partners and used their proximity to encourage responses. This approach of combining follow-up requests from the evaluator to people who have not completed the survey with general requests from the grantee to all desired respondents has proved effective with previous RPG cohorts.

  • Conduct telephone follow-up with nonrespondents on the partner survey. If email reminders and requests from the grantee prove ineffective, the cross-site evaluation team deploys survey staff with expertise in obtaining responses to conduct one round of telephone follow-up with nonrespondents. This approach of following up via telephone when email requests have not been effective has increased response rates with previous RPG cohorts.

Enrollment and services data

  • Provide an easy-to-use data entry system for enrollment and services data. The design of the enrollment and service data application of RPG-EDS is based on web-based case management systems that Mathematica has developed and successfully implemented for multiple projects that collect these data from similar types of providers. RPG-EDS can be accessed from any computer, allowing for ease of entry, while the data are encrypted in transit and at rest and reside behind firewalls, thereby maintaining data security.

  • Use multiple sources to check enrollment activity and completion of the enrollment and services data entry. Information on the number of people enrolled in the RPG program every six months will be obtained in the semiannual progress reports. If the number does not match the number of new entries to the enrollment and services data, the cross-site evaluation team contacts the grantee to reconcile the numbers and request they add any missing enrollees to RPG-EDS.

  • Conduct regular data completion and quality checks. The cross-site evaluation contractor examines each grantee’s enrollment and services data at regular intervals to identify any potential problems. If problems are identified, contractor staff notifies the grantee and works with the grantee and providers as needed to obtain missing data or remedy other potential problems quickly.

Sustainability data

  • Design sustainability survey in a manner that minimizes respondent burden. To minimize burden on respondents, the survey is brief, web-based, and structured such that respondents do not have to pay attention to routing and skip logic or view questions that do not apply to them.

  • Send advance and reminder emails to respondents (Appendix K and L). We send advance emails and an FAQ document to grantee and partner sample members requesting their participation. If respondents have not completed the survey within a certain amount of time, we send reminder emails requesting them to complete the surveys.

  • Solicit the help of grantees to encourage completion of the sustainability surveys. If response rates for individual grantees lag, the cross-site evaluation team works with lead grantee staff to identify additional strategies for increasing completed surveys without compromising respondent confidentiality. For instance, lead grantee staff may be asked to send an email to all the survey participants they had identified in their site, encouraging everyone’s response. In past rounds of staff and partner surveys, this approach helped boost response rates, because lead grantee staff had personal relationships with their partners and used their proximity to encourage responses. This approach of combining follow-up requests from the evaluator to people who have not completed the survey with general requests from the grantee to all desired respondents has proven effective with previous RPG cohorts.

  • Conduct telephone follow-up with nonrespondents on the partner survey. If email reminders and requests from the grantee prove ineffective, the cross-site evaluation team deploys survey staff with expertise in obtaining responses to conduct one round of telephone follow-up with nonrespondents. This approach of following up via telephone when email requests have not been effective has increased response rates with previous RPG cohorts.

Outcomes and impacts data

  • Design the outcomes and impacts instruments in a manner that reduces burden. The outcomes data that grantees must report comprise standardized instruments that often ask for similar information, such as demographic information about the respondent. To avoid such duplication, the outcomes instruments exclude redundant items to prevent duplication. This reduces burden on grantee staff responsible for uploading these data to RPG-EDS.

  • Develop a user-friendly, flexible upload process that has already proven successful. RPG-EDS, to which grantees upload data, provides easy access while maintaining the security of outcomes data. The system, designed with access by grantee staff in mind, is based on successful experiences in prior studies collecting similar types of data from similar types of service providers. The component of the system for managing outcomes data, to which grantees upload data from the outcomes instruments, is modeled on the data reporting system that was used in RPG projects from 2012 through 2017. Compared with the former RPG systems, RPG-EDS includes updated features and improved technology to simplify the upload process.

  • Provide training and technical assistance to grantee staff. We provide to grantees documentation on, training in, and technical assistance in collecting data from participants, uploading outcomes data to RPG-EDS, and using the web-based enrollment and services data entry application in RPG-EDS.

  • Include data quality checks in the data system. RPG-EDS also improves data reliability with automatic data quality checks. For example, if grantee staff enter out-of-range values in a particular field, the system prompts users to check the value. For some fields, response values are restricted; for others, grantee site staff can override the check. We also monitor the data entered by grantee sites and provide feedback to grantees on their data quality.

  • Optimize the frequency of data collection. Grantees upload outcomes data once every six months, rather than waiting until their evaluation data collection is complete. This enables the cross-site evaluation team to regularly identify and troubleshoot problems grantees experience in collecting data from respondents or uploading data.



B4. Test of Procedures or Methods to be Undertaken

Most of the instruments used in the RPG cross-site evaluation build on existing measures and experience from other studies completed by the cross-site evaluation team.

Grantee and partner staff topic guide. The interview topic guide for RPG grantee and partner staff has been modeled after interview guides used in similar studies such as the Early Head Start Enhanced Home Visiting Pilot Evaluation and the Evidence-Based Home Visiting (EBHV) and previous RPG cross-site evaluations. All site visitors receive training on the topic guides. After the first site visit, the cross-site evaluation team meets to discuss the instruments and whether they require any modifications to enhance data quality and reduce burden, such as eliminating or refining any questions that were unnecessary or redundant.

Partner surveys. The partner survey is identical to the one developed and pre-tested for RPG2, RPG3, and RPG4. A pretest of the instrument occurred under RPG2 and was covered under OMB #4170-0444.

Before deploying the partner survey on the web, Mathematica staff rigorously compared the web surveys to the hard-copy instrument for accuracy. Web testers checked question wording and response option formats, and used different scenarios to test the skip patterns.

Sustainability survey. The sustainability survey includes adapted items from the partner and staff surveys that were used in the RPG2 and RPG3 projects. The staff survey had adapted items used previously in other Mathematica projects including the Evidence-Based Home Visiting (EBHV) cross-site evaluation, the Child Support Noncustodial Parent Employment Demonstration (CSPED), and Parents and Children Together (PACT). It also included several standardized scales. The partner survey was modeled after the ones used on the EBHV cross-site evaluation, the Integration Initiative cross-site survey, and the survey instrument (Collaborative Capacity Inventory) used in RPG1. In addition to the previous staff and partner surveys, the sustainability survey also adapted items from the RPG-EDS service logs and semi-annual progress reports (SAPRs).

In the summer of 2019, Mathematica conducted a pretest of the survey to assess the clarity and completeness of content and to estimate respondent burden. To conduct the pretest, Mathematica reached out to project directors and evaluators from RPG3 grantees. Five participants completed and returned (via email or fax) a hard-copy version of the questionnaire and provided feedback during debriefing interview.

On average, it took respondents 19.4 minutes to complete the survey. Participants agreed the survey was comprehensive and generally easy to complete, and recommended a few changes to facilitate their understanding of the questions and survey goals. Based on pretest participant feedback Mathematica made some minor changes summarized below. Burden is estimated to be 20 minutes on the final version of the survey.

  • Participants asked for clarification of what is meant by “data” in relevant questions. For example, if data includes qualitative data collected through informal conversations with participants and staff, outside sources (such as administrative data), or data they had collected for their local evaluation. We revised relevant questions to ask about “information” instead of just “data”.

  • We included more examples in relevant questions, including examples related to activities to improve services and examples of how and when data might be shared.

  • We added “Not applicable” to some questions, since some services might not be offered by some grantees.

  • We clarified some terms used in the survey, such as “in kind” resources.

  • We modified some questions to ask about whether staff turnover or leadership changes, as well as the local policy context, affect sustainability planning.

  • We made some formatting changes to how questions were presented, such as reformatting questions with complex grids and adding categories with ranges.

RPG-EDS. The component of RPG-EDS for managing outcomes data is modeled after the data systems used with previous cohorts of RPG projects. The development team has rigorously tested and evaluated the functionality of RPG-EDS.



B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

We received preliminary input on statistical methods from Mathematica staff, including:



Dr. Sarah Avellar

Mathematica Policy Research

1100 1st Street, NE, 12th Floor

Washington, DC 20002


Dr. Angela D’Angelo

111 East Wacker Drive, Suite 3000

Chicago, IL 60601

Dr. Yange Xue

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 08543

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorClaire Smither Wulsin
File Modified0000-00-00
File Created2022-02-19

© 2024 OMB.report | Privacy Policy