PHE_OMB2_Supporting Statement Part B_Oct2023_clean

PHE_OMB2_Supporting Statement Part B_Oct2023_clean.docx

Pathway Home Grant Program Evaluation

OMB: 1290-0045

Document [docx]
Download: docx | pdf

Part B Justification For Pathway Home Grant Program Evaluation OMB No. 1290-0NEW

Part B: COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

Summary: The Chief Evaluation Office (CEO) in the U.S. Department of Labor (DOL) is conducting an evaluation of Round 2 grants under the Pathway Home program. The evaluation is designed to assess interventions to help individuals with justice-system involvement find meaningful employment and avoid recidivism. The Evaluation of the Pathway Home Grant Program (Pathway Home Evaluation) includes both implementation and impact studies. This information collection request seeks Office of Management and Budget (OMB) clearance for two new data information collections for the evaluation, in addition to the six approved under OMB No. 1290–0039. This request is for the follow-up survey for impact study participants and protocol for structured discussions with grantee staff. These data collection instruments will be submitted for approval by Health Media Lab Institutional Review Board.

B.1. Respondent universe and sampling methods

For the follow-up survey of study participants, the universe of potential respondents includes all 2,500 Pathway Home participants enrolled in the impact study from the six impact sites (everyone who enrolled and completed a baseline survey, including comparison group members). The contractor will attempt the follow-up survey with all 2,500 participants. As described in 1290-0039, the impact evaluation will use a quasi-experimental design to estimate the impact of participation relative to a group of individuals who were similar to the program group but were not eligible for services.

For the structured discussions with grantee staff, the contractor will hold both group and one-on-one or two-on-one calls. The universe of potential respondents for the group discussion includes key staff from the 16 sites that did not participate in a site visit. For the one-on-one or two-on-one calls, the universe of potential respondents includes the key staff from the six impact sites. We will purposely select an average of two key staff from each of the 22 grantees (6 impact sites and 16 non-impact sites) for a total of 44 staff.

See Table B.1 for estimates of universe and sample sizes for each data collection instrument.

Study participant follow-up survey. As part of the impact study, the evaluation team will field a web and phone survey to approximately 2,500 study participants 15 months after their enrollment into the study, including participants in the program and comparison groups. This survey will collect information on skill and credential attainment, employment and economic well-being, criminal justice involvement, and health and stability after reentering their communities. We estimate each respondent will spend about 25 minutes on the survey. We expect 64 percent, or about 1,600 study participants, will complete the survey.

Structured discussions with grantee staff. As part of the implementation study, the evaluation team will conduct virtual structured discussions with approximately 32 grantee staff from 16 grantees that did not participate in a site visit. Discussions will focus on understanding sustainability of the Pathway Home grantee programs. The group discussions will be conducted over WebEx and Zoom and are expected to take about 90 minutes to complete. There will be five or six grantees per group discussion. The evaluation team will also conduct six one-on-one or two-on-one calls with the six impact sites, which are also expected to take about 90 minutes. These calls will include the same questions about sustainability, as well as specific issues that emerged during site visits. We expect 90 percent of the 44 invited grantee staff to participate, meaning approximately 40 grantee staff.

Table B.1. Sampling and response rate assumptions, by data collection instrument and respondent type

Respondent type

Sampling method

Universe of potential respondents

Estimated selected sample

Estimated responses per respondent

Estimated response rates

Estimated responses

Follow-up survey for program participants

Study participantsa

Census

2,500

2,500

1

64%

1,600

Structured discussions with grantee staff

Grantee staffb

Census

110

44

1

90%

40

a The follow-up survey will be attempted with the census of the 2,500 impact study participants enrolled at baseline. Study design considerations, as well as resources, will influence the number of sites included in the impact study. Based on previous experience administering follow-up surveys with similar populations, such as the Evaluation of the SNAP Employment & Training Pilot (OMB #0584-0604), the evaluation team anticipates that 64 percent of the study participants will complete a follow-up survey.

b We estimated the universe of potential respondents by assuming there are, on average, 5 staff per grantee with the knowledge needed to participate in the discussions. We anticipate inviting an average of 2 grantee staff per site. We expect a 90 percent response rate, meaning approximately 40 participants will participate in the structured discussions. This assumption is a conservative estimate based on similar data collection efforts.

B.2. Procedures for collection of information

Follow-up survey of study participants. The follow-up survey will be conducted with the same participants who completed the baseline survey. It will be administered one of two ways; a self-administered online web-based survey or over the phone with a contractor interviewer who will then enter the responses into the web-based survey platform. We will first invite participants to complete the survey by web by sending a survey invitation to the email address they provided during the baseline survey. We will send weekly reminder emails. If participants have not completed the survey after those reminders, experienced interviewers will call those who have not completed the web survey and attempt to complete the survey over the phone. Finally, if we are not able to reach the participant by phone, we will attempt to reach the participant in-person. Field staff will try to locate the participant, and once located, will use a cell phone to dial into the call center to complete by telephone. We estimate that approximately 20 percent of participants will complete the self-administered web survey, and that the remainder will complete over the phone. To achieve a response rate that will enable us to obtain reliable impact estimates, respondents who complete the study participant follow-up survey will receive a $25 gift card as a 'thank you' for their time.

Structured discussions with grantee staff. As part of the implementation evaluation, the evaluation team will hold virtual structured discussions with grantees near the end of their grant period, in fall 2023. We will invite up to two staff members from each of the grantees in the impact study for discussions focused on changes since the site visits were conducted as well as plans for sustainability of program services. Similar discussions focused on sustainability will be held with staff from grantees not involved in the impact study (up to two staff from each grantee will be invited) and will involve five or six grantees in the same discussion. The evaluation team will use a semi-structured interview protocol to gather information from grantees. The discussions will be video-optional calls taking place over Zoom or Webex and will be recorded.

1. Statistical methodology, estimation, and degree of accuracy

Follow-up survey of study participants. Data collected using the follow-up survey of study participants will be used to describe the outcomes of study participants and to estimate the impact of program participation. The follow-up survey data will be used to describe Pathway Home participant outcomes which are not available in administrative data sources. These outcomes will be reported as simple means or percentiles, weighted to account for survey non-response. Additionally, we will estimate the impact of participation by comparing outcomes for the group of Pathway Home participants in our sample to a comparison group of participants who were not eligible for Pathway Home but were otherwise similar to program participants. To compare these groups, we will run a regression of outcomes on program participation as well as a set of covariates describing individual and program characteristics. It is possible that individuals who were not eligible for the Pathway Home program at study enrollment became eligible over the course of the study. If we find that a sizable share of the comparison group did enroll in the Pathway Home program, we will use a two-staged least squares to account for imperfect compliance with treatment group status. In this case, our impact estimates will be interpretable as a local average treatment effect.

All analyses of the follow-up survey data will be adjusted for survey non-response. To assess whether there was systematic survey non-response, we will compare the observable characteristics of responders and non-responders using data from the baseline survey. If there is evidence of differences between responders and non-responders, we will develop survey weights to account for non-response. These will be estimated as the inverse of the probability that an individual responded to the follow-up survey, estimated based on their baseline characteristics collected from the baseline survey approved under clearance request 1290–0039.

Structured discussions with grantee staff. The main type of data collected from the discussions with grantees will be qualitative information about staff’s experiences planning for sustaining their program after the grant period ends. Thus, no statistical methodology (such as sample stratification) or estimation will be necessary to analyze the data. We will not make any declarative statements about the efficacy of strategies or practices implemented by programs. We will qualitatively describe these programs to inform DOL and the broader field about pre-and post-release employment-focused programs.

We will use NVivo, a qualitative data analysis software package, to analyze the qualitative data collected through structured discussions. To extract data on key themes and topics, the evaluation team will develop a coding scheme, which will be organized according to key research questions and topics and guided by the conceptual framework, as well as more general constructs from the Consolidated Framework for Implementation Research and Community Coalition Action Theory framework on factors that affect implementation.1,2 To ensure reliability across the research team, all coders will code an initial set of documents and compare codes to identify and resolve discrepancies. We will also analyze data across grantees for each theme and determine trends in the data that suggest differences between types of grantees, program models, facility types, or other important aspects of the program. Because the implementation study is examining grant implementation, study findings will apply only to the Pathway Home program grantees and will not be more broadly generalizable.

2. Unusual problems requiring specialized sampling procedures

Specialized sampling procedures are not required for administering the follow-up survey or structured discussions with grantee staff. Both data collection activities will include all potential sample members.

3. Periodic data collection cycles to reduce burden

The data collection instruments for the impact and implementation studies will be administered only once for any individual respondent. To further minimize burden on respondents of the structured discussions, the evaluation team will review pertinent data available from Pathway Home grantee applications, grantee staffing and implementation plans, and any other reporting information to reduce the burden on site respondents whenever possible. Thus, the evaluation team can focus the discussions with respondents on topics for which information is not otherwise available.

B.3. Methods to maximize response rates and minimize non-response

Follow-up survey of study participants. The study team’s prior experience collecting follow-up survey data from similar populations will guide the approach to data collection for the Pathway Home Evaluation. The plan incorporates strategies that have proven effective on projects involving people with incarceration histories. It includes the following approaches to retaining sample members and ensuring high response rates on the follow-up survey:

  • Conduct proactive outreach to local incarceration facilities. Sample members might be reincarcerated at the time of the survey. These early contacts with facilities will help the study team to establish relationships, learn how to gain approval to interview inmates, and complete any required paperwork before our data collection.

  • Use a multimode approach. The study will begin on the web and then move to the more labor-intensive and costly mode of collection – telephone interviewing.

  • Web administration. Respondents will have the choice to complete the survey online. This choice allows respondents to complete on their own schedule and pace, as well as complete the survey over multiple sessions. The web survey will be accessible using any device (tablet, smartphone, laptop, or desktop computer) and will be compatible with assistive technologies respondents might need to participate online.

  • Technology to reduce burden. To reduce burden, the surveys will employ drop-down response categories so respondents can quickly select from a list, dynamic questions, automated skip patterns, so respondents only see those questions that apply to them (including those based on answers provided previously in the survey), and logical rules for responses so respondents’ answers are restricted to those intended by the question. These features should minimize data entry burden among participants and facilitate high-quality responses.

  • Conduct interim and field locating to find hard-to-reach respondents. Collecting detailed contact information at the time of program enrollment will facilitate interim locating, such as database searches and outreach to the respondents’ contacts. Field locators will attempt to locate hard-to-reach respondents by going to the addresses provided by the respondent in the baseline survey and addresses found by our in-house locating team. Once located, field locators will have respondents use a cell phone to dial into the call center to complete the survey by phone.

  • Use tested questionnaires. The collection of survey data has been tailored to the specific circumstances of this evaluation, yet it is based closely on prior surveys. These include the America’s Promise Job Driven Grant Program Evaluation (OMB 201801-1290-001) and the Reentry Employment Opportunities (REO) Evaluations (OMB 1290-0NEW). These prior instruments were extensively tested using cognitive interviews or debrief sessions under each of these evaluations with populations that are similar to this study’s. These populations include active participants in employment and training services and individuals returning from incarceration.

  • Interim mailings. To maintain contact with sample members, mailings such as letters and postcards may be sent before and throughout data collection. Around six months before the start of data collection, a letter will be sent with an enclosed pre-paid monetary incentive ($2 bill). A growing body of research suggests that pre-paying sample members with as little as $2 remains an effective way to increase response rates. Beebe et al. (2004) found that a $2 pre-paid incentive increased response rates by close to 10 percent on a mail survey of over 9,000 Medicaid beneficiaries. The study included an oversample of racial and ethnic minorities and found that the incentive had a comparable effect across different racial and ethnic strata.3 Other mailings before the start of data collection may request updated contact information. During data collection, we might include small tokens in some mailings to grab the respondents’ attention and encourage responsiveness. We have successfully used similar approaches to increase response rates in other studies with hard-to-reach populations, such as on the Evaluation of YouthBuild (Bevin, Stein, Goble, and Hurwitz 2017).4

Using the above approaches, the study team will strive to achieve a 64 percent response rate on the follow-up survey.

Structured discussions with grantee staff. As their grant agreements indicate, Pathway Home grantees included in the evaluation are expected to participate in the data collection activities as a condition of their grant awards. The evaluation team expects to achieve a response rate of 90 percent for structured discussions with grantee staff. To ensure full cooperation, the evaluation team will be flexible in scheduling these conversations to accommodate the particular needs of respondents. By holding the discussions virtually, respondents will not need to travel or incur additional burden to participate.

Methods to ensure data reliability.

Follow-up survey. We will use several well-proven strategies to ensure the reliability of data collected from the follow-up survey. The evaluation team will have reviewed the survey extensively, and it will be thoroughly tested in a pretest involving individuals from nonparticipating sites. To ensure we capture complete and accurate data, the web-based platform will flag missing data or data outside a valid range. Furthermore, to ensure we collect the most critical pieces of information from all respondents, the evaluation team will program the web-based system to not allow missing answers for critical items. In the analysis of survey data, responses will be weighted to account for survey non-response. In addition, staff will be trained on project security, including safeguards to protect personally identifiable information while collecting and storing information on sample members.

Structured discussions with grantee staff. We will use several well-proven strategies to ensure the reliability of data collected from the structured discussions. First, qualitative data collectors, all of whom already have extensive experience with this data collection method, will be thoroughly trained in aspects particular to this study, including how to probe for additional details to help interpret responses to questions. Second, this training and the use of the protocol will ensure that data collection is standardized across groups. Third, structured discussions will be recorded to ensure documentation of responses is accurate. Finally, all respondents will be assured that their responses will remain anonymous; reports will never identify respondents by name, and any quotes will be devoid of identifying information, including site name.

B.4. Tests of procedures or methods to be undertaken

Follow-up survey of study participants. We plan to pretest the follow-up survey with up to 9 individuals, some of whom have received services through a previous Pathway Home grant and some who have not. This will allow us to simulate the program and comparison groups.

Structured discussions with grantee staff. After the first discussion is completed, the study team will conduct a debrief to discuss if any adjustments need to be made to the discussion protocol or procedures used. The procedures and protocol are also informed by previous structured discussions with subsets of grantees.

B.5. Individuals consulted on statistical aspects of design and on collecting and/or analyzing data

The evaluation team has convened a technical working group (TWG) to provide substantive feedback throughout the project period, particularly on the impact evaluation design. The TWG members have expertise in research methodology as well as in programs and populations similar to those being served in the Pathway Home grant programs. The evaluation team has also convened an expert panel of people with lived experience in the justice system to ensure the research design, instruments, and findings are grounded in the experiences of people with direct experience in the justice system. Table B.2 lists the people who will oversee data collection, analysis, and the design for the Pathway Home Evaluation.



Table B.2. People who will oversee data collection, analysis, and the design for the Pathway Home Evaluation

Organization

Individuals

Mathematica
P.O. Box 2393
Princeton, NJ 08543-2393
(609) 799-3535

Ms. Samina Sattar
Project director
(609) 945-3358


Dr. Jillian Berk
Principal investigator
(202) 264-3449



Dr. Jillian Stein
Deputy project director
(609) 716-4395



Ms. Jeanne Bellotti
Director, Employment Research
(609) 275-2243



Ms. Betsy Santos
Survey director
(609) 750-2018



Dr. Ariella Spitzer
Impact study lead
(617) 872-8798

Social Policy Research Associates
1330 Broadway, Suite 1426
Oakland, CA 94612 
(510) 763-1499

Dr. Andrew Wiegand
President, CEO, and senior advisor
(510) 763-1499, ext. 636


Mr. Christian Geckeler
Senior associate
(510) 788-2461



1 Keith, Rosalind E., Jesse C. Crosson, Ann S. O’Malley, DeAnn Cromp, and Erin Fries Taylor. “Using the Consolidated Framework for Implementation Research (CFIR) to Produce Actionable Findings: A Rapid-Cycle Evaluation Approach To Improving Implementation.” Implementation Science, vol. 12, no. 1, 2017, pp. 1–12.

2 Butterfoss, F. D., and M. C. Kegler. “Toward a Comprehensive Understanding of Community Coalitions: Moving from Practice to Theory.” In Emerging Theories in Health Promotion Practice and Research, edited by R. DiClemente, L. Crosby, and M. C. Kegler (pp. 157–193). San Francisco, CA: Jossey-Bass, 2002.

3 Beebe, Timothy, Michael Davern, Todd Rockwood, Donna McAlpine and Kathleen Call (2004). The effects of a prepaid monetary incentive among low income and minority populations. Paper presented at the annual meeting of the American Association of Public Opinion Research, Phoenix, AZ, May 11, 2004. http://www.allacademic.com/meta/p_mla_apa_research_citation/1/1/5/9/6/p115965_index.html

4 Mory, Bevin, Jillian Stein, Lisbeth Goble, and Felicia Hurwitz. “Getting to Know You: Strategies to Engage Hard-to-Reach Respondents.” Paper presented at the American Association for Public Opinion Research Conference, New Orleans, May 19, 2017.

August 2023 3

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleReentry Employment Opportunities OMB Statements
SubjectOMB
AuthorMATHEMATICA
File Modified0000-00-00
File Created2023-12-12

© 2024 OMB.report | Privacy Policy