SUPPORTING STATEMENT FOR INFORMATION COLLECTIONS: PATHWAY HOME GRANT PROGRAM EVALUATION
OMB CONTROL NO.: 1290-XXXX
The U.S. Department of Labor (Department) submits this information collection request (ICR) as a new collection.
A: JUSTIFICATION
Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
The Chief Evaluation Office (CEO) in the U.S. Department of Labor (DOL) is conducting an evaluation of Round 2 grants under the Pathway Home program, which is designed to test interventions to help individuals with justice-system involvement find meaningful employment and avoid recidivism. The Evaluation of the Pathway Home Grant Program (Pathway Home Evaluation) includes both implementation and impact studies. This information collection request seeks Office of Management and Budget (OMB) clearance for two new data information collections for the evaluation; the follow-up survey for impact study participants and protocol for structured discussions with grantee staff.1 At the end of 2018, about 6.7 million adults were under some form of supervision by the U.S. correctional system.2 People released from incarceration face substantial obstacles to successful reentry. Not surprisingly given these challenges, 45 percent of those released from state prisons are without employment one year following release.3 Within this context, the U.S. Department of Labor (DOL) funded the Pathway Home grant program, which was designed to strengthen ties between pre-release services for those in local jails and state correctional facilities linked to post-release services available in the community. Using a continuity-of-care model, Pathway Home aims to allow participants to maintain the same case manager before and after release. The Pathway Home Evaluation is intended to provide DOL, grantees, and other stakeholders with practical information about the implementation of the grants that can build the knowledge base about reentry employment services, including those delivered before and after release. The impact evaluation will enable DOL to assess the overall impact of access to and participation in Pathway Home services. Since July 2020, the U.S. Department of Labor (DOL) has awarded approximately $113 million dollars to two cohorts of Pathway Home grantees to expand the availability of employment-focused reentry services for individuals incarcerated in U.S. jails, prisons, and community correctional facilities.
The Pathway Home Evaluation includes two components: (1) an implementation study to understand program implementation and partnership development and (2) an impact study to measure the effects of Pathway Home program services on participant outcomes. Both components will take place over four years (2021 to 2025).
The implementation study will compare the planning and implementation of services under the Pathway Home grants across different sites, including the six grantees that are in the impact study and the 16 grantees that are not part of that study. The implementation study will address three key research questions:
How did grantees implement the program, including facility and community-based services, hiring and training staff, data tracking and sharing systems, participant recruitment and enrollment processes, and transitioning participants from pre- to post-release case management? How did implementation vary by grantee and facility characteristics?
What types of partners did grantees engage in the program, and what roles did the partners play in program planning, implementation, and reporting? What factors did respondents perceive influenced the creation, maintenance, success, and sustainability of these partnerships?
Who was served by the Pathway Home grant program, and how were they identified, recruited, and enrolled? How do participant outcomes and program experiences vary by participant and program characteristics?
The implementation study will use data from the information collections which were approved under OMB No. 1290–0039: a web-based survey of grant administrators, a web-based survey of correctional facility administrators, interviews with program and partner administrators and staff, virtual discussions with grantee staff, focus groups with participants during site visits, and telephone interviews with program participants from each of those sites. This information collection request is for a new collection via structured discussions with grantee staff, which will be conducted virtually.
The impact study will compare the outcomes of Pathway Home participants (program group) to those of a comparison group, and will address three key research questions:
To what extent did receiving Pathway Home services affect participant outcomes such as employment, earnings, and avoidance of repeat involvement in the criminal justice system?
To what extent do impacts vary across selected populations, including those based on age, type of offense, type of institution, veteran status, gender, and race and ethnicity?
How does program effectiveness vary by grantee characteristics, such as institution type, population served, and services offered? Are there core components that are common to successful models of comprehensive reentry programs (for example, service delivery pre- and post-release, supportive services offered)?
The impact study will include a baseline and follow-up survey including all study participants (program and comparison group members) and will use administrative data to address the impact study research questions. The baseline survey of study participants was approved under OMB No. 1290–0039 while this request includes review and approval of the follow-up survey.
This is a new collection request associated with the Pathway Home Evaluation. This package requests clearance for two data collection activities; structured discussions with grantee staff and the participant follow-up survey. Structured discussions with grantee staff are scheduled for Fall 2023. The administration of the participant follow-up survey is planned for early 2024.
Citation of sections of laws that justify this information collection: Statutes and regulations mandating or authorizing the collection of information.
This evaluation is being conducted under Section 169 of the Workforce Innovation and Opportunity Act (WIOA),4 which authorizes research and evaluations to improve the management and effectiveness of workforce programs and activities under WIOA and other employment and training programs. CEO undertakes a learning agenda process each year to identify departmental priorities for program evaluations. This DOL-funded study was a result of Section 169 of the Workforce Innovation and Opportunity Act and the annual process to determine the Department's research priorities for the upcoming year. It contributes to the labor evidence base to inform employment and training programs and policies and addresses departmental strategic goals and priorities.
2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
DOL will use the information collected through this request in two ways. First, the data from the structured discussions with staff will add critical, detailed information on implementation of selected Pathway Home grants, including the nature of the partnerships with other organizations, the training and supportive services provided, the populations served, and common implementation successes and challenges.
The follow-up participant survey will provide new information to assess the activities, experiences, and outcomes for program and comparison group members who were included in the impact evaluation of selected Pathway Home grants. Information from this survey will add to that from the baseline surveys and from administrative data. These data are thus critical for the descriptive and impact analyses that will be conducted under the evaluation and that will provide DOL and other policymakers with information to be used in management decisions and planning for future grant programs, as well as for building the knowledge base on effective services provided to individuals before and after their release from incarceration.
The data collection instruments in the previously approved ICR, OMB No. 1290–0039 include a (1) baseline survey of study participants, (2) survey of grant administrators, (3) survey of correctional facility administrators, (4) interview guide for program and partner administrators and staff, (5) focus group guide for pre-release program participants, and (6) an interview guide for post-release program participants. More information on these data collection instruments and their uses can be found in the supporting statement for OMB No. 1290–0039. The new data collection instruments for approval in this package are (1) study participant follow-up survey, and (2) structured discussions with grantee staff.
Study participant follow-up survey. As part of the impact study, the evaluation team will field a web and phone survey to approximately 2,500 individuals who are in the study program group or comparison group. The program group received services under a Pathway Home grant and the comparison group were not eligible for services under the grant but enrolled in the study. Surveys will be administered 15 months after participants’ initial enrollment in the study. The survey will collect information on respondents’ skills and credential attainment, employment and economic well-being, criminal justice involvement, and health and stability after reentering their communities. The survey will take 25 minutes, on average, to complete.
Structured group and individual discussions with grantee staff. As part of the implementation study, the evaluation team will conduct virtual group discussions with approximately 32 grantee staff from 16 grantees that did not participate in a site visit. Discussions will focus on understanding sustainability of the Pathway Home grantee programs. The group discussions will be conducted over WebEx and Zoom and are expected to take about 90 minutes to complete. There will be five or six grantees per group discussion. The evaluation will also involve six (6) one-on-one or two-on-one calls with the six sites, which are also expected to take about 90 minutes. These calls will include the same questions about sustainability, as well as specific issues that emerged during site visits.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
This project will use multiple applications of information technology to reduce burden. The participant follow-up survey will have the capability to be hosted on the Internet via a live secure web link. To reduce burden, the survey will employ the following: (1) secure log-ins and passwords so respondents can save and complete the survey in multiple sessions, (2) drop-down response categories so respondents can quickly select from a list, (3) dynamic questions and automated skip patterns so respondents only see questions that apply to them (including those based on answers provided previously in the survey), and (4) logical rules for responses so respondents' answers are restricted to those intended by the question.
Video conferencing will be used to host the structured discussions with grantee staff. The use of these programs reduces burden for participants as they do not have to travel in any way to participate in the group. These programs also allow respondents to call into the meeting if they do not wish to or are not able to join by video.
4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item A.2 above.
The evaluation of the Pathway Home grant program will not require collecting information that is available through alternate sources. For example, the evaluation will use available information from grantee applications and existing administrative data sets to ensure that data collected through the follow-up survey and structured discussions with grantee staff are not available elsewhere.
5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
Neither of the data collection activities included in this clearance request will impact small businesses or other small entities.
6. Describe the consequence to federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
The information collections submitted for approval under this request are critical to building evidence that will add to the growing body of knowledge about what works in providing justice-involved people with linked, employment-focused pre- and post-release services.
Without collecting data on the Pathway Home grants through structured discussions with grantee staff, DOL will not be able to develop a comprehensive understanding of services, and the context in which the grants and partnerships operated, operational challenges grantees and partners faced, or how the services and partnerships evolved over time.
Similarly, failure to collect follow-up information for the impact study, from participants in the program group who received services, and from comparison group members who did not, would preclude DOL from determining the impact of the Pathway Home grants which were part of the impact study. Policymakers and the field thus would not have information about the effectiveness of grantees' services or what types of services or service enhancements should be implemented in future programs.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentially that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentially to the extent permitted by law.
No special circumstances apply to this data collection.
8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
The 60-day notice (88 FR11455) to solicit public comments was published in the Federal Register on 5/31/2023. No comments were received.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years -- even if the collection-of-information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
The contractor’s evaluation team is coordinating consultation on the research design and data needs. The process involves convening experts in a technical working group (TWG), convening an expert panel of people with lived experience in the justice system, and conducting discussions with site-level program staff. A list of evaluation team members is in Table A.1.
The TWG will provide substantive feedback throughout the project period, particularly on the impact study design. The TWG members have expertise in research methodology as well as in programs and populations similar to those being served in the Pathway Home grant program.
The contractor also consulted an expert panel of people with lived experience in the justice system and will continue to consult the expert panel through the project period to ensure the research design, instruments, and findings are grounded in the experiences of people directly affected by the justice system.
The contractor will also consult program staff in the relevant office in the Employment and Training Administration to better understand the feasibility of the research design within the regional context of grantees.
Table A.1. Pathway Home Contractor Evaluation Team
Organization |
Individuals |
Mathematica
|
Ms.
Samina Sattar |
|
Dr.
Jillian Berk
Dr.
Jillian Stein
Ms.
Jeanne Bellotti
Ms.
Betsy Santos |
Social
Policy Research Associates |
Dr.
Andrew Wiegand |
|
9. Explain any decision to provide any payments or gifts to respondents, other than remuneration of contractors or grantees.
Program or partner staff will not receive any payments or gifts because activities will be carried out in the course of their employment, with no additional compensation outside of their normal pay. To achieve a response rate that will enable us to obtain reliable impact estimates, respondents who complete the study participant follow-up survey will receive a $25 gift card as a 'thank you' for their time. In addition to physical gift cards, respondents will have the option of selecting an electronic gift card through the digital incentive platform BHN Rewards. Both physical and electronic gift cards will be provided by the study team from evaluation resources. Decades of research indicate monetary incentives increase response rates without compromising data quality (Grauenhorst et al., 2016; Mercer et al. 2015; Sundstrom et al., 2016; Singer and Ye 2013; Laguilles et al., 2011; de Leeuw and de Heer 2002; Singer and Kulka 2000).
10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
Information collected will be kept private to the extent permitted by law. The evaluation team complies with DOL data security requirements by implementing security controls for processes that it routinely uses in projects that involve sensitive data. Further, the evaluation is being conducted in accordance with all relevant regulations and requirements.
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
Evaluating the Pathway Home grant program using impact study methodology requires asking sensitive questions about Social Security numbers,5 wage rates and earnings, economic hardships, and involvement in the criminal justice system. Past evaluations have included similar questions without any evidence of significant harm.6 As described earlier, all sample members will be assured of the privacy of their responses before being asked to complete the follow-up survey and will be informed that they can skip any questions they do not wish to answer. All data will be reported in aggregate, summary format only, eliminating the possibility of individual identification and ensuring that individual responses are private.
The evaluation team will seek institutional review board approval for final, OMB-approved instruments.
12. Provide estimates of the hour burden of the collection of information. The statement should:
Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. General, estimates should not include burden hours for customary and usual business practices.
If this request for approval covers more than one form, provide separate hour burden estimates for each form.
Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.
Table A.2 includes assumptions about the annual number of respondents expected, the average number of responses per respondent, the average hours of burden per response, the annual burden hours estimated, the time value assumed for respondents, and the total annualized monetary burden hours for the following activities:
Study participant follow-up survey. As part of the impact study, the evaluation team will administer this survey to about 2,500 study participants. It is estimated that each respondent will spend about 25 minutes (.42 hours) on the survey. Given the very transient nature of this population, the response rate is expected to be 64 percent, or about 1,600 individuals (533 annualized over three years), who will complete the survey. This is lower than older studies involving a formerly incarcerated population, but consistent with response rates in more recent studies involving other hard-to-reach populations that have used a similar design. The Evaluation of Seven Second Chance Act Adult Demonstration Programs (D’Amico et al. 2017), which utilized a web survey with phone and in-person field follow-up from 2013 to 2014 with formerly incarcerated individuals achieved an 82.3 percent response rate. However, some individuals were enrolled in the study after release from incarceration and may have been able to provide more accurate contact information. The Evaluation of the SNAP Employment & Training Pilots (OMB #0584-0604), which also utilized a web survey with phone and in-person field follow-up of SNAP participants, achieved a response rate for the 12-month follow-up survey of only 60 percent in 2018. The annualized burden is approximately 222 hours.
Structured discussions with grantee staff. As part of the implementation study, the contractor evaluation team will conduct virtual discussions with grantee staff. For group discussions, the contractor will invite staff from 16 grantees, for a total of about 32 staff members. About 90 percent of the invited staff are expected to participate, meaning approximately 29 grantee staff. Discussion groups will average 90 minutes. The annualized burden is approximately 14.5 hours. The contractor will also conduct one-on-one or two-on-one calls with staff from the 6 impact sites. The contractor expects about 11 people (90 percent) to participate in these calls, which are estimated to average 90 minutes. The annualized burden is approximately 5.5 hours. The group discussions and one-on-one or two-on-one calls will include a total of approximately 40 grantee staff (13 annualized over three years) and 20 hours of annualized burden.
Table A.2. Estimated Annualized Respondent Cost and Hour Burden
Activity |
No. of respondentsa |
No. of responses per respondent |
Total responses |
Average burden (hours) |
Total burden (hours) |
Hourly wage rateb |
Monetized Value of Time |
Study participant follow-up survey |
533 |
1 |
533 |
.42 |
222 |
$7.25 |
$1,610 |
Structured discussions with grantee staff |
13 |
1 |
13 |
1.5 |
20 |
$36.92 |
$738 |
Total |
546 |
|
546 |
|
242 |
|
$2,348 |
Note: Numbers are rounded to the nearest whole number for all columns other than the "average burden hours" and "Hourly wage rate" columns.
aAll annual totals reflect a three-year clearance and study data collection period. Estimates are rounded to the nearest whole number.
bThe average hourly wages were obtained from the U.S. Bureau of Labor Statistics, National, State, Metropolitan, and Nonmetropolitan Area Occupational Employment and Wage Estimates, May 2021 (accessed at https://www.bls.gov/oes/2021/may/oes_nat.htm on December 5, 2022). Estimates of administrators' wages are based on the average wages for "social and community service managers" ($36.92). Monetized estimates for participants were assumed to be the federal minimum wage of $7.25.
13. Provide an estimate of the total annual cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).
The cost estimate should be split into two components: (a) a total capital and start up cost component (annualized over its expected useful life); and (b) a total operation and maintenance and purchase of service component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.
If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.
Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.
There will be no direct costs to respondents for the two information collections identified in this request for the Pathway Home Evaluation.
14. Provide estimates of the annualized cost to the Federal Government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 into a single table.
The total cost to the federal government is $2,214,049. Therefore, the annualized cost is $2,214,049 / 3 = $738,016.
Costs result from the following categories:
The estimated cost to the federal government for the contractor to carry out the data collection activities included in this package is $2,077,123. Annualized over three years of data collection, this comes to $692,374 per year.
The annual cost DOL will bear for federal technical staff to oversee the contract is estimated to be $45,642. The expected annual level of effort to perform these duties will require 200 hours for one Washington, DC–based Federal GS 14 Step 2 employee earning $65.54 per hour, and 200 hours for one Washington, DC–based Federal GS 15 Step 2 employee earning $77.09 per hour. (See Office of Personnel Management 2023 Hourly Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/23Tables/html/DCB_h.aspx.) To account for fringe benefits and other overhead costs, the agency has applied a multiplication factor of 1.6. Thus, [(200 hours × $65.54) + (200 hours × $77.09)] × 1.6 = $45,642. Over three years, the total cost is $45,642 × 3 = $136,926.
15. Explain the reasons for any program changes or adjustments.
This is a new information collection.
16. For collections of information whose results will be published, outline plans for tabulations, and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
Analysis plan
The follow-up survey data will be used to provide simple descriptive measures about the outcomes of Pathway Home participants such as averages, medians, and percentiles of the outcomes. In addition, the contractor will use data from the follow-up survey to estimate the impact of the Pathway Home program on participants.
As described in 1290–0039, the impact evaluation will use a quasi-experimental design to estimate the impact of participation relative to a group of individuals who were similar to the program group but were not eligible for services. The impact study will involve approximately six grantees, that have been selected based on their program activities and the feasibility of implementing the study design at their facility. Study participants will be either enrolled as program group members, those who enrolled in the Pathway Home program, or comparison group members. To estimate the impact of the Pathway Home program on participants, responses from members of the program group will be compared to responses from members of the comparison group. Responses to open-ended questions will be transformed into discrete outcomes by coding to identify key themes across respondents.
Because the surveys will be collected more than a year following enrollment in the study, participant contact information may be outdated. Additionally, some participants may not provide contact information at baseline, further limiting the ability to contact them when their surveys are available. Finally, some participants may choose not to participate in the follow-up survey. To address non-response data in the follow-up survey, all analyses will be adjusted using the follow-up survey data using a nonresponse adjustment. This will involve creating balance tables to assess whether there is systematic nonresponse to the follow-up survey within the sample of study participants. If it is determined that systematic nonresponse is likely to be an issue, non-response weights will be added. This will be done by estimating , the probability that individual i responded to the follow-up survey given their set of baseline characteristics (Xi). This will be done using a logistic regression based on covariates from the baseline information form.7 The contractor will then calculate weights for responders to the follow-up survey as . In the event of large outliers, the contractor will make an outlier adjustment.
The analysis will also involve an estimate the impact of the Pathway Home program using ordinary least square regression analyses (continuous outcomes) and logistic regression analyses (binary outcomes) controlling for individuals' baseline characteristics. Given the large number of possible covariates, the contractor will consider using a machine learning method such as LASSO to select which covariates to include in the analysis. Prior to this analysis, the contractor will create balance tables to assess whether there are systematic differences in baseline characteristics between the program and comparison groups. If likely systematic differences are identified, the contractor will adjust analyses using inverse probability weights. In the case that the analyses need to be adjusted for both selection into the program group and selection into survey response, combined weights will be estimated as where is the probability of individual i being in the program group conditional on their baseline characteristics.
The contractor will also consider estimating grantee-specific effects by using Bayesian analysis to bring together information on the grantee-specific estimate of the program's impact and the estimated effects of other partnerships' programs. To generate Bayesian impact estimates for each of the approximately six partnerships in the impact study, the contractor will first estimate grantee-specific estimates using regression analysis. The contractor will then use the impact estimates for the other partnerships to fit a prior distribution of treatment effects.
Analysis of data from the structured discussions with grantee staff will involve coding the qualitative data into key topics. The contractor evaluation team will begin by writing up detailed notes from virtual discussion groups in a structured format. To code the qualitative data for key themes and topics, a coding scheme will be developed and organized according to key research questions and topics as well as constructs from the CCAT and CFIR frameworks. Each segment of coded data will be assigned a negative or positive flag to identify barriers to and facilitators of implementation. This process will reduce the data into a manageable number of topics and themes for analysis. The evaluation team will then code the data using qualitative analysis software. To ensure reliability across team staff, all coders will code an initial set of documents and compare codes to identify and resolve discrepancies.
Publications
The evaluation includes an implementation study and an impact study. In 2022, a public-facing design report was released describing the implementation study. In 2023, a public-facing design report will be submitted for the impact study. Data collection for the implementation and impact studies will begin in 2023 and will end in 2025. Based on the data collection activities described in the previous OMB package and this clearance request, the following products will be developed:
Implementation study briefs. The first implementation study brief was published in August 2022. Other study briefs are estimated to be published in Fall 2023 and Spring 2024. The briefs will focus on special topics of interest to DOL that may include:
An overview of who was initially served by the grantees and what strategies the grantees used to recruit participants to the program, as well as recruitment and enrollment challenges and lessons learned.
An overview of the program models used across the two cohorts of grantees and an overview of services and innovative practices based on responses to the grantee and facility surveys and virtual discussions with grantee staff.
Partnerships developed under the program focusing on how the grantees identified different partners to support Pathway Home program planning and implementation and the trends across grantees in the types of partnerships that showed promise. The paper will also discuss the challenges, successes, and lessons learned around forming these partnerships.
Implementation study report. The evaluation team will complete a report describing the findings from the implementation study. This report will document how sites were selected for the evaluation, as well as the characteristics of sites and correctional facility partners that participated. The report will also discuss participants' experiences with the program, the coordination and delivery of services, any challenges to serving participants, and lessons learned. It is estimated that the report will be published in Fall 2024.
Impact study final report. The evaluation team also will complete a final report documenting how accessing Pathway Home grant services affected participants' outcomes. Likely outcomes will include employment, earnings, and criminal justice involvement. This report will also examine the effects for key subgroups and present an analysis of the association between program components and participant outcomes. It is estimated that the report will be published in Summer 2026.
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
The OMB approval number and expiration date will be displayed or cited on all forms that are part of the data collection.
18. Explain each exception to the certification statement.
No exceptions are necessary for this information collection.
REFERENCES
D’Amico, Ronald, Christian Geckeler and Hui Kim. 2017. “Evaluation of Second Chance Act Adult Demonstration Programs: Impact Findings at 18 Months.” Report to the U.S. Department of Justice, National Institute of Justice, grant number 2010‐RY‐BX‐0003.
De Leeuw, E. D., & de Heer, W. (2002). Trends in household survey nonresponse. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & J. A. Roderick (Eds.), Survey nonresponse (pp. 41-54). New York: Wiley.
Grauenhorst, T., Blohm, M., & Koch, A. (2016). Respondent incentives in a national face-to-face survey: Do they affect response quality? Field Methods, 28(3), 266-283.
Laguilles, J. S., Williams, E. A., & Saunders, D. B. (2011). Can lottery incentives boost web survey response rates? Findings from four experiments. Research in Higher Education, 52(5), 537-553. https://doi.org/10.1007/s11162-010-9203-2
Mercer, Andrew Caporaso, David Cantor, Reanne Townsend, How Much Gets You How Much? Monetary Incentives and Response Rates in Household Surveys, Public Opinion Quarterly, Volume 79, Issue 1, Spring 2015, Pages 105–129. https://doi.org/10.1093/poq/nfu059
Singer, Eleanor, and C. Ye. 2013. "The Use and Effects of Incentives in Surveys." Annals of the American Academy of Political and Social Science, 645 (1): 112-141.
Singer, Eleanor, and Richard A. Kulka. 2002. "Paying Respondents for Survey Participation." Pp. 105–28 in Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, eds., Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: National Academy Press.
Sundstrom, E. D., Hardin, E. E., & Shaffer, M. J. (2016). Extra credit micro-incentives and response rates for online course evaluations: Two quasi-experiments. Teaching of Psychology, 43(4), 276-284.
1 This project has received clearance under OMB No. 1290-0039 for other data collection activities.
2 Jones, Alexi. “Correctional Control 2018: Incarceration and Supervision by State.” Prison Policy Initiative, 2018. Available at https://www.prisonpolicy.org/reports/correctionalcontrol2018.html.
3 Looney, Adam, and Nicholas Turner. “Work and Opportunity Before and After Incarceration.” Washington, DC: Brookings Institution, 2018. Available at https://pdfs.semanticscholar.org/399b/5d1747e721fdb63a5837296619528d361de6.pdf.
4 20 CFR Part 683 eCFR :: 20 CFR Part 683 -- Administrative Provisions Under Title I of the Workforce Innovation and Opportunity Act
5 Social Security numbers are collected primarily to link administrative data for outcomes.
6 For example, the Impact Evaluation of the YouthBuild Program (OMB Control No 1205-0488) included similar questions without any evidence of significant harm.
7 If inverse probability weights are needed to adjust the comparison group to more closely resemble the program group, the logistic regression will be run incorporating these weights.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | March 6, 2007 |
Author | Theda Kenney |
File Modified | 0000-00-00 |
File Created | 2023-12-12 |