Pathway Home Grant Evaluation_OMB1_Supporting Statement Part A

Pathway Home Grant Evaluation_OMB1_Supporting Statement Part A.docx

The Evaluation of the Pathway Home Grant Program (Pathway Home Evaluation)

OMB: 1290-0039

Document [docx]
Download: docx | pdf

PART A: JUSTIFICATION for Pathway Home Grant Program Evaluation

OMB CONTROL No. XXXX-0NEW

OMB Expiration Date: TBD



Part A: Justification

The Chief Evaluation Office (CEO) in the U.S. Department of Labor (DOL) has commissioned an evaluation of the Pathway Home grant program. The program aims to improve the ability of people in the justice system to find meaningful employment and avoid repeat involvement in the criminal justice system. The Evaluation of the Pathway Home Grant Program (Pathway Home Evaluation) offers a unique opportunity to build knowledge about the implementation and effectiveness of these programs. CEO contracted with Mathematica and its subcontractors, Social Policy Research Associates and the Council of State Governments Justice Center, to conduct an implementation and impact study. This information collection request seeks Office of Management and Budget (OMB) clearance for new data collection for the Pathway Home Evaluation. This package requests clearance for six data collection instruments as part of the evaluation:


  1. Baseline survey of study participants

2. Survey of grant administrators

3. Survey of correctional facility administrators

4. Interview guide for program and partner administrators and staff

5. Focus group guide for pre-release program participants

6. Interview guide for post-release program participants


The first data collection instrument will be used for the impact study. The remaining data collection instruments will be used for the implementation study. A future information collection request will include protocols for a participant follow-up survey and virtual group discussions with grantee staff.

1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

At the end of 2018, about 6.7 million adults were under some form of supervision by the U.S. correctional system.1 People released from incarceration face substantial obstacles to successful reentry. Not surprisingly given these challenges, 45 percent of those released from state prisons are without employment one year following release.2 Within this context, the DOL funded Pathway Home, an ambitious effort to strengthen ties between pre-release services for those in local jails and state correctional facilities and post-release services available in the community. Using a continuity-of-care model, Pathway Home allows participants to maintain the same case manager before and after release. The Pathway Home Evaluation will provide DOL, grantees, and other stakeholders with practical information about the implementation of the Pathway Home grant program that can build the knowledge base about reentry employment programs, including those before and after release. The evaluation will enable DOL to consider the value of its investment in the Pathway Home grants by assessing the overall impact of access to and participation in Pathway Home services. Since July 2020, DOL has awarded approximately $113 million dollars to two cohorts of Pathway Home grantees to expand the availability of employment-focused reentry services for individuals incarcerated in U.S. jails, prisons, and community correctional facilities.

Citation of sections of laws that justify this information collection: This evaluation is authorized by Section 169 of the Workforce Innovation and Opportunity Act (WIOA), which authorizes research and evaluations to improve the management and effectiveness of workforce programs and activities such as the Pathway Home grant program. CEO undertakes a learning agenda process each year to identify departmental priorities for program evaluations. This DOL-funded study was a result of Section 169 of the Workforce Innovation and Opportunity Act and the annual process to determine the Department’s research priorities for the upcoming year. It contributes to the labor evidence base to inform employment and training programs and policies and addresses departmental strategic goals and priorities.

This package requests clearance for six data collection instruments, administration of which must start as in August 2022 to ensure a sufficient sample size among the six sites selected for the impact study. 3 This timeline would also give the evaluation team enough time for selecting, recruiting, and training study sites.

2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

DOL will use the data collected through the instruments summarized in this request to comprehensively describe implementation of the Pathway Home grant program, including its partnerships, training and supportive services provided, population served, and common implementation successes and challenges. The evaluation will also assess the impacts of Pathway Home on participant outcomes. This analysis will draw on the baseline survey as well as existing administrative data sets, for which collection does not require OMB approval. These data and the evaluation team’s descriptive and impact analyses will provide DOL and other policymakers with important information to guide management decisions, support future planning efforts about such grant programs, and share evidence of the effectiveness of providing services to people before and after their release from incarceration.

Overview of the evaluation

The Pathway Home Evaluation includes two components: (1) an implementation study to understand program implementation and partnership development and (2) an impact study to measure the effects of Pathway Home program services on participant outcomes. Both components will take place over four years (2021 to 2025).

The Pathway Home impact study will compare the outcomes of Pathway Home participants to those of a comparison group. The impact study will address three key research questions (see below). It will include a baseline and follow-up survey including all study participants (treatment and comparison group members) and will use administrative data to address the impact study research questions.

  1. To what extent did accessing Pathway Home services affect participant outcomes such as employment, earnings, and avoidance of repeat involvement in the criminal justice system?

  2. To what extent do impacts vary across selected populations, including those based on age, type of offense, type of institution, veteran status, gender, and race and ethnicity?

  3. How does program effectiveness vary by grantee characteristics, such as institution type, population served, and services offered? Are there core components that are common to successful models of comprehensive reentry programs (for example, service delivery pre- and post-release, supportive services offered)?

The implementation study component of the evaluation will include a web-based survey of approximately 70 grant administrators, a web-based survey of approximately 160 correctional facility administrators, interviews with program and partner administrators and staff, and focus groups with participants during site visits to approximately 16 sites, and telephone interviews with program participants from each of those sites. This information collection request includes the surveys of grant administrators and correctional facility administrators as well as the instruments that will be used during the on-site and telephone interviews and focus groups with staff and participants.

The Pathway Home Evaluation implementation study will address three key research questions:

  1. How did grantees implement the program, including facility and community-based services, hiring and training staff, data tracking and sharing systems, participant recruitment and enrollment processes, and transitioning participants from pre- to post-release case management? How did implementation vary by grantee and facility characteristics?

  2. What types of partners did grantees engage in the program, and what roles did the partners play in program planning, implementation, and reporting? What factors did respondents perceive influenced the creation, maintenance, success, and sustainability of these partnerships?

  3. Who was served by the Pathway Home grant program, and how were they identified, recruited, and enrolled? How do participant outcomes and program experiences vary by participant and program characteristics?

We will prioritize sites selected for the impact study in selecting sites for in-person visits.These sites will be selected based on key criteria of interest to DOL. These criteria include the structure and maturity of program models, strength of partnerships, type of correctional facility, population served, type of training, urbanicity, and region. Additionally, site selection will depend on whether an impact evaluation is feasible and whether the grantees and their facility partners agree to participate in the impact study. Additional grantees, not included in the impact study, might be selected for visits based on whether they are (1) implementing similar activities as impact sites and (2) implementing a unique program model. For example, sites implementing a unique program might focus on target groups of interest (e.g., rural areas, women) or might focus on particular topics of interest (e.g., partnerships that allow for the delivery of specific services, or differences in experiences between community-based organizations providing services directly to participants, and intermediary grantees with subgrantees who provide services to participants).

Overview of the data collection

Understanding the implementation and effectiveness of the Pathway Home grant program requires collecting data from multiple sources. For the implementation study, data collection will include key informant interviews with grant program administrators, partner administrators, and frontline staff during site visits; a survey of grant administrators; a survey of correctional facility administrators; focus groups and interviews with participants during site visits; and existing administrative data sets. For the impact study, the evaluation team will collect outcome data from administrative earnings records and criminal justice system records for all impact study participants. The data collection instruments in this clearance request include (1) baseline survey of study participants, (2) survey of grant administrators, (3) survey of correctional facility administrators, (4) interview guide for program and partner administrators and staff, (5) focus group guide for pre-release program participants, and (6) an interview guide for post-release program participants (see Table A.1 for details on how the evaluation team will use data from these instruments).

Baseline survey of study participants. As part of the impact study, the evaluation team will field a web-based survey to about 2,500 impact study participants at study enrollment. This survey will collect basic demographic information, employment history, and criminal justice history. The baseline survey will also be used to register consent and assign individuals to the treatment or comparison group, either through random assignment or a quasi-experimental design. Pre-release program staff will administer the survey, which will take 15 minutes, on average, to complete. The evaluation team is also prepared to offer the baseline survey on paper if Internet access is not available within the correctional facilities.

Survey of grant administrators. As part of the implementation study, the evaluation team will field a web survey to approximately 70 grant administrators to obtain information about the structure and main features of their Pathway Home grant programs, their partnerships, and the challenges and successes they encountered. The survey will take 30 minutes, on average, to complete.

Survey of correctional facility administrators. As part of the implementation study, the evaluation team will field a web survey to approximately 160 correctional facility administrators to help us understand the characteristics of the facilities and the people in custody, the facilities’ experiences planning for and implementing the Pathway Home grant program, and the services offered in the facility. The survey would be administered to the partners overseeing the correctional institutions where participants are recruited and enrolled and where pre-release services are delivered. This survey instrument includes questions about the characteristics of the facility and people in custody, program planning and implementation, and services offered at the facility. The survey will take 20 minutes, on average, to complete.

Interview guide for program and partner administrators and staff. As part of the implementation study, the evaluation team will conduct semistructured interviews with grant administrators, intermediary grant administrators, frontline staff, and partner staff administrators to understand how the program has been developed, managed, and delivered. These site visits will occur in approximately 16 sites in 2023. If grantees are operating Pathway Home programs at multiple correctional facility partner locations, we will select a subset of those facilities to visit. Selecting those sites will depend on the type of facility, the populations in custody, and distance from the grantee site. Similarly, for intermediaries, we will visit a subset of their grantees based on the characteristics of their program and distance from one another. We will use a master protocol to conduct the in-person interviews, which will ask about program structure, correctional facility and community context, recruitment, service overview, alternative services available, participant characteristics and outcomes, and early sustainability efforts. We expect the in-person interviews to take between 30 and 90 minutes, depending on respondent type. In the event an in-person interview cannot be conducted during the site visit, the interview will be conducted via telephone using the in-person interview protocol to ensure that similar topics are discussed with all respondents.

Focus group guide for pre-release program participants. As part of the implementation study, the evaluation team will conduct a participant focus group at each of the approximately 16 correctional facilities we visit to gather information from program participants. We will collect information about participants’ reasons for enrolling, impressions of the program, and the extent to which the program has helped them prepare for reentry to their communities and employment. All focus group participants must provide consent to participate in the research study. To fully ensure informed consent, the evaluation team will collect written consent from all participants at the start of each focus group. Written consent forms will describe the purpose of the study; outline the information that will be collected; explain the risks, benefits, and voluntary nature of participation; and collect participants’ consent to participate in the focus groups. These groups will be conducted in person and are expected to take about 90 minutes to complete. Grantee program staff will invite approximately eight participants. The evaluation team will work with the staff to ensure a mix of participants across gender, race, and level of engagement in the program.

Interview guide for post-release program participants. As part of the implementation study, the evaluation team will invite approximately 256 Pathway Home program participants to participate in in-person and telephone interviews at 12 months after their enrollment into the program. We will collect information about participants’ experiences transitioning from pre-release to post-release, and the extent to which the program has helped them reenter their communities and obtain employment. The interviews are expected to take about 60 minutes to complete. The evaluation team will work with the staff to ensure a mix of participants across gender, race, and level of engagement in the program are selected to participate in the interviews. The evaluation team anticipates interviewing participants from approximately 16 sites.

Table A.1. Data collection instruments and uses of data

Data collection instrument

How evaluation team will use the data

1. Baseline survey of study participants

This survey will serve to gather information about the characteristics of Pathway Home grant program participants who enrolled during 2022 and will facilitate assignment to the treatment or comparison group in the impact evaluation. The information will be used to ensure the treatment and comparison groups are equivalent and to describe the study sample.

2. Survey of grant administrators

This survey will serve to gather common information about organizational settings and intervention characteristics for Pathway Home grantees that received grant awards during 2021. The information will be used to draw insights about variations across grantee program models, partnerships, and services offered to Pathway Home participants.

3. Survey of correctional facility administrators

This survey will serve to gather common information about the corrections partners of Pathway Home grantees that received grant awards during 2021. The information will be used to gather insights about variations in the characteristics of correctional facility partners.

4. Interview guide for program and partner administrators and staff

The implementation study site visits will serve to collect information about the implementation of Pathway Home grantee programs to help people with justice system involvement reenter the community by connecting them to education, training, and work both pre- and post-release.

5. Focus group guide for pre-release program participants

The in-person focus groups with a subset of pre-release participants will serve to describe participants’ reasons for enrolling, impressions of the program, and the extent to which the program has helped them prepare for reentering their communities and obtaining employment.

6. Interview guide for post-release program participants

The in-person and telephone interviews with a subset of post-release participants will serve to describe their experiences transitioning from pre- to post-release services and the extent to which the program has helped them reenter their communities and obtain employment.



3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.

This project will use multiple applications of information technology to reduce burden. The baseline, grant administrator, and correctional facility administrator surveys will have the capability to be hosted on the Internet via a live secure web link. To reduce burden, the surveys will employ the following: (1) secure log-ins and passwords so respondents can save and complete the survey in multiple sessions, (2) drop-down response categories so respondents can quickly select from a list, (3) dynamic questions and automated skip patterns so respondents only see questions that apply to them (including those based on answers provided previously in the survey), and (4) logical rules for responses so respondents’ answers are restricted to those intended by the question.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item A.2 above.

The evaluation of the Pathway Home grant program will not require collecting information that is available through alternate sources. For example, the evaluation will use available information from grantee applications and existing administrative data sets to ensure that data collected through interview and focus groups are not available elsewhere.

5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.

Employer partners will participate in interviews as part of the implementation study site visits. Some of these employers might be from small businesses. To minimize burden on any small businesses that participate, we will only request information required for the intended use and minimize burden by restricting the length of interviews to the minimum required time. We will also consider the employers’ schedules when making decisions about the timing and locations of the interviews. As with all data collection activities, we will remind participants that their participation is completely voluntary.

6. Describe the consequence to federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

The evaluation represents an important opportunity for DOL to add to the growing body of knowledge about what works in providing justice-involved people with employment-focused pre- and post-release services. Without collecting data on the Pathway Home grant program through the surveys of grant administrators and correctional facility administrators, implementation study site visit interviews, focus groups, and participant interviews, DOL will not be able to conduct a comprehensive evaluation of the grant program. Policymakers would thus not have information about the context in which the partnerships and programs operated, any operational challenges grantees and partners faced, or how the partnerships and services evolved over time. Similarly, failure to collect baseline information from impact study participants would preclude DOL from evaluating whether the treatment and comparison groups were equivalent and develop methods to account for any differences, limiting the ability to determine the impact of the Pathway Home grant services. Policymakers and the field thus would not have high quality information about the effectiveness of grantees’ approaches.

7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

  • requiring respondents to report information to the agency more often than quarterly;

  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • requiring respondents to submit more than an original and two copies of any document;

  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;

  • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • requiring the use of statistical data classification that has not been reviewed and approved by OMB;

  • that includes a pledge of confidentially that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentially to the extent permitted by law.


No special circumstances apply to this data collection.

8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

The 60-day notice FR Doc. 2021–24057 to solicit public comments was published in the Federal Registeron November 4, 2021. No comments were received.

Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.

Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years -- even if the collection-of-information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.

The evaluation team is coordinating consultation on the research design and data needs. The process involves convening experts in a technical working group (TWG), convening an advisory group of people with lived experience in the justice system, and conducting discussions with site-level program staff. A list of evaluation team members is in Table A.2.

  • The TWG will provide substantive feedback throughout the project period, particularly on the impact study design. The TWG members have expertise in research methodology as well as on programs and populations similar to those being served in the Pathway Home grant program.

  • We consulted an advisory group of people with lived experience in the justice system and will continue to consult the advisory group through the project period to ensure the research design, instruments, and findings are grounded in the experiences of people directly affected by the justice system.

  • We will consult program staff to better understand the feasibility of the research design within the regional context of grantees.

Table A.2. Pathway Home Evaluation Team

Organization

Individuals

Mathematica
P.O. Box 2393
Princeton, NJ 08543-2393
(609) 799-3535

Ms. Samina Sattar

Project director

(609) 945-3358


Dr. Jillian Berk
Principal investigator
(202) 264-3449



Ms. Jeanne Bellotti
Director, Employment Research
(609) 275-2243



Dr. Jillian Stein
Deputy project director
(609) 716-4395



Ms. Betsy Santos
Senior survey researcher
(609) 750-2018

Social Policy Research Associates
1330 Broadway, Suite 1426
Oakland, CA 94612 
(510) 763-1499

Dr. Andrew Wiegand
President, CEO, and senior advisor
(510) 763-1499, ext. 636


Mr. Christian Geckeler
Senior associate
(510) 788-2461

Council of State Governments Justice Center

22 Cortlandt Street, 22nd floor

New York, NY 10007

(212) 482-2320

Dr. Nicole Jarrett

Director, Corrections and Reentry

(929) 314-5889

9. Explain any decision to provide any payments or gifts to respondents, other than remuneration of contractors or grantees.

Program or partner staff will not receive any payments or gifts because activities will be carried out in the course of their employment, with no additional compensation outside of their normal pay. Pending the correctional facility’s policies for payment of people in custody, we plan to provide respondents who complete the baseline survey a $15 incentive, and respondents who participate in the pre-release focus groups a $20 incentive. When facilities allow payment, we will offer to either deposit the incentive into a commissary account or send a gift card to the facility to hold for the individual until their release. If the facility does not allow for payment of people in custody, we can offer to send the incentive to a friend or family member in the community. Respondents who complete the post-release interviews will receive a $50 gift card as compensation for their time.

10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

Information collected will be kept private to the extent permitted by law. The evaluation team complies with DOL data security requirements by implementing security controls for processes that it routinely uses in projects that involve sensitive data. Further, the evaluation is being conducted in accordance with all relevant regulations and requirements.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

Evaluating the Pathway Home grant program using impact study methodology requires asking sensitive questions about Social Security numbers, wage rates and earnings, economic hardships, and involvement in the criminal justice system. Past evaluations have included similar questions without any evidence of significant harm. As described earlier, all sample members will be assured of the privacy of their responses before being asked to complete the baseline survey and will be informed that they can skip any questions they do not wish to answer. All data will be reported in aggregate, summary format only, eliminating the possibility of individual identification and ensuring that individual responses are private.

The evaluation team will seek institutional review board approval for final, OMB-approved instruments.

12. Provide estimates of the hour burden of the collection of information. The statement should:

  • Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. General, estimates should not include burden hours for customary and usual business practices.

  • If this request for approval covers more than one form, provide separate hour burden estimates for each form.

  • Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.

Table A.3 includes assumptions about the annual number of respondents expected, the average number of responses per respondent, the average hours of burden per response, the annual burden hours estimated, the time value assumed for respondents, and the total annualized monetary burden hours for the implementation study’s survey of grant administrators, the implementation study’s semistructured interviews for site visits, and focus group protocols. All the activities this request covers are annualized over three years. Here, we summarize the burden estimates rounded to the nearest whole number for each of the data collection activities:


  1. Baseline survey. The evaluation team will administer this survey to about 2,500 impact study participants. We estimate each respondent will spend about 15 minutes on the survey. The annualized burden is approximately 208 hours.

  2. Survey of grant administrators. The evaluation team will administer this survey to about 70 grant administrators. We estimate each respondent will spend about 30 minutes on the survey. We expect 95 percent, or about 67 grant administrators will complete the survey. The annualized burden is approximately 11 hours.

  3. Survey of correctional facility administrators. The evaluation team will administer this survey to about 160 correctional facility administrators. We estimate each respondent will spend about 20 minutes on the survey. We expect 80 percent, or about 128 correctional facility administrators will complete the survey. The annualized burden is approximately 14 hours.

  4. Site visit interviews. As part of the implementation study, which will be conducted in approximately 16 sites, the evaluation team will conduct semistructured interviews with grant administrators, frontline staff, and partners. We will invite about 32 stakeholders from each site to participate in the interviews and we expect 29 to participate. An additional three administrators will participate in visits to each of the six intermediary sites ((29 stakeholders × 16 sites) + (3 intermediary stakeholders × 6 sites) = 482 stakeholders). Interviews will range from 60 to 90 minutes based on respondent type. The annualized burden for the site visit interviews is approximately 179 hours.

  5. Pre-release participant focus groups. As part of the implementation study, the evaluation team will conduct a focus group at each of the 16 sites to gather information from participants receiving pre-release program services. In each of the 16 sites, we will invite approximately eight participants with the expectation that about 90 percent will participate, for a total of about 115 respondents. Focus groups will average 90 minutes. The annualized burden is approximately 57 hours.

  6. Post-release participant interviews. As part of the implementation study, the evaluation team will conduct interviews with participants who have engaged in post-release services. We will invite sixteen participants from each of the 16 sites, and expect 50 percent to participate, meaning approximately eight participants from each site. Interviews will average 60 minutes. The annualized burden is approximately 43 hours.

Table A.3. Estimated annualized respondent cost and hour burden

Activity

No. of respondentsa

No. of responses per respondent

Total responses

Average burden (hours)

Total burden (hours)

Hourly wage rateb

Annual monetized burden hours

Baseline survey

833

1

833

.25

208

$7.25

$1,508

Survey of grant administrators

22

1

22

.5

11

$36.13

$397

Survey of correctional facility administrators

43

1

43

.33

14

$32.50

$455

Site visit interviews with program and partner administrators: Grant administratorsc

39

1

39

1.5

59

$36.13

$2,132

Site visit interviews with program and partner administrators: Partner staff administratorsc

48

1

48

1

48

$36.13

$1,734

Site visit interviews with program and partner administrators: Frontline staffc

72

1

72

1

72

$21.82

$1,571

Pre-release participant focus groupsd

38

1

38

1.5

57

$7.25

$413

Post-release participant interviewse

43

1

43

1

43

$7.25

$312

Total

1,138


1,138


512


$8,530

Note: Numbers are rounded to the nearest whole number for all columns other than the “average burden hours” and “Hourly wage rate” columns.

aAll annual totals reflect a three-year clearance and study data collection period. Estimates are rounded to the nearest whole number.

bThe average hourly wages were obtained from the U.S. Bureau of Labor Statistics, National, State, Metropolitan, and Nonmetropolitan Area Occupational Employment and Wage Estimates, May 2020 (accessed at https://www.bls.gov/oes/current/oes_nat.htm on June 16, 2021). Estimates of administrators’ and managers’ wages are based on the average wages for “social and community service managers” ($36.13). Estimates of correctional facility staff wages are based on the average wages for “first-line supervisors of correctional officers” ($32.50). Estimates of wages for frontline staff are based on the average wages for “miscellaneous community and social service specialists” ($21.82). Monetized estimates for participants were assumed to be the federal minimum wage of $7.25. “Frontline staff” refers to both Pathway Home program staff and partner staff.

cAssumes each visit will, on average, involve individual or group interviews. We will plan to invite approximately seven grant administrators, 10 partner staff administrators, and 15 frontline staff across grantees and partners, and we expect 90 percent of staff will participate. The team anticipates completing about 16 visits in total. The average burden time per response for the grant administrator interviews will be 1.5 hours. The average burden time per response for the frontline staff interviews will be 1 hour. The average burden time per response for the partner staff administrator interviews will be 1 hour. Additionally, the team anticipates conducting interviews with about three intermediary grant administrators in about six of the 16 visits. The intermediary grant administrator interviews will be about 1.5 hours, on average. For all types of staff, some meetings will be shorter, and some will be longer than the averages.

dAssumes eight participants from each of the 16 sites will be invited to participate in a focus group and 90 percent will participate. The average burden time per response will be 1.5 hours.

eAssumes 16 participants from each of the 16 sites will be invited to do an interview and 50 percent will participate. The average burden time per response will be 1 hour.

13. Provide an estimate of the total annual cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

  • The cost estimate should be split into two components: (a) a total capital and start up cost component (annualized over its expected useful life); and (b) a total operation and maintenance and purchase of service component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

  • If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

  • Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


There will be no direct costs to respondents for the Pathway Home Evaluation other than their time.

14. Provide estimates of the annualized cost to the Federal Government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 into a single table.

The total annualized cost is $292,035. Costs result from the following categories:

  1. The estimated cost to the federal government for the contractor to carry out the data collection activities included in this package are $736,041. Annualized over three years of data collection, this comes to $245,347 per year.

  2. The annual cost DOL will bear for federal technical staff to oversee the contract is estimated to be $46,688. We expect the annual level of effort to perform these duties will require 200 hours for one Washington, DC–based Federal GS 14 Step 2 employee earning $65.94 per hour, and 200 hours for one Washington, DC–based Federal GS 15 Step 2 employee earning $79.96 per hour. (See Office of Personnel Management 2021 Hourly Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2021/DCB.pdf.) To account for fringe benefits and other overhead costs, the agency has applied a multiplication factor of 1.6. Thus, [(200 hours × $65.94) + (200 hours × $79.96)] × 1.6 = $46,688.



Summary:

245,347

+46,688

$292,035



15. Explain the reasons for any program changes or adjustments.

This is a new information collection.

16. For collections of information whose results will be published, outline plans for tabulations, and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

Analysis plan

The analysis of survey data will involve using simple descriptive measures to generate aggregated counts of responses and show variation in responses by type of grantee, program component or model, or participant characteristics as applicable. Responses to open-ended questions will be coded to identify key themes across respondents. When analyzing the information from the survey of correctional facility administrators, descriptive measures may be broken down responses by facility type, as some grantees are working with multiple facilities including prisons, jails, and halfway houses.

To analyze service delivery and coordination across partners, we will draw on two implementation research frameworks when coding and analyzing data from interviews and focus groups. The Community Coalition Action Theory (CCAT) framework describes stages of partnership development and implementation and identifies core components of effective partnerships that can foster change, making it well-aligned with our research questions and Pathway Home’s objectives.4 The Consolidated Framework for Implementation Research (CFIR) will be used to analyze the implementation of direct services interventions.5 This framework was developed to systematically assess the implementation context to reveal the factors that influence implementation, common implementation challenges, and promising strategies for replication.

Analysis of interview data will involve coding and triangulating across data sources. The evaluation team will begin by writing up detailed field notes from in-person and telephone interviews in a structured format. To code the qualitative data for key themes and topics, a coding scheme will be developed and organized according to key research questions and topics as well as constructs from the CCAT and CFIR frameworks. Each segment of coded data will be assigned a negative or positive flag to identify barriers to and facilitators of implementation. This process will reduce the data into a manageable number of topics and themes for analysis.6 The evaluation team will then code the data using qualitative analysis software. To ensure reliability across team staff, all coders will code an initial set of documents and compare codes to identify and resolve discrepancies. These data will be used to describe the nuances of how and why partnerships developed as they did and to explore implementation challenges and promising practices.

Publications

The evaluation includes an implementation study and an impact study. In 2022, a public-facing design report will be submitted for the impact study. Data collection for the implementation and impact studies will begin in 2022 and will end in 2024. The following products will be developed:

Implementation study briefs. Implementation study briefs will be submitted in 2022 and 2023. The briefs will focus on special topics of interest to DOL that may include:

  • An overview of who was initially served by the grantees and what strategies the grantees used to recruit participants to the program, as well as recruitment and enrollment challenges and lessons learned.

  • An overview of the program models used across the two cohorts of grantees and an overview of services and innovative practices, based on responses to the grantee and facility surveys and virtual discussions with staff.

  • Partnerships developed under the program, focusing on how the grantees identified different partners to support Pathway Home program planning and implementation, and the trends across grantees in the types of partnerships that showed promise. The paper would also discuss the challenges, successes, and lessons learned around forming these partnerships.


Implementation study report. The evaluation team will complete a report describing the findings from the implementation study. This report will document how sites were selected for the evaluation, as well as the characteristics of sites and correctional facility partners that participated. The report will also discuss participants’ experiences with the program, the coordination and delivery of services, any challenges to serving participants, and lessons learned.

Impact study final report. The evaluation team also will complete a final report documenting how accessing Pathway Home grant services affected participants’ outcomes. Likely outcomes will include employment, earnings, and criminal justice involvement. This report will also examine the effects for key subgroups and present an analysis of the association between program components and participant outcomes.

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

The OMB approval number and expiration date will be displayed or cited on all forms that are part of the data collection.

18. Explain each exception to the certification statement.

No exceptions are necessary for this information collection.




1 Jones, Alexi. “Correctional Control 2018: Incarceration and Supervision by State.” Prison Policy Initiative, 2018. Available at https://www.prisonpolicy.org/reports/correctionalcontrol2018.html.

2 Looney, Adam, and Nicholas Turner. “Work and Opportunity Before and After Incarceration.” Washington, DC: Brookings Institution, 2018. Available at https://pdfs.semanticscholar.org/399b/5d1747e721fdb63a5837296619528d361de6.pdf.

3 A subset of these Pathway Home grants was awarded to intermediary organizations that funded subgrantees for delivering Pathway Home services to participants, while other grants were awarded to community-based organizations that are providing Pathway Home services directly to participants. We use the term “site” to refer to either a direct community-based organization grantee, an intermediary grantee, or a subgrantee of an intermediary organization.

4 Butterfoss, F. D., and M. C. Kegler. “Toward a Comprehensive Understanding of Community Coalitions: Moving from Practice to Theory.” In Emerging Theories in Health Promotion Practice and Research, edited by R. DiClemente, L. Crosby, and M. C. Kegler (pp. 157–193). San Francisco: Jossey-Bass, 2002.

5 Damschroder, L. A., D. C. Aron, R. E. Keith, S. R. Kirsh, J. A. Alexander, and J. C. Lowery. “Fostering Implementation of Health Services Research Findings into Practice: A Consolidated Framework for Advancing Implementation Science.” Implementation Science, vol. 4, no. 7, 2009. doi:10.1186/1748-5908-4-50.

6 Ritchie, J., and L. Spencer. “Qualitative Data Analysis for Applied Policy Research.” In The Qualitative Researcher’s Companion, edited by M. Huberman and B. Miles. London: Sage, 2002.

5

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMarch 6, 2007
AuthorTheda Kenney
File Modified0000-00-00
File Created2023-08-01

© 2024 OMB.report | Privacy Policy