part B: Reentry Employment Opportunities (REO) Evaluation
OMB No. 1290-0NEW
June 2021
OFFICE OF MANAGEMENT AND BUDGET SUPPORTING STATEMENT PART B: COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS FOR THE REENTRY EMPLOYMENT OPPORTUNITIES (REO) EVALUATION
The Chief Evaluation Office (CEO) in the U.S. Department of Labor (DOL) is undertaking the Reentry Employment Opportunities (REO) Evaluation. The overall aim of the evaluation is to determine whether the REO programs improve employment outcomes and workforce readiness for young adults and adults with previous involvement in the criminal justice system. CEO contracted with Mathematica and its subcontractor, Social Policy Research Associates (SPR), to conduct this evaluation. The evaluation will include an implementation study and an impact study. This package requests clearance for four data collection instruments, which we include as supporting documents:
Grantee survey1
Master site visit interview protocol2
Participant focus group protocol
Employer focus group protocol
The four data collection instruments are for the implementation study, with the second through fourth to be used during implementation study site visits. The Health Media Lab Institutional Review Board has reviewed and approved these data collection instruments.
B.1. Respondent universe and sampling methods
This OMB package requests clearance for data collection activities that we will conduct with the REO grantees that received grants in 2018 and 2019 (Table B.1). A subset of these REO grants were awarded to intermediary organizations that provided funding to subgrantees for the delivery of REO services to participants, while other grants were awarded to community-based organizations (CBOs) that are providing REO services directly to participants. We use the term “site” to refer to either a direct CBO grantee or a subgrantee of an intermediary organization for a particular grant award.
Although the implementation study will include data collection from grantees that received their grant awards during prior calendar years and which OMB has already approved, the focus of this request for approval of additional data collection pertains to the 86 grantees that received their grant awards during years 2018 and 2019 only. Up to 28 sites will participate in the implementation study site visits. All impact study sites will participate in the implementation study site visits, and additional grantees, not included in the impact study, might be selected for visits that are (1) implementing similar activities as impact sites and (2) implementing a unique program model. For example, sites implementing a unique program might focus on target groups of interest (e.g., rural areas, women) or might focus on particular topics of interest (e.g., services tailored to youth versus adults, partnerships that allow for the delivery of specific services, differences in experiences between CBOs and intermediaries).
The impact study will include purposively chosen CBO grantees and intermediary grantee organizations and their subgrantees. The grantees will be purposively selected based on the strength of their program models, consistency of implementation across subgrantees, locations in states with data availability, large sample sizes, and tenure of their programs.
This section outlines the respondent universe, proposed methods for respondent selection, and the expected response rates for each of the data collection activities included in this submission.
Table B.1. Respondent universe, by grant year and type
Year |
Grant type |
Grantees |
Sitesa |
2018 |
CBO |
32 |
32 |
2018 |
Intermediary |
9 |
38 |
2019 |
CBO |
38 |
38 |
2019 |
Intermediary |
7 |
31 |
|
Totals |
86 |
139 |
a Some sites are operating more than one grant. For the purposes of this presentation of the respondent universe, we consider each grant as a distinct site and each intermediary subgrantee as a distinct site.
CBO = community-based organization.
Grantee survey. The goal of the implementation study grantee survey is to gather common information about organizational settings and intervention characteristics for REO grantees. DOL received clearance to collect the grantee survey from 97 grantees that were awarded grants during 2016 through 2018 in response to an initial request to OMB (see OMB Approval No. 1290-0026). The current package requests approval to collect the grantee survey from additional grantees. The survey will be distributed to an executive director from the census of 45 grantees that received REO grants in 2019. We will not use any statistical methods to select grantee survey respondents because all grantees that received grant awards in 2019 will be included in the survey fielding effort. The study team anticipates that the survey will take about 20 minutes to complete. We anticipate a 100 percent response rate for the survey because participation in the evaluation is a condition of grant award from DOL.
Master site visit interview protocol. As part of the implementation study, the study team will conduct the implementation study site visits to up to 28 sites that are selected from grantees that were awarded grants during 2018 and 2019. We will not use any statistical methods to select interviewees. Interview participants will be purposively selected based on their engagement with the REO program activities. Participants will include grant administrators, intermediary grant administrators, frontline staff, and partners to understand how the program implementation has been developed, managed, and delivered. We anticipate a 100 percent response rate for the interviews because participation in the evaluation is a condition of grant award from DOL.
Participant focus group protocol. As part of the implementation study site visits, the study team will conduct a focus group at each of the sites to gather information from participants about their experiences receiving program services. Focus group participants will be purposively selected in coordination with the grantees to identify engaged participants who can speak about their experiences with the program from start to end. The study team anticipates an average of eight participants at each of the focus group discussions. The data collected from focus group participants will not be generalized to the broader universe of REO program participants. These focus groups will provide insights to help answer research questions about the types and combinations of services that participants received and participants’ experiences accessing those services. We anticipate a 50 percent response rate for the participant focus groups. This response rate is a conservative estimate based on literature regarding similar data collection efforts.3,5
Employer focus group protocol. As part of the implementation study site visits, the study team will conduct a focus group at each site to gather information from employers. The study team anticipates that each site will be working with between 5 and 25 employer partners. We will work with each grantee to identify up to 10 employers to invite to the focus group. Focus group participants will be purposively selected to include employers who have been the most involved in REO, such as those who have hired numerous participants, or those who were involved with the shaping of program activities.4 To learn about employer engagement in the grants, we will purposefully select employers for the focus groups. As a result, findings from the focus groups will not be representative. Speaking with the most engaged employers will be valuable in helping the study team understand how partnerships were developed and maintained and will highlight the potential challenges involved in the cooperative endeavor—something a less-engaged partner might not be able to speak to. We are intentionally keeping this group smaller because we want to include only the most involved employer partners in the discussion, and because past experience suggests that keeping employer-partner focus group small helps with scheduling (employers schedules are often less flexible) and facilitating the group discussion so that we hear about each employer partner’s experience. Of the employers recruited from each site, we anticipate that up to 5 will attend, for an anticipated response rate of 50 percent. This response rate aligns with the literature, which recommends over-recruiting between twenty and fifty percent for focus groups.5
Table B.2 summarizes this information, including the sampling method, estimated sample sizes, and assumptions about response rates for each data collection activity and respondent type, by site.
Table B.2. Sampling and response rate assumptions, by data collection activity and respondent type
Respondent type |
Sampling method |
Universe of potential respondents |
Estimated selected sample |
Average responses (per site) |
Estimated responses per respondent |
Expected response rates |
Estimated responses (across sites) |
Implementation study grantee survey |
|||||||
Grantee survey respondentsa |
Census |
45 |
45 |
1 |
1 |
100 |
45 |
Implementation study site visits |
|||||||
Master site visit interview protocol – grant administrators |
Purposive |
278 |
56 |
2 |
1 |
100 |
56 |
Master site visit interview protocol – frontline staffb |
Purposive |
3,892 |
784 |
28 |
1 |
100 |
784 |
Master site visit interview protocol – partner staff administratorsb |
Purposive |
695 |
140 |
5 |
1 |
100 |
140 |
Master site visit interview protocol – Intermediary grant administratorsc |
Purposive |
48 |
18 |
1 |
1 |
100 |
18 |
Participant focus groupsd |
Purposive |
9,233 |
448 |
8 |
1 |
50 |
224 |
Employer focus groups e |
Purposive |
700 |
280 |
5 |
1 |
50 |
140 |
a The grantee survey will be administered to up to 45 grantees because 45 grants were awarded in 2019. DOL received clearance to collect the grantee survey from 97 grantees that were awarded grants during 2016 through 2018 in response to an initial request to OMB (see OMB Approval No. 1290-0026). We anticipate a 100 percent response rate.
b The frontline staff row includes both REO program staff and partner staff. It assumes an average of 28 frontline staff per site and 5 partner administrators per site. We anticipate a 100 percent response rate.
c We assume that we will conduct the 28 implementation study site visits at sites that stem from 6 intermediary grantees and 4 CBO grantees. We also assume that three intermediary staff will be available to interview from each of these 6 intermediary grantees. They will not be affiliated with a single site, per se. But, on average, we will select less than 1 intermediary staff member interviewed per site for the implementation study site visit data collection effort. We anticipate a 100 percent response rate.
d Based on grantee target enrollment numbers, we anticipate inviting an average of 16 participants per focus group. We expect a 50 percent response rate, meaning approximately 8 participants will attend each focus group. This assumption is a conservative estimate based on similar data collection efforts and best practices highlighted in the literature.
e Assumes that the universe of employer partners is up to 25 employers per site. We plan to invite 10 employers to each focus group on average and we anticipate a 50 percent response rate. Therefore, each employer focus group will, on average, involve up to 5 people and will be conducted in about 28 sites. The average burden time per response will be 1 hour.
B.2. Procedures for the collection of information
Understanding the effectiveness of the REO grants requires collecting data from multiple sources. For the implementation study, data collection will include a grantee survey, semistructured interviews, individual-level program data, and focus groups with employer partners and program participants. The impact study will include outcome data gathered through administrative earnings data and criminal justice system data for all impact study participants. The data collection activities this clearance covers include the grantee survey and the implementation study master site visit protocols, which include semistructured interviews with grant administrators, frontline staff, partner staff administrators, and intermediary grant administrators, as well as employer and participant focus groups. We describe each of these data collection activities in more detail below.
Grantee survey. As part of the implementation study, we will field an electronic survey with REO grantees to obtain information on their approaches to project management, recruitment and outreach, and service delivery. The grantee survey will include a set of common questions to lead to insights about variations across grantees and grant programs and to put the impact and implementation study data in context. DOL received clearance to collect the grantee survey from 97 grantees through an initial request to OMB (see OMB Approval No. 1290-0026). The current package requests approval to collect the grantee survey from an additional 45 grantees, which received their grant awards during year 2019.
Implementation study site visits. The study team will conduct two- to three-day implementation study site visits to up to 28 sites. Visits will include semistructured interviews with program staff and partners, a focus group with employers, a focus group with participants, and observations of program services to learn about respondents’ experiences, variations in implementation across grantees and grant programs, and to contextualize the impact study findings. Participation in interviews and focus group discussions is voluntary. The study team will collect verbal assent from all program staff and partners that participate in the implementation study site visit interviews and focus groups. The study team will collect written consent from any REO program participants who take part in the focus groups. The study team might need to conduct interviews by telephone, if for any reason it cannot complete all of the interviews on-site (for example, the potential need to conduct virtual site visits due to issues that might arise in advance of the visit, or if a respondent is out of the office unexpectedly while the study team is on-site). Lastly, during the implementation study site visits, the study team will work with program staff to identify activities to observe and will record observations using a rubric.
2. Statistical methodology, estimation, and degree of accuracy
Grantee survey. The main type of data collected from the grantee survey will be contextual information about the program, services offered, and any unique approaches used, or populations served and how those components might relate to positive participant outcomes. No complex statistical methodology (such as sample stratification) or estimation will be necessary to analyze the grantee survey data. We will analyze the data using simple descriptive measures to generate aggregated counts of responses. Responses to open-ended questions will be coded to identify key themes across grantees and enable the study team to describe the REO grantees’ characteristics and experiences.
Implementation study site visits. The main type of data collected from the interview and focus group respondents will be qualitative information about staff’s experiences and insights implementing the REO grant or, in the case of participants, their motivations for participating in REO and their experiences while doing so. Thus, no statistical methodology (such as sample stratification) or estimation will be necessary to analyze the interview or focus group data. We are planning to conduct 28 visits to include each of the intermediary subgrantees included in the impact study (n = 25) and allow for a few additional visits to other Reentry Project sites implementing promising or innovative strategies of interest to DOL (n = 3). We will not make any declarative statements about the efficacy of such strategies or practices. We will qualitatively describe these programs to inform DOL and the broader field about innovative program practices. We are proposing to conduct focus groups with up to 5 employer partners and up to 8 program participants during each site visit to qualitatively learn about the experiences of the individuals with whom we speak. This information will provide contextual information to help understand implementation in the sites selected for the implementation study. We will triangulate this information with data collected through the grantee survey and the analysis of the PIRL data to answer the research questions.
The qualitative data collected through interviews and focus groups will be analyzed using qualitative coding software such as NVivo or Atlas-ti. To code the qualitative data for key themes and topics, the study team will develop a coding scheme, which will be organized according to key research questions and topics and guided by the conceptual framework as well as constructs from the Consolidated Framework for Implementation Research on factors that affect implementation. To ensure reliability across team staff, all coders will code an initial set of documents and compare codes to identify and resolve discrepancies. These data will serve to describe the nuances of how partnerships developed as they did, and to explore perceptions of implementation challenges and promising practices. Because the implementation study is examining grant implementation, study findings will apply only to the REO program grantees and will not be more broadly generalizable.
3. Unusual problems requiring specialized sampling procedures
As mentioned previously, we will use data from the semistructured interviews and focus groups to describe the REO grants, including the perspectives of frontline staff, partners, and participants. Data collection from the interviews with administrators and frontline staff will include the universe of respondents. Using a census for these two sets of respondents will ensure that the study captures the diversity of grantees’ experiences and the perspectives of multiple respondents in each site. Without talking to all the key administrators and frontline staff, the study team might miss important information on the implementation of the REO grant. The study team plans to use purposive sampling methods for the employer and participant focus groups selected by the grantee sites. For the employer focus groups, the study team will ask site administrators in advance of the visits to identify those employers that have been closely involved with the REO grant (for example, by providing input on the development of the grant plans, the provision of services, or as an employer in which a notable number of participants have been placed during or after their service receipt). Although the study team will aim to include all employers that were closely involved with the program in these discussions, the focus group discussions might not be representative of all employer partners. For the participant focus groups, the study team will ask the site administrator to recommend and help invite participants who were engaged in program services and can provide a range of perspectives about program services; however, the actual invitees might not represent the participant diversity present at each program site.
4. Periodic data collection cycles to reduce burden
We will conduct each distinctive data collection effort only once per grant award. Specifically, we will administer the grantee survey for the implementation study only once per grant award. The study team will only conduct one implementation study site visit per selected site. For each site, each distinctive data collection effort will be spaced out over a short period to minimize respondent burden. To further minimize burden on respondents, the study team will review pertinent data available from REO grantee proposals, grantee staffing and implementation plans, and any other reporting information to reduce the burden on site respondents whenever possible. Thus, the study team can focus the discussions with respondents on topics for which information is not otherwise available.
B.3. Methods to maximize response rates and minimize nonresponse
As their grant agreements indicate, REO grantees and subgrantees included in the evaluation are expected to participate in the data collection activities as a condition of their grant awards.
Grantee survey. The study team expects to achieve a response rate of 100 percent for the grantee survey. As described above, an agreement to participate in the evaluation is a condition of receipt of REO grant funds. In addition, the study team will work with DOL to obtain its help in promoting full engagement of the grantees. The survey has been designed to be as brief as possible, with clear, easy-to-answer questions (mostly closed-ended questions, with a few open-ended questions). An official advance letter with log-in information will be mailed to grantee sample members to legitimize the study and encourage participation. Finally, the data collection plan will include a two-stage outreach strategy for reluctant responders. Beginning in week 2 of data collection, the study team will send an email reminder to those who have not responded. Beginning in week 4, the study team staff will conduct follow-up telephone calls (in addition to email reminders) to nonresponders. If a respondent leaves some questions blank, we plan to use pairwise deletion, meaning we will analyze all available responses for each question, excluding cases that skipped the question. When reporting the data, we will always indicate the number of responses and/or the number of missing data points.
Semistructured interviews. The study team plans to interview the universe of grant administrators, frontline staff, partner staff administrators, and intermediary grant administrators. To ensure full cooperation, the study team will be flexible in scheduling interviews and activities to accommodate the needs of respondents. Furthermore, data collectors will meet with in-person interview respondents in a central location that is well known and accessible.
Although the study team will try to arrange interviews that accommodate respondents’ scheduling needs, a respondent might be unable to meet while the team is on-site; when this happens, a member of the study team will request to meet with the respondent’s designee or schedule a follow-up call at a more convenient time. With these approaches, the study team anticipates a 100 percent response rate for administrator interviews and for each type of frontline staff who provides direct services to REO program participants. We have achieved comparable response rates on similar qualitative data collection efforts, such as those for the Workforce Investment Act Adult and Dislocated Worker Programs Gold Standard Evaluation, the Evaluation of the Summer Youth Employment Initiative under the Recovery Act, and the Impact Evaluation of the Trade Adjustment Assistance Program.
Focus groups. The study team plans to conduct focus groups with employer partners and program participants. To encourage participation in the focus groups, the study team will use methods that have been successful for numerous other Mathematica studies, such as the Evaluation of Grants Serving Youth, the Evaluation of the Linking to Employment Activities Pre-Release, and the Evaluation of Youth Career Connect, including providing easy-to-understand outreach materials and strategically scheduling focus groups at convenient times and locations.
Outreach materials will serve to help sites recruit participants for the focus groups. These materials will (1) describe the study, its purpose, and how the data collected will be used; (2) highlight DOL as the study sponsor; (3) explain the voluntary nature of participation in the focus groups; and (4) provide a telephone number and email address for participants to contact with any questions. Outreach materials for staff, partners, and participants will be clear and succinct and convey the importance of the focus group data collection.
For each cohort of sites that are included in the visits (where a cohort is a group of sites that are providing REO services through grant awards provided in a particular calendar year, such as 2018 or 2019), the implementation study site visits will take place over a few months, which will allow flexibility in timing and scheduling to maximize respondents’ ability to attend the focus groups. The study team will consider respondents’ schedules and availability when scheduling the focus groups to maximize response. In addition, data collectors will meet with respondents in convenient locations.
In addition to these strategies, which we will use for all focus groups, we also plan to offer incentives for the participant focus groups. Because program participants might be more difficult to locate and might face additional barriers to participation such as transportation costs, the study team will offer attendees a $20 gift card incentive. Although this is a nominal amount that is not likely large enough to be coercive for participants, the payment will facilitate focus group participation and indicate to participants that we value their time and participation.
Methods to ensure data reliability. We will use several well-proven strategies to ensure the reliability of data collected from the grantee surveys, as well as the interviews and focus groups.
Grantee survey. The study team used a systematic approach to develop the instrument and ensure the quality of the data. All respondents will be assured that reports will never identify them by name.
Implementation study site visits. First, site visitors, all of whom already have had experience with this data collection method, will be thoroughly trained in aspects particular to this study, including how to probe for additional details to help interpret responses to interview and focus group questions. Second, this training and the use of the protocols (Attachments X) will ensure that the data are collected in a standardized way across sites. Finally, all interview and focus group respondents will be assured that their responses will be reported anonymously; reports will never identify respondents by name, and any quotes will be devoid of personally identifiable information.
B.4. Tests of procedures or methods to be undertaken
Grantee survey. During formal pre-testing, the study team tested the grantee survey with nine respondents from nonparticipating grantees. The study team sent respondents the survey via email and asked them to complete it, scan it, and return it to Mathematica by email. Respondents were instructed to keep a copy of their completed survey to reference during the respondent debrief. The debriefs were conducted by telephone using a master site visit interview protocol. Feedback from the pre-tests was used to clarify the wording of the instructions and questions, and to eliminate questions that respondents found overly burdensome. The updated grantee survey, which OMB approved for use with other grantees (see OMB Approval No. 1290-0026), is included as a supporting document.
Implementation study site visits. To ensure that the REO data collection protocols effectively yields comprehensive and comparable data across the study sites, senior study team members will conduct the first implementation study site visit to ensure the protocols include appropriate probes and all relevant topics of inquiry. Furthermore, during this first visit, the senior study team staff will assess the research team–developed implementation study site visit agenda—including how data collection activities should be structured during each site visit—and ensure it is practical, given the amount of data to collect and the amount of time allotted for each data collection activity. Although the first visit might lead to very minor refinements to the question ordering or terms used, these are the final guides and protocols. Based on its experience with the first implementation study site visit, the study team will train all site visitors on the site visit data collection protocols to ensure a common understanding of the key objectives and concepts as well as fidelity to the protocols. The training session will cover topics such as the study purposes and research questions, data collection protocols, procedures for scheduling visits and conducting on-site activities (including a review of interview and focus group facilitation techniques and procedures for protecting human subjects), and post-visit files and summaries.
B.5. Individuals consulted on statistical aspects of design and on collecting and/or analyzing data
Consultations on the statistical methods used in this study were part of the study design phase to ensure technical soundness. The people listed in Table B.3 were consulted on the statistical methods discussed in this submission to OMB.
Table B.3. People consulted for the REO evaluation
Mathematica
|
Project director |
Ms.
Jeanne Bellotti |
(609) 799-3535 |
Survey director |
Dr.
Jillian Stein |
|
Impact study task leader |
Dr.
Ankita Patnaik |
|
Quality assurance reviewer |
Dr.
Jillian Berk |
Social
Policy Research Associates |
Principal investigator |
Dr.
Andrew Wiegand |
(510) 763-1499 |
|
|
|
Implementation study and administrative data collection task leader |
Mr.
Christian Geckeler |
Table B.4 lists staff responsible for overseeing the collection and analysis of data.
Table B.4. People who will oversee the collection and analysis of data for the REO evaluation
Mathematica
|
Project director |
Ms.
Jeanne Bellotti |
(609) 799-3535 |
Survey director |
Dr.
Jillian Stein |
|
Impact study task leader |
Dr.
Jonah Deutsch |
|
Quality assurance reviewer |
Dr.
Jillian Berk |
Social
Policy Research Associates |
Principal investigator |
Dr.
Andrew Wiegand |
(510) 763-1499 |
|
|
|
Implementation study and administrative data collection task leader |
Mr.
Christian Geckeler |
1 DOL received clearance to collect the grantee survey from 97 grantees through an initial request to OMB. This package requests approval to collect the grantee survey from additional grantees.
2 Semistructured interviews will be conducted with grant administrators, intermediary grant administrators, frontline staff, and partner staff administrators.
3 Kerry Levin, Jennifer Anderson, and Jocelyn Newsome. Comparing Recruitment for Focus Groups and Friendship Groups: Which Methodology Makes Recruitment Easier? Poster presented by Westat at the annual American Association for Public Opinion Research conference, 2016. Paper available at http://ewriteonline.com/wp-content/uploads/2016/03/Jen-Anderson_2016-AAPOR-Poster-Content_Hmwk1_no-comments.pdf
4 According to best practices cited in the literature, the optimal group size for a focus group is between 6 and 8 participants, with smaller groups appropriate for participants with more specialized skills or experience (see Onwuegbuzie, et al. 2009).
5 Onwuegbuzie, A. J., Dickinson, W. B., Leech, N. L., & Zoran, A. G. (2009). A qualitative framework for collecting and analyzing data in focus group research. International journal of qualitative methods, 8(3), 1-21.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Reentry Employment Opportunities OMB Statements |
Subject | OMB |
Author | MATHEMATICA |
File Modified | 0000-00-00 |
File Created | 2021-07-18 |