Reemployment Services and Eligibility Assessments (RESEA) Implementation Study
ICR REFERENCE NUMBER 201904-1290-001
April 2019
PART A: Justification
A.1. Circumstances Making the Collection of Information Necessary
The Chief Evaluation Office (CEO) of the U.S. Department of Labor (DOL) has contracted with Abt Associates and its partners—the Urban Institute, Capital Research Corporation (CRC), and the National Association of State Workforce Agencies (NASWA)—to conduct a three-year evaluation to develop strategies to support the evidence requirements for the Reemployment Services and Eligibility Assessment (RESEA) program that were enacted as part of the Bipartisan Budget Act of 2018 (Public Law 115-123; hereafter the “BBA”). This Evaluation to Advance RESEA Program Evidence will help DOL establish rigorous and informative evidence standards and also support DOL and states in conducting evaluations to expand the evidence base. The evidence standards and new evidence will each help states develop and implement RESEA programs that more quickly return Unemployment Insurance (UI) claimants to work.
An implementation study component of the Evaluation to Advance RESEA Program Evidence will support the evaluation’s aims noted above. This package requests clearance for three data collection activities as part of the RESEA implementation study:
key informant interviews during site visits with approximately 10 state workforce agencies and approximately 20 local workforce areas (i.e., two local workforce areas within each state);
telephone interviews with approximately 24 state workforce agencies; and
a web-based survey of all state workforce agencies.
1. RESEA Background
States and territories use RESEA funds “to address the reemployment service needs of UI claimants and ... to prevent and detect UI improper payments” (Unemployment Insurance Program Letter 8-18). Since 2005, DOL and participating state UI agencies have been addressing individual reemployment needs of UI claimants and working to prevent and detect UI improper payments through the voluntary UI Reemployment and Eligibility Assessment (REA) program and, beginning in fiscal year (FY) 2015, through the voluntary RESEA program. At the end of FY 2018, a total of 51 states and jurisdictions were operating an RESEA program (Unemployment Insurance Program Letter 7-19).
On February 9, 2018, the President signed the Bipartisan Budget Act of 2018, Public Law 115-123, which included amendments to the Social Security Act (SSA) that create a permanent authorization for the RESEA program.1 The permanently authorized RESEA program in Section 306 of the SSA provides for a phased implementation of new program requirements over several years.
Specifically, the BBA substantially increases funding for RESEA, introduces a requirement that the funding be used for evidence-based interventions, and requires evaluation of interventions that are not evidence based. The BBA requires that RESEA funds be allocated “only for interventions demonstrated to reduce the number of weeks for which program participants receive unemployment compensation by improving employment outcomes for program participants.” States implementing interventions not “demonstrated effective” with a “high or moderate causal evidence rating” must “be under evaluation at the time of use.” Starting in 2023, states are required to use as least a certain proportion of their RESEA grants to fund evidence-based interventions. At that point, being “under evaluation” is no longer an alternative for states to satisfy requirements to receive an RESEA grant. The required proportions increase through 2027 as follows:
In FY 2023 and FY 2024, states will be required to use no less than 25 percent of the grant funds for interventions or service delivery strategies with a high or moderate causal evidence rating that show a demonstrated capacity to improve employment and earnings outcomes for program participants.
For FY 2025 and 2026, states must use no less than 40 percent of funds for interventions or service delivery strategies with a high or moderate causal evidence rating.
For FY 2027 and beyond, states must use no less than 50 percent of RESEA funds for interventions or service delivery strategies with a high or moderate causal evidence rating.
To satisfy the BBA requirements, DOL must (1) review State Plans to determine whether the proposed RESEA interventions are evidence based and whether the required evaluation plans are acceptable. To do that, DOL must have (2) identified which reemployment interventions are evidence based. These two steps in turn require (3) standards for evidence-based interventions and for acceptable evaluations.
2. Role of the Implementation Study in the Evaluation
Under its contract with CEO, the Abt Associates team is developing options to support DOL in developing standards to rate program effectiveness and in generating new evaluations to expand the base of evidence of effective approaches. The contract includes an implementation study component to produce an up-to-date understanding of states’ current RESEA programs, their evaluation capacity, and plans for RESEA program changes subsequent to implementation of BBA requirements.2 That information will help DOL better understand existing interventions used by states, determine the extent to which existing evidence is able to demonstrate the effectiveness of interventions currently used, and identify interventions to evaluate further.
A.2. Purposes and Use of the Information
The information to be collected will be used by DOL and state/local workforce agencies to understand and analyze current RESEA program structure and operations at the state and local levels (including variability), state evaluation efforts, as well as state/local plans for making changes to RESEA programs subsequent to BBA requirements and for conducting evaluations. DOL will use the results of the data collection and analysis effort to develop future guidance, design evaluation projects, and provide a range of technical assistance services to help states meet evidence-based and other requirements of the BBA. Additionally, the results of the data collection effort will be published in a report that will help states and local workforce areas plan, implement, and evaluate changes to RESEA program services in order to meet BBA evidence-based requirements.
Overview of the Implementation Study Component of the RESEA Evaluation
The implementation study component of the three-year Evaluation to Advance RESEA Program Evidence effort will provide an in-depth understanding of state RESEA programs and how they vary. The key objectives of the implementation study are to provide up-to-date knowledge on the following questions:
How are RESEA programs designed and how do they operate in the field?
How do states decide which claimants to select?
How do states schedule meetings and promote attendance by claimants?
What types of employment services do they offer and how are they offered?
How do RESEA programs coordinate with AJCs?
How are eligibility enforcement mechanisms applied?
What challenges or benefits have states experienced in responding to the new RESEA provisions, and what further plans do they have to respond to them?
Which practices are seen as promising?
What are the employment and UI claim outcomes of RESEA participants?
How do states use evidence to inform the design of RESEA programs?
What plans and capacity do states have to conduct evaluations?
Overview of Data Collection for the Implementation Study Component
Understanding the implementation of state RESEA programs requires data from multiple sources. To the extent possible, we will use data from available documents. In particular we will (a) review and extract information from RESEA state plans and (b) analyze the ETA-9128 (RESEA Workload Report) and ETA-9129 (RESEA Outcomes Report) reports. The study will also conduct the following primary data collection activities:
Key informant interviews during site visits to 10 states. These includes visits to the state workforce agency and two local workforce areas to conduct interviews with local RESEA administrators and staff.
Telephone interviews with state UI/RESEA administrators in 24 states.
A web-based survey of all states.
A.3. Use of Technology to Reduce Burden
The interviews with key informants and state UI/RESEA administrators will not involve the use of technology. The survey data collection effort will prioritize the use of online technology to collect survey responses. The Abt team will use SurveyGizmo to program and administer the survey. SurveyGizmo offers a user interface that is easy for respondents to navigate. We will use the platform’s support for automated skip patterns, so that respondents are only asked to respond to items that are relevant to them. Administering the survey by web will also allow participants to respond to the survey at a time that they find most convenient and in stages. After respondents begin answering questions, they will be able to save their progress, leave the survey, and complete the remaining questions at their convenience. The software will also facilitate tracking response rates as states complete surveys.
Abt and its partners have successfully administered surveys of similar size and complexity using SurveyGizmo. Abt has recently used SurveyGizmo to develop a similar sized web-based survey of approximately 160 TANF State, Territory, and County administrators.
For any respondents who are unable to complete the survey online, we will allow them to complete the survey by phone
A.4. Efforts to Avoid Duplication
We will use the site visits and calls to collect only data that are not available from any other source. We will examine state RESEA plans, state RESEA websites, and state-level data submitted to DOL (i.e., ETA-9128, RESEA Workload Report) to gather all information possible before conducting the visits and calls in order to avoid asking interviewees for information available elsewhere.
We will use the survey to collect data that are not systematically collected from any other pre-existing survey effort. Some survey respondents may also participate in the site visit interviews. These respondents may be asked questions on topics similar to those covered in the survey. However, the survey questions will generally have closed-ended response options tailored for quantitative analyses, while site visit questions are open-ended and will be used for qualitative analysis of program implementation.
A.5. Methods to Minimize Burden on Small Entities
The data collection does not involve small businesses or other small entities.
A.6. Consequences of Not Collecting the Data
If the site visits, telephone interviews, and survey are not conducted, DOL will not have information that is needed to develop appropriate guidance about the relevance of existing evidence to states’ programs, understand what evidence needs to be generated to fill gaps, or develop plans to generate new evidence to fill those gaps. All of that is required so that states’ RESEA programs can be in a position to use evidence to improve the effectiveness of their interventions and meet the statutory requirements that the interventions they use are demonstrated effective.
A.7. Special Circumstances
The data collection effort does not involve any special circumstances.
A.8. Federal Register Notice and Consultation
Federal Register Announcement
A 60-day notice to solicit comments on the site visit and telephone interviews was published in the Federal Register, 83 FR 63188, 12/07/2018, pp. 63188-63189. No comments were received.
A 60-day notice to solicit comments on the survey was published in the Federal Register, 84 FR 17434, 04/25/2019, pp. 17434-17435. Two comments were received. One requested confirmation of the estimated burden hours, and the other requested additional information on the survey questions. We responded to both, confirming the estimated burden hours and notifying the commenter that the survey instrument would be available for review and public comment before final approval.
Consultation Outside the Agency
Consultation on the research design and data needs has been coordinated by the study team. It has involved discussion with multiple agencies within DOL (CEO, ETA, and OUI) to obtain a variety of perspectives from both program and research experts. The study team includes staff from NASWA, with whom we have consulted for expertise to ensure that data being collected are relevant to states and comprehensively cover important state RESEA program elements. Consultations have not involved experts outside of DOL or the study team.
A.9. Payment or Gifts
There are no payments or gifts to respondents. Tasks and activities conducted by administrators and staff to be interviewed are expected to be carried out in the course of their employment, and no additional compensation will be provided outside of their normal pay.
A.10. Assurance of Privacy
Information collected will be kept private to the extent permitted by law.
The administrators and staff interviewed and surveyed by the research team will be informed that their responses will be combined with those from other states and local areas for analysis. Respondents will not be identified by the individual’s name in any reports, nor will interview notes be shared with DOL or anyone other than the Abt research team. Only the evaluation team will be able to identify individual responses. All evaluation team members are required sign non-disclosure agreements (NDAs) with DOL in order to be permitted to work on the project or receive any restricted-access project materials. This includes Abt Associates, the prime contractor, and all subcontractors. Furthermore, the contractor complies with all DOL data security requirements.
To protect respondents’ privacy, all data will be stored on a password-protected drive established at the contractor site. Access to this drive will be limited to research staff members who are working on the project and have signed the non-disclosure agreement. To preserve privacy, paper copies of interview notes will be secured in a locked file cabinet. All interview notes and survey data will be destroyed when they have been fully analyzed and synthesized for reporting purposes, no later than the end of the period of performance for the Abt Associates’ evaluation contract.
A.11. Justification for Sensitive Questions
No sensitive questions will be asked during the interviews.
A.12. Estimates of Burden Hours
Table A.1 provides annual burden estimates for each of the data collection activities for which this package requests clearance. All of the activities covered by this request will take place within about a three-year period. To calculate the estimated cost burden for respondents, average hourly wages from the U.S. Bureau of Labor Statistics, National, State, Metropolitan, and Nonmetropolitan Area Occupational Employment and Wage Estimates for May 2018 were multiplied by the number of hours per respondent type. The following summarizes the annual burden estimates for each of the data collection activities, by type of respondent, state or local:
State program administrators and staff.
Site Visits to approximately 10 State Workforce Agencies and Telephone Interviews with approximately 24 State Workforce Agencies. The research team will conduct interviews with state program administrators in two different settings: in-person site visits and telephone interviews. During the site visits, the research team will conduct in-person interviews with a total of 51 state UI/RESEA administrators (an annualized 17 administrators). Each interview will take 1 hour to complete. For the telephone interviews, the research team will speak with approximately 24 state UI/RESEA administrators (an annualized 8 administrators). Each interview will take two hours to complete. Combined, the estimated annualized burden for the site visit and telephone interviews is 33 hours; the total burden (across the three years) is 99 hours.
Survey of all State Workforce Agencies. The research team will survey 53 administrators/staff (an annualized 18 administrators/staff, rounded up from 17.67). Completion of the survey will take up to 2 hours to complete. The estimated annualized burden for the survey is 36 hours; the total estimated burden (across the three years) is 108 hours.
Local program administrators and staff. Site Visits to approximately 20 Local Workforce Agencies. During the site visit, the research team will conduct (a) in-person interviews with an approximate total of 39 local workforce administrators (an annualized number 13 administrators); and (b) in-person interviews with a total of approximately 120 American Job Center (AJC) administrators/staff (an annualized number 40 local workforce administrators). Each interview will take 1 hour to complete. The estimated annualized burden across these three types of administrators/staff will be 53 hours; the total burden (across the three years) will be 159 hours.
The interviews during site visits and calls to state respondents will both use the same data collection instrument. The interviews with local respondents will use a separate instrument.
Across the three major types of data collection activities, the annual number of respondents is 96 administrators/staff, with an annual estimated burden of 122 hours. Across the three-years of data collection, the total number of respondents is 287 administrators/staff, with a total estimated burden of 366 hours. The total annual monetized burden hours is $5,286; the total monetized burden hours across the three years of data collection is $15,858.
Table A.1. Annualized Burden Estimates for Data Collection Estimates
Data Collection Activity |
Annual number of respondents |
Number of responses per respondent |
Total Number of Responses |
Average burden per response (in Hours) |
Annual |
Average hourly wage a |
Annual monetized burden hours |
Semi-structured in-person and telephone interviews with state UI/RESEA administrators |
25 |
1 |
25 |
1.32 |
33 |
$43.33 |
$1,430 |
Semi-structured in-person interviews with local WDB administrators and AJC administrators/ staff |
53 |
1 |
53 |
1 |
53 |
$43.33 |
$2,296 |
Web-based survey instrument for State RESEA administrators |
18 |
1 |
18 |
2 |
36 |
$43.33 |
$1,560 |
Unduplicated Total (Annual) |
96 |
-- |
96 |
|
122 |
|
$5,286 |
a The hourly wage of $43.44 is the May 2018 median wage across Management Occupations – Other Management Occupations (see http://www.bls.gov/oes/current/oes_nat.htm)
A.13. Estimated Total Annual Cost Burden to Respondents and Record Keepers
There is no burden on the respondents other than their time.
A.14. Estimated Annualized Cost to the Federal Government
The total annualized cost to the government is $194,548. Costs result from the following categories:
Contractor Costs to Conduct the Study. The cost of the three data collection activities is estimated at $465,049, including contract staff salaries. Spread over three years, the annualized cost is $155,016.
Cost of Federal Technical Staff. The annual cost borne by DOL for federal technical staff to oversee the contract is estimated to be $39,532. It is anticipated that the annual level of effort to perform these duties will require 400 hours for one federal GS 14 step 4 employee based in Washington, D.C., earning $61.77 per hour. (See Office of Personnel Management 2019 Hourly Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2019/DCB_h.pdf). To account for fringe benefits and other overhead costs, the agency has applied multiplication factor of 1.6:
400 hours × $61.77 × 1.6 = $39,532.
Thus, the total annualized federal cost is $155,016 + $39,532 = $194,548. The total cost for the federal government over the three-years for the two data collection activities is $583,644.
A.15. Reasons for Program Changes or Adjustments
This is a new data collection.
A.16. Plans for Tabulation and Publication of Results
Analysis Plan. The study team will summarize quantitative and qualitative data collected through the site visits, telephone interviews, and web-based survey using basic descriptive methods. Analysis of data from each source will follow a common set of steps involving data cleaning, variable construction, and computation of descriptive statistics. To facilitate analysis of each data source, the team will create variables to address the study’s research question. The research team plans two different types of analyses: (1) a descriptive analysis cutting across all state RESEA programs based on the web-based survey and review of all state plans (supplemented by ETA-9128 and ETA-9129 data); and (2) an analysis of the subset of state RESEA programs that are the focus of the in-depth site visits with approximately 10 states (and approximately 20 local workforce areas), and subsequent follow-up telephone interviews with approximately 24 states. The analysis effort will yield both assessments of aggregate trends across all RESEA state programs, as well as considerable detail about how RESEA programs are structured and operated within individual states and localities.
The research team plans to provide descriptive analyses, including tabulations and cross-state comparisons, which will document and assess RESEA programs at the national level and also compare key program features and characteristics across states. This descriptive analysis will document and synthesize survey and fieldwork data for a considerable range of RESEA program characteristics and features, including: the types and packages of services provided under current RESEA programs; common RESEA program models or components; characteristics of the UI claimants served (and those that are not served, such as claimants that fail to appear for RESEA sessions and are sanctioned); state selection criteria and processes and potential effects on populations served; state views and understanding of DOL’s guidance on RESEA and how those might affect state RESEA programs and components; state views on anticipated changes and likely effects on RESEA program operations, services, and outcomes; issues and challenges associated with implementing and operating RESEA, including views on use and effects of evidence-based standards/practices with regard to RESEA; and plans and promising approaches moving forward that states say that they plan to introduce in response to RESEA requirements and what states anticipate achieving with such changes.
Publications. In spring 2020, the Abt research team will produce an Interim Report on the Results of the Implementation Study that will synthesize findings across all sources of data collected as part of the implementation study. Additionally, the results of the implementation study data collection will be synthesized with results from other major components of the RESEA study into a Final Report of the RESEA Evaluation, submitted in Fall 2021 (i.e., at the end of the three-year evaluation project).
A.17. Approval Not to Display the Expiration Date for OMB Approval
The OMB approval number and expiration date will be displayed or cited on all forms completed as part of the data collection.
A.18. Exception to the Certification Statement
No exceptions are necessary for this information collection.
1 The RESEA provisions are contained in Section 30206 of the BBA, enacting new Section 306 of the SSA.
2 This evaluation effort also calls for evaluation technical assistance to help states to develop appropriate research and compliant State Plans.
pg.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Abt Single-Sided Body Template |
Author | Melissa O'Connor |
File Modified | 0000-00-00 |
File Created | 2021-01-15 |