DOL_RESEA_PRA_Part B_Combined Package-20190926(0930rev)

DOL_RESEA_PRA_Part B_Combined Package-20190926(0930rev).docx

Reemployment Services and Eligibility Assessments (RESEA) Implementation Study

OMB: 1290-0029

Document [docx]
Download: docx | pdf

Reemployment Services and Eligibility Assessments (RESEA) Implementation Study

ICR REFERENCE NUMBER 201904-1290-001

April 2019



PART B: Statistical Methods


Overview


The Chief Evaluation Office (CEO) of the U.S. Department of Labor (DOL) has contracted with Abt Associates and its partners—the Urban Institute, Capital Research Corporation (CRC), and the National Association of State Workforce Agencies (NASWA)—to conduct a three-year evaluation to develop strategies to support the evidence requirements for the Reemployment Services and Eligibility Assessment (RESEA) program that were enacted as part of the Bipartisan Budget Act of 2018 (Public Law 115-123; hereafter the “BBA”). This Evaluation to Advance RESEA Program Evidence will help DOL establish rigorous and informative evidence standards and also support DOL and states in conducting evaluations to expand the evidence base. The evidence standards and new evidence will each help states develop and implement RESEA programs that more quickly return Unemployment Insurance (UI) claimants back to work.


One component of the evaluation is an implementation study of states’ current RESEA programs. There are three key data collection activities to be completed for the Implementation Study:

  1. Site visits with 10 state workforce agencies and 20 local workforce areas (i.e., two local workforce areas within each state);

  2. Telephone interviews with RESEA program administrators from 24 states; and

  3. A web-based survey of all states and territories operating RESEA programs.

We describe the statistical methods used for all three data collection activities below.


B.1. Respondent Universe and Samples


As of FY 2019, a total of 51 states and jurisdictions are operating an RESEA program (Unemployment Insurance Program Letter 7-19). Up to 2 additional states may apply to begin operating RESEA programs in FY 2020. To support the implementation study, an online survey will be administered to the universe of all current RESEA grantees, plus any new FY 2020 grant applicants, for the purpose of systematically documenting program design and operations across the nation. The expected response rate is 80 percent. We have successfully achieved a comparable response rate in a different survey of state workforce agencies.1

No statistical sampling will be used. The list of primary respondent names and emails from each grantee will be developed in collaboration with DOL staff. Only state RESEA program administrators (1 in each state) will complete the survey.

Though this will be the first survey data collection activity for the RESEA evaluation, the evaluation team has had extensive experience administering similar data collection instruments and has been able to achieve high response rates for them. Also, the Department of Labor operating guidance for RESEA grants (UIPL-07-19) affirmed that the department is committed to supporting evaluations of state RESEA programs.


The evaluation team will also conduct site visits to 10 state grantees, during which we will interview representatives from the state workforce agency and local RESEA program staff. The sample selected is intended to reflect a range of program sizes and geographic locations, though it does not intended to necessarily be strictly nationally representative. The universe of potential respondents for the site visits includes all states and jurisdictions operating RESEA programs, except those in Hawaii, Alaska, or the U.S. Virgin Islands. The sampling unit is the RESEA grantee. To select those grantees for site visits, we will use implicit stratification by DOL region, 2 and randomly select states weighting each state’s probability of selection by square root of the number of claimants served by the program in FY 2019.


The stratification by DOL region aims to ensure geographic diversity of programs included. The weighting by the square root of program size aims to balance the study’s empirical interests in understanding programs that are experienced by larger numbers of claimants (i.e., larger programs), while not weighting size so heavily that the study would be at risk of not having representation from smaller states that face their own unique program and evaluation challenges.


Within those 10 states, we will randomly select one Workforce Development Board (WDB) area to visit from among WDBs that are located in predominantly metropolitan areas in that state. We will then purposefully choose a second workforce board area to visit that is within driving distance of the first, but has a higher proportion of rural residents. This approach aims to minimize costs to the government, while ensuring representation of diverse types of local contexts in the sample.


Following the site visits, we will conduct follow-up phone interviews with representatives from 24 state RESEA grantee agencies. The universe of potential respondents includes all states and jurisdictions operating RESEA programs (currently 51). The sampling unit is the RESEA grantee. We will select states for calls using the same stratified, weighted selection procedure used to select states for site visits.


B.2. Procedures for Collection of Information

This section describes the data collection procedures for the RESEA Implementation Study. The implementation study includes site visits with 10 state workforce agencies and 20 local workforce areas, telephone interviews with RESEA program administrators from 24 states, analysis of existing data sources and RESEA grant proposals, and a web-based survey of all states and territories operating RESEA programs.

Procedures with special subpopulations: With the RESEA grant program, DOL is interested in helping UI claimants return to work more quickly and, in turn, reduce the duration of their receipt of UI benefits. However, the proposed data collection efforts do not collect data from UI claimants participating in RESEA, but rather from staff of the state agencies that are RESEA grantees. Those staff do not constitute a special population.

Use of Periodic Data Collection Cycles to Reduce Burden: The grantee survey, site visits, and follow-up phone calls are each one-time data collection efforts.


Statistical Methods for Sample Selection and Degree of Accuracy Needed: The evaluation team will administer the survey to the universe of up to 53 RESEA grantees.3 Thus, no statistical methods will be used to select the survey sample.


The procedures for selection of states for site visits and telephone interviews are described above. The analyses of these data will be descriptive and will not involve any statistical tests of hypotheses. We therefore do not anticipate statistical challenges associated with selection of sample members or analyses of the data for either the site visits or interviews.


Using data collected from the survey, site visits, and phone interviews, the research team will perform descriptive analyses, including tabulations and cross-state comparisons, which will document and assess RESEA programs and compare how key program features vary across states. This descriptive analyses will document and synthesize survey and fieldwork data for a considerable range of RESEA program characteristics and features, including: the types and packages of services provided under current RESEA programs; common RESEA program models or components; characteristics of the UI claimants served (and those that are not served, such as claimants that fail to appear for RESEA sessions and are sanctioned); state selection criteria and processes and potential effects on populations served; state views and understanding of DOL’s guidance on RESEA and how those might affect state RESEA programs and components; state views on anticipated changes and likely effects on RESEA program operations, services, and outcomes; issues and challenges associated with implementing and operating RESEA, including views on use and effects of evidence-based standards/practices with regard to RESEA; and plans and promising approaches moving forward that states say that they plan to introduce in response to RESEA requirements and what states anticipate achieving with such changes.


We have designed each data collection instrument to ensure that the total information collected across all instruments will support the study’s goals and address the key research questions. Exhibit B1 provides a matrix with key study questions for the Implementation Study and the four key sources of data that will be used to address these study questions.




Exhibit B1: Implementation Study Matrix of Key Study Questions and Data Sources


Key Implementation Study Research Questions

Review of State RESEA Grant Proposals

Analysis of RESEA/UI Claimant Data (ETA 9128/9129 and UI Program Data)

Telephone & Site Visit Interviews with Sample of State Workforce Agency Administrators/ Staff

Site Visit Interviews with Sample of Local Workforce Agency and AJC Administrators/ Staff

Web-based Survey of All State Workforce Agencies

A. Claimants Selected for RESEA






Who among, and what fraction of, the unemployed access UI benefits, and what are the characteristics of those who become eligible for UI benefits? For how long are benefits available and under what conditions? What are the expected ways that these variations among states might impact RESEA selection, participation, and program structure?




Who among UI claimants does the selection process target for the RESEA program to serve?



Subsequent to the BBA provisions (as described in the UIPL 07-19), in what ways have states changed how they select participants for RESEA services?


Are there plans to change who among the UI claimant population the state targets for RESEA in the future? How? When?




To what extent is it perceived that the RESEA provisions (of the BBA) increased state flexibility with regard to targeting of claimants for RESEA?





Which claimants do staff perceive benefit most from RESEA?



B. RESEA Participation Levels






To what extent is there state-to-state variation in the percentage of initial UI claims (for which a payment was made) scheduled for a first RESEA session? Within states, to what extent is there year-to-year or local area variation?




What are the characteristics of RESEA participants? How do they vary across states? How much of the variation derives from broader UI, rather than RESEA, policies?



To what extent do states expect RESEA provisions of the BBA to impact RESEA participation levels and/or participant characteristics in FY 2019 and beyond?




C. RESEA Funding






What effect do states report that the 25% increase in the funding award between FY 2018 and FY 2019 had on their RESEA programs? What effect is it expected to have in the future (as funding is expected to increase even more)?




In what specific ways have states and local areas used the increase in funding (e.g., evidence reviews, program planning and upgrades, program evaluation, added staff, expanding RESEA program to other areas of the state)? What other changes are expected?



Subsequent to the funding increases, have states increased the numbers of UI claimants scheduled for RESEAs? If so, how?


Subsequent to the funding increases, how have states’ targeting strategies changed regarding whom they select to receive RESEAs?



Subsequent to the funding increases (which states can use to deliver reemployment services), how have states changed the number of referrals to additional reemployment services in AJCs?



D. Initial RESEA Session






Once states have selected them, how do they notify UI claimants that they are required to participate in their initial RESEA session (e.g., on-line notification through UI system, email, telephone, U.S. mail, etc.)? How much variation is there within and across states?


How many weeks into the claim does the first RESEA typically occur? How much variation is there within and across states?


How is the initial meeting scheduled? How much variation is there within and across states?


Where is the initial RESEA session conducted?


What is the basic format/content of RESEA initial sessions? To what extent does format/content vary across states and within states, across localities?


Can any portion of the initial RESEA session(s) be conducted virtually or by telephone? If so, how is it conducted and under what circumstances?


What specific reemployment/career services are reviewed with the claimant? To which types of services are claimants referred?


Other than initial and subsequent RESEA sessions, are the services available to RESEA participants through the AJCs different from the standard array of services provided to other job seekers using the AJCs? If so, how?


What type of staff conduct the initial RESEA session (i.e., UI, Wagner-Peyser, other AJC staff, some combination)? How does that vary across states?


How are referrals to additional reemployment services managed and tracked?


What is the feedback loop with UI on eligibility issues that are identified? How does that vary across states?


To what extent are initial RESEAs conducted in a group setting versus individual attention? Which activities are group and which are individual?


E. Subsequent RESEA Session






To what extent are RESEA participants scheduled for a subsequent RESEA sessions? How much variation is there across states (and localities) in terms of whether or not and how many RESEA subsequent sessions are conducted?


In states where subsequent sessions are conducted, are all RESEA participants who attend the initial RESEA session required to attend a subsequent RESEA session? If not, who is/isn’t scheduled for a subsequent RESEA session?


What is the typical timing of the subsequent RESEA session(s) (i.e., during what payment week are RESEA participants typically scheduled)?


Are the locations at which subsequent RESEA sessions are conducted the same as those where initial RESEA sessions are conducted?


Can any portion of the subsequent RESEA session(s) be conducted virtually or by telephone? If so, how is it conducted and under what circumstances?


What is the basic format/content of RESEA subsequent sessions? To what extent does format/content vary across states and local areas?


What specific reemployment/career services are reviewed with the claimant? To which types of services are claimants referred?


F. Other Features of RESEA Programs






Across states, what is the relative balance between the emphasis on enforcement of eligibility requirements (e.g., work search, able and available) and reemployment assistance?




Where in the RESEA process does FTR most often occur? For what reasons do FTRs occur?


What are the consequences of FTR for the initial RESEA session (e.g., if a claimant fails to report, are their benefits stopped for just that week until the person is able to report, or are they sanctioned for a longer period of time)?



After an FTR, what does a claimant need to do to resume benefits?



What, if anything, has been done to reduce FTR in the state and in local areas?



Typically, what type of staff are involved in administering FTRs (e.g., Wagner-Peyser, WIOA, and UI)?


What have states done to more fully integrate their RESEA programs with WIOA?



What are states’ and localities’ biggest challenges in operating the RESEA program?



G. Perceived Effective or Promising Practices






Which program elements do staff believe are most key to helping get UI claimants back to work most quickly?



What program elements, if any, do staff believe are ineffective?



What have staff seen that they feel provides evidence for what they think does and does not work in terms of effectively serving UI claimants?




What changes do staff believe would improve their programs?



H. Evidence-Based Strategies and Evaluation Requirements; Technical Assistance and Guidance






To what extent are states’ current RESEA program models, or individual elements of it, based on research evidence? What is the evidence that states base their current RESEA program models on? To what extent are those models, or individual elements of the models, being implemented with fidelity to those described in the evidence? What are the models or individual elements that states are implementing that are not evidence-based, and could be studied in the future?




Has the state engaged in any efforts since BBA to understand the evidence base and begin planning for the RESEA program evaluation requirements? If so, what is the nature of the efforts and who has been involved in those efforts (UI, WIOA, LMI, etc.)?




Have states been part of any DOL-funded, state-funded, foundation-funded, or other national evaluations of their REA and/or RESEA programs in the past 10 years?




Are states planning to use current RESEA grant funds to conduct evaluation(s) of their RESEA program? If so, what types of studies are planned and what interventions will be assessed?



Are states planning to use grant funds in future years to conduct evaluation(s) of their RESEA programs? Will states use the full 10% of their grants to fund evaluation(s) of their RESEA programs? What types of studies are planned and what interventions will be assessed?




To what extent are states interested in and/or planning on being part of larger evaluations across state lines (i.e., pooling grant funds with other states to conduct evaluations with larger sample sizes)?




Would states be willing to participate in future DOL-sponsored studies of RESEA programs?




What do states think of BBA requirements that tie funding to evidence-based interventions?




How prepared do states feel they are to conduct impact evaluations (i.e., evaluations that typically involve a treatment and control group)? What concerns do they have about evaluation?




Moving forward, what additional types of technical guidance (e.g., in the form of UIPL, webinars) do states need to effectively and efficiently implement the BBA requirements with regard to RESEA? If yes, what types of guidance are needed and for which provisions of the BBA?





Moving forward, do states need evaluation technical assistance to effectively and efficiently implement the BBA requirements with regard to evidence-based practices? If yes, in what specific areas is this technical assistance needed and how should USDOL provide this assistance (e.g., webinar, issuance of written guidance, etc.)?






B.3. Maximizing Response Rates and Addressing Nonresponse


This section describes the methods to maximize response rates for the RESEA web-based survey, site visits, and follow-up telephone interviews. The RESEA Implementation Study data collection efforts are heavily dependent on gaining grantees’ cooperation, buy-in, and collaboration. The evaluation team believes that grantees are interested in supporting DOL efforts to expand understanding of the RESEA program and thus are willing to participate in an evaluation to build the evidence base around this program. Further, the evaluation team is committed to providing the support and guidance to ensure minimal burden.


Survey


DOL’s Office of Unemployment Insurance (OUI) is committed to working with states to maximize the number of grantees that respond to the survey. The evaluators thus expect an 80 percent response rate on the web-based survey described in this package. To achieve a high response rate on the grantee survey, the evaluation team will take the steps outlined below:


  • Upon receipt of the list of RESEA program administrators from DOL, we will verify the accuracy of the contact information provided. We will check publicly available information or attempt to contact each prospective respondent to confirm that the survey will reach the intended recipient.

  • OUI (either the federal or regional office) will send advance letters or emails to all state RESEA administrators to notify them of the forthcoming survey. The letter will specify the survey’s purpose, general instructions on how to access and complete the survey, the completion deadline, and a point of contact should the respondent have questions about the survey.

  • On the scheduled date for the release of the survey, the Abt team will email the state UI or RESEA contact provided by DOL and include a link to the online survey and instructions for how to complete it.

  • The survey will be designed to minimize burden. In particular, only questions that the state’s RESEA administrator would be expected to be able to answer will be included. That said, respondents may choose to share the survey with others involved in their RESEA programs should they feel that input from others would be helpful to complete the instrument.  Respondents will be able to work on and save responses to specific questions in the survey at any time, so that they may come back later to complete remaining items. For any respondents who are unable to complete the survey online, we will allow them to complete the survey by phone.

  • Respondents will have six weeks to complete the survey. States will submit one online survey response. The Abt team will track which states have started the survey and monitor their progress. With three weeks remaining in the survey period, the Abt team will send an electronic reminder of the due date for the survey and an offer of technical support for any states having difficulty completing it.

As surveys are completed, the research team will check responses to ensure that they are complete. If any submitted survey responses appear systematically flawed, the team will follow up with respondents for confirmation or clarification. We will also ask for responses to any unanswered items. Any necessary revisions will be made via email or telephone contact with the state’s survey respondent. The Abt team will follow up with each state that has not submitted a survey instrument by the due date. NASWA will assist in this effort to ensure a high response rate from states.

The grantee survey will be used to systematically gather data on current RESEA program operations and plans for future program development for the full universe of current RESEA grantees and applicants. Estimation procedures will be, for the most part, very simple. The survey items are predominantly simple multiple choice items and do not include any sensitive topics for which refusals would be a concern. Nor do items rely on recall for which measurement error would be an issue. Nonetheless, some small amount of item nonresponse may occur. To address any item nonresponse, we will use logical imputation or imputation based on existing knowledge wherever feasible. Where that is not possible, we will fill in missing survey data elements using multiple imputation routines available in standard statistical software, such as Stata’s mi command. Such imputation uses statistical relationships between items estimated for sample members for whom the items are not missing to estimate values for sample members for whom data are missing on some but available for other items.

One survey will be administered per grantee, and we anticipate an 80% response rate. To ensure that survey responses are nationally representative of all programs, we will weight respondents to account for any unit non-response. Weights will incorporate major ways that states differ that may be consequential for program operations and claimants’ outcomes. Those include region, program size, urbanicity, and unemployment rates. We will finalize weighting options in discussion with DOL during the analysis phase.

Site Visits and Calls

We are also committed to ensuring full participation in the site visits and follow-up phone interviews among states selected for those data collection efforts. We again anticipate that grantees are interested in supporting DOL efforts to expand understanding of the RESEA program and thus are willing to participate in the visits and phone interviews. Consistent with those expectations and Abt’s prior experience, we expect a 100 percent response rate. Abt recently achieved a 100 percent response rate with grantee sites selected for visits for the American Apprenticeship Initiative Evaluation. Several of those grantees were state agencies. Similarly, between January and February 2019 we held exploratory calls with exploratory calls with a set of nine state workforce agencies. We were successful in conducting calls with all agencies that had been selected.

To facilitate participation, the evaluation team will take the steps outlined below:

  • DOL’s OUI will send a letter and email to selected states, requesting their participation in the site visits and phone interviews, as appropriate.

  • Each state’s DOL Federal Project Officer will be available to help answer questions and address concerns about participation.

  • The Abt evaluation team will follow up on the contact from OUI to explain the details of states’ participation. We will conduct the site visits and phone calls to minimize the burden on states, scheduling interviews and visits based on states’ availability and staffing.

  • NASWA will also encourage states to participate and will help address concerns. The grantee organizations are members of NASWA and have high levels of mutual trust and support.

To ensure that we interview the target number of respondents and maintain the intended diversity of states, in the event that any state does refuse or is unable to participate, we will replace them with a similar backup state. When replacing a selected grantee, we will select from within the same DOL region, choosing the state in that region that is most similar in program size (i.e., number of claimants served) to the state being replaced. Note that while tabulations of findings from site visits and calls are meant to reflect a diversity of program designs and contexts, they do not aim to be strictly nationally representative. Nor will statistical tests will be conducted. Thus no weights will be used in the analyses, nor will variance be estimated.



B.4. Test Procedures


This section describes tests of the data collection instruments included in this submission.

The grantee survey was developed and reviewed by DOL staff and evaluation team members. The evaluation team pilot-tested the survey with three state RESEA program administrators to ensure that the instrument as a whole and each question it contains is clearly written and comprehensible, and that response options are logical and exhaustive. Pilot respondents were selected in consultation with DOL and the National Association of State Workforce Agencies. Grantees provided feedback on the experience of completing the survey in telephone conversations with an evaluation team member who administered the survey. Respondents provided feedback on the clarity of the questions, suggested additional areas of inquiry or questions to add, identified questions for deletion or revision, and rated the ease of completion. The pretest also allowed us to confirm that the instrument could be completed within an acceptable length of time. Following the pilot test, the evaluation team made any needed changes to the survey instrument, web-based programming, and survey procedures. DOL reviewed and approved the final survey instrument. RESEA grantees that completed the survey during the pre-test will be given their completed surveys to review and update when the full survey is fielded to reduce burden while ensuring that all responses are accurate and up-to-date.

The interview protocols for the site visits and telephone interviews were developed following the completion of nine exploratory calls with state workforce agency representatives in January and February 2019. During these calls, evaluation team members asked questions similar to those included in the implementation study protocols, though which the team was able to receive feedback on topics and wording for questions. We will train interviewers to probe respondents for their understanding of the interview questions. If respondents need clarification, interviewers will provide additional information so as to minimize nonresponse.

B.5. Individuals Consulted on Statistical Aspects of the Design


With DOL oversight, Abt and its partners are responsible for conducting the RESEA Evaluation. The individuals listed in Exhibit B2 below made a contribution to the design of the implementation study and impact study of employer outreach activities. Because the survey is being administered to the full universe of state RESEA programs and we anticipate a 100 percent response rate, the study will not require the use of complex technical methods to review in sample selection or survey analysis (e.g., weighting, imputation). The data collected for the impact study will be analyzed under the direction of John Trutko, Capital Research Corporation. Dr. Andrew Clarkwest, the Project Director for the RESEA evaluation, will have oversight of all sub-studies and data collection efforts.

Exhibit B2: Individuals Consulted on Data Collection


Name

Affiliation

Telephone Number

Role in Study

Megan Lizik

DOL Chief Evaluation Office

(202) 693-5911

DOL COR

Gay Gilbert

DOL Office of Unemployment Insurance

(202) 693-3428

Administrator of the program office for RESEA (DOL/OUI). Subject matter expert.

Larry Burns

DOL Office of Unemployment Insurance

(202) 693-3141

Reemployment Coordinator in the program office for RESEA (DOL/OUI). Subject matter expert.

Gloria Salas-Kos

DOL Employment and Training Administration

(202) 693-3596

Subject matter expert

Andrew Clarkwest

Abt Associates

(301) 347-5065

Project Director

Jacob Klerman

Abt Associates

(617) 520-2613

Co-Principal Investigator

Demetra Nightingale

Urban Institute

(202) 261-5384

Co-Principal Investigator

Glen Schneider

Abt Associates

(617) 349-2471

Project Quality Advisor

John Trutko

Capital Research Corporation

(703) 522-0885

Implementation Study Lead

Teresa Doksum

Abt Associates

(617) 349-2896

IRB Reviewer


Inquiries regarding the statistical aspects of the study’s planned analysis should be directed to:


Andrew Clarkwest

Project Director

(301) 347-5065

Megan Lizik

Contracting Officer’s Representative, Chief Evaluation Office

(202) 693-5911

1 Gan, K. N., G. Schneider, E. Harvill, and N. Brooke. (2013). Technology-Based Learning in the Workforce Investment System. Prepared for the U.S. Department of Labor. Cambridge, MA: Abt Associates Inc.

2 We will allocate states consistent with the regions administered by the Employment and Training Administration: https://www.doleta.gov/regions/

3 Currently only 51 grantees exist. It is possible that up to two more states and jurisdictions will become grantees by the time the survey is administered.

pg. 3

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorMelissa O'Connor
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy