UCP_OMB_PartA_final_20121012

UCP_OMB_PartA_final_20121012.docx

Evaluation of the Unemployment Compensation Provisions of the American Recovery and Reinvestment Act of 2009

OMB: 1225-0089

Document [docx]
Download: docx | pdf





Evaluation of the Unemployment Compensation Provisions of the American Recovery and Reinvestment Act of 2009

OMB Supporting Statement:
Part A

October 12, 2012

Authors (in alphabetical order):

Heinrich Hock

Brandon Kyler

Annalisa Mastri

Julita Milliner-Waddell

Karen Needels

Patricia Nemeth

Walter Nicholson

Frank Potter

Grace Roemer

Linda Rosenberg

Wayne Vroman*

*The Urban Institute







Contract Number:

GS10F0050L/DOLF109631341

Mathematica Reference Number:

06863.450

Submitted to:

U.S. Department of Labor

Office of the Chief Evaluation Officer

200 Constitution Avenue NW

Washington, DC 20210

Project Officer: Jonathan A. Simonetta



Submitted by:

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005

Project Director: Karen Needels



Evaluation of the Unemployment Compensation Provisions of the American Recovery and Reinvestment Act of 2009

OMB Supporting Statement:
Part A

October 12, 2012

Authors (in alphabetical order):

Heinrich Hock

Brandon Kyler

Annalisa Mastri

Julita Milliner-Waddell

Karen Needels

Patricia Nemeth

Walter Nicholson

Frank Potter

Grace Roemer

Linda Rosenberg

Wayne Vroman*

*The Urban Institute








CONTENTS

PART A: SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION 1

1. Circumstances Necessitating the Data Collection 1

2. How, by Whom, and for What Purpose the Information Is to Be Used 7

3. Uses of Improved Technology to Reduce Burden 17

4. Efforts to Identify Duplication 18

5. Methods to Minimize Burden on Small Businesses or Entities 20

6. Consequences of Not Collecting the Data 20

7. Special Data Collection Circumstances 21

8. Federal Register Notice 21

9. Respondent Payments 22

10. Privacy 24

11. Questions of a Sensitive Nature 28

12. Hour Burden of the Collection of Information 29

13. Estimated Total Annual Cost Burden to Respondents and Record Keepers 29

14. Estimated Annualized Cost to the Federal Government 29

15. Changes in Burden 31

16. Publication Plans and Project Schedule 31

17. Reasons for Not Displaying Expiration Date of OMB Approval 32

18. Exceptions to the Certification Statement 32

rEFERENCES 33

Appendix A: UI Recipient Survey

Appendix B: Survey of UI Administrators

Appendix C MASTER Site Visit Protocol

Appendix D: Data Systems Survey

appendix E: 60-Day Federal Register Notice

appendix F: 30-day federal register notice

Appendix G: respondent mailings







TABLES

A.1 Summary of Topics to Be Covered by the Evaluation 5

A.2 Data Elements in the UC Recipient Survey, by Purpose 9

A.3 Survey of UI Administrators 12

A.4 Site Visit Topics by Respondent 15

A.5 Burden Estimates for Data Collection Efforts 30

A.6 Study Task by Cost 31

A.7 Schedule for Project Tasks 32





PART A: SUPPORTING STATEMENT FOR PAPERWORK
REDUCTION ACT SUBMISSION

The U.S. Department of Labor (DOL) contracted with Mathematica Policy Research (Mathematica) to conduct an evaluation of the unemployment compensation (UC) provisions of the American Recovery and Reinvestment Act (ARRA) of 2009. The evaluation is designed to provide insights into five topics: (1) states’ decisions to adopt certain UC-related reforms encouraged by ARRA, (2) states’ implementation experiences with these ARRA UC provisions, (3) the characteristics of recipients of different types of unemployment benefits during the time ARRA-related UC benefits were available, (4) the impact of ARRA UC provisions on recipients’ outcomes, and (5) additional research questions about the influence of the UC provisions of ARRA on macroeconomic issues and state unemployment insurance (UI) trust funds.

This package requests clearance for three data collection efforts conducted as part of the evaluation:

  1. A Survey of UI Recipients. This survey will yield data from a nationally representative sample of 2,400 UI recipients in 20 randomly selected UI jurisdictions from among the 50 states and the District of Columbia; topics to be covered include the recipients’ employment and financial characteristics prior to their period of unemployment and their experiences during and after receipt of benefits. The UI recipient survey is presented in Appendix A.

  2. A Survey of UI Administrators. This survey will yield data about the decision-making and implementation experiences of UI administrators in all 50 states and the District of Columbia. The survey of UI administrators is presented in Appendix B.

  3. Site Visit Data Collection. In-person visits to 20 purposively selected states and a data systems survey to be provided to state-level staff prior to those in-person visits will provide qualitative and in-depth information about the states’ experiences deciding whether to adopt the UC-related provisions of ARRA as well as their experiences with implementation of these and other provisions. A master protocol for the visits and the data systems survey are included in Appendixes C and D, respectively.

1. Circumstances Necessitating the Data Collection

The recession that began in late 2007 posed major challenges for the UC system. Although the unemployment rate’s having exceeded 10 percent indicates one dimension of the severity of the recession, perhaps the most significant indicator of the challenges was the steep increase in unemployment duration. The median duration of unemployment rose from a relatively normal 8.5 weeks in 2007 to 23 weeks by mid-2010. Similarly, the percentage of the unemployed who experienced spells longer than 26 weeks rose from 18 percent to 46 percent. More generally, the recession raised anew questions about whether a system designed in the 1930s continues to meet the needs of today’s unemployed workers.

The policy response to the recession, including passage of ARRA, was timely and extensive. The overarching objective of the evaluation being conducted for DOL is to assess the efficiency and effectiveness of the UC-related provisions of ARRA and other actions by the federal government. The remainder of this section provides information about those provisions (Section a) and an overview of the evaluation and its data needs (Section b).

a. The UC-Related Provisions of ARRA

The major UC-related provisions of ARRA and related legislation can be grouped into three categories: (1) provisions to extend the number of weeks of unemployment benefits available to workers who exhausted their entitlement to state-financed benefits (known as “exhaustees”); (2) provisions intended to encourage states to modernize their programs in response to certain changes over time in the labor market and technology; and (3) other provisions intended to help states or unemployed workers weather the recession. Each type of provision is discussed in turn.

1. Provisions to Extend Additional Benefits to the Long-Term Unemployed

On June 30, 2008, then-President George W. Bush signed Public Law 110-252 (henceforth referred to as the Emergency Unemployment Compensation Act of 2008, or EUC08), which provided up to 13 weeks of additional UC benefits to workers who exhausted their entitlements under regular state UI programs. In late 2008, benefits available under this “first tier” of emergency benefits were extended to 20 weeks. This was ultimately followed by three more tiers of benefits enacted throughout 2009, providing 14, 13, and 6 extra weeks of benefits, respectively. Additional changes were made to expand the availability of benefits through the Extended Benefits (EB) program, a long-standing program that provides additional weeks of benefits to unemployed workers in states with unemployment rates above certain thresholds. In contrast to the EUC08 program, EB benefits can be triggered automatically once a state surpasses an insured unemployment rate (IUR) or a total unemployment rate (TUR) threshold. By late 2009, unemployed workers who met program eligibility requirements could collect up to 99 weeks of unemployment benefits—26 from the regular UI program, 53 from the EUC08 program, and 20 from the EB program. The termination of the program was extended several times by legislation throughout 2010 as labor markets continued to be weak and, on several occasions, gaps in coverage that arose after expiration of the program were averted through retroactive implementation of an extension of the program. Between July 2008 and March 2011, more than $136 billion in benefits had been paid through the EUC08 program, and more than $19 billion had been paid through the EB program. Currently, UC recipients may receive EUC08 benefits as late as January 2, 2013; after that point, any remaining EUC08 benefits to which a recipient would be entitled will be lost.

2. Provisions to Encourage UI Modernization

The federal government apportioned $7 billion in incentive funds across states for the adoption of specific policies designed to increase access to benefits or the generosity of benefits for certain types of unemployed workers, given changes in the labor market and technological capabilities over time. The incentive program began upon passage of ARRA, and states have until August 22, 2011, to apply for the funds. Upon approval of the states’ applications, the modernization money is deposited into the state trust fund accounts maintained at the U.S. Treasury; however, unlike UI taxes deposited in these trust funds—which can be used only to pay benefits—the modernization funds can also be used to support administrative activities in the UI and Employment Service programs or worker adjustment activities such as job search assistance and counseling.

The incentives were structured such that a state had to adopt (or already have in place) an alternate base period (ABP; described further below) in order to receive one-third of the state’s total allocation of these incentive funds. Then, by adopting (or already having in place) two of the four remaining policies, the state could receive the remaining two-thirds of its share. The five provisions related to the incentives are described below.

Alternate Base Period. Traditionally, UI eligibility is based on a base period, which includes the unemployed worker’s earnings in the first four of the last five completed calendar quarters. This time frame has been used because of lags in the processing of paper-copy data provided to the state by employers about their employees’ earnings. However, the use of the first four of the last five completed calendar quarters can result in a gap of up to six months between the end of a base period and the time a worker applies for UI benefits. With increased use of electronic data processing, the length of the time lag between the end of a calendar quarter and the availability to the state of data to use in determining a UI claimant’s eligibility has diminished. Under an ABP, the benefit amount is usually calculated using the four most recent completed quarters of earnings, rather than the traditional base period. Prior to ARRA, some states already included ABP provisions in their laws (Vroman 1995).

ARRA incentives aimed to encourage more states to adopt ABPs, a method to expand UC system coverage to additional workers. A state could still qualify for incentive funds even if it specified that the ABP would be used only for claimants who did not qualify for benefits under a traditional base period; all states that implemented a new ABP have used this restriction.

Part-Time Work Provision. Under this provision, individuals seeking part-time work (as defined by state UI law) are eligible for UI benefits. Historically, workers seeking part-time work were not eligible for UI benefits in about half the states.

Compelling Family Reasons Provision. Traditionally, eligibility for UI benefits hinged upon whether or not a worker lost a job through no fault of his or her own. Thus, historically, workers who quit their jobs were not eligible for UI benefits; however, the reasons for quitting a job that states deemed allowable for UI purposes have varied. For example, a worker who quit his or her job after being subject to sexual harassment on the job might be allowed by the state to collect benefits. This ARRA modernization provision expands the definition of what constitutes an acceptable reason for voluntarily quitting a job to include “compelling family reasons,” thereby limiting disqualifications for benefits. For instance, individuals who quit their jobs to take care of a sick family member or follow a spouse who is relocating are not disqualified from receiving benefits under this expanded definition.

Dependents’ Allowance Provision. Under this provision, eligible recipients may collect a dependents’ allowance of at least $15 per week per dependent, in addition to regular UI benefits; states may impose a cap on the dependents’ allowances of $50 per week or 50 percent of the individual’s weekly benefit amount (the amount of benefits to which an individual is entitled if he or she has neither earnings nor other causes of deductions in benefits for the week). When a state has a dependents’ allowance provision, whether or not the provision was in existence prior to ARRA, the dependents’ allowance is paid with EB and EUC08 benefits as well as regular UI benefits.

Training Provision. Under this provision, benefits are extended for 26 weeks for UI exhaustees who are enrolled in and making satisfactory progress in certain training programs, such as state-approved programs and those authorized by the Workforce Investment Act.

3. Other UC-Related Provisions

In addition to these provisions aimed at providing additional benefits to the long-term unemployed and to encourage states to modernize their UI programs, ARRA and related legislation contained several other UC-related provisions. Generally speaking, they were intended to provide additional assistance to unemployed workers or states to help them weather the recession. The provisions that are within the scope of the evaluation include (1) the establishment of Federal Additional Compensation (FAC), which added $25 per week to UC weekly benefit amounts until it expired on December 7, 2010; (2) a reduction in federal taxation of UC benefits by making the first $2,400 received during calendar year 2009 exempt from the federal income tax; and (3) suspension of interest payments on all state trust fund loans in 2009 and 2010. The net result of these changes and other UC-related provisions of ARRA was that the federal government came to play a much larger role in the UC system than had been the case in previous recessions.

b. Overview of the Evaluation and Its Data Needs

Because of the wide range of UC-related provisions of ARRA, the questions that the evaluation is designed to answer are numerous. Questions related to the additional weeks of benefits include: How well did these expanded benefits meet the needs of unemployed workers during the recent recession? How did these expansions affect workers’ labor supply and other decisions? What administrative difficulties did states encounter in providing EUC08 benefits and related enhancements to recipients? To what extent were extended benefits timed to mitigate the effects of the economic downturn? Questions related to the modernization provisions include: What factors led states to adopt specific modernization features? What factors provided the greatest deterrents to their adoption? How difficult was it for states to adjust their existing UI laws and procedures? In the end, how much change actually resulted from the modernizations? How did the various modernization initiatives affect the pool of eligible workers? What were the characteristics of newly eligible workers and what were their experiences with the UC program? Finally, questions related to the other UC-related provisions of ARRA include: How did other provisions in ARRA, such as the FAC and waiver of taxation, influence the ability of recipients to maintain household income? What implications for states’ trust funds and administration of benefits were there for the temporary waiver in interest rate payments on outstanding loans to states?

To address these and the other research questions, the evaluation will include questions within five broad topics. Table A.1 summarizes the five topics and the relevant data sources and analytic methods that will be used to address questions within each.

As shown in Table A.1, this package contains a request for clearance for three types of data:

  • Survey of UC Recipients. This questionnaire will be administered to 2,400 UC recipients to collect detailed information on their demographic characteristics; UC program experiences; labor market experiences before, during, and after receipt of UC benefits; as well as household income, measures of financial well-being, receipt of other government benefits, and participation in training. A two-stage selection process (described in Section A.2) will be used to produce nationally representative estimates in a cost-effective manner. The UC recipients to be included in the survey began receiving UI program benefits between January 1, 2008, and September 30, 2009. Key outcomes for analysis of the survey data will include the duration that recipients received benefits, the amount of benefits received, the duration of initial unemployment, reemployment earnings, and postclaim financial hardships.





Table A.1. Summary of Topics to Be Covered by the Evaluation

Topics and Illustrative Subtopics to Be Addressed

Data Sources Included as Part of this Clearance Requesta

Main Analytic Methods

  • State contextual factors associated with decision to enact the provision and timing of that decision

  • Processes by which states selected modernization provisions to adopt

  • Effects of EUC08 program on state EB policies

  • Survey of UI administrators

  • Site visits/data systems survey

  • Descriptive analysis

  • Cross-state regression analysis

  • Qualitative analysis of contextual influences on state decisions

  • Duration, costs, and challenges associated with implementation

  • Impacts of greater benefit use on program administration

  • States’ responses to incentive payments and interest-free loan period

  • Survey of UI administrators

  • Site visits/data systems survey

  • Numeric counts of interview respondents who report specified implementation experiences

  • Qualitative implementation analysis

  • Unemployment duration, demographic characteristics, and post-UC labor market outcomes of recipients who did and did not receive extensions of benefits

  • Access to and distribution of benefits associated with state UI policies

  • UI recipient survey


  • Cross-tabular analysis

  • Propensity score matching

  • Hazard analysis

  • Benefits simulation

  • Effect of UC benefit receipt on unemployment duration and reemployment earnings

    Shape2
  • Effects of replacement rate, potential UC duration, and modernization reforms on recipients’ outcomes

  • UI recipient survey


  • Differences-in-differences estimation

  • Regression discontinuity designs

  • Instrumental variables estimation

  • Extent to which the timing of EB and EUC08 program triggers and benefit dollars were countercyclical

  • Contributions of benefit enhancements to stabilization or exacerbation of macroeconomic conditions or both

None

  • Aggregate panel data and time series analysis

aTo address research questions in the five topics, the evaluation also will use other types of data that are not part of this clearance request. They include publicly available state and national UI program data, economic data, and data on states’ UI laws as well as state-provided administrative data on UI recipients and wage-earners in states that are part of the individual-level data analysis.

ARRA = American Recovery and Reinvestment Act of 2009; EB = Extended Benefits program; EUC08 = Emergency Unemployment Compensation Act of 2008; TUR = total unemployment rate; UC = unemployment compensation; UI = unemployment insurance.



  • Survey of UI Administrators. UI administrators from all 50 states and the District of Columbia will be surveyed to collect uniform information on states’ decisions to adopt various UC features and their experiences implementing the ARRA UC provisions. This survey will yield information on the economic and political determinants of states’ decisions, the timing and duration with which states implemented new provisions, and plans to modify or repeal new provisions. Most of the information collected through this survey will be responses to closed-ended questions to facilitate statistical analysis.

  • Site Visit Data. On-site interviews and a data systems survey with UC stakeholders from 20 purposively selected states will be conducted. The states will be selected after information from the survey of UI administrators is available; they will be selected to represent a broad range of experiences, including states that adopted all of the optional UC provisions of ARRA, states that adopted some of them, and states that adopted none. Other characteristics of the states, such as features of their UI programs and their economic and political landscapes, also will be taken into account to ensure diversity. The on-site interviews will be conducted with state UI administrators, other state UI staff, and staff of UI call centers and One-Stop Career Centers. With a focus on the state-level perspective, the interviews will provide in-depth, qualitative data on states’ decision-making and implementation experiences to complement the data gleaned from the survey of UI administrators. As part of the site visit data collection, state staff will be asked to complete a brief survey, the data systems survey, about the influences of the ARRA provisions on their data systems. This survey will be provided to state UI staff shortly before the on-site visit, and the answers will be discussed with staff during the visit.

Complementary quantitative and qualitative methods will be used to address study questions. For instance, qualitative information on the political and social context that shaped states’ decision making will supplement cross-state regression analysis to assess the determinants of states’ adoption of key UC provisions. In some cases, the same data sources—for example, site interviews conducted as part of the implementation study—will be subject to both quantitative analysis (including numeric counts of respondents who report specified implementation experiences) and qualitative analysis (descriptions of common patterns across sites). The impact analysis will use several methods—including differences-in-differences estimation and regression discontinuity designs—to assess impacts on recipients’ outcomes. (Section B.2 provides additional description of the analytic methods.)

The evaluation will convey findings in three reports: (1) a modernization report, (2) an emergency benefits report, and (3) an impacts report. The modernization report will contain analysis of states’ decisions about the UI modernization provisions and their experiences implementing these and other UC-related provisions of ARRA. The emergency benefits report will contain analysis of states’ experiences regarding EB and emergency benefits extensions; it also will include an examination of the characteristics of recipients affected by the extensions of benefits. The impacts report will cover estimates of the impacts of the ARRA UC provisions on recipients’ outcomes. Although the focus of each report is distinctive, the second and third reports will build upon earlier analyses and findings.

2. How, by Whom, and for What Purpose the Information Is to Be Used

Clearance is being requested for three data collection efforts: (1) the UI recipient survey, (2) the survey of UI administrators, and (3) the site visit data collection, which includes a master site visit protocol and a data systems survey. Each data collection effort is described in a subsection below.

a. UI Recipient Survey

The individual-level analyses conducted for this study were commissioned by DOL to determine how the experiences of job losers were affected by the expansions to the UC system enacted by the federal government in response to the recent recession. The study’s impact evaluation seeks to measure the effects of EUC08 benefits and other ARRA-based changes to the UC system on labor market, training, and financial outcomes of UI recipients. To put the impact estimates in context, descriptive analyses will also provide DOL with an understanding of the socioeconomic and demographic characteristics of unemployed workers served by the UC system during the recent recession. Because most of these characteristics and outcomes are either imperfectly measured or not measured at all in administrative and extant survey data, Mathematica will conduct a survey of UI recipients to gather the unique data needed for this evaluation.

1. Selection of the Interview Sample

The survey will be administered to a nationally representative sample of UI recipients identified from administrative claims records using a two-stage cluster randomized sampling strategy. In the first stage, a sample of 20 out of the 51 major UI jurisdictions (the 50 states and the District of Columbia) will be randomly selected from which to gather the administrative data to locate recipients (the sampling frame). In the second stage, 3,000 recipients from the jurisdictions selected in the first-stage sample will be randomly selected to be interviewed. Achieving a target response rate of 80 percent will yield a nationally representative sample of 2,400 recipients completing surveys. Although the two-stage sampling design will result in less precise estimates than what would be obtained if recipients were interviewed from every UI jurisdiction, it substantially reduces the burden that UI jurisdictions will face in extracting the administrative files while still providing data to meet the study objectives.

The target population for the evaluation consists of individuals who were potentially eligible for additional unemployment benefits through the EUC08 legislation. Thus recipients with benefit-year-begin (BYB) dates ranging from May 1, 2006, through late 2011 (given current legislation at the time this clearance package was prepared) could potentially be included in the analysis. The survey will concentrate on a study population with BYB dates between January 1, 2008, and September 30, 2009. This range of BYB dates includes recipients with a range of experiences with ARRA-related policy and program changes. Concentrating the survey sample on this date range, rather than a broader range, will result in more precise estimates of the impact of UC-related provisions of ARRA, such as the higher tiers of EUC08 benefits, on recipients’ outcomes because it focuses the sample on recipients who began collecting benefits at points in time that will allow for impact estimation.1 It also allows the full UC benefit collection history to be characterized for most survey respondents using administrative data, reducing the need to ask for this information in the survey or to use statistical techniques to account for incomplete information. Finally, post-UC outcomes will be observed for most recipients in the survey, which will increase the capacity of the evaluation to detect impacts.

2. Content and Purposes of the Survey

The UI recipient survey includes basic screening and tracking questions and detailed modules that obtain information on recipient characteristics and outcomes. The data collected in the survey will serve four major purposes: (1) validating or updating information from administrative data and the sample locating process; (2) providing descriptive measures of the recipient population; (3) serving as control variables in statistical analyses; and (4) measuring postclaim outcomes to determine the impact of the availability of upper-tier EUC08 benefits and other ARRA-based changes to UC policies. The major content areas of the survey and the purposes of the data are described below; a copy of the survey questionnaire is included as Appendix A. Additional details on the specific items included in the survey are given in Table A.2.

Personal Information and UC Collection History. The survey will start with screening questions to ensure that the sample locating process has identified the correct individual. Respondents will be asked to confirm or update the start and stop dates of their UC collection, which will help them to focus on the benefit collection period of interest for the analyses. The sample will be stratified by start date in the descriptive analysis because many of the ARRA-based changes to UC policies, for example the availability of EUC08 benefits, affected recipients differently based on the date at which they exhausted regular UI. Together, the start and stop dates will also be used to calculate the duration of UC benefit receipt, which will be an outcome in the impact analysis. Respondents will also be asked to confirm or update the basic contact information gathered from the sample locating process so that incentive payments (discussed in Section A.9) can be delivered.

Employment History. Information on the characteristics of the job held prior to the claim will be used to describe the sample of recipients and to construct control or stratification variables for the statistical analyses. Respondents will be asked to provide basic stop and start date information for up to 10 postclaim jobs, with more detailed information collected on up to three jobs: (1) the first job held after the claim; (2) the job that served as the main source of earnings in the postclaim period, if different from the first; and (3) the main current job, if different from either the first or second job. The starting date of the first postclaim job will be used in conjunction with the date of first UI payment to calculate the duration of the initial unemployment spell, which is one of the primary study outcomes considered in the impact analysis. The amount of earnings in the postclaim period is another primary outcome. The impact analysis will also consider the effects of expansions of UC benefits under EUC08 and other ARRA-based changes to UC policies on other

Table A.2. Data Elements in the UC Recipient Survey, by Purpose


Survey
Items

Validation/ Tracking

Descriptive Measure

Control Variable

Outcome Measure

Personal Information and UC Collection History






Personal information: Verify name, date of birth, and last four digits of Social Security number

Section A

X




UC collection: Confirm or update start and stop dates

Section B

X

X

X


Duration of UC benefit receipt

Items B1-B4




Xa

Contact information: Address and telephone number

Section M

X




Employment History






Employment before job loss: Industry, occupation, union representation, job tenure, layoff history, hours worked, earnings, fringe benefits, reason for separation, recall status

Section C


X

X


Postclaim employment: Number of postclaim jobs, full-time status, desire for full-time work, start and stop dates, industry, occupation, union representation, hours worked, earnings, fringe benefits

Section F




X

Duration of initial unemployment spell

Items B3-B5, F8-F9




Xa

Reemployment earnings

Item F18




Xa

Current labor force participation status: Major activity in the week before the survey work search status, reason for not working, recall status, underemployment

Items F1-F4


X


X

Work Search, Education, and Training






Work search activity after job loss: Looked for work, hours per week searching, methods used, reason for not looking, whether services led to a referral and job offer

Section D




Xa

Postclaim education and training activities: Number of programs, start and stop dates, hours per week, location of program, whether collected UC benefits while in training, sources of financial support for training, program completion status, receipts of license or degree, reason for stopping participation, whether led to employment

Section E




Xa

Economic Well-Being






Preclaim finances: Savings to cover 3 and 6 months of living expenses, types of investments held, home ownership

Items G4– G7, H8


X

X


Preclaim income: Sources of income including state and federal support, total household income

Items H2, H4-H7


X

X


Postclaim financial hardships: Ever been late or missed payment on mortgage, rent, or other credit; defaulted on mortgage; experienced foreclosure or eviction; declared personal bankruptcy; postponed major purchases; change in work by other household members

Items H9– H12, K8




Xa

Health insurance coverage after job loss: Availability of insurance and COBRA through former employer, utilization of ARRA COBRA subsidy

Items I1–I6


X

X


Postclaim health vulnerability: Months since UI initial claim without health insurance coverage; delayed or deferred medical care after UI claim

Items I7–I8




X

Postclaim sources of income including state and federal support

Items H1, H3




X

Current total household income

Items H4-H7


X


X

Demographic and Socioeconomic Characteristics






Preclaim and current family structure: Martial status and number of dependents

Items G1-G3, K1–K7


X

X


Preclaim educational attainment

Item E1


X

X


Demographic characteristics: Date of birth, ethnicity, race, and gender

Items A4, J1- J3


X

X


Postclaim mobility: States in which recipients worked during and after claim spell, time periods for each state

Section L

X



X

Notes: Data elements marked in the “Validation/Tracking” column represent survey questions in which respondents confirm or update information from the administrative data or locating process. Items in the “Descriptive Measure” column will be used to provide a context for understanding the characteristics of the UI recipient population and interpreting the impact of the ARRA-based changes. The “Control Variable” column indicates factors determined at or before the time of job loss that may be correlated with postclaim labor market experiences. These may be used to define subgroups of interest in the descriptive analyses and used as covariates in the impact analysis. Data elements marked in the “Outcome Measures” column are measured after the UI initial claim. The distributions of these variables will be compared among groups of individuals who became eligible for additional compensation through EUC08 at different points in their unemployment spell.

a The duration of the initial unemployment spell, the duration of UC benefit receipt, reemployment earnings, financial hardship measures, work search intensity near the start of the benefit spell, and the likelihood of participation in education or training programs will serve as the main outcomes in the impact analysis.

characteristics of the postclaim employment experience, such as hours and availability of fringe benefits, which serve as measures of job quality. In addition to serving as a study outcome for the impact analysis, current employment status will be used to provide a descriptive understanding of the labor market activities of recipients at the time of the survey.

Work Search, Education, and Training. To shed light on the mechanisms that ultimately may connect unemployed workers to jobs and may affect the quality of jobs obtained, the survey will gather detailed information about respondents’ job search activities and participation in education and training programs. Respondents will be asked for information on how they searched for work, the amount of time they looked for employment, their reasons for not looking (if applicable), and whether they received referrals that led to employment. These questions will focus on the period shortly after loss of the preclaim job. The survey will also identify the number of education and training programs recipients participated in, asking detailed questions about up to two of them: (1) the longest program in which a recipient is currently enrolled, and (2) the longest other training program (current or non-current) in which the recipient was enrolled during the postclaim period. The impact analysis will include work search and education and training participation outcomes based on these questions when considering the effects of changes to UC policies under ARRA.

Economic Well-Being. Because of the role that the financial and real estate markets played in the recent recession, DOL is particularly interested in assessing recipients’ economic well-being and how it was affected by the expansion of UC benefits under ARRA. Thus, the survey will collect baseline information on household income, sources of federal and state income support, and the types of assets held before the job loss. These measures will be used to describe the characteristics of the sample and will serve as stratification and control variables in the descriptive and impact analyses. The survey will also gather data on indicators of financial distress, such as whether recipients experienced delinquencies on credit, mortgage, and rent payments; foreclosures and evictions; personal bankruptcy; and whether other household members increased their labor supply since the start of the claim. The impact analysis will consider how these financial distress outcomes were affected by the EUC08 legislation. Respondents will be asked about their health insurance coverage immediately following the job loss and use of the COBRA subsidy available under ARRA so that the survey sample can be aligned with the sample of recipients being interviewed for concurrent DOL-sponsored evaluation of that subsidy. In addition, respondents will be asked whether they experienced periods without health care coverage or if they delayed getting important medical care, both of which may be examined as an outcome in the impact analysis. The survey will include questions about sources of income in the postclaim period in order to determine whether more generous UC benefits altered recipients’ reliance on government support. Finally, the survey will include questions about total household income in the year prior to the interview, which will be used to describe the sample and will be considered as an outcome measure for the impact analysis.

Demographic and Socioeconomic Characteristics. Items such as education age, gender, race and ethnicity, education, marital status, and household composition and size will be used to provide a description of the characteristics of the UI recipient population. In addition to describing the sample, these factors are strongly correlated with labor market outcomes and will therefore be controlled for in the impact analysis to improve the precision of the estimates. Respondents will also be asked about the states in which they worked after filing for UI benefits to estimate the impact of benefit extensions and other changes to UC policy on interstate mobility. This geographic information will also shed light on the extent to which postclaim earnings data collected for this survey might be supplemented by administrative wage records from the 20 UI jurisdictions included in the sample.

b. Survey of UI Administrators

The survey of UI administrators will provide information on the decision to adopt UC-related ARRA provisions for all 50 states and the District of Columbia. The timing and content of the survey provide several analytical advantages. First, it will be deployed after the deadline for applying for modernization incentive funds, which means that the decision of every state about adoption of each type of provision will be known. Second, the survey primarily contains closed-ended questions, which will facilitate quantitative analysis of the responses, including tabulations and frequencies of responses. Third, the survey will be deployed in time to use the responses to inform the purposive selection of states for site visits. In particular, responses that characterize the debate surrounding adoption will enable the selection of states that ultimately adopted one or more provisions but had to overcome challenges to adoption; these states might provide lessons for the future about how best to structure federal incentive programs.

The study’s survey of UI administrators will add to related work being conducted by the National Association of State Workforce Agencies (NASWA). (In developing the UI administrator survey, the study team drew upon the NASWA questionnaire and other literature on UC provisions.) In late 2009, NASWA administered a survey to the UI administrators in all states about their responses to the various UC-related ARRA provisions. More recently, NASWA has conducted in-depth phone interviews with 20 selected states. This evaluation’s survey of UI administrators will provide new information, unavailable from the NASWA study, to answer this study’s research questions. First, as mentioned previously, the survey will be mailed after all the decisions to adopt modernization funds have been made; this will allow the capturing of experiences of late-adopting states not covered by either NASWA survey. This is important because the experiences of late-adopting states might differ systematically from those of early-adopting states. Second, the focus of this survey is the decision-making process, which has not been a focus of NASWA’s work but which is critical for answering research questions about the factors that led some states to adopt provisions and others not to do so. Third, the survey will contain closed-ended questions to facilitate more extensive quantitative analyses, which—unlike with qualitative data—can simultaneously take into account more than one explanatory factor on a decision-making outcome.

The survey of UI administrators will focus on two main study topics (see Appendix B for the survey):

  • The Decision to Adopt. The survey includes questions about states’ decisions about adopting the TUR trigger for EB, the ABP, and the other modernization provisions. In particular, respondents will be asked about the key factors states considered when deciding whether or not to adopt each provision. Respondents will also be asked to report whether the state estimated the costs of adopting the provisions and what factors were considered in estimating those costs.

  • Implementation Issues. For states that adopted particular provisions, the survey asks about the main challenges they encountered in implementation as well as whether and why their actual costs have differed from their projections of costs.

The survey contains three content modules that cover: (1) the TUR trigger for EB, (2) the ABP, and (3) the other modernization provisions. The study team will use the responses to the survey of UI administrators to (1) tailor the master site visit protocol for states to be visited in person; (2) conduct a descriptive analysis of states’ decisions to adopt, including regression analysis; and, as feasible, (3) generate variables to aid in the selection of states for the site visits. Table A.3

Table A.3. Survey of UI Administrators

Section

Contents

Rationale/Planned Use

Introduction

Glossary of Terms. Definitions of terms and a brief background on each policy being addressed.

Ensures consistent understanding of terms throughout the survey.

A

Confirming Information. The state’s existing provisions and when legislation putting the provisions into place (if applicable) was passed.

This information will be used to tailor the master site visit protocol to each site, saving time and decreasing the burden for respondents in the site visit data collection effort.

B

TUR Trigger for EB. Key factors favoring or hindering adoption of the TUR trigger. For those states that did adopt it, their implementation experiences.

These data items will be used in descriptive analyses. In addition, the implementation experiences will be used in selecting states for site visits.

C

Alternate base period (ABP). Key factors favoring or hindering adoption of the ABP. Whether the state considered cost estimates in its decision-making process. For those states that adopted the ABP, their implementation experiences and the likelihood of repeal.

These data items will be used in descriptive analyses. In addition, the implementation experiences will be used in selecting states for site visits.

D

Other UI modernization provisions. Key factors favoring or hindering adoption of two of the four modernization provisions. Whether the state considered cost estimates in its decision-making process. For states that adopted modernization provisions, their implementation experiences and the likelihood of repeal.

These data items will be used in descriptive analyses. In addition, the implementation experiences will be used in selecting states for site visits.

E

Contact information for respondent(s).

This information will be used to follow up with the survey respondent(s) as needed.



summarizes the content of each section of the survey and the rationale for and planned usage of the data items.

The questionnaire will be self-administered. The first page will contain fields that will be populated with publicly available state-specific information about the status of the ARRA UC-related provisions, including which provisions were adopted and when. The rest of the questionnaire will include a series of closed-ended questions in order to limit the burden on state staff. Use of closed-ended questions will (1) ensure that the collected data on certain topics will be uniform across states and (2) enable the evaluation team to easily quantify the data across states.

After OMB clearance is received, the study team will send an initial email to each state’s UI administrator introducing the study and its components. Then the team will email the UI administrator survey questionnaire that can be printed out, completed, and returned through either mail or fax. In addition, the study team will mail a paper copy of the questionnaire, along with a prepaid business reply envelope for returning either the paper questionnaire or a printout of the electronic questionnaire. (Electronic copies can also be returned via email.) Mathematica will email reminders to non-responding administrators to encourage participation.  

The survey instructions will ask the UI administrators or individuals they designate to respond to verify the publicly available information on the first page. They will also ask the sample members to respond to the closed-ended questions either based on their own knowledge or in consultation with other UI staff. The questionnaire will include a section for respondents to identify themselves and any colleagues with whom they collaborated to complete the questionnaire. It is expected to take an average of about 40 minutes to complete. When the completed questionnaire is returned to Mathematica, the evaluation team staff will review it. If necessary, the staff will follow up with the main respondent for clarification or to request responses to any uncompleted items. The study team will contact any states that do not respond to encourage them to do so. The study team anticipates a 100 percent response rate. (Strategies used to help achieve this response rate are described in Section B.3.)

c. Site Visit: Protocol and Data Systems Survey

An in-depth examination of how states responded to and implemented the ARRA-related UC provisions is a critical component of the evaluation. The visits conducted in 20 purposively selected states will allow for learning about a broad range of approaches and experiences, including states that made significant changes to qualify for the incentive funds, ones that qualified for the funds but did not need to make significant changes, and ones that did not apply for incentive funds. The study will document the factors that influenced states’ decisions about adopting certain provisions, the states’ administrative and programmatic experiences implementing provisions, and the lessons for future extensions of UC programs.2 As part of the site visits, a data systems survey will examine the extent of changes to information systems made by states in response to the ARRA UC provisions and whether states are able to accurately capture, track, and report on the UC ARRA requirements. Because of the technical nature of the topics, it will be useful to allow staff to respond, and the site visitors to review the information, in advance of the visit.

Visits to selected states will be timed to fully capture their decisions and implementation experiences, and to take advantage of data collected through the survey of UI administrators. States had through August 22, 2011, to submit their applications for UI modernization incentive funds and must enact the corresponding legislation within 12 months of the Secretary of Labor’s certification of the application. Site visits will begin in 2013. This start date is far enough from the application deadline that it is likely that the study will include implementation experiences of late-implementing states. Furthermore, the start of the visits is scheduled to allow the study team to use the information from the survey of UI administrators to select a set of states with diverse decision-making and implementation experiences and will also provide important background information for the states chosen for visits.

Several weeks before visiting a state, the site visitor will send the data systems survey to the UI benefits chief (see Appendix D for the data systems survey). The site visitor will request that the survey be completed and returned at least one week before the scheduled visit so that the survey responses can inform the site visitor’s questions about the effects of the ARRA-related UC provisions on the state’s data systems.

The study team will visit each of the 20 selected sites. Depending on the size of each state’s UC program, one or two researchers will visit the state for an average of two days. They will interview state UI administrative staff and other critical stakeholders. These visits will gather respondents’ unique perspectives on their states’ reasons for adopting or not adopting the optional provisions and their experiences implementing the ARRA-related UC provisions. The emphasis of the implementation study will be to identify the challenges states faced in implementing the changes and the successful strategies they used to overcome those challenges.

1. Site Visit Topics

The study team developed a comprehensive interview protocol to guide site visit discussions (see Appendix C for the master interview protocol and Appendix D for the data system survey that will guide on-site discussions about data systems changes). Interviews will be tailored to each state and respondent, but overall the site visits will cover the following key topics from a state-level perspective:

  • Decision Making. Why did states decide to adopt or not adopt the various optional UC provisions? How did they come to these decisions? Who was involved in the decision-making process? Were incentives effective at enticing states to adopt the optional provisions?

  • Implementation. What challenges did UI administrators and staff face in implementing the ARRA provisions? How long did it take to implement them? How have the UC provisions affected the workload of state staff? How has the state advertised the provisions to claimants? How has the state used the additional modernization incentive funds? Have the benefits of the ARRA-related UC policies outweighed the challenges and costs?

  • Data Systems Changes. What information systems changes were made by states in response to the ARRA-related UC provisions? Are states able to accurately capture, track, and report on the ARRA-related UC requirements? How did the systems changes affect benefits payments? What were the challenges (such as data quality and resource limitations), facilitators (such as strong political leadership), and costs?

  • Lessons for Future Policy. What lessons can be drawn from states’ experiences with the ARRA provisions? Is the state planning additions to or a repeal of the ARRA-related legislation? Why or why not? How would different amounts or types of incentives have affected states’ decisions to adopt the optional provisions?

Although some of these topics will be touched on in the survey of UI administrators, the site visits will collect more in-depth information and input from multiple respondents. Indeed, the site visits may provide useful information for interpreting the survey responses of those states not included in the visits.

2. Site Visit Respondents

To gather data for a complete analysis of each research topic, on-site data collection requires the input of multiple respondents with specific expertise. Table A.4 connects the key topics to the appropriate respondents. The site visit will include a two-hour interview with each state’s UI administrator in order to capture high-level information about all topics. In addition to the UI administrator, the site visit will include meetings with state-level stakeholders who have contributed to the state’s decisions about implementing ARRA-related UC provisions and have been involved in implementing them. The job titles of respondents will vary across states, but will likely include other state-level UI staff (such as the UI benefits chief and the UI trust fund manager) and a technology

Table A.4. Site Visit Topics by Respondent


UI
Administrator

Other State-Level UI Staff

UI Call Center Administrator

One-Stop Career Center Administrator

Advisory Council

Staff with Particular Expertise

Technology Officer

Module 1: Introduction and Background

Provisions adopted

X

X





Economic and political climate

X





X

Changes to UI laws pre-ARRA

X

X




X

Claims filing pre-ARRA

X

X

X

X

X


Module 2: Extended Benefits/TUR Trigger and EUC08

Decision to adopt trigger

X

X




X

TUR trigger implementation

X

X

X




EB implementation

X

X

X

X

X


EUC08 implementation

X

X

X

X

X


Relationship between EB and EUC08

X

X

X

X



Module 3: UI Modernization Provisions

Decision to adopt ABP

X

X




X

ABP implementation

X

X

X

X



Decision to adopt other provisions

X

X




X

Modernization provision implementation

X

X

X

X



Administrative or benefit costs of enacting provisions

X

X

X



X

Modernization incentive payments

X

X

X




Module 4: Federal Additional Compensation (FAC)

Timing and implementation

X

X

X

X



Module 5: First $2,400 Free of Federal Income Taxation

General information pre-ARRA


X

X

X



Implementation


X

X

X



Module 6: Suspension of Interest Payments on State Trust Fund Advances

Decision to apply for an advance

X

X




X

Effect on UI trust fund solvency

X

X




X

Administrative, accounting, and IT issues

X

X

X



X

Module 7: Concluding Questions

Overall assessment of ARRA provisions

X

X

X

X

X

X



Notes: UI staff includes the staff responsible for benefits administration, the trust fund manager, and any other staff the UI administrator indicates would have substantive knowledge of the indicated topics.

ABP = alternate base period; ARRA = American Recovery and Reinvestment Act; EB = Extended Benefits; EUC08 = Emergency Unemployment Compensation Act of 2008; IT = information technology; TUR = total unemployment rate; UI = unemployment insurance.

officer. In addition, the site visitor will interview a UI call center administrator and the administrator of a One-Stop Career Center in each state to capture the experiences of staff well-versed in UI operations on the ground and in workforce programs. Furthermore, the onsite data collection will include interviews with representatives from the state UC advisory council who played an important role in the public debate about the adoption of provisions.3

During their visits to each state, site visitors will spend about one day interviewing respondents at the state UI offices and about one day interviewing other stakeholders and call center and workforce staff. If necessary, the site visitors may adjust the protocol to conduct interviews with some individuals by phone, after pilot-testing the interview protocol in person.

In general, site visitors are expected to meet with the following individuals in each state:

  • UI Administrator (and Deputy if Appropriate). Interviews with state UI administrators will cover all key aspects of the decision-making processes surrounding the ARRA-related UC provisions, including the key issues involved in deciding which provisions to adopt. In addition, the UI administrators will provide a high-level perspective on the state’s implementation experiences, including modifications to data systems, staff retraining, and informing claimants of expanded eligibility or benefits.

  • Other State UI Staff. These respondents, either individually or in groups depending on the state office structure, will address questions related to implementation of the policies for which they have expertise. For example, the site visitor will ask the individual with particular expertise in issues related to trust fund management about topics related to benefits payment, state loans from the federal government, and the use of incentive funds. The benefits chief or technology officer will have knowledge of changes made to the state’s data systems and will be the target recipient of the data systems survey.

  • Call Center Administrator. The site visitor will interview one or two call center administrators in each state to understand how the provisions affected claimants and their interactions with the call centers, what administrative and implementation issues arose for staff working directly with the claimants, how administrators handled the additional flow of customers that probably resulted from these provisions, and how staff interacted with workforce staff in One-Stop Career Centers.

  • One-Stop Career Center Administrator. To complement understanding of the implementation of various provisions within the UC system, the site visitor will interview the administrator at one of the state’s One-Stop Career Centers to discuss the implications of the UC provisions for the workforce system. This interview will focus on how the increased number of UC claimants has affected the number of customers using the centers for reemployment and training services in the state.



In addition, the study team plans to gain the unique perspectives of other individuals who were involved in policy discussions related to the provisions. Thus, site visits will also include interviews with:

  • Members of the State UC Advisory Council. In some states, members of the state advisory council may have played a role in the decision to adopt the optional provisions. They may also have knowledge of issues surrounding the adoption of various ARRA-related provisions. Site visitors will identify two to three members of the council, including at least one employer representative, for interviews.

Because the types of respondents are expected to vary across states depending on the UI organizational structure and optional provisions adopted, the study team has developed a master site visit protocol covering all the key topics. For each state, the site visitors will tailor that protocol so that each respondent addresses only the modules about which he or she has knowledge. For instance, council members might have extensive knowledge about the decision-making process regarding adopting the optional provisions, but little or no knowledge of states’ experiences implementing the provisions. Site visitors would ask these respondents the questions about decision making and not those about implementation.

3. Uses of Improved Technology to Reduce Burden

Advanced technology will be used in the data collection efforts to reduce burden on recipients and staff for the UI recipient survey and the survey of UI administrators.

a. UI Recipient Survey

The UI recipient survey will utilize two data collection approaches. Sample members will be able to complete the survey either through interviewer-administered, computer-assisted telephone interviewing (CATI) or through self administration via the web. Both data collection methods reduce the respondent burden and costs compared to conducting in-person or paper-and-pencil interviews.

CATI is a logical choice as a method of administration for telephone interviews with large numbers of respondents. With CATI, information about sample members, such as their UI initial claim date and the name of their employer prior to unemployment, can be preloaded to improve question flow and data accuracy. CATI programs are efficient and accept only valid responses based on preprogrammed checks for logical consistency across answers. Interviewers are thus able to correct errors during the interview, eliminating the need for costly call backs to respondents. Also, dialing errors will be almost completely eliminated by making calls through a preview dialer. The preview dialer allows interviewers to review case history notes and the history of dispositions. The interviewer then presses one button to dial the number after reviewing the case (this is akin to one-touch or speed dialing). An automated call scheduler will simplify scheduling and rescheduling of calls to respondents and can assign cases to specific interviewers such as those who are trained in refusal conversion techniques or those who are fluent in Spanish. Further, CATI’s flexibility allows for the scheduling of interview times that are convenient for the sample member.

The web survey option offers even more cost efficiency because it is self-administered, meaning that interviewers are not required. The web survey programming also includes skip pattern logic; response code validity checks; specification of acceptable ranges; and consistency checks. Information from UI claim records will be preloaded into the web survey, as it will be in the CATI survey. The web interface will be easy to navigate to encourage sample members who open the web survey to continue through completion.

Both versions of the survey are expected to take approximately 30 minutes to complete and will be available in both English and Spanish. Except for language necessary to accommodate self-administration versus being asked by an interviewer, the content of both survey versions will be identical.

b. Survey of UI Administrators

Because of the limited number of sample members for this survey (N = 51), the survey of UI administrators will not be computer programmed. Instead, a letter of invitation and survey booklet will be mailed to UI administrators. Also, an electronic version of the questionnaire, which can be printed out, will be emailed. Administrators can use this to complete the survey on a computer. Completed surveys can be emailed to Mathematica, faxed, or sent via regular mail using the prepaid business reply envelope that will be included with the initial mailing packet.

The survey of UI administrators will include a state-specific fact sheet with information collected from the public domain about the state’s adoption of the various ARRA provisions covered in the survey. Respondents will be asked to confirm or correct this information. Use of prefilled data will lessen the respondent’s burden for completing the survey, although some questions will include options for open-ended responses to allow respondents to provide additional information as needed.

4. Efforts to Identify Duplication

Strategies to identify and avoid duplication are discussed in two subsections. The first covers the UI recipient survey and the second covers both the survey of UI administrators and the site visit data collection effort.

a. UI Recipient Survey

The UI recipient survey data will be used for an impact analysis of the effects of ARRA-based changes to the UC system on recipients’ outcomes and a descriptive analysis to describe the characteristics of the study population. Neither type of analysis can be feasibly conducted using currently-available data.

The sample of UI recipients interviewed for this evaluation will cover the study population in a manner that that cannot be achieved by ongoing surveys sponsored by the federal government. The basic monthly Current Population Survey (CPS) does not contain information on receipt of UI benefits. The March supplement to the CPS allows respondents who reported income from the UC system in the previous year to be identified. However, it is not possible to use the March supplement alone to distinguish among recipients according to their BYB date or duration of benefit receipt. Combining data from the March CPS and the basic monthly survey could identify some recipients with job separations job separations that occur in January through June and in December of each year, but doing so would not identify any recipients with job separations occurring between July and November. Because it is fielded on a biennial basis, the Panel Study of Income Dynamics cannot be used to identify UI recipients with UI first payments in 2007 and 2009. Finally, the survey sample interviewed for Mathematica’s DOL-funded evaluation of the COBRA subsidy available under ARRA will be limited to those UI recipients who lost their jobs between February 17, 2009, and March 31, 2011, and were eligible for COBRA at the time of separation. By relying on state administrative records to locate recipients, the UI recipient survey will efficiently yield a nationally representative sample covering the full study population of UI recipients with BYB dates between January 1, 2008, and September 30, 2009.

The UI recipient survey will result in measures of key study outcomes that cannot be reliably measured using administrative records. For example, the survey will gather information on financial hardships, an outcome of substantial interest to DOL, but for which UI administrative data provide no information.

In addition, the recipient survey will provide more precise and accurate measures of the other two primary outcomes for this study—unemployment duration and reemployment earnings. Following the approach of other studies using state administrative UI data (such as Jacobson, LaLonde, and Sullivan 1993), one might assume that a recipient identified from the claims records is unemployed until he or she is observed to reappear in the wage records. However, because this measure of employment status would only be available on a quarterly basis, relying on wage records would not allow the precise duration of unemployment to be calculated as will be possible using the UI recipient survey. This approach would also fail to detect cases where recipients become self-employed or migrate to a state not included in the study sample, resulting in biased estimates of unemployment duration. Such bias could be problematic for the impact analysis if the extent to which individuals migrate or transition to self-employment is related to UC policy parameters. Estimates of reemployment earnings might be problematic as well because administrative data only provide information on earnings that are insurable under the UI system. By contrast, the UI recipient survey will measure self-employment; measure employment in states not included in the sample; and yield a fuller measure of earnings that includes bonuses, tips, commissions, overtime, and fringe benefits.

Finally, the survey conducted for this study also will yield a much richer descriptive understanding of the characteristics of UI recipients than what could be produced using state administrative data. In addition to the descriptive value of the data, the data can be used to create explanatory variables in the estimation of program impacts. The UI claims records maintained by states may provide information on a limited number of demographic and preclaim employment characteristics of recipients, such as age, gender, race and ethnicity, and base period earnings, but they do not provide information about other types of preclaim information (such as household structure and certain measures of job quality, including layoff history and the availability of fringe benefits) and postclaim information (such as about job search behavior, participation in reemployment services and training, postclaim job quality measures, and financial well-being and distress).

b. UI Administrators Survey and Site Visits

As described in Section A.2, NASWA has been conducting research on the ARRA-related UC provisions. The team’s efforts to avoid duplicate data collection have involved coordinating with the NASWA study team and tailoring this study’s research questions and goals to be complementary without redundancy. Dr. Wayne Vroman, a co–principal investigator on the evaluation of the UC provisions for ARRA, is also a member of NASWA’s research team, and Rich Hobbie, the project director of NASWA’s study, is a member of the evaluation’s TWG. Discussions between the teams, further enhanced by these crossover staff, have provided both teams with an understanding of each other’s work and enabled this coordination. Currently, NASWA has collected information about 20 states through telephone interviews with UI administrators, the chief of benefits, and IT staff. In contrast, the survey of UI administrators will collect new data on the 51 UI jurisdictions’ decision-making processes for adopting the UI modernization provisions. In addition, the site visits to 20 selected states will collect multiple perspectives on the states’ experiences implementing the ARRA-related UC provisions.

5. Methods to Minimize Burden on Small Businesses or Entities

No small businesses or entities will be surveyed as part of the UCP evaluation.

6. Consequences of Not Collecting the Data

Each of the three data collection efforts in this data collection request is designed to provide unique information to answer questions of interest to policymakers. The consequences of not collecting these data are described in three subsections, one addressing each data collection effort.



a. UI Recipient Survey

The survey of UI recipients conducted for this study will provide the only source of reliable and nationally representative estimates of the characteristics and outcomes of UI recipients who began receiving benefits during the timeframe of interest to DOL. Ongoing surveys sponsored by the federal government do not adequately cover the full span of BYB dates that define the study population. Relying on data from administrative UI records would result in incomplete, imprecise, and potentially inaccurate measures of key study outcomes, such as financial hardship. Thus, not conducting the UI recipient survey will severely limit the capacity of DOL to determine the impact of EUC08 and other ARRA-based changes to UC policy on recipients’ postclaim experiences and to understand the characteristics of recipients affected by those policies.

b. Survey of UI Administrators

The study’s survey of UI administrators will be the only source of information available for all 50 states and the District of Columbia about the process involved in deciding whether or not to adopt the ABP and two of the four modernization provisions in response to incentives provided by DOL, as well as the TUR trigger for EB benefits. It also will be the only source of information about states’ cost estimates for the adoption of the modernization provisions.

If the data in the survey of UI administrators cannot be collected, then the study would be unable to answer questions based on the experiences of all 50 states and the District of Columbia about their decision-making and implementation experiences with the UC-related provisions of ARRA. Although some information about the early experiences of states is available through related work being conducted by NASWA, the NASWA survey of state UI directors on implementation of the Recovery Act does not reflect the experiences of states that made decisions to adopt and implement the optional UC-related provisions of ARRA after that data collection effort (the online survey was sent to states in November 2009). The proposed data collection effort will provide a more comprehensive picture, covering the experiences of states that adopted the provisions relatively late in the time period (which extended through August 2011) for which applications for incentive funds were available; these states might differ considerably from those that adopted the provisions quickly. Furthermore, if the data were not collected at all or if the data were collected from a subset of states only (such as the late-adopter states), the study would lose the ability to conduct quantitative analyses that are feasible only through the collection of uniform answers to closed-ended questions like those in the survey of UI administrators from all states.

Finally, if the survey of UI administrators were not conducted, the information from it could not be used as part of the purposive selection of states for the site visit data collection effort. The selection process would need to rely on publicly available information only. Thus, it would be more likely that the site visit data collection effort would exclude states that had distinctive decision-making and implementation experiences. Although some of the information to be collected through the survey of UI administrators could be collected during onsite interviews, doing so would require additional time for the onsite visits, and the information would be available in a less uniform way and for only 20 states.

c. Site Visit Data Collection

The site visit data collection effort, including in-person visits and the data systems survey, will provide comprehensive information about the decision-making and implementation experiences with the UC-related provisions of ARRA of 20 purposively selected states. If the site visit data collection effort does not occur, this type of rich information would not be available. Policymakers would not have detailed information about the political and economic contexts in which states made their decisions; states’ expectations about the results of adoption of provisions on the UI claims-taking process, the administration of reemployment services, and other UI program functions; and the actual influence of the provisions. Furthermore, policymakers would not know about the changes made to information technology and data systems to accommodate the UC-related ARRA provisions.

Although the survey of UI administrators provides some information about states’ decision-making and implementation experiences, that survey alone cannot yield a comprehensive picture about these issues because it is brief and focuses on the optional UC-related provisions (adoption of the TUR trigger for EB benefits and the modernization provisions). In contrast, the site visit data collection effort includes both the optional provisions and other provisions that were uniformly implemented across states (the EUC08 program, FAC benefits, exemption from federal taxation of a portion of benefits, and temporary suspension of interest payments on trust fund advances). Without collecting the site visit data, policymakers would not learn about states’ experiences with this latter group of provisions, and they would be unable to apply the findings to future policy.

7. Special Data Collection Circumstances

No special circumstances apply to this data collection. In all respects, the data will be collected in a manner consistent with federal guidelines.

8. Federal Register Notice

a. Federal Register Notice and Comments

As required by 5 CFR 1320.8 (d), a Federal Register Notice, published on December 12, 2011 (FR, Vol. 76, No. 238, pp. 77260-77263), announced the Evaluation of the Unemployment Compensation Provisions of the American Recovery and Reinvestment Act of 2009—the UCP evaluation. The Federal Register announcement provided the public an opportunity to review and comment on the planned data collection and evaluation within 60 days of the publication, in accordance with the Paperwork Reduction Act of 1995. A copy of this 60-day notice is included as Appendix E to this data collection clearance request. No comments were received from the public during the initial 60-day posting. The second Federal Register Notice was published on April 12, 2012 (FR, Vol. 77, No. 71, p. 22001), for a 30-day period, providing the public a second opportunity to respond. No comments were received from the public during this period. A copy of this 30-day notice is included as Appendix F.

b. Consultations Outside of the Agency

Consultations on the research design, sample design, and data needs are part of the study design phase of the UCP evaluation. The purposes of these consultations are to ensure the technical soundness of the study and the relevance of its findings and to verify the importance, relevance, and accessibility of the information sought in the study.

The members of the TWG listed below are experts in their respective fields and were consulted in developing the design, the data collection plan, the questionnaires, and the site visit protocol for the UCP evaluation.

Members of the TWG

Dr. Rich Hobbie, National Association of State Workforce Agencies (202) 434-8020

Dr. Douglas Holmes, Strategic Services on Unemployment
and Workers’ Compensation (202) 223-8904

Dr. Till von Wachter, Russell Sage Foundation and
Columbia University (212) 355-3406

Dr. George Wentworth, National Employment Law Project (860) 257-8894

Dr. Stephen Woodbury, Michigan State University and
W. E. Upjohn Institute for Employment Research (269) 385-0408

9. Respondent Payments

In the first subsection, respondent payments for the UI recipient survey are discussed. In the second subsection, the issue for the survey of UI administrators and site visit respondents is discussed.



a. UI Recipient Survey

In conjunction with other methods to fulfill the targeted 80 percent response to the UI recipient survey, there will be an incentive to all survey respondents. To encourage completion of the survey via the web and respondent-initiated telephone contact, a higher incentive will be offered for those methods of responding. Web completers, and those who call in, will be given $40. Those sample members who do not initiate contact will be called and offered a $30 incentive. This differential incentive offer is justified by the lower cost of web administration since the cost of interviewing staff is eliminated for web surveys and interviewing time is minimal when respondents call in to complete an interview. Materials sent to sample members will explain the differential in the incentive offers. (Appendix G contains example mailings to respondents.)

The offer of incentives is critical to efforts to gain cooperation from sample members and increase response rates ensuring the representativeness of the sample and providing data that are complete, valid, reliable, and unbiased. Given the importance of the UCP evaluation for DOL, the data collection must be held to high standards on these criteria, and offering incentives can help achieve that goal. However, because response to telephone surveys has been declining and costs associated with achieving high response have been increasing, the use of incentives has become a more common practice for survey studies (Curtin, Presser, and Singer 2005). Substantial evidence on the benefits of offering incentives has become available. Incentives can help achieve high response rates by increasing the sample members’ propensity to respond (Singer, Hoewyck, and Maher 2000). Studies offering incentives show decreased refusal rates and increased contact and cooperation rates. Among sample members who initially refuse to participate, incentives increase refusal-conversion rates. By increasing sample members’ propensity to respond, incentive payments have been found to significantly reduce the number of calls required to resolve a case and to significantly reduce the number of interim refusals. Thus, incentive payments can help contain costs, and pass some of the costs of conducting the survey as a gain to the participant rather than into additional survey operations.

While incentives help gain cooperation to increase the overall response rate, they also increase the likelihood of participation from subgroups with a lower propensity to cooperate with the survey request, helping to ensure the representativeness of the respondents and the quality of the data being collected. For example, Jäckle and Lynn (2007) find that incentives increase the participation of sample members more likely to be unemployed. There is also evidence that incentives bolster participation among those with lower interest in the survey topic (Jäckle and Lynn 2007; Kay 2001; Schwartz, Goble, and English 2006), resulting in data that are more nearly complete. Furthermore, paying incentives does not impair the quality of the data obtained (such as item nonresponse or the distribution of responses) from groups who would otherwise be underrepresented in a survey (Singer, Hoewyck, and Maher 2000).

Offering incentives is a critical addition to intensive efforts to establish contact with prospective respondents and gain their cooperation with the planned data collection. To leverage fully the benefits of offering incentives, the advance letter to the UI study participants will mention the incentive. Interviewers will also mention the incentive when they establish contact with the participants and attempt to gain their cooperation.

The planned incentive amount is generally consistent with the amount that was proposed, approved by OMB, and found to be effective for the National Evaluation of the Trade Adjustment Assistance (TAA) Program. Initially, the baseline survey for the TAA evaluation included an incentive payment of approximately $25 (some sample members received a $2 prepayment plus $25 for survey completion; others received only $25). In an August 2008 memo to OMB, DOL reported a lower than expected response rate at this incentive level. In September 2008, OMB approved a revised strategy to increase response rates for the TAA survey. This plan included changes in operational procedures (principally the use of DOL letterhead for mailings, which will be used for this survey as well) and an incentive experiment where sample members could receive increased payments for survey completion. The contractor implemented these plans on September 20, 2008.

The incentive experiment tested three incentive levels—$25, $50, and $75—in a split-ballot experiment. The results of the experiment showed that those receiving the $25 offer achieved a 12 percent response rate. The response rate for re-contacted nonresponders who received the $50 incentive offer was 22 percent. Similarly, those offered a $75 incentive payment responded at 25 percent, a significantly higher rate than those offered $25. The results of the incentive experiment were similar among TAA participant sample members and their comparison group sample, drawn from the general UI population. Respondents called in sooner and in greater numbers when offered $50 rather than $25. When offered $75 over $50, they responded even faster, but response was not significantly higher than for those offered $50. The $50 offer translated to fewer per-case telephone interviewer hours, locator hours, and clerical expenses for future mailings to sample members. A determination was made that the extra response rate points that the $75 yielded were not worth the cost of the extra $25 per person over the $50 incentive. However, the $50 incentive was cost-effective compared to the $25 incentive. For this survey of a similar population, the contractor will use the slightly lower dollar amounts of $40 for respondent-initiated completions and $30 for contractor-initiated completions, per the recommendation of OMB.

b. The Survey of UI Administrators and the Site Visits

State administrators and other state level staff will not be compensated for completing the survey of UI administrators or for participating in interviews conducted during the site visits. While compensating these individuals for their time could improve relations with the states, it is felt that supplying the information is part of the work-related responsibilities of administrators and other staff who will be included in the data collection. Through industry contacts and keeping the burden to a minimum, a 100 percent response rate is expected.

10. Privacy

This section contains a discussion of the evaluation team’s general procedures to protect the data that are part of this clearance request. It also contains a separate discussion of the distinctive privacy issues that pertain to the survey of UI administrators.

a. Procedures to Protect the Privacy of the Data Collected as Part of the Evaluation

All respondent materials will include assurances of privacy protection. These include letters sent to sample members and information posted on the website for the UI recipient survey. In addition, as part of the interviewer’s introductory comments, sample members will be told that their responses are private and will have the opportunity to have their questions answered. Interviewers will be trained in privacy procedures and will be prepared to describe them in full detail, if needed, or to answer any related questions raised by participants. For example, the interviewer will explain that the individual’s answers will be combined with those of others and presented in summary form only.

All data items that identify sample members will be kept only by the contractor, Mathematica, for use in assembling records data and in conducting the interviews. Any data received by DOL will not contain personal identifiers, thus precluding individual identification.

It is the policy of Mathematica to efficiently protect private information and data in whatever medium it exists, in accordance with applicable federal and state laws and contractual requirements. In conjunction with this policy, all Mathematica staff shall:

  1. Comply with a Mathematica pledge that is signed by all Mathematica full-time, part-time, and hourly Mathematica staff, and with the Mathematica Security Manual procedures to prevent the improper disclosure, use, or alteration of private information. Staff may be subjected to disciplinary or civil or criminal actions or both for knowingly and willfully allowing the improper disclosure or unauthorized use of private information.

  1. Only access private and proprietary information in performance of assigned duties.

  2. Notify their supervisor, the project director, and the Mathematica security officer if private information has been disclosed to an unauthorized individual, used in an improper manner, or altered in an improper manner. All attempts to contact Mathematica staff about any study or evaluation by individuals who are not authorized access to the private information will be reported immediately to both the cognizant Mathematica project director and the Mathematica security officer.

To allow external verification and replication of the study findings, as well as additional research, public use data files containing key analysis variables created for the UCP evaluation will be produced at the end of the study and formatted to data.gov specifications. These public use files will follow the current relevant OMB checklist to ensure that they can be distributed to the general public for analysis without restrictions. Steps will be taken to ensure that sample members cannot be identified in indirect ways. For example, categories of a variable will be combined to remove the possibility of identification due to a respondent being one of a small group of people with a specific attribute. Variables that will be carefully scrutinized include age, race and ethnicity, household composition and location, dates pertaining to employment, household income, household assets, and others as appropriate. Variables will also be combined in order to provide summary measures to mask what otherwise would be identifiable information. Although it cannot be predicted which variables will have too few respondents in a category, the study researchers plan not to report categories or responses that are based on cell sizes of less than five. If necessary, statistical methods will be used to add random variation within variables that would be otherwise impossible to mask. Finally, variables that could be linked to identifiers by secondary users will be removed or masked.

1. Systems Security

Mathematica’s computer facilities include state-of-the-art hardware and software. The hardware and software configurations have been designed to facilitate the secure processing and management of both small- and large-scale data sets.

Facility. The doors to Mathematica’s office space and Survey Operations Center (SOC) are always locked, and all SOC staff are required to display current photo identification while on the premises. Visitors are required to sign in and out and must wear temporary ID badges while on the premises. Any network server containing private data is located in a controlled, limited-access area. All authorized external access is through a server under strict password control.

Network. Sensitive data are stored in secure folders that reside on a Windows 2008 Server volume using NT File System (NTFS). BitLocker encryption software, configured to use a 256-bit AES key, encrypts data on the volume as they are stored. The encryption persists for the life of the volume. NTFS/BitLocker makes the data accessible only to users with authorized access, and makes data inaccessible to software that circumvents normal access control, in case the media are stolen. NTFS/BitLocker stores user data in an encrypted format on the volume, but it works transparently with most applications and backup utilities. All the rules of file system trustee assignments, trustee rights, ownership, sharing, visibility, locking, transactions, and space restrictions remain the same on the encrypted volume. Data on the “Secure_Data” folders are backed up using ArcServe 11.5, which encrypts the contents using the 3DES algorithm. These separate backups are overwritten every two months by backups of newer secure data, a process that enables compliance with secure data destruction requirements.

Access to all network features, such as software, files, printers, Internet, email, and peripherals, is controlled by userid and password. Mathematica staff are required to change their password for computer access no less than every three months, and passwords must adhere to the following standards: be at least eight characters long, contain at least one letter (upper or lower case), and contain at least one numeric or special character. All userid’s, passwords, and network access privileges are revoked within one working day for departing staff and immediately for terminated staff. All staff are required to log off the network before leaving for the day.

Printers. Printer access is granted to all staff with a valid userid and password. The physical hard disks on which the printer queues reside are subject to the same security and crash procedures that apply to the file servers. Printer queues are confined to write-access to all staff. No staff have read-access to the printer queues; that is, they cannot browse the contents of the printer queues. Printer stations are appropriately monitored according to the sensitivity of the printed output produced. No private or proprietary data or information can be directed to a printer outside Mathematica’s offices.

Electronic Communication. Each of Mathematica’s locations has a site-specific LAN. A combination of T1 and Ethernet Private Line (EPL) lines links the site-specific LANs into a Wide Area Network (WAN) and supports cross-office communications. Traffic on the Mathematica internal network, which is not encrypted, is secured by these links, all of which are private, point-to-point communication lines dedicated to Mathematica traffic and completely contained within Mathematica’s firewalls. As each office is connected to other offices solely by these private point-to-point lines and not through the Internet, all WAN traffic is contained and protected within Mathematica’s firewalls; no WAN traffic is routed through the Internet.

2. Treatment of Data with Personal Identifying Information

All data containing personal identifying information (PII)—including Social Security number (SSN), name, home address, date of birth, and telephone number—are considered to be sensitive, or private, data. The UCP evaluation is in compliance with the aforementioned company security policies. Listed below are specific details regarding the handling and processing of private information in this evaluation.

Access. Electronic files with private data are stored in restricted-access network directories. Access to restricted directories is limited through access control permissions, on a need-to-know basis to staff who have been assigned to and are currently working on the project. When temporarily away from their work area, project staff are instructed to close files and applications and to lock their workstations using the CTRL-ALT-DEL command. Workstations automatically lock within a set number of minutes, and a password must be used to regain access through the protected screen saver.

Electronic Communication. For internal emails, staff are forbidden to transmit sensitive study information as a regular file attachment; they are instructed instead to use the “insert hyperlink” feature in Outlook to include a shortcut to the file. This allows the receiver to go to the file directly but will not allow access to unauthorized individuals. In addition, staff are instructed to avoid including sample member names or other PII in internal emails, so that there is no potential for these to be viewed by others.

Emails sent outside Mathematica are not automatically encrypted, and therefore neither the text nor attachments are secure. Before sending an email containing sensitive information, the sender is obligated to ensure that the recipient is approved to receive such data. When files must be sent as attachments outside Mathematica, staff are instructed to use WinZip 14.5 (256-bit AES encryption) to password-protect the file and transmit the password to the recipient using a separate form of communication, preferably via phone. When a sample member’s name and contact information are sent outside Mathematica, the information is included in a secure attachment rather than in the text of the email.

UCP Evaluation Databases. Project databases containing private information are password-protected and accessible only to staff currently working on the project. To access the project’s database, users must first log onto their workstations and then, upon starting the database, log in again using a separate prompt. Project databases will be removed from the company servers and securely archived at the end of the data-processing period.

Telephone Interviewing. Telephone interviewers for the UI recipient survey will be seated in a common supervised area. As part of the process to verify that the correct sample members have been reached, interviewers will have access to respondents’ names and birthdates, as well as the last four digits of their SSN. Birth date and the last four SSN digits will be displayed on the computer screen only temporarily, at the beginning of the survey, so that the interviewer can verify the sample member’s identity. Interviewing staff for this project receive training that includes general SOC security and privacy procedures, as well as project-specific training that includes explanation of the highly private nature of this information, instructions to not share it or any PII with anyone not on the project team, and warnings about the consequences of any violations. Telephone interviews are recorded for educational and training purposes only, to aid SOC staff in improving their interviewing skills.

Locating. Staff who work on updating sample member contact information when the original contact is not successful must have access to key identifying information for short periods. These staff members receive training that includes general SOC security and privacy procedures, as well as project-specific training that includes clear instructions on what data and databases can be accessed and what data are required and can be recorded.

Locators may talk to sample member’s family, relatives, or other references to obtain updated contact information. To protect the sample member, locators are given scripts on what they can and cannot say when using these sources to obtain information. For example, they will be instructed not to tell anyone that the sample member has been selected to participate in a study of the unemployed. Rather, they will indicate that Mathematica is trying to reach the sample member for an important study sponsored by DOL. Postcards will describe the need to speak to the person who once filed for UI benefits.

Locating and Calling Contact Sheets. Project team members keep only the minimum amount of printed private information needed to perform assigned duties. Hard-copy materials (such as locating or calling contact sheets) containing data with any individual identifiers (e.g., name, street address) are stored in a locked cabinet or desk when not being used. When in use, such materials are carefully monitored by a project supervisor and are never left unattended. At the conclusion of the project, a final disposition of all remaining sample will be made, and contact sheets and other associated materials will be destroyed.

Hard-Copy Printouts. Sensitive temporary work files, used to create hard-copy printouts and stored in temporary work files on local hard drives, are deleted on a periodic basis. Hard-copy output with private information is shredded or stored securely once no longer needed. Test printouts of data records carrying personal identifiers that are generated during file construction are shredded.

Data Files. When possible, electronic files for everyday use are created without personal identifiers. Data and sample files that must contain sensitive data are stored and analyzed on one of Mathematica’s “Secure_Data” drives. Specifically, staff working on this project will be instructed to maintain all files with private data in project-specific, encrypted folders on the Mathematica network. Access control lists restrict access on a need-to-know basis and only to project staff who are specifically authorized to view the sample data (as designated by the project director or survey director) to select and process the sample or to process the data files. Sensitive data that are no longer needed in the performance of the project will be magnetically erased or overwritten using Hard Disk Scrubber or equivalent software, or otherwise destroyed.

b. Survey of UI Administrators

The evaluation team also will provide to DOL a public use data file and documentation for the data collected as part of the survey of UI administrators. For this set of data, it is not expected that that the identity of respondents will be kept private. This is because the target respondents are UI administrators—whose identities are publicly known—or individuals they designate. Furthermore, much information is publicly available about states’ decisions whether or not to adopt the UC-related provisions of ARRA. Therefore, for at least some states, it is likely that the public could identify a state from the records in the public use data file. However, the public use data file and documentation will exclude the respondents’ names, titles, and contact information.

Making available information about the experiences of specific states in the implementation of the UC-related provisions of ARRA, including the identities of the states, is consistent with DOL practice through other similar research, such as a recent NAWSA study of states’ early implementation experiences with the workforce development and UI provisions of ARRA.

11. Questions of a Sensitive Nature

a. UI Recipient Survey

The UI recipient survey contains some questions that may be considered sensitive. These questions are related to earnings, income, participation in transfer programs, the need for health care, household savings, missed or late payments on financial obligations, and other measures of financial distress (Section H). However, depending on an individual’s particular circumstances, any question could be perceived as sensitive. Mathematica’s interviewers are well trained to show sensitivity while remaining impartial. Also, if a respondent refuses or shows resistance to answering a financial question, alternate versions of the question which accept a range are generally provided. Finally, to encourage reporting, reluctant respondents are also reminded that their answers will be treated with privacy.

All questions in the UI recipient survey, including those deemed potentially sensitive, have been pretested and many have been used extensively in prior surveys with no evidence of harm. Questions about income, household savings, indicators of financial distress, and receipt of public assistance are necessary to measure the economic well-being of study participants. Obtaining information about these potentially delicate topics is integral to addressing the research questions posed by the study, in order to describe the characteristics of UI recipients, describe their outcomes, and assess the impact of the UC ARRA provisions.

b. Survey of UI Administrators and Site Visits

There are no questions of a sensitive nature in the survey of UI administrators and site visits.

12. Hour Burden of the Collection of Information

The hour burden estimate for the collection of information that is part of this clearance request consists of the burden from the UI recipient survey, the survey of UI administrators, and the site visit data collection (Table A.5).

The hour burden for the UI recipient survey is estimated at 30 minutes for each respondent. Hence, the total time for respondents to complete the questionnaire is 2,400 x (30/60) hours, which is equal to 1,200 hours.

The hour burden of the survey of UI administrators is expected to be 34 hours. The number of respondents and the average response times are based on an assumption that (1) 34 UI jurisdictions will take 50 minutes to respond (involving 1 respondent for 30 minutes and 1 respondent for 20) and (2) 17 UI jurisdictions will take 20 minutes to respond (1 respondent for 20 minutes). This expected variation in survey completion time is because large portions of the survey will be skipped for jurisdictions that did not implement the full set of UC-related modernization provisions in response to ARRA or had implemented the relevant provision prior to ARRA.

The hour burden for the site visit data collection is expected to be 575 hours. For each of 20 jurisdictions that will be part of this data collection effort, an average of two hours of previsit planning and coordination with the evaluation team (by 4 staff per state for 30 minutes each) is expected. The onsite interviews are expected to include averages of (1) 9 state UI office staff, (2) 1.5 call center administrators (1 administrator in half of the states and 2 administrators in half of the states), (3) 1 local One-Stop Career Center administrator, and (4) 3 other stakeholders, such as individuals on the UC advisory council. Each interview is expected to last an average of 90 minutes. Each UI jurisdiction that is part of the site visit data collection effort also will be asked to have a staff person complete the data systems survey before the visit; the time to complete this survey is expected to be 30 minutes.

The estimated total burden for the data collection included in this request for clearance is 1,719 hours, which equals the sum of the estimated burden for the survey of UI recipients, the survey of UI administrators and the site visit data collection effort.

13. Estimated Total Annual Cost Burden to Respondents and Record Keepers

There will be no start-up or ongoing financial costs incurred by respondents.

14. Estimated Annualized Cost to the Federal Government

The total estimated cost to the federal government of conducting the UCP evaluation is $4,288,407, which is the total contractor cost of conducting the evaluation over a three-year period. The annualized cost to the government is $1,429,469. This cost includes the study tasks shown in the Table A.6.



Table A.5. Burden Estimates for Data Collection Efforts

Respondents

Number of Respondents/
Instances of Collection

Frequency of Collection

Average Time per Response

Burden

(Hours)

UI Recipient Survey





UI recipients

2,400

Once

30 minutes

1,200






Survey of UI Administratorsa





State staff

51

Once

20 minutes

17

State staff

34

Once

30 minutes

17






Total for survey of UI administrators

85

--

--

34






Site Visit Data Collection





Planning for the site visits

80

Once

30 minutes

40

On-site interviews





State UI office staff

180

Once

90 minutes

270

Call center administrator

30

Once

90 minutes

45

Local One-Stop Career Center administrator

20

Once

90 minutes

30

Other stakeholders

60

Once

90 minutes

90

Data systems survey





State staff

20

Once

30 minutes

10






Total for site visits

390

--

--

485






Grand Total for All Three Data Collection Efforts


2,875


--


--


1,719

Note: Other stakeholders = lobbyists, legislature, council member.

a The number of respondents and average time per response for the survey of UI administrators are based on an assumption that (1) 34 UI jurisdictions will take 50 minutes to respond (involving 1 respondent for 30 minutes and 1 respondent for 20) and (2) 17 UI jurisdictions will take 20 minutes to respond (1 respondent for 20 minutes).



Table A.6. Study Task by Cost

Study Task

Cost

Evaluation Design, Technical Working Group Meetings, Reports, and Review

$531,001

Collection of UI Administrative Data

1,051,694a

Sampling for the UI Recipient Survey

178,809

CATI Programming and Database Design for the UI Recipient Survey

193,850

UI Recipient Survey Management

215,796

UI Recipient Survey Questionnaire Development and Training

91,114

UI Recipient Survey Locating

196,615

UI Recipient Survey Data Collection

442,664

Conduct Survey of UI Administrators

17,757

Conduct Site Visits

312,334

Prepare OMB Package

38,364

Create UI Modernization Report

213,082

Create Emergency Benefits Report

365,015

Create Impact on Claimants Report

341,298

Conduct Briefings on Study Findings

77,152

Create Public Use Data File

21,862

Total

$4,288,407

TWG = technical working group. UI = unemployment insurance.

aIncludes funds, provided under a task order separate from the main evaluation contract, to compensate states for the provision of administrative data to be provided for the evaluation.

15. Changes in Burden

The data collection efforts for the evaluation of the UC provisions of the ARRA of 2009 will count as 1,719 hours toward DOL’s information collection burden.

16. Publication Plans and Project Schedule

The evaluation will convey findings in three reports: (1) a modernization report, (2) an emergency benefits report, and (3) an impacts report. The modernization report will contain analysis of states’ decisions about the UI modernization provisions and their experiences implementing these and other UC-related provisions of ARRA. The emergency benefits report will contain analysis of states’ experiences regarding EB and emergency benefits extensions; it also will include an examination of the characteristics of recipients affected by the extensions of benefits. The impacts report will cover estimates of the impacts of the ARRA UC provisions on recipients. Additional detail about the approaches to be used for the analyses presented in these reports is provided in B.2.

The schedule for the fielding of the data collection efforts and the delivery of the reports is provided in Table A.7.



Table A.7. Schedule for Project Tasks

Tasks

Schedule (pending OMB approval)

Fielding of the UI Recipient Survey

9/1/2012 to 3/31/2013

Fielding of the Survey of UI Administrators

6/15/2012 to 10/15/2012

Site Visits

11/15/2012 to 3/15/2013

Modernization Report

10/15/2013

Emergency Benefits Report

1/15/2014

Impacts Report

3/15/2014

Public Use Data Files

3/29/2014

17. Reasons for Not Displaying Expiration Date of OMB Approval

The expiration date for OMB approval will be displayed on all respondent materials developed for the study.

18. Exceptions to the Certification Statement

Exception to the certification statement is not requested for the data collection.



rEFERENCES

Curtin, R., S. Presser, and Eleanor Singer. “Changes in Telephone Survey Nonresponse Over the Past Quarter Century.” Public Opinion Quarterly, vol. 69, no. 1, spring 2005, pp. 87–98.

Jäckle, Annette, and Peter Lynn. “Respondent Incentives in a Multi-Mode Panel Survey: Cumulative Effects on Nonresponse and Bias.” Working paper presented to the Institute for Social and Economic Research, University of Essex, Colchester, United Kingdom, 2007.

Jacobson, Louis S., Robert J. LaLonde, and Daniel G. Sullivan. “Earnings Losses of Displaced Workers.” American Economic Review, vol. 83, no. 4, September 1993, pp. 685–709.

Kay, Ward R. “The Use of Targeted Incentives to Reluctant Respondents on Response Rates and Data Quality.” Proceedings of the American Association for Public Research. Montreal, Canada: American Association for Public Opinion Research, 2001.

Schwartz, Lisa K., Lisbeth Goble, and Edward M. English. “Counterbalancing Topic Interest with Cell Quotas and Incentives: Examining Leverage-Salience Theory in the Context of the Poetry in America Survey.” Proceedings of the American Association for Public Research. Montreal, Canada: American Association for Public Opinion Research, 2006.

Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly, vol. 64, no. 2, summer 2000, pp. 171–188.

Vroman, Wayne. “The Alternative Base Period in Unemployment Insurance: Final Report.” Unemployment Insurance Occasional Papers 1995-3. Washington, DC: U.S. Department of Labor, Employment and Training Administration, 1995.



Shape4

Improving public well-being by conducting high-quality, objective research and surveys

Princeton, NJ Ann Arbor, MI Cambridge, MA Chicago, IL Oakland, CA Washington, DC



Mathematica® is a registered trademark of Mathematica Policy Research

Shape5

www.mathematica-mpr.com







1 When this clearance package was prepared, the survey sample was intended to include UI recipients with BYB dates ranging from October 1, 2007 through September 30, 2009. Subsequently, DOL and the contractor decided to remove from the sample the recipients with BYB dates in 2007. Such recipients would face the longest recall periods and the most challenges in providing information for data items tied to the calendar year (for example, household income). In addition, elimination of those UC recipients would allow a shorter time frame to be covered by the administrative data extracts. A consequence of this decision is that the study will not be able to fully analyze the impacts of the first tier of EUC08. DOL determined that this was an acceptable reduction in information, given the advantages of starting the collected data in 2008, particularly since estimating an impact of the first tier of EUC08 would require applying an interrupted time series design, an approach that is less rigorous than the other methods described in Section B.2.

2 As described later in this section, the 20 states in the site visit data collection effort might not be the same states as those included in the UI recipient survey.

3 The data collection plans do not include interviews or focus groups with UI recipients because the focus on the data collection effort is on states’ perspectives. Furthermore, conducting these types of interviews or focus groups in a way that would provide high-quality data would be very resource-intensive. However, some information about the experiences of UI recipients will be available through the UI recipient survey.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCMcClure
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy