DOL WOTC_Draft WOTC Supporting Statement A_2024-9-6_Final V3

DOL WOTC_Draft WOTC Supporting Statement A_2024-9-6_Final V3.docx

Work Opportunity Tax Credit (WOTC) Implementation Evaluation

OMB:

Document [docx]
Download: docx | pdf

Work Opportunity Tax Credit (WOTC) Implementation Evaluation

OMB Control Number 1219-0NEW

OMB Expiration Date: TBD




SUPPORTING STATEMENT FOR

Work Opportunity Tax Credit (WOTC) Implementation Evaluation

OMB CONTROL NO 1290-0NEW

This is a new information collection request.

  1. JUSTIFICATION



  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

This implementation evaluation of WOTC has been authorized by the Department of Labor’s (DOL) Chief Evaluation Office (CEO), in partnership with the Employment and Training Administration (ETA) and the Office of Disability Employment Policy. This implementation evaluation is authorized to fulfill the requirements of the Foundations for Evidence-Based Policymaking Act of 2018 and associated Office of Management and Budget (OMB) guidance.

In fiscal year (FY) 2023, State Workforce Agencies (SWAs) received a total of 7,939,913 certification requests from employers. At the end of the fourth quarter, 25.0 percent of certifications were approved, 47.9 percent were denied, and 27.1 percent were in backlog. WOTC performance across states has a large variation in approval rate (from less than one percent to 51 percent), denial rate (less than one percent to 81 percent), and pending rate (no pending at all to as large as 98.5 percent). In addition, two target groups account for more than three-fourths of the certifications. Special Nutrition Assistance Program (SNAP) participants account for 65.1 percent of certified WOTC employees, and long-term unemployed individuals, the next largest group, account for 6.8 percent of certified WOTC employees. Given the large variation in performance across states as well as the approval of only one in four certification requests, and only two target groups (albeit SNAP is the largest assistance program) accounts for three-fourths of the certifications, it is important to understand how implementation practices could contribute to these lopsided program results. DOL’s objective for this evaluation is to better understand the WOTC, how it is administered among state workforce agencies, how it serves job seekers and employers, the effectiveness and efficiency of its current design, potential improvements in structure and operations, and potential future research in this area. This study addresses the implementation circumstances and activities of:

  • WOTC employees’ experiences with American Job Centers (AJCs) and other partner organizations in preparing for employment.

  • WOTC employees’ experiences in obtaining employment.

  • Employers’/and employer representatives’ experience in hiring WOTC candidates and obtaining certifications.

  • American Job Centers’ and other partner organizations’ experience in assisting potential WOTC employees in obtaining employment.

  • SWAs’ experiences with employers and with agencies that provide evidence of eligibility for WOTC certifications.

An outcome and impact evaluation will be considered at a later date.

The WOTC implementation data collection overview

The study will leverage existing data gathered at the Federal and State levels, combined with primary data collected through surveys for which clearance is sought. The extant Federal data comprises aggregate performance data that SWAs report on ETA Form 5098. Existing State data includes the employer- and employee-level data reported on the forms that employers submit to seek certification of employees for WOTC. These data will be combined with other extant data from the Dunn and Bradstreet, which contains information that describes the establishments where WOTC employees work, including their size, type, major industries, etc. In addition, the Occupational Information Network (O*NET) will be queried to supply information on the wages earned in the occupations in the ZIP codes (where WOTC employees work or worked) to assess whether WOTC employees are receiving wages consistent with local norms. The 2022 Current Population Survey (CPS) supplement will provide information on the average wages in the occupations in which WOTC employees work.



These extant data will be analyzed independently and later merged with survey data (for which OMB clearance is sought) from WOTC participants. Survey responses will provide an understanding of the current implementation practices and challenges faced by different entities involved in WOTC. Therefore, included in this clearance request are web-based surveys for:

  1. SWA WOTC Administrators

  2. AJCs/Partner Organizations that pre-certify individuals in WOTC Target Groups

  3. WOTC-participating Businesses and Business Representatives Acting on Behalf of a WOTC-participating Business (also labeled employer survey)

  4. Individuals Certified in a WOTC Target Group (also labeled employee survey).



  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

The information collected in this new collection will enable DOL’s CEO and ETA, along with additional data from extant sources, to assess how the program is being carried out. Implementation evaluation is formative, not causal, and may provide insights into improvements that could support the success of the program. WOTC, like many Federal programs, does not have a highly standardized delivery process. WOTC is not a “standardized treatment” like that found in experimental research. SWAs design and implement their procedures under general guidance from DOL. Consequently, there are opportunities to examine how these variations are associated with implementation outcomes. This study will enable an assessment of the reach of the program. It will reveal the types of employers that participate, and the proportions of target groups served who obtain various types of jobs at various wage levels, and (from O*NET) the availability of jobs in the ZIP codes where the current employers operate. Collected data from multiple sources will also enable DOL to understand how WOTC is administered among state workforce agencies, how it serves job seekers and employers, and potential future research issues. The following is a list of general research questions this study seeks to answer.

  1. What are the demographics of WOTC-certified employees and how do they compare to the demographics of the general population?

  2. What is the degree of WOTC utilization among employers and what are the characteristics of these employers?

  3. What are the characteristics of jobs of WOTC-certified employees?

  4. How is WOTC currently operating?

The study team will synthesize data across all sources to answer the research questions.



  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.



Primary data collection will use advanced technology to minimize the burden on study participants and on staff at participating agencies. Electronic versions of the proposed surveys will be used so respondents can complete the questionnaires on their own schedule, in multiple sittings, and without needing to return any forms by mail. Respondents will require Internet access to respond, either through a cell phone with internet capability or a computer. The surveys will be 508 compliant to ensure that individuals with disabilities are able to complete the survey as well.



As of 2021, the PEW Research Center reported that 98 percent of people under the age of 64 use the internet.1 In addition, FCC sponsors Lifeline phones for low-income individuals that include voice and broadband. Lifeline phones are currently available to low-income individuals in all States except Montana.2 These statistics indicate that WOTC-certified individuals, who were employed through WOTC, are likely to have internet access, including cell phones with broadband. Some WOTC employee respondents may have limited reading skills and could be hampered in their response for that reason. However, most computers and phones offer a virtual voice to read the questions. Hence, we believe this data collection technology is feasible for all survey respondents, including WOTC employees.



  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.



The required data that is sufficient to conduct the WOTC implementation evaluation study are not otherwise available from the existing sources. National data on WOTC is limited to annual and quarterly outputs: (e.g., number of certifications, denials, backlog, number of WOTC certifications by two-digit O*NET occupation category, initial wages, and number of certifications by target group). No information is available on the implementation process at different nodes – employee, employer, SWAs, AJC/partner organizations, and IRS. Relevant extant data will be combined with the data collected through surveys to conduct analysis for the final report. The study team will only request new information through surveys. Each respondent will be surveyed only once.



  1. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.



ETA collects no data on the size of businesses that employ people with WOTC certifications. Until SWA records are obtained, we will not be able to determine the number of small businesses that participate in WOTC. In order to minimize the burden associated with the survey, we will conduct an Internet-based survey which will provide employers with the option to respond at their most convenient time. In all data collection activities, participants will be reminded that their participation is completely voluntary.



  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

This project is predicated on the expectation that adequate implementation of WOTC is essential to achievement of WOTC program outcomes and impacts. ETA provides guidance to State Workforce Agencies in implementing WOTC, but ETA has limited information on how WOTC implementation is operationalized among the 53 SWAs (this includes 50 states plus Washington, D.C., Puerto Rico, and the U.S. Virgin Islands), the American Job Centers, the employers and the employees. Given that the program has existed since 1996, this implementation evaluation will describe how WOTC is carried out, and how variations in implementation are logically related to these implementation outcomes: reaching the target audiences, adopting processes that logically support the WOTC program goals, completing certifications on a timely basis, and achieving participant satisfaction. These are viewed as necessary precursors to program outcomes and impacts. This evaluation is limited to describing the implementation of WOTC and its logical connection to implementation outcomes; it does not seek to measure WOTC program outcomes and impacts on employees and employers.

Without surveying employees, employers, SWAs and AJCs/partner organizations, ETA would have no evidence basis for determining if the WOTC delivery system supports achievement of program goals. If the surveys were not conducted, ETA would not learn what processes are associated with timeliness of certifications and understand factors that produce the substantial backlogs in WOTC certifications. It would not learn about the issues associated with collecting and verifying eligibility information for each of the 10 target groups, and how they differ. ETA would have no basis for policy or procedural guidance to remove barriers in the implementation processes that logically limit WOTC from reaching the eight low-participating target groups. Without surveying employees, no information beyond the target group is available about the characteristics of employees reached by WOTC, their employment experience, or features of their jobs such as benefits they receive or their satisfaction with the job. Inability to collect data from employers will lead to an incomplete understanding of what logically motivates employers to adopt WOTC into their recruitment practice and how differences in hiring practices and verification processes are associated with reaching various WOTC target groups, including hiring and retaining WOTC employees. Without AJC surveys, no evidence is available about what actions AJCs take and how these actions that are associated with identification and preparation of WOTC candidates, performing pre-certifications, and helping them to leverage WOTC candidates’ pre-certifications to obtain employment.


  1. Explain any special circumstances that would cause an information collection to be conducted in a manner:

    • requiring respondents to report information to the agency more often than quarterly;

    • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

    • requiring respondents to submit more than an original and two copies of any document;

    • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

    • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

    • requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

    • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

    • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.



The collection of demographic data is needed in order to answer specific research questions. However, an exemption from using the full question developed in OMB's Statistical Policy Directive No. 15 is requested. The question proposed will cause additional burden to the employees completing the survey. Given that these employees are not aware of the WOTC program, asking this level of detail may not result in answering the question. Additionally, given the smaller number of responses expected from employees, statistically valid analysis on the level of data asked in the full question will not be able to be conducted. Therefore, the use the race and ethnicity questions with minimum categories instead of the elongated version is preferred. The demographic question is located in the Employee survey as follows:



What is your race and/or ethnicity? Select all that apply.

  1. American Indian or Alaska NativeFor example, Navajo Nation, Blackfeet Tribe of the Blackfeet Indian Reservation of Montana, Native Village of Barrow Inupiat Traditional Government, Nome Eskimo Community, Aztec, Maya, etc.

  2. Asian – For example, Chinese, Asian Indian, Filipino, Vietnamese, Korean, Japanese, etc.

  3. Black or African AmericanFor example, African American, Jamaican, Haitian, Nigerian, Ethiopian, Somali, etc.

  4. Hispanic or LatinoFor example, Mexican, Cuban, Puerto Rican, Salvadoran, Dominican, Guatemalan, etc.

  5. Middle Eastern or North AfricanFor example, Lebanese, Iranian, Egyptian, Syrian, Iraqi, Israeli, etc.

  6. Native Hawaiian or Pacific IslanderFor example, Native Hawaiian, Samoan, Chamorro, Tongan, Fijian, Marshallese, etc.

  7. WhiteFor example, English, German, Irish, Italian, Polish, Scottish, etc.

  8. Prefer not to answer



  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.



Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.



Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.



A 60-day notice was published in the Federal Register on December 12, 2023, (Volume 88, Number 238, pages 86384-86385) soliciting comments on DOL’s intent to request OMB approval for data collection on the WOTC Implementation Evaluation. One organization, the National Employment Law Project (NELP) provided comments after a review of the survey instruments and study plan. Their thorough review of our plan and survey documents provided helpful insight into the employer and employee survey as well as the overall study plan. For the study plan, NELP asked that the contractor to focus more on job security, skills and advancement (including training), and diversity. NELP also included edits to roughly 20 questions on the employee survey. Some highlighted/accepted changes include adding:

  • A question on how employees commute to work daily.

  • Question options for why employees no longer work for the WOTC job and if they left voluntarily, why did they leave.

  • A question on how the employee’s hours and days work fluctuate.

In addition to these changes, NELP provided suggestions on modifying several Likert scale options and adding additional variables to other questions.



The study team also consulted with Methodological and Content Technical Working Groups (TWG) on research design and identifying important data needs. The TWGs convened on February 22, 2024 (Methodological) and February 29, 2024 (Content) to discuss the overall evaluation plan as well as provide a thorough review of the survey instruments and communication documents. Input from the TWGs aided in finalizing the survey instruments, survey administration methodology, and analysis plan for this project. Table 1 lists the people the study team have consulted in preparing this OMB package.



Table 1. List of Experts on the Technical Work Group Panel

Name

Title and Affiliation

Sarah Hamersma, Ph.D.

Associate Professor and Director of Doctoral Studies, Public Administration and International Affairs

Senior Research Associate, Center for Policy Research

Maxwell School of Citizenship and Public Affairs, Syracuse University

Peter Cappelli, Ph.D.

George W. Taylor Professor of Management


Director, Center for Human Resources


The Wharton School, University of Pennsylvania

Shunta Williams

B.J. Knutson-Cruset

Loi Dang

Carrie Thomas

Russel Hunter

National Association of State Workforce Agencies (NASWA)



In addition to the TWG, DOL used a third-party contractor to conduct a technical review of the evaluation design and data collection instruments for this engagement. Additional inputs from the third-party contractor were also included in revisions to the study design and data collection instruments.


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.



Each respondent in the Employee survey will receive a $10 gift certificate, provided either via mail or email. Given that the employees being surveyed will not be fully aware of their participation in the WOTC program, it has been deemed necessary to provide an incentive to respond. Based on available funds, it has been determined that $10 will be acceptable.



  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy. If the collection requires a systems of records notice (SORN) or privacy impact assessment (PIA), those should be cited and described here.



The study team will use personal information (name, address and phone number) for the purpose of contacting survey respondents to notify or remind them of the study and to request that they participate in a survey. Each respondent will be assigned a non-personal study ID on their survey form, and all collected data will be identified by the study ID. The contact information files will be kept separate from the data files. The contact information files will be securely stored and accessible only on a need-to-know basis. The study team ensures compliance with DOL data security requirements through the implementation of security controls within the standard processes routinely utilized in projects involving sensitive data. The process of data security will begin from the time of data collection when the study team makes efforts to ensure that respondents understand the extent to which information will be kept private.



For each survey, a Data Collection Instrument Disclosure statement will be provided in the email to the participant as well as in the introduction to the survey to ensure participants understand their rights and what the data will be used for. A sample of this statement is below.



Data Collection Instrument Disclosure Statement:

OMB Control Number: 1290-0NEW

Expires: TBD

Public reporting burden for this survey is estimated to average [enter hour burden per response] per response. The burden estimate includes the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and submitting the survey. This collection of information is voluntary. You are not required to respond to this collection of information unless it displays a valid OMB control number. Please send comments regarding the burden estimate or any other aspect of this collection of information to EconSys at [email protected] and reference OMB control number [1290-0NEW].



In addition, every survey will provide email addresses for both the Contractor and DOL, in case participants have additional questions about this data collection effort.



  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

The study team will not ask study participants sensitive questions such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. All respondents will be provided with an assurance of privacy before beginning the surveys and interviews.



  1. Provide estimates of the hour burden of the collection of information. The statement should:

    1. Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

    2. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

    3. Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under ‘Annual Cost to Federal Government.’



These surveys are a one-time data collection effort. Table 2 below provides burden estimates based on our sampling plan and testing procedures discussed further in Supporting Statement B.



Table 2. Estimated Annualized Respondent Cost and Hour Burden

Form/ Activity/ Section

No. of Respondents

 

Total Responses

Average Burden per Response (in minutes)

Total Burden (in hours)

Hourly

Monetized Value of Time

No. of Responses per Respondent

Wage Rate

State Workforce Agency

50

1

50

45

38

$47.16

$1,792

WOTC Employers

400

1

400

25

167

$62.50

$10,438

AJC/Partner Organizations

400

1

400

15

100

$47.16

$4,716

Certified Employees

600

1

600

20

200

$18.59

$3,718

TOTAL

1,450

 

1,450

 

505

 

$20,664

1 The hourly wage for both SWA and AJC employees was determined using ONET. The hourly wage was based on the job description for General and Operations Managers (11-1021.00)

2 The hourly wage for employers was determined using ONET. The hourly wage was based on the job description for Human Resource Managers (11-3121.00)

3 The hourly wage for employees was determined using ONET. The hourly wage was based on the job description for Office and Administrative Support Workers, All Other (43-9199.00). Given the wide range of jobs these employees worked, it was determined that using a general job type would fit most employees.



Overall, this data collection effort will include a total of 505 burden hours with a monetized value of time of $20,664.



  1. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).

    1. The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

    2. If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

    3. Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.

There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection.



  1. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.

The total cost to the Federal Government is $237,202 for the two years of the study. This equates to an annualized cost of $118,601. Costs incurred for the study are on the following:

    1. Tasks associated with this collection of information total $198,766. This amount also includes a $6,000 survey incentive for respondents of the Employee survey. Annualized over two years of tasks - study design, survey incentive, data collection, analyses, and delivering final reports to the DOL – this comes to $99,383 per year.

    2. The annual cost to DOL for federal technical staff to oversee the contract is estimated to be $19,218 or $38,436 for the two-year study. This number is based on the expected total of 600 hours from DOL staff and an hourly rate of $64.06 based on the GS rating GS13 Step 5.



  1. Explain the reasons for any program changes or adjustments reported on the burden worksheet.

This is a new information collection. Therefore, there are no changes or adjustments.



  1. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

The analytic techniques used in this study will consist of descriptive statistics, tabulations and relational analysis using regression techniques. The focus of the analysis will be to illustrate commonalities and differences in implementation practices and their relationship to certifications, denials, backlog, employment and tenure of WOTC employees. No complex analytical techniques are planned.

Because the study samples of employers, employees and partner organizations are derived from the SWA sample, and SWAs will be selected based on their ability to provide electronic files, the findings will be representative of those States and their WOTC employees and employers. They will not be representative of all States or all WOTC employers, employees and partner organizations. States without electronic systems, and presumably smaller states will be underrepresented. However, they can benefit from the findings of the study in that it will reveal which implementation procedures are associated with quicker certifications, lower backlogs, and greater participation of WOTC target groups.

The evaluation plan includes an implementation study final report in the fall of 2025. The study team will prepare a report describing the findings from the implementation evaluation study and deliver to the DOL CEO. The report will document the background of the WOTC program, the design of the implementation evaluation, the analysis approach used in statistical analyses, and the findings followed by recommendations. It will report how the employees go through the certification and hiring process under the WOTC program that involves employers, SWAs, and AJC/Partner organization. This will also document the feedback from employees (challenges, satisfaction levels by the employees), employers, SWAs, and AJC/Partner organizations. In addition, using qualitative and quantitative methods, the report will describe the findings on the characteristics of the WOTC employees and employers, the retention rates of the WOTC survey respondents relative to all employees in the same occupations, their career advancement, and any challenges faced by employees, employers, AJC/Partner organizations, and SWAs.

Table 3 contains the detailed project schedule.

Table 3. Work Breakdown Schedule (WBS)

WBS

Task

Start Date

End Date

1

Provide Overall Project Management

10/30/23

11/27/23

1.1

Orientation Meeting

10/30/23

10/30/23

1.1

Summary of Meeting

10/31/23

11/06/23

1.2

Develop Work Plan

10/31/23

11/27/23

2

Study Design

09/30/23

03/20/24

2.1

Refine Research Questions and Design Options

09/30/23

12/27/23

2.2

Evaluation Design Report

09/30/23

01/24/24

2.2.1

Draft Evaluation Design Report

09/30/23

12/27/23

2.2.2

Receive Feedback from DOL

12/28/23

01/10/24

2.2.3

Evaluation Design Report

01/11/24

01/24/24

2.3

Expert Panels and Other SMEs

01/25/24

03/20/24

2.3.1

Agendas for Panels (2 weeks prior to discussions)

01/25/24

03/20/24

2.3.2

Meeting Summary (1 week after each meeting)

01/25/24

03/20/24

3

Data Collection

09/27/23

08/30/25

3.1

Detailed Security Plan

12/28/23

02/29/24

3.2

Data Collection Instruments & Instructions

09/27/23

12/27/23

3.3

OMB PRA Clearance

09/27/23

12/31/24

3.3.1

Develop Materials

09/27/23

01/26/24

3.3.2

Submit Draft Clearance Package to DOL

01/29/24

01/29/24

3.3.3

Finalize and Submit Final PRA to DOL

01/30/24

6/12/24

3.3.4

Submit Clearance Package

02/20/24

9/09/24

3.3.5

Clearance Package Accepted

02/21/24

03/21/25

3.4

Data Collection

03/21/24

06/27/25

3.4.1

Extant Data Collection

03/21/24

11/10/24

3.4.2

Primary Data Collection (Interviews/Site Visits)

03/21/24

12/02/24

3.4.3

Survey Data Collection

03/27/25

05/27/25

3.5

Data Analysis Plan

01/25/24

03/20/24

3.6

Data Analysis

03/21/24

07/30/25

4

Conduct Implementation Evaluation

06/27/24

09/27/25

4.1

Draft Interim Report

06/27/24

06/27/24

4.2

Final Interim Report

09/27/24

09/27/24

4.3

Draft Final Report

05/27/25

09/27/25

4.4

Final Report

12/27/25

12/27/25

5

Evaluability Assessment

TBD

TBD

5.1

Draft Evaluability Report

TBD

TBD

5.2

Final Evaluability Report

TBD

TBD

6

Dissemination

TBD

TBD

6.1

Restricted Use Data File

TBD

TBD

6.2

Disclosure Avoidance Review

TBD

TBD

6.3

Dissemination Plan

TBD

TBD

6.4

Final Briefing

TBD

TBD

7

Optional Task

TBD

TBD

8

Optional Task

TBD

TBD



  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

The OMB approval number and expiration date will be displayed or cited on all forms completed as part of the data collection.



  1. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.

There are no exceptions necessary for the WOTC implementation evaluation study team to collect information.

1 Pew Research Center. Source: PEW Research Center Surveys of U.S. adults conducted 2000-2021. Data for each year based on a pooled analysis of all surveys conducted during that year. Accessed January 8, 2024.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAhsan Ahsanuzzaman
File Modified0000-00-00
File Created2024-12-15

© 2025 OMB.report | Privacy Policy