Work Opportunity Tax Credit (WOTC) Implementation Evaluation
OMB Control Number 1219-0NEW
OMB Expiration Date: TBD
Part B: Collection of Information Employing Statistical Methods
Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
Note: response rate means: Of those in your respondent sample, from what percentage do you expect to get the required information (if this is not a mandatory collection). The nonrespondents would include those you could not contact, as well as those you contacted but who refused to participate.
The universe of potential respondents of the data collection efforts that this request seeks approval from Office of Management and Budget (OMB) includes all coordinators of the Work Opportunity Tax Credit (WOTC) program in all 53 State Workforce Agencies (SWAs) (this includes all 50 states plus Washington, D.C., Puerto Rico, and the U.S. Virgin Islands). The universe for all other respondents will be based on a subset of the SWAs that have the capacity and willingness to provide electronic data needed for sampling and analysis. Once this “operational universe” is established, random samples will be drawn of American Job Center (AJC) and partner organizations, employers that requested certification of WOTC eligibility of hired employees, representatives that support some employers in their interactions with SWAs, and SWA-certified WOTC employees. Table 1 provides the study universe and expected response rates for each survey respondent group.
Table 1. Respondent Universe by Type
Type |
Sampling method |
Universe Size |
Number of expected respondents |
Estimated response per respondent |
Expected response rate |
State |
Census |
53 |
50 |
1 |
95%1 |
Employer |
Random Sample |
Unknown; will be obtained from SWA data. |
400 |
1 |
10%2 |
Employee |
Random Sample |
1,984,978 in 2023 |
600 |
1 |
5%3 |
American Job Center/Partner organizations |
Random Sample |
All AJC/Partner organization staff in selected states. There are 2,300 nationwide. (https://www.dol.gov/agencies/eta/american-job-centers). Sample will be selected from the subset of SWAs that have the capacity and willingness to provide AJC data. |
400 |
1 |
33%4 |
1 Given that SWAs are grantees for DOL, they are required to participate in evaluation activities such as this survey. While it is expected that all will participate, a 95% response rate allows for some states and territories to not participate based on outside factors beyond DOL’s control.
2 External surveys achieve lower response rates compared to internal surveys. Given the contractor will have direct email access to an individual with knowledge of the WOTC program and can conduct multiple follow-up emails, a 10% response rate seems achievable based on research.
3 Achieving a high response rate for the employee survey will be difficult, given only postcards and no follow-up will be sent. Given that employees may not know they are part of the WOTC program also provides challenges. The contractor is including a $10 incentive per survey response to increase response rates. The contractor previously conducted a survey using only one postcard and one paper survey follow-up, which achieved a 3% response rate. Given the inclusion of incentives, a 5% response rate is reasonable.
4 Similar to the Employer survey, external surveys can range in response rates. Given that AJCs receive grant funds similar to SWAs, a higher response rate is expected as they are likely required to participate in evaluations. However, DOL may not be able to strictly enforce participation like the SWAs.
Surveys
The web-based surveys will collect information to understand the processes used by administrators in SWAs, AJCs and employers to recruit, hire and determine WOTC eligibility, and certify individuals for employment. The WOTC employee survey will focus on the employee’s WOTC application process and employment experience. Overall, survey questions focus on key elements of the processes, and their satisfaction with the process.
Study participants selection
Looking at the WOTC certification data shows that there is a variation in WOTC workload across states. As of 2023 DOL WOTC performance statements, 10 states (CA, TX, MD, OH, FL, GA, IN, NY, IL, TN) contribute to half of all WOTC workload. It is expected that the largest SWAs participating in WOTC are automated and will be able to provide data that can be linked with extant data. However, even among these SWAs, some may have other priorities that prevent them from participating. The “operational universe” for this study will consist of as many of the large states that cooperate with the study. Beyond that, a few smaller states will be selected in consultation with DOL Regional offices that oversee the SWAs. Hence, the samples of survey respondents (aside from the SWAs, where we will send surveys to the universe) will represent a specific portion of the WOTC population based on the operationally defined universe.
Within the “operational universe,” statistically representative samples of employers, employees, and AJCs will be selected. The number of respondents in each group and statistical calculations behind the numbers are shown in Table 1. We calculate the representative sample size for the given employee respondent universe using the statistical methodology shown in equation (1). Our calculation shows that 400 respondents represent the employee, employer, and AJC/partner organization respondent universe.
Survey of SWA administrators. The goal of the survey of SWA administrators is to understand the ways in which SWAs engage in WOTC program. As such, it collects information from 54 SWAs about how they implement WOTC. With or without automation, all SWAs will be able to respond to questions about the procedures they use to administer WOTC. The expected response rate for this survey 95% percent based on the following requirement for DOL/ETA grantees:
A.12 Evaluation, Data, and Implementation
Grant and cooperative award recipients must cooperate during the implementation of a third-party evaluation. This means providing DOL/ETA or its authorized contractor with the appropriate data and access to program operating personnel and participants in a timely manner.
We anticipate that a few SWAs may not respond. This aligns with similar surveys that have been conducted in this area.
Survey of American Job Centers. This survey will provide information about AJC’s approaches in identifying, pre-screening, pre-certifying, and verifying WOTC candidates, the way SWAs are involved with AJCs in WOTC activities, about specific WOTC target group(s) they serve, how AJCs work with employers, and the other challenges AJCs face on WOTC program implementation. The study team will collect information from 400 AJCs and partner organizations. Since AJCs also receive DOL grant funds, similar to the SWAs, we anticipate a high rate of response.
Survey
of employers. The goal of surveying sampled employers is to
collect information about the employer or employer representative’s
involvement in WOTC implementation (identifying, verifying, and
certifying WOTC candidates), the employers’ interactions with
the SWA in obtaining certifications, and typical duration of WOTC
employees’ employment. The study targets to survey 400
employers. The employer tax credit for each WOTC hire can motivate
employers to respond to the survey. In addition, the survey covers
topics of interest to employers, such as issues in receiving
certification determinations and difficulties in obtaining
documentation, which can also motivate responses. However, there is
no penalty for nonresponse by employers.
Survey
of employees. The goal of surveying employees is to understand
the characteristics of candidates employed as WOTC employees, and
their employment experience, including hours worked and sustaining
the WOTC job or getting another job, and their satisfaction. A random
sample will be developed to collect responses from 600 employees.
WOTC employees may not be aware that they are a participant in the
WOTC program, given that employers may embed WOTC form requirements
into their recruiting and hiring forms. Given that, we anticipate
difficulty in surveying these individuals and the contractor will
provide an incentive to complete the survey. With the incentive, it
is expected that the response rate will be roughly 5 percent. The
response rate was calculated based on information provided above.
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
Statistical methodology, estimation, and degree of accuracy
Given our experience collecting survey data, we estimate an approximate response rate of 5% from employees and 10% from employers. We note and assume that the response rate will likely vary by target group. However, these differences are unknown to us at this time and therefore cannot be directly accounted for in the sample development.
As our only requirement is to draw a representative sample, we have used the following formula and assumptions to estimate the sample size of employees:
N = Population Size (N = 1,984,978)
Z = Z-score associated with normal distribution (95/5% confidence, associated Z-score = 1.96)
e = Margin of Error (e = 5%)
p = Standard Deviation (p = 0.5, the value is unknown and conservatively assumed to be 0.5)
Required Sample Size formula =
(1)
Applying the formula with the noted assumptions leads to a sample size of 384. For simplicity’s sake we are rounding the value to 400. (Because we will not know the number of employers until we obtain data from the SWAs, we cannot provide a calculation for the employer sample. We assume that the universe will be sufficiently large to apply the formula described above.) We plan to sample all the AJCs in the sampled SWAs. Each AJC will be asked to identify another partner organization in their community, and we will survey those as well.
To achieve the value of 600 collected surveys of employees based on a response rate of five percent requires a mailing of 12,000 survey requests.
Analyzing survey data
The data gathered through the survey will be tabulated using descriptive methods (including simple frequencies, cross-tabulations, and means, when appropriate) to provide contextual information about the characteristics of respondents, the strategies used by different implementing entities, challenges, and feedback. When relevant, important findings will also be presented through visualization techniques to communicate to the readers.
The analysis plan for the survey results is purely descriptive in nature. Each main segment (Employers, Employees, SWAs, and AJCs) is predetermined by its own and separate function within the WOTC program. There is not a need to compare within or between these separate segments or to adjust for clustering within groups. The criterion is not applicable in the context of the WOTC implantation evaluation.
Response data will be summarized within the different segments and by survey topics within segment. The topics are not directly relatable, nor comparable between segments. As such, statistical tests will likely not be employed. and sample size will not have an impact on any test employed. The sample size was chosen based on meeting minimal sample size requirements for reporting proportional results as well as being limited by available project budget. The Employee survey will be used to answer basic questions about outcomes and demographics. For the SWA, Employer, and AJC surveys, most questions are procedural in nature to get a better understanding of the processes used to implement WOTC across the different entities. Statistical tests will not be conducted as results will be tabulated and reported on in simple summary tables.
Unusual problems requiring specialized sampling procedures
To administer the surveys, no specialized sampling procedures are required. The study team will attempt to collect data from every node of WOTC implementation based on random selection from the operational universe.
Any use of periodic (less frequent than annual) data collection cycles to reduce burden
The study team will use the data collection instruments only once for any respondent. To minimize burden on respondents, the study team will review extant data available from Census, SWAs and any other sources to avoid asking questions for which data are available. For example, we are accessing data from the DOL Occupational Information Network (O*NET) to obtain the typical wage and educational levels associated with occupations in which WOTC employees work instead of asking AJCs to provide that information.
Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.
Nonresponse is mitigated for the WOTC surveys in five primary ways: (1) using a web-based survey to allow participants to save responses and return at a later date; (2) multiple contact attempts; (3) minimizing the instrument length; (4) providing an incentive to employees to complete the survey; and (5) ensuring survey questions are clear and succinct.
The Contractor’s web-based survey system provides participants with a confidential and accessible method to provide responses. Users can enter data into the surveys and save their progress and return at any time using a secure, individualized link. Additionally, for the employee survey, data for the specific job that is being addressed will be pre-populated (from SWA records) into the survey, along with the name of the employer and the date of hire. The web-based survey will also be formatted for mobile to ensure ease of access for anyone who receives a request to complete the survey.
For surveys where email addresses are available, periodic reminders will be sent to potential participants after the initial email. Typically, reminders are sent weekly to nonrespondents throughout the survey administration period. The only survey in this data collection that will not include emails to participants is the employee survey because email addresses are not collected by state agencies. For this group, postcards will be mailed to their addresses prompting them to take the survey online using a PIN that tracks each employee.
The Contractor has also streamlined each survey to only include questions where data is not available through extant sources. This limits the number of questions asked to each participant, which will limit incomplete and inaccurate responses. Each survey has also been tested (see item 4 in this document) to ensure all questions are succinct and are easily answered without having to look up additional information. This decreases the survey burden for respondents.
Despite implementing the best methods based on available costs, achieving a high response rate is unlikely for employers and employees, and has been difficult to achieve for even the largest federal surveys. Given that this survey will describe the range of implementation processes and not quantify program outcomes, a non-response bias will not be conducted. Additionally, for the employee survey, no demographic information other than the target group will be available to conduct this analysis since it is not consistently collected on WOTC forms. The responses will simply be used to better understand the program, what employees understand about the program, and if they are continuing to work.
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
Prior to pretesting, all surveys were reviewed by SWA representatives serving in a Technical Work Group as well as two evaluation experts in the Statistical Working Group. The Contractor also received edits via the 60-Day Notice from an employment group. Following revisions from the content experts and DOL stakeholders, each survey was tested using a web survey questionnaire and optional detailed interviews. Referrals for the testing subjects were made by the content experts.
For each survey, pretests were conducted with two (2) members of the public consistent with OMB regulations that state testing shall not exceed nine (9) members of the public without applying for a generic clearance. These interviews were conducted from July 15, 2024 to August 1, 2024. The survey pretests were conducted online. Each test subject was sent a link to the online version of the survey which included all instructions, questions, and a list of reflection questions. The reflection questions asked test respondents to provide feedback on the length of time to complete the survey if the design of the survey allowed for ease of understanding of questions and allowed for feedback on specific questions. Additionally, each participant was given the opportunity to schedule a telephone call to further discuss any areas of improvement for each survey.
Feedback from the survey pretests were reviewed by the Contractor and, in consultation with DOL, changes were made to the survey questions and communications as needed.
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
Table 3: People who will oversee data collection and analysis for the Pathway Home Evaluation
Organization |
Individuals |
Economic Systems Inc. 3120 Fairview Park Dr. Suite 500 Falls Church, VA 22042 |
Dr.
George Kettner
|
|
JoAnn
Kuchak
Karin DeLaitsch BPA Project Manager 608-358-0588
Thomas Schultz Senior Statistician 412-600-9996
Jacob Denne Senior Analyst 703-333-2197
|
DOL Chief Evaluation Office |
Austin Knipper Contracting Officer’s Representative 202-693-3063
Savi Swick Director of Research 202-693-7915
|
Statistical Working Group Members |
Sarah
Hamersma
Peter Cappelli Professor of Management, University of Pennsylvania 215-898-2722
|
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Ahsan Ahsanuzzaman |
File Modified | 0000-00-00 |
File Created | 2024-12-16 |