OMB Part B_5-24-19_clean - ROCIS

OMB Part B_5-24-19_clean - ROCIS.docx

Evaluation of the American Apprenticeship Initiative - Employer Survey and Participant Survey

OMB: 1290-0028

Document [docx]
Download: docx | pdf

Evaluation of the American Apprentice Initiative

ICR Reference Number 201903-1290-003

May 2019



OMB Supporting Statement Part B: Collection of Information Involving Statistical Methods



The Employment and Training Administration (ETA) in collaboration with the Chief Evaluation Office (CEO) of the U.S. Department of Labor (DOL) is undertaking the Evaluation of the American Apprenticeship Initiative (hereafter, the AAI evaluation). The AAI evaluation’s aim is to address the following six broad research questions:

  1. What are AAI grantees doing to generate apprenticeship slots?

  2. What strategies do AAI grantees and employers use to identify strong candidates for apprenticeships?

  3. What are the characteristics of AAI apprenticeships?

  4. What are the in-program experiences and post-program outcomes for apprentices?

  5. What innovations and lessons form the basis for broader change and sustainability that encourages employers to adopt apprenticeships?

  6. Do benefits accrue to employers from apprenticeship?

To answer these questions, the evaluation is implementing four sub-studies: an implementation study, an outcomes study, a return on investment study (ROI), and an impact study. The AAI evaluation is seeking approval for two data collection instruments: an Employer Survey that will support the ROI study, and a Participant Survey that will support the outcomes study.

B1. Respondent Universe and Sampling Methods

This section outlines the respondent universe and sampling methods for the two surveys.

Employer Survey

The Employer Survey will collect (1) information on employer-experienced costs and benefits related to hiring apprentices through AAI-sponsored programs, and (2) descriptive details about each employer and its AAI apprenticeship program. The population frame is the set of all employers participating in AAI-sponsored apprenticeships (approximately 1,300 employers as of December 2018).

The evaluator will conduct the survey by phone with a total of approximately 100 employers. These employers will be selected across a range of employer attributes, including size, industry, and geographic location. For these employer surveys, the evaluator will first contact grantees to fill in sections of the survey for which they have information. The evaluator will then call the 100 employers to complete the survey. This will produce a sample that is not overly burdensome for grantees but still provides a sufficient sample of employers to conduct the ROI study.

The list of participating employers and their contact information will be extracted from two sources: (1) DOL’s Quarterly Performance Results (QPR) system, and (2) grantee responses to a program office inquiry regarding employer contact information. More information on each data source is provided below.

  • Quarterly Performance Results (QPR) system. The QPR is DOL’s performance reporting system for the AAI grant initiative. Through the QPR, AAI grantees record information on all registered apprentices and pre-apprentices. In addition, grantees report the name and location of employers who hire apprentices in AAI-sponsored programs. The evaluator will use the list of AAI employers as the population universe for the Employer Survey. However, while the QPR does contain the name and location of AAI employers, it does not include a contact person name or phone number. Therefore, the evaluator will use alternate sources for obtaining the contact information necessary to field the survey.

  • Program Office Inquiry. The evaluator will work directly with the DOL Office of Apprenticeship to obtain a point of contact at each AAI-affiliated employer as well as this person’s email address and phone number. To facilitate this process, the evaluator will create templates that list the names of all employers associated with each AAI grantee for grantees to fill out. The Office of Apprenticeship will send these customized templates to each grantee as a part of a regular program inquiry and will share the completed templates with the evaluator using a secure transfer website.

The evaluator will aim for as high of a response rate as possible, recognizing that employers are not required to participate in data collection activities for the AAI evaluation. Based on feedback from other research teams, the estimated response rate from employers to the phone survey mode is 80 percent. Other evaluators who have conducted phone surveys of employers using grantee contacts have suggested that employers are much more likely to respond to survey efforts when grantees, with whom they have a prior relationship, request their participation in survey efforts.1 The evaluator will work with grantees to effectively market the employer survey to employers, which may also improve the response rate to the phone survey.

Participant Survey

The Participant Survey will gather information on the apprenticeship experiences directly from AAI apprentices. The survey is critical for obtaining information about apprentices that is not available elsewhere, including: (1) their backgrounds prior to beginning their apprenticeships, (2) specific types of training received and other in-program experiences, (3) subjective assessments of their experiences such as the value of particular aspects of training or satisfaction with mentors, and (4) their personal circumstances and motivations, including their reasons for not completing the apprenticeship program (if applicable). The Participant Survey will be conducted online and by phone, from January 2020 through June 2020.

The population frame is the set of all AAI apprentices (approximately 14,000 as of early December 2018). The Participant Survey sample will be 2,500 apprentices. To be included in the sample, apprentices must meet the following criteria:

  1. Started their apprenticeship by December 2018. The survey focuses heavily on experiences throughout different points of the apprenticeship, so it is important that apprentices have participated for long enough to be able to provide meaningful answers to most or all of the questions.

  2. Have a Social Security Number (SSN) in the QPR. It is important to be able to analyze relationships between information gathered in the survey and employment and earnings outcomes reported in National Directory of New Hires2, which is only possible for individuals for which the study has an SSN.

  3. Have a mailing address, email address, and phone number, obtained either from the QPR or through matching name and SSN with a commercial locating service such as v12data.

The evaluator will first send an advance letter notifying participants that they were selected for the survey and will then attempt to administer the survey via web. The evaluator will follow-up by phone with participants who do not respond to the web survey. If that locating attempt is successful, the apprentice will be included in the sampling frame.

As of early December 2018, there were approximately 6,700 apprentices who met these three criteria, comprising the universe of eligible respondents. This universe will increase by the end of 2018 as (1) more apprentices are enrolled by the grantees and (2) missing contact information is obtained for some apprentices through commercial locating.

The sample will be drawn with a design involving three-dimensional stratification with a disproportionate allocation crafted to provide useful sample sizes for several policy-relevant subgroups, in addition to the AAI participant population as a whole. Specifically, sample selection will stratify in order to produce the following representation in the (unweighted) survey sample:

  • Occupation Type. 15 percent from construction (the most typical apprenticing occupation), 35 percent from other production and maintenance occupations (production/manufacturing, installation, repair, and maintenance), and 50 percent from occupations that AAI targeted for expansion, such as information technology and healthcare. As of December 2018, 32 percent of AAI apprentices were in construction occupations (SOC code 47), 33 percent were in those other production and maintenance occupations (SOC codes of 49 or 51), and 35 percent of apprentices were in other occupations.

  • Sex. 35 percent women and 65 percent men. As of December 2018, 19 percent of AAI apprentices were women.

  • Race/ethnicity. 30 percent Hispanic, 30 percent white non-Hispanic, 30 percent black non-Hispanic, and 10 percent “other.” As of December 2018, 14 percent of AAI apprentices were Hispanic, 56 percent white non-Hispanic, 17 percent black non-Hispanic, and 13 percent other. The “other” category includes individuals identifying as Asian, Native American, Pacific Islander, or more than one race, as well as those who declined to answer.

The evaluator anticipates an 80 percent response rate to the survey, resulting in 2,000 completed surveys.3

Exhibit B1 summarizes the estimated sample size and target response rates for each data collection instrument.

Exhibit B1. Sample Size and Response Rate Assumptions

Respondent Universe

Sampling Methods

Population Size

Sample Size

Target Response Rate

Data Collection Instrument

Employers participating in AAI-sponsored apprenticeships

Employers selected from each grantee to represent a range of employer attributes

All AAI employers (approximately 1,300 as of December 2018)

100 employers

80 percent

Employer Survey conducted by phone

AAI Grantees

N/A: All grantees will be contacted to provide information on the two selected employers.

All AAI employers (approximately 1,300 as of December 2018)

46 grantees

100 percent1

Employer Survey, conducted by phone

AAI apprentices that:

  • Started their apprenticeship by December 2018

  • Have an SSN in the QPR

  • Have a valid mailing address, phone number, and email address

Select a sample of 2,500 through explicit stratification. Strata include:

  • Occupation Type: 15 percent from construction occupations; 35 percent from other production and maintenance occupations; and 50 percent from other occupations

  • Race/ethnicity. 30 percent Hispanic; 30 percent white non-Hispanic; 30 percent black non-Hispanic; and 10 percent “other.”

  • Sex. 35 percent women and 65 percent men.

All AAI apprentices (approximately 14,000 as of December 2018)

2,500 apprentices

80 percent

Participant Survey, conducted online and by phone

1The evaluator anticipates that all grantees will provide background information on employers, as grantees are required to participate in the evaluation by the terms of their grant agreement.





B2. Procedures for Collection of Information

For each survey, the evaluator will employ appropriate methods to select the sample, and will ensure that the plan for analysis is appropriate given the samples.

Employer Survey

The Employer Survey will be fielded by phone to approximately 100 employers identified as participating in an AAI-sponsored apprenticeship program and for whom contact information is available. All grantees will be interviewed, too, regarding employers who are part of the phone survey sample. Thus, no sampling is planned.

Responses to the Employer Survey will be used to measure the return on investment to employers hiring AAI-linked apprentices. Key measures that will be derived from the survey include:

  • Costs:

    • Cost of setting up and administering the apprenticeship program

    • Total compensation paid to apprentices

    • Total training cost, including instruction, space and materials

  • Benefits:

    • Value of apprentice production

    • Post-program benefits from reduced cost of hiring trained workers

    • Reduced turnover

These measures are descriptive in nature, and will describe the costs and benefits of registered apprentices to employers. No statistical tests for differences in measures across groups of employers are planned for measures based on the Employer Survey. Because the Employer Survey is a one-time data collection, there will be no use of periodic data collection to reduce respondent burden.

Participant Survey

Respondents for the Participant Survey will be selected through explicit stratification. As described above, the set of eligible participants will be stratified across three dimensions: occupation (construction, production and maintenance, or other), sex (male or female), and race/ethnicity (Hispanic, white non-Hispanic, black non-Hispanic, and other). This produces 24 total stratification cells (three occupation x two sex x four race/ethnicity). The sample probability for participants will vary by cell, and will be calculated such that random sampling from each cell produces the intended distribution across occupation, sex, and race/ethnicity described in section B1 above.

The Participant Survey will be used to calculate key outcome measures. These include:

  • Program retention, completion, placement, and duration

  • Skills and credentials acquired during apprenticeship, including

    • Receipt of related technical instruction

    • Receipt of on-the-job training

    • Perceived quality and value of instruction and on-the-job training

    • Receipt of certificates or license

  • Employment outcomes

    • Hourly wage and hours worked during and after apprenticeship

    • Job occupation

The evaluator will conduct simple tabulation of means, as well as produce regression-adjusted means in order to make direct comparisons between some subgroups. For example, regression-adjusted means would be used to compare outcomes like completion rates or wage growth for participants apprenticing for different occupations, where the differences in outcomes may be strongly influenced by differences in the backgrounds of individuals entering different occupations. The regression-adjusted means will be estimated using logistic regression for binary outcomes and ordinary least squares (OLS) for continuous outcomes.

Because some of the subgroups of interest are small, the evaluator will disproportionately sample from these smaller subgroups in order to increase their sample size (as described in section B1), reducing the variance of both simple and regression-adjusted means for these subgroups, and enabling the evaluator to better detect differences in means between the subgroups. Where feasible the evaluator will set the sampling probabilities to produce equally-sized samples of subgroups of interest (e.g., between the three largest race/ethnic groups), which will minimize the standard errors on difference-in-means estimators. Women constitute too small of a fraction of AAI participants to permit creation of a survey sample that is split evenly split between women and men, but the evaluator doubled the sampling rate for women in order to decrease the variance of subgroup estimates of their outcomes as much as possible.

Because the Participant Survey is a one-time data collection, there will be no use of periodic data collection to reduce respondent burden.

B3. Methods to Maximize Response Rates and Minimize Nonresponse

Employer Survey

As noted in section B1, the evaluator will aim to achieve the highest response rate possible. The mode of a telephone survey was selected because of its potential to achieve a higher response rate than prior researchers’ attempts to survey employers online. The evaluators have a goal of an 80 percent response rate. The following steps will be taken to maximize the response rate:

  • Host a grantee webinar explaining the Employer Survey’s purpose and requesting their assistance in messaging the survey to their employers and participating in data collection for the subsample of phone-interview employers

  • Develop an advance message for grantees to send to employers, notifying them of the upcoming survey

  • Collect data first from grantees to reduce employers’ response burden

  • Work with grantees to contact employers, through targeted joint emails that describe the importance of the data collection and the benefit to the employer in the form of an individualized ROI estimate

  • Send weekly email reminders to non-responders for four weeks, using different subject lines each week, focusing on different potential employer motivations for completing the survey

  • Offer an individualized return-on-investment estimate to employers that complete the survey, which would present a range of minimum, median, and maximum values for what the employer’s average return on investment across apprentices might be



Prior to analyses, the evaluation team will use QPR data to test for differences between employer respondents and the full set of AAI employers. Those may differ because of survey non-response or because the sampling approach is not purely random, so sampled employers, though intentionally diverse, are not representative of all AAI employers. The characteristics that will be tested include: region of the country, whether the employer is also a program sponsor, the occupation(s) for which the employer hires apprentices, and the grantee with which they are affiliated.

To correct for nonresponse, the evaluator will estimate and apply nonresponse weights. The evaluator will follow the standard procedure of first estimating predicted probabilities of response as a function of the characteristics observed for all employers. The weights will be calculated as the inverse of these estimated probabilities. In analyses, the weights will be applied to each observation using standard weighting routines in statistical software. Applying that weight in analyses makes the analysis sample more closely reflect the full sample of employers on observed characteristics. Because the intent of the analyses of employer data is to reflect the experiences of the full set of participating employers, the weights will correct for both non-response among employers in the survey sample and differences between surveyed employers and the full population of AAI employers.

The analyses will include only descriptive results. It will not include formal statistical tests on measures derived from the Employer Survey.

Participant Survey

As noted above, the evaluator estimates an 80 percent response rate to the Participant Survey. To obtain that response rate the evaluator will:

  • Send an advance letter notifying participants of the upcoming survey

  • Offer an incentive payment to complete the survey. The evaluator proposes an experiment to assess the effects of respondent payments on survey response rates. The evaluator proposes to randomly assign apprentices to one of three payment options:

    • Option 1: Constant $25 payment regardless of when the survey is completed and which mode used (online or phone)

    • Option 2: Constant $40 payment regardless of when the survey is completed and which mode used (online or phone)

    • Option 3: Two-tier payment: $40 payment if the survey is completed during the first four weeks of administration (the online mode); $20 if the survey is completed after the first four weeks (phone mode)

The survey will be released across several waves. In early waves, sample members will be divided evenly across the three payment options. In later survey waves, more sample members will be offered the payment option that is producing the highest response rate.

  • Send the email invitation to all sample members inviting them to complete the web survey

  • Send four email reminders weekly to non-responders, varying the subject lines each time to appeal to different potential motivations to complete the survey

  • Attempt to administer the survey by phone to those who do not complete the web survey

  • Periodically perform Accurint searches for new contact information, for non-responders

In addition to those who decline to participate in the survey and those in the survey sample who the evaluator was unable to contact, non-respondents include apprentices who were not eligible to be surveyed due to missing contact information. In order to evaluate the full set of AAI apprentices, the evaluator will weight the analysis sample to reflect the composition of all apprentices who started their apprenticeship on or before December 2018, based on their observed characteristics.

Prior to conducting analyses, the evaluator will use QPR data to test for differences between survey respondents and non-respondents in their demographic characteristics (sex, age, race/ethnicity), apprenticing occupation, region, months since their apprenticeship began, apprenticeship completion status, entry wage, and the grantee whose program they are affiliated with. To correct for nonresponse bias, the evaluator will estimate and apply nonresponse weights, starting with predicted probabilities of response as a function of the characteristics observed for all apprentices, followed by calculating the inverse of these estimated probabilities and weighting each observation by this amount using standard weighting routines in statistical software. The weights will be calibrated to reflect the composition of all apprentices who started their apprenticeship by December 2018. In doing so, the weights will adjust not only for selective nonresponse among the survey sample, but also differences in characteristics between apprentices who did and did not have SSNs available in the QPR and thus were not eligible to be surveyed.

To address any item nonresponse, we will first using logical imputation or imputation based on existing knowledge wherever feasible. Where that is not possible, we will fill in missing survey data elements using multiple imputation routines available in standard statistical software, such as Stata’s mi command. Such imputation uses statistical relationships between items estimated for sample members for whom the items are not missing to estimate values for sample members for whom data are missing on some but available for other items.

The combination of nonresponse weighting and multiple imputation will aim to enhance the representativeness and accuracy of outcomes derived from the Participant Survey. Because the Participant Survey will be used to measure outcomes, not impacts, there will be no calculation of minimum detectable effects.

B.4 Tests of Procedures

The evaluator conducted tests of both the Employer Survey and Participant Survey instruments. The results of these tests are described below. The revised instruments are included in this clearance request for review.

Employer Survey

The evaluator planned to pretest the survey with nine AAI-affiliated employers. To do this the evaluator first contacted five AAI grantees and requested that they connect the evaluators with three to five of their AAI employers. Grantees were selected based on diversity of their employers (e.g., geography, size, industries of focus). Only one grantee provided names and contact information for employers. The evaluator conducted pretests with two of these employers, both via phone. To learn more about challenges recruiting employers for the pretest, the evaluator interviewed one grantee director.

Each pretest took 30 minutes, in line with the OMB burden estimate. Feedback from the pretest employers included simplifying the wording of questions that were deemed too technical. These employers also had difficulty estimating the exact monetary value of “spillover” benefits. The language of apprenticeship did not resonate with one employer representing an industry that has not traditionally had apprenticeship programs.

The evaluator made the following revisions to the Employer Survey in response to testers’ comments:

  • Rewrote technical questions in easier to understand language (i.e., layman’s terms)

  • Changed response options around firm size, hiring costs, and promotion costs from open-text fields requiring a specific numeric value to category intervals

  • Removed questions asking employers to estimate the monetary value of specific spillover benefits

Feedback from a grantee indicated that AAI grantees can enter much of the information from early sections of the survey for their employers. This will substantially reduce burden for these employers and perhaps improve those employers’ response rate to the employer survey.

Participant Survey

The evaluator pretested the survey with nine apprentices drawn from one non-AAI employer and two AAI grantees. These apprentices were selected to reflect a range of occupations and program experiences. The pretest was conducted via phone.

Overall, the respondents reported the survey was clear and that they mostly understood the questions. Respondents did report some difficulty in responding to some items—either due to uncertainty in the question wording or response options. In addition, the length of the survey exceeded the burden estimate of 30 minutes. Consequently, the evaluator revised the survey to clarify questions that were unclear to respondents and reduce the length to conform to the burden estimate.

The evaluator made the following revisions:

  • Eliminated questions that are not essential to the study or that collect information available elsewhere (such as demographic information, which is available in the QPR)

  • Eliminated questions that were too long for the interviewer to read or not feasible to manually code (such as detailed occupation name)

  • Clarified and consolidated response options to several questions (such as reducing the list of skills gained during the apprenticeship and subjects covered in training classes)



B5. Individuals Consulted on Statistical Aspects of the Design

With DOL oversight, Abt Associates and its partners are responsible for conducting the AAI evaluation. The individuals listed in Exhibit B2 below made a contribution to the design of ROI study and outcomes study. Andrew Clarkwest at Abt Associates is the director of analysis for the AAI evaluation, and provides overall management and direction for analysis activities. Each of the sub-studies will also have a dedicated director. The data collected for the ROI study will be analyzed under the direction of Kevin Hollenbeck at the Upjohn Institute. For the outcomes study, data collection and analysis will be directed by Burt Barnow of George Washington University. Both have been consulted on the statistical methods used in this evaluation. Karen Gardiner, the project director for the AAI evaluation, will have oversight of all sub-studies and data collection efforts.

Exhibit B2: Individuals Consulted on Data Collection

Name

Telephone Number

Role in Study

Karen Gardiner

(301) 347-5547

Project Director

Andrew Clarkwest

(301) 347-5065

Director of Analysis

Robert Lerman

(202) 261-5676

Co-Principal Investigator

Karin Martinson

(301) 347-5726

Co-Principal Investigator

Austin Nichols

(301) 347-5679

Senior Advisor

Jacob Klerman

(617) 520-2613

Senior Advisor

Julie Pacer

(312) 529-9708

Survey Director

David Judkins

(301) 347-5952

Project Quality Advisor

Kevin Hollenbeck

(269) 343-5541

ROI Study Lead

Burt Barnow

(202) 994-6379

Outcomes Study Lead



Inquiries regarding the statistical aspects of the study’s planned analysis should be directed to:

Karen Gardiner

Project Director

(301) 347-5116

Michelle Ennis

Contracting Officer’s Representative, Employment and Training Administration

(202) 693-3636



1 The current employer survey results are not yet publicly available and, therefore, cannot be formally cited.

2 The National Directory of New Hires (NDNH) is a database of earnings and new hires information, maintained by the Office of Child Support Enforcement (OCSE) in the Administration for Children and Families (ACF). The AAI evaluation will use data from the NDNH to measure earnings and employment outcomes for AAI apprentices.

3 Several prior evaluations of workforce training programs achieved participant survey response rates close to 80 percent. For example, the Green Jobs and Health Care evaluation conducted an 18-month follow-up survey, which achieved response rates between 69 percent to 79 across grantees. The Pathways for Advancing Careers and Education evaluation achieved a response rate of 77 percent for the 18-month survey. See https://wdr.doleta.gov/research/FullText_Documents/ETAOP-2017-7%20Findings%20from%20the%20Impact%20Study.pdf and https://www.acf.hhs.gov/sites/default/files/opre/pace_three_yearanalysisplan_mainreport_508.pdf.

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy