Evaluation of Strategies Used in the TechHire and Strengthening Working Families Initiative Grant Programs
OMB Control No. 1290-XXXX
December 2018
OMB SUPPORTING STATEMENT PRA PART A
The U.S. Department of Labor’s Chief Evaluation Office (CEO) is undertaking the Evaluation of Strategies Used in the TechHire and Strengthening Working Families Initiative (SWFI) Grant Programs. The evaluation includes both implementation and impact components. The purpose of the evaluation is to identify whether the grants help low-wage workers obtain employment in and advance in H-1B industries and occupations and, if so, which strategies are most helpful.
This supporting statement is the second in a series of OMB submissions that correspond to an array of data collection activities for the evaluation. In January 2018, OMB approved the baseline data collection for this evaluation (OMB 1290-0014), which includes a baseline information form (BIF), 6-month follow-up participant survey, a participant tracking form, and a first round of site visit interviews.
CEO is seeking clearance in this second submission for a survey of grantees, semi-structured telephone interviews with grantees, a partner contact information template, a survey of partners, semi-structured telephone interviews with partners, and a second round of site visit interviews.
A final OMB submission will seek clearance for an 18-month participant follow-up survey.
The evaluation team is submitting the full package for the study in parts for several reasons, including: (1) the study schedule required random assignment to begin before the other instruments were developed and tested and (2) the first round of site visits needed to take place early during random assignment. Thus, CEO is now requesting OMB approval of additional instruments so that the evaluation can continue on schedule. The reason for submitting the 18-month follow-up survey in a third and final package is that the content of the survey will be determined in part based on review of findings from the 6-month follow-up survey and first round of implementation site visits.
A.1 Circumstances Necessitating the Information Collection
A user fee paid by employers to bring foreign workers into the United States under the H-1B nonimmigrant visa program provides funding for the TechHire and SWFI grants as authorized by Section 414(c) of the American Competitiveness and Workforce Improvement Act of 1998 (ACWIA), as amended (codified at 29 USC 3224a). In September 2016, the Employment and Training Administration (ETA) competitively awarded 39 Tech Hire grants and 14 SWFI grants. These programs attempt to help U.S. residents’ access middle- and high-skill high growth jobs in H-1B industries. Broadly, the goals of TechHire and SWFI are to identify innovative training strategies and best practices for populations that have barriers to participating in skills trainings. Both programs emphasize demand-driven training strategies, including employer involvement in training, usage of labor market data, work-based learning, and sectoral partnerships, among other priorities. A key goal of both programs is to bring the training system into better alignment with employer needs. To understand the extent to which the programs are meeting their goals, DOL is implementing an impact and implementation evaluation. This evaluation will contribute knowledge about the effectiveness of the approaches used under these grant programs.
29 USC 3224a (1), authorizes the Secretary of Labor to conduct ongoing evaluation of programs and activities to improve the management and effectiveness of these programs.
Overview of Evaluation
The evaluation research questions can be topically summarized as follows:
Grantee Program Descriptions:
What are the types and combinations of programs, approaches or services provided under the TechHire and SWFI grant programs?
What are the characteristics of the target populations served?
How are services for the target population implemented?
What are the issues and challenges associated with implementing and operating the programs, approaches, and/or services studied?
Implementation Procedures and Issues:
How were the programs implemented?
What factors influenced implementation?
What challenges did programs face in implementation and how were those challenges overcome?
What implementation practices appear promising for replication?
Partnerships and Systems:
How were systems and partnerships built and maintained?
What factors influenced the development and maintenance of the systems and partnerships over the lifecycle of the grant?
What challenges did programs face in partnership and systems building and how were those challenges overcome?
How did partnership development and maintenance strategies evolve over the lifecycle of the grant?
Outputs and Outcomes and Effective Strategies for Overcoming Barriers:
How and to what extent did the customized supportive services and education/training tracks expand participant access to targeted employment, improve program completion rates, connect participants to employment opportunities, and promote innovative and sustainable program designs?
What strategies and approaches were implemented and/or appear promising for addressing systematic barriers individuals may face in accessing or completing training and education programs and gaining employment in H1B industries?
Removal of Barriers and Coordination at the Systems Level:
How and to what extent did the programs both remove childcare barriers and address the individual job training needs of participants?
What were the changes in the coordination of program-level supports (training and support services) as well as the leveraging, connecting, and integrating at the systems level?
What was the reach and interaction of this program to parents who receive other federal program supports?
To address each of the five research areas, the evaluation includes both implementation and impact components. The implementation study will focus on all 53 TechHire and SWFI grantees and serve several purposes: providing a thorough description of all of the TechHire and SWFI programs; documenting implementation barriers and facilitators; describing partnerships and systems change; and providing descriptive data on program outputs and outcomes. The impact study will include both a randomized control trial (RCT) study and quasi-experimental design (QED) study. The RCT study will include approximately 5 grantees, whereas the QED study will include all of the 53 grantees.
Overview of Data Collection
To address the research questions listed above, the evaluation will include the following data collection activities:
Baseline Information Form (BIF) (clearance already obtained under OMB No. 1290-0014)
6-month follow-up survey (clearance already obtained under OMB No. 1290-0014)
Round 1 site visit interviews with grantee staff (clearance already obtained under OMB No. 1290-0014)
Round 1 site visit interviews with grantee partners (clearance already obtained under OMB No. 1290-0014)
Participant tracking form (clearance already obtained under OMB No. 1290-0014)
Grantee Survey (clearance requested in this package)
Semi-structured telephone interviews with grantees (clearance requested in this package)
Partner information template (clearance requested in this package)
Partner Survey (clearance requested in this package)
Semi-structured telephone interviews with partners (clearance requested in this package)
Round 2 site visit interviews (clearance requested in this package)
18- month follow-up survey (clearance will be requested in a future package)
With the submission of this justification, CEO requests clearance for the sixth through eleventh data collection components listed above (i.e., the grantee survey, semi-structured telephone interviews with grantees, partner contact information template, partner survey, semi-structured telephone interviews with partners, and second round of site visit interviews with RCT grantees). The Department of Labor (DOL) anticipates submitting a future OMB package to request permission to conduct the twelfth component.
A.2 Purpose and Use of the Information
This section discusses how information obtained through the data collection will be used to assess how TechHire and SWFI are implemented and to assess variation in impacts attributable to program characteristics. The data collected will be used in the implementation and impact components of the study.
A.2.1 Grantee Survey
A grantee survey will be administered to all 53 grantees. The purpose of the grantee survey is to provide uniform data on program organization and processes for the implementation study. The grantee survey will be the primary source of data on variation in how grantees implement their programs, including information on partnerships, screening and assessment, education and training activities, job development and retention and advancement services, childcare and other supportive services, implementation challenges and successes, and lessons learned. In addition to providing descriptive implementation data, the grantee survey will be a key source of data for both the RCT. The grantee survey will provide information to understand and describe the program interventions in the five RCT sites.
A.2.2 Telephone Interviews with Grantees
Semi-structured telephone interviews of the grantees will include one to three staff members to obtain more detailed information on implementation activities. Project directors typically oversee grant programs at a high-level, and they will be able to give us insight into a program’s overall strategy, partnerships, and sustainability. The day-to-day operations of TechHire and SWFI programs, however, are usually the responsibility of project managers and other staff members. The questions asked in the interviews will vary to some degree across grantees, based on how they respond to the grantee survey. The interviews will follow up on issues identified from the grantee survey.
A.2.3 Partner Information Template
Fielding of the partner survey will be preceded by a template that will be used to collection information from grantees to assemble the sampling frame. The study team reviewed grantee proposals and identified organizations listed as partners of grantees. Approximately 1,200 partners were identified. The average number of partners per grantee was 23, with a low of 7 and a high of 53. The study team will ask grantees to review and update the list, as well as to provide current contact information and involvement level (not involved, low, medium, and high) of each partner.
A.2.4 Partner Survey
TechHire grantees must be part of a primary partnership that includes members of the workforce investment system, education and training providers, and business-related organizations. Grantees are also required to have at least three employer partners or an industry association. Partnership requirements for SWFI are similar with the additional requirement of a partner that administers or funds child care services. In addition to required partners, most grantees have included optional partners in their applications. Grantee partners are critical to the success of TechHire and SWFI grantees. The survey will be administered to all grantee partners.
The partner survey will explore the role that partners have in the TechHire and SWFI programs, and their satisfaction with the program and participants. It will collect information about partners’ perspectives on the partnership with the grantee, partner roles, sustainability, and employability and performance of program participants. Data collected from the partner survey will be used for descriptive implementation analysis. In addition, the partner survey data will be used to understand program implementation in the five RCT sites.
A.2.5 Telephone Interviews with Partners
Semi-structured telephone interviews will be conducted with partners from a subset of TechHire and SWFI grantees. These interviews will explore how partnerships were either built or expanded; how different categories of grant activities were coordinated across partners; and what worked well (or not) during implementation.
A.2.5 Second Round of Site Visits (Interviews with Grantee Staff, Partners)
A second round of sites visits will be conducted in the five RCT sites. The purpose of the second round of site visits is to provide more detailed information on program implementation during the latter part of random assignment, when the programs are more mature. Each visit will be 2-3 days in length, and will include one-hour interviews with grantee and partner staff and observations of program operations. The site visit interviews will use a modular interview guide in which respondents will be asked questions based on their knowledge and roles. Combining the site visit data with all other data collected about the sites’ programs will permit an analysis of why impacts may or may not occur for particular RCT sites. The visits will provide more nuanced information about implementation challenges encountered, and lessons learned about implementation and operations.
A.3 Use of Information Technology
The grantee and partner surveys will be administered as web surveys. DOL anticipates that web surveys will be less burdensome, as they will offer easy access and submission, while also allowing respondents to complete the survey at a time convenient to them and at their own pace. A web survey has the additional advantages of reducing the potential for errors by checking for logical consistency across answers, accepting only valid responses and enforcing automated skip patterns. Respondents will be provided the URL and a unique PIN to access the survey. To reduce burden, the web surveys will use drop-down menus so that respondents can quickly select answers from a list and employ automated skip patterns so respondents are only shown those questions that apply to them. To increase the response rates, automated weekly email reminders will be sent to all non-respondents.
The telephone interviews with grantees and partners will be semi-structured interviews. The interviewers will take notes during the interviews and record them, subject to approval of each respondent. The recording of interviews reduces the burden of needing to contact respondents after the interview to review accuracy of the notes.
A.4 Identification of Duplication of Information Collection Efforts
DOL is not aware of any previous or planned effort to collect similar information concerning TechHire and SWFI program impacts or implementation. The data collection is needed to gather the information necessary to address the research questions of the evaluation. The information collected in the surveys and interviews is not available elsewhere. Existing sources of information, including the Workforce Integrated Performance System (WIPS), grantee applications, and performance progress reports will be used where possible, and the data collection efforts described here will not ask for this same information.
A.5 Impacts on Small Businesses or Other Small Entities
The data collection will not have an adverse impact on small entities. Grantees are community colleges, workforce development agencies, and community-based organizations that operate occupational training programs and provide related services. Some employer partners may be small businesses. Burden for these entities will be minimized by using web surveys so that they can respond at their convenience and collecting only the minimum amount of information necessary to answer the research questions. The site visits will be scheduled with grantees in advance at a time that is convenient, and the study team will ensure that the visit is efficient and productive.
A.6 Consequence to Federal Program or Policy if Collection is not Conducted
The evaluation will contribute to the body of literature about strategies to help low-wage workers obtain and advance in employment. Moreover, since DOL is funding other H-1B skills training programs, it is important to have rigorous information about the implementation and impact of the programs. If the information collection is not conducted, DOL will not be able to determine whether the grantee programs are effective and which strategies are effective.
If the grantee and partner surveys and semi-structured telephone interviews are not conducted there will be no information regarding the context, design, implementation, operation, and/or replicability and sustainability of the grant programs. As such, DOL/ETA will not be able to determine whether the grant programs are effective.
Finally, without collecting the site visit interviews, the evaluation team would not have information about how the program are implemented, making it difficult to interpret the impact findings.
A.7 Special Data Collection Circumstances
There are no special circumstances relating to the guidelines of 5 CFR 1320.5. This request fully complies with 5 CFR 1320.5.
A.8 Federal Register Notice
DOL published a notice on February 23, 2018 in the Federal Register, Volume 83, Number 37, pages 8108-8110 (83 FR 8108) and provided a 60-day period for public comments. A copy of this notice is included in this package. DOL did not receive any public comments.
The following people were consulted in developing the study design.
Technical Working Group
Kevin M. Hollenbeck, Ph.D., Vice President, Senior Economist, W.E. Upjohn Institute
Jeffrey Smith, Professor of Economics, Professor of Public Policy, University of Michigan
Gina Adams, Senior Fellow, Center on Labor, Human Services, and Population at The Urban Institute
David S. Berman, MPA, MPH, Director of Programs and Evaluation for the NYC Center for Economic Opportunity, in the Office of the Mayor
Mindy Feldbaum, Principal at the Collaboratory
A.9 Payments/Gifts to Respondents
No payments or gifts will be provided to participants.
A.10 Assurance of Privacy
Information collected will be kept private to the extent permitted by law.
Westat and MDRC are very cognizant of federal, state, and DOL data security requirements. All Westat and MDRC study staff will comply with relevant policies related to secure data collection, data storage and access, and data dissemination and analysis. No personally identifiable information (PII) is collected as part of this request. The evaluation team will take the following precautions to ensure the privacy and anonymity of all data collected:
All project staff, including research analysts, and systems analysts, will be instructed in the privacy requirements of the data and will be required to sign statements affirming their obligation to maintain privacy;
Only evaluation team members who are authorized to work on the project access to respondent contact information, completed survey instruments, and data files.
Data files that are delivered will contain no personal identifiers for respondents;
All data will be transferred via a secure file transfer protocol (FTP); and
Analysis and publication of project findings will be in terms of aggregated statistics only.
All respondents will be informed that the information collected will be reported in aggregate form only and no reported information will identify any individuals or organizations.
Access to the online surveys will require a unique PIN provided to the respondent. Survey data collection will use secure sockets layer encryption technology to ensure that information is secure and protected.
Recordings will be made of the interviews, subject to respondent approval. Interviewers conducted during site visits will ensure a private meeting space. Written materials and analyses from the interviews to be used as part of study reports will be prepared in such a way as to protect the identity of individuals. Only study team staff present at the interviews, the principal investigator, project director, and selected staff helping transcribe the recordings will have access to the notes. Notes will be securely stored in protected electronic files or locked cabinets. Only the staff members present at the interviews or transcribing the recordings will have access to the recordings. All staff conducting interviews, project leadership, and transcribing staff will sign privacy agreements before the interviews are conducted or before working with the data.
When not in use, all completed hardcopy documents will be stored in locked file cabinets or locked storage rooms. Unless otherwise required by DOL, these documents will be destroyed when no longer needed for the project. Evaluation team members working with the collected data will have previously undergone background checks that may include filling out an SF-85 or SF-85P form, authorizing credit checks, or being fingerprinted.
A.11 Justification of Questions of a Sensitive Nature
Respondents will not be asked questions of a sensitive nature.
A.12 Estimate of Annualized Burden Hours and Costs
Table A.12 presents the estimated respondent hour and cost burden. Burden estimates are annualized over a three-year period. Burden estimates are based on the contractor’s experience conducting similar data collections and BLS based hourly wage rates.
Grantee Survey: The annual number of respondents for the grantee survey is (53 / 3) = 17.67. The grantee survey will take respondents 1 hour on average to complete. The annual burden hours for the grantee survey are (17.67 x 1) = 18 hours (rounded).
Grantee Telephone Interview: The grantee telephone interviews will include up to 3 grantee staff—the project director, program manager, and one other staff member as appropriate. The total number of respondents for the grantee telephone interview is (53 * 3) = 159. The annual number of respondents is (159 / 3) = 53. The interviews will take one hour to complete. Therefore, the annual burden hours for the grantee telephone interviews are (53 x 1) = 53.
Grantee Partner Survey: Based on grantee applications, we estimate that there are 1,200 partners. We expect an 80 percent response rate to the partner survey. The 80 percent response rate equates to (.80 x 1,200) = 960 total partner respondents. The annual number of respondents is (960 / 3) = 320. Completion of the partner survey will take approximately 30 minutes. The annual burden hours are 320 x 30/60 = 160 hours.
Partner Telephone Interview: The partner telephone interviews will be conducted with 53 partners. The partner telephone interviews will include up to 2 staff from each organization. The total number of respondents is (53 x 2) = 106. The annual number of respondents is (53 / 3) = 35.33. The interviews will take one hour to complete. Therefore, the annual burden hours for the partner telephone interviews are (35.33 x 1) = 35 hours (rounded).
Key Informant Interview during site visit: The evaluation team will interview approximately 10 grantee staff and 8 partner staff in each of the five grantees included in the RCT. The number of grantee staff is 10 x 5 = 50 and partner staff is 8 x 5 = 40 and the total is 50 + 40 = 90. The number of annual respondents is (90 / 3) = 30. The interviews will take approximately one hour to complete. Therefore, the annual number of burden hours is 30 x 1 hour = 30 hours.
The cost represents the sum across the data collections when the average hourly wage rate for each respondent category is multiplied by the corresponding number of hours, as shown in Table A.12. The average hourly wage rates were obtained using the latest Occupational Employment Statistics data on wages, adjusted for inflation.1 The annual cost to respondents for these data collections is $15,205.
Table A.12 Estimated Annual Respondent Hour and Cost Burdens
Instruments |
Number of Respondents |
Number of Responses per Respondent |
Total Number of Responses |
Avg. Burden per Response (in Hrs.) |
Total Hour Burden (Rounded_ |
Average Wage Ratea |
Annual Cost Burden |
Grantee survey |
18 |
1 |
18 |
1 |
18 |
$48.46 |
$872 |
Grantee telephone interview |
53 |
1 |
53 |
1 |
53 |
$48.46 |
$2,568 |
Partner contact information template |
18 |
1 |
18 |
1 |
18 |
$48.46 |
$872 |
Partner survey |
320 |
1 |
320 |
30/60 |
160 |
$48.46 |
$7,754 |
Partner telephone interview |
35 |
1 |
35 |
1 |
35 |
$48.46 |
$1,696 |
interview guide during site visit |
30 |
1 |
30 |
1 |
30 |
$48.46 |
$1,454 |
Total |
474 |
|
|
|
314 |
|
$15,205 |
a The hourly wage rate for grantee and partner staff was taken from Bureau of Labor Statistics, “Occupational Employment Statistics—May 2016 National Occupational Employment and Wage Estimates” found at: https://www.bls.gov/oes/current/oes_stru.htm#00-0000: Management Occupations (SOC code 11-0000).
A.13 Estimates of Annualized Respondent Capital and Maintenance Costs
There are no additional cost to respondents other than their time.
A.14 Estimates of Annualized Cost to the Government
The total cost to conduct the information collected in this request is $784,984. The annualized cost is $784,984 / 3 = $261,661.
The estimated cost to the federal government for the contractor to carry out this study is based on the detailed budget of contractor labor and other costs is $727,000 for survey development, data collection, and analysis.
2) In addition, DOL expects the annual level of effort for Federal government technical staff to oversee the contract will require 200 hours for one Washington D.C.-based GS-14, Step 4 employee earning $60.40 per hour.2 To account for fringe benefits and other overhead costs the agency applies a multiplication factor of 1.6. The annual cost is $19,328 ($60.40 x 1.6 x 200 = $19,328). The data collection period covered by this justification is three years, so the estimated total cost is $57,984 ($19,328 x 3 = $57,984). The total cost is $727,000 + $57,984 = $784,984.
A.15 Changes in Hour Burden
This is a new data collection.
A.16 Plans for Tabulation and Publication
The Evaluation of Strategies Used in the TechHire and SWFI Grant Program data collection activities in this request will support the following major deliverables:
Short Paper 1. The first of two short papers will focus on interim implementation lessons, grantee strategies, and program participation. This short paper will include data gathered from the site visits to RCT grantees.
Short Paper 2. The second short paper will document short-term impacts based on the 6-month survey. The main short-term outcomes of interest will include training completion or continued enrollment in training and educational progress. It will also consider impacts on employment and childcare arrangements.
Issue Brief 1. An issue brief describing interim implementation and impact findings based on the two short papers will be submitted to DOL.
Final Report. A final report documenting the impact and will be submitted to DOL. The report will document the effects of participation on employment and earnings using the NDNH data and on employment, wages, hours worked, using the 18-month survey. The report will include analysis of impacts for key subgroups pooling across sites.
Issue Brief 2. The final issue brief, to be delivered in June 2021, will focus on final report highlights.
A.17 Approval to Not Display the Expiration Date
The collection of interview and survey data will show the OMB expiration date on any written instrumentation.
A.18 Exceptions to the Certification Statement
There are no exceptions to the Certification for Paperwork Reduction Act (5 CFR 1320.9) for this study.
1 Bureau of Labor Statistics, “Occupational Employment Statistics—May 2016 National Occupational Employment and Wage Estimates” https://www.bls.gov/oes/current/oes_nat.htm
2 See Office of Personnel Management 2018 Hourly Salary Table: https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2018/DCB_h.pdf
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |