1290-0NEW Supporting Statement A (20200820)_Final version

1290-0NEW Supporting Statement A (20200820)_Final version.docx

Apprenticeship Evidence-Building Portfolio evaluation

OMB: 1290-0034

Document [docx]
Download: docx | pdf

part a: justification for Apprenticeship Evidence-Building Portfolio evaluation

omb nO. 1290-0NEW

SEPTEMBER 2020


PART A: JUSTIFICATION

The Chief Evaluation Office of the U.S. Department of Labor (DOL) commissioned the high priority Apprenticeship Evidence-Building Portfolio evaluation contract to build the evidence on apprenticeship, including apprenticeship models, practices, and partnership strategies in high-growth occupations and industries. DOL’s initiatives to expand access to apprenticeship opportunities support the Presidential Executive Order “Expanding Apprenticeships in America.” The portfolio of initiatives includes the Scaling Apprenticeship Through Sector-Based Strategies grants, Closing the Skills Gap grants, Veterans Employment and Training Services (VETS) Apprenticeship pilot, and other DOL investments. The Urban Institute and its partners Mathematica Policy Research and Capital Research Corporation were contracted to conduct the study of these efforts.


This package requests clearance for seven data collection instruments as part of the study:


  1. A baseline survey and consent form for program participants

  2. A baseline survey and consent form for program staff

  3. An interview guide for program staff

  4. An interview guide for program partners

  5. A focus group guide for program participants

  6. An interview guide for military apprenticeship placement counselors

  7. An interview guide for military participants


DOL will submit additional ICRs for future data collection requests for this overall study.


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


The Department of Labor and industry have invested billions of dollars over the past decade to encourage, develop and expand industry-driven apprenticeship training nationwide. Much of the federal investment is through program grants and technical assistance. The breadth of apprenticeship investments has resulted in a diverse sectoral, geographic, and institutional mix of apprenticeship programs and projects. This project will build the evidence base on apprenticeship in three ways: careful review of existing evidence and information; rigorous implementation study to specify apprenticeship typologies and models to include a range of work-based training; and development of rigorous impact evaluation design options to analyze impacts of various models and strategies.


The Scaling Apprenticeship Through Sector-Based Strategies grants ($183.8 million) and the Closing the Skills Gap grants ($100 million) are the two largest recent federal apprenticeship investments and a primary focus of the proposed project. The Scaling Apprenticeship grant awards, announced in June 2019, focus on accelerating expansion of apprenticeships to more sectors with high demand for skilled workers, namely occupations and industries applying for H-1B worker visas. Closing the Skills Gap awards, announced in fall of 2019, are intended to promote apprenticeship as a method for closing the gap between employer skill demands and the skills of the workforce. The source of funding for both grant programs is fee revenue from Section 414(c) of the American Competitiveness and Workforce Improvement Act of 1998, and a substantial portion of grant funds are required to be spent on training activities. In addition, starting in early 2020, the Transitioning Service Member Apprenticeship Demonstration will be rolled out to eight military installations.


Although the evidence base on apprenticeship in the U.S. is growing, there are still several key knowledge gaps that are ripe for rigorous evaluations and evidence-building. Policymakers, researchers, evaluators, and practitioners are generally persuaded that apprenticeship has positive net benefits, but the study need more evidence on what models work in specific occupational contexts, for particular subgroups of apprentices. Impact analysis is needed to better understand what apprenticeship models and components are most effective for apprentices in various industries and occupations.


Citation of sections of laws that justify this information collection: The Scaling Apprenticeship Through Sector-Based Strategies grants, Closing the Skills Gap grants, and VETS Apprenticeship pilot and subsequent evaluation are funded by a portion of H-1B visa fees, which are authorized under Section 414(c) of the American Competitiveness and Workforce Improvement Act of 1998, which states that “the Secretary of Labor shall . . . award grants to eligible entities to provide job training and related activities for workers to assist them in obtaining or upgrading employment in industries and economic sectors . . . projected to experience significant growth and ensure that job training and related activities funded by such grants are coordinated with the public workforce investment system (29 USC 3224(a)).”


This a new collection request associated with the Apprenticeship Evidence-Building Portfolio.

This package requests clearance for seven data collection activities which need to start in September 2020. Given that the grantees are beginning to enroll participants, a timely start to the information collection is critical for conducting the evaluations.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The data collected through the activities summarized in this request will be used to conduct: (1) an implementation study of the Scaling Apprenticeship, Closing the Skill Gaps, and other similar DOL initiatives to develop typologies of apprenticeship models and practices, identify promising strategies across the portfolio, and to better understand the implementation of models to help interpret impact evaluation findings; (2) an impact evaluation to examine the effectiveness of the models on participants’ outcomes, such as employment earnings and career advancement; and (3) an implementation study on the VETS Apprenticeship pilot to understand service delivery design and implementation, challenges, and promising practices.


The overall study will address the following research questions:


  1. What are promising strategies for enhancing existing apprenticeship models or building new models to better serve, recruit, and retain individuals typically underrepresented in apprenticeship, such as those with disabilities, women, people of color, ex-offenders, and veterans and transitioning service members? (implementation and impact evaluation)


  1. Which industry sectors, occupations, and types of companies appeared to be the most promising for expanding apprenticeships, and why? Were they registered or unregistered apprenticeship programs? (implementation evaluation)


  1. What types of program components, or combinations of components, were designed and implemented in the apprenticeship programs? What challenges did programs face in implementation, and how were those challenges overcome? What implementation practices appear promising for replication? What types of strategies and approaches were implemented or taken to scale, and what policy changes were developed and implemented that led to systems change? (implementation evaluation)


  1. What stakeholders were involved in the design or implementation of the apprenticeship program? What role do sponsors and third parties contribute to engaging employers and apprentices? How were partnerships built and maintained? What factors influenced the development and maintenance of the partnerships? Did partnerships change or evolved over time, and if so, how and why? (implementation evaluation)


  1. What type of assistance was provided to increase employer engagement? How did implementation vary by employer characteristics, such as industry, type, size? What were the reasons employers choose to either invest in a new apprenticeship program or expand their existing apprenticeship program? What types of outreach were used to engage employers, and did outreach differ by industry? (implementation evaluation)


  1. What are the characteristics of program infrastructure, quality assurance, data management, and technical assistance? What metrics and data are used by different stakeholders to define and measure success of the apprenticeship program? (implementation evaluation)


  1. What is the role of apprenticeship placement counselors in assisting transitioning service members to learn about, search for, secure, and complete apprenticeships? How and to what extent do placement counselors conduct outreach to employers, group sponsors, local workforce boards, and other local stakeholders to identify apprenticeship opportunities? How do placement counselors assess, match, and place transitioning service members into apprenticeships? (implementation evaluation)


  1. What are the impacts of apprenticeship models, components, and strategies on apprentices’ employment, earnings, and career advancement? (implementation and impact evaluation)


  1. What are the proximate impacts of intervening strategies that may be related to employment outcomes? (implementation and impact evaluation)


  1. What are promising strategies for improving individuals’ recruitment, retention, and completion of pre-apprenticeship and apprenticeship programs? Do they differ for underrepresented populations? (implementation and impact evaluation)


  1. What are promising strategies for improving individuals’ employment outcomes? Do they differ for underrepresented populations? (implementation and impact evaluation)


The evidence generated by the evaluation will be relevant not only to the sites and their partners participating in the DOL initiatives, but to DOL policymakers and administrators assessing current and future apprenticeship initiatives, and to employers, training institutions and workforce development partners seeking knowledge and evidence about effective models, practices, partnerships and strategies to improve and scale their systems.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


This project will use multiple applications of information technology to reduce burden. As described below, information technology will be used to collect baseline data and participant identifying and contact information and administer the consent.


RAPTER® is a secure, web-based system that program staff will use to administer consent to participants, collect their identifying and contact information, and conduct random assignment. The use of check boxes and drop-down menus and response categories will minimize data entry burden. Participants completing the baseline survey via the web or program staff entering baseline survey information on behalf of participants will use the RAPTER® interface to complete baseline information. RAPTER® uses a secure log-in and password.


The baseline survey will have the capability to be hosted on the Internet via a live secure web-link. To reduce burden, the survey will employ the following: (1) secure log-ins and passwords so respondents can save and complete the survey in multiple sessions, (2) drop-down response categories so respondents can quickly select from a list, (3) dynamic questions and automated skip patterns so respondents only see those questions that apply to them (including those based on answers provided previously in the survey), and (4) logical rules for responses so respondents’ answers are restricted to those intended by the question.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


The apprenticeship evaluations will not require collection of information that is available through alternate sources. The participant baseline survey and consent forms; program staff baseline survey and consent forms; program staff interviews; partner interviews; participant focus groups; apprenticeship placement counselor interviews; and military participant focus groups are collecting new data that are not available elsewhere.


5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


The baseline survey will not be administered to any small businesses or entities, only study participants. However, the evaluation grantees conducting intake could be small organizations, such as businesses or nonprofit organizations. If small businesses are involved, only the minimal amount of data needed for this study will be collected.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


If these one-time data are not collected, DOL will not be able to determine the effectiveness of its apprenticeship investments and the various models, programs, components, and strategies being used. An implementation study will provide important information on ways to improve apprenticeship models and approaches. An impact study is needed to better understand what apprenticeship models and components are most effective for apprentices in various industries and occupations. The evidence generated by the study will benefit DOL and its apprenticeship grantees, as well as federal policymakers and administrators assessing current and future apprenticeship initiatives, and employers, training institutions and workforce development partners seeking knowledge and evidence about effective models, practices, partnerships and strategies to improve and scale their systems.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* Requiring respondents to report information to the agency more often than quarterly;

* Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* Requiring respondents to submit more than an original and two copies of any document;

* Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* Requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


There are no special circumstances for the proposed data collection.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


1. Federal Register announcement

The 60-day notice to solicit public comments was published in the Federal Register on December 19, 2019 (84 FR 69778). We received comments from the public for the draft Information Collection Request (ICR). (See Attachment H, which provides a summary of the comments and responses by the contractor team.)


2. Consultation outside of the agency


The project includes a Technical Working Group (TWG) to provide substantive feedback throughout the project period, particularly on the impact evaluation design. Members of the TWG are listed in Table A.1. They have expertise in research methodology as well as on programs and populations similar to those being served in the apprenticeship grant initiatives.


Table A.1. Technical Working Group Members

Carolyn Heinrich

Patricia and Rodes Hart Professor of Public Policy, Education, and Economics, Vanderbilt University


Susan Helper

Frank Tracy Carlton Professor of Economics at the Weatherhead School of Management, Case Western Reserve University


Chris Magyar

Chief Apprenticeship Officer, Techtonic Inc.


Mary Alice McCarthy

Director of the Center on Education & Skills, New America


Jeffrey Smith

Paul T. Heyne Distinguished Chair in Economics and Richard Meese Chair in Applied Econometrics, University of Wisconsin-Madison


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


There are no payments or gifts to program and partner staff, as activities are expected to be carried out in the course of their employment, and no additional compensation will be provided outside of their normal pay. Respondents participating in the participant focus groups will receive a $25 gift card or cash.

10. Describe any assurance of privacy provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


All respondents taking part in data collection activities are assured that information collected will be kept private to the extent permitted by law. Depending on data collection activity, respondents will sign a privacy form (Attachments A-B for impact study individual data collection), be read a privacy statement at the start of phone or in-person interviews (see Attachments C-D and F-G), and sign an informed consent sheet at the start of in-person focus groups (see Attachment E). In each activity, participants are informed that all data will be used for research purposes only, will be kept securely, and individually identifiable data will not be shared with program staff or the Department of Labor. They are also assured no one will ever publish their name in connection with the information collected, but information will be combined with individual data across the study, so researchers can describe the overall program effects, participants’ experiences, and program implementation. Further, all recipients are assured participation is completely voluntary and given the option of not answering any individual question. The evaluation team complies with DOL data security requirements by implementing security controls for processes that it routinely uses in projects that involve sensitive data. Further, the evaluation is being conducted in accordance with all relevant regulations and requirements, including those set out by the Urban Institute IRB

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


To evaluate the apprenticeship grants using impact study methodology requires sensitive questions related to social security numbers, wage rates and earnings, economic hardships, and involvement in the criminal justice system. The project team will seek Institutional Review Board (IRB) approval for final, OMB-approved instruments.


12. Provide estimates of the hour burden of the collection of information.

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under “Annual Cost to Federal Government.”


Table A.2 provides the annualized respondent hour and cost burden estimates for the data collection activities for which this package requests clearance. The evaluation is requesting clearance for a period of three years. Burden estimates are based on the study team’s experience conducting similar data collections. The table reflects estimated total respondent numbers that have been annualized over the 3 years of the study.


Table A.2. Estimated annualized respondent hour and cost burden

Type of Instrument

Number of Respondents

Number of Responses Per Respondent

Total Number of Responses

Average Burden Per Response (in hours)

Estimated Burden Hours

Average Hourly Wage1

Annual Burden Costs

Baseline survey and consent – program participants

5,0002

1

5,000

0.33

1,667

$7.25

$12,083.33

Baseline survey and consent–program staff3

2004

25

5,000

0.33

1,667

$35.05

$58,416.67

Interview guide – program staff

285

1

28

1

28

$35.05

$981.40

Interview guide – program partners

425

1

42

1

42

$35.05

$1,472.10

Focus group guide- program participants

706

1

70

1.5

105

$7.25

$761.25

Interview guide –military apprenticeship placement counselors

6

1

6

1

6

$35.05

$210.30

Interview guide – military participants

87

1

8

1

8

$7.25

$58.00

Total

5,354


10,154


3,523


$73,983.05

1 Hourly wage for program staff and partners reflects the May 2019 mean hourly wage estimate for “social and community service managers” as reported by the U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wage Estimates, 2019, “May 2019 National Occupational Employment and Wage Estimates United States,” (accessed from the following web site as of June 28, 2020: https://www.bls.gov/oes/current/oes_nat.htm.

2 Assumes 5,000 participants randomized every year.

3 The burden estimate for the program staff assumes they will help with collecting the baseline survey and consent form information from program participants.

4 Assumes 200 staff assist in participant randomization every year, each serving 25 participants.

5 Assumes 7 sites visited per year with 4 program staff and 6 partner interviews per site.

6 Assumes 1 focus group with 10 participants per each site visited.

7 Assumes interviews with 5 participants in each of 3 sites, spread over two years.


13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


There are no direct costs to respondents other than their time.


14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.


The total cost to the Federal government over three years is $1,133,502, and annualized cost to the federal government is $378,268. Costs result from the following two categories:


  1. The annualized cost to the federal government for the evaluation contractor, the Urban Institute and its partners Mathematica and Capital Research Corporation (Contract Number: DOL-1605DC-19-F-00312, to carry out this evaluation is $365,4801. The total cost of the data collection is $698,147 for the base contract and $398,294 for the VETS apprenticeship program data collection over 3 years. Therefore, the annualized cost is ($698,147+$398,294) / 3 = $365,480.


  1. The annualized cost for federal technical staff to oversee the evaluation is $12,354. This is calculated by the following: an annual level of effort of 200 hours for one Washington, DC-based Federal GS-14 step 4 employee earning $63.94 per hour. (See Office of Personnel Management 2020 Hourly Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2020/DCB_h.pdf.) Therefore, the annualized cost is 200 hours X $63.94 = $12,788.


The total annualized cost to the federal government is $378,268 ($365,480 + $12,788= $378,268).


15. Explain the reasons for any program changes or adjustments reported on the burden worksheet.


This is a new information collection.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Data collection will begin in September 2020 and will end in September 2023. After data collection, data will be presented in summary formats, tables, charts, and graphs to illustrate the results. Interim briefs will be submitted in 2021. A final report will be submitted in 2024.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The OMB approval number and expiration date will be displayed or cited on all forms completed as part of the data collection.


18. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.”


No exceptions are necessary for this information collection.

1 The total contractor cost includes the cost for $25 gift cards paid to focus group participants.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy