Supporting Statement Part B--REVISED (1)

Supporting Statement Part B--REVISED (1).docx

Apprenticeship Evidence-Building Portfolio Evaluation

OMB: 1290-0041

Document [docx]
Download: docx | pdf

Shape2

OMB Supporting statement

PART B: COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

In this document, the Department of Labor (DOL) requests clearance from the Office of Management and Budget (OMB) under the Paperwork Reduction Act (PRA) for a new collection associated with the Apprenticeship Evidence-Building Portfolio. The Chief Evaluation Office of the U.S. Department of Labor (DOL) commissioned the Apprenticeship Evidence-Building Portfolio evaluation contract to build the evidence on apprenticeship, including apprenticeship models, practices, and partnership strategies in high-growth occupations and industries.


We discuss here nine different instruments that are part of three studies: 1) an implementation evaluation of the Scaling Apprenticeship and Closing the Skills Gap grants programs to develop typologies of apprenticeship models and practices, identify perceived promising strategies across the portfolio, and to better understand the implementation of models to help interpret impact evaluation findings; 2) an assessment of registered apprenticeship state systems and partnerships to provide important information on their capacity to develop, design, modify, implement, replicate, sustain, expand/scale up, and evaluate apprenticeship strategies and models; and 3) an implementation evaluation of the Youth Apprenticeship Readiness grant program to understand service delivery design and implementation, and perceived challenges and promising practices.


  1. Scaling Apprenticeship and Closing the Skills Gap Grants survey of grantee staff

  2. State System Capacity Assessment semi-structured interview protocol for state staff

  3. State System Capacity Assessment semi-structured interview protocol for local lead organization staff

  4. State System Capacity Assessment semi-structured interview protocol for local partner staff

  5. State System Capacity Assessment semi-structured interview protocol for employer partner staff

  6. Youth Apprenticeship Readiness Grant survey for program staff

  7. Youth Apprenticeship Readiness Grant semi-structured interview protocol for program staff

  8. Youth Apprenticeship Readiness Grant semi-structured interview protocol for program partners

  9. Youth Apprenticeship Readiness Grant semi-structured interview protocol for follow-up with program staff



B.1. Respondent Universe and Sampling

In this section, we describe the respondent universe and sampling for each study in turn.

Scaling Apprenticeship and Closing the Skills Gap Grants Implementation Evaluation. The universe for the Scaling Apprenticeship and Closing the Skills Gap grants survey includes all Scaling Apprenticeship grantees and Closing the Skill Gap grantees. The survey is designed to provide the breadth of knowledge needed to systematically understand how grantees have structured and implemented their apprenticeship initiatives. The survey will be administered to all grantees and therefore not require statistical methods for sampling purposes.

State System Capacity Assessment. The universe for this study is a subset of approximately 15 states participating in the study. The study team will use a purposive sampling strategy to select states based on information learned from other activities under the Apprenticeship Evidence-Building Portfolio, and information collected from the Registered Apprenticeship Partners Information Database System (RAPIDS) and quarterly performance reports for state apprenticeship grants. The goal of the selection process is to includes states that can address the greatest number of topics included in the study. No statistical methods will be used to select the states to include in the semi-structured interviews as the sample is intended to be neither random nor representative.

Youth Apprenticeship Readiness Grant Evaluation. This study includes a survey and semi-structured interviews. The universe for the survey includes all 14 Youth Apprenticeship Readiness grantees. The survey is designed to collect information on grant experiences and program development and registration progress. It will be administered to all grantees and therefore not require statistical methods for sampling purposes.

For the semi-structured interviews, the study team will use a purposive sampling strategy to select nine of the 14 Youth Apprenticeship Readiness grantees based on information collected from grantee applications, work plans and quarterly reports, and notes from clarification calls. The goal of the selection is to include grantees with a range of characteristics and experiences implementing the grant. Criteria for selection include organization type, region of country, industry, number of to be served, partners, and high-level strategies. youthNo statistical methods will be used to select the grantees to include in the semi-structured interviews as the sample is intended to be neither random nor representative.

Below Table B.1 presents the number of entities in the population, by instrument.for each respondent type, and the final number number of respondents in the sample, the expected response ratemaximum , the (estimated as noted)number of respondents in the population

Table B.1. Population, Sample, and Expected Response Rate by Respondent Group

Instrument

Number of entities in population

Number in population of respondents

Number of respondents in the sample

Expected response rate

Final number of respondents

Scaling Apprenticeship and Closing the Skills Gap Grants survey – grantee staff

51 grantees

51a

51

100%a

51

State System Capacity Assessment interview protocol – state staff

50 states

250b

75

95b%

71

State System Capacity Assessment interview protocol – local lead organization staff

50 states

300c

90

90%c

81

State System Capacity Assessment interview protocol – local partner staff

50 states

50d0

150

9d%0

135

State System Capacity Assessment interview protocol – employer partner staff

50 states

100e

30

64%e

19

Youth Apprenticeship Readiness Grant survey – program staff

14 grantees

14f

14

10f0%

14

Youth Apprenticeship Readiness Grant interview protocol – program staff

14 grantees

56g

36

100g%

36

Youth Apprenticeship Readiness Grant interview protocol – program partners

14 grantees

84h

60

90h%

54

Youth Apprenticeship Readiness Grant interview protocol – follow-up with program staff

14 grantees

28i

18

100%i

18

a, we allow for maximum response here.sgrant Scaling Apprenticeship and Closing the Skills Gap are required to participate in evaluation activities as a condition of theBecause grantees . grantees This number is the number of

b" https://www.mathematica.org/publications/state-experiences-expanding-registered-apprenticeship-findings-from-a-federal-grant-program HYPERLINK ". https://www.mathematica.org/publications/state-experiences-expanding-registered-apprenticeship-findings-from-a-federal-grant-program

c" https://www.mathematica.org/publications/state-experiences-expanding-registered-apprenticeship-findings-from-a-federal-grant-program HYPERLINK "https://www.mathematica.org/publications/state-experiences-expanding-registered-apprenticeship-findings-from-a-federal-grant-program

d" https://www.mathematica.org/publications/state-experiences-expanding-registered-apprenticeship-findings-from-a-federal-grant-program HYPERLINK "https://www.mathematica.org/publications/state-experiences-expanding-registered-apprenticeship-findings-from-a-federal-grant-program

e" https://www.dol.gov/sites/dolgov/files/OASP/legacy/files/Employer-Perspectives-Study-Report-Round-Final.pdf HYPERLINK ". https://www.dol.gov/sites/dolgov/files/OASP/legacy/files/Employer-Perspectives-Study-Report-Round-Final.pdf

f, we allow for maximum response here.sgrant YARG are required to participate in evaluation activities as a condition of theBecause grantees . YARG grantees (14)This number is the number of

g, we allow for maximum response here.are required to participate in evaluation activities as a condition of the SCSEP grantBecause grantees This is the number of YARG grantees (14) multiplied by the number of program staff (4) for each grantee to be interviewed.

h" https://www.acf.hhs.gov/opre/report/health-profession-opportunity-grants-hpog-20-program-operator-and-partner-perspectives HYPERLINK ". https://www.acf.hhs.gov/opre/report/health-profession-opportunity-grants-hpog-20-program-operator-and-partner-perspectives

i, we allow for maximum response here.are required to participate in evaluation activities as a condition of the SCSEP grantBecause grantees This is the number of YARG grantees (14) multiplied by the number of program staff (2) for each grantee to be interviewed as a follow-up.




B.2. Procedures for the collection of information

Data for the three studies will be collected through online surveys, semi-structured interviews, and phone interviews, and are all one-time data collection efforts.

The surveys will be programmed and administered using Qualtrics. This survey software offers a user interface that is modern, secure, and easy to navigate for respondents. The software will also facilitate generation of tabulations of responses as surveys are completed by grantees and processed. The survey will be hosted on the Internet via a live secure web-link. To reduce respondent burden, it will employ the following: (1) secure log-ins and passwords so respondents can save and complete the survey in multiple sessions; (2) drop-down response categories so respondents can quickly select from a list; (3) dynamic questions and automated skip patterns so respondents only see those questions that apply to them (including those based on answers provided previously in the survey); and (4) logical rules for responses so respondents’ answers are restricted to those intended by the question. The surveys are provided in Attachment A for the Scaling Apprenticeship and Closing the Skills Gap Grants Implementation Evaluation and Attachment F for the Youth Apprenticeship Readiness Grant Evaluation.

The interviews will be semi-structured and conducted virtually or, if possible, in-person, depending on the study. We describe the procedures for conducting the interviews for each study in more detail below.

State System Capacity Assessment. The study team will conduct virtual semi-structured interviews with key state staff and partners and representatives of local/regional apprenticeship initiatives/programs. For each of the sites selected, the member of the two-person team primarily responsible for logistics will make initial contact by phone with the individual listed as the primary contact in the Office of Apprenticeship records. The team will then send an e-mail to inform the grantee organization of the study and request its cooperation. The initial telephone contact will provide background about the project and seek additional information on organizations and partners in order to identify key respondents. Based on this information, the team will contact respondents and determine the best timing for the interview in order to accommodate the schedule of local respondents. The interview protocols are provided in Attachments B, C, D, and E for the state staff, local lead organization staff, local partner staff, and employer staff.

Youth Apprenticeship Readiness Grant Evaluation. The study team will conduct three-day visits (in-person or virtually) to interview program staff and partners. The interviews will inquire about grantee experiences, apprenticeship models, partnerships, barriers, successes, and plans for sustainability. For each of the grantees selected, the member of the two-person site visit team primarily responsible for logistics will make initial contact by phone with the individual listed as the primary contact in the Office of Apprenticeship records. The site visit team will then send an e-mail to inform the grantee organization of the study and request its cooperation. The initial telephone contact will provide background about the project and seek additional information on organizations and partners in order to identify key respondents. Based on this information, the site visit team will contact respondents and determine the best timing for the visit in order to accommodate the schedule of local respondents. The study team will also conduct follow-up calls with the nine grantees participating in the semi-structured interviews described above to understand progress made on their program. These follow-up calls will be conducted with program staff. The interview protocols are provided in Attachments G, H, and I for the program staff, program partners, and follow-up with program staff.

Analysis plans across the —a descriptive analysis and a thematic analysis.techniquestwo analytical using and outcomes)—that the study team will analyze to address the research questions. We anticipate activities, site visit data (interview notes), and grant documents and performance data (participant characteristics, sData collection will generate a considerable volume of data—the survey. studies

Descriptive analysis, and where possible, supplemented by other grantee performance data on participants to create an analysis file. The team will first develop descriptive univariate tabulations of the survey data. They will then produce selected cross-tabulations to look at variation across populations served, geography, and program size, for example. The analysis will also be aligned with the research questions and topics discussed earlier.sThe descriptive analysis will provide a comprehensive picture of the components, models, partnerships, and strategies implemented by grantees. It will use data from the web-based survey .

Thematic analysispromising program components, partnerships, and strategies.perceived qualitative data in a way that these patterns can be observed. The report will highlight examples of the an understanding of the programs and services from all perspectives. The team will construct tables of data to organize buildalso be aligned with the detailed research questions and topics discussed earlier. The team will draw out themes and patterns across the grantees and types of respondents to The team will employ this inductive approach through team coding and analysis meetings where site visit team members and coders discuss emerging themes to provide multiple perspectives. The coding will 1 Applied thematic analysis is using textual data to “focus on identifying and describing both implicit and explicit ideas within the data, that is, themes” rather than only counting use of words or phrases within the text.2. The coding and analysis approach will use applied thematic analysis.apprenticeshipsThe team will also conduct a thematic analysis of the interview data to provide an in-depth understanding of the implementation of components, models, partnerships, and strategies and highlight promising approaches to .

B.3. Methods to maximize response rates and minimize nonresponse

Survey allowing for concentrated reminders and follow-up., and the universe is smalls is 100 percent as participation in evaluation activities is required as a condition of the grant awardsurveysThe estimated response rate for the esponse. rThe study team will make use of best practices to encourage high response rates while minimizing burden and non-response. For the surveys, these methods include:


Web administration. The study team will primarily administer the surveys online using Qualtrics. This choice allows the respondent to complete on their own schedule and pace, as well as complete the survey over multiple sessions. The web survey system used by the data collection team also supports mobile browsers, such as tablets or cellular phones.


Multiple modes of administration. To comply with Section 508 of the Rehabilitation Act, participants who may have difficulty completing a web survey will be offered the option of completing the surveys by telephone.


As this study is descriptive in natureand given our experience with grantee surveys, we do not expect item nonresponse to be large and do not plan to use statistical methods to impute missing data. In cases where there are missing responses, we will use pairwise deletion. However, if there are respondents with large amounts of missing data (defined as missing most of one or more of survey sections), we will compare and report on whether the basic characteristics of nonrespondents (size, percent of participants “most-in-need”, urbanicity) differ from respondents.


Interviews and focus group response. For the interviews, we expect that all program staff identified will agree to participate. For the State System Capacity Assessment, we will work with the lead state staff person to identify state staff and local lead organizations. We will then work with the local lead organization contact to identify organization staff, partner staff, and employer staff. For the Youth Apprenticeship Readiness Grant Evaluation, we will work with the primary contact person for the grant to identify appropriate staff and partners and to schedule interviews. We will use program staff to provide introduction to partners and discuss importance of the study to increase partner participation in interviews.


B.4. Test procedures

Below, we discuss how the study team has sought to reduce and test for the time burden for the survey.


Technology to reduce burden. To reduce burden, the survey will employ drop-down response categories so respondents can quickly select from a list, dynamic questions and automated skip patterns so respondents only see those questions that apply to them (including those based on answers provided previously in the survey), and logical rules for responses so respondents’ answers are restricted to those intended by the question. These features should minimize data entry burden by participants and facilitate high quality responses.



Testing questionnaire. For the Scaling Apprenticeship and Closing the Skills Gap Grants Implementation Evaluation, the study team tested the survey for clarity and length. The survey also draws on pre-tested questions from prior questionnaires used to study apprenticeship programs and U.S. Department of Labor grantees and grant partners, most notably questions from the American Apprenticeship Initiative (AAI) Evaluation questionnaire. Following programming the survey, the study team testthe instrument with ed 2 grantees and experts at Urban.


For the Youth Apprenticeship Readiness Grant Evaluation, the study team internally tested the survey for length and to ensure the online prompts work as expected. Since many of the survey questions and organization are derived from the Scaling Apprenticeship and Closing the Skills Gap Grants Implementation Evaluation survey (which has been pre-tested) and the AAI grantee survey (which was pretested and fielded), we minimized the burden on the 14 grantees by not testing the survey with them.


For the interviews, we expect that all program staff identified will agree to participate. For the State System Capacity Assessment, we will work with the lead state staff person to identify state staff and local lead organizations. We will then work with the local lead organization contact to identify organization staff, partner staff, and employer staff. For the Youth Apprenticeship Readiness Grant Evaluation, we will work with the primary contact person for the grant to identify appropriate staff and partners and to schedule interviews. We will use program staff to provide introduction to partners and discuss importance of the study to increase partner participation in interviews.


B.4. Testing interview instruments. .Interview instruments for the three studies were based on previously tested and reviewed instruments from the Scaling Apprenticeship and Closing the Skills Gap Implementation Evaluation study (cleared by OMB in a previous PRA package), the Evaluation of the State Apprenticeship Expansion Grants, and the AAI evaluation



B.. 5Individuals consulted on statistical aspects of design and on collecting and/or analyzing data

Staff responsible for overseeing the collection and analysis of data are listed in Table B.1 and individuals consulting on the efforts are listed in Table B.2.


Table B.1. Individuals overseeing the collection and analysis of data for the Apprenticeship Evidence-Building Portfolio

The Urban Institute

Barbara Butrica

Pamela Loprest

Project Directors


Daniel Kuehn

Deputy Project Director and Task Director


Lauren Eyster

Task Director


Mathematica


Samina Sattar

Project Director



Capital Research Corporation

John Trutko

Project Director




Table B.2. Individuals consulting on the collection and analysis of data for the Apprenticeship Evidence-Building Portfolio

The Urban Institute


Barbara Butrica

Pamela Loprest

Project Directors


Demetra Nightingale

Co-Principal Investigator


Daniel Kuehn

Deputy Project Director


William Congdon

Senior Advisor, Methods


Robert Lerman

Senior Advisor, Apprenticeship

Mathematica


Peter Schochet

Co-Principal Investigator


Samina Sattar

Project Director


Annalisa Mastri

Quality Control Advisor


Capital Research Corporation

John Trutko

Project Director


Technical Work Group Members

Carolyn Heinrich

Patricia and Rodes Hart Professor of Public Policy, Education, and Economics, Vanderbilt University


Susan Helper

Frank Tracy Carlton Professor of Economics at the Weatherhead School of Management, Case Western Reserve University


Chris Magyar

Chief Apprenticeship Officer, Techtonic Inc.


Mary Alice McCarthy

Director of the Center on Education & Skills, New America


Jeffrey Smith

Paul T. Heyne Distinguished Chair in Economics and Richard Meese Chair in Applied Econometrics, University of Wisconsin-Madison


1 Guest, MacQueen, and Namey 2012 (p. 9).

2 Guest, Greg, Kathleen M. MacQueen & Emily E. Namey. 2012. Applied Thematic Analysis. Thousand Oaks, CA: SAGE Publications.

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMathematica Staff
File Modified0000-00-00
File Created2022-08-30

© 2024 OMB.report | Privacy Policy