Supporting Statement Part B

51571 SSB Template for Formative Generic_TAA-Revised2-Clean.docx

Formative Data Collections for DOL Research

Supporting Statement Part B

OMB: 1290-0043

Document [docx]
Download: docx | pdf

PART B: Statistical methods for Formative Study of Trade Adjustment Assistance (TAA) Navigators

OMB NO. 1290 - 0043

MAY 2023

Formative Study of Trade Adjustment Assistance (TAA) Navigators  

OMB No. 1290 – 0043: Formative Data Collections for DOL Research



Supporting Statement

Part B

May 2023

B1. Respondent Universe and Sampling Methods

The study will not use formal statistical sampling.

  • Data collection for four instruments in this request—(1) discussion guides for Navigators, administrators, and partners; (2) the TAA participant interview guide; (3) discussion guide for employers; and (4) the Navigator observation rubric—will focus on site visits with selected states employing a Navigator model. Data collection site visits will focus on a purposively selected group of nine states that were confirmed to employ TAA Navigators.

  • The fifth data collection instrument (the state TAA Coordinator questionnaire) will focus on TAA Coordinators from all of the jurisdictions that offer a TAA program, which include all 50 states, the District of Columbia, and the territory of Puerto Rico.

  • The sixth instrument (a discussion guide for non-participating states) will focus on eight states identified by DOL as having expressed interest in TAA Navigators but that confirmed via email that they have not moved forward.

See Table B.1 for estimates of universe and sample sizes for each data collection instrument. 

Table B.1. Sampling and response rate assumptions, by data collection instrument and respondent type 

Respondent type

Sampling method

Universe of potential respondents

Estimated selected sample

Average responses (per site)

Estimated responses per respondent

Estimated response rates

Estimated responses (across sites)

Discussion guides for Navigators, administrators, and partners (site visit)



Navigators, administrators, and partnersa

Purposive

150

150

15

1

90%

135


Interview guide (site visit)



TAA participantsb

Purposive

3,681

54

3

1

50%

27



Discussion guide for employers (site visit)



Employersc

Purposive

135

34

3

1

80%

27



Navigator observation rubric (site visit)



Navigators and navigation recipientsd

Purposive

9

9

1

1

100%

9



State TAA Coordinator questionnaire




TAA Coordinatore

Universe

52

52

1

1

90%

47



Discussion guide for non-participating states




State TAA administratorsf

Purposive

16

16

1

1

100%

16



a We will attempt discussions with, on average, 15 Navigators, administrators, or partners per state, although the number of each type of staff can vary across the nine selected states. We expect a 90 percent response rate.

b On average, TAA programs have 409 participants per state based on the FY2021 annual report (https://www.dol.gov/sites/dolgov/files/ETA/tradeact/pdfs/AnnualReport21.pdf). We will attempt six interviews per state and expect a 50 percent response rate.

c We will attempt discussions with, on average, four employers per state. We expect an 80 percent response rate.

d We will directly observe one interaction with Navigators and TAA participants, potentially eligible workers, employers, or partners per state as part of the site visit.

e We will electronically field the questionnaire with TAA Coordinators in all states as well as the District of Columbia and Puerto Rico. We expect a 90 percent response rate.

f We will have discussions with TAA administrators in states that had expressed interest in implementing a Navigator model but later decided not to do so. We expect to have discussions with, on average, two staff per state. We expect a 100 percent response rate.

Discussion guides for Navigators, administrators, and partners. The goal of discussions with workforce Navigators, administrators, and partners is to understand the history of Navigator models and understand the barriers Navigators face. The study team anticipates an average of 15 participants from each of the nine TAA Navigator states during a site visit, for a total of 135 responses. The study team anticipates a 90 percent response rate. Similar studies, America’s Promise and the State Apprenticeship Expansion, have recently had similar response rates.1

TAA participant interview guide. The goal of the in-person interviews with program participants is to understand their experience with TAA Navigators—including access to services and barriers faced within the program. We will purposively select interview participants who have recently worked with Navigators to receive TAA services in coordination with state staff. The study team anticipates an average of three participants from each of the TAA Navigator states during a site visit, for a total of 27 participants. The study team anticipates a 50 percent response rate.

Discussion guide for employers. The goal of the semi-structured discussions with employers is to understand their experiences with TAA Navigators. We will purposively select, in coordination with state staff, employers who have worked with Navigators. The study team anticipates an average of three employers from each of the TAA Navigator states during a site visit, for a total of 27 employers. The study team anticipates an 80 percent response rate.

Navigator observation rubric. The goal of the Navigator observation is to document the content of the interaction between Navigators and navigation recipients and how services are delivered. We will select activities to directly observe in coordination with state staff. The study team anticipates an average of one direct observation from each of the TAA states during a site visit, for a total of nine direct observations. The study team anticipates a 100 percent response rate.

State TAA Coordinator questionnaire. The goal of the state TAA Coordinator questionnaire is to document compositional information about the Navigators in the TAA program, such as the number of Navigators, where they work within the TAA program, and their role and activities within the TAA program. The questionnaire will be fielded electronically among the 52 state TAA Coordinators. The study team anticipates a 90 percent response rate, with a single respondent for each questionnaire, for a total of 47 responses. A similar effort on the State Apprenticeship Expansion Grant Research Study yielded a 93 percent response rate among state apprenticeship agency directors.

Discussion guide for non-participating states. The goal of the semi-structured conference calls with these states is to learn why these states, which had expressed interest in implementing a Navigator model, later decided not to do so. We will purposively select eight states from among those identified by DOL as having expressed interest in TAA Navigators but that confirmed via email that they have not moved forward. The study team anticipates an average of two TAA administrators from each state who will jointly respond on behalf of the state, for a total of 16 responses. The study team anticipates a 100 percent response rate.

B2. Procedures for Collection of Information

Understanding the current implementation of TAA Navigator models and assessing the potential for a future impact study of Navigators requires collecting data from multiple sources. For the study, data collection will include discussion with Navigators, administrators, and partners; TAA participant interviews; discussions with employers; and Navigator observations. In addition, we will conduct a questionnaire with state TAA Coordinators. This clearance includes six data collection instruments, including discussion guides for Navigators, administrators, and partners (Appendix A), a TAA participant interview guide (Appendix B), a discussion guide for employers (Appendix C), and non-participating states (Appendix F); a Navigator observation rubric (Appendix D); and the state TAA Coordinator questionnaire (Appendix E), each described below.



Table B.2. Formative Study of TAA Navigators: Data collection activities 

Data collection instrument

Administration plans 

Discussion guides to use with Navigators, administrators, and partners

Total participants 

135 

Timing

Late spring or summer 2023

Mode 

In-person during site visit

Time 

90 minutes 

TAA participant interview guide 

Total participants 

27 

Timing

Late spring or summer 2023

Mode 

In-person during site visit

Time 

60 minutes  

Discussion guide for employers

Total participants 

27

Timing

Late spring or summer 2023

Mode 

In-person during site visit

Time 

30 minutes 

Navigator observation rubric

Total participants 

9

Timing

Late spring or summer 2023

Mode 

In-person during site visit

Time 

60 minutes 

State TAA Coordinator questionnaire 

Total participants 

47

Timing

Spring 2023

Mode 

Web 

Time 

15 minutes 

Discussion guide for non-participating states

Total participants 

16 

Timing

Late spring or summer 2023

Mode 

Telephone 

Time 

90 minutes 

Discussion guides for Navigators, administrators, and partners. The study team will conduct 135 semi-structured discussion with state-level administrators, program administrators, and Navigators. discussion will be conducted with Navigators, administrators, and partners during site visits. The discussion guides for Navigators, administrators, and partners will be used to create individual discussion guides based on the respondent type in preparation for the site visits.

TAA participant interview guide. The study team will conduct up to 27 in-person interviews with TAA participants who have interacted with a TAA Navigator. These interviews will allow participants to tell their own stories about barriers to access and their experiences with the Navigators. Interviews will be conducted during site visits or virtually. Interviews will be conducted in English or Spanish, and a live translator will be available if a worker’s preferred language is not Spanish. 

Discussion guide for employers. The study team will conduct up to 27 brief semi-structured discussion with employers about their experiences with a TAA Navigator—including how the Navigator assisted with petition filing or approval and with development of work-based learning opportunities. These discussions will take place during site visits and will last no more than 30 minutes.

Navigator observation rubric. The study team will conduct nine direct observations with TAA Navigators and navigation recipients (participants, potentially eligible workers, employers, or partners). Study team observers will use a structured rubric to document the content of the interaction and how services are delivered.

State TAA Coordinator questionnaire. We will field an electronic questionnaire to all 52 state TAA Coordinators, including the TAA Coordinators for the District of Columbia and Puerto Rico, to understand the prevalence of TAA Navigators across all states and other jurisdictions that offer a TAA program. Questionnaires will be collected via the web.

Discussion guides for non-participating states. The study team will conduct eight semi-structured conference calls with state TAA staff over the phone using the discussion guides. The conference calls will focus on why states did not adopt a Navigator model after initially expressing interest in the Navigator model.

  1. Statistical methodology, estimation, and degree of accuracy  

Discussions, interviews, and conference calls with Navigators, administrators, partners, TAA participants, and employers and Navigator observations.

The main type of data collected from the discussions, interviews, observations, and conference calls will be qualitative information about staff and participant experiences and insights on the Navigator model or, in the case of TAA participants, their experiences working with Navigators. Thus, no statistical methodology (such as sample stratification) or estimation will be necessary to analyze the discussion, interview, observational, or conference call data. We will not make any declarative statements about the efficacy of strategies or practices implemented by programs. We will qualitatively describe these programs to inform DOL and the broader field about TAA Navigator models.

We will use NVivo, qualitative data analysis software, to analyze the qualitative data collected through discussions, interviews, observations, and conference calls, with thematic analysis informed by our conceptual framework. To extract data on key themes and topics, the study team will develop a coding scheme, based on a directed approach to content analysis that attends to themes that are predetermined by program goals and research questions as well emergent themes. To ensure reliability across team staff, all coders will code an initial set of documents and compare codes to identify and resolve discrepancies.

State TAA Coordinator questionnaire. The main type of data collected from the questionnaire is descriptive information about whether the state has TAA Navigators, the number of Navigators, where they work within the workforce system, and their role and activities within the TAA program. No complex statistical methodology (such as sample stratification) or estimation will be necessary to analyze data from the state TAA Coordinator questionnaire. We will analyze the data using simple descriptive measures.

  1. Unusual problems requiring specialized sampling procedures 

Specialized sampling procedures are not required for administering interviews and observations during site visits and the state TAA Coordinator questionnaire.

  1. Periodic data collection cycles to reduce burden 

The data collection instruments for the study will be administered only once for any individual respondent. To further minimize burden on site visit discussion, interview, and conference call respondents, the study team will review pertinent data available on Navigator activities from states—such as TAA manuals, any outreach materials developed by Navigators, and state TAA data. Thus, the study team can focus the discussions with respondents on topics for which information is not otherwise available.    

B3. Methods to Maximize Response Rates and Deal with Nonresponse

Expected Response Rates

We expect to be able to achieve a 90 percent response rate for discussions, and interviews and a 100 percent response rate for direct observations and conference calls with non-participating states due to the assistance of state and local TAA program staff members. We expect to achieve a 90 percent response rate for the state TAA Coordinator questionnaire due to the expected assistance from DOL staff members in the Employment and Training Administration office of TAA. We expect a 50 percent response rate among TAA participant interviews, and an 80 percent response rate among discussions with employers. Experience on recent studies such as the America’s Promise Job-Driven Grant Program Evaluation indicate that these populations are often more difficult to schedule for discussions, so we expect to increase the number sampled to reach the intended targets.

Dealing with Nonresponse

For discussions, interviews, and observations during site visits as well as conference calls with non-participating states, participants will not be randomly sampled, and findings are not intended to be representative. Consequently, we will not calculate nonresponse bias. Respondent demographics will be documented and reported in written materials associated with the data collection.

The state TAA Coordinator questionnaire will be fielded to all state TAA Coordinators as well as the TAA Coordinators for the District of Columbia and Puerto Rico. A plan for the assessment and correction of nonresponse bias will not be necessary for analysis of the state TAA Coordinator questionnaire. It is anticipated that the response rate for the questionnaire will be 90 percent. Instead, the study team will focus efforts on establishing and maintaining contact with sample members to achieve sufficient levels of response. To prepare data for analysis, a series of data checks will be run, allowing for an examination of frequencies and means, and an assessment of the extent of missing data. As the states and organizations in the population are not representative of one another, weighting or imputations to account for missing data will not be possible. We anticipate in most cases that missing data will be deleted and the sample size noted for each question where responses are missing.

Maximizing Response Rates

Discussions with Navigators, administrators, and partners. The study team will be flexible in scheduling discussions to accommodate the particular needs of respondents. Although the study team will try to arrange discussions that accommodate respondents’ scheduling needs, there might be instances when a respondent is unable to meet while the team is on-site; when this happens, a member of the study team will request to schedule a follow-up conference call at a more convenient time.

TAA participant interviews. To encourage participation in the participant interviews, the study team will use methods that have been successful for numerous other Mathematica studies, including enlisting program staff in outreach to participants, providing easy-to-understand outreach materials, strategically scheduling interviews at convenient times and locations, and offering incentives to encourage participants to respond. 

Discussions with employers. The study team will be flexible in scheduling discussions with employers to accommodate the particular needs of respondents, including conducting discussions as conference calls. Although the study team will try to arrange discussions that accommodate respondents’ scheduling needs, there might be instances when a respondent is unable to meet while the team is on-site; when this happens, a member of the study team will request to schedule a follow-up conference call at a more convenient time.

Navigator observations. The study team will be flexible in scheduling observations to accommodate the schedules and needs of both Navigators and navigation recipients.

State TAA Coordinator questionnaire. The questionnaire will be designed to be as brief as possible, with clear, easy-to-answer questions (mostly closed-ended questions, with a few open-ended questions). For the questionnaire, the study team will use best practices to encourage high response rates while minimizing burden and nonresponse. These methods include the following:  

  • Web administration. We anticipate most respondents will prefer to complete the questionnaire online. This choice allows respondents to complete the questionnaire on their own schedule and at their own pace, and to complete the questionnaire over multiple sessions. The web questionnaire system the data collection team uses also supports mobile browsers, such as tablets or cellular phones.     

  • Technology to reduce burden. To reduce burden, the questionnaire will employ drop-down response categories so respondents can quickly select from a list; dynamic questions; automated skip patterns so respondents see only questions that apply to them (including those based on answers provided previously in the questionnaire); and logical rules for responses so respondents’ answers are restricted to those intended by the question. These features should minimize data entry burden among participants and facilitate high quality responses. 

Conference calls with non-participating states. The study team will be flexible in scheduling conference calls to accommodate the particular needs of respondents.

B4. Tests of Procedures or Methods to be Undertaken

Discussion and interview guides for Navigators, administrators, partners, TAA participants, employers, and states, and Navigator observation rubric. To ensure the interview and discussion guides as well as the observation rubrics are used effectively and yield comprehensive and comparable data across the study sites, senior research team members will use the first data collection site visit discussions, interviews and observation to check that the guides and rubric include appropriate probes and all relevant topics of inquiry. This visit will also be used to assess the site visit agenda—including how data collection activities should be structured during each site visit—and ensure it is practical, accounting for the amount of data to be collected and the amount of time allotted for each data collection activity. The study team will train all site visitors on the data collection instruments to ensure a common understanding of the key objectives and concepts as well as fidelity to the guides and rubric. After the first data collection visit, the senior researchers will meet with all trained site visitors to summarize any changes based on feedback from the initial site visit. The discussion and interview guides along with the observation rubric are included as supporting documents.

State TAA Coordinator questionnaire. The state TAA Coordinator questionnaire will be tested with up to nine state TAA Coordinators, using pre-test participants who are part of the 52-state TAA Coordinator universe. Participants in the pre-test will not receive the questionnaire during data collection. The study team will use the data from the pre-test, following up with participants for responses to revised items, for the study analysis. The questionnaire pre-tests will be conducted by phone with participants. Forms will be sent to participants to complete before the call. Feedback from the pre-tests will be used to clarify the question text and improve the overall flow of the questionnaire. 

B5. Individual(s) Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The evaluation team is coordinating consultation on the research design and data needs. The process included convening an expert panel of workforce Navigators to ensure the research design, instruments, and findings are grounded in the experiences of people with direct experience in the justice system. This formative evaluation does not require any statistical methodology or estimation so we did not seek consultations on statistical methodology outside of the evaluation team. Table B.3. lists the people who will oversee data collection and analysis for the Formative Study of TAA Navigators.

Table B.3. People who will oversee data collection and analysis for the Formative Study of TAA Navigators

Organization 

Individuals 

Mathematica   
P.O. Box 2393  
Princeton, NJ 08543-2393  
(609) 799-3535  

Dr. Jillian Berk  
Project Director and Principal Investigator  
(202) 264-3449  

  

Ms. Jeanne Bellotti  
Director, Employment Research  
(609) 275-2243  

Ms. Kristen Joyce  
Deputy Project Director  
(617) 715-6963  

Ms. Alicia Harrington 
Senior Survey Researcher 
(609) 945-3350 

Social Policy Research Associates  
1330 Broadway, Suite 1426  
Oakland, CA 94612   
(510) 763-1499  

Ms. Kate Dunham  
TAA Navigator Principal Investigator  
(510) 788-2475



1English, Brittany, Lindsay Ochoa, Andrew Krantz, Linda Rosenberg, Samantha Zelenack, Ellen Bart, Jeanne Bellotti, Skye Allmang, and Kate Dunham. “Creating and Expanding Regional Workforce Partnerships for Skilled H1-B Industries and Occupations: Implementation of America’s Promise Job-Driven Training Grants.” Report submitted to the U.S. Department of Labor. Washington, DC: Mathematica, 2022.

Sattar, Samina, Jacqueline Kauff, Daniel Kuehn, Veronica Sotelo Munoz, Amanda Reiter, and Kristin Wolff. “State Experiences Expanding Registered Apprenticeship: Findings from a Federal Grant Program.” Report submitted to the U.S. Department of Labor. Princeton, NJ: Mathematica, September 2020

6

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2024-07-24

© 2024 OMB.report | Privacy Policy