ICF ENPP Formative_OMB Part B Data Collection_REVISED_11.3.21

ICF ENPP Formative_OMB Part B Data Collection_REVISED_11.3.21.docx

Transition Assistance Program Employment Navigator and Partnership Pilot

OMB: 1290-0038

Document [docx]
Download: docx | pdf

PART B: JUSTIFICATION for Evaluation of Transition Assistance Program Employment Navigator and Partnership Pilot

OMB No. 1290-0NEW

August 2021

PART B: DATA COLLECTION ACTIVITIES

The Chief Evaluation Office of the U.S. Department of Labor (DOL) has commissioned an evaluation of the Transition Assistance Program Employment Navigator and Partnership Pilot (TAP ENPP, or ENPP). By establishing the ENPP, DOL aims to provide individualized career counseling and guidance to transitioning service members (TSMs) and military spouses. Under the ENPP model, Employment Navigators will provide individualized career services including self-assessment, interest and aptitude testing, career exploration, and detailed labor market information (referred to as the Assist-Explore-Plan [AEP] model) as well as warm handovers and connections to governmental and nongovernmental partners for additional services. ICF has been contracted by CEO to conduct a formative evaluation of this 12-month pilot program (April 1, 2021 to March 31, 2022). This evaluation examines how the ENPP was implemented, describes to what extent it was implemented as planned, and explores variation in implementation across each of the 13 pilot sites. This document provides insight into the proposed data collection process, which will be used to inform the formative evaluation.



B.1. Respondent universe and sampling

The universe of sites for this evaluation is the 13 United States military installations where DOL is piloting Employment Navigator services. The formative evaluation includes interviews and focus groups with pilot stakeholders associated with specific sites: Program Employment Navigator staff, TAP managers, and participants (TSMs and spouses). It also includes focus groups with program partners who are not tied to any individual site but rather serve veterans and spouses coming from any military installation. These program partners include both governmental partners (specifically, American Job Centers [AJCs located across the United States) and nongovernmental partners (national veterans’ services organizations). The interviews and focus groups for this formative study are not designed to produce statistically generalizable findings and participation is at the respondent’s discretion. Response rates will not be calculated or reported.



Table B.1. Sampling and response rate assumptions, by respondent type, for site-based respondents (over 2 years of study)

Type of respondent

Sampling method

Number of sites

Estimated universe across all sites

Expected sample (per site)1

Estimated response rate (percent)

Estimated responses (across sites)

Pilot site TAP managers

Purposeful

13

13

1-2

80

30

Military spouses

Purposeful

13

3152

2-4

50

42

TSM (post-Navigator) participants

Purposeful

13

2,1703

2-5

50

58

TSM (post-partner) participants

Purposeful

13

2,1703

2-4

50

42

Employment Navigators

Purposeful

13

54

2-3

90

34



Table B.2. Sampling and response rate assumptions, by respondent type, for non-site-based (national) respondents (over 2 years of study)

Type of respondent

Sampling method

Number of sites

Estimated universe

Expected sample

Estimated response rate (percent)

Estimated responses4

Program partners – non-governmental

Purposeful

n/a

205

35

70

24

Program partners – governmental

Purposeful

n/a

2,4006

35

70

24



1. Site selection

All 13 pilot sites will be included in data collection for this study, and all 13 will be involved in a similar level of data collection activities: TAP manager focus groups, participant focus groups, and Employment Navigator focus groups (all virtual).

2. Focus group participant selection

The number of participants in each focus group will vary by respondent type, and we will aim for four to eight participants per group. Virtual focus groups have higher rates of “no-shows” than in-person focus groups.7 We will overrecruit by 20 to 50 percent of the total number of participants to ensure an adequate focus group sample, as recommended in the research literature.8 In the instances where there are more individuals of the respondent type than we can include in a focus group, we will use purposive selection to create groups that have a variety of viewpoints and set quotas for participant types to match the general characteristics of the population. No statistical methods will be used in selecting interviewees and focus group participants.

Program Employment Navigators. For the Employment Navigator focus groups, we will invite one Navigator per site (the one with the longest tenure) to participate in a focus group, with seven to eight participants per group. Groups will mix Employment Navigators representing sites of different sizes, different branches, and locations (e.g., CONUS versus OCONUS) to tease out different perspectives. We will follow a similar process for selection of TAP managers if there is more than one TAP Manager at the site. We anticipate a 90 percent response rate for the employment navigator participants since they are professionals employed to work with the TSMs, comparable to literature of research with professionals.9

TAP Manager. We anticipate that there will be only one TAP Manager at each pilot site.

Program partners. We will invite the partner organizations with the highest volume of ENPP connections sent. The staff member respondent should be someone who is involved with the project either at management or service delivery level. Focusing on these partners allows us to capture the perspectives of partners who have served a higher volume of participants and thus seen a wider variety of cases. We anticipate a 70 percent response rate from program partners due to the existing relationships with DOL or VETS. Prior research has shown that stakeholders can be helpful for recruitment.10 However, we acknowledge that the response rates in the literature with service providers can also be low.11

Military spouse participants. We will use randomized (blind) selection among all eligible individuals at each site (military spouses who have had at least one meeting with an Employment Navigator). We will request that DOL sends ICF non-re-identifiable ID numbers for each eligible individual. ICF will select two to invite (and two alternates) per site, using a random numbers table. In the instances where there are more eligible individuals than we can include in a focus group, we will use purposive selection to create groups that have a variety of viewpoints and set quotas for participant types to match the general characteristics of the population (e.g., gender, military paygrade, length of military service). We anticipate a 50 percent response rate for the military spouse participants; a conservative estimate that is consistent with the research literature.12

TSM (post-Navigator) participants. We will use randomized (blind) selection among all eligible individuals at each site (enlisted TSMs who have had at least one meeting with an Employment Navigator). We will request that DOL sends ICF non-re-identifiable ID numbers for each eligible individual. ICF will select two to invite (and two alternates) per site, using a random numbers table. In the instances where there are more eligible individuals than we can include in a focus group, we will use purposive selection to create groups that have a variety of viewpoints and set quotas for participant types to match the general characteristics of the population (e.g., gender, military paygrade, length of military service). We anticipate a 50 percent response rate for the TSM participants; a conservative estimate since the research literature indicates difficulty with recruitment if not directed by the commanding officer.13,14

TSM (post-Partner) participants. We will use randomized (blind) selection among all eligible individuals at each site (enlisted TSMs who have had at least one meeting with an ENPP partner organization). We will request that DOL sends ICF non-re-identifiable ID numbers for each eligible individual. ICF will select 2 to invite (and 2 alternates) per site, using a random numbers table. In the instances where there are more eligible individuals than we can include in a focus group, we will use purposive selection to create groups that have a variety of viewpoints and set quotas for participant types to match the general characteristics of the population (e.g., gender, military paygrade, length of military service). We anticipate a 50 percent response rate for the TSM participants; a conservative estimate since the research literature indicates difficulty with recruitment if not directed by the commanding officer.15,16

B.2. Procedures for the collection of information

1. Data collection procedures

To supplement the information found in the program documents and ENPP performance data, ICF will collect information from key program stakeholders through focus groups between October 2021 (following PRA clearance of study) and March 2022. At this point, the second half of the ENPP pilot, operations should be stabilized and any variation among sites should be apparent.

We will use structured focus group protocols, allowing data collection staff to home in on domains of interest while also following the course of the conversation. Prior to data collection, we will ensure valid and reliable instrumentation through limited cognitive interviewing for our focus group protocols to establish face validity.

Due to the COVID-19 pandemic, onsite data collection will likely not be feasible during the study period. As a result, ICF has structured all data collection activities for virtual administration. ICF staff have extensive experience working in an online environment and use Microsoft Teams, the platform we will use for virtual data collection, daily for internal and external meetings. Furthermore, project staff are expert facilitators who are skilled in applying a culturally competent approach (including military culture) to the facilitation of dialogue, which ensures that participants are engaged and trust that the information they provide will be safeguarded, even in a virtual environment.

2. Statistical methodology, estimation, and degree of accuracy

Due to the qualitative and narrative nature of the data collected through the focus groups, no statistical methodology or estimation will be needed in the analysis of focus group data. As mentioned above, for focus group participant selection, we will use a random numbers table to identify invitees among the eligible universe at each site. In the instances where there are more eligible individuals than we can include in a focus group, we will use purposive selection to create groups that have a variety of viewpoints and set quotas for participant types to match the general characteristics of ENPP participants as a whole (e.g., gender, military paygrade, length of military service).

Prior to each focus group, we will review demographic, military service, and ENPP participation information about the individual participants provided by DOL. In the event of missing data, we may request additional information from participants via email, taking less than five minutes of time to respond. If the information is not received from the participant, the data will be identified as missing in demographic and background characteristics tables in the final report. We will not impute information or delete cases with missing data. We will present aggregated information about the respondents (by respondent group) in the final report. Because this formative study is examining pilot implementation and does not require a representative sample of ENPP participants, the interviews and focus groups will not be presented as statistically generalizable and response rates will not be calculated or reported.

Driven by the research questions and themes from the data summarization meetings, ICF will construct a framework and codebook for the qualitative data analysis. This inductive framework will allow ICF to ensure that data accurately depicts the underlying themes of the study indicated in the program’s logic model,17 as well as to code and capture unexpected implementation successes and challenges. As themes become definable throughout the data collection process, ICF will conduct interpretive analyses that test the proposed research questions and look closely at the relationships among the elements of the ENPP.18

2. Unusual problems requiring specialized sampling procedures

No unusual problems are anticipated.

3. Use of periodic (less frequent than annual) data collection cycles

Interviews and focus groups will be conducted with the respondent groups once during the time period of the pilot program.

B.3. Methods to maximize response rates and data reliability

We will use the following strategies to maximize response rates:

  1. We will provide template language to ENPP staff to introduce ICF and the study to the potential respondents, as trusted messengers. (The language will make it clear that participation is voluntary and they will not lose any benefits or services if they decline to participate).

  2. When introducing the focus group opportunity to participants, we will provide the goals of the study, use of the study (for internal DOL use only, not for publishing), precautions for respondent privacy, and consent form. We will provide a phone number and email address for questions that respondents might have.

  3. For TSMs and spouses, we will offer a $25 gift card as compensation for their time and effort. This is a nominal amount that is not large enough to be considered coercive for participants. It will serve to offset the costs of participation (e.g., child care) and acknowledge that participants’ time is valuable.

  4. We will be flexible in scheduling the focus groups. We will use a group scheduling software, such as Doodle, to identify a time that works for everyone. We will offer time zone-appropriate options for scheduling, and evening options if requested.

  5. We will provide clear step-by-step instructions for joining the focus group platform, and a staff on hand to address technical challenges.

With these strategies in place, ICF anticipates a conservative estimate for TSM and military spouse focus group participants (50 percent response rate) due to the virtual nature of the focus group.19 We anticipate a 90 percent response rate for the employment navigator participants since they are professionals employed to work with the TSMs, comparable to literature of research with professionals.20

We will use the following strategies to protect data validity:

  1. Employ experienced focus group facilitators and note takers, who have completed a project-specific training on virtual focus group facilitation, data collection with military stakeholders, and the purpose and goals of this study. Training included a review of the protocols, consent forms, and note-taking tools.

  2. Use structured focus group protocols and standardized note-taking tools.

  3. Video record all focus groups as a back-up to written notes and source to confirm speakers, quotes, and context of responses.

  4. Focus group facilitators will review notes as they are completed to review for accuracy and completeness. They will provide feedback to note takers as appropriate.

B.4. Tests of procedures or methods

All protocols for this study have been reviewed by content and methodological experts to ensure clarity, accuracy, and optimal ordering of the questions. In addition, ICF will use the protocols for small-scale data collection with no more than nine respondents early in the pilot, which will serve to validate question clarity, protocol timing, feasibility of the online meeting platform for data collection, and note-taking procedures.

B.5. Individuals consulted on statistical methods

The following individuals have been consulted on the use of statistical methods for the study design:

Peter Mueser, PhD

Professor, Department of Economics and Truman School of Public Affairs

University of Missouri

Columbia, MO 65211


Meredith Kleykamp, PhD

Associate Professor and Associate Chair, Department of Sociology

Director, Center for Research on Military Organization

University of Maryland

College Park, MD 20742



The following individuals consulted on the use of statistical methods for the study design and will also be primarily responsible for actually collecting and analyzing the data for the agency:

ICF

Dr. Rosemarie O’Conner (703) 251-0361

Ms. Emily Appel-Newby (703) 225-2409

Dr. Shelley Osborn (714) 357-5667




1 The same TAP manager may be contacted twice over two years because it is anticipated to be the same person in the role both years; the other respondent groups will be recruited uniquely each year. We also anticipate a need and opportunity to do follow-up data collection for clarification with a small number of each respondent type.

2 DOL reported 14-15 new spouse cases in May and June 2021. To be conservative, these estimates assume that the rate of new enrollments doubles for the remaining 9 months of the pilot (90 per quarter) reaching a total of 315.

3 DOL reported 620 total cases as of June 30, 2021 (3 months into the pilot), half of which have had a verified connection/warm handover to a partner. To be conservative, these estimates assume that the rate of new enrollments doubles for the remaining 9 months of the pilot (1,240 per quarter), reaching a total of 4,340.

4 Program partner staff may be contacted twice over two years because it is anticipated to be the same person in the role both years.

5 To date VETS has engaged nine nongovernmental partners and anticipates engaging more. We use 20 for estimation purposes.

6 According to DOL CareerOneStop, there are approximately 2,400 AJCs. Potentially all AJCs could receive referrals.

7 Daniels, N., Gillen, P., Casson, K., & Wilson, I. (2019). STEER: Factors to consider when designing online focus groups using audiovisual technology in health research. International Journal of Qualitative Methods, 18, 1-11. https://doi.org/10.1177/1609406919885786

8 Onwuegbuzie, A. J., Dickinson, W. B., Leech, N. L., & Zoran, A. G. (2009). A qualitative framework for collecting and analyzing data in focus group research. International Journal of Qualitative Methods, 8(3), 1-21.

9 Snape S. (2021). The Value of Conceptual Encounter methodology in exploring women’s experience of identity work in career choices and transitions. International Journal of Evidence Based Coaching and Mentoring. 15, 270-282. https://doi.org/10.24384/9z87-qr07

10 Bonisteel, I., Shulman, R., Newhook, L. A., Guttmann, A., Smith, S., & Chafe, R. (2021). Reconceptualizing recruitment in qualitative research. International Journal of Qualitative Methods, 1–12. https://doi.org/10.1177/16094069211042493

11 Wolf, M. R., Eliseo-Arras, R. K., Brenner, M., & Nochajski, T. H. (2017). “This will help your children”: Service providers’ experiences with military families during cycles of deployment. Journal of Family Social Work, 20(1), 26–40.

12 Mailey, E., Mershon, C., Joyce, J., & Irwin, B. (2018). “Everything else comes first”: a mixed-methods analysis of barriers to health behaviors among military spouses. BMC Public Health, 18(1), 1–11.

13 Kay, S. S., Lagana-Riordan, C., Pecko, J., Bender, A. A., & Millikan, A. M. (2015). Conducting focus groups with military populations: Lessons learned from the field. Journal of Ethnographic & Qualitative Research, 9(3), 209–220.

14 Besse, K., Toomey, T. L., Hunt, S., Lenk, K. M., Widome, R., & Nelson, T. F. (2018). How soldiers perceive the drinking environment in communities near military installations. Journal of Alcohol and Drug Education, 62(1), 71–90.

15 Kay, S. S., Lagana-Riordan, C., Pecko, J., Bender, A. A., & Millikan, A. M. (2015). Conducting focus groups with military populations: Lessons learned from the field. Journal of Ethnographic & Qualitative Research, 9(3), 209–220.

16 Besse, K., Toomey, T. L., Hunt, S., Lenk, K. M., Widome, R., & Nelson, T. F. (2018). How soldiers perceive the drinking environment in communities near military installations. Journal of Alcohol and Drug Education, 62(1), 71–90.

17 LeCompte, M., & Schensul, J. (2010). Designing and conducting ethnographic research (2nd ed.). Lanham, MD: Alta Mira Press.

18 LeCompte, M., & Schensul, J. (2010). Designing and conducting ethnographic research (2nd ed.). Lanham, MD: Alta Mira Press.

19 Daniels, N., Gillen, P., Casson, K., & Wilson, I. (2019). STEER: Factors to consider when designing online focus groups using audiovisual technology in health research. International Journal of Qualitative Methods, 18, 1-11. https://doi.org/10.1177/1609406919885786

20 Snape S. (2021). The Value of Conceptual Encounter methodology in exploring women’s experience of identity work in career choices and transitions. International Journal of Evidence Based Coaching and Mentoring. 15, 270-282. https://doi.org/10.24384/9z87-qr07

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDpatterson
File Modified0000-00-00
File Created2022-02-12

© 2024 OMB.report | Privacy Policy