NGYC_follow-up_OMB_pkg_PartB_CLEAN_3.28.2018

NGYC_follow-up_OMB_pkg_PartB_CLEAN_3.28.2018.docx

National Guard Youth ChalleNGe Job ChalleNGe Evaluation

OMB: 1290-0019

Document [docx]
Download: docx | pdf

OMB Part B Page 1 of 7

The National Guard Youth ChalleNGe Job ChalleNGe Evaluation

Office of Management and budget SUPPORTING STATEMENT PART B

The Employment and Training Administration (ETA) of the U.S. Department of Labor (DOL) is funding three National Guard Youth ChalleNGe programs to expand the target population to include court-involved youth and add a five-month residential occupational training component called Job ChalleNGe. The goal of Youth ChalleNGe, a program that was established in the 1990s, is to build confidence and maturity, teach practical life skills, and help youth obtain a high school diploma or GED. The program has a quasi-military aspect in which, for about 20 weeks, participants live in a disciplined environment and participate in numerous activities to strengthen their skills in a wide range of areas.

The addition of the Job ChalleNGe component to the existing Youth ChalleNGe model has the potential to bolster the program’s effectiveness by adding intensive occupational training. Job ChalleNGe expands the residential time by five months for cadets who are interested in staying and are identified by staff as having the potential to benefit from the program. It offers the following activities: (1) occupational skills training, (2) individualized career and academic counseling, (3) work-based learning opportunities, and (4) leadership development activities. In addition, the program is engaging employers to ensure participants’ skills address employers’ needs.

The National Guard Youth ChalleNGe Job ChalleNGe Evaluation, sponsored by DOL’s Chief Evaluation Office (CEO), is designed to gain an understanding of the implementation of the DOL Job ChalleNGe grant and the experiences and outcomes of participants in the three grantee sites that were awarded Job ChalleNGe grants in 2015. The CEO has contracted with Mathematica Policy Research, in conjunction with its subcontractors MDRC and Social Policy Research Associates (SPR), to conduct this evaluation. OMB clearance (control number 1291-0008) was received on March 31, 2016, for four data collection instruments used as part of the evaluation: (1) baseline information form (BIF) for youth, (2) site visit master staff protocol, (3) site visit employer protocol, and (4) site visit youth focus group protocol.

This request seeks clearance for two types of follow-up data collection from study sample members who participated in the Job ChalleNGe program.

1. Respondent universe and sampling methods

Follow up data collection

The evaluation design for the data collection request specified in this clearance package is an implementation study and an outcomes study of Job ChalleNGe participants. The universe and sample of Job ChalleNGe participants to be included in the data collection are youth in the three National Guard Youth ChalleNGe Job ChalleNGe grantee sites who (1) have consented to be in the study, (2) start Job ChalleNGe, and (3) are in cohorts 4, 5, and 6 of grant enrollment.1 Job ChalleNGe enrolls youth in cohorts shortly after they complete the Youth ChalleNGe program, which also works on a cohort-based model.

Based on OMB clearance provided for the evaluation’s baseline data collection (OMB control number 1291-0008), most Job ChalleNGe participants to be included in the follow-up data collection efforts will have already provided consent to participate in the evaluation. During enrollment of each cohort of Youth ChalleNGe, the evaluation team will request parental consent for the study (as described in the evaluation’s original OMB package). The evaluation team will check the participant list at the start of Job ChalleNGe and confirm that all youth and parents have been given an opportunity to consent to participate in the study; the consent process will be repeated for any youth and/or parents missed during Youth ChalleNGe. It is expected that about 90 percent of youth who begin Job ChalleNGe will have consented to participate in the study (Table B.1).

The follow-up data collection activities will be administered to consenting youth in cohorts 4, 5, and 6. These cohorts of grant enrollment for the Job ChalleNGe program are expected to start around July 2017, January 2018, and July 2018, respectively. Including cohorts 4, 5, and 6 allows the evaluation team to collect both text and follow-up survey data for a consistent sample of Job ChalleNGe participants.

Of the youth who consent to participate in the study, the evaluation team expects to achieve an 85 percent response rate on the two data collection efforts for which OMB approval is requested in this package: the monthly text message survey and the follow up-survey. The monthly text message survey will be administered to consenting youth in cohorts 4, 5, and 6 each month for eight months. For these three cohorts, about 351 youth are expected to complete each round of the monthly text message survey; the expected sample for each round would include 174 court-involved youth and 177 non-court-involved youth. Because the monthly text message survey will be administered in eight rounds, the evaluation team expects about 2,808 survey responses over the eight-month period. In a similar way, a total of 414 youth—207 court-involved youth and 207 non-court-involved youth—are expected to complete the follow-up survey, which will be administered only once.

Table B.1. Estimated sample sizes

Cohort

Expected total enrollment

Consent to the evaluation

Consent to receive study texts and complete monthly text message survey each month a

Complete follow-up survey

Cohort 4

 

 

 

 

Total starting Job ChalleNGe

180

162

117

138

Court-involved

90

81

58

69

Non-court-involved

90

81

59

69

Cohort 5





Total starting Job ChalleNGe

180

162

117

138

Court-involved

90

81

58

69

Non-court-involved

90

81

59

69

Cohort 6





Total starting Job ChalleNGe

180

162

117

138

Court-involved

90

81

58

69

Non-court-involved

90

81

59

69

Totals for cohorts 4–6





Total starting Job ChalleNGe

540

486

351

414

Total court-involved

270

243

174

207

Total non-court-involved

270

243

177

207

Notes: The cohort numbering refers to the progression of Job ChalleNGe cohorts of youth for whom services are funded through the Youth ChalleNGe Job ChalleNGe grants awarded to three Youth ChalleNGe sites. The first three cohorts, which began receiving services between January 2016 and January 2017, will not be included in the data collection efforts for which clearance is being requested. Cohorts 4–6 will be included; they are expected to begin receiving Job ChalleNGe services between approximately July 2017 and July 2018.

The numbers in the table are based on the following assumptions: (1) a consent rate of 90 percent to the study and a consent rate of 85 percent to be contacted via text, (2) an 85 percent response rates for the monthly text message survey and the follow-up survey, and (3) a court involvement rate of 50 percent.

a The monthly text message survey will involve eight rounds of data collection: once monthly for each of eight months. Thus, in total, the evaluation team expects about 2,808 survey responses (about 351 respondents per round times eight rounds). To address missing data, the evaluation team will limit the sample for the analysis of the monthly text survey data to respondents who filled out the monthly text survey in at least five of the eight months of survey administration. It is assumed the number of respondents who respond to the survey fewer than five times will be quite small but will result in a smaller analysis sample for the monthly text survey than estimated above. In longitudinal surveys, respondent samples can become increasingly composed of those who are most willing to participate in each subsequent survey, though re-interview response rates for many longitudinal studies show little evidence of selective attrition over time.2 However, most information about declines in response rates over time is based on longitudinal studies with annual interviews, where the length of time between interviews contributed to lower re-interview response rates; strategies such as incentive payments and communication with respondents between waves contribute to maintaining response rates.



The expected response rates (approximately 85 percent) for both the brief monthly text message survey and the follow-up survey are based on the evaluator’s prior experience conducting an evaluation of YouthBuild, a program serving a similar youth population, which obtained comparable response rates for a 12-month follow-up period.

2. Procedures for the collection of information

The study will include all youth enrolled in Job ChalleNGe cohorts 4, 5, and 6 who consented to participate in the evaluation (approximately 486 youth). As mentioned above, this is a follow-up data collection of a sample previously enrolled in the study. Prior to the collection of baseline data sample members, youth (and their parents/guardians, when needed) were asked to provide consent to participate in the evaluation. (OMB already provided clearance for the evaluation’s baseline data collection, as per OMB control number 1291-0008.) The initial consent form secured assent from the youth (and consent from their parents or guardians, when needed) for the Job ChalleNGe study, including follow-up data collection. At each point of contact for the monthly text message survey and at the follow-up survey, sample members will be given the option to agree or decline to participate in the data collection effort.

a. Analysis methods for outcomes study

The analysis will summarize outcomes for Job ChalleNGe participants up to 19 months after the start of the program. Descriptive and statistical methods will be used to estimate mean outcomes of the program participants over time. (See the section below on “Estimating outcomes for the full sample” for a fuller discussion.) Outcomes will be measured as either binary (0/1) variables (for example, whether or not the youth received a high school diploma) or continuous variables (for example, earnings). Outcomes will be estimated not only for the full sample, but also for policy-relevant subgroups, such as court-involved youth. The analysis will be conducted using the SAS and Stata software programs.

Assessing baseline characteristics. To describe the characteristics of the Job ChalleNGe participants prior to participation in the program, mean baseline measures of the participants will be calculated using data from the BIFs. (The BIF data collection effort has already been approved by OMB; see control number 1291-0008.)

Estimating outcomes for the full sample. An outcomes study design focuses solely on the outcomes of participants in the program under study. The planned approach for the Job ChalleNGe outcomes study uses data collection from the monthly text message survey and the follow-up survey to generate descriptive means of the outcome measures for Job ChalleNGe participants.

Outcomes study analysis approach. The analysis sample includes youth who responded to the surveys. Imputation will be used to address item nonresponse among the covariates.3 The primary analytical method will estimate average outcomes for Job ChalleNGe participants pooled across the three grantee sites. The outcomes will be constructed from both survey and administrative data for the Job ChalleNGe outcomes study. Outcomes from the survey will include measures of (1) education success (such as receipt of any vocational certificates or credentials), (2) employment success (such as employment status, military enlistment, and having any fringe benefits), and (3) delinquency and criminal justice involvement (such as new arrests and convictions) (Table B.2).

Table B.2. Outcome measures from survey data collection

Category

Measure

Mode

Employment

Ever employed since the end of Job ChalleNGe

Text survey, Follow-up survey


Average hours worked per week

Text survey, Follow-up survey


Enlisted in military

Text survey, Follow-up survey


Currently have more than one job

Follow-up survey


Time to first job after Job ChalleNGe

Follow-up survey


Length of time in current job

Follow-up survey


Employed in training field

Follow-up survey

Earnings

Weekly earnings

Text survey, Follow-up survey


Receive benefits through job

Follow-up survey

Education

Current enrollment in classes

Text survey, Follow-up survey


Completed education during Job ChalleNGe

Follow-up survey


Completed job training

Follow-up survey


Received credential

Follow-up survey


Current enrollment in job training

Follow-up survey

Justice system contact

New arrest

Follow-up survey

Arrest by type (drug, property, violent)

Follow-up survey

Convicted/found delinquent to a crime

Follow-up survey



Analysis of patterns in outcomes over time. The monthly text survey data also will allow for the exploration of patterns of engagement in employment, education, and other prosocial activities following participation in Job ChalleNGe, using panel data methods. To enhance interpretation of estimates of changes in their activities and outcomes over time, the analysis of the text survey data will focus on Job ChalleNGe participants who responded to at least five of the eight surveys. The evaluation team will also explore imputing values for observations that are missing throughout the data collection period. The goal of limiting the analysis sample to Job ChalleNGe participants who have responded at least five times is to preserve as large a sample as possible without risking unduly influencing the analysis through imputation of values for missing observations.

Descriptive analyses will be used to understand how Job ChalleNGe participants fare in the months following the end of the program, including the stability or instability of their employment experiences. Using the panel data of observations over time, the evaluation team will present the percentage of participants who are employed at each point in time, and describe the average time to the first job, persistence in employment, and the degree of churn in employment status over the period.

Estimating outcomes for subgroups. To understand how outcomes for Job ChalleNGe participants vary by youth characteristics, subgroup impacts will be calculated using a “split-sample” approach in which the full sample is divided into two subgroups. Outcomes for subgroups will be estimated using a straightforward modification to equation (1), where the model is stratified by subgroup status, and using F-tests to assess whether differences in outcomes across subgroup levels are statistically significant. Although the evaluation team will work with CEO to determine the types of subgroups for which a subgroup analysis should be conducted, potential subgroups to be analyzed include court-involved youth versus non-court-involved youth, as well as subgroups defined by age and race/ethnicity, given that the National Job Corps study found different impacts for subgroups defined by these demographic characteristics.4 Other potentially important subgroups could be defined by the youth’s level of economic disadvantage and the youth’s program, with the caveat that outcomes for small subgroups are likely to be imprecisely estimated.

b. Precision calculations of the outcome estimates

The study sample will include data from a subset of Job ChalleNGe cohorts. Therefore, the estimates will not be generalizable to other cohorts of Job ChalleNGe participants. However, the information provided by the study on the average outcomes for Job ChalleNGe participants, and in particular, the subsample of court-involved youth, will be informative to policymakers as they seek to describe the outcomes of the new program and consider expanding it to other Youth ChalleNGe locations.

Table B.3. Sample sizes and precision estimates



Precision for an outcome measured as a percentage and with a mean of:

Survey sample

Sample size

25 or 75 percent

33 or 67 percent

50 percent

Monthly text message survey (cohorts 4–6)





Responses to text message survey

351

4.5%

4.9%

5.2%

Subpopulation of court-involved youth

174

7.4%

8.6%

10.5%

Follow-up survey (cohorts 4–6)





Responses to follow-up survey

413

4.2%

4.5%

4.8%

Subpopulation of court-involved youth

207

5.9%

6.4%

6.8%

Notes. The cohort numbering refers to the progression of Job ChalleNGe cohorts of youth for whom services are funded through the Youth ChalleNGe Job ChalleNGe grants awarded to three Youth ChalleNGe sites. The first three cohorts, which began receiving services between January 2016 and January 2017, will not be included in the data collection efforts for which clearance is being requested. Cohorts 4–6 will be included; they are expected to begin receiving Job ChalleNGe services between approximately July 2017 and July 2018.

The sample sizes are based on the following assumptions: (1) consent rate of 90 percent to the evaluation and consent rate of 85 percent to be contacted via text, (2) 85 percent response rates for the monthly text survey and the follow-up survey, and (3) a court-involvement rate of 50 percent. For the monthly text survey, we will restrict the sample to respondents who answer the survey for at least five of the eight months; therefore, the actual response rate may differ from the 85 percent used in the calculations.


As shown in Table B.3, precision will vary slightly by the survey mode and the number of cohorts included in the analysis. For instance, for a monthly text survey including three cohorts, a binary outcome that has a prevalence of 50 percent (that is, each of its two possible values is true for 50 percent of the population) has a precision of plus or minus 5.2 percentage points. The estimates are somewhat more precise as the prevalence of the outcome approaches 100 percent or zero percent (for example, a 25 or 75 percent prevalence). The estimates for the subpopulation of court-involved youth are less precise than those for the full sample due to the smaller sample size. For example, a binary outcome that has a prevalence of 50 percent has a precision of plus or minus 10.5 percentage points for the subpopulation of court-involved youth in the monthly text survey. The follow-up survey, based on a slightly larger sample size, offers greater precision. For instance, a binary outcome with a prevalence of 50 percent has a precision of plus or minus 4.8 percentage points for the full sample and 6.8 percentage points for the court-involved subsample.

3. Methods to maximize response rates and deal with nonresponse

This clearance package requests approval for two types of data collection. The first is a monthly text message survey, which will start 8 months after Job ChalleNGe participants began receiving services in the program and 2 months after they could complete the program). The second is a follow-up survey, which will be fielded 16 months after the Job ChalleNGe participants began the program (and 11 months after they could complete the program). This section discusses the methods to maximize response rates and data reliability, first for the monthly text message survey and then for the follow-up survey.

a. Monthly text message survey

Response rates. The evaluation team will use the following procedures to maximize response rates for the monthly text message survey. Consent to participate in the study as well as willingness to be contacted via text is collected during the baseline data collection process as part of the consent form and BIF. An advance letter will be sent to all Job ChalleNGe participants who provided study consent two weeks prior to the start of the text message survey. The advance letter will explain the process, describe the types of questions to be asked, and reinforce the understanding of the privacy of the respondents’ information. A series of questions will be administered, via text messaging, starting eight months after the start of the Job ChalleNGe program (and two months after the end of it). Upon completion of each short series of texts, respondents will receive $3 via Amazon code.

Data reliability. All text message surveys are unique to the current evaluation and will be used across all three Job ChalleNGe grantees on a monthly basis, ensuring consistency in the collected data. The questions have been carefully developed by evaluation staff and extensively reviewed by both evaluation staff and staff at DOL. In addition, they will be thoroughly tested in a pre-test. These efforts will allow us to document respondent burden time and are intended to ensure that the questions are easily understood and interpreted by study participants in a way that will yield reliable information about the topics of interest for the study. The monthly text surveys will be administered by Signal Vine, an organization that specializes in text-based data collection. The evaluation team will determine the schedule for the text messages to ensure that the surveys are administered at regular intervals for all sample members.

  1. Follow-up survey

Response rates. The evaluation team will use the following procedures to maximize response rates on the follow-up survey. Consent to participate in the study is collected during the baseline data collection process as part of the initial consent form, which is completed by youth prior to the start of the Youth ChalleNGe program. An advance letter will be sent to all Job ChalleNGe participants who agreed to be part of the study two weeks prior to the start of the follow-up survey data collection. Similar to the advance letter sent prior to the start of the monthly text message survey, the advance letter for the follow-up survey will explain the process, describe the types of questions to be asked, and reinforce the understanding of the privacy of the respondents’ information. Youth will receive a link to the web-based survey via text (if a cell phone number was provided on the BIF and permission to text was given) and email (if an email address was provided on the BIF). During the 16-week data collection period, Job ChalleNGe participants will receive email reminders and at least one mail reminder. Those who complete the survey in the first 6 weeks of data collection will receive $30; those who complete in the remaining 10 weeks will receive $20. Payment will be sent through an Amazon code.

Data reliability. The follow-up survey is unique to the current evaluation and will be used across all three Job ChalleNGe grantees 16 months after the start of Job ChalleNGe, ensuring consistency in the collected data. As with the monthly text message survey questions, the follow-up survey questions were carefully developed by the evaluation team and then extensively reviewed by evaluation staff and staff at DOL. They also will be thoroughly tested in a pre-test. Several of the survey questions are based on questions with response scales that have been used in previous surveys of similar populations, such as the YouthBuild evaluation. The web-based follow-up survey will be administered by the evaluation team using ConfirmIt data collection software.

4. Tests of procedures or methods

The evaluation team will pre-test the brief monthly text message survey and the follow-up survey with up to nine youth participants from the third cohort of the Job ChalleNGe program. (This cohort will not be included in the data collection efforts for which this package is seeking approval from OMB.) Using information provided on the previously collected BIFs, included in OMB clearance for the baseline data collection (OMB control number 1291-0008), the evaluation team will attempt to recruit pre-test participants who cover a range of characteristics, including level of court involvement, gender, and age. Efforts will be made to use the same pre-test participants for both instruments.

The evaluation team will mail a hardcopy version of the follow-up survey, along with debriefing forms for both the follow-up survey and the monthly text message survey. Youth will be asked to send a text message to a member of the evaluation team on a project-specific phone upon receipt of the package. The texting survey will begin once the confirmation text is received. Once youth complete the monthly text survey questions, they will be sent a thank you text with an Amazon code in the amount of $10 in appreciation for their participation in the text survey pre-test. The thank you text will be followed by a text reminding them to complete and return the text debriefing form as well as the completed follow-up survey and corresponding debriefing form. Once the completed follow-up survey and debriefing forms are received, the youth will be sent an Amazon code, via text, in the amount of $50.

Upon receipt of the completed follow-up survey and both debriefing forms, a member of the evaluation team will conduct a phone debriefing with the participant using a standard debriefing protocol to determine whether any words or questions are difficult to understand or answer. The instruments will be revised to incorporate the lessons learned from the pre-tests.

5. Individuals consulted on statistical methods

Consultations on the statistical methods used in this study have been used to ensure the technical soundness of the study. The following individuals were consulted on statistical aspects of the design and will also be primarily responsible for actually collecting and analyzing the data for the agency:

Mathematica Policy Research

  • Ms. Jeanne Bellotti (609) 275-2243

  • Dr. Jillian Berk (202) 264-3449

  • Dr. Johanna Lacoe (510) 285-4618

  • Dr. Karen Needels (609) 750-4043

Inquiries regarding the statistical aspects of the study’s planned analysis should be directed to:

  • Dr. Jillian Berk (202) 264-3449

  • Ms. Jessica Lohmann (202) 693-5087

1 As a condition of receiving the grant, grantees are required to participate in the evaluation.

2 Schoeni, R.F., F. Stafford, K.A. McGonagle, and P. Andreski. “Response Rates in National Panel Surveys.” The ANNALS of the American Academy of Political and Social Science, vol. 645, no. 1, 2013, pp. 60–87.

3 The evaluation team anticipates achieving an 85 percent response rate for the monthly text survey and follow-up survey data collection. Given this, the team does not anticipate the need for a nonresponse bias analysis or for the use of survey nonresponse weights. However, should the response rate be less than 80 percent, a nonresponse bias analysis will be conducted and weights that adjust for nonresponse will be used.

4 Schochet, P., J. Burghardt, and S. McConnell. “Does Job Corps Work? Impact Findings from the National Job Corps Study.” American Economic Review, vol. 98, no. 5, 2008, pp. 1864–1886.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorIrwin, Molly E - ASP
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy