Youth ChalleNGe Evaluation OMB Part A revised 2016 -final 3 25 2016

Youth ChalleNGe Evaluation OMB Part A revised 2016 -final 3 25 2016.docx

National Guard Youth ChalleNGe Job ChalleNGe Evaluation

OMB: 1291-0008

Document [docx]
Download: docx | pdf

OMB Part A Page 3 of 11

The National Guard Youth ChalleNGe Job ChalleNGe Evaluation

OMB SUPPORTING STATEMENT PART A

The Employment and Training Administration (ETA) of the U.S. Department of Labor (DOL) is funding three National Guard Youth ChalleNGe programs to expand the program’s target population to include court-involved youth and add a five-month residential occupational training component called Job ChalleNGe.

The goal of Youth ChalleNGe is to build confidence and maturity, teach practical life skills, and help youth obtain a high school diploma or GED. The program’s numerous activities all address its eight core pillars: leadership/followership, responsible citizenship, service to community, life-coping skills, physical fitness, health and hygiene, job skills, and academic excellence. It has a quasi-military aspect in which participants, known as “Cadets,” live in barracks-style housing in a disciplined environment for about 20 weeks—the residential phase. Cadets wear their hair short and dress in military uniforms. Upon completing the residential phase of the program, participants receive one year of structured mentoring designed to help them successfully transition back to their communities.

The addition of the Job ChalleNGe component to the existing Youth ChalleNGe model has the potential to bolster the program’s effectiveness by incorporating occupational training. Job ChalleNGe will expand the residential time by five months for Cadets who are interested in staying and are randomely selected to do so, and will offer the following activities: (1) occupational skills training, (2) individualized career and academic counseling, (3) work-based learning opportunities, and (4) leadership development activities. In addition, the program will engage employers to ensure Cadets’ skills address employers’ needs.

The National Guard Youth ChalleNGe Job ChalleNGe Evaluation, sponsored by DOL’s Chief Evaluation Office (CEO), will use (1) a set of interviews with staff and youth to learn how these program enhancements are implemented, and (2) a randomized controlled trial (RCT) to measure the effectiveness of Job ChalleNGe. The RCT will compare Youth ChalleNGe graduates who attend Job ChalleNGe to graduates who were on track for Job ChalleNGe but were not randomly selected to attend. Expanding eligibility for Youth ChalleNGe and Job ChalleNGe to court-involved youth, who for the most part are not currently eligible to participate in Youth ChalleNGe, could make a difference in the lives of those youth who can be the hardest to serve. The evaluation will take place at sites awarded Job ChalleNGe grants in 2015: Fort Stewart, Georgia; Battle Creek, Michigan; and Aiken, South Carolina.



The CEO has contracted with Mathematica Policy Research, in conjunction with its subcontractors MDRC and Social Policy Research Associates (SPR), to conduct this evaluation. With this package, clearance is requested for four data collection instruments related to the impact and implementation studies to be conducted as part of the evaluation:

  1. Baseline information form (BIF) for youth

  2. Site visit master staff protocol

  3. Site visit employer protocol

  4. Site visit youth focus group protocol

The site visits interview protocols will include semi-structured interviews with grantee administration and staff, partners, and employers. The site visits protocols and youth focus group protocol will be conducted in all three program sites. No statistical methods will be used in the implementation analysis and discussions of the results will be carefully phrased to make clear that no generalization is intended.

An addendum to this package, to be submitted at a later date, will request clearance for the follow-up data collection from study participants, including communication tools (like advance letters and email text for non-response follow-up) and the survey instrument. We are submitting the full package for the study in two parts because the study schedule requires random assignment to take place and the implementation study to begin before the follow-up instrument and related tools are developed and tested.

A. Justification

1. Circumstances necessitating collection of information

Youth who are “disconnected” from school face profound challenges in the increasingly skills-based U.S. labor market. These youth are more likely to have academic difficulties and mental health and substance abuse issues. They may also have lower long-term earnings and find it harder to find and keep a job (Hair et al. 2009). These challenges are compounded for youth who have been involved with the criminal justice system. According to one study, only 12 percent of former youth offenders received their high school diploma or General Educational Development (GED) certificate; this is 74 percent lower than the national average (Steinberg et al. 2004). In many cases, schools are reluctant to enroll former offenders because of the perceived risk that they pose. To help dropouts resume their education and have a better chance at labor market success, governments and foundations have funded numerous programs, including alternative high schools, GED preparation programs, community service projects, and programs combining occupational training and soft skills development.

Programs for at-risk youth have been developed and implemented for decades, but rigorous evaluations have found few of those programs to be effective. Youth programs under the Job Training Partnership Act were found to have negative impacts on out-of-school youth (Bloom et al. 1997), and JobStart increased the likelihood of receiving a GED but generated few lasting impacts on labor market outcomes (Cave et al. 1993).

More recently, three programs for at-risk youth have been evaluated and shown to be effective. Notably, the 10-site, random assignment evaluation of National Guard Youth ChalleNGe found that three years after enrolling in Youth ChalleNGe there were higher levels of GED receipt, employment, earnings, and college enrollment in the participant population (Millenky et al. 2011). Another evaluation, the National Job Corps Study, found that Job Corps—which includes individualized academic education, vocational training, counseling, and job placement—increased GED attainment, certificate receipt, and earnings, and lowered arrest rates over the four-year follow-up period (Schochet et al. 2008). The evaluation of Year Up, which provides training and internships in financial operations and information technology, found that the program increased the earnings of participants by about 30 percent by the third follow-up year (Roder and Elliott 2014). These programs share a key component: they are intensive, requiring a substantial commitment on the part of participating youth. Youth ChalleNGe is designed to be a 17-month program; the average length of participation found in the National Job Corps Study was eight months, and Year Up is a year-long program. In addition to the time commitment required, both Youth ChalleNGe and Job Corps have a residential component.

a. The National Guard Youth ChalleNGe program model

The goal of Youth ChalleNGe is to build confidence and maturity, teach practical life skills, and help youth obtain a high school diploma or GED. The program’s numerous activities all address its eight core pillars: leadership/followership, responsible citizenship, service to community, life-coping skills, physical fitness, health and hygiene, job skills, and academic excellence. It has a quasi-military aspect in which participants, known as “cadets,” live in barracks-style housing in a disciplined environment for about 20 weeks—the residential phase. Cadets wear their hair short and dress in military uniforms. Upon completing the residential phase of the program, participants receive structured mentoring for a year; this mentoring is designed to help them successfully transition back to their communities.

To build on the success of Youth ChalleNGe, the ETA issued $12 million in grants in early 2015 for three Youth ChalleNGe programs to (1) expand the program’s target population to include youth who have been involved with the courts and (2) add an occupational training component, known as Job ChalleNGe. Job ChalleNGe will expand the residential time by five months for randomely-selected cadets who are eligible and interested in staying, and will offer the following activities: (1) occupational skills training, (2) individualized career and academic counseling, (3) work-based learning opportunities, and (4) leadership development activities. In addition, the program will actively recruit court-involved youth and engage employers to ensure that the cadets’ skills address employers’ needs. Expanding eligibility for Youth ChalleNGe and Job ChalleNGe to court-involved youth, who for the most part are not currently eligible to participate in Youth ChalleNGe, could make a difference in the lives of those youth who can be the hardest to serve. Substantial research suggests that, to reduce the likelihood of recidivism and increase their chances for success, youth involved in the justice system need specific supports and interventions, such as enrollment in schooling or in job-training programs, as well as access to housing and adult mentors (Beale-Spencer and Jones-Walker 2004). These supports, which are to be part of Youth ChalleNGe and Job ChalleNGe, can make a difference to ensure that youth find and keep jobs. Both the Youth ChalleNGe and Job Corps evaluations found that program results for youth who had been arrested before enrollment were generally similar to those for other youth, so the potential is there. However, some research has shown the potential risks of “peer contagion” in programs that put high-risk youth together with other youth (Dishion and Dodge 2005). The Job ChalleNGe program, with its tight structure, may be well positioned to avoid those deleterious effects.

The conceptual model for the Youth ChalleNGe and Job ChalleNGe programs will guide the study design (Figure A.1). The upper section of the figure depicts the core Youth ChalleNGe program, beginning on the far left with the inputs: the experienced staff, many with military backgrounds; the residential structure; funding and oversight from the National Guard Bureau; and so forth. Moving to the right, the next section shows the program activities—including the core components, the program’s daily structure and quasi-military atmosphere, and the mentoring phase—and the outputs resulting from those activities. These outputs then lead to the anticipated shorter- and longer-term outcomes for participants, including stable employment, continued education, and desistance from crime. The lower section depicts the Job ChalleNGe component. Its inputs include the new DOL funding and new partnerships to be developed with employers and training providers. The Job ChalleNGe activities and outputs will supplement the core program. Short-term outcomes will include credential attainment and work-readiness skills, which the core program does not address. Ultimately, it is hoped that the combination of Youth ChalleNGe and Job ChalleNGe will have a synergistic beneficial impact on job placement, earnings, and self-sufficiency. The bottom of the figure shows the contextual factors that will shape the program’s operations and outcomes: participant characteristics, site characteristics, and community context (such as labor market conditions).

b. Overview of evaluation

Measuring the effectiveness of the DOL-funded Job ChalleNGe programs requires a rigorous evaluation that can avoid potential biases resulting from fundamental differences between program participants and nonparticipants. The evaluation to be conducted by Mathematica and its subcontractors, MDRC and SPR, includes (1) a random assignment evaluation to measure the impact of the Job ChalleNGe program and (2) an implementation study to understand program implementation and help interpret the impact study results.

The evaluation will address three main sets of research questions:

  1. How were the Youth ChalleNGe and Job ChalleNGe programs implemented? What outreach was done to connect with court-involved youth? Were service models modified for these new participants? What factors influenced implementation? What challenges were faced in implementation and how were those challenges overcome? Which implementation practices are promising?

  2. To what extent did the addition of Job ChalleNGe change the outcomes of participants who completed the Youth ChalleNGe program? Compared with youth who participated in Youth ChalleNGe only, did Job ChalleNGe participants have higher rates of credential attainment, more work readiness skills, and better employment and criminal justice outcomes?

  3. To what extent did impacts vary for selected subpopulations? In particular, were the programs effective for both court-involved youth and youth who have not been involved with the courts? Were the programs equally effective across other subgroups?



Figure A.1. Conceptual model for the Youth ChalleNGe and Job ChalleNGe programs

Picture 1



The first set of research questions will be addressed through an implementation study of the three grantee demonstrations. The implementation study will include (1) semi-structured interviews with administrators, program staff, partners, and employers; (2) focus groups with youth; (3) observations of program activities; and (4) youth case file reviews. A thorough implementation analysis will provide the context for impact estimates, explain differences in impact estimates by target population or grantee, provide lessons for improving programs, and lay the groundwork for replication of the program innovations at other Youth ChalleNGe locations.

To address the next two questions, the evaluation team will conduct a rigorous impact study of Job ChalleNGe. To measure the effectiveness of the additional job training component, a lottery will be conducted among Youth ChalleNGe participants who express interest in enrolling in Job ChalleNGe and who have graduated or who are expected to graduate from Youth ChalleNGe. Two-thirds of those eligible for the lottery will be offered a slot in the residential Job ChalleNGe program. The remaining one-third would return home to start the offsite 12-month mentoring component of Youth ChalleNGe. Comparing outcomes for the Job ChalleNGe treatment and control groups will yield the impacts of being offered the Job ChalleNGe program in addition to the standard Youth ChalleNGe program. The random assignment process for Job ChalleNGe will be stratified by court-involvement status to ensure an equal balance of court-involved and noncourt-involved youth in the treatment and control groups. Impacts will be estimated separately for each subgroup.

c. Overview of data collection

The evaluation requires collection of three primary types of data: (1) background and contact information, (2) program implementation details, and (3) follow-up youth outcomes (to be submitted in a separate package for Office of Management and Budget [OMB] clearance). The data covered by this clearance include the collection of background and contact information for youth from the BIF and program implementation data.

Background and contact data. After obtaining parental/guardian consent forms and/or youth assent forms, the study team will collect a rich set of background data on treatment and control group members to support the impact study. These background data will enable the team to describe the characteristics of study participants at the time of random assignment, ensure that random assignment was conducted properly, and conduct baseline equivalency analyses. Background data will also be used to create subgroups for the analysis, match students to school records data, improve the precision of the impact estimates, and assess and correct for survey nonresponse. The contact information will be used to locate individuals for a follow-up survey. Youth under the age of 18 and their parents will be asked to complete an assent form; youth aged 18 and older will be asked to complete a consent form. The consent/assent form completed will cover both the Youth ChalleNGe and Job ChalleNGe components of the study.

Implementation data. The implementation analysis data covered by this clearance will be collected during two rounds of visits to the three grantees; information will be collected from interviews with administrators, staff, partners, and employers, focus groups with youth, case reviews, and observations of program activities. Together, these data will be used to describe the Youth ChalleNGe and Job ChalleNGe programs and the successes and challenges they face; highlight promising or best practices found in specific programs; and identify a set of lessons learned with regard to implementing, supporting, and funding the programs. Data from the implementation analysis will enable the study team to better interpret findings from the impact analysis.

Follow-up outcome data. Data on participant outcomes will be part of a future package for which OMB approval will be requested. Key follow-up outcomes that will come from a youth survey conducted 12 months following random assignment will include measures of education success (such as high school completion, GED attainment, postsecondary credits, and vocational certificates or credentials); employment success (such as work-readiness skills, work experience in paid and unpaid jobs, and earnings); and delinquency and criminal justice involvement (such as drug use, arrests, and juvenile detention or incarceration).

2. How, by whom, and for what purpose the information is to be used

Clearance is currently being requested for data collection that will be used to perform and monitor random assignment and to conduct the implementation study. Each form is described below, along with how, by whom, and for what purpose the collected information will be used. A subsequent addendum to this package will include a request for clearance for a follow-up survey and communication tools for follow-up data collection on sample members.

a. Background information form

Contact and demographic information will be collected on all youth who consent to be part of the study. Background data and contact information are needed for five purposes: (1) conducting and monitoring random assignment, (2) locating participants for follow-up data collection, (3) defining policy-relevant subgroups for impact estimates, (4) increasing the precision of impact estimates by including baseline covariates in the regression models that are predictive of the outcome measures, and (5) adjusting for nonresponse to the follow-up data collection using baseline data that are predictive of survey nonresponse.

The BIF will provide a snapshot of information about the youth—their characteristics, educational attainment, delinquency and criminal justice involvement—prior to their going through a random assignment lottery. The BIF will collect multiple forms of contact information that will be critical to the success of the follow-up data collection.

Youth will be asked to complete the BIF duringYouth ChalleNGe. Background data elements to be collected include the following:

  1. Identifying information. These data include the youth’s name, date of birth, Social Security number, and the parents’ names. This information will be sent to Mathematica staff for entry into the data management system to ensure that each individual is randomly assigned only once and that control group members do not receive program services to which they are not entitled.

  1. Contact information. These data include the youth’s and parents’ addresses; telephone numbers (home, cell, or other); and email addresses. The BIF will also collect information on social media participation. These data will be necessary for tracking and locating youth for the follow-up data collection.

  2. Youth characteristics. These data items include gender, race and ethnicity, housing status, foster care involvement, primary language spoken at home, free and reduced price lunch status, health status, and any diagnosed learning disabilities. The BIF will also include information on whether the youth is married or has children.

  3. Educational attainment. Included in this category is the last grade completed in school, school suspensions, and high school graduation or GED attainment.

  4. Employment information. The BIF includes questions about employment status prior to Youth ChalleNGe and whether the youth has had a paid job lasting three months or longer.

  5. Delinquency and criminal justice involvement. The BIF collects information on drug use; whether they youth has ever been arrested, adjudicated delinquent, or convicted of a crime or detained in a juvenile facility; and whether they are currently on probation or parole.

b. Implementation analysis

To assess the implementation of Youth ChalleNGe and Job ChalleNGe, the evaluation team will conduct two rounds of in-person visits to each of the three grantees, with each visit lasting approximately two days. During each round of visits, site visitors will conduct interviews with administrators, grantee staff, their partners, and employers; hold focus groups with youth; conduct case reviews; and record observations of program activities. The first visit will occur early after the grantees have started serving Job ChalleNGe participants. During the first visit, site visitors will investigate the current structure of the Youth ChalleNGe program, inquire about grantees’ plans for implementing Job ChalleNGe, and assess the status of those implementation plans to date. A follow-up visit will occur after the Job ChalleNGe program is more established to learn whether and how their plans for serving the youth through Youth ChalleNGe and Job ChalleNGe have changed since the first visit. This package requests approval for the master staff protocol, the employer protocol, and the youth focus group protocol.

Master staff protocol. The master staff protocol will guide the content of the site visit interviews with staff. The purpose of the grantee administrators, staff, and partner interviews is to learn about the program context, organizational structure, specific program components, and any implementation challenges associated with the different elements of the program. Table A.1 maps the main interview topics from the master staff protocol to key respondents: administrators, grantee staff, and partners.

Table A.1. Implementation analysis topics and staff respondents

Topics of Interest

Administrators

Program Staff

Partners

Local Context and Alternative Services

Local economic conditions, juvenile justice system, alternative services


Organizational and Administrative Structure of Youth and Job ChalleNGe Programs

Program goals and objectives, staff training, key program partners, budget

Recruitment and Enrollment for Youth and Job ChalleNGe Programs

Eligibility, recruitment, selection, random assignment and enrollment, participant characteristics


Youth ChalleNGe Pre-ChalleNGe Phase

Day one, team building, readiness assessment, support and retention



Youth ChalleNGe Residential Phase

Eight core components, assessments and certifications, discipline



Youth ChalleNGe Post-Residential Programming

Mentoring, post-residential planning, placement assessments, education and employment services


Job ChalleNGe Program Elements

Orientation, individualized counseling, skills training, leadership development, work readiness, employer engagement, work-based learning

Outcomes and Lessons Learned

Participant outcomes, implementation, adaptions to serving court-involved youth, sustainability



The interview protocols will be customized to individual respondents and site visit activities based on the structure of each grantee and whether the protocols are for the first or second round of site visits. Site visitors will then use these individual protocols to guide the semistructured interviews. These protocols will help site visitors capture the perspectives of a wide variety of stakeholders to document differences and similarities between the grantees and to understand the unique contexts in which each program operates. The protocols will also ensure that site visitors systematically collect the information needed to address the implementation study research questions, although they will be flexible enough that site visitors can pursue more open-ended discussions when needed.

Employer protocol. The purpose of the employer interviews is to learn directly from employers about their experiences with and perceptions of youth who have participated in the Youth ChalleNGe and Job ChalleNGe programs and whether the programs are meeting the employers’ needs for skilled workers. The protocol will be customized based on the employer’s involvement with the Youth ChalleNGe and Job ChalleNGe programs.

Youth focus group protocol. The youth focus group protocol will guide focus groups with youth participants during the site visits. The purpose of the focus groups is to learn about youth experiences in the program, and in particular, youth perspectives on the new elements of the program – expanding eligibility to include court-involved youth and adding Job ChalleNGe. The protocol will be used for focus groups with court-involved and non-court-involved youth. The focus group conversations will be guided by seven main topic areas: background and local context; recruitment and enrollment; Pre-ChalleNGe phase; Youth ChalleNGe residential phase; mentoring; Job ChalleNGe; and an overall program assessment.

The additional activities planned for the site visits – program activity observations and case file reviews – will also be guided by structured protocols, to ensure that information is systematically collected to address the implementation study research questions.

3. Use of technology to reduce burden

The data collection efforts will use advanced technology to reduce burden on program participants and staff at participating agencies. The evaluation team will supply sites with hard copies of the consent forms and BIFs to make it easier on sites and youth to complete the forms in one sitting. Paper copies of the consent forms and BIFs will be sent to the evaluator, which will use trained data entry staff to compile the data and create electronic databases.

4. Efforts to avoid duplication of effort

To minimize duplicate data collection, the BIF has been reduced to include only items necessary to the evaluation. The BIF does not include a request for information that is already available uniformly from the grantees as part of the data they collect from youth during their normal intake procedures. These existing data do not contain all the background characteristics of youth for background data collection nor will they be consistent across grantees.

The site visits will serve to collect information about implementation of the program at the three sites. Site visitors will review grant applications prior to the visit to reduce the burden of data collection about basic program characteristics during the first visit. At the first site visit, the evaluation team will collect information about the context in which the Youth ChalleNGe programs have been operating (prior to the start of the grant activities), grantee staff’s plans for and initial experiences with the implementation of the program, and youth experiences in the program. The second site visit will focus on how the program was implemented in practice, how those changes deviated from plans, and any challenges encountered and strategies that were tried to address the challenges. During each site visit, information will be collected from grantees that is not available through any other mechanism or source. Furthermore, the information collected during the site visits and used to conduct the implementation evaluation will be distinct from any agency monitoring activities that may occur during the grant period. 

5. Methods of minimizing burden on small entities

The implementation study data collection effort involves interviews with employers during the site visits. It is expected that, at each of the three grantees, one employer will be interviewed during the first site visit and two will be interviewed during the second visit—yielding a total of nine employers across all grantees and site visits. It is possible that some of these employers might be small businesses. Each interview is expected to last no more than 60 minutes, although it is possible that some will be shorter (such as 30 minutes). To minimize burden on any small businesses that participate in the interviews, the interviews will be scheduled at the employer’s convenience. If the employer prefers, the site visitors will conduct the interview by phone on dates other than during the site visits to grantees—again to minimize the burden on employers. The employer interviews, including those with small businesses, will be conducted only with employers who are willing to participate; as with other interviews, their participation will be completely voluntary.

6. Consequences of not collecting data

Without collecting background information on study participants, the evaluation team’s ability to implement random assignment correctly and monitor adherence to random assignment procedures would be severely limited. The lack of background information would limit the ability to describe the population of Youth ChalleNGe and Job ChalleNGe participants and would limit the analysis of impacts of the program on subgroups, such as youth with juvenile court histories, hence limiting the ability to determine whether each program is effective for these groups. Without background data, impact estimates would be less precise (so that small impacts would be less likely to be detected), and adjustments for nonresponse to the follow-up surveys (which will be part of a clearance package to follow this one) would have to be based on less-detailed administrative data.

Without collecting detailed contact information for study participants, the evaluation team’s ability to track participants over the follow-up period would be limited. This would likely lead to a higher nonresponse rate and, thus, a greater risk that the quality of survey data is biased based on systematic differences in response rates that are correlated with youth characteristics. These differences, if not controlled for in the model, may lead to bias in the impact estimates.

Without collecting the information specified in the site visit protocols, an implementation analysis of the Youth ChalleNGe and Job ChalleNGe programs could not occur. This would prevent information being provided to policymakers about the context in which programs operate, any operational challenges faced by programs in implementing their grant plans to include court-involved youth and to provide Job ChalleNGe services, and how the programs evolve over time.

7. Special circumstances

No special circumstances are involved with the collection of information.

8. Federal Register announcement and consultation

a. Federal Register announcement

The 60-day notice to solicit public comments was published in vol. 80, no. 141, page 43796 of the Federal Register on July 23, 2015.

b. Consultations outside the agency

The evaluation team has not consulted any experts who are not directly involved in the study regarding the subject of this clearance. The evaluation team expects to consult with additional experts for other aspects of the implementation and impact evaluations.

c. Unresolved issues

There are no unresolved issues.

9. Payment to respondents

There are no payments to respondents. Tasks and activities conducted by program and partner staff are expected to be carried out in the course of their employment, and no additional compensation will be provided outside of their normal pay. Youth will not be compensated for completing the consent form or BIF, or for participating in the interviews or focus groups.

10. Privacy of the data

The study is being conducted in accordance with all relevant regulations and requirements, including the Privacy Act of 1974 (5 U.S.C. 552a); the Privacy Act Regulations (34 CFR Part 5b); and the Freedom of Information Act (5 CFR 552) and related regulations (41 CFR Part 1-1, 45 CFR Part 5b, and 40 CFR 44502).

Before random assignment, participants and their parents/guardians (if a youth is under the age of 18) will receive information about the study’s intent to keep information private to the extent permitted by law; the consent form that participants will be asked to read and sign before being randomly assigned to a research group will include this privacy information. The information will introduce the evaluators, explain random assignment and the research groups, clarify that the study participants will be asked to complete a BIF, and inform participants that administrative records about the youth will be released to the research team. Participants will be told that all information provided will be kept private and used for research purposes only. Further, they will be assured that they will not be identified by name or in any way that could identify them in reports or communications with the DOL or with Youth ChalleNGe or Job ChalleNGe administration or instructors.

11. Additional justification for sensitive questions

The BIF will collect background information on youth who have consented to participate in this evaluation. Information on date of birth, Social Security number, address, and telephone numbers is needed to identify and contact sample members and to ensure that random assignment is conducted correctly. The BIF will also collect information on characteristics of sample members, such as marital status, number of children, and housing status, which will be used to enhance the impact estimates. This type of information is routinely collected as part of enrollment in most programs and is, therefore, not considered sensitive.

The BIF includes questions that some respondents might find sensitive. These questions ask about delinquent activities, including arrests and drug use. Collection of this information, although sensitive in nature, is critical for the evaluation given that a unique component of the grants under evaluation is the expansion of eligibility for the traditional Youth ChalleNGe program to include court-involved youth. In addition, the information cannot be obtained through other sources. The extent of prior involvement with the juvenile and criminal justice systems will be an important characteristic for describing the sample members and will serve as a means to form key subgroups for the impact analysis. The evaluation team has included similar questions in past studies without any evidence of significant harm.

As described earlier, all sample members will be provided with assurances of confidentiality before they complete the BIF and random assignment is conducted. Not all data items have to be completed. All data will be held in the strictest confidence and reported in aggregate, summary format, eliminating the possibility of individual identification.

12. Estimates of hours burden

In Table A.2, we describe our assumptions about the total number of responses expected, the average hours of burden per respondent, and the total burden hours estimated for the BIF and two rounds of site visits. The estimate of burden on youth is 666 hours for the BIF, with a monetized burden of $4,829, and 84 hours for site visits (42 hours per site), with a monetized burden of $610; 165 hours on grantee staff ($7,011); and 9 hours on employers ($248). In total, the burden is 924 hours ($12,698). Over three years, the annual burden hours on youth for the BIF is 222 hours ($1,610 per year) and 28 hours for the site visits (14 hours for each site visit; $204), 55 hours on grantee staff (38 hours for site visit 1 and 17 hours for site visit 2; $2,336), and 3 hours on employers (1 hour for site visit 1 and 2 hours for site visit 2; $83). In total, the annual burden over three years is 308 hours ($4,233).

Table A.2. Burden associated with the background information forms and site visits

Respondents

Total number of respondents over entire evaluation

Annual number of responsesa

Number of responses per respondent

Average burden time per response

Total burden hours over entire evaluation

Annual burden hoursa

Time value

Monetized burden hours (rounded) over entire evaluation

Annual monetized burden hoursa

Background information forms

Youth

4,994b

1,665

1

8 minutes

666

222

$7.25c

$4,829

$1,610

Grantee site visits

Site visit 1










Staff

113d

38

1

60 minutes

113

38

$42.49c

$4,801

$1,600

Employers

3

1

1

60 minutes

3

1

$27.60c

$83

$28

Youth

42

14

1

60 minutes

42

14

$7.25c

$305

$102

Site visit 2










Staff

104d

35

1

30 minutes

52

17

$42.49c

$2,209

$736

Employers

6

2

1

60 minutes

6

2

$27.60c

$166

$55

Youth

42

14

1

60 minutes

42

14

7.25c

$305

$102

Total

5,304

1,768



924

308


$12,698

$4,233

aThe figures for the annual number of responses, annual burden hours, and annual monetized burden hours are based on three years of data collection.

bThe Baseline Information Form will be collected from all Youth ChalleNGe participants enrolled in the study.

cThe hourly wage of $7.25 is the federal minimum wage (effective July 24, 2009), available at http://www.dol.gov/dol/topic/wages/minimumwage.htm; $42.49 is based on the May 2014 median wage of “Education Administrators: Postsecondary,” available at http://www.bls.gov/oes/current/oes119033.htm; and $27.60 is based on the May 2014 median wage of “Human Resources Specialist,” available at http://www.bls.gov/oes/current/oes131071.htm.

dSome respondents will participate in interviews that last more than 60 minutes, but the total number of hours is limited to 113 for site visit 1 and 52 for site visit 2. Therefore, the table provides an upper estimate of the number of separate respondents.

13. Estimate of total annual cost burden to respondents or record keepers

There are no direct costs to respondents, and they will incur no start-up or ongoing financial costs. The cost to respondents solely involves the time required for the interviews and completing the BIFs. The costs are captured in the burden estimates in Table A.2.

14. Estimates of annualized cost to the federal government

The total annualized cost to the federal government is $107,805. Costs result from the following two categories:

  • The estimated cost to the federal government for the contractor to carry out this study is $170,971 for background data collection and $152,443 for two rounds of site visits. Annualized, this comes to $107,805 over three years.

  • The annual cost borne by the DOL for federal technical staff to oversee the contract is estimated to be $18,102 The annual level of effort expected to perform these duties will require 200 hours for one Washington, D.C., based federal GS 14 step 4 employee earning $56.57 per hour. (See Office of Personnel Management 2015 Hourly Salary Table, available at http://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2015/DCB_h.pdf). To account for fringe benefits and other overhead costs, the agency has applied a multiplication factor of 1.6 (200 hours x $56.57 x 1.6 = $18,102).

15. Reasons for program changes or adjustments

This is a new information collection.

16. Tabulation, publication plans, and time schedules

This data collection will contribute to a final report describing findings from the impact and implementation analyses. In the final report, all study research questions will be answered by integrating the findings from the implementation and impact analyses. The report will include analysis of the data that are part of this request, as well as survey data that will be part of a subsequent request for clearance, the cadet tracking system data, and administrative records. The results will be clearly and concisely presented, with appendices providing an appropriate level of technical information to document the rigor of the analyses. The final report will be available in spring or summer 2020.

The data collection efforts included in this request also might be used as part of an issue brief that will explore a specific topic in depth, such as the challenges of recruiting and serving court-involved youth or approaches to work-based learning opportunities included in Job ChalleNGe. The final topic of the issue brief, which will use nontechnical language and infographics, will be determined by the CEO at a later point.

17. Approval not to display the expiration date for OMB approval

The expiration date for OMB approval will be displayed.

18. Exception to the certification statement

No exceptions to the certification statement are requested or required.

References

Beale-Spencer, M., and C. Jones-Walker. “Interventions and Services Offered to Former Juvenile Justice Offenders Reentering their Communities: An Analysis of Program Effectiveness,” Youth Violence and Juvenile Justice, vol. 2, no.1, 2004, pp. 88–97.

Bloom, H.S., L.L. Orr, S.H. Bell, G. Cave, F. Doolittle, W. Lin, and J.M. Bos. “The Benefits and Costs of JTPA Title II-A Programs: Key Findings from the National Job Training Partnership Act Study.” Journal of Human Resources, vol. 32, no. 3, 1997, pp. 549–576.

Cave, G., H. Bos, F. Doolittle, and C. Toussaint. “JOBSTART: Final Report on a Program for School Dropouts.” New York: MDRC, October 1993.

Dishion, T., and K. Dodge. “Peer Contagion in Interventions for Children and Adolescents: Moving Towards an Understanding of the Ecology and Dynamics of Change.” Journal of Abnormal Child Psychology, vol. 33, no. 3, 2005, pp. 395–400.

Hair, E., K. Moore, T. Ling, C. McPhee-Baker, and B. Brown. “Youth Who Are ‘Disconnected’ and Those Who Then Reconnect: Assessing the Influence of Family, Programs, Peers and Communities.” Child Trends Research Brief #2009-37. Available at http://www.childtrends.org/wp-content/uploads/2013/04/8.pdf.

Millenky, M., D. Bloom, S.M. Ravett, and J. Broadus. “Staying on Course: Three-Year Results of the National Guard Youth ChalleNGe Evaluation.” New York: MDRC, 2011.

Roder, A., and M. Elliott. Sustained Gains: Year Up’s Continued Impacts on Young Adults’ Earnings. New York: Economic Mobility Corporation, 2014.

Schochet, P., J. Burghardt, and S. McConnell. "Does Job Corps Work? Impact Findings from the National Job Corps Study." American Economic Review, vol. 98, no. 5, 2008, pp. 1864–1886.

Steinberg, L., H.L. Chung, and M. Little. “Reentry of Young Offenders from the Juvenile Justice System: A Development Perspective,” Youth Violence and Juvenile Justice, vol. 2, no. 1, 2004, pp. 21–38.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorIrwin, Molly E - ASP
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy