NGYC_follow-up_OMB_pkg_PartA_CLEAN_updated

NGYC_follow-up_OMB_pkg_PartA_CLEAN_updated.docx

National Guard Youth ChalleNGe Job ChalleNGe Evaluation

OMB: 1290-0019

Document [docx]
Download: docx | pdf

Part A: National Guard Youth Challenge Job ChalleNGe Evaluation

OMB NO. 1290-0NEW

APRIL 2018

The National Guard Youth ChalleNGe Job ChalleNGe Evaluation

Office of Management and budget SUPPORTING STATEMENT PART A

The Employment and Training Administration (ETA) of the U.S. Department of Labor (DOL) is funding three National Guard Youth ChalleNGe programs to expand the target population to include court-involved youth and add a five-month residential occupational training component called Job ChalleNGe. The goal of Youth ChalleNGe, a program that was established in the 1990s, is to build confidence and maturity, teach practical life skills, and help youth obtain a high school diploma or GED. The program has a quasi-military aspect in which, for about 20 weeks, participants live in a disciplined environment and participate in numerous activities to strengthen their skills in a wide range of areas.

The addition of the Job ChalleNGe component to the existing Youth ChalleNGe model has the potential to bolster the program’s effectiveness by adding intensive occupational training. Job ChalleNGe expands the residential time by five months for cadets who are interested in staying and are identified by staff as having the potential to benefit from the program. It offers the following activities: (1) occupational skills training, (2) individualized career and academic counseling, (3) work-based learning opportunities, and (4) leadership development activities. In addition, the program is engaging employers to ensure participants’ skills address employers’ needs.

The National Guard Youth ChalleNGe Job ChalleNGe Evaluation, sponsored by DOL’s Chief Evaluation Office (CEO), is designed to gain an understanding of the implementation of the DOL Job ChalleNGe grant and the experiences and outcomes of participants in the three grantee sites that were awarded Job ChalleNGe grants in 2015. The CEO has contracted with Mathematica Policy Research, in conjunction with its subcontractors MDRC and Social Policy Research Associates (SPR), to conduct this evaluation. OMB clearance (control number 1291-0008) was received on March 31, 2016, for four data collection instruments used as part of the evaluation: (1) baseline information form (BIF) for youth, (2) site visit master staff protocol, (3) site visit employer protocol, and (4) site visit youth focus group protocol.

This request seeks clearance for two types of follow-up data collection from study sample members who participated in the Job ChalleNGe program.

- A monthly text message survey to be administered eight times through text messages during the 8th- through 15th- month after participants start Job ChalleNGe. The survey is expected to take about 4 minutes for respondents to complete each time it is administered.

- A follow-up survey to be administered 16 months after Job ChalleNGe participants start the program. The survey is expected to take about 15 minutes for respondents to complete.

A. Justification

1. Circumstances necessitating collection of information

As was described in the initial request for clearance for data collection for this study (ICR reference number 201509-1291-002), youth who are “disconnected” from school face profound challenges in the increasingly skills-based U.S. labor market, including academic difficulties, lower long-term earnings, and employment instability. To help dropouts resume their education and have a better chance at labor market success, governments and foundations have funded numerous programs, including alternative high schools, GED preparation programs, community service projects, and programs combining occupational training and soft skills development.

The National Guard Youth ChalleNGe Job ChalleNGe grant program and subsequent evaluation are authorized by Section 171, Pilot and Demonstration Projects, of the Workforce Investment Act.

a. The National Guard Youth ChalleNGe Job ChalleNGe program

One program for at-risk youth that was recently found to be effective is National Guard Youth ChalleNGe. A study of this program found that, three years after enrolling in Youth ChalleNGe, participants had higher levels of GED receipt, employment, earnings, and college enrollment.1 The goal of Youth ChalleNGe is to build confidence and maturity, teach practical life skills, and help youth obtain a high school diploma or GED. The program includes activities to help youth develop valuable skills in areas such as leadership/followership, responsible citizenship, physical fitness, job skills, and academic excellence. During the 20-week quasi-military residential phase, participants, known as “cadets,” live in barracks-style housing, wear their hair short, and dress in military uniforms. Upon completing the residential phase of the program, cadets receive structured mentoring for a year, which is designed to help them successfully transition back to their communities. It is common for each site that administers a Youth ChalleNGe program to start the program with a new cohort of youth every six months—for example, in January and July of each calendar year.

To build on the success of Youth ChalleNGe, the ETA issued $12 million in grants in early 2015 for three Youth ChalleNGe programs—in Fort Stewart, Georgia; Battle Creek, Michigan; and Fort Jackson, South Carolina—to (1) expand the program’s target population to include youth who have been involved with the court system and (2) add an occupational training component, known as Job ChalleNGe. Job ChalleNGe is available to both youth who have been involved in the court system and those who have not. For cadets who are eligible and interested in staying for Job ChalleNGe, the component expands the residential time by five months and offers the following activities: (1) occupational skills training, (2) individualized career and academic counseling, (3) work-based learning opportunities, and (4) leadership development activities. Expected outcomes for Job ChalleNGe include credential attainment and work-readiness skills. Ultimately, it is hoped that the combination of Youth ChalleNGe and Job ChalleNGe will have a synergistic beneficial effect on participants’ post-program outcomes, such as job placement, earnings, and avoidance of criminal justice system involvement.

The grantees began serving youth with grant-funded Youth ChalleNGe and Job ChalleNGe activities in 2015 and are expected to continue to do so through the remainder of the grant period. The first cohort of youth served by the grant began Job ChalleNGe in January 2016; the final cohort will begin it in July 2018. In total, the grantees are expected to serve six cohorts of youth, with the sixth cohort finishing up Job ChalleNGe near the end of 2018.

b. Overview of the evaluation

The evaluation, sponsored by the Chief Evaluaiton Office (CEO) in DOL and conducted by Mathematica and its subcontractors MDRC and SPR includes (1) an implementation study to understand program implementation and (2) an outcomes study to learn about the experiences and outcomes of participants in the three grantee sites. The study is examining how the Youth ChalleNGe program serves court-involved youth, as well as the implementation of Job ChalleNGe for all youth it serves. The research questions also include an exploration of both (1) the post-residential phase of Youth ChalleNGe, since the Job ChalleNGe grants provide an alternative for this important transition period, and (2) the outcomes of Youth ChalleNGe and Job ChalleNGe participants. Although CEO and the evaluation team explored the possibility of conducting an impact evaluation to assess the impacts of Job ChalleNGe on youth’s outcomes, it was determined that this type of a study was not feasible given the inability of the study sites to recruit and determine eligible a sufficient number of youth for the Job ChalleNGe program to support this type of evaluation design.

The implementation and outcomes studies seek to answer the following research questions:

  1. How was the Youth ChalleNGe program implemented under the Job ChalleNGe grant?

  2. How was the Job ChalleNGe program implemented?

  3. How did the programs recruit and select youth for Job ChalleNGe?

  4. How did youth experience the post-residential phase?

  5. What were the employment, education, and criminal justice outcomes of Youth ChalleNGe and Job ChalleNGe participants?

  6. What expectations do youth have for the future?

  7. What can we learn from these grants about possible program models to serve court-involved and other at-risk youth?

The first three research questions will be addressed through the implementation study of the three grantee sites. The implementation study primarily draws upon data collected in instruments for which OMB approval has already been granted (control number 1291-0008); these data stem from the evaluation’s (1) semi-structured interviews with administrators, program staff, partners, and employers; (2) focus groups with youth; (3) observations of program activities; and (4) youth case file reviews.

Research questions 4–6 will be addressed through the outcomes study, which will draw upon survey data for which OMB approval is requested in package. These data will be obtained through a short survey administered through a monthly text message survey for eight months (during8th- through 15th- month after the participants start Job ChalleNGe), as well as a one-time follow-up survey (conducted in 16 month after the start of Job ChalleNGe).2

The seventh research question will be addressed through a synthesis and assessment of the information learned from all evaluation data collection activities.

c. Overview of data collection

Below is an overview of the two data collection instraments for which clearance is requested.

Monthly Text Message Survey. Job ChalleNGe participants who give consent to participate in the evaluation and permission to contact them via text message (and whose parents/guardians have done so, when necessary) and who are enrolled in grantee cohorts 4, 5, and 63 will be asked to complete a brief survey administered by text messaging on a monthly basis for eight months, during months 8 through 15 after the youth began Job ChalleNGe. The brief survey is designed to provide snapshots of the progression over time that the respondents make in their employment, earnings, and education between the end of the Job ChalleNGe program and the follow-up survey. For each round of monthly text message data collection, each participant will be asked to answer three to five questions; the specific questions will be tailored based on his or her circumstances. For example, participants who report in response to the first question that they are not currently working will not receive the follow-up questions about hours worked and hourly pay rate at a job.

Periodically, the survey will also collect updated contact information from the respondents. The contact information will be used to support the follow-up survey data collection effort.

Although the length of time it takes respondents to complete each round of the monthly text message survey will vary, it is expected that, on average, survey completion will require about four minutes each month.

Follow-up Survey. The sample for the follow-up survey data includes Job ChalleNGe participants who gave consent to participate in the evaluation (and whose parents/guardians did so, when necessary). The follow-up survey, to be conducted 16 months after the start of Job ChalleNGe for each cohort, covers five broad topics: (1) the participant’s experiences during Job ChalleNGe, such as the services he or she received; (2) employment and earnings, such as the characteristics of a current job or the participant’s recent work search efforts; (3) educational experiences, including attainment and future plans; (4) involvement in the court system, such as whether or not the participant was arrested and convicted of a crime; and (5) views about the value of different aspects of the Job ChalleNGe program. The expected length of time that it is expected to take the participants to complete the survey is 15 minutes. As with the brief text message surveys, respondents will be skipped past questions that are determined not to be pertinent for them.

2. How, by whom, and for what purpose the information is to be used

Data collected through instruments approved in this package will be used the following ways:

a. Monthly Text Message Surveys

The text message surveys will provide information so the evaluation team can develop measures about the progression of Job ChalleNGe participants during the follow-up period. Although each respondent will be asked no more than five questions during any one administration of the survey, the questions will vary based on respondents’ answers to earlier questions and the specific month in which the survey is conducted. In total, across the eight months of this data collection effort, there are 10 distinct questions that respondents might be asked. Six questions pertain to employment, two to participation in education, and two to contact information.

Employment information. The text message survey includes three questions about the respondent’s employment status, hours worked, and the wage rate at the job. The survey also asks about enlistment in the military to ensure that awareness about enlistment status is not inadvertently missed given (1) its distinctive nature as employment and (2) the close connection between the Youth ChalleNGe program and the military. Two additional questions will ask respondents about whether and how Job ChalleNGe assisted the respondents in obtaining their job.

Educational information. The text message survey includes two questions about respondents’ (1) current educational status and (2) educational history.

Descriptive analysis of the responses to the questions about employment and education will be used to document the progression of Job ChalleNGe participants in these areas during the first eight months after the end of the Job ChalleNGe program (while they are in the community). It also will be used to describe the extent to which the Job ChalleNGe program assisted participants. In addition, information on employment status (including military enlistment) and educational status will be used to construct a measure of participants’ engagement in a productive activity during each month covered by this data collection effort.

Contact information. During the 6th- month of the text message survey (that is, during the 13th- month after the start of Job ChalleNGe), the Job ChalleNGe participants will be asked to confirm or update their contact information. This information will be used by the evaluation team to improve locating of the participants for the follow-up survey data collection.

b. Follow-up survey

The one-time follow-up survey will collect data from each participant in cohorts 4, 5, and 6 of the Job ChalleNGe program from whom study consent was obtained. The survey contains five sections. The first asks respondents’ about their experiences during Job ChalleNGe, including the services and activities they participated in, whether or not they obtained an industry-recognized credential, and what they think they would have done had participation in Job ChalleNGe not been an option. The evaluation will use the information in this section to describe the activities in which the participants reported being involved, as well as perceptions of what they would have done had the grant-funded program not been available to them.

The second section of the follow-up survey asks respondents about their employment and earnings. Specific topics include work search effort and strategies used to look for work, current employment status and the characteristics of the main current job, and the extent to which Job ChalleNGe helped participants with employment. The evaluation team will couple this information with information from the brief text message survey to develop a rich descriptive picture of Job ChalleNGe participants’ employment during the first 10 months after the end of the Job ChalleNGe program (about 16 months after they started Job ChalleNGe).

The third section of the follow-up survey asks respondents about their educational experiences since Job ChalleNGe. Topics include educational attainment, current participation status in educational activities, expectations for participation in the future, reasons for participating, and expectations about the value of additional education for career advancement. As with the employment-related information, the evaluation team will couple the education information from the follow-up survey with education information from the text message survey to learn—at a descriptive level—about these types of outcomes during the follow-up period.

The fourth section includes questions about the respondents’ involvement in the criminal or juvenile justice systems, such as whether or not they have been arrested, convicted, and/or sentenced since starting Job ChalleNGe. This information will be used to gain insights about the extent of their involvement in the justice system and to examine differences in these outcomes between those who were identified as being court-involved prior to the start of Youth ChalleNGe and those who were not.

The final section of the survey includes questions about the respondents’ contact with Job ChalleNGe staff after leaving the program and their perceptions of the quality of the services received during the program. The evaluation team will analyze these questions to generate insights about the participants’ assessment of the program.

3. Use of technology to reduce burden

The data collection efforts will use advanced technology to reduce burden on program participants. The evaluation team will use texting and a web-based survey to make it easier for youth to respond to questions. While the text message survey will be conducted via mobile device, the 15-minute follow-up survey will able to be completed on a mobile device or a desktop computer. Using these technologies will provide respondents with around-the-clock access to the surveys and will enable them to complete them at any time that is convenient. Utilizing a web-based platform will enable respondents to complete the follow-up survey in parts, if necessary, and resume at the point of break-off. For the follow-up survey, we will also offer an option of completion over the telephone for study participants who prefer this method.

4. Efforts to avoid duplication of effort

To minimize duplicate data collection, the brief text message survey and the follow-up survey have been designed to include only items necessary to the evaluation and for which no other source is suitable.

Although the topics asked about in the brief text survey and follow-up survey overlap, there is no duplication of effort, because the time periods covered by the data collection efforts and/or the content of the questions differ. For example, the text message survey focuses on the current status (of employment or education) in the eight months following the end of Job ChalleNGe. Thus, the different rounds of this survey collect distinct information, even though the questions might initially appear to be repeated across the months. This frequency is necessary to support the evaluation’s plans to measure and assess the Job ChalleNGe participant’s progression in especially relevant outcomes during the follow-up period without risk of recall bias.

Furthermore, the follow-up data collection efforts do not include requests for information that is already available from other sources. Many of the questions to be asked pertain to outcomes that occur after the respondents have returned from Job ChalleNGe to the community; therefore, the information is not available from grantees.

Although the evaluation team plans to collect and analyze information about some of the sample members’ outcomes through administrative records data, given the limitations of these data, we will use them to supplement, not substitute for, the data available through the brief text message and follow-up surveys. Thus, there is no duplication of effort. CEO and the evaluation team determined that administrative records on employment and earnings are not appropriate for use in the evaluation, given the young ages of the Job ChalleNGe participants and, therefore, the types of job transitions they are likely to experience. Furthermore, these administrative data are available only in quarterly time intervals and do not include important measures such as hours worked, hourly wages, or occupation. Further, the administrative data do not include information about whether or not Job ChalleNGe helped the participants obtain a job, the fringe benefits available through the job, or whether participants found employment in the occupation for which they trained during Job ChalleNGe.

Administrative data about education, to be collected from the National Student Clearinghouse, also do not provide the same information as will be collected through the text message and follow-up surveys. For instance, the survey does not include many questions on enrollment in educational programs or credits earned, since that information may be gathered through the administrative data sources. Instead, the survey focuses on individual motivation for taking specific courses and how those choices relate to their experience in Job ChalleNGe. Furthermore, there is no other source besides the Job ChalleNGe participants themselves to ask about their expectations of future participation in education and the value of additional education for their career advancement.

Finally, the criminal justice administrative data are not expected to provide comprehensive information about certain types of justice involvement in a consistent way across the grantee sites. This is due in part to the age of the Job ChalleNGe participants and in part to state-specific restrictions on the content of the data that can be released publicly. For example, some states do not release to the public information on arrests unless there has been a conviction. In addition, some state criminal justice data repositories—the agencies from which the evaluation team plans to collect administrative data—do not consistently and comprehensively receive information from all criminal justice agencies (police, courts, jails and prison systems, and probation and parole systems) within the state boundaries. Use of a follow-up survey, therefore, is the best method for obtaining consistent data about sample members’ involvement in the court system to facilitate an analysis that pools outcomes across the three study sites.

Finally, there is no other source for information about sample members’ attitudes about whether and how Job ChalleNGe helped them.

5. Methods of minimizing burden on small entities

No small businesses or entities will be involved as respondents.

6. Consequences of not collecting data

Without the information from the brief text messages survey, the evaluation team would not be able to assess the monthly engagement and progression of Job ChalleNGe participants in certain types of prosocial activities (employment, education, and the military) during the eight-month period after the end of the Job ChalleNGe program.

The administration of the text survey in month 13 (in the 6th of eight monthly text data collections) also requests confirmation or update of the Job ChalleNGe participants’ contact information to foster the evaluation team’s ability to request participation in the follow-up survey. Without this information, the evaluation team might need to rely on outdated contact information and, hence, might encounter a higher nonresponse rate and greater difficulty obtaining the target number of completed follow-up interviews.

Without collecting the data requested in the follow-up survey instrument, the evaluation team would not be able to gain a rich understanding of the Job ChalleNGe participants’ outcomes at the time the survey is fielded, including their employment and educational experiences and their criminal justice outcomes. The evaluation team also would not be able to gain insights from the participants’ perceptions about the quality of the Job ChalleNGe program.

7. Special circumstances

No special circumstances are involved with the collection of information.

8. Federal Register announcement and consultation

a. Federal Register announcement

The 60-day notice to solicit public comments was published in 82 FR 512999 of the Federal Register onNovember 3, 2017. No comments were received.

b. Consultations outside the agency

The evaluation team consulted experts internal to their organization who are not directly involved in the study regarding the subject of this clearance.

c. Unresolved issues

There are no unresolved issues.

9. Payment to respondents

Respondents will be paid $3 for each round of the text message survey they complete. Payments will be made via Amazon Code; respondents will receive a code once they submit the final text exchange each month. Follow-up survey respondents will receive an Amazon code of $30 for completion in the first six weeks of the survey fielding period and $20 for completion thereafter. The decision to use a differential payment process is based on findings from an incentive experiment conducted by research staff on the Evaluation of the Impact on the YouthBuild Program.4 Their findings showed that offering a larger payment amount in the first four weeks of data collection (1) increased the odds of respondents completing the survey early, (2) reduced the need for an additional mode of follow-up, and (3) led to greater potential representativeness among respondents (although the finding was not statistically significant). The decision to use Amazon codes rather than traditional gift cards eliminates the lag time between completion and receipt.

10. Privacy of the data

The study is being conducted in accordance with all relevant regulations and requirements. A Certificate of Confidentiality was approved on March 11, 2016 under certificate number CC-HD-16-121.

During the baseline data collection, Job ChalleNGe participants and their parents/guardians (for youth under age 18) received information about the study’s intent to keep information private to the extent permitted by law: the consent form that participants were asked to read and sign included this privacy information. The information introduced the evaluators, clarified that the study participants could be asked to complete a series of surveys, and informed participants that administrative records about them will be released to the research team. Participants were told that all information provided will be kept private and used for research purposes only. Further, they were assured that they will not be identified by name or in any way that could identify them in reports or communications with the DOL or with Youth ChalleNGe or Job ChalleNGe administration or instructors.

An advance letter will be mailed to each Job ChalleNGe participant in the study prior to the text message survey and follow-up survey data collections. Advance letters will include similar information pertaining to privacy and use of data.

11. Additional justification for sensitive questions

The text message data collection and follow-up survey include questions that some respondents might find sensitive. These questions ask about court involvement, including arrests, types of charges, convictions, and sentencing. The evaluation team has included similar questions about court involvement in past studies without any evidence of significant harm.

Collection of the information about court involvement, although sensitive in nature, is critical for the evaluation, given that a distinctive innovation of the grants under evaluation is the availability of Job ChalleNGe (and Youth ChalleNGe) services to court-involved youth. Thus, CEO is especially interested in learning about whether the outcomes of court-involved youth differ from those of non-court-involved youth. The information cannot be obtained in a standardized way through state-maintained criminal justice administrative data because the content and comprehensiveness of such data vary across the three study sites, given that they are located in different states. The information also cannot be obtained from any other source.

Prior to the baseline data collection, the evaluation team obtained a Certificate of Confidentiality from the National Institutes of Health. The certificate was explained on the parent and youth consent forms presented to all prospective study participants. In addition, all sample members will be provided with assurances of privacy before they complete the text data collection and follow-up survey. They will not be required to complete all data items. All data will be held in the strictest confidence and reported in aggregate summary format, eliminating the possibility of individual identification.

12. Estimates of hours burden

Table A.1 presents the estimated total annual number of responses, the average hours of burden per response, and the total annual burden hours for the monthly text message and follow-up surveys. Over three years, the annual burden hours on study sample members for the monthly text message data collection is 62 hours ($450 per year) and 35 hours for the follow-up survey ($254). In total, the annual burden over three years is 97 hours ($704).

Table A.1. Estimated Annualized Respondent Hour and Cost Burden

Type of Instrument (Form/Activity)

Number of Respondents

Number of responses per respondent

Number of responsesa

Average burden per response (in Hours)

Estimated Burden Hours

Average Hourly Wage

Total Burden Cost

Monthly text message survey

117b

8

936

4/60c

62

$7.25d

$450

Follow-up survey

138e

1

138

15/60

35

$7.25d

$254

Total

255


1,074


97


$704

Note. Most youth participating in the monthly text message survey will also participate in the follow-up data collection. Therefore, the table provides an upper estimate of the number of separate respondents.

a The figures for the annual number of responses, annual burden hours, and annual monetized burden hours are based on three years of data collection.

b The monthly text message survey will be collected from all Job Challenge participants who consented to participate in the study, who provided a cell phone number to the study team, and who consented to be contacted via text messaging. The number of respondents is based on a 90% study consent rate x an 85% consent to text rate x an 85% response rate / 3 years.

c The average amount of time per response is based on calculating the average number of minutes expected for respondents to complete the text exchange each month. The number of texts in each month’s exchange will vary depending on the category of questions asked as well as introduction and closing texts. Depending on the responses to particular questions, the minimum number of text exchanges is 5 and the maximum is 11.

d The hourly wage of $7.25 is the federal minimum wage (effective July 24, 2009), available at http://www.dol.gov/dol/topic/wages/minimumwage.htm.

e The follow-up survey will be collected from all Job Challenge participants who consented to participate in the study. The number of respondents is based on a 90% study participation consent rate x an 85% response rate / 3 years.

13. Estimate of total annual cost burden to respondents or record keepers

There are no direct costs to respondents and they will incur no start-up or ongoing financial costs.

14. Estimates of annualized cost to the federal government

The total annualized cost to the federal government is $104,328. Costs result from the following two categories:

  • The estimated cost to the federal government for the contractor to carry out this study is $255,000 for survey data collection for three cohorts. Annualized, this comes to $85,000 over 3 years.

  • The annual cost borne by DOL for federal technical staff to oversee the contract is estimated to be $19,328. We expect the annual level of effort to perform these duties will require 200 hours for one Washington, DC–based Federal GS 14 step 4 employee earning $60.40 per hour. (See Office of Personnel Management 2018 Hourly Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2018/DCB_h.pdf.) To account for fringe benefits and other overhead costs, the agency has applied a multiplication factor of 1.6. Thus 200 hours x $60.40 x 1.6 = $19,328.

15. Reasons for program changes or adjustments

This is a new information collection.

16. Tabulation, publication plans, and time schedules

An evaluation final report will present findings from the data collection efforts for which clearance is requested (that is, the text message survey and the follow-up survey), as well as the data collection efforts already approved by OMB. In the final report, all study research questions will be answered by integrating the findings from the implementation and outcomes analyses. The report will include analysis of the cadet tracking system data and administrative records. The results will be clearly and concisely presented, with appendices providing an appropriate level of technical information to document the rigor of the analyses. The final report will be available in spring 2020.

The data collection efforts included in this request (as well as the data collection efforts already approved by OMB) also might be used as part of one or more issue briefs or other types of dissemination products designed to share findings from the evaluation. These product(s) might either provide an overview of the evaluation findings or focus on a specific topic in greater detail. The details of the product(s) will be determined by the CEO at a later point; any such products that are produced will use nontechnical language and infographics.

17. Approval not to display the expiration date for OMB approval

The expiration date for OMB approval will be displayed.

18. Exception to the certification statement

No exceptions to the certification statement are requested or required.


1 Millenky, M., D. Bloom, S.M. Ravett, and J. Broadus. “Staying on Course: Three-Year Results of the National Guard Youth ChalleNGe Evaluation.” New York: MDRC, 2011.

2 The evaluation team also plans to collect (1) management information systems data from the three study sites and (2) administrative data from the National Student Clearinghouse and from state agencies for education-related outcomes and criminal justice system outcomes, respectively. Section A.4 explains how the follow-up data collection efforts for which clearance is requested do not duplicate effort given the availability of these other data sources.

3 The three National Guard Youth ChalleNGe Job ChalleNGe grantees began enrolling cohorts of participants in the Job ChalleNGe program starting with a cohort in January 2016. The second cohort began in July 2016 and the third in January 2017.



4 Goble, Lisbeth, Jillian Stein, and Lisa K. Schwartz “Approaches to Increase Survey Participation and Data Quality in an At-Risk Youth Population.” Presented at the FedCASIC Conference, Washington, DC, March 2014.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorIrwin, Molly E - ASP
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy