YCC_OMB Part A Justification_DOL CLEAN_CEO Updated_05302018

YCC_OMB Part A Justification_DOL CLEAN_CEO Updated_05302018.docx

Youth CareerConnect Evaluation

OMB: 1290-0016

Document [docx]
Download: docx | pdf

Evaluation of Youth CareerConnect


Office of Management and Budget SUPPORTING STATEMENT PART A


The Employment and Training Administration (ETA) in collaboration with the Chief Evaluation Office of the U.S. Department of Labor (DOL), is undertaking the Evaluation of Youth CareerConnect (YCC). The overall aims of the evaluation are to determine the extent to which the YCC program improves high school students’ education and employment outcomes and to assess whether program effectiveness varies by student and program characteristics. The evaluation will include impact and implementation studies. The impact study has two components: (1) a rigorous randomized controlled trial (RCT) in which program applicants were randomly assigned to a treatment group (and were able to receive YCC program services) or a control group (who were not), and a quasi-experimental design (QED) component based on administrative school records.

With this package, clearance is requested for the follow-up survey data collection of study participants in the RCT. Prior clearance was received on March 20, 2015 for the YCC Participant Tracking System (PTS) under OMB Control No. 1291-0002 and April 15, 2015, under OMB Control No. 1291-0003 for four data collection instruments related to the impact and implementation studies to be conducted as part of the evaluation.

A. Justification

1. Circumstances necessitating collection of information

a. The YCC program model


To provide high school students with the skills necessary to succeed in the labor market, in spring 2014, DOL awarded a total of $107 million to 24 grantees to implement the YCC program. The program is a high-school–based initiative aimed at improving students’ college and career readiness in particular employment sectors. Through partnerships with colleges and employers, the programs are redesigning the high school experience to provide skill-developing and work-based learning opportunities that will help students prepare for jobs in high-demand occupations.

YCC programs blend promising features of both career academy and sector-based models. Career academies embrace three core components: (1) a small learning community that links students and teachers in a structured environment; (2) a college preparatory curriculum based on a career theme that applies academic subjects to labor market contexts and includes work-based learning; and (3) employer, higher education, and community partners.1 Both experimental and quasi-experimental evaluations have found that career academies improved academic achievement and reduced high school dropout rates for disadvantaged students.2 In addition, career academies have been found to increase preparation for and graduation rates from college,3 as well as wages, hours worked, and employment stability.4 Qualitative evidence suggests that work-based learning helps students clarify their career goals.5

The YCC program model also draws from the success of more recent sector-based initiatives that align occupational training with employers’ needs,6 often within the workforce investment system. Using labor market statistics and information collected directly from employers, the programs identify the skills that employers need. Training providers and employers work collaboratively to develop training curricula tailored to specific job opportunities. Evaluations of sector-based programs have yielded promising findings. An experimental study of three relatively mature, sector-based programs estimated that adult participants in the programs earned about $4,500 (18 percent) more during the two year period after they had enrolled in the study than did similar adults who did not participate in the program.7

DOL requires YCC students to have at least two years of high school, so programs begin in grades 9 to 11. Figure A.1 presents a program logic model that guided the design of the evaluation. The program has five critical components: (1) integrated academic and career-focused curricula around one or more industry themes, (2) demonstrated strong engagement with employers, (3) individualized career and academic counseling that builds career and postsecondary awareness and exploration of opportunities beyond the high school experience, (4) work-based learning opportunities, and (5) a small learning community. These components are designed to increase program outputs, such as exposure to postsecondary education options, career-focused coursework, and internships. Ultimate outcomes include short- and long-term measures of student success in education, employment, and life stability.

Figure A.1. Logic model for YCC programs

GED = general equivalency degree.

b. Evaluation overview

DOL’s Chief Evaluation Office (CEO) undertakes a learning agenda process each year to identify Departmental priorities for program evaluations. This evaluation was prioritized as part of that process in FY2014. ETA, in collaboration with CEO, contracted with Mathematica and its subcontractor, SPR, to conduct this evaluation.

The evaluation will address three main research questions.

  1. What was the impact of the YCC programs on students’ short-term outcomes? How did participation in an YCC program affect students’ receipt of services and their educational experiences? Did YCC participation improve education, employment, and other outcomes in the three years after the students applied to the program?

  2. How were the YCC programs implemented? How were they designed? Did the grantees implement the YCC core components? What factors influenced implementation? What challenges did programs face in implementation, and how were those challenges overcome? Which implementation practices appear promising for replication?

  3. Did the effectiveness of YCC programs vary by student and grantee characteristics? What were the characteristics of the more effective YCC programs?

Both the rigorous RCT component (in which program applicants were randomly assigned to a treatment or control group in a subset of four districts, of which three will participate in the follow-up survey) and the QED component that involves up to 16 districts will address the first set of questions (baseline information collection was approved for this component under OMB Control Number 1291-0002). The QED component will also use administrative school records and propensity score methods to generate a comparison group and estimate the impact of YCC in up to 16 districts. This analysis will compare the outcomes of YCC students and similar non-YCC students in the same district. An implementation study of all 24 grantees will address the second set of questions (Data collection for this study was approved under OMB Control No. 1291-0003). Integrating aspects of the impact and implementation studies will address the third set of questions and, in the process, will likely yield important insights into ways to improve programs.

c. Data collection overview

Understanding the effectiveness of the YCC program requires data collection from multiple sources. DOL has already received OMB approval to much of these data (see OMB Control No. 1291-0002 and 1291-0003). This current request is for follow-up survey data collection to support the RCT impact study.

Follow-up survey data. Self-reported follow-up survey data on YCC-related services (collected consistently for both treatment and control groups) will be used to estimate the impact of YCC on the number of career-focused courses completed, the receipt of work-based activities in high school (such as job shadowing, mentoring, internships, participation in training programs, and apprenticeships), and the receipt of career skill-building services (such as resume writing, interviewing, team building, and information on postsecondary schools). This information will be used to assess the intensity of services received by the treatment group and to document the study counterfactual as measured by the services received by the control group.

Key follow-up outcomes will be constructed using student follow-up survey data. These outcomes will include measures of (1) education success (such as school and program retention, attendance and behavior, school engagement and satisfaction, test scores and proficiency, postsecondary credits and the number of Advanced Placement classes, and educational expectations), (2) employment success (such as work-readiness skills, work experience in paid and unpaid jobs, employment expectations, and knowledge of career options), and (3) life stability (such as drug use and involvement with the criminal justice system).

Study participants will be asked to provide assent. During the baseline data collection, assent was collected via hard copy; however, assent has been programmed into the follow-up survey to reduce security concerns associated with hard copy form returns.

2. Purposes and use of the information

Clearance is being requested for the follow-up survey data collection. The section below describes how, by whom, and for what purpose the information will be collected.

Study outreach materials

Throughout the follow-up data collection process, study participants will receive correspondence about their participation via both mail and email. An advance letter will be sent to all study participants at the onset of data collection. The letter will include the survey link and the participant’s username and password for logging into the instrument. The letter will reiterate basic information about the study (which participants received when they enrolled), including details about study sponsorship, the optional and protected nature of their responses, and the incentive payment. Also, the advance letter will provide contact information in case participants have questions about the study or the survey. Approximately one week after the advance letter has been mailed, an invitation email containing similar text will be sent to all participants whose email addresses are on file.

Reminder letters will be sent to all study participants periodically throughout the field period. This letter will include text that is similar to the advance letter, including the survey link and the participant’s username and password, study information, incentive payment information, and study team contact information.

Student follow-up survey


Student-reported data from the follow-up survey will be used to construct key outcomes for the study. The survey will include some of the same items in the student baseline survey but will reference the current school year to capture changes in responses over time. Also, it will include questions that focus on experiences in high school. The follow-up survey will provide a snapshot of information about students’ school experiences, school engagement, expectations for education, and activities in and outside of school approximately two years after random assignment in the study.

The sample for the follow-up survey will consist of approximately 540 study participants (treatment and control group members) at the three RCT districts included in this data collection effort. The survey will be conducted in 2018, approximately 24 months after study enrollment in YCC, for a period of approximately 16 weeks. We expect to achieve an 80 percent response rate. The approach used to achieve this response rate will be similar to that used for Mathematica’s YouthBuild Evaluation, conducted for DOL. That evaluation included a sample of youth similar to the YCC samples, and it achieved 81 percent and 80 percent response rates on the 12-month and 30-month surveys, respectively.8

To help achieve this target response rate, the planned approach for collecting YCC follow-up survey data includes strategies to foster flexibility and maximize efficiency. A multimode approach will use three phases of data collection which will utilize technology as described below. This approach will maximize efficiency by encouraging early survey completion using the modes that require the least amount of resources (web and computer assisted telephone interviewing (CATI) call-ins) before moving to modes that require increased interviewer labor (CATI call-outs and field locating). Also, offering completion options via multiple modes provides study participants with the flexibility to complete the web survey or call the survey operations center (SOC) at their convenience.

A second strategy employed to reach the targeted response rate will be to communicate with study participants frequently and clearly. For example, an advance letter on DOL letterhead will explain and legitimize the study, emphasize its importance, and encourage participation. Additionally, the outreach strategy includes multiple rounds of reminder letters, emails, and calls to respond to participant concerns and persuade them to complete the survey. Outreach materials are included in the supporting documents.

Finally, a critical component of the strategy to maximize response rates is to design a clear, streamlined survey instrument. Simplicity is important to help respondents understand the data being collected and facilitate a timely response. The survey is expected to take 30 minutes to complete.

The follow-up survey data elements to be collected include the following:

  1. Student assent. The first screen of the follow-up survey will include a statement of assent for the student. The text on this screen is the exact text distributed in hardcopy during the sample intake period (approved under OMB No. 1291-0003).

  2. Identifying information. These data items include student name and date of birth, which are necessary for verification of identity for the follow-up survey.

  3. Academics and career preparation. These data items include the student’s high school enrollment status, satisfaction with school, participation in school-based activities related to preparation for postsecondary education, participation in school-based activities related to workforce preparation, participation in school-based supportive services, and measures of the student’s participation in school-organized extracurricular activities, school behavior, and hours spent doing homework.

  4. Work experience. This section of the survey collects information on work experience, including whether the students ever worked for pay, and, if they did, when the work occurred, whether the jobs were arranged through school, and what type of work was performed.

  5. Student opinions and experiences. These questions capture other student outcomes, including measures of student motivation, maternity/paternity, arrests, and history of alcohol and drug use.

  6. Contact information. The final section collects contact information for use in mailing the incentive payment to the participant.

3. Use of technology to reduce burden

The data collection efforts will use advanced technology to reduce the burden on participants. The follow-up survey will be conducted using a multimode approach. This process will begin primarily on the web and involve more intensive strategies (CATI and in-person locating) as part of nonresponse follow-up. The web and CATI versions of the surveys will be identical and will include identity verification questions to confirm that the correct study participant completes the survey. Surveys will also include skip patterns to ensure that respondents answer the questions that pertain to their experiences only. Using web surveys reduces the amount of interviewer labor necessary to complete data collection and gives respondents flexibility in when they complete the survey. Furthermore, respondents can complete the survey incrementally, in more than one sitting, if necessary. Study participants who do not have Internet access will be directed to complete it by phone using CATI. For participants who cannot be located by telephone, the research team’s specialized locating staff will use contact information collected during the baseline effort, searchable databases, directory assistance services, and reverse directories to obtain current contact information. If these electronic locating efforts yield no successful results, trained field locators will attempt to locate study participants by using recently known residential addresses. Field locators, equipped with cell phones, will dial into the research team’s SOC to allow the participants they locate to complete a CATI survey.

4. Efforts to avoid duplication of effort

To minimize duplication of effort in data collection, the collection activities focus on the key information to be gained from each data source and include only items that are necessary for the evaluation and that are not readily available through other sources. For example, the follow-up survey includes required items that are not available through the YCC program’s Participant Tracking System (PTS) or administrative school records. Although data collected through the PTS (approved under OMB No. 1291-0002) contains information such as demographics, service receipt, and work experience, it is collected for program participants only; as a result, this information must be obtained in the follow-up survey as well, to ensure that equivalent information is gathered for both treatment and control students.

5. Methods of minimizing burden on small entities

The data collection effort does not involve small businesses or other small entities.

6. Consequences of not collecting data

The study’s ability to evaluate YCC program effectiveness would be limited if follow-up survey data were not collected on study participants. The follow-up surveys measure various outcomes for both treatment and control groups (for example, expectations for educational attainment, attitudes about school, behavior, and receipt of YCC-like services such as workforce preparation activities) that are unavailable in other data sources. Without this self-reported data, the study would not be able to estimate YCC’s effects on these important outcomes.

7. Special circumstances

No special circumstances are involved with the collection of information.

8. Federal Register announcement and consultation

a. Federal Register announcement

A 60-day notice to solicit public comments was published in the Federal Register, Volume 82, No. 190, page 46090, on October 3, 2017. No comments were received.

b. Consultations outside the agency

No experts who are not directly involved in the study were consulted regarding the subject of this clearance. Experts were consulted for other aspects of the evaluation design and impact evaluation.

c. Unresolved issues

There are no unresolved issues.

9. Payment or gift to respondents

Study participants will receive a respondent payment for completing the follow-up survey. These payments are essential to maximize the response rate. Encouraging participants to complete their survey early helps to limit the amount of evaluation team labor required during the later stages of data collection (CATI outgoing calls and field locating). Thus study participants who complete their survey online or by calling into the SOC within the first four weeks will receive $40, and those who complete the survey thereafter will receive $25, irrespective of mode. The YouthBuild Evaluation, conducted for DOL, tested the effectiveness of this incentive model and found that study participants who were offered the $40 incentive had 38 percent higher probability of completing the survey within the first four weeks, compared to those who were offered the $25 incentive.9

10. Assurances of privacy

The contractor complies with DOL data security requirements by implementing security controls for processes that it routinely uses in projects that involve sensitive data. Further, the study is being conducted in accordance with all relevant regulations and requirements. The contractor secures personally identifiable information and other sensitive project information and strictly controls access on a need-to-know basis.

At the time of their enrollment in the study, all study participants received information about the study’s intent to keep information private to the extent permitted by law. Before the students were randomly assigned to the treatment or control group, all parents or guardians signed a consent form (approved in the prior package) that included this privacy information and explained that the study participants would be asked to participate in voluntary surveys. Participants were told that all information provided would be kept private and used for research purposes only. Further, participants were assured that they would not be identified in any way in reports or communications with DOL.

11. Additional justification for sensitive questions

The follow-up survey will include verification questions, such as the student’s date of birth. The responses to these questions will be validated against data collected during study enrollment to ensure that the survey is being administered to the intended participant.

The follow-up survey does include questions on topics that some respondents might find sensitive, such as delinquent activities, including arrests and drug use. Collecting this information is critical for the evaluation, and it cannot be obtained readily through other sources. Because they are predictors of dropping out of high school, involvement with the criminal justice system and drug use are key outcomes in the impact study. Similar questions were included in the baseline information forms previously collected and in past studies without any evidence of significant harm.

As described earlier, all study participants were assured that their information would be kept private before random assignment began and study enrollment forms were completed. Participants were informed that not all data items had to be completed and that all data would be held in the strictest confidence and reported in aggregate, summary format, eliminating the possibility of individual identification. This same language will be included in communications related to the follow-up survey. Also, the evaluation has obtained a Certificate of Confidentiality from the National Institutes of Health for the enrollment and baseline process, which will be amended to cover follow-up survey activities before they begin. The certificate assures participants that their information will be kept private to the fullest extent permitted by law.

12. Estimates of hours burden

Table A.1 describes the assumptions made about the annual number of responses expected, the average hours of burden per respondent, and the burden costs estimated for the follow-up survey. All estimates assume a three-year clearance.

Table A.1. Estimated respondent hour and cost burden associated with follow-up data collection

Type of instrument

Annual number of respondentsa

Number of responses per respondent

Annual Number of Responses

Average burden hour per response


Annual estimated burden hours

Average Hourly Rate b

Total Cost Burden

Student follow-up survey (including assent)

144

1

144

0.5

72

$7.25

$522.00

Total

144

1

144

0.5

72

$7.25

$522.00

a The figure corresponds to 80 percent of the total treatment and control group students in the three RCT districts.

b The hourly wage of $7.25 is the federal minimum wage (effective July 24, 2009): http://www.dol.gov/dol/topic/wages/minimumwage.htm.

13. Estimate of total annual cost burden to respondents or record keepers

There are no direct financial costs to respondents, and they will incur no start-up or ongoing financial costs. The cost to respondents involves solely the time required for the follow-up survey and consent form (one district) and the district staff time associated with data requests for the study. To limit the burden on district staff, we will work within the confines of their existing data system structure; also, payment will be provided to districts who requested financial compensation. The costs are captured in the burden estimates in Table A.1.

14. Estimates of annualized cost to the federal government

The total annualized cost to the federal government is $52,071.33. Costs result from the following categories:

  1. The estimated cost to the federal government for the contractor to carry out this study is $98,230 for follow-up survey data collection and student records collection. Annualized over three years of data collection, this comes to $32,743.33 ($98,230/3= $32,743.33).

  1. The annual cost borne by DOL for federal technical staff to oversee the contract is estimated to be $19,328. We expect the annual level of effort to perform these duties will require 200 hours for one Washington, DC–based Federal GS 14 step 4 employee earning $60.40 per hour. (See Office of Personnel Management 2017 Hourly Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2018/DCB_h.pdf.) To account for fringe benefits and other overhead costs, the agency has applied a multiplication factor of 1.6. Thus 200 hours x $60.40 x 1.6 = $19,328.

15. Reasons for program changes or adjustments

This is a new information collection.

16. Tabulation, publication plans, and time schedules

This data collection will contribute to the final report and short reports. The final report, which will cover all findings from the impact and implementation study portions of the evaluation, will be available in December 2019.

17. Approval not to display the expiration date for OMB approval

The expiration date for OMB approval will be displayed.

18. Exception to the certification statement

No exceptions to the certification statement are requested or required.

1 National Career Academy Coalition. “National Standards of Practice for Career Academies.” National Career Academy Coalition, April 2013. Available at http://www.ncacinc.com/sites/default/files/media/documents/nsop_with_cover.pdf. Accessed September 1, 2017.

Brand, Betsy. “High School Career Academies: A 40-Year Proven Model for Improving College and Career Readiness.” American Youth Policy Forum, November 2009. Available at http://www.aypf.org/documents/092409CareerAcademiesPolicyPaper.pdf. Accessed September 1, 2017.

Stern, David, Charles Dayton, and Marilyn Raby. “Career Academies: Building Blocks for Reconstructing American High Schools.” Career Academy Support Network, October 2000. Available at http://casn.berkeley.edu/resource_files/bulding_blocks.pdf. Accessed September 1, 2017.

2 Kemple, James J. “Career Academies: Impacts on Labor Market Outcomes and Educational Attainment.” MDRC, March 2004. Available at http://www.mdrc.org/sites/default/files/full_49.pdf.

Kemple, James J., and Jason Snipes. “Impacts on Students’ Engagement and Performance in High School.” New York: MDRC, March 2000. Available at http://www.mdrc.org/sites/default/files/full_45.pdf. Accessed September 1, 2017.

Maxwell, Nan L., and Victor Rubin. High School Career Academies: Pathways to Educational Reform in Urban Schools? Kalamazoo, MI: W.E. Upjohn Institute for Employment Research, 2000.

Stern, David, Marilyn Raby, and Charles Dayton. Career Academies: Partnerships for Reconstructing American High Schools. San Francisco, CA: Jossey-Bass, 1992.

3 Maxwell, Nan L. “Step-to-College: Moving from the High School Career Academy Through the Four-Year University.” Evaluation Review, vol. 25, no. 6, December 1, 2001, pp. 619–654.

4 Ibid

Maxwell, Nan L., and Victor Rubin. “High School Career Academies and Post-Secondary Outcomes.” Economics of Education Review, vol. 21, no. 2, April 2002, pp. 137–152.

5 Haimson, Joshua, and Jeanne Bellotti. “Schooling in the Workplace: Increasing the Scale and Quality of Work-Based Learning.” Princeton, NJ: Mathematica Policy Research, January 22, 2001. Available at https://www.mathematica-mpr.com/~/media/publications/PDFs/schooling.pdf.

6 Greenstone, Michael, and Adam Looney. “Building America’s Job Skills with Effective Workforce Programs: A Training Strategy to Raise Wages and Increase Work Opportunities.” Washington, DC: Brookings Institution, November 30, 2011. Available at https://www.brookings.edu/research/building-americas-job-skills-with-effective-workforce-programs-a-training-strategy-to-raise-wages-and-increase-work-opportunities/.

Maguire, Sheila, Joshua Freely, Carol Clymer, Maureen Conway, and Deena Schwartz. “Tuning In to Local Labor Markets: Findings from the Sectoral Employment Impact Study.” Philadelphia, PA: Public/Private Ventures, 2010. Available at http://ppv.issuelab.org/resources/5101/5101.pdf.

Woolsey, Lindsey, and Garrett Groves. “State Sector Strategies Coming of Age: Implications for State Workforce Policymakers.” Washington, DC: National Governors Association, 2013. Available at https://www.nga.org/files/live/sites/NGA/files/pdf/2013/1301NGASSSReport.pdf.

7 Ibid

8 Miller, Cynthia, Megan Millenky, Lisa Schwartz, Lisbeth Goble, and Jillian Stein. “Building a Future: Interim Impact Findings from the YouthBuild Evaluation.” Washington, DC: MDRC in partnership with Mathematica Policy Research, November 2016. Available at https://www.mdrc.org/sites/default/files/YouthBuild_Interim_Report_2016_FR.pdf. Accessed September 1, 2017.

9 Goble, Lisbeth, Jillian Stein, and Lisa K. Schwartz “Approaches to Increase Survey Participation and Data Quality in an At-Risk Youth Population.” Presented at the FedCASIC Conference, Washington, DC, March 19, 2014. Available at https://www.census.gov/fedcasic/fc2014/ppt/02_goble.pdf. Accessed September 1, 2017.




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPBurkander
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy