TAACCCT R4 Follow-Up OMB Package Supporting Statement B 12 7 16

TAACCCT R4 Follow-Up OMB Package Supporting Statement B 12 7 16.docx

National Evaluation of Round 4 of the Trade Adjustment Assistance Community College and Career Training Grant Program

OMB: 1291-0011

Document [docx]
Download: docx | pdf

Part B: Statistical Methods

Part B of the Supporting Statement for the National Evaluation of Round 4 of the Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grants Program – sponsored by the Department of Labor (DOL) Chief Evaluation Office (CEO) in partnership with the Employment and Training Administration (ETA) – considers the issues pertaining to Collections of Information Employing Statistical Methods. CEO and ETA contracted with Abt Associates, Inc. to conduct the study. The study will evaluate the national TAACCCT grant program, using a multi-pronged approach including (1) an Outcomes Study of selected Round 4 grantees, (2) an implementation analysis, (3) a study of employer relationships, and (4) a synthesis of the independent third-party evaluations that all Round 4 grantees are required to procure.

The TAACCCT grant program provides community colleges and other eligible institutions of higher education with funds to expand and improve their ability to deliver education and career training programs that can be completed in two years or less and are suited for workers who are eligible for training under the Trade Adjustment Assistance for Workers (TAA) Program. DOL awarded 49 grants in Round 1 in FY 2011, 72 in Round 2 in FY 2012, 57 grants in Round 3 in FY 2013, and 71 grants in Round 4 in FY 2014.

For the Outcomes Study, the evaluation team recommended and DOL selected a purposive sample of nine Round 4 grantees. Within these grantees, the evaluation team further identified and recommended 20 programs for the study that comprise all TAACCCT-supported programs that meet the study selection criteria described below. All students who enroll in these programs during the 2016-17 academic year will be invited to participate in the Outcomes Study. Baseline data will be collected on students who consent to participate. Administrative and follow-up survey data will be collected to measure participant outcomes. To support the implementation analysis, the evaluation team will administer a survey to all colleges involved in the 71 Round 4 grants. To learn more about how grantees develop strong partnerships with employers, the evaluation team will interview 50 purposively sampled employers.

Data collection efforts previously approved for the TAACCCT Round 4 Evaluation under OMB Control Number 1291-0004 include a baseline information form completed by study participants in the grantees selected for the Outcomes Study and semi-structured discussion guides for use on implementation research visits to the nine Outcomes Study sites. These data collection activities will continue under the previously approved request.1

This submission seeks clearance for four data collection instruments:

  • Follow-up Survey (Outcomes Study)

  • Participant Tracking Form (Outcomes Study)

  • College Survey (implementation study)

  • Employer Interview (employer relationship study)

The College Survey is very similar to one previously approved by OMB for colleges participating in Rounds 1 through 3 of the TAACCCT Grants Program (OMB Control No. 1291-007, expiration date 10/31/2018). Questions regarding employer relationships were added to ensure that the instrument covers priorities outlined in the Round 4 Solicitation for Grants Announcement (SGA). Data from the college survey will contribute to the implementation analysis and provide information for selecting employers for interviews as a part of the employer relationships study.

The Employer Interviews will be one-hour semi-structured phone interviews with Human Resources managers at companies that partner closely with Round 4 grantees.

B.1 Respondent Universe and Sampling Methods

Exhibit B.1 shows the respondent universe and target response rate for each data collection item (the follow-up survey, participant tracking form, college survey, and employer interviews). As shown, DOL’s expected response rate ranges from 80 percent for the follow-up survey and employer interviews to 90 percent for the college survey. The 12-month follow-up survey was not administered to previous rounds to TAACCCT participants. The college survey was administered to previous rounds of TAACCCT grantees. The response rate for Round 2 was 97 percent. Like the 12-month survey, the employer interviews are new to the evaluation of Round 4 TAACCCT grantees. The expected 80 percent response rate is based on previous experience recruiting interviewees for similar initiatives (e.g., Pathways for Achieving Careers and Education, Green Jobs and Health Care Impact Evaluation, Health Profession Opportunity Grants Impact Evaluation).

Below is additional information about the respondent universe and sampling methods.

Exhibit B.1: Sample Sizes and Response Rates

Data Collection

In Sample

Completed

Response Rate

Follow-up survey

5,000

4,000

80%

Tracking form

5,000

N/A

N/A

College survey

272

245

90%

Employer interviews

50

40

80%



Follow-Up Survey

The potential respondent universe for the 12-month follow-up survey is participants who enroll in TAACCCT-supported programs at nine grantees and who consent to be in the study. The evaluator selected 20 programs using the following criteria:

  • Programs anticipate enrolling at least 50 students during the 2016-2017 academic year;

  • Programs are at grantees where total enrollment in all eligible programs at the grantee is anticipated to be at least 200 students during the 2016-2017 academic year;

  • Programs are designed to be completed within one academic year of enrollment, with a preference for those designed to be completed within 6 months;

  • TAACCCT funding represents a large share of total funding for the program design, redesign, or enhancement; and

  • The program results in a credential that meets the criteria developed by the DOL Employment and Training Administration (ETA). 2

Participant Tracking Form

The respondent universe is identical to that for the follow-up survey. The purpose of the tracking form is to boost the response rate for the follow-up survey. As such, there will be no subsampling, nor target response rate.

College Survey

The respondent universe for the online college survey comprises all 272 colleges that are part of the 71 Round 4 grantees. As noted, the expected response rate is 90 percent. The list of primary (and secondary, if available) respondent names and emails from each college will be developed in collaboration with DOL staff.

Employer Interviews

The purpose of the employer study is to better understand how strong employer relationships work and what lessons can be gleaned to encourage and support successful employer relationships through DOL programs. The evaluation team will identify a sample of 50 employers that have strong relationships with their TAACCCT colleges using the data collected on employers from the college survey and represent a range of industries/occupations of training, new versus incumbent worker training, employer size, and previous involvement with the college.

B.2 Procedures for Collection of Information

        1. B.2.1 Sample Design

Follow-Up Survey

The evaluation team expects the Outcomes Study sample to include approximately 5,000 individuals who will have enrolled in one of 20 TAACCCT-funded programs in one of the nine study TAACCCT grantees. As noted in B.1, evaluation team recommended and DOL selected nine grantees and 20 programs for inclusion in the Outcomes Study. All individuals who enroll in one of these programs during the 2016-2017 academic year and consent to be in the study will be in the sample. A year after study enrollment, the evaluation team will contact all participants for a follow-up interview. No probability sampling will be conducted.

Participant Tracking Form

The sample will be all study participants.


College Survey

All 272 colleges that are part of Round 4 TAACCCT grants will be surveyed.


Employer Interviews

The evaluation team will use the college survey to draw the sample. Survey respondents will be asked to provide the names of up to five employers who they see as their strongest partners. They will then be asked to provide addition information on each of these employers including:

  • Previous relationships with the employer;

  • Whether they are sending their current employees for TAACCCT-supported training;

  • Whether they are hiring graduates from TAACCCT-supported training;

  • Types of activities employers have participated in; and

  • The roles of the employers.

The evaluator will determine the strength of the employer relationships using the list of employer activities in the Round 4 SGA and identifying employers that participate in multiple activities and then further refining the list using a Jobs for the Future scale developed to identify strong employer partnerships.3 The evaluator will categorize potential employer respondents by the key characteristics for which a range of respondent types is desired.

        1. B.2.2 Estimation Procedures

Follow-Up Survey

The Outcomes Study will address the following research questions:4

  1. What are the characteristics of TAACCCT participants in our study sample? How do they vary across different TAACCCT-supported programs and colleges in the sample?

  2. In which features of training programs and services do TAACCCT participants engage? What college and partner (e.g., workforce agency) services do they receive? What are their assessments of the services they receive? How does this differ across grants?

  3. What educational outcomes do TAACCCT participants in short-term occupational training programs achieve? Do they see themselves on a career pathway? What are their plans for future training? What is the expected timeline?

  4. Do participants obtain employment? If so, is this employment in the occupation for which they trained? What are their earnings? How does it compare to their earnings prior to entering TAACCCT-supported programs? What are the characteristics of their jobs?

  5. Which subgroups of participants are most likely to complete TAACCCT-funded programs and begin jobs in their occupation of training?

  6. Which types of TAACCCT programs and program features are associated with improved educational and employment outcomes for participants?5

  7. Did household income change? Are participants less likely to receive public assistance at follow-up than at baseline?

These research questions will be answered primarily with simple weighted tabulations of means and percentages along with corresponding estimated standard errors, with the exception of participant earnings, which will be measured using quarterly wage data from the National Directory of New Hires (NDNH). All non-earnings outcomes will be measured with follow-up survey data; weights will reflect modeled probabilities of response to the survey. No weights will be used for the NDNH outcomes. The evaluator will use standard hypothesis tests to determine whether changes over time and differences in observed outcomes for different groups are statistically significant.

Because the evaluator will make simple comparisons of means, there are no planned covariates. The key outcome variables are defined by the research questions above; stated briefly they are educational progress and earnings. The subgroups are not known at this time. They will be defined in terms of variables measured in the Baseline Information Form administered at the time of study intake. The subgroups selected will be determined empirically as a result of the research to answer research question #5 above.

Item nonresponse will be imputed with SAS/MI with fully conditional specification (FCS) methods,6 where each outcome is imputed based on models defined in terms of subgroups of interest. Nonresponse-adjusted weights will be used to compensate for unit nonresponse.

Participant Tracking Form

The data will not be analyzed.

College Survey

The college survey will be used to develop an inventory of grantee goals, activities, project context, and future project plans, not to make statistical inferences about these efforts. The data analysis will be descriptive.


Employer Interviews

The employer interviews are intended to develop a better understanding for how strong employer relationships work, not to make statistical inferences about these efforts. Qualitative data will be collected and the analysis will be descriptive.


        1. B.2.3 Degree of Accuracy Required

The report will include a number of comparisons between subgroups. Each of these comparisons involves a target group and a reference group. Questions will be of the form: is the average outcome better in the target group than in the reference group. “Better” will defined in terms of making better progress in completing training and in terms of achieving higher post-training earnings. The target and reference groups will be defined in terms of gender, prior education, race, or other variables measured in the baseline information form (BIF). They could also be defined in terms of time as is the comparison of pre- and post-training earnings.

The minimum detectable difference (MDD) is the smallest true difference between subgroups that a study will be able to detect at specified levels of power and statistical significance. Power refers to the probability of detecting a statistically significant difference of given size when it exists (i.e., avoiding Type II error) and typically is set to 80 percent. The statistical significance level in a hypothesis test equals the probability of rejecting the hypothesis of no impact when it is correct and there really is no difference (i.e., making a Type I error). The standard for statistical significance in the TAACCCT Outcomes Study will be 0.05. Two-sided tests will be used because there is no prior information about which programs lead to more educational progress and higher post-training earnings or what types of students will benefit more from them.

Exhibit B.2 shows that the Outcomes Study will be able to detect a difference between a target group and a reference group in the percentage making substantial educational progress (e.g., receiving specified credentials) as small as 8.9 percentage points if both groups being compared have a sample size of 500 each, or 13.9 percentages points if the two groups have a sample size of 200 each. The corresponding MDDs for quarterly earnings are $620 and $980. Statements will be made to the effect that outcomes are better or worse in the target group than in the reference group if the estimated difference in outcomes between the two groups is statistically significant.

Differences of this magnitude would likely be of interest to policy makers and practitioners. Fein (2016) in examining risk factors for college success found many ways to split the population of ever-enrolled students into groups with very different success rates of receiving credentials. For example, Fein found a 21-point difference in terms of self-reported high school average grades (As versus Cs or lower); a 9-point difference in terms of expected need (at enrollment) to work while attending school (less than 20 hours per week versus 35 or more hours per week); and a 13-point difference in terms of self-assess academic discipline (top quartile versus bottom quartile). Exhibit B.3 shows non-experimental estimates of returns of various sizes on education for similar populations. One of the studies shows a return on short-term credentials greater than $620 per quarter and several show similar or higher rates of return on longer-term credentials (the outcomes study includes short-term and longer-term (one year) programs).

Exhibit B.2: Minimum Detectable Differences (MDDs) for Comparing a Target Group to a Reference Group at each Grantee (with assumptions)

Statistic

Percent with Substantial Educational Progress

Average Quarterly Earnings

MDD given two groups with n=500 each

8.9 p.p.

$620

MDD given two groups with n=200 each

13.9 p.p.

$980

Mean in reference group

50.0

$2,600

Standard deviation in reference group

50.0

$3,500

Threshold p-value for statistical significance with two-sided test

0.05

0.05

Power

0.80

0.80

Note: The assumptions about earnings are based on the fourth quarter following randomization in the control group for the evaluation of HPOG, a grants program of the Administration for Children and Families (Harvill et al., 2015). The evaluation team determined this was the most relevant information available on the variance of post-training earnings of low-skill adults.

The MDD on quarterly earnings for two groups with n=500 each was calculated as

The MDD for substantial educational progress was defined as the smallest value of such that

Participant Tracking Form

These data will not be analyzed.

College Survey

For the college survey, no statistical techniques will be used to draw inferences.


Employer Interviews

For the employer interviews, no statistical techniques will be used to draw inferences.



Exhibit B.3 Recent Non-Experimental Estimates of Return to Training, by Length of Training


Publication

Credential Earned

 

Short-term certificate/ certificate

Long-term certificate/diploma

Associate degree

Bahr et al, 2015 (MI) – Males

1,345**

918.0***

1,441***


(572.6)

(276.5)

(162.8)

Bahr et al, 2015 (MI) – Females

267.6

619.6**

2,346***


(232.2)

(262.1)

(139.0)

Dagdar and Trimble, 2015 (WA) – Males

-3,861

2,963**

3,667***


(2,397)

(1,096)

(743.6)

Dagdar and Trimble, 2015 (WA) - Females

-519.8

6,069***

4,207***


(1,112)

(1,216)

(425.4)

Jepsen, Troske, and Coomes, 2014 (KY) - Males

297*

1,265***

1,484***


(160)

(183)

(149)

Jepsen, Troske, and Coomes, 2014 (KY) - Females

299***

1,914***

2,363***


(73)

(110)

(81)

Belfield, Liu and Trimble, 2014 (NC) - Males

$476***

$564***

$1,133***


(54.54)

(85.91)

(43.44)

Belfield, Liu and Trimble, 2014 (NC) - Females

$157***

$1,565***

$1,907***

 

(36.47)

(50.05)

(30.28)

Notes: Standard errors in parentheses. Earnings returns are quarterly for all studies reported except Dagdar and Trimble (2015) which reports annual returns to earnings.
Bahr et al. (2015) describe a short-term certificate as a program requiring fewer than 15 credit hours and a long-term certificate as a program requiring 15 or more credit hours to complete. Dagdar and Trimble (2015) describe a short-term certificate as a program requiring less than one year of full-time study and a long-term certificate as a program requiring more than one year of full-time study to complete. Jespen, Troske and Coomes (2014) describe a certificate as a program lasting one or two semesters of study and a diploma as a program lasting more than one year of study to complete. Belfield, Liu and Trimble (2014) describe a certificate as a program requiring 12-18 credit hours that can be completed in under one year and a diploma as a program requiring 36-48 credit hours to complete. An associate degree is typically a two year program requiring a set number of credit hours to complete.

*p<.10; **p<.05; ***p<.01





        1. B.2.4 Who Will Collect the Information and How It Will Be Done

Follow-Up Survey

The evaluation team will use three modes to administer the 12-month follow-up survey: self-administered web, telephone, and in person. First participants will be invited to complete a web-based survey. Those who have not completed the survey after 6 weeks will be contacted by the phone center. The phone interviews are conducted by professional interviewers working in a centralized computer-assisted telephone interview (CATI) system that allows real-time error checking and observation by supervisors. If a person cannot be contacted by phone after 4 weeks, the evaluator will attempt to conduct the survey in-person. In-person interviewing efforts are expected to last 8 to 10 weeks.

Participant Tracking Form

The evaluation team will send a welcome packet that describes the study to each participant approximately one month post enrollment. The evaluation team will then contact study participant for updated information through a letter or email at three points in time: 3 months, 6 months, and 9 months post enrollment.

College Survey

The evaluation team will administer the web-based college survey.

Employer Interviews

The evaluator will conduct one-hour interviews with employers by phone. The interviews will be semi-structured so that questions may not be asked in the order in the guide and interviewers may spend more time on certain questions that are more relevant than others to the particular interviewee. A two-person team – one senior interviewer and a note taker – will conduct the interview. Interviewers may record the interview if the interviewee agrees to ensure all details are captured.

        1. B.2.5 Procedures with Special Populations

There are no special populations included in the planned data collection efforts.


        1. B.2.6 Use of Periodic Data Collection Cycles to Reduce Burden

The 12-month follow-up survey will be administered once. Building on experience conducting follow-up surveys with similar populations, the evaluator is implementing proactive tracking of study participants between the time of study intake and the follow-up survey. These efforts are intended to update study participant contact information. The evaluator will send participants an initial welcome packet and then tracking notifications 3 months, 6 months and 9 months after study enrollment.


The college survey and employer interviews are one-time data collection efforts.

B.3 Methods to Maximize Response Rates and Deal with Non-response

        1. B.3.1 Follow-Up Survey

The methods to maximize response rates are discussed with regard first to participant tracking and locating, and then regarding the use of monetary tokens of appreciation.

Participant Tracking and Locating

Participant tracking to maximize the response to the 12-month follow-up survey includes active outreach to study participants and passive tracking. The active tracking begins with a welcome packet, sent to all study participants approximately one month after enrollment that includes a welcome letter, a study brochure, and address to a website with Frequently Asked Questions about the survey and associated tracking efforts. Additionally, the evaluator will send tracking letters 3 months and 9 months after enrollment that ask study participants to update their contact information.7 Also a text or email will be sent approximately 6 months following study enrollment to remind participants about the study and upcoming survey.

In terms of passive tracking, the evaluator will conduct several database searches to obtain additional contact information.

Tokens of Appreciation

Offering appropriate monetary gifts to study participants in appreciation for their time can help ensure a high response rate, which reduces the danger of nonresponse bias. Those who complete the follow-up survey will receive a $25 check as a token of appreciation for their time. Tracking letters at 3 months and 6 months post study enrollment will include a $2 token of appreciation.

Sample Control during the Data Collection Period for Follow-up Survey

During the data collection period, the evaluation team will minimize non-response levels and the risk of non-response bias in the following ways:

  • Using an advance letter and email that clearly conveys the purpose of the survey to study participants, the monetary gift, and reassurances about privacy to ease concerns about the survey.

  • Starting with a self-administered web survey, which is expected to be a convenient option for the study population.

  • Using updated contact information captured through tracking efforts to help the evaluation team locate participants for the survey.

  • Taking additional tracking and locating steps, as needed, when the evaluation team does not find sample members at the phone numbers or addresses previously collected.

  • Using trained interviewers (in the phone center and field) who are skilled at working with the sample population and maintaining rapport with respondents, to minimize the number of break-offs, and thus the incidence of non-response bias.

  • Employing a rigorous telephone process to ensure that all available contact information is utilized to make contact with participants.

  • Administering the survey in person in instances where the participant cannot be surveyed by phone.

  • Requiring the survey supervisors to manage the sample in a manner that helps to ensure that response rates achieved are relatively equal across sites.

        1. B.3.2 College Survey

The evaluation team will take the following steps to achieve a high response rate.

  • Reminding grantees and their partner colleges that participation in evaluation activities is a requirement of their grant in the documentation accompanying the survey.

  • DOL will send advance letters to all grant directors one month before the survey (see Appendix D). The letter will specify the date on which the survey is scheduled to be sent, the formats in which it will be available (online or in a Microsoft Word version, if needed), the time expected to complete the survey, and the survey’s originator (the Urban Institute).

  • On the scheduled date, the evaluator will e-mail all primary college contacts with the link to the online survey and instruction for completion. The respondents will be provided with a contact should they encounter any problems or questions as they complete the survey.

  • Using Qualtrics software, the evaluator will track who has started the survey and monitor their progress and follow up with those grantees that have not started or completed the survey. Follow-up with the grantee respondents will be done through periodic e-mail reminders.

  • Using a PC-based tracking system, the evaluator will monitor the receipt of surveys, status of follow-up reminders, attachments provided by respondents, completion of data entry, and need for further clarification. As each survey is reviewed, follow-up e-mails and telephone calls will be made to those respondents whose surveys contain errors, unclear responses, or missing information.

        1. B.3.3 Employer Interviews

The team will work closely with the TAACCCT colleges to identify key contact(s) at each employer selected and their contact information. The team will then reach out to each employer, introducing the evaluation team and the study and requesting a one-hour phone interview. The college lead will be copied on any communication to ensure the employer is comfortable with this request. Should the employer not respond to the first request, the evaluator will first follow up with another email and then call the employer directly if they do not respond. If these methods fail, the team will contact the college and request that they encourage the employer to participate and address any concerns the employer may have about participating.

B.4 Tests of Procedures

Follow-Up Survey

As noted in yellow boxes throughout the draft instrument (Appendix B.1), most of the questions in the follow-up survey instrument have been used in other data collection efforts:

  • The Baseline Information Form for this evaluation (OMB No. 1291-0004);

  • The Health Profession Opportunity Grants (HPOG) (OMB No. 0970-0394 )/Pathways for Advancing Careers and Education (PACE) (OMB No. 0970-0397) 36-Month Survey currently being conducted by Abt Associates for the Department of Health and Human Services;

  • The Green Jobs and Health Care Impact Survey, recently conducted by Abt Associates for the Department of Labor (OMB No. 1205-0506);

  • The PACE 15-month follow-up survey recently conducted by Abt Associates for the Department of Health and Human Services (OMB No. 0970-0397);

  • The HPOG 15-month follow-up survey being conducted by Abt Associates for the Department of Health and Human Services (OMB No. 0970-0394);

  • The WIA Adult and Dislocated Worker Programs Gold Standard Evaluation 15-month survey conducted by Mathematica Policy Research for the Department of Labor (OMB No. 1205-0504);

  • The Job Search Assistance (JSA) Strategies Evaluation Six-Month Follow-Up Survey conducted by Abt Associates for the Department of Health and Human Services (OMB No.0970-0440); and

  • The TAA Follow-Up for Baseline Completers, conducted by Mathematica Policy Research for the Department of Labor (OMB No. 1205-0460).

The evaluator pretested the survey with nine respondents from a TAACCCT grantee college, with characteristics comparable to the study participants. Following the pretests, respondents were debriefed about the clarity of the questions and any potential problems with the instruments. These pretests provided a definitive estimate about the length of the survey (18.56 minutes) and the clarity of various components. Based on the pretest findings the evaluator made improvements in introduction scripts and wording of specific questions. The instrument submitted for OMB review is final and ready for use by the respondents.

Participant Tracking Form

This form has been used for other studies; accordingly, it will not be pretested.

College Survey

The college survey was pretested as a part of the Rounds 1-3 TAACCCT OMB package (Control No. 1291-007). The eight new questions were reviewed by three TAACCCT grantee colleges. Comments were incorporated into the final instrument.

Employer Interviews

The guide contains questions that are similar to those included in guides from other grant-funded efforts; accordingly it will not be pretested.

B.5 Individuals Consulted on Statistical Aspects of the Design

The individuals listed in Exhibit B5.1 made a contribution to the design of the evaluation. Follow-up data collection for the Outcomes Study will be conducted by Abt SRBI, a subsidiary of Abt Associates, under the general direction of Ms. Gardiner, Project Director. Administration of the college survey and employer interviews will be conducted by the Urban Institute, under the general direction of Ms. Eyster, Co-Principal Investigator. The data collected for the Outcomes Study will be analyzed under the direction of Mr. Judkins. Both the collection and analysis of data for the implementation and employer studies will be under the direction of Ms. Gardiner and Ms. Eyster.

Exhibit B5.1: Individuals Consulted

Name

Telephone Number

Role in Study

Karen Gardiner

(301) 347-5116

Project Director

David Judkins

(301) 347-5952

Co-Principal Investigator

Lauren Eyster

(202) 261-5621

Co-Principal Investigator



Inquiries regarding the statistical aspects of the study’s planned analysis should be directed to:

David Judkins

Co-Principal Investigator

301-347-5952

Dr. Molly Irwin

Senior Evaluation Specialist, Chief Evaluation Office, DOL

202-693-5091


References


Bahr, Peter Riley, Susan Dynarski, Brian Jacob, Daniel Kreisman, Alfredo Sosa, and Mark Wiederpsan. 2015. Labor market returns to community college awards: Evidence from Michigan. (A CAPSEE Working Paper). New York.: Center for Analysis of Postsecondary Education and Employment, Teachers College, Columbia University. http://capseecenter.org/wp-content/uploads/2015/03/labor-market-returns-michigan.pdf

Belfield, Clive, Yuen Ting Liu, and Madeline Joy Trimble. 2014. The Medium-Term Labor Market Returns to Community College Awards: Evidence from North Carolina. (A CAPSEE Working Paper). New York: Center for Analysis of Postsecondary Education and Employment, Teachers College, Columbia University. http://capseecenter.org/medium-term-labor-market-return-to-community-college-awards/

Dadgar, Mina, and Madeline Joy Trimble. 2015. “Labor Market Returns to Sub-Baccalaureate Credentials: How Much Does a Community College Degree or Certificate Pay?” Educational Evaluation and Policy Analysis 37(4): 399–418.

Fein, D. (2016). Risk Factors for College Success: Insights from Adults in Nine Career Pathways Programs. OPRE Brief #2016-36. Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services (OPRE) http://www.career-pathways.org/wp-content/uploads/2016/06/PACE-Risk-Factors-for-College-Success_FINAL_05-16-2016.pdf

Harvill, E.L., Sahni, S.D., Peck, L.R., & Strawn, J. (2015). Evaluation and System Design for Career Pathways Programs: 2nd Generation of HPOG Evaluation Design Recommendation Report. Bethesda, MD: Abt Associates Inc. Draft available as appendix at https://www.fbo.gov/index?s=opportunity&mode=form&id=5bd90aebdd2132ef54c1bb0c9e474baf&tab=core&_cview=1

Jepson, Christopher, Kenneth Troske, and Paul Coomes. 2014. “The Labor-Market Returns to Community College Degrees, Diplomas, and Certificates.” Journal of Labor Economics 32(1): 95-121.


1 The planned impact evaluation was replaced with an Outcomes Study that will use one of the data collection instruments planned for the impact study and cleared under OMB Control No. 1291-004 (the Baseline Information Form). The Outcomes Study will have a similar sample size (5,000). The Outcomes Study will document the progress and outcomes of students who start TAACCCT-funded programs.

2 A Training and Employment Guidance Letter (Number 15-10) described the criteria. Specifically, the TEGL states, “A credential is awarded in recognition of an individual’s attainment of measurable technical or occupational skills necessary to obtain employment or advance within an occupation” (p. 6). See https://wdr.doleta.gov/directives/attach/TEGL15-10.pdf

3 See http://www.jff.org/sites/default/files/publications/materials/A-Resource-Guide-to-Employer-Engagement-011315.pdf, page 5.

4 Because grantees selected for the Outcomes Study will not be a random selection of all grantees, the evaluator will need to exercise caution about making generalizations to all TAACCCT participants. This caveat applies to all of the research questions.

5 This research question is less clearly answerable than the others. Whether we will be able successfully answer this question depends on the number of distinct programs and the success of our implementation analysis in categorizing and quantifying different approaches to spending TAACCCT grant funds.

6 https://support.sas.com/documentation/cdl/en/statug/63962/HTML/default/viewer.htm#statug_mi_sect008.htm

7 For students enrolled in the study in the fall of 2016 prior to OMB approval for tracking will be sent a first tracking letter as soon as OMB approval is obtained, which is expected to be somewhat later than 3 months.

Abt Associates Inc. Supporting Statement for OMB Clearance Request  pg. 11

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEyster, Lauren
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy