JC Cascades OMB Package Supporting Statement Part B 11-28-16

JC Cascades OMB Package Supporting Statement Part B 11-28-16.docx

Cascades Job Corps College and Career Academy Pilot Evaluation

OMB: 1290-0012

Document [docx]
Download: docx | pdf

Part B: Collection of Information Involving Statistical Methods

Part B of the Supporting Statement for the Cascades Job Corps Challenge and Career Academy Pilot Evaluation – sponsored by the Chief Evaluation Office within DOL – considers the issues pertaining to Collection of Information Employing Statistical Methods. CEO has contracted with Abt Associates Inc. and MDRC to conduct the evaluation. The evaluation team will conduct 1) an implementation analysis and 2) an impact study.

During the Job Corps application process, study participants will be given randomized access to the CCCA pilot. Baseline data will be collected on all applicants who give informed consent to participate in the study and only those who give consent will be randomized (i.e., those who refuse to give consent will be denied access to the CCCA pilot). Administrative and follow-up survey data will be collected to measure the impact of the pilot program.

This submission seeks clearance for three data collection activities:

  • Baseline information form (BIF);

  • Follow-up tracking form (to increase follow-up survey response rates);

  • Semi-structured interview guides for site visits and phone interviews.

Subsequent OMB submissions will seek clearance for follow-up data collection activities (in particular, a follow-up survey, tentatively planned for 18 months after randomization).

B.1: Respondent Universe and Sampling Methods

Baseline Data Collection for Impact Study

The potential respondent universe for the evaluation’s baseline data collection is all individuals who apply to the CCCA Job Corps program, are deemed eligible for the program, and consent to participate in the study. There is no sampling for inclusion in the study. All eligible applicants will be administered the BIF as a condition for randomization and the chance to participate in the CCCA pilot. There is thus no formal probability sampling or subsampling. The evaluators estimate that 2,200 individuals will participate in the study (i.e., randomly assigned on a 1:1 ratio to either the treatment or control group). All individuals who consent to participate in the study will be asked to complete the BIF. All eligible, consenting applicants will be subject to random assignment in order to gain admission to CCCA. Those randomized into the treatment group will be offered the opportunity to enroll in the CCCA pilot program, while those randomized into the control group will not be offered the opportunity to enroll in the CCCA pilot program. These study parameters are summarized in Exhibit B.1.

Exhibit B.1: Sample Sizes and Response Rates, by Study Group


In Study

Completed BIF

BIF Response Rate

Treatment (offered CCCA)

1,100

1,100

100%

Control (not offered CCCA)

1,100

1,100

100%

Total

2,200

2,200

100%


The response rate for all study participants will be 100% on the consent forms and the BIF since completing them is a condition of their being eligible for the study.1

Follow-Up Data Collection for Impact Study

As noted above, subsequent OMB submissions will seek clearance for a follow-up survey. Although we project to randomize a total of 2,200 individuals, the tentative plan is to survey only those randomized during the period July 2017 to December 2018. During that time, we project that 1,000 individuals will be randomized. We then project an 80 percent response rate for that survey sample. A subsequent OMB submission (to be submitted approximately January 2018) will include more detail regarding follow-up data collection. This submission does not request clearance for that follow-up data collection effort. This submission includes this information only in order to provide context for the discussion of the random assignment impact analysis in Section B.2.2 (Estimation Procedures) of this submission.

Follow-Up Tracking Form

The evaluation team will send the tracking form to all study participants (n=2,200) six times. For each tracking attempt, the evaluation will attempt to contact every study participant, unless the participant has withdrawn from the study or requested not to be contacted again.

Site Visits for Implementation Study

The two site visits for the implementation analysis will be conducted at the CCCA pilot Job Corps center. Evaluation team members will interview up to a total of 16 staff across both the CCCA center and nearby Job Corps centers; 10 CCCA staff members will be chosen based on their job title (e.g., Center Director), out of approximately 138 people. The evaluation team will hold focus groups with small groups of study participants. There is no statistical sampling. The evaluation team will also hold phone interviews with leadership (usually the Center Director) from nearby Job Corps centers. No statistical methods will be used in the implementation analysis, and discussions of the results will be carefully phrased to make clear that no generalization is intended.

B.2: Procedures for Collection of Information

B.2.1: Sample Design

        1. Baseline Data Collection for Impact Study

The evaluation team expects the study sample to include approximately 2,200 individuals who apply to participate in the CCCA pilot program. This section describes how the sample will be recruited. No probability sampling will be conducted.

The Cascades Job Corps center was selected to host the pilot program that will be the subject of the evaluation. All eligible individuals who apply to the CCCA program and who give informed consent to participate in the impact study will be randomly assigned to one of two study arms (the treatment or control group). Those applicants who do not consent to study participation will not be allowed to participate in the study but will be able to participate in other services provided in the community, including other Job Corps centers.

The BIF will be administered after applicants give informed consent but prior to randomization. Study participants will complete the BIF by entering data directly into the custom designed online Participant Data System (PDS). The goal is to have study intake and randomization up and running by January 1, 2017. The study intake period is scheduled to end no later than December 31, 2019, or three years after OMB approval is received, if later.

The evaluation team anticipates a baseline sample of 2,200, all of whom will complete the informed consent form and BIF, which are included in this package.

        1. Follow-Up Tracking Forms

        2. Study participants will respond to the follow-up tracking forms through one of three methods (where the study participant can choose whichever is most convenient): a hard-copy form to be mailed back to the evaluation team with the provided addressed and stamped envelope; an online, web-based form hosted by a secure server; and over the phone (respondents call in to the toll-free number; no calls are made directly to study members at this time). The follow-up tracking form is included in this package.

        3. Site Visits for Implementation Study

The data will be collected through semi-structured interviews held at the CCCA center and over the phone with non-CCCA Job Corps leadership. The interview protocols are included in this package.

          1. Statistical methodology for stratification and sample selection

No statistical methods will be used to select study participants for the impact study, or for the implementation study site visits.

B.2.2: Estimation Procedures

        1. Baseline Data Collection for Impact Study

No statistical estimation on a sample will be done. We will tabulate (means for continuous variables, percentage giving each discrete response) responses to the BIF (unweighted), overall and by subgroups (e.g., males).

        1. Site Visits for Implementation Study

The site visits and phone interviews are designed to provide in-depth qualitative information about the CCCA pilot and nearby Job Corps programs; no estimation procedures will be used. The data analysis will be descriptive.

        1. Follow-On Impact Analyses

A major objective of the evaluation is to estimate program impacts—that is, observed outcomes for the treatment group relative to what those outcomes would have been in the absence of the program. Specifically, the study will estimate if and by how much the CCCA program changes outcomes, including education received, services received, educational attainment, and labor market success (including employment and earnings). This section sketches our analytic approach to estimate program impacts using the follow-up survey data and administrative data and projects the statistical precision of the estimates (specifically, minimum detectable impacts).

As noted earlier, this OMB submission is for the Baseline Information Form (BIF) to be administered immediately prior to random assignment, the follow-up tracking form, and site visit interviews. The BIF and random assignment will be used to estimate the impact of the CCCA program. Outcomes will be collected via a follow-up survey (OMB clearance for which will be requested in a subsequent OMB submission) and by linking to administrative data, including the National Directory of New Hires (for information on labor market outcomes) and the National Student Clearinghouse (for degree granting institution outcomes). We project survey samples of 800 (an 80 percent response rate for the 1,000 individuals in the survey sample) and the administrative data samples of 2,200 (i.e., everyone randomized, with no loss for non-response).

The basic impact estimates can be computed using simple subtraction: the difference in average outcomes between the treatment group members and control group members is an unbiased measure of the impact of having access to the intervention. The resulting estimate for each outcome is unbiased because the individuals who comprise the treatment and control groups in the site were selected at random from a common pool and hence are statistically equivalent on all factors at baseline, in expectation. As a result, any statistically significant differences in outcomes between the groups can be attributed to the effects of the intervention. In other words, the test of an intervention impact on some outcome, y, (for example, earnings) compares the average value of y in the treatment group with the average value of y in the control group. If the difference between these two averages is statistically significantly different from zero, chance is ruled out as the explanation and we can conclude that the grantee’s program has an impact on the measured outcome. Thus, random assignment properly carried out eliminates threats to internal validity due to selection into the treatment group and other factors. This is different from a non-experimental comparison group analysis—that is, using naturally occurring program nonparticipants instead of a randomly assigned control group—in which underlying differences between the two groups being compared remain even with statistical adjustment, leading to potential problems with internal validity of the estimates.

With survey non-response, even random assignment based estimates may be biased. To minimize the risk of such nonresponse bias, we will construct weights to account for nonresponse and use these weights in our analysis. Specifically, we will create the weights as follows. Within each group (treatment and control), we will conduct a nonresponse analysis using baseline information collected on all study participants in order to determine which characteristics are both associated with a propensity to respond and correlated with the key outcomes being measured. We will then run response propensity models using these predictive variables and use the resulting propensity score to form weighting cells within each group. We will use the resulting weights in all descriptive analyses and all analyses of impact.

While simple treatment-control differences are unbiased, random differences in the characteristics of treatment and control groups will exist, increasing the variance of impact estimates. Following conventional practice, we will estimate impact using linear regression (both for continuous and binary outcomes). Such regression analysis controls for variations in measured background characteristics between individuals and improves the statistical precision of the impact estimates. For survey outcomes, we will use weighted linear regression (where the weights control for non-response to the follow-up survey using variables observed at baseline). Robust standard errors will be used to adjust for heteroscedasticity (including the heteroscedasticity induced by applying linear regression to binary outcomes). In the event of missing baseline covariate data, we plan to implement the dummy variable adjustment procedure described in Puma et al. (2009).

We will estimate subgroup impacts by including an interaction between treatment and subgroup membership (defined at baseline) in our regression models. We will test for homogeneity of impacts across the subgroup. Our standard procedure is only to discuss subgroup impacts when we can reject homogeneity of impacts.

        1. Degree of Accuracy

Minimum Detectable Impacts (MDIs) are the smallest true impacts that the study has at least an 80-percent probability of detecting as statistically significant; for a given level of power, the greater the sample size, the smaller the MDI that can be detected. It is important to calculate MDIs before beginning an evaluation to ensure that the study will be able to detect impacts of magnitudes that are relevant to policymakers.

Exhibit B.2 reports minimum detectable impacts (MDIs) for binary outcomes (with various baseline rates) from administrative data outcomes (N=2,200), survey outcomes (N=800, with a design factor for survey non-response) and for subgroup impacts.

Exhibit B.2: Minimum Detectable Impacts (MDIs)


Baseline Probability

SubGroup

N=T+C

10%/90%

25%/75%

33%/67%

50%

33%/67%

200

10.2 p.p.

14.7 p.p.

16.0 p.p.

17.0 p.p.

31.9 p.p.

400

7.2 p.p.

10.4 p.p.

11.3 p.p.

12.0 p.p.

22.6 p.p.

600

5.9 p.p.

8.5 p.p.

9.2 p.p.

9.8 p.p.

18.4 p.p.

800

5.1 p.p.

7.4 p.p.

8.0 p.p.

8.5 p.p.

16.0 p.p.

1000

4.6 p.p.

6.6 p.p.

7.1 p.p.

7.6 p.p.

14.3 p.p.

1200

4.2 p.p.

6.0 p.p.

6.5 p.p.

6.9 p.p.

13.0 p.p.

1400

3.8 p.p.

5.6 p.p.

6.0 p.p.

6.4 p.p.

12.1 p.p.

1600

3.6 p.p.

5.2 p.p.

5.6 p.p.

6.0 p.p.

11.3 p.p.

1800

3.4 p.p.

4.9 p.p.

5.3 p.p.

5.7 p.p.

10.6 p.p.

2000

3.2 p.p.

4.6 p.p.

5.0 p.p.

5.4 p.p.

10.1 p.p.

2200

3.0 p.p.

4.3 p.p.

4.7 p.p.

5.1 p.p.

9.8 p.p.

Notes:

p.p.—percentage points.

alpha=0.05, 1-beta=80%, 2-sided test; R-square=30%

Design effect (DEFF, for survey non-response)=1.05.




The MDIs are computed using the formula:

All the MDI calculations are based on a number of assumptions, some of which vary by the outcome measure involved. These assumptions are as follows:

• A 1:1 treatment-control ratio, implying a value of P of 0.5.

• An 80 percent follow-up survey response rate.

• Two-tailed statistical tests.

• Conventional power parameters (alpha=0.05; beta=0.80), implying a value of factor of 2.80.

See the footnotes to the table for assumed baseline rates, variances, and R2s.

Suppose that two-thirds of the control group is working two years after randomization. Then, using administrative data (e.g., the NDNH) and a sample size of 2,200, we could detect an impact of 4.7 percentage points. If CCCA is successful, impact on employment should be considerably larger than 4.7 percentage points (i.e., 71.4=66.7+4.7 percent). This MDI of 4.7 percentage points is one-third of the size of the 15.0 percentage point impact detected on receipt of a GED in the National Job Corps Study (Schochet, Burghardt, and McConnell, 2008).Similarly, suppose that half of the control group gets some certificate by 18 months after randomization. Then, using the survey data (i.e., a sample size of 800), we would detect an impact of 8.5 percentage points. Again, if CCCA is successful, impact on certificate receipt should be considerably larger than 8.5 percentage points (i.e., 58.5=50.0+8.5 percent). This MDI of 8.5 percentage points is less than half the size of the impact of 22.3 percentage points detected for receipt of vocational, technical, or trade certificates in the National Job Corps Study (Schochet, Burghardt, and McConnell, 2008).

B.2.3: Who Will Collect the Information and How It Will Be Done

        1. Baseline Data Collection for Impact Study

To enroll the sample in the impact study, Job Corps staff, including contractors, will:

  • Introduce Applicants to the Study. The staff (CCCA and Job Corps program contractors) who conduct intake and hold one-on-one interviews with applicants will introduce eligible applicants to the study. Participants will be given an opportunity to ask questions so that they will understand what study participation entails. The evaluation team will develop training materials for staff involved in the data collection process to assist in this explanation, so that program staff understand and are able to explain all aspects of the study clearly.

  • Obtain Informed Consent. Adult eligible applicants to the CCCA pilot program will be asked if they would be willing to participate in the study and if so, to sign the study’s Informed Consent Form. For those applicants who are minors, the study will obtain parental consent and the minor’s assent. Those who do not consent will not be included in the study, and cannot gain access to the CCCA program. Eligible applicants for whom the study obtains consent will be considered study participants after completion of the BIF and random assignment. The consent forms, which are included in this attachment, explain that:

  • Study participants will be required to give permission for the study team to request administrative data from the Job Corps’ national Management Information System (MIS) on within Job Corps activities, the National Directory of New Hires (NDNH) on earnings and receipt of Unemployment Insurance, and the National Student Clearinghouse (NSC) on enrollment in and receipt of a degree from college.

  • Study participants will be asked to fill out a Baseline Information Form, although participants will be informed that they may skip questions other than a handful of required elements necessary for implementing the randomization process (name, gender, date of birth, and Social Security Number).

  • The on line system will use random assignment to determine which eligible applicants will be invited to participate in the CCCA pilot program.

  • Collect Baseline Information. Once consent has been obtained, participants will complete the BIF, entering baseline information directly into the on-line system. After the BIF is completed, Job Corps staff will use the on-line system to conduct random assignment. Applicants who choose not to participate in the study or who do not complete the BIF (as defined in footnote 1) will not be randomly assigned. They will not be able to access the CCCA pilot program. Job Corps intake staff will inform individuals of their random assignment immediately after assignment is completed.

        1. Follow-Up Tracking Form for Impact Study

The evaluators will contact study participants six times through four methods to request completion of the tracking form. The evaluation team will (1) mail a hard-copy of the form twice to each study participant via U.S. postal service (an addressed, stamped return envelope will be enclosed); (2) the evaluation team will send a postcard once via U.S. postal service, on which a web link and toll-free phone number will be provided to complete the form; (3) the evaluation team will send a one text message containing a link to the form to those study participants who opted to be contacted via text in the BIF; and (4) at two different times, the evaluation team will send email messages containing a link to the form to those study participants who opted to be contacted via email in the BIF.

        1. Site Visits for Implementation Study

The data will be collected through semi-structured interviews held at the CCCA center or over the phone. Two-person teams of experienced researchers will conduct two site visits.

B.2.4: Procedures with Special Populations

        1. Baseline Data Collection for Impact Study

To ensure participants can understand each of the documents, the Informed Consent Form and BIF are designed at an 8th-grade readability level. While the CCCA pilot will not offer programs in Spanish, the evaluation team will work with Job Corps staff on ways they can assist where translation of the Informed Consent Form may be needed for Spanish-speaking parents of minors.

For those study participants who are minors at the time of enrollment, their parent or legal guardian will be required to give consent to participate, and the minor will be asked to complete a youth assent form, which is also included in this package. All study participants who are not legal minors will themselves be required to give consent.

B.2.5: Use of Periodic Data Collection Cycles to Reduce Burden

The BIF is a one-time data collection effort and will not require periodic data collection cycles. Using a variety of methods (so as to reduce burden on respondents), the evaluation team will contact study participants with the tracking form once per calendar quarter during the 18-month follow-up period. The evaluation team will conduct two site visits and will schedule them with sufficient time in between visits so as to reduce the burden on respondents.

B.3: Methods to Maximize Response Rates and Deal with Non-response

Baseline Data Collection for Impact Study

All individuals who agree to participate in the evaluation must complete the BIF in order to have the opportunity to be randomly assigned to the CCCA pilot program.2 Therefore, a response rate of 100 percent is expected on the BIF.

Site Visits for Implementation Study

CCCA participation in the evaluation—including site visits—was a condition of the pilot contract. It is anticipated that all nearby Job Corps centers selected for telephone interviews of their director (or other senior staff member) will agree to participate since the national Job Corps office will request their compliance.

Site visitors will work closely with the primary contact for CCCA pilot program to help in scheduling the site visit. One member of the two-person site visit team will take responsibility for working with the primary contact person to handle the scheduling and logistics, e.g., identifying appropriate interview respondents. Dates for site visits will be set at least one month in advance to allow ample time to schedule interviews. Interview appointments will then be confirmed via e-mail the week prior to the visit. Should a potential respondent not be available during the visit, the research team will follow up with a time to interview the person by phone.

Follow-up Postcard and Tracking Form

While completion of follow-up tracking forms is not a requirement of the study, the evaluation team will utilize different approaches to maximize the collection of any updates to participants’ contact information. We have planned for quarterly outreach activities, alternating both hard-copy and electronic outreach. For example, in one quarter we will send participants an email invitation to update info on a secure web portal, and in the next quarter, we will send either a postcard or a tracking form, both of which would invite them to provide updated contact information. We also plan to communicate with participants via text message (if they permitted) with a link to the same web portal.

B.4: Tests of Procedures

Most of the items in the BIF are either identical or similar to questions used in previous DOL studies (including those conducted by Abt Associates and MDRC) or national surveys.3 (Many are updated to lower the reading level to an 8th-grade level.) The few other items in the BIF are drawn verbatim from well-validated studies.4 As such, all items have been thoroughly tested on large samples. Still, we will pilot-test these materials on a convenience sample of 9 young adults with basic skills similar to those of Job Corps applicants (i.e., approximately 8th-grade reading skills).

B.5: Individuals Consulted on Statistical Aspects of the Design

The individuals listed in Exhibit B.5 below made a contribution to the design of the evaluation. Baseline data collection forms will be administered by Job Corps staff, under the direction of Abt Associates (and overseen by Ms. Williams as Project Director). The data collected for the Impact Study will be analyzed under the direction of Mr. Klerman. Both the conduct and analysis of data for the Implementation Study will be under the direction of Dr. Grossman.

Exhibit B.5: Individuals Consulted

Name

Telephone Number

Role in Study

Ms. Julie Williams

(301) 634-1782

Project Director

Mr. Jacob Klerman

(617) 520-2613

Co-Principal Investigator

Dr. Jean Grossman

(609) 258-6974

Co-Principal Investigator


Inquiries regarding the study’s planned analysis should be directed to:

Jacob Klerman

Co-Principal Investigator

301-347-5953

Dr. Molly Irwin

Senior Evaluation Specialist, Chief Evaluation Office, DOL

202-693-5091



References

Puma, Michael J., Robert B. Olsen, Stephen H. Bell, and Cristofer Price. 2009. What to Do When Data Are Missing in Group Randomized Controlled Trials (NCEE 2009-0049). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. http://ies.ed.gov/ncee/pdf/20090049.pdf.

Schochet, Peter Z., Burghardt, J. and McConnell, S. (2008). "Does Job Corps Work? Impact Findings from the National Job Corps Study." The American Economic Review 98.5: 1864-1886.


1 Sample members will have the option of skipping all but a few key items on the BIF (name, SSN, gender, and date of birth). Completion of the BIF therefore refers to completing the few key items that must be answered. However, Job Corps staff will be trained to encourage participants to fill out as much of the BIF as possible, especially the contact information necessary for locating participants for the follow-up survey. Completion of the consent refers to signing the form.

2 Sample members will have the option of skipping any item except a few key items on the BIF. Those are name, SSN, gender, and date of birth. Completion of the BIF therefore refers to completing these few key items that must be answered.

3 Studies referenced for the BIF include: the National Guard Youth Challenge Job Challenge Evaluation, conducted by Mathematica Policy Research for DOL; the Ready to Work (RTW) Evaluation, conducted by Abt Associates for DOL; the National Evaluation of Youth Corps conducted by Abt Associates for the Corporation for National and Community Service; and the Evaluation and System Design for Career Pathways Programs: 2nd Generation of HPOG evaluation, conducted by Abt Associates for the U.S. Department of Health and Human Services.

4 The reaction to challenge scale was taken from Connell, James, Jean Baldwin Grossman and Nancy Resch, The Urban Corps Assessment Package (URCAP) Manual, Philadelphia PA:  Public/Private Ventures Technical Report September 1995. Eric no. ED391868.

Abt Associates Inc. Supporting Statement for OMB Clearance Request  pg. 7

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy