baseline data collection request for ARRA-funded grants; job training evaluation, OMB 1205-0481
The Employment and Training Administration (ETA) of the U.S. Department of Labor (DOL) is undertaking the Green Jobs and Health Care Impact Evaluation of the Pathways Out of Poverty and Health Care and High Growth Training grant initiatives. The overall aim of this evaluation is to determine the extent to which enrollees achieve increases in employment, earnings, and career advancement as a result of their participation in the training provided by Pathways and Health Care grantees and to identify promising best practices and strategies for replication. The evaluation uses a random assignment research design to measure the educational and economic impacts of programs operated by selected grantees and will include a process study to examine program implementation and operations. The ETA has contracted with Abt Associates and Mathematica Policy Research to conduct this evaluation. The full evaluation involves three data collection efforts requiring OMB approval:
Baseline data collection using a web-based PTS (focus of this request for routine extension)
Follow-up interview 18 months after baseline collection
Follow-up interview 36 months after baseline collection
This submission requests extension of the approval for baseline data collection.
DOL is selecting six sites from the universe of Health Care and Pathways grantees to participate in the evaluation based on their likely numbers of applicants, quality of implementation, early placement information, program service strategies, targeted industries, and appropriateness of implementing a random assignment design. These six sites will be the universe for the impact study; results will not be generalized to all Health Care and Pathway grantees.
All individuals who consent to participant in the study will be included in the data collection, and no sampling will be used. Sites will use their existing eligibility criteria to identify people who qualify to receive program services. No attempt will be made to draw inferences to any population other than the set of units that responded to the data collection effort. Table B.1. Sample Size Requirements
|
|
Number of participating sites |
6 |
Number of participating individuals |
6,000 (average of 1,000 per site) |
Ratio of treatment to control group members |
Varied by site (e.g., 1:1, 2:1) |
Anticipated response rates |
100 percent (baseline data collection)
|
Anticipated number of respondents |
6,000 (baseline data collection)
|
Statistical methods will not be used to select the sample. Sites will be selected based on their likely numbers of applicants, quality of implementation, early placement information, program service strategies, targeted industries, and appropriateness of implementing a random assignment design. The universe of individuals admitted by the selected sites into their grant programs during the study intake period will be included in the study. Statistical analysis of the baseline data will consist solely of descriptive tabulations (to profile the population participating in the grant programs examined). Because there will be no sampling variation in the data for these purposes, the data analysis will not be estimates.
This emergency package is requesting approval for the use of two forms (informed consent and BIF) to be administered to all eligible individuals at the selected sites and entered into the PTS as they go through an intake process.
The methods to maximize response for the consent and BIF forms will be based on approaches that have been used successfully in many other random assignment studies to ensure that the study is clearly explained to both study participants and staff and that the forms are easy to understand and complete. Care has been taken to explain the study accurately and simply to potential participants. The approaches taken will be fully reviewed and approved by the IRB of Abt Associates (the lead research firm). The forms and procedures should minimize refusal rates and maximize voluntary participation in the program. Staff will be thoroughly trained on how to address study participants’ questions about the forms and their questions. Grantee staff will also be provided with a site-specific operational procedures manual prepared by the research team, contact information for members of the research team, and detailed information about the study.
Furthermore, the forms are designed to be easy to complete. They are written in clear and straightforward language, at the sixth-grade reading level, with closed response categories. The time required for participants to complete both forms is estimated to be 13 minutes, on average. In addition, the forms will be available in Spanish to accommodate Spanish-speaking customers. Grantee staff will administer the forms orally to participants with low literacy.
Data Reliability. Both forms required at intake are unique to the current evaluation and will be used across all program sites. Using the same forms across all sites will ensure consistency in the collected data. The forms will have been reviewed extensively by project staff and staff at ETA and will be thoroughly tested in a pretest involving nine or fewer individuals from nonparticipating sites. Staff will receive training covering each item on the BIF to ensure staff understand each item and record the information accurately. In addition, each participating site will be provided with access to a web-based system, the PTS, for entering the information from the BIF. To ensure complete and accurate data capture, this platform will flag missing data or data outside a valid range.
The forms planned as part of the intake process will be thoroughly tested with nonparticipating staff and participants at a grantee site. As mentioned, we will pilot test the forms on no more than nine people. After the forms are completed, project staff will debrief each participant using a standard debriefing protocol to determine whether any words or questions are difficult to understand or answer. We do not expect any significant problems to be uncovered in the pilot test, but we do expect some minor formatting and wording changes to be made as a result of the test. Since the full-scale evaluation will not be conducted at the pilot test site, and pretest respondents will not have an opportunity to be part of the demonstration, we will thank them with a small incentive of $25 for their time. No monetary incentive is planned for actual study participants during the baseline data collection effort.
Consultations on the statistical methods used in this study have been undertaken to ensure the technical soundness of the research. The following people were consulted in preparing this submission to OMB:
Abt Associates
Dr. Stephen Bell (301) 634-1700
Ms. Karin Martinson (301) 634-1700
Mr. Jacob Klerman (617) 520-2613
Mathematica Policy Research
Ms. Anne Ciemnecki (609) 275-2323
Dr. Karen Needels (541) 753-0201
We will also assemble a peer review panel consisting of three to five experts in the following areas: (1) training low-skill/low-income people; (2) the specific labor markets (e.g., health, “green” jobs); (3) random assignment evaluation; and (4) survey methods. These experts will review and comment on the evaluation study design and data collection procedures.
Bureau of Labor Statistics. An Occupational Analysis of Industries with Employment Gains: Occupational Employment Statistics (OES) Highlights. Washington: BLS, 2010.
Dohm, Arlene, and Lynn Shniper. “Occupational Employment Projections to 2016.” Monthly Labor Review, vol. 130, no. 11, 2007.
File Type | application/msword |
File Title | Abt Single-Sided Body Template |
Author | Melissa O'Connor |
Last Modified By | naradzay.bonnie |
File Modified | 2011-12-09 |
File Created | 2011-12-09 |