PRA 1205-0426 Supporting Statement B (10-29-19)

PRA 1205-0426 Supporting Statement B (10-29-19).docx

Post Enrollment Data Collection for Job Corps Participants

OMB: 1205-0426

Document [docx]
Download: docx | pdf

Post Enrollment Data Collection for Job Corps Participants

OMB Control No. 1205-0426

October 2019


B. Collections of Information Employing Statistical Methods


This information collection does employ statistical methods.


B1. Description of the Population and Sampling to Be Used


Goals of the Data Collection


No statistical methods leading to inferences of larger populations are used. In all reports and other publications and statements resulting from this work, no attempt is made to draw inferences to any population other than the set of participants or employers and educational institutions responding to the data collection effort.


There are four primary goals of the data collected through the instruments and methods described in this request:


  1. to fulfill WIOA-specified performance measurement and reporting requirements for the Job Corps program, specifically regarding reporting on the employment and education outcomes of program participants to determine the performance of each center and the larger program and to provide one part of an employer effectiveness measure;

  2. to determine the long-term placement and wage results for the Outcome Measurement System (OMS) which is used to manage many aspects of the program including center-level performance;

  3. to independently verify, as specified by the Office of the Inspector General, the initial placement and wage reports for individual participants reportedly placed by contractors to ensure the integrity and quality of placement services provided by Job Corps centers and career transition service (CTS) agencies; and

  1. to enhance the quality of the Job Corps program by collecting information on customer satisfaction with former participants, employers and educational institutions.


Populations to be surveyed


In addition to the employers and educational institutions taking the employment verification and satisfaction survey, all Job Corps students, including those who separate before qualifying as participants are given the satisfaction portion of the participant survey (module 6). Job Corps participants are contacted following the second and fourth quarters after exit from the Job Corps program. With a few exceptions, Job Corps participants are comprised of graduates and former enrollees.


A graduate is defined as an individual who has voluntarily applied for, been selected for, and enrolled in the Job Corps program and who, as a result of participation in the Job Corps program, has received a secondary school diploma or recognized equivalent, or completed the requirements of a career and technical education and training program that prepares individuals for employment leading to economic self-sufficiency or entrance into postsecondary education or training.


Former enrollees are defined in Section 142 of the WIOA as an individual who has voluntarily applied for, been selected for, and enrolled in the Job Corps program, but left the program prior to becoming a graduate

Participants are defined under WIOA as individuals who meet Job Corps eligibility criteria, have been accepted and enrolled into the program, and have demonstrated a commitment to the program by either completing the Career Preparation Program (CPP) or having at least 60 days of continuous enrollment. For the purposes of this collection, the surveys conducted of participants following the second and fourth quarters after their exit from Job Corps are termed participant surveys.


Participant Surveys

This collection projects to survey 43,720 participants following the second and fourth quarters after exit and 44,680 students on initial separation and at 13 weeks (see Table 1). The estimated response rate for the exiter surveys is 50% yielding 44,680 responses since all exiters will receive two surveys (on exit and at 13 weeks). Note, only module 6 of the participant survey is administered to exiters upon initial separation and at 13 weeks.


The current surveys under OMB 1205-0426 have been used since last approved in August 2016. The response rate estimates used in this current revision take into account the Job Corps participant response rates using the existing survey and methodology but also considers some of the improvements in the revised instrument and survey processes which are expected to improve future response rates above current levels. Response rates for graduates have exceeded those of former enrollees. Based on these factors, the combined response rate estimate for the participant surveys is estimated at 50% for 43,720 responses. See supporting statement A, Table 3.

Employers and Educational Institution Surveys

The population of employers and educational institutions consists of those organizations where Job Corps participants were reportedly placed. This collection projects to survey 10,000 employers or educational institutions (see Table 1). The historical response rate for the Employer/Educational Institution Survey has been 50%. This population is stable and relatively easy to reach compared with the participant surveys. Based on these factors, we project 5,000 responses. See supporting statement A, Table 3.


Table1: Participant Survey and Employer Educational Institution Sample Sizes (Annual Estimated)*


Survey Status

Annual Totals

Population: Job Corps exiters.

44,680

Sample: All Job Corps exiters who are assigned online surveys on initial separation and at 13 weeks.

44,680



Population: Job Corps participants (second and fourth quarter)

43,720

Sample: Participants who are assigned for survey attempts (second and fourth quarter).

43,720



Population: Employers and educational Institutions of placed students.

10,000

Sample: Employers who are assigned for survey attempts.

10,000

* Based on Program Year 2017 Data



Uses of the Collected Data


Participant Surveys

The data collected with the participant surveys serves four primary purposes.


  • First, Job Corps uses the survey data to fulfill part of the performance measurement and reporting requirements for the Job Corps program specified under WIOA. Specifically, the survey results lead to performance outcomes for five of the six primary WIOA reporting measures. The survey results require precision at the level of the center and the Career Transition Services (CTS) provider. Job Corps uses performance-based contracting to select center and career transition service (CTS) contractors, and, as a result, bases incentives and award fees, in part, on the results from these surveys. Job Corps' ranking of center and CTS contractor performance uses data collected from the surveys. The relative ranking of centers and CTS agencies provides one of the major criteria used by Job Corps to determine whether a contract option is renewed and to evaluate contractor past performance for new contracts.


  • Second, Job Corps uses the survey data to determine the results of long-term placement metrics in the Outcome Measurement System (OMS). The OMS provides Job Corps with the long-term placement and wage results for participants. This information is essential for Job Corps to manage the center operators and career transitional services providers and evaluate their outcomes.


  • Third, Job Corps uses survey data for independent verification of contractor reported outcomes regarding initial placement and wages. Survey results collected by the survey contractor serve as a third party verification of initial placement outcomes reported by career transition contractors.


  • Finally, the data collected supports the continuous program improvement activities regularly conducted by Job Corps and program operators. The questions from module 6 of the participant surveys is used to assess former students’ satisfaction with the Job Corps program and determine their reason for leaving prior to graduation if applicable. This information is essential to Job Corps’ continuous efforts for program enhancement and renovation.


Employers and Educational Institution Surveys

The data collected from the employers and educational institutions supports the continuous improvement efforts and are designed to assess employer satisfaction with the Job Corps participants they hired, how well they were prepared to meet the requirements of the position for which they were hired and a rating of their problem solving skills. The results collected by this survey provide qualitative information about the relevance and effectiveness of the education and training services delivered by Job Corps center operators.



B2. Statistical Methodology for Stratification and Sample Selection


As indicated above, all groups are a census of the population. No sample selection or stratification is applicable.


Participant Surveys

The key variables to be collected in the participant surveys are presented in the following diagram. The questionnaire will begin with questions designed to re-verify placement information obtained from JCDC (Job Corps Data Center). For those who are not placed, the questionnaire will skip the re-verification questions and begin with questions about their current job. Once information on the current job is captured, the questionnaire is designed to capture information on additional jobs held during the quarter. For each job, we will ask information on employer name, start date and end date, weekly hours worked (including overtime), hourly wage rate, any additional payments (such as tips, bonuses, or commissions) and other information needed to calculate total earnings over the quarter.

Shape1


Using the data collected in these surveys, the contractor will compare post-program outcomes for Job Corps participants who completed the program and other participants who left the program prior to completion. The analysis will also compare the employment rate and earnings of “completers” and “non-completers.” Other analysis will examine the employment rate of male and female Job Corps participants, older and younger participants, and participants in different occupational training programs. In addition to providing supplemental data for WIOA reporting the detailed data in the participant surveys serve to inform program management.


Questions are also asked about other placement and post-program activities, and about safety on center and satisfaction with the program.


Employers and Educational Institution Surveys

The key variables to be collected in the employer and educational institution surveys are presented in the following diagram.

Shape2

The data collected from the employers and educational institutions supports the continuous improvement efforts and are designed to assess employer satisfaction with the participants they hired, how well they were prepared to meet the requirements of the position for which they were hired and a rating of their problem solving skills. The results collected by this survey provide qualitative information about the relevance and effectiveness of the education and training services delivered by Job Corps center operators.


B3. Methods to Maximize Response Rates and Address Non response


Participant Surveys

Strategies used to maximize response rates for the participant survey:

  • The data collection instrument consists primarily of well-tested items.

  • The survey contractors work closely with Job Corps placement staff and JCDC to obtain adequate tracking and locating information about all Job Corps graduates and former enrollees.

  • An online web version of the survey can be accessed through computers, tablets, and smartphones—modalities that have been shown to be more popular with this age group (see Section Supporting Statement a—Question 3 response).

  • The survey contractors will use multiple contact sources such as United States Postal Service (USPS) address, email address, and cell phone number to maximize the opportunity for contacting the respondent through a variety of modes including standard mail, electronic contacts, and consented text messaging. Maintaining contact via multiple modes, including the initial separation and 13 week surveys, all during the first three months after exit will improve response rates during the data collection periods for the participant surveys.

  • Regular contact between students and career transition specialists following separation, and the enhancements to the Career Information System will provide access to updated student locating information.

  • The survey contractor will employ sample-locating and refusal-avoidance techniques that have been proven to maximize locating and enlisting the cooperation of youth populations. Graduates also receive monetary incentives from Job Corps during the 12-month service eligibility period that will likely enhance their cooperation with the data collection effort.


Research suggests that reminders can be the single most important technique for producing high response rates. Providing periodic reminders via a variety of contact methods including hard copy and electronic means helps track graduates and former enrollees who have moved since their separation from Job Corps. Using electronic outreach in addition to hardcopy is particularly beneficial for respondents for whom traditional contact information such as phone numbers and USPS mailing addresses is inadequate. Job Corps employs additional searching techniques that have been proven to maximize the location of respondents and elicit their cooperation, including:


  • contacting parents, relatives, and neighbors for military placements and to obtain current contact information;

  • sending address-correction letters;

  • searching on-line nationwide databases (for example, Accurint, White Pages, Directory Assistance, Lexis/Nexus, reverse lookups, among others);

  • requesting, where possible, information from public agencies (for example, motor vehicle

departments and corrections departments);

  • providing a toll-free line for respondents to call;

  • providing information on the surveys and how to participate on the contractor’s website; and

  • establishing a social media presence for the surveys.


Job Corps uses online survey techniques and telephone procedures through a support contract that maximizes response rates after respondents have been located. Telephone interviewers are trained to carefully follow these procedures. The contractors carefully monitor and retrain interviewers to correct any weaknesses in their contact and interviewing techniques. Two to three weeks after the first contact with a respondent who initially refused to participate, a senior interviewer will contact the respondent and address the respondent's concerns about completing the interview. The data collection contractors will maintain databases to track survey data, will generate regular reports to identify non-responders, and will support follow-up efforts. These procedures have been used successfully in Job Corps’ data collection efforts to achieve response rates consistent with those projected for these follow-up surveys.


Despite extensive efforts to maximize the response rate, there will inevitably be non-respondents. The non-respondents to the survey create a potential for non-response bias. That is, the respondent sample may not be representative of the population. Job Corps will conduct a thorough non-response bias analysis by comparing the characteristics of non-respondents as they enter the program with the characteristics of survey respondents who complete the second and fourth quarter surveys. Fortunately, as students enter Job Corps; there is a great deal of information on their baseline characteristics: age, gender, location, prior education level, in-program accomplishments, length of stay in the program and other important characteristics. Job Corps will compare these characteristics for respondents and non-respondents to determine if the respondent sample is systematically different from the non-respondent sample.


In past non-response bias analyses, Job Corps found that both male and female response rates exhibit the same pattern over time. Female graduates on average exhibited a slightly higher response rate than males. Similarly, response rates were stable across age groups, with a minor uptick at the older end of the spectrum. Overall, in past surveys, Job Corps found little non-response bias, except that students who were placed in jobs were more likely to respond. For that reason, Job Corps will continue to conduct regular non-response analyses to ensure that any non-response bias is identified and accounted for with the new survey results.


Employers and Educational Institution Surveys

The historical response rate for the Employer/Educational Institution Survey is 50%. This population is stable and relatively easy to reach compared with the participant surveys. However, Job Corps will conduct a thorough non-response bias analysis by comparing the characteristics of non-respondents with the characteristics of survey respondents who complete the surveys. Fortunately, there is a great deal of information on the baseline characteristics of employers in the survey population obtained from placement data such as organizational size, type, NAICS code, and familiarity with the placed participant. Job Corps will compare these characteristics for respondents and non-respondents to determine if the respondent sample is systematically different from the non-respondent sample.


Strategies used to maximize response rates for the employers and educational institutions survey:

  • The data collection instrument consists primarily of well-tested items via telephone surveys.

  • The survey contractors work closely with Job Corps placement staff and JCDC to obtain adequate tracking and locating information about employers of all placed Job Corps participants.

  • A minimum of three contact attempts will be made with employers unless the contact information is found to be invalid. Regular contact between students and career transition specialists following separation, and the enhancements to the Career Information System will provide access to updated employer and educational institution locating information.

  • Business factors results will be used when determining survey techniques including varying call times to increase chances of availability during business and training staff about reaching the correct person, especially in larger firms.1

  • The survey staff will provide a toll-free line for respondents to call; provide information on the surveys and how to participate on the contractor’s website; and establish a social media presence for the surveys to reduce refusals.


Missing Data


Participant Surveys

Throughout the survey, Job Corps makes a concerted effort to avoid missing data. For example, when the questionnaire includes a question on earnings, the response may be “Don’t Know” or “Refused.” The interviewers are trained to probe further and obtain useful information by providing the respondent opportunities to respond in categories rather than in exact earnings amount (e.g., ranges). This approach is usually effective in obtaining a response and avoiding missing data. Similarly, if the respondent refuses to provide information, interviewers are directed to a probe that reminds the respondent of the private nature of the survey, that information provided by the respondent will only be used for analysis of the Job Corps program, and individual information will not be shared.


Further, the proposed revisions to the second and fourth quarter participant surveys that are in this package were in part designed to improve the quality of the survey data and reduce item non-response to key employment-related questions.

To date, item non-response has not been a significant problem in using the participant survey data to meet reporting requirements. In some cases, Job Corps has been able to use answers to questions in one part of the survey to solve missing data problems in another part of the survey (e.g., answers about the wage or hours of the initial job placement for students who work at that same job as answered in a different part of the survey). When faced with missing data issues, Job Corps has not used any statistical methods for data imputation (e.g., random, regression-based techniques) to assign values when key individual data items needed in a calculation are missing. Instead, for example, Job Corps has adopted a conservative approach of not counting a student as placed in a quarter unless all the underlying questions needed to make that determination have been answered and indicate the placement criteria have been met.

To minimize this issue, follow-up probes with responses expressed in relatively narrow ranges have been added to wage, hours and earnings questions to help obtain more accurate and complete data. In other cases, questions related to specific criteria (e.g., wage at least minimum wage, hours at least 20 a week) have been added and or modified to minimize the extent of missing data.

Employers and Educational Institution Surveys

The employer survey is a short 12-minute interview, which rarely has missing data therefore, to date, item non-response has not been a significant problem in using the survey data to provide qualitative information about the services provided by Job Corps center operators. The job placement for students who have been working at that same job during the second and fourth quarters after exit is answered in respondents to the participant surveys. Job Corps has not used any statistical methods for data imputation (e.g., random, regression-based techniques) to assign values when key individual data items needed in a calculation are missing.


Weighting


Participant Surveys

The contractor is highly experienced in a variety of weighting processes. For this project, the contractor will explore a variety of weighting approaches including: (1) non-response adjustment to adjust for failure to obtain a completed interview from some respondents and (2) non-response adjustment for respondents in demographic groups with differential response rates (e.g., male, female). This weighting process, if necessary, will adjust survey results to reflect the demographic characteristics of the target population. These adjustments will help to make the sample representative of the target population by mitigating non-sampling errors and bias.23


Information collected in the survey is self-reported. Moreover, at the conclusion of collecting information on each job, we review the information provided and confirm that the information recorded is accurate.


For verification jobs, information on wages is provided by JCDC and, thus, the purpose of the survey questions is to verify the employment and the earnings on that job. Thus, for these jobs we have two sources of information.


Employers and Educational Institution Surveys

The weighting processes described above will be utilized for the Employers and Educational Institutions survey as appropriate.



B4. Test Procedures


Participant Surveys

This revised collection is built upon the currently-approved survey instrument to provide an updated and more streamlined question flow. It is also being developed to allow for a self-administered web-based methodology. The revised question flow continues to use questions that were in the currently-approved instrument but have changed the order of some questions and eliminated others to make the interview flow more smoothly. No additional testing except for testing programming changes was conducted since valid and reliable data has been obtained from the current survey using those same questions.


Employers and Educational Institution Surveys

The employer survey is a short 12-minute interview of employers of placed participants. No additional testing has been conducted since valid and reliable data has been obtained from the current survey instrument.


B5. Contact Information & Privacy


No individuals outside of the contractor were consulted on statistical aspects of the design.


Contact information for the contractor that will collect and analyze the survey information is:


Decision Information Resources, Inc.
3900 Essex Lane, Suite 900
Houston, Texas 77027
Phone: 713-650-1425

1 Fisher, S., Bosley, J., Goldenberg, K., Mockovak, W., & Tucker, C. (2003). A qualitative study of nonresponse factors affecting BLS establishment surveys: results. In 163rd Annual Joint Statistical Meetings.

2 Groves, R. M. 2006. “Nonresponse Rates and Nonresponse Bias in Household Surveys.” Public Opinion Quarterly 70(5):646-75.

3 Little, Roderick. 2003. “Bayesian Methods for Unit and Item Nonresponse.” In R.L. Chambers and C.J. Skinner (eds). Analysis of Survey Data. Wiley Series in Survey Methodology.

7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy