Student Experience Assessment of Job Corps Centers
OMB Control No. 1205-0NEW
February 2020
This information collection does employ statistical methods.
An objective of collecting data through the Student Experience Assessment (SEA) is to determine the students’ satisfaction with the components of the Department of Labor’s Job Corps program. The survey will become one of Job Corps’ emerging and most important management tools for the Job Corps program as it is a residential program mainly run by contractors making first-hand accounts immensely importance. Critical issues regarding the insufficient or lacking services in the program may be undisclosed and unaddressed without this data collection instrument providing information on the satisfaction of students at Job Corps. This information is critical for federal oversight and center management. This survey will be administered to currently enrolled Job Corps students.
The population or universe of potential respondents of the Student Experience Assessment (SEA) are Job Corps participants who are active at the time that the SEA is administered. Additionally, the framework is restricted to students who have been on center for at least two weeks. The two-week minimum enrollment period is necessary to ensure that students have received their email log-in information and have had training on accessing their email. The survey sample is a census at the time of the survey. All students who have been on center for at least two weeks will be surveyed.
The center enrollment size determines the number of students surveyed. Since Job Corps is an open-entry/open-exit program, the number of students enrolled can vary. The sample size is based on the center enrollment at the time of the administration of the SEA. Table 1 shows the maximum sample size nationally using all currently open centers (n=116) contracted on-board strength (OBS). There are 121 Job Corps centers, five of these centers were not included in this estimate because these centers are in a transitional period. The final sample will be determined based on each centers current enrollment or OBS at the time of the survey.
Timeframe |
Number of Centers |
Contracted On Board Strength |
Hours for Maximum Response Rate (100%) |
Expected Number of Respondents (80%) |
Burden Based on Expected Response Rate |
Quarterly |
116 |
37,417 |
12,472 hours |
29,934 |
9,977 hours |
Annually |
116 each time |
149,668 |
49,889 hours |
119,734 |
39,911 hours |
* Respondents can take the survey more than once during their enrollment based on sampling.
Note: Details may not sum to total due to rounding.
This is a new survey and has no past national data collection efforts to use for response rate estimates. The projected response rates are approximations based on a similar survey, the Student Satisfaction Survey (SSS), which will be phased out once the new safety and satisfaction surveys are approved for deployment. The survey consistently had response rates over 80 percent. One likely reason the response rate for the SSS was high is due to the residential setting of Job Corps.1 The SSS has a similar respondent population and some of the same content as the SEA; therefore, administering the survey in a residential setting continues to be a benefit. The staff will know when all students have received the satisfaction survey and can encourage them to log into a computer and complete their survey. As the literature shows, reminders are a proven method to increase response rates.2 Nationally, the SEA sample with an 80 percent response rate would result in an estimated 29,934 completed surveys.
The sample is a census of all students enrolled for at least two weeks. Job Corps is an open-entry and open-exit program meaning that students can enroll or separate at any time, resulting in some students with short tenures. The Office of Job Corps wants all students to have an opportunity to voice their satisfaction with services throughout their tenure. The survey will be administered quarterly. As students only receive survey modules based on their progress in the program, administering the survey quarterly means that students will be taking different modules each time the survey is administered. A census allows students to comment on program satisfaction throughout their tenure if they remain in the program multiple quarters while also allowing all students who stay in the program for a quarter at least one opportunity to provide feedback. The sample framework closely mirrors the intended population. The sample is drawn no more than one week before the survey is administered. Some students may separate from the program during the week the sample was determined or the week of survey administration, but the size of the excluded sample is small.
The primary purpose of the SEA data collection is to determine the student satisfaction at the centers and the program as a whole. The survey responses will be used to determine the average number of questions that indicate satisfaction and the centers will be given a percentage of responses that indicate program satisfaction. These results will be provided based on department (e.g. academics and career technical training) and overall. A secondary purpose is to use SEA results to further examine the program and provide information that leads to informed decisions. For example, an examination if students’ dissatisfaction with dorm living is associated with to lower center success regardless of the overall center satisfaction.
The SEA will not use statistical testing. However, the accuracy of the survey is still an important factor. Survey procedures ensure the students can complete the survey privately. Job Corps students’ individual results will not be shared with centers. This eliminates any pressure by the students to inaccurately report the level of program satisfaction at a center. Further analysis of the instrument will be conducted after the survey is implemented nationally to ensure the validity of the survey and internal reliability. The SEA will be conducted quarterly allowing for repetitive measurements. Analyzing the results over time will minimize the influence of anomalies due to possible periodically low response rates or an unusual event at the center. The analysis of results over time will produce in an improved understanding of centers’ overall level of student satisfaction.
There are no unusual problems. Job Corps has several phases that correspond to different services. The SEA has been designed to only show questions related to the students’ current phase in the program. The data needed to ensure the students are not overburdened by unnecessary questions is available administrative data maintained by Job Corps. When developing the sample each quarter, the administrative data will be used to make sure the survey adjusts to the students’ progress in the program and thereby minimizing the questions and the burden in the survey. Data processing and analysis will use SAS software program. The Web-based surveys will use Voxco. The Voxco software and results are located in the Job Corps Data Center to ensure the security of the survey and its results. Voxco is an automated system and once programmed with the survey, the software automatically provides students with the correct questions based on previous responses. Additionally, the system provides automated scheduling, reminders, and on-line data storage, which make it easier to control the sample, monitor the study, reduce data entry costs, and ensure consistency and quality of the data. Automated online surveying reduces the burden by speeding up the collection and analysis of the results. Additionally, each student will receive an individual link with a unique user name and password to the survey ensuring that it is that student that completes the survey. Once the survey link is used and the survey is completed, the link cannot be used again.
The Student Experience Assessment (SEA) will be conducted at Job Corps centers for 10 days every quarter. The Job Corps students will receive an email with a link to the web-based survey. Any internet-enabled device can be used to complete the survey. The survey can be completed at any time during the survey period and participants can complete it over multiple sessions.
During survey development, several methods were used to address response rates. To improve response rates, there were several goals. These included ensuring the language is accessible, ensuring the survey accommodates students with disabilities, and ensuring the survey would be available at times and places convenient to the students. Job Corps employed the following criteria to meet these goals:
The questions were written at a fifth grade reading level to ensure that most Job Corps students can complete the survey prior to receiving academic improvement courses at Job Corps.
The questions use terminology that is familiar to Job Corps students at all centers.
The survey is available in Spanish. Special care was taken to ensure the translation maintained a lower reading level.
The survey can be taken on any internet-enabled device either on or off a Job Corps campus.
There is an audio function of the survey in which each Webpage has a play button and, when pressed by the participant, all question and answer choice wording on the page is read out loud by the computer. This can be done as many times as needed by the respondent. This function is available in Spanish and English.
The survey can be completed over the course of multiple sessions. Once completed, the link cannot be used again.
The survey is password protected.
The length of the survey was considered. On average, the survey takes 20 minutes to complete.
The survey procedures were developed to consider improving response rates while also being mindful of students’ rights to refuse to complete the survey. The center operators will encourage all Job Corps students to take the survey during the survey week. The center staff will not know which students have decided not to complete the survey. Materials to remind and encourage Job Corps students to complete the survey have been developed and will be distributed to centers. Additionally, the students will be sent the initial participation email and follow-up reminder emails during the survey week. The reminder emails will be sent to the students’ personal email as well as to their Job Corps email. Each center will receive daily response rate emails to encourage the center’s staff to in turn encourage student participation.
The Office of Job Corps and the center operators will have an opportunity to promote the survey and encourage completion. A nonresponse study will be conducted. The nonresponse study will examine the survey response rates and results by several demographics. The demographics are gender, race/ethnicity, and age group. The study will identify nonresponse bias by a difference in response rate demographics and a difference in survey results by demographics. After the nonresponse study is completed, it will be determined if post-survey adjustments including weighting are necessary to ensure the results accurately measure the level of participant program satisfaction.
Item nonresponse is unlikely unless the student failed to complete the survey because the survey is designed to prompt students who skip questions. A survey will need to be 90 percent complete for the results to be used. Otherwise, the student’s responses will be deleted. The nonresponse study will include an examination of the incomplete surveys to determine if groups completing surveys have systematic differences based on gender, age, and ethnic/race.
B4. Test Procedures
There is a minimal burden with the SEA for the Job Corps center staff and students. The SEA is Web-based and accessible on any internet-enabled device. The time burden is minimal. The survey can be completed in approximately 20 minutes.
Job Corps students have reviewed the SEA during cognitive testing (OMB control number 1205-0436). Cognitive testing was completed with 220 students. 142 students of these students took the web survey with an interviewer asking questions when students completed sections or hesitated on a question. The other 78 students completed a paper version of the survey in which the interviewer read and discussed each question with the student.
The purposes of their review were the following:
The questions were understood as intended,
the questions were age and reading level appropriate,
the possible answer responses were clear and the students could find an appropriate response, and
the program-specific language was correctly used and understood as intended.
The cognitive testing produced improvements to the survey which have been incorporated. Upon approval, the survey will be used nationally. Several aspects of the survey will be reviewed. The response rates, internal test reliability, and nonresponse bias study will be conducted on survey results. Non-substantive changes will be made if needed. For example, post-survey weighting due to nonresponse biases will be made, if it is determined that it is necessary. If it is determined that a substantive changes to the survey are required to refine and improve the survey or the survey methods, a change-request package will be submitted.
The contact information for the contractors who were consulted on the design of the Student Experience Assessment is found below:
Decision Information Resources, Inc.
3900
Essex Ln, Suite 900
Houston, Texas 77027
Main Telephone:
713-650-1425
Battelle Memorial Institute
505 King Avenue
Columbus, OH 43201
Phone: 800-201-2011 or 614-424-6424
Contact information for the support contractor that will collect and analyze the survey information is:
Decision Information Resources, Inc.
3900
Essex Ln, Suite 900
Houston, Texas 77027
Main Telephone:
713-650-1425
1 Carini, R. M., Hayek, J. C., Kuh, G. D., Kennedy, J. M., & Ouimet, J. A. (2003). College student responses to web and paper surveys: Does mode matter?. Research in Higher Education, 44(1), 1-19.
2 Fan, W., & Yan, Z. (2010). Factors affecting response rates of the web survey: A systematic review. Computers in human behavior, 26(2), 132-139.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | jgloudemans |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |