205-0543 SupportingStatement B 5.31.23

205-0543 SupportingStatement B 5.31.23.docx

Student Experience Assessment of Job Corps Centers

OMB: 1205-0543

Document [docx]
Download: docx | pdf

Student Experience Assessment of Job Corps Centers

OMB Control No. 1205-0543

June 30, 2023


B. Collections of Information Employing Statistical Methods


This information collection does employ statistical methods.


Project Objectives


The objective of collecting data through the Student Experience Assessment (SEA) is to determine students’ satisfaction with the components of the Department of Labor’s Job Corps program. It is administered to students currently enrolled in the Job Corps program. The assessment is one of Job Corps’ most important management tools for the Job Corps program Given the residential structure of the program, it is integral to obtain first-hand accounts of the student experience. This information is critical for federal oversight and center management. Critical issues regarding any insufficient or lacking services in the program may be undisclosed and unaddressed without the information obtain via the SEA.

B1. Respondent Universe and Samples



Sampling Design

The potential sample of respondents of the SEA are enrolled Job Corps students who have been on center for at least two weeks. The two-week minimum enrollment period is necessary to ensure that students have received their email log-in information and have had training on accessing their email. The survey sample is a census at the time of the survey. All students who have been on center for at least two weeks will be surveyed.

Sample Size

The center enrollment size determines the number of students surveyed. Since Job Corps is an open-entry/open-exit program, the number of students enrolled can vary. The sample size is based on the center enrollment at the time of the administration of the SEA. Table 1 shows the maximum sample size nationally using all currently open centers (n=121) contracted on-board strength (OBS). The final sample will be determined based on each center’s current enrollment or OBS at the time of the survey.



Table 1. National estimates of the population, number surveyed by strata, and expected completions*

Timeframe

Number of Centers

Contracted On Board Strength

Hours for Maximum Response Rate (100%)

Expected Number of Respondents (80%)

Burden Based on Expected Response Rate

Quarterly

121

37,417

12,472 hours

29,934

9,977 hours

Annually

121 each time

149,668

49,889 hours

119,734

39,911 hours

* Respondents can take the survey more than once during their enrollment based on sampling.

Note: Details may not sum to total due to rounding.



Response Rates



The SEA has demonstrated a 77 percent response rate based on the October 2022 administration. Due to the COVID-19 pandemic, centers are not operating at full capacity. Therefore, there were 14,787 students in the pool with completed surveys resulting in a burden of about 3,811 hours There is a goal of reaching an 80 percent response rate with full center capacity. Nationally, in the scenario that an 80 percent response rate were achieved, the SEA would result in an estimated 29,934 completed surveys.



B2. Statistical Methods for Sample Selection and Degree of Accuracy Needed



Sampling Method



The sample is a census of all students enrolled for at least two weeks. Job Corps is an open-entry and open-exit program meaning that students can enroll or separate at any time, resulting in some students with short tenures. The Office of Job Corps wants all students to have an opportunity to voice their satisfaction with services throughout their tenure. The survey will be administered quarterly. As students only receive survey modules based on their progress in the program, administering the survey quarterly means that students will be taking different modules each time the survey is administered. A census allows students to comment on program satisfaction throughout their tenure if they remain in the program multiple quarters while also allowing all students who stay in the program for a quarter at least one opportunity to provide feedback. The sample framework closely mirrors the intended population. The sample is determined no more than one week before the survey starts. A few sampled students may separate from the program prior to the start of the survey; these students will be excluded from response rate reports.



Statistical Tests



The primary purpose of the SEA data collection is to determine student satisfaction at the centers and the program. The survey responses will be used to determine the average number of questions that indicate satisfaction, and the centers will be given a percentage of responses that indicate program satisfaction. These results will be provided based on department (e.g., academics and career technical training) and overall. A secondary purpose is to use SEA results to further examine the program and provide information that leads to informed decisions. For example, an examination if students’ dissatisfaction with dorm living is associated with lower center success regardless of the overall center satisfaction.



Accuracy



The SEA will not use statistical testing. However, the accuracy of the survey is still an important factor. Survey procedures ensure the students can complete the survey privately. Job Corps students’ individual results will not be shared with centers. This eliminates any pressure by the students to inaccurately report the level of program satisfaction at a center. Further analysis of the instrument will be conducted after the survey is implemented nationally to ensure the validity of the survey and internal reliability. The SEA will be conducted quarterly allowing for repetitive measurements. Analyzing the results over time will minimize the influence of anomalies due to possible periodically low response rates or an unusual event at the center. The analysis of results over time will produce in an improved understanding of centers’ overall level of student satisfaction.



Software and Unusual Problems



There are no unusual problems. Job Corps has several phases that correspond to different services. The SEA has been designed to only show questions related to the students’ current phase in the program. The data needed to ensure the students are not overburdened by unnecessary questions is available administrative data maintained by Job Corps. When developing the sample each quarter, the administrative data will be used to make sure the survey adjusts to the students’ progress in the program and thereby minimizing the questions and the burden in the survey. Data processing and analysis will use SAS software program. The Web-based surveys will use Voxco. The Voxco software and results are in a private cloud environment run by Rackspace to ensure the security of the survey and its results. Voxco is an automated system and once programmed with the survey, the software automatically provides students with the correct questions based on previous responses. Additionally, the system provides automated scheduling, reminders, and on-line data storage, which make it easier to control the sample, monitor the study, reduce data entry costs, and ensure consistency and quality of the data. Automated online surveying reduces the burden by speeding up the collection and analysis of the results. Additionally, each student will receive an individual link with a unique username and password to the survey ensuring that it is that student that completes the survey. Once the survey is completed, the password cannot be used again.



B3. Maximizing Response Rates and Addressing Nonresponse



Maximizing Response Rates



The SEA is conducted at Job Corps centers for nine days every quarter. The Job Corps students receive an email with a link to the web-based survey. Any internet-enabled device can be used to complete the survey. The survey can be completed at any time during the survey period and participants can complete it over multiple sessions.



During survey development, several methods were used to address response rates. To improve response rates, there were several goals. These included ensuring the language is accessible, ensuring the survey accommodates students with disabilities, and ensuring the survey would be available at times and places convenient to the students. Job Corps employed the following criteria to meet these goals:



  • The questions are at a reading level to ensure accessibility to most Job Corps students prior to receiving academic improvement courses at Job Corps.

  • The questions use terminology that is familiar to Job Corps students at all centers.

  • The survey is available in Spanish. Special care was taken to ensure the translation was readable and accessible to many Spanish dialects.

  • The survey can be taken on any internet-enabled device either on or off a Job Corps campus.

  • There is an audio function of the survey in which each Webpage has a play button and, when pressed by the participant, all question-and-answer choice wording on the page is read aloud by the computer. This can be done as many times as needed by the respondent. This function is available in Spanish and English.

  • The survey can be completed over the course of multiple sessions. Once completed, the link cannot be used again.

  • The survey is password protected.

  • The length of the survey was considered. On average, the survey takes 20 minutes to complete.

The survey procedures were developed to consider improving response rates while also being mindful of students’ rights to refuse to complete the survey. The center operators will encourage all Job Corps students to take the survey during the survey week. The center staff will not know which students have decided not to complete the survey. Job Corps provided centers with materials to remind and encourage students to complete the survey. Additionally, the students are sent the initial participation email and follow-up reminder emails during the survey week. The reminder emails are sent to the students’ personal email as well as to their Job Corps email. Additionally, text messages are sent to students that have provided their consent. Each center will receive daily response rate reports to encourage the center’s staff to in turn encourage student participation.



Nonresponse



The Office of Job Corps and the center operators will have an opportunity to promote the survey and encourage completion. A nonresponse study was conducted with the pilot administration in January 2022 and the July 2022 administrations. The response rates were not significantly different based on sex and race/ethnicity. As a result, post-survey adjustments were not necessary.



Item nonresponse is unlikely unless the student failed to complete the survey because the survey prompts students who skip questions. Less than five percent of students failed to complete the survey after beginning the survey. Pairwise deletion will be used for item nonresponse. A study of results over several administrations will be conducted to determine if there is item nonresponse for specific questions and if there is any evidence of response bias based on race/ ethnicity, sex, and gender. Item specific biases will be addressed by weighting or adjusting the question.



The survey nonresponse will continue to be analyzed and adjustments will be made as needed.



B4. Test Procedures

Methods to Minimize Burden and Improve Utility



There is a minimal burden with the SEA for the Job Corps center staff and students. The SEA is Web-based and accessible on any internet-enabled device. The time burden is minimal. The survey takes approximately 20 minutes to complete.



Pre-Test Data Collection



Job Corps students have reviewed the SEA during cognitive testing (OMB control number 1205-0436). Cognitive testing was completed with 220 students. About two-thirds (142 students) of these students took the web survey with an interviewer asking questions when students completed sections or hesitated on a question. The other students (78) completed a paper version of the survey in which the interviewer read and discussed each question with the student.



The purposes of their review were the following:

  • the questions were understood as intended;

  • the questions were age and reading level appropriate;

  • the possible answer responses were clear, and the students could find an appropriate response; and

  • the program-specific language was correctly used and understood as intended.

The cognitive testing produced improvements to the survey which have been incorporated. Upon approval, the survey will be used nationally. Several aspects of the survey will be reviewed. The response rates, internal test reliability, and nonresponse bias study will be conducted on survey results. Non-substantive changes will be made if needed. For example, post-survey weighting due to nonresponse biases will be made, if it is determined that it is necessary. If it is determined that a substantive changes to the survey are required to refine and improve the survey or the survey methods, a change-request package will be submitted.



B5. Contact Information



Contact Information



The contact information for the consulting contractors on the design of the Student Experience Assessment is found below:



Decision Information Resources, Inc.
3900 Essex Ln, Suite 900
Houston, Texas 77027
Main Telephone: 713-650-1425



Battelle Memorial Institute

505 King Avenue
Columbus, OH 43201

Phone: 800-201-2011 or 614-424-6424



Contact information for the support contractor that will collect and analyze the survey information is:


Decision Information Resources, Inc.
3900 Essex Ln, Suite 900
Houston, Texas 77027
Main Telephone: 713-650-1425




6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorjgloudemans
File Modified0000-00-00
File Created2023-07-29

© 2024 OMB.report | Privacy Policy