OJC SSA Supporting Statement Part B (2-5-20)

OJC SSA Supporting Statement Part B (2-5-20).docx

Student Safety Assessment of Job Corps Centers

OMB: 1205-0542

Document [docx]
Download: docx | pdf

Student Safety Assessment of Job Corps Centers

OMB Control No. 1205-0NEW

February 2020


B. Collections of Information Employing Statistical Methods


This information collection does employ statistical methods.


Project Objectives


The Student Safety Assessment (SSA) survey will be administered to currently enrolled Job Corps students. The two main objectives of collecting data through the SSA are: (a) to determine the safety of Job Corps centers based on its students’ perceptions; and (b) to provide actionable safety information to the centers. This includes details on where and what types of unsafe activities occur. The SSA is intended to be an emerging and very important management tool for the Job Corps program towards ensuring the safety of the students participating in the program. Without this data, critical safety issues may be undisclosed and unaddressed. As such, this information is essential for federal oversight and center management.

B1. Respondent Universe and Samples



Sampling Design

The population or universe of potential respondents of the Student Safety Assessment (SSA) are Job Corps students who are active at the time that the SSA is administered, which will typically occur the second week every month. Additionally, the framework is restricted to students who have been on center for at least two weeks. The two-week minimum enrollment period is necessary to ensure that students have received their email log-in information and have had training on accessing their email. The purpose of the survey is to compare centers. Therefore the accuracy is based on the center’s size and is planned to be 5.5 percent margin of error. However, the survey sample is a stratified random sample by center, with gender as the strata used. Stratifying proportionally by gender after determining the overall center sample size is important given that some centers have a small proportion by gender, primarily having a larger percentage of male students. For example, six centers have a planned On-Board Strength (OBS) with more than 80 percent male. A completely random sample could result with months of no females surveyed. The stratification by gender will be proportional to ensure that the same percentage of males and females are represented in the sample as on center at the time of the administration of the SSA.

Students may be selected and take the survey more than once during their enrollment in Job Corps. Many of the questions are about the students experiences in the last 30 days; therefore, a student’s responses could vary over time.

Sample Size

The sample size is determined by center. Since Job Corps is an open entry/open exit program, the number of students enrolled can vary. The sample size is based on the center enrollment at the time of the administration of the SSA. For planning purposes, the maximum sample size was determined using the contracted size of the centers and the planned number of students by gender by center. Appendix A shows the maximum monthly sample size nationally and provides these estimates by center. There are 121 centers; however, 116 were used for the sample size estimate because several centers were not included as these centers are currently closed or in a transitional period. There are 37,417 contracted OBS. The sample does allow for replication; therefore the student may take the survey more than once while enrolled in Job Corps. There are 68 small centers with 100 to 299 contracted OBS, 32 medium centers with 300 to 499 contracted OBS, and 16 large centers with 500 or more contracted OBS. The sample size based on the highest estimate of student population will be 14,579 students with 8,380 males and 6,198 females. The sample size per center range from 78 to 182 students. Small centers will have a higher proportion of students selected to minimize the margin of error. For example, a small center, Wolf Creek Job Corp Center, has a contracted OBS of 213 students. The sample size is 115 students with the expected response of 92 students. By comparison a large center, Earle C. Clements with a contract OBS of 1,022 students will have a sample size of 175 students with 140 expected survey completions. This ensures the required power by center.

Table 1. Sample size based On-board Strength (OBS) at centers

Number of centers

Total contracted OBS

Total sample size based on OBS

116

37,417

14,579

Source: Contracted on-board strength data Tracking of OBS report dated 10/10/2019

An added complexity is that many centers have imbalanced enrollments by gender. If a completely random sample was used, some centers could have very few or no females surveyed. Therefore, we stratify by gender to ensure the sample is proportional to the center population by gender. Using the examples noted above, Wolf Creek Job Corps Center has a planned male population of 165 or 77.5% of their contracted OBS. Therefore, the sample of 115 will include 89 males and 26 females selected completely at random within their gender strata. Similarly, Earle C. Clements Job Corps Center has a male planned OBS of 672 or 65.8%. Therefore, of the 175 students in the sample 115 will be male. Appendix A provides this level of detail for each center based on the contracted OBS. Each month, the sample size will be determined based on each center’s current OBS.

Response Rates



This is a new survey and there are no past national data collection efforts to use for response rate estimates. The projected response rates are approximations based on a similar survey, the Student Satisfaction Survey (SSS), which will be phased out once the new SSA is approved for deployment. The SSS has a similar respondent population and some of the same content. The SSS consistently had response rates over 80 percent. One likely reason the response rate for the SSS was high is due to the residential setting of Job Corps.1 This continues to be a benefit; the staff will be aware of the survey and can encourage all students to participate. As the literature shows reminders are a proven method to increase response rates.2 The SSA response rate of 80 percent is an approximation based on the experience with the SSS and the controlled educational setting of the program. Nationally, the SSA sample with an 80 percent response rate would result in an estimated 9,317 completed surveys. Additionally, the response rate by strata (gender) is estimated at 80 percent. This would result in an estimated 5,352 male and 3,965 female responses each month nationally.

B2. Statistical Methods for Sample Selection and Degree of Accuracy Needed



Sampling Method



The sampling method is a proportional stratified random sample. The proportional stratification is gender. The current enrollment at each center at the time of the survey will determine the sample size needed for each center and the proportion of males and females surveyed. At this time, post survey adjustments are not anticipated. The national results will be used to study differences between demographics to determine if there could be non-response biases. If the study of non-respondents shows that there is a systematic difference in response rates by a demographic group and this group is shown to respond differently to the survey questions, then post survey adjustments will be determined at that time. The purpose of any adjustment would be to ensure the results accurately represent the centers and the program.



The sample framework closely mirrors the intended population. The sample size is determined based on center’s current on-board strength and is drawn no more than one week before the survey is administered. Some students may separate from the program during the week the sample was determined or the week of the survey, but the size of the excluded sample is small.



Statistical Tests



The primary purpose of the SSA data collection is to determine the safety at the centers and the program as a whole. The survey responses will be converted to points and the mean will be used to determine safety scores on the different topic areas covered in the survey. The means of the different topic areas will determine the overall center score. An examination of confidence intervals will be done to analyze the results. A secondary purpose is to use SSA results to further examine the program and provide information that leads to informed decisions. Periodically, analyses that examine the trends and examine perception of safety related to long-term program engagement and programmatic outcomes will be conducted.





Accuracy



The accuracy of the survey and the results by center are an important factors. Survey procedures ensure the students can complete the survey privately and the privacy of their responses will be maintained. Job Corps students’ individual results will not be shared with centers. This eliminates any pressure by the students to inaccurately report the level of safety at a center. Further analysis of the instrument will be conducted after the survey is implemented nationally to ensure the validity of the survey and internal reliability. Power analyses were completed to determine sample size per center. The power analyses use a formula for proportion and assumptions based on the SSS results. Since the population is finite, the formula uses the finite population correction. Therefore, the following formula is used:



n=N*X/(X+N-1),



where, X = Zα/22*p*(1-p)/MOE2



and n is the number of survey responses needed, Za/2 is the critical value of the Normal distribution, MOE is the margin of error at 5.5 percent when comparing center results, p is the sample proportion is estimated to be .15 and N is the population size at the time of the survey3. The sample proportion was determined based on the results of prior Student Satisfaction Survey (SSS) and national safety surveys. The national average SSS Safety Rating (percent of students that felt safe at Job Corps) ranged from 86.6 to 88.3 in the last four administrations. The U.S. Department of Justice, Bureau of Justice Statistics, School Crime Supplement to the National Crime Victimization Survey was the main reference survey in the development of the SSA. In comparing similar questions to the School Crime Supplement found that between 4 to 6 percent of students reported being afraid of being attacked at school and 12 to 18 percent of male students reported being in a fight.4 Another national survey, the Youth Risk Behavior Survey (YRBS) questionnaire, found that that between 2 and 6 percent carry a weapon on school grounds.5 Therefore, the use of 15 percent for the proportional distribution for school safety is a conservative estimate to use in the power analysis.

The final step to determine the needed sample size is to account for the anticipated 80 percent response rate. The number of males needed in the sample will be determined by multiplying the sample size by the percentage of males on center at the time of the sample; the female sample size will be determined similarly. Stratification by gender is necessary to ensure that both genders are represented in the random sample especially since some centers are very lopsided based on gender.



The SSA will be conducted monthly, allowing for repetitive measurements. This will ensure that the SSA results are accurate as viewed over time. Analyzing the results over time will minimize the influence of anomalies due to possible periodically low response rates or an unusual event at the center. The analysis of results overtime will produce in an improved understanding of centers’ overall level of safety.

Software and Unusual Problems



There are no unusual problems. The sample size calculations are not complex and can be completed using Excel software. Data processing and analysis will use SAS software program. The Web-based surveys will use Voxco. The Voxco software and results are located in the Job Corps Data Center to ensure the security of the survey and its results. Voxco is an automated system and once programmed with the survey, the software automatically provides students with the correct questions based on previous responses. Additionally, the system provides automated scheduling, reminders, and on-line data storage, which make it easier to control the sample, monitor the study, reduce data entry costs, and ensure consistency and quality of the data. Automated online surveying reduces the burden by speeding up the collection and analysis of the results. Additionally, each student will receive an individual link with a unique user name and password to the survey ensuring that it is that student that completes the survey. Once the survey link is used and the survey is completed, the link cannot be used again.

B3. Maximizing Response Rates and Addressing Nonresponse



Maximizing Response Rates



The Student Safety Assessment (SSA) will be conducted at Job Corps centers over a one week period every month. The Job Corps students will receive an email with a link to the web-based survey. Any internet-enabled device can be used to complete the survey. The survey can be completed at any time during the week and participants can complete it over multiple sessions.



During survey development, response rates were considered and addressed through several methods. To improve response rates, there were several goals. These included ensuring the language accessible, ensuring the survey accommodates students with disabilities, and ensuring the survey would be available at times and places convenient to the students. Job Corps employed the following criteria to meet these goals:



  • The questions were written at a sixth grade level to ensure that most Job Corps students can complete the survey prior to receiving academic improvement courses at Job Corps.

  • The questions use terminology that is familiar to Job Corps students at all centers.

  • The survey is available in Spanish. Special care was taken to ensure the translation maintained a lower reading level.

  • The survey can be taken on any internet-enabled device either on or off a Job Corps campus.

  • There is an audio function of the survey in which each Webpage has a play button and when pressed by the participant, all question and answer choice wording on the page is read out loud by the computer. This can be done as many times as needed by the respondent. This function is available in Spanish and English.

  • The survey can be completed over the course of multiple sessions. Once completed, the link cannot be used again.

  • The survey is password protected.

  • The length of the survey was considered. On average, the survey takes approximately 15 minutes to complete.

The survey procedures were developed to consider improving response rates while also being mindful of Job Corps’ students’ rights to refuse to complete the survey. The survey procedure empowers Job Corps students but also allows for the center staff to encourage without being able to force participation as the identity of the students is not known. The center operators will encourage all Job Corps students to take the survey during the survey week. The center staff will not know which students were selected. Material to remind and encourage Job Corps students to complete the survey have been developed and will be distributed to centers. Additionally, the selected students will be sent an initial participation email and follow-up reminder emails during the survey week. Reminder emails will be sent to the students’ personal email and their Job Corps email accounts. Each center will receive daily response rate emails to encourage the center’s staff to encourage student participation.

Nonresponse



The Office of Job Corps and the center operators will have an opportunity to promote the survey and encourage its completion. A nonresponse study will be conducted. The nonresponse study will examine the survey response rates and results by several demographics. The demographics are gender, race/ethnicity, and age group. The study will identify nonresponse bias by a difference in response rate demographics and a difference in survey results by demographics. After the nonresponse study is completed, it will be determined if post-survey adjustments are necessary to ensure the results are accurately measuring the level of safety.

Item nonresponse is unlikely unless the student failed to complete the survey because the survey is designed to prompt students who skip questions. A survey will need to be 90 percent complete for the results to be used. Otherwise, all students’ responses will be deleted. The nonresponse study will include an examination of the incomplete surveys to determine if groups completing surveys have systematic differences based on gender, age, and ethnic/race.



Reliability and Validity



The SSA was developed based on national surveys for similar populations. In particular, the U.S. Department of Justice, Bureau of Justice Statistics, School Crime Supplement to the National Crime Victimization Survey served as a model for the survey. This survey is used in an educational setting with similar age groups for the same purpose. The internal reliability of the SSA will be determined after initial survey results are collected. A 0.8 Cronbach alpha for question reliability will be considered acceptable.



B4. Test Procedures

Methods to Minimize Burden and Improve Utility



There is a minimal burden with the SSA for the Job Corps center staff and students. The SSA is Web-based and accessible on any internet-enabled device. The time burden is minimal. The data collection pre-testing shows that the survey can be completed in approximately 15 minutes. Additionally, a random sample is used to decrease the number of Job Corps students required to take the survey each month.

Survey Refinement Activities



Survey refinement activities were conducted for the SSA to determine:

  • Whether the questions were understood as intended,

  • Whether the questions were age and reading level appropriate,

  • Whether the possible answer responses were clear and the students could find an appropriate response, and

  • Whether the program specific language was correctly used and understood as intended.

The survey refinement activities produced improvements to the survey which have been incorporated. No further changes are planned to the survey instrument.



Implementation

The SSA data collection instrument will be implemented nationally. The results of the response rates and any response bias will be used to adjust the results using post-survey weighting. If a substantive change to the methods or the instrument is necessary, a material change request will be submitted.

B5. Contact Information

Contact Information



The contact information for the contractors who were consulted on the design of the Student Safety Assessment is found below:



Decision Information Resources, Inc.
3900 Essex Ln, Suite 900
Houston, Texas 77027
Main Telephone: 713-650-1425



Battelle Memorial Institute

505 King Avenue
Columbus, OH 43201

Phone: 800-201-2011 or 614-424-6424



Contact information for the support contractor that will collect and analyze the survey information is:


Decision Information Resources, Inc.
3900 Essex Ln, Suite 900
Houston, Texas 77027
Main Telephone: 713-650-1425

1 Carini, R. M., Hayek, J. C., Kuh, G. D., Kennedy, J. M., & Ouimet, J. A. (2003). College student responses to web and paper surveys: Does mode matter?. Research in Higher Education44(1), 1-19.

2 Fan, W., & Yan, Z. (2010). Factors affecting response rates of the web survey: A systematic review. Computers in human behavior26(2), 132-139.

3 Daniel W. W. (1999). Biostatistics: A Foundation for Analysis in the Health Sciences. 7th edition. New York: John Wiley & Sons.

4 Musu, L. et al (2019). Indicators of School Crime and Safety: 2018. NCES 2019047; From Website: https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2019047.

5 Ditto.

5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy