MEMORANDUM OMB # 1850-0923 v.4
DATE: May 10, 2016
TO: Robert Sivinski
Office of Information and Regulatory Affairs, Office of Management and Budget
FROM: Isaiah O'Rear
National Center for Education Statistics
THROUGH: Kashka Kubzdela
National Center for Education Statistics
SUBJECT: ED School Climate Surveys (EDSCLS) Benchmark Study 2017 Updated
In April 2016, The National Center for Education Statistics (NCES) received approval for a revised EDSCLS 2017 Benchmark data collection (OMB# 1850-0923 v.3) plan and materials, primarily allowing NCES to begin in April 2016 applying for research permits from special handling districts in order to be able to recruit schools for the study. During the originally planned EDSCLS 2016 recruitment, NCES encountered a low school response rate that made it difficult to reach the target 500 school benchmark sample. In the change request, NCES proposed revisions to improve the recruitment schedule for data collection in 2017 and in this submission proposes additional revisions to provide sample schools with the option to administer the EDSCLS to a sample rather than a universe of their staff and grade 5-12 students.
Offering schools the choice to administer the benchmarking EDSCLS questionnaires in 2017 to a sample or to the originally planned universe of students and staff is designed to encourage a larger proportion of sample schools to participate in the EDSCLS benchmark study. Schools will be able to administer EDSCLS questionnaires to the principal and all of the teachers, non-instructional staff, and grade 5-12 students, as originally planned, or they can opt to administer the noninstructional survey to the principal only1, the student questionnaire to one randomly selected class of students from each eligible grade (grades 5-12) of no more than four grades, and the instructional staff questionnaire to two randomly selected teachers per grade. The reduced burden option is designed to address one of the major concerns voiced by schools that declined to participate in the 2016 national benchmark study.
In addition, a small nonresponse follow up study of sampled schools from the EDSCLS 2016 will be conducted to gain additional information about reasons schools had for not participating in the survey. This information can be used to improve efforts of NCES to recruit sampled schools for EDSCLS 2017. The study will interview personnel from ten schools that agreed to participate in the EDSCLS 2016, personnel from twenty schools that declined to participate, and personnel from thirty schools that never responded to the survey request.
The following revisions were made to the EDSCLS clearance package materials in this submission:
Part A:
Added the option that schools can choose either the originally proposed universe or the newly proposed sampling-within-school data collection (as described above).
Explained that tablets will be provided as incentives for all participating schools, and for schools that conduct a universe survey, also a report will be provided summarizing the school’s result in comparison to the national benchmark.
Updated response burden estimates to show both the minimum burden if schools choose the within school sample option and the maximum burden if 700 schools participate and all choose a universe survey. Updated the reasoning for changes to response burden and to the cost to federal government for the benchmark study.
Explained the need for a Nonresponse Follow-up Study using sample units from the EDSCLS 2016.
Added information on the:
frequency of data collection for the Nonresponse Follow-up Study.
payment and costs for the Nonresponse Follow-up Study.
sensitive nature of the questions for the Nonresponse Follow-up Study.
respondent burden associated with the Nonresponse Follow-up Study.
costs to respondents and to the federal government of the Nonresponse Follow-up Study, and explained the reason for the change in response burden and costs.
timing of the Nonresponse Follow-up Study.
Part B:
Added the option that schools can choose either the originally proposed universe or the newly proposed sampling-within-school data collection (as described above).
Explained the plan to prioritize recruitment for 714 schools.
Explained exclusions from the universe and a plan to oversample schools with a relatively large percentage of American Indian/Alaska native non-Hispanic student enrollment to make sure this population is well-presented in the sample.
Explicitly stated that state, district, and school ID will be used to sort the schools in each stratum in addition to high percentage of White non-Hispanic student enrollment and high percentage of student enrollment eligible for free or reduced-price lunch.
Explained the sample size, expected response rate, and the option to conduct within school sampling.
Updated the precision section to reflect the updated sampling plan.
Added details and explanations on the mail, email, and phone recruitment process. Added recruitment timelines and procedures. Indicated the contents of recruitment packages.
Added a handwritten post-it note to a recruitment package and explained how it will increase response rates.
Added a section explaining tracking procedures, and a section on possible in-person follow-ups as a strategy to understand potential nonresponse issues.
Explained reasoning for new recruitment timeline.
Added information on the sample design, procedures for collection of information, and methods to maximize response for the nonresponse follow-up study.
Appendix A:
Revised language in the three letters to districts and the first two letters to schools.
Added a follow-up letter/email informing schools of the within school sampling option and universe survey option.
Explained the universe sample option and the within school sampling option in the FAQs.
Added one more step to the school coordinator duties in the Summary of Activities for School Coordinators to provide rosters for schools that select to the within school sampling option.
Added the interview protocols for the nonresponse follow-up study.
Appendix C:
Updated the respondent eligibility criteria section to reflect the new option for within school sampling.
1 NCES does not plan to produce national benchmark scores for noninstructional staff. Based on the survey completion rate of 28.1 percent among the noninstructional staff in the pilot test conducted in 2015 (see Appendix D for the pilot test report), there is a concern that the responses will not be representative of the noninstructional staff population.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Authorised User |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |