Response to Public Comments 30-Day

EDSCLS Benchmarking Study 2016 Response to 30-day Public Comments.docx

ED School Climate Surveys (EDSCLS) Benchmark Study 2016

Response to Public Comments 30-Day

OMB: 1850-0923

Document [docx]
Download: docx | pdf

Public Comments Received During the 30-day Comment Period and NCES Responses

Docket: ED-2015-ICCD-0081, School Climate Surveys (SCLS) Benchmark Study 2016

Comments On: ED-2015-ICCD-0081-0021, ED-2015-ICCD-0081-0022 (duplicate)


Comments related to Item Wording, Survey Accessibility, Study Sample and others
Comment Number: 2, 3

Shape1

Document: ED-2015-ICCD-0081-0021
Name: Melissa Tooley

Address: Washington, DC, 20036

Organization: New America


Re: ED School Climate Surveys (EDSCLS) Benchmark 2016


Dear Ms. Mullan:

We are writing in response to the request for comments on the “ED School Climate Surveys

(EDSCLS) Benchmark 2016,” as published in the Federal Register on August 25, 2015, docket number ED2015ICCD0081.


New America is a nonprofit, nonpartisan public policy institute that works to address the next generation of challenges facing the United States. New America’s Education Policy Program uses original research and policy analysis to solve the nation’s critical education problems, serving as a trusted source of objective analysis and innovative ideas for policymakers, educators, and the public at large. We combine a steadfast concern for low income and historically disadvantaged people with a belief that better information about education can vastly improve both the policies that govern educational institutions and the quality of learning itself.


As made evident in our research, we believe that a positive and productive school climate contributes to a quality education. This movement by the U.S. Department of Education (ED) to provide benchmark data on school climate is a step towards strengthening the academic experience of students across the nation.

According to the abstract, these data’s intended use is to create a “basis of comparison between data collected by schools and school systems and the national school climate.” Providing schools with a ready to use survey and the opportunity to compare their results to others will pave the way for schools to engage in self-reflection on their climates and focus on necessary climate changes to best serve their students.


The comments that follow are grouped by the five topics outlined in the docket’s supplementary information: (1) the collection’s necessity to the proper functions of the ED; (2) processing and using the information in a timely manner; (3) the accuracy of the estimate of burden; (4) enhancing the quality, utility, and clarity of the information collected; (5) minimizing the burden of collection on the respondents.


(1) Is this collection necessary to the proper functions of the Department? For decades, education experts have acknowledged the critical role school climate plays in influencing student outcomes and driving teacher satisfaction and retention.1 2Despite this knowledge, many schools have not focused on improving climate, in part because of lack of knowledge and/or capacity to do so in a meaningful way. The Department has the means and the responsibility to collect and disseminate information vital to the development of successful and equitable schools: providing a survey and benchmarks to help schools improve the climate for staff and students can be a key element toward these efforts.

As highlighted in New America’s report, Skills for Success, studies have found that school climate—i.e., school environments, policies, and practices can influence the behavior, academic performance and socio emotional wellness of students. A positive school climate has the ability to encourage collaboration, cohesion, and feelings of safety and trust, all of which promote a better teaching and learning environment. In the pursuit of equitable educational outcomes, assessing and improving school climate should be considered as one strategy for generating better academic and life outcomes for all students.

However, it is difficult for schools and local educational agencies (LEAs) to undertake this work, without a clear idea what types of school climate information are important to collect and reflect upon or what “good” results might look like. Thus, providing a no cost survey tool, collecting nationally representative data using this survey, and providing benchmarks for comparison will aid schools and LEAs in meeting their objectives for students.


(2) Will this information be processed and used in a timely manner?


Aside from the expectation that schools complete the survey before April 2017, we did not find a proposed timeline for when the Department expects to complete data collection and publish results. We explain below how clearly articulating what time expectations are placed on schools will improve their performance in completing the surveys on time. Expected dates for survey data publication should also be shared with participants. This survey asks participating schools to invest their time and resources: having clarity about the timeline for returns on those investments may increase schools’ buy in.


(3) Is the estimate of burden accurate?

In an age of “data driven” education, more schools have established systems for conducting large scale surveys. However, there are several minor additions which might ease school implementation, and reduce burden for schools administering the survey:

  • Create a concise suggested timeline to accompany Survey Administration Procedures. Schools who feel supported are more likely to commit to the survey process and fully participate. We recommend providing a one page survey preparation and completion timeline for distribution to school leaders and staff. For example, in its Appendix, the California Climate Survey includes a one page checklist of school tasks leading up to the survey administration. The list identifies how many weeks before survey distribution each task must be completed. It also includes space for school determined “due dates” and the person responsible for each task. This proposed template articulates necessary prior preparation (i.e., administrator training, permission slip distribution, staff investment, makeup surveys, etc.) and minimizes the possibility of school’s underestimating their time investment. Prepared schools are more likely to complete the survey thoroughly and on time, and less likely to rush completion, leading to reduced participation and/or missed deadlines.

  • (4) How might the Department enhance the quality, utility, and clarity of the information to be collected?

Any information’s utility is determined by whether it is perceived to be of high quality and significance. With regard to the school climate survey, some schools may disregard information that may seem irrelevant or outdated. The recommendations below are intended to promote schools’ understandings of the survey results’ benefits, relevance, and use.

  • Update the benchmark on an annual basis. Schools’ climates are not static: they are influenced by changes among school staff, policies, and practices, not to mention changes among the students and communities they serve. Additionally, if schools are successful in using data from this and related surveys to improve climate, as is the objective of this survey, then a national benchmark should be trending upward over time. Thus, data from one year, while a critical starting place, is not sufficient to promote continued efforts to improve school climate. Additionally, for researchers to study the impact of improving school climate on staff and student outcomes will require longitudinal analysis over the course of several school years. As a result, we recommend updating survey results on an annual or biennial basis.

  • Disaggregate national survey results by various school characteristics. Schools exist in many varied contexts and environments, and hence may not feel a “national benchmark” is fully applicable to them and their circumstances. For the tool to be seen as useful by a variety of schools and LEAs, the Department should disaggregate results by different types of school characteristics (e.g., urban vs. suburban vs. rural, size, student demographics, middle vs. high schools, etc.), wherever base sizes allow. This information will also be helpful to researchers attempting to understand which aspects of school climate may be more or less correlated with certain types of schools, in addition to understanding how these are also related to school performance.

In an effort to improve clarity of survey questions included, particularly for students, we provide the following suggested edits to the phrasing used in the survey and/or the content of specific questions. By ensuring that respondents understand the questions, and are only required to answer questions that are relevant to them, these edits will improve the quality of the survey results attained.

  • Clarify question intentions. We identified several questions whose content and phrasing may yield different results than the question intends. The recommendations focus on slight rephrasing to improve student understanding of these questions. Although minor alterations, misinterpretations of important topics like student health and safety can have serious consequences.

  • The questions on pages IS7 to IS10 of the survey begin with “Students at this school…” but do not clarify whether the intention is for respondents to think about any students, or most or all students when answering these questions. As a result, they may lead to invalid, inconsistent results, as different students may answer the question based on their own assumptions about what the survey is asking. The Department should consider changing the questions to improve clarity and consistency, perhaps by changing the scale to be based on the proportion of students engaging in the behaviors outlined (e.g., All, Most, Some, None).

  • Question 15 which concerns student perceptions of options after experiencing sexual assault or dating violence. As it currently exists, the question is unclear whether it is asking if there is an official school resource for students or if the students feel comfortable talking to an adult if they experience sexual violence. Either the questions should clearly state it is referring to a resource (i.e., “At this school, there is a d designated teacher or some other adult…”) or it should focus on students’ comfort with using a staff member as a resource (i.e., “At this school, there is a teacher or some other adult students are comfortable going to…”).

  • As currently written, question 52 asks students whether trying drugs is socially acceptable. However, questions 48 and 49 use the phrase “use/try.” While try tends to imply a onetime experience, u se can refer to singular or repetitive use. Students’ perceptions of these two scenarios could differ significantly, and have potentially different implications for school climate. We recommend that the language be consistent.

  • Alter language to replace “parent” with “guardian.” The general language of the survey, including the title—Parent Survey, refers to all guardians collectively as “parents.” With the shifting landscape of the American family, we recommend altering the language to read “Parent/Guardian” or “Family.” While “parent” may not be confusing for school staff, it might elicit an emotional response from students or guardians who do not identify with that term. This word choice may inadvertently reduce respondent participation from students or guardians who feel excluded by the language.

  • Ensure a nationally representative sampling of schools serving students with disabilities. Number 78 in the NonInstructional Staff Survey, number 68 in the Instructional Staff Survey, and number 36 in the Parent Survey all ask about the services provided for students with disabilities. As written, the questions assume participant knowledge of the appropriate services required and provided to these students which many may not possess. We recommend including a “Not Applicable” or “Don’t Know” response to the list of options. In addition, the schools surveyed should constitute an appropriate national representation of schools serving students with different learning abilities and needs.

  • (5) How might the Department minimize burden of this collection on the respondents, including through the use of information technology?

The Department plans for the survey to be accessible on both computers and tablets. In order to be attentive to the varied needs of students, family members, and staff participants who may not have access to these devices, and minimize burden of data collection, the Department should also ensure that the survey is also able to be completed via mobile phones. To ensure holistic and inclusive data, we offer one additional recommendation.

  • Create accommodation materials. We recommend creating versions of the survey that are accessible to all students, including Braille and TexttoSpeech Technology (TTST). In Appendix C, the Platform Data Collection Instructions state that “schools should provide the same accommodations for students [with disabilities] as are usually provided for student testing.” For students with severe sight impairment or who struggle with reading comprehension, TTST is a useful tool. TTST materials eliminate the need for separate testing times or facilities where teachers read the tests aloud. Schools will not have to spend additional resources creating audio recorded versions of the surveys.

Without proper support some schools may exclude those students from the survey entirely. Students with disabilities are an integral part of school communities. Their opinions create a truly holistic view of school climate, especially in the mission for equitable education opportunities.


We appreciate the opportunity to offer our recommendations to inform this important work. Any questions regarding these comments can be addressed to Melissa Tooley at [email protected] .


Sincerely,


Education Policy Program, New America



RESPONSE:

Dear Ms. Tooley,

Thank you for your comments submitted during the 30-day public comment period for the ED School Climate Surveys (EDSCLS) Benchmark Study 2016. The National Center for Education Statistics (NCES) appreciates your interest in EDSCLS. We would like to respond on the following topics:

Proposed timeline:

As stated in the Supporting Statement Part A (A.16), the data collection for the national benchmark study will end in May 2016 and results from the study will be incorporated into the fall 2016 release of the updated EDSCLS platform. NCES will also publish a report with national benchmark results in fall 2016.

Preparation checklist for school coordinators:

A preparation checklist has already been included in the appendix A for school coordinators in the national benchmark study. In the national benchmark, the survey will be administered by NCES. NCES will perform much of the preparation activities.

The goal of the EDSCLS national benchmark data collection is to provide benchmark data for any school, school district, or state that chooses to use the EDSCLS platform in their jurisdiction. An administration and technical guide has been developed that will accompany the release of the EDSCLS platform this fall. The guide instructs survey administrators on how to prepare for their own data collection using the platform. Instead of a checklist, preparation flow chart provides guidance to institutions of various needs and technological capacities.

Update the benchmark on an annual basis:

NCES was provided funding to develop EDSCLS and to conduct one benchmark data collection. Apart from funding, NCES is not staffed to make this a regular data collection.

Disaggregate national survey results by various school characteristics:

In fall 2016, NCES will publish a report presenting the results of the EDSCLS national benchmark survey. The report will disaggregate the results by school characteristics where base sizes will allow.

Clarify question intentions:

EDSCLS survey instruments went through several rounds of testing, evaluation, and refinement to ensure that the data collection tool produces reliable and valid measures. The development started in 2013 with a school climate content position paper – a review of the existing school climate literature and existing survey items. A Technical Review Panel (TRP) met in early 2014 to recommend items to be included in the EDSCLS. In the summer of 2014, cognitive interviews were conducted on the draft SCLS items in one-on-one settings with 78 individual participants: students, parents, teachers, principals, and noninstructional staff from the District of Columbia, Texas, and California. The draft items were retained, revised, or dropped based on the testing results. The resulting set of items was then pilot tested in 2015. A convenience sample of 50 public schools that varied across key characteristics (region, locale, and racial composition) participated in the pilot test. The data from the pilot test were used to develop the final EDSCLS survey instruments and construct scales for topic areas and domains based on confirmatory factor analyses and Rasch modeling (see Appendix D). Pilot test sites were also asked to report questions respondents had during the pilot test. About a third of the items, those that proved problematic or redundant, were removed from the instruments after the pilot test. The items included in the final version of the EDSCLS instrument, which will be used in the national benchmark study, performed well in both cognitive laboratory testing and in the pilot test. Any suggested wording change would need to be fully tested before any changes could be made to the current instruments. Please note that survey respondents can skip any EDSCLS item if they don’t know the answer or feel that it is not applicable to them.

Use of information technology:

The EDSCLS survey can be completed on “smart” phones in addition to computers and tablets.

Creating versions of the survey that are accessible to all students:

The EDSCLS survey is web-based. Respondents will access the survey as a website. The website will be compliant with Section 508 of the Rehabilitation Act of 1973 (29 U.S.C. § 794d). This compliance ensures that the website will function correctly when used with assistive technology, such as screen-readers.

For the data collection, NCES will ask schools to use any accommodation policies and procedures already in place at the school. This approach was used in the EDSCLS pilot test, in which schools were asked to provide accommodations to their students and did not report any problems during the data collection or at the debriefing meetings after the pilot test. NCES will continue collecting feedback from schools in the national benchmark study.

The EDSCLS national benchmark uses school data from the Common Core of Data (CCD) as a sampling frame. Due to constraints on resources and sample size, the sample of EDSCLS national benchmark study was restricted to public schools categorized as “regular” in the CCD. Other school types, including schools focused on special education, are excluded from the national benchmark sample frame. There are no school-level data on student disabilities in the CCD.

The EDSCLS national benchmark uses school data from the Common Core of Data (CCD) as a sampling frame. Due to constraints on resources and sample size, the sample of EDSCLS national benchmark study was restricted to public schools categorized as “regular” in the CCD. Other school types, including schools focused on special education, are excluded from the national benchmark sample frame. However, students with disabilities are well represented in regular schools as 95 percent of those served under IDEA attend regular public schools at least part time (http://nces.ed.gov/programs/digest/d13/tables/dt13_204.60.asp ). This is why the instructions about accommodations mentioned above are important.

Thank you again for your interest in EDSCLS.

11New America. (2014). Skills for Success . Washington, DC: Melissa Tooley and Laura Bornfreund.

2Education Trust. (2012). Building and Sustaining Talent , Washington, DC: Sarah Almy and Melissa Tooley.

2

4

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRichard J. Reeves
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy