Justification rev

Volume 1 SCLS Pilot Test 2015.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification rev

OMB: 1850-0803

Document [docx]
Download: docx | pdf

Shape1



















Table of Contents





Attachments:

Attachment 1 – Questionnaires

Attachment 2 – Justification Table

Attachment 3 – Communication Materials

Attachment 4 – Platform Instructions and Draft Guides

Attachment 5 – Cognitive Interviews and Usability Testing Report



Justification

The School Climate Surveys (SCLS) are a suite of survey instruments being developed for schools, districts, and states by the U.S. Department of Education’s National Center for Education Statistics (NCES). This national effort extends current activities that measure school climate, including the state-level efforts of the Safe and Supportive Schools (S3) grantees, which were awarded funds in 2010 by the Department of Education’s Office of Safe and Healthy Students (OSHS) to improve school climate. Through the SCLS, schools nationwide will have access to survey instruments and a survey platform that will allow for the collection and reporting of school climate data across stakeholders at the local level. The surveys can be used to produce school-, district-, and state-level scores on various indicators of school climate from the perspectives of students, teachers, noninstructional staff and principals, and parents and guardians. NCES will also provide benchmark data, collected from a nationally representative sample of schools across the United States, to facilitate comparisons of school climate scores at the local and national levels.

Background and Key Respondents

Although work on the SCLS began in response to a 2013 White House initiative, formal research on school climate, rooted in studies of organizational environments, can be traced back to the 1950s (Halpin and Croft 1962; Wilson 1959). The focus of the research expanded over the years to include many different aspects of school climate, including the physical condition of schools, a sense of community, and setting high expectations for academic achievement (Insel and Moos 1974; Rutter et al. 1979; Bandura 2001, 2007; Hoy, Hannum, and Tschannen-Moran 1998; Evans 1997; Hoy and Sabo 1998; Newmann 1992; Stockard and Mayberry 1992). More recently, the conceptualization of school climate has evolved to include the characteristics of school administration; student participation; students’ social and emotional competencies; staff cohesion; and staff relationships with other staff, students, and school leaders (McLoughlin, Kubrick, and Lewis 2002; Wynn, Carboni, and Patall 2007; Harper 2010; Osher and Kendziora 2010).

School climate has been recognized as a potential lever in education policy since at least 1908, described by Perry (1908) in his book The Management of a City School as the “esprit de corps.” Positive school climates are conducive to learning, whereas negative school climates are a barrier. A meta-analysis by Wang, Haertel, and Walberg (1997) that examined over 11,000 statistical findings to identify the most salient factors in student learning determined that “the different kinds of instruction and climate had nearly as much impact on learning as the student aptitude categories” (p. 205). Freiberg (1999) arrived at the conclusion that school climate is not only measurable and material to school stakeholders, but also malleable. The purpose of the SCLS is to provide school, district, and state leaders with reliable, actionable data that will afford them the levers needed to foster positive school climates.

The SCLS instruments will allow education leaders to seek the viewpoints of multiple respondent groups. A multi-perspective approach is important because each of the stakeholders experiences school climate differently. Students, as the focus of education, are the consumers. Teachers, as the agents of instruction, are the producers. School administrators and other staff, as the policymakers, are the leaders and implementers. Parents and guardians, as their children’s supporters and advocates, are invested participants in fostering their children’s academic, social, emotional, and physical development.

As a result, the SCLS is a suite of four survey questionnaires that each measures the perspective of a different group of stakeholders. The first instrument—the “student survey”—is for middle and high school students. The second—the “teacher and instructional staff survey”—is for middle and high school teachers and other instructional staff. The third—the “principal and noninstructional staff survey”—is for middle and high school noninstructional staff (e.g., administrators, counselors, coaches, librarians); most of the items in this survey will be asked of all noninstructional staff, but some will be asked of principals only. The final instrument—the “parent survey”—is for parents of middle and high school students. Attachment 1 includes the four questionnaires (including Spanish versions of the student and parent surveys), and Attachment 2 includes a justification table for all questionnaire items.

Purpose and Uses of the Data

The pilot test of the SCLS will take place from February to May of 2015. It will be an operational test, under “live” conditions of all components of the survey system (e.g., survey instruments, and data collection, processing, and reporting tools) prior to the national administration. For the parent survey, the pilot test will also serve as a feasibility study to determine whether it is possible to achieve a satisfactory survey response rate.

There are 131 items on the student survey, 117 items on the teacher/instructional staff survey, 137 items on the principal/noninstructional staff survey, and 47 items on the parent survey planned for the pilot test. Within each survey, NCES aims to have a similar number of items in each school climate topical area. Per the Technical Review Panel’s recommendations, for the student, instructional staff, and noninstructional staff surveys, the aim is to measure most of the topical areas with scales. For each of the topical areas that will be measured with a scale, NCES has proposed 7-10 items for the student, instructional staff, and noninstructional staff surveys, with the aim of ending up with 5-6 items per topical area in the final SCLS instruments. With respect to the parent survey, there are a small number of items so the aim is instead to create domain scores, if the data permit. For this survey, NCES has proposed 10-16 items per domain, with the aim of ending up with 6-10 items per domain in the final SCLS instruments.

The data from the pilot test will be used primarily to refine the SCLS survey items. Our target number of items is 70-80 for the final student, instructional staff, and noninstructional staff surveys, and 20-30 for the parent survey. The pilot test is designed to answer questions such as the following:

  • Do the survey questions contribute to target scales as expected?

  • Are there unusual patterns of responses to any survey questions?

  • Are there major problems with the data collection, processing, and reporting tools?

  • Are there major implementation issues we need to modify for a national administration of the surveys?

Internal structure analyses for target scales will be conducted. These analyses will include internal consistency reliability analyses (Cronbach’s alpha), and both exploratory and confirmatory factor analyses to evaluate the extent to which items are measuring the same construct as well as the constructs that the items were designed to measure. A flag variable will be created to identify potentially problematic items that deserve in-depth review. The values for the flag variable will indicate problems based on the following criteria, where applicable:

  • Item nonresponse rate (e.g., 50 percent nonresponse)

  • Response variance (e.g., 90 percent of the respondents selected the same response option)

  • Item to total scale score correlation

  • Loading of the item to the target scale (based on factor analysis)

  • Item fit statistics (based on IRT analysis)

Items that are not flagged as problematic based on these criteria will be candidates to be included in the final item pool. Not all of the potentially problematic items identified by these criteria will necessarily be dropped from the item pool; the in-depth review also will take into consideration other aspects of an item’s contribution to the instrument, such as the priority of its topical area or whether the item provides actionable information.

Another goal of the pilot test is to examine the ease of use of the SCLS system and the supporting materials, including administration guides and technical manuals. Any technical difficulties or issues reported by the districts or schools will be addressed in the revision and updates of the system and/or the supporting materials after the pilot test.

Design

The goal of developing the SCLS platform is to provide local education agencies a tool which is free of charge, collects data, and produces user-friendly school climate reports. Local education agencies administering the surveys will be able to store the data locally in their own data systems. Therefore, during the SCLS Pilot Test, districts and schools, rather than NCES1, will administer the surveys and collect the data on their own servers. A convenience sample covering 50 public schools—25 middle schools and 25 high schools will participate in the pilot test. It will be conducted from February to May, 2015. At least one district will be recruited to include all schools in the district. In each school, all eligible students, their parents, teachers, and staff will be asked to participate in the respective surveys. All survey questionnaires will be administrated online through the SCLS system.

Sample

For the pilot test, NCES plans to include all middle and high schools that have grades 5-12 students in one or more school districts for a total of 50 schools. Selecting all of the schools in a district will allow us to test the production of climate scores for the districts. This approach will also provide an opportunity to test operational procedures at the district level. The outreach to districts or schools will be purposive, with a goal of representing the range of characteristics that may affect schools’ ability to self-administer the surveys. For example, American Institutes for Research (AIR), which will carry out the pilot test on behalf of NCES, will identify districts that span rural, urban, and suburban schools with differing technology infrastructure and experience, student size, and percentage of FRPL-eligible students. After the districts and schools are recruited, AIR will provide information packages and training to school coordinators through a variety of mediums, including print materials and telephone conference calls.

Student Survey Block Design

For students, it is important for the survey to be completed in a typical class period, while also leaving time for the other aspects of survey administration (e.g., settling into the computer lab, providing log-in instructions). Due to the large number of student items in the pilot test, a balanced incomplete block (BIB) design will be used for students. By utilizing this method, SCLS can sample enough students to obtain precise results for each survey item while generally consuming no more than an hour of each student's time. Based on the three domain areas of the SCLS—Engagement, Safety, and Environment—three blocks will be created and each student will only answer two of the three blocks. In this design, each survey block appears twice in each of the two possible positions, and each block is paired once with every other block. Therefore, for the pilot test, there will be six versions of the student surveys and they will be assigned randomly to student usernames (table 1). Although we do not have concerns about the order effect of the domain areas in general, this method will also allow us to examine the order effect. If significant order effect is found in the pilot test, NCES will consider adding the feature of randomizing domain areas in survey administration to the SCLS platform. Consequently, NCES will also consider randomizing domain areas in the national benchmark study.


Table 1. Six versions of student surveys based on BIB design

Student survey version

Position 1 survey block

Position 2 survey block

1

Engagement

Safety

2

Safety

Environment

3

Environment

Engagement

4

Engagement

Environment

5

Environment

Safety

6

Safety

Engagement


Data Collection and Support

The pilot test data will be collected through the SCLS system. If a district is participating in the test, a district administrator can set up data collections for all schools they would like to include in the test and generate usernames for each school. Schools in the district will login using the usernames produced by the SCLS system to answer the surveys. The data will be collected and stored on the district server before they are exported and sent to AIR. The district administrator will be able to view the survey submission reports and item response frequency reports for the entire district and each school. If a school chooses to participate in the test independently, the school can set up the survey systems on its server to collect data from the school. Then the school will follow the same steps as districts to export and send the data to AIR when the data collection is closed. The school will be able to see the survey submission reports and item response frequency reports for its data collections.

Previous experience with similar administrations is that in most cases student, staff, and school surveys can be completed by respondents without technical support. However, leading up to and throughout the February-May 2015 pilot study period, AIR will field and respond to inquiries about the study through a Help Desk, via a toll-free telephone number and/or a study-branded email address. The Help Desk staff will respond to all inquiries within one business day and will record all inquiries, along with their resolution, in a pilot test Help Desk log. Should any technical issues with the survey platform arise during the pilot test, AIR staff will work immediately to resolve the issue. The issue will be brought to NCES’s attention and logged, along with any corrective action taken.

Recruitment and Payments to Respondents

For the pilot test, a convenient sample will be used. AIR will reach out to districts/schools that AIR has worked with before or districts/schools that have expressed interested in SCLS work during the development stage. The communication materials and sample parental consent forms to districts and schools are included in Attachment 3.

There is no payment to respondents in the pilot test. To encourage district and/or school participation and help with the data collection, NCES will provide each participating school a laptop computer valued at up to $400 and will preload the computer with all of the recruitment materials on the background and goals of the SCLS system and instructions on how to take the surveys. The default homepage of the browser will be set to the SCLS data collection log-in page for the pilot test. Each participating school or district will also get an external hard drive valued at up to $100 that will contain the SCLS system and technical manuals on how to set up the SCLS system. After the final revisions to the system are completed based on the pilot test results, they will receive the updated system for their future administration of SCLS. In addition to the reports generated by the SCLS at the close of data collections, the participating districts will receive scale score reports for the district and each participating school, and the participating schools that are not part of a participating district will receive school scale score reports after the pilot data analyses are completed.

Assurance of Confidentiality

School or district administrators will receive materials that both describe the study and clearly indicate that participation is voluntary. The data collection will be hosted on the districts’ or schools’ own servers, and the survey responses that will be exported to AIR will not include any directly identifying personally identifiable information (PII). The demographic and school information that will be part of the initial files will not be maintained after the data analyses are completed.

A key feature of the SCLS platform is that there is no link between respondent identifiers and log-in credentials within the system. Each potential respondent will be provided with a unique, randomly generated log-in credential that will allow one-time-only access. To allow respondents to exit the survey and return to complete it at a later time, the platform will generate a new log-in credential, available only to the respondents, that will allow access to the survey at the point where they previously exited; to help ensure data confidentiality, respondents will have no access to any responses entered during a prior session.

To allow maximum utility from the student data2, the initial student log-in credentials will be maintained in the database and exported data files to allow linkage to other student data if districts or schools choose to assign individual usernames to specific students, thereby allowing identification of individual student responses. For teachers, staff, and parents, the system does not have the functionality to allow to retain their initial log-in credentials. As soon as the data collection is closed and the data export function becomes available, all log-in credentials are deleted and replaced with random IDs in the database. All emails used to disseminate usernames are also deleted. The teacher, staff, and parent data can also be exported, but individuals cannot be identified. Because of these features, the confidentiality language on the informed consent page for students at the beginning of the survey is left blank with instructions for districts or schools administrating the surveys to fill in text that is consistent with FERPA and their state laws. The confidentiality language for other respondents is prefilled on the informed consent page at the beginning of the respective surveys, since the responses will not be associated with directly identifying PII (see Attachment 4 for the informed consent pages language).

For use in this pilot test, the following text is also shown on the informed consent pages to provide full confidentiality clause for the data that will be transferred to NCES/AIR:

To analyze and refine the questionnaires, the National Center for Education Statistics (NCES) will receive individual-level responses from participating schools and districts without the names or other direct personal identifiers of the respondents. All information received by NCES that relates to or describes identifiable characteristics of individuals is protected from disclosure by federal statute; it may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required law (20 U.S.C., §9573).


All instructions that respondents will see when accessing the data collection system, including the above text, and guides for survey administrators are provided in Attachment 43.

Sensitive Questions

The SCLS Pilot Test is a voluntary survey. No persons are required to respond to it and respondents may decline to answer any question in the survey. All survey items are focused on the perceptions of respondents regarding various aspects of school climate. Respondents are not asked to report any personal incidences or behaviors.

Estimated Response Burden

Table 2 shows the expected burden for the pilot test. The pilot test will be conducted in 50 schools. Based on 2011-12 CCD data, the average school size of 517 students, the student/ teacher ratio of 16 to 1, and student/administrative and all other support staff ratio of 32 to 1 are used to calculate the total number of respondents. Based on reported response rates of similar surveys4, NCES estimates 80 percent response rates for student, teacher/instructional, and principal/noninstructional surveys and 40 percent response rate for the parent survey. The response burden for each student, teacher, and staff member participating in the survey is expected to be about 60 minutes. The response burden for each parent participating in the survey is expected to be about 30 minutes.

Table 2. Estimate of hourly burden for recruitment and participation in SCLS Pilot Test

Type

Hours per respondent

Number of respondents

Total hours

Recruitment

School or District administrator - Initial contact

0.05

100

5

Follow-up via phone or e-mail

0.15

60

9

Confirmation

0.05

50

3

Subtotal


100

17

Participation  

Students

1

20,520

20,520

Parents

0.5

10,260

5,130

Teachers/Instructional staff

1

1,280

1,280

Principal/Noninstructional staff

1

640

640

Subtotal


32,700

27,570

Total Burden

 32,910 responses

32,800

27,587

Cost to Federal Government

The total cost of the SCLS Pilot Test data collection to the government is approximately $450,000. This includes all direct and indirect costs of the data collection.

Project Schedule

February-May, 2015: Administer pilot test of 25 middle schools and 25 high schools to test items, develop scales for topics and domains, and refine platform technology.


September, 2015: School climate platform made available for education agency download.


March, 2016: National benchmark data collection of 250 middle schools and 250 high schools.

September, 2016: A revised school climate platform with national benchmark reporting made available for education agency download.

Cognitive Laboratory Testing

The SCLS Pilot test was preceded by cognitive interviews testing of the survey items and the usability testing of the SCLS system during the summer of 2014. Both the cognitive interviews and usability tests were conducted one-on-one with individual participants: students, parents, teachers, principals, and noninstructional staff from the District of Columbia, Texas, and California. Changes to the survey items and systems were made based on these interviews and testing. The cognitive interviews and usability testing report is included in Attachment 5.



References

Bandura, A. (2001). Social Cognitive Theory: An Agentic Perspective. Annual Review of Psychology52(1): 126.

Bandura, A. (2007). Much Ado Over a Faulty Conception of Perceived Self-Efficacy Grounded in Faulty Experimentation. Journal of Social and Clinical Psychology, 26(6): 641658.

Evans, L. (1997). Understanding Teacher Morale and Job Satisfaction. Teaching and Teacher Education, 13(8): 831–845.

Freiberg, H.J. (1999). School Climate: Measuring, Improving and Sustaining Healthy Learning Environments. Philadelphia: Routledge.

Halpin, A.W., and Croft, D.B. (1962). The Organizational Climate of Schools. Midwest Administration Center, 11(7). Chicago, IL: University of Chicago.

Harper, K. (2010). Measuring School Climate. Paper presented at the Safe and Supportive Schools Grantee Meeting, Washington, DC.

Hoy, W.K., and Sabo, D.J. (1998). Quality Middle Schools: Open and Healthy. Thousand Oaks, CA: Corwin Press.

Hoy, W.K., Hannum, J., and Tschannen-Moran, M. (1998). Organizational Climate and Student Achievement: A Parsimonious and Longitudinal View. Journal of School Leadership, 8(4): 336–359.

Insel, P.M., and Moos, R.H. (1974). Psychological Environments: Expanding the Scope of Human Ecology. American Psychologist, 29(3): 179–188. doi:10.1037/h0035994

McLoughlin, C.S., Kubrick Jr, R.J., and Lewis, M. (2002). Best Practices in Promoting Safe Schools. In Best Practices in School Psychology (4th ed., pp. 1181–1194). Bethesda, MD: National Association of School Psychologists.

Newmann, F. (1992). Student Engagement and Achievement in American Secondary Schools. New York: Teachers College Press.

Osher, D., and Kendziora, K. (2010). Building Conditions for Learning and Healthy Adolescent Development: Strategic Approaches. In B. Doll, W. Pfohl, and J. Yoon (Eds.), Handbook of Youth Prevention Science. New York: Routledge.

Perry, A. (1908). The Management of a City School. New York: Macmillan.

Rutter, M., Maughan, B., Mortimore, P., and Ouston, J. (1979). Fifteen Thousand Hours: Secondary Schools and Their Effects on Children. Cambridge, MA: Harvard University Press.

Stockard, J., and Mayberry, M. (1992). Effective Educational Environments. Newbury Park, CA: Corwin Press.

Wang, M.C., Haertel, G.D., and Walberg, H.J. (1997). What Do We Know: Widely Implemented School Improvement Programs. Philadelphia: Laboratory for Student Success.

Wilson, A.B. (1959). Residential Segregation of Social Classes and Aspirations of High School Boys. American Sociological Review, 24(6): 836–845.

Wynn, S.R., Carboni, L.W., and Patall, E.A. (2007). Beginning Teachers’ Perceptions of Mentoring, Climate and Leadership: Promoting Retention Through a Learning Communities Perspective. Leadership and Policy in Schools, 6(3): 209–229.

1 NCES will collect data directly in the 2016 national benchmark study, where the goal will be to produce nationally representative benchmark data. In contrast, among other purposes, this pilot is designed to test the data collection tool as it is intended to be used by schools and districts in the future.

2 AIR conducted 9 interviews with district and state personnel during November 2013 on the survey system features that would be useful for districts and schools. Almost all key informants (8 of 9) expressed interest in being able to produce results disaggregated by subgroups of students beyond what SCLS offers (grade, gender, and race/ethnicity). Therefore, decisions were made to allow potential linkage of the student responses to external data files and to leave the assurance of proper protections of individual student data to the local education agencies that are responsible for the data.

3 We have now included in Attachment 4 a partial draft of the technical system administrator instructions that will be provided to schools and districts. In January 2015, we will submit to OMB for a separate 1850-0803 clearance the finalized system administrator instructions and FAQs.

4 These surveys include Conditional for Learning Survey in Cleveland, Safe and Supportive Schools Survey in Iowa, Maryland Youth Risk Behavior Survey, and Albuquerque Public Schools Staff School Climate Survey.

1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPate, Austin
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy