Part C-1 Item Justification Update

Part C-1 SSOCS 2018 & 2020.docx

School Survey on Crime and Safety (SSOCS) 2018 and 2020 Update

Part C-1 Item Justification Update

OMB: 1850-0761

Document [docx]
Download: docx | pdf






School Survey on Crime and Safety (SSOCS) 2018 and 2020 Update



OMB# 1850-0761 v.17




Supporting Statement Part C-1

Item Justification



National Center for Education Statistics

Institute of Education Sciences

U.S. Department of Education




May 2019
revised September 2019




Contents



C1. Item Description and Justification: SSOCS:2018 and SSOCS:2020

At various times in the history of the School Survey on Crime and Safety (SSOCS), the survey items have been examined both for the quality of their content and the data collected in response to them and, when necessary, the questionnaire has been adjusted. To maintain consistent benchmarks over time, to the extent possible, minimal changes have been made to the questionnaire between survey administrations. Some items were removed from the SSOCS:2018 and SSOCS:2020 questionnaires based on a perceived decline in their usefulness and to reduce the burden on respondents, and some items were revised to clarify their meaning.

Information on specific editorial changes, content modifications, item additions, and item deletions is included in Sections C2 and C3 of this document.

Presented below is a complete description of the sections and the corresponding items in the SSOCS:2018 and SSOCS:2020 questionnaires (see appendix B for the full questionnaires).

    1. SSOCS: 2018

The SSOCS:2018 questionnaire consists of the following sections:

  • School practices and programs;

  • Parent and community involvement at school;

  • School security staff;

  • School mental health services;

  • Staff training and practices;

  • Limitations on crime prevention;

  • Frequency of crime and violence at school;

  • Incidents;

  • Disciplinary problems and actions; and

  • School characteristics: 2017–18 school year.

1.1.1 School Practices and Programs

This section collects data on current school policies and programs relating to crime and safety. These data are important in helping schools know where they stand in relation to other schools, and in helping policymakers know what actions are already being taken in schools and what actions schools might be encouraged to take in the future. These data can also benefit researchers interested in evaluating the success of certain school policies. Although SSOCS is not designed as an evaluation, the presence of school policies can be correlated with the rates of crime provided elsewhere in the questionnaire, with appropriate controls for school characteristics.

Question 1 asks specifically about the various school policies and practices that are in place, including those that restrict access to school grounds, monitor student behavior to prevent crime, impact the school’s ability to recognize an outsider, and enable communication in the event of a school-wide emergency. These policies and practices are important because they influence the control that administrators have over the school environment as well as the potential for students to bring weapons or drugs onto school grounds. Such actions can directly affect crime because students may be more reluctant to engage in inappropriate activities for fear of being caught. The school climate may also be affected because students may feel more secure knowing that violators of school policies are likely to be caught.

Question 2 asks about the existence of written plans for dealing with various crisis scenarios, and Question 3 asks whether schools drill students on the use of specific emergency procedures. When emergencies occur, there may not be time or an appropriate environment for making critical decisions, and key school leaders may not be immediately available to provide guidance. Thus, having a written plan for crises and drilling students on emergency procedures is important in preparing schools to deal with crises effectively.

Question 4 asks about various activities schools have in place that may directly or indirectly prevent or reduce violence. The presence of such activities is a sign that schools are being proactive by seeking to prevent violence before it occurs rather than reacting to it.

Questions 5 and 6 ask whether schools have a threat assessment team, and, if so, how often the threat assessment team meets. Threat assessment teams are an emerging practice in schools to identify and interrupt students who may be on a path to violent behavior.

Question 7 asks about the presence of recognized student groups that promote inclusiveness and acceptance in schools. The presence of such groups is important in creating a climate in which students are respectful of peers from all backgrounds and may help to reduce conflict and violence.

1.1.2 Parent and Community Involvement at School

This section asks about the involvement of parents and community groups in schools. Parent and community involvement in schools can affect the school culture and may impact the level of crime in a school.

Questions 8 and 9 ask about policies or practices that schools have implemented to involve parents in school procedures and the percentage of parents participating in specific school events.

Question 10 asks if specific community organizations are involved in promoting a safe school environment to determine the extent to which the school involves outside groups.

1.1.3 School Security Staff

Questions 11 through 18 ask about the use and activities of sworn law enforcement officers (including School Resource Officers) on school grounds and at school events. Question 19 asks about the presence of other security personnel who are not sworn law enforcement officers. In addition to directly affecting school crime, the use of security staff can also affect the school environment. Security staff may help prevent illegal actions, reduce the amount of crime, and contribute to feelings of security or freedom on school grounds. Thus, the times that law enforcement personnel are present, their visibility, their roles and responsibilities, and their carrying of weapons are all important.

1.1.4 School Mental Health Services

Questions 20 and 21 ask whether diagnostic mental health assessments were provided to students by a licensed mental health professional and whether these diagnostic assessments were provided at school or outside of school (school-employed or contracted mental health professionals may provide diagnostic assessment services in either or both locations). Questions 22 and 23 ask whether treatment for mental health disorders was provided to students by a licensed mental health professional and whether treatment was provided at school or outside of school (school-employed or contracted mental health professionals may provide treatment in either or both locations). Assessing the types of mental health services provided by schools as well as the location of these services demonstrates how well equipped schools are to deal with students who have mental disorders. Schools’ ability to attend to students who have mental health disorders may influence the frequency and severity of delinquency and behavioral problems within the school.

Question 24 asks for principals’ perceptions of the factors that limit their schools’ efforts to provide mental health services to students. The question asks about factors such as inadequate access to licensed mental health professionals, inadequate funding, concerns about parents’ reactions, and the legal responsibilities of the school. Schools that face issues relating to inadequate resources or support may have limited effectiveness in providing mental health services to students. Schools’ financial obligation to pay for mental health services may also make them reluctant to identify students who require these services.

1.1.5 Staff Training and Practices

Question 25 asks whether schools or districts provide training for classroom teachers or aides on topics such as classroom management; school-wide policies and practices related to violence, bullying, and cyberbullying; alcohol and/or drug use; and safety procedures. Other types of training include recognizing potentially violent students; recognizing signs of suicidal tendencies; recognizing signs of substance abuse; intervention and referral strategies for students who display signs of mental health disorders; recognizing physical, social, and verbal bullying; positive behavioral intervention strategies; and crisis prevention and intervention. Schools can now obtain early warning signs to identify potentially violent students, and their use of such profiles may affect both general levels of discipline and the potential for crises. The type of training provided to teachers is important because teachers collectively spend the most time with students and observe them closely. Moreover, there is evidence in recent research that a substantial discrepancy exists in the percentage of schools that have these types of policies and the percentage of teachers that are trained in them. Collecting data on teacher training will inform efforts to combat violence and discipline problems in schools.

Question 26 asks if there are any school staff who legally carry a firearm on school property. While many school districts and states have policies that prohibit carrying firearms on school property, some state and district policies allow school staff to legally carry (concealed) firearms at school. While not all policies require those who carry a firearm on campus to divulge that information, principals may be aware of some instances in which staff members have brought firearms on school property. The presence of firearms in schools may be an indicator of the school climate.

1.1.6 Limitations on Crime Prevention

This section asks for principals’ perceptions of the factors that limit their schools’ efforts to reduce or prevent crime. Question 27 asks about factors such as lack of adequate training for teachers, lack of support from parents or teachers, inadequate funding, and federal, state, or district policies on disciplining students. Although principals are not trained evaluators, they are the people who are the most knowledgeable about the situations at their schools and whether their own actions have been constrained by the factors listed.

Schools that face issues relating to inadequate resources or support may have limited effectiveness in responding to disciplinary issues and reducing or preventing crime. Identifying principals’ perceptions of the factors that limit their ability to prevent crime in school can inform efforts to minimize obstructions to schools’ crime prevention measures.

1.1.7 Frequency of Crime and Violence at School

Questions 28 and 29 ask about violent deaths, specifically homicides and shootings at school. Violent deaths get substantial media attention but are actually relatively rare. There is evidence that, in general, schools are much safer than students’ neighboring communities. Based on analyses of such previous SSOCS data, these crimes are such rare events that the National Center for Education Statistics (NCES) is unable to report estimates per its statistical standards. Nonetheless, it is important to include these items as they are significant incidents of crime that, at the very least, independent researchers can evaluate. Furthermore, the survey is intended to represent a comprehensive picture of the types of violence that can occur in schools, and the omission of violent deaths and shootings would be questioned by respondents who may have experienced such violence.

1.1.8 Incidents

The questions in this section ask about the frequency and types of crime and disruptions at school (other than violent deaths). Question 30 specifically asks principals to provide counts of the number of recorded incidents that occurred at school and the number of incidents that were reported to the police or other law enforcement. Question 30 will assist in identifying which types of crimes in schools are underreported to the police and will provide justification for further investigation. Questions 31 and 32 ask about the number of hate crimes and the biases that may have motivated these hate crimes. Question 33 asks whether there were any incidents of sexual misconduct between school staff members and students. Question 34 asks about the number of arrests that have occurred at school. The data gained from this section can be used directly as an indicator of the degree of safety in U.S. public schools and indirectly to compare schools in terms of the number of problems they face.

1.1.9 Disciplinary Problems and Actions

This section asks about the degree to which schools face various disciplinary problems and how schools respond to them. The data gathered in questions 35 and 36 can help to provide an overall measure of the types of problems schools encounter on a regular basis. There is evidence that schools’ ability to control crime is affected by their control of lesser violations, and that, when lesser violations are controlled, students do not progress to more serious disciplinary problems. The data gathered in this section will be helpful in confirming or denying the importance of schools’ control of lesser violations and provide another measure of the disciplinary situation in U.S. schools. The data may also be helpful in multivariate models of school crime by providing a way of grouping schools that are similar in their general disciplinary situation but different in their school policies or programs.

Question 35 asks principals to report, to the best of their knowledge, how often certain disciplinary problems occur at school. Problems of interest include student racial/ethnic tensions; bullying; sexual harassment; harassment based on sexual orientation, gender identity, religion, or disability; widespread disorder in classrooms; student disrespect of teachers; and gang activities. This question provides a general measure of the degree to which there are disciplinary problems at each school.

Question 36 asks about the frequency of three aspects of cyberbullying, providing a general measure of the degree to which cyberbullying is an issue for students and how often staff resources are used to deal with cyberbullying.

Question 37 asks what kinds of disciplinary actions were available to each school and whether each action was actually used during the school year. This item is not intended to be comprehensive; instead, it focuses on some of the most important disciplinary strategies. These data will help policymakers to know what options and what constraints principals face. For example, if an action is allowed in principle but not used in practice, then policymakers would need to act in a different way than if the action is not allowed.

Question 38 asks about the number of various types of offenses committed by students and the resulting disciplinary actions taken by schools. Question 39 asks how many students were removed or transferred from school for disciplinary reasons. These items provide valuable information about how school policies are actually implemented (rather than simply what policies are in place), with a particular emphasis on how many different kinds of actions are taken with regard to a particular offense as well as how many times no actions are taken.

1.1.10 School Characteristics: 2017–18 School Year

This section asks for a variety of information about the characteristics of the schools responding to the survey. The information provided in this section is necessary to be able to understand the degree to which different schools face different situations. For example, one school might have highly effective programs and policies, yet still have high crime rates due to the high crime rates in the area where the school is located. Another school might appear to have effective policies based on its crime rates but actually have higher crime rates than similar schools.

Question 40 asks for the school’s total enrollment.

Question 41 requests information on the school’s student population, including the percentage of students who receive free or reduced-price lunches (a measure of poverty), are English language learners (a measure of the cultural environment), are in special education (a measure of the academic environment), and are male (most crimes are committed by males, so the percentage who are male can affect the overall crime rate).

Question 42 addresses various levels of academic proficiency and interest, which are factors that have been shown to be associated with crime rates.

Question 43 asks for the number of classroom changes made in a typical day. This is important because it affects schools’ ability to control the student environment. When students are in hallways, there are more opportunities for problems. Also, a school with fewer classroom changes is likely to be more personal and to have closer relationships between the students and teachers.

Questions 44 and 45 ask about the crime levels in the neighborhoods where students live and in the area where the school is located. This is an important distinction, since some students may travel a great distance to their school, and their home community may have a significantly different level of crime than their school community.

Question 46 asks for the school type. Schools that target particular groups of students (such as magnet schools) have more control over who is in the student body and may have more motivated students because the students have chosen a particular program. Charter schools have more freedom than regular schools in their school policies, may have more control over who is admitted into the student body, and may have more motivated students because the students chose to attend the school.

Question 47 asks for the school’s average daily attendance. This is a measure of truancy and thus a measure of the level of disciplinary problems at the school. It also is a measure of the academic environment.

Question 48 asks for the number of transfers. When students transfer after the school year has started, schools have less control over whether and how the students are assimilated into the school. These students are likely to have less attachment to the school as well as to the other students, thus increasing the risk of disciplinary problems.

1.2 SSOCS: 2020

The SSOCS:2020 questionnaire and procedures are expected to be the same as in SSOCS:2018. Due to adjustments to the questionnaire between these two administrations, some item numbers have shifted. The item numbers presented below represent the SSOCS:2020 question numbering, with the SSOCS:2018 question numbers displayed in parentheses. Further details on the changes made to the questionnaire between the 2018 and 2020 administrations of SSOCS, including rationales for the changes, can be found in Section C3.

The SSOCS:2020 questionnaire consists of the following sections:

  • School practices and programs;

  • Parent and community involvement at school;

  • School security staff;

  • School mental health services;

  • Staff training and practices;

  • Limitations on crime prevention;

  • Incidents (the section on Frequency of Crime and Violence at School was removed from the SSOCS:2020 questionnaire; items previously in that section were incorporated into the Incidents section)

  • Disciplinary problems and actions; and

  • School characteristics: 2019–20 school year.

1.2.1 School Practices and Programs

This section collects data on current school policies and programs relating to crime and safety. These data are important in helping schools know where they stand in relation to other schools, and in helping policymakers know what actions are already being taken in schools and what actions schools might be encouraged to take in the future. These data can also benefit researchers interested in evaluating the success of certain school policies. Although SSOCS is not designed as an evaluation, the presence of school policies can be correlated with the rates of crime provided elsewhere in the questionnaire, with appropriate controls for school characteristics.

Question 1 (SSOCS:2018 Question 1)

This question asks specifically about the various school policies and practices that are in place, including those that restrict access to school grounds, monitor student behavior to prevent crime, impact the school’s ability to recognize an outsider, and enable communication in the event of a school-wide emergency. These policies and practices are important because they influence the control that administrators have over the school environment as well as the potential for students to bring weapons or drugs onto school grounds. Such actions can directly affect crime because students may be more reluctant to engage in inappropriate activities for fear of being caught. The school climate may also be affected because students may feel more secure knowing that violators of school policies are likely to be caught.

Questions 2 and 3 (SSOCS:2018 Questions 2 and 3)

These questions ask about the existence of written plans for dealing with various crisis scenarios, and whether schools drill students on the use of specific emergency procedures. When emergencies occur, there may not be time or an appropriate environment for making critical decisions, and key school leaders may not be immediately available to provide guidance. Thus, having a written plan for crises and drilling students on emergency procedures is important in preparing schools to deal with crises effectively.

Question 4 (SSOCS:2018 Question 4)

This question asks about various activities schools have in place that may directly or indirectly prevent or reduce violence. The presence of such activities is a sign that schools are being proactive by seeking to prevent violence before it occurs rather than reacting to it.

Questions 5 (SSOCS:2018 Questions 5 and 6)

This question asks whether schools have a threat assessment team. Threat assessment teams are an emerging practice in schools to identify and interrupt students who may be on a path to violent behavior. A follow-up question in the SSOCS:2018 questionnaire asked how often the threat assessment team meets; this question was removed from the SSOCS:2020 questionnaire.

Question 6 (SSOCS:2018 Question 7)

This question asks about the presence of recognized student groups that promote inclusiveness and acceptance in schools. The presence of such groups is important in creating a climate in which students are respectful of peers from all backgrounds and may help to reduce conflict and violence.

1.2.2 Parent and Community Involvement at School

This section asks about the involvement of parents and community groups in schools. Parent and community involvement in schools can affect the school culture and may impact the level of crime in a school.

Question 7 (SSOCS:2018 Questions 8 and 9)

This question asks about policies or practices that schools have implemented to involve parents in school procedures. An additional question in the SSOCS:2018 questionnaire asked about the percentage of parents participating in specific school events; this question was removed from the SSOCS:2020 questionnaire.

Question 8 (SSOCS:2018 Question 10)

This question asks if specific community organizations are involved in promoting a safe school environment to determine the extent to which the school involves outside groups.

1.2.3 School Security Staff

Questions 9 through 15 (SSOCS:2018 Questions 11 through 18)

These questions ask about the use and activities of sworn law enforcement officers (including School Resource Officers) on school grounds and at school events. One question from this section (SSOCS:2018 Question 15) was removed from the SSOCS:2020 questionnaire.

Question 16 (SSOCS:2018 Question 19)

This question asks about the presence of other security personnel who are not sworn law enforcement officers. In addition to directly affecting school crime, the use of security staff can also affect the school environment. Security staff may help prevent illegal actions, reduce the amount of crime, and contribute to feelings of security or freedom on school grounds. Thus, the times that law enforcement personnel are present, their visibility, their roles and responsibilities, and their carrying of weapons are all important.

1.2.4 School Mental Health Services

Questions 17 and 18 (SSOCS:2018 Questions 20 and 21)

These questions ask whether diagnostic mental health assessments were provided to students by a licensed mental health professional and whether these diagnostic assessments were provided at school or outside of school (school-employed or contracted mental health professionals may provide diagnostic assessment services in either or both locations).

Questions 19 and 20 (SSOCS:2018 Questions 22 and 23)

These questions ask whether treatment for mental health disorders was provided to students by a licensed mental health professional and whether treatment was provided at school or outside of school (school-employed or contracted mental health professionals may provide treatment in either or both locations). Assessing the types of mental health services provided by schools as well as the location of these services demonstrates how well equipped schools are to deal with students who have mental disorders. Schools’ ability to attend to students who have mental health disorders may influence the frequency and severity of delinquency and behavioral problems within the school.

Question 21 (SSOCS:2018 Question 24)

This question asks for principals’ perceptions of the factors that limit their schools’ efforts to provide mental health services to students. The question asks about factors such as inadequate access to licensed mental health professionals, inadequate funding, concerns about parents’ reactions, and the legal responsibilities of the school. Schools that face issues relating to inadequate resources or support may have limited effectiveness in providing mental health services to students. Schools’ financial obligation to pay for mental health services may also make them reluctant to identify students who require these services.

1.2.5 Staff Training and Practices

Question 22 (SSOCS:2018 Question 25)

This question asks whether schools or districts provide training for classroom teachers or aides on topics such as classroom management; school-wide policies and practices related to violence, bullying, and cyberbullying; alcohol and/or drug use; and safety procedures. Other types of training include recognizing potentially violent students; recognizing signs of suicidal tendencies; recognizing signs of substance abuse; intervention and referral strategies for students who display signs of mental health disorders; recognizing physical, social, and verbal bullying; positive behavioral intervention strategies; and crisis prevention and intervention. Schools can now obtain early warning signs to identify potentially violent students, and their use of such profiles may affect both general levels of discipline and the potential for crises. The type of training provided to teachers is important because teachers collectively spend the most time with students and observe them closely. Moreover, there is evidence in recent research that a substantial discrepancy exists in the percentage of schools that have these types of policies and the percentage of teachers that are trained in them. Collecting data on teacher training will inform efforts to combat violence and discipline problems in schools.

Question 23 (SSOCS:2018 Question 26)

This question asks if there are any school staff who legally carry a firearm on school property. While many school districts and states have policies that prohibit carrying firearms on school property, some state and district policies allow school staff to legally carry (concealed) firearms at school. While not all policies require those who carry a firearm on campus to divulge that information, principals may be aware of some instances in which staff members have brought firearms on school property. The presence of firearms in schools may be an indicator of the school climate.

1.2.6 Limitations on Crime Prevention

This section asks for principals’ perceptions of the factors that limit their schools’ efforts to reduce or prevent crime.

Question 24 (SSOCS:2018 Question 27)

This question asks about factors such as lack of adequate training for teachers, lack of support from parents or teachers, and inadequate funding.. Although principals are not trained evaluators, they are the people who are the most knowledgeable about the situations at their schools and whether their own actions have been constrained by the factors listed. Four subitems from this section (SSOCS:2018 items 27j–m) were removed from the SSOCS:2020 questionnaire.

Schools that face issues relating to inadequate resources or support may have limited effectiveness in responding to disciplinary issues and reducing or preventing crime. Identifying principals’ perceptions of the factors that limit their ability to prevent crime in school can inform efforts to minimize obstructions to schools’ crime prevention measures.

1.2.7 Incidents

The questions in this section ask about the frequency and types of crime and disruptions at school (other than violent deaths). Note that the section Frequency of Crime and Violence at School has been removed from the SSOCS:2020 questionnaire and its items have been incorporated into the Incidents section.

Question 25 (SSOCS:2018 Question 30)

This question specifically asks principals to provide counts of the number of recorded incidents that occurred at school and the number of incidents that were reported to the police or other law enforcement. This question will assist in identifying which types of crimes in schools are underreported to the police and will provide justification for further investigation.

Questions 26 and 27 (SSOCS:2018 Questions 31 and 32)

These questions ask about the number of hate crimes and the biases that may have motivated these hate crimes.

Question 28 (SSOCS:2018 Question 33)

This question asks whether there were any incidents of sexual misconduct between school staff members and students.

Questions 29 and 30 (SSOCS:2018 Questions 28 and 29)

These questions ask about violent deaths (specifically, homicides and shootings at school). Although violent deaths get substantial media attention, they are actually relatively rare. In fact, there is evidence that, in general, schools are much safer than students’ neighboring communities. Based on analyses of such previous SSOCS data, these crimes are such rare events that the National Center for Education Statistics (NCES) is unable to report estimates per its statistical standards. Nonetheless, it is important to include these items as they are significant incidents of crime that, at the very least, independent researchers can evaluate. Furthermore, the survey is intended to represent a comprehensive picture of the types of violence that can occur in schools, and the omission of violent deaths and shootings would be questioned by respondents who may have experienced such violence. In the SSOCS:2018 questionnaire, these questions were contained in the Frequency of Crime and Violence at School section; this section was removed from the SSOCS:2020 questionnaire and its items moved to the Incidents section.

Question 31 (SSOCS:2018 Question 34)

This question asks about the number of arrests that have occurred at school. The data gained from this section can be used directly as an indicator of the degree of safety in U.S. public schools and indirectly to compare schools in terms of the number of problems they face.

1.2.8 Disciplinary Problems and Actions

This section asks about the degree to which schools face various disciplinary problems and how schools respond to them. The data gathered in questions 32 and 33 (SSOCS:2018 questions 35 and 36) can help to provide an overall measure of the types of problems schools encounter on a regular basis. There is evidence that schools’ ability to control crime is affected by their control of lesser violations, and that, when lesser violations are controlled, students do not progress to more serious disciplinary problems. The data gathered in this section will be helpful in confirming or denying the importance of schools’ control of lesser violations and provide another measure of the disciplinary situation in U.S. schools. The data may also be helpful in multivariate models of school crime by providing a way of grouping schools that are similar in their general disciplinary situation but different in their school policies or programs.

Question 32 (SSOCS:2018 Question 35)

This question asks principals to report, to the best of their knowledge, how often certain disciplinary problems occur at school. Problems of interest include student racial/ethnic tensions; bullying; sexual harassment; harassment based on sexual orientation, gender identity, religion, or disability; widespread disorder in classrooms; student disrespect of teachers; and gang activities. This question provides a general measure of the degree to which there are disciplinary problems at each school.

Question 33 (SSOCS:2018 Question 36)

This question asks about the frequency of cyberbullying (including at and away from school), providing a general measure of the degree to which cyberbullying is an issue for students. Two additional subitems were included in the SSOCS:2018 questionnaire asking about how often cyberbullying affected the school environment and how often staff resources were used to deal with cyberbullying; these subitems were removed from the SSOCS:2020 questionnaire.

Question 34 (SSOCS:2018 Question 37)

This question asks what kinds of disciplinary actions were available to each school and whether each action was actually used during the school year. This item is not intended to be comprehensive; instead, it focuses on some of the most important disciplinary strategies. These data will help policymakers to know what options and what constraints principals face. For example, if an action is allowed in principle but not used in practice, then policymakers would need to act in a different way than if the action is not allowed.

Question 35 (SSOCS:2018 Question 38)

This question asks about the number of various types of offenses committed by students and the resulting disciplinary actions taken by schools.

Question 36 (SSOCS:2018 Question 39)

This question asks how many students were removed or transferred from school for disciplinary reasons. These items provide valuable information about how school policies are actually implemented (rather than simply what policies are in place), with a particular emphasis on how many different kinds of actions are taken with regard to a particular offense as well as how many times no actions are taken.

1.2.9 School Characteristics: 2019–20 School Year (SSOCS:2018 2017–18 School Year)

This section asks for a variety of information about the characteristics of the schools responding to the survey. The information provided in this section is necessary to be able to understand the degree to which different schools face different situations. For example, one school might have highly effective programs and policies, yet still have high crime rates due to the high crime rates in the area where the school is located. Another school might appear to have effective policies based on its crime rates but actually have higher crime rates than similar schools.

Question 37 (SSOCS:2018 Question 40)

This question asks for the school’s total enrollment.

Question 38 (SSOCS:2018 Question 41)

This question requests information on the school’s student population, including the percentage of students who receive free or reduced-price lunches (a measure of poverty), are English language learners (a measure of the cultural environment), are in special education (a measure of the academic environment), and are male (most crimes are committed by males, so the percentage who are male can affect the overall crime rate).

Question 39 (SSOCS:2018 Question 42)

This question addresses various levels of academic proficiency and interest, which are factors that have been shown to be associated with crime rates.

Question 40 (SSOCS:2018 Question 43)

This question asks for the number of classroom changes made in a typical day. This is important because it affects schools’ ability to control the student environment. When students are in hallways, there are more opportunities for problems. Also, a school with fewer classroom changes is likely to be more personal and to have closer relationships between the students and teachers.

Questions 41 and 42 (SSOCS:2018 Questions 44 and 45)

These questions ask about the crime levels in the neighborhoods where students live and in the area where the school is located. This is an important distinction, since some students may travel a great distance to their school, and their home community may have a significantly different level of crime than their school community.

Question 43 (SSOCS:2018 Question 46)

This question asks for the school type. Schools that target particular groups of students (such as magnet schools) have more control over who is in the student body and may have more motivated students because the students have chosen a particular program. Charter schools have more freedom than regular schools in their school policies, may have more control over who is admitted into the student body, and may have more motivated students because the students chose to attend the school.

Question 44 (SSOCS:2018 Question 47)

This question asks for the school’s average daily attendance. This is a measure of truancy and thus a measure of the level of disciplinary problems at the school. It also is a measure of the academic environment.

Question 45 (SSOCS:2018 Question 48)

This question asks for the number of transfers. When students transfer after the school year has started, schools have less control over whether and how the students are assimilated into the school. These students are likely to have less attachment to the school as well as to the other students, thus increasing the risk of disciplinary problems.

C2. Changes to the Questionnaire and Rationale: SSOCS:2018

The following section details the editorial changes, deletions, and additions made to the SSOCS:2016 questionnaire. Based on the results of the SSOCS:2016 data collection and cognitive interview testing, some items for SSOCS:2018 were revised to clarify their meaning. Additionally, several items were removed from the survey based on a perceived decline in their usefulness and to make room for new items that reflect emerging issues in school crime and safety.

The result is the proposed instrument for the SSOCS:2018 survey administration, which is located in appendix B. For additional information on the rationales for item revisions, please see the findings and resulting recommendations from cognitive testing, which are located in Part C4.


    1. Changes to Definitions

One definition (sexual misconduct) has been added to clarify a term used in a new survey item on the 2018 questionnaire. Three definitions (arrest, harassment, and school resource officer) have been added to clarify terms already used in previous questionnaires. Eight definitions (bullying, cyberbullying, diagnostic mental health assessment, mental health professional, rape, sexual assault, sexual harassment, and treatment) were revised to increase clarity for survey respondents.

Arrest – A formal definition has been added to increase consistency in the interpretation of an “arrest.” The definition aligns with the definition used by the Bureau of Justice Statistics.

Bullying The three key components for bullying (an observed or perceived power imbalance, repetition, and the exclusion of siblings or current dating partners) have been re-ordered in the definition to increase readability.

Cyberbullying – The definition for cyberbullying has been revised to explicitly identify cyberbullying as a form of bullying.

Diagnostic mental health assessment – The definition for diagnostic mental health assessment (previously called “diagnostic assessment”) was modified to remove references to general medical professionals and medical diagnoses other than mental health. The revisions will help respondents to distinguish diagnostic assessments for mental health disorders from assessments that may be administered to identify other medical or educational issues.

Harassment – A formal definition has been added to clarify “harassment.” The definition aligns closely with the definition used in the Civil Rights Data Collection conducted by the Department of Education’s Office for Civil Rights.

Mental health professional The definition for mental health professional has been revised to specify that mental health professionals are licensed.

Rape The definition of rape has been modified to specify that all students, regardless of sex or gender identity, can be victims of rape.

School resource officer As this term is used in the instructions for many survey items, the definition for school resource officer is now included in the list of formal definitions. Previously, the definition for this term was included directly in subitem 18a.

Sexual assault The definition for sexual assault has been modified to specify that all students, regardless of sex or gender identity, can be victims of sexual assault.

Sexual harassment The definition for sexual harassment has been modified to specify that all students, regardless of sex or gender identity, can be victims of sexual harassment and to include additional examples of forms of harassment. Additionally, as the corresponding survey item asks only about sexual harassment of students by students, examples of other perpetrators (e.g., school employees, non-school employees) were removed from the definition.

Sexual misconduct This definition was added to the questionnaire in accordance with a new item on incidents of sexual misconduct. The definition aligns with language used in legislation in several states, such as the Commonwealth of Pennsylvania’s “Educator Discipline Act.”

Treatment In consultation with mental health experts, the wording of this definition was modified to clarify that “treatment” refers to clinical interventions to address mental health disorders.


    1. Editorial Changes

Throughout the questionnaire, the school year has been updated to reflect the most recent 2017–18 school year, and item skip patterns have been updated to reflect the new numbering in the questionnaire.

Item 1, subitem b. “Loading docks” was added as an example to this item in a parenthetical notation.

Item 1, subitem h. This item has been modified to combine the two items on random sweeps, subitems 1h and 1i, on the 2016 questionnaire. The resulting item does not distinguish between random sweeps conducted using dog sniffs and those that do not use dog sniffs, because it is more important to identify how many schools are conducting sweeps for contraband as opposed to the method used for such sweeps.

Item 1, subitem i. This item has been modified to combine the two items on drug testing, subitems 1j and 1k, on the 2016 questionnaire. The resulting item does not distinguish between drug testing for student athletes and drug testing for students in extracurricular activities other than athletics, because it is more important to identify how many schools are conducting random drug testing as opposed to the specific population that is being drug tested.

Item 1, subitem u. This item has been modified to specify prohibition of “non-academic” use of cell phones or smartphones since many schools now permit students to use cell phones during school hours for academic purposes. “Text messaging devices” has also been changed to “smartphones.”

Item 2, subitem g. This item has been modified to “pandemic disease” to broaden the scope of the item since schools’ emergency plans may include infectious diseases other than or in addition to the flu.

Item 4, subitem b. The word “training” was removed from this item, and the item was moved to follow subitem 4a in an effort to put similar items closer together.

Item 4, subitem d. The word “attention” has been removed from this item since individual mentoring, tutoring, and coaching all imply individual attention.

Item 13, subitem a. The wording of this item was revised to increase consistency between subitems 13a and 17b.

Item 14, subitem c. The wording of this item was revised to increase consistency between subitems 14c and 17a.

Item 17, subitem b. The wording of this item was revised to increase consistency between subitems 13a and 17b.

Item 18, subitem a. The definition for School Resource Officer was removed from this item as the definition is now included in the list of formal definitions.

Item 19. The wording of this item stem was re-ordered to read “sworn law enforcement officers (including School Resource Officers)” to increase consistency with the wording used in other items in the School Security Staff section.

Item 24, subitem c. “Confidentiality” was added as an example to this item in a parenthetical notation.

Item 24, subitem f. Per changes to the term and definition as noted above, “diagnostic assessment” was changed to “diagnostic mental health assessment” in this item. Additionally, “diagnostic mental health assessment” and “treatment” were bolded and marked with an asterisk as an indication that these terms have a formal definition.

Item 32, subitem c. The word “gender” was changed to “sex” to align the terminology in the questionnaire with the terminology used in other NCES data collections.

Item 32, subitem e. A parenthetical was added to clarify the meaning of “disability.” The examples given in the parenthetical (physical, mental, and learning disabilities) align with those used in subitem 35g.

Item 34. Previously, this item asked respondents to record the number of arrests that occurred at school. Since respondents are sometimes unable to record the exact number of arrests, the format of this item has been changed to use the following response categories: 0, 1–5, 6–10, and 11 or more. These data will be used to benchmark against estimates collected in other federal data collections. Due to the addition and removal of items from the “Incidents” section of the survey, this item was moved to the end of the section to establish a better flow in the survey items.

Item 37, subitem b. “At-home instruction” was changed to “home instruction” to align the terminology in the questionnaire with the terminology used by schools.

Item 41, subitem b. “Limited English Proficient” was changed to “English language learner (ELL)” to align the terminology in the questionnaire with the terminology used in other NCES data collections.

    1. Item Deletions and Rationale

2015–16 Questionnaire, Item 1, subitem h. This item was deleted. The two items on random sweeps (1h and 1i in the 2016 questionnaire) were combined because it is more important to identify how many schools are conducting sweeps for contraband than whether or not schools are using dog sniffs during these sweeps.

2015–16 Questionnaire, Item 1, subitem k. This item was deleted. The two items on required drug testing (1j and 1k in the 2016 questionnaire) were combined because it is more important to identify how many schools are conducting random drug testing as opposed to the specific population that is being tested.

2015–16 Questionnaire, Item 1, subitem v. This item was deleted. This variable was determined to be outdated and to have limited analytic use.

2015–16 Questionnaire, Item 1, subitem x. This item was deleted. This variable was determined to be outdated and to have limited analytic use.

2015–16 Questionnaire, Item 4, subitem c. This item was deleted. This variable was shown to have little variance and to have limited analytic use.

2015–16 Questionnaire, Item 4, subitem d. This item was deleted. This variable was determined to have limited analytic use and was dropped to increase the focus on other, more-specific programs included in item 4.

2015–16 Questionnaire, Item 4, subitem f. This item was deleted. This variable was determined to have limited analytic use and was dropped to increase the focus on other, more-specific programs included in item 4.

2015–16 Questionnaire, Item 8, subitem c. This item was deleted. This variable was determined to have limited analytic use.

2015–16 Questionnaire, Item 9, subitem c. This item was deleted. Similar information is collected in other NCES surveys, such as the National Household Education Surveys.

2015–16 Questionnaire, Item 9, subitem d. This item was deleted. Similar information is collected in other NCES surveys, such as the National Household Education Surveys.

2015–16 Questionnaire, Item 14, subitem d. This item was deleted. This variable was shown to have little variance and limited analytic use.

2015–16 Questionnaire, Item 20. This item was deleted. To address limitations in the format of the item in the 2016 questionnaire and to improve comprehension, information on the types of mental health services available and the location of these services will now be collected in four separate items (items 20, 21, 22, and 23 in the 2017–18 questionnaire).

2015–16 Questionnaire, Item 21, subitem d. This item was deleted. To collect information on limitations related to the broader scope of parental concerns regarding schools’ efforts to provide mental health services to students, it was replaced with a new item on “concerns about reactions from parents.”

2015–16 Questionnaire, Item 30. This item was deleted. This item was determined to have limited analytic use, and its deletion is intended to help reduce overall questionnaire burden on the respondent.

2015–16 Questionnaire, Item 31. This item was deleted. This item was determined to have limited analytic use, and its deletion is intended to help reduce overall questionnaire burden on the respondent.

    1. Content Modifications, Item Additions, and Rationale

Item 4. The stem of this item was revised. Specifically, “programs” was changed to be “activities” to encompass the wide range of programs, trainings, and interventions that schools may implement in an attempt to prevent or reduce violence. Additionally, the specification of “formal” was also removed from the item since both “formal” and “informal” activities are important in schools’ attempts to prevent or reduce violence. The specification that activities must be “intended to prevent or reduce violence” was also removed since it is assumed all activities in this item have an explicit or implicit intent to prevent or reduce violence. The instruction to answer “Yes” for all that applies was also removed.

Item 20. This item has been added to assess the percentage of schools that provide diagnostic mental health assessments to evaluate students for mental health disorders. Adequate assessment of mental health disorders in students may help to prevent future violent acts, and research supports that school mental health programs can have an impact on reducing behavioral problems.

Item 21. As a follow-up to item 20, this item has been added to assess whether schools are providing diagnostic mental health services at school or outside of school (school-employed or contracted mental health professionals may be providing services in either or both locations).

Item 22. This item has been added to assess the percentage of schools that provide treatment to students for mental health disorders. Adequate treatment of mental health disorders in students may help to prevent future violent acts, and research supports that school mental health programs can have an impact on reducing behavioral problems.

Item 23. As a follow-up to item 22, this item has been added to assess whether schools are providing treatment for mental health disorders at school or outside of school (school-employed or contracted mental health professionals may be providing services in either or both locations).

Item 24, subitem d. This subitem was added to assess whether principals perceive that their schools’ efforts to provide mental health services to students are limited by concerns about how parents may react.

Item 25, subitem h. This item will gather information on whether teachers/aides have been trained in what steps to take once they have recognized the signs of suicidal tendencies or self-harm. Training in recognizing signs of self-harm is critical for interrupting students who may be in situations of self-harm.

Item 26. This item will gather information on the number of schools that had a staff member who legally carried a firearm on school property. While many schools, districts, and states have laws or policies that prevent the carrying of firearms in public schools, a growing number of state and district policies allow school staff to legally carry (concealed) firearms at school, an indication that this item is particularly policy relevant.

Item 33. This item will identify the percentage of schools that had an incident of sexual misconduct between a staff member and a student during the 2017–18 school year. Adding this item is in direct response to a GAO recommendation for the Department of Education to collect data and have the ability to respond to the prevalence of sexual misconduct by school personnel.

Item 35, subitem f. This item will gather information on the frequency of student harassment of other students based on religion. Harassment based on other biases is already included on the survey, and this item will separately identify harassment based on religion.

Item 35, subitem g. This item will gather information on the frequency of student harassment of other students based on disability. Harassment based on other biases is already included in the survey, and this item will separately identify harassment based on student disabilities (including physical, mental, learning, and other disabilities).

C3. Changes to the Questionnaire and Rationale: SSOCS:2020

The following section details the editorial changes, item deletions, and global formatting changes made between the SSOCS:2018 and SSOCS:2020 questionnaires. Based on the results of the SSOCS:2018 data collection, feedback from content area experts, and a seminar on visual design in self-administered surveys, some items for SSOCS:2020 were revised for consistency, clarity, and brevity. The section Frequency of Crime and Violence at School was removed, and the corresponding questions were incorporated into the Incidents section. Additionally, several items were removed from the survey in an effort to reduce overall questionnaire burden on the respondent. No new items were added.

The result is the proposed instrument for the SSOCS:2020 survey administration, which is provided in appendix B.


    1. Changes to Definitions

Three terms and definitions (active shooter, alternative school, and children with disabilities) have been adjusted to align with federal definitions for those terms. Eight definitions (evacuation, gender identity, hate crime, lockdown, rape, School Resource Officer (SRO), shelter-in-place, and threat assessment) have been minimally revised to increase brevity and clarity for survey respondents.

Active shooter – The definition was revised to align with the current definition used by the U.S. Department of Homeland Security.

Alternative school – The definition for alternative school (previously “specialized school”) was revised to align with other NCES and Department of Education surveys and publications.

Children with disabilities – The definition for children with disabilities (previously “special education students”) was updated to align with the Individuals with Disabilities Education Act (IDEA) definition.

Evacuation – The definition was simplified to avoid implied endorsement of a specific procedure for evacuation.

Gender identity – Detailed examples of gender expression were removed from the definition for brevity.

Hate crime – The definition was modified to include national origin or ethnicity as a hate crime bias.

Lockdown – The term was simplified, and examples were removed to avoid implied endorsement of a specific procedure for lockdown.

Rape – The bracketed item-specific instruction was removed from the definition. This information is specific to item 25 and the instructions appear within that item.

School Resource Officer (SRO) – The word “career” was removed from the definition to broaden the definition to all SROs.

Shelter-in-place – The definition was modified to clarify the purpose of the practice and examples of procedures were simplified.

Threat assessment – The word “team” was removed from the term and the definition was modified to focus on a formalized threat assessment process rather than a team.

    1. Editing Changes

Throughout the questionnaire, the school year has been updated to reflect the most recent 2019–20 school year, and item skip patterns have been updated to reflect the new numbering in the questionnaire.

Arrest – The first letter in the definition was lowercased for consistency with other definitions.

Gender identity – The word “means” was removed from the beginning of the definition for consistency with other definitions.

Hate crime – The first letter in the definition was lowercased for consistency with other definitions.

Sexual misconduct - The first letter in the definition was lowercased for consistency with the rest of the definitions.

Sexual orientation – The word “means” was removed from the beginning of the definition for consistency with other definitions.

Item 1, subitem a. The underlining and bolding of the word “and” were removed to align with consistent formatting practices across the questionnaire.

Item 1, subitem u. The underlining of the word “use” was removed to align with consistent formatting practices across the questionnaire.

Item 2, subitem f. The term “Suicide threats or incidents” was pluralized to make the item parallel with the wording used in items 2d and 2e.

Item 4, subitem d. The forward slashes in “mentoring/tutoring/coaching” were changed to commas.

Item 5. Per the changes to the term and definition as noted above, the term “threat assessment team” was changed to “threat assessment.”

Item 6, subitem c. This subitem was expanded to include student groups supporting the acceptance of religious diversity.

Item 8. The phrase “disciplined and drug-free schools” was replaced with “a safe school” to broaden the question and better reflect current Department of Education language.

Item 13. The phrase “Memorandum of Use” was changed to “Memorandum of Understanding” to better reflect current terminology.

Item 14. The term “at school” was set in bold and marked with an asterisk to indicate that it has a formal definition.

Item 14, subitem b. The subitem was reworded to distinguish examples of physical restraints from chemical aerosol sprays.

Item 15. The term “Part-time” was capitalized in the instructions to increase consistency with the response options of the item.

Item 16. The term “Part-time” was capitalized in the instructions to increase consistency with the response options of the item. The term “security guards” was changed to “security officers” to better reflect current terminology.

Item 23. The phrase “to the best of your knowledge” was removed from the item for brevity. The instruction to exclude sworn law enforcement was moved into the item stem to increase clarity.

Item 25. The underlining of the word “incidents” was removed to align with consistent formatting practices across the questionnaire. The column 2 header was changed to “Number reported to sworn law enforcement” for clarity.

Item 27, subitem a. The phrase “color” was removed from the item to reduce ambiguity in terminology.

Item 28. The underlining of “whether or not the incidents occurred at school or away from school” was removed to align with consistent formatting practices across the questionnaire.

Item 31. The placement of language specifying the inclusion of both students and non-students was adjusted for increased clarity.

Item 34. The word “Yes” was capitalized for consistency with the rest of the item.

Item 34, subitem c. Per the changes to the term and definition as noted above, the term “a specialized school” was changed to “an alternative school.”

Item 35. Per the changes to the term and definition as noted above, the column 3 header term “specialized schools” was changed to “alternative schools.”

Item 36, subitem b. Per the changes to the term and definition as noted above, the term “specialized schools” was changed to “alternative schools.”

Item 38, subitem c. Per the changes to the term and definition as noted above, the term “Special education students” was changed to “Children with disabilities (CWD).”

Item 44. The question was rephrased to better align with the language above the response box and clarify that the response should be a percentage of the school’s total enrollment.

    1. Changes to School/Respondent Information

In prior SSOCS collections, respondents have been asked to provide their name and title/position. For SSOCS 2020, respondents are provided more title/position response options and similar title/position information is being requested for any other school personnel who helped to complete the questionnaire. This modification reflects feedback from the TRP and aims to gain a better understand of all staff involved in completing the survey.

    1. Item Deletions and Rationale

2017–18 Questionnaire Item 6. This item was deleted. Following feedback from an expert panel, it was determined that how often the threat assessment team meets is not a critical piece of information. The broad response options had limited analytic use.

2017–18 Questionnaire Item 9. This item was deleted to reduce respondent burden since the item overlaps with the National Teacher and Principal Survey (NTPS).

2017–18 Questionnaire Item 12, subitem a. This subitem was deleted. Similar information is collected in SSOCS:2020 item 9 (SSOCS:2018 item 11); its deletion is intended to help reduce overall questionnaire burden on the respondent.

2017–18 Questionnaire Item 15. This item was deleted. Similar information is collected in SSOCS:2020 items 9 and 10 (SSOCS:2018 items 11 and 12); its deletion is intended to help reduce overall questionnaire burden on the respondent.

2017–18 Questionnaire Item 27, subitem j. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable was outdated and had limited analytic use.

2017–18 Questionnaire Item 27, subitem k. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable was outdated and had limited analytic use.

2017–18 Questionnaire Item 27, subitem l. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable was outdated and had limited analytic use.

2017–18 Questionnaire Item 27, subitem m. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable was outdated and had limited analytic use.

2017–18 Questionnaire Item 36, subitem b. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable had limited analytic use.

2017–18 Questionnaire Item 36, subitem c. This subitem was deleted. Following feedback from an expert panel, it was determined that this variable had limited analytic use.

    1. Global Changes to Formatting and Instructions

In addition to the item-specific changes described above, some global changes were made to enhance the consistency and formatting of the questionnaire in an effort to improve its visual design. A streamlined and consistent questionnaire will be easier for the respondent to follow, reduce response time and burden, and help promote an accurate understanding of survey items and response options. These revisions were based on feedback from a TRP consisting of content area experts and on the recommendations of a national expert in visual design elements for self-administered surveys.

The survey cover page has been revised to:

  • Include the Department of Education and U.S. Census Bureau logos in order to enhance the perception of the survey’s legitimacy.

  • Remove white space where the school information will be printed. White space typically indicates an area for the respondent to fill in, but in this case the information will be pre-filled by Census.

  • Remove the list of endorsements. The endorsements will be provided in a separate handout in order to reduce clutter on the cover page and allow for the incorporation of the logos of some endorsing agencies that respondents may be most familiar with.

Horizontal and vertical grid lines have been removed.

Alternative row shading has been incorporated.

Certain response field shapes have been changed to reflect best practices in questionnaire design. The following guidelines for response fields have been implemented for consistency across the SSOCS:2020 questionnaire. These changes also bring the paper questionnaire design in better alignment with the design of the SSOCS web instrument:

  • For items where respondents select only one response (out of two or more response options), response fields will appear as circles.

  • For items where respondents select all applicable responses (out of two or more response options), response fields will appear as squares.

  • For items where respondents are asked to provide numerical values (e.g., incident counts or dates) or text (e.g., names or e-mail addresses), response fields will appear as rectangles.

Instructions found at the bottom of pages referring the respondent to item definitions will now read “*A removable Definitions sheet is printed on pages 3–4.” Similar to NTPS procedures, the two definition pages will be included as a perforated sheet that can be removed from the questionnaire to facilitate easier reference when taking the survey.

All apple-style bullet points have been changed to circular bullet points.

The source code font has been lightened, and codes have been moved away from response boxes to avoid distracting the respondent.

Certain instructions in the survey were also removed to reduce redundancy and item length. The following instructions are included the first time a style of item response options is introduced, but not in subsequent questions that use the same response option style:

  • “Check “Yes” or “No” on each line” (appears first in Question 1).

  • “Check one response on each line” (appears first in Question 21).

  • “If none, please place an “X” in the None box” (appears first in Question 15).


C4. Changes to the Questionnaire and Rationale: SSOCS:2020 Change Request September 2019

The following section details the editorial changes, item deletions, and global formatting changes made between the SSOCS:2020 questionnaire approved by OMB in May 2019 and the SSOCS:2020 questionnaire submitted as part of a change request memo in September 2019.

The result is the proposed instrument for the SSOCS:2020 survey administration, which is provided in appendix B.


    1. Changes to Definitions

No changes to definitions were made, although one term was updated to more accurately reflect the definition. The page will be perforated to allow for the respondent to remove the page to assist with responding to items. The instruction on the Definitions sheet was updated to remind the respondent that the page can be detached, and an additional instruction was added alongside the perforation as a reminder to remove the page. The ordering of “active shooter” and “alternative school” were switched to be listed in alphabetical order.

Instruction on Definitions page: The following words are bolded and marked by an asterisk (*) wherever they appear in the questionnaire. Please detach and use these definitions as you respond.


New instruction alongside perforation: Please tear off this “definitions” sheet to use while completing the survey.


New order of terms:

Active shooter – The definition was revised to align with the current definition used by the U.S. Department of Homeland Security.

Alternative school – The definition for alternative school (previously “specialized school”) was revised to align with other NCES and Department of Education surveys and publications.

Change in term:

“Restorative circle” was updated as “restorative practices” to more accurately reflect the entire process rather than one example (i.e., “circle”) of a practice.

    1. Editing Changes

Throughout the instrument, item skip pattern instructions have been updated to reflect the new numbering and item positioning throughout the questionnaire. Item numbering and minutes to complete the questionnaire have been updated to reflect item deletions.


Instructions page: the following bullet was updated to reflect the updated page numbers:

  • Defined terms are bolded and marked with an asterisk (*) throughout the survey. A removable "definitions" sheet is printed on pages 3 – 4 2 and 3 to use as a reference while filling out the questionnaire.

Definitions page:

Update to the instruction:

The following words are bolded and marked by an asterisk (*) wherever they appear in the questionnaire. Please detach and use these definitions as you respond.

New instruction alongside perforation:

Please tear off this “definitions” sheet to use while completing the survey.


Item 1- several subitems in this section were moved to group related subitems together

Item 4-The following update was made to say:

Student involvement in restorative circles practices (e.g., peace or conflict circles)

Item 9. The following change was made to the instructions for item 9:

Do not include security officers or other security personnel who are not sworn law enforcement

in response to items 9-12 9-15; information on additional security staff is gathered in item 16.

Item 10, subitem c. The “/” was removed from the text to now say:

When school or school activities were not occurring.

Item 28. The following update was made to the instruction:

Sexual assault* and rape* are both forms of sexual misconduct. Therefore, some incidents of

staff-student behavior may be reported in response to items 23a and 23b 25a and 25b as well as item 28.

Item 31 (previously Item 33). The formatting of this item has changed since this item no longer has multiple subitems.

Previous wording: To the best of your knowledge, thinking about problems that can occur anywhere (both at your school* and away from school), how often do the following occur?

Cyberbullying* among students who attends your school

Current wording: To the best of your knowledge, thinking about problems that can occur anywhere (both at your school* and away from school), how often does cyberbullying* among students who attend your school occur?



    1. Changes to School/Respondent Information

The ordering of items in the School Characteristics and Respondent Information sections was updated to better group related-items together.


    1. Item Deletions and Rationale

Questionnaire Items 29 & 30. For the 2018 SSOCS, it was known that some schools in sample experienced shootings and opted out of completing the survey. This knowledge suggests concerns of bias in the data and the need to rely on the collection of school shootings and homicides information from mandatory, universe collections. Therefore, items 29 (shootings) and 30 (homicides) have been removed from the 2020 SSOCS instrument. These deletions result in renumbering of items, as well as a change in the number of minutes to complete the survey.

    1. Global Changes to Formatting and Instructions

Instructions found at the bottom of pages referring the respondent to item definitions will now read “*A removable “Definitions” sheet is printed on pages 2 & 3.”






























C5. School Survey on Crime and Safety (SSOCS) 2018 Cognitive Interview Report


List of Charts


List of Tables

Executive Summary

This report summarizes the findings from and decisions following cognitive interviews conducted by American Institutes for Research (AIR) to test and revise items on the School Survey on Crime and Safety (SSOCS) 2018 questionnaire. Nineteen remote and in-person cognitive interviews were conducted across the United States during November and December of 2016.


In the cognitive interviews, AIR interviewers followed a structured protocol in an hour-long one-on-one interview session, where participants were encouraged to “think-aloud” in their answers with probes from the interviewer as needed. The objective of the cognitive interview testing was to identify problems of ambiguity or misunderstanding in item wording as well as response options. The testing allowed revisions and refinement of the wording for the next administration of SSOCS in spring 2018. The intended result of cognitive interviews is a questionnaire that is easier to understand, interpreted consistently across respondents, and aligned with the concepts being measured.


Key Findings and Recommendations


Across cognitive interviews, it was identified that respondents rarely referred back to the instructions or definition page even if they were stuck or found a concept confusing. Small edits to draw the respondent’s attention to definitions will be implemented for the SSOCS:2018 paper questionnaire and, upon approval, SSOCS:2018 will include an experimental web administration; NCES, AIR, and Census intend to incorporate the definitions and skip patterns directly into items for the web instrument.


Item 3, which asked participants to identify components of programs with the intent to prevent or reduce violence, proved to be the tested item which resulted in the most respondent error and confusion. Specifically, most respondents had issues with defining “formal” programs and determining the “intent” of programs. Though revisions made to this item and tested across the latter portion of participants appeared to help respondents some in understanding the scope of the item, continued respondent comprehension issues led NCES and AIR to further revise the item to remove both the specification of “formal” programs and the intent to prevent or reduce violence. It is the nature of all the listed components to either implicitly or explicitly have an influence in reducing/preventing violence.


Respondent feedback on two new items – item 4 on staff carrying firearms at school and item 12 on sexual misconduct between staff and students – yielded no significant comprehension issues. NCES will include these items on SSOCS:2018 to gather information that no other federal survey currently requests. The testing of one new item – item 9 on school-based probation officers – indicated that respondents were confused about the scope of school-based probation officers. Due to these comprehension issues and the fact that the survey included only a general definition of probation officer and not one specific to school-based probation officers, NCES and AIR felt this item should undergo further refinements and testing before being included on the questionnaire; the item will not appear on SSOCS:2018.


Respondents generally experienced few issues with other items tested during cognitive interviews. NCES and AIR made minor revisions to these items to add additional context or clarification.


Participants who reviewed and answered questions about the survey materials and delivery cited that time burden and length of the questionnaire were response deterrents. Respondents noted that the relevant topic and a small gift would be incentivizing to completing the questionnaire. Based on the feedback on the survey materials, NCES and AIR will make efforts to shorten the length of future questionnaires, streamline and target the communication materials, develop a pilot web instrument and electronic materials, and test the inclusion of a monetary incentive to increase response rates.

Introduction


The School Survey on Crime and Safety (SSOCS), a nationally representative survey of elementary and secondary public schools, is one of the nation’s primary sources of school-level data on crime and safety. Managed by the U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics (NCES), SSOCS has been administered six times, covering the 1999–2000, 2003–04, 2005–06, 2007–08, 2009–10, and 2015–16 school years, and will next be administered in the spring of the 2017–18 school year.


SSOCS is the only recurring federal survey collecting detailed information on the incidence, frequency, seriousness, and nature of violence affecting students and school personnel, as well as other indices of school safety from the schools’ perspective. As such, SSOCS fills an important gap in data collected by NCES and other federal agencies. Topics covered by SSOCS include school practices and programs, parent and community involvement, school security staff, school mental health services, staff training, limitations on crime prevention, the type and frequency of crime and violence, and the types of disciplinary problems and actions used in schools. Principals or other school personnel designated by the principal as the person who is “the most knowledgeable about school crime and policies to provide a safe environment” are asked to complete SSOCS questionnaires.


Background

At multiple points in its history, the quality of SSOCS survey items has been examined. To the greatest extent possible, minimal changes have been made to existing survey items to maintain consistent benchmarks over time. However, items have periodically been modified to ensure relevancy, and new items have been added to address emerging issues of interest. The 2016 questionnaire, which included approximately 40 new and modified items or sub-items on school practices and programs, school security staff, school mental health services, staff training, and incidents of crime, received OMB clearance in August 2015 (OMB# 1850-0761 v.8). The 2015–16 data collection was fielded in spring 2016.


SSOCS will be conducted again in spring 2018. NCES and its contractor, the American Institutes for Research (AIR), held a series of meetings in spring and summer 2016 to discuss the proposed content of the SSOCS:2018 questionnaire. NCES and AIR also requested feedback on proposed and modified survey items from top experts in school crime and safety who previously served as Technical Review Panel (TRP) members during development of prior SSOCS questionnaires. As a result of these discussions, 8 items or sub-items were recommended for addition, 15 items or sub-items were recommended to be modified, and 10 items or sub-items were removed from the questionnaire. Additionally, recommendations were made to add or modify several definitions in accordance with changes made to survey items.


As a final step in the item development process, a portion of the new and modified survey items were tested on target participants through cognitive interviews in fall 2018 (the questionnaire that contains all of the items that underwent cognitive testing can be found in appendix A-1 of this report). In addition to the selection of survey items, questions about the communication materials and physical survey package contents were also tested to assess possible factors which may or may not contribute to the propensity of completing the survey (the communication materials that were tested can be found in appendix A-2). This document describes the types of cognitive testing conducted, the sample and recruitment plans, the data collection process, and the data analysis process. It also provides detailed findings from the cognitive interviews as well as summarizes the discussion of results and final decisions on additional modifications to survey items and materials included in the 2018 questionnaire.


Study Purpose


The objective of cognitive interviews is to uncover participants’ specific comprehension issues when responding to survey items. Cognitive interviews also measure participants’ overall understanding of the content surveyed. In a cognitive interview, an interviewer uses a structured protocol in a one-on-one interview, drawing on methods from cognitive science. Interviews are intended to identify problems of ambiguity or misunderstanding in question wording, with the goal of ensuring that final survey items are easily understood, interpreted consistently across respondents, and aligned with the concepts being measured. Cognitive interviews should result in a questionnaire that is easier to understand and therefore less burdensome for respondents, as they also yield more accurate information.

Methodology


Sample and Recruitment Plan


In order to meet a target number of approximately 20 completed cognitive interviews, interview respondents were recruited via a sub-contracted recruitment firm, Nichols Research. Nichols Research is a full-service marketing research firm operating in the San Francisco Bay Area and Central California. During the last month of recruitment, AIR staff also recruited potential participants through e-mail and phone outreach to school administrators who have relationships with AIR through other research projects.


Recruited participants were targeted from a variety of locations as trained AIR-interviewers were available to conduct in-person interviews at school sites near five AIR offices – in the District of Columbia, Chicago, San Mateo, Boston, and Austin metropolitan areas – or could conduct remote interviews with participants anywhere in the United States. Given the diversity of locations across the country in which the cognitive interviews were held, NCES and AIR expected that those who participated would better represent the target population of schools from SSOCS than participants sampled from the same region or city. In order to adequately test the survey instrument with minimal selection bias, an attempt was made to distribute interviews across schools that represent a diverse cross section of the general population given their socio-demographic characteristics (i.e., percent white enrollment, total enrollment size, etc.).


Interested individuals were screened to ensure they were eligible for participation in the interviews. Eligible schools included regular public schools, charter schools, and schools that have partial or total magnet programs with students in any of grades prekindergarten through 12. Eligible participants included school principals or persons designated by principals as “the person most knowledgeable about school crime and policies to provide a safe environment.”


Upon a successful screening, e-mails and phone calls were used to contact eligible participants and schedule interviews with AIR staff. Participants were asked whether they preferred in-person interviews or interviews conducted remotely through video conference or teleconference. The availability of the remote interview served to encourage participation by offering flexible options. Additionally, remote options allow for reduced costs as opposed to strictly in-person interviews, which are more costly overall (due primarily to transit costs and additional staff hours for travel time). Outreach materials used for recruitment can be found in appendix C-2 of this report.


In total, 19 principals were successfully recruited from across the nation to participate in the testing.

Chart 1 shows the states and cities of the participants’ schools.


Table 1 shows the distribution of these participants across interview modes and waves. Note that the school names have been removed and replaced with “School A, B, C, etc.” to ensure participant confidentiality.

Chart 1. Distribution of cognitive interview participants, by school location




Table 1. Distribution of cognitive interview participants, by school and interview mode and wave



School

Total


A

B

C

D

E

F

G

H

I

J

K

L

M

N

O

P

Q

R

S

19

Interview Mode





















In-person


x






x

x











3

Remote teleconference

x



x


x

x




x


x

x

x



x

x

10

Remote video conference



x


x





x


x




x

x



6

Interview Wave





















Wave 1

x

x

x

x

x

x

x

x

x

x

x









11

Wave 2












x

x

x

x

x

x

x

x

8


The questionnaire was tested in two waves; changes to the questionnaire were implemented as a result of a preliminary review of findings from the first 11 interviews. Only one item was revised between wave 1 and wave 2. After cognitive interviews had been conducted with 11 of the 19 participants, AIR reviewed the interview notes and discovered that 10 respondents reported issues understanding item 3. Participant response errors stemmed from comprehension of terms, phrasing, and overall item structure, warranting intervention to the item before conducting further interviews. After consultation with NCES, AIR revised item 3 for testing with the remaining 8 participants in a second wave. More details on this item will be discussed in the Key Findings section.


Although multiple versions of the questionnaire were used during the actual cognitive interviews, the complete Wave 1 version of the questionnaire that contains all of the items that underwent cognitive testing, as well as the survey instructions and definitions, can be found in appendix A-1, for ease of presentation in this report.


The participants represented schools with a range of characteristics (by grade level, locale, enrollment size, and percent White enrollment). The participants included elementary, middle, high, and combined school principals. Table 2 shows descriptive statistics of the participants’ schools. While the sample included a variety of characteristics, the results of these interviews do not explicitly measure differences by these characteristics.


Table 2. Distribution of cognitive interview participants, by school and school characteristics



School

Total


A

B

C

D

E

F

G

H

I

J

K

L

M

N

O

P

Q

R

S

19

Level





















Primary



x

x



x






x






x

5

Middle

x

x




x









x

x




5

High school





x





x







x

x


4

Combined








x

x


x

x


x






5

Enrollment





















Less than 300


x



x

x




x


x


x

x

x




8

300–499

x










x


x






x

4

500–999



x

x



x

x

x








x



6

1,000 or more


















x


1

Urbanicity


City

x

x

x


x


x

x

x

x




x

x


x


x

12

Suburb






x





x





x


x


4

Town




x
















1

Rural












x

x







2

Percent White Enrollment


More than 95%




















0

More than 80 to 95%













x







1

More than 50 to 80%







x





x





x



3

50% or less

x

x

x

x

x

x


x

x

x

x



x

x

x


x

x

15


The interviews took place between November 9, 2016, and December 23, 2016, and principals were asked to report items for the 2016–17 school year.


Data Collection Procedures


Each cognitive interview lasted approximately one hour and was conducted using a structured protocol. The methodology was developed by AIR researchers in consultation with NCES and drew on best practices and methods from cognitive science. The interviews were conducted using two key components of cognitive interview methods: think-aloud interviewing and verbal probing techniques (known, respectively, as concurrent and retrospective recall probing). With think-aloud interviewing, participants are explicitly instructed to think aloud (i.e., describe what they are thinking) as they work through items. With verbal probing techniques, the interviewer asks probing questions as necessary to (1) clarify points that are not evident from the “think-aloud” process and (2) explore additional issues that have previously been identified as of particular interest. Cognitive interview studies produce qualitative data in the form of verbalizations made by participants during the think-aloud phase and in response to the interviewer probes.


AIR staff who had prior training and experience conducting cognitive interviews were chosen as interviewers. Interviewers received training from a senior researcher at AIR on the SSOCS-specific interview protocol. The interview team consisted of 10 trained staff, located throughout the District of Columbia, Chicago, San Mateo, Boston, and Austin AIR offices. Interviewers were scheduled to a cognitive interview session according to interview location, interview mode, and staff availability (updated weekly using internal polls). Staff from NCES observed a subset of interviews.


During each interview, the study participant was welcomed by the interviewer and informed that their participation was voluntary and that their answers would be used only for research purposes and not disclosed, or used, in identifiable form for any other purpose except as required by law (Education Sciences Reform Act of 2002, 20 U.S.C. §9573). All participants authorized their consent at the time of the interview, and as a thank-you for their time and participation, received a $50 AMEX gift card for remote interviews or a $75 AMEX gift card for in-person interviews.


SSOCS is a paper-based survey; for the cognitive testing, participants were provided with a copy of the survey and other SSOCS materials (advance letter, cover letter, and brochure) consistent with an actual administration of SSOCS. Hard copies were provided for in-person interviews while electronic copies were e-mailed to the participant once a remote interview was scheduled. During the interview session, participants were asked to complete items in sets (broken out by the topical sections of the survey). They were asked to read the questions out loud and use the think-aloud process to describe how they understood the question and chose their response. Following the think-aloud portion for each section, the interviewer followed up with a set of pre-established probing questions, as necessary, to gather additional feedback and clarification.1


As the respondent filled out the questionnaire, the interviewer took handwritten notes in a protocol that contains the items and probes for the cognitive interview (see appendix B of this report for in-person and remote versions of the protocol). The interviewer kept a record of the respondent’s responses to the items using a separate version of the questionnaire. The interviews were also audio- or video-recorded to allow the interviewer to review the recording for additional detail for notes and analysis. Once the interview was completed, the interviewer entered notes in an electronic note template. The interviewer then saved the notes and the recording of the interview in a secure project server for analysis.


Data Analysis Procedures


Cognitive interviewing is a qualitative research method where the overall purpose of the analysis is to capture and render respondents’ views within the boundaries of each item and associated probes. For the SSOCS cognitive interviews, after the interview notes were checked for quality, the study team compiled all notes using qualitative analysis software (NVivo) to determine the item-by-item issues across all interviews as well as themes apparent across items. The software was used to understand the qualitative data from the interviews and condense that information into specific findings.


Research staff from AIR employed a qualitative coding process to examine each set of notes from all cognitive interviews. Following a coding scheme allowed the analysis to be systematic and consistent over time and across analysts. AIR staff coded items across all interviews to identify where respondents expressed issues with question structure, wording, response options, terminology, or concepts.


In the analysis phase, AIR staff looked for patterns, themes, and categories to determine the most important findings under each construct of the questionnaire. The process of analysis involved noting the prevalence of different responses and grouping differences in responses and associations among responses. Counts of these groupings were used to identify prevalent themes as well as minority views. Common themes and problems arising in the interviews were used to provide recommended changes to the SSOCS questionnaire.

Limitations

A qualitative research methodology such as the cognitive interviews conducted for this research is not intended to develop quantitatively precise or absolute measures. Though the study recruitment attempted to distribute interviews across schools that represent a diverse cross-section of the general population given their socio-demographic characteristics, the non-stratified, non-random sample means the results cannot be generalized to the population under study with any known level of precision.

While the SSOCS cognitive interviews employed a choice of mode (in-person or remote) to promote greater potential response rates and improved coverage of the target population, factors related to the mode, such as the physical presence of an interviewer and whether information is communicated orally or visually, can influence how people respond. This study did not measure differences by interview mode.

Cognitive Testing Findings and Recommendations

The following section summarizes the findings from the cognitive interviews by providing a synopsis of principal feedback for each tested item, a summary of discussions between NCES and AIR to address respondent feedback, and final decisions on the tested items.


Note that the numbering for tested items in this report corresponds with the numbering in the cognitive testing questionnaire found in appendix A-1 of this report. Numbering for the revised items corresponds with the numbering in the final draft SSOCS:2018 questionnaire found in appendix B of the full OMB package. Definitions for many terms were provided to respondents at the front of the questionnaire (see page 2 of appendix A-1 of this report). Within items, defined terms are set in bold type and are marked with an asterisk (*).


Some tested items included sub-items that are not new or heavily modified for SSOCS:2018, but were included to provide context to sub-items chosen for testing. Though respondents may have provided feedback on the contextual sub-items, it was the general practice of NCES and AIR to not recommend changes to these sub-items given the preference maintain trend across survey administrations.

Key Findings

General Survey Feedback


Throughout the interview session, interviewers made general observations of respondent confusion and comprehension issues; additionally, similar probes were have been utilized across multiple items. For example, interviewers frequently asked respondents to define terms in their own words to identify whether respondents’ interpretation of terms aligned with the definitions provided.


Issues Identified

Respondents rarely referred back to the instructions or definition page even if they were stuck or found a concept confusing.

  • However, interviewees who did take the time to refer to the definitions found them helpful for clarifying what information survey items intended to gather.

  • Six respondents specifically mentioned referring to the definitions sheet to answer questions. Definitions most frequently referenced were “harassment” and “trauma sensitivity.”

  • One interviewee mentioned wanting the definitions at the end of each page rather than having to flip back to the beginning each time to read through.

Six respondents did not follow the skip patterns for the mental health items. Interviewees who missed the skip pattern mainly chalked it up to their own error.


For some items where not all sub-items were tested, or where related survey sections were not tested, some respondents expressed confusion over the scope of concepts. See, for example, specific feedback on item 1, related to the inclusion of metal detector checks in sweeps, and item 5, related to the inclusion of school resource officers and other security staff in probation officers.



Discussion

NCES and AIR considered the feedback from respondents on item definitions and instructions to be very helpful; however, given the timeline for the SSOCS:2018 administration does not allow for additional testing, it was recommended that any major changes to the structure and location of definitions on the paper questionnaire be held off until the SSOCS:2020 questionnaire. Small edits to draw the respondent’s attention to definitions and skip patterns were recommended, including bolding the reference to the definitions pages that appears at the bottom of each questionnaire page, and underlining instructions for skip patterns. Upon approval, SSOCS:2018 will include an experimental web administration, and NCES, AIR, and Census intend to incorporate the definitions and skip patterns directly into items for the web instrument.

FINAL RECOMMENDATION: For the SSOCS:18 paper questionnaire, bold the reference to the definitions pages that appears at the bottom of each questionnaire page, and underline instructions for skip patterns.



Item 1


Issues Identified

Item stem and response categories

Generally, respondents understood what this question was asking. One respondent, R-P (i.e. respondent from school ‘P’), reported having issues understanding this question. The respondent re-read the directions because she thought that the first question was asking two separate questions (1) was it a practice and (2) did the practice change. She said that this format is slightly confusing. The respondent said that she was trying to figure out which question to answer.

Sub-items
    1. Responses to this question varied from large-scale and exhaustive/comprehensive searches (i.e. whole building; all lockers) to small-scale spot checks (i.e. 10 random backpacks). Eight interviewees had some difficulty or expressed confusion when answering the question about sweeps for contraband. Three respondents (R-D, R-E, and R-I) seemed to only focus on the “dog sniffs” part of the question and, if no dogs were used, they answered “no” to the question regardless of whether or not they conducted other means of sweeping. Two respondents (R-B and R-J) mentioned wondering if a wand or “weapon abatement machine” counted as a sweep. Three respondents (R-I, R-L and R-G) mentioned general locker searches as sweeps. Respondents also mentioned defining contraband as anything they do not allow on campus, including cell phones, stolen goods, etc. Respondents answered this question with varying definitions of scope, (1) with a very broad definition of contraband or (2) with a very limited definition of contraband (i.e. drugs and weapons only).


    1. In general, respondents did not have an issue with understanding this item. While most respondents referred to specific restrictions they have implemented on their own campuses, R-B and R-R did reference district-wide policies concerning cell phones. Most respondents said their cell phone policies were strict (i.e. putting them in cell phone lockers at the beginning of the day; no use even during after school activities). Respondents mentioned a few specific situations where students are allowed to use cell phones include: during lunch, after school, or in the event of an emergency (i.e. a student needs to call home due to illness).



Discussion

While NCES and AIR debated if some language on “current” practice should be included in the item stem and the instruction on change of practices removed, it was decided that no change should be made at this time, given the desire to retain the school year in the item stem and that only one participant expressed difficulty with the current instructions. To address respondent feedback on sub-item 1a., it was recommended that the structure be modified to clarify that dog sniffs are only on example of contraband sweeps.

FINAL RECOMMENDATION: Revise text for sub-item 1.a


FINAL ITEM:



Item 2


Issues Identified

Item stem and response categories

Generally, respondents understood this question. Respondents who answered “no” to sub-items commonly said they did not have “written plans” for those instances, but felt their staff knew what to do in those situations. Two respondents (R-J and R-O) said this question is forcing them to think about the types of documentation they have and said that they would be taking next steps to develop written procedures to fill in the gaps.

Sub-items
    1. No issues reported with this item.


    1. No issues reported with this item.


    1. There was some variation in responses. A few respondents mentioned having on-site plans, but not off-site plans. One respondent (R-Q) mentioned the district “probably” having written plans for these situations, but would need to call if something were to occur in order to respond to the situation.


Most respondents included one or more examples of other types of events they would include when responding to this question. Examples included: fire, falls, any event requiring first aid (medical emergencies), and community issues (fire, spills, air quality).


One respondent (R-P) reported issues responding to this question. She reported feeling like this item made it sound like these accidents would be happening outside of the school or on the periphery of the school as opposed to an accident that would happen in class (e.g., a student slips on some water and gets injured). She suggested deepening this item to include accidents that occurred during the school day and on school grounds.


    1. No issues reported with this question.



Discussion

NCES and AIR discussed whether “Accidents” was an appropriate category for this item, compared with other sub-items. Based on respondent feedback, it was decided that the scope of “accidents” in the item was unclear. NCES and AIR felt that the item should be revised to clarify between school and district level policies as well as accidents in school compare with outside of school. Additional revisions of this nature would need to undergo further cognitive testing, so it was recommended the item not be included in its current structure on the SSOCS:18 questionnaire.

FINAL RECOMMENDATION: Remove sub-item 2c from SSOCS:2018 and consider revising and retesting for SSOCS:2020

Item 3 (WAVE 1 – tested on respondents A through K)


Issues Identified

Item stem and response categories

Ten of eleven respondents reported issues understanding this question. Response errors included defining “formal” and determining the “intent” of programs.

Issues around defining “formal” came up in two ways (1) format and (2) frequency. One respondent (R-M) asked if formal meant the program needs to be “off-the-shelf”, practiced often/consistently, with staff that are trained to facilitate. Other respondents asked if they could mark “yes” if they have activities that fall within those categories but are not necessarily practiced regularly or may not be considered a full program.

It became apparent to interviewers that respondents were not paying close attention to the “intent to prevent or reduce violence” part of the question when respondents were probed about the components of programs. Respondents often marked “yes” if they had the programs or activities in general without considering whether or not the programs specifically had the intent to prevent or reduce violence. A few participants were candid enough to say they skipped the actual question and went straight to the sub-items and noted “yes” if they had them and “no” if they did not.

Sub-items
    1. No issues reported with this question.


    1. No issues reported with this question.


    1. No issues reported with this question.


    1. No issues reported with this question.


    1. One respondent (R-B) had difficulty defining “peer mediation.” She said she thought “restorative circles” could be included in the definition of peer mediation and answered “yes” to the question, but was unsure if she should be responding based on her understanding of the question.


    1. No issues reported with this question.


    1. Two respondents (R-I and R-R) had issues with this question. R-I mentioned referring to the glossary for the definition of the concept. R-R answered the question by aligning it to general conflict resolution.


    1. No issues reported with this question.


Changes implemented to item 3 for Wave 2, respondents L through S (see findings below)

  • Removed the double barrel stem of this item. Instead, split the answer column for each sub-item into two Y/N columns. The first to indicate if such a component exists and the second to indicate if it is part of a program with the intent to prevent or reduce violence.


SEE ITEM 3 WAVE 2 DISCUSSION AND FINAL ITEM RECOMMENDATIONS BELOW


Item 3 (WAVE 2 – tested on respondents L through S)

Issues Identified

Item stem and response categories

While the revision of the question made the question more text heavy, it seemed to help reduce respondent tendencies to skip over reading the full question and pay attention to what they were marking for each column. Respondents were more likely to specifically consider the intention to reduce or prevent violence with this structure.


“Formal” may still need to be defined in order to capture accurate responses concerning format and frequency. There still seems to be a need for clarification.


One respondent (R-M) read the question a few times. Afterwards, she said that she was looking at both columns and was thinking through what the difference is between both columns. She says that she likes this set up better than the set up used in item 1, because it’s very clear with regard to which question she is answering. R-M answered “no” to all subcomponents. During the think aloud, the respondent said that she was thinking about the phrase “formal programs.” For 3h, she noted they do have ways that they build community but she wouldn’t call them formal. For example, they do have morning meetings to promote a sense of community but there is no program in writing that people can refer back to. The respondent said that she would answer “No” even though they do have programs that builds a sense of community.


One respondent from Wave 2 (R-Q) noted that “violence” is a very extreme word and that in his mind it means “extreme harm.” According to the respondent, if the word “violence” was replaced with “harm” he would have answered the questions differently. He also said that for all of the interventions the school had in place, one could argue that they all reduce violence indirectly even if that was not the specific intention.

  • Respondent answer: “[For] very few of them [programs on the checked list] that I have selected am I thinking about the program specifically having the “intention to reduce or prevent violence.” I’m just seeing them, checking whether we have the program – or not.”

Sub-items

See Wave 1 above. No additional sub-item issues were identified during Wave 2.



Discussion

NCES and AIR decided to revert back to the original format of the item (removing the additional columns) since feedback indicated more comprehension issues with the item stem as opposed to the format of the item itself. Thus, it was decided to remove both the specification of “formal” programs and the intent to prevent or reduce violence; it is the nature of all the listed components to either implicitly or explicitly have an influence in reducing/preventing violence. Additionally, NCES and AIR recommended replacing “programs” with “activities” to better align with the broad scope of the sub-items included.

FINAL RECOMMENDATION: Revise item stem and structure.


FINAL ITEM:



Item 4


Issues Identified

Item stem and response categories

No issues reported with this question. All respondents answered “no” to this question.


Two respondents (R-G and R-M) noted that they have signs all over the school that says firearms are not allowed. However, they were unsure about how this applied to conceal carry laws and the fact that there were no policies (that they knew of) that staff with conceal carry licenses had to notify the principal if they carried a gun to school.


Two respondents (R-M and R-P) noted that there was a policy (school/district or state level) prohibiting the carrying of firearms on school property.


One respondent (R-G) said that staff hadn’t told her if they carried a gun onto campus and that it is not required that they do.


Discussion

Respondent feedback yielded no comprehension issues with the wording of this item. Since no other federal surveys ask about this topic and testing indicated that the item did not add a significant about of time burden for respondents, NCES and AIR recommended retaining the item for the SSOCS:2018 questionnaire.

FINAL RECOMMENDATION: Retain item (no change).


Item 5


Issues Identified

Item stem and response categories

Four respondents (R-E, R-J, R-M, and R-S) had issues defining “diagnostic assessment.” Only one of the four respondents (R-E) referred back to the definitions before answering the question. Most respondents referred to a diagnostic assessment as a tool used by a psychiatrist or mental health professional to determine whether a student has a disorder or mental health issue. One respondent (R-K) specifically mentioned the tool being used to determine a student’s eligibility to be on an IEP and 504 plan.


Five respondents (R-I, R-J, R-K, R-L, and R-O) had issues defining mental health disorder and respondent answers across interviews were variable as to the range of issues they included in the definition. R-O said a mental disorder could really be anything and was unsure of what to include from depression to anxiety to being bipolar. One respondent (R-S) mentioned that she had to be careful to note the “mental health” part of the question and not answer based on an “educational assessment.” A majority of respondents reported a broad comprehensive definition.



Discussion

After discussion between NCES and AIR, including consultation with mental health experts, it was suggested that the item stem be altered to remove “ability” and focus on schools that are providing these services (it was determined that if a school had the ability they would be providing such services). Experts also suggested additional clarifications to address respondent confusion. Specifically, to use the term “diagnostic mental health assessments” to help respondents understand that educational assessment should not be included. An additional instruction was also suggested to emphasize that the assessments should be those conducted by a licensed mental health professional.


FINAL RECOMMENDATION: Revise item stem and instructions.


FINAL ITEM:


Item 6


Issues Identified

Item stem and response categories

Refer to respondent issues with defining diagnostic assessment under Item 5.

Because most respondents answered “no” to the previous item, respondents answered this question infrequently. Respondents typically said the difference between “at-school” and “outside of school” mental health professionals was the ability to which these professional could diagnose students. Diagnosis was often defined as an assessment that required the school to refer out to specialist services. Three respondents (R-K, R-L, and R-P) spoke about contractual relationships with mental health professionals. The same titles of professionals were used to describe at school and outside mental health professionals with the exception of school counselor (specifically at school): psychiatrist, phycologist, and social worker. For an example of respondent confusion, R-J had trouble answering Q6 and originally answered no to both sub-items. She then explained that the situation could be both yes and no when a caseworker or psychologist comes in. She said they do have a psychologist, but they do not have a psych ward or anything like that. She was confused as to what they question was “looking for.” She did mention that some of the students have an IEP on file and have a behavioral disorder, but not a mental health disorder. Ultimately, she decided to put no.

Sub-items
  1. No specific issues with this question.


  1. No specific issues with this question.



Discussion

After discussion between NCES and AIR, including consultation with mental health experts, it was recommended that “available” be changed to “provided” in this item and to use the term “diagnostic mental health assessment” to align with the revisions made to the previous lead-in item.


FINAL RECOMMENDATION: Revise item stem.


FINAL ITEM:

Item 7


Issues Identified

Item stem and response categories

Most respondents did not have an issue with this question. Respondents defined treatment as methods in which to help students deal with various types of mental health disorders. Treatment was referred to as a regular plan through a variety of different means including: medication, psychotherapy, different care resources, counseling, etc.


One respondent (R-P) reported having an issue with the wording of this question. The respondent specifically drew attention to the word “ability.” The respondent said that schools always have the “ability,” but that the question might be trying to get at whether or not they actually provided treatment. The respondent also wondered if the word “ability,” referred to whether or not her school has the financial resources, or the emotional wherewithal, or the desire/will to have mental services. Her answer to the question would vary depending on the definition of “ability.” She responded as if the word “ability” meant having sufficient financial resources.



Discussion

After discussion between NCES and AIR, including consultation with mental health experts, it was suggested that the item stem be altered to remove “ability” and focus on schools that are providing these services (it was determined that if a school had the ability they would be providing such services). An additional instruction was also suggested, to emphasize that the assessments should be those conducted by a licensed mental health professional.


FINAL RECOMMENDATION: Revise item stem and instructions.


FINAL ITEM:



Item 8


Issues Identified

Item stem and response categories

Generally, respondents understood this question. One respondent (R-R) said he was not sure if a school counselor should be considered a mental health professional and did not know if he should include it as part of his answer. He said he could argue for it either way.

Sub-items
  1. No issues reported with this question.

  2. No issues reported with this question.



Discussion

After discussion between NCES and AIR, including consultation with mental health experts, it was recommended that “available” be changed to “provided” in this item to align with the revisions made to the previous lead-in item.

FINAL RECOMMENDATION: Revise item stem.


FINAL ITEM:



Item 9

Issues Identified

Item stem and response categories

Seven respondents (R-D, R-F, R-I, R-K, R-O, R-P, and R-S) had issues with this question.



Several respondents had issues with the definition of probation officer – a few respondents didn’t know whether or not to include student resource officers or other types of community security officers. R-O said that because they’re an elementary school they’re not given a security guard. This respondent did not understand the proper definition of a school-based probation officer as it relates to the item and thought they were security guards. R-F listed “safety and security officer”, indicating that she did not fully understand the question as it relates to school-based probation officers. R-S thought that a probation officer was a security guard who may or may not be actual law enforcement and did not realize initially that they were different concepts.


One respondent (R-O) said a probation officer, in his mind, is someone who monitors people who have broken the law. However, upon further reflection, he said he wasn’t sure if the question was asking about that type of probation officer and wondered if they survey was asking about academic probation instead. He did not refer to the glossary.


One respondent (R-K) said the question was asking something he was not familiar with and thought it was odd that schools would even have probation officers located on school property.


One respondent (R-O) said his school was small and so they don’t have a need for one and they are an elementary school so the role of a probation officer wouldn’t be relevant.


Two respondents (R-D and R-P) said they had an officer that can be called specifically for truancy but not for anything else.


There were differences in the understanding of probation office by school level. High school respondents did not report any issues, whereas primary, middle, and combined schools had difficulty with the term and concept.



Discussion

Based on respondent confusion over school-based probation officers in cognitive testing, and the fact that the definition included was a general definition of probation officer and not specific to school-based probation officers, NCES and AIR felt this item should undergo further refinements and testing before being included on the questionnaire.

FINAL RECOMMENDATION: Drop item from SSOCS:18 and consider additional testing for SSOCS:20.


Item 10


Issues Identified

Item stem and response categories

Generally, this question did not pose any issues for respondents. Part 10e did cause some confusion.

Sub-items
  1. No issues reported with this question.

  2. No issues reported with this question.

  3. No issues reported with this question.

  4. No issues reported with this question.

  5. There was some general confusion about the definition of “trauma sensitivity” and, again, there was very little evidence of respondents referring back to the glossary. Five respondents (R-A, R-F, R-M, R-O, and R-R) were uncertain about the definition. R-O noted that he did not know how to define trauma sensitivity. He thought it could mean sensitivity to others in the workplace and they do not offer specific training on that so he answered “no.” He said it might be a term that he does not know. R-M said when he thinks about the word trauma, he thinks about a “medical” issue not general safety.



Discussion

Based on respondent confusion over “trauma sensitivity,” NCES and AIR felt this sub-item should undergo further refinements (such as sub-item examples and a better definition) and testing before being included on the questionnaire.

FINAL RECOMMENDATION: Drop sub-item 10e from SSOCS:18 and consider additional testing for SSOCS:20.


Item 11

Item stem and response categories

Generally, respondents understood this question; however, the interpretation of arrest seems to vary. Most respondents said an arrest involves law enforcement and the arrestee is taken off the school premises in handcuffs due to illegal activity. Very few respondents reported an arrest at their school and said they would definitely know if an arrest had occurred at their school as they would be notified immediately. A few respondents also reported having a formal system of logging these instances (i.e. a discipline database). Two respondents (R-B and R-E) said they did not know how many arrests had occurred at their school – R-B said he would have to ask his school security and the other respondent said they have no means of recording that data.


R-E said that he was familiar with the database on how incident records are kept, however, because arrests occur with an outside agency and is at their discretion, incidents at school may not result to an immediate arrest. Even though they know a detention happens, if there’s no evidence, there’s no formal arrest, even though from the school’s view an arrest happens. The respondent thought that reporting out on this would lead to meaningless statistics for the school and changing the question format to be non-open ended and use meaningful response categories (i.e. 0-49, 50-100, etc…). The respondent knew he could not answer “none” because they have occurred, but he couldn’t give an accurate answer based on how the incidents are recorded.



Discussion

NCES and AIR agreed to use response categories for SSOCS:2018 instead of using an open-ended response, based on feedback gathered from respondents. Categories recommended for inclusion were based on responses to the arrest count item that was included on the SSOCS:2016 questionnaire. NCES also recommended that information such as 2015–16 CRDC arrests data and further input from principals on how they keep administrative records should be taken into consideration for potential revisions to this item for SSOCS:2020.

FINAL RECOMMENDATION: Revise item response categories.


FINAL ITEM:


Item 12


Issues Identified

Item stem and response categories

Respondents generally did not have issues with this question. Most respondents defined sexual misconduct as inappropriate contact (physical or verbal).

One respondent (R-G) reported “No” initially, then recalled an incident with a bus driver and student texting. However, the respondent said they wouldn’t count that because the question asked about a “staff member at their school” and the bus driver was a district employee, not a school employee.

Though all respondents answered “No” to this question, a number of respondents were adamant that they would absolutely know if this happened because there would be records, an investigation, a hearing, and/or someone getting fired. Several respondents framed their response as “Not to my knowledge,” “Not that I’m aware of,” “Not that has been brought to my attention,” etc.



Discussion

Respondent feedback yielded no major comprehension issues with the wording of this item. Since no other federal surveys ask about this topic, respondents indicated they would absolutely know if these incidents had occurred, and testing indicated that the item did not add a significant about of time burden for respondents, NCES and AIR recommended retaining the item for the SSOCS:2018 questionnaire. To address respondent confusion about whether the item scope included staff employed by the district, rather than only those employed by the school, NCES and AIR recommended adjusting the language. Revised language attempts to clarify that respondents should report on sexual misconduct involving any staff affiliated with the school, rather than only school-employed staff members.

FINAL RECOMMENDATION: Revise stem.


FINAL ITEM:




Item 13


Issues Identified

Item stem and response categories

Three respondents (R-A, R-P, and R-R) mentioned having issues distinguishing between frequency options. Two respondents (R-A and R-P) had difficulty defining and choosing between “happens on occasion” and “never happens” for question 13.

The majority of respondents noted that they were reporting based on what they thought and perceived, rather than formal reports. They did not feel there had to be “evidence” of the problems for them to know they exist. Respondents noted that students may not often report these instances formally (they may think it is “normal behavior”) but respondents knew these problems were happening. However, because they do not know as well as the students when these happen, this made it somewhat difficult to pick frequency categories with a high degree of certainty, especially for those problems not happening very often.

R-P mentioned that all of these areas (a-f) would be hard to calculate unless they ended in disciplinary action. The respondent was not sure whether a principal would have a good gauge of this on a daily basis. Specifically, the respondent was not sure how accurately a principal would be able to assess the frequency especially in a really big school.

Sub-items
    1. Three respondents (R-K, R-O, and R-P) mentioned issues quantifying racial/ethnic tensions, reporting that that tension does not seem “measurable.”


R-K said that “tension” is very hard to quantify and needed to loop back around to that sub-item. She said it’s possible to answer about how often a student expresses racial/ethnic tensions but she is unable to know the frequency if it’s not directly communicated. The respondent decided that she would answer this as “happens as least once a month” and says that she was thinking about comments that are racially or ethnically based. She doesn’t know if this counts as tensions or not. If this is a first time for a student, they might just get a call home or a warning and the principal would not know about it.


    1. No issues were reported with this question.


    1. No issues were reported with this question.


    1. One respondent (R-B) reported having difficulty defining gender identity.


The respondent did flip to the glossary page but didn’t seem to see the gender identity entry since she continued to say she would want one because a student may not explicitly identify as the other gender but may still be harassed for it. The respondent mentioned a student that likes hanging out with the girls and joining fashion club but hasn’t come out to say he identifies a different way, but he still gets ragged on.


    1. No issues were reported with this question.


    1. One respondent (R-S) did not know whether to include mental disabilities (i.e. ADHD or ADD) in his understanding of this question. Three respondents specifically noted that they were not limiting their answers to physical disabilities, but also including learning disabilities as well.




Discussion

NCES and AIR recommended no change to sub-item 13a as this item has a long history on the SSOCS questionnaire and there is a preference to retain it as-is for trend comparisons. Based on respondent feedback on sub-item 13f. NCES and AIR recommended adding parenthetical examples to clarify the scope of “disability” in sub-item 13.f.


FINAL RECOMMENDATION: Add examples to sub-item 13f.

FINAL ITEM:





SSOCS Survey Materials


Beyond testing on new and modified survey items, the SSOCS cognitive interviews included an additional set of questions to measure participant responses to select communication materials and physical components of the survey package — the advance letter, the survey cover letter, the brochure, the FedEx envelope, and the free pen.


Note: Due to time constraints, 4 of 19 participants did not review and answer questions regarding the survey materials during their interviews. A total of 15 participants responded to the survey materials questions.

Survey materials
Willingness to complete the survey
    • Twelve respondents reported that time issues/burden would affect their decision to participate or not.

      • Two respondents specifically mentioned 10-15 minutes tops being the appropriate amount of time.

    • Eight respondents mentioned an incentive or small gift as helpful.

    • Six respondents) mentioned the topic (crime and safety) as important to their participation. They said it was relevant to their school.

    • Three respondents mentioned the length of the survey as a deterrent to completing it.



  1. Advance and cover letters

    • Six respondents expressed negative perceptions concerning the letters.

      • Two respondents mentioned wanting to know from the letter if there was an incentive.

      • Two respondents mentioned the letters being too long.

      • One respondent mentioned the letters as too “busy.”

      • One respondent mentioned wanting a deadline/timeline of when to have it completed by.


  1. SSOCS brochure

    • Six respondents shared positive perceptions concerning the brochure.

      • Two participants said they were more likely to read the brochure than the letters.

    • Five respondents recommended improvements to the brochure.

      • Four respondents (R-B, R-E, R-K, and R-R) mentioned wanting more meaningful statistics (district based).

      • One respondents mentioned it being too busy.


  1. Physical mailing/incentive

    • How participants would like to receive items:

  • Seven respondents mentioned preferring email or receiving the survey in digital form.

  • Three respondents mentioned addressing to specific person not just to the school.

  • Three respondents mentioned having the package signed for would heighten its importance.

  • Three participants mentioned a follow-up call about the survey as being helpful.

  • Two respondents mentioned the survey being sent by the district or their supervisor as making it a priority.

  • Respondents mentioned that they prefer the questionnaire be sent via FedEx (over another carrier like USPS); they would be more likely to be open and complete the survey.



    • Usefulness of data collected

      • Seven respondents mentioned wanting examples of how the data has influenced policy in the past.

        • For example, one respondent said he wants to know about how information from this survey changes law somewhere or how it was used; this would make him more likely to fill it out. He said that principals are in a field where they want to make a difference and “you should show us” how the survey results actually get translated into action.



Discussion

NCES and AIR discussed a number of recommendations for potential changes to address respondent feedback on the survey materials. Recommendations included:

  • Consider revising, shortening, and streamlining the advance and cover letters to seem less burdensome to participants.

  • Consider adding a specific “due date” to the cover letter. Current language asks school to return the questionnaire within two weeks; however, principals may not remember when they received the package or may be more likely to mark or remember a specific date.

  • Consider revising the brochure to be more targeted to the school characteristics.

  • Consider including some results from the prior survey collection in the letter included in the questionnaire package – we provide this information in a reminder letter, but it may pique the principal’s interest more to have it up front.

  • Consider ways to reduce time burden. Consider modules or matrix sampling so each respondent receives a shorter questionnaire.

  • Consider enhancements to the delivery method and reminders to increase response. Suggestions from respondents include several things SSOCS already does – sending the survey addressed to principal by FedEx and follow-up calls. While principals are contacted by email in advance of the survey mailing, historically the questionnaire has only be available in paper, and there has not been the option to fill out the form electronically. NCES is already planning to test a web instrument which could facilitate this switch to electronic outreach and response. For paper surveys, consider using a delivery method that requires a signature to signify importance.

  • Consider if there is a way for superintendents (who have already been notified that schools in their district will receive SSOCS) to pass along the survey or reach out to schools to let them know the survey is coming and that the district supports voluntary participation. This may be difficult, as an added burden to superintendents, but even if some were willing to do so it could increase schools’ perceptions that they should complete the survey.

  • Consider retaining the SSOCS free pen or another small incentive, as respondents indicated this may have some effect on their willingness to complete the survey.


FINAL RECOMMENDATION: Based on feedback from SSOCS:18 cognitive testing and focus groups with principals that were conducted separately, NCES and AIR made several modifications to the outreach materials to better target schools and increase response rates. These changes included:

  • Streamlined the text of outreach letters by moving language on legislation authorizing NCES to collect the data to the footer;

  • Added additional findings from the 2015–16 data collection to select letters;

  • Added language discussing how SSOCS data may inform the creation of programs and policies within federal agencies;

  • Moved references to NCES and the Department of Education to more prominent placement within letters and emails.

Additionally, the SSOCS:2018 data collection will include two experiments aimed to improve response rates:

  • Approximately 1,150 cases from the sample will be randomly selected to receive a pilot web-based instrument;

  • Approximately half of the entire sample will be randomly selected to receive a $10 incentive as part of the initial mailout.


Appendices – please see PDF “Part C SSOCS 2018 Cognitive Interview Report Appendices”



C4. School Survey on Crime and Safety (SSOCS) Report of Focus Groups Among Elementary, Middle School, and High School Principals



O V E R V I E W



The School Survey on Crime and Safety (SSOCS) is a national survey of elementary and secondary public school principals. SSOCS collects information on school safety, including the frequency of school crime and violence, disciplinary actions, and school practices related to the prevention and reduction of crime. SSOCS is one of the nations primary sources of school-level data on crime and safety. Sponsored by the National Center for Education Statistics (NCES), it has been administered six times since 2000 to nationally representative samples of schools, most recently during the 2015-16 school year, and it will be conducted again in the spring of the 2017-2018 school year. For the SSOCS 2016 collection, completed questionnaires were received from about 2,100 public schools for an unweighted unit response rate of 60%. NCES hopes to improve upon this response rate for the 2018 SSOCS by better understanding factors that motivate schools to complete (or decline to complete) the SSOCS questionnaire.


O B J E C T I V E S



On behalf of NCES, Hager Sharp conducted focus groups with school principals to better understand both the barriers and benefits schools tend to associate with participation in surveys like SSOCS and to identify communication strategies that will help overcome those barriers to participation. The information from the focus groups will guide recruitment strategies and materials development for SSOCS 2018.


M E T H O D O L O G Y



Shape1 Shape2 Shape3 Shape4 Shape5 Shape6 Shape7 Shape8 Shape9 Shape10 Shape11 Shape12 Hager Sharp conducted three online focus groups with public school principals in various states and school districts to understand their perceptions of SSOCS and how they would respond if invited to take the survey. A recruitment vendor selected a randomized sample of principals to ensure diversity of geography, level (e.g., elementary, secondary), school size, and the percentage of students eligible for the National School Lunch Program (NSLP). We conducted the groups using the WebEx platform, and all principals received SSOCS background materials in advance, including a brochure about SSOCS, a letter from the Department of Education, and a sample survey questionnaire. We discussed these materials as





Shape14 well as general perceptions about SSOCS and other education studies. Hager Sharp analyzed the discussions for themes, including areas of similarities and differences among participants.


Executive Summary


HI G HL I G HT S


P R I O R K N O W L E D G E O F S S O C S


None of the elementary school principals had heard of SSOCS before receiving the materials we sent them. Only one of the middle school principals had heard of SSOCS before receiving our materials. He was a security officer in a school prior to becoming a principal. Among high school principals, a quarter of the participants had heard of SSOCS; the rest had not heard of the survey before receiving materials.


I N I T I A L R E A C T I O N T O S S O C S


Elementary School Principals

  • After briefly looking at materials about SSOCS, a majority of participants said they would be willing to take the survey, citing the value of information for researchers, policy makers, and education decision makers. They said they would be particularly interested in participating if they perceived a benefit for students and schools.

  • Participants reiterated a desire to receive the results of the survey after it is completed and analyzed, particularly if it provides comparative data between their school and others across the nation.

  • Several participants cited the importance of the Department of Education brand as a motivating factor for completing the survey. They indicated they would be more likely to pay closer attention to materials with the Department of Education seal on them.

  • Participants cited ease of completing the survey as a motivating factor, with most voicing their desire to complete the survey electronically rather than through a printed questionnaire.

  • Several participants indicated that they dislike questions that ask them to estimate the percentage of school participants or time spent on activities and programs. They prefer yes/no questions.

  • Other participants - particularly those who identified themselves as principals of smaller schools - indicated having no challenge in estimating the percentage of time spent on activities or the percentage of participation from the school community.

  • One participant suggested a monetary incentive for completing the survey would be motivating.

  • One participant noted that the decision to complete surveys is sometimes made at the district level rather than at the level of an individual school. The school is much more likely to complete the survey if the district requests or endorses it.


Middle School Principals

  • Shape15 A majority of participants indicated they would probably not participate in the survey because of time constraints and the perceived lack of relevance to their specific school. A majority of participants expressed concern over the length of time it would take to complete the survey and review accompanying materials.

  • In considering whether or not they would participate in SSOCS, principals would want to know who

would see the data, how the data would be shared with others, and whether or not they would receive the data or a results report after participating. Several participants said they would want to see the results of the survey, including a comparison between similar schools and themselves.

  • Many participants emphasized that the timing of a survey in spring is the worst because of testing.

They would be more likely to complete a survey during down time in early summer if they completed it at all.


High School Principals

  • Many principals cited their willingness to participate in SSOCS because it requires only their time to complete and potentially some clerical hours. In contrast, surveys that require participation from students and/or teachers are more difficult to conduct and usually require approval from the district.

  • Some participants mentioned they would be willing to participate because the survey comes from NCES.

  • Several participants mentioned they would be more willing to take SSOCS if the materials indicated clearly and initially the amount of time required to complete the survey.

  • A majority of participants noted that early summer is an ideal time to receive the survey and ask for participation from principals, as it is the least busy period out of the year. Spring is not ideal because of sports activities, other testing, and other activities.


Shape16 RE A C T I O N S TO S S O C S B R O C H U R E


Elementary School Principals

  • While principals found the brochure generally useful, several suggested it needs more information about how the data will be used and if they will receive a report of survey results.

  • Participants also noted that the brochure could be rearranged to feature more pertinent information about what the survey is, how it will be used, and how much time it will take to complete at the beginning of the brochure or in the first two sections.

  • Several participants pointed out that the brochure is very wordy and they prefer shorter, more concise materials that use bullets so they can quickly skim and get the key points they need.

  • One participant noted that the images on the brochure dont necessarily indicate that the survey is on school safety. She suggested the visuals should be more focused on students to remind principals of who they are trying to protect.


Middle School Principals

  • Participants reiterated that materials would need to quickly and clearly explain why the survey is important to them and what the benefit would be for their school in order to encourage their participation.

  • Participants emphasized the need to see the value of the survey and its data for their school to motivate them to participate. Others mentioned that an incentive would motivate them.

  • Some participants indicated they would probably not look through the brochure or materials and would be unlikely to participate in the survey.

  • Participants are more likely to be willing to complete the survey if it contained shorter, simpler yes/no questions and didnt require time to gather data to complete.

  • One participant noted that he would be unlikely to believe that the data would be used by federal policymakers to inform education policy.


High School Principals

  • Many participants liked the brochure and thought it gave many examples and reasons to participate in SSOCS.

  • Participants would like to know more about how results from the survey would be disaggregated down to school size so they could compare themselves to similar schools across the country. Comparison is not meaningful for them if its not apples to apples.”

  • Some participants recommended tailoring the brochure to different sizes of schools. Several noted that the images in the brochure call to mind a large, urban school. Rural school principals may see the brochure and think that their data would not be relevant to the study because of this perception.


RE A C T I O N S TO S S O C S L E T T E R F R O M T H E C O M M I S S I O N E R



Elementary School Principals

  • Nearly all participants emphasized that shorter, concise, bulleted letters are more helpful and more likely to be read by them than a longer letter.

  • Participants want to know why the survey is being conducted and how the information and results will be used.

  • Shape17 Several participants emphasized the need to explain within the letter how long the survey will take them to complete, as this will encourage them and make it more likely for them to complete the survey.

  • Participants cited the list of endorsements as helpful and valuable to them; it reassures them of the surveys validity and value, particularly if they are members of the endorsing organizations.

  • Participants noted again that it is important to have the Department of Education logo and branding on the letter and envelope, as it would capture their attention, make it less likely for them to throw the survey away, and increase the likelihood that they would complete the survey.

Middle School Principals

  • Several participants said the letter is too long and wordy. They would prefer it say up front and concisely what the survey is, how long it would take to complete, and how it would benefit their school.

  • Participants indicated that the letter addressed from the Department of Education on official letterhead would not persuade them to complete the survey.

  • Participants did not feel the endorsements would encourage them to participate, unless the survey is endorsed and mandated by their district or state officials.

  • A majority of participants indicated that a monetary incentive would improve the chances that they would complete survey; principals from rural schools especially noted that financial incentives would greatly increase the chances of their participation.


High School Principals

  • Several participants noted that the letter made participating in the survey sound very optional and they would be less likely to participate in SSOCS after reading it.

  • A majority of participants wanted to know how the data would be used and how they could use the data in their schools. Understanding the what’s in it for me? would encourage them to participate. Participants emphasized the importance of having a clear, concise message at the beginning of the letter that explains why the survey is important and what the benefits are to their students and school for participating.

  • Some participants noted that they did not notice the list of endorsers on the letter. Other participants

mentioned that certain endorsers like AFT and NEA actually discourage them from participating in SSOCS. They suggested it would be better to customize the letter for different principal levels and include endorsers that are most pertinent to the principal at the top.

  • One principal noted many of the questions in the survey are questions that they also report for state

studies, and wanted to know why the state and federal government aren’t sharing this data to reduce redundancy.

F O R M A T P R E F E R E N C E S


Across all groups of principals, all but one principal would prefer to receive and complete the survey online.

P E R C E I V E D V A L U E O F O T H E R I N F O R M A T I O N A L M A T E R I A L S


Across all groups,

  • Shape18 Several respondents suggested it would be valuable and interesting to watch a video that shared or explained survey results rather than a video that gave an overview of SSOCS. Participants felt that a letter and brochure were sufficient to provide them with overview information about the survey.

  • Participants generally did not see much use for a promotional or overview video, although some indicated that they would share a video with staff at professional development meetings or email it to them along with information about the survey.

  • Participants said they would be interested in and willing to view a video only if it came to them electronically. They would not be willing to go online and watch a video if they received the link on paper materials through the mail. Participants also indicated that they would not want to watch a video longer than 2-3 minutes.


Shape19 P R E F E R R E D S O U R C E S O F IN F O R M A T I O N O N E D U C A T I ON



Elementary School Principals

To keep up with trends in education, participants cited trade outlets like Education Week, Principals magazine, and Edutopia as sources for news. Many participants mentioned they use Twitter and Facebook groups as sources of information and news. The principals also indicated they receive information about studies, legislation, and programs through their districts main office or their states Department of Education listservs and email newsletters.

Several participants also cited professional associations such as the National Association of Elementary School Principals or local chapters of professional organizations as their source for education news. One referenced a local principals group that meets in [my state] as well as the School Administrators of [my state].” Several principals listed national news outlets such as the Washington Post, New York Times, and National Public Radio as sources of news and information on education and national, state, and local policies and legislation that may affect them.

Middle School Principals

A majority of participants indicated receiving weekly or daily newsletters, updates, and digests from their district-level administrators or state superintendents. Most also indicated receiving news from education associations, including National Association for Secondary School Principals, National Education Association, and others. Participants cited using outlets including Education Week, Smart Brief, and National Public Radio for broader information on education issues. Some participants noted finding information through Twitter.

High School Principals

Many participants cited newsletters from professional organizations, including ASCD and NASSP national and state chapters, and school administrator associations as their sources for education news outside of their district. A number of participants also mentioned daily or weekly emails from their district or states Department of Education as a source of news for them. They also mentioned publications including Educational Leadership and School District Leader magazine as news sources.

C O N C L U S I O N S A N D R E C O M M E N D A T I O N S



Conclusion

Recommendation

Responses in all three groups suggest a low

awareness of SSOCS among principals. Raising awareness about SSOCS in the broader education community may increase perceived importance and perceived usefulness of the survey, and in turn, increase response rates.

Raise awareness of and support for SSOCS using

the channels identified by the principals as their trusted sources for news and information about education, including trade publications, professional associations, state- and district-level departments, and relevant social media platforms. The endorsing organizations listed in the Commissioner’s letter could use their in-house channels to educate the community about SSOCS and its importance.

Across all groups, principals expressed a desire to receive the results of the survey, particularly if it

Consider releasing the results of SSOCS in the form of a report and accompanying video, with

Shape20

provides comparative data between their school

and others across the nation. Some suggested releasing the results through a video in addition to a report.

broad dissemination through the education

community using the trusted news sources that principals identified.





Conclusion

Recommendation

Most participants cited the importance of the

Department of Education brand as a motivating factor for completing the survey. They indicated they would be more likely to pay closer attention to materials with the Department of Education seal on them.

Make the Department of Education seal more

prominent on the outside of SSOCS packages going to principals, and maintain the prominence of the seal in the Commissioner’s letter. Consider making the seal more prominent in the brochure.

Participants cited ease of completing the survey

as a motivating factor, with most voicing their desire to complete the survey electronically rather than through a printed questionnaire.

Consider switching format to online to replace

printed questionnaire.

Many participants emphasized that the timing of a

survey in spring is the worst because of testing. They would be more likely to complete a survey during down time in early summer.

Consider switching the timing of the survey to late

May/early June, when principals are less busy.

Participants suggested the SSOCS brochure

could be rearranged to feature more pertinent information first, including what the survey is, how it will be used, and how much time it will take to complete. They also want information about how they can get a copy of the results of the survey.

Reorganize the SSOCS brochure to feature the

most pertinent information more prominently, including what the survey is, how it will be used, and how much time it will take to complete.

Some participants recommended tailoring the

brochure to different sizes of schools. Several noted that the images in the brochure call to mind a large, urban school. Rural school principals may see the brochure and think that their data would not be relevant to the study because of this perception.

Include images and information that are relevant

to a variety of types of schools, including rural schools, to reassure them that the survey is relevant for them. Include messages about the representativeness of SSOCS to reassure rural schools that they will be able to compare their school to others that are similar in size and environment.

Regarding the Letter from the Commissioner,

several participants said the letter is too long and wordy. They would prefer it say up front and concisely what the survey is, how long it would take to complete, and how it would benefit their school.

Streamline the letter to include the essential

points in the first paragraph.

Participants emphasized the importance of having

a clear, concise message at the beginning of the letter that explains why the survey is important and what the benefits are to their students and school for participating. Many thought the letter lacks urgency it seems “too optional and doesnt make a strong case for why principals should take the time to complete it.

Add urgency to the letter by describing why the

survey is important and what the benefits are for schools.



Shape21 Shape22 Elementary Principals Focus Group Tuesday, January 31, 2017, 8:00 9:30 p.m.


Characteristics of Participants



Seven elementary school principals participated in the group. A summary of their school characteristics is as follows:


Gender of

Principal

Public or

Private School

School Size

School

Community

Percent

Eligible for SLP

Female

Public

Small

Rural

0-25%

Female

Public

Medium

Urban

0-25%

Male

Public

Medium

Suburban

51-75%

Male

Public

Medium

Rural

26-50%

Male

Public

Medium

Suburban

51-75%

Male

Public

Small

Rural

26-50%

Female

Public

Small

Rural

26-50%


Detailed Responses



Awareness of Education Surveys and Studies in General


Lets start by talking broadly about education studies involving samples of primary or secondary schools, staff, and students. Are you aware of any surveys or studies like that? Do any studies come to mind?


When asked about their awareness of and experience with education surveys and studies in general, all principals had some experience with education-related surveys or studies. Several mentioned they had received surveys from PhD candidates and university programs and had taken the time to participate. Most complained that they never see the results of surveys and studies after they complete them. Verbatim comments on this topic include:

    • “I get a lot of University PhD candidates who send out surveys. I also get some through listservs and different college prep programs. As far something like [SSOCS], I have not participated in anything close to that. But I am accustomed to doing a lot surveys for colleges.”

    • We get a lot from graduate students. I have seen some surveys specifically about the state that I live in, about outdoor service centers and afterschool programs. But I do not get to see the feedback from these surveys, which would be nice.”

    • “I receive surveys from universities or national groups that ask a lot of questions about parent involvement rather than information about principals or teachers. We don’t get feedback or see the results; you fill out the survey and never know what comes from it.”


Others commented that they had been asked to take surveys about various topics, including parent engagement, satisfaction with textbooks and curriculum materials, and principal effectiveness. Verbatim comments on this topic include:

    • We get a lot of surveys about principal effectiveness, the duties of the principal. I have participated in these.”





    • Shape36 “I have done 9-10 of them in the past 18 months, surveys from curriculum programs or textbooks.

We have also all given surveys to our students, like the Gallop survey, which is also given to school staff.”

Most participants indicated a willingness to complete surveys if they are short and easy to participate in,” particularly online surveys. One expressed a dislike of vague scales,” including some five-point Likert-like scales. Most indicated they would like to see the results of the surveys they participate in.


Prior Knowledge of SSOCS


Lets talk more specifically about a study called the School Survey on Crime and Safety or SSOCS. Prior to receiving the materials we sent you, what, if anything, had you heard or did you know about SSOCS?


None of the principals had heard of SSOCS before receiving the materials we sent them.

Initial Reaction to SSOCS


For the following discussion, Id like you to refer to the packet of information about SSOCS. Please open it up at this time. If you dont have your packet, please refer to the electronic versions on your screen.]

At this point, what are your thoughts on SSOCS? Would you take this survey if it came to your school? What would motivate you to take the survey?


After briefly looking at materials about SSOCS, a majority of participants said they would be willing to take the survey, citing the value of information for researchers, policy makers, and education decision makers. They said they would be particularly interested in participating if they perceived a benefit for students and schools. Verbatim comments include:

    • Yes, I would take it. I would like to see how we compare to other schools around the nation and how this information will be used by people who make decisions that improve situations for kids. I want to see the data coming from the survey.”

    • Yes, I agree and its a good prompt to do some reflecting as to how we are doing as a school.

School safety is one of the most important things we do for kids and if this will help legislation- wise or help people who are making decisions about education policy then its great and I would help.”

    • Yes, if someone is taking the time to send a survey, then its usually for a good reason - a valid reason. Having data is important for the system as a whole.”



Participants reiterated a desire to receive feedback or the results of the survey after it is completed and analyzed, particularly if it provides comparative data between their school and others across the nation. Verbatim comments include:

    • “Offering some of the results once the data has been compiled [would motivate me]; if I knew I would get the results, I would be more likely to complete this.”


Several participants cited the importance of the Department of Education brand as a motivating factor for completing the survey. They indicated they would be more likely to pay closer attention to materials with the Department of Education logo on them. Verbatim comments include:

    • Shape37 “I might take it. If it looked like a piece of junk mail, Im probably throwing it in the garbage. If its optional and I open it and its several pages, its very likely to go in the garbage. If it looks like something thats federal or from NCES - that Im probably required to do - then I am likely to fill it out. It helps if the packaging looks official. NCES is considered a credible source. Most of the time I am not going to take seriously a lot of things I get in the mail.”

    • Because its coming from the U.S. Department of Education and its probably not that long, I would complete it.”


Participants cited ease of completing the survey as a motivating factor, with a majority voicing their desire to complete the survey electronically rather than through a printed questionnaire. Verbatim comments include:

    • “I get a lot of surveys through my email. Im a lot more likely to complete a survey if its electronic, and more apt to ignore mail.”

    • “If it was sent electronically, theres a 90% chance that I would fill it out. It is easier to respond to that than a paper document.”

    • Whether or not it looks official, how much time will it take to complete the survey [is a key factor]; if its only 5-10 minutes, Ill probably take the time to do it. If its lengthy, then Im less likely to. If it were electronic this would help make it more likely that I would complete it. The one thing is that we do submit a lot the same information the state collects some of the same information. On some of these I think gosh, dont they already have this somewhere?


One participant suggested a monetary incentive for completing the survey would be motivating: Money is nice too.”

Reactions to SSOCS Brochure


Lets talk specifically about the brochure about SSOCS. Does this look like it has the information you would want to know about SSOCS? Do you think there is anything that might be missing? Does the brochure answer the questions that you might have?


While principals found the brochure generally useful, several suggested it needs more information about how the data will be used and if they will receive a report of survey results. Verbatim comments include:

    • “It covers most of the things I would want to know about, but I would want more information about how the data will be used and if I will receive feedback or results from it.”

Participants also noted that the brochure could be rearranged to feature more pertinent information about what the survey is, how it will be used, and how much time it will take to complete at the beginning of the brochure or in the first two sections. Verbatim comments include:

    • “Im going to want to know first why my school was selected - why am I getting this is the very first thing I want to see. The stuff on the right hand side of brochure is most important - I would flip it around. The least important stuff to me is about the $250,000 fine; Im going to assume if its the federal government that you arent going to give my information away. I would put that info at the very end, in the last pane.”



Several participants pointed out that the brochure is very wordy and they prefer shorter, more concise materials that use bullets so they can quickly skim and get the key points they need. Verbatim comments include:

    • Shape38This brochure is very wordy. Im not going to take the time to read this entirely; I need to skim it quickly. I like bullets and big headings so I can quickly see what I am getting. And Im not sure if the pictures are appealing for what you are getting at with the topic.”

    • “I think the topics that are covered in the questionnaire are helpful. I have no issue with the brochure - its a little wordy; theres a lot of text in there. But thats probably better, given the complexity of the survey youre about to take. I might not read it during the day but Id probably take it home to read and review.”

    • “I think the brochure looks fine; it gives enough information and I like the resources listed. It looks like a lot of words, but its really quick reading. I would look at it – I cant say Id read every word, but Id definitely take a look at it. I like the visual layout; I like some of the red background pulling my eyes to places.”

One participant noted that the images on the brochure dont necessarily indicate that the survey is on school safety. She suggested the visuals should be more focused on students to remind principals of who they are trying to protect. Verbatim comments include:

    • What types of visuals would be a better fit… I guess I’m confused by what would be better. The cameras on the school building - this rubs me wrong. I would rather see more pictures of kids themselves. When I think of crime and safety, Im thinking of the students first. When I see these visuals, I think of data remaining confidential and school camera data, not answers in a survey.”

    • “I think the brochure looks fine. Its hard to make a brochure when youre talking about crime and safety in schools - I think thats a tough one to find good visuals for.”


Reactions to SSOCS Letter from the Commissioner


What about the information included in the letter? Does the letter give you all the information that you need in a letter about SSOCS? Are the logos prominent enough to convey this is from the Department of Education? What do you think of the endorsers?


Nearly all participants emphasized that shorter, concise, bulleted letters are more helpful and more likely to be read by them than a longer letter. Verbatim comments include:

    • As far as purpose, it sounds clear. Sometimes simple is better; if its bulleted and streamlined, its helpful. Short is always better.”

    • Bold face is really good and gets attention.”


Participants want to know why the survey is being conducted and how the information and results will be used. Verbatim comments include:

    • This contains the information I would need in a letter about a survey or a study. It gives me the who, what, why, and the reason for doing it. It would be nice to know more about how it will be used - is it for legislation, policymaking, etc.”


Several participants emphasized the need to explain within the letter how long the survey will take them to complete, as this will encourage them and make it more likely for them to complete the survey. Verbatim comments include:

    • “It should say in the letter or somewhere in the brochure how long the questionnaire/survey will take to complete. I quit surveys halfway through if it seems too lengthy.”

    • “I think its good to have a deadline for the survey; educators love a deadline. I would not do it if it were more open-ended; I would put it in a stack to never see again. The two-week time frame seems reasonable - without looking through the questionnaire in-depth, I think anything is probably possible in two weeks.”

Shape39 Participants cited the list of endorsements as helpful and valuable to them; it reassures them of the surveys validity and value, particularly if they are members of the endorsing organizations. Verbatim comments include:

    • The different groups backing it on the side is helpful; I could do a little research on them and ease my mind a bit on who is supporting this.”

    • “I like the endorsed by list on the side of the letter.”

    • “I think its helpful to have a list of endorsers, but I don’t think it would cause me to take it or not to take it. Its nice to have, but not going to impact my decision.”


Participants noted again that it is important to have the Department of Education logo and branding on the letter and envelope, as it would capture their attention, make it less likely for them to throw the survey away, and increase the likelihood that they would complete the survey. Verbatim comments include:

    • “I like the fact that its got the Department of Education logo at top as well as organizations endorsing it on the side that I am a member of.”

    • “I would pick up the letter with the logo of the Department of Education first, and I would be more likely to read it than the brochure.”

    • “I have never heard of SOCCS before, and having the Department of Education emblem there always catches my attention. Ive never received anything from the Department of Education in the mail before.”


Feedback on SSOCS Sample Questionnaire


Lets look at the questionnaire. Does it seem reasonable? Do you think it would be easy or difficult to fill out?


Several participants indicated that they dislike questions that ask them to estimate the percentage of school participants or time spent on activities and programs. They prefer yes or no questions. Verbatim comments include:

    • “I strongly dislike questions that ask me to estimate time spent on specific activities. The percentage questions are hard. I would not be very interested if I had to fill it out that way. Yes/no questions would be easier. Percentages are something youd have to sit and figure out versus yes or no questions. Things you have to go and search for or figure out statistically would turn people away. If you gave them a range to come from, that might work, but its very hard to have percentages of time or activities.”

    • “I have [several] schools I oversee. I could get a rough estimate of these percentages, and not have any hard data necessarily. It would be a huge guesstimate as to how many parents participate in open house - thats organized chaos where we dont track attendance closely. We do track parent/teacher conferences. Im not sure that the school would have that data. It would be difficult to quantify certain things being asked in the survey.


Other participants - particularly those who identified themselves as principals of smaller schools - indicated having no challenge in estimating the percentage of time spent on activities or the percentage of participation from the school community. Verbatim comments include:

    • Shape40 “I am used to filling things out like this, in percentages, to get an estimate. What are you trying to find out and what are you doing with it? I dont have a problem with giving estimates or percentages for things. As principal of a smaller school, I know whats going on there - Im involved in all of it.

    • “I think it looks okay; I could estimate easily.

    • “I agree - I think the questions are pretty straightforward and I could give percentages and good estimates offhand.”


Perceived Value of Other Informational Materials


Would it be helpful to have a video or Powerpoint presentation available about SSOCS? Would you view these? Would you use them to show others what SSOCS is about?


Participants said they would be interested in and willing to view a video only if it came to them electronically. They would not be willing to go online and watch a video if they received the link on paper materials through the mail. Participants also indicated that they would not want to watch a video longer than 2-3 minutes. Verbatim comments include:

    • “I would probably watch a video - a short one. But if youre mailing me a package through mail, Im not going to then go online and watch a video. If I’m getting mail, I want a letter from someone who is important that gives me a reason not to toss the mail in the garbage. I would only click on a link to a video in email.”

    • “I would only watch a video in email. If the letter is good, then I would be more likely to read that. I would love a nice letter. If a letter does the topic justice, I probably wouldnt watch the video. However, the way the world is, everyone has a video. If you didnt have one, someone would probably want to know where it is.”


Several respondents suggested it would be valuable and interesting to watch a video that shared or explained survey results rather than a video that gave an overview of SSOCS. Participants felt that a letter and brochure were sufficient to provide them with overview information about the survey. Verbatim comments include:

    • “I would go with a letter explaining purpose versus watching a video. Having the amount of time the survey is going to take clearly listed on materials - letter or brochure would help. I wouldnt need to watch a video if letter was informative enough.

    • “I would like the video, but I dont want to watch an infomercial. I just need the stuff I need. It would be a great way to display results and explain how the information is being used. I could also use that video to share with staff like a video news release about the results of the survey”.

    • “I feel like between the letter and brochure, thats enough. I dont need a video or Powerpoint to explain it to me.

    • “Im not sure what I would use the video or Powerpoint for to share results or convince me that I should complete survey? The letter is perfectly fine and sufficient. It looks to me like something else has already warned us this is coming and asked us to participate; then when you get this letter it provides enough information that Im either going to do the survey or not. Once I’ve read the letter, I’ve probably made my mind up about completing it or not. Im not going to go somewhere else or watch a video to consider completing it.


Participants generally did not see much use for a promotional or overview video, although some indicated that they would share a video with staff at professional development meetings or email it to them along with information about the survey. Verbatim comments include:

    • Shape41 “I would show it to staff or send it to staff, a hundred percent. A lot of staff are younger, so they are digital natives. They would prefer to watch a video rather than read an email I send out about it. It could be useful as a training/recruitment tool to explain to others.

    • “For me, a video wouldn’t be all that helpful if Im the only one taking the survey. Endorsements and the Department of Education logo on the top of the letter are helpful.


One participant noted that the decision to complete surveys is sometimes made at the district level rather than at the level of an individual school. The school is much more likely to complete the survey if the district requests or endorses it. Verbatim comments include:

    • “It sounds like surveys are sent to individual schools. But in my school system, if a letter is coming from our central office with a directive to complete a survey, its more likely to get completed.”


Format Preferences



When asked if they would prefer to take the survey online or in print, all principals said they would prefer to take the survey online.




Preferred Sources of information on Education


What resources do you rely on to keep up with education trends? What resources do you rely on for news of your local school system?


To keep up with trends in education, participants cited trade outlets like Education Week, Principals magazine, and Edutopia as sources for news. Many participants mentioned they use Twitter and Facebook groups as sources of information and news. The principals also indicated they receive information about studies, legislation, and programs through their districts main office or their state’s Department of Education listservs and email newsletters.


Several participants also cited professional associations such as the National Association of Elementary School Principals or local chapters of professional organizations as their source for education news. One referenced a local principals group that meets in [my state] as well as the School Administrators of [my state].” Several principals listed [national] news outlets such as the Washington Post, New York Times, and National Public Radio as sources of news and information on education and national, state, and local policies and legislation that may affect them.

Additional Comments



After asking if the principals had any final comments about SSOCS, one emphasized the importance of the Department of Education logo and branding on survey materials and correspondence about the survey. Verbatim comments include:

    • “Could you make sure if you are going to send a letter that its got the US Department of

Education seal on the outside of the envelope, because if its a plain envelope, Im going to toss it. Especially if its electronic - if it says clearly Department of Education, it will be taken more seriously.”

Shape42

Shape43 Shape44 Middle School Principals Focus Group Wednesday, February 1, 2017, 8:00 9:30 p.m.


Characteristics of Participants



Six middle school principals participated in the group. A summary of their school characteristics is as follows:


Gender of

Principal

Public or

Private School

School Size

School

Community

Percent

Eligible for SLP

Female

Public

Large

Urban

76-100%

Female

Public

Medium

Suburban

0-25%

Male

Public

Large

Urban

76-100%

Male

Public

Small

Rural

51-75%

Female

Public

Small

Rural

51-75%

Female

Public

Medium

Urban

76-100%


Awareness of Education Surveys and Studies in General


Lets start by talking broadly about education studies involving samples of primary or secondary schools, staff, and students. Are you aware of any surveys or studies like that? Do any studies come to mind?


Several participants indicated their school is a Positive Behavioral Interventions and Supports (PBIS) school and they complete and conduct a number of surveys and questionnaires related to school climate as a part of that program. Several principals also cited Gallup poll and Healthy Kids survey as studies their school had participated in. Verbatim comments include:

    • My school is a positive behavior support school, and any of the surveys we use regularly are regarding school climate, a balance of ones that are already created and ones we create. We dont participate in very many outside surveys unless its mandated by the state. Any time [a research team] or someone wants to come in and survey our kids, we have to get permission.”

    • Were a PBIS school too, so we do the PBIS surveys. Theres a few we do. We also do a healthy Kids survey and we also do the Gallup poll survey, as a district. Its a district decision to participate in the Gallup survey.”

    • We are also a PBIS school, so there are surveys we do along with that for climate purposes. We work with two teaching colleges that do research, school SROs and national association on SROs, and do surveys on safety, alcohol and substance abuse, through ASDA. Our guidance counselor does anti-bullying surveys. Theres a lot of participation in surveys.


Some participants mentioned state and district-wide surveys to measure incidents in schools that can have a negative impact on school reputations, such as being listed as a “dangerous school.” Verbatim comments include:

    • We fill out the ones the county tells us to fill out and send back. Im always making sure the data is pertinent. In the state, will you be considered a [good] school? Or with seven crimes’ that are part of the survey that will get you on the list in the state as being a ‘dangerous school? There is a state-level survey for this. It goes into the data portal that is publicly accessible.”


Shape45 Participants noted that many universities and graduate students contact them to ask for participation in surveys. Principals participate in surveys and studies that are required of them by their district and/or state, and several said it is at their discretion to participate in voluntary surveys. Verbatim comments include:

    • Sometimes districts make decisions on what surveys schools will participate in, but something like this we would do on our own. Graduate students send emails all the time asking to complete a quick survey or answer questions, with no direction from the district. If we are talking about a survey kids take, then we have to get permission from the district. For principals, they dont give us guidance; they tell us only do it if we want.”


Prior Knowledge of SSOCS


Lets talk more specifically about a study called the School Survey on Crime and Safety or SSOCS. Prior to receiving the materials we sent you, what, if anything, had you heard or did you know about SSOCS?


All but one of the participants had not heard of SSOCS before receiving our materials about it. One participant had because he was a security officer in a school prior to becoming a principal.


Initial Reaction to SSOCS


For the following discussion, Id like you to refer to the packet of information about SSOCS. Please open it up at this time. If you dont have your packet, please refer to the electronic versions on your screen.]

At this point, what are your thoughts on SSOCS? Would you take this survey if it came to your school?

What would motivate you to take the survey?


A majority of participants indicated they would probably not participate in the survey because of time constraints and the perceived lack of application to their specific school. A majority of participants expressed concern over the length of time it would take to complete the survey and review accompanying materials. Verbatim comments include:

    • “I wouldnt go through the pamphlet - its a matter of time. Our time is so limited in the day, to go through a pamphlet about a potential survey I might do I wouldnt do the survey because of the time that it requires. I dont have the time. I would want to know how much time it takes to do the survey. The topic is interesting to me, but on a quick scan, it wouldnt have gotten past the email for me. We get at least 200 emails a day to go through. This wouldnt make it to the top of my list.”

    • “I would be less likely to do the survey if it was in the mail but also, in email, it won’t get past my secretary. Because of how its been received, it wasnt mandated [by local officials], so I wouldnt have time to complete it. Too many things are a priority. Also the survey is a very long survey to boot.

    • “I have the same concerns as others - this would take too much time to complete during the regular school year.”

    • As far as surveys go, its pretty extensive. You’re asking for a lot of data that has to be retrieved and that takes time. At least an hour, if not longer.”


In considering whether or not they would participate in SSOCS, principals would want to know who would see the data, how the data would be shared with others, and whether or not they would receive the data or a results report after participating. Several participants said they would want to see the results of the survey, including a comparison between similar schools and themselves. Verbatim comments include:

    • Shape46 Who is going to see the data? Am I going to see the data before it goes to a bigger group? Are we going to be compared to other schools like us? How is the information going to be shared across the country?”

    • “I would like to see the results, and Im also curious to see the types of schools and how they compare to us.”

    • “One of my secretaries would do it; it wouldnt be me. I would want to know if data is being segregated to schools my size, rural. Urban data doesnt mean anything to me. My secretary would take it on herself if she thought it needed to be done. If it came to my desk first it would go in the trash can.”


Many participants emphasized that the timing of a survey in spring is the worst because of testing. They would be more likely to complete a survey during down time in early summer if they completed it at all. Verbatim comments include:

    • The spring is one of the worst times to administer, as thats testing time. The summer would be the best time to administer this. I would actually have time to do it. This is a twenty-page survey. Theres no way even if I wanted to do it for the good of the order, theres just no way. In the summer, I might actually take the time to do it.”


Reactions to SSOCS Letter from the Commissioner


What about the information included in the letter? Does the letter give you all the information that you need in a letter about SSOCS? Are the logos prominent enough to convey this is from the Department of Education? What do you think of the endorsers?


Several participants said the letter is too long and wordy. They would prefer it say up front and concisely what the survey is, how long it would take to complete, and how it would benefit their school. Verbatim comments include:

    • The letter itself is lengthy; again, I wouldn’t have time to do this. I would read the first paragraph then you’re losing me. Time and relevance should be on this letter - this is a survey, it takes approximately x time to complete it, etc. What about this survey is useful to me or my school? If not, then its going in the trash.

    • Too much text. The endorser at the top would possibly get it to my desk to look at, but as soon as I start reading you’ve lost me.

    • “It would be more likely to be completed if it was during the down time in the summer. Were becoming a PBIS school, and we have to do a lot surveys and data collection, and this would be another survey. If this would provide me a data point that isnt being provided by another collection tool that my district or state has offered, then I would complete this, but otherwise no.”


Participants indicated that the letter addressed from the Department of Education on official letterhead would not persuade them to complete the survey. Verbatim comments include:

    • “It doesnt change whether I would do it its always a time and relevance issue.

    • If its endorsed and mandated to be completed, yes. But if its just gathering information and not going to supply me with any data thats actionable, then Im not going to spend my time wasted on it. I need to see a direct benefit for the school.”


Participants did not feel the endorsements would encourage them to participate, unless the survey is endorsed and mandated by their district or state officials. Verbatim comments include:



    • Shape60 The endorsers just add a lot more text on the page. Its very busy, it starts to sound like Charlie Brown teacher after a while.


A majority of participants indicated that a monetary incentive would improve the chances that they would complete survey; principals from rural schools especially noted that financial incentives would greatly increase the chances of their participation. Verbatim comments include:

    • If you said free $25 visa gift card for your time, then I might look at it. I can’t put a dollar amount but anytime the incentive is there, money for your school, then that would be attractive. Im in a very poor rural area, so its attractive to me it gets my attention.”


Reactions to SSOCS Brochure


Lets talk specifically about the brochure about SSOCS. Does this look like it has the information you would want to know about SSOCS? Do you think there is anything that might be missing? Does the brochure answer the questions that you might have?


Participants reiterated that materials would need to quickly and clearly explain why the survey is important to them and what the benefit would be for their school in order to encourage their participation. Verbatim comments include:

    • “It might be the right information, but I would still be working to find out why is this important to me. Why do I need to fill this out? How is it going to help me do my job better? I dont think that its something I would put on my priority list. Definitely dont send it in the spring, thats testing, thats lockdown time. The summer might work… [before summer school starts]. Thats a small window of time to send this out and have the time to complete it.

    • The benefit of PBIS is that we use it to identify kids that need support. We can show that using the data that it helps to create a positive school climate. I can get buy in, and I can get support. This is 20 pages of something that I am not connected with. The day goes by so quickly and so many hours in the day. If my boss is telling me that we have to do it, then its getting done. Otherwise its not.”


Participants emphasized the need to see the value of the survey and its data for their school to motivate them to participate. Others mentioned that an incentive would motivate them. Verbatim comments include:

    • “In rural [areas], I have no comparison to urban anything. So it would not be helpful to me to see comparisons to that. What am I going to get from it? I know from looking Im not going to get much. Is there money attached to it? Then, by golly, were going to go after it as a district. The questionnaire doesnt seem relevant to our school environment and the data would not be comparable to my school.”

    • “Im very rural as well. If you want to make the brochure eye-catching and you had pictures of Sandy Hook and Columbine on it, then it might be noticeable. If there was an incentive attached to this, my poor school would be all over it.”

    • “Maybe if the data had an impact on the state level or local level I would be more likely to take it.”

    • An incentive is always appealing - in our [district], the PBIS is used to support data that we can use to write small grants. If the survey was shorter… some sections could be taken out if you want more people to complete it. I cant see doing anything more than four pages.


Shape61 Some participants indicated they would probably not look through the brochure or materials and would be unlikely to participate in the survey. Verbatim comments include:

    • “I wouldnt even look at brochure. I dont see the value of the data to me. The PBIS, the stuff you are looking through PBS collects, unless Im told I have to do it Im not doing it and a brochure isnt going to help. The survey overlaps with one Im already doing. Im only doing so I can help the greater larger good, and I dont see how this data is going to help an organization improve the quality of life at my school. It doesnt connect back to my school. I would have to see how my participation directly relates to improving the culture for schools.”

      • As much junk mail as I get every day, I dont even know that I would open it. I throw so many things away that come in folders. I would have to have a reason to open it and not trash it.”


Participants are more likely to be willing to complete the survey if it contained shorter, simpler yes/no” questions and didnt require time to gather data to complete. Verbatim comments include:

    • “If the survey was more yes/no questions or simplified, then a secretary or sub-staff could fill it out.”

    • “If its shorter it may get my attention and Ill fill it out quickly and send it back.”

One participant noted that he would be unlikely to believe that the data would be used by federal policymakers to inform education policy. Verbatim comments include:

    • “In the current climate, I dont think the boss or Trump could care less what I think. I wouldnt believe that survey data would be used by federal policymakers.”


Format Preferences


Participants would prefer to receive and complete the survey electronically, but pointed out that it is easier to miss an email because they receive on average more than 200 emails a day. Verbatim comments include:

    • “Its easier to fill out, but less likely to grab my attention because its super easy to delete.”

    • “Depending on who sends it, it might not even get opened. It would go straight to the junk pile or trash. If it came from state-level via email, then it will get opened.”

    • “If we could have it electronic and yes/no answers, so we wouldnt have to get specific numbers for data, then its likely to get it done. Of course, an incentive would make that even better.”




Preferred Sources of information on Education


What resources do you rely on to keep up with education trends? What resources do you rely on for news of your local school system?


A majority of participants indicated receiving weekly or daily newsletters, updates, and digests from their district-level administrators or state superintendents. Most also indicated receiving news from education associations, including National Association for Secondary School Principals, National Education Association, and others.


Participants cited using outlets including Education Week, Smart Brief, and National Public Radio for broader information on education issues. Some participants noted finding information through Twitter.


Additional Comments

Shape62 After asking if the principals had any final comments about SSOCS, a majority of participants expressed appreciation for being consulted on this topic and noted its importance in their schools. Verbatim comments include:

    • The motivation to participate in this group was the opportunity to share feedback and be asked. Information is important to get on a national level for sure - but how its asked, I wouldnt have time to do it. If I can see some actionable insights from data, it would be more appealing to me.

    • “Crime and safety is an important topic, and I appreciate the opportunity to discuss it and meet with you all. But, the issue is very different for districts across the country, especially for me in a very rural district.”


Shape63 Shape64 High School Principals Focus Group Thursday, February 2, 2017, 8:00 9:30 p.m.


Characteristics of Participants



Nine high school principals participated in the group. A summary of their school characteristics is as follows:


Gender of

Principal

Public or

Private School

School Size

School

Community

Percent

Eligible for SLP

Male

Public

Large

Suburban

0-25%

Male

Public

Small

Rural

0-25%

Male

Public

Small

Urban

76-100%

Female

Public

Large

Suburban

0-25%

Female

Public

Small

Rural

51-75%

Male

Public

Small

Rural

51-75%

Male

Public

Small

Rural

26-50%

Female

Public

Small

Urban

51-75%

Female

Public

Small

Suburban

26-50%





Awareness of Education Surveys and Studies in General


Lets start by talking broadly about education studies involving samples of primary or secondary schools, staff, and students. Are you aware of any surveys or studies like that? Do any studies come to mind?


Participants said they are aware of some surveys conducted by the Department of Education and NCES. Many surveys that principals participate in come from their district and they are required to participate. Several principals mentioned receiving surveys from higher education institutions and graduate students. Participants mentioned additional surveys on school climate and on resources. They also mentioned surveys from the College Board and from professional associations such as the National Association of Secondary School Principals (NASSP). Verbatim comments include:

    • We have some surveys that come through the state, … regarding technology. That has been pretty regular and were required to do. A lot come from the district. I havent participated in a lot of national surveys. Im not aware of opportunities to participate in national ones.

    • Everything we do comes through our district and then farmed out to high schools. Some research drug and alcohol use; some are clearinghouse things related to college and career readiness; and we do …. related to technology.”

    • We have district-level surveys; Im doing an employee engagement survey right now. Teachers participate in an insight survey that goes into evaluation for principals.

    • At my level, in a very small rural district, I make decisions on surveys. They come from the [state association of principals], or they come from my district or state. We are looking at school, climate, and culture of the school district in recent surveys.”


Prior Knowledge of SSOCS


Shape65 Lets talk more specifically about a study called the School Survey on Crime and Safety or SSOCS. Prior to receiving the materials we sent you, what, if anything, had you heard or did you know about SSOCS?


About a quarter of the participants had heard of SSOCS; the rest had not heard of the survey before receiving materials.


Initial Reaction to SSOCS


For the following discussion, Id like you to refer to the packet of information about SSOCS. Please open it up at this time. If you dont have your packet, please refer to the electronic versions on your screen.]

At this point, what are your thoughts on SSOCS? Would you take this survey if it came to your school? What would motivate you to take the survey?


Many principals cited their willingness to participate in SSOCS because it only requires their time to complete and potentially some clerical hours. Surveys that require participation from students and/or teachers are more difficult to conduct and usually require approval from the district. Verbatim comments include:

    • Since it just has to be completed by me, its no problem at all to fill that out. So, yes, we would complete it.”

    • “Its highly likely that I would complete it if it falls on my shoulders in terms of handling it between myself and a secretary.”

    • Absolutely, I would fill it out. I find it of great value and would take the time to complete it.”


Some participants mentioned they would be willing to participate because the survey comes from NCES. Verbatim comments include:

    • “Largely because it is coming from NCES and it involves only one person to compile the information and complete - just me and some clerical time to get the data together.”


Several participants mentioned they would be more willing to take SSOCS if the materials indicated clearly and initially the amount of time required to complete the survey. Verbatim comments include:

    • “One of my questions - how long would I have to take it? I’m a middle school principal, high school principal, and athletic director. Spring time is busy with sports. If I had enough time to complete it, I would be more than happy to fill it out.”

    • “If it takes a little bit of time, I would do it myself. It appears to be valuable.”

    • Yes, if I have a little time and a two-week window, I should be able to do it as well.”


A majority of participants noted that early summer is an ideal time to receive the survey and ask for participation from principals, as it is the least busy period out of the year. Spring is not ideal because of sports activities, other testing, and other activities. Verbatim comments include:

    • “I would definitely do it. The participation would be improved if I knew a purpose, and also if it were in early summer that would be better than me, because of all the Spring activities.”

    • Early summer is a better time frame and would give a more complete picture - a years worth of data for us to pull from. I completely agree that summer is the best time to do it, as it coincides well with other reports and data we gather at the end of the year.”

    • “It wouldnt matter so much if it were later in the year or summer, but you would get more complete information at end of the year.”

    • Shape66Summer is actually busiest for me, but timing isnt that much of an issue and as long as my district approves it, I would make the time.

    • Summer, being end of June, would be great.”

    • January is my worst month - as long as its not then, I could do it. I [have a lot of other duties].”


Shape70 Reactions to SSOCS Letter from the Commissioner


What about the information included in the letter? Does the letter give you all the information that you need in a letter about SSOCS? Are the logos prominent enough to convey this is from the Department of Education? What do you think of the endorsers?


Several participants noted that the letter made participating in the survey sound very optional and they would be less likely to participate in SSOCS after reading it. Verbatim comments include:

    • The letter sounds very optional to me. I would probably be less inclined to participate. It doesnt sound urgent or pressing.”

    • “If its just voluntary, I often dont get past the first few lines. If its not clear what its for and why I should pay attention, then it gets tossed to the side.”

    • “I would come at it more from a plea - we need your help and know youre busy, help us make a difference in schools. Looking at the three pieces overall, the letter was the least that popped for me. The brochure first, then I went to the survey to look through. The letter was what I looked at last.”

    • The letter starts off with a lot of “government-ese.”


A majority of participants wanted to know how the data would be used and how they could use the data in their schools. Understanding the what’s in it for me?” would encourage them to participate. Participants emphasized the importance of having a clear, concise message at the beginning of the letter that explains why the survey is important and what the benefits are to their students and school for participating. Verbatim comments include:

    • “I think if there was a more compelling argument as to how this affects me as a school principal, a sense of urgency and more about the data, that would be better. Why should I do this?”

    • What kind of laws or reforms will this type of data affect? If you mentioned how this data affects federal policy, then I would be more inclined to participate.”

    • “I look at the survey as a checklist for proactivity; you should be self-analyzing safety on a campus all the time. Keeping your kids safe, having limited entrances, etc. Its that proactive piece, looking at your own processes and the data would be helpful as well.”

    • “It would be interesting to look at this data across the nation, as this is an issue we are all addressing.”

    • “I dont really see anything in the letter that talks about the benefit to me directly, my students, my district or school, my parents. This is why you should do this, because you can get xyz out of it. I dont really care what gets reported to the government. I need the compelling reason why right at the top.”


Some participants noted that they did not notice the list of endorsers on the letter. Other participants mentioned that certain endorsers like AFT and NEA actually discourage them from participating in SSOCS. They suggested it would be better to customize the letter for different principal levels and include endorsers that are most pertinent to the principal at the top (for example, NAESP for elementary school principals and NASSP for high school principals). Verbatim comments include:

    • “I think endorsers add to the credibility, but I didnt really notice the list on the letter. I would rather understand what data would be made available to me and how it would be used that would make it more compelling.”

    • Shape71 “Right at the top, when I see AFT or NEA, thats a real big turn-off. I would pay attention to NASSP - at least our professional organization has said its credible. It should be the individual professional organizations that match up to the type of principal targeted - NASSP for high school, NAESP for elementary school, etc. I want to see the organizations that are relevant to me. Be aware of regional sensitivities to certain groups.”

    • “I would urge you to keep alternative education and school associations in mind for endorsements.”


One principal noted many of the questions in the survey are questions that they also report for state studies, and wanted to know why the state and federal government aren’t sharing this data to reduce redundancy. Verbatim comments include:

    • We have to report a lot of this data to the state, so it makes me wonder why arent these two agencies talking to each other? Why the redundancy?



Reactions to SSOCS Brochure


Lets talk specifically about the brochure about SSOCS. Does this look like it has the information you would want to know about SSOCS? Do you think there is anything that might be missing? Does the brochure answer the questions that you might have?


Many participants liked the brochure and thought it gave many examples and reasons to participate in SSOCS. Verbatim comments include:

    • “I believe this brochure makes a very great case, it gives lot of examples. This is very helpful.”

    • “I agree - this is very helpful. The supporting data aspect and reminder that we are doing this to improve student and school safety.”

    • The brochure is put together rather well, and it looks pretty important. It comes off pretty nice. We would participate just to help ensure that the US Department of Education had a good sample of data. It doesnt influence anything we do. I would be curious to see results and it would be nice to see it broken down by school size and by state. I know those results will help influence some federal policy decisions possibly, but it wont influence anything we do locally.”


Participants would like to know more about how results from the survey would be disaggregated down to school size so they could compare themselves to similar schools across the country. Comparison is not meaningful for them if its not “apples to apples.” Verbatim comments include:

    • Yes, its very well-written. My question is how does this data parse out to rural districts as opposed to urban and large? … I would like to see how our school compares to other similar schools. It might be interesting to mention that the information provided is across all levels and can be seen at all levels. I want to know how am I going to be able to see the results? When I read this, it looks like large school. That is important because its where most of our kids are, but why would you be asking me in my small rural school? My information might skew the results.”

    • This looks like its for a big school and I would think that my small school data is not valuable. It looks like its more urban schools, not rural. I would need to see that my participation would help, not hinder or confuse the cause.”


Some participants recommended tailoring the brochure to different sizes of schools. Several noted that the images in the brochure call to mind a large, urban school. Rural school principals may see the brochure and think that their data would not be relevant to the study because of this perception. Verbatim comments include:

    • Shape72Perhaps send a different brochure to different types of schools. Im a small school and would need to see how my information can be relevant to the cause. The brochure seems to depict or bring to mind large urban school environments. School size and also rural versus urban - those are big differentiators.”

    • “I would want results that are more tailored to our district and others like it.


Perceived Value of Other Informational Materials


Would it be helpful to have a video or Powerpoint presentation available about SSOCS? Would you view these? Would you use them to show others what SSOCS is about?


A majority of participants indicated that a video or Powerpoint would not be very helpful in learning more about SSOCS or encouraging their participation. Some principals who are part of larger districts indicated that it might be helpful to share a video or presentation at group staff meetings with other principals. Verbatim comments include:

    • “If it was a 2-3-minute video, or a 5-7 slide deck, that could be a value to our stakeholders, but outside of that probably not.

    • The survey speaks for itself; a video or Powerpoint is not necessary.

    • When sent to a larger district, a video or PowerPoint would be helpful.”


Format Preferences



When asked if they would prefer to take the survey online or in print, most participants would prefer an electronic delivery of the survey. One did not have a preference in format.

Preferred Sources of information on Education


What resources do you rely on to keep up with education trends? What resources do you rely on for news of your local school system?


Many participants cited newsletters from professional organizations, including ASCD and NASSP national and state chapters, and school administrator associations as their sources for education news outside of their district.

A number of participants also mentioned daily or weekly emails from their district or states Department of Education as a source of news for them. They also mentioned publications including Educational Leadership and School District Leader magazine as news sources.

1 Due to the nature of the paper-based administration of SSOCS, principals would have time to check records and consult with other school staff before answering items during the operational survey; however, the structure of the cognitive testing did not allow for this consultation.

xxxv

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKemp, Jana
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy