BJS response to Public Comments on SLEPS

BJS Response to 60-Day Comments for SLEPS 072419.docx

2019 Survey of Law Enforcement Personnel in Schools (SLEPS)

BJS response to Public Comments on SLEPS

OMB: 1121-0367

Document [docx]
Download: docx | pdf

BJS Response to Public Comments received on under the 60 Day FR Notice


During the 60-day comment period, BJS received comments from three organizations: Campaign for Youth Justice (CFYJ), Rights4Girls, and Southern Poverty Law Center (SPLC). Some of the proposed items were incorporated, either into existing questions or as a new question, because they were straightforward concepts and complemented the current survey content. Other proposed items were not incorporated because they reference complex topics that would require testing new items which would result in significant delays in the fielding of SLEPS. The following is a brief summary of the feedback BJS received during the 60-day comment period and how this feedback was addressed, organized by whether or not changes were incorporated into the survey instruments.


Feedback incorporated into the SLEPS instruments


Per the recommendation by CFYJ, BJS added a new question to the law enforcement agency (LEA) survey to determine if School Resource Officers (SROs) are allowed to conduct interviews of students outside the presence of a parent or guardian without first obtaining permission from a parent or guardian (Question 14). A complementary addition was made within an existing question on the SRO survey, adding response options to a question about SRO activities. The two new response options ask SROs if they have conducted a student interview outside the presence of a parent or guardian or if they have conducted a student interview in the presence of a parent or guardian in the past 30 days (Question 16).


Per the recommendation by Rights4Girls, BJS added ‘trauma-informed practices’ to the list of social/behavioral training topics on both the LEA and SRO surveys (LEA Question 26, SRO Question 15).


Per the recommendation by SPLC, BJS added ‘conducted video surveillance/monitoring’ and ‘participated on a threat assessment team’ as response options to the question about the law enforcement activities in which SROs have engaged in the past 30 days (Question 16).


Feedback partially incorporated into the SLEPS instruments


SPLC suggested that BJS ask about limitations or prohibitions in agreements with schools or in internal departmental policy surrounding SRO arrest powers, SRO involvement in school disciplinary matters, SRO use of force, and SRO use of weapons. BJS determined that the existing question on the LEA survey about SRO program characteristics (Question 12) sufficiently covers the topics of arrest powers and involvement in school disciplinary matters. No addition was made for SRO use of force because BJS’s Law Enforcement Management and Administrative Statistics (LEMAS) survey asks LEAs about use of force policies and goes to the same types of LEAs as SLEPS will, with one exception – SLEPS will include school-based police departments and LEMAS does not. Use of force policies would not be SRO-specific, so adding a question about use of force policy would be mostly a duplication of what is collected through LEMAS. Because LEMAS does not include school-based police departments, BJS does not have data on use of force policies in these types of LEAs. However, BJS concluded that the small number of school-based police departments does not warrant addition of this question to the LEA survey. LEMAS asks LEAs if they have written policy or procedural directives on the use of deadly force/firearm discharge and use of less-lethal force. Regarding SRO use of weapons, BJS added two response options to Question 12: ‘use of firearms’ and ‘use of less-lethal equipment.’


SPLC commented that BJS should ask which training topics are required for SROs and how often SROs must be re-trained. Training questions on the LEA survey originally asked about topics on which training was offered to SROs. Following review of this comment, these training questions were revised to ask about required training (Questions 24, 25, 26). No changes were made to incorporate frequency of training. An earlier version of the SRO survey tested a question that asked if training on specified topics was received annually and cognitive testing demonstrated that respondents had difficulty answering this (see Attachment 30, pages 38-41 of the original OMB package). BJS revised the SRO survey to only ask if training was received because of this difficulty and BJS anticipates the same difficulty would exist if a training frequency question was added to the LEA survey.


Feedback not incorporated into the SLEPS instruments


CFYJ and SPLC provided similar comments that the LEA survey should collect data on student arrests and referrals to law enforcement disaggregated by characteristics such as race, ethnicity, gender, disability status, and offense. The Department of Education’s Civil Rights Data Collection (CRDC) collects many of the elements identified by CFYJ and SPLC and does so on a much larger scale than SLEPS would be able to. CRDC is a mandatory data collection that collected data from the universe of over 96,000 schools in 2015-2016. CRDC collects the number of students who received a school-related arrest and the number of students referred to a law enforcement agency by sex, race/Hispanic origin, and disability status. Comparatively, SLEPS is a voluntary data collection that will sample approximately 2,000 LEAs that employ SROs. Adding the proposed items to SLEPS would involve some duplication of CRDC’s effort that ultimately would not provide better estimates given the scale and target population of SLEPS compared to CRDC. Additionally, CRDC collects the number of full-time equivalent (FTE) sworn law enforcement officers and the number of FTE security guards present at school at least once a week. SLEPS will only sample LEAs with SROs.

In addition to duplicating effort, BJS is concerned about the ability of LEAs to provide the proposed elements in an accurate manner and without undue burden. Student arrests and citations may not be able to be easily retrieved from LEA records management systems (RMS). The RMS may not have a location code for “school” that will allow LEAs to easily identify arrests that take place at school. If LEAs do identify arrests that occur at school, there are additional complications, such as identifying whether the arrest was of a student and if the school’s SRO made the arrest. An earlier version of the LEA survey that was cognitively tested asked LEAs if they collected data on measures such as number and type of arrests made by SROs and the number and type of citations issued by SROs. Some respondents indicated that they had access to this data but it would be extremely burdensome to quantify these instances. Other respondents indicated they did not have the data but could get the information from the school district (see attachment 30, page 31 of the original OMB submission).

BJS also has concerns about adding complicated items such as arrest and the other proposed elements without the ability to test new questions to ensure clarity and the ability of LEAs to answer. The recommendation to collect data on use of force incidents falls under this as well, as it is still unclear how to best capture this type of information from LEAs. CFYJ and SPLC also suggested that BJS ask LEAs how many arrests resulted in a delinquency finding or conviction. LEAs would not be able to report this as these data fall under courts’ domain.

Other suggestions were not incorporated into the SLEPS instruments because BJS felt that the topics were sufficiently captured in already existing items. CFYJ and SPLC both recommended the addition of bias-free policing as a social/behavioral training topic on the LEA and SRO instruments. CFYJ also recommended adding training on the civil or constitutional rights of students to the list of law enforcement training topics. BJS determined that these topics are not sufficiently different from the existing topics of ‘cultural sensitivity and/or cultural competency’ and ‘procedures for handling juvenile offenders’ to warrant inclusion.

No changes were made for some proposed items due to uncertainty of how to measure items, concerns of not having tested questions, and concerns about the inability to provide context for the proposed items. CFYJ and Rights4Girls had somewhat similar suggestions to add questions asking about complaints filed against SROs. BJS has not tested any questions about complaints and doing so would result in significant delays. Additionally, the LEA survey is not collecting contextual measures that would allow for meaningful comparisons of this data, such as the size of schools. Rights4Girls proposed asking SROs how frequently they apply certain tools learned in training and how often they employ responses less punitive than arrest. BJS did not make any changes in response due to concerns about context, the inability to test questions, and the previously mentioned issue of assessing frequency. Another suggestion was to ask if SROs have access to data on students identified as a threat and if LEAs use any integrated databases of student information. Uncertainty surrounding methods and sources through which students are identified as threats and means through which to track this information would warrant item testing, which would significantly delay the fielding of SLEPS.

The value of some proposed items was unclear to BJS and therefore no changes were made. One suggestion was to ask LEAs if they analyze and review data on student arrests, citations, and use of force, disaggregated by student demographic characteristics, and if so, how often. It is unclear what value it holds to know if LEAs do this without a tested way to follow up to see if such a review has any impact, such as a policy change. Another suggestion was to ask SROs how many students they referred to school administration for suspension or expulsion in previous 12 months. BJS determined this is outside the scope of SRO responsibility and therefore did not add this.







File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDavis, Elizabeth (OJP)
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy