Justification

Volume 1 SSOCS-CRDC Incident Count Cognitive Interviews.docx

NCES System Clearance for Cognitive, Pilot, and Field Test Studies 2019-2022

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf






School Survey of Crime and Safety (SSOCS) Incident Count Check Cognitive Interviews




Volume I

Supporting Statement





OMB #1850-0803 v.250







Submitted by

National Center for Education Statistics (NCES)

U.S. Department of Education






June 2019






Appendixes:

Appendix A: Recruitment Materials

Appendix B: Protocols

Appendix C: Questionnaire Handouts

Submittal-Related Information

The following material is being submitted under the National Center for Education Statistics (NCES) generic clearance agreement (OMB# 1850-0803), which provides NCES the capability to improve data collection instruments by conducting testing—such as usability tests, focus groups, and cognitive interviews—to improve methodologies, survey questions, and/or delivery methods.

1. Background and Study Rationale

The School Survey on Crime and Safety (SSOCS) is a nationally representative survey of public elementary and secondary schools and is one of the nation’s primary sources of school-level data on crime and safety in public schools. Conducted by NCES, within the U.S. Department of Education (ED), SSOCS has been administered seven times starting in the 1999–2000 school year. The next administration is planned for the 2019–20 school year.


SSOCS is designed to produce nationally representative data on violence and safety practices in public schools. SSOCS questionnaires are completed by school principals or other school personnel designated by the principal as the person who is “most knowledgeable about school crime and policies to provide a safe environment.” SSOCS is the only recurring federal survey that collects detailed information on the incidence, frequency, seriousness, and nature of violence affecting students and school personnel, as well as on other indices of school safety, directly from school-level respondents.


Topics covered by SSOCS include school programs and practices, parent and community involvement, school security staff, school mental health services, staff training, limitations on crime prevention, the type and frequency of crime and violence, and the types of disciplinary problems and actions used in schools. Other federal surveys obtain information about school crime from individuals other than those who have the school-level perspective of principals.


One such survey is the Civil Rights Data Collection (CRDC). The CRDC is administered by the U.S. Department of Education’s Office of Civil Rights (OCR) and has collected data on education and civil rights issues in U.S. public schools since 1968. The CRDC collects a variety of information from superintendents and staff at local education agencies (LEAs), most of which is disaggregated by race/ethnicity, sex, limited English proficiency, and disability. Information collected by the CRDC is used by ED offices as well as policymakers and researchers outside of ED. The most recent collections were census collections of all LEAs. CRDC is currently finishing collection of 2017–18 data.


Between SSOCS and the CRDC, there is overlap in the counts of crime incidents reported, disciplinary actions, and harassment/bullying data. When SSOCS experienced a lapse in funding after the 2009–10 data collection, ED deemed it crucial to continue collecting counts of the key metrics traditionally collected in the SSOCS, and they were incorporated into the CRDC. When SSOCS resumed in 2015–16, incident counts remained in both SSOCS and the CRDC. Consequently, there are incident counts in both SSOCS and the CRDC for the 2015–16 and 2017–18 school years. This was intentional to assess how the counts obtained through both data collections compare.


Initial analyses show discrepancies in the information reported for schools that participated in both SSOCS and the CRDC. To gain a better understanding of the context around the differences in the reporting of the items in the two surveys, this request is to conduct a two-phase cognitive interview study to explore respondents’ understanding of the incident count questions—in particular, what respondents include in, and exclude from, their calculations and what records respondents reference to answer the questions—in order to determine whether respondents are providing the intended information. The goals of the study are to obtain information to improve the survey items, reduce the burden of future data collections, and ensure that the resulting data are more accurate for schools, districts, policymakers, and other data users. NCES contracted the American Institutes for Research (AIR) to carry out the cognitive interviews.


2. Study Design and Sample Characteristics

Taking into consideration data collection timelines, as well as the need to test and refine interview procedures and materials to ensure that the data being collected will answer the research questions, this study will be conducted in two phases. Phase 1 will consist of a small number of interviews in which the recruitment methods and study materials will be tested; the results of these interviews will be used to inform the conditions for Phase 2, which will be similar to Phase 1 but on a larger scale. With participant’s permission, each interview will be audio recorded to aid in data analysis and report writing.


The sample will be composed of public schools and districts; interview respondents will be the staff who were responsible for filling out the school crime incident counts in the SSOCS:2018 and CRDC 2017–18 questionnaires (per the contact information provided in each data submission). SSOCS respondents will include elementary, middle, and high school principals or other school personnel designated by the principal as the person who is “most knowledgeable about school crime and policies to provide a safe environment.” The CRDC respondents will be the staff at the respective LEA for that school. For each school in the sample, the SSOCS and CRDC respondents will create an “interview pair”, but each respondent will be interviewed individually. On behalf of NCES, AIR will recruit respondents, administer interviews, and provide to NCES reports and recommendations based on analyses of findings for both phases of this study. A brief description of the sample characteristics and objectives is provided below.


Phase 1

Phase 1 will consist of approximately 10 interview pairs―one SSOCS respondent and one CRDC respondent, for a total of approximately 20 interviews—in July and August of 2019. These pairs will represent a range of: (a) incident types, (b) which data collection(s) the incidents were reported on, and (c) school level. A purposive sample will be drawn to ensure appropriate coverage of possible scenarios and school characteristics.


The interviews in Phase 1 will serve to evaluate the feasibility of scaling up the study methods and materials to conduct Phase 2 of the study. The Phase 1 interviews will allow us to assess the success of the recruitment tactics and to test the interview protocols to determine if the questions produce the information necessary to be able to determine validation issues/discrepancies across the two surveys. The findings from Phase 1 will provide recommendations for revisions to the study materials for Phase 2. A summary of the results from Phase 1 and recommendations for Phase 2 will be developed by late August 2019. If revisions are made to materials for Phase 2, a revised request with the revised materials and a memo listing the changes will be submitted to OMB at that time for approval before Phase 2 begins.


Phase 2

Phase 2 will consist of up to 100 interview pairs (for a total of approximately 200 interviews) in fall 2019. These pairs will represent a range of school characteristics, including schools in different locales, schools serving different grade levels, and schools with varying enrollment sizes. Note that while the sample will include a mix of characteristics, the results will not explicitly measure differences by these characteristics.


The Phase 2 interviews will be used to assess the validity of the SSOCS and CRDC incident items and will focus on the following research questions:

  • Who is the respondent (principal, superintendent, other staff member, etc.) who completed the incident counts for SSOCS and the CRDC?

  • How did the respondent collect the incident count data? For example, did the respondent estimate the counts or use records? If the respondent used records, what records?

  • Are there differences by respondent type in the interpretation of the incident count items in SSOCS and the CRDC?

  • Did the respondent verify the incident count data? If so, how did the respondent do this?

  • How are the procedures for providing incident counts for SSOCS and the CRDC similar? How are they different? In what ways do the similarities and differences vary by whether the SSOCS and CRDC counts match/do not match?

Upon completion of the Phase 2 interviews, the interview notes will be analyzed to identify trends in the interview data and a summary report presenting the results of the study will be prepared. The report will include a description of key findings on schools’ and districts’ interpretations of the incident items in SSOCS and the CRDC, staff members responsible for completing the survey items, data storage procedures, and methods for providing and validating incident counts for the surveys. The summary report will provide recommendations for future administrations of SSOCS, specifically pertaining to the questions collecting incident count data.

3. Data Collection

3.1 Recruitment

In both phases of the study, recruitment is anticipated to last approximately 4 weeks and the recruitment process is expected to take 5 to 20 minutes per entity. A central goal of Phase 1 is to determine the ease of pinpointing the correct contacts for both surveys and provide a better idea of the time frame required to successfully recruit both contacts. AIR will develop and maintain a detailed tracking sheet of recruitment efforts for each sampled district and school; this will aid in understanding the true level of effort required for recruitment prior to Phase 2 of the study.


Prior to the start of recruitment, OCR will provide AIR with a list of the points of contact for the 2017–18 CRDC. Recruitment will begin once a sample of linked SSOCS schools and CRDC districts has been drawn. After the sample of schools has been selected, a notification letter from NCES will be mailed to the schools and their districts informing them of the study activities and inviting them to participate. Each pair of school and district contacts will be assigned an AIR staff member who will serve as their main point of contact and who will administer the interviews with the respondents.


Recruitment will begin by contacting the staff member listed as the SSOCS:2018 point of contact for each school to determine whether that person filled in the incident data and is the appropriate person for the interview. Once the SSOCS respondent has agreed to participate in an interview, they will be asked to sign a consent form that requests permission for the following:


  1. to share information from the school’s participation in SSOCS with the district contact.

  2. to participate in the interview.


Once the SSOCS respondent has agreed to participate in an interview, a similar outreach process will be used to contact the listed CRDC 2017–18 point of contact for that school’s district. If the school has consented to share their responses on SSOCS with their district, the CRDC respondent will be asked to sign a non-disclosure affidavit (NDA) and a consent form that they agree to participate in the interview. If the school has not consented to share their responses on SSOCS, the interview will be adjusted so that this information is not shared with the district and a non-disclosure affidavit will not be required. For more information about consent and non-disclosure procedures, see the “Assurance of Confidentiality” section below.


If a SSOCS respondent agrees to an interview, but AIR is unable to schedule an interview with the CRDC district respondent―either AIR is unable to reach the correct respondent or the respondent declines to be interviewed―AIR will proceed with interviewing the school only.


During recruitment, if the SSOCS or CRDC contact indicates they were not responsible for filling out the school’s incident count data, AIR will request contact information for the correct school or district staff member and then reach out accordingly. Once an interview has been scheduled, AIR staff will send a meeting invite via e-mail to confirm the interview date and time. A consent form and non-disclosure agreement (for district administrators only) will be attached for their signature.


AIR will use multiple outreach methods and resources to recruit participants. SSOCS and CRDC respondents will be contacted by e-mail and phone during recruitment, and AIR will confirm that interested individuals are eligible to participate. Our recruitment experience with cognitive interviews for SSOCS:2018 and the 2017–18 CRDC indicated that these respondents are a hard-to-reach population, and increased time and effort will be dedicated to meeting recruitment targets.


Appendix A contains the full set of recruitment materials, including a visual representation of the recruitment process (see figure 1 in Appendix A). Screening and recruiting participants for qualitative studies such as this one is a dynamic process. The recruitment email templates and scripts will be used in Phase 11. If we adapt the materials for Phase 2 based on feedback received during Phase 1, a revised request with the revised materials and a memo listing the changes will be submitted to OMB at that time for approval before Phase 2 begins.

3.2 Qualitative Interview Methods

Qualitative interview practices will be used to gather information from SSOCS and CRDC respondents to learn about their experiences reporting information for the school crime incident count items that are included in each survey. In each qualitative interview, trained interviewers will provide an overview of the study goals and procedures and then ask participants to answer questions about their experiences responding to either SSOCS or the CRDC. The interviewers will use a semi-structured protocol (see Appendix B) drawing on methods from cognitive science to investigate the process that respondents use to answer the incident count questions in each survey. The main goal of this approach is to explore respondents’ understanding of the incident count questions—in particular, what they include in and exclude from their calculations when answering the questions—in order to determine whether respondents are providing the intended information.


The semi-structured interviewing methods will consist of two key components: think-aloud interviewing and verbal probing techniques (also known as concurrent and retrospective recall probing, respectively). With think-aloud interviewing, respondents are explicitly instructed to think aloud (i.e., describe what they are thinking) as they work through items. With verbal probing techniques, the interviewer asks probing questions, as necessary, to clarify points that are not evident from the think-aloud process or to explore additional issues that were identified a priori as being of particular interest.


Cognitive interview studies produce qualitative data in the form of verbalizations made by participants during the think-aloud interviewing and in response to interviewer probes. Both the think-aloud approach and probing techniques will be applied to all participants during the qualitative interviews. Interviewers will refer to the protocol to guide the content of the interviews but will also be free to deviate from the guide should participants have difficulty answering questions that do not have scripted probes or prompts.


The questions in the protocol will directly address the study’s research questions by gathering data on: how districts receive incident data from schools to report for the CRDC (and what processes they use to verify these data); what formats districts and schools use to store these data; and how district and school respondents understand and respond to the incident items in the two surveys, including any cognitive issues with definitions or question format. In order to compare the reporting processes between schools and districts for the same incident count items on the two surveys, the protocols include some questions that refer to estimates for specific incident count items that the school or district respondent provided in their respective survey (SSOCS or CRDC). For example, questions in the protocol for school interviews will discuss the incident counts from SSOCS and whether certain counts were higher, lower, or equal to the counts provided on the CRDC. Given time constraints, questions comparing reporting on SSOCS and the CRDC will only focus on the most serious incident for which the school or district reported data. No exact counts reported by the school on SSOCS will be shared with the district respondent. See the “Assurance of Confidentiality” section below for additional details.


Following each interview, the digital audio recording will be archived for qualitative analysis. AIR staff will organize their observations and summarize the common themes, insights, and ideas emerging from each of the interviews into a report.


For the complete set of questions that are provided in the protocols, see Appendix B. Excerpts of the incident items taken directly from the data collection questionnaires (and the way in which items will be clustered for comparison) are provided in Appendix C.

3.3 Interview Logistics

The qualitative interviews will last 60 minutes to give participants sufficient time to reflect on the more open-ended questions included in the protocol and allow time for interviewers to probe for additional information as necessary. The qualitative interviews will be conducted remotely via video conference (using the web conferencing platform GoToMeeting). If a respondent would prefer that the interview be conducted over the phone, that will be accommodated. Interviews will be conducted primarily during regular work hours (9 am–6 pm EDT) but will be scheduled after work hours, as needed.

4. Consultation Outside the Agency

NCES has also been in consultation with some of the nation’s top content experts in school crime, school safety, and student support services in schools, as well as experts in methodology related to establishment and administrative surveys. In September and October of 2018, two technical review panels (TRPs) were conducted with external experts and representatives from federal agencies to provide feedback on the content and methodological approaches of SSOCS. The first TRP focused on whether SSOCS currently covers the most appropriate content areas/items, discussed how to address critical gaps in content in the questionnaire, and considered the usefulness and prioritization of current SSOCS items. The second TRP focused on potential improvements to SSOCS sampling and data collection procedures, including how SSOCS could be better integrated with or supplemented by extant administrative data sources, including the CRDC.

Feedback from the TRPs included the recommendation to conduct a thorough study to validate the crime incident data for the SSOCS and CRDC. The TRPs proposed that such a study include cognitive interviews with school and district officials (for schools that responded to both surveys) to determine how they responded to the incident items and uncover potential reasons for the matches or mismatches in incident counts reported across the two surveys.

5. Assurance of Confidentiality

All participants will be assured that their participation is voluntary and materials shared with the participants will include the following language:


All the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).


A consent form that explains the purpose and duration of the interview will be sent via e-mail to both SSOCS and CRDC participants, to be signed and returned prior to their interview.


At the time of recruiting SSOCS participants, AIR staff will also obtain consent from the school to be able to share minimal information regarding the school’s participation in SSOCS during the interview with the school’s district. This would include identifying the school that responded to the survey and providing limited information about the school’s responses to select incident items. No exact counts reported by the school on SSOCS will be shared with the district respondent. Consent to share the school’s responses on SSOCS will be collected at the time of recruitment and prior to the interview with the CRDC respondent. If written consent is not received from the SSOCS respondent prior to the interview with the CRDC respondent, information on the school’s responses on SSOCS will not be shared during the district interview.


Additionally, CRDC district respondents will be asked to sign a non-disclosure affidavit prior to the interview to affirm that they will not reveal any identifiable information or responses from the SSOCS school(s) in their district that may be shared with them as part of the interview. If the district respondent does not sign the non-disclosure affidavit, information on the school’s responses on SSOCS will not be shared during the district interview. See Appendix A for recruitment materials, including the consent form and the non-disclosure affidavit.


Participants will be assigned a unique identifier (participant ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant’s name except in a file used to manage recruitment and interviews. The consent and non-disclosure agreement forms, which include the participant’s name, will be separated from the participant interview files and secured for the duration of the study. The interviews will be audio-recorded. All files will be secured for the duration of the study—with access limited to key AIR project staff—and destroyed at the conclusion of the study.

6. Estimate of Costs for Recruiting and Paying Respondents

District- and school-level participation are vital to the success of this study, and from prior experience we expect that it will be challenging to gain district- and school-level administrators’ cooperation to participate. It has been identified in similar projects that incentives are an effective approach to district- and school-level administrator recruitment – they communicate appreciation of a respondent’s time and participation, and this may be especially important for districts and schools at the very busy time of the school year during which the study will be fielded. To encourage participation in the interviews and to thank participants for their time and effort, we will offer each a $50 gift card that will be sent with a thank-you note via mail within 10 business days of completion of the interview. We offered a $50 gift card for 2018 SSOCS cognitive interviews conducted remotely with this population (OMB# 1850-0803 v.171) and were able to successfully reach acceptable levels of participation.

7. Estimate of Hourly Burden

Up to 10 interview pairs (approximately 20 participants) will be recruited for the qualitative interviews in Phase 1, and up to 100 interview pairs (approximately 200 participants) will be recruited for the qualitative interviews in Phase 2. Each interview will take approximately 60 minutes. The initial contact and screening of potential participants is estimated at an average of 10 minutes, or 0.16 hours. On average, three recruiting attempts are expected to be needed for each SSOCS participant (thus, an estimated 30 attempts to yield up to 10 SSOCS participants for Phase 1 and 300 attempts to yield up to 100 SSOCS participants for Phase 2) and five recruiting attempts for each CRDC participant (thus, an estimated 50 attempts to yield up to 10 CRDC participants for Phase 1 and 500 attempts to yield up to 100 CRDC participants for Phase 2). Table 1 shows burden estimates for (a) recruiting participants (the initial contact and screening); and (b) conducting the qualitative interviews by each Phase of the study.

Table 1. Estimate of hourly burden for SSOCS qualitative interviews: Phase 1 and Phase 2

Activity

Number of Respondents*

Number of Responses

Burden Hours per Respondent

Total Burden Hours

Phase 1

SSOCS Recruitment

Recruitment (initial contact and screening)

30

30

0.16

5

Qualitative Interviews

10

10

1

10

CRDC Recruitment

Recruitment (initial contact and screening)

50

50

0.16

8

Qualitative interviews

10

10

1

10

Phase 1 Subtotal

80

100

-

33

Phase 2

SSOCS Recruitment

Recruitment (initial contact and screening)

300

300

0.16

48

Qualitative Interviews

100

100

1

100

CRDC Recruitment

Recruitment (initial contact and screening)

500

500

0.16

80

Qualitative Interviews

100

100

1

100

Phase 2 Subtotal

800

1,000

-

328

Total

880

1,100

-

361

Note: The subtotals for the estimated Number of Respondents do not include duplicated counts of respondents.

8. Schedule

Recruitment for the cognitive interview study will begin in June 2019, pending the availability of CRDC 2017–18 data for matching analyses and sampling. Phase 1 is scheduled to begin in early July 2019, with a summary of results and recommendations for Phase 2 developed by late August 2019. Phase 2 is scheduled to begin in October 2019. Table 2 below provides the overall schedule for Phase 1 and Phase 2.

Table 2. Schedule of high-level activities for SSOCS:2018 cognitive interviews study

Activity

Start Date

End Date

Phase 1

Training

June 24

June 26

SSOCS Recruitment

June 26

August 9

CRDC Recruitment

July 10

August 21

Interviews

July 11

August 22

Analysis

August 19

September 13

Reporting

August 26

September 29

Phase 2

SSOCS Recruitment

October 14

November 8

CRDC Recruitment

October 28

November 22

Interviews

October 21

November 29

Analysis

November 4

December 6

Reporting

November 21

December 20

9. Cost to the Federal Government

The estimated cost to prepare for, administer, and report the results of these cognitive interviews is approximately $364,000. The cost includes salaried labor for contractor staff and other direct costs associated with the organization of the interviews.

1 In all phases of the study, in cases when an email or a phone call follow a prior interaction with the addressee, short personalized statements may be interwoven into the email and/or phone script, adjusted to the tone and subjects of prior communications with the respondents.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKaleem, Korantema
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy