Volume I

Volume I - NCVS SCS 2022 Cognitive Interviews.docx

NCES System Clearance for Cognitive, Pilot, and Field Test Studies 2019-2022

Volume I

OMB: 1850-0803

Document [docx]
Download: docx | pdf



National Center for Education Statistics





Volume I

Supporting Statement





2022 School Crime Supplement to

the National Crime Victimization Survey (SCS:22/ NCVS)

Cognitive Interviews



OMB# 1850-0803 v.282



Attachments:

Volume II – Cognitive Interview Protocol

Attachment I – Communication Materials, Screener, and Consent Forms

Attachment II – Survey Questions



November 2020

  1. Submittal-Related Information

The following materials are being submitted under the National Center for Education Statistics (NCES) generic clearance agreement (OMB# 1850-0803) which provides for NCES to improve methodologies, question types, and/or delivery methods of its survey and assessment instruments by conducting testing such as pilot tests, focus groups, and cognitive interviews.

This request is to conduct recruitment and cognitive interviews designed to evaluate new and revised questionnaire items for the 2022 School Crime Supplement to the National Crime Victimization Survey (SCS:22/NCVS). This package presents the question wording to be tested and describes plans and procedures for conducting the cognitive interviews. NCES is authorized to conduct this study by the Education Sciences Reform Act of 2002 (ESRA 2002, U.S.C. 20 §9543).

  1. Background and Study Rationale

The School Crime Supplement (SCS) to the National Crime Victimization Survey (NCVS) was co-designed by the National Center for Education Statistics (NCES) and the Bureau of Justice Statistics (BJS). The SCS collects data on school-related topics, including alcohol and drug availability, fighting, bullying and hate related behaviors, and fear and avoidance behaviors from students age 12 to 18 in U.S. public and private elementary, middle, and high schools. To date, the SCS was conducted in 1989, 1995, 1999, 2001, 2003, 2005, 2007, 2009, 2011, 2013, 2015, 2017, and 2019.

The Center for Behavioral Science Methods (CBSM) has conducted cognitive pretesting on the 2015, 2017, and 2019 iterations of the School Crime Supplement. Previous pretesting focused on updating and refining the series of questions about respondents’ experiences with bullying in an effort to be consistent with the Centers for Disease Control and Prevention’s (CDC) uniform definition of bullying. Estimates of bullying produced by the 2019 data aligned with expected rates of bullying based on estimates from previous years, suggesting that the items were performing well.

The Coronavirus pandemic (COVID-19) has introduced unique challenges and changes to the way that schools and classrooms are structured. In additional to traditional in-person teaching, many schools began offering all virtual instruction or hybrid learning where students attend in person part of the week and virtually part of the week. Some schools have switched between all in-person and all virtual learning mid-year in response to changing COVID-19 rates in their area. As school systems started to make decisions about in-person/virtual learning, it became clear that the traditional SCS questionnaire will not function well in its current state in this climate. NCES decided to delay the SCS data collection by one year (to January-June 2022). However, it is difficult to predict how school systems will be operating at that time. Therefore, the decision was made to revise the SCS questionnaire to account for the possibilities of in-person, virtual, and hybrid learning in 2022.

NCES is also interested in asking additional questions of homeschooled students to ascertain their reasons for homeschooling, particularly to determine if bullying was one of them. CBSM will recruit a small number of homeschooled students to test the new questions for this population. Since previous pretesting has focused on refining the bullying questions, CBSM will attempt to recruit some students who have had experiences they consider to be bullying this school year. This will allow us to assess whether the questions continue to perform well for students receiving different types of schooling due to COVID-19.

This request is to conduct 3 rounds of cognitive interviews with middle school and high school students to evaluate the new and revised items for the SCS that were added to address new modes of schooling in response to the coronavirus pandemic. (Please see Attachment II to see the full SCS 2022 Questionnaire to be tested.) While the testing focuses on revisions to address different types of schooling, probing questions will also be asked about other 2022 SCS items that may benefit from revision. The cognitive interviews will enable the team to identify problems with question wording and suggest revisions to problematic questions. The interviews and feedback will be used to develop more effective study materials.

  1. Recruitment and Data Collection

To test the 2022 SCS instrument, we will recruit a total of 30 students ages 12-18 who currently attend public or private middle schools and high schools or are homeschooled. We will strive to recruit a sample with diverse demographic characteristics, geographic diversity, and school levels (middle school and high school). In addition to covering a spectrum of demographic and school level characteristics, we aim to recruit a sample of students who received different types of schooling this school year (in-person, virtual, hybrid, and homeschooling). Efforts will also be made to recruit some students who have experienced negative interactions with other students from their school that may count as bullying as well as students who have not. Table 1 below illustrates the targets for student recruitment by round.

Table 1. Recruitment Targets by Round

Response Type

Respondents per Round

Round 1

Round 2

Round 3

Student Recruitment

27

27

27

SCS Questionnaire

Middle School Students

5

5

5

High School Students

5

5

5



Cognitive interview participants will be recruited using CBSM partnerships with student organizations all over the U.S.; through Craigslist; through listservs; social media; and through personal networks. See Attachment I for the language to be used in recruitment advertisements and for the eligibility screener. The respondents will receive materials by email to remind them of their interview time and any other directions or information they need.

We will conduct remote user sessions using video conferencing software (Skype for Business) with up to 30 English-speaking participants. Instructions on how to install Skype for Business will be sent as an attachment to the confirmation email once an interview has been scheduled (see Attachment I). Interviewers will use the scripted protocol seen in Volume II. Participants age 18 will electronically sign a consent form through a link emailed to them before the interview, if they are under the age of 18 a parent or guardian will be sent a consent form (see Attachment I). Participants will also give oral consent at the start of the interview.

Each interview will be 60 minutes in length and will allow time to complete the SCS survey and administer a set of in-depth retrospective probes about a subset of questions. From prior experience, we anticipate needing to contact approximately 81 individuals to yield the 30 desired interviews to account for ineligible respondents and cancellations.

Exhibit 1 summarizes the number of interviews to be conducted for all three rounds, as well as the attendant burden. We will only test one version of the survey in each round. The cognitive testing will use an iterative process. Data from the interviews will be analyzed after each round to identify problematic questions. Those questions will be revised and tested in the subsequent round.

Researchers from the Center for Behavioral Science Methods (CBSM) at the U.S. Census Bureau will carry out the cognitive interviews remotely. Between four and seven staff members will be trained to conduct the interviews to allow flexibility in interview scheduling. All interviewers will be required to demonstrate a strong understanding of the interview protocol before beginning interviews. With respondent permission, the interviews will be audio-recorded so that the responses may be reviewed as needed during analysis. The CBSM project leader will conduct interviewer training, supervise staff, and monitor data quality throughout the data collection period.

During each interview, regardless of round, interviewers will administer the survey in its entirety. The interviewer will observe the respondent as they answer the survey items, noting any questions or problems the respondent has with regards to particular items. After completing these sections, respondents will be asked a set of probing questions about some of the survey items, and about any questions or problems the respondent had when answering the main survey questions. After all probing questions, interviewers will ask a set of debriefing questions. The survey questions will be iteratively tested within each of the three rounds, with the protocol and question wording being adapted based on the results from the previous round. See Volume II of this submission for the full cognitive interview protocol.

Cognitive interviewing techniques allow researchers to evaluate potential response error and to assure that the survey provides valid data. In general, the goal of cognitive testing is to assess the respondents’ comprehension of the questionnaire items, including question intent and the meaning of specific words and phrases in the item. Data from cognitive interviews can identify potentially problematic questions that are not understood as intended. This testing can also examine the respondents’ retrieval of relevant information from memory, decision processes involved with answering a question, and questions that are difficult to answer due to cognitive burden. The cognitive interviews will assess issues such as:

  • The subject’s understanding of terms in the survey

  • How confident the subject is in their response

  • How they remembered the information they provided in factual questions

  • Whether they found a response choice that fit their answer

  • How easy or difficult it is to answer a question

  • Issues with sensitive questions

  • Consistency of answers within the questionnaire and in comparison to the expected range of answers

  1. Consultations Outside the Agency

Census will prepare the OMB packet for cognitive testing, conduct cognitive testing, summarize interviews, and analyze results. They will use these results to make final recommendation to NCES and BJS for the wording of the questions. NCES and BJS will make the final decision about the wording for the questions.

  1. Justification for Sensitive Questions

While the main focus of this cognitive testing is new and revised questions to adapt questions to apply to in-person, virtual, and hybrid learning scenarios in 2022, there is still a need to ask the entire questionnaire. This will allow sponsors to make sure there are no unforeseen issues with how questions perform for students in each of the scenarios.

The series of bullying questions may be considered sensitive for students who are experiencing bullying. Since previous revisions to the questionnaire prior to fielding in 2015, 2017, and 2019 have focused on refining the bullying questions, it is important to assess whether the questions continue to perform well as they are currently written for students receiving different types of schooling due to COVID-19. In the three previous phases of pretesting the bullying questions, CBSM has seen evidence that respondents were ultimately okay with being asked both the survey questions and the probing questions about their experiences with bullying.

  1. Paying Respondents

We expect it will be difficult to recruit participants for this study due to the disruptions of the coronavirus pandemic on education this school year. New modes of instruction have introduced scheduling and logistical challenges for parents/guardians and children. Additionally, in previous years, cognitive interviews testing the SCS were done in person. Advertisements listed the US Census Bureau headquarters as a possible location for the interview, and parents bringing students met interviewers in-person when they arrived. Remote interviews make it more difficult for parents to confirm the legitimacy of opportunity for their child to participate in an interview with an adult. We expect this to make recruiting difficult as some parents may be reluctant to allow their child to participate in an online activity with a stranger. We anticipate it would be even more difficult to assuage these concerns or to convey that the opportunity is legitimate if no or only a low incentive for this interview were offered. By offering the incentive, parents may be more likely to make the initial call to inquire about the study, at which point we can provide them with more information to increase the legitimacy of the study.

To ensure that we can recruit participants from all desired populations, and to thank them for completing the interview, each student and parent/guardian will be offered an incentive. Incentives will offset the costs of participation in this research, such as internet or phone use. We will offer $25 to the respondent and $25 to the parent or guardian who assists them with setting up the interview and installing the video conferencing software. Two standard envelopes, each containing $25 cash, will be mailed via USPS in a single priority mail flat rate envelope. The priority mail flat rate envelopes have tracking and delivery confirmation.

  1. Assurance of Confidentiality

Cognitive interview participants will be informed that their participation is voluntary and that all of the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). Participants will also be advised that the interview will be recorded and that the recording will only be reviewed for the purposes of this study. Lastly, participants will be advised that direct quotes may be used in research papers and professional presentations, but names will never be attributed to anything a respondent says.

Participants will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all materials together. The respondent ID will not be linked to the respondent in any way or form. If respondents are under the age of 18, their parents will be provided with a parental consent form; respondents who are 18 will be provided with the standard consent form used for adults (see Attachment I). The signed consent forms will be kept separately from the interview files in a locked cabinet for the duration of the study.

  1. Estimate of Hourly Burden

Exhibit 1 summarizes the number of interviews to be conducted for all three rounds of this study, as well as the recruitment burden. Each interview will be 60 minutes in length and will allow time to complete the SCS survey and administer a set of in-depth retrospective probes about a subset of questions. From prior experience, we anticipate needing to contact approximately 81 individuals to yield the 30 desired interviews to account for ineligible respondents and cancellations.



Exhibit 1. Estimated Respondent Burden of Cognitive Interviews

Response Type

Number of Respondents

Number of Responses

Burden Hours per Respondent

Total Burden hours

Recruitment Screener

81

81

.17

14

Consent Procedure/ Installation of Software

30*

30

.17

5

Cognitive Interview

30*

30

1

30

Total

81

141


49

*A subset of all recruited; does not contribute to the total number of respondents. Some numbers have been rounded.

  1. Cost to the Federal Government

The cost of conducting the cognitive interviews will be $116,000, under the NCES contract to CBSM at the U.S. Census Bureau.

  1. Schedule

Recruit participants through networks and advertisements

December 21, 2020- March 25, 2021

Conduct Round 1 cognitive testing

December 28 – January 22, 2021

Iterative revisions to item wording

January 25 – February 5, 2021

Conduct Round 2 cognitive testing

February 8 – February 19, 2021

Iterative revisions to item wording

February 22 – March 12, 2021

Conduct Round 3 cognitive testing

March 15 – March 26, 2021

Analysis and Final Recommendations

March 29 – April 9, 2021

Final Wording for Questionnaire

April 9, 2021

Finalized Report

June 11, 2021





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHerschel Sanders
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy