2019 SCS OMB Supporting Statement B_final

2019 SCS OMB Supporting Statement B_final.docx

2019 School Crime Supplement to the National Crime Victimization Survey

OMB: 1121-0184

Document [docx]
Download: docx | pdf




Supporting Statement – 2019 National Crime Victimization Survey (NCVS) School Crime Supplement (SCS)


B. Collection of Information Employing Statistical Methods


  1. Universe and Respondent Selection

The sample universe for the NCVS School Crime Supplement (SCS) is all persons age 12 to 18 living in NCVS interviewed households who have attended public or private school during the current school year (grades 6 through 12).1 Students who were homeschooled the entire school year are ineligible for the SCS. Students are eligible for the SCS if they were homeschooled for part of the school year and attended a public or private school during the other part of the school year. The NCVS sample is drawn from more than 120 million U.S. households and excludes military barracks and institutionalized populations. In 2019, the annual national sample is planned to be approximately 240,000 designated addresses located in 542 stratified Primary Sampling Units (PSUs) throughout the United States.


Frame

The Master Address File (MAF) contains all addresses from the most recent decennial census plus updates from the United States Postal Service, state and local address lists, and other address listing operations. The MAF is the frame for the target NCVS population. Every ten years, the Census Bureau redesigns the samples for all of their continuing demographic surveys, including the NCVS. In 2015, the 2000 sample design started to phase out and the 2010 sample design started to be phased in. Beginning in 2016, some PSUs were removed from the sample, some new PSUs were added to the sample, and some continuing PSUs that were selected for both the 2000 and 2010 designs remained in the sample. The phase-in and phase-out of the sample designs started in January 2015 and continued through December 2017. The new sample sizes are larger than in previous years to support state-level estimates in 22 states.


Rotating Panel Design

The NCVS uses a rotating panel design. The sample consists of seven groups for each month of enumeration. Each of these groups stays in the sample for an initial interview and six subsequent interviews, for a total of seven interviews for the typical household. During the course of a 6-month period, a full sample of seven rotation groups is interviewed (one-sixth each month). One rotation group enters the sample for its first interview each month.


SAMPLE SELECTION

The sample design for the NCVS is a stratified, multi-stage cluster sample. Sample selection for the NCVS, and by default the SCS, has three stages: the selection of primary sampling units (PSUs), the selection of address units within sample PSUs, and the selection of persons and households from those addresses to be included in the sample.

Stage 1. Defining and Selecting PSUs

Defining PSUs – Formation of PSUs begins with listing counties and independent cities in the target area. The PSUs comprising the first stage of the sample are formed from counties or groups of adjacent counties based upon data from the decennial census and the American Community Survey (ACS). For the NCVS, the target area is all 50 states and the District of Columbia. Counties are either grouped with one or more contiguous counties to form PSUs or are PSUs all by themselves. The groupings are based on certain characteristics such as total land area, current and projected population counts, large metropolitan areas, and potential natural barriers such as rivers and mountains. For the NCVS, decennial census counts, ACS estimates, and administrative crime data drawn from the FBI’s Uniform Crime Reporting (UCR) Program are also used to stratify the PSUs. The resulting county groupings are called PSUs.


After the PSUs are formed, the larger PSUs are included in the sample with certainty and are considered to be self-representing (SR). The remaining PSUs, called non self-representing (NSR) because only a subset of them are selected, are combined into strata by grouping PSUs with similar geographic and demographic characteristics.

Stratifying PSUs – For the 2010 design, the NSR PSUs are grouped with similar NSR PSUs within states to form strata. Each SR PSU forms its own stratum. The data used for grouping the PSUs is based on decennial census demographic data, ACS data, and administrative crime data. NSR PSUs are grouped to be as similar or homogeneous as possible. Just as the SR PSUs must be large enough to support a full workload so must each NSR strata. The most efficient stratification scheme is determined by minimizing the between PSU variance and the within PSU variance.


Selecting PSUs – The SR PSUs are automatically selected for sample or “selected with certainty.” NSR PSUs are sampled with probability proportional to the population size using a linear programming algorithm. One PSU is selected from each NSR stratum. The 2010 design NCVS sample includes 339 SR PSUs and 203 NSR PSUs. PSUs are defined, stratified, and selected once every ten years. The 2010 design sample PSUs were sampled using population data from the 2010 census.

Stage 2. Preparing Frames and Sampling within PSUs

Frame Determination – The 2010 sample design selects its sample from two dynamic address-based sampling frames, one for housing units and one for group quarters (GQs). Both frames are based upon the MAF, which is a national inventory of addresses. The MAF is continually updated by various Census Bureau programs and external sources. New housing units are added to the MAF, and therefore the NCVS sampling frame, through semiannual updates from a variety of address sources, including the U.S. Postal Service Delivery Sequence File, local government files, and field listing operations.


In the 2010 design, each address in the country was assigned to the housing unit or GQ frame based on the type of living quarter. Two types of living quarters are defined in the decennial census. The first type is a housing unit (HU). An HU is a group of rooms or a single room occupied as separate living quarters or intended for occupancy as separate living quarters. An HU may be occupied by a family or one person, as well as by two or more unrelated persons who share the living quarters. The second type of living quarters is GQ. GQs are living quarters where residents share common facilities or receive formally authorized care. About 3% of the population counted in the 2010 Census resided in GQs. Of those, less than half resided in non-institutionalized GQs. About 97% of the population counted in the 2010 Census lived in HUs.


Within-PSU Sampling – All of the Census Bureau’s continuing demographic surveys, such as the NCVS, are sampled together. This procedure takes advantage of updates from the January MAF delivery and ACS data. This within-PSU selection occurs every year for housing units and every three years for GQs.


Selection of samples is done one survey at a time (sequentially). Each survey determines how the unit addresses within the frame should be sorted prior to sampling. For the NCVS, each frame is sorted by geographic variables. A systematic sampling procedure is used to select addresses from each frame. A skeleton sample is also selected in every PSU. Every six months new addresses on the MAF are matched to the skeleton frame. The skeleton frame allows the sample to be refreshed with new addresses and thereby reduces the risk of under-coverage errors due to an outdated frame.


Addresses selected for a survey are removed from the frames, leaving an unbiased or clean universe behind for the next survey that is subsequently sampled. By leaving a clean universe for the next survey, duplication of addresses across surveys is avoided. This is done to help preserve response rates by insuring that no unit falls into more than one survey sample.


Stage 3. Persons within Sample Addresses

The last stage of sampling is done during the initial contact of the sample address during the data collection phase. For the SCS, if the address is a residence and the occupants agree to participate, then an attempt is made to interview every person ages 12 to 18 who lives at the resident address and completes the NCVS-1. The NCVS has procedures to determine who lives in the sample unit and a household roster is completed with names and other demographic information of all persons who live there. If someone moves out (in) of the household during the interviewing cycle, he or she is removed from (added to) the roster.


Approximately 2,688 persons a month, ages 12 to 18, in these households will be eligible to be interviewed for the supplement during January to June 2019 for a total of 16,133 possible interviews. Generally, interviewers are able to obtain SCS interviews with approximately 53% of the SCS eligible household members in occupied units in sample in any given month. A total of 8,567 persons ages 12 to 18 are expected to be interviewed for the SCS during the 6-month collection period.


State Samples

Beginning in January 2016, BJS and Census increased and reallocated the existing national sample in the 22 largest states. The states receiving a sample boost include Arizona, California, Colorado, Florida, Georgia, Illinois, Indiana, Maryland, Massachusetts, Michigan, Minnesota, Missouri, New Jersey, New York, North Carolina, Ohio, Pennsylvania, Tennessee, Texas, Virginia, Washington, and Wisconsin. In 2017, each of these 22 states had a population greater than 5 million persons and in total these 22 states comprised 79% of the U.S. population.2 In each of the 22 states, enough sample was selected to achieve a 10% relative standard error (RSE) for a three year average violent victimization rate of 0.02. The underlying assumption of the subnational sample design is that three years of data will be needed to produce precise estimates of violent crime, which is experienced by about 1% of the population. Sample sizes in the remaining 28 states and the District of Columbia were determined to ensure full representation and unbiased estimates at the national level. For the 2010 design unlike the 2000 sample design, no strata cross state boundaries and all 50 states and the District of Columbia have at least one sampled PSU.


BJS and NCES anticipate producing state-level estimates using the 2019 SCS, including estimates of bullying, given the prevalence of bullying during the last few survey administrations.


Weighting and Estimation

The purpose of the SCS is to be able to make inferences about school-related victimizations for the population of students ages 12 to 18 in the United States. Before such inferences can be drawn, it is necessary to adjust, or weight, the sample of people to ensure it is similar to the entire population in this age group. The SCS weights are a combination of household-level and person-level adjustment factors. Household and person respondents from the NCVS sample are adjusted on a bi-annual basis to represent the U.S. population age 12 or older. For the SCS, the population is restricted to students ages 12 to 18 who attend public or private school during the current school year.


NCVS household and person weights are first adjusted to account for any subsampling that occurs within large GQs. The NCVS nonresponse weighting adjustment then allocates the sampling weights of nonresponding households and persons to respondents with similar characteristics. Additional factors are then applied to correct for the differences between the sample distributions of age, race and Hispanic origin, and sex and the population distributions of these characteristics. The resulting weights were assigned to all interviewed households and persons in the NCVS file.


SCS weighting begins with the NCVS final person weight, which is the then multiplied by a SCS noninterview adjustment factor. SCS noninterview adjustment factors were computed by distributing the weights of SCS noninterviews to the weights of the SCS interviews, with adjustment cells determined by age, race and Hispanic origin, and sex. The result is a SCS person-level weight that can be used for producing estimates from the SCS variables.


Variance Estimates

The NCVS and SCS estimates come from a sample, so they may differ from figures from an enumeration of the entire population using the same questionnaires, instructions, and enumerators. The difference between the sample estimate and true population parameter is known as sampling error.3 The sampling error quantifies the amount of uncertainty and bias in an estimate as a result of selecting a sample.


Variance estimates can be derived using direct estimation or generalized variance functions (GVFs). GVFs for the NCVS are created by the Census Bureau for the BJS. The Census Bureau produces parameters for GVFs that estimate the variance of any crime count estimate based on the value of the estimate. To do this, estimates and their relative variance are fit to a regression model using an iterative weighted least squares procedure where the weight is the inverse of the square of the predicted relative variance.



2. Procedures for Collecting Information

The SCS is designed to calculate national and state level (for the 22 most populous states) estimates of school-related victimization for the target population – all children ages 12 to 18 living in NCVS households who have attended public or private school during the current school year. The SCS is administered to all NCVS respondents ages 12 to 18 during the 6-month period from January through June 2019.

DATA COLLECTION

For the six month period, January through June 2019, the SCS will be administered to approximately 126,635 designated households. Each housing unit selected for the NCVS remains in the sample for three years, with each of seven interviews taking place at 6-month intervals.


The NCVS-500 (Control Card) is used to complete a household roster with names and other demographic information of the household members. For some demographic questions that are asked directly of respondents, flashcards for education, race, Hispanic origin, sexual orientation, gender identity, employment, household income are used. Respondents are asked to report victimization experiences occurring in the six months preceding the month of interview. The NCVS Crime Screener instrument (NCVS-1) is asked of all respondents age 12 or older in the household and is used to ascertain whether the respondent has experienced a personal crime victimization during the prior six months and is therefore eligible to be administered the NCVS Crime Incident Report instrument (NCVS-2). The NCVS-1 collects the basic information needed to determine whether the respondent experienced a crime victimization (rape or sexual assault, robbery, aggravated or simple assault, personal larceny, burglary, motor vehicle theft, or other household theft). When a respondent reports an eligible personal victimization, the NCVS-2 is then administered to collect detailed information about the crime incident. The NCVS-2 is administered for each incident the respondent reports. For each victimization incident, the NCVS-2 collects information about the offender (e.g. sex, race, Hispanic origin, age, and victim-offender relationship), characteristics of the crime (including time and place of occurrence, use of weapons, nature of injury, and economic consequences), whether the crime was reported to police, reasons the crime was or was not reported, and victim experiences with the criminal justice system. Clearance for the core NCVS forms and materials including the NCVS-500, NCVS-1 and NCVS-2 are requested through a separate OMB request and number (OMB NO: 1121-0111).


Each interview period the interviewer completes or updates the household composition component of the NCVS interview and asks the crime screener questions (NCVS-1) for each household member age 12 or older. The interviewer then completes a crime incident report (NCVS-2) for each reported crime incident identified in the crime screener. Once the NCVS interview is completed (i.e. nonvictims responded to all NCVS-1 screening questions or victims completed all necessary NCVS-2 incident reports), the interviewer administers the SCS questionnaire to children ages 12 to 18.


The first contact with a household is by personal visit and subsequent contacts may be by telephone. For the second through seventh visits, interviews are done by telephone whenever possible. Approximately half of all interviews conducted each month are by telephone.


SCS collection

The SCS is designed to calculate national and 22 state-level estimates of school-related victimization for the target population – all children ages 12 to 18 living in NCVS households who have attended school during the current school year.


Initially, each eligible person ages 12 to 18 is asked a short set of screener questions to determine if they attended school, either private or public sector, at any time during the current school year. Students are ineligible if they were homeschooled the entire survey period, or if they were enrolled in a grade below 6th, a GED program, or in college. If they did meet the school criteria, the students are then administered the SCS core instrument.


The SCS instrument is divided into seven primary parts. Specific rationale for each question can be found in Attachment 4. The sections include –

  1. Environmental (school environment) – asks students about their school’s name, type of school, grade levels, access to school and building, student activities, school organizational features related to safety, academic and teaching conditions, student-teacher relations, and drug availability.

  2. Fighting, bullying, and hate behaviors – asks students about the number and characteristics of physical fights, bullying, and hate-related incidents.

  3. Avoidance – asks students whether they avoided certain parts of the school building or campus, skipped class, or stayed home entirely because of the threat of harm or attack.

  4. Fear – follows up with questions on how afraid students feel in and on their way to and from school.

  5. Weapons – focuses on whether students carried weapons on school grounds for protection or know of any students who have brought a gun to school.

  6. Gangs – asks students about their perception of gang presence and activity at school.

  7. Student characteristics – asks students about their attendance and academic performance.


A split sample design is proposed for the 2019 SCS. The purpose of the split sample is to be able to compare differences in questionnaire wording and maintain the historical data trends. After cognitive testing, discussed in further detail in the Test of Procedures section below, it was determined that a number of questions should be revised to improve respondent comprehension. The questions being tested in the split sample design include student participation in school activities, availability of drugs and alcohol at school, student experiences with bullying, and gang presence at school.


Version 1 includes questions from the 2017 SCS, providing a bridge to historical data trends. Version 2 includes revised questions based on cognitive testing. The revisions to the Version 2 questions include –

  • The student participation in school activities question is reordered so the sub-item “spirit groups, for example, Cheerleading, Dance Team, or Pep Club” comes before “Athletic teams at school.” During cognitive testing, some respondents considered cheerleading an athletic team and were not sure how to classify it. To reduce confusion, the ordering of these sub-items is switched.

  • The alcohol and drug availability question includes a sub-item on the availability of opioids. NCES received a request from ED’s Office of Safe and Healthy Students (OSHS) to collect data on opioids as part of a response to the President’s Commission on Combating Drug Addiction and the Opioid Crisis.

  • The bullying questions uses the revised bullying question that does not include the word bully in the text of any questions. These questions also remind respondents to think about experiences that occurred electronically as cognitive testing results determined they were not thinking of electronic bullying when answering this series of questions.

  • The introduction to the gangs question removes the phrase “we are interested in all gangs, whether or not they are involved in violent or illegal activity.” Feedback from Census field representatives (FR) and cognitive interviewers indicated confusion on this part of the definition. NCES and BJS are primarily interested in measuring the presence of illegal and violent gangs at schools, therefore, the statement “we are interested in all gangs, whether or not they are involved in violent or illegal activity” was removed in Version 2.


BJS and NCES consulted with the Demographic Statistical Methods Division (DSMD) at the Census Bureau to determine if a split sample would be appropriate. DSMD evaluated a 60/40% split and a 50/50% split. They estimated the 60/40 split could identify a significant difference (alpha = 0.05) in bullying rates at 4.6%. The 50/50 split could identify a significant difference (alpha = 0.05) in bullying rates at 4.5%. Although the minimum detectable differences do not vary much between the 60/40 and 50/50 split, BJS and NCES propose a 60/40 split for the 2019 SCS for two reasons. First, the coefficients of variation (CV) for the control group (Version 1) are lower for the 60/40 split compared to the 50/50 split. Second, the 60/40 split boosts the sample for the control group which may be advantageous in preserving the historical trend lines.


  1. Methods to Maximize Response Rates

Contact Strategy

Contact materials focus on the NCVS in general and do not specifically reference the SCS or other supplemental surveys. The Census Bureau mails notifications to households prior to data collection, interviewers contact households for the first time in-person, and interviewers conduct nonresponse follow-up. The Census Bureau mails an introductory letter (NCVS-572(L)) or continuing household letter (NCVS-573(L)) explaining the NCVS to the household before the interviewer's visit or call (Attachments 6 and 7). The introductory letters are sent to households before their first NCVS interview, or time-in-sample 1 (TIS-1) interviews, and the continuing household letters are sent to households in time-in-sample 2 through 7 interviews. When they go to a household, the interviewers carry cards identifying them as Census Bureau employees. Potential respondents are assured that their answers will be held in confidence and are only used for statistical purposes. For respondents who have questions about the NCVS, interviewers provide a brochure (NCVS-110), and can also reference information in their Information Card Booklet (NCVS-554) that contains information such as uses of NCVS data and frequently asked questions and answers. At the FRs discretion, thank you letters are sent to the household (NCVS-593(L) or NCVS-594(L)). All forms and materials used for contact with the household have been previously approved by OMB (OMB NO: 1121-0111).


The Census Bureau trains interviewers to obtain respondent cooperation and instructs them to make repeated attempts to contact respondents and complete all interviews. The interviewer obtains demographic characteristics of noninterview persons for use in the adjustment for nonresponse. SCS response rates are monitored on a monthly basis and compared to previous month’s average to ensure their reasonableness.


As part of their job, interviewers are instructed to keep noninterviews, or nonresponse from a household or persons within a household, to a minimum. Household nonresponse occurs when an interviewer finds an eligible household but obtains no interviews. Person nonresponse occurs when an interview is obtained from at least one household member, but an interview is not obtained from one or more other eligible persons in that household. Maintaining a high response rate involves the interviewer’s ability to enlist cooperation from all kinds of people and to contact households when people are most likely to be home. As part of their initial training, interviewers are exposed to ways in which they can persuade respondents to participate as well as strategies to use to avoid refusals. Furthermore, the office staff makes every effort to help interviewers maintain high participation by suggesting ways to obtain an interview, and by making sure that sample units reported as noninterviews are in fact noninterviews. Also, survey procedures permit sending a letter to a reluctant respondent as soon as a new refusal is reported by the interviewer to encourage their participation and to reiterate the importance of the survey and their response.


As was done in previous years, in 2019, NCES will prepare a number of information materials about the SCS for FR distribution to parents and students. Designed as brochures, these informational materials will provide answers to frequently asked questions about the SCS, and they will be produced in both English and Spanish. The student brochure includes the answers to such questions as “Do I have to take this survey?” and “Why are my answers to the survey important?” The parent brochure includes answers to such questions as “What is the purpose of this survey?” and “What questions are on the survey for my child?” The parent brochure will also include some illustrative survey findings from the 2015 SCS. Findings will not be included on the student brochure out of concern that they might bias student responses.


The 2019 brochures will be similar to those produced for 2017. The four 2019 brochures are as follows:

  • For parents in English (Attachment 8)

  • For students in English (Attachment 9)

  • For parents in Spanish (Attachment 10)

  • For students in Spanish (Attachment 11)


Interviewer Training

Training for NCVS interviewers consists of classroom and on-the-job training. Initial training for interviewers consists of a full day pre-classroom self-study, four-day classroom training, post-classroom self-study, and on-the-job observation and training. Initial training includes topics such as protecting respondent confidentiality, gaining respondent cooperation, answering respondent questions, proper survey administration, use of systems to collect and transmit survey data, NCVS concepts and definitions, and completing simulated practice NCVS interviews. The NCVS procedures and concepts taught in initial training are also regularly reinforced for experienced NCVS interviewers. This information is received via monthly written communications, ongoing feedback from observations of interviews by supervisors, and monthly performance and data quality feedback reports.


NCVS interviewers also receive specific training on the SCS including eligibility, the organization of the SCS interview, content of the survey questionnaire, addressing potential respondent questions, and internal check items that are in place to help the interviewer ensure that the respondent is being asked the appropriate questions and follow-up when clarification is needed. The SCS training materials are distributed to interviewers approximately a month before the supplement goes into the field.


Monitoring Interviewers

In addition to the above procedures used to ensure high participation rates, the Census Bureau implements additional performance measures for interviewers based on data quality standards. Interviewers are trained and assessed on administering the NCVS-1, NCVS-2, and SCS exactly as worded to ensure the uniformity of data collection, completing interviews in an appropriate amount of time (not rushing through them), and keeping item nonresponse and “don’t know” responses to a minimum. The Census Bureau also uses quality control methods to ensure that accurate data are collected. Interviewers are continually monitored by their Regional Office to assess whether performance and response rate standards are being met and corrective action is taken to assist and discipline interviewers who are not meeting the standards.


Another component of the data quality program is monthly feedback. In 2011, the Census Bureau implemented a series of field performance and data quality indicators. Previously, high response rates were the primary measure of interviewer performance. The data quality indicators are tracked through the Census Bureau’s expanded Performance and Data Analysis (Giant PANDA) tool, and monthly reports provided to the field. Under the revised performance structure, interviewers are monitored on the following –

  • response rates (household, person, and the current supplement in the field);

  • time stamps (the time it takes to administer the screener questions on the NCVS-1 or the crime incident questions on the NCVS-2);

  • overnight starts (interviews conducted very late at night or very early in the morning);

  • late starts (cases not started until the 15th or later in the interview month);

  • absence of contact history records (cases missing records of contact attempts with the household and/or persons within the household); and

  • quality of crime incidents (changes made to the location, presence, or theft data items on the NCVS-2 during post-processing coding operations).

Noncompliance with these indicators results in supervisor notification and follow-up with the interviewer. The follow-up activity may include simple points of clarification (e.g., the respondent works nights and is only available in the early morning for an interview), additional interviewer training, or removal of the interviewer from the survey.


Every effort has been made to make the survey materials clear and straightforward. The SCS instrument has been designed to make collection of the data as concise and easy for the respondent as possible. The SCS questions have been cognitively tested to ensure that they are easily understood by most respondents.


Nonresponse and Response Rates

Interviewers are able to obtain interviews with about 84% of household members in 78% of the occupied units in sample in a given month. Beginning with 2018 and following data collection years, the Census Bureau plans to report nonresponse and response rates, respondent and nonrespondent distribution estimates, and proxy nonresponse bias estimates for various subgroups. Should the analyses reveal evidence of nonresponse bias, BJS will work with the Census Bureau to assess the impact to estimates and ways to adjust the weights accordingly. The interviewers obtain demographic characteristics of noninterview persons for use in the adjustment for nonresponse.


In 2017, the Census Bureau found evidence of potential bias in the SCS estimates because the overall response rate was low. Analysis indicated that respondent and nonrespondent distributions were significantly different for race and Hispanic origin and census region subgroups. However, after applying weights adjusted for person nonresponse, there was no evidence that these response differences introduced nonresponse bias in the final victimization estimates.


  1. Test of Procedures

The revised 2019 SCS questionnaire underwent cognitive testing by the Center for Survey Measurement (CSM) staff at the Census Bureau, under the NCES generic clearance for cognitive, pilot, and field test studies (OMB NO: 1850-0803), from December 2017 through June 2018. The cognitive testing was primarily focused on how the new bullying questions and revised order heightened the respondent’s awareness of what constituted bullying, including the actions taken by the individuals involved, the frequency, and the relationships between the victims and the perpetrators.


In general, the 2017 version of the bullying questions performed well, producing estimates of bullying closer to those produced from prior years of SCS data. However, research has shown that including a term like bullying, which has a variety of colloquial meanings, in the question wording has the potential to influence measurement error.4 The next step in improving the questions was to remove the terms bullying and bullied, and use a set of behaviorally-specific questions that measures the different components included in the bullying definition. The 2017 items were used as a starting point.


The testing was conducted in four rounds, and an iterative methodology was used to identify and address problematic questions at the end of each round. The iterative method allowed for assessment of whether or not revised question wording addressed the problems interviewers observed during the previous rounds. Most of the questions on the instrument performed well and were easy for interviewers to administer, easy for respondents to understand and answer, and thus required no revisions. However, feedback from CSM staff indicated some questions did require revisions.


Overall, respondents responded favorably to the exclusion of the word bully in the items. While some respondents indicated it is likely students could respond differently if the term is used in the item, others indicated the exclusion of the word makes it easier to respond because less emotion is involved and students may feel more comfortable discussing their experiences. Results of this approach clearly suggest not including the term is an appropriate method of collecting information on student experiences of peer victimization.

Feedback from Census field staff administering the 2017 instrument indicated respondents expressed confusion on when to include incidents of cyberbullying. To address this concern, it was determined respondents should be reminded to think of incidents that may have occurred electronically, specifically, “Now I have some questions about what students from your school do that make you feel bad or are hurtful to you. These could occur in person or using technologies, such as a phone, the Internet, or social media. During this school year, has any student from your school…” Additionally, a new sub-item focused on rumor-related incidents occurring electronically was added: “Purposely shared your private information, photos, or videos in a hurtful way?” Lastly, this issue was addressed in the social exclusion sub-item: “Excluded you from activities, social media, or other communications to hurt you?” While many respondents had not experienced cyberbullying, these additions performed well during cognitive testing and are proposed to be included on the 2019 instrument.


Further refinement of the repetition and power imbalance follow-up questions was necessary to ensure all situations are covered and that these two components are accurately being collected. For example, respondents were asked how many days they experienced hurtful things, and if the respondent indicated only one day, a follow-up question was asked on how many times during that one day the incidents occurred. NCES and BJS agreed this approach is a more accurate way to collect repetition than in previous administrations. Further refinement of the power imbalance items include asking whether a student perpetrator did hurtful things to the respondent more than once and specifying to the respondent to think of those people when responding to the various attributes that constitute a power imbalance (i.e., stronger, popular, money, etc.). Respondents were also asked to specify if they indicated some other power imbalance that was not included in the list. This is to ensure the provided list is a comprehensive list of potential power differentials and will allow for sub-items to be added in future administrations if any themes present themselves.


Two new items were added to the bullying section. One asks about the relationship between the respondent and the perpetrator. This is to weed out incidents that involve siblings or current dating partners. The CDC definition states siblings and current dating partners are not to be considered bullying as these fall under other constructs, such as domestic violence and dating violence.5 To date, these two exclusions have not been considered on the SCS, and given the other substantial changes being made to the bullying items, it seemed appropriate to address this part of the CDC definition as well. The other new item to this section is an item placed at the end asking if the student felt that their experiences were bullying. This item performed well during cognitive testing and BJS and NCES anticipate this information being valuable for stakeholders as it will provide some insight as to how students perceive their school experiences.


Lastly, there were two other noteworthy changes to other items on the instrument. First, per a request from ED’s Office of Safe and Healthy Students (OSHS) and in direct response to the President’s Commission on Combating Drug Addiction and the Opioid Crisis, the drug availability at school question was revised to break out the sub-item on prescription drug availability. This sub-item is now split into two, with one asking specifically about opioids and the other asking about other prescription drugs illegally obtained. The opioid sub-item did require minor tweaks in between rounds to address comprehension issues, but the final wording “Heroin or prescription painkillers illegally obtained without a prescription, such as Codeine, Percocet, or fentanyl? These are also known as opioids” performed well during the last round of testing and meets the needs of OSHS.


The other change is to the introduction of the gang presence question. In prior administrations, the introduction has included confusing language indicating respondents should think about “all gangs, whether or not they are involved in violent or illegal activity.” Census field staff have indicated issues when providing this introduction and students expressing confusion on what the item is asking. In most contexts, a gang is perceived as being involved in nefarious activities and BJS and NCES agreed that gangs involved in violent and illegal activities are of primary interest. It is proposed to remove this part of the introduction for the 2019 instrument.


The final report outlining these and other recommendations of the cognitive testing and testing protocols are included with this package as Attachments 12-14.


  1. Consultants on Statistical Aspects of the Design

BJS and NCES take responsibility for the overall design and management of the activities described in this submission, including developing study protocols, sampling procedures, and questionnaires and overseeing the conduct of the studies and analysis of the data by contractors.


The Census Bureau will collect all information. Ms. Meagan Meuchel is the NCVS Survey Director at the Census Bureau and manages and coordinates the NCVS and its supplements. Mr. David Hornick of the Demographic Statistical Methods Division of the Census Bureau oversees the statistical aspects of the supplement. BJS, NCES, and Census Bureau staff responsible for the SCS include –

BJS Staff:

all staff located at-

810 7th Street, NW

Washington, DC 20531

NCES Staff:

all staff located at-

550 12th Street, SW

Washington, DC 20202

Census Bureau Staff:

all staff located at-

4600 Silver Hill Road

Suitland, MD 20746

Jeffrey H. Anderson, Ph.D.

Director


Marilyn Seastrom, Ph.D.

Chief Statistician

Meagan Meuchel

NCVS Survey Director

Associate Directorate for Demographic Programs – Survey Operations

Jeri Mulrow

Principal Deputy Director


Chris Chapman

Associate Commissioner

Sample Surveys Division

Jill Harbison

NCVS Assistant Survey Director

Associate Directorate for Demographic Programs – Survey Operations

Grace Kena

Acting Chief

Victimization Statistics Unit

Andy Zukerberg

Chief

Cross-Sectional Surveys Branch

Megan Ruhnke

NCVS Assistant Survey Director

Associate Directorate for Demographic Programs – Survey Operations

Rachel Morgan, Ph.D.

Statistician

Victimization Statistics Unit

Rachel Hansen

Project Director

Cross-Sectional Surveys Branch

Scott Raudabaugh

Chief, Crime Surveys Programming & Population Support Branch Chief

Demographic Surveys Division



David Hornick

Lead Scientist

Demographic Statistical Methods Division


1 Public schools are identified on the Department of Education’s (ED) Common Core of Data (CCD) database (https://nces.ed.gov/ccd/). Charter schools are included in the CCD database and therefore are categorized as public schools. Private schools are identified on ED’s Private School Universe Survey (PSS) (https://nces.ed.gov/surveys/pss/).

2 Table 1. Annual Estimates of the Resident Population for the United States, Regions, States, and Puerto Rico: April 1, 2010 to July 1, 2017 (NST-EST2017-01). Source: U.S. Census Bureau, Population Division. Release Date: December 2017.

3 Everitt, B.S., and Skrondal, A. (2010). The Cambridge Dictionary of Statistics, Fourth Edition. Retrieved from http://www.stewartschultz.com/statistics/books/Cambridge%20Dictionary%20Statistics%204th.pdf.

4 Vaillancourt, T., et al. (2008). Bullying: Are researchers and children/youth talking about the same thing? International Journal of Behavioral Development, 32(6): 486-495.

5 Gladden, R.M., Vivolo-Kantor, A.M., Hamburger, M.E., & Lumpkin, C.D. Bullying Surveillance Among Youths: Uniform Definitions for Public Health and Recommended Data Elements, Version 1.0. Atlanta, GA; National Center for Injury Prevention and Control, Centers for Disease Control and Prevention and U.S. Department of Education; 2014. http://www.cdc.gov/violenceprevention/pdf/bullying-definitions-final-a.pdf.

9


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLisa Price-Grear
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy