2015 SCS OMB Supporting Statement B FINAL

2015 SCS OMB Supporting Statement B FINAL.doc

School Crime Supplement to the National Crime Victimization Survey

OMB: 1121-0184

Document [doc]
Download: doc | pdf




B. Description of Statistical Methodology


1. Respondent Universe


The sample universe is all children ages 12 to 18 living in households who have attended school during the previous six months (grade 12 or less). In 2015, we expect that approximately 14,461 persons between the ages of 12 and 18 will complete the core NCVS and be eligible for the SCS from January through June of 2015. Based on the 2013 SCS response rates, we anticipate that about 59.9% (the student completion rate for 2013), or about 8,662 persons, will complete the 2015 SCS. Exhibit 5 includes the household completion rates, student completion rates (NCVS respondents eligible for the SCS), and overall SCS response rates (calculated by multiplying the household completion rate by the student completion rate) from 1989 through 2013.


Exhibit 5: SCS response rates

Year

Household Completion Rate

Student Completion Rate

Overall SCS Response Rate

1989

96.%

86.5%

83.5%

1995

95.1

77.5

73.7

1999

93.8

77.6

72.8

2001

93.1

77.0

71.7

2003

91.9

69.6

64.0

2005

90.6

61.7

56.0

2007

90.4

58.3

52.7

2009

91.7

55.9

51.3

2011

90.7

63.3

57.4

2013

85.5

59.9

51.2


Exhibit 5 shows that 59.9% of the eligible SCS sample completed the 2013 SCS. The remaining 40.1% were noninterviews. Reasons for noninterviews include ­– (1) the person was an NCVS noninterview, (2) the person was an NCVS interview but refused or was unavailable for the SCS interview, (3) the person was physically or mentally unable to answer the questions and no proxy was available, or (4) an acceptable proxy respondent refused to give a proxy SCS interview. The majority (76%) of SCS noninterviews were because the person refused to respond to the NCVS; therefore, they were not eligible to complete the SCS. Approximately 11% of SCS noninterviews were parent refusal. The 2013 SCS response rate not including the NCVS noninterviews was 86%.


OMB guidelines require a nonresponse bias analysis for all surveys with an overall unit response rate less than 80 percent. Due to the low unit nonresponse rate for the SCS in 2005, 2007, 2009, 2011, and 2013, a unit nonresponse bias analysis was done by the Census Bureau. Nonresponse can greatly affect the strength and application of survey data by leading to an increase in variance as a result of a reduction in the actual size of the sample and can produce bias if the nonrespondents have characteristics of interest that are different from the respondents.


In order for response bias to occur, respondents must have different response rates and responses to particular variables in the survey. The magnitude of unit nonresponse bias is determined by the response rate and differences between respondents and nonrespondents on key survey variables. Although the bias analysis cannot measure response bias since the SCS is a sample survey and it is not known how the population would have responded, the SCS sampling frame has four key student or school characteristic variables for which data is known for respondents and nonrespondents: sex, race/ethnicity, household income, and urbanicity, all of which are associated with student victimization. To the extent that there are different responses by respondents in these groups, nonresponse bias is not a concern.


In 2005, the analysis of unit nonresponse bias found evidence of bias for the race, household income, and urbanicity variables. White (non-Hispanic) and respondents of other (non-Hispanic) races had higher response rates than Black (non-Hispanic) and Hispanic respondents. Respondents from households with an income of $35,000–$49,999 and $50,000 or more had higher response rates than those from households with incomes of less than $7,500, $7,500–$14,999, $15,000–$24,999 and $25,000–$34,999. Respondents who live in urban areas had lower response rates than those who live in rural or suburban area. Although the extent of nonresponse bias cannot be determined, weighting adjustments, which corrected for differential response rates, should have reduced the problem.


In 2007, the analysis of unit nonresponse bias found evidence of bias by the race/ethnicity and household income variables. Hispanic respondents had lower response rates than other race/ethnicities. Respondents from households with an income of $25,000 or more had higher response rates than those from households with incomes of less than $25,000. However, when responding students are compared to the eligible NCVS sample, there were no measurable differences between the responding students and the eligible students, suggesting the nonresponse bias has little impact on the overall estimates.


In 2009, the analysis of unit nonresponse bias found evidence of potential bias for the race/ethnicity and urbanicity variables. White students and students of other race/ethnicities had higher response rates than did Black and Hispanic respondents. Respondents from households located in rural areas had higher response rates than those from households located in urban areas. However, when responding students are compared to the eligible NCVS sample, there were no measurable differences between the responding students and the eligible students, suggesting the nonresponse bias has little impact on the overall estimates.


In 2011, the analysis of unit nonresponse bias found evidence of potential bias for the age variable. Respondents 12 to 17 years old had higher response rates than did 18 year old respondents in the NCVS and SCS interviews. The nonresponse bias analysis report on the 2013 SCS concluded a high nonresponse bias in many of the same subgroups as in the 2011 SCS.1 Weighting the data adjusts for unequal selection probabilities and for the effects of nonresponse. The weighting adjustments that correct for differential response rates are created by region, age, race, and sex, and should have reduced the effect of nonresponse.


Response rates for most SCS survey items in all survey years were high – typically over 97% of all eligible respondents meaning there is little potential for item nonresponse bias for most items in the survey. Weights were developed to compensate for differential probabilities of selection and nonresponse. The weighted data permit inferences about the eligible student population who were enrolled in schools in all SCS data years.


2. Statistical Methodology


The NCVS sample of living quarters is drawn from the more than 130 million housing units and group quarters nationwide. In general, military barracks and institutionalized group quarters are excluded from the NCVS frame. The national sample of housing units consists of approximately 69,000 designated addresses located in 320 stratified primary sampling units (PSUs) throughout the United States. The sample consists of six parts, each of which is designated for interview in a given month and again at six month intervals. Beginning in 2005, new sample addresses were introduced based upon the 2000 Decennial Census of Population and Housing. Newly constructed housing since that census is also sampled. In 2015, some of the sample addresses selected from the 2000 Census will be phased out and new sample addresses updated from the 2010 Census and recent United States Postal Service Files will be phased in. The introduction of the new sample will not impact the 2015 SCS as the sample will be introduced in late 2015 after the SCS data collection is complete.


The NCVS uses a rotating sample. The sample consists of six groups for each month of enumeration. Each of these groups stays in the sample for an initial interview and six subsequent interviews. During the course of a six month period, the length of time the SCS will be in the field, a full sample of six rotation groups will be interviewed (one-sixth each month). In addition, one rotation group enters the sample for its first interview each month.


The NCVS sample has also undergone a 26% sample reinstatement. Beginning in October 2010, BJS began to reinstate the sample that was removed in 2007, increasing the monthly sample from about 8,500 households to about 10,700 households. The reinstated sample was completed by June of 2011. The sample reinstatement is an important component of BJS’s effort to restore the capabilities of the NCVS to measure the impact of crime in the United States. It reduces the standard errors associated with violent crime by about 5%, and enables more robust analyses of annual estimates of crime victimization and the characteristics of crimes and crime victims.


Beginning in July 2013, new sample was also added to 11 states to study the feasibility of making state-level estimates. Enough sample was added to each state to achieve a coefficient of variation on the three year average violent victimization rate of 0.10. The 2013 NCVS Sample Boost increased the NCVS monthly sample by 2,200 designated sample units. Since the 2013 Sample Boost is a feasibility study, SCS interviews conducted from these cases are excluded from SCS analysis and reports.


During each interview period the interviewer completes or updates the household composition component of the NCVS interview and asks the crime screen questions (NCVS-1) for each household member 12 years old or older. The interviewer then completes a crime incident report (NCVS-2) for each reported crime incident identified in the crime screener. Following either the screener or the administration of the crime incident report, depending on whether a crime was reported, each household member 12 to 18 years of age will be administered the SCS. Each household member provides the information by self-response. Proxy respondents are allowable under very limited circumstances and represent 5.1% of all interviews. All forms and materials used for the NCVS screener and crime incident report have been previously approved by OMB (OMB NO: 1121-0111). The 2015 SCS instrument is attached as Attachment 9.


The first contact with a household is by personal visit and subsequent contacts may be by telephone. For the second through seventh visits, interviews are done by telephone whenever possible. Approximately 52% of the interviews conducted each month are by telephone.


SAMPLING


The SCS is designed to calculate national estimates of school crime and safety for the target population - the noninstitutional resident population ages 12 to 18 years old. The SCS will be administered to all age-eligible NCVS respondents during the six month periods from January through June 2015. The frame used to reach the target NCVS population is the list of addresses of all living quarters in the U.S. compiled from the most recent decennial census and lists of housing units constructed since that most recent decennial census. Sample selection for the NCVS, and by default the SCS, has three stages: the selection of primary sampling units or areas known as PSUs, the selection of address units in sample PSUs, and the determination of persons and households to be included in the sample.


Survey estimates are derived from a stratified, multi-stage cluster sample. The PSUs composing the first stage of the sample are formed from counties or groups of adjacent counties based upon data from the decennial census. The larger PSUs are included in the sample automatically and are considered to be self-representing (SR) since all of them are selected with certainty. The remaining PSUs, called non self-representing (NSR), because only a subset of them are selected, are combined into strata by grouping PSUs with similar geographic and demographic characteristics, as collected in the decennial census from which the sample is drawn. For the NCVS, administrative crime data drawn from the FBI’s Uniform Crime Reporting Program are also used to stratify the PSUs.


Stage 1. Defining and Selection of PSUs:


Defining PSUs - Formation of PSUs begins with listing counties and independent cities in the target area. For the NCVS, the target area is the entire country. The counties are either grouped with one or more contiguous counties to form PSUs or are PSUs all by themselves. The groupings are based on certain characteristics such as total land area, current and projected population counts, large metropolitan areas, and potential natural barriers such as rivers and mountains. The resulting county groupings are called PSUs.


After the PSUs are formed, the large PSUs and those in large urban areas are designated self-representing (SR). The smaller PSUs are designated non self-representing (NSR). Determining which PSUs are considered small and which are large depends on the survey’s SR population cutoff. An SR PSU must be large enough in population to support at least one field representative with a full workload, never cross state boundaries, and compact enough to minimize the amount of travel required by the field representatives.


Stratifying PSUs – The NSR PSUs are grouped with similar NSR PSUs within census divisions (New England, Mid Atlantic, East North Central, West North Central, South Atlantic, East South Central, West South Central, Mountain, and Pacific) to form strata. Each SR PSU forms its own stratum. The data used for grouping the PSUs consist of decennial census demographic data and administrative crime data. As was stated earlier, NSR PSUs are grouped to be as similar or homogeneous as possible. Just as the SR PSUs must be large enough to support a full workload so must each NSR strata be of that size. The most efficient stratification scheme is determined by minimizing the between PSU variance and the within PSU variance.


Selecting PSUs – The SR PSUs are automatically selected for sample or “selected with certainty.” One NSR PSU is selected from each grouped stratum. The NSR PSUs are sampled with probability proportional to the population size.


Stage 2. Preparing Frames and Sampling within PSUs


Frame Determination: 2000 Design – To ensure adequate coverage for the target population, the Census Bureau defines and selects sample from four address lists called frames: the unit frame, the area frame, the group quarters frame, and the new construction or permit frame. Each address in the country is assigned to one and only one of these frames. Which frame an address is assigned to depends on four factors: (1) what type of living quarters are at the address, (2) when the living quarters were built, (3) where the living quarters were built, and (4) how completely the street address was listed. The main distinction between the frames is the procedures used to obtain the sample addresses.


Frame Determination: 2010 Design – In 2015, some of the NCVS sample will be selected from the Census Bureau’s Master Address File (MAF). The MAF contains addresses for housing units and group quarters that were collected in the 1990, 2000, and 2010 censuses. The MAF is further updated twice a year with new addresses from the United States Postal Service. Although most of the 2015 NCVS sample was selected from the four frames developed as part of the 2000 sample design, part of the 2015 NCVS sample addresses were selected from the housing unit and group quarter addresses on the MAF.


Two types of living quarters are defined in the decennial census. The first type is a housing unit. A housing unit (HU) is a group of rooms or a single room occupied as separate living quarters or intended for occupancy as separate living quarters. A housing unit may be occupied by a family or one person, as well as by two or more unrelated persons who share the living quarters. Before the 2000 decennial census, separate living quarters were defined as a space in which the occupants live and eat separately from all the other persons on the property and have direct access to their living quarters from the outside or through a common hall or lobby as found in apartment buildings. Beginning with the 2000 decennial census, the criteria for separate living quarters are that the occupants must live separately from any other individuals in the building and have direct access from outside the building or through a common hall or entry. Eating separately is no longer a criterion.


The second type of living quarters is group quarters (GQ). Group quarters are living quarters where residents share common facilities or receive formally authorized care. About 3% of the population counted in the 2010 census resided in group quarters. Of those, about half resided in non-institutionalized group quarters. About 97% of the population counted in the 2010 census lived in housing units.


Within-PSU Sampling – All the Census Bureau’s continuing demographic surveys, such as the NCVS, are sampled together shortly after the most recent decennial census. This takes advantage of newly available census data that shows population growth and demographic changes, as well as updated unit address lists. Roughly a decade’s worth of sample is selected at that time. Selection of samples is done one survey at a time (sequentially) and one frame at a time (independently). Each survey determines how the unit addresses within the frame should be sorted prior to sampling. For the NCVS, each frame is sorted by geographic variables. A systematic sampling procedure is used to select housing units from each frame. For the unit and the GQ frames, actual unit addresses are selected and reserved for the NCVS. In the area frame, a specified number of living quarters in a specific geographic location are promised to the NCVS and after the address listing operation in that geographic area, the specific unit addresses are assigned. Similarly, in the permit frames, empty placeholders are selected for the NCVS within the PSU. Then over time as new permits are issued, the placeholders are replaced with actual newly built housing units/addresses.


Addresses selected for a survey are removed from the frames, leaving an unbiased or clean universe behind for the next survey that is subsequently sampled. By leaving a clean universe for the next survey, duplication of addresses between surveys is avoided. This is done to help preserve response rates by insuring no unit falls into more than one survey sample.


Beginning in 2015, NCVS will phase in some new addresses in PSUs that are common to both the 2000 and 2010 designs.


Stage 3: Sample within Sample Addresses


The last stage of sampling is done during initial contact of the sample address during the data collection phase. For the SCS, if the address is a residence and the occupants agree to participate, then an attempt is made to interview every person ages 12 to 18 years old who lives at the resident address and completes the NCVS-1. There are procedures to determine who lives in the sample unit and a household roster is completed with their name and other demographic information. If someone moves out (in) during the interviewing cycle, he or she is removed from (added to) the roster.



DATA COLLECTION


The SCS will be administered from January through June 2015. Initially each eligible person age 12 to 18 is asked a short set of screener questions to determine if they attended school, either private or public sector, at any time during the current school year. Students are eliminated if they were home-schooled the entire survey period. If they did attend school, the students are then administered the SCS core instrument.


The SCS instrument is divided into seven primary parts: (1) environmental (school environment), (2) fighting, bullying, and hate behaviors, (3) avoidance, (4) fear, (5) weapons, (6) gangs, and (7) student characteristics. The environmental section asks students about their school’s name, type, grade levels, access to school and building, student activities, school organizational features related to safety, academic and teaching conditions, student-teacher relations, and drug availability. Section two, fighting, bullying, and hate behaviors asks students about the number and characteristics related to physical fights, physical and cyber bullying, and hate-related incidents. Section three, avoidance, asks students whether they avoided certain parts of the school building or campus, skipped class, or stayed home entirely because of the threat of harm or attack. Section four, fear, follows up with questions on how afraid students feel in and on their way to and from school. Section five, weapons, focuses on whether students carried weapons on to school grounds for protection or know of any students who have brought a gun to school. Section six, gangs, asks students about their perception of gang presence and activity at school. Finally, section seven asks students about their attendance and academic performance. Justifications for the sections/items can be found in Attachment 14.


For the 2015 SCS, extensive work was completed to evaluate the information collected on bullying during past SCS administrations. As a result, a new format for collecting detailed information on bullying was developed. These changes were made to align the results to the updated definition of what constitutes bullying, and to respond to information on students' changing views of bullying and cyber-bullying documented by cognitive testing and the input of researchers on the TRP. However, because the SCS is an ongoing survey, with many years of national trend data on bullying, the goal is to maintain congruency over time.


As a result, a split sample design for the bullying questions during the 2015 SCS administration is proposed. The purpose of the split sample is to compare the effect of two different measures on the reporting of bullying in school. The first measure of bullying is the old series of questions (from the 2013 SCS and earlier) with follow-up questions about power imbalance and repetition (part of the CDC’s uniform definition).2 This series of questions will maintain the trend in bullying data. The second measure of bullying encompasses all parts of the CDC’s uniform definition. Researchers can use these estimates to create a conversion factor that will allow future administrations of the SCS using only the new version of the bullying question series to be compared to historic data.


BJS and NCES consulted with the Demographic Statistical Methods Division (DSMD) at the Census Bureau to determine if a split sample would be appropriate.3 DSMD evaluated a 50 percent/50 percent split and a 25 percent/75 percent split. They estimated that the 50/50 split could identify a difference in bullying rates of 10 percent as significant and the 25/75 split could identify a difference of 11.5 percent as significant. NCES decided to move forward with the 50/50 split sample in the 2015 SCS after the results of this analysis. Attachment 13 is the Census memo containing additional details on the split sample evaluation.


Once the conversion factor is calculated on the 2015 SCS data, it is anticipated that future survey administrations would only require the second version of the bullying question to be administered. NCES will produce a technical report examining both bullying estimates from the 2015 SCS.


3. Maximizing Response Rates


Census Bureau staff mails an introductory letter (NCVS-572(L), NCVS-573(L)) (see Attachment 15 and Attachment 16) explaining the NCVS to the household before the interviewer's visit or call. When they go to a house, the interviewers carry cards and portfolios identifying them as Census Bureau employees. The Census Bureau trains interviewers to obtain respondent cooperation and instructs them to make repeated attempts to contact respondents and complete all interviews. The interviewer obtains demographic characteristics of noninterview persons and the race of noninterview households for use in the adjustment for nonresponse. SCS response rate reports will be generated on a monthly basis and compared to the previous month’s average to ensure their reasonableness.


As part of their job, interviewers are instructed to keep noninterviews to a minimum. Maintaining a low nonresponse rate involves the interviewer’s ability to enlist cooperation from all kinds of people and to contact households when people are most likely to be home. As part of their initial training, interviewers are exposed to ways in which they can persuade respondents to participate as well as strategies to use to avoid refusals. Furthermore, the office staff makes every effort to help interviewers reduce their noninterviews by suggesting ways to obtain an interview, and by making sure that sample units reported as noninterviews are in fact noninterviews. Also, survey procedures permit sending a letter to a reluctant respondent as soon as a new refusal is reported by the interviewer to encourage their participation and to reiterate the importance of the survey and their response.


In addition to the above procedures used to ensure high participation rates, beginning in 2011, interviewers were required to complete a two-day refresher training course designed to reinforce standards for collecting accurate data. Following the refresher training, the Census Bureau implemented additional performance measures for interviewers based on data quality standards. Interviews are trained to and assessed on administering the NCVS-1 and the NCVS-2 exactly as worded to ensure the uniformity of data collection, completing interviews in an appropriate amount of time (not rushing through them), and keeping item nonresponse and “don’t know” responses to a minimum.


The Census Bureau also uses quality control methods to ensure that accurate data are collected. Interviewers are continually monitored by each Regional Office to assess whether performance and response rate standards are being met and corrective action is taken to assist and discipline interviewers who are not meeting the standards.


As was done in 2013, in 2015, NCES will prepare a number of informational materials about the SCS for FR distribution to parents and students. Designed as brochures, these informational materials will provide answers to frequently asked questions about the SCS, and they will be produced in both English and Spanish. The student brochure includes the answers to such questions as “Do I have to take the survey?” and “Why are my answers to the survey important?” The parent brochure includes answers to such questions as “What is the purpose of the survey?” and “What questions are on the survey for my child?” The parent brochure will also include some illustrative survey findings from the 2011 SCS. Findings will not be included on the student brochure out of concern that they might bias student responses.


The 2015 brochures will be similar to those produced for 2013. The four 2013 brochures are as follows:

  • For Parents in English (see: Attachment 17)

  • For Students in English (see: Attachment 18)

  • For Parents in Spanish (see: Attachment 19)

  • For Students in Spanish (see: Attachment 20)


For the core NCVS, interviewers are able to obtain interviews with about 88% of household members in 83% of the occupied units in sample in any given month. Only household members who have completed the NCVS-1 will be eligible for the SCS. The response rates for the last two (2011 and 2013) administrations of the School Crime Supplement were 57.4% and approximately 51.2%, respectively (see Exhibit 5).

4. Test of Procedures

The revised 2015 SCS questionnaire underwent cognitive testing by Census Bureau staff in November 2013 and February 2014. The cognitive testing was primarily focused on the current (2013 SCS) and proposed bullying questions. The purpose of the cognitive testing was to (1) fully examine whether the proposed new questions were well understood by the target population, and (2) establish validity of the new questions (e.g., did students construct responses based on the intended information reflected in each survey item). Having multiple rounds of cognitive testing allowed some revisions to questions based on the results of the first round of testing. Evidence from the study indicated that the final versions of the questions were well understood and were capturing intended information.


One key recommendation that emerged was to present the bullying questions in two different ways using a split sample design for the 2015 SCS administration. There were too few students involved in the cognitive lab testing to reliably estimate how overall bullying frequency would be affected by a change to the bullying questions on the instrument to reflect the CDC’s uniform definition. Using a split sample design in the 2015 SCS data collection will allow for maintaining the trend in bullying data, as well as testing the new measures of bullying.

Following OMB approval of the revised SCS, the Census Bureau will translate the survey instrument into an automated CAPI instrument. Census Bureau staff, including instrument developers and the project management staff, will conduct internal testing of the CAPI instrument.


Interviewers will be provided with an SCS self-study which is mandatory to complete prior to initiating any interviews. Interviewer training is usually conducted a month prior to the first month of interview. This allows the interviewers time to familiarize themselves with the survey content and any special instrument functionality that is specific to conducting interviews for the SCS.


5. Consultants on Statistical Aspects of the Design

The Census Bureau will collect all information. Dr. Timothy Kennel and Mr. Joseph Croos of the Demographic Statistical Methods Division of the Census Bureau provided consultation on the statistical aspects of the supplement. Ms. Meagan Wilson heads the NCVS team of the Associate Director for Demographic Programs, which manages and coordinates the NCVS and the SCS supplement. Dr. Marilyn Seastrom, NCES’s Chief Statistician, has attended and will continue to attend a Census-organized meeting of statisticians to consult on the statistical aspects of the survey.

1 See attachment 21.

2 Gladden, R.M., Vivolo-Kantor, A.M., Hamburger, M.E., & Lumpkin, C.D. Bullying Surveillance Among Youths: Uniform Definitions for Public Health and Recommended Data Elements, Version 1.0. Atlanta, GA; National Center for Injury Prevention and Control, Centers for Disease Control and Prevention and U.S. Department of Education; 2014. http://www.cdc.gov/violenceprevention/pdf/bullying-definitions-final-a.pdf.

3 See attachment 13.

10


File Typeapplication/msword
AuthorLisa Price-Grear
Last Modified ByMorgan, Rachel
File Modified2014-08-18
File Created2014-08-12

© 2024 OMB.report | Privacy Policy