1st Passback Memo

1stPassback responseschoolcrime.doc

School Crime Supplement to the National Victimization Survey

1st Passback Memo

OMB: 1121-0184

Document [doc]
Download: doc | pdf

MEMORANDUM


To: Rochelle Wilkie Martinez


From: Katrina Baum


Cc: Michael Rand, Jeremy K. Shimer, Kathryn Chandler, Jill DeVoe, Lynn Bauer


Date: December 19, 2008


Re: BJS Responses to OMB Passback for the 2009 School Crime Supplement


Passback Item

Response

NCVS

Please provide a list, by year, of all of the NCVS supplements for the past 3 years and projected for the next 3 years.

NCVS Supplements, by year


2006 – Supplemental Victimization Survey (stalking), January to June

2007 – School Crime Supplement, January to June

2008 – Identity Theft Supplement, January to June

Police Public Contact Survey, July to December

2009 – School Crime Supplement, January to June

2010 – n/a

2011 – School Crime Supplement, January to June

2012 – n/a

Please clarify whether households are eligible for a supplement during their first month in sample for the NCVS.  If so, has BJS or Census analyzed any attrition impacts of introducing a supplement at this time?

Yes, households that are in the sample for the first time are eligible for the supplements. No analyses have been conducted examining how inclusion may impact attrition.

NCVS School Crime Supplement

Please update the SS to remove outdated references (e.g., A1 indicates that final questionnaire is not yet available, Part B contains references to the bounding interview and to cognitive testing yet to be conducted).

Please see revised supporting statement.

A4 seemed incomplete and perhaps dated.  It should at least discuss SSOCS in addition to, or in lieu of, older one-time collections.

We have revised the language to A4 to discuss more recent efforts and the SSOCS collection. Please see revised supporting statement.

A14 appears to have inconsistent cost figures.  Please reconcile and correct.

The inconsistent cost estimate for Census has been updated to $860,000 total. This revised figure appears in the narrative and table in item A14.


Passback Item

Response

A 15 references “an increase in the person level nonresponse for the 12-18 year old age group.”  Please provide more than 2 data points and indicate if you anticipate a further drop in 2009.

The nonresponse rates for 1999, 2001, and 2003 are 18.2%, 18.4%, and 23.6%, respectively. We do not know whether there will be a further drop in response rates in 2009, but Field Division at the Census Bureau will implement the following tactics to minimize nonresponse.


1. Census Field Division will stress to each regional office (RO) the importance of minimizing "Z" rates among eligible respondents 12-18 years of age before any interviewing occurs in January.

2. Census Field Division will produce weekly reports containing "A" & "Z" rates. The reports will also contain messages stressing the importance of the supplement. RO NCVS Supervisors will be asked to closely monitor the performance of each field representative (FR) & supervisory FR having 12-18 year old eligible SCS respondents.

3. We will also ask Supervisors and office assistants to carefully review "notes" for any case returned to the RO as a partial (outcome code 203). They will then be encouraged to attempt completing interviews with eligible SCS respondents.

4. The Field Division liaison will also review case notes and recommend to respective ROs necessary follow-up for eligible SCS respondents.


Field Division staff will also meet with Census program managers, as needed, to review response rates and implement any suggested ideas that may help overall performance measures.

A16 should cover tabulation/publication plans as well as schedule.  Please provide a paragraph or more about the planned publication(s) and the type of statistics to be developed from this supplement.

Please see revised supporting statement.

What are the specific impacts on estimates from the reduced sample size?  Are any estimates unlikely to be produced as a result?

Although variances will increase, it is too early to tell if the estimates will be impacted by reduced sample size.

Does it typically take 3 ½ years for data from this supplement to be released (as is indicated for the 2005 data)?

No, it takes about 2 years to release the SCS data file. Production of the SCS data file occurs second to that of the main NCVS data file for that year, meaning any delays with the main NCVS impart a delay to the SCS. Data year 2005 was an anomaly and was issued originally in December 2006 and rereleased in April 2008 due to errors in the weighting. See the collection and release schedules for previous collections and more realistic anticipated release dates for future data files pasted below.



Passback Item

Response

When is the 2007 data scheduled to be released?

January 2009. A publication entitled Student Victimization in U.S. Schools: Results From the 2007 School Crime Supplement to the National Crime Victimization Survey is currently in NCES’ publication plan and will be published in June 2009.

Given that the 2005 and 2007 data are not yet released, how is it possible that the 2009 data will be released in 2009?

The 2005 data were released on April 30, 2008. The 2007 data will be released in January 2010 and we anticipate a similar processing time for the 2009 data with a release date of January 2011.

B2 -- Please provide an update on whether the 2007 data indicate a break in series.

Currently, estimates at the urban, suburban, and rural levels constitute a break-in series. Ongoing examination of geographical data is being conducted to determine if estimates at this level can be released. Data at the national level does not constitute a break-in series.

Has BJS or NCES looked at seasonality effects (e.g., winter versus spring) on responses for this supplement?

Although neither BJS nor NCES has done a seasonality study as it relates to the data collection period, the supplement has consistently been collected during this time frame since inception in 1989 and in the 5 subsequent administrations. One reference that seems to indicate that there are not monthly differences in school crime is Parker, Smith, Smith, and Toby 1991.

B3 – please provide your nonresponse bias analysis plan.

NCES’ bias analysis is conducted in accordance with its standards which can be found on page 85 of the following document: http://nces.ed.gov/pubs2003/2003601.pdf


A summary of the unit- and item-level bias analysis performed in 2005 is attached for your reference. The methodology for the 2007 SCS bias analyses, which are currently being conducted, mimic the 2005 analyses with the exception of the CHAID analysis at the unit-level. CHAID is typically done for purposes of creating cell adjustments for use in weighting. Because we do not use the SCS bias analysis to adjust the survey weights, there is no need, per se, to perform this particular analysis.


In short, the 2009 bias analyses will reflect the methodology of the 2007 analyses, which are the same as 2005 with the exception of the CHAID analysis (see attached unit and item nonresponse memos).

Questionnaire

Please provide a final version of the questionnaire that doesn’t say “pretesting” on it.

The SCS script from the CAPI instrument was provided as an attachment on 12/16/08. Should this need to be uploaded into ROCIS?

Please clarify what is read to the respondent (versus the PRA statement on the top of the questionnaire that is not seen by respondents).

The following statement is read aloud to respondents:


"Now I have some additional questions about your school. These answers will be kept confidential, by law."

What is known about how well the “school environment” measures collected from the students in the NCVS supplement compare with similar measures collected by principals or other adults in schools?

We do not know how well responses align between principals/adults and students in school on school environment measures. It is NCES’ intent to perform a methodological investigation that compares student responses to principal responses (from the CCD) on items for which we have data for both respondents, such as highest grade/lowest grade in school, public/private, and the church affiliation of school. Regardless of the consistency or accuracy of student responses relative to those of adults, it is the students’ awareness and perception of the school environment that is uniquely captured in the SCS.


Passback Item

Response

Please summarize the literature that led to your operational definitions of bullying and cyber-bullying.  Are there gradations or progressions that the questions are trying to capture?  We are trying to understand the grouping of things on such a continuum (e.g., spreading rumors and excluding someone from a “friends” group seem significantly different from destroying property or threatening someone)?

Bullying

According to current conceptualizations, bullying is defined as repeated and intentional harm doing, inflicted on another, in a relationship characterized by an imbalance of power. Olweus (1993, p. 9-10) suggests a student is being bullied when three criteria are met, that is; (1) bullying is aggressive behavior or intentional harmdoing (2) which is carried out repeatedly and over time (3) in an interpersonal relationship characterized by an imbalance of power. Bullying does not occur in the occasional act of aggression and does not include friendly teasing (where there is no intent to harm).


Further, the literature that attempts to define bullying suggests it can take several forms including overt and covert actions. Olweus draws the distinction between “direct bullying” or open and physical attacks on the victim, and “indirect bullying” in the form of social isolation and exclusion from groups, and argues that these types of bullying are important when understanding and defining bullying. While these labels of direct and indirect have since been detailed and redefined in the bullying literature (Feshbach 1969, Crick and Grotpeter 1995, Cairns, Cairns, Neckerman, Ferguson, and Gariepy 1989), they still serve an important purpose in characterizing the types of bullying a victim may experience.


There are several characteristics to the bullying question on the supplement that deserve note:

1. The introduction includes the word “bullied” meaning that the respondent must consider him or herself bullied, in addition to have been victimized in one of the ways listed, in order to respond affirmatively to the behavior.

2. The list provided is considered to be an exhaustive list of possible ways in which one can be bullied that can be consolidated into one “bullied” measure by research analysts or subdivided into distinct categories of bullying victimization.

3. The follow-up question relating to the frequency of bullying victimization also allows analysts to set their own “threshold” for determining bullying victimization.


Cyberbullying

In September 2006, the Centers for Disease Control identified electronic aggression as an emerging public health problem in need of additional prevalence and etiological research to support the development and evaluation of effective prevention programs (David-Ferdon and Hertz 2007). As Williams and Guerra summarize:


To date, information on the prevalence of Internet bullying comes primarily from anecdotal reports and a limited number of youth surveys. Findings suggest that Internet bullying and victimization rates are around 25 percent. … Even fewer studies have examined the correlates and predictors of Internet bullying and whether these are similar or distinct from factors linked to bullying in schools.”


Cyberbullying questions were introduced on the 2005 supplement in order to be one of the first student-level national surveys to collect data on this emerging topic.


Passback Item

Response

Why are the questions about notifying adults at school written in the passive voice?

Questions 19d and 20c ask: “Was a teacher or some other adult at school notified about the bullying?” This mimics the question on the main NCVS incident report which asks: “Were the police informed of this incident in any way?" In addition, passive voice is used since we are not specifically asking whether it was the respondent who made the notification.

In addition to victimization questions, why are “perpetrator” type gun questions asked of students?  If this is a good idea, why aren’t analogous questions asked about bullying?

The weapon carrying items were introduced on the supplement in its original inception in 1989. Since then the questions have changed, but the spirit is consistent. The supplement was the first student-level national survey to ask about weapon carrying at school. In the interest of questionnaire time and respondent burden, NCES and BJS jointly decided to limit the bullying questions to those focusing on bullying victimization.

NCES collections


Why is the NCVS supplement approach preferred by NCES over a student supplement to the SSOCS?

If NCES were to implement a representative student survey at the school-level, there would be several obstacles. As OMB is aware, obtaining school/district-level and student/parent-level permissions to administer student surveys in the school setting is difficult and time consuming. As required by FERPA, NCES must obtain informed consent from the parents of school-based respondents. Informed consent involves providing a written account of why personal information is requested and how it will be used. In general, parents should have the option, without penalty, of agreeing or declining to provide the information an education agency or school requests. FERPA regulations require that prior consent be given by parents for the disclosure of information to persons other than school officials. Obtaining such consent from parents is a difficult, costly, and timely process. In the NCVS/SCS design, parent permission is obtained as part of the household's and parent's acceptance of the survey.

Another obstacle is cost. NCES currently funds the SCS at about $900K over 2 years. The cost of a school-based student collection with one-on-one CAPI procedures to obtain equivalent data (NCVS and SCS) at the same sample size would not be in the realm of possibility. Information contained in the supplement is richer because of its link to the data in the larger NCVS.

If the NCVS were not able to collect this supplement in the future (i.e., after its redesign scheduled for implementation in 2013), what alternatives would be available to NCES?

No alternatives would be available to NCES. The value of the supplement data are the victimization data appended to the file from the main NCVS collection. Without the victimization data, the data on various contexts for student victimization are moot. More general data such as those on school safety and security measures, fearfulness, bullying, and gangs might be appended to NCES’ National Household Education Survey (NHES), but, again, the strength of the supplement lies in the appended criminal victimization data from the NCVS.

Please explain how results from the supplement are used in concert with the SSOCS or other NCES information (e.g., do they indicate trends in the same direction on schools use of security guards, etc.?).

While no report/tabulation directly compares results from the two data sources, data on security measures from both SSOCS and the supplement are presented back to back in the Indicators of School Crime and Safety report. Specifically, Indicator 20 presents data on school security measures taken from the SSOCS and Indicator 21 reports similar measures taken from the supplement. Specifically, SSOCS data show between the 1999–2000 and 2005–06 school years, the percentage of schools using one or more security cameras to monitor the school increased from 19 to 43 percent. School Crime Supplement data show the percentage of students who observed the use of security cameras at their schools increased from 39 to 58 percent between 2001 and 2005.


6


File Typeapplication/msword
File TitlePassback Item
Authorbaumk
Last Modified ByLisa Price-Grear
File Modified2008-12-31
File Created2008-12-31

© 2024 OMB.report | Privacy Policy