Vol I FRSS 106 School Safety and Discipline

Vol I FRSS 106 School Safety and Discipline.docx

Quick Response Information System (QRIS) 2012-2015 System Clearance

Vol I FRSS 106 School Safety and Discipline

OMB: 1850-0733

Document [docx]
Download: docx | pdf



Volume I



Fast Response Survey System (FRSS) 106:

School Safety and Discipline: 2013–14



OMB# 1850-0733 v.29



























November 27, 2013

National Center for Education Statistics

U.S. Department of Education

Justification

The National Center for Education Statistics (NCES), U.S. Department of Education (ED), requests OMB approval under the NCES system clearance for the Quick Response Information System (QRIS) (OMB# 1850-0733) to conduct data collection for the Fast Response Survey System (FRSS) survey #106 on public school safety and discipline. The survey will provide nationally representative data, with a First Look report on the results to be released in the summer of 2015. The survey will be modeled after the NCES School Survey on Crime and Safety (SSOCS), modified for FRSS length and format constraints. The SSOCS survey was conducted in 2000, 2004, 2006, 2008, and 2010. It was also approved by OMB for a 2012 data collection (OMB# 1850-0761) but was not conducted due to budget constraints.


The FRSS survey is authorized under the Education Science Reform Act of 2002 (ESRA 2002, 20 U.S.C. 9573), which authorizes NCES to collect and report statistical data related to education in the United States. NCES has contracted Westat for all stages of this survey.


Design

Overview of Survey Development


The FRSS 106 questionnaire is comprised of selected questions that were approved for the SSOCS:2010 and SSOCS:2012 questionnaires (SSOCS:2010 was fielded, while SSOCS:2012 was not fielded due to budgetary constraints). Because FRSS 106 is comprised entirely of questions included in SSOCS, the development of SSOCS is discussed below.


Prior to the 2004 SSOCS, nine administrators from schools varying in locale, level, and district were recruited to identify potential issues with wording, formatting, and content. These nine participants responded to a series of scripted questions related to the survey items that tested the clarity of terms, the appropriateness of response options, and overall ease in responding to specific survey questions. Interviews were conducted at the schools and varied in length from 1 to 2 hours.


After the questionnaire was modified based on the results of the cognitive interview, seven site visits were completed to determine how schools record crime data (i.e., the format and layout of the data) and the amount of time it takes to obtain the appropriate data. As with the cognitive interviews, administrators were recruited from schools varying in locale, level, and district and were asked to complete a shortened version of the questionnaire. Interviews were conducted at the schools and varied in length from 1 to 3 hours.


To test the wording and format of the questionnaire and to find out how long it took for respondents to complete the SSOCS:2004 instrument in its entirety, a total of eight debriefing interviews were completed. Unlike the cognitive interviews and site visits, the respondents were administrators from public schools only. Principals were asked to complete the survey as if they had received the survey request in the mail, recording the total amount of time it took them to complete the survey. Telephone interviewers then contacted these principals and asked about the amount of time it took to complete the questionnaire, who and what information was needed to respond to the items, whether the questions were clear, and the use and clarity of the instrument provided.


Due to the complexity of this task for SSOCS:2004, its associated cost, and the minimal amount of change to the questionnaire since this thorough undertaking, subsequent consultations with principals about SSOCS survey items have focused on cognitive testing specific items that are new to the survey rather than the entire questionnaire. This was done for SSOCS:2006, 2008, and 2010.


The FRSS 106 questionnaire was prepared using selected questions from SSOCS. One round of the pretest was conducted with eight respondents during November 2013. During the pretest, respondents were asked to complete the questionnaire and participate in a telephone debriefing with Westat to provide feedback on the questionnaire. Completed questionnaires were collected by fax prior to the debriefing with each respondent. During the pretest, all questions on the questionnaire were tested and estimates of the respondent time required to complete the survey were obtained. The purpose of the pretest was to verify that the questions and corresponding instructions were clear and unambiguous, and to determine the response burden for the survey. The final questionnaire (Attachment 1) is being submitted with this request for OMB clearance.


NCES Review and Consultations Outside of Agency


As indicated above, the proposed FRSS 106 questionnaire contains selected questions from the OMB-approved SSOCS questionnaire. The SSOCS survey was originally developed in consultation with a Technical Review Panel (TRP) that was created to review crime-related surveys sponsored by NCES. Panel members and their affiliations are as follows:


  • Lynn Addington, Department of Justice, Law and Society, American University

  • Bill Bond, National Association of Secondary School Principals

  • Margaret Evans, National Association of Elementary School Principals

  • Denise Gottfredson, Department of Criminology and Justice, University of Maryland

  • Gary Gottfredson, Gottfredson Associates, Inc.

  • Kristen Hayes, Office of Safe and Drug Free Schools

  • William Lassiter, Center for Prevention of School Violence

  • Colin Loftin, School of Criminal Justice, State University of New York, Albany

  • Sister Dale McDonald, National Catholic Education Association

  • Shannon Means, Kentucky Center for School Safety

  • Michael Rand, Bureau of Justice Statistics

  • Bill Smith, Instructional Support Services, Sioux Falls School District


In addition, the feedback on the SSOCS:2008 questionnaire to inform revisions to SSOCS:2010 was provided by the following outside experts:


  • Lynn Addington, Department of Justice, Law and Society, American University

  • Amanda Nickerson, School Psychology Department, State University of New York, Albany

  • Teresa Sarmiento Brooks, School of Social Work, University of New England


Assurance of Confidentiality


Data to be collected will not be released to the public with institutional or personal identifiers attached. Data will be presented in aggregate statistical form only. In addition, each data file undergoes extensive disclosure risk analysis and is reviewed by the NCES/IES Disclosure Review Board before use in generating report analyses and before release as a public use data file. Respondents will be assured that their participation in the survey is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose unless otherwise compelled by law (Education Sciences Reform Act of 2002, 20 U.S.C. § 9573).


Description of Sample and Burden


NCES is considering two different sample sizes for the FRSS 106 survey, with the larger sample size contingent upon additional funding being available prior to the start of data collection. The proposed sample design is a nationally representative sample of either 1,600 or 3,500 regular public schools from the 2010–11 (or most recent) NCES Common Core of Data (CCD) Public Elementary/Secondary School Universe File. The questionnaire is limited to three pages of items readily available to respondents and can be completed by most respondents in about 20 minutes.


Prior to contacting schools for the FRSS 106 survey, a courtesy information packet consisting of a cover letter (Attachment 2) and copy of the questionnaire will be mailed to the superintendent of each district with sampled schools. The packet will also include a list of the sampled schools within the district. Any special requirements that districts have for approval of surveys will be met before schools in those districts are contacted. Each of the districts that require special approval has unique requirements for obtaining approval. The materials sent to special districts will be tailored to meet the specific requirements of each district, consistent with the materials included in this OMB package. For example, most districts request information on survey justification, confidentiality, sample size, and survey collection procedures, which will be copied from the appropriate sections of the OMB package after their approval.


Questionnaire packages, including information needed to access the Web survey, will be mailed to the principal of sampled schools in February 2014. The cover letter and questionnaire will include a description of the most appropriate respondent. Follow-up for nonresponse will be conducted both by mail and telephone and will begin about 3 weeks after the questionnaires have been mailed to the schools. Experienced telephone interviewers will be trained to conduct the nonresponse follow-up and will be monitored by Westat supervisory personnel. Telephone nonresponse follow-up is used to prompt respondents to complete the survey by web, mail, or fax and is expected to take about 5 minutes.


Option 1: Sample Size of 1,600 Schools


The sample of 1,600 schools is estimated to be located in 1,360 districts, based on previous FRSS studies. Notification to the estimated 1,360 districts is expected to take approximately 5 minutes per district for a total of 113 respondent burden hours (table 1). Based on previous FRSS studies, it is anticipated that approximately 70 districts with special clearance procedures will be contacted. The respondent burden will be approximately 2 hours per special district for a total of 140 respondent burden hours. The estimated burden time for 1,600 schools to review the introductory letter requesting their participation (initial contact) is 5 minutes per school for a total of 133 respondent burden hours. The initial sample of 1,600 schools will yield about 1,360 completed questionnaires, assuming a response rate of 85 percent. Based on a response burden of approximately 20 minutes per completed questionnaire, the estimated response burden to complete the survey is estimated to be about 453 hours.1 It is anticipated that about 75 percent of the schools will receive a nonresponse follow-up call that will take about 5 minutes. The total estimated burden time for nonresponse follow-up is about 100 hours. The total number of burden hours for data collection and nonresponse follow-up is about 939 hours.


Table 1. Estimated burden for data collection and nonresponse follow-up for 1,600 schools: FRSS 106

Type of collection

Sample size

Estimated response rate (percent)

Estimated number of respondents

Estimated number of responses

Total burden hours per respondent

Respondent burden hours

Special clearance district
   review

70

100%

70

70

2.00

140

District notification

1,360

100%

1,360

1,360

.083

113

Initial school contact

1,600

100%

1,600

1,600

.083

133

Questionnaire

1,600

85%

1,360

1,360

.333

453

Nonresponse follow-up
   call to school

1,600

75%

1,200

1,200

.083

100

Total burden

-

-

2,960

5,590

-

939


Option 2: Sample Size of 3,500 Schools


The sample of 3,500 schools is estimated to be located in 2,975 districts, based on previous FRSS studies. Notification to the estimated 2,975 districts is expected to take approximately 5 minutes per district for a total of 247 respondent burden hours (table 2). It is anticipated that close to all (approximately 110) districts with special clearance procedures will have schools sampled and need to be contacted. The respondent burden will be approximately 2 hours per special district for a total of 220 respondent burden hours. The estimated burden time for 3,500 schools to review the introductory letter requesting their participation (initial contact) is 5 minutes per school for a total of 291 respondent burden hours. The initial sample of 3,500 schools will yield about 2,625 completed questionnaires, assuming a response rate of 85 percent. Based on a response burden of approximately 20 minutes per completed questionnaire, the estimated response burden to complete the survey is estimated to be about 991 hours.2 It is anticipated that about 75 percent of the schools will receive a nonresponse follow-up call that will take about 5 minutes. The total estimated burden time for nonresponse follow-up is about 218 hours. The total number of burden hours for data collection and nonresponse follow-up is about 1,967 hours.


Table 2. Estimated burden for data collection and nonresponse follow-up for 3,500 schools: FRSS 106

Type of collection

Sample size

Estimated response rate (percent)

Estimated number of respondents

Estimated number of responses

Total burden hours per respondent

Respondent burden hours

Special clearance district
   review

110

100%

110

110

2.00

220

District notification

2,975

100%

2,975

2,975

.083

247

Initial school contact

3,500

100%

3,500

3,500

.083

291

Questionnaire

3,500

85%

2,975

2,975

.333

991

Nonresponse follow-up
   call to school

3,500

75%

2,625

2,625

.083

218

Total burden

-

-

6,475

12,185

-

1,967


In case we receive funding to exercise option 2, we are requesting approval for the higher, option 2 burden estimate.


Procedures and Data Collection Instrument


A questionnaire, cover letter (Attachment 3), and web information sheet (Attachment 4) will be mailed to each sampled school. The cover letter requests the participation of the school and introduces the purpose and content of the survey. It also notes that the survey should be completed by the person most knowledgeable about safety and discipline at the school. The cover letter also includes instructions on how to complete and return the survey, as well as contact information in case of questions. The web information sheet is included in the mailing to provide information about the option to complete a web version of the survey. On the cover of the survey and in the cover letter, respondents are assured that their participation is voluntary and their answers may not be disclosed, or used, in identifiable form for any other purpose unless otherwise compelled by low (Education Sciences Reform Act of 2002, 20 U.S.C. § 9573).


If a completed survey is not received for a sampled school within 3 weeks after the initial mailing, the school will receive a nonresponse follow-up letter (Attachment 5), another copy of the school’s web information sheet, and a brief, scripted telephone call (Attachment 6) prompting the respondent to return a completed survey via the web, fax, or mail.


Questionnaire


The questionnaire is designed to collect information on specific safety and discipline plans and practices, training for teachers and aides related to school safety and discipline issues, security personnel, frequency of specific discipline problems, and number of incidents of various crimes.


Question 1 asks about several kinds of school policies and practices. These data are important in helping schools know where they stand in relation to other schools, and to help policymakers know what actions are already being taken and those that might be encouraged in the future.

  • Items 1a through 1c and 1f ask about access to the school building(s) and grounds. The ability of students and outsiders to enter and leave the campus throughout the school day affects the amount of control that administrators have over the school environment, and the potential for bringing weapons or drugs onto school grounds.

  • Items 1d–1e, 1g–1k, 1n–1o, 1q, 1r, and 1t ask about ways that students are monitored to prevent crime. Such actions can directly affect crime because students may be more reluctant to engage in inappropriate activities for fear of being caught. The school climate also may be affected because students may feel more secure knowing that violators of school policies are likely to be caught.

  • Items 1l and 1m ask about dress code.

  • Item 1p provides information about a system to notify parents in the event of an emergency.

  • Item 1s provides information about the school environment (e.g., are students and outsiders able to identify staff who might help with a problem?) and about the school’s ability to monitor the grounds and identify outsiders.

  • Items 1u and 1v ask about the availability of telephones or two way radios to staff. Either device gives staff and teachers the opportunity to obtain help without leaving the specific location or classroom, and affects the administration’s ability to communicate with teachers.

  • Items 1w and 1x ask about ways the students’ use of cell phones, text messaging devices, and social networking websites is controlled. These items were included based on a body of literature that suggests cyberbullying, harassment via electronic means, and aggression via student social networking sites is an area of concern in schools. These items will collect information on the prevalence of policies that may limit students’ ability to bully one another via electronic means.


Question 2 asks about training provided by schools or districts for classroom teachers or aides. The types of training include classroom management; school-wide discipline policies and practices related to violence, bullying, and alcohol/drug use; safety procedures; recognizing signs of potentially violent students, bullying behavior, and substance abuse; positive behavioral intervention strategies; and crisis prevention and intervention. If schools obtain early warning signs of behavior issues, they may be able to reduce the likelihood of problems or crises (such as multiple shootings). The type of training provided to teachers is important because teachers collectively spend the most time with students and observe students closely.


Question 3 asks about the existence of written plans for dealing with crises and if students have been drilled on these plans. When crises occur, there may not be time or an appropriate environment for making critical decisions, and key school leaders may not be available to immediately provide guidance. Thus, having a written plan is considered important in preparing schools to deal with crises effectively.


Questions 4 through 6 ask about the use of law enforcement or security personnel on the school grounds or at school events. Besides directly affecting school safety, the use of law enforcement personnel also affects the school environment. It may help to prevent illegal actions, and reduce the amount of crime. It also may affect the feeling of security or freedom on school grounds. Thus, the amount of time the law enforcement personnel are present and their visibility are important.


Question 7 asks about the frequency of several kinds of disciplinary problems, providing a measure of the degree to which there are disciplinary problems at the school. There is evidence that when these types of violations are controlled, students do not progress to more serious disciplinary problems. This question asks about the degree to which schools face such disciplinary problems and provides a measure of the disciplinary situation in U.S. public schools.


Question 8 asks about violent deaths, specifically homicides. Violent deaths get substantial attention by the media but are actually relatively rare, and there is evidence that (in general) schools are much safer than students’ neighboring communities. Based on analyses of such previous data collected by the SSOCS surveys, these crimes are rare events such that NCES is unable to report estimates per the Center’s statistical standards. Nonetheless, it is important to include the item in the questionnaire as these are significant incidents of crime that, at the very least, independent researchers can evaluate. Furthermore, the survey represents a comprehensive picture of the types of violence that can occur in schools, and the omission of violent deaths would be questioned by respondents who may have experienced such violence.


Question 9 asks about the frequency of various kinds of crimes, other than violent deaths, that occurred at school. The data can be used as an indicator of the degree of safety in U.S. public schools.


Survey Cost and Time Schedule


For a sample size of 1,600, the survey is estimated to cost the federal government about $800,000, including about $750,000 for contractual costs and $50,000 for salaries and expenses. For a sample size of about 3,500, the survey is estimated to cost the federal government about $1.2 million, including about $1.1 million for contractual costs and $75,000 for salaries and expenses. Contractual costs include the costs for survey preparation, data collection, data analysis, and report preparation.


Mailing of the survey is planned for February 2014. About 3 weeks after mail out of the survey, Westat will begin telephone follow-up for nonresponse. Data collection is scheduled for completion about 20 weeks after initial mailout.


Plan for Tabulation and Publication


The First Look report will be released on the NCES website in the summer of 2015 and include explanatory text and tables. Participating schools will be notified when NCES releases the report. A public use data file will also be released on the NCES website. Survey responses will be weighted to produce national estimates. Tabulations will be produced for each data item. Cross-tabulations of data items will be made with selected classification variables, such as instructional level of the school, school enrollment size, urbanicity (locale), and percent White enrollment. These are the same classification variables used for the SSOCS reports.



Statistical Methodology

Reviewing Statisticians


Chris Chapman, of NCES, is the Project Officer for this survey. Kathryn Chandler, NCES Project Director of the SSOCS, was consulted about both survey content and design. Adam Chu, Senior Statistician, Westat, was consulted about the statistical aspects of the design. Westat is the contractor currently conducting the QRIS surveys for NCES.


Sample Design


The sample design for the proposed FRSS survey School Safety and Discipline: 2013–14 is similar to the design used in previous iterations of NCES’s School Survey on Crime and Safety (SSOCS). The general goals of the sample design are to obtain nationally-representative samples of public schools for each of three instructional levels (elementary, middle, and secondary/combined) and to provide estimates of key variables that are subject to a coefficient of variation (CV) of approximately 5 to 7 percent. Compared to prior SSOCS surveys, the proposed FRSS survey is much shorter, focusing mainly on school practices and programs, with collection of a limited number of crime statistics. The level of funding for the proposed survey is currently under review. Therefore, two options are presented: a basic option with an initial sample size of 1,600 schools, and an enhanced option with an initial sample size of 3,500 schools. Except for the difference in sample size, the general approach to sampling will be the same for both options. Assuming an 85 percent response rate, the proposed sample sizes will yield an estimated 1,360 responding schools under the basic option, and 2,975 responding schools under the enhanced option. In the following sections, we describe the respondent universe and corresponding sampling frame, the proposed allocation of the sample to the three instructional levels, the method of stratification and sample selection, and the expected levels of precision under the specified design.


Respondent Universe


The respondent universe for the FRSS survey on school safety includes all regular public schools operating in the United States. The sampling frame will be developed from the most recent NCES Common Core of Data (CCD) Public School Universe File. As indicated in Table 3, there are 86,753 regular schools included in the 2010–2011 CCD universe file, of which 50,990 are elementary schools, 16,577 are middle schools, and 19,186 are high schools or schools with combined elementary/secondary grades. Table 4 summarizes how the three instructional levels will be defined for sampling purposes based on the grades taught in the school. Note that the counts of schools in Table 3 pertain only to the 86,753 “regular” schools in the CCD file, and exclude special education, vocational, and alternative/other non-regular schools. Schools with a high grade of kindergarten or lower, ungraded schools, and schools in the outlying U.S. territories are also ineligible for the survey and are excluded from the counts in the table.


Sample Allocation


In general, allocating the total sample to the three instructional levels in proportion to the numbers of schools in the population would be inefficient for the proposed survey because it would lead to a very large sample of elementary schools and relatively few middle schools (e.g., see Table 3). An alternative approach that would better support the analytic goals of the study would therefore be to “undersample” elementary schools (where the prevalence of school safety and discipline problems may be relatively low) and to “oversample” middle and secondary/combined schools (where problems are expected to be higher). For example, in the 2010 SSOCS, approximately 25 percent of the total sample was allocated to the elementary school stratum, 35 percent to the middle school stratum, and 40 percent to the secondary/combined school stratum. While such an allocation ensures that relatively robust samples of middle and secondary/combined schools are obtained for analysis purposes, it will result in relatively large overall design effects resulting from the disproportionate allocation of the sample to strata. The design effect reflects the increase in sampling variance as compared with a self-weighting (equal probability) sample of the same size. For example, under the SSOCS allocation, design effects would range from 1.08 to 1.27 for the three instructional levels and up to 1.75 for the total (overall) sample and subgroups that include all three instructional levels (e.g., subgroups defined by urbanicity or minority status).


To mitigate the design effects expected under the 25-35-40 percent allocation used in SSOCS, the allocation to the three instructional levels can be modified somewhat to improve the precision of overall estimates while still maintaining reasonably good precision for the three levels. We propose to allocate the samples to the three levels in roughly the following percentages: 35-30-35. Compared with the SSOCS design, the proposed allocation increases the share of the samples of elementary schools to about 35 percent of the total sample, while moderately reducing the share of middle and secondary schools. The proposed allocation is expected to result in an appreciable reduction in the overall design effects (1.37 under the proposed design compared with 1.74 under the SSOCS allocation).


Sample Stratification and Selection


For sampling purposes, strata will be defined by crossing instructional level (elementary, middle, and secondary/combined) with four enrollment size classes (less than 300, 300 to 499, 500 to 999, and 1,000+) and four urbanicity categories (city, suburb, town, rural). Within each sampling stratum, schools will be sorted by region and categories of percent White enrollment prior to sample selection to induce additional implicit stratification. For each instructional level, the specified total sample size will be allocated to strata in rough proportion to the aggregate square root of the enrollment of the schools in the stratum. The use of the square root of enrollment for sample allocation purposes is a compromise between proportional allocation (which is approximately optimal for estimating the number or proportion of schools with a specified characteristic), and allocation in proportion to enrollment (which is approximately optimal for numeric variables that are correlated with school size; e.g., the number of students in schools with controlled access or other practices). After the stratum sample sizes have been determined, the required numbers of schools will be selected systematically from the sorted file using independent random starts and a fixed sampling rate within each stratum. In addition to analyses by instructional level, the sample design will permit separate analysis (along a single dimension) by enrollment size class (four categories) and urbanicity (four categories). Subgroup analysis of categories based on percent White enrollment is also possible, although the number of schools in the highest percent White category (more than 95 percent White) will be relatively small under the proposed sample allocation.


Expected Levels of Precision


Assuming an overall response rate of 85 percent, an initial sample size of 1,600 schools under the basic option will yield approximately 1,360 completed questionnaires, while an initial sample size of 3,500 schools under the enhanced option will yield approximately 2,975 completed questionnaires. The approximate sample sizes and the corresponding coefficients of variation (CV) to be expected under the two proposed design options are summarized in Tables 5A and 5B. Since the sample sizes in these tables are derived from preliminary tabulations of the 2010–2011 CCD file,3 the actual sample sizes may differ somewhat from those shown. Also, note that the sample sizes represent the expected numbers of completed questionnaires (respondents), and not the numbers of schools to be selected. The coefficient of variation represents the expected relative standard error of an estimated proportion, and can be converted to approximate 95 percent relative confidence bounds by multiplying the entries by 2. Cells for which the CV is greater than 0.075 (i.e., greater than 7.5 percent) are highlighted. Under the basic option (Table 3A), a CV under 0.07 will be difficult to achieve for underlying population proportions that are smaller than 0.50 (i.e., 50 percent prevalence characteristics). In contrast, the sampling errors under the enhanced option (Table 3B) will be considerably smaller. For example, it will generally be possible to attain a CV under 0.07 with the enhanced option unless the underlying population proportions are smaller than 0.33 (i.e., 33 percent prevalence characteristics). Achieving CVs under 0.07 is generally difficult for small proportions because the small value in the denominator of the CV results in large values of the CV.


Table 3. Number of regular public schools and enrollment in the 2010–2011 CCD public school universe file by instructional level and size class

Instructional

level*

Enrollment

size class

Number

of schools


Enrollment

Elementary

Less than 300

12,202

2,277,412


300 to 499

18,807

7,548,403


500 to 999

19,052

12,483,990


1,000 or more

929

1,090,531

Middle

Less than 300

3,779

644,923


300 to 499

3,711

1,481,017


500 to 999

7,313

5,234,656


1,000 or more

1,774

2,171,786

Secondary/

Less than 300

6,014

913,196

combined

300 to 499

3,297

1,291,958


500 to 999

4,157

2,961,578


1,000 or more

5,718

9,883,458





Total**


86,753

47,982,908

* See Table 4 for definitions.

** The counts in this table are based on data in the 2010–2011 CCD public school universe file, and exclude special education, vocational, and alternative/other schools, schools with a high grade of kindergarten or lower, ungraded schools, and schools in the outlying U.S. territories.



Table 4. Definition of instructional level categories for the school safety survey

Low

Grade

High grade


1

2

3

4

5

6

7

8

9

10

11

12

PK

E

E

E

E

E

E

E

E

C

C

C

C

K

E

E

E

E

E

E

E

E

C

C

C

C

1

E

E

E

E

E

E

E

E

C

C

C

C

2


E

E

E

E

E

E

E

C

C

C

C

3



E

E

E

E

E

E

C

C

C

C

4




M

M

M

M

M

M

C

C

C

5





M

M

M

M

M

C

C

C

6






M

M

M

M

C

C

C

7







M

M

M

C

C

C

8








M

M

C

C

C

9









M

S

S

S

10










S

S

S

11











S

S

12












S

E Elementary

M Middle/junior high

S/C Senior high/combined


Table 5A. Expected sample sizes (number of completed interviews) and coefficients of variation,
by selected analytic domains under the basic option (n = 1,600)

Sample subgroup

Expected sample size

Proportion (P) in population

0.20

0.33

0.50

0.67

0.80








Total sample

1,360

0.063

0.045

0.032

0.022

0.016

Instructional level







Elementary

476

0.095

0.068

0.048

0.033

0.024

Middle

408

0.105

0.075

0.052

0.037

0.026

Sec/combined

476

0.103

0.074

0.052

0.036

0.026








Size of school







Less than 300

194

0.154

0.110

0.077

0.054

0.039

300 to 499

314

0.121

0.086

0.060

0.042

0.030

500 to 999

539

0.094

0.067

0.047

0.033

0.024

1,000 +

314

0.118

0.084

0.059

0.042

0.030








Urbanicity







City

360

0.123

0.087

0.061

0.043

0.031

Suburb

422

0.114

0.082

0.057

0.040

0.029

Town

184

0.168

0.120

0.084

0.059

0.042

Rural

395

0.118

0.084

0.059

0.041

0.030








Percent White enrollment







More than 95 percent

111

0.216

0.154

0.108

0.076

0.054

More than 80 to 95 percent

341

0.126

0.090

0.063

0.044

0.032

More than 50 to 80 percent

375

0.121

0.086

0.060

0.042

0.030

50 percent or less

533

0.100

0.071

0.050

0.035

0.025

NOTE: Cells for which the CV is greater than 0.075 (i.e., greater than 7.5 percent) are highlighted.

Table 5B. Expected sample sizes (number of completed interviews) and coefficients of variation,
by selected analytic domains under the enhanced option (n = 3,500)

Sample subgroup

Expected sample size

Proportion (P) in population

0.20

0.33

0.50

0.67

0.80








Total sample

2,975

0.043

0.031

0.021

0.015

0.011

Instructional level







Elementary

1,041

0.064

0.046

0.032

0.023

0.016

Middle

893

0.071

0.050

0.035

0.025

0.018

Sec/combined

1,041

0.070

0.050

0.035

0.025

0.017








Size of school







Less than 300

424

0.104

0.074

0.052

0.036

0.026

300 to 499

687

0.081

0.058

0.041

0.029

0.020

500 to 999

1,179

0.064

0.045

0.032

0.022

0.016

1,000 +

685

0.080

0.057

0.040

0.028

0.020








Urbanicity







City

786

0.083

0.059

0.041

0.029

0.021

Suburb

924

0.077

0.055

0.039

0.027

0.019

Town

400

0.114

0.081

0.057

0.040

0.028

Rural

864

0.080

0.057

0.040

0.028

0.020








Percent White enrollment







More than 95 percent*

242

0.146

0.104

0.073

0.051

0.036

More than 80 to 95 percent

746

0.085

0.061

0.043

0.030

0.021

More than 50 to 80 percent

824

0.081

0.058

0.041

0.029

0.020

50 percent or less

1,163

0.067

0.048

0.034

0.024

0.017

NOTE: Cells for which the CV is greater than 0.075 (i.e., greater than 7.5 percent) are highlighted.



1 This estimate is the average amount of time school staff respondents reported the questionnaire took to complete during the pretest.

2 This estimate is the average amount of time school staff respondents reported the questionnaire took to complete during the pretest.

3 The most current 2011–2012 CCD school universe will be used for sample selection if it is available.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy