SSOCS OMB 2010 2012_Supporting Statement B to NCES 09.14.09

SSOCS OMB 2010 2012_Supporting Statement B to NCES 09.14.09.doc

School Survey on Crime and Safety (SSOCS), 2010 and 2012

OMB: 1850-0761

Document [doc]
Download: doc | pdf






School Survey on Crime and Safety (SSOCS)

2010 and 2012


OMB Supporting Statement B

May 14, 2009



Contents



List of tables


Table Page


Table 1 Expected respondent universe by school level and urbanicity for the proposed sample, based on the SSOCS:2008 universe (created from the 2005–06 CCD) 4


Table 2 Expected respondent universe by school level and enrollment size for the proposed sample, based on the SSOCS:2008 universe (created from the 2005–06 CCD) 4


Table 3 Unweighted and weighted unit response rates, by selected school characteristics: School year 2007–08 8



Section B. Methodology

NCES has entered into an interagency agreement with the Census Bureau to conduct the 2010 collection of SSOCS. If this interagency agreement continues to go well, the Census Bureau will also provide data collection services for the 2012 SSOCS. The Census Bureau was the data collection agency for both the 2006 and 2008 SSOCS collections. Westat, Inc., and Abt Associates, Inc., did the earlier collections in 2000 and 2004, respectively. Since the questionnaire and many of the procedures are well-defined, the major portion of the work on this survey is data collection, and the Census Bureau is believed to be able to achieve relatively high response rates. The information below reflects plans for 2010. The 2012 SSOCS will be done using the same methods and procedures.


B1. Respondent Universe


The potential respondent universe is all regular, public schools with students in any of grades 1 through 12. This excludes schools run by the Bureau of Indian Education, overseas Department of Defense schools, schools specializing in special education or alternative education Vocational schools, newly closed schools, home schools, and ungraded schools are also excluded. The size of this population is estimated to be about 84,000 (based on information from the Department of Education's Common Core of Data).1


Table 1. Expected respondent universe by school level and urbanicity for the proposed public school sample, based on the 2005–06 CCD

Urbanicity

Primary

Middle

High

Combined

Total

City

14,531

3,826

2,943

939

22,239

Suburb

15,541

4,935

3,216

547

24,239

Town

6,349

2,920

2,108

613

11,990

Rural

13,666

4,010

3,851

4,282

25,809

Total

50,087

15,691

12,118

6,381

84,277


Table 2. Expected respondent universe by school level and enrollment size for the proposed public school sample, based on the 2005–06 CCD

Enrollment Size

Primary

Middle

High

Combined

Total

Less than 300

12,655

2,660

2,099

3,033

20,447

300 to 499

17,998

3,443

1,658

1,509

24,608

500 to 999

18,240

7,374

2,964

1,399

29,977

1000+

1,194

2,214

5,397

440

9,245

Total

50,087

15,691

12,118

6,381

84,277


Sample Selection and Response Rates

A stratified sample design will be used to select approximately 3,500 public schools for the SSOCS to be conducted in 2010. For sample allocation purposes, strata will be defined by instructional level, type of locale, and enrollment size. Both minority enrollment and region will be used as sorting variables in the sample selection process to induce implicit stratification. The 2008 SSOCS yielded an unweighted response rate of approximately 75 percent. When the responding schools were weighted to account for their original sampling probabilities, the response rate increased to approximately 77 percent.



B2. Sample Design and Estimation

Sample Design for SSOCS:2008


A stratified sampling design was used to select schools for the SSOCS:2008. For sample allocation and sample selection, strata were defined by instructional level, type of locale, and enrollment size. Within each of four instructional level categories, sample was allocated to each of 16 subgroups formed by the cross classification of size (four levels) and locale (four levels), in proportion to an aggregate measure of size derived for each subgroup. Aggregate measure of size for a specific locale*size cell within an instructional level is equal to the sum of the square root of school enrollment.


The initial goal of SSOCS:2008 was to collect data from at least 2,550 schools, taking nonresponse into account. One possible method of allocating schools to the different sampling strata would have been to allocate them proportionally to the U.S. public school population. However, while the majority of U.S. public schools are primary schools, the majority of school violence is reported in middle and high schools. Proportional allocation would, therefore, have yielded an inefficient sample design because the sample composition would have included more primary schools (where crime is an infrequent event) than middle or high schools (where crime is a relatively more frequent event). As a result, a larger proportion of the target sample of 2,550 schools was allocated to middle and high schools. Based on the aggregate measure of size, the desired sample of 2,550 schools was allocated to the four instructional levels as follows: 640 primary schools, 895 middle schools, 915 schools high schools, and 100 combined schools. Within instructional level, the overall sample of schools was then allocated to each stratum in proportion to the measure of size. Schools in the 1999–2000 SSOCS (SSOCS:2000), SSOCS:2004, and SSOCS:2006 were allocated to instructional levels in a similar manner.


After the allocation for each stratum was determined, percent minority and region were used as implicit stratification variables by sorting the school lists in each stratum by these variables before sample selection. Formulas used for calculating this measure of size are provided below.


The formula is given as:


MOS(h) =


where Ehi = the enrollment of the i th school in stratum h and Nh is the total number of schools in stratum h.


The measure of size for the instructional level – MOS(l) - is found by summing across the 16 measure of size values, MOS(h), that comprise the instructional level. The ratio of the stratum’s measure of size to the overall measure of size for the instructional level determines the number of cases to be allocated to that stratum. This is found by dividing the stratum measure of size MOS(h) by the total measure of size for the instructional level MOS(l). The result provides the proportion of the sample that should be allocated to this stratum.


Sample Design for SSOCS:2010


It is recommended that the same general sampling design used for SSOCS:2008 be adopted for the selection of schools in SSOCS:2010 and SSOCS:2012, with regard to the stratification variables, the number of strata, the method of sample allocation and the sorting of variables before selection.


The two main objectives for the sampling design of SSOCS:2010 and SSOCS:2012 are identical to those for SSOCS:2008: (1) to obtain overall cross-sectional and sub-group estimates of important indicators of school crime and safety, and (2) to have precise estimates of change in various characteristics relating to crime between the 2003–04, 2005–06, 2007–08, 2009–10, and later 2011–12 SSOCS administrations. Adopting the same basic design increases the precision of the estimate of change. For sample allocation and sample selection purposes, strata were defined in prior administrations of SSOCS by crossing instructional level, type of locale, and enrollment size. In addition, minority status and region were used as implicit stratification variables by sorting schools by these variables within each stratum before sample selection. The three explicit and two implicit stratification variables have been shown to be related to school crime and thus create meaningful strata for this survey.


Like SSOCS:2008, the sample design will not account for other NCES surveys in the field at the same time as SSOCS. Selecting a sample that avoids or minimizes overlap would unnecessarily complicate the sampling design and would required complex computations of probabilities prior to sample selection as well as complex weighting post-data collection.


SSOCS:2010 will take advantage of the lessons learned from the 2008 data collection. Response rates achieved for various strata and substrata in SSOCS:2008 will be examined in order to determine the proper size of the initial sample selection for 2010. Response rates achieved for various strata and substrata in SSOCS:2010 will then be examined in order to determine the proper size of the initial sample selection for 2012 . Table 4 contains SSOCS:2008 response rates by type of school level, enrollment size, locale, minority status, and region. When using 2008 response rates to estimate 2010 response rates, the goal will be to ensure a sufficient number of completed cases for analysis.


The base-weighted response rate in the 2008 SSOCS was 77 percent. The sample design was built on these same expectations, so as to ensure that a sufficient number of completed interviews will be obtained.


Table 3. Unweighted and weighted unit response rates, by selected school characteristics: School year 2007–08

 

 


 

 

 

 






Unweighted

Weighted


Initial

Completed

Non-


response

response

School characteristic

sample

survey1

respondents2

Ineligible3

rate (percent)4

rate (percent)5


 

 

 

 

 

 

Total

3,484

2,560

872

52

75

77.16








Level







Primary

833

618

200

15

76

76.96

Middle

1,214

897

297

20

75

76.96

High school

1,295

936

347

12

73

76.22

Combined

142

109

28

5

80

80.82








Enrollment size







Less than 300

371

285

60

26

83

83.33

300–499

630

486

131

13

79

76.74

500–999

1,318

992

315

11

76

76.22

1,000 or more

1,165

797

366

2

69

68.60








Urbanicity







City

1,046

679

335

32

67

69.44

Suburb

1,151

814

329

8

71

73.10

Town

469

390

70

9

85

84.61

Rural

818

677

138

3

83

83.85








Percent minority enrollment







Less than 5 percent

427

353

70

4

83

84.32

5 to less than 20

percent

892

707

181

4

80

80.77

20 to less than 50

percent

895

656

231

8

74

76.66

50 percent or more

1,270

844

390

36

68

71.38








Region







Northeast

597

399

189

9

68

69.51

Midwest

832

648

168

16

79

80.78

South

1,274

950

304

20

76

79.71

West

781

563

211

7

73

74.60








1In SSOCS:2008, A minimum of 60 percent of the 241 subitems eligible for recontact (i.e., all subitems in the questionnaire except for the seven introductory items) were required to have been answered for a survey to be considered complete, including a minimum of 80 percent of the 103 critical subitems.

2Nonrespondents include those eligible schools that did not answer the minimum number of items required for a survey to be considered complete.

3Ineligible schools include those that had closed, merged with another school at a new location, or changed from a regular public school to an alternative school.

4The unweighted response rate is calculated as the following ratio: completed cases / (total sample - known ineligibles).

5The weighted response rate is calculated by applying the base sampling rates to the following ratio: completed cases / (total sample - known ineligibles).

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 School Survey on Crime and Safety (SSOCS:2008).




Calculation of Weights

Weights will be attached to each surveyed school so that the weighted data will represent population levels. The final weight for completed cases will be comprised of a sampling base weight and an adjustment for nonresponse. As with the 2008 SSOCS, weighting adjustment cells for 2010 SSOCS data will be determined based on bias analysis results from 2010 SSOCS data in order to create the adjustment for nonresponse. Weighting adjustment cells for 2012 SSOCS data will be determined based on bias analysis results from 2012 SSOCS data in order to create the adjustment for nonresponse. The final, adjusted weight will be raked so that the sum of the weights matches the number of schools derived from the latest CCD public school universe file.


Methods for Variance Estimation

Standard errors of the estimates will be estimated using jackknife repeated replication (JRR). Replicate codes that indicate the computing strata and the half-sample to which each sample unit belongs will be provided, as well as the replicate weights for all replicates that were formed in order to calculate variances.


Procedures for Collection of Information


The data collection methods for the 2009–10 and 2011–12 SSOCS will be based on those used in the 2007–08 SSOCS. The methodology for the SSOCS:2010 and SSOCS:2012 entails a mail survey with intensive phone and e-mail follow-up. Census’ strategy is to e-mail respondents within one week of the expected arrival of the questionnaire, and to call respondents by phone within three weeks of the expected arrival of the questionnaire. We will then mail a second survey to all eligible non-responders and will continue this intensive phone follow-up for the remainder of the field period until completion of the interview via mail, fax or phone.


Steps in the Data Collection Process


The following is a description of the main tasks in the data collection process for the SSOCS. These tasks include: drawing the sample; identifying special permission districts; mailings to school principals, district superintendents, and CSSOs; phone follow-up to non-responding schools; and refusal aversion and conversion efforts (See Supporting Statement C, Appendices D – K for letters sent to superintendents and principals, as well as postcards to schools in special permission districts and reminder e-mails and voicemails to respondents).



Drawing the Sample


Census staff will draw the sample of schools in September, due to the fact that we would like to identify the special permission school districts early in the survey cycle. This ensures that these schools have the necessary information to present to their approval board in the fall of 2010 and 2012.



Identifying Special Permission Districts and the Need for Research Applications


NCES has a list of special permission districts that require research applications. The districts that require special permission prior to data collection will be contacted by NCES or contractor support staff to obtain any materials necessary for district approval of SSOCS.


Advance Notification to Principals


Principals will be notified of the survey through an advance letter sent a week or two before the questionnaire, following OMB clearance. The letter will include the date of the first mailing as well as an 800 number that principals can call if they have questions. We are opting to send an advance letter, as opposed to a postcard notification. A letter can provide more information and generally looks more “official.” The toll-free line will be answered by Census program staff in Suitland, MD, who have been explicitly trained on this study and how to respond to this mailer. Staffing levels will ensure that at least one staff person is available at all times during the promised hours of operation. Copies of the advance letter to principals and principals in special permission districts are included in Supporting Statement C, Appendices F and G.


Mailing to District Superintendents and Chief State School Officers


In order to achieve the highest possible response rate, we will send the study notification mailing to superintendents and CSSOs at the same time as the advance notification to principals. The purpose of this mailing is to provide districts with information about the survey and to inform them about the questionnaires being mailed to sampled schools in their district. It is not designed to ask for permission, rather, this is designed as a vehicle to help enhance participation. All materials sent to the CSSOs will be personalized using contact information from the CSSO website. Copies of the letters to the superintendents/CSSOs are included in Supporting Statement C, Appendices D and E.


Mailing the Questionnaire to Principals


We will begin mailing questionnaires to school principals in late February 2010 (February 2012 for the 2012 collection) via Federal Express (FEDEX) for the first mailing because the price is similar to USPS Priority Mail and FEDEX guarantees delivery. The mailing will include a postage-paid return envelope. The content cover letter will be targeted to the schools and will include the toll-free number at Census, along with the hours of operation, and the return address. The principal will be asked to complete the questionnaire by the end of March 2010 (March 2012 for the 2012 collection), or have it completed by the person in the school who is most knowledgeable about school crime and safety. A copy of the cover letter to principals is included in Supporting Statement C, Appendix H, and a copy of the postcard included for special permission districts is included in Supporting Statement C, Appendix I.

Protocol for Follow-up Calls


Approximately two weeks after the estimated delivery of the questionnaire to school principals, we will initiate phone calls to confirm that they have received the mailing and to ask if they have any questions. Approximately one week after the first follow up call, the first reminder e-mail will be sent to all respondents from the NCES Project Officer. A second reminder e-mail will be sent to non-responders. This e-mail will be from the Project Officer at NCES. E-mails will be personalized and sent to individual respondents. If requested, we will FEDEX another questionnaire to the school (and we would call within 2 days to confirm receipt).


Second Mailing of Questionnaire


We will do a second mailing in late March or early April to nonresponders using FEDEX.


Data Retrieval of Critical Items



In terms of collection of “critical items” interview labor can be divided between follow-up to nonresponders (with the remaining weeks seeking critical item completes as an alternative to the full survey) and follow-up on responders who have skipped items deemed to be critical (retrievals on missing data). For nonresponders, after May 10, 2010 (May 10, 2012 for the 2012 collection) we will offer “critical item” completes by fax or phone. The “critical items” identified by NCES for the SSOCS:2010 and SSOCS:2012 will be the same critical items as were defined for SSOCS:2006 and SSOCS:20082, which include the incidence data as well as school attributes.


Refusal Conversion for Schools that Will not Participate


At any time during data collection, if a school expresses strong concerns about confidentiality, these calls will be directed to the Census Project Director (and possibly onto NCES) for formal assurance. All mailed refusal conversion materials would include the project’s toll-free number as well as the number for the Project Director. The Project Director’s direct number would be included in the interviewer FAQs, as well.


The conversion letters are viewed as a second conversion attempt, after the interviewers have attempted conversion. Information learned during the refusal conversion interviews would be used to inform the content of the refusal conversion letters, if it is decided that these letters have potential for increasing response rates.


We propose the 2010 and 2012 SSOCS refusal conversion to start about 1 month after the start of data collection and continue throughout the rest of the field period. This lag between data collection start and refusal conversion allows time for the development and design of the refusal conversion training and protocol that will be based on lessons learned during the first month of data collection. Throughout the field period, we will ensure a “cooling off period” before a refusing school is called (we propose a minimum cooling-off period of 2 weeks, 14 calendar days).



B3. Methods to Maximize Response Rates

NCES is committed to obtaining a high response rate in this survey. A key to achieving that response rate is the tracking of the response status of each sampled school, with telephone follow-up of those schools that do not respond promptly. The survey responses will be monitored through an automated receipt control system. In addition, several other steps also will be taken to maximize the response rate. For example, the package containing the questionnaire will include a specially designed brochure describing the purpose of the study as well as highlights from the 2008 SSOCS. The mailed questionnaire will be accompanied by a postage-paid return reply envelope, and will provide a toll-free 800 number that people may call to resolve questions about the survey. It also will provide a means for seeking help by email. If a questionnaire is returned by the postal service, Census will seek to verify the correct address and remail the questionnaire. Re-mails will be sent by Federal Express in order to assure prompt receipt of the questionnaire, and to give the survey greater importance in the eyes of the potential respondents.


All questionnaires that are received will be reviewed for consistency and completeness; if a questionnaire has too few items completed to be counted as a response (or if it has missing or conflicting data on key items), telephone interviewers will seek to obtain more complete responses. Interviewers who have received training in telephone interview techniques and specific training on the SSOCS survey will conduct all of the telephone interviews. After data retrieval is completed, a questionnaire must have at least 60 percent of all items and at least 80 percent of all critical items completed in order to be considered valid for inclusion in the dataset. Responses of “don’t know” will not be considered as valid responses when counting the number of items completed.


Endorsements


To further increase the perceived legitimacy of the survey and thus improve the response rate, NCES will seek endorsements from several organizations for the 2010 SSOCS. Each of the endorsing agencies below supported the 2006 and 2008 SSOCS questionnaires and their endorsement will be sought for future iterations. These include:


1. American Association of School Administrators (AASA)

2. American Federation of Teachers (AFT)

3. American School Counselors Association (ASCA)

4. Association of American Educators (AAE)

5. Center for the Prevention of School Violence (CPSV)

6. Council of Chief State School Officers (CCSSO)

7. National Association of Elementary School Principals (NAESP)

8. National Association of School Resource Officers (NASRO)

9. National Association of School Safety and Law Enforcement Officers

(NASSLEO)

10. National Association of Secondary School Principals (NASSP)

11. National Association of State Boards of Education (NASBE)

12. National Education Association (NEA)

13. National Middle School Association (NMSA)

14. National PTA

15. National School Boards Association (NSBA)

16. National School Safety Center (NSSC)

17. Northwest Regional Educational Laboratory (NWREL)

18. Police Executive Research Forum (PERF)

19. School Safety Advocacy Council (SSAC)

20. School Violence Resource Center (SVRC)



B4. Tests of Procedures

Project staff completed several pre-test activities prior to the conduct of the 2004 survey. Cognitive testing with 9 principals is scheduled for May 2009. This round of cognitive testing will concentrate on new items and items that have been problematic (e.g., low response rates). NCES is quite satisfied with the questionnaire and with the procedures.


B5. Individuals Responsible for Study Design and Performance

Several key staff are responsible for the study design and performance. They include:


Kathryn Chandler, Project Officer, National Center for Education Statistics (202.502.7486)

Lynn Bauer, Education Statistics Services Institute (202.403.6159)

Samantha Neiman, Education Statistics Services Institute (202.403.6554)

Jill DeVoe, Education Statistics Services Institute (202.403.6409)

Sally Ruddy, Education Statistics Services Institute (651.698.2581)

Steven Tourkin, Project Director and Education Surveys Branch Chief, Demographic Surveys Division, Census (301.763.3791)

Randall Parmer, Demographic Surveys Methods Division, Census (301.763.3567)




1 The public universe figures will be updated in August 2009 from the 2007–08 Common Core of Data. The figures shown here are the sampling universe figures for the 2007–08 SSOCS public schools, which was derived from the 2005–06 CCD.

2 The critical items for SSOCS:10 are: 7, 8, 14, 15, 16, 17, 20, 23, 24, 25, 26, 27, 31, 32, and 33

0


File Typeapplication/msword
AuthorKathryn.Chandler
Last Modified ByLynn Bauer
File Modified2009-09-14
File Created2009-09-11

© 2024 OMB.report | Privacy Policy