Part B SSOCS 2016 & 2018

Part B SSOCS 2016 & 2018.docx

School Survey on Crime and Safety (SSOCS) 2016 and 2018

OMB: 1850-0761

Document [docx]
Download: docx | pdf

Shape1






School Survey on Crime and Safety (SSOCS)

2016 and 2018


OMB #1850-0761 v.7



Supporting Statement Part B


National Center for Education Statistics

Institute of Education Sciences

U.S. Department of Education







March 13, 2015

Contents





List of tables


Table Page


Table 1 Expected respondent universe for the proposed SSOCS 2011–12 public school sample, by school level and urbanicity, based on the 2009–10 CCD 1

Table 2 Expected respondent universe for the proposed SSOCS 2011–12 public school sample, by school level and enrollment size, based on the 2009–10 CCD 1

Table 3 Unweighted and weighted unit response rates, by selected school characteristics: School year 2009–10 4

Table 4 Unweighted and weighted unit response rates, by selected school characteristics: School year 2007–08 5




Section B. Methodology

The School Survey on Crime and Safety (SSOCS) questionnaire and many of the procedures were used in the 2006, 2008, and 2010 SSOCS data collections and are therefore well defined. The information below reflects plans for SSOCS:2016 and SSOCS:2018.


B1. Respondent Universe and Sample Design and Estimation

The sampling frame for SSOCS:2016 is the same as the 2015–16 National Teacher and Principal Survey (NTPS) sampling frame, with additional out-of-scope schools excluded. The NTPS sampling frame was constructed from the Public Elementary/Secondary School Universe file of the 2013–14 Common Core of Data (CCD), which is an NCES annual collection of fiscal and nonfiscal data for all public schools, public school districts, and state education agencies in the United States.


To create the NTPS sampling frame, certain types of schools were excluded from the CCD public school universe file, including schools in the U.S. outlying areas1 and Puerto Rico, overseas Department of Defense schools, newly closed schools, home schools, and schools with a high grade of kindergarten or lower (regular public schools, charter schools, and schools that have partial or total magnet programs with students in any of grades prekindergarten through 12 are included in the frame). The SSOCS sampling frame starts with the NTPS frame, but excludes schools run by the Bureau of Indian Education, schools specializing in special education or alternative education, vocational schools, and ungraded schools.


The size of the SSOCS population is estimated to be about 85,000 schools. Tables 1 and 2 show the distribution of the public school sampling universe for the 2011–12 SSOCS, which was not fielded; this universe was based on the 2009–10 CCD. The 2015–16 SSOCS sampling universe, which is based on the 2013–14 CCD, is expected to have a similar distribution. Tables 1 and 2 will be updated in August 2015 with data from the 2013–14 CCD.


Table 1. Expected respondent universe for the proposed SSOCS 201112 public school sample, by school level and urbanicity, based on the 200910 CCD

Urbanicity

Primary

Middle

High

Combined

Total

City

14,484

3,797

3,244

894

22,419

Suburb

15,349

4,919

3,332

529

24,129

Town

6,078

2,874

2,140

518

11,610

Rural

14,198

4,375

4,285

4,213

27,071

Total

50,109

15,965

13,001

6,154

85,229


Table 2. Expected respondent universe for the proposed SSOCS 201112 public school sample, by school level and enrollment size, based on the 200910 CCD

Enrollment Size

Primary

Middle

High

Combined

Total

Less than 300

11,553

2,999

2,477

3,041

20,070

300–499

18,603

3,737

1,962

1,544

25,846

500–999

19,007

7,396

3,108

1,258

30,769

1,000+

946

1,833

5,454

311

8,544

Total

50,109

15,965

13,001

6,154

85,229


Sample Selection and Response Rates


A stratified sample design will be used to select approximately 3,230 public schools for SSOCS:2016. For sample allocation purposes, strata will be defined by instructional level, type of locale, and enrollment size. Both minority enrollment and region will be used as sorting variables in the sample selection process to induce implicit stratification. SSOCS:2010 yielded an unweighted response rate of approximately 77 percent. When the responding schools were weighted to account for their original sampling probabilities, the response rate increased to approximately 81 percent. SSOCS:2008 yielded an unweighted response rate of approximately 75 percent and a weighted response rate of approximately 77 percent. Based on the average weighted response rate of the two prior administrations of SSOCS, a response rate of approximately 79 percent is anticipated for SSOCS:2016 and is reflected in the sample size.


Sample Design for SSOCS:2010


A stratified sampling design was used to select schools for SSOCS:2010.2 For sample allocation and sample selection, strata were defined by instructional level, type of locale, and enrollment size. Within each of four instructional level categories, the sample was allocated to each of 16 subgroups formed by the cross-classification of locale (four levels) and enrollment size (four levels) in proportion to an aggregate measure of size derived for each subgroup. The aggregate measure of size for a specific locale by size cell within an instructional level is equal to the sum of the square root of school enrollment.


The initial goal of SSOCS:2010 was to collect data from at least 2,550 schools, taking nonresponse into account. One possible method of allocating schools to the different sampling strata would have been to allocate them proportionally to the U.S. public school population. However, while the majority of U.S. public schools are primary schools, the majority of school violence is reported in middle and high schools. Proportional allocation would, therefore, have yielded an inefficient sample design because the sample composition would have included more primary schools (where crime is an infrequent event) than middle or high schools (where crime is a relatively more frequent event). As a result, a larger proportion of the target sample of 2,550 schools was allocated to middle and high schools. Based on the aggregate measure of size, the desired sample of 2,550 schools was allocated to the four instructional levels as follows: 640 primary schools, 895 middle schools, 915 schools high schools, and 100 combined schools. Within instructional level, the overall sample of schools was then allocated to each stratum in proportion to the measure of size. Schools in SSOCS:2000, SSOCS:2004, SSOCS:2006, and SSOCS:2008 were allocated to instructional levels in a similar manner.


After the allocation for each stratum was determined, percent minority and region were used as implicit stratification variables by sorting the school lists in each stratum by these variables before sample selection. The formula used to calculate measure of size is given as:


MOS(h) =


where Ehi = the enrollment of the i th school in stratum h and Nh = the total number of schools in stratum h.


The measure of size for the instructional level, MOS(l), is found by summing across the 16 measure-of-size values, MOS(h), that comprise the instructional level. The ratio of the stratum’s measure of size to the overall measure of size for the instructional level determines the number of cases to be allocated to that stratum. This is found by dividing the stratum measure of size, MOS(h), by the total measure of size for the instructional level, MOS(l). The result provides the proportion of the sample that should be allocated to this stratum.


Sample Design for SSOCS:2016 and SSOCS:2018


The same general sampling design used for SSOCS:2010 will be adopted for the selection of schools in SSOCS:2016 and SSOCS:2018 with regard to stratification variables, the number of strata, the method of sample allocation, and the sorting of variables before selection.


The two main objectives of the SSOCS:2016 and SSOCS:2018 sampling design are identical to those of SSOCS:2010: (1) to obtain overall cross-sectional and subgroup estimates of important indicators of school crime and safety, and (2) to maintain precise estimates of change in various characteristics relating to crime between the 2003–04, 2005–06, 2007–08, 2009–10, 2015–16, and later SSOCS administrations.3 Adopting the same general design increases the precision of the estimate of change. For sample allocation and sample selection purposes, strata were defined in prior administrations of SSOCS by crossing instructional level, type of locale, and enrollment size. In addition, minority status and region were used as implicit stratification variables by sorting schools by these variables within each stratum before sample selection. The three explicit and two implicit stratification variables have been shown to be related to school crime and thus create meaningful strata for this survey.


A study was conducted to determine what value might be gained from selecting the SSOCS:2016 sample in tandem with the 2015–16 National Teacher and Principal Survey (NTPS). This research suggested that SSOCS:2016 should continue to be sampled independently from NTPS. This is consistent with the way SSOCS:2008 was sampled with regard to the Schools and Staffing Survey (SASS; the predecessor to NTPS) sample. The chief advantage of the independent sampling approach is that an unbiased sample can be selected in a very simple and straightforward manner that aligns with the sample selection of previous SSOCS administrations.


SSOCS:2016 will take advantage of the lessons learned from the 2010 and 2008 data collections. Response rates achieved for various strata and substrata in SSOCS:2010 and SSOCS:2008 have been examined in order to determine the proper size of the initial sample selection for 2016. Table 3 contains SSOCS:2010 response rates by type of school level, enrollment size, urbanicity, percent White enrollment, and region. Table 4 contains the response rates for SSOCS:2008. When using 2010 and 2008 response rates to estimate 2016 response rates, the is to ensure a sufficient number of completed cases for analysis.


The base-weighted response rate was 81 percent in SSOCS:2010 and 77 percent in SSOCS:2008. The sample design for SSOCS:2016 was built on the expectation of a response rate similar to the average response rate of the two prior administrations of SSOCS (79 percent) to ensure that a sufficient number of completed interviews would be obtained.


Calculation of Weights

Weights will be attached to each surveyed school so that the weighted data will represent population levels. The final weight for completed cases will be composed of a sampling base weight and an adjustment for nonresponse. As with SSOCS:2010, nonresponse weighting adjustment cells for the SSOCS:2016 data will be determined using a categorical search algorithm called Chi-Square Automatic Interaction Detection (CHAID). CHAID begins by identifying the school-level characteristics of interest that are the best predictors of response. It divides the dataset into groups so that the unit response rate within cells is as constant as possible and the unit response rate between cells is as different as possible. The characteristics of interest as predictors of response must be available for both respondents and nonrespondents in order to conduct a CHAID analysis, and, in the case of SSOCS, will be available through the CCD sampling frame. Weighting adjustment cells for 2018 SSOCS data will be determined based on bias analysis results from 2016 SSOCS data in order to create the adjustment for nonresponse. The final, adjusted weight will be raked so that the sum of the weights matches the number of schools derived from the latest CCD public school universe file.

Table 3. Unweighted and weighted unit response rates, by selected school characteristics:

School year 2009–10




School characteristic


Initial sample


Completed

survey1


Non- respondents2



Ineligible3

Unweighted response

rate (percent)4

Weighted response

rate (percent)5

Total

3,476

2,648

779

49

77.3

80.8

Level6

Primary

863

684

168

11

80.3

81.4

Middle

1,208

909

280

19

76.5

78.0

High school

1,273

948

314

11

75.1

78.1

Combined

132

107

17

8

86.3

87.6

Enrollment size

Less than 300

372

304

48

20

86.4

85.8

300–499

673

526

136

11

79.5

81.4

500–999

1,310

1,009

287

14

77.9

79.4

1,000 or more

1,121

809

308

4

72.4

73.0

Urbanicity

City

1,031

703

303

25

69.9

73.0

Suburb

1,185

881

290

14

75.2

76.7

Town

455

391

59

5

86.9

87.2

Rural

805

673

127

5

84.1

88.1

Percent White enrollment

More than 95 percent

373

336

36

1

90.3

88.4

More than 80 to 95 percent

868

715

145

8

83.1

86.3

More than 50 to 80 percent

914

703

198

13

78.0

81.9

50 percent or less

1,321

894

400

27

69.1

72.9

Region

Northeast

595

444

149

2

74.9

78.3

Midwest

822

646

163

13

79.9

81.3

South

1,282

965

296

21

76.5

82.1

West

777

593

171

13

77.6

79.9

1In SSOCS:2010, a minimum of 60 percent of the 231 subitems eligible for recontact (i.e., all subitems in the questionnaire except those associated with the introductory items) were required to be answered for the survey to be considered complete. Of the 231 subitems eligible for recontact, this includes a minimum of 80 percent of the 89 critical subitems (72 out of 89 total), 60 percent of item 16 subitems (18 out of 30 total), 93 percent of item 23 subitems in columns 2, 3, and 4 (14 out of 15 total), and 60 percent of item 23 subitems in columns 1 and 5 (6 out of 10 total).

2Nonrespondents include 80 schools whose districts denied permission to NCES and those eligible schools that either did not respond or responded but did not answer the minimum number of items required for the survey to be considered complete.

3Ineligible schools include those that had closed, merged with another school at a new location, changed from a regular public school to an alternative school, or are not a school ("not a school" generally refers to a school record for an organization that does not provide any classroom instruction (e.g., an office overseeing a certain type of program or offering tutoring services only)).

4The unweighted response rate is calculated as the following ratio: completed cases / (total sample - known ineligibles).

5The weighted response rate is calculated by applying the base sampling rates to the following ratio: completed cases / (total sample - known ineligibles).

6Primary schools are defined as schools in which the lowest grade is not higher than grade 3 and the highest grade is not higher than grade 8. Middle schools are defined as schools in which the lowest grade is not lower than grade 4 and the highest grade is not higher than grade 9. High schools are defined as schools in which the lowest grade is not lower than grade 9 and the highest grade is not higher than grade 12. Combined schools include all other combinations of grades, including K–12 schools.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2009–10 School Survey on Crime and Safety (SSOCS:2010).


Table 4. Unweighted and weighted unit response rates, by selected school characteristics: School year 2007–08

 

 


 

 

 

 






Unweighted

Weighted


Initial

Completed

Non-


response

response

School characteristic

sample

survey1

respondents2

Ineligible3

rate (percent)4

rate (percent)5


 

 

 

 

 

 

Total

3,484

2,560

872

52

75

77.16








Level







Primary

833

618

200

15

76

76.96

Middle

1,214

897

297

20

75

76.96

High school

1,295

936

347

12

73

76.22

Combined

142

109

28

5

80

80.82








Enrollment size







Less than 300

371

285

60

26

83

83.33

300–499

630

486

131

13

79

76.74

500–999

1,318

992

315

11

76

76.22

1,000 or more

1,165

797

366

2

69

68.60








Urbanicity







City

1,046

679

335

32

67

69.44

Suburb

1,151

814

329

8

71

73.10

Town

469

390

70

9

85

84.61

Rural

818

677

138

3

83

83.85








Percent minority enrollment







Less than 5 percent

427

353

70

4

83

84.32

5 to less than 20

percent

892

707

181

4

80

80.77

20 to less than 50

percent

895

656

231

8

74

76.66

50 percent or more

1,270

844

390

36

68

71.38








Region







Northeast

597

399

189

9

68

69.51

Midwest

832

648

168

16

79

80.78

South

1,274

950

304

20

76

79.71

West

781

563

211

7

73

74.60








1In SSOCS:2008, A minimum of 60 percent of the 241 subitems eligible for recontact (i.e., all subitems in the questionnaire except for the seven introductory items) were required to have been answered for a survey to be considered complete, including a minimum of 80 percent of the 103 critical subitems.

2Nonrespondents include those eligible schools that did not answer the minimum number of items required for a survey to be considered complete.

3Ineligible schools include those that had closed, merged with another school at a new location, or changed from a regular public school to an alternative school.

4The unweighted response rate is calculated as the following ratio: completed cases / (total sample - known ineligibles).

5The weighted response rate is calculated by applying the base sampling rates to the following ratio: completed cases / (total sample - known ineligibles).

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 School Survey on Crime and Safety (SSOCS:2008).



Methods for Variance Estimation

Standard errors of the estimates will be estimated using jackknife repeated replication (JRR). Replicate codes that indicate the computing strata and the half-sample to which each sample unit belongs will be provided, as will the weights for all replicates that were formed in order to calculate variances.

B2. Procedures for Collection of Information

The data collection methods used in SSOCS:2016 and SSOCS:2018 will be based on those used in SSOCS:2010, including utilizing a mail survey with intensive phone and e‑mail follow-up. The methods are described in more detail in the following sections.

Steps in the Data Collection Process

The following is a description of the main tasks in the data collection process for SSOCS. These tasks include drawing the sample; identifying special-permission districts; mailings of letters to school principals, district superintendents, and Chief State School Officers (CSSOs); mailings of full package to principals; phone follow-up to nonresponding schools; and refusal conversion efforts. All communication materials to potential respondents are designed for refusal aversion (see Appendix A for letters to superintendents and principals, as well as postcards to schools in special-permission districts and reminder e-mails and voicemails to respondents).

Drawing the Sample

The sample of schools will be drawn in September preceding data collection in order to identify the special-permission school districts early in the survey cycle. This will ensure that these districts have the necessary information to present to their approval board in the fall of 2015 and 2017.

Identifying Special-Permission Districts and the Need for Research Applications

NCES has a list of special-permission districts that require research applications. These districts will be contacted by the Census Bureau to obtain any materials necessary for their approval of SSOCS. The special contact districts are those known to require completion of a research application before they will allow schools under their jurisdiction to participate in a study. Based on an initial assessment of previous SSOCS data collections, it is estimated that 181 special contact districts will be in the sample.

Approximately 68 percent of the special districts contacted in the 2009–2010 SSOCS responded to a request for external research. These responses were either: approval (55.2 percent), or refusal (12.7 percent). Several districts were identified that have refused multiple rounds of SSOCS for special outreach. We plan to engage senior NCES staff and others with personal contacts at these districts in our efforts to gain approval for this cycle of SSOCS.

The special district contact operations for the 2015–16 SSOCS will be based on those used in the 2011–2012 NTPS data collection. Districts are identified as ‘special districts’ prior to data collection because they were flagged as such during previous cycles of SSOCS or other NCES studies, or identified during updating district information based on what is found in online sources. The application process for each individual district is obtained either through direct contact via phone or e-mail or through the district website. Most districts require that the following documents be provided in the research request packet:

  • Study proposal with a timeline of the study

  • Study Abstract and/or Executive Summary

  • IRB approval (SSOCS is exempt from seeking IRB approval)

  • Consent form

  • Project Director’s resume

  • Copy of any communications that would be sent to participants

  • Copy of questionnaires

Some districts require a processing fee (approximately $50-$200) before the research proposal can be evaluated.

Advance Notification to Principals

Principals will be notified of the survey through an advance letter and email sent a week or two before the questionnaire, following OMB clearance. The letter will include information about the study, the date of the first mailing, and a toll-free number that principals can call if they have questions. The toll-free number will be answered by Census program staff in Suitland, Maryland, who have been explicitly trained for this study and on how to respond to calls from schools. Staffing levels will ensure that at least one staff person is available at all times during the promised hours of operation. Copies of the advance letter to principals and principals in special-permission districts are included in Appendix A.

Mailing the Study Notification to District Superintendents and Chief State School Officers

In order to achieve the highest possible response rate, we will send the study notification mailing to superintendents and CSSOs at the same time as the advance notification to principals. The purpose of this mailing is to provide districts with information about the survey and to inform them about the questionnaires being mailed to sampled schools in their district. It is not designed to ask for permission; rather, it is designed as a vehicle to help enhance participation. All materials sent to the CSSOs will be personalized using contact information from the CSSO website. Copies of the letters to the superintendents/CSSOs are included in Appendix A.

Mailing the Questionnaire to Principals

We will begin mailing questionnaires to school principals in late February 2016 (February 2018 for the 2018 collection). The mailing will include a postage-paid return envelope. The cover letter will be personalized to the school principal and will include the toll-free number at the Census Bureau, along with the hours of operation and the return address. The principal will be asked to complete the questionnaire—or to have it completed by the person at the school who is the most knowledgeable about school crime and safety—by the end of March 2016 (March 2018 for the 2018 collection). A copy of the cover letter to principals and a copy of the postcard for special-permission districts are included in Appendix A.

Protocol for Follow-up Calls

Approximately 2 weeks after the estimated delivery of the questionnaire to school principals, Census will initiate phone calls to confirm that they have received the mailing and to ask if they have any questions. Approximately 1 week after the first follow-up call, the first reminder e-mail will be sent to all respondents from the NCES Project Director. A second reminder e-mail will be sent to nonrespondents from the NCES Project Director. E-mails will be personalized and sent to individual respondents. If requested, another questionnaire will be sent to the school (and then call within 2 days to confirm receipt).

Second Mailing of Questionnaire

A second mailing of questionnaires to nonrespondents will be done in late March or early April.

Data Retrieval of Critical Items

In terms of the collection of “critical items,” interview labor can be divided between follow-up with nonrespondents (with the remaining weeks seeking “critical item” completes as an alternative to the full survey) and follow-up with respondents who have skipped items deemed to be critical (retrieval of missing data). For nonrespondents, after May 11, 2016 (May 11, 2018 for the 2018 collection), we will offer “critical item” completes by fax or phone. The “critical items” identified by NCES for SSOCS:2016 and SSOCS:2018 will be the same critical items as were defined for SSOCS:2010,4 which include the incidence data as well as school attributes.



Refusal Conversion for Schools That Will Not Participate

At any time during data collection, if a school expresses strong concerns about confidentiality, these concerns will be directed to the Census Project Director (and possibly to NCES) for formal assurance. All mailed refusal conversion materials will include the project’s toll-free number as well as the Project Director’s direct number.

The refusal conversion letters would be viewed as a second conversion attempt, after the interviewers have attempted conversion. Information learned during the refusal conversion interviews would be used to inform the content of the refusal conversion letters, if it is decided that these letters have the potential to increase response rates.

The 2016 and 2018 SSOCS refusal conversion will begin about one month after the start of data collection and continue throughout the rest of the field period. This lag between the start of the data collection and the beginning of refusal conversion will allow time for the development and design of the refusal conversion training and protocol that will be based on lessons learned during the first month of data collection. Throughout the field period, we will ensure a “cooling off period” of at minimum 14 calendar days before a refusing school is called.

B3. Methods to Maximize Response Rates

NCES is committed to obtaining a high response rate in SSOCS:2016 and SSOCS:2018. A key to achieving a high response rate is to track the response status of each sampled school, with telephone follow-ups of those schools that do not respond promptly. To help track response status, survey responses will be monitored through an automated receipt control system.

Several other steps will also be taken to maximize the response rate. For example, the package containing the questionnaire will include a specially designed brochure describing the purpose of the study, as well as highlights from SSOCS:2010. Further, a pen with the SSOCS logo and website address will be included in the package to help remind the respondent to complete the questionnaire. The mailed questionnaire will be accompanied by a postage-paid return reply envelope and will provide a toll-free 800 number that people may call to resolve questions about the survey. It also will provide a means for seeking help by e‑mail. If a questionnaire is returned by the U.S. Postal Service, the Census Bureau will seek to verify the correct address and remail the questionnaire. Questionnaires will be remailed by Federal Express to ensure their prompt receipt and to give the survey greater importance in the eyes of the potential respondents.

All questionnaires that are received will be reviewed for consistency and completeness. If a questionnaire has too few items completed to be counted as a response (or if it has missing or conflicting data for key items), telephone interviewers will seek to obtain more complete responses. Interviewers who have received training in telephone interview techniques and specific training in SSOCS will conduct all of the telephone interviews. After data retrieval is completed, a questionnaire must have at least 60 percent of all items and at least 80 percent of all critical items completed to be considered valid for inclusion in the dataset. Responses of “don’t know” (which only apply to item 17) will not be considered as valid responses when counting the number of items completed.

Endorsements

To further increase the perceived legitimacy of the survey and thus improve the response rate, the Census Bureau will seek endorsements on behalf of NCES from organizations. Each of the endorsing agencies below supported SSOCS:2010 and their endorsement will be sought for SSOCS:2016 and SSOCS:2018:

  • American Association of School Administrators

  • American Federation of Teachers

  • American School Counselors Association

  • Association of American Educators

  • Center for the Prevention of School Violence

  • Council of Chief State School Officers

  • National Association of Elementary School Principals

  • National Association of School Resource Officers

  • National Association of School Safety and Law Enforcement Officers

  • National Association of Secondary School Principals

  • National Association of State Boards of Education

  • National Education Association

  • National Middle School Association

  • National PTA

  • National School Boards Association

  • National School Safety Center

  • Northwest Regional Educational Laboratory

  • Police Executive Research Forum

  • School Safety Advocacy Council

  • School Violence Resource Center

B4. Tests of Procedures

Project staff completed several pretest activities during the initial development of SSOCS and prior to several additional iterations of the survey. As part of the development of the 2015–16 SSOCS, AIR conducted cognitive testing with 17 administrators in the winter of 2014–15 (OMB# 1850-0803). The cognitive testing concentrated on new items, items that had undergone substantial revisions, and items that have proven to be problematic (e.g., because of low response rates). Based on the results of the cognitive testing, NCES is confident in the validity of the finalized items on the questionnaire.

B6. Individuals Responsible for Study Design and Performance

Several key staff are responsible for the study design and performance. They are:

  • Rachel Sutcliffe, Project Director, National Center for Education Statistics, (202) 502-7684

  • Jana Kemp, American Institutes for Research, (202) 403-6566

  • Samantha Neiman, American Institutes for Research, (312) 588-7345

  • Sally Ruddy, American Institutes for Research, (651) 698-2581

  • Carolyn Pickering, Education Surveys Branch Survey Director, Associate Director for Demographic Programs, Census Bureau, (301) 763-3873

  • Randall Parmer, Demographic Surveys Methods Division, Census Bureau, (301) 763-3567

1 The U.S. outlying areas are America Samoa, Guam, the Commonwealth of the Northern Mariana Islands, and the U.S. Virgin Islands.

2 Note that SSOCS data were last collected during the 2009–10 school year (that is, in SSOCS:2010).

3 Again, note that SSOCS data were last collected during the 2009–10 school year.

4 The critical items for SSOCS:2016 are 11, 18, 19, 24, 25, 26, 28, 32, 33, 35, 36, 37, 38, 39, 43, 44, and 45 (see Appendix B).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKathryn.Chandler
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy