Part B SSOCS 2018 & 2020

Part B SSOCS 2018 & 2020.docx

School Survey on Crime and Safety (SSOCS) 2018 and 2020

OMB: 1850-0761

Document [docx]
Download: docx | pdf






School Survey on Crime and Safety (SSOCS)

2018 and 2020


OMB #1850-0761 v.12



Supporting Statement Part B






National Center for Education Statistics

Institute of Education Sciences

U.S. Department of Education







March 2017

revised May 2017




Contents





List of tables


Table 1 Expected respondent universe for the SSOCS:2018 public school sample, by school level and urbanicity, based on the 2015–16 CCD …….1

Table 2 Expected respondent universe for the SSOCS:2018 public school sample, by school level and enrollment size, based on the 2015–16 CCD …….1

Table 3 Unweighted and weighted unit response rates, by selected school characteristics: School year 2015–16 …….4




Section B. Methodology

The 2018 School Survey on Crime and Safety (SSOCS) questionnaire has only minor content changes that distinguish it from the 2016 questionnaire. However, the 2018 collection involves methodological changes—including a test of a web version of the survey for a subsample of schools and an incentive experiment for half of the sample. If these tests yield positive results in response rates, NCES will look into using the web-based instrument, as well as an incentive, for all respondents in the 2020 collection. The information presented in this document for SSOCS:2018 will also be used in SSOCS:2020, except where noted otherwise.

B1. Respondent Universe and Sample Design and Estimation

The sampling frame for SSOCS:2018 is constructed from the public school sampling frame for the 2017–18 National Teacher and Principal Survey (NTPS), but it excludes schools that are considered out-of-scope for SSOCS. The NTPS public school sampling frame was constructed from the Public Elementary/
Secondary School Universe data file of the 2014–15 Common Core of Data (CCD), which is an NCES annual collection of fiscal and nonfiscal data for all public schools, public school districts, and state education agencies in the United States.

To create the NTPS sampling frame, certain types of schools were excluded from the CCD public school universe file: schools in the U.S. outlying areas1 and Puerto Rico, overseas Department of Defense schools, newly closed schools, home schools, and schools with a high grade of kindergarten or lower (regular public schools, charter schools, and schools that have partial or total magnet programs with students in any of grades prekindergarten through 12 are included in the frame). The SSOCS sampling frame starts with the NTPS frame, but excludes schools run by the Bureau of Indian Education or Department of Defense, schools specializing in special education or alternative education, vocational schools, virtual schools, and ungraded schools.

The size of the SSOCS population is estimated to be about 84,000 schools. Tables 1 and 2 show the expected distribution of the public school sampling universe for SSOCS:2018, based on the 2014–15 CCD.

Table 1. Expected respondent universe for the SSOCS:2018 public school sample, by school level and urbanicity, based on the 2014–15 CCD

Urbanicity

Primary

Middle

High

Combined

Total

City

14,805

3,838

3,402

984

23,029

Suburb

17,339

5,617

3,927

661

27,544

Town

5,721

2,680

2,159

524

11,084

Rural

11,701

3,523

3,388

4,149

22,761

Total

49,566

15,658

12,876

6,318

84,418


Table 2. Expected respondent universe for the SSOCS:2018 public school sample, by school level and enrollment size, based on the 2014–15 CCD

Enrollment size

Primary

Middle

High

Combined

Total

Less than 300

10,495

2,837

2,420

2,835

18,587

300–499

18,122

3,515

2,018

1,537

25,192

500–999

19,911

7,454

3,094

1,492

31,951

1,000+

1,038

1,852

5,344

454

8,688

Total

49,566

15,658

12,876

6,318

84,418


Sample Selection and Response Rates

A stratified sample design will be used to select approximately 4,800 public schools for SSOCS:2018 (compared to 3,553 public schools in SSOCS:2016) in order to obtain the 2,761 completed interviews needed to ensure precision in the estimates. For sample allocation purposes, strata will be defined by instructional level, locale, and enrollment size. Minority enrollment, region, and state will be used as sorting variables in the sample selection process to induce implicit stratification.

SSOCS:2016 yielded an unweighted response rate of approximately 59 percent. When the responding schools were weighted to account for their original sampling probabilities, the response rate increased to approximately 63 percent. Both the unweighted and weighted response rates were a significant drop from those obtained in the prior, SSOCS:2010 collection, which yielded unweighted and weighted response rates of 77 and 81 percent, respectively.

Sample Design for SSOCS:2016

A stratified sample design was used to select schools for SSOCS:2016. For sample allocation and sample selection, strata were defined by instructional level, locale, and enrollment size. Within each of four instructional level categories, the sample was allocated to each of 16 subgroups formed by the cross-classification of locale (four levels) and enrollment size (four levels) in proportion to an aggregate measure of size derived for each subgroup. The aggregate measure of size for a specific locale by enrollment cell within an instructional level is equal to the sum of the square root of school enrollment.

The initial goal of SSOCS:2016 was to collect data from at least 2,550 schools, taking nonresponse into account. One possible method of allocating schools to the different sampling strata would have been to allocate them proportionally to the U.S. public school population. However, while the majority of U.S. public schools are primary schools, the majority of school violence is reported in middle and high schools. Proportional allocation would, therefore, have yielded an inefficient sample design because the sample composition would have included more primary schools (where crime is an infrequent event) than middle or high schools (where crime is a relatively more frequent event). As a result, a larger proportion of the target sample of 2,550 schools was allocated to middle and high schools, with the allocation as follows: 640 primary schools, 895 middle schools, 915 high schools, and 100 combined schools. After inflating the sample size to allow for nonresponse, the resulting sample allocation by school level was 849 primary schools, 1,230 middle schools, 1,347 high schools, and 127 combined schools. The total sample size was 3,553 schools. Schools in SSOCS:2000, SSOCS:2004, SSOCS:2006, SSOCS:2008 and SSOCS:2010 were allocated to instructional levels in a similar manner.

After the allocation for each stratum was determined, percent minority, region, and state were used as implicit stratification variables by sorting the school lists in each stratum by these variables before sample selection. The formula used to calculate measure of size is given as

MOS(h) =

where Ehi = the enrollment of the i th school in stratum h and Nh = the total number of schools in stratum h.

The measure of size for the instructional level, MOS(l), is found by summing across the 16 measure-of-size values, MOS(h), that comprise the instructional level. The ratio of the stratum’s measure of size to the overall measure of size for the instructional level determines the number of cases to be allocated to that stratum. This is found by dividing the stratum measure of size, MOS(h), by the total measure of size for the instructional level, MOS(l). The result provides the proportion of the sample that should be allocated to this stratum.

Sample Design for SSOCS:2018 and SSOCS:2020

The same general sample design used for SSOCS:2016 will be adopted for the selection of schools in SSOCS:2018 and SSOCS:2020 with regard to stratification variables, the number of strata, the method of sample allocation, and the sorting of variables before selection.

The two main objectives of the SSOCS:2018 and SSOCS:2020 sampling design are identical to those of SSOCS:2016: (1) to obtain overall cross-sectional and subgroup estimates of important indicators of school crime and safety; and (2) to maintain precise estimates of change in various characteristics relating to crime between the earliest and most recent SSOCS administrations. Adopting the same general design increases the precision of the estimate of change. For sample allocation and sample selection purposes, strata were defined in prior administrations of SSOCS by crossing instructional level, locale, and enrollment size. In addition, percent minority, region, and state were used as implicit stratification variables by sorting schools by these variables within each stratum before sample selection. The three explicit and three implicit stratification variables have been shown to be related to school crime and thus create meaningful strata for this survey.

While the general sampling design for SSOCS:2018 remains the same as in prior collections, there are three notable differences. First, in an attempt to be proactive in reducing the burden on districts and their respective schools, the SSOCS team and the NTPS team are coordinating on the special district recruitment efforts (approved in March, 2017; OMB# 1850-0761 v.11), and drawing each sample in tandem to minimize overlap in the schools sampled for both surveys. Second, a web experiment will be conducted during the SSOCS:2018 collection only. Approximately 1,150 cases from the sample will be randomly selected to receive the web-based instrument. This experimental group will be selected in a manner that reflects the overall sampling design, providing the ability to use responses from this group when calculating estimates. Lastly, an incentive experiment will be conducted during the SSOCS:2018 collection only. Approximately half of the entire sample will be randomly selected to receive a $10 prepaid gift card as part of the initial mailout. This incentive is not contingent on the respondent’s completion of the survey.

SSOCS:2018 will take advantage of the lessons learned from the 2016 data collection. The response rates achieved for various strata and substrata in SSOCS:2016 have been examined in order to determine the proper size of the initial sample selection for 2018. Table 3 displays the SSOCS:2016 response rates by school level, enrollment size, urbanicity, percent White enrollment, and region. The 2016 response rates were used to estimate the 2018 response rates to ensure a sufficient number of completed cases for analysis.

Calculation of Weights

Weights will be attached to each surveyed school so that the weighted data will represent population levels. The final weight for completed cases will be composed of a sampling base weight and an adjustment for nonresponse. As with SSOCS:2016, nonresponse weighting adjustment cells for the SSOCS:2018 data will be determined using a categorical search algorithm called Chi-Square Automatic Interaction Detection (CHAID). CHAID begins by identifying the school-level characteristics of interest that are the best predictors of response. It divides the dataset into groups so that the unit response rate within cells is as constant as possible and the unit response rate between cells is as different as possible. The characteristics of interest as predictors of response must be available for both respondents and nonrespondents in order to conduct a CHAID analysis, and, in the case of SSOCS, will be available through the CCD sampling frame. Weighting adjustment cells for SSOCS:2018 data will be determined based on bias analysis results from the SSOCS:2016 data in order to create the adjustment for nonresponse. The final, adjusted weights will be raked so that the sum of the weights matches the number of schools derived from the latest CCD public school universe file.

Methods for Variance Estimation

Standard errors of the estimates will be estimated using jackknife repeated replication (JRR). Replicate codes that indicate the computing strata and the half-sample to which each sample unit belongs will be provided, as will the weights for all replicates that were formed in order to calculate variances.

Table 3. Unweighted and weighted SSOCS unit response rates, by selected school characteristics:

School year 2015–16



School characteristic


Initial sample


Completed

Survey1


Non- respondents2



Ineligible3

Unweighted response

rate (percent)4

Weighted response

rate (percent)5

Total

3,553

2,092

1,442

19

59.2

63.0

Level6

Primary

849

516

325

8

61.4

63.6

Middle

1,230

719

508

3

58.6

60.4

High school

1,347

774

567

6

57.7

60.2

Combined

127

83

42

2

66.4

70.1

Enrollment size

Less than 300

349

234

107

8

68.6

73.0

300–499

702

426

273

3

60.9

62.3

500–999

1,384

831

546

7

60.3

60.2

1,000 or more

1,118

601

516

1

53.8

54.0

Urbanicity

City

1,083

558

517

8

51.9

52.2

Suburb

1,362

781

576

5

57.6

60.7

Town

428

295

130

3

69.4

68.8

Rural

680

458

219

3

67.7

73.9

Percent White enrollment

More than 95 percent

147

108

39

0

73.5

74.1

More than 80 to 95 percent

801

543

255

3

68.0

71.6

More than 50 to 80 percent

1,025

606

414

5

59.4

63.0

50 percent or less

1,580

835

734

11

53.2

56.2

Region

Northeast

602

338

262

2

56.3

61.6

Midwest

788

501

283

4

63.9

66.4

South

1,346

765

575

6

57.1

61.6

West

817

488

322

7

60.2

62.5

1In SSOCS:2016, a minimum of 162 of the 296 subitems eligible for recontact (i.e., all subitems in the questionnaire except those associated with the introductory items) were required to be answered for the survey to be considered complete. This includes a minimum of 75 of the 92 critical subitems, 18 of the 30 subitems within item 26, and 6 of the 25 subitems within item 35.

2Nonrespondents include schools whose districts denied permission to NCES to conduct the survey and those eligible schools that either did not respond or responded but did not answer the minimum number of items required for the survey to be considered complete.

3Ineligible schools include those that had closed, merged with another school at a new location, changed from a regular public school to an alternative school, or are not a school ("not a school" generally refers to a school record for an organization that does not provide any classroom instruction (e.g., an office overseeing a certain type of program or offering tutoring services only)).

4The unweighted response rate is calculated as the following ratio: completed cases / (total sample known ineligibles).

5The weighted response rate is calculated by applying the base sampling rates to the following ratio: completed cases / (total sample known ineligibles).

6Primary schools are defined as schools in which the lowest grade is not higher than grade 3 and the highest grade is not higher than grade 8. Middle schools are defined as schools in which the lowest grade is not lower than grade 4 and the highest grade is not higher than grade 9. High schools are defined as schools in which the lowest grade is not lower than grade 9 and the highest grade is not higher than grade 12. Combined schools include all other combinations of grades, including K–12 schools.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2015–16 School Survey on Crime and Safety (SSOCS:2016).


B2. Procedures for the Collection of Information

The data collection methods used in SSOCS:2018 include unique to it mail survey and web-based survey experiments with intensive follow-up by both phone and e‑mail. The methods are described in more detail in the following sections.

Steps in the Data Collection Process

The following is a description of the main tasks in the data collection process for SSOCS. These tasks include drawing the sample; identifying special contact districts; mailing letters to school principals, district superintendents, and Chief State School Officers (CSSOs); mailing full package to principals; placing reminder and follow-up calls to nonresponding schools; and refusal conversion efforts using both mailings and e-mails. All communication materials to potential respondents are designed for refusal aversion. See appendix A for letters to superintendents, CSSOs, and principals, as well as postcards to schools in special contact districts and reminder e-mails to respondents (once SSOCS:2016 results become available example findings will be added to the communication materials provided in appendix A, and the finalized communication materials will be submitted to OMB for approval as a change request in August, 2017).

Drawing the Sample

The sample of schools will be drawn in the June preceding data collection, following the drawing of the NTPS sample. However, since many larger districts (known as “certainty” districts) are always included in the various NCES sample surveys, the preliminary research and application development for these districts will begin in early spring, prior to sampling. This will ensure that these districts have the necessary information to present to their research approval board during their scheduled annual or bi-annual meetings. Additional special contact district outreach will occur once the sample is drawn for any remaining sampled districts that require approval.

Identifying Special Contact Districts and the Need for Research Applications

Special contact districts require that a research application be submitted to and reviewed by the district before they will allow schools under their jurisdiction to participate in a study. Districts are identified as “special contact districts” prior to data collection because they were flagged as such during previous cycles of SASS, NTPS, or SSOCS, or by other NCES studies. Special contact districts are also identified during data collection when districts indicate that they will not complete the survey until a research application is submitted, reviewed, and approved.

Once a district is identified as a special contact district, basic information about the district is obtained from the NCES Common Core of Data (CCD). The basic information includes the NCES LEA ID number, district name, city, and state. The next step is to search the district’s website for a point of contact and any information available about the district’s requirements for conducting external research. Some districts identified as being a special contact district from the previous cycle may be incorrect and staff will verify whether a given district has requirements for conducting external research before proceeding.

The following are examples of the type of information that will be gathered from each district’s website in order to prepare a research application for submission to this district:

  • Name and contact information for the district office or department that reviews applications to conduct external research, and the name and contact information of the person in charge of that office.

  • Information about review schedules and submission deadlines.

  • Whether application fees are required, and if so, how much.

  • Whether a district sponsor is required.

  • Whether an online application is required, and if so, the link to the application if possible.

  • Information about research topics and/or agenda on which the district is focusing.

  • The web link to the main research department or office website.

  • Research guidelines, instructions, application forms, District Action Plans, Strategic Plan or Goals, if any.

Recruitment staff will contact districts by phone and email to obtain key information not listed on the district’s website, (e.g., requirements for the research application, research application submission deadlines, etc.).

SSOCS/NTPS staff developed a generic research application that covers the information typically requested in district research applications. Staff will customize the generic research application to each district’s specific requirements that need to be addressed or included in the research application (e.g., how the study addresses key district goals, or inclusion of a district study sponsor), or submit the generic application with minimal changes to districts that do not have specific application requirements.

Using the information obtained from the district website or phone or email exchanges, a district research request packet will be prepared. Each research application will include the following documents, where applicable:

  • District research application cover letter;

  • Research application (district-specific or generic, as required by the district);

  • Study summary;

  • FAQ document;

  • Special contact district approval form;

  • Participant informed consent form (if required by the district);

  • SSOCS/NTPS Project Director’s resume;

  • Copy of questionnaires; and

  • Application fee (if required by the district).

Other information about the study may be required by the district and will be included with the application or provided upon request.

Approximately one week after the application is submitted to the district (either electronically or in hard copy, as required by the district), SSOCS/NTPS district recruitment staff will contact the district’s research office to confirm receipt of the package and to ask when the district expects to review the research application and when a decision will be made. If additional information is requested by the district (e.g., the list of sampled schools), recruitment staff will follow up on such requests and will be available to answer any questions the district may have throughout the data collection period.

To reduce burden for the special contact districts and improve operational efficiency, NCES is planning to seek research approval simultaneously for NTPS 2017-18 and SSOCS 2018. Although NCES plans to minimize overlap in the schools sampled for NTPS and SSOCS, most of the largest districts will have schools selected for both surveys. All special contact districts with schools in both surveys will receive both research applications concurrently and will be given the option to participate in NTPS only, SSOCS only, or both NTPS and SSOCS. The research request packets for the districts in both studies will contain an additional letter introducing the studies and emphasizing that SSOCS and NTPS are working together to minimize the number of schools asked to participate in both studies.

Some districts charge a fee (~$50-200) to process research application requests, which will be paid as necessary.

Advance Notification to Principals

Principals will be notified of the survey through an advance letter and e-mail sent a week or two before the questionnaire, following OMB clearance. The letter will include information about the study, the date of the first mailing, and a toll-free number that principals can call if they have questions. The toll-free number will be answered by Census program staff in Suitland, Maryland, who have been explicitly trained for this study and on how to respond to calls from schools. Staffing levels will ensure that at least one staff person is available at all times during the promised hours of operation. Copies of the advance letter to principals and principals in special contact districts are included in appendix A.

Mailing the Study Notification to District Superintendents and Chief State School Officers

In order to achieve the highest possible response rate, we will send the study notification mailing to superintendents and Chief State School Officers (CSSOs) at the same time as the advance notification to principals. The purpose of this mailing is to provide districts with information about the survey and to inform them about the questionnaires being mailed to sampled schools in their district. It is not designed to ask for permission; rather, it is designed as a vehicle to help enhance participation. All materials sent to the CSSOs will be personalized using contact information from the CSSO website. Copies of the letters to the superintendents/CSSOs are included in appendix A.

Initial and Second Mailouts

SSOCS:2018 only will include a web experiment and an incentive experiment. The web experiment will be designed to test the efficacy of offering a web response option as the initial mode of data collection, as in the 2015–16 NTPS Schools and Principals Internet Study. A subsample of schools will be randomly selected (using the same sample design as the paper sample) to be included in the experimental (web) treatment at the time of sampling. Additionally, a $10 prepaid gift card incentive will be included in the initial mailout to half of the sample (half of the paper treatment and half of the web treatment) in order to determine the effectiveness of incentivizing respondents to complete the questionnaire.

The initial mailout is scheduled for late February 2018 (February 2020 for the 2020 collection), and the second mailout is scheduled for March 2018 (March 2020 for the 2020 collection). The principal will be asked to complete the questionnaire—or to have it completed by the person at the school who is the most knowledgeable about school crime and safety—within 2 weeks of receipt of the questionnaire. Principals who have completed their questionnaire prior to the second mailout will be excluded from the second mailout.

The mailings for the paper respondents will include a paper questionnaire and a postage-paid return envelope, as well as a personalized cover letter that will provide the toll-free number at the Census Bureau, the SSOCS e-mail address, and the hours of operation.

The mailings for the schools selected for the web experiment will include a personalized letter containing the survey URL and a unique UserID for accessing the survey online. The letter will also include the Census Bureau contact information and hours of operation. A paper version of the questionnaire will be sent to respondents in the web experiment upon their request.

A clerical operation prior to data collection will obtain e-mail addresses for all of the sampled principals, and this e-mail address will be used to contact the principals throughout the data collection. The principals in the web experiment will receive an e-mail invitation that includes a clickable URL to the web survey and log-in credentials around the time of the first and second mailings. Principals of all schools, regardless of whether the school was assigned to the web experiment, will be sent reminder e-mails, as appropriate, throughout the data collection period. E-mails will be personalized and sent to individual respondents.

A copy of the cover letters to principals used in the first and second mailings and a copy of the postcard for special contact districts are included in appendix A.

Protocol for Follow-up Calls

Approximately 2 weeks after the estimated delivery of the first mailing to school principals, Census will initiate phone calls to confirm that they have received the mailing and to ask if they have any questions. About a month later, Census will initiate phone calls with nonrespondents, reminding them to complete their questionnaire. Finally, during the last month of data collection, Census will conduct nonresponse follow-up by phone. This operation is aimed at collecting SSOCS data over the phone, whenever possible.

Additional Mailouts

Two additional mailings to nonrespondents will follow the first and second mailings, as needed. These will be scheduled for April and May.

The third and fourth mailings for all respondents, regardless of whether they are in the web or paper treatment, will include a paper questionnaire, a postage-paid return envelope, and a personalized cover letter that will include the toll-free number at the Census Bureau, the SSOCS e-mail address, and the Census Bureau’s hours of operation. The third mailing will be the first time that respondents in the web experiment receive a paper questionnaire. Principals who have completed their questionnaire prior to these mailing(s) will be excluded from the mailouts.

Refusal Conversion for Schools That Will Not Participate

If a school expresses strong concerns about confidentiality at any time during data collection, these concerns will be directed to the Census Project Director (and possibly to NCES) for formal assurance. All mailed materials will include the project’s toll-free number.

The SSOCS:2018 refusal conversion will begin about one month after the start of data collection and continue throughout the rest of the field period. This lag between the start of the data collection and the beginning of refusal conversion will allow time for the development and design of the refusal conversion training and protocol, which will be based on lessons learned during the first month of data collection. Throughout the field period, we will ensure a “cooling off period” of at minimum 14 calendar days before a refusing school is called.

Data Retrieval of Critical Items

In terms of the collection of “critical items,” the interview labor will be divided between follow-up with nonrespondents (seeking the completion of “critical items” rather than the full survey) and follow-up with respondents who have skipped items deemed to be critical (the retrieval of missing data). For nonrespondents, in May 2018 (May 2020 for the 2020 collection), we will offer “critical item” completion by fax or phone. The “critical items” identified by NCES for SSOCS:2018 and SSOCS:2020 are the same as in SSOCS:20162 and include incidence data as well as school attributes.

B3. Methods to Maximize Response Rates

NCES is committed to obtaining a high response rate in SSOCS:2018 and SSOCS:2020. In general, a key to achieving a high response rate is to track the response status of each sampled school, with telephone follow-up, as well as follow-up by mail and e-mail, of those schools that do not respond promptly. To help track response status, survey responses will be monitored through an automated receipt control system.

Several other steps will also be taken to maximize the response rate. For example, SSOCS:2018 will include two experiments—a web-based mode and an incentive—to motivate principals to respond to the survey. If the experimental groups yield higher response rates than the control groups (paper respondents, no incentive), they will be implemented in the full sample in the 2020 data collection.

Questionnaires will be mailed by Federal Express to ensure their prompt receipt and to give the survey greater importance in the eyes of the potential respondents. Mailed paper questionnaires will be accompanied by a postage-paid return reply envelope and a personalized letter and include a toll-free number that respondents may call to resolve questions about the survey. The letters will also provide a means for seeking help by e‑mail. If a questionnaire is returned by the U.S. Postal Service, the Census Bureau will seek to verify the correct address and remail the questionnaire. Likewise, if outgoing e-mails sent to respondents bounce back, the Census Bureau will perform research to obtain the correct addresses and then resend the e-mails.

All completed questionnaires that are received by the Census Bureau will be reviewed for consistency and completeness. If a questionnaire has too few items completed to be counted as a response (or if it has missing or conflicting data for key items), telephone interviewers will seek to obtain more complete responses. Telephone interviews will be conducted only by Census Bureau interviewers who have received training in general telephone interview techniques as well as specific training for SSOCS. After data retrieval is completed, a questionnaire must have approximately 60 percent of all items and approximately 80 percent of all critical items completed to be considered valid for inclusion in the dataset. Responses of “don’t know” (which only apply to item 17) will not be considered as valid responses when counting the number of items completed.

Endorsements

To further increase the perceived legitimacy of the survey and thus improve the response rate, the following organizations have endorsed the SSOCS collection:

  • American Association of School Administrators

  • American Federation of Teachers

  • American School Counselors Association

  • Association of American Educators

  • Association for Middle Level Educators

  • Council of Chief State School Officers

  • Education Northwest

  • National Association of Elementary School Principals

  • National Association of School Psychologists

  • National Association of School Resource Officers

  • National Association of Secondary School Principals

  • National Association of State Boards of Education

  • National PTA

  • National School Safety Center

  • School Safety Advocacy Council

  • School Social Work Association of America

  • UCLA Center for Mental Health in the Schools

B4. Tests of Procedures

Experiments

SSOCS:18 will include two data collection experiments: the first experiment tests the inclusion of a web instrument as a mode of completing the survey, and the second experiment investigates the inclusion of a prepaid gift card as an incentive to completing the survey.

Experiment #1: Web Instrument

Among a total sample of 4,800 schools, approximately 1,150 schools will be selected at random to be included in the treatment group. As opposed to the control group, which will receive a package including an informative letter about the survey, a brochure, and the questionnaire, the treatment group will receive a letter including log-in information, as well as a brochure. Out of a total of four potential mailings to schools, this group will only receive the web option to complete the survey for the first two mailings, and will receive the paper questionnaire in the third and fourth mailings. The difference in response rate between the control and treatment group necessary to detect a statistically significant difference has been calculated and is discussed below. The following test statistic was used in determining this difference:

where

  • = 1.96 for a 95 percent confidence level per NCES standards

  • and are the response rates for the treatment and control groups

  • and are the sample sizes for the treatment and control groups

  • = 1.6, the design effect observed in the recent administrations of SSOCS

To be conservative, the standard error was maximized by setting and equal to 0.5 only in the standard error component of the equation above. The actual experiment will gain additional power as the response rates for each group deviate from 50 percent. With 1,150 schools receiving the web option and 3,650 receiving the mail questionnaire, a significant difference will be detectable if the response rates between the two groups differ by at least 4.2 percentage points.

Experiment #2: Incentive

For this experiment, half of the sample (approximately 2,400 schools) will receive a $10 prepaid gift card in the initial mailing. This is considered the treatment group. For respondents in the treatment group receiving the paper version of the questionnaire, this gift card will be included with the informative letter, the brochure, and the survey. For respondents in the treatment group receiving the web option of the questionnaire, this gift card will be included with the informative letter and the brochure. The gift card will be included only in the initial mailing and not in other subsequent mailings. Control group respondents will not receive any monetary incentive throughout the data collection procedures. Using the same statistic as above, the differences in response rate between the control and treatment group necessary to detect statistically significant differences have been calculated. With 2,400 cases receiving the incentive and 2,400 not receiving the incentive, a significant difference will be detectable if the response rates between the two groups differ by at least 3.6 percentage points.

Cognitive Testing and Focus Groups

As part of the development of SSOCS:2018, cognitive testing was conducted with 19 administrators during the winter of 2016 (OMB# 1850-0803 v.171). The cognitive testing concentrated on new items, items that had undergone substantial revisions, and items that have proven to be problematic (e.g., items where interpretability issues were identified in previous iterations of testing). Based on the results of the cognitive testing, NCES is confident in the validity of the finalized items in the questionnaire. The cognitive interviews summary of results is provided in Part C of this submission.

In addition to the cognitive testing of the items, three online focus groups were conducted involving a total of 22 elementary, middle, and high school principals in order to gain a better understanding of the barriers and benefits schools tend to associate with participation in federal surveys, and to identify communication strategies that will help overcome those barriers to participation. Specifically, participants reviewed and provided feedback on the SSOCS brochure, advance letter to principals, and the 2016 questionnaire. Based on the results of these focus groups, NCES is considering an update to the overall look and feel of the brochure. Appendix A provides draft text for the brochure. The focus group summary of results is provided in Part C of this submission.

B5. Individuals Responsible for Study Design and Performance

Several key staff are responsible for the study design and performance. They are

  • Rachel Hansen, Project Director, National Center for Education Statistics

  • Jana Kemp, American Institutes for Research

  • Melissa Diliberti, American Institutes for Research

  • Steven Hummel, American Institutes for Research

  • Samantha Neiman, American Institutes for Research

  • Michael Jackson, American Institutes for Research

  • Shawna Cox, U.S. Census Bureau

  • Adam Rettig, U.S. Census Bureau

1 The U.S. outlying areas are America Samoa, Guam, the Commonwealth of the Northern Mariana Islands, and the U.S. Virgin Islands.

2 The critical items in SSOCS:2018 are items 11, 18, 19, 28, 29, 30, 31, 35, 36, 38, 39, 40, 41, 42, 46, 47, and 48 (see appendix B).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKathryn.Chandler
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy