Part B SSOCS 2022

Part B SSOCS 2022.docx

2022 School Survey on Crime and Safety (SSOCS:2022)

OMB: 1850-0761

Document [docx]
Download: docx | pdf






2022 School Survey on Crime and Safety (SSOCS:2022)



OMB# 1850-0761 v.21




Supporting Statement Part B





National Center for Education Statistics

Institute of Education Sciences

U.S. Department of Education





January 2021






Contents







Section B. Methodology



When compared to the SSOCS:2020 questionnaire, the SSOCS:2022 questionnaire has only added a few new items concerning the COVID-19 pandemic. SSOCS:2022 will be collected primarily by a web-based survey, with paper questionnaires offered in follow-up mailings rather than at the onset of collection. In addition, the testing of monetary incentives will be expanded in the 2022 collection. Both of these were planned for the SSOCS:2020 data collection; however, changes were made to the collection strategy during data collection due to the emergent situation related to the COVID-19 pandemic, which resulted in the closure of many school buildings across the nation along with the closure of the Census Bureau National Processing Center (NPC).

B1. Respondent Universe and Sample Design and Estimation

The SSOCS sampling frame is constructed from the Common Core of Data (CCD) public school universe file, but it excludes schools in the U.S. outlying areas and Puerto Rico, overseas Department of Defense schools, schools run by the Bureau of Indian Education, school specializing in special education or alternative education, vocational schools, virtual schools, ungraded schools, newly closed schools, home schools, and schools that have no students above the grade of kindergarten. Regular public schools, charter schools, and schools that have partial or total magnet programs with students in any of grades prekindergarten through 12 are included in the sampling frame.

The SSOCS:2022 frame will be constructed from the Public Elementary/Secondary School Universe data file of the 2017–18 CCD with new schools being added and variables being updated with 2019–20 CCD data when it becomes available (scheduled to be released in April/May of 2021). The size of the SSOCS:2022 population is estimated to be approximately 83,900 schools.

Tables 1 and 2 show the estimated expected distribution of the public school sampling universe for SSOCS:2022, by school level and urbanicity and by school level and enrollment size, respectively. Tables 1 and 2 reflect the expected numbers as estimated using the 2017–18 CCD universe file.

Table 1. Expected respondent universe for the SSOCS:2022 public school sample, by school level and urbanicity, based on the 2017-18 CCD

Urbanicity

Elementary

Middle

High/Grade 9-11

Combined/Other

Total

City

14,925

3,694

4,098

429

23,146

Suburb

17,478

5,366

4,359

286

27,489

Town

5,623

2,310

2,464

200

10,597

Rural

11,563

3,389

5,239

2,429

22,620

Total

49,589

14,759

16,160

3,344

83,852


Table 2. Expected respondent universe for the SSOCS:2022 public school sample, by school level and enrollment size, based on the 2017-18 CCD

Enrollment size

Elementary

Middle

High/Grade 9-11

Combined/Other

Total

Less than 300

10,461

2,392

3,619

1,458

17,930

300–499

18,643

3,405

2,976

699

25,723

500–999

19,459

7,038

3,941

820

31,258

1,000+

1,026

1,924

5,624

367

8,941

Total

49,589

14,759

16,160

3,344

83,852



Sample Selection and Response Rates

A stratified sample design will be used to select approximately 4,800 public schools for SSOCS:2022 in order to obtain the 2,550 completed interviews needed to ensure precision in the estimates. For sample allocation purposes, strata will be defined by instructional level, locale, and enrollment size. Minority enrollment, region, and state will be used as sorting variables in the sample selection process to induce implicit stratification.

SSOCS:2018 yielded an unweighted response rate of approximately 58 percent. When the responding schools were weighted to account for their original sampling probabilities, the response rate increased to approximately 62 percent. SSOCS:2020 yielded an unweighted response rate of approximately 51 percent. However, given the strain that the COVID-19 pandemic put on the SSOCS:2020 data collection, particularly the closure of many schools across the nation and the closure of the Census Bureau’s NPC, the SSOCS:2018 is likely the more accurate predictor for SSOCS:2022 response. Given the inclusion of a planned incentive experiment aimed at increasing the overall response, we anticipate at least maintaining the SSOCS:2018 response rates in SSOCS:2022, which will yield more completed surveys than needed to meet the study’s objective.

Sample Design for SSOCS:2018 and SSOCS:2020

A stratified sample design was used to select schools for SSOCS:2018 and SSOCS:2020.

The two main objectives of the SSOCS:2018 and SSOCS:2020 sampling design were:(1) to obtain overall cross-sectional and subgroup estimates of important indicators of school crime and safety; and (2) to maintain precise estimates of change in various characteristics relating to crime between the earliest and most recent SSOCS administrations. Adopting the same general design increases the precision of the estimate of change. For sample allocation and sample selection purposes, strata were defined in prior administrations of SSOCS by crossing instructional level, locale, and enrollment size. In addition, percent minority, region, and state were used as implicit stratification variables by sorting schools by these variables within each stratum before sample selection. The three explicit and three implicit stratification variables have been shown to be related to school crime and thus create meaningful strata for this survey. Within each of four instructional level categories, the sample was allocated to each of 16 subgroups formed by the cross-classification of locale (four levels) and enrollment size (four levels) in proportion to an aggregate measure of size derived for each subgroup. The aggregate measure of size for a specific locale by enrollment cell within an instructional level is equal to the sum of the square root of school enrollment.

The initial goal of SSOCS:2018 and SSOCS:2020 was to collect data from at least 2,550 schools, taking nonresponse into account. One possible method of allocating schools to the different sampling strata would have been to allocate them proportionally to the U.S. public school population. However, while the majority of U.S. public schools are elementary schools, the majority of school violence is reported in middle and high schools. Proportional allocation would, therefore, have yielded an inefficient sample design because the sample composition would have included more elementary schools (where crime is an infrequent event) than middle or high schools (where crime is a relatively more frequent event). As a result, a larger proportion of the target sample of 2,550 schools was allocated to middle and high schools.

The formula used to calculate measure of size is given as

MOS(h) =

where Ehi = the enrollment of the i th school in stratum h and Nh = the total number of schools in stratum h.

The measure of size for the instructional level, MOS(l), is found by summing across the 16 measure-of-size values, MOS(h), that comprise the instructional level. The ratio of the stratum’s measure of size to the overall measure of size for the instructional level determines the number of cases to be allocated to that stratum. This is found by dividing the stratum measure of size, MOS(h), by the total measure of size for the instructional level, MOS(l). The result provides the proportion of the sample that should be allocated to this stratum.

The general sampling design for SSOCS:2020 remained the same as in prior collections, however, an incentive experiment was included where approximately 2,340 schools were assigned to the “early incentive” treatment, 1,230 schools were assigned to the “delayed incentive” treatment, and 1,230 schools were be assigned to the “no incentive” (control) treatment. These experimental groups were selected in a manner that reflects the overall sampling design, providing the ability to use responses from this group when calculating estimates. Lastly, a split-panel experiment was conducted during the SSOCS:2020 to test a navigation menu within the web instrument. Approximately half of the entire sample was randomly selected to receive a different version of the web instrument that included a navigation menu.

Sample Design for SSOCS:2022

While the general sampling design for SSOCS:2022 remains the same as in prior collections, there is one notable difference from SSOCS:2020. The split-panel experiment within the web instrument, designed to test a navigation menu, will not be conducted for SSOCS:2022, and therefore, there is no need to randomly select any proportion of the sample to receive a different version of the web instrument. Note that the incentive experiment described above for SSOCS:2020 will be conducted in SSOCS:2022 since the COVID-19 pandemic altered the data collection plan during production, so the treatment assignment described above applies to the SSOCS:2022 sample design.

SSOCS:2022 will take advantage of the lessons learned from SSOCS:2018, rather than SSOCS:2020, due to the unique challenge the pandemic presented to the SSOCS:2020 cycle. The response rates achieved for the various strata and substrata in SSOCS:2018 have been examined in order to determine the proper size of the initial sample selected for 2020 to ensure a sufficient number of completed cases for analysis. Table 3 displays the SSOCS:2018 response rates by school level, enrollment size, urbanicity, percent White enrollment, and region.

Calculation of Weights

Weights will be attached to each surveyed school so that the weighted data will represent population levels. The final weight for completed cases will be composed of a sampling base weight and an adjustment for nonresponse. As with SSOCS:2018 and SSOCS:2020, nonresponse weighting adjustment cells for the SSOCS:2022 data will be determined using a categorical search algorithm called Chi-Square Automatic Interaction Detection (CHAID). CHAID begins by identifying the school-level characteristics of interest that are the best predictors of response. It divides the dataset into groups so that the unit response rate within cells is as constant as possible and the unit response rate between cells is as different as possible. The characteristics of interest as predictors of response must be available for both respondents and nonrespondents in order to conduct a CHAID analysis, and, in the case of SSOCS, will be available through the CCD sampling frame. Weighting adjustment cells for SSOCS:2022 will be determined based on bias analysis results from the SSOCS:2018 and SSOCS:2020 in order to create the adjustment for nonresponse. The final, adjusted weights will be raked so that the sum of the weights matches the number of schools derived from the latest CCD public school universe file.

Methods for Variance Estimation

Standard errors of the estimates will be estimated using jackknife repeated replication (JRR). Replicate codes that indicate the computing strata and the half-sample to which each sample unit belongs will be provided, as will the weights for all replicates that were formed in order to calculate variances.


Table 3. Unweighted and weighted SSOCS unit response rates, by selected school characteristics:

School year 2017–18

School characteristic


Initial sample

Completed Survey1


Non- respondents2



Ineligible3

Unweighted response rate (percent)4

Weighted response rate (percent)5

Total

4,803

2,762

1,975

66

58.3

61.7

Level6

Primary

1,170

671

477

22

58.4

60.8

Middle

1,704

975

703

26

58.1

60.7

High school

1,748

997

740

11

57.4

61.4

Combined

181

119

55

7

68.4

71.5

Enrollment size

Less than 300

456

286

135

35

67.9

68.4

300–499

955

605

334

16

64.4

65.8

500–999

1,860

1,042

806

12

56.4

56.8

1,000 or more

1,532

829

700

3

54.2

55.1

Urbanicity

City

1,528

723

769

36

48.5

49.3

Suburb

1,837

1,034

793

10

56.6

58.2

Town

563

382

168

13

69.5

68.2

Rural

875

623

245

7

71.8

55.0

Percent White enrollment

More than 95 percent

170

128

39

3

76.6

79.2

More than 80 to 95 percent

1,014

675

330

9

67.2

68.3

More than 50 to 80 percent

1,390

848

536

6

61.3

62.8

50 percent or less

2,229

1,111

1,070

48

50.9

55.0

Region

Northeast

819

459

347

13

56.9

61.3

Midwest

1,029

636

377

16

62.8

64.3

South

1,845

1,042

782

21

57.1

61.0

West

1,110

625

469

16

57.1

60.4

1In SSOCS:2018, a minimum of 60 percent (157 subitems) of the 261 subitems eligible for recontact (i.e., all subitems in the questionnaire except the non-survey items that collect information about the respondent) were required to be answered for the survey to be considered complete. The 261 subitems eligible for recontact include a minimum of 80 percent of the 76 critical subitems (61 out of 76 total), 60 percent of item 30 subitems (18 out of 30 total), and 60 percent of item 38 subitems in column 1 (3 out of 5 total). The critical items are 11, 18, 19, 20, 22, 28, 29, 30, 31, 35, 36, 38 (column 1), 39, 40, 41, 42, 46, 47, and 48. Questionnaires that did not meet established completion criteria were considered incomplete and are excluded from the SSOCS:2018 data file.

2Nonrespondents include schools whose districts denied permission to NCES to conduct the survey and those eligible schools that either did not respond or responded but did not answer the minimum number of items required for the survey to be considered complete.

3Ineligible schools include those that had closed, merged with another school at a new location, changed from a regular public school to an alternative school, or are not a school ("not a school" generally refers to a school record for an organization that does not provide any classroom instruction (e.g., an office overseeing a certain type of program or offering tutoring services only)).

4The unweighted response rate is calculated as the following ratio: completed cases / (total sample known ineligibles).

5The weighted response rate is calculated by applying the base sampling rates to the following ratio: completed cases / (total sample known ineligibles).

6Primary schools are defined as schools in which the lowest grade is not higher than grade 3 and the highest grade is not higher than grade 8. Middle schools are defined as schools in which the lowest grade is not lower than grade 4 and the highest grade is not higher than grade 9. High schools are defined as schools in which the lowest grade is not lower than grade 9 and the highest grade is not higher than grade 12. Combined schools include all other combinations of grades, including K–12 schools. (Note that the school level categories were changed for the 2019-20 cycle.)

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2017–18 School Survey on Crime and Safety (SSOCS:2018).


B2. Procedures for the Collection of Information

The data collection methods used in SSOCS:2022 include web-based survey and mail survey with intensive follow-up by both phone and e‑mail. The methods are described in more detail in the following sections.

Steps in the Data Collection Process

The following is a description of the main tasks in the data collection process for SSOCS. These tasks include drawing the sample; identifying special contact districts; mailing advance letters to school district superintendents and Chief State School Officers (CSSOs); mailing full packages to principals; placing reminder and follow-up calls to nonresponding schools; and refusal conversion efforts using both mailings and e-mails. All communication materials to potential respondents are designed for refusal aversion. See appendix A for letters to superintendents, CSSOs, and principals, as well as postcards to schools in special contact districts and reminder e-mails to respondents.

Drawing the Sample

The sample of schools will be drawn in the summer preceding data collection (Summer 2021) once the SSOCS frame creation is complete. However, since many larger districts (known as “certainty” districts) are always included in the various NCES sample surveys, the preliminary research and application development for these districts will begin in Spring 2021, prior to sampling. This will ensure that these districts have the necessary information to present to their research approval board during their scheduled annual or bi-annual meetings. Additional special contact district outreach will occur once the sample is drawn for any remaining sampled districts that require approval.

Identifying Special Contact Districts and the Need for Research Applications

Special contact districts require that a research application be submitted to and reviewed by the district before they will allow schools under their jurisdiction to participate in a study. Districts are identified as “special contact districts” prior to data collection because they were flagged as such during previous cycles of SASS, NTPS, or SSOCS, or by other NCES studies. Special contact districts are also identified during data collection when districts indicate that they will not complete the survey until a research application is submitted, reviewed, and approved.

Once a district is identified as a special contact district, basic information about the district is obtained from the NCES Common Core of Data (CCD). The basic information includes the NCES LEA ID number, district name, city, and state. The next step is to search the district’s website for a point of contact and any information available about the district’s requirements for conducting external research. Some districts identified as being a special contact district from the previous cycle may be incorrect and staff will verify whether a given district has requirements for conducting external research before proceeding.

The following are examples of the type of information that will be gathered from each district’s website in order to prepare a research application for submission to this district:

  • Name and contact information for the district office or department that reviews applications to conduct external research, and the name and contact information of the person in charge of that office.

  • Information about review schedules and submission deadlines.

  • Whether application fees are required, and if so, how much.

  • Whether a district sponsor is required.

  • Whether an online application is required, and if so, the link to the application if possible.

  • Whether in-person approval is required, and if so, information about the in-person approval process.

  • Information about research topics and/or agenda on which the district is focusing.

  • The web link to the main research department or office website.

  • Research guidelines, instructions, application forms, District Action Plans, Strategic Plan or Goals, if any.

Recruitment staff will contact districts by phone and email to obtain key information not listed on the district’s website, (e.g., requirements for the research application, research application submission deadlines, etc.).

SSOCS special district recruitment staff developed a generic research application (see appendix A) that covers the information typically requested in district research applications. Staff will customize the generic research application to each district’s specific requirements that need to be addressed or included in the research application (e.g., how the study addresses key district goals, or inclusion of a district study sponsor), or submit the generic application with minimal changes to districts that do not have specific application requirements.

Using the information obtained from the district website or phone or email exchanges, a district research request packet will be prepared. Each research application will include the following documents, where applicable:

  • District research application cover letter;

  • Research application (district-specific or generic, as required by the district);

  • Study summary;

  • Frequently Asked Questions (FAQ) document;

  • Special contact district approval form;

  • Participant informed consent form (if required by the district);

  • NCES Project Director’s resume;

  • Copy of questionnaires; and

  • Application fee (if required by the district).

Other information about the study may be required by the district and will be included with the application or provided upon request.

Approximately one week after the application is submitted to the district (either electronically or in hard copy, as required by the district), SSOCS special district recruitment staff will contact the district’s research office to confirm receipt of the package and to ask when the district expects to review the research application and when a decision will be made. If additional information is requested by the district (e.g., the list of sampled schools), recruitment staff will follow up on such requests and will be available to answer any questions the district may have throughout the data collection period.

Some districts charge a fee (~$50-200) to process research application requests, which will be paid as necessary.

Mailing the Study Notification to District Superintendents and Chief State School Officers

In order to achieve the highest possible response rate, we will send the study notification mailing to superintendents and Chief State School Officers (CSSOs) prior to the start of SSOCS:2022 data collection with the sampled schools. The purpose of this mailing is to provide districts with information about the survey and to inform them about the questionnaires being mailed to sampled schools in their district. It is not designed to ask for permission; rather, it is designed as a vehicle to help enhance participation. All materials sent to the CSSOs will be personalized using contact information from the CSSO website. Copies of the letters and materials sent to the superintendents/CSSOs are included in appendix A.

Mailouts

SSOCS:2022 will be conducted primarily via the web-based survey instrument. A clerical operation prior to data collection will obtain e-mail addresses for all of the sampled principals, and these e-mails will be used to contact the principals throughout the data collection. Both mail and e-mail will be used to distribute instructions on how to complete the web questionnaire, with a paper questionnaire introduced in follow-up mailings. Sampled principals will receive as many as four mailings, as needed, throughout the collection period, and principals who have completed their questionnaire prior to subsequent mailing(s) will be excluded from those mailouts.

SSOCS:2022 will build on the SSOCS:2018 incentive experiment and will include two incentive treatment groups. Schools in the “early incentive” treatment group will receive a $10 cash incentive at the first contact by mail. Schools in the “delayed incentive” treatment group will not receive an incentive in the first two mail contacts but will receive a $10 cash incentive during the third mail contact. Both treatment groups will be evaluated against the control group, which will not receive any incentive.

Note that this experiment was originally planned for SSOCS:2020, but changes were made to the collection strategy during data collection due to the emergent situation related to the COVID-19 pandemic, which resulted in the closure of many school buildings across the nation along with the closure of the Census Bureau NPC. As such, the incentive experiment planned for the SSOCS:2020 will instead be conducted as part of SSOCS:2022.

The initial mailout is scheduled for mid-February 2022, and the second mailout is scheduled for March 2022. The principal will be asked to complete the questionnaire—or to have it completed by the person at the school who is the most knowledgeable about school crime and safety—within 2 weeks of receipt. Both mailings will include a personalized letter containing the survey URL and a unique UserID to access the survey online. The letter will also include Census Bureau contact information and answers to FAQs. In addition, the mailing will include a one-page endorsement insert, which will display the names and logos of all SSOCS endorsing agencies. Finally, schools in the “early incentive” treatment will receive $10 cash adhered to a brightly colored incentive insert in their initial mailout package.

The third and fourth mailings (in March and April, respectively) will include a paper questionnaire, a postage-paid return envelope, and a personalized cover letter that will include the toll-free number at the Census Bureau and the SSOCS e-mail address. The third mailing will be the first time that respondents receive a paper questionnaire. Schools in the “delayed incentive” treatment group will also receive their $10 cash incentive adhered to a brightly colored incentive insert in the third package mailing.

Principals will receive an e-mail invitation that includes a clickable URL to the web survey and log-in credentials around the time of the first and second mailings. E-mails will be personalized and sent to individual respondents. Principals will be sent reminder e-mails, as appropriate, throughout the data collection period.

A copy of the cover letters and e-mails sent to principals throughout SSOCS:2022 data collection is included in appendix A.

Protocol for Follow-up Calls

Approximately 3 weeks after the second mailing to school principals, Census will initiate phone calls with nonrespondents, reminding them to complete their questionnaire.

Finally, during the last two months of the SSOCS:2022 data collection, Census will conduct nonresponse follow-up by phone. This operation is aimed at collecting SSOCS data over the phone, whenever possible.

Refusal Conversion for Schools That Will Not Participate

If a school expresses strong concerns about confidentiality at any time during data collection, these concerns will be directed to the Census Project Director (and possibly to NCES) for formal assurance. All mailed materials will include the project’s toll-free number. In addition, FAQs will be included on the back of the initial mailout letters and will include information about why the participation of each sampled school is important and how respondent information will be protected.

Data Retrieval of Critical Items

In terms of the collection of “critical items,” the interview labor will be divided between follow-up with nonrespondents (seeking the completion of “critical items” rather than the full survey) and follow-up with respondents who have skipped items deemed to be critical (the retrieval of missing data). For nonrespondents, in May 2022, we will offer “critical item” completion by fax or phone. The “critical items” identified by NCES for SSOCS:2022 include incidence data as well as data on school characteristics, consistent with SSOCS:2018 and SSOCS:2020. The SSOCS:2022 critical items are analogous to the SSOCS:2020 items, with item numbers updated to match the revised SSOCS:2022 questionnaire: 15, 21, 22, 23, 25, 31, 32, 36, 37, 39 (column 1), 40, 41, 44, 45, 46, 48, and 49.

B3. Methods to Maximize Response Rates

NCES is committed to obtaining a high response rate in SSOCS:2022. In general, a key to achieving a high response rate is to track the response status of each sampled school, with telephone follow-up, as well as follow-up by mail and e-mail, of those schools that do not respond promptly. To help track response status, survey responses will be monitored through an automated receipt control system.

The decision to move to a primarily web-based instrument starting with SSOCS:2020 was based on the results of the two SSOCS:2018 experiments—a web-based mode and an incentive—to motivate principals to respond to the survey. Analyses of these experiments resulted in the recommendation to include an incentive and allow web-based responses as part of a mixed-mode methodology in future SSOCS administrations. Overall, offering an incentive was advantageous for SSOCS:2018, as it increased response rates and promoted significantly faster response times. SSOCS:2022 will build on the SSOCS:2018 incentive experiment but will include two incentive treatment groups (see section B.4 of this document for details).

SSOCS:2020 included a modal experiment to test a navigation menu within the web instrument. The goal of adding the navigation menu was to improve the instrument usability and reduce respondent burden. While the results of the experiment have not yet been fully analyzed, preliminary analyses indicate that the navigation menu did not negatively impact survey response. SSOCS:2022 will include the navigation menu for all sampled schools.

SSOCS:2022 will mail questionnaire packages to nonrespondents’ school addresses using Federal Express during the fourth and final mailing in order to make the questionnaire package stand out to nonrespondents.

All mailed SSOCS paper questionnaires will be accompanied by a postage-paid return reply envelope and a personalized letter and include a toll-free number that respondents may call to resolve questions about the survey. The letters will also provide a means for seeking help by e‑mail. If a questionnaire is returned by the U.S. Postal Service, the Census Bureau will seek to verify the correct address and remail the questionnaire. Likewise, if outgoing e-mails sent to respondents bounce back, the Census Bureau will perform research to obtain the correct addresses and then resend the e-mails.

All completed questionnaires (both paper and web) that are received by the Census Bureau will be reviewed for consistency and completeness. If a questionnaire has too few items completed to be counted as a response (or if it has missing or conflicting data for key items), telephone interviewers will seek to obtain more complete responses. Telephone interviews will be conducted only by Census Bureau interviewers who have received training in general telephone interview techniques as well as specific training for SSOCS. After data retrieval is completed, a questionnaire must have approximately 60 percent of all items and approximately 76 percent of all critical items completed to be considered valid for inclusion in the dataset. Responses of “don’t know” (which only apply to items 5 and 20 on SSOCS:2022) will not be considered as valid responses when counting the number of items completed.

Endorsements

To further increase the perceived legitimacy of the survey and thus improve the response rate, Census will solicit endorsement from the following agencies and organizations:

  • American Association of School Administrators

  • American Federation of Teachers

  • American School Counselors Association

  • Association for Middle Level Education

  • Association of American Educators

  • Center for Prevention of School Violence

  • Center for School Mental Health

  • Council of Chief State School Officers

  • Education Northwest

  • National Association of Elementary School Principals

  • National Association of School Psychologists

  • National Association of School Resource Officers

  • National Association of School Safety and Law Enforcement Officers

  • National Association of Secondary School Principals

  • National Association of State Boards of Education

  • National PTA

  • National School Boards Association

  • National School Safety Center

  • Police Executive Research Forum

  • Safe Schools Initiative Division

  • School Safety Advocacy Council

  • School Social Work Association of America

  • UCLA Center for Mental Health in the Schools

  • University of Arkansas Criminal Justice Institute


B4. Tests of Procedures

Experiments

SSOCS:2022 will include two incentive treatment groups. Schools in the “early incentive” treatment group will receive a $10 cash incentive at the first contact by mail, as was done for the SSOCS:2018 incentive treatment group. Schools in the “delayed incentive” treatment group will not receive an incentive in the first two mail contacts but will receive a $10 cash incentive during the third mail contact. Both treatment groups will be evaluated against the control group, which will not receive any incentive throughout data collection.

Among a total sample of 4,800 schools, approximately 2,340 schools will be selected at random to be included in the “early incentive” treatment group and approximately 1,230 schools will be selected at random to be included in the “delayed incentive” treatment group. The remaining 1,230 schools will be in the control group.

The goal of this experiment is to further refine the SSOCS incentive strategy by comparing response rates, indicators of nonresponse bias, and data collection costs for the early and delayed incentive strategies, relative to a no-incentive control.

The smallest subsample size needed to detect a 5 percent difference between treatment groups was calculated to be 1,230 schools, which is the sample allocated to the delayed treatment group and the control group. The actual experiment will gain additional power as the response rates for each group deviate from 50 percent. With 1,230 schools receiving the delayed incentive and 1,230 schools receiving no incentive, a significant difference will be detectable from the “early incentive” treatment if the response rates between the groups differ by at least 4.4 percentage points.

Cognitive Testing

As part of the development of SSOCS:2022, cognitive testing will be conducted with school administrators during the winter and spring of 2021. The cognitive testing will concentrate on new items pertaining to the COVID-19 pandemic that caused widespread school closures, and significant changes to school policies and disruptions to their delivery of instruction to students in 2020 and 2021. The new COVID-19 pandemic questions included in attachment B of this submission may be revised in a later change request depending on the findings from the forthcoming cognitive testing.

B5. Individuals Responsible for Study Design and Performance

Several key staff responsible for the study design and performance of SSOCS:2022. They are:

  • Rachel Hansen, Project Director, National Center for Education Statistics

  • Jana Kemp, American Institutes for Research

  • Michael Jackson, American Institutes for Research

  • Riley Burr, American Institutes for Research

  • Talia Kaatz, American Institutes for Research

  • Ke Wang, American Institutes for Research

  • Shawna Cox, U.S. Census Bureau

  • Walter Holmes, U.S. Census Bureau

  • Kombo Gbondo Tugbawa, U.S. Census Bureau

  • Aaron Gilary, U.S. Census Bureau

  • Alfred Meier, U.S. Census Bureau

  • Dillon Simon, U.S. Census Bureau


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKathryn.Chandler
File Modified0000-00-00
File Created2021-05-25

© 2024 OMB.report | Privacy Policy