School Pulse Panel
Preliminary Activities
OMB# 1850-0963 v.3
Supporting Statement Part B
National Center for Education Statistics
Institute of Education Sciences
U.S. Department of Education
June 2021
revised August 2021
Contents
B1. Respondent Universe and Sample Design and Estimation 1
B2. Procedures for the Collection of Information 6
B3. Methods to Maximize Response Rates 11
B5. Individuals Responsible for Study Design and Performance 16
The U.S. Census Bureau will collect the School Pulse Panel (SPP) data on behalf of NCES. Data collection will be a self-administered, online survey. There will be two components to the online survey: 1) a district-level component focused on administrative data and 2) a school-level component focused on attitudinal questions and questions best asked of staff with more direct knowledge of day-to-day operations. It is estimated for the survey (both components) to require a total of about 4 hours of district and school staff time each month. The sampled school and its district will be offered a reimbursement of a total of $5,000 for their participation in the study over the course of 12 months. To streamline the delivery of the reimbursement, a debit card will be sent to a designated district contact after the first quarter of collections has been completed. This initial debit card will have $1,250 loaded onto the card. Additional money will be loaded onto the same card after each quarter of data collections. Principals, or the school staff most knowledgeable about COVID-19 impacts on the school environment and instructional offerings, can complete the school-level component survey. No classroom time is involved in the completion of this survey.
The resulting data will provide aggregate estimates for public schools across the nation. A stratified sample design will be used to select approximately 1,200 U.S. public schools. The sample is designed to provide national estimates of primary, middle, and high schools taking into account the type of locale (urbanicity) and racial/ethnic student enrollment.
The sampling frame for the School Pulse Panel is derived from the National Teacher and Principal Survey (NTPS) 2020-21 frame, which itself is largely derived from the 2018-19 Common Core of Data (CCD), the file of public schools supplied annually by State educational agencies to NCES. Only public schools in the 50 states and the District of Columbia will be included in the School Pulse Panel sampling frame, though the School Pulse Panel may sample Puerto Rico separately. Data will also be collected from the outlying areas (American Samoa, Guam, Commonwealth of Northern Mariana Islands, and U.S. Virgin Islands) at the district-level. Certain types of schools are excluded, including newly closed schools, home schools, virtual schools, ungraded schools, correctional facilities, and schools with a high grade of kindergarten or lower. Regular public schools, charter schools, alternative schools, special education schools, vocational schools, and schools that have partial or total magnet programs are included in the frame. For sample allocation purposes, strata are defined by instructional level, and the sample is sorted primarily on the type of locale (urbanicity), percent minority enrollment, and geographic region.
Tables 1 and 2 show the estimated expected distribution of the public-school sampling universe for the School Pulse Panel, by school level and urbanicity and by school level and percent minority enrollment, respectively.
Table 1. Expected respondent universe for the SPP sample, by
school level and region, based on the 2020-21 NTPS Frame
Region |
Elementary |
Middle/Combined/Other |
High/Grade 9-11 |
Total |
Northeast |
8,450 |
2,969 |
3,443 |
14,862 |
Midwest |
12,468 |
4,461 |
6,051 |
22,980 |
South |
18,075 |
7,480 |
8,157 |
33,712 |
West |
13,306 |
4,805 |
5,813 |
23,924 |
Total |
52,299 |
19,715 |
23,464 |
95,478 |
Table 2. Expected respondent universe for the SPP sample, by school level and urbanicity, based on the 2020-21 NTPS Frame
Urbanicity |
Elementary |
Middle/Combined/Other |
High/Grade 9-11 |
Total |
City |
15,650 |
5,031 |
6,332 |
27,013 |
Suburb |
18,007 |
6,471 |
6,355 |
30,833 |
Town |
5,823 |
2,942 |
3,687 |
12,452 |
Rural |
12,819 |
5,271 |
7,090 |
25,180 |
Total |
52,299 |
19,715 |
23,464 |
95,478 |
Table 3. Expected respondent universe for the SPP school sample, by school level and percent minority enrollment, based on the 2020-21 NTPS Frame
Percent Minority |
Elementary |
Middle/Combined/Other |
High/Grade 9-11 |
Total |
0 to less than 25 |
15,487 |
5,975 |
7,423 |
28,885 |
25 to less than 50 |
11,220 |
4,508 |
4,435 |
20,163 |
50 to less than 75 |
8,884 |
3,529 |
3,576 |
15,989 |
75+ |
16,708 |
5,703 |
8,030 |
30,441 |
Total |
52,299 |
19,715 |
23,464 |
94,578 |
Sample Selection and Response Rates
A stratified sample design will be used to select approximately 1,000-1,200 U.S. public schools. The sample is designed to provide national estimates of primary, middle/combined, and high schools taking into account the type of locale (urbanicity), racial/ethnic student enrollment, and region. Note that combined schools will be grouped with middle schools for the purposes of measurement and estimation.
There are two stages of sample selection. We will draw a base sample 1,200 schools as an initial stage and then reach out to the districts of those schools for inclusion in the panel. If enough districts agree to participate such that we obtain at least 1,000 schools for our panel, then we will use those 1,000 – 1,200 schools as our panel. But we will also draw a reserve sample with similar characteristics to our base sample. In case we do not get the necessary number of districts to participate out of the base sample, we will reach out to districts of the reserve sample schools to complete the panel.
Once the sampling starts, we anticipate that there may be some nonresponse from the schools, and some of them may drop out of our sample entirely. We anticipate about 70 to 80 percent response rates for the panel month to month, but schools not participating may drive the response rates below what we need to make our estimates. As a result, for each school in our sample, we identify a replacement school with similar characteristics from the other schools in its district. If needed, we will reach out to those schools, which may constitute a second reserve sample.
Note that there are no previous administrations of the School Pulse Survey, so we cannot derive estimates of response from those.
Sample Design for the School Pulse Panel
The main objective of the School Pulse Panel sampling design is to obtain overall subgroup estimates broken out by various school characteristics. For sample allocation and sample selection purposes, strata were defined by instructional level. In addition, region, locale, percent minority enrollment, enrollment size, and charter status were used as implicit stratification variables by sorting schools by these variables within each stratum before sample selection. The explicit stratification and the first three implicit stratification variables (region, locale, and percent minority enrollment) are priorities for evaluation for this panel.
The goal of the School Pulse Panel is to collect data from 1,000 to 1,200 schools. The method determined to allocating schools to the different sampling strata is allocate them proportionally to the U.S. public school population.
Note that there are no
experiments planned within the sample, in which different schools in
sample may be given different content or pathways.
Weights will be attached to each surveyed school so that the weighted data will represent population levels. The final weight for completed cases will be composed of a sampling base weight and an adjustment for nonresponse. Nonresponse weighting adjustment cells for the SPP data will be determined using a categorical search algorithm called Chi-Square Automatic Interaction Detection (CHAID). CHAID begins by identifying the school-level characteristics of interest that are the best predictors of response. It divides the dataset into groups so that the unit response rate within cells is as constant as possible and the unit response rate between cells is as different as possible. The characteristics of interest as predictors of response must be available for both respondents and nonrespondents in order to conduct a CHAID analysis, and, in the case of SPP, will be available through the CCD sampling frame. The final, adjusted weights will be raked so that the sum of the weights matches the number of schools derived from the School Pulse frame.
Standard errors of the estimates will be estimated using jackknife replication. Replicate codes that indicate the computing strata and the half-sample to which each sample unit belongs will be provided, as will the weights for all replicates that were formed in order to calculate variances.
The School Pulse Panel data collection will begin during the recruitment process, with full data collection beginning in September of 20211. The U.S. Census Bureau, acting as a contractor for NCES, will handle the data collection. Sampled schools and their districts will receive emails notifying them of the survey as soon as agreements are in place to conduct the data collection. Starting in September, each will receive an email notifying them of the full survey, which will include the log on information to complete the online questionnaire. Respondents will have a two-week window to respond to the survey. Reminder emails will be sent during the data collection window. Data will be formally reviewed for disclosure prevention and released a few weeks after data collection ends for that month. This will be repeated monthly through August of 2022.
The sample of schools will be drawn in the summer preceding data collection once the SPP frame creation is complete. However, since many larger districts (known as “certainty” districts) are always included in the various NCES sample surveys, the preliminary research and application development for these districts will begin in the spring, prior to sampling. This will ensure that these districts have the necessary information to present to their research approval board during their scheduled annual or bi-annual meetings. Additional special contact district outreach will occur once the sample is drawn for any remaining sampled districts that require approval.
Special contact districts require that a research application be submitted to and reviewed by the district before they will allow schools under their jurisdiction to participate in a study. Districts are identified as “special contact districts” prior to data collection because they were flagged as such during previous cycles of SASS, NTPS, or SSOCS, or by other NCES studies. Special contact districts are also identified during data collection when districts indicate that they will not complete the survey until a research application is submitted, reviewed, and approved.
Once a district is identified as a special contact district, basic information about the district is obtained from the NCES Common Core of Data (CCD). The basic information includes the NCES LEA ID number, district name, city, and state. The next step is to search the district’s website for a point of contact and any information available about the district’s requirements for conducting external research. Some districts identified as being a special contact district from the previous cycle may be incorrect and staff will verify whether a given district has requirements for conducting external research before proceeding.
The following are examples of the type of information that will be gathered from each district’s website in order to prepare a research application for submission to this district:
Name and contact information for the district office or department that reviews applications to conduct external research, and the name and contact information of the person in charge of that office.
Information about review schedules and submission deadlines.
Whether application fees are required, and if so, how much.
Whether a district sponsor is required.
Whether an online application is required, and if so, the link to the application if possible.
Whether in-person approval is required, and if so, information about the in-person approval process.
Information about research topics and/or agenda on which the district is focusing.
The web link to the main research department or office website.
Research guidelines, instructions, application forms, District Action Plans, Strategic Plan or Goals, if any.
Recruitment staff will contact districts by phone and email to obtain key information not listed on the district’s website, (e.g., requirements for the research application, research application submission deadlines, etc.).
NCES staff developed a generic research application (see appendix A) that covers the information typically requested in district research applications. Recruitment staff will customize the generic research application to each district’s specific requirements that need to be addressed or included in the research application (e.g., how the study addresses key district goals, or inclusion of a district study sponsor), or submit the generic application with minimal changes to districts that do not have specific application requirements.
Using the information obtained from the district website or phone or email exchanges, a district research request packet will be prepared. Each research application will include the following documents, where applicable:
District research application cover letter;
Research application (district-specific or generic, as required by the district);
Study summary;
Special contact district approval form;
Participant informed consent form (if required by the district);
NCES Project Director’s resume;
Copy of questionnaires; and
Application fee (if required by the district).
Other information about the study may be required by the district and will be included with the application or provided upon request.
Given the short timeframe for recruitment, districts will be asked to make their decision within one week after the application is submitted. If additional information is requested by the district (e.g., the list of sampled schools), recruitment staff will follow up on such requests and will be available to answer any questions the district may have throughout the data collection period.
Some districts charge a fee (~$50-200) to process research application requests, which will be paid as necessary.
To achieve the highest possible response rate, we will send the study notification mailing to superintendents and Chief State School Officers (CSSOs) via e-mail prior to the start of the SPP data collection with the sampled schools. The purpose of this mailing is to provide districts and states with information about the survey and to inform them about the survey being sent to sampled schools in their district. It is not designed to ask for permission; rather, it is designed as a vehicle to help enhance participation. All materials sent to the CSSOs will be personalized using contact information from the CSSO website. Copies of the letters and materials sent to the superintendents/CSSOs are included in appendices A & B.
School and District Communication
The School Pulse Panel will be conducted via a self-administered web-based survey instrument. A clerical operation prior to data collection will obtain e-mail addresses for all the sampled school district contacts and school principals, and these e-mails will be used to contact the school districts and principals throughout the data collection. During the recruitment process, an initial letter will be sent via e-mail in July 2021 to notify sampled districts and schools of their selection for the survey, to verify contact information, and to inform schools and districts about reimbursements of up to $5,000 offered to districts over the course of 12 months for their participation in the study. The letters to school districts will also include a link to a short preliminary summer questionnaire, primarily about learning loss mitigation strategies (appendix C). A second e-mail to be sent in August 2021 will include an overview of the study’s topic areas, the research questions the study will examine, a copy of the study questionnaire for reference, information about the offered reimbursement, and a reminder to complete the preliminary summer questionnaire. An invitation e-mail will be sent in September, and each month thereafter, to distribute instructions on how to complete the monthly web questionnaire, including the survey URL and a unique UserID and password to access the survey online.
At the end of each quarter of data collection, districts that have completed each month of data collection will receive a debit card loaded with $1,250. The debit cards and their associated pins, along with instructions on how to use the debit card, will be delivered to a pre-identified district employee with fiscal responsibility.
A copy of the cover letters and e-mails sent to school districts and principals throughout the SPP data collection is included in appendices A & B.
Approximately 2 weeks after the second mailing to school principals, Census will initiate phone calls with nonrespondents, reminding them to complete their learning loss questionnaire.
Finally, during the last week of each month of SPP data collection, Census will conduct nonresponse follow-up by phone. This operation is aimed at reminding the respondent to respond to the online survey or to collecting data over the phone, whenever possible.
Refusal Conversion for Schools That Will Not Participate
If a school expresses strong concerns about confidentiality at any time during data collection, these concerns will be directed to the Census Project Director (and possibly to NCES) for formal assurance. All materials will include the project’s toll-free number. In addition, initial emails will include information about why the participation of each sampled school is important and how respondent information will be protected.
NCES is committed to obtaining a high response rate in the SPP. In general, a key to achieving a high response rate is to track the response status of each sampled district and school, with telephone follow-up, as well as follow-up by e-mail, of those schools that do not respond promptly. To help track response status, survey responses will be monitored through an automated receipt control system.
It is estimated that the survey (both components) will require a total of about 2-3 hours of combined district and school staff time each month. To encourage study participation, the district of each sampled school will be offered a reimbursement of a total of $5,000 for their participation in the study over the course of 12 months.
All completed questionnaires that are received by the Census Bureau will be reviewed for consistency and completeness. If a questionnaire has too few items completed to be counted as a response (or if it has missing or conflicting data for key items), telephone interviewers will seek to obtain more complete responses. Telephone interviews will be conducted only by Census Bureau interviewers who have received training in general telephone interview techniques as well as specific training for SPP.
As part of the development of the SPP, the learning loss questionnaire to be collected in July and August will go through expert review performed by the Census Bureau’s Center for Behavioral Science Methods team. For items included in the full collection starting in September, cognitive testing will be conducted with school administrators during the summer of 2021. The cognitive testing will concentrate on items pertaining to the COVID-19 pandemic that caused widespread school closures, and significant changes to school policies and disruptions to their delivery of instruction to students in 2020, 2021, 2022. These items will be included in the subsequent emergency clearance package in late summer of 2021.
Several key staff responsible for the study design and performance of the School Pulse Panel. They are:
Rachel Hansen, Project Director, National Center for Education Statistics
Michelle McNamara, National Center for Education Statistics
Cassandra Logan, U.S. Census Bureau
Elke McLaren, U.S. Census Bureau
Aaron Gilary, U.S. Census Bureau
Alfred Meier, U.S. Census Bureau
Kathleen Kephart, U.S. Census Bureau
Jessica Holzberg, U.S. Census Bureau
1 Details of the full data collection for the 2021-22 school year will be provided in a subsequent package upon clearance of this package for the preliminary activities.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Kathryn.Chandler |
File Modified | 0000-00-00 |
File Created | 2021-08-21 |