Part B NTPS 2019-20 Preliminary Activities

Part B NTPS 2019-20 Preliminary Activities.docx

National Teacher and Principal Survey of 2019-2020 (NTPS 2019-20) Preliminary Field Activities

OMB: 1850-0598

Document [docx]
Download: docx | pdf




National Teacher and Principal Survey

of 2019-2020 (NTPS 2019-20)

Preliminary Field Activities



OMB# 1850-0598 v.24



Supporting Statement

Part B





National Center for Education Statistics (NCES)

U.S. Department of Education





June 2018






Table of Contents

Section Page




Part B Collection of Information Employing Statistical Methods

This request is to contact districts and schools in order to begin preliminary activities for the NTPS 2019-20 collection, including: (a) contacting and seeking research approvals from special handling districts, where applicable, and (b) notifying sampled schools of their selection for the survey and to verify mailing addresses. This document describes the preliminary plans for NTPS 2019-20 sample design, estimation details, and recruitment and data collection procedures based on the NTPS 2017-18 design. The NTPS 2019-20 Main Study clearance request, which will be published for public comment in December 2018, will describe the final sample design, recruitment, and data collection plans.

B.1 Universe, Sample Design, and Estimation

Section B.1.1 includes information on the study universe of interest and sample design planned for NTPS 2019-20. Section B.1.2 describes the precision requirements and target sample sizes set out for the study.

B.1.1 Universe and Sample Design: Respondent Universe

B.1.1.1 Schools

The respondent universe for NTPS 2019-20 data collection consists of approximately 94,000 public schools and 25,000 private schools in the 50 U.S. states and the District of Columbia (DC) that offer instruction in any of grades 1-12 or the ungraded equivalent. To be eligible for inclusion in the sample, schools must: provide classroom instruction to students; have one or more teachers who provide instruction; serve students in at least one of grades 1-12 or the ungraded equivalent; be located in one or more buildings, and be located in the continental United States.

The most recent final Common Core of Data (CCD) file available from NCES at the time of sampling in spring 2019 will be used to construct the public school frame. The respondent universe for charter schools will be identified as those public charter schools that meet the NTPS definition of an eligible school found on the CCD. The universe has been adjusted to remove kindergarten-terminal schools, which are not eligible for NTPS. Table 1 presents the number of public schools on the 2017-18 NTPS public school universe, which are based on the 2014-15 CCD, by urbanicity and school level. The CCD that will be used to construct the sample for NTPS 2019-20 is not yet available at the time of this submittal.

Table 1. Respondent universe by school level and urbanicity for the proposed public school sample, based on the 2017-18 NTPS Public School Universe

School level

Region

Primary

Middle

High

Combined

Total

Central City

15,308

3,699

5,407

1,727

26,141

Suburban

17,933

5,136

5,901

1,220

30,190

Town

6,138

2,340

3,481

876

12,835

Rural

12,221

3,189

6,014

3,538

24,962

Total

51,600

14,364

20,803

7,361

94,128

SOURCE: 2017-18 NTPS; 2014-15 CCD.

The private school frame will be drawn from the 2017-18 Private School Survey (PSS) frame. Preschools and schools with kindergarten as the highest grade will be excluded. Table 2 presents the number of private schools on the 2015-16 PSS universe by urbanicity and school level.

Table 2. Respondent universe by school level and urbanicity for the proposed private school test sample, based on the 2015-16 PSS

School level

Region

Elementary

Secondary

Combined

Total

Central City

4,975

1,121

2,702

8,798

Suburban

5,005

871

2,938

8,814

Town

1,315

145

766

2,226

Rural

2,735

472

1,939

5,146

Total

14,030

2,609

8,345

24,984

SOURCE: 2015-16 PSS

B.1.1.2 Teachers

Teachers will be randomly sampled within the second design stage from either (a) the roster information provided by each participating sampled school collected on a Teacher Listing Form (TLF), (b) a clerical look-up operation, or (c) purchased from a vendor. Teachers within the sampled school are classified as ineligible for NTPS if they are a short-term substitute teacher, student teacher, a teacher’s aide, or do not teach any of grades K-12 or comparable ungraded levels. The information that classifies teachers as ineligible is obtained from the Teacher Questionnaire. Details of the second-stage sample design of teachers are provided in section 2.

B.1.2 Precision Requirements and Sample Sizes

This section details the school sample sizes and precision requirements for the NTPS 2019-20 public and private school samples. The sample for NTPS 2019-20 is expected to include approximately the same sample sizes for both public and private schools and teachers as NTPS 2017-18. However, after the 2017-18 NTPS data collection ends in July 2018, its results will inform the final NTPS 2019-20 study design – including sample sizes, precision requirements, and sampling methodologies – all of which will be fully specified in the NTPS 2019-20 Main Study submission in December 2018.

The final NTPS 2017-18 public sample included:

  • 10,600 schools and school principals (9,100 traditional public and 1,500 public charter), with the goal of at least 6,800 completed interviews; and

  • 47,000 teachers (42,100 traditional public and 4,900 public charter), with the goal of at least 35,000 interviews.

The final NTPS 2017-18 private school sample included:

  • 4,000 schools and school principals, with the goal of at least 2,300 completed interviews; and

  • 9,000 teachers, with the goal of at least 6,000 interviews.

Sampling – Public Schools

The level of precision achieved by NTPS 2017-18 will be evaluated to inform the sample design decisions for NTPS 2019-20. In particular, publishability and bias indicators (described in Section B.3.2) will be reviewed in order to improve the NTPS 2019-20 school sample design. The NTPS 2019-20 oversampling stratification will be based preliminarily on the following domains:

      • Charter/Non-charter;

      • School Level (primary, middle, high, combined);

      • Urbanicity (city, suburb, town, rural);

      • School enrollment (four levels: schools with enrollment less than 100; schools with enrollment between 100 and 199; schools with enrollment 200 to 499; schools with enrollment 500 or more);

      • State Tier, state.

The NCES standards for publishability indicate that the coefficient of variation (CV) must be no larger than 50%, and if the CV is between 30% and 50%, the estimates are published with a caveat. For a population proportion of 20%, a CV of 30% corresponds to a standard error of 6%. In order to make sure that we don’t fall below the CV 30% minimum with the uncertainties about response and about exact values of design effects, we set as a target a CV of 25% as a lower bound. This corresponds to an expected standard error of 5%. This considerably reduces the chance of falling below the 30% boundary (if we set 30% itself as the target, we would be below it one-half of the time). Our target goal then for each state is to make sure that the expected standard error is no larger than 5% for a population proportion of 20% (a CV of 25%), at both the school and teacher level.

Table 3 presents a portion of the analysis for public schools by school type, grade level, urbanicity, and poverty status. Presented are the anticipated number of responding schools or principals for the NTPS design and the expected precision based on analyses using the NTPS 2015-16 final response rates and CV of 25%. The analysis using the NTPS 2017-18 final response rates with CV of 25% will be completed in June 2019, at which time NCES will submit a change request with the final analysis results in a revised Table 3.

Table 3. NTPS 2017-18 school-domain expected interviews, standard errors, and design effects with state oversampling to achieve 25% CV or less

Domain

Frame Schools

Expected Sample Size (completed interviews)

Expected Standard Error

Design Effect

All

94,128

6,700

0.63%

1.680

Charter

6,530

774

1.69%

1.375

Non-charter

87,598

5,926

0.67%

1.658

Primary

51,600

3,028

0.89%

1.489

Middle

14,364

1,122

1.43%

1.431

High

20,803

1,739

1.40%

2.125

Combined

7,361

810

1.89%

1.814

City

26,141

1,941

1.17%

1.673

Suburban

30,190

1,972

1.13%

1.581

Town

12,835

1,047

1.61%

1.696

Rural

24,962

1,740

1.28%

1.775

Enrollment < 100

8,208

332

3.44%

2.464

100 <= Enrollment < 300

7,618

490

2.30%

1.621

300 <= Enrollment < 500

36,116

2,376

1.00%

1.489

500 <= Enrollment < 750

23,552

1,653

1.15%

1.377

750 <= Enrollment < 1,000

9,395

789

1.65%

1.343

1,000 <= Enrollment

9,239

1,060

1.38%

1.255

Percent FRPL < 35%

26,066

1,928

1.27%

1.947

35% <= Percent FRPL < 50%

15,561

1,194

1.46%

1.590

50% <= Percent FRPL < 75%

26,182

1,828

1.17%

1.574

75% <= Percent FRPL

24,417

1,601

1.23%

1.507

Not Participating FRPL

1,902

148

5.30%

2.603


Table 4 provides the analogous precision analysis for public school teachers. The expected standard errors were calculated based on analyses using the NTPS 2015-16 final response rates and CV of 25%. The analysis using the NTPS 2017-18 final response rates with CV of 25% will be completed in June 2019, at which time NCES will submit a change request with the final analysis results in a revised Table 4.

Table 4. NTPS 2017-18 major domain expected teacher interviews, standard errors, and design effects with state oversampling to achieve 25% CV or less

Domain

Frame Full-Time Equivalent Teachers (in 1000s)

Expected Teacher Completed Interviews

Expected Standard Error

Design Effect

All

3,127.9

34,722

0.44%

4.25

Charter

144.9

3,394

1.24%

3.25

Non-charter

2,983.0

31,329

0.46%

4.14

Primary

1,473.6

13,507

0.67%

3.80

Middle

552.6

6,368

1.01%

4.09

High

924.3

11,154

0.82%

4.72

Combined

177.4

3,694

1.23%

3.47

City

920.6

10,328

0.82%

4.36

Suburban

1,202.1

11,377

0.77%

4.19

Town

368.0

5,126

1.10%

3.85

Rural

637.2

7,891

0.90%

3.99

Enrollment < 100

40.9

712

2.49%

2.75

100 <= Enrollment < 300

94.3

1,519

1.85%

3.26

300 <= Enrollment < 500

862.9

9,999

0.77%

3.73

500 <= Enrollment < 750

865.9

9,544

0.84%

4.22

750 <= Enrollment < 1,000

474.1

4,909

1.20%

4.38

1,000 <= Enrollment

789.8

8,039

0.98%

4.81

Percent FRPL < 35%

943.9

10,524

0.82%

4.46

35% <= Percent FRPL < 50%

530.3

6,253

1.05%

4.34

50% <= Percent FRPL < 75%

839.9

9,287

0.84%

4.11

75% <= Percent FRPL

755.2

7,829

0.91%

4.06

Not Participating FRPL

58.6

831

3.00%

4.68


Sampling – Private Schools

To inform the sample design for NTPS 2019-20 private schools, NCES will evaluate the level of precision achieved in NTPS 2017-18. The precision analysis will be based on analysis variables and on proportions to address important characteristics. The following variables will be evaluated:

  • School type (Religious – Catholic, Religious – Other, Non-Religious);

  • Grade Level (Elementary, Secondary, Combined); and

  • Region (Northeast, Midwest, South, West).

The sample design for private schools will be broadly consistent with the private school design for the NTPS 2017-18 NTPS private school test. The desired goal was to achieve a CV of less than 30 percent for a population proportion of 20% in order to meet NCES standards for reporting.

In order to better equalize precision across major school domains for private schools as was done in NTPS 2017-18, NCES plans to oversample for NTPS 2019-20 as follows:

  • Secondary schools will be sampled at a rate proportional to 3.33 times the measure of size (as determined by number of FTE teachers);

  • Non-Religious schools will be sampled at a rate proportional to 1.43 times the measure of size (except for secondary non-religious schools, which are sampled at the 3.33 rate); and

  • Other strata will be sampled at a rate proportional to 1.0 times the measure of size.

For teachers, the expected number of completed interviews is estimated to be proportional to the product of the final school sampling factor and the number of full time equivalent (FTE) teachers over schools in the domain. The overall target number of completed interviews is 6,000. Assuming the attrition rate for the NTPS 2019-20 will be similar to the rate for NTPS 2017-18, the sample size needs to be 9,000 in order to yield the expected number of completed teacher interviews. The teacher sample size for a sampled school should be proportional to the product of the final teacher multiplier (based on the expected attrition adjustment factors), final school oversampling factor, and measure of size for the school. Teachers will be sampled from roster information provided by each participating sampled school. The target teacher completed interview sample sizes are designed to be proportional to the square root of the number of full-time teachers for each school and assume an attrition rate due to nonresponse.

NTPS 2019-20 will have an implicit stratification based on the proposed systematic sampling sort order, which uses a hierarchy of the following domains, as was done for NTPS 2017-18:

    • Three-level affiliation (Catholic, non-Catholic religious, nonreligious);

    • Three-level school span (elementary, secondary, combined);

    • Four-level Census region (Northeast, South, Central, Midwest);

    • Four-level urbanicity (city, suburb, town, rural);

    • Eleven-level affiliation;

    • Five-level school size (enrollment <100, 100-199, 200-499, 500-749, 750+);

    • State;

    • Highest grade;

    • Twelve-level urbanicity (large city, medium-sized city, small city, etc.);

    • Zip code;

    • School enrollment;

    • PIN number.

Tables 5 and 6 show expected sample sizes, standard errors, and CVs for population percentages of 20% by key domains of school type, grade level, and region.

Table 5 presents a portion of the analysis for private schools by affiliation, grade level, and region.

Table 5. School-domain expected interviews, standard errors, and design effects for the NTPS 2017-18 private school sample

Domain

Frame Schools

Expected Sample Size (completed interviews)

Expected Standard Error

Design Effect

All

24,861

2,266

1.08%

1.65

Catholic

6,407

742

1.83%

1.55

Other religious

11,600

774

1.80%

1.57

Nonsectarian

6,854

750

1.77%

1.46

Elementary

13,216

826

1.61%

1.34

Secondary

2,426

654

1.69%

1.17

Combined

9,219

786

1.72%

1.45

Northeast

5,787

602

2.26%

1.92

Midwest

6,105

512

2.24%

1.61

South

8,025

706

1.86%

1.53

West

4,944

446

2.38%

1.58


Table 6 provides the analogous precision analysis for private school teachers.

Table 6. Teacher-domain expected interviews, standard errors, and design for the NTPS 2017-18 private school sample

Domain

Frame Full-Time Equivalent Teachers

Expected Teacher Completed Interviews

Expected Standard Error

Design Effect

All

431,588

5,827

0.99%

3.58

Catholic

135,265

2,078

1.75%

3.98

Other religious

164,122

1,756

1.72%

3.24

Nonsectarian

132,201

1,993

1.65%

3.40

Elementary

163,523

1,644

1.65%

2.81

Secondary

62,614

1,933

1.66%

3.32

Combined

205,451

2,250

1.53%

3.31

Northeast

112,558

1,661

1.92%

3.84

Midwest

91,178

1,233

2.13%

3.50

South

149,772

1,848

1.74%

3.48

West

78,081

1,084

2.26%

3.47

Sampling – Principals within All Schools

For each sampled traditional public, public charter, and private schools, the principal will be included in the survey as a result of the school being selected.

Survey Weights

Schools, principals, and teachers will be weighted by the inverse of the probability of selection. The final weight will contain adjustments for nonresponse and any other sampling or field considerations that arise after the sample has been drawn.

Response Rates

We expect the NTPS 2019-20 response rates to approximate those of NTPS 2017-18 (for public and private schools) or to fall lower given the long-term trend in declining response rates for federal surveys. Table 7 provides the base-weighted response rates for NTPS 2015-16, given that the final base-weighted response rates are not yet calculated for NTPS 2017-18. Table 8 provides the preliminary unweighted response rates for NTPS 2017-18 as of mid-May. Note that as of mid-May 2018, data collection is still ongoing, especially for teachers; for this reason, teacher response rates are not reported in Table 8. The final NTPS 2017-18 response rates will be included in the NTPS 2019-20 Main Study submission in December 2018.

Table 7. Base-weighted response rates for NTPS 2015-16 by respondent and school type

School Type

Unit of Observation

Teacher

Principal

School

Traditional Public

67.9

71.8

72.5

Charter

66.2

71.9

73.2



Table 8. Preliminary unweighted response rates for NTPS 2017-18 by respondent and school type

School Type

Unit of Observation

Principal

School

Traditional Public

69.6

71.9

Charter

62.2

65.9

Private

62.3

65.9


B.2 Procedures for the Collection of Information

Section B.2.1 describes the operations for the preliminary field activities for NTPS 2019-20, with Section B.2.1.1 describing special districts operation and Section B.2.1.2 the school pre-contact letter. Section B.2.2 describes school-level data collection procedures for the school-level questionnaires (i.e., Teacher Listing Form, School Questionnaire, and Principal Questionnaire), with Section B.2.2.1 describing the procedures to be used with priority schools and Section B.2.2.2 with non-priority schools. Section B.2.3 describes data collection procedures for the Teacher Questionnaire.

B.2.1 Preliminary Field Activities

B.2.1.1 Special Contact District Operation

Special contact districts require that a research application be submitted to and reviewed by the district before they will allow schools under their jurisdiction to participate in a study. Districts are identified as “special contact districts” prior to data collection because they were flagged as such during previous cycles of SASS, NTPS, or by other NCES studies. Special contact districts are also identified during data collection when districts indicate that they will not complete the survey until a research application is submitted, reviewed, and approved.

Once a district is identified as a special contact district, basic information about the district is obtained from the NCES Common Core of Data (CCD). The basic information includes the NCES LEA ID number, district name, city, and state. The next step is to search the district’s website for a point of contact and any information available about the district’s requirements for conducting external research. Some districts identified as being a special contact district from the previous cycle may be incorrect and staff will verify whether a given district has requirements for conducting external research before proceeding.

The following are examples of the type of information that will be gathered from each district’s website in order to prepare a research application for submission to this district:

  • Name and contact information for the district office or department that reviews applications to conduct external research, and the name and contact information of the person in charge of that office.

  • Information about review schedules and submission deadlines.

  • Whether application fees are required, and if so, how much.

  • Whether a district sponsor is required.

  • Whether an online application is required, and if so, the link to the application if possible.

  • Information about research topics and/or agenda on which the district is focusing.

  • The web link to the main research department or office website.

  • Research guidelines, instructions, application forms, District Action Plans, Strategic Plan or Goals, if any.

Recruitment staff will contact districts by phone and email to obtain key information not listed on the district’s website, (e.g., requirements for the research application, research application submission deadlines, etc.).

NTPS staff developed a generic research application that covers the information typically requested in district research applications. Staff will customize the generic research application to each district’s specific requirements that need to be addressed or included in the research application (e.g., how the study addresses key district goals, or inclusion of a district study sponsor), or submit the generic application with minimal changes to districts that do not have specific application requirements.

Using the information obtained from the district website or phone or email exchanges, a district research request packet will be prepared. Each research application will include the following documents, where applicable:

  • District research application cover letter;

  • Research application (district-specific or generic, as required by the district);

  • Study summary;

  • FAQ document;

  • Special contact district approval form;

  • Participant informed consent form (if required by the district);

  • NTPS Project Director’s resume;

  • Copy of questionnaires; and

  • Application fee (if required by the district).

Where required or requested, applications will include the draft 2019-20 NTPS questionnaires, which are the 2015-16 questionnaires included in Appendix B of this submission. The 2017-18 NTPS questionnaires will be provided to districts that request them. Other information about the study may be required by the district and will be included with the application or provided upon request.

Approximately one week after the application is submitted to the district (either electronically or in hard copy, as required by the district), NTPS district recruitment staff will contact the district’s research office to confirm receipt of the package and to ask when the district expects to review the research application and when a decision will be made. If additional information is requested by the district (e.g., the list of sampled schools), recruitment staff will follow up on such requests and will be available to answer any questions the district may have throughout the data collection period.

Some districts charge a fee (~$50-200) to process research application requests, which will be paid as necessary.

B.2.1.2 School Pre-Contact Letters

The school pre-contact letter is to verify school mailing addresses and to inform schools about the upcoming data collection. A letter is sent to each sampled school informing them of their selection for the study. About 4% of all school addresses get corrected by the U.S. Post Office in response to the pre-contact letter, saving time and effort during the actual data collection period.

B.2.2 School-level Data Collection Procedures

This section describes the data collection procedures used for the NTPS 2017-18 full-scale data collection including the Teacher Listing Form (TLF), School Questionnaire (SQ), Principal Questionnaire (PQ), and Teacher Questionnaire (TQ). The final data collection procedures for NTPS 2019-20 are under development and will be fully specified in the NTPS 2019-20 Main Study submission in December 2018.

School-level data collection procedures for NTPS 2017-18 are summarized in Exhibit 1.

In July 2017, all schools received an advance letter addressed to the principal at the school address. The letter includes instructions for completing a brief screener interview online using the NTPS Respondent Portal. The purpose of the screener interview is to determine the school’s eligibility for the NTPS and establish a survey coordinator. The survey coordinator is asked to facilitate the completion of NTPS questionnaires within their school, and materials are mailed to him or her throughout data collection. Principals who do not self-screen are contacted by telephone to complete the screener. A reminder email was sent to non-responding school principals in August 2017.

After the advance letter and screener interview, schools enter one of two data collection paths. The data collection methodology employed is dependent on whether the school has been identified as a “priority school.” The propensity model is based on a model developed to identify priority schools for the 2015-16 NTPS data collection. The same model with updated information was used for the 2017-18 NTPS data collection and will be used again for the 2019-20 NTPS data collection.

Prior to the start of NTPS 2017-18 data collection, a propensity model was run to identify “priority” schools. These “priority” schools have characteristics of schools from which it has been historically difficult to collect data and which have a potentially high impact on weighting. The priority flag takes into account both the response propensity and the base weight of a school to create a measure of a school’s potential effect on nonresponse weighting adjustments and final estimates. Schools with either an extremely high weight or an extremely low response propensity have a large response influence, meaning their nonresponse will disproportionately affect the nonresponse adjustment cell in which they are located. Thus, efforts are made to prioritize field operations in these school early during data collection.

Between late February and early March 2018, an additional reminder email was sent to Principals and/or Survey Coordinators of nonresponding schools, and between late February and early June 2018, an email reminder was sent to teachers who were not eligible for the contingency plan incentive experiment. Each of these emails included a link to an informational NTPS video and to the relevant survey, the respondent’s User ID, and selected findings from the 2015-16 NTPS.

Exhibit 1: 2017-18 National Teacher and Principal Survey – School-Level Data Collection Operations

Shape1 Shape2

Reminder email to Principals/Schools

Late February-Early March


Reminder email to Teachers

Late February- Early June

B.2.2.1 Priority Schools

In early September 2017, principals or survey coordinators at priority schools were mailed a letter, at the school address, informing them that their school may receive a personal visit from Census Bureau staff in the coming weeks. About ten days later, data collection began with a personal visit from a Census Bureau Field Representative. The expectation for the personal visit is that the Census Bureau Field Representative would complete the school’s Teacher Listing Form (TLF). In most cases, the TLF was pre-populated with vendor or clerically-researched data and the Field Representative only needed to verify that the teacher information is complete and accurate. The Field Representative also distributed sealed letters containing login information for the school and principal questionnaires. If the Field Representative noted that the school has shown reluctance or initially refused to participate in the study, the Regional Office of the Field Representative sent out a “letter of better understanding” to help encourage participation.

Schools for which the personal visit was unsuccessful received an initial package in late October 2017 addressed to the survey coordinator at the school address. If a survey coordinator was not established during the screener interview, the package was addressed to the principal at the school address. The mailed package contained a letter to the survey coordinator or principal and three individual sealed envelopes that contained login information for completing the TLF, Principal Questionnaire, and School Questionnaire. A few days after the initial package mailing, email was also used to contact the survey coordinator and principal. Additionally, principals and survey coordinators were contacted by email around the same time the initial packages are mailed to the sampled schools. The emails included the appropriate hyperlinks and User IDs to complete the NTPS questionnaires online. A reminder email was sent in mid-November to principals and survey coordinators.

In late-November 2017, a second package was mailed to the survey coordinator or principal, at the school address, of nonresponding priority schools. The package included a reminder letter, a pre-populated paper TLF and a return envelope (if applicable), and/or replacement materials for completing the principal and/or school questionnaires online. Principal and survey coordinator email addresses were used as means of reminding nonresponding school staff to complete their questionnaires.

In early January 2018, priority schools that had not yet provided or verified their TLF had their teachers sampled from the vendor or clerically-researched list of teachers. If outstanding school-level forms remained, a third package was mailed to the survey coordinator or principal at the school address. This package included a reminder letter, paper versions of the principal and/or school questionnaire(s), and postage-paid addressed return envelopes. Principal and survey coordinator email addresses were used as means of reminding nonresponding schools to complete their questionnaires.

Beginning in late January 2018, priority schools that had not yet completed their school and/or principal questionnaires were sent to a telephone reminder operation aimed at prompting the survey coordinator or school principal to complete their questionnaires. If outstanding school-level forms remained after the telephone reminder operation, one more attempt by mail, email, and telephone was be made to remind the school to complete their outstanding questionnaire(s).

B.2.2.2 Non-priority Schools

In September 2017, all non-priority schools received an initial school package addressed to the survey coordinator at the school address. If a survey coordinator was not established during the screener interview, the package was addressed to the principal at the school address. The package contained a letter to the survey coordinator or principal, and three individual sealed envelopes that contained login information for completing the TLF, Principal Questionnaire, and School Questionnaire. Principals and survey coordinators were also contacted by email around the same time the initial packages were mailed to the sampled schools. The emails contained the appropriate hyperlinks and User IDs to complete the NTPS questionnaires online.

About three weeks later, a second package was mailed to nonresponding schools. The package included a reminder letter to the survey coordinator or principal and replacement materials for completing the outstanding questionnaires online. Principal and survey coordinator email addresses were used as means of reminding nonresponding school staff to complete their questionnaires.

Beginning in November 2017, nonpriority schools that had not yet completed their TLF electronically were sent to a telephone reminder operation aimed at prompting the survey coordinator or school principal to complete their TLF online. Non-priority schools that completed their TLF but had not yet returned either the Principal Questionnaire or School Questionnaire received a reminder letter and email during this time.

In late November 2017, non-priority schools with outstanding school-level questionnaires were mailed a third package. The package included a reminder letter to the survey coordinator or principal, paper versions of the questionnaires that were still outstanding, and postage-paid return envelopes. If the TLF was one of the outstanding questionnaires, the version included in this third mailout was pre-populated with teacher list data from the vendor or clerical research. Principal and survey coordinator email addresses were used as means of reminding nonresponding school staff to complete their questionnaires.

In early January 2018, non-priority schools that had not yet completed their TLF were sent to a Field operation, where sampled schools received an in-person visit from a Field Representative. The expectation for the personal visit was that the Census Bureau Field Representative would: (a) verify the school’s TLF, which was pre-populated with vendor or clerically-researched data when such data were available, and (b) distribute paper school and/or principal questionnaires as needed. After the Field operation, non-priority schools that had not provided or verified their TLF had their teachers sampled from the vendor or clerically-researched list of teachers.

Beginning in early January 2018, principals and/or survey coordinators in non-priority schools that completed their TLF but had not completed their school and/or principal questionnaire were sent a reminder email and were contacted by telephone.

If outstanding school and/or principal questionnaires remained after the field or telephone operation, contacts by mail, email, telephone, and in-person visit (if not previously visited) were made to attempt to remind the school to complete their outstanding questionnaire(s).

B.2.3 Teacher Data Collection

Teachers were sampled weekly from completed or verified TLFs throughout data collection. As teachers were sampled, they were mailed an initial teacher package containing a letter that introduced the survey and provided the login information to complete their survey online. Around the same time, teachers for whom an email address was available were also sent an email including the hyperlink and User ID to complete their teacher questionnaire online. If the school had a survey coordinator established, the individually-sealed teacher packages were sent to the survey coordinator, at the school address, with a cover letter. If the school did not have a survey coordinator established, the teacher packages were mailed individually to the sampled teachers at the school address in most cases. Exceptions were made to this for late sampled teachers whose materials were mailed directly to their school’s principal to distribute.

If the school’s teachers were sampled from a vendor or clerical list (where the school did not complete or verify a TLF), materials for the sampled teachers to complete their teacher questionnaires were mailed directly to the teachers at their school address regardless of whether a survey coordinator was established. Exceptions were made to this for late sampled teachers whose materials were mailed directly to their school’s survey coordinator (when there is one established) or the principal to distribute.

Teachers with a valid email address were sent an email containing the hyperlink to the online Teacher Questionnaire and their User ID a few days after their initial mailout.

Each sampled teacher could have received as many as three reminder packages to complete their outstanding Teacher Questionnaire. Each teacher mailing was accompanied by an email to the teacher a few days after the mailing. The first reminder letter contained the login information for the Teacher Questionnaire (URL and User ID) and was sent to the survey coordinator (if applicable). The second and third reminder packages included a letter and a paper questionnaire and were addressed directly to the sampled teachers at the school address, regardless of whether the school had a survey coordinator established.

Beginning in late January 2018, telephone interviewers contacted survey coordinators to ask them to remind their schools’ sampled teachers to complete their questionnaires. Telephone interviewers and/or Field Representatives contacted nonresponding teachers by phone or during an in-person visit from late February through May 2018.

B.3 Methods to Secure Cooperation, Maximize Response Rates, and Deal with Nonresponse

This section describes the methods that NCES will use to secure cooperation, maximize response rates, and deal with nonresponse for NTPS 2019-20. Section B.3.1 details how NTPS plans to secure cooperation by leveraging its status as the primary source of information on K-12 schools and staffing in the United States. Section B.3.2 describes the methods used in NTPS 2017-18 to minimize nonresponse, including those added as change requests to try to boost response rates. The final methods selected for NTPS 2019-20 will be specified in the NTPS 2019-20 Main Study submission in December 2018.

B.3.1 Methods to Secure Cooperation and Maximize Response Rates

The entire survey process, starting with securing research cooperation from key public school groups and individual sample members and continuing throughout the distribution and collection of individual questionnaires, is designed to increase survey response rates. In addition, the following elements of the data collection plan, in particular, will contribute to overall success of the survey and will enhance the survey response rates.

  1. Visible support from top-level Federal, State, and local education officials. Without the support of high-level officials in the U.S. Department of Education, State Education Agencies, and the sampled local school districts, surveys of public school principals and teachers cannot be successfully implemented. Obtaining endorsements from these officials is a critical factor in the success of the data collection procedures. Top-level Education Department officials will need to fully support the data collection by endorsing the survey in writing and sending advance letters and notices to sampled districts that require prior research applications and to individual survey participants (principals and teachers) to encourage participation.

  2. Endorsements from key public school groups. The level of interest and cooperation demonstrated by key groups can often greatly influence the degree of participation of survey respondents. Endorsements are viewed as a critical factor in soliciting cooperation from state and local education officials. NCES obtained endorsements for NTPS 2017-18 and will again seek endorsements for NTPS 2019-20 from the following organizations or agencies:

American Association of School Administrators

American Association of School Librarians

American Federation of Teachers

American Montessori Society

American School Counselors Association

Association for Middle Level Education (formerly National Middle School Association)

Association for Supervision and Curriculum Development

Association of American Educators

Council of Chief State School Officers

Council of the Great City Schools

National Association of Elementary School Principals

National Association of Secondary School Principals

National Parent Teacher Association

  1. Endorsements from key private school groups. In addition to the endorsements from key public school groups, NCES also obtained endorsements for NTPS 2017-18 and will again seek endorsements for NTPS 2019-20 from the following private school organizations or agencies:

American Association of School Administrators

Association of Christian Teachers and Schools

Association of Military Colleges and Schools

Christian Schools International

Council for American Private Education

Council of Islamic Schools of North America

Evangelical Lutheran Church in America

Islamic School League of America

Jesuit Schools Network (formerly Jesuit Secondary Education Association)

Lutheran Church-Missouri Synod

National Association of Episcopal Schools

National Association of Independent Schools

National Association of Private Special Education Centers

National Catholic Educational Association

National Christian School Association

National Council for Private School Accreditation

Office of Education, General Conference of Seventh Day Adventists

Oral Roberts University Educational Fellowship

United States Conference of Catholic Bishops

National Parent Teacher Association


  1. Stressing the importance of the survey and the respondents' participation. Official letters will be used to motivate respondents to return surveys. NTPS 2019-20 respondent letters will be sent by the U.S. Census Bureau and signed by the NCES Commissioner. Communications in the form of both letters and emails will be personalized for the principal and survey coordinators, which is expected to have positive effects on the survey response rates.

B.3.2 Methods to Minimize Nonresponse

A major challenge in any survey is obtaining high response rates, and this is even more important today when response rates have been falling among federal surveys in general, and in NTPS in particular.

The main problem associated with nonresponse is the potential for nonresponse bias in the estimates produced using data collected from nonrespondents. Bias can occur when respondents are systematically different from nonrespondents. Two approaches that will be used to reduce the potential for bias are designing the data collection procedures and methods wisely to reduce nonresponse (e.g., establishing survey coordinators) and using statistical methods of sampling and weighting to reduce the effect of nonresponse on the estimates. While the statistical approaches are important in controlling biases and costs, the data collection procedures and methods are at the heart of a successful study.

Methods selected to minimize nonresponse in NTPS 2019-20 will build upon those used in NTPS 2017-18, including actions that were taken late in the data collection to boost principal and teacher response rates.

Data Collection Strategies to Minimize Nonresponse

  1. Minimize survey burden on schools. NTPS survey procedures are designed to minimize burden on schools and sampled individuals (principals and teachers), and the survey instruments have been designed to be completed as quickly and easily as possible.

To reduce burden on schools, whenever possible, the TLF (both the electronic version in the NTPS Respondent Portal and the paper TLF) will be pre-populated with vendor teacher roster data, and the school will be asked to verify the teacher information rather than provide it from scratch. Results from NTPS 2017-18 confirmed that providing pre-populated TLFs was successful in reducing burden on sampled schools.

Good questionnaire design techniques have been employed to minimize item nonresponse. Questionnaires from previous rounds of SASS and NTPS were carefully analyzed to determine which items had the highest levels of item nonresponse. This information guided NCES in reviewing the clarity of item wording, definitions, and instructions. Items that were not considered to be effective or useful were removed from the survey so as to streamline the questionnaires and ease the response burden.

A key design feature of NTPS is the ability to link to other NCES collections such as EDFacts and the Civil Rights Data Collection (CRDC). Information from these sources will be incorporated into final datasets to allow researchers and policymakers to analyze those data together. This will further reduce the need to collect from schools data that have already been collected from state or district education agencies.

  1. Recruit survey coordinators. Successive administrations of SASS and NTPS have shown that an important procedure to help maximize response rates is to establish a school-based "survey coordinator" to serve as a primary point of contact for NTPS staff. The use of a survey coordinator is expected to help keep response rates high, provide some minimal data quality checks, and simplify the follow-up process by having one point of contact.

  2. Tailor nonresponse follow up strategies. In an effort to maximize response rates and minimize the potential for bias, NCES took a number of steps prior to the 2017-18 NTPS to identify high priority schools. These high priority schools are those to be targeted differently during data collection. The schools identified as high priority had the lowest propensity to respond (based on 2017-18 and 2015-16 NTPS data, as well as SASS data, as described below) and the highest potential impact on estimates.

As in NTPS 2017-18, schools sampled for NTPS 2019-20 will be assigned a “priority” flag based on the weighted response influence of the case. The weighted response influence takes into account both the response propensity and the base weight of a school to create a measure of a school’s potential effect on nonresponse weighting adjustments and final estimates. The weighted response influence can be calculated as:

where: is the final weighted response influence for a school,

is the baseweight for a school, and

is the estimated response propensity for a school

As the formula shows, a case with either an extremely high weight or an extremely low response propensity has a large response influence, meaning that their nonresponse will disproportionately affect the nonresponse adjustment cell in which they are located. Missing that particular school’s information may result in biased estimates (if variables in the propensity model are related to outcomes of interest), and will certainly result in increased variance in the estimates (due to more variable final weights). In order to avoid having extreme weights drive the value of weighted response influence, the formula takes the natural log of the base weight.

The weighted response propensity model for NTPS 2017-18 was developed using data from NTPS 2015-16 and SASS 2011-12. Specific categories of variables available for evaluation include geography, urbanicity, racial/ethnic makeup, enrollment, grades levels, percent of free lunch recipients, and type of school. These variables are available in the SASS 2011-2012, NTPS 2015-16, and NTPS 2017-18 sample files, enabling us to leverage past experience in creating the response propensity models. The NTPS 2017-18 data collection plan employed propensity modeling to identify high priority schools and modified collection strategies in order to increase response rates for those schools. Results from the NTPS 2017-18 data showed that the model and strategies used helped reduce declining response rates amongst those schools and thus the same propensity model and similar collection strategies will be used in NTPS 2019-20.

The priority flag was assigned at the school level in NTPS 2017-18 and the same will be done for NTPS 2019-20. During data collection, the priority flag was used to move high priority schools and schools without a survey coordinator into field follow-up operations earlier in collection in an effort to boost response rates. Schools in the high priority group generally do not respond until later in the data collection process and ultimately require field intervention.

NTPS 2017-18 data collection for priority schools began with a personal visit from a Census Bureau Field Representative rather than beginning with a series of mailouts and telephone operations. By contacting the school staff in-person at the beginning of data collection, costs were expected to be reduced due to the omission of the mailout and telephone operations that typically precede field operations. In addition, this approach was expected to raise the probability of response by providing the field staff more time to secure the completed questionnaires. The primary focus of the operation was to obtain a complete TLF; however, the Field Representative also delivered the invitations to complete the school and principal questionnaires online. Throughout data collection, NTPS staff on a daily basis reviewed the cases assigned to field.

NTPS focuses on obtaining cooperation and improving response rates at the school level for a number of reasons. Past administrations of NTPS have shown that when cooperation is obtained at the school level, teachers and principals are more likely to respond. Additionally, evaluations of schools’ response propensities have shown that the nonresponse in past administrations was driven primarily at the school level. Results showed that schools in special contact districts are the primary driving force behind low response propensity. Special districts are those that require additional applications or documentation to collect data in their schools. Nearly 80% of the schools with high propensity for non-response reside in these special districts. For this reason, resources will continue to be allocated to focus heavily on obtaining approvals from special contact districts in order to boost response rates for this group.

  1. Use vendor lists for teacher sampling. NTPS teacher-level response rates are calculated by multiplying response at the school level to the TLF by response at the teacher level. In the past, this has meant that if the school did not complete the TLF, teachers from that school could not be sampled, ultimately lowering the teacher response rate. The goal in NTPS 2015-16 and 2017-18 was to improve the overall teacher response rate by allowing NTPS to sample teachers from schools that have not submitted a TLF; therefore, TLFs received from sample schools were supplemented with vendor-purchased teacher lists. When a vendor-purchased list was unavailable, a clerical operation was conducted to look up teacher information on school and/or district websites. Whenever possible, the TLF was pre-populated with vendor teacher roster data, and the school was asked to verify the teacher information rather than provide it from scratch. The vendor and clerically-researched lists were evaluated in NTPS 2017-18, NTPS 2015-16, and the NTPS 2014 pilot test and showed high levels of comparability to lists obtained directly from schools.

In NTPS 2019-20, whenever possible, TLFs will once again be pre-populated with vendor-purchased teacher lists and those obtained through a clerical look-up operation utilizing school websites, and schools will be asked to verify the teacher information rather than provide it from scratch. This approach is expected to help improve the overall teacher response rate and allow teacher sampling in schools that have not submitted a TLF as a last-ditch effort to collect data in such schools.

  1. Monitor publishability and bias measures. For NTPS 2017-18, NCES monitored data collection progress throughout survey operations in order to identify and potentially minimize problems with nonresponse. The Census Bureau created weekly “publishability” reports from their data collection tracking system that showed whether key analysis cells were large enough to provide publishable estimates as of that point in time. By monitoring this publishability metric, NCES was able to identify populations of schools for which nonresponse hampered reporting. These results will be considered in designing the sample and nonresponse follow-up strategies for NTPS 2019-20. NCES also monitored R-indicators, a measure of representativeness, or lack of bias in the respondent population, on a weekly basis. The closer the R-indicator is to 1, the more balanced is the respondent population. Towards the end of data collection in 2017-18, the R-indicator for the full sample indicated that the respondent population was fairly well balanced. NCES plans to continue to monitor these two indicators in NTPS 2019-20.

  2. Personalize principal contact materials. As was done in NTPS 2017-18, to maximize the chances that all mailed NTPS 2019-20 materials intended for the school principal successfully make it to the principal, all principal contact materials will be personalized with the principal’s name. Principals’ names are obtained from vendor-purchased school staff lists. If a principal’s name is not available from the vendor, clerical staff research this information using school and district websites.

  3. Use of email to target principals, survey coordinators, and teachers. NTPS 2017-18 demonstrated that email was an effective tool to drive participation in both the NTPS teacher and principal surveys. It proved that teacher email addresses could be effectively collected on the TLF, school websites, and from vendor lists of teachers; that principal email addresses could be effectively collected from school websites and from vendor purchased school data; and that survey coordinator email addresses could be effectively collected during the screener interview. Because personalized emails carry no cost and may help boost response, throughout 2019-20 NTPS data collection, teachers, principals, and survey coordinators will be contacted via email. The emails will include login information to access the NTPS online survey instruments, in addition to text inviting and subsequently reminding these respondents to complete their survey online.

  4. Use of additional reminder emails to teachers. Previous NTPS cycles showed that response rates for late-sample wave teachers in NTPS level off and even appeared to be lower than for earlier waves of teachers. This may have been a product of the timing of school testing and late-school year activities because late-sample wave teachers received an invitation to complete the survey during a period with a heavy school workload. It may also have been because the late sampled teachers were in schools that were either late responders to the TLF or TLF non-respondents (in instances where teachers were sampled from a teacher roster obtained from clerical research or the vendor data) and therefore may have had less support and encouragement from their principals and/or survey coordinators to complete their questionnaires. Given that additional reminder emails carry no cost and may help response rates, as in NTPS 2017-18, three (or more) reminder emails will be sent to nonresponding teachers during NTPS 2019-20 data collection. The maximum number of reminder emails that will be sent will be specified in the NTPS 2019-20 Main Study data collection submission in December 2018.

  5. Send a “letter of better understanding” to principals and teachers. After the 2015-16 NTPS collection, field representatives and the regional offices recommended to send “letters of better understanding” to principals and teachers who may be hesitant to complete the survey to help them gain a better understanding of the study by providing them information about how the data are used and referencing some of the published data from NTPS First Look Reports. These letters will be sent to principals and teachers in priority schools, which tend to exhibit high non-response.

  6. Telephone and field follow-up operations for late-sampled teachers. NTPS 2017-18 included two additional follow-up operations aimed at collecting completed questionnaires from nonresponding teachers sampled in the later data collection waves (17-20). In previous NTPS cycles, late-sampled teachers were not eligible for inclusion in telephone follow-up and/or field follow-up. During the phase 2 telephone follow-up operation, telephone center staff made telephone calls to late-sampled teachers to remind them to complete their questionnaire and, whenever possible, collect the interview over the phone. During the phase 4 field operation, Field Representatives made personal visits to the schools to drop off the paper form(s) and schedule a time to pick up the completed forms. Additionally, both of these operations targeted domains with publishability risks (e.g. teachers in city and charter schools).

  7. Consider new methods of minimizing nonresponse. NCES is considering a number of additional methods to minimize nonresponse in NTPS 2019-20, including the continued use of incentives. Previously, debit cards and cash were the main forms of incentives used to minimize nonresponse. For NTPS 2019-20, additional non-monetary incentives are being considered as a tool to further increase response rates. NTPS 2017-18 included an incentive experiment for teachers and survey coordinators and also included a contingency plan incentive experiment that targeted domains ‘at-risk’ for not meeting NCES publishability standards. This was one of a few experiments designed to examine the effectiveness of offering teachers a monetary incentive to boost overall teacher response. Further information about incentives, as well as experiments related to mailed materials and messaging, is provided in section B.4.2 of this document.

Statistical Approaches to Nonresponse

One of the methods employed to reduce the potential for nonresponse bias is adjustment of the sample weights to account for nonresponse. If schools or teachers with certain characteristics are systematically less likely than others to respond to a survey, the collected data may not accurately reflect the characteristics and experiences of the nonrespondents, which can lead to bias. To adjust for this, respondents are assigned weights that, when applied, result in them representing their own characteristics and experiences as well as those of nonrespondents with similar attributes. The school weights are also raked to sampled-based control totals in order to maintain the background characteristics of the sample. This is another method used to reduce the potential for nonresponse bias in the estimates produced from the data.

Response rates will be computed for the TLF, the School Questionnaire, the Principal Questionnaire, and the Teacher Questionnaire. Data collected through any instrument with a response rate of less than 85 percent will be evaluated for nonresponse bias. In addition to comparing the characteristics of respondents and nonrespondents using data that are available from the sampling frames (for example, school type and school locale from the school frame), we will also compare NTPS 2019-20 estimates to estimates from previous rounds of NTPS and SASS. A methodology report covering NTPS 2019-20 will be developed and released, and will describe the methods and results of the nonresponse bias analysis.

B.4 Tests of Methods and Procedures

The SASS/NTPS series of studies has a long history of testing materials, methods, and procedures to improve the quality of its data. Section B.4.1 describes those tests that have most influenced the NTPS design, beginning with the 2014-15 NTPS Pilot Test and continuing through NTPS 2017-18. Section B.4.2 describes experiments proposed for NTPS 2019-20.

B.4.1 Tests Influencing the Design of NTPS 2019-20

2014-15 NTPS Pilot Test

Five experiments designed to optimize the design of the 2015-16 NTPS were conducted as part of the 2014-15 NTPS Pilot Test: 1) the Questionnaire Mode Experiment, 2) the TLF Email Experiment, 3) the Invitation Mode Experiment, 4) the Teacher Questionnaire Instruction Experiment, and 5) the Vendor Analysis. Each of these experiments is briefly described below, along with its results and implications for successor NTPS data collections.

  1. Questionnaire Mode Experiment. This experiment was designed to determine whether paper questionnaires or Internet survey instruments (i.e., mailonly versus internet sequential modes) constituted the most effective mode of collecting the TLF, School Questionnaire, and Principal Questionnaire. For all three-survey instruments, the schools assigned to the paper mode had higher response rates than the schools assigned to the internet mode.

Some known issues with data collection could have impacted these response rates. First, the pilot test did not use survey coordinators, a method shown to boost response rates in SASS. Second, there were problems related to the contact materials for the internet treatment groups. As a result of this experiment, NTPS 2015-16 was primarily paper based; used improved contact materials and login procedures; and included an experimental sample of 1,000 schools, outside the main study, which were offered Internet survey at the onset of data collection and which followed standard production NTPS procedures, including the establishment of a survey coordinator.

  1. Teacher Listing Form (TLF) Email Experiment. This experiment was designed to assess the feasibility of collecting teacher email addresses on the TLF and the quality of those collected. The pilot test design included a split-panel experiment, with half of sampled schools randomly assigned to receive a TLF that included a request for teachers’ email addresses and the other half to receive a TLF that did not request email addresses. At the end of data collection, response rates were comparable between the schools that received the TLF with the email address field and the schools that received the TLF without the email address field. As a result of this experiment and the Invitation Mode Experiment described below, NCES used the TLF with the email address field in NTPS 2015-16 and plans to continue to do so for NTPS 2017-18.

  2. Invitation Mode Experiment. The purpose of this experiment was to identify which of three methods of inviting teachers to complete the Teacher Questionnaire yielded the best response rates. Schools were randomly assigned to the following invitation modes: 1) both email and mailed paper invitation letters to complete the internet instrument (treatment A), 2) a mailed paper invitation letter to complete the internet instrument only (treatment B), and 3) a mailed package that included a letter and paper questionnaire (treatment C). The results of the experiment indicated that a strategy using a combination of email and paper invitations (treatment A) is best for inviting teachers to complete the internet questionnaire. The response rate for treatment group A was comparable to that of treatment group C that received only mailed paper materials. As a result of this experiment, teachers sampled for NTPS 2015-16 for whom we had a valid email address were sent both email and paper invitations as the initial request to fill out the Teacher Questionnaire. Teachers without valid email addresses were sent their initial invitation as part of a mailed package that included a paper copy of the survey. For the 2017-18 NTPS, NCES plans to push for web response by both mailed and emailed correspondence, switching to a paper questionnaire at the third mailing.

  3. Teacher Questionnaire Instruction Experiment. This experiment was designed to determine (1) whether including instructions in the NTPS questionnaire impacts response rates for questionnaire items and data quality, and (2) whether the position, format, and presence or absence of a preface in the instruction impacts response rates for questionnaire items. NCES is currently analyzing the results from this experiment and plans to incorporate these findings in a future NTPS administration.

  4. Vendor Analysis. The purpose of this experiment was to evaluate both the feasibility of collecting teacher lists from a vendor and the reliability of the purchased information to see whether it could be used to supplement or replace school-collected TLFs. NCES purchased teacher lists from a vendor for schools sampled for the 2014-15 NTPS pilot test. The vendor teacher lists were compared with information collected from the TLFs. The results suggested that the vendor list information was comprehensive and reliable at a relatively low cost. NCES used vendor lists to sample teachers from a subset of schools that did not respond to the TLF in NTPS 2015-16 and plans to use vendor lists for the 2017-18 NTPS.

NTPS 2015-16 Full-Scale Collection

  1. Schools and Principals Internet Test. The 2015-16 NTPS included an Internet experiment for schools and principals, which was designed to test the efficacy of offering an internet response option as the initial mode of data collection, as done previously in the Questionnaire Mode Experiment included in the 2014-15 NTPS Pilot Study, described earlier.

Key differences exist between the 2014-15 and 2015-16 NTPS internet experiments, with the most notable being that the 2015-16 experiment included the use of a survey coordinator at the school, and improved respondent contact materials and mailout packaging. In the 2015-16 NTPS, an independent sample of 1,000 public schools was selected for this experiment, which invited schools and principals to complete the NTPS school-level questionnaires using the internet at the first and second contacts by mail. A clerical operation prior to data collection obtained email addresses for sampled principals assigned to the internet treatment. Principals were sent emails as an initial mode of invitation to complete the NTPS questionnaires as well as reminder emails; the timing of these emails was a few days following the mailings.

Paper questionnaires were offered at the third and final mailout. Data collection for the internet treatment concluded after the third mailing, so the schools in the experimental treatment did not receive a fourth mailing and were not included in the telephone follow-up or field follow-up operations. When comparing the response rates for all three survey instruments at the end of the reminder telephone operation – the most reasonable time to make the comparison – and removing the cases that would have qualified for the early field operation, the response rates for schools assigned to the internet treatment are five to six percent higher than those for the paper treatment. Therefore, the initial mailout will invite respondents to complete online questionnaires during the 2017-18 NTPS data collection for all questionnaire types. Paper questionnaires will be introduced during the third mailing. Principal email addresses (purchased from the vendor) and school-based survey coordinator email addresses (collected at the time the survey coordinator is established) will be utilized during data collection. Invitations to complete the principal and school questionnaires via the Internet response option will be sent to the principal and school-based survey coordinator by email in conjunction with the various mailings.

  1. Contact Time Tailoring Experiment. This test was designed to determine the optimal contact time for teachers. During the telephone nonresponse follow-up operation, interviewers contacted nonresponding principals and teachers to remind them to complete their questionnaire. Teachers tend to be difficult to reach during the school day due to their teaching schedules. NCES staff hypothesized that teachers may be easier to reach by phone in the late afternoon, when school had been dismissed. To test the accuracy of this theory, an experiment was embedded in the telephone nonresponse follow-up operation. A portion of the NRFU teacher workload received an experimental treatment, where they were intended to be contacted only in the afternoon between 2:00 p.m. and 5:00 p.m. (respondent time). The remainder of the NRFU teacher universe functioned as the control group. These teachers were intended to receive contacts throughout the school day, per typical telephone follow-up procedures. The research questions this test was designed to answer were as follows:

  1. Are afternoons more productive for calling teachers?

  2. If not afternoons, are there more productive times than others for calling teachers?

  3. Do productive contact times for teachers hold globally, or do different types of schools have different productive call time frames?

  4. Can we use school-level frame information (e.g. urbanicity, school size, grade level) to help tailor call times in future rounds of data collection?

  5. If the calls are being made at “productive times,” are fewer call attempts required to successfully make contact with the teacher?

  6. If the calls are being made at “productive times,” are fewer call attempts and total contacts required to obtain a completed interview?

Operational challenges in conducting the call time experiment were encountered. Early in the telephone nonresponse follow-up operation, telephone interviewers reported that school staff members were complaining about receiving multiple calls to reach the sampled teachers. School staff members indicated that they would prefer to know the names of the teachers the interviewer needed to reach so that they could assist the interviewer in as few phone calls as possible. As a result, the results of the experiment could not be evaluated as intended. Instead of comparing the success of reaching the sampled teachers by their treatment group, staff compared the success rates of the actual call times. Call times were categorized as ‘early’ (before 2:00 p.m.) or ‘late’ (between 2:00 p.m. and 5:00 p.m.). There was not a noticeable difference in the success rates of contacting teachers by call time. Additional analyses on the data may be conducted to help inform future administrations of NTPS.

NTPS 2017-18 Full-Scale Collection

To address declining response rates among teachers in NTPS 2015-16, NCES tested the use of incentives to increase response in NTPS 2017-18. In addition, NTPS 2017-18 included a private school test that was designed to (a) provide accurate estimates for teachers and principals in private schools in the U.S. and (b) to examine the effects of strategies to improve response in this population. The results of these experiments are still being evaluated and will be included in the NTPS 2019-20 Main Study submission in December 2018.

  1. Testing the use of teacher incentives. The 2017-18 NTPS included an incentive experiment designed to examine the effectiveness of offering teachers a monetary incentive to boost overall teacher response. Teachers were incentivized during the first 12 waves of teacher sampling, then a combination of teachers and/or school coordinators or principals were incentivized during the remaining waves. During the first 12 waves of the teacher sampling, teachers were only sampled from returned TLFs. However, beginning in wave 13 for schools, teachers could be sampled from returned TLFs, vendor lists, or internet look-ups. This change in the teacher sampling procedure provided a natural breakpoint between the two phases of the experiment and allowed us to target the most challenging cases with an additional incentive for the school coordinator or principal.

  2. Testing the use of incentives as part of a contingency plan. NTPS 2017-18 experimented with offering an incentive to teachers if they belonged to a domain that was determined to be ‘at-risk’ of not meeting NCES reporting, or publishability, standards towards the end of data collection (by February 12, 2018). NCES monitored actual and expected response in each of the key domains on a weekly basis. The contingency plan was to be activated in the experimental group only if needed and, based on publishability reports, it was deemed needed and was activated. The control group was not eligible to receive the contingency incentive. While the plan was aimed at improving teacher response rates, because teachers within a school were likely to discuss the study, schools were selected based on meeting criteria of the domain at risk and all teachers within the school were subject to the same treatment (experimental or control). This approach was based on the assumption that if some teachers in the school received an incentive and others did not, it would negatively impact current and future response from that school. At the time the incentive was activated, some teachers at the school have already responded to NTPS – such teachers, if assigned to the contingency incentive treatment, were provided the incentive as a “thank you” for their participation. For all other teachers in the school, the same incentive was prepaid and not conditional on their response. Given that schools selected for the contingency plan incentive were based on the number of teachers in the at-risk domain, selection for this incentive was independent of the main NTPS incentive experiment. Consistent with the other NTPS 2017-18 procedures, the incentive amount varied between priority and non-priority schools. Teachers in selected non-priority schools received $10 with their third mail-out or thank-you letter, and teachers in selected priority schools received $20 with their third mail-out or thank-you letter.

  3. Private School Test. In NTPS 2017-18, NCES conducted an embedded test with private schools both to determine whether sufficient response could be achieved to provide reliable estimates for private schools and to evaluate specific methods for improving response rates. The private schools selected for this test experienced data collection procedures that were generally similar to those used with the NTPS 2017-18 public school sample. Some procedures were adjusted to accommodate differences specific to this sector (e.g., religious holidays and schedules). Preliminary results indicate that the private school data collected during NTPS 2017-18 will yield publishable estimates; however, our evaluation of the data will not be completed until after NTPS 2017-18 data collection period ends. The final results will be provided in the NTPS 2019-20 Main Study data collection submission in December 2018.

Coordinated special district operations. NCES conducts several school-based studies within the NCES legislative mandate to report on the condition of education including, among others, NTPS, the Survey of School Crime and Safety (SSOCS), and the National Assessment of Educational Progress (NAEP). A critical step for data collection is to obtain approval from public school districts that require it before a study can be conducted with students, teachers, and/or staff. The number of such special contact districts is steadily increasing. This poses a barrier to successful data collection, because many districts and schools have complex and lengthy approval processes, reject all outside research, or only review applications for outside research once a year. This has contributed to lower response rates for non-mandatory NCES surveys. NCES continues to examine how different program areas, both within NCES and in other federal agencies, seek approval from PreK-12 public districts and schools in order to identify best practices and make recommendations for current and future operations, including the for NTPS 2019-20.

B.4.2 Tests Included in the Design of NTPS 2019-20

NCES is currently considering options for tests of methods, materials, and procedures to be conducted as part of NTPS 2019-20. NTPS 2019-20 is still in the planning stages, and a description of all data collection operations and tests, including those listed below, will be provided in the NTPS 2019-20 Main Study data collection submission in December 2018.

  1. Further testing the use of teacher incentives.

For NTPS 2019-20, both non-monetary and monetary incentives are being considered as a tool to increase response rates. NTPS 2017-18 included an incentive experiment for teachers and survey coordinators and also included a contingency plan incentive experiment that targeted domains ‘at-risk’ for not meeting NCES publishability standards. NTPS 2019-20 will incorporate further testing of monetary incentives as well as testing of non-monetary incentives for the first time.

  1. Testing new mailed package contents and packaging.

In an effort to both increase response rates and lower mailing costs, NTPS 2019-20 will explore whether new types of mailed materials will yield higher response rates. In previous NTPS administrations, teacher questionnaires and instructions to complete them online were sent to sampled teachers in standard business envelopes. In NTPS 2019-20, a randomized experiment will compare the effects of using business envelopes vs. pressure-seal mailing materials.

  1. Tailored Materials.

Respondents sampled for NTPS receive letters and e-mails that emphasize the importance of their participation in the survey, but this information has not emphasized the ways in which NTPS data inform researchers and policymakers. In NPTS 2017-18, the statement “Public school teachers provided an average of 27 hours of instruction to students during a typical week in the 2015-16 school year. What about you?” was added to the outside of Third Reminder Teacher Letter envelopes for the final wave of sampled public school teachers. In NTPS 2019-20, qualitative research will explore what statistics are most salient to different types of respondents, and similar statements will be placed on materials sent to respondents, such as on the outside of envelopes or within enclosed letters, to determine whether targeted, persuasive messaging can increase response rates.

B.5 Individuals Responsible for Study Design and Performance

The following individuals are responsible for the NTPS 2019-20 study design, data collection, and analysis: Maura Spiegelman, Deanne Swan, and Andy Zukerberg at NCES; Shawna Cox, Walter Holmes, Mary Davis, and Aaron Gilary at U.S. Census Bureau; and David Marker, Lou Rizzo, and Minsun Riddles at Westat.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy