Part B NTPS 2017-18 Preliminary Field Activities

Part B NTPS 2017-18 Preliminary Field Activities.docx

National Teacher and Principal Survey of 2017-2018 (NTPS 2017-18) Preliminary Field Activities

OMB: 1850-0598

Document [docx]
Download: docx | pdf

National Teacher and Principal Survey

of 2017-2018 (NTPS 2017-18)

Preliminary Field Activities



OMB# 1850-0598 v.16



Supporting Statement

Part B





National Center for Education Statistics

U.S. Department of Education



July 2016

Revised October 2016

National Teacher and Principal Survey

of 2017-2018


(NTPS 2017-18)


Preliminary Field Activities


OMB Clearance Package

OMB# 1850-0598 v. XX



Supporting Statement

Part B



National Center for Education Statistics

U.S. Department of Education



July 2016









Table of Contents

Section Page




Part B Collection of Information Employing Statistical Method

This request is to contact districts and schools in order to begin preliminary activities for the NTPS 2017-18 collection, which includes (a) contacting and seeking research approvals from special handling districts, where applicable, and (b) notifying sampled schools of their selection for the survey and to verify mailing addresses. This is the initial clearance request that will be submitted for NTPS 2017-18. The sampling design and plan, the special contact district recruitment protocol and materials (including the use of incentives at the district level to encourage district approval and participation), and the possibility of NTPS 2017-18 collaborating with TALIS and NAEP are currently being researched. If they prove desirable, they will be submitted to OMB as a change request in late 2016. The full NTPS 2017-18 OMB clearance request, addressing all remaining data collection activities, will begin public comment period in early 2017.

Section B.1 describes the universe, sample design, and estimation details for NTPS 2017-18. Section B.2 describes the data collection procedures for the preliminary field activities. Section B.2 also includes a description of the full-scale data collection procedures for NTPS 2015-16, because the operations for NTPS 2017-18 are not yet fully specified. Details of NTPS 2017-18 operations will be included in the data collection clearance request in early 2017. Section B.3 discusses methods to secure cooperation and mitigate nonresponse. In particular, it describes methods used to improve response rates in NTPS 2015-16 and how those methods will be used in NTPS 2017-18. Section B.4 describes recent developments in a long history of tests of methods and procedures to improve data quality. It also includes a description of a test to include private schools in NTPS. Section B.5 lists the names and phone numbers of those involved in the design of the study and the development of these materials.

B.1 Universe, Sample Design, and Estimation

Section B.1.1 includes information on the study universe of interest and sample design planned for NTPS 2017-18. Section B.1.2 describes the precision requirements and target sample sizes set out for the study.

B.1.1 Universe and Sample Design: Respondent Universe

B.1.1.1 Schools

The respondent universe for NTPS 2017-18 data collection consists of approximately 95,750 public schools in the 50 U.S. states and the District of Columbia (DC) that offer instruction in any of grades K-12. To be eligible for inclusion in the sample, schools must provide classroom instruction to students, have one or more teachers to provide instruction, serve students in at least one of grades 1-12 or the ungraded equivalent, must be located in one or more buildings, and must be located in the U.S. and not in the outlying areas or U.S. territories.

NCES’ 2015-16 Common Core of Data (CCD) will be used to construct the public school frame. The respondent universe for charter schools will be identified as those public charter schools that meet the NTPS definition of an eligible school found on the CCD. The universe has been adjusted to remove kindergarten-terminal schools, which are not eligible for NTPS. Table 1 presents the number of public schools on the 2008-09 CCD by region and school level. The CCD for 2015-16 is not available at the time of submitting this package. Details of the first-stage sample design of schools are provided in section 2.

Table 1. Respondent universe by school level and region for the proposed public school sample, based on the 2008-09 CCD

 

School level

Total

Region

Primary

Middle

High

Combined

Northeast

8,590

2,689

2,924

796

14,999

Midwest

13,497

4,404

6,205

1,565

25,671

South

17,368

6,207

6,381

2,582

32,538

West

12,526

3,492

4,871

1,653

22,542

Total

51,981

16,792

20,381

6,596

95,750

SOURCE: 2008-09 CCD.


B.1.1.2 Teachers

Teachers will be randomly sampled within the second design stage from roster information provided by each participating sampled school. Teachers within the sampled school are classified as ineligible for NTPS if they are a short-term substitute teacher, student teacher, a teacher’s aide, or do not teach any of grades K-12 or comparable ungraded levels. This information is obtained from the Teacher Questionnaire. Details of the second-stage sample design of teachers are provided in section 2.

B.1.2 Precision Requirements and Sample Sizes

This section details the school sample sizes and precision requirements for the NTPS 2017-18 public school sample. Details about the teacher sample will be provided in the NTPS 2017-18 full-scale OMB clearance package in early 2017.

The final NTPS 2017-18 sample will include approximately:

  • 9,300 schools and school principals (7,900 traditional public and 1,500 public charter); and

  • 43,000 teachers (38,000 traditional public and 3,500 public charter).

Sampling – Public Schools

The level of precision achieved by NTPS 2015-16 was evaluated to inform the sample design decisions for NTPS 2017-18. In particular, publishability and bias indicators (described in Section B.3.2) were reviewed in order to improve the school sample design for the 2017-18 NTPS. A key change in NTPS 2017-18 from NTPS 2015-16 is the inclusion of additional sample to permit the publication of state estimates. The 2017-18 NTPS oversampling stratification will be based preliminarily on the following domains:

      • Charter/Non-charter;

      • School Level (primary, middle, high, combined);

      • Urbanicity (city, suburb, town, rural);

      • School enrollment (four levels: schools with enrollment less than 100; schools with enrollment between 100 and 199; schools with enrollment 200 to 499; schools with enrollment 500 or more);

      • State Tier, state.

The NCES standards for publishability indicate that the coefficient of variation (CV) must be no larger than 50%, and if the CV is between 30% and 50%, the estimates are published with a caveat. For a population proportion of 20%, a CV of 30% corresponds to a standard error of 6%. Our minimal goal for each state is to make sure that the expected standard error is no larger than 6% for a population proportion of 20% (a CV of 30%), at both the school and teacher level.1 Table 2 presents a portion of the analysis for public schools by school type, grade level, urbanicity, and poverty status. Presented are the anticipated number of responding schools or principals for the NTPS design and the expected precision based on an analysis of NTPS 2015-16.

Table 2. School-domain expected interviews, standard errors, and design effects with state oversampling to achieve 30% CV or less


State oversampling to achieve 30% CV

Domain

Frame schools

Total sample size

Expected completes

Expected standard error

Design effect

All

95,464

9,281

5,957

0.65%

1.58

Charter

6,254

1,414

786

1.71%

1.44

Non-charter

89,210

7,867

5,171

0.69%

1.52

Primary

52,868

4,060

2,719

0.91%

1.40

Middle

14,912

1,528

1,049

1.45%

1.38

High

21,199

2,409

1,505

1.42%

1.89

Combined

6,485

1,284

683

2.06%

1.82

City

25,818

3,341

1,757

1.20%

1.59

Suburban

29,900

2,802

1,785

1.16%

1.49

Town

12,785

1,165

908

1.66%

1.57

Rural

26,961

1,973

1,507

1.31%

1.61

High poverty

23,604

2,551

1,417

1.29%

1.48

Low/med pov

71,860

6,730

4,540

0.75%

1.61


Table 2.1.b provides the analogous precision analysis for public school teachers. The expected standard errors were calculated based on the NTPS 2015-16 and scaled for the expected NTPS 2017-18 number of respondents.

Table 2.1b. Major domain expected teacher interviews, standard errors, and design effects with state oversampling to achieve 30% CV or less


State oversampling to achieve 30% CV

Domain

Frame FTE (in 1000s)

Total sample size

Expected completes

Expected standard error

Design effect

All

3,088.3

41,652

30,721

0.42%

3.31

Charter

132.7

4,674

3,388

1.13%

2.69

Non-charter

2,955.5

36,978

27,333

0.43%

3.18

Primary

1,490.6

16,351

12,257

0.62%

2.91

Middle

543.2

7,883

5,828

0.94%

3.20

High

908.4

13,191

9,517

0.79%

3.69

Combined

146.0

4,227

3,118

1.21%

2.86

City

904.8

12,994

9,259

0.77%

3.44

Suburban

1,187.4

13,945

10,254

0.72%

3.30

Town

364.2

5,813

4,428

1.04%

2.99

Rural

631.8

8,900

6,780

0.85%

3.08

High poverty

721.1

9,660

6,889

0.86%

3.20

Low/med pov

2,367.2

31,992

23,831

0.47%

3.35


Sampling – Teachers within All Schools

Teachers will be sampled from roster information provided by each participating sampled school. Details on the teacher sample will be provided in the NTPS 2017-18 full-scale OMB clearance package submitted in Early 2017.

Sampling – Principals within All Schools

For each sampled traditional public and public charter schools, the principal will be included in the survey as a result of the school being selected.

Survey Weights

Schools, principals, and teachers will be weighted by the inverse of the probability of selection. The final weight will contain adjustments for nonresponse and any other sampling or field considerations that arise after the sample has been drawn.

Response Rates

We expect the NTPS 2017-18 response rates to approximate those of NTPS 2015-16 and SASS 2011-12 or to fall lower given the long-term trend in declining response rates for federal surveys. Table 3 provides the base-weighted response rates for SASS 2011-12, as the final base-weighted response rates are not yet calculated for NTPS 2015-16.

Table 3. Base-weighted response rates for SASS 2011-12 by respondent and school type

School Type

Unit of Observation

Teacher

Principal

School

Traditional Public

77.92%

72.90%

72.68%

Charter

70.36%

69.67%

69.15%



B.2 Procedures for the Collection of Information

Section B.2.1 describes the procedures for the preliminary field activities for NTPS 2017-18, comprising the special districts operation and the school pre-contact letter. The remaining sections describe the data collection procedures used in NTPS 2015-16. The final data collection procedures for NTPS 2017-18 are currently under development. They will be fully specified in the data collection OMB clearance submission in early 2017. Section B.2.2 describes data collection procedures for the Teacher Listing Form. Section B.2.3 describes procedures for the School Questionnaire, Principal Questionnaire, and Teacher Questionnaire.

B.2.1 Preliminary Field Activities

Special Contact District Operation

To customize the special district contact operations for the NTPS 2017-18 data collections, NCES recently contracted Avar Consulting with its subcontractor Westat, to implement a more individualized approach to the research application required by those districts. Because the contract for this work was awarded in September 2016, changes may be made to the materials and protocol involved in the described here special district contact operations. Such changes will be submitted to OMB for clearance in the fall 2016 as a change request. Another way in which the NTPS 2017-18 special district operations will vary from those implemented in NTPS 2015-16, is that NCES will simultaneously apply for approval to conduct 2017-18 NTPS and 2018 SSOCS in districts where schools have been sampled for both studies.

However, contacted districts will be provided the opportunity to approve or deny each study independently. Previously the special district contact operations were conducted independently for NTPS and SSOCS, resulting in two contacts for districts selected for both surveys. Districts are identified as ‘special districts’ prior to data collection because they were flagged as such during previous cycles of SASS, NTPS, or SSOCS or other NCES studies, or identified during updating district information based on what is found in online sources. The application process for each individual district is obtained either through direct contact via phone or e-mail or through the district website. Most districts require that the following documents be provided in the research request packet:

  • Study proposal with a timeline of the study

  • Study Abstract and/or Executive Summary

  • IRB approval (NTS is exempt from seeking IRB approval)

  • Consent form

  • Project Director’s resume

  • Copy of any communications that would be sent to participants

  • Copy of questionnaires

Some districts require a processing fee (approximately $50-$200) before the research proposal can be evaluated. Other information about the study may be required by the district and will be provided upon request.

School Pre-Contact Letters

The school pre-contact letter is to verify school mailing addresses and to inform schools about the upcoming data collection. A letter is sent to each sampled school informing them of their selection for the study. About 4% of all school addresses get corrected by the U.S. Post Office in response to the pre-contact letter, saving time and effort during the actual data collection period.

B.2.2 Main Data Collection

This section describes the data collection procedures used for the main data collection in NTPS 2015-16 including the Teacher Listing Form (TLF), School Questionnaire (SQ), Principal Questionnaire (PQ), and Teacher Questionnaire (TQ). The final data collection procedures for NTPS 2017-18 are under development and will be fully specified in the full-scale OMB clearance package submitted in early 2017.

B.2.2.1 Priority Schools

A priority flag was assigned at the school level in NTPS 2015-16. During data collection, the priority flag was used to move high priority schools and schools without a coordinator into field follow-up operations earlier in collection in an effort to boost response rates. Schools in the high priority group generally had not responded in previous SASS/NTPS administrations until later in the data collection process and they ultimately required field intervention. For NTPS2015-16, these schools were moved to field contact immediately after the first two mailouts of the initial school package to help reduce costs.

As described in section B.3.2 below, schools identified as high priority either 1) did not have an identified school coordinator or 2) had the lowest propensity to respond (based on SASS 2011-12 data) and the highest potential impact on estimates. The NTPS 2015-16 design sent these high priority schools into a Phase I field follow-up operation early in collection in an effort to boost response.

The priority flag takes into account both the response propensity and the base weight of a school to create a measure of a school’s potential effect on nonresponse weighting adjustments and final estimates. Schools with either an extremely high weight or an extremely low response propensity have a large response influence, meaning their nonresponse will disproportionately affect the nonresponse adjustment cell in which they are located.

B.2.2.2 School-level Data Collection Procedures

School-level data collection procedures for NTPS 2015-16 are summarized in Exhibit 1.

Beginning in August 2015, all schools received an initial school package addressed to the principal at the school address. The package contained a letter to the principal, a letter to the school coordinator (including instructions for completing online a brief screener interview using the NTPS Respondent Status Center), a Teacher Listing Form, a School Questionnaire, and a Principal Questionnaire. For the 1,000 schools in the Schools and Principals Internet Test (described in section B.4.1 below) the initial school package included login information for the TLF, SQ, and PQ. Prior to the beginning of data collection, a clerical operation was conducted to find email address for all principals in the Schools and Principals Internet Test. In late August, they were also sent an initial email inviting them to participate in the study.

The school coordinator was tasked with facilitating completion of the questionnaires. The internet-based Respondent Status Center allowed schools to complete screener items ensuring the school’s eligibility for NTPS, upload or manually enter the Teacher Listing Form, and check the status of all of the school’s questionnaires.

From September through October of 2015, the Census Bureau telephone operations interviewers contacted each school that had not completed their TLF to screen the school over the telephone, determine their eligibility for NTPS, and try to establish a school survey coordinator. The interviewer also reminded the school to complete their TLF. In the same timeframe, a second initial school package was mailed to nonresponding schools and a reminder letter was sent to school coordinators.

Beginning in November of 2015, priority schools and schools without a survey coordinator were moved directly into field follow-up. Census Bureau field office staff began to conduct personal visits (Phase 1 Field Follow-up) to the schools to drop off additional survey materials and encourage survey completion. During the same timeframe, non-priority schools with a survey coordinator received a third initial school package and Census Bureau telephone operations interviewers contacted survey coordinators to remind them to encourage school-level sample members to complete their surveys. Additionally, principals in the Schools and Principals Internet Test were sent a reminder email.

From November through January, nonresponding non-priority schools with an established survey coordinator received a third and a fourth mailing of the initial school package. Schools in the Schools and Principals Internet Test were sent paper questionnaires in this mailing and principals were sent a second reminder email. Census Bureau telephone staff began making reminder phone calls to school survey coordinators.

Based upon the results of the Vendor Analysis conducted as part of the 2014 NTPS Pilot Test (described in section B.4.1 below), NCES decided to collect teacher lists from a vendor if sampled schools had not completed the TLF by mid-December 2015. If the sampled school was on the vendor-purchased data file, the vendor teacher roster was used for teacher sampling. If the sampled school was not on the vendor file, the Census Bureau conducted a clerical operation to locate teacher rosters on school or district web sites. If found, these rosters were used for teacher sampling.

A telephone follow-up operation was conducted with all schools in February and March of 2016. From late March through May 2016, the Census Bureau conducted a final in-person operation to try to collect missing survey forms (Phase 2 Field Follow-up).

B.3 Methods to Secure Cooperation, Maximize Response Rates, and Deal with Nonresponse

This section describes the methods that NCES will use to secure cooperation, maximize response, and deal with nonresponse for NTPS 2017-18. Section B.3.1 details how NTPS plans to secure cooperation by leveraging its status as the primary source of information on K-12 schools and staffing in the United States. Section B.3.2 describes the methods used in NTSP 2015-16 to minimize nonresponse, including those added as change requests during data collection to try to boost response rates. The final methods selected for NTPS 2017-18 will be fully specified in full-scale OMB clearance package scheduled to be submitted in Early 2017.

B.3.1 Methods to Secure Cooperation and Maximize Response Rates

The entire survey process, starting with securing research cooperation from key public school groups and individual sample members and continuing throughout the distribution and collection of individual questionnaires, is designed to increase survey response rates. In addition, the following elements of the data collection plan, in particular, will contribute to overall success of the survey and will enhance the survey response rates.

  1. Visible support from top-level Federal, State, and local education officials. Without the support of high-level officials in the U.S. Department of Education, State Education Agencies, and the sampled local school districts, surveys of public school principals and teachers cannot be successfully implemented. Obtaining endorsements from these officials is a critical factor in the success of the data collection procedures. Top-level Education Department officials will need to fully support the data collection by endorsing the survey in writing and sending advance letters and notices to sampled districts' Superintendents, and individual survey participants (principals and teachers) to encourage participation.

  2. Endorsements from key public school groups. The level of interest and cooperation demonstrated by key groups can often greatly influence the degree of participation of survey respondents. Endorsements are viewed as a critical factor in soliciting cooperation from state and local education officials. NCES is seeking endorsement for NTPS by the following organizations or agencies:

American Association of School Administrators

Association of American Educators

American School Counselors Association

Association of Supervision and Curriculum Development

American Federation of Teachers

American Counseling Association

Association for Middle Level Education

Council of Chief State School Officers

Council of the Great City Schools

National Association of Elementary School Principals

National Association of Secondary School Principals

National Education Association

American Association of School Librarians

American Montessori Society

National Parent Teacher Association

As more endorsements are received, they will be added to questionnaires’ cover page.

  1. Stressing the importance of the survey and the respondents' participation. Official letters will be used to motivate respondents to return surveys. NTPS 2017-18 respondent letters will include those sent from and signed by the Director of the Census Bureau, the NCES Commissioner, and/or NCES Sample Division’s Associate Commissioner. A late effort to boost principal response rates by personalizing communications materials in 2015-16 yielded additional responses at no cost, so NCES plans to continue this practice for NTPS 2017-18.

B.3.2 Methods to Deal with Nonresponse

A major challenge in any survey is obtaining high response rates, and this is even more important today when response rates have been falling among federal surveys in general, and in the SASS/NTPS series of studies in particular.

The main problem associated with nonresponse is the potential for nonresponse bias in the estimates produced using data collected from nonrespondents. Bias can occur when respondents are systematically different from nonrespondents. Two approaches that will be used to reduce the potential for bias are designing the data collection procedures and methods wisely to reduce nonresponse (e.g., using school coordinators) and using statistical methods of sampling and weighting to reduce the effect of nonresponse on the estimates. While the statistical approaches are important in controlling biases and costs, the data collection procedures and methods are at the heart of a successful study.

Methods selected to minimize nonresponse in NTPS 2017-18 will build upon those used in NTPS 2015-16, including actions that in NTPS 2015-16 were taken late in that data collection to boost principal and teacher response rates.

Data Collection Strategies to Minimize Non Response

  1. Minimize survey burden on schools. NTPS survey procedures are designed to minimize burden on schools and sampled individuals (principals and teachers) and the survey instruments have been designed to be completed as quickly and easily as possible.

Good questionnaire design techniques have been employed to minimize item nonresponse. Completed questionnaires from prior rounds of SASS/NTPS were carefully analyzed to determine which items had the highest levels of item nonresponse. This information guided NCES in reviewing the clarity of item wording, definitions, and instructions. Items that were not considered to be effective or useful were deleted to streamline the questionnaires and ease the response burden.

A key design feature of NTPS is the ability to link to other NCES collections such as EDFacts and the Civil Rights Data Collection (CRDC). Information from these sources will be incorporated into final datasets to allow researchers and policymakers to analyze those data together. This will further reduce the need to collect from schools data that have already been collected from state or district education agencies.

  1. Recruit school coordinators. Successive administrations of SASS/NTPS have shown that an important procedure to help maximize response rates is establishing a school coordinator to serve as a primary point of contact for NTPS staff. The continued use of a school coordinator is expected to help keep response rates high, provide some minimal data quality checks, and simplify the follow-up process by having one point of contact.

  2. Tailor nonresponse follow up strategies in priority schools. In an effort to maximize response rates and minimize the potential for bias, NCES took a number of steps prior to NTPS 2015-15 to identify high priority schools for special handling. The schools identified as high priority either 1) did not have an identified school coordinator or 2) had the lowest propensity to respond (based on SASS 2011-12 data as described below) and the highest potential impact on estimates. The NTPS 2015-16 design sent these high priority schools into a Phase I field follow-up operation early in collection in an effort to boost response.

NTPS 2015-16 successfully implemented in production the use of a “priority flag” to identify these priority schools. The weighted response influence takes into account both the response propensity and the base weight of a school to create a measure of a school’s potential effect on nonresponse weighting adjustments and final estimates. The weighted response influence can be calculated as:

where: is the final weighted response influence for a school,

is the baseweight for a school, and

is the estimated response propensity for a school.

As the formula shows, a case with either an extremely high weight or an extremely low response propensity has a large response influence, meaning their nonresponse will disproportionately affect the nonresponse adjustment cell in which they are located. Missing that particular school’s information may result in biased estimates (if variables in the propensity model are related to outcomes of interest), and will certainly result in increased variance in the estimates (due to more variable final weights). In order to avoid having extreme weights drive the value of weighted response influence, the formula takes the natural log of the base weight.

The weighted response propensity model for NTPS 2015-16 was developed using data from SASS 2011-12 and the 2014 NTPS Pilot Test. Specific categories of variables available for evaluation included geography, urbanicity, racial/ethnic makeup, enrollment, grades levels, percent free lunch, and type of school. These variables were available in the SASS 2011-12 sample files, the 2014 NTPS Pilot Test sample files, and the 2015-16 NTPS sample file, thus leveraging past experience in creating the response propensity models. For NTPS 2017-18, the model will be updated using variables from the full-scale NTPS 2015-16.

The priority flag was assigned at the school level in NTPS 2015-16. During data collection, the priority flag was used to move high priority schools and schools without a coordinator into field follow-up operations earlier in collection in an effort to boost response rates. Schools in the high priority group generally do not respond until later in the data collection process and ultimately require field intervention. Moving those schools to field contact after the first two mailouts reduces costs.

NTPS focuses on obtaining cooperation and improving response rates at the school level for a number of reasons. Past administrations of SASS and NTPS have shown that if cooperation is obtained at the school level, teachers and principals are highly likely to respond. Evaluations of schools’ response propensities have shown that the nonresponse in past administrations was driven primarily at the school level. Results showed that schools in special contact districts are the primary driving force behind low response propensity. Nearly 80% of the schools with high propensity for non-response reside in these special districts. For this reason, resources will continue to be allocated to focus heavily on obtaining approvals from special contact districts in order to boost response rates for this group.

  1. Send a tailored flyer to city schools. Past administrations of SASS demonstrated that response rates tend to be very low for urban schools. NTPS 2015-16 included a flier that was sent specifically to urban schools (as determined from frame data on the CCD). This flier was roughly the size of one half of a standard sheet of paper and was customized with statistics about city schools and why their participation is important. The outcomes of the inclusion of the flyer in mailings for urban schools are still being evaluated and final details about possibly using the flyer again in NTPS 2017-18 will be included in the full-scale OMB clearance package in early 2017.

  2. Use vendor lists for teaching sampling. NTPS teacher-level response rates are calculated by multiplying response at the school level to the Teacher Listing Form (TLF) by response at the teacher level. This means that if the school does not cooperate by not completing the TLF, teachers from that school cannot be sampled, which ultimately lowers the teacher response rate. In NTPS 2015-16, TLFs received from sample schools were supplemented with vendor-purchased teacher lists and a clerical operation looking up teacher information on school websites. In the 2014 NTPS Pilot Test, these methods showed high levels of comparability to lists obtained directly from schools. The goal in NTPS 2015-16 was to improve the overall teacher response rate by allowing NTPS to sample teachers from schools that have not submitted a TLF. The results of the vendor list operation are being evaluated and their implications for NTPS 2017-18 are being considered.

  3. Monitor publishability and bias measures. For NTPS 2015-16, NCES monitored data collection progress throughout survey operations in order to identify and potentially minimize problems with nonresponse. The Census Bureau created weekly “publishability” reports from their data collection tracking system that showed whether key analysis cells were large enough to provide publishable estimates as of that point in time. By monitoring this publishability metric, NCES was able to identify populations of schools for which nonresponse hampered reporting. These results will be considered in designing the sample and nonresponse follow-up strategies for NTPS 2017-18. NCES also monitored R-indicators, a measure of the bias of the sample, on a weekly basis. The closer the R-indicator is to 1, the more balanced is the respondent population. Towards the end of data collection, the R-indicator for the full sample indicated that the respondent population was fairly well balanced. NCES plans to continue to monitor these two indicators in NTPS 2017-18.

  4. Personalize principal letters and emails. The National Teacher and Principal Surveys (NTPS) 2015-16 Refusal Conversion Change Request (OMB# 1850-0598 v.13) was approved in January 2016 as a result of significant response issues with high priority schools despite early field collection efforts. Anecdotal evidence indicated that the initial survey packages addressed to “School Administrator/Principal” were being overlooked or thrown in the trash. Therefore, NTPS 2015-16 requested and received clearance to send personalize letters to nonresponding principals in charter schools, schools in towns, and schools with enrollment less than 100; unproductive schools from the phase one field follow-up operation; and recoded “refusals” from the phase one field follow-up operation. NCES is considering the feasibility of personalizing principal contact materials in NTPS 2017-18 and will provide details in the full-scale OMB clearance package.

  5. Target emails to nonresponding principals. The 2015-16 National Teacher and Principal Survey (NTPS) Targeted Data Collection Emails Change Request (OMB# 1850-0598 v.14) was approved in April 2016 and requested the use of personalized emails to nonresponding principals in targeted school groups. The 2014 NTPS pilot test had demonstrated that email was an effective tool to drive participation in the NTPS teacher survey. The pilot test also showed that email addresses and teacher lists could be collected from school websites. The 2015-16 NTPS included clerical operations to look up principal email addresses and lists of teachers as well as their email addresses during data collection. Using this information, NCES sent targeted, personalized emails to individual principals encouraging their school’s participation in the survey. To add an element of personalization, each email was sent directly to the principal from the NCES/NTPS email box and was signed by the Associate Commissioner for Sample Surveys of NCES. Because personalized emails to nonresponding principals in target schools carry no cost and may help response rates, NCES plans to incorporate this into its procedures for NTPS 2017-18 (the details of which will be specified in the full-scale OMB clearance package submitted in early 2017).

  6. Send a third reminder email. The 2015-16 National Teacher and Principal Survey (NTPS) Teacher Reminder Email Change Request (OMB# 1850-0598 v.15) was approved in June 2016 and requested to ability to send a third reminder email (fourth email in total) to late waves of teachers in NPTS 2015-16 to give them a final reminder/opportunity to complete the survey before close-out. The response rates for late wave teachers in the NTPS had been leveling off and appeared to be lower than for earlier waves of teachers. This may have been a product of the timing of school testing and late school year activities because late wave teachers received an invitation to complete the survey during a period with a heavy school workload. Given that this additional reminder email carried no cost and may help response rates, NCES plans to incorporate a third reminder email into data collection procedures for NTPS 2017-18.

  7. Consider new methods of minimizing nonresponse. NCES is considering a number of other methods to minimize nonresponse in NTPS 2017-18, including the possible use of incentives and establishment of a district coordinator. Full details of any additional proposed methods of minimizing nonresponse will be included in the full-scale OMB clearance package submitted in early 2017.

Statistical Approaches to Nonresponse

One of the methods employed to reduce the potential for nonresponse bias is adjustment of the sample weights to account for nonresponse. If schools or teachers with certain characteristics are systematically less likely than others to respond to a survey, the collected data may not accurately reflect the characteristics and experiences of the nonrespondents, which can lead to bias. To adjust for this, respondents are assigned weights that, when applied, result in them representing their own characteristics and experiences as well as those of nonrespondents with similar attributes. The school weights are also raked to sampled-based control totals in order to maintain the background characteristics of the sample. This is another method used to reduce the potential for nonresponse bias in the estimates produced from the data.

Response rates will be computed for the Teacher Listing Form, the School Questionnaire, the Principal Questionnaire, and the Teacher Questionnaire. Data collected through any instrument with a response rate of less than 85 percent will be evaluated for nonresponse bias. In addition to comparing the characteristics of respondents and nonrespondents using data that are available from the sampling frames (for example, school type and school locale from the school frame), we will also compare study estimates to estimates from previous rounds of NTPS and SASS. The nonresponse bias analysis will be similar to that conducted for SASS as reported in study methodology documentation (for the most recent released SASS methodology report, see http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2010332). A methodology report covering NTPS 2017-18 will be developed and released, and will describe the methods and results of the nonresponse bias analysis.

B.4 Tests of Methods and Procedures

The SASS/NTPS series of studies has a long history of testing materials, methods, and procedures to improve the quality of its data. Section B.4.1 describes those tests that have most influenced the NTPS design, beginning with the 2014-15 NTPS Pilot Test and continuing through a new research project aimed at improving procedures for requesting research approval from special contact districts. Section B.4.2 describes a proposed experiment to include private schools, principals, and teachers in NTPS.

B.4.1 Tests Influencing the Design of NTPS 2017-18

2014-15 NTPS Pilot Test

Five experiments designed to optimize the design of the 2015-16 NTPS were conducted as part of the 2014-15 NTPS Pilot Test: 1) the Questionnaire Mode Experiment, 2) the Teacher Listing Form (TLF) Email Experiment, 3) the Invitation Mode Experiment, 4) the Teacher Questionnaire Instruction Experiment, and 5) the Vendor Analysis. Each of these experiments is briefly described below, along with its results and implications for successor NTPS data collections.

  1. Questionnaire Mode Experiment. This experiment was designed to determine whether paper questionnaires or Internet survey instruments (i.e., mailonly versus internet sequential modes) constituted the most effective mode of collecting the TLF, School Questionnaire, and Principal Questionnaire. For all three survey instruments, the schools assigned to the paper mode had higher response rates than the schools assigned to the internet mode. Some known issues with data collection could have affected this difference. First, the pilot test did not use school coordinators, a method shown to boost response rates in SASS. Second, there were problems relating to the contact materials for the internet treatment groups. As a result of this experiment, NTPS 2015-16 was primarily paper based. It also included an experimental sample of 1,000 schools outside the main study that were offered internet at the onset of data collection using standard production NTPS procedures such as a school coordinator and improved contact materials and login procedures. NCES is evaluating the implications of this experiment for NTPS 2017-18, but currently plans to use Internet survey instruments as the primary data collection mode for teacher respondents.

  2. Teacher Listing Form (TLF) Email Experiment. This experiment was designed to assess the feasibility and quality of collecting teacher email addresses on the Teacher Listing Form. The pilot test design included a 50% split-panel experiment with schools randomly assigned to receive either a TLF that included a request for teachers’ email addresses or one that did not. At the end of data collection, response rates were comparable between the schools that received the TLF with the email address field and the schools that received the TLF without the email address field. As a result of this experiment and the Invitation Mode Experiment described below, NCES used the TLF with the email address field in NTPS 2015-16 and plans to continue to do so for NTPS 2017-18.

  3. Invitation Mode Experiment. The purpose of this experiment was to identify which of three methods of inviting teachers to complete the Teacher Questionnaire yielded the best response rates. Schools were randomly assigned to the following invitation modes: 1) both email and mailed paper invitation letters to complete the internet instrument (treatment A), 2) a mailed paper invitation letter to complete the internet instrument only (treatment B), and 3) a mailed package that included a letter and paper questionnaire (treatment C). The results of the experiment indicated that a strategy using a combination of email and paper invitations (treatment A) is best for inviting teachers to complete the internet questionnaire. The response rate for treatment group A was comparable to that of treatment group C that received only mailed paper materials. As a result of this experiment, teachers sampled for NTPS 2015-16 for whom we had a valid email address were sent both email and paper invitations as the initial request to fill out the Teacher Questionnaire. Teachers without valid email addresses were sent their initial invitation as part of a mailed package that included a paper copy of the survey. NCES plans to continue using this procedure in NTPS 2017-18.

  4. Teacher Questionnaire Instruction Experiment. This experiment was designed to determine (1) whether including instructions in the NTPS questionnaire makes a difference, and (2) whether the position, format, and presence or absence of a preface in the instruction makes a difference. NCES is currently analyzing the results from this experiment and plans to incorporate these findings in a future NTPS administration.

  5. Vendor Analysis. The purpose of this effort was to evaluate both the feasibility of collecting teacher lists from a vendor and the reliability of their information to see whether they could be used to supplement or replace school-collected TLFs. NCES purchased teacher lists from a vendor for schools sampled for the 2014-15 NTPS Pilot Test. The vendor teacher lists were compared with information collected from the TLFs. The results suggested that the vendor list information was comprehensive and reliable at a relatively low cost. NCES used vendor lists to sample teachers from a subset of schools that did not respond to the TLF in NTPS 2015-16. Based on its evaluation of that process, NCES will specify how it plans to use vendor lists in the full-scale NTPS 2017-18 clearance package to be submitted in early 2017.

NTPS 2015-16 Full-Scale Collection

  1. Schools and Principals Internet Test. NTPS 2015-16 included a Schools and Principals Internet Experiment designed to test the efficacy of offering an internet response option as the initial mode of data collection. This experiment built on the Questionnaire Mode Experiment from the 2014-15 NTPS Pilot Study, described above. Most importantly, the 2015-16 Schools and Principals Internet Test included the use of the school coordinator and corrected problems with respondent materials and packaging in the earlier test. A separate sample of 1,000 public schools was selected for this experiment, which invited schools and principals to complete the NTPS questionnaires over the internet. A clerical operation prior to data collection obtained email addresses for sampled principals assigned to the internet test. Principals were sent emails as an initial mode of invitation to complete the NTPS questionnaires. Results from the 2015-16 NTPS Schools and Principals Internet Test will influence the final 2017-18 NTPS design. Those results and the final design will be described in the full-scale 2017-18 NTPS clearance package to be submitted in Early 2017.

  2. Contact Time Tailoring Experiment. This test was designed to determine the optimal contact time for two respondent groups: school coordinators and teachers. The ideal contact time for each of these individuals could easily be different. In fact, there is anecdotal evidence that interviewers have more success contacting school coordinators in the morning, while there is a greater likelihood of making contact with teachers in the afternoon. This experiment varied the contact times used in the telephone calls conducted to nonrespondent schools and teachers. The research questions this test was designed to answer are:

  1. Are afternoons more productive for calling teachers?

  2. If not afternoons, are there more productive times than others for calling teachers?

  3. Do productive contact times for teachers hold globally, or do different types of schools have different productive call time frames?

  4. Can we use frame information (urbanicity, school size, grade level) to help tailor call times in future rounds of data collection?

  5. If we are calling at productive times, do we have to make fewer overall attempts to make contact with the teacher?

  6. If we are calling at productive times, do we have to make fewer total attempts/total contacts to obtain a completed interview?

Results from the 2015-16 NTPS Contact Time Tailoring Experiment are being evaluated and will influence future NTPS data collection procedures.

NTPS 2017-18 Full-Scale Collection

NCES is currently considering options for tests of methods, materials, and procedures to be conducted as part of the full-scale NTPS 2017-18.

  1. Testing rotating content modules. The School Questionnaire, Principal Questionnaire, and Teacher Questionnaire each have core sections with content that is included in every survey administration. In addition, these instruments include rotating content modules that are fielded in selected administrations of NTPS. NCES is currently conducting cognitive interviews to evaluate new items for rotating NTPS modules on topics of educator evaluation, professional development, classroom organization, and instructional time (OMB #1850-0803 v.147). These cognitive interviews are designed to improve question wording, organization, and the order of the questions in the instruments. Respondents include a mix of teachers and principals from urban, suburban, and rural schools, teachers who are both full and part time, and teachers with alternative certification. The final questionnaires, including rotating content modules revised as a result of the cognitive interviews, will be included in the full-scale NTPS 2017-18 clearance package to be submitted in Early 2017.

Special Contact Districts Research Project

NCES conducts several school-based studies within the NCES legislative mandate to report on the condition of education, including NTPS, the Survey of School Crime and Safety (SSOCS), and the National Assessment of Educational Progress (NAEP). A critical step for data collection is obtaining approval from public school districts that require prior approval to conduct studies with students, teachers, and staff. The number of these special contact districts is steadily increasing. This increased requirement poses a barrier to successful data collection, because many districts and schools have complex and lengthy approval processes, reject all outside research, or only review applications for outside research once a year. This has contributed to lower response rates for non-mandatory NCES surveys. There is a research project currently underway at NCES that is exploring how different program areas both within NCES and in other federal agencies seek approval from PreK-12 public districts and schools in order to identify best practices and make recommendations for future operations. The results of this research project will inform the design of the special districts operation for NTPS 2019-20.

B.4.2 NTPS 2017-18 Private School Test

This section describes the NTPS 2017-18 Private School experiment, designed to determine whether NCES can achieve the level of response rates with this population that meet NCES publishability standards. In SASS 2010-11, the response rates for private schools, particularly in specific strata, were too low to meet NCES reporting standards. Because of this, NTPS 2015-16 collected data from public schools, but not from private schools. In NTPS 2017-18, NCES plans to conduct an embedded test with private schools to evaluate methods for improving response rates. The private schools selected for this test will undergo data collection procedures that will be generally similar to those used with the main (public) NTPS 2017-18 school sample, but will be adjusted to accommodate differences specific to the sector (e.g., holidays and schedules). The sample sizes detailed in Section B.4.2.2 are being proposed for clearance for the school pre-contact letter operation. The descriptions of the private school test in Sections B.4.2.1, B.4.2.3, and B.4.2.4 are preliminary and will be fully specified in the full-scale NTPS 2017-18 clearance package to be submitted in Early 2017.

B.4.2.1 Proposed Treatments

The design of the proposed treatments for the NTPS 2017-18 Private School Test is still preliminary and will be fully specified in the NTPS 2017-18 full-scale OMB clearance package in Early 2017. NCES anticipates that the proposed treatments will include the use of incentives and a follow-up contact schedule that accelerates in-person follow up by field representatives. NCES plans to use propensity score modeling (a combination of likelihood-to-respond and the risk-of-bias) to identify and segment schools among the treatment groups.

B.4.2.2 Universe and Sample Design

The respondent universe for the NTPS 2017-18 Private School Test consists of 24,8622 private schools in the 50 U.S. states and the District of Columbia (DC) that meet eligibility criteria. To be eligible for inclusion in the sample, schools must provide classroom instruction to students, have one or more teachers to provide instruction, serve students in at least one of grades 1-12 or the ungraded equivalent, must be located in one or more buildings, and must be located in the U.S. and not in the outlying areas or U.S. territories.

Because of timing, NCES will use the Private School Survey (PSS) 2013-14 to construct the private school sampling frames, and the schools will be sampled from the PSS 2015-16. Table 4 below presents the number of private schools on the PSS 2013-14 by region and school level. The universe has been adjusted to remove Kindergarten-terminal schools, which are not eligible for NTPS.

Table 4. Respondent universe by school level and region for the proposed private school sample, based on the 2013-14 PSS

Region

Total

 

Elementary

 

Secondary

 

Combined

Total

28,364


17,254


2,700


8,410

Northeast

6,038


3,462


861


1,715

Midwest

8,540


6,687


568


1,285

South

8,750


4,104


660


3,987

West

5,037

 

3,002

 

611

 

1,423

SOURCE: Private School Universe Survey (PSS), 2013–14.

B.4.2.3 Precision Requirements and Sample Sizes

This section details the school sample sizes and precision requirements for the NTPS 2017-18 Private School Test.3 Details about the teacher sample will be provided in the full-scale NTPS 2017-18 clearance package to be submitted in Early 2017.

The final NTPS 2017-18 Private School Test samples will include approximately 4,000 schools and school principals. To inform the sample design for the NTPS 2017-18 Private School Test, NCES evaluated the level of precision achieved by SASS 2011-12. The precision analysis was based on analysis variables and on proportions to address important characteristics. The following variables were evaluated:

  • School type (Religious – Catholic, Religious – Non-Sectarian, Non-Religious);

  • Grade Level (Elementary, Secondary, Combined); and

  • Region (Northeast, Midwest, South, West).

The desired level of precision for NTPS estimates for private schools was defined in terms of a 95% confidence interval half-width (corresponding to 1.96 times the standard error). The desired goal was to achieve a 95% confidence interval half-width of 3.0%. NCES has also evaluated the sampling plan for the ability to achieve a minimum level of CV. The desired goal was to achieve a CV of less than 30 percent in order to meet NCES standards for reporting. Table 5 shows this information by key domains of school type, grade level, and region.

In order to improve the precision for private schools from those achieved in SASS 2011-12, NCES plans to oversample as follows:

  • Secondary schools are sampled at a rate proportional to 3.33 times the measure of size (as determined by number of FTE teachers);

  • Religious – Non-sectarian schools will be sampled at a rate proportional to 1.43 times the measure of size; and

  • Other strata are sampled at a rate proportional to 1.0 times the measure of size.

Table 5. Anticipated Level of Precisions by Key Domains for NTPS 2017-18 Private School Test

Domain

Frame schools

Expected number of completed interviews

Expected standard error

95% CI half-width

Design effect

CV

Min Pop % for

CV < 30%

All

24,861

1,750

1.22%

2.38%

1.62

6.08%

1.02%

Catholic

6,407

573

2.07%

4.06%

1.54

10.37%

2.90%

Other religious

11,600

598

2.01%

3.94%

1.51

10.06%

2.73%

Nonsectarian

6,854

579

2.02%

3.96%

1.48

10.11%

2.76%

Elementary

13,216

638

1.80%

3.53%

1.29

8.99%

2.20%

Secondary

2,426

505

2.11%

4.13%

1.40

10.55%

3.00%

Combined

9,219

607

1.95%

3.82%

1.44

9.74%

2.57%

Northeast

5,787

465

2.55%

4.99%

1.88

12.73%

4.31%

Midwest

6,105

395

2.52%

4.93%

1.56

12.58%

4.21%

South

8,025

545

2.10%

4.12%

1.50

10.51%

2.98%

West

4,944

345

2.68%

5.26%

1.55

13.42%

4.77%


B.4.2.4 Procedures for the Collection of Information

The NTPS 2017-18 Private School Test will generally follow the data collection procedures used for the main sample (public schools). The test may include treatments that vary the data collection procedures to improve response rates or to accommodate the needs of this population (e.g., scheduling around religious holidays for non-sectarian private schools). Full details on the data collection procedures for NTPS 2017-18 and on procedures unique to the private school test will be specified in the NTPS full-scale clearance package in Early 2017.


B.5 Individuals Responsible for Study Design and Performance

The following individuals are responsible for the study design, data collection, and analysis for NTPS 2017-18: Amy Ho, Chelsea Owens, Deanne Swan, Andy Zukerberg, and Marilyn Seastrom from NCES; Carolyn Pickering, Shawna Cox, Mary Davis, and James Farber from Census; and David Marker and Lou Rizzo from Westat.

1 Please note that work on the NTPS 2017-18 sample design continues. Should any change be necessary to what is proposed in this package, NCES will submit to OMB a change request in the fall 2016.

2 This is based on the PSS 2013-14 data file. There were 29,639 private schools with valid values for the NTPS frame variables, such as affiliation, school grade level, and number of full-time equivalent teachers.

3 Please note that work on the NTPS 2017-18 sample design continues. Should any change be necessary to what is proposed in this package, NCES will submit to OMB a change request in the fall 2016.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy