Part B & C PFS 2016-17

Part B & C PFS 2016-17.docx

Principal Follow-Up Survey (PFS 2016-17) to the National Teacher and Principal Survey (NTPS 2015-16)

OMB: 1850-0934

Document [docx]
Download: docx | pdf




Principal Follow-Up Survey (PFS 2016-17) to the National Teacher and Principal

Survey (NTPS 2015-16)



Supporting Statement

Parts B & C



OMB# 1850-new v.1



October 2016


National Center for Education Statistics (NCES)

U.S. Department of Education


TABLE OF CONTENTS


B. Collection of Information Employing Statistical Methods 1


1. Respondent Universe 1

1a. Background on the NTPS 2015-16 sample 1

1b. Principal Follow-up Survey (PFS) Sample Design 2

2. Procedures for Collection of Information 2

3. Methods for Maximizing Response Rates 2

3a. Endorsements 2

3b. Stressing the survey’s importance 3

3c. Extensive follow-up of nonrespondents 3

4. Tests of Procedures and Methods 3

5. Reviewing Statisticians 3

C. Item Justification 3


PART B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

1. Respondent Universe

1.a Background on the NTPS 2015-16 Sample

The 2015–16 National Teacher and Principal Survey (NTPS 2015-16) sample was a two-stage stratified sample. The schools were sampled first, and then teachers were selected within each sampled school.

The NTPS school sample is a systematic probability proportionate to size (PPS) sample, where size is defined to be the square root of the full time equivalent number of teachers at the school. Unlike its predecessor, the School and Staffing Survey (SASS), the NTPS did not stratify schools prior to sampling. However, some schools were oversampled based on their assigned domains. The domains were defined by charter status (charter/not charter), grade level (primary/middle/high/combined), urbanicity (city/suburban/town/rural), and poverty status (> 75% of enrollment eligible for free or reduced price lunch/otherwise).

Public charter schools were oversampled at a rate 3.1 times their proportional sample allocation. Proportional allocation in this instance is defined as the domain’s sum of schools’ measures of size relative to the sum of measures of size of all schools in the sampling frame. Further oversampling applied to non-charter schools by grade level, whereby combined schools were oversampled at 2.4 times the proportional allocation, middle schools were oversampled at 1.17 times the proportional allocation, and high schools were oversampled at 1.12 times the proportional allocation. Primary schools were undersampled at a rate of 0.9 times their proportional allocation. In addition, once the grade level sampling differential was applied, non-charter schools had their probabilities further altered by urbanicity as follows: rural schools were oversampled at a rate of 1.05 times their previous rate, town schools were oversampled at a rate 1.27 times their previous rate, and suburban schools were undersampled at a rate of 0.95 times their previous rate.

In addition to oversampling, the sample sizes of schools in six small states (Alaska, District of Columbia, Hawaii, Rhode Island, Vermont, and Wyoming) were inflated to realize lower bounds on precision for these states and to enable the calculation of reliable key state estimates by aggregating data from multiple cycles of NTPS.

Along with the names of their teachers and their email address, sampled schools were asked to provide the following descriptive characteristics of each teacher:

  1. teaching status: part-time or full-time; and

  2. subject matter taught: teachers were classified as special education, general elementary, math, science, English/language arts, social studies, vocational/technical, or other.

The above information for each teacher in a selected NTPS school comprised the initial teacher sampling frame. The frame was also supplemented with information from vendor-purchased teacher roster files and from a clerical look-up operation conducted as part of NTPS where teacher lists were located on school and/or district websites. Within each sampled school, teachers were stratified by subject. The strata include math, science, English/language arts, social studies, and everything else. No oversampling was done for the teacher sample. Teacher records within a school domain and teacher stratum were sorted by the teacher subject and the teacher line number code, which is a unique number assigned to identify the teacher within the list of keyed teachers. Within each teacher stratum in each school, teachers were selected systematically with equal probability. The following table shows a summary of the NTPS 2015-16 allocated sample sizes:

Type of School

Number of Schools

Number of Teachers*

Traditional Public

7,127

43,674

Public Charter

1,173

5,313

Internet Experiment for Schools and Principals
(These data were experimental and will not be published. No teachers were sampled from these experimental schools)

1,000


Total

9,300

48,987


1.b Principal Follow-up Survey (PFS) Sample Design

The sampling frame for PFS 2016-17 consists of all the traditional public and public charter school principals who completed a Principal Questionnaire during NTPS 2015-16. Any sampled NTPS principal who did not complete their questionnaire or was otherwise found to be out of scope for NTPS will not be included in the PFS frame. The PFS 2016-17 sample will include approximately 5,700 principals.

PFS 2012-13 had a sample size of 7,512 principals from public and charter schools. The large discrepancy in sample size between the 2012-13 and 2016-17 administrations of the PFS is due to NTPS 2015-16 including fewer sampled schools than SASS 2011-12, because NTPS 2015-16 sample was selected to produce estimates at the national rather than state level.

2. Procedures for Collection of Information

Paper Questionnaires

Two separate paper questionnaires will be developed for the 2016-17 PFS: one for schools (PFS-1A) and one for principals (PFS-1C). The PFS-1A questionnaire will be mailed to each sampled principal’s 2015-16 school in order to gather occupational status information about that principal for the 2016-17 school year. The PFS-1C questionnaires will be mailed directly to the sampled principal’s home address, which was collected on the 2015-16 NTPS Principal Questionnaire, in order to (1) collect occupation status information about the principal when the school fails to complete their PFS-1A; or (2) verify the school’s response when the school has indicated that the principal has left their 2015-16 school (i.e., is a non-stayer).

Paper PFS questionnaires will be checked-in and processed at the U.S. Census Bureau’s National Processing Center using the key-from-paper data methodology.

Weighted unit and overall response rates of PFS Principals, using initial base weight, by sector and teaching status: 2012–13

Sector and teaching status

Weighted unit response rate

Weighted overall unit response rate1

Public schools

99.7

72.7

Charter Schools

99.1

69.0

Private Schools

97.6

62.3


3. Methods for Maximizing Response Rates

A variety of procedures will be employed to maximize response rates at both the level of the responding unit (i.e., sample member) and at the level of the individual survey items in each survey questionnaire.

The entire survey process, starting with securing research cooperation from key public school groups and individual sample members and continuing throughout the distribution and collection of individual questionnaires, is designed to increase survey response rates. In addition, NCES believes that endorsements (see below), stressing the survey’s importance in communication materials, and extensive follow-up of nonrespondents will further facilitate the overall success of the survey and enhance response rates.

3.a Endorsements

The level of interest and cooperation demonstrated by key groups can often greatly influence the degree of participation of survey respondents. Endorsements are viewed as a critical factor in soliciting cooperation from state and local education officials for obtaining high participation rates. The PFS is seeking endorsement from the following organizations or agencies:

American Association of School Administrators

American Federation of Teachers

National Association of Elementary School Principals

National Association of Secondary School Principals

3.b Stressing the survey’s importance

Official letters (initial and follow-up) from the NCES Commissioner will be used to motivate respondents to return their surveys. These letters will address the principal by name. The use of personalization of survey materials (e.g., cover letters and survey packets with names) in NTPS 2015-16 demonstrated a positive effect on the response rates. Both the Commissioner’s signature and the endorsements from respected affiliations and organizations are intended to increase the respondent’s perception of the importance of the survey.

3.c Extensive follow-up of nonrespondents

For PFS 2016-17, mixed survey modes will be utilized to maximize response levels – self-administered mail instruments and telephone, as needed. Non-respondents will receive at least two reminders. Given the brevity of the instrument, we anticipate response rates at or above 90% with this approach.

4. Tests of Procedures and Methods

The most recent administration of the PFS was during the 2012-13 school year and provided attrition rates for principals in K-12 public and private schools. The PFS sample included all principals interviewed in SASS 2011-12. Schools that had returned a completed SASS 2011-12 principal questionnaire were mailed the PFS form in the spring of 2013. The form collected from the school information about the current occupational status of the principal who was that school’s principal a year earlier, during SASS 2011-12.

Additionally, a validation study was conducted concurrently with yet separately from PFS 2012-13. This study analyzed the validity of school’s responses to PFS by re-interviewing the principals directly about their occupation status. In general, more variation in response was found than was expected. However, the ability to draw definitive conclusions was limited by the small sample size captured by the study and the difficulty in reaching a substantial portion of the original validation study sample.

NCES decided not to include a large-scale validation study in PFS 2016-17 data collection, but to instead follow-up directly with all principals who were identified as no longer serving as principal in their 2015-16 school (“non-stayers”) on the PFS form mailed to schools. The goal of following-up directly with principals is to assess how accurately schools report principal occupational status for principal non-stayers, and to obtain accurate attrition data for principal non-stayers.

5. Reviewing Statisticians

Andy Zukerberg, Isaiah O’Rear, Chelsea Owens of NCES; Shawna Cox of the Census Bureau; and Rebecca Goldring of Westat reviewed and approved the PFS data collection plan and related matters for statistical quality, feasibility, and suitability to the overall objectives of the survey.

PART C. ITEM JUSTIFICATION

There are four versions of the PFS Questionnaire:

  • PFS-1A is sent to public schools in the initial mail-out;

  • PFS-1A(T) is used to record responses from telephone follow-up to nonresponding schools that were sent PFS-1A;

  • PFS-1C is sent directly to NTPS respondents who were public school principals when the PFS-1A indicates principal is a non-stayer or if PFS-1A is a non-response; and

  • PFS-1C(T) is used to record responses from telephone follow-up with nonresponding public school principals that were sent PFS-1C.

PFS questionnaires have not been modified from the PFS 2012-13 Collection.

1 The weighted overall unit response rate is a function of the 2011-12 SASS Principal response rate multiplied by the 2012-13 Principal Follow-up Survey response rate.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Titletfs omb supporting statement [Memorandum]
AuthorNCES
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy