Vol I FRSS 105 Public School Facilities Survey

Vol I FRSS 105 Public School Facilities Survey.docx

Quick Response Information System (QRIS) 2012-2015 System Clearance

Vol I FRSS 105 Public School Facilities Survey

OMB: 1850-0733

Document [docx]
Download: docx | pdf



Volume I:



Fast Response Survey System (FRSS) 105: Condition of Public School Facilities



OMB# 1850-0733 v. 27



























October 22, 2012

National Center for Education Statistics

U.S. Department of Education

Justification


The National Center for Education Statistics (NCES), U.S. Department of Education (ED), requests OMB approval under the NCES system clearance for the Quick Response Information System (QRIS) (OMB #1850-0733) to conduct data collection for the Fast Response Survey System (FRSS) survey #105 on the condition of public school facilities. Congress has appropriated funds for NCES to conduct an FRSS survey on the condition of public school facilities, with a First Look report on the results to be released in late 2013. FRSS previously conducted a survey on this topic in 1999.


The FRSS survey is authorized under the Education Science Reform Act of 2002 (ERSA 2002, 20 U.S.C. 9573), which authorizes NCES to collect and report statistical data related to education in the United States. NCES has contracted Westat for this survey.


Design

Overview of Survey Development

The 2012-13 FRSS survey will cover many of the same topics as the 1999 survey, but will use a revised questionnaire. The current survey reflects lessons learned from the 1999 survey, topics and issues identified through literature review, with modifications based on two rounds of feasibility calls and two rounds of pretest calls (OMB# 1850-0803) with public school district personnel most knowledgeable about school facilities. A few items from the 1999 survey are included on the 2012-13 questionnaire for comparison. As was done in 1999, schools will be sampled, but surveys will be sent to districts, where facilities personnel and records are located. Two rounds of feasibility calls, each with nine respondents, were conducted in May and June 2012 (OMB# 1850-0803 v.67). The feasibility calls were used to explore potential new survey items, and identify and correct issues with the content and format of the survey before conducting the pretest. Respondents were asked to review but not complete the questionnaire and then participate in a short telephone interview with Westat to provide feedback on the questionnaire. The resulting draft of the questionnaire was then reviewed by the NCES Quality Review Board (QRB) and revised accordingly to prepare it for the pretest.


Two rounds of pretest calls, one with ten respondents and one with six respondents, were conducted in September and October 2012 (OMB# 1850-0803 v.70). The second pretest was conducted to test minor changes to the wording and format of question 14 (now question 13) to reduce the number of respondents who inadvertently skipped Part B of the question. In both rounds of pretest calls, respondents were asked to complete the questionnaire and participate in a telephone debriefing with Westat to provide feedback on the questionnaire. Completed questionnaires were collected by fax prior to the debriefing with each respondent. The purpose of the pretests was to verify that all questions and corresponding instructions were clear and unambiguous, to determine if the information would be readily accessible to respondents, and to determine whether the burden on respondents could be reduced further. Changes to the questionnaires were made based on the feedback received from the pretests, and documented in memorandums summarizing the pretest results. After the first pretest, minor changes were made to the wording and format of question 14 (now question 13). After the second pretest, the only change was to drop questions 12 and 16 (from the pretest version) to reduce the length of the questionnaire to FRSS limits. The revised questionnaire (Attachment 1) is being submitted with this request for OMB clearance.


NCES Review and Consultations Outside of Agency


The NCES QRB reviewed study materials on two occasions prior to the submission for the feasibility calls. QRB members first reviewed a descriptive paragraph about the study, and later reviewed a draft questionnaire. The questionnaire was also reviewed by the Office of Innovation and Improvement (OII) in the U.S. Department of Education. In addition, the questionnaire was sent to the Environmental Protection Agency for review. Revisions were made to the instrument, and a few new items were added based on input from the reviewers. The questionnaire was tested with respondents during feasibility calls. The questionnaire was then revised and again reviewed by the NCES QRB and OII prior to the pretest calls.


Assurance of Confidentiality


Data to be collected will not be released to the public with institutional or personal identifiers attached. Data will be presented in aggregate statistical form only. In addition, each data file undergoes extensive disclosure risk analysis and is reviewed by the NCES/IES Disclosure Review Board before use in generating report analyses and before release as a public use data file. Respondents will be assured that their participation in the survey is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose unless otherwise compelled by law (Education Sciences Reform Act of 2002, 20 U.S.C. § 9573).


Description of Sample and Burden


The proposed sample design is a nationally representative sample of 1,800 regular public schools from the 2010-11 NCES Common Core of Data (CCD) Public Elementary/Secondary School Universe File. Since facilities personnel and records are generally located at the district, the survey will be sent to the district in which the sampled school is located, with instructions that the survey is designed to be completed only for the sampled school by the person in the district who is most familiar with the school facilities in the district. This is the same procedure that was followed in the 1999 FRSS survey, and was also used in the feasibility and pretest calls for this FRSS survey. The questionnaire is limited to three pages of items readily available to respondents and can be completed by most respondents in 30 minutes or less.


Any special requirements that districts have for approval of surveys will be met before survey materials are mailed to those districts. Each of the approximately 10 districts that require special approval has unique requirements for obtaining approval. The materials sent to special districts will be tailored to meet the specific requirements of each district, consistent with the materials included in this OMB package. For example, most districts request information on survey justification, confidentiality, sample size, and survey collection procedures, which will be copied from the appropriate sections of the OMB package after its approval.


Questionnaire packages, including information needed to access the Web survey and a list of sampled schools in the district, will be mailed to the superintendent of districts with sampled schools in January 2013. The cover letter and questionnaire will include a description of the most appropriate district-level respondent. Follow-up for nonresponse will be conducted both by mail and telephone and will begin about 3 weeks after the questionnaires have been mailed to the districts. Experienced telephone interviewers will be trained to conduct the nonresponse follow-up and will be monitored by Westat supervisory personnel. Telephone nonresponse follow-up is used to prompt respondents to complete the survey by web, mail, or fax and is expected to take about 5 minutes.


The sample of 1,800 schools is estimated to be located in 1,530 districts, based on previous FRSS studies. The estimated burden time for districts to review the introductory letter requesting their participation (initial contact) is 5 minutes per district for a total of 127 respondent burden hours (table 1). The initial sample of 1,800 schools will yield about 1,620 completed questionnaires, assuming a response rate of 90 percent. Based on a response burden of approximately 30 minutes per completed questionnaire, the estimated response burden to complete the survey is estimated to be about 810 hours.1 It is anticipated that about 75 percent of the districts (i.e., 1,148 districts) will receive a nonresponse follow-up call that will take about 5 minutes. The total estimated burden time for nonresponse follow-up is about 95 hours. The total number of burden hours for data collection and nonresponse follow-up is about 1,052 hours.



Table 1. Estimated burden for data collection and nonresponse follow-up: FRSS 105

Type of collection

Sample size

Estimated response rate (percent)

Estimated number of respondents

Estimated number of responses

Total burden hours per respondent

Respondent burden hours

Special clearance district review

10

100%

10

10

2.00

20

Initial district contact

1,530

100%

1,530

1,530

.083

127

Questionnaire

1,800

90%

1,620

1,620

.50

810

Nonresponse follow-up call to district

1,530

75%

1,148

1,148

.083

95

Total burden

-

-

1,620

4,308

-

1,052



Procedures and Data Collection Instrument


A survey packages will be sent to each district with at least one sampled school. The packet will include a questionnaire (Attachment 1), a cover letter addressed to the district superintendent (Attachment 2), a cover letter addressed to the district facilities coordinator (Attachment 3), and web information sheet (Attachment 4) for each sampled school. The superintendent’s cover letter includes instructions to forward the questionnaire package to the person at the district who is most familiar with the school facilities in the district. The facilities coordinator cover letter includes contact information in case of questions, and provides guidelines on how to complete and return surveys, including the option to complete a Web version for each sampled school. The public law is cited on the cover letter and the front page of the survey assuring respondents that their participation is voluntary, and their answers may not be disclosed or used in identifiable form for any other purpose unless compelled by law (Education Sciences Reform Act of 2002, 20 U.S. C. § 9573).


If a completed survey is not received for a sampled school within 3 weeks after the initial mailing, the district will receive a nonresponse follow-up letter (Attachment 5), another copy of the school’s Web information sheet, and a brief, scripted telephone call (Attachment 6) prompting the facilities coordinator to return a completed survey via the Web, fax, or mail.


Questionnaire


The questionnaire is designed to collect information on the condition of the building systems/features in permanent and portable (temporary) buildings in schools, and satisfaction with the building environmental factors that result from them. Respondents will be asked (as they were in 1999) for their estimate of the total cost of repairs/renovations/modernizations to put the school’s buildings in good overall condition, and on which sources (e.g., facilities inspections, capital improvement master plans) this estimate is based. They will be asked about plans for major repair or renovation or replacement of building features and systems, and about plans for construction at the school in the next few years. Additional items ask about the school’s long-range educational facilities plan, and steps taken to improve energy efficiency. The instrument is discussed below.


Question 1 asks whether the school has two types of onsite buildings -- permanent and portable (temporary) buildings. Responses to the question indicate which parts of questions 2 and 7 should be completed.


Question 2 lists 17 building systems/features and asks about the condition of each in the school’s permanent and portable (temporary) onsite buildings. The question includes a 4-point rating scale (excellent, good, fair, poor) and a “school does not have system/feature” option. Building features include things such as roofs, plumbing/lavatories, heating and air conditioning systems, electrical system, and life safety features. Part A asks about the condition of the various systems/features in the school’s permanent buildings and part B asks about the condition of the same systems/features in the school’s portable (temporary) buildings. Question 2 is a modified version of an item that was included in the 1999 survey.


Question 3 asks about the condition of various outdoor features at the school: school parking lots and roadways, bus lanes and drop-off areas, sidewalks and walkways, outdoor play areas/playgrounds, outdoor athletic facilities, covered walkways, and fencing. The question includes a 4-point rating scale (excellent, good, fair, poor) and a “school does not have feature” option.


Question 4 asks for an overall rating of the condition of the permanent and portable (temporary) onsite buildings at the sampled school. The question includes a 4-point rating scale (excellent, good, fair, poor) and a “school does not have building type” option. This is a modified version of an item that was included in the 1999 survey.


Question 5 asks for the best estimate of the total cost of all repairs/renovations/modernizations required to put the school’s onsite buildings in good overall condition. If the school’s onsite buildings are already in good or excellent overall condition, respondents are instructed to enter zero. This item was included in the 1999 survey.


Question 6 asks about the sources on which the cost estimate given in question 5 is based. This item was included in the 1999 survey.


Question 7 asks how satisfactory various environmental factors are in the school’s onsite buildings. Environmental factors included are artificial and natural lighting, heating, air conditioning, ventilation, indoor air quality, water quality, and acoustics or noise control. Satisfaction is rated separately for permanent and portable (temporary) buildings. This is a modified version of an item that was included in the 1999 survey.


Question 8 asks in what year the school’s main instructional building was constructed, and question 9 asks in what year the last major renovation of the main instructional building took place. Both items were included in the 1999 survey. Question 10 asks in what year the last major building replacement or addition was made to the school. These items provide information about the functional age of the school.


Question 11 asks whether any major repair/renovation/modernization work is currently being performed at the school.


Question 12 asks which kinds of construction projects, if any, are planned for the school in the next 2 years. The construction projects include building new permanent buildings or permanent additions to buildings, and major repairs, renovations, or modernization of existing permanent buildings. This is a modified version of an item that was included in the 1999 survey.


Question 13 lists the same 17 building systems/features used in Question 2, and asks which, if any, have major repairs, renovations, or replacements planned for the next 2 years. If major repairs, renovations or replacements are planned, part B asks for the main reason for this work.


Question 14 asks if there is a written long-range educational facilities plan for the school. This item was included in the 1999 survey.


Question 15 asks about the use of qualified professionals within the last 5 years to perform inspection of the condition of the physical features of the facility, evaluation of energy use, and evaluation of indoor environmental hazards.


Question 16 asks about actions undertaken within the last 5 years to improve energy efficiency at the school. Actions included are replacing lighting fixtures, lighting ballasts, or bulbs; installing motion-sensors for lighting; upgrading insulation, outer walls, and/or siding; replacing windows and/or doors; installing or upgrading reflective roof coating; installing more efficient HVAC systems; and installing or upgrading energy management system.


Question 17 asks whether there are significant problems with the facilities at the school that are not covered in this survey. If the response is “yes,” space is provided to describe the problems.


Survey cost and Time Schedule


The survey is estimated to cost the federal government about $800,000, including about $750,000 for contractual costs and $50,000 for salaries and expenses. Contractual costs include the costs for survey preparation, data collection, data analysis, and report preparation.


Mailing of the survey is planned for January 2013. About 3 weeks after mail out of the survey, Westat will begin telephone follow-up for nonresponse. Data collection is scheduled for completion about 18 weeks after initial mail out.


Plan for Tabulation and Publication

The First Look report will be released on the NCES website in late 2013 and include explanatory text and tables. Districts with participating schools will be notified when NCES releases the report. A public use data file will also be released on the NCES website. Survey responses will be weighted to produce national estimates. Tabulations will be produced for each data item. Cross tabulations of data items will be made with selected classification variables, such as instructional level of the school, school enrollment size, community type (locale), geographic region, percent minority enrollment in the school, and percent of students eligible for free and reduced-price lunch.



Statistical Methodology

Reviewing Statisticians

John Ralph, of NCES, is the Acting Project Officer for this survey. Adam Chu, Senior Statistician, Westat, was consulted about the statistical aspects of the design. Westat is the contractor currently conducting the QRIS surveys for NCES.


Respondent Universe

The respondent universe for the FRSS survey on the condition of public school facilities will include all regular public schools in the United States. A stratified sample of 1,800 regular schools will be selected from the 2010-2011 NCES Common Core of Data (CCD) Public School Universe File. As indicated in Table 2, over 86,753 regular schools are included in the CCD universe file, of which 50,990 are elementary schools, 16,577 are middle schools, and 19,186 are high schools or schools with combined elementary/secondary grades. Table 3 summarizes how the three instructional levels will be defined for sampling purposes based on the grades taught in the school. Note that the counts of schools in Table 2 pertain only to the 86,753 “regular” schools in the CCD file, and exclude special education, vocational, and alternative/other non-regular schools. Schools with a high grade of kindergarten or lower, ungraded schools, and schools in the outlying U.S. territories are also ineligible for the survey and are excluded from the counts in the table.



Sample Design

A stratified sample of 1,800 public schools will be selected for the survey, including 720 elementary schools, 540 middle schools, and 540 high schools. The proposed allocation is designed to permit separate analysis of the three instructional levels, without unduly increasing design effects for overall statistics involving all levels. For sampling purposes, strata will be defined by crossing instructional level (elementary, middle, and secondary) with five enrollment size classes (less than 300, 300 to 499, 500 to 999, 1,000 to 1,499, and 1,500+) and four type-of-locale categories (city, suburban, town, rural). Within each sampling stratum, schools will be sorted by region and categories of percent minority enrollment prior to sample selection to induce additional implicit stratification. For each instructional level, the specified total sample size will be allocated to strata in rough proportion to the aggregate square root of the enrollment of the schools in the stratum. The use of the square root of enrollment for sample allocation purposes is a compromise between proportional allocation (which is approximately optimum for estimating the number or proportion of schools with a specified characteristic), and allocation in proportion to enrollment (which is approximately optimum for estimating aggregates that are correlated with school size, e.g., building square footage). After the stratum sample sizes have been determined, the required numbers of schools will be selected systematically from the sorted file using independent random starts. In addition to analyses by instructional level, the proposed sample design will permit separate analysis (along a single dimension) by locale, broad enrollment size class, OE region, and minority enrollment. Assuming a response rate of 90 percent, the initial sample of 1,800 will yield approximately 1,620 completed questionnaires.


The approximate sample sizes and the corresponding estimates of sampling precision to be expected under the proposed design are summarized in Table 4 for selected subgroups. Since the results in Table 4 are based on preliminary tabulations of the CCD file, the actual sample sizes may differ somewhat from those shown. Also, note that the sample sizes in Table 4 represent the expected numbers of completed questionnaires, and not the numbers of schools to be selected. The standard errors in Table 4 can be converted to 95 percent confidence bounds by multiplying the entries by 2. Thus, for example, an estimated proportion of the order of 20 percent (P = 0.20) for elementary schools would be subject to a margin of error of + 3.2 percent (at the 95 percent confidence level).


Estimation and Calculation of Sampling Errors

For estimation purposes, sampling weights reflecting the overall probabilities of selection under the proposed design will be attached to each data record. These weights will include upward adjustments for nonresponse. To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the proposed jackknife replication approach, 100 subsamples or “replicates” will be formed in a way that preserves the basic features of the full sample design. A set of estimation weights (referred to as “replicate weights”) will then be generated for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and each of the 100 jackknife replicates. The mean square error of the replicate estimates then provides a measure of the variance (standard error) of the survey statistic. Previous surveys, using similar sample designs, have yielded relative standard errors (i.e., coefficients of variation) in the range of 2 to 10 percent for most national estimates. Similar results are expected for this survey.



Table 2. Number of regular public schools and enrollment in the 2010-2011 CCD public school universe file by instructional level and size class


Instructional level*

Enrollment

size class

Number

of schools


Enrollment

Elementary

Less than 300

12,202

1,830,300


300 to 499

18,807

7,522,800


500 to 999

19,052

14,289,000


1,000 or more

929

1,192,600

Middle

Less than 300

3,779

566,850


300 to 499

3,711

1,484,400


500 to 999

7,313

5,484,750


1,000 or more

1,774

2,315,950

Secondary/

Less than 300

6,014

902,100

combined

300 to 499

3,297

1,318,800


500 to 999

4,157

3,117,750


1,000 or more

5,718

9,540,000





Total**


86,753

49,565,300


* See Table 3 for definitions.


** The counts in this table are based on data in the 2010-2011 CCD public school universe file, and exclude special education, vocational, and alternative/other schools, schools with a high grade of kindergarten or lower, ungraded schools, and schools in the outlying U.S. territories.





Table 3. Definition of instructional level categories for the school facilities survey


Low

High grade

Grade

1

2

3

4

5

6

7

8

9

10

11

12

PK

E

E

E

E

E

E

E

E

C

C

C

C

K

E

E

E

E

E

E

E

E

C

C

C

C

1

E

E

E

E

E

E

E

E

C

C

C

C

2


E

E

E

E

E

E

E

C

C

C

C

3



E

E

E

E

E

E

C

C

C

C

4




M

M

M

M

M

M

C

C

C

5





M

M

M

M

M

C

C

C

6






M

M

M

M

C

C

C

7







M

M

M

C

C

C

8








M

M

C

C

C

9









M

S

S

S

10










S

S

S

11











S

S

12












S

E

Elementary

M

Middle/junior high

S

Senior high

C

Combined





Table 4. Expected sample sizes (number of completed interviews) and corresponding standard errors for the school facilities survey, by selected analytic domains




Standard error† of an estimated



proportion equal to ...


Subset of sample

Expected sample size*


P = 0.20


P = .33


P = .50

Total sample

1,620

0.011

0.013

0.014

Instructional level

 

 

 

 

Elementary

648

0.016

0.019

0.021

Middle

486

0.019

0.022

0.024

Sec/combined

486

0.020

0.024

0.025


 

 

 

 

Type of locale

 

 

 

 

Central city

425

0.022

0.026

0.028

Urban fringe

498

0.020

0.024

0.026

Town

215

0.031

0.037

0.039

Rural

482

0.021

0.024

0.026






Size of school

 

 

 

 

Less than 300

257

0.028

0.033

0.036

300 to 499

351

0.024

0.029

0.030

500 to 999

680

0.017

0.021

0.022

1,000 +

333

0.025

0.029

0.031






*Expected number of completed questionnaires assuming 90 percent response rate..

Standard errors include unequal weighting design effects ranging from 1.1 to 1.3 depending on the subgroup.

1 This estimate is the average amount of time district staff respondents reported the questionnaire took to complete during the two pretest rounds.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy