Schools and Staffing Survey
2007-2008 Full-Scale
OMB Supporting Statement
November 2006
Table of Contents
A. Justification
1. Circumstances Making Collection of Information Necessary 4
2. Purposes and Uses of SASS 9
3. Appropriate Use of Information Technology 10
4. Efforts to Identify Duplication 16
5. Methods Used to Minimize Burden on Small Entities 16
6. Frequency of Data Collection 17
7. Special Circumstances of Data Collection 17
8. Consultants outside the Agency 17
9. Provision of payments or Gifts to Respondents 26
10. Assurance of Confidentiality 26
11. Sensitive Questions 27
12. Estimates of Hour Burden for Information Collection 28
13. Estimates of Costs 29
14. Costs to the Federal Government 30
15. Reasons for Changes in Response Burden and Costs 30
16. Time Schedule for Field Tests and Full-Scale SASS 30
17. Approval to not Display Expiration Date for OMB Approval 31
18. AGENCY CONTACT …23
B. Collection of information employing statistical methods
1. Respondent Universe 32
2. Statistical Procedures for Collecting Information 32
3. Procedures for Collection of Information 36
4. Methods for Maximizing Response Rates 37
5. Tests of Procedures and Methods 40
6. Reviewing Statisticians 40
C. Justifications for Questionnaire Content
1. Teacher Quality and Career Paths 41
2. Professional Development 41
3. School Reform 42
4. Parental Involvement 42
5. School Safety 43
6. Basic Descriptive Information 43
7. Library Media Center 43
8. Charter School 45
The materials in this document are in support of a request for clearance to conduct the full-scale Schools and Staffing Survey (SASS) in 2007-2008. The results of a methodological field test in 2005-2006 and other design and development activities has led to changes in SASS field procedures, sampling design, and questionnaires. This package provides information about the planned new approach and updates information in other sections of the clearance package.
SASS has been conducted five times previously in 1987-88, 1990-91, 1993-94, 1999-2000, and 2003-2004 (OMB number 1850-0598). The basic components and key design features of the full-scale SASS are summarized below:
The need for comprehensive data on teachers, school administrators and school policies and programs has been well established in recent years. In response to concerns for the past several years about the status of teaching and education, State and local educational policymakers have sought more information about the composition of the school workforce and policies affecting the recruitment, retention, and retirement of teachers. The full implementation of the No Child Left Behind Act has added scrutiny of hiring policies for paraprofessional instructional aides and principals.
The National Center for Education Statistics (NCES) of the Institute of Education Sciences (IES), U.S. Department of Education, is conducting this study, as authorized under Public Law 107-279, Title I, Part C, Section 151(b) and 153(a) of the Education Sciences Reform Act of 2002, which states:
“The Statistics Center shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including -
(1) collecting, acquiring, compiling (where appropriate, on a State-by-State basis), and disseminating full and complete statistics (disaggregated by the population characteristics described in paragraph (3)) on the condition and progress of education, at the preschool, elementary, secondary, postsecondary and adult levels in the United States, including data on -
(A) State and local education reform activities;
(B) State and local early childhood school readiness activities;
(C) student achievement in, at a minimum, the core academic areas of reading, mathematics, and science at all levels of education;
(D) secondary school completions, dropouts, and adult literacy and reading skills;
(E) access to, and opportunity for, postsecondary education, including data on financial aid to postsecondary students;
(F) teaching, including---
(i) data on in-service professional development, including a comparison of courses taken in the core academic areas of reading, mathematics, and science with courses in noncore academic areas, including technology courses; and
(ii) the percentage of teachers who are highly qualified (as such term is defined in section 9101of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 7801)) in each State and, where feasible, in each local educational agency and school;
(G) instruction, the conditions of the education workplace, and the supply of, and demand for, teachers;
(H) the incidence, frequency, seriousness, and nature of violence affecting students, school personnel, and other individuals participating in school activities, as well as other indices of school safety, including information regarding ---
the relationship between victims and perpetrators;
(ii) the type of weapons used in incidents, as classified in the Uniform Crime Reports of the Federal Bureau of Investigation;
(I) the financing and management of education, including data on revenues and expenditures;
(J) the social and economic status of children;
(K) the existence and use of educational technology and access to the Internet by students and teachers in elementary and secondary schools;
access to, and opportunity for, early childhood education;
(M) the availability of, and access to, before-school and after-school programs (including such programs during school recesses);
(N) student participation in and completion of secondary and postsecondary vocational and technical education programs by specific program areas; and
(O) the existence and use of school libraries;
(2) conducting and publishing reports and analyses of the meaning and significance of the statistics described in paragraph (1);
(3) collecting, analyzing, cross-tabulating, and reporting, to the extent feasible, information by gender, race, ethnicity, socioeconomic status, limited English proficiency, mobility, disability, urban, rural, suburban districts, and other population characteristics when such disaggregated information will facilitate educational and policy decisionmaking;
(4) assisting public and private educational agencies, organizations, and institutions in improving and automating statistical and data collection activities, which may include assisting State educational agencies and local educational agencies with the disaggregation of data and with the development of longitudinal student data systems;
(5) determining voluntary standards and guidelines to assist State educational agencies in developing statewide longitudinal data systems that link individual student data consistent with the requirements of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 6301 et seq.), promote linkages across States, and protect student privacy consistent with section 183, to improve student academic achievement and close achievement gaps;
(6) acquiring and disseminating data on educational activities and student achievement (such as the Third International Math and Science Study) in the United States compared with foreign nations;
(7) conducting longitudinal studies, as well as regular and special surveys and data collections, necessary to report on the condition and progress of education;
(8) assisting the Director in the preparation of a biennial report, as described in section 119; and
(9) determining, in consultation with the National Research Council of the National Academies, methodology by which States may accurately measure graduation rates (defined as the percentage of students who graduate from secondary school with a regular diploma in the standard number of years), school completion and dropout rates.”
Activities for SASS are included in Subsection (a), Part 1, as well as Parts 2, 3 and 7.
The Center assures participating individuals and institutions that any data collected under the 2003-2004 full-scale SASS survey shall be in total conformity with NCES’ standards for protecting the privacy of individuals. Section 401(a)(3)(A)of the National Education Statistics Act of 1994, as amended, states that:
“DISCLOSURE. --- No Federal department, bureau, agency, officer, or employee and no recipient of a Federal grant, contract, or cooperative agreement may, for any reason, require the Director, any Commissioner of a National Education Center, or any other employee of the Institute to disclose individually identifiable information that has been collected or retained under this title.” (20 U.S.C. 9007, 9573).
Similarly, the Education Sciences Reform Act of 2002 states that:
“No person may -
(A) use any individually identifiable information furnished under the provisions of this section for any purpose other than statistical purposes for which it is supplied, except in the case of terrorism [note: see discussion below on the Patriot Act];
(B) make any publication whereby the data furnished by any particular person under this section can be identified; or
(C) permit anyone other than the individuals authorized by the Commissioner to examine the individual reports.” (Section 408(a)(2) of the National Education Statistics Act of 1994, as amended by the Education Sciences Reform Act of 2002, P.L. 107-279, Title I, Part C, Section 183, also Title IV, Section 401 redesignations, 20 U.S.C. 9007, 9573).
Penalties for misuse of information have been established in Section 408(b)(6) for which:
“Any person who uses any data provided by the Center, in conjunction with any other information or technique, to identify any individual student, teacher, administrator, or other individual and who knowingly discloses, publishes, or uses such data for a purpose other than a statistical purpose, or who otherwise violates subparagraph (A) or (B) of subsection (a)(2) of this section, shall be found guilty of a…felony and imprisoned …, or fined…, or both.”
The U.S.A. Patriot Act of 2001, (Public Law 107-56) permits the Attorney General to petition a court of competent jurisdiction for an ex parte order requiring the Secretary of the Department of Education to provide data relevant to an authorized investigation or prosecution of an offense concerning national or international terrorism. The law states that any data obtained by the Attorney General for such purposes “...may be used consistent with such guidelines as the Attorney General, after consultation with the Secretary, shall issue to protect confidentiality.” This law was incorporated into the Education Sciences Reform Act of 2002.
The Federal Statistical Confidentiality Order of 1997, an OMB directive, provides a consistent government policy for “…protecting the privacy and confidentiality interests of persons who provide information for Federal statistical programs…” The Order defines relevant terms and provides guidance on the content of confidentiality pledges that Federal statistical programs should use under different conditions. The Order provides language for confidentiality pledges under two conditions – first, when the data may only be used for statistical purposes; second, when the data are collected exclusively for statistical purposes, but the agency is compelled by law to disclose the data. Since the U.S. Patriot Act of 2001 includes a legal requirement that compels NCES to share the data under the conditions specified in the law, the second condition applies to NCES. In this case, the Order instructs the agency to “…at the time of collection, inform the respondents from whom the information is collected that such information may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, unless otherwise compelled by law.”
Prior and Related Studies
The studies prior to the Schools and Staffing Survey were separate surveys of public and private schools, principals, teachers, and school districts under the Elementary and Secondary General Information System. The National Research Council report, “Creating a Center for Education Statistics: A Time for Action,” in 1986, noted:
“It is essential that any system of collecting education data recognize, reflect and react
to the issue of timeliness…An example of such a lack is the case of teaching and
teachers. With the publication of A Nation at Risk (National Commission on Excellence
in Education, 1983), these topics emerged as fundamental issues of concern, and the
need for data was sudden and immediate. Unfortunately, information on the number of
teachers and other professional staff – which we would think would be an essential
element of any continuing data system – was last collected at the elementary and
secondary level in 1979-80. Data on minority teachers is even more archaic, having
last been collected in 1968.”
This report caused a number of fundamental changes to occur at the National Center for Education Statistics (formerly, the Center for Education Statistics). Among them was the establishment of the Schools and Staffing Survey, to collect data on a periodic basis for schools, principals, teachers, districts, and school libraries, all in the same survey year. The survey was designed to provide the data at the state level for public schools and at the “affiliation” level for private schools.
The General Education Provisions Act, as amended (20 USC 1211e-1), specified that the National Center for Education Statistics (NCES) design an integrated survey system called the Schools and Staffing Survey (SASS). Legislative authority for NCES to collect data through surveys was reauthorized under the Improving America’s Schools Act of 1994 and has most recently been reauthorized by the provisions of the Education Sciences Reform Act of 2002 cited above. The SASS was first fielded in school year 1987-88, again in 1990-91, 1993-94, then took a 6-year pause for major survey design revisions in the 1999-2000 data collection. The most recent administration was in 2003-04. The SASS is scheduled for its next collection during the 2007-2008 school year and is expected to continue on a 4-year cycle for the foreseeable future (2011-2012, 2015-2016, etc.).
The National Education Association compiles a report on America’s Teachers every 5 years, but most data elements differ from those asked in SASS. The National Education Association and the American Federation of Teachers collect data on public teacher salaries, from the state education associations, but there are no comparable data sources for the private school teachers. Education Research Service collects data on staffing for public schools but not from private schools. There are no other sources of data on Bureau of Indian Affairs-sponsored schools, principals, or teachers.
Congress, the Department of Education and other Federal agencies, State Departments of Education, education associations and the education research community, will use data from the 2007-2008 full-scale SASS to answer policy questions on a range of topics. In the past, some of the most heavily published subject matter has included average class size, the number of new teachers, out-of-field teaching, professional development, teacher attrition and retention and teacher qualifications. Based on previous administrations of SASS, the data will be used to:
Assess and monitor teacher capacity as defined by teacher qualifications, teacher career paths, professional development needs and activities, and support for these aspects of teachers' careers by the school and the district;
B. Assess and monitor school capacity as defined by school organization and decision-making, management of curriculum and instruction, school safety and parental involvement;
C. Assess and monitor district capacity as defined by policies over the recruitment,
retention and retirement of teachers; and
Collect national statistics on school libraries, which the Federal government has done
since 1958. The survey is designed to provide a national picture of school library
media center staffing, collections, technology and services.
Information on public school districts, schools, principals, teachers, and school library media centers are available at the state level and are used extensively by state policymakers.
The U.S. Census Bureau has conducted every collection of the SASS and will administer the 2007-2008 full-scale SASS. The high quality of SASS data is, in part, attributable to the experience and expertise of the Census Bureau headquarters and field staff. The Census Bureau staff uses its expertise to apply technology appropriately to keep respondent burden to a minimum.
Potential respondents will be able to query Census staff with their questions via e-mail, as well as by telephone. However, since there are no national lists of district, school, principal, teacher, or school library e-mail addresses, the Census Bureau does not have any systematic way to contact potential sampled respondents via e-mail. The Common Core of Data, the universe from which the public school sample is drawn, has contacted states to see if it was feasible to gather e-mail addresses. There was a lack of any consensus – some states were for the idea, while many others thought that there was no way to guarantee the accuracy of such lists, given the turnover of personnel. E-mail addresses for elementary and secondary schools use the person’s name. There are no generic e-mail addresses for schools. These e-mail addresses are not publicly available from school districts. As the survey is voluntary, there is no way to guarantee that e-mail addresses can be gathered as part of the survey collection from all sampled units. Until e-mail addresses become easier to gather, it will be impractical to use e-mail as one of the electronic pathways to contact respondents systematically and urge them to file an electronic survey response. Consequently, in the 2007-08 SASS, we will attempt to collect e-mail addresses from both public and private school teachers, but primarily to be used in the 2008-09 Teacher Follow-up Survey. As part of the 2005-06 SASS methodological field test, sampled teachers were asked to provide both home and work e-mail addresses. This effort yielded a high percentage of addresses, out of which about 2/3 of those subsequently tested did work.
From 1999-2006, the National Center for Education Statistics (NCES) has offered respondents an Internet reporting option on some rounds of the PSS and on some components of the SASS. These included the:
1999-2000 Library Media Center Questionnaire (LMC, a component of SASS),
2001-02 PSS,
2004-05 Teacher Follow-up Survey (TFS, a follow-up survey on a sample of SASS teachers), and
2005-06 PSS.
All of these collections follow the same data collection steps:
Mail Phase: First, the questionnaire is mailed to the school (or to the teacher in the case of the TFS). Approximately a week later, a reminder postcard is sent. Approximately 4 to 5 weeks after the initial mailout, a second questionnaire is sent to non-respondents.
Telephone Phase: A few weeks (depending on the survey) after the second mailout, remaining respondents are contacted by phone to complete the questionnaire. The telephone phase lasts one to two months.
Field Phase: Remaining non-respondents are assigned to field staff who attempt to get the questionnaires completed by phone or personal visit.
The Internet option was offered at the beginning of the mail phase along with the mailed questionnaire in the LMC and the 2005-06 PSS. The 2001-02 PSS and the 2004-05 TFS offered the Internet option with some variations in time from the first mailout--either a week before (either with or without mentioning that that would be a mail option forthcoming), or concurrent. The Internet option was offered along with the second mailouts as well. The intended respondents were notified of the Internet options by mail. Except for the LMC, once telephone follow-up began, the Internet option ended.
The text refers to various response rates, which are defined as follows:
The self-administered response rate is the percent of completed returns during the mail phase (before the start-up of telephone follow-up). It is the number of completed mail questionnaires (and Internet, when offered) divided by the number of cases mailed (in every survey, every case was offered a mail questionnaire).
The Internet response rate is the number of completed Internet returns divided by the number of cases offered the Internet option (the number varied by survey).
The final response rate is the number of completed questionnaires from any mode divided by the number of cases in the sample reduced by the number of out-of-scope cases.
The following sections provide the rationale and results for the specific methodologies utilized in conjunction with each survey and makes recommendations for subsequent rounds of PSS and SASS. Specifically, the following sections cover:
Summary of the impact on self-administered and final response of including an Internet reporting option;
Summary of the Internet response rates on the LMC, PSS, and TFS.
Summary of the impact on self-administered and final response of including an Internet reporting option
This section examines the impact of adding an Internet option on response rates. Adding the Internet option in addition to the mail questionnaire had two potential advantages. First, if it increased self-administered response, then the costs of conducting telephone and field follow-up would be reduced, as there would be fewer cases. Second, there would be potential data processing advantages, as the Internet instrument captured the data, provides some edit checks on the data, and could expedite the release of the final results.
The 2001-02 PSS and 2004-05 TFS were designed to measure the impact of including an Internet option on self-administered response (2001-02 PSS and 2004-05 TFS). These research projects followed a 2000-01 study on the American Community Survey (ACS) that similarly tested the inclusion of an Internet option along with a mail questionnaire.1 The results of these studies are shown in Table 1. Table 1 also includes response rate results from the LMC study, which included two Internet treatments, and compared the self-administered and final response rates. The LMC study differed from the other three in that it did not specifically measure response between mail only versus mail plus Internet groups.
Table 1. Impact of Including an Internet Option on Response in 4 Studies
Study |
Description and Response Rates |
|
|
||
2001-02 PSS |
Control panel was mail only. Panel 2 offered mail and Internet concurrently. Panel 3 offered Internet one week before mail, with mention of a mail questionnaire. Panel 4 offered Internet one week before mail, without mention of a mail questionnaire. |
||||
|
Mail only |
Mail and Internet concurrently |
Internet one week before with mention of mail |
Internet one week before without mention of mail |
Mail and Internet groups combined |
Self-administered Response |
55.9% |
55.3% |
57.5% |
57.0% |
56.6% |
Final Response |
95.7% |
95.8% |
95.3% |
95.7% |
95.6% |
|
|
|
|
|
|
2004-05 TFS |
Control panel was mail only. Panel 2 offered Internet one week before mail, with mention of a mail questionnaire. Panel 3 offered Internet one week before mail, without mention of a mail questionnaire. Each panel subdivided into incentive/non-incentive, but not broken out in this table. |
||||
|
Mail only |
|
Internet one week before with mention of mail |
Internet one week before without mention of mail |
Mail and Internet groups combined |
Self-administered Response |
48.8% |
|
42.4% |
45.4% |
43.9% |
|
|
|
|
|
|
2000-01 ACS |
Control panel was mail only. Experimental group offered mail and Internet. Results for Nov, Dec and Jan panels combined. |
||||
|
Mail only |
Mail and Internet concurrently |
|
|
|
Self-administered Response |
43.6% |
37.8% |
|
|
|
|
|
|
|
|
|
1999-2000 LMC |
Entire sample offered both mail and Internet. Half were given greater encouragement to respond by Internet at each stage of data collection. |
||||
|
Greater encouragement |
Less encouragement |
|
|
|
Self-administered Response |
36% |
45% |
|
|
|
Final Response |
86% |
86% |
|
|
|
Overall, these studies indicate that adding the Internet option had a neutral or negative impact on initial response rates. When the mail and Internet options were offered concurrently, the results of the TFS and ACS studies showed that the self-administered response rate of the mail-only group exceeded that of the mail plus Internet group. While the self-administered response rates of the mail plus Internet groups exceeded the mail-only group on the PSS study, the difference was slight (and disappeared in the final analysis). While the LMC did not compare mail only versus mail plus Internet, the initial response of the group receiving greater encouragement to report by Internet was significantly lower.
The 2005-06 PSS offered the Internet option to nearly all respondents, but did not specifically include a design to measure self-administered response against a control group. However, the self-administered response rate was 0.8 percentage points lower than the 2001-02 PSS, when 1/3 of the schools were offered an Internet option, and 3.8 percentage points lower than the 2003-04 PSS, when no Internet option was offered.
The TFS and PSS studies also tested the self-administered response rates of Internet plus mail versus mail only when the Internet option was offered one week in advance of the mailout. Each study had two panels of these cases—the first mentioning that there would be a mail option, and the second not mentioning it. In both studies, the better response rates were achieved by the groups who were not informed in advance of the mail option.
The final response rates did not differ between control groups and experimental groups in any study.
The data from the questionnaires will be scanned optically with keying only of write-in entries. The optical scanning system to be utilized has been used successfully with the 2002 Economic Census.
All available data sources were examined to determine that comparable data were not available elsewhere. Continuing discussions with the state education agencies, private school associations and other data providers and data users, as well as continuing review of other data sources within NCES and other Federal agencies and programs, indicate that similar information is not available. The National Assessment for Educational Progress (NAEP) has background questionnaires for teachers, but only for those teaching 4th, 8th and 12th grades. The Common Core of Data (CCD) collects data about all elementary and secondary public schools. The data collected about schools and staff, however, are mostly limited to general descriptive information (i.e., names of schools or districts, contact information for the school or district, summary counts of teachers or other staff). Since 1997, the Common Core of Data has expanded its data collections to include items on whether the school has Title I services and whether those services are school-wide or targeted assistance. Consequently, it is not necessary for SASS to collect data on whether the school has Title I services or not, although data on the number of children in the program will be collected. The National Education Longitudinal Study (NELS) followed a cohort of students every other year beginning with eighth grade since 1988. Although NELS included a teacher questionnaire, the survey focus was on the students and their transitions beginning with eighth grade and continuing into post-secondary institutions or the work force. Among the aforementioned NCES elementary and secondary school surveys, only SASS produces extensive data that is nationally representative of kindergarten through twelfth grade teachers. States collect data on schools and teachers, but the data lack uniformity between states, preventing meaningful cross-state comparisons.
There is no other national survey yielding similar in-depth data on public school districts or public and private schools, principals, teachers or library media centers. SASS data will permit analysis not only within each of the components of SASS (i.e., the district file, school file, principal file, teacher file and library media center file), but also across components or files. The data link teachers and principals to their schools and schools to their respective school districts. This linkage across the different respondent groups makes the SASS data unique among school surveys.
The burden on small schools and districts is minimized during the full-scale SASS through the sample design that specifies the selection of schools as a function of size as defined by the number of teachers. Small schools and districts, therefore, will be sampled at lower rates because they comprise a smaller proportion of the teacher population per school.
The SASS data collection should maintain a four-year cycle because it is important to collect high-quality and detailed data on schools and teachers while at the same time, collecting high quality assessment data through the NAEP.
There are no circumstances that will require special data collection efforts.
Mr. Steve Tourkin
Chief, Education Surveys Branch
Demographic Surveys Division
U.S. Census Bureau
Washington, DC 20233
Mr. Dennis Schwanz
Chief, Longitudinal Surveys Branch
Demographic Statistical Methods Division
U.S. Census Bureau
Washington, DC 20233
Since its inception, the development of SASS has relied on the substantive and technical review and comment of people both inside and outside the Department of Education. Outside experts who have commented on proposed revisions to SASS include:
Eric Hanushek,
Hoover Institute,
Stanford University
Dan Goldhaber,
Evans School of Public Affairs
University of Washington
Cathy Roller,
Director of Research and Policy,
International Reading Association
Mike Podgursky,
Department of Economics,
University of Missouri-Columbia
Dale Ballou,
Vanderbilt University
Frederick Hess
American Enterprise Institute
Kenneth Wong
Brown University
Patrick Wolf
Georgetown Public Policy Institute
Georgetown University
Karen Levesque
MPR Associates
Data On Vocational Education
Consultations within the U.S. Department were made with: Zollie Stevenson, Libby Witt, and Bob Stonehill of the U.S. Department of Education’s Office of Elementary and Secondary Education, Compensatory Education Programs.
SASS staff at NCES make presentations at an annual meeting with the Private School Group, consisting of representatives of private school associations. Key NCES staff involved in these meetings are Jeff Owings, Kathryn Chandler, Kerry Gruber, and Steve Broughman. Edie McArthur, the OMB liaison within NCES, made comments on the revised questionnaires.
In addition to review within NCES, the following individuals reviewed data collection methods and content:
Mr. Steve Tourkin
Chief, Education Surveys Branch
Demographic Surveys Division
Bureau of the Census
Washington, DC 20233
Mr. Dennis Schwanz
Chief, Longitudinal Surveys Branch
Demographic Statistical Methods Division
Bureau of the Census
Washington, DC 20233
The review of the library media center questionnaire includes consultation with Ms. Julie A. Walker, Executive Director of the American Association of School Librarians.
Comments were received from the Federal Register notice.
-----Original
Message-----
From: CAPE [mailto:[email protected]]
Sent:
Wednesday, October 04, 2006 11:15 PM
To: McArthur, Edith;
Schneider, Mark
Subject: Proposed Amendment to SASS
Questionnaire
Edie and Mark:
Sorry for
the delayed response to your email messages. Just catching
up
from three days of out-of-office meetings.
Thanks for the
chance to offer a suggestion regarding the wording of
question
78 in the 2003-04 SASS Private School Questionnaire.
CURRENT
LANGUAGE:
78. Of the students enrolled in this school, how
many have an
Individual Education Plan (IEP)
because
they have special needs?
PROPOSED LANGUAGE:
78.
Of the students enrolled in this school, how many have
an
Individualized Education Program (IEP) or a Services
Plan
because they have special needs AND are receiving services
financed
by a public school district to address those
needs?
78a. Of the students enrolled in this school
who do not have an IEP
or Services Plan (and thus are not
included in your response to
question 78), how many have
special needs that would likely warrant
an IEP were they
enrolled in public school?
RATIONALE:
1.
"IEP" stands for "Individualized Education Program"
not
"Individual Education Plan."
See
http://www.ed.gov/parents/needs/speced/iepguide/index.html
2.
Students with special needs who are placed by their parents
in
private schools AND who are served by public school
districts under
IDEA are provided a Services Plan, rather
than an IEP.
See
http://www.ed.gov/policy/speced/leg/idea/brief10.html
3.
Even under my proposed rewording, question 78 would not get at
what
I believe is the purpose of the original question, namely,
to
identify the number of students in the school with
special needs.
The problem is that not all children
with special needs in private
schools are receiving
services financed by a public school district,
a condition
that would trigger a Services Plan (for parentally placed
students)
or an IEP (for district-placed students -- i.e., students
that
a district places in a specialized private school because
the
district does not have the required resources to meet
the child's
needs). Thus, we also need my proposed
question 78a to identify the
students with special needs
who have neither an IEP nor a Services
Plan. The sum
of the responses to 78 and 78a should yield the total
number
of students in the school with special needs.
I'm sure the
survey experts will be able to tighten the
wording.
Best.
Joe
-------------------------------------------------
Joe
McTighe
Executive Director
Council for American Private
Education (CAPE)
PMB 457
13017 Wisteria Drive
Germantown,
MD 20874
Tel - 301-916-8460
Fax - 301-916-8485
E-Mail
- [email protected]
Web - http://www.capenet.org
SASS Response to CAPE comments:
Will make that correction.
Further comments from the Office of Non-Public Education (these are included below) indicate that not all students with disabilities in private schools would necessarily be counted as in an IEP or in a services plan. The intent is to be able to describe the school in terms of the percentage of its students who have a learning or physical disability that requires some special services or adaptations in order for those students to succeed academically. The terms Individualized Education Program and services plan seem to be applicable whenever the student came from a public school, and the local school district is paying for or providing the special services. However, there are at least some private school students who have never enrolled in a public school and yet may require special services. Therefore, the revised item on the Private School questionnaire will use the term “students with disabilities” in the main question, and only refer to IEP and services plan in the instructional note for that item.
The revised items will be:
49a. Of the students enrolled in this school as of October 1, do any have a formally identified disability?
(A student with a formally identified disability may or may not have an IEP (Individualized Education Program) or Services Plan.)
*Do not include prekindergarten, postsecondary, or adult education students.
__ Yes
__ No (skip to item 50)
49b. How many students have a formally identified disability in this school?
_________ Students
50a. Does this school primarily serve students with disabilities? [Unchanged]
*If you marked “Special education school – primarily serves students with disabilities” for item 15, then please mark “Yes” below.
__ Yes Go to item 51a
__ No
50b. How many students with a formally identified disability are in each of the following instructional settings?
All day in a regular classroom (100 percent of the school day)
__ None
________ Students
Most of the day in a regular classroom (80-99 percent of the school day)
__ None
________ Students
Some of the day in a regular classroom (40-79 percent of the school day)
__ None
________ Students
Little or none of the day in a regular classroom (0-39 percent of the school day)
__ None
________ Students
On the Private Teacher questionnaire,
13. Of all the students you teach at this school, how many have a formally identified disability?
If none, mark (X) the box.
___ None
_____________ Students
From internal review of the OMB clearance, Edith McArthur commented that the term “English Language learner (ELL)” has become standard U.S. Department of Education terminology for limited-English proficient (LEP) students.
RESPONSE: The term “English Language learner” will be incorporated into current LEP student items on the Public School, Public School (with District) ,Private School, Public School Teacher, and Private School Teacher questionnaires.
Per the finalized U.S. Department of Education guidance on maintaining, collection and reporting data on race and ethnicity to the U.S. Department of Education (Federal Register notice dated Monday, August 7, 2006, Volume 71, No. 151, Part V), aggregate counts of students and teachers by race/ethnicity can be reported as soon as 2007-08 in either the new categories or in the older categories. The older, 5-category scheme is to be phased out by the 2009-2010 school year. In the meantime, the Schools and Staffing Survey will follow the lead of the Common Core of Data/EDNET data collection for 2006-07.
These are the “hybrid” version of the race/ethnicity data collection categories:
5. Around the first of October, how many students enrolled in grades K-12 and comparable ungraded levels were –
* Do NOT include prekindergarten, postsecondary, or adult education students.
* If none, please mark (X) the None box. For (d) and (f), report number of students if the data are available. Otherwise, mark (X) the “Not available” box. Please report these student counts using the data categories currently used in this state.
Hispanic or Latino, regardless of race?
___ None
_________ Students
White, not of Hispanic or Latino origin?
___ None
_________ Students
Black, not of Hispanic or Latino origin?
____ None
__________ Students
Asian or Pacific Islander?
____ None
__________ Students
Of these, how many were
Asian?
____ None
____ Not reportable
________ Students
Native Hawaiian or other Pacific Islander?
_____ None
_____ Not reportable
_________ Students
American Indian or Alaska Native?
____ None
_______ Students
f. Two or more races?
____ None
____ Not reportable
________ Students
Total students (sum of entries 5a-f)
___,______ Students
Of the full-time and part-time TEACHERS in this school around the first of October, how many were –
* If none, please mark (X) the None box. For (d) and (f), report number of students if the data are available. Otherwise, mark (X) the “Not available” box. Please report these student counts using the data categories currently reported by this state
Hispanic or Latino, regardless of race?
____ None
________ Students
White, not of Hispanic or Latino origin?
____ None
________ Students
Black, not of Hispanic or Latino origin?
____ None
________ Students
Asian or Pacific Islander?
____ None
________ Students
Of these, how many were
Asian?
____ None
____ Not reportable
________ Students
Native Hawaiian or other Pacific Islander?
____ None
____ Not reportable
________ Students
American Indian or Alaska Native?
____ None
______ Students
Two or more races?
____ None
____ Not reportable
________ Students
Total Teachers (sum of entries 29a-f)
____,______ Teachers
We will not provide any cash payment to survey respondents. The school respondent will be provided with a token non-cash gift of the Census Bureau Statistical Abstract of the United States CD, as part of the effort to encourage participation.
A plan for assuring the confidentiality of individual data has been developed by NCES and the Census Bureau. Under this plan, the 2007-2008 SASS will conform to federal regulations – specifically, the Privacy Act of 1974 (5 U.S.C. 552a), Privacy Act Regulations (34 CFR Part 5b), the Hawkins-Stafford Amendments of 1988 (P.L. 100-297), the Computer Security Act of 1987, NCES Restricted-Use Data Procedures Manual, the Federal Statistical Confidentiality Order of 1997 (an OMB directive), the U.S.A. Patriot Act of 2001 (P.L. 107-56), the E-Government Act of 2002, Title V, Subtitle A, Confidential Information Protection, the Education Sciences Reform Act of 2002 (P.L. 107-279) and the NCES Statistical Standards and Policies handbook. From the initial contact with the participants in this survey through all of the follow-up efforts, careful attention will be paid to informing potential survey respondents that NCES and the Census Bureau will protect the confidentiality of their personal data. The Census Bureau will collect the data for NCES by the authority of Public Law 107-279, Title I, Part C, Section 183 of the Education Sciences Reform Act of 2002 (20 USC 9573), which guarantees the confidentiality of respondents. In addition, the Census Bureau and NCES will treat the data as confidential, based on the Privacy Act of 1974 and as amended under the U.S.A. Patriot Act of 2001.
All survey respondents will be informed that this is a voluntary survey in a survey cover letter signed by the Associate Commissioner of NCES. The cover letter will also state that the data will only be reported in statistical summaries that preclude the identification of any individual teacher or principal participating in the survey, unless otherwise compelled by law.
The Census Bureau will collect data under an interagency agreement with NCES. The Census Bureau will maintain the individually identifiable questionnaires as confidential material. The required plan will include the following:
1. Provisions for data collection in the field;
2. Provisions to protect the data-coding phase required before machine processing;
3. Provisions to safeguard completed survey documents;
4. Authorization procedures to access or obtain files containing identifying information; and
5. Provisions to remove printouts and other outputs that contain teacher identification information from normal operation. Such materials will be maintained in secured storage areas and will be destroyed as soon as practical by burning.
The SASS questionnaires contain several items about salary and benefits. Federal regulations governing the administration of these questions, which fall under the provisions of the Privacy Act of 1974, require (a) clear documentation of the need for such information as it relates to the primary purpose of the study, (b) provisions to respondents which clearly inform them of the voluntary nature of participation in the study, and (c) assurances of confidential treatment of responses.
The collection of data related to salary is central to understanding key policy issues driving this study. The recruitment of new teachers and retention of experienced teachers are related to the salary and benefits offered by a school or school system. Information about salary in relation to teaching experience, educational background and working conditions is essential to gaining an understanding of factors affecting teacher compensation. Comparisons of teaching conditions across types of schools (i.e., public and private schools) require information about the salary and benefits packages of the different types of schools.
The advance notification letter to each district or school respondent explains that participation in the survey is voluntary. In addition, each questionnaire states on the inside cover, “We are conducting this survey with only a sample of school [principals or teachers]. Therefore, the unique data you contribute helps to represent your state or area more accurately. We encourage you to participate in this voluntary survey.”
Assurances of the confidentiality of respondents’ data are printed in that same inside cover letter on each questionnaire, where it is stated, “The information you provide will be combined with the information provided by others in statistical reports. No individual data that links your name, address, or telephone number with your responses will be included in the statistical reports.” This is also true of salary data – the data presented are always aggregated averages or medians of the range of salary data reported.
The sample size, projected number of responses, estimated average response time and the total estimated respondent burden for the 2007-2008 full-scale SASS are as follows:
Table 3. Estimates of hour burden of 2007-2008 SASS, including Teacher Listing Forms (TLF) and all questionnaires
Questionnaire |
Sample Size |
Projected Number of Responses |
Estimated Average Response Time per Respondent (Minutes) |
Total Hours |
One-school district identification |
640 |
608 |
4 |
41 |
District contact information |
4,500 |
4,455 |
3 |
223 |
Screen schools |
14,000 |
12,600 |
8 |
1,680 |
Public School TLF |
10,200 |
10,037 |
30 |
5,019 |
Private School TLF |
3,600 |
3,028 |
30 |
1,514 |
BIA School TLF |
160 |
150 |
30 |
75 |
School District |
5,700 |
4,725 |
25 |
1,969 |
Public School |
9,995 |
8,076 |
40 |
5,384 |
Public School with District Items |
1,500 |
1,212 |
35 |
707 |
Private School |
3,600 |
2,732 |
45 |
2,049 |
Public School Principal |
10,360 |
8,516 |
25 |
3,548 |
Private School Principal |
3,600 |
2,696 |
25 |
1,123 |
Public School Teacher |
57,640 |
48,879 |
45 |
36,659 |
Private School Teacher |
11,000 |
9,064 |
40 |
6,043 |
Public School LMC |
9,900 |
8,128 |
35 |
4,741 |
TOTAL |
146,395 |
124,906 |
|
70,775 |
NOTE: The Public School with District Items questionnaire is different from the Public School questionnaire in that it goes to all of the sampled public charter schools, traditional public schools that are the only school in their district, Bureau of Indian Affairs-funded schools, and any other publicly-funded school that is not part of a regular school district (such as state schools for the deaf). The Public School with District Items questionnaire includes all of the district items except those that are redundant with the public school questionnaire or do not apply to these types of schools.
Projected number of responses is based on the weighted response rate for each type of questionnaire, according to the results of the 2003-04 SASS. The average length of time necessary to complete each type of questionnaire is estimated from the finalized version of the questionnaires, which are reduced in length from those used in the 2003-04 SASS.
Respondents to the full-scale SASS will not incur any costs other than of their time to respond. The standard NCES procedure for estimating cost is to multiply the estimated total survey reporting hours (the average length of time it takes to complete the survey) by the average salary of school employees (assumed to be $26.00 per hour). The respondent dollar cost is estimated to be $1,840,150.
We are not imposing any additional costs on our respondents, other than those reported in Section 12, Estimates of Hour Burden for Information Collection.
The cost to the federal government for the full-scale SASS is $17.0 million over the 4-year data collection cycle for SASS. The Census Bureau estimates were compiled from individual estimates developed within each Census Bureau division involved in the survey. Estimates were based on the sample sizes, the length of the questionnaires and the data processing requirements. Administrative overhead, forms design, printing and personnel costs are included.
The response burden of the full-scale SASS will be reduced from that of the 2003-04 SASS. The initial response burden estimate for the 2005-2006 SASS was based on a methodological field test and the updated estimated response time is based upon the finalized 2007-08 SASS questionnaires. Funding for the collection of data from the universe of charter schools will not be available for this cycle of SASS. The Bureau of Indian Affairs-funded schools’ principals and teachers are included as part of the number of respondents to the Public Principal and Public School Teacher questionnaires, respectively. The number of BIA schools has been refined from the Bureau of Indian Affairs directory total of roughly 200 to the expected eligible school count of 160 (pre-kindergarten or postsecondary BIA-sponsored schools are not eligible, nor are dormitories). Private schools will not receive the Library Media Center questionnaire, due to budget constraints.
The operational schedule for conducting the full-scale SASS is as follows:
Mail advance letters to school districts June 2007
Mail questionnaires to school districts August 2007
Mail advance letters and questionnaires to schools August 27, 2007
(includes Teacher Listing Form (TLF), School,
Library Media Center (LMC), and Principal
Questionnaires)
Screen schools and establish school coordinators August 29 – October 12, 2007
Conduct telephone follow-up of TLFs
Second mail-out of School, LMC and Principal October/November 2007
Questionnaires
Conduct field follow-up of TLFs October 29 – November 16, 2007
Mail Teacher Questionnaires weekly as they are November-December 2007
sampled
Telephone reminders to mail in questionnaires November 26, 2007 – Jan.15, 2008
Telephone non-response follow-up December 12, 2007 – Feb. 28, 2008
Conduct field follow-up of all questionnaires February 1 – 28, 2008
Conduct field follow-up of remaining Teacher March 11, 08 – April 30, 2008
and School District Questionnaires
Data capture of all questionnaires January – May 2008
Data Processing May 2008 – September 2008
Completion of files, tabulations and codebooks October 2008
Currently the only planned publications are “first release” publications: At least 1 and perhaps up to 5 First Look reports (10-15 pages of tables, a few bullets). A Teacher Qualifications report. Potentially a teacher salary report. It is also possible that some web tables would be released for a few topics not covered in the First Look or any of the other reports but which are deemed important to cover.
We are not seeking approval to not display the expiration date of OMB approval.
Edie McArthur, (202) 502-7393, [email protected].
1 Griffin, D., Fischer, D., and Morgan, M. (2001). “Testing an Internet Response Option for the American Community Survey.” Paper presented at the Annual Conference of the American Association for Public Opinion Research Annual Conference, Montreal, Quebec, May.
File Type | application/msword |
File Title | Schools and Staffing Survey |
Author | Barbara Holton |
Last Modified By | DoED |
File Modified | 2007-02-23 |
File Created | 2007-02-23 |