Appendix G Sampling Design

Appendix G NAEP 2011 Sampling Design.pdf

National Assessment of Educational Progress (NAEP) 2017-2019

Appendix G Sampling Design

OMB: 1850-0928

Document [pdf]
Download: pdf | pdf
NATIONAL CENTER FOR EDUCATION STATISTICS
NATIONAL ASSESSMENT OF EDUCATIONAL PROGRESS
Appendix G
NAEP 2011 Sample Design

Request for Clearance for
NAEP Assessments for 2017-2019
OMB# 1850-NEW v.1
(previous OMB# 1850-0790 v.43)

July 29, 2016

1 of 91

NAEP Technical Documentation Website
NAEP 2011 Sample Design
The sample design for NAEP 2011 included samples for various operational, special study, and pilot test assessments. Representative samples were drawn for the following operational
assessments:

2011 State Assessment Sample Design
2011 National Assessment Sample
Design

national assessments in mathematics and reading in public and private schools at grades 4 and 8;
national assessments in computer-based writing (WCBA) in public and private schools at grades 8 and 12;
national assessments in science in public and private schools at grade 8;
state-by-state and Trial Urban District Assessments (TUDA) assessments in mathematics and reading in public schools at grades 4 and 8; and
state-by-state assessments in science in public schools at grade 8.
Representative samples were drawn for the following special study and pilot test assessments:
mathematics computer-based study (MCBS) in public schools at grade 8;
study to examine a direct link between NAEP and Trends in International Mathematics and Science Study (TIMSS) in public schools at grade 8;
Special mathematics assessment in Puerto Rico in public and private schools at grade 4 and in public schools at grade 8; and
pilot tests in reading and mathematics in public and private schools at grade 4, in reading and mathematics in public schools at grade 8, and in economics in public schools at grade 12.

The samples for the operational assessments were organized into four distinct components and sampled separately. The samples for the special studies and pilot tests were integrated into these various components.
mathematics, reading, and science assessments of fourth- and eighth-grade students in public schools;
mathematics, reading, and science assessments of fourth-grade and eighth-grade students in private schools;
computer-based writing assessments and mathematics study of eighth-grade and twelfth-grade students in public schools; and
computer-based writing assessments of eighth-grade and twelfth-grade students in private schools.
The national assessments were designed to achieve nationally representative samples of public and private school students in the fourth, eighth, and twelfth grades. Their target populations included all students in public, private, Bureau
of Indian Education (BIE), and Department of Defense Education Activity (DoDEA) schools, who were enrolled in fourth, eighth, and twelfth grades, respectively, at the time of assessment.
For the fourth- and eighth-grade mathematics, reading, and science assessments in public schools, the NAEP state student samples and assessments constitute the NAEP national student samples and assessments. Nationally
representative samples were drawn for the remaining populations of private school students in fourth and eighth grades.
The TUDA samples formed part of the corresponding state public school samples, and the state samples formed the public school grade 4 and 8 part of the national sample.
The mathematics, reading, and science samples were based on a two-stage sample design:
selection of schools within strata, and
selection of students within schools.
The computer-based writing and mathematics samples were based on a three-stage sample design:
selection of primary sampling units (PSUs),
selection of schools within strata, and
selection of students within schools.
In the three-stage design, schools were stratified and selected within the sampled PSUs. The samples of schools were selected with probability proportional to a measure of size based on the estimated grade-specific enrollment in the
schools for both designs.
The state assessments were designed to achieve representative samples of students in the fourth and eighth grades. Their target populations included all students in each participating jurisdiction, which included states, District of
Columbia, BIE, DoDEA, and school districts chosen for the TUDA assessments. Each sample was designed to produce aggregate estimates with reliable precision for all the participating jurisdictions, as well as estimates for various
student subpopulations of interest.
At grades 4 and 8, all BIE schools were included in the mathematics, reading, and science assessments. Also, public schools with relatively high American Indian/Alaska Native populations were oversampled in six states (Arizona,
Minnesota, North Carolina, Oregon, Utah, and Washington). This was designed to enhance the reporting of results for American Indian students at the state level in those states with a sizable proportion of the nation's American Indian
students for the National Indian Education Study (NIES), which was conducted in conjunction with NAEP.
All states participated in the mathematics, reading, and science assessments. By design, only BIE schools did not participate in the state science assessment, as it lacked the required number of students for the state science assessment.
A small portion of students received the science assessment in BIE schools in science to supplement the national science sample.
The District of Columbia, which generally does not have enough students for an assessment in a third subject, also participated in the grade 8 science assessment. To accomplish this, each student in the District of Columbia was assigned
to two of the three assessment subjects and thus tested twice over two days.
The figure below illustrates the various sample types and subjects.
Components of the NAEP 2011 samples, by assessment subject, grade, and school type: 2011

NOTE: View an accessible version of this figure.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Assessments.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_sampdsgn.aspx

7/22/2016 2:27 PM

2 of 91

Components of the NAEP samples, by assessment subject, school type, and grade: 2011
Components of the NAEP 2011 samples, by assessment subject, school type, and grade: 2011
Assessment
Grade
4
8

Reading

Mathematics

Science

WCBA

(1)
(3)
(1)
(3)

(1)
(3)
(1)
(3)

(1)
(3)

(2)
(3)
(2)
(3)

12
1Public/Bureau

of Indian Education (BIE)/Department of Defense Education Activity (DoDEA).

2Public.
3Private.

NOTE: WCBA = Writing computer-based assessment.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Assessments.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/sampdsgn_2011_accessible.aspx

7/22/2016 2:27 PM

3 of 91

Sample Design for the 2011 National Assessment
The 2011 national assessment included the following components:
mathematics and reading assessments in public and private schools at grades 4 and 8;
writing computer-based assessment (WCBA) in public and private schools at grades 8 and 12;
science assessments in public and private schools at grade 8.
The sample design aimed to achieve a nationally representative sample of students in the defined populations who were enrolled at the time of assessment.
The mathematics and reading samples were based on a two-stage sample design:
selection of schools within strata, and
selection of students within schools.

Fourth- and Eighth-Grade Public School
Assessments
Fourth- and Eighth-Grade Private
School National Assessment
Writing Computer-Based Assessment
(WCBA) and Mathematics
Computer-Based Study (MCBS)

The computer-based writing and mathematics samples were based on a three-stage sample design:
selection of primary sampling units (PSUs),
selection of schools within strata, and
selection of students within schools.
The samples of schools were selected with probability proportional to a measure of size based on the estimated grade-specific enrollment in the schools.
For the mathematics, reading, and science assessments in fourth- and eighth-grade public schools, the NAEP state student samples and assessments constitute the NAEP national student samples and assessments. Nationally
representative samples were drawn for the remaining populations of private school students in fourth and eighth grades. By design, only Bureau of Indian Education (BIE) schools did not participate in the state science assessment, as it
lacked the required number of students for the state science assessment. A small portion of students received the science assessment in BIE schools in science to supplement the national science sample.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_main.aspx

7/22/2016 2:27 PM

4 of 91

2011 Fourth- and Eighth-Grade Private School National Assessment
The private school samples were designed to produce nationally representative samples of students enrolled in private schools in the United States. Fourth- and eighth-grade students were assessed in mathematics and reading.

Target Population

Private school students were sampled for the eighth-grade national science assessment at a very low rate. The three operational subjects (reading, mathematics, science) were sampled in the ratio of 9:9:1. This ensured enough private school sample to report
a national science result, but does not support breakdowns by type of private school.

Sampling Frame

Reading pilots and a special mathematics assessment in Puerto Rico were also conducted in the private school samples for fourth grade.

Stratification of
Schools

Oversampling of private schools at grades 4 and 8, last implemented in 2005, was reintroduced. Response rates permitting, allowed separate reporting for reading and mathematics, for Catholic, Lutheran, Conservative Christian, and other private schools.
The target sample sizes of assessed students for each grade and subject are shown in the table below. Prior to sampling, these target sample sizes were adjusted upward to offset expected rates of school and student attrition due to nonresponse and
ineligibility.

School Sample
Selection
Substitute Schools
Ineligible Schools
Student Sample
Selection
School and Student
Participation

Target sample sizes of assessed students, private school national assessment, by subject and grade: 2011
Grade
Total
4
8

Total

Mathematics

Mathematics pilot

Reading

Reading pilot

Science

Special mathematics assessment

25,240
12,570
12,670

12,000
6,000
6,000

200
200
†

12,000
6,000
6,000

220
220
†

670
†
670

150
150
†

† Not applicable.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
Samples were based on a two-stage design that involved selection of schools within strata and selection of students within schools. The first-stage samples of schools were selected with probability proportional to a measure of size based
on the estimated grade-specific enrollment in the schools.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_priv_gr_4_8.aspx

7/22/2016 2:27 PM

5 of 91

Ineligible Schools for the 2011 Private School National Assessment
The Private School Universe Survey (PSS) school file, from which most of the sampled schools were drawn, corresponds to the 2007-2008 school year, 3 years prior to the assessment school year.
During the intervening period, some of these schools either closed, no longer offered the grade of interest, or were ineligible for other reasons. In such cases, the sampled schools were coded as
ineligible.

Eligibility Status of Sampled
Schools by Grade and Private
School Type
Ineligible Sampled Schools by
Ineligibility Type

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_inelg.aspx

7/22/2016 2:27 PM

6 of 91

Eligibility Status of Sampled Schools for the 2011 Private School National Assessment
The following table presents a breakdown by private school type of ineligible and eligible schools in the fourth- and eighth-grade private school samples. There are considerable differences across private school types at grades 4 and 8.
Schools whose private school type was unknown at the time of sampling subsequently had their affiliation determined during data collection. Therefore, such schools are not broken out separately.
Eligibility status of sampled private schools, national assessment, by grade and private school type: 2011
Fourth grade
Private school type

Eligibility status

All private

Eighth grade

Count

Percentage

Count

Percentage

Total
Ineligible
Eligible

748
102
646

100.00
13.64
86.36

930
126
804

100.00
13.55
86.45

Catholic

Total
Ineligible
Eligible

264
26
238

100.00
9.85
90.15

332
26
306

100.00
7.83
92.17

Non-Catholic

Total
Ineligible
Eligible

484
76
408

100.00
15.70
84.30

598
100
498

100.00
16.72
83.28

Lutheran

Total
Ineligible
Eligible

107
8
99

100.00
7.48
92.52

141
7
134

100.00
4.96
95.04

Conservative Christian

Total
Ineligible
Eligible

123
17
106

100.00
13.82
86.18

150
22
128

100.00
14.67
85.33

Other private

Total
Ineligible
Eligible

254
51
203

100.00
20.08
79.92

307
71
236

100.00
23.13
76.87

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_inel_elig_status.aspx

7/22/2016 2:27 PM

7 of 91

Ineligible Sampled Private Schools for the 2011 National Assessment
The table below presents unweighted counts of sampled schools, by grade and eligibility status, for the private school samples.

NAEP sample private schools, national assessment, by grade and eligibility status: 2011
Grade and eligibility status

Unweighted count of schools

Unweighted percentage

All fourth-grade sampled private schools
Eligible
Has sampled grade, but no eligible students
Does not have sampled grade
Closed
Not a regular school
Duplicate on sampling frame
Other ineligible

748
646
14
22
55
7
2
2

100.00
86.36
1.87
2.94
7.35
0.94
0.27
0.27

All eighth-grade sampled private schools
Eligible
Has sampled grade, but no eligible students
Does not have sampled grade
Closed
Not a regular school
Duplicate on sampling frame
Other ineligible

930
804
19
26
52
19
4
6

100.00
86.45
2.04
2.80
5.59
2.04
0.43
0.65

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_inelgtype.aspx

7/22/2016 2:27 PM

8 of 91

Sampling Frame for the 2011 Private School National Assessment
The frame of the private schools in all three grades was developed from the 2007-2008 Private School Universe Survey (PSS), a survey conducted by the U.S. Census Bureau for the National Center for
Education Statistics (NCES). The PSS is a biennial mail survey of all private schools in the 50 states and the District of Columbia. The PSS frame of schools comprises both a list frame and an area frame. The
2007-2008 list frame is an assembly of the 2005-2006 PSS frame and more up-to-date lists from state education agencies, private school associations, and other easily accessible sources. To improve the
coverage of the PSS list frame, the Census Bureau also conducted a survey to locate private schools in a random sample of geographic areas throughout the United States. The areas were single counties or
groups of counties sampled from an area frame constructed from all counties in the nation. Within each selected area a complete list of private schools was gathered using information from the Yellow Pages,
religious institutions, local education agencies, chambers of commerce, and local government offices. Schools not already on the list frame were identified and added to the frame of private schools. A
weighting component was computed by the Census Bureau so that the additional area-frame schools would represent all schools absent from the list frame, not just those in the selected areas.
The sampling frame excluded schools that were ungraded, provided only special education, were part of hospital or treatment center programs, were juvenile correctional institutions, were home-school
entities, or were for adult education.

Fourth- and
Eighth-Grade Schools
and Enrollment in the
Private School
Sampling Frame
New-School Sampling
Frame for the Private
School Assessment

Private school affiliation is unknown for nonrespondents to the PSS. Because oversampling was desired to report by affiliation, additional work was done to obtain affiliation for nonrespondents to the PSS. If a nonresponding school responded to a previous
PSS (either two or four years prior), affiliation was obtained from the previous response. For those schools that were nonrespondents for the last three cycles of the PSS, in some cases Internet research was used to establish affiliation. There were still schools with unknown affiliation
remaining after this process.
For quality control purposes, school and student counts from the 2011 sampling frame were compared to school and student counts from previous NAEP frames (2009 and 2007). No major issues were found.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_sampfrme.aspx

7/22/2016 2:27 PM

9 of 91

Fourth- and Eighth-Grade Schools and Enrollment in the 2011 Private School Sampling Frame
The following table displays, by grade and affiliation, the number of private schools in the sampling frame and their estimated enrollment. Enrollment was estimated for each school as the Private School Universe Survey (PSS)--reported
enrollment averaged across grades 1 through 8.
The counts presented below are of schools with known affiliation. Schools with unknown affiliation do not appear in the table because their grade span, affiliation, and enrollment were unknown. Although PSS is a school universe survey,
participation is voluntary and not all private schools respond. Since the NAEP sample must represent all private schools, not just PSS respondents, a small sample of PSS nonrespondents with unknown affiliation was selected for each of
the targeted grades to improve NAEP coverage.
Number of schools and enrollment in private school sampling frame, national assessment, by affiliation and grade: 2011
Grade

Affiliation

Number of schools

Estimated enrollment

4

Total
Catholic
Non-Catholic private
Lutheran
Conservative Christian
Other private

20,110
5,974
14,136
1,374
4,080
8,682

383,849
171,054
212,795
18,086
61,504
133,205

8

Total
Catholic
Non-Catholic private
Lutheran
Conservative Christian
Other private

17,968
5,465
12,503
1,166
3,636
7,701

369,381
170,509
198,872
16,579
57,363
124,930

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_sampfrme_gr_4_8.aspx

7/22/2016 2:27 PM

10 of 91

New-School Sampling Frame for the 2011 Private School Assessments
Whereas the Private School Universe Survey (PSS) file used for the frame corresponds to the 2007-2008 school year, the NAEP assessment year was the 2010-2011 school year. During this 3-year period, some schools closed, some
changed their grade span, and still others came into existence.
To achieve as close to full coverage as possible, the private school frame was supplemented by a sample of new Catholic schools. The goal was to allow every such school a chance of selection, thereby fully covering the target population
of Catholic schools in operation during the 2010-2011 school year. The first step in this process was the development of a new-school frame through the construction of a diocese-level file from the PSS school-level file. To develop the
frame, the diocese-level file was divided into two files: one for small dioceses and the other for medium and large dioceses.
Small dioceses contained no more than three schools on the frame in total, with no more than one school at each grade (fourth, eighth, and twelfth). New schools in small dioceses were identified during school recruitment and added to
the sample if the old school in the same diocese was sampled at the relevant grade. From a sampling perspective, the new school was viewed as an “annex” to the sampled school that had a well-defined probability of selection equal to
that of the old school. The “frame” in this case was, in fact, the original frame; when the old school was sampled in a small diocese, the new school was automatically sampled as well.
To limit respondent burden and keep the level of effort within reasonable bounds, the new-school frame was created using information obtained from a sample of the remaining dioceses. The remaining dioceses were separated into two
strata of large- and medium-size dioceses. These strata were defined by computing the percentage of the nation’s total Catholic school enrollment each diocese represents, sorting the dioceses in descending order by that percentage, and
cumulating the percentages across the sorted file. All dioceses up to and including the first diocese at or above the 80th cumulative percentage were defined as large dioceses. The remaining dioceses were defined as medium dioceses.
A simplified example is given below. Dioceses are ordered by percentage enrollment. The first six become large dioceses and the last six become medium dioceses.
Example showing assignment of Catholic dioceses to the large and medium strata, private school national assessment: 2011
Diocese
Diocese 1
Diocese 2
Diocese 3
Diocese 4
Diocese 5
Diocese 6
Diocese 7
Diocese 8
Diocese 9
Diocese 10
Diocese 11
Diocese 12

Percent enrollment

Cumulative percentage enrollment

Stratum

20
20
15
10
10
10
5
2
2
2
2
2

20
40
55
65
75
85
90
92
94
96
98
100

L
L
L
L
L
L
M
M
M
M
M
M

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
In actuality, there were 71 large and 103 medium dioceses in the sampling frame.
The target sample size was 10 dioceses total: 8 large and 2 medium. In the medium stratum, the dioceses were selected with equal probability. In the large stratum, dioceses were sampled with probability proportional to enrollment. These
probabilities were retained and used in all later stages of sampling and weighting in order to represent all dioceses, whether or not they had been selected as new school samples for the assessment.
Each selected diocese was sent a listing of its schools extracted from the 2007-2008 PSS file and was asked to provide information about new schools and any changes to grade span in existing schools. This information provided by the
selected dioceses was used to create sampling frames for the selection of new Catholic schools. The process of obtaining the information was conducted with the help of the National Catholic Educational Association (NCEA). NCEA was
sent the school lists for the 10 sampled dioceses and was responsible for returning the completed updates.
The eligibility of a new school at a particular grade was determined by its grade span. A school already on PSS also was classified as “new” if a change of grade span had occurred such that the school status changed from ineligible to
eligible at a particular grade.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_newschoolframe_4_8.aspx

7/22/2016 2:27 PM

11 of 91

Sampling of Schools for the 2011 Private School National Assessment
The private school samples were selected with probability proportional to size using systematic sampling from a sorted list. A school's measure of size was a complex function of the school's estimated grade
enrollment. For the eighth grade sample, multiple "hits" were allowed per school, but this was not the case for the fourth grade sample.

Computation of
Measures of Size

Schools were ordered within each school type using a serpentine sort involving the following variables:

School Sample Sizes:
Frame and New
School

census division,
urbanicity classification (based on urban-centric locale),
race/ethnicity status, and
estimated grade enrollment.
A systematic sample was then drawn with probability proportional to size using this serpentine sorted list and the measures of size.
Schools with unknown affiliation were treated separately. A sample of about 30 schools with unknown affiliation was selected at each of the two grades.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_schlsamp.aspx

7/22/2016 2:27 PM

12 of 91

Computation of Measures of Size
There were five objectives underlying the process for determining the probability of selection for each school and for setting the number of students to be sampled within each selected school:
to meet the target student sample size for each grade;
to select an equal-probability sample of students;
to limit the number of students selected from any one school;
to ensure that the sample within a school does not include a very high percentage of the students in the school, unless all students are included; and
to reduce the rate of sampling of small schools, in recognition of the greater cost and burden per student of conducting assessments in such schools.
The goal in determining the school's measure of size is to optimize across the last four objectives in terms of maintaining the precision of estimates and the cost effectiveness of the sample design. The following algorithm was used to
assign a measure of size to each school based on its estimated grade-specific enrollment.
In the formulas below, xjs refers to the estimated grade enrollment for private school type j and school s and Ps is a primary sampling unit (PSU) weight associated with the private school universe (PSS) area sample.
The preliminary measures of size (MOS) were set as follows:

The preliminary school measure of size was rescaled to create an expected number of hits by applying a multiplicative constant bj, which varies by grade and school type. The private school sample design allowed multiple “hits.” For
example, a school with two hits will have twice as many students sampled as a single-hit school, etc. To limit respondent burden, constraints were placed on the number of hits allowed per school. For grade 4 it was one hit, and for grade
8 it was two.
It follows that the final measure of size, Ejs, was defined as:

where uj is the maximum number of hits allowed.
The school's probability of selection πjs was given by:

One can choose a value of bj such that the expected overall student sample yield matches the desired targets specified by the design, where the expected yield is calculated by summing the product of an individual school’s probability and
its student sample yield across all schools in the frame.
In addition, new and newly eligible Catholic schools were sampled from the new-school frame. The assigned measures of size for these schools,
,
used the bj and uj values from the main school sample for the grade and school type (i.e., the same sampling rates as for the main school sample). The variable πdjs is the probability of selection of the diocese into the new-school diocese
(d) sample.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_schlsamp_mos.aspx

7/22/2016 2:27 PM

13 of 91

School Sample Sizes: Frame and New School
The following table presents the number of schools selected from the private school sampling frame (constructed from the Private School Universe Survey file) and the new-school sampling frame, for grades 4 and 8, by school type.
NAEP private school frame-based and new school samples, by grade and school type: 2011
Grade
4

8

Total school sample

Frame school sample

New school sample

All private
Catholic
Non-Catholic
Lutheran
Conservative Christian
Other private
Unknown affiliation

Private school type

264
484
109
120
230
25

260
484
109
120
230
25

4
0
0
0
0
0

All private
Catholic
Non-Catholic
Lutheran
Conservative Christian
Other private
Unknown affiliation

330
600
141
148
285
26

323
600
141
148
285
26

7
0
0
0
0
0

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_schlsamp_sampsize.aspx

7/22/2016 2:27 PM

14 of 91

School and Student Participation Rates for the 2011 Private School National Assessment
Private school participation in NAEP is not mandatory. The 2011 assessment holds true to the historic pattern of having higher rates of participation among Catholic and Lutheran schools than
among Conservative Christian and other private schools. Although a portion of the participating school sample consisted of substitute schools, it is preferable to calculate school response rates on the basis of
school participation before substitution.
In every NAEP survey, some of the sampled students are not assessed for the following reasons:
withdrawn students,
excluded students with disabilities (SD),
excluded English language learner (ELL) students, or
students absent from both the original session and the makeup session (not excluded but not assessed).
Withdrawn students are those who have left the school before the original assessment. Excluded students were determined by their school to be unable to meaningfully take the NAEP assessment in their
assigned subject, even with an accommodation. Excluded students must also be classified as SD and/or ELL. Other students who were absent for the initial session are assessed in the makeup session. The last
category includes students who were not excluded (i.e., “were to be assessed”) but were not assessed either due to absence from both sessions or because of a refusal to participate. Assessed students are also
classified as assessed without an accommodation or assessed with an accommodation. The latter group can be divided into SD students assessed with an accommodation, ELL students assessed with an
accommodation, or students who are both SD and ELL and accommodated. Note that some SD and ELL students are assessed without accommodations, and students who are neither SD nor ELL can only be
assessed without an accommodation.

School Response
Rates
Weighted Student
Response and
Exclusion Rates for
Mathematics
Weighted Student
Response and
Exclusion Rates for
Reading
Weighted Student
Response and
Exclusion Rates for
Science

The weighted response rates utilize the student base weights and indicate the weighted percentage of assessed students among all students to be assessed. The exclusion rates, in contrast, provide the weighted
percentage of excluded SD or ELL students among all absent, assessed, and excluded students.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_schl_and_stud_part.aspx

7/22/2016 2:27 PM

15 of 91

School Response Rates for the 2011 Private School National Assessment
The following table presents counts of eligible sampled schools and participating schools, as well as weighted school response rates, for the private school samples in which the mathematics and reading operational assessments were
conducted. The weighted school response rates estimate the proportion of the student population that is represented by the participating school sample prior to substitution.

Private school response rates, national assessment, by school type and grade: 2011
Grade

Private school type

Eligible sampled schools

Participating schools, including substitutes

Weighted school response rate prior to substitution (percent)

4

All private
Catholic
Non-Catholic
Lutheran
Conservative Christian
Other private

646
238
408
99
106
203

557
236
321
97
94
130

73.51
96.27
55.34
94.87
73.13
42.23

8

All private
Catholic
Non-Catholic
Lutheran
Conservative Christian
Other private

804
306
498
134
127
237

696
299
397
129
114
154

74.40
93.23
57.54
92.73
72.51
45.71

NOTE: Detail may not sum to total due to rounding. Percentages are based on unrounded counts.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_schresp_rates.aspx

7/22/2016 2:27 PM

16 of 91

Weighted Student Response and Exclusion Rates for the 2011 Private School National Mathematics Assessment
The following table presents the weighted student response and exclusion rates for the mathematics assessment. The exclusion rates give the percentage of students excluded among all eligible students. Excluded students must be either
students with disabilities (SD) or English language learner (ELL). The response rates indicate the percentage of students assessed among those who were intended to take the assessment from within the participating schools. Thus,
students who were excluded are not included in the denominators of the response rates.
Weighted student response and exclusion rates for private schools, national mathematics assessment, by school type and grade: 2011
Grade

Private school type

Weighted student response rate

Weighted percentage of all students who are SD and excluded

Weighted percentage of all students who are ELL and excluded

4

All private
Catholic
Non-Catholic
Lutheran
Conservative Christian
Other private

95.55
95.85
95.19
96.63
93.90
95.70

0.18
0.22
0.15
0.36
0.30
0.05

0.12
0.03
0.20
0.00
0.00
0.32

8

All private
Catholic
Non-Catholic
Lutheran
Conservative Christian
Other private

94.77
95.05
94.43
96.30
94.50
94.02

0.44
0.46
0.42
0.33
0.29
0.49

0.06
0.07
0.05
0.00
0.17
0.00

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Mathematics Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_studresp_math.aspx

7/22/2016 2:27 PM

17 of 91

Weighted Student Response and Exclusion Rates for the 2011 Private School National Reading Assessment
The following table presents the weighted student response and exclusion rates for the reading assessment. The exclusion rates give the percentage of students excluded among all eligible students. Excluded students must necessarily be
either students with disabilities (SD) or English language learners (ELL). The response rates indicate the percentage of students assessed among those who were intended to take the assessment from within the participating schools. Thus,
students who were excluded are not included in the denominators of the response rates.
Weighted student response and exclusion rates for private schools, national reading assessment, by school type and grade: 2011
Grade

Private school type

Weighted student response rate

Weighted percentage of all students who are SD and excluded

Weighted percentage of all students who are ELL and excluded

4

All private
Catholic
Non-Catholic
Lutheran
Conservative Christian
Other private

95.16
95.49
94.77
96.27
95.48
93.99

0.30
0.30
0.30
1.17
0.00
0.31

0.22
0.18
0.26
0.00
0.00
0.42

8

All private
Catholic
Non-Catholic
Lutheran
Conservative Christian
Other private

94.80
95.36
94.11
95.19
93.72
94.13

0.40
0.30
0.50
0.48
0.08
0.69

0.07
0.09
0.06
0.00
0.20
0.00

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Reading Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_studresp_read.aspx

7/22/2016 2:27 PM

18 of 91

Weighted Student Response and Exclusion Rates for the 2011 Private School National Science Assessment
The following table presents the weighted student response and exclusion rates for the grade 8 national science assessment. The exclusion rates give the percentage of students excluded among all eligible students. Excluded students must
necessarily be either students with disabilities (SD) or English language learners (ELL). The response rates indicate the percentage of students assessed among those who were intended to take the assessment from within the participating
schools. Thus, students who were excluded are not included in the denominators of the response rates.
Weighted student response and exclusion rates for private schools, grade 8 science assessment, by school type: 2011
Private school type
All private
Catholic
Non-Catholic

Weighted student response rate

Weighted percentage of all students who are SD and excluded

Weighted percentage of all students who are ELL and excluded

93.86
94.27
93.34

0.18
0.35
0.00

0.00
0.00
0.00

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Science Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_studresp_science.aspx

7/22/2016 2:27 PM

19 of 91

Stratification of Schools in the 2011 Private School National Assessment
Explicit stratification for the NAEP 2011 private school samples was by private school type: Catholic, Lutheran, Conservative Christian, Other Private, and unknown affiliation. Private school affiliation was unknown for nonrespondents
to the NCES Private School Universe Survey (PSS) for the past three cycles.
The implicit stratification of the schools involved four dimensions. Within each explicit stratum, the private schools were hierarchically sorted by census division, urbanicity status, race/ethnicity status, and estimated grade enrollment.
The implicit stratification in this four-fold hierarchical stratification was achieved via a "serpentine sort."
Census division was used as the first level of implicit stratification for the NAEP 2011 private school sample.
Collapsing of census division varied by grade. For grade 4, all nine census divisions were used for stratifying Catholic and other private schools. However, due to small cell sizes, divisions in the Northeast and Midwest were collapsed
within census regions for Conservative Christian schools. For Lutheran schools, a South Central stratum was created within the southern region and divisions were collapsed across regions to create an East Coast stratum. For grade 8, all
census divisions were used to stratify Catholic and other private schools. Divisions in the Northeast were collapsed within region for both Conservative Christian and Lutheran schools. Additionally for Lutheran schools, two divisions
were collapsed within the southern region to create a South Central stratum.
The next level of stratification was an urbanicity classification based on urban-centric locale, as specified on the PSS. Within a census division-based stratum, urban-centric locale cells that were too small were collapsed. The criterion for
adequacy was that the cell had to have an expected school sample size of at least six.
The urbanicity variable was equal to the original urban-centric locale if no collapsing was necessary to cover an inadequate original cell. If collapsing was necessary, the scheme was to first collapse within the four major strata (city,
suburbs, town, and rural). For example, if the expected number of large city schools sampled was less than six, large city was collapsed with midsize city. If the collapsed cell was still inadequate, they were further collapsed with small
city. If a major stratum cell (all three cells collapsed together) was still deficient, it was collapsed with a neighboring major stratum cell. For example, city would be collapsed with suburbs.
The last stage of stratification was a division of the geographic/urbanicity strata into race/ethnicity strata if the expected number of schools sampled was large enough (i.e., at least equal to 12). This was done by deciding first on the
number of race/ethnicity strata and then dividing the geography/urbanicity stratum into that many pieces. The school frame was sorted by the percentage of students in each school who were Black, Hispanic, or American Indian. The
three race/ethnic groups defining the race/ethnicity strata were those that have historically performed substantially lower on NAEP assessments than White students. The sorted list was then divided into pieces, with roughly an equal
expected number of sampled schools in each piece.
Finally, schools were sorted within stratification cells by estimated grade enrollment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_strat.aspx

7/22/2016 2:27 PM

20 of 91

Student Sample Selection for the 2011 Private School National Assessment
The target student sample size within sampled schools for the fourth and eighth grades was 63 students. However, schools with 70 or fewer students automatically had all students sampled. In addition, at grade 4 only, a school that had
more than 70 students but fewer than 121 could choose to have all students sampled.
There was only one spiral type for each grade. The percentage of booklets by subject within the spiral for each grade is given below.
Percentage of booklets, private school national assessment, by subject within the spiral and grade: 2011
Grade
4
8

Mathematics

Reading

Science

KaSA

Pilot

46.96
48.34

48.62
46.69

†
4.97

1.55
†

2.87
†

† Not applicable.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
The process of student list submission, sampling students from year-round schools, sampling new enrollees, and determining student eligibility and exclusion status was the same as for the state NAEP student sample.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_studsamp.aspx

7/22/2016 2:27 PM

21 of 91

Substitute Schools for the 2011 Private School National Assessment
Substitutes were preselected for the private school samples by sorting the school frame file according to the actual order used in the sampling process (the implicit stratification). Each sampled school had its two nearest neighbors on the
school frame file identified as potential substitutes. As the last sort ordering was by grade enrollment, the nearest neighbors had grade enrollment values very close to that of the sampled school.
Schools were disqualified as potential substitutes if they were already selected in the private school sample or assigned as a substitute for another private school (earlier in the sort ordering). Schools assigned as substitutes for
eighth-grade schools were disqualified as potential substitutes for fourth-grade schools.
If both nearest neighbors were still eligible to be substitutes, the one with the closer grade enrollment was chosen. If both nearest neighbors had the same grade enrollment (an uncommon occurrence), one of the two was randomly
selected.
In the process described above, only schools with the same affiliation were selected as substitutes.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_subs.aspx

7/22/2016 2:27 PM

22 of 91

Target Population for the 2011 Private School National Assessment
The target population for the 2011 Private School National Assessment included all students enrolled in private schools in grades 4 and 8 within the 50 states and the District of Columbia.
http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_targpop.aspx

Sampling Frame for
the Private School
National Assessment

7/22/2016 2:27 PM

23 of 91

2011 Fourth- and Eighth-Grade Public School National Assessment
For the mathematics, reading, and science assessments in fourth- and eighth-grade public schools, the national samples were the state assessment samples for each jurisdiction. All states participated in the mathematics, reading, and
science assessments. By design, only Bureau of Indian Education (BIE) schools did not participate in the state science assessment, as it lacked the required number of students for the state science assessment. A small portion of students
received the science assessment in BIE schools in science to supplement the national science sample.
Additional details of the national science sample are also described as part of the state assessment samples.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_pub_gr_4_8.aspx

7/22/2016 2:27 PM

24 of 91

2011 Writing Computer-Based Assessment (WCBA)
The sample design for the NAEP 2011 writing computer-based assessment (WCBA) provided a nationally representative sample of eighth- and twelfth-grade students.
This was accomplished by designing separate sample components for public and private schools. The selected samples were based on a three-stage sample design:
selection of primary sampling units (PSUs),
selection of schools within strata, and
selection of students within schools.
The samples of schools were selected with probability proportional to a measure of size based on the estimated eighth- and twelfth-grade enrollment in the schools.
The target population respectively included all students in public and private schools, Bureau of Indian Education (BIE) schools, and Department of Defense Education Activity (DoDEA)
schools in the 50 states and the District of Columbia, who were enrolled in the eighth and twelfth grade at the time of assessment.

Selection of Primary Sampling Units
Public School 2011 Writing
Computer-Based Assessment (WCBA)
Private School 2011 Writing
Computer-Based Assessment (WCBA)
School and Student Participation
Results for the 2011 Writing
Computer-Based Assessment

The table below shows the target student sample sizes of assessed students for each sample.
Target student sample sizes of assessed students for grades 8 and 12, writing computer-based assessment (WCBA), by school type: 2011

School type

Grade

Target student sample size

Public

8,12

19,800

Private

8,12

2,200

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Writing
Computer-Based Assessment.
To reduce the burden on any particular school, efforts were made to minimize the overlap between the 2011 PSU sample and all other PSU samples selected for NAEP since 2006. The school samples were designed to have minimum
overlap with both the United States school sample for the Trends in International Mathematics and Science Study (TIMSS), and the NAEP 2011 state sample.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_wcba.aspx

7/22/2016 2:27 PM

25 of 91

Private School 2011 Writing Computer-Based Assessment (WCBA)
The NAEP 2011 writing computer-based assessment (WCBA) sample design yielded nationally representative samples of private school students in grades 8 and 12 through a three-stage approach:
selection of primary sampling units (PSUs), selection of schools within strata, and selection of students within schools. The sample of schools was selected with probability proportional to a measure of size
based on the estimated grade enrollment in the schools.

Target Population

The 2011 national WCBA sampling plan had a goal of assessing 2,200 eighth-graders and 2,200 twelfth-graders. Target sample sizes were adjusted to reflect expected private school and student response
and eligibility.

Stratification of Schools

Sampling Frame

Sampling of Schools

Schools on the sampling frame were explicitly stratified prior to sampling by private school affiliation (Catholic, non-Catholic, and unknown affiliation). Within affiliation type, schools were implicitly
stratified by PSU type (certainty/noncertainty). In certainty PSUs, further stratification was by census region, urban-centric locale, and estimated grade enrollment. In noncertainty PSUs, additional
stratification was by PSU stratum, urban-centric locale, and estimated grade enrollment.

Substitute Schools

From the stratified frame of private schools, systematic random samples of eighth- and twelfth-grade schools were drawn with probability proportional to a measure of size based on the estimated grade
enrollment of the school in the relevant grade.

Student Sample Selection

Ineligible Schools

Each selected school in the private school sample provided a list of eligible enrolled students from which a systematic, equal probability sample of students was drawn.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_priv_wbca.aspx

7/22/2016 2:27 PM

26 of 91

Ineligible Private Schools for the 2011 Writing Computer-Based Assessment (WCBA)
The Private School Universe Survey (PSS) school file from which most of the sampled schools were drawn corresponds to the 2007-2008 school year, 3 years prior to the assessment school year. During the intervening period, some of
these schools either closed, no longer offered the grade of interest, or were ineligible for other reasons. In such cases, the sampled schools were coded as ineligible.
The table below presents unweighted counts of sampled private schools by eligibility status, including the reason for ineligibility.
Number of sampled private schools, writing computer-based assessment (WCBA), by eligibility status and grade: 2011
Eligibility status

Unweighted count of schools

Unweighted percentage

All eighth-grade sampled private schools
Eligible schools
No eligible students in grade
Does not have grade
School closed
Not a regular school
Other ineligible school
Duplicate on sampling frame

157
140
3
4
8
1
1
0

100.00
89.17
1.91
2.55
5.10
0.64
0.64
0.00

All twelfth-grade sampled private schools
Eligible schools
No eligible students in grade
Does not have grade
School closed
Not a regular school
Other ineligible school
Duplicate on sampling frame

177
160
4
4
2
4
2
1

100.00
90.40
2.26
2.26
1.13
2.26
1.13
0.56

NOTE: Detail may not add up to totals due to rounding. Percentages are based on rounded counts.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Computer-Based Assessment.
The table below presents unweighted counts of sample private schools by collapsed private school type and eligibility status.
Number of sampled private schools, writing computer-based assessment (WCBA), by eligibility status and private school type: 2011
Private school type
All eighth-grade sampled private schools
Catholic

Non-Catholic

All twelfth-grade sampled private schools
Catholic

Non-Catholic

Eligibility status

Unweighted count of schools

Unweighted percentage

Total
Eligible
Ineligible
Total
Eligible
Ineligible
Total
Eligible
Ineligible

157
140
17
50
42
8
107
98
9

100.00
89.17
10.83
100.00
84.00
16.00
100.00
91.59
8.41

Total
Eligible
Ineligible
Total
Eligible
Ineligible
Total
Eligible
Ineligible

177
160
17
55
55
0
122
105
17

100.00
90.40
9.60
100.00
100.00
0.00
100.00
86.07
13.93

NOTE: Detail may not add up to totals due to rounding. Percentages are based on rounded counts.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Computer-Based Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_priv_wcba_inelg.aspx

7/22/2016 2:27 PM

27 of 91

Sampling Frame for the Private School 2011 Writing Computer-Based Assessment (WCBA)
The sampling frame for private schools was developed from the 2007-2008 Private School Universe Survey (PSS), a survey conducted by the U.S. Census Bureau for the National Center for Education
Statistics (NCES). The PSS is a biennial mail survey of all private schools in the 50 states and the District of Columbia. The PSS frame of schools comprises both a list frame and an area frame. The list frame
is an assembly of the 2005-2006 PSS frame and more up-to-date lists from state education agencies, private school associations, and other easily accessible sources. To improve the coverage of the PSS list
frame, the Census Bureau also conducted a survey to locate private schools in a random sample of geographic areas throughout the United States. The areas were single counties or groups of counties sampled
from an area frame constructed from all counties in the nation. Within each selected area a complete list of private schools was gathered using information from telephone directories, religious institutions,
local education agencies, chambers of commerce, and local government offices. Schools not already on the list frame were identified and added to the frame of private schools. A weighting component was
computed by the Census Bureau so that the additional area-frame schools would represent all schools absent from the list frame, not just those in the selected areas.
The sampling frame was restricted to schools located in the primary sampling units (PSUs) selected for the NAEP 2011 writing computer-based assessment (WCBA). In addition, the sampling frame
excluded ungraded schools, vocational schools with no enrollment, special-education-only schools, homeschool entities, prison and hospital schools, and juvenile correctional institutions.
For quality control purposes, school and student counts from the sampling frame were compared to school and student counts from previous private school frames by grade. No major discrepancies were found.

Eighth- and
Twelfth-Grade
Schools and
Enrollment in the
Private School WCBA
Sampling Frame
New-School Sampling
Frame for the Private
School Writing
Computer-Based
Assessment

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_priv_wcba_sampfrme.aspx

7/22/2016 2:27 PM

28 of 91

Eighth- and Twelfth-Grade Schools and Enrollment in the Private School WCBA Sampling Frame
The following table presents the number of schools and estimated enrollment for the private school frame for grades 8 and 12. These enrollment numbers include only those schools with known affiliation. The unweighted estimated
enrollment is restricted to the selected primary sampling units (PSUs). The weighted estimated enrollment incorporates the PSU weight (inverse of the probability of selecting the PSU), as well as the Private School Universe Survey
(PSS) weight, and thus is a national estimate of the number of private school students in each grade.
Number of schools and enrollment in private school sampling frame for the writing computer-based assessment (WCBA), by school affiliation and grade: 2011
Grade

Affiliation

Number of schools

Estimated enrollment (unweighted)

Estimated enrollment (weighted)

8

Total
Catholic
Non-Catholic

9,366
3,438
5,928

234,221
112,863
121,358

374,445
169,638
204,807

12

Total
Catholic
Non-Catholic

4,539
780
3,759

213,828
111,164
102,664

338,291
158,660
179,631

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Computer-Based Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_sampfrme_gr_8_12_wcba.aspx

7/22/2016 2:27 PM

29 of 91

New-School Sampling Frame for the Private School Writing Computer-Based Assessment (WCBA)
Whereas the Private School Universe Survey (PSS) file used for the frame corresponds to the 2007-2008 school year, the NAEP assessment year was the 2010-2011 school year. During this 3-year period, some schools closed, some
changed their grade span, and still others came into existence.
To achieve as close to full coverage as possible, the private school frame for the writing computer-based assessment (WCBA) was supplemented by a sample of new Catholic schools. The goal was to allow every such school a chance of
selection, thereby fully covering the target population of Catholic schools in operation during the 2010-2011 school year. The first step in this process was the development of a new-school frame through the construction of a
diocese-level file from the PSS school-level file. To develop the frame, the diocese-level file was divided into two files: one for small dioceses and a second for medium and large dioceses.
Small dioceses contained no more than three schools on the frame in total, with no more than one school at each grade (fourth, eighth, and twelfth). New schools in small dioceses were identified during school recruitment and added to
the sample if the old school in the same diocese was sampled at the relevant grade. From a sampling perspective, the new school was viewed as an “annex” to the sampled school that had a well-defined probability of selection equal to
that of the old school. The “frame” in this case was, in fact, the original frame; when the old school was sampled in a small diocese, the new school was automatically sampled as well.
To limit respondent burden and keep the level of effort within reasonable bounds, the new-school frame was created using information obtained from a sample of the remaining dioceses. The remaining dioceses were separated into two
strata of large- and medium-size dioceses. These strata were defined by computing the percentage of the nation’s total Catholic school enrollment each diocese represents, sorting the dioceses in descending order by that percentage, and
cumulating the percentages across the sorted file. All dioceses up to and including the first diocese at or above the 80th cumulative percentage were defined as large dioceses. The remaining dioceses were defined as medium dioceses.
A simplified example is given below. Dioceses are ordered by percentage enrollment. The first six become large dioceses and the last six become medium dioceses.
Example showing assignment of Catholic dioceses to the large and medium strata, private school assessment: 2011
Diocese
Diocese 1
Diocese 2
Diocese 3
Diocese 4
Diocese 5
Diocese 6
Diocese 7
Diocese 8
Diocese 9
Diocese 10
Diocese 11
Diocese 12

Percent enrollment

Cumulative percentage enrollment

Stratum

20
20
15
10
10
10
5
2
2
2
2
2

20
40
55
65
75
85
90
92
94
96
98
100

L
L
L
L
L
L
M
M
M
M
M
M

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
In actuality there were 71 large and 103 medium dioceses in the sampling frame.
The target sample size was 10 dioceses total: 8 large and 2 medium. In the medium stratum, the dioceses were selected with equal probability. In the large stratum, dioceses were sampled with probability proportional to enrollment. These
probabilities were retained and used in all later stages of sampling and weighting in order to represent all dioceses, whether or not they had been sampled to be surveyed for new schools.
Each selected diocese was sent a listing of its schools extracted from the 2007-2008 PSS file and was asked to provide information about new schools and any changes to grade span in existing schools. This information provided by the
selected dioceses was used to create sampling frames for the selection of new Catholic schools. The process of obtaining the information was conducted with the help of the National Catholic Educational Association (NCEA). NCEA was
sent the school lists for the 10 sampled dioceses and was responsible for returning the completed updates.
The eligibility of a new school at a particular grade was determined by its grade span. A school already on PSS also was classified as “new” if a change of grade span had occurred such that the school status changed from ineligible to
eligible at a particular grade.
As was done for the original sampling frame, the new-school sampling frame was restricted to schools located in the primary sampling units (PSUs) selected for the NAEP 2011 WCBA. Weights for schools in the new-school sample
were adjusted to account for the PSU selection probability.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_newschoolframe_wcba.aspx

7/22/2016 2:27 PM

30 of 91

Sampling of Private Schools for the 2011 Writing Computer-Based Assessment (WCBA)
The writing computer-based assessment (WCBA) private school sample was selected with probability proportional to size using systematic sampling from a sorted list. A school's measure of size was a
complex function of the school's estimated grade enrollment.
Schools were ordered within each grade using the serpentine sort described under the stratification of private schools. A systematic sample was then drawn using this serpentine sorted list and the measures of
size.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_priv_wcba_schlsamp.aspx

Computation of
Measures of Size for
the 2011 Private
School Writing
Computer-Based
Assessment
School Sample Sizes
for 2011 Private
School Writing
Computer-Based
Assessment: Frame
and New School

7/22/2016 2:27 PM

31 of 91

Computation of Measures of Size for the 2011 Private School Writing Computer-Based Assessment (WCBA)
In the design of each school sample, five objectives underlie the process of determining the probability of selection for each school and how many students are to be sampled from each selected school containing grade-eligible students:
to meet the target student sample size;
to select an equal-probability sample of students;
to limit the number of students who are selected from a school;
to ensure that the sample within a school does not include a very high percentage of the students in the school, unless all students are included; and
to reduce the rate of sampling of small schools, in recognition of the greater cost and burden per student of conducting assessments in such schools.
The goal in determining the school's measure of size is to optimize across the last four objectives in terms of maintaining the accuracy of estimates and the cost-effectiveness of the sample design. The following algorithm was used to
assign a measure of size to each school based on its estimated grade enrollment as indicated on the sampling frame.
The measures of size vary by enrollment size. The initial measures of size (MOS) were set as follows, for both eighth and twelfth grades:

where Xjs is the estimated grade enrollment for grade j (j = 8, 12) in school s, PSCHWTs= the Private School Universe Survey (PSS) area frame weight for school s, computed by the U.S. Census Bureau, and PSU_WTs = the primary
sampling unit (PSU) weight for school s.
An adjustment to the initial measure of size was made for some schools. Schools in the PSU containing Honolulu County had their measure of size increased by a factor of two in order to double their probability of selection.
The school measure of size was then rescaled to create an expected number of hits by applying a multiplicative constant bj, which varies by grade and school type. For the national WCBA sample, by design, a school could not be selected
or "hit" in the sampling process more than once.
The rescaled measure of size, Ejs, was defined as:

For grade 8 only, a final adjustment was made to the measures of size (Ejs) in the national sample to attempt to reduce school burden by minimizing the number of schools that were selected for simultaneous administration of the WCBA,
the operational private school assessments (mathematics, reading, and science), and the Trends in International Mathematics and Science Study (TIMSS). The NAEP 2011 studies for grade 8 used an adaptation of the Keyfitz process to
compute conditional measures of size that, by their design, minimized the overlap of schools selected for the three types of assessment. Grade 12 did not have any operational assessments or a TIMSS sample in 2011.
The school's probability of selection πjs was given by:

One can choose a value of bj such that the expected overall student sample yield matches the desired targets specified by the design, where the expected yield is calculated by summing the product of an individual school’s probability and
its student sample yield across all schools in the frame.
In addition, new and newly eligible schools were sampled from the new-school frame. The assigned measures of size for these schools,

used the bj value from the main school sample for the grade and school type (i.e., the same sampling rates as for the main school sample). The variable πdjs is the probability of selection of the diocese into the new-school diocese (d)
sample.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_schlsamp_wcba_mos.aspx

7/22/2016 2:27 PM

32 of 91

School Sample Sizes for 2011 Private School Writing Computer-Based Assessment (WCBA): Frame and New School
The following table presents the number of schools selected from the private school WCBA sampling frame (constructed from the Private School Universe Survey file) and the new-school sampling frame, for eighth and twelfth grade, by
school type.
NAEP private school frame-based and new school writing computer-based assessment (WCBA) samples, by grade and school type: 2011
Grade and private school type

Total school sample

Frame school sample

New school sample

Eighth grade
All private
Catholic
Non-Catholic
Unknown affiliation

157
50
106
1

155
48
106
1

2
2
0
0

Twelfth grade
All private
Catholic
Non-Catholic
Unknown affiliation

177
55
120
2

177
55
120
2

0
0
0
0

NOTE: Details may not sum to totals due to rounding.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Computer-Based Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_delta_schlsamp_sampsize_wcba.aspx

7/22/2016 2:27 PM

33 of 91

Stratification of Private Schools for the 2011 Writing Computer-Based Assessment (WCBA)
Prior to stratification, the private school sampling frame was divided into grade-specific files, one each for eighth and twelfth grade. For each such grade-specific file, schools were explicitly stratified by private school affiliation
(Catholic, non-Catholic, and unknown affiliation). Private school affiliation was unknown for nonrespondents to the NCES Private School Universe Survey (PSS). Within private school type, separate implicit stratification schemes were
used to sort schools in certainty primary sampling units (PSUs) and noncertainty PSUs. In all cases, the implicit stratification was achieved via a serpentine sort.
Within each certainty PSU, the schools were hierarchically sorted by
census region,
urbanization classification (urban-centric locale), and
estimated grade enrollment.
Schools in noncertainty PSUs were hierarchically sorted by
PSU stratum,
urbanization classification (urban-centric locale), and
estimated grade enrollment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_priv_wcba_strat.aspx

7/22/2016 2:27 PM

34 of 91

Student Sample Selection for the Private School 2011 Writing Computer-Based Assessment (WCBA)
For the NAEP 2011 writing computer-based assessment (WCBA), the target student sample sizes within sampled schools were the same for both eighth and twelfth grades. All students were sampled if the school had 30 or fewer students
in that grade. Otherwise, a sample of 30 students was selected without replacement.
The process of list submission, sampling new enrollees, and determining student eligibility and exclusion status was the same as the process used for the NAEP 2011 state student samples.
http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_priv_science_wcba_studsamp.aspx

7/22/2016 2:27 PM

35 of 91

Substitute Private Schools for the 2011 Writing Computer-Based Assessment (WCBA)
Substitutes were preselected for the private school samples by sorting the school frame file according to the actual order used in the sampling process (the implicit stratification). For operational reasons, the original selection order was
embedded within the sampled primary sampling unit (PSU). Each sampled school had each of its nearest neighbors within the same sampling stratum on the school frame file identified as a potential substitute. When grade enrollment
was used as the last sort ordering variable, the nearest neighbors had grade enrollment values very close to that of the sampled school. This was done to facilitate the selection of about the same number of students within the substitute as
would have been selected from the original sampled school.
Schools were disqualified as potential substitutes if they were already selected in any of the original private school samples or assigned as a substitute for another private school (earlier in the sort ordering), or if they were already selected
in the original 2011 Trends in International Mathematics and Science Study (TIMSS) sample. TIMSS substitutes were eligible to be used as substitutes for the writing computer-based assessment (WCBA). Schools assigned as substitutes
for twelfth-grade schools were disqualified as potential substitutes for eighth-grade schools.
If both nearest neighbors were still eligible to be substitutes, the one with a closer grade enrollment was chosen. If both nearest neighbors were equally distant from the sampled school in their grade enrollment (an uncommon
occurrence), one of the two was randomly selected. If the grade enrollment of the nearest neighbor school was less than half of the expected student sample size of the original sampled school, then it was considered ineligible as a
substitute for that school.
Of the approximately 330 originally sampled private schools for the WCBA, about 100 had a substitute activated because the original school, although eligible, did not participate. Ultimately, about 40 substitute private schools
participated in the WCBA.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_priv_wcba_subs.aspx

7/22/2016 2:27 PM

36 of 91

Target Population of the Private School 2011 Writing Computer-Based Assessment (WCBA)
The target population for the private school 2011 writing computer-based assessment (WCBA) included all students who were enrolled in eighth and twelfth grades in private schools. The sample frame included private schools in the 50
states and the District of Columbia.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_priv_wcba_targpop.aspx

7/22/2016 2:27 PM

37 of 91

Public School 2011 Writing Computer-Based Assessment (WCBA)
The NAEP 2011 writing computer-based assessment (WCBA) sample design yielded nationally representative samples of public school students in each grade (grades 8 and 12) through a three-stage
approach: selection of primary sampling units (PSUs), selection of schools within strata, and selection of students within schools. The sample of schools was selected with probability proportional to a
measure of size based on the estimated grade enrollment in the schools.

Target Population

The 2011 WCBA was administered in both grades 8 and 12, with the goal of assessing 19,800 students in each grade. The target sample size was adjusted to reflect expected public school and student
response and eligibility.

Stratification of Schools

Sampling Frame

Schools on the sampling frame were explicitly stratified prior to sampling by PSU type (certainty/noncertainty). Within certainty PSUs, schools were implicitly stratified by census region, urban-centric
locale and median household income in the zip code area where the school is located. Within noncertainty PSUs, schools were implicitly stratified by PSU stratum, urban-centric locale, and median
income in the zip code area where the school is located.

Sampling of Schools

From the stratified frame of public schools, systematic random samples of eighth- and twelfth-grade schools were drawn with probability proportional to a measure of size based on the estimated grade
enrollment of the school, in the relevant grade.

Ineligible Schools

Substitute Schools

Student Sample Selection

Each selected school in the public school samples provided a list of eligible enrolled students from which a systematic, equal probability sample of students was drawn.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_science_pub_wcba.aspx

7/22/2016 2:27 PM

38 of 91

Ineligible Public Schools for the 2011 Writing Computer-Based Assessment (WCBA)
The Common Core of Data (CCD) public school file from which most of the sampled schools were drawn corresponds to the 2007-2008 school year, 3 years prior to the assessment school year. During the intervening period, some of
these schools either closed, no longer offered the grade of interest, or became ineligible for other reasons. In such cases, the sampled schools were considered to be ineligible.
The table below presents unweighted counts of sampled public schools by grade and eligibility status, including the reason for ineligibility.
Number of sampled public schools, writing computer-based assessment (WCBA), by eligibility status and grade: 2011
Eligibility status
All eighth-grade sampled public schools
Eligible schools
No eligible students in grade
Does not have grade
School closed
Not a regular school
Other ineligible school
Duplicate on sampling frame
All twelfth-grade sampled public schools
Eligible schools
No eligible students in grade
Does not have grade
School closed
Not a regular school
Other ineligible school
Duplicate on sampling frame

Unweighted count of schools

Unweighted percentage

890
841
1
12
28
8
0
0

100.00
94.49
0.11
1.35
3.15
0.90
0.00
0.00

1,200
1,100
3
13
16
30
3
0

100.00
94.57
0.25
1.09
1.34
2.51
0.25
0.00

NOTE: Detail may not add up to totals due to rounding. Percentages are based on rounded counts.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Computer-Based Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_pub_wcba_inelg.aspx

7/22/2016 2:27 PM

39 of 91

Sampling Frame for the Public School 2011 Writing Computer-Based Assessment (WCBA)
The sampling frame for public schools was derived from the Common Core of Data (CCD) file corresponding to the 2007-2008 school year. The CCD files provided the frame for all regular public, stateoperated public,Bureau of Indian Education (BIE), and Department of Defense Domestic Dependent Elementary and Secondary Schools (DDESS) open during the 2007-2008 school year.
The sampling frame was restricted to schools located in the primary sampling units (PSUs) selected for the NAEP 2011 writing computer-based assessment (WCBA). The sampling frame also excluded
ungraded schools, vocational schools with no enrollment, special-education-only schools, homeschool entities, prison or hospital schools, and juvenile correctional institutions.
For quality control purposes, school and student counts from the sampling frame were compared to school and student counts from previous public school frames by grade. No major discrepancies were found.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_pub_science_wcba_sampfrme.aspx

Eighth- and
Twelfth-Grade
Schools and
Enrollment in the
Public School WCBA
Sampling Frame
New-School Sampling
Frame for the Public
School Writing
Computer-Based
Assessment

7/22/2016 2:27 PM

40 of 91

Eighth- and Twelfth-Grade Schools and Enrollment in the Public School WCBA Sampling Frame
The following table presents the number of schools and estimated enrollment for the public school frame for grades 8 and 12. The unweighted estimated enrollment is restricted to the selected primary sampling units (PSUs). The
weighted estimated enrollment incorporates the PSU weight (inverse of the probability of selecting the PSU), and thus is a national estimate of the number of public school students in each grade.
Number of schools and enrollment in public school sampling frame, writing computer-based assessment (WCBA), by grade: 2011
Grade
8
12

School count in sampled PSUs

Estimated enrollment (unweighted)

Estimated enrollment
(weighted)

11,379
9,068

1,952,079
1,833,707

3,635,336
3,423,860

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Computer-Based Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_pub_sampfrme_gr_8_12.aspx

7/22/2016 2:27 PM

41 of 91

New-School Sampling Frame for the Public School Writing Computer-Based Assessment (WCBA)
The Common Core of Data (CCD) file used for the frame corresponds to the 2007-2008 school year, whereas the assessment year is the 2010-2011 school year. During this 3-year period, some schools closed, some changed structure (one
school becoming two schools, for example), and others came into existence.
To achieve as close to full coverage as possible, the writing computer-based assessment (WCBA) school frame was supplemented by a sample of new schools obtained from a sample of districts. Each sampled district was sent a list of the
CCD schools and asked to add in any new schools or old schools that had become newly eligible for eighth or twelfth grades.
Since asking every school district to list new- and newly-eligible schools would have generated too much of a burden, a sample of districts was contacted to obtain a list of new schools. To represent the unsampled districts in the full
sample of schools, weights for schools included in the new-school sample were adjusted to reflect the district selection probability.
As was done for the original sampling frame, the new-school sampling frame was restricted to schools located in the primary sampling units (PSUs) selected for the NAEP 2011 WCBA. Weights for schools in the new-school sample
were further adjusted to account for the PSU selection probability.
The goal was to allow every new school a chance of selection, thereby fully covering the target population of schools in operation during the 2008-2009 school year. The first step in this process was the development of a new-school
frame through the construction of a district-level file from the CCD school-level file. To develop the frame, the district-level file was divided into two files: one for small districts and a second for medium and large districts.
Small districts contained no more than three schools on the frame in total, with no more than one school at each targeted grade (fourth, eighth, and twelfth). New schools in small districts were identified during school recruitment and
added to the sample if the old school was sampled. From a sampling perspective, the new school was viewed as an “annex” to the sampled school that had a well-defined probability of selection equal to that of the old school. The “frame”
in this case was, in fact, the original frame; when the old school was sampled in a small district, the new school was automatically sampled as well.
The remaining districts were defined as medium and large districts. In these districts, a frame of new schools was developed based on information provided by the district. To limit the required effort, the new-school frame was created
through developing information on a sample of medium and large public school districts in each jurisdiction. All districts were selected in the following classes of districts:
jurisdictions where all schools were sampled with certainty at either grade 8 or 12 (so that all new schools would be selected with certainty, as well),
state-operated districts,
districts in states with fewer than 10 districts,
districts containing no schools other than charter schools, and
TUDA districts.
The remaining districts in each jurisdiction (excepting the certainty jurisdictions) were separated into two strata of large- and medium-size districts. These strata were defined by computing an aggregate percentage of enrollment for each
district within the state (removing districts in the certainty strata defined above) and sorting in descending order by percentage of jurisdiction enrollment represented by the district. All districts up to and including the first district at or
above the 80th cumulative percentage were defined as large districts. The remaining districts were defined as medium districts.
An example is given below. A state's districts are ordered by percentage enrollment. The first six become large districts and the last six become medium districts.
Large and medium districts example, state assessment, by enrollment, stratum, and district: 2011
District
1
2
3
4
5
6
7
8
9
10
11
12

Percentage enrollment

Cumulative percentage enrollment

Stratum

20
20
15
10
10
10
5
2
2
2
2
2

20
40
55
65
75
85
90
92
94
96
98
100

L
L
L
L
L
L
M
M
M
M
M
M

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011.
The target sample size for each jurisdiction was 10 districts. Where possible, we selected 8 large and 2 medium districts. However, in the example above, since there are only 6 large districts, all of the large districts and 4 of the medium
districts were selected for the new-school inquiry.
If sampling was needed in the medium stratum (i.e., it was not a certainty jurisdiction), the medium districts were selected with equal probability. If sampling was needed in the large stratum, the large districts were sampled with
probability proportional to enrollment. These probabilities were retained and used in all later stages of sampling and weighting, as the district probability then represented the number of other districts that were not sampled to be surveyed
for new schools.
The selected districts in each jurisdiction were then sent a listing of all their schools that appeared on the 2007-2008 CCD file and were asked to provide information about the new schools not included in the file and grade span changes
of existing schools. These listings provided by the selected districts were used as sampling frames for selection of new public schools and updates of existing schools. This process was conducted through the NAEP State Coordinator in
each jurisdiction. The coordinators were sent the information for all sampled districts in their respective states and were responsible for returning the completed updates.
The eligibility of a school was determined based on the grade span. A school also was classified as “new” if a change of grade span had occurred such that the school status changed from ineligible to eligible in a particular grade.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_pub_newschoolframe_wcba.aspx

7/22/2016 2:27 PM

42 of 91

Sampling of Public Schools for the 2011 Writing Computer-Based Assessment (WCBA)
The writing computer-based assessment (WCBA) public school sample was selected with probability proportional to size, using systematic sampling from a sorted list. A school's measure of size was a
complex function of the school's estimated grade enrollment.
Schools were ordered within each grade, using the serpentine sort described under the stratification of public schools. A systematic sample was then drawn using this serpentine-sorted list and the measures of
size.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_pub_wcba_schlsamp.aspx

Computation of
Measures of Size for
the 2011 Public
School Writing
Computer-Based
Assessment
School Sample Sizes
for 2011 Public
School WCBA: Frame
and New School

7/22/2016 2:27 PM

43 of 91

Computation of Measures of Size for the 2011 Public School Writing Computer-Based Assessment (WCBA)
In the design of each school sample, five objectives underlie the process of determining the probability of selection for each school and how many students are to be sampled from each selected school containing grade-eligible students:
to meet the target student sample size;
to select an equal-probability sample of students;
to limit the number of students who are selected from a school;
to ensure that the sample within a school does not include a very high percentage of the students in the school, unless all students are included; and
to reduce the rate of sampling of small schools, in recognition of the greater cost and burden per student of conducting assessments in such schools.
The goal in determining the school's measure of size is to optimize across the last four objectives in terms of maintaining the accuracy of estimates and the cost-effectiveness of the sample design. The following algorithm was used to
assign a measure of size to each school based on its estimated grade enrollment as indicated on the sampling frame.
The measures of size vary by enrollment size. The initial measures of size (MOS) were set as follows, for both eighth and twelfth grades:

where xjs is the estimated grade enrollment for grade j (j = 8, 12) in school s, and PSU_WTS is the primary sampling unit (PSU) weight for school j.
An adjustment to the initial measure of size was made for some schools. Schools with a high percentage of Black or Hispanic students, and schools in the PSU containing Honolulu County, had their measure of size increased by a factor
of two, in order to double their probability of selection.
The school measure of size was then rescaled to create an expected number of hits by applying a multiplicative constant bj, which varies by grade. For the national writing computer-based assessment (WCBA) sample, by design, a school
could not be selected or "hit" in the sampling process more than once.
The rescaled measure of size, Ejs, was defined as:

A final adjustment was made to the measures of size (Ejs) in the national sample to attempt to reduce school burden by minimizing the number of schools selected for simultaneous administration of both the state and national studies. The
NAEP 2011 studies used an adaptation of the Keyfitz process to compute conditional measures of size that, by their design, minimized the number of schools selected for the national study (WCBA) that were also selected for the state
assessment or the Trends in International Mathematics and Science Study (TIMSS).
The school's probability of selection πjs was given by:

One can choose a value of bj such that the expected overall student sample yield matches the desired targets specified by the design, where the expected yield is calculated by summing the product of an individual school’s probability and
its student sample yield across all schools in the frame.
In addition, new and newly eligible schools were sampled from the new-school frame. The assigned measures of size for these schools,

used the bj value from the main school sample for the grade (i.e., the same sampling rates as for the main school sample). The variable πdjs is the probability of selection of the district into the new-school district (d) sample.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_pub_schlsamp_wcba_mos.aspx

7/22/2016 2:27 PM

44 of 91

School Sample Sizes for 2011 Public School WCBA: Frame and New School
The following table presents the number of schools selected for the 2011 public school writing computer-based assessment from the public school sampling frame and the new school sampling frame, for grades 8 and 12.
NAEP public school WCBA frame-based and new school samples, by grade: 2011
Grade
8
12

Total school sample

Frame school sample

New school sample

890
1,200

866
1,200

24
12

NOTE: Details may not sum to totals due to rounding.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Computer-Based Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_pub_schlsamp_sampsize_wcba.aspx

7/22/2016 2:27 PM

45 of 91

Stratification of Public Schools for the 2011 Writing Computer-Based Assessment (WCBA)
Prior to stratification, the public school sampling frame was divided into grade-specific files, one each for eighth and twelfth grade. For each grade-specific frame file, separate implicit stratification schemes were used to sort schools into
certainty primary sampling units (PSUs) and noncertainty PSUs. In all cases, the implicit stratification was achieved via a "serpentine sort."
For certainty PSUs, the schools were hierarchically sorted by
census region,
urbanization classification (urban-centric locale), and
median household income in the zip code area where the school is located.
Schools in noncertainty PSUs were hierarchically sorted by
PSU stratum,
urbanization classification (urban-centric locale), and
median household income in the zip code area where the school is located.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_pub_wcba_strat.aspx

7/22/2016 2:27 PM

46 of 91

Student Sample Selection for the Public School 2011 Writing Computer-Based Assessment (WCBA)
For the NAEP 2011 writing computer-based assessment (WCBA), the target student sample sizes within sampled schools were the same for both eighth and twelfth grades. All students were sampled if the school had 30 or fewer students
in that grade. Otherwise, a sample of 30 students was selected without replacement.
The process of list submission, sampling students from year-round schools, sampling new enrollees, and determining student eligibility and exclusion status was the same as the process used for the NAEP 2011 state student samples.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_pub_science_wcba_studsamp.aspx

7/22/2016 2:27 PM

47 of 91

Substitute Public Schools for the 2011 Writing Computer-Based Assessment (WCBA)
Substitutes were preselected for the public school samples by sorting the school frame file according to the actual order used in the sampling process (the implicit stratification). For operational reasons, the original selection order was
embedded within the sampled primary sampling unit (PSU) and state. Each sampled school had each of its nearest neighbors within the same sampling stratum on the school frame file identified as a potential substitute. When grade
enrollment was used as the last sort ordering variable, the nearest neighbors had grade enrollment values very close to that of the sampled school. This was done to facilitate the selection of about the same number of students within the
substitute as would have been selected from the original sampled school.
Schools were disqualified as potential substitutes if they were already selected in any of the original public school samples or assigned as a substitute for another public school (earlier in the sort ordering), or if they were already selected
in the original 2011 Trends in International Mathematics and Science Study (TIMSS) sample. TIMSS substitutes could be used as substitutes for the writing computer-based assessment (WCBA). Schools assigned as substitutes for
twelfth-grade schools were disqualified as potential substitutes for eighth-grade schools.
If both nearest neighbors were still eligible to be substitutes, the one with a closer grade enrollment was chosen. If both nearest neighbors were equally distant from the sampled school in their grade enrollment (an uncommon
occurrence), one of the two was randomly selected. If the grade enrollment of the nearest neighbor school was less than half of the expected student sample size of the original sampled school, then it was considered ineligible as a
substitute for that school.
Of the approximately 2,090 originally sampled public schools for the WCBA assessment, about 30 schools had a substitute activated, because the original school, although eligible, did not participate. Ultimately, about 20 of the activated
substitute public schools, all in twelfth-grade, participated in the computer-based assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_pub_wcba_subs.aspx

7/22/2016 2:27 PM

48 of 91

Target Population of the Public School 2011 Writing Computer-Based Assessment (WCBA)
The target population for the 2011 public school writing computer-based assessment (WCBA) included all students who were enrolled in eighth and twelfth grades, in public schools, Bureau of Indian Education (BIE) schools, and
Department of Defense Education Activity Schools (DoDEA) in the 50 states and the District of Columbia.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_pub_science_wcba_targpop.aspx

7/22/2016 2:27 PM

49 of 91

School and Student Participation Results for the 2011 Writing Computer-Based Assessment (WCBA)
Writing computer-based assessment (WCBA) participation in NAEP is not mandatory. Although a portion of the participating school sample consisted of substitute schools, it is preferable to
calculate school response rates on the basis of school participation before substitution.
In every NAEP survey, some of the sampled students are not assessed for the following reasons:
withdrawn students,
excluded students with disabilities (SD),
excluded English language learner (ELL) students, or
students absent from both the original session and the makeup session (not excluded but not assessed).

School Response Rates for the 2011
Writing Computer-Based
Assessment
Weighted Student Response and
Exclusion Rates for the 2011 Writing
Computer-Based Assessment

Withdrawn students are those who have left the school before the original assessment. Excluded students were determined by their school to be unable to meaningfully take the NAEP assessment in their assigned subject, even with an
accommodation. Excluded students must also be classified as SD and/or ELL. Other students who were absent for the initial session are assessed in the makeup session. The last category includes students who were not excluded (i.e.,
“were to be assessed”) but were not assessed either due to absence from both sessions or because of a refusal to participate. Assessed students are also classified as assessed without an accommodation or assessed with an accommodation.
The latter group can be divided into SD students assessed with an accommodation, ELL students assessed with an accommodation, or students who are both SD and ELL and accommodated. Note that some SD and ELL students are
assessed without accommodations, and students who are neither SD nor ELL can only be assessed without an accommodation.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_science_results_wcba.aspx

7/22/2016 2:27 PM

50 of 91

School Response Rates for 2011 Writing Computer-Based Assessment (WCBA)
The following table presents counts of eligible sampled schools and participating schools, as well as weighted school response rates, for the writing computer-based assessment (WCBA) school sample. The weighted school response rates
estimate the proportion of the student population that is represented by the participating school sample prior to substitution.
School response counts and rates for public and private schools, writing computer-based assessment (WCBA), by school type, geographic region, and grade: 2011

Grade

School type and geographic region

Number of sample eligible schools

Number of participating schools, including substitutes

Weighted school response rate prior to substitution (percent)

8

National all
Northeast all
Midwest all
South all
West all
National public
National private
Catholic
Non-Catholic private

981
167
189
377
248
841
140
42
98

947
156
186
365
240
839
108
42
66

97.27
95.36
98.83
97.15
97.43
99.73
71.21
95.53
52.06

12

National all
Northeast all
Midwest all
South all
West all
National public
National private
Catholic
Non-Catholic private

1,300
233
249
468
341
1,100
160
55
105

1,200
213
245
441
318
1,100
122
50
72

93.52
91.91
96.93
94.66
89.70
96.04
67.23
76.60
58.35

NOTE: Detail may not sum to totals because of rounding. Percentages are based on unrounded counts.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Computer-Based Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_schresp_rates_wcba.aspx

7/22/2016 2:27 PM

51 of 91

Weighted Student Response and Exclusion Rates for the 2011 Writing Computer-Based Assessment (WCBA)
The following table presents the weighted student response and exclusion rates for the writing computer-based assessment (WCBA). The exclusion rates give the percentage of students excluded among all eligible students. Excluded
students must be either students with disabilities (SD) or English language learners (ELL). The response rates indicate the percentage of students assessed among those who were intended to take the assessment in participating schools.
Thus, students who were excluded are not included in the denominators of the response rates.

Weighted student response and exclusion rates for public and private schools, writing computer-based assessment (WCBA), by school type and geographic region and grade: 2011

Grade

School type and geographic region

Weighted student response rates (percent)

Weighted percent of all students who are SD and excluded

Weighted percent of all students who are ELL and excluded

8

National all
Northeast all
Midwest all
South all
West all
National public
National private
Catholic
Non-Catholic private

94.00
93.18
94.18
94.63
93.47
93.99
94.09
94.72
93.29

1.40
1.52
1.59
1.32
1.25
1.51
0.24
0.53
0.00

0.49
0.80
0.33
0.44
0.46
0.52
0.05
0.00
0.08

12

National all
Northeast all
Midwest all
South all
West all
National public
National private
Catholic
Non-Catholic private

86.98
84.18
86.39
88.38
87.71
86.98
87.01
85.95
88.36

2.11
1.75
2.11
2.48
1.84
2.29
0.24
0.11
0.36

0.35
0.43
0.15
0.32
0.51
0.38
0.03
0.00
0.06

NOTE: SD = students with disabilities; ELL = English language learners.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Computer-Based Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_science_wcba_stud_resp.aspx

7/22/2016 2:27 PM

52 of 91

Selection of Primary Sampling Units for the 2011 WCBA and MCBS Assessments
For the writing computer-based assessment (WCBA) and mathematics computer-based study (MCBS), a sample of 105 primary sampling units (PSUs) was drawn from a frame of PSUs based on
current Census information.
After the PSU frame was created, 29 certainty PSUs (those with measures of size large enough that it is efficient to take them with probability of selection equal to 1) were identified and set aside.
Stratification of the noncertainty PSUs (the remaining PSUs with probabilities of selection strictly less than 1) was carried out after analysis of Census 2000 data and NAEP 2000 achievement scores
identified the stratification variables. This analysis identified the set of PSU-level, Census-based variables that had as much association with NAEP assessment scores as possible. The intent was that
the results of this analysis and stratification would be used for multiple design years and subject matter. The results were used previously in 2006, 2008, 2009, and 2010. Periodically, this analysis
and stratification will be conducted according to the availability of Census data and key assessment scores. Measures of size and probabilities of selection were defined for each PSU, and a stratified
systematic sample of PSUs was drawn. For WCBA and MCBS, 76 noncertainty PSUs were selected.
The PSUs on the frame satisfied the following criteria:

PSU Generation: Metropolitan
Statistical Areas
PSU Generation: Certainty PSUs
PSU Generation:
Non-Metropolitan Statistical
Areas
PSU Frame: Stratification
Final PSU Sample

The PSU sampling frame included all U.S. states and the District of Columbia, but excluded the U.S. territories and Puerto Rico;
PSUs consisted of one county or contiguous multiple counties;
Metropolitan Statistical Areas (MSAs) were designated as separate PSUs even with their large size, as they were sufficiently compact in terms of their travel costs (due to higher levels of transportation infrastructure);
PSUs did not cross Census region boundaries;
PSUs did not cross state boundaries, in general;
Non-MSA PSUs in the Northeast and South Census regions had a minimum population of 15,000 youths (age 0 to 17 inclusive), and in the Midwest and West Census regions had a minimum population of 10,000 youths, in general,
according to the 2003 U.S. Census Bureau's Population Estimates Program; and
Non-MSA PSUs were to be of minimum size (defined in terms of maximum distance between points—a rough proxy for travel time) while still satisfying the minimum population constraints.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_wcba_mcbs_psu_selection.aspx

7/22/2016 2:27 PM

53 of 91

Final Primary Sampling Unit (PSU) Sample for the 2011 Assessments
For the writing computer-based assessment (WCBA) and mathematics computer-based study (MCBS), a primary sampling unit (PSU) sample was drawn independently from each of the 76 noncertainty strata defined in Final Primary
Sampling Unit Strata. One PSU was selected with probability proportionate to size (with size equal to estimated number of youths) within each stratum. The selection of the noncertainty PSUs was designed to minimize the overlap with
the 2008 LTT sample, the 2009 science sample, and the 2010 sample.
Also, 29 PSUs were included in the sample of PSUs with certainty.
Distribution of sampled PSUs, computer-based writing and mathematics assessments, by PSU type: 2011
PSU type

Number of sampled PSUs

Total
Census region
Northeast
Midwest
South
West
Certainty/metropolitan status
Certainty metropolitan
Noncertainty metropolitan
Noncertainty non-metropolitan

105
15
23
42
25
29
54
22

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_psu_finalsample.aspx

7/22/2016 2:27 PM

54 of 91

Primary Sampling Unit (PSU) Generation: Certainty PSUs for the 2011 Assessments
Any primary sampling unit (PSU) was defined as a certainty PSU if it had 500,000 or more youths. The estimated number of youths is the number of persons age 17 or under from the 2008 U.S. Census Bureau's Population Estimates
Program.1 These PSUs were so large that a sample of schools was taken from all of them (rather than from only a subsample of them, as with noncertainty PSUs). There were two exceptions to the 500,000 cutoff. The Honolulu, Hawaii,
and Washington, D.C., PSUs were included as certainties by design: Honolulu, Hawaii in order to reduce the variability of including Native Hawaiian students, and Washington, D.C., as it is essentially a part of the larger MD-VA-DC
Washington area PSU. A total of 29 PSUs were classified as certainties in the 2011 frame. The table below provides a listing of the certainty PSUs by census region.
Metropolitan statistical area (MSA) definition for certainty PSUs, by primary sampling unit (PSU): 2011
Primary sampling unit (PSU)

Number of counties

Number of youths

Grand total

Metropolitan statistical area (MSA)

State

203

30,407,927

Total Northeast
1--1
1--2
1--3
1--4
1--5

6,753,238
903,391
1,518,504
2,915,787
481,884
933,672

Boston-Cambridge-Quincy
New York-Northern New Jersey-Long Island
New York-Northern New Jersey-Long Island
Pittsburgh
Philadelphia-Camden-Wilmington

MA
NJ-PA
NY
PA
PA

40
5
13
10
7
5

Chicago-Naperville-Joliet
Detroit-Warren-Livonia
Minneapolis-St. Paul-Bloomington
St. Louis
Cleveland-Elyria-Mentor

IL
MI
MN
MO
OH

40
9
6
11
9
5

5,113,204
2,231,409
1,089,901
782,054
519,876
489,964

Total South
3--1
3--2
3--3
3--4
3--5
3--6
3--7
3--8
3--9
3--10

Washington-Arlington-Alexandria
Tampa-St. Petersburg-Clearwater
Miami-Fort Lauderdale-Miami Beach
Atlanta-Sandy Springs-Marietta
Washington-Arlington-Alexandria
Baltimore-Towson
San Antonio
Houston-Sugar Land-Baytown
Dallas-Fort Worth-Arlington
Washington-Arlington-Alexandria

DC
FL
FL
GA
MD
MD
TX
TX
TX
VA

93
1
4
3
28
5
7
8
10
12
15

9,089,075
112,016
592,372
1,204,361
1,443,448
546,557
629,656
561,126
1,615,543
1,755,255
628,741

Total West
4--1
4--2
4--3
4--4
4--5
4--6
4--7
4--8
4--9

Phoenix-Mesa-Scottsdale
Sacramento--Arden-Arcade--Roseville
San Diego-Carlsbad-San Marcos
San Francisco-Oakland-Fremont
Riverside-San Bernardino-Ontario
Los Angeles-Long Beach-Santa Ana
Denver-Aurora
Honolulu
Seattle-Tacoma-Bellevue

AZ
CA
CA
CA
CA
CA
CO
HI
WA

30
2
4
1
5
2
2
10
1
3

9,452,410
1,168,524
519,855
744,470
923,680
1,174,107
3,314,817
637,268
199,268
770,421

Total Midwest
2--1
2--2
2--3
2--4
2--5

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
1

The U.S. Census Bureau's Population Estimates Program (http://www.Census.gov/popest/) yearly publishes total resident population estimates by demographics such as age, sex, race, and Hispanic origin for the nation, states, and
counties.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_psu_certainty.aspx

7/22/2016 2:27 PM

55 of 91

Primary Sampling Unit Frame: Stratification for the 2011 Assessments
The noncertainty primary sampling unit (PSU) strata were initially determined by census region and metropolitan status (metropolitan or non-metropolitan)—a total of eight primary strata. Measures
of size were defined for each of these strata, determined by the relative share of the eventual PSU sample (the sample size is designed to be proportional to the number of youths). The PSU stratum
measure of size then is the total number of youths in the stratum. The table below presents these counts for each of the eight primary strata. The relative share of the PSU sample size for each stratum
is the number of youths in the stratum divided by the total number of youths, multiplied by 76 (the total noncertainty PSU strata for the writing computer-based assessment [WCBA] and mathematics
computer-based study [MCBS]). The results of these calculations are given in the table below.

Stepwise Regression Analysis
Results for PSU Stratification
Final PSU Strata

Noncertainty primary sampling unit (PSU) frame size statistics, by primary stratum: 2011
Primary stratum

PSUs

Counties

Youths

Target number of PSU strata

Set number of PSU strata

Youths per PSU stratum

Total noncertainty PSUs
Northeast Region Metropolitan
Northeast Region Non-Metropolitan
Midwest Region Metropolitan
Midwest Region Non-Metropolitan
South Region Metropolitan
South Region Non-Metropolitan
West Region Metropolitan
West Region Non-Metropolitan

1,040
46
50
100
249
153
269
71
102

2,937
83
94
246
769
458
872
101
314

43,533,921
4,531,012
1,098,293
7,458,159
3,505,128
13,269,054
5,190,589
6,803,588
1,678,098

76.0
7.9
1.9
13.0
6.1
23.2
9.1
11.9
2.9

76
8
2
12
6
22
10
12
4

572,815
566,377
549,147
621,513
584,188
603,139
519,059
566,966
419,525

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
The division of the primary strata into the final strata was done on a stratum-by-stratum basis. The criteria for good PSU strata were: (1) the strata should have as equal measures of size as possible, which reduces sampling variance, and
(2) the strata should be as heterogeneous in measured achievement as possible (i.e., there should be strata with low mean achievement, strata with mid-level mean achievement, and strata with high mean achievement). This second
criterion also ultimately reduces the variance of the assessment estimates since the final PSU sample will be balanced in terms of assessment means.
PSU assessment means from the current year cannot be used, as assessments are only conducted after sampling is completed. Information is available about PSU sociodemographic characteristics in advance, however. An analysis was
done within each primary stratum to find sociodemographic variables that were good predictors of the NAEP 2000 mathematics and science assessment results. Using these sociodemographic variables to define strata should increase the
chance of having efficient strata definitions. The page Stepwise Regression Analysis Results for PSU Stratification describes this analysis for each primary stratum.
The final step in stratification was to define the desired number of strata using the selected stratifiers while constructing strata that were as close to equal size as possible (with size defined by number of youths). The objective was to
establish strata that had a high between-stratum variance for the stratifiers (i.e., which "spread out" the stratifiers as much as possible). These strata are given on the page Final PSU Strata.

http://nces.ed.gov/nationsreportcard/tdw/sample_design/2011/2011_samp_natl_psu_framestrat.aspx

7/22/2016 2:27 PM

56 of 91

Final Primary Sampling Unit (PSU) Strata for the 2011 Assessments
The strata were defined using the selected stratifiers from the stepwise regression analysis (see Stepwise Regression Analysis Results for PSU Stratification). The cutoffs were selected so that roughly equal measures of size were
represented by each stratum.
Stratification for Northeast metropolitan noncertainty primary sampling units (PSUs), national assessment, by stratum: 2011
Stratum
Total
1
2
3
4
5
6
7
8
Mean

Primary stratifier

Secondary stratifier

PSUs

Measure of size

†
Percent child poverty <=10.1%
Percent child poverty <=10.1%
10.1%< Percent child poverty <=12.5%
10.1%< Percent child poverty <=12.5%
12.5%< Percent child poverty <=13.4%
13.4%< Percent child poverty <=15.1%
15.1%< Percent child poverty <=17%
17%< Percent child poverty <=20.7%
†

†
Percent Black <=15.9%
15.9%< Percent Black <=27.7%
Percent Black <=14.9%
14.9%< Percent Black <=38.2%
†
†
†
†
†

46
8
2
7
4
5
7
5
8
†

4,531,012
572,628
533,970
578,198
624,044
543,994
574,735
516,879
586,564
566,377

† Not applicable.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
Stratification for Northeast non-metropolitan noncertainty primary sampling units (PSUs), national assessment, by stratum: 2011
Stratum
Total
1
2
Mean

Primary stratifier

PSUs

Measure of size

†
Percent child poverty <=15.7%
15.7%< Percent child poverty <=22.8%
†

50
22
28
†

1,098,293
544,762
553,531
549,147

† Not applicable.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
Stratification for Midwest metropolitan noncertainty primary sampling units (PSUs), national assessments, by stratum: 2011
Stratum
Total
1
2
3
4
5
6
7
8
9
10
11
12
Mean

Primary stratifier

Secondary stratifier

Tertiary stratifier

PSUs

Measure of size

†
Percent child poverty <=12.5%
Percent child poverty <=12.5%
Percent child poverty <=12.5%
Percent child poverty <=12.5%
Percent child poverty <=12.5%
Percent child poverty <=12.5%
12.5%< Percent child poverty <=12.9%
12.9%< Percent child poverty <=14.5%
12.9%< Percent child poverty <=14.5%
14.5%< Percent child poverty <=27.6%
14.5%< Percent child poverty <=27.6%
14.5%< Percent child poverty <=27.6%
†

†
†
†
†
†
†
†
†
†
†
Med HH Income <=$38,291
$38,291< Med HH Income <=$46,460
$38,291< Med HH Income <=$46,460
†

†
Pct Asian <=1.1%
1.1%< Pct Asian <=1.4%
1.4%< Pct Asian <=2.4%
2.4%< Pct Asian <=2.6%
2.6%< Pct Asian <=3.4%
3.4%< Pct Asian <=10.3%
†
Pct Asian <=1.3%
1.3%< Pct Asian <=2.7%
†
Pct Asian <=0.9%
0.9%< Pct Asian <=3.1%
†

100
17
4
8
3
7
13
6
7
7
17
7
4
†

7,458,159
623,684
668,332
598,589
706,518
602,499
618,908
619,810
623,599
602,409
603,071
569,605
621,135
621,513

† Not applicable.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
Stratification for Midwest non-metropolitan noncertainty primary sampling units (PSUs), national assessment, by stratum: 2011
Stratum
Total
1
2
3
4
5
6
Mean

Primary stratifier

Secondary stratifier

Tertiary stratifier

PSUs

Measure of size

†
Percent child poverty <=15.7%
Percent child poverty <=15.7%
Percent child poverty <=15.7%
Percent child poverty <=15.7%
15.7%< Percent child poverty <=45.5%
15.7%< Percent child poverty <=45.5%
†

†
Percent college grd <=12.5%
12.5%< Percent college grd <=36.0%
12.5%< Percent college grd <=36.0%
12.5%< Percent college grd <=36.0%
Percent college grd <=13.2%
13.2%< Percent college grd <=23.0%
†

†
†
Pct BHI <=4.2%
4.2%< Pct BHI <=8.5%
8.5%< Pct BHI <=41.4%
†
†
†

249
41
42
42
38
41
45
†

3,505,128
577,244
577,144
582,552
591,909
584,830
591,449
584,188

† Not applicable.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.

Stratification for South metropolitan noncertainty primary sampling units (PSUs), national assessment, by stratum: 2011
Stratum
Total
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

Primary stratifier

Secondary stratifier

Tertiary stratifier

PSUs

Measure of size

†
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%

†
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%
Percent Black <=39.7%

†
Percent Hispanic <=1.7%
1.7%< Percent Hispanic <=2.6%
2.6%< Percent Hispanic <=2.7%
2.7%< Percent Hispanic <=3.0%
3.0%< Percent Hispanic <=3.5%
3.5%< Percent Hispanic <=4.2%
4.2%< Percent Hispanic <=4.8%
4.8%< Percent Hispanic <=5.5%
5.5%< Percent Hispanic <=7.3%
7.3%< Percent Hispanic <=8.5%
8.5%< Percent Hispanic <=9.1%
9.1%< Percent Hispanic <=11.2%
11.2%< Percent Hispanic <=14.6%
14.6%< Percent Hispanic <=21.1%
21.1%< Percent Hispanic <=30.8%

153
17
11
3
8
8
5
4
6
8
5
3
7
8
6
5

13,269,054
596,069
630,434
578,311
571,617
663,600
655,658
626,966
518,112
589,272
531,498
701,272
700,785
571,531
548,529
691,494

† Not applicable.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.

7/22/2016 2:27 PM

57 of 91

Stratum

Primary stratifier

Stratum
16
17
18
19
20
21
22
Mean

Secondary stratifier

Tertiary stratifier

PSUs

Measure of size

Primary stratifier

Secondary stratifier

Tertiary stratifier

PSUs

Measure of size

Percent child poverty <=22.7%
Percent child poverty <=22.7%
Percent child poverty <=22.7%
22.7%< Percent child poverty <=24.3%
24.3%< Percent child poverty <=45.7%
24.3%< Percent child poverty <=45.7%
24.3%< Percent child poverty <=45.7%
†

Percent Black <=39.7%
39.7%< Percent Black <=56.6%
39.7%< Percent Black <=56.6%
†
Percent Black <=2.0%
2.0%< Percent Black <=60.8%
2.0%< Percent Black <=60.8%
†

30.8%< Percent Hispanic <=51.2%
Percent Hispanic <=2.7%
2.7%< Percent Hispanic <=7.8%
†
†
Percent Hispanic <=3.8%
3.8%< Percent Hispanic <=64.1%
†

7
6
6
11
4
10
5
†

710,494
547,257
655,387
498,732
712,425
494,418
475,193
603,139

† Not applicable.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
Stratification for South non-metropolitan noncertainty primary sampling units (PSUs), national assessment, by stratum: 2011
Stratum
Total
1
2
3
4
5
6
7
8
9
10
Mean

Primary stratifier

Secondary stratifier

Tertiary stratifier

PSUs

Measure of size

†
Percent Black <=31.2%
Percent Black <=31.2%
Percent Black <=31.2%
Percent Black <=31.2%
Percent Black <=31.2%
Percent Black <=31.2%
Percent Black <=31.2%
31.2%< Percent Black <=51.3%
31.2%< Percent Black <=51.3%
51.3%< Percent Black <=79.4%
†

†
Median HH income <=$36,049
Median HH income <=$36,049
Median HH income <=$36,049
Median HH income <=$36,049
Median HH income <=$36,049
Median HH income <=$36,049
$36,049< Median HH income <=$54,721
Median HH income <=$29,555
$29,555< Median HH income <=$44,421
†
†

†
Percent Asian <=0.2%
0.2%< Percent Asian <0.3%
Percent Asian =0.3%
0.3%< Percent Asian <=0.4%
0.4%< Percent Asian <=0.7%
0.7%< Percent Asian <=3.0%
†
†
†
†
†

269
32
29
32
29
26
26
21
25
22
27
†

5,190,589
511,172
515,004
520,184
510,176
538,940
542,335
551,780
513,918
499,209
487,871
519,059

† Not applicable.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
Stratification for West metropolitan noncertainty primary sampling units (PSUs), national assessment, by stratum: 2011
Stratum
Total
1
2
3
4
5
6
7
8
9
10
11
12
Mean

Primary stratifier

Secondary stratifier

PSUs

Measure of size

†
Percent high school graduates <=70%
Percent high school graduates <=70%
70%< Percent high school graduates <=78.9%
78.9%< Percent high school graduates <=79.6%
79.6%< Percent high school graduates <=88.3%
79.6%< Percent high school graduates <=88.3%
79.6%< Percent high school graduates <=88.3%
79.6%< Percent high school graduates <=88.3%
79.6%< Percent high school graduates <=88.3%
79.6%< Percent high school graduates <=88.3%
88.3%< Percent high school graduates <=90.1%
90.1%< Percent high school graduates <=93.1%
†

†
Percent college graduates <=13.5%
13.5%< Percent college graduates <=22.5%
†
†
Percent college graduates <=21.8%
21.8%< Percent college graduates <=25.5%
25.5%< Percent college graduates <=26.9%
26.9%< Percent college graduates <=27.8%
27.8%< Percent college graduates <=30.3%
30.3%< Percent college graduates <=39.5%
†
†
†

71
7
4
6
3
14
8
4
3
3
3
8
8
†

6,803,588
635,482
508,977
513,746
656,056
559,117
587,808
525,628
583,877
557,408
527,923
556,141
591,425
555,966

† Not applicable.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 National Assessment.
Stratification for West non-metropolitan noncertainty primary sampling units (PSUs), national assessment, by stratum: 2011
Stratum
Total
1
2
3
4
Mean

Primary stratifier

Secondary stratifier

Tertiary stratifier

PSUs

Measure of Size

†
Percent college graduates <=21%
Percent college graduates <=21%
Percent college graduates <=21%
21%< Percent college graduates <=42.6%
†

†
Percent child poverty <=17.3%
17.3%
File Typeapplication/pdf
File TitleNAEP - Print Preview
AuthorEMOLIN
File Modified2016-07-29
File Created2016-07-22

© 2024 OMB.report | Privacy Policy