Part B NAEP System Clearance 2014-2016

Part B NAEP System Clearance 2014-2016.docx

National Assessment of Education Progress (NAEP) 2014-2016 System Clearance

OMB: 1850-0790

Document [docx]
Download: docx | pdf



National Center for Education Statistics

National Assessment of Educational Progress






SUPPORTING STATEMENT PART B



Request for System Clearance for

NAEP Assessments for 2014-2016


OMB# 1850-0790 v.36















November 1, 2012

(revised 1-17-13)

Table of Contents

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS 1

1. Potential respondent universe 1

2. Procedures for collection of information 1

3. Methods to maximize response rates and deal with issues of nonresponse 4

4. Tests of procedures or methods to be undertaken 5

5. Consultants on statistical aspects of the design 5





Appendix A Statute Authorizing NAEP A-1



Appendix B External Consultants

NAEP Design and Analysis Committee B-1

NAEP Validity Studies Panel B-2

NAEP Quality Assurance Technical Panel B-3

NAEP Socio-Economic Status Panel B-4

NAEP National Indian Education Study Technical Review Panel B-5

NAEP Civics Standing Committee B-6

NAEP Economics Standing Committee B-7

NAEP Geography Standing Committee B-8

NAEP Mathematics Standing Committee B-9

NAEP Reading Standing Committee B-10

NAEP Science Standing Committee B-11

NAEP Technology and Engineering Literacy Standing Committee B-12

NAEP U.S. History Standing Committee B-13

NAEP Writing Standing Committee B-14



Appendix C Example of Sample Design Document (2013 Assessment) C-1

Appendix D Sample Parental Notification Letter D-1

Appendix E Sample School Coordinator Responsibilities Brochure E-1

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

  1. Potential respondent universe

The possible universe of student respondents for main NAEP is estimated to be 12 million at grades 4, 8, 12, attending approximately 154,000 public and private elementary and secondary schools. NAEP test booklets are administered in selected public and private schools to a sample of fourth-, eighth-, and twelfth-grade students.

Respondents are selected according to student sampling procedures with these possible exclusions:

  • The student is identified as an English language learner (ELL), but is prevented from participation in NAEP, even with accommodations allowed in NAEP.

  • The student is identified as having a disability which prevents participation in NAEP, even with accommodations as allowed in NAEP, and has an Individualized Education Plan (IEP) or equivalent classification, such as a Section 504 plan.

NAEP relies upon the professional judgment of school administrators as to how students or schools should be classified.

  1. Procedures for collection of information

Sampling

The sampling information in this system clearance package is an overview of the sampling techniques and criteria used by the current sampling contractor for the sampling in NAEP assessments. Each specific assessment will involve different selected samples based on the number of students and subjects in that particular assessment. Planned sample sizes are based on the need to obtain representative samples on which to report achievement information.

To assess a representative sample of students, the process begins by identifying a sample of schools with student populations that reflect the varying demographics of a specific jurisdiction, be it the nation, a state, or a district. Within each selected school, students are chosen at random to participate and each has the same chance of being chosen, regardless of socio-economic status, disability, status as an English language learner, or any other factors. Selecting schools that are representative helps ensure that the student sample is representative.

The following steps are used to select a sample of public schools and students in a year when NAEP reports state-level results.

  1. Generate a sampling frame.
    For sampling frames, NAEP uses the most current versions of the NCES Common Core of Data (CCD; public schools) and Private School Universe Survey (PSS; private schools) files. In addition, to address the fact that the CCD file necessarily does not include the most recent changes to schools by the time of the assessment, NAEP also conducts a survey of NAEP State Coordinators to check for additional schools in a sample of public school districts.

  2. Classify schools into groups.
    Using the list, schools are classified into groups, first by type of location and then by the racial/ethnic composition within those locations. This step takes into account the distribution of schools and students across rural, suburban, and urban areas in each state, and the diversity of the student population at each school. This ensures that NAEP assesses students in schools that represent different demographic groups.

  3. Within each group, order schools by a measure related to student achievement.
    Within each group, schools are sorted by student achievement to ensure that schools with varying levels of student achievement are represented in the NAEP sample. This is done using school level results on state achievement tests. In a few cases where recent achievement data are not available, schools are sorted by the median household income for the area where the school is located.

  4. Assign a probability of selection to all schools.
    All schools on the list are assigned a probability of being selected for participation. A school’s probability is based on the size of its enrollment in relation to the size of the state’s student population at the selected grade-level. Larger schools have a greater probability of selection as they represent a larger proportion of the state’s student population. This step ensures that students from schools of different sizes are appropriately represented in the sample.

  5. Select the school sample.
    After schools are assigned a probability of selection and grouped on an ordered list based on the characteristics that are referred to in previous steps, the sample is selected using systematic sampling. For example, every third school would be selected for the sample, but starting with the second school on the ordered list.

  6. Confirm school eligibility.
    The list of schools selected to participate is sent to each state to verify that the school is eligible for participation. Some factors that would make a school ineligible include schools that have closed or if the grade span has changed so that a grade level assessed by NAEP is no longer in the school.

  7. Randomly select students to participate in NAEP.
    School principals are notified that their schools have been chosen to participate in NAEP. Within each sampled school, students are randomly selected with equal probability from a complete list of students at the grade to be assessed.

NAEP yearly sample design plans are not available until the spring of the year preceding the assessments. For this system clearance submittal, we have included the 2013 sample design memorandum (refer to appendix C) which details the specific sampling procedures for the 2013 assessments. Subsequent clearance packages submitted for the specific assessment years will include each year’s sample design specifics for 2014-2016 when the yearly Wave I and II clearance packages are submitted.

Design Features

As in the past, NAEP samples are based on multistage designs. The state assessment designs consist of stratified samples of public schools, where the stratification is derived from type of location (urban/suburban/large town/small town/rural), proportion minority enrollment, school level achievement on statewide testing programs, and a measure of household income in the zip code area of the school. The second stage of sampling is the selection of the students from within each selected school. This is an equal probability systematic sample from among all students in the appropriate grade.

For the national samples, a three-stage design is used. The first stage is the selection of primary sampling units (PSUs), which are individual counties or groups of contiguous counties. The second stage is the selection of schools within PSUs, and the third stage is the selection of students within schools.

The following are characteristic features of NAEP sampling designs:

  • for state-level assessments, approximately equal sample sizes (2,500–3,000 assessed students) from each participating state’s public schools, for each subject;

  • for district-level assessments, sample sizes of approximately 1,000–2,500 from each participating district’s public schools, for each subject;

  • sample sizes of approximately 6,000–12,000 for national-only operational subjects, depending on the size of the item pool;

  • in each school, some students to be assessed in each subject;

  • lists of schools obtained from the NCES CCD and PSS files;

  • schools grouped into strata;

  • schools assigned a measure of size;

  • sample selected with probability proportional to the measure of size; and

  • school stratification based on characteristics such as: type of location, enrollment by race/ethnicity, and school achievement.

Refer to appendix C for an example of the sampling procedures performed for the 2013 assessment.

  1. Methods to maximize response rates and deal with issues of nonresponse

NAEP attempts to minimize nonresponse of both students and schools. Chief state school officers and Local Education Agency (LEA) superintendents are provided with lists of schools in the sample in their jurisdiction and their cooperation is requested. For the assessments, schools within each state will be selected and the chief state school officer and the NAEP state coordinator will be asked to solicit their cooperation. NCES will provide letters to states and districts in support of the operational and pilot tests. Since states and school districts receiving Title I funds are required to participate in the NAEP reading and mathematics assessments (grades 4 and 8) under the National Assessment of Educational Progress Act, NAEP response rates have improved for these assessments.

Not all of the students in the sample will respond. Some will be unavailable during the sample time period because of absenteeism or other reasons. If a student decides not to participate, the action will be recorded, but no steps will be taken to obtain participation. Response rates, in percentages, from a recent NAEP assessment are shown below:

 

Grade 4

Grade 8

Grade 12

Student Response Rates

94

93

87

School Response Rates



 

Public Schools

100

100

96

Private Schools

74

74

67

Note: The public school response rates for grades 4 and 8 rounded to 100, but were actually slightly lower (i.e., 99.8 percent).

  1. Tests of procedures or methods to be undertaken

The 2014–2016 administration procedures will be similar to those of previous NAEP assessments. If the final design for an assessment requires new procedures or methods, they will be tested in a special study or a pilot test prior to the operational assessment.

  1. Consultants on statistical aspects of the design

ETS, Fulcrum, Westat, and NCES staffs have collaborated on the statistical aspects of the design. The primary persons responsible are the following:

Jay Campbell

NAEP Executive Director, ETS


Peggy Carr

Associate Commissioner, NCES


Patricia Etienne

Program Director, Design, Analysis, and Reporting and Assessment Coordination, NCES


Scott Ferguson

NAEP Project Director, Fulcrum


Arnold Goldstein

Statistician, Assessment Reporting and Dissemination, NCES


Andrew Kolstad

Senior Technical Advisor, NCES


Andreas Oranje

NAEP Psychometric Director, ETS

Keith F. Rust

Vice-President, Westat


Holly Spurlock

Program Director, Assessment Operations, NCES


Dianne Walsh

Vice-President, Westat



In addition, the NAEP Design and Analysis Committee and the NAEP Validity Studies panel members (see appendix B) have also contributed to NAEP designs on an on-going basis.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSystem Clearance Part B
Authorjoconnell
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy