Part B (Sys Cl) Supporting Statement

Part B (Sys Cl) Supporting Statement.docx

National Assessment of Education Progress (NAEP) 2011-13 System Clearance

OMB: 1850-0790

Document [docx]
Download: docx | pdf

NATIONAL ASSESSMENT OF

EDUCATIONAL PROGRESS






SUPPORTING STATEMENT PART B



SYSTEM CLEARANCE PROPOSAL



NAEP SURVEYS


FOR THE YEARS 2011-2013




Shape1



March 17, 2010


Table of Contents

A. JUSTIFICATION……………………………………………………………………………………... 1

1a. Circumstances making the collection of information necessary………………………………............. 1

1b. Overview of NAEP Assessments............................................................................................................ 3

1c. Overview of 2011-2013 NAEP Assessments.......................................................................................... 6

1d. Rationale for OMB System Clearance……………………………………………………………….... 8

2. How, by whom, and for what purpose the data will be used………………………………………….. 9

3. Use of techniques to reduce burden………………………………………………………………….. 11

4. Efforts to identify duplication……………………………………………………………………….. 12

5. Burden on small businesses or other small entities………………………………………………….. 13

6. Consequences of collecting information less frequently……………………………………………. 13

7. Consistency with 5 C.F.R. 1320.5…………………………………………………………………… 13

8. Consultations outside the agency……………………………………………………………………. 13

9. Payments or Gifts to Respondents…………………………………………………………………… 14

10. Assurance of Confidentiality………………………………………………………………………… 14

11. Sensitive questions…………………………………………………………………………………… 17

12. Estimation of respondent reporting burden (2011-2013)……………………………………………. 17

13. Cost to respondents………………………………………………………………………………….. 25

14. Estimates of cost to the federal government…………………………………………………………. 25

15. Reasons for changes in burden (from last System Clearance submittal)……………………………. 25

16. Time schedule for data collection……………………………………………………………………. 27

17. Approval for not Displaying OMB Approval Expiration Date……………………………………… 27

18. Exceptions to Certification Statement……………………………………………………………….. 27





Appendix A Statute Authorizing NAEP A-1


Appendix B Lists of Committee Members B-1

NAEP Background Variable Standing Committee B-2

NAEP Design and Analysis Committee B-3

NAEP Validity Study Panel B-4

NAEP National Indian Education Study (NIES) Technical Review Panel B-5

NAEP Writing Standing Committeel Review Panel B-6


Appendix C Example of Sample Design Document (2009 Assessment) C-1

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


  1. Potential respondent universe.

The possible universe of student respondents is estimated to be 12 million fourth-, eighth-, and twelfth-grade students attending approximately 154,000 public and private elementary and secondary schools. NAEP test booklets are administered in selected public and private schools to a sample of students in grades 4, 8, and 12.


Students are selected according to student sampling procedures with these possible exclusions:

  • The student is identified as an English Language Learner (ELL), and is prevented from participation in NAEP, even with accommodations allowed in NAEP.

  • The student is identified as having a disability which prevents participation in NAEP, even with accommodations as allowed in NAEP, and has an Individualized Education Plan (IEP) or equivalent classification, such as those identified as part of the 504 plan.


NAEP relies upon the professional judgment of school administrators as to how students or schools should be classified.


2. Procedures for collection of information.

Sampling

The sampling information in this system clearance package is an overview of the sampling techniques and criteria used by the current sampling contractor for the sampling in NAEP assessments. Each specific assessment will involve different selected samples based on the volumes and subjects in that particular assessment. Planned sample sizes are based on the need to obtain representative samples on which to report achievement information.

For sampling frames, NAEP uses the most current versions of the NCES CCD (public schools) and PSS (private schools) files. In addition, to address the fact that the CCD file is necessarily somewhat out-of-date by the time of the assessment, NAEP also conducts a survey of NAEP State Coordinators to check for additional schools in a sample of public school districts.


Design Features

As in the past, NAEP samples are based on multistage designs. The state assessment designs consist of stratified samples of public schools; where the stratification is derived from type of location (urban/suburban/large town/small town/rural), proportion minority enrollment, school level achievement on statewide testing programs, and a measure of household income in the zip code area of the school. The second stage of sampling is the selection of the students from within each selected school. This is an equal probability systematic sample from among all students in the appropriate grade. For the national samples, a three-stage design is used. The first stage is the selection of primary sampling units (PSUs), which are individual counties or groups of contiguous counties. The second stage is the selection of schools within PSUs, and the third stage is the selection of students within schools. The following are characteristic features of NAEP sampling designs:

  • for state-level assessments, approximately equal sample sizes (2,500-3,000 assessed students) for each state's public schools, for each subject, at grades 4 and 8, in each state

  • sample sizes of approximately 10,000-12,000 for national-only operational subjects

  • in each school, some students to be assessed in each subject

  • lists of schools obtained from the NCES Common Core of Data (CCD)

  • schools grouped into strata

  • schools assigned a measure of size

  • sample selected with probability proportional to the measure of size

  • school stratification based on characteristics such as: type of location, minority enrollment, and school achievement



(Refer to Appendix C for an example of the sampling procedures contained in the 2009 assessment.)



3. Methods to maximize response rates and deal with issues of nonresponse.


NAEP attempts to minimize nonresponse of both students and schools. Chief State School Officers and LEA superintendents are provided with lists of schools in the sample in their jurisdiction and their cooperation is requested. For the assessments, schools within each state will be selected and the chief state school officer and the state coordinator will be asked to solicit their cooperation. NCES will provide letters to states and districts in support of the operational and field tests. Since states and school districts receiving Title I funds are required to participate in the NAEP reading and mathematics assessments (grades 4 and 8) under No Child Left Behind, NAEP response rates have improved for these assessments.

In previous NAEP administrations 95 percent or more of students have responded; between 85 percent and 90 percent of school administrators are respondents; and among teachers, 85 percent provide background-specific information and 75 percent provide class-period specific information.


Not all of the students in the sample will respond. Some will be unavailable during the sample time period because of absenteeism or other reasons. If a student decides not to complete an exercise, the action will be recorded, but no steps will be taken to obtain an answer.


4. Tests of procedures or methods to be undertaken.

The 2011-2013 operational, probe, and pilot main assessments will be administered in the January-March time window of each year. The long-term trend administration is conducted in the fall (October-December) of 2011 for age 13 students, the winter (January-March) of 2012 for age 9 students, and the spring (March-May) of 2012 for age 17 students. Refer to information in item 16 for specific schedules. Each student in a session will receive one booklet from a spiral of booklets, in which all booklets are spiraled. In general, the operational and pilot test materials and some special studies will be administered in the same sessions. For some subjects (e.g., U.S. history and geography) separate sessions will be required because of different time requirements or book layout. The 2011-2013 administration procedures will be similar to those of previous NAEP operational assessments.




5. Consultants on statistical aspects of the design.

ETS, Fulcrum, Westat, and NCES staffs have collaborated on the statistical aspects of the design. The primary persons responsible are:

Nancy Caldwell

Vice-President, Westat


Jay Campbell

Executive Director, NAEP Project Director, ETS


Peggy Carr

Associate Commissioner, NCES


Patricia Etienne

Program Director, Assessment Coordination, NCES


Arnold Goldstein

Statistician, Assessment Reporting and Dissemination, NCES


Steve Gorman

Program Director, Design, Analysis, Reporting, NCES


Paul Harder

NAEP Project Director, Fulcrum


Andrew Kolstad

Senior Technical Advisor, NCES


Andreas Oranje

Senior Psychometrician and Psychometric Manager, ETS

Keith F. Rust

Vice-President, Westat; and


Holly Spurlock

Program Director, Assessment Operations, NCES



In addition, the NAEP Design and Analysis Committee (DAC) and the NAEP Validity Studies (NVS) panel members (see Appendix B) have also contributed to NAEP designs on an on-going basis.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleNATIONAL ASSESSMENT OF
Authorjoconnell
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy