NAEP 07 Part A 5-22-07

NAEP 07 Part A 5-22-07.doc

National Assessment of Educational Progress 2008-2010 System Clearance

OMB: 1850-0790

Document [doc]
Download: doc | pdf





NATIONAL ASSESSMENT OF

EDUCATIONAL PROGRESS





VOLUME I



SUPPORTING STATEMENT


SYSTEM CLEARANCE PROPOSAL



NAEP OPERATIONAL AND PILOT SURVEYS


FOR THE YEARS 2008-2010






December 18, 2006

(Revised 5-21-2007)



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS 35

1. Potential Respondent Universe…………………………………………….….……………………35

2. Procedures for collection of information. 35

3. Methods to maximize response rates and deal with issues of nonresponse……………………….. 37

4. Tests of procedures or methods to be undertaken………….………………………………….…...38

5. Consultants on statistical aspects of the design ………………………………………………..….38


Appendix A (Statute Authorizing NAEP) A-1

Appendix B Lists of Committee Members B-1

NAEP Background Questionnaire Development Expert Panels ……………………..……………… B-2

SES Standing Committee B-3

NAEP Contextual Variables Expert Panel B-3

NAEP Design and Analysis Committee B-4

NAEP Validity Study Panel Members B-5

Appendix C Example of Sample Design Document (2007 Assessment) C-1

A. JUSTIFICATION

1a. Circumstances making the collection of information necessary.

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, economics, and the arts. In the current legislation that reauthorized NAEP (The No Child Left Behind Act of 2001 (Public Law 107-110)), Congress mandated again the collection of national education survey data through a national assessment program. The No Child Left Behind Act of 2001 requires the assessment to collect data on specified student groups, including race/ethnicity, gender, socio-economic status, disability, and limited English proficiency. It requires fair and accurate presentation of achievement data and permits the collection of background or descriptive information that is related to academic achievement and aids in fair reporting of results. The intent of the law is to provide representative-sample data on student achievement for the nation, the states, and subpopulations of students and to monitor progress over time.


NAEP is administered by the National Center for Education Statistics (NCES) in the Institute for Education Sciences of the U.S. Department of Education. The National Assessment Governing Board sets policy for NAEP and determines the content framework for each assessment. As a result of the No Child Left Behind Act, the Governing Board is responsible for selecting and approving all of NAEP’s non-cognitive or background questions, as well as the cognitive items. These surveys are currently conducted by an alliance of corporations under contract with the U.S. Department of Education. The national surveys contain two kinds of questions – “cognitive” or test questions measuring academic subject student knowledge, and “background” or survey questions that gather information on demographic as well as classroom instructional procedures.

The federal authority mandating NAEP is found in Section 411 of Public Law 107-110. This law states:

"…(b)(1) -- The purpose of this section is to provide, in a timely manner, a fair and accurate measurement of student academic achievement and reporting trends in such achievement in reading, mathematics, and other subject matter as specified in this section.

"(2) MEASUREMENT AND REPORTING.-- The Commissioner, in carrying out the measurement and reporting described in paragraph (1), shall --

"(A) use a random sampling process which is consistent with relevant, widely accepted professional assessment standards and that produces data that are representative on a national and regional basis;

"(B) conduct a national assessment and collect and report assessment data, including achievement data trends, in a valid and reliable manner on student academic achievement in public and private elementary schools and secondary schools at least once every 2 years, in grades 4 and 8 in reading and mathematics;

"(C) conduct a national assessment and collect and report assessment data, including achievement data trends, in a valid and reliable manner on student academic achievement in public and private schools in reading and mathematics in grade 12 in regularly scheduled intervals, but at least as often as such assessments were conducted prior to the date of enactment of the No Child Left Behind Act of 2001;

"(D) to the extent time and resources allow, and after the requirements described in subparagraph (B) are implemented and the requirements described in subparagraph (C) are met, conduct additional national assessments and collect and report assessment data, including achievement data trends, in a valid and reliable manner on student academic achievement in grades 4, 8, and 12 in public and private elementary schools and secondary schools in regularly scheduled intervals in additional subject matter, including writing, science, history, geography, civics, economics, foreign languages, and arts, and the trend assessment described in subparagraph (F);

"(E) conduct the reading and mathematics assessments described in subparagraph (B) in the same year, and every other year thereafter, to provide for 1 year in which no such assessments are conducted in between each administration of such assessments;

"(F) continue to conduct the trend assessment of academic achievement at ages 9, 13, and 17 for the purpose of maintaining data on long-term trends in reading and mathematics;

"(G) include information on special groups, including, whenever feasible, information collected, cross tabulated, compared, and reported by race, ethnicity, socioeconomic status, gender, disability and limited English proficiency; and

"(H) ensure that achievement data are made available on a timely basis following official reporting, in a manner that facilitates further analysis and that includes trend lines.


Discussion: NCLB requires that NAEP collect data and report state-level results for reading and mathematics achievement at grades 4 and 8 for "...special groups, including, whenever feasible, information collected, cross tabulated, compared, and reported by race, ethnicity, socioeconomic status, gender, disability and limited English proficiency." To provide reliable and valid subgroup data at the state level for these mandated assessments, NAEP selects state/jurisdiction samples of approximately 3,150 students per grade/subject. (State samples for states containing one or more districts participating in the NAEP Trial Urban District Assessment will have larger student samples.) This sample size allows for accurate reporting for subgroups specified in NCLB. National results are calculated from the aggregate state samples. For the state assessments not mandated by NCLB (i.e., science and writing at grades 4 and 8), state/jurisdiction samples also are approximately 3,150 students per grade/subjects. In states that do not volunteer for the non-NCLB subjects, a small sample of students are assessed to allow the state to contribute to the national results.

NCLB requires that grades 4 and 8 mathematics and reading be assessed at the state level by NAEP every two years. Governing Board policy requires grades 4 and 8 science and writing be assessed at the state level every four years. The schedule of state assessments had reading, mathematics, and writing assessed in 2007, and will have reading, mathematics, and science assessed in 2009. In state assessment years, there are two NCLB-mandated and one non-mandate subject (for writing, there is no grade 4 assessment). Therefore, the 2009 assessment covered under the requested 2008-2010 OMB Systems Clearance would involve three state assessments at grades 4 and 8, resulting in large sample sizes. In 2008 and 2010, which include only national assessments and pilot/field tests, NAEP does not estimate student achievement for subgroups at the state level. Therefore, the sample sizes for these years are considerably smaller.

A copy of the current statute is included in Appendix A.

1b. Overview of NAEP 2008-2010 Assessments

The following provides a broad overview of the 2008-2010 NAEP assessments. Please note that the National Assessment Governing Board determines NAEP policy and assessment schedule, and future Governing Board decisions may result in changes to some aspects of an assessment (e.g., which subjects are assessed in which years). However, overall methodology and assessment process will remain constant.


NAEP consists of two assessment programs: the NAEP Long Term Trend (LTT) assessment and the main NAEP assessment. The LTT assessments are designed to give information on the changes in academic performance. They are administered nationally every four years (but are not reported at state or district level) and report student performance at ages 9, 13, and 17 in mathematics and reading. Measuring trends of student achievement or change over time requires the precise replication of past procedures.


The main NAEP assessment reports current achievement levels and short-term trends in student achievement at grades 4, 8, and 12. These assessments follow subject-area frameworks developed by the Governing Board and use the latest advances in assessment methodology. The subject-area frameworks evolve to match instructional practices. These assessments are distinguished from NAEP LTT assessments, which, although national, uses frameworks and questions that remain constant over time.


An additional feature of the NAEP program is the High School Transcript Study (HSTS). Beginning in the summer and continuing through the fall of the year, NCES collects high school transcripts from students who graduated from selected schools across the nation. The study is conducted with a nationally representative sample of both public and nonpublic schools that have been selected to participate in NAEP. A representative sample of graduating seniors within each school is selected.


Assessment types listed in this document are described as follows:


Operational


An assessment whose results will be used for NAEP reporting purposes

Pre-calibration

An assessment that has a second stage of item development (after pilot testing).The psychometric performance of items is determined, so that statistical judgments can be made prior to the operational assessments


Pilot

An assessment that contains a pretest of items to obtain information regarding clarity, difficulty levels, timing, and feasibility of items



The 2008 data collection consists of:

  • national Long Term Trend assessments in reading and mathematics at ages 9, 13, and 17;

  • national assessments in the arts (music and visual arts) at grades 8;

  • pre-calibration assessments for 2009 reading and 2009 mathematics at grades 4 and 8; and

  • pilot assessments for 2009 reading and 2009 mathematics at grade 12; pilot assessments for 2009 science at grades 4, 8, and 12; pilot assessments for 2012 Long Term Trend at ages 9,13, and 17 and High School Transcript Study (HSTS) pilot study in 2008;


The 2009 data collection consists of:

  • national (grades 4, 8,and 12) and state (grades 4 and 8) assessments in reading and mathematics;

  • national (grades 4, 8,and 12) and state (grades 4 and 8) assessment in science;

  • pilot assessments for 2011 reading and 2011 mathematics at grades 4 and 8; for 2010 U.S. history, 2010 civics, and 2010 geography at grades 4, 8, and 12; and

  • High School Transcript Study at grade 12.


The 2010 data collection consists of:

  • national assessments in U.S. history, civics, and geography at grades 4, 8, and 12;

  • pre-calibration assessments for 2011 reading and 2011 mathematics at grades 4 and 8;

  • pilot assessments for 2011 writing at grades 4, 8, and 12; 2011 reading and 2011mathematics at grade 12.



In all assessment years, questionnaires are generally administered to students at grades 4, 8, and 12; to teachers at grades 4 and 8; and to school administrators at grades 4, 8, and 12. SD/ELL (Students with Disabilities/ English Language Learner) questionnaires will be completed by teachers or administrators of students identified as learning disabled or as English Language Learners.

Survey sampling information needs to be gathered from schools for all NAEP assessments. This sampling information can be gathered manually or electronically at the school, district, or state level. Electronic filing (E-Filing) is encouraged at the state and district level, but if done at the school level some burden will be incurred by school personnel.

Sizes and categories of respondent burden are provided in Section 12 of this document. Special Studies

Special small-scale studies are conducted in accordance with the developmental needs of NAEP. For example over the past system clearance period, the following special studies were or will be conducted:

  • science bridge study in 2005;

  • National Indian Education Study in 2005 and 2007.

  • Sensitivity to Instruction study involving 8th grade mathematics in 2006;

  • reading Word Location Study in 4th and 8th grades in 2006;

  • meaning vocabulary study for grades 4 and 8 in 2007; and

  • pilot of an Extended Student Background Questionnaire (ESBQ) at grades 4 and 8 to gather socio-economic status data in 2007.


Over the course of 2008-2010 NAEP assessments, special studies will be conducted as directed by NCES. The specifics of future studies will be included in the subsequent yearly OMB submittals. Though an exhaustive list of special studies cannot be provided given the real-time nature of these projects, the following provides a list of special studies currently under consideration includes, but is not limited to:

  • extension of the 2007 Extended Student Background Questionnaire pilot to grade 12;

  • reading bridge study at grades 4 and 8 to link results from the previous frameworks to the new 2009 frameworks;

  • mathematics bridge study at grade 12 to link results from the previous frameworks to the new 2009 frameworks;

  • science bridge study at grades 4, 8, and 12 to link results from the previous frameworks to the new 2009 frameworks;

  • reading bridge study at grade 12 to link results from the previous frameworks to the new 2009 frameworks;

  • National Indian Education Study (NIES) for grades 4 and 8 in 2009;

  • pilot study of an on-line NAEP writing assessment;

  • special studies of student preparedness at grade 12; and

  • NAEP validity studies.

The instruments (i.e., assessments and questionnaires) and the student sample sizes needed to address the study questions are not known at this time; therefore, estimates of student, teacher, and school burden are not provided in the 2008-2010 OMB Systems Clearance application. Details for selected special studies will be included in subsequent annual clearance requests.

1c. Rationale for OMB System Clearance

NCES is requesting system clearance for the NAEP assessments to be administered in the 2008-2010 timeframe, similar to the system clearance approval that was granted for the 2005-2007 NAEP administrations (OMB 1850-0790). Under this clearance OMB would waive the first 60-day Federal Register Notice Period. Each submission under this system clearance would be announced with a 30-day Federal Register Notice at the time it is sent to OMB. OMB would agree to provide comment within 15 days after the end of the 30-day Notice period.

The essential reason for the system clearance request is that it enables NAEP to meet its large and complex assessment schedules and deliverables through a more efficient clearance process. The initial NAEP System Clearance request was submitted for the 2005-2007 NAEP assessments. The purpose of requesting a System Clearance process was to provide a more efficient clearance process for an increasing volume of OMB submittals. Since the passage of the No Child Left Behind (NCLB) legislation in 2001, the number of NAEP assessments, volume of participants, and complexity of the development and production processes have all increased dramatically. In the pre-NCLB years, operational subjects were administered in one year and pilot subjects were administered in the next year. There were not operational, pilot, and field test (pre-calibration for reading and mathematics) assessments all being administered simultaneously. Thus instead of questions for a few subjects being submitted, NAEP now needs to submit many more sets of questions each year ( in 2008 approximately 30 sets of questions will be submitted as part of the OMB submittals).

In addition, under NCLB requirements there is the factor of short turnaround times for production and reporting. NAEP is required to report mathematics and reading scores on the state and national levels every other year and results must be reported within six months of the administration. Due to the increased volumes, complex development, and six month reporting guidelines, there are shorter time frames for data analysis, question reviews, assembly of questions for submittal to the Governing Board and OMB, and abbreviated windows for printing and distribution. The system clearance process shortens the timeframe for OMB approvals from 120 days to 45 days, which is critical for meeting printing and distribution deadlines.

 

The System Clearance process specifically covers the operational assessments approved by the National Assessment Governing Board, pilot and field tests that support operational assessments, and special studies that are intended to inform future decisions regarding NAEP. The system clearance is not sought for assessments or studies that are only tangentially related to or associated with NAEP. For example, a separate OMB submittal was done for the 2005 National Indian Education Study (NIES), which was sponsored by the Office of Indian Education and conducted by NCES. The NIES became an integrated component of the NAEP assessment in 2007 and background questions related to the study were submitted under the previous OMB Systems Clearance. It is expected that NIES will be repeated in 2009 and background questions for the 2009 study will be submitted under the systems clearance requested for the 2008-2010 period.

Since the passage of the No Child Left Behind (NCLB) legislation, the number of assessments, volume of participants, and complexity of the development and production processes have all increased dramatically. Because of NCLB requirements, every state participates in the reading and mathematics assessments at grades 4 and 8 and most states participate in the state-level writing and science assessments. These state-level assessments require that samples be selected to permit reporting of results for each state as well as for the nation, resulting in a much larger sample size than would otherwise be necessary. In addition, because of the complex sampling plan used by NAEP, upwards of 800 distinct test booklets may need to be printed for one assessment year alone.

In addition to the large volume of development and assessment that takes place, there is the factor of short turnaround times for production and reporting. NAEP is required to report mathematics and reading scores on the state and national levels every other year and results must be reported within six months of the administration. Due to the increased volumes, complex development, and six month reporting guidelines, there are shorter time frames for data analysis, question reviews, assembly of questions for submittal to the Governing Board and OMB, and abbreviated windows for printing and distribution. The system clearance process shortens the timeframe for OMB approvals from 120 days to 45 days which is critical for meeting printing and distribution deadlines.

Volume I of this system clearance package contains supporting information for the operational, pre-calibration, and pilot assessments to be given in 2008-2010. Volume II contains those background questionnaires that are available at this time for 2008-2010 assessments. Additional questionnaires will be submitted as they are developed over the course of the system clearance period. Clearance of three years duration (beginning May 1, 2007) is requested for the background materials that will be used in the 2008-2010 operational, pre-calibration, and pilot assessments. It is understood that under system clearance each additional set of items would be cleared up to the end of the administration of the 2010 assessment (May, 2010). At that time, NCES would submit a new request for system clearance to cover future NAEP administrations.


2. How, by whom, and for what purpose the data will be used.

Under this request for system clearance, NCES is asking for approval of the various background questionnaires (e.g., student, teacher, and school) that will be part of the NAEP 2008-2010 national and state assessments, as well as any special studies (i.e., NIES, HSTS, etc.) that will be conducted in the 2008-2010 timeframe. The 2008 through 2010 assessments will report student achievement in reading and mathematics (long-term trend), the arts in 2008; reading, mathematics, and science in 2009; and civics, U.S. history and geography in 2010. The schedule for subjects to be assessed has been established by the Governing Board; however, it is subject to modification. Therefore, NCES is requesting some leeway with regard to the specific subject assessments that will be administered while holding the methodology and burden constant. NCES is also requesting approval for the data collection instruments/procedures required for the HSTS in 2009 and for student sampling for long term trend and main NAEP. As is described in later sections, neither the HSTS nor student sampling imposes any student-level burden. Both require input from school- or district-level staff.1


Given that the purpose of NAEP is to gather data on the achievement of students in the subject areas assessed for use in monitoring education progress, and because of the program's increasing visibility, it is incumbent on the program to develop the most reliable and valid instruments possible. To do so, NAEP employs four strategies to develop items:

  1. Small-scale special study pilot testing of new materials and test administration techniques;

  2. Pilot testing items to determine which items best measure the constructs under consideration;

  3. Pre-calibration field testing of operational assessments to accommodate the mandated six-month reporting (for grades 4 and 8 reading and mathematics only); and

  4. Full-scale operational assessments.


All four types of activities may simultaneously be in the field during any given data collection effort. Each is described in more detail below:

  1. Small-scale pilot testing. As the NAEP program has evolved, definitions of what will be tested have also evolved. For example, under the revised Reading Framework for 2009, the Governing Board is proposing the inclusion of a new measure of vocabulary. In 2007, items designed to measure vocabulary in this way were included in the pilot test instruments. In addition, NAEP has also conducted small-scale studies of using technology as the platform for test administration. This was done in the on-line studies of assessments in mathematics, writing and technology rich environments. Both types of pilot testing will continue and are likely to increase over time as innovative techniques are implemented within the assessment field at large. Under both types of situations described above, NAEP would first pilot test the method of either item construction or administration platform, use the data to refine the technique, and then do a comparative study of how it works compared to current methodology prior to any large scale implementation.


  1. Pilot testing items to determine which items best measure the constructs under consideration. In addition to ensuring that items measure what is intended, the data collected from pilot tests serve as the basis for selecting the most effective items and data collection procedures for the subsequent operational national and state assessments. Pilot testing is a cost-effective means for revising and selecting items prior to national data collection because while fewer numbers of students participate they are administered to a nationally representative sample of students and data are gathered about performance that crosses the full spectrum of student achievement. In general two items are tested for each one that will appear in the operational assessment. Pilot testing of cognitive and background items is carried out in all subject areas. In subjects other than reading and mathematics at grades 4 and 8, pilot testing is done one year prior to the operational assessment. In reading and mathematics at grade 4 and 8, pilot testing is done two years prior to the operational assessment so that pre-calibration field testing may occur (see C below for further explanation).


  1. Pre-calibration field testing of operational assessments to accommodate the mandated six-month reporting. Under the current legislation, reading and mathematics assessments at grades 4 and 8 are to be administered every two years and the results are to be made public within 6-months of the end of data collection. To streamline the data analysis process, NAEP has shifted many of its scaling operations so that they occur between pilot testing and the operational assessment. Pre-calibration field testing is performed on instruments that are fixed after pilot testing and that are administered one year prior to the operational assessment. These fixed instruments are “pre-calibrated” and remain the same for the operational assessment.


  1. Full-scale operational assessments. "Operational" NAEP administrations, rather than pilot or field test administrations, collect data to publicly report on the educational achievement.


Development of background questionnaires follows a similar pattern as described above for cognitive items, although fewer questions are pilot tested. Guidance for what is asked in background questionnaires is set by the Governing Board. NCES develops the questionnaires. The Governing Board then reviews and approves the questionnaires prior to pilot testing and then again after pilot testing, using the data to select the final set of background questions. The questions are designed to provide the information (a) for disaggregating data according to categories specified in the legislation, (b) to provide contextual information that is subject specific (e.g. reading, mathematics) and has an impact and known relationship to achievement, and (c) to provide policy relevant information specified by the Governing Board.


Background questions that seem problematic can be dropped or modified before the operational administration. One typical modification is reducing the number of response categories given. This modification is employed when field test data indicate very low response percentages in adjacent response categories. As the Governing Board introduces new subject specific frameworks (e.g., reading for 2009), new questionnaire development will begin with a need for pilot testing and then approval of final questionnaire instruments.


In addition to the overarching goal of NAEP to provide data about student achievement at both the national and state levels, NAEP also provides specially targeted data on an as-needed basis. At times, this may only mean that a special analysis of the existing database is necessary. At other times the targeting may include the addition of a short add-on questionnaire targeted at specified groups. For example, in 2005 and 2007 additional student, teacher, and school questionnaires were developed and administered as part of the National Indian Education Study that NCES conducted on behalf of the Office of Indian Education. Through such targeted questionnaires, important information about the achievement of a specific group is gathered at minimal additional burden. These instruments most closely resemble a Fast-Response-Survey, although they are only administered in tandem with an existing NAEP data collection. They are intentionally kept to a minimum and are designed to avoid jeopardizing the main purpose of the program.


Finally, the NAEP program may be required to design and administer additional studies to fulfill its mission to report on “progress” or trends in student achievement over time. If revised subject-area frameworks are adopted that change the scope of the content being measured, it may be necessary for the program to conduct “bridge studies” to link the assessment results from the previous framework to results from the revised framework. Bridge studies administer previous cognitive items to a portion of the sample to allow for statistical linking of the reporting scales across the two frameworks. While bridge studies may increase the necessary school and student sample sizes in a given year, no additional background questionnaires would be required.


Subsequent clearance packages submitted for the specific assessment years will describe the scope of the assessment, including additional studies, school and student samples, estimated burden, and background questionnaires requiring clearance. The basic methodology and procedures described under this system clearance will be employed for the 2008, 2009, and 2010 NAEP assessments.


3. Use of improved techniques to reduce burden.

For all operational and pilot tests, NAEP will continue to take advantage of proven, modern measurement techniques, which greatly enhance the power and value of the NAEP data collected. Through the use of a partial balanced incomplete block (BIB) spiraling variant of matrix sampling, a variety of analyses are feasible because the data are not booklet-bound. Covariances are computed among all questions in a subject area, so that:

  • composites of questions can be appraised empirically for coherence and construct validity;

  • the dimensional structure of each subject area can be determined analytically as reflected in student performance consistencies;

  • item response theory (IRT) scaling can be applied to unidimensional sets of exercises regardless of what booklet they appear in;

  • IRT scales can be developed having common meaning across exercises, population subgroups, age levels, and time periods;

  • powerful trend analyses can be undertaken by means of these common scales;

  • performance scales can be correlated with background, attitudinal, and program variables to address a rich variety of educational and policy issues; and

  • public-use electronic files can be made much more useful because secondary analyses are also not booklet-bound.


No student takes the complete assessment and given that NAEP results are reported for groups of students, using a BIB spiral allows the program to administer a lengthy overall assessment without overburdening any one student.

The various background questionnaires (student, teacher, and school) are constructed to minimize respondent burden using the information gathered from pilot testing. Based on pilot test data, questions that do not yield useful information or that produce redundant information can be eliminated. Therefore, the questions administered operationally will be a subset of the items pilot-tested.

To further reduce respondent burden, NAEP has a core set of background variables (required for reporting purposes) that are collected in every assessment, but other non-required topics are rotated across assessments, and within and across subject areas, thus limiting the number of questions asked of any individual student.

Another technique for improving the response process consists of providing the teacher and school questionnaires via an on-line electronic completion system. In the most recent NAEP assessment year, approximately 8% of teacher and school questionnaires were completed electronically on-line.


4. Efforts to identify duplication.

The proposed background questions do not exist in the same format or combination in the Department of Education or elsewhere. The background data gathered by NAEP is the only comprehensive cross-sectional survey performed periodically or regularly on a large-scale basis that can be related to extensive achievement data. No other federally funded studies have been designed to collect data for the purpose of regularly assessing trends in educational progress. None of the major non-federal studies of education achievement were designed to measure changes in national achievement. In short, no existing data source in the public or private sector duplicates NAEP.


5. Burden on small businesses or other small entities.

Private schools are included in the sample proportional to their representation in the population.

6. Consequences of collecting information less frequently.

Under NCLB legislation, Congress has mandated the on-going collection of NAEP data.

Failure to collect the 2008-2010 operational, pilot and precalibration field test data on the current schedule would affect the quality and schedule of the NAEP assessments, and would result in assessments that would not fulfill the mandate of the legislation.


7. Consistency with 5 C.F.R. 1320.5.

No special circumstances are involved. This data collection observes all requirements of 5 C.F.R. 1320.5.


8. Consultations outside the agency.

No comments were received from the public during the Federal Register notice period.

In addition to the contracts responsible for the development of the background questions for the NAEP assessments, the program involves many consultants as well as reviews by specialists. These consultants and special reviewers represent expertise with students of different ages, ethnic backgrounds, geographic regions, learning abilities, and socioeconomic levels. Staff and consultants have reviewed all exercises for bias and sensitivity issues, grade appropriateness, and congruence with testing at the state level. When appropriate, member of the subject-area standing committee review the background questions with regards to appropriateness within existing curricular and instructional practices.


Pilot testing of the background questions discerns the validity and utility of the data from the viewpoint of respondent groups. Students provide essential feedback through their responses about the clarity, reasonableness, appropriateness, and vocabulary levels of the questions.


The following lists of outside personnel are provided in Appendix B:

  • Background Questionnaire Expert Panel for Reading

  • Background Questionnaire Expert Panel for Mathematics

  • Background Questionnaire Expert Panel for Science

  • Background Questionnaire Expert Panel for SES Variables

  • Background Questionnaire Expert Panel for Contextual Variables

  • NAEP Validity Study Panel

Note: Committee panels may be named, if needed, for upcoming assessments including, but not limited to, Arts, Civics, U.S. History, and Geography.


9. Payments or Gifts to Respondents.

There will be no gifts or payments to respondents except that schools that participate in the geocoding component of the SES pilot study will receive the geocoding software. Schools participating in the High School Transcript Study are paid the established fee for providing student transcripts. Given that the study pays schools the prevailing rate to perform a standard service, estimates of school-level burden for that function are not included in this volume.

10. Assurance of Confidentiality.

NAEP has policies and procedures that ensure NAEP privacy, security, and confidentiality. Specifically for the NAEP project, this ensures that privacy, security, and confidentiality policies and procedures are in compliance with the Privacy Act of 1974 and its amendments, NCES Confidentiality Procedures, and the Department of Education ADP Security manual. The NAEP Security and Confidentiality Plan has been developed and NCES ensures that all current contractor policies and procedures are in compliance with all NAEP security and confidentiality requirements.


All NAEP-contractor staff with access to confidential NAEP information are required to sign an “affidavit of nondisclosure” that affirms, under severe penalty for unlawful action, they will protect NAEP information from non-authorized access or disclosure. The affidavits are in keeping with the NCES Standard for Maintaining Confidentiality (IV-01-92). NAEP contractors are required to maintain and provide NCES with a list of all staff that have contact with NAEP secure information, along with certification that all such staff have taken an appropriate oath of confidentiality. The Data Collection contractor must also comply with Directive OM: 5-101 which requires that all staff with access to data protected by the Privacy Act and/or access to Department of Education systems and who will work on the contract for 30 days or more go through the Department of Education Employee Security Clearance Procedures. 

An important privacy and confidentiality issue is to protect the identity of assessed students, teachers, and schools. To assure this protection, NAEP has established security procedures that closely control access to identifying information. For example, the reporting contractor will not be privy to files that contain information linking assessment instruments to individuals or schools. These files will be produced and used by authorized sampling and data collection contractor staff for necessary conduct of the assessment. After the assessment takes place, the link files are not removed from and remain secure within the sampled school. The school is asked to retain this link information for a specified period of time before destruction.


Furthermore, to ensure the anonymity of respondents, NAEP staff will use the following precautions.

  • Data files will not identify individual respondents.

  • No personally identifiable information, either by schools or respondents, will be gathered or released by third parties. No permanent files of names or addresses of respondents will be maintained.

  • Student participation is voluntary.


After the components of NAEP are completed in a school, neither student- nor teacher-reported data are retrievable by personal identifiers. We emphasize that confidentiality is completely assured for individual schools and for individual students, teachers, and principals.


For the HSTS component of NAEP, student transcripts are collected from schools for sampled students and school staff members complete a School Information Form that provides general information about class periods, credits, graduation requirements, and other aspects of school policy. To maintain the privacy of student and school identities, students’ names are removed from the transcripts and questionnaires and given a unique identification number. NCES ensures that the data collected from schools and students can be used for statistical purposes only.

Following is a discussion of the legislation covering our confidentiality requirements:

Confidentiality of Individuals and Institutions

NCES assures participating individuals and institutions that any data collected conforms to the IES standards for protecting the privacy of individuals as required by Section 183 of the Education Sciences Reform Act of 2002 (P.L. 107-279):

". . . all collection, maintenance, use, and wide dissemination of data by the institute, including each office, board, committee, and Center of the Institute, shall conform with the requirements of section 552A of Title 5, United States Code [which protects the confidentiality rights of individual respondents with regard to the data collected, reported, and published under this title]." (Section 183)

Under the Education Sciences Reform Act of 2002 (ESRA 2002), all individually identifiable information about students, their families, and their schools shall remain confidential. To this end, this law requires that no person may:

         Use any individually identifiable information furnished under the provisions of this section for any purpose other than statistical purposes for which it is supplied, except in the case of terrorism;

         Make any publication whereby the data furnished by any particular person wider this section can be identified; or

         Permit anyone other than the individuals authorized by the Commissioner to examine individual reports.

Further, individually identifiable information is immune from legal process, and shall not, without the consent of the individual concerned, be admitted as evidence or used for any purpose in any action, suit, or other judicial or administrative proceeding, except in the case of terrorism. Employees, including temporary employees, or other persons who have sworn to observe the limitations imposed by this law, who knowingly publish or communicate any individually identifiable information will be subject to fines of up to $250,000 or up to 5 years in prison, or both (Class E felony).

Protection and Security of Data

The confidentiality of individually identifiable information contained in project documents, data, and other information supplied by the National Center for Education Statistics, U.S. Department of Education (NCES/ED) or information acquired in the course of performance under this contract where the information was furnished under the provisions of Section 183 of the Education Sciences Reform Act of 2002 is a material aspect of the contract and must be maintained, secured, and protected from disclosure as provided in Section 183. The Privacy Act of 1974 (5 U.S.C. 552a) also applies.

The contractor shall enforce strict procedures for assuring confidentiality. These procedures shall apply to all phases of the project and should include but not be limited to: information used to locate study respondents, data collection in the field, coding and editing phases of data prior to machine processing, safeguarding response documents, and maintenance of any respondent follow-up information. Contractor shall physically separate the identifying data required for any follow-up from data required for research purposes.

1.      The contractor The Privacy Act of 1974 (5 U.S.C. 552a),

2.      E-Government Act of 2002 - (P.L. 107-347, Title V, Subtitle A, "Confidential Information Protection"),

3.      Family Educational Rights and Privacy Act (FERPA) (20 U.S.C. 1232g; 34 CFR Part 99),

4.      The Freedom of Information Act (5 U.S.C. 552),

5.      shall be familiar with and comply with:

6.      Hawkins-Stafford Elementary and Secondary School Improvement Amendments of 1988 (P.L.100-297),

7.      Section C of Education Sciences Reform Act of 2002 (P.L. 107-279),

8.      Title IV of the Improving America's Schools Act of 1994 (P.L.103-382),

9.      USA Patriot Act of 2001 (P.L. 107-56),

10.  Office of Management and Budget (OMB) Federal Statistical Confidentiality Order of 1997, and

11.  Any new legislation that impacts the data collection through this contract.

The contractor shall maintain the confidentiality of all documents, data, and other information supplied by NCES/ED or acquired in the course of performance of this contract, except for any documents or other information specifically designated as non-confidential by NCES/ED. The contractor shall take such measures as are necessary to maintain the required security and protection of confidential information (see section Data Security Plan). The contractor shall be prepared to develop compliance procedures in cooperation with the COR concurrently with the development of the study design.

1.1.2        Data Security Plan
1.1.2.1  Definition of Personally Identifiable Information and Direct Identifiers

The term "personally identifiable information" (hereinafter PII) means any information about an individual maintained by an agency, including, but not limited to, education, financial transactions, medical history, and criminal or employment history and information which can be used to distinguish or trace an individual's identity, such as their name, social security number, date and place of birth, mother's maiden name, biometric records, etc., including any other personal information which is linked or linkable to an individual. The term "direct identifier" denotes any single datum that alone is deemed sufficient to yield a high risk of disclosure if released, such as social security numbers, full names, or biometric records.

1.1.2.2  Development of Data Security Plan

The contractor shall present a detailed security plan that expands upon what was presented in the proposal to the COR for approval. In order to ensure the anonymity of individual respondents, the contractor must draft and execute this plan in compliance with all relevant laws, regulations, and policies governing the security of data, particularly PII. These include (but are not necessarily limited to):

1)            Section 183 of the Education Sciences Reform Act of 2002, which states:

(a) IN GENERAL.-All collection, maintenance, use, and wide

dissemination of data by the Institute, including each office, board,

committee, and center of the Institute, shall conform with the

requirements of section 552a of title 5, United States Code, the

confidentiality standards of subsection (c) of this section, and sections

444 and 445 of the General Education Provisions Act (20

U.S.C. 1232g, 1232h).

(b) STUDENT INFORMATION.-The Director shall ensure that

all individually identifiable information about students, their academic

achievements, their families, and information with respect

to individual schools, shall remain confidential in accordance with

section 552a of title 5, United States Code, the confidentiality

standards of subsection (c) of this section, and sections 444 and

445 of the General Education Provisions Act (20 U.S.C. 1232g,

1232h);

2)            The Privacy Act of 1974 (P.L. 93-579, U.S.C. 552a);

3)            The E-Government Act of 2002 (H.R. 2458/S. 803);

4)            The relevant sections of Title 45 of the Code of Federal Regulations entitled "Protection of Human Subjects" (45 CFR 46);

5)            The U. S. Department of Education Handbook for Information Assurance Security Policy (June 2005);

6)            Office of Management and Budget Memorandum M-06-19, "Reporting Incidents Involving Personally Identifiable Information and Incorporating the Cost of Security in Agency Information Technology Investments";

The contractor shall enforce strict procedures for ensuring confidentiality. These procedures shall apply to all phases of the project .

11. Sensitive questions.

The National Assessment of Educational Progress emphasizes voluntary respondent participation, assures confidentiality of individual responses, and avoids asking for information that might be considered sensitive or offensive. Insensitive or offensive questions are prohibited by NCLB and the Governing Board reviews all items for bias and sensitivity. Throughout the item development process, the staff works with consultants and internal reviewers to identify and eliminate potential bias in the questions.

12. Estimation of respondent reporting burden (2008-2010)

Design plans, estimated volumes, and burden estimates for the 2008-2010 assessments are contained in Exhibits at the end of this section. The average number of person hours estimated to complete the 2008-2010 background questionnaires for each respondent grade/age is summarized below:

2008Grade 4 - PreCalibration (Math & Reading); Science Pilot; Reading Bridge

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 30,700 students will participate at grade 4 And the response burden for the grade 4 2008 administration is approximately 7,675 hours for students.


Teachers

Teachers at grade 4 will fill out Teacher Questionnaires. It is estimated that approximately 1,842 teachers will fill out the questionnaires. The individual respondent burden for teachers is approximately 20 minutes (.33hours). The total response burden will be about 608 hours.


School Personnel (burden for Questionnaires only, not e-filing or HSTS totals)

A School Characteristics and Policies Questionnaire will be completed by the principal or school administrator. It is estimated that a total of 614 schools will be sampled.

The individual response burden for school personnel is 30 minutes (.50 hours) for the school questionnaire used in the grade 4 2008 survey. The total response burden will be about 307 hours for school personnel.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 1,075 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the survey. The total response burden will be about 355 hours for teacher/school personnel.


E-Filing (School Personnel)

E-Filing information will be provided by approximately 245 schools at grade 4.. The individual response burden for school personnel is 60 minutes (1.0 hours) for the e-filing process. Thus, the total response burden will be about 245 hours for grade 4 school personnel.


2008Grade 8- PreCalibration (Math & Reading); Science Pilot; Reading Bridge; Arts

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 39,200 students will participate at grade 8 and the response burden for the grade 8 2008 administration is approximately 9,800 hours for students.


Teachers

Teachers at grade 8 will fill out Teacher Questionnaires. It is estimated that approximately 2,352 teachers will fill out the questionnaires. The individual respondent burden for teachers is approximately 20 minutes (.33hours). The total response burden will be about 776 hours.


School Personnel (burden for Questionnaires only, not e-filing or HSTS totals)

A School Characteristics and Policies Questionnaire will be completed by the principal or school administrator. It is estimated that a total of 784 schools will be sampled.

The individual response burden for school personnel is 30 minutes (.50 hours) for the school questionnaire used in the grade 8 2008 survey. The total response burden will be about 392 hours for school personnel.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 1,372 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the survey. The total response burden will be about 453 hours for teacher/school personnel.


E-Filing (School Personnel)

E-Filing information will be provided by approximately 313 schools at grade 8. The individual response burden for school personnel is 60 minutes (1.0 hours) for the e-filing process. Thus, the total response burden will be about 313 hours for school personnel.



2008Grade 12Reading Pilot; Math Pilot; Science Pilot; Math Bridge

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 23,500 students will participate at grade 12 and the response burden for the grade 12 2008 administration is approximately 5,875 hours for students.


Teachers

There are no teacher questionnaires at grade 12.


School Personnel (burden for Questionnaires only, not e-filing or HSTS totals)

A School Characteristics and Policies Questionnaire will be completed by the principal or school administrator. It is estimated that a total of 470 schools will be sampled.

The individual response burden for school personnel is 30 minutes (.50 hours) for the school questionnaire used in the grade 12 2008 survey. The total response burden will be about 235 hours for school personnel.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 823 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the grade 12 survey. The total response burden will be about 271 hours for teacher/school personnel.


E-Filing (School Personnel)

E-Filing information will be provided by approximately 186 schools at grade 12. The individual response burden for school personnel is 60 minutes (1.0 hours) for the e-filing process. Thus, the total response burden will be about 186 hours for school personnel.


2008Age 9 – Long Term Trend (Mathematics and Reading)

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 16,500 students will participate at age 9. and the response burden for the age 9 2008 administration is approximately 4,125 hours for students.


Teacher and School

There are no teacher or school surveys for the Long Term Trend assessments.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 578 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the survey. The total response burden will be about 191 hours for teacher/school personnel.


2008Age 13 – Long Term Trend (Mathematics and Reading)

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 16,500 students will participate at age 13. and the response burden for the age 13 2008 administration is approximately 4,125 hours for students.


Teacher and School

There are no teacher or school surveys for the Long Term Trend assessments.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 578 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the survey. The total response burden will be about 191 hours for teacher/school personnel.


2008Age 17 – Long Term Trend (Mathematics and Reading)

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 16,500 students will participate at age17. and the response burden for the age 17 2008 administration is approximately 4,125 hours for students.


Teacher and School

There are no teacher or school surveys for the Long Term Trend assessments.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 578 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the survey. The total response burden will be about 191 hours for teacher/school personnel.


2009Grade 4 - Operational (Math, Reading, Science); Pilot (Math, Reading, U.S. History, Civics, Geography)

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 590,200 students will participate at grade 4 and the response burden for the grade 4 2009 administration is approximately 147,550 hours for students.


Teachers

Teachers at grade 4 will fill out Teacher Questionnaires. It is estimated that approximately 35,412 teachers will fill out the questionnaires. The individual respondent burden for teachers is approximately 20 minutes (.33hours). The total response burden will be about 11,686 hours.


School Personnel (burden for Questionnaires only, not e-filing or HSTS totals)

A School Characteristics and Policies Questionnaire will be completed by the principal or school administrator. It is estimated that a total of 11,804 schools will be sampled.

The individual response burden for school personnel is 30 minutes (.50 hours) for the school questionnaire used in the grade 4 2009 survey. The total response burden will be about 5,902 hours for school personnel.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 20,657 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the survey. The total response burden will be about 6,817 hours for teacher/school personnel.


E-Filing (School Personnel)

E-Filing information will be provided by approximately 4,475 schools at grade 4.. The individual response burden for school personnel is 60 minutes (1.0 hours) for the e-filing process. Thus, the total response burden will be about 4,475 hours for grade 4 school personnel.


2009Grade 8- Operational (Math, Reading, Science); Pilot (Math, Reading, U.S. History, Civics, Geography)

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 513,900 students will participate at grade 8 and the response burden for the grade 8 2008 administration is approximately 128,475 hours for students.


Teachers

Teachers at grade 8 will fill out Teacher Questionnaires. It is estimated that approximately 30,834 teachers will fill out the questionnaires. The individual respondent burden for teachers is approximately 20 minutes (.33hours). The total response burden will be about 10,175 hours.


School Personnel (burden for Questionnaires only, not e-filing or HSTS totals)

A School Characteristics and Policies Questionnaire will be completed by the principal or school administrator. It is estimated that a total of 10,278 schools will be sampled.

The individual response burden for school personnel is 30 minutes (.50 hours) for the school questionnaire used in the grade 8 2009 survey. The total response burden will be about 5,139 hours for school personnel.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 17,987 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the survey. The total response burden will be about 5,936 hours for teacher/school personnel.


E-Filing (School Personnel)

E-Filing information will be provided by approximately 3,949 schools at grade 8. The individual response burden for school personnel is 60 minutes (1.0 hours) for the e-filing process. Thus, the total response burden will be about 3,949 hours for school personnel.


2009Grade 12Operational (Math, Reading, Science); Pilot (U.S. History, Civics, Geography)

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 50,500 students will participate at grade 12 and the response burden for the grade 12 2009 administration is approximately 12,625 hours for students.


Teachers

There are no teacher questionnaires at grade 12.


School Personnel (burden for Questionnaires only, not e-filing or HSTS totals)

A School Characteristics and Policies Questionnaire will be completed by the principal or school administrator. It is estimated that a total of 1,010 schools will be sampled.

The individual response burden for school personnel is 30 minutes (.50 hours) for the school questionnaire used in the grade 12 2009 survey. The total response burden will be about 505 hours for school personnel.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 1,768 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the grade 12 survey. The total response burden will be about 583 hours for teacher/school personnel.


E-Filing (School Personnel)

E-Filing information will be provided by approximately 351 schools at grade 12. The individual response burden for school personnel is 60 minutes (1.0 hours) for the e-filing process. Thus, the total response burden will be about 351 hours for school personnel.


  • HSTS

School personnel from schools sampled for the HSTS will complete a School Information Form that provides general information about class periods, credits, graduation requirements, and other aspects of school policy. School personnel will also be asked to provide a course catalog (a list of courses) offered for each of four consecutive years, from 2005-06 through 2008-09. It is estimated that 750-775 schools will be sampled for the 2009 HSTS.

The individual response burden for school personnel for the HSTS is 30 minutes (.50 hours) for the School Information Form and providing additional materials for the study. The total response burden for the HSTS will be about 388 hours for school personnel.


2010Grade 4 - Operational (U.S. History, Civics, Geography); PreCalibration (Math & Reading); Pilot (Writing)

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 30,500 students will participate at grade 4 And the response burden for the grade 4 2010 administration is approximately 7,625 hours for students.


Teachers

Teachers at grade 4 will fill out Teacher Questionnaires. It is estimated that approximately 1,830 teachers will fill out the questionnaires. The individual respondent burden for teachers is approximately 20 minutes (.33hours). The total response burden will be about 604 hours.


School Personnel (burden for Questionnaires only, not e-filing or HSTS totals)

A School Characteristics and Policies Questionnaire will be completed by the principal or school administrator. It is estimated that a total of 610 schools will be sampled.

The individual response burden for school personnel is 30 minutes (.50 hours) for the school questionnaire used in the grade 4 2010 survey. The total response burden will be about 305 hours for school personnel.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 1,068 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the survey. The total response burden will be about 352 hours for teacher/school personnel.


E-Filing (School Personnel)

E-Filing information will be provided by approximately 232 schools at grade 4. The individual response burden for school personnel is 60 minutes (1.0 hours) for the e-filing process. Thus, the total response burden will be about 232 hours for grade 4 school personnel.


2010Grade 8 - Operational (U.S. History, Civics, Geography); PreCalibration (Math & Reading); Pilot (Writing)

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 37,200 students will participate at grade 8 and the response burden for the grade 8 2010 administration is approximately 9,300 hours for students.


Teachers

Teachers at grade 8 will fill out Teacher Questionnaires. It is estimated that approximately 2,232 teachers will fill out the questionnaires. The individual respondent burden for teachers is approximately 20 minutes (.33hours). The total response burden will be about 737 hours.


School Personnel (burden for Questionnaires only, not e-filing or HSTS totals)

A School Characteristics and Policies Questionnaire will be completed by the principal or school administrator. It is estimated that a total of 744 schools will be sampled.

The individual response burden for school personnel is 30 minutes (.50 hours) for the school questionnaire used in the grade 8 2010 survey. The total response burden will be about 372 hours for school personnel.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 1,302 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the survey. The total response burden will be about 430 hours for teacher/school personnel.


E-Filing (School Personnel)

E-Filing information will be provided by approximately 283 schools at grade 8. The individual response burden for school personnel is 60 minutes (1.0 hours) for the e-filing process. Thus, the total response burden will be about 283 hours for school personnel.



2010Grade 12Operational (U.S. History, Civics, Geography); Pilot (Math, Reading, Writing)

Students

The student response burden is .25 hours (15 minutes) for background questions for each student sampled in the assessments. Approximately 51,000 students will participate at grade 12 and the response burden for the grade 12 2010 administration is approximately 12,750 hours for students.


Teachers There are no teacher questionnaires at grade 12.


School Personnel (burden for Questionnaires only, not e-filing or HSTS totals)

A School Characteristics and Policies Questionnaire will be completed by the principal or school administrator. It is estimated that a total of 1,020 schools will be sampled.

The individual response burden for school personnel is 30 minutes (.50 hours) for the school questionnaire used in the grade 12 2010 survey. The total response burden will be about 510 hours for school personnel.


SD/ELL

An SD/ELL Questionnaire will be completed by the principal or teacher. It is estimated that a total of approximately 1,785 surveys will be completed. The individual response burden for school personnel is 20 minutes (.33 hours) for the SD/ELL questionnaire used in the grade 12 survey. The total response burden will be about 589 hours for teacher/school personnel.


E-Filing (School Personnel)

E-Filing information will be provided by approximately 387 schools at grade 12. The individual response burden for school personnel is 60 minutes (1.0 hours) for the e-filing process. Thus, the total response burden will be about 387 hours for school personnel.


To minimize the burden to participating schools and students, the following procedures will be used:

Trained administrators will conduct the operational, pre-calibration, and pilot assessments at all grades.


Assessment administrations will be limited, whenever possible, to about

60-90 minutes to facilitate school scheduling.


Students will not take every question in a particular subject area. Blocks are assembled in different booklets and each booklet is given to a different sub-sample of students.



EXHIBIT ONE - Student Background Questionnaire Plan




GRADES 4, 8, 12

Subject Area

No. of Booklets

No. of Blocks

No. of Min/
Block

Total No. of Minutes/Student

Background

1

2

5 -core

10 - subject specific

15







EXHIBIT TWO

Estimated Respondent Burden

National Assessment of Educational Progress

2008-2010 Assessments




Universe

Respondents (approximate size of sample)

Average No. of Items Per Respondent

Type of Respondent

Estimate of Average Person Hours

Total Respondent Burden in Person Hours

Student Background Questions


2008

2009

2010


12 million




142,900

1,154,600

118,700



38

38

38



Student

"


"



.25

"

"



35,725

288,650

29,675

School and Teacher Questionnaires







Teacher Questionnaires

2008

2009

2010

383,000



4,194

66,246

4,062



33

"

"



Teacher

"

"



.33

"

"



1,384

21,861

1,340

School Questionnaire

2008

2009

2010

154,000



1,868

23,092

2,374



44

"

"



Principal

"

"



.50

"

"



934

11,546

1,187

SD/ELL

2008

2006

2007

383,000


5,002

40,411

4,155


9

"

"


Teacher/Administrator

"

"


.33

"

"


1,650

13,336

1,371

E-Filing

2008

2009

2010

16,000


744

8,775

902


List production


School Personnel


1.0

"

"


744

8,775

902

HSTS

2009




775




School Personnel



.50



388


Total respondents = 1,567,603* Total burden hours = 419,468




* E-filing and HSTS respondents are subsets of schools where school personnel complete such activities.

EXHIBIT THREE

Estimated Respondent Burden for NAEP 2008-2010 Assessments

By Year, By Grade Level



2008

Subjects

# of Students

Student Burden (in hours)

# of Teachers /

Teacher Burden

# of Schools

School Burden

SD/ELL (# of school personnel)

SD/ELL Burden

Total Burden

(in hours)

(in hours)

(in hours)

(in hours)


LTT Ages 9

16,500

4,125

 

 

 

 

578

191

4,316

 

LTT Ages 13

16,500

4,125

 

 

 

 

578

191

4,316

 

LTT Ages 17

16,500

4,125

 

 

 

 

578

191

4,316

 

 

 

 

 

 

 

 

 

 

 

 

4th Grade -PreCalibration (Math & Reading); Science Pilot; Reading Bridge

30,700

7,675

1,842

608

614

307

1,075

355

8,944

 

8th Grade -PreCalibration (Math & Reading); Science Pilot; Reading Bridge; Arts Operational

39,200

9,800

2,352

776

784

392

1,372

453

11,421

 

12th Grade -ReadingPilot;Math Pilot; Science Pilot; Math Bridge

23,500

5,875

 

 

470

235

823

271

6,381

 

E-Filing2

 

 

 

 


744

 

 

 744

 

Totals

142,900

35,725

4,194

1,384

1,868

1,678

5,002

1,650

40,438


2 A subset of the sample of schools where school personnel complete e-filing activities.

2009

Subjects

# of Students

Student Burden (in hours)

# of Teachers /

Teacher Burden

# of Schools

School Burden

SD/ELL (# of school personnel)

SD/ELL Burden

Total Burden


(in hours)

(in hours)

(in hours)

(in hours)


4th Grade -Operational (Math, Reading, Science); Pilot (Math, Reading, U.S. History, Civics, Geography)

590,200

147,550

35,412

11,686

11,804

5,902

20,657

6,817

171,955


 

8th Grade -Operational (Math, Reading, Science); Pilot (Math, Reading, U.S. History, Civics, Geography)

513,900

128,475

30,834

10,175

10,278

5,139

17,987

5,936

149,725


 

12th Grade -Operational (Math, Reading, Science); Pilot (U.S. History, Civics, Geography)

50,500

12,625

 

 

1,010

505

1,768

583

13,713


 

HSTS

 

 

 

 


388

 

 

388


 

E-Filing3

 

 

 

 


8,775

 

 

8,775


 

Total

1,154,600

288,650

66,246

21,861

23,092

20,709

40,411

13,336

344,556



3 A subset of the sample of schools where school personnel complete e-filing activities.

2010

Subjects

# of Students

Student Burden (in hours)

# of Teachers /

Teacher Burden

# of Schools

School Burden

SD/LEP (# of school personnel)

SD/LEP Burden

Total Burden

(in hours)

(in hours)

(in hours)

(in hours)

4th Grade -Operational (U.S. History, Civics, Geography); PreCalibration (Math & Reading); Pilot (Writing)

30,500

7,625

1,830

604

610

305

1,068

352

8,886

8th Grade -Operational (U.S. History, Civics, Geography); PreCalibration (Math & Reading); Pilot (Writing)

37,200

9,300

2,232

737

744

372

1,302

430

10,838

12th Grade -Operational (U.S. History, Civics, Geography); Pilot (Math, Reading, Writing)

51,000

12,750

 

 

1,020

510

1,785

589

13,849

E-Filing4

 

 

 

 

902

902

 

 

902

Totals

118,700

29,675

4,062

1,340

2,374

2,089

4,155

1,371

34,475





















2008-2010 Totals

1,416,200

354,050

74,502

24,586

27,334

24,476

49,567

16,357

419,468











Total Respondents

1,567,603









Total Burden

419,468










4 A subset of the sample of schools where school personnel complete e-filing activities.


  1. Cost to respondents.

There are no direct costs to respondents.


  1. Estimates of cost to the federal government

Contracts to conduct the 2008 through 2010 NAEP assessments have not been awarded, therefore, the total costs for the period covered by this system clearance package have not been established. Necessary contracts for the 2008-2010 assessments are expected to be awarded on or before September 28, 2007. Upon award of the contracts, NCES will append estimated costs to this system clearance application.

Based on the current contracts for the 2005-2007 assessments (period covered under the existing systems clearance), the total cost to the federal government for the development, printing, distribution, scoring, analysis, and reporting for the currently authorized 2005-2007 activities is approximately $214,707,000. The total cost estimate for this previous three-year period ($214,707,000) is broken down as follows:

  • Development Costs –12.7M (includes question research, development, cognitive lab tryout of questions)

  • Materials Preparation, Printing & Distribution, Scoring –59.6M (Includes all test booklet preparation and printing, hiring and conducting of all training personnel for scoring).

  • Analyses – 13.9M (includes creation and analysis of design plans, item analysis, DIF analysis, scaling, summarization of data).

  • Reporting – 12.1M (includes reporting assessment results, assembling Technical Reports, producing public use data file).

  • Alliance Coordination and Research -- 16.9M (includes management of all contractor coordination efforts, Quality Assurance and QC Plans, Develop & Maintain a Public Communications Receipt & Tracking System).

  • Sampling and Data Collection -- 93.6M (includes site visits for data collection and assessment administrations, development of sampling design and specifications).

  • Web Operations and Maintenance -- 5.7M (includes development and support of a NAEP information management system).

  1. Reasons for changes in burden

For the 2008-10 OBM Systems Clearance package (as was done in the 2005-07 package), a summary of the total burden - by year and overall - is presented in Exhibit Two (Estimated Response Burden for Operational and Pilot Test Items). The estimate for average time to respond to student background questions (0.25 hours), teacher background questions (0.33 hours) and SD/LEP background questions (0.33 hours) was the same across the two packages. The estimate for average time to respond to the school background questions was increased from 0.33 hours to 0.50 hours between the 2005-07 and 2008-10 packages to be consistent with National Assessment Governing Board guidelines. The 2008-10 package includes for the first time burden estimates for E-filing and the High School Transcript Study.

Across the three years, the total number of respondents referenced in the 2005-07 package is 2,718,965. The total number of respondents referenced in 2008-10 package is 1,582,765. The difference in the total number of respondents is due to differences in sample sizes between state assessment years (i.e., years that include one or more assessments that support state-by-state reporting) and national-only assessment years. In state assessment years, NAEP samples more than 1,000,000 students; in national-only assessment years, NAEP samples about 100,000 students. In two of the three years between 2005 and 2007, NAEP conducted state assessments. Between 2008 and 2010, NAEP will conduct state assessments in only one year, 2009. This fact results in the almost 1.2 million more students covered under the 2005-07 OMB System Clearance package compared to the 2008-10 package. The decrease in student sample size is accompanied by decreases in teacher and school samples. When taken together, the difference in the sample size results in a much larger total burden hours (695,402 hours) for the 2005-07 package compared to the total burden hours (423,708 hours) for the 2008-10 package.

16. Time schedule for data collection.

The time schedule for the data collection for the 2008-2010 assessments is shown below.


2008

Operational (Long Term Trend);

Operational (Arts, Long Term Trend);

Pre-calibration (Reading and Mathematics);

Pilot (Reading, Mathematics, Science, HSTS)



October 2007-May 2008

January-March 2008

"

"




2009

Operational (Reading, Mathematics, Science);

Pilot (Reading, Mathematics, U.S. History, Civics, Geography)

HSTS


January-March 2009

"


June-Sept. 2009



2010

Operational (U.S. History, Civics, Geography);

Pre-calibration (Reading and Mathematics);

Pilot test (Reading, Mathematics, Writing)



January-March 2010

"

"




  1. Approval for not Displaying OMB Approval Expiration Date


No exception is requested.


  1. Exceptions to Certification Statement


No exception is requested.

1 State-level burden resulting for e-filing sampling information falls to NCES-funded staff at state education agencies; therefore, the burden is not included in estimates provided in this volume.

File Typeapplication/msword
File TitleNATIONAL ASSESSMENT OF EDUCATIONAL PROGRESS
AuthorNada Ballator (ext. 1526)
Last Modified ByDoED
File Modified2007-05-22
File Created2007-05-22

© 2024 OMB.report | Privacy Policy