Part A (Sys Cl) Supporting Statement

Part A (Sys Cl) Supporting Statement.doc

National Assessment of Education Progress (NAEP) 2011-13 System Clearance

OMB: 1850-0790

Document [doc]
Download: doc | pdf





NATIONAL ASSESSMENT OF

EDUCATIONAL PROGRESS






SUPPORTING STATEMENT PART A



SYSTEM CLEARANCE PROPOSAL



NAEP SURVEYS


FOR THE YEARS 20112013









March 17, 2010


Table of Contents


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS 1

1. Potential respondent universe. 1

2. Procedures for collection of information. 1

3. Methods to maximize response rates and deal with issues of nonresponse. 3

4. Tests of procedures or methods to be undertaken. 3

5. Consultants on statistical aspects of the design. 4


Appendix A Statute Authorizing NAEP A-1


Appendix B Lists of Committee Members B-1

NAEP Background Variable Standing Committee B-2

NAEP Design and Analysis Committee B-3

NAEP Validity Study Panel B-4

NAEP National Indian Education Study (NIES) Technical Review Panel B-5

NAEP Writing Standing Committeel Review Panel B-6


Appendix C Example of Sample Design Document (2009 Assessment) C-1

A. JUSTIFICATION

1a. Circumstances making the collection of information necessary.

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 121 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. In the current legislation that reauthorized NAEP (The No Child Left Behind Act of 2001 (Public Law 107-110)), Congress mandated again the collection of national education survey data through a national assessment program. The No Child Left Behind Act of 2001 (NCLB) requires the assessment to collect data on specified student groups, including race/ethnicity, gender, socio-economic status, disability, and limited English proficiency. It requires fair and accurate presentation of achievement data and permits the collection of background, non-cognitive, or descriptive information that is related to academic achievement and aids in fair reporting of results. The intent of the law is to provide representative sample data on student achievement for the nation, the states, and subpopulations of students and to monitor progress over time.


NAEP is administered by the National Center for Education Statistics (NCES) in the Institute for Education Sciences of the U.S. Department of Education. The National Assessment Governing Board (henceforth referred to as the Governing Board) sets policy for NAEP and determines the content framework for each assessment. As a result of the No Child Left Behind Act, the Governing Board has final authority on the appropriateness of all cognitive and noncognitive assessment items. These surveys are currently conducted by an alliance of organizations under contract with the U.S. Department of Education. The national surveys contain three kinds of questions – “cognitive” assessment questions, which measure student knowledge of an academic subject; “non-cognitive” assessment questions, which gather construct-related information not reflecting directly subject knowledge, such as student motivation or effort; and “background” questions which gather factual information such as demographic variables.


The federal authority mandating NAEP is found in Section 411 of Public Law 107-110. This law states:

"…(b)(1) -- The purpose of this section is to provide, in a timely manner, a fair and accurate measurement of student academic achievement and reporting trends in such achievement in reading, mathematics, and other subject matter as specified in this section.

"(2) MEASUREMENT AND REPORTING.-- The Commissioner, in carrying out the measurement and reporting described in paragraph (1), shall --

"(A) use a random sampling process which is consistent with relevant, widely accepted professional assessment standards and that produces data that are representative on a national and regional basis;

"(B) conduct a national assessment and collect and report assessment data, including achievement data trends, in a valid and reliable manner on student academic achievement in public and private elementary schools and secondary schools at least once every 2 years, in grades 4 and 8 in reading and mathematics;

"(C) conduct a national assessment and collect and report assessment data, including achievement data trends, in a valid and reliable manner on student academic achievement in public and private schools in reading and mathematics in grade 12 in regularly scheduled intervals, but at least as often as such assessments were conducted prior to the date of enactment of the No Child Left Behind Act of 2001;

"(D) to the extent time and resources allow, and after the requirements described in subparagraph (B) are implemented and the requirements described in subparagraph (C) are met, conduct additional national assessments and collect and report assessment data, including achievement data trends, in a valid and reliable manner on student academic achievement in grades 4, 8, and 12 in public and private elementary schools and secondary schools in regularly scheduled intervals in additional subject matter, including writing, science, history, geography, civics, economics, foreign languages, and arts, and the trend assessment described in subparagraph (F);

"(E) conduct the reading and mathematics assessments described in subparagraph (B) in the same year, and every other year thereafter, to provide for 1 year in which no such assessments are conducted in between each administration of such assessments;

"(F) continue to conduct the trend assessment of academic achievement at ages 9, 13, and 17 for the purpose of maintaining data on long-term trends in reading and mathematics;

"(G) include information on special groups, including, whenever feasible, information collected, cross tabulated, compared, and reported by race, ethnicity, socioeconomic status, gender, disability and limited English proficiency; and

"(H) ensure that achievement data are made available on a timely basis following official reporting, in a manner that facilitates further analysis and that includes trend lines.


A copy of the current statute is included in Appendix A.



1b. Overview of NAEP Assessments.

The following provides a broad overview of NAEP assessments. Please note that the Governing Board determines NAEP policy and assessment schedule, and future Governing Board decisions may result in changes to some aspects of an assessment (e.g., which subjects are assessed in which years). However, the overall methodology and assessment process will remain consistent.


NAEP consists of two assessment programs: the NAEP Long Term Trend (LTT) assessment and the main NAEP assessment. The LTT assessments are designed to give information on the changes in academic performance. They are administered nationally every four years (but are not reported at state or district level) and report student performance at ages 9, 13, and 17 in mathematics and reading.


The main NAEP assessment reports current achievement levels and short-term trends in student achievement at grades 4, 8, and 12 for the nation and, for certain assessments, states and select urban districts. These assessments follow subject-area frameworks developed by the Governing Board and use the latest advances in assessment methodology. The subject-area frameworks evolve to match instructional practices.


Assessment types listed in this document are described as follows:

Operational

An assessment whose results will be used for NAEP reporting purposes.

Pilot

An assessment that contains a pretest of items and possible assessment conditions to obtain information regarding clarity, difficulty levels, timing, and feasibility of items and conditions.

Probe

An assessment that is exploratory in nature and may use smaller sample sizes than an operational assessment, assess a limited portion of the framework, or report on limited data and results.

Special Study

An additional study to investigate content issues, delivery options, linking to other surveys, or reporting variables.



All four types of activities may simultaneously be in the field during any given data collection effort. Each is described in more detail below:

  1. Operational assessments. "Operational" NAEP administrations, rather than pilot or field test administrations, collect data to publicly report on the educational achievement of students.

  2. Pilot testing. In addition to ensuring that items measure what is intended, the data collected from pilot tests serve as the basis for selecting the most effective items and data collection procedures for the subsequent operational assessments. Pilot testing is a cost-effective means for revising and selecting items prior to an operational data collection because, while fewer numbers of students participate, the items are administered to a nationally representative sample of students and data are gathered about performance that crosses the spectrum of student achievement. Prior to pilot testing, new items are cognitively tested (intensive one-on-one interviews) on small groups of sample participants. This procedure is useful for identifying questionnaire and procedural problems before larger scale pilot testing is undertaken. Pilot testing also affords the opportunity to test out different conditions (i.e. computer-delivered methods) under which items are administered. Pilot testing of cognitive and non-cognitive items is carried out in all subject areas. In main NAEP subjects, other than reading and mathematics at grades 4 and 8, pilot testing is done one year prior to the operational assessment. In reading and mathematics at grades 4 and 8, pilot testing is conducted two years prior to the operational assessment to facilitate the analyses required to meet the shorter reporting deadlines.

  3. Probe assessments. Probes are used for exploring preferred methods of assessing and reporting on newer technologies, delivery methodologies, and constructs. Probes fall between a full operational assessment and a pilot assessment. Probes are typically smaller in sample size than a full operational assessment but still afford large enough samples to report the data from them. Probes are exploratory in nature and are often used for such new technologies as the 2012 technological literacy assessment, the science interactive computer tasks (ICT), and hands-on tasks (HOTs) assessments in 2009.

  4. Small-scale special studies. As the NAEP program has evolved, definitions of what will be assessed have also evolved. For example, under the revised Reading Framework in 2009, the Governing Board proposed the inclusion of a new measure of meaning vocabulary. In addition, NAEP has also conducted small-scale studies using computers as the platform for test administration, such as the preliminary studies to prepare for the 2011 writing and 2009 science ICT assessments. As new frameworks, new administration modes and modules, and new constructs are considered, additional small-scale special studies will be required to investigate the effectiveness or the effect of the topic in question, prior to implementing it on a wider-scale.


In addition to reporting overall results of student performance and achievement, NAEP also reports student performance results for various subgroups of students and on various educational factors. Guidance for what is asked in the assessments is set by the Governing Board. NCES is responsible for developing the items and for selecting the final set of questions. The questions are designed to (a) provide the information for disaggregating data according to categories specified in the legislation, (b) provide contextual information that is subject specific (e.g. reading, mathematics) and has an impact and known relationship to achievement, and (c) provide policy relevant information specified by the Governing Board.


Questions that do not work well can be dropped or modified before the operational administration. One typical modification is reducing the number of response categories given. This modification is employed when pilot data indicate very low response percentages in adjacent response categories.


In addition to the overarching goal of NAEP to provide data about student achievement at the national, state, and district levels, NAEP also provides specially targeted data on an as-needed basis. At times, this may only mean that a special analysis of the existing data is necessary. At other times the targeting may include the addition of a short add-on questionnaire targeted at specified groups. For example, in the past, additional student, teacher, and school questionnaires were developed and administered as part of the National Indian Education Study (NIES) that NCES conducted on behalf of the Office of Indian Education. Through such targeted questionnaires, important information about the achievement of a specific group is gathered at minimal additional burden. These types of questionnaires are intentionally kept to a minimum and are designed to avoid jeopardizing the main purpose of the program.


Subsequent clearance packages submitted for the specific assessment years will describe the scope of the assessment, including additional studies, school and student samples, estimated burden, and items requiring clearance. The basic methodology and procedures described under this system clearance will be employed for the 2011, 2012, and 2013 NAEP assessments.



1c. Overview of 2011-2013 NAEP Assessments.

The 2011 data collection consists of the following:

  • National, state, and urban district assessment in reading at grades 4 and 8 (including the NIES study);

  • National, state, and urban district assessment in mathematics at grades 4 and 8 (including the NIES study);

  • National, state, and urban district assessment in writing at grade 4, and national only at grades 8 and 12;

  • Pilot assessments for 2013 reading and mathematics at grades 4 and 8;

  • Pilot assessments for 2012 technological literacy at grades 8 and 12; and

  • Pilot assessment for 2012 economics at grade 12.

The 2012 data collection consists of the following:

  • National Long-Term Trend (LTT) assessments in reading and mathematics at ages 9, 13, and 17;

  • National assessment in economics at grade 12;

  • Probe national assessment in technological literacy at grades 8 and 12;

  • Pilot assessments for 2016 LTT in reading and mathematics at ages 9, 13, and 17;

  • Pilot assessment in science, including ICT and HOTs, at grades 4, 8,and 12; and

  • Pilot assessments for 2013 reading and mathematics at grade 12.

The 2013 data collection consists of the following:

  • National, state, and urban district assessment in reading at grades 4 and 8 (including the NIES study), national and partial state assessment at grade 12;

  • National, state, and urban district assessment in mathematics at grades 4 and 8 (including the NIES study), national and partial state assessment at grade 12;

  • National, state, and urban district assessments in science, including ICT and HOTs, at grades 4 and 8, and national only at grade 12;

  • High School Transcript Study (HSTS) at grade 12;

  • Pilot assessments for 2015 reading and mathematics at grades 4 and 8; and

  • Pilot assessments for 2014 civics, geography, and U.S. history at grades 4, 8, and 12.



In all assessment years, questionnaires (whether delivered as part of a booklet or on computer) are generally administered to students at grades 4, 8, and 12; to teachers at grades 4 and 8 (also at grade 12 for economics); and to school administrators at grades 4, 8, and 12. SD and ELL (Students with Disabilities and English language learners) worksheets are completed by teachers or administrators of students identified as disabled or as ELL.


Survey sampling information needs to be gathered from schools for all NAEP assessments. This sampling information can be gathered manually or electronically at the school, district, or state level. Electronic filing (E-Filing) is encouraged at the state and district level, but if done at the school level some burden will be incurred by school personnel.


Sizes and categories of respondent burden are provided in Section 12 of this document.


Special Studies

Special small-scale studies are conducted in accordance with the assessment development, research, or additional reporting needs of NAEP. For example over recent assessment cycles, the following special studies were conducted:

  • meaning vocabulary study for grades 4 and 8;

  • preparedness study at grade 12;

  • NAEP-SAT linking study at grade 12;

  • mathematics accessibility study (a NAEP validity study) at grades 4 and 8;

  • special teacher and student studies related to performance for Puerto Rico delivered assessments;

  • reading Word Location Study at grades 4 and 8;

  • mathematics inclusion booklet study at grade 4;

  • student try-outs of the computer-based NAEP writing assessment;

  • motivation study at grade 12; and

  • Extended Student Background Questionnaire (ESBQ) study at grades 4 and 8.


Over the course of the 2011-2013 NAEP assessments, special studies will be conducted as directed by NCES. The specifics of future studies will be included in the subsequent yearly OMB submittals. Though an exhaustive list of special studies cannot be provided given the real-time nature of these projects, the following provides a list of special studies (either currently known or under consideration):

  • NAEP-TIMSS (Trends in International Mathematics and Science Study) linking study in 2011;

  • Multi-stage test (MST) study in 2011;

  • bridge studies to link results from the previous frameworks to the new frameworks;

  • special studies of student preparedness at grade 12; and

  • NAEP validity studies.


The NAEP-TIMSS linking study in 2011 is likely and will link the results from NAEP to TIMSS in an effort to compare states to the international countries. While the exact design of the study is yet to be determined, the basic principle is that randomly equivalent groups of students will take NAEP and TIMSS in order to link the two sets of results. The MST study (briefly described on page 18) is also a likely special study.


The instruments (i.e., assessments and questionnaires) and the student sample sizes needed to address the study questions are not known beyond the estimates for the 2011 NAEP-TIMSS linking study and the 2011 MST study. Therefore, burden estimates for future special studies are not provided in this application. Details for selected special studies will be included in subsequent clearance requests.

1d. Rationale for OMB System Clearance.

NCES is requesting system clearance for the NAEP assessments to be administered in the 2011–2013 timeframe, similar to the system clearance approval that was granted for the 2005–2007 and 2008–2010 NAEP administrations (OMB 1850-0790). The primary reason for the system clearance request is that it enables NAEP to meet its large and complex assessment reporting schedules and deliverables through a more efficient clearance process.


Since the passage of the No Child Left Behind (NCLB) legislation, the numbers of assessments, volumes of participants, and complexity of the development and production processes have all increased dramatically. Because of NCLB requirements, every state participates in the reading and mathematics assessments at grades 4 and 8 and most states participate in the state-level writing and science assessments. These state-level assessments require that samples be selected to permit reporting of results for each state as well as for the nation, resulting in a much larger sample size than would otherwise be necessary. In addition, because of the complex matrix sampling plan used by NAEP, upwards of 1,000 distinct assessment booklets may need to be printed for one assessment year alone.


In addition to the large volume of development and assessment that takes place, there is the factor of short turnaround times for production. Due to the increased volumes and complex development, there are shorter time frames for data analysis, question reviews, assembly of questions for submittal to the Governing Board and OMB, and abbreviated windows for printing and distribution.  To meet these demanding timelines, NCES is requesting a waiver of the 60-day federal register notice for individual clearances of studies described in this system clearance package.


Parts A and B of this system clearance package contain supporting information for the operational, pilot, probe, and known special study assessments to be given in 2011-2013. This submittal does not contain any volume II (specific questionnaire) material. Specific yearly questionnaires will be submitted as they are developed over the course of the system clearance period. Clearance of three years duration (beginning May 1, 2010) is requested for the background and non-cognitive materials that will be used in the 2011-2013 operational, probe, pilot, and special study assessments. It is understood that, under the system clearance, each set of items would be cleared. At the end of this system clearance period (i.e., April 30, 2013), NCES would submit a new request for system clearance to cover future NAEP administrations.


2. How, by whom, and for what purpose the data will be used.

Under this request for system clearance, NCES is asking for approval of the various assessments, including the various questionnaires (e.g., student, teacher, and school) and data collection efforts that will be part of the NAEP 2011–2013 national, state, and district assessments, as well as any special studies (i.e., special bridge studies, etc.) that will be conducted in the 2011–2013 timeframe. Results will be reported on the 2011 assessments in reading, mathematics, and writing; the 2012 assessments in reading and mathematics (LTT), economics, and technological literacy; and the 2013 assessments in reading, mathematics, and science. The schedule for subjects to be assessed has been established by the Governing Board; however, it is subject to modification. Therefore, NCES is requesting some leeway with regard to the specific subject assessments that will be administered, while holding the methodology and general procedures constant. As is described in later sections, neither the HSTS nor student sampling imposes any student-level burden, but rather require input from school- or district-level staff.2


NAEP uses the results from the probes, pilot tests and some special studies to inform future assessments and procedures. Questions are evaluated for their effectiveness and appropriateness to be used in an operational assessment. Data from the questionnaires are used as part of the marginal estimation procedures that produce the student achievement results. In addition, questionnaire data is used to perform quality control checks on school-reported data and in special reports, such as the Black-White Gaps report.


Results from the operational assessments, probes, and some special studies are released to the public. The NAEP results are reported in the Nation’s Report Card, which is used by policymakers, state and local educators, principals, teachers, and parents to inform the debate over education.


NAEP provides results on subject-matter achievement, instructional experiences, and school environment for populations of students (e.g., all fourth-graders) and groups within those populations (e.g., female students, Hispanic students). NAEP does not provide scores for individual students or schools, although state NAEP can report results by selected large urban districts. NAEP results are based on representative samples of students at grades 4, 8, and 12 for the main assessments, or samples of students at ages 9, 13, or 17 years for the LTT assessments.


The NAEP report cards provide national results, trends for different student groups, results on scale scores and achievement levels, and sample questions. In reports with state or urban district results, there are sections that provide overview information on the performance of these jurisdictions. In addition to the report card, more information is available online (http://nationsreportcard.gov/) and in one-page summary reports called snapshots for each participating state or urban district. Additional data tools are available online for those interested in:


Finally, there are numerous opportunities for secondary data analysis because of NAEP's large scale, the regularity of its administrations, and its stringent quality control processes for data collection and analysis. NAEP data are used by researchers and educators who have diverse interests and varying levels of analytical experience.



3. Use of techniques to reduce burden.

For all operational and pilot assessments, NAEP will continue to take advantage of proven, modern measurement techniques, which greatly enhance the power and value of the NAEP data collected. Through the use of a partial balanced incomplete block (BIB) spiraling variant of matrix sampling, a variety of analyses are feasible because the data are not booklet-bound. Covariances are computed among all questions in a subject area, so that:

  • composites of questions can be appraised empirically for coherence and construct validity;

  • the dimensional structure of each subject area can be determined analytically as reflected in student performance consistencies;

  • item response theory (IRT) scaling can be applied to unidimensional sets of exercises regardless of which booklet they appear in;

  • IRT scales can be developed having common meaning across exercises, population subgroups, age levels, and time periods;

  • powerful trend analyses can be undertaken by means of these common scales;

  • performance scales can be correlated with background, attitudinal, and program variables to address a rich variety of educational and policy issues; and

  • public-use electronic files can be made much more useful because secondary analyses are also not booklet-bound.


No student takes the complete assessment and given that NAEP results are reported for groups of students, using a BIB spiral allows the program to administer a lengthy overall assessment without overburdening any one student.


The various questionnaires (student, teacher, and school) are constructed to minimize respondent burden using the information gathered from pilot testing. Based on pilot test data, questions that do not yield useful information or that produce redundant information can be eliminated. Also, to reduce respondent burden, NAEP will employ some of the following techniques:

  • maintaining questions across years and not pilot new ones;

  • rotating some non-required questions across assessments;

  • reducing the number of questions piloted for existing frameworks; and

  • piloting the reformatting of questions.


Another technique for improving the response process consists of providing the teacher and school questionnaires via an on-line electronic completion system. In the most recent NAEP assessment year, approximately 10 percent of teacher and school questionnaires were completed electronically on-line. Starting in 2010, in an effort to reduce burden, the SD and ELL data collection process will have been revamped. Instead of requiring individual questionnaire booklets for each identified SD and /or ELL student, a worksheet allowing for the collection of multiple students (up to 10 per worksheet) has been developed.


4. Efforts to identify duplication.

The proposed assessments, including the questionnaires, do not exist in the same format or combination in the U.S. Department of Education or elsewhere. The background and non-cognitive data gathered by NAEP is the only comprehensive cross-sectional survey performed periodically or regularly on a large-scale basis that can be related to extensive achievement data. No other federally funded studies have been designed to collect data for the purpose of regularly assessing trends in educational progress. None of the major non-federal studies of educational achievement were designed to measure changes in national achievement. In short, no existing data source in the public or private sector duplicates NAEP.


5. Burden on small businesses or other small entities.

Private schools are included in the sample proportional to their representation in the population.

6. Consequences of collecting information less frequently.

Under NCLB legislation, Congress has mandated the on-going collection of NAEP data.

Failure to collect the 2011-2013 assessment data on the current schedule would affect the quality and schedule of the NAEP assessments, and would result in assessments that would not fulfill the mandate of the legislation.


7. Consistency with 5 C.F.R. 1320.5.

No special circumstances are involved. This data collection observes all requirements of 5 C.F.R. 1320.5.


8. Consultations outside the agency.

In addition to the contracts responsible for the development and administration of the NAEP assessments, the program involves many consultants and is also reviewed by specialists. These consultants and special reviewers represent expertise with students of different ages, ethnic backgrounds, geographic regions, learning abilities, and socioeconomic levels. Contractor staff and consultants have reviewed all questions for bias and sensitivity issues, grade appropriateness, and congruence with assessing at the state level. When appropriate, members of subject-area standing committees will review the questionnaires with regards to appropriateness within existing curricular and instructional practices. For example, because the 2011 administration of writing will be the first administration under a new framework, the writing committee will provide feedback on the writing background and non-cognitive questionnaires.


The following examples of outside personnel are provided in Appendix B:

  • NAEP Background Variable Standing Committee

  • NAEP Design and Analysis Committee

  • NAEP Validity Studies Panel

  • NAEP National Indian Education Study (NIES) Technical Review Panel

  • NAEP Writing Standing Committee


One public comment was received in response to the 60-day Federal Register notice (Vol.74, page 57159, published on November 4, 2009).  It pointed out that ELL student exclusion policies vary state-by-state and sometimes district-by-district.  In response, NCES revised the ELL bullet point in the Supporting Statement Part B Section 1 so that it is not specific regarding which ELL students are excluded. The new bullet point reads: “Students are selected according to student sampling procedures with these possible exclusions: 1) The student is identified as an English Language Learner (ELL), and is prevented from participation in NAEP, even with accommodations allowed in NAEP.”


9. Payments or Gifts to Respondents.

In general, there will be no gifts or payments to respondents, although students do get to keep the NAEP pencils used in the assessment. On occasion, NAEP will leave educational materials behind at schools for their use (i.e., science kits from the science HOTs assessments). Schools participating in the High School Transcript Study are paid the established fee for providing student transcripts. Given that the study pays schools the prevailing rate to perform a standard service, estimates of school-level burden for that function are not included in this volume. Special studies sometimes include remuneration for respondents (i.e. the motivational study in 2008) and will be explicated in the specific-year clearance package submittals. Some schools also offer recognition parties with pizza or other perks for students who participate, however these are not reimbursed by NCES or the contractor. If any incentives are proposed as part of a future special study, they would be justified as part of that future clearance package. In addition, the amounts would be consistent with amounts approved in other studies with similar conditions.

10. Assurance of Confidentiality.

NAEP has policies and procedures that ensure privacy, security, and confidentiality, in compliance with the legislation (Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and Education Sciences Reform Act (Public Law 107-110, 20 U.S.C. §9622)). Specifically for the NAEP project, this ensures that privacy, security, and confidentiality policies and procedures are in compliance with the Privacy Act of 1974 and its amendments, NCES Confidentiality Procedures, and the Department of Education ADP Security manual. The federal authority mandating NAEP in Section 9622 of US Code 20 requires the confidentiality of personally identifiable information:

"(A) IN GENERAL.-- The Commissioner for Education Statistics shall ensure that all personally identifiable information about students, their academic achievement, and their families, and that information with respect to individual schools, remains confidential, in accordance with section 552a of title 5.


"(B) PROHIBITION.-- The Assessment Board, the Commissioner for Education Statistics, and any contractor or subcontractor shall not maintain any system of records containing a student's name, birth information, Social Security number, or parents' name or names, or any other personally identifiable information.


The NAEP Security and Confidentiality Plan has been developed and NCES ensures that all current contractor policies and procedures are in compliance with all NAEP security and confidentiality requirements.


All NAEP-contractor staff with access to confidential NAEP information are required to sign an “affidavit of nondisclosure” that affirms, under severe penalty for unlawful action, that they will protect NAEP information from non-authorized access or disclosure. The affidavits are in keeping with the NCES Standard for Maintaining Confidentiality (IV-01-92). NAEP contractors are required to maintain and provide NCES with lists of all staff that have contact with NAEP secure information, along with certification that all such staff has taken an appropriate oath of confidentiality. The Data Collection contractor must also comply with directive OM: 5-101 which requires that all staff with access to data protected by the Privacy Act and/or access to U.S. Department of Education systems and who will work on the contract for 30 days or more go through the Department of Education Employee Security Clearance Procedures. 

An important privacy and confidentiality issue is to protect the identity of assessed students, their teachers, and their schools. To assure this protection, NAEP has established security procedures, described below, that closely control access to potentially identifying information.


Students’ names are submitted to the Sampling and Data Collection contractor for selecting the student sample. This list also includes the month/year of birth, race/ethnicity, gender, and a status code for Students with Disabilities, English Language Learners, and Participation in the National School Lunch Program. The student sample is selected and this data for selected students is submitted to the Materials and Printing contractor, who includes the data in the Packaging and Distribution system for the production of the Administration Schedule and student labels, which are then forwarded to field staff and used to manage and facilitate the assessment. These data are also added to the School Control System (SCS) used by field staff to print materials used by the schools. Student information is deleted from the Packaging and Distribution system after the assessment begins.


All labels are removed from the assessment materials and destroyed at the schools, upon completion of the assessment. The section of the Administration Schedule with names is removed and placed in the school folder. The school folder contains all of the forms and materials with student names. It is kept at the school until the end of the school year and then destroyed by school personnel.


In addition to student information, teacher and principal names are collected and recorded on a roster, which is used to keep track of the distribution and collection of NAEP teacher and school questionnaires. The roster is kept at the school until questionnaires are collected. At that time, questionnaires and the portion of the roster without names is returned for processing. The portion of the roster with names is kept at the school, in the school folder which is destroyed at the end of the school year.  


The completed Administration Schedules and teacher and school questionnaire rosters are returned to the Materials and Printing contractor for processing.

Furthermore, to ensure the confidentiality of respondents, NAEP staff will use the following precautions:

  • Data files will not identify individual respondents.

  • No personally identifiable information, either by schools or respondents, will be gathered or released by third parties. No permanent files of names or other direct identifiers of respondents will be maintained.

  • Student participation is voluntary.

  • NAEP data are perturbed. Data perturbation is a statistical data editing technique implemented to ensure privacy for student and school respondents to NAEP’s assessment questionnaires. The process is coordinated in strict confidence with the IES Disclosure Review Board (DRB) with details of the process shared only with the DRB and a minimal number of contractor staff.


After the components of NAEP are completed in a school, neither student- nor teacher-reported data are retrievable by personal identifiers. We emphasize that confidentiality is assured for individual schools and for individual students, teachers, and principals. The following text is printed on all student, teacher, and school questionnaires:

The information you provide will be used for statistical purposes only. In accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and other applicable Federal laws, your responses will be kept confidential and will not be disclosed in identifiable form to anyone other than employees or agents. By law, every NCES employee as well as every agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of up to $250,000, or both if he or she willfully discloses ANY identifiable information about you.


For the HSTS component of NAEP, student transcripts are collected from schools for sampled students, and school staff members complete a School Information Form that provides general information about class periods, credits, graduation requirements, and other aspects of school policy. To maintain the privacy of student and school identities, students’ names are removed from the transcripts and questionnaires at the school and given a unique identification number, which is used to match the transcript records to the NAEP questionnaire and performance information, on an individual basis. NCES ensures that the data collected from schools and students can be used for statistical purposes only.


11. Sensitive questions.

The National Assessment of Educational Progress emphasizes voluntary respondent participation, assures confidentiality of individual responses, and avoids asking for information that might be considered sensitive or offensive. Insensitive or offensive questions are prohibited by NCLB and the Governing Board reviews all items for bias and sensitivity. Throughout the item development process, NCES staff works with consultants, contractors, and internal reviewers to identify and eliminate potential bias in the questions.


12. Estimation of respondent reporting burden (2011-2013).

Because there are such fluctuations in the burden numbers over the three years of this collection, per discussion with OMB, we will show each year's burden as it actually is rather than averaging the total over the three years.  This change will be more accurate and representative of the actual annual numbers for both burden and respondents. For this initial year of the 2011-2013 NAEP Systems Clearance, the burden for 2011 will be reported. The burden numbers and respondents will be changed accordingly for 2012 and 2013.


Design information, estimated volumes, and burden estimates for the 2011-2013 assessments are contained in exhibits 2 and 3 that follow. Exhibit 2 provides an overall summary of the burden information by respondent group. Exhibit 3 provides the burden information per respondent group, by grade, and by year.


To minimize the burden to participating schools and students, the following procedures will be used:

  • Trained administrators will conduct the assessments at all grades.

  • Assessment administrations will be limited, whenever possible, to about 60-90 minutes to facilitate school scheduling (computer based assessments can be scheduled for up to 120 minutes).

  • Students will not take every question in a particular subject area. Blocks are assembled in different booklets and each booklet is given to a different sub-sample of students


A description of the respondents or study is provided below, as supporting information for the following exhibits:

Students - Students in fourth, eighth and twelfth grades complete assessment booklets that commonly contain two 25-minute cognitive blocks, followed by two sections (a background section and a non-cognitive section), which require a total of 15 minutes to complete3. The first background question block contains core questions related to demographic information. The second section contains subject-specific non-cognitive questions. Exhibit 1 indicates the burden on students.

Teachers - The teachers of fourth- and eighth-grade students (and twelfth-grade economics students) participating in NAEP will be asked to complete questionnaires about their teaching background, education, training, and classroom organization. 4th grade teacher burden is estimated to be 30 minutes while 8th and 12th grade teacher burden is 20 minutes. 4th grade teachers often have multiple subject-specific sections to complete.

Principals/Administrators - The school administrators in the sampled schools will be asked to complete a questionnaire. The core questions are designed to measure school characteristics and policies that research has shown are highly correlated with student achievement. A section with subject-specific questions concentrates on curriculum and instructional services issues.

SD and ELL worksheets will be completed for students identified with disabilities or as English language learners.

E-Filing and Pre-Assessment Visit - Survey sample information is collected from schools in the form of lists of potential students who may participate in NAEP. This sample information can be gathered manually or electronically at the school, district, or state level. If done at the school or district level some burden will be incurred by school personnel. The Pre-Assessment Visit is the opportunity for the NCES contractor field staff to meet with the school personnel to review procedures and logistics for the upcoming assessment.

High School Transcript Study (HSTS): The HSTS information is provided by a sample of the public and private schools. The sample of schools is nationally representative of all schools in the United States, and the sample of students is representative of graduating seniors from each school. The transcript study includes only those students whose transcripts indicate that they graduated the year that the study was conducted. Most of the students sampled in the transcript study are in schools that participated in NAEP. The data collected from those students that participated in NAEP make it possible to link course-taking patterns to academic performance, as measured by NAEP.

National Indian Education Study (NIES): The NIES will be conducted as part of the 2011 and 2013 assessments in reading and mathematics. The national sample includes students from both public and nonpublic schools that have both large and small American Indian/Alaska Native (AI/AN) student populations. The administration of the NAEP assessment will be followed with the administration of a questionnaire specifically designed for the NIES study. Questionnaire data will be linked to NAEP performance data.

NAEP-TIMSS (Trends in International Mathematics and Science Study) linking study will link the results from NAEP to TIMSS in an effort to compare states to the international countries. While the exact design of the study is yet to be determined, the basic principle is that randomly equivalent groups of students will take NAEP and TIMSS in order to link the two sets of results.

Multi-stage testing (MST) special study in 2011 will study the possibility of using an adaptive testing algorithm in NAEP. The study, administered on computer, will be conducted in mathematics at grade 8. A sample of students will take a first cognitive block of medium difficulty and, depending on the individual student performance on that block, the computer will then assign the student either an easy, medium, or difficult block for the second block. Another sample of students will take the same blocks via standard NAEP random assignment. Comparisons will be made between the results from these two samples of students to determine if tailoring block difficulty to student ability is a worthwhile endeavor for NAEP.


EXHIBIT 1

Student Questionnaire Plan




GRADES 4, 8, 12

Subject Area

No. of Booklets

No. of Blocks

No. of Min/
Block

Total No. of Minutes/Student

Background

Non-cognitive

1

2

5 minutes for core

10 minutes for subject specific questions


15










EXHIBIT 2

Estimated Respondent Burden

2011-2013 NAEP Assessments



Universe

Respondents (approximate size of sample)

Average No. of Items Per Respondent

Type of Respondent

Estimate of Average Person Hours

Total Respondent Burden in Person Hours

Student Questions

2011

2012

2013


12 million


934,175

84,500

1,133,000


39

50

56



Student

"

"



.25

"

"


233,544

21,125

283,250

Teacher Questionnaires

2011

2012

2013


383,000


67,472

1,700

77,448



43

"

"


Teacher

"

"


.50

.33

"


28,786

617

32,270

School Questionnaires

2011

2012

2013


154,000



17,297

837

20,980



64

"

"


Principal

"

"


.50

"

"



8,650

420

10,490

SD/ELL Worksheets*

2011

2012

2013


383,000



17,297

837

20,980



9

"

"



Teacher/Administrator

"

"


.16 per student

"

"



31,270

2,412

36,893

E-Filing and Pre-assessment Visit**

2011

2012

2013


154,000




17,297

2,112

20,980




List production



School Personnel

"

"



1.0 each

"

"



23,334

2,916

28,953

HSTS

2013




750



School Personnel


3.0


2,250

* The number of respondents represents the estimated total number of school personnel who will complete the worksheets. The SD and ELL burden totals in the last column are based on the estimated number of students who will need worksheets completed for them which allows burden calculation for school personnel completing the worksheets.


**Note: School burden for pre-assessment visits and e-filing is listed separately from school questionnaire burden.

Only a subset of schools participates in e-filing.


Total respondents = 2,417,662 Total burden hours = 747,180

EXHIBIT 3

Estimated Respondent Burden for NAEP 2011-2013 Assessments

By Year, By Grade Level

2011














Subjects

# of Students

Avg. time per resp.

Student Burden (in hours)

# of Teachers

Avg. time per resp.

Teacher Burden

(in hours)

# of Schools

Avg. time per resp.

School Burden

(in hours)

SD/ELL (# of students id as SD/ELL)

Avg. time per resp.

SD/ELL Burden (school personnel)

(in hours)

Total Burden

(in hours)

4th Grade -Operational (Math, Reading, Writing); Pilot (Math, Reading); NIES study

509,975

15 min

127,494

37,772

30 min

18,886

9,443

30 min

4,722

112,195

10 min

18,699

169,801

Pre-assessment visit and E-filing1








2 hr

13,031




13,031

4th Grade -Totals

509,975


127,494

37,772


18,886

9,443



17,753


112,195


18,699

182,832

8th Grade -Operational (Math, Reading, Writing); Pilot (Math, Reading, Tech Lit); NIES study

380,000

15 min

95,000

28,148

20 min

9,383

7,037

30 min

3,519

68,400

10 min

11,400

119,301

Pre-assessment visit and E-filing1








2 hr

9,711




9,711

NAEP-TIMSS Linking Special Study Gr.8

12,000

15 min

3,000

888

20 min

296

222

30 min

111

2,160

10 min

360

3,767

MST Special Study –Gr.8

9,000

15 min

2,250

664

20 min

221

166

30 min

83

1,620

10 min

270

2,824

8th Grade -Totals

401,000


100,250

29,700


9,900

7,425


13,424

72,180


12,030

135,603

12th Grade -Operational (Writing); Pilot (Tech Lit, Economics)

23,200

15 min

5,800




429

30 min

215

3,248

10 min

541

6,556

Pre-assessment visit and E-filing1








2 hr

592




592

12th Grade -Totals

23,200


5,800




429


807

3,248


541

7,148

Grand Total

934,175

 

233,544

67,472

 

28,786

17,297

 

31,983

187,623

 

31,270

325,583

 

 

 

 

1 School personnel involved in pre-assessment data gathering and e-filing activities (only a subset of schools do e-filing).

NOTE: Due to rounding some totals may be slightly different than the sum of subtotals.

2012

Subjects

# of Students

Avg. time per resp.

Student Burden (in hours)

# of Teachers

Avg. time per resp.

Teacher Burden

(in hours)

# of Schools

Avg. time per resp.

School Burden

(in hours)

SD/ELL (# of students id as SD/ELL)

Avg. time per resp.

SD/ELL Burden (school personnel)

(in hours)

Total Burden

(in hours)

LTT1 Ages 9

17,000

15 min

4,250




425



3,740

10 min

623

4,873

LTT Ages 13

17,000

15 min

4,250




425



3,060

10 min

510

4,760

LTT Ages 17

17,000

15 min

4,250




425



2,380

10 min

397

4,647

LTT Pre-assessment visit and E-filing2








2 hr

1,760




1,760

LTT Totals

51,000


12,750




1275


1,760

9,180


1,530

16,040

4th Grade -Science pilot, science ICT & HOTs pilot;

3,000

15 min

750

300

30 min

150

75

30 min

38

660

10 min

110

1048

Pre-assessment visit and E-filing2








2 hr

104




104

4th Grade -Totals

3,000


750

300


150

75


142

660


110

1,152

8th Grade -Science pilot, science ICT & HOTs pilot; Tech Lit probe

9,000

15 min

2,250

900

20 min

300

225

30 min

113

1,620

10 min

270

2,933

Pre-assessment visit and E-filing2








2 hr

311




311

8th Grade -Totals

9,000


2,250

900


300

225


424

1,620


270

3,244

12th Grade -Economics; Tech Lit probe; Reading Pilot; Math Pilot; Science Pilot; Science ICT & HOTs pilot

21,500

15 min

5,375

500

20 min

167

537

30 min

269

3,010

10 min

502

6,313

Pre-assessment visit and E-filing2








2 hr

741




741

12th Grade -Totals

21,500


5,375

500


167

537


1,010

3,010


502

7,054

Grand Total

84,500


21,125

1,700


617

2,112


3,336

14,470


2,412

27,490

 

 

 

 


1 No teacher or school questionnaire burden associated with LTT.

2 School personnel involved in pre-assessment data gathering and e-filing activities (only a subset of schools do e-filing).

NOTE: Due to rounding some totals may be slightly different than the sum of subtotals.



2013

Subjects

# of Students

Avg. time per resp.

Student Burden (in hours)

# of Teachers

Avg. time per resp.

Teacher Burden

(in hours)

# of Schools

Avg. time per resp.

School Burden

(in hours)

SD/ELL (# of students id as SD/ELL)

Avg. time per resp.

SD/ELL Burden (school personnel)

(in hours)

Total Burden

(in hours)

4th Grade -Operational (Math, Reading, Science); Probe (Science ICT/HOTs); Pilot (Math, Reading, Civics, Geography, US History); NIES study

522,800

15 min

130,700

38,724


30 min

19,362

9,681

30 min

4,841

115,016

10 min

19,169

174,072

Pre-assessment visit and E-filing1








2 hr

13,360




13,360

4th Grade -Totals

522,800


130,700

38,724



19,362

9,681


18,201

115,016


19,169

187,432

8th Grade -Operational (Math, Reading, Science); Probe (Science ICT/HOTs); Pilot (Math, Reading, Civics, Geography, US History); NIES study

522,800

15 min

130,700

38,724


20 min

12,908

9,681

30 min

4,841

94,104

10 min

15,684

164,133

Pre-assessment visit and E-filing1








2 hr

13,360




13,360

8th Grade -Totals

522,800


130,700

38,724



12,908

9,681


18,201

94,104


15,684

177,493

12th Grade -Operational (Math, Reading, Science); Probe (Science ICT/HOTs); Pilot (Civics, Geography, US History)

87,400

15 min

21,850




1,618

30 min

809

12,236

10 min

2,039

24,698

Pre-assessment visit and E-filing1








2 hr

2,233




2,233

HSTS









2,250




2,250

12th Grade -Totals

87,400


21,850




1,618


5,292

12,236


2,039

29,181

Grand Total

1,133,000


283,250

77,448


32,270

20,980


41,694

221,356


36,892

394,106

 

 

 

 

1 School personnel involved in pre-assessment data gathering and e-filing activities (only a subset of schools do e-filing).

NOTE: Due to rounding some totals may be slightly different than the sum of subtotals.







  1. Cost to respondents.

There are no direct costs to respondents.


  1. Estimates of cost to the federal government.

Based on the current contracts for the 2011 and 2012 assessments and the estimated costs for the 2013 assessments, the total cost to the federal government for the administrations of the 2011-2013 activities is approximately $ 66.8 million. The cost estimate is broken down as follows:

  • $ 21.1 million for the printing, packaging, and distribution phases of the administrations;

  • $ 44.2 million for the cost of the 2011-13 field supervisors and assessment administrators to go into schools to administer the assessment, including travel expenses; and

  • $ 1.5 million for web operations and maintenance costs related to the support of the science ICT, writing, and technological literacy assessments.


  1. Reasons for changes in burden (from last System Clearance submittal).

For the 2011-2013 OMB Systems Clearance package (as was done in the 2008-2010 package), a summary of the total burden - by year, by respondent group, and overall – is presented in exhibit 2 (Estimated Respondent Burden for 2011-2013 Assessments). The chart provides estimates for average time to respond to student questionnaire questions (0.25 hours), teacher questions (0.50 and 0.33 hours), school questions (.50 hours) and SD and ELL worksheets (0.16 hours).

The slight reduction in burden of 9,858 hours (from 2010 to 2011) is in part attributed to the change from the way we reported burden for previous systems clearances: rather than take an average of the burden hours over 3 years, for 2011 we are using the specific burden and respondent figure (and will treat 2012 and 2013 burden figures the same way). Due to the extreme variations in burden from 2011 - 2013, this seemed to be a more accurate way to handle the burden over the next 3 years of this system clearance. As a result, 2011 will reflect a slight reduction in burden and respondents.  There have been other changes that will also impact the burden numbers and number of respondents over the three year period.  For instance:

 

1) The nature of NAEP is that burden alternates from a relatively low burden in national-level administration years to a substantial burden increase in state-level administration years that include one or more assessments that support state-by-state and certain urban districts reporting. In state/district assessment years, NAEP samples approximate 1,000,000 students, while in national-only assessment years, NAEP samples approximate 100,000 students. In 2011 and 2013, NAEP will conduct state/district assessments, and in 2012 national-level assessments.

 

2) Prior versions of NAEP system clearance packages used much lower percentages for the SD and ELL student participation rates. Over the past few years, NAEP has taken measures to include as many students as possible and expanded the accommodations allowed. Based on the 2008 and 2009 data, the percentages of students identified as either SD or ELL has increased to 22 percent for grade 4, 18 percent for grade 8, and 14 percent for grade 12. Thus, the SD and ELL volumes and burdens have increased significantly over prior submittals.

 

3) 2013 will include the high school transcript study (HSTS) which will increase burden for that year. It will be explained in more detail in the OMB clearance package that will be submitted for 2013.

 

4)  NAEP has implemented a number of measures to minimize burden as well.  For instance, as described in section A3 (page 12), the SD and ELL student collection process has been revamped to reduce burden.



  1. Time schedule for data collection

The time schedule for the data collection for the 2011-2013 assessments is shown below.

2011

Main NAEP


January-March 2011

2012

Long-Term Trend


Main NAEP


October 2011-May 2012

January-March 2012

2013

Main NAEP


January-March 2013





  1. Approval for not Displaying OMB Approval Expiration Date.

No exception is requested.


  1. Exceptions to Certification Statement.

No exception is requested.

1 As described in Section 1b, main NAEP assesses students in grades 4, 8, and 12 and Long Term Trend NAEP assesses students at ages 9, 13, and 17.

2 State-level burden resulting from e-filing sampling information falls to NCES-funded staff at state education agencies; therefore, the burden is not included in estimates provided in this volume.

3 The grades 8 and 12 writing assessments will be delivered on computer and students will only complete one section of background/non-cognitive questions. The section is still 15 minutes and combines all core and subject-specific questions.

File Typeapplication/msword
File TitleNATIONAL ASSESSMENT OF EDUCATIONAL PROGRESS
Author#Administrator
Last Modified By#Administrator
File Modified2010-04-12
File Created2010-04-12

© 2024 OMB.report | Privacy Policy