1850-0790 rev SSPart A NAEP System Clearance 2014-2016

1850-0790 rev SSPart A NAEP System Clearance 2014-2016.docx

National Assessment of Education Progress (NAEP) 2014-2016 System Clearance

OMB: 1850-0790

Document [docx]
Download: docx | pdf

National Center for Education Statistics

National Assessment of Educational Progress






SUPPORTING STATEMENT PART A



Request for System Clearance for

NAEP Assessments for 2014-2016


OMB# 1850-0790 v.36














November 1, 2012

(revised 1-17-13)

Table of Contents

A. JUSTIFICATION 1

1a. Circumstances making the collection of information necessary 1

1b. Overview of NAEP assessments 2

1c. Overview of 2014–2016 NAEP assessments 4

1d. Rationale for OMB System Clearance 7

2. How, by whom, and for what purpose the data will be used 8

3. Use of techniques to reduce burden 9

4. Efforts to identify duplication 10

5. Burden on small businesses or other small entities 11

6. Consequences of collecting information less frequently 11

7. Consistency with 5 CFR 1320.5 11

8. Consultations outside the agency 11

9. Payments or gifts to respondents 12

10. Assurance of confidentiality 13

11. Sensitive questions 16

12. Estimation of respondent reporting burden (2014–2016) 16

13. Cost to respondents 21

14. Estimates of cost to the federal government 21

15. Reasons for changes in burden (from last System Clearance submittal) 22

16. Time schedule for data collection 22

17. Approval for not displaying OMB approval expiration date 22

18. Exceptions to Certification Statement 22

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS 1

Appendix A Statute Authorizing NAEP A-1

Appendix B External Consultants

NAEP Design and Analysis Committee B-1

NAEP Validity Studies Panel B-2

NAEP Quality Assurance Technical Panel B-3

NAEP Socio-Economic Status Panel B-4

NAEP National Indian Education Study Technical Review Panel B-5

NAEP Civics Standing Committee B-6

NAEP Economics Standing Committee B-7

NAEP Geography Standing Committee B-8

NAEP Mathematics Standing Committee B-9

NAEP Reading Standing Committee B-10

NAEP Science Standing Committee B-11

NAEP Technology and Engineering Literacy Standing Committee B-12

NAEP U.S. History Standing Committee B-13

NAEP Writing Standing Committee B-14

Appendix C Example of Sample Design Document (2013 Assessment) C-1

Appendix D Sample Parental Notification and School Coordinator Responsibilities D-1

Appendix E Sample School Coordinator Responsibilities Brochure E-1

A. JUSTIFICATION

1a. Circumstances making the collection of information necessary

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 121 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, technology and engineering literacy (TEL), and the arts. In the current legislation that reauthorized NAEP, the National Assessment of Educational Progress Authorization Act (Public Law 107-279 Title III, section 303), Congress mandates the collection of national education survey data through a national assessment program. The National Assessment of Educational Progress Authorization Act also requires the assessment to collect data on specified student groups and characteristics, including information organized by race/ethnicity, gender, socio-economic status, disability, and limited English proficiency. It requires fair and accurate presentation of achievement data and permits the collection of background, noncognitive, or descriptive information that is related to academic achievement and aids in fair reporting of results. The intent of the law is to provide representative sample data on student achievement for the nation, the states, and subpopulations of students and to monitor progress over time.

NAEP is administered by the National Center for Education Statistics (NCES) in the Institute for Education Sciences of the U.S. Department of Education. The National Assessment Governing Board (henceforth referred to as the Governing Board) sets policy for NAEP and determines the content framework for each assessment. As a result of the National Assessment of Educational Progress Authorization Act, the Governing Board has final authority on the appropriateness of all cognitive and noncognitive assessment items. The NAEP assessments are conducted by an alliance of organizations under contract with the U.S. Department of Education.2 These assessments contain different kinds of items — “cognitive” assessment items, which measure student knowledge of an academic subject, and “noncognitive” or “background” items, which gather factual information such as demographic variables, as well as construct-related information, such as courses taken.

A copy of the current statute is included in appendix A.

1b. Overview of NAEP assessments

The following provides a broad overview of NAEP assessments. Please note that the Governing Board determines NAEP policy and assessment schedule,3 and future Governing Board decisions may result in changes to some aspects of an assessment (e.g., which subjects are assessed in which years). However, the overall methodology and assessment process will remain consistent.

NAEP consists of two assessment programs: the NAEP long-term trend (LTT) assessment and the main NAEP assessment. The LTT assessments are designed to give information on the changes in academic performance and have been administered for over 40 years. They are administered nationally every four years (but are not reported at state or district level) and report student performance at ages 9, 13, and 17 in mathematics and reading.

The main NAEP assessments report current achievement levels and short-term trends in student achievement at grades 4, 8, and 12 for the nation and, for certain assessments, states and select urban districts. These assessments follow subject-area frameworks developed by the Governing Board and use the latest advances in assessment methodology. The subject-area frameworks evolve to match instructional practices. NCES is responsible for developing the cognitive items and for selecting the final set of items.

In addition to reporting overall results of student performance and achievement, NAEP also reports student performance results for various subgroups of students and on various educational factors. Guidance for what is asked in the questionnaires is set by the Governing Board. NCES is responsible for developing the noncognitive items and for selecting the final set of items. The items are designed to (a) provide the information for disaggregating data according to categories specified in the legislation, (b) provide contextual information that is subject-specific (e.g. reading, mathematics) and has an impact and known relationship to achievement, and (c) provide policy relevant information specified by the Governing Board.

Questionnaires are generally administered to students at grades 4, 8, and 12; to teachers at grades 4 and 8 (and at grade 12 for economics); and to school administrators at grades 4, 8, and 12. Students with disabilities (SD) and English language learners (ELL) worksheets are completed by teachers or administrators of students identified as SD or as ELL, in order to determine if individual students can meaningfully participate and, if so, which accommodation(s) they should receive.

Assessment types listed in this document are described as follows:

Operational

An assessment whose results will be used for NAEP reporting purposes.

Pilot

An assessment that contains a pretest of items and possible assessment conditions to obtain information regarding clarity, difficulty levels, timing, and feasibility of items and conditions.

Special Study

A study to investigate topics such as content issues, delivery options, accommodations, linking to other surveys, or reporting variables.


All three types of activities may simultaneously be in the field during any given data collection effort. Each is described in more detail below:

  1. Operational assessments: “Operational” NAEP administrations, unlike pilot administrations, collect data to publicly report on the educational achievement of students. The NAEP results are reported in the Nation’s Report Card, which is used by policymakers, state and local educators, principals, teachers, and parents to inform educational policy decisions.

  2. Pilot testing: Pilot testing of cognitive and noncognitive items is carried out in all subject areas. In addition to ensuring that items measure what is intended, the data collected from pilot tests serve as the basis for selecting the most effective items and data collection procedures for the subsequent operational assessments. Pilot testing is a cost-effective means for revising and selecting items prior to an operational data collection because, while fewer numbers of students participate, the items are administered to a nationally representative sample of students and data are gathered about performance that crosses the spectrum of student achievement. Items that do not work well can be dropped or modified before the operational administration. For example, when pilot data indicate very low response percentages in some response categories, the number of response categories provided can be reduced.

Prior to pilot testing, many new items are pre-tested with small groups of sample participants (cleared under the NCES pretesting generic clearance agreement; OMB #1850-0803). All noncognitive items undergo one-on-one cognitive interviews, which is useful for identifying questionnaire and procedural problems before larger scale pilot testing is undertaken. Select cognitive items also undergo pre-pilot testing, such as item tryouts or cognitive interviews, in order to test out new item types or formats or challenging content.

  1. Special studies: Special studies are an opportunity for NAEP to investigate particular aspects without impacting the reporting of NAEP results. Previous special studies have focused on linking NAEP to other assessments or linking across NAEP frameworks, investigating alternative sets of items for particular samples, evaluating specific accommodations, investigating administration modes (such as computer-delivered assessment alternatives), and providing targeted data on specific student populations.

In addition to the overarching goal of NAEP to provide data about student achievement at the national, state, and district levels, NAEP also provides specially targeted data on an as-needed basis. At times, this may only mean that a special analysis of the existing data is necessary. At other times the targeting may include the addition of a short add-on questionnaire targeted at specified groups. For example, in the past, additional student, teacher, and school questionnaires were developed and administered as part of the National Indian Education Study (NIES) that NCES conducted on behalf of the Office of Indian Education. Through such targeted questionnaires, important information about the achievement of a specific group is gathered at minimal additional burden. These types of special studies are intentionally kept to a minimum and are designed to avoid jeopardizing the main purpose of the program.

1c. Overview of 2014–2016 NAEP assessments

The basic methodology and procedures described under this system clearance will be employed for the 2014, 2015, and 2016 NAEP assessments. Subsequent clearance packages submitted for the specific assessment years will describe the scope of the assessment, including additional studies, school and student samples, estimated burden, and items requiring clearance.

The 2014 data collection consists of the following:

  • Operational national assessments for U.S. history, civics, and geography at grades 4, 8, and 12;

  • Operational national assessment for TEL4 at grade 8;

  • Pilot assessments for 2015 science, including interactive computer tasks (ICTs)4 and hands-on tasks (HOTs), at grades 4, 8, and 12; and

  • Pilot assessments for 2015 reading and mathematics at grade 12.

The 2015 data collection consists of the following:

  • Operational national and state assessments in reading, mathematics, and science at grades 4 and 8;

  • Operational national, plus 15 states, assessments in reading and mathematics at grade 12;

  • Operational national assessments in science ICTs4 and HOTs at grades 4, 8 and 12;

  • Operational Puerto Rico mathematics assessment at grades 4 and 8; and

  • Pilot assessments for 2017 reading, mathematics, and writing4 at grades 4, 8, and 12.

The 2016 data collection consists of the following:

  • Operational national arts assessment at grade 8;

  • Operational national LTT assessments in reading and mathematics for ages 9, 13, and 17;

  • Pilot assessments for 2020 LTT reading and mathematics for ages 9, 13, and 17;

  • Pilot assessments for 2018 U.S. history, geography, and civics for grades 4, 8, and 12; and

  • Pilot assessment for 2018 economics at grade 12.

Special Studies

Special studies are conducted in accordance with the assessment development, research, or additional reporting needs of NAEP. For example, over recent assessment cycles the following special studies were conducted:

  • NAEP–Trends in International Mathematics and Science Study (TIMSS) Linking Study;

  • Reading Accessible Blocks Study;

  • Knowledge and Skills Appropriate (KaSA) Study (Mathematics);

  • NAEP–Lexile© Study (Reading);

  • NAEP–Program for International Student Assessment (PISA) Linking Study (Mathematics); and

  • NAEP–High School Longitudinal Study (HSLS) Study.

Over the course of the 2014–2016 NAEP assessments, special studies will be conducted. The specifics of future studies will be included in the subsequent OMB submittals. Though an exhaustive list of special studies cannot be provided given the real-time nature of these projects, two special studies are currently being considered:

  • High School Transcript Study (HSTS) – The NAEP HSTS, planned for 2015, periodically surveys the curricula being followed in our nation’s high schools and the course-taking patterns of high school students through a collection of transcripts. The HSTS information is provided by a sample of the public and private schools and the sample of students is representative of the U.S. population of graduating seniors. The transcript study includes only those students whose transcripts indicate that they graduated the year that the study was conducted. Most of the students sampled in the transcript study are in schools that participated in NAEP. The data collected from those students that participated in NAEP make it possible to link course-taking patterns to academic performance, as measured by NAEP.

  • National Indian Education Survey (NIES) – NIES is planned as part of the 2015 assessments in reading and mathematics. The national sample includes students from both public and nonpublic schools that have both large and small American Indian/Alaska Native (AI/AN) student populations. The administration of the NAEP assessment will be followed with the administration of a questionnaire specifically designed for the NIES study. Questionnaire data will be linked to NAEP performance data.

In addition, NAEP is in the process of transitioning assessments from paper-and-pencil to computer-delivered assessments. Therefore, the NAEP program will most likely undertake some special studies to explore aspects associated with the transition to computer-delivered assessments, such as item types, hardware, system performance, and comparability across assessment modes.

1d. Rationale for OMB System Clearance

NCES is requesting system clearance for the NAEP assessments to be administered in the 2014–2016 timeframe, similar to the system clearance approval that was granted for the 2011–2013 and 2008–2010 NAEP administrations (OMB 1850-0790). The primary reason for the system clearance request is that it enables NAEP to meet its large and complex assessment reporting schedules and deliverables through a more efficient clearance process.

Since the passage of the No Child Left Behind Act in 2001 (P.L. 107-110), the numbers of assessments, volumes of participants, and complexity of the development and production processes have all increased dramatically. Because of the National Assessment of Educational Progress Authorization Act requirements, every state participates in the reading and mathematics assessments at grades 4 and 8 and most states participate in the state-level writing and science assessments.

These state-level assessments require that samples be selected to permit reporting of results for each state as well as for the nation, resulting in a much larger sample size than would otherwise be necessary. In addition, because of the complex matrix sampling plan used by NAEP, upwards of 1,000 distinct assessment booklets or forms may need to be printed or assembled for one assessment year alone.

Given the cyclical nature of the NAEP assessments and their complexity and scope, there is a relatively short and fixed time period in which production can occur. Prior to the finalization of the content, many activities must be conducted, including quality reviews, analysis of any prior data, assembly of forms, and approval by the Governing Board and OMB. To meet these demanding timelines, with this submission, NCES is adequately justifying the need for and overall practical utility of the full study as proposed and an overarching plan for the phases of the data collection over the next three years that provides as much detail regarding the measures to be used as is available at the time of the submission. As such, this is a request for OMB to approve the initial phase of this collection.  Prior to fielding any additional phases of the study in relation to this collection, NCES will publish a notice in the Federal Register allowing a 30-day public comment period on the details of each subsequent study component concurrent with submission to OMB.  However, the expiration date of the OMB Control number for this collection (1850-0790) will not be extended by these subsequent submissions unless NCES follows PRA requirements for both 60 and 30-day public comment periods.

Parts A and B of this system clearance package contain supporting information for the assessments to be given in 2014–2016. This submittal does not contain any volume II (specific questionnaire) material. Specific yearly questionnaires will be submitted as they are developed over the course of the system clearance period. Clearance of three years duration (beginning April 1, 2013) is requested for the 2014–2016 assessments. At the end of this system clearance period (i.e., prior to March 31, 2016), NCES will submit a new request for system clearance to cover future NAEP administrations.

2. How, by whom, and for what purpose the data will be used

Under this request for system clearance, NCES is asking for approval of the planned assessments, including the various questionnaires (e.g., student, teacher, and school) and data collection efforts that will be part of the NAEP 2014–2016 national, state, and district assessments, as well as any special studies that are planned to be conducted in the 2014–2016 timeframe. Results will be reported on the 2014 operational assessments in U.S. history, civics, geography, and TEL; the 2015 operational assessments in reading, mathematics, and science, including ICTs and HOTs; and the 2016 operational arts assessment and long-term trend assessments in reading and mathematics. The schedule for subjects to be assessed has been established by the Governing Board; however, it is subject to modification. Therefore, NCES is requesting some leeway with regard to the specific subject assessments that will be administered, while holding the methodology and general procedures constant.

NAEP uses the results from the pilot tests and some special studies to inform future assessments and procedures. Items are evaluated for their effectiveness and appropriateness to be used in an operational assessment.

Results from the operational assessments and some special studies are released to the public. The NAEP results are reported in the Nation’s Report Card, which is used by policymakers, state and local educators, principals, teachers, and parents to help inform educational policy decisions. The NAEP Report Cards provide national results, trends for different student groups, results on scale scores and achievement levels, and sample items. In reports with state or urban district results, there are sections that provide overview information on the performance of these jurisdictions. NAEP does not provide scores for individual students or schools.

In addition to the Report Card, more information is available online (http://nationsreportcard.gov/) and in one-page summary reports, called snapshots, for each participating state or urban district. Additional data tools are available online for those interested in:

In addition to contributing to the reporting tools mentioned above, data from the questionnaires are used as part of the marginal estimation procedures that produce the student achievement results. In addition, questionnaire data is used to perform quality control checks on school-reported data and in special reports, such as the Black–White Achievement Gap report.

Finally, there are numerous opportunities for secondary data analysis because of NAEP’s large scale, the regularity of its administrations, and its stringent quality control processes for data collection and analysis. NAEP data are used by researchers and educators who have diverse interests and varying levels of analytical experience.

3. Use of techniques to reduce burden

For all operational and pilot assessments, NAEP will continue to take advantage of proven, modern measurement techniques, which greatly enhance the power and value of the NAEP data collected. Through the use of a partial balanced incomplete block (BIB) spiraling variant of matrix sampling, a variety of analyses are feasible because the data are not booklet bound. Covariances are computed among all items in a subject area, so that

  • composites of items can be appraised empirically for coherence and construct validity;

  • the dimensional structure of each subject area can be determined analytically as reflected in student performance consistencies;

  • item response theory (IRT) scaling can be applied to unidimensional sets of items or tasks, regardless of which booklet they appear in;

  • IRT scales can be developed having common meaning across exercises, population subgroups, age levels, and time periods;

  • powerful trend analyses can be undertaken by means of these common scales;

  • performance scales can be correlated with background, attitudinal, and program variables to address a rich variety of educational and policy issues; and

  • public-use electronic files can be made much more useful because secondary analyses are also not booklet-bound.

To minimize the burden for participating schools and students, the following procedures will be used:

  • No student takes the complete assessment and, given that NAEP results are reported for groups of students, using a BIB spiral allows the program to administer a lengthy overall assessment without overburdening any one student.

  • Trained administrators will conduct the assessments at all grades.

  • Assessment administrations will be limited, whenever possible, to approximately 90 minutes to facilitate school scheduling (note, computer-delivered assessments may be scheduled for up to 120 minutes).

The various questionnaires (student, teacher, and school) are constructed to minimize respondent burden using the information gathered from pilot testing. Based on pilot test data, items that do not yield useful information or that produce redundant information can be eliminated. Also, to reduce respondent burden, NAEP may employ some or all of the following techniques:

  • maintaining items across administrations rather than piloting new ones;

  • rotating some non-required items across assessments;

  • spiraling items across respondents to provide a broader range of content coverage while minimizing individual respondent burden; and

  • reducing the number of new items piloted for existing frameworks.

Another technique for improving the response process consists of pre-testing items using cognitive interviews, item tryouts, and usability studies on a smaller number of students to maximize the success of items. These techniques allow items to be refined before they go to larger groups of students.

4. Efforts to identify duplication

The proposed assessments, including the questionnaires, do not exist in the same format or combination in the U.S. Department of Education or elsewhere. The noncognitive data gathered by NAEP comprise the only comprehensive cross-sectional survey performed regularly on a large-scale basis that can be related to extensive achievement data in the United States. No other federally funded studies have been designed to collect data for the purpose of regularly assessing trends in educational progress and comparing these trends across states. None of the major non-federal studies of educational achievement were designed to measure changes in national achievement. In short, no existing data source in the public or private sector duplicates NAEP.

5. Burden on small businesses or other small entities

The school samples for NAEP contain small-, medium-, and large-size schools, including private schools. Schools are included in the sample proportional to their representation in the population, or as necessary to meet reporting goals. It is necessary to include small and private schools so that the students attending such schools are represented in the data collection and in the reports.

6. Consequences of collecting information less frequently

Under the National Assessment of Educational Progress Authorization Act, Congress has mandated the on-going collection of NAEP data. Failure to collect the 2014–2016 assessment data on the current schedule would affect the quality and schedule of the NAEP assessments, and would result in assessments that would not fulfill the mandate of the legislation.

7. Consistency with 5 CFR 1320.5

No special circumstances are involved. This data collection observes all requirements of 5 CFR 1320.5.

8. Consultations outside the agency

In addition to the contractors responsible for the development and administration of the NAEP assessments, the program involves many consultants and is also reviewed by specialists serving on various technical review panels. These consultants and special reviewers bring expertise concerning students of different ages, ethnic backgrounds, geographic regions, learning abilities, and socioeconomic levels; the specific subject areas being assessed; the analysis methodologies employed; and large-scale assessment design and practices. Contractor staff and consultants have reviewed all items for bias and sensitivity issues, grade appropriateness, and appropriateness of content across states.

In particular, subject area standing committees play a central role in the development of NAEP assessment instruments and have been essential in helping create assessment content that is appropriate for the targeted populations, and that meets the expectations outlined in the Governing Board frameworks. One of the most important functions of the committees is to contribute to the validation of the assessments. Through detailed reviews of items, scoring guides, tasks, constructed-response item training sets for scorers, and other materials, the committees help establish that the assessments are accurate, accessible, fair, relevant, and grade-level appropriate and that each item measures the knowledge and skills it was designed to measure. When appropriate, members of subject area standing committees will also review the questionnaires with regards to appropriateness with existing curricular and instructional practices.

The following examples of outside personnel, based on current membership, are provided in appendix B:

  • NAEP Design and Analysis Committee

  • NAEP Validity Studies Panel

  • NAEP Quality Assurance Technical Panel

  • NAEP Socio-Economic Status Panel

  • NAEP National Indian Education Study Technical Review Panel

  • NAEP Civics Standing Committee

  • NAEP Economics Standing Committee

  • NAEP Geography Standing Committee

  • NAEP Mathematics Standing Committee

  • NAEP Reading Standing Committee

  • NAEP Science Standing Committee

  • NAEP Technology and Engineering Literacy Standing Committee

  • NAEP U.S. History Standing Committee

  • NAEP Writing Standing Committee


As has been the practice for the past few years, OMB representatives will be invited to attend the technical review panel meetings that are most informative for OMB purposes.

9. Payments or gifts to respondents

In general, there will be no gifts or payments to respondents, although students do get to keep the NAEP pencils or earbuds used in the paper-and-pencil and computer-delivered assessments, respectively. On occasion, NAEP will leave educational materials behind at schools for their use (e.g., science kits from the science HOTs assessments). Schools participating in the High School Transcript Study are paid the established fee for providing student transcripts. Given that the study pays schools the prevailing rate to perform a standard service, estimates of school-level burden for that function are not included in this volume. Special studies (i.e., cognitive lab studies and background item studies) sometimes include remuneration for respondents and will be explicated in the specific-year clearance package submittals. Some schools also offer recognition parties with pizza or other perks for students who participate; however, these are not reimbursed by NCES or the contractor. If any incentives are proposed as part of a future special study, they would be justified as part of that future clearance package. In addition, as appropriate, the amounts would be consistent with amounts approved in other studies with similar conditions.

10. Assurance of confidentiality

NAEP has policies and procedures that ensure privacy, security, and confidentiality, in compliance with the legislation (Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and the National Assessment of Educational Progress Authorization Act). Specifically for the NAEP project, this ensures that privacy, security, and confidentiality policies and procedures are in compliance with the Privacy Act of 1974 and its amendments, NCES Confidentiality Procedures, and the Department of Education ADP Security manual. The National Assessment of Educational Progress Authorization Act requires the confidentiality of personally identifiable information [20 U.S.C. §9622 (c) (3)]:

(A) IN GENERAL.-- The Commissioner for Education Statistics shall ensure that all personally identifiable information about students, their academic achievement, and their families, and that information with respect to individual schools, remains confidential, in accordance with section 552a of title 5, United States Code.

(B) PROHIBITION.-- The Assessment Board, the Commissioner for Education Statistics, and any contractor or subcontractor shall not maintain any system of records containing a student’s name, birth information, Social Security number, or parents’ name or names, or any other personally identifiable information.

Each contractor develops a Data Security Plan and NCES ensures that all current contractor policies and procedures are in compliance with all NAEP security and confidentiality requirements. In addition, all NAEP contractor staff with access to confidential NAEP information are required to sign an affidavit of nondisclosure that affirms, under severe penalty for unlawful action, that they will protect NAEP information from non-authorized access or disclosure. The affidavits are in keeping with the NCES Standard for Maintaining Confidentiality (Standard 4-2). All contractors must also comply with directive OM: 5-101, which requires that all staff with access to data protected by the Privacy Act and/or access to U.S. Department of Education systems and who will work on the contract for 30 days or more go through the security screening procedures. 

An important privacy and confidentiality issue is the protection of the identity of assessed students, their teachers, and their schools. To assure this protection, NAEP has established security procedures, described below, that closely control access to potentially identifying information.

Students’ names are submitted to the Sampling and Data Collection contractor for selecting the student sample. This list also includes the month/year of birth, race/ethnicity, gender, and a status code for students with disabilities, English language learners, and participation in the National School Lunch Program. The student sample is selected and the data for selected students are submitted to the Materials Preparation, Distribution, Processing and Scoring (MDPS) contractor, who includes the data in the Packaging and Distribution system for the production of the Administration Schedule and student labels, which are then forwarded to field staff and used to manage and facilitate the assessment. These data are also added to the School Control System (SCS) used by field staff to print materials used by the schools. Student information is deleted from the Packaging and Distribution system after the assessment begins.

All labels are removed from the assessment materials and destroyed at the schools, upon completion of the assessment. The section of the Administration Schedule with names is removed and placed in the school storage envelope. The school storage envelope contains all of the forms and materials with student names. It is kept at the school until the end of the school year and then destroyed by school personnel.

In addition to student information, teacher and principal names are collected and recorded on a roster, which is used to keep track of the distribution and collection of NAEP teacher and school questionnaires. The roster is kept at the school until questionnaires are collected. At that time, questionnaires and the portion of the roster without names is returned for processing. The portion of the roster with names is kept at the school, in the school storage envelope, which is destroyed at the end of the school year.

The completed Administration Schedules and teacher and school questionnaire rosters (without student, teacher, or principal names) are returned to the MDPS contractor for processing.

Furthermore, to ensure the confidentiality of respondents, NAEP staff will use the following precautions:

  • Data files will not identify individual respondents.

  • No personally identifiable information, either by schools or respondents, will be gathered or released by third parties. No permanent files of names or other direct identifiers of respondents will be maintained.

  • Student participation is voluntary.

  • NAEP data are perturbed. Data perturbation is a statistical data editing technique implemented to ensure privacy for student and school respondents to NAEP’s assessment questionnaires for assessments in which data are reported or attainable via restricted-use licensing arrangements with NCES. The process is coordinated in strict confidence with the IES Disclosure Review Board (DRB), with details of the process shared only with the DRB and a minimal number of contractor staff.

After the components of NAEP are completed in a school, neither student- nor teacher-reported data are retrievable by personal identifiers. We emphasize that confidentiality is assured for individual schools and for individual students, teachers, and principals. The following text is printed on all student, teacher, and school questionnaires:

The information you provide will be used for statistical purposes only. In accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and other applicable Federal laws, your responses will be kept confidential and will not be disclosed in identifiable form to anyone other than employees or agents. By law, every NCES employee as well as every agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of up to $250,000, or both if he or she willfully discloses ANY identifiable information about you.

In addition, parents are notified of the assessment. Appendix D includes a sample parental notification letter regarding NAEP. The letter is adapted for each grade/subject combination and the school principal may edit it. However, the information regarding confidentiality and the appropriate law reference will remain unchanged. The NAEP state coordinators5 work with the school coordinators in obtaining student and school participation in NAEP. A sample brochure from the NAEP state coordinators to the participating schools describing these school coordinator activities is included as appendix E.

For the HSTS component of NAEP, student transcripts are collected from schools for sampled students, and school staff members complete a School Information Form that provides general information about class periods, credits, graduation requirements, and other aspects of school policy. To maintain the privacy of student and school identities, students’ names are removed from the transcripts and questionnaires at the school and given a unique identification number, which is used to match the transcript records to the NAEP questionnaire and performance information, on an individual basis. NCES ensures that the data collected from schools and students are used for statistical purposes only.

11. Sensitive questions

The National Assessment of Educational Progress emphasizes voluntary respondent participation, assures confidentiality of individual responses, and avoids asking for information that might be considered sensitive or offensive. Insensitive or offensive items are prohibited by the National Assessment of Educational Progress Authorization Act, and the Governing Board reviews all items for bias and sensitivity. Throughout the item development process, NCES staff works with consultants, contractors, and internal reviewers to identify and eliminate potential bias in the items.

12. Estimation of respondent reporting burden (2014–2016)

Because there are such fluctuations in the burden numbers over the three years of this collection, we will continue our practice of showing the estimate of each year’s burden rather than averaging the total over the three years. The burden numbers and respondents will be changed accordingly for each year, based on the final list of assessments and special studies to be assessed. Exhibit 1 provides the burden information per respondent group, by grade and by year, for the 2014–2016 assessments. Exhibit 2 summarizes the burden across the three years.

A description of the respondents or study is provided below, as supporting information for Exhibit 1:

  • Students - Students in fourth, eighth, and twelfth grades complete assessment booklets or forms that contain 50 or 60 minutes of cognitive blocks, followed by noncognitive block(s) which require a total of 15 minutes to complete. The core noncognitive items are answered by all students across subject areas and are related to demographic information. In addition, students answer subject-specific noncognitive items. Additional student burden accounts for time to read directions, distribute test booklets (for paper-and-pencil assessments), and log on to the computer and view a tutorial (for computer-delivered assessments). This additional burden is estimated at 10 minutes for paper-and-pencil and 15 minutes for computer-delivered assessments. Therefore, the total burden for students is 25 minutes for paper-and-pencil and 30 minutes for computer-delivered assessments.

    • Teachers - The teachers of fourth- and eighth-grade students (and twelfth-grade economics students) participating in NAEP are asked to complete questionnaires about their teaching background, education, training, and classroom organization. Fourth-grade teacher burden is estimated to be 30 minutes because fourth-grade teachers often have multiple subject-specific sections to complete. Eighth- and twelfth-grade teacher burden is 20 minutes because they only have one subject-specific section.

    • Principals/Administrators - The school administrators in the sampled schools are asked to complete a questionnaire. The core items are designed to measure school characteristics and policies that research has shown are highly correlated with student achievement. A section with subject-specific items concentrates on curriculum and instructional services issues. The burden for school administrators is estimated at 30 minutes.

    • SD and ELL worksheets are completed by school personnel concerning students identified as SD or ELL. These worksheets will be used to determine the appropriate accommodations for students.

    • Submission of Samples - Survey sample information is collected from schools in the form of lists of potential students who may participate in NAEP. This sample information can be gathered manually or electronically at the school, district, or state level. If done at the school or district level, some burden will be incurred by school personnel. It is estimated that it will take one hour for school personnel to complete the submission process. Based on recent comparable experience, 64 percent, 48.7 percent, and 70.4 percent of the schools or districts will complete the sample submission process for 2014, 2015, and 2016, respectively. Note that the estimated percent is dependent upon the nature of the sample (i.e., national or state) and the assessment program (i.e., inclusion of LTT).

    • Pre-Assessment Visit - The pre-assessment visit is the opportunity for the NCES contractor field staff to meet with the school personnel to review procedures and logistics for the upcoming assessment. It is estimated that it will take two hours for school personnel to complete the pre-assessment visit.

  • HSTS - The NAEP HSTS periodically surveys the curricula being followed in our nation’s high schools and the course-taking patterns of high school students through a collection of transcripts. This data collection requires three hours per school from a sample of approximately 800 schools.

  • National Indian Education Study (NIES) - NIES is designed to describe the condition of education for American Indian and Alaska Native (AI/AN) students in the United States. Additional questionnaires designed for NIES are given to students (estimated at 15 minutes), teachers (20 minutes), and school administrators (30 minutes).


EXHIBIT 1

Estimated Burden for NAEP 2014–2016 Assessments, By Year, By Grade Level

(Note: all explanatory notes and footnotes are displayed following the 2016 table)


2014 2015


2016

Notes for all tables in Exhibit 1

Cells with cross-hashes are sub-sets or repeats of previous cells. Therefore, they are shown for row calculation purposes only and are not included in the column totals.

Light gray shaded cells are intentionally left blank because no burden is associated with that cell.

1 - Because the specific special studies are not known at this time, a placeholder is included for burden calculation purposes.

2 - Based on most recent comparable data, it is estimated that 64% of district or school personnel will perform sample selection activities in 2014, 48.7% in 2015, and 70.4% in 2016.

3 - P&P refers to paper-and-pencil assessments and CDA refers to computer-delivered assessments. All special study placeholders assume that half will be P&P and half will be CDA.

4 - SD and ELL burden is calculated as the number of hours that school personnel spend to complete the SD/ELL worksheets for students identified as SD and/or ELL (estimated at 10 minutes per student). The overall SD/ELL burden is a factor of the number of students identified as SD/ELL. Based on most recent data, it is estimated that 23 percent of grade 4/age 9 students, 18 percent of grade 8/age 13, and 14 percent of grade 12/age 17 students will be identified as SD/ELL.

5 - Teacher and school questionnaires are not administered in association with the LTT assessments.

EXHIBIT 2

Total Estimated Burden for NAEP 2014–2016 Assessments


13. Cost to respondents

There are no direct costs to respondents.

14. Estimates of cost to the federal government

The total cost to the federal government for the administrations of the 2014–2016 activities is estimated to be approximately $48.15 million. The cost estimate is broken down as follows:

  • $13.25 million for the printing, packaging, and distribution phases of the administrations;

  • $27.1 million for the cost for the field supervisors and data collectors to go into schools to administer the 2014–2016 assessments, including travel expenses and testing equipment costs; and

  • $7.8 million for web operations and maintenance costs related to the support of the computer-delivered assessments.

15. Reasons for changes in burden (from last System Clearance submittal)

The nature of NAEP is that burden alternates from a relatively low burden in national-level administration years (i.e., even years) to a substantial burden increase in state-level administration years that include one or more assessments that support national, state-by-state, and certain urban districts reporting (i.e., odd years). In state/district assessment years, NAEP samples approximately 1,000,000 students, while in national-only assessment years, approximately 100,000 students. In 2015, NAEP will conduct state/district assessments, and in 2014 and 2016, national-level assessments. The previous System Clearance included burden for two state/district assessments (2011 and 2013) and only one national-level assessment (2012), therefore the overall number of respondents and responses is lower in this clearance request than in the previous one. However, because we have adjusted the estimated respondent time for students and the pre-assessment visit, the overall time burden estimate is slightly higher for this versus the previous System Clearance. Previously, only the time spent responding to the noncognitive questionnaires was included in the total burden calculation for the students, but now we also include the time for instructions and booklet distribution, which increases the per-student burden estimate by an additional 10 minutes for paper-and-pencil assessments and by 15 minutes for computer-delivered assessments. Finally, recent reports from the field staff have indicated that the pre-assessment visits require two hours, rather than the one hour previously estimated. Therefore, we have adjusted the burden estimate for pre-assessment visit to two hours.

16. Time schedule for data collection

The time schedule for the data collection for the 2014–2016 assessments is shown below.

2014

Main NAEP

January–March 2014

2015

Main NAEP

January–March 2015

2016

Long-Term Trend

Main NAEP

October 2015–May 2016

January–March 2016



17. Approval for not displaying OMB approval expiration date

No exception is requested.

18. Exceptions to Certification Statement

No exception is requested.

1 As described in section 1b, main NAEP assesses students in grades 4, 8, and 12, and long-term trend NAEP assesses students at ages 9, 13, and 17.

2 The current contract expires on March 26, 2013. A new contract will be awarded prior to that date.

3 The Governing Board assessment schedule can be found at http://www.nagb.org/naep/assessment-schedule.htm.

4 TEL, Science ICT, and writing will be administered by computer. All other assessments are currently planned as a paper-and-pencil administration.

5The NAEP State Coordinator serves as the liaison between the state education agency and NAEP, coordinating NAEP activities in his or her state.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSystem Clearance Part A
Author#Administrator
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy