Part A NAEP 2021 STQ Special Study

Part A NAEP 2021 STQ Special Study.docx

National Assessment of Educational Progress (NAEP) 2021 Schools and Teacher Questionnaire Special Study

OMB: 1850-0956

Document [docx]
Download: docx | pdf

National Center for Education Statistics

National Assessment of Educational Progress







National Assessment of Educational Progress (NAEP) 2021 Schools and Teacher Questionnaire Special Study



Supporting Statement

Part A




OMB# 1850-0956 v.2








March 2021


Note: OMB# 1850-0956 announced the postponement of the 2021 main NAEP students assessments. This Amendment (OMB# 1850-0956 v.2) is being submitted to continue to collect voluntary responses from teachers and schools via online survey questionnaires to capture their experiences during the COVID-19 outbreak and its impact on the 2019-2020 and current school years. NCES is requesting an Emergency Clearance for this Amendment due to the COVID-19 outbreak impact to the 2021 NAEP test administation.




A.1. Circumstances Making the Collection of Information Necessary

A.1.a. Purpose of Submission

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, and technology and engineering literacy (TEL).

NAEP is conducted by the National Center for Education Statistics (NCES) in the Institute of Education Sciences of the U.S. Department of Education. As such, NCES is responsible for designing and executing the assessment, including designing the assessment procedures and methodology, developing the assessment content, selecting the final assessment content, sampling schools and students, recruiting schools, administering the assessment, scoring student responses, determining the analysis procedures, analyzing the data, and reporting the results.1

The National Assessment Governing Board (henceforth referred to as the Governing Board), appointed by the Secretary of Education but independent of the Department, is a bipartisan group whose members include governors, state legislators, local and state school officials, educators, business representatives, and members of the general public. The Governing Board sets policy for NAEP and is responsible for developing the frameworks and test specifications that serve as the blueprint for the assessments.

The NAEP assessments contain diverse items such as “cognitive” assessment items, which measure what students know and can do in an academic subject, and “survey” or “non-cognitive” items, which gather information such as demographic variables, as well as construct-related information, such as courses taken. The survey portion includes a collection of data from students, teachers, and school administrators. Since NAEP assessments are administered uniformly using the same sets of test booklets across the nation, NAEP results serve as a common metric for all states and select urban districts. The assessment stays essentially the same from year to year, with only carefully documented changes. This permits NAEP to provide a clear picture of student academic progress over time.

NAEP consists of two assessment programs: the NAEP long-term trend (LTT) assessment and the main NAEP assessment. The LTT assessments are given at the national level only and are administered to students at ages 9, 13, and 17 in a manner that is very different from that used for the main NAEP assessments. LTT reports mathematics and reading results that present trend data since the 1970s. LTT was last administered in 2020 for ages 9 and 13 but due to the COVID-19 outbreak and school closures, NCES decided that the age 17 administration would be delayed. This submission only covers the NAEP 2021 School and Teacher Questionnaire Special Study, given the cancellation of the main NAEP student assessments.

During a typical NAEP administration, the possible universe of student respondents is estimated to be 12 million at grades 4, 8, and 12 for main NAEP, and at ages 9, 13, and 17 for Long-Term Trend (LTT), attending the approximately 154,000 public and private elementary and secondary schools in 50 states and the District of Columbia, and including Bureau of Indian Education and Department of Defense Education Activity (DoDEA) Schools. Note that territories, including Puerto Rico, are not included in the national samples. For 2021, NCES will be collecting survey questionnaire responses from teachers and schools who volunteer to participate. Due to the cancellation of the main NAEP student assessments, there will be no student respondents.

During a typical NAEP administration, NAEP provides results on subject-matter achievement, instructional experiences, and school environment for populations of students (e.g., all fourth-graders) and groups within those populations (e.g., female students, Hispanic students). NAEP does not provide scores for individual students or schools. The main NAEP assessments report current achievement levels and trends in student achievement at grades 4, 8, and 12 for the nation and, for certain assessments (e.g., reading and mathematics), states and select urban districts. The Trial Urban District Assessment (TUDA) is a special project developed to determine the feasibility of reporting district-level results for large urban districts. Currently, the following 27 districts participate in the TUDA program: Albuquerque, Atlanta, Austin, Baltimore City, Boston, Charlotte, Chicago, Clark County (NV), Cleveland, Dallas, Denver, Detroit, District of Columbia (DCPS), Duval County (FL), Fort Worth, Fresno, Guilford County (NC), Hillsborough County (FL), Houston, Jefferson County (KY), Los Angeles, Miami-Dade, Milwaukee, New York City, Philadelphia, San Diego, and Shelby County (TN).

Previously, the NAEP 2021 assessments were approved (OMB# 1850-0928 v.21). Subsequently, the Commissioner postponed the 2021 main NAEP student assessments (see Commissioner’s note on the NAEP website, https://nces.ed.gov/whatsnew/commissioner/remarks2020/11_25_2020.asp) and an announcement was made in Emergency Clearance Package (OMB# 1850-0956). This current Amendment reflects NCES’ decision to continue to collect voluntary responses from teachers and schools via online survey questionnaires. Collecting this information will allow NCES to capture data about educational experiences during the COVID-19 outbreak and the impact of the COVID-19 outbreak on the 2019-2020 and current school years. Some of the assessment, questionnaire, and recruitment materials are translated into Spanish. In years in which NAEP is administered in Puerto Rico, such as in 2021, Spanish versions of communication materials for teachers and school coordinators as well as teacher and school questionnaires are created by translating their English equivalents into Spanish. The final versions of the teacher and school survey questionnaires as well as the Spanish translations have been approved previously in Amendment v.21 (OMB# 1850-0928), although they have been edited for this special administration.


A.1.b. Legislative Authorization

In the current legislation that reauthorized NAEP, the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622), Congress mandates the collection of national education survey data through a national assessment program:

  1. ESTABLISHMENT- The Commissioner for Education Statistics shall, with the advice of the Assessment Board established under section 302, carry out, through grants, contracts, or cooperative agreements with one or more qualified organizations, or consortia thereof, a National Assessment of Educational Progress, which collectively refers to a national assessment, State assessments, and a long-term trend assessment in reading and mathematics.

  2. PURPOSE; STATE ASSESSMENTS-

(1) PURPOSE- The purpose of this section is to provide, in a timely manner, a fair and accurate measurement of student academic achievement and reporting of trends in such achievement in reading, mathematics, and other subject matter as specified in this section.

The National Assessment of Educational Progress Authorization Act also requires the assessment to collect data on specified student groups and characteristics, including information organized by race/ethnicity, gender, socio-economic status, disability, and English language learners. This allows for the fair and accurate presentation of achievement data and permits the collection of background, non-cognitive, or descriptive information that is related to academic achievement and aids in the fair reporting of results. The intent of the law is to provide representative sample data on student achievement for the nation, the states, and a variety of populations of students, and to monitor progress over time.

The statute and regulation mandating or authorizing the collection of this information can be found at https://www.law.cornell.edu/uscode/text/20/9622.

A.1.c. Overview of NAEP Assessments

This section provides a broad overview of NAEP assessments, including information on the assessment frameworks, the cognitive and survey items, inclusion policies, the transition to digitally based assessments (DBA), and the assessment types.

A.1.c.1. NAEP Frameworks

NAEP assessments follow subject-area frameworks developed by the Governing Board and use the latest advances in assessment methodology. Frameworks capture a range of subject-specific content and thinking skills needed by students in order to deal with the complex issues they encounter inside and outside their classrooms. The NAEP frameworks are determined through a development process that ensures they are appropriate for current educational requirements. Because the assessments must remain flexible to mirror changes in educational objectives and curricula, the frameworks must be forward-looking and responsive, balancing current teaching practices with research findings.

NAEP frameworks can serve as guidelines for planning assessments or revising curricula. They also can provide information on skills appropriate to grades 4, 8, and 12 and can be models for measuring these skills in innovative ways. The subject-area frameworks evolve to match instructional practices. Developing a framework generally involves the following steps:

  • widespread participation and reviews by educators and state education officials;

  • reviews by steering committees whose members represent policymakers, practitioners, and members of the general public;

  • involvement of subject supervisors from education agencies;

  • public hearings; and

  • reviews by scholars in the field, by NCES staff, and by a policy advisory panel.

The frameworks can be found at https://www.nagb.gov/naep-frameworks/frameworks-overview.html.

A.1.c.2. Cognitive Item Development

As part of the item development process, NCES calls on many constituents to guide the process and review the assessment. Item development is guided by a multi-year design plan, which is guided by the framework and establishes the design principles, priorities, schedules, and reporting goals for each subject. Based on this plan, the NAEP contractor creates a development plan outlining the item inventory and objectives for new items and then begins the development process by developing more items than are needed. This item pool is then subjected to:

  • internal contractor review with content experts, teachers, and experts on political sensitivity and bias;

  • playtesting, tryouts, or cognitive interviews with small groups of students for select items (particularly those that have new item types, formats, or challenging content); and

  • refinement of items and scoring rubrics under NCES guidance.


Next, a standing committee of content experts, state and local education agency representatives, teachers, and representatives of professional associations reviews the items. The standing committee considers:

  • the appropriateness of the items for the particular grade;

  • the representative nature of the item set;

  • the compatibility of the items with the framework and test specifications; and

  • the quality of items and scoring rubrics.

For state-level assessments, this may be followed by a state item review where further feedback is received. Items are then revised and submitted to NCES and the Governing Board Assessment Development Committee for approval prior to pilot testing.

The pilot test is used to finalize the testing instrument. Items may be dropped from consideration or move forward to the operational assessment. The item set is once again subjected to review by the standing committee and NCES following generally the same procedure described above. A final set of test items is then assembled for NCES and the Governing Board’s review and approval. After the operational assessment, items are once again examined. In rare cases where item statistics indicate problems, the item may be dropped from the assessment. The remaining items are secured for reuse in future assessments, with a subset of those items publicly released.

A.1.c.3. Survey Items

In addition to assessing subject-area achievement, NAEP collects information that serves to fulfill the reporting requirements of the federal legislation and to provide context for the reporting of student performance. The legislation requires that, whenever feasible, NAEP includes information on special groups (e.g., information reported by race, ethnicity, socio-economic status, gender, disability, and limited English proficiency).

As part of most NAEP data collections, in addition to student assessments, three types of questionnaires are used to collect information: student, teacher, and school (including the new COVID-19 items for each respondent type). For 2021, only the teacher and school questionnaires will be administered; no student participation is possible, either as assessment or in response to a questionnaire. An overview of the questionnaires is presented below, and additional information about the content of the questionnaires is presented in Part C.

Student Questionnaires

In a typical NAEP administration year, each NAEP student assessment booklet includes non-cognitive items, also known as the student questionnaire. The questionnaires appear in separately timed blocks of items in the assessment forms. The items collect information on students’ demographic characteristics, classroom experiences, and educational support. Students’ responses provide data that give context to NAEP results and/or allow researchers to track factors associated with academic achievement. Students complete the questionnaires voluntarily (for confidentiality provisions see Section A.10 for more information). Student names are never reported with their responses or with the other information collected by NAEP.

Each student questionnaire includes three types of items:

  • General student information: Student responses to these items are used to collect information about factors such as race or ethnicity and parents’ education level. Answers on the questionnaires also provide information about factors associated with academic performance, including homework habits, the language spoken in the home, and the number of books in the home.

  • Other contextual/policy information: These items focus on students’ educational settings and experiences and collect information about students’ attendance (i.e., days absent), family discourse (i.e., talking about school at home), reading load (i.e., pages read per day), and exposure to English in the home. There are also items that ask about students’ effort on the assessment and the difficulty of the assessment. Answers on the questionnaires provide information on how aspects of education and educational resources are distributed among different groups.

  • Subject-specific information: In most NAEP administrations, these items cover three categories of information: (1) time spent studying the subject; (2) instructional experiences in the subject; and (3) student factors (e.g., effort, confidence) related to the subject and the assessment.

As described above, students are not participating in NAEP 2021, and therefore no student questionnaires will be part of NAEP 2021.

Teacher Questionnaires

To provide supplemental information about the instructional experiences reported by students, teachers are asked to complete an online questionnaire using NAEPq about their instructional practices, classroom organization, teaching background and training, and the subject in which students are being assessed. While completion of the questionnaire is voluntary, NAEP encourages teachers’ participation since their responses improve the accuracy and completeness of the NAEP assessment.

Teacher questionnaires are typically only given to teachers at grades 4 and 8; NAEP typically does not collect teacher information for grade 12. By grade 12, there is such variation in student course-taking experiences that students cannot be matched to individual teachers for each tested subject. For example, a student may not be taking a mathematics class in grade 12, so he or she cannot be matched to a teacher. Conversely, a student could be taking two mathematics classes at grade 12 and have multiple teachers related to mathematics. Only an economics teacher questionnaire has been developed and administered at grade 12. However, these data were not released (with either the 2006 or the 2012 results) due to a student-teacher match rate below statistical standards.2

Teacher questionnaires are organized into different parts. The first part of the teacher questionnaire covers background and general training and includes items concerning years of teaching experience, certifications, degrees, major and minor fields of study, coursework in education, coursework in specific subject areas, the amount of in-service training, the extent of control over instructional issues, and the availability of resources for the classroom. Subsequent parts of the teacher questionnaire tend to cover training in the subject area, classroom instructional information, and teacher exposure to issues related to the subject and the teaching of the subject. For the NAEP 2021 School and Teacher Questionnaire Special Study, teacher questionnaires have been expanded to include COVID-19 outbreak questions which ask about the impact of the crisis on their teaching. They also ask about pre- and in-service training, the ability level of the students in the class, the length of homework assignments, the use of particular resources, and how students are assigned to particular classes.

School Questionnaires

The school questionnaire provides supplemental information about school factors that may influence students’ achievement. It is given to the principal or another official of each school that participates in the NAEP assessment. While schools’ completion of the questionnaire is voluntary, NAEP encourages schools’ participation since it makes the NAEP assessment more accurate and complete.

The school questionnaire is accessed online through NAEPq and is organized into different parts. The first part tends to cover characteristics of the school, including the length of the school day and year, school enrollment, absenteeism, dropout rates, and the size and composition of the teaching staff. Subsequent parts of the school questionnaire tend to cover tracking policies, curricula, testing practices, special priorities, and schoolwide programs and problems. The questionnaire also collects information about the availability of resources, policies for parental involvement, special services, and community services.

The supplemental charter school questionnaire designed to collect information on charter school policies and characteristics is provided to administrators of charter schools who are sampled to participate in NAEP. The supplement covers organization and school governance, parental involvement, and curriculum and offerings.

Development of Survey Items

The Background Information Framework and the Governing Board’s Policy on the Collection and Reporting of Background Data (located at https://www.nagb.gov/content/nagb/assets/documents/policies/collection-report-backg-data.pdf), guide the collection and reporting of non-cognitive assessment information. In addition, subject-area frameworks provide guidance on subject-specific, non-cognitive assessment questions to be included in the questionnaires. The development process is very similar to the cognitive items, including review of the existing item pool; development of more items than are intended for use; review by experts (including the standing committee); and cognitive interviews with students, teachers, and schools. When developing the questionnaires, NAEP uses a pretesting process so that the final questions are minimally intrusive or sensitive, are grounded in educational research, and the answers can provide information relevant to the subject being assessed.

In the web-based NAEP Data Explorer,3 (located at https://www.nationsreportcard.gov/ndecore/landing) the results of the questionnaires are sorted into eight broad categories: Major Reporting Groups, Student Factors, Factors Beyond School, Instructional Content and Practice, Teacher Factors, School Factors, Community Factors, and Government Factors.

To minimize burden on the respondents and maximize the constructs addressed via the questionnaires, NAEP may spiral items across respondents and/or rotate some non-required items across assessment administrations. The final versions of survey items for the NAEP 2021 School and Teacher Questionnaire Special Study, for each subject and respondent, are included in Appendices J. Not all of the items presented will be given to an individual respondent or in a specific administration. The final teacher and school versions of the 2021 questionnaires are provided in this submission (see Appendix J-2 for teacher survey items and Appendix J-3 for school survey items).

A.1.c.4. Inclusion in NAEP

During a typical NAEP administration, it is important for NAEP to assess as many students selected to participate as possible. Assessing representative samples of students, including students with disabilities (SD) and English language learners (ELL), helps to ensure that NAEP results accurately reflect the educational performance of all students in the target population and can continue to serve as a meaningful measure of U.S. students’ academic achievement over time.

The National Assessment Governing Board, which sets policy for NAEP, has been exploring ways to ensure that NAEP continues to appropriately include as many students as possible and to do so in a consistent manner for all jurisdictions assessed and reported on. In March 2010, the Governing Board adopted a policy, NAEP Testing and Reporting on Students with Disabilities and English Language Learners (located at https://www.nagb.gov/content/nagb/assets/documents/policies/naep_testandreport_studentswithdisabilities.pdf). This policy was the culmination of work with experts in testing and curriculum and those who work with exceptional children and students learning to speak English. The policy aims to:

  • maximize participation of sampled students in NAEP;

  • reduce variation in exclusion rates for SD and ELL students across states and districts;

  • develop uniform national rules for including students in NAEP; and

  • ensure that NAEP is fully representative of SD and ELL students.

The policy defines specific inclusion goals for NAEP samples. At the national, state, and district levels, the goal is to include 95 percent of all students selected for the NAEP samples, and 85 percent of those in the NAEP sample who are identified as SD or ELL.

Students are selected to participate in NAEP based on a sampling procedure4 designed to yield a sample of students that is representative of students in all schools nationwide and in public schools within each state. First, schools are selected, and then students are sampled from within those schools without regard to disability or English language proficiency. Once students are selected, those previously identified as SD or ELL may be offered accommodations or excluded.

Accommodations in the testing environment or administration procedures are provided for SD and ELL students. Some examples of accommodations permitted by NAEP are extra time, testing in small-group or one-on-one sessions, reading aloud to a student, and scribing a student’s responses. Some examples of testing accommodations not allowed are giving the reading assessment in a language other than English or reading the passages in the reading assessment aloud to the student.

States and jurisdictions vary in their proportions of special-needs students and in their policies on inclusion and the use of accommodations. Despite the increasing identification of SD and ELL students in some states, in particular of ELL students at grade 4, NAEP inclusion rates have generally remained steady or increased since 2003. This reflects efforts on the part of states and jurisdictions to include all students who can meaningfully participate in the NAEP assessments. The NAEP inclusion policy is an effort to ensure that this trend continues.

A.1.c.5. Digitally Based Assessments (DBA)

Virtually all of our nation’s schools are equipped with computers, and an increasing number of schools are making digital tools an integral component of the learning environment, reflecting that the knowledge and skills needed for future post-secondary success involve the use of new technologies. NAEP is evolving to address the changing educational landscape through its transition to DBA.

NAEP DBA use current technology, and as technology evolves, so will the nature of delivery of the assessments. During a typical NAEP administration, NAEP administers the digital assessments on tablets, which NAEP field staff bring into the schools.5 Other administration models may be considered in the future, including the use of school equipment or a combination of approaches.

DBA allow NAEP to:

  • more accurately reflect what is happening in today’s classrooms;

  • improve measurement of knowledge and skills; and

  • collect new types of data that provide depth in our understanding of what students know and can do, including how they engage with new technologies to approach problem solving.

Leveraging New Technologies

NAEP DBA uses new testing methods and item types that reflect the growing use of technology in education. Examples of such new item types include:

  • Multimedia elements, such as videos and audio clips: The NAEP computer-based writing assessment, administered in 2011 at grades 8 and 12, made use of multimedia. These elements have been incorporated into other NAEP DBA as well. The 2011 writing tasks were presented to students on computers in a variety of ways, including text, audio, photographs, video, and animation. Examples of these tasks are available at http://www.nationsreportcard.gov/writing_2011/sample_quest.aspx.

  • Interactive items and tools: Some questions may allow the use of embedded technological features to form a response. For example, students may use “drag and drop” functionality to place labels on a graphic or may tap an area or zone on the screen to make a selection. Other questions may involve the use of digital tools. In the mathematics DBA, an online calculator is available for students to use when responding to some items. An equation editor is also provided for the entry of mathematical expressions and equations, and we have incorporated some digital tools, such as rulers, data graph builders, and function graphers, and continue to explore more tools that can be used to gauge students’ mathematical skills. Students are shown how to use many of these interactive features and tools in the brief tutorials that are included at the beginning of each NAEP DBA. The 2019 tutorial is available at https://enaep-public.naepims.org/2019/english.html

  • Immersive scenario-based tasks: Scenario-based tasks use multimedia features and tools to engage students in rich, authentic problem-solving contexts. NAEP’s first scenario-based tasks were administered in 2009, when students at grades 4, 8, and 12 were assessed with interactive computer tasks in science. The science tasks asked students to solve scientific problems and perform experiments, often by simulation. They provide students more opportunities than a paper-based assessment (PBA) to demonstrate skills involved in doing science without many of the logistical constraints associated with a natural or laboratory setting. The science tasks administered in 2009 can be explored at http://www.nationsreportcard.gov/science_2009/ict_summary.aspx. NAEP also administered scenario-based tasks in the 2014 technology and engineering literacy (TEL) assessment, where students were challenged to work through computer simulations of real-world situations they might encounter in their everyday lives. A sample TEL task can be viewed at http://nces.ed.gov/nationsreportcard/tel/wells_item.aspx. NAEP is continuing to expand the use of scenario-based tasks to measure knowledge and skills in other subject areas such as mathematics and reading.

In addition to new item types, the transition to DBA makes it possible for NAEP to employ an adaptive testing design, in which assessment content is targeted to a student’s ability based on performance during the test administration. Thus, students see items that are tailored to their ability levels, and they may be more likely to be able to engage in the assessment and demonstrate what they know and can do. The goal of implementing adaptive testing is to achieve better measurement of student knowledge and skills across the wide range of student performance levels on which NAEP reports. NAEP is considering using adaptive testing initially in the mathematics DBA and possibly in other NAEP assessments in the future.

The type of adaptive testing being considered for NAEP is a multi-stage test (MST) design that uses two stages. Students take two sections of cognitive items, just as in past NAEP administrations. Based on their performance on the first section of items, students receive a second section of items that is targeted to their ability level. For example, students who do not perform well on the first section of items receive a second section composed of somewhat easier items. The implementation of this two-stage MST design for NAEP mathematics grades 4 and 8 has been informed by previous research on the benefits, applicability, and feasibility of adaptive testing for NAEP. In particular, in 2011 NAEP conducted the mathematics computer-based study, which evaluated the use of a two-stage MST design for the grade 8 mathematics assessment.6 In addition, the 2015 Stage 1 pilots in mathematics and science also incorporated an MST design. Finally, an MST mathematics study was conducted in 2017 (approved in August 2016, OMB# 1850-0928 v.1), which informed the operational MST design for the 2019. Prior to adopting an MST design in other subject areas/grades, additional testing will be conducted for each subject area/grade. There is not an MST component included in the 2021 administration, but we will continue to consider it for future administrations.

The DBA technology allows NAEP to capture information about what students do while attempting to answer questions. While PBA only yields the final responses in the test booklet, DBA capture actions students perform while interacting with the assessment tasks, as well as the time at which students take these actions. These student interactions with the assessment interface are generally not used to assess students’ knowledge and skills, but rather this information might be used to provide context for student performance. For example, more proficient students may use digital tools such as the calculator in mathematics or the spell-checker in writing assessments, compared to less proficient students. As such, NAEP will potentially uncover more information about which actions students use when they successfully (or unsuccessfully) answer specific questions on the assessment. Unless specifically required by the scoring rubrics, process data are not scored; they are primarily used for improving assessment design and for providing contexts for interpreting reported scores.

NAEP will capture the following actions in the DBA, although not all actions will be captured for all assessments:

  • Student navigation (e.g., clicking back/next; clicking on the progress navigator; clicking to leave a section);

  • Student use of tools (e.g., zooming; using text to speech; turning on scratchwork mode; using the highlighter tool; opening the calculator; using the equation editor; clicking the change language button);

  • Student responses (e.g., clicking a choice; eliminating a choice; clearing an answer; keystroke log of student typed text);

  • Writing interface (e.g., expanding the response field; collapsing the prompt; using keyboard commands such as CTRL+C to copy text; clicking buttons on the toolbar such as using the bold or undo button);

  • Other student events (e.g., vertical and horizontal scrolling; media interaction such as playing an audio stimulus);

  • Tutorial events (records student interactions with the tutorial such as correctly following the instructions of the tutorial; incorrectly following the instructions of the tutorial; or not interacting with the tutorial when prompted); and

  • Scratchwork canvas (the system saves an image of the final scratchwork canvas for each item where the scratchwork tool is available).

Development of Digitally Based Assessments (DBA)

NAEP’s item and system development processes include several types of activities that help to ensure that our DBA measure the subject-area knowledge and skills outlined in the NAEP frameworks and not students’ ability to use the tablet or the particular software and digital tools included in the DBA.

During item development, new digitally-based item types and tasks are studied and pretested with diverse groups of students. The purpose of these pretesting activities is to determine whether construct-irrelevant features, such as confusing wording, unfamiliar interactivity or contexts, or other factors, prevent students from demonstrating the targeted knowledge, skills, and abilities. Such activities help identify usability, design, and validity issues so that items and tasks may be further revised and refined prior to administration.

Development of the assessment delivery system, including the interface that students interact with when taking NAEP DBA, is informed by best practices in user experience design. Decisions about the availability, appearance, and functionality of system features and tools are also made based on the results of usability testing with students.

To help ensure that students know how to use the assessment system and tools, each administration of a NAEP DBA begins with a brief interactive tutorial that teaches students how to use the system features to take the assessment. Students actively engage with the tutorial, as they are asked to use specific tools and features. Help screens are also built into the system, and students can access them at any time while taking the assessment. The 2019 tutorial is available at https://enaep-public.naepims.org/2019/english.html.

Accommodations and Universal Design Features with DBA

New technologies are improving NAEP’s ability to offer accommodations to increase participation and provide universal access to students of all learning backgrounds, including students with disabilities and English language learners. In a digital environment, what used to be an accommodation for PBA becomes a seamless part of universal design, available to all students. This means that things like adjusting font size, having test items read aloud in English (text-to-speech), changing the appearance of the testing interface to have a higher and a lower contrast, using a highlighter tool, and eliminating answer choices can be accomplished by all students during the test administration.

In addition to these universal design features, NAEP also continues to offer accommodations to students with Individualized Education Programs (IEPs), Section 504 plans, and English language learning (ELL) plans requiring that they have them. Some accommodations are available in the testing system (such as additional time, a magnification tool, or a Spanish/English version of the test), while others are provided by the test administrator or the school (such as breaks during testing, sign language interpretation of the test, or a bilingual dictionary). Section B.2.b provides more information on the classification of students and the assignment of accommodations.

A.1.c.6. Assessment Types

NAEP uses three types of assessment activities, which may simultaneously be in the field during any given data collection effort. Each is described in more detail below.

Operational Assessments

Operational NAEP administrations, unlike pilot administrations, collect data to publicly report on the educational achievement of students as required by federal law. The NAEP results are reported in The Nation’s Report Card (http://nationsreportcard.gov/), which is used by policymakers, state and local educators, principals, teachers, and parents to inform educational policy decisions.

Pilot Assessments

Pilot testing (also known as field testing) of cognitive and non-cognitive items is carried out in all subject areas. Pilot assessments are usually conducted in conjunction with operational assessments and use the same procedures as the operational assessments. The purpose of pilot testing is to obtain information regarding clarity, difficulty levels, timing, and feasibility of items and conditions. In addition to ensuring that items measure what is intended, the data collected from pilot tests serve as the basis for selecting the most effective items and data collection procedures for the subsequent operational assessments. Pilot testing is a cost-effective means for revising and selecting items prior to an operational data collection because the items are administered to a small nationally representative sample of students, and data are gathered about performance that crosses the spectrum of student achievement. Items that do not work well can be dropped or modified before the operational administration.

Prior to pilot testing, many new items are pre-tested with small groups of sample participants (cleared under the NCES pretesting generic clearance agreement; OMB #1850-0803). All non-cognitive items undergo one-on-one cognitive interviews, which are useful for identifying questionnaire and procedural problems before larger-scale pilot testing is undertaken. Select cognitive items also undergo pre-pilot testing, such as item tryouts or cognitive interviews, in order to test out new item types or formats, or challenging content. In addition, usability testing is conducted on new technologies and technology-based platforms and instruments.

Special Studies

Special studies are an opportunity for NAEP to investigate particular aspects of the assessment without impacting the reporting of NAEP results. Previous special studies have focused on linking NAEP to other assessments or linking across NAEP same-subject frameworks, investigating the expansion of the item pool, evaluating specific accommodations, investigating administration modes (such as DBA alternatives), and providing targeted data on specific student populations.

In addition to the overarching goal of NAEP to provide data about student achievement at the national, state, and district levels, NAEP also provides specially targeted data on an as-needed basis. At times, this may only mean that a special analysis of the existing data is necessary. At other times, this may include the addition of a short, add-on questionnaire targeted at specified groups. For example, in the past, additional student, teacher, and school questionnaires were developed and administered as part of the National Indian Education Study (NIES) that NCES conducted on behalf of the Office of Indian Education. Through such targeted questionnaires, important information about the achievement of a specific group is gathered at minimal additional burden. These types of special studies are intentionally kept to a minimum and are designed to avoid jeopardizing the main purpose of the program.

A.1.d. Overview of 2021 NAEP Assessments

The Governing Board determines NAEP policy and the assessment schedule,7 and future Governing Board decisions may result in changes to the plans represented here. Any changes will be presented in subsequent clearance packages or revisions to the current package. Due to the COVID-19 outbreak, NCES has announced to the public the decision to postpone the student assessments. The public can access this information via the NCES website, which can be found at https://nces.ed.gov/nationsreportcard/about/covid19.aspx. While the student assessment and questionnaire has been postponed, NCES will still collect survey information from teachers and school administrators.


The NAEP 2021 School and Teacher Questionnaire Special Study will consist an online version of the teacher and school survey questionnaires to all schools originally sampled for the 2021 student assessment.

A.2. How, by Whom, and for What Purpose the Data Will Be Used

While the main operational NAEP administration was postponed in 2021, a special data collection is being undertaken to understand the impact of the COVID-19 pandemic on education. More specifically, the goal for this NAEP 2021 School and Teacher Questionnaire Special Study is to fill critical gaps in the current knowledge on the impact of the COVID-19 pandemic on education across regions, TUDA districts and states and understand how the following topics impact education across the nation:


  • Technology use and access

  • Resources for learning and instruction

  • Organization of instruction

  • Teacher preparation

  • Self-efficacy (related to delivering remote instruction)


A main goal is to understand the differential impacts of the COVID-19 outbreak on the instruction of students (i.e., organization of school instruction during the 2019-2020 and 2020-2021 school years, preparations and actions taken by teachers and schools to support students, and variations in instructional practices).


The results will be presented as percentages of schools or the percentages of teachers responding to each option for each questionnaire item. In addition to an NCES report, percentage data are likely to be made available in a special component of the NAEP Data Explorer, which will enable users to cross tabulate variables. Results will be reported at the jurisdiction (state and/or TUDA level) and a national-level summary will be created depending on participation of enough states. NCES will conduct analyses to explore the data for possible findings of interest to report. It will be possible to analyze data by type of location (e.g., urban, rural) as well as other school characteristics, such as demographics of schools’ student populations (e.g., percentage of students by race/ethnicity or eligibility for NSLP in the school). All analyses will be contingent on participation rates and minimum sample size requirements.


Results from each typical NAEP assessment are provided online in an interactive website (https://www.nationsreportcard.gov/) and in one-page summary reports, called snapshots, for each participating state or urban district. Additional data tools are available online for those interested in:

In addition to contributing to the reporting tools mentioned above, data from the questionnaires are used as part of the marginal estimation procedures that produce the student achievement results. Questionnaire data are also used to perform quality control checks on school-reported data and in special reports, such as the Black–White Achievement Gap report (http://nces.ed.gov/nationsreportcard/studies/gaps/) and the Classroom Instruction Report in reading, mathematics, and science based on the 2015 Student Questionnaire Data (https://www.nationsreportcard.gov/sq_classroom/#mathematics). In addition to this NAEP 2021 School and Teacher Questionnaire Special Study, a summative report will be provided at the end of the NAEP 2021 School Survey data collection, relating the results of the study (see OMB #1850-0957).

Lastly, there are numerous opportunities for secondary data analysis because of NAEP’s large scale, the regularity of its administrations, and its stringent quality control processes for data collection and analysis. NAEP data are used by researchers and educators who have diverse interests and varying levels of analytical experience.

A.3. Improved Use of Technology

NAEP has continually moved to administration methods that include greater use of technology, as described below.

Online Teacher and School Questionnaires

The teacher and school questionnaires that accompany the NAEP assessment were traditionally available as paper-based questionnaires. Starting in 2001, NAEP offered teachers and school administrators an option of either completing the questionnaires on paper or online. In an effort to reduce costs and to streamline the data collection, starting in 2014 the NAEP program moved to the practice of having the teacher and school questionnaires available primarily online through a tool known as NAEPq. To support respondents who have limited internet connections, NAEP field staff have a limited number of printed copies of the questionnaires that can be distributed at the school’s request.

Electronic Pre-Assessment Activities

Each school participating in NAEP has a designated staff member to serve as its NAEP school coordinator. Pre-assessment and assessment activities include functions such as finalizing student samples, verifying student demographics, reviewing accommodations, and planning logistics for the assessment. NAEP is moving in the direction of paperless administrations. An electronic pre-assessment system (known as MyNAEP) was developed so that school coordinators would provide requested administration information online, including logistical information, updates of student and teacher information, and the completion of inclusion and accommodation information.8

Digitally Based Assessments (DBA)

As described in Section A.1.c.5, NAEP is transitioning to DBA. The move to DBA allows NAEP to provide assessments consistent with other large-scale assessments (such as those given by the Partnership for Assessment of Readiness for College and Careers [PARCC] and the Smarter Balanced Assessment Consortium). In addition, the transition to DBA allows NAEP to more accurately reflect what is happening in today’s classrooms, improve measurement of knowledge and skills, and collect new types of data that provide depth in our understanding of what students know and can do.

Automated Scoring

NAEP administers a combination of selected-response items and open-ended or constructed-response items. NAEP currently uses human scorers to score the constructed-response items, using detailed scoring rubrics and proven scoring methodologies. With the increased use of technologies, the methodology and reliability of automated scoring (i.e., the scoring of constructed-response items using computer software) has advanced. While NAEP does not currently employ automated scoring methodologies, these are being investigated for possible future use.

One study involved using two different automated scoring engines and comparing the scores to those previously given by human scorers. This study was conducted on items from the 2011 writing assessment. For each constructed-response item, approximately two-thirds of responses were used to develop the automated scoring model (the Training/Evaluation set) and the other third of responses were used to test and validate the automated scoring model (the Test/Validation set). The sample was selected from approximately 2,000 responses to each of the 22 different grade 8 prompts, plus approximately 2,000 responses to each of the 22 different grade 12 prompts. Approximately 80,000 existing responses were scored using automated scoring models for this study. No new data collection or human scoring was required.

The Training/Evaluation set was used to train, evaluate, and tune each scoring engine so as to produce the best possible scoring models for each constructed response item. The final scoring models were then applied to the Test/Validation set producing a holistic score for each response. Automated scoring performance is typically evaluated by comparison with human scoring performance. Evaluation criteria for the scoring models included measures of scorability, correlation with word count, overall mean and standard deviation calculations, and agreement with human scores using kappa, quadraticweighted kappa, and Pearson correlation coefficients. Fairness was also examined for focal and reference groups and compared to results of human raters.9 In addition to comparing how well each individual scoring engine agreed with human scorers, we also compared how well the two scoring engines agreed with each other. Results of these investigations will inform whether automated scoring could be utilized for specific NAEP assessments or if additional investigations are required.

A.4. Efforts to Identify Duplication

The proposed assessments, including the questionnaires, do not exist in the same format or combination in the U.S. Department of Education or elsewhere. The non-cognitive data gathered by NAEP comprise the only comprehensive cross-sectional survey performed regularly on a large-scale basis that can be related to extensive achievement data in the United States. No other federally funded studies have been designed to collect data for the purpose of regularly assessing trends in educational progress and comparing these trends across states. None of the major non-federal studies of educational achievement were designed to measure changes in national achievement. In short, no existing data source in the public or private sector duplicates NAEP.

While the survey items in NAEP are unique, the items are not developed in a vacuum. Their development is informed by similar items in other assessments and survey programs. In addition, in future rounds of development, NCES will continue to better align the NAEP survey questions with other surveys (particularly, but not limited to, those from other NCES and federal survey programs).

Historically, NAEP has served as a critical national “audit” function, offering an extremely helpful reference point in the interpretation of score trends on “high-stakes” tests used for school accountability. The main NAEP scales have served this function well even though high-stake state assessments were not always closely aligned with the corresponding NAEP assessments. Given the significant changes currently underway in the American educational landscape, including the Next Generation Science Standards, the Common Core State Standards, and Partnership for Assessment of Readiness for College and Careers (PARCC), and Smarter Balanced consortia, this “audit” function is even more important.

NAEP has provided the best available information about the academic achievement of the nation’s students in relation to consensus assessment frameworks, maintaining long-term trend lines for decades. NAEP has offered reporting at the national level as well as achievement comparisons among participating states for more than two decades, and since 2003, all states have participated in the NAEP mathematics and reading assessments at the fourth and eighth grades. More recently, NAEP has also reported achievement for selected large urban school districts. In addition to characterizing the achievement of fourth-, eighth-, and twelfth-grade students in a variety of subject areas, NAEP has also served to document the often substantial disparities in achievement across demographic groups, tracking both achievement and achievement gaps over time. Additionally, in describing educational achievement, NAEP has furthered deliberation as to the scope and meaning of achievement in mathematics, reading, and other subject areas. NAEP assessments are aligned to ambitious assessment frameworks developed by a thoughtful process to reflect the best thinking of educators and content specialists. These frameworks have served as models for the states and other organizations to follow. Finally, NAEP has also served as a laboratory for innovation, developing and demonstrating new item formats, as well as statistical methods and models now emulated by large-scale assessments worldwide.

NAEP has functioned well as a suite of complex survey modules conducted as assessments of student achievement in fixed testing windows. The complexity of NAEP evolved by necessity to address its legal and policy reporting requirements and the complex sampling of items and students needed to make reliable and valid inferences at the subgroup, district, state, and national level for stakeholders, ranging from policymakers to secondary analysts, and do so without creating an undue burden on students and schools.

A.5. Burden on Small Businesses or Other Small Entities

The school samples for NAEP contain small-, medium-, and large-size schools, including private schools. Schools are included in the sample proportional to their representation in the population, or as necessary to meet reporting goals. During a typical NAEP administration, it is necessary to include small and private schools so that the students attending such schools are represented in the data collection and in the reports. The trained field staff work closely with all schools to ensure that the pre-assessment activities and the administration can be completed with minimal disruption.

A.6. Consequences of Collecting Information Less Frequently

Under the National Assessment of Educational Progress Authorization Act, Congress has mandated the on-going collection of NAEP data. As part of the Consolidated Appropriations Act, 2021, Congress postponed the 2021 main NAEP student assessments to 2022 in light of the impact of the COVID-19 outbreak (for more information, see https://www.govtrack.us/congress/bills/116/hr133/text/enr). NCES has decided to continue to collect voluntary responses from teachers and schools via online survey questionnaires in 2021. Collecting this critical information will allow NCES to capture information regarding the educational experiences during the COVID-19 outbreak and its impact on the 2019-2020 and current school years.

A.7. Consistency with 5 CFR 1320.5

No special circumstances are involved. This data collection observes all requirements of 5 CFR 1320.5.

A.8. Consultations Outside the Agency

The NAEP assessments are conducted by an alliance of organizations under contract with the U.S. Department of Education.10 The Alliance includes the following:

  • Management Strategies is responsible for managing the integration of multiple NAEP project schedules and providing data on timeliness, deliverables, and cost performance.

  • Educational Testing Service (ETS) is responsible for coordinating Alliance contractor activities, developing the assessment instruments, analyzing the data, preparing the reports, and platform development.

  • Huntington Ingalls Industries (HII) is responsible for NAEP web technology, development, operations, and maintenance including the Integrated Management System (IMS).

  • Pearson is responsible for printing and distributing the assessment materials, and for scanning and scoring students’ responses.

  • Westat is responsible for selecting the school and student samples and managing field operations and data collection.

In addition to the NAEP Alliance, other organizations support the NAEP program, all of which are under contract with the U.S. Department of Education. The current list of organizations include:11

  • American Institutes for Research (AIR) is responsible for providing technical support, conducting studies on state-level NAEP assessments, and running the NAEP Validity Studies Panel.

  • Council of Chief State School Officers (CCSSO) is responsible for providing ongoing information about state policies and assessments.

  • CRP, Inc. is responsible for providing logistical and programmatic support.

  • Hager Sharp is responsible for supporting the planning, development, and dissemination of NAEP publications and outreach activities.

  • Optimal Solutions Group is responsible for providing technical support.

  • Tribal Tech is responsible for providing support for the National Indian Education Study.

In addition to the contractors responsible for the development and administration of the NAEP assessments, the program involves many consultants and is also reviewed by specialists serving on various technical review panels. These consultants and special reviewers bring expertise concerning students of different ages, ethnic backgrounds, geographic regions, learning abilities, and socio-economic levels; the specific subject areas being assessed; the analysis methodologies employed; and large-scale assessment design and practices. Contractor staff and consultants have reviewed all items for bias and sensitivity issues, grade appropriateness, and appropriateness of content across states.

In particular, subject-area standing committees play a central role in the development of NAEP assessment instruments and have been essential in creating assessment content that is appropriate for the targeted populations, and that meets the expectations outlined in the Governing Board frameworks. One of the most important functions of the committees is to contribute to the validation of the assessments. Through detailed reviews of items, scoring guides, tasks, constructed-response item training sets for scorers, and other materials, the committees help establish that the assessments are accurate, accessible, fair, relevant, and grade-level appropriate, and that each item measures the knowledge and skills it was designed to measure. When appropriate, members of subject-area standing committees will also review the questionnaires with regards to appropriateness with existing curricular and instructional practices.

Appendix A lists the current members of the following NAEP advisory committees:

  • NAEP Design and Analysis Committee

  • NAEP Validity Studies Panel

  • NAEP Quality Assurance Technical Panel

  • NAEP National Indian Education Study Technical Review Panel

  • NAEP Mathematics Standing Committee

  • NAEP Reading Standing Committee

  • NAEP Science Standing Committee

  • NAEP Survey Questionnaires Standing Committee

  • NAEP Mathematics Translation Review Committee

  • NAEP Science Translation Review Committee

  • NAEP Grade 8 Social Studies Translation Review Committee

  • NAEP Grade 4 and 8 Survey Questionnaire and eNAEP DBA System Translation Review Committee

  • NAEP Principals’ Panel Standing Committee

As has been the practice for the past few years, OMB representatives will be invited to attend the technical review panel meetings that are most informative for OMB purposes.

In addition to the contractors and the external committees, NCES works with the NAEP State Coordinators, who serve as the liaison between each state education agency and NAEP, coordinating NAEP activities in his or her state. NAEP State Coordinators work directly with the schools sampled for NAEP.

A.9. Payments or Gifts to Respondents

In general, there will be no gifts or payments to respondents, although students in a typical NAEP administration do get to keep the NAEP earbuds used in DBA. On occasion, NAEP will leave educational materials at schools for their use (e.g., science kits from the science hands-on assessments). Some schools also offer recognition parties with pizza or other perks for students who participate; however, these are not reimbursed by NCES or the NAEP contractors. If any incentives are proposed as part of a future special study, they would be justified as part of that future clearance package. As appropriate, the amounts would be consistent with amounts approved in other studies with similar conditions.

A.10. Assurance of Confidentiality

Data security and confidentiality protection procedures have been put in place for NAEP to ensure that all NAEP contractors and agents (see section A.8 in this document) comply with all privacy requirements, including:

  1. The Statements of Work of NAEP contracts;

  2. National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622);

  3. Family Educational Rights and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));

  4. Privacy Act of 1974 (5 U.S.C. §552a);

  5. Privacy Act Regulations (34 CFR Part 5b);

  6. Computer Security Act of 1987;

  7. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  8. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  9. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  10. Foundations of Evidence-Based Policymaking Act of 2018, Title III, Part B, Confidential Information Protection;

  11. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  12. The U.S. Department of Education Incident Handling Procedures (February 2009);

  13. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  14. NCES Statistical Standards; and

  15. All new legislation that impacts the data collected through the contract for this study.

Furthermore, all NAEP contractors and agents will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/. In addition, the Sampling and Data Collection (SDC) contractor has obtained an Authority to Operate (ATO) for the NCESLS System from the Education OCIO to operate at the FISMA moderate level through the Certification & Accreditation (C&A) process. Security controls include secure data processing centers and sites; properly vetted and cleared staff; and data sharing agreements.

An important privacy and confidentiality issue is the protection of the identity of assessed students, their teachers, and their schools. To assure this protection, NAEP has established security procedures, described below, that closely control access to potentially identifying information.

All assessment and questionnaire data are encrypted at all times. This means that NAEP applications that handle assessment and questionnaire data:

  • enforce effective authentication password management policies, making it difficult to hack into the data;

  • limit authorization to individuals who truly need access to the data, only granting the minimum access to individuals as they need (i.e., least privilege user access);

  • keep data encrypted, both in storage and in transport, utilizing volume encryption and transport layer security protocols;

  • utilize SSL certificates and HTTPS protocols for web-based applications;

  • limit access to data via software and firewall configurations as well as not using well known ports for data connections; and

  • restrict access to the portable networks utilized to administer an assessment to only assessment devices.

Students’ names are submitted to the Sampling and Data Collection (SDC) contractor for selecting the student sample. This list also includes the month/year of birth, race/ethnicity, gender, and status codes for students with disabilities, English language learners, and participation in the National School Lunch Program. This data request for NAEP fully conforms to the requirements of the Family Educational Rights and Privacy Act of 1974 (FERPA) [20 U.S.C. 1232g; 34 CFR Part 99]. FERPA is designed to protect the privacy rights of students and their families, by providing consistent standards for the release of personally identifiable student and family information. NCES and its agents are explicitly authorized under an exception to FERPA’s general consent rule to obtain student level data from institutions. For the purposes of this collection of data, FERPA permits educational agencies and institutions to disclose personally identifiable information from students’ education records, without consent, to authorized representatives of the Secretary of Education in connection with an evaluation of federally supported education programs (34 CFR §§ 99.31(a)(3)(iii) and 99.35).

After the student sample is selected, the data for selected students are submitted to the Materials Preparation, Distribution, Processing and Scoring (MDPS) contractor, who includes the data in the packaging and distribution system for the production of student-specific materials (such as labels to attach to the student booklets or log-in ID cards), which are then forwarded to field staff and used to manage and facilitate the assessment. These data are also uploaded to the MyNAEP Prepare for Assessments online system for review by schools and added to the MyNAEP School Control System (SCS) used by field staff to print materials used by the schools. Student information is deleted from the packaging and distribution system before the assessment begins. Student information is deleted from the MyNAEP system typically two weeks after all quality control activities for the assessment are complete.

All paper-based student-specific materials linking personally identifiable information (PII) to assessment materials are destroyed at the schools upon completion of the assessment. The field staff remove names from forms and place the student names in the school storage envelope. The school storage envelope contains all of the forms and materials with student names and is kept at the school until the end of the school year and then destroyed by school personnel.12

In addition to student information, teacher and principal names are collected and recorded in the MyNAEP Prepare for Assessment online system, which is used to keep track of the distribution and collection of NAEP teacher and school questionnaires. A paper copy of the questionnaire report is printed for use during the assessment, and this paper copy is left in the school storage envelope, which is destroyed at the end of the school year. The teacher and principal names are deleted from the MyNAEP system at the same time the student information is deleted.

For DBA, NAEP data are stored on systems in a locked-down environment at a secure hosting facility with strict measures in place to prevent unauthorized online access. The student names are not included on the assessment tablets or stored by the same contractor or on the same database as the student responses. Shortly before, during, and after assessments, assessment data are transmitted through secure, encrypted channels (SSL, SSH) between NAEP systems, the NAEP assessment servers, and the assessment administration devices. Data on those devices are also encrypted—these data can be read only by the assessment software—and the devices are secured against unauthorized use.

Furthermore, to protect collected data, NAEP staff will use the following precautions:

  • Assessment and questionnaire data files will not identify individual respondents.

  • No personally identifiable information, either by schools or respondents, will be gathered or released by third parties. No permanent files of names or other direct identifiers of respondents will be maintained.

  • Student participation is voluntary.

  • NAEP data are perturbed. Data perturbation is a statistical data editing technique implemented to ensure privacy for student and school respondents to NAEP’s assessment questionnaires for assessments in which data are reported or attainable via restricted-use licensing arrangements with NCES. The process is coordinated in strict confidence with the IES Disclosure Review Board (DRB), with details of the process shared only with the DRB and a minimal number of contractor staff.

The following text appears on all student assessments, the MyNAEP system, and teacher and school questionnaires:13

Paperwork Burden Statement, OMB Information

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0956 for NAEP 2021 School and Teacher Questionnaire Special Study and 1850-0957 for the NAEP 2021 School Survey. The time required to complete this information collection is estimated to average 65 minutes for the NAEP 2021 School and Teacher Questionnaire Special Study and 30 minutes for the NAEP 2021 School Survey, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate, suggestions for improving this collection, or any comments or concerns regarding the status of your individual submission, please write to: National Assessment of Educational Progress (NAEP), National Center for Education Statistics (NCES), Potomac Center Plaza, 550 12th St., SW, 4th floor, Washington, DC 20202.


OMB No. 1850-0956 APPROVAL EXPIRES 8/31/2021 for the NAEP 2021 School and Teacher Questionnaire Special Study


OMB No. 1850-0957 APPROVAL EXPIRES 8/31/2021 for the NAEP 2021 School Survey.


Authorization and Confidentiality Assurance

National Center for Education Statistics (NCES) is authorized to conduct NAEP by the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622) and to collect students’ education records from education agencies or institutions for the purposes of evaluating federally supported education programs under the Family Educational Rights and Privacy Act (FERPA, 34 CFR §§ 99.31(a)(3)(iii) and 99.35).


All of the information provided by participants may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). By law, every NCES employee as well as every NCES agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of $250,000, or both if he or she willfully discloses ANY identifiable information about participants. Electronic submission of participant’s information will be monitored for viruses, malware, and other threats by Federal employees and contractors in accordance with the Cybersecurity Enhancement Act of 2015. The collected information will be combined across respondents to produce statistical reports.

In addition, the following text appears on the log-in screen for the MyNAEP system and NAEPq, the online system used for teacher and school administrator questionnaires.

MyNAEP

When you have finished or if you need to stop before finishing, please LOG OUT of the survey system by clicking "Save and exit" and CLOSE ALL browser windows or screens to keep your responses secure. For example, if you used Chrome or Safari to open the survey, make sure no Chrome or Safari windows or screens are open after you end the survey. Not closing all browsers may allow someone else to see your responses.

NAEPq

When you have finished or if you need to stop before finishing, please LOG OUT of the survey system by clicking "Exit" and CLOSE ALL browser windows or screens to keep your responses secure. For example, if you used Chrome or Safari to open the survey, make sure no Chrome or Safari windows or screens are open after you end the survey. Not closing all browsers may allow someone else to see your responses.



As part of the MyNAEP system, there is an additional screen after users log into the system. The text shown on that screen is below. The MyNAEP Data Security Agreements makes sure the registered user acknowledges that they have access to student PII and has the user certify that they will keep the information secure and confidential.

MyNAEP DATA SECURITY AGREEMENT

Under this agreement you will have access to MyNAEP, a secure site maintained by Westat on behalf of the National Center for Education Statistics. By accepting this agreement, you also agree to keep information from the site confidential as outlined below.


National Center for Education Statistics (NCES) is authorized to conduct NAEP by the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622) and to collect students' education records from education agencies or institutions for the purposes of evaluating federally supported education programs under the Family Educational Rights and Privacy Act (FERPA, 34 CFR §§ 99.31(a)(3)(iii) and 99.35). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

NAEP collects data in a manner consistent with Family Educational Rights and Privacy Act (FERPA) privacy conventions governing the release of student data. Generally, schools must have written parental permission in order to release any information from a student's education record. However, FERPA allows schools to disclose those records, without consent, to organizations conducting certain studies for or on behalf of the school [34 CFR Part 99.31(a)(6)(i)].


As a study authorized by the U.S. Secretary of Education, NAEP is permitted to obtain personally identifying student information without written parental permission. Even so, FERPA stipulates that data collection must be conducted in a manner that does not permit personal identification of parents and students by individuals other than representatives of the organization, and that the information is destroyed when no longer needed for the purposes for which the study was conducted [34 CFR Part 99.31 (a)(6)(ii)].


NAEP complies with FERPA confidentiality requirements through its use of data transmittal, data storage, and personnel protocols designed to safeguard personally identifying information. Information is protected during transmission to and from NAEP systems by the use of data encryption technologies, such as Secure Socket Layer (SSL) and digital certificates and signatures that encrypt data, validate data integrity, and authenticate the parties in a transaction. NAEP uses SSL for all restricted-access websites that are used to transfer data, such as www.mynaep.com. All individuals including contractors and school personnel must sign assurances of confidentiality in which they pledge to maintain data confidentiality and exercise reasonable caution to prevent access by others to this information when it is in their possession.


As a representative of your school working on NAEP, you will have access to personally identifying student information. By accepting this agreement, you are certifying that you are authorized to handle and process NAEP information on behalf of your school, and that you will keep the information secure and confidential.

Full name:

Date:

Email address:

School:

District:



More specific information about how NAEP handles PII is provided in the table below:

PII is created in the following ways

  1. Public and non-public school samples are released by the SDC contractor to NAEP State Coordinators (public schools only), NAEP TUDA Coordinators (public schools only), and SDC Gaining Cooperation Field Staff (non-public schools only) using the secure MyNAEP for Schools website.

  1. Schools are recruited by SDC field staff for participation in NAEP.

  1. Participating schools need to submit a current roster of students for the sampled grade for student sampling.

  1. Rosters of students can be created by NAEP State Coordinators, NAEP TUDA Coordinators, or NAEP School Coordinators.

    1. Rosters are submitted through the secure MyNAEP for Schools website

    1. Rosters must be in Excel

  1. PII is contained in the roster files: state unique identifiers (optional), student names, month/year of birth, race/ethnicity, gender, and status codes for students with disabilities, English language learners, and participation in the National School Lunch Program.

  1. PII is stored in the SDC contractor’s secure data environments.

PII is moved in the following ways

  1. Student names (PII) are moved to the MDPS contractor via a secure FTP site. These names are used to print Student Login Cards.

  1. Student Login Cards are only created for students taking DBA, so the student names for the PBA students are not moved.

  1. Student PII data is available to the NAEP School Coordinators and the SDC contractor’s Field Staff through the secure MyNAEP for Schools website.

    1. NAEP School Coordinators can view and update PII for their own schools

    1. NAEP School Coordinators can print materials containing PII for their own schools

    1. NAEP School Coordinators store materials containing PII for their own schools in the “NAEP Secure Storage Envelope”

    1. SDC contractor Field Staff can update PII for schools within their assignment

    1. SDC contractor Field Staff can print materials containing PII for schools within their assignment

    1. SDC contractor Field Staff store materials containing PII for schools within their assignment in their “NAEP School Folders”


  1. At no point in time does any individual contractor have access to both the student name and student assessment and questionnaire responses. MDPS has access to both the student name and student assessment and questionnaire responses, but never at the same time. MDPS uses student PII to print Student Login Cards months in advance of the NAEP assessment window and destroys the student PII file after the assessment begins. SDC never has access to student responses, and no other contractor has access to Student PII.

PII is destroyed in the following ways

  1. MDPS contractor destroys the PII after the assessment begins.

  1. School Coordinators destroy the materials containing PII on or before the end of the school year.

  1. SDC contractor Field Staff destroy the materials containing PII after the school assessment has been completed. SDC contractor Field Staff return their NAEP School Folders to Westat Home Office for secure storage, and eventual secure destruction.

  1. SDC contractor destroys student names after all weighting quality control checks have been completed. This activity is completed approximately six months following the end of the administration.


A.11. Sensitive Questions

NAEP emphasizes voluntary respondent participation. Insensitive or offensive items are prohibited by the National Assessment of Educational Progress Authorization Act, and the Governing Board reviews all items for bias and sensitivity. The nature of the questions is guided by the reporting requirements in the legislation, the Governing Board’s Policy on the Collection and Reporting of Background Data, and the expertise and guidance of the NAEP Survey Questionnaire Standing Committee (see Appendix A-8). Throughout the item development process, NCES staff works with consultants, contractors, and internal reviewers to identify and eliminate potential bias in the items.

During a typical NAEP administration, the NAEP student questionnaires include items that require students to provide responses on factual questions about their family’s socio-economic background, self-reported behaviors, and learning contexts, both in the school setting as well as more generally. In compliance with legislation, student questionnaires do not include items about family or personal beliefs (e.g., religious or political beliefs). The student questionnaires focus only on contextual factors that clearly relate to academic achievement.

Educators, psychologists, economists, and others have called for the collection of non-cognitive student information that can explain why some students do better in school than others. Similar questions have been included in other NCES administered assessments such as the Trends in International Mathematics and Science Study (TIMSS), the Program for International Student Assessment (PISA), the National School Climate Survey, and other federal questionnaires, including the U.S. Census. The insights achieved by the use of these well-established survey questions will help educators, policymakers, and other stakeholders make better informed decisions about how best to help students develop the knowledge and skills they need to succeed.


During a typical NAEP administration, NAEP does not report student responses at the individual or school level, but strictly in aggregate forms. To reduce the impact of any individual question on NAEP reporting, the program has shifted to a balanced reporting approach that includes multi-item indices, where possible, to maximize robustness and validity. In compliance with legislation and established practices through previous NAEP administrations, students may skip any question.

To provide additional context for 2021, the teacher and school questionnaires include items that ask about students’ learning experiences, teachers’ preparation and instructional practices, and schools’ preparation and instructional organization and practices related to the COVID-19 outbreak. During the development process, these COVID-19 related items underwent a similar series of reviews for bias and sensitivity as the main questionnaire items. This included a sensitivity review conducted by the contractor’s independent group of reviewers who are not part of the NAEP program to identify potentially delicate, inflammatory, or inappropriate language.


A.12. Estimation of Respondent Reporting Burden (2021)

The burden numbers for NAEP data collection for 2021 will account only for the preparation activities performed prior to the postponement of the 2021 student assessment, and for the NAEP 2021 School and Teacher Questionnaire Special Study.

Exhibit 1 provides the burden information per respondent group and by grade for the 2021 data collection.

Exhibit 2 summarizes the burden by respondent group.

A description of the respondents or study is provided below, as supporting information for Exhibit 1:

  • Students— Due to the cancellation of the 2021 student assessments and questionnaires, there will be no burden on students.

  • Teachers—The teachers of fourth- and eighth-grade students who would have participated in main NAEP are being asked to complete online questionnaires about their teaching background, education, training, and classroom organization. In 2021, teachers will also answer questions about their teaching preparation and instructional practices related to the COVID-19 outbreak. Average fourth-grade teacher burden is estimated to be 35 minutes because fourth-grade teachers often have multiple subject-specific sections to complete. Average eighth-grade teacher burden is 25 minutes if only one subject is taught and an additional 10 minutes for each additional subject taught. Based on timing data collected from cognitive interviews, adults can respond to approximately six non-cognitive items per minute. Using this information, the teacher questionnaires are assembled so that most teachers can complete the questionnaire in the estimated amount of time. For adult respondents, the burden listed is the estimated average burden.

  • Principals/Administrators—The school administrators in the sampled schools are asked to complete online questionnaires. The core items are designed to measure school characteristics and policies that research has shown are highly correlated with student achievement. Subject-specific items concentrate on curriculum and instructional services issues. In 2021, school administrators will also answer questions about their school’s preparation, instructional organization and practices related to the COVID-19 outbreak. The burden for school administrators is determined in the same manner as burden for teachers (see above) and is estimated to average 40 minutes per principal/administrator, although burden may vary depending on the number of subject-specific sections included.

  • SD and ELL— Due to the cancellation of the 2021 student assessments, there will be no collection of student SD and ELL information.

  • Submission of Samples—Some schools collected and submitted student survey samples prior to the postponement of the NAEP student assessments. NCES has accounted for this burden incurred by schools in Exhibit 1 below, although this information will be discarded. Survey sample information is collected from schools in the form of lists of potential students who may participate in NAEP. This sample information can be gathered manually or electronically at the school, district, or state level. If done at the state level, some states require a data security agreement, which is customized based on the specific requests of the state and provides verbatim security and confidentiality information from Section A.10 above. If done at the school or district level, some burden will be incurred by school personnel. It is estimated that it will take two hours, on average, for school personnel to complete the submission process. Based on recent experience, it is estimated that 26 percent of the schools or districts will complete the sample submission process (based on the data from 2019).

  • Pre-Assessment and Assessment Activities—The NAEP school coordinator will be responsible for providing information to teacher and school administrators, so they may complete the survey questionnaires online. New instructional communications providing additional information about the teacher and school survey questionnaires can be found in Appendices D2-21-1 to D2-21-8S. Each school participating in main NAEP has a designated staff member to serve as its NAEP school coordinator. Pre-assessment and assessment activities include functions such as providing school information, conducting a pre-study call, and managing questionnaires. An electronic pre-assessment system (known as MyNAEP) was developed so that school coordinators would provide requested administration information online, including logistical information, and teacher information. More information about the school coordinators’ responsibilities is included in Section B.2. Based on information collected from previous years’ use of MyNAEP, it is estimated that it will take 65 minutes, on average, for school personnel to complete these activities for 2021, including looking up information to enter into the system. We will continue to use MyNAEP system data to learn more about participant response patterns and use this information to further refine the system to minimize school coordinator burden.

  • Post-Assessment Follow-up Survey—Teachers and school administrators will not be asked to complete a post-assessment follow-up survey in 2021 given the cancellation of the NAEP student assessments.

  • Online SQ Collection from Remote Students— Due to the cancellation of the 2021 student assessments, there will be no burden on students.

EXHIBIT 1

Estimated Burden for NAEP 2021 School and Teacher Questionnaire Special Study

(Note: all explanatory notes and footnotes are displayed following the table)

Subjects

Students

Teachers

School Questionnaire
(school principal)

School Coordinator

SD/ELL (school personnel)


Pre-assessment and
sample submission

Online SQ from remote students

Total Burden (in hours)

# of Students

Avg. minutes per response

Burden (in hours)

# of Teachers

Avg. minutes per response

Burden (in hours)

# of Schools

Avg. minutes per response

Burden (in hours)

# of Schools

Burden (in hours)1

# of Schools

Burden (in hours)

# of Schools

# of SD/ELL Students

Avg. minutes per response

Burden (in hours)

4th Grade

Operational (Math and Reading) single-subject assessment

N/A

N/A

N/A

14,044

35

8,192

3,511

40

2,341

3,511

5,630

N/A

N/A

N/A

N/A

N/A

N/A

16,163

Puerto Rico Math

N/A

N/A

N/A

600

35

350

150

40

100

150

241

N/A

N/A

N/A

N/A

N/A

N/A

691

4th Grade Totals

N/A

N/A

N/A

14,644

N/A

8,542

3,661

N/A

2,441

3,661

5,871

N/A

N/A

N/A

N/A

N/A

N/A

16,854

8th Grade

Operational (Reading, Math) single-subject assessment

N/A

N/A

N/A

20,376

25 for teachers who teach 1 subject; additional 10 for each additional subject2

10,188

3,396

40

2,264

3,396

5,445

N/A

N/A

N/A

N/A

N/A

N/A

17,897

Puerto Rico Math

N/A

N/A

N/A

870

25

363

145

40

97

145

232

N/A

N/A

N/A

N/A

N/A

N/A

692

8th Grade Totals

N/A

N/A

N/A

21,246

N/A

10,551

3,541

N/A

2,361

3,541

5,677

N/A

N/A

N/A

N/A

N/A

N/A

18,589

Total Requested Burden

N/A

N/A

N/A

35,890

N/A

19,093

7,202

N/A

4,802

7,202

11,548

N/A

N/A

N/A

N/A

N/A

N/A

35,443

 

 

 

 

 

 

 

 

 

 

 

 



 

 

 

 



Total number of respondents

50,294


Total number of responses



50,294










Notes for 2021 table in Exhibit 1

  1. The burden for the school coordinator for the NAEP 2021 School and Teacher Questionnaire Special Study is as follows: Pre-assessment burden is 65 minutes, which includes managing questionnaires, and the sample submission burden is 2 hours (for 26% of schools in 2021 based on 2019 data). For the purposes of the calculation of burden, we consider the performance of all of these tasks to constitute 1 response.

  2. Grade 8 teachers who teach one subject have an estimated burden of 25 minutes, with an additional 10 minutes for each additional subject. The estimated number of teachers who teach 1 subject is 50% and 2 subjects is 50%.


EXHIBIT 2

Total Annual Estimated Burden Time Cost for NAEP 2021 Assessments


 Data Collection Year

Number of Respondents

Number of Responses

Total Burden (in hours)

2021

50,294

50,294

35,443


The estimated respondent burden across all these activities translates into an estimated total burden time cost 35,443 hours14, broken out by respondent group in the table below.


Students

Teachers and School Staff

Principals

Total

Hours

Cost

Hours

Cost

Hours

Cost

Hours

Cost

2021

N/A

N/A

30,641

$841,709

4,802

$238,420

35,443

$1,080,129


A.13. Cost to Respondents

There are no direct costs to respondents.

A.14. Estimates of Cost to the Federal Government

The total cost to the federal government for the preparation, data collection, and reporting of the NAEP 2021 School and Teacher Questionnaire Special Study (contract costs and NCES salaries and expenses) is estimated to be $6,086,800. The 2021 data collection cost estimate is shown in the table below.

NCES salaries and expenses

$100,000


Contract costs

$5,986,800


Printing, packaging, and distribution, and scoring

$879,000



Item development

$305,000


Sampling, training, data collection, and weighting

$3,000,000



Recruitment and state support

$315,000



Design, analysis and reporting

$987,800



Securing and transferring DBA assessment data

$0



DBA system development

$500,000



A.15. Reasons for Changes in Burden (from last Clearance submittal)

The reason for the change in burden from the last amendment submission is due to the cancellation of NAEP student assessments in 2021. Burden for teachers, principals, and school coordinators have been revised to capture the estimated burden for participating in the online survey questionnaires.


A.16. Time Schedule for Data Collection and Publications

The time schedule for the data collection for the 2021 assessments is shown below.

Survey Questionnaires

March-April 2021: Data collection for NAEP 2021 School and Teacher Questionnaire Special Study



The report from the teacher and school administrator surveys is expected to be released in Fall 2021.

A.17. Approval for Not Displaying OMB Approval Expiration Date

No exception is requested.

A.18. Exceptions to Certification Statement

No exception is requested.

1 The role of NCES, led by the Commissioner for Education Statistics, is defined in 20 U.S.C. §9622 (https://www.law.cornell.edu/uscode/text/20/9622) and OMB Statistical Policy Directives No. 1 and 4 (https://obamawhitehouse.archives.gov/omb/inforeg_statpolicy).

2 The grade 12 economics teacher match rate was 56 percent in 2012. For comparison, the 2015 teacher match rates for grades 4 and 8 were approximately 94 percent and 86 percent, respectively.

3 See Section A.2 for more information about how NAEP results are reported.

4 See Section B.1.a for more information on the NAEP sampling procedures.

5 See Section B.2 regarding procedures for data collection.

6 The study design and results are summarized in Oranje, A., Mazzeo, J., Xu, X., & Kulick, E. (2014). A multistage testing approach to group-score assessments. In D. Yan, A. A. von Davier, & C. Lewis (Eds.), Computerized multistage testing: Theory and applications (pp. 371-389). Boca Raton, FL: CRC Press.

7 The Governing Board assessment schedule can be found at https://www.nagb.gov/about-naep/assessment-schedule.html.

8 Additional information on the MyNAEP site is included in the Section B.2.

9 These evaluation criteria were largely based on criteria advocated in Williamson, D. M., Xi, X., & Breyer, F. J. (2012). A framework for evaluation and use of automated scoring. Educational Measurement: Issues and Practices, 31(1), 2-13.

10 The current contract expires on June 30, 2024.

11 The current contracts expire at varying times. As such, the specific contracting organizations may change during the course of the time period covered under this submittal.

12 In early May, schools receive an email from the MyNAEP system reminding them to securely destroy the contents of the NAEP storage envelope and confirm that they have done so. The confirmation is recorded in the system and tracked.

13 Previous versions of NAEP instruments have displayed Government system warning banners. As of this package they are no longer included in NAEP instruments to assure consistency in language used across the different NAEP and NCES materials.

14 The average hourly earnings of teachers and principals derived from May 2019 Bureau of Labor Statistics (BLS) Occupation Employment Statistics is $27.47 for teachers and school staff and $49.65 for principals. If mean hourly wage was not provided, it was computed assuming 2,080 hours per year. The exception is the student wage, which is based on the federal minimum wage of $7.25 an hour. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation codes: Elementary school teachers (25-2021); Middle school teachers (25-2022); High school teachers (25-2031); Principals (11-9032); last modified date May 2019.  

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSystem Clearance Part A - Revisions with track changes
Author#Administrator
File Modified0000-00-00
File Created2021-03-19

© 2024 OMB.report | Privacy Policy