Part B NAEP 2019 and 2020 LTT Update 3

Part B NAEP 2019 and 2020 LTT Update 3.docx

National Assessment of Educational Progress (NAEP) 2019 and 2020 Long-Term Trend (LTT) Update 2

OMB: 1850-0928

Document [docx]
Download: docx | pdf

National Center for Education Statistics

National Assessment of Educational Progress






National Assessment of Educational Progress (NAEP) 2019 and 2020

Long-Term Trend (LTT) 2020 Update 3



Supporting Statement

Part B




OMB# 1850-0928 v.17









July 2019

revised August 2019




Table of Contents


Part B. Collection of Information Employing Statistical Methods

B.1. Potential Respondent Universe and Sampling Design

The possible universe of student respondents is estimated to be 12 million at grades 4, 8, and 12 for main NAEP, and at ages 9, 13, and 171 for Long-Term Trend (LTT), attending the approximately 154,000 public and private elementary and secondary schools in 50 states and the District of Columbia, and including Bureau of Indian Education and Department of Defense Education Activity (DoDEA) Schools. Note that territories, including Puerto Rico, are not included in the national samples.

Respondents are selected according to student sampling procedures with these possible exclusions:

  • The student is identified as an English language learner (ELL), but is prevented from participation in NAEP, even with accommodations allowed in NAEP.

  • The student is identified as having a disability (SD) which prevents participation in NAEP, even with accommodations as allowed in NAEP, and has an Individualized Education Plan (IEP) or equivalent classification, such as a Section 504 plan.

Additional information regarding the classification of students is provided in Section B.2.b.

B.1.a. Sampling Procedures

To assess a representative sample of students, the process begins by identifying a sample of schools with student populations that reflect the varying demographics of a specific jurisdiction, be it the nation, a state, or a district. Within each selected school, students are chosen at random to participate and each has the same chance of being chosen, regardless of socio-economic status, disability, status as an English language learner, or any other factors. Selecting schools that are representative helps ensure that the student sample is representative.

The following are characteristic features of NAEP sampling designs:

  • for state-level assessments, approximately equal sample sizes (2,200–3,000 assessed students) from each participating state’s2 public schools, for each subject;

  • for district-level assessments, sample sizes of approximately 1,200–2,000 from each participating district’s public schools, for each subject;

  • sample sizes of approximately 6,000–20,000 for national-only operational subjects, depending on the size of the item pool;3

  • samples sizes of approximately 3,000–12,000 for pilot assessments, depending on the size of the item pool;4 and

  • in each school, some students to be assessed in each subject.

Additional information about the sampling procedures used in NAEP can be found in the technical documentation at http://nces.ed.gov/nationsreportcard/tdw/sample_design/. Note, while the latest documentation for main NAEP that has been published (as of the drafting of this document) is from 2013, the procedures have essentially remained the same. A summary of the sampling procedures is included below. Additional details (taken from the main NAEP 2013 procedures on the technical documentation website) can be found in Appendix G1 (NAEP 2013 Sampling Design) and for LTT (taken from the 2012 procedures on the technical documentation website) can be found in Appendix G2 (LTT 2012 Sampling Design).

As in the past, NAEP samples are based on multistage designs. For the national samples, a two- or three-stage design is used. If a three-stage design is used, the first stage is the selection of primary sampling units (PSUs), which are individual counties or groups of contiguous counties. The next stage is the selection of schools (within PSUs, when a three-stage design is used) and the final stage is the selection of students within schools. The national samples have sufficient schools and students to yield results for public schools, private schools, each of the four Census Regions of the country, as well as gender, race, degree of urbanization of school location, parent education, and participation in the National School Lunch Program (NSLP).

The following steps are used to select a sample of public schools and students in a year when NAEP reports state-level results. Private schools are not included in a state-level sample, which focuses solely on public schools.

  1. Generate a sampling frame.
    For sampling frames, NAEP uses the most current versions of the NCES Common Core of Data (CCD; public schools) and Private School Universe Survey (PSS; private schools) files. In addition, to address the fact that the CCD file does not necessarily include the most recent changes to schools by the time of the assessment, NAEP also conducts a survey of NAEP State Coordinators to check for additional new schools in a sample of public school districts.

  2. Classify schools into groups.
    Using the list, schools are classified into groups, first by type of location and then by the race/ethnicity classification within those locations. This step takes into account the distribution of schools and students across rural, suburban, and urban areas in each state, and the diversity of the student population at each school.

  3. Within each group, order schools by a measure related to student achievement.
    Within each group, schools are sorted by student achievement to ensure that schools with varying levels of student achievement are represented in the NAEP sample. This is done using school-level results on state achievement tests. In a few cases where recent achievement data are not available, schools are sorted by the median household income for the area where the school is located.

  4. Assign a measure of size to all schools.
    All schools on the list are assigned a measure of size. A school’s measure of size is based on the size of its enrollment in relation to the size of the state’s student population at the selected grade-level. Larger schools have a larger measure of size as they represent a larger proportion of the state’s student population. This step ensures that students from schools of different sizes are appropriately represented in the sample.

  5. Select the school sample.
    After schools are assigned a measure of size and grouped on an ordered list based on the characteristics that are referred to in previous steps, the sample is selected using stratified systematic sampling with probability proportional to the measure of size using a sampling interval. This procedure ensures that each school has the required selection probability. By proceeding systematically throughout the entire list, schools of different sizes and varying demographics are selected, and a representative sample of students will be chosen for the assessment. Additional details regarding the selection of the school sample is included in the technical documentation (https://nces.ed.gov/nationsreportcard/tdw/sample_design/2013/sample_design_for_the_2013_state_assessment.aspx).

  6. Confirm school eligibility.
    The list of schools selected to participate is sent to each state to verify that the school is eligible for participation. Some factors that would make a school ineligible include schools that have closed or if the grade span has changed so that a grade level or age assessed by NAEP is no longer in the school. Eligibility counts are included in the technical documentation (https://nces.ed.gov/nationsreportcard/tdw/sample_design/2013/eligible_schools_sampled_for_the_2013_state_assessment.aspx). Information on response rates can be found in Section B.3.b.

  7. Select students to participate in NAEP.
    School principals are notified that their schools have been chosen to participate in NAEP. Within each sampled school, a systematic sample of students is selected with equal probability from a complete list of students at the grade or age to be assessed.

The process for private school selection is similar to the public school selection process but depends on the U.S. Department of Education’s private education system databases to create the initial list of all known private schools. Private schools are sampled to be representative of private schools nationwide. The results for private schools are not included in state-level results which are solely focused on public schools.

As described above, the selection process for schools uses stratified systematic sampling within categories of schools with similar characteristics. Some schools or groups of schools (districts) may be selected for each assessment cycle if they are unique in the state. For instance, if a particular district is in the only major metropolitan area of a state or has the majority of a minority population in the state, it may be selected for assessment more often. Additionally, even if a state decides not to participate at the state level, schools in that state identified for the national sample will still be asked to participate.

NAEP yearly sample design plans are not available until the spring of the year preceding the assessments. For this clearance submittal, we have included the 2019 and 2020 sample design memorandums (see Appendix C1 and C2) which detail the specific sampling procedures for the 2019 and 2020 assessments.

Additional information about the sampling procedures used in NAEP can be found in the technical documentation at http://nces.ed.gov/nationsreportcard/tdw/sample_design/.

B.1.b. Weighting Procedures

Since each selected school that participates in the assessment effort and each student assessed constitutes only a portion of the full population of interest, weights are applied to both schools and students. The weights permit valid inferences to be drawn from the student samples about the respective populations from which they were drawn and, most importantly, ensure that the results of the assessments are fully representative of the target populations.

Additional information about the weighting procedures used in NAEP can be found in the technical documentation at http://nces.ed.gov/nationsreportcard/tdw/weighting/. Note, while the latest documentation that has been published (as of the drafting of this document) is from 2013, the procedures have essentially remained the same. A summary of the sampling procedures is included below. Additional details (taken from the main NAEP 2013 procedures on the technical documentation website) can be found in Appendix G1 (NAEP 2013 Sampling Design) and for LTT (taken from the 2012 procedures on the technical documentation website) can be found in Appendix G2 (LTT 2012 Sampling Design).

The final weights assigned to each student as a result of the estimation procedures are the product of the following steps (which are described in additional detail below):

  • assignment of a “base” weight, the reciprocal of the overall initial probability of selection;

  • adjustment of the school base weights to reduce extreme variability, arising from special circumstance;

  • adjustments for school and student nonresponse;

  • adjustment (if needed) to reflect assignment to a specified assessment subject; and

  • adjustment of the student weights in state samples so that estimates for key student-level characteristics were in agreement across assessments in reading, math, and science.

School base weights are assigned separately by grade or age and, as noted, are the reciprocal of the school’s probability of selection for that grade or age level.

Each sampled student receives a student base weight, whether or not the student participated in the assessment process. The base weight reflects the number of students that the sampled student represents in the population of interest. The sum of the student base weights for a given subgroup provides an estimate of the total number of students in that subgroup.

Since nonresponse is unavoidable in any survey of a human population, a weighting adjustment is introduced to compensate for the loss of sample data and to improve the precision of the assessment estimates. Nonresponse adjustments are applied at both the school and the student levels; the weights of responding schools are adjusted to reflect the nonresponding schools, and the weights of responding students, in turn, receive an adjustment to account for nonresponding students. School nonresponse adjustment cells are formed in part by census division, urbanicity, and race/ethnicity. Student nonresponse adjustment cells are formed in part by SD/ELL status, school nonresponse cell, age, gender, and race/ethnicity.

The complexity of the sample selection process as well as the variations in school enrollment can result in extremely large weights for both schools and students. Since unusually large weights are likely to produce large sampling variances for statistics of interest, and especially so when the large weights are associated with sample cases reflective of rare or atypical characteristics, such weights usually undergo an adjustment procedure that “trims” or reduces extreme weights. Again, the motivation is to improve the precision of the survey estimates. The student weight trimming procedure uses a multiple median rule to detect excessively large student weights.

Weighted estimates of population totals for student-level subgroups for a given grade or age will vary across subjects even though the student samples for each subject generally come from the same schools. These differences are the result of sampling error associated with the random assignment of subjects to students through a process known as spiraling. For state assessments, in particular, any difference in demographic estimates between subjects, no matter how small, may raise concerns about data quality. To remove these random differences and potential data quality concerns, a new step was added to the NAEP weighting procedure starting in 2009. This step adjusts the student weights in such a way that the weighted sums of population totals for specific subgroups are the same across all subjects. It was implemented using a raking procedure and applied only to state-level assessments.

Estimates of the sampling variance of statistics derived through the assessment effort are developed through a replication method known as “jackknife.” This process of replication involves the repeated selection of portions of the sample (replicates). A separate set of weights is produced for each replicate, using the same weighting procedures as for the full sample. The replicate weights, in turn, are used to produce estimates for each replicate (replicate estimates). The variability among the calculated replicate estimates is then used to obtain the variance of the full-sample estimate.

Additional information about the weighting procedures used in NAEP can be found in the technical documentation at http://nces.ed.gov/nationsreportcard/tdw/weighting/.

B.2. Procedures for Collection of Information

B.2.a Recruitment of Schools

Once the sample of schools is selected for the 2019 main NAEP administration and the 2020 LTT administration, the NAEP State Coordinator and NAEP field staff typically follow a standard set of procedures for securing the participation of public and nonpublic schools. The process includes:

  • sending initial contact letters to chief state school and testing officers (for 2019, see Appendix D2-15 for the letter and Appendices D2-13 and D2-14 for the included information; for the 2020 LTT letter, see Appendix D3-22);

  • sending a notice to the district superintendents of which and how many schools were selected for NAEP from their district (for 2019 see Appendix D2-3 for the letter and Appendix D2-1 for the included information; see D2-3-S-PR and D2-1-S-PR for the Spanish translations to be used in 2019 in Puerto Rico; for 2020, see Appendix D3-3);

  • sending a notice of each school’s selection for NAEP to the principal or other administrative official, along with an assessment information packet containing introductory information and materials (for 2019, see Appendix D2-4 for the letter and Appendix D2-2 for the included information; see D2-4-S-PR and D2-2-S-PR for the Spanish translations to be used in 2019 in Puerto Rico; for 2020, see Appendix D3-10);

  • sending a notice with each school’s NAEP assessment date to the principal or other administrative official, along with additional assessment information (for 2019, see Appendix D2-7 for the letter and Appendices D2-9 [public schools] and D2-8 [private schools] for the included information; see Appendix D2-7-S-PR and D2-9-S-PR for the Spanish translations to be used in 2019 in Puerto Rico;. for 2020, see Appendix D3-14);

  • sending a letter to each school’s principal with instructions for assigning a school coordinator (for 2019, see Appendix D2-5; see D2-5-S-PR for the Spanish translation to be used in 2019 in Puerto Rico; for 2020, see Appendix D3-10 ); and

  • sending information to each school coordinator regarding his/her role (for 2019, see Appendix D2-6 for the letter and Appendices D1-5 [public schools] and D1-6 [private schools] for the brochure describing the role; see Appendices D2-6-S-PR and D1-5-S-PR for the Spanish translations to be used in 2019 in Puerto Rico; for 2020, see Appendix D3-13).

The National Indian Education Study (NIES) includes additional recruitment activities:

  • sending an endorsement letter from the Bureau of Indian Education or other agencies or organizations involved in American Indian education to the school principal to encourage participation (see Appendix D1-3);

  • sending a flyer to school principals or another administrative officials to inform them about the study and encourage participation (see Appendix D1-4); and

  • sending a fact sheet to Associate Deputy Directors, Education Program Administrators, and Education Line Officers to inform Bureau of Indian Education officials about their role in supporting their school’s participation in NAEP (see Appendix D1-2).

The High School Transcript Study (HSTS) includes additional recruitment activities:

  • sending an initial notification as part of the standard NAEP notification process (see Appendix D2-3); and

  • sending a notice to the HSTS coordinator from the NAEP State or TUDA Coordinator informing them about their participation in HSTS (see Appendix D2-18 for the letter and Appendix D2-19, D2-20, and D2-21 for the included information).

The Middle School Transcript Study (MSTS) includes additional recruitment activities:

  • sending a recruitment letter to TUDA district superintendents asking them to participate in the study (see Appendix D2-22 for the letter and Appendices D2-23 and D2-24 for the included information);

  • sending a notice to TUDAs selected to participate in the study (see Appendix D2-25);

  • calling the TUDAs to obtain information about the schools and course catalogs (see Appendix I-4); and

  • calling the TUDAs regarding the submittal of transcripts (see Appendix I-4).

Note: Appendices D1 and D2 provide the full finalized communication and recruitment materials to be used in NAEP 2019. Some communication materials will be also used in a Spanish-language version and the translated versions are included in appendices D1 and D2 as applicable. Appendix D3 provides the LTT communication and recruitment materials for 2020.

B.2.b School Coordinator Responsibilities

The school coordinators are responsible for preparing for the NAEP assessment in the school using the MyNAEP system, which is an online secure site that provides participating schools with a convenient way to prepare for the upcoming assessment. MyNAEP serves as the primary resource and action center throughout the assessment process. The secure MyNAEP system is used for all special studies (a revised version of the MyNAEP system will be used for the HSTS and MSTS special studies collection of transcripts). The site also offers school coordinators an electronic way to prepare for the assessment at their own pace. The NAEP field representative will schedule an initial call in December to pre-review the major areas of the MyNAEP system with the school coordinator. The content of the MyNAEP system is provided in Appendix J1, the Spanish version in Appendix J2, and the HSTS and MSTS versions are provided in Appendices J3 and J4.

The MyNAEP menu is a virtual checklist of all activities that school coordinators will need to complete throughout the school year. The following describes the different sections and activities that need to be completed, and the purpose and timeframe for each.

  • Register and Provide School Information

  • Tasks: Register for the MyNAEP website and provide school contact information and school characteristics, including student enrollment for the selected grade or age, charter school status, and important dates.

  • Purpose: Gain access to the secure MyNAEP website as the designated school coordinator and ensure that NAEP has the most up-to-date information about the school.

  • Timeline: main NAEP: August and October 2018; LTT 2020: August to November 2019.

  • Submit Student List/Sample

  • Tasks: NAEP collects a list of all students in the selected grade or age for each school. The school submits an Excel file with all students and their demographic data (see Appendix H). Note, as described in Section A.12, the school coordinator is only responsible for this task if the state coordinator has not previously submitted the student list for sampling. As such, only a portion of the school coordinators are responsible for this task.

  • Purpose: Draw a representative sample of students from the school to participate in the NAEP assessments. Ensure all students have an opportunity to be sampled.

  • Timeline: main NAEP: October and November 2018; LTT 2020: August to November 2019.

  • Review and Verify List of Students Selected for NAEP

  • Tasks: Review demographic data to make sure they are correct and add any missing demographic data. School coordinators will be asked to review and verify student information and also to indicate whether students were displaced from a natural disaster.

  • Purpose: Demographic data are used for reporting results of student groups in The Nation’s Report Card.

  • Timeline: main NAEP: December 2018 and January 2019; LTT 2020: August to November 2019.

  • Complete SD/ELL Student Information

  • Tasks: Determine how students participate in NAEP (i.e., without accommodations, with accommodations, or do not test). Provide the Individuals with Disabilities Education Act (IDEA) disability status, English proficiency, primary language, grade- or age-level performance, and accommodations, using the state-specific NAEP inclusion policies (see Appendices D1-8 and D1-9 for templates of the NAEP 2019 SD and ELL inclusion policies, which will be customized by the NAEP State Coordinators, and see D1-8-S-PR and D1-9-S-PR for their Spanish translations to be used in Puerto Rico). For the LTT 2020 SD/ELL templates, see Appendices D3-16 and D3-17.

  • Purpose: Make sure students have appropriate supports to access the NAEP assessment.

  • Timeline: main NAEP: December 2018 and January 2019; LTT 2020: September 2019 to March 2020.

  • Notify Parents

  • Tasks: Download and customize the parent notification letter (for NAEP 2019, see Appendix D2-12 [public schools], D2-11 [private schools], D2-12-S [public school Spanish Translation], and D2-11-S [private school Spanish translation] for the template of the letter; for LTT 2020, see Appendix D3-7 [public schools], Appendix D3-8 [private schools], Appendices D3-20 [public school Spanish Translation] and D3-21 [private school Spanish Translation] for the template of the letter), upload the customized letter to the system, and certify the date parents were notified.

  • Purpose: Ensure that parents/legal guardians are notified of their student’s selection to participate in NAEP, which is a requirement of the Reauthorized Elementary and Secondary Education Act (ESEA).5

  • Timeline: main NAEP: December 2018 and January 2019; LTT 2020: September 2019 to March 2020.

  • Manage Questionnaires

  • Tasks: For the main NAEP administration only, identify respondents for school and teacher questionnaires, send respondents links to online questionnaires, and monitor completion of questionnaires. Distribute information about NAEP to teachers (see Appendix D1-7 for the English version and Appendix D1-7-S for the Spanish translation to be used in Puerto Rico).

  • Purpose: Results are used to provide contextual data from schools and teachers in The Nation’s Report Card.

  • Timeline: December and January (2019 main NAEP only).

  • Update Student List

  • Tasks: Identify any newly enrolled students since the original list of students was provided in the fall. Upload a current list of students via Excel or review original list and add newly enrolled students.

  • Purpose: Ensures all students have an opportunity to be sampled so NAEP can assess a representative sample of students.

  • Timeline: main NAEP: January 2019; LTT 2020: December 2019 to March 2020 (note, LTT age 13 will not have an update list process as it is administered in the fall).

  • Plan for Assessment Day and Encourage Participation

  • Tasks: Determine assessment session times and locations, share cell phone policy to ensure security of NAEP items, and make a plan to encourage student participation.

  • Purpose: Ensure that the school is prepared for a successful administration of NAEP.

  • Timeline: main NAEP: December 2018 and January 2019; LTT 2020: September 2019 to March 2020.

  • Support Assessment Day Activities

  • Tasks: Print resources to notify students and teachers.

  • Purpose: Ensure students arrive at assessment location prepared and on time.

  • Timeline: One week prior to assessment date.

  • [For submission of electronic transcripts for HSTS and MSTS:6] Submit Electronic Transcripts

    • Tasks: Download the list of sampled students from MyNAEP. Attach transcript information to the downloaded file and submit.

    • Purpose: To link NAEP students to the transcripts being collected.

    • Timeline: Summer to Fall 2019.

Before the assessment, the NAEP field representative will hold a Pre-assessment Review Call with the school coordinator to review the status of the completion of the tasks in the MyNAEP system, answer any questions, and review assessment day procedures.

After each assessment, the field staff will meet with the school coordinator for a debriefing interview. The purpose of this interview is to obtain feedback on how well the assessment went in that school, the usefulness of NAEP materials (publications, letters, etc.), preparation activities, strategies utilized for increasing participation, and any issues that were noted. The debriefing interview questions are included in Appendix E1-1 for 2019 and Appendix E2-1 for 2020. As part of the ongoing quality control of the assessment process, 25 percent of the schools will be randomly selected for an additional follow-up survey. Survey questions solicit feedback on pre-assessment, assessment, and procedural processes. The post-assessment follow-up survey is included in Appendix E1-2 for 2019 and Appendix E2-2 for 2020.

The final school coordinator responsibility occurs at the end of the school year during which he or she securely destroys any documents with student identifying information.

B.2.c Administration Procedures

Trained NAEP field staff will set up and administer the assessment and provide all necessary equipment and assessment materials to the school, including paper booklets and pencils for the paper-and-pencil assessments; and tablets with an attached keyboard, stylus, earbuds, and, for some subjects, mouse for the digitally based assessments (DBA). Internet access is not required for DBA. NAEP field staff will pack up the equipment and leave the space as they found it.

Assessments are held in sessions of approximately 25 students, with possibly multiple sessions held concurrently (particularly for paper-and-pencil assessments) or two sequential sessions held in the schools (particularly for DBA).

The field staff use scripts and carefully timed sections to administer the paper-based assessments (PBA). Most of the instructions for DBA are provided on the tablets, from the assessment system.

B.3. Methods to Secure Cooperation, Maximize Response Rates, and Deal with Nonresponse

Schools within each state will be selected and the chief state school officer and the NAEP State Coordinator will be asked to solicit their cooperation. Since states and school districts receiving Title I funds are required to participate in the main NAEP reading and mathematics assessments (grades 4 and 8) under the National Assessment of Educational Progress Authorization Act, NAEP response rates have improved for these assessments. Two areas that have typically had lower response rates in NAEP are grade 12 students and private schools. As such, NCES has created specialized materials targeted at these audiences:

  • The Best Practices: Strategies for Supporting Twelfth-Grade NAEP Participation (referred to as the Best Practices Guide for short) provides resources and strategies to increase twelfth-grade student motivation and participation (Appendix D1-10). For 2020 LTT, The Best Practices: Strategies for Supporting High School NAEP Participation (referred to as the Best Practices Guide for short) provides resources and strategies to increase age 17 student’s motivation and participation (Appendix D3-18). The Best Practices Guide is all digital and may be distributed via a flash drive. Customizable resources and templates can be downloaded directly from www.mynaep.com.

  • Videos and additional information on the NAEP website for schools, students, parents, and teachers (see http://nces.ed.gov/nationsreportcard/about/schools.aspx).

  • Additional brochures and resources targeting private schools, including Overview of NAEP for Private Schools (see Appendix D1-12), NAEP in Your Private School (see Appendix D2-8), and a webpage dedicated just to private schools (http://nces.ed.gov/nationsreportcard/about/nonpublicschools.aspx).

B.3.a. Methods to Maximize Response Rate

There are four main areas that can be focused on in order to maximize completion rates: (1) early distribution of information and materials; (2) effective communication with school personnel; (3) efforts to encourage student participation; and (4) efforts made by field staff to avoid refusals and to convert initial refusals to cooperating schools.

Early Distribution of Information and Materials

Over the years, feedback from schools and states had indicated that notification of a school’s selection in the NAEP sample earlier rather than later is beneficial to the school for planning purposes and improves school response rate. As such, NAEP generally notifies schools of selection in May of the year prior to the assessment. In addition, to facilitate the school coordinators’ completion of the tasks associated with the administration, the MyNAEP system is available to the school coordinators approximately 6–7 weeks before the administration window begins.

Effective Communication with School Staff

The participation of schools can be increased by effectively communicating information about NAEP, including what NAEP measures, the various assessment components, why it is important that schools, students, and teachers participate, and the role of the school staff. Effective communication materials from the State Coordinator and the field staff (as described in Section B.2.a) will help maximize the participation of schools. In addition, an intuitive and easy-to-use MyNAEP system (as described in Section B.2.b) will help ensure that the school coordinator’s experience is positive.

In addition, NCES may thank school staff and the principal for their participation in NAEP (see Appendix D2-17).

Encouraging Student Participation

Previous feedback from school administrators has shown that students respond more positively to the assessment when they know the assessment has the support of the school administration. Therefore, the field staff will encourage the school coordinator to make efforts to encourage students to do their best, including having the principal introduce the assessment. In addition, field staff will suggest to the school coordinator that grade 8 and 12 schools may want to issue community service credits for participating. Given that grade 12 student participation can be particularly challenging, NAEP has developed a Best Practices Guide to encourage grade 12 participation (Appendix D1-10), which is shared with sampled high schools.

Avoiding Refusals and Converting Initial School Refusals

Field staff will be trained in methods to maximize school participation, which will include being flexible in the assessment scheduling, following up with the school coordinators, and scheduling in-person preparation meetings, at the school coordinator’s request.

B.3.b. Statistical Approaches to Nonresponse

Not all of the students in the sample will respond. Some will be unavailable during the sample time period because of absenteeism or other reasons. If a student decides not to participate, the action will be recorded, but no steps will be taken to obtain participation. The NAEP response rates follow AAPOR (American Association for Public Opinion Research) guidelines. Response rates, in percentages, from the 2015 NAEP assessment are shown below. Previous years’ response rates can be found in the technical documentation (see for example, https://nces.ed.gov/nationsreportcard/tdw/sample_design/2012/2012_samp_econ_resp_school.aspx).

 

Grade 4

Grade 8

Grade 12

Student Response Rates

94

92

78

School Response Rates



 

Public Schools

100

99

91

Private Schools

61

56

57

Note: The public school response rate for grade 4 was rounded to 100, but was actually slightly lower (i.e., 99.7).


Compared to 2013, the rates have dropped for students at grade 12 (from 84 to 78) and for private schools at grades 4 and 8 (from 71 and 70 to 61 and 56, respectively). Grade 12 participation remains voluntary at the state, district, school, and student levels.

In the same vein, all private school participation in NAEP is voluntary. Anecdotal information from private schools suggests that some of the decline can be attributed to growing anti-testing sentiments, anti-government sentiments, reluctance to sacrifice instructional time, and limited school time and/or resources. Anti-technology sentiments were also beginning to play a part in refusals in 2015, as we began piloting digitally based assessments. We track reasons for school nonparticipation, and the most typical reasons for private schools are “no contact made” (i.e., school would not return our calls) or “definitive no” with no reason provided. As such, it is sometimes difficult to ascertain the specific reason for refusal in a large percentage of schools.

Many efforts are underway or intensifying to help boost twelfth-grade (and age 17) student and private school participation. For example, the Best Practices Guide (Appendix D1-10 and D3-18) is reviewed prior to each grade 12 (and age 17) assessment so to include strategies to encourage grade 12 (and age 17) participation. In addition, we are working to increase engagement of private school organization leaders in recruitment efforts and requesting customized endorsement letters from these organizations (Appendix D2-10). We have also expanded outreach efforts to schools to promote the use of NAEP data tools to highlight the value of NAEP data to private schools. Furthermore, efforts are underway to develop a customized dashboard for private schools on The Nation’s Report Card site.

NCES and the National Assessment Governing Board have established participation rate standards that states and jurisdictions are required to meet in order to have their results published. Beginning in 2003, if a state’s school response rate is below 85 percent, the results will not be published by NAEP, regardless of the response rate after substitution (see https://nces.ed.gov/nationsreportcard/about/participrates.aspx and https://www.nagb.org/content/nagb/assets/documents/policies/samplingpolicy1.pdf).

B.4. Pilot Testing and Data Uses

Pilot testing of cognitive and non-cognitive items is carried out in all subject areas. The purpose of pilot testing is to obtain information regarding clarity, difficulty levels, timing, and feasibility of items and conditions. In addition to ensuring that items measure what is intended, the data collected from pilot tests serve as the basis for selecting the most effective items and data collection procedures for the subsequent operational assessments. Pilot testing is a cost-effective means for revising and selecting items prior to an operational data collection because the items are administered to a small, nationally representative sample of students and data are gathered about performance that crosses the spectrum of student achievement. Items that do not work well can be dropped or modified before the operational administration.

Prior to pilot testing, many new items are pre-tested with small groups of sample participants (cleared under the NCES pretesting generic clearance agreement; OMB #1850-0803). All non-cognitive items undergo one-on-one cognitive interviews, which is useful for identifying questionnaire and procedural problems before larger scale pilot testing is undertaken. Select cognitive items also undergo pre-pilot testing, such as item tryouts or cognitive interviews, in order to test out new item types or formats, or challenging content. In addition, usability testing is conducted on new technologies and technology-based platforms and instruments.

The findings and recommendations from NAEP 2017 MSTS Pilot Study (OMB# 1850-0803 v.180) have been provided in this submission in Appendix L.

B.5. Consultants on NAEP Design

ETS, Fulcrum, Westat, and NCES staff have collaborated on aspects of the design. The primary persons responsible from NCES are: Peggy Carr, Patricia Etienne, Holly Spurlock, and William Tirre; from ETS: Jay Campbell and Amy Dresher; from Westat: Keith Rust and Greg Binzer; and from Fulcrum: Scott Ferguson. In addition, the NAEP Design and Analysis Committee, the NAEP Validity Studies Panel, and the NAEP Quality Assurance Technical Panel members (see Appendices A-1 through A-3) have also contributed to NAEP designs on an on-going basis.

1 In some instances, students eligible for LTT may be a year younger or a year older depending on their birthday and on when the LTT assessment is administered.

2 Participating states vary depending on the subject and grade assessed, but may include the 50 states, the District of Columbia, the Department of Defense Education Activity, and (for mathematics assessments only) Puerto Rico.

3 NAEP IRT scaling requires a minimum sample size of 1,500-2,000 students per item in order to estimate stable item parameters. Therefore, national assessments with larger item pools have larger samples.

4 NAEP IRT scaling is conducted for most pilot assessments, requiring a minimum of 1,500-2,000 students per item in order to estimate stable item parameters. Therefore, pilot assessments with larger item pools have larger samples.

5 Please note that parents/legal guardians are required to receive notification of student participation but NAEP does not require explicit parental consent (by law, parents/guardians of students selected to participate in NAEP must be notified in writing of their child’s selection prior to the administration of the assessment).

6 The HSTS transcripts will be submitted by either a school coordinator or by state personnel. The MSTS transcripts will be submitted by TUDA personnel.






File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSystem Clearance Part B - revisions with track changes
Authorjoconnell
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy