Justification

Vol 1 NAEP 2017 Oral Reading Fluency Pilot.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf


National Center for Education Statistics

National Assessment of Educational Progress



Volume I

Supporting Statement



The National Assessment of Educational Progress (NAEP) Oral Reading Fluency Pilot Study 2017



OMB# 1850-0803 v.174







September 2016





Table of Contents







Appendix A: School Contact Script

Appendix B: Parent/Guardian Notification Letter

Appendix C: School Debriefing Script

  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803) that provides for NCES to conduct various procedures (such as field tests, cognitive interviews, usability studies) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.

  1. Background and Study Rationale

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is administered by NCES, part of the Institute of Education Sciences, in the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the various subject areas and to collect survey questionnaire (i.e., non-cognitive) data to provide context for the reporting and interpretation of assessment results.

Oral Reading Fluency (ORF) was first administered by NCES in 19921 and later in 20022 as a measure of basic reading skills (i.e., decoding, word recognition, and fluency). A key finding was that over a quarter (26%) of the NAEP fourth-graders had some degree of difficulty with basic reading skills, and 7% had considerable difficulty—none related to disability. Currently, NAEP yields no data that are sensitive to and descriptive of the lowest levels of reading skill, despite the fact that the portion of NAEP students scoring below Basic level in reading is large (31% in 2015). NAEP cannot say what percentage of the current NAEP students have difficulty reading (i.e., how many still struggle with decoding, word recognition, and fluency).

A special study is planned for 2017 using methodology similar to the ORF study in 2002 NAEP. This study will assess the capacities of ORF for providing new information in a potential larger ORF administration in 2018 with a sub-sample of the nationally representative NAEP. Doing so would increase the generalizability of the findings to a larger population.

In this study, a sample of fourth-grade students will take the NAEP reading assessment followed by an ORF module and survey questions. The ORF module will consist of a set of materials that students will read aloud in English after completing the NAEP Reading Assessment. Each student will receive the same set of materials. Automated scoring will count the number of words read correctly per minute using speech recognition software. Extensive research and development3 has ensured that words read correctly per minute can be measured reliably and that speakers of non-standard varieties of English are scored fairly.

At the end of the module, students will respond orally to open-ended survey questions about their experience with reading aloud and with taking the assessment. The questions are open-ended to increase the likelihood of low-performing readers providing responses since they will be able to respond extemporaneously without needing to read additional text to select a response. Students will respond orally, and their responses will be recorded by the ORF tool, hand-coded according to a predetermined scheme, and analyzed.

The goals of the study are to gather data on student performance and to inform decision-making for a potential larger ORF study in 2018. The ultimate goals of measuring ORF in NAEP are as follows:

  • to improve interpretation of NAEP data and enrich findings

  • to provide useful information regarding common profiles of struggling, low fluency readers and encourage development of instructional practices to support struggling readers

  • to improve NCES assessment instruments by using technology-based instruments with capabilities for automated administration and scoring

Results from this study will not be publicly released, but they will be made available to NCES and used to inform future development of NAEP assessments, including a possible larger ORF study in 2018.

Volume I of this submittal contains descriptions as well as design, sampling, burden, cost, and schedule information for the study. Volume II contains the student debriefing questions, while the appendices contain sample scripts and notification documents.

  1. Recruitment and Sample Characteristics

The study will consist of approximately 20 schools with 10 fourth-grade students from each school for a total of 200 fourth-grade students. States will be asked to volunteer to participate in the study after they have been notified of their selection for international studies such as TIMSS/PIRLS so that states can avoid schools that are included in these international studies.

NAEP State Coordinators (see section 5) will leverage their relationships within selected states to contact schools, and identify those schools willing to participate in the study. The NAEP State Coordinators will forward the contact information for participating schools to Westat, the NAEP data collection contractor (see section 5). Westat field administration staff will contact each school to make arrangements for students from the school to participate (see appendix A for sample school contact script).

The sample will target participation from high poverty schools with a minimum of 75% of students eligible for the National School Lunch Program (NSLP). Approximately 10 students from each school will be randomly selected to participate. School administrators will be asked to provide information on participating students’ demographic information such as gender, race/ethnicity, English Language Learner status, and free and reduced school lunch eligibility. A sample parental notification letter (see appendix B) will be provided to the schools for their use to notify parents or guardians of students in the study. The school principal may edit it; however, the information regarding confidentiality and the appropriate law reference will remain unchanged.

  1. Study design and data collection

Prior to the study, Westat field administration staff will contact cooperating schools to confirm student sampling and make logistical arrangements (see Appendix A for the contact script). Westat field administration staff who are familiar with technology-based administration will conduct the study. They will bring all necessary materials, including the touch-enabled devices, headphones, and microphones to the schools on the assessment days.

Students will be provided a tutorial on the eNAEP test delivery system and then asked to complete two 30-minute cognitive blocks of NAEP reading and one 15 minute ORF module (including 2-3 minutes to answer open-ended survey questions about student experiences). The cognitive reading blocks will consist of operational grade 4 NAEP reading content. The 15 minute ORF module presents two sentences to repeat aloud (in order to determine which students have weak speaking skills), a 65-word narration to re-tell aloud (in order to ensure that students focus on comprehension when they read aloud), a list of words (e.g., hook) and non-words (e.g., bo), and four passages to read aloud (in order to determine oral reading fluency). After each read-aloud ORF passage, the student participant retells the passage from memory. Last, the student will orally respond to five survey questions. Students’ oral responses will be recorded by the ORF tool.

The study will require approximately 90 minutes (15 minutes for getting students situated and logged on to the NAEP touch-enabled devices and 75 minutes of assessment time (including 2-3 minutes to respond to open-ended survey questions about the ORF assessment).4

In addition, after each assessment, the field administration staff may conduct a debriefing interview with the school coordinator. The purpose of this interview is to obtain feedback on how well the assessment went in that school and any issues that were noted. (See appendix C for the school debriefing script.)

Data will be analyzed with the primary goal of evaluating whether the stimuli, tasks, operation of module, and derived variables appear to be performing within the range of expectations from previous research. A secondary goal is to explore the range of score metrics and derived variables that might be of interest in future research. For each ORF task, variables will be created for accuracy and rated for each fluency task. These variables can be combined into a single, derived variable such as words correct per minute (wcpm). For the oral retell tasks, the variable may be the count of exact words or paraphrases of original. For survey usability questions, responses will be categorized to identify module features or content that students find interesting or difficult to use or do.

  1. Consultations outside the agency

NAEP State Coordinators serve as the liaisons between the state education agency and NAEP, coordinating NAEP activities in his or her state. As previously noted, NAEP State Coordinators will work with schools within their states to identify participating schools. Westat is the Sampling and Data Collection (SDC) contractor for NAEP. Westat will administer the 2017 ORF special study. American Institutes for Research (AIR) is coordinating the development of the ORF tool and its integration into the eNAEP test delivery platform for the 2017 ORF special study. Analytic Measures Inc. (AMI) is a subcontractor to AIR. AMI is developing the ORF tool to be used in the 2017 ORF special study. KADA Incorporated is a subcontractor to AIR. KADA will be integrating the ORF tool into the eNAEP test delivery platform. Kada will also be responsible for the scoring and analysis of the ORF Special Study results.

  1. Justification for Sensitive Questions

Throughout the item and debriefing question development processes, effort has been made to avoid asking for information that might be considered sensitive or offensive.

  1. Paying Respondents

Schools will receive a $50 gift card to encourage participation and to thank them for their time and effort. The study will take place during regular school hours, and thus there will not be any monetary incentive for the student participants. However, students will be permitted to keep the earbuds or headphones used during the study.

  1. Assurance of Confidentiality

The study will not collect any personally identifiable information. Prior to the start of the study, participants will be notified that their participation is voluntary and that their answers may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (20 U.S.C. §9573)].

Written notification will be sent to parents/guardians of students before the study is conducted (see appendix B). Participants will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form.

  1. Estimate of Hourly Burden

School administrators and personnel provide pre-assessment information and help with the logistics of student and room coordination and other related duties. The school administrator burden is estimated at 20 minutes for the pre-assessment contact. The school personnel burden is estimated at 40 minutes for administration support and 10 minutes for the post-assessment debriefing interview.

Parents of participating students will receive a letter explaining the study, for which the parent’s burden is estimated at 3 minutes. An additional burden (15 minutes) is estimated for a small portion of parents (up to 20) who may write to refuse approval for their child or may research information related to the study.

Approximately 200 students from 20 schools will participate in the study. Student burden is calculated based on 15 minutes for setup and tutorial and 15 minutes for the ORF module (including 2-3 minutes to respond to the open-ended survey questions about student experience) from a total study time of 90 minutes.5

Estimated hourly burden for the participants is described in Table 1, below.

Table 1. Estimate of Hourly Burden

Person

Task

Number of Respondents

Number of Responses

Hours per Respondent

Total Burden (in hours)

School administrator

Initial Contact by Westat

20

20

0.33

6.67

School personnel

Scheduling, Logistics, and Debriefing Interview

20

20

0.67

13.3

Parents

Initial Notification

220

220

0.05

11

Parents*

Refusals or Additional Research

20*

20

0.25

5

Students

ORF Special Study

200

200

0.5

100

Total

460

480

NA

136

* These parents are a subset of those who were initially notified.

  1. Cost to federal government

Table 2 (below) provides the overall project cost estimates.

Table 2: Estimate of Costs

Activity

Provider

Estimated Cost

Design, preparation of tasks, scoring and analysis of data

KADA

$80,000

Recruitment and data collection activities

Westat

$70,000

Development and support of technology-based system delivery activities

AMI

$90,000

Project coordination

AIR

$30,000

Total


$270,000

  1. Project Schedule

Table 3 (below) provides the overall schedule.

Table 3: Schedule

Date

Event

November 2016March 2017

Task and System Development and Tablet Preparation

October 2016November 2016

Recruitment

AprilMay 2017

Data Collection

MayAugust 2017

Scoring and Data Analysis



1 Pinnell, G.S., Pikulski, J.J., Wixson, K.K., Campbell, J.R., Gough, P.B., & Beatty, A.S. (1995). Listening to Children Read Aloud: Data from NAEP’s Integrated Reading Performance Record (IRPR) at Grade 4 (NAEP-23-FR-04; NCES-95-726).

2 Daane, M.C., Campbell, J.R., Grigg, W.S., Goodman, M.J., & Oranje, A. (2005). Fourth-grade students reading aloud: NAEP 2002 special study of oral reading (NCES 2006-469). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, US Department of Education.

3 Baer, J., Kutner, M., Sabatini, J., White, S. (2009). Basic Reading Skills and the Literacy of the America’s Least Literate Adults: Results from the 2003 National Assessment of Adult Literacy (NAAL) Supplemental Studies. NCES number 2009481.

4 Communications to schools and parents indicate 100 minutes to allow for transition time to and from the study classroom.

5 Similar to main NAEP assessments, the cognitive item portions of the study are not included in the burden calculation.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTabletStudyUsability_Vol1_9-10-13
SubjectOperational Analysis
AuthorFulcrum IT
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy