Justification

Volume I NAEP Read-Aloud Accommodations Study 2013.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf



National Center for Education Statistics

National Assessment of Educational Progress





Volume I

Supporting Statement



Request for Clearance for

NAEP Read-Aloud Accommodations Study 2013



OMB# 1850-0803 v.71

(NCES Generic Clearance for Cognitive, Pilot, and Field Test Studies)










August 16, 2012

Revised September 26, 2012

Volume I: Supporting Statement


  1. Submittal-Related Information

This material is being submitted under the National Center for Education Statistics (NCES) generic clearance agreement (OMB #1850-0803). This generic clearance provides for NCES to conduct various procedures (such as field tests, surveys, and cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.


  1. Background and Study Rationale

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is administered by NCES, part of the Institute for Education Sciences, in the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the various subject areas and to collect questionnaire data to provide context for the reporting and interpretation of assessment results.1

The NAEP program endeavors to assess all students selected as a part of its sampling process. In all NAEP assessments, accommodations are provided as necessary for some of the students with disabilities (SD) and/or English language learners (ELL). The National Assessment Governing Board, which sets policy for NAEP, and NCES have been exploring ways to ensure that NAEP continues to appropriately include as many students as possible and to do so in a consistent manner for all jurisdictions assessed and reported. Use of appropriate accommodations provides more accessible assessments for SDs and ELLs and, thus, helps improve the inclusion rate for these students.

Currently, read-aloud accommodations are allowed for most NAEP assessments but, notably, not for the reading assessment. To gain information on the effectiveness and validity of offering read-aloud accommodations for reading assessments, NCES wishes to conduct a study of read-aloud accommodations for the NAEP reading assessment, as described in this submittal.

The goal of this study is to examine the effectiveness, validity, and differential impact of read-aloud accommodations for ELLs and SDs in the NAEP reading assessment. The results of this study will inform NCES inclusion and accommodations policies and practices for NAEP. In addition, the results of this study will contribute to the overall body of research related to read-aloud accommodations for reading assessments, which are currently being evaluated to inform decisions in NAEP, multistate consortia assessments, and other state assessments. Thus the results may possibly bring more consistency in accommodations policies between NAEP and state assessments. In addition, the methodology used in this study for examining the effectiveness and validity of read-aloud accommodations could be used as a model for evaluating other accommodations for ELLs and SDs in NAEP.


  1. Study Design and Context

Research Design

In this study, the effectiveness, validity, and differential impact of read-aloud accommodations will be examined. The outcome of this study will shed light on the following research questions:

  1. Do read-aloud accommodations (read-aloud of test directions, reading passages, test questions, and answer choices) make reading assessments more accessible to ELLs, to SDs?

  2. Do read-aloud accommodations (as in question 1, except read-aloud of reading passages) make reading assessments more accessible to ELLs, to SDs?

  3. Do read-aloud accommodations alter the reading construct for ELLs, for SDs?

  4. Do read-aloud accommodations differentially impact the performance of ELLs and SDs with different background characteristics?

Figure 1 presents the design for examining the effectiveness and validity of read-aloud accommodations. As Figure 1 shows, there are three variations of read-aloud accommodations: (1) read-aloud everything2 (test directions, reading passages, test questions, and answer choices); (2) read-aloud everything except reading passages; and (3) standard condition with no read-aloud provision. Intact classrooms of students will be assigned to the three conditions.


Figure 1. Design of Read-Aloud Accommodations Study3


Student Categories

ELL

SD

Non-ELL / Non-SD

Conditions

Read-aloud everything

G1

G2

G3

Read-aloud directions, test questions, and answer choices

G4

G5

G6

Standard condition
(no accommodations)

G7

G8

G9

Note: G=Group


Content of Assessment

The focus of this study will be on the NAEP reading assessment. The NAEP reading framework emphasizes students’ reading comprehension and students’ ability to apply vocabulary knowledge. The reading test will consist of two 25-minute reading blocks, each timed separately, along with a student background questionnaire. One Literary block and one Information block will be used, and the blocks will be counterbalanced, thus yielding two booklets.

It will take up to 135 minutes to administer these two blocks, along with the student background questionnaire, for each assessment condition.

Population and Sample

The study plans to use three high incidence categories of disabilities to achieve a large enough sample size and enough power to detect meaningful differences. These categories are: (1) specific learning disability, (2) speech/language impairment, and (3) intellectual disability. While the study will not be administered by category of disability, analyses will be performed for each of these categories separately.

The population for this study consists of students in grades 4 and 8. The sample for this study will be selected from schools in California with large numbers of ELLs and/or SDs. The study will be conducted in California, given its diverse population and, thus, there will be a greater number of schools that meet the requirements of the proportion of ELLs and SDs. Schools participating in the main NAEP administration in 2013 will be excluded from the sample.

Sixteen schools at each grade are needed. Three intact classrooms from each of the 16 schools per grade will be selected. Given that grade 8 students typically change classes throughout the day, science classes will be used in this study4. Classes with large numbers of ELLs and/or SDs will be targeted so that there will be a sufficient representation of ELLs and SDs. Therefore, based on an average class size of 25 students, 1,200 students per grade will participate in the study. This sample size will be sufficient for examining the effectiveness, validity, and differential impact for read-aloud accommodations in reading.

A power analysis was conducted to determine the sample size required for the read-aloud accommodations for the three groups of students in this study: (1) ELLs, (2) SDs, and (3) non-ELLs/non-SDs. The sample size was calculated based on a minimum detectible effect size (MDES) of 0.35 standard deviation with a Type-1 error rate of 0.05 (two-tailed) and statistical power of 0.80 (80%). Based on these parameters, the sample was calculated to be 130 subjects for each of the three subgroups of students with disabilities: (1) specific learning disability, (2) speech/language impairment, and (3) intellectual disability as well as for other subgroups of students (ELLs and non-ELLs/non-SDs)5. While obtaining this sample size for each of the three groups of students with disabilities might be challenging, focusing efforts on schools/classes with larger number of students with disabilities in the three categories may yield the required sample size. However, if these minimum numbers are not secured, we have the option of combining students in the three categories of disabilities to obtain the sample size required for analysis. For ELLs, we will have enough power to perform analyses by some subgroups of ELLs (such as ELLs at different levels of English proficiency). For non-ELLs/non-SDs, we will have a large enough sample that will allow analyses by levels of some of background variables such as gender, socioeconomic status (SES; as indicated by participation in the National School Lunch Program), and ethnicity.


  1. Data Collection Process

Data collection covers three main activities: recruitment, preassessment, and assessment. Westat, the NAEP sampling and data collection contractor will be responsible for the data collection activities for the study.

Recruitment Activities

The California NAEP State Coordinator (see section 6) will leverage her relationship within the state to inform state and district representatives about the study, request their cooperation, and let them know that a representative from Westat will be contacting them to discuss the study in more detail. ARDAC will conduct internet searches to help identify potentially eligible schools with sufficient SD and ELL student representation. Westat will then follow up with principals of potentially eligible schools (see appendix I) to collect information to identify schools that meet the classroom SD and ELL study requirements.

Next, Westat representatives will contact the principals of schools identified as eligible, to discuss the study details and obtain the schools’ cooperation, using the School Recruitment Checklist (see appendix A) to guide the discussion. Once cooperation is obtained, the Westat representatives will schedule the assessment date and collect the names and contact information for a school coordinator and the teachers of eligible classrooms. Those teachers and school coordinators will receive e-mail notifications regarding the study (see appendices B and C, respectively) and providing a description of the study (see appendix D).

Preassessment Activities

Prior to conducting the assessments, the teachers will be sent a questionnaire (provided in Volume II) to complete and return prior to the assessment day (see appendix E for the e-mail instructions to the teachers).

In addition, the school coordinators will be sent specific instructions and class rosters (see appendices F and G) to complete. The roster instructions will ask the school coordinators to create a list of the students in a particular class and assign each student an ID number. The school coordinators will then be instructed to record only the ID number on the roster, along with the following information:

  • Disability status

  • English language learner status

  • Accommodation(s) received on state assessment

  • National School Lunch program (NSLP) status

  • Most recent state assessment score in math

  • Most recent state assessment score in reading

  • Most recent English language proficiency score

  • Gender

  • Race/Ethnicity

Finally, the school coordinators will be instructed to return the class rosters and keep the list of student names and ID numbers for assessment day.

Using the class rosters, Westat staff will prepare the assessment booklets by recording the school ID, session ID, and student ID numbers on the cover. The Westat team will bring the prepared booklets and a copy of the class rosters to the school on assessment day.

Assessment Activities

Prior to the beginning of the sessions, the Westat team of test administrators will collect the lists of student names and ID numbers from the school coordinators and record the student names on the blank removable labels on the corresponding booklets.

Each student will receive the booklet with his/her name and corresponding ID number. When all booklets have been distributed, the administrators will instruct the students to remove the label with their name on it and put it on the upper corner of their desks. Before the end of the assessment, the administrators will collect the labels with student names and destroy them before leaving the school, thus ensuring that student names are not associated with their completed assessment booklets. Upon conclusion of the assessment, the administrators will collect the booklets and prepare them for shipment to Pearson, the NAEP Materials, Distribution, Processing and Scoring contractor, for processing. In addition, administrators will provide feedback regarding the individual administrations, indicating any issues, difficulties, or concerns they had when administering the read-aloud accommodations.


  1. Additional Study Aspects

Sessions

As stated earlier, the study will be conducted in intact classrooms. Each condition will be conducted in a separate classroom, or session. Students from the three categories (ELL, SD, and non-ELL/non-SD) will be interspersed across the three conditions as part of the natural interspersion across classrooms.

Accommodations

Given that reading text aloud takes longer than allowing students to read the text by themselves, students will be, by default, given extended time (time and a half). This accommodation will be given to all students, regardless of condition, so that the results are comparable. However, in order to maximize comparability of the groups within the study, other accommodations will not be provided in the study. The three intact classrooms within each school will be randomly assigned to the conditions, thus we cannot guarantee which condition a student will be in.

Background Questions

The 2013 NAEP operational core and reading-specific background questions will be administered to students as part of the study. In addition, a brief section with follow-up accommodations questions will be administered to students (varying per the condition assigned). Note that at grade 4, the core background questions will be read aloud to all students, regardless of condition, as is standard NAEP practice. The reading-specific and follow-up accommodations questions will not be read aloud for any condition. See Volume II for the complete set of background questions (i.e., the core, reading-specific, and accommodations questions) that will be administered.

Class Roster Information

Additional cognitive and noncognitive data for the selected students will be provided by the school coordinators as part of the class roster described in section 4. These data will be needed to judge the possible initial differences between the accommodated and non-accommodated groups and to examine the possibility of differential impact of accommodations. The data will also be used to examine the criterion-related validity of the accommodated outcomes.

Analysis and Reporting

ARDAC will perform the analysis of the data, as well as the preparation of a report to NCES. The report prepared by ARDAC will include an introduction, a comprehensive literature review section, the design of the study, statistical design, possible limitations of the study, results of analyses related to each of the study questions/hypotheses, and a discussion of the findings.



  1. Consultations Outside the Agency

Advance Research & Data Analyses Center (ARDAC) is an independent research center established in 1995 to conduct educational research and data analyses at the national, state, and district levels. ARDAC’s mission is to provide research and technical expertise that will inform and enhance state educational assessment systems with their existing instructional practices, assessments, and curricula. In addition to extensive experience in research on student instruction, assessment, curricula, and program evaluation, ARDAC’s staff and consultants have knowledge of and expertise with federal education programs and initiatives.

Dr. Jamal Abedi, who will lead the study, is a professor at the School of Education at the University of California, Davis, and a research partner at the National Center for Research on Evaluation, Standards, and Student Testing (CRESST). Dr. Abedi’s expertise includes psychometrics and test and scale development, application of latent-variable modeling in assessing validity and reliability of performance-based assessments, studies on the validity of assessment and accommodations for ELLs, and measurement of creativity. Dr. Fereshteh Hejri, who will be the Research Team Coordinator, is a senior researcher and director of ARDAC, where she has served as lead researcher on such large-scale research projects as the Enhanced Assessment studies; NAEP Secondary Analyses; California Health Interview Survey (CHIS), Department of Public Health at UCLA; Survey of Bank of America’s grant recipients; and Economic America’s Approach in Teaching Consumer Economics.

The NAEP State Coordinator serves as the liaison between the state education agency and NAEP, coordinating NAEP activities in his or her state.

Westat, the Sampling and Data Collection (SDC) contractor for NAEP, will recruit the participating schools and administer the study.

Educational Testing Service (ETS) serves as the Item Development (ID) contractor on the NAEP project, developing cognitive and background items for NAEP assessments.

Pearson is the NAEP Materials, Distribution, Processing and Scoring (MDPS) contractor and will be involved in the printing, packaging, distribution, and processing phases of the study.


  1. Assurance of Confidentiality

Respondents are notified that their participation is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (20 U.S.C. §9573)].

Parental notification will be sent prior to the study, and parents will have the option of not having their child participate (see appendix H for the parent/guardian notification letter). Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any form that will leave the school.

  1. Estimate of Hourly Burden

For school administration, the burden is comprised of the completion of the school survey, contact with principals at selected schools, completion of the class rosters, and the support activities for logistics and administration of the study. An oversampling of schools is included to account for possible school refusal. The completion of the school sampling survey is estimated at an hour per contacted school. Contact with the selected principals is estimated at 20 minutes per school. The burden for participating schools is estimated at seven hours per school for completion of the class rosters and support logistics.

Burden for teachers includes completion of the teacher survey and is estimated to be 10 minutes per participating teacher.

Parents of participating students will receive a letter explaining the study, for which the burden is estimated at 2 minutes. Additional burden (10 minutes) is estimated for a small portion of parents who may write refusing approval for their child or may request information related to the study.

Burden to the students is estimated to be 135 minutes.

The estimated total response burden is shown in Table 1.


Table 1. Estimate of Read-Aloud Accommodations Study Burden

Respondent/Activity

Hours per respondent

Number of respondents

Number of responses

Total hours

School






Completion of school survey

1

100*

100

100

Contact with selected schools

.33

60**,***

60**,***

20

Completion of class rosters and provide support (space, logistics, class interactions, etc.)

7

32***

32***

224

Sub-total



100

192

344

Teacher






Complete questionnaire

0.166

96

96

16

Sub-total


96

96

16

Parent/Guardian


 

 


 

Initial contact letter

0.033

2,425**

2,425**

81

Follow-up activities (refusal, information gathering)

0.166

25***

25***

4

Sub-total



2,425

2,450

85

Students






Gr. 4 & 8 Students

2.25

2,400

2,400

5,400

Sub-total



2,400

2,400

5,400

Total Burden



5,021

5,138

5,845

* Includes overage for schools that are not eligible

** Includes overage for potential refusals

*** Subset of initial contact group


  1. Estimate of Costs for Recruiting and Paying Respondents

Because the study will take place during regular academic school hours, a monetary incentive to individual student participants will not be provided. Given the limited time required for the teachers, a monetary incentive to the individual teacher participants will not be provided. To encourage school participation, the schools will receive $200 upon completion of the study as a token of appreciation.


  1. Cost to Federal Government

The overall project cost estimate, including design, preparation, and administration of student read-aloud accommodations study assessments (including recruitment, incentive costs, data collection, analysis, and reporting) is $936,400. Cost estimates by contractor are listed in Table 2.


Table 2. Estimate of Read-Aloud Accommodations Study Costs

Activity

Provider

Estimated Cost

Design, preparation of study tasks, development of background questionnaire, analysis of data, report preparation

ARDAC

$188,000

School incentives

ARDAC

$6,400

Recruitment and data collection activities

Westat

$557,000

Printing, distribution, processing activities

Pearson

$175,000

Item review activities

ETS

$10,000

Total


$936,400



  1. Schedule

The following table provides the schedule of milestones and deliverables:


Table 3. Schedule

Activity

Dates

Recruit participants (subsequent to OMB clearance)

September–November 2012

Data collection, preparation, and coding

February–March 2013

Data analysis

March–May 2013

Initial study report

July 2013


1 Education Sciences Reform Act of 2002 (ESRA), National Assessment of Educational Progress (20 USC § 9622).


2 Note, “everything” refers to the cognitive portions of the assessment only. As described in section 5, the background questions will be presented consistently across all conditions.

3 Students with disabilities that are also English language learners will be included in the study. However, the sample will likely not be large enough to analyze this group separately. Rather, analyses will be performed three different ways: 1) including the SD and ELL students in the ELL group, 2) including the SD and ELL students in the SD group, and 3) not including the SD and ELL students. In addition, descriptive analyses will be done on this group. Finally, if a large enough sample is obtained, a fourth student category will be added to the analysis plan.

4 While the focus of the study is on reading, SD and sometimes ELL students often receive specialized instruction in reading/English Language Arts. Therefore, so to obtain a mix of students in the intact classrooms, science, a required course in CA at grade 8, will be used.

5 The sample size in each subgroup could be reduced to 110 under a one-tailed test condition; however, a two-tailed test is preferred.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleBackground Cog Lab OMB Submission V.1
SubjectNAEP BQ
AuthorDonnell Butler
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy