NAEP Accessible Reading Cog Lab Items -- Justification

NAEP Accessible Reading Cog Items-Cog Labs Vol 1.docx

NCES Cognitive, Pilot, and Field Test Studies System

NAEP Accessible Reading Cog Lab Items -- Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf



National Assessment of Educational Progress

National Center for Education Statistics





Volume I

Supporting Statement



Request for Clearance for 2011-2012 Cognitive Interview Studies of NAEP Cognitive Items for the Accessible Booklet Study in Reading



OMB# 1850-0803 v.58

(NCES Generic Clearance for Cognitive, Pilot and Field Test Studies)










November 1, 2011


Volume I: Supporting Statement







  1. Submittal-Related Information

This material is being submitted under the National Center for Education Statistics (NCES) generic clearance agreement (OMB #1850-0803). This generic clearance provides for NCES to conduct various procedures (such as field tests and cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.

  1. Background and Study Rationale

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is administered by NCES, part of the Institute for Education Sciences, in the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the various subject areas and to collect questionnaire data to provide context for the reporting and interpretation of assessment results.


For several years, NCES has been interested in the use of accessible blocks as a means of increasing inclusion in NAEP as well as improving measurement at the lower levels of the NAEP scale.1 A significant number of students tend to perform below the basic level on NAEP. For example, on the 2009 assessment, 18% of fourth-graders performed below the basic level in mathematics and 33% in reading. Only 39% of fourth-graders performed at or above the proficient level in mathematics and only 33% did so in reading. Very small percentages reached the advanced level in either subject. Furthermore, the percentages of students performing in the lower part of the distribution are much greater for many of the demographic groups that NAEP is required to report by law (Public Law 107-110). These students are the very groups that our society has in its mind’s eye when it makes a commitment to closing the achievement gap. Hence knowing what these students can do is essential to achieving national priorities.


Yet, given the need for NAEP assessments to measure the full range of content and skills specified in the frameworks, the reading assessment has tended to include many blocks (defined here as the combination of a passage and its accompanying 8-12 items) that low-achieving (especially below basic) students find difficult. The result is that achievement estimates at the lower extreme of the distribution have relatively large standard errors. The aim of including one or more accessible blocks would not be to make NAEP easier, but to improve the precision (and hence the information value) of measurement at the lower levels. Increased precision at the lower levels represents an important validity issue regarding the use of NAEP as a means of benchmarking and interpreting changes in student assessment results over time. If NAEP scores remain static for some demographic groups or subject areas, it may be due to NAEP’s inability to detect change at lower performance levels.


Accessible blocks could either be incorporated into the normal spiral or given selectively to students who were previously identified as likely to benefit.2 The inclusion of an accessible booklet, consisting of two accessible blocks, holds promise as a means for increasing the participation of students with disabilities (SD) and possibly also English Language Learners (ELL), thereby improving the validity of NAEP as a means of representing the performance of those subgroups. Offering an accessible booklet option to SDs and ELLs could also reduce the impact of construct irrelevant variance (readability, language demand, visual distractors, etc.) on test results for these subgroups.


More research is needed to determine the extent to which incorporation of accessible reading blocks into the regular administration of NAEP increases precision at the lower end of the performance continuum, and to explore the development of item modification guidelines and expert review processes in the area of reading.


The proposed study is intended to examine the feasibility and effectiveness of accessible blocks that are aligned with the NAEP reading framework. The initial phase involves convening a panel of three to five expert item writers and test development specialists to create accessible blocks for grades 4 and 8, using accessibility guidelines previously created by the investigators, and refined during the phase 1 cog labs of standard NAEP reading blocks, as described below. The draft accessible blocks will then be reviewed by another expert literacy panel that will independently evaluate alignment with the NAEP framework and adherence to the accessibility guidelines.


Cognitive labs (the focus of this submittal) will be conducted with standard and accessible blocks to gain insight into how students interpret and respond to items. This method (referred to as cog labs) involves intensive, one-on-one interviews in which the respondent is typically asked to "think- aloud" as he or she answers cognitive or survey questions. A number of different techniques will be involved, among them asking respondents to paraphrase questions, using probing questions to determine how respondents came up with their answers, and so on. These are outlined in the cognitive interview protocol shown in Volume II.


The University of Illinois at Urbana Champaign (UIUC) (see Section 5 for contractor information), on behalf of NCES, plans to conduct two sets of cognitive item cognitive labs in 2011-2012:


  • Phase 1: Prior to the expert panels, UIUC will conduct cog labs with standard NAEP reading blocks and selected items from other assessments to identify aspects of the assessment that limit accessibility and to aid in the selection of blocks for modification; and

  • Phase 2: When alignment is verified and pilot blocks finalized, UIUC will conduct cog labs with both standard and accessible blocks to assess accessibility and inform the selection of blocks for inclusion in the field test.


The purpose of the cognitive interview study is to explore how students process items and generate responses. For example, the cog labs might try to determine if students get useful ideas from images and stimuli accompanying items and examine how such components are incorporated into students’ planning and the development of students’ responses. The goal is to identify areas of success and potential problems prior to administering items to a large number of respondents, so as to increase the quality of the items.


In the cognitive labs, an interviewer uses a structured protocol in a one-on-one interview using two methods: think-aloud interviewing and verbal probing techniques. With think-aloud interviewing, respondents are explicitly instructed to “think-aloud” (i.e., describe what they are thinking) as they figure out their responses and their strategies for responding to items. The respondent reads each question aloud, and then the interviewer records the cognitive processes that the respondent uses in arriving at a response or strategy. With verbal probing techniques, the interviewer asks probing questions, as necessary, to clarify points that are not evident from the “think-aloud” process. These probes might include, for example, asking respondents to rephrase the question in their own words or to assess whether the response categories for selected response items are relevant.


Cognitive interview studies are largely observational. The largely qualitative data collected consists of mainly verbal reports from think-aloud tasks, in addition to comments volunteered by respondents. As a result of the information learned during the cognitive interviews, items are revised, resulting in a set of items that are clear and meaningful to respondents (and thus less burdensome for them) and that yield accurate and meaningful measures of appropriate knowledge and skills.


During the cognitive labs, samples of students will take either two standard blocks (Phase 1) or both an accessible block and a standard block (Phase 2) in a counterbalanced design using a 1:1 administration with a trained observer. The observer will prompt the student to “think-aloud” as they complete the item blocks and will also debrief the student to gather information on the strategies that the student used. Student work will also be analyzed for further evidence on student strategies and to evaluate performance. Comparisons will be made between strategies, time to completion, and performance across accessible and standard blocks.

  1. Study Design and Context

Subject-Area Overview

The cognitive lab study includes student interviews about their reactions to and performance on four accessible blocks and up to six standard reading blocks at each of grades 4 and 8. The item blocks are designed to measure students’ abilities to read and comprehend texts of various genres for different purposes. Many item blocks will be accompanied by figures and illustrations. Based on the NAEP Reading Framework, each item block is intended to take up to 25 minutes of student time; however, most students will take less time. In particular, low performing students will take substantially less time with standard NAEP blocks because they are unable to engage with the material in depth.

Given the formative use of the cog lab data, the process for scheduling and conducting cog labs will be iterative. That is, information gathered from earlier cognitive interview sessions will help to shape the item blocks evaluated in the next series of cognitive interviews.



Sampling Plan

Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics. Existing research and practice have failed to offer a methodological or practical consensus regarding the minimum or optimal sample size necessary to provide valid results for cognitive question development.3 Several researchers have confirmed the standard of five as the minimum number of participants per subgroup for analysis (i.e., cell) for the purposes of exploratory cognitive interviewing for question development.4 Other researchers have indicated that although a sample size of five per cell will likely identify major problems with a question, more is better, up to approximately fifteen per cell.5 With this research in mind, we plan to interview five students for each of the item blocks considered in the study.


Table 1. Sampling Information by Study Phase

Phase 1: Refine accessibility guidelines through review of standard blocks

Phase 2: Evaluate accessible blocks and compare performance on standard and accessible blocks

Grade

# of students

# of blocks

Grade

# of students

# of blocks

4

10

4 standard blocks

4

20

4 accessible & 4 standard blocks

8

10

4 standard blocks

8

20

4 accessible & 4 standard blocks

NOTE: Standard blocks used in phase 2 will overlap standard blocks used in phase 1. Overall, up to 6 standard blocks will be used at each grade level.



Student Recruitment and Sample Characteristics

Four schools with which UIUC has an ongoing research relationship will participate in this study. Researchers will work with the school liaisons to obtain class lists, determine dates and times when interviews can be conducted, and arrange for space in which to conduct the interviews. The cog labs will take place outside academic instructional times. Parents and students will be recruited via e-mail using class lists provided by the participating schools (see Appendix A for the initial recruitment e-mail). Parents of potential student participants will be screened by telephone, after which assent will be solicited from student participants (see Appendices B and C). The screening process will be designed to yield a sample with the following criteria:

  • Mix of race (Black, White, Asian)

  • Mix of Hispanic ethnicity

  • Mix of socioeconomic background (based on parental education)

  • Mix of urban/suburban/rural

  • Mix of regular education and special education students


This stratification is necessary to investigate the possible impact of modifications on the performance of these subgroups and explore the potential utility of the accessible blocks for accommodation purposes.


Upon receiving agreement from the student and parent to participate in the study, confirmation letters will be sent and a follow-up phone call will be made (see Appendices D, E, and F). Upon the completion of the interview, thank you letters will be sent to the student and parent (see Appendices G and H).

  1. Data Collection Process

The cog labs will be conducted by research associates at UIUC, who will be screened for suitability for this assignment. They will then be trained on cognitive interviewing techniques and the use of the specific protocols used in the study.


Cognitive Interview Process

The script and cognitive interview protocols to be used in this study are included in Volume II of this submission. Participants will first be welcomed, introduced to the interviewer and the observer (if an in-room observer is present), and told they are there to help answer questions about how people respond to assessment questions for reading. Participants will be reassured that their participation is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002, 20 U.S.C §9573]. Interviewers will explain the think-aloud process, and administer a practice question. Item blocks will be delivered to students in standard paper-and-pencil format, and participants will answer questions verbally. Interviewers will use several different cognitive interviewing techniques, including general think-aloud and question-specific probes, observation, and debriefing questions.


After the student has completed a think-aloud and interview for each of two item blocks (as shown in the think-aloud portion of the cognitive lab protocol in Volume 2), the interviewer will select one of those two item blocks, based on a specified sampling plan, and have the student read aloud the reading passage and one of the associated questions. The student will also answer additional probes pertaining to this passage and question (as shown in the read aloud portion of the cognitive lab protocol) so as to further assess student understanding.


The protocol contains a majority of generic interview questions that apply to all items and item blocks. However, for some items and item blocks, the read aloud portion of the protocol may also include probes that apply to those specific items or item blocks, for example, the probes may examine how students interact with specific features introduced in the accessible blocks. Because these blocks are not yet developed, we have only included placeholders in the protocol shown in Volume II.


Units of Analysis

The key unit of analysis is the item or item block. Items or item blocks will be analyzed across participants.


The types of data collected about the items and item blocks will include

  • think-aloud verbal reports;

  • behavioral coding (e.g., errors in reading passages or items);

  • responses to generic probes;

  • responses to item-specific probes;

  • additional volunteered participant comments; and

  • debriefing questions.


A coding frame will be developed for the responses to think-aloud questions and other verbal reports. The frame will be designed to identify problems associated with item or passage comprehension, memory retrieval, judgment/estimation processes, and response processes. The draft coding frame will be modified and supplemented based on reviewing the initial cognitive interviews.


Analysis Plan

The general analysis approach will be to compile the different types of data in spreadsheets and other formats to facilitate identification of patterns of responses for specific items or item blocks, for example, patterns of counts of verbal report codes and of responses to probes or debriefing questions.


Each type of data for an item will be examined both independently and in conjunction with item-specific features (e.g., nature of stimulus, item or passage length or complexity, item type, and number of response choices) in order to determine whether a feature or an effect of an item is observed across multiple measures and/or across administrations of the item.


This approach will ensure that the data are analyzed in a way that is thorough, systematic, and that will enhance identification of problems with items and provide recommendations for fixing those problems.

  1. Consultations Outside the Agency

American Institutes for Research (AIR)

AIR is a not-for-profit research organization that has maintained the NAEP Validity Studies (NVS) panel under contract to NCES since 1995. The cognitive lab study described here was commissioned by the NVS panel on behalf of NCES and will be carried out by NVS panel member Dr. Lizanne DeStefano (UIUC). In addition, the study will be reviewed by the full NVS panel during its tri-annual meetings.


University of Illinois at Urbana Champaign (UIUC)

UIUC will conduct the study on behalf of NCES. Dr. Lizanne DeStefano, Fox Family Professor of Education at the University of Illinois, will direct the study. Dr. DeStefano serves as a member of the NVS panel and has conducted an accessible booklet study in mathematics. She is an expert in large-scale assessment, accommodations, and special populations. Under Dr. DeStefano’s direction, UIUC will be responsible for recruitment of subjects, preparation of materials, conduct of the cog labs, data analysis, and reporting.

  1. Assurance of Confidentiality

NCES has policies and procedures that ensure privacy, security, and confidentiality, in compliance with the Education Sciences Reform Act of 2002 (20 U.S.C. §9573). This legislation ensures that security and confidentiality policies and procedures of all NCES studies, including the NAEP project, are in compliance with the Privacy Act of 1974 and its amendments, NCES confidentiality procedures, and the Department of Education ADP Security Manual.


Participation in the study is voluntary. Written consent will be obtained from legal guardians of minor students before interviews are conducted (see Appendix E for the consent form). Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files, secured for the duration of the study, and destroyed after the final report is released.


The interviews will be recorded. The only identification included on the audio files will be the participant ID. The recorded files will be secured for the duration of the study and will be destroyed after the final report is submitted.

  1. Justification for Sensitive Questions

Throughout the interview protocol development process, effort has been made to avoid asking for information that might be considered sensitive or offensive. Item developers and reviewers have identified and eliminated potential bias in questions.

  1. Estimate of Hourly Burden

Following is a general description of the recruitment process. UIUC has an ongoing research relationship with the four schools to be used in this study and no separate effort is required to recruit schools. Using class lists from these participating schools, UIUC will e-mail a sufficient number of potential participants to obtain the desired sample. Follow-up screening phone calls with interested parents will then be made to gain participation and achieve the mix of sampling criteria desired for this specific study. If the parents agree, student assent will be solicited during the same phone call (if the student is available) or at school during the next few days. Study staff will work with the school liaisons to schedule interviews during non-academic class time. Confirmation calls will then be made with parents and students prior to the scheduled interview. It is expected that attrition will occur at each step of the process. The number of respondents in each phase is estimated based on similar prior cog lab recruitments done over the past year.


Coordination with school liaisons to obtain class lists, confirm dates and times when interviews can be conducted, and arrange for space in which to conduct the interviews is estimated at 45 minutes or .75 hours per school. Initial e-mail contact and response is estimated at 3 minutes or 0.05 hours per parent. The follow-up phone calls to screen and recruit student participants are estimated at 9 minutes or 0.15 hours per parent/student respondent. The follow-up phone call and letter to confirm participation is estimated at 3 minutes or 0.05 hours individual. To fit with school schedules, student interviews will be limited to 1 hour. However, the students in phase 2 may be invited back for a second session if they are unable to complete an accessible block and a standard block within the one-hour limit. (During phase 1, we are confident that we will collect sufficient information for purposes of refining the accessibility guidelines within the one-hour limit.) We expect to use 10 students for phase 1 and 20 for phase 2. For the burden statement, we have estimated that 10 of the 20 students recruited for phase 2 will be invited back and will therefore participate for a second hour.


Table 2. Burden Table

Activity/Respondents

Hours per respondent

Number of respondents

Total hours

School Coordination

School Liaisons

0.75

4

3

Parent and Student Recruitment/Confirmation

 

 

Initial e-mail contact (parents)

0.05

200

10

Follow-up via phone/at school (parents & students)

0.15

160

24

Confirmation letters (parents & students)

0.05

120

6

Confirmation via phone (students)

0.05

60

3

Interviews - Grade 4 Students




Phase 1

1

10

10

Phase 2

1

20

20

Phase 2 – Students invited back

1

10

10

Interviews - Grade 8 Students

 

 

 

Phase 1

1

10

10

Phase 2

1

20

20

Phase 2 – Students invited back

1

10

10

Total Burden

 

264 (maximum)

126


  1. Estimate of Costs for Recruiting and Paying Respondents

Because the study will take place during regular academic school hours, a monetary incentive to individual participants is not necessary. Schools will be given a selection of children’s books valued at $50 for hosting the study. This practice has proven effective in recruiting respondents to participate in similar research.

  1. Cost to Federal Government

The overall project cost estimate, including design, preparation, conducting of student cognitive interviews (including recruitment, incentive costs, data collection, analysis, and reporting) is

$68,000.

  1. Schedule

The following table provides the schedule of milestones and deliverables:

Table 3. Schedule

Activity

Dates

Submission to OMB

November 2011

Recruit participants (subsequent to OMB clearance)

November-December 2011

Data collection, preparation, and coding

December 2011- April 2012

Data analysis

January - May 2012

Final study report

August 2012


1 NCES, through its NAEP Validity Studies (NVS) expert panel, has carried out promising development work for accessible blocks in mathematics. See DeStefano, L. and Johnson, J. (2011). Study of the Feasibility of a NAEP Accessible Booklet Alternative: Report on Phase 2. Unpublished.


2 For example, see a proposal for using state assessment scores to pre-assign booklets in McLaughlin, D.H., Scarloss, B.A., Stancavage, F.B., Blankenship, C.D. (2005). Using State Assessments to Assign Booklets to NAEP Students to Minimize Measurement Error: An Empirical Study in Four States. Palo Alto, CA: American Institutes for Research.

3See Almond, P. J., Cameto, R., Johnstone, C. J., Laitusis, C., Lazarus, S., Nagle, K., Parker, C. E., Roach, A. T., & Sato, E. (2009). White paper: Cognitive interview methods in reading test design and development for alternate assessments based on modified academic achievement standards (AA-MAS). Dover, NH: Measured Progress and Menlo Park, CA: SRI International.

4See Van Someren, M. W., Barnard, Y. F., & Sandberg, J. A. C. (1994). The think-aloud method: A practical guide to modeling cognitive processes. San Diego, CA: Academic Press.

5See Willis, G. (2005). Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleBackground Cog Lab OMB Submission V.1
SubjectNAEP BQ
AuthorDonnell Butler
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy