Justification

NAEP Cog Items-Cog Labs Vol 1.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf

2011 Cognitive Interview Studies of NAEP Cognitive Items

Shape2



National Assessment of Educational Progress





Volume I

Supporting Statement



Request for Clearance for 2011 Cognitive Interview Studies of NAEP Cognitive Items



OMB# 1850-0803 v.45

(Generic Clearance for Cognitive, Pilot and Field Test Studies)



Shape1




March 2, 2011

Volume I: Supporting Statement







  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803). This generic clearance provides for NCES to conduct various procedures (such as field tests and cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.

  1. Background and Study Rationale

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is administered by NCES, part of the Institute for Education Sciences, in the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the various subject areas and to also collect questionnaire data to provide context for the reporting and interpretation of assessment results.


As part of NAEP’s cognitive item development process, a portion of assessment items are tested on a small number of students before they are pilot tested. In NAEP, a method often used to explore new questions, both cognitive and background, is that of conducting cognitive interviews. This method (referred to as cog labs) involves intensive, one-on-one interviews in which the respondent is typically asked to "think aloud" as he or she answers cognitive or survey questions. A number of different techniques may be involved, among them asking respondents to paraphrase questions, using probing questions to determine how respondents came up with their answers, and so on. Given the current assessment schedule determined by the National Assessment Governing Board, NAEP plans to conduct two sets of cognitive item cognitive labs in 2011. These cognitive labs, covered under this submission, are in preparation for the following assessments:

  • the 2012 pilot of technology and engineering literacy (TEL) computer-based assessment at grade 8, which will be the first time that NAEP assesses this new subject-area; and

  • the 2013 pilot of computer-based writing assessment at grade 4, which will be the first time that NAEP assesses writing via computer at grade 4.


The purpose of the cognitive interview study is to explore how students process tasks and generate responses. For example, in the context of computer-based writing, we might try to determine if students get useful ideas from images and stimuli accompanying tasks and how such components are incorporated into students’ planning and development of their written responses. Our goal is to identify areas of success and potential problems prior to administering items and tasks to a large number of respondents, so that we can increase the quality of the items and tasks.


In the cognitive labs, an interviewer uses a structured protocol in a one-on-one interview using two methods: think-aloud interviewing and verbal probing techniques. With think-aloud interviewing, respondents are explicitly instructed to “think aloud” (i.e., describe what they are thinking) as they figure out their responses and strategies to responding to items and tasks. The respondent reads each question aloud, and then the interviewer records the cognitive processes that the respondent uses in arriving at a response or strategy. With verbal probing techniques, the interviewer asks probing questions, as necessary, to clarify points that are not evident from the “think aloud” process. These probes might include, for example, asking respondents to rephrase the question in their own words or to assess whether the response categories for selected response items are relevant.


Cognitive interview studies are largely observational. The largely qualitative data collected will consist of mainly verbal reports from think-aloud tasks, in addition to comments volunteered by respondents. As a result of the information learned during the cognitive interviews, items and tasks are revised, resulting in a set of questions and tasks that are clear and meaningful to respondents (and thus less burdensome for them) and that yield accurate and meaningful measures of appropriate knowledge and skills.

  1. Study Design and Context

Subject-Area Overview

For grade 4 writing, the study includes student interviews about their reactions to and performance on extended constructed-response writing tasks. Based on the Writing framework, these tasks are intended to take 25 minutes of student time (not counting additional time required to respond to interview questions). Assessment tasks are designed to measure students’ abilities to write to three purposes (to persuade, to explain, and to convey experience, real or imagined) and various audiences using the computer and commonly available word processing tools. Many tasks will be accompanied by drafts of multimedia stimuli to determine how students react to such stimuli.


For grade 8 TEL, the study includes student interviews about their reactions to and performance on sets of discrete items and scenario-based assessment tasks. Each set of discrete items or scenario-based assessment tasks is intended to take between 10 and 40 minutes of student time (not counting additional time required to respond to interview questions). Some discrete item sets and scenario-based tasks will be of relatively short duration and should take approximately 10-20 minutes of student time and could, therefore, possibly be combined with an additional short discrete item set or scenario-based task for the purpose of the cog lab. Other item sets and tasks will be of longer duration and require approximately 20-40 minutes for completion. Discrete items and scenario-based tasks are designed to measure various aspects of student performance in the following three areas of technology & engineering literacy: Technology and Society, Design and Systems, and Information and Communications Technology.


Given the duration, magnitude, and novelty of the TEL items and tasks to be studied, the process for scheduling and conducting TEL cog labs will be iterative. Cog labs will be performed on a few items and tasks at a time. We will develop a coding frame that will allow us to rapidly identify and code reflection of verbal responses and to identify problems associated with item or task comprehension, memory retrieval, judgment/estimation processes, and response processes. The draft coding frame will be modified and supplemented based on reviewing the initial cognitive interviews. That is, information gathered from an earlier session of cognitive interviews will be integrated into the next session of cognitive interviews to be performed on a new set of items and tasks at a later date.


Sampling Plan

Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics. Existing research and practice have failed to offer a methodological or practical consensus regarding the minimum or optimal sample size necessary to provide valid results for cognitive question development.1 Several researchers have confirmed the standard of five as the minimum number of participants per subgroup for analysis (i.e., cell) for the purposes of exploratory cognitive interviewing for question development.2 Other researchers have indicated that although a sample size of five per cell will likely identify major problems with a question, more is better, up to approximately fifteen per cell.3 With this research in mind, we plan to interview five grade 4 students for each of the writing tasks. Given the TEL assessment construct is new, we plan to interview 10 grade 8 students for each of the TEL tasks or sets of items.



Student Recruitment and Sample Characteristics

Parents and students will be recruited via phone and email using participant databases of volunteers and through other available lists (see Appendix A for the initial recruitment email). Potential student participants and their parents will be screened by telephone (see Appendices B and C). The screening process would be designed to yield a sample with the following criteria:

  • Mix of race (Black, White, Asian)

  • Mix of Hispanic ethnicity

  • Mix of socioeconomic background (based on parental education)

  • Mix of urban/suburban/rural

  • For TEL only, mixed exposure to formal instruction in technology and engineering


The recruitment process described above will be conducted by UserWorks, Educational Testing Service, and possibly other contractors (see Section 5). Please note that this recruitment procedure was used to recruit participants for cognitive laboratory interviews conducted for NCES over the past year4.


Upon receiving agreement from the student and parent to participate in the study, confirmation letters will be sent and a follow-up phone call will be made (see Appendices D, E, and F). Upon the completion of the interview, thank you letters will be sent to the student and parent (see Appendices G and H).

  1. Data Collection Process

The student interviews will be conducted by Abt Associates and Educational Testing Service (see Section 5). Qualified interviewers will be identified and trained on cognitive interviewing techniques and the use of the specific protocols used in the study.


Cognitive Interview Process

The script and cognitive interview protocols to be used in this study are included in Volume II of this submission. Participants will first be welcomed, introduced to the interviewer and the observer (if an in-room observer is present), and told they are there to help answer questions about how people respond to assessment questions for, as appropriate, writing or TEL. Participants will be reassured that their participation is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002, 20 U.S.C §9573]. Interviewers will explain the think-aloud process, conduct a practice question, and then participants will answer questions verbally. Interviewers will use several different cognitive interviewing techniques, including general think-aloud and question-specific probes, observation, and debriefing questions.


The protocols contain a majority of generic interview questions that apply to all tasks. However, they may also contain specific questions that apply to particular tasks, especially for TEL, since that assessment will contain items that vary much more in form and content than those used in the writing assessment. For example, the grade 8 TEL protocol contains generic language, such as “How do you think an action you took or this element on the screen helped you answer the question or do this part of the task?” which may be modified for a particular task, such as “How did using this menu help you navigate through the website?” or “How did this figure on the screen help you answer the question?”.


TEL tasks are ultimately intended to be delivered to students as interactive, computer-based tasks; though writing tasks are not interactive in the way TEL tasks are, they are computer-based. Writing interviews will be conducted on computer, using an interface similar to that to be used for the grade 4 NAEP 2012 pilot and task stimuli, in either draft form or mocked up, to approximate the final version. (Note that usability testing of the writing interface for grade 4 students is being conducted separately.5) In order to introduce the student to the writing computer system, students will watch a short tutorial before beginning the actual tasks.


However, because cog labs are to be conducted at an early stage of development and, thus, the full computer system will not be available, TEL interviews will be conducted using paper-and-pencil prototypes or, where feasible, using computer prototypes with limited interactivity. Given that the purpose of the cog labs is to provide insight into student understanding of the items and tasks, the different mode of administration will not affect the interpretation of the cog lab results. Additional piloting testing of the items and tasks will be done on the computer prior to a full-scale administration.


Units of Analysis

The key unit of analysis is the item or task. Items or tasks will be analyzed across participants.


The types of data collected about the questions will include

  • think-aloud verbal reports;

  • behavioral coding (e.g., errors in reading items or tasks);

  • responses to generic probes;

  • responses to item or task-specific probes;

  • additional volunteered participant comments; and

  • debriefing questions.


A coding frame may be developed for the responses to think-aloud questions and other verbal reports. The frame will be designed to identify and code reflection of verbal responses and to identify problems associated with item or task comprehension, memory retrieval, judgment/estimation processes, and response processes. The draft coding frame will be modified and supplemented based on reviewing the initial cognitive interviews.


Analysis Plan

The general analysis approach will be to compile the different types of data in spreadsheets and other formats to facilitate identification of patterns of responses for specific items or tasks, for example, patterns of counts of verbal report codes and of responses to probes or debriefing questions.


Each type of data for an item or task will be examined both independently and in conjunction with item-specific features (e.g., nature of stimulus, item or task length or complexity, item type, and number of response choices) for the item or task in order to determine whether a feature or an effect of an item or task is observed across multiple measures and/or across administrations of the item or task.


This approach will ensure that the data are analyzed in a way that is thorough, systematic, and that will enhance identification of problems with items or task and provide recommendations for fixing those problems.

  1. Consultations Outside the Agency

Educational Testing Service (ETS)

ETS serves as the Item Development contractor on the NAEP project, developing cognitive and background items for NAEP assessments. ETS’ staff (both Research & Development and Item Development) will be involved in the management of the cog lab interviews for writing and TEL. In addition, ETS may recruit for and conduct some of the cog lab interviews.



Abt Associates

Abt Associates is a large, established for-profit government and business research and consulting firm. Abt Associates is working as a subcontractor for ETS on this project to conduct some of the cog lab interviews. Abt Associates provides expert development, testing, and refinement of data collection tools to ensure their reliability and validity. Located in Bethesda, Maryland, Abt Associates has a Cognitive Testing Laboratory facility that offers a range of cognitive interviewing and usability testing services.


Johnny Blair, a Principal Scientist at Abt Associates, is a specialist in survey research design, cognitive interviewing, and usability testing. He has extensive experience in survey methodology, statistics, demography, instrument design, data analysis, and evaluation. He has published extensively on issues such as cognitive interview data quality, optimizing cognitive pretest sample sizes, and customizing cognitive interview protocols.


UserWorks

UserWorks is a Maryland-based company that specializes in recruiting for focus groups, cognitive interviews, and other research. UserWorks is a subcontractor of Abt Associates on this project to recruit for some of the cog lab interviews. The two organizations have worked together regularly for several years.



In addition, ETS will use additional outside recruiting and interviewing sub-contracting agencies to recruit students for and conduct the cog labs. As ETS is in the process of selecting these sub-contractors, details are not finalized at this time. One potential additional sub-contracting agency is EurekaFacts, LLC, which is located in Rockville, MD, and specializes in designing new programs, conducting needs assessments, evaluating and improving program outcomes, and conducting program assessments and outcome evaluations. EurekaFacts’ experience performing one-on-one interviews and focus groups positions them as a possible sub-contracting agency for both recruiting and performing some of the cog lab interviews.

  1. Assurance of Confidentiality

NCES has policies and procedures that ensure privacy, security, and confidentiality, in compliance with the Education Sciences Reform Act of 2002 (20 U.S.C. §9573). This legislation ensures that security and confidentiality policies and procedures of all NCES studies, including the NAEP project, are in compliance with the Privacy Act of 1974 and its amendments, NCES confidentiality procedures, and the Department of Education ADP Security Manual.


Participation in the study is voluntary. Written consent will be obtained from legal guardians of minor students, before interviews are conducted (see Appendix E for the consent form). Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files and secured for the duration of the study and will be destroyed after the final report is released.


The interviews will be recorded. The only identification included on the files will be the participant ID. The recorded files will be secured for the duration of the study and will be destroyed after the final report is submitted.

  1. Justification for Sensitive Questions

Throughout the interview protocol development process, effort has been made to avoid asking for information that might be considered sensitive or offensive. Reviewers have identified and eliminated potential bias in questions.

  1. Estimate of Hourly Burden

A two-stage recruitment effort will be conducted via email and phone. Initial e-mail contact and response is estimated at 3 minutes or 0.05 hours. The follow-up phone calls to screen student participants are estimated at 9 minutes or 0.15 hours per family. The follow-up phone call and letter to confirm participation is estimated at 3 minutes or 0.05 hours. Student interviews will be limited to 60 (for writing and most TEL items and tasks) or 90 minutes (for extended TEL tasks).


Following is a general description of the recruitment process. The recruiting contractor will email a sufficient number of potential participants (in past studies, approximately 10 potential participants for each required student has been contacted). Follow-up screening phone calls with interested parents and students will then be made to gain participation and achieve the mix of sampling criteria desired for this specific study. Confirmation calls will then be made with the students prior to the scheduled interview. It is expected that attrition will occur at each step of the process. The number of respondents in each phase is estimated based on similar prior cog lab recruitments done over the past year.


The specific burden table for this study follows:


Respondent

Hours per respondent

Number of respondents

Total Hours

Parent and Student Recruitment

 

 

Initial e-mail contact

0.05

3600

180

Follow-up via phone

0.15

720

108

Confirmation via phone

0.05

430

22

Interviews (Writing - Grade 4)




Extended-constructed response tasks

1

50

50

Interviews (TEL - Grade 8)

 

 

 

Short interactive tasks or sets of items

1

230

230

Long interactive tasks

1.5

80

120

Total Burden

 

3600

710

  1. Estimate of Costs for Recruiting and Paying Respondents

Because the study will take place outside of regular academic school hours, a monetary incentive is aimed at ensuring participation and motivation on behalf of the participants. This practice has proven effective in recruiting respondents to participate in similar research. Each participating student will receive a $25 gift card in compensation for time and effort. In addition, we are offering a gift card of $25 per parent to remunerate them for their time and help offset the travel/transportation costs of bringing the participating student to and from the cognitive laboratory site. These amounts are consistent with previous similar studies. Generic gift cards that can be used anywhere credit cards are accepted are recommended because low-income participants do not always have bank accounts and check-cashing outlets often charge fees.

  1. Cost to Federal Government

The overall project cost estimates, including design, preparation, conducting of student cognitive interviews (including recruitment, incentive costs6, data collection, analysis, and reporting) are as follows:

  • Writing: $160,000

  • TEL: $480,000

  • Total: $640,000

  1. Schedule

The following table provides the schedule of milestones and deliverables:


Activity

Dates (Writing)

Dates (TEL)7

Submission to OMB

March 2011

Recruit participants (subsequent to OMB clearance)

March – April 2011

March – October 2011

Data collection, preparation, and coding

April 2011

April – November 2011

Data analysis

May 2011

May – December 2011

Final study report

June 2011

December 2011


1See Almond, P. J., Cameto, R., Johnstone, C. J., Laitusis, C., Lazarus, S., Nagle, K., Parker, C. E., Roach, A. T., & Sato, E. (2009). White paper: Cognitive interview methods in reading test design and development for alternate assessments based on modified academic achievement standards (AA-MAS). Dover, NH: Measured Progress and Menlo Park, CA: SRI International.

2See Van Someren, M. W., Barnard, Y. F., & Sandberg, J. A. C. (1994). The think-aloud method: A practical guide to modeling cognitive processes. San Diego, CA: Academic Press.

3See Willis, G. (2005). Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage.

4 The NAEP 8th-grade Writing Usability Study (May 2010) and the Background Question Cog Lab (January 2011) both employed this recruitment procedure.

5 The NAEP grade 4 writing computer-based assessment usability study was cleared by OMB in February 2011.

6 Note: gift card fees are included in the incentive costs.

7 As stated in Section 3, the TEL development and cognitive labs will be done iteratively over several months.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleBackground Cog Lab OMB Submission V.1
SubjectNAEP BQ
AuthorDonnell Butler
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy