Justification

Vol I - NAEP SBT Visual Effects Cog Labs.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf

National Center for Education Statistics

National Assessment of Educational Progress



Volume I

Supporting Statement


Cognitive Interview Study of the Effects of Visual Representations on Student Performance on NAEP Science Scenario-Based Tasks




OMB# 1850-0803 v. 156

Shape1











April 15, 2016











TABLE OF CONTENTS



  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB# 1850-0803), which allows for NCES to conduct various procedures (such as cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey instruments and procedures. This request is to conduct cognitive interviews to probe one aspect of test validity—the effects of visual representations and associated interactive features on student performance on National Assessment of Educational Progress (NAEP) Science Scenario-Based Tasks (SBTs). The data generated by the study will help inform the development of new SBTs.

  1. Background and Study Rationale

NAEP is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is administered by NCES, part of the Institute for Education Sciences, in the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in various subject areas, collecting survey questionnaire (i.e., non-cognitive) data from students, teachers, and principals to provide context for the reporting and interpretation of assessment results.

Present plans for NAEP call for all NAEP assessments to transition to digitally based assessments (DBAs) beginning with the 2017 administration year. In support of this transition, new types of items have been developed that take advantage of the DBA environment to measure a wider range of knowledge and skills—for example, the Science SBTs. Like other NAEP DBA-enabled items, Science SBTs assess students through their interaction with multimedia tasks in which information is presented in two or more forms, such as on-screen text, auditory narration, static pictures (e.g., graph, charts, and maps), and dynamic visual representations (e.g., animations, videos, and interactive illustrations). One general belief behind the assessment design is that a task with vivid and interesting multimedia aids is more likely to engage students and facilitate their comprehension and performance. However, there have been limited research studies in NAEP that have collected evidence in support of this belief. For example, it may be possible that some uses of multimedia actually impede comprehension and performance (for at least some students) because they introduce construct-irrelevant information that is distracting or overloads students’ capacity for cognitive processing.

This request is to conduct cognitive interviews (also referred to as cognitive labs) of the Science SBTs developed for grade 8 students. Examples of Science SBTs administered in 2009 (described as interactive computer tasks) can be found at http://www.nationsreportcard.gov/science_2009/. The Science SBTs to be used in the current study represent next-generation versions of interactive tasks as they will appear in NAEP starting in 2019. These SBTs were previously evaluated using cognitive interviews during the item development process to ensure that they were free from errors, are functioning as anticipated, and were piloted in 2015. This study has a different focus, however. Specifically, we will use the Science SBTs as a platform for investigating the impact of various types of visual and interactive features on students’ ability to demonstrate their true level of achievement. Findings will be used to help generate guidelines for the development of future SBTs (in science and other subject areas) that support the validity of NAEP’s achievement estimates.

Cognitive interviews use a structured protocol in a one-on-one interview drawing on methods from cognitive science. In this study the objective is to explore how visual and interative features interact with student participants’ comprehension and reasoning processes when they work through tasks. Cognitive interviews, retrospective think-aloud, and verbal probing techniques will be employed to elicit student feedback. The focus is on the ways in which visual and interative features of the SBTs might enhance or inhibit participants’ performance, showing evidence of their mastery of target knowledge and skills.

At the start of a coginitive lab, each participant will be given a brief introduction to taking NAEP on a tablet computer. Following this introduction, the participant will complete one 15-30 minute Science SBT on a tablet computer. Computer software will be used to record how the participant responded to the task shown on the screen. After the task is completed, the administrator will play back a recording of the particpant’s performance to her/him. This will be used to help the student describe what she or he was thinking while working through the task.

As the participant talks through his/her experience in completing the SBT, the administrator will ask questions related to the participant’s understanding of the visual and interactive features of the task presented to her/him (e.g., clarity and intelligibility of the images, text, tables and graphs, simulated 3D content, and interactive graphics). In addition, the administrator will ask participants for comments or suggestions on how to improve those features. The cognitive interview will end with a brief set of survey questions, asking about the partcipant’s experiences and attitudes pertaining to technology and science education.

Overall, the cognitive interviews will generate data in the form of a digital video recording of students’ actions working through the SBT, which can be used to support students’ think-alouds, and a recorded retrospective think-aloud generating feedback that will be used to evaluate visual and interative features that enhance or inhibit students’ comprehension and problem solving process.

  1. Sampling and Recruitment Plan

NCES contracted AIR to maintain the NAEP Validity Studies (NVS) Panel for NCES, and to carry out the cognitive interview activity described in this package. AIR will recruit participants from various California locations, such as the Santa Barbara area, using existing contacts of AIR staff and of the Principal Investigator, Dr. Richard Durán, who will be directing this study as a member of the NVS Panel. Students will be recruited from multipurpose youth clubs and after school programs, as well as science-related conferences, out-of-school science camps, and after-school Science, Technology, Engineering and Mathematics (STEM) or computer learning programs. AIR will then communicate with parents of interested students to confirm eligibility and obtain parental consent. In addition, AIR will use a subcontractor, Elliott Benson Research and/or Nichols Research, to recruit students in the San Francisco Bay area and potentially also the Sacramento area. Elliott Benson/Nichols will use targeted phone lists to call parents who might be interested and have eligible children. Recruitment will continue throughout the period for administering the cognitive interviews, as necessary. Paper flyers, e-mails, and phone calls will be used to contact potential participants (see Appendices A-H).

Students will be sampled so as to achive the following participant criteria:

  • A mix of genders

  • A mix of native English speakers and students from homes in which English is not the dominant language

  • A mix of socioeconomic background, as evidenced by eligibility for the free or reduced lunch program

  • A mix of exposure to STEM and technology learning opportunities (i.e., some students will be recruited from academic programs featuring advanced STEM learning opportunities, such as out-of-school science camps, and after-school STEM or computer learning programs)

Although the sample will include a mix of student characteristics, results will not explicitly measure differences by these characteristics. Based on previous AIR and NVS Panel experience, we judge that six student participants for each of the five eighth-grade tablet-delivered SBTs will yield sufficient data for our cognitive lab study; administering one task per participant will be most feasible given the time required to complete all of the steps of the cognitive interview, as described in this submission. Consequently, we plan to recruit a total of 30 eighth-grade students.

Interested participants will be screened to ensure that they meet the criteria for participation (e.g., their parents/guardians have given consent and they are from the targeted demographic groups outlined above). When recruiting participants, AIR or Elliott Benson/Nichols staff will first communicate with the parent/guardian of the interested minor before starting the screening process. The parent/guardian will be informed about the objectives, purpose, and participation requirements of the study and the activities it entails. After confirming that a student is qualified and available to participate, a confirmation e‐mail/letter will be sent and parental/guardian informed consent for the minor’s participation will be obtained.1

  1. Data Collection Process

The cognitive interviews will take place at the University of California, Santa Barbara (UCSB), Elliot Benson’s Sacramento office, AIR’s office in San Mateo CA, or another suitable venue (e.g., research lab, conference room, or after-school center meeting room). Participants will first be welcomed, introduced to the interviewer and the observer (if an in‐room observer is present), and told that they are there to help answer questions about how students respond to science tasks administered on a tablet computer. Students will be reassured that their participation is voluntary and that their answers will be used only for informing the development of SBTs; their answers will not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002, 20 U.S.C §9573].

Interviewers will explain the think‐aloud process and conduct a practice session with a sample question (see Volume II for the welcome script, think‐aloud instructions, and generic probes). Next, participants will view a brief tutorial to orient them to specific features of the platform on which the SBTs will run.

The cognitive interviews will use a retrospective think-aloud method in which participants will first complete one 15-30 minute science task in a naturalistic way, without interruption (although students will not be discouraged if they want to make comments about the task during the session). Computer software Camtasia® will record how students responded to the task shown on the screen. After the task is completed, the administrator will play back a video recording of the participant’s performance, and ask her/him to explain what they were thinking as they worked through the task. Enabling students to see their actions facilitates their recollection of what they did during the task, and helps them to further reconstruct their thinking at each point in the task.

The protocols for the think‐aloud sections will contain largely generic prompts to be applied flexibly by the interviewer and will facilitate and encourage students to verbalize their thoughts. An example prompt may be: “I see you were looking at the table on the right-hand side of the screen. What were you thinking?” Occasionally, and as relevant to each specific SBT, the administrator will ask the student probe questions that target prioritized visual and interactive features that experts hypothesize would support or inhibit students’ performance of the task. For example: “Did you know what to do? Was it easy for you to find the button you were supposed to press or did it take you some time to find it?” A video and audio recorder will run for the entire session to capture each participant’s behavior and words. Interviewers will also take notes about participants’ reactions to the science task.

The cognitive interview session will end with the administration of a brief written questionnaire to gather information on the participant’s science background. (See Volume II.)

Analysis Plan

The results will be compiled to identify patterns of responses for tasks, including patterns of responses to probes, or types of actions observed from students at specific points while working through the SBTs. This approach will help to ensure that the data are analyzed in a way that is thorough and systematic. In this way, the analysis strategy will enable the identification of visual and interactive features that facilitate or inhibit comprehension and performance, and would allow us to develop recommendations for addressing these problems. A summary report will be produced, which will include a description of participant characteristics, positive and negative reactions to visual and interactive features within tasks, and principles drawn from the findings that can be used to guide future design of NAEP SBTs.

  1. Consultations Outside the Agency

The American Institutes for Research (AIR) is a not-for-profit research organization that has maintained the NAEP Validity Studies (NVS) panel under contract to NCES since 1995. The NVS Panel is charged with investigating matters related to the validity of the NAEP assessment. The cognitive lab study described here was commissioned by the NVS panel on behalf of NCES, and will be carried out by AIR under the guidance of NVS panel member Dr. Durán. In addition, the study will be reviewed by the full NVS panel during its tri-annual meetings.

Under subcontract to AIR, the University of California in Santa Barbara (UCSB) will provide the facilities and equipment to be used during the design phase of the study and for conducting cognitive interviews in the Santa Barbara area. The subcontract will also cover a portion of Dr. Durán’s time during the design phase.

Elliott Benson Research is a market research agency in Sacramento, CA that provides nationwide recruiting and field management service. Nichols Research is a full-service marketing research firm operating in the San Francisco Bay Area and Central California. The study will use Elliott Benson and/or Nichols Research to recruit up to half the sample as required after pursuing participants using existing contacts of AIR staff and of the Principal Investigator, Dr. Richard Durán.

  1. Assurance of Confidentiality

Students taking part in the cognitive interviews will be notified that their participation is voluntary and that their answers may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (20 U.S.C. §9573)]. Written consent will be obtained from participants and from parents or legal guardians of students under age 18. Study participants will be identified by unique student identifiers, with the code book indicating their true identity kept under lock and key in the data storage area of an office project room in UCSB. The consent forms, which include the participant name, will be separated from the interview files and secured for the duration of the study. They will be destroyed after the final report is completed. The interviews will be recorded. The only identification included on the files will be the participant pseudonyms. The recorded files will be secured for the duration of the study and destroyed after the final report is submitted.

  1. Justification for Sensitive Questions

This study does not include sensitive questions.

  1. Estimate of Hourly Burden

The estimate of hourly burden assumes that AIR recruits half of the subjects and Elliott Benson/Nichols Research recruits the other half (15 participants each). AIR and Elliott Benson/Nichols Research have different recruitment strategies and attrition rates, and the burden estimates for the number of respondents and responses reflect this difference (each cell in Table 1 shows the estimates for AIR + estimates for Elliott Benson/Nichols Research). The estimated burden for recruitment assumes attrition throughout the process. Assumptions for approximate attrition rates for direct participant recruitment from initial contact to follow-up are 50 percent for AIR and 40 percent for Elliott Benson/Nichols Research. Assumptions for attrition from follow-up to confirmation are 20 percent for AIR and 50 percent for Elliott Benson/Nichols Research. All cognitive interviews will be conducted by AIR; cognitive interview sessions will be scheduled for no more than 120 minutes.

Table 1. Burden for Cognitive interviews (AIR recruited + Elliott Benson/Nichols Research recruited)

Respondent

Hours per respondent

Number of respondents

Number of responses

Total hours (rounded up)

Schools and Organizations

Initial contact

0.05

40 + 0

40 + 0

2

Follow-up contact

0.15

10* + 0

10 + 0

2

Confirmation

0.05

8* + 0

8 + 0

1

Sub-Total


40

58

5

Parent or Legal Guardian for Student Recruitment

Initial contact

0.05

46 + 60

46 + 60

6

Follow-up contact

0.15

23* + 36*

23 + 36

9

Consent form completion and return

0.13

18* + 18*

18 + 18

5

Confirmation

0.05

18* + 18*

18 + 18

2

Sub-Total


106

237

22

Participation (Cognitive Interviews)

Students

2.00

30a

30

60

Sub-Total


30

30

60

Total Burden


176

325

87

* Subset of initial contact group, not double counted in the total number of respondents.

a Estimated number of actual participants will be somewhat less than confirmation numbers.

  1. Recruitment Costs

To encourage participation and thank them for their time and effort, an incentive will be offered to each participating student. Because the cognitive labs will run for 120 minutes to accommodate the length of the SBTs, we will offer each student a $40 incentive. If a parent or legal guardian brings their student to and from the testing site, they will also receive $25 as a thank you for their time,effort, and transportation for their child.

  1. Costs to Federal Government

The estimated cost to federal government for the NAEP science SBT cognitive interview activities project is $233,053 as delineated in Table 2.

Table 2. Estimate of Costs to Federal Government

Activity

Provider

Cost

Design, prepare, and conduct cognitive interviews (including recruitment, allocation of incentive costs, data collection, analysis, and reporting)

AIR

$210,820

Design cognitive interviews, provide space and equipment to conduct Santa Barbara based interviews

UCSB

$18,703

Cognitive interview recruitment, provide space to conduct Sacramento based interviews

Elliott Benson/Nichols Research

$3,530

Total Estimate


$ 233,053



  1. Schedule

Table 3 depicts the high-level schedule for the various activities. Each activity includes recruitment, data collection, analyses, and reports.

Table 3. Timeline for Cognitive Interviews for NAEP Science SBTs, Grades 8

Activity

Dates

Recruit participants

April 2016 July 2016

Data collection, preparation, and coding

May 2016 August 2016

Data analysis of cognitive interview results

August 2016 October 2016

Cognitive interview summary report

November 2016 March 2017



1 If appropriate, relevant appendices (i.e., parental screening calls) may be translated into Spanishto facilitate communication.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorGeorge P Barrett
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy