Justification

Volume 1 NAEP 2022 Economics Pretesting.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf



National Center for Education Statistics

National Assessment of Educational Progress





Volume I

Supporting Statement





National Assessment of Educational Progress (NAEP) 2022

Economics Pretesting



OMB# 1850-0803 v.239







September 2018

Volume I Table of Contents


Attachments:

Volume II – Protocols

Appendices – Communication Materials



  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB# 1850-0803), which provides for NCES to conduct various procedures (such as pilot tests, cognitive interviews, and usability studies) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments and procedures.

  1. Background and Study Rationale

The National Assessment of Educational Progress (NAEP) is a federally authorized survey, by the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622), of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is conducted by NCES, which is part of the Institute of Education Sciences, within the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the different subject areas and collect survey questionnaire (i.e., non-cognitive) data to provide context for the reporting and interpretation of assessment results.

This request is to conduct pretesting of grade 12 discrete items (DI) and interactive item components (IICs) for the 2022 NAEP Economics Assessment. Pretesting is used to obtain data about new digitally-enhanced items, tasks, and stimuli during the NAEP item development process. Pretesting is intended to enhance the efficiency of the development of assessment instruments. Pretesting before piloting helps to identify and eliminate potential issues with items and tasks. This can mean fewer challenges in scoring and analysis, and it can lead to higher pilot item survival rates.

The overall goal of ID pretesting is to determine whether items and tasks appear to elicit targeted knowledge and skills and/or reduce unintended ones, in the form of either evidence that can be scored or qualitative data consisting of observations and student reactions. Pretesting helps to identify whether any item content or features cause confusion or introduce sources of construct-irrelevant variance. Pretesting supplies data based upon which items and scoring rubrics can be refined.

This study will include cognitive interviews and tryouts. Cognitive interviews allow for the gathering of qualitative data about how students work through items and offer opportunities to probe potential sources of construct irrelevance. The larger samples and timed testing conditions of tryouts are especially useful for gathering quantitative data about timing and item performance before piloting, and for investigating the possible effects of the different features of items on students’ performances.

In cognitive interviews, an interviewer uses a structured protocol in a one-on-one interview drawing on methods from cognitive science. In NAEP studies to date, two methods have been combined: think-aloud interviewing and verbal probing techniques. With think-aloud interviewing, respondents are explicitly instructed to "think aloud" (i.e., describe what they are thinking) as they work through questions. With verbal probing techniques, the interviewer asks probing questions, as necessary, to clarify points that are not evident from the “think-aloud” process, or to explore additional issues that have been identified a priori as being of particular interest. This combination of allowing the students to verbalize their thought processes in an unconstrained way, supplemented by specific and targeted probes from the interviewer, has proven to be flexible and productive. This will be the primary approach in the NAEP economics cognitive interviews.

In tryouts, students will work uninterrupted through selected draft items. Tryouts allow for pretesting of a wider range of content and the collection of more robust data on ranges of student responses, item difficulty, assessment timing, and item functionality than is practical for cognitive interviews. The larger samples and timed testing conditions of tryouts are especially useful for gathering quantitative data about items, investigating the possible effects of different item features on student performance, and learning how long it takes students to complete items.

  1. Recruitment and Data Collection

Recruitment and Sample Characteristics

EurekaFacts, an NCES subcontractor for NAEP, will recruit a maximum of 20 students to participate in cognitive interviews and a maximum of 50 students to participate in tryouts.

EurekaFacts will recruit participants for the pretesting study from the District of Columbia, Maryland, Virginia, West Virginia, Delaware, and Southern Pennsylvania. Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics. Students will be recruited to obtain the following criteria:

  • Students who are enrolled in 12th grade for the 2018-2019 school year;

  • A mix of gender;

  • A mix of race/ethnicity (Black, Asian, White, Hispanic, etc.);

  • A mix of socioeconomic backgrounds; and

  • A mix of urban/suburban/rural areas.

While EurekaFacts will use various outreach methods (see Appendices) to recruit students to participate, the bulk of the recruitment will be conducted by telephone and will be based on acquisition of targeted mailing lists containing residential addresses and landline and cellular telephone listings. EurekaFacts will also use a participant recruitment strategy that integrates multiple outreach methods and resources such as newspaper and internet ads, community organizations (e.g., Boys and Girls Clubs, Parent-Teacher Associations), and mass media recruitment (e.g., postings on the EurekaFacts website).

Interested students 18-years of age or over (see Appendix G) and parents of students under 18-years of age (see Appendix D) will be screened to ensure that the recruited students meet the criteria for participation in the study (i.e., that the students are from the targeted demographic groups outlined above). When recruiting participants, EurekaFacts staff will first speak to the student (if eighteen years or older) or speak to the parent/legal guardian of the interested minor before starting the screening process. During this communication, the student or the parent/legal guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort as well as the activities that it entails. After confirming that a participant is qualified, willing, and available to participate in this study, he or she will receive a confirmation email/letter and phone call. Written, informed parental consent and consent of student participants age 18 or over (see Appendices L and M) will be obtained for all respondents who are interested in participating in the study.

Data Collection Process

Cognitive Interviews

Cognitive interviews will take place at a range of suitable venues. EurekaFacts will conduct cognitive interviews at their Rockville, Maryland site or other sites. In all cases, a suitable environment such as a quiet room will be used to conduct the interviews, and there will be more than one adult present. Each cognitive interview session will last 90 minutes.

Each participant will first be welcomed by staff, introduced to the interviewer and the observer, and told that s/he is there to help answer research questions about how people answer social science questions. Interviewers will explain the cognitive interview process and, to the extent that the think-aloud process is used, conduct a practice session with a sample question.

Protocols for cognitive interviews will include probes for use as students work through item sets, and also probes for use after students finish answering items (see Volume II). Probes will include a combination of pre-planned questions, identified before the session, and ad hoc questions that the interviewer identifies as important from observations during the interview, such as clarifications or expansions on points raised by the student. For example, if a student paused for a long time over a particular item, appeared to be frustrated at any point, or indicated an ‘aha’ moment, the interviewer might probe these kinds of observations further, to find out what was going on. To minimize the burden on the student, efforts will be made to limit the number of verbal probes that can be used in any one session or in relation to any set of items. The welcome script, cognitive interview instructions, and hints for the interviewers are provided in Volume II.

Interactions and responses may be recorded via video screen-capture software (e.g., Morae® software by TechSmith). These recordings can be replayed for later analysis, to see how a given student progressed through the task. Digital audio recording will capture students’ verbal responses to the interview, using either the tablet’s integral microphone or an external digital recorder, depending on the specific tablet platform used and compatibility with the screen-capture software. Interviewers will also record their own notes separately, such as behaviors (e.g., “the participant appeared confused”), questions posed by students, and observations of how long various items take to complete.

The types of data collected about the items and components will include

  • student reactions to and responses to items and components;

  • behavioral data (e.g., actions observable from interviewer notes, process data, screen-captures, gaze patterns where collected);

  • responses to generic questions;

  • responses to targeted questions specific to the item(s);

  • additional volunteered participant comments; and

  • answers to debriefing questions.

Tryouts

Tryout sessions will be conducted by EurekaFacts in small groups. Each session will last 90 minutes. Because during tryouts students complete items on their own without any interruption, it is possible and most efficient to have several students work at the same time. A proctor will be present during the session and will follow a strict protocol to provide students with general instructions, guide the group through the tryout, administer any debriefing questions and assist students in the case of any technical issues (see Volume II). The proctor will take notes of any potential observations or issues that occur during the tryout session. Finally, proctors will present students with follow-up verbal or written probes. This has been done successfully in mathematics and social sciences. Questions typically ask students about their reactions, areas of confusion, and background knowledge. ETS staff will develop these questions and share them with EurekaFacts staff. EurekaFacts will make sound recordings of the post-tryout discussion sessions. No other types of recordings are made during tryouts.

The types of data collected will include:

  • process data (e.g., timing and students’ movements among items)

  • responses to items;

  • EurekaFacts observer notes; and

  • answers to debriefing questions.

  1. Consultations outside the agency

Educational Testing Service (ETS) is the Item Development, Data Analysis, and Reporting contractor for NAEP and will develop the items, analyze results, and draft a report with results. EurekaFacts, a research and consulting firm based in Rockville, MD, a subcontractor for ETS, will recruit participants and administer the cognitive interviews and tryouts.

  1. Justification for Sensitive Questions

Throughout the item and debriefing question development processes, effort has been made to avoid asking for information that might be considered sensitive or offensive.

  1. Paying Respondents

To encourage participation in a 90 minutes session, a $35 gift card from a major credit card company will be offered to each student who participates in a pretesting session as a thank you for his or her time and effort. If a parent or guardian provides transportation for a student, they will be offered a $25 gift card from a major credit card company to thank them for bringing the participating student to and from the testing site.

  1. Assurance of Confidentiality

The study will not retain any personally identifiable information. Prior to the start of the study, students will be notified that their participation is voluntary and that all of the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

Before students can participate in the study, written consent will be obtained from students 18 years of age or older and from the parents or legal guardians of students less than 18 years of age. Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files, secured for the duration of the study, and will be destroyed after the final report is released. Pretesting activities may be recorded using audio or screen capture technology. The only identification included on the files will be the participant ID. The recorded files will be secured for the duration of the study and will be destroyed after the final report is completed.

  1. Estimate of Hourly Burden

The estimated burden for recruitment assumes attrition throughout the process.1 In all events, each student will participate in one session for a total of 90 minutes. Table 1 details the estimated burden.

Table 1. Estimate of Hourly Burden for Pretesting Activities

Respondent

Number of respondents

Number of responses

Hours per respondent

Total hours

Student Recruitment via Teachers and Staff

 

 


 

Initial contact with staff: email, flyer distribution, & planning

10

10

0.33

4

Parent or Legal Guardian





Flyer and consent form review

188

188

0.08

16

Consent form completion and return

94*

94

0.13

13

Confirmation to parent via email or letter

70*

70

0.05

4

Recruitment Totals

198

362


37

Student

Grade 12 Cognitive Interviews

20

20

1.5

30

Grade 12 Tryouts

50

50

1.5

75

Interview Totals

70

70


105

Total Burden

268

432


142

*Subset of initial contact group

Note: numbers have been rounded and therefore may affect totals

  1. Cost to federal government

The total cost of the study is $376,542 as detailed in Table 2.

Table 2: Cost to the Federal Government

Activity

Provider

Estimated Cost

Cognitive Interviews



Design and prepare for up to two rounds of cognitive interviews; analyze findings & prepare report

ETS

$ 65,916

Prepare for and administer up to two rounds of cognitive interviews (including recruitment, incentive costs, data collection, analysis, & reporting)

EurekaFacts

$ 144,465

Tryouts



Design and prepare for task tryouts, analyze findings, and prepare report

ETS

$ 61,286

Prepare for and administer task tryouts (including recruitment, incentive costs, data collection, & reporting)

EurekaFacts

$ 104,875



  1. Project Schedule

The schedule for this study, including all activities, is provided in Table 3.

Table 3. Project Schedule

Activity

Each activity includes recruitment, data collection, and analyses

Dates

Cognitive interviews

September 2018-January 2019

Small-scale tryouts

October 2018-January 2019

Pretesting reports submitted

January 2019



1 Assumptions for approximate attrition rates are 50 percent from initial contact to consent form completion and 25 percent from submission of consent form to participation.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTabletStudyUsability_Vol1_9-10-13
SubjectOperational Analysis
AuthorFulcrum IT
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy