Vol. I

Vol 1 TIMSS 2023 Cog Labs_OMB updated.docx

NCES System Clearance for Cognitive, Pilot, and Field Test Studies 2019-2022

Vol. I

OMB: 1850-0803

Document [docx]
Download: docx | pdf




National Center for Education Statistics (NCES)




Volume I

Supporting Statement



Trends in International mathematics and science study (TIMSS 2023) Cognitive Interviews



OMB# 1850-0803 v.263




October 2020


Table of Contents


Attachments:

Volume II – Protocols

Appendices – Communication Materials



1) Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803) that provides for NCES to conduct various procedures (such as field tests and cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.

2) Background and Study Rationale

The Trends in International Mathematics and Science Study (TIMSS) is an international assessment of fourth- and eighth-grade students’ achievement in mathematics and science. Participation in this study by the United States at regular intervals provides data on current and past education policies and a comparison of U.S. education policies with its international counterparts. The United States will participate in TIMSS 2023 to continue to monitor the progress of its students compared to that of other nations and to provide data on factors that may influence student achievement.


TIMSS is conducted by the International Association for the Evaluation of Educational Achievement (IEA), an international collective of research organizations and government agencies that create the frameworks used to develop the assessment, the background questionnaires, and the studies’ timeline. IEA decides and agrees upon a common set of standards and procedures for collecting and reporting data, and defines the studies’ timeline, all of which must be followed by all participating countries. As a result, TIMSS is able to provide a reliable and comparable measure of student skills in participating countries. In the U.S., the National Center for Education Statistics (NCES) sponsors this study in collaboration with the IEA and other contractors (Westat, AIR, and Hager Sharp) to ensure proper implementation of the study and adoption of practices in adherence to the IEA’s standards.


The TIMSS collection of data is consistent with the NCES mandate. The enabling legislation of the National Center for Education Statistics [Section 406 of the General Education Provisions Act, as amended (20 U.S.C. 1221e-1)] specifies that "The purpose of the Center [NCES] shall be to collect and analyze and disseminate statistics and other information related to education in the United States and in other nations." The Educational Sciences Reform Act of 2002 (ESRA 2002: 20 U.S.C., Section 9543) also specifies that NCES shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations. In addition to being essential for any international perspective on mathematics and science knowledge and skills, U.S. participation fulfills both the national and international aspects of NCES' mission.


The TIMSS & PIRLS International Study Center at Boston College employs a collaborative process to develop the new achievement items needed for each TIMSS cycle. The development process is directed and managed by the staff of the TIMSS & PIRLS International Study Center, who have considerable experience in the measurement and assessment of mathematics and science achievement. They work collectively with the participating countries to develop the assessment frameworks and items and then to review and test the items. Prior to the Main Study, the items are evaluated in a field test. For additional item development testing, a pilot study is conducted prior to field testing. In the beginning stages of item development, cognitive interviews are employed to understand more about students’ experiences with the mathematics and science content being assessed in the items. More details on the item development process can be found at https://timss.bc.edu/timss2019/methods/chapter-1.html.


TIMSS transitioned to a digitally-based assessment, known as eTIMSS, in the 2019 administration. As part of the TIMSS 2023 item development process, specific new electronic assessment items will be pretested in a virtual cognitive laboratory setting with a small number of students before they are administered to a larger sample through the field test and main study. During the cognitive labs, these assessment items will be presented in an online Adobe PDF format rather than in a fully interactive digital format. Information gathered from the cognitive interviews will be used to inform the development of each item’s content prior to creating them digitally for administration in the TIMSS field test.


This submittal requests clearance for cognitive interviews of mathematics and science assessment items which are being developed for the upcoming TIMSS 2023 assessment. Specifically, cognitive interviews will be conducted for the following four assessments:


  1. 4th-grade TIMSS 2023 mathematics

  2. 8th-grade TIMSS 2023 mathematics

  3. 4th-grade TIMSS 2023 science

  4. 8th-grade TIMSS 2023 science


Included in the submittal are

  • Volume I — supporting statement that describes the design, data collection, burden, cost, and schedules of the study;

  • Volume II — protocols and questions used in the cognitive interviews; and

  • Appendices — recruitment and communication materials.


American Institutes for Research (AIR) is the lead contractor for the cognitive interviews (see section 4). An overview of the cognitive interviews is presented below.


In cognitive interviews (often referred to as a cognitive laboratory study or cog lab), an interviewer uses a structured protocol in a one-on-one interview (see Volume II for protocols). Cognitive interview studies are largely observational. Due to the coronavirus pandemic, this cognitive interview study has been converted from an in-person to a virtual format to help limit the spread of the virus, reduce the burden on students and parents, and maintain the budget. For this study students will use a computer in their home to join an online video interview session using Adobe Connect. The students will answer a sample of TIMSS assessment questions (or items) displayed on their computer screen by the interviewer in an online Adobe PDF format and will use Adobe Connect tools to mark their responses. After each item, the interviewer will ask the student a few questions to gather information about student reactions to the item. The largely qualitative data collected will be mainly verbal reports in response to these questions, in addition to volunteered comments.


The objective of this cognitive interview study is to assess the clarity of the presentation of the assessment items and how well students understand what they have to do as they work through the items. The items will be a group of newly drafted higher complexity electronic items, called Problem Solving and Inquiry (PSI) items, in the form of online Adobe PDF story boards selected by the TIMSS & PIRLS International Study Center at Boston College. Each PSI item is estimated to take students either 10-15 minutes (short PSIs) or 20-30 minutes (long PSIs) to complete. The information gained from the study will be used by Boston College for potential revisions prior to creating the digital versions that will be tested during the TIMSS 2023 Field Test.

3) Recruitment and Data Collection

Sampling and Recruitment Plans

The TIMSS 2023 cognitive laboratory study will be conducted in a maximum of three phases. The first phase of cognitive interviews will be conducted in October 2020 (or as soon as OMB clearance is approved), and the following two potential phases will be conducted in the winter or spring of 2021. These additional phases will be administered as determined by Boston College’s needs and based on the availability of new PSI story boards to be tested. The implementation of this study in phases will allow interviews to be conducted with assessment items that are ready to be tested first, while others are still being prepared. As well, it will allow item developers at Boston College to make modifications to items tested in the initial phases to be tested again in later phases in order to identify whether improvements can be made.


Each phase of cognitive interviews will include Grade 4 and Grade 8 PSI items for testing. Each item will be administered to a total of 6 students in order to gain a range of perspectives on the quality of the item. Each cognitive interview session will test 1-2 PSIs, depending on the length of the PSI, and include time for debriefing questions, not to exceed a 1-hour session. Phase one of the cog labs, beginning in October, will include up to 2 short Grade 4 PSI items administered to 6 students, and up to 3 Grade 8 PSI items (1 long and 2 short) rotated between up to 12 students, for a maximum total of 18 students in phase one. Subsequent phases may include less students, depending on the number of PSI items to be tested, but will not exceed a maximum of 56 students across all 3 phases.


Table 1: Estimated Sample Size by Phase

Cog Lab Phase

Number of Grade 4 Students

Number of Grade 8 Students

Max. Total Number of Students

Phase 1

(October 2020)

6

(for 2 items)

12

(for 3 items)

18

Phase 2 & 3 combined

(TBD 2021)

6-24

(for 2-6 items)

6-24

(for 2-6 items)

38

Max. Total



56*

*While number of students at each grade is to be determined and may not be equal across grades, the maximum number of students will not exceed 18 for Phase 1 and 38 for Phases 2 and 3 combined, and will not exceed a total of 56 students across all phases.


The cognitive labs will be conducted virtually by AIR staff using Adobe Connect software.


Students will be recruited from across the United States by EurekaFacts. The EurekaFacts participant database includes tens of thousands of individuals nationally and is constantly refreshed and updated through the company’s independent participant outreach methods in both English and Spanish. These efforts ensure that the research participants recruited by EurekaFacts are not “professional respondents,” instead they are individuals with limited to no experience with qualitative research who can provide fresh and actionable information for stakeholders. As a result, the sample from the database will be representative of the survey’s population and should not impact testing results.


English-speaking students who demonstrate their comfort with talking to an interviewer during the initial recruitment screening step will be recruited from the following demographic populations in their databases of potential research participants for studies:


  • A mix of race/ethnicity (Black, Asian, White, Hispanic, etc.);

  • Mix of varied levels of mathematics ability;

  • Mix of school types (public and private);

  • A mix of socioeconomic background; and

  • A mix of gender


EurekaFacts will collect this demographic information during the recruitment screener and use what information they might already have in their databases to aim for a diverse sample of students. The results will not explicitly measure differences by those characteristics, meaning that the sample will not necessarily be representative of the grade 4 and 8 student populations.


EurekaFacts will conduct recruitment by telephone based on their databases. They maintain a database of potential research participants for studies. All prospective participants are above the age of 18 in the database because of Children’s Online Privacy Protection Rule (COPPA). COPPA ensures the protection of children age 13 and younger. EurekaFacts maintain data on adults (18+) who have let EurekaFacts know they are parents and have children. Recruitment from such a database ensures minimal hang up refusal to reduce response burden. It has been decided not to go through schools to recruit students because recruitment from local schools is more indirect and may result in lower response rates from students and parents than direct calls.


Interested participants will be screened to ensure that students meet the criteria for participation in the study. See Appendix A for recruitment screener. When recruiting participants, staff will first speak to the parent/guardian of the interested minor before starting the screening process. During this communication, the parent/guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort as well as the activities that it entails. After confirmation that participants are qualified, willing, and available to participate in the research project, they will receive a confirmation e-mail (see Appendix B). Informed parental consent will be obtained for all respondents under the age of 18 who are interested in participating in the data collection efforts (see Appendix C). After the interview, the parent of the student who participated in the interview will receive a thank-you e-mail and the electronic gift card incentives (see Appendix D). See Appendix A through D for all recruitment and communication materials.

Cognitive Interview Information

The template for the cognitive interviews is contained in Volume II of this submittal. The template includes:

  • welcome/thank you/introductory remarks,

  • a generic version of the cognitive interview scripts that will be customized for each of the items in the study, and

  • closing remarks/thanks.

Item Information

Items to be included in the study will be selected by the International Study Center at Boston College from the pool of Problem Solving and Inquiry (PSI) items being developed for the TIMSS 2023 Field Test. Problem Solving and Inquiry (PSI) items are intended to be administered digitally in the TIMSS Field Test and Main Study. The PSIs are scenario-based tasks with multiple questions that may take a student 10-15 minutes (short PSIs) or 20-30 minutes (long PSIs). For the purposes of these cognitive interviews, only online Adobe PDF storyboards of the PSIs will be presented to the students. The story boards will allow the students to work through the items on screen in a similar way as they would on an electronic device. The reason for using story boards during this stage of item development is to gather student input regarding the content of the items in order to inform development of PSIs prior to the labor-intensive process of developing the digital versions.


For these cognitive interview, Boston College is developing and will select PSI items across both math and science and for Grades 4 and 8.

Calculator Use

Based on international guidelines set by IEA for calculator use in TIMSS assessments since 2003, calculators will not be permitted during fourth-grade cognitive interviews. However, the TIMSS policy on calculator use at the eighth grade is to give students the best opportunity to operate in settings that mirrors their classroom experiences. Thus, following the United States implementation of the calculator policy in the TIMSS main assessments since 2003, calculators will be permitted, but not required, during eighth-grade cognitive interviews. Because the TIMSS main assessments are conducted in schools, students in the U.S. have used the calculators they are accustomed to using and provided by their school. For this study eighth graders will be permitted, but not required, to use a calculator that they like to use for schoolwork.

Cognitive Interview Process

AIR will conduct the cognitive interviews, based on the protocol structures described in Volume II. Protocols for the interviews have been developed based on NCES cognitive lab materials, with appropriate adaptation for TIMSS and the purpose of this study.

Participants will first be welcomed, introduced to the interviewer, and told they are there to help answer questions about how students answer assessment items. Participants will be reassured that their participation is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002, 20 U.S.C §9573 and 6 U.S.C. §151]. The interviewer will then review the technical logistics of Adobe Connect with the student and parent. At this time, participants will also be notified that the interview will be audio and video recorded using the Adobe Connect software and asked for their verbal consent before the interview continues.


The interviews will focus on the following questions:

  • Can the student determine what is being asked of her/him at each point in the task?

  • How much time did the student spend on the task when s/he worked through it without guidance? What is the reading load for the task?

  • What is the student’s overall impression of the task? Is the task accessible? Engaging? Is the content and/or context familiar?

  • Is the level of complexity of each stage of the task appropriate for the grade level?

  • Is the student directing her/his attention appropriately at each stage of the task (where is the student’s focus directed at each stage)?


The interviews will include general questions about the clarity of the item, and if the student reports issues with clarity of the item the interview will include targeted questions about parts of the item, which may be tailored to specific items. The interviewer script of questions will be generated by AIR for each PSI item and reviewed by NCES and Boston College (see Volume II).


The interviews will follow these steps:

  1. The interviewer will ask the student to work through the first PSI item for 15 minutes (or 30 minutes if identified as a long item by Boston College). This will give the student an opportunity to think about each item before the interviewer proceeds with questioning. Students will be asked to think-aloud.

  2. After the student has produced answers or responses for the PSI item, or after the allotted amount of time has passed, the interviewer will begin to collect information through questioning the student directly. Initially, the student will be asked to report on the clarity of the item. For an item which the student finds to be clear with no issues there will be no additional questions after the general questions have been completed for this particular item. For an item the student finds unclear, the interviewer will proceed through a variety of additional probing questions directed at assessing the clarity of the item to the student. Areas of questioning will include: use and meaning of stimulus material provided (e.g., geometric figures, graphs, etc.), whether the student encountered any unfamiliar words, how the student solved the item, and whether the content was familiar.

  3. The interviewer will move to the second PSI item when questioning for the first PSI item is complete. The items for each student will be identified by AIR in advance.

  4. The interviewer, using his/her judgment based on experience, will note pertinent aspects of the interview process, such as the student’s level of motivation and any special circumstances that might affect the interview.

  5. As the student is providing information during the session, interviewers will take notes to record data during the interview. The generic item script developed by AIR will provide consistency in gathering data and will aid in data analysis (See Example Generic Item Scripts in Volume II)

  6. This process will not focus on whether the student produced a correct or incorrect answer for each question, but instead, on how that answer was determined and the clarity of the item (i.e., how the question was interpreted, the thinking process engaged in, etc.).

  7. As part of the process, the interviewer will be asked to record the student’s answer and whether it was correct if the item is a multiple-choice item, but that information will not be shared with the student. If the item is constructed response the interviewer will score the item after the interview has been completed. Students will not be given the correct answers by the interviewer as this may impact any subsequent questions or interviews.

  8. After the interview concludes, the interviewer will input their notes from the session within 24 hours and submit them and the Adobe Connect recording to the interview manager.

Analysis Plans

For the cognitive interview data collections, the key unit of analysis is the item. Items will be analyzed across participants. The types of data collected about the items will include: responses to general questions about student reactions to the item; responses to targeted questions specific to the item or task; and additional volunteered participant comments.


The deliverable from the analyses of the cognitive interviews will be in the format of a memo for each phase. Each memo will contain results for each item and a brief summary of results. The memos will be delivered along with Excel files containing student responses AIR collected in interviewer notes. The general analysis approach will be to compile the data gathered through the interviewer note-taking sheet to present the data at the item-level. The presentation of the report will facilitate the identification of patterns of responses for each item.

4) Consultations Outside the Agency

AIR is an established not-for-profit research organization. AIR offers facilities, tools, and staff to collect and analyze both qualitative and quantitative data. AIR will use their trained cognitive interviewer staff to conduct the cognitive interviews virtually using Adobe Connect software.

The TIMSS & PIRLS International Study Center at Boston College in the United States is involved in consultations with NCES regarding the purpose and scope of the cognitive lab study. Boston College will provide the assessment items that will be used and the results from the cognitive lab study will be sent to Boston College to supplement item development processes for TIMSS 2023.


EurekaFacts is a for-profit research firm that will be responsible for the student recruitment.

5) Assurance of Confidentiality

Participants will be notified that their participation is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (20 U.S.C. §9573 and 6 U.S.C. §151)].


A consent form that explains the purpose and duration of the interview—to be signed and returned before interviews are conducted—will be sent via e-mail to the parents or legal guardians of all students. The student will also be asked to provide verbal consent to participate at the start of the interview sessions, which will be recorded using Adobe Connect to assist with post-interview note-taking. Information about the interview recording will be included along with an opt-out box in the parent consent form (see Appendix C). Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files, secured for the duration of the study in a restricted-use access folder in AIR’s network, and will be destroyed after the final report is released. All e-mail exchanges with parents or legal guardians will be saved and archived in a restricted-access folder in AIR’s network. The Adobe Connect recordings will be stored in a restricted-access folder in AIR’s network and will be deleted after the project concludes.

Justification for Sensitive Questions

Throughout the item, task, and process of developing interview protocols, effort has been made to avoid asking for information that might be considered sensitive or offensive. Reviewers have attempted to identify and minimize potential bias in questions.

6) Estimate of Hourly Burden

Based on the proposed outreach and recruitment methods, we estimate initial respondent burden at 0.15 hours through phone calls. We are estimating a recruitment of 24 participants in phase one for the maximum 18 participants to show. In total across 3 phases of interviews we estimate a recruitment of 70 participants to achieve a total of 56 participants. Student interviews with items will be limited to 60 minutes for all students. The estimated burden for recruitment assumes attrition throughout the process. Interviews will be conducted virtually using the Adobe Connect software.


Table 2 details the estimated burden for the cognitive interviews.


Table 2. Estimate of Hourly Burden


Respondent

Hours per respondent

Number of respondents

Number of responses

Total hours (rounded up)

Parent or Legal Guardian for Student Recruitment

Recruitment, screening, and inviting

0.15

960

960

144

Participation (Cognitive Interviews)

Students

1

56

56

56

Total Burden


1,016

1,016

200


7) Estimate of Costs for Paying Respondents

Each participating student will receive a $25 electronic Amazon gift card as a thank you for his or her time and effort. In addition, an electronic Amazon gift card of $25 will be given to the parent or legal guardian to thank him or her for the time involved in helping the child successfully connect to the Adobe Connect interview.

8) Costs to Federal Government

The total cost to the federal government for conducting the TIMSS 2023 cognitive interviews is estimated to be $100,000. The estimated costs for the activities detailed in this package are described below.


Table 3. Estimate of Costs

Activity

Provider

Estimated Cost

Design, preparation, and conduct of cognitive interviews (data collection, analysis, and reporting)

AIR

$84,000

Recruitment of students

EurekaFacts

$16,000

Total


$100,000


9) Schedule

Table 4 provides the schedule of milestones and deliverables for the cognitive interviews. Phase 1 dates have been set based on the items from Boston College that are ready to be tested. Phases 2 and 3 dates have been planned but are still tentative and will be based on the speed of development of new items by Boston College that are ready for testing.


Table 4. Schedule of Milestones and Deliverables


Activity

Dates

Expected OMB clearance

September 2020

Recruit participants for Phase 1 (subsequent to OMB clearance)

September - October 2020

Phase 1 data collection

October 2020

Phase 1 data analysis

October 2020

Phase 1 study report

Early November 2020

Potential Phase 2 recruit students

Tentative January 2021

Phase 2 data collection

Tentative January 2021

Phase 2 data analysis

Tentative January 2021

Phase 2 study report

Tentative February 2021

Potential Phase 3 recruit students

Tentative April 2021

Phase 3 data collection

Tentative May 2021

Phase 3 data analysis

Tentative May 2021

Phase 3 study report

Tentative June 2021


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleBackground Cog Lab OMB Submission V.1
SubjectNAEP BQ
AuthorDonnell Butler
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy