Trends in International mathematics and science study (TIMSS 2015) Cognitive Interviews
Volume I
Supporting Statement
REQUEST FOR OMB Clearance
OMB# 1850-0803 v.86
Submitted by:
National Center for Education Statistics
U.S. Department of Education
Institute of Education Sciences
Washington, DC
September 2013
Volume I Table of Contents
1) Submittal-Related Information 2
2) Background and Study Rationale 2
4) Cognitive Interview Process 5
5) Consultations Outside the Agency 6
6) Assurance of Confidentiality 6
7) Estimate of Hourly Burden 6
8) Estimate of Costs for Paying Respondents 7
This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803) that provides for NCES to conduct various procedures (such as pilot tests and cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.
The Trends in International Mathematics and Science Study (TIMSS) is an international assessment of fourth- and eighth-grade students’ achievement in mathematics and science. Participation in this study by the United States at regular intervals provides data on current and past education policies and a comparison of U.S. education policies with its international counterparts. The United States will participate in TIMSS 2015 to continue to monitor the progress of its students compared to that of other nations and to provide data on factors that may influence student achievement. In addition to the main study, the United States will also participate in TIMSS Advanced in 2015, to assess twelfth-grade students’ achievement in advanced mathematics and physics.
TIMSS is conducted by the International Association for the Evaluation of Educational Achievement (IEA), an international collective of research organizations and government agencies that create the frameworks used to develop the assessment, the background questionnaires, and the studies’ timeline. IEA decides and agrees upon a common set of standards and procedures for collecting and reporting data, and defines the studies’ timeline, all of which must be followed by all participating countries. As a result, TIMSS is able to provide a reliable and comparable measure of student skills in participating countries. In the U.S., the National Center for Education Statistics (NCES) sponsors this study in collaboration with the IEA and other contractors (Westat, Avar Consulting (Avar), AIR, and Hager Sharp) to ensure proper implementation of the study and adoption of practices in adherence to the IEA’s standards.
The TIMSS collection of data is consistent with the NCES mandate. The enabling legislation of the National Center for Education Statistics, the Educational Sciences Reform Act of 2002 (ESRA 2002: 20 U.S.C., Section 9543) specifies that NCES shall collect, report, analyze, and disseminate statistical data related to education, including acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations. In addition to being essential for an international perspective on mathematics and science knowledge and skills, U.S. participation fulfills both the national and international aspects of NCES' mission.
As part of the TIMSS item development process, specific assessment items will be first evaluated using a small sample of students through cognitive interviews, before being tested on a larger scale through the TIMSS 2015 field test, and eventually administered to a nationally representative student sample in the main study. This submittal requests clearance for the cognitive interviews of mathematics and science assessment items proposed for the following six TIMSS 2015 and TIMSS 2015 Advanced assessments:
4th-grade TIMSS 2015 mathematics
8th-grade TIMSS 2015 mathematics
4th-grade TIMSS 2015 science
8th-grade TIMSS 2015 science
12th-grade TIMSS 2015 Advanced mathematics
12th-grade TIMSS 2015 Advanced physics.
In cognitive interviews (often referred to as a cognitive laboratory study or cog lab), an interviewer uses a structured protocol in a one-on-one interview (see Volume II for protocols). In this study students will answer a sample of TIMSS assessment items as they normally would in a testing situation. After each item, the interviewer will ask the student a few questions to gather information about student’s understanding of and reactions to the item. The objective of this cognitive interview study is to assess the clarity of the presentation of the problem-solving situations of the most complex items and how well high-achieving students understand what they have to do as they work through the items. The items will be a group of higher complexity items from the pool of Field Test items selected by the TIMSS & PIRLS International Study Center at Boston College, which will use the information gained to improve the assessments that will be administered in TIMSS 2015 field test.
Twenty items will be tested for each of the six assessments, and each student will respond to four items. Five students will be interviewed per item, which should be sufficient at this stage given that the key purpose of the cognitive interview is to identify qualitative patterns in how students think at different points in a given task and confirm the clarity of the items. This means that 25 students will be interviewed for each assessment, for a total of 150 students across the three grades and six assessments.
The American Institutes for Research (AIR) staff will conduct the cognitive interviews, as a subcontractor for Westat. The interviews will be conducted in Washington, DC as well as San Mateo, California facilities in order to allow for a more diverse sample. All items will be tested at least twice in each location. Students will be recruited in the Washington, DC area by Shugoll Research and in the San Mateo, California area by Nichols Research Inc., 75 students in each location. Both companies plan to recruit students from the following demographic populations:
A mix of race/ethnicity (Black, Asian, White, Hispanic, etc.);
A mix of socioeconomic background; and
A mix of urban/suburban residence location.
Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics.
For recruiting 12th graders, both companies will recruit students who are either currently in or have already taken an advanced mathematics or Physics course, as defined by the TIMSS Advanced framework. Using the NCES Secondary School Course Classification System: School Codes for Exchange of Data (SCED) (http://nces.ed.gov/pubs2007/2007341.pdf ), the following courses, at a minimum, will be eligible for the grade 12 mathematics assessment:
02121 Calculus
02122 Multivariate Calculus
02123 Differential Calculus
02124 AP Calculus AB
02125 AP Calculus BC
02126 Particular Topics in Calculus (when it follows a survey calculus course)
02132 IB Mathematics
The following courses in the SCED, at a minimum, will be eligible for the grade 12 physics assessment:
03152 Physics—Advanced Studies
03155 AP Physics B
03156 AP Physics C (either E&M or MECH)
03157 IB Physics
Both companies will recruit high achieving students at all grades because the objective of the study is to determine whether the most complex and difficult items are understandable even for high achieving students. High achievement will be determined by parent- and self-reporting of letter grades and achievement in relevant classes. Specifically, students will be recruited who have received As or Bs in their physics or calculus/IB math classes at grade 12; or have received As or Bs/excelled (if no letter grades given) in their math or science classes at grades 4 and 8. For grade 4 and 8, because in some cases students might not receive letter grades in their classes, reports of achievement are considered from both parent and student. For grade 12, students are asked to report the letter grade they received for specific math and physics classes (see Appendix A for the recruitment screener for more details).
Both companies will conduct recruitment by telephone based on their databases, which contain individuals in the area who signed up to be potential research participants. Interested participants will be screened to ensure that students meet the criteria for participation in the study. Both companies will use the same phone script recruitment screener. When recruiting participants, staff will first speak to the parent/guardian of the interested minor before starting the screening process. During this communication, the parent/guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort as well as the activities that it entails. After confirmation that participants are qualified, willing, and available to participate in the research project, they will receive a confirmation e-mail/letter (Appendix B). Informed consent will be obtained from all respondents who are 18 years old or older who are interested in participating (Appendix D), and informed parental consent will be obtained for all respondents under the age of 18 (Appendix C).
The International Study Center at Boston College selected mathematics and science items from the pool of items developed for the TIMSS 2015 and TIMSS 2015 Advanced Field Tests that have been evaluated as complex items by expert test developers. The items span across various content areas, item types, and include various types of stimuli. They include multiple-choice and constructed response items, and are written in the format of a single question or a set of questions (item set). For the purpose of this study, each question in an item set, or multi-part item, is considered an item.
Cognitive Interview Information
The template for the cognitive interviews is provided in Volume II of this submittal, and includes:
welcome/thank you/introductory remarks,
a generic version of the item scripts at each grade level that will be customized for each of the items in the study,
sample items, and
closing remarks/thanks.
Calculator Use
Based on international guidelines set by IEA for calculator use in TIMSS assessments since 2003, calculators will not be permitted during fourth-grade cognitive interviews. However, the TIMSS policy on calculator use at the eighth grade is to give students the best opportunity to operate in settings that mirror their classroom experiences. Thus, following the United States implementation of the calculator policy in the TIMSS main assessments since 2003, calculators will be permitted, but not required, during eighth-grade cognitive interviews. Because the TIMSS main assessments are conducted in schools, students in the U.S. have used the calculators they are accustomed to using and which frequently are provided by their school. For this study, eighth graders will be permitted, but not required, to bring a calculator that they like to use for school work. Additionally, AIR will provide basic calculators for eighth-grade students who do not bring their own calculator, should they wish to use one. Twelfth grade students will be required to bring their own graphing calculator to the interview. The TIMSS Advanced 2015 assessment items for grade 12 were drafted with the expectation that students will have calculators that are able to graph and do all scientific functions.
Analysis Plans
For the cognitive interview data collections, the key unit of analysis is the item. Items will be analyzed across participants. The types of data collected about the items will include: responses to general questions about student reactions to the item; responses to targeted questions specific to the item or task; and additional volunteered participant comments.
The deliverable from the analyses of the cognitive interviews will be a report which will contain results for each item and a brief summary of results. These will include information from AIR as collected in interviewer notes. The general analysis approach will be to compile the data gathered through the interviewer note-taking sheet to present the data at the item-level. The presentation of the report will facilitate the identification of patterns of responses for each item. In addition, AIR will supply a summary of results for each grade and subject.
Qualified interviewers, trained on the cognitive interviewing techniques of the protocols, will conduct the interviews. Participants will first be welcomed, introduced to the interviewer, and told they are there to help answer questions about how students respond to assessment items. Participants will be reassured that their participation is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002, 20 U.S.C §9573].
The list of items for each student will be generated by AIR in advance. The interviews will focus on the following question: How well do students understand what they have to do as they work through an item that they have found to be unclear? The interviews will include general questions about the clarity of the item (see Volume II), and if the student reports issues with clarity in the item, the interview will include targeted questions about parts of the item, which may be tailored to specific items.
The interviews will follow these steps:
The interviewer will ask the student to work through the item for a predetermined amount of time, between 3 and 8 minutes, depending on grade level and item. This will give the student an opportunity to think about the item before the interviewer proceeds with questioning.
After the student has produced a response for the item, or after the allotted amount of time has passed, the interviewer will collect information through questioning the student. Initially, the student will be asked to report on the clarity of the item. For an item which the student answers correctly and finds to be clear, after the general questions have been completed for the particular item, no additional questions will be asked. For an item the student finds unclear and/or answers incorrectly, the interviewer will proceed through a variety of additional probing questions directed at assessing the clarity of the item to the student. Areas of questioning will include: use and meaning of stimulus material provided (e.g., geometric figures, graphs, etc.), whether the student encountered any unfamiliar words, how the student solved the item, and whether the content was familiar.
The interviewer will move to the next item when questioning for a particular item is complete until all items are completed or 60 minutes has passed.
The interviewer, using his/her judgment based on experience, will note pertinent aspects of the interview process, such as the student’s level of motivation and any special circumstances that might affect the interview.
As the student is providing information during the session, interviewers will take notes, following the generic item script developed by AIR, designed to provide consistency in data gathering and aid in data analysis (see Example Generic Item Scripts for each grade in Volume II).
The interview will not focus on whether the student produced a correct or incorrect answer for each question, but instead, on how that answer was determined and the clarity of the item (i.e., how the question was interpreted, the thinking process engaged in, etc.).
As part of the process, the interviewer will record the student’s answer and whether it was correct if the item is a multiple choice item, but that information will not be shared with the student. If the item is a constructed response and the interviewer is unsure if the student answered correctly, the interviewer will score the item after the interview has been completed. Students will not be given the correct answers by the interviewer as this may impact the remainder of the interview.
The TIMSS & PIRLS International Study Center at Boston College in the United States is involved in consultations with NCES regarding the purpose and scope of the cognitive lab study. Boston College will provide the assessment items that will be used and the results from the cognitive lab study will be sent to Boston College to supplement field trial results.
AIR is an established not-for-profit research organization, offering facilities, tools, and staff to collect and analyze both qualitative and quantitative data. AIR will use facilities and staff in both Washington, DC and San Mateo, California to conduct the cognitive interviews. Shugoll Research is a for-profit research firm that will be responsible for the recruitment for the Washington, DC area, and Nichols Research is a for-profit research firm that will be responsible for the recruitment for the San Mateo, California area.
Participants will be notified that their participation is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (20 U.S.C. §9573)]. Written consent will be obtained from legal guardians (of minor students) and directly from students 18 years or older before interviews are conducted. Interview sessions will not be audio or video recorded. Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files and secured for the duration of the study and will be destroyed after the final report is released.
Based on the proposed outreach and recruitment methods, we estimate initial respondent burden at 0.10 hours during phone calls. We estimate a recruitment of 90 to 95 participants resulting in 75 participants showing for their interview in each location (San Mateo, CA and Washington, DC), for a total of 150 participants. Interviews will be limited to 60 minutes for each student. Table 1 details the estimated burden for the cognitive interviews.
Table 1. Estimate of Hourly Burden
Respondent |
Hours per respondent |
Number of respondents |
Total burden hours |
|
Math |
Science |
|||
Student Recruitment |
0.10 |
1,080 |
1,080 |
216 |
Cog lab - Grade 4 Students |
1 |
25 |
25 |
50 |
Cog lab - Grade 8 Students |
1 |
25 |
25 |
50 |
Cog lab - Grade 12 Students |
1 |
25 |
25 |
50 |
Total Burden |
|
1,155 |
1,155 |
366 |
Students will be given a $25 gift card (Visa gift card) to thank them for their time and effort. In addition, if the participating students’ parent or legal guardian brings them to and from the interview site, the parent or guardian will also receive a $25 gift card to thank him or her for the time and effort involved and to offset transportation costs.
The total cost to the federal government for conducting the TIMSS cognitive interviews is estimated to be $100,000, as detailed below.
Activity |
Provider |
Estimated Cost |
|
AIR |
$69,870 |
|
Shugoll Research |
$15,530 |
|
Nichols Research |
$14,600 |
Totals |
|
$ 100,000 |
Table 3 provides the schedule of milestones and deliverables for the cognitive interviews.
Table 3. Schedule of Milestones and Deliverables
Activity |
Dates |
Recruit participants (subsequent to OMB clearance) |
Oct 2013-Jan 2014 |
Data collection |
Jan -Feb 2014 |
Data analysis |
Feb -Mar 2014 |
Final study report |
April 2014 |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Background Cog Lab OMB Submission V.1 |
Subject | NAEP BQ |
Author | Donnell Butler |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |