Part A TIMSS 2019 Main Study

Part A TIMSS 2019 Main Study.docx

Trends in International Mathematics and Science Study (TIMSS) 2019 Main Study

OMB: 1850-0695

Document [docx]
Download: docx | pdf





Trends in International mathematics and science study (TIMSS) 2019 main study





OMB# 1850-0695 v.14




Supporting Statement Part A




Submitted by:



National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC










September 2018


Supporting Statement Part B

APPENDICES

A: TIMSS 2019 Main Study Recruitment Materials


B: TIMSS 2019 Main Study Notification Letters and Supporting Materials


C.1: TIMSS 2019 International Final Main Study Questionnaires

C.2: TIMSS 2019 Draft U.S. Adaptations to International Questionnaires

C.3 : Summary of Changes from TIMSS 2019 International Field Test Questionnaires to International Main Study Questionnaires


D: TIMSS 2019 Non-Response Bias Analysis Plan



PREFACE

The Trends in International Mathematics and Science Study (TIMSS), conducted by the National Center for Education Statistics (NCES), within the U.S. Department of Education (ED), is an international assessment of fourth and eighth grade students’ achievement in mathematics and science. Since its inception in 1995, TIMSS has continued to assess students every 4 years (1995, 1999, 2003, 2007, 2011, and 2015), with the next TIMSS assessment, TIMSS 2019, being the seventh iteration of the study. In TIMSS 2019, 59 countries and 6 other education systems plan to participate at grade 4, and 40 countries and 5 other education systems plan to participate at grade 8. The United States will participate in TIMSS 2019 to continue to monitor the progress of its students compared to that of other nations and to provide data on factors that may influence student achievement.

TIMSS is led by the International Association for the Evaluation of Educational Achievement (IEA), an international collective of research organizations and government agencies that create the frameworks used to develop the assessment, the survey instruments, and the study timeline. IEA decides and agrees upon a common set of standards, procedures, and timelines for collecting and reporting data, all of which must be followed by all participating countries. As a result, TIMSS is able to provide a reliable and comparable measure of student skills in participating countries. In the U.S., NCES conducts this study in collaboration with the IEA and a number of contractors (Westat, Avar Consulting (Avar), AIR, and Hager Sharp) to ensure proper implementation of the study and adoption of practices in adherence to the IEA’s standards. Participation in TIMSS is consistent with NCES’s mandate of acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations [The Educational Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543)].

New in 2019, TIMSS will be a technology-based assessment conducted in an electronic format (referred to as “eTIMSS”) and administered using the eTIMSS Assessment Platform. There are two primary goals for the transition to a technology-based assessment: (1) maintaining continuity to measure trends with the past paper-and-pencil TIMSS assessments, and (2) developing relevant assessment innovations that take advantage of new technologies.

Compared to previous assessment cycles, TIMSS 2019 differs in several ways:

  • Unlike TIMSS 2015, the IEA will not conduct TIMSS Advanced.

  • TIMSS 2019 will be a digitally-based assessment administered on the eTIMSS Assessment Platform.

  • TIMSS 2019 administered a pilot study in 2017, which was followed by a field test in 2018.

The United States participated in the 2017 pilot study to assist in the development of eTIMSS. The pilot was a very important part of the transition from a paper-based assessment to a technology-based assessment. The pilot was designed to provide information that can be used to reduce the data collection burden in 2019. The purpose of the pilot study was to (1) try out the new eTIMSS Assessment Platform, using the tablet player, data monitoring system, and online scoring; (2) conduct a mode effect study to examine the effect of administering the assessment on tablet versus paper; and (3) pilot newly-developed problem-solving and inquiry assessment items. Data collection for the pilot occurred in April and early May 2017. Student samples were obtained by selecting two classes from each school (e.g., two mathematics classes for grade 8).

In preparation for the TIMSS 2019 main study, all countries were asked to implement a 2018 field test, in which the United States participated. The purpose of the TIMSS field test was to evaluate new assessment items and background questions to ensure practices that promote low exclusion rates, and to ensure that classroom sampling procedures proposed for the main study are successful. Data collection for the field test occurred from March through April 2018.

Because TIMSS is a collaborative effort among many parties, the United States must adhere to the international schedule set forth by the IEA, including the availability of final field test and main study plans as well as draft and final questionnaires. In order to meet the international data collection schedule, to align with recruitment for other NCES studies (e.g., the National Assessment of Education Progress, NAEP), and for schools to put the TIMSS 2019 assessment on their calendars, recruitment activities for the main study began in May of 2018. The U.S. TIMSS main study will be conducted from April through May, 2019.

Furthermore, prior to the main study data collection, NCES is conducting pretesting in fall-winter 2018 with students in order to test the operational procedures and main study versions of the electronic instruments (approved in September 2018; OMB# 1850-0803 v.238; see Part B.4 for more details).

The request for the TIMSS 2019 Main Study recruitment & Field Test was approved in July 2017 with the latest change request approved in July 2018 (OMB# 1850-0695 v.10-13). With that submission, NCES adequately justified the need for and overall practical utility of the main study and an overarching plan for all phases of the data collection over the next 3 years, and provided detail on the measures to be used which were available at the time of the submission. This request, with a 30-day public comment period notice published in the Federal Register, is to conduct the TIMSS 2019 main study data collection.

The Supporting Statement Parts A and B of this submission are largely the same as the last approved versions (OMB# 1850-0695 v.10-13), with minor updates to reflect the focus of this request. Appendix A-B provides the approved versions (OMB # 1850-0695 v.10-13) of TIMSS 2019 main study state, district, and school communication materials and parent notification materials. Appendix C has been updated with the TIMSS 2019 main study questionnaires, which are a subset of the approved field test questionnaires. Final international versions of the TIMSS 2019 main study questionnaires are provided in Appendix C1, draft U.S. changes/adaptations to the international versions are included in Appendix C2, and changes from the international versions of the field test questionnaires to the main study versions are provided in Appendix C3. Lastly, Appendix D provides an updated Non-Response Bias Analysis Plan. The final versions of the TIMSS 2019 main study instruments, including finalized U.S. adaptations, will be submitted to OMB as a change request in October 2018.

A. Justification

A.1 Importance of Information

Benchmarking of U.S. student achievement against other countries continues to be of high interest to education policymakers, and informs policy discussions of economic competitiveness and workforce and post-secondary preparedness. TIMSS provides a unique opportunity to compare U.S. students‘ mathematics and science knowledge and skills at fourth and eighth grade with that of their peers in countries around the world. Science, technology, engineering, and mathematics (STEM) preparedness is key to economic improvement.

The continuation of U.S. participation allows for the study of past and current education policies that have shaped science and mathematics achievement over the past 24 years. Furthermore, participating countries are not only able to obtain information about students' knowledge and abilities in the specified subjects, but also about the cultural environments, teaching practices, curriculum goals, and institutional arrangements that are associated with student achievement in the respective subject areas.

TIMSS complements what we learn from national assessments such as the National Assessment of Educational Progress (NAEP) by identifying the strengths and weaknesses of student science and mathematics achievement relative to participating countries around the world. It provides valuable benchmarking information about educational polices enacted in other countries and policies that could be applied to U.S. educational practices.

Based on earlier TIMSS data releases, it is likely that the results of TIMSS 2019 will draw great attention in the United States and elsewhere. It is therefore expected that TIMSS will contribute to ongoing national and international debates and efforts to improve mathematics and science learning and achievement.

A.2 Purposes and Uses of Data

TIMSS assesses mathematics and science knowledge and skills at grades 4 and 8. TIMSS is designed to align broadly with curricula in the participating countries. The results, therefore, suggest the degree to which students have learned concepts and skills likely to have been taught in school. TIMSS also collects background information on students, teachers, schools, curricula, and official education policies in order to allow cross-national comparison of educational contexts that may be related to student achievement.

Data compiled and collected from TIMSS 2019 allows for evidence-based decisions to be made for the purposes of educational improvement. Each successive participation in TIMSS provides trend information about student achievement in mathematics and science relative to other countries, as well as indicators that show how this achievement relates to demographic and curricular, school, teacher, and student factors that provide the educational context for achievement. This high quality, internationally comparative trend data provide key information to inform education policy discussions.

Through participation in TIMSS and other international assessment programs, NCES is able to provide comparative indicators on student performance and school practices across countries in order to benchmark U.S. student performance, and to suggest hypotheses about the relationship between student performance and factors that may influence performance as well as areas in which students have strengths or weaknesses. The international studies identify differences among countries over time in instructional practices, school policies, and opportunity-to-learn that informs discussions about how to organize instruction.

NCES’s mandate [Section 406 of the General Education Provisions Act, as amended (20 U.S.C. §1221e-1)] specifies that "The purpose of the Center [NCES] shall be to collect and analyze and disseminate statistics and other information related to education in the United States and in other nations." and the Educational Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543) specifies that NCES shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations. TIMSS is essential for any international perspective on students’ mathematics and science knowledge and skills, and U.S. participation in ICILS is aligned with both the national and international aspects of NCES' mission

TIMSS 2019 Components

The mathematics and science assessments at grades 4 and 8 are organized around a content dimension that specifies the subject matter to be assessed and a cognitive dimension that specifies the thinking processes to be assessed. The cognitive domains are the same in mathematics and science: knowing, applying, and reasoning. The TIMSS 2019 frameworks were published in August 2017 and are similar to the TIMSS 2015 frameworks, but slightly revised to provide more specificity for item writers, and to better reflect current curricula in participating countries. It is not anticipated that there will be any revisions to the content domains or cognitive domains, nor to the target percentages for the content domains or cognitive domains at either subject at either grade.

In fourth-grade mathematics the cognitive content domains include: number, geometric shapes and measures, and data display. More advanced content in these three domains are assessed in eighth grade, supplemented by a data and chance domain. TIMSS assesses a range of problem-solving situations within mathematics, with about two-thirds of the questions requiring students to use applying and reasoning skills.

In science at fourth grade, the content domains include: life science, physical science, and earth science. At eighth grade the content domains transition to a more discipline-based approach, reflecting the differences in instruction from elementary school. The content domains at eighth grade are: biology, chemistry, physics, and earth science. TIMSS 2019 will also measure science practices and science inquiry, reflecting recent emphasis on these skills in many countries’ curricula and content standards.

Assessment Mode

In TIMSS 2019 countries can choose to administer a digitally-based assessment, rather than the traditional paper-and-pencil mode of assessment. The United States will participate in digitally-based TIMSS. Studies will be conducted in the 2017, 2018, and 2019 to operationalize the new digital mode of assessment, as well as to analyze mode effects so that the important TIMSS trend reporting can be maintained. The assessments will be conducted on tablets, such as the Microsoft Surface Pro, with a stylus and external keyboard.

Assessment Instruments

In order to minimize burden and to ensure broad subject-matter coverage, TIMSS will use a matrix sampling approach where the mathematics and science items at each grade level are organized into a set of test booklets, with each student taking only one booklet. Test items are either multiple-choice or constructed response items.

Questionnaires

The background questionnaires for TIMSS 2019 were developed to address a background questions framework developed internationally. The United States will adapt the questions to fit the U.S. education context, including adding a few questions, such as about the race/ethnicity of students. Teacher and school questionnaires will be offered online, with a paper-and-pencil backup. All students in the field test, and all students in the main study, will answer their questionnaires on paper, subsequent to the cognitive assessment questions administered on tablets. In the main study, a subset of 1,500 students per grade level will take a paper TIMSS assessment (for the purposes of a bridging study between the paper and new electronic version). TIMSS 2019 main study questionnaires (Appendix C1) are largely based on TIMSS 2015 questionnaires, as well as the TIMSS 2019 field test questionnaires.

School Questionnaire. A representative from each participating school will be asked to provide information on mathematics and science resources, teacher availability and retention, principal leadership, school emphasis on academic success, school climate, and parental involvement in school activities. The TIMSS school questionnaire is expected to take 30 minutes to complete and will be offered online, with a paper-and-pencil backup. The school questionnaire was not administered during the pilot study.

Teacher Questionnaire. At grades 4 and 8, mathematics and science teachers of students in selected classes will be asked to complete a teacher questionnaire. Teacher questionnaires will include questions about teacher preparation and experience, mathematics and science topics taught, instructional resources and technology, instructional time, instructional engagement, classroom assessment, and technology resources and instruction in their classes. The teacher questionnaire is expected to take 30 minutes to complete and will be offered online, with a paper-and-pencil backup. The teacher questionnaire was not administered during the pilot study.

Student Questionnaire. Student information will be collected about home resources, student motivation, self-concept, self-efficacy, and student characteristics such as gender and race/ethnicity. The student questionnaire is expected to take 30 minutes to complete and is administered after the cognitive assessment on paper. During the pilot study we administered only an abbreviated student questionnaire that took about 10 minutes to complete and collected limited demographic and background information. During the field test, the student questionnaire was also administered on paper due to the need of the international contractor to focus on the technical aspect of the assessment for the field test. In the main study, students will first take the electronically-delivered cognitive assessment questions, followed by paper versions of the student questionnaire. A subset of students, instead of receiving the electronically-delivered cognitive assessment questions, will instead receive a paper-and-pencil version of the same test booklets used in the 2017 pilot equivalence study (booklets that consist of the TIMSS trend items carried over from TIMSS 2015).

A.3 Improved Information Technology (Reduction of Burden)

The TIMSS 2019 design and data collection procedures are prescribed internationally including student assessments and questionnaires being administered on tablets. TIMSS also administered paper-and-pencil versions of the assessment in the 2017 pilot test and will administer paper versions to a subsample of students in the 2019 main study to study the mode effect of transitioning from paper to digital administration. Each participating nation is expected to adhere to the internationally prescribed design. In the United States, the school and teacher questionnaires will be made available to school administrators and teachers online as the main mode of administration, with a paper-and-pencil backup to facilitate user preference for participation.

A communication website, MyTIMSS USA, was used during the 2017 pilot and 2018 field test, and will be used in the 2019 main study in order to provide a simple, single source of information to engage sample schools and maintain high levels of their involvement. This portal will be used throughout the assessment cycle to inform schools of their tasks and to provide them with easy access to information tailored for their anticipated needs. We plan to gather class and student lists from participating schools electronically using an adaptation of Westat’s secure E-filing process. E-filing is an electronic system for submitting lists of student information, including student background information in school records. Instructions to school coordinators on how to submit class and student lists are included in Appendix A. E-filing has been used successfully in NAEP for more than 10 years, and was used in TIMSS 2015 and the PISA 2012 and 2015 assessments. The E-filing system provides advantageous features such as efficiency and data quality checks. Schools will access the E-filing system through the MyTIMSS web site.

The eTIMSS assessments will be implemented using tablets carried into schools by the data collection staff so that TIMSS will not need to use school equipment. Schools can continue to use their own equipment for instruction and assessment, and TIMSS will not need to burden school or district IT staff to set up school equipment or take down firewalls. TIMSS does not require internet access during the assessment. Student data are uploaded to Westat secure servers after the assessment.

A.4 Efforts to Identify Duplication

In the United States, mathematics and science achievement is systematically assessed at (1) the Federal level, where trend data have been collected on a fairly regular basis since 1971 through the National Assessment of Educational Progress (NAEP); (2) the state level, where data are routinely collected as part of state testing programs, though they vary across the states in terms of the frequency of testing, age/grades tested, and types of cognitive items administered; and (3) the district level, where data are collected through the use of commercially or locally developed standardized tests as well as tests developed in conjunction with the instructional programs used in schools. TIMSS 2019 does not duplicate these assessments.

TIMSS 2019 is part of a program of international cooperative studies of educational achievement supported and funded, in part, by the U.S. Department of Education. These studies represent the U.S. participation in international studies involving approximately 60-65 countries. As part of international cooperative studies, the United States must collect the same information at the same time as the other participating nations for purposes of making both valid international comparisons with other countries and with the previous TIMSS data. While some studies in the United States collect similar, though not identical, types of information (e.g., NAEP), the data from those studies cannot be substituted for the information collected in TIMSS in that they do not allow for comparisons outside the United States. Furthermore, the data collected through TIMSS is based on unique frameworks that are not shared by any other state, national, or international data collection effort. In order to participate in these international studies, the United States must agree to administer the same core instruments that are administered in all other participating countries. Because the items measuring mathematics and science achievement have been developed with intensive international coordination, any changes to the instruments require international coordination and approval.

A.5 Minimizing Burden for Small Entities

No small entities are part of this sample. The school samples for TIMSS contain small-, medium- and large-size schools, including private schools, selected based on probability proportionate to their size. All school sizes are needed to ensure an appropriate representation of each type of school in the selected sample of schools. Burden will be minimized wherever possible. For example, schools will be selected so as to avoid as much as possible overlap with other NCES assessments such as NAEP and PISA. In addition, contractor staff will conduct all test administrations, provide all equipment, and will assist with parental notification, sampling, and other tasks as much as possible within each school.

A.6 Frequency of Data Collection

The pilot data collection occurred in April and early May 2017. The field test data collection occurred in March 5 through April 15, 2018. The main study data collection is scheduled for April through May 2019. This schedule is prescribed by the international collective for TIMSS, and adherence to this schedule is necessary to establish consistency in survey operations among participating countries as well as to maintain trend lines.

A.7 Special Circumstances

None of the special circumstances identified in the Instructions for Supporting Statement apply to the TIMSS study.

A.8 Consultations outside NCES

Consultations outside NCES have been extensive and will continue throughout the life of the project. The IEA studies are developed as a cooperative enterprise involving all participating countries. An International Steering Committee has general oversight of the study and each National Research Coordinator participates in extensive discussions concerning the projects, usually with advice from national subject matter and testing experts. In addition, the IEA convened separate panels of mathematics and science experts from around the world to develop cognitive items.

The majority of the consultations (outside NCES) have involved the IEA-Amsterdam, in the Netherlands, the IEA-Data Processing Center (DPC) in Hamburg, Germany, and the TIMSS International Study Center (ISC) at Boston College in the United States. Key to these ongoing consultations are: Dirk Hastedt (executive director of the IEA); Oliver Neuschmidt (head of the IEA Data Processing and Research Center); Michael Martin, Ina V.S. Mullis, Victoria Centurino, and Kerry Cotter (directors of the ISC TIMSS teams), all of whom have extensive experience in developing and operating international education surveys (especially related to TIMSS).

A.9 Payments or Gifts to Respondents

In order to achieve acceptable school response rates in international studies, schools in the U.S. are usually offered $200 to thank them for their participation and the time they invest in and the space they make available for the international assessments. High response rates are required by both IEA and NCES, and are difficult to achieve in school-based studies. The U.S. has historically had difficulties in achieving sufficient participation levels. Based on incentives provided in past administrations of TIMSS and currently offered in other international assessments, and based on the field test, schools will be offered $200 for their time in the main study. In the 2017 eTIMSS pilot, students took both the paper-and-pencil version and an eTIMSS version of the assessment. Student time was twice what it typically is, and involved more assistance from the school staff in order to administer either two sessions in one day, or over two days. Therefore, schools participating in the eTIMSS pilot were offered $400 for their time, help, and participation.

To address challenges that may be encountered with securing school cooperation for TIMSS 2019, as in the field test, we will utilize a second-tier incentive which will allow us to offer up to $800 to schools that are historically very difficult to recruit. This second-tier incentive will be offered only to:

  • Private schools in the original sample or substitute private schools. In ICILS 2018, only 40% of original private schools participated, with the overall rate of about 57% after substitutes were added.

  • Public schools that are selected for and participate in NAEP 2019 and are also selected for TIMSS 2019. Typically these schools participate only in NAEP because it is required, and refuse participation in TIMSS. There are about 40 such schools nationwide.

  • All substitute schools. Typically, refusals by original schools occur in the mid to late fall prior to the assessment in the spring, at which point substitute schools are recruited. Recruitment of substitutes is typically very challenging (less than 25% participate) because recruitment is late, after the school year has begun and school calendars are set (also, some states refuse to recruit schools that late in the school year).

A similar second-tier recruitment strategy has been used in other international studies conducted by NCES. Most recently, in ICILS 2018, we began offering the second-tier incentive very late, in the middle of data collection, and attempted to turn around schools that had already refused, as well as newly activated substitutes and schools that were not doing any work to prepare for the assessment. We were able to successfully recruit about 20% of these schools even with this late start, which brought us much closer to our target recruitment percentage. The extra incentive strategy should be even more successful in TIMSS if we begin it during the fall, well before data collection begins. If necessary to meet international participation requirements, we will also use select second-tier incentive for refusing original schools.

The school staff serving as School Coordinators will receive $100 for their time and effort. The School Coordinator serves a critical role in data collection, functioning as the central school contact and facilitating arrangements for the assessments. They are asked to file class and student listing forms; arrange the date, time and space for the assessment; and disseminate information to parents and students.

To encourage their participation, TIMSS will offer for the first time $20 to teachers who complete the teacher questionnaires. Both PISA and PIRLS (in 2016) have offered this level of incentive to teachers who completed a 30-minute questionnaire. In the past, field staff and Westat home office staff have spent considerable time tracking down teachers after the assessment to obtain a reasonable response rate. The teacher questionnaire was not administered during the pilot test and thus no teacher incentive was used. In order to avoid sending 5 to 10 checks to the school for the school coordinator to distribute to teachers who complete the questionnaire, electronic Amazon gift cards in the amount of $20 will be used. Amazon gift cards were selected because they have no associated fees, unlike other cash card programs. Teacher email addresses will be collected prior to the assessment, and will be used to electronically deliver their login credentials (i.e., user ID and password) for accessing the teacher questionnaire on the IEA Online Survey System. The teacher email addresses will also be used to send teachers their incentive (i.e., Amazon gift code) after the questionnaire has been completed, and to send periodic reminders to teachers who have not yet completed their questionnaires.

Consistent with prior administrations of TIMSS, as a token of appreciation for their participation, students will receive a small gift valued at approximately $4. In TIMSS 2015, each participating student received a small flashlight that could be clipped with an attached karabiner to a backpack or belt loop. Students will also receive a certificate with their name thanking them for participating and representing the United States in TIMSS. In schools where it is permitted, eighth-grade students participating in TIMSS may also receive a certificate from the U.S. Department of Education for four hours of volunteer service. Additionally, some schools also offer recognition parties with pizza or other treats for students who participate; however these are not reimbursed by NCES or the contractor. In districts or schools that require active parental consent, which historically has been difficult to collect, we will offer a small party with refreshments for students who bring in their parental consent forms.

A.10 Assurance of Confidentiality

Data security and confidentiality protection procedures have been put in place for TIMSS to ensure that the TIMSS contractor for the U.S. and its subcontractors comply with all privacy requirements, including:

  1. The statement of work of this contract;

  2. Privacy Act of 1974 (5 U.S.C. §552a);

  3. Family Educational and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));

  4. Privacy Act Regulations (34 CFR Part 5b);

  5. Computer Security Act of 1987;

  6. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  7. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  8. Confidential Information Protect and Statistical Efficiency Act of 2002;

  9. E-Government Act of 2002, Title V, Subtitle A;

  10. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  11. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  12. The U.S. Department of Education Incident Handling Procedures (February 2009);

  13. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  14. NCES Statistical Standards; and

  15. All new legislation that impacts the data collected through the inter-agency agreement for this study.

Furthermore, the contractor will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/.

The laws pertaining to the use of personally identifiable information are clearly communicated in correspondence with states, districts, schools, teachers, students, and parents. Letters and information materials will be sent to parents and school administrators describing the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential (see copies in appendix A-B).

Letters to teachers, school coordinators, and supporting materials will read (Appendix A-1 and A-2):

NCES is authorized to conduct this study under the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

Letters to states, districts, and schools and parent notification letters and supporting materials will read (Appendix A-1, A-2, B-1, and B-2):

NCES is authorized to conduct this study under the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). All of the information provided by school staff and students may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

The following statement will appear on the login page for eTIMSS, MyTIMSS, and the front cover of the printed questionnaires (the phrase “search existing data resources, gather the data needed” will not be included on the student questionnaire):

The National Center for Education Statistics (NCES), within the U.S. Department of Education, conducts TIMSS in the United States as authorized by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0695. The time required to complete this information collection is estimated to average [XX] minutes per [respondent type], including the time to review instructions [, search existing data resources, gather the data needed,] and complete and review the information collection. If you have any comments or concerns regarding the accuracy of the time estimate(s), suggestions for improving the form, or questions about the status of your individual submission of this form, write directly to: Trends in Mathematics and Science Study (TIMSS), National Center for Education Statistics, Potomac Center Plaza, 550 12th Street, SW, 4th floor, Washington, DC 20202.

OMB No. 1850-0695, Approval Expires xx/xx/2020.

The TIMSS 2019 confidentiality plan includes signing confidentiality agreements and notarized nondisclosure affidavits by all contractor and subcontractor personnel and field workers who will have access to individual identifiers. Also included in the plan is personnel training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses; controlled and protected access to computer files under the control of a single data base manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility. Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. Neither names nor addresses will be included on any data file. In eTIMSS, students log into the automated assessment using non-identifying ID’s and thus the resulting data are collected and stored with only the non-identifying TIMSS assigned ID. The data are collected from the tablets using a process that encrypts the data, and uploads the data onto a Westat secure FTP site.

NCES understands the legal and ethical need to protect the privacy of the TIMSS respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure analysis of the TIMSS 2019 data when preparing the data files for use by researchers, in compliance with ESRA 2002 (20 U.S.C. §9573). Schools with high disclosure risk will be identified and, to ensure that individuals may not be identified from the data files, a variety of masking strategies will be used, including swapping data and omitting key identification variables (i.e., school name and address) from both the public- and restricted-use files (though the restricted-use file will include an NCES school ID that can be linked to other NCES databases to identify a school); omitting key identification variables such as state or ZIP Code from the public-use file; and collapsing or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files.

A.11 Sensitive Questions

The questionnaires do not have items considered to be of a sensitive nature.

A.12 Estimates of Burden

This request is to conduct TIMSS 2019 main study. Therefore, the burden estimates include burden for (1) contacting states, districts, schools, and parents in order to recruit for the TIMSS 2019 main study, including (a) sending recruitment letters to districts and schools selected for the main study, (b) contacting and seeking research approvals from special handling districts, where applicable, and (c) notifying parents of sampled students about their participation in the main study; and (2) collecting main study data. The estimated burden for these efforts is shown in Table A.1.

The district and school contact letters for the main study are assumed to impose small burden on all contacted parties, both those that refuse and those that agree to participate in the TIMSS studies. Thus the burden hours for this activity are based on the total number of districts and schools contacted rather than the total number agreeing to participate.

The special handling districts are those known to require completion of a research application before they will allow schools under their jurisdiction to participate in a study. Based on an initial assessment of previous TIMSS data collections, we estimate that there may be up to 10 special handling districts in the sample. Contacting special handling districts begins with updating district information based on what can be gleaned from online sources. Calls are then placed to verify the information about where to send the completed required research application forms, and, if necessary, to collect contact information for this process. During the call, inquiry is also made about the amount of time the districts spend reviewing similar research applications. The estimated number of such districts represents those with particularly detailed application forms and lengthy processes for approval. This operation should begin in the spring of the year preceding the start of the data collection to allow sufficient time for special handling districts’ review processes. We will continue to work with these districts until we receive a final response from each district (approval or denial of request) up until April 1, 2019 for the main study.

The total district and school response burden estimate for the main study recruitment is based on 10 minutes for districts to read materials and respond, and 20 minutes for schools to read materials and respond. The total response burden estimate for IRB approvals is based on 120 minutes for staff approval and 60 minutes for panel approval. The total response burden for parental notification is based on 10 minutes for reading and reviewing recruitment materials and notification forms. The total response burden for the main study data collection is based on 30 minutes for students and teachers to complete a questionnaire, and 20 minutes for a school administrator to complete a school questionnaire.

Please note that the questionnaire burden for students in Table A.1. is estimated at 50 minutes because experience has shown that administering the 30-minute student questionnaire after the assessment often takes longer due to time for bathroom breaks after the assessment and getting the class of students resettled to focus on the questionnaire. Based on the estimated hourly rates for principals/administrators, school coordinators, teachers, and parents of $47.81, $29.25, $29.25, and $24.34, respectively1, and the federal minimum wage of $7.25 as the hourly rate for the students, and based on the estimated total of 43,181 burden hours for TIMSS 2019 main study recruitment and data collection, the estimated respondent burden time cost is $536,494.

Table A.1. Burden estimates for TIMSS 2019 main study recruitment and data collection, in 2018 and 2019, respectively, for grades 4 and 8.

Activity

Sample size

Expected response rate

Number of respondents

Number of responses

Per respondent (minutes)

Total burden (hours)

Main Study Recruitment

 

 

 

 

 

 

Contacting Districts

670

1.00

670

670

10

112

School Recruitment (Original Schools)

670

0.70*

469

469

20

157

School Recruitment (Replacement Schools)

201

0.50*

101

101

20

34

District IRB Staff Study Approval

84

1.00

84

84

120

168

District IRB Panel Study Approval

504

1.00

504

504

60

504

Parental notification

29,480

1.00

29,480

29,480

10

4,914

Main Study Recruitment Burden

 

 

31,308

31,308

 

5,889

Main Study Data Collection

 

 

 

 

 

 

Student

 

 

 

 

 

 

Grade 4

 

 

 

 

 

 

Directions

14,740

0.96

14,151

14,151

20

4,717

Assessment

14,740

0.96

14,151

14,151

72

16,982

Student Questionnaire

14,740

0.96

14,151

14,151

50

11,793

Grade 8

 

 

 

 

 

 

Directions

14,740

0.96

14,151

14,151

20

4,717

Assessment

14,740

0.96

14,151

14,151

90

21,227

Student Questionnaire

14,740

0.96

14,151

14,151

50

11,793

Total Student Burden Main Study

 

 

28,302

56,604

 

33,020

School Staff (Grades 4 and 8)

 

 

 

 

 

 

School Administrator

670

0.95

637

637

30

319

Teacher

2,680

0.95

2,546

2,546

30

1,273

School Coordinator (Grade 4 and 8)

670

1.00

670

670

240

2,680

Total School Burden Main Study

 

 

3,853

3,853

 

4,272

Total Burden Main Study

 

 

63,463

91,765

 

43,181

Note: Total burden requested in this submission includes the main study data collection burden plus the already approved burden associated with TIMSS 2019 main study recruitment and pre-assessment activities. Total student burden does not include the time for the cognitive assessment (demarcated in gray font and presented here for information only purposes), which is not subject to PRA.

* Satisfactory sampling participation rate includes a final unweighted school response rate of at least 50% of original schools and at least 85% of original plus replacement schools, with original school sample as the denominator.

A.13 Total Annual Cost Burden

There are no additional costs to respond beyond the time to respond.

A.14 Annualized Cost to Federal Government

The cost to the federal government for conducting all of the phases of TIMSS 2019, including 2017 pilot, 2018 field test operations (preparations, data collection, and scoring), and the 2019 main study is estimated to be $7,860,615 over a 3-year period (see table breakdown below). The total cost for the 2019 main study is estimated to be $5,755,146. These figures include all direct and indirect costs.

Components with breakdown

Estimated costs

PILOT (2017)


Recruitment

95,000

Preparations (e.g., adapting instruments, sampling)

177,851

Data collection, scoring, and coding

323,822

FIELD TEST (2018)


Recruitment

149,885

Preparations (e.g., adapting instruments, sampling)

409,276

Data collection, scoring, and coding

949,635

MAIN STUDY (2019)


Recruitment

739,972

Preparations (e.g., adapting instruments, sampling)

53,278

Data collection, scoring, and coding

4,961,896

Current package components

$5,755,146

Grand total

$7,860,615

Note: Cells showing the cost to federal government for the TIMSS activities that were approved by OMB in prior submissions (OMB# 1850-0695 v.8-10) are shaded in light gray and are not included in the “Current package components” total.


A.15 Program Changes or Adjustments

The apparent increase in burden from the last approval is due to the fact that the last request was to conduct the TIMSS 2019 field test in 2018 and to recruit schools and communicate with districts and parents in preparation for the TIMSS 2019 main study, while this request is to conduct the TIMSS 2019 main study, including all recruitment and data collection activities.

With respect to the previous administration of Main Study TIMSS, TIMSS 2019 differs from TIMSS 2015 in the following ways:

  • No federally funded state benchmarking;

  • Transition to a digitally-based assessment in 2019;

  • Addition of a pilot study preceding the field test; and

  • TIMSS Advanced will not be administered in 2019.

A.16 Plans for Tabulation and Publication

The TIMSS pilot and field test were designed to provide a statistical review of the performance of items on the cognitive assessment and questionnaires in preparation for the main study data collection. They also provided valuable experience in administering eTIMSS assessments on tablets and the performance of TIMSS items in a digitally-based environment. Corresponding paper-and-pencil assessments in 2017 and 2019 will provide information for evaluating the mode effect of the transition from paper to eTIMSS.

Based on the data collected in the main study, the TIMSS International Study Center will prepare separate reports for mathematics and science at grades 4 and 8. These reports will be released by December 2020. As has been customary, NCES will also release a report for each study at the same time as the international reports are released, interpreting the results for the U.S. audience. NCES reports on initial data releases are generally limited to simple bivariate statistics. There are currently no plans to conduct complex statistical analyses of either dataset. Examples of past reports on TIMSS can be found at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2013009. In the spring of 2021, the International Study Center will also prepare technical reports for TIMSS 2019, describing the design and development of the assessments as well as the scaling procedures, weighting procedures, missing value imputation, and analyses. After the release of the international data, NCES plans to release the national data and an accompanying User’s Guide for each study.

Electronic versions of each publication are made available on the NCES website. Schedules for tabulation and publication of TIMSS 2019 results in the United States are dependent upon receiving data files from the international sponsoring organization. With this in mind, the expected data collection dates and a tentative reporting schedule are as follows:

Dates

Activity

April—December 2016

Prepare data collection manuals, forms, assessment materials, and questionnaires for pilot

July—August 2016

Select school samples for pilot

October 2016—January 2017

Contact and gain cooperation of states, districts, and schools for pilot

February—March 2017

Select student samples for pilot

April—May 2017

Collect pilot data

End of May 2017

Deliver raw data to international sponsoring organization

July—August 2017

Review pilot data results

April—December 2017

Prepare data collection manuals, forms, assessment materials, questionnaires for field test

March—April 2017

Select school samples for field test

May—December 2017

Contact and gain cooperation of states, districts, and schools for field test

February—March 2018

Select student samples for main study

March 2018—April 2018

Collect field test data

May 15, 2018

Deliver raw data to international sponsoring organization

July 2018—August 2018

Review field test results

May 2018—December 2018

Prepare for the main study/recruit schools

March 2019—May 2019

Collect main study data

June 2020

Receive final data files from international sponsors

June 2020—December 2020

Produce reports


Note: Items in gray font have been concluded at the time of this submission, in September 2018.

A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all data collection materials.

A.18 Exceptions to Certification Statement

No exceptions to the certifications are requested.

1 The average hourly earnings of principals/education administrators in the May 2017 National Occupational and Employment Wage Estimates sponsored by the Bureau of Labor Statistics (BLS) is $47.81, for school coordinators and teachers is $29.25, and of parents is $24.34. If mean hourly wage was not provided it was computed assuming 2,080 hours per year. The student wage is based on the federal minimum wage. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ data type: Occupation codes: Education Administrators, Elementary and Secondary Schools (11-9032), Education School Teachers, Except Special Education (25-2021), and all employees (00-0000); accessed on July16, 2018.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCalvin Choi
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy