Part A PISA 2025 MS Recruitment Field Test v28 30D

Part A PISA 2025 MS Recruitment Field Test v28 30D.docx

Program for International Student Assessment 2025 (PISA 2025) Main Study Recruitment and Field Test

OMB: 1850-0755

Document [docx]
Download: docx | pdf






Program for International Student Assessment 2025 (PISA 2025) Main Study Recruitment and Field Test






Supporting Statement Part A





OMB# 1850-0755 v.28








National Center for Education Statistics (NCES)

U.S. Department of Education

Institute of Education Sciences

Washington, DC









April 2023

revised July 2023









TABLE OF CONTENTS



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Respondent Universe

B.2 Statistical Methodology

B.3 Maximizing Response Rates

B.4 Purpose of Field test and Data Uses

B.5 Individuals Consulted On Study Design



Appendix A: Recruitment Materials

Appendix B: Parental Consent Materials

Appendix C: Data Collection Instruments



PREFACE

The Program for International Student Assessments (PISA) is an international assessment of 15-year-olds which focuses on assessing students’ reading, mathematics, and science literacy. PISA was first administered in 2000 and is conducted every three years. The ninth cycle of the study, PISA 2025, is being administered at a time when interest is increasing, both worldwide and in the United States, in how well schools are preparing students to meet the challenges of the future, and how the students perform compared with their peers in other education systems of the world. Approximately 85 education systems, including the U.S., are expected to participate in 2025. The U.S. has participated in all previous cycles and is participating in 2025 in order to track trends and to compare the performance of U.S. students with that in other education systems.

PISA 2025 is sponsored by the Organization for Economic Cooperation and Development (OECD). In the U.S., PISA 2025 is conducted by the National Center for Education Statistics (NCES) of the Institute of Education Sciences (IES), U.S. Department of Education. PISA is a collaboration among the participating countries, the OECD, and a group of international organizations each under contract to the OECD (hereafter referred to as the PISA International Consortium), including the Australian Council for Educational Research (ACER), cApStan Linguistic Quality Control, Westat, and Open Assessment Technologies (OAT).

In each administration of PISA, one of the subject areas (reading, mathematics, or science literacy) is the major domain and has the broadest content coverage, while the other two subjects are the minor domains. Science literacy will be the major domain in PISA 2025. Other areas may also be assessed, such as, in the case of PISA 2025, Learning in a Digital World (LDW), which will be an innovative domain in 2025. PISA assesses students’ knowledge and skills gained both in and out of school environments. The focus on the “yield” of education in and out of school makes it different from other international assessments such as the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS), which are closely tied to school curriculum frameworks and assess younger and grade-based populations.

Like previous rounds of PISA in 2015, 2018, and 2022, in PISA 2025 the entire assessment and the questionnaires will be administered on computer. While it is possible for countries to continue using paper-based instruments, and some countries are choosing to do so, those instruments will not include new assessment items. The U.S. will administer PISA 2025 on computer. In addition to the cognitive assessments, PISA 2025 will include questionnaires administered to school principals and assessed students. The school questionnaire will be delivered online. The school and student questionnaires are core components of PISA and as such are required for all participating countries. The teacher questionnaire, which is optional and was administered in previous rounds (2015 and 2018), will not be administered in 2025 because the U.S. has deemed the resulting data inadequate for analyses due to the lack of weights for the teacher data.

To prepare for the main study in 2025, PISA countries will conduct a field test in the spring of 2024, primarily to evaluate newly developed assessment and questionnaire items but also to test the assessment operations. The PISA 2025 field test data collection will occur in the U.S.A. from March-April 2024 and the main study data collection from September-November 2025. In order to meet the international data collection schedule for the spring 2024 field test, questionnaires must be finalized by September 2023 and recruiting activities begun by October 2023. This submission requests approval for all recruitment and data collection activities related to the 2025 field test and the PISA 2025 main study1.

Field test and main study recruitment materials, including letters to state and district officials and school principals, and text for a PISA field test brochure, summary of activities, and “Frequently Asked Questions” are provided in Appendices A-1. Parental consent letters and related materials for the field test are provided in Appendices B-1. The international versions of the field test questionnaire items with proposed adaptations to these items for use in the United States are provided in the C Appendices. In fall or early winter 2023, we will submit a change request to provide updated materials, including the final version of the US field test instrument, new screenshots of the reprogrammed MyPISA website, and an updated student video script for PISA 2025. (See Appendices A-1 and A-2 for more details.)

In order to begin recruiting schools for the main study by October 2024, we will submit a change-request to OMB in May 2024 with the final main study recruitment materials and parental consent letters, details about any changes to the design and procedures for the main study, and updates to the respondent burden estimates for the main study data collection. Subsequently, in spring 2025, we will submit a clearance request, with a 30-day public comment period notice published in the Federal Register, with the final main study procedures and instruments for data collection in the fall of 2025.

A. JUSTIFICATION

A.1 Importance of Information

As part of a continuing cycle of international studies, the U.S., through NCES, participates in several international education assessments and surveys. PISA, sponsored by OECD, is one of these studies.

In light of the growing concerns related to international economic competitiveness, the changing face of our workplace, and the expanding international marketplace in which we trade, knowing how our students and adults compare with their peers around the world has become an even more prominent issue than ever before. Nationwide, interest in understanding what other nations are doing to further the educational achievement of their populations has increased beyond simple comparisons.

Data at critical points during the education career of U.S. students, such as that collected through PISA, have been used by policymakers in efforts to guide and examine the American education system. Consequently, generating comparative data about students in school, at the end of schooling, and about adults in workplace and in community has become an important focus for NCES.

PISA measures students' knowledge, skills, and competencies primarily in three subject areas – reading, mathematics, and science literacy. The overall strategy is to collect in-depth information on student capabilities in one of these three domains every 3 years so that detailed information on each becomes available every nine years. During each 3-year survey cycle, the major focus is on one content domain, with a minor focus on the other two content domains. The major focus for the data collection in 2025 is on science literacy, with a minor focus on reading and mathematics.

The results from PISA assessments, published every 3 years along with related indicators, allow national policymakers to compare the performance of their education systems with those of other countries and provide a basis for monitoring the effectiveness of education systems at the national level. Without these kinds of data, U.S. policymakers will be limited in their ability to gain insight into the educational performance and practices of other nations as they compare to the U.S. NCES provides extensive information to the public on PISA through its publications and its website (http://nces.ed.gov/surveys/pisa).

A.2 Purposes and Uses of Data

Governments and the general public want solid evidence of education outcomes. In the late 1990s, the OECD launched an extensive program for producing policy-oriented and internationally comparable indicators of student achievement on a regular basis and in a timely manner. PISA is at the heart of this program. How well are schools preparing students to meet the challenges of the future? Parents, students, the public, and those who run education systems need to know whether children are acquiring the necessary skills and knowledge, whether they are prepared to become tomorrow's workers, to continue learning throughout life, to analyze, to reason, and to communicate ideas effectively.

The results of OECD’s PISA, published every 3 years (with more detailed measures of each of the three major subject domains every 9 years) along with related indicators, allow national policymakers to compare the performance of their education systems with those of other countries, and to analyze the relationship between constructs measured through the PISA questionnaires with assessment results at national and international levels. Through PISA, the OECD and NCES produce three types of indicators:

  • Basic indicators that provide a baseline profile of the knowledge, skills, and competencies of students;

  • Contextual indicators that show how such skills relate to important demographic, social, economic, and education variables; and

  • Trend indicators that emerge from the ongoing, cyclical nature of the data collection.

PISA 2025 Components

The primary focus for the assessment and questionnaires for PISA 2025 will be on science literacy. The PISA framework defines outcomes of science education with a set of three competencies that an individual would be expected to display.

A scientifically educated person can engage in reasoned discourse about science, sustainability and technology to inform action. This requires the competencies to:

1. Explain phenomena scientifically: Recognize, construct, apply and evaluate explanations for a range of natural and technological phenomena.

2. Construct and evaluate designs for scientific enquiry and interpret scientific data and evidence critically: Appraise and evaluate ways of investigating questions scientifically, and interpret and evaluate scientific data critically.

3. Research, evaluate and use scientific information for decision making and action: Obtain scientific information on a specific global, local or personal science-related issue and evaluate its credibility, potential flaws and the implications for personal and communal decisions.”

As in all administrations of PISA, reading and mathematics literacy also will be assessed, although they will be “minor domains” in 2025. The instruments to be administered in 2025 are as follows:

Assessment Instruments: There are a total of 69 forms in the field test containing 4 clusters for science, reading, mathematics, and LDW which will be administered in a 2-hour session. Students will receive one form with a combination of clusters depending on the form. Because there is a desire to include multistage adaptive testing in science for the main study in 2025, the field trial design includes variable unit positioning within science items clusters and will investigate the effects of variable unit positioning versus fixed positions in preparation for the main study, the hypothesis being that item parameter invariance is only supported when using intact clusters. PISA 2018 and PISA 2022 applied a similar procedure to develop multistage adaptive testing in the design of the reading and mathematics assessments, respectively. The reading and mathematics assessment components of the PISA 2025 field trial will use the multistage adaptive test used in 2018 and 2022, but these will be reduced by about 25 percent.

Following the field test, cognitive and non-cognitive items will be evaluated for bias and interpretation issues, following standard protocols. For the main study, the pool of items will be reduced to only include those items that demonstrate validity across the participating education systems, as well as meeting the goals of content coverage to adequately measure the framework and providing the desired distribution of item types.

Background Questionnaire Instruments: Every participating country must implement two core background questionnaires for PISA 2025: school and student. Several optional questionnaires are also available, of which the U.S. will implement an additional student questionnaire on Information and Communication Technology (ICT) familiarity. This questionnaire was administered to U.S. students in the previous rounds of PISA in 2018 and 2022. These instruments have been developed to address the PISA 2025 questionnaire framework, which defines 20 modules across the school and student questionnaires comprising student background characteristics, teaching and learning practices, school governance, and non-cognitive/metacognitive constructs dealing with school-related outcomes, attitudes, and motivational strategies. In addition, the questionnaires include items that have been administered in multiple cycles of PISA, allowing the investigation of patterns and trends over time. Countries adapt the questions to fit their national context and the questionnaires are reviewed and verified to ensure they remain comparable across countries.

Participating students will be asked to provide information pertaining primarily to the major assessment domain, science, and about their demographics (e.g., age, gender2, language, race and ethnicity); socio-economic background of the student (e.g., parental education, economic background); student's education career; and access to educational resources and their use at home and at school. Domain-specific information will include instructional experiences and time spent in school, as perceived by the students, and student attitudes towards science. Multiple forms of the questionnaire will be used in the field test to try out different items and item formats, with the goal for the student questionnaire to take approximately 30 minutes to complete in the main study. The main study may or may not use multiple forms.

The ICT questionnaire aims to examine students’ ICT activities and domain-specific attitudes including access to and use of ICT at home and at school, students’ attitudes towards and self-confidence in using computers, self-confidence in doing ICT tasks and activities; and navigation indices extracted from log-file data (number of pages visited, number of relevant pages visited). The core student questionnaire and ICT questionnaire will be computer-based and delivered to students via the assessment platform for PISA. The school questionnaire will be administered online, though hard copy versions will also be made available to those who make the request.

A.3 Improved Information Technology (Reduction of Burden)

The PISA 2025 design and procedures are prescribed internationally. Data collection will consist of computer-based responses for science, reading, mathematics, and LDW. Responses to the computer-based assessments and questionnaires will be captured electronically. In the U.S., the computer-based assessments and student questionnaire will be implemented using tablets carried into schools by the data collection staff. The school questionnaire will be available on-line as the main mode of administration. This greatly reduces the burden on schools and staff by eliminating the need to use school-based equipment and computer labs. Online data collection of the school questionnaire was successfully used in the three previous rounds of PISA.

A communication website, MyPISA.us, will be used during the 2024 field test and 2025 main study in order to provide a simple, single source of information to engage sampled schools and maintain high levels of their involvement. This secure portal will be used throughout the assessment cycle to inform schools, particularly school coordinators, of their tasks and to provide them with easy access to information tailored for their anticipated needs. We will gather student sampling information from participating schools electronically using an adaptation of Westat’s secure E-filing process through the MyPISA.us portal. E-filing is an electronic system for submitting lists of student information, including limited background information in school records. Instructions to school coordinators on how to submit student lists are included in Appendix A. E-filing has been used successfully in the National Assessment of Educational Progress (NAEP) for more than 10 years, and was used in TIMSS 2015 and 2019, ICILS 2018, TALIS 2018 and the recent TALIS 2024 field test, and PISA 2012, 2015, 2018, and 2022 assessments. The E-filing system provides advantageous features such as efficiency and data quality checks, and secure data transmission.

A.4 Efforts to Identify Duplication

A number of international comparative studies already exist to measure achievement in science, mathematics, and reading, including TIMSS and PIRLS. The Program for the International Assessment of Adult Competencies (PIAAC), administered since 2012, measures the reading literacy, numeracy, and problem-solving skills of adults. In addition, the U.S. has been conducting its own national surveys of student achievement for more than 40 years through the NAEP program. PISA differs from these studies in several important ways:

Content. PISA is designed to measure “literacy” broadly, while other school-based studies, such as TIMSS and NAEP, have a strong link to curriculum frameworks and seek to measure students’ mastery of specific knowledge, skills, and concepts taught in schools. The content of PISA is drawn from broad content areas, such as understanding, using, and reflecting on written information for reading, in contrast to more specific curriculum-based content such as decoding and literal comprehension. Moreover, PISA differs from other assessments in the tasks that students are asked to do. PISA focuses on assessing students’ knowledge and skills in reading, mathematics, and science literacy in the context of everyday situations. That is, PISA emphasizes the application of knowledge to everyday situations by asking students to perform tasks that involve interpretation of real-world materials as much as possible. A study based on expert panelists’ reviews of mathematics and science items from PISA, TIMSS, and NAEP reported that PISA items required multi-step reasoning more often than either TIMSS or NAEP.3 The study also showed that PISA mathematics and science literacy items often involved the interpretation of charts and graphs or other “real world” material. These tasks reflect the underlying assumption of PISA: as 15-year-olds begin to make the transition to adult life, they need to know not only how to read, or know particular mathematical formulas or scientific concepts, but also how to apply this knowledge and these skills in the many different situations they will encounter in their lives. The computer-based assessments add additional “real world” tasks, given the predominance of technology in young adults’ lives and workplaces.

Age-based sample. The goal of PISA is to represent outcomes of learning rather than outcomes of schooling. By placing the emphasis on age, PISA intends to show not only what 15-year-olds have learned in school, but also outside of school and over the years, not just in a particular grade. In contrast, NAEP, TIMSS, and PIRLS are all grade-based samples: NAEP assesses students in grade 4, 8, and 12; TIMSS assesses students in grades 4 and 8 (and, occasionally, grade 12); and PIRLS assesses students in grade 4. PISA thus seeks to show the overall yield of an education system and the cumulative effects of all learning experiences. Focusing on students whose modal age is 15 provides an opportunity to measure broad learning outcomes while all students are still required to be in school across the many participating nations. Finally, because years of education and school entry ages vary among countries, choosing an age-based sample makes comparisons across countries somewhat easier than a grade-based sample.

Information collected. The kind of information PISA collects also reflects a policy purpose slightly different from the other assessments. PISA collects only background information related to general school context and student demographics. This differs from other international studies such as TIMSS, which collects background information related to how teachers in different countries approach the task of teaching and how the approved curriculum is implemented in the classroom. The results of PISA will certainly inform education policy and spur further investigation into differences within and between countries, but PISA is not intended to provide direct information about improving instructional practice in the classroom. The purpose of PISA is to generate useful indicators to benchmark performance more broadly and inform education policy.

Alternate sources for these data do not exist. This study represents the U.S. participation in an international study involving approximately 85 countries and jurisdictions in the PISA 2025 field test in spring of 2024 and the main study in fall of 2025. The U.S. must collect the same information, using the same instruments and procedures, at the same time as the other nations for purposes of making valid and meaningful international comparisons. No other study in the U.S. will be using the instruments developed by the OECD, and thus no alternative sources of comparable data are available.

A.5 Minimizing Burden for Small Entities

No small entities are part of this sample. The school sample for PISA will contain small-, medium-, and large-size schools from a wide range of school types, including private schools, and burden will be minimized wherever possible for all institutions participating in the data collection. Schools included in the field test will have a low likelihood of being included in the main study. Student burden will be reduced through the use of multiple forms of the assessment and student background questionnaire. In the field test this will allow PISA to test out new background items or differing versions of items without adding to administration time. In addition, contractor staff will assume as much of the organization and test administration as possible within each school. Contractor staff will undertake all test administration and these staff will also assist with parental notification, sampling, and other tasks as much as possible within each school.

A.6 Frequency of Data Collection

PISA is conducted on a 3-year cycle as prescribed by the OECD, and adherence to this schedule is necessary to establish consistency in survey operations among the many participating countries.

A.7 Special Circumstances

The special circumstances identified in the Instructions for Supporting Statement do not apply to this study.

A.8 Consultations outside NCES

Consultations outside NCES have been extensive and will continue throughout the life of the project. The nature of the study requires this, because international studies typically are developed as a cooperative enterprise involving all participating countries. PISA 2025 is being developed and operated under the auspices of the OECD by a consortium of organizations. Key persons from these organizations who are involved in the design, development and operation of PISA 2025 are listed below.

Organization for Economic Cooperation and Development

Andreas Schleicher, Indicators and Analysis Division

2, rue André Pascal, 75775 Paris Cedex16, FRANCE, Tel: +33 (1) 4524 9366, Fax: +33 (1) 4524 9098


Australian Council for Educational Research

Goran Lazendic, Project Director, ACER Corporate Headquarters

19 Prospect Hill Rd, Camberwell VIC 3124, Australia, Tel: +61 3 9277 5555


Westat

Jill DeMatteis, Director for Sampling

1600 Research Boulevard, Rockville, Maryland 20850-3129 USA, Tel: 301 251 8278, Fax: 301 294 2034


A 60 day notice was published in the Federal Register on May 8, 2023 (88 FR 29648). One public comment was received but was not substantive. A 30 day notice will be published.

A.9 Payments or Gifts to Respondents

Currently, the minimum response rate targets required by the OECD are 65 percent of original schools and 80 percent of students, while the NCES statistical standards require a minimum response rate target of 85 percent at both the school and student level. Historically, these high response rates have been difficult to achieve in school-based studies. The U.S. failed to reach the NCES school response rate targets for the study in all previous PISA administrations (2000, 2003, 2006, 2009, 2012, 2015, 2018, and 2022). Gaining sufficient student cooperation is also challenging. The U.S. has historically met the NCES target rate of 85 percent of students responding; however, this takes a great deal of effort. Student response rates exceeded the NCES requirement in PISA 2006 by 6 percent, in PISA 2009 by 2 percent, in PISA 2012 by 4 percent, and by 5 percent in 2015. The U.S. response rate in 2018 was 85 percent. The U.S. response rate from 2022—the most recent round of data collection—is 80 percent.-. The monetary incentives, particularly for school coordinators, had a positive impact in maintaining the student response rates. School coordinators indicated that the incentives were meaningful to them as well as to the students. Field staff reiterated this as well, reporting what they heard from school coordinators and students.

We are using a multi-pronged approach to address the challenge of gaining school and student cooperation and learn as much as possible during the field test about how to achieve acceptable participation rates. First, our PISA contractor reviewed the most recent PISA 2022 experience to understand where possible improvements can be made in materials and communication with schools. Staff with experience working on NAEP, PISA, other international assessments, other large-scale data collections, and with expertise in effective approaches to school recruitment have provided input towards identifying strategies for achieving high response rates and continue to serve as an ongoing source of ideas and feedback. Additionally, PISA 2025 continues the use of effective incentives. The proposed amounts are described below and are based on the amounts used previously in PISA 2018 and 2022 (please also see Part B, section B.3).

In the rare cases where state or school district laws or labor contracts do not allow school staff to receive incentives for participating in PISA, the school or school district will be offered the total amount of incentives that would have otherwise been distributed to the individual respondents. NCES and its contractor will work with schools to determine when this option will need to be implemented.

Schools. Schools participating in PISA will receive $250. In order to meet the minimum school response rates mandated by the PISA international governing board, and to thank each school for accommodating the disruption, we believe it is necessary to offer schools this incentive to encourage participation.

Although field tests tend to be successful, we anticipate difficulty in reaching the required school response rates in the main study, as has been the case in all past administrations of PISA, and was particularly difficult in PISA 2022 following the COVID-19 pandemic. Although field tests tend to have more schools participate than required, and the disposition of these schools tends to be positive toward PISA, the recruitment of the field test schools does not follow the model that will be used in the main study because in the field test we were not required to build a response rate for original schools. In the field test, we have to obtain participation from an adequate number of schools to get the required number of student responses for the test items and, thus, we pool schools (both originally sampled schools and their replacement schools), approach the original schools first, and move to the replacement schools quickly with very little, if any, conversion effort. This will not be the case in the main study where we must pursue the original schools until we obtain a satisfactory rate of participation. The historical experience for the U.S. is that obtaining a sufficient response rate of 65 percent of originally sampled, eligible schools is difficult and has required additional efforts. Moreover, in the field test we tend not go to schools in states where we have traditionally had difficulty gaining school cooperation, but must do so in the main study. Finally, the current climate regarding voluntary assessments is increasingly more challenging and we face increasing challenges in each subsequent administration of PISA.

Given these anticipated difficulties in securing sufficient school participation, as in PISA 2022, we plan a second-tier of incentive for the main studies, which will allow us to offer to schools, when necessary, an $800 participation incentive, instead of the standard in PISA $250. This second tier will not be initiated until near the end of the current academic year, in June 2025, after we have approached all original schools and had an opportunity to try different conversion efforts, such as addressing the specific concerns of refusing schools and making personal visits to schools to discuss the study face to face. If, by that time, we have not reached a participation rate of at least 68 percent of the original schools, we will implement the higher incentive rate (to meet the 65 percent of original schools minimum requirement we need to recruit at least 68 percent of original schools factoring in a 3 percent attrition of schools over the summer months before data collection begins in fall 2025). We will approach refusing schools with the second-tier incentive only if necessary and at the point of our last chance to convert them.

In addition to a monetary incentive, in the main study, participating schools will be offered a school-level report with basic comparisons of the performance means of students in the school with overall means for the U.S., OECD countries, and other similar schools as measured in PISA 2025. Working with the NCES Chief Statistician, we have devised a set of sample and response rate requirements for a participating school to receive a report (schools will be grouped into 6 categories based on sample size and response rates). Failure to achieve the designated sample and response rate requirements will mean that a school will not be eligible to receive a school report with comparative achievement results. For other participating schools, we will also develop an alternate version that will report the non-cognitive questionnaire data, showing profiles of the school across a variety of variables, compared to other similar schools (for additional details see Part B, section B.3).4 This report version will allow us to provide a source of feedback to potentially all schools, if they request it. This alternate report requires work with the NCES Chief Statistician to be sure that the information provided meets NCES reporting standards. We will also directly ask school principals during recruitment if there is any reason they do not want to receive a report. In these rare cases, a school report will not be produced. School-level reports will be shared only with the principal of the school and will not be shared or distributed to anyone else. Prior to distribution, the school reports will be reviewed by the NCES Chief Statistician for accuracy and compliance with the sample and response rate requirements designated for PISA 2025.

School coordinators. The school coordinator will be offered $200. The role of the school coordinator is critical for the success of the study. The coordinator is expected to coordinate logistics with the data collection contractor; supply a list of eligible students for sampling to the data collection contractor; communicate with students and parents about the study to encourage participation; assist the test administrator in ensuring that the sampled students attend the testing sessions; and assist the test administrator in arranging for make-up sessions as needed.

Students. The student burden in PISA 2025 will be the same as in previous rounds of PISA, most recently in 2022 and, as in previous data collection cycles, all participating students will be offered $25.

Additionally, students participating in the assessment during non-school hours (after school or on a Saturday), which is an accommodation offered in the main study when it is not possible to find a suitable time within school hours, and one that is exercised rarely, will be offered $35. The increased incentive amount is designed to thank students for travelling to the assessment site and potentially missing outside of school activities (e.g., work, sports) in order to participate in the assessment outside of school hours.

In addition, all participating students will receive a volunteer service certificate of 4 hours from the U.S. Department of Education. Incentives for students will only be provided with the explicit permission of the school principal. All student incentives will be offered directly to the students. Parents will be informed of the amount of the payment the students will receive in the consent form/letter in advance of the assessment. The payments will be provided as a personal check, as has been successfully done since PISA 2009.

A.10 Assurance of Confidentiality

The primary contractor for this study is Westat. Data security and confidentiality protection procedures have been put in place for PISA 2025 to ensure that the contractor and its subcontractors comply with all privacy requirements, including:

  1. The statement of work of this contract;

  2. Family Educational Rights and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));

  3. Privacy Act of 1974 (5 U.S.C. §552a);

  4. Privacy Act Regulations (34 CFR Part 5b);

  5. Computer Security Act of 1987;

  6. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  7. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  8. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  9. Foundations of Evidence-Based Policymaking Act of 2018, Title III, Part B, Confidential Information Protection;

  10. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  11. The U.S. Department of Education Incident Handling Procedures (February 2009);

  12. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  13. NCES Statistical Standards; and

  14. All new legislation that impacts the data collected through the contract for this study.

Furthermore, the contractor will comply with the Department of Education’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: https://nces.ed.gov/statprog/2012/.

By law (20 U.S.C. §9573), a violation of the confidentiality restrictions is a felony, punishable by imprisonment of up to 5 years and/or a fine of up to $250,000. PISA procedures for maintaining confidentiality include notarized nondisclosure affidavits obtained from all personnel who will have access to individual identifiers; personnel training regarding the meaning of confidentiality; controlled and protected access to computer files; built-in safeguards concerning status monitoring and receipt control systems; and a secure, staffed, in-house computing facility. PISA follows detailed guidelines for securing sensitive project data, including, but not limited to physical/environment protections, building access controls, system access controls, system login restrictions, user identification and authorization procedures, encryption, and project file storage/archiving/destruction.

Additionally, the contractor will take security measures to protect the web data collection applications from unauthorized access. The Department of Education has established a policy regarding the personnel security screening requirements for all contractor employees and their subcontractors. The contractor must comply with these personnel security screening requirements throughout the life of the contract, including several requirements that the contractor must meet for each employee working on the contract for 30 days or more. Among these requirements are that each person working on the contract must be assigned a position risk level. The risk levels are high, moderate, and low based upon the level of harm that a person in the position can cause to the Department’s interests. Each person working on the contract must complete the requirements for a “Contractor Security Screening.” Depending on the risk level assigned to each person’s position, a follow-up background investigation by the Department will occur.

Regarding student lists from administrative sources, the Family Educational Rights and Privacy Act (FERPA, 34 CFR Part 99) allows the disclosure of personally identifiable information from students’ education records without prior consent for the purposes of PISA 2018 according to the following excerpts: 34 CFR §99.31 asks, “Under what conditions is prior consent not required to disclose information?” and explains in 34 CFR §99.31(a) that “An educational agency or institution may disclose personally identifiable information from an education record of a student without the consent required by §99.30 if the disclosure meets one or more” of several conditions. These conditions include, at 34 CFR §99.31(a)(3):

The disclosure is, subject to the requirements of §99.35, to authorized representatives of--

(i) The Comptroller General of the United States;

(ii) The Attorney General of the United States;

(iii) The Secretary; or

(iv) State and local educational authorities.

PISA is collecting data under the Secretary’s authority. Any personally identifiable information is collected with adherence to the security protocol detailed in 34 CFR §99.35.

The laws pertaining to the use of personally identifiable information are clearly communicated in correspondence with states, districts, school administrators, parents, and students. Letters and information materials describe the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential (see Appendix A and B). Recruitment letters, supporting materials, and login pages (for students the launch page) and front cover of each data collection instrument, including questionnaires, 5 indicate:

The National Center for Education Statistics (NCES) is authorized to conduct the Program for International Student Assessment (PISA) by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543), and to collect students’ education records from educational agencies or institutions for the purpose of evaluating federally supported education programs under the Family Educational Rights and Privacy Act (FERPA, 34 CFR §§ 99.31(a)(3)(iii) and 99.35). The data are being collected for NCES by Westat, a U.S.-based research organization. All of the information [you / your child] provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

Login pages (for students the launch page) and front cover of each data collection instrument, including questionnaires, in addition include the following statement (the phrase “gather the data needed, and complete and review the information collection” will not be included on the student questionnaire):

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0755. The time required to complete this information collection is estimated to average up to [240/45/180] minutes per [school coordinator/school administrator/student], including the time to review instructions, gather the data needed, and complete and review the information collection. If you have any comments or concerns regarding the accuracy of the time estimate(s), suggestions for improving the form, or questions about the status of your individual submission of this form, write directly to: The Program for International Student Assessment (PISA) 2025, National Center for Education Statistics (NCES), Potomac Center Plaza, 550 12th Street, SW, Room 4007, Washington, DC 20202.

OMB No. 1850-0755, Approval Expires xx/xx/2025

NCES understands the legal and ethical need to protect the privacy of the PISA respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure risk analysis of the PISA 2025 data when preparing the data files for use by researchers, in compliance with ESRA (20 U.S.C. §9573). Schools with high disclosure risk will be identified and, to ensure that individuals may not be identified from the data files, a variety of masking strategies will be used. IES’s Disclosure Review Board (DRB) carefully reviews all datasets prior to release to ensure that disclosure risks have been properly addressed. The PISA 2025 data will be reviewed and approved by the DRB prior to any public release, as has been the protocol for all previous rounds of PISA.

A. 11 Sensitive Questions

PISA 2025 does not include questions usually considered to be of a sensitive nature, such as items concerning religion, substance abuse, or sexual activity. Several items in the background questionnaires may be considered sensitive by some of the respondents, such as the socioeconomic context of the school, parents’ education and occupation, family possessions, and students’ belongings. Research indicates that the constructs these items represent are strongly correlated to academic achievement, and they have been used in the seven previous cycles of PISA (2000, 2003, 2006, 2009, 2012, 2015, 2018, and 2022) well as a number of other national and international studies. These items are considered essential for the anticipated analyses and to retain consistency in planned comparisons with the international data.

A. 12 Estimates of Burden

This package requests approval for the field test recruitment and data collection, and for the main study recruitment activities and data collection activities, which are to begin early in the fall of 2024, after the completion of the field test. Burden estimates are shown in table A.1. The time required for students to respond to the assessment (cognitive items) portion of the study, and associated directions, are shown in gray font and are not included in the totals because they are not subject to the PRA. Student and administrator field test questionnaires are included in the requested burden totals. Recruitment and pre-assessment activities include the time involved in: (1) special handling school district staff reviewing and processing NCES’s research application requests to conduct PISA in schools under their districts’ jurisdiction, and (2) schools deciding to participate, completing student listing forms, distributing parent consent materials, and arranging assessment space. Burden estimates for the main study are also provided for information purposes in table A.1. The total response burden is based on the following:

  • We estimate that there may be 15 special handling districts in the field trial sample and 30 in the national main study sample – those known to require completion of a research application before they will allow schools under their jurisdiction to participate in a study. Estimated burden hours for special handling districts are included in table A.1 under “Special Handling Districts IRB Staff Approval” (about 2 hours per IRB special handling district staff to process and review PISA application to conduct study in their schools) and “Special Handling Districts IRB Panel Approval” (about 1 hour per IRB special handling district panel member, assuming the average panel size of 6 members, to discuss and respond to the PISA application). Contacting special districts begins with updating district information based on what can be gleaned from online sources, followed by calls to verify the information about where to send the completed required research application forms, and, if necessary, to collect contact information for this process. During the call, inquiry is also made about the amount of time the districts spend reviewing similar research applications. The estimated number of such districts represents those with particularly detailed application forms and lengthy processes for approval. To allow sufficient time for special districts’ review processes, this operation will begin upon receiving OMB’s approval, and continue until we receive final approval or denial of our request from each contacted district, up until April 30, 2024 for field trial and up until October 1, 2025 for the main study.

  • For school recruitment, in 50 schools field test and 256 for the national main study sample: 90 minutes for school administrators during the recruitment process; and an average of 4 hours for school coordinators to: (a) coordinate logistics with the data collection contractor, (b) supply a list of eligible students for sampling to the data collection contractor, (c) communicate with students and parents about the study to encourage participation, (d) assist the test administrator in ensuring the sampled students attend the testing sessions, and (e) assist the test administrator in arranging make-up sessions as needed.

  • For data collection, in 50 schools field test and 256 for the national sample: (a) a 45-minute school administrator questionnaire, (b) a 3-minute review of consent forms by parents, (c) a 35-minute computer-based core background student questionnaire, and (e) a 15-minute ICT student questionnaire.

The main study burden estimates are shown below the field test estimates. Although NCES has yet to receive a firm commitment from any state interested in PISA, the main study burden estimates reflect burden for the inclusion of up to 3 states (TBD). The main study burden estimates will be updated following the field test as final decisions by states and territories are made. State-level burden estimates are not shown for the field test because state participation is considered a national option, administering the same assessment design as to the national sample. Therefore, PISA does not require a field test of state-level samples (the U.S. national field trial stands in for state-level participation).

Table A-1. Burden estimates for PISA 2025 field test and main study


Sample

Expected response rate

Number of respondents

Number of responses

Burden per respondent (minutes)

Total burden (hours)


FIELD TRIAL—Based on core + international options

Recruitment and Pre-Assessment Activity (includes Puerto Rico)

School Administrator (US sample)

50

1

50

50

90

75


Special Handling Districts IRB Staff Approval (US sample)

15

1

15

15

120

30


Special Handling Districts IRB Panel Approval (US sample)

90

1

90

90

60

90


School Coordinator (US sample)

50

1

50

50

240

200


School Administrator

 

 

 

 

 

 


Questionnaire (US sample)

50

1

50

50

45

38


Parent

 

 

 

 

 

 


Student Participation Consent

2,600

1

2,600

2,600

3

130


Total School Burden Field Trial

 

 

2,855

2,855

 

563


Student








US national sample








Directions

2,600

0.8

2,080

2,080

10

347


Assessment

2,600

0.8

2,080

2,080

120

4,160


Student questionnaire (Main questionnaire)

2,600

0.8

2,080

2,080

35

1,213


Student questionnaire (ICT questionnaire)

2,600

0.8

2,080

2,080

15

520


Total Student Burden Field Trial

 

 

2,080

4,160

 

1,733


Total Burden Field Trial

 

 

4,935

7,015

 

2,296










MAIN STUDY —Based on core + international options

US national sample








Recruitment and Pre-Assessment Activity








School Administrator

298

0.86

256

256

90

384


Special Handling Districts IRB Staff Approval (US sample)

31

1

31

31

120

62


Special Handling Districts IRB Panel Approval (US sample)

187

1

187

187

60

187


School Coordinator

256

0.85

218

218

240

872


School Administrator

 

 

 

 

 

 


Questionnaire

218

1

218

218

45

164


Parent








Student Participation Consent

10,713

1

10,713

10,713

3

536


Student

 

 

 

 

 

 


Directions

10,713

0.8

8,570

8,570

10

1,428


Assessment

10,713

0.8

8,570

8,570

120

17,140


Student questionnaire (main questionnaire)

10,713

0.8

8,570

8,570

35

4,999


Student questionnaire (ICT questionnaire)

10,713

0.8

8,570

8,570

15

2,142


State samples (up to 3 states)

Recruitment and Pre-Assessment Activity








School Administrator (US states)

162

1

162

162

90

243


School Coordinator (US states)

162

1

162

162

240

648


School Administrator

 

 

 

 

 

 


Questionnaire (US states)

162

1

162

162

45

122


Parent

 

 

 

 

 

 


Student Participation Consent

8,424

1

8424

8424

3

421


Student








US states (includes up to 3)








Directions

8,424

0.83

6,992

6,992

10

1,165


Assessment

8,424

0.83

6,992

6,992

120

13,984


Student questionnaire (main questionnaire)

8,424

0.83

6,992

6,992

35

4,079


Student questionnaire (ICT questionnaire)

8,424

0.83

6,992

6,992

15

1,748


Total Burden - Main Study

 

 

36,095

51,657

 

16,607


Total Burden Requested in this Submission

 

41,030

58,672

 

18,903


NOTE: OMB Clearance Requested: Total Burden includes all burden associated with conducting the PISA 2025 Field Test and the PISA 2025 Main Study. The estimated PISA 2025 Main Study burden is conservatively high because the PISA 2025 Main Study may include up to 3 states, however the burden is held consistent with national sample schools because of potential variability between states. The total student burden does not include time for cognitive assessment and its associated instructions, because assessment is not subject to PRA.

*Special note: For the national main study sample, we expect to draw an initial sample of 298 schools. Taking into account closed, merged, and ineligible schools (historically, around 14% of sampled schools), as well as the historical school-level response rate, we anticipate interacting with/recruiting about 256 of these schools, of which, we estimate, 218 will participate in the PISA 2025 main study. To obtain the required number of students, we will ask to sample up to 52 students in each school. However, some of the smaller schools will not have 52 students available. We estimate: 218 schools x 52 students sampled x 0.945 (a factor used to account for variations in student population sizes across the schools) = 10,713 starting student sample size that we will work to recruit. Based on historical student assessment rates (~80%), we estimate that, in the end, we will assess about 8,570 students (10,713 x 0.80), which will assure that we meet the minimum required 6,300 assessed students.



The estimated hourly rates for secondary noninstructional staff/coordinators, principals/education administrators, and parents ($23.87, $49.35, $28.01 respectively) are based on Bureau of Labor Statistics (BLS) May 2021 National Occupational and Employment Wage Estimates.6 The federal minimum wage of $7.25 is used as the hourly rate for students. For the PISA 2025 field test main study, for the national and state samples, a total of 18,909 burden hours are anticipated, resulting in an estimated burden time cost to respondents of approximately $246,929.

A.13 Total Annual Cost Burden

Other than the burden associated with completing the PISA questionnaires and assessments (estimated above in Section A.12), the field test and main study impose no additional cost to respondents.

A.14 Annualized Cost to Federal Government

The cost to the Federal Government for PISA 2025 field test and main study is estimated to be $7,572,372 from January 2023 to December 2025, and includes costs for the national sample. These estimates also include all estimated direct and indirect costs of the project. This corresponds to an annual cost to the federal government of $2,524,124.















Table A-2. Estimated costs for PISA 2025 field test and main study

Components with breakdown

Estimated costs

FIELD TEST (2024)


NCES salaries and expenses

$480,000

Recruitment

$339,804

Preparations (e.g., adapting instruments, sampling)

$256,864

Data collection, scoring, and coding

$1,232,124

MAIN STUDY (2025)

 

NCES salaries and expenses

$320,000

Recruitment

$707,126

Preparations (e.g., adapting instruments, sampling)

$227,277

Data collection, scoring, and coding

$3,141,546

Reporting and dissemination

$867,631

Grand total

$7,572,372


A.15 Program Changes or Adjustments

The apparent increase in burden from last approval is due to the fact that the last request was to conduct the PISA 2022 main study, while this request is for all burden associated with the PISA 2025 field test and main study.

A.16 Plans for Tabulation and Publication

The PISA 2025 field test is designed to provide a statistical review of the performance of items on the assessments and questionnaires in preparation for the main study. The international contractor, ACER, will provide the international instruments to be used in the field test and will report to the participating countries on the results of the field test. Based on the field test results, ACER, with input and agreement from the participating countries, will make final revisions in the survey instruments, materials, and documents in preparation for the main study.

For PISA 2025 main study, an analysis of the U.S. and international data will be undertaken to provide an understanding of the U.S. national results in relation to the international results. Based on proposed analyses of the international data set by ACER, and the need for NCES to report results from the perspective of an American constituency, a plan is being prepared for the statistical analysis of the U.S. national data set as compared to the international data set. Analysis of data will include examinations of the science, reading, and mathematics literacy and financial literacy of U.S. students in relation to their international counterparts; and the relationships between student performance and student and school background variables.

All reports and publications will be coordinated with the release of information from the international organizing body. Planned publications and reports for the PISA 2025 main study include the following:

General Audience Report. Approximately one year after data collection, in December 2026, to correspond with the international data release, NCES will publish the U.S. National First Look Report with information on the status of reading, mathematics, and science education among students in the U.S. in comparison to their international peers, written for a non-specialist, general U.S. audience. This report will present the results of analyses in a clear and non-technical way, conveying how U.S. students compare to their international peers, and what factors, if any, may be associated with the U.S. results.

The results for LDW will be released internationally in 2027, with NCES releasing a Data Point report to present the results of LDW for the U.S. Accompanying both the First Look report and the Data Point, will be a set of tables and figures released on the NCES website to provide additional details.

Following the release of the national report, additional data will be made available to secondary users in the form of the International Data Explorer (IDE), an online tool on the NCES website, and a U.S. public-release dataset.

Survey Operations/Technical Report. This document will detail the procedures used in the main study (e.g., sampling, recruitment, data collection, scoring, weighting, and imputation) and describe any problems encountered and the contractor’s response to them. The primary purpose of the main study survey operations/technical report is to document the steps undertaken by the U.S. in conducting and completing the study. This report will include an analysis of non-response bias, which will assess the presence and extent of bias due to nonresponse. Selected characteristics of respondent students and schools will be compared with those of non-respondent schools and students to provide information about whether and how they differ from respondents along dimensions for which we have data for the nonresponding units, as required by NCES standards.

Electronic versions of each publication are made available on the NCES website. Schedules for tabulation and publication of PISA 2025 results in the U.S. are dependent upon receiving data files from the international sponsoring organization. With this in mind, the expected data collection dates and a tentative reporting schedule are as follows:


April - December 2023

Prepare data collection manuals, forms, assessment materials, and questionnaires for field test

November 2023-February 2024

Contact and gain cooperation of states, districts, and schools for field test

January–April 2024

Select student samples and collect field test data

June 2024

Deliver raw data to international sponsoring organization

August – September 2024

Receive Field test Report from international sponsors

October 2024–September 2025

Prepare for the main study phase/recruit schools

September 2025–November 2025

Collect main study data

March - April 2026

Receive final data files from international sponsors

August - December 2026

Produce General Audience Report and Survey Operations/Technical Report for the U.S.

A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all data collection materials.

A.18 Exceptions to Certification Statement

No exceptions to the certifications are requested.

1 The materials that will be used in the 2025 main study will be based upon the field test materials included in this submission. Additionally, this submission is designed to adequately justify the need for and overall practical utility of the full study and to present the overarching plan for all of the phases of the data collection, providing as much detail about the measures to be used as is available at the time of this submission. As part of this submission, NCES is publishing a notice in the Federal Register allowing first a 60- and then a 30-day public comment period. For the final proposal for the national (main) study, after the field test, NCES will publish a notice in the Federal Register allowing an additional 30-day public comment period on the final details of 2025 main study.

2 PISA 2025 will allow countries the option to include a third response option for student gender. The proposed U.S. adaptation to the school questionnaire items asking about gender are provided in Appendix C. No third option will be applied to the student question.

3 Neidorf, T.S., Binkley, M., Gattis, K., and Nohara, D. (2006). Comparing Mathematics Content in the National Assessment of Educational Progress (NAEP), Trends in International Mathematics and Science Study (TIMSS), and Program for International Student Assessment (PISA) 2003 Assessments (NCES 2006-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

4 We originally planned to develop this report for PISA 2018, however, due to timing issues, we had to postpone this development to PISA 2025.

5 Students will first see the assessment launch page, as shown on p. 83 of Appendix C, which acts as a portal for all PISA activities for students and includes links to both the assessments and all questionnaires. The authorization, confidentiality, and PRA language detailed in this section is displayed only on the assessment platform page, to which students will return after they complete the assessment and before they begin the questionnaires (see Student Instrument data collection procedures in Part B).

6 The average hourly earnings of noninstructional staff in the May 2021 National Occupational and Employment Wage Estimates sponsored by the Bureau of Labor Statistics (BLS) is $23.87, of principals/education administrators is $49.35, and of all occupations to estimate parent wages is $25.72. Source: BLS Occupation Employment Statistics, https://www.bls.gov/oes/current/oes_nat.htm#(4) data type: Occupation codes: Education, Training, and Library Workers, All Other (Elementary and Secondary Schools) (25-9099); Education Administrators, Elementary and Secondary Schools (11-9032); and All employees (00-0000); accessed on March 22, 2023. If mean hourly wage was not provided it was computed assuming 2,080 hours per year.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePREFACE
AuthorGonzales, Patrick
File Modified0000-00-00
File Created2023-08-01

© 2024 OMB.report | Privacy Policy