Part B PISA 2018 Main Study

Part B PISA 2018 Main Study.docx

Program for International Student Assessment 2018 (PISA 2018) Main Study

OMB: 1850-0755

Document [docx]
Download: docx | pdf






PROGRAM FOR INTERNATIONAL STUDENT ASSESSMENT 2018 (PISA 2018) Main Study




OMB# 1850-0755 v.21



SUPPORTING STATEMENT PART B




Submitted by:


National Center for Education Statistics (NCES)

U.S. Department of Education

Institute of Education Sciences

Washington, DC








February 2018

revised March 2018






TABLE OF CONTENTS


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Respondent Universe

PISA 2018 assesses students nearing the “end of their compulsory school experience” and, as all prior administrations of PISA, is conducted in the United States by the National Center for Education Statistics (NCES) within the U.S. Department of Education. For international comparability, the target population is defined as students who are 15 years old, in grades 7 or higher. A range of exact birthdates is specified by the international coordinating committees based on the months in which the data will be collected. However, students must be between the ages of 15 years and 3 completed months and 16 years and 2 completed months at the beginning of the testing period. In the U.S., the universe for the selection of schools is all types of schools in the 50 states and the District of Columbia. Within sampled schools, students will be selected for participation by drawing a random sample among the 15-year-old students.

B.2 Procedures for the Collection of Information

This section presents information on the PISA international standards and description for school and teacher sampling, recruitment, and data collection procedures for the PISA 2018 main study. Gaining schools’ and teachers’ cooperation in voluntary research is increasingly challenging and employing effective strategies for gaining the cooperation of schools is central to the data collection effort. PISA 2018 main study states, districts, and schools will be recruited beginning in early 2018 (OMB# 1850-0755 v.18-20) and data collection will be conducted from October 1 through November 23, 2018.

B2.a Statistical Methodology

The Technical Standards for main study PISA 2018 established by the international governing board include the following:

Standard 1.8 The student sample size for the computer-based mode is a minimum of 5,250 assessed students and 2,100 for additional adjudicated entities, or the entire PISA Defined Target Population where the PISA Defined Target Population is below 5,2501 and 2,100 respectively. The student sample size of assessed students for the paper-based mode is a minimum of 5,250. The minimum student sample size for financial literacy in the national sample is an additional 1,650 students, for a total of a minimum of 6,900 students that need to be assessed in PISA 2018 in the United States. If individual states participate in the U.S. to obtain state-level estimates, each state administering financial literacy would add approximately 550 students.

Standard 1.9 The school sample size needs to result in a minimum of 150 participating schools, and 50 participating schools for additional adjudicated entities, or all schools that have students in the PISA Defined Target Population where the number of schools with students in the PISA Defined Target Population is below 150 and 50 respectively. Countries not having at least 150 schools, but which have more students than the required minimum student sample size, can be permitted, if agreed upon, to take a smaller sample of schools while still ensuring enough sampled PISA students overall.

Standard 1.10 The final weighted school response rate is at least 85 percent of sampled eligible and non-excluded schools. If a response rate is below 85 percent then an acceptable response rate can still be achieved through agreed upon use of replacement schools.

Standard 1.11 The final weighted student response rate is at least 80 percent of all sampled students across responding schools.

Standard 1.12 The final weighted sampling unit response rate for any optional cognitive assessment is at least 80 percent of all sampled students across responding schools.

In addition, NCES has a standard in which student response rate should be at least 85 percent, and the sampling design described below is based on that rate.

The design for this study will be self-weighting, stratified, consist of two stages, and will use probability proportional to size (PPS). There will be no oversampling of schools or students. Schools will be selected in the first stage with PPS and students will be sampled in the second stage yielding overall equal probabilities of selection.

Target Populations

The national PISA target population is 15-year-old students attending education institutions located within the U.S. in grades 7 and higher. The target population for any participating state is the same. The plan is to implement the main survey in the fall of 2018, with a field test in the spring of 2017. The specific definition of age eligibility that will be used in the survey is “…between 15 years and 3 (completed) months to 16 years and 2 (completed) months at the beginning of the testing window.”

Sampling Frame of Schools

The population of schools for PISA 2018 is defined as all schools containing any 15-year-olds in grades 7 through 12. As in previous PISA cycles, the school sampling frame for the PISA 2018 main study sample was developed from the most up-to-date NCES Common Core of Data (CCD) and Private Schools Survey (PSS) datasets. For the PISA 2018 main study, we used the school sampling frame prepared for the National Assessment of Educational Progress (NAEP) 2018, which uses the 2015-2016 CCD and the 2015-2016 PSS school data. We minimized overlap with the 2018 Teaching and Learning International Survey (TALIS) which will be collecting data in schools in the spring of 2018.

The grade structure of the school is a key stratification variable designed to reduce sampling error, but this is especially so in PISA because data analyses have shown that achievement is highly related to grade. Other stratification variables may include public/private, region of the country, location (urban/suburban/town/rural, etc.), and enrollment by race/ethnicity.

Overview of Main Study

For the core computer-based assessment in reading, mathematics, and science, the international required minimum number of completed assessments is 5,250 students in 150 schools. An additional 1,650 assessed students are required for education systems assessing financial literacy for total of 6,900 assessed students. To achieve a larger number of students assessed in 2018 and required for financial literacy, and to account for anticipated nonparticipation and student ineligibility, the number of students sampled within schools will be increased to 52 students from 42 students sampled in the past. Assuming the same response level as in PISA 2015, the initial target is a total sample of about 257 schools, with estimated 229 eligible, to yield about 190 participating schools (assuming a total 83% school participation rate). As allowed under the international sampling standards, to achieve the target final school response rate, we will use replacement schools to complete the sample.

Assuming a within-school student assessment rate of 83 percent (rates were 81 percent in 2000, 70 percent in 2003, 83 percent in 2006, 78 percent in 2009, 81 percent in 2012, and 83 percent in 2015), the original sample size of students within schools will be up to 52 students. In schools that do not have 52 PISA-eligible students, all eligible students will be sampled. Should any states participate in the 2018 assessment, each state would have a sample of 60 schools and 2,948 students to yield 2,447 assessed students (given the expected 83% assessment rate). As the main study plans for states and subnational jurisdictions are finalized, this information will be updated in the respondent burden table in the Supporting Statement Part A.

Cognitive design

The U.S. will implement a design that includes cognitive items in reading, mathematics, science, and financial literacy. This design includes 36 forms for the core assessment, each with 4 clusters. The financial literacy assessment will include 12 assessment forms, each with four clusters: 2 financial literacy clusters and one cluster each of reading and math. The 2018 main study design will also include multi-stage adaptive testing in reading.

The field test assessment design utilized 6 trend clusters and 12 new clusters of reading items, 6 trend clusters each of science and mathematics items. These clusters were organized in a rotation within three groups of students. Within a school, sampled students were assigned to each of the three groups.

Group 1 received 2 trend clusters of combinations of science and mathematics, mathematics and reading, or science and reading. These clusters were in fixed unit order (FUO). Data from this group described the degree of invariance between 2015 and 2018 in reading and mathematics. The science items, which used trend science items, provided information about variability between 2015 and 2018 and the impact of different ordering in 2018.

Group 2 received new and trend items in reading. These items applied variable unit ordering (VUO) within clusters. The design provided variations in unit ordering within clusters and the results were examined relative to Group 1. Each of the 24 Group 2 forms contained a combination of one to six trend reading clusters and 3 of the 12 new reading clusters. Every trend cluster was paired with a new cluster once and appeared once or twice in each position.

Group 3 contained new reading clusters and was based on a fixed order of units to provide a basis of comparison to the varying unit orders in Group 2. Each form was administered to 32 students (768 students total).

The results of the field test showed no significant difference between the percentage of students that reached the items and selected the correct answers in the FUO versus the VUO forms both for reading trend and new items. The pairs for both trend items and new items were observed to be almost identical in response time and no unit order effect was found. Based on this evidence, it was determined that the order in which units were presented did not significantly impact the estimation of item and person parameters and a multi-stage adaptive test (MAT) design could be introduced for the first time into the main study.

For the main study, the MAT design for reading relies on material equal to fifteen 30-minute clusters, but organized into units rather than clusters. These units are combined into various “testlets.” Testlets are a set of items from one or more units and are linked through common units, which ensure item parameter recovery. The MAT design will consist of three stages: Core, Stage 1, and Stage 2. The Core stage includes a set of 5 different units, each composed of 5 items, with 8 possible testlets. In Stage 1, there is a set of 24 different units, each composed of 5 items and classified as easier (“low”) or somewhat more difficult (“high”), resulting in 16 possible testlets. Finally, at Stage 2, there is a set of 16 different units, each composed of 7 items, also varying in difficulty levels and resulting in 16 possible testlets. Each testlet is paired with another testlet in a different parallel set at every stage, which helps increase the number of items to ensure better coverage of the domain. Thus, there are eight testlets at the core and eight parallel testlets at each stage (eight low and eight high testlets at each: Stage 1 and Stage 2).

Financial Literacy. The United States is again participating in the optional financial literacy assessment in 2018. In PISA 2015 students were subsampled to participate in financial literacy from the core assessment group within each school and these subsampled students returned for an additional hour of financial literacy. The 2018 assessment design is similar to the one used in PISA 2012, however, when an expanded sample was used to assess financial literacy in the same session that mathematics, science, and reading were assessed. That is, students sampled for financial literacy in 2018 will not be required to return for a second session. Each student sampled for financial literacy will receive two clusters of math or reading and two clusters of financial literacy and will be asked to respond to a set of financial literacy-specific background questionnaire items. For the main study, approximately 12 students will be selected to participate in the financial literacy assessment in addition to 40 students selected for the core assessment. These students will be selected separately from the core assessment, but administered the assessment in the same session as the students taking the core assessment The financial literacy instrument will contain 3 clusters with trend items from 2012 and 2015 as well as new interactive items.

Background Questionnaire Instruments. The questionnaires have been developed to address the questionnaire framework developed for PISA 2018. The framework defines 14 modules across the school, student, and teacher questionnaires comprising student background characteristics, teaching and learning practices, professional development of teachers, school governance, and non-cognitive/metacognitive constructs dealing with reading-related outcomes, attitudes, and motivational strategies. In addition, the questionnaires include items that have been included in multiple cycles of PISA, allowing the investigation of patterns and trends over time.

School questionnaire. The principal or designate from each participating school will be asked to provide information on basic demographics of the school population and more in-depth information on one or more specific issues (generally related to the content of the assessment in the major domain, which is reading in 2018). Basic information to be collected includes data on school location; measures of socio-economic context of the schools’ student population, including location, school resources, facilities, and community resources; school size; staffing patterns; instructional practices; and school organization. The in-depth information is designed to address a very limited selection of issues that are of particular interest and that focus primarily on the major content domain, reading. For both the field test and main study, it is anticipated that the school questionnaire will take approximately 45 minutes. It will be available to respondents online.

Teacher questionnaire. The teacher questionnaire will be offered online and is estimated to take approximately 30 minutes to complete in the main study. The teacher questionnaire is used to gather school-level contextual information about the structural and process characteristics of schools from a teacher’s perspective (e.g. teaching practices and learning opportunities in classrooms, leadership and school policies for professional development, vertical and horizontal differentiation of the school system) and will be analyzed alongside data received through the school questionnaire to provide a context for the student achievement scores.

Student core questionnaire. Participating students will be asked to provide information pertaining primarily to the major assessment domain in 2018, reading. Information to be collected includes demographics (e.g., age, gender, language, race, and ethnicity); socio-economic background of the student (e.g., parental education, economic background); student's education career; and access to educational resources and their use at home and at school, which have been standard questions in PISA since the earliest rounds. Domain-specific information will include instructional experiences and time spent in school, as perceived by the students, and student attitudes towards reading. In the field test there were multiple forms of the questionnaire in order to try different items and item formats. The main study will administer a single form of the student core questionnaire that is estimated to take approximately 30 minutes to complete.

Information and Communication Technology (ICT) Familiarity questionnaire. The ICT questionnaire aims to examine students’ ICT activities and domain-specific attitudes including access to and use of ICT at home and at school, students’ attitudes towards and self-confidence in using computers, self-confidence in doing ICT tasks and activities; and navigation indices extracted from log-file data (number of pages visited, number of relevant pages visited). The ICT questionnaire for students is expected to take approximately 15 minutes to complete.

Financial Literacy (FL) questionnaire. The FL questionnaire aims to examine students’ experience with money matters, such as having savings accounts, debit or prepaid cards, as well as whether they have experienced financial-related lessons in their school careers. Many of the items in the FL questionnaire were previously administered in 2012 and 2015, with a small number of new items piloted in the field trial. The FL questionnaire for students is expected to take approximately 15 minutes to complete.

Main Study Within-School Sampling

For the main study, up to 52 students will be sampled within each participating school in order to yield the minimum required 6,900 assessed students (also see Special note under the response burden table in the Supporting Statement Part A). Up to 25 teachers will be sampled within each participating school in order to yield the minimum required 4,000 teacher national sample. Within a school, a total of up to 25 teachers who are eligible to teach the modal grade (grade 10) will be selected. Up to ten teachers will be English/language arts teachers (teachers eligible to teach grade 10 students in an English/language arts subject) and up to 15 teachers will be non-English/language arts teachers (teachers who are eligible to teach grade 10, but in subjects other than ELA). The teacher and student data are not linked; that is, the sampled teachers are not necessarily teachers of the sampled students. The sampling selection for teacher and students are independent of one another.

As in the field test, in the main study, each school will have a school coordinator to provide assistance with arranging for the study and prepare a list of all eligible students and teachers in the school using a standardized Student Listing Form and Teacher Listing Form. The listing forms are unchanged from the ones used in the field test. Each completed list will be submitted to Westat and the information entered into the KeyQuest sampling software. The collected data will be used only to select the student and teacher samples in each school. Once no further follow-up with sampled students and teachers is necessary, all student and teacher listing data will be destroyed.

Nonresponse Bias Analysis, Weighting, Sampling Errors

It is inevitable that nonresponse will occur at the school, teacher, and student levels. We will analyze the nonrespondents and provide information about whether and how they differ from the respondents along dimensions for which we have data for the nonresponding units, as required by NCES statistical standards. After the international contractor calculates weights, sampling errors will be calculated for a selection of key indicators incorporating the full complexity of the design, that is, clustering and stratification.

B2.b Respondent Recruitment

Based on recruitment knowledge gained in the PISA 2018 field trial and other NCES studies, states and districts will be notified about the study before the sampled schools in their jurisdictions are contacted. School staff often wants to be assured that their school district knows about the study before they agree to participate. The PISA 2018 approach to state, district, and school recruitment is described in this section, and all of the respondent recruitment materials are provided in Appendix A and B. PISA 2018 recruitment activities were last approved in September 2017 (OMB# 1850-0755 v.18-20).

State Recruitment. In December 2017, state education agencies (SEAs) in states that contain schools sampled for the PISA 2018 main study were mailed a package that includes the state letter and PISA 2018 advanced materials. We are working with the NAEP State Coordinators (NSCs) in this state recruitment effort. Some NSCs sent the state letter and personally follow-up to answer any questions; otherwise, Westat school recruiters mailed the state package and follow-up with the schools being recruited.

District Recruitment. Early in 2018, advance packages to district superintendents are being mailed. Each package contains an introductory letter, including a list of sampled schools in the district’s jurisdiction, and the PISA 2018 advanced materials. The district mailings come from the NSC or Westat, depending on each NSC’s preference. Shortly after the mailing, the district superintendent is contacted by phone to inform him/her of the study, ensure they received the PISA 2018 package, and answer any questions they may have. Any issues with approaching schools in the district are also discussed at the time.

The PISA 2018 study staff responds to districts’ requirements such as research applications or meetings to provide more information about the study. If a district chooses not to participate, the recruiter documents all concerns listed by the district so that a refusal conversion strategy can be formulated. As in the PISA 2018 field test and previous NCES studies, some NSCs talk to district staff themselves, others mail the package but do not further contact districts, while still others do not want to be involved in district recruitment at all. In cases where the NSCs do not wish to follow-up, Westat’s recruiters work directly with the districts.

Based on prior recruitment experience on a variety of NCES studies, some districts are designated as “special handling districts.” Contacting special handling districts begins with updating district information based on researching online sources, followed by calls to verify the information about where to send the completed required research application forms, and, if necessary, to collect contact information for this process. During the call, inquiries are also made about the amount of time the district spends reviewing similar research applications.

School Recruitment. After each district of sampled public schools has been informed of the study and has confirmed the receipt of the PISA 2018 package, the main study public school mailings is triggered. Private school mailings began simultaneously with public districts being contacted. All of the school mailings, taking place from January through March of 2018, contain an introductory letter offering each sampled school a $250 check as a thank you for participation (to be mailed with a thank you letter after the end of the data collection); the PISA brochure and timeline; and school administrator, teacher, and student Frequently Asked Questions (FAQs) sheets. Shortly after the mailing, the NSCs or Westat recruiters phone the school administrator to discuss the study, gain cooperation, and assign a school staff person to serve as the PISA 2018 school coordinator who will work with the PISA staff to manage the data collection in the school. The school coordinator will act as the liaison between study staff and their school.

In cases where recruitment proves more difficult, school recruiters consult with the PISA home office to evaluate a conversion plan for each school. Typically, the types of general issues are principals who are difficult to reach, school staff who are considering participation but not providing a final decision, principals and/or staff who express a concern and need follow-up, and principals who may require additional appreciation for agreeing to participate.

School Coordinator Contact. Shortly after permission has been granted from the school administrator, Westat emails the school coordinator the MyPISA website registration information. The MyPISA website (described in more detail below) contains the e-filing instructions and templates, and other logistical information for the assessment. Each school will receive a unique ID that will support multiple school users. Each user must then provide their own contact information and set up their own unique account. This registration process has been used across all of NCES’ international studies and in NAEP.

In August 2018, the school coordinator will receive a handbook detailing the procedures for administering the PISA 2018 survey in the school, and providing timelines and instructions for submission of the list of students and teachers via MyPISA. Westat PISA staff will also call the school coordinator to discuss the PISA activities at the school, including when to begin constructing and e-filing the student and teacher list.

Teacher sampling and recruitment. As described above, teacher lists, including teachers’ names and email addresses will be collected beginning in August 2018 for drawing a teacher sample in each school. Edit checks will be run during sampling to check column mappings and completeness of data to ensure that all teacher listing forms are constructed properly for sampling and for contacting the sampled teachers (see Appendix A).

After sampling has been completed and the correct teacher names and email addresses have been collected for the school, the sampled teachers and the school principal will be emailed personalized teacher and principal invitations to the PISA survey with the survey address and credentials. The school coordinator serves as the study’s main point of contact for the teachers and to answer any questions from the school administrator and teachers about PISA 2018. Shortly after teacher sampling, Westat will also email each sampled teacher a direct, unique link to the teacher questionnaire.

Student sampling. Beginning in August 2018, student lists will be collected along with the teacher lists to draw a student sample in each school. The student lists are designed to collect students’ names, grade, gender, and month and year of birth. Edit checks will be run during sampling to check column mappings and completeness of data to ensure that all student listing forms are constructed properly for sampling (see Appendix A).

After student sampling has been completed, student tracking forms will be generated in KeyQuest and distributed to the school coordinator via MyPISA. These forms will also be used by the PISA field staff for administering the assessment and recording student participation status.

After students are sampled and school coordinators are notified of the sampled students, each school coordinator will notify and provide consent materials to parents of sampled students using the materials provided in Appendix B. Prior to the assessment, the test administrator assigned to each school will collect a copy of the notification and consent materials the school sent to parents as proof that the school took care of parental notification and consent.

Role of the MyPISA Website. The central purpose of MyPISA is to provide a way for the schools to securely upload a list of students and teachers and to provide the tracking forms to the school coordinator and PISA field staff with the sampled students and teachers. MyPISA is also used as a source for information about PISA such as copies of advanced materials and descriptions of survey activities, school actions, and instructions and templates for submitting lists. School registrants are encouraged to update their contact information and access information about the study in MyPISA.

B.2.c PISA Data Collection

PISA 2018 main study data collection will occur in October-November 2018. Prior to data collection, Westat PISA staff will complete various pre-survey activities.

Pre-survey Activities

In August 2018, the school coordinators will receive a handbook and instructions for assembling a student and teacher list. The lists will be submitted to Westat via MyPISA and the samples will be drawn during August 2018. Beginning in August 2018, school coordinators will be asked to do the following activities:

  • Create and submit lists of eligible students and teachers via MyPISA;

  • Receive and distribute to sampled study materials from the PISA Home Office;

  • Encourage sampled teachers and the principal to complete their questionnaire; and

  • Meet with the PISA test administrator to review assessment logistics and hold a brief meeting with the sampled students to show a PISA video presentation explaining the study and the students’ role and contribution and to answer questions about PISA.

Data Collection

The school and teacher questionnaires will be available electronically, with a hard copy available upon request. The principal and sampled teachers will be given links to their questionnaires as part of invitation to PISA and will also be emailed their personalized link to the questionnaire. The student questionnaire will be administered as part of the Student Delivery System (SDS) provided to countries by the International Consortium. The SDS includes the PISA main assessment, the financial literacy assessment, and the student questionnaires. The main study PISA questionnaires are provided in Appendix C.

The PISA assessments are administered to students by trained PISA test administrators hired and trained by Westat. The PISA test administrators will bring all assessment equipment to the school including student laptops and peripheral equipment (power cords and mice). They are responsible for set-up and breakdown of equipment and administration of the assessment to the students. All that is required from the school is an adequate space to set up the equipment and hold the assessment.

Students complete the cognitive assessment followed by the student questionnaires administered in a separate session. Typically there is long break between the cognitive assessment session and the student questionnaires session. All student assessment activities are planned to occur in a single day.

Throughout the data collection period, PISA staff and the school coordinator will monitor the return of school and teacher questionnaires and, working in conjunction, will follow-up with non-responders as needed. A ‘window-is-closing’ non-response follow-up effort will be utilized, gently reminding principals and teachers to complete their questionnaires. Near the end of data collection, the two reminders before the last are designed to establish a deadline effect and will be followed with an extension email. This campaign-style approach is designed to provide soft reminders across the data collection window, while creating a sense of urgency to respond towards the end.

School and school coordinator incentive checks will be distributed after the assessment is completed. Teacher incentive checks will be distributed as the teacher questionnaires are completed. Incentives will be mailed to schools on a weekly basis throughout the data collection period. The school coordinators will distribute the incentive checks. Student incentives will be distributed to the students at the end of the questionnaire session.

B.3 Maximizing Response Rates

Our approach to maximizing school and student response rates in the main study includes the following:

  • Use of a fall test administration, to avoid major conflicts with state testing;

  • Selecting and notifying schools at least a year in advance;

  • PISA 2018 summer training workshop for school coordinators or other school staff;

  • Communicating with state and district officials early in the process and applying a more proactive approach with states by coordinating with NSCs to gain assistance with sampled schools;

  • Assigning personal recruiters for specific schools;

  • Monetary and data report incentives for schools, school coordinators, teachers, and students (see Section A.9 and below);

  • Contact with schools and school coordinators at set intervals throughout the year preceding the assessment;

  • Use of an informational video about PISA 2018 to motivate student participation and full effort during the assessments; and

  • Use of individually tailored refusal conversion strategies to encourage participation.

As in the four previous cycles of PISA, in 2006, 2009, 2012, and 2015, the summer training workshop for representatives of sampled schools is designed to inform schools about PISA and keep them engaged in the study (see Section A.9). The Summer Training for PISA 2018 Schools provides an important channel of communication between NCES and Westat staff and the schools participating in PISA. The workshop is valuable for answering questions from schools about PISA, conveying the usefulness of PISA data both nationally and internationally, and working with school staff to help them understand the logistical requirements of the study in their schools. This includes working with student lists, defining Special Education Needs and determining non-participants, understanding the room logistics and equipment involved, and the specific timelines of activities. The summer training workshop for PISA 2018 will be held in June 2018 in Washington D.C. The school coordinator from each participating school will be invited to attend. Airfare, hotel accommodation, and per diem will be provided for school participants who attend the training.

Schools that meet the criteria for receiving a report (see section A.9 of Supporting Statement Part A), will be provided school-level PISA 2018 results. While individual-level scores cannot be produced from PISA data, a school level report showing comparative data for the school can be produced when the school has a participation rate of 85 percent or better and at least 10 assessed students. The results in the school-level report will be comparative results that do not provide actual school means, but rather indicate how the school performed compared to country averages and to other US schools with similar demographic characteristics. We are also looking at ways to design a second report that would provide information from the school questionnaires to schools that do not meet these requirements.

These approaches are based on recommendations from an NCES panel, experience with previous PISA administrations, as well as extensive discussions with NCES’ chief statistician.

B.4 Purpose of Field Test and Main Study Data Uses

In the U.S., the PISA 2018 field test was conducted from April to May 2017. The field test was used to evaluate newly developed assessment items and to test the survey operations. The field test included new reading items developed to address the PISA 2018 framework. Finally, the field test was used to evaluate school recruitment, data collection, and data management procedures in preparation for the main study. The field test was also used to determine if a multi-stage adaptive test design for reading would be feasible to implement in the main study. The results of the field test were analyzed by OECD and informed the main study materials and procedures presented in this submission.

B.5 Individuals Consulted on Study Design

Many people at OECD, ETS, and other organizations around the world have been involved in the design of PISA. Some of the lead people are listed in section A8. Overall direction for PISA is provided by Patrick Gonzales, the PISA National Project Manager, and other staff at NCES.

1 The required number of assessed students was revised from the original requirement of 6,300 to 5,250 students per an education systems that is administering PISA as computer-based assessment and is not administering the Global Competence assessment.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePREFACE
AuthorJanice Bratcher
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy