Part A PISA 2018 Recruitment & Field Test

Part A PISA 2018 Recruitment & Field Test.docx

Program for International Student Assessment (PISA 2018) Recruitment and Field Test

OMB: 1850-0755

Document [docx]
Download: docx | pdf






PROGRAM FOR INTERNATIONAL STUDENT ASSESSMENT (PISA 2018) field test and recruitment for Main Study




OMB# 1850-0755 v.19



SUPPORTING STATEMENT PART A









Submitted by:


National Center for Education Statistics (NCES)

U.S. Department of Education

Institute of Education Sciences

Washington, DC










April 2016

Revised September 2016









TABLE OF CONTENTS



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Respondent Universe

B.2 Statistical Methodology

B.3 Maximizing Response Rates

B.4 Purpose of Field test and Data Uses

B.5 Individuals Consulted on Study Design



APPENDIX A: RECRUITMENT MATERIALS

APPENDIX B: PARENTAL CONSENT MATERIALS

APPENDIX C: INSTRUMENTS



PREFACE

The Program for International Student Assessments (PISA) is an international assessment of 15-year-olds which focuses on assessing students’ reading, mathematics, and science literacy. PISA was first administered in 2000 and is conducted every three years. The seventh cycle of the study, PISA 2018, is being administered at a time when interest is increasing, both worldwide and in the United States, in how well schools are preparing students to meet the challenges of the future, and how the students perform compared with their peers in other education systems of the world. Approximately 75 education systems, including the U.S., are expected to participate in 2018. The U.S. has participated in all previous cycles and will participate in 2018 in order to track trends and to compare the performance of U.S. students with that in other education systems.

PISA 2018 is sponsored by the Organization for Economic Cooperation and Development (OECD). In the United States, PISA 2018 is conducted by the National Center for Education Statistics (NCES) of the Institute of Education Sciences (IES), U.S. Department of Education. PISA is a collaboration among the participating countries, the OECD, and a group of international organizations each under contract to the OECD (hereafter referred to as the PISA International Consortium), including the Educational Testing Service (ETS), Westat, the German Institute for Educational Research (DIPF), and Pearson.

In each administration of PISA, one of the subject areas (reading, mathematics, or science literacy) is the major domain and has the broadest content coverage, while the other two subjects are the minor domains. Reading literacy will be the major domain in PISA 2018. Other areas may also be assessed, such as, in the case of PISA 2018, financial literacy, and global competence.1 PISA assesses students’ knowledge and skills gained both in and out of school environments. The focus on the “yield” of education in and out of school makes it different from other international assessments such as the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS), which are closely tied to school curriculum frameworks and assess younger and grade-based populations.

Like PISA 2015, in PISA 2018 the entire assessment and the questionnaires will be administered on computer. While it is possible for countries to continue using paper-based instruments, and some countries are choosing to do so, those instruments will not include new items. The U.S. will administer PISA 2018 on computer. In addition to the cognitive assessments, PISA 2018 will include questionnaires administered to assessed students, school principals, and teachers. The school and teacher questionnaires will be delivered online. The school and student questionnaires are core components of PISA and as such are required for countries, while the teacher questionnaire is optional for countries. The U.S. will implement all three questionnaires in 2018.

To prepare for the main study in 2018, PISA countries will conduct a field test in the spring of 2017, primarily to evaluate newly developed assessment and questionnaire items but also to test the assessment operations. The U.S. PISA field test data collection will occur from April-May 2017 and the main study data collection from September-November 2018. In order to meet the international data collection schedule for the spring 2017 field test, questionnaires must be finalized by September 2016 and recruiting activities begun by October 2016. This submission requests approval for:

  1. recruitment and pre-assessment activities for the 2017 field test sample;

  2. administration of the field test; and

  3. the overarching plan and recruitment of schools for the 2018 main study sample2.

Field test recruitment materials, including letters to state and district officials and school principals, and text for a PISA field test brochure, summary of activities, and “Frequently Asked Questions” are included in Appendix A. Parental consent letters and related materials for the field test are included in Appendix B. Main study materials will be based on these, but will reflect the main study design and components to be administered. The international versions of the field test questionnaire items with proposed adaptations to these items for use in the United States are provided in Appendix C. The final versions of the questionnaires will be submitted to OMB as a change request upon the approval of this clearance package.

In order to begin recruiting schools for the main study by September 2017, NCES will submit a change-request to OMB in May 2017 with the final main study recruitment materials and parental consent letters, details about any changes to the design and procedures for the main study, and updates to the respondent burden estimates for the main study data collection. Subsequently, in late fall 2017, NCES will submit a clearance request, with a 30-day public comment period notice published in the Federal Register, with the final main study procedures and instruments for data collection in the fall of 2018.

A. JUSTIFICATION

A.1 Importance of Information

As part of a continuing cycle of international studies, the United States, through the National Center for Education Statistics (NCES), participates in several international education assessments and surveys. The Program for International Student Assessment (PISA), sponsored by the Organization for Economic Cooperation and Development (OECD), is one of these studies.

In light of the growing concerns related to international economic competitiveness, the changing face of our workplace, and the expanding international marketplace in which we trade, knowing how our students and adults compare with their peers around the world has become an even more prominent issue than ever before. Nationwide, interest in understanding what other nations are doing to further the educational achievement of their populations has increased beyond simple comparisons.

Data at critical points during the education career of U.S. students, such as that collected through PISA, have been used by policymakers in efforts to guide and examine the American education system. Consequently, generating comparative data about students in school, at the end of schooling, and about adults in workplace and in community has become an important focus for NCES.

PISA measures students' knowledge, skills, and competencies primarily in three subject areas – reading, mathematics, and science literacy. The overall strategy is to collect in-depth information on student capabilities in one of these three domains every 3 years so that detailed information on each becomes available every nine years. During each 3-year survey cycle, the major focus is on one content domain, with a minor focus on the other two content domains. The major focus for the data collection in 2018 is on reading literacy, with a minor focus on mathematics and science.

The results from PISA assessments, published every 3 years along with related indicators, allow national policymakers to compare the performance of their education systems with those of other countries and provide a basis for monitoring the effectiveness of education systems at the national level. Without these kinds of data, U.S. policymakers will be limited in their ability to gain insight into the educational performance and practices of other nations as they compare to the United States. NCES provides extensive information to the public on PISA through its publications and its website (http://nces.ed.gov/surveys/pisa).

A.2 Purposes and Uses of Data

Governments and the general public want solid evidence of education outcomes. In the late 1990s, the OECD launched an extensive program for producing policy-oriented and internationally comparable indicators of student achievement on a regular basis and in a timely manner. PISA is at the heart of this program. How well are schools preparing students to meet the challenges of the future? Parents, students, the public, and those who run education systems need to know whether children are acquiring the necessary skills and knowledge, whether they are prepared to become tomorrow's workers, to continue learning throughout life, to analyze, to reason, and to communicate ideas effectively.

The results of OECD’s PISA, published every 3 years (with more detailed measures of each of the three major subject domains every 9 years) along with related indicators, allow national policymakers to compare the performance of their education systems with those of other countries, and to analyze the relationship between constructs measured through the PISA questionnaires with assessment results at national and international levels. Through PISA, the OECD and NCES produce three types of indicators:

  • Basic indicators that provide a baseline profile of the knowledge, skills, and competencies of students;

  • Contextual indicators that show how such skills relate to important demographic, social, economic, and education variables; and

  • Trend indicators that emerge from the ongoing, cyclical nature of the data collection.

PISA 2018 Components

The primary focus for the assessment and questionnaires for PISA 2018 will be on reading literacy. The PISA reading framework defines reading literacy as an individual’s:

understanding, using, evaluating, reflecting on and engaging with texts in order to achieve one’s goals, to develop one’s knowledge and potential and to participate in society.

As in all administrations of PISA, mathematics and science literacy also will be assessed, although they will be “minor domains” in 2018. The instruments to be administered in 2018 are as follows:

Assessment Instruments: There are a total of 66 forms in the field test containing 4 clusters for reading, mathematics, science, and global competence, which will be administered in a 2-hour session. Students will receive one form with a combination of clusters depending on the form. Because there is a desire to include multistage adaptive testing for the main study (reading component only) in 2018, the field trial design includes variable unit positioning within clusters and will investigate the effects of variable unit positioning versus fixed positions in preparation for the main study, the hypothesis being that item parameter invariance is only supported when using intact clusters.

Global competence is a new innovative domain to be assessed in PISA 2018. The PISA global competence assessment seeks to measure students’ readiness to live and work in a world that is increasingly interdependent. Global competence consists of the skills and mind habits to understand such global interdependence, and to live with meaning and direction in contexts where global interactions increase exponentially. It aims to gauge students’ sensitivity toward and understanding of other people’s cultures and values without judging their worth on the basis of preconceived notions; their understanding of international decision making and its implications; and the nature and importance of history in shaping contemporary society.

Following the field test, cognitive and non-cognitive items will be evaluated for bias and interpretation issues, following standard protocols. For the main study, the pool of items will be reduced to only include those items that demonstrate validity across the participating education systems, as well as meeting the goals of content coverage to adequately measure the framework and providing the desired distribution of item types.

Background Questionnaire Instruments: Every participating country must implement two core background questionnaires for PISA 2018: school and student. Several optional questionnaires are also available, of which the United States will implement two: a teacher questionnaire and an additional student questionnaire on Information and Communication Technology (ICT) familiarity. These instruments have been developed to address the PISA 2018 questionnaire framework, which defines 14 modules across the school, student, and teacher questionnaires comprising student background characteristics, teaching and learning practices, professional development of teachers, school governance, and non-cognitive/metacognitive constructs dealing with reading-related outcomes, attitudes, and motivational strategies. In addition, the questionnaires include items that have been administered in multiple cycles of PISA, allowing the investigation of patterns and trends over time. Countries adapt the questions to fit their national context and the questionnaires are reviewed and verified to ensure they remain comparable across countries.

The teacher questionnaire, which was also implemented in 2015, gathers school-level contextual information about the structural and process characteristics of schools (e.g. teaching practices and learning opportunities in classrooms, leadership and school policies for professional development, vertical and horizontal differentiation of the school system) and will be analyzed alongside data received through the school questionnaire to provide a context for the student achievement scores.

Participating students will be asked to provide information pertaining primarily to the major assessment domain, reading, and about their demographics (e.g., age, gender, language, race, and ethnicity); socio-economic background of the student (e.g., parental education, economic background); student's education career; and access to educational resources and their use at home and at school. Domain-specific information will include instructional experiences and time spent in school, as perceived by the students, and student attitudes towards reading. Multiple forms of the questionnaire will be used in the field test to try out different items and item formats, with the goal for the student questionnaire to take approximately 30 minutes to complete in the main study. The main study may or may not use multiple forms.

The ICT questionnaire, in turn, aims to examine students’ ICT activities and domain-specific attitudes including access to and use of ICT at home and at school, students’ attitudes towards and self-confidence in using computers, self-confidence in doing ICT tasks and activities; and navigation indices extracted from log-file data (number of pages visited, number of relevant pages visited). The core student questionnaire, financial literacy, and ICT questionnaire will be computer-based and delivered to students via a thumb-drive. The school and teacher questionnaires will be administered online, though hard copy versions will also be made available to those who make the request.

A.3 Improved Information Technology (Reduction of Burden)

The PISA 2018 design and procedures are prescribed internationally. Data collection will consist of computer-based responses for reading, mathematics, science, financial literacy, and global competency. Responses to the computer-based assessments and questionnaires will be captured electronically. In the U.S., the computer-based assessments and student questionnaire will be implemented using laptops carried into schools by the data collection staff. The school and teacher questionnaires will also be available on-line as the main mode of administration. This greatly reduces the burden on schools and staff by eliminating the need to use school-based equipment and computer labs. Online data collection was successfully used in the 2015 cycle.

A communication website, MyPISAUSA, will be used during the 2017 field test and 2018 main study in order to provide a simple, single source of information to engage sampled schools and maintain high levels of their involvement. This portal will be used throughout the assessment cycle to inform schools, particularly school coordinators, of their tasks and to provide them with easy access to information tailored for their anticipated needs. We plan to gather student and teacher lists from participating schools electronically using an adaptation of Westat’s secure E-filing process through the MyPISAUSA portal. E-filing is an electronic system for submitting lists of student and teacher information, including limited background information in school records. Instructions to school coordinators on how to submit student and teacher lists are included in Appendix A. E-filing has been used successfully in NAEP for more than 10 years, and was used in TIMSS 2015 and the PISA 2012 and 2015 assessments. The E-filing system provides advantageous features such as efficiency and data quality checks.

A.4 Efforts to Identify Duplication

A number of international comparative studies already exist to measure achievement in science, mathematics, and reading, including the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS). The Program for the International Assessment of Adult Competencies (PIAAC), administered in 2012, measures the reading literacy, numeracy, and problem-solving skills of adults. In addition, the United States has been conducting its own national surveys of student achievement for more than 40 years through the National Assessment of Educational Progress (NAEP) program. PISA differs from these studies in several important ways:

Content. PISA is designed to measure “literacy” broadly, while other studies, such as TIMSS and NAEP, have a strong link to curriculum frameworks and seek to measure students’ mastery of specific knowledge, skills, and concepts taught in schools. The content of PISA is drawn from broad content areas, such as understanding, using, and reflecting on written information for reading, in contrast to more specific curriculum-based content such as decoding and literal comprehension. Moreover, PISA differs from other assessments in the tasks that students are asked to do. PISA focuses on assessing students’ knowledge and skills in science, reading, and mathematics literacy in the context of everyday situations. That is, PISA emphasizes the application of knowledge to everyday situations by asking students to perform tasks that involve interpretation of real-world materials as much as possible. A study based on expert panelists’ reviews of mathematics and science items from PISA, TIMSS, and NAEP reported that PISA items required multi-step reasoning more often than either TIMSS or NAEP.3 The study also showed that PISA mathematics and science literacy items often involved the interpretation of charts and graphs or other “real world” material. These tasks reflect the underlying assumption of PISA: as 15-year-olds begin to make the transition to adult life, they need to know not only how to read, or know particular mathematical formulas or scientific concepts, but also how to apply this knowledge and these skills in the many different situations they will encounter in their lives. The computer-based assessments add additional “real world” tasks, given the predominance of technology in the lives of young adults and the workplace.

Age-based sample. The goal of PISA is to represent outcomes of learning rather than outcomes of schooling. By placing the emphasis on age, PISA intends to show not only what 15-year-olds have learned in school, but also outside of school and over the years, not just in a particular grade. In contrast, NAEP, TIMSS, and PIRLS are all grade-based samples: NAEP assesses students in grade 4, 8, and 12; TIMSS assesses students in grades 4 and 8 (and, occasionally, grade 12); and PIRLS assesses students in grade 4. PISA thus seeks to show the overall yield of an education system and the cumulative effects of all learning experiences. Focusing on students at age 15 provides an opportunity to measure broad learning outcomes while all students are still required to be in school across the many participating nations. Finally, because years of education vary among countries, choosing an age-based sample makes comparisons across countries somewhat easier than a grade-based sample.

Information collected. The kind of information PISA collects also reflects a policy purpose slightly different from the other assessments. PISA collects only background information related to general school context and student demographics. This differs from other international studies such as TIMSS, which collects background information related to how teachers in different countries approach the task of teaching and how the approved curriculum is implemented in the classroom. The results of PISA will certainly inform education policy and spur further investigation into differences within and between countries, but PISA is not intended to provide direct information about improving instructional practice in the classroom. The purpose of PISA is to generate useful indicators to benchmark performance more broadly and inform education policy.

Alternate sources for these data do not exist. This study represents the U.S. participation in an international study involving approximately 75 countries and jurisdictions in the PISA 2018 field test in spring of 2017 and the main study in fall of 2018. The United States must collect the same information, using the same instruments and procedures, at the same time as the other nations for purposes of making valid and meaningful international comparisons. No other study in the United States will be using the instruments developed by the OECD, and thus no alternative sources of comparable data are available.

A.5 Minimizing Burden for Small Entities

No small entities are part of this sample. The school sample for PISA will contain small-, medium-, and large-size schools from a wide range of school types, including private schools, and burden will be minimized wherever possible for all institutions participating in the data collection. For example, the selection of schools to be assessed in the PISA 2018 field test (spring 2017) will avoid overlap with the selection of schools for NAEP, TIMSS, or the Teaching and Learning International Survey (TALIS; an international survey of teachers and their professional lives), which will also be in the field in the spring of 2017. Schools included in the field test will have a low likelihood of being included in the main study. Student burden will be reduced through the use of multiple forms of the assessment and student background questionnaire. In the field test this will allow PISA to test out new background items or differing versions of items without adding to administration time. In addition, contractor staff will assume as much of the organization and test administration as possible within each school. Contractor staff will undertake all test administration and these staff will also assist with parental notification, sampling, and other tasks as much as possible within each school.

A.6 Frequency of Data Collection

This request to OMB is for recruitment and pre-assessment activities for the PISA 2018 field trial , the administration of the field trial (spring 2017), and recruitment of the main study school sample. The main study data collection will occur in fall of 2018 and will be cleared under a subsequent request. PISA is conducted on a 3-year cycle as prescribed by the OECD, and adherence to this schedule is necessary to establish consistency in survey operations among the many participating countries.

A.7 Special Circumstances

The special circumstances identified in the Instructions for Supporting Statement do not apply to this study.

A.8 Consultations outside NCES

Consultations outside NCES have been extensive and will continue throughout the life of the project. The nature of the study requires this, because international studies typically are developed as a cooperative enterprise involving all participating countries. PISA 2018 is being developed and operated under the auspices of the OECD by a consortium of organizations. Key persons from these organizations who are involved in the design, development and operation of PISA 2018 are listed below.

Organization for Economic Cooperation and Development

Andreas Schleicher, Indicators and Analysis Division

2, rue André Pascal, 75775 Paris Cedex16, FRANCE, Tel: +33 (1) 4524 9366, Fax: +33 (1) 4524 9098


Educational Testing Service

Irwin Kirsch, Project Director, ETS Corporate Headquarters

660 Rosedale Road, Princeton, NJ 08541 USA, Tel: 1-609-921-9000, Fax: 1-609-734-5410


Westat

Keith Rust, Director of Sampling

1600 Research Boulevard, Rockville, Maryland 20850-3129 USA, Tel: 301 251 8278, Fax: 301 294 2034

A.9 Payments or Gifts to Respondents

Currently, the minimum response rate targets required by the OECD are 85 percent of original schools and 80 percent of students, while the NCES statistical standards require a minimum response rate target of 85 percent at the student level. Historically, these high response rates have been difficult to achieve in school-based studies. The United States failed to reach the school response rate targets for the study in all previous PISA administrations (2000, 2003, 2006, 2009, 2012, and 2015) and had to adjust incentives upwards in the middle of the recruitment and data collection period in order to meet minimum response rate requirements. Gaining sufficient student cooperation is also challenging. The United States has historically met the NCES target rate of 85 percent of students responding; however, this takes a great deal of effort. Student response rates exceeded the NCES requirement in PISA 2006 by 6 percent, in PISA 2009 by 2 percent, and in PISA 2012 by 4 percent. Unweighted results from 2015—the most recent round of data collection--suggest that the U.S. student response rate is 89 percent, as it was in 2012. The monetary incentives, particularly for school coordinators, had an impact in maintaining the student response rates. School coordinators indicated that the incentives were meaningful to them as well as to the students. Field staff reiterated this as well, reporting what they heard from school coordinators and students.

NCES is using a multi-pronged approach to address the challenge of gaining school and student cooperation and learn as much as possible during the field test about how to achieve acceptable participation rates. First, our PISA contractor will review the most recent PISA 2015 experience to understand where possible improvements can be made in materials and communication with schools. Staff with experience working on the National Assessment of Educational Progress (NAEP), PISA, other international assessments, other large-scale data collections, and with expertise in effective approaches to school recruitment will provide input so that strategies can be identified for achieving high response rates and serve as an ongoing source of ideas and feedback. We will also continue the use of effective incentives. The proposed amounts are described below and are based on the amounts used in PISA 2012 and PISA 2015.

Schools. Schools participating in PISA will receive $250. In order to meet the minimum school response rates mandated by the PISA international governing board, and to thank the school for accommodating the disruption, we believe it is necessary to offer schools this incentive to encourage participation.

School coordinators. The school coordinator will be offered $200. The role of the school coordinator is critical for the success of the study. The coordinator is expected to coordinate logistics with the data collection contractor; supply a list of eligible students and teachers for sampling to the data collection contractor; communicate with teachers, students, and parents about the study to encourage participation; assist the test administrator in ensuring that the sampled students attend the testing sessions; and assist the test administrator in arranging for make-up sessions as needed.

Teachers: The field test will implement a teacher questionnaire delivered as an on-line instrument. Up to 25 teachers will be selected from each school. As in the Teaching and Learning International Survey (TALIS), selected teachers will be offered $25 for completing the questionnaire.

Students. The student burden in PISA 2018 will be the same as in previous rounds of PISA, in 2012 and 2015 and, as in previous data collection cycles, all participating students will be offered $25. Unlike PISA 2012 and 2015, where students were subsampled for additional, optional assessments and assessed in an additional session after the core assessment, the assessment design for PISA 2018 will facilitate the assessment of all students in a single, core assessment session. This means that the added incentive offered to students attending the second session in 2012 and 2015 will not be necessary in 2018.

Additionally, students participating in the assessment during non-school hours (after school or on a Saturday), which is an accommodation offered in the main study when it is not possible to find a suitable time within school hours, and one that is exercised rarely, will be offered $35. The increased incentive amount is designed to thank students for travelling to the assessment site and potentially missing outside of school activities (e.g., work, sports) in order to participate in the assessment outside of school hours. Incentives for students will only be provided with the explicit permission of the school principal.

All student incentives will be offered directly to the student. Parents will be informed of the amount of the payment the students will receive in the consent form/letter in advance of the assessment. The payments will likely take the form of a personal check. This was the method used in both the field test and the main study in previous rounds of PISA 2009, 2012, and 2015 and it worked well.

In the rare cases where state or school district laws or labor contracts do not allow school staff to receive incentives for participating in PISA, the school or school district will be offered the total amount of incentives that would have otherwise been distributed to the individual respondents. Student incentives are usually not affected by these laws and will be distributed directly to students. NCES and its contractor will work with schools to determine when this option will need to be implemented.

A.10 Assurance of Confidentiality

Procedures for handling confidential aspects of the study that will be used in PISA 2018 will mirror those used in past administrations of PISA. Expertise in data security and confidentiality was a significant criterion in the selection of the PISA 2018 contractor. The plan for maintaining confidentiality includes signed confidentiality agreements and notarized nondisclosure affidavits obtained from all personnel who will have access to individual identifiers. Also included in the plan is personnel training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses; controlled and protected access to computer files under the control of a single data base manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility.

Letters and other materials will be sent to parents and school administrators describing the voluntary nature of this survey. The material sent will include a brochure that describes the study and conveys the extent to which respondents and their responses will be kept confidential (copies of letters to be used in the field test and the brochure text are included in Appendix A). The following statement will appear on the front cover of the questionnaires (the phrase “gather the data needed, and complete and review the information collection” will not be included on the student questionnaire):

Your answers will be combined with answers from other [respondent type] to calculate totals and averages. All information you provide may only be used for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Science Reform Act of 2002 (ESRA 2002), 20 U.S. Code, Section 9573].

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary survey is 1850-0755. The time required to complete this survey is estimated to average XX minutes per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the survey. If you have any comments concerning the accuracy of the time estimate, suggestions for improving this survey, or any comments or concerns regarding the status of your individual submission of this survey, please write to: Program for International Student Assessment (PISA), National Center for Education Statistics, Potomac Center Plaza, 550 12th Street, SW, Washington, DC 20202.

OMB No. 1850-0755, Approval Expires 09/30/2019.

Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. No school or individual names or addresses will be included on these files or documentation.

NCES understands the legal and ethical need to protect the privacy of the PISA respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure risk analysis of the PISA 2018 data when preparing the data files for use by researchers, in compliance with 20 U.S.C., § 9573. Schools with high disclosure risk will be identified and, to ensure that individuals may not be identified from the data files, a variety of masking strategies will be used, including swapping data and omitting key identification variables (e.g., school name and address) from both the public- and restricted-use files (though the restricted-use file will include an NCES school ID that can be linked to other NCES databases to identify a school); omitting key identification variables such as state or zip code from the public-use file; and collapsing or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files. IES’s Disclosure Review Board (DRB) carefully reviews all datasets prior to release to ensure that disclosure risks have been properly addressed. The PISA 2018 data will be reviewed and approved by the DRB prior to any public release, as has been the protocol for all previous rounds of PISA.


A. 11 Sensitive Questions

PISA 2018 does not include questions usually considered to be of a sensitive nature, such as items concerning religion, substance abuse, or sexual activity. Several items in the background questionnaires may be considered sensitive by some of the respondents, such as the socioeconomic context of the school, parents’ education and occupation, family possessions, and students’ belongings. Research indicates that the constructs these items represent are strongly correlated to academic achievement, and they have been used in the five previous cycles of PISA (2000, 2003, 2006, 2009, 2012, and 2015) as well as a number of other national and international studies. These items are considered essential for the anticipated analyses and to retain consistency in planned comparisons with the international data.

A. 12 Estimates of Burden

This package requests approval for the field test recruitment and data collection, and for the main study recruitment activities, which are to begin early in the fall of 2017, after the completion of the field test. Burden estimates are shown in table A.1. The time required for students to respond to the assessment (cognitive items) portion of the study, and associated directions, are shown in gray font and are not included in the totals because they are not subject to the PRA. Student, administrator, and teacher field test questionnaires are included in the requested burden totals. Recruitment and pre-assessment activities include the time involved in a school deciding to participate, completing teacher and student listing forms, distributing parent consent materials, and arranging assessment space. They also include estimated burden associated with school district staff reviewing and processing special handling district research application materials. Burden estimates for the main study (calculated based on the scenario of the United States participating in the core and international optional teacher questionnaire) are also provided for information purposes in table A.1.

The burden to respondents for the PISA 2018 field test is calculated for the estimated time required for special handling district staff to review and process application requests to conduct PISA 2018 in schools under their jurisdiction, and for students and school staff (school administrator and school coordinators) to complete recruitment, pre-assessment, and assessment activities in 70 schools across the nation (see table A.1). In addition, Puerto Rico has indicated its intent to participate in PISA 2018, as it did in 2015. Puerto Rico expects to fund its own participation and to administer computer-based instruments. Because Puerto Rico has previously participated in PISA using paper-based instruments (in PISA 2012 and 2015), Puerto Rico must participate in the field test and will assess students in 60 schools, as indicated in table A.1. Puerto Rico will administer reading, mathematics, science, and global competence, but not financial literacy. It will administer the standard school and student questionnaire but not the ICT questionnaire or teacher questionnaire.

The total response burden for districts and schools in the field test is based on the following:

  • In 70 schools for the national sample: a 45-minute school questionnaire for 70 school administrators; a 45-minute teacher questionnaire for 1,400 teachers; 90 minutes for 70 school administrators during the field test recruitment process; and an average of 4 hours for 70 school coordinators to (a) coordinate logistics with the data collection contractor, (b) supply a list of eligible students and teachers for sampling to the data collection contractor, (c) communicate with teachers, students, and parents about the study to encourage participation, (d) assist the test administrator in ensuring the sampled students attend the testing sessions, and (e) assist the test administrator in arranging make-up sessions as needed.

  • In 60 schools in Puerto Rico: a 45-minute school questionnaire for 60 school administrators; 90 minutes for 60 school administrators during the field test recruitment process; and an average of 4 hours for 60 Puerto Rican school coordinators performing the same duties described above for school coordinators in the national sample.

  • We estimate that there may be 9 special handling districts in the field trial sample – those known to require completion of a research application before they will allow schools under their jurisdiction to participate in a study. Estimated burden hours for special handling districts are included in table A.1 under “Special Handling Districts IRB Staff Approval” and “Special Handling Districts IRB Panel Approval.” Contacting special districts begins with updating district information based on what can be gleaned from online sources, followed by calls to verify the information about where to send the completed required research application forms, and, if necessary, to collect contact information for this process. During the call, inquiry is also made about the amount of time the districts spend reviewing similar research applications. The estimated number of such districts represents those with particularly detailed application forms and lengthy processes for approval. To allow sufficient time for special districts’ review processes, this operation will begin upon receiving OMB’s approval, and continue until we receive final approval or denial of our request from each contacted district, up until April 30, 2017.

For the field test, the total student response burden on 5,525 students is based on 2,975 students from the national field test sample taking (a) the 30-minute computer-based core background questionnaire, (b) an additional 15 minute ICT questionnaire, and (c) a 15 minute set of financial literacy questionnaire items; and 2,550 students taking the 30-minute core questionnaire in Puerto Rico.

The main study burden estimates are shown below the field test estimates. Although NCES has yet to receive a firm commitment from any state interested in PISA, the main study burden estimates reflect burden for the inclusion of up to 3 states (TBD) and Puerto Rico. The main study burden estimates will be updated following the field test as final decisions by states and territories are made. Other than for Puerto Rico, state-level burden estimates are not shown for the field test because state participation is considered a national option, administering the same assessment design as to the national sample. Therefore, PISA does not require a field test of state-level samples (the U.S. national field trial stands in for state-level participation).

For the main study, burden for recruitment and pre-assessment activities is based on an average of 90 minutes per school administrator, 4 hours per school coordinator, 2 hours per IRB special handling district staff to process and review PISA application to conduct study in their school(s), and 1 hour per IRB special handling district panel member to discuss and respond to the PISA application. The total main study recruitment burden requested here includes school administrators and school coordinators participating in PISA national and state samples, accounting for the possible participation of up to three states and Puerto Rico.

Table A-1. Burden estimates for PISA 2018 field test and main study

 

Sample

Expected response rate

Number of respondents

Number of responses

Burden per respondent (minutes)

Total burden (hours)

FIELD TRIAL—Based on core + international options

Recruitment and Pre-Assessment Activity (includes Puerto Rico)

School Administrator (US sample)

70

1.00

70

70

90

105

Special Handling Districts IRB Staff Approval (US sample)

9

1

9

9

120

18

Special Handling Districts IRB Panel Approval (US sample)

54

1

54

54

60

54

School Administrator (Puerto Rico sample)

60

1.00

60

60

90

90

School Coordinator (US sample)

70

1.00

70

70

240

280

School Coordinator (Puerto Rico sample)

60

1.00

60

60

240

240

School Administrator







Questionnaire (US sample)

70

1.00

70

70

45

53

Questionnaire (Puerto Rico sample)

60

1.00

60

60

45

45

Teachers







Questionnaire (US sample)

1,750

0.80

1,400

1,400

45

1,050

Total School Burden Field Trial

 

 

1,723

1,853

 

1,935

Student







US national sample







Directions

3,500

0.85

2,975

2,975

10

496

Assessment

3,500

0.85

2,975

2,975

120

5,950

Student questionnaire (Main questionnaire)

3,500

0.85

2,975

2,975

30

1,488

Student questionnaire (ICT questionnaire)

3,500

0.85

2,975

2,975

15

744

Student questionnaire (Financial Literacy questionnaire)

3,500

0.85

2,975

2,975

15

744

Puerto Rico sample







Directions

3,000

0.85

2,550

2,550

10

425

Assessment

3,000

0.85

2,550

2,550

120

5,100

Student questionnaire (main questionnaire)

3,000

0.85

2,550

2,550

30

1,275

Total Student Burden Field Trial



5,525

11,475


4,251

Total Burden Field Trial

 

 

7,248

13,328

 

6,186








MAIN STUDY —Based on core + international options

US national sample







Recruitment and Pre-Assessment Activity







School Administrator

205

1.00

205

205

90

308

Special Handling Districts IRB Staff Approval (US sample)

30

1

30

30

120

60

Special Handling Districts IRB Panel Approval (US sample)

180

1

180

180

60

180

School Coordinator

205

1.00

205

205

240

820

School Administrator







Questionnaire

205

1.00

205

205

45

154

Teacher







Questionnaire

5,125

0.85

4,356

4,356

30

2,178

Student







Directions

10,250

0.90

9,225

9,225

10

1,538

Assessment

10,250

0.90

9,225

9,225

120

18,450

Student questionnaire (main questionnaire)

10,250

0.90

9,225

9,225

30

4,613

Student questionnaire (FL questionnaire)

10,250

0.90

9,225

9,225

15

2,306

Student questionnaire (ICT questionnaire)

10,250

0.90

9,225

9,225

15

2,306

State samples (up to 3 states & Puerto Rico)

Recruitment and Pre-Assessment Activity







School Administrator (US states)

162

1.00

162

162

90

243

School Administrator (Puerto Rico)

60

1.00

60

60

90

90

School Coordinator (US states)

162

1.00

162

162

240

648

School Coordinator (Puerto Rico)

60

1.00

60

60

240

240

School Administrator







Questionnaire (US states)

162

1.00

162

162

45

122

Questionnaire (Puerto Rico)

60

1.00

60

60

45

45

Teacher







Questionnaire (US states)

3,750

0.85

3,188

3,188

30

1,594

Questionnaire (Puerto Rico)

-

0.85

-

-

30

-

Student







US states (includes up to 3)







Directions

8,100

0.90

7,290

7,290

10

1,215

Assessment

8,100

0.90

7,290

7,290

120

14,580

Student questionnaire (main questionnaire)

8,100

0.90

7,290

7,290

30

3,645

Student questionnaire (FL questionnaire)

8,100

0.90

7,290

7,290

5

608

Student questionnaire (ICT questionnaire)

8,100

0.90

7,290

7,290

15

1,823

Puerto Rico







Directions

3,000

0.90

2,700

2,700

10

450

Assessment

3,000

0.90

2,700

2,700

120

5,400

Student questionnaire (main questionnaire)

3,000

0.90

2,700

2,700

30

1,350

Total School Recruitment Burden - Main Study



1,064

1,064


2,589

Total Burden Requested in this Submission


8,312

14,392

 

8,775

NOTE: OMB Clearance Requested: Total Burden includes all burden associated with conducting the PISA 2018 Field Test and the recruitment and preassessment activities for the PISA 2018 Main Study. The PISA 2018 Main Study burden is conservatively high because the PISA 2018 Main Study may include up to 3 states and Puerto Rico, however the burden is held consistent with national sample schools because of potential variability between states. Total student burden does not include time for cognitive assessment and its associated instructions. Puerto Rico has expressed intent to participate in PISA 2018 and would be expected to conduct a field trial as it moves from paper-based assessment to computer-based assessment. Any participating states are not required to conduct field trial as the national field trial is sufficient for this purpose. The estimates for Puerto Rico schools and students have been added to the burden estimates for the field trial and the main study. The Teacher Questionnaire will not be administered in Puerto Rico.


The hourly rates for secondary school teachers/instructional staff, noninstructional staff/coordinators, and principals ($28.98, $21.34, $44.68 respectively) are based on Bureau of Labor Statistics (BLS) May 2015 National Occupational and Employment Wage Estimates.4 The federal minimum wage of $7.25 is used as the hourly rate for students. For the PISA field test and recruitment for the main study, for the national, Puerto Rico, and state samples, a total of 8,775 burden hours are anticipated, resulting in an estimated burden time cost to respondents of approximately $164,466.

A.13 Total Annual Cost Burden

Other than the burden associated with completing the PISA questionnaires and assessments (estimated above in Section A.12), the field test and main study impose no additional cost to respondents.

A.14 Annualized Cost to Federal Government

The cost to the Federal Government for conducting the PISA 2018 field test is estimated to be $2,483,434 over a 2-year period. The total cost to the Federal Government for conducting the PISA 2018 main study is estimated to be of $4,407,282 over a 4-year period. This is based on the national data collection contract, valued at $6,965,219 from January 2016 to December 2020, and includes costs for the national sample. These estimates also include all estimated direct and indirect costs of the project.

A.15 Program Changes or Adjustments

There is an overall reduction in burden, because the last approval was for the full scale PISA 2015 collection, while this clearance request is only for field test and recruitment activities for PISA 2018.

With regards to content, there are some changes to PISA 2018 from the previous rounds of data collection. The main change is that the assessment will focus on reading literacy during this cycle. The result is that the bulk of the items will be reading items and that mathematics and science will be the secondary components. Also, the global competence assessment is new. Additionally, there are minor changes in wording to some of the questionnaire items, and questions that focused on student attitudes toward mathematics or science now focus on attitudes toward reading. The ICT questionnaire is also new for 2018 and each student will take the 15-minute questionnaire. The financial literacy questionnaire has been lengthened to 15 minutes, which is new for 2018. Each student will take the standard student 30-minute background questionnaire and the ICT and financial literacy modules.

A.16 Plans for Tabulation and Publication

The PISA 2018 field test is designed to provide a statistical review of the performance of items on the assessments and questionnaires in preparation for the main data collection. The international contractor, ETS, will provide the international instruments to be used in the field test and will report to the participating countries on the results of the field test. Based on the field test results, ETS, with input and agreement from the participating countries, will make final revisions in the survey instruments, materials, and documents in preparation for the main study.

For the main study in 2018, an analysis of the U.S. and international data will be undertaken to provide an understanding of the U.S. national results in relation to the international results. Based on proposed analyses of the international data set by ETS, and the need for NCES to report results from the perspective of an American constituency, a plan is being prepared for the statistical analysis of the U.S. national data set as compared to the international data set. Analysis of data will include examinations of the science, reading, and mathematics literacy and collaborative problem-solving of U.S. students in relation to their international counterparts; and the relationships between student performance and student and school background variables.

All reports and publications will be coordinated with the release of information from the international organizing body. Planned publications and reports for the PISA 2018 main study include the following:

General Audience Report. This report will present information on the status of reading, mathematics, and science education among students in the United States in comparison to their international peers, written for a non-specialist, general U.S. audience. This report will present the results of analyses in a clear and non-technical way, conveying how U.S. students compare to their international peers, and what factors, if any, may be associated with the U.S. results.

Survey Operations/Technical Report. This document will detail the procedures used in the main study (e.g., sampling, recruitment, data collection, scoring, weighting, and imputation) and describe any problems encountered and the contractor’s response to them. The primary purpose of the main study survey operations/technical report is to document the steps undertaken by the United States in conducting and completing the study. This report will include an analysis of non-response bias, which will assess the presence and extent of bias due to nonresponse. Selected characteristics of respondent students and schools will be compared with those of non-respondent schools and students to provide information about whether and how they differ from respondents along dimensions for which we have data for the nonresponding units, as required by NCES standards.

Electronic versions of each publication are made available on the NCES website. Schedules for tabulation and publication of PISA 2018 results in the United States are dependent upon receiving data files from the international sponsoring organization. With this in mind, the expected data collection dates and a tentative reporting schedule are as follows:


April - December 2016

Prepare OMB clearance documents, data collection manuals, forms, assessment materials, questionnaires for field test

October 2016-February 2017

Contact and gain cooperation of states, districts, and schools for field test

March – May 2017

Select student samples and collect field test data

July 2017

Deliver raw data to international sponsoring organization

August – September 2017

November 2017

Receive Field test Report from international sponsors,

Submit OMB package

September 2017–September 2018

Prepare for the main study phase/ recruit schools

June/July 2018

Summer conference for sampled schools (tentative)

September 2018–November 2018

Collect main study data

March - April 2019

Receive final data files from international sponsors

August - December 2019

Produce General Audience Report and Survey Operations/Technical Report for the United States

A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all data collection materials.

A.18 Exceptions to Certification Statement

No exceptions are requested to the "Certification for Paperwork Reduction Act Submissions" of OMB Form 83-I.

1 In addition, there is ongoing discussion at the OECD on a possible PISA-PIAAC Linking Study in 2018. At this stage of discussions, it is premature to make any assumptions about what the United States would choose to do if this became a viable option. If study plans progress and the United States is intent on participating, a revised clearance package that includes a description of the linking study and revised burden estimates will be submitted.

2 The materials that will be used in the 2018 main study will be based upon the field test materials included in this submission. Additionally, this submission is designed to adequately justify the need for and overall practical utility of the full study and to present the overarching plan for all of the phases of the data collection, providing as much detail about the measures to be used as is available at the time of this submission. As part of this submission, NCES is publishing a notice in the Federal Register allowing first a 60- and then a 30-day public comment period. For the final proposal for the full study, after the field test, NCES will publish a notice in the Federal Register allowing an additional 30-day public comment period on the final details of 2018 main study.

3 Neidorf, T.S., Binkley, M., Gattis, K., and Nohara, D. (2006). Comparing Mathematics Content in the National Assessment of Educational Progress (NAEP), Trends in International Mathematics and Science Study (TIMSS), and Program for International Student Assessment (PISA) 2003 Assessments (NCES 2006-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

4 The average hourly earnings of secondary school teachers/instructional staff in the May 2015 National Occupational and Employment Wage Estimates sponsored by the Bureau of Labor Statistics (BLS) is $28.98, of noninstructional staff is $21.34, and of principals/education administrators is $44.68. If mean hourly wage was not provided it was computed assuming 2,080 hours per year. The exception is student wage, which is based on the federal minimum wage. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ data type: Occupation codes: Secondary School Teachers (25-2030); Education, Training, and Library Workers, All Other (Elementary and Secondary Schools) (25-9099); and Education Administrators, Elementary and Secondary Schools (11-9032); accessed on April 5, 2016.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePREFACE
AuthorGonzales, Patrick
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy