Part B PISA 2015 Recruitment and Field Test

Part B PISA 2015 Recruitment and Field Test.docx

Program for International Student Assessment (PISA 2015) Recruitment and Field Test

OMB: 1850-0755

Document [docx]
Download: docx | pdf



OECD

PROGRAM FOR INTERNATIONAL STUDENT ASSESSMENT (PISA 2015) Field Test and recruitment for field test and main study



REQUEST FOR OMB Clearance

OMB# 1850-0755 v.14


SUPPORTING STATEMENT PART B




Prepared by:


National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC






Submitted:

March 27, 2013




TABLE OF CONTENTS


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Respondent Universe

PISA 2015 assesses students nearing the "end of their compulsory school experience." For international comparability, this is defined as students who are 15 years old, in grades 7 or higher. A range of exact birthdates is specified by the international coordinating committees based on the months in which the data will be collected. However, students must be between the ages of 15 years and 3 completed months and 16 years and 2 completed months at the beginning of the test period. The universe for the selection of schools is all types of schools in all states of the United States and the District of Columbia. Within sampled schools, students will be selected for participation by drawing a random sample among the 15-year-old students.

B.2 Statistical Methodology

The Technical Standards for main study PISA 2015 established by the international governing board include the following:

Standard 1.8. The student sample size must be a minimum of 5,250 assessed students, or the National Defined Target Population.

Standard 1.9. The school sample size must be a minimum of 150 schools or all schools that have students in the National Defined Target Population.

Standard 1.10. The target cluster size is typically 35 PISA eligible students, which upon agreement can be increased or reduced to a number not less than 20.

Standard 1.11. School response rates must be above 85 percent of sampled schools. If a response rate is below 85 percent then an acceptable response rate can still be achieved through agreed upon use of replacement schools. PISA establishes three response rate zones—acceptable, intermediate, and not acceptable. “Acceptable” refers to original school response rates above 85 percent and means that the country’s data will be included in all international comparisons. “Not Acceptable” refers to original response rates below 65 percent and means that the country’s data will be a candidate for not being reported in international comparisons unless considerable evidence is presented that nonresponse bias is minor. “Intermediate” refers to original school response rates of between 65 and 85 percent and means that a decision on whether or not to include the country’s data in comparisons must be made while taking into account a variety of factors, such as student response rates, quality control, etc. In addition, schools with less than 50 percent participation of students are not considered participating schools and neither that school nor those students that did participate are considered in the calculation of response rates.

Standard 1.12. The overall student response rates must be above 80 percent of sampled students.

In addition, NCES has a standard in which student response rate should be at least 85 percent, and the sampling design described below is based on that rate.

Overview

The design for this study will be self-weighting, stratified, consist of two stages, and will use probability proportional to size (PPS). There will be no oversampling of schools or students. Schools will be selected in the first stage with PPS and students will be sampled in the second stage yielding overall equal probabilities of selection.

Target Populations

The PISA target population is 15-year-old students attending education institutions located within the United States in grades 7 and higher. The plan is to implement the survey in the fall of 2015, with a field test in the spring of 2014. The specific definition of age eligibility that will be used in the survey is “…between 15 years and 3 (completed) months to 16 years and 2 (completed) months at the beginning of the testing window.”

Sampling Frame of Schools

The population of schools for PISA 2015 is defined as all schools containing any 15-year-olds in grades 7 through 12. As in previous PISA cycles, the school sampling frame will be developed from the most up-to-date NCES Common Core of Data (CCD) and Private Schools Survey (PSS). For the PISA 2015 field test, we will use the school sampling frame prepared for the National Assessment of Educational Progress (NAEP) 2014 which used the 2011-2012 CCD and the 2011-2012 PSS school data. We will avoid, to the degree possible, NAEP and TIMSS which will be present in high schools in the 2013-2014 school year.

The grade structure of the school is a key stratification variable designed to reduce sampling error, but this is especially so in PISA because data analyses have shown that achievement is highly related to grade. Other stratification variables may include public/private, region of the country, location (urban/suburban/town/rural, etc.), and enrollment by race/ethnicity.

Field Test

International standards do not require a formal probability sample of schools for the PISA field test. It is sufficient that the samples of schools be representative of a broad range of schools from across the United States. The field test requires a minimum student sample of 1,944 students. The United States plans to select a sample of 60 schools each with one substitute school, with the expectation that 60 schools will ultimately participate, to provide for an adequate participating student sample. Among the 60 schools, 54 will be public schools and 6 will be small private schools. This allows for school and student non-response and also for school level and within-school level exclusions.

The KeyQuest sampling software provided by the consortium will be used to select the student samples in each school. The target cluster size of students per school will be 42 students with the goal of assessing 36 students per school.

To obtain a school sample that is broadly representative of schools across the United States, we will target a convenience sample of schools with grade 9 and above and enrollment of at least 50 students in grades 9 and 10 (where most 15-year-olds are found) excluding schools with grades 7 and 8 only, small public schools, and any schools sampled for other educational studies in 2011 (such as NAEP and TIMSS). We will use the sample stratification characteristics used in previous PISA cycles including census region, locality (city/urban fringe/town/rural MSA), school type (public/private), grade span, and minority enrollment. We will also use stratification to separate subgroups of schools by test mode (CBA and paper-based or CBA only). The sample will be a stratified systematic sample, with sampling probabilities proportional to measures of size, where the measure of size is the estimated number of 15-year-olds.

Mode Effects Study

One of the purposes of the field test is to conduct a mode effects study to test the effects of moving from a paper-based assessment to a fully computer-based assessment by examining the invariance of the item parameters across the two assessment modes. The mode effects study will use intact clusters from the four domains moving to the computer-based mode (science, mathematics, reading, and financial literacy) since they will be used in both the paper-based and computer-based assessments. This is necessary in order to measure trends across previous rounds that use paper-based assessments.

In particular, the field test will include 18 paper-based forms that cover science, mathematics, and reading. In addition, two forms of financial literacy will also be included. The tasks from these 20 forms will be reworked into an equivalent set of 20 computer-based forms.

Financial Literacy

The United States is again participating in the optional financial literacy assessment in 2015. However, the 2015 assessment design creates concerns about the feasibility of conducting the financial literacy assessment efficiently in schools. Students sampled for financial literacy will be a subsample of the original student sample and will be required to return for a second session administered on a second day. This design is different than in PISA 2012, when an expanded sample was used to assess financial literacy in the same session that mathematics, science, and reading were assessed. Requiring students to return for a second session, on another day, could be viewed by schools as an unnecessary burden to the schools and students, and could result in higher rates of school refusals to participate in PISA. Moreover, sampled students may not return for that second session and this could mean insufficient student response rates for the financial literacy assessment. Given this, NCES wants to evaluate the feasibility of including the financial literacy assessment in PISA 2015 at two points: (1) after the field test and (2) during the school recruitment period for the main study. After the field test, NCES will evaluate the following:

    • the student participation rate for the financial literacy assessment sessions in the field test to determine if the rate is sufficient;

    • the response rates in the 39 schools conducting the mode effect study in the field test (those fielding financial literacy and thus requiring a second session for students) in comparison with the 21 schools that will not field financial literacy and thus not require students to return for a second session; and

    • feedback from school administrators in the field test schools about the perceived burden of this second testing session.

NCES will use this information to determine if administering financial literacy in the main study is feasible and would not negatively affect overall school response rates. If NCES proceeds with financial literacy in the main study, and we learn during school recruitment for the main study that schools object to the undue burden of the second testing session for students, NCES will offer schools the option of not including this second testing session. If the number of schools opting out reaches the point where an adequate student sample would not be possible, NCES will drop the financial literacy assessment for all schools in the main study.

Teacher Questionnaire

The United States will also field a teacher questionnaire in the field test. The teacher questionnaire will be given to a sample of up to 10 science teachers and 15 non-science teachers eligible to teach the modal grade (grade 10) within each school and will be delivered online. Burden rates are provided in part A.12.

To evaluate whether to field the teacher questionnaire in the main study we will look at several criteria:

  • School reaction to the inclusion of the teacher questionnaire. During gaining cooperation with field test schools, we will gather any reactions to the inclusion of the teacher questionnaire by school administrators.

  • Response rate. We will evaluate the teacher response rate and the degree to which we have within-school response rates of at least 50%, which is the OECD’s minimum within-school response rate for teachers in TALIS (Teaching and Learning International Survey).

  • Degree of completeness of the instruments. Since the questionnaire for the main study will be shorter, at 30 minutes, we will examine if respondents fully completed the instrument or at least 50 percent of it.

  • Burden on teachers. In our debriefings with schools we will collect reaction to the burden experienced by the respondents.

  • Amount of follow-up with teachers. We will evaluate the amount of staff time required to communicate and follow-up with respondents to get completed questionnaires.

Main Study

The international minimum number of completed assessments—for the core computer-based assessments in science, mathematics, reading assessment, and collaborative problem solving—is 5,250 students in 150 schools. In PISA, the United States typically assesses between 5,600 and 5,900 students in 165 schools. Assuming the same response level as PISA 2012, our initial target is a total sample of about 240 schools to yield about 165 participating schools. To achieve the target final school response rate, we will use replacement schools to complete the sample.

The student-per-school target for the core assessment is at least 36 completed student assessments per school. Assuming a within-school response rate of 85 percent (rates were 85 percent in 2000, 82 percent in 2003, 91 percent in 2006, and 86 in PISA 2009), the original sample size of students within schools will be 42. Should any states participate in the 2015 assessment, each state would have a sample of 50 schools and 2,100 students to yield 1,890 assessed students.1 Like the states, Puerto Rico will also have a sample size of 50 schools and 2,100 students.

Nonresponse Bias Analysis, Weighting, Sampling Errors

It is inevitable that nonresponse will occur at both levels: school and student. We will analyze the nonrespondents and provide information about whether and how they differ from the respondents along dimensions for which we have data for the nonresponding units, as required by NCES standards. After the international contractor calculates weights, sampling errors will be calculated for a selection of key indicators incorporating the full complexity of the design, that is, clustering and stratification.

B.3 Maximizing Response Rates

Our approach to maximizing school and student response rates in the main study includes the following:

  • Use of a fall test administration, to avoid conflicts with state testing;

  • Selecting and notifying schools at least a year in advance;

  • Communicating with state officials early in the process and applying a more proactive approach with states to gain assistance with sampled schools;

  • Assigning personal recruiters for specific schools;

  • Incentives for schools, school coordinators, teachers, and students (see Section A9);

  • Contact with schools and school coordinators at set intervals throughout the year preceding the assessment;

  • A tentative plan for a summer conference for representatives from sampled schools several months before the data collection (main study only) to inform them about PISA and keep them engaged in the study; and

  • Provision of school-level results on PISA.

These approaches are based on recommendations from an NCES panel and experience with previous PISA administrations.

B.4 Purpose of Field Test and Data Uses

Participation in the field test is an international requirement for participating in the PISA 2015 main study. The main focus of the field test is to collect enough assessment data to perform reliable tests of the items. However, during the field test, procedures for conducting the main study, including recruitment methods for obtaining school and student participation also will be evaluated. This information will be used to (a) determine the main study design and in which international options the United States will participate, and (b) improve our recruiting strategies and materials for the main study.

B.5 Individuals Consulted on Study Design

Many people at OECD, ETS, and other organizations around the world have been involved in the design of PISA. Some of the lead people are listed in section A8. Overall direction for PISA is provided by Holly Xie and Dana Kelly, the PISA National Project Managers at the National Center for Education Statistics, U.S. Department of Education.

1 This is based on an expected response rate in the states of 90 percent; response rate in the states has been historically slightly higher than the national sample. This is attributable to the fact that states recruit and manage the participation of the schools and the students.

i


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePREFACE
AuthorJanice Bratcher
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy