Progress in International Reading Literacy Study (PIRLS 2021) Main study data collection
OMB# 1850-0645 v.15
Supporting Statement Part B
National Center for Education Statistics
U.S. Department of Education
Institute of Education Sciences
Washington, DC
November 2020
The PIRLS 2021 Field Test completed data collection in early spring 2020. The respondent universe for the PIRLS 2021 Field Test was all students enrolled in grade 4 with a mean age of at least 9.5 years of age during the 2019-2020 school year. While the 2021 Main Study Data Collection was originally scheduled for the parallel population in Spring 2021, due to the COVID-19 pandemic and under the guidance of IEA, the main study assessment period for the U.S. will be delayed and students will be assessed at the beginning of fall 2021 for school year 2021-2022, instead of spring 2021 for school year 2020-2021. Due to this change, the respondent universe for the PIRLS 2021 main study will be all students enrolled in grade 5 that have a mean age of at least 10.5 of age during the 2020-2021 school year. The universe for the selection is all public schools in 15 populous states for the field test, and all public and private schools across all 50 states and the District of Columbia for the main study. The field test selected a sample of 45 schools, with the goal of obtaining participation from a minimum of 40 schools. The main study will select a sample of 290 schools to be nationally representative, with the goal of obtaining participation from a minimum of 243 schools. Within sampled schools, approximately 1,650 students for the field test and 9,280 students for the main study will be selected for participation by drawing a random sample of two classes (or including all eligible students if the school has only two or less eligible classes). All selected students in the field test will participate in an electronically administered PIRLS using the digitalPIRLS assessment platform. Of the 9,280 students in the main study, a sample of 7,280 students will participate in an electronically administered PIRLS using the digitalPIRLS assessment platform, and a sample of 2,000 students will take PIRLS on paper in order to allow for a bridge analysis with paper-based trend lines. Only intact classes of grade 4 students for the field test and grade 5 students for the main study will be assessed. School administrators and teachers of the selected classrooms will also be asked to complete questionnaires.
PIRLS requires that participating countries meet participation rate requirements based on school, classroom, and student participation as described below.
A minimum school participation rate of 85%, based on sampled schools;
A minimum classroom participation rate of 95%, computed across the participating sampled schools and replacement schools; and
A minimum student participation rate of 85%, computed across the participating sampled schools and replacement schools
OR
A minimum combined school, classroom, and student participation rate of 75%, based on sampled schools (although classroom and student participation rates include replacement schools)
The most significant challenge in recruitment for PIRLS has been engaging the schools and gaining their cooperation. Given that classrooms are selected, student participation is not as great of a challenge. Historically student participation rates have never fallen below 90 percent (see table 1). That said, it is important to U.S. PIRLS that students are engaged and try hard on the assessment.
Table 1. Historical PIRLS school and student participation rates
Year |
School Participation Rate |
Overall Student Participation Rate |
|
Before Replacement |
After Replacement |
||
2016 PIRLS |
76 |
92 |
94 |
2016 ePIRLS |
74 |
89 |
90 |
2011 |
80 |
85 |
96 |
2006 |
57 |
86 |
95 |
2001 |
61 |
86 |
96 |
Field Test Sampling Plan and Sample
The purpose of the PIRLS field test is to test out a new PIRLS delivery system, assessment items, background questionnaire (BQ) items, and to ensure that classroom and student sampling procedures proposed for the main study are successful. In selecting a school sample for this purpose, it is important to minimize the burden on schools, districts, and states, to minimize the impact on these entities while also ensuring that the field test data are collected effectively.
As required by the PIRLS International Study Center, the field test sample is to consist of at least 45 schools with approximately 1,650 students to be selected for participation from a random sample of two classes from each school, which is estimated to yield a minimum of 1,400 students to be assessed. A probability sample of schools is not required for the field test because the field test is designed only to test items, questions, and procedures. However, the sample must include a broad range of schools covering such characteristics as public (including charter schools), large, small, urban, and rural schools, and schools from a variety of different states.
The field test sample will be drawn before the main study sample, and schools will be selected for the field test from the set of schools that may be included in the main study sample, though the chances of a school being selected for both samples are minimal. We will draw the field test sample from 15 states. (Typically California, Illinois, Virginia, and Georgia are among the selected states because of the variation in size and diverse demographics.) This approach will achieve the desired distribution of schools by region, poverty level, and ethnicity, and will inform the recruitment and data collection process for the main study.
For the field test sampling frame, schools in the selected states will be stratified by state and high/low poverty1 resulting in 30 different strata. Serpentine sorting will be used to sort schools by locale (city, suburb, town, and rural), race/ethnicity status (“15 percent or above” or “below 15 percent” Black, Hispanic, Asian and Pacific Islander, and American Indian and Alaskan Native students), and fourth grade enrollment within each stratum. A purposive sample of 45 schools will be selected for the field test that allocates equally to the separate states, although purposive selection of schools within the states may be conducted to ensure that to the extent possible, the proportion of schools in the field test closely aligns with the proportion of schools in the main study school sampling frame on the margins of the stratification and sort characteristics described previously. In addition, we will select the PIRLS field test sample so as to minimize overlap with the NAEP sample. Two replacement schools will be selected for each of the 45 sampled schools from the same strata that will have the same sort characteristics as the corresponding sampled schools. Once the field test sample has been selected, a summary of the distribution of the characteristics of the selected schools will be prepared, showing the comparison with the national population of schools.
Once the sample of schools has been submitted to and approved by IEA, IEA Hamburg will provide a school data file, including original schools and their replacement schools, in the IEA sampling software (referred to as the Windows Within-School Sampling Software [WinW3S]) prior to field test data collections. Sampled original schools will be contacted to obtain their cooperation and participation. A replacement school will be activated when an original school or level-one replacement school reaches a final participation status of refusal. Section B.3 describes in more details the recruitment approaches and procedures.
The student sampling procedures for the field test corresponded as closely as feasible to what is planned for the main study, so as to try out the operational procedures for student sample selection. The student sample consisted of two randomly selected classes per school, depending on the number of classes available in grade 4 and assuming that the school has at least two classes. Each participating school was asked to submit an exhaustive list of classes (that is, a list that accounts for each student in the grade exactly once). Smaller classes were combined to form “pseudo-classes” for the purposes of sampling. Once the list of classes was submitted, we used a sampling algorithm in the sampling software provided by the IEA to select two classes (or pseudo-classes) with equal probability. The student sample then consisted of all students in the selected classes.
Once cooperation was obtained from the schools, the class and student lists will be collected from the participating schools electronically using a secure electronic filing process (as explained in Part A). Electronic filing provides advantageous features, such as efficiency and data quality checks. Schools were access the electronic filing system through the secure MYPIRLS website. School coordinators were asked to provide school information, class lists, and student lists along with associated teacher information through the e-filing system. Parental notification materials (i.e., Facts for Parents about PIRLS and sample parent/guardian notification letters; see Appendices A1 and A2) were provided to the school coordinators to send to the parents or guardians of the sampled students in selected classes. The school coordinators collected parental consent forms and submit them to the PIRLS staff.
The process of collecting teacher and student-teacher linkage information has been streamlined to improve user flexibility and efficiency and reduce burden for the data collection process. School coordinators were asked to provide teacher information through the MyPIRLS website on the “Submit Class List” page during the e-filing process by entering a complete and current list of all of their school’s fourth-grade classes, including the student roster for each listed class as well as the name and email address of the associated reading teacher. Excel templates of the student list were posted on the “Submit Student List” page for the school coordinators to provide student information and link teachers to the students of each class listed on the “Submit Class List” page. This data collection procedure eliminates the need for the previously used Student-Teacher Linkage Form (STLF). Furthermore, information about the associated teachers is securely stored in the e-filing system, which is connected to the database for the MyPIRLS website and is updated with teacher questionnaire participation status on a regular basis, making it easy and efficient for test administrators to track questionnaire status, and eliminating the need for the Teacher Tracking Form (TTF). This data collection approach centralizes the information to be shared with the school coordinators in one secured online location through the MyPIRLS website, rather than having multiple forms the school coordinators have to verify and confirm at different stages of the data collection phase.
Westat staff and the test administrators worked with the school coordinators to monitor the participation status for the school and teacher questionnaire completion through the MyPIRLS website. Automated reminder emails were sent, as needed, to schools and teachers who have not yet begun to complete their questionnaires. School coordinators were instructed to notify the Westat PIRLS team or the test administrators in cases where principals or teachers of the sampled classes are unable to complete the questionnaire. Principals may appoint a member of school staff (typically the assistant principal or the school coordinator) to complete the school questionnaire as described by IEA in the survey operation procedures.
Main Study Sampling Plan and Sample
Because of the fall assessment schedule and under the guidance of IEA, the school sample design for the main study must be a probability sample of schools that fully represents the entire fifth grade population in the United States. At the same time, to ensure maximum participation, it must be designed to minimize overlap with other NCES studies involving student assessment that will be conducted around the same time. The main study will take place in the fall of 2021, in the following school year after the NAEP 2021 assessment. NAEP 2021 will assess several thousand schools nationally at grades 4 and 8. To be fully representative, the PIRLS grade 5 sample may include some schools that will have participated in the Main NAEP 2021 at grade 4. However, this number will be kept to a minimum.
In order to assess the minimum required 5,000 students from 150 schools for the digitalPIRLS plus 1,500 students from 50 schools for the bridge study paperPIRLS, we will sample 290 schools and about 9,280 students. For each original sample school, two replacement schools will also be identified. The sampling frame will be obtained from the most current versions of NCES’s Common Core of Data (CCD) and Private School Survey (PSS) files, restricted to schools having grade 5, and eliminating schools in Puerto Rico, U.S. territories, and Department of Defense overseas schools. The sample will be stratified according to school characteristics such as public/private, Census region, poverty status2 (as measured by the percentage of students in the school receiving free or reduced-price lunch in the National School Lunch Program [NSLP]). This will ensure an appropriate representation of each type of school in the selected sample of schools. The process used to determine school eligibility, student eligibility, and student sampling is described below.
Schools will be selected with probability proportional to size (PPS) sample, where the measure of size is based on the number of estimated students at grade 5. A PPS design ensures that all students have an approximately equal chance of selection because the same sample size will be selected from each school, regardless of the size of the school. It also improves cost-efficiency by increasing the number of students per school.
Once cooperation is obtained from the schools, the class and student lists will be collected from the participating schools electronically using a secure electronic filing process (as explained in Part A). Electronic filing provides advantageous features, such as efficiency and data quality checks. Schools will access the electronic filing system through the secure MyPIRLS website. School coordinators will be asked to provide school information, class lists, and student lists along with associated teacher information through the e-filing system. Parental notification materials (i.e., Facts for Parents about PIRLS and sample parent/guardian notification letters; see Appendices A1 and A2) will be provided to the school coordinators to send to the parents or guardians of the sampled students in selected classes. The school coordinators will collect parental consent forms and submit them to the PIRLS staff.
The process of collecting teacher and student-teacher linkage information has been streamlined to improve user flexibility and efficiency and reduce burden for the data collection process. School coordinators are asked to provide teacher information through the MyPIRLS website on the “Submit Class List” page during the e-filing process by entering a complete and current list of all of their school’s fifth-grade classes, including the student roster for each listed class as well as the name and email address of the associated reading teacher. Excel templates of the student list are posted on the “Submit Student List” page for the school coordinators to provide student information and link teachers to the students of each class listed on the “Submit Class List” page. This data collection procedure eliminates the need for the previously used Student-Teacher Linkage Form (STLF). Furthermore, information about the associated teachers is securely stored in the e-filing system, which is connected to the database for the MyPIRLS website and is updated with teacher questionnaire participation status on a regular basis, making it easy and efficient for test administrators to track questionnaire status, and eliminating the need for the Teacher Tracking Form (TTF). This data collection approach centralizes the information to be shared with the school coordinators in one secured online location through the MyPIRLS website, rather than having multiple forms the school coordinators have to verify and confirm at different stages of the data collection phase.
Student sampling will be accomplished by randomly selecting up to two classes per school, depending on the number of classes available in grade 5 and assuming that the school has at least two classes. Smaller classes will be combined to form “pseudo-classes” for the purposes of sampling. The sampling algorithm in the sampling software provided by the IEA will be used to select the classes for each school. Each grade 5 school will be asked to prepare an exhaustive list of classes that is comprehensive and includes each grade 5 student in the school in one of the listed classes.
The main study will implement the same data collection procedures used in the field test using the secure e-filing system through the MyPIRLS website. The school coordinators will be able to submit these classes and student lists via secure e-filing system. Any class with fewer than ten students will be combined with another class to form a ‘pseudo-class’ with at least ten students in it. Two classes (or pseudo-classes) will be selected from each school, with equal probability (unless only one class is possible), and all students in those classes/pseudo-classes will be included in the sample. If a school has only one class, then all the students in the grade will be included in the sample.
Westat staff and the test administrators will work with the school coordinators to monitor the participation status for the school and teacher questionnaire completion through the MyPIRLS website. Automated reminder emails will be sent, as needed, to schools and teachers who have not yet begun to complete their questionnaires. School coordinators will notify the Westat PIRLS team or the test administrators in cases where principals or teachers of the sampled classes are unable to complete the questionnaire. Principals may appoint a member of school staff (typically the assistant principal or the school coordinator) to complete the school questionnaire as described by IEA in the survey operation procedures.
Nonresponse Bias Analysis, Weighting, and Sampling Errors
It is inevitable that nonresponse will occur at both levels: school and student. We will analyze the nonrespondents and provide information about whether and how they differ from the respondents along dimensions for which we have data for the nonresponding units, as required by NCES standards. After the calculation of weights, sampling errors will be calculated for a selection of key indicators incorporating the full complexity of the design, that is, clustering and stratification (see Appendix B for more detail).
Student Data Collection Procedures
Data collection protocols for digitalPIRLS assessment administration are the same for both the Field Test and the Main Study. Students begin the data collection activities by entering a room containing desks and PIRLS tablets with keyboards. Upon entering, each student takes a seat where they will see a student log-in card (see Appendix A1 and A2) which contains each student’s unique log-in information. The first screen that students see is the PIRLS student log-in screen (see Appendix C), which acts as a portal for all PIRLS student data collection activities. The PIRLS test administrator gives the students the verbal instruction to log in and students then use their log-in information to begin. The log-in information is saved and does not need to be re-entered again as long as students stay active, but if students take breaks longer than 10 minutes the computers will time out and students will have to re-enter their log-in information.
There are four sections to each digitalPIRLS student administration: Directions, Assessment 1, Assessment 2, and the Student Questionnaire. Immediately after students initially log in, they will be directed to the Directions password screen, which will have a password entry box. (See Appendix C for the Student Questionnaire password screen. The password screens are all of a similar design.) The session administrator verbally gives all students the same password, and students then enter that password to begin receiving the Directions for PIRLS. When students complete that section, they are directed to another password screen, this time for Assessment 1. Again they will verbally receive the appropriate password to allow access to that block, and will all begin together. At the end of each section, students are directed to the password screen for the next section. This design allows students to take short breaks but also all begin each section together.
For students who will participate in the bridge analysis of the main study, the PIRLS assessment administration will have similar data collection procedures as the digitalPIRLS assessment administration described above, except students will be given paper booklets with student ID labels instead of student login cards.
B.3 Maximizing Response Rates
The recruitment process begins with notification of Chief State School Officers and test directors about the sample of schools in their state, followed by notification of districts and then schools. Each notification letter includes audience-specific information about PIRLS and about participating in PIRLS. Following the initial notification, NAEP State Coordinators and/or experienced school recruiters contact schools that have not yet agreed to participate to provide additional details about PIRLS and their schools’ involvement. If a school indicates that it does not wish to participate in the study, recruiters will ask why the school is refusing and may seek to address specific issues (e.g., inconvenient date, concerns about privacy) before considering the school a final refusal. These follow-up communications continue until a school has reached a final participation status (i.e., cooperating or refusing).
When a school agrees to participate in PIRLS, we request that the school designate a School Coordinator to facilitate the school’s participation. The School Coordinator will notify teachers and students in selected classrooms about PIRLS, explain the importance of participating, and explain test day activities. Additionally, the School Coordinator may be asked to encourage sampled teachers to complete the teacher questionnaire.
Refusals may occur at the state, district, and school levels. When a school reaches a final participation status of refusal, a substitute school is activated. The activation of a substitute school triggers the notification process. If a state or district refuses to participate, no substitute schools from the same state/district will be activated. Since all substitute schools are within states with original schools in the sample, this notification process only includes district notifications (when applicable) and school notifications.
Our approach to maximizing school recruitment is to:
Obtain endorsements about the value of PIRLS from relevant organizations;
Work with NAEP state coordinators;
Inform Chief State School Officers and test directors about the sample of schools in their state. Enclose a sample letter of endorsement they can send to schools;
Send letters and informational materials to schools and districts. These letters will be customized by the type of school;
Train experienced school recruiters about PIRLS;
Implement strategies from NAEP’s Private School Recruiting Toolkit. This toolkit, developed for NAEP, includes well-honed techniques used to recruit a very challenging type of schools;
Follow-up mailings with telephone calls to explain the study and schools’ involvement, including placing the PIRLS assessment date on school calendars;
Offer schools $200 for participation and, as a second-tier incentive, $800 to schools that are historically very difficult to recruit (as explained in Part A);
Maintain continued contact until schools have built a relationship with the recruiter and fully understand PIRLS;
Offer a $100 incentive to the individual at the school identified to serve as the school coordinator; and
Make in-person visits to some schools, as necessary.
Our approach to maximizing student recruitment is to:
Send parental permission forms home to parents. Implied permission is encouraged, but written permission will be collected if required by the school district or school;
Encourage the teacher to encourage student participation;
Offer participating students a small gift valued at approximately $4. In PIRLS 2016, each participating student received a small wristwatch and a pencil. Comparably valued items will be distributed to participating students for the PIRLS 2021 data collection;
Students will also receive a certificate with their name thanking them for participating in and representing the U.S. in PIRLS 2021; and
When feasible, have the test administrator speak to the students prior to the scheduled test day to encourage participation.
Our approach to maximizing teacher recruitment is to:
Send letters and informational materials to teachers;
Provide the option of an electronic or hard-copy questionnaire;
Offer a $20 incentive for participation; and
Have the test administrator speak to the teacher on the day of the student session.
B.4 Purpose of Field Test and Main Study and Data Uses
The U.S. is participating in the PIRLS field test in spring 2020 to evaluate new assessment items and background questions and to ensure that classroom and student sampling procedures proposed for the main study are successful. Prior to the field test, we conducted pretesting sessions in a school environment and in a simulated classroom environment (OMB# 1850-0803 v. 257) to test the draft digitalPIRLS platform as well as the PIRLS delivery system. . The pretesting sessions and the field test have provided information that will be used to improve data collection instruments and related systems prior to the main study. The goals of PIRLS are to (1) develop internationally valid instruments for measuring reading literacy suitable for establishing internationally comparable literacy levels in each of the participating education systems; (2) describe on one international scale the literacy profiles of fourth-graders in school in each of the participating education systems; (3) describe the reading habits of fourth graders in each participating education system; and (4) identify the home, school, and societal factors associated with the literacy levels and reading habits of fourth-graders in school.
PIRLS 2021 is designed to bridge PIRLS from paper-based administrations to computer-based administrations by including a bridge study as part of the main study, which will support the mode transition while allowing PIRLS to report two decades of trend data. Each successive participation in PIRLS provides trend information about student achievement in reading relative to other countries, as well as indicators that show how this achievement relates to demographic and curricular, school, teacher, and student factors that provide the educational context for achievement. Data compiled and collected from PIRLS 2021 will allow for evidence-based decisions to be made for educational improvement. These high-quality, internationally comparative trend data are key in informing education policy discussions.
B.5 Individuals Consulted on Study Design
Overall direction for PIRLS is provided by Dr. Sheila Thompson, National PIRLS Research Coordinator (NCES) and Dr. Stephen Provasnik, International Studies Branch Chief (NCES).
The following persons are responsible for the sampling and statistical design of PIRLS:
Pierre Foy. TIMSS and PIRLS International Study Center, Boston College (617-552-6253); and
Sylvie LaRoche and Ahmed Almaskut from Statistics Canada (613-863-9480).
Sampling, analysis, and reporting will be performed by:
TIMSS and PIRLS International Study Center, Boston College;
the U.S. national contractor for PIRLS, Westat; and
National Center for Education Statistics (NCES), U.S. Department of Education.
1 High poverty schools are defined as having 50% or more students eligible for participation in the National School Lunch Program (NSLP), and low poverty schools have less than 50% of students eligible for NSLP. In addition, in the main study, which includes private schools in the sample, all private schools are classified as low poverty because no NSLP information is available.
2 High poverty schools are defined as having 50% or more students eligible for participation in the National School Lunch Program (NSLP), and low poverty schools have less than 50% of students eligible for NSLP. In addition, in the main study, which includes private schools in the sample, all private schools are classified as low poverty because no NSLP information is available.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Calvin Choi |
File Modified | 0000-00-00 |
File Created | 2021-01-12 |