Part B PIRLS 2016 Field Test & MS Recruitment

Part B PIRLS 2016 Field Test & MS Recruitment.docx

Progress in International Reading Literacy Study (PIRLS 2016) Field Test and Recruitment for Main Study

OMB: 1850-0645

Document [docx]
Download: docx | pdf




Progress in International Reading Literacy Study (PIRLS 2016) FIELD TEST AND RECRUITMENT FOR MAIN STUDY



REQUEST FOR OMB Clearance

OMB# 1850-0645 v.8




Supporting Statement Part B




Submitted by:



National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC



October 2014



  1. COLLECTION OF INFORMATION EMPLOYING STATISTICAL INFORMATION

B.1 Respondent Universe

The respondent universe for the PIRLS field test is all students enrolled in grade 4 that are at least 9.5 years of age, during the 2014-2015 school year. The universe for the selection of schools is all types of schools in seven populous states. A sample of 50 schools will be selected for the field test, with the goal of obtaining participation from a minimum of 25 schools. Within sampled schools, students will be selected for participation by drawing a random sample of two classes. Only students in intact classrooms will be assessed at each grade. School administrators and teachers of selected classrooms will also be asked to complete questionnaires.

B.2 Statistical Methodology

Field Test Sampling Plan and Sample

The purpose of the PIRLS field test is to test out new assessment items and background questions, and to ensure that classroom and student sampling procedures proposed for the main study are successful. In selecting a school sample for this purpose, it is important to minimize the burden on schools, districts, and states, to minimize impact on these entities while also ensuring that the field test data are collected effectively. In addition, the U.S. will implement the new ePIRLS assessment for the first time.

As required by the PIRLS International Study Center, the field test sample is to consist of 25 schools with 800 students assessed. The student samples will be obtained by selecting two classes from each school. As the field test is designed only to test items, questions, and procedures, a probability sample of schools is not required. However, the sample must include a broad range of schools covering such features as public (including charter schools), private, large, small, urban, and rural schools, and schools from a variety of different states.

The field test sample will be drawn after the main study sample, and schools will be selected for the field test from the set of schools not included in the main study sample. We will draw the field test sample from five states –

California, Illinois, New York, North Carolina, and Texas – chosen because of their large size and diverse demographics. This will allow for achieving the desired distribution of schools by region, poverty level, and ethnicity, and will inform the recruitment and data collection process for the nation as a whole.

Schools in California, Illinois, New York, North Carolina, and Texas that are not selected into the main study sample will comprise the field test sampling frame and will be stratified by state, high/low poverty,1 and public/private status, resulting in 15 different strata. Serpentine sorting will be used to sort schools by locale (city, suburb, town, and rural), race/ethnicity status (“15 percent or above” or “below 15 percent” Black, Hispanic, Asian and Pacific Islander, and American Indian and Alaskan Native students), and fourth grade enrollment within each stratum. A purposive sample of 25 schools will be selected for the field test which allocates equally to the separate states, although purposive selection of schools within the states will be conducted to ensure that to the extent possible, the proportion of schools in the field test closely aligns with the proportion of schools in the main study school sampling frame on the margins of the stratification and sort characteristics described previously. In addition, we will investigate the possibility of NAEP or other NCES studies taking place in schools around the same time as the field test and will select the PIRLS field test sample so as to minimize overlap with the NAEP sample. Two replacement schools will be selected for each of the 25 sampled schools from the same strata, and will have the same sort characteristics as the corresponding sampled schools. Once the field test sample has been selected, a summary of the distribution of the characteristics of the selected schools will be prepared, showing the comparison with the national population of schools.

The student sampling procedures for the field test will correspond as closely as feasible to what is planned for the main study, so as to try out the operational procedures for student sample selection. The sample will be selected by selecting one or two classes per school, depending on the number of classes available at grade 4. Each participating school will be asked to submit an exhaustive list of classes (that is, a list that accounts for each student in the grade exactly once). Smaller classes will be combined to form “pseudoclasses” for the purposes of sampling. Once the list of classes is submitted, we will use a sampling algorithm to select two classes (or pseudoclasses) with equal probability. The student sample will then consist of all students in the selected classes.

We plan to gather class and student lists from participating schools electronically using a secure electronic filing process (as explained in Part A). Electronic filing provides advantageous features such as efficiency and data quality checks. Schools will access the electronic filing system through a web site.

Main Study Sampling Plan and Sample

The school sample design for the main study must be more rigorous than that for the field test. It must be a probability sample of schools that fully represents the entire United States. At the same time, to ensure maximum participation it must be designed so as to minimize overlap with other NCES studies involving student assessment that will be conducted around the same time.

The main study will take place in the spring of 2016, about two months after the Main NAEP 2015 assessment. NAEP will assess several hundred schools nationally, at grades 4, 8, and 12. To be fully representative, the PIRLS sample may include some schools that will have participated in the Main NAEP 2015 at the same grade. However, this number can be kept to a minimum.

The sample size for the PIRLS main study will be 150 schools. For each original sample school, two replacement schools will also be identified. The sampling frame will be obtained from the most current versions of NCES’s Common Core of Data (CCD) and Private School Survey (PSS) files, restricted to schools having grade 4, and eliminating schools in Puerto Rico, U.S. territories, and Department of Defense overseas schools.

The sample will be stratified according to school characteristics such as public/private, Census region, poverty status (as measured by the percentage of students in the school receiving free or reduced-price lunch in the National School Lunch Program (NSLP)). This will ensure an appropriate representation of each type of school in the selected sample of schools.

Determining school eligibility, student eligibility, and student sampling will be accomplished as described below.

Schools will be selected with probability proportional to the number of estimated classes at grade 4, with schools expected to have either one or two classes being given the same selection probability. The use of a probability proportional to sample design ensures that all students have an approximately equal chance of selection, since two classes will be selected from each school regardless of the size of the school. Note that we will modify this equal probability design in the following way. So as to increase the available sample size of students in high poverty schools, we will double the probability of selection of each school with at least 50 percent of students eligible for free or reduced-price lunch under NSLP, relative to other schools of the same size.



Student sampling will be accomplished by selecting one to two classes per school. Each grade 4 school will be asked to prepare a list of classes that is comprehensive, and includes each grade 4 student in the school in one of the listed classes. As described above, schools will submit these classes and student lists via secure E-filing. Any class with fewer than ten students will be combined with another class to form a ‘pseudoclass’ with at least ten students in it. We will then select one or two classes (or pseudoclasses) from each school, with equal probability, and all students in those classes/pseudoclasses will be included in the sample. If a school has only one class, then all students in the grade will be included in the sample.

Nonresponse Bias Analysis, Weighting, and Sampling Errors

It is inevitable that nonresponse will occur at both levels: school and student. We will analyze the nonrespondents and provide information about whether and how they differ from the respondents along dimensions for which we have data for the nonresponding units, as required by NCES standards. After the calculation of weights, sampling errors will be calculated for a selection of key indicators incorporating the full complexity of the design, that is, clustering and stratification (see Appendix B for more detail).

B.3 Maximizing Response Rates

The most significant challenge in recruitment for PIRLS has been engaging the schools and gaining their cooperation. Given that classrooms are selected, student participation is not as great of a challenge. Historically student participation rates have never fallen below 90 percent (see table 1). That said, it is important to U.S. PIRLS that students are engaged and try hard on the assessment.

Table 1. Historical PIRLS school and student participation rates

Year

School Participation Rate

Overall Student Participation Rate

Before Replacement

After Replacement

2011

80

85

96

2006

57

86

95

2001

61

86

96


Our approach to school recruitment is to:

  • Obtain endorsements about the value of PIRLS from relevant organizations;

  • Work with NAEP state coordinators;

  • Inform Chief State School Officers and test directors about the sample of schools in their state. Enclose a sample letter of endorsement they can send to schools;

  • Send letters and informational materials to schools and districts. These letters will be customized by the type of school;

  • Train experienced school recruiters about PIRLS;

  • Implement strategies from NAEP’s Private School Recruiting Toolkit. This toolkit, developed for NAEP, includes well-honed techniques used to recruit a very challenging type of schools;

  • Follow-up mailings with telephone calls to explain the study and schools involvement, including placing the PIRLS assessment date on school calendars;

  • Offer schools $200 for participation;

  • During the field test, allow schools that find the inclusion of ePIRLS to be too burdensome the option to either have a portion or all of their students only complete the paper-and-pencil version of PIRLS;

  • Maintain continued contact until schools have built a relationship with the recruiter and fully understand PIRLS;

  • Offer a $100 incentive to the individual at the school identified to serve as the school coordinator, plus $50 for running the ePIRLS system check, and assisting with computer setup on the second day of the test administration (these components may be delegated to a school IT coordinator if necessary); and

  • Make in-person visits to some schools, as necessary.

Our approach to student recruitment is to:

  • Send parental permission forms home to parents. Implied permission is encouraged but written permission will be collected if required by the school district or school;

  • Encourage the teacher to encourage student participation;

  • Offer participating students a small gift valued at approximately $4. In PIRLS 2011, each participating student received a small watch/stop watch that could be clipped securely (with an attached karabiner) to a backpack or belt loop. A similarly priced item will be distributed to participating students for the PIRLS 2016 data collection;

  • Students will also receive a certificate with their name thanking them for participating in and representing the United States in PIRLS 2016; and

  • When feasible, have the test administrator speak to the students prior to the scheduled test day to encourage participation.

Our approach to teacher recruitment is to:

  • Send letters and informational materials to teachers;

  • Provide the option of an electronic or hard-copy questionnaire;

  • Offer a $20 incentive for participation; and

  • Have the test administrator speak to the teacher on the day of the student session.


B.4 Purpose of Field Test and Data Uses

The central goals for the field test are to evaluate new assessment items and background questions, and to ensure that classroom and student sampling procedures proposed for the main study are successful. The U.S. will also implement the ePIRLS assessment and will analyze data from the field test to inform decisions on whether to implement it in the main study.

One of the purposes of the field trial is to test the effects of administering the ePIRLS option under consideration on school and student recruitment and operations. Information gained from the field trial can be useful in weighing the value of the additional education data gained by participation in ePIRLS against the added cost and burden and the risk of not achieving acceptable school and student response rates for inclusion in the international comparisons.

Part of the long-term plan for PIRLS is for it to become an on-line assessment with eventual completely electronic administration instead of paper delivery. ePIRLS is designed to bridge the reading-for-information portion of PIRLS from paper-based administrations to computer-based administrations. The primary value of the U.S. participating in ePIRLS 2016 is to evaluate the feasibility and validity of transitioning from paper to electronic delivery in PIRLS in the context specific to the United States. Additionally, the one-time results comparing student literacy across traditional print and digital formats both within the U.S. and against other countries will provide new and valuable information to educators, researchers, and policymakers.

The paper-and-pencil PIRLS assessment will be administered on the first day and ePIRLS on a different day selected by the school. Students will be sampled for ePIRLS from those who complete the paper-and-pencil version of PIRLS 2016; about half the students who participate in PIRLS will be sampled for ePIRLS. The sampled students will be asked to return for the ePIRLS session on a day designated by their school. During the school recruitment period, the fact that students need to be asked to return for a second session could be viewed by schools as an unnecessary burden to the school and students, and could result in schools declining to participate in PIRLS. Schools that find ePIRLS too burdensome may opt to either have a portion or all of their students complete only the paper-and-pencil version of PIRLS. Besides the concern about the impact of ePIRLS on school participation rates, students may also not return for the second session, which could result in insufficient student response rates for the ePIRLS assessment to be viable. NCES will use the field test results to make the final decisions on whether to administer ePIRLS as part of the PIRLS 2016 main study. The decision will be based on the following factors:

  • feedback from school administrators in the field test schools about the perceived burden of the second, ePIRLS testing session (during conversations with sampled field test schools, we will gather any reactions to the inclusion of ePIRLS by school administrators);

  • student participation rate in the field test ePIRLS and the degree to which at least 50% of the students from the initial PIRLS assessment day will prove available for the second, ePIRLS session; and

  • degree to which students complete ePIRLS in the field test (ePIRLS consists of two 40-minute assessment modules plus a 5-minute questionnaire, and students are allowed to stop at any time).

If NCES proceeds with ePIRLS in the main study, similarly to the field test procedure, NCES may offer those schools that object to the burden of the second testing session the option of including in ePIRLS only one-half or none of the students assessed in the paper-and-pencil PIRLS. If the number of schools opting out of ePIRLS in the main study reaches the point where an adequate student sample would not be possible, NCES will consider dropping ePIRLS for all schools.

B.5 Individuals Consulted on Study Design

Overall direction for PIRLS is provided by Dr. Sheila Thompson, National Research Coordinator, National Center for Education Statistics, U.S. Department of Education.

The following persons are responsible for the statistical design of PIRLS:

  • Pierre Foy. TIMSS and PIRLS International Study Center, Boston College (617-552-6253); and

  • Marc Joncas and Jean Dumais, Statistics Canada (613-951-0007).

Contractors responsible for sampling and data analysis:

  • David Wilson, RTI International (919-541-6990);

  • Patricia Green, RTI International (312-456-5260); and

  • Ben Dalton, RTI International (919-541-7228).

Analysis and reporting will be performed by:

  • TIMSS and PIRLS International Study Center, Boston College;

  • RTI International;

  • Insight Policy Research; and

  • National Center for Education Statistics, U.S. Department of Education.

1 High poverty schools are defined as having 50% or more students eligible for participation in the National School Lunch Program (NSLP), and low poverty schools have less than 50% of students eligible for NSLP. Private schools are all classified as low poverty because no NSLP information is available.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCalvin Choi
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy