Volume 1 - PIRLS 2021 FT Pretest

Vol 1 PIRLS 2021 FT Pretest.docx

NCES System Clearance for Cognitive, Pilot, and Field Test Studies 2019-2022

Volume 1 - PIRLS 2021 FT Pretest

OMB: 1850-0803

Document [docx]
Download: docx | pdf





Progress in International Reading Literacy Study (PIRLS) 2021 Field Test Pretest





OMB# 1850-0803 v.257



Volume 1




Submitted by:



National Center for Education Statistics (NCES)

Institute of Education Sciences (IES)

U.S. Department of Education

Washington, DC







October 2019














  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB# 1850-0803), which provides for NCES to conduct various procedures (such as pilot tests, cognitive interviews, and usability studies) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments and procedures.

  1. Background and Study Rationale

The Progress in International Reading Literacy Study (PIRLS) is an international assessment of fourth-grade students’ achievement in reading. PIRLS reports on four benchmarks in reading achievement at grade 4 (Advanced, High, Medium, and Low) and on a variety of issues related to the education context for the students in the sample, including instructional practices, school resources, curriculum implementation, and learning supports outside of school. Since its inception in 2001, PIRLS has continued to assess students every 5 years (2001, 2006, 2011, and 2016), with the next PIRLS assessment, PIRLS 2021, being the fifth iteration of the study. Participation in this study by the United States at regular intervals provides data on student achievement and on current and past education policies and a comparison of U.S. education policies and student performance with those of the U.S. international counterparts. In PIRLS 2016, 58 education systems participated. The U.S. will participate in PIRLS 2021 to continue to monitor the progress of its students compared to that of other nations and to provide data on factors that may influence student achievement.

PIRLS is coordinated by the International Association for the Evaluation of Educational Achievement (IEA), an international collective of research organizations and government agencies that create the assessment framework, the assessment instrument, and background questionnaires. The IEA decides and agrees upon a common set of standards and procedures for collecting and reporting PIRLS data, and defines the studies’ timeline, all of which must be followed by all participating countries. As a result, PIRLS is able to provide a reliable and comparable measure of student skills in participating countries. In the U.S., the National Center for Education Statistics (NCES) conducts this study, with support from the U.S. Department of Education contractors, and works with the IEA to ensure proper implementation of the study and adoption of practices in adherence to the IEA’s standards. Participation in PIRLS allows NCES to meet its mandate of acquiring and disseminating data on educational activities and student achievement in the U.S. compared with foreign nations [The Educational Sciences Reform Act of 2002 (ESRA 2002) 20 U.S.C. §9543].

Compared to PIRLS 2016, PIRLS 2021 differs in that it offers a redesigned digital format, referred to as digitalPIRLS to distinguish it from the traditional paper-and-pencil format, called paperPIRLS, which will still be available to countries. As the U.S. will be administering the digital version of PIRLS, in this submission all discussion of PIRLS 2021 refers to digitalPIRLS unless otherwise specified. The other difference to note is that in an effort to reduce the burden on schools and students, the assessment will be administered for one day only (rather than over two days as was done in 2016 when ePIRLS was administered for the first time independently). Given the transition to a digitally-based assessment (DBA) for PIRLS, a bridge study between the two PIRLS formats will be conducted by the IEA during the main study. This design will require that approximately 1,500 students will be administered the paperPIRLS booklets, which will increase the sample size required during the main study.

This request is to conduct a real-world test of the IEA digitalPIRLS Assessment System with students, allowing the system to be tested prior to the field test data collection in spring 2020 in the manner that will be used in the field test to help identify system issues, including any router performance issues. The rationale for this study is based on lessons learned and issues encountered by students related to the electronic administration of TIMSS, which was also designed by the IEA, during the field test in spring 2018. It is believed that students use and interact with the systems differently than adult quality control (QC) testers. Conducting pretesting and QC processes with students will allow us to identify existing issues in the PIRLS system, and address them prior to the operational use of the system in the field test in spring 2020.

The pretesting will be conducted with students from grade 4 in a real school classroom environment as well as in a simulated classroom using the preliminary final version of the digitalPIRLS Assessment System. Up to three pretesting events will take place during winter 2020, to include two in-school pretest sessions and one session in a simulated classroom environment. Each session will last approximately 150 minutes.

  1. Recruitment and Data Collection

Recruitment and Sample Characteristics

In-School Pretest:

For the in-school pretest, a state will be asked to participate in the study on a voluntary basis. The NAEP State Coordinator (“Coordinator”) in the volunteer state will recruit in their state schools that are not part of the PIRLS 2021 field test sample (see Appendix A for a sample recruitment letters). The Coordinator will leverage relationships within the state to contact schools and identify those willing to participate in the study. The NAEP State Coordinator will forward the contact information for participating schools to Westat.

A total of two schools will participate in the study. While the study will seek participation from schools with various demographic characteristics, including students with a mix of race/ethnicity and socioeconomic backgrounds, detailed sampling requirements will not be targeted because of the small number of schools that will be asked to participate in the study. Additionally, schools that are within a relatively close proximity to each other within a state will be selected to ensure that one field administration staff member is able to conduct the study for all schools in the state. Private schools will not be recruited for this study.

In each school, an intact class of 4th graders will participate in one pretrial session (for a total of two 4th grade classes). Schools may select a class to participate or work with Westat to randomly sample one of their classes in the participating grade. Schools will be asked to provide the number of participating students in the selected class but will not be required to submit student lists or any information about students. PIRLS test administrators will communicate with the school coordinator to determine accommodation needs and discuss whether any students should be excluded from the pretest session. The session will be held in the school, on a date in January 2020 that is selected in coordination with the school.

Simulated Classroom Pretest:

EurekaFacts, under contract with Westat, will recruit up to 30 students for one session of PIRLS pretesting. The session will be held at the EurekaFacts facility in Rockville, Maryland, tentatively scheduled for Saturday, January 11, 2019.

EurekaFacts will recruit participants for the pretesting study from the District of Columbia, Maryland, Virginia, and West Virginia. Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics. Students will be recruited to obtain the following criteria:

  • A mix of race/ethnicity (Black, Asian, White, Hispanic);

  • A mix of socioeconomic background; and

  • A mix of urban/suburban/rural areas.

While EurekaFacts will use various outreach methods (see Appendix B) to recruit students to participate, the bulk of the recruitment will be conducted by telephone and be based on EurekaFacts’ acquisition of targeted mailing lists containing residential addresses and landline telephone listings. EurekaFacts will also use a participant recruitment strategy that integrates multiple outreach methods and resources such as newspaper and internet ads, community organizations (e.g., Boys and Girls Clubs, Parent-Teacher Associations), and mass media recruitment (e.g., postings on the EurekaFacts website).

Interested students will be screened (see Appendix B) to ensure that they meet the criteria for participation in the pretesting study (i.e., the students are from the targeted demographic groups outlined above and their parents/legal guardians have given consent). When recruiting participants, EurekaFacts staff will speak to the parent/legal guardian of the interested minor before starting the screening process. During this communication, the parent/legal guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort as well as the activities that it entails. After confirming that a participant is qualified, willing, and available to participate in this study, he or she will receive a confirmation email/letter and phone call. Written informed parental consent (see Appendix B) will be obtained for all respondents who are interested in participating in the data collection efforts. Shortly after the preliminary US version of the IEA PIRLS assessment software is released, the simulated classroom pretesting event with students will be held.

The session will last approximately 150 minutes, and it will be structured as follows:

  • During the session, each student will be asked to take the assessment under standard PIRLS assessment conditions (approximately 100 minutes, which includes 10 minutes for directions/tutorial, two 40 minute cognitive sessions for the assessment, and two 5-minute breaks following each assessment session). Students will also complete the PIRLS student questionnaire, which is estimated to take 40 minutes.

  • A group debrief (up to 10 minutes) will be conducted in each session to solicit feedback from the students. See Volume 2 for the debriefing script.

Data Collection Process

The PIRLS pretesting sessions will mirror PIRLS student data collection procedures. Students begin data collection activities by entering a room containing desks and PIRLS tablets with keyboards. There are four sections to each digitalPIRLS student administration: Directions, Assessment 1, Assessment 2, and the Student Questionnaire. For both the simulated classroom and in-school pretests, normal data collection will be enabled by the PIRLS systems, and any errors generated will be collected automatically by the system. Note that student responses will not be scored. In addition to the PIRLS systems recording information, administrators and observers from NCES, Westat, and/or EurekaFacts will monitor the assessments and record notes detailing any issues encountered by the students, as well as what the students were doing at the time each issue occurred. In addition, observers may ask individual students for clarification of the actions he or she took prior to an issue or error occurring. For example, observers may ask questions such as, “What is the error?”; “What was the last thing you saw before the error?”; “What were you expecting to happen?”; or “What did you do right before the error happened?”. Understanding and documenting what caused the system error is necessary in order to obtain enough information for staff to replicate the error and develop a fix for it.

The session in the simulated classroom environment may be audio and/or video recorded to capture information regarding any student actions that resulted in system errors or issues. Sessions that occur in schools will not be recorded.

  1. Consultations outside the agency

Westat is the contractor for PIRLS and will provide the tablets for students’ use and administer the pretesting study.

Consultations outside NCES have been extensive and will continue throughout the life of the project. The IEA studies are developed as a cooperative enterprise involving all participating countries. An International Steering Committee has general oversight of the study and each National Research Coordinator participates in extensive discussions concerning the projects, usually with advice from national subject matter and testing experts. In addition, the IEA convened separate panels of experts from around the world to develop cognitive items.

The majority of the consultations (outside NCES) have involved the IEA and, in the United States, the PIRLS International Study Center at Boston College. Key to these ongoing consultations are: Dirk Hastedt (executive director of the IEA), Michael Martin, Ina V.S. Mullis, Pierre Foy, and Jenny Liu, all of whom have extensive experience in developing and operating international education surveys (especially related to PIRLS).

EurekaFacts is located in Rockville, Maryland. It is an established for-profit research and consulting firm, offering facilities, tools, and staff to collect and analyze both qualitative and quantitative data. EurekaFacts is working as a subcontractor for Westat to recruit participants and provide the facilities to be used for the study. In addition, EurekaFacts staff may assist in administering and/or observing some of the study sessions.

  1. Justification for Sensitive Questions

Throughout the item and debriefing question development processes, an effort has been made to avoid asking for information that might be considered sensitive or offensive.

  1. Paying Respondents

Incentives used for the in-school sessions will reflect those planned for the PIRLS 2021 field test. Consistent with prior administrations of TIMSS, schools will receive a check for $200 after the session is completed and students will receive a small gift (e.g., a string backpack with a globe printed on it) valued at approximately $4 as a token of appreciation for their participation.

For the EurekaFacts simulated classroom environment sessions, to encourage participation and thank students for their time and effort, a $35 gift card from a major credit card will be offered to each participating student. If a parent or legal guardian brings their student to and from the testing site, they will also receive a $35 gift card to thank them for their time and effort in transporting their child. Similar NAEP studies (e.g. OMB# 1850-0803 v.199) have offered $25 to the student participant and $25 to the parent/guardian for sessions that last approximately 1.5 hours. In addition, in the ICILS pretest, student participants and parent/guardians were offered $50 for a 3-hour session. Given that this research study requires more time than NAEP and less time than ICILS (approximately 2 hours), $35 will be offered to both the student and parent/guardian to aid in recruitment and gain their cooperation in the study.

  1. Assurance of Confidentiality

The study will not retain any personally identifiable information. Prior to the start of the study, students will be notified that their participation is voluntary. As part of the study, students will be notified that the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

In the in-school environment, parents will be notified using the same procedures as in the TIMSS 2019 main study. A notification letter (see Appendix A) will be sent to the parent/legal guardian of each sampled student prior to the pretesting session and parents/legal guardians will have the opportunity to opt the student out of the session. Before each pretesting session is administered in the simulated classroom environment, a written consent will be obtained from the parent/legal guardian of each participating student (see Appendix B).

All participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files, secured for the duration of the study, and will be destroyed after the final report is released. Pretesting activities may be recorded using audio and/or screen capture technology. The only identification included on the files will be the participant ID. The recorded files will be secured for the duration of the study and will be destroyed after the final report is completed.

  1. Estimate of Hourly Burden

Tables 1, 2, and 3 detail the estimated burden of the various PIRLS pretesting activities.

Estimated Burden for In-School Environment Pretesting

The estimates assume recruitment of 2 schools with grade 4. Within each school 30 students from a single classroom will be recruited to participate in the study (to yield around 28 students per class), and all parents will be notified. Each student will participate in one session for a total of 150 minutes, including breaks. For estimates of hourly burden in the in-school environment, see Table 1 below.

Table 1. Estimate of Hourly Burden – School Environment

Activity

Sample size

Expected response rate

Number of respondents

Number of responses

Time per respondent (minutes)

Total burden (hours)

In-School Pretest Recruitment

Contacting Districts

2

1.00

2

2

10

1

Contacting Schools

2

1.00

2

2

20

1

Parental notification

60

1.00

60

60

10

10

Pretest Recruitment Burden

 -

-

64

64

-

12

In-School Pretest Data Collection

Students

 

 

 

 

 

 

Directions

60

0.93

56

56

10

9

Assessment

60

0.93

56

56

80

75

Student questionnaire

60

0.93

56

56

40

37

Debrief

60

0.93

56

56

10

9

Pretest Data Collection Burden

 -

 -

56

168

 -

55

Total burden – In-School Pretest

 

 

120

232

 

67

Note: Numbers have been rounded and therefore may affect totals. The cognitive item portion of the study (“Assessment”), shaded in darker grey in the table, are not included in burden totals because they are not subject to the Paperwork Reduction Act (PRA).


Estimated Burden for the Simulated Classroom Pretesting

The estimated burden for recruitment assumes attrition throughout the process. Each student will participate in one session for a total of 150 minutes, including breaks. Table 2 details the estimated burden for the simulated classroom pretesting.

Table 2. Estimate of Hourly Burden – Simulated Classroom Environment

Activity

Number of respondents

Number of responses

Time per respondent (minutes)

Total hours

Parent or Legal Guardian for Student Simulated Classroom Pretest Recruitment


Initial contact

134

134

3

7

Follow-up via phone

67*

67

9

10

Consent & confirmation

34*

34

9

5

Recruitment Burden

134

235

-

22

Simulated Classroom Pretest Data Collection


Directions

30

30

10

5

Assessment

30

30

80

40

Student questionnaire

30

30

40

20

Debrief

30

30

10

5

Data Collection Burden

30

90

-

30

Total burden – Simulated Classroom Pretest

164

325

-

52

* Subset of initial contact group; assumptions for approximate attrition rates are 50 percent from initial contact to follow-up, 50 percent from follow-up to confirmation, and 90 percent from confirmation to participation. The cognitive item portion of the study (“Assessment”), shaded in darker grey in the table, are not included in burden totals because they are not subject to the Paperwork Reduction Act (PRA).

Note: Numbers have been rounded and therefore may affect totals


Table 3 details the total estimated burden for the In-School Environment Pretesting and the Simulated Classroom Pretesting sessions.

Table 3. Estimate of Total Hourly Burden

Respondent

Number of Respondents

Number of Responses

Total Hours

Field Trial Pretesting Sub Total

120

232

67

Simulated Classroom Pretesting Sub Total

164

325

52

Total Burden

284

557

119

Note: numbers have been rounded and therefore may affect totals

  1. Cost to federal government

The total cost to federal government for this study is $80,800 as detailed in Table 4.

Table 4: Estimate of Costs


Activity

Provider

Estimated Cost

School Environment

Recruiting states, schools, and students

Westat

$4,082

Administering the study

Westat

$34,674

Simulated Classroom Environment

Recruiting students and providing facilities for the study

EurekaFacts

$24,103

Administering the study

Westat

$17,941

Total

 

$80,800



  1. Project Schedule

The schedule for this study, including all activities (recruitment, pretesting, data collection, results), will begin in November 2019, as soon as OMB approval is granted, and the study will conclude by January 2020.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTabletStudyUsability_Vol1_9-10-13
SubjectOperational Analysis
AuthorFulcrum IT
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy