Part A PIRLS 2016 Field Test & MS Recruitment

Part A PIRLS 2016 Field Test & MS Recruitment.docx

Progress in International Reading Literacy Study (PIRLS 2016) Field Test and Recruitment for Main Study

OMB: 1850-0645

Document [docx]
Download: docx | pdf




Progress in International Reading Literacy Study (PIRLS 2016) FIELD TEST AND RECRUITMENT FOR MAIN STUDY



REQUEST FOR OMB Clearance

OMB# 1850-0645 v.8




Supporting Statement Part A




Submitted by:



National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC



October 2014





TABLE OF CONTENTS



PREFACE 2


A. JUSTIFICATION





APPENDICES

A: Parental Consent and Respondent Recruitment Materials


B: Non-response Bias Analysis Plan


C: Data Collection Instruments





PREFACE

The Progress in International Reading Literacy Study (PIRLS) is an international assessment of fourth-grade students’ achievement in reading. Since its inception in 2001, PIRLS has continued to assess students every 5 years (2001, 2006, 2011), with the next PIRLS assessment, PIRLS 2016, being the fourth iteration of the study. Participation in this study by the United States at regular intervals provides data on student achievement and on current and past education policies and a comparison of U.S. education policies and student performance with those of the U.S. international counterparts. In PIRLS 2011, 53 education systems participated. The United States will participate in PIRLS 2016 to continue to monitor the progress of its students compared to that of other nations and to provide data on factors that may influence student achievement.

PIRLS is conducted by the International Association for the Evaluation of Educational Achievement (IEA), an international collective of research organizations and government agencies that create the assessment framework, assessment, and background questionnaires. The IEA decides and agrees upon a common set of standards and procedures for collecting and reporting PIRLS data, and defines the studies’ timeline, all of which must be followed by all participating countries. As a result, PIRLS is able to provide a reliable and comparable measure of student skills in participating countries. In the U.S., the National Center for Education Statistics (NCES) conducts this study and works with the IEA and, for PIRLS 2016, RTI International (under contract with the U.S. Department of Eductaion), to ensure proper implementation of the study and adoption of practices in adherence to the IEA’s standards. Participation in PIRLS also allows NCES to meet its mandate of acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations [The Educational Sciences Reform Act of 2002 (ESRA 2002) 20 U.S.C., Section 9543].

PIRLS results provide four benchmarks in reading achievement at grade 4 (Advanced, High, Medium, and Low). PIRLS also reports on a variety of issues related to the education context for the students in the sample, including instructional practices, school resources, curriculum implementation, and learning supports outsdie of school.

Compared to PIRLS 2011, PIRLS 2016 differs in that it will not involve the IEA’s bilateral coordination of PIRLS with the International Study Center’s Trends in International Mathematics and Science Study (TIMSS) assessment. The two studies are on different time cycles (TIMSS takes place every 4 years) and take place in the same year only every 20 years (the current TIMSS is being administered in 2015). PIRLS 2016 includes an innovative new assessment of online reading, ePIRLS, which is designed to help countries understand how successful they are in preparing fourth-grade students to read, comprehend, and interpret online information. The PIRLS 2016 assessment is designed to take place over two consecutive days, with the paper-and-pencil version administered on the first day and ePIRLS administered on the following day.

In preparation for the PIRLS 2016 main study, all countries are asked to implement a field test in 2015. The purpose of the PIRLS field test is to evaluate new assessment items and background questions, to ensure practices that promote low exclusion rates, and to ensure that classroom and student sampling procedures proposed for the main study are successful. In selecting a school sample for this purpose, it is important to minimize the burden on schools, districts, and states, while also ensuring that the field test data are collected effectively. PIRLS staff will also work to help respondents understand the study’s value relative to burden imposed, and to ensure a high level of school participation.

Data collection for the field test will occur from March through April 2015. The United States plans to recruit 25 public schools and assess 800 students. The student samples will be obtained by selecting two classes from each school. The U.S. PIRLS main study will be conducted in the spring of 2016, and will involve a nationally-representative sample of 4,800 students in the target population from 150 schools.

This submission describes the overarching plan for all phases of the data collection, including the 2016 main study. In addition to the supporting statements Parts A and B, Appendix A provides field test recruitment materials consisting of letters to state and district officials and school principals, text for a PIRLS field test brochure, “Frequently Asked Questions,” a “Summary of Activities”, and parental consent materials. Appendix B provides the non-response bias analysis plan, and Appendix C provides the PIRLS background questionnaires.

Because PIRLS is a collaborative effort among many parties, the United States must adhere to the international schedule set forth by the IEA, including the availability of draft and final questionnaires. In order to meet the international data collection schedule for the spring 2015 field test, recruitment activities are scheduled to begin in November 2014. Recruitment for the main study will begin in March of 2015 to align with recruitment for other NCES studies (e.g., the National Assessment of Education Progress, NAEP), and for schools to put the assessment on their calendars. We expect the main study materials and procedures to be very similar to those used in the field test. Therefore, this submission requests approval for:

  1. recruiting for the 2015 field test and 2016 main study;

  2. conducting the 2015 field test data collection; and

  3. a description of the overarching plan for all of the phases of the data collection, including the 2016 main study.1

One change request may follow this submission. In January 2015, we will submit a change-request memo with the final main study recruitment materials and parental consent letters, along with a description of changes to the design, procedures, and respondent burden for the main study, should any changes be made. In late 2015, we will submit a clearance package, with a 30-day notice published in the federal register, which will include the final main study instruments for data collection in February-May, 2016. The main study questionnaires will be a subset of the field test instruments.

A. Justification

A.1 Importance of Information

Benchmarking of U.S. student achievement against other countries continues to be of high interest to education policymakers, and informs policy discussions of economic competitiveness and workforce and post-secondary preparedness. PIRLS provides a unique opportunity to compare U.S. students’ reading knowledge and skills at fourth grade with that of their peers in countries around the world.

The continuation of U.S. participation allows for the study of past and current education policies that have shaped reading achievement over the past 15 years. Furthermore, participating countries are not only able to obtain information about students' knowledge and abilities, but also about the cultural environments, teaching practices, curriculum goals, and institutional arrangements that are associated with student achievement.

PIRLS complements what we learn from national assessments such as the National Assessment of Educational Progress (NAEP) by identifying the strengths and weaknesses of student reading achievement relative to participating countries around the world. It provides valuable benchmarking information about educational polices enacted in other countries and policies that could be applied to U.S. educational practices.

Based on earlier PIRLS data releases, it is likely that the results of these studies will draw great attention in the United States and elsewhere. It is therefore expected that PIRLS will contribute to ongoing national and international debates and efforts to improve reading learning and achievement.

A.2 Purposes and Uses of Data

PIRLS assesses reading knowledge and skills at grade 4. PIRLS is designed to align broadly with curricula in the participating countries. The results, therefore, suggest the degree to which students have learned concepts and skills likely to have been taught in school. PIRLS also collects background information on students, teachers, schools, curricula, and official education policies in order to allow cross-national comparison of educational contexts that may be related to student achievement.

Data compiled and collected from PIRLS 2016 allows for evidence-based decisions to be made for the purposes of educational improvement. Each successive participation in PIRLS provides trend information about student achievement in reading relative to other countries, as well as indicators that show how this achievement relates to demographic and curricular, school, teacher, and student factors that provide the educational context for achievement. These high quality, internationally comparative trend data are key in informing education policy discussions.

Through the participation in PIRLS and other international assessment programs, NCES is able to provide comparative indicators on student performance and school practices across countries in order to benchmark U.S. student performance, and to suggest hypotheses about the relationship between student performance and factors that may influence performance as well as areas in which students have strengths or weaknesses. The international studies identify differences among countries over time in instructional practices, school policies, and opportunity-to-learn that can inform discussions about how to improve students’ ability to read.

This collection of data is consistent with the NCES mandate. The enabling legislation of the National Center for Education Statistics [Section 406 of the General Education Provisions Act, as amended (20 U.S.C. 1221e-1)] specifies that "The purpose of the Center [NCES] shall be to collect and analyze and disseminate statistics and other information related to education in the United States and in other nations." The Educational Sciences Reform Act of 2002 (ESRA 2002: 20 U.S.C., Section 9543) also mandates that NCES shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations. In addition to being essential for any international perspective on reading knowledge and skills, U.S. participation fulfills both the national and international aspects of NCES' mission.

PIRLS 2016 Components

The reading assessment is organized around a content dimension that specifies the subject matter to be assessed and a cognitive dimension that specifies the thinking processes to be assessed. PIRLS assesses two purposes for reading that fourth-grade students typically engage in: reading for literary experience and reading to acquire and use information. PIRLS also assesses four broad processes of comprehension predominantly used by fourth-grade readers: focusing on and retrieving explicitly stated information; making straightforward inferences; interpreting and integrating ideas and information; and evaluating and critiquing content and textual elements. The PIRLS 2016 framework is similar to 2011, but has been slightly updated to provide more specificity for item writers, and to better reflect current curricula in participating countries. There were no revisions to the content domains or cognitive domains, nor were there changes to the target percentages for either domain.

Assessment Instruments

To minimize burden and ensure broad subject-matter coverage, PIRLS paper-and-pencil reading tests will use a matrix sampling approach where the reading items are organized into a set of test booklets, with each student receiving only one booklet. This design is consistent with past practice.

ePIRLS

ePIRLS is a new component of the study being introduced for the 2016 cycle. Because for many people the Internet has become the primary source for obtaining information, reading curricula around the world are acknowledging the importance of online reading. In order to measure how well students read online, IEA has created ePIRLS – an innovative, online reading assessment.

ePIRLS uses an engaging, simulated Internet environment to measure fourth grade students’ achievement in online reading for information-gathering purposes. The assessment presents fourth grade students with authentic school-like assignments about science and social studies topics. ePIRLS allows an accurate assessment of online reading and comprehension skills beyond those used in “traditional” print material.

What makes IEA’s ePIRLS truly innovative is its simulated online interface and assessment design. After logging into ePIRLS, students are introduced to their assignment. As ePIRLS begins, two windows appear: an Internet browser window at left and the ePIRLS assessment window at right. To successfully complete ePIRLS, students must not only be able to navigate and discriminate among informational texts in a non-linear online environment, but must also construct meaning from these Internet sources, retrieve data, make inferences, and integrate the online information. Importantly, at the end of the assessment, students must be able to synthesize information across multiple passages.

Each ePIRLS assessment task typically lasts about 40 minutes, and each student will be asked to complete two tasks. A brief online questionnaire will be administered at the end of the assessment, asking students about their online reading experience. The PIRLS 2016 assessment is designed to take place over two days, with the paper-and-pencil version administered on the first day and ePIRLS administered on the next available day. It is intended that about half of the students who take the paper-and-pencil PIRLS will be selected to take ePIRLS. The added burden of a second assessment session may have a negative impact on schools’ willingness to participate in the study. Therefore, we will use the field test to evaluate the feasibility of including ePIRLS with PIRLS 2016 main study (see Part B section 4). If during the field test schools refuse to allow their students to participate in both sessions, they will be offered the option to include only a portion of students selected for PIRLS in the ePIRLS sample. If schools still refuse, they will be offered the option of dropping the ePIRLS administration entirely.

Questionnaires

The background questionnaires for PIRLS 2016 were developed to address the issues outlined in the PIRLS context questionnaire framework. The United States is adapting the questions to fit the U.S. education context, as appropriate, including adding a few questions, such as about the race/ethnicity of students. All but the student questionnaire will be offered online, with a paper-and-pencil backup. Students will only receive a paper-and-pencil questionnaire.

School Questionnaire. A representative from each participating school will be asked to provide information on reading resources, teacher availability and retention, principal leadership, school emphasis on academic success, school climate, and parental involvement in school activities.

Teacher Questionnaire. Reading teachers of students in selected classes will be asked to complete a teacher questionnaire, which will include questions about teacher preparation and experience, reading instruction, instructional resources and technology, instructional time, instructional engagement, and classroom assessment.

Student Questionnaire. Student information will be collected about home resources, motivation, self-concept, self-efficacy, and student characteristics such as gender and race/ethnicity. It should be administered to all students who have received parental permission to participate in PIRLS.

A.3 Improved Information Technology (Reduction of Burden)

The PIRLS 2016 design and procedures are prescribed internationally and data collection involves paper-and-pencil student assessments and questionnaires, the ePIRLS online student assessments and questionnaires, as well as online or paper-and-pencil questionnaires for schools and teachers. Each participating nation is expected to adhere to the internationally prescribed design. In the United States, the school and teacher questionnaires will be made available to school administrators and teachers online as the main mode of administration, with a paper-and-pencil backup to facilitate user preference for participation. The online questionnaires will be provided on the secure NCES server, so that NCES will be able to control access to the data to ensure confidentiality and minimize disclosure risk.

A communication website will be used for PIRLS 2016 during the field test and main study in order to provide a simple, single source of information to engage and maintain high levels of school involvement. This portal will be used throughout the assessment cycle to inform schools of their tasks and to provide them with easy access to information tailored for their anticipated needs. We plan to gather class and student lists from participating schools electronically using a secure electronic filing process. Electronic filing is an electronic system for submitting lists of student information, including student background information in school records. The electronic filing system provides advantageous features such as efficiency and data quality checks.

A.4 Efforts to Identify Duplication

In the United States, reading achievement is systematically assessed at (1) the Federal level, where trend data have been collected on a fairly regular basis since 1971 through the National Assessment of Educational Progress (NAEP); (2) the state level, where data are routinely collected as part of state testing programs, though they vary across the states in terms of the frequency of testing, age/grades tested, and types of cognitive items administered; and (3) the district level, where data are collected through the use of commercially or locally developed standardized tests as well as tests developed in conjunction with the instructional programs used in schools. PIRLS 2016 does not duplicate these assessments.

PIRLS 2016 is part of a program of international cooperative studies of educational achievement supported and funded, in part, by the U.S. Department of Education. These studies represent the U.S. participation in international studies involving 40 to over 70 countries each. As part of international cooperative studies, the United States must collect the same information at the same time as the other nations for purposes of making both valid international comparisons with other countries and with the previous PIRLS data. While some studies in the United States collect similar, though not identical, kinds of information (e.g., NAEP), the data from those studies cannot be substituted for the information collected in PIRLS in that they do not allow for comparisons outside the United States. Furthermore, the data collected through PIRLS is based on a unique framework that is not shared by any other state, national, or international data collection effort. In order to participate in these international studies, the United States must agree to administer the same core instruments that are administered in the other countries. Because the items measuring reading achievement have been developed with intensive international coordination, any changes to the instruments require international coordination and approval.

A.5 Minimizing Burden for Small Entities

The school samples for PIRLS contain small-, medium- and large-size schools, including private schools, selected based on probability proportionate to their size. All school sizes are needed to ensure an appropriate representation of each type of school in the selected sample of schools. Burden will be minimized wherever possible. In addition, RTI staff will conduct all test administrations, and will assist with parental notification, sampling, and other tasks as much as possible within each school. The assessment will be administered to intact classes to minimize disruption to school schedules.

A.6 Frequency of Data Collection

The field test data collection is scheduled for March 1 through April 15, 2015, and the full-scale data collection for February through May, 2016. This schedule is prescribed by the international collective for PIRLS, and adherence to this schedule is necessary to establish consistency in survey operations among participating countries as well as to maintain trend lines.

A.7 Special Circumstances

None of the special circumstances identified in the Instructions for Supporting Statement apply to the PIRLS study.

A.8 Consultations outside NCES

Consultations outside NCES have been extensive and will continue throughout the life of the project. The IEA studies are developed as a cooperative enterprise involving all participating countries. An international panel of reading and measurement experts provide substantive and technical guidance for the study and National Research Coordinators participate in extensive discussions concerning the projects, usually with advice from national subject matter and testing experts. In addition, the IEA convened a panel of reading experts from around the world to develop cognitive items.

The majority of the consultations (outside NCES) have involved the TIMSS & PIRLS International Study Center at Boston College in the United States. Key to these ongoing consultations are: Dirk Hastedt (executive director of the IEA); Michael Martin, Ina V.S. Mullis, and Pierre Foy, all of whom have extensive experience in developing and operating international education surveys (especially related to PIRLS).

A.9 Payments or Gifts to Respondents

In order to achieve acceptable school response rates, schools have historically been offered incentives to participate and as a thank you for the time they invest in and the space they make available for the international assessments. High response rates are required by both IEA and NCES, and are difficult to achieve in school-based studies. The U.S. has historically had difficulties in achieving sufficient participation levels. We plan to offer schools $200 for participation. The $200 is based on incentives provided in past administrations of PIRLS and currently offered in other international assessments.

The school staff serving as School Coordinators will receive $100 for their time and effort in coordinating the traditional assessment plus $50 for running the ePIRLS system check, and assisting with computer setup (these components may be delegated to a school IT coordinator if necessary). The School Coordinator serves a critical role in data collection, functioning as the central school contact, and facilitating arrangements for the assessments. They are asked to file class and student listing forms; arrange the date, time, and space for the assessment; and disseminate information to parents and students.

Consistent with prior administrations of PIRLS, as a token of appreciation for their participation, the main study students will receive a small gift valued at approximately $4. In PIRLS 2011, each participating student received a small watch/stop watch that could be clipped securely (with an attached karabiner) to a backpack or belt loop. A similarly priced item will be distributed to participating students for the PIRLS 2016 data collection. Students will also receive a certificate with their name thanking them for participating in and representing the United States in PIRLS 2016. Some schools also offer recognition parties with pizza or other treats for students who participate; however these are not reimbursed by NCES or the contractor.

Teachers will be offered $20 for completing the PIRLS teacher questionnaire. Historically, participation is high among school administrators without offering incentives; therefore, no incentive will be offered for completion of the school administrator questionnaire.

A.10 Assurance of Confidentiality

The laws pertaining to the collection and use of personally identifiable information are clearly communicated in correspondence with states, districts, schools, teachers, students, and parents. Letters and information materials will be sent to parents and school administrators describing the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential (see copies in appendix A):

NCES is authorized to conduct this study under the Education Sciences Reform Act of 2002 (ESRA 2002; 20 U.S.C. § 9543). By law, the data provided [by you, schools, staff, and students] may only be used for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. § 9573).

The following statement will appear on the front cover of the questionnaires (the phrase “search existing data resources, gather the data needed” will not be included on the student questionnaires):

U.S. participation in this study is sponsored by the National Center for Education Statistics (NCES), U.S. Department of Education, and authorized by the Education Sciences Reform Act of 2002 (20 U.S.C., § 9543). Your responses are protected by federal statute (20 U.S.C., § 9573) and may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law.

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0645. The time required to complete this information collection is estimated to average [XX] minutes per [respondent type], including the time to review instructions [, search existing data resources, gather the data needed,] and complete and review the information collection. If you have any comments or concerns regarding the accuracy of the time estimate(s), suggestions for improving the form, or the status of your individual submission of this form, write directly to: Progress in International Reading Literacy Study (PIRLS), National Center for Education Statistics, U.S. Department of Education, 1990 K Street, N.W., Washington, D.C. 20006.

OMB No. 1850-0645, Approval Expires xx/xx/2017.

The PIRLS 2016 confidentiality plan includes signing confidentiality agreements and notarized nondisclosure affidavits by all contractor and subcontractor personnel and field workers who will have access to individual identifiers. Also included in the plan is personnel training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses; controlled and protected access to computer files under the control of a single data base manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility. Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. Neither names nor addresses will be included on any data file.

NCES understands the legal and ethical need to protect the privacy of the PIRLS respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure analysis of the PIRLS 2016 data when preparing the data files for use by researchers, in compliance with 20 U.S.C., § 9573. Schools with high disclosure risk will be identified and, to ensure that individuals may not be identified from the data files, a variety of masking strategies will be used, including swapping data and omitting key identification variables (i.e., school name and address) from both the public- and restricted-use files (though the restricted-use file will include an NCES school ID that can be linked to other NCES databases to identify a school); omitting key identification variables such as state or ZIP Code from the public-use file; and collapsing or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files.

A.11 Sensitive Questions

The questionnaires do not have items considered to be of sensitive nature.

A.12 Estimates of Burden

This package shows estimated burden to respondents for all PIRLS 2016 activities, and requests approval for burden to respondents for the field test and for recruitment for the main study. Burden estimates are shown in Table A.1. The minimum sample size for the field study is 25 schools and 800 students and for the main study 150 schools and 4,000 students. The burden table assumes exceeding the minimum requirements and is based on a sample of 1,000 students in the field test and 5,000 students in the main study. The time required for students to respond to the assessment (cognitive items) portion of the study and associated directions are shown in gray font and are not included in the totals because they are not subject to the PRA. Student, administrator, and teacher questionnaires are included in the requested burden totals. Recruitment and pre-assessment activities include the time to review study requirements by the districts that require approval before contacting their schools, and the time involved in a school deciding to participate, completing teacher and student listing forms, distributing parent consent materials, and arranging assessment space. Burden estimates for the main study data collection are also provided for information purposes in table A.1.

Table A.1. Burden estimates for PIRLS 2016 Field Test and Main Study.

Data collection instrument

Sample size

Expected response rate

Number of respondents

Number of responses

Minutes Per respondent

Total burden Hours

Field Test

Student Directions (sample of 1000)

1,000

0.93

930

930

10

155

Student Assessment

1,000

0.93

930

930

80

1,240

ePIRLS Student Assessment

500

0.93

465

465

80

620

ePIRLS Student Questionnaire (sample of 1000)

500

0.93

465

465

5

39

Student Questionnaire (sample of 1000)

1,000

0.93

930

930

30

465

School Administrator Questionnaire (1 / school)

25

0.99

25

25

40

17

Teacher Questionnaire (2 per school)

50

0.97

49

49

40

33

School Administrator Recruitment

50

0.88

44

44

90

66

School Coordinator (1 per participating school)

25

1.00

25

25

1,140

475

Total Burden Field Test

 

 

1,073

1,538


1,095

Main Study

National sample

 

 

 

 

 

 

Student Directions (sample of 5,000)

5,000

0.93

4,650

4,650

10

775

Student Assessment

5,000

0.93

4,650

4,650

80

6,200

ePIRLS Student Assessment

3500

0.93

3255

3255

80

4340

ePIRLS Student Questionnaire (sample of 5,000)

3500

0.93

3255

3255

5

271

Student Questionnaire (sample of 5,000)

5,000

0.93

4,650

4,650

30

2,325

School Administrator Questionnaire

150

0.99

149

149

40

99

Teacher Questionnaire

225

0.97

218

218

40

145

School Administrator Recruitment

250

0.88

220

220

90

330

School Coordinator

150

0.88

132

132

1,140

2,508

District IRB Staff Study Approval

40

1

40

40

120

80

District IRB Panel Study Approval

240

1

240

240

60

240

State sample

 

 

 

 

 

 

Student Directions (sample size 1,700)

1,700

0.93

1,581

1,581

10

264

Student Assessment

1,700

0.93

1,581

1,581

80

2,108

Student Questionnaire (sample size 1,700)

1,700

0.93

1,581

1,581

30

1318

School Administrator Questionnaire

50

0.99

50

50

40

33

Teacher Questionnaire

100

0.97

97

97

40

65

School Administrator Recruitment

75

0.88

66

66

90

99

School Coordinator

50

0.88

44

44

1,140

836

District IRB Staff Study Approval

10

1

10

10

120

20

District IRB Panel Study Approval

60

1

60

60

60

60

Total Burden District/School Recruitment and Pre-Assessment Activities Main Study

 

 

812

812

 

4,173

Total Burden Requested in this Submission

 

 

1,885

2,350

 

5,268

Note: OMB Clearance Requested: Total Burden includes burden associated with conducting the PIRLS 2016 Field Test and the recruitment and pre-assessment activities for the PIRLS 2016 Main Study (items in bold). The PIRLS 2016 Main Study burden is conservatively high because the PIRLS 2016 Main Study may include a benchmarking state; however the burden is held consistent with national sample schools because of potential variability between states. Total student burden does not include the time for the cognitive assessment and its associated instructions (in gray font).


In 2011, one state provided its own funding to participate in PIRLS. The main study burden estimates reflect burden for the inclusion of a state. The main study burden estimates will be updated following the field test as final design decisions are made.

For students participating in the field test, the burden is estimated to be 543 hours, based on the 30 minute background questionnaire plus the 5 minute ePIRLS online reading questionnaire. At $7.25 per hour (the 2014 Federal minimum wage), the total burden cost of the field test study for students is estimated to be $3,937.

For the field test school administrator 40-minute questionnaire and all of the field test and main study recruitment and pre-assessment activities for the national and state collections, burden cost is calculated at an estimated $50.00 per hour cost to school and district administrators (for 912 hours), and an estimated $35.00 per hour cost for teachers and school coordinators (for 3,852 hours), so the total burden cost to school and district staff for these activities is estimated at $180,420.

A.13 Total Annual Cost Burden

No cost to respondents is anticipated beyond the estimated burden cost described in Section A.12.

A.14 Annualized Cost to Federal Government

The total cost to the federal government for conducting PIRLS 2016 full scale is estimated to be $10,658,124 over a 5-year period. The cost for the field test is estimated to be $1,673,731. These figures include all direct and indirect costs, and are based on the national data collection contract, valued at $5,371,232 over five years, from July 2014 to June 2019.

A.15 Program Changes or Adjustments

This is a reinstatement of a previously approved collection, and as such shows all burden as new. With regards to content, PIRLS 2016 differs from PIRLS 2011 in that PIRLS 2016 will not involve the IEA’s bilateral coordination of PIRLS with the IEA’s Trends in International Mathematics and Science Study (TIMSS) assessment. PIRLS 2016 will include the new ePIRLS assessment, with the paper-and-pencil version administered on the first day and ePIRLS administered on a subsequent day.

A.16 Plans for Tabulation and Publication

The PIRLS field test is designed to provide a statistical review of the performance of items on the cognitive assessment and questionnaires in preparation for the main study data collection.

Based on the data collected in the main study, the TIMSS & PIRLS International Study Center will prepare a report to be released in December 2017. As has been customary, NCES will also release a report at the same time as the international reports are released, interpreting the results for the U.S. audience. NCES reports on initial data releases are generally limited to simple bivariate statistics. There are currently no plans to conduct complex statistical analyses of either dataset. An example of the past PIRLS report can be found at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2013010rev. In the spring of 2018, the International Study Center will also prepare a technical report, describing the design and development of the assessment as well as the scaling procedures, weighting procedures, missing value imputation, and analyses. After the release of the international data, NCES plans to release the national data and an accompanying User’s Guide for the study.

Electronic versions of each publication are made available on the NCES website. Schedules for tabulation and publication of PIRLS 2016 results in the United States are dependent upon receiving data files from the international sponsoring organization. With this in mind, the expected data collection dates and a tentative reporting schedule are as follows:



Dates

Activity

April—December 2014

Prepare data collection manuals, forms, assessment materials, questionnaires

November 2014—February 2015

Contact and gain cooperation of states, districts, and schools for field test

February 2015—March 2015

Select student samples

March 2015—April 2015

Collect field test data

May 15, 2015

Deliver raw data to international sponsoring organization

July 2015—August 2015

Review field test results

March 2015—February 2016

Prepare for the main study/recruit schools

February 2016—May 2016

Collect main study data

June 2017

Receive final data files from international sponsors

June 2017—December 2017

Produce report



A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all data collection materials.

A.18 Exceptions to Certification Statement

No exceptions to the certifications are requested.

1 The materials that will be used in the 2016 main study will be based upon the field test materials included in this submission. Additionally, this submission is designed to adequately justify the need for and overall practical utility of the full study and to present the overarching plan for all of the phases of the data collection, providing as much detail about the measures to be used as is available at the time of this submission. As part of this submission, NCES is publishing a notice in the Federal Register allowing first a 60- and then a 30-day public comment period. For the final proposal for the main study, after the field test, NCES will publish a notice in the Federal Register allowing an additional 30-day public comment period on the final details of 2015 main study.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCalvin Choi
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy