Part A PIRLS 2016 Main Study

Part A PIRLS 2016 Main Study.docx

Progress in International Reading Literacy Study (PIRLS 2016) Main Study

OMB: 1850-0645

Document [docx]
Download: docx | pdf




Progress in International Reading Literacy Study (PIRLS 2016) MAIN STUDY





OMB# 1850-0645 v.9




Supporting Statement Part A




Submitted by:



National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC









September 2015

Revised October 2015





TABLE OF CONTENTS



PREFACE 2


A. JUSTIFICATION





APPENDICES

A: Parental Consent and Respondent Recruitment Materials


B: Non-response Bias Analysis Plan


C: Data Collection Instruments







PREFACE

The Progress in International Reading Literacy Study (PIRLS) is an international assessment of fourth-grade students’ achievement in reading. Since its inception in 2001, PIRLS has continued to assess students every 5 years (2001, 2006, 2011), with the next PIRLS assessment, PIRLS 2016, being the fourth iteration of the study. Participation in this study by the United States at regular intervals provides data on student achievement and on current and past education policies and a comparison of U.S. education policies and student performance with those of the U.S. international counterparts. In PIRLS 2011, 53 education systems participated. The United States will participate in PIRLS 2016 to continue to monitor the progress of its students compared to that of other nations and to provide data on factors that may influence student achievement.

PIRLS is conducted by the International Association for the Evaluation of Educational Achievement (IEA), an international collective of research organizations and government agencies that create the assessment framework, assessment, and background questionnaires. The IEA decides and agrees upon a common set of standards and procedures for collecting and reporting PIRLS data, and defines the studies’ timeline, all of which must be followed by all participating countries. As a result, PIRLS is able to provide a reliable and comparable measure of student skills in participating countries. In the U.S., the National Center for Education Statistics (NCES), within the Institute of Education Sciences of the U.S. Department of Education (ED), conducts this study and works with the IEA and, for PIRLS 2016, RTI International (under contract with ED), to ensure proper implementation of the study and adoption of practices in adherence to the IEA’s standards. Participation in PIRLS also allows NCES to meet its mandate of acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations [The Educational Sciences Reform Act of 2002 (ESRA 2002) 20 U.S.C., Section 9543].

PIRLS 2016 includes an innovative new assessment of online reading, ePIRLS, which is designed to help countries understand how successful they are in preparing fourth-grade students to read, comprehend, and interpret online information. The PIRLS 2016 assessment is designed to take place over two consecutive days, with the paper-and-pencil version administered on the first day and ePIRLS administered on the following day.

Data collection for the field test occurred from February through May 2015. The United States recruited 25 public schools and assessed 993 students. The student samples were obtained by selecting two classes from each school. The field test included the first ever implementation of ePIRLS, a computerized assessment of online reading. In the field test, ePIRLS was conducted in 1 of the 2 classrooms in each school participating in the PIRLS field test.

NCES is requesting clearance for data collection materials and procedures for PIRLS 2016 main study data collection. In November 2014, NCES received approval (OMB# 1850-0645 v.8) for the PIRLS 2016 field test data collection and main study recruitment. The materials to be used in the main study are based upon those that were approved in November 2014. With that submission, NCES adequately justified the need for and overall practical utility of the full study as proposed and an overarching plan for the phases of the data collection over the next 3 years, and provided as much detail on the measures to be used as was available at the time of the submission. Thus OMB approved the initial phase of this collection in November 2014, and now NCES published a notice in the Federal Register allowing a 30-day public comment period on the details of the subsequent study components – the PIRLS 2016 main study data collection described in this submission.

The IEA began its process of main study preparation upon completion of the field test. After reviewing item statistics from each country, the IEA made initial recommendations for changes to the instruments. These recommendations were distributed to the participating countries and representatives from each country met in August 2015 to discuss the proposed changes and work toward finalizing the instruments. IEA sent the finalized instruments to participating countries for translation and adaptation. Translation is not required in the United States since the instruments are provided in English. The proposed changes and adaptations are presented in Part C.

The U.S. PIRLS main study will take place between March and May 2016, and will include a nationally-representative sample of about 5,000 students in the target population from 150 schools. All classrooms participating in the main study will be asked to participate in ePIRLS. The main study will closely resemble the field test in approach, burden, and materials.

The three changes proposed since the field test for the main study are the plan to administer ePIRLS to all classrooms; the reduction in burden for school administrators, teachers, and school coordinators; and some changes to questionnaire items. In addition to the supporting statements Parts A and B, Part C provides justification for the questionnaire item changes. Appendix A provides the already approved main study recruitment materials consisting of letters to state and district officials and school principals, text for a PIRLS main study brochure, “Frequently Asked Questions,” a “Summary of Activities”, and parental consent materials (appendix A is being included again in this request because the main study recruitment activities are expected to continue at the time this request will be approved). Appendix B provides the non-response bias analysis plan, and Appendix C provides the PIRLS 2016 main study background questionnaires.

Because PIRLS is a collaborative effort among many parties, the United States must adhere to the international schedule set forth by the IEA, including the availability of draft and final questionnaires. In order to meet the international data collection schedule for the spring 2016 main study, recruitment activities began in May 2015. Recruitment for the main study was also scheduled to align with recruitment for other NCES studies (e.g., the National Assessment of Education Progress, NAEP), and for schools to put the assessment on their calendars. Main study materials and procedures are very similar to those used in the field test.

A. Justification

A.1 Importance of Information

Benchmarking of U.S. student achievement against other countries continues to be of high interest to education policymakers, and informs policy discussions of economic competitiveness and workforce and post-secondary preparedness. PIRLS provides a unique opportunity to compare U.S. students’ reading knowledge and skills at fourth grade with that of their peers in countries around the world.

The continuation of U.S. participation allows for the study of past and current education policies that have shaped reading achievement over the past 15 years. Furthermore, participating countries are not only able to obtain information about students' knowledge and abilities, but also about the cultural environments, teaching practices, curriculum goals, and institutional arrangements that are associated with student achievement.

PIRLS complements what we learn from national assessments such as the National Assessment of Educational Progress (NAEP) by identifying the strengths and weaknesses of student reading achievement relative to participating countries around the world. It provides valuable benchmarking information about educational polices enacted in other countries and policies that could be applied to U.S. educational practices.

Based on earlier PIRLS data releases, it is likely that the results of these studies will draw great attention in the United States and elsewhere. It is therefore expected that PIRLS will contribute to ongoing national and international debates and efforts to improve reading learning and achievement.

A.2 Purposes and Uses of Data

PIRLS assesses reading knowledge and skills at grade 4. PIRLS is designed to align broadly with curricula in the participating countries. The results, therefore, suggest the degree to which students have learned concepts and skills likely to have been taught in school. PIRLS also collects background information on students, teachers, schools, curricula, and official education policies in order to allow cross-national comparison of educational contexts that may be related to student achievement.

Data compiled and collected from PIRLS 2016 allows for evidence-based decisions to be made for the purposes of educational improvement. Each successive participation in PIRLS provides trend information about student achievement in reading relative to other countries, as well as indicators that show how this achievement relates to demographic and curricular, school, teacher, and student factors that provide the educational context for achievement. These high quality, internationally comparative trend data are key in informing education policy discussions.

Through the participation in PIRLS and other international assessment programs, NCES is able to provide comparative indicators on student performance and school practices across countries in order to benchmark U.S. student performance, and to suggest hypotheses about the relationship between student performance and factors that may influence performance as well as areas in which students have strengths or weaknesses. The international studies identify differences among countries over time in instructional practices, school policies, and opportunity-to-learn that can inform discussions about how to improve students’ ability to read.

This collection of data is consistent with the NCES mandate. The enabling legislation of NCES [Section 406 of the General Education Provisions Act, as amended (20 U.S.C. 1221e-1)] specifies that "The purpose of the Center [NCES] shall be to collect and analyze and disseminate statistics and other information related to education in the United States and in other nations." The Educational Sciences Reform Act of 2002 (ESRA 2002: 20 U.S.C., Section 9543) also mandates that NCES shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations. In addition to being essential for any international perspective on reading knowledge and skills, U.S. participation fulfills both the national and international aspects of NCES' mission.

PIRLS 2016 Components

The reading assessment is organized around a content dimension that specifies the subject matter to be assessed and a cognitive dimension that specifies the thinking processes to be assessed. PIRLS assesses two purposes for reading that fourth-grade students typically engage in: reading for literary experience and reading to acquire and use information. PIRLS also assesses four broad processes of comprehension predominantly used by fourth-grade readers: focusing on and retrieving explicitly stated information; making straightforward inferences; interpreting and integrating ideas and information; and evaluating and critiquing content and textual elements. The PIRLS 2016 framework is similar to 2011, but has been slightly updated to provide more specificity for item writers, and to better reflect current curricula in participating countries. There were no revisions to the content domains or cognitive domains, nor were there changes to the target percentages for either domain.

Assessment Instruments

To minimize burden and ensure broad subject-matter coverage, PIRLS paper-and-pencil reading tests will use a matrix sampling approach where the reading items are organized into a set of test booklets, with each student receiving only one booklet. This design is consistent with past practice.

ePIRLS is a new component of the study being introduced for the 2016 cycle. Because for many people the Internet has become the primary source for obtaining information, reading curricula around the world are acknowledging the importance of online reading. In order to measure how well students read online, IEA has created ePIRLS – an innovative, online reading assessment.

ePIRLS uses an engaging, simulated Internet environment to measure fourth grade students’ achievement in online reading for information-gathering purposes. The assessment presents fourth grade students with authentic school-like assignments about science and social studies topics. ePIRLS allows an accurate assessment of online reading and comprehension skills beyond those used in “traditional” print material.

What makes IEA’s ePIRLS truly innovative is its simulated online interface and assessment design. After logging into ePIRLS, students are introduced to their assignment. As ePIRLS begins, two windows appear: an Internet browser window at left and the ePIRLS assessment window at right. To successfully complete ePIRLS, students must not only be able to navigate and discriminate among informational texts in a non-linear online environment, but must also construct meaning from these Internet sources, retrieve data, make inferences, and integrate the online information. Importantly, at the end of the assessment, students must be able to synthesize information across multiple passages.

Each ePIRLS assessment task typically lasts about 40 minutes, and each student will be asked to complete two tasks. A brief online questionnaire will be administered at the end of the assessment, asking students about their online reading experience. The PIRLS 2016 assessment is designed to take place over two days, with the paper-and-pencil version administered on the first day and ePIRLS administered on the next available day. The field test was used to evaluate the feasibility of including ePIRLS with PIRLS. Only half of the classrooms selected for PIRLS were also selected for ePIRLS. For the main study, it is intended that all students who take the paper-and-pencil PIRLS will also take ePIRLS.

Questionnaires

The background questionnaires for PIRLS 2016 were developed to address the issues outlined in the PIRLS context questionnaire framework. The United States is adapting the questions to fit the U.S. education context, as appropriate, including adding a few questions, such as about the race/ethnicity of students. All but the student questionnaire will be offered online, with a paper-and-pencil backup. Students will only receive a paper-and-pencil questionnaire.

School Questionnaire. A representative from each participating school will be asked to provide information on reading resources, teacher availability and retention, principal leadership, school emphasis on academic success, school climate, and parental involvement in school activities.

Teacher Questionnaire. Reading teachers of students in selected classes will be asked to complete a teacher questionnaire, which will include questions about teacher preparation and experience, reading instruction, instructional resources and technology, instructional time, instructional engagement, and classroom assessment.

Student Questionnaire. Student information will be collected about home resources, motivation, self-concept, self-efficacy, and student characteristics such as gender and race/ethnicity. It should be administered to all students who have received parental permission to participate in PIRLS.

A.3 Improved Information Technology (Reduction of Burden)

The PIRLS 2016 design and procedures are prescribed internationally and data collection involves paper-and-pencil student assessments and questionnaires, the ePIRLS online student assessments and questionnaires, as well as online or paper-and-pencil questionnaires for schools and teachers. Each participating nation is expected to adhere to the internationally prescribed design. In the United States, the school and teacher questionnaires will be made available to school administrators and teachers online as the main mode of administration, with a paper-and-pencil backup to facilitate user preference for participation. The online questionnaires will be provided on the secure NCES server, so that NCES will be able to control access to the data to ensure confidentiality and minimize disclosure risk.

A communication website for PIRLS 2016 was developed for use during the field test and main study in order to provide a simple, single source of information to engage and maintain high levels of school involvement. This portal will be used throughout the assessment cycle to inform schools of their tasks and to provide them with easy access to information tailored for their anticipated needs. We plan to gather class and student lists from participating schools electronically using a secure electronic filing process. Electronic filing is an electronic system for submitting lists of student information, including student background information in school records. The electronic filing system provides advantageous features such as efficiency and data quality checks.

A.4 Efforts to Identify Duplication

In the United States, reading achievement is systematically assessed at (1) the Federal level, where trend data have been collected on a fairly regular basis since 1971 through the National Assessment of Educational Progress (NAEP); (2) the state level, where data are routinely collected as part of state testing programs, though they vary across the states in terms of the frequency of testing, age/grades tested, and types of cognitive items administered; and (3) the district level, where data are collected through the use of commercially or locally developed standardized tests as well as tests developed in conjunction with the instructional programs used in schools. PIRLS 2016 does not duplicate these assessments.

PIRLS 2016 is part of a program of international cooperative studies of educational achievement supported and funded, in part, by the U.S. Department of Education. These studies represent the U.S. participation in international studies involving 40 to over 70 countries each. As part of international cooperative studies, the United States must collect the same information at the same time as the other nations for purposes of making both valid international comparisons with other countries and with the previous PIRLS data. While some studies in the United States collect similar, though not identical, kinds of information (e.g., NAEP), the data from those studies cannot be substituted for the information collected in PIRLS in that they do not allow for comparisons outside the United States. Furthermore, the data collected through PIRLS is based on a unique framework that is not shared by any other state, national, or international data collection effort. In order to participate in these international studies, the United States must agree to administer the same core instruments that are administered in the other countries. Because the items measuring reading achievement have been developed with intensive international coordination, any changes to the instruments require international coordination and approval.

A.5 Minimizing Burden for Small Entities

The school samples for PIRLS contain small-, medium- and large-size schools, including private schools, selected based on probability proportionate to their size. All school sizes are needed to ensure an appropriate representation of each type of school in the selected sample of schools. Burden will be minimized wherever possible. In addition, RTI staff will conduct all test administrations, and will assist with parental notification, sampling, and other tasks as much as possible within each school. The assessment will be administered to intact classes to minimize disruption to school schedules.

A.6 Frequency of Data Collection

The main study data collection is scheduled for March through May 2016. This schedule is prescribed by the international collective for PIRLS, and adherence to this schedule is necessary to establish consistency in survey operations among participating countries as well as to maintain trend lines.

A.7 Special Circumstances

None of the special circumstances identified in the Instructions for Supporting Statement apply to the PIRLS study.

A.8 Consultations outside NCES

Consultations outside NCES have been extensive and will continue throughout the life of the project. The IEA studies are developed as a cooperative enterprise involving all participating countries. An international panel of reading and measurement experts provide substantive and technical guidance for the study and National Research Coordinators participate in extensive discussions concerning the projects, usually with advice from national subject matter and testing experts. In addition, the IEA convened a panel of reading experts from around the world to develop cognitive items.

The majority of the consultations (outside NCES) have involved the TIMSS & PIRLS International Study Center at Boston College in the United States. Key to these ongoing consultations are: Dirk Hastedt (executive director of the IEA); Michael Martin, Ina V.S. Mullis, and Pierre Foy, all of whom have extensive experience in developing and operating international education surveys (especially related to PIRLS).

A.9 Payments or Gifts to Respondents

The PIRLS standard operating procedures specify a required minimum participation rate for schools, classrooms, and students. Participating countries must achieve an 85 percent school participation rate from schools in the original sample, not including replacement schools. Countries that are unable to meet the requirements with original schools but do achieve an 85 percent participation rate for schools after including replacement schools will be annotated in the international reports. National samples that do not meet the 85 percent requirements, even after including replacement schools, will be segregated in the international reports.

In order to achieve acceptable school response rates, schools have historically been offered incentives to participate and as a thank you for the time they invest in and the space they make available for the international assessments. High response rates are required by both IEA and NCES, and are difficult to achieve in school-based studies. The U.S. has historically had difficulties in achieving sufficient participation levels. Schools taking part in PIRLS 2016 are offered $200, which has been approved by OMB. The $200 is based on incentives provided in past administrations of PIRLS and currently offered in other international assessments. To address challenges encountered with securing school cooperation for PIRLS 2016, we propose a second-tier incentive which would allow us to offer $800 if necessary. This second-tier incentive would be offered only to schools in the original sample and only after all other refusal conversion methods have been exhausted. The second-tier incentive would be initiated in November 2015.

The school staff serving as School Coordinators will receive $100 for their time and effort in coordinating the traditional assessment plus $50 for running the ePIRLS system check, and assisting with computer setup (these components may be delegated to a school IT coordinator if necessary). The School Coordinator serves a critical role in data collection, functioning as the central school contact, and facilitating arrangements for the assessments. They are asked to file class and student listing forms; arrange the date, time, and space for the assessment; and disseminate information to parents and students.

Consistent with prior administrations of PIRLS, as a token of appreciation for their participation, the main study students will receive a small gift valued at approximately $4. In the PIRLS field test, each participating student received a small, digital wrist watch and a “USA” pencil. These same token incentives are planned again for the main study. Some schools also offer recognition parties with pizza or other treats for students who participate; however, these are not reimbursed by NCES or the contractor.

Teachers will be offered $20 for completing the PIRLS teacher questionnaire. Historically, participation is high among school administrators without offering incentives; therefore, no incentive will be offered for completion of the school administrator questionnaire.

A.10 Assurance of Confidentiality

The laws pertaining to the collection and use of personally identifiable information are clearly communicated in correspondence with states, districts, schools, teachers, students, and parents. Letters and information materials will be sent to parents and school administrators describing the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential (see copies in appendix A):

NCES is authorized to conduct PIRLS 2016 by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C., § 9543). The data are being collected for NCES by RTI International, a nonprofit research organization based in North Carolina. The collected data may be used only for statistical purposes and may not be disclosed or used, in identifiable form, for any other purpose except as required by law (ESRA 2002, 20 U.S.C., § 9573). The collected information will be combined across respondents to produce statistical reports.

The following statement will appear on the front cover of the questionnaires (the phrase “search existing data resources, gather the data needed” will not be included on the student questionnaires):

The National Center for Education Statistics (NCES), within the U.S. Department of Education, is authorized to conduct this study under the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C., § 9543). The data are being collected for NCES by RTI International, a nonprofit research organization based in North Carolina. The collected data may be used only for statistical purposes and may not be disclosed or used, in identifiable form, for any other purpose except as required by law (ESRA 2002, 20 U.S.C., § 9573). The collected information will be combined across respondents to produce statistical reports.

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary survey is 1850-0645. The time required to complete this survey is estimated to average 20 minutes per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the survey. If you have any comments or concerns regarding the accuracy of the time estimate(s), suggestions for improving the form, or the status of your individual submission of this form, write directly to: Progress in International Reading Literacy Study (PIRLS), National Center for Education Statistics, U.S. Department of Education, 1990 K Street, N.W., Washington, D.C. 20006.

OMB No. 1850-0645, Approval Expires 11/30/2017.

The PIRLS 2016 confidentiality plan includes signing confidentiality agreements and notarized nondisclosure affidavits by all contractor and subcontractor personnel and field workers who will have access to individual identifiers. Also included in the plan is personnel training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses; controlled and protected access to computer files under the control of a single data base manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility. Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. Neither names nor addresses will be included on any data file.

NCES understands the legal and ethical need to protect the privacy of the PIRLS respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure analysis of the PIRLS 2016 data when preparing the data files for use by researchers, in compliance with 20 U.S.C., § 9573. Schools with high disclosure risk will be identified and, to ensure that individuals may not be identified from the data files, a variety of masking strategies will be used, including swapping data and omitting key identification variables (i.e., school name and address) from both the public- and restricted-use files (though the restricted-use file will include an NCES school ID that can be linked to other NCES databases to identify a school); omitting key identification variables such as state or ZIP Code from the public-use file; and collapsing or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files.

A.11 Sensitive Questions

The questionnaires do not have items considered to be of sensitive nature.

A.12 Estimates of Burden

This package shows estimated burden to respondents for all PIRLS 2016 activities, and requests approval for burden to respondents for the main study. Burden estimates are shown in Table A.1. The minimum sample size for the main study is 150 schools and 4,000 students. The burden table assumes exceeding the minimum requirements and is based on a sample of 5,000 students in the main study. The time required for students to respond to the assessment (cognitive items) portion of the study and associated directions are shown in gray font and are not included in the totals because they are not subject to the PRA. Student, administrator, and teacher questionnaires are included in the requested burden totals. Based on the field test experience, burden has been reduced for the school coordinator, school administrator, and teacher. Recruitment and pre-assessment activities include the time to review study requirements by the districts that require approval before contacting their schools, and the time involved in a school deciding to participate, completing teacher and student listing forms, distributing parent consent materials, and arranging assessment space. Burden estimates for the main study data collection are also provided for information purposes in table A.1.

For students participating in the main study, the burden is estimated based on the 30 minute background questionnaire plus the 5 minute ePIRLS online reading questionnaire. At $7.25 per hour (the 2015 Federal minimum wage), the total burden cost of the main study for students is estimated to be $19,669.

For the main study school administrator 30-minute questionnaire and all of the main study recruitment and pre-assessment activities, burden cost is calculated at an estimated $45.89 per hour cost to school and district administrators for 724 hours, or $33,224. Teacher and school coordinator burden was estimated at $27.34 per hour for 1,327 hours, $36,2801. The total burden cost for PIRLS 2016 is $89,174; the total burden estimate is 4,764 hours.

Table A.1. Burden estimates for PIRLS 2016 Main Study.

Data collection instrument

Sample size

Expected response rate

Number of respondents

Number of responses

Minutes Per respondent

Total burden Hours

Main Study

National sample

 

 

 

 

 

 

Student Directions (sample of 5,000)

5,000

0.93

4,650

4,650

10

775

Student Assessment

5,000

0.93

4,650

4,650

80

6,200

ePIRLS Student Assessment

5,000

0.93

4,650

4,650

80

6,200

ePIRLS Student Questionnaire (sample of 5,000)

5,000

0.93

4,650

4,650

5

388

Student Questionnaire (sample of 5,000)

5,000

0.93

4,650

4,650

30

2,325

School Administrator Questionnaire

150

0.99

149

149

30

74

Teacher Questionnaire

225

0.97

218

218

35

127

School Administrator Recruitment**

250

0.88

220

220

90

330

School Coordinator**

150

1

150

150

480

1,200

District IRB Staff Study Approval**

40

1

40

40

120

80

District IRB Panel Study Approval**

240

1

240

240

60

240

Total Burden


 

5,518*

10,317

-

4,764

* The total number of respondents contains no duplication of the number of different responses expected for the given type of respondent.

** The burden associated with entries in gray font has been approved, but is included here because the activity still is taking place.


A.13 Total Annual Cost Burden

No cost to respondents is anticipated beyond the estimated burden cost described in Section A.12.

A.14 Annualized Cost to Federal Government

The total cost to the federal government for conducting PIRLS 2016 full scale is estimated to be $10,658,124 over a 5-year period. These figures include all direct and indirect costs, and are based on the national data collection contract, valued at $6,364,165 over five years, from July 2014 to June 2019.

A.15 Program Changes or Adjustments

The decrease in estimated burden to respondents reflects that the last approval was for field test recruitment, data collection, and main study recruitment, while this request is for main study recruitment and data collection only, it is also caused by lowered burden estimates for school coordinators, school administrators, and teachers.

A.16 Plans for Tabulation and Publication

Based on the data collected in the main study, the TIMSS & PIRLS International Study Center will prepare a report to be released in December 2017. As has been customary, NCES will also release a report at the same time as the international reports are released, interpreting the results for the U.S. audience. NCES reports on initial data releases are generally limited to simple bivariate statistics. There are currently no plans to conduct complex statistical analyses of either dataset. An example of the past PIRLS report can be found at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2013010rev. In the spring of 2018, the International Study Center will also prepare a technical report, describing the design and development of the assessment as well as the scaling procedures, weighting procedures, missing value imputation, and analyses. After the release of the international data, NCES plans to release the national data and an accompanying User’s Guide for the study.

Electronic versions of each publication are made available on the NCES website. Schedules for tabulation and publication of PIRLS 2016 results in the United States are dependent upon receiving data files from the international sponsoring organization. With this in mind, the expected data collection dates and a tentative reporting schedule are as follows:



Dates

Activity

May 2015—February 2016

Prepare for the main study/ Contact and gain cooperation of states, districts, and schools for main study schools

October 2015—February 2016

Select student samples

March 2016—May 2016

Collect main study data

September 2016

Deliver raw data to international sponsoring organization

June 2017

Receive final data files from international sponsors

June 2017—December 2017

Produce report



A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all data collection materials.

A.18 Exceptions to Certification Statement

No exceptions to the certifications are requested.

1 The average hourly earnings of education administrators in the 2014 National Compensation Survey sponsored by the Bureau of Labor Statistics (BLS) is $44.13, and of elementary school teachers is $27.34. If mean hourly wage was not provided, it was computed assuming 2,080 hours per year. The exception is the student wage, which is based on the federal minimum wage. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation codes: Elementary school teachers (25-2021); Education Administrators (11-9032); accessed on September 14, 2015.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCalvin Choi
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy