Part B & C PIRLS 2016 Main Study

Part B & C PIRLS 2016 Main Study.docx

Progress in International Reading Literacy Study (PIRLS 2016) Main Study

OMB: 1850-0645

Document [docx]
Download: docx | pdf




Progress in International Reading Literacy Study (PIRLS 2016) MAIN STUDY





OMB# 1850-0645 v.9




Supporting Statement Part B





Submitted by:



National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC









September 2015

Revised November 2015



  1. COLLECTION OF INFORMATION EMPLOYING STATISTICAL INFORMATION

B.1 Respondent Universe

The respondent universe for the PIRLS main study is all students enrolled in grade 4 that are at least 9.5 years of age, during the 2015-2016 school year. The universe for the selection of schools is all types of schools in the U.S. A sample of 176 schools was selected for the main study, with the goal of obtaining participation from a minimum of 150 schools. Within sampled schools, students will be selected for participation by drawing a random sample of two classes. Only students in intact classrooms will be assessed at each grade. School administrators and teachers of selected classrooms will also be asked to complete questionnaires.

B.2 Statistical Methodology

Field Test Sampling Plan and Sample

The purpose of the PIRLS field test was to test out new assessment items and background questions, and to ensure that classroom and student sampling procedures proposed for the main study are successful. In selecting a school sample for this purpose, it was important to minimize the burden on schools, districts, and states, to minimize impact on these entities while also ensuring that the field test data were collected effectively. In addition, the U.S. implemented the new ePIRLS assessment for the first time.

The PIRLS International Study Center required that the field test sample consist of at least 25 schools with at least 800 students assessed. In the U.S., 933 students from 25 schools participated in the field test. The student samples were obtained by selecting two classes from each school. As the field test was designed only to test items, questions, and procedures, a probability sample of schools was not required. However, the sample included a broad range of schools covering such features as public (including charter schools), private, large, small, urban, and rural schools, and schools from a variety of different states.

The field test sample was drawn after the main study sample, and schools were selected for the field test from the set of schools not included in the main study sample. We drew the field test sample from five states –California, Illinois, New York, North Carolina, and Texas – chosen because of their large size and diverse demographics. This allowed for achieving the desired distribution of schools by region, poverty level, and ethnicity, and informed the recruitment and data collection process for the nation as a whole.

Schools in California, Illinois, New York, North Carolina, and Texas that were not selected into the main study sample made up the field test sampling frame and were stratified by state, high/low poverty,1 and public/private status, resulting in 15 different strata. Serpentine sorting was used to sort schools by locale (city, suburb, town, and rural), race/ethnicity status (“15 percent or above” or “below 15 percent” Black, Hispanic, Asian and Pacific Islander, and American Indian and Alaskan Native students), and fourth grade enrollment within each stratum. A purposive sample of 25 schools was selected for the field test which allocated equally to the separate states, although purposive selection of schools within the states was conducted to ensure that to the extent possible, the proportion of schools in the field test closely aligned with the proportion of schools in the main study school sampling frame on the margins of the stratification and sort characteristics described previously. In addition, we investigated the possibility of NAEP or other NCES studies taking place in schools around the same time as the field test and selected the PIRLS field test sample so as to minimize overlap with the NAEP sample. Two replacement schools were selected for each of the 25 sampled schools from the same strata, and had the same sort characteristics as the corresponding sampled schools. Once the field test sample was selected, a summary of the distribution of the characteristics of the selected schools was prepared, showing the comparison with the national population of schools.

The student sampling procedures for the field test corresponded as closely as feasible to what was planned for the main study, so as to try out the operational procedures for student sample selection. The sample was selected by selecting one or two classes per school, depending on the number of classes available at grade 4. Each participating school was asked to submit an exhaustive list of classes (that is, a list that accounts for each student in the grade exactly once). Smaller classes were combined to form “pseudoclasses” for the purposes of sampling. Once the list of classes was submitted, we used a sampling algorithm to select two classes (or pseudoclasses) with equal probability. The student sample then consisted of all students in the selected classes.

Class and student lists were gathered from participating schools electronically using a secure electronic filing process (as explained in Part A). Electronic filing provides advantageous features such as efficiency and data quality checks. Schools were able to access the electronic filing system through a web site.

Main Study Sampling Plan and Sample

The main study sample was selected just prior to the selection of schools for the field test, and was approved by NCES and Stats Canada, the PIRLS international sampling contractor. The school sample design for the main study is required to be more rigorous than that for the field test. It must be a probability sample of schools that fully represents the entire United States. At the same time, to ensure maximum participation it must be designed so as to minimize overlap with other NCES studies involving student assessment that will be conducted around the same time.

The main study will take place in the spring of 2016, about two months after the NAEP 2015 assessment. NAEP will assess several hundred schools nationally, at grades 4, 8, and 12. NCES coordinated with the NAEP contractor to minimize the overlap between that study and PIRLS; the NAEP sample was drawn after the PIRLS sample.

The sample size for the PIRLS main study needs to yield a minimum of 150 schools. For each original sample school, two replacement schools have also been identified. The sampling frame was obtained from the most current versions of NCES’s Common Core of Data (CCD) and Private School Survey (PSS) files, restricted to schools having grade 4, and eliminating schools in Puerto Rico, U.S. territories, and Department of Defense overseas schools. A total of 176 schools and associated replacement schools were selected to ensure the achievement of the required 150 minimum participating schools.

The sample was stratified according to school characteristics such as public/private, Census region, poverty status (as measured by the percentage of students in the school receiving free or reduced-price lunch in the National School Lunch Program (NSLP)). This ensures an appropriate representation of each type of school in the selected sample of schools.

Plans for determining school eligibility, student eligibility, and student sampling are described below.

Schools were selected with probability proportional to the number of estimated classes at grade 4, with schools expected to have either one or two classes being given the same selection probability. The use of a probability proportional to size sample design ensures that all students have an approximately equal chance of selection. Note that we modified this equal probability design in the following way. So as to increase the available sample size of students in high poverty schools, we doubled the probability of selection of each school with at least 50 percent of students eligible for free or reduced-price lunch under NSLP, relative to other schools of the same size.

Student sampling will be accomplished by selecting one to two classes per school. Each grade 4 school will be asked to prepare a list of classes that is comprehensive, and includes each grade 4 student in the school in one of the listed classes. As described above, schools will submit these classes and student lists via secure E-filing. Any class with fewer than ten students will be combined with another class to form a ‘pseudoclass’ with at least ten students in it. We will then select one or two classes (or pseudoclasses) from each school, with equal probability, and all students in those classes/pseudoclasses will be included in the sample. If a school has only one class, then all students in the grade will be included in the sample.

Nonresponse Bias Analysis, Weighting, and Sampling Errors

It is inevitable that nonresponse will occur at both levels: school and student. We will analyze the nonrespondents and provide information about whether and how they differ from the respondents along dimensions for which we have data for the nonresponding units, as required by NCES standards. After the calculation of weights, sampling errors will be calculated for a selection of key indicators incorporating the full complexity of the design, that is, clustering and stratification (see Appendix B for more detail).

B.3 Maximizing Response Rates

The most significant challenge in recruitment for PIRLS has been engaging the schools and gaining their cooperation. Given that classrooms are selected, student participation is not as great of a challenge. Historically student participation rates have never fallen below 90 percent (see table 1). That said, it is important to U.S. PIRLS that students are engaged and try hard on the assessment.

Table 1. Historical PIRLS school and student participation rates

Year

School Participation Rate

Overall Student Participation Rate

Before Replacement

After Replacement

2011

80

85

96

2006

57

86

95

2001

61

86

96


Our approach to school recruitment is to:

  • Obtain endorsements about the value of PIRLS from relevant organizations;

  • Work with NAEP state coordinators to obtain state endorsement of the study and to assist with school recruitment;

  • Inform Chief State School Officers and test directors about the sample of schools in their state. Enclose a sample letter of endorsement they can send to schools;

  • Send letters and informational materials to schools and districts. These letters will be customized by the type of school;

  • Train experienced school recruiters about PIRLS;

  • Follow-up mailings with telephone calls to explain the study and schools involvement, including placing the PIRLS assessment date on school calendars;

  • Maintain continued contact until schools have built a relationship with the recruiter and fully understand PIRLS;

  • Offer a $100 incentive to the individual at the school identified to serve as the school coordinator, plus $50 for running the ePIRLS system check, and assisting with computer setup on the second day of the test administration (these components may be delegated to a school IT coordinator if necessary);

  • Attempt to convert refusals using strategies by addressing specific concerns and offering flexibility in assessment dates and times;

  • Make in-person visits to some schools, as necessary; and

  • Offer schools $200 for participation with a second-tier incentive of $800 for original schools as a last attempt at refusal conversion.

Our approach to student recruitment is to:

  • Send parental permission forms home to parents. Implied permission is encouraged but written permission will be collected if required by the school district or school;

  • Encourage the teacher to encourage student participation;

  • Offer participating students a small gift valued at approximately $4. In the PIRLS 2015 field test, each participating student received a small, digital wrist watch and a “USA” pencil. These items were well received and are planned for participating students for the PIRLS 2016 data collection; and

  • When feasible, have the test administrator speak to the students prior to the scheduled test day to encourage participation.

Our approach to teacher recruitment is to:

  • Send letters and informational materials to teachers;

  • Provide the option of an electronic or hard-copy questionnaire;

  • Offer a $20 incentive for participation; and

  • Have the test administrator speak to the teacher on the day of the student session.


B.4 Purpose of Field Test and Data Uses

The central goals for the field test were to evaluate new assessment items and background questions, and to ensure that classroom and student sampling procedures proposed for the main study are successful. The U.S. also implemented the ePIRLS assessment and analyzed data from the field test to inform decisions on whether to implement it in the main study.

One of the purposes of the field test was to test the effects of administering the ePIRLS option under consideration on school and student recruitment and operations. Information gained from the field test is useful in weighing the value of the additional education data gained by participation in ePIRLS against the added cost and burden and the risk of not achieving acceptable school and student response rates for inclusion in the international comparisons.

Part of the long-term plan for PIRLS is for it to become an on-line assessment with eventual completely electronic administration instead of paper delivery. ePIRLS is designed to bridge the reading-for-information portion of PIRLS from paper-based administrations to computer-based administrations. The primary value of the U.S. participating in ePIRLS 2016 is to evaluate the feasibility and validity of transitioning from paper to electronic delivery in PIRLS in the context specific to the United States. Additionally, the one-time results comparing student literacy across traditional print and digital formats both within the U.S. and against other countries will provide new and valuable information to educators, researchers, and policymakers.

In the field test (as is planned for the main study) the paper-and-pencil PIRLS assessment was administered on the first day and ePIRLS on a different day selected by the school. Students were sampled for ePIRLS from those who completed the paper-and-pencil version of PIRLS 2016; about half the students who participated in PIRLS were sampled for ePIRLS. The sampled students were asked to return for the ePIRLS session on a day designated by their school. During the school recruitment period, the fact that students needed to be asked to return for a second session could have been viewed by schools as an unnecessary burden to the school and students, and could have resulted in schools declining to participate in PIRLS. Schools that found ePIRLS too burdensome may have opted to either have a portion or all of their students complete only the paper-and-pencil version of PIRLS. Besides the concern about the impact of ePIRLS on school participation rates, students may also not have returned for the second session, which could have resulted in insufficient student response rates for the ePIRLS assessment to be viable. NCES used the field test results to make the final decisions on whether to administer ePIRLS as part of the PIRLS 2016 main study. The decision was based on the following factors:

  • feedback from school administrators in the field test schools about the perceived burden of the second, ePIRLS testing session;

  • student participation rate in the field test ePIRLS and the degree to which at least 50% of the students from the initial PIRLS assessment day proved available for the second, ePIRLS session; and

  • degree to which students completed ePIRLS in the field test (ePIRLS consists of two 40-minute assessment modules plus a 5-minute questionnaire, and students are allowed to stop at any time).

In summary, 24 of the 25 schools participating in PIRLS 2016 field test participated in ePIRLS. School administrators indicated that set up took longer than anticipate but overall reported that it was a positive experience. Of students selected for e PIRLS, 86 percent participated, and 455 of the 461 participating students completed ePIRLS. The 6 students who did not complete ePIRLS had legitimate reasons, such as computer issues, needing to arrive late, or depart early. Given these results, NCES decided to proceed with ePIRLS in the main study. Similarly to the field test procedure, NCES may offer those schools that object to the burden of the second testing session the option of administering ePIRLS to only one-half or none of the students assessed in the paper-and-pencil PIRLS.


B.5 Individuals Consulted on Study Design

Overall direction for PIRLS is provided by Dr. Sheila Thompson, National Research Coordinator, National Center for Education Statistics, U.S. Department of Education.

The following persons are responsible for the statistical design of PIRLS:

  • Pierre Foy. TIMSS and PIRLS International Study Center, Boston College (617-552-6253); and

  • Marc Joncas, Sylvie LaRoche and Jean Dumais, Statistics Canada (613-951-0007).

Contractors responsible for sampling and data analysis:

  • David Wilson, RTI International (919-541-6990);

  • Patricia Green, RTI International (312-456-5260); and

  • Ben Dalton, RTI International (919-541-7228).

Analysis and reporting will be performed by:

  • TIMSS and PIRLS International Study Center, Boston College;

  • RTI International;

  • Insight Policy Research; and

  • National Center for Education Statistics, U.S. Department of Education.






Progress in International Reading Literacy Study (PIRLS 2016) MAIN STUDY




OMB# 1850-0645 v.9




Supporting Statement Part C




Submitted by:



National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC








September 2015









NOTE: The final version of the questionnaires to be used in the United States, as approved by the International Association for the Evaluation of Educational Achievement (IEA), will be provided by the end of October 2015.

This section provides a justification for the changes to survey items between the field test and the main study in the Progress in International Reading Literacy Study (PIRLS). The PIRLS 2016 main study will be administered in more than 50 countries. The International Association for the Evaluation of Educational Achievement (IEA), an international collective of research organizations and government agencies, conducts PIRLS. The IEA created a framework for the PIRLS assessment and questionnaire which guides the development of the assessment and questionnaire items. This framework is implemented by the International Study Center which coordinates with each country to ensure comparability of study protocol and administration. This framework has been in place since the study’s inception in 2001.

As part of the assesment and questionnaire development for the main study, the field items were reviewed in multiple steps. First, the International Study Center reviews item statistics for each of the assessment and questionnaire items from each country. The International Study Center makes initial recommendations for the main study and presents them during a meeting of the National Research Coordinators from each of the participating countries. The recommendations are discussed and a determination is made about the item’s inclusion, adaptation, or exclusion for the main study. A final set of instruments are developed for all countries. Countries then identify any adaptations they wish to make and, if necessary, translate the instruments. All adaptations and translation go through a rigorous verification process. Translation is not required in the United States because the instruments are provided in English.

The item changes presented in this submission reflect the results of that meeting. The changes and associated justification for the changes are presented in Table 1.


Table 1. Questionnaire changes from the field test to the main study

Questionnaire

Field Test Location

Main Study Location

Description of Change

Field Test Wording

Main Study Wording

Justification

School Questionnaire

School

Cover

Cover

Response burden time reduced

It is estimated that you will need approximately 40 minutes to complete this questionnaire.

It is estimated that you will need approximately 30 minutes to complete this questionnaire.

Reflects field test timing data and item changes

Heading: School Enrollment and Characteristics


School

Q7 and Q8

Q7

Questions combined and wording modified

Does your school provide free breakfast for students?

--Yes, for all students

--Yes, for some students

--No, free breakfast is not provided


Does your school provide free lunch for students?

--Yes, for all students

--Yes, for some students

--No, free lunch is not provided

Does your school provide free meals for students?

-- Breakfast

--Yes for all students

--Yes, for some students

--No

--Lunch

--Yes for all students

--Yes, for some students

--No

Combined for clarity

School

Q10


Question removed

To what degree are the following health topics emphasized in your school? (options: Very high, High, Medium, Low)

  1. Washing hands

  2. Brushing teeth

  3. A healthy diet/nutrition

  4. Disease prevention


It was deemed that this question did not work effectively in all participating countries

Heading: Instructional Time


School

Q13


Question removed

As a general school policy, is student achievement used to assign fourth grade students to classes (e.g., streaming, tracking)? (Yes/no)


Removed to reduce questionnaire length

Heading: Resources and Technology


School

Q14A-C

Q11A-C

Reference to digital removed from question wording

How many books (print and digital)…

How many titles of magazines and other periodicals (print and digital)…

Can students borrow print or digital materials…

How many books (print)…

How many titles of magazines and other periodicals (print)…

Can students borrow print materials…

It was decided to collect information about digital books separately

School


Q12

Question added


Does the school provide access to digital books? (Yes/no)

Added to collect specific information about digital books.

Heading: School Emphasis on Academic Success


School


Q17


Q15


Items d and j removed

How would you characterize each of the following within your school?

d) Teachers working together to improve student achievement

j) Parental pressure for the school to maintain high academic standards


Removed to reduce redundancy

Item e added


e) Collaboration between school leadership (including master teachers) and teachers to plan instruction

Added to include leadership and still capture information from Q15d

Item m modified and renumbered to item l

m) Students’ respect for classmates who excel in school

l) Students’ respect for classmates who excel academically

Wording changed for clarity of intent

Heading: Changed from School Climate to School Discipline and Safety


School

18B

Q17

Subquestion moved to separate question and item c added

18B. To what degree is each of the following a problem among teachers in your school? (Not a problem/Minor problem/Moderate problem/Serious problem)

a) Arriving late or leaving early

b) Absenteeism

17. To what degree is each of the following a problem among teachers in your school? (Not a problem/Minor problem/Moderate problem/Serious problem)

a) Arriving late or leaving early

b) Absenteeism

c) Failure to complete the curriculum

Reorganized for clarity; item c added.

Heading: Changed from School Readiness to Students’ Literacy and Readiness


School

Q19

Q18

Item b removed

About how many of the students in your school can do the following when they begin the first grade of primary/elementary school? (<25%/25-50%/51-75%/>75%)

b) Tell a story


Removed to reduce questionnaire length

Heading: Changed from Principal Professional Development and Principal Experience and Education


School

Q21 and Q22

Q20 and Q21

Text in instructions bolded

Please round to the nearest whole number

Please round to the nearest whole number

Bolded for emphasis

School

Q22

Q22

Modified response option

What is the highest level of formal education you have completed?

- Completed an academic Master’s degree, postgraduate certificate program (e.g., teaching) or first professional degree (e.g., law, medicine, dentistry)

What is the highest level of formal education you have completed?

- Completed a Master’s degree, postgraduate certificate program (e.g., teaching), or professional degree (e.g., law, medicine, dentistry)

Modified to match items 4e in the teacher questionnaire and 23 in the school questionnaire

School

Q23B

Q23B

Modified item b

Do you hold any of the following professional qualifications in educational leadership?

b) An academic Master’s degree, postgraduate certificate program (e.g., teaching) or first professional degree (e.g., law, medicine, dentistry)

Do you hold any of the following professional qualifications in educational leadership?

b) A Master’s degree, postgraduate certificate program (e.g., teaching), or professional degree (e.g., law, medicine, dentistry)

Modified to match items 4e in the teacher questionnaire and 22 in the school questionnaire

School

Q25


Section and question removed

Heading: Leadership Activities


During the past year, approximately how much time have you spent on the following school leadership activities in your role as a school principal?


Removed to reduce questionnaire length

Teacher Questionnaire

Teacher

Cover

Cover

Response burden time reduced

It is estimated that you will need approximately 40 minutes to complete this questionnaire.

It is estimated that you will need approximately 30 minutes to complete this questionnaire.

Reflects field test timing data and item changes

Heading: About You


Teacher

Q4E

Q4E

Modified item e

What is the highest level of formal education you have completed?

e) Completed a Master’s degree or professional degree (MD, DDS, lawyer, minister)

What is the highest level of formal education you have completed?

e) Completed a Master’s degree, postgraduate certificate program (e.g., teaching), or professional degree (e.g., law, medicine, dentistry)

Modified to match items 22 and 23 in the school questionnaire

Teacher

Q5A

Q5A

Items d and e removed

During your college or university education, what was your major or main area(s) of study?

d) Mathematics

e) Science



Math and science were included in 2011 when TIMSS and PIRLS administered together. No longer relevant.

Teacher

Q5B

Q5B

Item j added


As part of your formal education and/or training, to what extent did you study the following areas?

j) Early childhood education

Relevant to early grade reading instruction.

Teacher

Q6

Q6

Question wording modified

In the past two years, how many hours in total have you spent in formal in-service/professional development (e.g., workshops, seminars, lesson studies, etc.) that dealt directly with reading or teaching reading (e.g., reading theory, instructional methods)?

In the past two years, how many hours in total have you spent in formal professional development (e.g., workshops, seminars, lesson studies, etc.) that dealt directly with reading or teaching reading (e.g., reading theory, instructional methods)?

Underlined for emphasis

Heading: School Emphasis on Academic Success


Teacher


Q7


Q7


Items d, j, n, o, p, and q removed

How would you characterize each of the following within your school?

d)Teachers working together to improve student achievement

j) Parental pressure for the school to maintain high academic standards

n) Clarity of the school’s educational objectives

o) Collaboration between school leadership and teachers to plan instruction

p)Amount of instructional support provided to teachers by school leadership

q) School leadership’s support for teachers’ professional development


Removed to reduce redundancy among responses

Item e added


e) Collaboration between school leadership (including master teachers) and teachers to plan instruction

Added to include school leadership with what was Q7d

Heading: School Environment


Teacher

Q9


Question removed

In your current school, how severe is each problem?


Removed to reduce questionnaire length

Heading: About Being a Teacher


Teacher

Q10


Q9


Items a, b, e, and f removed

How often do you have the following types of interactions with other teachers (Very often/Often/Sometimes/Never)

a) Discuss how to teach a particular topic

b) Collaborate in planning and preparing instructional materials

e) Work together to try out new ideas

f) Work as a group on implementing the curriculum


Removed to reduce questionnaire length

Item d modified and renumbered as item c

d) Visit another classroom to learn more about teaching

c) Observe another classroom to learn more about teaching

Wording changed for clarity of intent

Item d added


d) Work with teachers from other schools on the curriculum

Added to capture cross-school collaborations

Teacher

Q11

Q10

Items b and g removed

How often do you feel the following way about being a teacher?

b) I am satisfied with being a teacher at this school

g) I am going to continue teaching for as long as I can


Removed to reduce questionnaire length

Teacher

Q12


Question removed

Indicate the extent to which you agree or disagree with each of the following statements

  1. There are too many students in the classes

  2. I have too much material to cover in class

  3. I have too many teaching hours

  4. I need more time to prepare for class

  5. I need more time to assist individual students

  6. I feel too much pressure from parents

  7. I have difficulty keeping up with all of the changes to the curriculum

  8. I have too many administrative tasks


Removed to reduce questionnaire length

Heading: Changed from About Teaching the PIRLS Class to About Teaching Reading to the PIRLS Class


Teacher


Q17


Q15


Item f removed

In your view, to what extent do the following limit how you teach this class?

f) Students with physical disabilities


Deleted; low frequency

Items d and h added


d) Students absent from class

h) Lack of support for using information technology

Added based on discussion at international meeting.

Teacher

Q18


Question removed

How often do you do the following in teaching this class?


Removed to reduce questionnaire length

Teacher

Q21

Q18

Item a removed

When you have reading instruction and/or do reading activities, how often do you organize students in the following ways?

a) Students work independently on a goal they choose themselves


It was deemed that this question did not work effectively in all participating countries

Heading: About Teaching Reading to the PIRLS Class


Teacher


Q22


Q19


Item B(c) modified

When you have reading instruction and/or do reading activities with the students, how often do you have the students read the following types of text (in print or digitally)?

B. Informational Reading Materials

c) Nonfiction articles that describe and explain about things, people, events, or how things work

When you have reading instruction and/or do reading activities with the students, how often do you have the students read the following types of text (in print or digitally)?

B. Informational Reading Materials

c) Nonfiction articles that describe and explain about things, people, events, or how things work (e.g., newspaper articles, brochures)

Modified for clarity

Item B(d) removed

d) Authentic materials (e.g., menus, brochures, cartoons, newspaper articles, song lyrics)


Removed resulting from international discussion; parenthetical list not useful in all countries.

Teacher

Q23

Q20

Item f added


When you have reading instruction and/or do reading activities with the students, how often do you do the following?

f) Teach students how to summarize the main ideas

Removed to reduce redundancy

Teacher


Q24


Q21


Items c, d, and h removed

How often do you do the following in teaching reading to this class (Every lesson, About half, Some, Never):

c) Allow students to shift discussion in a new direction

d) Encourage students to ask questions that challenge the opinions of the teacher or other students

h) Give students the opportunity to explain their answers to classroom tests


Removed items that did not contribute to the scale.

Items b, c, and e added


b) Provide materials that are appropriate for the reading levels of individual students

c) Link new content to students’ prior knowledge

e) Encourage student discussions of texts

Added items that reflect best practice.

Heading: Computer and Library Resources


Teacher

Q27

Q24

Question split into two parts

Do the students in this class have computers (including tablets) available to use during their reading lessons?

  • Yes, each student has a computer

  • Yes, the class has computers that students can share

  • Yes, the school has computers that the class can use sometimes

Do the students in this class have computers (including tablets) available to use during their reading lessons?

  • Yes

  • No

If yes, what access do the students have to computers? (Yes/no)

  1. Each student has a computer

  2. The class has computers that students can share

  3. The school has computers that the class can use sometimes

Split into two questions for clarity of data.

Heading: Reading Homework


Teacher

Above Q30


Subheading removed

Questions 30-32 ask about homework for the fourth grade students in this class.


It was deemed that the subheading wasn’t needed

Heading: Reading Difficulty


Teacher

Above Q33


Subheading removed

Questions 33-34 ask about how you deal with reading difficulties of fourth grade students in this class.


It was deemed that the subheading wasn’t needed

Student Questionnaire


Heading: About you


Student

Q4A

Q4A

Item added: I don’t know

Was your mother (or stepmother or female legal guardian) born in the United States?

--Yes

--No


Was your mother (or stepmother or female legal guardian) born in the United States?

--Yes

--No

--I don’t know

Response added due to high nonresponse. Also added to TIMSS

Student

Q4B

Q4B

Item added: I don’t know

Was your father (or stepfather or male legal guardian) born in the United States?

--Yes

--No


Was your father (or stepfather or male legal guardian) born in the United States?

--Yes

--No

--I don’t know

Response added due to high nonresponse. Also added to TIMSS

Student

Q4C

Q4C

Item added: I don’t know

Were you born in the United States?

--Yes

--No


Were you born in the United States?

--Yes

--No

--I don’t know

Response added due to high nonresponse.

Also added to TIMSS

Student

Q6

Q6

Items a and b modified and combined into single item

Do you have any of these things at your home?

a) A computer or tablet of your own

b) A computer or tablet that is shared with other people at home

Do you have any of these things at your home?

a) A computer or tablet

Modified for clarity

Student

Q7

Q10

Question wording modified

How often do you use a computer or tablet in each of these?

How often do you use a computer or tablet in each of these places for schoolwork (including classroom tasks, homework, or studying outside of class)?

Modified for clarity of intent

Student

Q10

Q9

Item wording modified

How often do you eat breakfast on school days?

--Never

How often do you eat breakfast on school days?

--Never or almost never

Modified for clarity

Student


Q11

Question added


How much time do you spend using a computer or tablet to do these activities for your schoolwork on a normal school day? (No time/30 minutes of less/More than 30 minutes)

  1. Finding and reading information

  2. Preparing reports and presentations

IEA decided that more items measuring the use of computers and/or tablets is needed for the analysis of the ePIRLS data.

Student


Q12

Question added


How much time do you spend each day using a computer or tablet for any of the following activities? (No time/30 minutes or less/30 minutes up to 1 hour/From 1 hour up to 2 hours/2 hours or more)

  1. Playing games

  2. Watching videos

  3. Chatting

  4. Surfing the Internet

IEA decided that more items measuring the use of computers and/or tablets is needed for the analysis of the ePIRLS data.

Heading: Your School


Student

Q11

Q13

Items d and g removed

What do you think about your school? Tell how much you agree with these statements.

d) I like to see my classmates at school

g) I learn a lot in school


Removed to reduce questionnaire length

Student

Q12

Q14

Question wording modified

During this year, how often have other students from your school done any of the following things to you? Also, include through texting or the Internet.

During this year, how often have other students from your school done any of the following things to you (including through texting or the Internet)?

Modified for clarity

Heading: Lessons about reading


Student

Q13

Q15

Items g, h, and l removed

Think about the reading you do for school. How much do you agree with these statements about your reading lessons?

g) My teacher is enthusiastic about what we read for class

h) My teacher has clear answers to my questions

l) My teacher listens to what I have to say


Removed to reduce questionnaire length

Heading: Reading in school


Student

Q14

Q16

Item c added


In school, how often do these things happen?

c) My teacher asks us in class to talk about what we have read

Similar to item added to teacher questionnaire to reflect best practice.

Heading: Reading outside of school


Student

Q17

Q19

Item b removed

How often do you do these things outside of school?

b) I read things that I choose myself


Removed to reduce questionnaire length

Heading: What you think about reading


Student

Q18

Q20

Items a, h, and k removed

What do you think about reading? Tell how much you agree with each of these statements.

a) I read only if I have to

h) It is important to be a good reader

k) My parents like it when I read


Removed to reduce questionnaire length

Student

Q19

Q21

Items e, h, and I removed

How well do you read? Tell how much you agree with each of these statements.

e) My teacher tells me I am a good reader

h) I like to read out loud to other people

i) I can understand hard stories


Removed to reduce questionnaire length

ePIRLS Student Questionnaire

ePIRLS


Q2

Question added


How much time do you spend each day finding and reading information on the Internet?

o Less than 30 minutes

o 30 minutes up to 1 hour

o From 1 hour up to 2 hours

o 2 hours or more

Item replaced with Q4

ePIRLS

Q4


Question removed

How often do you use a computer to do these things?

(Every day, Once or twice a week, Once or twice a month, Never or almost never)

a) Find out about things that interest you

b) Do your school work

c) Play games

d) Chat with your friends


Replaces former Q2



1 High poverty schools are defined as having 50% or more students eligible for participation in the National School Lunch Program (NSLP), and low poverty schools have less than 50% of students eligible for NSLP. Private schools are all classified as low poverty because no NSLP information is available.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCalvin Choi
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy