PISA 08 field 09 full scale part a

PISA 08 field 09 full scale part a.doc

Program for International Student Assessment (PISA) 2008 Field Test, 2009 Full Scale

OMB: 1850-0755

Document [doc]
Download: doc | pdf





OECD

PROGRAM FOR INTERNATIONAL STUDENT ASSESSMENT
(PISA 2009)



REQUEST FOR OMB REVIEW



Standard Form (SF) 83-I

and Supporting Statement for Data Collection




Prepared by:


Windwalker Corporation

1355 Beverly Road, Suite 330

McLean, VA 22101



Prepared for:


National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC






Submitted: November 29, 2007


TABLE OF CONTENTS



Chapter Page


Preface iii


A JUSTIFICATION 1


A.1 Importance of Information 1

A.2 Purposes and Uses of Data 4

A.3 Improved Information Technology (Reduction of Burden) 5

A.4 Efforts to Identify Duplication 5

A.5 Minimizing Burden for Small Institutions 7

A.6 Frequency of Data Collection 7

A.7 Special Circumstances 8

A.8 Consultations Outside NCES 8

A.9 Payments or Gifts to Respondents 9

A.10 Assurance of Confidentiality 10

A.11 Sensitive Questions 10

A.12 Estimates of Burden 11

A.13 Total Annual Cost Burden 12

A.14 Annualized Cost to Federal Government 12

A.15 Program Changes or Adjustments 13

A.16 Plans for Tabulation and Publication 13

A.17 Display OMB Expiration Date 15

A.18 Exceptions to Certification Statement 15


B Collections of Information Employing Statistical

MethodS 16


B.1 Respondent Universe 16

B.2 Statistical Methodology 16

B.3 Maximizing Response Rates 19

B.4 Purpose of Field trial and Test of Procedures 19

B.5 Individuals Consulted on Statistical Design 19


APPENDIX - RECRUITMENT MATERIALS 21



List of Exhibits

Exhibit


1 Estimates of burden for students 11

2 Estimates of burden for school administrators 12

3 Estimates of burden for school coordinators 12




PREFACE


Beginning in the year 2000, the Program for International Student Assessment (PISA) began to assess the knowledge of 15-year-old students around the world in three subject areas – reading, mathematics and science. PISA assesses students’ knowledge and skills gained both in and out of school environments, which makes it different from other international assessments such as the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Study (PIRLS), which are closely tied to school curriculum frameworks.


PISA is a cyclical study. Administered every three years, the major focus of the data collection rotates among the three content domains so that in-depth information is collected on one domain with a minor focus on the other two domains in each administration. The focus in 2009 is on reading. The central aim of PISA is to assess the application of skills and competencies, embedded in the context of these subject areas.


The fourth phase of PISA, PISA 2009, continues this endeavor at a time when interest is increasing, both worldwide and in the United States, in how well schools are preparing students to meet the challenges of the future. International interest and involvement in PISA is high; 43 countries participated in 2000, 41 participated in 2003, 57 participated in 2006, and 68 are expected to participate in 2009. The U.S. has participated in all of the previous cycles, and will participate in 2009 in order to track trends over time and to compare the performance of U.S. students with those in other countries.


PISA 2009 is sponsored by the Organization for Economic Cooperation and Development (OECD). In the United States, PISA 2009 is being conducted by the National Center for Education Statistics (NCES) of the Institute of Education Sciences, U.S. Department of Education.


To prepare for the assessment, PISA 2009 will conduct a field trial in the Spring of 2008. The purposes of the field trial are to collect data on assessment items and questionnaires and to test school recruitment, data collection, and data management procedures in preparation for the main study. The field trial also includes a new international optional component – an Electronic Reading Assessment (ERA) – that will include 24 countries and in which the U.S. has very recently decided to participate.


The purpose of this OMB submission is to request clearance for the recruiting for the 2008 field trial, the 2008 field trial data collection, recruiting for the 2009 main study, and the 2009 main study data collection. The international schedule calls for PISA field trial recruiting starting on December 1, 2007, field trial data collection from March 17-April 30, 2008, recruiting for the main study beginning in the Fall of 2008 (at least twelve months in advance of the data collection), and main study data collection in Fall 2009. It is important to note that because PISA is an international study, the U.S. will need to use internationally-developed instruments and follow international requirements and guidelines in conducting the assessment. If OMB grants clearance for both the field trial and the main study, NCES will prepare a memo to OMB after the field trial to update and continue the clearance process. The post-field trial memo will:


  1. Document any changes needed to the instruments and procedures for the main study in 2009.


  1. Describe changes in reporting burden for the main study data collection.


OMB approval for the field trial is requested as soon as possible so that recruiting activities can begin in order to meet the international data collection schedule for the Spring 2008 field trial. In order to begin recruiting schools for the main study by Fall 2008 (one year in advance, as recommended by a task force on improving response rates), we would need clearance for main study recruiting and administration no later than August 2008.

The questionnaires in the current materials are those that were used in 2006. They are placeholders until the 2009 questionnaires are available from the international development process. We do not anticipate that the questionnaires will change substantially. However, one known change is that the race category currently called “Pacific Islander” will be changed to “Native Hawaiian or Other Pacific Islander.”

A. JUSTIFICATION

A.1 Importance of Information

The Need for International Data on Education


As part of a continuing cycle of international education studies, the United States, through the National Center for Education Statistics (NCES), plans to participate in several international data gathering activities involving assessments and surveys in the coming years. The Program for International Student Assessment (PISA), sponsored by the Organization for Economic Cooperation and Development (OECD), is one of these studies.


In light of the growing concerns related to international economic competitiveness, the changing face of our workplace, and the expanding international marketplace we trade in, knowing how our students and adults compare with their peers around the world has become an even more prominent issue than ever before. Beyond just simple comparisons, understanding what other nations are doing to further the educational achievement of their populations has also become increasingly more important.


Data at critical points during the educational career of our students will help policymakers in their efforts to guide and restructure the American educational system. These critical points may occur during primary, secondary or tertiary education, as well as extending into adult education and into training programs. Consequently, generating comparative data relative to students in school, at the end of schooling, and about adults in the workplace and in the community has become an important focus with NCES.


PISA 2009 is part of the larger international program that NCES has actively participated in through collaboration with, and representation at, the OECD, the Asia-Pacific Economic Cooperation (APEC), and the International Association for the Evaluation of Educational Achievement (IEA). Collaboration with Statistics Canada, Eurostat, and ministries of education throughout the world helps to round out the portfolio of data NCES compiles.


Through this active participation, NCES has sought to strengthen the quality, consistency and timeliness of international data. To continue this effort, the United States must follow through with well-organized and executed data gathering activities within our national boundaries. These efforts will allow NCES to build a data network that can provide the information necessary for informed decision making on the part of national, state, and local policy makers.


PISA


The OECD Program for International Student Assessment (PISA) will measure students' knowledge, skills, and competencies in three subject areas – reading, mathematics, and science. The overall strategy is to collect in-depth information on student capabilities in one of these three major domains every three years so that detailed information on each becomes available every nine years. During each three-year survey cycle, the major focus will be on one content domain, with a minor focus on the other two content domains. The major focus for the upcoming data collection, to take place in the year 2009, is on reading literacy, with a minor focus on science and mathematics. The 2009 data collection will be the second time the focus has been on reading literacy, thus allowing the first comparison of data on the same topic. The target population for this project will be a nationally representative sample of 15-year-old students.


The Australian Council for Education Research (ACER), under the auspices of OECD, is responsible for the international components of this project. The data collection contractor for the United States, the Windwalker Corporation, will work directly with ACER, the appropriate OECD and ACER committees, and the PISA National Project Manager from NCES.


Data Collection Instruments

The primary focus for the assessment and questionnaires for PISA 2009 will be on reading literacy. The international coordinating committee has defined reading literacy as:

  • Understanding, using, and reflecting on written information which involves an active and interactive role of the reader in gaining meaning from written texts; and

  • Having a set of linguistic tools that are important for meeting the demands of modern societies.


Final versions of the questionnaires have not yet been released by the international contractor for PISA, but copies of the PISA 2006 school and student questionnaires are attached to this submission. Current plans for the PISA field trial call for the use of seven spiraled assessment booklets for the field trial and two types of background questionnaires, one at the student level and one at the school level. The assessment booklets will be spiraled to reduce respondent burden – each student will not be administered all items but a subset of items.

School questionnaires. A representative from each participating school will be asked to provide information on basic demographics of the school population and more in-depth information on one or more specific issues (generally related to the content of the tests in the major domain). Basic information to be collected include data on school location; measures of socio-economic context of the school, including location, school resources, facilities, and community resources; school class and size; staffing patterns; instructional practices; and school organization. The in-depth information is designed to address a very limited selection of issues that are of particular concern. It is anticipated that the school questionnaire will take approximately 25-30 minutes to complete.


Student questionnaires. Participating students will be asked to provide basic demographic data and in-depth information. Basic information to be collected include demographics (e.g., age, gender, language, race/ethnicity); socio-economic background of the student (e.g., parental education, economic background); student's educational career; educational resources at home and at school, and their use; and instructional practices, curriculum, and time spent in school, as perceived by the students. It is anticipated that the student questionnaire will take approximately 30 minutes to complete.


Assessment instruments. The PISA field trial will include a two-hour assessment that focuses on reading literacy. The main study will also focus on reading, with a lesser emphasis on science and mathematics. Seven different test booklets will be used in the U.S. field trial. The main study in 2009 shall consist of approximately thirteen booklets with four 30-minute blocks per booklet. There are fewer booklets in the field trial because some items used in prior rounds will be repeated and these do not need to be field-tested. Each test booklet will include approximately 2 hours of test items and may include some items measuring student’s attitudes toward reading.


The field trial also includes an Electronic Reading Assessment (ERA) that will be administered to a subsample of students involved in the primary assessment. The PISA Reading Literacy Framework has been expanded to include computer text processing and navigation skills, and these skills will be assessed through a computer-based test lasting 40 minutes (the total ERA test session will occupy one hour). There will be five versions of the ERA, each containing two of the five 20-minutes clusters of items. The clusters will be spiraled across versions to reduce respondent burden. Based on the results of the field trial, the international consortium will decide whether the ERA will be included in the main study, and if so, NCES will decide whether the U.S. will participate in this optional component.

A.2 Purposes and Uses of Data


Governments and the general public want solid evidence of educational outcomes. The Organization for Economic Cooperation and Development (OECD) has therefore launched an extensive program for producing policy-oriented and internationally comparable indicators of student achievement on a regular basis and in a timely manner. How well are schools preparing students to meet the challenges of the future? Parents, students, the public and those who run education systems need to know whether children are acquiring the necessary skills and knowledge. Are they prepared to become tomorrow's workers, to continue learning throughout life, to analyze, to reason and communicate ideas effectively?


The results of the OECD tests, published every three years along with related indicators, will allow national policy makers to compare the performance of their education systems with those of other countries. Further, the results will provide a basis for better assessment and monitoring of the effectiveness of education systems at national levels.


Through PISA, OECD will produce three types of indicators:


  1. Basic indicators, providing a baseline profile of the knowledge, skills, and competencies of students;

  2. Contextual indicators, showing how such skills relate to important demographic, social, economic and educational variables; and

  3. Trend indicators that emerge from the on-going, cyclical nature of the data collection.

Consequences of Not Collecting the Data


Over the last few decades, the world has become accustomed to hearing about Gross Domestic Products, Consumer Price Indices, unemployment rates and other similar terms in news reports comparing national economies. The use of these economic indicators allows for discussion and debate of complex economic activities with well-respected measures of that activity. Education policymakers and the general public have a similar need to discuss what is going on in the field of education with indicators that are based on valid and reliable data and other information. Outcome data from PISA allows U.S. policymakers to gauge U.S. performance in relation to other countries, as well as monitor progress over time in comparison to these countries. The results of the PISA assessments, published every three years along with related indicators, will allow national policy makers to compare the performance of their education systems with those of other countries. Further, the results will provide a basis for better assessment and monitoring of the effectiveness of education systems at the national level. Without these kinds of data, U.S. policymakers will be limited in their ability to gain insight into the educational performance and practices of other nations as they compare to the United States, and will have lost an investment made in previous cycles in measuring trends.


A.3 Improved Information Technology (Reduction of Burden)


The PISA 2009 design and procedures are prescribed internationally, and data collection involves paper and pencil responses for all components except the Electronic Reading Assessment (ERA). In the U.S, the ERA will be implemented using computers carried into schools by the data collection staff.


A.4 Efforts to Identify Duplication

A number of international comparative studies already exist to measure achievement in mathematics, science, and reading, including the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS). The Adult Literacy and Lifeskills study (ALL) will measure the reading literacy skills of adults. In addition, the United States has been conducting its own national surveys of student achievement for more than 30 years through the National Assessment of Educational Progress (NAEP) program. PISA differs from these studies in several important ways:


Content. PISA is designed to measure “literacy” broadly while other studies, such as TIMSS and NAEP, have a strong link to curriculum frameworks and seek to measure students’ mastery of specific knowledge, skills, and concepts. The content of PISA is drawn from broad content areas, such as understanding, using, and reflecting on written information for reading, in contrast to more specific curriculum-based content such as decoding and literal comprehension.


Tasks. In addition to the differences in purpose and age coverage between PISA and other international comparative studies, PISA differs from other assessments in what students are asked to do. PISA focuses on assessing students’ knowledge and skills in reading, mathematics, and science literacy in the context of everyday situations. That is, PISA emphasizes the application of knowledge to everyday situations by asking students to perform tasks that involve interpretation of real-world materials as much as possible. A study based on expert panels’ reviews of mathematics and science items from PISA, TIMSS, and NAEP reports that PISA items require multi-step reasoning more often than either TIMSS or NAEP. The study also shows that PISA mathematics and science literacy items often involve the interpretation of charts and graphs or other “real world” material. These tasks reflect the underlying assumption of PISA: as 15-year-olds begin to make the transition to adult life, they need to know not only how to read, or particular mathematical formulas or scientific concepts, but also how to apply this knowledge and these skills in the many different situations they will encounter in their lives. The ERA adds an additional “real world” task, that of processing and navigating within computer-based text material.


Age-based sample. The goal of PISA is to represent outcomes of learning rather than outcomes of schooling. By placing the emphasis on age, PISA intends to show not only what 15-year-olds have learned in school, but outside of school as well as over the years, not just in a particular grade. PISA thus seeks to show the overall yield of an educational system and the cumulative effects of all learning experience. Focusing on age 15 provides an opportunity to measure broad learning outcomes while all students are still required to be in school across the many participating nations. Finally, because years of education vary among countries, choosing an age-based sample makes comparisons across countries somewhat easier.


Information collected. The kind of information PISA collects also reflects a policy purpose slightly different from the other assessments. PISA collects only background information related to general school context and student demographics. This differs from other international studies such as TIMSS, which collects background information related to how teachers in different countries approach the task of teaching and how the approved curriculum is implemented in the classroom. The TIMSS video studies further extend this work by actually capturing images of instruction across countries. The results of PISA will certainly inform education policy and spur further investigation into differences within and between countries, but PISA is not intended to provide direct information about improving instructional practice in the classroom. The purpose of PISA is to generate useful indicators to benchmark performance and inform policy.


Thus, while some studies in the United States collect similar, though not identical, kinds of information (e.g., NAEP), the data from those studies cannot be substituted for the information collected in PISA. Further, PISA 2009 will assess 15-year-olds (in the United States), and this group is typically not represented in existing data collections on academic achievement. In order to participate in the international study, the United States must agree to administer the same core instruments that will be administered in the other countries. Because the items measuring academic achievement have been developed with intensive international coordination, any changes to the PISA 2009 instruments would also require international coordination.


Hence, alternate sources for these data do not exist. This study represents the U.S. participation in an international study involving 68 nations in the PISA 2008 field trial. The United States must collect the same information at the same time as the other nations for purposes of making international comparisons. No other study in the United States will be using the instruments developed by the international sponsoring organization, and thus no alternative sources of comparable data are available.


A.5 Minimizing Burden for Small Institutions

The school sample for PISA will contain small-, medium- and large-size schools from a wide range of school types including private schools. It is necessary to include small and private schools so that the students attending such schools are represented in the data collection. Approximately 35 schools will be asked to participate in the field trial and 150 schools in the main study. Burden will be minimized wherever possible for all institutions participating in the data collection. For example, the schools to be assessed in the PISA field trial will avoid overlap with schools selected for NAEP. Student burden will be reduced in the field trial through the use of three forms of the student background questionnaire. This will allow us to test out new questions or differing versions of questions in order to see which are most effective without adding to administration time. In addition, contractor staff will assume as much of the organization and test administration as possible within each school. Contractor staff will undertake all test administration and these staff will also assist with parental notification, sampling, and other tasks as much as possible within each school.


A.6 Frequency of Data Collection

This request to OMB is for the PISA 2008 field trial and PISA 2009 main study. PISA is conducted on a three-year cycle as prescribed by the international sponsoring organization, and adherence to this schedule is necessary to establish consistency in survey operations among the many participating countries.


A.7 Special Circumstances

No special circumstances exist in the data collection plan for PISA 2009 that would necessitate unique or unusual manners of data collection. None of the special circumstances identified in the Instructions for Supporting Statement applies to the PISA 2009 study.


A.8 Consultations Outside NCES

Consultations outside NCES have been extensive and will continue throughout the life of the project. The nature of the study requires this, because international studies typically are developed as a cooperative enterprise involving all participating countries. PISA 2009 is being developed and operated, under the auspices of the OECD, by a consortium of organizations. Key persons from these organizations who are involved in the design, development and operation of PISA 2009 are listed below.

Organization for Economic Cooperation and Development

Andreas Schleicher

Indicators and Analysis Division

2, rue André Pascal

75775 Paris Cedex16

FRANCE

Tel: +33 (1) 4524 9366

Fax: +33 (1) 4524 9098


Australian Council for Education Research

Ray Adams, Project Director

ACER

19 Prospect Hill Road

CAMBERWELL VIC 3124

AUSTRALIA

Tel: 613 92775555

Fax: 613 92775500


Westat

Keith Rust, Director of Sampling

1650 Research Boulevard

Rockville, Maryland 20850-3129

USA

Tel: 301 251 8278

Fax: 301 294 2034


A.9 Payments or Gifts to Respondents

Currently, the minimum response rate targets required by OECD are 85 percent of original schools and 80 percent of students, while NCES’ minimum response rate targets are 85 percent at the student level. These high response rates (85 percent is the target for the school level and 85 percent is the target for the student level of PISA) are becoming increasingly difficult to achieve in school-based studies. Neither PISA 2000, 2003 nor PISA 2006 reached the school response rates targeted for the study. Thus, the contractor for PISA 2009 proposed to test the following incentive system during the field trial to determine if it would be effective in generating the required response rates. This system is based on but not the same as that used in PISA 2006.


Incentives for school coordinators. The role of the school coordinator is critical for the success of the study. The coordinator is expected to: coordinate logistics with the data collection contractor, supply a list of eligible students for sampling to the data collection contractor, communicate with teachers, students, and parents about the study to encourage participation, assist the test administrator in ensuring the sampled students attend the testing session, and assist the test administrator in arranging for make-up sessions as needed. The study will provide $150 to school coordinators for those schools that participate and meet acceptable rates of student response in the field trial. The rate for the main study will depend on whether the ERA is part of the data collection plan.


Incentives for students. The study will provide a $20 incentive for students participating in the assessment during regular school hours. This is higher than the rate used in PISA 2006 due to inflation and increases in the Federal minimum wage. Students participating in the ERA during regular school hours will receive an additional $10 incentive. Students participating in the assessment during non-school hours will be offered a $50 incentive (to compensate for missed activities, work, etc.) with an additional $15 incentive for those participating in the ERA. Incentives for students will only be provided with the explicit permission of the school principal.


Incentives for schools. In order to meet the minimum school response rates mandated by the international governing board, we believe it is necessary to offer school incentives. We plan to offer schools a school incentive of $300 if the school participates in the field trial assessment. The rate for the main study will depend on whether the ERA is part of the data collection plan.

A.10 Assurance of Confidentiality


PISA 2009 will conform to all relevant federal regulations – specifically, the Privacy Act of 1974 (5 U.S.C. 552a), the Education Sciences Reform Act of 2002, the National Education Statistics Act of 1994, the U.S. Patriot Act of 2001, the E-Government Act of 2002, and the NCES Standards and Policies. Because no individually identifying information is kept the data collection conforms to the requirements under which a pledge of confidentiality can be made under the Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA). Therefore, the following pledges will be used in the data collection:


For the school letter --

The information you provide about yourself, your school staff and students will be used for statistical purposes only. In accordance with the Confidential Information Protection provisions in Public Law 107-347 and the Confidentiality provisions in PL 107-279, Sec. 183, responses will be kept confidential and will not be disclosed in identifiable form. By law, everyone working on this NCES survey is subject to a jail term, a fine, or both if he or she willfully discloses ANY information that could identify you.


For the front of the survey forms --

The information you provide about yourself will be used for statistical purposes only. Your responses will be kept confidential and will not be disclosed in identifiable form. By law, everyone working on this NCES survey is subject to a jail term, a fine, or both if he or she willfully discloses ANY information that could identify you.


The plan for maintaining confidentiality includes signing confidentiality agreements and notarized nondisclosure affidavits obtained from all personnel who will have access to individual identifiers. Also included in the plan is personnel training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses; controlled and protected access to computer files under the control of a single data base manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility.

Letters and other materials will be sent to parents and school administrators describing the voluntary nature of this survey. The material sent will include a brochure to describe the study and to convey the extent to which respondents and their responses will be kept confidential. (Copies of letters are included in the Appendix).


Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. Neither names nor addresses will be included on any data file. A separate locator database for these sample members will be maintained in a secure location.


A. 11 Sensitive Questions


Federal regulations governing the administration of questions that might be viewed by some as “sensitive” because of their requirement for personal or private information, require (a) clear documentation of the need for such information as it relates to the primary purpose of the study, (b) provisions to respondents which clearly inform them of the voluntary nature of participation in the study, and (c) assurances of confidential treatment of responses.


PISA 2009 does not include questions usually considered to be of a highly sensitive nature, such as items concerning religion, substance abuse, or sexual activity. However, the field trial questionnaires proposed by the international coordinators do include a few items that may be categorized as being included in the topics identified by the Protection of Pupil Rights Act (PPRA). All items are being reviewed by NCES, and items that ask for information covered by PPRA will be excluded from the U.S. questionnaire.


Several other items in the background questionnaires may be considered sensitive by some of the respondents, even though they do not fall into any of the PPRA domains. These items relate to the socioeconomic context of the school, parents' education and occupation, family possessions, and student's belongings. Research indicates that the constructs these items represent are strongly correlated to academic achievement, and they have been used in the three previous cycles of PISA (2000, 2003, and 2006). Therefore, the items are essential for the anticipated analyses and to retain consistency in planned comparisons with the international data.


A. 12 Estimates of Hour Burden for Information Collection


Estimates of response burden for students participating in the PISA field trial and main study data collection activities are shown in Exhibit 1. The average response burden is based on a 30-minute questionnaire and 5 minutes of attitudinal items included in the cognitive assessment booklet.



Exhibit 1. Estimated burden on student respondents for field trial and main study

Students


Sample

Expected Response
Rate

Number of Respondents

Average Burden/
Response* (minutes)

Total
Burden (hours)

Field trial (2008)

1,412

85%

1,200

35

700

Main study (2009)

5,294

85%

4,500

35

2625

TOTAL

6,706


5,700


3325

*Student questionnaire and attitudinal items in cognitive assessment booklet

We have used $5.85 per hour to estimate the cost to students. The dollar cost for students is estimated at $4,095 for the field trial and $15,356 for the main study.

Exhibit 2 presents information on response burden for the school administrator questionnaire. School administrators will be asked to complete a 30-minute questionnaire. For the school administrator questionnaire assuming a $50 hourly cost, the cost to respondents is $875 in the field trial and $3,750 in the main study. In addition to the burden on respondents, burden will also be placed on school coordinators. We expect that school coordinators will spend at least 8 hours supplying student lists for sampling, notifying teachers and students about the study, and assisting test administrators in making sure that sampled students attend the scheduled testing sessions.

Exhibit 2. Estimate of burden for school administrators

School administrators


Sample

Expected Response
Rate

Number of Respondents

Average Burden/
Response (minutes)

Range of Response Times (minutes)

Total
Burden (hours)

Field trial (2008)

41

85%

35

30

20-40

17.5

Main study (2009)

176

85%

150

30

20-40

75

TOTAL

217


185



92.5

Exhibit 3 presents an estimate of the level of burden for coordinators. Note that this burden is not included in the total since it does not include the completion of any questionnaires.


Exhibit 3. Estimate of burden for school coordinators

School coordinators


Number of Coordinators

Average Burden Hours

Range of Hours

Total
Burden (hours)

Field trial (2008)

35

8

6-10

280

Main study (2009)

150

8

6-10

1200

TOTAL

185



1480

School coordinators may be instructional or non-instructional staff at the school. We estimate an average salary of coordinators of $30 per hour; thus the cost of coordinator time for the field trials will be approximately $8,400 and $36,000 for the main study.


A.13 Total Annual Cost Burden


Other than the burden associated with completing the PISA questionnaires and assessments (estimated above in Section A.12), the field trial and main study impose no additional cost to respondents.

A.14 Annualized Cost to Federal Government


The PISA 2009 data collection involves annualized costs to the Federal Government of an estimated $1,110,211 for the field trial (covering a one-year performance period), and an estimated $1,411,366 annualized for the main study for each year of a three-year performance period. These figures do not include costs for administration of the ERA, which are in the process of being negotiated between the government and the contractor.


A.15 Program Changes or Adjustments


There are few changes to PISA 2009 from the previous rounds of data collection. The main change is that the assessment will focus on reading literacy during this cycle. The result is that the bulk of the items will be reading items and that science and math will be the secondary components. The inclusion of the Electronic Reading Assessment (ERA) in the field trial also represents a significant change. There are also minor changes in wording to some of the questionnaire items, and questions that focused on student attitudes toward science or math now focus on attitudes toward reading. A new contractor, Windwalker Corporation, will conduct the field trial and main study for PISA 2009.

A.16 Plans for Tabulation and Publication


The PISA field trial is designed to provide a statistical review of the performance of items on the assessments and questionnaires in preparation for the main data collection. The Australian Council for Educational Research (ACER), the international sponsoring organization, will provide the international instruments to be used in the field trial and will report to the participating countries on the results of the field trial. Based on the field trial results ACER, with input and agreement from the participating countries, will make final revisions in the survey instruments, materials, and documents in preparation for the main study.


For the main study in 2009, an analysis of the U.S. and international data will be undertaken to provide for an understanding of the U.S. national results in relation to the international results. Based on proposed analyses of the international data set by ACER, and the need for NCES to report results from the perspective of an American constituency, a plan is being prepared for the statistical analysis of the U.S. national data set as compared to the international data set. Analysis of data will include examinations of the reading, mathematics, and science knowledge and awareness of U.S. students in relation to their international counterparts; and the relationships between reading, mathematics, and science knowledge and awareness, and student background.


All reports and publications will be coordinated with the release of information from the international organizing body. Planned publications and reports for the PISA 2009 main study include the following:


General Audience Report. This report will present information on the status of reading, mathematics, and science education among students in the United States in comparison to their international peers, written for a non-specialist, general American audience. This report will present the results of analyses in a clear and non-technical way, conveying how U.S. students compare to their international peers, and what factors, if any, may influence the U.S. results.


Survey Operations/Technical Report. This report shall document the procedures used in the main study (e.g., sampling, recruitment, data collection, scoring, weighting, and imputation) and describing any problems encountered and the contractor’s response to them. The primary purpose of the main study survey operations/technical report is to document those steps taken by the United States in undertaking and completing the study.


Non-response Bias Analysis Report. This report shall present data that will assess the presence and extent of bias due to nonresponse. Selected characteristics of respondent students and schools will be compared with those of non-respondent schools and students to provide information about whether and how they differ from respondents along dimensions for which we have data for the nonresponding units, as required by NCES standards.


Electronic versions of each publication are generally made available on the NCES website. Schedules for tabulation and publication of PISA 2009 results in the United States are dependent upon receiving data files from the international sponsoring organization. With this in mind, the expected data collection dates and a tentative reporting schedule are as follows:


October - December 2007

Prepare OMB clearance documents, data collection manuals, forms, assessment materials, questionnaires for field trial



December 2007-February 2008

Gain cooperation of states, districts, schools for field trial



March – July 2008

Select student samples and collect field trial data



July 2008

Deliver raw data to international sponsoring organization



August – September 2008

Receive Field Trial Report from international sponsors, revise OMB package



September 2008–September 2009

Prepare for the main study phase/ recruit schools



September 2009–December 2009

Collect main study data



March - April 2010

Receive final data files from international sponsors



August - December 2010

Produce General Audience Report, Survey Report, and Technical Report for the United States


A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all data collection materials.


A.18 Exceptions to Certification Statement

No exceptions are requested to the "Certification for Paperwork Reduction Act Submissions" of OMB Form 83-I.






4

File Typeapplication/msword
File TitlePREFACE
AuthorJanice Bratcher
Last Modified ByEdith.McArthur
File Modified2008-03-04
File Created2008-03-04

© 2024 OMB.report | Privacy Policy