Att_ rev 9-16 Part A PISA 2012 Recruitment and Field Test 2010

Att_ rev 9-16 Part A PISA 2012 Recruitment and Field Test 2010.docx

Program for International Student Assessment (PISA) 2012 Recruitment and Field Test,

OMB: 1850-0755

Document [docx]
Download: docx | pdf





OECD

PROGRAM FOR INTERNATIONAL STUDENT ASSESSMENT
(PISA 2012)



REQUEST FOR OMB Clearance

OMB# 1850-0755 v.10



SUPPORTING STATEMENT PART A









Prepared by:


National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC






Submitted: September 10, 2010


TABLE OF CONTENTS




PREFACE

The Program for International Student Assessments (PISA) is an international assessment of 15-year-olds, which focuses on assessing students’ mathematics, science, and reading literacy. PISA was first administered in 2000 and is conducted every three years. The fifth phase of PISA, PISA 2012, is being administered at a time when interest is increasing, both worldwide and in the United States, in how well schools are preparing students to meet the challenges of the future, and how the students perform comparing with their peers in other countries of the world. The participation of the PISA study among countries and jurisdictions1 has been significantly increased since the initial survey in 2000: 43 countries/jurisdictions in 2000, 41 in 2003, 57 in 2006, 66 in 2009, and 74 are expected to participate in 2012. The United States has participated in all of the previous cycles, and will participate in 2012 in order to track trends and to compare the performance of U.S. students with that in other countries.

PISA 2012 is sponsored by the Organization for Economic Cooperation and Development (OECD). In the United States, PISA 2012 is being conducted by the National Center for Education Statistics (NCES) of the Institute of Education Sciences, U.S. Department of Education. PISA is a collaboration among the participating countries, the OECD, and a consortium of various international contractors, referred to as the PISA International Consortium, led by the Australian Council for Educational Research (ACER).

In each administration of PISA, one of the subject areas (mathematics, science, or reading literacy) is the major domain and has the broadest content coverage, while the other two subjects are the minor domains. Other areas may also be assessed, such as general problem solving. PISA emphasizes functional skills that students have acquired as they near the end of mandatory schooling (aged 15 years). Moreover, PISA assesses students’ knowledge and skills gained both in and out of school environments. The focus on the “yield” of education in and out of school makes it different from other international assessments such as the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Study (PIRLS), which are closely tied to school curriculum frameworks and assess younger and grade-based populations.

PISA 2012 will focus on mathematics literacy as the major domain. Reading and science literacy will also be assessed as minor domains. All three will be assessed through a paper-and-pencil assessment and there also will be computer-based assessments in mathematics and reading. In addition, there will be a general problem-solving assessment (computer-based only) and an assessment of financial literacy (paper-and-pencil only). PISA 2012 represents the second cycle with the major domain in mathematics literacy (PISA 2003 was the first). This is also the second cycle that includes an assessment of general problem-solving (2003 was the first), although the previous administration was a paper-and pencil assessment. This will be the first assessment of financial literacy by PISA. The paper-and-pencil-based mathematics, science, and reading literacy assessments and the computer-based problem-solving assessment are core components of PISA 2012 and all countries are required to participate. The computer-based mathematics and reading assessments and the financial literacy assessment are international options.

In addition to the cognitive assessments described above, PISA 2012 will include questionnaires administered to assessed students and school principals.

To prepare for the main study in 2012, PISA countries will conduct a field trial in the spring of 2011. The purpose of the field trial is to collect data on assessment items and questionnaires and to test school recruitment, data collection, and data management procedures in preparation for the main study. The field trial will also be used, in the United States, to help determine in which international options to participate.

The U.S. PISA field trial data collection will occur from March-May 2011. The international PISA 2012 Field Trial Guidelines indicate the requirement to field trial 200 assessed students per item. In countries planning to participate in both problem solving and the computer-based mathematics and reading assessments, this means that a minimum of 1,800 students must undertake a test on computer in the field trial. Assuming an 80 percent student response rate, the field trial will have 1,925 students assessed on computer (605 paper-and-pencil and computer + 1,320 computer only). The need for a field trial sample of this size was emphasized in a memorandum sent to all national centers for PISA 2012 by the PISA Consortium in late May.

The United States plans to recruit 124 schools for the field trial, with the expectation that 80 will participate. It is an international requirement that the sampling of students be carried out in at least a portion of the field trial schools just as it will for the main survey. Thus we anticipate that, in 36 of the schools, 42 students will be sampled for the paper-and-pencil field trial assessment (n=1512). Half of these students will then be subsampled to take the computer-based field trial material (n=756). This mirrors the student sampling plan for the main survey, and provides adequate sample for the pencil and paper assessments. However, it will only provide about 25-30% of the students needed for trialing the computer-based material. Thus we plan to administer computer-based assessments in the other 44 schools, completing assessments with an average of 30 students via computer in each of these schools (n=1320). In all, about 1,925 (1320 + .80*756) students will take the computer-based assessment during the field trial.

In addition to the field trial, NCES plans to conduct small panels and focus groups with principals and students to examine the challenges of recruitment and ways to increase participation and knowledge of PISA. A separate OMB request has been submitted for these activities.

The U.S. PISA main study will be conducted from September-November 2012. If the United States participates in the core components of PISA, the main study will involve a nationally-representative sample of 5,600 students in the target population in 165 schools. Each student will be administered a 2-hour paper-and-pencil assessment that will include some combination of mathematics, reading, and science items and a 30-minute student questionnaire; 14 students per school will return for a second session to take a 40-minute computer-based assessment of general problem-solving. The school principal of each sampled school will complete a 30-minute questionnaire.

The United States also may opt to participate in (a) the computer-based assessment of mathematics and reading; (b) the financial literacy assessment; or (c) both the computer-based assessment of mathematics and reading and the financial literacy assessment. Under option (a) in each school a total of 18 students, who participated in the first session, would participate in a second session, the 40-minute computer-based assessment that would now include reading and mathematics as well as general problem-solving. Under option (b) an additional 8 students per school would be sampled (increasing the overall sample size to 6,800 students) and these students would participate in only the paper-and-pencil session (the financial literacy assessment would be folded into the larger mathematics, reading, and science paper-and-pencil assessment). Under option (c) an additional 8 students per school would be sampled (overall sample size = 6,800 students) for the financial literacy requirement and in each school a total of 18 students, who participated in the first session, would participate in a second session, the 40-minute computer-based assessment session.

The U.S. decision about in which, if any, international options to participate will depend on the results of the field trial recruitment and operations and assessment data analysis, as well as funding considerations. Altogether, there are six possible scenarios for the main study; these are described in part B of this document. OMB approval for the field trial is requested at this time so that recruiting activities can begin in September 2010 in order to meet the international data collection schedule for the spring 2011 field trial.

In this clearance package, NCES requests OMB’s approval for:

  1. recruiting for the 2011 field trial and 2012 main study;

  2. conducting the 2011 field trial data collection; and

  3. a waiver of the 60-day federal register notice for the 2012 main study data collection clearance.

The international schedule calls for field trial data collection from March-May 2011, recruiting for the main study beginning in the fall of 2011 (at least 12 months in advance of the data collection), and main study data collection in the fall of 2012. Field trial recruitment materials, including letters to state and district officials and school principals, text for a PISA field trial brochure, and “Frequently Asked Questions” to be provided to recipients of the recruitment letters are included in Appendix A. Parental consent letters and related materials for the field trial are included in Appendix B. Main study materials will be based on these but will reflect the main study design and components to be administered.

It is important to note that because PISA is a collaborative international study, the U.S. administration of PISA operates under some constraints, particularly around the schedule and the availability of instruments, which are negotiated internationally. For example, at the time that this package is submitted, the final international versions of the student and school questionnaires are not available from the international contractor. Instead, NCES has included the 2003 PISA student and school questionnaires administered in the United States in Appendix C of this document. The PISA 2012 student and school questionnaires are expected to be very much like the 2003 versions because mathematics was also the major domain in PISA 2003 and thus the content questions focus on mathematics, as they will in PISA 2012. There will be some refinement for PISA 2012, though, so information about how the 2012 versions are likely to be different is included in Appendix C. The 5-minutes worth of background items to be administered as part of the financial literacy assessment are not included, however, because they have not yet been developed, as the financial literacy assessment is new for PISA.

Further, the main study design and burden will be determined after the field trial when the United States determines in which international options it will participate. This clearance package, however, describes the study design and presents burden estimates for each of the possible options.

In submitting this package, NCES is seeking permission to submit the final field trial questionnaires and study design as a change request in January 2011, once the field trial questionnaires have been finalized. Any new information about the instruments and study design will be added to this clearance package prior to the publication of the 30-day notice and OMB’s review of the package materials.

Further, in order to begin recruiting schools for the main study by September 2011, in May 2011 we will submit a change-request memo to OMB that will provide the final main study recruitment materials and parental consent letters, summarize the results of the field trial and the options U.S. will participate in during the main study, document changes made to the instruments and procedures for the main study, and detail the resulting respondent burden estimates for the main study data collection.

Lastly, in spring 2012 we will submit a clearance package, with a 30-day notice published in the federal register, which will include the final main study instruments for data collection in the fall of 2012.

A. JUSTIFICATION

A.1 Importance of Information

As part of a continuing cycle of international education studies, the United States, through the National Center for Education Statistics (NCES), is currently and in the coming years participating in several international assessments and surveys. The Program for International Student Assessment (PISA), sponsored by the Organization for Economic Cooperation and Development (OECD), is one of these studies.

In light of the growing concerns related to international economic competitiveness, the changing face of our workplace, and the expanding international marketplace in which we trade, knowing how our students and adults compare with their peers around the world has become an even more prominent issue than ever before. Nationwide, interest in understanding what other nations are doing to further the educational achievement of their populations has increased, beyond simple comparisons.

Data at critical points during the education career of our students will help inform policymakers in their efforts to guide and restructure the American education system. These critical points may occur during primary, secondary, or tertiary education, as well as extending into adult education and training programs. Consequently, generating comparative data about students in school, at the end of schooling, and about adults in workplace and in community has become an important focus for NCES.

PISA 2012 is part of the larger international program that NCES has actively participated in through collaboration with, and representation at, the OECD, the Asia-Pacific Economic Cooperation (APEC), and the International Association for the Evaluation of Educational Achievement (IEA). Collaboration with Statistics Canada, Eurostat, and ministries of education throughout the world helps to round out the portfolio of data NCES compiles.

Through this active participation, NCES has sought to strengthen the quality, consistency, and timeliness of international data. To continue this effort, the United States must follow through with well-organized and executed data gathering activities within our national boundaries. These efforts will allow NCES to build a data network that can provide the information necessary for informed decision-making on the part of national, state, and local policy makers.

PISA measures students' knowledge, skills, and competencies primarily in three subject areas – reading, mathematics, and science literacy. The overall strategy is to collect in-depth information on student capabilities in one of these three domains every 3 years so that detailed information on each becomes available every nine years. During each 3-year survey cycle, the major focus is on one content domain, with a minor focus on the other two content domains. The major focus for the data collection in 2012 is on mathematics literacy, with a minor focus on science and reading. The 2012 data collection will be the second time the focus has been on mathematics literacy, thus allowing the first in-depth comparison of performance in mathematics. The target population for this project will be a nationally representative sample of 15-year-old students. PISA 2012 also includes computer-based assessments in mathematics, reading, and general problem-solving. In addition to enabling PISA to measure parts of the domain(s) that cannot be measured through traditional paper-and-pencil assessments, the inclusion of computer-based assessments in 2012 is part of PISA’s transition to being entirely computer-based in the future.

Over the last few decades, the world has become accustomed to hearing about Gross Domestic Products, Consumer Price Indices, unemployment rates, and other similar terms in news reports comparing national economies. The use of these economic indicators allows for discussion and debate of complex economic activities with well-respected measures of that activity. Education policymakers and the general public have a similar need to discuss what is going on in the field of education with indicators that are based on valid and reliable data and other information. Outcome data from PISA allow U.S. policymakers to gauge U.S. performance in relation to other countries, as well as monitor progress over time in comparison to these countries. The results of the PISA assessments, published every 3 years along with related indicators, will allow national policy makers to compare the performance of their education systems with those of other countries. Further, the results will provide a basis for better assessment and monitoring of the effectiveness of education systems at the national level. Without these kinds of data, U.S. policymakers will be limited in their ability to gain insight into the educational performance and practices of other nations as they compare to the United States, and would have lost an investment made in previous cycles in measuring trends.

The Australian Council for Educational Research (ACER), under the auspices of OECD, is responsible for the international components of this project. Westat, the data collection contractor for the United States will work directly with ACER and the PISA U.S. National Project Managers from NCES.

A.2 Purposes and Uses of Data

Governments and the general public want solid evidence of education outcomes. In the late 1990s, the OECD launched an extensive program for producing policy-oriented and internationally comparable indicators of student achievement on a regular basis and in a timely manner. PISA is at the heart of this program. How well are schools preparing students to meet the challenges of the future? Parents, students, the public, and those who run education systems need to know whether children are acquiring the necessary skills and knowledge, whether they are prepared to become tomorrow's workers, to continue learning throughout life, to analyze, to reason, and to communicate ideas effectively.

The results of OECD’s PISA, published every 3 years along with related indicators, allow national policy makers to compare the performance of their education systems with those of other countries. Further, the results provide a basis for better assessment and monitoring of the effectiveness of education systems at national levels.

Through PISA, OECD produces three types of indicators:

  • Basic indicators that provide a baseline profile of the knowledge, skills, and competencies of students;

  • Contextual indicators that show how such skills relate to important demographic, social, economic, and education variables; and

  • Trend indicators that emerge from the ongoing, cyclical nature of the data collection.



PISA 2012 Components

The primary focus for the assessment and questionnaires for PISA 2012 will be on mathematics literacy. The PISA mathematics framework defines mathematics literacy as:

an individual’s capacity to recognize, do and use mathematics, including to reason mathematically in a variety of contexts, and to identify the role that mathematics plays in the world by describing, modeling, explaining, and predicting phenomena. Mathematical literacy is a continuum—thus more mathematically literate individuals are better able to use mathematics and mathematical tools to make the well-founded judgments and decisions required by constructive, engaged and reflective citizens.”

As in all administrations of PISA, reading and science literacy also will be assessed, although they will be “minor domains” in 2012. In addition, PISA 2012 includes computer-based assessments and a new financial literacy assessment. Questionnaires will be administered to students and school principals. As summarized in Table A-1, some components of PISA 2012 are “core”—countries are required to participate—while other components are “international options.” The United States will administer all components during the field trial and following the field trial will determine in which components to participate in the main study.

Table A-1. Assessment components of PISA 2012: Core and international options

Assessment Mode

Core

International Options

Paper-and-pencil

Mathematics

Science

Reading Literacy

Financial Literacy

Computer-based

General Problem Solving

Mathematics

Reading



The PISA 2012 instruments and possible designs are described below.

Assessment instruments

Paper-and-pencil assessment. The PISA field trial will include a 2-hour paper-and-pencil assessment that includes primarily mathematics literacy items and also some financial literacy items. The main study will focus on mathematics, but will also include science and reading items as well as financial literacy if the United States participates in this option. Seven different test booklets will be used in the U.S. field trial. The main study in 2012 will consist of approximately 13 booklets with four 30-minute blocks per booklet. If financial literacy is administered, there will be a total of 15 booklets, 2 of which will include the financial literacy items and 5 minutes of background questionnaire items focused on financial literacy. There are fewer booklets in the field trial than in the main study because some mathematics items and all reading and science items that will be included in the main study booklets were used in prior rounds of PISA and do not need to be field-tested.

Computer-based assessments. The PISA field trial will also include a computer-based assessment, to be administered in a separate 40-minute session to a subsample of students who take the paper-and-pencil assessment. In the main study, if all three subjects (reading, mathematics, and general problem solving) are administered, there will be 11 forms of the computer-based assessment, each with two 20-minute blocks. A form could include problem-solving only, reading only, mathematics only, or a combination. If only problem-solving is administered, then there will be 8 forms, each comprising 2 problem-solving blocks.

Questionnaires

School questionnaire. A representative from each participating school will be asked to provide information on basic demographics of the school population and more in-depth information on one or more specific issues (generally related to the content of the assessment in the major domain, mathematics). Basic information to be collected includes data on school location; measures of socio-economic context of the school, including location, school resources, facilities, and community resources; school size; staffing patterns; instructional practices; and school organization. The in-depth information is designed to address a very limited selection of issues that are of particular interest and that focus primarily on the major content domain, mathematics. It is anticipated that the school questionnaire will take approximately 30 minutes to complete. It will be available to respondents on-line.

Student questionnaires. Participating students will be asked to provide basic demographic data and information pertaining to the major assessment domain, mathematics. Basic information to be collected includes demographics (e.g., age, gender, language, race, ethnicity); socio-economic background of the student (e.g., parental education, economic background); student's education career; and educational resources and their use at home and at school. Domain-specific information will include instructional experiences and time spent in school, as perceived by the students, and student attitudes. It is anticipated that the student questionnaire will take approximately 30 minutes to complete. In the field trial there will be multiple forms of the questionnaire in order to try out different items and item formats. In the main study there will be three forms of the student questionnaire with common and different items. While each student will complete a single questionnaire, multiple forms of the questionnaire will enable PISA to collect data on a broader set of variables.

Final versions of the PISA questionnaires have not yet been released by the international contractor, but the PISA 2012 student and school questionnaires are expected to be very much like the 2003 versions because mathematics was also the major domain in PISA 2003 and thus the content-related questions focus on mathematics, as they will in PISA 2012. There will be some refinements for PISA 2012, though, so information about how the 2012 versions are likely to be different (i.e., which items are likely to be deleted and possible new measures) is included in Appendix C together with the 2003 questionnaires. One purpose of the field trial is to try out alternative item formats to improve the quality and cross-national comparability of the data so items used in 2003 may be presented in different formats (e.g., variations of existing Likert response scales) or 2003 constructs may be measured using new item formats (e.g., forced choice, situational judgment, vignettes) in order to find an optimal way of gathering the information.



A.3 Improved Information Technology (Reduction of Burden)

The PISA 2012 design and procedures are prescribed internationally. Data collection involves paper-and-pencil responses for the core mathematics, reading, and science assessment and the optional financial literacy assessment. In the computer-based mathematics, reading, and problem-solving assessments, to be administered in the United States for the first time in 2012, responses will be captured electronically. In the United States, during the field trial, the computer-based assessments will be implemented using laptops carried into schools by the data collection staff. During the field trial we will evaluate the feasibility of using school computers in the main study.

In PISA 2012, the school questionnaire will be available for the first time to school administrators on-line as well as in paper format. As in PISA 2009, the student questionnaire will be prepared in a scannable format; while the responses will be entered manually by respondents, they will be scanned to a data file.

A.4 Efforts to Identify Duplication

A number of international comparative studies already exist to measure achievement in mathematics, science, and reading, including the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS). The Program for the International Assessment of Adult Competencies (PIAAC), to be administered for the first time in 2011, will measure the reading literacy, numeracy, and problem-solving skills of adults. In addition, the United States has been conducting its own national surveys of student achievement for more than 30 years through the National Assessment of Educational Progress (NAEP) program. PISA differs from these studies in several important ways:

Content. PISA is designed to measure “literacy” broadly, while other studies, such as TIMSS and NAEP, have a strong link to curriculum frameworks and seek to measure students’ mastery of specific knowledge, skills, and concepts. The content of PISA is drawn from broad content areas, such as understanding, using, and reflecting on written information for reading, in contrast to more specific curriculum-based content such as decoding and literal comprehension. Moreover, PISA differs from other assessments in the tasks that students are asked to do. PISA focuses on assessing students’ knowledge and skills in reading, mathematics, and science literacy in the context of everyday situations. That is, PISA emphasizes the application of knowledge to everyday situations by asking students to perform tasks that involve interpretation of real-world materials as much as possible. A study based on expert panels’ reviews of mathematics and science items from PISA, TIMSS, and NAEP reports that PISA items require multi-step reasoning more often than either TIMSS or NAEP.2 The study also shows that PISA mathematics and science literacy items often involve the interpretation of charts and graphs or other “real world” material. These tasks reflect the underlying assumption of PISA: as 15-year-olds begin to make the transition to adult life, they need to know not only how to read, or know particular mathematical formulas or scientific concepts, but also how to apply this knowledge and these skills in the many different situations they will encounter in their lives. The computer-based assessments to be included in 2012 add additional “real world” tasks, given the predominance of technology in the lives of young adults.

Age-based sample. The goal of PISA is to represent outcomes of learning rather than outcomes of schooling. By placing the emphasis on age, PISA intends to show not only what 15-year-olds have learned in school, but outside of school, as well as over the years, not just in a particular grade. In contrast, NAEP, TIMSS, and PIRLS are all grade-based samples: NAEP assesses students in grade 4, 8, and 12; TIMSS assesses students in grades 4 and 8; and PIRLS assesses students in grade 4. PISA thus seeks to show the overall yield of an education system and the cumulative effects of all learning experience. Focusing on age 15 provides an opportunity to measure broad learning outcomes while all students are still required to be in school across the many participating nations. Finally, because years of education vary among countries, choosing an age-based sample makes comparisons across countries somewhat easier than a grade-based sample.

Information collected. The kind of information PISA collects also reflects a policy purpose slightly different from the other assessments. PISA collects only background information related to general school context and student demographics. This differs from other international studies such as TIMSS, which collects background information related to how teachers in different countries approach the task of teaching and how the approved curriculum is implemented in the classroom. The results of PISA will certainly inform education policy and spur further investigation into differences within and between countries, but PISA is not intended to provide direct information about improving instructional practice in the classroom. The purpose of PISA is to generate useful indicators to benchmark performance and inform policy.

Alternate sources for these data do not exist. This study represents the U.S. participation in an international study involving 74 countries and jurisdictions in the PISA 2011 field trial. The United States must collect the same information at the same time as the other nations for purposes of making international comparisons. No other study in the United States will be using the instruments developed by the international sponsoring organization, and thus no alternative sources of comparable data are available.

In order to participate in the international study, the United States must agree to administer the same core instruments that will be administered in the other countries. Because the items measuring academic achievement have been developed with intensive international coordination, any changes to the PISA 2012 instruments would also require international coordination.

A.5 Minimizing Burden for Small Entities

No small entities are part of this sample. The school sample for PISA will contain small-, medium-, and large-size schools from a wide range of school types, including private schools, and burden will be minimized wherever possible for all institutions participating in the data collection. For example, the selection of schools to be assessed in the PISA field trial (spring 2011) will avoid overlap with the selection of schools for NAEP or TIMSS, which will also be in the field in the spring of 2011. Schools included in the field trial will not be included in the main study. Student burden will be reduced through the use of multiple forms of the student background questionnaire. In the field test this will allow PISA to test out new background items or differing versions of items without adding to administration time. In the main study, the use of multiple forms will enable PISA to gather a broad set of information without additional administration time. In addition, contractor staff will assume as much of the organization and test administration as possible within each school. Contractor staff will undertake all test administration and these staff will also assist with parental notification, sampling, and other tasks as much as possible within each school.

A.6 Frequency of Data Collection

This request to OMB is for the PISA 2011 field trial and PISA 2012 main study. PISA is conducted on a 3-year cycle as prescribed by the international sponsoring organization, and adherence to this schedule is necessary to establish consistency in survey operations among the many participating countries.

A.7 Special Circumstances

No special circumstances exist in the data collection plan for PISA 2012 that would necessitate unique or unusual manners of data collection. None of the special circumstances identified in the Instructions for Supporting Statement applies to the PISA 2012 study.

A.8 Consultations Outside NCES

The 60-day Federal Register notice was published on May 13, 2010 (75 FR, No. 92, p. 26943). No public comments have been received in response to this notice.

Consultations outside NCES have been extensive and will continue throughout the life of the project. The nature of the study requires this, because international studies typically are developed as a cooperative enterprise involving all participating countries. PISA 2012 is being developed and operated, under the auspices of the OECD, by a consortium of organizations. Key persons from these organizations who are involved in the design, development and operation of PISA 2012 are listed below.

Organization for Economic Cooperation and Development

Andreas Schleicher

Indicators and Analysis Division

2, rue André Pascal

75775 Paris Cedex16

FRANCE

Tel: +33 (1) 4524 9366

Fax: +33 (1) 4524 9098


Australian Council for Educational Research

Ray Adams, Project Director

ACER

19 Prospect Hill Road

CAMBERWELL VIC 3124

AUSTRALIA

Tel: +613 92775555

Fax: +613 92775500


Westat

Keith Rust, Director of Sampling

1600 Research Boulevard

Rockville, Maryland 20850-3129

USA

Tel: 301 251 8278

Fax: 301 294 2034

A.9 Payments or Gifts to Respondents

Currently, the minimum response rate targets required by OECD are 85 percent of original schools and 80 percent of students, while the NCES minimum response rate target is 85 percent at the student level. These high response rates are difficult to achieve in school-based studies. The United States failed to reach the school response rate targets for the study in all previous PISA administrations (2000, 2003, 2006, and 2009) and had to adjust incentives upwards in the middle of the recruitment and data collection period in order to meet minimum response rate requirements. With the addition of a second session in PISA 2012 to enable administration of the computer-based assessments, and a larger sample size to accommodate the financial literacy assessment, it is likely that we will face even greater resistance from schools. Gaining sufficient student cooperation is also challenging. While we met the NCES requirement in PISA 2006 by 6%, unweighted results from PISA 2009 suggest we have barely met the 85% response rate required by NCES (the unweighted student response rate is 86%) and there were 33 schools below the 80% required by OECD (again, unweighted). Moreover, our experience in PISA thus far is based on a single test session; we anticipate even greater difficulty getting students to return for a second session. Failing to meet international requirements for response rates puts the United States at great risk of not having its PISA results included in the international reports and database and, in effect, a loss of millions of dollars invested by the United States in PISA, a loss in the time invested by the schools and students that do participate, and the loss of the comparative data the United States is seeking through the project.

NCES is using a multi-pronged approach to address the challenge of gaining school and student cooperation. First, our PISA contractor is convening a Response Rate Task Force composed of staff with experience working on the National Assessment of Educational Progress (NAEP0, PISA and other international assessments, and other large-scale data collections, and with expertise in effective approaches to school recruitment. The task force, which will meet for the first time in September 2010, will identify strategies for achieving high response rates and serve as an ongoing source of ideas and feedback. Second, in September-October 2010 we will conduct focus groups with principals and students (for which OMB clearance has been requested under OMB# 1850-0803 v.36) to gain insights into desirable approaches to gaining school and student cooperation and obtain feedback on recruitment materials. Finally, we propose conducting an incentive experiment in the field trial to examine whether increased respondent incentives (beyond what was provided in PISA 2009) increase participation rates. The rest of this section discusses the proposed incentive experiment.

To understand the interaction between different levels of monetary incentives and response rates, especially given the increased burden of PISA 2012, we propose an experiment in the field trial, where schools will be randomly assigned into two groups. In half of the schools, the school, school coordinator, and students will receive the same incentive amounts as used in PISA 2009 (Incentive 1) and in the other half they will receive larger incentive amounts (Incentive 2) as shown in Table A-2. The comparisons will examine the relative change in response rate between the two incentive groups by test option. For those schools receiving the PISA 2009 incentives, they will also be compared with the PISA 2009 schools to determine the relative change in response rate as a result of test option differences (holding constant the incentive amounts). To assess the response rate experiment, we will use data from PISA 2009 as a baseline for paper-and-pencil (P&P) only assessments. The observed school and student response rates in the field trial schools will be compared to the historic trend when PISA was administered using the paper-and-pencil method.

Table A-2. Summary of proposed incentives for field trial incentive experiment


Recipient

Incentive 1

(2009 amounts)

Incentive 2

Schools

$200

$800

School coordinators

$100

$200

Students



P&P only

$20

$40

P&P and CBA

$20 + $15 for CBA

$40 + $20 for CBA

CBA only

$15

$20


The incentive experiment will be administered at the school level. As noted, schools will be randomly assigned into a high/low incentive groups. The test hypothesis is for better response rate in the higher incentive group, a one directional test. For a sample of 124 schools, the minimum detectable effect size has to be in the mid-to-high range to attain 80% power and alpha=.05 for a one-tailed test. For example, the observed school response rate is currently about 65 percent. The higher incentive group will need a response rate close to 84 percent for statistically significant comparisons at this level of accuracy. For students, taking into consideration the clustering within schools, and assuming an effective sample size of two equal groups of 650 students each, a student response rate increase from 80 to 85 percent will be statistically significant at the same power and alpha level of accuracy. The decision on the incentive scheme to recommend for the main study will depend on the consistency of improvement at both the school and the student levels.

The rationale for the proposed amounts is described below.

Schools. In the proposed incentive experiment, half of the schools will receive $200 and the other half will receive $800. In order to meet the minimum school response rates mandated by the PISA international governing board, and in order to compensate the school for the increased disruption and burden resulting from the addition of a second session, we believe it is necessary to offer schools an incentive to encourage participation. The proposed increase from 2009 (to $800) is due to the increased disruption and burden associated with the PISA 2012 design. While in 2009 data collection staff spent approximately 4 hours at a school to conduct the assessment (including time for set-up, assessment administration, and wrap-up), for the PISA 2012 field trial it will be necessary to be at a school for the entire school day, upwards of 8 hours. In addition to being in the school for an entire day and sampling more students than in the past, we will require, for the second computer-based assessment, rooms suitable for test administration on laptops for the second computer-based assessment. Thus, the burden for PISA 2012 is considerably larger than it was in 2009. Also, we learned from schools that while an incentive is an important part of their decision to participate in a study like PISA, for some schools processing a check can be difficult, and that for some being able to choose from among a menu of equipment or supplies is more attractive. Depending upon the results of the field trial and the smaller focus group/panel studies we will conduct (for which OMB clearance has been requested under OMB# 1850-0803 v.36), in the main study we would like to be able to offer schools a choice between a check or supplies/equipment items valued at the amount of the incentive approved for the main study.

Students. In half of the schools, we will provide a $20 incentive for students participating in the paper-and-pencil assessment session and an additional $15 incentive for the second, computer based assessments (CBA) session ($15 incentive for a second PISA field trial session was approved in 2008, although the originally planned and approved Electronic Reading Assessment (ERA) was in the end not field tested due to ERA funding issues). In the other half of schools, students will be offered $40 for the paper-and-pencil session and an additional $20 for the second session. Asking students to return for a second session has not been tested and will likely interfere with the remainder of each student’s school day. We are concerned that $15 may not be sufficiently attractive to ensure students return, and we would like to test whether a higher amount will secure the required response rate for both sessions. For students in schools in which only CBA will be administered, we will offer $15 to students in half of the schools and $20 in the other half.

Students participating in the assessment during non-school hours, which is an accommodation offered when it is not possible to find a suitable time within school hours and one that is exercised only as a last resort, will be offered a $50 to $75 incentive to compensate for travel, missed activities, work, etc., the amount depending on when the assessment would be conducted (e.g. after school versus on Saturday), and how far the students would have to travel to attend it (in 2009 and in 2006 it was necessary to adjust the incentive in the middle of the data collection period to $75 in order to secure the necessary response rates), with an additional $25 incentive for those participating in a second session. We do not expect to administer the assessment out of school hours in the field trial in many schools and thus will not be able to conduct an experiment in such schools. Incentives for students will only be provided with the explicit permission of the school principal.

School coordinators. In our proposed field test experiment, the school coordinator will be offered $100 in half of the schools and $200 in the other. The role of the school coordinator is critical for the success of the study. The coordinator is expected to: coordinate logistics with the data collection contractor; supply a list of eligible students for sampling to the data collection contractor; communicate with teachers, students, and parents about the study to encourage participation; assist the test administrator in ensuring the sampled students attend the testing sessions; and assist the test administrator in arranging for make-up sessions as needed. For schools with both paper-and-pencil and computer-based assessment administrations, the school coordinator will need to find space for morning and afternoon administrations and space that can accommodate administration via laptops. In addition, in the main study, school coordinators will be asked to supply state unique student identifiers for each sampled student to support future methodological studies that NCES plans to conduct (see description in Part B). Given the significant increase in the time and effort required of school coordinators, a larger amount than was offered in 2009 ($200 compared with $100) may be needed.

In our experience, the amount of effort required of school coordinators varies considerably across school contexts. In the past, in some schools the tasks required only 3-5 hours of effort, while in others 10-12 hours. We have identified a few factors that seem to affect the effort required. One is the student population served by the school. Schools that serve at-risk populations pose greater logistic challenges. Another factor is how the school is organized. For example, one school in the 2009 sample served a challenging student population (e.g., high truancy rates), had multiple campuses located miles apart, and classes were conducted in multiple shifts. The task of submitting an accurate student list, arranging the testing session, and getting students to attend was far more challenging and time consuming than the same task in a traditional school. Another example from 2009, a high school in the sample comprised of five campuses located in close proximity but operating mostly independently. Again, submitting an accurate student list, arranging the testing session, and getting students to attend was challenging and time consuming for the school coordinator; even determining if this was one school or five and how student sampling would be conducted required considerable back and forth between the contractor and the school coordinator.

In the main study, it may be again appropriate to offer higher compensation to school coordinators that have to expend significantly more effort to implement PISA in their school (up to $300 was approved during data collection in 2009 and in 2006 for school coordinators to compensate them for additional time spent getting students to attend the testing session). We will use the incentive experiment, focus groups, and other research results to propose the appropriate compensation for school coordinators in the main study.


This request for field test recruitment is being submitted in September 2010. If the Response Rate Task Force or results of the focus groups (expected to be available by the end of October) have any bearing on our approach to field test recruitment, particularly in terms of the incentive experiment, we would submit a change request to OMB in late October, prior to initiating contact with schools. Moreover, in preparation for the main study, NCES will take into consideration all that is learned from the task force, the focus groups, the incentive experiment, and experience in the field during the field test to develop an appropriate proposal for recruiting and incentivizing schools.


A.10 Assurance of Confidentiality

The PISA 2012 plan for ensuring the confidentiality of the project and participants conforms with the following federal regulations and policies: the Privacy Act of 1974 (5 U.S.C. 552a), Privacy Act Regulations (34 CFR Part 5b), the Education Sciences Reform Act of 2002 (P.L.100-297 Title I, Part C, Sec. 183, as amended), the Computer Security Act of 1987, NCES Restricted-Use Data Procedures Manual, and the NCES Standards and Policies. Procedures for handling confidential aspects of the study that will be used in PISA 2012 will mirror those used in past administrations of PISA. Although the contractor has not yet been selected, expertise in data security and confidentiality is a significant criterion in the selection of the contractor.

The plan for maintaining confidentiality includes signed confidentiality agreements and notarized nondisclosure affidavits obtained from all personnel who will have access to individual identifiers (shown in Exhibit 1). Also included in the plan is personnel training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses; controlled and protected access to computer files under the control of a single data base manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility.

Letters and other materials will be sent to parents and school administrators describing the voluntary nature of this survey. The material sent will include a brochure that describes the study and conveys the extent to which respondents and their responses will be kept confidential (copies of letters to be used in the field trial and the brochure text are included in Appendix A). The following statement will appear on the front cover of the questionnaires (the phrase “gather the data needed, and complete and review the information collection” will not be included on the student questionnaire):

U.S. participation in this study is sponsored by the National Center for Education Statistics (NCES), U.S. Department of Education.  Your responses are protected by federal statute (P.L. 107-279, Title I, Part E, Sec. 183). Your answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law.  By law, everyone working on this NCES survey is subject to a jail term of up to 5 years, a fine of up to $250,000, or both if he or she willfully discloses ANY identifiable information about you.


According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0755.  The time required to complete this information collection is estimated to average 30 minutes per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection.  If you have any comments concerning the accuracy of the time estimate(s) or suggestions for improving the form, please write to: U.S. Department of Education, Washington, D.C. 20202-4651. If you have comments or concerns regarding the status of your individual submission of this form, write directly to: Program for International Student Assessments (PISA), National Center for Education Statistics, U.S. Department of Education, 1990 K Street, N.W., Washington, D.C. 20006-5650.

 

O.M.B. No. 1850-0755, Approval Expires xx/xx/xxxx.

Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. No school or individual names or addresses will be included on these files or documentation.

NCES understands the legal and ethical need to protect the privacy of the PISA respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure analysis of the PISA 2012 data when preparing the data files for use by researchers. This analysis will ensure that NCES has fully complied with the confidentiality provisions contained in PL 100-297. To protect the privacy of respondents as required by PL 100-297, schools with high disclosure risk will be identified, and a variety of masking strategies will be used to ensure that individuals may not be identified from the data files. These masking strategies include swapping data and omitting key identification variables (i.e., school name and address) from both the public- and restricted-use files (though the restricted-use file will include an NCES school ID that can be linked to other NCES databases to identify a school); omitting key identification variables such as state or ZIP Code from the public-use file; and collapsing categories or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files.


Exhibit A.1. Affidavit of Nondisclosure


Affidavit of Nondisclosure




______________________________________________________________________

(Job Title)



______________________________________________________________________

(Date Assigned to Work with NCES Data)



______________________________________________________________________

(Organization, State or Local Agency Name)



______________________________________________________________________

(Organization or Agency Address)


____________________________________________________________

(NCES Individually Identifiable Data)



I, __________________________________ , do solemnly swear (or affirm) that I will not –

  1. make any disclosure or publication whereby a sample unit or survey respondent (including students and schools) could be identified or the data furnished by or related to any particular person or school under these sections could be identified;


  1. or use or reveal any individually identifiable information furnished, acquired, retrieved or assembled by me or others, under the provisions of Section 183 of the Education Sciences Reform Act of 2002 (P.L. 107-279) and Title V, subtitle A of the E-Government Act of 2002 (P.L. 107-347) for any purpose other than statistical purposes specified in the NCES survey, project or contract.



___________________________________

(Signature)


[The penalty for unlawful disclosure is a fine of not more than $250,000 (under 18 U.S.C. 3571) or imprisonment for not more than five years (under 18 U.S.C. 3559), or both. The word "swear" should be stricken out when a person elects to affirm the affidavit rather than to swear to it.]



A. 11 Sensitive Questions

Federal regulations governing the administration of questions that might be viewed by some as “sensitive” because of their requirement for personal or private information, require (a) clear documentation of the need for such information as it relates to the primary purpose of the study, (b) provisions to respondents that clearly inform them of the voluntary nature of participation in the study, and (c) assurances of confidential treatment of responses.

PISA 2012 does not include questions usually considered to be of a highly sensitive nature, such as items concerning religion, substance abuse, or sexual activity. However, the field trial questionnaires proposed by the international coordinators do include a few items that may be categorized as being included in the topics identified by the Protection of Pupil Rights Act (PPRA). All items are being reviewed by NCES, and items that ask for information covered by PPRA will be excluded from the U.S. questionnaire.

Several other items in the background questionnaires may be considered sensitive by some of the respondents, even though they do not fall into any of the PPRA domains. These items relate to the socioeconomic context of the school, parents’ education and occupation, family possessions, and students’ belongings. Research indicates that the constructs these items represent are strongly correlated to academic achievement, and they have been used in the four previous cycles of PISA (2000, 2003, 2006, and 2009). Therefore, the items are essential for the anticipated analyses and to retain consistency in planned comparisons with the international data.

A. 12 Estimates of Burden

The cost/burden to respondents for the PISA 2012 field trial is calculated for the estimated time required of students and school staff (school administrator and school coordinators) to complete recruitment, pre-assessment, and assessment activities (see Table A.3) in 80 schools (124 schools will be sampled and we expect 65% to participate). Burden estimates are also provided for information purposes at the bottom of Table A.3 for the main study (calculated based on the scenario of the United States participating in the core and all international options); these estimates will be updated following the field trial as final design decisions are made. Assessment activities include the time involved to complete student and school administrator questionnaires, as well as the time for assessment directions. The time required for students to respond to the assessment (cognitive items) portion of the study, and associated directions, are shown in gray font and are not included in the totals. Recruitment and pre-assessment activities include the time involved to decide to participate, completing class and student listing forms, distributing parent consent materials, distributing the school questionnaire, and arranging assessment space.

For the field trial, the average response burden for 1,210 students is based on a 30-minute questionnaire and, for a subsample of 242 of these students, 5 minutes of financial literacy background items. The extra students who take the computer based assessment only (n=1,320) are included for the purpose of item development and reliability checks. The core questionnaire items are not collected for them. Basic demographics (e.g., sex and grade) will be collected in the student roster provided for student sampling. At an estimated $7.25 per hour (the 2009 Federal minimum wage) cost to students, the dollar cost of the field trial study for students is estimated at $4,531.

Table A-3. Burden estimates for PISA 2012 field trial and main study

 

Sample

Expected response rate

Number of respondents

Number of responses

Per respondent (minutes)

Total burden (hours)

FIELD TRIAL

Student

 

 

 

 

 

 

Directions paper-and-pencil

1,512

0.80

1,210

1,210

10

202

Paper-and-pencil test booklet

1,512

0.80

1,210

1,210

120

2,420

Directions (computer-based assessment)

2,406

0.80

1,925

1,925

10

321

Computer-based assessment in addition to paper-and-pencil

756

0.80

605

605

40

404

Computer-based assessment only

1,650

0.80

1,320

1,320

40

880

Financial Literacy background items

302

0.80

242

242

5

21

Core Questionnaire

1,512

0.80

1,210

1,210

30

605

Total Student Burden Field Trial

 

 

1,210

1,452


626

School Administrator

 

 

 

 

 

 

Questionnaire

124

0.65

81

81

30

41

Recruitment and Pre-Assessment Activity

 

 

 

 

 

School Administrator

124

1.00

124

124

90

186

School Coordinator

124

0.65

81

81

240

323

Total School Burden Field Trial



205

286


550

MAIN STUDY—Based on core + international options

Student

 

 

 

 

 

 

Directions

8,000

0.85

6,800

6,800

10

1,134

Paper-and-pencil test booklet

8,000

0.85

6,800

6,800

120

13,600

Core Questionnaire

8,000

0.85

6,800

6,800

30

3,400

Financial Literacy background items

1,412

0.85

1,200

1,200

5

100

Directions (computer-based assessment)

2,353

0.85

2,000

2,000

10

334

Computer-based assessment

2,353

0.85

2,000

2,000

40

1,334

Total Student Burden Main Study

 

 

6,800

8,000

 

3,500

School Administrator

 

 

 

 

 

 

Questionnaire

194

0.85

165

165

30

83

Recruitment and Pre-Assessment Activity

 

 

 

 

 

School Administrator

194

1 .00

194

194

90

291

School Coordinator

194

0.85

165

165

240

660

Total School Burden Main Study

 

 

359

8,524

 

4,534

NOTES: Total student burden does not include time for cognitive assessment and its associated instructions.


The average response burden of 550 hours for schools in the field trial is based on a 30-minute school questionnaire for 81 school administrators; 90 minutes for 124 school administrators during the recruitment process (all sampled schools); and an average of 4 hours for 81 school coordinators to coordinate logistics with the data collection contractor; supply a list of eligible students for sampling to the data collection contractor; communicate with teachers, students, and parents about the study to encourage participation; assist the test administrator in ensuring the sampled students attend the testing sessions; and assist the test administrator in arranging for make-up sessions as needed. At an estimated $50.00 per hour cost to administrators (227 hours) and an estimated $35.00 per hour cost for school coordinators (323 hours), the dollar cost of the field trial for schools is $22,655 ($11,305 for school administrators and $11,350 for school coordinators).

A.13 Total Annual Cost Burden

Other than the burden associated with completing the PISA questionnaires and assessments (estimated above in Section A.12), the field trial and main study impose no additional cost to respondents.

A.14 Annualized Cost to Federal Government

The total cost to the federal government for conducting the PISA 2012 field trial is estimated to be $1,102,839 million spread out over a 1-year period. The total cost to the federal government for conducting the PISA 2012 main study is estimated to be $1,919,200 million per year for a 3-year period. These estimates include all direct and indirect costs of the project and are based on the United States participating in the core and international options.

A.15 Program Changes or Adjustments

There is an overall reduction in burden, because the last approval was for the full scale PISA 2009 collection, while this clearance request is only for field test and recruitment activities.

With regards to content, there are some changes to PISA 2012 from the previous rounds of data collection. The main change is that the assessment will focus on mathematics literacy during this cycle. The result is that the bulk of the items will be mathematics items and that science and reading will be the secondary components. The inclusion of the computer-based mathematics, reading, and problem-solving, and paper-and-pencil based financial literacy in the field trial also represents significant changes. There are also minor changes in wording to some of the questionnaire items, and questions that focused on student attitudes toward science or reading now focus on attitudes toward mathematics. Another change to the student questionnaire is the use of multiple forms in the main study. While each student will still take a single 30-minute questionnaire, there will be three forms of the questionnaire with common and different content to allow PISA to collect more background information while keeping the burden on individuals at the same level as in past administrations of PISA. Finally, it is possible that a new contractor—not identified at the time this document was submitted—will conduct the field trial and main study for PISA 2012.

A.16 Plans for Tabulation and Publication

The PISA field trial is designed to provide a statistical review of the performance of items on the assessments and questionnaires in preparation for the main data collection. The international contractor, ACER, will provide the international instruments to be used in the field trial and will report to the participating countries on the results of the field trial. Based on the field trial results, ACER, with input and agreement from the participating countries, will make final revisions in the survey instruments, materials, and documents in preparation for the main study.

For the main study in 2012, an analysis of the U.S. and international data will be undertaken to provide for an understanding of the U.S. national results in relation to the international results. Based on proposed analyses of the international data set by ACER, and the need for NCES to report results from the perspective of an American constituency, a plan is being prepared for the statistical analysis of the U.S. national data set as compared to the international data set. Analysis of data will include examinations of the reading, mathematics, and science literacy of U.S. students in relation to their international counterparts; and the relationships between reading, mathematics, and science literacy and student and school background variables.

All reports and publications will be coordinated with the release of information from the international organizing body. Planned publications and reports for the PISA 2012 main study include the following:

General Audience Report. This report will present information on the status of reading, mathematics, and science education among students in the United States in comparison to their international peers, written for a non-specialist, general American audience. This report will present the results of analyses in a clear and non-technical way, conveying how U.S. students compare to their international peers, and what factors, if any, may be associated with the U.S. results.

Survey Operations/Technical Report. This document will document the procedures used in the main study (e.g., sampling, recruitment, data collection, scoring, weighting, and imputation) and describe any problems encountered and the contractor’s response to them. The primary purpose of the main study survey operations/technical report is to document those steps taken by the United States in undertaking and completing the study. This report will include an analysis of non-response bias, which will assess the presence and extent of bias due to nonresponse. Selected characteristics of respondent students and schools will be compared with those of non-respondent schools and students to provide information about whether and how they differ from respondents along dimensions for which we have data for the nonresponding units, as required by NCES standards.

Electronic versions of each publication are made available on the NCES website. Schedules for tabulation and publication of PISA 2012 results in the United States are dependent upon receiving data files from the international sponsoring organization. With this in mind, the expected data collection dates and a tentative reporting schedule are as follows:


April - December 2010

Prepare OMB clearance documents, data collection manuals, forms, assessment materials, questionnaires for field trial



September 2010-February 2011

Contact and gain cooperation of states, districts, schools for field trial



March – July 2011

Select student samples and collect field trial data



July 2011

Deliver raw data to international sponsoring organization



August – September 2011

Receive Field Trial Report from international sponsors, revise OMB package



September 2011–September 2012

Prepare for the main study phase/ recruit schools

June/July 2012

Summer conference for sampled schools



September 2012–November 2012

Collect main study data



March - April 2013

Receive final data files from international sponsors



August - December 2013

Produce General Audience Report, Survey Operations/Technical Report for the United States



A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all data collection materials.

A.18 Exceptions to Certification Statement

No exceptions are requested to the "Certification for Paperwork Reduction Act Submissions" of OMB Form 83-I.



1 Some PISA participants are subnational jurisdictions (e.g., Hong Kong, China).

2 Neidorf, T.S., Binkley, M., Gattis, K., and Nohara, D. (2006). Comparing Mathematics Content in the National

Assessment of Educational Progress (NAEP), Trends in International Mathematics and Science Study (TIMSS), and

Program for International Student Assessment (PISA) 2003 Assessments (NCES 2006-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePREFACE
AuthorJanice Bratcher
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy