Part A ISASL [PIAAC Cycle 2] 2022 Field Test

Part A ISASL [PIAAC Cycle 2] 2022 Field Test.docx

International Study of Adult Skills and Learning (ISASL) [Program for the International Assessment of Adult Competencies (PIAAC) Cycle II] 2022 Field Test

OMB: 1850-0870

Document [docx]
Download: docx | pdf


International Study of Adult Skills and Learning (ISASL) [Program for the International Assessment of Adult Competencies (PIAAC) Cycle II]

2022 Field Test


Supporting Statement Part A

OMB # 1850-0870 v.7

Submitted by

National Center for Education Statistics

U.S. Department of Education



revised October 2019

September 2019



As in Cycle 1, a user-friendly name for PIAAC Cycle II was created – the International Study of Adult Skills and Learning (ISASL) – to represent the program to the public, and will be used on all public-facing materials and reports. As this international program is well-known within the federal and education research communities, we continue to use "PIAAC" in all internal and OMB clearance materials and communications, and use the “PIAAC” name throughout this submission. However, as seen in Appendix E, all recruitment and communication materials refer to the study as ISASL.



Table of Contents




Part B: Collections of Information Employing Statistical Methods


Appendix A: PIAAC Cycle II Field Test Screener


Appendix B: PIAAC Cycle II Field Test Case Initialization Items


Appendix C: PIAAC Cycle II Field Test Background Questionnaire (English)


Appendix C2: PIAAC Cycle II Field Test Background Questionnaire (Spanish)


Appendix D: PIAAC Cycle II Field Test Assessment Text


Appendix E: PIAAC Cycle II Field Test Respondent Contact Materials



Preface

The Program for the International Assessment of Adult Competencies (PIAAC) is a cyclical, large-scale study of adult skills and life experiences focusing on education and employment. PIAAC is an international study designed to assess adults in different countries over a broad range of abilities, from simple reading to complex problem-solving skills, and to collect information on individuals’ skill use and background. PIAAC is coordinated by the Organization for Economic Cooperation and Development (OECD) and developed by participating countries with the support of the OECD. In the United States, the National Center for Education Statistics (NCES), within the U.S. Department of Education (ED) conducts PIAAC. NCES has contracted with Westat to administer the PIAAC Cycle II Field Test data collection in the U.S.

The U.S. participated in the PIAAC Main Study data collection in 2012 and conducted national supplement data collections in 2014 and 2017. All three of these collections are part of PIAAC Cycle I, in which 39 countries participated (24 countries in 2012, 9 new countries in 2014, and 5 more new countries in 2017), including the U.S., with close to 200,000 adults assessed across the 39 countries over the three data collections. A new PIAAC cycle is to be conducted every 10 years, and PIAAC Cycle II Main Study data collection will be conducted from August 2021 through March 2022. In preparation for the main study collection, PIAAC Cycle II will begin with a Field Test in 2020, in which 34 countries are expected to participate with the primary goal of evaluating newly developed assessment and questionnaire items and to test the PIAAC 2022 planned operations. This request is to conduct the PIAAC Cycle II Field Test in April-June 2020.

PIAAC 2022 defines four core competency domains of adult cognitive skills that are seen as key to facilitating the social and economic participation of adults in advanced economies: (1) literacy, (2) numeracy, (3) reading and numeracy components1, and (4) adaptive problem solving. All participating countries are required to assess the literacy and numeracy domains [(1) & (2)], but the reading and numeracy components and adaptive problem solving domains are optional [(3) & (4)].

The U.S. will administer all four domains of the PIAAC 2022 assessment to a nationally representative sample of adults, along with a background questionnaire with questions about their education background, work history, the skills they use on the job and at home, their civic engagement, and sense of their health and well-being. The results are used to compare the skills capacities of the workforce-aged adults in participating countries, and to learn more about relationships between educational background, employment, and other outcomes. In addition, in PIAAC 2022, a set of financial literacy questions will be included in the background questionnaire.

As noted above, PIAAC is a multi-cycle collaboration between the governments of participating countries, the OECD, and a consortium of various international organizations, referred to as the PIAAC Consortium. Members of the consortium include representatives from Educational Testing Service (ETS), Westat, cApStAn, The Research Centre for Education and the Labor Market (ROA), gesis-ZUMA Centre for Survey Research, German Institute for International Education Research (DIPF), and the Data Processing Centre of the International Association for the Evaluation of Educational Achievement (IEA-DPC).

PIAAC is collaborative and international by nature and is developed through an extensive series of international meetings and workgroups assisted by OECD staff. At these meetings, the PIAAC framework for the assessment, the background questionnaire, and the common standards and data collection procedures are all developed. Expert panels, researchers, the PIAAC Consortium’s support staff, and representatives of the participating countries collaboratively develop all these aspects of PIAAC and guide the development of the software platform for uniformly administering the assessment on tablets. All countries must follow the common standards and procedures and use the same software when conducting the survey and assessment. As a result, PIAAC is able to provide a reliable and comparable measure of adult skills in the adult population across the participating countries.

This submission contains the materials for the PIAAC Cycle II Field Test. The final U.S. versions of the field test instruments are provided in Appendices A-D, including the Screener (Appendix A), the Case Initialization Items (Appendix B), the Background Questionnaire (Appendix C), and the Assessment Text (Appendix D). The separate document Appendix C2 contains the Spanish translation of the Background Questionnaire. Finally, the PIAAC 2022 Cycle II Field Test respondent contact materials and the Spanish language versions of these materials, if applicable, are provided in Appendix E.

Subsequently, in Fall 2020, we will submit a clearance request with the procedures and instruments to be used in the Main Study data collection beginning in the late summer of 2021.



A.1 Importance of Information

Over the past three decades, national governments and other stakeholders have been increasingly interested in an international assessment of adult skills to monitor how well populations are prepared for the challenges of a knowledge-based society. In the mid-1990s, the International Adult Literacy Survey (IALS) assessed the prose, document, and quantitative literacy of adults in 22 countries or territories, including the U.S. Between 2002 and 2006, the Adult Literacy and Lifeskills (ALL) Survey assessed prose and document literacy, numeracy, and problem-solving in ten countries, including the U.S.

PIAAC’s measurement of competencies in problem solving and of skills used in the workplace moved the survey beyond conventional measurements of literacy. These two features helped to assess the extent to which adults have acquired a generic set of skills and competencies. At the same time, PIAAC looks more closely than previous surveys at the extent to which people with low literacy levels have the basic building blocks that they need to read effectively. Data from PIAAC Cycle I increased the understanding of the relationship between adult skills and life outcomes in health, employment status, and civic engagement, which have social and economic impacts both within and across countries.

U.S. participation in PIAAC Cycle I and II is consistent with the NCES mandate. The enabling legislation of the National Center for Education Statistics [Section 406 of the General Education Provisions Act, as amended (20 U.S.C. 1221e-1)] specifies that “The purpose of the Center [NCES] shall be to collect and analyze and disseminate statistics and other information related to education in the United States and in other nations.” The Educational Sciences Reform Act of 2002 (HR 3801, Part C, Sec.153) also specifies that NCES

shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including—(1) collecting, acquiring, compiling (where appropriate, on a State-by-State basis), and disseminating full and complete statistics (disaggregated by the population characteristics described in paragraph (3)) on the condition and progress of education, at the preschool, elementary, secondary, postsecondary, and adult levels in the United States, including data on…(D) secondary school completions, dropouts, and adult literacy and reading skills…[and] (6) acquiring and disseminating data on educational activities and student achievement…in the United States compared with foreign nations.2

In addition to being essential for any international perspective on adult literacy and reading skills, U.S. participation fulfills both the national and international aspects of NCES’ mission. By conducting PIAAC, NCES is able to provide policy-relevant data for international comparisons of the U.S. adult population’s competencies and skills, and help inform decision-making on the part of national, state, and local policymakers, especially those concerned with economic development and workforce training.

A.2 Purposes and Uses of the Data

The core objectives for PIAAC Cycle II are similar to those for Cycle I. NCES will use data from the U.S. participation in Cycle II to help continue to:

  • Identify factors that are associated with adult competencies;

  • Extend the measurement of skills held by the working age population;

  • Provide a better understanding of the relationship of education to adult skills; and

  • Allow comparisons across countries and, as PIAAC is cyclical, over time.

Additionally, information from the PIAAC Cycle II will be used by:

  • Federal policymakers and Congress to (a) better understand the distribution of skills within and across segments of the population and (b) be able to plan Federal policies and interventions that are most effective in developing key skills among various subgroups of the adult population;

  • State and local officials to plan and develop education and training policies targeted to those segments of the population in need of skill development;

  • News media to provide more detailed information to the public about the distribution of skills within the U.S. adult population, in general, and the U.S. workforce more specifically; and

  • Business and educational organizations to better understand the skills of the U.S. labor force and to properly invest in skills development among key segments of the workforce to address skill gaps.

NCES will use information gathered from the Field Test to refine the screener, background questionnaire, assessment, and training and data collection materials and procedures, before their implementation in the Main Study. Specifically, the objectives of the PIAAC Cycle II Field Test are to:

  • Evaluate computer-assisted personal interviewing (CAPI) system applications and data processing procedures;

  • Test the screener and background questionnaire content, including question ordering and skip patterns;

  • Evaluate the reading, numeracy, and adaptive problem-solving assessment items to determine which field-tested items should be selected for the Main Study;

  • Test data collection procedures, including transitions between instruments and the direct use of the automated instruments by respondents;

  • Evaluate the interviewer training program;

  • Review and analyze the Field Test data, including the variability and sensitivity of questions, the consistency of edit checks, and the administration time associated with each aspect of the data collection;

  • Test the within-household sample selection process;

  • Test the Quality Control (QC) sampling-related procedures; and

  • Test the flow of materials and the sample data from sample selection to the delivery of the Sample Design International File (SDIF) at the end of the data collection.

PIAAC Cycle II Field Test Components

As noted in the prior section, NCES will use the results from the Field Test to improve all aspects of the content and processes of the PIAAC before the Main Study.

As in Cycle 1, a user-friendly name for PIAAC was created to represent the program to the public, and will be used on all public-facing materials. The chosen name for the PIAAC Cycle II administration is the International Study of Adult Skills and Learning (ISASL). As the international program is well-known within the federal research community and among education researchers, we have continued to use "PIAAC" in all internal and OMB clearance materials and communications, and have used that name throughout this package. However, as seen in Appendix E, all recruitment and communication materials refer to the study as ISASL. For more information on the motivation and thinking behind this decision, please see p. 5 of Part B.  

Also as in Cycle I, PIAAC Cycle II instruments include a Screener (see Appendix A and section B.2), a set of verification items known as the Case Initialization (Appendix B), the Background Questionnaire (Appendix C) and the assessment. Below, we describe the Case Initialization, the Background Questionnaire and the Assessment.

Case Initialization

After administering the screener, the first step in the interview with the eligible respondent (who may or may not be the same as the person answering the screener questions) is the Case Initialization module. This module is used to verify information about the respondent before the Background Questionnaire begins, to ensure the correct person is being interviewed, that the person is eligible, and that we have the correct contact information for the person. Through several screens, the module will prompt the interviewer to ask the respondent to verify their name (or nickname), age, gender, address and telephone number. This information will be prefilled from the answers obtained in the Screener, but the interviewer can correct any answers on the screens if needed.

Background Questionnaire (BQ)

The PIAAC BQ is meant to identify (a) what skills participants regularly use in their job and in their home environment, (b) how participants acquire those skills, and (c) how those skills are distributed throughout the population. To obtain this information, the BQ asks participants about their education and training; present and past work experience; the skills they use at work; their use of specific literacy, numeracy, and information and computer technology skills at work and home; personal traits; and health and other background information. For Cycle II, the BQ will include a set of questions on financial literacy, particularly about the respondent’s financial habits and understanding of financial information. The BQ is administered as a computer-assisted personal interview (CAPI) by a household interviewer on a study tablet computer, and takes approximately 30 minutes to complete (see Appendix C for the English version of the BQ, and Appendix C2 for the Spanish translation).

Assessment

The PIAAC direct-assessment evaluates the skills of adults in four domains: literacy, numeracy, reading and numeracy components, and adaptive problem solving. These domains are considered to constitute key information processing skills in that they provide a foundation for the development of other higher-order cognitive skills and are prerequisites for gaining access to and understanding specific domains of knowledge.

The PIAAC is an adaptive computer-based assessment. Under this unique design, the computer-based instrument directs respondents to a set of easier or more difficult items based on their answers to a set of locator items. The locator items are designed to assess the respondent’s ability to complete the domain-based items and to assess his or her basic computer skills (e.g., the respondent’s capacity to use the stylus or their finger to highlight text on the tablet computer). As a result, some respondents may not complete all items in the assessment.

In this adaptive design, the instrument uses an algorithm to select assessment items for each participant. This algorithm uses a set of variables that include: i) the participant’s level of education, ii) the participant’s status as a native or non-native language speaker; and iii) the participant’s performance in the locator tasks and on the computer-based items as they advance through the assessment. The key advantage of such an adaptive design is to provide a more accurate assessment of participants’ abilities, while using a smaller number of items than a traditional test design.

At the completion of the BQ, the participant completes a brief tablet tutorial (see Appendix D) and then begins the reading and numeracy components. Upon completing the components, the participant completes the locator and, based on the locator score, the respondent is routed to the literacy, numeracy or adaptive problem solving items on the assessment.

A.3 Improved Information Technology (Reduction of Burden)

As in PIAAC Cycle I, technology is a large component of the PIAAC Cycle II study. The screener (Appendix A) and the BQ (Appendix C) are CAPI administered. The interviewer will read the items aloud to the respondent from the screen on a tablet computer and will record all responses on the tablet. Because the assessment is computer-based, the data capture system uses automated skip patterns to display the appropriate questions depending on the preceding responses. For the screener, the tablet is used to run a sampling algorithm to determine who, if anyone, in the household is eligible to participate in the study, and it will select a respondent or respondents for the BQ and the assessment.

Additionally, a Field Management System (FMS), similar to the one used in Cycle I, will be used for the PIAAC Cycle II Field Test. The FMS is comprised of three modules: Supervisor Management System, Interviewer Management System, and Home Office Management System. The Supervisor Management System will be used to manage data collection and produce productivity reports. Supervisors will use the Supervisor Management System to transfer cases between interviewers, plan and monitor travel to PSUs, close cases as needed, order and track mailing of outreach materials to sampled households in a timely fashion, and manage adaptive design procedures. The Interviewer Management System allows interviewers to administer the automated instruments, manage case status, transmit data, and maintain information on the cases. In Cycle II, interviewers will enter all information about contacts with cases into the tablet, replacing household folders used for that purpose in Cycle I. The Home Office Management System supports the packaging and shipping of cases to the field, shipping of booklets for scoring report production, processing of cases, receipt control, the receipt and processing of automated data, and is integrated with processes for editing and analysis. Additionally, a dashboard will summarize and present field work paradata (e.g. information about time of contacts, data quality problems, hours worked and costs) for the use of field management and home office staff in monitoring and supervising interviewers.

A.4 Efforts to Identify Duplication

PIAAC Cycle I was the first study in the U.S. to incorporate a technology component among the measured skills. The international nature of the study allows comparisons of the prevalence of these skills in the U.S. versus other PIAAC participating countries. Although other international assessments examine a similar content area, PIAAC is unique in several ways, detailed here.

Content

PIAAC is a “literacy” assessment, designed to measure performance in certain skill areas at a broader level than school curricula, encompassing a broader set of skills that adults have acquired throughout life. The skills that are measured in PIAAC differ from those measured by other studies such as the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS), which are curriculum based. The TIMSS and PIRLS are designed to assess what students have been taught in school in specific subjects (such as science, mathematics, or reading) using multiple-choice and open-ended test questions.

Another international assessment, the Program for International Student Assessment (PISA), assesses literacy, functional skills, and other broad learning outcomes and is designed to measure what 15-year-olds have learned inside and outside of school throughout their lives. Besides its different target age group, PISA measures skills at a much higher and more academic level than PIAAC, with PIAAC being more focused on the extensive range of skills needed in both everyday life and in the workforce. PIAAC contains tasks not included in the school based assessments, and assesses a range of skills from a very basic level to the higher workplace level skills that adults encounter in everyday life.

PIAAC has also improved and expanded on the cognitive frameworks of previous large-scale adult literacy assessments (including the National Adult Literacy Assessment (NALS), NAAL, IALS, and ALL). The most significant difference between PIAAC Cycle II and previous large-scale assessments is that PIAAC will be administered on tablet computers and is designed to be an adaptive assessment, so respondents receive groups of items targeted to their performance levels. Because of these differences, PIAAC introduced a new set of scales to measure adult literacy, numeracy, and problem solving. Scales from IALS and ALL have been mapped to the PIAAC scales so that trends in performance can be measured over time.

Adult Household Sample

As an international assessment of adult competencies, PIAAC differs from student assessments in several ways. PIAAC assesses a wide range of ages (16-74), irrespective of their schooling background, whereas student assessments target a specific age (e.g., 15-year-olds in the case of PISA) or grade (e.g., grade 4 in PIRLS) and are designed to evaluate the effects of schooling. PIAAC is administered in individuals’ homes, whereas international student assessments such as PIRLS, PISA, and TIMSS are conducted in schools.

Information collected

The kind of information PIAAC collects also reflects policy purpose different from the other assessments. PIAAC provides policy relevant data for international comparisons of the U.S. adult population’s competencies and skills and provides vital data to national, state, and local policymakers focused on economic development and adult workforce training.

A.5 Minimizing Burden for Small Entities

The PIAAC Cycle II Field Test will collect information from the 16-74 year-old population through households only. No small entities will be contacted to participate in the PIAAC Cycle II Field Test data collection.

A.6 Frequency of Data Collection

PIAAC is a cyclical study, intended by the OECD to be updated internationally every 10 years. During PIAAC Cycle I, three rounds of data collection were conducted in the U.S.: 2012, 2014, and 2017, as described in Table 1. The PIAAC Cycle II Field Test data collection is scheduled for 2020 while the Main Study data collection will take place in 2021-2022.

Table 1. U.S. PIAAC data collection efforts

International cycle

Data collection name

Sample

Time period

Cycle I

PIAAC 2012 Main Study

5,100 adults ages 16-65

2011-2012

Cycle I

PIAAC 2014 National Supplement

3,600 adults ages 16-74 and adults in prison (ages 16-74)

2013-2014

Cycle I

PIAAC 2017 National Supplement

3,660 adults ages 16-74

2017

Cycle II

PIAAC 2020 Field Test

1,500 adults ages 16-74

2020

Cycle II

PIAAC 2022 Main Study

5,500 adults ages 16-74

2021-2022


A.7 Special Circumstances

The special circumstances identified in the Instructions for Supporting Statement do not apply to this study.

A.8 Consultation outside NCES

As noted above, all participating countries develop PIAAC as a cooperative enterprise. PIAAC was developed under the auspices of the OECD by a consortium of organizations. The following are the key persons from these organizations who are involved in the design, development, and operation of PIAAC Cycle II:

  • William Thorn, Senior Analyst, Education and Skills Directorate, Organization for Economic Cooperation and Development, 2, rue André Pascal, 75775 Paris Cedex 16, FRANCE;

  • Irwin Kirsch, Project Director for PIAAC Consortium, Educational Testing Service (ETS) Corporate Headquarters, 660 Rosedale Road, Princeton, NJ 08541; and

  • Jacquie Hogan, Study Director, Westat, 1600 Research Boulevard, Rockville, Maryland 20850-3129.

A.9 Payments or Gifts to Respondents

In recent years, in-person household-based survey response rates have been declining. Research indicates that incentives play an important role in gaining respondent cooperation. To meet PIAAC Cycle II response rate goals, as was done in U.S. PIAAC Cycle I, sampled respondents will be offered $50 via a cash card to thank them for their time and effort spent participating in PIAAC (including completing the background questionnaire and assessment).

An experiment with incentives for completing the Screener will be conducted during the Field Test. Households in the test group will receive $2 in cash in the mail with the advance letter and $5 via a cash card at the completion of the Screener at the sampled address.

A.10 Assurance of Confidentiality

Data security and confidentiality protection procedures have been put in place for the PIAAC Cycle II Field Test to ensure that all contractors and agents working on PIAAC comply with all privacy requirements including, as applicable:

  1. The statement of work for the PIAAC contract;

  2. Privacy Act of 1974 (5 U.S.C. §552a);

  3. Privacy Act Regulations (34 CFR Part 5b);

  4. Computer Security Act of 1987;

  5. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  6. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  7. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  8. Foundations of Evidence-Based Policymaking Act of 2018, Title III, Part B, Confidential Information Protection;

  9. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  10. The U.S. Department of Education Incident Handling Procedures (February 2009);

  11. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  12. NCES Statistical Standards; and

  13. All new legislation that impacts the data collected through the inter-agency agreement and contract for this study.

Furthermore, all PIAAC contractors and agents will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/.

All study personnel will sign Westat and PIAAC Cycle II confidentiality agreements, and notarized NCES nondisclosure affidavits will be obtained from all personnel who will have access to individual identifiers. The protocols for satisfying the confidentiality protocols for the PIAAC Cycle II Field Test have been approved by the Institute of Education Sciences (IES) Disclosure Review Board (DRB). PIAAC Consortium organizations, including ETS and Westat, will follow the procedures set in the PIAAC Cycle II Technical Standards and Guidelines to support the data delivery, cleaning, analysis, scaling, and estimation. NCES will work closely with the DRB and the PIAAC Cycle II Consortium organizations to map out the details of the disclosure analysis plan for the PIAAC Cycle II.

The physical and/or electronic transfer of personally identifiable information (PII), particularly first names and addresses, will be limited to the extent necessary to perform project requirements. This limitation includes both internal transfers (e.g., transfer of information between agents of Westat, including subcontractors and/or field workers) and external transfers (e.g., transfers between Westat and NCES, or between Westat and another government agency or private entity assisting in data collection). For PIAAC Cycle II, the only transfer of PII outside of Westat facilities is the automated transmission of case reassignments and completed cases between Westat and its field interviewing staff. Westat will delete all PII from the Field Test files before the data is delivered to ETS and IEA for analysis. The only geographic identifiers on the files sent to ETS and IEA will be the Census region and the urban/rural classification. No data coarsening will be done with the Field Test data because these data will not be released further than ETS and IEA.

The transmission of this information is secure, using approved methods of encryption. All field interviewer tablet computers are encrypted using full-disk encryption in compliance with FIPS 140-2 to preclude disclosure of PII should a tablet be lost or stolen. Westat will not transfer PIAAC Cycle II Field Test files (whether or not they contain PII or direct identifiers) of any type to any external entity without the express, advance approval of NCES.

Specifically, for electronic files, direct identifiers will not be included (a Westat-assigned study identifier will be used to uniquely identify cases), and these files will be encrypted according to NCES standards. If these are transferred on media, such as a USB drive, they will be encrypted in compliance with FIPS 140-2.

All PIAAC Cycle II Field Test data files constructed to conduct the study will be maintained in secure network areas at Westat and will be subject to Westat’s regularly scheduled backup process, with backups stored in secure facilities on and off site. These data will be stored and maintained in secure network and database locations where access is limited to the specifically authorized Westat staff assigned to the project who have completed the NCES Affidavit of Non-disclosure. Identifiers will be maintained in files required to conduct survey operations that will be physically separate from other research data including responses to questionnaire and assessment items, sampling frame information, and log data, and accessible only to sworn agency and contractor personnel. Westat will deliver data files, accompanying software, and documentation to NCES at the end of the study. Neither respondents’ names nor addresses will be included in those data files.

The following protocols are also part of the approved System Security Plan: (1) training personnel regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses; (2) controlling and protecting access to computer files under the control of a single database manager; (3) building-in safeguards concerning status monitoring and receipt control systems; and (4) having a secured and operator-manned in-house computing facility.

The laws pertaining to the collection and use of personally identifiable information will be clearly communicated to participants in correspondence and prior to administering PIAAC Cycle II Field Test BQ and assessment. Westat will provide a study introductory letter and brochure to households that will describe the study, its voluntary nature, convey the extent to which respondents and their responses will be kept confidential, and provide the NCES authorization and PRA statements. In addition, before beginning the BQ, the NCES confidentiality statement and authorization and PRA statements will be shown to the BQ respondent on the tablet screen. The introductory letter and screen shown to respondents before starting the BQ will include the following language:

The National Center for Education Statistics within the U.S. Department of Education is authorized to conduct this study under the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). All of the information you provide may only be used for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). Individuals are never identified in any reports. All reported statistics refer to the U.S. as a whole or to national subgroups.

The National Center for Education Statistics within the U.S. Department of Education is authorized to conduct this study under the Education Sciences Reform Act of 2002 (ESRA 2002; 20 U.S.C. § 9543). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). Individuals are never identified in any reports. All reported statistics refer to the U.S. as a whole or to national subgroups. According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this voluntary survey is 1850-0870. The time required to complete this survey is estimated to average between 5 minutes to two hours per response, depending on the components applicable to household, including the time to review instructions, gather the data needed, and complete and review the ISASL (PIAAC Cycle II) questionnaire and exercise. If you have any comments concerning the accuracy of the time estimate, suggestions for improving this survey, or any comments or concerns regarding this survey, please write directly to: PIAAC, National Center for Education Statistics, Potomac Center Plaza, 550 12th Street, SW, Washington, DC 20202. Approval expires MM/DD/YYYY.

A.11 Sensitive Questions

The screener and background questionnaire for PIAAC Cycle II Field Test will include questions about race/ethnicity, place of birth, and household income. These questions are considered standard practice in survey research and will conform to all existing laws regarding sensitive information.

A.12 Estimates of Burden

For the PIAAC Cycle II Field Test, the total response burden is estimated at two hours per respondent, including the time to answer the screener (5 minutes), background questionnaire questions (30 minutes), complete the locator and the orientation modules (10 minutes), and the assessment (60 minutes).

Table 2. Estimates of burden for PIAAC Cycle II Field Test

Data collection instrument

Sample size

Expected response rate

Number of respondents

Number of responses

Burden per respondent (minutes)

Total burden hours

Households







PIAAC Screener

3,433

71.4%

2,451

2,451

5

205

Individuals







Background Questionnaire

2,213

71.4%

1,580*

1,580

30

790

Locator and Orientation Module

1,580

100%

1,580*

1,580

10

263

Assessment**

1,580

98.1%

1,550*

1,550

60

1,550

Total

NA

NA

2,451

5,611

NA

1,258

* Duplicate counts of individuals are not included in the total number of respondents estimate.

** Assessments are exempt from Paperwork Reduction Act reporting and thus are not included in the burden total.

NOTE: See table 4 in Part B for details on the sample yield estimates.

Table 2 presents the estimates of burden for the PIAAC Cycle II Field Test. The intended target total number of assessment respondents for PIAAC Cycle II Field Test is 1,550, with a total burden time (excluding the assessment) of 1,258 hours and an expected overall response rate of about 50 percent. In the first row, 3,433 is the number of expected occupied households, computed as the total number of sampled dwelling units multiplied by the occupancy rate (3,928 * 0.874). Of these, 2,451 households are estimated to go through the screener (the number of expected occupied households multiplied by the screener response rate: 3,433*0.714). In the second row, 2,213 is the number of sampled persons, computed as the sum of sampled persons age 16-65 and sampled persons age 66-74. The number of sampled persons age 16-65 is the product of (a) the number of completed screeners (2,451), (b) the proportion of households having at least one eligible person 16-65 (0.815), and (c) an adjustment for the proportion of HHs with 2 sample persons age 16-65 selected3 (1.072). The number of sampled persons age 66-74 is the product of (a) the number of completed screeners (2,451), (b) the proportion of households having at least one eligible person 66-74 (0.145), and (c) an adjustment to select persons age 66-74 in approximately one out of every five HHs (0.2). In the third row, 1,580 is the number of sampled persons estimated to complete the BQ (the product of 2,213 * the BQ response rate of 0.714). Of these, all participants are expected to complete the locator and orientation module, and 1,550 participants are expected to complete the assessment (based on the expected 0.98 assessment response rate).

Assuming an average hourly cost of $24.984 for respondents, the 1,258 total burden hours are estimated to translate to $31,425 total burden time cost to PIAAC Cycle II Field Test respondents.

A.13 Total Annual Cost Burden

There are no additional costs to respondents and no record-keeping requirements.

A.14 Annualized Cost to Federal Government

The total cost to the federal government, including all direct and indirect costs of preparing for and conducting the PIAAC Cycle II Field Test is estimated to be $3,113,759 (see Table 3 for cost detail).

Table 3. Cost for conducting the PIAAC Cycle II Field Test

Item

Cost

Labor

1,102,506


Other Direct Costs

489,252


Respondent Incentives

113,124


Overhead, G&A, and Fee

1,273,577


Salaries of Federal Employees

135,300


TOTAL PRICE

3,113,759


A.15 Program Changes or Adjustments

This submission is a reinstatement of the PIAAC study and as such it represents an increase in burden. In comparison to the now discontinued last approval for PIAAC, this would be an apparent decrease in burden time due to the fact that the last approval was for PIAAC 2017 National Supplement data collection while this request is to conduct a field test for PIAAC 2022.

Since PIAAC is a cyclical study, the PIAAC Cycle II Field Test and Main Study data collections are building on the Cycle I content and procedures. The most significant changes or adjustments to the program include the use of a tablet computer to administer the study instruments, modules added to the background questionnaire, and the elimination of paper booklets during the Field Test (to be included in the Main Study).

A.16 Plans for Tabulation and Publication

Because the Field Test is intended for testing sampling and operational procedures in the field only, the data from the PIAAC Cycle II Field Test will not be used to produce national estimates.

Technical Report

NCES will not produce a report for the Field Test and there are currently no plans to conduct statistical analyses of the Field Test dataset.

Table 4. PIAAC 2022 Cycle II Schedule

Dates

Activity

November 2019—February 2020

Select sample for field test

September 2019—March 2020

Prepare field test data collection manuals, forms, training materials, assessment materials, questionnaires, and train data collection staff

April—June 2020

Collect field test data – includes recruitment and assessment activities

August 2020

Deliver field test raw data to international consortium

January—June 2021

Select sample for main study

August 2020—July 2021

Prepare main study data collection manuals, training materials, forms, assessment materials, questionnaires, and train data collection staff

August 2021—March 2022

Collect main study data – includes recruitment and assessment activities

May 2022

Deliver main study raw data to international consortium

January 2023

Receive data files from international consortium

January 2023 —August 2023

Produce reports



A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all data collection materials.

A.18 Exceptions to Certification Statement

No exceptions to the certifications are requested.

1 The numeracy core competency domain refers to the ability to access, use, interpret, and communicate mathematical information and ideas, in order to engage in and manage the mathematical demands of a range of situations in adult life. Examples of numeracy skills include reading an airport timetable or calculating the sale price of an item based on an advertisement. The numeracy components core competency domain, on the other hand, is aimed at whether an individual has the foundational skills to develop the higher numeracy abilities necessary for functioning in society. Examples of these basic numeracy skills include determining how many objects are shown and which number is the largest.

2 See http://www.ed.gov/policy/rschstat/leg/PL107-279.pdf for the full Education Sciences Reform Act.

3 In households with four or more eligible respondents, two persons will be sampled to participate.

4 The average hourly earnings of adults derived from May 2018 Bureau of Labor Statistics (BLS) Occupation Employment Statistics is $24.98. If mean hourly wage was not provided, it was computed assuming 2,080 hours per year. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation code: All employees (00-0000); accessed on August 30, 2019.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePIAAC OMB Clearance Part A 12-15-09
AuthorMichelle Amsbary
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy