Part A ICILS 2023 Main Study

Part A ICILS 2023 Main Study.docx

International Computer and Information Literacy Study (ICILS 2023) Main Study Questionnaire Revision

OMB: 1850-0929

Document [docx]
Download: docx | pdf





International Computer and

Information Literacy Study (ICILS 2023)

Main Study Sampling, Recruitment, and Data Collection




OMB #1850-0929 v.10





Supporting Statement Part A







Submitted by

National Center for Education Statistics

Institute of Education Sciences

U.S. Department of Education





November 2021

revised July 2022



TABLE OF CONTENTS



PREFACE 2


A. JUSTIFICATION





SUPPORTING STATEMENT PART B

APPENDICES

A: Main Study Recruitment and Parent Materials


B: Draft ICILS U.S. Main Study Questionnaires


C: Non-response Bias Analysis Plan





PREFACE

The International Computer and Information Literacy Study (ICILS) is a computer-based international assessment of eighth-grade students’ computer and information literacy (CIL) skills. ICILS was first administered internationally in 2013 in 21 education systems and again in 2018. The next administration of ICILS will be in 2023. The United States participated for the first time in the 2018 administration. Our participation in this study has provided data on students’ skills and experience using technology to investigate, create, and communicate, and provided a comparison of U.S. student performance and technology access and use with those of the international peers. The 2023 study will allow the U.S. to begin monitoring the progress of its students compared to that of other nations and to provide data on factors that may influence student computer and information literacy skills. The data collected through ICILS will provide valuable information with which to understand the nature and extent of the “digital divide” and has the potential to inform understanding of the relationship between technology skills and experience and student performance in other core subject areas.

ICILS is conducted by the International Association for the Evaluation of Educational Achievement (IEA), an international collective of research organizations and government agencies that create the assessment framework, assessment, and background questionnaires. The IEA decides and agrees upon a common set of standards and procedures for collecting and reporting ICILS data, and defines the study timeline, all of which must be followed by all participating countries. As a result, ICILS is able to provide a reliable and comparable measure of student skills in participating countries. In the U.S., the National Center for Education Statistics (NCES) conducts this study and works with the IEA and RTI International to ensure proper implementation of the study and adoption of practices in adherence to the IEA’s standards. Participation in ICILS will allow NCES to meet its mandate of acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations [The Educational Sciences Reform Act of 2002 (ESRA 2002) 20 U.S.C. §9543].

The ICILS Assessment Framework defines computer and information literacy (CIL) as an “individual’s ability to use computers to investigate, create, and communicate in order to participate effectively at home, at school, in the workplace, and in the community” (Fraillon, Schulz, & Ainley, 2013, p. 18). ICILS reports on eighth-grade students’ abilities to collect, manage, evaluate, and share digital information, as well as their understanding of issues related to the safe and responsible use of electronic information. Achievement scores are reported across four proficiency levels of CIL. ICILS also collects a variety of data to provide context and investigate student access to, use of, and engagement with ICT at school and at home, school environments for teaching and learning CIL, and teacher practices and experiences with ICT.

ICILS 2023 plans to continue collecting and analyzing data to provide a better understanding of the modern, pervasive digital context, as well as of students’ skills in information management, communication, and computational thinking. ICILS 2023 will include more aspects related to digital citizenship, reflecting young people’s increasing opportunities for online citizenship participation and helping measure progress toward UNESCO's Sustainable Development Goal 4.4 (increasing the number of young people who have relevant skills for employment).

In preparation for the ICILS 2023 main study, all countries are asked to implement a field test between March 1 and April 15, 2022.1 The purpose of the ICILS field test is to evaluate new assessment items and background questions, to ensure practices that promote low exclusion rates, and to ensure that classroom and student sampling procedures proposed for the main study are successful. The U.S. ICILS main study will be conducted from March through May 2023, and will involve a nationally-representative sample of at least 3,000 eighth-grade students from a minimum of 150 schools.

The request to conduct the ICILS 2023 main study data recruitment and collection (OMB# 1850-0929 v.9) was approved in April 2022. The materials to be used in the main study are based upon those that were proposed most recently in October 2021. That submission described the overarching plan for all phases of the data collection for the 2023 main study, including the Non-Response Bias Analysis Plan in Appendix C.

Because ICILS is a collaborative effort among many parties, the United States must adhere to the international schedule set forth by the IEA, including the availability of draft and final questionnaires. For the previous submission, the content of Appendix B was updated with the latest approved drafts of the 2023 ICILS field study questionnaires. After feedback from interested parties, we chose to add new items related to the effects of the COVID pandemic to the main study questionnaires; this submission seeks approval for a questionnaire revision along with 30 days of public comment. We expect the final U.S. versions of the main study questionnaires to be approved by the IEA in late December 2022, at which point we will submit them to OMB for approval as a change request. We expect the final versions to be very similar to the drafts included in this submission.

A. Justification

A.1 Importance of Information

Benchmarking of U.S. student achievement against other countries continues to be of high interest to education policymakers, and informs policy discussions of economic competitiveness and workforce and post-secondary preparedness. ICILS provides a unique opportunity to compare U.S. eighth-grade students’ computer and information literacy skills and access to and use of technology with that of their peers in countries around the world. ICILS was developed internationally as a response to the increasing use of information and communication technology (ICT) in modern society and the need for citizens to develop relevant skills in order to participate effectively in the digital age.

Moreover, many international assessments (e.g., TIMSS) and the National Assessment of Educational Progress (NAEP) have almost completed their transitions from paper-based format to technology-based format. The coronavirus pandemic has made it clear that technology is essential to students’ learning when schools are closed. An important question that is not currently addressed by an existing national data collection is the extent to which students’ computer skills and experience with digital devices and instruction based on technology matters for their performance on technology-based assessments like eTIMSS. In order to support these transitions, ICILS will provide information to inform NCES about eighth-grade students’ computer skills in an effort to better understand the possible “digital divide” that may impact U.S. student performance in other subject areas, such as mathematics and science as measured by eTIMSS.

ICILS identifies the strengths and weaknesses of student computer and information literacy skills relative to participating countries around the world. It also provides valuable benchmarking information about educational polices enacted in other countries and policies that could be applied to U.S. educational practices.

Based on other similar international assessment data releases (such as TIMSS and PIRLS), it is likely that the results of this study will draw great attention in the United States and elsewhere. It is therefore expected that ICILS will contribute to ongoing national and international debates and efforts to improve computer and information literacy and support access to and use of technology.

A.2 Purposes and Uses of Data

ICILS assesses computer and information literacy knowledge and skills at grade 8 cross-nationally. ICILS asks how well students are prepared for life in the information age, exploring several key questions about student CIL and its contexts: 1) How does student computer and information literacy vary within and between countries; 2) What factors influence students' computer and information literacy; and 3) What can education systems and schools do to improve students' computer and information literacy? In order to gather data to explore such questions, ICILS also collects background information on students, teachers, schools, and official education policies.

Data compiled and collected from ICILS 2023 will allow for evidence-based decisions to be made for the purposes of educational improvement. The study will provide policymakers and education systems with an important data source on the contexts and outcomes of ICT-related education programs, and the role of schools and teachers in supporting students’ computer and information literacy achievement.

Through participation in ICILS and other international assessment programs, NCES is able to provide comparative indicators on student performance and school practices across countries in order to benchmark U.S. student performance, and to suggest hypotheses about the relationship between student performance and factors that may influence performance as well as areas in which students have strengths or weaknesses. The international studies identify differences among countries that can inform discussions about how to improve educational contexts and outcomes.

NCES’s mandate [Section 406 of the General Education Provisions Act, as amended (20 U.S.C. 1221e-1)] specifies that "The purpose of the Center [NCES] shall be to collect and analyze and disseminate statistics and other information related to education in the United States and in other nations." and the Educational Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543) specifies that NCES shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations. ICILS is essential for any international perspective on students’ computer and information literacy, and U.S. participation in ICILS is aligned with both the national and international aspects of NCES' mission.

ICILS 2023 Components

ICILS consists of a computer-based student assessment and background questionnaires for students, teachers, ICT coordinators, and school principals.

The CIL student assessment framework for 2023 is based on the structure of the 2018 framework with some modifications. Under the 2023 framework, ICILS will continue assessing two dimensions: computational thinking (dimension 1) and digital information (dimension 2). Each dimension is divided into strands, which refer to the overarching conceptual category for framing the skills and knowledge addressed by the assessment instruments. Under computational thinking, two strands will be measured: conceptualizing problems and operationalizing solutions. Computational thinking is a new strand added to the framework for 2018, which is optional for participating countries. The United States will continue participating in this strand. Under the digital information dimension, four strands will be measured: understanding computers, gathering information, producing information, and digital communication.

Assessment Instruments

The student assessment is composed of modules which include three types of tasks: (i) multiple-choice or constructed response items based on realistic stimulus material; (ii) software simulations of generic applications requiring students to complete an action in response to an instruction; and (iii) authentic tasks that require students to modify and create information products using 'live' computer software applications. The assessment will include 6 trend modules (used in the 2018 administration of ICILS) and 8 new modules (5 of the new modules will fall under the computational thinking dimension and 3 will fall under the computer and information literacy dimension), which are expected to take 30 minutes each to complete. To reduce burden on individual students, each student will complete 4 assessment modules according to a module rotation design, for a total of 120 minutes.

The ICILS assessment will be administered to students on a digital device. The type of device and specific model –computer or tablet with a keyboard and mouse attached – is to be determined by each country. Based on usability testing by the international contractor, countries were provided specifications and chose a single device for the field test and main study that meets the requirements and works within their country. The United States will use the Chromebook computers to administer the field test and will use the same for the main study.

Questionnaires

The background questionnaires for ICILS 2023 were developed to address the issues outlined in the ICILS context questionnaire framework. In accordance with international study procedures, the United States will use the international questionnaire, but will adapt some questions to fit the U.S. education context, as appropriate, and will add a few questions, such as student race/ethnicity.

School and ICT Questionnaires. The school questionnaire consists of two parts, one of which is completed by the principal and is expected to take 15 minutes, and one of which is intended to be completed by the ICT-coordinator (or school administrator indicated by the principal who is familiar with ICT in the school) and is also expected to take 15 minutes to complete (thus, 30 minutes total). The questionnaire will provide information on computer use, ICT resources, and relevant policies and practices in the school context. The school questionnaire will be offered online, with a paper-and-pencil backup.

Teacher Questionnaire. A teacher questionnaire will be administered to a random sample of 8th-grade teachers in each school and is expected to take about 30 minutes to complete. This questionnaire will provide information on computer use, ICT resources, and relevant policies and practices in the school context. The teacher questionnaire will be offered online, with a paper-and-pencil backup.

Student Questionnaire. A student questionnaire, which is computer-based, is expected to take 30 minutes, and is completed by each student after the student assessment. The student questionnaire gathers information about computer use in and outside of school, attitudes to technology, self-reported computer proficiency, and background characteristics such as gender and race/ethnicity.

A.3 Improved Information Technology (Reduction of Burden)

The ICILS 2023 design and procedures are prescribed internationally and data collection involves computer-based student assessments and questionnaires, as well as online or paper-and-pencil questionnaires for schools and teachers. Each participating nation is expected to adhere to the internationally prescribed design. In the United States, the school and teacher questionnaires will be made available to school administrators and teachers online as the main mode of administration, with a paper-and-pencil backup to facilitate user preference for participation.

A communication website will be used for ICILS 2023 in order to provide a simple, single source of information to engage and maintain high levels of school involvement. This portal will be used throughout the assessment cycle to inform schools of their tasks and to provide them with easy access to information tailored for their anticipated needs. We plan to gather eighth-grade class/student and teacher lists from participating schools electronically using a secure electronic filing process. Electronic filing is an electronic system for submitting lists of student information, including student background information in school records. Instructions to school coordinators on how to submit student and teacher lists are included in Appendix A. E-filing has been used successfully in NAEP for more than 10 years, and was used in TIMSS 2019 and the PISA 2018 assessments. The electronic filing system provides advantageous features such as efficiency and data quality checks.

A.4 Efforts to Identify Duplication

In the United States, the National Assessment of Educational Progress (NAEP) technology and engineering literacy (TEL) assessment was administered to a nationally-representative sample of eighth-grade students in 2014 and 2018. The next administration is currently scheduled for 2024. TEL refers to the capacity to use, understand, and evaluate technology as well as to understand technological principles and strategies needed to develop solutions and achieve goals. NAEP TEL was completely computer-based and included interactive scenario-based tasks. ICILS does not duplicate the NAEP TEL assessment, as the focus on computer and information literacy differs from the focus on engineering literacy. Also, TEL doesn’t include a teacher questionnaire to collect information directly on how teachers use technologies in their practices.

ICILS 2023 is part of a program of international cooperative studies of educational achievement supported and funded, in part, by the U.S. Department of Education. These studies represent the U.S. participation in international studies involving a broad range of countries. As part of international cooperative studies, the United States must collect the same information at the same time as the other nations for purposes of making both valid international comparisons with other countries and with the potential future ICILS data collections. While some studies in the United States may collect similar, though not identical, kinds of information (e.g., NAEP TEL), the data from those studies cannot be substituted for the information collected in ICILS in that they do not allow for comparisons outside the United States. Furthermore, the data collected through ICILS is based on a unique framework that is not shared by any other state, national, or international data collection effort. In order to participate in these international studies, the United States must agree to administer the same core instruments that are administered in the other countries. Because the items measuring computer and information literacy have been developed with intensive international coordination, any changes to the instruments require international coordination and approval.

A.5 Minimizing Burden for Small Entities

The school samples for ICILS contain small-, medium- and large-size schools, including private schools, selected based on probability proportionate to their size. All school sizes are needed to ensure an appropriate representation of each type of school in the selected sample of schools. Burden will be minimized wherever possible. In addition, national contractor staff will bring laptops or tablets to the schools to conduct the assessment, will conduct all test administrations, and will assist with parental notification, consent forms, sampling, and other tasks as much as possible within each school.

A.6 Frequency of Data Collection

The main study data collection will take place from March through May 2023. This timeline is prescribed by the international contractor for ICILS, and adherence to this schedule is necessary to establish consistency in survey operations among participating countries as well as to maintain potential trend lines. Future ICILS data collections are not yet determined.

A.7 Special Circumstances

None of the special circumstances identified in the Instructions for Supporting Statement apply to the ICILS study.

A.8 Consultations outside NCES

Consultations outside NCES have been extensive and will continue throughout the life of the project. The IEA studies are developed as a cooperative enterprise involving all participating countries. An international panel of computer and information literacy and measurement experts provide substantive and technical guidance for the study and National Research Coordinators participate in extensive discussions concerning the projects, usually with advice from national subject matter and testing experts.

The majority of the consultations (outside NCES) involve IEA as the international study center for ICILS. IEA works closely with the national centers of participating countries. Key staff from IEA include: Mr. Julian Fraillon (research director) and Dr. Juliane Kobelt (assessment coordinator), both of whom have extensive experience in developing and operating international education surveys (especially related to ICILS).

A.9 Payments or Gifts to Respondents

In order to achieve acceptable school response rates in international studies, incentives are typically offered at the school, staff, and student level to thank them for their participation – the time they invest in and the space they make available for the international assessments. High response rates are required by both IEA and NCES and are difficult to achieve in voluntary school-based studies. ICILS standards for participating countries require minimum participation rates for schools, classrooms, and students. Without sufficient participation levels, U.S. results may be flagged as unreliable in international reports or excluded from international reporting altogether. To be included in cross-national comparisons, each country must meet the following standards:

        • A weighted or un-weighted school response rate with replacement of at least 85 percent AND an un-weighted overall student/teacher response rate of at least 85 percent; OR

        • The product of the weighted school response rate with replacement and the weighted overall student/teacher response rate of at least 75 percent.



The U.S. has historically had difficulties in achieving sufficient participation levels. To maximize likelihood of participation, we propose a set of monetary and nonmonetary incentives. The list of incentives for ICILS 2023 are introduced in Table 1, followed by a description and justification for each.

Table A1. ICILS 2023 Main Study Incentive Amounts

Participant

Incentive

School

$200 or equivalent gift certificate for school supplies


Interactive webinar (for up to 3 staff) for schools to select one topic of either social and emotional learning, project-based learning, or STEM


$800 or equivalent gift certificate for targeted schools who are the most difficult to recruit (see details in section below)2

School coordinator

$100 or equivalent gift certificate for school supplies

Certificate of service (can be used toward continuing ed if district allows)

ICT coordinator

$25 or equivalent in gift card for school supplies

Certificate of service (can be used toward continuing ed if district allows)

Teacher

$25 or equivalent in gift card for school supplies

Certificate of service (can be used toward continuing ed if district allows)

Student

Token incentive, ~$4 value

Certificates of service (in schools where permitted)

Community service hours (in schools where permitted)



School incentives. As with other international assessments such as TIMSS and PIRLS, all schools contacted to participate in ICILS 2023 will be offered $200 as a token of our appreciation for participating. Each participating school will also be invited to send up to three staff to a 90-minute interactive webinar delivered online as a virtual workshop. Staff will be able to choose from one of three timely topics:

  • STEM and Inquiry-Based Learning. This inquiry-based learning session will embed a common language and instructional strategy across disciplines to foster blended learning and other processes that model curiosity, design thinking, and reflective learning. This option will provide a system that maximizes the staff’s opportunity to grow and succeed in teaching STEM through inquiry.

  • Social and Emotional Learning (SEL). Schools will learn how to create an environment where SEL can take place and build competence in self-awareness and social awareness in adults.

  • Project-Based Learning. The session is designed to support inquiry-driven project-based learning (PBL) for educators who are early or intermediate in their exploration of the concept. This workshop will engage participants for designing, assessing, and managing standards-focused exploration as well as using performance assessment to judge the relevant work generated by 21st-century learners.

For the ICILS 2023 main study, participating schools will also receive a school-level report; those with sufficient participation will receive indicators of performance. These reports are a requirement for participation of many schools in large districts who expect timely, relevant, and actionable data in exchange for participation.

While we hope that these incentives will be sufficient to achieve the IEA participation requirements, school recruitment has become increasingly difficult in the U.S. and will likely be even more difficult given loss of instruction and learning during COVID. To ensure that we are able to achieve the required participation targets for ICILS 2023, we plan to offer a second-tier incentive of up to $800 for the most challenging schools to recruit. This second-tier incentive would be used judiciously and strategically to increase participation in the following target groups:

  • Schools that have declined to participate but are in a sampling strata that is falling short of participation targets, or we may go back and offer it to original schools where the original school and both of its substitutes decline to participate;

  • Private schools in the original or substitute sample. In ICILS 2018, only 40% of original private public schools participated, with the overall rate of about 57% after substitutes were added; and;

  • Substitute schools for whom recruitment does not begin prior to July 2022. During previous TIMSS cycles, original schools typically refused in the mid to late fall prior to the spring assessment, at which point substitute schools are recruited. Recruitment of substitute schools is a challenge (less than 25 percent participate) because the school year is already underway, and calendars are set.

This second-tier incentive has been repeatedly and successfully used across the NCES international studies—and has been recently approved for two international studies currently in the field (PIRLS 2021 and PISA 2022)—to help ensure that the studies meet international participation requirements to be included in cross-national comparisons. Table 2 shows the history and impact of this second-tier incentive across U.S. participation in international studies.

Table A2. History and Impact of the Second-Tier Incentive on NCES International Studies

Study

Approved Incentive

Timing of Initiation

Impact

PISA 2012 Field Test

Tier 1: $200

Tier 2: Up to $800 for a subset of schools.

Conducted as an experiment

Considered successful and used for main study

PISA 2012 Main Study

Tier 1: $200

Tier 2: Up to $800 for a subset of schools.

At the end of the prior academic year after all original schools were contacted

Met the minimum participation requirement after including substitute schools

PISA 2015

Tier 1: $200

Tier 2: Up to $800 for a subset of schools.

At the end of the prior academic year after all original schools were contacted

Met the minimum participation requirement after including substitute schools

PIRLS 2016

Tier 1: $200

Tier 2: Up to $800 for a subset of schools.

Mid-way through recruitment

Accepted by eight schools (or about 20 percent of the schools offered the $800) which enabled the study to achieve the minimum participation target after including substitute schools.

ICILS 2018

Tier 1: $200

Tier 2: Up to $800 for a subset of schools.

Offered in the middle of data collection through an OMB change request after it became clear participation rates were not going to be met with the Tier 1 $200 incentives.

Able to successfully recruit about 20 percent of these schools even with the late start. Though ICILS fell short of its target recruitment percentage, an earlier implementation may have helped achieve required participation.

TIMSS 2019

Tier 1: $200

Tier 2: Up to $800 for a subset of schools.

Throughout recruitment for private and NAEP overlap schools; beginning Fall 2018 for substitute and refusal schools

Accepted by 159 schools (or about 46 percent of the schools offered the $800), which enabled the study to achieve the minimum participation target after including substitute schools.

PIRLS 2021

Tier 1: $200

Tier 2: Up to $800 for a subset of schools.

Recruitment in progress

Recruitment in progress

PISA 2022

Tier 1: $200

Tier 2: Up to $800 for a subset of schools.

Recruitment in progress

Recruitment in progress



School coordinator incentive. The school staff serving as School Coordinators will receive $100 for their time and effort. The School Coordinator serves a critical role in data collection, functioning as the central school contact and facilitating arrangements for the assessments. They are asked to file class and student listing forms; arrange the date, time and space for the assessment; and disseminate information and consent forms to parents and students.

A check will be mailed to each school in the amount of $200, and to each school coordinator in the amount of $100, once the ICILS assessment has been conducted in their schools.

Student incentives. Consistent with other international assessments, as a token of appreciation for their participation, students will receive a small gift valued at approximately $4. Students will also receive a certificate with their name thanking them for participating in ICILS and representing the United States with a credit for three hours of community service. Some schools also offer recognition parties with pizza or other treats for students who participate; however these are not reimbursed by NCES or the contractor.

Teacher incentives. Teachers will be offered $25 for completing the ICILS teacher questionnaire to encourage their participation. In order to avoid sending up to 20 checks to the school for the school coordinator to distribute to teachers who complete the questionnaire, electronic Amazon gift cards in the amount of $25 will be used. Teacher email addresses are not collected prior to the assessment. Teacher invitation cards that provide information about how to access the online teacher questionnaire are distributed to selected teachers by the school coordinator. This card will include instructions for the teacher to email the ICILS Staff Help Desk upon completion of the questionnaire and provide his or her email address. Once completion of the questionnaire will be confirmed, the code to access the Amazon electronic gift card will be emailed to the teacher. In this way, teachers will see the direct link between completing the questionnaire and receiving the $25 thank you token, and will receive the incentive very quickly after survey completion. Amazon gift cards will be used because they have no associated fees, unlike other cash card programs.

ICT Coordinator incentives. The incentive proposed for each ICT coordinator is $25 per ICT Coordinator survey as either a check or gift card for school supplies. The ICT coordinator is the person with designated responsibility for ICT in the school who is knowledgeable about ICT and will complete the ICT coordinator questionnaire. This incentive aligns with teacher incentives for their participation on this and other NCES international comparison studies, and is necessary to obtain participation as these staff are often busy and spread across multiple schools.

School administrators. Historically, participation is high among school administrators without offering incentives; therefore, no incentive will be offered for completion of the school administrator questionnaire.

A.10 Assurance of Confidentiality

Data security and confidentiality protection procedures have been put in place for ICILS to ensure that RTI International and its subcontractors comply with all privacy requirements, including:

  1. The statement of work of this contract;

  2. Privacy Act of 1974 (5 U.S.C. §552a);

  3. Family Educational and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));

  4. Privacy Act Regulations (34 CFR Part 5b);

  5. Computer Security Act of 1987;

  6. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  7. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  8. Confidential Information Protect and Statistical Efficiency Act of 2002;

  9. E-Government Act of 2002, Title V, Subtitle A;

  10. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  11. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  12. The U.S. Department of Education Incident Handling Procedures (February 2009);

  13. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  14. NCES Statistical Standards; and

  15. All new legislation that impacts the data collected through the inter-agency agreement for this study.

Furthermore, RTI International will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/.

The laws pertaining to the use of personally identifiable information are clearly communicated in correspondence with states, districts, schools, teachers, students, and parents. Letters and information materials will be sent to parents and school administrators describing the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential (see copies in appendix A):

The National Center for Education Statistics (NCES)is authorized to conduct this study under the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543), and to collect students’ education records from educational agencies or institutions for the purpose of evaluating federally supported education programs under the Family Educational Rights and Privacy Act (FERPA, 34 CFR §§ 99.31(a)(3)(iii) and 99.35). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). In the United States, ICILS is conducted by NCES, part of the U.S. Department of Education, and the data are being collected by RTI International. The U.S. Office of Management and Budget has approved the data collection under OMB #1850-0929.

The following statement will appear on the login page for ICILS and the front cover of the printed questionnaires (the phrase “search existing data resources, gather the data needed” will not be included on the student questionnaire):

The National Center for Education Statistics (NCES)is authorized to conduct this study under the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543), and to collect students’ education records from educational agencies or institutions for the purpose of evaluating federally supported education programs under the Family Educational Rights and Privacy Act (FERPA, 34 CFR §§ 99.31(a)(3)(iii) and 99.35). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). In the United States, ICILS is conducted by NCES, part of the U.S. Department of Education, and the data are being collected by RTI International. The U.S. Office of Management and Budget has approved the data collection under OMB #1850-0929.

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0929. The time required to complete this information collection is estimated to average [XX] minutes per [respondent type], including the time to review instructions [, search existing data resources, gather the data needed,] and complete and review the information collection. If you have any comments or concerns regarding the accuracy of the time estimate(s), suggestions for improving the form, or questions about the status of your individual submission of this form, write directly to: International Computer and Information Literacy Study (ICILS), National Center for Education Statistics, PCP, 550 12th St., SW, 4th floor, Washington, DC 20202.

OMB No. 1850-0929, Approval Expires xx/xx/20yy.

The ICILS confidentiality plan includes signing confidentiality agreements and nondisclosure affidavits by all contractor and subcontractor personnel and field workers who will have access to individual identifiers. Also included in the plan is personnel training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses; controlled and protected access to computer files under the control of a single data base manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility. Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. Neither names nor addresses will be included on any data file.

NCES understands the legal and ethical need to protect the privacy of the ICILS respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure analysis of the ICILS 2023 data when preparing the data files for use by researchers, in compliance with 20 U.S.C. §9573. Schools with high disclosure risk will be identified and, to ensure that individuals may not be identified from the data files, a variety of masking strategies will be used, including swapping data and omitting key identification variables (i.e., school name and address) from both the public- and restricted-use files (though the restricted-use file will include an NCES school ID that can be linked to other NCES databases to identify a school); omitting key identification variables such as state or ZIP Code from the public-use file; and collapsing or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files.

A.11 Sensitive Questions

The questionnaires do not have items considered to be of sensitive nature.

A.12 Estimates of Burden

This package shows estimated burden to respondents for all ICILS 2023 activities, and requests approval for burden to respondents for the field test and main study recruitment and data collection. Burden estimates are shown in Table A.3.

Table A3. Burden estimates for ICILS 2023 field test and main study recruitment and data collection

Activity

Sample Size

Expected Response Rate

Number of Respondents

Number of Responses

Average Burden Time per Response (minutes)

Total Burden (hours)

Estimated Respondent Average Hourly Earnings1

Estimated Respondent Burden Time Cost

Pilot Field Test2









District Notification

8

100%

8

8

5

1

$49.52

$50

School Recruitment

7

70%

5

5

20

2

$49.52

$99

School Coordinators (1 per participating school)

5

100%

5

5

240

20

$31.39

$628

Students’ Parents (permission)

125

80%

100

100

10

17

$27.07

$460

Student Directions

125

80%

100

100

30

50



Student Assessment

125

80%

100

100

150

250



Student Questionnaire

125

80%

100

100

30

50

$7.25

$363

Principal Questionnaire (1 per school)

5

100%

5

5

15

2

$49.52

$100

Teacher Questionnaire

30

85%

26

26

30

13

$31.39

$409

ICT Coordinator Questionnaire (1 per school)

5

100%

5

5

15

2

$31.39

$63

Total Pilot Field Test2

--

--

254

254

--

107


$2,172

Main Study  

Nonparticipating Districts

170


30%

51

51

10

9

$49.52

$446

Participating Districts

70%

119

119

10

20

$49.52

$990

District IRB Staff Study Approval

68

100%

68

68

120

136

$49.52

$6,735

District IRB Panel Study Approval3

340

100%

340

340

120

680

$49.52

$33,674

Original Nonparticipating Eligible Schools

225


40%

90

90

20

30

$49.52

$1,486

Original Participating Eligible Schools

60%

135

135

20

45

$49.52

$2,229

Replacement Nonparticipating Eligible Schools

4504

96.75%

435

435

20

145

$49.52

$7,181

Replacement Participating Eligible Schools

3.25%

15

15

20

5

$49.52

$248

School Coordinators

150

100%

150

150

240

600

$31.39

$18,834

Students’ Parents (permission)

3,370

90%

3,033

3,033

10

506

$27.07

$13,698

Student Survey

3,3705

89%

3,000

3,000

30

1,500

$7.25

$10,875

Student Directions

3,370

89%

3,000

3,000

30

1,500



Student Assessment

3,370

89%

3,000

3,000

120

6,000

--

--

Teacher Survey

2,2506

95%

2,138

2,138

30

1,069

$31.39

$33,556

School Survey (Principal)

150

95%

143

143

15

36

$49.52

$1,783

ICT Coordinator Survey

150

95%

143

143

15

36

$31.39

$1,130

Main Study Total (requested burden)

--

--

9,860

9,860

--

4,817

--

$132,865

Italicized rows represent assessment activities, which are excluded from calculations of burden.

The average hourly earnings of parents derived from May 2020 Bureau of Labor Statistics (BLS) Occupation Employment Statistics is $27.07, teachers is $31.39, and education administrators is $49.52. If mean hourly wage was not provided, it was computed assuming 2,080 hours per year. The exception is the student wage, which is based on the federal minimum wage. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation codes: All employees (00-0000); Teachers (25-2020); Education Administrators (11-9032); accessed on October 14, 2021.

2 This burden was requested and approved in a separate package, International Computer and Information Literacy Study (ICILS 2023) Pilot Field Test (OMB# 1850-0803 v.304), which can be inspected at https://www.reginfo.gov/public/do/PRAViewIC?ref_nbr=201903-1850-001&icID=249841. The burden is shown here to illustrate the function of the field test in the larger project and is not included in our burden calculations for this request.

3 Based on the estimate that 40% of all districts will require special handling and that, on average, there will five individuals per panel.

4 Two replacements schools will be sampled for each original school but a replacement school will only be contacted if the original school declines to participate.

5 Sample size assumes that approximately one percent of students will be ineligible, and that 90 percent of eligible students’ parents will review permission materials.

6 Based on estimate average of 15 teachers per school.


Some districts are known as “special handling districts” which require completion of a research application before they will allow schools under their jurisdiction to participate in a study. Based on an initial assessment of previous data collections of similar studies such as TIMSS, we have estimated the number of special handling districts in the main study samples (shown in Table A.3). Contacting special handling districts begins with updating district information based on what can be gleaned from online sources. Calls are then placed to verify the information about where to send the completed required research application forms, and, if necessary, to collect contact information for this process. During the call, inquiry is also made about the amount of time the districts spend reviewing similar research applications. The estimated number of such districts represents those with particularly detailed application forms and lengthy processes for approval. This operation can take 6-8 months and should begin as early as possible to allow sufficient time for special handling districts’ review processes. We will continue to work with these districts until we receive a final response (approval or denial of request) up until the end of data collection in May 2023.

For the main study the target sample size for the United States is 150 schools and 3,000 students. The minimum sample size requirements for the field test were 32 schools and 570 students. The time required for students to respond to the assessment (cognitive items) portion of the study and associated directions are shown in shaded lines and italic font and are not included in the totals because they are not subject to the PRA. Student, administrator, and teacher questionnaires are included in the requested burden totals. Recruitment and pre-assessment activities include the time to review study plans by the school districts that require research application and approval before contacting their schools, and the time involved in a school deciding to participate, completing teacher and student listing forms, distributing parent notification and consent materials, and arranging assessment space.

Based on the estimated hourly rates for principals/administrators, school coordinators, parents, teachers, and students of $49.52, $31.39, $27.07, $31.39, and $7.25, respectively. For the ICILS main study, a total of 4,924 burden hours are anticipated, resulting in an estimated burden time cost to respondents of approximately $1352,865.

A.13 Total Annual Cost Burden

No cost to respondents is anticipated beyond the estimated burden cost described in Section A.12.

A.14 Annualized Cost to Federal Government

The cost to the federal government for conducting ICILS 2023 main study recruitment, data collection and scoring, is estimated to be $2,894,802 over a 3-year period (see table breakdown below). These figures include all direct and indirect costs. The costs for the ICILS 2023 Pilot Field Test, as reported and approved in the International Computer and Information Literacy Study (ICILS 2023) Pilot Field Test (OMB# 1850-0803 v.304) package, is $494,350, bringing the total cost of the administration of ICILS 2023 to $3,389,152.

Components with breakdown

Estimated costs

MAIN STUDY (2023)


Assessment Preparations (sampling, instruments, training, etc)

$433,449

Recruitment

$949,316

Data Collection

$1,186,710

Scoring and Dataset Preparation

$325,327

Total for this request

$2,894,802



A.15 Program Changes or Adjustments

The respondent burden reflects burden for the ICILS 2023 main study recruitment and data collection.

A.16 Plans for Tabulation and Publication

Based on the data collected in the main study, IEA will prepare an international report to be released in November 2024. As has been customary, NCES will also release a report at the same time as the international reports are released, interpreting the results for the U.S. audience. NCES reports on initial data releases are generally limited to simple bivariate statistics. There are currently no plans to conduct complex statistical analyses of either dataset. An example of a similar report on another international assessment can be found at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2013010rev. In the spring of 2025, IEA will also release the international technical report, describing the design and development of the assessment as well as the scaling procedures, weighting procedures, missing value imputation, and analyses. After the release of the international data, NCES plans to release the national data and an accompanying User’s Guide for the study.

Electronic versions of each publication are made available on the NCES website. Schedules for tabulation and publication of ICILS 2023 results in the United States are dependent upon receiving data files from the international sponsoring organization. With this in mind, the expected data collection dates and a tentative reporting schedule are provided in table A.4.

Table A.4. Schedule of Activities for ICILS 2023 Pilot Field Test and Main Study.


Activity

Start Date

End Date


Pilot Field Test



Prepare data collection manuals, forms, assessment materials, and questionnaires

October 2021

March 2022


Select school sample

September 2021

October 2021


Recruitment of states, districts, schools

November 2021

April 2022


Collect field test data

March 2022

April 2022


Deliver raw data to international sponsoring organization

April 2022

May 2022



Main Study




Prepare data collection manuals, forms, assessment materials, and questionnaires

May 2022

March 2023


Select school sample

March 2021

July 2021


Recruitment of states, districts, schools for main study

January 2022

May 2023


Collect main study data

March 2023

May 2023


Deliver raw data to international sponsoring organization

July 2023

July 2023


Receive final data files from international sponsors

June 2024

June 2024


Produce reports

June 2024

December 2024




A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all data collection materials.

A.18 Exceptions to Certification Statement

No exceptions to the certifications are requested.

1 In October 2021 NCES submitted and OMB approved a separate package for the ICILS 2023 Pilot Field Test (OMB# 1850-0803 v.304), which can be inspected at https://www.reginfo.gov/public/do/PRAViewIC?ref_nbr=201903-1850-001&icID=249841. Although this submission refers to research activities associated with the field test, in this submission NCES is not requesting review or approval of the procedures, burden, budget, or instruments for the ICILS 2023 Pilot Field Test.

2 The incentive structure detailed in Table A1 is identical to what was offered to respondents during the ICILS 2023 Field Test (OMB# 1850-0803 v.304) with the exception of the additional incentive for targeted schools, which is held in reserve for main study use only.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCalvin Choi
File Modified0000-00-00
File Created2022-08-26

© 2024 OMB.report | Privacy Policy