Part A ICILS 2018 Field Test & MS Recruitment

Part A ICILS 2018 Field Test & MS Recruitment.docx

International Computer and Information Literacy Study (ICILS 2018) Field Test and Recruitment for Main Study

OMB: 1850-0929

Document [docx]
Download: docx | pdf




International Computer and Information Literacy Study (ICILS 2018) FIELD TEST AND RECRUITMENT for MAIN STUDY




REQUEST FOR OMB Clearance

OMB# 1850-0929 v.5




Supporting Statement Part A




Submitted by:



National Center for Education Statistics (NCES)

Institute of Education Sciences (IES)

U.S. Department of Education

Washington, DC











Revised April 2017




TABLE OF CONTENTS



PREFACE 2


A. JUSTIFICATION





SUPPORTING STATEMENT PART B

APPENDICES

A: Field Test and Main Study Recruitment and Parent Materials


B: ICILS U.S. Field Test Questionnaires





PREFACE

The International Computer and Information Literacy Study (ICILS) is a computer-based international assessment of eighth-grade students’ computer and information literacy (CIL) skills. ICILS was first administered internationally in 2013 in 21 education systems and will be administered again in 2018. The United States will participate for the first time in the 2018 administration. U.S. participation in this study will provide data on students’ skills and experience using technology to investigate, create, and communicate, and will provide a comparison of U.S. student performance and technology access and use with those of the international peers. This study will also allow the U.S. to begin monitoring the progress of its students compared to that of other nations and to provide data on factors that may influence student computer and information literacy skills. The data collected through ICILS will provide valuable information with which to understand the nature and extent of the “digital divide” and has the potential to inform understanding of the relationship between technology skills and experience and student performance in other core subject areas.

ICILS is conducted by the International Association for the Evaluation of Educational Achievement (IEA), an international collective of research organizations and government agencies that create the assessment framework, assessment, and background questionnaires. The IEA decides and agrees upon a common set of standards and procedures for collecting and reporting ICILS data, and defines the study timeline, all of which must be followed by all participating countries. As a result, ICILS is able to provide a reliable and comparable measure of student skills in participating countries. In the U.S., the National Center for Education Statistics (NCES) conducts this study and works with the IEA and Westat to ensure proper implementation of the study and adoption of practices in adherence to the IEA’s standards. Participation in ICILS will allow NCES to meet its mandate of acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations [The Educational Sciences Reform Act of 2002 (ESRA 2002) 20 U.S.C. § 9543].

The ICILS Assessment Framework defines computer and information literacy (CIL) as an “individual’s ability to use computers to investigate, create, and communicate in order to participate effectively at home, at school, in the workplace, and in the community” (Fraillon, Schulz, & Ainley, 2013, p. 18). ICILS reports on eighth-grade students’ abilities to collect, manage, evaluate, and share digital information, as well as their understanding of issues related to the safe and responsible use of electronic information. Achievement scores are reported across four proficiency levels of computer and information literacy (CIL). ICILS also collects a variety of data to provide context and investigate student access to, use of, and engagement with ICT at school and at home, school environments for teaching and learning CIL, and teacher practices and experiences with ICT.

In preparation for the ICILS 2018 main study, all countries are asked to implement a field test in 2017. The purpose of the ICILS field test is to evaluate new assessment items and background questions, to ensure practices that promote low exclusion rates, and to ensure that classroom and student sampling procedures proposed for the main study are successful. In selecting a school sample for this purpose, it is important to minimize the burden on schools, districts, and states, while also ensuring that the field test data are collected effectively.

Data collection for the field test will occur from May through October 2017. The United States will administer the field test assessment and student questionnaire to a minimum of 500 students in approximately 32 schools; in each school a minimum of 20 students will be randomly selected from a comprehensive list of all eighth-grade students. In each school a minimum of 15 target grade teachers will be randomly selected from a comprehensive list of all eighth-grade teachers to complete the teacher questionnaire. The school principal and/or designee will complete a school questionnaire. The U.S. ICILS main study will be conducted from February through May 2018, and will involve a nationally-representative sample of at least 9,000 eighth-grade students from a minimum of 300 schools.

This submission describes the overarching plan for all phases of the data collection, including the 2018 main study. In addition to the supporting statements Parts A and B, Appendix A provides field test and main study recruitment materials consisting of letters to state and district officials and school principals, text for an ICILS field test brochure, “Frequently Asked Questions,” a “Summary of Activities,” parental notification and consent materials, and student and teaching listing instructions. Appendix B provides the ICILS field test background questionnaires.

Because ICILS is a collaborative effort among many parties, the United States must adhere to the international schedule set forth by the IEA, including the availability of draft and final questionnaires. In order to meet the international data collection schedule for the spring 2017 field test, recruitment activities are scheduled to begin in October 2016. Recruitment for the main study will begin as early as May of 2017 to align with recruitment for other NCES studies (e.g., the Trends in International Mathematics and Science Study (TIMSS) and the National Assessment of Education Progress (NAEP)), and for schools to put the ICILS assessment on their calendars. We expect the main study materials and procedures to be very similar to those used in the field test. Therefore, this submission requests approval for:

  1. recruiting for the 2017 field test and 2018 main study;

  2. conducting the 2017 field test data collection; and

  3. a description of the overarching plan for all of the phases of the data collection, including the 2018 main study.1

If needed, in summer 2017, we will submit a change-request memo with a description of changes to the design, procedures, and respondent burden for the main study, should any changes be made. In late 2017, we will submit a clearance package, with a 30-day public comment period announced in the federal register, which will include the final main study instruments for data collection in February-May, 2018. The main study questionnaires will be a subset of the field test instruments.

A. Justification

A.1 Importance of Information

Benchmarking of U.S. student achievement against other countries continues to be of high interest to education policymakers, and informs policy discussions of economic competitiveness and workforce and post-secondary preparedness. ICILS provides a unique opportunity to compare U.S. eighth-grade students’ computer and information literacy skills and access to and use of technology with that of their peers in countries around the world. ICILS was developed internationally as a response to the increasing use of information and communication technology (ICT) in modern society and the need for citizens to develop relevant skills in order to participate effectively in the digital age.

Moreover, many international assessments and the National Assessment of Educational Progress (NAEP) are undergoing transitions from paper-based format to technology-based format. The Trends in International Mathematics and Science Study (TIMSS) is in the process of making this transition for its next administration in 2019 (eTIMSS). An important question that is not currently addressed by an existing national data collection is the extent to which students’ computer skills and experience with digital devices and instruction based on technology matters for their performance on technology-based assessments like eTIMSS. In order to support these transitions, ICILS will provide information to inform NCES about eighth-grade students’ computer skills in an effort to better understand the possible “digital divide” that may impact U.S. student performance in other subject areas, such as mathematics and science as measured by eTIMSS.

ICILS identifies the strengths and weaknesses of student computer and information literacy skills relative to participating countries around the world. It also provides valuable benchmarking information about educational polices enacted in other countries and policies that could be applied to U.S. educational practices.

Based on other similar international assessment data releases (such as TIMSS and PIRLS), it is likely that the results of this study will draw great attention in the United States and elsewhere. It is therefore expected that ICILS will contribute to ongoing national and international debates and efforts to improve computer and information literacy and support access to and use of technology.

A.2 Purposes and Uses of Data

ICILS assesses computer and information literacy knowledge and skills at grade 8 cross-nationally. ICILS asks how well students are prepared for life in the information age, exploring several key questions about student CIL and its contexts: 1) How does student computer and information literacy vary within and between countries; 2) What factors influence students' computer and information literacy; and 3) What can education systems and schools do to improve students' computer and information literacy? In order to gather data to explore such questions, ICILS also collects background information on students, teachers, schools, and official education policies.

Data compiled and collected from ICILS 2018 will allow for evidence-based decisions to be made for the purposes of educational improvement. The study will provide policymakers and education systems with an important data source on the contexts and outcomes of ICT-related education programs, and the role of schools and teachers in supporting students’ computer and information literacy achievement.

Through participation in ICILS and other international assessment programs, NCES is able to provide comparative indicators on student performance and school practices across countries in order to benchmark U.S. student performance, and to suggest hypotheses about the relationship between student performance and factors that may influence performance as well as areas in which students have strengths or weaknesses. The international studies identify differences among countries that can inform discussions about how to improve educational contexts and outcomes.

NCES’s mandate [Section 406 of the General Education Provisions Act, as amended (20 U.S.C. 1221e-1)] specifies that "The purpose of the Center [NCES] shall be to collect and analyze and disseminate statistics and other information related to education in the United States and in other nations." and the Educational Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543) specifies that NCES shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations. ICILS is essential for any international perspective on students’ computer and information literacy, and U.S. participation in ICILS is aligned with both the national and international aspects of NCES' mission.

ICILS 2018 Components

ICILS consists of a computer-based student assessment and background questionnaires for students, teachers, school principals.

The CIL student assessment framework for 2018 is based on the structure of the 2013 framework with some modifications. Under the 2018 framework, CIL will be assessed as two dimensions with the working titles: computational thinking (dimension 1) and digital information (dimension 2). Each dimension is divided into strands, which refer to the overarching conceptual category for framing the skills and knowledge addressed by the CIL instruments. Under computational thinking, two strands will be measured: conceptualizing problems and operationalizing solutions. Computational thinking is a new strand added to the framework for 2018, which is optional for participating countries. The United States will be participating in this strand. Under the digital information dimension, four strands will be measured: understanding computers, gathering information, producing information, and digital communication. The digital information dimension is largely equivalent to the CIL construct reported in ICILS 2013 and trend analysis will be based on this dimension.

Assessment Instruments

The student CIL assessment is composed of modules which include three types of tasks: (i) multiple-choice or constructed response items based on realistic stimulus material; (ii) software simulations of generic applications requiring students to complete an action in response to an instruction; and (iii) authentic tasks that require students to modify and create information products using 'live' computer software applications. The CIL assessment will include 3 trend modules (used in the 2013 administration of ICILS) and 4 new modules (2 of the new modules will fall under the computational thinking dimension), which are expected to take 30 minutes each to complete. To reduce burden on individual students, each student will complete 4 assessment modules according to a module rotation design, for a total of 120 minutes.

The ICILS assessment will be administered to students on a digital device. The type of device and specific model –computer or tablet with a keyboard and mouse attached – is to be determined by each country. Based on usability testing by the international contractor, countries were provided specifications and chose a single device for the field test and main study that meets the requirements and works within their country. The United States will be using the Windows Surface Pro tablet with a mouse and keyboard attached to administer the field test.

Questionnaires

The background questionnaires for ICILS 2018 were developed to address the issues outlined in the ICILS context questionnaire framework. In accordance with international study procedures, the United States will use the international questionnaire, but will adapt some questions to fit the U.S. education context, as appropriate, and will add a few questions, such as student race/ethnicity.

School Questionnaire. The school questionnaire consists of two parts, one of which is completed by the principal and is expected to take 15 minutes, and one of which is intended to be completed by the ICT-coordinator (or school administrator indicated by the principal who is familiar with ICT in the school) and is also expected to take 15 minutes to complete (thus, 30 minutes total). The questionnaire will provide information on computer use, ICT resources, and relevant policies and practices in the school context. The school questionnaire will be offered online, with a paper-and-pencil backup.

Teacher Questionnaire. A teacher questionnaire will be administered to a random sample of 8th-grade teachers in each school and is expected to take about 30 minutes to complete. This questionnaire will provide information on computer use, ICT resources, and relevant policies and practices in the school context. The teacher questionnaire will be offered online, with a paper-and-pencil backup.

Student Questionnaire. A student questionnaire, which is computer-based, is expected to take 30 minutes, and is completed by each student after the student assessment. The student questionnaire gathers information about computer use in and outside of school, attitudes to technology, self-reported computer proficiency, and background characteristics such as gender and race/ethnicity.

A.3 Improved Information Technology (Reduction of Burden)

The ICILS 2018 design and procedures are prescribed internationally and data collection involves computer-based student assessments and questionnaires, as well as online or paper-and-pencil questionnaires for schools and teachers. Each participating nation is expected to adhere to the internationally prescribed design. In the United States, the school and teacher questionnaires will be made available to school administrators and teachers online as the main mode of administration, with a paper-and-pencil backup to facilitate user preference for participation.

A communication website will be used for ICILS 2018 during the field test and main study in order to provide a simple, single source of information to engage and maintain high levels of school involvement. This portal will be used throughout the assessment cycle to inform schools of their tasks and to provide them with easy access to information tailored for their anticipated needs. We plan to gather eighth-grade student and teacher lists from participating schools electronically using a secure electronic filing process. Electronic filing is an electronic system for submitting lists of student information, including student background information in school records. Instructions to school coordinators on how to submit student and teacher lists are included in Appendix A. E-filing has been used successfully in NAEP for more than 10 years, and was used in TIMSS 2015 and the PISA 2012 and 2015 assessments. The electronic filing system provides advantageous features such as efficiency and data quality checks.

A.4 Efforts to Identify Duplication

In the United States, the National Assessment of Educational Progress (NAEP) technology and engineering literacy (TEL) assessment was administered to a nationally-representative sample of eighth-grade students in 2014. TEL refers to the capacity to use, understand, and evaluate technology as well as to understand technological principles and strategies needed to develop solutions and achieve goals. NAEP TEL was completely computer-based and included interactive scenario-based tasks. ICILS does not duplicate the NAEP TEL assessment, as the focus on computer and information literacy differs from the focus on engineering literacy.

ICILS 2018 is part of a program of international cooperative studies of educational achievement supported and funded, in part, by the U.S. Department of Education. These studies represent the U.S. participation in international studies involving a broad range of countries. As part of international cooperative studies, the United States must collect the same information at the same time as the other nations for purposes of making both valid international comparisons with other countries and with the potential future ICILS data collections. While some studies in the United States may collect similar, though not identical, kinds of information (e.g., NAEP TEL), the data from those studies cannot be substituted for the information collected in ICILS in that they do not allow for comparisons outside the United States. Furthermore, the data collected through ICILS is based on a unique framework that is not shared by any other state, national, or international data collection effort. In order to participate in these international studies, the United States must agree to administer the same core instruments that are administered in the other countries. Because the items measuring computer and information literacy have been developed with intensive international coordination, any changes to the instruments require international coordination and approval.

A.5 Minimizing Burden for Small Entities

The school samples for ICILS contain small-, medium- and large-size schools, including private schools, selected based on probability proportionate to their size. All school sizes are needed to ensure an appropriate representation of each type of school in the selected sample of schools. Burden will be minimized wherever possible. In addition, national contractor staff will bring laptops or tablets to the schools to conduct the assessment, will conduct all test administrations, and will assist with parental notification, consent forms, sampling, and other tasks as much as possible within each school.

A.6 Frequency of Data Collection

The IEA has delayed the field test data collection period for all participating countries from the originally planned collection period of March through May 2017 to the end of May through October 2017. NCES’s preference will be to collect data from schools at the end of May/early June rather than in October in order to have more time to analyze the data, evaluate the study operations, and make any needed adjustments before the 2018 main study. The main study data collection will take place as originally scheduled, from February through May 2018. This timeline is prescribed by the international contractor for ICILS, and adherence to this schedule is necessary to establish consistency in survey operations among participating countries as well as to maintain potential trend lines.

A.7 Special Circumstances

None of the special circumstances identified in the Instructions for Supporting Statement apply to the ICILS study.

A.8 Consultations outside NCES

Consultations outside NCES have been extensive and will continue throughout the life of the project. The IEA studies are developed as a cooperative enterprise involving all participating countries. An international panel of computer and information literacy and measurement experts provide substantive and technical guidance for the study and National Research Coordinators participate in extensive discussions concerning the projects, usually with advice from national subject matter and testing experts.

The majority of the consultations (outside NCES) involve the Australian Council for Educational Research (ACER), the international study center for ICILS. ACER staff are responsible for designing and implementing the study in close cooperation with the IEA Secretariat, the IEA Data Processing and Research Center, and the national centers of participating countries. Key staff from ACER include: Dr. John Ainley (project coordinator), Mr. Julian Fraillon (research director); and Dr. Wolfram Schulz (assessment coordinator), all of whom have extensive experience in developing and operating international education surveys (especially related to ICILS).

A.9 Payments or Gifts to Respondents

In order to achieve acceptable school response rates in international studies, schools in the U.S. are usually offered $200 to thank them for their participation and the time they invest in and the space they make available for the international assessments. High response rates are required by both IEA and NCES, and are difficult to achieve in school-based studies. The U.S. has historically had difficulties in achieving sufficient participation levels. As in other international assessments, such as TIMSS, schools will be offered $200 for their participation in ICILS.

The school staff serving as School Coordinators will receive $100 for their time and effort. The School Coordinator serves a critical role in data collection, functioning as the central school contact and facilitating arrangements for the assessments. They are asked to file class and student listing forms; arrange the date, time and space for the assessment; and disseminate information and consent forms to parents and students.

A check will be mailed to each school in the amount of $200, and to each school coordinator in the amount of $100, once the ICILS assessment has been conducted in their schools.

Consistent with other international assessments, as a token of appreciation for their participation, students will receive a small gift valued at approximately $4. In TIMSS 2015, each participating student received a small flashlight that could be clipped with an attached karabiner to a backpack or belt loop. Students will also receive a certificate with their name thanking them for participating in ICILS and representing the United States. Some schools also offer recognition parties with pizza or other treats for students who participate; however these are not reimbursed by NCES or the contractor.

Teachers will be offered $20 for completing the ICILS teacher questionnaire to encourage their participation. In order to avoid sending up to 20 checks to the school for the school coordinator to distribute to teachers who complete the questionnaire, electronic Amazon gift cards in the amount of $20 will be used. Teacher email addresses are not collected prior to the assessment. Teacher invitation cards that provide information about how to access the online teacher questionnaire are distributed to selected teachers by the school coordinator. This card will include instructions for the teacher to email the ICILS Staff Help Desk upon completion of the questionnaire and provide his or her email address. Once completion of the questionnaire will be confirmed, the code to access the Amazon electronic gift card will be emailed to the teacher. In this way, teachers will see the direct link between completing the questionnaire and receiving the $20 thank you token, and will receive the incentive very quickly after survey completion. Amazon gift cards will be used because they have no associated fees, unlike other cash card programs.

Historically, participation is high among school administrators without offering incentives; therefore, no incentive will be offered for completion of the school administrator questionnaire.

A.10 Assurance of Confidentiality

Data security and confidentiality protection procedures have been put in place for ICILS to ensure that Westat and its subcontractors comply with all privacy requirements, including:

  1. The statement of work of this contract;

  2. Privacy Act of 1974 (5 U.S.C. §552a);

  3. Family Educational and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));

  4. Privacy Act Regulations (34 CFR Part 5b);

  5. Computer Security Act of 1987;

  6. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  7. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  8. Confidential Information Protect and Statistical Efficiency Act of 2002;

  9. E-Government Act of 2002, Title V, Subtitle A;

  10. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  11. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  12. The U.S. Department of Education Incident Handling Procedures (February 2009);

  13. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  14. NCES Statistical Standards; and

  15. All new legislation that impacts the data collected through the inter-agency agreement for this study.

Furthermore, Westat will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/.

The laws pertaining to the use of personally identifiable information are clearly communicated in correspondence with states, districts, schools, teachers, students, and parents. Letters and information materials will be sent to parents and school administrators describing the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential (see copies in appendix A):

NCES is authorized to conduct this study under the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). All of the information you provide may only be used for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

The following statement will appear on the login page for ICILS and the front cover of the printed questionnaires (the phrase “search existing data resources, gather the data needed” will not be included on the student questionnaire):

The National Center for Education Statistics (NCES), within the U.S. Department of Education, conducts ICILS in the United States as authorized by the Education Sciences Reform Act of 2002 (20 U.S.C. §9543). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0929. The time required to complete this information collection is estimated to average [XX] minutes per [respondent type], including the time to review instructions [, search existing data resources, gather the data needed,] and complete and review the information collection. If you have any comments or concerns regarding the accuracy of the time estimate(s), suggestions for improving the form, or questions about the status of your individual submission of this form, write directly to: International Computer and Information Literacy Study (ICILS), National Center for Education Statistics, PCP, 550 12th St., SW, 4th floor, Washington, DC 20202.

OMB No. 1850-0929, Approval Expires xx/xx/20yy.

The ICILS confidentiality plan includes signing confidentiality agreements and notarized nondisclosure affidavits by all contractor and subcontractor personnel and field workers who will have access to individual identifiers. Also included in the plan is personnel training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses; controlled and protected access to computer files under the control of a single data base manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility. Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. Neither names nor addresses will be included on any data file.

NCES understands the legal and ethical need to protect the privacy of the ICILS respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure analysis of the ICILS 2018 data when preparing the data files for use by researchers, in compliance with 20 U.S.C. §9573. Schools with high disclosure risk will be identified and, to ensure that individuals may not be identified from the data files, a variety of masking strategies will be used, including swapping data and omitting key identification variables (i.e., school name and address) from both the public- and restricted-use files (though the restricted-use file will include an NCES school ID that can be linked to other NCES databases to identify a school); omitting key identification variables such as state or ZIP Code from the public-use file; and collapsing or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files.

A.11 Sensitive Questions

The questionnaires do not have items considered to be of sensitive nature.

A.12 Estimates of Burden

This package shows estimated burden to respondents for all ICILS 2018 activities, and requests approval for burden to respondents for the field test and for recruitment for the main study. Burden estimates are shown in Table A.1.

Table A.1. Burden estimates for ICILS 2018 Field Test and Main Study.

Data collection instrument

Sample size

Expected response rate

Number of respondents

Number of responses

Minutes Per respondent

Total burden Hours

Field Test

School Administrator Recruitment

38

1

38

38

90

57

School Coordinator (1 per participating school)

32

1

32

32

240

128

District IRB Staff Study Approval

11

1

11

11

120

22

District IRB Panel Study Approval

66

1

66

66

60

66

Student Directions

800

0.85

680

680

15

170

Student Assessment

800

0.85

680

680

120

1,360

Student Questionnaire

800

0.85

680

680

30

340

School Administrator Questionnaire (1 / school)

38

0.84

32

32

30

16

Teacher Questionnaire (20 teachers sampled / school)

640

0.85

544

544

30

272

Total Burden Field Test

-- 

-- 

1,371

1,403

-- 

901

Main Study


 

 

 

 

 

 

School Administrator Recruitment

353

0.85

300

300

90

450

School Coordinator

353

0.85

300

300

240

1,200

District IRB Staff Study Approval

40

1

40

40

120

80

District IRB Panel Study Approval

240

1

240

240

60

240

Student Directions

11,250

0.85

9,562

9,562

15

2,391

Student Assessment

11,250

0.85

9,562

9,562

120

19,124

Student Questionnaire

11,250

0.85

9,562

9,562

30

4,781

School Administrator Questionnaire

300

1

300

300

30

150

Teacher Questionnaire (20 teachers sampled / school)

6000

0.85

5,100

5,100

30

2,550

Total Burden Main Study School Recruitment

-- 

-- 

880

880

-- 

1,970

Total Burden Requested in this Submission

-- 

-- 

2,251

2,283

-- 

2,871

Note: Total Burden Requested in this Submission includes burden associated with conducting the ICILS 2018 Field Test and the recruitment and pre-assessment activities for the ICILS 2018 Main Study (items in black font). Total student burden does not include the time for the cognitive assessment and its associated instructions (in gray regular font) nor the main study data collection (in gray italicized font).


Some districts are known as “special handling districts” which require completion of a research application before they will allow schools under their jurisdiction to participate in a study. Based on an initial assessment of previous data collections of similar studies such as TIMSS, we have estimated the number of special handling districts in the field test and main study samples (shown in Table A.1). Contacting special handling districts begins with updating district information based on what can be gleaned from online sources. Calls are then placed to verify the information about where to send the completed required research application forms, and, if necessary, to collect contact information for this process. During the call, inquiry is also made about the amount of time the districts spend reviewing similar research applications. The estimated number of such districts represents those with particularly detailed application forms and lengthy processes for approval. This operation should begin in fall 2016 to allow sufficient time for special handling districts’ review processes. We will begin contacting these districts upon receiving OMB’s approval, and continue to work with them until we receive a final response (approval or denial of request) up until March 31, 2017.

The minimum sample size requirements for the field study are 32 schools and 570 students and for the main study the target sample size for the United States are 300 schools and 9,000 students. The burden table assumes exceeding the minimum requirements and is based on a sample of 800 students in the field test and 11,250 students in the main study. The time required for students to respond to the assessment (cognitive items) portion of the study and associated directions are shown in gray font and are not included in the totals because they are not subject to the PRA. Student, administrator, and teacher questionnaires are included in the requested burden totals. Recruitment and pre-assessment activities include the time to review study plans by the school districts that require research application and approval before contacting their schools, and the time involved in a school deciding to participate, completing teacher and student listing forms, distributing parent notification and consent materials, and arranging assessment space. Burden estimates for the main study data collection are also provided for information purposes in table A.1.

The hourly rates for teachers/instructional staff, noninstructional staff/coordinators, and principals ($28.45, $21.34, $44.68 respectively) are based on Bureau of Labor Statistics (BLS) May 2015 National Occupational and Employment Wage Estimates2, assuming 2,080 hours per year. The federal minimum wage of $7.25 is used as the hourly rate for students. For the ICILS field test and recruitment for the main study, a total of 2,871 burden hours are anticipated, resulting in an estimated burden time cost to respondents of approximately $80,142.

A.13 Total Annual Cost Burden

No cost to respondents is anticipated beyond the estimated burden cost described in Section A.12.

A.14 Annualized Cost to Federal Government

The cost to the federal government for conducting initial phases of ICILS 2018, all field test operations (preparation, recruitment, field test data collection, and field test scoring), and preparation for the main study (including sampling plan, assessment instruments preparation, recruitment, and preparation for data collection and scoring) is estimated to be $2,749,100 over a 2-year period. This figure includes all direct and indirect costs, and is based on estimates for the initial phases of the implementation of ICILS, which is in a national data collection contract in conjunction with the transition of TIMSS to electronic format, from April 2016 to February 2018.

A.15 Program Changes or Adjustments

This is a new data collection effort by the United States, and as such shows all burden as new.

A.16 Plans for Tabulation and Publication

The ICILS field test is designed to provide a statistical review of the performance of items on the cognitive assessment and questionnaires in preparation for the main study data collection.

Based on the data collected in the main study, ACER will prepare an international report to be released in November 2019. As has been customary, NCES will also release a report at the same time as the international reports are released, interpreting the results for the U.S. audience. NCES reports on initial data releases are generally limited to simple bivariate statistics. There are currently no plans to conduct complex statistical analyses of either dataset. An example of a similar report on another international assessment can be found at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2013010rev. In the spring of 2020, ACER will also release the international technical report, describing the design and development of the assessment as well as the scaling procedures, weighting procedures, missing value imputation, and analyses. After the release of the international data, NCES plans to release the national data and an accompanying User’s Guide for the study.

Electronic versions of each publication are made available on the NCES website. Schedules for tabulation and publication of ICILS 2018 results in the United States are dependent upon receiving data files from the international sponsoring organization. With this in mind, the expected data collection dates and a tentative reporting schedule are provided in table A.2.

Table A.2. Schedule of Activities for ICILS 2018 Field Test and Main Study.

Dates

Activity

April—December 2016

Prepare data collection manuals, forms, assessment materials, questionnaires

October 2016—December 2016

Contact and gain cooperation of states, districts, and schools for field test

December 2016—March 2017

Select student samples

May 2017—October 2017

Collect field test data

July 2017—October 2017

Deliver raw data to international sponsoring organization

August 2017—November2017

Review field test results

March 2017—February 2018

Prepare for the main study/recruit schools

February 2018—May 2018

Collect main study data

June 2019

Receive final data files from international sponsors

June 2019—November 2019

Produce report



A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all data collection materials.

A.18 Exceptions to Certification Statement

No exceptions to the certifications are requested.

1 The materials that will be used in the 2018 main study will be based upon the field test materials included in this submission. Additionally, this submission is designed to adequately justify the need for and overall practical utility of the full study and to present the overarching plan for all of the phases of the data collection, providing as much detail about the measures to be used as is available at the time of this submission. As part of this submission, NCES is publishing a notice in the Federal Register allowing first a 60- and then a 30-day public comment period. For the final proposal for the main study, after the field test, NCES will publish a notice in the Federal Register allowing an additional 30-day public comment period on the final details of 2018 main study.

2 The average hourly earnings of teachers/instructional staff in the May 2015 National Occupational and Employment Wage Estimates sponsored by the Bureau of Labor Statistics (BLS) is $28.45 (an average hourly rate of Elementary and Middle School Teachers, $27.91, and Secondary School Teachers, $28.98), of noninstructional staff is $21.34, and of principals/education administrators is $44.68. If mean hourly wage was not provided it was computed assuming 2,080 hours per year. The exception is student wage, which is based on the federal minimum wage. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ data type: Occupation codes: Elementary and Middle School Teachers (25-2020) and Secondary School Teachers (25-2030); Education, Training, and Library Workers, All Other (Elementary and Secondary Schools) (25-9099); and Education Administrators, Elementary and Secondary Schools (11-9032); accessed on April 5, 2016.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCalvin Choi
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy