Part A FRSS 109 Teacher Use of IT

Part A FRSS 109 Teacher Use of IT.docx

Fast Response Survey System (FRSS) 109: Teachers’ Use of Technology for School and Homework Assignments

OMB: 1850-0857

Document [docx]
Download: docx | pdf






Fast Response Survey System (FRSS) 109: Teachers’ Use of Technology for School and Homework Assignments


Supporting Statement Part A





OMB # 1850-0857 v.5













National Center for Education Statistics (NCES)

U.S. Department of Education

Institute of Education Sciences

Washington, DC





April 2018

Revised May 2018


A.1. Justification

The Fast Response Survey System (FRSS) 109 survey on teachers’ use of technology for school and homework assignments in public schools is conducted by the National Center for Education Statistics (NCES) as part of the IES response to the request in the Every Student Succeeds Act of 2015 (ESSA, 20 U.S.C. §6301 et seq.) to provide information about the educational impact of access to digital learning resources (DLRs) outside of the classroom.

The expanding use of technology affects the lives of students both inside and outside the classroom. For this reason, the role of technology in education is an increasingly important area of research. While access to technology can provide valuable learning opportunities to students, technology by itself does not guarantee successful outcomes. Schools and teachers play an important role in successfully integrating technology into teaching and learning. Findings from the FRSS 109 study will provide insight on the types and availability of DLRs outside of the classroom, and will contribute to IES reports on the educational impact of access to DLRs outside the classroom.

NCES is authorized to conduct FRSS by the Education Science Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). NCES has contracted Westat for all stages of the survey. The request for FRSS 109 preliminary activities, including securing research approval from special contact school districts beginning in April 2018, notifying superintendents of districts with sampled schools about the survey, and obtaining teacher lists from sampled schools beginning in August 2018, was approved in March 2018, with the latest change request approved in April 2018 (1850-0857 v.2-4). This request is to conduct the FRSS 109 data collection. Because the approved FRSS 109 preliminary activities are expected to continue after this request for data collection is approved, the approved procedures, materials, and burden for the preliminary activities are being carried over in this submission.

Overview of Data Collection

The first-stage school sample of 2,000 regular public schools was selected from the 2015-16 CCD Public School Universe file in March 2018. Any special requirements that school districts have for approval of research in schools within their jurisdiction will be met before schools are contacted. In addition, the superintendent of each district with sampled schools will be sent a letter informing them about the survey (see Appendix A-1). This letter will be sent to all districts with sampled schools except for the special contact districts, because those are already being notified through the research application process per their specific requirements.

The initial step of data collection will be to collect lists of eligible teachers from the sampled schools. These lists will provide the sampling frame for selecting the teacher sample. In early fall 2018, the principal of each sampled school will be asked to prepare a list of eligible teachers according to written instructions (see Appendix B). Telephone follow up will begin 3 weeks after requests for teacher lists have been mailed to principals. Experienced telephone interviewers will be trained to conduct the prompt calls to encourage principals or their designees to submit a list of teachers.

The second stage of data collection will be to administer a teacher questionnaire to the sampled teachers. Respondents will have the option of completing either a paper or online version of the questionnaire. The questionnaire is limited to 3 pages of information readily available to respondents and can be completed by most teachers in 15 minutes (see Appendix C). These procedures are typical for FRSS surveys and result in minimal burden to respondents. Questionnaires will be mailed to teachers starting in late fall 2018. The cover letter will include information about the option to complete the online version of the survey. Telephone follow up for nonresponse (see Appendix A-2) will begin about 3 weeks after the questionnaires have been mailed to the teachers. Experienced telephone interviewers will be trained to conduct the nonresponse follow up and will be monitored by Westat’s supervisory personnel during all interviewing hours.

A.2. Purposes and Uses of the Data

FRSS 109 will collect nationally representative data from public school teachers about their use of DLRs for teaching, and how their knowledge and beliefs about their students’ access to DLRs outside the classroom affect the assignments they give. FRSS 109 will provide valuable national data to IES to address the request in ESSA 2015. ESSA provides guidance to state governments on how to receive supplemental federal funding for public education. As part of the ESSA legislation, IES is required to produce a report on the educational impact of access to DLRs outside of the classroom. Specifically, ESSA requests that IES conduct research in the following five areas:

  1. An analysis of student habits related to digital learning resources outside of the classroom, including the location and types of devices and technologies that students use for educational purposes.

  2. An identification of the barriers students face in accessing DLRs outside of the classroom.

  3. A description of the challenges that students who lack home internet access face, including challenges related to student participation and engagement in the classroom and homework completion.

  4. An analysis of how the barriers and challenges such students face impact the instructional practice of educators.

  5. A description of the ways in which state education agencies, local education agencies, schools, and other entities, including partnerships of such entities, have developed effective means to address the barriers and challenges students face in accessing DLR outside of the classroom.

ESSA refers to the term “digital learning” as “any instructional practice that effectively uses technology to strengthen a student’s learning experience and encompasses a wide spectrum of tools and practices” (20 U.S.C. §7112 Definitions). However, for this survey, the main focus of digital learning resources will be computers and internet access.

To provide the needed data, FRSS 109 will collect nationally representative data from public school teachers about their use of computers and the internet for school and homework assignments, and how their knowledge and beliefs about their students’ access to computers and the Internet outside the classroom affect the assignments they give. The survey will focus on information that can best be provided by teachers from their perspective and direct interaction with students.

A.3. Improved Information Technology

To speed up the teacher sampling, schools will have the option of submitting the list of teachers via email, secure fax server, or by mail. When replacement materials are requested by schools during nonresponse follow up, the materials will be sent via email or fax server.

The survey will use a mixed mode data collection approach with teachers given the option of completing an online or paper version of the survey. Whenever paper versions of the questionnaire are used, they will be transmitted to and from respondents by email or a secure fax server whenever possible. Email address to which respondents can direct questions will be included on the front of the questionnaire and in the materials for preparing the teacher list. These procedures are all designed to minimize burden on respondents.

A.4. Efforts to Identify Duplication

NCES is conducting FRSS 109 as part of the IES response to the request in ESSA 2015 to provide information about and produce a report on the educational impact of access to DLRs outside of the classroom. NCES is developing a report on this issue that analyzes existing research and identifies the needs for additional research. One area where additional research is needed is on how teachers’ perceptions of student access to technology and the Internet at home impact their instructional practices. The purpose of FRSS 109 is to provide this missing information by collecting nationally representative data from public school teachers about their use of technology for school and homework assignments, and how their knowledge and beliefs about their students’ access to DLR outside the classroom affect the assignments they give. There is no existing national source of this information.

A.5. Methods used to Minimize Burden on Small Entities

Given the sampling strategy described in the Supporting Statement Part B, smaller schools were sampled at lower rates than larger schools. To minimize the burden on schools for the teacher list collection, respondents have the option of using existing staff lists, and editing them as needed to eliminate ineligible teachers. We will accept lists in all formats and assist respondents by telephone and email. Teachers will be sampled from lists collected from sampled public schools of all sizes. The burden on teachers is minimized by keeping the questionnaire short (three pages) and restricting questions to generally available information, giving respondents the option of completing an online version of the questionnaire, conducting follow-up for nonresponse and data clarification by telephone, and transmitting paper versions of the questionnaire by email or fax whenever requested.

A.6. Consequences for Not Collecting the Information

If these data are not collected, NCES and IES will not be able to respond to the Congressional request to provide current teacher-level data about how teachers use DLRs for teaching, and how their knowledge and beliefs about their students’ access to DLRs outside the classroom affect the assignments they give.

A.7. Adherence to the Guidelines in CFS 1320.5

Data collected will be conducted in a manner consistent with the guidelines in 5 CFR 1320.5. The only exception is that responses will be requested in fewer than 30 days, following the well-developed procedures for NCES quick response surveys such as FRSS, which are intended to collect data quickly.

A.8. Consultations Outside of the Agency

Development work is being conducted under NCES developmental work generic clearance (OMB# 1850-0803). To-date, the NCES Quality Review Board (QRB) members reviewed a draft list of questionnaire and discussion topics prior to submitting the request to OMB for feasibility calls (OMB# 1850-0803 v.202). Revisions were made to the list of topics based on input from the reviewers and the list was used to develop an interview guide for the feasibility calls. As rounds of feasibility calls progressed, draft questionnaire items and then a draft questionnaire were developed. Following the last round of feasibility calls, the QRB members reviewed the draft questionnaire, and revisions were made based on their input. The revised version was approved by OMB in March 2018 for pretesting (OMB# 1850-0803 v.226). The survey pretest was conducted in April and May 2018, using the draft questionnaire. The results of the pretesting were used to develop the final version of the FRSS 109 questionnaire (Appendix C).

In addition to staff from NCES’ Statistical Standards group, the Annual Reports group, and each of the three Divisions, the QRB also included staff from ED’s Office of Educational Technology (OET) and the Policies and Programs Studies Service of the Office of Planning, Evaluation, and Policy Development (OPEPD); the U.S. Commerce Department’s National Telecommunication and Information Administration; and the IBM Center for The Business of Government. The QRB members for this survey are listed below:


Rafi Goldberg, National Telecommunications and Information Administration, Commerce

Bernadette Adams, Office of Educational Technology

Andrew Abrams, OPEPD (Policy and Program Studies Service)

Dan Chenok, the IBM Center for The Business of Government

Halima Adenegan, NCES (Assessment Division)

Jamie Deaton, NCES (Assessment Division, NAEP)

John Ralph, NCES (Annual Reports and Information)

Tom Snyder, NCES (Annual Reports and Information)

Mark Glander, NCES (Administrative Records Division, CCD)

Chris Chapman, NCES (Sample Surveys Division, Longitudinal Branch)

Maura Spiegelman, NCES (Sample Surveys Division, Cross-sectional Surveys Branch)

Marilyn Seastrom, NCES (Statistical Standards and Data Confidentiality)

Kashka Kubzdela, NCES (Statistical Standards and Data Confidentiality)

A.9. Provisions for Payments or Gifts to Respondents

No payment or gifts to respondents will be made. However, some school districts charge a fee to process research application requests. These will be paid as necessary.

A.10. Assurance of Confidentiality

Data security and confidentiality protection procedures have been put in place for FRSS to ensure that the FRSS contractor and any subcontractors, when applicable, comply with all privacy requirements, including:

  1. The statement of work of this contract;

  2. Family Educational and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));

  3. Privacy Act of 1974 (5 U.S.C. §552a);

  4. Privacy Act Regulations (34 CFR Part 5b);

  5. Computer Security Act of 1987;

  6. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  7. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  8. Confidential Information Protect and Statistical Efficiency Act of 2002;

  9. E-Government Act of 2002, Title V, Subtitle A;

  10. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  11. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  12. The U.S. Department of Education Incident Handling Procedures (February 2009);

  13. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  14. NCES Statistical Standards; and

  15. All new legislation that impacts the data collected through the contract for this study.

Furthermore, the contractor will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/.

Data to be collected will not be released with institutional or personal identifiers attached. Data will be presented in aggregate statistical form only. In addition, each data file undergoes extensive disclosure risk analysis and is reviewed by the NCES/IES Disclosure Review Board before use in generating report analyses and before release as a public use data file. Each respondent will be assured that all information identifying them or their school or agency will not be published in reports, and both the cover letter accompanying the questionnaire and the header on the questionnaire will state: “All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).”

All Westat staff members working on the study are required to sign the NCES Affidavit of Nondisclosure and Westat’s confidentiality pledge. Access to data processing and storage facilities is restricted to authorized personnel at all times; access to the web versions of the questionnaire are ID and password protected with a secure socket layer (SSL) protocol. In addition, the network administrator for the study has established additional security features for the study. The study systems, including the web versions of the questionnaires, have passed ED’s extensive Certification and Accreditation security process.

A.11. Sensitive Questions

There are no questions of a sensitive nature included in the survey.

A.12. Estimates of Response Burden

The approved FRSS 109 preliminary activities include: (a) contacting and seeking research approvals from public school districts with an established research approval process (“special contact districts”), (b) notifying the superintendent of districts with sampled schools about the study, and (c) notifying sampled schools of their selection for the survey and requesting a teacher list from them. The approved estimated respondent burden hours and cost for these activities are shown in gray font in Exhibit 1.

The special contact districts are those known to require completion of a research application before they will allow schools under their jurisdiction to participate in a study. Contacting special contact districts begins with updating district information based on what can be gleaned from online sources and what is known from other NCES data collections. Individual districts will be contacted as needed to fill in gaps about where and to whom to send the completed required research application forms. This operation will begin in April 2018 to allow as much time as possible for special contact districts’ review processes, and will continue until we receive a final response (approval or denial of request) as long as there is sufficient time for sampled schools and teachers to respond to FRSS 109. Any special requirements that districts have for approval of surveys will be met before schools in those districts are contacted. Each special contact district has unique requirements for obtaining approval. For teacher surveys, we estimate needing to work with approximately 200 special contact districts. The materials sent to special contact districts will be tailored to meet the specific requirements of each district, based on the district recruitment materials provided in Appendix A-1. The respondent burden for special contact districts is estimated to be approximately 2 hours for IRB review by one staff member per district, and 60 minutes per member for district IRB panel review, assuming each panel would on average be composed of six panel members.

The notification letter to district superintendents is not included in the burden estimates because the superintendents are not asked to take any action upon receiving the letter.

Principals of sampled schools will be asked to provide lists of eligible teachers to be used for sampling (see Appendix B). This is estimated to take about 20 minutes per school. Based on recent experience, we estimate that principals from about 85 percent of sampled schools will provide the teacher lists.

In this submission, we are requesting approval to conduct the teacher survey data collection. Sampled teachers will be asked to respond to the survey (see Appendix C), which is estimated to take about 15 minutes per sampled teacher. We estimate that about 85 percent of sampled teachers will respond to the survey. Telephone nonresponse follow-up is estimated to take about 5 minutes and is used to prompt respondents to complete the survey (see Appendix A-2). We estimate that about 75 percent of the sampled teachers will receive a nonresponse follow-up call. Information about estimated respondent burden hours and cost for teacher survey data collection is summarized in the bottom rows of Exhibit 1.

The estimated average hourly earnings of elementary and secondary administrators are $46.85 and school teachers are $29.721. Therefore, based on 3,251 total burden hours for FRSS 109 preliminary activities and teacher survey data collection activities, the associated total estimated burden time cost to respondents for FRSS 109 is $133,638.

Exhibit 1. Estimated burden for FRSS 109 preliminary activities and teacher survey data collection

Type of collection

Sample size

Estimated response rate

Estimated number of respondents

Estimated number of responses

Burden hours per response

Total respondent burden hours

Respondent burden time cost*

Special contact district IRB Staff Review

200

100%

200

200

2

400

$18,740

Special contact district IRB Panel Review

200*6

100%

1,200

1,200

1

1,200

$56,220

Teacher lists collected from school principals

2,000

85%

1,700

1,700

0.33

561

$26,283

Preliminary Activities Subtotal**

--

--

3,100

3,100

--

2,161

$101,243

Teacher questionnaire

4,000

85%

3,400

3,400

0.25

850

$25,262

Nonresponse follow-up call

4,000

75%

3,000

3,000

0.08

240

$7,133

Teacher Data Collection Subtotal

--

--

3,400

6,400

--

1,090

$32,395

TOTAL

--

--

6,500

9,500


3,251

$133,638

*Cost per hour for preliminary activities is $46.85, and for teacher data collection is $29.72.

**Approved in March 2018 and being carried over in this submission.


A.13. Estimates of Cost Burden for Collection of Information

There are no costs to respondents beyond their time to participate. No equipment, printing, or postage charges will be incurred by the participants.

A.14. Estimates of Cost to the Federal Government

The estimated cost to the federal government for FRSS 109 is $1,495,276. The estimated cost to the federal government for the survey development work cleared under OMB# 1850-0803 is $30,000. The estimated cost to the federal government for the preliminary activities is $400,000, and the remaining activities (survey data collection, analysis, reporting, data files, and project management) is $1,065,276.

A.15. Changes in Burden

The apparent increase in the estimated respondent burden in this request is due to the fact that the last approval was for FRSS 109 preliminary activities, while this request is for FRSS 109 preliminary activities and data collection.

A.16. Publications Plans and Time Schedule

An NCES report containing survey results will be published on the NCES website. In addition, public use and restricted use data files will be made available to researchers. The analyses of the questionnaire data will be descriptive in nature and will provide data users with tables and appropriate explanatory text. Tabulations will be produced for each data item. Cross-tabulations of data items will be made with selected classifications variables such as school enrollment size, community type (locale), geographic region, and poverty level. The First Look report will be released on the NCES website in spring 2020.

Submission of district research applications for special contact districts will begin in April 2018. Collection of teacher sampling lists will begin in August 2018 and continue through February 2019. Collection of teacher questionnaires will be conducted from November 2018 through June 2019.

A.17. Approval to Not Display Expiration Date

No exemption from the requirement to display the expiration date for OMB approval of the information collection is being requested for FRSS 109.

A.18. Exceptions to the Certification Statement

No exceptions to the certification statement apply to FRSS 109.


1 The time cost to respondents is the hourly earnings of elementary and secondary administrators and school teachers as reported in the May 2017 Bureau of Labor Statistics (BLS) Occupational Employment Statistics. The mean hourly wage was computed assuming 2,080 hours per year. Source: BLS Occupational Employment Statistics, https://www.bls.gov/oes/current/oes_nat.htm#25-0000, Occupation codes: Education Administrators, Elementary and Secondary School (11-9032), Elementary and Middle School Teachers (25-2020), and Secondary School Teachers (25-2030), accessed on April 22, 2018.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy