Volume 1 FRSS 110 Technology

Vol 1 FRSS 110 Instructional Technology Use.docx

Quick Response Information System (QRIS) 2017-2020 System Clearance

Volume 1 FRSS 110 Technology

OMB: 1850-0733

Document [docx]
Download: docx | pdf



Volume I:





Fast Response Survey System (FRSS) 110: Use of Educational Technology for Instruction in Public Schools





OMB# 1850-0733 v. 36















October 2019


National Center for Education Statistics (NCES)

U.S. Department of Education






Justification

The National Center for Education Statistics (NCES), within the U.S. Department of Education (ED), requests OMB approval under the NCES system clearance for the Quick Response Information System (QRIS) (OMB# 1850-0733) to conduct data collection for the Fast Response Survey System (FRSS) school survey #110 on use of educational technology for instruction in public schools. ED’s Office of Educational Technology requested that NCES conduct this FRSS survey.

The expanding use of technology affects the lives of students both inside and outside the classroom. For this reason, the role of technology in education is an increasingly important area of research. While access to technology can provide valuable learning opportunities to students, technology by itself does not guarantee successful outcomes. Schools and teachers play an important role in successfully integrating technology into teaching and learning. The purpose of this FRSS 110 survey is to collect nationally representative data from public schools about their use of educational technology for instruction.

NCES is authorized to conduct FRSS by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). NCES contracted Westat for all stages of the survey, including preliminary activities and data collection using the Fast Response Survey System (FRSS). This request is to conduct the full-scale survey data collection, beginning in January 2020. The request to conduct the FRSS 110 preliminary activities, which involved securing research approval from special contact school districts, was approved by OMB in May 2019 under the OMB generic clearance for quick response surveys (OMB#1850-0733 v.35).

Design

Overview of Survey Development

FRSS has established procedures for developing short surveys on a wide variety of topics. The techniques used to shape the survey design for FRSS 110 include input from the NCES Quality Review Board (QRB), several rounds of feasibility calls, and a pretest.

The current survey reflects input from the QRB, with modifications based on three rounds of feasibility calls (OMB# 1850-0803 v. 244) and a pretest (OMB# 1850-083 v. 254) with public school personnel most knowledgeable about the use of instructional technology in the school. The reports of the survey development results are provided in Attachment 1. The first round of feasibility calls was conducted in February 2019 with respondents from 14 schools. Because this is a new survey, the first round of calls used an open-ended interview guide rather than a questionnaire. The second round of feasibility calls was conducted in April and May 2019 with respondents from 13 schools, during which respondents provided feedback on draft survey questions. The third round of feasibility calls was conducted with 13 respondents in June 2019 during which respondents were asked to review draft survey questions, instructions, and definitions based on the initial rounds of feasibility calls. The resulting draft of the questionnaire was then reviewed by the NCES QRB and revised accordingly to prepare it for the pretest.

Pretest calls with respondents from 8 schools were conducted in September and October 2019. For the pretest, respondents were asked to complete the questionnaire and return it to Westat, and then participate in a telephone debriefing with Westat to provide feedback on the questionnaire. The purpose of the pretest was to verify that all questions and corresponding instructions were clear and unambiguous, to determine if the information would be readily accessible to respondents, and to determine whether the burden on respondents could be further reduced. Changes to the questionnaire were made based on the feedback received during the pretest. The revised questionnaire (Attachment 2) is being submitted with this request for OMB clearance.

Overview of Survey Collection

Approximately one week before the start of survey data collection, we will send a letter to the superintendent of districts with sampled schools informing them about the data collection (see the communication materials in Attachment 3). Then, in early January, we will send the principals of the sampled schools a survey package containing a paper copy of the questionnaire (which identifies the sampled school with a label on the front page, above the respondent information section), a “Dear Principal” cover letter, a web survey information sheet, and a business reply envelope. The letter and questionnaire will request that the survey be completed by the principal or the person most knowledgeable about the use of educational technology for instruction at the school. On the front of the survey and in the cover letter, respondents are assured that their participation is voluntary, and their answers may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). All respondents will have the option of completing the survey on paper or online.

We will conduct follow-up for nonresponse and data clarification using a combination of mail, email, and telephone contacts. Approximately three weeks after the start of data collection, we will begin nonresponse follow-up by telephone, using the Respondent Information Sheet to guide the calls (see the communication materials in Attachment 3). If a principal indicates that he or she did not receive or no longer has the survey package, we will send another survey package by either mail or email, using the appropriate version of the “Dear Principal” cover letter in the survey package. Late in the nonresponse follow-up period, we may send reminder packages using either the mail or email version (as appropriate) of the “Dear Survey Participant” letter along with the questionnaire, and may also include the “Make Your Voice Heard” survey overview to quickly highlight the importance of the survey and the low burden to respond.

Research Questions and Questionnaire

The purpose of the FRSS survey is to collect information from public schools about their use of educational technology for instruction, with a focus on equity and access. The cover of the questionnaire indicates that the survey is designed to be completed by the principal or the person most knowledgeable about the use of educational technology for instruction at the school indicated on the front of the survey. The cover also includes a definition of computers to be used by respondents while completing the questionnaire. To help ensure that respondents read this important definition, it is repeated in a box above question 1, and called out in questions 1 and 4.

The questionnaire collects information about the following research questions.

  • What is the availability and location of computers for student instructional use?

  • Are there school-provided computers assigned to individual students, and are students allowed to take these computers home?

  • What is the quality of the instructional computers and software at the school, and to what extent do the computers meet the school’s instructional needs?

  • How easy is it for teachers at the school to find enough computers to use with their students?

  • How reliable is the Internet connection in the instructional areas of the school?

  • How much flexibility do school-level leaders have in determining which types and how much educational technology is purchased for the school?

  • How much flexibility do school-level leaders have in determining which types and how much professional development in educational technology is provided for the school?

  • Does the school allow students to borrow computers to take home on a short-term basis?

  • Does the school provide mobile hotspots or web-enabled devices with paid data plans for students to take home for Internet access?

  • To what extent are various types of online resources used for instruction at the school?

  • For what types of classroom activities do teachers at the school use educational technology?

  • What type of professional development in educational technology is provided to teachers at the school?

  • What type of staff work with the teachers at the school to integrate educational technology into instruction?

  • How is student learning affected by the use of educational technology in the instructional program at the school?

  • What challenges are faced by teachers in the school in using educational technology for instruction?

NCES Review and Consultations Outside of Agency

The NCES QRB members reviewed a draft list of questionnaire and discussion topics prior to the request for the feasibility calls (OMB#1850-0803 v.244). Revisions were made to the list of topics based on input from the reviewers, and the list was used to develop an interview guide for the feasibility calls. As rounds of feasibility calls progressed, draft questionnaire items and then a draft questionnaire were developed. Following the last round of feasibility calls, the QRB members reviewed the draft questionnaire, and revisions were made based on their input. The revised version was used for the pretest (OMB#1850-0803 v.254). In addition to staff from NCES’s Statistical Standards group, the Annual Reports group, and the Administrative Records and Sample Surveys Divisions, the QRB also includes staff from ED’s Office of Educational Technology (OET) and four education technology organizations. The QRB members for this survey are listed below:


Bernadette Adams, Office of Educational Technology

Jake Steele, Office of Educational Technology

Tom Snyder, NCES (Annual Reports and Information)

Ross Santy, NCES (Administrative Records Division)

Chris Chapman, NCES (Sample Surveys Division,

Kashka Kubzdela, NCES (Statistical Standards and Data Confidentiality)

Maria Worthen, iNACOL

Christina Luke, Digital Promise

Susan Bearden, CoSN

Ji Soo Song, ISTE

Assurance of Confidentiality

Data to be collected will not be released to the public with institutional or personal identifiers attached. Data will be presented in aggregate statistical form only. In addition, each data file will undergo extensive disclosure risk analysis and will be reviewed by the NCES/IES Disclosure Review Board before use in generating report analyses and before release as a public use data file. Respondents will be assured on the questionnaire and in the cover letter that their participation in the survey is voluntary and that all of the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

The statements below will appear on the front page of the questionnaire (paper and web versions).

NCES is authorized to conduct this study by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). While participation in this survey is voluntary, your cooperation is critical to make the results of this survey comprehensive, accurate, and timely. All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this information collection is 1850–0733. The time required to complete this information collection is estimated to average 30 minutes per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate, suggestions for improving this form, or any comments or concerns regarding the status of your individual submission of this form, please write directly to: Quick Response Information System (QRIS), National Center for Education Statistics (NCES), PCP, 550 12th Street, SW, 4th floor, Washington, DC 20202.

Description of Sample and Burden

The sample design is a nationally representative sample of approximately 1,300 public schools from the 2016–17 NCES Common Core of Data (CCD) Public School Universe File. The questionnaire is limited to three pages of items readily available to respondents and can be completed by most respondents in about 30 minutes. Any special requirements that districts have for approval of surveys will have been met before those districts are contacted. The request for special clearance district activities was approved by OMB in May 2019 (OMB#1850-0733). Information about estimated respondent burden and response time cost for FRSS 110 survey data collection is provided in Table 1 (respondents are counted only once).

Table 1. Estimated burden for FRSS 110 survey data collection

Type of collection

Sample size

Estimated response rate (percent)

Estimated number of respondents

Estimated number of responses

Burden hours per respondent

Total response burden hours

Response burden time cost

(@$47.48 per hour)

Initial school contact

1,300

100%

1,300

1,300

0.083

108

$5,128

Questionnaire

1,300

85%

1,105

1,105

0.50

553

$26,257

Nonresponse follow-up call to school

1,300

75%

975

975

0.083

81

$3,846

Total

--

--

1,300

3,380

--

742

$35,231



The estimated average hourly earnings of elementary and secondary school administrators are $47.481. Therefore, based on 742 total burden hours for FRSS 110 survey data collection, the associated total estimated burden time cost to respondents is $35,231.

Survey Cost and Time Schedule

The entire survey is estimated to cost the federal government about $850,000, including about $800,000 for contractual costs and $50,000 for salaries and expenses. Contractual costs include the costs for survey preparation, preliminary activities, data collection, data analysis, and report preparation. The main survey data collection from schools will begin in January 2020 and is scheduled to end in June 2020.

Plan for Tabulation and Publication

The First Look report will be released on the NCES website in summer 2021 and include explanatory text and tables. Participating schools will be notified when NCES releases the report. A public use data file will also be released on the NCES website. Survey responses will be weighted to produce national estimates. Tabulations will be produced for each data item. Cross-tabulations of data items will be made with selected classification variables, such as school level, enrollment size, community type (locale), geographic region, and category for percent of students eligible for free/reduced price lunch.

Statistical Methodology

This statistical methodology section was submitted in the request to conduct the FRSS 110 preliminary activities, which was approved by OMB in May 2019 under the OMB generic clearance for quick response surveys (OMB#1850-0733). It is being carried over in this request and remains unchanged from the approved version.

Reviewing Statisticians

Christopher Chapman, of NCES, is the Project Officer for this survey. Adam Chu, Senior Statistician, Westat, was consulted about the statistical aspects of the design.

Respondent Universe

FRSS 110 will collect data from a nationally representative sample of public schools. Schools meeting the following conditions are in scope for the educational technology survey:

  • The school provides instruction in any of the grades 1 through 12.

  • The school is located within the 50 States or the District of Columbia.

  • The school does not have zero or missing enrollment.

  • The school is a regular school (with the exception of DoDEA and BIE schools, and schools that are fully or primarily virtual schools)

Sampling Frame

The sampling frame (i.e., universe list) from which the school sample will be drawn will be constructed from the 2016–17 (or later edition if available) Common Core of Data (CCD) Universe Files maintained by NCES.2 The CCD file contains a record for all known public schools along with selected characteristics such as instructional level, enrollment size, community type (type of locale), percent of students eligible for free/reduced price lunch, and others. As summarized in Table 2, there are over 87,000 schools that meet the conditions for inclusion in the educational technology survey.

Table 2. Distribution of eligible schools in the 2016–17 CCD Universe File, by enrollment size class and instructional level

 

Instructional Level

 

Enrollment size class

Elementary

Middle

High

Other*

Total

Less than 300

11,878

3,239

4,704

1,890

21,711

300 to 499

18,957

3,207

2,767

984

25,915

500 to 999

19,893

6,494

3,477

1,228

31,092

1000 to 1499

1,027

1,583

2,250

359

5,219

1500 or more

68

161

3,186

177

3,592

All

51,823

14,684

16,384

4,638

87,529

* These are schools in the CCD Universe file with grades that do not meet the traditional definition of elementary, middle, or high schools.

Sample Design and Stratification

Traditionally, surveys conducted under the FRSS have employed stratified samples ranging in size from 1,200 to 1,800 schools depending on analytic goals and available resources. Since FRSS is designed to provide estimates for broadly-defined subgroups of interest as well as overall national estimates, a stratified sample design with primary strata defined by level, size class, and other characteristics generally has been found to be effective in meeting study objectives. Specification of explicit strata for sampling purposes allows for the selection of schools at varying rates to: (a) ensure that key subgroups are adequately represented in the sample, and (b) improve sampling precision for selected subgroup estimates. Moreover, use of enrollment size as the primary stratifier also helps to ensure that sample-based estimates that are correlated with the size of the school can better achieve reasonable levels of precision.

In view of the above considerations, we plan to select a stratified sample of 1,300 schools for the FRSS 110 survey, with strata defined by: (a) instructional level (elementary, middle, high, other) and (b) enrollment size class (i.e., the following five size classes: [1] under 300 students; [2] 300 to 499; [3] 500 to 999; [4] 1,000 to 1,499; and [5] 1,500+). To allow for comparisons among the various instructional levels, we will select 400 schools from each of the elementary, middle, and high school strata, and 100 from schools in the “other” stratum. Within each instructional level stratum, the samples will be selected at rates that are roughly proportional to the aggregate square-root of the enrollment of the schools in the five size classes. Use of the square root of enrollment as the measure of size for sample allocation has two main benefits. First, it will give the larger schools in terms of enrollment relatively higher probabilities of selection which is beneficial for estimation of school-level characteristics that are related to the number of students in the school. Second, it will help limit the size of the design effects (and associated increased variances) that can adversely affect the estimation of proportions or counts of schools that report a specified characteristic.

Other variables, such as region and poverty status, will be used to sort the schools in the sampling frame prior to sample selection. The sorting induces an “implicit” stratification that helps ensure that schools with the selected characteristics are appropriately represented in the sample. Within each sampling stratum, schools will be selected systematically at rates that depend on the size class of the school. Table 3 summarizes the allocation of the sample of 1,300 schools to the four instructional level strata and the corresponding numbers to be selected by size class. Assuming an 85 percent response rate, the expected number of responding schools is 1,105.

Table 3. Distribution of school sample by enrollment size class and type of locale

 

Instructional Level

 

Enrollment size class

Elementary

Middle

High

Other

Total

Less than 300

58

48

52

22

181

300 to 499

140

75

51

21

287

500 to 999

188

205

86

35

514

1000 to 1499

13

64

74

13

164

1500 or more

1

8

136

9

154

All

400

400

400

100

1,300







Expected Levels of Precision

Table 4 summarizes the expected sample sizes and levels of precision for selected subgroup estimates derived from the proposed sample design. The number of “responding schools” shown in the table are calculated assuming an overall response rate of 85 percent. Also shown are 95% confidence bounds around an estimated percentage derived from the respondent samples. The confidence bounds given in the table are for reported respondent characteristics ranging from a 20% characteristic to a 50% characteristic. As can be seen in the table, for subgroups with at least 340 respondents, estimates are expected to be relatively precise with 95% confidence bounds ranging from ± 5.2% to 6.1% for an estimated 50 percent characteristic. Moreover, under the proposed sample design, the minimum detectable difference (MDD) in estimated percentages between subgroups consisting of approximately 340 or more respondents would range from about 10% to 12% (e.g., using a T test to test for significance).

Table 4. Expected sample sizes (number of completed interviews) and 95% confidence bounds around an estimated proportion by selected subgroups under proposed design

 

 

 

 

95% confidence bounds around an estimated percentage equal to:

Subgroup

Number selected

Respondent schools

P = 20.0%

P = 33.0%

P = 50.0%

Total

1,300

1,105

±2.9%

±3.4%

±3.6%

Instructional level







Elementary

400

340

±4.4%

±5.2%

±5.5%


Middle

400

340

±4.5%

±5.3%

±5.6%


High

400

340

±4.8%

±5.6%

±6.0%


Other

100

85

±9.1%

±10.7%

±11.3%

Enrollment size class







Under 500

468

398

±4.4%

±5.2%

±5.5%


500 to 999

514

437

±4.2%

±5.0%

±5.3%


1,000 or more

318

270

±5.4%

±6.3%

±6.7%

Type of locale







City

358

304

±5.5%

±6.5%

±6.9%


Suburban

459

390

±4.9%

±5.7%

±6.1%


Town

165

140

±8.1%

±9.5%

±10.1%


Rural

318

270

±5.8%

±6.9%

±7.3%

Percent of students eligible for free/reduced price lunch







Under 35 percent

453

385

±4.6%

±5.5%

±5.8%


35 to 49 percent

231

196

±6.5%

±7.6%

±8.1%


50 to 75 percent

330

280

±5.4%

±6.4%

±6.8%


75 percent or more

287

244

±5.8%

±6.9%

±7.3%

 

 

 

 

 

 

 

Estimation and Calculation of Sampling Errors

For estimation purposes, sampling weights reflecting the overall probabilities of selection and adjustments for nonresponse will be attached to each data record. To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, 50–100 subsamples or “replicates” will be formed in a way that preserves the basic features of the full sample design. A set of estimation weights (referred to as “replicate weights”) will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and each of the jackknife replicates. The variability of the replicate estimates is used to obtain a measure of the variance (standard error) of the survey statistic. Previous surveys, using similar sample designs, have yielded relative standard errors (i.e., coefficients of variation) in the range of 2 to 10 percent for most national estimates. Similar results are expected for this survey.

1 The time cost to respondents is the hourly earnings of elementary and secondary school administrators as reported in the May 2018 Bureau of Labor Statistics (BLS) Occupational Employment Statistics. The mean hourly wage was computed assuming 2,080 hours per year. Source: BLS Occupational Employment Statistics, https://www.bls.gov/oes/current/oes_nat.htm#11-0000, Occupation code: Education Administrators, Elementary and Secondary School (11-9032; accessed on March 29, 2019).

2 Glander, M. (2019). Documentation to the 2016-17 Common Core of Data (CCD) Universe Files (NCES 2019-052). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved 3/27/2019 from http://nces.ed.gov/pubsearch/.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy