Justification

Vol 1 FRSS 110 Instructional Technology Use Preliminary Activities.docx

Quick Response Information System (QRIS) 2017-2020 System Clearance

Justification

OMB: 1850-0733

Document [docx]
Download: docx | pdf



Volume I:





Fast Response Survey System (FRSS) 110: Use of Educational Technology for Instruction in Public Schools – Preliminary Activities





OMB# 1850-0733 v. 35















April 1, 2019


National Center for Education Statistics (NCES)

U.S. Department of Education






Justification

The National Center for Education Statistics (NCES), within the U.S. Department of Education (ED), requests OMB approval under the NCES system clearance for the Quick Response Information System (QRIS) (OMB# 1850-0733) to conduct district recruitment for the Fast Response Survey System (FRSS) school survey #110 on use of educational technology for instruction in public schools. ED’s Office of Educational Technology requested that NCES conduct this FRSS survey.

The expanding use of technology affects the lives of students both inside and outside the classroom. For this reason, the role of technology in education is an increasingly important area of research. While access to technology can provide valuable learning opportunities to students, technology by itself does not guarantee successful outcomes. Schools and teachers play an important role in successfully integrating technology into teaching and learning. The purpose of this FRSS 110 survey is to collect nationally representative data from public schools about their use of educational technology for instruction.

NCES is authorized to conduct FRSS by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). NCES contracted Westat for all stages of the survey, including preliminary activities and data collection using the Fast Response Survey System (FRSS). This request is for FRSS 110 preliminary activities, which involve securing research approval from special contact school districts beginning in June 2019. The request to conduct the full-scale survey will be submitted at a later date under the OMB generic clearance for quick response surveys (OMB#1850-0733).

Design

Overview of Survey Development

FRSS has established procedures for developing short surveys on a wide variety of topics. The techniques that are planned to shape the survey design for FRSS 110 include input from the NCES Quality Review Board (QRB), several rounds of feasibility calls, and up to two pretests.

We are conducting up to three rounds of feasibility calls (OMB# 1850-0803 v. 244), each with fifteen or fewer respondents. With new surveys such as the FRSS 110 survey on school use of educational technology, the initial feasibility calls use an open-ended interview guide rather than a questionnaire. As rounds of feasibility calls progress, respondents will be asked to review, but not complete draft questionnaire items and ultimately a draft questionnaire. Conducting multiple rounds of feasibility calls will systematically inform us about public schools’ use of educational technology for instruction. The gathered information will be used to draft a questionnaire, and in later rounds will provide in-depth information on respondents’ perceptions of the draft survey and response burden. The process will result in several iterations of the questionnaire items. As a result of the feedback we receive, we will make any necessary changes to the survey items and draft the survey to be reviewed by the NCES QRB and revised as necessary to prepare it for pretesting.

For the pretest, respondents will be asked to complete the questionnaire and fax it to Westat, and then participate in a telephone debriefing with Westat to provide feedback on the questionnaire. The purpose of the pretest is to verify that all questions and corresponding instructions are clear and unambiguous, to determine if the information would be readily accessible to respondents, and to determine whether the burden on respondents could be further reduced. As necessary, changes to the questionnaire will be made based on the feedback received during the pretest.

Procedures and Materials for Preliminary Activities

The FRSS 110 preliminary activities requested in this submission include contacting and seeking research approvals from public school districts with an established research approval process (“special contact districts”). The special contact districts are those known to require completion of a research application before they will allow schools under their jurisdiction to participate in a study. Activities for special contact districts begin with updating district information based on what can be gleaned from online sources and what is known from other NCES data collections. Individual districts will be contacted as needed to fill in gaps about where and to whom to send the completed required research application forms. This operation will begin in June 2019 to allow as much time as possible for special contact districts’ review processes, and will continue until we receive a final response (approval or denial of request) as long as there is sufficient time for sampled schools to respond to FRSS 110. Any special requirements that districts have for approval of surveys will be met before schools in those districts are contacted. Each special contact district has unique requirements for obtaining approval. The materials sent to special contact districts will be tailored to meet the specific requirements of each district, based on the district recruitment materials provided in Appendix A.

Overview of Survey Collection

In winter 2019, we will submit a request for clearance to collect surveys from a sample of 1,300 public schools. We will send the principals of the sampled schools a survey package requesting that the questionnaire be completed by the person in the school most knowledgeable about educational technology. Respondents will have the option of completing the survey on paper or online. We will conduct a follow-up for nonresponse operation using a combination of mail, email, and telephone contacts. The final data collection details and materials will be provided in winter 2019.

NCES Review and Consultations Outside of Agency

The NCES QRB members reviewed a draft list of questionnaire and discussion topics prior to this request for preliminary activities. Revisions were made to the list of topics based on input from the reviewers, and the list was used to develop an interview guide for the feasibility calls. In addition to staff from NCES’s Statistical Standards group, the Annual Reports group, and each of the three Divisions, the QRB also includes staff from ED’s Office of Educational Technology (OET); the U.S. Commerce Department’s National Telecommunication and Information Administration; the National Science Foundation; and four education technology organizations. The QRB members for this survey are listed below:


Rafi Goldberg, National Telecommunications and Information Administration, Commerce

Lee Zia, National Science Foundation

Bernadette Adams, Office of Educational Technology

Halima Adenegan, NCES (Assessment Division; ED Tech Equity Initiative)

Tom Snyder, NCES (Annual Reports and Information)

Ross Santy, NCES (Administrative Records Division)

Chris Chapman, NCES (Sample Surveys Division,

Kashka Kubzdela, NCES (Office of the Commissioner)

Maria Worthen, iNACOL

Christina Luke, Digital Promise

Susan Bearden, CoSN

Ji Soo Song, ISTE

Assurance of Confidentiality

Data to be collected will not be released to the public with institutional or personal identifiers attached. Data will be presented in aggregate statistical form only. In addition, each data file will undergo extensive disclosure risk analysis and will be reviewed by the NCES/IES Disclosure Review Board before use in generating report analyses and before release as a public use data file. Respondents will be assured that their participation in the survey is voluntary and that all of the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

Description of Sample and Burden

The sample design is a nationally representative sample of approximately 1,300 public schools from the 2016–17 (or most recent) NCES Common Core of Data (CCD) Public School Universe File. For school surveys, we estimate needing to work with approximately 200 special contact districts. The respondent burden for special contact districts is estimated to be approximately 2 hours for IRB review by one staff member per district, and 60 minutes per member for district IRB panel review, assuming each panel would on average be composed of six panel members. Information about estimated respondent burden and response time cost for FRSS 110 preliminary activities is provided in Table 1.

Table 1. Estimated burden for FRSS 110 preliminary activities

Type of collection

Sample size

Estimated response rate (percent)

Estimated number of respondents and responses

Burden hours per respondent

Total response burden hours

Response burden time cost

(@$47.48 per hour)

Special contact district IRB Staff Review

200

100%

200

2

400

$18,992

Special contact district IRB Panel Review

200*6

100%

1,200

1

1,200

$56,976

Total

--

--

1,400

--

1,600

$75,968

The estimated average hourly earnings of elementary and secondary school administrators are $47.481. Therefore, based on 1,600 total burden hours for FRSS 110 preliminary activities, the associated total estimated burden time cost to respondents is $75,968.

Survey Cost and Time Schedule

The entire survey, including these preliminary activities and the conduct of the main survey, is estimated to cost the federal government about $850,000, including about $800,000 for contractual costs and $50,000 for salaries and expenses. Contractual costs include the costs for survey preparation, preliminary activities, data collection, data analysis, and report preparation.

Preliminary activities to obtain approval from special contact districts will begin in June 2019. The main survey data collection from schools will begin in January 2020 and is scheduled to end in June 2020.

Plan for Tabulation and Publication

The First Look report will be released on the NCES website in summer 2021 and include explanatory text and tables. Participating schools will be notified when NCES releases the report. A public use data file will also be released on the NCES website. Survey responses will be weighted to produce national estimates. Tabulations will be produced for each data item. Cross-tabulations of data items will be made with selected classification variables, such as school level, enrollment size, community type (locale), geographic region, and category for percent of students eligible for free/reduced price lunch.

Statistical Methodology

Reviewing Statisticians

Christopher Chapman, of NCES, is the Project Officer for this survey. Adam Chu, Senior Statistician, Westat, was consulted about the statistical aspects of the design.

Respondent Universe

FRSS 110 will collect data from a nationally representative sample of public schools. Schools meeting the following conditions are in scope for the educational technology survey:

  • The school provides instruction in any of the grades 1 through 12.

  • The school is located within the 50 States or the District of Columbia.

  • The school does not have zero or missing enrollment.

  • The school is a regular school (with the exception of DoDEA and BIE schools, and schools that are fully or primarily virtual schools)

Sampling Frame

The sampling frame (i.e., universe list) from which the school sample will be drawn will be constructed from the 2016–17 (or later edition if available) Common Core of Data (CCD) Universe Files maintained by NCES.2 The CCD file contains a record for all known public schools along with selected characteristics such as instructional level, enrollment size, community type (type of locale), percent of students eligible for free/reduced price lunch, and others. As summarized in Table 2, there are over 87,000 schools that meet the conditions for inclusion in the educational technology survey.

Table 2. Distribution of eligible schools in the 2016–17 CCD Universe File, by enrollment size class and instructional level

 

Instructional Level

 

Enrollment size class

Elementary

Middle

High

Other*

Total

Less than 300

11,878

3,239

4,704

1,890

21,711

300 to 499

18,957

3,207

2,767

984

25,915

500 to 999

19,893

6,494

3,477

1,228

31,092

1000 to 1499

1,027

1,583

2,250

359

5,219

1500 or more

68

161

3,186

177

3,592

All

51,823

14,684

16,384

4,638

87,529

* These are schools in the CCD Universe file with grades that do not meet the traditional definition of elementary, middle, or high schools.

Sample Design and Stratification

Traditionally, surveys conducted under the FRSS have employed stratified samples ranging in size from 1,200 to 1,800 schools depending on analytic goals and available resources. Since FRSS is designed to provide estimates for broadly-defined subgroups of interest as well as overall national estimates, a stratified sample design with primary strata defined by level, size class, and other characteristics generally has been found to be effective in meeting study objectives. Specification of explicit strata for sampling purposes allows for the selection of schools at varying rates to: (a) ensure that key subgroups are adequately represented in the sample, and (b) improve sampling precision for selected subgroup estimates. Moreover, use of enrollment size as the primary stratifier also helps to ensure that sample-based estimates that are correlated with the size of the school can better achieve reasonable levels of precision.

In view of the above considerations, we plan to select a stratified sample of 1,300 schools for the FRSS 110 survey, with strata defined by: (a) instructional level (elementary, middle, high, other) and (b) enrollment size class (i.e., the following five size classes: [1] under 300 students; [2] 300 to 499; [3] 500 to 999; [4] 1,000 to 1,499; and [5] 1,500+). To allow for comparisons among the various instructional levels, we will select 400 schools from each of the elementary, middle, and high school strata, and 100 from schools in the “other” stratum. Within each instructional level stratum, the samples will be selected at rates that are roughly proportional to the aggregate square-root of the enrollment of the schools in the five size classes. Use of the square root of enrollment as the measure of size for sample allocation has two main benefits. First, it will give the larger schools in terms of enrollment relatively higher probabilities of selection which is beneficial for estimation of school-level characteristics that are related to the number of students in the school. Second, it will help limit the size of the design effects (and associated increased variances) that can adversely affect the estimation of proportions or counts of schools that report a specified characteristic.

Other variables, such as region and poverty status, will be used to sort the schools in the sampling frame prior to sample selection. The sorting induces an “implicit” stratification that helps ensure that schools with the selected characteristics are appropriately represented in the sample. Within each sampling stratum, schools will be selected systematically at rates that depend on the size class of the school. Table 3 summarizes the allocation of the sample of 1,300 schools to the four instructional level strata and the corresponding numbers to be selected by size class. Assuming an 85 percent response rate, the expected number of responding schools is 1,105.

Table 3. Distribution of school sample by enrollment size class and type of locale

 

Instructional Level

 

Enrollment size class

Elementary

Middle

High

Other

Total

Less than 300

58

48

52

22

181

300 to 499

140

75

51

21

287

500 to 999

188

205

86

35

514

1000 to 1499

13

64

74

13

164

1500 or more

1

8

136

9

154

All

400

400

400

100

1,300







Expected Levels of Precision

Table 4 summarizes the expected sample sizes and levels of precision for selected subgroup estimates derived from the proposed sample design. The number of “responding schools” shown in the table are calculated assuming an overall response rate of 85 percent. Also shown are 95% confidence bounds around an estimated percentage derived from the respondent samples. The confidence bounds given in the table are for reported respondent characteristics ranging from a 20% characteristic to a 50% characteristic. As can be seen in the table, for subgroups with at least 340 respondents, estimates are expected to be relatively precise with 95% confidence bounds ranging from ± 5.2% to 6.1% for an estimated 50 percent characteristic. Moreover, under the proposed sample design, the minimum detectable difference (MDD) in estimated percentages between subgroups consisting of approximately 340 or more respondents would range from about 10% to 12% (e.g., using a T test to test for significance).

Table 4. Expected sample sizes (number of completed interviews) and 95% confidence bounds around an estimated proportion by selected subgroups under proposed design

 

 

 

 

95% confidence bounds around an estimated percentage equal to:

Subgroup

Number selected

Respondent schools

P = 20.0%

P = 33.0%

P = 50.0%

Total

1,300

1,105

±2.9%

±3.4%

±3.6%

Instructional level







Elementary

400

340

±4.4%

±5.2%

±5.5%


Middle

400

340

±4.5%

±5.3%

±5.6%


High

400

340

±4.8%

±5.6%

±6.0%


Other

100

85

±9.1%

±10.7%

±11.3%

Enrollment size class







Under 500

468

398

±4.4%

±5.2%

±5.5%


500 to 999

514

437

±4.2%

±5.0%

±5.3%


1,000 or more

318

270

±5.4%

±6.3%

±6.7%

Type of locale







City

358

304

±5.5%

±6.5%

±6.9%


Suburban

459

390

±4.9%

±5.7%

±6.1%


Town

165

140

±8.1%

±9.5%

±10.1%


Rural

318

270

±5.8%

±6.9%

±7.3%

Percent of students eligible for free/reduced price lunch







Under 35 percent

453

385

±4.6%

±5.5%

±5.8%


35 to 49 percent

231

196

±6.5%

±7.6%

±8.1%


50 to 75 percent

330

280

±5.4%

±6.4%

±6.8%


75 percent or more

287

244

±5.8%

±6.9%

±7.3%

 

 

 

 

 

 

 

Estimation and Calculation of Sampling Errors

For estimation purposes, sampling weights reflecting the overall probabilities of selection and adjustments for nonresponse will be attached to each data record. To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, 50–100 subsamples or “replicates” will be formed in a way that preserves the basic features of the full sample design. A set of estimation weights (referred to as “replicate weights”) will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and each of the jackknife replicates. The variability of the replicate estimates is used to obtain a measure of the variance (standard error) of the survey statistic. Previous surveys, using similar sample designs, have yielded relative standard errors (i.e., coefficients of variation) in the range of 2 to 10 percent for most national estimates. Similar results are expected for this survey.

1 The time cost to respondents is the hourly earnings of elementary and secondary school administrators as reported in the May 2018 Bureau of Labor Statistics (BLS) Occupational Employment Statistics. The mean hourly wage was computed assuming 2,080 hours per year. Source: BLS Occupational Employment Statistics, https://www.bls.gov/oes/current/oes_nat.htm#11-0000, Occupation code: Education Administrators, Elementary and Secondary School (11-9032; accessed on March 29, 2019).

2 Glander, M. (2019). Documentation to the 2016-17 Common Core of Data (CCD) Universe Files (NCES 2019-052). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved 3/27/2019 from http://nces.ed.gov/pubsearch/.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy