Justification

FRSS_98 Distant Ed Pretest Volume I Justification.docx

System Clearance for Cognitive, Pilot and Field Test Studies

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf



Volume I:


Request for Clearance for Pretest of the Proposed Fast Response Survey System (FRSS) 98: Distance Education Courses for Public Elementary and Secondary School Students: 2009-10


OMB# 1850-0803 v. 28























May 13, 2010

Justification


The National Center for Education Statistics (NCES), U.S. Department of Education (ED), requests OMB approval under the NCES system clearance for Cognitive, Pilot, and Field Test studies (OMB #1850-0803) to pretest a district survey about technology-based distance education for public elementary and secondary school students, which is part of the Fast Response Survey System (FRSS). NCES is requesting clearance to conduct up to two rounds of pretest calls if necessary. This survey, referred to as FRSS 98, will provide nationally representative data on this topic by presenting current information about enrollments in distance education courses in the nation’s public elementary and secondary schools, as well as covering tracking and monitoring of student progress in distance education courses, district record-keeping, entities with which districts partner to deliver distance education courses, reasons for having distance education, types of distance education courses, and technologies used to deliver these courses . The purpose of the pretest is to identify and correct any potential issues with the content and format of the survey before its full scale implementation later this year, so as to assure that it captures the intended meaning of each question and minimizes the burden imposed on respondents. The request to conduct the full survey will be submitted at a later date under OMB generic clearance for quick response surveys (OMB #1850-0733), which are authorized under the Education Sciences Reform Act of 2002.


Design


Overview of Survey Development


NCES has contracted Westat to conduct the FRSS 98 survey and its pretest. The survey was requested and funded by the Office of Educational Technology (OET), U.S. Department of Education. Two previous iterations of the district survey Distance Education Courses for Public Elementary and Secondary School Students were previously conducted by NCES for school years 2002-03 and 2004-05.  The current survey (2009-10) is based on the previous versions, with modifications based on four rounds of feasibility calls with public school district personnel most knowledgeable about K-12 distance education.  The four rounds of feasibility calls, each with nine or fewer respondents, were conducted between February 2009 and February 2010 to update the survey questions. The first round of calls focused on potential survey topics, and potential revisions to the definition. In the second round, the feasibility of and burden associated with providing specific enrollment numbers was explored, and the definition was finalized. The third round provided a review of the entire questionnaire, and the fourth round focused on a brief review of a subset of modified questions. The resulting draft of the survey was then reviewed by the NCES Quality Review Board (QRB) and revised accordingly to prepare it for the pretest described here.


Consultations Outside of Agency


The data requesters for this survey, the Office of Educational Technology, provided extensive input during survey development, and reviewed and approved all questions.


Sample, Burden, and Cost


In this submission, we are requesting approval to conduct up to two rounds of pretest calls with the revised questionnaire, with 15 respondents from school districts around the nation in each round. Respondents will be asked to review, complete, and fax back the 3-page paper and pencil survey, and will be invited to provide their feedback by telephone. Administrators in districts of different sizes, urbanicity, and regions of the country who are knowledgeable about their district’s distance education courses for public elementary and secondary schools will be identified and recruited by phone (see phone recruitment script in Attachment 1). In order to recruit 15 districts to participate in a pretest round, we anticipate contacting approximately 45 districts. On average, recruitment calls with respondents who agree to participate in the pretest are expected to take about 10 minutes to explain the purpose of the pretest and set up an appointment to discuss the survey; all other recruitment calls are expected to take about 3 minutes. The questionnaire is expected to take approximately 30 minutes to complete and verbal feedback is expected to take another 30 minutes or less. The total expected response burden is about 1 hour and 10 minutes per respondent (17.5 hours), plus 1.5 hours for contacts to districts that do not participate in the pretest, for a total estimated burden time of 19 hours for a round of pretest calls (see Table 1). The feedback obtained will be used to revise the survey. If the revisions needed after the first pretest are minor, then only one round of pretest calls will be made. However, if major revisions are needed, then a second round of pretest calls will be made. We anticipate that the estimated cost of the pretest to the federal government will be approximately $10,000 for each pretest round. Following the pretest, NCES will submit the revised questionnaire along with an official request for OMB clearance to conduct the national study of FRSS 98.


Table 1. Maximum burden time for up to two rounds of pretest calls.

Respondents

Number of Respondents

Number of Responses

Burden Hours per Respondent

Total Burden Hours

Each Round





Recruitment – Districts

not participating in the pretest

30

30

0.05

1.5

Recruitment – Districts

participating in the pretest

15

15

0.167

2.5

Pretest survey and debrief

15

15

1

15






Total per round

45

60

-

19






Total for two rounds

90

120

-

38



Data Collection Instrument


A cover letter (Attachment 2) and questionnaire (Attachment 3) will be emailed or faxed to each participating district. The cover letter thanks the respondent for agreeing to participate in the pretest, introduces the purpose and content of the survey, indicates that their participation is voluntary, includes instructions on how to complete and return the survey, includes questions for respondents to consider while completing the survey that will help in providing feedback about the survey, and provides contact information should any questions arise before the scheduled discussion with the survey manager. On the cover of the survey, respondents are assured that their participation is voluntary and their answers may not be disclosed or used in identifiable form for any other purpose unless compelled by law. The public law is cited on the front page of the survey (attachment 3). While the current draft of the questionnaire exceeds the 3 pages limit for FRSS, the final questionnaire will be reduced to 3 pages based on the results of the pretest.


The survey is designed to collect general information on distance education courses for public elementary and secondary school students in the nation’s public school districts. The first three questions ask about enrollments in distance educations courses. The first question is provided to ‘screen out’ districts that do not have students enrolled in distance education courses. The second question asks for the number of enrollments in distance education courses, i.e., a duplicated count of students. Respondents are asked to report enrollments by instructional level. This is the same as the 2004-05 version of the question. Question 3 asks whether the district can provide the number of students enrolled in distance education courses, i.e., an unduplicated count. Respondents in the first three rounds of feasibility calls indicated that these counts were not always available, and that when they were available, providing these counts was burdensome. As a result, the question was modified to have a Yes-No response. OET is interested in this information for possible future research.


Questions 4-6 focus on tracking and monitoring distance education courses. Question 4 asks whether districts distinguish distance education courses on an academic record. This question provides information about the ease with which districts can identify these types of courses in student records. OET is interested in this information for possible future research. Question 5 asks about tracking of information on completions in these courses. During feasibility calls, respondents indicated that providing numbers of completions would be burdensome and problematic due to differing definitions and time periods for tracking completions information. Thus, the question was modified to ask whether distance education and classroom courses were tracked in the same way. Question 6 asks about specific ways districts may be monitoring progress in distance education courses.


Questions 7 and 8 were included to obtain information about district policies regarding distance education courses and programs. Question 7 asks about the written policies that specify the consequences of not successfully completing a distance education course. Question 8 asks whether districts are offering these courses as an entire course load and/or for a full program for graduation. Respondents during the feasibility calls indicated that taking a full course load or a full program through distance education was often handled differently depending on why distance education was being used. So, the question response categories reflect the different usage of distance education courses by specific programs and the regular high school program.


Districts may use many entities to deliver and/or develop distance education courses. Question 9 asks districts which entities deliver these courses, and which five entities most frequently deliver distance education courses (via rank order). This question has been slightly modified from the 2004-05 version of the questionnaire with the addition of the Part 2/ranking task. Information about the extent to which districts or other entities are developing distance education courses is asked in Question 10 with a 5-point scale.


Question 11 asks about the reasons for having distance education courses. This item is similar to a question asked in the 2002-03 version of the questionnaire. Question 12 asks about the types of courses taken, which was identified in feasibility calls as an important question that is distinct from the reasons students take distance education courses.


Questions 13 and 14 address the technologies used for delivery of distance education courses. Question 13, which asks about the extent to which each technology is used, is a modified version of a similar question on the 2004-05 version of the questionnaire. However, the extent scale replaces Yes/No response categories, and the response options (a-f) have been modified slightly for clarity. The extent scale is included to provide more information than just Yes/No about usage of each type of technology. Question 14 is the same as a question on the 2004-05 questionnaire, which asks for the primary mode of technology used for the greatest number of courses taken.


Question 15 asks about courses delivered over the Internet and is included to set up the skip pattern for Question 16. Question 16 asks about locations in which students were accessing Internet-based courses. This is a slightly modified version of a question asked on the 2004-05 version.


Because there is interest in whether providing distance education courses is a growing trend, Question 17 asks if the district will expand the number of distance education courses offered (in the next three years as a reference point). This is also modified slightly from a question on the 2004-05 version.

Lastly, Question 18 asks districts whether they deliver distance education courses to students not regularly enrolled in the district. This is also a slightly modified version of a question on the 2004-05 version.


Timeline

Pretest activities are expected to begin as soon as approval for them is received from OMB. It is anticipated that participant recruitment, completion of the pretest calls, write up of the memorandum summarizing the results, and survey revision will take approximately 8 weeks.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMEMORANDUM
AuthorPriscilla Carver
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy