Supporting Statement Part A

Supporting Statement Part A.doc

Teacher's Use of Education Technology in US Public Schools

OMB: 1850-0857

Document [doc]
Download: doc | pdf

Request for OMB Clearance

Teachers’ Use of Educational Technology in U.S. Public Schools

Supporting Statement for Paperwork Reduction Act Submissions


Section A. Justification


A.1. Importance of the Information


The National Center for Education Statistics (NCES), U.S. Department of Education proposes to conduct a new survey on teachers’ use of educational technology in public elementary and secondary schools. The survey was requested by the Office of Educational Technology (OET), U.S. Department of Education, to provide national data on the use of current and emerging educational technology by public school teachers. The survey is included in the National Educational Technology Leadership national activities spending plan.


This is one of three proposed surveys that OET has requested be conducted. This survey will follow procedures that are standard for FRSS surveys but it is being submitted for full clearance review rather than under the FRSS system clearance because it is a survey of teachers. The other two are district-level and school-level surveys and will submitted under the FRSS clearance 1850-0733. OET envisions the three new surveys as a barometer of technology access and use within public elementary and secondary school districts, schools, and classrooms. The teacher survey will collect information on technology usage and teachers’ perceptions that can not be obtained at the school or district level. In 1999, NCES conducted Public School Teachers Use of Computers and the Internet using the FRSS. The survey found that nearly all public school teachers (99 percent) reported having computers available somewhere in their schools in 1999. Approximately half of the public school teachers who had computers or the Internet available in their schools used them for classroom instruction. In 1999, approximately one-third of teachers reported feeling well prepared or very well prepared to use computers and the Internet for classroom instruction, with less experienced teachers indicating they felt better prepared to use technology than their more experienced colleagues.


The new teacher survey will cover a broader range of educational technology topics and will include new and emerging technology. It will provide current national statistics on the following topics:


  • Technology equipment (e.g., availability and use of computers and other hardware);

  • Uses of educational technology for instructional and administrative purposes;

  • Teacher preparation to use educational technology;

  • Teacher professional development in educational technology; and

  • Teacher information (e.g., grades and subjects taught, years of experience) to be used for classification variables.


By addressing teachers’ use and preparation to use current and emerging educational technology in public schools, the survey will provide valuable national data for OET and other educational policymakers at the national, state, and district levels.


The survey is authorized under Section 153 (a) of the Education Science Reform Act of 2002 (Public Law 107-279), which states that the purpose of NCES is “to collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations.”


Overview of Data Collection. Westat has been contracted by NCES to conduct this survey. Westat is responsible for the questionnaire development; sample design and selection; data collection; telephone follow up; editing, coding, keying, and verification of the data; and production of tabulations and the report detailing the results of the survey.


The survey includes a nested sample design that links districts, schools, and teachers. We will draw an initial sample of 2,000 schools and expect a response rate of 90 percent (based on previous similar surveys), yielding 1,800 participating schools. We will draw a nationally representative sample of about 4,000 teachers selected from lists provided by the participating schools. The school sample will be selected from the most recent NCES Common Core of Data (CCD) Public School Universe file. Prior to contacting schools or teachers, a courtesy information packet consisting of a cover letter and copy of the questionnaire will be mailed to the superintendent of each district with schools selected for participation. The packet also will include a list of the schools within the districts that are in the sample. Any special requirements that districts have for approval of surveys will be met before schools in those districts are contacted.


The first stage of data collection will be to collect lists of eligible teachers from the sampled schools. These lists will provide the sampling frame to select teachers. In September 2008, the principal of each sampled school will be asked to have a list of eligible teachers prepared and sent to Westat by mail or fax, according to written instructions. Appendix A contains two documents that will be used for teacher list collection: (1) instructions for preparing the list; and (2) a form to be returned with the list of teachers. For confidentiality reasons, this form does not include the name of the survey and will contain a random ID number (Westat ID) that only Westat staff can use to identify the school. To minimize the burden on schools, Westat will coordinate the collection of teacher sampling lists with the collection of the school survey. Collection and followup activities for the teacher lists and school surveys will be handled by the same Westat staff to minimize the number of contacts made to principals and other school staff.


The second stage of data collection will be to collect a self-administered survey from the sampled teachers. Respondents will have the option of completing the survey on a traditional paper and pencil questionnaire or on a web version of the questionnaire that will be accessed through the Internet. The questionnaire (see Appendix B) is limited to three pages of information readily available to respondents and can be completed by most teachers in 20 minutes. These procedures are typical for FRSS surveys and result in minimal burden on respondents. Questionnaires will be mailed to teachers starting in January 2009. The cover letter (see Appendix C) will include information about the option to complete a web version of the survey on the Internet.


Telephone follow up for nonresponse will begin about 3 weeks after the questionnaires have been mailed to the teachers. Experienced telephone interviewers will be trained to conduct the nonresponse follow up and will be monitored by Westat supervisory personnel during all interviewing hours.



A.2. Purposes and Uses of the Data


The survey will provide valuable national data for OET and other educational policymakers at the national, state, and district levels. The teacher survey is a vital component of the three linked technology surveys (district, school, and teacher) that are part of OET’s National Educational Technology Leadership national activities spending plan. Early research for these surveys revealed that while district and school respondents are good sources of data on availability of technology in schools, teachers are the best, and sometimes only, source of data on how this technology is being used. The availability of educational technology changes rapidly. For example, the ratio of students to instructional computers with Internet access in public schools was about 9 to 1 in 1999 and almost 4 to 1 in 2005. The percentage of public school instructional rooms with Internet access was 64 percent in 1999 and 94 percent in 2005.1 The proposed teacher survey will report how available technology is actually being used by teachers and students. In addition, the proposed survey will provide current data on the use of new equipment and technology (e.g., interactive whiteboards, classroom response systems, online student assessments).


An NCES report containing survey results will be published on the NCES website. In addition, public use and restricted use data files will be made available to researchers.



A.3. Improved Information Technology


Sampled teachers will be given the option of completing the survey using a traditional pencil and paper questionnaire, or using a Web version of the questionnaire that is accessed through the Internet. When paper versions of the questionnaire are used, they will be transmitted to and from respondents by fax whenever possible. In addition, the email address for the contractor (Westat) responsible for answering respondent questions will be included on the front of the questionnaire and in the materials for preparing the sampling list of teachers. These procedures are all designed to minimize the burden on respondents.



A.4. Efforts to Identify Duplication


In addition to the 1999 FRSS on Public School Teachers Use of Computers and the Internet, the Office of Planning, Evaluation, and Policy Development (OPEPD), of the Department of Education conducted a national survey of teachers in 2005 and 2007 as part of the Department's National Educational Technology Trends Study. This survey asked teachers about the technology-related professional development activities they participated in, their access to and use of technology for instruction, the factors that facilitated or impeded technology use, and the changes in teaching and learning that may have occurred as a result of teacher and student use of technology. The two rounds of data collection for NETTS also used a nested survey design, although at the state, district and teacher-level (rather than district, school and teacher level). Both the NETTS and the FRSS surveys were requested by the Department’s Office of Educational Technology. We have consulted with Bernadette Adams Yates of OPEPD on the NETTS.


To check for duplication of effort, we reviewed the 2007 NETTS data collection plan and questionnaire. The key results of this review are (1) FRSS 95 will provide more current data - the most recent NETTS is for the 2006-2007 school year and FRSS 95 is for the spring semester 2009; (2) FRSS is designed to provide data more quickly than most surveys; and (3) there is no duplication of questions on the two surveys, although some of the same topics are addressed. In addition, the new FRSS on teachers will allow comparability to 1999 items on total computers in the classroom and how many of these computers have Internet access.


  • The FRSS asks how frequently the teacher has used specific types of software for classroom preparation, instruction, or administrative tasks (never, rarely, sometimes, often). The NETTS asked about the presence of specific software but not the frequency of use by the teacher.


  • Regarding use of technology to communicate with parents and students: The FRSS asks how often the teacher uses specific types of technology (e.g., email, online bulletin board, course or teacher blog) to communicate with (1) parents; and (2) students (never, rarely, sometimes, often). The NETTS asked simply about how often the teacher used technology to communicate to students and parents, but not about the specific type of communication that was used.


  • Regarding training: The FRSS asks about the extent to which activities (including undergraduate teacher education, graduate teacher education, professional development, training by school technology staff, and independent learning) have prepared the teacher to use educational technology. The NETTS asks about specific instructional components restricted to a teacher’s preservice preparation program.


  • Use of systems on school or district networks: the FRSS asks how frequently the teacher uses a system on the school or district network for entering or viewing student data (e.g., grades, attendance records, IEPs) or administering assessments. The NETTS did not ask about use of systems on school or district networks, but asked what kind of data, tools, and instructional supports the teacher can access through an electronic student data system.


  • Use of remote access: the FRSS asks how frequently the teacher uses remote access (e.g., access from home) for each of the following: school email, documents on the school/district server, student data, and school/district software applications. This was not collected in the NETTS.



A.5. Methods Used to Minimize Burden on Small Entities


The survey will be collected from individuals (teachers). Teachers will be sampled from lists collected from sampled public schools of all sizes. To minimize the teacher list collection burden on schools, we will suggest that respondents use existing staff lists, editing them as needed. We will accept lists in all formats and provide list collection assistance to respondents by telephone and email. In addition, smaller schools generally will be sampled at a lower rate than large schools.



A.6. Consequences of Not Collecting the Information


If these data are not collected, OET will not have current teacher-level data about technology access and use. These data will provide the policymakers at OET with information as specified in their National Educational Technology Leadership national activities plan.



A.7. Adherence to the Guidelines in 5 CFR 1320.5


Data collection will be conducted in a manner consistent with the guidelines in 5 CFR 1320.5. The only exception is that responses will be requested in fewer than 30 days, following the well-developed procedures for NCES quick response surveys such as FRSS, which are intended to collect data quickly.



A.8. Consultations Outside NCES


Substantial development work was conducted to identify survey items and to explore the feasibility of collecting data on these items. After discussions with OET about desired survey topics, Westat conducted a literature review and a search of existing survey instruments. Westat conducted feasibility calls to test and improve the instrument. Teachers with various characteristics were asked to review and discuss the survey in 30-45 minute telephone interviews. Respondents were asked about the clarity and relevance of the survey items, and whether they could answer each question without too much burden. During feasibility calls, respondents were not required to complete the questionnaire, but rather to review and give feedback about the survey. These calls were conducted over several months and the questionnaire for each round was substantially different than in the previous round. We contacted nine or fewer respondents for each round. After the calls, the instrument was revised and submitted to OET and NCES for review and further revision.


Based on feedback from the NCES Questionnaire Review Board (QRB), the survey was revised and a pretest of nine teachers was conducted to identify problems survey respondents might have in providing the requested information. The purpose of the pretest was to verify that all questions and corresponding instructions were clear and unambiguous, to determine if the information would be readily available to respondents, and to determine whether the burden on respondents could be further reduced. The pretest included teachers at five elementary schools and four secondary schools. The schools varied by size and state. The teachers were teaching a variety of grades and subjects. Their years of teaching experience ranged from two to thirty-five years. Responses and comments on the pretest questionnaire were collected by fax and telephone, and were discussed with NCES and OET. Changes to the questionnaire were made based on the feedback received from the pretest, and documented in a memorandum summarizing the pretest results.



A.9. Payments to Respondents


Not applicable. No payments or gifts to respondents will be made.



A.10. Assurance of Confidentiality


Data to be collected will not be released with institutional or personal identifiers attached. Data will be presented in aggregate statistical form only. A statement to this effect is included in the cover letter accompanying each questionnaire. In addition, the public use data file will undergo extensive disclosure risk analysis and be reviewed by the NCES/IES Disclosure Review Board before release.


Respondents will be assured that all information identifying them or their school will be kept confidential in compliance with the Education Sciences Reform Act of 2002 (P.L. 107-279), which requires that no person may:


  • use any individually identifiable information furnished under the provisions of this section for any purpose other than the statistical purposes for which it is supplied;

  • make any publication whereby the data furnished by any particular person under this section can be identified; or

  • permit anyone other than the individuals authorized by the Commissioner to examine the individual reports.


All Westat staff members working on the study are required to sign the NCES Affadavit of Nondisclosure, as well as Westat's confidentiality pledge, which appears as Exhibit 1.



Exhibit 1. Westat confidentiality statement

WESTAT, INC.

EMPLOYEE OR CONTRACTOR'S ASSURANCE OF CONFIDENTIALITY OF SURVEY DATA


Statement of Policy


Westat is firmly committed to the principle that the confidentiality of individual data obtained through Westat surveys must be protected. This principle holds whether or not any specific guarantee of confidentiality was given at time of interview (or self-response), or whether or not there are specific contractual obligations to the client. When guarantees have been given or contractual obligations regarding confidentiality have been entered into, they may impose additional requirements which are to be adhered to strictly.


Procedures for Maintaining Confidentiality


1 All Westat employees and field workers shall sign this assurance of confidentiality. This assurance may be superseded by another assurance for a particular project.


2. Field workers shall keep completely confidential the names of respondents, all information or opinions collected in the course of interviews, and any information about respondents learned incidentally during field work. Field workers shall exercise reasonable caution to prevent access by others to survey data in their possession.


3. Unless specifically instructed otherwise for a particular project, an employee or field worker, upon encountering a respondent or information pertaining to a respondent that s/he knows personally, shall immediately terminate the activity and contact her/his supervisor for instructions.


4. Survey data containing personal identifiers in Westat offices shall be kept in a locked container or a locked room when not being used each working day in routine survey activities. Reasonable caution shall be exercised in limiting access to survey data to only those persons who are working on the specific project and who have been instructed in the applicable confidentiality requirements for that project.


Where survey data have been determined to be particularly sensitive by the Corporate Officer in charge of the project or the President of Westat, such survey data shall be kept in locked containers or in a locked room except when actually being used and attended by a staff member who has signed this pledge.


5. Ordinarily, serial numbers shall be assigned to respondents prior to creating a machine-processible record and identifiers such as name, address, and Social Security number shall not, ordinarily, be a part of the machine record. When identifiers are part of the machine data record, Westat's Manager of Data Processing shall be responsible for determining adequate confidentiality me assures in consultation with the project director. When a separate file is set up containing identifiers or linkage information which could be used to identify data records, this separate file shall be kept locked up when not actually being used each day in routine survey activities.


6. When records with identifiers are to be transmitted to another party, such as for keypunching or key taping, the other party shall be informed of these procedures and shall sign an Assurance of Confidentiality form.


7. Each project director shall be responsible for ensuring that all personnel and contractors involved in handling survey data on a project are instructed in these procedures throughout the period of survey performance. When there are specific contractual obligations to the client regarding confidentiality, the project director shall develop additional procedures to comply with these obligations and shall instruct field staff, clerical staff, consultants, and any other persons who work on the project in these additional procedures. At the end of the period of survey performance, the project director shall arrange for proper storage or disposition of survey data including any particular contractual requirements for storage or disposition. When required to turn over survey data to our clients, we must provide proper safeguards to ensure confidentiality up to the time of delivery.


8. Project directors shall ensure that survey practices adhere to the provisions of the U.S. Privacy Act of 1974 with regard to surveys of individuals for the Federal Government. Project directors must ensure that procedures are established in each survey to inform each respondent of the authority for the survey, the purpose and use of the survey, the voluntary nature of the survey (where applicable) and the effects on the respondents, if any, of not responding.


PLEDGE


I hereby certify that I have carefully read and will cooperate fully with the above procedures. I will keep completely confidential all information arising from surveys concerning individual respondents to which I gain access. I will not discuss, disclose, disseminate, or provide access to survey data and identifiers except as authorized by Westat. In addition, I will comply with any additional procedures established by Westat for a particular contract. I will devote my best efforts to ensure that there is compliance with the required procedures by personnel whom I supervise. I understand that violation of this pledge is sufficient grounds for disciplinary action, including dismissal. I also understand that violation of the privacy rights of individuals through such unauthorized discussion, disclosure, dissemination, or access may make me subject to criminal or civil penalties. I give my personal pledge that I shall abide by this assurance of confidentiality.


Signature

A.11. Sensitive Questions


There are no questions of a sensitive nature included in the survey.



A.12. Estimates of Response Burden


The survey includes two types of collections. Respondents in sampled schools will be asked to provide lists of eligible teachers to be used for sampling. The instructions say that schools may use existing staff lists and cross off or delete teachers that are not eligible for the survey. This is estimated to take 30 minutes or less. Sampled teachers will be asked to complete the questionnaire, which is estimated to be 20 minutes based on a pretest of nine respondents. Based on previous FRSS surveys, we expect a 90 percent response rate for each type of collection. Thus, an initial sample of 2,000 schools will yield about 1,800 participating schools. We will sample about 4,000 teachers from the lists provided by the participating schools. This initial sample of 4,000 teachers will yield about 3,600 completed teacher questionnaires. The estimated respondent burden hours and cost for both collections is summarized below.


Type of collection

Sample size

Estimated response rate

Estimated number of respondents

Estimated time to complete

Respondent burden hours

Respondent cost (@ $25 each)








Teacher lists collected from sampled schools

2,000

90%

1,800

0.5

900

$22,500

Questionnaires collected from sampled teachers

4,000

90%

3,600

0.3

1,200

$30,000

Total


$52,500


A.13. Estimates of Cost Burden for Collection of Information


Not applicable. Respondents will not need to purchase or maintain equipment or services.



A.14. Estimates of Cost to the Federal Government


The survey is estimated to cost the Federal government about $780,000, including about $750,000 for contractual costs and $30,000 for salaries and expenses. Based upon costs of past FRSS sample surveys, contractual costs are divided into the subtask costs shown below.


Subtask

Cost



Teacher list collection and processing

200,000

Sampling

25,000

Survey preparation

50,000

Data collection

350,000

Data analysis and report preparation

125,000



Total

750,000


A.15. Changes in Burden


This is a new survey; no adjustments are being requested.



A.16. Publication Plans/Time Schedule


Plans for Tabulation and Publication


Most of the analyses of the questionnaire data will be descriptive in nature, providing NCES, OET, and other data users with tables and appropriate explanatory text. Survey responses will be weighted to produce national estimates. Tabulations will be produced for each data item. Crosstabulations of data items will be made with selected classification variables. This includes school-level characteristics and teacher characteristics, such as the following.


School Characteristics:

  • School level (elementary and secondary/combined)

  • School enrollment (less than 300, 300-499, and 500 or more);

  • Geographical region (Northeast, Southeast, Central, and West);

  • Percent minority enrollment (less than 6 percent, 6-20 percent, 21-49 percent, 50 percent or more); and

  • Percent of students eligible for free or reduced-price lunch (less than 35 percent, 35-49 percent, 50-74 percent, 75 percent or more).

Teacher Characteristics:

  • Main teaching assignment field (self-contained or subject area);

  • Grade level taught; and

  • Years of teaching experience.


Reports of the findings will be distributed to the data requester, survey respondents, and, upon request, to other interested individuals and organizations. Westat will also submit a procedural report to NCES.


Time Schedule


Collection of teacher sampling lists will begin in September 2008 and continue through February 2009. Collection of teacher questionnaires will be conducted January through May 2009. Dates of key activities for list collection, survey collection, and tabulation and publication are listed below.


Teacher List Collection

  • September 2008: Mail package to principal with request for teacher list.

  • About 3 weeks after mailout, begin telephone followup for nonresponse and data retrieval.

  • February 2009: End list collection.


Survey Collection

  • January 2009: Begin mailing surveys to sampled teachers. Mailing will continue through February on a flow-basis.

  • One week after mailout, send a thank-you/reminder postcard thanking those who responded and reminding those who have not yet responded.

  • About 3 weeks after mailout, begin telephone followup for nonresponse and data retrieval.

  • May 2009: End survey collection.


Tabulation and Publication

Key activities are listed below by number of weeks after the end of survey collection.

  • 8 weeks – submit initial draft of tables to NCES.

  • 12 weeks – submit first draft of complete report for NCES Project Officer review.

  • 20 weeks – submit revised report to NCES Chief Statistician for technical/peer review.

  • 32 weeks – submit revised report for review by IES.

  • 45 weeks – NCES releases report.



A.17. Approval to Not Display Expiration Date


Not applicable. The survey will display the expiration date for OMB approval of the information collection.



A.18. Exceptions to the Certification Statement


Not applicable. No exceptions to the certification statement are being sought.








1 Source: U.S. Department of Education, National Center for Education Statistics, Fast Response Survey System, Internet Access in U.S. Public Schools, Fall 1999 and Fall 2005.


File Typeapplication/msword
File TitleRequest for System Clearance
AuthorWestat
Last Modified Byyifwanda.ndjungu
File Modified2008-06-24
File Created2008-06-23

© 2024 OMB.report | Privacy Policy