Survey Development Results

Attachment 1 - FRSS 110 Instructional Technology Use - Survey Development Results.docx

Quick Response Information System (QRIS) 2017-2020 System Clearance

Survey Development Results

OMB: 1850-0733

Document [docx]
Download: docx | pdf





Attachment 1

Survey Development Results





Fast Response Survey System (FRSS) 110: Use of Educational Technology for Instruction in Public Schools




OMB# 1850-0733 v. 36








October 2019


National Center for Education Statistics (NCES)

U.S. Department of Education

Date:

March 8, 2019


To:

Chris Chapman

Bernadette Adams


From:

Laurie Lewis

Cindy Gray


Subject:

FRSS 110 Use of Educational Technology for Instruction: Feasibility Calls, Round 1


This memo describes findings from the first round of feasibility calls for the FRSS 110 survey on use of educational technology for instruction. We conducted telephone interviews from February 7-28, 2019, with respondents from 14 schools. One respondent was a Digital Integration Specialist at the school, and the remainder were school principals. We spoke to respondents at six elementary schools (including one K-8 school), five middle schools, and three high schools. Schools were located in all types of communities (city, suburban, town, and rural), and were located in various geographic areas of the country. Below is a summary of the information collected from the respondents, presented by topic. Following the summary are topics for discussion during our call next week.


Technology Available at the School for Student and Teacher Use

Respondents were asked if the school had a 1:1 program that provided a district- or school-provided computer for every student. Among our respondents, half (seven schools) reported a 1:1 program, with three of those schools (one elementary and two middle schools) reporting that the computers were assigned to classrooms and four of the schools (two middle and two high schools) reporting that the computers were assigned to individual students. The range of time that schools reported having 1:1 programs was 2 to 6 years. All four schools that assigned computers to individual students had Chromebooks, and allowed students to take the computers home with them.


Among the schools with 1:1 programs reporting that computers were assigned to classrooms, two (one elementary and one middle school) had Chromebooks with enough in every classroom that each student had their own computer while in the classroom. The situation was more complex at the third school (a middle school) where the principal indicated they had 1:1 computers assigned to classrooms. At that school, the principal described having a combination of some iPads (described as not very useful), older laptops (which were not Chromebooks) in math and science classrooms, lots of carts of Chromebooks, and a lab used only for coding.


This combination of different types computers, with some in classrooms, some on carts or in tubs, and some in labs or library media centers was similar to the situation at some of the schools that said they did not have 1:1 programs. In particular, the middle school principal who said they did not have a 1:1 program described having classroom sets of Chromebooks in most core subject area classrooms, as well as two large computer labs and 20 Chromebooks in the library. Thus, while this school did not have one computer for each of the approximately 500 students in the school, students had a lot of access to current-model computers during their core academic periods. This school plans to move to 1:1 within classrooms within the next two years. The principal at this school also commented that to her, the term 1:1 only refers to programs where computers are assigned to individual students and students are allowed to take the computers home. It does not include computers assigned to students that stay at school, or having enough computers in every classroom that all students can be on a computer at the same time (1:1 with classroom assignments). If used, this term would need to be carefully defined for respondents.


Respondents at schools that did not have 1:1 programs were asked for their estimate of about how many students per instructional computer the school has. Respondents were often surprised by this question, and were unsure how to answer. They generally gave a description of the number, type, and location of computers in their school and sometimes reported the number of students in the school. While this did not answer our question, it did provide us with insight into the wide mix of type and age of computers in many schools. If OET decides that they want to know the number of students per instructional computer, then it should be collected by asking for the number of computers and number of students in the school, as was done in the 2008 survey.


All of the schools that we talked to that did not have 1:1 programs reported having multiple types of computers in their school. Across schools, four types of computers were reported: desktop computers, laptops, Chromebooks (which respondents distinguished from laptops), and iPads (always referred to by brand). Some of the non-Chromebook computers were identified as being older and not able to be upgraded, making them less useful for some school activities. However, some schools were using these older computers for specific functions that met needs within their schools. For example, one school was using older laptops in science labs because they could be programmed to support particular lab activities, and another school was using older desktop computers assigned to classrooms to run their wireless projectors and smartboards, while teachers used their newer laptop or tablet for other classroom tasks.


Computers were located in classrooms (e.g., four desktops per classroom), on carts or in tubs (some assigned to particular classrooms and others rotated or checked out as needed), in computer labs (some high quality and some old and obsolete, sometimes within the same school), and in the library media center. What came through clearly from the information provided by schools is that asking how many computers a school has and the average age of the computers, even broken down by category, does not provide meaningful information about the amount of access or quality of technology available to students in these schools. Most of the schools we talked to without 1:1 programs did not have concrete plans for a 1:1 program within the next two years.


Schools had varying responses to the question about whether the school has a “Bring Your Own Device” or “Bring Your Own Technology” policy. No schools used this approach to have students provide their own technology in place of their school providing the technology. Some schools, particularly at the middle and high school levels, said that students could bring their own laptops if they wanted to, but that almost none of them did. Phones were generally supposed to be put away during instructional periods, but some schools said that individual teachers may have students use them for a specific activity (e.g., an online quiz using Kahoot). Some schools did not allow students to bring any of their own technology to school, with the reasons given being age (elementary school students), concerns about equity for schools in high poverty areas (some students having devices while others did not), and focus on technology provided by the school in some schools with 1:1 programs. None of the schools allowed students to borrow computers to take home on a short-term basis.


Respondents at all schools reported that all teachers had an instructional computer assigned to him or her. At some schools, there were two computers assigned to a teacher, with one assigned to run classroom devices such as wireless projectors and smartboards, and the other computer used for other classroom tasks.


All respondents except one indicated that the quality of the instructional computers at the school was sufficient to meet the instructional needs of the students and teachers at the school. In later calls, we also asked respondents whether the quantity of instructional computers was sufficient. One respondent (the same one who indicated that the quality was not sufficient) indicated that the quantity was not sufficient, three respondents said it was sufficient but they could use more, and the rest thought the quantity of computers they had was fine.


With the exception of one respondent, where the state had blocked access to previously used software because of ADA compliance issues, respondents reported that the quality of instructional software was sufficient, although two respondents qualified that with statements such as “as far as I know.” Many of the schools were using Chromebooks, and running the web-based Goggle apps, so the software was not on the computers. Google Classrooms and suites of Google apps were frequently mentioned.


Internet Access Available at the School

All schools had Internet access for all their instructional computers and in all instructional areas of the school. All schools had wireless Internet access, and most schools also had at least some wired connections, although most of their access was wireless. Respondents were a bit confused by the wording of the question about whether the Internet access was “sufficient to support a digitally enhanced curriculum.” Most respondents seemed to interpret this as signal strength and reliability. Most respondents indicated that their Internet access was fairly to very reliable, although a few mentioned specific concerns related to problems encountered during state testing when so many students needed to be online at once. A couple of respondents mentioned that their wireless access and reliability had improved in recent years when the district added additional routers to boost the signal throughout the school. One respondent indicated that their wired Internet connections were very reliable, but their wireless connections were not reliable. A number of respondents indicated that they were not sure how to respond to the question about whether the school has high-speed (broadband) Internet access, because they did not know what the school has and/or did not know whether it met the definition of high-speed Internet access.


Staff Support and Professional Development

Respondents were asked whether there is a staff member at their school whose job it is to support teachers with integrating technology into instruction. One finding that quickly emerged was that whether staff supporting technology integration were school-based or district-based did not indicate the amount or quality of support or assistance that teachers got. Whether technology integration was that staff member’s only or full-time job also did not elucidate the support or assistance that teachers got.


One principal, whose school was in the fourth year of a 1:1 program, said that technology coaching positions were no longer needed because teachers were fully integrating the technology by that time. Several schools described having district curriculum specialists or instructional coaches who worked with individual or teams of teachers at the school on all aspects of curriculum integration, with technology integration only a part of what this staff member worked on with teachers. In some cases, these were regularly scheduled meetings (e.g., weekly meetings with grade-level or subject-area teams), and in other cases the staff member came to the school in response to a teacher request. Other schools described having someone at the school, such as the library media specialist, who was trained to assist teachers with technology integration as one of many job functions. Another approach frequently mentioned was the use of “teacher leaders” who received training from the district and then worked with other teachers in their school.


In schools with less support, there was informal teacher-to-teacher sharing of knowledge, and in a couple of schools, there was no assistance at all provided to the teachers, either from the school or from the district. A couple of principals also mentioned that younger/newer teachers had an easier time and were more inclined to integrate technology into their classroom activities than older/more experienced teachers, who sometimes were resistant to change.


Respondents were also asked whether there was a staff member at the school whose job it is to provide technical support for educational technology at the school (e.g., troubleshooting/maintenance of hardware, software, or networks). As with the technology integration item, whether the technical support was school-based or district-based was not the key issue in the quality of the support. Some schools described a process where they notified the district about any problems, and they were usually resolved very quickly, sometimes remotely. Other schools had someone onsite at the school to handle problems. Sometimes this was the person’s only job, but other times it was a teacher or library media specialist who had this as one of their duties. Sometimes there was a combination of onsite and district-based support. For example, one school said that the library media specialist managed the Chromebook inventory (the school was 1:1), a teacher spent half his time on technical support and half teaching, and the district had a tech team that handled repair and purchases. One school said that an outside company handled technical support.


The ways professional development were provided tended to be align with how technology integration was done. Professional development in technology was job-embedded for those schools that used curriculum specialists or instructional coaches that included technology integration as part of their ongoing work with teachers. Schools frequently mentioned that the district provided professional development in technology for all teachers when they were rolling out something new, such as the introduction of Chromebooks or the switch to Google Classrooms (required of all teachers). However, in the absence of new technology that all teachers were required to use, professional development was usually offered “cafeteria style” over the summer or on teacher professional days, with technology sessions just one of many options from which teachers could choose. Thus, principals could not respond to the series of questions about professional development in educational technology (such as how often it was provided or how many hours a year teachers generally received) because the questions did not match the way teachers received professional development.


Online Textbooks and Resources

Online textbooks were used to a limited degree. One high school said they only use online textbooks, and do not issue any hardcopy textbooks. Other schools said that there were online versions of the hardcopy textbooks that students could access from home, or that a few courses at the school used online textbooks. Most schools said they do not use online textbooks.


The use of other types of online curriculum resources and supplemental materials was more common. These included online materials that came with the hardcopy textbooks, reading books for elementary school students that could be accessed online, free-standing programs (e.g., Read 180 or Imagine Math), and many resources that teachers locate themselves and use in their classes. YouTube videos were ubiquitous, and Kahn Academy videos were also mentioned by several respondents. Several respondents mentioned that Kahoot was used for quizzes. A couple of schools mentioned that they use eLearning with students when school is closed due to the weather. One of these schools said that the assignments do not require Internet access, and that extra time is given for the assignments if needed. Another school spontaneously noted that they cannot use eLearning because their students lack technology at home. A few schools used games, but other emerging technologies were rarely reported. With the exception of online textbooks and curriculum resources that are purchased or provided by the school or district, principals could not reliably report about what online resources their teachers used.


Barriers or Challenges Faced by Teachers and Students

Respondents were asked what kinds of barriers or challenges they think the teachers and students in their school face in using educational technology for instructional purposes. Responses fell into three broad categories: general, teacher-related, and student-related. General comments included that the district won’t provide technology because of cost, they were limited because the school was not 1:1 with their Chromebooks, they were concerned about the reliable functionality of the school computers (this school had mostly old computers), and there was a lack of trust in the robustness of the network (this school said their wireless connection was unreliable).


Several respondents mentioned time as a major barrier or challenge for their teachers. With all the many demands on their time, it is a challenge for teachers to find the time to become familiar with new technologies and integrate them into their teaching in effective ways. Numerous respondents mentioned the knowledge, skills, and training of teachers as a major barrier. Some respondents commented that their teachers do not understand how to use the available technology, and that they need to make sure that all of their teachers are trained to use the technology. A couple of respondents mentioned that some teachers, particularly older or more experienced teachers, do not want to change the way that they do things, and question whether technology makes instruction better. One respondent commented that many of the teachers at her school were older and could really use more professional development and exposure to get them up to speed with technologies as they emerge, while the younger teachers added these skills to their repertoire easily since they were used to using technology. One respondent noted that initially the challenge was convincing teachers of the value of 1:1 Chromebooks in the district; now the challenge is making sure that teachers know whether students are engaged and learning.


By far the most frequently mentioned barrier or challenge for students was that many students did not have access to technology, and especially the Internet, at home. A couple of respondents mentioned that many students did not have parents who spoke English, which limited their ability to help their children with schoolwork. High rates of poverty and student homelessness were also mentioned as challenges that limited student access to and experience with technology. This created a wide variety of experience with technology at some schools, leading to equity concerns. The other challenge mentioned for students was proper usage. This encompassed both the need to teach students how to use the technology for academic purposes, as well as the need to teach digital citizenship.


State, District, or School Policies about Technology Use for School Assignments

The only policies reported were acceptable use policies, and policies about appropriate treatment of and responsibilities for school-issued computers.


Topics for Discussion

The first round of feasibility calls provided information on many topics related to educational technology in public schools, summarized above. The next round of feasibility calls needs to prioritize topics to move us closer to a questionnaire that both collects the information needed by OET and meets the requirements of FRSS (three pages of questions with a respondent burden of about 30 minutes). Below we list the various topics to be prioritized, followed by issues with individual topics to be discussed during our call.


  1. Relative Priorities of Various Topics

  1. Computers and other technology equipment

  2. Internet access

  3. Staff support and professional development

  4. Online textbooks and resources

  5. Barriers or challenges


  1. Issues with Individual Topics

  1. Computers: knowing the number of computers and calculating student to computer ratios does not provide full information about the amount of access and quality of technology available to students.

  2. Internet access: What type of information is needed about Internet access? Is the main focus on reliability?

  3. Staff support: For questions about technology support provided to teachers and schools, how much focus should be put on the ways in which the support is provided versus the extent to which principals believe they get the level and type of support that is needed?

  4. Professional development: Any questions asked about professional development need to move away from the types of questions in the Round 1 interview guide, since this does not appear to reflect the way teachers currently receive professional development in educational technology. Aspects of professional development may be able to be incorporated into questions about staff support.

  5. Online resources: What information does OET want to know about online resources that principals would be able to report about, since many resources are located and used by individual teachers in their classes?

  6. Barriers or challenges: It may be better to replace this with an opinion item of the type used in the 2008 FRSS school technology survey (see attached item), with modifications to the opinion statements in the list. A topic of interest is the extent to which teachers use technology in ways that are creative or innovative, but it is questionable whether principals could report this accurately.

  7. Policies: We are not recommending that this topic be included on the survey since the only policies reported by schools were policies about acceptable use and responsibilities for school-issued computers.


Date:

May 6, 2019


To:

Chris Chapman

Bernadette Adams


From:

Laurie Lewis

Cindy Gray


Subject:

FRSS 110 Use of Educational Technology for Instruction: Feasibility Calls, Round 2


This memo describes findings from the second round of feasibility calls for the FRSS 110 survey on use of educational technology for instruction. We conducted telephone interviews from April 15–May 2, 2019, with respondents from 13 schools. The respondents were a combination of principals and other school administrators (10) and technology specialists at the school (3). We spoke to respondents at four elementary schools, five middle schools, and four high schools. Schools were located in all types of communities (city, suburban, town, and rural), and were located in various geographic areas of the country. For this round of calls, the interview guide was primarily closed-ended questionnaire items that we discussed with respondents over the phone (see attached). Below is a summary of the information collected from the respondents and our recommendations for changes for the final round of feasibility calls.


Definition of Computer

The following definition of computer was added to the survey:

For purposes of this survey, computers include desktop, laptop, and tablet computers (including chromebooks and iPads). Smartphones are not included in the definition of computers.

The definition was referenced in questions 1 and 4 of the survey, to stress how computers were being defined for the survey. Respondents reported that the definition was clear to them. We do not recommend any changes to the definition. However, we do recommend referencing the definition or at least repeating the exclusion of smartphones in the question that asks about student access to a computer at home, since several respondents asked about smartphones in the context of this question.


Computers for Student Use

The first three questions were designed to determine whether the school had a computer for every student in the school (Q1), whether the school had a computer for every student in some grade levels (and in later interviews, in some grade levels or classrooms) in the school (Q2), and how computers were assigned for student use in schools with a computer for every student in at least some grades (Q3). The Q2–Q3 sequence used for most of the calls had some weaknesses that were identified during the interviews, and was revised for the last two calls (see attached revised questions). For example, some schools had a computer for every student in some types of classrooms, such as classrooms for core academic subjects. For Q3, respondents were not always sure whether to report computers were assigned to students or to the classroom. Most of the information collected in Q3 could be obtained from Q4, with the exception of whether students could take computers home. Therefore, we changed Q3 to just ask whether students could take school-provided computers home. We think this revised version, in conjunction with the information in Q4, will work better and provide the information being sought. We recommend keeping the revised versions of Q2 and Q3 for the next round of calls.


Question 4 asked about the number of computers for student use in various locations of the school. Since respondents had not been asked to compile this information before the call, we asked them to tell us whether they had any computers for student use in that location, and then to tell us how easy or difficult it would be to provide the numbers of computers. All respondents could tell us whether they had computers in the various locations. Respondents indicated that they could obtain the number of computers by location either by consulting their inventory, by contacting their technology support person, or by calculating it (e.g., by multiplying the number of classrooms by the number of computers in each classroom, or the number of computers on a cart by the number of carts). We do not recommend any changes to this question for the next round of calls.


Question 5 asked about the number of various types of computers for student use (part 1), and the general age (in categories) of that type of computer in the school (part 2). Since respondents had not been asked to compile the number of computers before the call, we asked them to tell us whether the school had computers of that type for student use, and then to tell us how easy or difficult it would be to provide the numbers of computers. All respondents could tell us whether they had computers of the various types, and could obtain the number of computers of various types in the same way they would obtain the information about the numbers of computers by location. However, part 2 of the question, asking about the general age of that type of computer in the school, did not work well. Many schools had computers of a particular type in different age categories, since computers were purchased as the school had funds, or portions of the inventory were refreshed on a continuing basis. Also, unless the computers were very new or very old, many respondents were just guessing about the age of the computers. Obtaining more accurate or more detailed information about the age of the computers would almost certainly exceed the response burden for the survey. We recommend keeping the part of the question asking about types of computers for student use, but dropping the part of the question asking about the age of the computers.


Additional Computer and Internet Questions

Question 6 asked whether every teacher had an instructional computer. It should be noted that every teacher in all schools in both sets of calls (rounds 1 and 2) had an instructional computer. Since there does not appear to be variation in the responses to this question, this item may be a good candidate to drop if space constraints require eliminating items. If this question is kept, we recommend re-ordering it to come after Q8 so that respondents do not mistakenly think that Q7 and Q8 only apply to the computers for teachers.


The next set of questions (Q7–Q8) asked respondents to rate the quality of instructional computers and software. There were no problems with either of these questions.


Questions 9 and 10 ask whether students are expected to bring their own computers to school to use for instructional purposes, and how many students (on a scale from none to most) bring their own computers to school to use for instructional purposes. No schools expected students to bring their own computers, and a number of respondents indicated that this was not allowed. Two schools said that “few” students brought their own computers, and one school said that “some” students brought their own computers. The other schools all responded “none” to this question. These questions do not seem to provide useful information, and are candidates to be dropped for the next round of calls.


Questions 11 and 12 ask whether the school allows students to borrow computers to take home on a short-term basis and whether the school provides mobile hotspots for students to take home. All schools said no to both items. However, we recommend keeping both items since we know from work on other surveys that some public schools would answer “yes” to these questions, they are clear and easy to respond to, and provide information about ways that schools may increase access to computers and the Internet for student use at home.


Questions 13 and 14 ask for an estimate of the percentage of students with access to a computer at home, and the likelihood that the computers have Internet access. Respondents did provide answers to these items, although a number of them indicated that it was a guess. The questions are written to request an estimate (“in your estimation”), and mirror the questions asked on the FRSS 109 teacher technology survey. We recommend keeping both items, and suggest adding an instruction to Q13 indicating that smartphones should be excluded, since this was raised by several respondents.


Questions 15 and 16 ask about the reliability of the Internet connection in the instructional areas of the school. There were no problems with these questions, and we recommend keeping them for the next round of calls.


Online Instructional Resources

Question 17 asks about the extent to which various types of online instructional resources were used by the school and by teachers. Respondents could answer the question, although for a few of the rows, particularly d, f, and g, it was not always clear to us what respondents were including in their responses. When we asked for examples of these types of resources, some respondents could not provide examples, even though they provided an extent rating. The first set of calls also found that principals could not reliably report about what online resources their teachers used, beyond those that were purchased or provided by the school or district. Thus, while this item overall works fine, OET may wish to consider trimming some of the types of resources.


Support for Integrating Educational Technology into Instruction

Question 18 asks about the ways in which support for integrating educational technology into instruction is provided to teachers at the school. Because this question was so dense and had a complex format, we sent it to respondents the day before the call with the email reminder message about the call. We asked them to have the question in front of them during the call to facilitate the discussion. Overall, this question worked well. We recommend a few minor changes to the wording of the types of support in row a-e, but otherwise recommend keeping the item in its current format for the next round of calls.


Professional Development in Educational Technology

The next two questions ask about professional development in educational technology offered to teachers at the school. In question 19, respondents understood row a, job-embedded coaching. However, respondents understood rows b and c less well, especially row c. We tried some modified wording in row c for the last few calls (“sessions where teachers can select among various topics, which may include educational technology topics as well as other topics”), but the change was not particularly helpful. For rows b and c, respondents were not sure what we were asking them. They would often seek clarification of what these categories meant, sometimes asking if we meant teacher professional days. In addition to problems understanding the rows, the response options of Yes-No did not work well. Instead, we got responses such as “sometimes,” “very rarely,” or “we’ve done that a few times over the years.” We recommend changing question 19 to ask only about job-embedded coaching, and changing the response options to a rating scale, such as how often teachers receive job-embedded coaching in educational technology.


In question 20, we asked respondents (primarily principals) an open-ended question about what types of topics are typically covered during the professional development sessions offered in educational technology. As was the case in the first round of calls, principals could not provide much information about educational technology topics covered in teacher professional development, providing general answers such as the sessions would involve how to use a program or an application. Google Classrooms was frequently mentioned. Respondents also said that when something new was introduced or a change was made, all teachers would be required to participate in training. We do not recommend continuing to ask about topics for teacher professional development because school-level respondents are not able to provide this information.


Ratings Scales about the Use of Educational Technology at the School

Question 21 asked respondents to indicate the extent to which they agreed or disagreed with statements about how student learning is affected by the use of educational technology in the instructional program at the school. The statements were drawn from the Initial Evaluation Report of the Utah State Board of Education Digital Teaching and Learning Grant Program. This question worked well, and we recommend keeping it for the next round of calls.


Question 22 asked respondents to indicate the extent to which they agreed or disagreed with statements about the use of educational technology in the instructional program at the school. The statements were taken from the FRSS 92 survey on educational technology in U.S. public schools, fall 2008. This question worked well, and we recommend keeping it for the next round of calls.


Question 23 asks respondent to indicate the extent to which various challenges in using educational technology for instruction apply to teachers at the school. The statements were drawn partly from the Initial Evaluation Report of the Utah State Board of Education Digital Teaching and Learning Grant Program and partly from the input we received during round 1 of the feasibility calls. This question worked well, and we recommend keeping it for the next round of calls.


Policies

Question 24 asked whether the school had policies on acceptable use of educational technology, digital responsibility, and cyberbullying. All schools had policies that cover these topics. Question 25 asks whether the district uses web filtering or content-control software to control student access to websites. All schools responded in the affirmative to this question. Since there does not appear to be variation in the responses to these questions, we recommend dropping them from the survey.


Next Steps

The next round of feasibility calls will involve sending respondents a draft questionnaire to review but not complete, followed by a telephone discussion about the draft questionnaire. Thus, we need to decide which topics and items from the current round of calls will be kept for the third (and final) round of feasibility calls. In addition, if there are additional topics or revised versions of current items that OET wants to try, this third round of calls is the final opportunity for exploratory work. As soon as we receive this input from OET, we will create a fully formatted draft questionnaire and start work on the final round of feasibility calls, which needs to be completed by mid-June due to school closings. We look forward to discussing the report of the round 2 calls with OET and NCES soon.



Date:

July 5, 2019


To:

Chris Chapman

Bernadette Adams


From:

Laurie Lewis

Cindy Gray


Subject:

FRSS 110 Use of Educational Technology for Instruction: Feasibility Calls, Round 3,


This memo describes findings from the third round of feasibility calls for the FRSS 110 survey on use of educational technology for instruction. We conducted telephone interviews from June 6–June 27, 2019, with respondents from 13 schools. The respondents were a combination of principals (8) and technology staff (5), all of whom provided responses for a single school. We spoke to respondents about five elementary schools, four middle schools, and four high schools. Schools were located in all types of communities (city, suburban, town, and rural), and were located in various geographic areas of the country. For this round of calls, respondents were sent the draft questionnaire (see attached) when the interview was scheduled. Respondents were asked to review but not complete the questionnaire, and to have it in front of them during the interview so that we could go over it together and get their feedback about the items. Below is a summary of the information collected from the respondents and our recommendations for changes for the version that will be sent to the Quality Review Board (QRB) for review.


Survey Cover Page

The cover page of the survey included three bullets:

  • An instruction that the survey is designed to be completed by the person most knowledgeable about the use of educational technology for instruction at the school indicated on the front of the survey (for our call, this was the school for which our respondent was recruited);

  • An instruction to respond for the 2019-20 school year (because the survey will be fielded next school year; and

  • The definition of computer that was added to the survey during the second round of feasibility calls. That definition is: “For purposes of this survey, computers include desktop, laptop, and tablet computers (including Chromebooks and iPads). Smartphones are not included in the definition of computers.”

All respondents reported that the bulleted instructions and definition were clear.


Computers for Student Use

The first three questions were designed to determine whether the school had a computer for every student in the school (Q1), whether the school had a computer for every student in some grade levels or classrooms (Q2), and whether students are allowed to take school-provided computers home with them at the end of the day (Q3). Respondents thought these items were clear, and were able to provide an answer and to follow the indicated skip patterns.


Question 4 asked about the number of computers for student use in various locations of the school. Since respondents had not been asked to compile this information before the call, we asked them to tell us whether they had any computers for student use in that location. All respondents could tell us whether they had computers in the various locations. We recommend minor changes to the instructions for this question to indicate that respondents should count all of their computers for student use, and should report each computer in only one location.


Additional Computer and Internet Questions

The next set of questions (Q5–Q6) asked respondents to rate the overall quality of instructional computers and software at the school. For question 6, two respondents were not sure whether the term “instructional software” referred only to software designed for instructional purposes or whether it also included other software they used for instruction, such as Google platforms, Adobe, Word, Excel. We recommend replacing “instructional software” with “software used for instruction.”


Question 7 asked about the extent to which the computers at the school meet the school’s instructional needs. There were no problems with these questions. Question 8 asked how easy it is for teachers to find enough computers to use with their students in a lab or classroom. All respondents could easily answer this question.


Question 9 asked whether every teacher had an instructional computer available to him or her. All respondents could easily answer this question. It should be noted that every teacher in all schools in all three sets of feasibility calls had an instructional computer.


Question 10 asked how much flexibility school-level leaders have in determining which types and how much technology is purchased for that school. This item is a slightly modified version of an item on technology leadership that Bernadette provided after the round 2 feasibility calls (that version said “you and other school level leaders”). While this item has some methodological issues regarding asking two different questions (which types of technology; how much technology), none of the respondents in the feasibility calls found this question confusing, and all were able to respond. When asked who they would include as “school-level leaders” in this question, answers generally included more than the school principal. Many respondents included a subset of teachers (e.g., grade-level leads, special education teachers), instructional coaches, school counselors, or technology staff, as well as assistant principals.


Questions 11 and 12 asked whether the school allows students to borrow computers to take home on a short-term basis and whether the school provides mobile hotspots for students to take home for Internet access. All respondent could answer these questions. For question 11, we suggest moving the “not applicable” response to the top of the response options so that it stands out to respondents. We also suggest adding examples in parentheses for what we mean by a short-term basis (e.g., for a day or a week) so that short-term is not interpreted as a semester or school year, which one respondent suggested could be the case.


Question 13 asked how reliable the Internet connection was in the instructional areas of the school. While all respondents could answer the question, there was a perceived mismatch between the category labels and the text descriptions, particularly for moderately reliable. Several respondents commented that they would not consider a connection lost a few times a week to be moderately reliable, but rather not reliable. We recommend changing the response options to a four-point Likert scale without additional descriptive text. That scale would be: very reliable, somewhat reliable, slightly reliable, and not reliable.


Question 14 asked about the extent to which the school experiences problems with Internet connectivity or speed when large numbers of students must be online at the same time. All respondents could easily answer this question.


Question 15 asked how long it usually takes to: (a) get a computer repaired, (b) get help on a software problem or question, and (c) get network services restored when the network goes down. Respondents could answer this question using the time frames provided, and also provided additional comments about items a and c. Regarding getting a computer repaired, there were two general types of comments. First, respondents said that how long it would take to get it repaired would depend on what was wrong with it. Simple repairs could be done quickly, but major repairs such as a cracked laptop screen would require sending it out to be fixed, which could take days. The other type of comment for this item was that schools or districts that were technology-intensive (e.g., had 1:1 programs) usually had extra computers that could be loaned to students who could not use their own computer for some reason (such as it was being repaired, the student forgot their computer at home, the laptop was not charged). Therefore, the length of time it takes to repair a computer does not necessarily reflect the amount of time a student or teacher is without a computer. Regarding getting network services restored when the network goes down, respondents noted that this was generally caused by something outside their district’s control, such as bad weather or a network cable being severed somewhere offsite. How long it took to get network services restored depended on what caused the network to go down.


Access to Computers and Smartphones at Home

Questions 16 and 17 asked for an estimate of the percentage of students with access to a computer at home, and the likelihood that the computer has reliable Internet access at home. Questions 18 and 19 asked for an estimate of the percentage of students with access to a smartphone at home, and the likelihood that the smartphone has reliable Internet access from home. These questions mirror the questions asked on the FRSS 109 teacher survey, but request information about all the students at the school rather than just the students taught by one teacher. Respondents did provide answers to these items, although many respondents indicated that it was a guess, especially for Internet access (which has a “don’t know” response option). An important consideration for the questions about computer and Internet access at home is the ability of students to use school-provided Chromebooks at home if they do not have Internet access at home. School respondents (and teachers in FRSS 109) indicated that school-provided Chromebooks were not useful to students at home unless they had Internet access. Thus, access to a computer at home does not necessarily translate into a useable device for school assignments. Respondents were also less sure about their estimates of smartphone access than computer access. If the questions about smartphones are retained, additional instructions should be added to clarify that access to a smartphone includes access to a smartphone owned by a parent, since some respondents interpreted the question to be asking only about smartphones owned by students.


OET may wish to consider dropping this set of items. In addition to the issues described above, these items are not directly related to the focus of this survey on the use of educational technology for instruction in schools. Since the survey is limited to 3 pages of questions and responses to these items are available at the teacher level from FRSS 109, including these items in FRSS 110 does not seem like the best use of survey space. Responses from teachers in FRSS 109 will provide better quality data about home access because teachers are in a better position to make these estimates for the students they teach compared to school-level respondents trying to estimate for the entire school.


Online Instructional Resources

Question 20 asked about the extent to which various types of online instructional resources were used by the school and by teachers. Respondents thought the question was clear and that they would be able to provide answers.


Support for Integrating Educational Technology into Instruction

Question 21 asked about the ways in which support for integrating educational technology into instruction is provided to teachers at the school. Part 1 asked (yes or no) whether support was provided that way, and part 2 asked (if yes in part 1) whether that type of support was provided during regularly scheduled meetings and in response to teacher requests.


The descriptions of the type of support in items “a” through “e” were not clear to respondents. Respondents had the following types of issues with these items:

  • The types of positions or support were not clear and not distinct from each other.

  • Respondents were not sure whether they should consider staff from the district as well as from the school.

  • Respondents frequently placed the same staff member in multiple categories, particularly when they did not have someone with a particular title or who exactly matched the description. Sometimes the way that a staff member’s job was configured at the school made it hard to know which category was the best fit, and so the respondent said yes to each category that described any aspect of the person’s job rather than the one category where they fit best.

  • In item c and d, there were different interpretations of staff receiving specialized training, from the district providing special training in order to assign this responsibility to the staff member to staff taking training on their own and then sharing it with teachers on their own initiative.

  • Respondents did not understand that the term “teacher leaders” was intended to include only teachers.

In part 2, we found that “regularly scheduled meetings” was open to wide interpretation of meaning. Examples include: teachers bringing up technology integration at their general teacher meetings, staff setting a time to meet with a teacher in response to the teacher’s question or request, training/support provided on professional development days, and a single meeting scheduled by the principal to go over a specific topic. This variety of interpretations makes the responses not very meaningful. The original intent was to differentiate on-going support from sporadic or one-time support. However, based on the discussions with respondents, it is unlikely that we can collect data with this distinction.


We recommend that this question be completely revised, with a focus on clarifying the types of staff providing support currently listed in items “a” through “e”. We also recommend that part 2 of this question be dropped since it is not providing useful data.


Professional Development in Educational Technology

Question 22 asked whether teachers at the school were provided professional development in educational technology as job-embedded coaching. A number of respondents asked what we meant by job-embedded coaching, and when we asked respondents to tell us what they thought the term meant, we got a wide variety of responses. If OET would like to keep this question, we recommend that they provide a definition that we can integrate into the question.


Question 23 asked how much flexibility school-level leaders have in determining which types and how much professional development is provided for their school. As with question 10, this item is a slightly modified version of an item on technology leadership that Bernadette provided after the round 2 feasibility calls (that version said “you and other school level leaders”). While this item has some methodological issues regarding asking two different questions (which types and how much), none of the respondents in the feasibility calls found this question confusing, and all were able to respond. We do suggest changing the stem slightly to ask about professional development in educational technology.


Continuum Questions

Questions 24 and 25 were added in this round of calls at the request of OET to try to get a sense of “where schools were in their journey” from basic usage to a more integrated approach. A few respondents understood the questions, but most were confused or did not know how to interpret the questions. Even after reading the questions a couple of times, a number of respondents said they did not understand what we were asking them, and did not know how to respond. Some of the respondents did not understand that the responses were a continuum on which they were to place their school, and thought that they had to choose either School A or School B. These items did not work and should be dropped from the survey in their current format. However, if OET has a strong interest in trying to collect information about these concepts, we could draft new items to measure them with an extent scale rather than with a continuum.


Ratings Scales about the Use of Educational Technology at the School

Question 26 asked respondents to indicate the extent to which they agreed or disagreed with statements about how student learning is affected by the use of educational technology in the instructional program at the school. The statements were drawn from the Initial Evaluation Report of the Utah State Board of Education Digital Teaching and Learning Grant Program. This question worked well.


Question 27 asked respondents to indicate the extent to which they agreed or disagreed with statements about the use of educational technology in the instructional program at the school. The statements were taken from the FRSS 92 survey on educational technology in U.S. public schools, fall 2008. This question worked well.


Question 28 asked respondent to indicate the extent to which various challenges in using educational technology for instruction apply to teachers at the school. The statements were drawn partly from the Initial Evaluation Report of the Utah State Board of Education Digital Teaching and Learning Grant Program, partly from an item on barriers to effective technology integration provided by Bernadette after the second round of feasibility calls, and partly from the input we received during earlier rounds of the feasibility calls. This question worked well.


Privacy Issues Assessed by the District

Question 29 asked to what extent the district assess various privacy issues prior to purchasing online educational technology for students. None of the respondents thought that they were in the right position to answer this question. All respondents indicated that this question should be asked of district respondents, not school respondents. This question should be dropped from the survey.


Next Steps

The purpose of our conference call on July 9 is to decide which questions will be kept on the survey to fit within the FRSS limit of 3 pages of questions. We will also discuss any revisions needed to clarify these questions for respondents.


After the call, the next step in the process is to prepare the version of the questionnaire that will be reviewed by the QRB. Once the version of the questionnaire for QRB review is approved by NCES and OET, the QRB meeting is scheduled and the questionnaire sent to them for review. QRB members send written comments on the questionnaire within 1 week, and then Westat has 1 week to prepare written responses to their comments. The comments and responses are discussed during the QRB meeting (likely a conference call again), and then Westat prepares a revised questionnaire. Once NCES and OET approve the revised questionnaire, it is sent to OMB in the Cog Lab package requesting approval to conduct the survey pretest.


Date:

October 8, 2019


To:

Chris Chapman

Bernadette Adams


From:

Laurie Lewis

Cindy Gray


Subject:

FRSS 110 Use of Educational Technology for Instruction: Pretest Call Report


This memo describes findings from the pretest calls for the FRSS 110 survey on use of educational technology for instruction. We conducted telephone interviews from September 26–October 7, 2019 with respondents from 8 schools. The respondents were a combination of principals and technology staff, all of whom provided responses for a single school. We spoke to respondents at all instructional levels (elementary, middle, and high schools). Schools were located in city, suburban, and rural areas, and were located in various geographic areas of the country. For this round of calls, respondents were sent the draft questionnaire when the interview was scheduled. Respondents were asked to complete the questionnaire and send it back to us prior to their interview, and to also have it in front of them during the interview so that we could go over it together and get their feedback about the items.


Respondents did not have any comments or questions about the bullets on the cover page. They indicated that the definition of computers in the third bullet was clear and appropriate. The same definition appears in the box above the first question. The only change we made to the information in the box as a result of the pretest calls was to add the statement, “Answer only for the school indicated on the front of this survey.” This reinforces the statement on the cover page about responding for the indicated school. This is important since respondents may work at more than one school or in a district technology office, and it is important that they respond only for the sampled school.


We made only minor changes to the survey questions following the pretest. The headers for question 4 were modified slightly, changing “count each computer in only one location” to “count each computer only once,” and adding “(if none, enter 0)” above the response column. For question 6, the question stem was revised to add the phrase, “Include instructional software accessed through the Internet as well as software loaded on the computers.” Many schools use Chromebooks, and a couple of respondents noted that the software was accessed through the Internet rather than residing on the computers. In question 19, the wording in item e was clarified, and now reads, “Competing priorities in the classroom adversely affect the use of educational technology.”


The revised questionnaire is attached. Once NCES and OET have provided comments or approved these changes, we will prepare the OMB package for the full data collection.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLaurie Lewis
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy