1875-NEW TO27-revised_Task4.4-Part 1-OMB-PART A-toPPSS-2016-8-3

1875-NEW TO27-revised_Task4.4-Part 1-OMB-PART A-toPPSS-2016-8-3.pdf

Study of Digital Learning Resources for Instructional English Learner Students

OMB: 1875-0279

Document [pdf]
Download: pdf | pdf
Task Order 27
Study of Digital Learning Resources for
Instructing English Learner Students
Task 4.4 (Part 1) Revised OMB Part A

July 11, 2016 (Final)

Submitted to:
U.S. Department of Education
Office of Planning, Evaluation and Policy Development

Submitted by:
Westat
1600 Research Boulevard
Rockville, Maryland 20850-3129
(301) 251-1500

Study of Digital Learning Resources
(DLRs) for Instructing English Learner
Students

Supporting Statement for Paperwork Reduction
Act Submission
PART A: Justification

Contract GS-23F-8144H/ED-PEP-11-O-0088

Table of Contents
A. Justification ..........................................................................................................

1

Introduction .........................................................................................................................

1

Overview of the Study ..........................................................................................................

1

A.1.

Circumstances Making the Collection of Information Necessary ....................

1

A.2.

Purposes and Use of the Information Collection .............................................

2

A.3.

Use of Information Technology to Reduce Burden .........................................

7

A.4.

Efforts to Identify and Avoid Duplication ........................................................

7

A.5.

Efforts to Minimize Burden on Small Businesses or Other Entities .................

7

A.6.

Consequences of Not Collecting the Information ...........................................

7

A.7.

Special Circumstances Requiring Collection of Information in a
Manner Inconsistent with Section 1320.5(d)(2) of the Code of
Federal Regulations .........................................................................................

8

A.8.

Consultation Outside the Agency ....................................................................

8

A.9.

Payments to Respondents ...............................................................................

9

A.10.

Assurance of Confidentiality ............................................................................

9

A.11.

Questions of a Sensitive Nature.......................................................................

10

A.12.

Estimate of Respondent Burden ......................................................................

11

A.13.

Estimates of the Cost Burden to Respondents ................................................

14

A.14.

Estimates of Annualized Costs to the Federal Government ...........................

14

A.15.

Changes in Hour Burden ..................................................................................

14

A.16.

Plans for Analysis, Publication, and Schedule..................................................

14

A.17.

Approval to Not Display Expiration Date ........................................................

17

A.18.

Exceptions to Certification Statement .............................................................

18

References .........................................................................................................................

18

Task 4.4 Revised Draft OMB. Part A: Justification

iii

Exhibits
A.1.
A.2.
A.3.
A.4.
A.5.

Research questions by data source matrix .................................................
Estimates of respondent burden ..................................................................
Estimates of annualized cost to respondents ...............................................
Data collection timeline ................................................................................
Draft overview of final report chapters ........................................................

Task 4.4 Revised Draft OMB: Part A: Justification

6
12
13
15
16

iv

A. Justification
Introduction
The Study of Digital Learning Resources for Instructing English Learner Students will address a gap in
research and in supports for practice in the instruction of students identified as English learners (ELs).
The limited focus in the field on use of digital learning resources (DLRs) in instruction of EL students is a
particularly critical issue given recent demographic changes. These changes are reflected in large
increases in the numbers of EL students enrolled in public schools in grades K–12 (Capps et al., 2005;
NCELA, 2014), presenting a “new mainstream” (Enright, 2011) in schools today.
The goal of this research effort is to provide an understanding of the current use of DLRs for instructing
EL students in order to inform further research and policy development efforts. To achieve its goal, the
study has conducted market research on DLRs to guide the design of the data collection. The study will
survey school district administrators and teachers regarding their identification and use of DLRs and
conduct site visits to districts and schools to conduct interviews and observations and collect additional
information through DLR demonstrations and classroom observations.

Overview of the Study
To address the Department’s research questions, the two-year study consists of four components:
1) Market research will be conducted to identify DLRs available for K–12 instruction to support EL
students’ second language acquisition and learning of academic English and concepts and skills
in core content areas;
2) A nationally representative survey of 1,000 school district administrators responsible for
instructional and technology decisions (including districts with low, medium, and high
representation of ELs);
3) A survey of 1,200 teachers of EL students, including mainstream teachers of EL students and EL
specialist teachers at elementary and secondary grade levels; and
4) Case studies of 6 districts and 12 schools to provide more in-depth, qualitative data on the use
of DLRs for instructing EL students.
The study will be guided by input from a five member Technical Working Group (TWG) of researchers
and local district or school representatives and an Expert Panel including technology developers,
practitioners, and education researchers.

A.1.

Circumstances Making the Collection of Information Necessary

The National Educational Technology Plan (NETP) (2010) outlined a forward-looking vision for DLRs as a
core element in education of the future. The most recent NETP (2016) emphasizes the importance of
equity in access to and use of technology. To achieve the NETP goal in instructing EL students, it is
important to understand how technology is being used within classrooms in instructing students from
language backgrounds other than English who are identified as EL students.

Task 4.4 Revised Draft OMB. Part A: Justification

1

As of the 2011-2012 school year there were 4.5 million EL students in grades K–12, representing about
nine percent of the total public school enrollment (OELA, 2016). As a result, many districts and schools
nationally are building capacity to meet the needs of their new EL student populations (Capps et al.,
2005; NCELA, 2014). EL students are challenged to learn academic content while also gaining proficiency
in English. Teachers report using DLRs to help meet EL students’ needs; however, many do not receive
the guidance they need to best support their EL students (Warschauer et al., 2004; Zehler et al., 2008).
This study will provide the Department with key descriptive information on DLR identification and use in
instructing EL students, information that is not currently available. The findings also will provide the
foundation for guidance to educators and DLR developers, to be included in the study products, an
educator guide and a developer guide.
This study is authorized under the Elementary and Secondary Education Act, Title III — Language
Instruction for Limited English Proficient and Immigrant Students, Section 3111 (c)(1)(C).

Program Background
This study represents an important initiative to understand more about the use of DLRs in grades K–12
and in particular to learn how DLRs are used for instructing EL students. DLRs have become increasingly
prevalent in schools as instructional tools and resources, but very little is known about actual practice
using DLRs, and in particular, about their use in instructing EL students.

Previous Studies
At the current time, the limited research available on the use of DLRs does not offer sufficient findings to
guide policy or decision-making. The products of this Study of Digital Learning Resources (DLRs) for
Instructing English Learner (EL) students will address a critical gap in research and in supports for
practice in instruction of EL students. A prior literature review found that as of 2009 much of the
research on technology in education focused on foreign language learning and post-secondary contexts,
and there were very few studies relevant to use of technology for instruction of EL students in grades
K–12 (Zehler, Yilmazel-Sahin, Massoud et al., 2011, 2012). The findings supported earlier calls for greater
attention to this area (Parker 2008; Proctor, Dalton & Grisham 2007; Zhao, 2005).

A.2.

Purposes and Use of the Information Collection

The information collected under this study will be used to help provide an understanding of the current
use of DLRs for instructing EL students. The findings will also help to identify key areas for guidance to
teachers and to developers of DLRs for ELs.
Two surveys will be conducted: (1) A district survey will address the technology and student background
context of the district, and obtain information on sources for DLRs, criteria for selecting DLRs, the types
of DLRs used, and barriers to and supports for use of DLRs. (2) A teacher survey, including both
mainstream and EL-specialist teachers, will provide information on the population of students they
teach, and on their practices in identifying, selecting and using DLRs with their students. Teachers will
also indicate what types of DLRs they use, barriers and supports to their use, and information on the

Task 4.4 Revised Draft OMB. Part A: Justification

2

types of professional development and other support they have received related to the use of DLRs with
students and in particular on the use of DLRs in instructing EL students. The surveys will consist of
primarily closed-ended items but will include a small number of open-ended items to elicit examples of
specific DLRs used and comments on gaps and needs related to DLRs.
Six case studies will use interviews, observations and DLR demonstrations to provide detailed
information on DRL use in a small number of districts and schools. The information collected through
the case studies will be used to complement the data collected in the surveys by gathering additional
detailed information. The information will be related to the following topics mapped to the key research
questions: (1) district decision-making around DLR use; (2) the DLRs commonly used with EL students;
(3) the way in which teachers use DLRs for the instruction of EL students; (4) the professional
development and/or coaching provided to teachers around the use of DLRs; (5) barriers to and supports
for the use of DLRs by ELs and their teachers; and (6) districts’ and teachers’ metrics and approaches for
evaluating their use of technology to support ELs. Data gathered from case studies will also inform
research question (7) on how developers and practitioners could improve the use of DLRs for ELs. For
example, case study research will document teachers’ current practices around using DLRs with ELs
(through interviews and observations in case study schools) and analyze the extent to which current
practices reflect research-based principles for promoting ELs’ proficiency in academic language and
gains in content area learning as identified by the research literature and input from the TWG meeting.
The study findings will be presented in a final report and will guide the development of content for two
short field-focused guides for educators and technology developers. The educator guide will include
information to assist teachers in using DLRs. It will include, for example, key questions to consider when
selecting DLRs to use with English learners and examples of how teachers may use DLRs to support EL
student learning and address any barriers to EL students’ use of DLRs. The developer guide will present
information from the study that can support developers in designing DLRs to better serve the needs of
the EL population, including both DLRs designed specifically for EL students as well as those designed for
a broader student population.

Study Objectives
The objective of the Study of Digital Learning Resources for Instructing English Learner Students is to
understand how digital learning resources (DLRs) are used in instructing English learner (EL) students.
The study’s research questions outline the key areas in which the Department requires information on
the use of DLRs for instruction of EL students. The research questions and sub-questions to be
addressed are listed below. Exhibit A.1 outlines the data sources related to each research question.

Research Questions
1.

How do districts and teachers identify and select DLRs in general? How do districts and teachers
identify and select DLRs specifically to support EL students?
1.1 Who identifies DLRs and makes decisions about purchase of and access to DLRs within the
district?
1.2 What criteria do districts use in identifying and selecting DLRs? To what extent do these
decisions specifically consider EL students’ needs?
1.3 To what extent do districts identify DLRs that are designed for EL students?
1.4 In what ways do teachers identify possible DLRs?

Task 4.4 Revised Draft OMB. Part A: Justification

3

1.5 What criteria do teachers use in selecting DLRs for their instruction in general? For instruction
of their EL students specifically?
2.

What types of DLRs do districts report using to support English learners? What types of DLRs do
teachers report using in instructing and structuring learning activities for their EL students?
2.1 To what extent do teachers report using DLRs in instructional activities for their instruction in
general? In instructional activities for their EL students specifically?
2.2 What types of DLRs do district administrators report obtaining for their districts? What types
of DRLs do districts report that they obtain specifically to address the needs of EL students?
2.3 What types of DLRs do teachers report using for instruction of EL students?
2.4 To what extent are the DLRs used by teachers designed specifically for EL students or include
features to support EL students?

3.

How do teachers of EL students use DLRs in the instruction of EL students?
3.1 What are the purposes for which teachers currently use DLRs? (e.g., student engagement,
motivation, skills practice, content knowledge, family engagement)
3.2 What are the areas of knowledge or skills instruction for which teachers use DLRs?
3.3 What instructional activity contexts (such as teacher-led, whole class, small group, individual
work or a combination of these contexts) do teachers report for use of DLRs in instruction?
3.4 To what extent do teachers combine use of DLRs and non-DLR activities?
3.5 What are teachers’ perceptions of the value of DLR use in general and in particular, with EL
students?

4.

To what extent do teachers receive professional development (PD) in effective use of DLRs for
instruction? Which professional development approaches do teachers report to be most helpful in
supporting their use of DLRs in instruction?
4.1. What professional development (PD), if any, do teachers receive that is related to use of DLRs
in general? Related to instruction of ELs and to use of DLRs for instructing EL students? What
are the topics addressed by the PD?
4.2 Who provides the PD, and how much PD have teachers received?
4.3 What types of PD or other supports on use of DLRs do teachers identify as most helpful?

5.

What are barriers to and supports for (1) the use of DLRs in instruction of EL students and (2) the
use of DLRs by students at home? How can districts, schools, and DLR developers address these?
5.1
5.2
5.3
5.4
5.5

6.

What barriers to use of DLRs with ELs do district administrators report?
What barriers do teachers report for their use of DLRs in the classroom for instruction of ELs?
What supports for use of DLRs with ELs do district administrators report?
What supports do teachers report for their use of DLRs for instruction of ELs?
What supports do teachers report as helpful for engaging (1) students and (2) parents or
family members in working with DLRs with students, for use of DLRs outside of the classroom?

How do districts and teachers define and measure the success of their use of technology to support
EL students?
6.1 What do district and administrators and staff report as their indicators of successful use of
DLRs?

Task 4.4 Revised Draft OMB. Part A: Justification

4

6.2 What do teachers report as their indicators of successful use of DLRs?
7.

How could developers and practitioners improve the usefulness of DLRs for instructing EL students?
7.1. What do district administrators report as gaps in DLRs available for instructing ELs?
7.2. What do teachers report as gaps in DLRs for instructing ELs?
7.3. Do districts and/or teachers report using the full range of types of DLRs available? If there are
gaps between what is available and what is used, what are these gaps and their implications
for developers and educators?
7.4. What other needs do district administrators or teachers identify related to the use of DLRs for
instructing EL students?

Task 4.4 Revised Draft OMB. Part A: Justification

5

Exhibit A.1 Research questions by data source matrix

School Administrator
Interviews

Tech Coordinator and
Coach Int.

Teacher Interviews

1. How do districts and teachers identify and select
DLRs in general? How do districts and teachers
identify and select DLRs specifically to support EL
students?













2. What types of DLRs do districts report using to
support English learners? What types of DLRs do
teachers report using in instructing and structuring
learning activities for their EL students?































3. How do teachers of EL students use DLRs in the
instruction of EL students?

Classroom
Observation

DLR Demonstration

District
Interviews

Case study

Teacher Survey

Survey

District Survey

Research question



4. To what extent do teachers receive professional
development (PD) in effective use of DLRs for
instruction? Which professional development
approaches do teachers report to be most helpful
in supporting their use of DLRs in instruction?













5. What are barriers to and supports for (1) the use of
DLRs in instruction of EL students and (2) the use of
DLRs by students at home? How can districts,
schools, and DLR developers address these?















6. How do districts and teachers define and measure
the success of their use of technology to support
EL students?















7. How could developers and practitioners improve
the usefulness of DLRs for instructing EL students?















Task 4.4 Revised Draft OMB. Part A: Justification

6

A.3.

Use of Information Technology to Reduce Burden

The research team will use information technology to reduce burden on school districts, schools, and
respondents for both the survey and interview portions of the data collection. Each data collection task
uses a form of technology that facilitates the collection of consistent and reliable information in the
most effective way while lessening respondent burden.
To minimize burden during the district and teacher surveys, the surveys will be administered online, via
a web survey platform that allows for entry into the survey from any computer while preserving the
unique identify of the district or teacher respondent. The survey platform will also allow for multiple
entries into the survey, saving all information entered at each point, so that respondents can complete
the survey through multiple brief sessions if needed. This will allow respondents to complete the survey
at their convenience. Another advantage to online administration is the reduction of time and human
error with manual data entry since the data will be entered directly by the respondent and automatically
loaded into a master data file. Even so, we will provide respondents with access to an electronic copy of
the survey so that if they prefer to not use the online survey they will be able to print out and complete
a hard copy or to indicate responses on the electronic copy. This may be helpful for any cases of
respondents without ready access to computers or to reliable internet, as might be the case in rural
districts, especially outside of school campuses.
To reduce the burden on case study sites, the research team will collect and review key, publicly
available documents (when such documents exist) such as district technology plans, school urbanicity,
and student population for each case study district and school. The research team will use the extant
information (for example, information gathered from the research team’s review of district documents,
such as the district technology plan or district department structures) to refine and tailor interview
protocols for each site to ensure that the time spent with respondents is used to gain information most
relevant to each respondent. Communication with case study respondents (e.g., for scheduling or any
required follow-up) will be conducted via email as much as possible. Alternate modes of communication
(e.g., telephone) will be used to follow up only if email proves inefficient or if specially requested.

A.4.

Efforts to Identify and Avoid Duplication

There are no federal collection efforts that gather the level of detailed information that this study will
collect.

A.5.

Efforts to Minimize Burden on Small Businesses or Other Entities

Small businesses will not be involved as respondents. The primary entities for this study are randomly
selected school districts.

A.6.

Consequences of Not Collecting the Information

If the proposed information is not collected, the federal government will miss the opportunity to
provide timely and practical information to stakeholders. For example, the federal government will lose
the opportunity to have information that may help it provide more targeted, and effective guidance for
teachers and instruction for EL students through the use of DLRs. Also, without the study’s findings, the
Task 4.4 Revised Draft OMB. Part A: Justification

7

Department will not have the information needed to guide policy and decision-making or to address
areas of need to ensure more effective instruction for EL students.
The consequences of not collecting specific data include:


Without collecting district and teacher surveys, the study team would not be able to obtain
information on how DLRs are selected and identified to support EL students.



Without classroom observations and DLR demonstrations, the study team would not be able to
identify examples of current practices to present a whole picture of how DLRs are used for
instruction of EL students.



Without district, principal and teacher interviews, we would not be able to address key levers of
DLR selection, adoption, implementation, and evaluation.

A.7.

Special Circumstances Requiring Collection of Information in a Manner
Inconsistent with Section 1320.5(d) (2) of the Code of Federal
Regulations

There are no special circumstances. This collection of information is conducted in a manner consistent
with the guidelines in the Code of Federal Regulations, 5 CFR 1320.5.

A.8.

Consultation Outside the Agency

A.8.1 Federal Registrar Announcement
A 60-day notice of this study was published in the Federal Register on [5/12/16] (Vol. 81, No. 92, pp.
29551-29552). To date, no public comments have been received.

A.8.2 Consultations Outside the Agency
The Department has contracted with Westat to conduct this study. Westat and its team designed the
study and developed the data collection and analysis plans that were submitted to the Department for
review. Members of the Program and Policies Studies Service (PPSS) office, the Office of English
Language Acquisition (OELA), and the Office of Educational Technology (OET) who have in-depth
knowledge of the topic areas reviewed the documents and provided feedback to Westat.
In addition, the project team has assembled a Technical Working Group (TWG) (in consultation with ED)
composed of researchers and local school district or school representatives. The TWG convened on April
7, 2016 to review and provide feedback on the design and data collection instruments. The meeting
provided key guidance to the project team including: confirmation that the teacher sample should
include both mainstream teachers of EL students and EL-specialist teachers; confirmation that the study
should include all teachers, including first-year teachers; recommendation that the survey should be
implemented in the winter-spring rather than the fall; and proposed edits to the data collection
instruments.

Task 4.4 Revised Draft OMB. Part A: Justification

8

The TWG members are as follows:


Rebecca Black, Associate Professor, University of California, Irvine



Chris Hansen, Director of Curriculum, Hortonville Area School District, Hortonville, WI



Maria Santos, Co-chair and Senior Advisor for Leadership, Understanding Language; and Director
for School and District Services, Comprehensive School Assistance Program, WestEd



Rebecca Silverman, Associate Professor, University of Maryland



Binbin Zheng, Assistant Professor, University of Michigan

The project team will also convene an Expert Panel including technology developers, practitioners, and
education researchers. The Expert Panel will provide input on what they consider to be the key issues
and questions, both for reporting on the use of DLRs for ELs and in structuring guidance for educators
and developers. The Expert Panel will meet at a point during data collection. The expert panel members’
input will guide the development of the draft educator and developer guides. .

A.9.

Payments to Respondents

To encourage participation in the study, researchers will offer $25 as a small incentive to participants
who complete surveys and/or participate in interviews. In the researchers’ experience, incentives are an
important tool in helping to reach the desired 85 percent response rates.

A.10.

Assurance of Confidentiality

Researchers will adhere to federal rules regarding the protection of human subjects in research. The
research team has a duty to protect all information, but particularly anything sensitive or potentially
embarrassing to individuals. The following provisions will apply on this project:


We will establish procedures and train all case study researchers in data security procedures,
and will document the data security procedures to be used by all participating researchers
(Westat, SRI, and OLC) for the case study data collection specifically, and for any transfers of
case study data by organizations. This will occur prior to any conduct of the case study data
contacts and data collection.



As part of the data collection training, all members of the research team will be trained on data
confidentiality. Specifically, researchers will be trained on how to store data without individual
names and how to discuss survey and interview data only within a team context for analysis
purposes.



As part of obtaining consent for surveys and interviews, each respondent will be apprised that
their participation in the project is voluntary, that they may cease participation at any time
during the survey or interview, and that the study team will protect or maintain the
confidentiality of their responses except as may be required by law. In interviews, researchers
will provide this information orally as well as in writing in the consent form. All case study

Task 4.4 Revised Draft OMB. Part A: Justification

9

respondents will be asked to sign a consent form (see Appendix I for copies of the two consent
forms: one for interview participants and one for classroom observation participants).
Respondents also will be informed that responses to the data collection will be used to
summarize findings in an aggregate manner (across surveys and across case study sites, as
appropriate, or will be used to provide examples of program implementation in a manner that
does not associate responses with a specific site or individual. In any reporting of the case study
findings, pseudonyms will be used for each site. The study team may refer to the generic title of
an individual (e.g., “district director”) but neither the site name nor the individual name will be
used. All efforts will be made to keep the description of the site general enough so that a reader
would never be able to determine the true name or identity of the site or individuals at the site.
The study team will make sure that access to all data with identifiable information is limited to
members of the study team. The study team will not provide information that associates
responses or findings with a subject or district to anyone outside the study team and will protect
and maintain confidentiality for data collected, except as required by law.


Each respondent will be assigned a unique study identification number to protect and maintain
confidentiality of respondents.



The voluntary nature of project participation, the confidentiality provisions, and consent forms
are subject to and overseen by Westat’s and SRI’s respective Institutional Review Boards for
human subjects research.



All electronic data will be stored on secure servers. Access to the server is password protected,
with required changes at regular intervals and strong password elements. Each user’s access is
limited and determined by the network administrator.



Westat and SRI’s standard practice is to shred documents and destroy electronic data once the
data are no longer required, typically within three years of study completion, to allow for any
questions that may arise after publication.

A.11.

Questions of a Sensitive Nature

The data collection instruments (attached in Appendices A-C) do not ask questions of a sensitive nature.
However, the research team as a standard of practice takes precautions for the unlikely situation that a
question makes a respondent uncomfortable. Both the survey and interview consent forms note that
participation is voluntary and may be withdrawn at any time. The surveys allow respondents to skip
questions, and interviewees may decline to answer questions. Maintaining confidentiality, particularly
student confidentiality, is a paramount concern. In the research team’s experience, district and school
personnel are very careful not to disclose confidential information as they deal with student privacy
concerns daily. In addition, we are not collecting student-level data and questions do not solicit
information regarding individuals, and thus risk of any confidentiality breach is further minimized.

Task 4.4 Revised Draft OMB. Part A: Justification

10

A.12.

Estimate of Respondent Burden

The total respondent burden for the data collection effort is 2,480 hours for the one-time data collection
fall 2016 to winter 2017, an annualized burden of 827 hours. Information for this study will be collected
through responses to district and teacher surveys and through six case studies that include in-person
interviews, classroom observations, and DLR demonstrations. The information will be collected from a
nationally representative sample of 999 school districts that serve at least one EL student according to
the most recent NCES Common Core of Data (CCD) Local Education Agency Universe File. A total of
1,200 teachers will be selected—600 mainstream teachers of at least one EL student and 600 ELspecialist teachers. One teacher of each type will be selected from one school drawn from each of 600
districts within a subsample of the 999 districts in the main study sample.
Exhibit A.2 presents time estimates of respondent burden for the data collection activities requested for
approval in this submission. Exhibit A.3 presents estimates of the costs to respondents based on the
estimated number of hours required to respond to the one-time data collection efforts, including
administrative staff time for coordination and for work as part of the process for developing the case
study sample.

Task 4.4 Revised Draft OMB. Part A: Justification

11

Exhibit A.2. Estimates of respondent burden

(a)
Data Collection
Activity
Rosters
List of Schools

List of Teachers

(b)
Type of
Respondent

(d)
Number of
(c)
Responses
Number of
per
Respondents Respondent

District
Administrative
Assistant
School
Administrative
Assistant

(e)
Total
Number of
Responses
(c x d)

(f)
Average
Time per
Response
(Minutes)

(g)
Total
Burden
(Hours)
(e x f)

600

1

600

30

300

600

1

600

40

400

999
1,200

1
1

999
1,200

55
35

916
700

School
Administrative
Assistant
School
Administrative
Assistant
Administrator

18

1

18

60

18

12

1

12

60

12

6

1

6

60

6

Administrator

6

1

6

60

6

Administrator

6

1

6

60

6

Principal

12

1

12

60

12

School-level
administrator

12

1

12

60

12

School-level
administrator

12

1

12

60

12

Teacher

12

1

12

60

12

Teacher

36

1

36

60

36

20

1

20

60

20

12

1

12

60

12

Surveys
District Survey
Teacher Survey
Site Visits
Sample Selection
Information
Site Visit
Coordinator
District Technology
Director Interview
District EL Staff
Interview
District Curriculum
and Instruction
Director Interview
School Principal
Interview
School EL
Coordinator
Interview
School Instructional
Technology (IT)
Specialist or Coach
EL Specialist
Teacher interview
Mainstream
Teacher Interview
Classroom
Observations
DLR Demonstration

Administrator
Teacher

Total

3,563

3,563

2,480

Annual Respondent Burden

1,188

1,188

827

Task 4.4 Revised Draft OMB. Part A: Justification

12

Exhibit A.3. Estimates of cost to respondents
(a)
Data Collection Activity

Estimated
Burden (Hours)

Average

Estimated

Hourly Rate

Cost to
Respondents)

Rosters
List of Schools

300

$15.62

$4,686.00

List of Teachers

400

$15.62

$6,248.00

District Survey

916

$44.13

$40,412.05

Teacher Survey

700

$26.90

$18,830.00

Sample Selection Information

18

$15.62

$281.16

Site Visit Coordinator

12

$15.62

$187.44

District Technology Director Interview

6

$44.13

$264.78

District EL Staff Interview

6

$44.13

$264.78

District Curriculum and Instruction Director
Interview

6

$44.13

$264.78

School Principal Interview

12

$36.30

$435.60

School EL Coordinator Interview

12

$27.11

$325.32

EL Specialist Teacher Interview

12

$27.11

$325.32

School Instructional Technology (IT) Specialist

12

$27.11

$325.32

Mainstream Teacher Interview

36

$27.11

$975.96

Classroom Observations

20

NA

NA

DLR Demonstration

12

$27.11

$325.32

Surveys

Site Visit

Total
Annual Cost

2,480

$74,151.83

827

$24,717.28

NOTE: Average hourly rate for educational administrators and administrative assistants derived from the Bureau of Labor
Statistics’ Occupational Employment and Wages, May 2014. Average hourly rates for principals and teachers derived from the
Digest of Educational Statistics, 2013.

Task 4.4 Revised Draft OMB. Part A: Justification

13

A.13.

Estimates of the Cost Burden to Respondents

There is no capital or start-up cost component to these data collection activities, nor is there a total
operation, maintenance, or purchase cost associated with the study.

A.14.

Estimates of Annualized Costs to the Federal Government

The estimated annualized cost of the study to the federal government is $460,415.33. This estimate is
based on the total contract cost of $1,381,246, amortized over a 36-month performance period. It
includes costs already invoiced, plus budgeted future costs that will be charged to the government for
the study design, sampling, data collection, analysis, and reporting.

A.15.

Changes in Hour Burden

This is a new study and new data collection.

A.16.

Plans for Analysis, Publication, and Schedule

A.16.1 District and Teacher Surveys
Analysis of the district and teacher survey data will consist of simple descriptive statistics on all survey
items and item-level results disaggregated by key district- and teacher-level characteristics. The latter
set of analyses will involve cross-tabulations and tests of significance such as chi square, t-tests and
ANOVA with adjustment for multiple comparisons where appropriate to compare results from different
characteristics of districts, schools, or teachers.
In developing analyses of the data, we will draw on three main sources of information:


The categories of districts identified in the district sample (e.g., sampling strata) for levels of EL
representation: significant, moderate, and low-incidence;



The categories of teachers, including the two categories identified in the teacher sample:
EL-specialist teacher of EL students and mainstream teacher of EL students;



Grade levels for DLR use, based on the elementary and secondary grade level categories of
teachers selected.

Given the complex nature of the sampling, we will use WesVar for calculating the accurate variance
estimates of each subgroup in conducting the tests of significance when examining weighted data from
surveys (e.g., comparing EL specialists’ use of technology to mainstream teachers of ELs’ use).
Key areas of analysis will address the primary research questions. The analyses will examine, for
example, differences by district level of EL representation in the types of DLRs used, in the supports for
DLR use provided to teachers, and in the degree to which districts specifically address EL student needs
in their DLR selections. Analyses of data related to teachers will examine, for example, the types of DLRs
used, access to DLRs and to professional development on use of DLRs and the extent to which this

Task 4.4 Revised Draft OMB. Part A: Justification

14

professional development is specific to EL students; the types of learning activity contexts teachers
structure for use of DLRs (e.g., EL students work independently, students collaborate as pairs, or in
groups); and the goals teachers identify for DLR use.

A.16.2 Case Study
We will follow an iterative approach to analyzing the case study data, beginning before each site visit,
and continuing through internal case study reports to cross-site analysis. Before each site visit, the case
study team will review relevant documents (e.g., district- and school-technology plans, and overviews of
district-licensed DLR use where available). Once each district visit is completed, the site visitors will draft
district-level case study reports, integrating data from the district interviews and two sets of school-level
interviews, DLR demonstrations, observations, and documents. The qualitative data will be coded and
analyzed utilizing qualitative analysis software (e.g., NVivo) and iteratively coded to identify emerging
themes. Case study researchers will meet to discuss key themes and to describe dimensions of similarity
and variation across districts (e.g., on decision-making, use and instructional practices in use of DLRs to
support ELs’ learning, facilitating factors, barriers and challenges, teachers’ professional development
needs, among others). Additionally, our analysis will also examine similarity and variation across types of
respondents (e.g., district administrators, principals, and teachers). The overall case study report will
summarize across the site visit data reports to quantify themes and synthesize findings to address the
evaluation questions.

Timeline and Publication Plans
Study Timeline
The study data collection will begin on an amended timeline. The proposed revised plan is that data
collection will begin in September 2016 and continue through March 2017 (see exhibit A.4).
Exhibit A.4. Data collection timeline
Task

Timeline

Select sample of districts for survey

September 2016

OMB approval

September 2016

Contact sampled districts

Immediately after OMB approval

Case study recruitment

January –February 2017

Begin data collection: district survey

January 2017

Select subsample of districts and identify teacher survey sample

September 2016 – December 2016

Begin data collection: teacher survey

January 2017

Task 4.4 Revised Draft OMB. Part A: Justification

15

Task

Timeline

Data collection: case studies

February to March 2017

End survey data collection

May, 2017

Publication Plans
The study findings will be reported in a Results in Brief document, a concise summary of the study and of
the key findings and in the final report. In addition, the findings will inform the design of the guides for
developers and educators.
The Final Report will use clear language and will be formatted so that a reader will easily understand the
“take-away” points. The report will be structured to provide an overview of the study goals, design,
sources of expert input and guidance, the study implementation, analyses, and findings (see exhibit A-5).
In addition a Final Report will be prepared that will outline the study goals, design, research questions,
methodology, analyses, and study outcomes. The Final Report will include quantitative analyses of the
survey data, examining key comparisons by districts and teacher types. The report will summarize and
integrate the survey and case study findings. The case study findings will provide additional depth of
information and insights into the on-the-ground perspectives and practices regarding use of DLRs in
instructing and structuring learning activities for EL students.
Exhibit A.5. Draft overview of final report chapters
Chapter
Executive Summary
1. Introduction

2. Identification of
DLRs for ELs

3. Use of DLRs for
Instructing EL
Students

Content
Summary of key content and findings
Overview of the study goals, key research questions, and framework for discussing
DLRs
Findings related to the identification of DLRs for ELs as reported by districts and
teachers, including types of individual DLRs and licensed integrated DLR sets, and
reported barriers and supports.
 For districts, the findings will be examined by key district EL-representation
groups, and by key district characteristics. Sources are the district survey with
further information drawn from the case study findings.
 For teachers, the findings will be examined by type of teacher and other key
characteristics. Sources are the teacher survey with further information drawn
from the case study findings.
District goals for use of DLRs and teachers’ reported use of DLRs, including the types
of student activity contexts (for example, individual use, peer pair or small group
use, degree of teacher-led versus student-driven activities); the language groups
and grade levels of students instructed using DLRs; the use of DLRs reported by
mainstream versus specialist teachers of ELs; and differences by district ELrepresentation category. Also included will be the measures of success in use of
DLRs reported by districts and teachers. Sources are the district and teacher surveys
with further information drawn from the case study findings.

Task 4.4 Revised Draft OMB. Part A: Justification

16

Exhibit A.5. Draft overview of final report chapters— continued
Chapter

Content

4. Professional
Development and
Other Supports for
Teachers to Use DLRs
in Instruction

Findings on the types of professional development, coaching and other supports
provided within a district to teachers, and reported by teachers, and the
characteristics of those considered to be most helpful by teachers. Sources are both
district and teacher surveys and related case study findings.

5. Barriers to the Use
of DLRs

6. Summary and
Implications of the
Findings

Reported barriers to use of DLRs based on district and teacher surveys. These will
be examined by district EL-representation category, by teacher type and by other
key characteristics of districts and/or teachers.
Discussion of key findings with a focus on findings that identify, among other areas:
 potential areas for further DLR development to address identified needs of
educators working with EL students;
 potential areas for further educator support to expand and maximize the value of
DLRS in instructing EL students;
 barriers to use of DLRs that may be addressed; and
 supports for educators to enhance instructional practice using DLRs in instructing
EL students.

In addition, the study team will prepare two guidance documents based on the findings.


The Developer’s Guide will describe gaps in the DLRs available or gaps in the types of
information about DLRs that are presented to educators that the study findings overall have
identified. The toolkit will provide guidance where there are steps suggested by the findings that
might address these gaps. The toolkit will be based on the summary findings across all
components of the study, including the market research, the district and teacher surveys and
the case study data collection. The toolkit will include as appropriate examples and templates to
guide developers, and framework descriptions to align with the information provided to
educators in the guide.



The Guide for Educators will provide guidance on the range of DLRs and on the potential for use
of DLRs in instructing and structuring learning activities for EL students. The guide content will
be based on the market research, survey and case study findings. The guidance is expected to
include an overview of DLRs, presenting these in a framework that outlines broad categories and
types of DLRs, as well as descriptors of DLR features and functions. The framework will be
supported by descriptions of example DLRs, drawn from among those reported used in the
study. The audience will be both administrators and teachers and will be organized to reflect
differences in guidance appropriate to each.

A.17.

Approval to Not Display Expiration Date

The agency plans to display the expiration date for OMB approval of the information collection on all
data collection instruments.

Task 4.4 Revised Draft OMB. Part A: Justification

17

A.18.

Exceptions to Certification Statement

This study does not require any exceptions to the Certificate for Paperwork Reduction Act (5 CFR
1320.9).

References
Capps, R., M. Fix, J. Murray, J. Ost, J. Passel, and S. Herwantoro. The New Demography of America’s
Schools: Immigration and the No Child Left Behind Act. 2005.
http://www.urban.org/research/publication/new-demography-americas-schools.
Enright, K. A. “Language and Literacy for the New Mainstream.” American Educational Research Journal
48 no. 1 (2011): 80–118. doi: 10.3102/0002831210368989.
NCELA (National Clearinghouse for English Language Acquisition). NCELA State Title III Information
System. 2014. http://www.ncela.us.
OELA (Office of English Language Acquisition). Fast Facts: Profiles of English Learners (ELs). 2016.
http://www.ncela.us/files/fast_facts/OELA_FastFacts_ProfilesOfELs.pdf
Parker, L. L. ed. Technology-Mediated Learning Environments for Young English Learners. New York, NY:
Erlbaum, 2008.
U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics,
Common Core of Data (CCD). Local Education Agency Universe Survey. n.d. http://nces.ed.gov.
U.S. Department of Education, Office of Educational Technology. Future Ready Learning: Reimagining
the Role of Technology in Education. Washington, DC, 2016.
Warschauer, M., D. Grant, G. Del Real, and M. Rousseau. “Promoting Academic Literacy with
Technology: Successful Laptop Programs In K-12 Schools.” System 32 (2004): 525–537.
Zehler, A. M., H. L. Fleischman, P. J. Hopstock, T. G. Stephenson, M. L. Pendzick, and S. Sapru. Descriptive
Study of Services to Limited English Proficient (LEP) Students and LEP Students with Disabilities.
Volumes I-III. Arlington, VA: Development Associates, Inc., 2003.
Zehler, A. M., C. Adger, C. Coburn, I. Arteagoitia, K. Williams, and L. Jacobson. Preparing to Serve English
Language Learner Students: School Districts with Emerging English Language Learner
Communities (REL 2008–No. 049). Washington, DC: National Center for Education Evaluation
and Regional Assistance, Regional Educational Laboratory-Appalachia, 2008.
http://ies.ed.gov/ncee/edlabs/projects/project.asp?ProjectID=151.
Zehler, A. M., Y. Yilmazel-Sahin, L. Massoud, S. C. Moore, C. Yin, and K. Kramer. The Implementation of
Technology for Instruction of English Learner Students: District Survey. Technical assistance
memorandum submitted to the U.S. Department of Education, Institute of Education Sciences,
National Center for Education Evaluation and Regional Assistance, Regional Educational
Laboratory-Appalachia, 2011.

Task 4.4 Revised Draft OMB. Part A: Justification

18

Zehler, A. M., Y. Yilmazel-Sahin, L. Massoud, S. C. Moore, C. Yin, and K. Kramer. (2012). “Using
Technology-Based Resources: Technology Integration in Instruction of EL Students.” Poster
presentation. International Society for Technology in Education, Annual Conference, San Diego,
CA. June 26, 2012.
Zhao, Y. (2005). “Recent Developments in Technology and Language Learning: A Literature Review and
Meta-Analysis.” In Research in Technology and Second Language Learning: Developments and
Directions, edited by Y. Zhao, 17-37. Greenwich, CT: Information Age, 2005.

Task 4.4 Revised Draft OMB. Part A: Justification

19


File Typeapplication/pdf
AuthorAnnette Zehler
File Modified2016-08-04
File Created2016-08-03

© 2024 OMB.report | Privacy Policy