1850-NEW ALI OMB Supporting Statement Part A_May 2017

1850-NEW ALI OMB Supporting Statement Part A_May 2017.docx

Impact Evaluation of Academic Language Intervention

OMB: 1850-0941

Document [docx]
Download: docx | pdf





Impact Evaluation of

Academic Language Intervention



■ ■ ■



OMB Forms Clearance Request:

Supporting Statement Part A



MAY 2017


Prepared for:

Institute of Education Sciences

United States Department of Education

Contract No. ED-IES-15-C-0050


Prepared By:

MDRC

16 East 34th Street, 19th Floor

New York, NY 10016

William Corrin, Project Director

[email protected]

(212) 340-8840






Supporting Statement for Paperwork Reduction Act Submission

This package requests clearance from the Office of Management and Budget (OMB) to conduct data collection activities for a rigorous evaluation of an academic language intervention on English Learner (EL) students’ and disadvantaged non-EL students’ language and reading skills. The Institute of Education Sciences, within the U.S. Department of Education, awarded the contract to conduct this evaluation to MDRC and its partners Abt Associates and the Florida Center for Reading Research at Florida State University (collectively, referred to hereafter as “the study team”) in September 2015.

Some research suggests that ELs and economically disadvantaged students are at particular risk for poor academic outcomes due to underdeveloped academic language skills (Kieffer, 2010). Academic language generally refers to linguistic features that are prevalent in academic discourse across school content areas that are infrequent in colloquial conversations. Specifically for this project, academic language is defined as knowledge and understanding of words and discourse found in text that forms the basis for the language of schooling. Knowledge of academic words and discourse can be taught, practiced, and demonstrated in school in oral modalities (speaking and listening) and text modalities (reading and writing).There is a growing body of work to suggest that ELs and economically disadvantaged students struggle to develop academic language proficiency that taps the content of academic texts and academic talk; the ability to think and learn like a scientist, historian, mathematician, or writer; and the skills necessary for overall academic achievement (Bailey & Heritage, 2008; Foorman, Koon, Petscher, Mitchell, & Truckenmiller, 2015; Guerrero, 2004; Hakuta et al., 2000; Honig, 2010; Shanahan & Shanahan, 2008).

Although prior studies of academic language instruction provide some initial evidence of efficacious instructional practices, there is little confirmation regarding the large-scale effectiveness of academic language instruction or intervention. The goal of this evaluation is to assess the impact of an academic language intervention on EL students’ and disadvantaged non-EL students’ (e.g., students from low income families) language and reading skill when implemented at a larger scale. This evaluation will contribute to the knowledge base of the instructional practices that improve language and literacy outcomes.

This submission requests clearance to conduct data collection for the baseline period prior to implementing the selected academic language intervention, during the implementation year (the 2017-18 school year), and a follow-up year (spring 2019). The evaluation will examine the implementation and impact of WordGen Elementary, an academic language intervention, using a random assignment design in which participating schools in each district are randomly assigned to a treatment group whose 4th and 5th grade teachers receive training and materials to implement the treatment or to a control group whose teachers do not. The analyses for this study will draw on the following data sources: Teacher surveys, teacher and student rosters, school district records data, student assessments, and classroom observations.


1. Circumstances Making the Collection of Information Necessary

  1. Statement of need for a rigorous evaluation

The Elementary and Secondary Education Act emphasizes the performance of EL and disadvantaged students and requires schools to demonstrate how they are improving their English language proficiency and academic achievement1 The study’s data collection will permit a rigorous assessment of a promising academic language intervention in multiple school districts across the country as part of a large-scale randomized controlled trial evaluation that will provide rigorous evidence on the effectiveness of the selected academic language intervention for ELs and disadvantaged non-EL students in grades 4 and 5.

There is a growing body of work pointing to the importance of academic language proficiency for accessing the content of academic texts and academic talk; learning to think and learn like a scientist, historian, mathematician, or writer; and overall academic achievement (Bailey & Heritage, 2008; Guerrero, 2004; Honig, 2010; Shanahan & Shanahan, 2008; and Hakuta et al., 2000). Academic vocabulary, perhaps one of the most studied aspects of academic language, has consistently been identified as a key factor to students’ academic success. Further, academic language has been found to correlate significantly with reading comprehension skills in developing readers (Uccelli, Galloway, Barr, Meneses, & Dobbs, 2015). ELs and children growing up in poverty are at particular risk for poor academic outcomes due to their emerging academic language skills. These at-risk children are often caught trying to simultaneously develop English language proficiency while also learning academic content, and therefore need to learn with tremendous efficiency to keep pace with the demands of the curriculum (August & Shanahan, 2006).

Recent work has identified several promising practices in supporting the development of academic language of ELs and other disadvantaged students. The specific features of desirable interventions include teaching a set of academic vocabulary words intensively across several days; integrating oral and written language instruction into content-area teaching; providing regular, structured writing opportunities; and providing small-group instructional intervention to struggling students (Baker et al., 2014). While prior studies provide some initial evidence of effective instructional practices, there is little confirmation regarding the effectiveness of academic language interventions when implemented at scale—this is the gap that the Institute of Education Sciences (IES) seeks to fill with the current study. In particular, the study intends to assess the effectiveness of the selected academic language intervention on academic outcomes for EL students and for disadvantaged non-EL students in grades 4-5.



  1. Study Logic Model and WordGen Elementary Intervention

Logic Model. The primary hypothesis of this evaluation is that high quality instruction explicitly promoting the acquisition of academic language will improve academic word knowledge and knowledge of academic discourse, as well as reading and academic achievement, for EL students and their disadvantaged classmates who are non-EL students. The study team anticipates that the impact of high quality instruction will likely be moderated by several student characteristics: English language status, socioeconomic status, baseline reading skills, and grade level.

The logic model in Exhibit 1 displays connections between academic language intervention implementation supports, the intervention’s core components, and the proximal and distal outcomes to be measured and analyzed by the study team:

  • Implementation supports: Activities and supports provided through professional development and other supports that will ensure high-fidelity implementation of the academic language intervention, as well as increase teachers’ understanding of individual differences in language development, the linguistic challenges that students can encounter in text, and how academic language contributes to reading comprehension.

  • Core intervention components: Participation in the professional development and access to other supports associated with the academic language intervention are expected to result in teachers’ acquisition of knowledge about academic language and the adoption of instructional practices that improve the academic language instructional environment for students. Academic language instruction should give students opportunities to engage with academic language orally (through authentic opportunities to speak and listen with teachers and peers) and through text (through rich opportunities to read text and generate written responses).

  • Proximal student outcomes: Changes in the instructional environment are expected to directly influence students’ academic language skill associated with word knowledge and discourse elements.

  • Distal student outcomes: Improvements in academic language are expected to affect distal student outcomes in reading achievement (as measured by reading comprehension skill) and academic achievement (as measured by performance on state tests). Given that EL students represent one of the subgroups of interest in this evaluation, the study team will also examine whether improvements in academic language also lead to progression in or exit from EL status. These distal outcomes are highly relevant for theory and policy.



Exhibit 1. Academic Language Intervention Logic Model



Characteristics of WordGen Elementary—the academic language intervention to be tested. In summer 2016, the study team issued a Request for Proposals to developers of academic language interventions. In early 2017, the study team selected WordGen Elementary as the intervention to be included in the study. WordGen Elementary is an academic language intervention developed by the Strategic Education Research Partnership (SERP) together with some of the nation’s leading literacy experts.

Word Generation Elementary is structured around 12 two-week teaching units that introduce 5-6 high-frequency academic vocabulary words that are used across disciplines. Each unit begins with a video newscast and a “Reader’s Theater” that introduces multiple perspectives on topic that is designed to be interesting to 4th and 5th graders – for example, “What is fair?” and “Who should decide what we eat?” Each unit provides students with repeated, authentic opportunities to actively engage in using that academic language in the classroom by reading a variety of texts, participating in word-learning activities and writing tasks, and discussing and debating about each topic using the focus words. Examples of focus words related to the “Who should decide what we eat?” topic, for example, include: nutrition, effective, campaign, respect, and eliminate. Each lesson lasts approximately 45 minutes long and is should be implemented every day.

SERP will collaborate with districts to support the implementation of the classroom-based WordGen Elementary activities. SERP will hire and train a locally-based coach to provide ongoing in-school and other supports to teachers. Coaches will participate in a two to three day summer training and will receive ongoing support from SERP staff throughout the school year. In each district, coaches and SERP staff will then co-facilitate a two day introductory training for local teachers. The introductory training will cover such topics as the value of discussion to academic achievement; an introduction to and deep dive into the Word Generation Elementary curriculum; discussion and debate in Word Generation Elementary classrooms; and how English Learners and students struggling academically can benefit from the intervention. During the 2017-18 school year, teachers will receive in-school support from the locally based coach and SERP staff, as well as online support via webinars and an online WordGen Elementary community.



  1. Research Questions and Study Design

Research questions. The evaluation will be anchored in the logic model above and address the following primary research questions:

  • What is the impact of the academic language intervention on student achievement?

  • What is the impact of the academic language intervention on classroom instruction?

  • Was the academic language intervention implemented with fidelity?

  • Is there variation in the implementation or impact of the academic language intervention?



Design. The study team will assess the impact of the academic language intervention using an experimental design in which participating schools are randomly assigned to the intervention group or the business as usual (BAU) control group. In the intervention schools, the intervention will be implemented in grades 4 and 5 during the 2017-2018 school year, and the BAU schools will not implement it in any grade during the 2017-2018 school year. The impact estimation approach for this design is straightforward: the effect of the intervention can be estimated by comparing the average outcomes between the two groups. The analysis approach is discussed below in section A.16 and in Supporting Statement Part B.



  1. Data Collection Needs/Plan/Schedule

The evaluation includes several complementary data collection efforts that will allow the study team to address the study’s research questions. Exhibit 2 presents the data collection instruments, need, respondents, modes, and schedule. Additional details about the data sources are provided in section 2 below.

Exhibit 2. Data collection needs

Instrument

Data Need

Respondent

Mode

Schedule

Teacher surveys

Instructional strategies used, professional development received, experience, background characteristics

Teachers

Electronic with hard copy follow-up

Fall 2017, Spring 2018, Spring 2019

Teacher and student rosters

Enrollment and classroom teaching assignments

School staff

Electronic

Fall 2017, Winter 2018

School district records

Reading or language arts and math standardized test score data, English language proficiency status, and student demographic and other characteristics

District staff

Electronic

Fall 2018, Fall 2019

Student assessments

Academic language skills and reading comprehension

Students

Paper

Spring 2018

Classroom observations

Classroom instructional strategies and fidelity of implementation of the tested intervention

NA

Paper

Fall 2017, Spring 2018



2. Purpose and Use of the Information

Data for the evaluation will be collected and analyzed by the contractor selected under contract ED-IES-15-C-0050. The information gathered through this data collection will be analyzed by the study team to study the implementation and impacts of an academic language intervention in the late elementary grades. The findings from this evaluation will provide important evidence for educators and policymakers on the impacts of academic language instruction on EL and disadvantaged non-EL students’ language skills and reading achievement. The evaluation will also provide important insights into implementation challenges and how educators may overcome them. In addition, the data collection for this evaluation will be available as a restricted use data file, which will serve as a valuable resource for other researchers.

The study team will gather information from existing data sources to the extent possible, but some necessary information can only be obtained directly from study respondents. The data collected in the evaluation will be used to address the evaluation’s research questions, as shown in Exhibit 3. Details about each data sources are discussed in the section following Exhibit 3.

Exhibit 3. Research questions and data sources

Primary Research Questions

Data Sources

What is the impact of the academic language intervention on student achievement?

  • Student assessments

  • School district records data

Was the academic language intervention implemented with fidelity?

  • Teacher surveys

  • Classroom observations of instruction and fidelity of implementation

What is the impact of the academic language intervention on classroom instruction?

  • Teacher surveys

  • Classroom observations of instruction and fidelity of implementation

Is there variation in the implementation or impact of the academic language intervention?

  • Student assessments

  • School district records data

  • Teacher surveys

  • Classroom observations of instruction and fidelity of implementation



Teacher surveys: The teacher survey will be used to measure the instructional differences between language instruction in the treatment and BAU classrooms, and to measure fidelity of implementation of the intervention by teachers in the treatment group. The survey will include, for example, items about teachers’ prior experience and training in teaching ELs and disadvantaged non-EL students, participation in professional development (non-intervention specific) related to AL instructional strategies, and self-reported use of instructional strategies to support students’ acquisition of AL skills. In addition, the surveys will include separate items for teachers in the treatment group: these items will be specific to delivering the core instructional components of the intervention, including their use of intervention-specific instructional techniques, resources and materials; their self-efficacy for applying the core elements of the intervention; and challenges encountered implementing the intervention. Teacher surveys will be administered online in the fall of 2017, spring of 2018 and spring of 2019, with teachers being contacted via email. For teachers who do not respond to the online survey, the study team will mail a hard copy of the survey to their schools.

Teacher and Student Rosters: To permit tracking of participation of students and teachers in the treatment and BAU classrooms, schools will be asked to submit rosters of the students enrolled in each classroom as well as the name of the teacher and his/her contact information. The study team will work with the schools to identify a liaison at each school (the “school liaison”) that will support the study’s data collection activities during the 2017-18 school year. Rosters will be requested in fall 2017 and winter 2018 for all classrooms and teachers in the study. These rosters are necessary to ensure that data are collected for all teachers and students in the study (i.e., teacher surveys and student assessments are administered to the right participants).

School District Records Data Collection: The study team will request extant data from school district records, including demographic data (e.g., gender, free/reduced price lunch eligibility; EL and special education status) for students enrolled in 4th or 5th grade in each participating school in 2017-18 and 2018-19 and state reading or language arts and math achievement test scores from the spring 2017, spring 2018 and spring 2019 administrations for students in the study. Data will be collected from districts in fall 2018 and fall 2019. These data will be used to determine the impact of the intervention on student reading/language arts and math achievement, one of the key outcomes of interest. Student demographic data will be used as variables in the study’s impact analyses.

Student Assessments: To estimate the impact of the intervention on the key proximal student outcome, the study will administer a direct assessment of students’ academic language skills (as measured by the Core Academic Language Skills Instrument). To estimate whether the intervention has an impact on the longer-term key student outcome of reading comprehension, the study team will administer a direct assessment of students’ reading comprehension skills (measure TBD).

Classroom Observations: In order to assess the impact of the intervention on classroom instructional practices, the study team will conduct classroom observations in approximately 40 percent of the study classrooms (approximately 3 of the 8 classrooms per school). To capture the degree to which teachers are delivering instruction that supports academic language and reading development irrespective of curriculum and assigned treatment condition, the team will collect observational data using the Classroom Assessment Scoring System-Upper Elementary version (CLASS-UE), a well-validated, reliable measure of instructional quality independent of any specific intervention. Furthermore, the team will collect observational data using a study-modified version of the Word Generation Elementary Fidelity Instrument, in order to capture teachers’ coverage of the intervention’s curricular units and content and delivery of intervention-specific instructional strategies.



3. Use of Technology to Reduce Burden

The data collection plan is designed to obtain reliable information in an efficient way that minimizes respondent burden. The study team will gather information from existing data sources as much as possible, but some necessary information can only be obtained directly from study respondents. Whenever possible we will use technology to maximize the efficiency and completeness of the information gathered for this evaluation and to minimize the burden on respondents. In particular, we will collect teacher and student rosters and extant school records data electronically via a secure data transfer portal, in whatever file format and structure is most convenient for school liaisons and district staff.


In addition, the study team will implement teacher surveys electronically using the FluidSurvey platform. FluidSurvey has built-in, customizable routines for inviting participants, tracking completion of surveys, and presenting a broad range of question types (e.g., select one response; select all responses that apply; Likert scaled items) in a user-friendly online format. FluidSurvey also allows programming of pre-filled text, conditional skip logic, and other automated features that minimize the burden on respondents. By administering the surveys online, respondents can complete them easily at a time and place most convenient for them. Additionally, online administration can reduce time and human error associated with manual data entry because the data will be entered directly by respondents and loaded automatically into an electronic data file.



4. Efforts to Avoid Duplication

As described above, prior studies provide some initial evidence of effective instructional practices, but there is little confirmation regarding the effectiveness of academic language interventions when implemented at scale—this is the gap that the current study seeks to fill.

The study team will use existing data for the study as much as possible and attempt to avoid duplicating data collection efforts whenever possible. While the study will rely on existing data to the extent possible, some new data collection is necessary because there are not currently any large scale studies or data collection efforts that examine the same or similar data.



5. Efforts to Minimize Burden on Small Businesses or Other Small Entities

The primary entities for the study are district and school staff. We will minimize burden for all respondents by requesting only the minimum data required to meet study objectives. Burden on respondents will be further minimized through the careful specification of information needs. We will also keep our data collection instruments short and focused on the data of most interest. Sample sizes and data requirements for each respondent group were determined by careful consideration of the information needed to meet the study objectives, and were reviewed by the study’s technical working group (TWG).



6. Consequences of Not Collecting the Information

The data collection plan described in this submission is necessary for ED to examine the large-scale effectiveness of academic language instruction shown to be promising on a smaller scale. Although prior studies of academic language instruction provide some initial evidence of efficacious instructional practices, there is little confirmation regarding the large-scale effectiveness of academic language instruction or intervention. The goal of this evaluation is to assess the impact of an academic language intervention on EL students’ and disadvantaged non-EL students’ (e.g., students from low income families) language and reading skill when implemented at a larger scale. The research questions that the current study seeks to address also have important policy relevance. Starting in 2017, the Every Student Succeeds Act places a new emphasis on the performance of EL and disadvantaged students, where schools will be required to demonstrate how they are improving the English language proficiency of ELs and disadvantaged students. This evaluation will contribute to the knowledge base on the instructional practices that improve literacy outcomes for these students.

Failing to conduct this study would mean missing a key opportunity to learn about the instructional practices that may improve the language and reading skills of ELs and disadvantaged non-EL students and support decisions that school, district, and other education leaders are making as they seek strategies for addressing the needs of students who may not be receiving adequate support to succeed in school.

Without the information from teacher surveys, the study will be unable to examine the impact of the intervention and training on teachers’ instruction and on the professional development they receive. The study will also be unable to assess differences in teacher experience and practice for the treatment and control teachers in the study. In addition, without the information on teachers’ demographic backgrounds, educational attainment, and professional experience, the study will be unable to capture teacher characteristics that may influence the implementation or effectiveness of the intervention.

Without teacher and student rosters, the study will be unable to identify and track participants in the study.

Without school records, the study will not be able to analyze the ultimate impact of the intervention on student outcomes, such as their English language proficiency status or performance on state tests, and the study will not be able to control for important characteristics, such as students’ race, gender, or EL level.

Without student assessments, the study will not be able to analyze the impact of the intervention on students’ academic language skills or reading comprehension, which are the critical outcomes the intervention is hypothesized to effect.

Without classroom observations, the study will be unable to understand differences in instruction between the treatment and control classrooms or assess the extent to which the intervention is implemented with fidelity.



7. Special Circumstances Justifying Inconsistencies with Guidelines in 5 CFR 1320.6

There are no special circumstances concerning the collection of information in this study.



8. Federal Register Announcement and Consultation Outside the Agency

a. Federal Register Announcement

The 60 day FR notice was published on 3-31-2017, Vol. 82, page 16030.  No substantive public comments have been received to date. The 30-day notice will be published to solicit additional public comments.



b. Consultation Outside the Agency

The study team has sought input on the study, request for developer proposals, and developer selection from an Expert Panel, which includes some of the nation’s experts in language, literacy instruction, instruction for ELs, and statistical methods. The study team will continue to consult with the panel throughout the study on other issues that would benefit from their input. The following table lists the Expert Panel members.

Name

Title and Affiliation

David Francis

Director of the TX Institute for Measurement, Evaluation, and Statistics, University of Houston

C. Patrick Proctor

Associate Professor, Lynch School of Education, Boston College

Jeannette Mancilla-Martinez

Associate Professor of Literacy Instruction, Vanderbilt University

Julie Washington

Professor and Program Director in Communication Sciences and Disorders, Georgia State University

Jeffrey Smith

Professor of Economics and Public Policy, University of Michigan

David Figlio

Professor of Education and Social Policy and Economics, Director of the Institute for Policy Research, Northwestern University

Amy Crosson

Assistant Professor of Curriculum and Instruction, Penn State



9. Payments or Gifts to Respondents

We are aware that teachers are the targets of numerous requests to complete data collection instruments on a wide variety of topics from state and district offices, independent researchers, and ED and several decades of survey research support the benefits of offering incentives. Specifically, we propose incentives for the teacher surveys to partially offset respondents’ time and effort in completing the surveys. We propose offering a $25 incentive to teachers each time he or she completes a survey to acknowledge the 35 minutes required to complete each survey. This proposed amount is within the incentive guidelines outlined in the March 22, 2005 memo, “Guidelines for Incentives for NCEE Evaluation Studies,” prepared for OMB.

Incentives are also proposed because high response rates are needed to make the survey findings reliable and data from the teacher survey are essential to conducting impact analyses on instructional practices. Although some districts will have solicited buy-in from teachers to participate in the evaluation, our recent experience with numerous teacher surveys supports our view that obtaining teacher buy-in on intervention training and implementation does not guarantee teachers will be willing to devote the time necessary to complete a survey, and monetary incentives increase the likelihood of cooperation of school staff.

The study team has reviewed the research literature on the effectiveness of incentives in increasing response rates for surveys. In the Reading First Impact Study commissioned by ED (OMB control number 1850-0797), monetary incentives proved to have significant effects on response rates among teachers. A sub-study requested by OMB on the effect of incentives on survey response rates for teachers showed significant increases when an incentive of $15 or $30 was offered to teachers as opposed to no incentive (Gamse et al., 2008). In another study, Rodgers (2011) offered adult participants $20, $30, or $50 in one wave of a longitudinal study and found that offering the highest incentive of $50 showed the greatest improvement in response rates and also had a positive impact on response rates for the next four waves.



10. Assurance of Confidentiality

The study team will conduct all data collection activities for this evaluation in accordance with all relevant regulations and requirements. These include the Education Sciences Institute Reform Act of 2002, Title I, Part E, Section 183, that requires “[all] collection, maintenance, use, and wide dissemination of data by the Institute … to conform with the requirements of section 552 of Title 5, United States Code, the confidentiality standards of subsections (c) of this section, and sections 444 and 445 of the General Education Provisions Act (20 U.S.C. 1232 g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.

Respondents will be assured that confidentiality will be maintained, except as required by law. The following statement will be included under the Notice of Confidentiality in all voluntary requests for data:

Information collected for this study comes under the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district, school or individual. We will not provide information that identifies you or your school or district to anyone outside the study team, except as required by law. Additionally, no one at your school or in your district will see your responses.

The following safeguards are routinely required of contractors for IES to carry out confidentiality assurance, and they will be consistently applied to this study:

  • All data collection employees sign confidentiality agreements that emphasize the importance of confidentiality and specify employees’ obligations to maintain it.

  • Personally identifiable information (PII) is maintained on separate forms and files, which are linked only by sample identification numbers.

  • Access to a crosswalk file linking sample identification numbers to personally identifiable information and contact information is limited to a small number of individuals who have a need to know this information

  • Access to hard copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.

  • Access to electronic files is protected by secure usernames and passwords, which are only available to approved users. Access to identifying information for sample members is limited to those who have direct responsibility for providing and maintaining sample crosswalk and contact information. At the conclusion of the study, these data are destroyed.

  • Sensitive data is encrypted and stored on removable storage devices that are kept physically secure when not in use.

  • The plan for maintaining confidentiality includes staff training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. It also includes built-in safeguards concerning status monitoring and receipt control systems.


All data containing individually identifiable records will be destroyed by an appropriate fail-safe method, including physical destruction of the media itself or deletion of the contents on our servers. After the study is completed, the study team will create a restricted access file of the data collected and submit that file to IES. This file will have been stripped of all individual, school, and district identifiers.



11. Questions of a Sensitive Nature

There are no questions of a sensitive nature included in the information requested.



12. Estimate of Response Burden

Exhibit 4 provides an estimate of the time burden for the data collection activities for this evaluation. These estimates are based on the instruments included in the appendices and the study team’s experience collecting administrative data from districts and administering surveys to teachers. The total of 1,477 hours includes the following efforts: up to 16 hours for each of the 12 districts to collect and assemble administrative records on students participating in the evaluation; 35 minutes for 490 teachers (85 percent of the anticipated sample) to complete the teacher survey in fall 2017, spring 2018, and spring 2019; up to 6 hours each for 72 school liaisons to collect and assemble school-level rosters of students and teachers.

Averaged over the three-year clearance period, the annual number of respondents for this collection is 191. The annual number of responses for this collection is 546. The annual number of burden hours for this collection is 492 burden hours.

Exhibit 4. Estimate of Respondent Burden

Respondent/

Data Collection Activity

# of Targeted Respondents

Expected response rate

Expected number of respondents

# of Responses per respondent

Hours per Response

Total Burden Hours

Total Costs

School Staff

Teacher Surveys

576

85%

490

3

0.58

853

$28,396.372

Student and teacher rosters

72

100%

72

2

3

432

$15,884.643

Districts

Student test scores and demographic data

12

100%

12

2

8

192

$8,530.564

Total



574

 

 

1,477

$52,811.57



13. Estimate of Other Total Annual Cost to Respondents

There are no additional respondent costs associated with this data collection beyond the burden estimated in item A12.



14. Estimates of Costs to the Federal Government

The estimated cost to the federal government of this six-year evaluation is $15,284,959, inclusive of all options. Thus, the average annual cost to the federal government is $2,547,493.



15. Changes in Burden

This is a new collection and there is an annual program change increase of 492 burden hours.

16. Plans for Analysis, Publication and Schedule

Publication Plan & Schedule

Exhibit 5 displays the anticipated timetable for project publications. The study team will prepare two public reports. The first will describe data analyses and findings in response to the key implementation and impact evaluation questions. The report will discuss design and data collection, the nature and implementation of the academic language intervention training and support, the nature and implementation of the academic language intervention in treatment schools as well as services in the BAU schools, and impact findings. We expect this report to be published by September 2020. The second report will focus on the longer-term impact of the academic language intervention. Both reports will be written and organized so that they are accessible to policy makers and research-savvy practitioners rather than academic researchers and both will follow guidance provided in the NCES Statistical Standards and the IES Style Guide.

Exhibit 5. Publications

Report

Drafts of Report

Final Public Report

First report: Impact and Implementation of Academic Language Intervention

January 2020, April 2020, August 2020

September 2020

Second report: Longer-term Impact of Academic Language Intervention

September 2020, January 2021, May 2021

June 2021



Analysis Plan

The experimental design—school level random assignment—will allow the study team to examine differences in mean outcomes in the treatment and control schools in straightforward way. The prototypical impact estimation model essentially compares the mean outcomes between these two groups of schools, taking into account random assignment blocking by districts and clustering of students within schools. It also includes baseline covariates, such as students’ baseline reading performance, to improve estimation precision. A two-level hierarchical regression model will be used for the estimation. The model estimates separate treatment impact for each district and the district-specific estimates will then be averaged across districts, weighting each by the number of treatment group schools in that district, to yield the overall impact estimate for the average treatment school in the sample. In addition to the impact analyses on various samples and different outcomes measured at varying time points, the team will conduct exploratory analyses that address the following topics: 1) the extent to which impacts vary across sites; 2) whether such variation is related to setting characteristics and features of intervention implementation.

17. Approval to Not Display Expiration Date

All data collection instruments for which we are requesting clearance will display the OMB number and expiration date.

18. Exceptions to Item 19 of OMB Form 83-1

No exceptions are requested.



References

Bailey, A. L. & Heritage, M. (2008). Formative assessment for literacy, grades K-6: Building reading and academic language skills across the curriculum: Corwin Press.

Baker, S., Lesaux, N., Jayanthi, M., Dimino, J., Proctor, CP., Morris, J., & Newman-Gonchar, R. 2014. "Teaching academic content and literacy to English learners in elementary and middle school (NCEE 2014-4012)." Washington, DC: National Center for Education Evaluation and Regional Assistance.

Catts, H. W., Adlof, S. M., Hogan, T. P., & Weismer, S. E. (2005). Are specific language impairment and dyslexia distinct disorders?. Journal of Speech, Language, and Hearing Research, 48(6), 1378-1396.

Foorman, B. R., Koon, S., Petscher, Y., Mitchell, A., & Truckenmiller, A. (2015). Examining general and specific factors in the dimensionality of oral language and reading in 4th–10th grades. Journal of Educational Psychology, 107(3), 884.

Gamse, B. C., Bloom, H. S., Kemple, J. J., & Jacob, R. T. (2008). Reading First Impact Study: Interim Report. NCEE 2008-4016. National Center for Education Evaluation and Regional Assistance.

Guerrero, M. D. (2004). Acquiring Academic English in One Year An Unlikely Proposition for English Language Learners. Urban Education 39, 2: 172-199.

Hakuta, K., Butler, Y. G., & Witt, D. (2000). How long does it take English learners to attain proficiency? The University of California Linguistic Minority Research Institute. Policy report 2000-1. Adolescence 40: 503-512.

Honig, S. L., (2010). A framework for supporting scientific language in primary grades. The Reading Teacher 64, 1: 23-32

Kieffer, M. J. (2010). Socioeconomic status, English proficiency, and late-emerging reading difficulties. Educational Researcher, 39(6), 484–486.

Rodgers, Willard (2011) “Effects of Increasing the Incentive Size in a Longitudinal Survey” Journal of Official Statistics, Vol. 27, No. 2, pp. 279-299.

Shanahan, T. & Shanahan, C. (2008). Teaching disciplinary literacy to adolescents: Rethinking content-area literacy. Harvard Educational Review 78, 1: 40-59.

Uccelli, P., Galloway, E. P., Barr, C. D., Meneses, A., & Dobbs, C. L. (2015). Beyond vocabulary: Exploring cross-disciplinary academiclanguage proficiency and its association with reading comprehension. Reading Research Quarterly, 50(3), 337-356.

1 This impact evaluation is authorized under two legislative authorities. The first is Title III, Part B, Subpart 2 Section 3221 of the Elementary and Secondary Education Act, which covers research on language instruction. In addition, the Consolidated Appropriations Act of 2014 (P.L. 113-67) allows the Department to strengthen impact evaluation work by pooling resources across ESEA programs.

2 Based on average hourly wage rate of $33.29/hour for elementary school teachers. Bureau of Labor Statistics, U.S. Department of Labor, Occupational Employment Statistics, accessed online at http://www.bls.gov/oes/current/naics4_611100.htm#25-0000 (May 2015)

3 Based on average hourly wage rate of $36.77/hour for elementary school teachers with master’s degree and 11-20 years of experience. National Center for Education Statistics, U.S. Department of Education, accessed online at https://nces.ed.gov/programs/digest/d14/tables/dt14_211.40.asp

4 Based on average hourly wage rate of $44.43/hour for education administrators. Bureau of Labor Statistics, U.S. Department of Labor, Occupational Employment Statistics, accessed online at http://www.bls.gov/oes/current/naics4_611100.htm#25-0000 (May 2015)

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorKatie Gan
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy