Supporting Statement Part A

Supporting Statement Part A.docx

Study of Schools Targeted for Improvement Using Title I Section 1003(g) Funds Provided Under ARRA (Study of School Turnaround)

OMB: 1850-0878

Document [docx]
Download: docx | pdf

American Institutes for Research®


Shape1



Study of School Turnaround





OMB Clearance Request

For Data Collection Instruments



Part A: Supporting Statement for Paperwork Reduction Act Submission



February 10, 2011




Prepared for:

United States Department of Education

Contract NO. ED‑04‑CO‑0025/0022





Prepared by:

American Institutes for Research

Mathematica Policy Research

Decision Information Resources

Education Northwest

Contents

List of Appendices

Appendix A: Criteria for the Selection of States A–1

Appendix B: Construct Matrix B–1

Appendix C: Protocols and Consent Forms C–1

Appendix C–1: Draft State Administrator Interview Protocol and Consent Form C–1

Appendix C–2: Draft District Administrator Interview Protocol and Consent Form C–8

Appendix C–3: Draft Elementary School Principal Interview Protocol and Consent Form C–18

Appendix C–4: Draft High School Principal Interview Protocol and Consent Form C–28

Appendix C–5: Draft Elementary School Teacher Interview Protocol and Consent Form C–38

Appendix C–6: Draft High School Teacher Interview Protocol and Consent Form C–46

Appendix C–7: Draft Instructional Coach Interview Protocol and Consent Form C–55

Appendix C–8: Draft Union Representative Interview Protocol and Consent Form C–61

Appendix C–9: Draft External Support Provider Protocol and Consent Form C–67

Appendix C–10: Draft Elementary School Teacher Focus Group Protocol and
Consent Form C–75

Appendix C–11: Draft High School Teacher Focus Group Protocol and Consent Form C–83

Appendix C–12: Draft School Improvement Team Focus Group Protocol and
Consent Form C–92

Appendix C–13: Draft Parent Community Focus Group Protocol and Consent Form C–100

Appendix C–14: Draft High School Student Focus Group Protocol and Consent Form C–105

Appendix C–15: Draft Elementary School ELL Teacher Interview Protocol and
Consent Form C–113

Appendix C–16: Draft High School ELL Teacher Interview Protocol and Consent Form C–121

Appendix C–17: Draft District ELL Coordinator Interview Protocol and Consent Form C–130

Appendix D: Teacher Surveys D–1

Appendix D–1: Teacher Survey: Elementary Longitudinal Module D–1

Appendix D–2: Teacher Survey: High School Longitudinal Module D–9

Appendix E: Request for Documents and Files (RDF) District Director of Fiscal Services
2008–09 and 2009–10 School Year E–1

Appendix F: Walk Through Observation Guide F–1

Appendix G: State, District, and School Notification G–1



List of Exhibits



Introduction

The Institute of Education Sciences (IES) of the U.S. Department of Education (ED) requests clearance for the data collection for the Study of School Turnaround (SST). The purpose of the study is to document over time the intervention models, approaches, and strategies adopted and implemented by a subset of schools receiving federal School Improvement Grant (SIG) funds. To this end, the evaluation will employ multiple data collection strategies. Clearance is requested for the study’s design, sampling strategy, data collection, and analytic approach. This submission also includes the clearance request for the data collection instruments.

This document contains three major sections with multiple subsections:

  • Study of School Turnaround

    • Overview

    • Conceptual Framework

    • Evaluation Questions

    • Sampling Design

    • Data Collection Procedures

    • Analytic Approach

  • Supporting Statement for Paperwork Reduction Act Submission

    • Justification (Part A)

  • Appendices contain a 50‑state table detailing sampling criteria, the study’s construct matrix, site visit interview and focus group protocols, a teacher survey, state interview protocol, Request for Documents and Files, school observation guide, consent forms for all respondents, and notification materials for state and district participants.

Study of School Turnaround

Overview

The Study of School Turnaround (SST)1 will involve case studies to document over time the intervention models, approaches and strategies adopted and implemented by a subset of schools receiving federal School Improvement Grant (SIG) funds. Authorized under Section 1003(g) of the Title I of the Elementary and Secondary Education Act (ESEA) and supplemented by the American Reinvestment and Recovery Act (ARRA), SIGs will target $3.5 billion over the next three years toward the goal of turning around the nation’s lowest‑performing schools. Guidance issued by the U.S. Department of Education has defined both the criteria for selecting eligible schools and the permitted intervention models (School Improvement Grants, 2010). Eligible schools are defined as belonging to one of three categories:

  • Tier I, which includes any Title I school in improvement, corrective action, or restructuring that (1) is among the lowest‑achieving five percent of those schools in the state; or (2) is a high school that has had a graduation rate below 60 percent for a number of years.2

  • Tier II, which includes any secondary school that is eligible for, but does not receive Title I, Part A funds and (1) is among the lowest‑achieving five percent of such secondary schools in the state; or (2) has a graduation rate below 60 percent for a number of years.3

  • Tier III, which includes the remaining Title I schools in improvement, corrective action, or restructuring that are not Tier I schools.4

For each Tier I and II school identified in an LEA’s SIG subgrant application, the LEA must specify one of four improvement models to be implemented in an effort to turn around the school.

  • Turnaround model: replaces the principal and no less than 50 percent of the staff, introduces new governance structure and introduces significant instructional reforms, increases learning time, and provides flexibility and support;

  • Restart model: reopens the school under the management of a charter school operator, charter management organization, or an education management organization;

  • School closure: closes the school and reassigns students to higher achieving schools; and

  • Transformation model: replaces the principal, introduces significant instructional reforms, increases learning time, and provides flexibility and support.

These models are consistent with those defined in other ARRA‑funded initiatives, including Race to the Top (RTT) and the State Fiscal Stabilization Funds (SFSF)‑Phase 2.

The SST will follow the experiences of 60 case study schools in “real time,” from the point at which they receive their SIG funding through a three year period thereafter. The study will involve the following data collection strategies: (1) site visits, telephone interviews, teacher surveys, and (2) document collection at the state, district and school levels that includes fiscal data and information on the school turnaround process.

The approach to this study’s design embraces three interrelated objectives:

Objective 1: To document the change process in a set of chronically low‑performing schools receiving SIG funds.

This study will describe the characteristics of 60 SIG schools, the decisions and strategies they undertake, and the constraints they face as they work to implement intervention models intended to improve student outcomes. Because the study will collect “real time” longitudinal information over the course of three years in a variety of school contexts, it will offer a unique window on how SIG implementation unfolds. In particular, the study team will seek to understand the school‑level processes associated with the planning, implementation, and sustainability of change strategies. School change is a dynamic process, requiring attention, motivation, and internal capacity on the part of school‑level stakeholders. In these 60 schools, the study team will examine the extent to which school‑level actors are engaged in school improvement processes and the level and quality of the implementation of their change strategies.5

The study team recognizes, however, that neither school improvement nor school failure occurs in isolation. Data will be collected from the states and districts in which the case study schools are located, examining school practices as the product of complex and interacting factors. These factors include decisions and practices at multiple levels of the system, characteristics of school populations and personnel, prior reform histories and resulting capacities, the actions of external providers and partners, and use of fiscal resources. Indeed, a particularly important aspect of the study will be the integration of data concerning resource allocation with information about other aspects of the change process.

Objective 2: To study leading indicators of school turnaround.

An objective of the study will be to study factors that are hypothesized to promote the change process in SIG schools. Drawing on existing studies of school improvement and turnaround, the conceptual framework for this study delineates a set of potential leading indicators. The study will track these indicators for study schools over the course of the project.

Objective 3: To support schools undertaking actions to turn around student performance by sharing accumulating knowledge and lessons from study schools with SIG program staff and other key stakeholders.

Each year, the study team will produce reports and research briefs with annual study findings. The study also will share accumulating knowledge with program staff in ED, with the goal of informing ED’s management of the grant program and provision of technical assistance to states, districts, and schools. These knowledge‑sharing activities will enrich the study and its reach and will yield lessons for future evaluations.

Conceptual Framework

The conceptual framework for this study addresses the research questions in Exhibit 2, drawing on an understanding of the SIG program requirements and on the research literature concerning organizational change processes, policy implementation, and effective schools. Undergirding the framework and the design are several assumptions based on prior research:

  • The heart of the change process (and thus of this study) consists of people, activities, and relationships inside the school. At the same time, school performance is influenced by the systems in which these schools are situated, thus, systemic contributors to chronic low performance also must be considered.

  • The strategies that states, districts, and schools select and employ will reflect different “theories of action”—including different conceptions of the problem(s) to be addressed and different assumptions about how the chosen strategies will address that (those) problem(s). The study should seek to understand both what people do to turn around the lowest‑performing schools and why they do so.

  • Schools are complex social systems—the characteristics of the schools and the various improvement strategies they employ will interact and overlap, making it difficult to tease out causality or predict effects.

  • The quality of implementation is a critical determinant of the effect of any policy or program, and implementation takes shape as policies and practices are interpreted and acted on across multiple levels of the system.

  • Interventions and strategies have both descriptive characteristics that can be directly observed or measured and qualitative dimensions that can only be derived from analysis across multiple characteristics. For example, the literature on external support providers suggests that an important determinant of their effectiveness in a given school is the “fit” between the strategies they employ or promote and the needs of that school. Fit, however, cannot be measured directly but must be inferred analytically from data on the school, its past performance, potential contributing factors to that performance, and the actions and strategies of the support provider.

  • Policy interpretation and implementation are mediated by intervening contextual variables, which also will influence outcomes.

  • Implementation changes over time as effects accumulate and as individuals and units interpret results and modify practice.

Exhibit 1 on page 6 depicts the conceptual framework that guides the study, reflecting these assumptions and study goals. Several aspects of the graphic are important to note, as discussed below.

Schools at the core: Highlighted in pale blue are the boxes labeled “School Implementation of SIG” and “Leading Indicators of School Improvement,” emphasizing that the core purposes of this study are to document the actions of the 60 study schools to “turn around” their chronic low performance and to track a set of leading indicators that are hypothesized to be associated with subsequent gains in student achievement. The study’s evaluation questions (in the next section) specify domains targeted by the SIG program guidance as likely to foster improved student outcomes; the study will attend to school actions in each of these domains. At the same time, study schools are likely to combine their actions in these domains differently, and the choices made across domains may be interrelated. For example, decisions about the choice of models or instructional improvement strategies (EQs1 and 2) and staffing (EQ3) may be integral aspects of a change in governance (EQ4), such as a charter conversion. Also, depending on their own determination of needs and “theories of action,” schools will differ in their selection of “entry points” for turnaround efforts. For example, some schools may start the process by changing school leadership or staff, while others may start with changing their governance (i.e., becoming a charter school), and still others by striving for a “quick win” (e.g., bringing order to a chaotic environment) to spur motivation and attention.

In addition to examining actions and strategies undertaken in each school, the study will examine the qualities of the schools’ approaches to turnaround. The degree of coherence across multiple strategies, their divergence from past practices, and the level of buy-in, for example, are some of the qualities likely to influence depth of implementation and eventual effectiveness; indeed, these qualities may spell the difference between success and failure across schools following the very same intervention “model.” In each box in Exhibit 1, therefore, the relevant descriptive characteristics of the strategies and the analytic qualities of the larger approach to turnaround in the school are indicated.

The actions and strategies undertaken by schools are expected to influence improvements in student outcomes through the changes they bring about in the behaviors and capacities of the staff and students. Because such changes are expected precursors to improved student achievement, they are referred to as “leading indicators.” For example, changes in school personnel or professional development efforts would be likely to improve student outcomes only if they result in staff with increased knowledge and skills. A culture of high expectations for students and of continuous improvement also are examples of likely precursors of changes in student outcomes. Tracking such leading indicators and the strategies that are related to them is an important goal of much of the turnaround literature.

Multi‑level implementation of SIG: As illustrated in the conceptual framework, district and state implementation of the SIG program shape schools’ implementation. Districts may have the primary role in selecting the intervention models to be used by SIG schools, or they may prescribe instructional approaches, or provide additional flexibility to SIG schools, or provide technical assistance specifically for SIG schools. The study will examine the actions and strategies undertaken by the districts in which study schools are located. As with school implementation, the study will go beyond documentation of actions and strategies, and attempt to understand qualities of district

Exhibit 1. Conceptual Framework

Shape2

approaches that are likely to be associated with successful school implementation. For example, districts’ SIG‑related strategies may differ in their specificity, or in their emphasis on applying pressure vs. support for SIG schools; districts also will differ in the comprehensiveness and accessibility of data that schools can use to inform their improvement efforts. The study will examine how district actions contribute to (or impede) school implementation. Of course, state SIG policies such as the definition of eligible schools and SIG guidance set the parameters for district and school policies as illustrated in the conceptual framework, and the study will examine SIG policies for the states in which study schools and districts are located.

The role of external support providers: Most approaches to school turnaround recognize that schools that are chronically low‑performing are so in part because they lack the capacity to significantly improve on their own. An expected component of this study, therefore, and one of the research questions, will be to examine the role of external change agents and the assistance that they provide to schools. Indeed, there are different sorts of external support providers, all of which will be considered in this study. One set of support providers (educational management organizations and charter management organizations) are those that provide comprehensive support to chronically low‑performing schools, and have tools and processes that will guide the turnaround process. In some cases, this assistance may be combined with actual line authority over the schools. Another set of support providers are outside vendors and non‑profits helping the school with one or multiple aspects of their improvement strategies (such as professional development in mathematics, or how to collect and manage classroom observation data). Finally, individuals contracted with the state (for example, those affiliated with the statewide system of support) may be providing direct, long‑term assistance to the case study schools. Given these diverse and prominent roles, the study will examine the work of external agents in the case study schools, as noted in the box in Exhibit 1 marked “External Partners.”

The role of context: The SIG program does not intervene in a vacuum. A key feature of the study’s conceptual framework, therefore, is the emphasis on contextual and systemic factors that mediate SIG‑specific actions and strategies, and their qualities, at the school, district and state levels. School change is embedded in a system. While many prior studies of school reform have focused exclusively on the school, this study will examine the systemic, historical, community and other contexts of school actions and how they influence school actions.

Time: Exhibit 1 demonstrates that the range and complexity of variables examined in Year 1 will be repeated over the three years of data collection. Thus, unlike prior, retrospective research, this study will capture the dynamics of the change process. Arrows in the conceptual framework illustrate a process of continuous feedback; one particular focus will be the extent to which student outcome data are used to revisit intervention models at the school, district and state levels. Because this is a longitudinal, real‑time evaluation, the study team will be in a better position to examine how the turnaround process begins and evolves, how the different levels of the education system interact with one another, and the extent to which existing contextual factors are related to subsequent decisions.

Evaluation Questions

To meet its three objectives, the SST will document the contexts, actions, strategies, and qualities of these strategies that are implemented in a subset of schools that receive SIG funds. The evaluation questions for this study address seven aspects of school turnaround relevant to the SIG grants: (1) selection of intervention models, (2) instructional improvement strategies, (3) human capital strategies, (4) approaches to school governance, (5) use and role of external support partners, (6) the allocation of SIG funds, and (7) contextual and systemic influences. Within each of these, the study team will focus a subset of analyses on issues related to English Language Learners (ELLs). Exhibit 2 presents the broad evaluation questions that will guide the data collection and analysis for each aspect.

Exhibit 2. Study of School Turnaround Evaluation Questions

EQ1: Intervention Models. Which of the four intervention models are districts and schools selecting for turning around the SIG‑funded schools in this study, and why? What roles are states, districts, schools and turnaround partners playing in the decision making and design of the intervention, and how do these roles change over time?


EQ2: Instructional Improvement Strategies. What specific actions are states, districts, and schools taking to improve instruction and outcomes in the SIG‑funded schools in this study? What are the rationales for these actions and how are the decisions made? How well are these strategies planned, implemented, refined, and sustained? How do they change over time?


EQ3: Human Capital. What strategies are states, districts, and schools using to improve the qualifications and effectiveness of teachers, principals, and other staff at the SIG‑funded schools in this study? To what extent do these strategies change over time?


EQ4: Approaches to School Governance and Flexibility. What new governance approaches are being adopted for study schools? Are states and districts providing significant new flexibility to enable implementation of SIG intervention models, instructional improvement strategies, or human capital strategies? To what extent do governance approaches change over time?


EQ5: External Support. What is the nature and quality of external support provided to SIG schools in this study? What roles do states, districts, and turnaround partners play in guiding the change process and helping schools implement improvement strategies? How does support for study schools change over time?


EQ6: Uses of Funds. How are states and districts in this study allocating school improvement funds provided under Section 1003(a) and 1003(g)? How are study states, districts and schools using these funds? To what extent do funds allocation and uses of funds change over time?


EQ7: Contextual and Systemic Influences. How do school, district, and state contexts shape the adoption, implementation, and changes over time of the strategies employed at each level of the system? How do prior school, and district and state intervention, human capital strategies, and governance approaches contribute to strategies employed in SIG schools?

Sampling Design

The main components of this study are presented in Exhibit 3 along with the proposed sample and schedule of data collection activities. A detailed discussion of the sampling design is provided in the Supporting Statement for Paperwork Reduction Act Submission, Part B of this package.

Briefly, the study will include a base sample of 60 schools and two nested subsamples. The first nested sample (the core case studies) will consist of 25 schools in which the study team will conduct in-depth case studies over three years of data collection. The second nested sample (special topics case studies) will consist of two sets of 10 schools in which the study team will explore focused topics of policy interest.



Exhibit 3. Main Study Components, Proposed Sample, and Schedule of Data Collection Activities





Base Sample:

60 Schools



State interviews

Principal Phone Interviews

Longitudinal
Teacher Survey

Fall 2010

Fall 2011

Fall 2012

Fall 2010

Fall 2011

Fall 2012

Winter 2011

Winter 2012

Winter 2013



Site visits: Winter 2011, Fall 2011, Spring 2012, Fall 2012, Spring 2013

S

Core Case Studies:

25 Schools

pring supplement teacher survey:
Once each spring (in addition to the fall administration, as part of the base sample)



Site visits: Special Topic A—Spring 2011 and Spring 2012
Special Topic B—Fall 2012 and Fall 2013

S

Special Topics Case Studies:

2 sets of 10 schools

pecial topic survey supplement
: Once each year (twice for each set of schools)







Data Collection Procedures

The data collection for this study includes site visits, telephone interviews, teacher surveys, and document data collection. All of the study’s data collection instruments have been included in this submission.6 Exhibit 3 above presents a summary of the data collection activities. A more detailed discussion of these procedures is provided in the Supporting Statement for Paperwork Reduction Act Submission, Part B section of this package. Copies of the site visit interview and focus group protocols and state interview protocol are located in Appendices C–1 through C–17. Copies of the teacher survey, Request for Documents and Files, school observation guide, and state notification letter are included in Appendices D, E, F, and G, respectively.

Each year, the study team will collect data from a base sample of 60 schools through a teacher survey and principal interviews. In addition, the study team will interview state officials from the states in which the 60 schools are nested. Among these 60 schools, 25 will be selected as core case study schools, from which the study team will collect additional data through site visits and a survey supplement. In addition, the study team will identify two sets of 10 schools from the base sample of 60 schools, in which the study team will explore special topics of policy interest. The first set of 10 schools will include schools with a high proportion of ELLs; the focus of the second set of 10 schools will be determined in consultation with IES and the study’s TWG.

Analytic Approach

Site Visits

The most important element of the study is the site visit, which will consist of data collection in the 25 core case study schools and two sets of special topic case study schools. For each school, study staff will conduct interviews or focus groups with the principal, teachers, support providers, and other stakeholders, as well as interviews with officials of the district in which the school is located. These interviews and focus groups will be guided by semi-structured interview protocols designed to ensure that discussion of specific topics of interest are consistent across respondents, and that respondents are also able to describe school improvement processes and policies in their own words.

Analyses of the qualitative site visit data will enable the study team to answer the full gamut of evaluation questions at the school level, for example: how decisions are being made about which intervention models to use (EQ1); what new instructional practices schools are implementing (EQ2); what professional development approaches schools are implementing (EQ3); to what extent school governance approaches change over time (EQ4); what external support is being provided (EQ5); how schools use SIG funds (E6), and the role of contextual and systemic influences (EQ7).

Qualitative site visit data will be analyzed through a carefully-structured five-step analytic process guided by the study’s evaluation questions and conceptual framework. The process is designed to build in reliability and validity into the case study process, by both building a chain of evidence and using triangulation to identify themes (Yin, 2003). Steps one through four will be conducted for each case study site visit, and step five will require an analysis of across cases. In step one, the preliminary data capture will occur immediately following each site visit and will require researchers to enter information into a data capture template in a web-based software platform. During step two, site visit interview and focus group data will be coded by researchers using a code book and a qualitative analysis software program such as Atlas.Ti, NVivo, or HyperResearch. In step three, within-case data will be analyzed across interviews and focus groups within a single case. Analyses will be guided by structured rubrics that align with the study’s conceptual framework. In step four, a case narrative guided by a common outline template will combine analysis from steps 1-3 and will also include context-specific observations that clarify critical school, district and state dynamics. Finally, in step five, case narratives will be analyzed collectively using rubrics to determine cross-cutting themes.

Shape11

Reliability and Validity Strategies. Research suggest there are several strategies that, if embedded into the data collection and analyses processes can improve reliability and validity of data analysis (e.g., Yin, 1992, 2003). First, all researchers will be trained and provided with guidance materials in order to improve consistency in data capture and analyses. Second, researchers conducting analyses will be convened at a minimum of every two weeks to discuss the data analysis process, questions about coding of data and other discrepancies. As a result of these meetings, additional trainings and revisions to guidance materials will be made and distributed to the team. Third, two lead researchers will review data analyses of each team member on a weekly basis to improve consistency in reporting and analysis across cases. Discrepancies identified during these data analysis checks will be discussed and resolved by the team during the regular meetings. Fourth, our analytic process and sources of data collection allow for triangulation of data, which will allow researchers to verify the observations in the data (Stake, 2000). Last, in each step of the analysis process, procedures for ensuring inter-rater reliability measures will be implemented, discrepancies identified will be resolved and discussed at the team meetings (Armstrong, Gosling, Weinman, & Marteau, 1997).

The following is a more in-depth discussion of each of the five steps of the analytic process. Within each step the analysis process is described followed by a brief discussion of step-specific validity and reliability measures.

  1. Preliminary Data Capture

In order to capture initial impressions about case study schools and case study respondents that can sometimes be lost in the time that elapses between site visits and recording field notes, the study team has designed a web-based data capture field workbook. The data capture field workbook will be aligned with the key dimensions and qualities identified in the study’s conceptual framework. The web-based platform will allow field researchers to catalog responses to interviews and focus groups immediately following each visit. First impressions regarding development of turnaround strategies and quality of implementation will be recorded consistently across all cases. This will ensure that all field teams are collecting data consistently and that no data sources or areas of interest are overlooked. Preliminary data capture will also allow the research team to identify emerging themes and to focus on these as the study progresses. Use of this capture template will be on-going, and more detailed analysis of this data during step three through five will produce quantified analysis on key indicators both within and across cases.

Researchers will be trained in the use of the data capture field workbook and will participate in weekly meetings to discuss site-visits and issues related to the use of the capture field book. Additionally, lead researchers will review the field workbooks weekly to identify possible gaps in data collection and to inform additional guidance provided during these meetings.

  1. Case Interview and Focus Group Coding

Site visit interview and focus group notes will be cleaned, reviewed for accuracy, and then coded. To guide the coding process, a coding book organized by the dimensions and qualities identified in the study’s conceptual framework and evaluation questions will be developed. The coding book will detail how each code should be applied and include examples from the data that illustrate each code. Researchers will code the data using a qualitative analysis software package. Qualitative analysis software programs such as Atlas.T1 facilitate the analysis of large quantities of qualitative data by enabling researchers to develop and test hypotheses using codes that are assigned to specific portions of the narrative. It also allows the research team to organize and categorize data within-case or across cases on a year-to-year basis.

Researchers will be trained to use the coding book and qualitative analysis software. To improve the reliability of coding guidance, the process will be piloted, allowing the research team to gain consensus about the meaning of specific codes, and therefore improving the reliability and consistency of the coding process. Regular meetings of the researchers analyzing the data will ensure there is consistency across coding and spot-checks of coding conducted by a lead researcher will improve inter-rater reliability.

  1. Within-Case Analysis

After all within-case data have been cataloged and coded using web-based tools and qualitative analysis software, the research team will use rubrics and matrices to organize the data by theme and to quantify the evidence across subgroups. Rubrics will be organized by construct, and within construct by indicator. For each indicator, the rubric will delineate levels with concrete descriptors. Analysts will ascribe a rating to each indicator, based on coded data from each set of respondents. Exhibit 4 provides an example of two indicators in a rubric developed for the dimension of coherence at the school level. (The full rubric for coherence includes six indicators; this is included only as an example). The corresponding matrices will embed direct evidence (in the form of quotes from interviews or documents) to provide evidence of each rating. Finally, analysis using within-case rubrics will strengthen the reliability of the analysis by cataloging thematic data across multiple sources to ensure that findings are triangulated and persistent across sources.

To ensure validity and reliability of the within-case analyses, researchers will be trained in the use of rubrics and matrices. An introductory training supplemented by a pilot within-case analysis will be guided by a lead researcher. Additionally, the research team will meet regularly to discuss and resolve discrepancies. Last, a lead researcher will review all coding and provide one-on-one and group feedback as needed.


Exhibit 4: Sample Indicators of a School-Level Analytic Rubric

School-level COHERENCE

Indicator

Levels Descriptors

Rating for principal interview*

School leadership adopts new strategies that are consistent with turnaround goals. 

1. There is limited or no connection between the strategies/actions being implemented and the school’s turnaround goals

[insert numeric rating]

2. While some strategies and actions are aligned with the school’s turnaround goals, there are still some strategies that have limited or no connection to the school’s turnaround goals

[insert supporting quote]

3. All or nearly all of the strategies/actions that have been adopted are consistent with the school’s turnaround goals

[insert hyperlink to interview transcript]

Staff believe strategies/actions implemented are aligned with the school’s turnaround goals

1. Staff describe fragmented strategies/actions that are disconnected or in conflict with school’s turnaround goals

[insert number for rating]

2. Staff clearly articulate how some strategies/actions are aligned with school’s turnaround goals, but also describe strategies/actions that are fragmented.

[insert supporting quote]

3. Staff clearly articulate how most strategies/actions are aligned with the school’s turnaround goals.

[insert hyperlink to interview transcript]

*Note: The full rubric will include columns for different respondents, e.g., teacher interview, instructional coach interview, etc. However, they have been omitted from this example because many columns would limit the readability in this format.

  1. Case Narrative

The primary purpose of the case narrative is to develop a cohesive, comprehensive summary for each case that integrates the data from steps 1-3, but also includes important contextual data that would be difficult to measure using the above analysis tools. These narratives will be 5-10 page summaries of each case that convey how the study’s conceptual framework has been operationalized within the school, paying close attention to not only the types of turnaround strategies being implemented but the quality of the implementation efforts. Additionally, because this is a large-scale, longitudinal study, case narratives will prove valuable for identifying changes to school context and quality of implementation from year-to-year and for capturing changes in school culture that may be affecting the turnaround process but that are sometimes difficult to quantify using rubrics. Each case narrative will be reviewed by a lead researcher to ensure there is consistency in reporting across cases.

  1. Cross-Case Analysis

The study team will conduct cross‑case analyses to identify emergent themes, associations, and processes. The analysis will include a comparison of topics across the schools, districts and states in the case study sample. The primary data sources for these analyses will be rubrics, as this quantified form of qualitative data facilitates cross-case comparisons and identification of associations among practices and school characteristics. In addition, the case narratives will provide additional contextual information that will help to explain patterns and relationships evidence by the rubric data.

Inter-rater reliability measures will focus on training, regular meetings and inter-rater reliability checks. Researchers will be trained to use the rubrics and matrices for cross-case analyses. Regular meetings of researchers will be convened to discuss discrepancies, improve definitions of codes, and provide examples from the data to support a mutual understanding of the codes and analyses. Last, continual inter-rate reliability checks will be conducted.

Fiscal Data

Although the fiscal data must be integrated into analyses for each of the 25 core case study schools and the two sets of special topic case study schools, they necessitate some different analytic strategies. In addition to the qualitative fiscal information collected from interviews that will be analyzed in the case study reports and the cross‑case analyses, the evaluation team also will analyze documents collected at the state and district levels. These will include consolidated applications for federal funds for districts with case study schools and expenditures for each case study school (extracted from its district’s full expenditure files). Site codes contained in the expenditure files will allow the study team to analyze expenditures at each case study school site individually and to observe changes in expenditure patterns in case study schools over time. Object and function codes will permit the documentation of changes over time in policy‑relevant expenditure categories such as personnel, contracted services, and technology. Fund codes will provide descriptive data how SIG funds themselves were used, and how expenditures overall changed after receipt of the SIG grant. Information obtained through interviews with district and school officials will provide insight into the improvement strategies behind the expenditure decisions (e.g., whether increases in expenditures on personnel represent a reduction in class sizes, an increase in the number of specialized staff such as coaches, or other strategies). In addition, the study team will seek to determine if there are unique features of the financial decisions in schools that are more successful (assuming the sample captures some such schools).

State Interviews

The study team believes that school‑level turnaround processes are likely to be shaped by the historical and policy context of each state and its demographic and urban characteristics. Interviews with state officials in the five states in which the 60 schools are situated will provide needed insight on state‑level decisions with regard to a range of evaluation questions on state funding (EQ6), state contexts (e.g., legal constraints and flexibility) (EQ4), and state actions and technical assistance (EQ5). In addition, the state interviews will address questions related to recent changes in teacher licensure systems, certification requirements and teacher evaluation procedures. The analysis will consist of coding text data, an iterative process that includes reading, reviewing, and filtering data to identify prevalent themes relating to each of these evaluation questions.

State Extant Analyses

To situate the 60 case study schools in a broader context, the study team will analyze and report on extant national data, including state and district SIG applications, and data from EDFacts and the Common Core of Data (CCD). The study team will use these data to describe (1) SIG policies and guidance provided by states and districts to SIG schools and (2) the characteristics of SIG‑eligible and SIG‑awarded schools. First, the review of state and district SIG applications will document critical background information about SIG policies and guidance, including, for example, evaluation criteria for reviewing and prioritizing district applications; planned processes for monitoring SIG implementation and reviewing districts’ annual goals for student achievement for SIG‑awarded schools; and planned activities related to technical assistance. Second, using data on the SIG‑eligible and SIG‑awarded schools, in conjunction with data from EDFacts and CCD, the study team will conduct analyses of the features of SIG schools, including grade level, size, urbanicity, funding levels, and characteristics of enrolled students. The study team will also report on school‑level AYP performance and accountability status of the SIG‑funded schools. In addition, the study team will use these data to address questions related to state strategies for targeting resources and variation in selected intervention models by state, region, school level, and urbanicity.

Principal Interviews

Each fall, the study team will conduct telephone interviews with the principal in each of the 60 base sample schools. Collecting longitudinal data on the processes associated with SIG implementation in all 60 schools enables the study team to situate the core and focused case study schools in a larger sample, and to provide necessary background data on each. In the first year of data collection, these telephone interviews will serve as a screener process for selecting the 25 core case studies. In subsequent years, the principal interview will also help the study team to identify schools to be included in the special topic case studies.

Teacher Survey

A brief Web‑based teacher survey (approximately 10 minutes response time) will be administered to all teachers in the 60 sampled SIG schools in the winter of the 2010-11, 2011-12, and 2012-13 school years. A spring supplement of similar length will also be administered annually for three years to teachers in the 25 schools which will be visited by the study staff. Finally, a “special topic” supplement will be administered annually for two years to two subsets of 10 schools in which the study team will explore focused topics of policy interest. The purpose of these surveys will be (a) to collect longitudinal data from the larger set of schools (60) to inform case study analyses and the selection of focused samples, and (b) to collect data on topics for which feedback from all teachers is necessary, and for which focus groups are not the optimal strategy.

Although a high response rate is expected, the study team does not anticipate a 100 percent response rate; thus, the study team will analyze whether the teachers who respond to the survey are different from the full population of teachers in each school in observable ways. The survey administration group will closely monitor response rates among subgroups of teachers and will target follow-up prompts to any subgroup for which the non-response rate is low.

Because the survey is intended to inform understanding of each school as a “case,” most analyses will be within-school. The study team anticipates that most of the analyses will involve univariate (means, frequencies, etc.) and bivariate (comparisons of means, cross‑tabulations, etc.) approaches to provide an overview of the data and investigate changes in knowledge, perspectives, and behaviors of respondents by independent background characteristics. The study team also will analyze data from the open‑ended survey items and will develop matrices to summarize data across respondents within a given school.

Supporting Statement for Paperwork Reduction Act Submission

Justification (Part A)

  1. Circumstances Making Collection of Information Necessary

The U.S. Department of Education (ED) has authorized School Improvement Grant (SIG) funds under Section 1003(g) of the Title I of the Elementary and Secondary Education Act (ESEA), supplemented by funds from the American Reinvestment and Recovery Act (ARRA). In addition to receiving a large, one‑time infusion of resources, the SIG program requires dramatic changes for grantee schools, for example replacing principals and teachers or transitioning away from district management (e.g., to management of a charter school operator, charter management organization, or an education management organization). Data collection is needed in order to learn how these grant funds are used and to document how the change process unfolds in chronically low‑performing schools. In addition, data are needed to inform decision‑makers in schools that are attempting unprecedented reform.

Specifically, the Study of School Turnaround (SST) is designed to:

  • document the change process in a set of chronically low performing schools receiving SIG funds;

  • study leading indicators of school turnaround; and

  • support schools undertaking actions to turn around student performance by sharing accumulating knowledge and lessons from study schools with SIG program staff and other key stakeholders.

  1. Purposes and Uses of Data

The data collection associated with the SST will be of immediate interest and significance for policymakers and practitioners as it will provide real‑time, detailed, and policy-relevant information on a major federal initiative. The study will offer unique insight on how the implementation of turnaround strategies unfolds, an in‑depth look into the change process in struggling schools, and will allow program administrators to fine‑tune guidance provided to districts and schools participating in the program. The approach embraces mixed‑methods data collection and the associated analyses will enable the study to report on the implementation of various district and state SIG policies. Ultimately, the information may be used to support other schools that endeavor to turn around student performance.

Prior studies of school reform have identified many factors that appear to be related to improved student outcomes (including, for example, data use, strong leadership, adults taking responsibility for student learning), but have not provided guidance on how to initiate and sustain the change process. Thus, despite decades of school improvement research, many schools in the nation are chronically low‑performing. Because this study follows schools from the outset of the SIG grants, it will yield information on the change process as it unfolds—which will help policy makers understand how to stimulate dramatic change, and to provide guidance to low‑performing schools.

In addition, it is worth noting that the SIG program is designed to stimulate particularly dramatic change processes. However, research on NCLB implementation has demonstrated that relatively few schools identified as low‑performing attempt to implement the most dramatic interventions. Thus, this study will be an important source of information for policy and practice about how such dramatic interventions play out in a subset of SIG schools.

  1. Use of Technology to Reduce Burden

The study team will use a variety of information technologies to maximize the efficiency and completeness of the information gathered for this evaluation and to minimize the burden the evaluation places on respondents at the state, district, and school levels:

  • When possible, data will be collected through ED’s and states’ Web sites and through sources such as EDFacts, and other Web‑based sources.

  • Teacher surveys will be administered through a Web‑based platform to streamline the response process.

  • Electronic fiscal files that include information on revenues and expenditures will be requested from districts.

  • State phone interview protocols will be designed to fully explore the evaluation questions without placing undue burden on the respondents.

  • A toll‑free number and email address will be available during the data collection process to permit respondents to contact interview staff with questions or requests for assistance. The toll‑free number and email address will be included in all communication with respondents.



  1. Efforts to Identify Duplication

Whenever possible, AIR, Mathematica, and DIR will use existing data including EDFacts, state SIG applications and subgrant applications, Consolidated State Performance Reports (CSPRs) and Office of Elementary and Secondary Education (OESE) monitoring reports, federal, state, and local fiscal and payroll files, and master class or instructional schedules and class lists. This will reduce the number of questions asked in the case study interviews and focus groups, thus limiting respondent burden and minimizing duplication of previous data collection efforts and information. In addition, the contractor will seek to coordinate activities with other studies of ARRA that are to be undertaken by ED in the near future, and will avoid duplication of effort.

  1. Methods to Minimize Burden on Small Entities

No small businesses or entities will be involved as respondents.

  1. Consequences of Not Collecting Data

Failure to collect the data proposed through this study would prevent ED from gaining ongoing, in‑depth understanding on how SIG funds are distributed from the federal government to schools and how they are being used at the state, district, and school levels. Understanding the strategies and approaches that the study schools implement and how they use SIG funds will enable federal policy makers and program managers to monitor the program and provide useful, ongoing guidance to states and districts. Furthermore, without the “real‑time” data that this study will provide, schools and districts will not receive timely feedback on how the turnaround process is evolving. This may hinder the schools and districts capacity to discern which types of strategies and expenditures appear to support school turnaround.

  1. Special Circumstances

None of the special circumstances listed apply to this data collection.

  1. Federal Register Comments and Persons Consulted Outside the Agency

A 60‑day notice about the study was published in the Federal Register (Volume 75, page 20346) on April 19, 2010 to provide the opportunity for public comment. A 30-day comment period has been provided with this collection at the same time of the OMB submittal.

To assist with the study’s complex technical and substantive issues, the study team has drawn on the experience and expertise of researchers with substantive expertise that is complementary to the core study team. In particular the study team has consulted with Dr. Margaret Goertz, a researcher affiliated with the Consortium for Policy Research in Education (CPRE) at the University of Pennsylvania. In addition to decades of experience studying school improvement (often through case studies) Dr. Goertz brings expertise in fiscal data collection and analyses.

In addition, at the 2010 annual meeting of the American Educational Research Association in Denver, Colorado, the study team convened a small group of leading researchers to provide feedback on the study design, data collection strategies, and dissemination plan. Meeting attendees included Dr. Cynthia Coburn, Dr. Margaret Goertz, Dr. Andrew Porter, Dr. Jim Spillane, Dr. Joan Talbert, Dr. Brenda Turnbull, Dr. Penny Sebring, and Dr. Penny Wohlstetter.

The study team has also convened a Technical Working Group, which met in January 2011. The TWG members in attendance included Dr. Cynthia Coburn, Dr. Joan Talbert, Dr. Penny Wohlstetter, and Ms. Angelica Infante, in addition to Dr. Margaret Goertz, project consultant.

  1. Payment or Gifts

The Web-based teacher survey, which will require approximately 10 minutes to complete, will include a token incentive to ensure a high response rate. The planned incentive amount is $10 per respondent for each survey module, for a maximum total cost of $111,000, assuming all 2,250 teachers in the 60 selected schools complete the longitudinal module in each of the three years of the study, all 950 teachers in the 25 core case study schools complete the case study supplement survey in each of the three years of the study, and all 750 teachers in the 20 special topic schools complete the topical module survey in each of the two years.

  1. Assurances of Confidentiality

The study team has established procedures to ensure the confidentiality and security of its data. This approach will be in accordance with all relevant regulations and requirements, in particular the Education Sciences Institute Reform Act of 2002, Title I, Subsection (c) of Section 183, which requires the director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.”  The study will also adhere to requirements of subsection (d) of Section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.

The study team will protect the full privacy and confidentiality of all individuals who provide data for case studies (with the exception of state officials, described below). The study will not have data associated with personally identifiable information (PII), as study staff will be assigning random ID numbers to all of data records and then stripping any PII from the data records. In addition to the data safeguards described here, the study team will ensure that no respondent names, schools, or districts are identified in publicly available reports or findings, and if necessary, the study team will mask distinguishing characteristics. A statement to this effect will be included with all requests for data:

This survey is voluntary. You can decide to not participate or discontinue participation at any time without penalty. You do not have to answer any questions you do not want to, but we encourage you to answer as many as you can. There are no known risks to participating in this survey. We will treat the information that you supply in a manner that protects your privacy, in accordance with the Education Sciences Institute Reform Act. Only selected research staff will have access to data. We will NOT present results in any way that would permit them to be identified with you or any other specific individual. No personally identifiable information, such as your name or your district or school affiliation, will be disclosed to anyone outside the project.  We will not provide any information that identifies you or your school to anyone outside of the study team, except as required by law.

The case of state‑level respondents is somewhat different: The state‑level interviews, by their very nature, focus on policy topics that are in the public domain. Moreover, it would not be difficult to identify Title I and school improvement directors in each state and thus determine the identity of the state‑level respondents. Having acknowledged that, the study team will endeavor to protect the privacy of the state‑level interviewees, and as with district‑ and school‑level respondents, the study team will avoid using their names in reports and attributing any quotes to specific individuals. The study team will primarily report on the numbers of schools that engage in specific practices, thus avoiding reference to specific schools.

With regard to hard copy data, focus group notes and other hard copy data will be kept in locked file cabinets during nonworking hours. All members of the study team with access to the data will be trained and certified on the importance of confidentiality and data security.

All electronic data will be protected using several methods. AIR and its subcontractors provide secure FTP services that allow encrypted transfer of large data files with clients, if necessary. Internal networks for AIR, Mathematica, and DIR are all protected from unauthorized access utilizing defense‑in‑depth best practices, which incorporate firewalls and intrusion detection and prevention systems. The networks are configured so that each user has a tailored set of rights, granted by the network administrator, to files approved for access and stored on the LAN. Access to computer systems are password protected, and network passwords must be changed on a regular basis and conform to strong password policies, to which all three partner organizations adhere. All project staff assigned to tasks involving sensitive data will be required to provide specific assurance of confidentiality and obtain any clearances that may be necessary.

  1. Justification of Sensitive Questions

No questions of a sensitive nature will be included in this study.

  1. Estimates of Hour Burden

The estimated hour burden for the data collections for the study is 3,992 hours. Based on average hourly wages for participants, this amounts to an estimated monetary cost of $118,557. Exhibit 5 summarizes the estimates of respondent burden for study activities.

The burden estimate associated with the recruitment of states, districts, and schools is 30 hours. This burden estimate includes:

  • Time associated with gaining cooperation with 6 states.

  • Time associated with gaining cooperation with 23 case study districts.

  • Time associated with gaining cooperation with the 60 case study districts.

The burden estimate associated with the state interviews is 18 hours. This burden estimate includes:

  • Time for 6 state Title I directors/SIG coordinators to participate in a 1‑hour telephone interview during the annual 3 administrations.

The burden estimate associated with the principal interviews is 180 hours. This burden estimate includes:

  • Time for 60 school principals to participate in a 1‑hour telephone interview during the annual 3 administrations.

The burden estimate associated with the core site visits is 2,145 hours. This burden estimate includes:

  • Time for district staff across the 16 districts, in which the core case studies will occur, to prepare fiscal data annually for 3 years;

  • Time for 16 external support providers across the 16 districts, in which the core case studies will occur, to participate in a 1‑hour interview once in the first year of data collection and twice per year in years 2 and 3;

  • Time for 16 district staff across the 16 districts, in which the core case studies will occur, to participate in a 1-hour interview once in the first year of data collection and twice per year in years 2 and 3;

  • Time for 16 district union representatives across the 16 districts, in which the core case studies will occur, to participate in a 1‑hour interview once per year for 3 years;

  • Time for 25 school principals in 25 schools to participate in a 1‑hour interview once per year for 3 years;

  • Time for 25 instructional coaches in 25 schools to participate in a 1‑hour interview once per year for 3 years;

  • Time for 75 teachers in 25 schools to participate in a 45‑minute interview once in the first year of data collection and twice per year in years 2 and 3;

  • Time for 150 teachers in 25 schools to participate in a 1‑hour focus group once per year for 3 years;

  • Time for 125 school improvement team members in 25 schools to participate in a 1‑hour focus group once per year for 3 years;

  • Time for 125 parents/community liaisons in 25 schools to participate in a 1‑hour focus group once per year for 3 years; and

  • Time for 78 secondary students in 13 schools to participate in a 1‑hour focus group once per year for 3 years.

The burden estimate associated with the ELL special topic site visits is 387 hours. This burden estimate includes:

  • Time for district staff in 6 districts, in which the ELL case studies will occur, to prepare fiscal data annually for 2 years;

  • Time for 6 external support providers across the 6 districts, in which the ELL case studies will occur, to participate in a 1‑hour interview annually for 2 years;

  • Time for 6 district staff across the 6 districts, in which the ELL case studies will occur, to participate in a 1‑hour interview annually for 2 years;

  • Time for 10 school principals in 10 schools to participate in a 1‑hour interview once per year for 2 years;

  • Time for 30 teachers in 10 schools to participate in a 45‑minute interview once per year for 2 years;

  • Time for 60 teachers in 10 schools to participate in a 1‑hour focus group once per year for 2 years;

  • Time for 50 school improvement team members in 10 schools to participate in a 1‑hour focus group once per year for 2 years; and

  • Time for 30 secondary students in 5 schools to participate in a 1‑hour focus group once per year for 2 years.

The burden estimate associated with the second set of special topic site visits is 330 hours. This burden estimate includes:

  • Time for district staff in 6 districts, in which the special topic case studies will occur, to prepare fiscal data annually for 2 years;

  • Time for 6 external support providers across the 6 districts, in which the special topic case studies will occur, to participate in a 1‑hour interview annually for 2 years;

  • Time for 10 school principals in 10 schools to participate in a 1‑hour interview once per year for 2 years;

  • Time for 60 teachers in 10 schools to participate in a 1‑hour focus group once per year for 2 years;

  • Time for 50 school improvement team members in 10 schools to participate in a 1‑hour focus group once per year for 2 years; and

  • Time for 30 secondary students in 5 schools to participate in a 1‑hour focus group once per year for 2 years.

The burden estimate associated with the teacher surveys is 902 hours. This burden estimate includes:

  • Time for 80 percent of 2,250 teachers in 60 schools to respond to a 10‑minute longitudinal module survey once per year for 3 years;

Also included in Exhibit 5 are burden estimates associated with data collection instruments that have yet to be developed, but are expected to be included in a future clearance package (see blue-highlighted rows). Please note that burden estimates related to these instruments are not included in any of the totals presented in this section, but are included for supplemental informational purposes only. The burden estimate associated with these instruments is 827 hours, and includes:

  • Time for 80 percent of 950 teachers in 25 schools to respond to a 5‑minute case study supplement survey once per year for 3 years.

  • Time for 80 percent of 950 teachers in 25 schools to respond to a 10‑minute case study module once per year for 3 years.

  • Time for 80 percent of 375 teachers in 10 schools to respond to a 10‑minute ELL topical module supplement survey once per year for 2 years.

  • Time for 80 percent of 375 teachers in 10 schools to respond to a 10‑minute topical (TBD) module supplement survey once per year for 2 years.





Exhibit 5. Summary of Estimates of Hour Burden


Task

Total Sample Size

Estimated Response Rate

Number of Respondents

Time Estimate

(in hours)

Number of Administrations

Total Hour Burden

Hourly Rate

Estimated Monetary Cost of Burden

Recruitment of States, Districts, and Schools

Gaining cooperation of states

6

100%

6

0.5

--

3

$45

$135

Gaining cooperation of case study districts

23

100%

23

0.5

--

12

$45

$540

Gaining cooperation of case study schools

60

100%

60

0.25

--

15

$45

$675

Total for Recruitment

--

--

--

--

--

30

--

$1,350

State Interviews

Conducting Interviews

6

100%

6

1

3

18

$45

$810

Total for State Interviews

--

--

--

--

--

18

--

$810

Principal Interviews

Conducting Interviews

60

100%

60

1.0

3

180

$45

$8,100

Total for Principal Interviews

‑‑

‑‑

‑‑

‑‑

‑‑

180

‑‑

$8,100

Core Site Visits

Preparing fiscal data

16

100%

16

1.5

3

72

$45

$3,240

Participating in interviews—External support providers

16

100%

16

1.0

5

80

$45

$3,600

Participating in interviews—District staff

16

100%

16

1.0

5

80

$45

$3,600

Participating in interviews—District union representative

16

100%

16

1.0

3

48

$45

$2,160

Participating in interviews—School principal

25

100%

25

1.0

3

75

$45

$3,375

Participating in interviews—Instructional coach

25

100%

25

1.0

3

75

$29

$2,175

Participating in interviews—Teachers

75

100%

75

0.75

5

281

$29

$8,149

Participating in focus groups—Teachers

150

100%

150

1.0

3

450

$29

$13,050

Exhibit 5. Summary of Estimates of Hour Burden (continued)


Task

Total Sample Size

Estimated Response Rate

Number of Respondents

Time Estimate

(in hours)

Number of Administrations

Total Hour Burden

Hourly Rate

Estimated Monetary Cost of Burden

Core Site Visits

Participating in focus groups—School improvement team

125

100%

125

1.0

3

375

$29

$10,875

Participating in focus groups—Parent/community liaisons

125

100%

125

1.0

3

375

$20

$7,500

Participating in focus groups—Students (Secondary)

78

100%

78

1.0

3

234

$15

$3,510

Total for Core Site Visits

‑‑

‑‑

‑‑

‑‑

‑‑

2,145

‑‑

$61,234

ELL Special Topic Site Visits

Preparing fiscal data

6

100%

6

1.5

2

18

$45

$810

Participating in interviews—External support providers

6

100%

6

1.0

2

12

$45

$540

Participating in interviews—District staff

6

100%

6

1.0

2

12

$45

$540

Participating in interviews—School principal

10

100%

10

1.0

2

20

$45

$900


Participating in interviews—Teachers

30

100%

30

0.75

2

45

$29

$1,305

Participating in focus groups—Teachers

60

100%

60

1.0

2

120

$29

$3,480

Participating in focus groups—School improvement team

50

100%

50

1.0

2

100

$29

$2,900

Participating in focus groups—Students (Secondary)

30

100%

30

1.0

2

60

$15

$900

Total for ELL Special Topic Site Visits

‑‑

‑‑

‑‑

‑‑

‑‑

387

‑‑

$11,375



Exhibit 5. Summary of Estimates of Hour Burden (continued)


Task

Total Sample Size

Estimated Response Rate

Number of Respondents

Time Estimate

(in hours)

Number of Administrations

Total Hour Burden

Hourly Rate

Estimated Monetary Cost of Burden

Special Topic (TBD) Site Visits


Preparing fiscal data

6

100%

6

1.5

2

18

$45

$810

Participating in interviews—External support providers

6

100%

6

1.0

2

12

$45

$540

Participating in interviews—District staff1

6

100%

6

1.0

2

12

$45

$540

Participating in interviews—School principal

10

100%

10

1.0

2

20

$45

$900


Participating in interviews—Teachers1

30

100%

30

0.75

2

45

$29

$1,305

Participating in focus groups—Teachers

60

100%

60

1.0

2

120

$29

$3,480

Participating in focus groups—School improvement team

50

100%

50

1.0

2

100

$29

$2,900

Participating in focus groups—Students (Secondary)

30

100%

30

1.0

2

60

$15

$900

Total for Special Topic Site Visits

‑‑

‑‑

‑‑

‑‑

‑‑

330

‑‑

$9,530













Exhibit 5. Summary of Estimates of Hour Burden (continued)


Task

Total Sample Size

Estimated Response Rate

Number of Respondents

Time Estimate

(in hours)

Number of Administrations

Total Hour Burden

Hourly Rate

Estimated Monetary Cost of Burden

Teacher Survey

Conducting longitudinal module survey

2,250

80%

1,800

0.167

3

902

$29

$26,158

Conducting case study supplement survey1

950

80%

760

0.083

3

189

$29

$5,481

Conducting case study module survey1

950

80%

760

0.167

3

381

$29

$11,049

Conducting ELL topical module survey1

375

80%

300

0.167

2

100

$29

$2,900

Conducting topical (TBD) module survey1

375

80%

300

0.167

2

100

$29

$2,900

Total for teacher survey

‑‑

‑‑

‑‑

‑‑

‑‑

902

‑‑

$26,158

TOTAL

‑‑

‑‑

‑‑

‑‑

‑‑

3,992

‑‑

$118,557

1 These data collection instruments have yet to be developed as their purpose is to address topics that will emerge during the first case study visits, or to explore a “special topic” that will be identified later in the study.

  1. Estimate of Cost Burden to Respondents

There are no additional respondent costs associated with this data collection beyond the hour burden estimated in item A12.

  1. Estimate of Annual Cost to the Federal Government

The estimated cost for this study, including development of a detailed study design, data collection instruments, justification package, data collection, data analysis, and report preparation, is $6,470,074 for the four years, or approximately $1,617,518 per year.

  1. Program Changes or Adjustments

This request is for a new information collection.

  1. Plans for Tabulation and Publication of Results

Data collection for the Study of School Turnaround will begin in Fall 2010 and end in Spring 2013. Findings will be reported to ED by AIR and its partners in four substantive reports, the dissemination schedule for which is summarized in Exhibit 6.

Exhibit 6. Schedule for Dissemination of Study Results

Task Number

Deliverable

Date Due

Task 6: Baseline Data Collection and Report

6.1

List of extant documents to be collected

Completed

6.2

Draft report outline

Completed


Revised report outline

Completed


Summary of key findings

Completed

6.3

First draft of report

10/27/10


Second draft of report

10/29/10


Third draft of report

11/30/10


Final version of report

12/30/10

Task 7: Year 1 Data Collection and Report

7.2

Draft report outline

2/15/11


Revised report outline

3/15/11


Summary of key findings

5/16/11

7.3

First draft of report

5/31/11


Second draft of report

6/30/11


Third draft of report

7/29/11


Final version of report

9/30/11

Task 8: Year 2 Data Collection and Report

8.2

Draft report outline

2/15/12


Revised report outline

3/15/12


Summary of key findings

5/15/12

Exhibit 6. Schedule for Dissemination of Study Results (continued)

Task Number

Deliverable

Date Due

8.3

First draft of report

5/31/12


Second draft of report

6/29/12


Third draft of report

7/30/12


Final version of report

9/28/12

Task 9: Year 3 Data Collection and Report

9.2

Draft report outline

2/15/13


Revised report outline

3/15/13


Summary of key findings

5/15/13

9.3

First draft of report

5/31/13


Second draft of report

6/28/13


Third draft of report

7/30/13


Final version of report

9/30/13



The baseline report, which will be submitted in final form on December 30, 2010, will include critical background information about the SIG policy and guidance, including facts about the SIG applications, state allocation of funds, the number of SIG‑awarded schools, and grant amounts.

The year 1, 2, and 3, reports will be submitted in final form on September 30, 2011, September 28, 2012, and September 30, 2013, respectively.

For this study, the research team also will communicate and disseminate information to ED and other stakeholders through the following:

  • School profiles for all participating case study schools, produced annually

  • Annual briefings for ED staff and other stakeholders

  • Annual presentations at professional and practitioner conferences

  1. Approval to Not Display OMB Expiration Date

All data collection instruments will include the OMB expiration date.

  1. Explanation of Exceptions

No exceptions are requested.



References

Armstrong, D., Gosling., A., Weinman, J., and Marteau, T. (1997). The Place of Inter-Rater Reliability in Qualitative Research: An Empirical Study. Sociology 31(1) 597-606.

Editorial Projects in Education Research Center (2009), Education Week School Finance Data, available at: http://www.edweek.org/rc/articles/2009/01/21/sow0121.h27.html.

Harvey, J. and Housman, N. (2004) Crisis or possibility: Conversations about the American high school. Washington, DC: National High School Alliance.

Herman, R., Dawson, P., Dee, T., Greene, J., Maynard, R., & Redding, S., (2008). Turning around chronically low‑performing schools; IES Practice Guide. Washington DC: U.S. Department of Education, Institute of Education Sciences.

Hess, F. (2005). Inside the gift horse’s mouth: Philanthropy and school reform. Phi Delta Kappan 87(2), 131–137.

Hill, P. (2006). A foundation goes to school: Bill and Melinda Gates shift from computers in libraries to reform in high schools. Education Next 6(1), 44–51.

Le Floch, K.C., Boyle, A., Therriault, S., and Holzman, B. (2010). State efforts to support and improve high schools. AIR Research Brief. Washington DC: American Institutes for Research.

School Improvement Grants—American Recovery and Reinvestment Act of 2009; Title I of the Elementary and Secondary Education Act of 1965, 75 Fed. Reg. 3375–3383 (2010)

Siskin, L. (2003). When an irresistible force meets and immovable object: Core lessons about high schools and accountability. In M. Carnoy, R. Elmore, and L. Siskin (Eds.), The new accountability: High schools and high stakes testing (pp. 175–194). New York: Routledge Falmer.

Stake, R. (2000). Chapter 16: Case Studies. In Handbook of Qualitative Research, 2nd Edition, edited by Denzin, N. and Lincoln, Y. Thousand Oaks, CA: Sage Publishing.

Yin, R. K. (2003). Case study research, design and methods, 3rd ed. Newbury Park: Sage Publications.

Yin, R.K. (1992). Evaluation: A singular craft. Paper presented at the annual meeting of the American Evaluation Association, Seattle, WA.

Yohalem, N., Wilson, Ahlstrom, A., Ferber, T., & Gaines, E. (2006). Supporting older youth: What’s policy got to do with it? New directions for youth development, 111, Fall 2006 (pp. 117–129).



1 The contractors for this study are the American Institutes for Research (AIR), Mathematica Policy Research, Decision Information Resources (DIR), and Education Northwest.

2 States have the option of identifying Title I eligible elementary schools that (1) are no higher achieving than the highest‑achieving school identified as a persistently lowest‑achieving school in Tier I; and that (2) have not made AYP for at least two consecutive years; or are in the state’s lowest quintile based on proficiency rates.

3 States may also identify as Tier II schools Title I eligible secondary schools that (1) are no higher achieving than the highest‑achieving school identified as a persistently lowest‑achieving school in Tier II; or that have a graduation rate of less than 60 percent over a number of years; and that (2) have not made AYP for at least two consecutive years; or are in the state’s lowest quintile based on proficiency rates.

4 States have the option of identifying as Tier III schools (1) Title I eligible schools that do not meet the requirements to be in Tier I or Tier II; and (2) have not made AYP for at least two consecutive years; or are in the state’s lowest quintile based on proficiency rates.

5 Please see page 10 for a preliminary discussion of the analytic processes for this study.

6 With the exception of the state interview protocol (which will be administered to fewer than nine respondents), all instruments require clearance.

1000 Thomas Jefferson Street, NW | Washington, DC 20007‑3835

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorInformation Technology Group
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy