06844-OMB Recruit-Part-B

06844-OMB Recruit-Part-B.docx

Impact Evaluation of Race to the Top (RTT) and School Improvement Grants (SIG)

OMB: 1850-0884

Document [docx]
Download: docx | pdf



Impact Evaluation of RTT
and SIG

Part B

June 20, 2011


Contract Number:

ED-IES-10-C-0077

Mathematica Reference Number:

06844.062

Submitted to:

National Center for Education Evaluation

555 New Jersey Ave NW, Suite 500h

Washington, DC 20208

Project Officer: Thomas E. Wei, PhD

Submitted by:

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005

Project Director: Susanne James-Burdumy, PhD

Impact Evaluation of RTT
and SIG

Part B

June 20, 2011






CONTENTS

PART B SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION 1

Collection of Information Requiring Statistical Methods 1

1. Respondent Universe and Sampling Methods 1

2. Statistical Methods for Sample Selection and Degree of
Accuracy 5

3. Methods to Maximize Response Rates and Deal with
Nonresponse 7

4. Pilot Testing 7

5. Individuals Consulted on Statistical Aspects of the Design 7



REFERENCE 8


APPENDIX A: state letter


APPENDIX b: STUDY INFO SHEET


APPENDIX c: recruiting protocol with state administrator


APPENDIX D: RECRUITING PROTOCOL WITH DISTRICT ADMINISTRATOR
(DISTRICTS IN THE SCHOOL TURNAROUND
MODEL [STM] SAMPLE)


APPENDIX E: RECRUITING PROTOCOL WITH DISTRICT ADMINISTRATOR
(DISTRICTS IN THE RACE TO THE TOP [RTT] SAMPLE)


APPENDIX F: district letter




TABLES

B.1 RDD Sample Size Requirements 6

















FIGURES

B.1 Venn Diagram of Study Sample 2



PART B. SUPPORTING STATEMENT FOR PAPERWORK
REDUCTION ACT SUBMISSION

This Office of Management and Budget (OMB) package requests clearance to recruit 50 states, the District of Columbia, approximately 240 school districts, and approximately 1,200 schools for inclusion in an evaluation of the Race to the Top (RTT) and School Improvement Grants (SIG) programs. The RTT-SIG evaluation will provide important information on the implementation and impacts of school turnaround efforts and educational reforms funded through these two federal grant programs. The Institute of Education Sciences (IES) at the U.S. Department of Education (ED) has contracted with Mathematica Policy Research and its subcontractors, the American Institutes for Research and Social Policy Research Associates, to conduct this important evaluation.

The RTT-SIG evaluation will include implementation and impact components. For the evaluation of RTT, the implementation component will include semi-structured interviews with state and district officials while the impact component will be based on an interrupted time series (ITS) design. For the evaluation of RTT and SIG-funded school turnaround models (STMs), the implementation component will include semi-structured interviews with state and district officials and a web survey of school principals. The impact evaluation of STMs will be based on a regression discontinuity design (RDD).

This OMB clearance request is the first of two for this evaluation and includes materials that will be used in the study’s recruitment process. We are submitting two clearance requests because recruitment efforts must begin before all of the study’s data collection instruments can be developed. Included in this first OMB clearance request are drafts of the state recruitment letter (Appendix A), the RTT/SIG study information sheet (Appendix B), protocols for recruitment calls and site visits (Appendices C, D, and E), and the district recruitment letter (Appendix F). We provide an overview of the study’s design and eventual data collection plans to provide context, but they are not the focus of this request. A later request will seek clearance for activities to collect information from the states, districts, and schools included in the evaluation, and will include data collection instruments for the study.

Collection of Information Requiring Statistical Methods

1. Respondent Universe and Sampling Methods

The RTT-SIG evaluation is designed to provide a descriptive account of the implementation of RTT and SIG; the most rigorous possible estimates of the effects of RTT and SIG; and the contextual information needed to fully understand and interpret those effects. The study will be based on two samples of school districts, strategically selected both to provide information on RTT and SIG implementation and to support a rigorous analysis of program impacts. To estimate the impact of STMs on student achievement, the evaluation’s first choice is to use a rigorous RDD, exploiting approaches for awarding STM funds to schools that involve a continuous measure. The second choice design, which would be used if an RDD were not feasible, is ITS. The evaluation will also assess the correlation between turnaround models—and the specific turnaround strategies used within such models—and improvements in school outcomes. Separately, to assess the relationship between RTT and student outcomes, the evaluation will use an ITS analysis.

The study will involve two samples, one for the evaluation of STMs and one for the evaluation of RTT (see Figure B.1). The sample for the evaluation of STMs (referred to throughout as the STM sample) will consist of approximately 1,200 schools within an estimated 120 school districts across 30 states (roughly 600 schools will form the treatment group, and roughly 600 schools will form the comparison group). The districts in the STM sample will be purposefully selected based on suitability for the RDD. The sample for the evaluation of RTT (referred to throughout as the RTT sample) will include all 50 states and the District of Columbia and, within the 12 RTT-winning states and the 12 states with the highest application scores among the losing states,1 a sample of approximately 120 school districts. To be eligible for this component of the study, a district must have been identified as a “participating district” in the state’s RTT application (these applications are available both for the RTT winners and losers). From among those eligible districts we will draw our stratified random sample. We anticipate that there will be some overlap between the RTT and STM samples, based on a preliminary examination of districts that may be suitable for the RDD and states that received RTT grants.

Figure B.1. Venn Diagram of Study Sample

Shape1

a. Select States, Districts, and Schools

For the STM impact and implementation components of the study, we will purposefully form a prioritized list of school districts from which up to 120 districts—and 1,200 schools within them—will be recruited. The list will only include districts that applied for STM funding. To ensure an adequate sample for the RDD analysis, we will prioritize districts (1) with a large number of schools implementing STMs and (2) with a high percentage of eligible schools implementing STMs (to reduce fuzziness of the RDD).

The sample for the RTT implementation component of the study will include all 50 states, the District of Columbia, and 120 school districts in 24 states (the 12 who won RTT and the 12 among the losers who came closest to winning). The sampling frame for the school districts will be the school districts identified as “participating districts” on states’ RTT applications. We will draw a stratified random sample in each state, stratifying by whether districts are rural or not. If feasible, we may stratify by additional district characteristics, including the percentage of students receiving free or reduced-price lunch and the percentage of students classified as ELL. The number of districts selected in each state will be proportional to state size, with a minimum size of two districts per state. The sample for the RTT impact component will consist of all 50 states and the District of Columbia.

b. Recruit States, Districts, and Schools

(i) Recruitment for the STM Component

The recruitment plan for the STM component of the evaluation comprises two stages: (1) state recruitment and (2) district recruitment. The STM component of the evaluation may include schools implementing STMs using either SIG or RTT funds.

State Recruitment. In the first phase of recruitment, we will contact each state and the District of Columbia to introduce the study, gauge appropriateness for inclusion in the RDD study, and generate interest in the study. We will send introductory FedEx packages to the State Contacts for the SIG and RTT programs. Each package will include:

  • A state notification letter (Appendix A) will explain the study’s importance, provide an overview of the study, and indicate that a member of the study team will call to provide more details. This letter will be printed on ED letterhead and signed by IES director John Easton to underscore the study’s high-level federal support.

  • A nontechnical information sheet (Appendix B) will provide a non-technical description of the study. This document includes a summary of evaluation activities, the study’s research questions, and a timeline for study activities.

The recruiter assigned to the state will follow up with the State Contacts for the SIG and RTT programs to confirm receipt of the mailing and arrange appointments with the appropriate agency staff for a telephone discussion that will be guided by a recruiting protocol (Appendices C, D, and E). This protocol will guide recruiters in providing a nontechnical description of the study design and data collection activities, confirming the state’s process used to define STM eligibility categories and rankings of the STM eligible schools (for both SIG and RTT-funded STMs), reviewing the type of student level data maintained by the state’s data system and willingness to provide data to the study team, and securing participation in the study. The recruiter will also highlight the study’s importance and address questions or concerns.

District and School Recruitment. Based on the discussions with the states, we will determine whether the state is well suited for the RDD component of the evaluation and, if so, which districts and schools should be included in the STM sample. For these districts and schools, we will send FedEx mailings to the District Contacts for the SIG program and/or RTT program (where RTT-funded STMs are being included). These mailings will include the district notification letter (Appendix F) and the nontechnical study information sheet (Appendix B).

Recruiters will make follow up calls to the District Contacts to introduce themselves and the study, and to arrange for a time to further discuss the study. We anticipate that recruiting communications will predominantly take place by phone and e-mail, but in-person meetings will be arranged where they are deemed necessary to facilitate recruitment efforts. Recruitment discussions with districts will be guided by a modified version of the state recruitment protocol, with a greater emphasis on district and school participation. During these discussions, recruiters will review the study design and planned data collection activities (such as interviews with district administrators, principal surveys, and the collection of student-level data if the state cannot provide these data), identify the schools in the district that would be included in the STM component, and discuss what their participation in the study would entail. We anticipate that the bulk of recruiting discussions will take place with the targeted district staff (and that these district staff will facilitate the participation of their schools in the study). However, we will follow up with the individual schools as requested by the district.

Research Applications and MOUs. In states or districts with policies concerning external research projects, the contractor will gain the necessary approvals and abide by the relevant guidelines for conducting the study. The contractor will seek expedited research application reviews where possible and emphasize that the evaluation has received prior review by its institutional review board (IRB).

Following an oral commitment from state and district officials to participate in the study, they will be sent a memorandum of understanding (MOU) to sign (also signed by a representative of the evaluation contractor). The MOU will describe the agreed-upon roles and responsibilities of the study team and of participating states, districts, and schools.

Recruitment Training and Tracking. Each state and district will have one primary recruiter assigned to them to facilitate developing rapport between the recruiter and the state and district officers and their support staff. Recruiters will participate in a one and a half day training session covering the following topics: a study overview; review of RDD requirements; recruitment procedures and timeline; the effective, non-technical communication of the study’s needs, importance, and methodological issues; dealing with issues that might arise; and task management (such as the schedule, weekly meetings, and use of the tracking system). Trainers will also share key strategies based on the contractor’s extensive experience recruiting districts and schools for IES studies.

All contacts will be documented in an electronic tracking system to minimize redundant contacts and to provide the Contracting Officer’s Representative (COR) with up-to-date information on recruiting progress. The web-based system will be accessible to recruiters from Mathematica and its subcontractors.

(ii) Recruitment for the RTT Component

No separate state-level recruitment contacts are planned for the RTT component of the evaluation. Instead, as part of the STM state recruitment contacts described above (after gathering the information needed to assess suitability for the RDD study component), we will discuss with state administrators the evaluation’s plan to gather information about their implementation of RTT-related reforms and solicit their cooperation with these activities.

RTT-specific recruitment calls are planned for school districts in the RTT sample. The goals of these calls are to (1) provide basic information about the study; (2) obtain buy-in for planned data collection; and (3) identify the appropriate respondents for the RTT district interviews, according to area of expertise. Protocols are included in Appendices C, D, and E.

2. Statistical Methods for Sample Selection and Degree of Accuracy

Our goal is to include a minimum of 1,200 schools in the evaluation’s STM impact analysis. This sample should make it possible to detect impacts between 0.10 and 0.15 standard deviations with high probability if the RDD has a moderate amount of fuzziness and 50 percent of schools are included in the bandwidth. Power calculations and other details are shown below.

a. Statistical Methodology for Stratification and Sample Selection

We will not draw a random sample of districts for the STM component of the study. Rather, states, districts, and schools will be selected purposefully to provide information on implementation and to support a rigorous analysis of program impacts.

For the RTT implementation component of the study, we will randomly select school districts across pre-determined strata in 24 states (the 12 who won RTT and the 12 among the losers who came closest to winning). The sampling frame for the school districts will be the school districts identified as “participating districts” on states’ RTT applications. We will draw a stratified random sample in each state, stratifying by whether districts are rural or not. If feasible, we may stratify by additional district characteristics, including the percentage of students receiving free or reduced-price lunch and the percentage of students classified as ELL. The number of districts selected in each state will be proportional to state size, with a minimum size of two districts per state.

b. Estimation Procedures

There are no estimation, tabulation, or publication plans based on this package because no data are being collected.

c. Degree of Accuracy Needed

Detecting an MDE close to 0.10 using an RDD should be possible with a sample of 1,200 schools (600 in the treatment group and 600 in the comparison group). In calculating the MDE for the study’s RDD component, one must account for (1) the correlation between the assignment variable and treatment variable, (2) clustering of students within schools, (3) fuzziness in the RDD, and (4) the sample size reduction that results from selecting an optimal bandwidth for estimating RD impacts.

Table B.1 shows sample size requirements for the study for different assumptions regarding key design parameters. In a preliminary review of state SIG applications and awards to assess potential fuzziness, we found that a sample size of approximately 1,200 schools may be attainable within states that meet a maximum fuzziness threshold. Specifically, we measured the difference in the proportion of schools receiving awards between the treatment group (Tier I & II eligible schools) and the comparison group (Tier III eligible schools). High differences in this proportion correspond to low levels of fuzziness. Based on an analysis of the relationship between fuzziness and finite sample bias, we found that the lowest value of the difference in this proportion that is acceptable for this study is 40 percent. In states where the difference is at least 40 percent, the  average difference
in the proportion of schools receiving awards between the treatment and comparison groups is
73 percent (a “moderate” degree of RD fuzziness in Table B.1). Also, from the evaluation of supplemental educational services (SES), which also used an RDD, we found that the bandwidth typically excluded about half of the analysis sample.

Table B.1. RDD Sample Size Requirements

Proportion of Schools Included in RDD Bandwidth

Proportion of Treatment Group Implementing STM (Fuzziness)

100
(No Fuzziness)

80
(Light Fuzziness)

70
(Moderate Fuzziness)

Number of Schools Needed for an MDE of 0.10

100%

400

625

825

75%

550

850

1,100

50%

800

1,250

1,650

Number of Schools Needed for an MDE of 0.15

100%

185

285

370

75%

245

375

500

50%

360

575

750


Note: The numbers in the table represent the quantity of schools needed to achieve the MDE targets specified in the shaded cells. The MDEs are expressed in effect size units and were calculated assuming (1) a two-tailed test; (2) a 5 percent significance level ; (3) an 80 percent level of power ; (4) a reduction in variance of 40 percent at the student level and 70 percent at the school level due to the use of regression models to estimate impacts; (5) an intra-class correlation of 0.15 ; and (6) an RD assignment variable that follows the normal distribution. The table entries were calculated using the following formula:

where fct is the sum of two critical values (corresponding to and ) from the T-distribution with df degrees of freedom; RD is the regression discontinuity design effect (Schochet 2008); is the difference in participation rates between students below and above the RD cutoff (the degree of fuzziness in the design); PSC is the proportion of students in the control group (assumed to be 50 percent); PDF is the probability density function of the assignment variable used to determine participation in the RD design (assumed normal); and , , , are the number of schools and students in the treatment and control groups (we assume 200 students per school). We assume no schools in the comparison group implement STMs.

3. Methods to Maximize Response Rates and Deal with Nonresponse

We expect that states and districts will readily engage in recruiting discussions about participation in the RTT-SIG evaluation with little nonresponse, given their active pursuit of RTT and SIG grants. The expectations for reporting data on program performance and for participation in evaluations for the Secretary of Education are explicitly stated in the RTT and SIG grant application forms (see page 19 of the SIG application [http://www2.ed.gov/programs/sif/
application.doc] and pages 5, 14, and 96 of the RTT application [http://www2.ed.gov/programs/
racetothetop/phase2-application.doc]).

A clear description of the study design presented to states and school districts, and the data collection’s reliance to a large degree on administrative data without any direct data collection from students or intrusion on classroom instructional time, will further encourage cooperation with evaluation efforts.

4. Pilot Testing

We will not conduct any pilot testing for the recruitment phase of the RTT-SIG evaluation.

5. Individuals Consulted on Statistical Aspects of the Design

The following individuals were consulted on the statistical aspects of this study:

Name

Title

Telephone

Susanne James-Burdumy

Associate Director of Research, Mathematica

609-275-2248

John Deke

Senior Researcher, Mathematica

609-275-2230

Irma Perez-Johnson

Senior Researcher, Mathematica

609-275-2339

Lisa Dragoset

Researcher, Mathematica

609-945-3348



The following individuals will be responsible for data collection and analysis:

Name

Title

Telephone

Susanne James-Burdumy

Associate Director of Research, Mathematica

609-275-2248

John Deke

Senior Researcher, Mathematica

609-275-2230

Rebecca Herman

Managing Research Analyst, AIR

202-403-5449

Irma Perez-Johnson

Senior Researcher, Mathematica

609-275-2339

Nancy Carey

Senior Survey Researcher, Mathematica

202-264-3483



REFERENCE

Schochet, Peter Z. “Technical Methods Report: Statistical Power for Regression Discontinuity Designs in Education Evaluations.” NCEE 2008-4026. Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, 2008.



www.mathematica-mpr.com

Improving public well-being by conducting high-quality, objective research and surveys

Princeton, NJ Ann Arbor, MI Cambridge, MA Chicago, IL Oakland, CA Washington, DC


Mathematica® is a registered trademark of Mathematica Policy Research



1 ED recently announced that the nine States that were closest to winning RTT Phase 2 grants are eligible to compete for $200 million in additional funds. To compete, States will propose specific parts of their Phase 2 plans that they would implement with the new funds. While the new funding might have implications for interpretation and analysis, we do not currently see a need to change the study’s design or sampling plans. When we know which states win these additional funds and exactly what they intend to use them for, we will reassess our design and analysis plans.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJennifer S. Baskwell
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy