SES-OMB Supporting statement PartA 8-20-08

SES-OMB Supporting statement PartA 8-20-08.doc

Feasibility and Conduct of an Impact Evaluation of Title I Supplemental Education Services

OMB: 1850-0858

Document [doc]
Download: doc | pdf

Contract No.: ED-01-CO-0038-0009

MPR Reference No.: 6413-140





Feasibility and Conduct of an Impact Evaluation of Title I Supplemental Education Services


Part A: Supporting Statement for Request for OMB Approval of Data Collection Instruments


August 20, 2008




















Submitted to:


Institute of Education Sciences

U.S. Department of Education

80 F Street, NW

Room 308D

Washington, DC 20208


Project Officer:

Audrey Pendleton


Submitted by:


Mathematica Policy Research, Inc.

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005


Project Director:

Brian Gill, Ph.D.



CONTENTS

Page

SUPPORTING STATEMENT REQUEST FOR OMB APPROVAL OF DATA

COLLECTION INSTRUMENTS 1


A. JUSTIFICATION 1


1. Circumstances Necessitating Collection of Information 1

2. How, by Whom, and for What Purpose Information Is To Be Used 8

3. Use of Automated, Electronic, Mechanical, or Other Technological Collection Techniques 10

4. Efforts to Avoid Duplication of Effort 10

5. Sensitivity to Burden on Small Entities 10

6. Consequences to Federal Program or Policy Activities if the Collection Is Not Conducted or Is Conducted Less Frequently than Proposed 10

7. Special Circumstances 11

8. Federal Register Announcement and Consultation 11

9. Payment or Gift to Respondents 11

10. Confidentiality of the Data 12

11. Additional Justification for Sensitive Questions 13

12. Estimates of Hour Burden 13

13. Estimate of Total Annual Cost Burden to Respondents or Recordkeepers 14

14. Estimates of Annualized Cost to the Federal Government 14

15. Reasons for Program Changes or Adjustments 14

16. Tabulation, Publication Plans, and Time Schedules 14

17. Approval Not to Display the Expiration Date for OMB Approval 16

18. Exception to the Certification Statement 17


REFERENCES 18



APPENDIX A: CONFIDENTIALITY PLEDGE

SUPPORTING STATEMENT
REQUEST FOR OMB APPROVAL OF DATA COLLECTION INSTRUMENTS

A. JUSTIFICATION

Collection of information is needed to support a rigorous evaluation of supplemental educational services (SES) for the U.S. Department of Education (ED). The No Child Left Behind Act (NCLB) requires school districts to offer SES to students who attend schools that have failed to make adequate yearly progress (AYP) for three years. SES are tutoring or other academic support services offered outside the regular school day by state-approved providers, free of charge to eligible students. Parents can choose the specific SES provider from among a list of providers approved to serve their area. This evaluation is authorized under the No Child Left Behind Act of 2001, Section 1501 (PL No 107-110).


Mathematica Policy Research (MPR) is working with ED to design and conduct a rigorous evaluation of SES based on a regression discontinuity (RD) design in up to 12 districts. The primary research questions to be answered by the evaluation are: (1) what is the effect of SES on student achievement? and (2) how does the effect of SES vary by provider type? MPR will assess the impact of SES by comparing a treatment and control group of students, where the treatment and control group are formed purposefully based on a measure of prior achievement (such as a test score or grade point average). Valid estimates of the effect of SES can be determined by comparing the average reading and math scores of students who were accepted into SES to the average scores of students who were not accepted into SES, after regression adjusting for the measure of prior achievement used to determine acceptance (this is the definition of an RD design). MPR will assess how impacts vary by provider type by calculating provider-specific impacts and then relating those impacts to provider type, as measured using a survey of SES providers.


We are requesting OMB approval for the regression discontinuity design and for baseline data collection activities. The total evaluation consists of three phases of work, including recruitment, baseline data collection, and outcome data collection. The recruitment phase includes assessing feasibility and recruitment of up to 12 districts. The baseline data collection phase includes collecting parents’ choice of providers from districts. We have learned that districts typically ask for parents’ provider preferences on the SES parent application form. OMB approval for the outcome data collection phase, which will take place in spring 2009, will be requested in the fall of 2008. This will include collection of information from SES providers and collection of school records.



1. Circumstances Necessitating Collection of Information

Below we describe the need for a rigorous evaluation of SES, the research questions that an impact evaluation would answer, the rationale for using an RD design, and the data collection activities of the study.

a. Statement of Need for a Rigorous Evaluation of SES Using an RD Design


The No Child Left Behind Act (NCLB) requires districts with Title I schools that fall short of state standards for three sequential years to offer SES to their students from low-income families. Hundreds of thousands of students participate in SES, but so far little systematic information is available on the effectiveness of SES in promoting student achievement or on the operational characteristics of effective SES providers. During the 2006–2007 school year, 529,627 students participated in SES nationwide. The potential market for SES is substantially larger, as the participating students constituted only 14.5 percent of all the 3,645,665 students eligible to receive SES in 2006–2007 (U.S. Department of Education, Annual Performance Reports). The few studies that have examined the relationship between participation in supplemental services and student achievement have relied on non-experimental designs (for example, Zimmer et al. 2007). Additional research using more rigorous methods that permit more definitive causal inference is needed to assess the achievement impacts of SES.


We will assess the impact of SES by comparing a treatment and control group of students, where the treatment and control group are formed purposefully based on prior achievement. Valid estimates of the effect of SES can be determined by comparing the average reading and math scores of students who were accepted into SES to the average scores of students who were not accepted into SES, after regression adjusting for the measure of prior achievement used to determine acceptance (this is the definition of an RD design).


A randomized experimental evaluation of SES is precluded by NCLB, which requires that all eligible students who request services receive them, as long as resources are available. Although a randomized design is precluded by statute, NCLB’s rules about the allocation of services when resources are constrained create the opportunity for an RD analysis that will allow causal inferences with rigor approaching that of a randomized experiment. Under the RD design, we will evaluate SES in districts that, because of funding constraints, cannot serve all students eligible for SES services and that must ration services on the basis of a quantifiable, continuous score (such as an achievement test score). Eligible applicants with scores below a preselected, fixed cutoff score will be offered SES services (treatment students), whereas eligible applicants with scores above the cutoff value (control students) will not. Unbiased estimates of the impacts of SES services can then be obtained by comparing the outcomes of eligible applicants below and above the cutoff value, after adjusting for baseline assignment scores. RD is generally considered one of the strongest quasi-experimental designs available to researchers for purposes of causal inference (see, for example, Shadish et al. 2002), and it is the only methodology other than random assignment to fully meet the standards of ED’s What Works Clearinghouse (http://ies.ed.gov/ncee/wwc/overview/review.asp?ag=pi). A detailed description of the RD design is provided in Section A.16.


Evaluation of Title I programs and services is authorized in the No Child Left Behind Act, Title I, Part E, Section 1501. Although there is no federal requirement or legislation specifically requiring an evaluation of the SES program, findings of the current study will not only inform national policy discussions about SES but also provide direct feedback to participating districts about the effectiveness of the SES offered in their districts.



b. Research Questions for the Full Evaluation


The contractor will examine the following research questions in conducting a rigorous evaluation of SES:



1. What is the impact of participation in Title I Supplemental Educational Services on student achievement in reading and mathematics? SES represents a considerable investment of resources with the specific purpose of improving the academic achievement of students attending schools that are failing to make AYP. This question focuses on whether SES as a whole are achieving that purpose.

2. Are district characteristics and practices, SES provider characteristics and services, and student characteristics related to the impact on student achievement? Of particular interest to ED is whether specific provider types are more effective than others. Dimensions along which providers might vary include substantive focus (for example, math or reading), intensity (for example, frequency of student attendance), and method of delivery (for example, small group activities, one-on-one tutoring, or the use of computer technology). Relevant student subgroups include prior student achievement and whether or not students are served by their parents’ first choice of providers.


c. Overview of the Design and Feasibility of the Study


The power calculations of an evaluation based on an RD design in up to 12 districts indicate that a total sample of 50,000 students will be sufficient to answer the study’s research questions.1 We anticipate that 12 districts, drawn from a list of 24 provided by the Office of Innovation and Improvement (OII), will yield the needed student sample. The 12 districts would be evenly split with half being districts from Florida and half being districts in states other than Florida. The state of Florida has enacted rules requiring districts to document their SES recruitment efforts as part of applying to use Title I funding set aside for SES services for other purposes. OII has provided the names of school districts that are oversubscribed for SES during the current school year (See Table A.1).


The feasibility of the evaluation will be assessed through informal conversations with school district officials in 9 of these districts. The feasibility will be based on the likelihood of oversubscription and whether or not districts are allocating services based on quantifiable measures of prior student achievement (or similar assignment variables), as is required in order to evaluate SES using an RD design. We will also assess whether the districts would be willing to participate in a rigorous evaluation of SES. Specifically, we will assess the following:



  • The Number of Oversubscribed Districts That Use Quantifiable Information to Assign Students to Services. To be included in the study, school districts must not only be oversubscribed but also allocate admission to SES using a continuous, quantifiable measure of students’ prior achievement.

  • The Number of Students Participating in SES in the Oversubscribed Districts and the Rates of Oversubscription. Our goal is to include about 50,000 SES applicants in the study, with enough oversubscription such that at least 10 percent of the 50,000 applicants can be included in the control group (statistical power calculations supporting these goals are presented in Section B).

  • Whether Districts Are Able to Provide Students’ Scores on District and State Tests That Can Be Used As the Study’s Outcome Measure. Because NCLB requires districts to test students beginning in third grade, we anticipate that this data will be available in most districts.

TABLE A.1


SCHOOL DISTRICTS FROM WHICH FEASIBILITY WILL BE ASSESSED

AND SAMPLE WILL BE RECRUITED


District Name

State

Albuquerque Public Schools

New Mexico

Baltimore City Public Schools

Maryland

Bay County School District

Florida

Brevard County School District

Florida

Bridgeport

Connecticut

City of Chicago SD 299

Illinois

Collier County School District

Florida

Dade County School District

Florida

Denver County 1

Colorado

Flagler County

Florida

Gadsden County

Florida

Hillsborough County School District

Florida

Indianapolis Public Schools

Indiana

Lee County School District

Florida

Leon County School District

Florida

Little Rock School District

Arkansas

Los Angeles Unified

California

Oakland Unified

California

Osceola County School District

Florida

Palm Beach County School District

Florida

Pinellas County School District

Florida

Polk County School District

Florida

Sacramento City Unified

California

San Francisco Unified

California



d. Rationale for a Regression Discontinuity Design


The RD design is best suited for an evaluation of SES given the requirements of NCLB. NCLB requires districts with eligible students to make available the equivalent of up to 20 percent of their total Title I funds for a combination of SES and for transportation for students using NCLB’s school-choice option. Total required expenditures for SES are therefore capped at 20 percent or less, with the specific cap depending on the amount the district is spending to transport students using the school-choice option. When the number of students participating in SES is large enough that the costs reach this cap, NCLB permits districts to ration services, giving priority to the lowest-achieving eligible students who apply to participate. In districts that have an oversubscription of SES applicants and that are assigning eligible applicants to services according to a score based on prior achievement, it should therefore be possible to analyze the impact of services using an RD analysis. This analysis would examine whether there is a discontinuity in the relationship between the “assignment score” variable (prior achievement) and the outcome (subsequent achievement) at the prior achievement level that is used as the cutoff for assignment to services. Additional details of the RD design are described in Section A.16.



e. Structure of the Data Collection Effort

To help ED address the study research questions, the contractor will collect and analyze data from several sources. Clearance is currently being requested for the collection of baseline data from district records of the parents’ application for their child to participate in the SES program which will be used to link applicants to providers, in order to answer the second research question. Most districts already include parents’ first, second, and third choice of an SES provider.


ED will request OMB clearance to collect outcome data in an addendum to the current OMB package, including: (1) an SES provider survey (which will allow the contractor to assess provider characteristics that can then be linked to impacts) and (2) the collection of student records/district test scores in Spring/Summer 2009 (the main outcome for the evaluation). Table A.2 shows the schedule of these data collection activities.



f. Data Collection Activities for which OMB Clearance Is Being Requested


Collection of SES Application Data


The contractor will gather information from school districts on parents’ preferred providers listed as part of the SES application process in the fall of 2008. The district application forms typically ask parents to provide the names of their first, second, and third choice of SES providers. We will ask districts to record this data from SES applications and submit information on parents’ preferred providers in an electronic file to the contractor.





TABLE A.2


DATA COLLECTION SCHEDULE

Activity

Respondent

Clearance Requested in Current Package

Clearance To Be Requested in Addendum

Baseline Data Collection, Fall 2008:

Collect SES application data (50,000 records from up to 12 districts)



School districts




X




Outcome Data Collection, Spring 2009:

SES provider survey (300 providers at most)



SES provider



X

Outcome Data Collection, Summer/Fall 2009:

Obtain student records/district test scores (50,000)


District/School staff



X



We anticipate that 50,000 parents will complete the SES application and provide information about preferred providers (95% response rate from approximately 52,600 total parents who fill out SES applications).



ED will request OMB clearance in an addendum to the current OMB package for the collection of schools’ Adequate Yearly Progress (AYP) status information, an SES provider survey, and school records collection.


Collection of School AYP Status Information


In spring 2009 we will collect data on whether each school represented in the study met AYP targets for each student subgroup on the basis of spring 2008 testing. We expect this information will be readily available on state department of education websites, so districts and schools will experience no burden in the collection of these data. These data will be merged with our student-level data to enable us to conduct subgroup analyses on students who are members of groups whose schools missed AYP targets.



SES Provider Survey


In spring 2009, the contractor will use two instruments to collect information from SES providers. OMB clearance for these forms will be requested in an addendum in the fall of 2008. The first will be a mail questionnaire, with telephone followup, focused on provider characteristics (for example, type and size of organization, years in existence); staff characteristics (gender, ethnicity, prior teaching experience, current certification, employment in study district); services provided (type, frequency, delivery methods); and characteristics of all the students they serve, not just those in the study. For efficiency, reliability, and comparability, the contractor will, to the extent possible, reuse questions from instruments that have already had OMB clearance on other ED projects.


Second, the contractor will ask each SES provider to provide a list of the names of students receiving SES services in each of up to 12 districts. Providers will be asked to provide information on the type and amount of services provided to each participating student.



Collection of Demographic Data and Student Achievement Scores


During the summer/early fall of 2009, the contractor will collect scores from tests administered by the state or district in school years 2006–2007, 2007–2008, and 2008–2009. The demographic and other student-level information we will collect from the districts includes grade level, month and year of birth, race/ethnicity, English proficiency, disability status, eligibility for free or reduced-price school lunch, student grades, and attendance. We anticipate that we will be able to obtain school records for 50,000 students.



2. How, by Whom, and for What Purpose Information Is To Be Used

The information collected will inform an impact evaluation of Title I SES on the reading and mathematics achievement of third- to eighth-grade students. The data will also be useful for state and local policymakers, districts and schools, and parents. The information will also inform policy decisions about the approval and funding of SES providers. Specifically, data collection efforts will be used in the following ways:


SES Application Data. The contractor will use data from parents’ SES district applications to identify parents’ preferred SES providers. Districts will then record the data and submit information in an electronic file to the contractor. By identifying parents’ preferred SES providers before the RD cutoff is determined, we can assess whether SES has a greater impact for students who get their first choice provider.


SES Provider Survey. When combined with information from SES application forms, data from the SES provider survey will allow ED to relate provider effectiveness to provider characteristics and practices. The contractor will also collect participation and attendance records for students who attend SES or other district-provided after school programs. This information will allow ED to describe the counterfactual experiences of students in the control group and calculate the effect of participating in SES (as opposed to the effect of offering SES—see Section A.16). The 25 largest providers in each district will be asked to participate in the provider survey (we expect fewer than 25 providers in most districts).


Student Records and District Test Scores. The main outcome for this study will be students’ scores on districts’ existing tests. These tests are most relevant to districts because they are the tests used for accountability purposes. Earlier test scores or other continuous measures from student records (GPA, attendance) can also be used as the RD cutoff variable, and to improve the precision of impact estimates. These records will be collected for all 50,000 students in the study.



3. Use of Automated, Electronic, Mechanical, or Other Technological Collection Techniques

The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Where feasible, information will be gathered from existing electronic data sources, such as program and school records. Since some school districts and SES providers may not use electronic means of collecting and storing data, the contractor expects to receive some data from reporting forms or preexisting documents. To avoid burdening the school districts and SES providers, the contractor will offer them the option of delivering participation data electronically, filling out a straightforward reporting form manually, or submitting hard-copy documents that already exist. Minimizing evaluation costs and reducing respondent burden were key considerations in the decision to collect preferred provider information via existing application forms as opposed to administering new parent surveys.

4. Efforts to Avoid Duplication of Effort

This effort will yield unique data to evaluate the impact of Title I SES. There are no similar evaluations being conducted and there is no alternative source for the information to be collected. Moreover, the data collection plan reflects careful attention to the potential sources of information for this study and particularly the reliability of the information and efficiency in gathering the information. The data collection plan avoids unnecessary collection of information from multiple sources.



5. Sensitivity to Burden on Small Entities

All data collection will be coordinated by the evaluation contractor so as to minimize burden on school and district staff and SES providers. The primary entities for this study are schools and the districts to which they belong, along with providers of SES and non-SES after school providers. Burden is reduced for all respondents by requesting only the minimum information required to meet the study objectives. The burden on schools, districts, and providers has been minimized by carefully specifying information needs, restricting questions to generally available information, and designing the data collection strategy to minimize burden on respondents.



6. Consequences to Federal Program or Policy Activities if the Collection Is Not Conducted or Is Conducted Less Frequently than Proposed

In the absence of the impact evaluation, ED will not be able to assess the impacts of SES on student achievement. The data collection plan calls for the minimum amount of data needed to measure differences in student achievement based on SES provider. The collection of SES application data will be a one-time collection.



7. Special Circumstances

There are no special circumstances involved with this data collection.

8. Federal Register Announcement and Consultation

A request for comment on the proposed data collection activities and instruments was published in the Federal Register on March 6, 2008 (FR 73 No 45, page 12149 of the FR).


a. Comments

One public comment was received. The commenter objected to the planned fall 2008 baseline achievement test for third graders as interfering with parents’ choice of providers and taking time away from instruction. The planned administration of the baseline achievement test was planned for third graders across all providers and would have taken place outside of school and SES instruction time. However, ED has decided that sufficient data is available without administering a separate test, and will not administer tests to third graders as planned.


b. Consultations Outside the Agency

During preparation of the study design and data collection plan for this evaluation, professional counsel is being sought from a number of people. Input is being solicited from a broad range of researchers, most of whom are members of the Technical Working Group under contract to design the impact evaluation.


These individuals are

Ron Zimmer, RAND

Steven Ross, University of Memphis

Drew Gitomer, Educational Testing Service

Jeffrey Smith, University of Michigan

Thomas Cook, Northwestern University

Robert Linn, University of Colorado

Erica Harris, Chicago School District

Tim Silva, Mathematica Policy Research

Peter Schochet, Mathematica Policy Research

c. Unresolved Issues

None.



9. Payment or Gift to Respondents

Parents of students in the study will only be asked to fill out the SES application form as required by districts. We will not offer payments or gifts to these students or parents.



10. Confidentiality of the Data

All data collection activities will be conducted in full compliance with ED regulations; this is to maintain the confidentiality of data obtained on private persons and to protect the rights and welfare of human research subjects as contained in ED regulations.


The contractor will follow the new policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, which requires “[a]ll collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of Section 552 of Title 5, United States Code, the confidentiality standards of Subsection (c) of this section, and Sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.


In addition, the contractor will ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools shall remain confidential in accordance with Section 552a of Title 5, United States Code, the confidentiality standards of Subsection (c) of this section, and Sections 444 and 445 of the General Education Provision Act.


Subsection (c) of Section 183 referenced above requires the Director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.”


Subsection (d) of Section 183 prohibits disclosure of individually identifiable information as well as making any the publishing or communicating of individually identifiable information by employees or staff a felony.


The contractor will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released. Information from participating institutions and respondents will be presented at aggregate levels in reports. Information on respondents will be linked to their program, district, and school but not to any individually identifiable information. No individually identifiable information will be maintained by the study team. All staff that have access to respondents, schools or data will: 1) sign a Confidentiality Pledge (Appendix A) and 2) obtain security clearance through NCEE’s security clearance officer.


In addition, the following verbatim language will appear on all letters, brochures, and other study materials:



Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.


ED is also in the process of preparing a System of Records Notice



11. Additional Justification for Sensitive Questions

There are no questions of a sensitive nature included in data collection instruments or procedures. Participation in the study is voluntary and all data collection activities will be conducted in full compliance with ED regulations.



12. Estimates of Hour Burden

Table A.4 provides an estimate of time burden. The total average annual burden for this data collection effort is 2,000 hours, over three years. School districts will collect choice of provider information from parents as part of the district SES application form. Districts will then record the data and submit information in an electronic file to the contractor.



Table A.4

average annual BURDEN TO RESPONDENTS IN HOURS


Data Collection Activities

Average Annual Number of Respondents

Number of Responses/ Respondent

Average Burden Hours/
Respondent

Total Average Annual Burden Hours

Record information on parents’ preferred providers*

16,667

1

.12

2000






Estimated Total




2000

* It is assumed that approximately 4,167 parents will fill out an SES application and provide information on preferred providers, in each of the 12 school districts. School district staff will key in the parents’ information on preferred providers and send an electronic file with this information to contractor.


13. Estimate of Total Annual Cost Burden to Respondents or Recordkeepers

There are no direct costs to individual participants.



14. Estimates of Annualized Cost to the Federal Government

The estimated cost to the federal government to carry out the Impact Evaluation of Title I Supplemental Education Services is $2,147,060. The study will be carried out over roughly three years (from fall 2007 to spring 2010). The annual cost of the data collection in this Request for OMB Approval of Data Collection Instruments and analysis of this data is $715,687.



15. Reasons for Program Changes or Adjustments

There is a program change of 2,000 hours since this is a new collection.



16. Tabulation, Publication Plans, and Time Schedules

a. Tabulation Plans


Using an RD design, valid estimates of the effect of SES can be determined by comparing the average reading and math scores of students who were accepted into SES to the average scores of students who were not accepted into SES, after regression adjusting for the measure of prior achievement used to determine acceptance. Figure 1 illustrates the RD design graphically, using a hypothetical example in a hypothetical district. In this example, students with an assignment score of 50 or less receive SES (the treatment group), and students with a score over 50 do not (the control group). This figure plots student math test scores against assignment scores. It also displays the fitted regression line for the treatment and comparison groups. The estimated impact on math test scores is the vertical distance between the two regression lines at the cutoff value of 50. In this example, all data are used to calculate the impact, including data from students who are far from the RD cutoff. Making use of all available data increases the statistical precision of the impact estimate because it improves our ability to regression adjust for the measure of prior achievement.2 An important consideration in calculating impacts using an RD design is the functional form used to regression adjust for prior achievement. In Figure A.1, the functional form is linear. In practice, we will also calculate impacts using non-parametric regression techniques that allow for a more flexible functional form.


Because the assignment score will be defined differently across districts (we anticipate that in most cases it will be based on a prior year’s test score) and because each district will use a different cutoff for allocating services, we will estimate separate impacts for each district in the sample and then compute a weighted average of these estimates to obtain an overall estimate of the impact of SES among the districts in our sample.3 We will weight district-specific estimates according to the number of eligible students in each district, which will provide an estimate of the impact of SES on the average student under study.4



F IGURE A.1

HYPOTHETICAL EXAMPLE OF THE RD METHOD























We will also use the RD design to explore the relationship between SES provider characteristics and effectiveness. In the SES application materials, we will ask parents to name their preferred SES provider. Because we will identify the preferred providers prior to determining the RD cutoff, we will be able to calculate provider-specific impacts (for example, by estimating a separate impact regression for each provider). We can then correlate impacts with provider characteristics and practices. Dimensions along which interventions might vary include substantive focus (for example, math or reading), intensity (for example, frequency of student attendance), and method of delivery (for example, small group activities, one-on-one tutoring, or the use of computer technology).


One additional consideration is that some students offered SES might not receive the services, and some students whose assignment score exceeds the cutoff might nonetheless manage to receive SES.5 If this is the case, the impact estimates will represent the impact of offering students SES rather than the effect of receiving SES. We propose to collect data on whether students received SES from the provider survey and from district administrative records. If many students who were offered SES chose not to receive them, or if students who should not have received SES according to their assignment score do in fact receive them, we can compute an additional estimate reflecting the impact on students of receiving SES using what is known as a “fuzzy” RD design (Trochim 1984; Hahn et al. 2001). This approach is similar to calculating the impact of treatment on the treated in a randomized control trial using a Bloom (1984) adjustment, essentially using the discontinuity in SES receipt at the assignment score cutoff as an instrumental variable for SES receipt, holding constant a function of the assignment score.



b. Publication Plans

The evaluation report will be completed after all data from the 2008–2009 school year have been collected and analyzed. A draft report will be completed by January 2, 2010, and the final report will be completed by the end of September 2010.



c. Time Schedule

The full timeline for the evaluation is shown in Table A.5. The timeline calls for design and district recruiting in summer and fall 2008 after OMB clearance, data collection between fall 2008 and summer 2009, and analysis and report writing between summer 2009 and spring 2010.



17. Approval Not to Display the Expiration Date for OMB Approval

Approval not to display the expiration date for OMB approval is not requested.



18. Exception to the Certification Statement

No exceptions to the certification statement are requested or required.











TABLE A.5


STUDY ACTIVITIES TIMELINE


Time Period

Activity

Spring 2008

Contractor contacts districts to assess feasibility of study.

Late Summer/Early Fall 2008

Contractor recruits districts to be in study after receiving OMB approval.

Fall 2008

Districts enroll students in SES (September - October 2008)


Contractor provides technical assistance to districts during enrollment process (September - October 2008)


Districts provide contractor with study data.

  • Specification of which students are enrolled in SES and which students are not

  • Cutoff score that determined SES enrollment

  • Measures of prior achievement for students participating in the study

  • Information on parents’ preferred providers


Spring 2009

Contractor conducts SES provider survey and collects student attendance information from SES providers.

Spring/summer 2009

District provides contractor with student-level data files on spring 2009 test results, demographics, and level of participation in SES.



Contactor collects AYP status of schools.

Summer 2009-2010

Contactor conducts analyses and writes report.

REFERENCES

Bloom, H.S. “Accounting for No-Shows in Experimental Evaluation Designs.” Evaluation Review, 8, pp. 225-246, 1984.


Center on Education Policy. From the Capital to the Classroom: Year 4 of the No Child Left Behind Act. Washington, DC: Center on Education Policy, March 2006.


Hahn, Jinyong, Petra Todd, and Wilbert Van der Klaauw. “Identification and Estimation of Treatment Effects with a Regression-Discontinuity Design.” Econometrica, vol. 69, no. 1, January 2001.


Shadish, W.R., Thomas D. Cook, and Donald T. Campbell. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston: Houghton Mifflin, 2002.


Stullich, Stephanie, Elizabeth Eisner, Joseph McCrary, and Collette Roney. National Assessment of Title I Final Report. Volume 1: Implementation of Title I. Washington, DC: U.S. Department of Education, February 2007.


Trochim, W. Research Design for Program Evaluation: the Regression-Discontinuity Approach. Beverly Hills: Sage Publications, 1984.


Zimmer, Ron, Brian Gill, Paula Razquin, Kevin Booker, and J.R. Lockwood III. State and Local Implementation of the No Child Left Behind Act. Volume I: Title I School Choice, Supplemental Educational Services, and Student Achievement. Washington, DC: U.S. Department of Education, 2007.

1 This study requires more students than some other education evaluations for two reasons. First, a key research question is how the effects of SES vary by provider type, which requires a large overall sample in order to support smaller subgroup analyses. Second, the RD design is not as statistically powerful as an experimental design, requiring more students in order to detect effects.

2 We plan to calculate impacts using all available data, but we will also calculate impacts using only students who are close to the RD cutoff as a sensitivity analysis.

3 In some districts, assignment score and/or cutoff might also differ by grade, in which case we will estimate district/grade-specific impacts.

4 As a sensitivity analysis, we will also calculate the impact on the average district by giving an equal weight to each district-level impact.

5 This second concern is known as comparison group “crossover,” which might occur if the district erroneously provides the student SES or does not have a systematic approach for allocating available services from a waiting list when students initially offered SES decline them.

File Typeapplication/msword
File TitleSupporting Statement for OMB Approval
AuthorMartha Bleeker
Last Modified Bykatrina.ingalls
File Modified2008-08-20
File Created2008-08-20

© 2024 OMB.report | Privacy Policy