1850-0800 revOMB Part B (2-17-16)

1850-0800 revOMB Part B (2-17-16).docx

Evaluation of the Effectiveness of the Scholarships for Opportunity and Results (SOAR) Program

OMB: 1850-0800

Document [docx]
Download: docx | pdf






U.S. Department of Education






Evaluation of the Effectiveness of the Scholarships for
Opportunity and Results (SOAR) Act Program:
Supporting Statement for OMB Clearance



PART b: COLLECTION OF INFORMATION
EMPLOYING STATISTICAL METHODS











December 1, 2015


















TABLE OF CONTENTS





INTRODUCTION B-1


B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS B-4

B.1 Respondent Universe and Sampling Procedures B-4

B.2 Statistical Methods for Sample Selection B-4

B.3 Methods to Maximize Response Rates B-6

B.4 Pilot Testing B-7

B.5 Individuals and Organizations Involved in this Project B-7





EVALUATION OF THE EFFECTIVENESS OF THE SCHOLARSHIPS FOR OPPORTUNITY AND RESULTS (SOAR) ACT PROGRAM


SUPPORTING STATEMENT

COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


A third cohort has been added to the evaluation which means an extra round of follow up data collection in spring 2017. Cohort 3 was added to Table 1. Other than that change, the ICR is identical to the one previously approved.



Shape1



INTRODUCTION


This document requests forms clearance approval from the Office of Management and Budget (OMB) for the collection of data under the Evaluation of the Effectiveness of the Scholarships for Opportunity and Results (SOAR) Act Program. In particular, we are requesting approval for: (1) parent, student, and principal surveys, and (2) records abstraction from DC Public School (DCPS), from the District of Columbia Public Charter School Board, and private school administrative files. We also describe other aspects of the evaluation plan that do not contribute to burden for context. This is a request for an extension of the #1850-0800 collection since a third cohort of data collection is needed in order to obtain the sample size needed to carry out this evaluation.


Overview of the Program


The Scholarships and Opportunities for Results (SOAR) Act H.R. 1473 (P.L.112-10), signed into law on April 15, 2011, reauthorized the DC School Choice Incentive Act and provided for a five-year continuation of a school choice program for low-income residents of Washington, DC. The program, still titled the Opportunity Scholarship Program or OSP, now provides scholarships of up to $12,000 per student per year to enable low-income elementary and secondary students to attend private schools in the District of Columbia in lieu of the public schools already available to them. The statute specifies that certain students be given priority in the award of scholarships, including students who have siblings already participating in the program, students who were previously awarded a scholarship under the earlier program but who did not use it, and students from public schools designated as “in need of improvement” (SINI) or corrective action under the federal Elementary and Secondary Education Act.


The OSP is operated under a grant from the U.S. Department of Education (ED) to the DC Children and Youth Investment Corporation (the Trust). 1 The Trust awarded 1,014 new scholarships in summer 2011(soon after the program was reauthorized) to all eligible applicants at that time. Since then, 316 scholarships were awarded in summer 2012, 394 scholarships were awarded in summer 2013 and 285 scholarships were awarded in summer 2014 through a lottery of eligible applicants.


Overview of the Evaluation


The reauthorization once again stipulated that an evaluation of the program be conducted “using the strongest possible research design for determining the effectiveness” of the program (Section 309, see Appendix A). ED awarded a contract to Westat, and its research partners, Pemberton Research and the University of California at San Diego to: (1) provide technical assistance to the program operator, particularly with respect to the design and conduct of the lotteries of applicants, and (2) conduct an evaluation of the impacts of the program.


The foundation of the evaluation will be a randomized control trial (RCT) comparing outcomes of eligible applicants (students and their parents) assigned by lottery to receive or not receive a scholarship. This design is consistent with the requirement for a rigorous evaluation as well as the need to fairly allocate the scholarships if the program is oversubscribed. Because the law also specified other kinds of comparisons and analyses, the planned evaluation study includes both quantitative and qualitative components.



Research Questions


The study is designed to address the following key questions:


  • What is the impact of the program on student achievement? As described in the statute, the purpose of the program is to allow low-income parents to enroll their children in other than DC public schools because test scores in the public schools remain below the national average. The law therefore placed a priority on examining whether the program improves the academic achievement levels and growth of eligible students who would otherwise be in a public school setting. The evaluation will calculate the impacts on achievement (as well as other outcomes) of the offer of a scholarship (the “Intent to Treat” estimate) and the impact of using a scholarship (the “Treatment on Treated” estimate).

  • What is the impact on other measures of student success? The law calls for examining other indicators of school success, including persistence, grade retention, high school graduation and, if possible, college enrollment. Measures of student engagement, such as school attendance, will also be examined.

  • Does the program affect parent and student reports of school satisfaction and safety, or parent involvement in their child’s education? A key desired outcome of school choice is an increase in both the school choices possible and parents’ and students’ satisfaction with the choices they have made. The SOAR statute extends the outcomes to be studied to include: how parents and students view the safety of the child’s school and the success of the program in increasing parent involvement.

  • Why do parents choose to participate in the program? Previous studies of school choice suggest parents consider a variety of factors in choosing whether to pursue private schooling for their children and the characteristics of schools that most affect their specific school selections. The statute specifies that the evaluation examine these issues for parents who apply to the OSP.

  • Does the program change students’ instructional environment and opportunities? Whatever the effects of the OSP on key outcomes, researchers and policymakers have long been interested in the mechanisms by which voucher programs might be expected to benefit students. Among the hypotheses that will be explored in the evaluation are whether participating students are exposed to more motivated or better performing peers and the extent to which school organization, instruction, or services are different in public vs. private schools.


Sample


In order for the evaluation to have sufficient statistical power to detect policy relevant impacts, the sample will consist of approximately 1,771 eligible program applicants who entered the OSP in spring 2012 (cohort 1; n = 536; n = 718), spring 2013 (cohort 2), and spring 2014 (cohort 3; n = 517) (see Part B of this submission). To be included in the evaluation sample the applicant must be eligible for the program, a rising Kindergartener (K) or already be attending a public school, and participate in a lottery to determine whether they will receive a scholarship award.2


Data Collection


Evaluation data will be collected for the three cohorts of program applicants from a variety of sources, as summarized in Table 1. Each cohort will have baseline data3 as well as three years of follow up (post-lottery) data collection; 2013-2015 for cohort 1, 2014-2016 for cohort 2, and 2015-2017 for cohort 3. In addition to estimating program impact, we will use this experimental study to conducted research about interim outcomes.


Table 1. Data Measures for the Evaluation of the DC Opportunity Scholarship Program

Data Source

Description

Student assessments

The Terra Nova assessment will be administered to the eligible sample before the lotteries are conducted and each spring following the lotteries for 3 years (spring 2013-2015 for the 2012 cohort, spring 2014-2016 for the 2013 cohort, and spring 2015-2017 for the 2014 cohort). The follow up assessments will be administered in students’ school and will provide the primary outcome measure for the impact evaluation.

School records

Administrative records will be collected from DCPS, the District of Columbia Public Charter School Board and participating private schools in the fall of each year to obtain data on prior year attendance, persistence, disciplinary actions, and grades for members of the treatment and control groups.

Parent surveys

The evaluation will include annual surveys of evaluation sample members’ parents in each follow up year. These surveys will examine such issues as reasons for continued participation or withdrawal, involvement in school, satisfaction with school choices, and perceptions of school safety, leadership, and offerings. The survey will be mixed mode. (Web with phone or paper follow up).

Student surveys

The study will conduct surveys of the evaluation sample who are in grades four and above, to collect information about students’ satisfaction with their schools, perceptions of safety, and other characteristics of their school program and environment. The surveys will be administered in each of the follow up years at the same time (and place) as the student assessments.

Principal surveys

The study design calls for annual surveys to be administered to principals in the DC traditional public school, charter school, and private school systems in 2013-2017. Data from principals of students in the treatment and control groups will provide information about school organization and offerings for descriptive analyses of students’ school environments and for use as mediators in the impact analysis. The web-based principal surveys will also be used to examine how aware public and private schools are of the DC Opportunity Scholarship Program and whether they are making any changes in response to it.

DC Opportunity Scholarship Program Operator Records

As the administrator of the DC Opportunity Scholarship Program, the operator is responsible for confirming ongoing eligibility for the program and continuing participation for scholarship recipients. Westat will obtain application data for all sample members as well as annual participation information for individual students from the program operator.

B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


B.1. Respondent Universe and Sampling Procedures


The Impact Evaluation of the DC Opportunity Scholarship Program will include all students eligible for the impact evaluation and their parents, as well as the universe of school principals. Therefore, there is no sampling proposed for this study.



B.2. Statistical Methods for Sample Selection


An overview of the statistical methods used to run the lottery and calculations of the minimum detectable effect size is presented next.

Lottery


The lottery design took into account several unique features of the DC OSP. The enabling

legislation stipulates that three groups are to be given a priority, that is, assigned a higher

probability of winning the lottery. These priority groups are:


  • Group 1: students with a sibling already participating in program

  • Group 2: students attending a public school in need of improvement (SINI)

  • Group 3: students previously awarded a scholarship but who had never used the scholarship


Number of Scholarships Awarded

The overall number of scholarships was determined by the budget available and an assumption that the take-up rate would be 75 percent. The Trust awarded 1,014 new scholarships in summer 2011 (soon after the program was reauthorized) to all eligible applicants at that time, and 316 scholarships in summer 2012, 394 scholarships in summer 2013 and 285 scholarships in summer 2014 through a lottery of 536, 718 and 517 eligible applicants, respectively.


Assigning Probabilities of Receiving a Scholarship by Priority Group


The US Department of Education determined that compared to an applicant who is in none of the three priority groups, for students in groups 2 and 3 (SINI and previous awardees) the probability of award should be 25 percent higher, and for students in group 1 (applicants with a sibling in the program) the probability of an award should be 40 percent higher.


These figures were used for the first-year lottery, held on July 31, 2012. Overall, students with no priority had a 48 percent chance of receiving an award, compared to 60 percent of students in priority groups 2 and 3 (1.25 X 48 percent) and 67.2 percent of students in priority group 1 (1.4 X 48 percent). The average probability of receiving a scholarship was 59 percent. Overall, 35 students from the “no priority” group, 223 from priority groups 2 and 3, and 41 from priority group 1, for a total of 299, were awarded scholarships.


The U.S. Department of Education further determined that a supplemental lottery should be held for rising kindergarteners currently attending private schools. The same priority groups and relative probabilities were used for this lottery, although there were no applicants in priority groups 1 and 2 because by definition private schools cannot be a SINI and kindergarten students could not have received a scholarship in an earlier year. Overall, 11 students (50 percent) who were not in a priority group and 6 (67 percent) who were in priority group 1 were offered a scholarship, for an overall award rate of 55 percent.


In subsequent lotteries (2012-2014) the Department of Education established the relative probabilities among priority groups and the overall average probability of receiving a scholarship based on the number of slots available in participating private schools and/or funding availability, and in the number of applicants.


Minimum Detectable Effect Size

Statistical power analysis is a useful tool for estimating the sample sizes needed to estimate a program effect with a desired degree of precision. For this study, however, the sample sizes will be set based on the number of eligible applicants for the program, but looking at statistical power is useful to get a sense of how much power will exist for the full sample and for various sizes of subgroups. To set up the power analysis, various parameters are set as defaults. The desired Type I error rate (the probability of finding a statistically significant effect when none exists) is set to 5 percent. The sample is assumed to be created by independent random assignment, and characteristics of students at baseline are assumed to explain 50 percent of the variation in the test score at follow-up. This latter assumption has an important effect on power, but is supported by data from previous studies. In fact, it is typical for a larger amount of variation to be explained by baseline factors, but to accommodate the increase in variance that will arise because students are clustered in families, a conservative estimate is used.


Figure 1 has three curves, for three assumed values of the effect size (0.10, 0.20, and 0.30). The red line on the graph at power of 70 percent provides a visual reference point for “conventional” power levels. The sample sizes at the point where the 70-percent line intersects the curve is the focus point. As Figure 2-1 shows, for effect sizes of 0.10 or greater, the lottery’s sample of 1,800 students will reach the conventional power level. In other words, the probability that the statistical tests indicate that the measured effect is significant will exceed 70 percent when the true effect is 0.10 or greater. Figure 2-1 also can be read in the opposite direction, from power to sample size, to get a sense of power to detect effects for subgroups.


If the effect size is 20 percent (the middle line) a subgroup of 320 students or more can reach a power level of 70 percent (the actual sample size is not visible in the figure; it is the sample size at which the 70 percent line intersects the power curve). Such a subgroup might be based on gender or proficiency according to the baseline test score.


Figure 2-1. Power curves




We do not anticipate any unusual problems requiring specialized sampling procedures. Student, parent, and principal surveys all use the universe of respondents.


B.3. Methods to Maximize Response Rates


This package requests an extension for this collection to complete follow-up data collection with the third cohort, added in order to obtain the sample size needed to carry out this evaluation.  The number students that applied to the DC Opportunity Scholarship Program the first two years of the lottery were fewer than expected. Response rates did not contribute to the need for a larger sample.


We will maximize response rates developing survey instruments that are fairly easy for respondents to complete and by following up with non-responders by mail, fax, and telephone. This study is striving for a response rate of 80 percent. Obtaining high response rates in the Impact Evaluation of the DC Opportunity Scholarship Program will be critical to the success of the study. It will be particularly important to obtain response rates that are not only high overall, but that are approximately equal in the treatment and control groups.


We have several strategies for ensuring a high rate of response. First, we plan to collect the student survey data at the same time and place as the administration of the assessment. Second, because of a key provision in the law, the Trust communicated to parents that student and parent participation in the evaluation’s data collection is required for students to be entered into the lottery, to keep their scholarship or to remain eligible to receive a scholarship in the future. We believe this requirement will be a formidable incentive to respond to the surveys and assessments. Furthermore, parents will have the option of completing the survey on the web or being interviewed by phone at a time they indicate as convenient or by completing a paper version. Third, we have found that having a cover letter signed by recognized official encouraging principals to participate is extremely effective in getting principals to pay attention to the importance of their participation. We will work with the COR to obtain a letter from DCPS for public school principals and a letter from the Program Operator for private schools. We will also discuss obtaining a letter from the Director of IES. Finally, we will employ a sophisticated tracking system to ensure that we follow up with non-response in a timely and comprehensive way using a combination of reminder postcards, emails, follow-up letters and telephone calls to encourage respondents to complete the surveys.


B.4. Pilot Testing


We pretested each of the surveys with nine or fewer people who are similar demographically to respondents in the study. We asked the pretest respondents to first complete the relevant survey and then participate in a focus group about it. In the focus group discussions we tested for completion times and feelings of burden, salience of language, concept recognition, and understanding of terms. After the pretest and with input from the Technical Working Group (TWG), we revised the surveys as needed based on the pretest results.



B.5. Individuals and Organizations Involved in this Project


The statistical aspects of the design have been reviewed thoroughly by staff at the Institute of Education Sciences. Table 5 shows the individuals most closely involved in developing the statistical procedures and who will be responsible for data collection and analysis.


Table 5. Individuals Involved in this Project

Name

Affiliation

Role

Phone Number

Juanita Lucas-McLean

Westat

Project Director

(301) 294-2866

Mark Dynarski

Pemberton Research

Co-Principal

Investigator

(609) 443-1981

Julian Betts

University of California at San Diego

Co-Principal Investigator

(858) 534-3369

Lou Rizzo

Westat

Statistician

(301) 294-4486

Meredith Bachman

ED/IES

COR

(202) 219-2014


1 In May 2012, a grant to run the program was awarded to the DC Children and Youth Investment Trust Corporation (“Trust”), a non-profit organization that operates a privately-funded scholarship program for students in the DC area. A new grant was awarded to Serving Our Children in 2015 to serve as the current program operator.

2 Although students who attend a private school when they apply to the OSP are eligible for a scholarship and may be awarded one through a lottery, these students are not included in the evaluation because the “treatment” for these students differs significantly from the OSP treatment for students from public schools. For students already attending a private school when they apply to the OSP, the lottery determines only who will pay for their private school tuition – the federal OSP program vs. other scholarship programs or the families themselves; we have no hypothesis that this difference could result in improvement in achievement although it could affect family resources. In contrast, the lottery of public school applicants in most cases determines whether a student attends a private schools or a public school and there is a body of evidence suggesting that such differences in school settings could lead to differences in achievement.

3 Approval for baseline collection was approved on November 3, 2011 (#1855-0015).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJill Feldman
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy