1850-0800 revised OMB package part A rev1 12 11 08

1850-0800 revised OMB package part A rev1 12 11 08.doc

Impact Evaluation of the DC Opportunity Scholarship Program

OMB: 1850-0800

Document [doc]
Download: doc | pdf




U.S. Department of Education






Impact Evaluation of the DC Opportunity Scholarship Program







Office of Management and Budget

Statement for Paperwork Reduction Act Submission


Part A: Justification



Contract ED-04-CO-0126




July 10, 2008

Revised September 25, 2008

Revised December 11, 2008



TABLE OF CONTENTS



Page





PART A. JUSTIFICATION


This package represents a request for approval to collect one additional round of data – in spring 2009 – for the Impact Evaluation of the DC Opportunity Scholarship Program (OSP) using, with one exception, the same evaluation design, instruments, and data collection and analysis procedures previously approved by OMB as OMB No.1850-0800 (initial approval notice dated 4/15/05; extension beyond the original three-year expiration date approval notice dated 1/25/08). The one exception is the addition of a short, phone follow up survey to collect information on dropping out and post-high school activities. New burden estimates are also included.


This additional round of data collection was not anticipated when ED submitted the package to extend data collection by 3 months (to complete the already approved data collection) or when ED acted to discontinue approval for this evaluation’s collection beyond the new 7/31/08 expiration date (NOD 5/27/08). At that time, re-authorization of and funding for the OSP was uncertain.


However, it now appears likely that there will be funds for both scholarships and the evaluation1 in FY 2009, based on the recent House committee vote (June 25, 2008) to appropriate an extra year of funds (for FY 2009) for the DC School Choice Incentive Act. This request for OMB approval to administer an additional round of student testing and student, parent, and principal surveys reflects policymakers’ interest in collecting longer-term information about OSP participants, providing the evaluation the ability to estimate fourth year program impacts. Additionally, we are adding a short follow up parent telephone survey that will enable us to respond to the language in the law that calls for examining other indicators of student success, including “retention rates, dropout rates, and (if appropriate) graduation and college admissions rates” (Title III, Section 309 (a) (4) (A) (D), see Attachment 1 for the authorizing legislation). It is only in FY 2009 that a sufficient number of the evaluation’s impact sample will have aged to the point that it is appropriate to ask questions about these outcomes and that allows us to detect statistically significant impacts, if there are any.


Since we are submitting an already-approved design and set of instruments and procedures, this package is identical to the package approved by OMB on 1/25/08 with the exceptions described below. We have proceeded in this manner so that this package can function as a stand-alone document reflecting the full evaluation. Throughout the remaining sections of this document, any difference between the recently approved package and the current one are shown in bold, to assist OMB in its review. The differences are:


  • For every occurrence of the number of years of data collection, we have updated the language to say “4 years of follow up data collection for each cohort.” (See Table 1, pages 4 and 5; pages 6 and 7; pages 9 and 10.)


  • We have added references to the additional data collection instrument – the follow up parent survey – in the appropriate sections. (See page 2; Table 1, pages 4 and 5; Table 2, pages 6 and 7; page 12.)


  • We have expanded the discussion in Section A6 on consequences of less-frequent data collection to include the four years of follow up data collection (See page 8).


  • In the section on estimates of respondent burden, we have added three columns to Table 3 to detail the burden of the requested additional year of data collection and we have added one row to detail the burden of the requested new instrument. (See Section A12, pages 12 - 14.)


  • In the section on estimates of annualized government costs, we updated the cost information to reflect our best estimate of expenditures for the extra year of data collection, given that we are in the process of amending the scope of the current evaluation contract to accommodate the new work (see Section A14, page 14.)


  • In the section on changes in hour burden, we updated the burden hours. (See Section A15, page 14.) Because the approval for the existing (and completed) collection was discontinued, this is a reinstatement with new hours.


  • We updated the reporting schedule to include a report describing the 4th year impacts (See Table 4, Section A16, page 18.)



  1. Explanation of Circumstances That Make Collection of Data Necessary


Introduction


In early 2004, the U.S. Congress passed the DC School Choice Incentive Act, Title III of the District of Columbia Appropriations Act of 2004, Division C of HR 2673 (PL 108-199). The legislation established a new, five-year school choice program for low-income residents of Washington, DC, and provided for a program operator to design and oversee parent outreach efforts, school recruitment, the student application process, and the distribution of scholarships.2 The program provides scholarships of up to $7,500 per student per year to enable low-income elementary and secondary students to attend private schools in lieu of the public schools already available to them. It is anticipated that, given annual appropriations of $13 million, up to 2000 students could be supported by scholarships each year, since most private schools in DC charge less than the ceiling amount for tuition and fees. The law requires that students be assigned scholarships by lottery if there are more eligible applicants than can be accommodated by the appropriation or the availability of seats in participating private schools.


The law also requires an evaluation of the program “using the strongest possible research design for determining the effectiveness” of the program (Section 309, see Attachment 1). The U.S. Department of Education (ED) awarded contracts to Westat, and its research partners, the University of Arkansas (formerly Georgetown University) and Chesapeake Research Associates, to (1) provide technical assistance to the program operator, particularly with respect to the design and conduct of the random assignment of participants during the baseline year of 2004, and (2) perform a 5-year impact evaluation of the program.


This document represents the Supporting Statement for the data collection and analysis to be conducted under the Impact Evaluation of the DC Opportunity Scholarship Program. In particular, we are requesting approval for: (1) parent, student, and principal surveys, (2) ongoing testing of student applicants, (3) records abstraction from DC Public Schools (DCPS) administrative files, and (4) a follow up parent survey.


Study Design


The foundation of the DC Opportunity Scholarship Program evaluation will be a Randomized Control Trial (RCT) comparing outcomes of eligible applicants (students and their parents) assigned by lottery to receive or not receive a scholarship. This design is consistent with the requirement for a rigorous evaluation as well as the need to fairly allocate the scholarships if the program is oversubscribed. At the same time, the law specified other kinds of comparisons and analyses, resulting in a planned evaluation study that includes both quantitative and qualitative components, and both performance (progress) reporting and measures of impact.


Research Questions


The study is designed to address the following key questions:


  • What is the impact of the program on student academic achievement? The law places high priority on examining whether the program—the availability and offer of scholarships—improves the academic achievement of eligible students. This question can be addressed most rigorously by comparing the academic achievement of student applicants randomly assigned to receive and not receive scholarships. However, the law also asks for a comparison of the academic achievement of students who participate in the program with their grade-level counterparts in DCPS.


  • What is the impact of attending private versus public schools? Because it is likely that some students offered scholarships will choose not to use them, the evaluation will also use accepted econometric methods to examine the effects for students who take the scholarship offer and enroll in a private school.


  • What is the impact of the program on other student measures? The law calls for examining other indicators of student school success, including persistence, retention, graduation and, if possible, college enrollment. In addition, Congress required the evaluation to assess the school safety of students who receive the scholarships relative to those who did not receive scholarships.


  • What effect does the program have on student and parent satisfaction with the educational options available in DC and with children’s actual school experiences? A key desired outcome of scholarship programs is an increase in both the school choices possible and parents’ and students’ satisfaction with the choices they have made. These issues will be examined by comparing the satisfaction and reasons for applying to the DC Opportunity Scholarship Program among applicants assigned by lottery to receive scholarships and those assigned to not receive scholarships.


  • To what extent is the program having an impact on schools in Washington, DC? Scholarship programs have been hypothesized to affect not only the students who receive the scholarships but also the broader population of public schools and students. Theory suggests that these broader outcomes could occur when public school systems respond to a fear of losing students, and therefore revenues, to private schools. These competitive effects might include changing curricula, adopting new themes or missions, and other modifications to existing policies and practices to make the public schools more attractive. Choice programs might also affect the larger population of private schools, beyond those in which the programs’ participants are currently enrolled; if choice programs are successful, additional private schools may choose to participate or new schools may be established to meet enrollment demand. However, exploring these potential systemic effects of the DC Opportunity Scholarship Program will be challenging, given the existing design of the program and the limited resources available to address this question.


Data Collection


Evaluation data will be collected for two cohorts of program applicants and include a variety of data collection methodologies. To achieve the sample sizes necessary for statistical power, the evaluation will track the progress and experiences of applicants in spring 2004 and in spring 2005. The evaluation team is collecting pre-program (“baseline”) measures of family background and student achievement and is planning to collect annual “in program” measures in order to conduct a rigorous evaluation of program impacts. These measures will be collected from the data sources described in Table 1.


Table 1. Data Measures for the Evaluation of the DC Opportunity Scholarship Program

Data Source

Description

Student assessments

  • Baseline test scores (SAT-9) will be abstracted from DCPS records for public school applicants; applicants who did not participate in DCPS spring assessments (primarily children below grade 3) will be tested the following fall by the evaluators using the same assessment at Saturday sessions.

  • After the baseline year, the evaluators will administer the SAT-9 to all treatment and control group members in the spring of each year, at annual events to re-establish eligibility for the program – for 4 years of follow up data collection for each cohort.

  • The study will also obtain DCPS test score data for all public school students in those years, in order to draw a random sample of similarly low-income DCPS nonapplicant students, stratified by grade level, to compare with the applicants to the DC Opportunity Scholarship Program, as required by law.

School records

Administrative records will be collected from DCPS and charter school authorizers to obtain data on attendance, persistence, disciplinary actions, and grades for members of the treatment and control groups at baseline. In addition, the study will seek to obtain these data for all public school students, including those in charter schools, so that the program applicants can be compared to nonapplicant DCPS students in the relevant grade levels, as required by the DC Choice Act.

Parent surveys

The study will conduct surveys of parents (of students in the treatment and control groups) – for 4 years of follow up data collection for each cohort. These surveys will examine such issues as reasons for applying, satisfaction with school choices, and perceptions of school safety, educational climate, and offerings. It is likely that these surveys will be administered during the annual program renewal events, with telephone follow up as necessary.

Student surveys

The study will conduct surveys of treatment and control group students who are in grades four and above – for 4 years of follow up data collection for each cohort.

These surveys will collect information about students’ satisfaction with their schools, perceptions of safety, and other characteristics of their school program and environment. The surveys will be administered at the same time (and place) as the student assessments.

Principal surveys

The study design calls for a survey of principals in the spring of each year from (1) principals of all 109 private schools, and (2) principals of all of the 160 regular public and charter schools in DCPS. The surveys will be administered for 4 years of follow up data collection for each cohort. The surveys will collect information about school conditions and the school environment that might affect student achievement, and awareness of and response to the DC Opportunity Scholarship Program.


Table 1. Data Measures for the Evaluation of the DC Opportunity Scholarship Program (continued)

Data Source

Description

DC Opportunity Scholarship Program Operator Records

As the administrator of the DC Opportunity Scholarship Program, the operator is responsible for confirming ongoing eligibility for the program and continuing participation for scholarship recipients. Although surveys of parents and students will also be conducted, Westat will collect annual data from the program operator about individual student program participation.

Follow Up Parent Telephone Survey

We will administer a short, telephone follow up parent survey in the final year of data collection to collect information on whether students are still in school and, if not, whether they have dropped out or have graduated. We will also collect information on whether students are attending college or are employed. The survey will be administered to parents of students who have turned 16 by June 30, 2009, since 16 is the age of compulsory school attendance in DC.



  1. How the Information Will Be Collected, by Whom, For What Purpose


Information on the DC Opportunity Scholarship Program and the outcomes of program applicants will be collected primarily by Westat, with data analyzed by Westat and its research partners, University of Arkansas (formerly Georgetown University) and Chesapeake Research Associates. This work will be conducted under Contract Number ED-04-CO-0126. The data to be collected will be obtained from student assessments, school records, and surveys of parents, students, and principals and used to address the research questions and topics identified by in the authorizing legislation. The legislation also specifies that the evaluation report annually on the performance of the program and the students; thus, annual data collection is necessary and cannot be reduced to a lesser frequency. The student, parent, and principal surveys will all include the universe of respondents. In no case do we anticipate any unusual problems requiring specialized sampling procedures. Table 2 shows how each of the sources of data relates to the study questions followed by detailed descriptions of the data sources.


    1. Student Assessments


Based on the legislated language, the key outcome measure for judging the effectiveness of the program is student achievement. Moreover, the law requires the independent evaluator to measure student achievement each year. For the purposes of the evaluation, we have interpreted “student achievement” as students’ skills in reading and mathematics (not science or history).


There are several key considerations that must be taken into account in order to ensure that the measurement of student achievement is a valid indicator of program impacts. Most importantly, to the extent possible, the same administration and testing environments must be maintained for both scholarship recipients (treatment group) and those who applied for but did not receive scholarships (control group). This is easy in the case of the “baseline” measurement of achievement. DCPS annually administers the SAT-9 in April in all of its schools, following a consistent test administration guide for each grade level; we plan to abstract these data for all public school applicants to the DC Opportunity Scholarship Program.


However, going forward beyond the baseline year offers some challenges. Only the control group and members of the treatment group who have declined to use their scholarships or who attrited from the program will be attending DCPS schools and participate in DCPS testing. Most treatment group members will be dispersed throughout a set of participating private schools. Private schools are unlikely to allow us to pull members of the treatment group out of their school day in order to administer the DCPS test to them. Moreover, comparing test results in those circumstances to results for students in the public schools who took the DCPS test along with all students in at their schools introduces a substantial bias. For the DCPS students, the DCPS test is likely to be more consequential, with teachers planning and preparing for it for at least several weeks. In contrast, students in the private schools will have little warning or preparation, placing them at a serious disadvantage in the comparison of achievement with public school students (the control group). This option, although requiring less burden on the control group, would lay the evaluation open to serious criticism in estimating and interpreting the key program impacts.


Table 2. Relationship Between the Study Questions and Proposed Sources of Data

Study Question

Student assessments

School records

Parent surveys

Student surveys

Principal surveys

DC Opportunity Scholarship Program Operator Records

Follow Up Parent Survey

What is the impact of the program on student academic achievement?





What is the impact of attending private versus public schools?





What is the impact of the program on other student measures?


What effect does the program have on student and parent satisfaction with the educational options available in DC and with children’s actual school experiences?



To what extent is the program having an impact on schools in Washington, DC?







Instead, at the current time, we plan to administer the SAT-9 math and reading assessments when the treatment and control group families come in to renew their eligibility for the Program, so that the test administration will be similar across all types of evaluation members. The scholarship users will clearly be the most motivated to attend and we will be conscious of the need to take steps to encourage the scholarship non-users (decliners) and control group members to fulfill the requirements to participate in the evaluation’s data collection. These assessments will be administered in early April of each year, including the new (4th) round in spring 2009 for which we are now requesting approval.


    1. School Records


Administrative records will be collected from DCPS and charter school authorizers to obtain data on attendance, persistence, disciplinary actions, and grades for members of the treatment and control groups at baseline. In addition, Westat will seek to obtain these data for all public school students, including those in charter schools, so that the program applicants can be compared to other students in the relevant grade levels, as required by the DC Choice Act.


    1. Parent Surveys


The legislation requires the evaluation to examine the impact of the program on parents. The study will conduct surveys of parents (of students in the treatment and control groups) for the 4 years of follow up data collection for each cohort, including this new round in spring 2009. These surveys will examine such issues as reasons for applying to and remaining with the program, satisfaction with school choices, and perceptions of school safety, educational climate, and offerings. These surveys will be administered to the parents when they come in to renew their child’s program eligibility, with telephone follow up as necessary.


    1. Student Surveys


The study will conduct surveys of treatment and control group students who are in grades four and above, to collect information about students’ satisfaction with their schools, perceptions of safety, reports of behavior both within and outside of school, and other characteristics of their school program and environment. The surveys will be administered for the 4 years of follow up data collection for each cohort (including the new round in spring 2009), and are likely to occur at the same time (and place) as the student assessments – the family events where the parents come in to renew program eligibility.


    1. Principal Surveys


The study design calls for two separate principal surveys: (1) principals of all 109 private schools in DC, administered toward the end of each of the four years and (2) principals of all of the 160 regular public and charter schools in DCPS, administered toward the end of each of the four years of follow up data collection for each cohort, including the new round in spring 2009.

The private school principal survey will focus on knowledge of the DC Opportunity Scholarship Program and ask specific questions about perceptions of the program, why the school does (or does not) participate, and how the program is integrated within their school. The public school principal survey will collect information about school characteristics, climate, how much they know about the DC Opportunity Scholarship Program, and whether they are changing anything in response to the program.


  1. Follow Up Parent Surveys


In the fourth year of data collection for each cohort, we will conduct a telephone survey of parents whose children are 16 and older as of June 30, 2009. The age of 16 was chosen as the cutoff, because it is the age of compulsory school attendance in Washington, DC. The survey will collect information on whether the students are still in school, have dropped out, or have graduated. We will also ask about education and employment status, both for those who have dropped out and those who have graduated. It is necessary to collect this information through a telephone survey, because students (and their parents) who have dropped out of school are less likely to attend the testing events at which we administer other surveys and because those who are identified as completing 12th grade are not invited in for K-12 testing.



  1. Use of Improved Technology to Reduce Burden


The data collection plan has been designed to maximize efficiency and accuracy, and to minimize respondent burden. A key consideration in the decision to abstract baseline student achievement data from DCPS records (rather than administer our own evaluation assessment) was to minimize evaluation costs and reduce respondent burden. We will ask parents to complete a paper survey form at the time they come in to renew their eligibility, and we will follow up with telephone interviewing to offer parents the opportunity to provide the information in the format most convenient to them.



  1. Efforts to Identify and Avoid Duplication


As an examination of a new program, serving students at least half of whom will be outside public school district records, the evaluation must collect much of its own data. We are using existing data to the extent possible—for example, relying on the DCPS assessment for the baseline measures of student achievement. However, other information collected as part of the evaluation — the ongoing student assessments, the surveys of parent, students, and principals — is not available elsewhere.



  1. Efforts to Minimize Burden on Small Businesses and Other Entities


There is no anticipated impact on small business or other small entities. The primary entities for this study are students and parents, although some data will be collected from principals in public and private schools. Burden is reduced for all respondents by requesting only the minimum information required to meet the study objectives. The burden on schools has also been minimized through the careful specification of information needs, restricting questions to generally available information where possible, and designing the data collection strategy—particularly the survey methods—to minimize burden on respondents. For example, we will obtain some descriptive information on public and private schools from the Common Core Data (CCD) available from the National Center on Education Statistics. We will also administer the surveys to students and parents when they are attending events to re-establish their eligibility for the program.



  1. Consequences of Less-Frequent Data Collection


This data collection is necessary in order to evaluate the DC Opportunity Scholarship Program and comply with the evaluation mandate in the DC School Choice Incentive Act to report annually to Congress. Virtually all of the data collection activities—respondents, topics, and the need for annual collection—stem directly from the legislative requirements. In fact, the statute called for five years of data collection and reports. Congress expected that the evaluation would follow students for a longer period than had other voucher studies, but that task could not be accomplished within the original five year contract period and funding. The current request to add a fourth year of follow up data collection is consistent with Congress’ intent.



  1. Special Circumstances Requiring Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations


There are no special circumstances associated with this data collection.



  1. Federal Register Comments and Persons Consulted Outside the Agency


Consultations on the research design, sample design, data sources and needs, and study reports have occurred during the study’s design phase and will continue to take place throughout the study. The purpose of such consultations is to ensure the technical soundness of the study and the relevance of its findings, and to verify the importance, relevance, and accessibility of the information sought in the study.


Westat and its subcontractors, University of Arkansas (formerly Georgetown University) and CRA, have provided substantial input to ED for the study. Senior technical staff from these organizations who are conducting the study are listed below:


Westat Babette Gutmann, Project Director (301) 738-3626

Alex Ratnofsky, Vice President (301) 251-8249

Juanita Lucas-McLean, Senior Analyst (301) 294-2866


University of Arkansas Patrick Wolf, Principal Investigator (479) 575-2084

Nada Eissa, Senior Analyst (Georgetown) (202) 687-0626


CRA Michael Puma, Senior Analyst (410) 897-4968


The Department has also consulted with an Expert Advisory Panel, a group that includes both eminent school choice experts and evaluation methodologists. This advisory panel includes:


Professor Julian Betts, University of California, San Diego

Professor Thomas Cook, Northwestern University

Professor Jeff Henig, Columbia University

Professor William Howell, University of Chicago

Professor Guido Imbens, Harvard University

Dr. Larry Orr, Abt Associates (retired)

Professor Rebecca Maynard, University of Pennsylvania.


The original 60-day notice for this data collection was published in the Federal Register on July 18, 2008.



  1. Payments to Respondents


We realize that participation in the evaluation of the DC Opportunity Scholarship Program will place demands on each of the respondents. Specifically, it is critical to the study design that parents, students, and principals participate in the assessments and complete the survey forms each year, as we will be following each cohort and their parents for four years of follow up data collection for each cohort. The fourth year of follow up data collection for each cohort, for which approval is currently being sought, is particularly important to policymakers, because it will allow the evaluation to estimate the impact of having a scholarship and attending a private school for a cumulative four years, a point at which the impacts of the scholarship program should be stable.


However, unlike other studies of educational programs, we cannot depend on testing students while they are at school, because neither the private schools nor the public schools will allow those evaluation activities to take place on campus.3 Instead, we must encourage parents and students (both treatment group and control group) to attend testing events on Saturdays and evenings throughout the spring of each year in schools and community locations around Washington, DC. While appeals can be made to the treatment group through the program, it is particularly difficult to obtain response from the control group, and other experimental studies of voucher programs have faced the challenge of having substantial differential response rates for treatment versus control groups, raising the possibility of bias in the analysis. Because a study of a voucher program is controversial, there are similar issues in collecting data from principals in treatment and control schools.


To mitigate that problem, we originally proposed—and OMB approved—a set of payments that we expected would generate high rates of response among the evaluation sample. Based on the first year of data collection completed, we requested and OMB approved a set of increased incentives for parents and principals (NOC 1/10/06). Subsequently, we requested and OMB approved (NOA 4/24/07) the option to split the parent payment between the parent and older students in order to incentivize the older students (for whom we had particularly low rates of response) since if they attend the testing events they generally did so on their own and had to forego social and athletic activities to participate in the data collection.


The table below shows the current approved incentive payments (NOC 1/10/06 and NOA 4/24/07); the shading represents incentives relevant to the fourth year of data collection for which this approval is being sought.


INSTRUMENT

PAYMENT

Parents


Baseline/Year 1 Follow Up Data Collection

$50 ($25 in original OMB submission)

Year 2 Follow Up Data Collection

$100

Year 3 Follow Up Data Collection

$150

Year 4 Follow Up Data Collection

$150

Principal

$20 ($10 in original OMB submission)





  1. Assurances of Confidentiality


All data collection activities will be conducted in full compliance with Department of Education regulations to maintain the confidentiality of data obtained on private persons and to protect the rights and welfare of human research subjects as contained in Department of Education regulations. These activities will also be conducted in compliance with other applicable federal regulations. Research participants will be informed about the nature of the information that will be requested and confidentiality protection, and they will be assured that information will be reported only in aggregate, statistical form in reports and public use data files. Respondents will also be informed that their names will not be associated with their answers and that no one will have access to this information except as may be required by law, regulation, or subpoena or unless permission is given by both the parent and participating child.


In particular, it is very important that parents or legal guardians of sample members understand that information is being collected regarding their children, and that this information is being held confidential. When parents apply to the DC Opportunity Scholarship Program on behalf of their child(ren), they receive an oral presentation on evaluation activities and requirements and a written statement of the same; they are asked to sign the consent form and only those who sign are part of the program and the evaluation (see the consent form, Attachment 2).


All surveys will also contain a statement regarding the confidentiality of their responses, as follows:


Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law.”


Specific Procedures to Maintain Confidentiality


Westat, in the conduct of the Evaluation of the DC Opportunity Scholarship Program, will follow procedures for ensuring and maintaining participant privacy, consistent with Education Sciences Reform Act of 2002. Title I, Part E, Section 183 of this Act requires, “All collection, maintenance, use, and wise dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.


In addition, Section 309 of the DC Choice Act includes a particular specification that no personally identifiable information can be disclosed as part of the evaluation. As a result of this provision, in publishing the Privacy Act Notice for the System of Records for this evaluation, ED has eliminated all possible routine disclosures to which any data collected or obtained for the evaluation might be subjected. Under the notice, personal information (names, addresses, student ID numbers) may only be disclosed to Westat and in the unlikely case of a terrorist threat.


Westat, as ED’s “authorized representative” for the collection and maintenance of data for the Evaluation, will take the confidentiality requirements very seriously. Employees of Westat are required to sign Westat’s “employee or contractor’s assurance of confidentiality of data” (see Attachment 3). This document outlines the general requirements and responsibilities of employees and contractors with regard to maintaining the confidentiality and privacy of data. In addition, each project at Westat is required, upon inception, to develop a customized confidentiality. The Westat project director develops the confidentiality plan for the evaluation that takes into account assurances made to respondents, what project information is confidential, who is authorized to have access to it, and how access can be controlled. This plan will be shared with all project staff, who will then be expected to implement it. Some of the components of the plan include:


  • Keeping hard-copy confidential information under lock and key.

  • Storing confidential electronic information in a secure location.

  • Communicating about cases via email without violating confidentiality and privacy.

  • Clearly labeling documents containing confidential information “confidential.”

  • Limiting to the number of copies of confidential documents.

  • Arranging for security when sending confidential jobs to a network printer.

  • Ensuring that only authorized personnel see faxes containing confidential information.

  • Adhering to the telephone research center’s (TRC) protocols for transporting confidential data to and from the TRC.

  • Adhering to data entry’s protocols for transporting confidential data to and from data entry.

  • Using mail and delivery services appropriate for the sensitivity level of the confidential data.

  • Not bringing confidential data home.

  • Disposing of confidential information properly when it is no longer needed.


Institutional Review Board (IRB)


Westat has sought clearance from its Institutional Review Board (IRB) for the DC Opportunity Scholarship Program application and consent form, and for all other protocols associated with student’s participation in the study. In the case of the DC Choice Act, the Congress specified a requirement that all applicants, even those who ultimately do not receive a scholarship through the lottery process, participate in the evaluation’s data collection in order to be eligible for a scholarship in succeeding years. The Congress considered such support for data collection critical to ensure that comprehensive and comparable data was collected from both the treatment and control group members. The IRB provided guidance on how to clarify these requirements on the application and consent forms that all applicants must sign.



  1. Questions of a Sensitive Nature


There are no questions of a sensitive nature on the data collection instruments.



  1. Estimates of Respondent Burden


The study calls for surveys of students, parents, and principals, as well as records abstraction and test administration. The instruments were developed to maximize respondent completion of the surveys and to minimize respondent burden. All survey instruments are brief and focus on collecting only information essential to the study.


The research team will administer the student surveys as part of the student assessment that will be administered to students at the family renewal events. The parent survey will be administered at the family renewal events, with telephone follow up. These surveys are designed to be completed in paper and pencil format and will collect information on the respondents’ perception of the school program and environment. The follow up parent survey will be administered as a telephone survey.


The two principal surveys will be administered as a mail survey with telephone follow up. The surveys will be mailed to principals with instructions to complete the survey and mail or fax it back to the research team. Principals who do not respond by the stated deadline will be contacted by telephone in an attempt to obtain a completed response.


The research team will administer the assessment to the treatment and control groups each spring. In addition, they will collect administrative records from DCPS and charter schools authorizers to obtain data on attendance, persistence, disciplinary actions, and grades for members of the treatment and control groups at baseline.


Table 3 shows the estimated burden for each of the data sources, as well as the additional burden expected for the spring 2009 round of data collection and the follow up parent survey – for which approval is now being sought.



Table 3. Annual Burden Estimates, by Data Source


Already Approved & Completed

For 4th Year Follow-up (to be approved)

Data Source

Respondents

Estimated Number of Responses

Estimated Annual Burden per Response

(in Hours)

Total Estimated Annual Burden

(in Hours)

Estimated Number of Responses

Estimated Annual Burden per Response

(in Hours)

Total Estimated Annual Burden

(in Hours)

Student Assessments

Eligible applicants in grades K-12

2,705

2.5

6,762.50

1,616

2.5

4,040.00

School Records

DCPS staff and charter school authorizers

2

40.0

80.00

2

40.0

80.0

Student Survey

(at testing events)

Eligible applicants in grades 4-12

2,705

0.25

676.25

1,616

0.25

4,04.00

Parent Survey

(at testing events)

Parents of eligible applicants

2,705

0.25

676.25

1,616

0.25

4,04.00

Private School Principal Survey

Private school principals of participating and non-participating schools

109

0.17

18.53

76

0.17

12.92

Public School Principal Survey

Principals of DC public schools

150

0.17

25.50

105

0.17

17.85

DC Opportunity Scholarship Program Operator Records


1

40.0

40.00

1

40.0

40.0

Follow Up Parent Survey (phone)

Parents of eligible students; students age 16 or above as of June 30, 2009

--

--

--

914

.083

75.86

Total


8,377

 

8,279.03

5,946


5,074.63

Notes: The information in this table describes the surveys and burden for one (annual) cycle of data collection. This cycle will be repeated for a total of 4 years of follow up data collection for each cohort.

The original package mistakenly showed 8,662 responses and 8,564 hours. During the approval process, the adjusted numbers were 8,377 responses and 8,279 hours, reflected on the 83C and noted in the memo to OMB dated 1/3/05.

The estimated number of respondents for which we are seeking approval for this 4th year of data collection is the number of applicants in the impact sample (2,308) times our expected response rate of 70 percent.


The estimated number of respondents for this final round of data collection is smaller than in prior (completed) years of data collection for two reasons.


    1. In the first three years of data collection we tested and surveyed applicants who were randomized (the impact sample of 2,308) as well as applicants who were not randomized and received the offer of a scholarship (851 public school applicants and 216 applicants already in private school), for a total of 3,375. We tested these non-impact sample scholarship recipients to adhere to the statute’s requirements that all applicants be subject to testing. This was particular important, because some families include students both in and not in the impact sample; we were concerned that eliminating the testing requirement for those not in the impact sample would undermine response rates from impact sample members. In addition, in earlier reports we produced some information on all cohort 1 and 2 applicants, not just those in the impact sample. However, for this fourth year of data collection, we are planning to test and survey only the impact sample of 2,308 in an effort to achieve a sufficient response rate.


    1. The number of respondents in the first three years reflects an estimated 80 percent response rate. The number of respondents in the fourth years reflects an estimated 70 percent response rate (based on the experience with turnout in the third follow up year and our expectation of receiving permission to raise the incentive for the year 4).


The instruments are identical to those used in earlier rounds with one exception – we are adding a (short) follow up parent survey, so the estimated annual burden per respondent is slightly higher. As in prior years, the estimated annual burden is the product of the number of respondents and the estimated annual burden per respondent, for a total of 5,074.63 hours projected for the spring 2009 activities.



  1. Estimate of the Cost Burden to Respondents


There are no additional respondent costs associated with this data collection other than the hour burden estimated in item A12.



  1. Estimates of Annualized Government Costs


The estimated cost to the federal government of conducting the Impact Evaluation of the DC Opportunity Scholarship Program is based on the government's contracted cost of the data collection and related study activities along with personnel cost of government employees involved in oversight and/or analysis. For all the data collection and reporting activities, including the additional (4th) year for which OMB approval is currently being requested, the overall cost to the government is estimated to be $2,516,884.



  1. Changes in Hour Burden


This is a request for an additional year of data collection. But because we discontinued the approval for the earlier collections, the current inventory is zero. The total new estimated annual burden, in hours, is 5,074.63, as reflected in the last column of Table 3 (on page 13).



  1. Time Schedule, Publication, and Analysis Plan


All data will be analyzed according to rigorous technical standards, and woven together to provide a complete assessment of whether the DC Opportunity Scholarship Program achieved its goals. The focus of the analysis and report will be evidence regarding: (1) who applies for and uses a scholarship; (2) what impacts does the offer and use of a scholarship have on student test scores, parental satisfaction, school safety, and other participant outcomes; and (3) do the principals at DC public schools and private schools plan to manage their educational institutions differently in response to the establishment of the DC Opportunity Scholarship Program.


General Analytic Strategy


It is well known that the independent effects of school choice on student outcomes are difficult to estimate. Perhaps the most significant difficulty faced by researchers is selection bias -- the self-selection of families to even seek out a new school choice for their child, and the mutual student/school decision process that selects students into different types of schools. Because this bias is generally a result of unmeasurable factors, most researchers have preferred the use of a randomized experiment to a dependence on non-experimental statistical methods. Since the DC Opportunity Scholarship Program provides for the random distribution of scholarships using a lottery, under certain conditions and within certain parameters, we will therefore use experimental methods to the extent possible to estimate most programmatic impacts.


To motivate the discussion of how we identify the effect of the scholarship program on test-scores, it is useful to begin with a simple representation of the selection problem as a missing data problem, using the potential outcomes approach. This approach defines causal effects in terms of potential outcomes or counterfactuals. Conceptually, the causal effect of treatment is defined as the difference between the “outcome for individuals assigned to the treatment group” and “outcome for the treatment group if it had not received the treatment,“ or:


(E.1) “E(Yi| Xi, Ti =1)” - “E(Yi |Xi, Ti =0)”


In the case of scholarships, the treatment effect–the effect of the scholarships on academic achievement–would be defined as the difference between “test scores for program students” and “test scores for program students if they had not received a scholarship.” The fundamental problem is that a student is never observed simultaneously in both states of the world. What is observed is a student in the treatment group (Ti =1) or in the control group (Ti =0). The outcome in the absence of treatment, E(Yi |Xi, Ti =0), is then the counterfactual--what would have occurred to those students receiving the scholarships if they had not received them.


If students receiving scholarships were identical to other students in both observable and unobservable characteristics, the counterfactual could be generated directly from an appropriately selected comparison group. Valid comparison groups are rarely found in practice, however. The random assignment of students into the program generates the counterfactual from the control group – eligible applicants who did not receive a scholarship.4 If correctly implemented, random assignment yields statistically equivalent groups, and allows estimation of the program impact through differences in mean outcomes between the two groups.

Consistent with this approach is the following basic analytic model of the effects of school choice scholarships on outcomes. Consider first the outcome equation for the test score of student i in year t. It is reasonable to assume that test scores (Yit ) are determined as follows:


(E.2) Yit =α+ τ Tit + Xi γ+ εit if t>k (period after program takes effect)


In equation (E.2), Tit is equal to one if the student has the opportunity to participate in the voucher program (i.e., the award rather than the accrual use of the voucher) and equal to zero otherwise. Xi is a vector of student characteristics (measured at baseline) known to influence future academic achievement, such as prior test scores, mother’s level of education, family income, etc. In this model, τ represents the effect of vouchers on test scores for students in the program, conditional on Xi. With a properly designed experiment, using a concise and judiciously chosen set of statistical controls for characteristics that predict future achievement should improve the precision of the estimated impact. That is, the estimated treatment effect, τ, should be identical to the difference in mean outcomes between the treatment and the control groups.


Customization of General Analytic Strategy


Since the initial applicants were randomized within certain relevant subgroups, we propose a randomized block design for analyzing scholarship program impacts. The randomized block design divides the program group into relatively homogenous groups (called blocks). The program group is then randomly assigned vouchers within each block. We are interested in how academic achievement (Y) is affected by the assignment into a voucher program. Suppose we could identify b blocks -- based on grade and scholarship priority status -- that are of size n. Consider then the following statistical model for this Randomized Block Design:


(E.3) Yikt = μ+ τ Tikt +∑bj=2 ρj Bik+ Xik γ+ εik,t


where

i = 1,…..,n observations and k=1,….,b blocks(defined by grade and priority status);

Yji is the outcome for student i in block j, at time t;

μ is the overall mean outcome (e.g. test score);

τ is the treatment (scholarship program) effect;

ρj is the jth block effect;

Tit is assignment into the voucher program

Bji is the block assignment

Xji represents observable characteristics, measured at baseline

εij is the random error; independent, Ν(0,σε2 ).


This analytical framework follows naturally from the randomization scheme and is easily implemented and interpreted. Y can be measured in several different dimensions, including test scores, school satisfaction, parental satisfaction, grade completion, including where appropriate, high school graduation, etc. μ is average outcome for all program members; ρj is the average block effect. τ is the effect of vouchers on academic achievement. The remainder of this discussion discusses econometric concerns and associated empirical methods.


Take-Up of Scholarships


Even with a properly implemented experiment, we may expect slippage between the random assignment into the experiment and use of the scholarship at a private school. This occurrence has been observed in very different experimental settings, including medical trials, job training and health insurance experiments. More relevant to our exercise is the slippage that has been observed in previous school voucher experiments, such as the Milwaukee Parental Choice Program. Such slippage has important implications for the estimators of the effect of the scholarship program. Generally we define two broad estimators of interest. The first, commonly referred to as the "Intent to Treat" (ITT), is the effect of the offer of a scholarship on student outcomes. All students randomized into the sample make up the experimental sample, regardless of whether they use the scholarship to attend a private school.


Policymakers are typically also interested in the effect of scholarship use on student achievement. This estimator, commonly referred to as the "Impact of the Treated" (IOT), is based on the sample of scholarship users. Instrumental variable analysis provides us with a well-established method to generate an estimate of the scholarship impact on the treated from the ITT estimator.5


Using only the sample of scholarship users in this case could introduce a form of selection bias, in that the sample of students using the voucher to attend private schools is selected (from the randomized-in sample). Self-selection bias results in the case where family (observable and unobservable) characteristics that affect student outcomes also affect the decision to use the voucher. For example, families who care more about education and are more able to gather and analyze relevant information about the schools are also the families whose children are more likely to make use of the voucher, all else equal. Students in such families are also more likely to do better once in a private school setting than their randomized-in counterparts who do not use the voucher. To see the point, consider the following models of actual use of the voucher, and student test scores.6


(E.4) Vit = σ0+ σ1Tit + Xiσ2+ εit



where

i represents student, t time

V represents use of the voucher

T represents treatment status (=1 if selected in the lottery)

X represents observable characteristics


Note that when schools randomly select from applicants when they are over-subscribed, L is random, conditional on the school and grade of the applicant. Such effects would be controlled for in the randomized block design proposed in equation E.3.


We also recognize that a model of student outcomes would be based on actual voucher use/attendance at a private institution.


(E.5) Yit = π0+ π1Vit + Xi π2+ νit


Combining equations (E.4) and (E.5), we get


(E.6) Yit = ψ0+ ψ1Tit + Xi ψ2+ ξit


What these equation show is that the estimated treatment effect ψ1 is equal to a combination of the effects of selection into the program on voucher use and of school attendance on student outcomes (ψ1= π11). Note that ψ is the treatment effect in the empirical models E.2 and E.3. What we estimate in the ITT model is therefore the reduced form effect of both margins of response-student learning in private schools and family take-up of scholarship dollars. It is important to note that ψ is in some respects the policy parameter of interest since families cannot be compelled to use available scholarships. Its decomposition is of course incredibly useful for learning about the effectiveness of different types of schools on educational attainment; and of the success of, in this case, publicly funded scholarships. Our empirical analysis will examine, among other margins, family choices regarding take-up of the scholarships as well as types of schools selected.


These types of analyses will be performed for the various outcome measures called for in the law, including academic achievement, safety, satisfaction, and other student outcomes.


Reports


The DC Choice Act requires annual reporting to Congress. The first report will describe who applied to the DC Opportunity Scholarship Program, largely by comparing the demographic characteristics and achievement of program applicants with those of other DCPS students. Subsequent reports will focus on the impact of the program, using the experimental and multivariate regression techniques described above to estimate differences in outcomes between the treatment and control group members. Based on the guidance in the legislation, the reports will focus primarily on conditions and outcomes involving student academic performance, parental satisfaction, school safety, and the process by which parents select schools. A schedule for the reports and data files is provided in Table 4.


Table 4. Deliverable Schedule

Deliverable

Schedule

Descriptive Report

First Impact Year Report

Second Impact Year Report

Third Impact Year Report

Fourth Impact Year Report

Spring 2006

Spring 2007

Spring 2008

Spring 2009

Spring 2010

Data Files with Documentation.

July 2010



  1. Display of Expiration Date for OMB Approval


All data collection instruments will include the OMB expiration date.



  1. Exceptions to Certification Statement


No exceptions are requested.


1If, after OMB grants approval for this new round of data collection, the funds to conduct the collection ultimately fail to be appropriated by the full Congress, ED will formally terminate the approval.

2In March 2004, a grant to run the program was awarded to the Washington Scholarship Fund, a non-profit organization that operates a privately-funded scholarship program for students in the DC area.

3Participating private schools do not want government intrusion into their campus activities, and view the evaluation testing as one source of that intrusion, and the public school system did not want to appear to be endorsing the voucher program being studied by allowing testing in the public schools.

4See the following studies, which all use the same data from an evaluation of a New York City privately-funded scholarship program: Howell, William G., Patrick J. Wolf, David E. Campbell, and Paul E. Peterson, “School Vouchers and Academic Performance: Results from Three Randomized Field Trials,” Journal of Policy Analysis and Management, 21:2, 2000; Barnard, John, Constantine E. Frangakis, Jennifer L. Hill, and Donald B. Rubin, “Principal Stratification Approach to Broken Randomized Experiments: A Case Study of School Choice Vouchers in New York City,” Journal of the American Statistical Association, 98:462, 2003; Alan B. Krueger and Pei Zhu, “Another Look at the New York City School Voucher Experiment,” Working Paper Series, Education Research Section, Princeton University, March 2003.

5 For an extended discussion of the use of this technique under such circumstances, see Howell et al, The Education Gap, pp. 49-51.

6Cecilia Elena Rouse, “Private School Vouchers and Student Achievement: An Evaluation of the Milwaukee Parental Choice Program,” The Quarterly Journal of Economics, May 1998, pp. 553-602.

2


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorBeth Sinclair
Last Modified By#Administrator
File Modified2008-12-19
File Created2008-12-19

© 2024 OMB.report | Privacy Policy