Att_OMB package part B 10 04 07

Att_OMB package part B 10 04 07.doc

Impact Evaluation of the DC Opportunity Scholarship Program

OMB: 1850-0800

Document [doc]
Download: doc | pdf





U.S. Department of Education






Impact Evaluation of the DC Opportunity Scholarship Program







Office of Management and Budget

Statement for Paperwork Reduction Act Submission



Part B: Collection of Information Employing Statistical Methods


Contract ED-04-CO-0126






October 3, 2007



TABLE OF CONTENTS




PART B Collection of Information Employing Statistical Methods 1


B.1 Respondent Universe and Sampling Methods 1

B.2 Information Collection Procedures 1

B.3 Methods to Maximize Response Rates 2

B.4 Tests of Procedures 2

B.5 Individuals Consulted on Statistical Aspects of Design 2




PART B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


This package is to request OMB clearance for data collection instruments previously approved by OMB (OMB No.1850-0800, approval notice dated 4/15/05). This package is identical in content to the package approved by OMB with one exception: the incentive section (A9) has been updated to reflect increases in the incentive amounts that were approved by OMB on 7/22/05 and 12/19/05. (Minor changes in wording have been made to the section headings to reflect the current OMB headings.)

B.1. Respondent Universe and Sampling Procedures

The Impact Evaluation of the DC Opportunity Scholarship Program will be based on administering assessments and surveys to the universe of program applicants and their parents as well as school principals. The surveys that will provide information on schools’ competitive response to the program will also include the universe of DCPS principals and of private school principals. Therefore, there is no sampling proposed for this study.



B.2. Information Collection Procedures

Student Assessments


We plan to administer the SAT-9 math and reading assessments when the treatment and control group families come in to renew their eligibility for the Program, so that the test administration will be similar across all types of evaluation members. The scholarship users will clearly be the most motivated to attend and we will be conscious of the need to take steps to encourage the scholarship non-users (decliners) and control group members to fulfill the requirements to participate in the evaluation’s data collection. These assessments will be administered in early April of each year, for the four years of the evaluation’s data collection.


School Records


Administrative records will be collected from DCPS and charter school authorizers to obtain data on attendance, persistence, disciplinary actions, and grades for members of the treatment and control groups at baseline. In addition, Westat will seek to obtain these data for all public school students, including those in charter schools, so that the program applicants can be compared to other students in the relevant grade levels, as required by the DC Choice Act.


Parent Surveys


The legislation requires the evaluation to examine the impact of the program on parents. These surveys will be administered to the parents when they come in to renew their child’s program eligibility, with telephone follow up as necessary.


Student Surveys


Each year, the study will conduct surveys of treatment and control group students who are in grades four and above. The surveys will be administered each year of the program and are likely to occur at the same time (and place) as the student assessments – the family events where the parents come in to renew program eligibility.


Principal Surveys


Surveys will be mailed to principals of all private schools in DC and principals of all public and charter schools in DCPS. The surveys will be mailed in the spring of each data collection year, with telephone follow up.



B.3. Methods to Maximize Response Rates


We will maximize the response rate for this portion of the study both by distributing survey instruments that are fairly easy for respondents to complete and by following up with non-responders by mail, fax, and telephone. Although the primary respondents are quite disadvantaged (eligibility requirements include family income less than or equal to 185 percent of poverty), this study is striving for a response rate of 80 percent.


Obtaining high response rates in the Impact Evaluation of the DC Opportunity Scholarship Program will be critical to the success of the study. It will be particularly important to obtain response rates that are not only high overall, but that are approximately equal in the treatment and control groups. This will be challenging due to the fact that while most of the treatment group will presumably be in a relative small set of participating private schools, control group students will likely attend a large number of different DCPS schools, and the identities of these schools will not be known in advance.


We have several strategies for ensuring a high rate of response. First, we have planned to conduct most of the data collection—student assessments, student and parent surveys—at events the program operator will hold for the treatment and control groups to re-establish eligibility for the program. Second, because of a key provision in the law, in our communications with parents we can stress that participation in the evaluation’s data collection is required for students to keep their scholarship or remain eligible to receive a scholarship in the future. We believe these requirements will be a formidable incentive to respond to the surveys and assessments. Finally, we will employ a sophisticated tracking system to ensure that we follow up with non-response in a timely and comprehensive way.



B.4. Test of Procedures


We pre-tested each of the surveys with nine or fewer people who were similar demographically to respondents in the study. We asked the pretest respondents to first complete the relevant survey and then participate in a focus group about it. In the focus group discussions we tested for completion times and feelings of burden, salience of language, concept recognition, and understanding of terms. After the pretests, we revised the surveys as needed based on the pretest results.



B.5. Individuals Consulted on Statistical Aspects of Design

The statistical aspects of the design have been reviewed thoroughly by staff at the Institute of Education Sciences, as well as by members of the study’s expert panel (listed in Part A, section A8). Table 1 shows the individuals most closely involved in developing the statistical procedures.


Table 1. Individuals Involved in this Project

Name

Affiliation

Role

Phone Number

Babette Gutmann

Westat

Project Director

(301) 738-3626

Patrick Wolf

University of Arkansas

Principal Investigator

(479) 575-2084

Mike Puma

CRA

Senior Analyst

(410) 897-4968

Juanita Lucas-McLean

Westat

Director of Data Collection

(301) 294-2866

Marsha Silverberg

ED/IES

Economist, COR

(202) 208-7178


2


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorBeth Sinclair
Last Modified ByRoseta.Hall
File Modified2007-12-12
File Created2007-12-12

© 2024 OMB.report | Privacy Policy