OMB SS.& PartB

OMB SS.& PartB.doc

National Evaluation of the Voluntary Public School Choice Program (SC)

OMB: 1875-0224

Document [doc]
Download: doc | pdf




National Evaluation of the

Voluntary Public School Choice (VPSC) Program:

Final Phase

TASK ORDER NO. 1, CONTRACT NO. ED-04-CO-0047







DRAFT



REQUEST FOR OMB CLEARANCE



May 2, 2006




Submitted by:

COSMOS Corporation

Submitted to:

Policy and Program Studies Service

U.S. Department of Education

400 Maryland Avenue, SW

Room 6W207

Washington, DC 20202-0498


National Evaluation of the

Voluntary Public School Choice (VPSC) Program:

Final Phase



Draft

Request for OMB Clearance



May 2, 2006



Submitted by:


COSMOS Corporation

3 Bethesda Metro Center, Suite 400

Bethesda, Maryland 20814



Submitted to:


Policy and Program Studies Service

U.S. Department of Education

400 Maryland Avenue, SW

Room 6W207

Washington, DC 20202-0498



Task Order No. 1

Contract No. ED-04-CO-0047


CONTENTS


Page

Part A. Justification A-1

A1. Circumstances Requiring the Collection of Data A-1

A2. Purposes and Uses of the Data A-3

A3. Use of Technology to Reduce Burden A-3

A4. Efforts to Identify Duplication A-4

A5. Methods to Minimize Burden on Small Entities A-4

A6. Consequences of Not Collecting the Data A-4

A7. Special Circumstances A-5

A8. Federal Register Comments and Persons Consulted Outside the Agency A-5

A9. Payment to Respondents A-6

A10. Assurance of Confidentiality A-6

A11. Justification of Sensitive Questions A-7

A12. Estimates of Response Burden A-7

A13. Estimates of Cost Burden to Respondents A-13

A14. Estimate of Annual Cost to Federal Government A-13

A15. Program Changes or Adjustment A-13

A16. Plans for Tabulation and Publication of Results A-13

A17. Approval to Not Display Expiration Date A-15

A18. Explanation of Exceptions A-15


Part B. Description of Statistical Methods B-1

B1. Respondent Universe B-1

B2. Procedures for Data Collection B-4

B3. Methods to Maximize Response Rates B-7

B4. Tests of Procedures or Methods B-9

B5. Names and Telephone Numbers of Individuals Consulted B-9


Exhibits


1. Aggregate Annual Respondents and Hour Burden for Final Phase Evaluation A-7

1a. Estimated Annual Burden for Gaining Cooperation from Comparison Sites A-8

1b. Estimated Annual Burden for Participants in Site Visits A-10

1c. Estimated Annual Burden for Respondents to School Survey A-11

1d. Estimated Annual Burden for Student-Level Achievement Data A-12

1e. Estimated Annual Burden for Lottery Data A-13

2. Timeline of Data Collection Activities B-4

3. Evaluation Framework Guiding Data Collection Activities B-5

4. Overview of Data Collection B-7



Attachments

1. No Child Left Behind Act of 2001 (P.L. 107-110, Title V, Part B, Subpart 3) 1-1


2. Federal Register Notice 2-1

3. Data Collection Instruments 3-1

PART A. JUSTIFICATION


A1. Circumstances Requiring the Collection of Data

This request for clearance is to carry out data collection activities for the National Evaluation of the Voluntary Public School Choice (VPSC) Program: Final Phase. Earlier data collection activities under the Initial Phase were covered under OMB Clearance No. 1875-0224, expiring on October 31, 2006.

The VPSC Program. The VPSC Program supports the expansion of public school choice as part of the No Child Left Behind Act of 2001 (P.L. 107-110, Title V, Part B, Subpart 3). (Attachment 1 contains the language from sections 5241-5246 authorizing VPSC.) The VPSC program is helping selected school districts to establish or expand public school choice initiatives, to provide options for parents to secure a high quality education for their children, and especially options for students in low-performing schools (“sending” schools) to transfer to higher performing schools (“receiving” schools). In September 2002, the program competitively awarded 13 five-year grants, ranging from $3.4 to $17.8 million. The awards averaged $9.2 million, or approximately $1.8 million per year, and were made to various state education agencies and local school districts as well as to one non-profit (charter school) organization.

The grant activities, far from representing a common “intervention,” are extremely heterogeneous. The grants range from statewide programs, to districtwide programs, to programs within selected zones within a district. The use of VPSC funds also varies. Some of the VPSC Program’s sites have focused more of their funds on the implementation of parent information centers to provide information on students’ school choice options; others have used more funds to build new capacities (i.e., educational programs) at receiving schools; and yet others have invested more heavily on professional development related to teachers’ needs in dealing with newly incoming students.


The National Evaluation. The Final Phase of the National Evaluation of VPSC (June 2006–March 2007) will follow the design developed for the Initial Phase of the National Evaluation (Sept. 2002–May 2006). Although the Initial Phase’s data collection experiences have led to slight modifications in the design, the evaluation questions have remained the same throughout both phases:


1. What are the characteristics of the VPSC Program’s sites?


    1. What organizations or partnerships received grants?

    2. Are the funded initiatives located in diverse areas (e.g., urban, suburban, rural)?

    3. What are the characteristics of the students who choose another school (e.g., demographic and academic characteristics)?


2. How and to what extent does the program promote educational

equity and excellence?


2.1 What strategies are funded with the federal grants (e.g., transportation, marketing, or funding for schools of choice)?

2.2 How do the strategies work in conjunction with Title I choice accountability provisions?

2.3 How do the initiatives improve parents’ awareness of their children’s options?

2.4 What percent of eligible students and schools participate in the initiatives?

2.5 How do the initiatives facilitate choices by low-income, minority, or low-performing students?

2.6 To what extent do the initiatives enable students to move from low- to higher performing schools?

2.7 Are the choice options associated with changes in other districts or public schools in the area that are not part of the federally funded program?


3. What academic achievement is associated with the VPSC Program?


3.1 What are the academic trends for students who transfer from low- to higher performing schools?

3.2 To what extent are the funded initiatives associated with these trends?

3.3 To what extent are the funded initiatives associated with the overall academic quality of schools and districts in the area?


Overall, the evaluation’s objective is to understand the choice initiatives and outcomes associated with the VPSC Program’s funds.


A2. Purposes and Uses of the Data

The primary purpose for the data is program evaluation. The data will enable ED to document and assess the progress and accomplishments of the VPSC Program and its awardees. ED can use the data to: 1) determine whether to modify or extend the VPSC concepts, and 2) share examples of the diversity of choice practices with school, district, and state educators. Furthermore, the data will provide insights into the working assumptions about public school choice initiatives as a strategy for improving public education. Finally, the data will help inform policy decisions by providing Congress with important information about school choice and the use of the appropriated funds.


A3. Use of Technology to Reduce Burden

Technology will be used to reduce burden on the VPSC Program’s sites. Examples include the collection of data through FAX, e-mail, and Web site communications, as well as the exchange of electronic data files. For instance, a school survey to an average of 50 schools at each site has been conducted via FAX transmissions instead of being administered as a traditional mail survey. As a second example, the team routinely uses e-mail to submit draft site visit reports to the sites for their review and correction. Finally, the VPSC Program’s sites have submitted detailed data about their students’ performance in electronic files.


A4. Efforts to Identify Duplication

The National Evaluation is the only study of the 13 sites in the VPSC Program and therefore is not duplicated by any other cross-site study. Each of the 13 sites is conducting its own evaluation, but mainly to guide themselves in implementing their initiatives. Nevertheless, these evaluation efforts may potentially overlap with those of the National Evaluation. Therefore the following steps already have been taken to identify and avoid duplication. First, the National Evaluation helped to convene a plenary meeting of the VPSC Program’s Project Directors (January 2003 in Tampa, FL) to discuss these matters and the potential overlaps in data collection in detail. The team has then participated in the VPSC Program’s Project Directors’ meetings in May 2004 (in Rockford, IL) and in July 2005 (in Washington, DC). The National Evaluation plans to participate in the fourth meeting, tentatively scheduled to occur in June 2006 (in Minneapolis, MN). During these meetings the National Evaluation presents preliminary findings and outlines future data collection activities. The team also meets individually with the sites and local evaluators to coordinate ongoing data collection efforts.

A5. Methods to Minimize Burden on Small Entities

The evaluation has no plans for collecting data from small entities, as all of the data will come from public school entities. Therefore, no impact on small entities is expected.


A6. Consequences of Not Collecting the Data

VPSC represents an initiative of major interest in contemporary educational policy. Expanding school choice options is high among federal, state, and local priorities, and the progress made by the VPSC Program has been of national interest. The National Evaluation is unique because it is collecting cross-site evaluation data that will not be compiled or available elsewhere. Without the National Evaluation, progress by the VPSC Program cannot be assessed solely from extant databases, and the needed information will not be available for dissemination to the interested parties at federal, state, or local levels. In addition, ED would be unable to comply with its congressional mandate to evaluate the VPSC Program.


A7. Special Circumstances

There are no special circumstances. The proposed data collection fully complies with 5 CFR 1320.5(d)(2).


A8. Federal Register Comments and Persons Consulted Outside the Agency

To support the present clearance request, a Federal Register Notice (Vol. XX, No. XX) was published on [xx/xx/xxxx] (Attachment 2 contains a copy). During the ensuing three-month period, no public comments were recorded.

The development of the design for the National Evaluation and the planned data collection benefited from the advice and review of an eight-member expert panel that has met periodically during the Initial Phase of the evaluation. Individual members of the panel also have been consulted directly, throughout the life of the evaluation. The panel was further convened in April 2006 for the purpose of reviewing and commenting on the revised Evaluation Design and this current document. The panel members include several scholars who have published papers on school choice initiatives and are:


Frank Brown, Ph.D.

Professor of Education

Director, Educational Research and Policy Studies

University of North Carolina, Chapel Hill


Peter W. Cookson, Jr., Ph.D.

Dean of Graduate School

Professor of Educational Administration

Lewis & Clark College


David Heistad, Ph.D.

Executive Director

Testing, Evaluation, and Student Information

Minneapolis Public Schools


Valerie Lee, Ed.D.

Professor, School of Education

University of Michigan


Janelle T. Scott, Ph.D.

Assistant Professor

Steinhardt School of Education, New York University


Paul Teske, Ph.D.

Professor

Graduate School of Public Affairs

University of Colorado at Denver


Patrick J. Wolf, Ph.D.

Associate Professor

Georgetown Public Policy Institute


Todd Ziebarth, MPA

Senior Policy Analyst

National Alliance for Public Charter Schools



In addition to this panel, the evaluation team also has consulted with the following individual who has a special expertise in evaluating school choice initiatives:


Clive Belfield, Ph.D.

Assistant Director

National Center on School Privatization

Teachers College



Dr. Belfield reviewed the Initial Phase evaluation design and interim reports, and he has provided written comment. Together, he and the panel members represent a variety of specialties and academic disciplines (e.g., education, sociology, political science, statistics, and economics).


A9. Payment to Respondents

No payments or gifts will be made to any of the respondents in this evaluation.


A10. Assurance of Confidentiality

All individual and institutional data collection by the National Evaluation will be conducted in accordance with the provisions of the Privacy Act of 1974. The National Evaluation team will repeat the procedures it has used during the Initial Phase of the National Evaluation of the VPSC Program, which include using individual student records in aggregate form only and referencing individual interview records with coded identifiers to retain the anonymity of the individual person. In addition, the National Evaluation team will not identify any comparison sites by name.


A11. Justification of Sensitive Questions

No questions will be asked that are of a sensitive nature.


A12. Estimates of Response Burden

The estimated annual response burden is 1,106 person-hours. This total represents the sum of the estimated burden for all portions of the Final Phase evaluation (see exhibit 1). Detailed breakdowns of the estimated hours and cost burden for each item listed in exhibit 1 are included in the narrative and exhibits that follow later in this section (see exhibits 1a-1e).


Exhibit 1


Aggregate Annual Respondents

and Hour Burden for Final Phase Evaluation


Item

Number of respondents

Hour burden

Estimated cost of burden

a. Gaining cooperation from comparison sites

26

26

$1,404

b. Site visits

338

455

$19,695

c. School survey

585

193

$8,880

d. Student-level achievement data

13

312

$16,848

e. Lottery data

3

120

$6,480

Total

965

1,106

$53,307

Gaining Cooperation from Comparison Sites. The evaluation team worked with the VPSC Program’s sites to identify potential comparison sites, based on the criteria to be described in Section B1. At the start of data collection, the site visit team contacted the Title I coordinator at the comparison sites to explain the nature of the study and obtain the sites’ cooperation. In only one case did the potential comparison site choose not to participate in the evaluation. In that instance, the evaluation team contacted and obtained cooperation from a second comparison site. In addition, several sites have involved the participation of multiple districts, and thus have required a multiple-district comparison site. In these instances, the evaluation team contacted the coordinators in several districts to secure the needed participation.

During the Final Phase of the Evaluation, the evaluation team estimates that securing the continued cooperation of 13 (in some cases multiple-district) comparison sites will require one (1) hour phone conversations with approximately 26 district administrators for a total burden of 26 person hours (see exhibit 1a).

Exhibit 1a


Estimated Annual Burden for

Gaining Cooperation from Comparison Sites


Item

Type of respondent

Number of respondents

Time estimate

(in hours)

Total hours

Hourly rate*

Estimated cost of burden

Gaining cooperation from comparison sites

District administrators

26

1

26

$54

$1,404

Total

26

-

26

-

$1,404

* Estimates based on National Occupational Employment and Wage Estimates (U.S. Bureau of Labor Statistics, November 2005).



Site Visits. During the Final Phase of the National Evaluation, the evaluation team will conduct site visits to the 13 VPSC sites and 13 comparison sites. At each VPSC Program’s site, the site visit team will interview five (5) district or state administrators, including the VPSC project director and staff, the Title I coordinator, and other relevant district personnel (possibly including representatives from the transportation, budget, and parent information offices). These interviews will last approximately three (3) hours each. At the VPSC Program’s site, the site visit team also will visit one sending and one receiving school. At each school the site visit team will interview the principal (for 3 hours), three teachers (for 0.5 hours each), and five parents (for 0.5 hours each). The teacher and parent interviews may take the form of a small focus group. The total burden on the VPSC Program’s sites will be 377 person hours (see exhibit 1b).

The evaluation team also will conduct visits to the comparison sites. In order to minimize the burden to these sites, the team will focus the interviews on key personnel in the districts. Approximately three (3) district administrators will be interviewed (as compared to the five interviews deemed necessary at each VPSC site), including the Title I coordinator and other relevant district personnel at each site (possibly including representatives from the magnet or charter schools offices, as well as a representative from the parent liaison or transportation offices). These interviews will last approximately 2 hours. The evaluation team will not conduct site visits to schools at comparison sites, and therefore, no principals, teachers, or parents will be interviewed at the comparison sites. The total burden on the comparison sites will be 78 person hours (see exhibit 1b).


Exhibit 1b


Estimated Annual Burden for

Participants in Site Visits


Site

Type of respondent

Number of respondents

Time estimate (in hours)

Total hours

Hourly rate*

Estimated cost of burden

VPSC Program’s site

District or state administrators (5 per site)

65

3

195

$54

$10,530

School principals

(2 schools per site)

26

3

78

$46

$ 3,588

Teachers

(3 per school)

78

.5

39

$35

$1,365

Parents

(5 per school)

130

.5

65

$0*

$0


Subtotal

299

-

377

-

$ 15,483

Comparison site

District administrators

(3 per site)

39

2

78

$54

$4,212

Total

338

-

455

-

$ 19,695

* Estimates based on National Occupational Employment and Wage Estimates (U.S. Bureau of Labor Statistics, November 2005). Parents will not be interviewed in their professional capacity and therefore will have no direct costs other than their time to participate.


In total, the site visits will require an annual burden of 455 person hours.


School Survey. Principals from up to 650 schools across sites will be asked to complete a FAX survey that should take no longer than 20 minutes of their time1. The number of estimated respondents is 585 (a 90 percent response rate), for an annual burden of 193 person hours (see exhibit 1c).

Exhibit 1c


Estimated Annual Burden for

Respondents to School Survey


Respondent

Total sample size

Estimated response rate

Number of respondents

Time estimate

(in hours)

Total hours

Hourly rate*

Estimated cost of burden

School principals

(average of 50 per site)

650

90%

585

.33

193

$46

$8,880

Total

650

90%

585

-

193

-

$8,880

* Estimates based on National Occupational Employment and Wage Estimates (U.S. Bureau of Labor Statistics, November 2005).



Student-Level Achievement Data. The evaluation team has asked the VPSC Program’s sites to submit student-level achievement data for all of the students participating in the VPSC initiative and a comparison group of students. As part of their own implementation procedures, the sites should already have this information collected. Providing it for the evaluation team should be only a matter of copying existing records and duplicating relevant documentation. In addition, the sites strip all identifying information from the records (e.g., name, social security number), so that the files remain completely anonymous. The evaluation team estimates that these efforts, along with occasional phone or e-mail dialogue with the sites, will take approximately 3 business days (24 hours) for each grant administrator. The number of estimated respondents is 13 (a 100 percent response rate), for an annual burden of 312 person hours (see exhibit 1d).


Exhibit 1d


Estimated Annual Burden for

Student-Level Achievement Data


Item

Type of respondent

Number of respondents

Time estimate

(in hours)

Total hours

Hourly rate*

Estimated cost of burden

Obtaining student-level achievement data

District administrators

13

24

312

$54

$16,848

Total

13

-

312

-

$16,848

* Estimates based on National Occupational Employment and Wage Estimates (U.S. Bureau of Labor Statistics, November 2005).



Lottery Data. When the number of applicants exceeds the number of available seats, the VPSC Program stipulates that the sites must select students to participate on the basis of a lottery. During the Final Phase of the National Evaluation of the VPSC Program, the evaluation team intends to use data from a sample of the sites’ lotteries to evaluate the choice initiatives more closely. The sample consists of up to three VPSC Program’s sites that are planning to use lotteries in anticipation of oversubscription. The evaluation team will ask these sites to share the lottery results with the team, as well as subsequent student achievement scores for both the students placed in a school of choice through the lottery and the students applying but not placed through the lottery. The sites should have these data available and in a mode that is easy to share, requiring only the copying of existing records and the preparation of documentation, as well as the removal of any identifying information. The evaluation team estimates that these efforts will take approximately 5 business days (40 hours) for each site. The number of estimated respondents is 3 (a 100 percent response rate), for an annual burden of 120 person hours (see exhibit 1e).



Exhibit 1e


Estimated Annual Burden for

Lottery Data


Item

Type of respondent

Number of respondents

Time estimate

(in hours)

Total hours

Hourly rate*

Estimated cost of burden

Obtaining lottery data

District administrators

3

40

120

$54

$6,480

Total

3

-

120

-

$6,480

* Estimates based on National Occupational Employment and Wage Estimates (U.S. Bureau of Labor Statistics, November 2005).



A13. Estimates of Cost Burden to Respondents

The planned respondents range from district administrators to parents. The hourly rate for each respondent has been indicated in the exhibits in section A12. There are no other additional respondent costs aside from those outlined in section A12.


A14. Estimate of Annual Cost to the Federal Government

The total cost for the evaluation is $823,865 over 42 months. The average annualized cost is $235,390. Most of the costs for the evaluation are incurred in year 3, when data collection efforts are underway.


A15. Program Changes or Adjustment

This request is for an extension to continue information collection that began during the Initial Phase of the National Evaluation.


A16. Plans for Tabulation and Publication of Results


Plans for Tabulating Results. The analysis plans cover both quantitative and qualitative data. On the quantitative side, the National Evaluation will establish baseline and implementation trends of student academic performance. The evaluation is not intended to be a complete outcome analysis, because achievement scores for the final year of VPSC Program Implementation (June 2007), will likely not be available in a form that can be analyzed when the proposed evaluation ends (March 2008). The quantitative analysis has started with an analysis to determine the robustness of any differences in student achievement trends between the VPSC Program’s sites and comparison sites (school-level) and transferring and non-transferring students (student-level, including multiple categories of non-transferring students as discussed later), as of the middle of the VPSC Program. Where statistically significant differences are found, rate of return and other analyses will then be conducted to illuminate other conditions regarding the potential cost-benefit of the choice initiatives in greater depth.

On the qualitative side, the initial analysis has focused on documenting the intensity, completeness, and fidelity of the implementation of the VPSC initiatives. To the extent that implementation has taken place satisfactorily, the analysis will then investigate the plausible arguments that can be made regarding the relationship between the VPSC initiatives and subsequent events, including the consideration of rival explanations.

Because of the heterogeneity of the VPSC initiatives, all cross-grant analyses will follow the principles of meta-analysis, treating each grant in effect as if it were an independent study.


Plans for Publishing the Results. To cover the results, the Final Phase of the National Evaluation will produce one report. The report will be the final report from the Final Phase. A draft will be available in the fall of 2007, with the final document scheduled for release in March 2008. The evaluation team also will hold periodic briefings to inform ED and other officials on the progress of the VPSC Program and of the National Evaluation, with a final briefing for ED staff to occur in March 2008.

A17. Approval to Not Display Expiration Date

No request is being made for any exemption from displaying the expiration date.


A18. Explanation of Exceptions

This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.

PART B. DESCRIPTION OF STATISTICAL METHODS


B1. Respondent Universe

The respondent universe is the entirety of the 13 VPSC Program’s sites and 13 comparison sites.


A Nested, Multiple-Case Design. The design is nested because each of the 13 VPSC Program’s sites has groups of schools and groups of students within these schools. The purpose of the nested design is to acknowledge the unique attributes of each site. At the school level, the relevant schools at each site can be depicted as a “system of schools,” consisting of:


  1. All district schools, or all schools within a specified geographic portion of

a district, if the initiative involves a districtwide or zone-wide

arrangement;

(b) Schools from which students have transferred (sending schools);


(c) Schools to which students have transferred (receiving schools);


(d) Schools eligible to serve as sending schools but where no transfers

may have occurred (other eligible sending schools); and


(e) Schools eligible to serve as receiving schools but where no transfers

may have occurred (other eligible receiving schools).



Aggregate (school-level) data will be collected about the trends in academic performance of all of these different types of schools, to permit comparisons among the trends. The data also will cover the demographic characteristics of the enrolled students. Among other issues, the possibility exists that, if a choice program does not include either racial fairness or socioeconomic fairness guidelines, schools may become more segregated or socially stratified.

At the student level in the nested design, the relevant students include the following:


(a) Students applying for transfer and then transferring (enrollees);

(b) Students applying for transfer but not transferring, either by their

own decision or because their applications could not be

honored—e.g., due to a lack of seats (applicants);


(c) Students eligible to apply but not applying (non-applicants); and


(d) Students not eligible to apply—e.g., students already at a

receiving school (non-eligibles).



The National Evaluation will use student-level data, gathered by the 13 sites, to compare the trends among these groups of students.


Definition of Comparison Sites. For each of the VPSC Program’s sites, the evaluation team selected a comparison site, which included a set of schools in a district or zone not involved in the VPSC Program. The comparisons did not include schools that were part of the “system” of schools participating in a VPSC-funded initiative, including those schools in the system that were eligible but did not send or receive transferring students.

In addition to being outside of the “system” of schools, the selected comparison sites have several preferred characteristics. First, they were selected based on proximity to the VPSC-funded site, using the following criteria:


  • For sites that limit choice to a well-defined zone or zones

in their district, the comparison site is a similar

set of other schools outside of the zone(s) but in the same district;


  • For sites whose choice initiatives cover the whole district, the comparison site is another district in the same state;


  • For sites that cover multiple districts (e.g., a metropolitan area or

a rural area), the comparison site is a similar kind of multiple-

district area in the same state; and


  • For statewide sites, the comparison sites are non-participating

districts in the same state that are similar to the participating

districts.2



Second, the comparison sites are comparable to the VPSC-funded sites in academic performance, demographic characteristics, and enrollment size. Where all three criteria could not be met, priority was given to the first two.

Third, the comparison sites had few available public school choice opportunities similar to the VPSC Program at the time of selection. While some comparison sites offer magnet and charter schools, these same choice options previously existed at many of the VPSC Program’s sites. Similarly, comparison and VPSC Program’s sites must offer choice related to Title I provisions. However, the comparison sites have historically had no choice initiatives resembling those at the VPSC Program’s sites, as funded by the VPSC Program.


Definition of the Initiatives Being Studied. As a final design issue, the National Evaluation defines the initiatives of interest as the ones being supported by VPSC funds. The use of VPSC funds has not only defined the initiatives of interest for the National Evaluation, but also has helped focus on specific facets of the initiatives. For instance, for some of the 13 sites, the bulk of the funds is used to support and enhance parent information centers; other sites have invested heavily in parent outreach and media campaigns; yet others have provided funds for new educational programs at receiving schools to increase their capacity and attractiveness to serve transferring students. The use of funds also has usefully directed the National Evaluation’s attention to more specific activities, while still attending to the overall VPSC choice initiatives at the 13 sites.


Sufficiency of Sample Size. The planned analysis will compare different groups of schools and students, using student achievement scores as outcome data. As part of the nested design, the number of schools in the study, estimated to reach over 1,000, will include a sample population that is sufficiently large enough to detect medium effect sizes.


B2. Procedures for Data Collection

The timeline of data collection activities is outlined in exhibit 3 below.


Exhibit 2


Timeline of Data Collection Activities


Data collection activity


Projected

start date

Projected

end date

Site visits

Interviews with VPSC Project Directors and Staff

10/1/2006

5/15/2007

Interviews with other participating persons

10/1/2006

5/15/2007

Interviews with the comparison site district and staff

10/1/2006

5/15/2007


Archival data

10/1/2006

5/15/2007

Surveys

School Survey

10/1/2006

5/31/2007

Achievement data

Student-level achievement data

Ongoing


Lottery-related student records

Ongoing



The data collection plans have been organized according to a three-step evaluation framework (see exhibit 3). The heading for each step reflects one of the three main evaluation questions, and within each step is a set of subquestions that the National Evaluation will address. The evaluation framework helps to assure that all variables of interest are included in the actual data collection instruments.

The relevant data will include qualitative and quantitative data from all of the sites, including: One additional round of site visits; a survey covering an average of 50 schools at each site (all schools at sites with fewer than 50 schools, or a random sample of schools when there are more than 50 schools); and the collection of documentary and archival data about the participating students and the performance of schools in the district (or the site) as a whole. Site visits will be made to the comparison sites, mainly to confirm the absence of a VPSC-like initiative at the site and to document the nature of other choice options that may be in place. The archival data will be collected about all of the schools, including those at the comparison sites. These data will be analyzed in conjunction with the data gathered during the two rounds of site visits and the two administrations of the school survey that occurred during the Initial Phase.

Exhibit 3


Evaluation Framework Guiding Data Collection Activities


















The planned instruments have been designed so that they cover the relevant portions of the three evaluation questions. These instruments (see exhibit 3) include:


(A) Protocol A: Interview with the VPSC Project Director and Staff

(a site visit instrument, covering VPSC implementation and trends associated with VPSC and largely directed at the VPSC project director and staff);


(B) Protocol B: Interview with Other Participating Persons (a site visit

instrument directed to school principals, teachers, or parents of

transferring and non-transferring students);


(C) Protocol C: Interview with the Comparison Site District and Title I

Staff (a comparison site instrument, mainly for collecting data from the district offices); and


(D) School Survey (a school survey instrument directed to principals

of schools participating in the VPSC Program’s initiatives and to administrators of participating districts in the two statewide initiatives).



The list of instruments does not include the collection and analysis of quantitative archival data, which will address the evaluation questions related to student achievement. School-level data will come from existing databases compiled by the U.S. Department of Education, providing annual data for nearly every school in the country, starting with the year 1998-99.3 These databases may be supplemented by website data from State Departments of Education. Student-level data will be compiled and provided to the National Evaluation by the 13 VPSC Program’s sites.




Exhibit 4


Overview of Data Collection


Evaluation topics and research issues

VPSC Program sites

Comparison sites

VPSC PD & staff

Other district staff

Other persons (principal, teachers, and parents

Archival

Survey

District

Archival

Topic 1: How and to what extent does the program promote educational equity and excellence [VPSC Program or other choice programs]?


Topic 2: What are the characteristics of the VPSC Program’s sites [or the comparison sites]?


Topic 3: What academic achievement is associated with the VPSC Program [or trends at the comparison sites]?






B3. Methods to Maximize Response Rates

The site visits to VSPC Program’s sites will gather responses from persons at the district level and at a minimum of two schools. Because the sites have received grants and are interested in the evaluation’s results, their participation is assumed to be highly motivated, yielding a high response rate. During the first two rounds of site visits, response rates were nearly 100 percent. The National Evaluation team also has maintained periodic contact with the sites so that any problems that may arise can be identified and remedied quickly. The evaluation team anticipates that it will once again obtain a near-100 percent response rate during the site visits (with the only non-responses a result of illness or scheduling conflicts).

Responses also are needed from the comparison sites, at the district level. The National Evaluation has tried to assure a high response rate from the comparison sites by minimizing the burden on them. Comparison sites have been asked for a few hours of their time, mainly involving the Title I coordinators who are accustomed to dealing with ED and its evaluation efforts. Should sites be reluctant to participate, the evaluation team will request that some ED Title I official try to persuade them. As a last resort, any lack of response by the sites still will not jeopardize the main analysis, because the data are available from archival sources and do not come from the site itself. During the first two rounds of site visits, the team has successfully been able meet with representatives from all of the comparison sites, and the team anticipates a similar response rate during the Final Phase of the National Evaluation.

During the first two rounds of administering the school survey, the evaluation team developed a successful method for obtaining a high response rate. At the request of the sites, the evaluation team gave the sites’ project directors the survey to distribute to the schools, with a personalized cover letter from the VPSC Program’s sites requesting the schools’ participation. Schools were asked to either fax the survey directly to the evaluation team, or to return the completed surveys to the site (depending on the sites’ preferences). The sites sent reminder e-mails to the non-responding schools approximately one week following the submission deadline. If necessary, the sites then made follow-up calls to the schools.

Following the effort by the VPSC Program’s site staff, and with the permission of the site, the evaluation team then made at least two rounds of follow-up calls directly to the non-responding schools, and frequently re-FAXed the survey to the schools’ FAX numbers. ED project staff also volunteered to make calls directly to non-responding schools during the second round of survey administration. However, this was not necessary once a response rate of 91 percent was obtained. The team plans to follow the procedures outlined above and anticipates a 90 percent or higher response rate for the school survey during the Final Phase of the National Evaluation.


B4. Tests of Procedures or Methods

All of the planned data collection activities and procedures for the Final Phase have been conducted as part of the Initial Phase evaluation. In fact, the evaluation team has completed two rounds of site visits to each of the VPSC Program’s 13 sites, and has collected two rounds of school survey data. All of the data collection activities were completed without difficulty and with no unanticipated problems. In addition, the evaluation team has used the VPSC Program’s Project Directors’ meetings, discussed earlier, to explain the data collection procedures. The VPSC Program’s sites have not voiced any major problems or unanticipated burden in responding to requests for data, interviews, or supporting documentation. Therefore, the evaluation team will utilize the same procedures and methods in the Final Phase that were used during the Initial Phase.


B5. Names and Telephone Numbers of Individuals Consulted

Frank Brown, Ph.D.

Professor of Education

Director, Educational Research and Policy Studies

University of North Carolina, Chapel Hill

Phone: 919-962-2522

e-mail: [email protected]


Peter W. Cookson, Jr., Ph.D.

Dean of Graduate School

Professor of Educational Administration

Lewis & Clark College

Phone: 503-768-6002

e-mail: [email protected]


David Heistad, Ph.D.

Executive Director

Testing, Evaluation, and Student Information

Minneapolis Public Schools

Phone: 612-668-0571

E-mail: [email protected]


Valerie Lee, Ed.D.

Professor, School of Education

University of Michigan

Phone: 734-647-2456

E-mail: [email protected]


Janelle T. Scott, Ph.D.

Assistant Professor

Steinhardt School of Education, New York University

Phone: 212-998-5621

E-mail: [email protected]


Paul Teske, Ph.D.

Professor

Graduate School of Public Affairs

University of Colorado at Denver

Phone: 303-556-5970

E-mail: [email protected]


Patrick J. Wolf, Ph.D.

Associate Professor

Georgetown Public Policy Institute

Phone: 202-687-9152

E-mail: [email protected]


Todd Ziebarth, MPA

Senior Policy Analyst

National Alliance for Public Charter Schools

Phone: 202-289-2700

e-mail: [email protected]


Project Monitor: Adrienne Hosek

Policy and Program Studies Service

U.S. Department of Education

Phone: 202-260-4189

E-mail: [email protected]


Evaluation Team: Robert K. Yin, Ph.D.,

Pirkko Ahonen, Ph.D., Janeula Burt, Ph.D., and Patricia Freitag, Ed.D.

COSMOS Corporation

Phone: 301-215-9100

E-mail: [email protected]

ATTACHMENT 1


No Child Left Behind Act of 2001 (P.L. 107-110, Title V, Part B, Subpart 3)


No Child Left Behind Act, Public Law 107–110, 107th Congress


TITLE V—PROMOTING INFORMED PARENTAL CHOICE AND INNOVATIVE

PROGRAMS


‘‘PART B—PUBLIC CHARTER SCHOOLS


‘‘Subpart 3—Voluntary Public School Choice Programs


‘‘SEC. 5241. GRANTS.

‘‘(a) AUTHORIZATION.—From funds made available under section 5248 to carry out this subpart, the Secretary shall award grants, on a competitive basis, to eligible entities to enable the entities to establish or expand a program of public school choice (referred to in this subpart as a ‘program’) in accordance with this subpart.

‘‘(b) DURATION.—Grants awarded under subsection (a) may be awarded for a period of not more than 5 years.


‘‘SEC. 5242. USES OF FUNDS.

‘‘(a) REQUIRED USE OF FUNDS.—An eligible entity that receives a grant under this subpart shall use the grant funds to provide students selected to participate in the program with transportation services or the cost of transportation to and from the public elementary schools and secondary schools, including charter schools, that the students choose to attend under the program.

‘‘(b) PERMISSIBLE USES OF FUNDS.—An eligible entity that receives a grant under this subpart may use the grant funds for—

‘‘(1) planning or designing a program (for not more than 1 year);

‘‘(2) the cost of making tuition transfer payments to public elementary schools or secondary schools to which students transfer under the program;

‘‘(3) the cost of capacity-enhancing activities that enable high-demand public elementary schools or secondary schools to accommodate transfer requests under the program;

‘‘(4) the cost of carrying out public education campaigns to inform students and parents about the program; and

‘‘(5) other costs reasonably necessary to implement the program.

‘‘(c) NONPERMISSIBLE USES OF FUNDS.—An eligible entity that receives a grant under this subpart may not use the grant funds for school construction.

‘‘(d) ADMINISTRATIVE EXPENSES.—The eligible entity may use not more than 5 percent of the funds made available through the grant for any fiscal year for administrative expenses.


‘‘SEC. 5243. APPLICATIONS.

‘‘(a) SUBMISSION.—An eligible entity that desires a grant under this subpart shall submit an application to the Secretary at such time, in such manner, and containing such information as the

Secretary may require.

‘‘(b) CONTENTS.—An application submitted under subsection (a) shall include—

‘‘(1) a description of the program for which the eligible entity seeks funds and the goals for such program;

‘‘(2) a description of how and when parents of students will be given the notice required under section 5245(a)(2);

‘‘(3) a description of how students will be selected for the program;

‘‘(4) a description of how the program will be coordinated with, and will complement and enhance, other related Federal and non-Federal projects;

‘‘(5) if the program is to be carried out by a partnership, the name of each partner and a description of the partner’s responsibilities; and

‘‘(6) such other information as the Secretary may require.


‘‘SEC. 5244. PRIORITIES.

‘‘In awarding grants under this subpart, the Secretary shall

give priority to an eligible entity—

‘‘(1) whose program would provide the widest variety of choices to all students in participating schools;

‘‘(2) whose program would, through various choice options, have the most impact in allowing students in low-performing schools to attend higher-performing schools; and

‘‘(3) that is a partnership that seeks to implement an interdistrict approach to carrying out a program.


‘‘SEC. 5245. REQUIREMENTS AND VOLUNTARY PARTICIPATION.

‘‘(a) PARENT AND COMMUNITY INVOLVEMENT AND NOTICE.—In

carrying out a program under this subpart, an eligible entity shall—

‘‘(1) develop the program with—

‘‘(A) the involvement of parents and others in the community to be served; and

‘‘(B) individuals who will carry out the program, including administrators, teachers, principals, and other staff; and

‘‘(2) provide to parents of students in the area to be served by the program with prompt notice of—

‘‘(A) the existence of the program;

‘‘(B) the program’s availability; and

‘‘(C) a clear explanation of how the program will operate.

‘‘(b) SELECTION OF STUDENTS.—An eligible entity that receives a grant under this subpart shall select students to participate in a program on the basis of a lottery, if more students apply for admission to the program than can be accommodated.

‘‘(c) VOLUNTARY PARTICIPATION.—Student participation in a program funded under this subpart shall be voluntary.


‘‘SEC. 5246. EVALUATIONS.

‘‘(a) IN GENERAL.—From the amount made available to carry out this subpart for any fiscal year, the Secretary may reserve not more than 5 percent—

‘‘(1) to carry out evaluations;

‘‘(2) to provide technical assistance; and

‘‘(3) to disseminate information.

‘‘(b) EVALUATIONS.—In carrying out the evaluations under subsection (a), the Secretary shall, at a minimum, address—

‘‘(1) how, and the extent to which, the programs promote educational equity and excellence;

‘‘(2) the characteristics of the students participating in the programs; and

‘‘(3) the effect of the programs on the academic achievement of students participating in the programs, particularly students who move from schools identified under section 1116 to schools

not so identified, and on the overall quality of participating schools and districts.



















ATTACHMENT 2


Federal Register Notice


[To Come]




ATTACHMENT 3


Data Collection Instruments


Data Collection Instruments



This appendix includes the data collection instruments for the National Evaluation of the Voluntary Public School Choice Program. The instruments include protocols for:


A. Interview with the VPSC Project Director and Staff, 2006-07;


B. Interview with Other Participating Persons (e.g., principals), 2006-07;


C. Interview with the Comparison Site District and Title I Staff, 2006-07; and


D. School Survey, 2006-07.


These protocols were developed to address the three main evaluation questions, and the series of subquestions guiding this evaluation (see exhibit 3-1).


The protocols (with the exception of the School Survey) will be used to guide data collection activities in the field. Site visit team members should be prepared to address the protocol questions in their written reports when they return from the field visits. In addition, the team should collect some demographic information while on site (see exhibit 3-2).

Exhibit 3-1


Crosswalk of Evaluation Questions

With Questions from the Protocols*


Evaluation Question

# of Items

Item

1.1

0

Face Sheet

1.2

0

Face Sheet

1.3

6

A3.1, A3.2, A11.1, B5, D2, D4

2.1

28

A1.1, A1.2, A1.3, A1.4, A1.5, A1.6, A2.1, A2.2, A2.3, A2.5, A2.6, A2.7, A3.2, A3.3, A6.1, A6.3, A6.4, A7.1, A7.2, A8.1, A8.2, A8.3, A9.3, B2, B3, B4, B6, B7

2.2

5

A2.1, A3.3, A4.3, A9.2, A10.1, B8

2.3

21

A2.3, A2.4, A3.4, A4.1, A4.2, A4.3, A5.2, A5.4, A5.6, A5.7, A5.8, A7.3, B1, B2, B3, B9, D5, D6

2.4

6

A2.1, A2.7, A2.8, A3.2, A3.3, A11.1

2.5

7

A2.3, A2.4, A2.5, A2.6, A2.8, A3.3, A8.1, B2, B3

2.6

7

A2.3, A2.4, A2.5, A2.6, A2.7, A2.8, A3.3

2.7

5

A2.3, A3.4, A9.1, A13.3, A13.5, B11

3.1

7

A11.2, A12.1, B10

3.2

7

A11.2, A12.1, B10

3.3

11

A11.2, A12.4, A12.5, A13.1, A13.2, A13.4, A13.5, B10, D7

*This exhibit focuses on the qualitative data that will be collected to address the evaluation questions; not shown are archival items that address evaluation questions 1.1 and 1.2, though those questions are in part covered by the face sheet in exhibit 3-2. Also not shown are the items to be addressed during the site visit to the comparison sites.


Exhibit 3-2

(Data collection: 2006-07)


Face Sheet To Be Completed for

Each VPSC Site and Comparison Site

at the Time of the Site Visit



Date of Site Visit:




Site Visit Team:





Name of Site:


Lead Organization and

Dept. for VPSC Initiative:


Partnering Organizations:


Type of Jurisdiction

(e.g., rural, suburban, urban):


Student Population:




Ethnicity:






White


African American


Hispanic


Asian


Other




Respondent (# of hrs)


Name


Date of Interview

Grant Program

Director (1-2):




Grant Program Staff (1-2):




Local Evaluation (1):




Interviews with others:

Principals (0.75):




Teachers (0.5):




Parents (0.5):




Others (0.5):





Site Visit Face Sheet, 2006-07


A. Interview with the VPSC Project Director and Staff

(Data collection: 2006-07)


1. Definition of the VPSC Initiative


1.1 What activities or staff are being supported with VPSC funds? [Use the revised proposal budget as a starting point, but be prepared for changes to have occurred.]


1.2 Is it reasonable to define the initiative according to how the funds are being spent? If not, provide some other rationale for defining the initiative. [Also check the revised proposal budget.]


1.3 Describe the VPSC initiative in 2-3 short sentences.


1.4 How participatory was the planning process that was used in developing the VPSC initiative? [Probe mainly for parent and school participation, and describe the nature of the participation.]


1.5 What were the main problems encountered in planning (not implementing) the VPSC initiative?


1.6 What have been the main problems encountered in implementing the VPSC initiative?



2. Identity of the Schools Involved in the VPSC Initiative


2.1 Define the breadth of the initiative across its participating schools:







2.1.1 Academic years covered by implementation (not including planning year):




2.1.2 No. and grade level of sending schools:




2.1.3 No. and grade level of receiving schools or sites, by district: [Also see Qs. 7.1 and 8.1 below.]




2.1.4 No. and grade level of remaining schools, by district:

Within this number, no. and grade level of schools that might become sending schools in the near future:

[Define criteria used.]







A: Interview with the VPSC Project Director and Staff, 2006-07



2.2 Describe any “zone” pattern and how zones were defined. [If possible, obtain a map of the schools and of the zone pattern.]


2.3 How were the sending and receiving schools identified to be part of the initiative? [If related to school performance, obtain documentation of such performance.]


2.4 Were any logically eligible schools “exempted” from becoming sending or receiving schools? [If YES, please describe.]


2.5 What are the grades for which choice options can be exercised (e.g., the entering grade for each school, but not the other grades)?

2.6 About how many students are eligible to transfer, due to the VPSC initiative, by grade level (and school, if possible)?


2.7 About how many seats will be created at receiving schools, due to the VPSC initiative, by grade level (and school or site, if possible)? [Also see responses to Qs. 7.1 and 8.1 below.]


2.8 How did the district estimate the number of seats available at receiving schools? [Also comment on the quality of the procedures and data used by the district.]


2.9 What is the schedule of tuition transfers, if any, that accompanied the student transfer?




3. Student Assignment Criteria and Procedures


3.1 What criteria have been used to define students eligible for transfer, and what source of data is used to review whether students are eligible?


3.2 How and when do the eligible students indicate whether they want to transfer, and how many potential receiving schools may they identify? [Obtain an application form and review a sample of the completed forms.]







3.2.1 Do they have to indicate, explicitly, that they wish to stay at their present school?




3.2.2 How does the date when the VPSC choice applications are due coincide with the application dates for other choice options—e.g., to charters, magnets, or any other special schools?




3.2.3 Identify the location(s) where the applications must be submitted and the rationale for selecting the location(s):








A: Interview with the VPSC Project Director and Staff, 2006-07


3.3 What criteria are used in deciding to approve a transfer? [Confirm the criteria through documentation and discussions with others, including parents, if possible.]









Check whether the following student assignment criteria

are used; if YES, state the criterion:

Present


Criterion



3.3.1 Proximity preferences (e.g., 50 percent of the seats are reserved for students who live near enough to walk to school: [Note whether these families still have to fill out a choice application.]






3.3.2 Racial fairness guidelines (e.g., the ratio of Whites and non-Whites cannot vary more than 10 percentage points from that of the district as a whole):






3.3.3 Socioeconomic fairness guidelines (e.g., the ratio of students eligible for free or reduced lunch cannot vary more than 10 percentage points from that of the district as a whole):





3.3.4 Sibling preferences:






3.3.5 Special provisions, if any, for students requiring bilingual or special education programs:






3.3.6 Substitution rules when available seats, especially those designated by proximity, race, or socioeconomics, are undersubscribed:






3.3.7 Lottery or random assignment when seats are oversubscribed (and especially whether separate lotteries in effect exist for different subgroups of seats—e.g., Whites and non-Whites): [If used, describe the lottery or random assignment procedure.]






3.3.8 Whether and how a transferring student will receive supplemental services:






3.3.9 Whether and how an eligible but non-transferring student will receive supplemental services:






3.3.10 Whether the program maintains a waiting list, and how late in the semester students can still transfer, if a seat becomes vacant:






3.3.11 Whether the students originally enrolled in the receiving schools can be displaced:













A: Interview with the VPSC Project Director and Staff, 2006-07



3.4 Have there been any complaints regarding the student assignment process:







3.4.1 By students and their families? [If YES, describe.]




3.4.2 By sending schools’ staff? [If YES, describe.]




3.4.3 By receiving schools’ staff? [If YES, describe.]




3.4.4 By any other relevant parties? [If YES, describe.]








3.5 How has the district’s student population changed, relative to the population of school-age children?


4. Parent Notification Procedures


4.1 How did parents participate (and how many participated) in the planning of the choice program and especially in the design of the parent information center?


4.2 When were parents first notified of the choice options supported by the VPSC initiative? [Obtain a copy of the written notification.]

4.3 What feedback information is collected to assure that all parents are being properly informed about their choice options?



5. Parent Information Center(s) (PICs)


5.1 What district-wide enrollment does the PIC cover, and how many students is it actually serving? [Depending upon the choice program, the number of applicants for transfer may only be a portion of the total student enrollment—obtain both numbers.]


5.2 What are the physical characteristics of the PIC’s location?







5.2.1 Access conditions (e.g., parking and public transportation):




5.2.2 Physical facility (e.g., in a high school; working atmosphere):




5.2.3 Hours and days of operation:




5.2.4 No. of different languages spoken by staff, and match to student population:








A: Interview with the VPSC Project Director and Staff, 2006-07



5.3 Enumerate the PIC’s staff, by title and FTE. Also note the supervisory relationships and to whom the PIC director reports in the district.


5.4 Enumerate the outreach activities undertaken by the PIC during the past school year. Also, give some estimate of the frequency of each type of activity:









5.4.1 Home visits or individual conferences with school (or choice) staff:

How often?



5.4.2 Use of mass media:




5.4.3 Use of community or neighborhood events or facilities:




5.4.4 Other:








5.5 How have these outreach activities been adjusted, if at all, from year to year?


5.6 How does the PIC know that its outreach activities are sufficient in reaching all eligible families? [Cite the actual data used by the PIC in developing this knowledge.]


5.7 What is the nature of the parent survey (sample and instrument), if any, conducted by the PIC, and how are the survey results used?


5.8 How does the PIC work with individual schools? [If YES to any of the following, please describe]:







5.8.1 The PIC maintains up-to-date information about the schools’ enrollments:




5.8.2 The PIC arranges tours of the schools:




5.8.3 The PIC encourages schools to develop their own marketing materials:




5.8.4 The PIC helps the schools to disseminate information about themselves:









A: Interview with the VPSC Project Director and Staff, 2006-07


6. Transportation Support within VPSC Initiative


6.1 Were VPSC funds used to support new transportation services? [If YES, please describe.]







6.1.1 Purchase of buses:




6.1.2 Hiring of drivers:




6.1.3 Increases in drivers’ salaries:




6.1.4 Revamping of technological infrastructure (e.g., computerized bus routes):







6.2 Were school hours changed (e.g., staggered daily times or alternate school calendars) to control transportation costs? [If so, describe and note whether the changes were related to VPSC or not.]

6.3 Whether VPSC funds were used to support new transportation services or not, collect data prior to and during the VPSC Program on: [If possible, obtain these data for individual schools, especially the sending and receiving schools.]







6.3.1 The number of buses:




6.3.2 The number of students riding buses:




6.3.3 Average and range of trip times:




6.3.4 The number of schools served each day by a single bus:








6.4 How important are the augmented transportation services to the entire VPSC initiative (e.g., did parents pay for or otherwise support a portion of the existing services, before)?


7. Capacity-Building (New Schools or Sites to Serve as Receiving Schools)


7.1 How many new schools or sites (and how many seats), if any, were opened as receiving schools, with VPSC support? [Also see Q. 8.1.]


7.2 What are the substantive educational themes at these schools, and in what way do they reflect a diversification or replication of themes already covered by the existing schools?


7.3 How were the themes chosen, and did parents participate in the process?



A: Interview with the VPSC Project Director and Staff, 2006-07



8. Capacity-Building (New or Expanded Programs at Existing Schools or Sites)


8.1 How many existing schools or sites have been expanded or added as receiving schools (and how many seats), with VPSC support? [The total of the schools or sites and seats in 7.1 and 8.1 should coincide with the responses to Qs. 1.4.3 and 2.7 above.]


8.2 What are the substantive educational themes at these schools, and in what way do they reflect a diversification or replication of themes already covered?


8.3 Describe the specific actions taken to increase the capacity of the existing receiving schools: [Check each of the following, and if YES, describe.]







8.3.1 New educational programs (e.g., magnets or academies within the school) at the existing schools:




8.3.2 Administrative changes to increase the school’s enrollment capacity:




8.3.3 Supplemental services to assist the transferring students:




8.3.4 Professional development to assist educators in providing instruction to the transferring students:




8.3.5 Other:








9. Capacity-Building (Sending Schools)


9.1 Do sending schools lose funds in relation to out-transfers? [If so, describe the procedures and the amount of funds involved.]


9.2 Are the sending schools under a formal sanction category (e.g., in need of improvement, in corrective action, or restructuring), and how long have they been in the category?



A: Interview with the VPSC Project Director and Staff, 2006-07


9.3 Whether in a formal sanction category or not, what actions will be (or have been) taken to

improve the future performance of the sending schools (check all of the following)?






9.3.1 Professional development or other educational assistance:




9.3.2 Adoption of new educational practices:




9.3.3 Restructuring of the school:




9.3.4 Elimination of the school:









10. Relationship to Title I Provisions of NCLB


10.1 Has the district started to offer the school choice options under the new provisions of Title I, and if so, how do these arrangements relate to those of the VPSC initiative?







10.1.1 Are different schools involved in each, or are VPSC funds used to complement the use of Title I funds for the same schools?









11. Transferring Students and Comparisons with non-Transferring Students


11.1 For each relevant academic year in implementing the VPSC initiative, obtain the following data (preferably noting the individual sending and receiving schools):






11.1.1 How many students were eligible to apply for transfer?




11.1.2 How many applied, and how many were transferred?




11.1.3 What were the main experiences in dealing with under- or over-subscription?




11.1.4 What were the main lessons learned about the transfer process, and did these lessons lead to modifications in the procedures used the following year?









A: Interview with the VPSC Project Director and Staff, 2006-07


11.2 Based on the analysis of individual student records [such analysis may be done by the VPSC site], how do the transferring students compare to those eligible but not transferring, on the following characteristics?







11.2.1 Annual academic performance: [Start with the year prior to the initiative.]




11.2.2 Demographic characteristics (White vs. non-White; school lunch vs. non-school lunch):




11.2.3 Other characteristics that might account for academic differences between the two groups (e.g., number of siblings; level of parents’ education; nature of parents’ employment and income):









12. Sending and Receiving Schools and Their Changes Over Time


12.1 For each sending and receiving school and for each academic year of the initiative, track:







12.1.1 The total enrollments, by grade level:




12.1.2 The proportion of White and non-White students:




12.1.3 The proportion of students eligible for free or reduced lunch:




12.1.4 The academic performance (mathematics and reading) of all the students at the school:

[Use all test data available, even for grades not necessarily affected by the VPSC initiative, not just the transferring students.]




12.1.5 Obtain district-wide averages for the above, omitting the sending and receiving schools, to permit contextual comparisons:








12.2 At the receiving schools, what problems have emerged that might have been associated with the transfers or transfer process, and what remedies have been put into place?


12.3 At the sending schools, what problems have emerged that might have been associated with the transfers or transfer process, and what remedies have been put into place?


12.4 What is the evidence that the sending schools have taken more serious measures to improve their performance or otherwise to become more competitive?

A: Interview with the VPSC Project Director and Staff, 2006-07


12.5 Is there any evidence that potential sending schools are taking more serious measures to improve their performance or otherwise to become more competitive?



13. Contextual Conditions


13.1 What other choice options and programs are available in the district (not covered by the VPSC initiative), and how many students are involved in such options or programs?


13.2 Have these other choice options and programs changed during the period of the VPSC initiative and that might have played a role in the preceding early outcomes?


13.3 Have there been any notable changes by the surrounding districts that might have played a role in the preceding early outcomes?


13.4 Have there been any changes in the community (e.g., residential relocation patterns) that might have played a role in the preceding early outcomes?


13.5 How has the district’s “share” of the student population changed, relative to that of private schools? [Obtain annual data by grade level, if possible.]


A: Interview with the VPSC Project Director and Staff, 2006-07

Interview with the VPSC Project Director and Staff (Outcomes), Spring 2007



B. Interview with Other Participating Persons (e.g., principals)

(Data collection: 2006-07)


1. What is your connection to the VPSC initiative [e.g., principal or teacher at participating school, parent of participating student, other]?


2. How were you made aware of the VPSC initiative?


3. Can you briefly describe your understanding of the VPSC initiative?


  1. What changes have you seen or experienced as a result of the VPSC initiative?


  1. How has student enrollment at your school [or your child’s school] changed during the past year and to what extent have these changes been associated with the VPSC initiative, compared to other conditions?


6. What changes in the academic program have been made as part of the VPSC initiative, and how well are these changes working?


7. What changes in school administration (e.g., redefining school hours or the school year) have been made as part of the VPSC initiative, and how well are these changes working?


8. How smooth or disruptive was the timing or sequence of key events in the VPSC initiative—e.g., application deadlines or parent notification dates?


9. How well do you think the VPSC initiative has been received by teachers, principals, and parents, and do you think they are sufficiently informed about any new role(s) expected of them under the initiative?


10. How has student behavior and academic performance changed, and to what extent are these changes associated with VPSC?


11. Did VPSC provide any other benefits or create any other problems that have not been addressed by the preceding questions?


B: Interview with Other Participating Persons, 2006-07


C. Interview with the Comparison Site District and Title I Staff

(Data collection: 2006-07)


Procedures


One-day site visits will be made to comparison sites in spring of 2007 (February-May). Where the comparison sites are within the jurisdiction already covered by the VPSC Program’s site (e.g., another “zone” in a VPSC district or another district within a statewide VPSC initiative), the needed data should be collected from the VPSC and related district staff.


Where the comparison sites are outside of the VPSC jurisdiction, initial contact with the sites should emphasize that our National Evaluation is: 1) covering public school choice, 2) selecting some sites where minimal choice options other than Title I, magnets, and charters are in operation, and 3) making site visits to sites that are near a VPSC Program’s site but that did not receive a VPSC award. To minimize burden, the site visit will involve interviews with: the Title I coordinator and any related district staff (e.g., overseeing magnets and charters).


Preparation for Site Visits. Prior to any site visit, field teams should collect and analyze data about the targeted area. This information will have come from several sources, including: the VPSC Program’s site that is aware of (and helped to select) the comparison site, and materials gathered from the district or state Web sites.


Assembling of Evidence and Preliminary Reports Immediately Following Site Visits. Field teams are urged to begin the formal analysis and report-writing process as soon as a site visit has ended, though additional data may still have to be collected. Assembling data and drafting narratives proceeds more efficiently and with much higher quality if this time sequence is followed. Teams should reserve the day or two after the site visit for this activity, avoiding other commitments.


Outline of Report. The report should follow the same heading structure as the topics of inquiry described next.



Topics of Inquiry


The topics below not only cover the “agenda” to be followed by the field team but also give explicit probes and examples regarding the type of evidence that is being sought. As a result, the protocol should provide guidance on how to know what to look or listen for, and how to recognize relevant evidence when it is encountered.


C: Interview with the Comparison Site, 2006-07


1. Public School Choice Options at This Site


Enumerate the public school choice options at this site, giving the number of schools and students involved by type of option:


1.1 School identified for improvement under Title I:


1.2 Magnet Schools (no. of schools and programs):


1.3 Charter Schools:


1.4 Other choice arrangements, including unsafe school choice and desegregation (describe):


[If options were identified under 1.2, 1.3, or 1.4 above, be sure to cover Q. 3 below.]



2. New Title I Choice Arrangements


How has the district started to offer the school choice options under the new provisions of Title I? (probe for):


2.1 Use of designation as a Title I school identified for improvement in defining sending and receiving schools:


2.2 Expanded transportation services:


2.3 Provision of supplemental services:


2.4 Expanded or improved parent notification procedures:


2.5 Other (describe):


3. Other, non-Title I Choice Arrangements


Describe the procedures used for the other choice options identified in Q. 1 above (address the following questions for each type of option separately):


3.1 How are eligible students defined?


3.2 How are students/parents notified of their choices? [Be sure to cover all the communication modes—e.g., see Q.5 in the School Survey instrument.]


3.3 What criteria are used in selecting students to exercise their choice?

[Be sure to cover all criteria—e.g., see Q3.3 in the VPSC field

instrument.]

C: Interview with the Comparison Site, 2006-07


3.4 What proportion of eligible students were able to exercise their choice?

[Also determine the proportion receiving their 1st, 2nd, etc. choices;

obtain data for the past 2-3 years, if possible.]


3.5 How have transportation services been affected, if at all, by the choice

arrangement? [Be sure to cover all facets of these services—e.g., see Q.6 in the

VPSC instrument.]


3.6 Are other choice options likely to be implemented in the future?



4. Trends Associated with Choice Options


Determine whether the district maintains records of individual students who have transferred or of the performance of sending and receiving schools. The following questions may be addressed once, even if the district has more than one choice arrangement.


4.1 Are there any data showing trends in academic performance by the students who have exercised their choice? [Probe for data for the past 2-3 years, if possible.]


4.2 Are there any data showing trends in academic performance by the schools involved in the choice arrangement? [Probe for data for the past 2-3 years, if possible.]


4.3 What are the district’s policies, views, and preferences regarding school choice options?


4.4 What interest has been expressed by parents, communities, or students regarding school choice options at this district?



5. Contextual Conditions


5.1 Have there been any notable changes by the surrounding districts that might be relevant to either the choice options or their outcomes at this district?


5.2 Have there been any changes in the community (e.g., residential relocation patterns) that might be relevant to either the choice options or their outcomes at this district?


5.3 How has the district’s “share” of the student population changed, relative to that of private schools in the area? [Obtain annual data by grade level, if possible]


5.4 Is the district implementing other approaches to school improvement as an alternative to choice?

C: Interview with the Comparison Site, 2006-07


D. School Survey: 2006-07


COVER SHEET


QUESTIONNAIRE FOR INDIVIDUAL SCHOOLS

National Evaluation of the Voluntary Public School Choice Program









Date:





Respondent:





Title:





Phone:





School Name:





District:





State:



















FOR NATIONAL EVALUATION USE


ID Number: _________________


Date Received: ______________







1

School Survey, 2006-07



QUESTIONNAIRE FOR INDIVIDUAL SCHOOLS



1. School’s Name, Address, and Grade Levels:

Name:


Address:



Grade Levels (circle lowest and highest): pre-K K 1 2 3 4 5 6 7 8 9 10 11 12


2. The following data were calculated based on student enrollment from which school semester?

(check ONE only):


for Spring 2006


for Fall 2006


for Spring 2007



Other date (specify the date)

Total No. of Students: __________ (no.)

American Indian or Alaska Native:



(%)

Asian:



(%)

Black or African American:



(%)

Hispanic:



(%)

Native Hawaiian or other Pac. Isld.:



(%)

White:



(%)

Other:



(%)


Eligible for Free and Reduced Price Lunch:




(%)

With IEP:



(%)

With Limited English Proficiency:



(%)

Migrant:



(%)


3. Does this School receive Title I assistance? ___Yes (schoolwide) ___Yes (targeted) ___No

If yes, has the school been identified as failing to make Adequate Yearly Progress, based on student achievement scores for the:


for the 2006-2007 school year


for the 2005-2006 school year


for the 2004-2005 school year


none of the above



2

School Survey, 2006-07


4. School Choice Options for Students at this School:

(check ALL that apply):


a.



students within the district may transfer to this school

b.


students outside the district may transfer to this school


c.



students may transfer from this school to other schools within the district

d.


students may transfer from this school to other schools outside the district


e.



Other – e.g., open enrollment (please explain):





f.


there are no school choice options (If you answered “f”, please stop; do not respond to the remaining survey items.)



(if you checked items a, b, c, or d):



For 2006-07, about how many students have transferred ?




g.


(number of students transferring to this school)




h.


(number of students transferring from this school)











i. Did teachers receive any extra staff or professional development,

in relation to the transfer process? ___Yes ___No



j. If yes, what were the main topics of the staff or professional development?




(topic 1)




(topic 2)


5. Notifying Parents about School Choice Options:


What actions did the school take to notify parents/families of their choice options?

(check ALL that apply)

a.


individual, face-to-face meetings with school officials

b.


group meetings with school officials

c.


enrollment fairs or similar events for parents to learn about choice options

d.


open houses at receiving schools

e.


letter mailed to parents/families

f.


letter sent home with students

g.


announcements in community newspapers or other media

h.


contacts made by the district’s parent information center(s)

i.


other (please explain):



j. How many languages, other than English, have been used in these notification procedures?

___ (no.)

3

School Survey, 2006-07


6 . In your opinion, what proportion of the parents/families had a good understanding of their choice options last year? (check ONE only)

a.


all parents/families

b.


most parents/families (e.g., over 50 percent)

c.


some parents/families (e.g., between 20-50 percent)

d.


few parents/families (e.g., less than 20 percent)


If you checked 6b, 6c, or 6d, what is the most important thing you can recommend, to

improve parents/families’ understanding of their choice options?





7. Has your school started new programs (e.g., magnets, academies, small learning communities, new academic subjects) to be more attractive, either to reduce the number of students transferring out or to increase the number transferring in? (check ALL that apply)

a.


becoming a charter school

b.


starting new magnets, academies, or small learning communities

c.


starting other new academic programs or subjects

d.

e.


making other changes in school administration (e.g., changing school hours)


other

f.


no new programs


Briefly describe the new programs and the main changes in school operation and administration.

















4

School Survey, 2006-07


1For the two statewide initiatives, survey questions will be directed to districts. Title I administrators of participating districts will be asked to complete the survey.

2The fact that all districts are eligible to participate may create a complication. However, this complication was considered less troublesome than selecting sites from an entirely separate state, where the choice and other educational policy conditions could be entirely different.

3The National School-Level Assessment Database compiled by ED consists of aggregate (school-level) student achievement scores on state assessments for virtually every school in the country and over a multi-year period that begins with the year 1998-99 (the first year when data were available for over 90 percent of the schools; the database contains more partial data for the 1997-98 school year). Data are currently available through the 2002-03 school year. Data for more recent years are scheduled to be released in 2006, which should allow the evaluation team to continue analyzing school-level achievement scores in a similar fashion. Because the data are available in aggregate form only, any school-level comparisons over time in fact represent a series of cross-sections over time, not a true longitudinal design.

File Typeapplication/msword
File TitleOMB final phase
AuthorP Ahonen
Last Modified Bysheila.carey
File Modified2006-10-17
File Created2006-10-17

© 2024 OMB.report | Privacy Policy