Att_1850-0832 v3 (4205) MSAP Revised SS A OMB Package Part A 1 12 10

Att_1850-0832 v3 (4205) MSAP Revised SS A OMB Package Part A 1 12 10.docx

Conversion Magnet Schools Evaluation

OMB: 1850-0832

Document [docx]
Download: docx | pdf

Conversion Magnet Schools Evaluation


Revised OMB Clearance Request

Part A






January 2010




Prepared for:

Institute of Education Sciences

United States Department of Education

Contract No. ED-04-CO-0025/0009



Prepared By:

American Institutes for Research®

Table of Contents


Page


Part A Submission


Appendices

Appendix A: Authorizing Legislation for Evaluation of Magnet Schools Assistance Grants

Appendix B: Grantee Screening Protocol-Module A: MSAP Project Director Interview

Appendix C: Grantee Screening Protocol-Module B: District Assessment Representative Interview

Appendix D: Grantee Screening Protocol-Module C: District School Choice Coordinator Interview

Appendix E: Grantee Screening Protocol-Module D: District Data Management System Representative Interview

Appendix F: Grantee Screening Notification Letter to MSAP Project Director

Appendix G: Feasibility Phase Brochure

Appendix H: Purposes of Variables in Grantee Screening Protocol for Modules A through D

Appendix I: Student Records Data Collection Plan

Appendix J: Evaluation Phase Brochure

Appendix K: Purposes of Variables in Student Data Collection Plan

Appendix L: Principal Survey – 2009-2010 School Year

Appendix M: Purposes of Variables in Principal Survey – 2009-2010 School Year

Appendix N: MSAP Project/School Choice Coordinator Interview Guide

Appendix O: Notification Letter to Coordinator for Follow-Up Interview

Appendix P: Purposes of Variables in Coordinator’s Follow-Up Interview

Appendix Q: IES Memo to OMB on the Results of the Evaluation of Conversion Magnet Schools Feasibility Phase including IES Responses to OMB Questions

List of Exhibits Page




Introduction


Overview

Magnet programs were designed to help address racial equity issues in American public education and have become an important component of public school choice as well as a possible mechanism to improve the achievement of all students, particularly students who are disadvantaged. The Conversion Magnet School Evaluation is being conducted to determine if efforts to turn around low-performing schools through converting to a Magnet Schools Assistance Program (MSAP) supported magnet school are associated with improved achievement and a reduction in minority group isolation using a Comparative Interrupted Time Series (CITS) design.


An OMB clearance request that (1) described the study design and full data collection activities and (2) requested approval for the burden associated with the first three years of data collection was approved in 2007 (OMB Number 1850-0832 approval 7/13/07; expiration 7/31/10). As described in the original clearance request, the study included a feasibility phase to determine if an adequate sample permitted the full evaluation as well as an evaluation phase. In 2008, we completed the feasibility phase and updated OMB on the results including the viability of moving forward with the CITS study and a description of the actual study sample (please see Appendix Q; Approved on 3/21/08). We also adjusted the burden estimates for years one through three to reflect the actual rather than the expected sample.


We are now requesting clearance for the burden associated with the final fourth and fifth years of data collection necessary for the rigorous CITS design using instruments included in the original request (with revisions). The three data collection activities are (1) collection of student records data for the 2009-2010 school year from participating districts, (2) a survey of the principals of magnet and comparison schools participating in the study, and (3) an interview with the local MSAP project director or school choice coordinator in each participating district. The corresponding data collection instruments are included in this clearance request.


The content in this final OMB request is identical to the original approved request, with some exceptions. In addition to generally updating the package (e.g. summarizing completed activities and providing information unavailable at the time of the original clearance request), we made three key revisions:


  1. The burden hours requested here for years 4 and 5 (373; 187 annually) are smaller. The new estimate reflects the actual sample and revised power calculations with updated assumptions (provided to OMB in 2008). The actual sample includes 14 grantees districts and approximately 25 magnet schools and 50 comparison schools.

  2. The description of using lottery-based studies to examine the effects of students who apply to magnet schools from outside the neighborhood zone was deleted. We will use student fixed effects analysis (also described in our original package) to estimate these effects, because so few of the newly-converted magnet schools in our sample are oversubscribed and use lotteries.

  3. We made revisions to the principal survey and MSAP project director interview in order to update the reference period and to simplify the respondents’ task while improving the utility of the data collection. These revisions were informed by initial data collection efforts and feedback from the study advisory team.


This document contains two sections. This first introductory section provides background for the data collection instruments for which clearance is sought by describing the policy context within which magnet schools operate, the characteristics of the federally funded magnet schools that are the focus of the Conversion Magnet Schools Evaluation, and the major features of the evaluation study. The second section contains Part A (Justification) of the supporting statement for the Paperwork Reduction Act Submission. A set of appendices contains the instruments for which we are requesting clearance, and a companion document contains Part B (Statisical Methods) of the supporting statement for the Paperwork Reduction Act Submission.




Study Background


Since the mid-1970s, magnet schools have been critical to school districts’ efforts to implement voluntary desegregation plans and, in some cases, court desegregation orders. More recently, they have become an important component of public school choice options available to parents, and admissions policies of such programs have evolved to include considerations other than ethnic balance—e.g., promoting socio-economic diversity, and providing options to families who want to move their children out of underperforming schools. It is estimated that, in 1999-2000, the U.S. had about 3,000 magnet schools enrolling 2.5 million students.1


The federal government has supported the development of magnet schools through the Magnet Schools Assistance Program (MSAP) since the mid-1980s. Legislative authority for MSAP is found in the Elementary and Secondary Education Act of 1965, as amended, Title V, Part C; 20 U.S.C. 7231-7231j.2 The program was most recently amended by sections 5301-5311 of the No Child Left Behind Act of 2001 (NCLB). The program awards 3-year grants to school districts for use in implementing new or revised magnet programs in elementary and secondary schools. In a given year, about 10 percent of the nation’s public magnet schools receive support from MSAP. During the current funding cycle (grants awarded in 2007), MSAP supported programs in over 175 schools located in 41 districts. The annual appropriation for MSAP between 2004 and 2009 averaged $106.6 million.3


Section 5310 of the NCLB statute authorizes the secretary of education to use MSAP monies to evaluate the program. The National Center for Education Evaluation (NCEE) of the Institute of Education Sciences (IES), in collaboration with the Office of Innovation and Improvement (OII), has awarded a five-year contract to the American Institutes for Research (AIR) and its subcontractor, Berkeley Policy Associates (BPA), investigate the relationship between the introduction of elementary magnet programs using funds from MSAP grants awarded in 2004 or 2007 and student outcomes that include student achievement and minority group isolation.

Conversion Magnet Schools and Their Students

Federal statute defines a magnet school as a public school or education center that offers a special curriculum capable of attracting substantial numbers of students of different racial backgrounds.4 Most elementary magnet schools begin as “regular” schools that serve the population that lives within the boundaries of a residentially defined service area—the school’s attendance area (also often referred to as an attendance zone). Students in this area who attend the school are its “resident” students. Typically, these schools serve attendance areas with high concentrations of low-income, minority group students who historically have had low levels of academic achievement.


Schools become conversion magnet schools when they introduce a special curriculum or instructional approach (the magnet program) and seek to attract and enroll non-resident students (i.e., students living outside the school’s regular attendance area). The expectation is that by increasing the diversity of students attending the school and involving students in engaging and rigorous academic programs, the conversion will improve the resident students’ academic performance and reduce minority group isolation of students in schools with substantial proportions of minority students.5 Non-resident students are expected also to benefit academically from the quality of the magnet program and the diversity of their classmates. In addition, non-resident students may benefit because the magnet school that their family has chosen for them may be better able to provide them with opportunities that match their particular needs and/or interests.

The Conversion Magnet Schools Evaluation

Despite the popularity and persistence of magnet programs, there have been only a few quantitative studies of their relationship to important student outcomes. Results have been mixed, and no definitive conclusions can yet be drawn. Drawing broad conclusions is particularly challenging because the structures and target populations of magnet school programs are so varied, but the studies conducted have treated magnet schools as a single type of intervention. For instance, elementary and secondary programs differ in the degree to which they can capitalize on students’ career and college plans and the pattern of standardized testing that provides evidence of student achievement over time. Some schools operate “programs-within-a-school” (PWSs) in which the magnet program serves only a fraction of their enrollment, while others operate whole-school programs that are designed to affect all of their students. While most magnet schools (particularly at the elementary level) serve both the residents of an attendance area and non-residents who must apply for admission, some magnets have no attendance area and require all students to apply for admission. Magnet programs also vary in maturity—they may be new, well-established, or “stale” and in need of revision to strengthen the curriculum and generate new interest among potential applicants.


The Conversion Magnet Schools Evaluation to which this request pertains is an investigation of the relationships between some MSAP-funded magnet schools and the academic achievement and minority group isolation of the students who attend them. The study avoids some of the limitations of earlier studies by focusing on a single, relatively large group of these schools that have several characteristics in common: they are elementary schools that converted into whole-school magnets during the 2004 or 2007 MSAP grant cycle, and serve a mixed enrollment of resident and non-resident students.


The differing circumstances of the two groups of students necessitate the use of different research designs to study the relationship between conversion magnet schools and student outcomes. Because resident students are not randomly selected for admission to the magnet school, resident student outcomes cannot be examined using an experimental design. Rather, we are conducting a rigorous quasi-experimental, comparative interrupted time series design in which outcomes for resident students in MSAP-funded conversion elementary magnet schools are compared with those for resident students in matched non-magnet comparison schools in the same district. For non-resident students a fixed effects model is being conducted in which the achievement gains of individual students switching between traditional public schools and conversion magnet schools are examined. This approach removes the influence of unobserved characteristics such as students’ level of motivation, to the extent that these characteristics are fixed during the period under study6.

Supporting Statement for Paperwork Reduction Act Submission


A. Justification

1. Circumstances Making Collection of Information Necessary

Through the provision of the Magnet Schools Assistance Program (MSAP), the federal government aims to make a significant contribution to supporting magnet schools by promoting an intervention strategy that:


  • offers distinctive educational curriculum or teaching methods in a manner that increases parental choice options by making programs available to students both inside and outside of neighborhood attendance areas; and

  • prevents, reduces, or eliminates minority group isolation by attracting students of diverse backgrounds.


The research literature identifies each of these mechanisms—curricular focus or teaching method, parental choice, and diversity in student compositions—as possible avenues for enhancing the academic achievement of students. Moreover, the explicit objective of reducing racial and ethnic minority group isolation makes MSAP unique among the federally funded programs of the Elementary and Secondary Education Act.


MSAP priorities include supporting schools and students most in need of educational improvement, the primary target of ESEA and new programs under the American Recovery and Reinvestment Act of 2009 (Pub.L. 111-5). Specifically, MSAP prioritizes support to districts that (a) establish magnet programs in schools identified for school improvement, corrective action, or restructuring under Title I and to improve the quality of teaching and instruction in the schools or (b) maximize the opportunity for students in low-performing schools to attend higher-performing magnet schools, thereby reducing minority group isolation in the low-performing sending schools.7


As mentioned above, magnet programs are a popular and longstanding intervention with an estimated 2.5 million students enrolled as of 1999-2000. MSAP has supported magnet schools since the mid-1980s and with an annual appropriation that recently averaged $106.6 million (2004 through 2009). Most recently, MSAP is supporting 175 schools in 41 school districts (grants awarded in 2007).


Research on Magnet School Programs


Despite the popularity and durability of the magnet school concept, scientifically rigorous research on the effectiveness of magnet school programs is inconclusive. A review of the research literature on the effects of magnet programs on student achievement identifies 3 random assignment studies using lotteries and 12 quasi-experimental studies using non-randomized comparison groups with pre and post-tests controls.


Collectively, the lottery based studies of the effects of magnet schools on student achievement are inconclusive. A study of programs-within-a-school (PWSs) and whole school “career” magnet programs in New York City high schools in the late 1980s and early 1990s determined that the programs not only failed to have an effect on reading scores, absenteeism, or on the likelihood of students taking advanced graduation/college admissions tests, they also appeared to have a negative effect on high school graduation rates and mathematics test scores.8 In a recent study of elementary and secondary schools in San Diego, the authors generally found no differences in mathematics or reading test scores between lottery winners and losers in any of the district’s school choice programs, including magnet programs. As an important exception, the authors did report that winners of lotteries to attend magnet high schools performed better on mathematics achievement tests 2 and 3 years later.9 In the third study of lottery winners and losers to middle school magnet programs in a mid-sized Southern school district, the positive effect of magnet programs on mathematics achievement tests disappeared when the authors controlled for student demographics and prior achievement. The authors suggest that the most likely explanation for this is a differential pattern of attrition among lottery winners and losers.10, 11


The 12 quasi-experimental studies of the relationship between magnet programs and student achievement date largely from student cohorts of the 1980s and early 1990s and include analysis of the relationship of magnet programs to test scores in reading, mathematics, and other subjects.12 Seven studies were conducted on elementary school magnet programs, three on middle school programs, and two on high school magnets. Some of the studies examine whole school programs, while others focus on program-within-a-school (PWS) magnets, and a few consider both. The studies of PWS magnets tend to be more consistent in showing positive outcomes than studies of whole school programs. The PWS magnets, however, are often very selective of students, and the studies of those programs may be particularly subject to selection bias. Although whole school magnets provide programs that are generally more available to all students, the results from studies of those magnet programs tend to be mixed.


Research on the relationship of magnet schools to desegregation is even more limited than research on the relationship of magnet schools to student achievement. Two descriptive studies reported that over half of desegregation-targeted schools in MSAP funded districts succeeded in preventing, eliminating or reducing minority group isolation.13,14 The San Diego study cited previously, uses the outcomes of lotteries to examine the effect of school choice options, including magnet schools, on racial, socioeconomic and other forms of integration district-wide.15 The results indicated that magnet schools increased the exposure of White to non-White students, and vice versa. The effect of magnet schools on socioeconomic integration was inconclusive.16 While the San Diego study makes an important contribution to examining the relationship of magnet schools to integration, the study is restricted to a single district. The earlier descriptive studies of the relationship of MSAP funded magnet schools to minority group isolation do not provide for a controlled comparison between magnet and non-magnet schools.


Each of the prior studies has limitations, and the mixed findings indicate that no definitive conclusions can yet be drawn about the effects of magnet schools and programs on important student outcomes. Drawing broad conclusions is particularly challenging because the structure and target population of magnet school programs are so varied.17,18 Another important limitation of existing studies is that most focus on students who have actively applied for a program, thereby overlooking the effect of magnet programs on resident students who may have been admitted to the program because they live in the school’s attendance area.


Conversion Magnet Schools Evaluation


IES, in collaboration with OII, initiated the Conversion Magnet Schools Evaluation due to the popularity and persistence of magnet programs and the inconclusive research on the relationship of these programs to important student outcomes. Section 5310 of ESEA authorizes the Secretary of Education to carry out evaluations that address, among other things, how and to what extent magnet school programs lead to educational quality and improvement, and the extent to which they lead to elimination, reduction, or prevention of minority group isolation. The legislation also directs the Secretary to collect and disseminate information on successful magnet schools to the general public (see Appendix A).


This study is addressing limitations to previous work identified above by (1) focusing on schools that are similar in that they represent the most common category of school receiving funding through MSAP (elementary whole-school magnets) and are new (i.e. conversion) magnets (2) studying the relationship between participation in magnet programs and important outcomes for both resident and non-resident students, and (3) examining the relationship of magnet school conversion to minority group isolation through a controlled comparison of magnet and non-magnet schools.






2. Purposes and Uses of the Data

The data collected as part of the Conversion Magnet School Evaluation are needed to address the following main research questions:


  1. How does the conversion of a neighborhood school to a magnet school relate to the educational achievement of resident (neighborhood) students?

  2. To what extent does the conversion of a neighborhood school to a magnet school reduce minority group isolation in the school?

  3. How does magnet school attendance relate to the educational achievement of non-resident students?

  4. To what extent do the new magnet schools funded through the 2004 and 2007 MSAP evolve over time in terms of their program structure and content?


A grantee screening protocol was used to address the feasibility study research questions (now complete). To address the evaluation research questions, we are using student record data, a principal survey, and an interview with MSAP project directors/school choice coordinators.

Grantee Screening Protocol

The purpose of the district screening protocol was to gather information about the 2004 and 2007 MSAP grantee districts and schools needed to determine whether a rigorous analysis of the relationship between attending a magnet school and student achievement was feasible, and whether the districts had the capacity to provide the necessary data for such a study. The protocol is organized by topic into four modules directed at a district official who is knowledgeable about the subject. These include:

  • Module A: MSAP Project Director Interview, which covers the number and characteristics of both elementary magnet schools funded by MSAP and potential comparison schools. (see Appendix B);

  • Module B: District Assessment Representative Interview, which covers the assessments used by the district since 2001-2002. (see Appendix C);

  • Module C: District School Choice Coordinator Interview, which covers the operation of the district’s magnet assignment procedures, including the numbers of students winning and losing in lotteries to attend the magnet elementary schools, and record-keeping of these data. (see Appendix D); and

  • Module D: District Data Management System Representative Interview, which covers the content, format, and linkability of record-data in the district’s student data management system(s). (See Appendix E).


A notification letter was sent to the 2004 and 2007 MSAP grantees that were potentially eligible to be in the study. A sample notification letter to the MSAP Project Directors is presented in Appendix F. A brochure describing the overall study accompanied the notification letter and was distributed to all interviewees in the screening process. This brochure is shown in Appendix G. The variables from the modules in the protocol and their purposes are detailed in Appendix H.


The screening of grantee districts was, of necessity, a semi-structured process as districts varied considerably in size, data management capacity and sophistication, and the details of their choice systems. While the protocol indicates specific information needed by the study, the interviewers asked for clarification when initial responses were ambiguous, incomplete, or unanticipated. While the information needed to assess the feasibility of including each district was fairly extensive, the screening process was designed to reduce burden in three ways. First, it was organized by topic into four modules (each of which required approximately 30 minutes to complete) for use with officials who were particularly knowledgeable about specific subjects. The two modules that were administered first (those pertaining to the district’s magnet/comparison schools and student achievement tests) allowed researchers to eliminate some districts from consideration without administering the two other modules (pertaining to the school choice system/lotteries and the data management system). Thus, while the estimated average time to complete all four modules was 2 hours (distributed among three or four different individuals), the burden for some districts was an hour or less. We estimated that all four modules would be administered to about two-thirds of the districts. Second, to the extent possible, existing data about schools and assessments (e.g., enrollment data from the CCD and assessment information from federal, state, and district websites) were collected and pre-coded into each district’s protocol prior to the interviews.


Pre-coding enabled district staff to verify some information rather than search for it themselves. In addition, it helped to highlight areas in which a district may not meet study criteria so that interviewers can focus their initial questions on the factors most likely to eliminate the district from consideration. Finally, advance copies of the pre-coded protocol were sent to the district so that officials would know what information would be requested during the interview.

Student Data Collection Plan

Student test scores (the outcome measure) and data on student background characteristics (covariate measures) are the core data needed by the Conversion Magnet Schools Evaluation to answer the research questions pertaining to magnet schools’ relationship to student achievement and minority group isolation. Districts participating in the study from both the 2004 and 2007 cohorts have been asked to provide individual student record data for each of the 3 years prior to their receiving an MSAP grant and for at least 3 years after the grant award.19 The 2004 grantees have been asked to provide data for the 6-years from the award of their grants, which is to say from 2004-2005 through 2009-2010. The 2007 grantees have been asked to provide data for the 3 years from the award of their grants, which is from 2007-2008 to 2009-2010. Data for all years prior to 2008-2009 were requested in early 2009, and data for 2008-09 are being requested in late 2009 and early 2010. These data collections were approved as part of the first OMB review. The data for 2009-2010 will be collected in late 2010 and early 2011 as scores for spring 2010 state tests become available.


Data required by the evaluation include individual student test scores in English language arts and mathematics, information allowing each student to be identified as a resident or non-resident of the school he or she attended; and demographic variables that will be used as covariates to reduce variance in the analysis. Data for each student stored in different data systems must either be linked together in the files provided by the district or transmitted in a form that will allow the study team to make the linkages themselves.


In all of the participating districts, student record data are being collected for all elementary school students in the district rather than for selected schools. This is done for three reasons. First, it enables the study to examine outcomes for resident students in the magnet and comparison schools, and students who transfer to the magnet school. Second, having data for all elementary students in the district will permit supplementary analyses that might either strengthen the interrupted time series or student fixed effects analysis, or allow alternative analyses should those approaches not be feasible for a particular district. Third, extracting data for all students is easier than extracting data for selected schools and/or individual students, and thus reduces the burden of data collection for the districts’ data managers.


Most of the data required to investigate the reduction of minority group isolation in conversion magnet schools is being ascertained from extant data sources (e.g. the National Center for Education Statistics’ (NCES) Common Core of Data (CCD) or ED’s EDFacts, a centralized performance database). In addition to examining the composition of entire schools, the minority group isolation study is also investigating, to the extent feasible, the composition of classrooms within schools. The study is requesting information on individual students’ classroom assignments as part of the student data request. Where student-level data are not available, the study is determining whether summary data are maintained and can be readily provided on classroom composition by such factors as gender, race-ethnic group, income, and English language status.


The student data collection plan describes the longitudinal student records data and the classroom- and school-level enrollment summaries that are being requested from each district. The Student Data Collection Plan appears in Appendix I. It is accompanied by a brochure describing the evaluation (Appendix J) that is given to data managers for districts in the study. The variables being collected with the student record data and their purposes are detailed in Appendix K.


The study is taking steps to minimize the burden on districts of providing this data. First, AIR has systems programmers with experience in dealing with complex data formats and a variety of data platforms. The experience of these programmers is being used to provide assistance, for example, in linking data files. Additionally, an individual AIR or BPA staff member has been assigned to work with each district to compile detailed information on the data being provided. For some districts, the compilation of disparate sets of documentation may, in fact, be a side benefit that districts receive by participating in this evaluation.

Principal Survey

The principal survey was administered to the principals in the 2004 MSAP grant cohort of districts in 2008, and will be administered in spring 2010 to principals in both the 2004 and 2007 MSAP grant cohorts. The survey provides key information needed to (1) interpret the results of the achievement and desegregation analyses; (2) document the differences between the magnet and comparison schools; (3) place these schools in the larger context of the nation’s schools (through comparisons on key variables between the study schools and the schools surveyed in NCES’s Schools and Staffing Survey (SASS) and Early Childhood Longitudinal Study (ECLS); and (4) describe the nature and evolution of the magnet schools in the years following the grant award. Questions relating to the first three of these purposes are answered by both magnet and comparison school principals; questions relating to the evolution of the school’s magnet program are answered only by the magnet school principals. The Spring 2010 principal survey submitted in this clearance request is a revised version of the survey that was approved by OMB in July 2007 and administered in May 2008 to magnet and comparison school principals in the 2004 MSAP grant cohort. The revised Principal Survey appears in Appendix L. The variables from the survey and their purposes are detailed in Appendix M.


Revisions to Principal Survey


Changes to the principal survey have been made to update the reference period and to simplify the respondents’ task while improving the utility of the data collected. The original survey was administered in 2008 to only 3 districts, and principals in these districts will also respond to the 2009-2010 survey, which will be administered to principals in 14 districts.


The following updates to items are largely a result of the later time period:


  1. School year references throughout the survey have been updated.


  1. The categories in item 3 for reporting the ethnic and racial composition of the faculty have been updated to be consistent with NCES guidance for meeting OMB standards.20 Recognizing that districts are at different stages of updating their record-keeping systems, item 3 includes instruction to help principals use the information in their record system to complete this item in a consistent manner.21


  1. Item 23, which asked about the school’s history in meeting AYP and school improvement efforts it has undertaken, has been revised in the 2009-2010 survey to (a) encompass the increased number of years that AYP has been tracked, and (b) permit reporting of school improvement actions undertaken to avoid failing to meet AYP goals. The revised item also eliminates a screening question that asks principals to indicate the years in which AYP was met, as this information can now be obtained from public records; whereas some principals, such as those more recently assigned to a school, may be unfamiliar with the entire AYP history of their school. For the latter reason, we have also included an option for principals to indicate that they do not know what improvement actions their school may have taken in an effort to meet AYP.


The following changes that clarify or improve an item, or otherwise add to the information obtained from the principals:


  1. On item 17, which asks principals to indicate the approximate percentage of parents involved in selected activities, a response category was added for indicating the approximate percentage of parents who are “volunteers in the school.” The addition of this category for measuring parental involvement is in response to general advice from the study’s technical work group (TWG) on the importance of capturing information about school context to inform the analysis of the relationship of magnet schools to student achievement.


  1. Item 18 in the 2009-2010 Principal Survey, which asks principals to select one of four options to describe their school’s “primary focus in terms of program content,” replaces item 18 in the 2006-2007 Principal Survey that asked respondents to identify “special features” of their school’s program from a list of 27 features. Judging from responses to the 2006-2007 Principal Survey, the term “special feature” is too broad or ambiguous to elicit responses that indicate the key or essential characteristic of a school’s program. The item replacing it in the 2009-2010 survey is more explicit in asking principals to identify school’s “primary focus” and requiring them to select one from a list of 4 alternatives: a comprehensive curriculum (no specialized area of focus), a special curricular focus (e.g. arts, math/science, foreign language, character education); a particular education philosophy or set of values (e.g. Montessori, open school), or other focus that they are asked to specify.22


  1. The formatting of item 20 was modified to facilitate more consistent and complete recording of the number of hours and minutes in a week that a typical third grade student spends studying each of the following: mathematics, reading, science, social studies/history, art/music, physical education/health, and other subjects. In the 2006-2007 survey, respondents were provided with separate columns for reporting hours and minutes. Some respondents reported hours in the “minutes” column in addition to or instead of the “hours” column (e.g., 5 hours, 300 minutes). Further, some responses summed to what appear to be an unrealistically small amount of instructional time per week. To facilitate consistent reporting, the 2009-2010 survey simplifies the format for recording time to that of a line for each subject in which hours and minutes are separated by a colon (i.e., Hours : Minutes). To facilitate complete recording of instructional time, the 2009-2010 survey also includes a line for “Total instructional time per week.”


  1. The final question, addressed only to principals of the MSAP schools (i.e., item 35 for principals of magnet schools funded by the 2007 MSAP grants and item 40 for the principals of magnet schools funded by the 2004 MSAP grants), is a new item on the principal survey that asks respondents to indicate the approximate percentage of students in each grade that received instruction in 2009-2010 using the curriculum or teaching method that is part of the school’s magnet program. While all the magnet schools in this study operate “whole school” programs, it is not necessarily true that every student in the school receives instruction based on the magnet curriculum and/or teaching method (e.g., some schools may be phasing in their program). This item is intended to indicate the extent to which the program reaches all students in the school. This question was added in response to the TWG’s recommendation of obtaining contextual information to inform the analysis of student achievement.


Although some of the preceding changes will reduce comparability between responses to the 2008 and 2010 Principal Surveys, we believe that the gain in the quality of responses will outweigh the loss of comparability. Moreover, the number of respondents to the 2006-2007 survey is small relative to the number of principals who will receive the 2009-2010 survey. Following the TWG’s recommendation to continue following schools from the 2004 MSAP grant cohort, principals in the three districts surveyed with the 2006-2007 version will have an opportunity to answer the revised version in 2010. Finally, the proposed changes will not affect the estimated time to answer the revised items in 2010


MSAP Project Director/School Choice Coordinator Interview Protocol

This semi-structured interview is designed to obtain information from the MSAP Project Directors or School Choice Coordinator in each of the 14 grantee districts that will inform the analyses of student achievement and minority group isolation and help describe the evolution of the magnet school programs in those districts. The protocol in the original submission to OMB was drafted in early 2007, three years before the interviews will be conducted. Over the intervening years, interactions with participating districts, general advice from the study’s TWG to gather contextual information, and time to further assess what information would be useful in addressing the research questions, resulted in some additions to the interview protocol.


Revisions to Project Director/School Choice Coordinator Interview Protocol


The changes to the protocol include:


  • verifying the theme or focus of the programs in the MSAP magnet schools and whether or not this changed during the course of implementing the grant (Items 1b,1c);


  • gathering more information on school choices in the district by adding questions about district support for non-MSAP magnets (item 4), school choice options resulting from ESEA requirements (items 5,6,7), and competition for students that MSAP schools experience from charter and private schools (items 8,9,10, and 11)


  • requesting a summary of the number and acceptance/refusal of applications to each MSAP funded magnet schools in the study for the 2009-2010 school year (items 19 and 20)


These additions to the interview protocol will be used in describing the evolution of the magnet programs and the district context within which the programs developed. Some of the items might also inform analyses of student achievement and minority group isolation.


On the basis of the screening protocols used at the beginning of the study, we estimate the average time to complete the revised interview will be 30 minutes. To facilitate an efficient interview process, portions of the protocol will be pre-coded with information gathered during the earlier feasibility or screening phase of the study so that our informants can simply confirm that the information is still accurate or indicate changes that may have occurred. Additionally, interviewers will be directed to complete the interviews within 30 minutes.

3. Use of Information Technology to Reduce Burden

During the feasibility phase, the initial screening of the grantees involved semi-structured telephone interviews with district officials. The participating officials were sent copies of the interview protocols via email in advance of the interviews so they could prepare themselves for the discussion, but they were not required to produce written responses. Where possible, the study team also reduced burden on district officials by obtaining enrollment and accountability data from the CCD, EDFacts, and other electronic sources (e.g., from district websites and state testing agencies).


During the evaluation phase, the major data collections involve extraction of student demographic and achievement data from pre-existing electronic data files maintained by districts. All data and documentation will be submitted to the study team via secure electronic transmissions.


Only one paper-and-pencil survey will be administered during this study. During the evaluation phase, a survey of magnet and comparison school principals is being administered at the end of each grant period. This survey will not involve information technology, as developing web-based methods would not be cost-effective given the relatively modest sample size of the study. In some circumstances, information technology will be used to reduce burden. For instance, telephone calls will be used to remind respondents to complete the surveys, and project staff will offer to help non-respondents complete the survey over the phone. The survey is designed to reduce burden on respondents: most of its questions are answered by checking a response option, and the form is short enough to be answered in 35 minutes or less. In addition, principals are given the option of completing a paper-and-pencil or fillable electronic version of the survey.

4. Efforts to Identify Duplication

The Conversion Magnet Schools Evaluation is the only large-scale study currently underway that will apply rigorous experimental and quasi-experimental methods to examining the relationship between magnet school programs and the achievement and minority group isolation of the students who attend them. Magnet schools are identified in NCES’s CCD and SASS surveys, and this information is also included in the National Longitudinal School-Level Standardized Score Database. However, data available through these surveys do not allow analysts to differentiate between resident and non-resident students and thus would not permit rigorous analyses based on disaggregated data.


States and districts nationwide do collect student assessment and background data. The Conversion Magnet Evaluation will assemble these existing data rather than administering its own achievement tests, thereby reducing burden on the participating districts and schools and avoiding duplication of data collections.

5. Methods to Minimize Burden on Small Entities

No small entities will be impacted by this project.

6. Consequences of Not Collecting the Information

As required by ESEA, school districts must provide families whose children attend underperforming schools additional school choices in which to enroll their children. The Conversion Magnet Schools Evaluation represents one of the first efforts by the Department of Education to conduct a rigorous study of therelationship between magnet school programs and student outcomes. Failure to collect the information proposed in this request will prevent ED from evaluating this form of special educational programming that is supported by federal Magnet Schools Assistance Program grants. More generally, without this study, policymakers will have a limited basis on which to judge the value of investing in magnet schools, and parents will lack information for deciding whether to send their children to magnet schools.

7. Special Circumstances Justifying Inconsistencies with Guidelines in 5 CFR 1320.5

No special circumstances apply to this study with respect to any requirements of respondents, pledges of confidentiality, or use of statistical data classification.

8. Consultation Outside the Agency

The 60 day FR notice was published VOL 75, page 4053, on January 26, 2010. No public comments have been received. The study will employ a technical work group (TWG) to advise AIR/BPA on data collection instruments, study feasibility, research designs, and methodological issues including combining results across sites that use diverse types of student assessments. The consultants, all of whom have agreed to serve on the panel, bring expertise in magnet schools, school choice, studies of racial segregation and achievement gaps, experimental and interrupted time series designs, and the analysis of complex student assessment data. The consultants and their affiliations are listed in Exhibit 2.



Exhibit 2. Technical Working Group Members for the Evaluation of Conversion Magnet Schools Study


Name


Position and Institution

Expertise

Adam Gamoran

Professor of Sociology and Educational Policy Studies (University of Wisconsin—Madison)

Empirical studies of magnet schools and school choice

Dale Ballou

Associate Professor of Public Policy and Education (Vanderbilt University)

Empirical studies of magnet schools and school choice

Ellen Goldring

Professor of Education Policy and Leadership (Vanderbilt University)

Studies of magnet program characteristics

Ronald Ferguson

Lecturer in Public Policy and Senior Research Associate (Kennedy School of Government, Harvard University)

Studies of racial segregation and achievement gaps

Steven Rivkin

Associate Professor of Economics (Amherst College)

Studies of racial segregation and achievement gaps

Thomas Dee

Associate Professor of Economics (Swarthmore College—on leave at Stanford University School of Education, 2006-7)

Studies of racial segregation and achievement gaps

Jason Snipes

Vice President, Center for Research Evaluation and Technology (CERET), Academy for Educational Development

Experimental and interrupted time series designs

Larry Hedges

Professor of Sociology and Psychology (University of Chicago)

Combining measurements from different state tests



During the feasibility phase of the project, the TWG members were available for consultation on an as-needed basis to comment on data collection instruments and to review drafts of the memorandum on evaluation feasibility and study options. TWG members meet twice in the evaluation phase. During the first meeting in October 2008, the TWG reviewed and commented on the analysis plan for the evaluation. The second meeting will be to review and comment on preliminary results from the analyses. TWG members may also review drafts of the final report.

9. Payments or Gifts to Respondents

Principals will be compensated $25 to complete the principal survey, which will require about 35 minutes of their time. Principals have many demands on their time during the school day and thus have limited time to respond to surveys. Typically, they complete surveys outside normal work hours. Therefore, we will pay to offset the time principals spend completing the survey, with the payment proportional to their estimated hourly wage. Since the survey is expected to take 35 minutes to complete, about 60% of an hour, the principals will be compensated $25 or 60% of the hourly rate for elementary principals based on the median elementary school principal’s hourly rate. We believe that this is a medium burden activity.23

10. Assurances of Confidentiality

None of the information collected will be reported or published in a manner that would identify individual respondents.


To ensure that the data collected are not available to anyone other than authorized project staff of the contractor and subcontractor, a set of standard confidentiality procedures is being followed during the data collection process:


  • All project staff agree in writing to an assurance of confidentiality.

  • All project staff keep completely confidential the names of all respondents, all information or opinions collected during the course of the study, and any information about respondents learned incidentally.

  • Reasonable caution is being exercised in limiting access to data during transfer, storage, and analysis to persons working on the project who have been instructed in the applicable confidentiality requirements for the project. In particular, electronic data files are encrypted for transfer between the districts and the contractor, and stored on secure servers at AIR and BPA. Respondents send paper surveys directly to AIR/BPA via FedEx (which provides rapid transmission and tracking if problems arise). Paper surveys are stored in locked files. After the project is completed, the contractors will destroy all identifying information.

  • To allow the linking of student data across files (e.g., connecting demographic information and test scores, or test scores from successive years) without using student names or the identification codes used by their districts, all of the participating districts are creating a randomly assigned pseudo-ID number for each student. Pseudo-ID numbers rather than the actual ID codes are included in all data files provided to the study in order to protect the identities of individual students. The cross-walk between actual identifiers and the pseudo-IDs is being maintained by the districts and not provided to the study team.

  • The Project Director is responsible for ensuring that all contractor personnel involved in handling data on the project are instructed in these procedures and comply with these procedures throughout the study.

  • The Project Director ensures that the data collection process adheres to provisions of the U.S. Privacy Act of 1974 with regard to surveys of individuals for the Federal government.


Finally, the following explicit statement regarding confidentiality is be included in notification letters, study descriptions, and instructions to survey respondents: “Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you, your school, or your district to anyone outside the study team, except as required by law.”

11. Questions of a Sensitive Nature

The data collection instruments do not request any data of a sensitive nature to individual persons.

12. Estimates of Response Burden

Exhibit 3a presents the original estimates of total respondent burden for the data collections during the 3-year period for which OMB approval of instruments was sought in 2007. These estimates were based on the assumptions that 40 MSAP grantee districts would be screened, that 50 conversion magnet schools and 100 comparison schools from 20 districts would participate in the study, and 10 of the participating districts would come from each of the two grantee cohorts. The total estimated hour burden for these 3 years was 3,186 hours: 64 hours for district officials; 3,085 hours for district data managers; and 37 hours for principals. Across these 3 years, the average annual number of responses was 94 and the average response hours were 1,062.24


By spring 2008, it was apparent that the sample of districts and schools recruited for the study was be smaller than the numbers upon which the numbers in Exhibit 3a were based. The revised estimates were based on the actual sample of 25 magnet and 50 comparison schools rather than the 50 magnet and 100 comparison schools projected in the original OMB request. Accordingly, in March 2008, NCEE submitted a revised burden estimate in which the estimated total numbers of respondents and hours for the first three years of the study were halved. Specifically, the number of respondents was reduced from 282 to 141 (which reduced the annual average respondents from 94 to 47), the number of respondent hours was reduced from 3,186 to 1,593 (which reduced the annual average hours from 1,062 to 531).



Exhibit 3a. Time Burden for Respondents During the First Three Years of the Evaluation (April 2007 Estimates; Already Approved by OMB)



Task

Total Sample Size

Estimated Response Rate

Projected Number of Respondents

Time Estimate (in person-hours)

Number of Administrations or Collections

Total Hours

District Officials







Grantee Screening Interview (Module A)1

40

100%

40

0.5

1

20

Grantee Screening Interview (Module B)

36

100%

36

0.5

1

18

Grantee Screening Interview (Module C)

26

100%

26

0.5

1

13

Grantee Screening Interview(Module D)

26

100%

26

0.5

1

13

District Data Managers







First Student Data Request for 2004 Grantee Cohort (2001-2002 through 2006-2007)2

10

100%

10

121

1

1,210

First Student Data Request for 2007 Grantee Cohort (2004-2005 through 2006-2007) 2

10

100%

10

76

1

760

Second Student Data Requests for 2004 and 2007 Grantee Cohorts (2007-2008) 2

20

100%

20

23

1

460

Third Student Data Requests for 2004 and 2007 Grantee Cohorts (2008-2009) 2

20

100%

20

23

1

460

First Classroom Data Request for 2004 Grantee Cohort (2001-2002 through 2006-2007)3

5

100%

5

18

1

90

First Classroom Data Request for 2007 Grantee Cohort (2004-2005 through 2006-2007)3

5

100%

5

9

1

45

Second Classroom Data Requests for 2004 and 2007 Grantee Cohorts (2007-2008 )3

10

100%

10

3

1

30

Third Classroom Data Requests for 2004 and 2007 Grantee Cohorts (2008-2009)3

10

100%

10

3

1

30

(continued on next page)

Exhibit 3a. Continued



Task

Total Sample Size

Estimated Response Rate

Projected Number of Respondents

Time Estimate (in person-hours)

Number of Administrations or Collections

Total Hours

Principals







2007 Principal Survey (2004 Grantee Cohort)4

75

85%

64

0.6

1

37

Totals



282



3,186


1 The grantee screening protocol consists of four modules, each of which will require an average of 30 minutes to complete. Information gathered in Modules A and B will eliminate some districts from the simple, making administration of other modules unnecessary.

2 Hours estimated for providing student records data include time in consultation with AIR/BPA staff about content and formatting needs, identifying lottery applicants, retrieving and extracting data, linking data across files, and documenting and transmitting files. Estimates assume that providing the three earliest years of data (2001-2002 through 2003-2004) will require 20 hours per year of data while providing more recent years of data will require 15 hours per year of data. Estimates also include 16 hours to create a file of student pseudo-IDs during the first wave of student data collection and 8 hours for each of the three subsequent waves to update the file.

3 Assumes that half of the districts in each grantee cohort supply classroom data, and that file extractions require 3 hours each per year of data.

4 The principal survey is estimated to take 35 minutes (0.58 hours) to complete.


Exhibit 3b summarizes the estimated respondent burden for the data collections planned for the final two years of the study, for which clearance is now being requested. The exhibit below is a revised version of the exhibit that appeared in the April 2007 clearance request. Revisions are based on current information about the numbers of districts and schools now in the study sample. The total estimated hour burden for these 2 years is 373 hours: 6 hours for district officials; 329 hours for district data managers; and 38 hours for principals. Across these 2 years, the average annual number of responses is 50 and the average annual response hours are 187. This burden estimate includes the following data collection activities:


  • Local MSAP project directors or school choice coordinators will participate in semi-structured interviews focused on changes that have occurred over the years of their MSAP grants. The interviews will take approximately 30 minutes to complete. (Assumes 14 grantee districts and an 85% response rate.)

  • District data management staff from 13 data management departments representing the 14 participating districts (two of which share a common data management system), will respond to one request for student data for the 2009-2010 school year. They will (1) extract, format, and submit student records data along with a data dictionary; and (2) create and maintain a file of pseudo-ID numbers cross-walked with students’ actual identifiers.25,26

  • District data management personnel from 10 school districts will, as necessary, extract and submit classroom composition data for the 2009-2010 school year. It is expected that 10 MSAP grantee districts will be able to provide classroom composition data. 27

  • Principals of 64 magnet and comparison schools in the 14 MSAP grantee districts will spend about 35 minutes completing a principal survey in 2010 (assumes 75 schools and an 85 percent response rate).

Exhibit 3b. Time Burden for Respondents During the Final Two Years of the Evaluation



Task

Total Sample Size

Estimated Response Rate

Projected Number of Respondents

Time Estimate (in person-hours)

Number of Administrations or Collections

Total Hours

District Officials







Coordinator Interview in 2010

14

85%

12

0.50

1

6

District Data Managers







Fourth Student Data Request for 2004 and 2007 Grantee Cohorts (2009-2010)1,2

13

100%

13

23

1

299

Fourth Classroom Data Request for 2004 and 2007 Grantee Cohorts (2009-2010)3

13

75%

10

3

1

30

Principals







2010 Principal Survey (2004 and 2007 Grantee Cohorts)4

75

85%

64

0.6

1

38.4

Totals



99



373.4


1 Hours estimated for providing student records data include time in consultation with AIR/BPA staff about content and formatting needs, identifying lottery applicants, retrieving and extracting data, linking data across files, and documenting and transmitting files.

2 The study includes 14 grantees of which 2 share a common data management system. Consequently, while there are 14 magnet program coordinators, student data will be requested from 13 data managers.

3 Assumes that 75 percent of the districts in each grantee cohort supply classroom data, and that file extraction requires 3 hours per year of data.

4 The principal survey is estimated to take 35 minutes (0.58 hours) to complete.

.

13. Estimate of Total Capital and Startup Costs/Operation and Maintenance Costs to Respondents or Record-Keepers

There are no capital or start up costs.

14. Estimates of Costs to the Federal Government

Cost of the evaluation contract: The estimated cost for the 5-year study contracted to AIR/BPA—including conducting a feasibility investigation; developing a detailed study design, data collection instruments, and justification package, and completing data collection, data analysis, and preparing reports—is $3,420,130 overall, averaging $684,026 per year. The total cost of the feasibility phase is $495,279, and total cost of evaluation phase is $2,924,851.

15. Changes in Burden

This submission includes a reduction of 344 annual burden hours. The 531 annual burden hours approved for years 1 through 3 will no longer be needed as the data collection activities in study year 1 through 3 will be completed. The data collection activities for study years 4 and 5, currently being requested, will entail an annual burden of 187 hours.

16. Plans and Schedule for Analysis, Tabulation and Publication of Results

The schedule for the publication of results for the two phases of the Conversion Magnet Schools Evaluation is shown in Exhibit 4.

A. Description of Study Reports

The schedules for data collections and the publication of results for the two phases of the Conversion Magnet Schools Evaluation are shown at the end of this section in Exhibits 4 and 5, respectively.

Feasibility Phase Reports

Two documents were submitted to ED at the end of the feasibility phase. The first was a memorandum documenting the results of the feasibility investigation and presenting recommendations about whether and how to proceed with a full evaluation of conversion magnet schools. The second was a descriptive report that focuses on school characteristics during the years leading up to the conversion of the magnet schools funded through either the 2004 or 2007 MSAP grants. Tabulations show levels and trends in characteristics such as total enrollment, proportions of students from minority groups, and proportions of students eligible for free and reduced price meals of the conversion magnet schools compared with other schools in their districts.



Evaluation Phase Report

A final report will be submitted at the end of the evaluation phase. It will present the results of analyses estimating relationships between conversion magnet schools and key desired outcomes of the Magnet Schools Assistance Program—improvement of students’ academic achievement and reduction of minority group isolation. These analyses will be based on data for the school years 2001-2002 through 2009-2010. The report will include the following tabulations and analyses:

  • Descriptive Information

  • Description of the characteristics of conversion magnet schools in the 2004 and 2007 MSAP grantee cohorts (e.g., location, size, and themes)

  • Comparison of study schools (conversion magnets and non-magnet comparison schools) and schools nationwide to provide context for understanding how the study schools differ (if they do) from other magnet and non-magnet schools

  • Description of dependent variables (student achievement scores, minority group isolation in the schools)

  • Description of changes in enrollment composition of the districts, conversion magnets, and comparison schools between 2001-2002 and 2009-2010 (from 3 years prior to the MSAP grant to 6 years after the 2004 MSAP grants and 3 years after the 2007 MSAP grants)

  • Description of changes in the content and structure of the magnet programs during the 3 years following the award of their MSAP grants


  • Estimation of the relationship between conversion magnets and key outcomes: student achievement and minority group isolation

  • Results of analyses of the achievement of resident students in conversion magnet schools in the 2004 and 2007 MSAP grantee cohorts before and after the schools converted.

  • Results of analyses of the achievement of non-resident students who switch between a traditional public school and a conversion magnet school in the 2004 and 2007 MSAP grantee cohorts after the schools converted 28

  • Results of analyses of trends in minority group isolation in the conversion magnet schools in the 2004 and 2007 MSAP cohorts

B. Complex Techniques for Analyses of the Relationship Between Conversion Magnet Schools, Student Achievement, and Minority Group Isolation

As has been explained earlier, differences in the manner in which resident and non-resident students are assigned to conversion magnet schools require different methods for analyzing the relationship between conversion magnet schools and student achievement. This section provides additional detail on the estimation methods that will be used for the core and supplemental achievement analyses that will be conducted for each group of students. The section also discusses models for analyzing the relationship between magnet conversions and minority group isolation.



Core Analysis for Resident Students: Comparative Interrupted Time Series

The primary estimation method for analysis of the relationship between conversion magnet schools and the achievement of resident students will be an interrupted time series analysis that uses a three-level hierarchical linear model (HLM), with students at the first level, year at the second level, and school at the highest level. The analysis will be conducted separately by grade level (2nd, 3rd, 4th, or 5th grade) and by subject area (English language arts/reading or mathematics). Each grade level of resident students will be compared before and after the conversion of the school. A grade level sample is comprised of the students in a particular grade level, 3 years before and at least 3 years after their school converts; for example, the third-grade class in years 2001 to 2006 comprise a sample for analysis.29, 30

The HLM is specified as follows:


Level 1 – Individual Student Level

(1)

where

the outcome for student i, in grade level j, from school pair k and school s.

mean-centered background characteristics q for student i in grade level j from school pair k and school s.

a random error term for student i in grade level j from school pair k and school s (assumed independently and identically distributed (iid) across students in a grade level).


Level 2 – Year/Grade Level (e.g., third-grade students over 6 years or more)


(2)


where

= the mean outcome for grade level j from school pair k and school s for students with mean background characteristics for grade level j and school s.

= dummy variable equal to 1 for first year after conversion. 0 otherwise.

= dummy variable equal to 1 for second year after conversion. 0 otherwise.

= dummy variable equal to 1 for third year after conversion. 0 otherwise.

= dummy variable equal to 1 if school s is a magnet school, pre- or post-conversion.
0 otherwise.

= random error term for grade level j, school pair k, iid across school pairs.

= random error term for grade level j, school pair k, school s, iid across schools



Level 3 – School Pair Level


(3)


where are random error terms, iid across school pairs.31


The first level is a simple regression of outcomes for individual students in a given grade j as a function of their background characteristics. The equation is included in the model to control statistically for any compositional shifts that might occur over time in the measured background characteristics of students at a given school.


The second level of the model is the comparative interrupted time-series analysis of regression-adjusted mean outcomes for a given grade-level group from a single school within a single magnet/pooled-comparison schools pair. π0k is the regression-adjusted baseline mean student outcome for the 2 comparison schools combined. π1k, π2k, and π3k are the deviations from the baseline mean student outcome for the comparison schools in the three follow-up years. π4k is the difference between the regression adjusted baseline mean outcome for the magnet school (in years before and after conversion) and that of its comparison schools. π5k, π6k, and π7k are the differences between the deviations from the baseline mean for the magnet school and its comparison school counterpart—the estimated effects of transforming the magnet school in the first, second, and third years of magnet status, respectively. For the 2004 cohort, it may be possible to include as many as six years after conversion in the analysis.


The third level represents the distribution of parameters across magnet/comparison school pairs, providing a way to summarize these findings. π5k, π6k, and π7k are the best available estimates of the typical effects of a magnet school. The standard deviations of these estimates provide measures of the consistency of these effects.


Core Analysis for Non-Resident Students: Fixed-Effects Analysis

Non-resident students who switch between traditional public schools and the conversion magnet schools will be studied separately from resident students if there are enough students switching schools. The comparative interrupted time series approach we will use for resident students is not useful in the analysis of non-resident students, in part because there are no obvious comparison schools in this case. Moreover, students who self-select to apply to enroll in a magnet school are likely to differ in both observable and unobservable ways from students in their local schools who do not apply to the magnet school, as well as from resident students in the magnet schools.

Students who switch into magnet schools are likely to differ in important ways from students who decide not to do so. This heterogeneity makes it problematic to compare switchers with non-switchers as a method of inferring the relationship of switching to magnet schools and student achievement. As a remedy to this problem we will use student-level fixed-effect models in which we do not compare one student with another, but we instead compare the student’s achievement growth before he or she switches to a magnet with the growth he or she experiences after switching.

If a sufficient number of students switch between non-magnet and magnet schools, a student-fixed effects analysis will be conducted to assess the relationship of attending magnet schools to student achievement for non-resident students. The analysis will make use of data on all students in the district attending a non-magnet school and those non-resident students attending a magnet school (i.e., those students attending a school that adopted a magnet program who live outside the magnet school’s attendance zone). That is, the only students in the district excluded from the analysis are the resident magnet school students who live in the attendance zone of the magnet school that they attend. The achievement gains of individual students will be compared before and after students switch between the conversion magnet schools and traditional public schools. This approach avoids making inter-student comparisons, and removes the influence of unobserved characteristics such as student motivation, to the extent that these characteristics are fixed during the period under study.

The main requirement of this fixed-effects model is there are students who have at least one test-score gain when attending a non-magnet school, and one test-score gain when the student is in the magnet school. For example, in a district that test students starting at grade three, a student that switches from a non-magnet to a converted, MSAP magnet schools at the beginning of the fifth grade would have a gain score between third and fourth grade from the traditional public school and a gain score between fourth and fifth grade attributed to the switch to the magnet school. In districts in which elementary schools include sixth grade, the student would also have a gain score between fifth and sixth grade attributable to the switch to a magnet school if he was still in that school.32

The planned fixed-effects model to estimate the effect of magnet schools on non-resident student achievement is specified as follows:

Shape1


Shape2



Where the observation subscript refer to student i in grade g attending school s and district d in school year t. The variables in the model are defined as follows:

Shape3 Shape4  = a set of dummy variables for all students where there are I in total and where Shape5 Shape6  is a dummy variable equal to 1 for observations from student i.

Shape7 Shape8  = set of dummy variables for all schools, where there are S schools in total, and where Shape9 Shape10 is a dummy variable equal to 1 for observations from school s, district d at time t and 0, otherwise.

Shape11 Shape12  = a set of grade dummy effects, where λigt is a dummy variable equal to 1 for observations from student i in grade g at time t, and 0 otherwise.

Shape13 Shape14  = a set of time-varying student background characteristics, where there are Q in total, for student i enrolled in grade g at school s, district d at time t.

Shape15 Shape16  = dummy variable equal to 1 if observation for student i and year t corresponding to year j, and 0 otherwise, defined separately for j = 2,…,6.

Shape17 Shape18  = dummy variable equal to 1 if school s attended by student i in year t is a magnet, and 0 otherwise, where year t corresponds to year j defined separately for j = 4, 5 and 6.

Shape19 Shape20  = random error term for student i in grade g, school s, district d and year t.



The dependent variable is a year-to-year change in achievement (we will be modelling both ELA and math outcomes separately) from the previous year to the current year (i.e., from years t – 1 to t). The coefficients Shape21 Shape22  represent the regression-adjusted mean outcome for each of the students across all schools attended and grades they were enrolled in. The estimated parameters Shape23 Shape24  show deviations from the student mean outcome for students in school s, while the Shape25 Shape26  coefficients show deviations from the individual mean outcome attributable to grade level g. The Shape27 Shape28  coefficients represent deviations from individual mean outcomes due to background characteristics that vary over time. The estimated parameters Shape29 Shape30 , Shape31 Shape32 , Shape33 Shape34  and Shape35 Shape36  show year-specific effects of being observed in the study years 3, 4, 5 or 6. Of particular interest are the final three coefficients, Shape37 Shape38 , Shape39 Shape40  and Shape41 Shape42 , which denote the effect on indiviudal student outcomes from switching to a magnet school for study years 4, 5 or 6.

The main challenge to this analysis is that we cannot know until the data are examined exactly how many non-resident students switch into magnet schools during the first three years in which those schools operate their magnet programs. The statistical power of this analysis will increase with the number of students who switch.



Supplementary Estimation Technique for Resident Student CITS Analysis

The CITS analysis discussed above will provide a quasi-experimental measure of the effect of attending a conversion magnet school for resident students. This section briefly addresses a supplementary student fixed-effects model to estimate the effects of staying in a school after conversion to magnet. This model can improve the internal validity of the CITS analysis by controlling for all fixed characteristics of students, both observed and unobserved, thereby reducing the chance that unobserved differences between students are being confounded with the magnet status of the school.


One approach to this fixed-effect model that is highly analogous to the comparative interrupted time series HLM model is to estimate a difference-in-difference model that compares individual students’ gains in achievement before and after a school converts to magnet status, while using gains of individual students at comparison schools to control for district-wide trends.



Core Analysis of the Estimated Effect of Magnet Conversion on Minority Group Isolation

The study will also evaluate the relationship between converting elementary schools into magnets and various measures of minority group isolation, such as the percentage of students at the school who are from minority groups. Let Rjks = one of the measures of minority group isolation for grade level j from school pair k and school s. Then this variable can be modeled using equations (2) and (3) from the HLM model proposed earlier for test scores. (A student-level model (1) is dispensed with because measures of minority group isolation are typically measured at the school or a higher level.) In this version of the HLM model, the coefficients on π5k, π6k, and π7k indicate whether the conversion magnet school changed its trend in the given measure of minority group isolation relative to the comparison school(s).


It is also possible to test whether district-wide measures of minority group isolation have changed from their pre-existing trends in the years after one or more elementary schools are converted to magnet status. However, such an approach does not allow for a comparison set of schools to be used, and so this corresponds to an interrupted time series research design rather than a comparative interrupted time series design.

C. Schedules of Data Collections and Reports

Exhibit 4 presents the schedule for the data collection activities during the feasibility and evaluation phases of the Conversion Magnet Schools Evaluation, and Exhibit 5 presents the schedule of deliverables for the two phases.



Exhibit 4. Schedule for Data Collections


Feasibility Phase

Beginning and End Dates

Screen 2004 MSAP Cohort

Screen 2007 MSAP Cohort

May 1, 2007-August 1, 2007

July 2, 2007-September 28, 2007

Evaluation Phase


Survey principals in 2004 cohort

December 3, 2007-March 31, 2008

Student records data collection for 2001-2002

through 2006-2007

December 3, 2007-March 31, 2008

Student records data collection for 2007-2008

December 1, 2008-March 31, 2009

Student records data collection for 2008-2009

December 1, 2009-March 31, 2010

Student records data collection for 2009-2010

December 1, 2010-March 31, 2011

Survey principals in both cohorts

April 15, 2010-September 30, 2010

Final interview of MSAP project directors or

district choice representatives

April 15, 2010-December 1, 2010



Exhibit 5. Schedule for Dissemination of Study Results


Feasibility Phase

Deliverable Dates

Feasibility Memorandum

First draft of memo

Final draft of memo



October 30, 2007

February 25, 2008

Descriptive Report

First draft of report

Second draft of report

Final report



December 31, 2007

January 31, 2008

March 31, 2008

Evaluation Phase Deliverables


Evaluation Report

First draft of report

Second draft of report

Final report



May 31, 2011

July 31, 2011

September 30, 2011


17. Approval to Not Display Expiration Date

All data collection instruments will include the OMB expiration date.

18. Exceptions to Item 19 of OMB Form 83-1

No exceptions are requested.






1 See Christenson, B., Eaton, M., Garet, M., & Doolittle, F. (2004). Review of literature on magnet schools (Report submitted to U.S. Department of Education). Washington, DC: American Institutes for Research and MDRC, p. 3. See also U.S. Department of Education. (2005). Magnet schools assistance. Retrieved January 8, 2007, from http://www.ed.gov/legislation/ESEA02/pg65.html; and Rossell, C. H. (2005, Spring). What ever happened to magnet schools? Education Next, 2. Retrieved January 8, 2007, from http://www.educationnext.org/20052/44.html

2 Program regulations are in 34 CFR 280.

3 Information downloaded from www.ed.gov/programs/magnet/funding.html

4 The statute authorizes funding of grants to districts to carry out magnet programs that are part of an approved desegregation plan and are designed to bring students from different social, economic, ethnic, and racial backgrounds together. (Sec. 5303 of NCLB)

5 The Code of Federal Regulations definition of terms for the federal government’s magnet assistance program identifies minority group to include American Indian or Alaskan Natives, Asian or Pacific Islanders, Hispanics, and Blacks (not of Hispanic origin). The code defines minority group isolation in reference to a school as, “the condition in which minority group children constitute more than 50 percent of the enrollment of the school.” (34 CFR 280.4(b) )

6As mentioned in the introduction, we are conducting a student-fixed effects analysis to assess the relationship of attending magnet schools to student achievement for non-resident students instead of a lottery study due to the existence only a few magnet schools that were oversubscribed and relied on lotteries for admittance. Most magnet schools accepted all applicants.

7 See Federal Register, February March 9, 2007 (Volume 72 Number 46), page 10723.

8 Crain, R., Allen, A., Thaler, R., Sullivan, D., Zellman, G., Little, J., & Quigley, D. (1999). The effects of academic career magnet education on high schools and their graduates. Berkeley, CA: University of California Berkeley, National Center for Research in Vocational Education.

9 Betts, J., Rice, L., Zau, A., Tang ,Y., & Koedel, C. (2006). Does school choice work? Effects on student integration and achievement. San Francisco: Public Policy Institute of California.

10 Ballou, D., Goldring, E., & Liu, K. (2006). Magnet schools and student achievement. New York: Columbia University, Teachers College, National Center for the Study of Privatization in Education.

11 The lottery applicants in the study by Ballou are non-resident students applying to schools outside of the assigned school zone. Although not specifically discussed, the magnet lottery applicants in the San Diego study by Betts et al. are also likely to be non-resident students, particularly at the elementary school level. In Cain et al.’s New York City study, however, students had to apply to participate in the career magnet academies and were selected through a lottery process without apparent regard to residents.

12 See Christenson, B., Eaton, M., Garet, M., & Doolittle, F. (2004). Review of literature on magnet schools (report submitted to U.S. Department of Education). Washington, DC: American Institutes for Research and MDRC.

13 A descriptive evaluation of the 1998 MSAP funded school districts reports that 57 percent of schools targeted for desegregation with MSAP funds succeeded in preventing, eliminating, or reducing minority group isolation. (U. S. Department of Education, Office of the Under Secretary. (2003). Evaluation of the Magnet Schools Assistance Program, 1998 grantees. Washington, DC: Author. See page xii. [Retrieved January 8, 2007, from http://www.ed.gov/rschstat/eval/choice/magneteval/finalreport.pdf]). An earlier descriptive study of the 1989 and 1991 MSAP funded school districts reported that 64 percent of desegregation targeted schools met or made progress in meeting their desegregation objectives, with 57 percent moving closer to the district-wide average in terms of minority enrollment. (U.S. Department of Education, Office of the Under Secretary. (1996). Reducing, eliminating, and preventing minority isolation in American schools: The impact of the Magnet Schools Assistance Program. Washington, DC: Author.)

14 It should be noted that not all of the magnet schools that MSAP supports are the target for desegregation. In a small proportion of cases, the magnet program in one school is intended to affect the enrollment composition of one or more other schools by drawing students from those schools into the magnet school.. In such cases, these other schools are the targets for desegregation.

15 Betts, J., Rice, L., Zau, A., Tang ,Y., & Koedel, C. (2006). Does school choice work? Effects on student integration and achievement. San Francisco: Public Policy Institute of California.

16 While the exposure of students of parents with a low education to students whose parents have a high education increased, so did the exposure of students whose parents education is unknown to students of parents with a high education. Since it is unclear what the unknown category represents, it is difficult to draw a conclusion.

17 The 2004 and 2007 MSAP grants provided funding for 385 schools. Nearly two-thirds (253 schools) were new magnet schools, and of these over half (137 schools) were elementary schools.]

18 Nationally in 1999-2000, over three-fifths of all magnet schools were located in elementary schools (Christenson et al., 2004, p. 3).

19 All of the districts are providing data in which each student’s records are linkable across years. Having longitudinally linked data for individual students makes possible stronger analyses than are possible with individual records that cannot be longitudinally linked.

20 The revisions contained in NCES’s guidance on reporting aggregate ethnicity-race data for teachers, staff, and students that meet OMB standards include creating a race category for Hawaiian Native and Pacific Islanders that is separate from the Asian race category, and adding a category for non-Hispanic’s of two or more races.

21 The position of this question has been changed from item 4 in the 2006-2007 Principal Survey to item 3 in the 2009-2010 Principal Survey to place it closer to items 1 and 2 that also involve counts of “teachers” as opposed to counts of “teachers and other staff”.

22 Item 18 in the 2009-2010 survey is borrowed from a set of questionnaires for principals of magnet, charter, regular public, and private schools used in the 2008 “Survey of What Makes Schools Work” conducted by the Northwest Evaluation Association and Vanderbilt University.

23 According to the Department of Labor’s Occupational Outlook Handbook, 2006-2007 edition, the median annual salary for elementary school principals in 2004-2005 was $74,062. Assuming 220 workdays (or 1760 work hours) per year, the hourly rate for elementary principals without fringe benefits is $42.

24 Each row in Exhibits 3a and 3b represent a distinct data collection activity. The total projected number of respondents for the first 3 years of the project is obtained by summing the projected number of respondents shown in column 4 of Exhibit 3a. As shown in the bottom row of that exhibit, we projected a total of 282 respondents during the first 3 years of the evaluation, or an average of 94 respondents per year. The total number of hours per data collection activity (column 8) is obtained by multiplying the projected number of respondents (column 4) by the estimated time it will take each respondent to complete that activity (column 5). As shown in the bottom row of Exhibit 3a, we projected a total of 3,186 hours of respondent time during the first 3 years of the evaluation, or an average of 1,062 hours per year.

25 Actual student identifiers will be replaced by pseudo-ID codes in the files provided to AIR/BPA to ensure confidentiality of individual students’ records.

26 The estimate assumes that the response rate for this data collection will be 100 percent because the participating districts have all agreed to provide data.

27 The estimate assumes that the response rate for this data collection will be 75 percent because some of the participating districts may be able to provide other types of student record data but not classroom assignment data to the study.



28 As indicated in the introduction,-a lottery-based analysis of the achievement of non-resident students in conversion magnet schools has been dropped from the evaluation. However, a quasi-experimental analysis based on fixed effect models described below may provide an alternative for studying achievement of non-resident students in conversion magnet schools at no additional burden to districts.

29 For this analysis, students in the two comparison schools for each magnet will be pooled together as one pseudo-comparison school before being compared with the magnet school.

30 Schools using the same achievement tests will be grouped together for analysis, so it is likely that the analyses will be grouped by state.

31 The model as shown treats variation in estimated effects across magnet schools as random effects; it would also be possible to treat this variation as fixed effects.

32 In practice, the analysis will make use of the total number of switchers into or out of magnet schools to identify the relationship of magnet schools to achievement of nonresident students.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleConversion Magnet Schools Evaluation
AuthorAuthorised User
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy