ISMS OMB part A (Jan 2022) clean

ISMS OMB part A (Jan 2022) clean.docx

Impact Study of Federally-Funded Magnet Schools

OMB: 1850-0943

Document [docx]
Download: docx | pdf

p
art a

Impact Study of Federally-Funded Magnet Schools: OMB Data Collection Package

January 26, 2022



Submitted to:

Institute of Education Sciences
550 12th Street, SW

Room 4104
Washington, DC 20004

Project Officer: Lauren Angelo
Contract Number: ED-IES-17-C-0066

Submitted by:

Mathematica Policy Research

P.O. Box 2393
Princeton, NJ 08543-2393
Telephone: (609) 799-3535
Facsimile: (609) 799-0005

Project Director: Christina Tuttle
Reference Number: 50526.01.026.220.000



This page has been left blank for double-sided copying.



CONTENTS

PART A. SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION 5

A. Justification 5

1. Circumstances necessitating the collection of information 5

2. Purposes and uses of the data 8

3. Use of technology to reduce burden 10

4. Efforts to avoid duplication of effort 11

5. Methods to minimize burden on small entities 11

6. Consequences of not collecting data 11

7. Special circumstances 12

8. Federal Register announcement and consultation 12

9. Payments or gifts 12

10. Assurances of confidentiality 12

11. Additional justification for sensitive questions 13

12. Estimates of hours burden 13

13. Estimates of cost burden to respondents 15

14. Estimates of annual costs to the federal government 15

15. Reasons for program changes or adjustments 15

16. Plan for tabulation and publication of results 16

17. Approval not to display the OMB expiration date 19

18. Explanation of exceptions 19

References 20

APPENDIX A: STUDENT RECORDS REQUEST

APPENDIX B: LOTTERY RECORDS REQUEST

Appendix C: PRINCIPAL SURVEY ON SCHOOL ORGANIZATON AND INSTRUCTION

APPENDIX D: CHARTER SCHOOL ADMISSIONS SURVEY

APPENDIX E: confidentialty pledge



TABLES

A.1 Magnet study research questions and data sources 7

A.2. Source, mode, and timing 9

A.3. Annual reporting and record-keeping hour burden 16

A.4. Timeline for project publications 20



PART A. SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION

This Office of Management and Budget (OMB) package requests clearance for data collection activities to support a rigorous Impact Study of Federally-Funded Magnet Schools. The Institute of Education Sciences (IES) at the U.S. Department of Education (the Department) has contracted with Mathematica Policy Research and its subcontractor, Social Policy Research Associates (SPR), to conduct this evaluation

The study team completed a feasibility assessment under a previously approved clearance package (1850-0943). Information gathered from fiscal year (FY) 2016 and 2017 Magnet Schools Assistance Program (MSAP) grantee districts and schools confirmed that a sufficient number of students are being admitted to these schools through lotteries to support an impact evaluation. A brief (currently under review) describes how MSAP-funded schools recruit and select students for admission, a topic of interest to the program office and to the Department’s policy makers.

This updated clearance package expands on the prior one to request approval for impact study data collection. If the package is approved, the study would collect survey data from principals and district administrative records on admissions lotteries and student progress. The study would use these data to estimate the impacts of magnet schools on student achievement and diversity and to describe whether particular features of magnet schools are associated with greater success.

A. Justification

1. Circumstances necessitating the collection of information

a. Statement of need for a rigorous evaluation of magnet schools

The MSAP is one of the Department’s key school choice programs and one of the key levers used to promote school improvement. In light of this the agency’s new Learning Agenda includes questions about (1) if the program works to improve student achievement and school diversity and (2) how specifically magnet schools might achieve these goals.

The number of magnet schools has grown in recent years to more than 3,200 schools in more than 600 districts across the country (Glander 2016). Today, magnet schools typically implement a distinctive theme or instructional method and recruit at least some students from outside the school’s neighborhood to attract students with a broader range of backgrounds to the school. The assumption is that all students will benefit academically from the school’s thematic focus and, potentially, from exposure to a diverse student body. Growth in the magnet school sector has been aided by the Magnet Schools Assistance Program (MSAP) at the U.S. Department of Education (ED), which has provided grants to school districts to support magnet programs since 1985.

Despite the growth noted above, in recent years, federal funding for MSAP has declined while funding for other forms of school choice (such as public charter schools and private school voucher programs) has increased. The lack of conclusive evidence about the effectiveness of magnet schools in general (Ballou 2009) and of the MSAP program in particular may play a role in funding decisions. The most recent FY 2016 and FY 2017 MSAP grant competitions provide an opportunity to conduct a large-scale rigorous study of the MSAP program, drawing primarily on administrative records. The study would help answer the Department’s learning agenda questions and provide information to guide investment decisions.

The impact evaluation would measure the effects of a wide range of MSAP schools on student achievement and school diversity, using a rigorous lottery-based random assignment design. The impact study data collection would make it possible to estimate the impact of admission to a magnet school on: (1) student achievement and other available academic outcomes; and (2) school environment, including exposure to a more diverse set of peers. The study would also examine whether particular features of MSAP schools are associated with greater success which could inform program improvement efforts in the future.

This impact evaluation of MSAP schools is authorized by Part B section 8601 of the Elementary and Secondary Education Act of 1965 (ESEA) as amended by the Every Student Succeeds Act (ESSA), which allows the Department to pool resources across ESEA programs in order to fund rigorous evaluations of individual Federal education programs that currently lack sufficient evaluation dollars. The Magnet Schools Assistance Program (MSAP), itself, is authorized under Title IV, Part D of the ESEA, as amended by ESSA, and provides grants to local educational agencies (LEAs) and consortia of LEAs to support magnet schools under an approved, required or voluntary, desegregation plan.

b. Overview of study design and research questions

The study includes districts that are currently receiving federal support through multi-year MSAP grants awarded in FY2016 and FY2017. The feasibility phase revealed that there are 14 MSAP-supported school districts that are eligible to be recruited for the impact study. From these districts, we anticipate the study would include approximately 70 MSAP schools. Each of these schools used lotteries to determine which students were admitted in the 2018-2019 school year or the 2019-2020 school year.1

The study’s analyses would estimate the impact of each magnet school in the study sample by comparing the outcomes of students who receive an admission offer through the lottery to the outcomes of students who do not receive an offer through the lottery. The lotteries randomly determine which students are in the study’s treatment group (“lottery winners,” who are offered admission to the study’s magnet schools) and which are in the control group (“lottery losers,” who are not offered admission). The study would examine the outcomes of two cohorts of students: the first study cohort (applying through a lottery to enter an MSAP school in 2018-2019) would be followed for four years, and the second study cohort (applying through a lottery to enter an MSAP school in 2019-2020) would be followed for three years.2 Our analytic approach takes into account that variation will exist in schools’ lottery procedures, grade levels of applicants, and the ratio of applicants to available seats; including as many different magnet school lotteries as possible in the study will maximize the sample’s size and policy relevance.

Table A.1 describes these research questions, as well as data sources and analysis methods for addressing those questions.

Table A.1 Magnet study research questions and data sources

Research questions

Data sources and analysis methods

Feasibility study (completed)

  1. How do the districts and schools funded through the 2016 and 2017 MSAP grants recruit and select students for admission?

  • Screener interviews with 2016 and 2017 MSAP districts and school surveys on student recruitment and admissions

  • Comparison of reported school/district practices

  1. How many schools hold eligible lotteries and are willing to participate in the study?

  • Screener interviews with 2016 and 2017 MSAP districts

  1. Is the number of lottery participants and eligible schools sufficient to conduct the study?

  • Screener interviews on lottery procedures and information available in district administrative data

  • Projections of the number of students participating in usable lotteries in spring 2018 and 2019

  • Analysis of statistical power to detect meaningful impacts

Impact evaluation

  1. What is the impact of admission to the magnet program on student academic outcomes (achievement and/or other relevant measures of student success, such as persistence in school or graduation)?

  • Data on student outcomes from district student-level records

  • Experimental impact analysis comparing outcomes of treatment group students (who win lotteries and are admitted to magnet schools) and control group students (who lose lotteries)

  1. What is the impact of admission to the magnet program on the type of school that students attend, including their school’s educational programs and the students’ exposure to a diverse range of peers?

  • Student- and school-level administrative records, supplemented by data on school characteristics from principal survey

  • Experimental impact analysis comparing characteristics of schools attended by treatment group and control group students

  1. To what extent is there a relationship between school characteristics, including measures of diversity, and school impacts on student outcomes?

  • Subgroup analysis, in which impacts are calculated for subgroups of schools with particular characteristics

  • Correlational analysis, examining relationship between school characteristics and measures of diversity (from principal survey and extant data) and impacts

  1. How do charter schools select students for admission?

  • Surveys of a sample of currently operating charter schools

  • Descriptive analyses of practices used



By comparing the outcomes of treatment group and control group students, we would estimate the impact of admission to a MSAP school. Using primarily administrative records, we would measure these outcomes for four years. Following the sample for four years makes it possible to observe student test scores in grade 3 for students entering a magnet school in kindergarten, and for some outcomes it would also make it possible to assess whether any impacts grow or diminish over time (which could be particularly important in examining the effectiveness of newer magnet schools). By comparing the characteristics of the schools that treatment group and control group students attend, we also would assess the effect of admission to a magnet program on the type of schools that students attend, including the content of their schools’ educational programs and the students’ exposure to diverse groups of peers (Exhibit A.1, Box D). We expect a broad set of schools to participate in the study, so our approach also includes comparisons of the impacts of admission to different groups of magnet schools, such as between newer magnet schools versus older magnet schools or between magnets with different themes. This correlational analysis may yield policy-relevant findings on the conditions under which magnet schools may be most likely to be successful.

The feasibility study findings revealed the importance of adding a research question, focused on charter school admissions practices, to the magnet program evaluation. Specifically, the feasibility study uncovered that almost all MSAP schools are located in districts with other kinds of school choice, including charter schools. Information about charter school admissions practices, alongside already collected information about MSAP school practices, would provide context for the impact evaluation findings. Moreover, information about charter school admissions practices (including use of lotteries) would make it possible to assess if a future charter school study, modeled on the current magnet school evaluation, would be feasible.

2. Purposes and uses of the data

To address the study’s research questions, the evaluation would collect and analyze data from several sources (Table A.2). These data will be used to understand the impact of admission to a magnet program on student academic outcomes and type of school attended (including peer characteristics), providing evidence on the extent to which magnet schools are improving student outcomes and increasing diversity. Next, we describe how the study would use each data source.

Table A.2. Source, mode, and timing

Data source

Mode and timing

Respondent group

District interview (completed feasibility study)

District interview protocol

60-minute interview, administered in fall 2018a

All nine FY 2016 and 32 2017 MSAP grantee districts (for a total of 40 unique MSAP districts)b

Magnet school survey (completed feasibility study)

School recruitment and admissions survey

10-minute survey, administered in fall 2018a

All 38 FY 2016 and 124 2017 MSAP grantee schools (for a total of 162 unique MSAP schools)


Charter school survey (impact study)

Charter school admissions survey

30--minute survey, administered electronically, with telephone option, in fall 2020

Nationally representative sample of 2000 charter schools

Principal survey (impact study)

Principal survey on school organization and instruction

30-minute survey, administered electronically, with telephone option, in fall 2020

Principals of schools attended by impact study sample treatment group and control group students (including charter schools)

District/school records (impact study)

Lottery records request

Electronic data and characteristics of lottery implementation for prior and current school years, collected from districts in spring through fall 2019, for 2018–2019 school lottery (retrospectively) and 2019–2020 school lottery

All impact study sample districts (comprising schools attended by treatment group and control group students)

Student records request

Electronic student-level records data for prior school years, requested from districts in fall 2020 (for 2016-2017 through 2019–2020 school years) and fall 2022 (for 2020–2021 and 2021–2022 school years)

All impact study sample districts (comprising schools attended by treatment group and control group students)

a We assume that the respondents will need an additional hour to locate and compile information in advance of the interview.

b One district (New Haven Public Schools) received a grant in both FY 2016 and FY 2017. Also, this count includes seven different ‘community school district’ grantees managed by the New York City Department of Education—the study separately contacted a representative of each of these community school districts in the feasibility study. For the impact study, in New York City the lottery records extraction process and district records data extraction process will be conducted directly with the central New York City Department of Education office.



District interview protocol (completed feasibility study). District interviews collected information on student recruitment activities, student selection, the structure of MSAP grantee admissions lotteries, the extent to which grantee schools are oversubscribed, and the availability of administrative data. This information helped determine whether a school was eligible for the impact study.

To reduce burden on district staff, we reviewed grantees’ district websites and other publicly available information relevant to our items of interest before the interview; we asked respondents to verify or update that information, rather than submit new answers. We developed the protocol so as to complete a discussion with one staff member in 60 minutes. We provided the protocol to respondents in advance of the phone discussion along with a study brochure and assumed that they would spend up to an hour locating and compiling information for the interview.

Magnet school recruitment and admissions survey (completed feasibility study). Short magnet school surveys collected only essential information on student recruitment activities and student selection. This information also helped determine whether a school was eligible for the impact study and provided information of interest to the program office.

Charter school admissions survey (impact study). Short charter school surveys will collect only essential information on student admissions. This information will help provide context for magnet school admissions practices and program impacts. It will also help determine whether a school might be eligible for a charter school impact study, should that become of interest to the Department and other stakeholders.

Lottery records request (impact study). For the impact study, it is essential to collection information about how lotteries are conducted and the outcomes of lotteries in order to define the study’s treatment and control groups. To gather this information, the study team would share a memo with districts that defines the data that districts and schools will need to provide for each cohort of lottery applicants. We would structure the memo to define lottery data in a manner that accounts for several factors, such as single-school versus districtwide common lotteries, processes for making admissions offers (for example, location or sibling preferences), processes for late offers from wait lists, and identifiers to match lottery records to enrollment records. Districts would provide the associated records in the format used by the district.

Principal survey on school organization and instruction (impact study). Magnet schools offer specialized curricula or instructional methods to attract students from diverse backgrounds. A 30-minute web-based survey of school organization and instruction would provide a description of the magnet programs in MSAP schools, or the “treatment” in this evaluation. An important contribution of the impact study would be to gather information not only on the magnet schools but also on the schools that the study’s control group students attend. Research offers little insight into the relationship between the practices and characteristics of magnet schools and school and student outcomes. The survey of school organization and instruction would allow us to analyze which characteristics may be related to the effectiveness of magnet schools in improving student outcomes and increasing integration.

Student records request (impact study). In the impact study, we would share a memo with districts that describes the data we will ask them to provide. These student-level administrative records include demographic characteristics, school enrollment, and test scores, as well as attendance, persistence, and graduation, where applicable and available. The data request memo would clearly and concisely summarize (1) the sample of students for whom we are requesting data, (2) the data elements, and (3) the school years for which we are requesting each data element.

We also would use publicly available school-level data from the Common Core of Data and district websites to measure school and staff characteristics. These characteristics will include school size, racial/ethnic and socioeconomic student composition, and teacher/pupil ratio.

3. Use of technology to reduce burden

To minimize respondent burden, the data collection plan is designed to obtain information efficiently. When feasible, we will gather information from existing data sources, using the most efficient methods available. This will be especially helpful in reducing burden for the district/school interviews, because we will collect publicly available data on the schools’ admissions and lottery processes before the interview. Table A.2 provides information on the source, mode, and timing for each data collection activity.

We would survey principals primarily electronically. This mode allows respondents to complete the data collection instrument where and when they choose, and its built-in editing checks and programmed skips will reduce response errors. We also would offer a telephone or paper mode for principals who prefer these modes.

The types of district records needed for the evaluation are almost always available as computer files, and would be transferred to Mathematica in that way. We would specify the required data elements; however, to reduce burden on the district, we would accept any format in which the data are provided. We would convert these data to a consistent format so they can be combined with data that other districts submit and will be suitable for analysis.

4. Efforts to avoid duplication of effort

There is currently no other study examining the effectiveness of the MSAP. In addition, this study will require little effort on the part of respondents. The study will rely extensively on pre-existing administrative records, which would help to limit respondent burden. District records provide the basis of the study’s key outcomes, including student achievement and school diversity, and we would collect admission lottery information from schools and districts that already document these procedures in detail.

5. Methods to minimize burden on small entities

No small businesses or entities will be involved as respondents.

6. Consequences of not collecting data

These data are needed to evaluate the MSAP and its magnet schools. Little is known about these schools’ success in promoting diversity and boosting student achievement, the program’s main goals. Failing to collect these data would make it impossible to inform policymakers and the field about effectiveness and potential ways to address program improvement, or to address the Department’s Learning Agenda.

Consequences of not collecting data from specific sources. Each data source provides information needed for the evaluation.

Without information from the screener interviews with districts and surveys of magnet schools on recruitment and admissions, we would not have been able to collect information on the MSAP grantee admissions lotteries, the extent to which grantee schools are oversubscribed, and the availability of administrative data. Without all this information, we would not have been able to determine whether there was a sufficient number of schools and students for the impact evaluation. In addition, without the magnet school survey we would not have been able to collect information on student recruitment and admission practices. This information, alongside information from the charter school admissions surveys, will be valuable to education policymakers interested in understanding these practices. If we did not collect this information, it would not be possible for the study to highlight common and innovative approaches magnet schools use to attract students and increase diversity nor how admissions practices compare to other choice schools.

Without the information from principal surveys on school organization and instruction, we would be unable to understand the characteristics and organization of magnet schools and determine how they differ from schools enrolling control group students and we would be unable to analyze which characteristics may be related to the effectiveness of magnet schools in improving student outcomes and diversity.

Without lottery records, we cannot conduct the impact evaluation. We would be unable to (1) accurately define students’ treatment group or control group status, and (2) identify the largest group of schools and students appropriate for the analyses.

Without student records, we would not be able to analyze the impact of magnet schools on student outcomes, including test scores, persistence, graduation, and exposure to diverse groups of peers.

7. Special circumstances

There are no special circumstances involved with this data collection.

8. Federal Register announcement and consultation

a. Federal Register announcement

The 60-day notice to solicit public comments will be published in the Federal Register.

b. Consultations outside the agency

We convened a panel of experts to advise on the feasibility and the design of the impact evaluation. These experts and their affiliations are listed below:

  • Atila Abdulkadiroglu, Duke University

  • Bob Bifulco, Syracuse University

  • Kelly Bucherie, Magnet Schools of America

  • Sarah Cohodes, Columbia University

  • Keisha Crowder-Davis, Dallas Independent School District

  • Erica Frankenberg, Penn State University

  • Megan Gallagher, Urban Institute

  • Ellen Goldring, Vanderbilt University

  • Jia Wang, UCLA CRESST



9. Payments or gifts

As part of their funding requirements, MSAP grantees must participate in evaluation activities; however, the same is not true of schools that become part of the control group samples. Because a strength of the intended impact evaluation is an exploration of the relationship between school organization/instruction and magnet school effects, the survey of treatment and control group principals is a critical data collection activity. We believe respondent payments are necessary for the principals in the control group sample. Specifically, control principals completing the survey will receive a $30 payment, which is consistent with a high-burden survey 30-minute survey.3 We do not propose respondent payments for principals in grant-funded schools because we believe grant requirements will be sufficient to adequately obtain their responses.

For the charter school admissions survey, those who complete the survey within the first three weeks will be offered $50 gift card; those completing the survey after the first three weeks will be offered a $25 gift card.

10. Assurances of confidentiality

The study team has established procedures to protect the confidentiality and security of its data. This approach will be in accordance with all relevant regulations and requirements, in particular the Education Sciences Institute Reform Act of 2002, Title I, Subsection (c) of Section 183, which requires the director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.” The study also will adhere to requirements of Subsection (d) of Section 183 prohibiting disclosure of individually identifiable information, as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.

The study team will protect the full privacy and confidentiality of all people who provide data. The study will not have data associated with personally identifiable information (PII), because study staff will be assigning random ID numbers to all data records and then stripping any PII from the data records. In addition to the data safeguards described here, the study team will ensure that no respondent names are identified in publicly available reports or findings, and, if necessary, the study team will mask distinguishing characteristics. The following statement to this effect will be included with all requests for data:

Mathematica Policy Research and its subcontractor SPR follow the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Responses to this data collection will be used only for research purposes. The reports prepared for the study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. We will not provide information that identifies respondents to anyone outside the study team, except as required by law.”

Mathematica uses the following safeguards to protect confidentiality:

  • All Mathematica employees sign a pledge that emphasizes the importance of confidentiality and describes their obligation (see Appendix E).

  • All internal networks are protected from unauthorized access by using defense-in-depth best practices, which incorporate firewalls and intrusion detection and prevention systems. The networks are configured so that each user has a tailored set of rights, granted by the network administrator, to files approved for access and stored on the network. Access to hard-copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.

  • Computer data files are protected with passwords, and access is limited to specific users, who must change their passwords on a regular basis and conform to strong password policies.

  • Especially sensitive data are maintained on removable storage devices that are kept physically secure when not in use.

After the study concludes, the study data may be transmitted to the National Center for Education Statistics (NCES) for safekeeping as a restricted-use file. Before transmittal, the data will be stripped of any individual identifiers. Researchers wishing to access the data for secondary analysis must apply for an NCES license and agree to the rules and procedures guiding the use of restricted-use files.

11. Additional justification for sensitive questions

No questions of a sensitive nature will be included in this study.

12. Estimates of hours burden

Table A.3 provides an estimate of time burden for the data collections, broken down by instrument and respondent. A total of 107 hours was already approved for the now complete feasibility study. This updated OMB package requests 2,115.5 hours for the impact evaluation. These estimates are based on our experience collecting administrative data from districts and administering surveys to school administrators and principals. The table also presents annualized estimates of indirect costs to all respondents for each data collection instrument. Next, we provide details on the time and cost burdens for each data collection activity.

District interview (completed feasibility study): This estimate included up to 2 hours, in fall 2018, for all 40 2016 and 2017 MSAP grantees to complete the screener interview. This estimate assumed one hour for the interview and one hour to collect and assemble information on lottery procedures and student recruitment and admission. The estimated cost burden for the district screener was based on an average wage of $44.46 per hour for 80 hours, coming to a total of $3,556.80.

Magnet school recruitment and admissions survey (completed feasibility study): This estimate included 10 minutes, in fall 2018, for all 162 2016 and 2017 MSAP grantee schools to complete the survey. This should not have required respondents to collect or assemble information. The estimated cost burden for the school survey was based on an average wage of $44.46 per hour for 27 hours, coming to a total of $1,200.42.

Charter school admissions survey (impact): This estimate includes up to 45 minutes, in fall 2020 for 1600 school administrators (80% of the 2000 sampled charter schools) to gather information and complete the survey. This estimate assumes 15 minutes for assembling information on lottery procedures and 30 minutes for the survey. The estimated cost burden for the school survey was based on an average wage of $44.46 per hour for 1,200 hours, coming to a total of $53,532. These estimates are based in part on the actual burden of the completed interviews for the feasibility study, which used an instrument that is similar to the planned charter school admissions survey.

Lottery records request (impact study). We assume 16 hours for each of the 14 districts in the impact study, to work with each district liaison to compile data requested in the lottery records extraction memo (224 total hours). An average wage of $44.46 per hour was used to calculate the total burden cost of $9,959.04 for collecting these records.

Principal survey on school organization and instruction (impact study). Based on power calculations, we anticipate recruiting 100 MSAP schools into the study sample; in addition to the principals in these schools, we anticipate interviewing principals at approximately 1,000 schools attended by students who were not admitted to MSAP schools through the lottery (these are schools attended by students in the study’s control group). In total, 1,100 principals will be contacted to complete the school survey in spring 2022. We assume 935 principals (85 percent of the anticipated 1,100 principals in the sample) will complete the half-hour (30 minute) survey (467.5 total hours). Assuming an average wage of $44.46 per hour, the total burden for the school surveys comes to a total of $20,785.05.

Student records request (impact study). Similar to the burden associated with lottery records, we assume 8 hours for each of the 14 districts in the impact study at two points in time (in fall 2020 and fall 2022), for working with each district liaison to obtain administrative records defined in the memo (224 total hours). An average wage of $44.46 per hour was used to calculate the total burden cost of $9,959.04 for collecting these records.

The total annual burden for this impact evaluation collection is 2,115.5 hours. A total burden cost of $94,235.13 was calculated across all data collection activities for an average yearly burden of $31,411.71.





Table A.3. Annual reporting and record-keeping hour burden

Respondent/data request

Total number
of respondents

Total responses

Total amount of time per response (hours)

Total hours

Respondent average hourly wagea

Respondent labor cost

Feasibility study – COMPLETED

District interview

40

40

2a

80

$44.46

$3,556.80

School recruitment and admissions survey

162

162

.17

27

$44.46

$1,200.42

Impact Study







Student records request (Appendix A)

14

28

8

224

$44.46

$9,959.04

Lottery records request (Appendix B)

14

14

16

224

$44.46

$9,959.04

Principal survey on school organization and instruction (Appendix C)

935

935

.5

467.5

$44.46

$20,785.05

Charter school admissions surveyb (Appendix D)

1,600

1,600

.75

1,200

$44.46

$53,532.00

Total

2,563

2,577

2,115.5

n.a.

$94,235.13

Average annual (across 3 years)

854

859


705.2

$31,411.71

aWe assume that the telephone interview will take about an hour to complete and an hour for the respondent to gather information to prepare for the call.

bWe assume that the charter school admissions survey includes 15 minutes of preparation time and 30 minutes to complete the survey.

n.a. = not applicable



13. Estimates of cost burden to respondents

There are no additional respondent costs associated with this data collection beyond the burden estimated in item A.12.

14. Estimates of annual costs to the federal government

The estimated cost for this six-and-a-half-year study, including establishing a technical working group, design and feasibility memo, brief on grantee recruitment and admissions, data collection instruments, district recruitment, data collection, data analysis, and report preparation, is $4,563,441 or approximately $702,068 per year.

15. Reasons for program changes or adjustments

The original burden estimate of 238 hours for the principal survey on school organization and instruction is being increased by 229.5 hours to a total of 467.5 hours. This is due to the number of expected respondents for the 30-minute survey increasing from 476 (85% of a total sample of 560) to 935 (85% of a total sample of 1,100) principals.

The survey’s sample size is increased because, first, there is an opportunity to include more MSAP schools in the analysis than originally projected. Initially it was assumed 70 MSAP schools would meet the study’s inclusion criteria (i.e., used lotteries to determine which students were admitted in the 2018-2019 school year or the 2019-2020 school year). However, more districts than expected use unified lotteries, and review of the lottery data collected for the study since the initial OMB submission demonstrated that the use of a unified lottery increases the number of schools whose admissions are affected by the lottery outcome. As a result, the study can include up to 100 MSAP schools in the treatment group for the impact study.

Further, it was originally assumed that the majority of students who were not admitted via lottery to an MSAP school—the control group—would ultimately attend an average of 7 other, non-MSAP schools whose principals would also need to be surveyed (to understand how the characteristics and organization of schools enrolling control group students differ from those of the MSAP school). However, based on review of the administrative data collected since the initial OMB submission, it is now estimated that the study will need to survey principals from 10 non-MSAP schools to collect information on school characteristics for 90% of the control group.

Including these additional schools and respondents is beneficial for this program evaluation for multiple reasons. Specifically, this change will increase both the statistical precision of the study (due to the larger sample size) and the external validity of the MSAP study (by including a larger proportion of MSAP schools in the impact study).

16. Plan for tabulation and publication of results

a. Tabulation plans

Descriptive briefs

The purpose of the descriptive briefs will be to provide information about the recruitment and admissions practices of MSAP grantees and how those compare to the admissions practices of charter schools. This information may be useful to future MSAP grantees and other magnet schools outside the MSAP program. The briefs will summarize information collected in the study’s feasibility assessment (in particular, the initial screener interviews with district and school officials in spring 2018) and information from the charter school surveys. The analyses for these briefs will be purely descriptive; the briefs will not contain any impact analyses or complex statistical modeling. The document will be limited to descriptive tables and figures reporting how often schools in the study samples report using specific types of recruitment (magnet schools) and admissions practices (magnet and charter schools).

Impact report

The impact report will summarize the results of the full impact study, including rigorous estimates of the impacts of magnet schools on student achievement and diversity. The impact analyses summarized in this report will examine the effects of being offered admission to a magnet school on student achievement and other outcomes. Key elements of the plan include the following five components:

Basic model. The basic model shows how we will use treatment/control comparisons to estimate the impact of being offered admission to a magnet school on student achievement and other outcomes, including characteristics of the schools that students attend. Impacts on the characteristics of schools that students attend will tell us whether magnet schools affect the diversity of students’ peers and the characteristics of students’ educational programs. This analysis also will shed light on possible mechanisms for magnet school impacts, by revealing how these schools differ in their organization, instruction, and other characteristics from the schools their students would otherwise attend. We will base our primary student achievement outcome on scores for state assessments in reading and math (grades 3 through 8), covering additional subjects and grades as available from the district (including high school assessments). These scores will be standardized using state- and grade-level means and populations, to make comparisons across states and grades easier. We will examine other achievement outcomes (such as science or social studies test scores) to the extent that students are tested in these areas to assess whether magnet schools with themes related to these subjects affect achievement in these subjects. We also will examine impacts reflecting student progress and attainment, including grade promotion and graduation, if applicable.

A simple version of the model for estimating magnet school impacts is:

(1) ,


where is the outcome for student in year t who participated in lottery , is an indicator of whether student i was assigned to the treatment group, represents student baseline characteristics including prior test scores, is a set of lottery fixed effects, and is a random error term. The lottery fixed effects account for differences between characteristics of applicants to different schools (and applicants who participate in different lotteries in a given school), as well as differences in the proportion of students offered admission in different lotteries (Abdulkadiroglu et al. 2011). The coefficient represents the estimated impact of being offered admission to a magnet school—an intent-to-treat estimate, because not all those offered admission will attend the school. We also will estimate the impact of attending a study magnet school, using a complier average causal effect (CACE) estimator (Schochet and Chiang 2009).

Variation in lottery designs. We will use the information we collect on lottery processes and outcomes to carefully account for issues that might otherwise thwart efforts to define the treatment group of magnet school students and the proper counterfactual against which their outcomes should be compared. For example, we will be prepared to analyze data not only from a simple, single-school lottery (based on the model described above), but also from a districtwide unified lottery. In the latter case, students or their parents may express their preferences for a number of schools, and the district randomly assigns students to schools. To account for different types of lotteries, we will adjust the sample included in the model estimation, as well as the lottery fixed-effects variables included in the model, accordingly.

Other lottery features that our analysis approach will be prepared to handle include exemptions, preferences, and stratification. Lottery exemptions allow certain students (such as siblings or those from a feeder school on a magnet school pathway) to be admitted to the school without having to participate in the lottery. Stratification occurs when certain groups of students are randomized before others or when a set number of seats are reserved for the group regardless of the number of applicants. Finally, we also may encounter situations in which students apply to multiple magnet schools with single-school lotteries. We will adapt methods developed in prior work on schools of choice (Gleason et al. 2010) to handle these types of situations.

The counterfactual. The counterfactual condition tells us what would happen to treatment group students if they had not received an admission offer to a MSAP magnet school. The experiences of control group students provide evidence about this counterfactual. For example, a large proportion of control group students attending private schools suggests that magnet schools attract students who would otherwise leave the district. The counterfactual condition affects the interpretation of impact estimates. In the simplest case, control group students would all attend neighborhood public schools, and we would interpret impacts as reflecting a comparison of magnets and these neighborhood schools. A more likely scenario is that the counterfactual is more varied, with control group students attending a mix of neighborhood, charter, and private schools. If many control group students attend private schools where data on administrative outcomes are not available (or in districts where charter school data is not included with district data on traditional schools and magnet schools), this could lead to differential attrition in the sample (with a higher attrition rate in the control group than in the treatment group), which can pose a threat to the study’s internal validity. In addition, control group students also may attend other magnet schools, which could complicate interpretation of the study’s results. We will minimize these risks by avoiding lottery situations in which such conditions are widespread. In cases in which a large proportion of control group students in our sample do attend magnet schools, we will focus on complier average causal effect (CACE) estimates.

We will estimate the counterfactual based on schools that control group students attend. To the extent possible, we will measure magnet school impacts separately for subgroups of students based on the type of school they would likely have attended if they did not attend the magnet school (for example, students who would otherwise attend a traditional public school versus a charter school). We will explore multiple ways of carrying out this type of subgroup analysis. For example, one approach would be to identify magnet schools where nearly all control group students attend a traditional public school, and compare these schools to other magnet schools where nearly all control group students attend a charter school. An alternative approach would be to examine unified district-wide lotteries where students rank multiple schools: in these lotteries, it may be possible to estimate impacts separately for subgroups of treatment students who chose either traditional public schools or charter schools as their second-ranked option.

Correlational analysis. Our impact analysis will not only provide rigorous evidence on the overall impacts of magnet schools for students, but will also explore differential effects among subgroups of students and schools. This involves defining subgroups of schools with particular characteristics, then examining correlations between these characteristics and impacts. One focus will be on magnet school themes, to shed light on whether some types of schools—for example, those with a Science, Technology, Engineering, and Mathematics (STEM) focus—may be more successful than others. We will examine school characteristics that the charter school literature suggests may correlate with more positive impacts, such as serving larger proportions of disadvantaged students (Gleason et al. 2010), providing intensive tutoring for students (Furgeson et al. 2012), or providing substantial coaching and feedback to teachers (Dobbie and Fryer 2013). We also will assess the estimated impacts of different types of magnets, such as older magnets versus newer ones. This estimate will be relevant to districts considering whether to expand their magnet school sector, either by opening new magnet programs or by expanding the number of seats available in existing programs. Finally, we will estimate differential impacts by school grade levels.

Longitudinal data. Using student-level and school-level administrative data in each of up to four follow-up years, we will track changes in key student- and school-level characteristics. Tracking students for four years will make it possible to observe administrative data on test scores in third grade (typically the earliest grant in which standardized tests are administered) for students who entered a magnet elementary school in kindergarten. One key student-level measure will be students’ school enrollment. We will examine whether treatment group students who enter magnet schools stay there or move to nonmagnet schools. We will examine whether control group members enter magnet schools at any point during the follow-up period. At the school level, we will describe how magnet schools’ characteristics evolve during the study period. For example, schools that convert to magnet status may not have fully put the school’s theme in place in the first year or two, so we may observe such changes in key school characteristics as student composition. Finally, the data will allow us to track magnet school impacts after additional years of potential enrollment in the schools.

b. Publication plans

The study team will prepare three reports. As part of the feasibility assessment, the study team will prepare a descriptive brief documenting the recruitment and admissions practices used by FY 2016 and 2017 MSAP grantees. To provide context for the impact evaluation and to inform the feasibility of a potential charter school impact study, the study team will prepare a descriptive brief documenting the admissions practices of charter schools. Finally, the study team will prepare a single comprehensive report presenting a portrait of magnet schools and their impacts on student achievement and school diversity. We will draw on experience preparing IES reports that use figures and accompanying briefs to present clear, accessible findings that can guide district, school, and policymaker decisions. Table A.4 presents the timeline for these reports.

17. Approval not to display the OMB expiration date

All data collection instruments will include the OMB expiration date.

18. Explanation of exceptions

No exceptions are requested.

Table A.4. Timeline for project publications

Activity

Date

Descriptive brief (magnets)

First draft

December 2019

Revised draft

February 2020

Final brief

July 2020

Descriptive brief (charters)

First draft

Feb 2021

Revised draft

April 2021

Final brief

September 2021

Study report

First draft

July 2023

Revised draft

September 2023

Final report

April 2024





References

Abdulkadiroglu, A., J. Angrist, S. Dynarski, T.J. Kane, and P. Pathak. “Accountability and Flexibility in Public Schools: Evidence from Boston’s Charters and Pilots.” Quarterly Journal of Economics, vol. 126, no. 2, 2011, pp. 699–748.

Ballou, Dale. “Magnet School Outcomes.” In Handbook of Research on School Choice, edited by Mark Berends, Matthew Spring, Dale Ballou, and Herbert Walberg. New York: Taylor and Francis, 2009.

Berends, Mark, and Joseph R. Waddington. “Impact of the Indiana Choice Scholarship Program: Achievement Effects for Students in Upper Elementary and Middle School.” Working paper. Available at http://creo.nd.edu/images/people/Waddington__Berends_Indiana_Voucher_Impacts_06.24.17.pdf. Accessed July 13, 2017.

Betts, J., S. Kitmitto, J. Levin, J. Bos, and M. Eaton. “What Happens When Schools Become Magnet Schools? A Longitudinal Study of Diversity and Achievement.” Washington, DC: American Institutes for Research, May 2015.

Bifulco, Robert, Casey Cobb, and Courtney Bell. “Can Interdistrict Choice Boost Student Achievement? The Case of Connecticut’s Interdistrict Magnet School Program.” Educational Evaluation and Policy Analysis, vol. 31, no. 4, December 2009, pp. 323–345.

Blank, R., Levine, R., & Steel, L. “After Fifteen Years: Magnet Schools in Urban Education.” In B. Fuller, R. Elmore, & G. Orfield (eds.), Who Chooses? Who Loses? Culture, Institutions, and the Unequal Effects of School Choice, (pp. 154-172). New York: Teachers College Press, 1996.

Christenson, B., M. Eaton, M. Garet, L. Miller, H. Hiwaka, and P. DuBois. “Evaluation of the Magnet Schools Assistance Program, 1998 Grantees.” Washington, DC: American Institutes for Research, 2003.

Crain, R.L., A. Allen, R. Thaler, D. Sullivan, G. Zellman, J. Warren Little, and D. Quigley. “The Effects of Academic Career Magnet Education on High Schools and Their Graduates.” Berkeley, CA: National Center for Research in Vocational Education, 1999.

Dillman, Don A. 2007. Mail and Internet Surveys: The Tailored Design Method (2nd ed.). New Jersey: John Wiley & Sons, Inc.

Dobbie, Will, and Roland G. Fryer Jr. "Getting beneath the Veil of Effective Schools: Evidence from New York City." American Economic Journal: Applied Economics, vol. 5, no. 4, October 2013, pp. 28–60.

Finn, Chester and Jessica Hocket. “Exam Schools: Inside America’s Most Selective Public High Schools.” Princeton, NJ: Princeton University Press, 2012.

Furgeson, Joshua, Brian Gill, Joshua Haimson, Alexandra Killewald, Moira McCullough, Ira Nichols-Barrer, Bing ru Teh, Natalya Verbitsky Savitz, Melissa Bowen, Allison Demeritt, Paul Hill, and Robin Lake. “Charter-School Management Organizations: Diverse Strategies and Diverse Student Impacts.” Cambridge, MA: Mathematica Policy Research, January 2012.

Glander, Mark. “Selected Statistics from Public Elementary and Secondary Education Universe: School Year 2014-15.” Washington, DC: National Center for Education Statistics, September 2016.

Gleason, P., M. Clark, C. Tuttle, and E. Dwoyer. “The Evaluation of Charter School Impacts.” Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, June 2010.

National Center for Education Evaluation. 2005. Guidelines for incentives for NCEE impact evaluations. (March 22). National Center for Education Evaluation: Washington, DC.

Polikoff, Morgan, and Tenice Hardaway. “Don’t Forget Magnet Schools When Thinking About School Choice.” Washington, DC: Brookings Institution, March 2017.

Schochet, Peter, and Hanley Chiang. “Technical Methods Report: Estimation and Identification of the Complier Average Causal Effect Parameter in Education RCTs.” Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, April 2009.

Steel, L., and R. Levine. "Educational Innovation in Multiracial Contexts: The Growth of Magnet Schools in American Education." Prepared for the U.S. Department of Education under contract by American Institutes for Research, Palo Alto, California, 1994.

Wang, J., J. Schweig, and J. Herman. “Is There a Magnet School Effect? Using Meta-Analysis to Explore Variation in Magnet School Success.” Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing and Center for the Study of Evaluation, December 2014.

This page has been left blank for double-sided copying.



www.mathematica-mpr.com

Improving public well-being by conducting high quality,
objective research and data collection

Princeton, NJ Ann Arbor, MI Cambridge, MA Chicago, IL Oakland, CA SEATTLE, WA TUCSON, AZ Washington, DC ■ Woodlawn, MD




1 Though the focus is on schools funded with the new grants, we would include magnet schools that received MSAP grant funds between 2010 and 2014 in the same 2016–2017 grantee districts. These earlier-funded schools are likely to have more mature magnet programs and stronger demand for admission, making them potentially better candidates for a lottery-based impact study of the MSAP.

2 Data collection activities scheduled to occur after the first three years of the study may be undertaken as an option in the contract with ED for this project. If ED chooses to exercise an option for additional years of data collection beyond the first three-year period, at that time another forms clearance package will be submitted for approval.

3 Providing data collection incentives is important in federal studies, given the recognized burden and need for high response rates. The use of incentives has been shown to be effective in increasing response rates (Dillman, 2007). In 2005, the National Center for Education Evaluation (NCEE) submitted a memorandum to OMB outlining guidelines for incentives for NCEE Evaluation Studies and tying recommended incentive levels to the level of burden (represented by the length of the survey).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorForest Crigler
File Modified0000-00-00
File Created2022-01-28

© 2024 OMB.report | Privacy Policy