ISMS OMB part B

ISMS OMB part B.docx

Impact Study of Federally-Funded Magnet Schools

OMB: 1850-0943

Document [docx]
Download: docx | pdf

part b

Impact Study of Federally-Funded Magnet Schools: OMB Data Collection Package

March 23, 2020



Submitted to:

Institute of Education Sciences
550 12th Street, SW

Room 4104
Washington, DC 20004

Project Officer: Lauren Angelo
Contract Number: ED-IES-17-C-0066

Submitted by:

Mathematica Policy Research

P.O. Box 2393
Princeton, NJ 08543-2393
Telephone: (609) 799-3535
Facsimile: (609) 799-0005

Project Director: Christina Tuttle
Reference Number: 50526.01.026.220.000



This page has been left blank for double-sided copying.

CONTENTS

PART B. SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION 1

A. Collection of information employing statistical methods 1

1. Respondent universe and sampling methods 1

2. Procedures for the collection of information 4

3. Methods to maximize response rates and deal with nonresponse 7

4. Tests of procedures or methods to be undertaken 8

5. Individuals consulted on statistical aspects of the design 8

References 9



TABLE

B.1. Magnet study research questions and data sources 2

B.2. Individuals consulted on study design 9


This page has been left blank for double-sided copying.


PART B. SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION

This Office of Management and Budget (OMB) package requests clearance for data collection activities to support a rigorous Impact Study of Federally-Funded Magnet Schools. The Institute of Education Sciences (IES) at the U.S. Department of Education (the Department) has contracted with Mathematica Policy Research and its subcontractor, Social Policy Research Associates (SPR), to conduct this evaluation

The study team completed a feasibility assessment under a previously approved clearance package (1850-0943). Information gathered from fiscal year (FY) 2016 and 2017 Magnet Schools Assistance Program (MSAP) grantee districts and schools confirmed that a sufficient number of students are being admitted to these schools through lotteries to support an impact evaluation. A brief (currently under review) describes how MSAP-funded schools recruit and select students for admission, a topic of interest to the program office and to the Department’s policy makers.

This updated clearance package expands on the prior one to request approval for impact study data collection. If the package is approved, the study would collect survey data from principals and district administrative records on admissions lotteries and student progress. The study would use these data to estimate the impacts of magnet schools on student achievement and diversity and to describe whether particular features of magnet schools are associated with greater success.

B. Collection of information employing statistical methods

The most recent FY 2016 and FY 2017 MSAP grant competitions provide an opportunity to conduct a rigorous study of the MSAP program, drawing primarily on administrative records. The grant notices in those years included a competitive preference priority for grant applicants that proposed to select students to attend the magnet schools by methods such as lotteries. The impact evaluation would measure the effects of a wide range of MSAP schools, on student’s achievement and school diversity, using a rigorous lottery-based random assignment design. The impact study data collection would make it possible to estimate the impact of admission to a magnet school on: (1) student achievement and other available academic outcomes; and (2) school environment, including exposure to a more diverse set of peers. The study would also examine whether particular features of MSAP schools are associated with greater success which could inform program improvement efforts in the future.

This impact evaluation of MSAP schools is authorized by Part B section 8601 of the Elementary and Secondary Education Act of 1965 (ESEA) as amended by the Every Student Succeeds Act (ESSA), which allows the Department to pool resources across ESEA programs in order to fund rigorous evaluations of individual Federal education programs that currently lack sufficient evaluation dollars. The Magnet Schools Assistance Program (MSAP), itself, is authorized under Title IV, Part D of the ESEA, as amended by ESSA, and provides grants to local educational agencies (LEAs) and consortia of LEAs to support magnet schools under an approved, required or voluntary, desegregation plan.

The study design takes advantage of the lotteries that districts or schools conduct when there are more students interested in being admitted to a magnet school than there is available space for them. To increase the policy relevance of the study, the analysis would focus on the districts currently receiving federal support through multi-year MSAP grants awarded in FY2016 and FY2017. We anticipate recruiting approximately 70 MSAP schools.1 The analysis would estimate the impact of each magnet school in the study sample by comparing the outcomes of students who receive an admission offer through the lottery to the outcomes of students who do not receive an offer through the lottery. Our analytic approach takes into account that substantial variation will exist in schools’ lottery procedures, grade levels of applicants, and the ratio of applicants to available seats; including as many different magnet school lotteries as possible in the study will maximize the sample’s size and policy relevance.

A final research question would focus on charter school admissions practices. The feasibility study uncovered that almost all MSAP schools are located in districts with other kinds of school choice, including charter schools. Information about charter school admissions practices, alongside already collected information about MSAP school practices, would provide context for the impact evaluation findings.

Table B.1 describes these research questions, as well as data sources and analysis methods for addressing those questions.

Table B.1 Magnet study research questions and data sources

Research questions

Data sources and analysis methods

Feasibility study

  1. How do the districts and schools funded through the 2016 and 2017 MSAP grants recruit and select students for admission?

  • Screener interviews with 2016 and 2017 MSAP districts and school surveys on student recruitment and admissions

  • Comparison of reported school/district practices

  1. How many schools hold eligible lotteries and are willing to participate in the study?

  • Screener interviews with 2016 and 2017 MSAP districts

  1. Is the number of lottery participants and eligible schools sufficient to conduct the study?

  • Screener interviews on lottery procedures and information available in district administrative data

  • Projections of the number of students participating in usable lotteries in spring 2018 and 2019

Analysis of statistical power to detect meaningful impacts

Impact evaluation

  1. What is the impact of admission to the magnet program on student academic outcomes (achievement and/or other relevant measures of student success, such as persistence in school or graduation)?

  • Data on student outcomes from district student-level records

  • Experimental impact analysis comparing outcomes of treatment group students (who win lotteries and are admitted to magnet schools) and control group students (who lose lotteries)

  1. What is the impact of admission to the magnet program on the type of school that students attend, including their school’s educational programs and the students’ exposure to a diverse range of peers?

  • Student- and school-level administrative records, supplemented by data on school characteristics from principal survey

  • Experimental impact analysis comparing characteristics of schools attended by treatment group and control group students

  1. To what extent is there a relationship between school characteristics, including measures of diversity, and school impacts on student outcomes?

  • Subgroup analysis, in which impacts are calculated for subgroups of schools with particular characteristics

  • Correlational analysis, examining relationship between school characteristics and measures of diversity (from school survey and extant data) and impacts

  1. How do charter schools select students for admission?

  • Surveys of a sample of 2000 currently operating charter schools

  • Descriptive analyses of practices used



1. Respondent universe and sampling methods

For the district screener interviews, the respondent universe was all school districts receiving Magnet Schools Assistance Program (MSAP) grants in 2016 and 2017. Forty different school districts received these grants in FY 2016 and FY 2017. In addition, school survey supplements included one administrator in all 162 2016 and 2017 MSAP schools. Those interviews/surveys determined which schools are eligible (i.e., are oversubscribed and use lotteries to admit students) for data collection associated with the impact study, which includes a principal survey and student and lottery records collection.

The respondent universe for the student sample includes all students who applied to the eligible schools in 2018 or 2019 and participated in the schools’ admissions lotteries. These lotteries determine students in the treatment group (“lottery winners,” who are offered admission to the study’s magnet schools) and students in the control group (“lottery losers,” who are not offered admission). We would collect survey data from principals in the schools the treatment group and control group students attend, and would collect administrative data on these same schools from districts. We would collect administrative data on all sample students in the study schools. Finally, we would survey a sample of charter school administrators about admissions practices at their schools. Next, we describe each stage in greater detail.

a. Selection of magnet districts and schools

The study conducted screening interviews with one MSAP coordinator in each of the FY2016 and FY2017 MSAP grantee districts. In addition, surveys included one administrator in all 162 2016 and 2017 MSAP schools. From those interviews/surveys, we identified potential districts and magnet schools to include in the impact study. That is, we identified districts with MSAP schools that tend to be oversubscribed and use lotteries to admit students. The impact evaluation sample would be comprised of approximately 70 MSAP schools that held 2018 and/or 2019 admissions lotteries (power calculations indicate that this potential sample is sufficient to produce impact estimates with adequate statistical precision). For example, the study would be able to compare the impacts of newer MSAP schools to more established MSAP schools, and compare the impacts of MSAP schools using a ‘Science, Technology, Engineering, and Math’ theme to schools using a different theme. We expect a broad set of schools to participate in the study, including new, preexisting, and conversion magnets, as well as magnets with different themes.

b. Principal survey sample (impact study)

The principal survey sample will include all principals in the 70 treatment group schools and those in up to 490 schools attended by students in the control group, resulting in a total sample of 560 principals. If students in the control group attend more than 490 schools, to contain costs the study will select the sample by prioritizing schools that enroll the largest number of control group students. We expect that approximately 85 percent of principals in the sample will complete the survey.

c. Student sample (impact study)

The respondent universe for student data collection will include all students who participate in the 2018 or 2019 lotteries for admission to study MSAP schools. To conduct a study with adequate statistical precision, our power calculations indicate that we will need to recruit a sample of approximately 70 MSAP schools and include two cohorts of students per school, representing a universe of approximately 14,000 students who applied to and participated in the lotteries for these schools. The lotteries will determine students in the treatment group (“lottery winners,” who are offered admission to the study’s magnet schools) and students in the control group (“lottery losers,” who are not offered admission). To examine effects of magnet schools on student outcomes, we will collect district administrative records data on student demographics (age, sex, race/ethnicity, and eligibility for free or reduced-price lunch), school enrollment, test scores, attendance, persistence, and graduation.

d. Charter school survey sample (impact study)

The charter school sample would be comprised of 2,000 charter schools randomly sampled from the universe of the roughly 7,700 charter schools operating in 2018. We expect that approximately 80 percent of charter school administrators in the sample will complete the survey.



2. Procedures for the collection of information

a. Statistical methodology for stratification and sample selection

We will select a sample of approximately 70 MSAP grantee schools that are willing and able to participate in the impact evaluation, and for each MSAP school the study sample will include two cohorts of students who applied for admission via a lottery. Based on the information collected in the study’s feasibility phase, we project that this sample of schools will yield a student-level sample of more than 14,000 treatment group and control group students. The proposed design and sample size is sufficient to detect policy-relevant impacts both for the full sample and for key subgroups of students and schools. For MSAP middle schools, the design supports a minimum detectable effect (MDE) or 0.05 on achievement outcomes for the overall sample, or an MDE of 0.08 for a 50% subgroup of students or schools (such as recently established MSAP schools). This level of power for subsets of schools also implies that the design will be able to conduct a useful correlational analysis of how magnet school characteristics relate to impacts. The sample also supports impacts estimates for MSAP elementary schools (MDE of 0.10) and MSAP high schools (MDE of 0.08).

Next, we describe our selection of the sample of principals and students for each data collection activity, should the impact evaluation go forward.

Selection of principal sample. We will not randomly sample principals for any data collection activities. Instead, we will attempt to collect survey data from all principals of all treatment group schools and up to 490 schools attended by control group students. Treatment group schools are defined as 2016 and 2017 MSAP grant-supported schools with eligible admission lotteries. Control group schools are defined as schools not supported by MSAP grants where students chose to enroll after participating in an admission lottery for a treatment school and not receiving an admission offer from that treatment group school. If control group students attend more than 490 schools, we will survey principals in the schools that enrolled the largest number of control group students.

Selection of student sample. We will not randomly sample students for the collection of district administrative records. Instead, we will attempt to collect administrative records data on all students who entered the sample by participating in a MSAP school lottery seeking admission in fall 2018 or fall 2019, regardless of which schools the students attended in the grantee district during the study’s follow-up period. The study team will not modify the lottery procedures used by MSAP-supported districts and schools—for example, any systems used by districts to give some types of students a higher chance of receiving an admissions offer than other types of students (such as lottery stratification) will remain in place and will not be altered by the study.

Selection of charter school sample. We will randomly sample 2,000 charter schools from the universe of the roughly 7,700 charter schools operating in 2018. The random sample will be stratified on three factors: (1) whether or not a school has received support from the Department’s Charter School Program, (2) the state a school is located in, and (3) grade levels offered (high school, middle school, or elementary school). The sample size allows for making generalizations about charter school admissions practices, learning about how oversubscribed schools prioritize students for admissions, and making select comparisons (e.g., comparing Charter School Program grantees with schools that have never received federal support).

b. Data collection

District interview (completed feasibility study).

We conducted a screening interview with each MSAP district to describe the study, collect school eligibility information, and seek support to encourage school participation in study activities, including screening interviews and school organization and instruction surveys.

We developed the protocol with the goal of completing a discussion with one staff member in 60 minutes. We provided the protocol to respondents in advance of the discussion and assumed that they would spend up to an hour locating and compiling information for the interview. To reduce burden on district staff, we reviewed grantees’ district websites and other publicly available information to identify information relevant to our items of interest and ask respondents to verify or update that information rather than submit new answers.

School survey on recruitment and admissions (completed feasibility study). Short school surveys collected only essential information on student recruitment activities and student selection. This information helped determine whether a school was eligible for the impact study and will provide information of interest to the program office.

Principal survey on school organization and instruction (impact study). The data collection team will administer a 30-minute survey electronically to treatment group and control group principals. We also will provide an option for principals to complete the survey by telephone, to achieve an 85 percent response rate.

We will administer principal surveys in fall 2020. The survey will gather information on school instruction and organization. Principals will respond to questions on curricular focus; admissions processes; student programs; instructional organization, approach, and resources; faculty and staff; safety and behavior policies; and community and parent engagement. Data from the principal surveys will allow us to assess to analyze which school organization and instructional approaches may be related to the effectiveness of magnet schools in improving student outcomes and diversity.

The survey draws on valid and reliable items from instruments we have developed to survey principals at choice and traditional public schools, including the principal surveys from the IES charter school study and the National Evaluation of Charter Management Organization (CMO) Effectiveness, both used to identify promising practices in choice schools. We also drew from other relevant surveys, including the School Survey on Crime and Safety, Schools and Staffing Survey, and National Center on School Choice Surveys (Berends et. al 2009).

Lottery records request (impact study). For the impact study, it is essential to collection information about how lotteries are conducted and the outcomes of lotteries in order to define the study’s treatment and control groups. The study team will share a memo with districts that defines the data that districts and schools will need to provide for each cohort of lottery applicants to account for variation in lottery implementation. The lottery memo will describe detailed information for each cohort of lottery applicants to account for variation in lottery implementation and all school options. The extraction memo is structured to describe lottery data in a manner that accounts for the following factors:

  • Single-school versus districtwide common lotteries

  • Lottery processes for making admissions offers

  • Accounting for wait list offers for eventual admissions

  • Identifying treatment group assignment compliers: matching lottery records to enrollment records

In eligible schools and districts, collecting and processing data will involve close coordination with staff before, during, and after lotteries. Multiple points of coordination will allow us to document and verify nuances in the lottery procedures and data that we should account for in the research design. We will establish a liaison at the school or district with whom we will communicate about lottery procedures and outcomes. We will communicate during the recruitment process to confirm that the lottery structure is consistent with the information gathered during recruitment. We will use updated information that we gather from liaisons at this stage to develop a plan for completing the lottery results extraction form retrospectively for 2018 lotteries and 2019 lotteries. We will collect lottery data in a single data collection round. Districts will provide the associated records in the format used by the district.

Student records request (impact study). For the impact study, the data collection team will collect data on students in treatment group and control group schools. We developed a standardized memo to share with districts that describes the data districts will provide so they can be collected in a consistent manner. This includes (1) student-level records, including demographic characteristics, school enrollment, and test scores, as well as attendance, persistence, and graduation where applicable and available; and (2) school-level data, such as teacher experience, principal experience, and average student characteristics. The data request form clearly and concisely summarizes (1) the samples of schools and students for whom we are requesting data (including identifying information where possible), (2) the data elements, and (3) the school years for which we are requesting each data element.

We also will use publicly available school-level data from the Common Core of Data and district websites to measure additional school and staff characteristics. These characteristics will include school size, racial/ethnic and socioeconomic student composition, and teacher/pupil ratio.

Charter school admissions survey (impact study). Short (30 -minute) surveys will be administered electronically to charter school administrators to collect only essential information on student admissions. This information will help provide context for magnet school admissions practices and program impacts. Data collection will begin in fall 2020.

3. Methods to maximize response rates and deal with nonresponse

Next, we describe our methods for maximizing response rates and minimizing nonresponse in our collection of extant data from districts and our collection of primary data from principals (should the impact evaluation be feasible). All MSAP grantees are expected to participate in data collection activities, including completing the screener, as a condition of their grant funding.

Collection of administrative data. To reduce districts’ burden in the submission of administrative records (including student-level lottery documentation and longitudinal school records data) and maximize response rates, we will allow districts to submit data in the most convenient format. Federal rules permit The Department and its designated agents to collect school records data from schools and districts without prior parental or student consent (Family Educational and Rights and Privacy Act [20 U.S.C. 1232g; 34 CFR Part 99]). To further maximize the response rate and minimize burden on schools, we will follow these federal rules.

Principal survey on school organization and instruction (impact study). We expect to achieve a response rate of 85 percent for the principal survey. The Department has indicated that grantees are expected to complete the survey as a condition of their funding, which we expect to assist with response rates in the study’s treatment group. However, this will not apply to principals of schools attended by students in the control group. For these principals, we will offer a $30 gift card for completing the survey. For all principals, we will use the following approach, which is designed to maximize efficiency and minimize costs.

We will send an initial welcome email to all sample principals. The email will contain information on the study and a link to access the survey. Principals will be given 12 weeks to complete the survey, and we will send an email every 2 weeks reminding them to do so. We will also forward a list of study principals to a coordinator in each district, and ask districts to encourage the principals to complete the survey. To validate data and ensure quality control, we will conduct (1) regular, real-time checks of survey responses to ensure completeness and face validity and to detect issues such as instructions that need clarification; (2) an interim review of aggregate data to validate instrument skip patterns and review preliminary statistics, early enough to make critical fixes for most of the sample; and (3) a full review, cleaning, and editing of the complete data files.

For treatment group principals, we will leverage the expectation that they will cooperate as a condition of their grant funding. For control group principals, we will implement an incentive ($30) and leverage district participation in the study by requesting help from the district liaison identified in the district memorandum of understanding. The district liaison will provide an advance letter and other communication and follow-up, as appropriate, to participating treatment group and control group principals to encourage their cooperation.

Charter school admissions survey (impact study). We expect to achieve a response rate of 8o percent for the charter school administrator survey. We will use the following approach, which is designed to maximize efficiency and minimize costs.

We will send an initial welcome email to the entire sample. The email will contain information on the study and a link to access the survey. Administrators will be given 12 weeks to complete the survey, and we will send a reminder email up to once a week reminding them to do so. For a portion of the sample, we will also conduct reminder phone calls and offer the option of completing the survey over the phone with a trained interviewer. Those who complete the survey within the first three weeks will be offered $50 gift card; those completing the survey after the first here weeks will be offered a $25 gift card.

4. Tests of procedures or methods to be undertaken

Feasibility study. Pre-testing the district/school screener was vital to the integrity of data collection. We reviewed previously used questions and developed new questions for the evaluation. We conducted pilot testing of the district/school screener during the 60-day comment period for the initial OMB package. Due to this testing, we learned that it was necessary to target items specifically to the district role and other items specific to the school administration. Therefore, the interview was broken-up into a district interview and short school survey on recruitment.

Impact evaluation. We plan to pre-test the charter school admissions survey and the principal survey on school organization and instruction with up to nine respondents each. We will mail participants a hard copy of the survey and ask them to complete the survey within two weeks. We will provide participants with a list of questions to consider as they complete the survey and ask them to make notes in the survey regarding any questions that may be confusing or any terms or phrases that are unclear. We will also schedule a time to conduct a follow-up phone call with respondents to make sure questions were understood correctly and to debrief about the survey. The debrief will focus on their experience completing the survey, rather than their responses to the questions. Specifically, the debrief will focus on whether:

  • Questions are worded simply, clearly, and briefly, as well as in an unbiased manner.

  • Respondents can readily understand key terms and concepts.

  • Question response categories are appropriate, mutually exclusive, and reasonably exhaustive, given the intent of the questions.

  • Questions are accompanied by clear, concise instructions and probes so that respondents know exactly what is expected of them.

The goal of the pre-test is to assess how respondents understand the terms and questions presented in the survey, assess the accuracy and relevancy of our questions, and determine whether we are missing important elements in our questions. The pre-test will also allow us to determine how long the survey took to complete. We will share the pre-test findings with IES and consult with IES on any recommended changes to the survey instruments.

5. Individuals consulted on statistical aspects of the design

Members of the study team with expertise in statistical design were consulted on aspects of the sampling plan. Additionally, members of the advisory panel for the study reviewed the study design and provided feedback. These individuals are listed below.

Table B.2. Individuals consulted on study design

Name

Title and affiliation

Phil Gleason

Associate director, Human Services Research and senior fellow, Mathematica

Ira Nichols-Barrer

Atila Abdulkadiroglu,

Bob Bifulco,

Kelly Bucherie,

Sarah Cohodes,

Keisha Crowder-Davis,

Erica Frankenberg,

Megan Gallagher

Ellen Goldring

Jia Wang

Senior researcher, Mathematica

Duke University

Syracuse University

Magnet Schools of America

Columbia University

Dallas Independent School District

Penn State University

Urban Institute

Vanderbilt University

UCLA CRESST

References

Berends, Mark, Marisa Cannata, Ellen Goldring, and Roberto Penaloza. “Innovations in Schools of Choice.” Paper presented at the annual meeting of the American Educational Research Association, San Diego, CA, April 13–18, 2009. Available at https://www.researchgate.net/publication/267563706_Innovation_in_Schools_of_Choice. Accessed July 13, 2017.

Goldring, R., and S. Taie. “Principal Attrition and Mobility: Results from the 201213 Principal Follow-up Survey.” (NCES 2014-064). Washington, DC: U.S. Department of Education, National Center for Education Statistics, 2014. Available at http://files.eric.ed.gov/fulltext/ED545366.pdf. Accessed October 23, 2017.



www.mathematica-mpr.com

Improving public well-being by conducting high quality,
objective research and data collection

Princeton, NJ Ann Arbor, MI Cambridge, MA Chicago, IL Oakland, CA SEATTLE, WA TUCSON, AZ Washington, DC ■ Woodlawn, MD


1 Though the focus is on schools funded with the new grants, we would include magnet schools that received MSAP grant funds between 2010 and 2014 in the same 2016–2017 grantee districts. These earlier-funded schools are likely to have more mature magnet programs and stronger demand for admission, making them potentially better candidates for a lottery-based impact study.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorForest Crigler
File Modified0000-00-00
File Created2022-01-28

© 2024 OMB.report | Privacy Policy