Supporting Statement Part B_REVISED 2021-02-24

Supporting Statement Part B_REVISED 2021-02-24.docx

Evaluating the DC Opportunity Scholarship Program After the 2017 Reauthorization

OMB: 1850-0961

Document [docx]
Download: docx | pdf



P art B: Supporting Statement for Paperwork Reduction Act Submission




Evaluating the DC Opportunity Scholarship Program After the 2017 Reauthorization




February 2021




Prepared for:

Meredith Bachman

U.S. Department of Education

555 New Jersey Ave, NW

Room 502I

Washington, DC 20208-5500




Submitted by:

Abt Associates Inc.

10 Fawcett Street

Cambridge, MA 02138



Table of Contents




  1. Collection of Information Employing Statistical Methods

Introduction

The U.S. Department of Education (ED)’s Institute of Education Sciences (IES) requests clearance for data collection activities to support a congressionally mandated study of the District of Columbia (DC) Opportunity Scholarship Program (OSP). Specifically, this request covers administrative data as well as surveys of the OSP program operator, administrators of participating and non-participating OSP schools, OSP applicants, and OSP users.

In 2004, Congress established the DC Opportunity Scholarship Program (OSP), the only federally funded private school choice program. The OSP awards scholarships to low-income DC residents to allow their children to attend participating DC private schools.

Collecting information about the OSP is critical given ED’s interest in private school choice as a way to improve students’ educational outcomes and Congress’s focus on the program. Proposed legislation supports both expanding the OSP to serve more students in DC and new tax credits that would make up to $5 billion available to fund similar programs nationwide.

The importance of the OSP to Congress is reflected in its requirement that IES conduct a third evaluation of the program, following those completed in 2011 and 2019. The two previous evaluations relied on lotteries to award private school scholarships to provide the conditions for a rigorous assessment of the program’s effectiveness. Together, those earlier evaluations raised some questions about how well the program was improving student achievement and parent and student satisfaction with their schools. Additionally, the most recent evaluation found declining rates of participation in the OSP program.

In the 2017 Scholarship for Opportunity and Results (SOAR) Reauthorization Act, Congress mandated that this third program evaluation not rely on a lottery-based allocation of scholarships. Therefore, the data collection for the new evaluation will address how the OSP is being implemented and what changes might be made to better fulfill the mission of increasing access to high-quality education for all students (SOAR Act § 38-1853.02 (4)). ED plans to use the study results to inform program improvements aimed to increase families’ participation and satisfaction with the OSP and students’ academic success. Results may also inform a future reauthorization of the program.

IES has contracted with Abt Associates Inc. and its partners, the Center on Reinventing Public Education (CRPE) and Dr. Anna Egalite (together, the “study team”), to conduct the evaluation, including all data collection. The new OSP evaluation will collect data to answer and report on the questions shown in Exhibit B1 below.

Exhibit B1: Key Research Questions

  1. Who is the OSP serving, and who is it not serving?

  2. What are the top challenges families face participating in the OSP?

  3. What does the OSP program operator do or not do to help families overcome challenges in applying for, using, and continuing to use an OSP scholarship to enroll and stay enrolled in a private school?

  4. What do schools do or not do to help families overcome challenges to enrolling and staying enrolled?

  5. How are OSP students progressing academically?

The study will collect information from the program operator, both OSP-participating and non-participating private schools in DC and public schools (both traditional and charter) in DC. The study will also collect administrative data relevant to answering the research questions. Exhibit B2 shows the timing and frequency of data collection activities through fall of 2024.

Exhibit B2. Timing of Data Collection Activities

Data Source

Respondent

Data Source Type

Spring 2021

OSP application data

OSP Program Operator

Administrative data

OSP participant data

OSP Program Operator

Administrative data

My School DC data

Office of the State Superintendent of Education (OSSE)

Administrative data

Public school enrollment data

Office of the State Superintendent of Education (OSSE)

Administrative data

Public school characteristics data

Office of the State Superintendent of Education (OSSE)

Administrative data

OSP student achievement data

OSSE and DC private schools that participated in OSP in 2017-2018 and 2018-2019

Administrative data

OSP private school website contenta

Study team will collect data directly OSP-participating private schools’ public websites

Administrative data

Fall 2021

OSP application data

OSP Program Operator

Administrative data

OSP participant data

OSP Program Operator

Administrative data

My School DC data

Office of the State Superintendent of Education (OSSE)

Administrative data

Public school enrollment data

Office of the State Superintendent of Education (OSSE)

Administrative data

Public school characteristics data

Office of the State Superintendent of Education (OSSE)

Administrative data

Program operator interview

Executive director, Assistant director, Enrollment coordinator

Primary data

Spring 2022

Parent applicant survey

Parent of each eligible 2021-2022 OSP scholarship applicant

Primary data

Student applicant survey

Eligible 2021-22 OSP scholarship applicants in grades 4-12

Primary data

OSP school administrator survey

School administrators in private schools participating in OSP

Primary data

Non-OSP school administrator survey

One administrator at each DC private school not participating in OSP

Primary data

Public school administrator survey

One administrator at each DC public (traditional or charter) school

Primary data

Fall 2022

OSP application data

OSP Program Operator

Administrative data

OSP participant data

OSP Program Operator

Administrative data

Public school enrollment data

Office of the State Superintendent of Education (OSSE)

Administrative data

Public school characteristics data

Office of the State Superintendent of Education (OSSE)

Administrative data

OSP student achievement data

OSSE and DC private schools that participated in OSP in 2021-2022

Administrative data

Spring 2023

Parent user survey

Parent of each eligible 2021-22 OSP scholarship applicant who used an OSP scholarship to enroll in an OSP-participating private school for 2021-22 school year

Primary data

Student user survey

Eligible 2021-22 OSP scholarship applicants in grades 4-12 who used an OSP scholarship to enroll in an OSP-participating private school for 2021-2022 school year

Primary data

Fall 2023

Public school enrollment data

Office of the State Superintendent of Education (OSSE)

Administrative data

Public school characteristics data

Office of the State Superintendent of Education (OSSE)

Administrative data

OSP student achievement data

OSSE and DC private schools that participated in OSP in 2021-2022

Administrative data

Fall 2024

High school graduation data b

OSSE and DC private schools that participated in OSP in 2012-2013 to 2014-2015

Administrative data

College enrollment data c

National Student Clearinghouse

Administrative data

a These data will be extracted by the study team and are not covered in this clearance request.

b These data will be extracted in the fall of 2024 and are not covered in this clearance request.

c These data will be extracted in the fall of 2024 and are not covered in this clearance request but will have no associated burden, as the team would pay NSC to extract the data.

    1. Respondent Universe and Sampling Methods

For each administrative and primary data source/respondent proposed, Exhibits B3 and B4 summarize the respondent universe, sampling methods, expected response rates; Exhibit B4 also provides corresponding response rates from the most recent prior evaluation of the OSP.

      1. Respondent Universe

Respondents include the OSP program operator; DC’s Office of the State Superintendent of Education (OSSE); the National Student Clearinghouse; OSP private school, non-OSP private school, and public school administrators; parents of eligible student applicants1 for a 2021-2022 OSP scholarship; parents of the subset of OSP scholarship applicants who use an OSP scholarship in the 2021-2022 school year; eligible student applicants for a 2021-2022 OSP scholarship; and the subset of student applicants who use an OSP scholarship in the 2021-2022 school year.

Where the exact size of the respondent universe is unknown, the study team has estimated the population sizes as shown in Exhibits B3 and B4. We prepared these estimates as follows:

  • Student achievement data (estimated population: 43 schools). The respondent universe consists of the 43 OSP schools in operation in 2019-2020.

  • High school graduation data (estimated population: 20 high schools plus OSSE). The respondent universe consists of the DC private schools that participated in the OSP program in 2012-13, 2013-14, and/or 2014-15 that include grade 12 and that remain open in spring 2024 when the study team will request these data. Among the 43 OSP schools operating in 2019-2020, 20 schools include grade 12. High school graduation data on former OSP students who graduated from a public school will be requested from OSSE. Clearance will be requested for these data in a future package because the data will be extracted in fall 2024.

  • Parents of eligible 2021-22 OSP scholarship applicants (estimated population: 750 parents). The respondent universe consists of one parent per household with one or more eligible scholarship applicants. For each household with multiple OSP applicants, the study team will randomly select one target child and ask the parent to complete the survey based on their experiences with OSP for that child. For 2019-20, the program operator reported 856 eligible new applicants. The study team has rounded up the estimated population of eligible student applicants to 900. Using historical data on the number of households with one, two, and three or more OSP applicants, we estimate 750 parents with at least one eligible OSP applicant for the 2021-22 year.2

  • Parents of 2021-22 OSP scholarship users (estimated population: 450 parents). The respondent universe consists of one parent per scholarship user. In 2018-2019, 42% of eligible students offered an OSP scholarship used it to attend a participating school (Webber et al., 2019a, Figure 11). Assuming a population of 750 parents of 2021-22 eligible OSP applicants, the number of parents of students who go on to use the OSP scholarship should not exceed 450 (60% of eligible applicants).

  • Eligible 2021-22 OSP scholarship student applicants in grades 4-12 (estimated population: 500 students). Based on the distribution of scholarship applicants by grade in the second evaluation of OSP (Webber et al., 2019b, Figure A-3), approximately 46% of eligible applicant students were entering 4th to 12th grades (in 2012-2013). The study team has rounded up the estimated population of eligible student applicants (1,000 × .46 = 460) for the 2021-22 OSP scholarship.

  • Eligible 2021-22 OSP scholarship student users in grades 4-12 (estimated population: 300 students). Assuming a population of 500 eligible applicants in grades 4-12, and the 42% usage rate observed for 2018-2019, the study team estimates the population of student users to be no more than 300 (60% of the population, a conservative estimate).

  • OSP private, non-OSP private, and public school administrators (estimated populations: 43, 21, and 225 administrators, respectively). Assuming one administrator per school, the study team consulted the program operator’s list of OSP-participating schools, the National Center for Education Statistics Private School Universe Survey data (2015-16 school year), along with internet searches for lists of private schools in DC and OSSE’s website (https://osse.dc.gov/page/data-and-reports-0), to estimate the number of each type of school.

  • Study liaisons in OSP schools (estimated population: 43 schools). The study team will ask to visit the 43 OSP schools to administer student surveys to OSP students in grades 4-12. In each school, the study team will request assistance scheduling these visits from one staff member—the study liaison—per school.

      1. Sampling Methods

The study team will not use sampling methods but will collect data from the census of respondents in each group.

      1. Expected Response Rates

The study team anticipates a 100% response rate for information collected from administrative data sources (see Exhibit B3). We expect a 100% response rate for the primary data collected via the program operator interview and OSP school administrator survey (see Exhibit B4). We expect an 80% response rate for the remaining primary data collected via the non-OSP and public school administrator surveys as well as the parent and student surveys (see Exhibit B4). Details about how the study will achieve these response rates is included below.

Administrative data sources: For administrative data sources, the study team will coordinate closely with each respondent to ensure that the team’s data requests clearly specify the records and data fields requested, the data security methods that the study team will use to protect the privacy of individuals and schools included in the data files, and the frequency and timing of data requests.

Program operator interview: The study team will work with the OSP program operator to schedule the interview at a convenient time when the Executive Director, Assistant Director, and Enrollment Coordinator are available. The study team anticipates that the program operator’s staff will all participate in this interview given their strong interest in the findings of the study.



School administrator, parent, and student surveys: The target response rates for the school administrator, student, and parent surveys is 80%. The study team will use methods used successfully by the most recent prior study to encourage participation by school administrators (see Section B3 for details).

Exhibit B3. Respondent Universe, Sample, and Expected Response Rate for Administrative Data Sources

Data Source

Respondent

Respondent Universe

Sample

Response Rate Expected

OSP application data

2017-2018, 2018-2019, 2019-2020, 2020-2021, 2021-2022 OSP scholarship applicants

OSP Program Operator

1

census

100%

OSP participant data

2017-2018, 2018-2019, 2019-2020, 2020-2021, 2021-2022, 2022-2023 OSP scholarship users

OSP Program Operator

1

census

100%

My School DC data

Students entering lottery for DC public (traditional, charter) schools for enrollment in 2017-2018, 2018-2019, 2019-2020, 2020-2021, 2021-2022a

Office of the State Superintendent of Education (OSSE)

1

census

100%

Public school enrollment data

Students enrolled in DC public (traditional, charter) schools in 2019-2020, 2020-2021, 2021-2022

Office of the State Superintendent of Education (OSSE)

1

census

100%

Public school characteristics data

DC public (traditional, charter) schools in 2017-2018, 2018-2019, 2019-2020, 2020-2021, 2021-2022, 2022-2023

Office of the State Superintendent of Education (OSSE)

1

census

100%

OSP student achievement data

OSP students in who take norm-referenced tests in mathematics and English/language arts in 2017-2018, 2018-2019, 2019-2020, 2020-2021, 2021-2022, 2022-2023

OSP-participating private schools

43

(estimated)

census

100%

High school graduation datab

For OSP applicants for 2012-13, 2013-14, 2014-15 randomly assigned to receive or not receive an offered OSP scholarship (see Webber et al., 2019a)

OSSE and DC private schools that participated in OSP in 2012-2015


21 (estimated)

census

100%

College enrollment datac

For OSP applicants for 2012-13, 2013-14, 2014-15 randomly assigned to receive or not receive an offered OSP scholarship (see Webber et al., 2019a)

National Student Clearinghouse

1

census

100%

OSP school website contentd

Schools participating in OSP in 2020-2021

NONE: Study team will collect data directly from each of 44 schools’ websites

NA

NA

NA

a These data will be extracted in the fall of 2024 and are not covered in this clearance request.

b These data will be extracted in the fall of 2024 and are not covered in this clearance request but have no associated burden, as the team would pay NSC to extract the data.

c These data will be extracted by the study and are not covered in this clearance request.

Exhibit B4. Respondent Universe, Sample, Expected and Prior Response Rates for Primary Data Sources

Data Source

Respondent

Respondent Universe

Sample

Response Rate Expected

Prior Response Rate

Program operator interview

3 employees (Executive Director, Assistant Director, Enrollment Coordinator) of the OSP program operator

3

census

100%

NA

Parent applicant survey

One parent of each eligible 2021-2022 OSP scholarship applicant

750

(estimated)

census

80%


77% a

Parent user survey


One parent of each eligible 2021-22 OSP scholarship applicant who used an OSP scholarship to enroll in an OSP-participating private school for 2021-22 school year

450

(estimated)

census

80%

72% b

Student applicant survey


Eligible 2021-22 OSP scholarship applicants in grades 4-12

500

(estimated)

census

80%

68% c

Student user survey


Eligible 2021-22 OSP scholarship applicants in grades 4-12 who used an OSP scholarship to enroll in an OSP-participating private school for 2021-2022 school year

300

(estimated)

census

80%

68% b

OSP school administrator survey


Head of school, principal, or other school leader (one per school participating in OSP)

43

census

100%

100% d

Non-OSP school administrator survey


Head of school, principal, or other school leader (one per DC private school not participating in OSP)

23

census

80%

61% d

Public school administrator survey


Principal, vice principal, or other school leader (one per DC public school)

225

census

80%

91% d

Note: The Prior Response Rates shown for the parent and student surveys reflect the prior evaluation’s treatment group response rates only. The study team expects that in the current evaluation, all eligible OSP applicants will be offered OSP scholarship, as recent data shows that OSP is not oversubscribed.

a Treatment group only, Dynarski et al. (2017, p. B-10).

b Treatment group only, Dynarski et al. (2018, p. B-9).

c Treatment group only, Dynarski et al. (2017, p. B-11).

d Betts et al. (2016, Table A-2, p. 21).

    1. Statistical Methods for Sample Selection and Degree of Accuracy Needed

      1. Sample Selection

As described above in Section B.1.2 Sampling Methods, the study will collect data from the census of respondents in each respondent group. No sample selection is planned.

      1. Estimation Procedures

Exhibit B5 indicates which analysis methods will be used for each research question. To address the research questions, the study team will use three types of analytic methods:

  • Descriptive analyses. The study team will produce summary statistics such as means and standard deviations (for continuous variables) and tabulations such as frequency distributions and percentages (for categorical variables).

  • Comparative analyses. To compare groups, such as demographic characteristics of OSP scholarship users and non-users, the study team will use common statistical tests, such as an F-test, to determine whether differences are real (“statistically significant”) or likely due to chance.

  • Regression analyses. The study team will use two types of regression models. (1) To investigate which school characteristics are most important to DC families when deciding on a school, we will use a model that takes into account demographic and school characteristics along with families’ schooling preferences to understand which characteristics families value most when choosing public and private schools and how preferences vary by characteristics of the choosers. (2) To estimate the effects of receiving an OSP scholarship on long-term student outcomes (high school graduation, college enrollment and completion), the study team will use a regression model for each outcome to compare the average value for students in the 2012-13, 2013-14, and 2014-15 cohorts offered an OSP scholarship (the treatment group) versus the average value for students not offered a scholarship (the control group). This model will include covariates for the demographic and academic characteristics of participating students before random assignment (at baseline). We will test the difference in outcomes—the impact—for statistical significance using a probability threshold of .05 (5%), a level used by most researchers (i.e., a 95% likelihood that an impact observed by the study was not due to chance).

Schooling Decisions

The study team will use a maximum likelihood estimation method (Hole, 2007; Train, 2003) using data on the private schools to which students applied for admission, public schools that they ranked in the My Schools DC lottery system or that are in the by-right attendance zone the student lives, data on the private schools that offered the student admission, the results of the My Schools DC lottery, and enrollment decisions to identify school characteristics families value most when they decide on a school to attend. The large number of characteristics to be included in this model and the size of the dataset (approximately 1,000 students and 100 school attributes) generate a high risk of overfitting the model.3 To reduce this risk, the study team will “penalize” models in iterative fashion using the least absolute shrinkage and selection operator (LASSO) approach (Hastie, Tibshirani, & Wainwright, 2015). This approach “shrinks” the coefficients of predictors so that some coefficients are effectively zero (and as a result, some predictors contribute negligibly to the overall model). LASSO simultaneously improves predictive accuracy and reduces the number of independent variables in the final model. The study team will use the results of this final model to examine correlations between parent characteristics and relative weights parents place on school characteristics (for example, lower-income parents may place a higher weight on a school’s average class size than on test scores and higher-income parents weighting test scores more highly than class size).

Effect of an OSP scholarship on high school, college outcomes

To estimate the intent-to-treat effect of receiving an OSP scholarship on long-term student outcomes, the study team will use a linear regression-adjusted model with fixed treatment effects. This regression model’s equation is:

where i indexes students and t indexes time. Yit is the outcome (for example, graduation from high school) for student i in year t (years measured relative to random assignment); Ti0 is 1 if student i received/used an OSP scholarship at t=0 (year of randomization) and 0 if student i did not receive an OSP scholarship at t=0. READi0 and MATHi0 are reading and mathematics test scores for student i measured at t=0, and BaselineChari0 is a set of baseline characteristics measured at t=0. If the timing of tests relative to the average test date varies across students, Daysit measures the deviation from the average test date at follow-up time t (if there is no variation in Days, this term drops out of the regression equation). In this model, measures the average treatment effect, namely the average impact of receiving an OSP scholarship on student outcomes. To test for impacts, the study will conduct two-tailed t-tests at the .05 level.





Exhibit B5. Estimation Methods for Each Study Research Question

Research Question

Descriptive analyses

Comparative analyses

Regression analyses

RQ 1. Who is the OSP serving, and who is it not serving? Who…

  1. applies for an OSP scholarship, and how do applicant and non-applicant families differ?

  2. uses an OSP scholarship, and how do OSP users differ from non-users?

  3. continues to use an OSP scholarship to stay enrolled in a private school (into a second school year), and how do “stayers” differ from “leavers”?

X

X


RQ 2. What are the top challenges families participating in the OSP? Why do families…

  1. apply for an OSP scholarship, and what challenges do they face?

  2. use (or not use) an OSP scholarship to enroll initially in a private school, and what challenges do they face?

  3. continue (or not) to use an OSP scholarship to stay enrolled in a private school, and what challenges do they face?

X

X


RQ3. What does the OSP program operator do, and not do, to help families overcome challenges in applying for, using, and continuing to use an OSP scholarship to enroll and stay enrolled in a private school?

  1. Is the program operator aware of families’ challenges at each stage of the program?

  2. How does the program operator support families at each stage of the program and engage OSP schools to support families?

  3. What information or supports could the program operator expand or add?

  4. What challenges does the program operator face?

X

X


RQ4. What do schools do and not do to help families overcome challenges to enrolling and staying enrolled?

  1. Are OSP schools aware of families’ challenges?

  2. What information and supports do OSP schools provide to help families enroll and stay enrolled?

  3. How do the schooling environments in OSP schools, non-OSP schools, and public schools relate to families’ schooling decisions?

X

X

X

RQ5. How are OSP students progressing academically?

  1. What is the average national percentile ranking for OSP participants, and what is the average gain in national percentile ranking for OSP participants between two school years?

  2. What is the effect of receiving and using an OSP scholarship on long-term outcomes of students compared to those of students not offered an OSP scholarship?

X


X





      1. Unusual Problems Requiring Specialized Sampling Procedures

Not applicable.

      1. Use of Periodic Data Collection Cycles to Reduce Burden

Exhibit B2 shows the timing of data collection activities.4 To minimize burden, the study team will collect My School DC public lottery data only twice (spring 2021 and fall 2021). Surveys of school administrators will be conducted once during the study period (spring 2022). We will collect high school graduation and college enrollment data for prior cohorts of OSP applicants once (spring 2024).5

To provide timely information about the OSP, the study will collect data about who participates in the OSP and academic progress of OSP participants on four recent cohorts of OSP students (2017-2018, 2018-2019, 2019-2020, and 2020-2021). The study will also follow the 2021-2022 cohort of OSP scholarship applicants and their parents across two years of school enrollment (2021-22 and 2022-23). To ensure accurate data on OSP participation and academic progress across these five cohorts, the study team must collect other administrative data once per year for three to four years:

  • OSP applicant and participant data in spring 2021, fall 2021, and fall 2022;

  • OSP student achievement data in spring 2021, fall 2022 and fall 2023; and

  • Public school enrollment data from OSSE in spring 2021, fall 2021, fall 2022 and fall 2023.

Likewise, to track school enrollment and satisfaction outcomes for 2021-2022 OSP scholarship applicants across two school years, the study team will collect parent and student survey data twice:

  • Parent and student OSP applicant surveys in spring of 2022; and

  • Parent and student OSP user surveys in spring of 2023.

    1. Methods to Maximize Response Rates and Deal with Nonresponse

      1. Maximize Response Rates

To maximize response rates, the study team will use strategies that have proven successful in the prior DC OSP evaluations as well as past studies the team has conducted with similar populations of school administrators, students, and parents (e.g., Reading First Impact Study, Evaluation of the U.S. Department of Education’s Student Mentoring Program, Evaluation of the Massachusetts Expanded Learning Time Initiative, Enhanced Reading Opportunities Study, Career Academies Evaluation, The Teacher Incentive Fund Evaluation, the Study of Enhanced College Advising in Upward Bound, the Study of Student Messaging in GEAR UP, and the Impact Evaluation of Academic Language Interventions). Below we begin by outlining general strategies and then describe those specific to particular data collection activities.

General strategies to maximize response rates include:

  • Advance agreements for administrative data collection. The study team will work with the OSP program operator to establish a memorandum of understanding (MOU) that outlines the study activities with which we would like the program operator’s cooperation. These activities include the program operator interview and the provision of administrative data on OSP scholarship applicants and OSP participants (scholarship users), as well as assistance identifying school administrators and study liaisons in the participating OSP schools. To obtain administrative data from the program operator and OSSE, we will establish a data use agreement with each that specifies the timing and frequency of data requests, the number and type of records for whom data are requested, and the specific data elements requested. The study team will establish similar MOUs and data sharing agreements with each OSP school in order to obtain OSP student achievement data (in OSP schools that administer norm-referenced standardized tests of reading and/or mathematics) and to collect graduation data from each OSP school from which OSP applicants in 2012-13, 2013-14, or 2014-15 may have graduated.

  • Multi-tiered, multi-mode approach for survey data collection. For each survey effort, the study team will communicate with each target respondent in advance of the survey, at the start of the survey field period, and during the survey fielding period using multiple modes of communication (mail, phone, email, text message, and in person). The study team will offer respondents the option of completing the survey by telephone, via a web-based survey link, or in person. Web surveys will be compatible with mobile devices as well as traditional (desktop) screen sizes. In addition, the team will offer a monetary incentive to each respondent who completes a survey. Finally, the study team has developed surveys that contain clear language, that are easy to complete, and can be finished within 15 minutes.

  • Single point of contact for OSP school visits. To facilitate the in-person administration of student surveys, we will ask local education organizations such as the Archdiocese of Washington Catholic Schools for all OSP Catholic schools and the Association of Independent Schools of Greater Washington for independent private schools in DC to encourage schools to accommodate study activities. We will ask the program operator to help us identify a point of contact at each school to facilitate one-on-one outreach. The study team will then work with each school liaison to discuss logistics and scheduling so that both the school’s scheduling needs and the study’s requirements are met. To encourage the school’s engagement, the study team will describe the importance of the study, the role of the school in the study, and the timeline of activities.

Below are the procedures the study team will use for particular data collection activities:

  • Parent and Student Applicant and User Surveys.

    • In advance of parent or student survey administration, parents will receive initial information about the study via an email or mailed letter that describes the study’s purpose, the study team, and parents’ role in the study, and that lets parents know that we will be asking their child to complete a survey during school hours in the coming weeks. This first communication will provide the study’s toll-free telephone number for the parent to call to opt-out the child from the student survey. The letter includes a link to redeem a $5 advance incentive for parents and describes the additional monetary incentives for parents and students who complete the surveys.

    • The study team will work with a designated school liaison to administer the survey to OSP applicants attending OSP schools.

    • For students in non-OSP schools, students who missed an in-school survey administration, and parents, the study team will send parents an email and letter containing web links to the student and parent surveys, along with a reminder of the incentives for survey completion ($10 age-appropriate gift card for the student and $20 for the parent).

    • Parents will receive weekly email and text message reminders for a maximum of six weeks. After four weeks of email and text reminders, the study team will telephone non-respondents: up to 15 phone call reminders will be made. If those attempts are unsuccessful, the team will next send a field interviewer to conduct the survey(s) in person. This interviewer will attempt to contact the respondent both at the respondent’s address and via any alternate contacts. The interviewer will leave specially designed study flyers and Sorry-I-Missed-You cards with family members or friends. Field managers will work with interviewers to locate each respondent and will conduct additional searches for respondents with outdated or inaccurate contact information.

    • For the Parent and Student User Surveys scheduled for administration in spring 2022 to students who used an OSP scholarship to enroll in an OSP school (and their parents), the study team will first check the OSP program operator’s participant records for any contact information updates. For all respondents, but particularly those no longer participating in the OSP, the study team will use both active and passive tracking methods to locate them prior to survey administration. Active tracking will include direct contact with the study participants through email, USPS mail, or telephone calls. Parents will be asked to update their contact information (via an online submission form, by telephone, or by mail). Parents who update their contact information will receive a $5 gift card. Passive tracking includes the collection of contact information from sources such as postal address updates, directory assistance, reverse directories, administrative data, and other commercial databases such as Accurint.

    • Methods to maximize response rates to the Parent and Student User Surveys will mirror those used for the Parent and Student Applicant Surveys; this includes a second incentive ($10 age-appropriate gift card for the student and $25 for the parent) to complete the survey.

  • School Administrator Survey. Following a similar approach to that used for the parent and student surveys, the study team will deliver an advance email and letter describing the study, study team, and the school administrator’s role in the study. Letters of support for the study and encouragement to complete the survey will also be sent by the OSP program operator (to OSP school administrators) or by OSSE (to public school administrators). To complete the survey, each administrator will receive an email with a link to an online survey (a paper copy of the survey will also be made available). The study team will follow up with non-respondents with up to six email reminders and eight phone reminders. Each administrator will receive an incentive to complete the survey. Finally, the study team will pilot-test the survey to ensure that it is clearly worded and easy to complete within 15 minutes.

      1. Dealing with Non-Response

As discussed above, to meet target survey response rates, the study team will:

  • Employ a multi-tiered approach, including advance letter, email, text message, phone, and in-person follow-up;

  • Use multiple modes allowing respondents to complete the survey online (optimized for display on a desktop or mobile device), by phone, or in person;

  • Conduct pilot tests of surveys to ensure that each requires no more than 15 minutes to complete; and

  • Give monetary incentives to survey respondents to promote their cooperation and improve data quality by minimizing non-response bias (see Section B.3.1 Maximize Response Rates for more details about incentives).

For each respondent group, if response rates fall below 80%, the study team will conduct non-response analyses. First, the study team will compare administrative data on the characteristics of individuals who completed the surveys to the characteristics of those who did not. Second, using these baseline characteristics we will use a statistical model to predict the probability that a targeted survey respondent was located and responded to the survey. If these analyses point to the possibility of non-response bias, sampling weights will be created based on the observable baseline characteristics and used in analyses.

    1. Test of Procedures and Methods to Be Undertaken

During the 60-day public comment period, the study team pilot tested each of the three survey instruments with no more than nine individuals per instrument. Each of the three pilot tests included representatives of each respondent population: four students pilot tested the student survey, four parents pilot tested the parent survey, and five school administrators pilot tested the school administrator survey. Respondents identified questions that were challenging to answer and items where additional clarity was necessary.

Exhibit B6 shows the average time to completion for each survey.

Exhibit B6. Average Minutes to Complete Each Survey Instrument

Survey

Average Minutes to Complete Survey

Standard Deviation

School Administrator Surveysa



OSP/Non-OSP private school administrator

21

1.4

Public school administrator

13

2.8

Parent Surveys



Parent applicant

17

0.7

Parent user

12

3.5

Student Surveysb



Student applicant

8

0.7

Student user

11

2.8

a Because the Non-OSP private school administrator survey has 13 fewer items than the OSP version, the estimated burden per respondent is three minutes less than the estimated burden per respondent for OSP private school administrators.

b Due to minor differences in the versions of the student survey for students in different grade levels, pilot respondents included OSP students in elementary, middle, and high school grades.

The results of time review and feedback from the pilot testing did not suggest any changes to the student applicant survey or the student user survey. However, the time review and pilot testing resulted in the following revisions to the parent and school administrator surveys to shorten the length and clarify survey item wording and response options:

  • On the parent applicant and user survey, the wording of some questions and response options was edited for clarity. Three questions were dropped, and two response options were dropped from a third question to shorten the overall survey length.

  • On the private school administrator survey, the wording of some questions and response options was edited for clarity. Fifteen questions were dropped because they were identified as redundant or non-essential to the study goals. Seven questions were added to request information not available from other sources and to cover topics related to COVID-19

  • On the public school administrator survey, the wording of some questions and response options was edited for clarity. Five questions were dropped because they were identified as redundant or non-essential to the study goals. Two questions were added to the survey to allow for comparison with private school administrator responses and to collect information related to COVID-19.

After the instruments are approved, the study team will program the surveys for administration via computer-assisted web interviewing (CAWI) and computer-assisted personal interviewing (CAPI) methods. Prior to deployment, the team will test the programmed survey instruments (CAWI and CAPI modes) to ensure they function as designed. This will include extensive manual testing for skip patterns, fills, and other logic. To reduce data entry errors, numerical entries will be checked against an acceptable range and, where appropriate, prompts will be presented for valid but unlikely values. This testing will increase the accuracy of data collected, while ensuring that respondent burden is minimized. The study team has conducted similar survey administrations on the Massachusetts PEG Evaluation and on the Early Learning Study at Harvard (ELS@H) and Impact Evaluation of Academic Language Interventions (ALI) projects (OMB Control No. 1850-0941).

    1. Individuals Consulted on Statistical Aspects of the Design

The following individuals were consulted on the statistical aspects of the study:

Name

Title/Affiliation

Telephone

Carter Epstein

Senior Associate, Abt Associates

(617) 349-2543

Tamara Linkow

Principal Associate, Abt Associates

(617) 520-2978

Austin Nichols

Principal Associate, Abt Associates

(301) 347-5679

Amanda Parsad

Principal Associate, Abt Associates

(301) 634-1791


The following individuals will be responsible for the data collection and analysis:

Name

Title/Affiliation

Telephone

Donna Demarco

Principal Associate, Abt Associates

(617) 349-2322

Carter Epstein

Senior Associate, Abt Associates

(617) 349-2543

Tamara Linkow

Principal Associate, Abt Associates

(617) 520-2978

Amanda Parsad

Principal Associate, Abt Associates

(301) 634-1791

Brenda Rodriguez

Senior Associate, Abt Associates

(617) 520-2351



References

Betts, J., Dynarski, M., Feldman, J. (2016). Evaluation of the DC Opportunity Scholarship Program: Features of Schools in DC, NCEE Evaluation Brief, completed under contract number ED-12-CO-0018.

Dynarski, M., Rui, N., Webber, A., Gutmann, B. (2017). Evaluation of the DC Opportunity Scholarship Program: Impacts After One Year (NCEE 2017-4022). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Dynarski, M., Rui, N., Webber, A., and Gutmann, B. (2018). Evaluation of the DC Opportunity Scholarship Program: Impacts Two Years After Students Applied (NCEE 2018-4010). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Hastie, T., R. Tibshirani, and M. Wainwright. (2015). Statistical Learning with Sparsity: The Lasso and Generalizations. Boca Raton, FL: CRC Press.

Hole, A. R. (2007). “Fitting mixed logit models by using maximum simulated likelihood.” Stata Journal, 7(3), 388–401. https://www.stata-journal.com/sjpdf.html?articlenum=st0133

Train, Kenneth E. (2003). Discrete Choice Methods with Simulation. Cambridge, United Kingdom: Cambridge University Press.

Webber, A., Rui, N., Garrison-Mogren, R., Olsen, R., & Gutmann, B. (2019a). Evaluation of the DC Opportunity Scholarship Program. Impacts Three Years After Students Applied. (NCEE 2019-4006). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Webber, A., Rui, N., Garrison-Mogren, R., Olsen, R., & Gutmann, B. (2019b). Evaluation of the DC Opportunity Scholarship Program. Impacts Three Years After Students Applied. Technical Appendix (NCEE 2019-4006). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

1 Eligible students must be current DC residents, 5 years old or entering Kindergarten through 12th grade for the upcoming school year, and be either recipients of SNAP benefits (food stamps) OR meet income guidelines: at or below 185% of the income threshold for first-time applicants or 300% of the income threshold for renewing families. Source: https://servingourchildrendc.org/our-program/apply/

2 Data from the prior evaluation of the OSP (Webber et al., 2019a) showed that 69.8% of applications were from households with one OSP applicant; 22.3% from households with two OSP applicants; 5.2% from households with three OSP applicants, 2.1% from households with four OSP applicants, 0.5% from households with five OSP applicants, and 0.08% from households with six OSP applicants. Assuming we select one focal child per household and 900 eligible student applicants, the total number of parents is 750.

3 Overfitting a model means that the relationships observed with the independent variables only apply to the current data used, reducing the generalizability outside of the current sample (that is, the model would not perform well on new data, as the model was not trained to handle small changes in relationships between variables). Overfitting can produce misleading regression coefficients; a penalized maximum likelihood estimator can reduce or eliminate this risk of overfitting.

4 Review of OSP schools’ websites in the spring of 2020 imposes no respondent burden because the data collection will be conducted directly by the study team.

5 High school graduation data will be extracted from OSSE and OSP private schools in the fall of 2024 and are not covered in this data request.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAllan Porowski
File Modified0000-00-00
File Created2021-03-06

© 2024 OMB.report | Privacy Policy