1850-0892 PELL_OMB_PartB_renewal submission

1850-0892 PELL_OMB_PartB_renewal submission.docx

Evaluation of the Pell Grant Experiments Under the Experimental Sites Initiative

OMB: 1850-0892

Document [docx]
Download: docx | pdf



Evaluation of the Pell Grant Experiments Under the Experimental Sites Initiative

OMB Renewal Supporting Statement:

Part B

May 20, 2015







Evaluation of the Pell Grant Experiments Under the Experimental Sites Initiative

OMB Renewal Supporting Statement:

Part B

May 20, 2015




















TABLES

B.1 Sample Sizes and Precision, by Experiment 9

B.2 Sample Size and Minimum Detectable Impacts, by Experiment 10





FIGURES

Figure B.1. Stylized Model of the Recruitment, Enrollment, and Random Assignment Process for PGE When There Is Need-Blind Admissions 4

Figure B.2. Time Line for the Pell Grant Experiments Study 8



PART B: COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

Shape1

The initial Information Clearance Request (ICR) for this evaluation, with the evaluation design and data collection instruments, was approved through NOA on August 16, 2012 for the period up to August 31, 2015. Since that time, aspects of the evaluation have changed that now require an extension in the data collection period through August 2018. Specifically, two issues necessitate this extension:

  • Smaller than expected sample size. The initial ICR indicated that 51 separate schools and 10,800 students would participate in the two Pell experiments during financial aid award years 2012-2013 and 2013-2014 based on estimates provided by FSA. However, while projections of the participating schools proved accurate, schools have been slower than initially projected to identify and enroll eligible students. In order to achieve sample sizes sufficient to estimate impacts of the Pell grant, it is necessary to extend the enrollment period for the experiments, as well as the data collection period for school records, for an additional two years beyond the three years initially requested (2014-2015 and 2015-2016). Data collection originally scheduled for fall 2013, 2014, 2015 will be collected in summer 2015, 2017, and 2018. A smaller study sample will result in a burden reduction for participating schools.

  • Lower than expected survey response rates. The initial ICR included a survey of 2,500 students in the sample–very low-income adults who are unemployed or underemployed, most in very short-term training programs (less than 15-weeks). The current response rate (30%) is too low to yield a sample size that can be used to estimate impacts on measures from the survey despite repeated phone, email, or regular mail follow up efforts. Given the high costs associated with fielding the survey and smaller than expected sample sizes, continuing to field the survey would not be cost-effective. Discontinuing the survey would allow the remaining resources to be reallocated to a longer enrollment period and result in overall burden reduction. Further mention of the survey has been removed from this submission.

The changes in this package reflect date changes resulting from extending the data collection period and reductions in burden from having a smaller than expected sample size and discontinuing the student survey.

The Institute of Education Sciences (IES) at the U.S. Department of Education (ED) requests approval to conduct an evaluation of the effects of two Pell Grant Experiments (PGE) demonstrations under the Experimental Sites Initiative (ESI). The ESI, authorized by section 487A(b) of the Higher Education Act of 1965 (HEA), allows the Secretary to grant waivers from specific Title IV HEA statutory or regulatory requirements to enable institutions to test alternative methods for administering those federal student aid programs. The two demonstrations are targeted to income- eligible postsecondary students interested in vocational training but who could not otherwise receive a Pell grant because: (1) they currently have a bachelor’s degree, or (2) they seek to enroll in a vocational program that is shorter than the current minimum duration and clock hours. Because of the potential high costs – and benefits – of expanding Pell grant eligibility in these two ways, ED has decided to rigorously assess the demonstration programs using a random assignment design. The study will examine the impacts of each experiment on employment and earnings, participation in education and training and job support activities, and student debt and financial aid receipt.

OVERVIEW OF THE DEMONSTRATIONS AND STUDY APPROACH

Under the ESI, Title IV institutions choose to participate in demonstrations or “experiments” in response to a notice from ED’s Office of Federal Student Aid (FSA). FSA published such a notice in October 2011, inviting postsecondary schools to participate in any of 8 different experiments1, two of which expanded Pell grant eligibility for students seeking job training. That notice also specified the institutions’ obligations to provide data and to ensure that a control or comparison group could be formed so that the effects of participating in the experiments could be evaluated. In subsequent webinars, FSA has provided additional detail to interested institutions about the demonstrations and the evaluation.

1. The Two Pell Grant Experiments (PGE)

Under the current ESI, postsecondary schools will receive waivers to enable them to provide Pell Grants to students who would not otherwise qualify under current Pell Grant rules. The PGE evaluation will include two substudies, each of which relaxes one eligibility criterion for receipt of a Pell Grant:

  1. Experiment 1. Students who already hold a bachelor’s degree and who document that they are unemployed or underemployed will be able to receive Pell Grant award support. This support can be for up to a one-year program of vocational education intended to help them obtain employment, to be used over no more than two award years. Current rules do not allow individuals with a bachelor’s degree to receive Pell support unless it is to be used for teacher certification or licensure.

  1. Experiment 2. Students will be able to receive a prorated amount of Pell Grant financial support for short-term vocational training that lasts for at least 150 clock hours over a period of at least 8 weeks. Current rules require that a student’s academic program is at least 600 clock hours (or an equivalent in semester, trimester, or quarter hours) over at least 15 weeks to qualify for Pell support.

2. Selecting Schools

Schools that volunteered to implement Experiments 1 and 2, that were in good standing in administering Title IV programs (e.g., related to compliance, default rates, etc.), and that agreed to meet the requirements of the evaluation form the study school sample. ED expects the sample to include a maximum of 28 schools for Experiment 1 and 40 for Experiment 2, but with approximately 17 intending to participate in both experiments. Although there will be 51 distinct schools participating, because each experiment will be studied separately there will be a total of 68 experiments underway. Each school will identify the set of vocational or job training programs to which the experiments will apply.

3. Identifying Eligible Students

Recruitment, enrollment, and random assignment of sample members into the PGE study will be the same for both substudies and will involve several steps (Figure A.1). Participating schools will recruit applicants and encourage them to submit both the Free Application for Federal Student Aid (FAFSA) (typically completed on line) and an application to the PGE-eligible program in which the student wants to enroll. Simultaneously or sequentially, FSA will process the FAFSA and the school will determine whether the student can be admitted to the vocational program. Students will receive a Student Aid Report (SAR) and schools an Institutional Student Information Record (ISIR), which provides an assessment of the applicant’s expected family contribution (EFC) towards his or her educational expenses.

Because the potential participants in the study would not ordinarily be eligible for Pell grants, by virtue of their educational characteristics or their program, the PGE schools will need to determine a way to identify candidates for the experiments rather than processing their aid packages in the usual manner. Most likely, the institutions will ensure that financial aid office staff flag students who apply to the PGE eligible programs and review their ISIRs separately.

Figure B.1. Stylized Model of the Recruitment, Enrollment, and Random Assignment Process for PGE When There Is Need-Blind Admissions



4. Random Assignment

Once candidates for the experiments are identified by the institutions, school staff will send these eligible individuals (evaluation contractor-provided) information about the study that also requests students’ consent to participate. School staff will data-enter into a web-accessible, study-specific random assignment system the names and Social Security numbers of eligible admitted applicants who have given consent, as well as a very limited amount of other information about the individual and PGE program, so that random assignment can be conducted.2 In real-time (with little delay), the school then will be notified of the research group status of each study participant. Approximately 60 percent of participants will be assigned to the treatment group, and the remaining 40 percent will be assigned to the control group.

Control group members will have access to the normal financial support that they are eligible for (i.e., excluding a Pell Grant). Study participants assigned to the treatment group will be offered a Pell grant, and the school will take this into account in determining any other aid for which the student is eligible. The financial aid packages will then be provided to the study participants. Regardless of whether the participant is assigned to the treatment or control group, he or she can choose to enroll at the PGE school, enroll at another school to which he or she has been admitted, or pursue some other type of activity.3

It is estimated that schools in Experiment 1 will enroll 25 participants, on average, while schools in Experiment 2 will enroll 100 participants into the study, for a total of 700 sample members in Experiment 1 and 4,000 in Experiment 2. Thus, total sample enrollment for the study will be 4,700. The study participants will consist of individuals who have been determined to be eligible for the study under either experiment and who have consented to be in the study.

5. Collecting Data

Both substudies of PGE will have the same data collection plans. These collection plans include new burden imposed by collecting PGE school data for all study participants. The plans also include use of two other types of data—FSA data and annual earnings data maintained by the Social Security Administration (SSA)4 —that do not generate data collection burden on participating schools or students. These data are described in detail in Section A.2. Together, these data will provide a rich set of information from which we can estimate the impacts of expanded Pell grant eligibility on study participants’ educational experiences and student debt, the characteristics of participants and their vocational programs, as well as exploratory analysis of impacts on participants’ employment and earnings outcomes.

6. Reporting

The schedules for sample enrollment and program participation, as well as when post-program outcomes can be observed, drives the project’s reporting schedule. The study is expected to last 6.5 years, from October 2012 to March2019 (Figure A.2). Enrollment of school applicants into the study began in November 2012. Although each of the 68 experiments in the study might take a slightly different amount of time to complete its enrollment of study participants, enrollment for the study is expected to continue through June 2016.

Most of the study participants who enroll in Experiment 2 are expected to complete their participation in education or training in a fairly short time (two to four months), while participants who enroll in Experiment 1 are expected to take 9-14 months but could be up to two years if attending less than full-time. It is expected that all sample members who participate in a PGE program will complete their training program by summer 2018. The first full post-program calendar year for all study participants will be 2019, although many of the participants who entered the study early in the sample enrollment period are expected to have had a full year of post-program experiences prior to then. SSA data covering calendar year 2017 is expected to be available for analysis in preliminary form in summer 20185, making it possible to draft a report and have it go through IES’ statutorily required review process for publication in late spring 2019.

B1. Respondent Universe and Samples

All three data collection efforts (the FSA data, the PGE school data, and the SSA data) will provide administrative data for all study participants. As noted earlier, the FSA data and the SSA data will not generate new burden as a result of the study. The discussion this section describes the respondent universe for the administrative data, grouping these three sources of data together because of their similarities.

The study is designed to collect data on individuals who are ineligible for the Pell Grant program because they either (Experiment 1) applied to vocational or career training programs but already have a bachelor’s degree or (Experiment 2) applied to a short-term training program. In spring 2012, the Office of Federal Student Aid (FSA) recruited schools to volunteer programs for the study. As described earlier, to date 27 schools have participated in Experiment 1 and 27 schools have participated in Experiment 2. On average, each school in Experiment 1 is expected to enroll 25 participants, and each school in Experiment 2 is expected to enroll 100 participants. The potential respondent universe of respondents consists of these 4,700 study participants with 700 in Experiment 1 and 4,000 in Experiment 2. The data collection effort is designed to be representative of the two groups of individuals at the programs in the PGE study. It does not generalize to any other population of individuals or programs due to the process used to select schools (open invitation plus screening for Title IV administrative compliance by FSA), programs (the criteria listed in the invitation notice plus schools’ preferences), and students (recruiting approaches used by schools).

All three types of administrative data are expected to be comprehensive in their coverage. Data on eligible candidates entered by PGE school staff for the purpose of random assignment will define the universe of study participants. The evaluation contractor will request data extracts from PGE school records for this sample of potential and actual enrollees; study participants without an enrollment record are assumed to not have enrolled in a program at a PGE school. It is assumed that PGE programs already track student enrollment because they must verify it before students can receive financial aid.6 In addition, PGE programs are already required to report six-year graduation rates to the Integrated Postsecondary Data Education System. As a result, it is likely that PGE programs already have databases that track graduation outcomes over time.

The study expects a 100 percent response rate for the PGE program data collection effort. The federal notice inviting schools to participate in the experiments and subsequent communication from FSA requires that all PGE schools provide relevant administrative data as a condition for participation. As a result, the study will include only individuals with administrative data from PGE programs.

The FSA and SSA data also are expected to be available for 100 percent of study participants. The cause of the 100 percent response rate of the FSA data is analogous to that of the PGE program’s data. In both cases, only individuals with the data are eligible to participate in the study. The study will assume that individuals without an SSA earnings record have zero earnings and no employment. This approach is consistent with that of others studies that use data from the SSA Master Earnings File (Schochet et al. 2003).

Figure B.2. Time Line for the Pell Grants Experiments Study


2012

2013

2014

2015

2016

2017

2018

2019

Contract Month

Oct-Dec

Jan-Mar

Apr-Jun

Jul-Sep

Oct-Dec

Jan-Mar

Apr-Jun

Jul-Sep

Oct-Dec

Jan-Mar

Apr-Jun

Jul-Sep

Oct-Dec

Jan-Mar

Apr-Jun

Jul-Sep

Oct-Dec

Jan-Mar

Apr-Jun

Jul-Sep

Oct-Dec

Jan-Mar

Apr-Jun

Jul-Sep

Oct-Dec

Jan-Mar

Apr-Jun

1. PGE School Planning Phase

Shape2



























2. Study Participant Enrollment

Shape3



























3. Expected Program Completion Period



Shape4

























4. FSA Data Extraction

























5. Data Extracts from PGE Schools

























6. Analysis of PGE School Data











Shape5









Shape6




Shape7




7. Analysis of SSA Data
























Shape8




8. Final Report


























Draft Deliverable Final Deliverable Data Extract

B2. Statistical Methods for Sample Selection and Degree of Accuracy Needed

The administrative data from FSA, PGE schools, and SSA will contain information for all study participants.7 The analytical sample is designed to be a purposefully selected sample that will generalize to the students who participated in the Pell grant experiments. Because the analytical sample will have data on 100 percent of the study participants, the study will not need to use sampling weights to correctly represent the population. Implicitly, the sampling weight for each respondent will be one.

To demonstrate the precision associated with this sampling approach, Table B.1 provides the half-widths of the 95 percent confidence intervals for two potential proportions of the outcome variable. For Experiment 1, the half-width of the confidence interval is 0.037 when half of the population has a particular outcome. When the proportion is only 0.10, the half-width falls to 0.022. In addition, the half-width of the confidence interval for annual earnings is $821. The confidence intervals for Experiment 2 are smaller than those of Experiment 1 because it has a larger sample size. These figures indicate that the study will produce relatively precise estimates of the outcome variables for both experiments.

The study will also produce descriptive statistics by treatment and control group within each experiment. Even with these smaller sample sizes, the study will lead to relatively precise estimates of the outcome variables. For example, the half-width of the confidence interval for a proportion of 0.50 for the control group in Experiment 1 is 0.059. The corresponding confidence interval for earnings is $1,298. These two half-widths represent the least precision available to the study using the PGE program data. Even so, the estimates from this sample are will provide useful insights about the population of study participants.

Table B.1. Sample Sizes and Precision, by Experiment


Sample size

Half-Width of confidence interval of a proportion of 0.50

Half-Width of confidence interval of a proportion of 0.10

Confidence interval of earnings (dollars)

Experiment 1

700

0.037

0.022

821.0

Treatment group

420

0.048

0.029

1,059.9

Control group

280

0.059

0.035

1,298.1

Experiment 2

4,000

0.015

0.009

343.5

Treatment group

2,400

0.020

0.012

443.4

Control group

1,600

0.024

0.015

543.0

Notes: The confidence intervals are based on a 95 percent probability level.



The estimation procedures used for this analytical sample are designed to measure the impacts of the offer of Pell Grants. Because the average Pell Grant amount, program content/duration, and student characteristics will differ by experiment, the study will analyze the impacts separately for each experiment. The study will estimate ordinary least squares regression models in the form of Equation (1). The dependent variable is yip, where y is the outcome of interest for study participant i in program p. The main outcome variables will be employment and earnings, but the study will also include enrollment, graduation, and other measures as secondary outcome variables. The variable gi indicates whether the study participant was randomly assigned to be in the treatment or control group. This specification implies that the parameter γ is the effect of access to a Pell Grant on the outcome y. In this setup, γ is the average treatment effect of the Pell Grant access for this population.

(1)

The regression model will control for a variety of characteristics in Xip, such as the participant’s age, educational background, and earnings before random assignment. The inclusion of the control variables will enable the study to estimate the effects of Pell Grants with a high degree of precision. The remaining terms μp and εip represent program fixed effects and a stochastic error term, respectively.

To determine whether the study can detect the impact of Pell Grants, Table B.2 presents the minimum detectable impacts (MDIs) of the estimation procedure, which are defined as the minimum detectable effects times the standard deviations of the outcomes. The power calculations are based on a two-tailed test with an alpha of 0.05 and a beta of 0.80. The means and standard deviations for the four key outcomes are based on a studies of unemployed or underemployed adults seeking job training at postsecondary institutions and studies of postsecondary institutions more generally (Baum et al 2011; Brock and Richburg-Hayes 2006; Maguire et al 2010; McConnell et al 2006; U.S. Department of Education 2009; U.S. Department of Education 2011).

Table B.2. Sample Size and Minimum Detectable Impacts, by Experiment


Sample size

Mean

Standard deviation

MDI with R2=0.2

MDI with R2=0.4

Experiment 1






Enrollment at Study School (%)

700

70.0

45.8

8.9

7.7

One Year Completion at Study School (%)

700

12.2

32.7

6.3

5.5

Annual Earnings ($)

700

11,082.8

11,082.8

2,145.6

1,858.1

Annual Employment (%)

700

34.0

47.4

9.2

7.9

Experiment 2






Enrollment at Study School (%)

4,000

70.0

45.8

3.7

3.2

One Year Completion at Study School (%)

4,000

22.8

42.0

3.3

2.9

Annual Earnings ($)

4,000

11,082.8

11,082.8

896.5

776.4

Annual Employment (%)

4,000

34.0

47.4

3.8

3.3

Note: The power calculations are based on an alpha of 0.05 and a beta of 0.80. The MDIs are for differences between the treatment and control groups, where the treatment group is 60 percent of the sample and the control group is 40 percent of the sample. The results are based on a 100 percent response rate to the administrative data.

MDI = minimum detectable impact.



Under standard assumptions, the power calculations show that the estimation procedure can detect meaningful differences between the treatment and control groups.8 For example, with an R2 equal to 0.2, the procedure is powered to detect a difference of 8.9 percentage points in the probability of enrollment and a difference of 6.3 percentage points in the probability of school completion at a study school in Experiment 1. Both experiments are powered to detect even smaller differences when the regression model explains a larger portion of the variance in the outcome (R2 = 0.4). In this setting, the procedure is powered to detect a difference of 7.7 percentage points in the probability of enrollment and a difference of 5.5 percentage points in the probability of completion in Experiment 1. Thus, the estimation procedures are likely to detect the true effects of access to Pell Grants on the outcomes if the true effects exceed these MDIs.



B3. Maximize Response Rates

As explained in Section B.1, it is expected that the study team will be able to attain FSA, PGE school, and SSA data for all study participants. The collection and analysis of these data will be based on the assumption that there is a 100 percent match rate between the list of study participants and the administrative data records files. If a study participant is not in the SSA data files, for example, it will be assumed that he or she did not have Social-Security-covered earnings during the relevant time period. Therefore, it is expected that data will be collected on all study participants and no special procedures will be necessary to maximize response rates.

B4. Tests of Procedures or Methods

Because there is no longer a survey for this ICR, no tests of procedures or methods is necessary

B5. Individuals Consulted on Statistical Aspects of the Design

The study is based on the best possible decisions for the statistical aspects of the design. In doing so, it will provide rigorous answers to the research questions that will be of use to ED. To date, ED has consulted with its contractor, Social Policy Research Associates and its subcontractor, Mathematica Policy Research, as well as with a Technical Working Group for the study. Specific individuals are identified below.

Name

Affiliation

Telephone Number/Email

Dr. Andrew Wiegand

SPR

(510) 788-2455

Dr. Ronald D’Amico

SPR

(510) 788-2484

Dr. Karen Needels

Mathematica

(541) 753-0201

Dr. Albert Liu

Mathematica

(510) 830-3706

Judith Scott-Clayton

Columbia University

[email protected]

Sara Goldrick-Rab

University of Wisconsin

[email protected]

Kevin Hollenbeck

Upjohn Institute

[email protected]

Dave Marcotte

University of Maryland

[email protected]

Debra Bragg

University of Illinois

(217) 244-8974

REFERENCES

Baum, Sandy, Kathie Little, and Kathleen Payea. “Trends in Community College Education: Enrollment, Prices, Student Aid, and Debt Levels. New York: The College Board, 2011. Available at http://advocacy.collegeboard.org/sites/default/files/11b_3741_CC_Trends_Brief_WEB_110620.pdf.

Brock, Thomas, and Lashawn Richburg-Hayes. “Paying for Persistence: Early Results of a Louisiana Scholarship Program for Low-Income Parents Attending Community College.” New York: MDRC, 2006. Available at [http://www.mdrc.org/sites/default/files/full_472.pdf].

Maguire, Sheila, Joshua Freely, Carol Clymer, Maureen Conway, and Deena Schwartz. “Tuning in to Local Labor Markets: Findings from the Sectoral Employment Impact Study.” New York: Public/Private Ventures, 2010.

McConnell, Sheena, Elizabeth Stuart, Kenneth Fortson, Paul Decker, Irma Perez-Johnson, Barbara Harris, and Jeffrey Salzman. “Managing Customers’ Training Choices: Findings from the Individual Training Account Experiment.” Washington, DC: Mathematica Policy Research, 2006. Available at [http://wdr.doleta.gov/research/FullText_Documents/managing_customers_choices.pdf].

Schochet, Peter Z., Sheena McConnell, and John Burghardt. “National Job Corps Study: Findings Using Administrative Earnings Records Data.” Final report prepared for the U.S. Department of Labor. Princeton, NJ: Mathematica Policy Research, October 2003.



U.S. Department of Education, Office of Postsecondary Education. “The Federal Pell Grant Program End-of-Year Report, 2010–2011.” Washington, DC: Federal Student Aid Data Center, n.d. Available at [http://trends.collegeboard.org/student-aid/figures-tables/fed-aid-maximum-and-average-pell-grant-over-time].

U.S. Department of Education, National Center for Education Statistics, “2001-02 to 2008-09 Integrated Postsecondary Education Data System, Fall 2001, and Spring 2002 through Spring 2009.” Washington, DC: U.S. Department of Education, 2010. Available at [http://nces.ed.gov/programs/digest/d10/tables/dt10_341.asp].





2 Randomly assigning within programs will promote treatment-control group balance on this important dimension. This might allow the evaluation to calculate impacts separately by occupational area.

3 The particular methods that schools use to recruit potential sample members and any screening that is conducted to assess applicants’ interest levels in the PGE program before random assignment is conducted will have an influence on the rate at which study participants enroll in the PGE program.

4 There is also some possibility of obtaining quarterly wage data from the National Directory of New Hires (NDHD) maintained by the U.S. Department of Health and Human Services. There is pending legislation to expand access to the database for federal research purposes. If this access is available during the evaluation period, we would consider substituting NDNH data for the FSA annual earnings data.

5 A full year of post-program SSA data on employment and earnings will only be available for a partial sample of participants (i.e., those who completed their training as of the end of 2016).

6 The characteristics of study participants can be collected by the school that houses the PGE program.

7 If a match for a sample member is not found, it will be assumed that he or she did not participate in the activity covered by the data.

8 The table presents hypothetical means and standard deviations that could be expected based on other research. The particular methods that schools use to recruit potential sample members and any screening that is conducted to assess applicants’ interest levels in the PGE program before random assignment is conducted will have an influence on the rates at which treatment and control group members participate in an educational program and achieve other outcomes of interest to the study. A different rate of enrollment than is assumed in the table, for example, would lead to a different MDI.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCMcClure
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy