Part B NCER-NPSAS Grant Study-CSFA 2017

Part B NCER-NPSAS Grant Study-CSFA 2017.docx

NCER-NPSAS Grant Study - Connecting Students with Financial Aid (CSFA) 2017: Testing the Effectiveness of FAFSA Interventions on College Outcomes

OMB: 1850-0931

Document [docx]
Download: docx | pdf











NCER-NPSAS Grant Study

Connecting Students with Financial Aid (CSFA) 2017: Testing the Effectiveness of FAFSA Interventions on College Outcomes




Supporting Statement Part B
OMB # 1850-New v.1








Submitted by

National Center for Education Statistics

U.S. Department of Education










October 2016

revised December 2016








Contents



Tables




  1. Collection of Information Employing Statistical Methods

This submission requests clearance for the data collection methods, materials, and survey instrument planned for the NCER-NPSAS grant study entitled, Could Connecting Students with Financial Aid Lead to Better College Outcomes? A Proposal to Test the Effectiveness of FAFSA Interventions Using the NPSAS Sample (referred to as “Connecting Students with Financial Aid (CSFA) 2017”). An overview of the design is provided below; for more information see Part A.2. Intervention materials, survey contacting materials, and the survey are provided in appendices A, B, and C, respectively.

    1. Respondent Universe

The respondent universe for the CSFA 2017 study is that of the 2015-16 National Postsecondary Student Aid Study (NPSAS:16). Details about the respondent universe for NPSAS:16, approved by OMB (#1850-0666 v.17-19), are provided below.

      1. Institution Universe

To be eligible for NPSAS:16, an institution was required, during the 2015-16 academic year, to:

  • Offer an educational program designed for persons who had completed secondary education;

  • Offer at least one academic, occupational, or vocational program of study lasting at least 3 months or 300 clock hours;

  • Offer courses that are open to more than the employees or members of the company or group (e.g., union) that administered the institution;

  • Be located in the 50 states, the District of Columbia, or Puerto Rico;1

  • Be other than a U.S. Service Academy; and

  • Have a signed Title IV participation agreement with the U.S. Department of Education.

Institutions providing only avocational, recreational, or remedial courses or only in-house courses for their own employees were excluded. The seven U.S. Service Academies were excluded because of their unique funding/tuition base.

      1. Student Universe

The students eligible for inclusion in the NPSAS:16 sample were those enrolled in a NPSAS-eligible institution in any term or course of instruction between July 1, 2015 and April 30, 2016, who were:

  • Enrolled in (a) an academic program; (b) at least one course for credit that could be applied toward fulfilling the requirements for an academic degree; (c) exclusively non-credit remedial coursework but who the institution has determined are eligible for Title IV aid; or (d) an occupational or vocational program that required at least 3 months or 300 clock hours of instruction to receive a degree, certificate, or other formal award;

  • Not currently enrolled in high school; and

  • Not enrolled solely in a GED or other high school completion program.

    1. Statistical Methodology

The student sample for the CSFA 2017 study will be selected from among the NPSAS:16 student sample members who (1) completed the NPSAS:16 survey, (2) were not selected for participation in the 2016/17 Baccalaureate and Beyond Longitudinal Study (B&B:16/17; OMB# 1850-0926), (3) were in their first three years of their undergraduate education, and (4) agreed to participate in follow-up research in response to the following NPSAS:16 survey question:

We are almost done with this survey. But first, we want to let you know that some students may be invited to participate in follow-up studies to learn more about their education and employment experiences after completing this survey. These follow-up studies will be led by external researchers not affiliated with the Department of Education. We would like to seek your permission to allow RTI to re-contact you on behalf of one of the external researchers. Your participation in future studies is completely voluntary, but there is no substitute for your response. Are you willing to be contacted about these future studies? Yes/No

This CSFA 2017 study will implement a set of interventions designed to maximize FAFSA submission and renewal and encourage better informed course enrollment through more generous financial aid. RTI will send students letters and emails with information about the importance of the FAFSA, guidance on how to complete the FAFSA, and suggestions for ways to get help completing the form (i.e., utilizing websites vetted for accuracy and usefulness and non-profit organizations available to give free guidance). The intervention mailings will occur from January to May 2017, with up to 4 mailings over that time period.

Table 1 provides sample sizes for the CSFA 2017 study. At the time the sample is randomly assigned to groups, RTI will check students’ 2016-17 and 2017-18 FAFSA filing statuses to determine what message will be conveyed in the materials sent to students as the intervention.

The Group assignment for those who have not yet filed a FAFSA for the 2017-18 school year will be based on their 2016-17 filing status.. This group will be used to investigate whether it matters how the disseminated information is framed. “Positive framing,” for example, will remind students that they could receive up to a $5,000 Pell Grant by completing the FAFSA. For the “negative framing” group, the message will indicate that financial aid will be lost if the FAFSA is not completed. All letters will include information on how credits taken relate to financial aid awards to help the research team understand the effects of receiving information about how more intense attendance could increase the aid received and thus reduce overall time-to-degree and possibly loan burden. The study also includes a control group.

Table 1. Distribution of the Connecting Students with Financial Aid (CSFA) 2107 sample, by treatment group

Treatment groups

Did Not Submit 2017-18 FAFSA as of Jan 13, 2016

Submitted 2017-18 FAFSA

Total

Control

3,000

1,000

4,000

Neutral framing

3,000

1,100

4,100

Positive framing

3,000

---

3,000

Negative framing

3,000

---

3,000





Total

12,000

2,100

14,100

Students who already filed a FAFSA for the 2017-18 school year will be randomly assigned into one of the treatment groups or into the control group. All treatment groups will receive mailings focused on the next steps after submitting the FAFSA and information on how credit levels relate to financial aid awards. The control group members will receive no mailings. FAFSA filing for 2016-17 is not considered for students who have already filed for 2017-18. Until randomization in January 2017, the relative proportion of students who will have already completed their FAFSAs for the 2017-18 school year will not be known. Thus the numbers in Table 1 are approximate.

In the fall of 2017, RTI will again contact all students selected to participate in the study to ask that they complete a survey focused on their experiences with financial aid awareness and how they navigated the aid system generally. All members of all groups, regardless of being part of a treatment or control group, will be asked to complete the survey.

On an annual basis, RTI will match the grant sample to the federal financial aid data systems, the Central Processing System (CPS), and the National Student Loan Data System (NSLDS) to obtain detailed data on aid receipt and enrollment choices (e.g., intensity, persistence, etc.). The sample will also be matched to the National Student Clearinghouse (NSC) to observe their enrollment patterns and intensity.

    1. Methods for Maximizing Response Rates

Response rates in the NPSAS:16 full-scale study are a function of success in two basic activities: locating the sample members and gaining their cooperation. The initial phase of CSFA 2017(January to May 2017) requires no response from participants. However, the second phase – the survey completion (October to December 2017) – will seek a response. Achieving the desired response rate will require an integrated tracing approach designed to yield the maximum number of locates with the least expense. The sample members were already located, participated in NPSAS:16, and agreed to participate in follow-up studies, so they should be fairly easy to locate. The tracing plan includes the following elements:

  • Advance Tracing – this stage includes tracing steps taken prior to the start of data collection, including batch database searches. Additional searches will be conducted through interactive databases if necessary.

  • Data Collection Mailings and emails – will be sent to the physical and email addresses provided by students during the NPSAS:16 interview. The National Change of Address database will be used to correct or forward outdated addresses.

  • Other Locating Options – will be used as needed, including a LexisNexis e-mail search if initial attempts to reach sample members are unsuccessful.

During the survey phase, efforts to achieve the desired response rate will also include providing both online and hard copy survey options, a series of reminder prompts via direct postcard mailings and emails, and offering a $30 incentive to encourage response.

Communications with Participants. Table A1 in appendix A provides the text that will be included in the letters and emails to students randomly chosen for the treatment groups. The language gives a sense of the goal of the communication and what types of information will be provided to each treatment group. In December 2016, the study team conducted focus groups to determine whether the language of the intervention letters and emails is understandable to a group of college students from a variety of schools (1850-0803 v.184). In response to the feedback from the focus groups, the wording of the intervention letters and emails has been revised and replaced in this submission.

For students who have not yet submitted a FAFSA for 2017-18, the first part of the letter will attempt to connect with students by recognizing their current FAFSA filing status. For instance, “According to our records, as of {INSERT DATE}, you have not yet applied for financial aid for next year.” For the full-time students, there are several variations in the text based on whether the student has been chosen to receive the information framed in a positive, neutral, or negative way. For example, students receiving the positive framing will receive:

Many students can get a Federal Pell Grant up to $5,920, which does not need to be repaid. That’s thousands of dollars to help you pay for college!

In contrast, students with the negative frame will receive:

Thousands of dollars are at stake. You could lose your chance to get a Pell Grant worth up to $5,920 from the federal government. .”

The letter with neutral framing will not contain either line.


The letter will then go on to provide basic information about what the FAFSA is and the need to complete the process to obtain financial aid. The letters and emails will be personalized to reflect the student’s previous (2016-17) FAFSA filing status (did not submit or submitted a complete FAFSA). See Appendix A for a sample letter.

Next, the letter will address some of the common myths about FAFSA and who should complete the form and why. The letter will go into some detail about what is needed to complete the FAFSA (e.g., SSN for student and parents, tax return information) and how to complete the FAFSA.

All letters will explain how financial aid awards can vary according to the number of credit hours taken. The goal of this communication is to help students understand the tradeoffs between financial aid and the number of credits taken. The letters will provide information on the amount of aid a student would receive based on the number of credit hours taken. To underscore this point, the letters will read,

“Did you know you could get more financial aid if you take more courses? This could help you finish your studies sooner. As shown below, students who attend full-time could receive twice as much financial aid with a Pell Grant, which is free money that does not need to be repaid.

Finally, the letter will then suggest ways for students to get additional help and support completing the FAFSA. The letters will include links and a Quick Response (QR) code to the U.S. Department of Education website designed to educate and help students complete the FAFSA (listed on the intervention letters in Appendix A).

For students who have already submitted a FAFSA for 2017-18, the intervention letters differ because these students do not need help completing the FAFSA. Students in the treatment group will either receive general information encouraging persistence or be encouraged to enroll full-time by making clear how credit levels relate to financial aid awards. Control group members will receive no mailings.

Follow-Up Survey. The follow-up survey will be sent in the fall of 2017 to all students selected for participation, not just those who received the intervention mailings. Contacting materials to accompany the follow-up survey are provided in appendix B. The content of the survey is provided in appendix C. It will be prepared as a hard copy survey and programmed using Voxco, an off-the-shelf survey design software which can run from inside the ESN.

The survey will provide direct measures of students’ experiences with financial aid. The survey questions will illuminate students’ awareness of financial aid programs and of the process to receive aid, whether they completed the FAFSA, from whom or where they learned about financial aid and got help through the process, how easy or hard they found the process, and whether and how aid influenced their enrollment choices (where and at what intensity).

Cognitive interviews on the survey items will be conducted in March 2017 to examine whether college students correctly understand the question wording and whether their answers get adequately captured in multiple-choice questions. The request to conduct cognitive testing will be submitted to OMB for review under the NCES generic clearance for pretesting (OMB# 1850-0803). With input from cognitive interviews, we expect to finalize the CSFA survey in April 2017. Any changes to the survey will be submitted to OMB for review in 2017 with the goal of implementing the survey beginning in October 2017.

    1. Tests of Procedures and Methods

The CSFA 2017 study will implement a set of interventions intended to maximize FAFSA submission and renewal as well as encourage better informed course-enrollment with more generous financial aid. The first set of research questions is:

  • Among students who did not submit a FAFSA, does providing clear information about financial aid eligibility, simplified instructions for completing the FAFSA, and suggestions about other resources available to them have a positive effect on college outcomes?

  • Among students who submitted a FAFSA the previous year, does providing clear information about the need to resubmit the FAFSA and how to do so have a positive effect on college outcomes?

  • Among students who have started the process of submitting a FAFSA for 2017-18, does providing clear information about the steps needed to complete the process and encouragement to reenroll have a positive effect on college outcomes?

For each case, the grantees will examine outcomes such as FAFSA submission, aid receipt, college persistence semester-to-semester, and degree or certificate completion.

While it is expected that better information will result in better outcomes, there are questions about how to best provide this information in terms of the framing of the messages. Therefore, within the treatment groups that target full-time students, grantees will randomize the nature of the information that is provided to answer the question:

  • Does the framing of the information, whether positive, negative, or neutral, affect whether students respond by submitting the FAFSA and/or persisting in college, and does the magnitude of any response vary by treatment group?

Finally, for the subset of students who attended school part-time during the baseline year or who have already completed the FAFSA for 2017-18, grantees will examine whether information can influence not only the completion of tasks (such as completing the FAFSA), but also decisions about enrollment intensity (i.e., how many credits a student takes each term) by addressing the problem of low awareness of financial aid rules to answer the question:

  • Does making explicit the tradeoffs between the number of credits taken and the financial aid award amount influence enrollment intensity?

Another set of questions relates to whether increasing aid receipt also increases rates of persistence over several years.

  • If the interventions increase aid receipt by encouraging FAFSA completion, is this also a driver of increased college persistence and completion?

The grantee’s research team plans to investigate both the overall effects of the interventions and whether the effects vary by subgroup. Given that CSFA sample members were selected from NPSAS:16, a nationally-representative sample of college students, the sample is expected to have a diverse group in terms of gender, race/ethnicity, age, income, region, type of college, and first-generation college student status. The research team will utilize the diversity of the sample to explore not only if, but for whom, the informational interventions work.

Another major advantage of using the NPSAS sample and data is its heavy integration with existing administrative data. First, the NPSAS data collection includes an in-depth student interview, giving a wealth of information about a student’s background, goals, and experiences. Then, NPSAS links the survey information to administrative data (a description of the confidentiality procedures in place for the administrative record matching is provided in appendix D). From the baseline NPSAS year, grantees will have access to students’ enrollment, credits attempted and completed, course selection, major, and term-to-term persistence. NPSAS also includes a match to NSLDS and institutional aid files to provide detailed data on aid receipt and enrollment choices (e.g., intensity, persistence, etc.). In the years after the intervention, RTI will continue to match the sample to CPS, NSDLS, and NSC data to further track the enrollment patterns of students who have left their original institution and transferred elsewhere in the country. NSC data will also provide the part-time status of students to examine whether information about financial aid rules influences enrollment intensity. Additionally, NSC will indicate where students receive their federal financial aid, if any, thereby providing general enrollment information.

Based on all of the data that will be available for the NPSAS sample both before and after the study interventions, grantees will have a rich set of background, college experience, and outcome information. Student background data collected in the original NPSAS study will be used for important covariates and to define subgroups for analysis. Additionally, the key outcome measures for the CSFA study – FAFSA submission, financial aid receipt, enrollment and persistence, part-time status, transfer behavior, and degree or certificate completion – can all be measured using administrative records, thus resulting on only few cases being lost to attrition. Although NSC does not cover all institutions, NSC data can be used with other government sources to triangulate the effects of the interventions and get a sense of how institutions missing from NSC might affect the estimated magnitude of the effects.

In matching to the administrative data sources, grantees will be able to look at both the immediate and longer-term effects of the interventions, immediately after exposure as well as one and two years after the interventions. It may be the case that once students understand the importance of the FAFSA and how to complete it, they continue to do so in future years.

Administrative data will also be used to provide information on moderators and mediators of the treatment effect. The baseline NPSAS:16 data collection offers a rich set of moderators. From their confirmatory analyses, grantees will be able to examine how race, gender, low-income, and first-generation status moderates the treatment effect. Additional exploratory analysis can be conducted on how other student (e.g. prior aid receipt) and institutional characteristics (e.g. four- vs. two-year attendance) moderate the treatment impacts of the interventions.

One of the central mediators is likely the filing of the FAFSA and the subsequent receipt of financial aid, observable through administrative data. Other potential mediators will also be tracked through the NPSAS:16 survey and the grant’s follow-up survey (e.g. students’ engagement in college, students’ understanding of the financial aid process, institutional supportiveness). Described below are plans for identifying the roles of moderators and mediators across data sources.

As described above, for purposes of the CSFA study, students are being randomly assigned to the control or treatment groups, and these randomized controlled trials (RCTs) are being employed to obtain unbiased causal estimates of the impact of the interventions. Because the intervention will be administered on a student-by-student basis, the unit of treatment will be the individual, and the dispersed nature of the sample and the fact that the treatment will occur outside of school, means there will be little to no contamination among students. Randomization will occur at the individual level without blocking which, in large samples of students, provides only minimal improvement in standard errors.

Through the follow-up survey, grantees will attempt to measure institutional supports and outreach that students are receiving. In principle, any such supports should be orthogonal to the treatment status and, hence, should not bias results. However, if institutional supports are found to be redundant, they may reduce the effectiveness of the interventions, and may be important to measure to understand not only the intention-to-treat effects but also the impacts of the treatment-on-the-treated.

The following equation will be used to estimate the intention-to-treat estimates of the treatment:

yi = a + βk Treatmentik + Xi + εi (1)

where yi refers to student i; Treatmentik is an indicator of whether student i was assigned to be in the treatment k rather than the control; Xi is an indicator for additional student-level covariates (e.g. race, gender, family income, etc.); and εi is a student specific random error term. The treatment effects are indexed by k to signify the multiple treatments being explored. Some of the student-level covariates may include some characteristics of the student’s initial campus of attendance. The inclusion of covariates should improve the standard errors. Missing values for covariates will be controlled for using imputation techniques with controls accounting for such imputation. Since treatment occurs at the student-level, heterogeneity will be controlled in computations of the standard errors.

To control for potential moderators, the equation includes an interaction between the potential moderator (e.g. gender) while including the main treatment effect and the main impact of the characteristic in the covariates. As mentioned above, the central confirmatory moderators to be examined include gender, race, income, and first-generation status. A similar strategy will be used for mediators (e.g. financial aid receipt) where interactions between the treatment and potential mediators are included.

Equivalence of the treatment groups will be tested using a modified version of equation (1). Rather than using an outcome as the dependent variable, the equation will be used to compare separate covariates and their balance across treatment categories. Attention will be paid to the potential for the separate covariates to be related so multiple comparison corrections and methods will be used as needed (Scochet, 2008).

The planned analyses will have excellent power as a result of the large initial sampling in the NPSAS. For the power calculations, the sample is restricted to students in their 1st, 2nd, and 3rd year of undergraduate studies. All power calculations assume a 95 percent confidence interval, 80 percent power, and equal assignment across treatment and control groups. In addition, it is assumed that 40 percent of the outcome could be explained by existing covariates. For the first set of comparisons, focused on improving college outcomes by providing information and reminders/support to existing students, the minimum detectable effect sizes (MDES) are 0.08 and 0.07, which corresponds to 4.0 and 3.5 percentage-point changes in college completion (assuming completion rates are 50%). If the impacts of various prompts appear statistically equivalent, data can be pooled within full-time and within part-time students. In the pooled analysis, the MDES would be 0.065 and 0.060 respectively. These correspond to 3.3 and 3.0 percentage point changes in enrollment (again assuming completion rates are 50%).

Perhaps the study most similar to the CSFA 2017 study is that of Castleman and Page (2016) which uses text messaging to encourage FAFSA renewal. The CSFA planned outreach is more aggressive than that used by Castleman and Page, and the 3.5-4.0 percentage point impacts are smaller than the impacts they report (11 percentage points for students in two-year colleges). Hence, the anticipated MDES is in-line with prior studies.

Power begins to diminish in the subsample analysis. In a simple bifurcation (e.g. male vs female) of the overall samples when treatment conditions are pooled, the MDES are 0.09 and 0.08. This is the upper bound of effects possible (although still much lower than the impact in Castleman and Page, forthcoming). Without pooling the treatments, the MDES are 0.11 and 0.10. In sum, detecting subsample and treatment variation designs is possible but only if the impact is as large as others have found.

In addition to the above, two separate cost analyses will be conducted. The first focuses exclusively on the intervention. Careful records on the costs of sending the intervention will be maintained. The main costs will be the costs of preparing and sending information to students. These costs represent the marginal costs of conducting the intervention. Records will also be maintained on any other costs incurred, such as those for identifying and preparing information needed to send the interventions, and for maintaining a website to administer the survey. These costs may be different if the project were replicated at a specific university. The second type of cost analyses is specific to the IES funding category. The use of NPSAS samples as a base for subsequent research is an experiment. The costs of the research might help IES make strategic decisions on whether to consider future collaborations. The differences in costs when baseline data are already complete and when the research builds on existing data might inform IES’ future competitions.

    1. Reviewing Statisticians and Individuals Responsible for Designing and Conducting the Study

The CSFA 2017 study requires substantial coordination between the grantee, RTI, and NCES. The grantee is responsible for research design, data analysis, and dissemination of results, and includes the following individuals: Dr. Bridget Terry Long, Principal Investigator, Harvard University, and Dr. Eric Bettinger, Co-Principal Investigator, Stanford University. The following staff members at RTI are responsible for sample selection, respondent contacting and follow-up, data collection and processing, and weighting of the data (if applicable): Ms. Kristin Dudley, Mr. Jeff Franklin, Mr. Peter Siegel, and Dr. Jennifer Wine. Dr. Tracy Hunt-White and Dr. Sean Simone, from NCES, and Dr. James Benson, from NCER, are the statisticians responsible for ensuring that the confidentiality of NPSAS:16 sample members is protected (including ensuring that proper data security protocols are in place and that NCES Statistical Standards are met) and for the general oversight of the NCER-NPSAS grant program.



  1. References

Adelman, Clifford. 2006. The Toolbox Revisited: Paths to Degree Completion from High School Through College. Washington, D.C.: U. S. Department of Education.

Advisory Committee on Student Financial Assistance (2008). “Early and Often: Designing a Comprehensive System of Financial Aid Information.” Abridged Report, Washington, D.C.: U.S. Department of Education. https://www2.ed.gov/about/bdscomm/list/acsfa/earlyoftenreport.pdf

Bettinger, Eric (2012). “Financial Aid: A Blunt Instrument for Increasing Degree Attainment.” In Andrew Kelly and Mark Schneider (Eds.), Getting to Graduation: The Completion Agenda in Higher Education, Baltimore, MD: Johns Hopkins Press.

Bettinger, Eric (2015). “Need-Based Aid and College Persistence: The Effects of the Ohio College Opportunity Grant.” Educational Evaluation and Policy Analysis, 37(1): 102S-119S.

Bettinger, Eric, Bridget Terry Long, Philip Oreopoulos, and Lisa Sanbonmatsu (2012). “The Role of Application Assistance and Information in College Decisions: Results from the H&R Block FAFSA Experiment.” Quarterly Journal of Economics, 127(3): 1-38.

Bird, K. and Benjamin L. Castleman (2014). “Here Today, Gone Tomorrow? Investigating Rates and Patterns of Financial Aid Renewal Among College Freshmen.” Center for Education Policy and Workforce Competitiveness Working Paper No. 25. Charlottesville, VA: University of Virginia.

Castleman, Benjamin L. and Bridget Terry Long (2016). “Looking Beyond Enrollment: The Causal Effect of Need-Based Grants on College Access, Persistence, and Graduation.” Journal of Labor Economics 34(4), 1023–1073.

Castleman, B.L., & Page, Lindsay C. (2016). “Freshman year financial aid nudges: An experiment to increase FAFSA renewal and college persistence. Journal of Human Resources, 31(51): 389-415.

Deming, David and Susan Dynarski. (2009) “Into College, Out of Poverty? Policies to Increase the Postsecondary Attainment of the Poor.” Philip Levine and David Zimmerman, eds. Targeting Investments in Children: Fighting Poverty When Resources Are Limited. Chicago: University of Chicago Press.

Dynarski, Susan. (2008) “Building the Stock of College-Educated Labor.” 2008. Journal of Human Resources 43:3, pp. 576-610.

Dynarski, Susan M. and Judith Scott-Clayton (2006). “The Cost Of Complexity In Federal Student Aid: Lessons From Optimal Tax Theory And Behavioral Economics,” National Tax Journal, 59(2): 319-356.

Dynarski, Susan and Judith Scott-Clayton (2007). “College Grants on a Postcard: A Proposal for a Simple and Predictable Federal Student Aid.” The Hamilton Project Discussion Paper 2007-01. Washington, D.C.: The Brookings Institute.

Dynarski, Susan and Judith Scott-Clayton (2013). “Financial Aid Policy: Lessons from Research.” Future of Children, 23(1): 67-92.

Goldrick-Rab, Sara, Douglas Harris, Robert Kelchen, and James Benson (2016). Reducing Income Inequality in Educational Attainment: Experimental Evidence on the Impact of Financial Aid on College Completion.” American Journal of Sociology 121:6, 1762-1817

Horn, Laura J., Xianglei Chen, and Chris Chapman (2003). “Getting Ready to Pay for College: What Students and Their Parents Know About the Cost of College Tuition and What They Are Doing to Find Out.” National Center for Education Statistics Report No. 2003030. Washington, D.C.: National Center for Education Statistics.

Hoxby, Caroline and Sarah Turner (2012). “Expanding College Opportunities for High- Achieving, Low Income Students.” Stanford Institute for Economic Policy Research Working Paper Number 12-014.

Hoxby, C., and Turner, S. (2015). What High-Achieving Low-Income Students Know About College. American Economic Review, Volume 105, Number 5, pages 514-517.

Kane, Thomas J. and Christopher Avery (2004). “Student Perceptions of College Opportunities: The Boston COACH Program.” In Caroline Hoxby (Ed.), College Decisions: The New Economics of Choosing, Attending and Completing College, Chicago, IL: University of Chicago Press.

Kantrowitz, M. (2009). “Analysis of Why Some Students Do Not Apply for Financial Aid.” Policy Analysis paper, finaid.org. http://www.finaid.org/educators/20090427CharacteristicsOfNonApplicants.pdf

King, Jacqueline E. (2004). “Missed Opportunities: Students Who Do Not Apply for Financial Aid.” Issue Brief, Washington, D.C.: American Council on Education. https://www.soe.vt.edu/highered/files/Perspectives_PolicyNews/10-04/2004FAFSA.pdf

King, Jacqueline E. (2006). “Missed Opportunities Revisited: Students Who Do Not Apply for Financial Aid. Issue Brief, Washington, D.C.: American Council on Education, Center for Policy Analysis. http://datacenter.spps.org/uploads/Missed_Opportunities_Revisited_2.pdf

Kofoed, M. S. (2013). “To Apply or Not Apply: FAFSA Completion and Financial Aid Gaps.” Working Paper, United States Military Academy. http://www.cuny.edu/about/administration/offices/ira/opr/seminars/currentseries/ToAppl yorNottoApply.pdf

Levin, Adam (August 2, 2015). “Not So FAFSA: How to Avoid a Student Aid Scam.” ABC News. Accessed August 2, 2015 at: http://abcnews.go.com/Business/fafsa-avoid-student-aid-scam/story?id=32788041

Long, Bridget Terry (2007). “The Contributions of Economics to the Study of College Access and Success.” Teachers College Record, 109(10): 2367-2443.

Long, Bridget Terry (2010). “Making College Affordable by Improving Aid Policy” (Issues in Science and Technology). Washington, D.C.: National Academy of Sciences, Division of Behavioral and Social Sciences and Education.

Long, Bridget Terry and Melissa Bert (2015). “Encouraging Family Engagement: How the Framing of the Message Matters.” Mimeo, Harvard University. No URL available.

Schochet, Peter Z. (2008). “Technical Methods Report: Guidelines for Multiple Testing in Impact Evaluations.” Washington, D.C.: U.S. Department of Education. NCEE 2008- 4018.

Scott-Clayton, J. (2011). “On Money and Motivation: A Quasi-Experimental Analysis of Financial Incentives for College Achievement.” Journal of Human Resources, 46(3): 614-646.

Shireman, Robert, Sandy Baum, and Patricia Steele (2012). “How People Think about College Prices, Quality, and Financial Aid.” Change The Magazine of Higher Learning, 44(5): 43-48.

Tversky, Amos and Daniel Kahneman (1981). “Framing of Decisions and the Psychology of Choice.” Science, New Series, 211(4481): 453-458.

U. S. Department of Education (2014). 2011–12 National Postsecondary Student Aid Study (NPSAS:12) Data File Documentation (NCES 2014-182). Washington, D.C.

U. S. Department of Education (2015). Digest of Education Statistics: 2013 (NCES 2015-011). Washington, D.C.

1 Institutions in Puerto Rico were not eligible for NPSAS:12.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleChapter 2
Authorspowell
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy