1850-0912 Supporting_Statement_Part_A revised post comment period

1850-0912 Supporting_Statement_Part_A revised post comment period.docx

Study of Enhanced College Advising in Upward Bound

OMB: 1850-0912

Document [docx]
Download: docx | pdf



P art A: Supporting Statement for Paperwork Reduction Act Submission



Study of Enhanced College Advising in Upward Bound



Prepared for:

Marsha Silverberg

U.S. Department of Education

555 New Jersey Ave, NW

Room 502I

Washington, DC 20208-5500




Submitted by:

Abt Associates Inc.

55 Wheeler Street

Cambridge, MA 02138




Part A: Supporting Statement for Paperwork Reduction Act Submission

Table of Contents



  1. Justification

This supporting statement updates the burden request under OMB control number 1850-0912, approved on 8/8/2014, by including burden from the Phase II of data collection and annualizing the burden (updates in Sections A.12 and A.15) and appending the Phase II data collection instruments and related communications (Appendices G through J).

Introduction

The U.S. Department of Education (ED) will fulfill a congressional mandate to assess the effectiveness of a promising practice in its long-standing Upward Bound (UB) program by conducting a research demonstration to enhance college advising in UB.

The study is being sponsored by ED’s Institute of Education Sciences, in collaboration with the Office of Postsecondary Education, and implemented by Abt Associates Inc. and its partners, Decision Information Resources (DIR), Survey Research Management (SRM), and American Institutes for Research (AIR).

Overview of the Upward Bound Program

The UB program is designed to improve college access for students from disadvantaged backgrounds. Coming out of the Economic Opportunity Act of 1964 as part of the War on Poverty, the UB program is the oldest of the Federal TRIO programs. In fiscal year 2014, approximately $265 million was spent to fund 814 UB projects and serve over 61,000 high school participants. 1

Each version of the Higher Education Act, including the most recent 2008 Higher Education Opportunity Act (HEOA) (20 USC 1070A-18), has prescribed major details of the UB program. UB project grantees may include two- or four-year colleges (the vast majority), but also other organizations such as local education agencies, nonprofit organizations, other community organizations, and state education agencies, may also host UB projects. Eligible UB students must come from families with household income below 150 percent of the poverty line or in which neither parent holds a bachelor’s degree, and two-thirds of any project’s participants must satisfy both criteria. Individual UB projects must provide an array of services to participants, who typically enter the program early in high school.

UB projects are required to provide students with:

  • academic tutoring to prepare students to complete secondary or postsecondary courses;

  • guidance on course selection; assistance in preparing for college entrance examinations and completing college admission applications;

  • information on all Federal student financial aid programs, and benefits and resources for locating public and private scholarships;

  • assistance completing financial aid applications;

  • education or counseling services to improve the financial literacy and economic literacy of students or their parents, including financial planning for postsecondary education; and

  • assistance for high school dropouts with secondary school reentry, entry into alternative education programs, and entry into general educational development (GED) programs or postsecondary education programs (20 USC 1070A-13b).

According to grantee-provided data, more than 80 percent of Upward Bound participants attend college within two years of graduating high school,2 with older data suggesting that three-quarters of those students (60 percent overall) enroll in a four-year college or university.3 However, like many low-income students, UB participants may miss opportunities to enroll in more selective colleges and universities that better match their academic capabilities. A previous study of UB found that only 11 percent of UB participants enroll in four-year institutions classified by the Barron’s guide as “most competitive”, “highly competitive”, or ”very competitive” (Seftor, Mamun and Schirm, 2009). Further, more than a third of participants overall attend their host institution (45 percent for those participating in UB programs hosted at a two-year institution), but whether that reflects the best outcome for those students or a lost opportunity is uncertain. While all UB projects provide college advising and application help, there is variation in the emphasis and intensity of these activities and room to improve the “match” or “fit” between UB student qualifications, financial circumstances, and interests and the college in which they enroll.

Overview of the Enhanced College Advising Demonstration

The demonstration will build on advising activities grantees already conduct, but take into account information and approaches emerging from recent rigorous research (Avery, 2013; Hoxby and Turner, 2013; Roderick, Nagaoka, Coca, and Moeller, 2009; Sherwin, 2012; Carrell and Sacerdote, 2013). The intervention is a professional development program for UB staff and a set of tools and resources for them to use in working with students in the spring of their junior year through early senior year. Both the staff training and student tools and resources will focus on the benefits of attending higher quality institutions, the concepts of net costs and completion rates in comparing colleges of interest, the availability of financial aid, other factors to consider in finding a “fit,” and the importance of applying to at least 4 colleges (Smith, 2011), with fee waivers to ensure that household income is not a constraint on the number of applications.

The professional development will include review of emerging research and best practices; introduce key concepts in enhanced college advising; simulate enhanced advising activities with materials, tools, resources; and support staff to develop a plan for implementing enhanced advising strategies. The professional development will be offered in a series of webinars. The intervention builds on lessons learned from recent Hoxby-Turner (2013) research, by providing students with: (1) an illustrative example of colleges customized to their PSAT/SAT and/or ACT/PLAN score and their location, and (2) a list of scholarships and grants available in their state.4 In contrast to the Hoxby-Turner study, where packets were mailed directly to very high-achieving, low-income students, the UB demonstration will address students with a wider range of academic backgrounds and encourage staff support to help students understand and act on these materials.

Overview of the Evaluation

The 2008 Higher Education Opportunity Act (HEOA) (20 USC 1070A-18), requires ED to conduct a rigorous study of a promising practice that has the potential to improve key outcomes for UB participants. At the same time, the law prohibits any evaluation of a TRIO program that would require grantees to “recruit additional students beyond those the program or project would normally recruit” or that would result “in the denial of services for an eligible student under the program or project.” The proposed research demonstration fulfills HEOA’s mandate to examine a promising practice and is consistent with the prohibition against denying students UB services as part of the evaluation. Under the demonstration design, both the treatment and control group projects would continue providing regular UB services. In addition, ED has committed to providing the professional development program to both groups, with the control group projects receiving training after the experimental period is over.

The professional development program will be evaluated using a delayed treatment randomized control trial (RCT) design. This design will ensure that all UB projects that volunteer have access to the college advising intervention at some point. Approximately 200 Upward Bound projects (awarded grants in 2012) will be recruited to volunteer for the demonstration. These projects will be randomly assigned so that those assigned to Wave 1 (treatment) will receive the professional development program beginning in spring 2015.5 Wave 2 (control) projects will receive the professional development program beginning in fall 2016.

Students in both groups who were high school juniors in 2014-15 will be tracked over time to collect administrative and survey data on key outcomes, including college application behavior, college acceptance and matriculation, and receipt of financial aid. ED expects to execute an option to the current contract to also enable collection of longer-term data on college persistence, which is the most important measure of match or fit.

The study will use the data to assess not only whether the intervention is effective, but how well it was implemented and whether its effectiveness depends on key components of the UB program or features of the college advising intervention as it was designed and implemented. In particular, the evaluation is designed to answer three main research questions:

  1. To what extent do the professional development package and tools have an effect— above and beyond the services Upward Bound grantees already provide—on important student outcomes?

  2. How fully was the intervention implemented (e.g., in terms of staff participation in training and staff implementation of the intervention model)? And to what extent did the intervention produce a difference in the Upward Bound college advising provided to treatment and control group students?

  3. Is there variation in the impacts of the enhanced college advising intervention on student outcomes and to what extent is the variation associated with other project features or characteristics of participating students? For example, do impacts vary between projects hosted by two-year institutions and projects hosted by four-year institutions? Are differences in the implementation of the enhanced college advising associated with differences in impacts?

To answer these questions we will conduct both impact and descriptive analyses. The first report, which will address each research question, will be available in 2017, and the second report will be published in 2018. ED expects to issue a later report on persistence impacts in 2020. To minimize costs, the evaluation will rely to the extent possible on easily available administrative data for many of the outcome measures.

Exhibit A-1 presents the research questions along with the data sources for each question, the analytic approach and outcomes of interest.

Exhibit A-1. Evaluation Questions, Data Sources, Analytic Approach, and Outcomes of Interest

Research Question

Data Sources

Analytic Approach

Outcomes of Interest

  1. To what extent do the professional development package and tools have an effect— above and beyond the services Upward Bound grantees already provide—on student outcomes?

  • Student survey data

  • National Student Clearinghouse data

  • Federal Student Aid data

  • NCES IPEDS data

  • College Board and ACT data

  • Impact analysis

  • Sample: 4,000 students in 200 UB projects

  • Number and type of college applications submitted

  • Selectivity of colleges applied to

  • Knowledge of college net costs

  • Knowledge of financial aid options

  • Completion of FASFA

  • Type of college enrolled in

  • Selectivity of college enrolled in

  • Persistence in college


  1. How fully was the intervention implemented (e.g., in terms of staff participation in training and staff implementation of the intervention model)? And to what extent did the intervention produce a difference in the Upward Bound college advising provided to treatment and control group students?

  • UB Project Director survey data

  • Student survey data

  • Descriptive / impact analysis

  • Sample: 4,000 students in 200 UB projects and 200 UB project directors

  • Enhanced college advising experiences

  • Receipt and use of student advising materials

  • Staff knowledge and awareness

  • Staff behaviors and practice

  1. Is there variation in the impacts of the enhanced college advising intervention on student outcomes and to what extent is the variation associated with other project features or characteristics of participating students? For example, do impacts vary between projects hosted by two-year institutions and projects hosted by four-year institutions? Are differences in the implementation of the enhanced college advising associated with differences in impacts?

  • Student survey data

  • PD survey data

  • National Student Clearinghouse data

  • Federal Student Aid data

  • NCES IPEDS data

  • College Board and Act data


  • Impact analysis / moderator analysis

  • Sample: 4,000 students in 200 UB projects and 200 UB project directors

  • Number and type of college applications submitted

  • Selectivity of colleges applied to

  • Knowledge of college net costs

  • Knowledge of financial aid options

  • Completion of FASFA

  • Type of college enrolled in

  • Selectivity of college enrolled in



    1. Circumstances Making the Collection of Information Necessary

The study’s data collection and design will allow for a rigorous assessment of a promising practice within UB that focuses on program improvement, as required by The 2008 Higher Education Opportunity Act (HEOA) (20 USC 1070A-18). It also builds on promising new research about college advising practices and on strong policy interest in addressing the issue of college undermatching and fit among low-income students.6 Finally, the demonstration and its research will test out a strategy geared towards improving college persistence, an outcome of great interest to both policymakers and UB projects.

An accumulating set of studies indicates that, when applying to college, high-achieving low-income students often aim for less-selective institutions or “undermatch” (Bowen, Chingos, and McPherson, 2009; Byndloss and Reid 2013; Roderick, Coca, and Nagaoka, 2011; Smith, Pender, and Howell, 2013). These students tend to select colleges that are less academically rigorous, have lower graduation rates, and have higher net costs to the student than more academically-rigorous institutions. The research indicates that some low-income students have limited access to information about college affordability, selectivity, and outcomes (Avery, 2013; Bowen, Chingos, and McPherson, 2009; Hoxby and Turner, 2013).

Experimental and non-experimental evidence suggests promising college advising practices to address “match” or “fit” that can be incorporated into or adapted for Upward Bound (Avery, 2013; Carrell and Sacerdote, 2013; Roderick, Nagaoka, Coca, and Moeller, 2009; Sherwin, 2012, Hoxby and Turner, 2013, Byndloss and Reid 2013). These strategies include practical help on the logistics of applying to colleges (e.g. deadlines and plans), ways to expand students’ efforts to apply for financial aid, approaches to widen students’ aspirations and expectations about school quality, and accommodations for other important student considerations such as locality, programs of study, and campus attributes.

    1. Purposes and Use of the Information Collection

There are two phases to the information collection for this study. The first ICR (approved on August 8, 2014) requested approval for Phase I – Random Assignment and Collection of Student Rosters and Baseline Student Surveys. During Phase I volunteer projects were randomly assigned to either receive the intervention beginning in spring 2015 or spring 2016, student rosters were collected to identify student participants, and a baseline student survey was administered. This ICR requests clearance for Phase II – Collection of Follow-Up Survey, Project Director Survey, and Administrative Data.

All information will be collected by the study team. A combination of administrative and survey data will be collected to examine the implementation and impacts of the Enhanced College Advising demonstration (see summary in Exhibit A-2).


Exhibit A-2. Data Collection Plan

Schedule

Data

Purpose

Respondent

Mode

Phase I –Random Assignment and Collection of Student Rosters and Baseline Survey

Fall 2014

2014-15 Student Rosters

Identify student study sample of high school juniors in 2014-15 in both treatment and control group projects

200 UB Project Directors in study

Electronic data or Paper

Winter 2015

Baseline student surveys

Collect baseline student data from study sample (2014-15 high school juniors) in both treatment and control group projects

4,000 students in study sample

Electronic data

Phase II – Collection of Follow-Up Survey, Project Director Survey, and Administrative Data

Spring 2015

UB APR data

Collect student identifiers for matching study sample to SAT and ACT data

None; extant data on 4,000 students in study sample

Electronic data

Spring 2015

SAT and ACT data

Collect SAT and ACT data on 2014-15 high school juniors in all projects

None; extant data on 4,000 students in study sample

Electronic data

Fall 2015

2015-2016 Student Rosters

Identify 2015-16 high school juniors in control group projects that will receive access to the intervention in 2015-16

100 UB Project Directors in study control group

Electronic data or Paper

Winter 2016

Project Director Survey

Collect data on college advising in UB from project directors in both treatment and control group projects

200 UB Project Directors in study

Electronic data

Spring 2016

Follow-Up Student Survey

Collect follow-up data from study sample (2014-15 high school juniors who are now high school seniors)

4,000 students in study sample

Telephone and Web

Spring 2016

SAT and ACT data

Collect SAT and ACT data on 2015-16 high school juniors in control projects (note that these students are not in the study sample; they are the delayed treatment group and will not be included in the analyses)

None; extant data on 2,000 students from delayed treatment group (i.e. 2015-16 high school juniors in control group projects; these students are not in the study sample)

Electronic data

Summer and Fall 2016

Federal Student Aid data

Collect data on FASFA completion and financial aid receipt

None; extant data on 4,000 students in study sample

Electronic data

Fall 2016

National Student Clearinghouse data

Obtain college enrollment data

None; extant data on 4,000 students in study sample

Electronic data

Fall 2018

National Student Clearinghouse data

Obtain college enrollment data

None; extant data on 4,000 students in study sample

Electronic data

      1. Phase I Data

Approval was received for data collection, during Phase I, of student rosters and student baseline surveys.

Student Rosters. The study team will collect rosters of high school juniors participating at each of the approximately 200 UB projects from project directors in the fall of 2014. The students on these rosters will define the evaluation sample for all analyses (Questions 1 and 3), which is estimated to include 4,000 students. The roster data will also help in locating students and parents for the study’s data collection. The study team will request only directory information as designated under the Family Educational Rights and Privacy Act (FERPA) and its implementing regulations (20 U.S.C 1232g and 34 CFR Part 99) such as the student’s name, email, home address, telephone number, date of birth, and parent name.

To obtain student roster data during Phase I, the study team will rely on the UB project directors. In the fall of the 2014-15 school year, the study team will send an email requesting rosters to each project director that provides instructions on what is being requested and how to submit the rosters (see Appendix A). Next, the study team will reach out to each project director via a telephone call during the fall of the 2014-15 school year to follow-up on the request. Project directors will be asked to submit student rosters through a password protected secure file transfer portal (SFTP). The study team will hold two webinars for project directors to learn how to use the SFTP and why this method of roster submission is required to ensure student confidentiality protections.

Rosters must be collected in the fall of the 2014-15 school year to identify the sample for the baseline survey, which will be administered in early 2015.

Student Baseline Survey. The baseline survey will serve multiple purposes. First, it will allow for a check of whether the treatment and control group are statistically similar prior to any treatment. Second, the data will provide key covariates to improve the precision of the impact estimates. Finally, some of the variables may help to form subgroups of students, to examine whether different types of students benefit differentially from the college advising strategy.

The survey will be administered in early 2015 to all high school juniors in the treatment and control groups (approximately 4,000 students) before those in the treatment group are exposed to the intervention in the late spring of 2015. It will focus on characteristics that are likely correlated with college-related outcomes, such as students’ college-going expectations and plans, the number of colleges to which a student plans to apply; the name of the college a student is mostly likely to attend; and students’ understanding of college costs. In developing the survey, the study team relied heavily on existing survey instruments that address topic areas relevant to the key outcomes of the study (see Appendix B).

The baseline survey will be administered in a web-based online format with the option for telephone administration, if requested. The study team will send students emails and letters to their home addresses with the survey’s URL and their individualized survey login information (see Appendices C and D). The email and letter will ask students to complete the brief 15 minute survey at a time most convenient for them.

Prior to fielding the student baseline survey, parents will receive an informational letter that describes the study and its data collection activities and provides instructions to follow (including contacting a toll-free phone number and email address) if parents do not wish their child to participate in the baseline or follow-up surveys (see Appendix E). Students in UB projects assigned to the treatment group whose parents opt them out of the student surveys will still be eligible to participate in all college advising activities.

      1. Phase II Data

Phase II of the evaluation will involve additional data sources and this ICR requests approval for these data collection activities. Where possible, the study relies on existing data sources, and surveys will be used to gather information where extant data sources are not available. Phase II includes the collection of: UB APR data, college entrance exam data, Federal Student Aid (FSA) data, National Student Clearinghouse (NSC) data, student rosters, student follow-up surveys, and project director surveys.

UB APR data. The study team will use the UB APR data, which include student identifiers such as first and last names, birthdates, and SSNs, in conjunction with student identifiers collected from the UB rosters to create a UB study participant data file with student names, birthdates and SSNs for matching to data from the College Board, the ACT, the FSA, and the NSC.7 The UB APR data will be obtained from ED through a password protected SFTP.

College Entrance Exam data. To customize the college profiles of the approximately 4,000 UB study participants, the study team will obtain PSAT, SAT, and ACT data from the College Board and the ACT. The UB study participant data file will be securely transferred to the College Board and the ACT through a password protected SFTP. The College Board and the ACT will then match the UB study participant data file to their database and send, through the SFTP, a new data file that includes UB participants and their PSAT, SAT, and ACT scores to Abt.

Federal Student Aid (FSA) data. These data will provide measures of key study outcomes including FAFSA completion and receipt of Federal student aid (such as Pell grants, Federal Work-Study, etc.). Additionally, household income, a key covariate in the impact analysis, will be collected from the FSA office. These data will be collected in 2016 for the full study sample (the 4,000 high school juniors from the 2014-15 school year at participating UB projects). The FSA data will be obtained from ED through a password protected SFTP.

National Student Clearinghouse (NSC) data. To measure two of the study’s key outcomes, initial and ongoing college enrollment, the study team will collect postsecondary enrollment data from the NSC for study students (i.e. UB students at participating projects who were juniors in the 2014-15 school year). The NSC data will include enrollment status and institution enrolled in. Their database covers institutions in all 50 states and both public and private institutions. The NSC provides student-level data on college enrollment and completion for 9,800 member institutions; together, these institutions represent 91 percent of enrollment in higher education in the U.S. Although the NSC has lower coverage of two-year than four-year institutions, fewer private than public institutions, and gaps in data in some states, no other national-level datasets contain individual student-level data on postsecondary enrollment and completion.

In fall 2016 and again in fall 2018, the UB study participant data file will be securely transferred to the NSC through a password protected SFTP.8 The NSC will then match the UB study participant data file to their database and send, through the SFTP, a new data file that includes UB participants and college enrollment data to Abt.

Student Rosters. To identify the students eligible for access to the intervention in 2015-16 (non-experimental group), the study team will collect rosters of high school juniors in the UB program at participating control group projects in the fall of 2015. The collection of the student rosters in fall 2015 will follow similar procedures to those developed for the 2014 roster collection.

Outcomes data will not be collected on high school juniors participating in UB at control projects in the 2015-16 school year.

Project Director Survey. The study team will administer the project director survey via the web in the winter of 2016 to the approximately 200 UB project directors participating in the study. The survey will collect information about the features of college advising provided in participating UB projects to help estimate the difference in the experimental conditions faced by students in the two groups, and to provide data on important factors mediating potential intervention impacts (see Appendix G for invitation and Appendix H for survey). All 200 directors of UB projects in the study will be surveyed. Because students in both experimental conditions may be exposed to similar college advising topics, the survey will include items specifically developed to determine the differences between the college advising provided to students in control group projects and that provided to students in treatment group projects. The survey will not cover other UB service offerings because this information was recently collected under a separate contract led by DIR in partnership with Abt. That contract included a comprehensive survey of all UB projects (obtaining a 95 percent response rate) that covers the full range of required program offerings, and these data will be available to the study team for the purposes of the current study about enhance college advising.

Follow-Up Student Survey. The follow-up survey will collect key student outcomes data. In the spring of 2016, near the end the senior year of students in the study sample (i.e. 4,000 high school juniors from 2014-15), the study team will administer a web-based follow-up survey. This survey will capture features of the college advising that students received; students’ experiences with the college and financial aid application processes; the number and types of colleges to which students applied; reasons why they chose to apply to these colleges; the number of colleges to which they were admitted; the types and amount of financial aid offered; their college enrollment decision; reasons for choosing to enroll at the selected college; and their understanding of financial aid and college costs (see Appendix H for invitation and Appendix J for survey). The survey will be fielded after April 15, 2016 because most undergraduate admissions decisions are communicated on or before this date annually. Additional details about the fielding procedures will be included in the Phase II ICR.




    1. Use of Information Technology and Burden Reduction

The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden.

The study will use a combination of mechanical and electronic technology to collect data. For each data collection task, the study team has selected the form of technology that enables the collection of valid and reliable information in an efficient way while minimizing respondent burden.

To minimize burden during the collection of student rosters, the evaluator’s electronic mail address and toll‑free telephone number will be included on the request for student rosters for project directors who have questions and project directors will be provided with an SFTP link to upload rosters with minimal effort and time. Taken together, these procedures are all designed to minimize the burden on project directors.

The student baseline and follow-up survey as well as the project director survey will be administered primarily online, allowing students to easily complete the survey at a time and place most convenient for them. Additionally, online administration can reduce time and human error associated with manual data entry because the data will be entered directly by respondents and loaded automatically into an electronic data file. However, follow-up phone calls will be made to nonresponders and during these calls telephone administration for the survey will be available.

    1. Efforts to Identify Duplication

To date, there has been no rigorous study that addresses the congressional mandate to investigate one or more strategies that could improve the effectiveness of Upward Bound.

To the extent possible, the study team will use existing data for the study rather than duplicate data collection efforts. For example, the study team will use data on how projects are implementing most UB services from the 2013 Upward Bound Project Director survey (collected for a different study under OMB #1850-0899, NOA 7/16/13) instead of collecting these data again for this demonstration in 2015-16. The study team will utilize all publically available data on high school characteristics and about UB programming. The information collected in the student rosters, and in the student and project director surveys is not available elsewhere.

    1. Efforts to Minimize Burden in Small Businesses

No small businesses will be involved as respondents. The primary entities for this study are UB projects. The study team will minimize burden by training data collection staff to make their contacts with UB project directors as straightforward and concise as possible. Student surveys and project director surveys will be administered via the web, minimizing the burden placed on UB project directors and students. All notification mailings, conversations, and presentations are designed to be clear, brief, and informative.


    1. Consequences of Not Collecting the Information

The consequences of not collecting specific data include:

  • Without collecting the student rosters, the study team could not identify the survey sample or define treatment and control groups, rendering it impossible to conduct the study.

  • Without administering the baseline and follow-up student surveys, the study team could not complete the impact analyses.

  • Without administering the project director survey, the study team could not estimate the difference in the experimental conditions faced by students in the two groups or analyze data on important factors mediating potential intervention impacts.

  • Without conducting the study, ED would not comply with the evaluation requirements set forth in the 2008 Higher Education Opportunity Act (HEOA) (20 USC 1070A-18).

    1. Special Circumstances Justifying Inconsistencies with Guidelines in 5 CFR 1320.6

There are no special circumstances concerning the collection of information in this study.

    1. Consultation Outside the Agency

      1. Federal Registrar Announcement

A 60-day notice to solicit public comments was published in the Federal Register on August 25, 2015 (vol. 80, no. 164, p. 51543). One public comment that addressed this data collection was received. The comment did not require any changes.

      1. Consultations Outside the Agency

Over the course of the study, the study team will assemble a Technical Working Group (in consultation with ED) composed of consultants with various types of expertise in the areas relevant to this study. The Technical Working Group convened on June 10, 2014 and discussed the study design and data collection plans. The group did not recommend any substantial changes to the study design or data collection plans.

The intervention’s professional development and resources were developed in collaboration with the Council for Opportunity in Education (COE), the College Board, and ACT. Furthermore, COE, the association of Upward Bound grantees, was consulted about the study’s recruitment and outreach plans.

      1. Unresolved Issues

There are no unresolved issues.

    1. Payments or Gifts to Respondents

During Phase I—Random Assignment and Collection of Student Rosters and Baseline Student Surveys, there will be no payments.

For Phase II – Collection of Follow-Up Survey, Project Director Survey, and Administrative Data, we propose to provide a modest incentive payment ($15 gift card) to each student participant after completions of the follow-up survey. An incentive payment is necessary for the follow-up student survey because this survey will collect key outcome data essential for the impact analysis. Incentives are appropriately used in Federal statistical surveys with respondents whose failure to participate would jeopardize the quality of the survey data (Graham, 2006).

To develop this strategy we reviewed the research literature on the problem of attrition in both panel and longitudinal surveys and the effectiveness of incentives. For example, Jäckle & Lynn (2008) considered the cumulative effects of conditional and unconditional incentives in a panel study of teenagers (aged 16-17 years old during the first survey wave) in the UK. Unconditional incentives significantly reduced attrition in a multi-mode panel study, with no impact on attrition bias, regardless of mode or type of incentive. The results suggest that incentives are also effective in maintaining sample sizes in a panel study. Rodgers (2011) offered adult participants $20, $30, or $50 in one wave of a longitudinal study and found that offering the highest incentive of $50 showed the greatest improvement in response rates and also had a positive impact on response rates for the next four waves.

We have no plans to offer an incentive for completion of the project director survey since all projects will be receiving the professional development and tools and sign up for the demonstration required projects to agree to participate in the data collection components.

    1. Assurance of Confidentiality

The study team will conduct all activities in Phases I and II accordance with all relevant regulations and requirements. These include the Education Sciences Institute Reform Act of 2002, Title I, Part E, Section 183, that requires “[all] collection, maintenance, use, and wide dissemination of data by the Institute … to conform with the requirements of section 552 of Title 5, United States Code, the confidentiality standards of subsections (c) of this section, and sections 444 and 445 of the General Education Provisions Act (20 U.S.C. 1232 g, 1232h).” These citations refer to the Privacy Act, the Family Education Rights and Privacy Act, and the Protection of Pupil Rights Amendment.

In addition, all data collected for the study shall be maintained in accordance with Section 552a of Title 5, United States Code, the confidentiality standards subsection (c) and sections 444 and 445 of the General Educations Provision Act. Subsection (c) of Section 183, referenced above, requires the director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.” The study will also adhere to requirements of subsection (d) of Section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony. All collaborator and partner organizations that the study team will be sharing personally identifiable information will be required under data sharing agreements to abide by all of these statutes as well.

In addition, the following verbatim language will appear on all letters, fact sheets, and other study materials:



Information collected for this study comes under the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific program, district or individual. Any willful disclosure of such information for nonstatistical purposes, except as required by law, is a class E felony.

Data will be presented in aggregate statistical form only. All study staff involved in collecting, reviewing, or analyzing individual-level data will be knowledgeable about data security procedures and will sign nondisclosure agreements (see Appendix F). Respondents will be assured that all information identifying them will be kept private to the extent allowed by law. The confidentiality procedures adopted for this study during all rounds of data collection, data processing, and analysis consist of the following:

  • All paper files will be converted to an electronic format and the paper files will be shredded immediately after they have been converted.

  • Electronic data files with sensitive data will be removed from computers and working servers in a manner that ensures that the information cannot be recovered.

  • At end of contract with the ED, the study team will destroy all student identifiers but will retain de-identified student data.

  • All electronic copies of de-identified student data will be destroyed within three years of the final contract payment from ED, unless otherwise directed by ED at the request of the study team or by ED’s request.

    1. Questions of a Sensitive Nature

There are no questions of a sensitive nature included in the information requested.

    1. Estimate of Response Burden

The total annual respondent burden for the data collection effort is 885 hours, with an estimated annual cost to respondents of $5,571. Exhibit A-3 presents time estimates of respondent burden for the data collection activities, requested for approval, in this submission. Because this is a three year clearance, total burden is divided by 3. The burden estimates are based on the following assumptions:

  • One project director is expected at each of the 200 UB projects.

  • The cost to project directors is based on an hourly wage of $25.67 in 2010-11 for School and Career Counselors.9

  • Twenty UB student participants, on average, at each of the 200 UB projects.

Exhibit A-3. Estimate of Annual Respondent Burden


Total # of Responses

Annual # of Responses

Hours/ Response

Annual Burden Hours

Estimated Hourly Wage

Annual Costs

UB Project Coordinator (n=200)







Collect and Submit Student Rosters (Phases I & II)

300

100

2.0

200

$25.67

$5,134

Project Survey

200

67

0.25

17

$25.67

$437

Students (n=4,000)







Baseline Survey (Phase I)

4,000

1,334

0.25

334

NA

NA

Follow-Up Survey (Phase II)

4,000

1334

0.25

334

NA

NA

Totals

8,500

2,836


885


$5,571



    1. Estimate of Total Capital and Startup Costs/Operation and Maintenance Costs to Respondents or Record-Keepers

There are no annualized capital/startup or ongoing operation and maintenance costs involved in random assignment or the collection of student rosters, administration of the student baseline and follow-up survey, project director surveys, or collection of administrative data.

    1. Estimates of Costs to the Federal Government

The total estimated cost to the federal government of the data collection activities for the study as described above (Phases I and II) is $1,573,989. The estimated cost to the federal government of the data collection activities for Phase I is $580,210 and the estimated cost to the federal government of the data collection activities for Phase II is $993,779. The data collection activities will be carried out over five years, Fall 2014 to Fall 2018. Thus, the average annual cost to the federal government is $314,798.



    1. Changes in Burden

This collection adds the Phase II data collection efforts and annualizes the total burden over the three year period. The annual burden of 885 hours represents a decrease of 515 hours, from the 1,400 hours approved.

    1. Plans for Analysis, Publication and Schedule

      1. Analysis Plans

Impact and descriptive analyses will be conducted to answer the study research questions, as described here. Part B of this Supporting Statement provides additional detail on these analyses.

  1. To what extent do the professional development package and tools have an effect— above and beyond the services Upward Bound grantees already provide—on student outcomes?

To examine the impacts of the intervention (Question 1), the study will exploit the experimental design to estimate Intent-to-Treat (ITT) effects of the intervention relative to the control condition. These impact effects will be estimated for each of the outcomes (i.e., the number and types of colleges students apply to; the selectivity of the college in which students enroll; financial aid obtained; college costs borne by students and their families; and overall enrollment in postsecondary education). We will test whether each of the impacts is statistically significant to determine if there is convincing scientific evidence that the intervention caused improvements in student outcomes.

In conducting the analysis, the study team will estimate two-level regression models, with students (level-1) nested within Upward Bound projects (level-2), to account for clustering.10 The student level will control for student demographic characteristics (e.g., race/ethnicity, English language learner status) as well as aspects of students’ educational plans collected from the baseline student survey (e.g., whether the student’s first planned postsecondary degree is a two- or four-year degree), and whether or not the student is the first in the family to attend college. The project level will include the treatment indicator, to distinguish between treatment and control projects, indicators for any stratifying variables used in random assignment, and we will control for baseline characteristics of the high schools attended by students in the project, including, for example, the percentage of seniors applying for federal financial aid, average school achievement and the percentage of students eligible for free or reduced-price lunch.

        1. Hierarchical Linear Model

The following hierarchical linear model will be used to estimate program impacts on continuous outcome variables.

The Level-1 (student level) model is:

for i=(1,2,…n) students per project and j=(1,2,..,P) UB projects.

Where, is the value of the outcome (e.g., number of college applications submitted) for the ith student in the jth UB project; is 1 if the ith student in the jth UB project student is the first in the family to attend college and 0 otherwise centered at the grand mean; are E covariates representing educational plans for ith student in the jth UB project (e.g. whether the student’s first planned postsecondary degree is a two- or four-year degree) centered at the grand mean; are K additional covariates representing demographic characteristics of the ith student in the jth UB project (e.g., race/ethnicity, English language learner status) each centered at the grand mean; is the covariate-adjusted mean value of the outcome for control students in the jth UB project; through  are regression coefficients indicating the effects of each student-level covariate on the outcome variable ; and is the random effect representing the difference between student ij’s score and the predicted mean score for project j.

The Level-2 (UB project-level) model is:

Where, is the covariate-adjusted mean value of the outcome measure across control UB projects; is the treatment effect i.e. the difference between the covariate-adjusted means of the treatment and control projects; Treatmentj is the treatment status dummy variable with a value of 1 if the jth project is assigned to the treatment group and 0 if assigned to the control group; is a vector of k variables measuring the characteristics of the jth project in school year 2014-15 prior to random assignment (e.g. the percentage of seniors applying for federal financial aid, the percentage of students eligible for free or reduced-price lunch) each centered at the grand mean; is the deviation of UB project j’s mean from the grand mean, conditional on covariates

The parameter indicates the impact of the demonstration on the outcome. A two-tailed t-test will be conducted to test the null hypothesis of no treatment impact using an alpha level criterion of 0.05. A positive and statistically significant estimate of will indicate that there is compelling scientific evidence (at the 5 percent level) that the demonstration had an impact on the targeted outcome. The parameter indicates the magnitude of the impact -- such that the enhanced college advising in UB projects is estimated to have, on average, a point effect on the specified outcome.

A standardized effect size will be calculated by dividing the estimated impact ( ) by the standard deviation of the outcome variable, , in the control group ( ). The standardized effect size is . The control group standard deviation will be used, as recommended by Burghardt, Deke, Kisker, Puma, and Schochet (2009), rather than the pooled standard deviation, because the intervention might affect the standard deviation in the treatment group.

  1. How fully was the intervention implemented (e.g., in terms of staff participation in training and staff implementation of the intervention model)?  And to what extent did the intervention produce a difference in the Upward Bound college advising provided to treatment and control group students?

Descriptive analyses will be used to provide information on implementation and the fidelity of implementation (Question 2). Information on the overall level of and variation in implementation fidelity will provide important contextual information for interpreting the impact findings. Our proposed implementation analysis will cover: (1) the professional development received by UB staff; (2) the nature of the college advising that is offered to participants, (3) the extent to which participants receive college advising from Upward Bound, (4) the alignment between UB’s college advising and the college advising that projects in the treatment group were expected to provide, and (5) the difference between the college advising received by students in the treatment group and students in the control group.

To capture services offered, we will use program director surveys to obtain information on treatment projects’ rollout of the intervention compared to control group members’ reports of college advising activities provided. To collect data on the services students receive, especially those related to college and financial aid planning and applications, we plan to rely on the student follow-up survey. However, we recognize the limitations of these data; student self-reports may suffer from recall error and potential response biases (e.g., successful students tend to over-attribute their success to mentors and tutors and tend to rate the quality of those services more highly than students who are not as successful).

To characterize the difference between the college advising received by students in the treatment group and students in the control group (i.e., the treatment-control contrast), we will conduct an impact analysis using the same methods described earlier in this section. The magnitude of the treatment’ impact on college advising services will be used to characterize the treatment-control contrast.

  1. Is there variation in the impacts of the enhanced college advising intervention on student outcomes and to what extent is the variation associated with other project features or characteristics of participating students? For example, do impacts vary between projects hosted by two-year institutions and projects hosted by four-year institutions? Are differences in the implementation of the enhanced college advising associated with differences in impacts?

An important goal of the study is to identify implementation features or other factors that may influence the impacts of the intervention. To address Question 3 (i.e., variation in impacts), we propose to augment our analysis model with interaction terms and test whether there are statistically significant correlational relationships between the impacts of the intervention and the way in which it was implemented and other site-level characteristics. For example, to test for variation between projects hosted by two-year institutions and projects hosted by four-year institutions in the program impact, the study team will include an interaction between the treatment indicator and an indicator for whether the project is hosted by a four year institution in Equation 2, as follows:

Where, is the covariate-adjusted mean value of the outcome measure (or log-odds of the outcome occurring) among control projects hosted by two-year institutions; is the mean difference in the covariate-adjusted outcome between treatment and control projects (i.e., the treatment impact); Treatmentj is the treatment status dummy variable with a value of 1 if the jth project is assigned to the treatment group and 0 if assigned to the control group; is the difference between projects hosted by four-year institutions and projects hosted by two-year institutions in the covariate-adjusted mean value (or log-odds) of the outcome among control projects; is a host type indicator variable with a value of 1 if the jth project is hosted by a four-year institution and 0 if hosted by a two-year institution; is the difference in the treatment impact between projects hosted by four-year institutions and projects hosted by two-year institutions; *Trtj is the host type by treatment condition interaction term; and is a vector of k variables measuring the characteristics of the jth project in school year 2014-15 prior to random assignment (e.g. the percentage of seniors applying for federal financial aid, the percentage of students eligible for free or reduced-price lunch) each centered at the grand mean.

Using a .05 level criterion, the study team will conduct a test of the null hypothesis that the parameter for the interaction term is zero, i.e., that there is no statistically significant difference in the program impact for the type of host institution. The null hypothesis is expressed as follows:

Because project characteristics cannot be randomly assigned the study team will interpret a statistically significant difference as suggestive evidence that the program impact varies by type of host institution. If the study team finds a statistically significant difference in program impacts, the study team will then test whether the program impact within type of host institution or subgroup is statistically significant. Standardized effect sizes will be calculated as described above.

      1. Timeline and Publication plans

        1. Timeline

The study is expected to be conducted across seven years, assuming ED exercises the option for longer-term data collection. Exhibit A-4 (below) displays the full study timeline. Phase I data collection activities will occur in the first half of study year 2; the collection of student rosters will occur in the fall of 2014, the administration of the baseline survey will take place in the winter of 2015 and random assignment will occur, after administration of the baseline survey, in late winter of 2015. All additional data collection efforts, which are part of Phase II and are outside of the purview of this ICR package will, take place no earlier than the spring of 2015.

Exhibit A-4 Study Timeline

Activity

Date

Study Year 1 (Phase I)

Design Surveys

October 2013- March 2014

UB projects volunteer to participate

February-May 2014

Pilot Baseline Survey

April 2014

Convene Technical Working Group

April-May 2014

Pilot Follow-Up and PD Surveys

July 2014

Study Year 2 (Phases I and II)

Collection of Student Rosters

November-December 2014

Notify UB Parents

December 2014 – January 2015

Administer Baseline Student Survey

January-March 2015

Random Assignment of UB Projects into Wave 1* or Wave 2

March 2015

Collection of SAT and ACT Data

Spring 2015

Professional Development for Wave 1 Projects

Spring-Fall 2015

Enhanced College Advising Wave1 Projects

Summer-Fall 2015

Study Year 3 (Phase II)

Collection of Rosters of Wave 2 Juniors

November-December 2015

Administer Project Director Survey

January-March 2016

Collection of SAT and ACT Data for Wave 2 Juniors

Spring 2016

Administer Follow-Up Student Survey

Spring 2016

Professional Development for Wave 2 Project Directors

Summer-Fall 2016

Collection of FSA Data

June 2016

Study Year 4 (Phase II)

Expanded College Advising for Wave 2 Juniors

Fall 2016

Collection of NSC Data

November 2016

Collection of Updated FSA Data

November 2016

Release of Report #1

Spring 2017

Study Year 5 (Phase II)

Release of Report #2

Winter 2018

Submission of Data File with Documentation

Winter 2018

Study Year 6 (Phase II)


Collection of NSC Data

November 2018

Study Year 7 (Phase II)

Release of Report #3

Winter 2020

*Wave 1 (spring/summer/fall 2015): receive PD in spring 2015, customized college profiles provided to juniors in summer 2015 and expanded college advising strategies implemented in summer and fall 2015.

Wave 2 (spring/summer/fall 2016): receive PD in summer 2016, customized college profiles provided to juniors in summer 2016 and expanded college advising implemented in summer and fall 2016. Wave 2 projects will have student outcomes data (i.e. surveys, NSC, or FSA data) collected on their 2014-15 juniors but will implement expanded college advising strategies with their 2015-2016 juniors.

        1. Publication plans

The evaluation plans call for three reports, including the option period. The first, published in May 2017, will be based on data collected through June 2016 (from both the Phase I and II ICRs) and will focus on the outcomes measure prior to high school graduation such as of college matriculation plans and students’ understanding of financial aid and college costs. The second report will be available in March 2018 and will include results regarding actual post-secondary enrollment, the selectivity of colleges where students’ enrolled, and students’ use of Federal financial aid. The third report will be available in March 2020 and will include results regarding college persistence.

    1. Approval to Not Display Expiration Date

No exemption is requested. The data collection instruments will display the expiration date.

    1. Exceptions to Item 19 of OMB Form 83-1

The submission describing data collection requires no exemptions to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).

References

Avery, C. (2013). Evaluation of the College Possible Program: Results from a Randomized Controlled Trial. (Working Paper 19562). Retrieved from National Bureau of Economic Research: http://www.nber.org/papers/w19562

Bowen, W. G., Chingos, M.M., & McPherson, M.S. 2009. Crossing the Finish Line: Completing College at America’s Public Universities. Princeton, NJ: Princeton University Press.

Burghardt, J., Deke, J., Kisker, E., Puma, M., & Schochet, P. (2009). Regional educational laboratory rigorous applied research studies: Frequently asked analysis questions. Institute of Education U.S. Department of Education. Princeton, NJ: Mathematics Policy Research.

Bureau of Labor Statistics, U.S. Department of Labor. Occupational Outlook Handbook, 2012‑13 Edition. Accessed online at: http://www.bls.gov/ooh/community-and-social-service/school‑and-career-counselors.htm (accessed November 15, 2013).

Byndloss, C. D. & Reid, C. (2013). “Promoting College Match for Low-Income Students: Lessons for Practitioners.” MDRC Policy Brief, September 2013. Available at: http://www.mdrc.org/sites/default/files/college_match_brief.pdf (accessed November 15, 2013).

Carrell, S. E. & Sacerdote, B. (2013). Late Interventions Matter Too: The Case of College Coaching in New Hampshire. Working Paper 19031). Retrieved from National Bureau of Economic Research: http://www.nber.org/papers/w19031.

Family Educational Rights and Privacy Act of 1974, 20 U.S.C. § 1232g; 34 CFR Part 99.3.

Graham, J. D. (2006). Questions and Answers When Designing Surveys for Information Collections. Washington, D.C., Office of Management and Budget.

Hoxby, C. & Turner, S. (2013). Expanding College Opportunities for High-Achieving, Low Income Students. Stanford Institute for Economic Policy Research. SIEPR Discussion Paper No. 12-014.

Jäckle, A. & Lynn, P. (2008). Respondent Incentives in a Multi-Mode Panel Survey: Cumulative Effects on Nonresponse and Bias. Survey Methodology, 34(1), 105-117.

Rodgers, W. 2011. Effects of Increasing the Incentive Size in a Longitudinal Study. Journal of Official Statistics 27 (2): 279-299.

Roderick, M., Nagaoka, J., Coca, V., & Moeller, E. (2009). From High School to the Future: Making Hard Work Pay Off. Chicago: Consortium on Chicago School Research.

Roderick, M., Coca, V., & Nagaoka, J. (2011). "Potholes on the Road to College: High School Effects in Shaping Urban Students’ Participation in College Application, Four-year College Enrollment, and College Match." Sociology of Education 84(3): 178-211.

Schochet, P. Z. (2008). Statistical power for random assignment evaluations of education programs. Journal of Educational and Behavioral Statistics, 33(1), 62-87.

Seftor, N. S., Mamun, A. & Schirm, A. (2009). “The Impacts of Regular Upward Bound on Postsecondary Outcomes 7-9 Years After Scheduled High School Graduation.” Mathematica Policy Research, Inc. Available at: http://www.mathematica-mpr.com/publications/pdfs/upwardboundoutcomes.pdf

Sherwin, J. (2012). “Make Me a Match: Helping Low-Income and First-Generation Students Make Good College Choices.” MDRC Policy Brief, April 2012. Available at: http://www.mdrc.org/sites/default/files/policybrief_24.pdf

Smith, J. (2011). “Can Applying to More Colleges Increase Enrollment Rates?” The College Board. Available at: http://advocacy.collegeboard.org/sites/default/files/11b_4313_College%20App%20Research%20Brief_WEB_111026.pdf

Smith, J., Pender, M., & Howell, J. (2013). The full extent of student-college academic undermatch. Economics of Education Review, 32(0), 247-261.

U.S. Department of Education, Office of Postsecondary Education. Annual Performance Report: Upward Bound Program Awards FY2012. Available at http://www2.ed.gov/programs/trioupbound/ubgrantees2012.xls





1 U.S. Department of Education, Office of Postsecondary Education, Upward Bound Program Awards FY2014, available at http://www2.ed.gov/programs/trioupbound/ubgrantees2014.pdf

2 See http://www2.ed.gov/programs/trioupbound/ubgranteelevel-exp0910.pdf

4 The UB host institution will be one of five examples provided, so that students (and potentially their parents) can compare the net costs and performance of the host institution to other postsecondary institutions.

5 We will to conduct blocked random assignment of projects, using region and one or more other blocking factors that are associated with the key student outcomes in this study.

6 For example, the White House convened a group of university administrators in October 2013 to discuss the issue of and possible remedies for undermatching among high achieving, low-income students.

7 Both ACT and the College Board are collaborators in the demonstration, and will have data sharing agreements with Abt that bind them to all of the confidentiality and privacy regulations that Abt must adhere to as an IES contractor.

8 Abt will enter into a data agreement with the NSC that binds them to all of the confidentiality and privacy regulations that Abt must adhere to as an IES contractor.



9 Bureau of Labor Statistics, U.S. Department of Labor, Occupational Outlook Handbook, 2012-13 Edition, accessed online at  http://www.bls.gov/ooh/community-and-social-service/school-and-career-counselors.htm (October 12, 2012).

10 Since each Upward Bound project serves several target schools, it is reasonable to ask whether it might be more appropriate to estimate a three-level model of schools nested with projects and students nested within schools.  However, Schochet (2008) shows that in this context, it is only necessary to capture variation across schools if the study selects a sample of target schools from each project.  Because the study sample will include all target schools for each randomized project, adding a school-level to the model is unnecessary.  




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorKatie Gan
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy