MSTRIO_OMB_Draft_Part_A_120312_Response2_RIMS_and_OMB_041013

MSTRIO_OMB_Draft_Part_A_120312_Response2_RIMS_and_OMB_041013.docx

Study of Implementation and Outcomes in Upward Bound and other TRIO Programs

OMB: 1850-0899

Document [docx]
Download: docx | pdf

A Study of Implementation and Outcomes in Upward Bound and Other TRIO Programs


Part A: Supporting Statement for Paperwork Reduction Act Submission


April 10, 2013

























Prepared for

Institute of Education Sciences

U.S. Department of Education



Prepared by

Decision Information Resources, Inc.

Abt Associates


Contract Number:

ED-IES-10-R-0015

Contents


Part A: Justification

List of Tables


Table 1. Estimated Annual and Total Reporting Burden for Survey Respondents ………….….12


Table 2. Anticipated Sequence of Data-Collection and Reporting Activities……...……………14


Table 3. Example—Grantees by Selected Organizational Characteristics……………………....15


Table 4. Example—Approaches to Providing Academic Tutoring,

by Location of Project……………………………………...……………………………………15


Table 5. Approaches to Providing College Entrance Exam Preparation,

by Type of Host Institution……………….……………………………………………………..16



INTRODUCTION


This clearance request is submitted to the Office of Management and Budget (OMB) in support of data collection and analysis planned as part of A Study of Implementation and Outcomes of Upward Bound and Other TRIO Programs, sponsored by the Institute of Education Sciences (IES) at the U.S. Department of Education (ED). This request is a resubmission on behalf of this project, after ED and OMB determined that an earlier plan for the study was not feasible.1


Overview of the Upward Bound Program


The Upward Bound (UB) Program, the oldest and largest of the federal TRIO programs, was initiated under Title IV of the Higher Education Act of 1965 as a pre-college program designed to help economically disadvantaged high school students (low-income and/or first-generation college going) prepare for, enter, and succeed in postsecondary education. With grant awards approximating $5,000 per student per year, Upward Bound provides an intensive set of services. In the most recent (FY 2012) grant competition, ED sought to make the provision of those services more productive and cost effective by incentivizing project applicants to propose new delivery methods and approaches that could reduce the cost per student without sacrificing implementation quality. Awards were made to 820 Upward Bound grantees totaling $268.2 million to serve more than 62,500 students.


The structure of Upward Bound projects is well established, defined largely by specific statutory language prescribing project and student eligibility and the services offered. Students usually enter the program while in the ninth or tenth grade. Although students may participate in Upward Bound through the summer following twelfth grade (for three to four years total), participants spend an average of 21 months in the program.2 Upward Bound projects are generally operated by two- or four-year colleges. Projects must offer academic instruction as well as college counseling, mentoring, and other support services. In addition to regularly scheduled meetings throughout the school year, projects also offer an intensive instructional program that meets daily for about six weeks during the summer. Upward Bound projects may provide additional services or supports to students, beyond the set of required core activities. Currently, very little is known about the intensity, duration, and mix of services, or how they are delivered, particularly for FY 2012 grantees that may have changed implementation strategies under the new grant requirements


Overview of the Study and the Role of the Proposed Data Collection


The Study of Implementation and Outcomes of Upward Bound and Other TRIO Programs was awarded in August 2010 as a first step towards launching a congressionally mandate study. The 2008 reauthorization of the Higher Education Act (HEA) required ED to conduct a study that would identify particular institutional, community, and program or project practices that are effective in improving Upward Bound student outcomes (20 U.S.C. 1070a-18) (see Appendix A). However, the law also prohibited any study of a TRIO program that would require grantees to “recruit additional students beyond those that the program or project would normally recruit” or that would result “in the denial of services for an eligible student under the program or project.” These prohibitions were initially seen as eliminating the possibility of conducting a randomized control trial within the Upward Bound program.


The new contract was intended to explore possible designs for and the feasibility of conducting a study of the effectiveness of one or more promising practices in Upward Bound. Design work completed highlighted the weaknesses of popular, program-supported study approaches such as matching grantees on certain project characteristics and then assessing variation in student outcomes that could be attributed to differences in project implementation.3 In consultation with OMB in summer 2011, ED moved forward to investigate the options for testing out certain “enhanced” practices or alternative strategies for implementing core components of Upward Bound using a randomized control trial.


The data collection for which we are now seeking OMB approval is part of that effort.4 The survey of FY 2012 Upward Bound grantee project directors has two objectives:


  1. To help identify one or more strategies to test as part of a random assignment demonstration. One challenge in pursuing a rigorous study of promising practices in Upward Bound, given that it is a service-rich program, is understanding how (not just whether) projects implement core components, where gaps in services might exist, and the prevalence of both the implementation approaches and gaps. In order not to violate the statutory prohibition against denying regular Upward Bound services to student participants, we have to ensure that any promising strategy we evaluate is not already widely implemented but yet is sufficiently attractive to Upward Bound projects to allow us to recruit grantees not already implementing the strategy to participate in a research demonstration involving random assignment.


  1. To provide new and up-to-date information about the services and strategies that Upward Bound grantees implement, particularly under the new grant productivity requirements. The last survey of program implementation was conducted in 19975 and the program office has an interest in obtaining information that is more current. The survey will attempt to capture approaches to program services and supports that participants experience and which may improve their prospects of successfully competing high school and entering and/or college.

The survey is designed to address the following research questions:

  • How are the key components and service areas implemented within Upward Bound?

  • What is the prevalence of specific strategies or approaches for implementing service areas or program components across grantee projects?

  • Where are there gaps in services that might be linked to improving student outcomes?

  • What strategies or approaches may be amenable to testing through an experimental design to measure impact on student outcomes?


The project director survey will be administered to all 820 regular Upward Bound grantees in the summer of 2013, so that grantees can report on Upward Bound activities and services during their summer program as well as during the school year. Because of the need to collect information on how services are implemented, the instrument is detailed and designed to take approximately 40 minutes to complete. We solicited and have received input from the TRIO community and other college access experts to improve and revise the instrument during the public comment period.6 The results of the data collection will be tabulated and released in a summary report.


A.1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


This survey is necessary in order to prepare for and conduct the congressionally-mandated study of promising practices in Upward Bound (see Appendix A), given the comprehensiveness of required services under the program and the statutory prohibitions on changing recruitment of student applicants or denying regular Upward Bound services to participants. These factors make identifying practices or strategies that can be evaluated rigorously more dependent on obtaining information on current implementation approaches or gaps in service. The timing of the survey (approximately 12-14 months after grant award) is designed to ensure that Upward Bound projects already know their implementation strategies so that any demonstration of promising practices is clearly an “add-on” to what projects that participate in the demonstration intended to offer.



A.2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


a. How and By Whom the Data Will be Collected


The Upward Bound implementation survey and its analysis will be conducted by Decision Information Resources (DIR) and its partner, Abt Associates, who were awarded a contract by IES through a competitive procurement. The survey of project directors is expected to take approximately 40 minutes to administer and will be fielded with all 820 FY 2012 Upward Bound grantees. A draft of the letter inviting project directors to complete the survey is included in Appendix B and will be sent in spring 2013. The survey will not collect personally identifiable information other than the names and contact information for UB project directors, which are publicly available on the UB websites and in project directories.


All data for this study will be collected through a web-based survey, although provision will be made for alternative hard-copy and phone completion, if necessary. Our approach to implementing web-based surveys will help to ensure we attain the high response rate desired to minimize non-response bias. To obtain high response rates, the study team will employ a number of features in the design and administration of the survey that have been demonstrated to be effective in web-based surveys. These features include:

  • clearly identifying the survey sponsorship and support by key stakeholders

  • providing for multiple modes for completion—including the web, hard-copy, and by phone, if the respondent desires,

  • use of reminder follow-up calls 7


Ongoing data files will be extracted for analysis throughout the survey period to ensure responses are being appropriately captured, regardless of mode.


b. Purpose for Collecting the Information


The survey of project directors is necessary to obtain current and detailed information on the approaches and strategies used to implement core components of Upward Bound, information that will inform efforts to fulfill the congressionally mandated study of promising practices within the UB program. The UB Annual Performance Reports (APRs) submitted by project directors to ED are insufficiently detailed to capture meaningful variations in grantee strategies for implementing program components or to help categorize practices that could, in turn, measure relationships between implementation strategies and student outcomes.




A.3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


The survey instrument for this study has been designed to be administered through a web platform; however, provision will be made for alternative modes of completion (i.e. telephone or mail). This will allow respondents the opportunity to complete the survey at a convenient time and pace, and reduces response burden through use of appropriate skip logic. A web-based survey also reduces time and human error associated with manual data entry by allows respondent-entered data to be directly loaded into a data file.


A.4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


The information to be gathered by this data collection does not currently exist in a systematic format. Similar information has not been collected concerning UB grantees for over 15 years.


A.5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


No small businesses will be involved as respondents. However, some grantees may be considered small entities. To minimize burden on these small entities, ED will use a web-based survey to allow respondents to complete the survey at a time and pace most convenient for them.


A.6. Describe the consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


This submission includes a survey that will be conducted only one time during the full study period and will collect information not otherwise available to the study. The survey of Upward Bound Project directors will provide new and current information about the services and strategies grantees implement and identify potential promising practices. Not collecting this information from grantees would prevent ED from fulfilling the mandate to conduct a rigorous study of promising practices in Upward Bound.


A.7. Explain any special circumstances requiring collection of information in a manner inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations.


There are no special circumstances associated with this data collection.



A.8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments


In accordance 5 CFR 1320.8 (d), a 60-day Federal Register was published on 12/26/12 (vol.77, page 76012), announcing the agency’s intention to request an OMB review of data collection activities and providing a 60-day period for public comments. A second notice will be published in the Federal Register and another 30-day period will be provided for public comments.


During the 60-day comment period, we received comments from four members of the TRIO community, including the advocacy group representing Upward Bound and other TRIO programs, providing feedback on the survey instrument. The comments suggested the following:

  • Make sure survey items accurately represent UB projects. Members of the Upward Bound community provided feedback on the structure and language of questions highlighting inconsistencies with program practices. For example, some survey items referenced eighth grade participants when statutory regulations require applicants to have completed the eighth grade unless it can be demonstrated that the requirement would defeat the purpose of the program. As a result of this feedback, references were removed.


  • Improve the organization of survey items. Suggestions were provide for how items could be reorganized to reduce burden on respondents. For example, questions regarding the use of technology were asked for each required service area. Changes were made to the instrument to consolidate similar questions to minimize repetition of questions and improve the overall organization of survey items.


  • Simplify survey items to reduce response time. It was suggest that the wording of questions and the structure of responses could be improved to reduce the time necessary to complete the survey. For example, some items ask PDs to provide numeric values regarding student participation levels for specific services. Items were modified such that response are pre-coded, allowing respondents to provide a range (e.g. 0-25%, 26-50%, etc.) rather than absolute values as a response.


  • Allow for grantee input. While coded responses can reduce response time, it was also mentioned that adding an open-ended question would allow grantees to describe and discuss innovative or promising practices. In response to this comment, an open-ended question was added to allow for more direct input from project directors.


In addition, the study team solicited advice from individuals with expertise on college access prior to revising the survey instrument. These individuals include Dr. William Tierney from the University of Southern California, Dr. Consuelo Arbona from University of Houston, and Dr. Laura Perna from the University of Pennsylvania. We also obtained advice from staff from ED’s Office of Postsecondary Education intended to minimize burden on respondents. Feedback from both external experts and OPE staff are also reflected in the attached, revised instrument.


Finally, we will pilot the web-based version of the revised survey with no more than 8 respondents during the OMB review period to ensure that the survey is easy to navigate, functions well, all skips operate appropriately, and the resulting data file is accurate. It is our expectation that changes to the instrument will be primarily operational based on the feedback received from current project directors and members of the TRIO community.


A.9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


No payments will be made to respondents as they are all current grantees receiving ED funding.


A.10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


ED, its study contractor (Decision Information Resources, Inc.), and other parties involved in the study will follow the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183. The act requires, “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment. Other than the names and contact information for the project directors, no data collected for this survey will contain personally identifiable information. UB project director names and contact information are already publicly-available on UB websites and directories of UB projects, however, no names and contact information will be released as part of the study.


ED, the survey contractor (Decision Information Resources, Inc.), and other parties involved in the study will protect the confidentiality of all information collected for the study and will use it for research purposes only. The following safeguards will be put in place:


  • Respondents will be assured that confidentiality will be maintained, except as required by law.

  • Identifying information about respondents (e.g., respondent name, address, and telephone number) will not be entered into the analysis data file, and will only be kept in hard copy in a locked filing cabinet.

  • A fax machine used to send or receive documents that contain confidential information will be kept in a locked file room, accessible only to study team members.

  • Confidential materials will be printed on a printer located in a limited access file room. When printing documents that contain confidential information from shared network printers, authorized study staff will be present and retrieve the documents as soon as printing is complete.

  • In public reports, findings will be presented in aggregate by type of respondent or for subgroups of interest. No reports will identify individual respondents or local agencies.

  • Access to the sample files will be limited to authorized study staff only; no others will be authorized such access.

  • All members of the study team will be briefed regarding confidentiality of the data.

  • A control system will be in place, beginning at sample selection, to monitor the status and whereabouts of all data collection instruments during transfer, processing, coding, and data entry. This includes sign-in/sign-out sheets and the hand-carrying of documents by authorized project staff only.

  • All data will be stored in secure areas accessible only to authorized staff members. Computer-generated output containing identifiable information will be maintained under the same conditions.

  • When any hard copies containing confidential information are no longer needed, they will be shredded.


In addition, the following language will be included in the recruitment letters:


As required by the policies and procedures of the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for the study will summarize findings across individuals and institutions and will not associate responses with a specific UB grantee, school district, school, or person. We will not provide information that identifies grantee respondents to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.


A.11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


There are no questions of a sensitive nature included on the data-collection instruments for this study.



A.12. Provide estimates of the hour burden of the collection of information. The statement should:


• Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.

Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 13.


All of the hour and cost burden for this study will fall on Upward Bound project directors who will be asked to complete the survey. The burden estimates assume 100% of respondents complete the survey. Non-response would reduce the burden estimate accordingly.


Estimated Hour Burden for Survey Respondents.


The planned study includes one survey of Upward Bound project directors. The same survey instrument will be used for all respondents. The survey will be administered to all 820 regular Upward Bound grantees. The time to complete the survey is estimated to be 0.67 hours. Table 1 presents our annual and total hour burden estimates for survey respondents. The total hour burden for survey respondents is estimated to be 549 hours (820 respondents x 0.67 hours per respondent). Although this is a one-time data-collection effort, because it is a 3 year contract the annual respondents will be 274 (820 respondents ÷ 3 years) and the annual burden over the three years is 183 burden hours (549 hours ÷ 3 years).


Table 1. Estimated Annual and Total Reporting Burden for Survey Respondents

Respondents

Number of Respondents

Frequency of Collection

Average Time per Respondent

Annual Hourly Burden

Estimated Hourly Wage*

Estimated Annual Burden

Project Director

274

-

.67 hours

183 hours

$32.65

$5,975

TOTAL

820

1


549


$17,925

* 2010 Statistical Abstract of the U.S.—Table 630. Mean Hourly Earnings and Weekly Hours by Selected Characteristics: 2007. The rate used here for the project director reflects the hourly wage for a manager in state and local governments, which may be higher than the wages for a UB project director.


Estimated of the Cost Burden to Survey Respondents.


As shown in Table 1, the estimated total hour burden for survey respondents is 549 hours (820 respondents x 0.67 hours per respondent). The cost burden per hour for the survey reflects the hourly wage ($32.65) for a manager in state and local governments. Multiplying the total number of hour burden by the average hourly wage yields a total cost estimate of $17,925. Although this is a one-time data-collection effort, the annual cost burden over the three years is $5,975 ($17,925 cost burden ÷ 3 years). In addition to their time respondents incur no other direct monetary costs.


A.13. Provide an estimate for the total annual cost burden to respondents or record-keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).


No annualized capital-startup costs or ongoing operation and maintenance costs are associated with collecting the information.


A.14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.


The total cost to the federal government for the Implementation and Outcomes Study of Upward Bound and Other TRIO Programs is $1,495,180. This total includes the substantial work that has already been conducted between 2010 and 2012 through this contract in examining the feasibility of alternative experimental and quasi-experimental design options and data sources for an evaluation of UB, analyzing data from UB Annual Performance Reports and grant applications, and preparing materials for a set of qualitative case study visits. The annual cost for activities being conducted in FY 2012-2013, which includes design, implementation, and initial design of the project director survey, including obtaining input from expert reviewers and stakeholders, is $508,878.


A.15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I (these references will change when the Information Collection Submission Worksheet is finalized).


This is a new data collection request.


A.16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


This section outlines the time schedule for activities and deliverables and presents analysis plans for the project director survey data.


Time Schedule for Activities and Deliverables


The schedule shown in Table 2 lists the sequence of activities required to conduct the information-collection activities described in this submission and includes key dates for activities related to respondent notification, data collection, analysis, and reporting.


Table 2. Anticipated Sequence of Data-Collection and Reporting Activities

Activity

Expected Dates

Administration of survey

6/1/2013 - 7/31/13

Internal memo summarizing initial findings – to inform development of research demonstration on promising practices


8/30/13

Release of report

4/30/14


Analysis Plans for Project Director Survey Data


The focus of the analysis will be to describe implementation strategies used for key components of the UB program by current grantees. We will group implementation by categories of grantees defined by key UB organizational characteristics—including type of host institution, location of grantee services, and number of target schools. Descriptive analysis will examine the approaches to components offered overall and by categories of grantees, illuminating possible relationships between service strategies and organizational characteristics. Examples of these types of tabulations are provided in Tables 3 - 5.


The analytic approach is straightforward and will include tabulation of means and distributions. Because we have sampled the universe of grantees and expect a 100 percent or near 100 percent response, it is not appropriate to compare approaches using traditional t-tests.



Table 3. Example—Grantees by Selected Organizational Characteristics


Study Sites


Number of Grantees

Percent of Grantees

Location



Urban



Suburban



Rural



Type of Host Institution



Two-Year



Four-Year



Number of Active Target Schools



1



2-3



4-5



More than 5




Table 4. Example—Approaches to Providing Academic Tutoring, by Location

UB Project Provision of Academic Tutoring, by Location of Project

Approaches to Providing Academic Tutoring

Urban

% (n)

Suburban

% (n)

Rural

% (n)

Provide group tutoring




Provide one-on-one tutoring




Use on-line tutoring resources




Level of Participation in Tutoring Services




% of projects in which tutoring required of some students




Frequency of Tutoring




Offered daily




Offered once weekly




Offered less than weekly




Methods Used to Determine Tutorial Offerings to Students




Use school records




Use teacher referrals




Student assessments




Parents or students express need




Characteristics of Tutorial Staff




Use paid UB staff




Use volunteers




Use paid college students from host institution





Table 5. Approaches to Providing College Entrance Exam Preparation, by Type of Host Institution

UB Project Provision of College Entrance Exam Preparation, by Host Institution Type


Two-Year

% (n)

Four-Year

% (n)

College Exam (SAT-ACT) Prep Services





Provide group preparation assistance





Provide one-on-one preparation assistance





Use on-line preparation assistance





Use third-party assistance (e.g., Princeton Review)





Level of Participation in College Exam Prep Services





% of projects in which participation required of students

% requiring 9th grade students to participate
% requiring 10th grade students to participate
% requiring 10th grade students to participate
% requiring 10th grade students to participate





Intensity of College Exam Prep Offerings





Mean number of hours of exam prep offered per student per year





Characteristics of Exam Prep Staff





Use paid UB staff





Use volunteers





Use paid college students from host institution





Use third party group to provide service






The study team proposes a two-staged approach to help identify one or more promising approaches:


The first step will be to use the descriptive data to explore each required service area. With detailed data on the offerings, we will be able to classify service areas as either higher intensity (e.g., frequent and convenient offerings, requirements to participate, one-on-one attention, or “always available” offerings, etc.) or lower intensity (e.g., infrequent and less convenient offerings, no requirements to participate, etc.). With data on participation levels, we can explore the relationship between intensity and participation levels. For example, identifying lower intensity service areas with lower participation levels will be one way to narrow in on particular service areas that could benefit from an intervention designed to improve the intensity.


The second step will include: (a) identifying strategies that would address those lower intensity offerings, (b) searching for fully-formed interventions that contain those strategies, and if none exist, (c) consider developing interventions that incorporate those strategies.



A.17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


All data-collection instruments and associated forms will display the expiration date for OMB approval.


A.18. Explain each exception to the certification statement identified in Item 19, "Certification for Paperwork Reduction Act Submissions," of OMB Form 83-I. Again, the reference will change when the Information Collection Submission Worksheet is finalized.


No exception is requested or needed.

1 The original OMB submission (1850-NEW (04471)), for site visits to 20 Upward Bound grantees, was submitted on 4/20/11 and later withdrawn.

2 U.S. Department of Education, Office of the Under Secretary, Policy and Program Studies Service, The Impacts of Regular Upward Bound: Results from the Third Follow-Up Data Collection, Washington, D.C., 2004.

3 This type of quasi-experimental study would not yield reliable or conclusive results because of the inability to fully isolate the effects of any particular implementation strategies. Upward Bound projects differ in the students they serve, the relationship between the grantee “host institution” and its participating high schools, the level of external resources, and many other factors in addition to their approaches to implementing core components and services. To measure the effects of implementing a particular (potentially promising) approach to a UB component an evaluation would need to statistically “control for” everything else that might differ among projects and influence outcomes; there would inevitably be unknown or not easily measured factors that could have a powerful effect on outcomes.

4 IES has also awarded a separate contract with an option to conduct an RCT of a promising practice within Upward Bound, once such a practice or strategy can be identified under the current contract.

5 Moore, Mary T. “The National Evaluation of Upward Bound: A 1990's View of Upward Bound Programs Offered, Students Served, and Operational Issues.” Washington, D.C.: U.S. Department of Education, 1997.

6 During the 60-day public comment period, comments were received from four members of the TRIO community, including the advocacy group representing UB and other TRIO programs. Three experts were solicited. Comments and input did not exceed eight individuals.

7 Millar, Morgan M. and Donald Dillman. “Improving Response to Web and Mixed-Mode Surveys.” Public Opinion Quarterly. Volume 75, Number 2, Summer 2011; Heerwegh, Dirk. “Effects of Personal Salutations in Email Invitations to Participate in a Web Survey. Public Opinion Quarterly. Volume 69, Number 4, Winter 2005.

http://survey.sesrc.wsu.edu/dillman/papers/websurveyppr.pdf, accessed 6 November 2002.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePart A: Justification
Authorjdiamandopoulos
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy