LSAMP Supporting Statement

SupportingStatement.doc

EHR Generic Clearance

LSAMP Supporting Statement

OMB: 3145-0136

Document [html]
Download: html
Supporting Statement

Supporting Statement (3145-0136)

Request For Clearance: National Science Foundation, Directorate of Education and Human Resources, Division of Human Resource Development

Distance Monitoring of Louis Stokes Alliances for Minority Participation (LSAMP)

Attachment F

Section A

Introduction

This request for Office of Management and Budget (OMB) review is part of the renewal process for the National Science Foundation (NSF) Directorate for Education and Human Resources (EHR) Generic Clearance, OMB 3145-0136, which will expire on January 31, 2008. The EHR Generic Clearance includes collections of information about NSF's education and training (E&T) activities. This particular request addresses management or monitoring for the Louis Stokes Alliances for Minority Participation (LSAMP) program within EHR's Division of Human Resource Development (HRD). This task is the oldest survey task in the EHR Generic.  LSAMP was one reason OMB requested that NSF apply to establish an EHR Generic Clearance for program and project monitoring in 1995. 

A.1. Circumstances Requiring the Collection of Data

In the early 1990s the LSAMP program began as a multidisciplinary comprehensive undergraduate program designed to increase the quantity and quality of minority students receiving baccalaureate degrees in science, technology, engineering, and mathematics (STEM).  In the early 2000s, LSAMP's programmatic design expanded to include financial support for activities that encourage enrollment in full-time graduate study.  For example, the LSAMP Bridge to the Doctorate activity allows eligible, currently funded LSAMP projects to apply to NSF to receive additional funds to support graduate students, particularly those traditionally underrepresented in STEM fields, to pursue and attain a doctorate in a STEM field supported by NSF.

The LSAMP program requires funded projects to propose and maintain the formation of multi-institutional alliances.  The LSAMP program funds projects that address processes and factors that promote baccalaureate and graduate degree attainment, preparation for graduate study, and preparation for successful STEM careers outside of the higher education enterprise. You can see the latest LSAMP solicitation at http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf07566&org=NSF.

The LSAMP program supports the NSF strategic outcome goal of cultivating "a world-class, broadly inclusive science and engineering workforce," and expanding "the scientific literacy of all citizens," labeled as 'Learning' on page 5 of the FY 2006-2011 Strategic Plan, http://www.nsf.gov/pubs/2006/nsf0648/NSF-06-48.pdf. In particular, the program will help promote NSF's Learning-related investment priorities to "develop methods to effectively bridge critical junctures in STEM education pathways" and "prepare a diverse, globally engaged STEM workforce" (reference page 7 of the plan).

Data collected from LSAMP alliances through the monitoring system are needed by NSF for project and program monitoring, to fulfill policy and program reporting needs, and to serve as preliminary work for future impact assessment and evaluation activities. The data collected as part of  OMB 3145-0136 allow NSF officials to document the overall program investment in  individual alliances, and make future funding and program policy decisions.

Data collection instruments are included in appendices.

A.2. Purposes and Uses of the Data

The information collected in this task is required for effective administration, communication, and program and project monitoring; for meeting reporting requirements; for measuring attainment of NSF's program, project and strategic goals as laid out in NSF’s Strategic Plan; and as a baseline for future program evaluations.

 

The primary purpose of this collection is program management, also known as program monitoring. This data collection activity is designed to track the extent to which LSAMP awards meet the objectives of the program. Within the HRD division, this information is used to administer and monitor the progress of the program. The findings are used to recommend, among other things, administrative changes in program functions, level of award support, individual program focus and emphasis, and recruiting efforts. 

 

The LSAMP program also uses the data to fulfill reporting requirements. As a part of its performance assessment activities, NSF relies on the judgment of external experts to maintain high standards of program management. Directorate and Office advisory committees (ACs) meet twice a year, while Committees of Visitors (COVs) for divisions or programs meet once every three years. Data collected in the LSAMP monitoring system may be used to report to these committees on program activities. In addition, NSF is required to measure the attainment of its program, project and strategic goals by the President's Management agenda as represented by the Office of Management and Budget's (OMB) Program Assessment Rating Tool (PART), by the Government Performance and Results Act (GPRA) of 1993, and by the NSF’s Strategic Plan. Data collected in the LSAMP monitoring system help NSF management examine their progress towards the Foundation’s goals and respond to these reporting requirements.

 

Finally, the data can also be used as a preliminary step in more detailed future evaluation efforts, such as the sort of rigorous evaluations described in the May 2007 Report of the Academic Competitiveness Council, which was established by the Deficit Reduction Act of 2005 (P.L. 109-171) to serve as a multi-agency effort to identify federal STEM education programs and establish their effectiveness. The full ACC report can be accessed at http://www.ed.gov/about/inits/ed/competitiveness/acc-mathscience/index.html.

Under the LSAMP monitoring system, each LSAMP alliance and institution provides annual data using the Web-based data collection system (see appendix A). The following is an overview of the types of information collected: 

  • Alliance Data: The alliance respondent is asked to provide summaries of alliance-supported activities (e.g., student activities, faculty development), alliance accomplishments and obstacles to program goals.  The alliance respondent is also asked to provide line item budget data for the current reporting year. Additionally, alliances are asked to name and describe their nonacademic partners.
  • Institution data: Since LSAMP alliances involve a number of academic institutions, specific data about each participating institution are collected. Institution respondents are asked to provide counts of student enrollment and degrees awarded by field of study, gender, race/ethnicity and academic level.  The Web-survey includes a data collection screen for each field of study by academic level (e.g. sophomore) and for each field of study by degree (e.g. Bachelor's).  The screenshots in Appendix A provide an example of each screen using the field of Agricultural Science.  The are an additional nine fields of study included in the Web-survey: Chemistry, Computer Science, Engineering, Geosciences, Life/Biological Sciences, Mathematics, Physics/Astronomy, Environmental Science, and Non-STEM fields.  In addition to counts of student enrollment and degrees awarded, institutions report the number of incoming graduate students who received direct LSAMP support as undergraduates and provide descriptions of sponsored activities and a count of students that participated in each activity.
  • Data on Individuals: Some information is collected about all students and faculty participating in LSAMP.  Name, Social Security number (SSN), gender, race, ethnicity, disability status and field of study are collected on all individuals.  Faculty rank is also collected.  Additional student data includes class (e.g., sophomore), GPA, mentor's name, whether the student graduated during the current reporting year, and whether the student received financial support during the academic year and/or summer.  A checklist of LSAMP activities in which the student participated is included.  

In addition, a second module in the LSAMP system will collect information from institutions participating in the LSAMP Bridge to the Doctorate activity. In order to measure the effectiveness of this additional component (funding for graduate students) in the LSAMP program, PIs will be asked to report the following kinds of data on students receiving funding and on additional comparable students at their institutions:

  • Demographic data
  • Undergraduate major, GPA, and research experience
  • Graduate field
  • Stipends received from the LSAMP program
  • Status of graduate degree work
  • Field/type of student employment

A preliminary version of this module, in which Bridge students were respondents, was cleared in 2005. Since that time, the start of this data collection was postponed while the elements to be collected were modified. When the Bridge to the Doctorate data collection begins in 2008 only Bridge data coordinators, and not graduate students, will be asked to respond. To see the instrument for this module, see Appendix B, and for more details on the differences between the preliminary instrument cleared in 2005 and the current instrument, see the item crosswalk in Appendix C.

Other than changes to the "Bridge" component, there are no changes to the LSAMP data collection system. See Appendix C for detailed list of data elements.

A.3. Use of Information Technology To Reduce Burden

EHR typically uses Web-based systems because they can facilitate respondents' data entry across hardware and software platforms.  An innovative feature of many of the individual Web systems designed by Macro International Inc. for NSF is the thorough editing of all submitted data for completeness, validity and consistency.  Editing is performed as data are entered.  Most invalid data cannot enter the system, and questionable or incomplete entries are called to respondents' attention before they are submitted to NSF.

LSAMPs surveys employ user-friendly features such as automated tabulation, data entry with custom controls such as checkboxes, data verification with error messages for easy online correction, standard menus and predefined charts and graphics.  All these features facilitate the reporting process, provide useful and rapid feedback to the data providers and reduce burden.

The WebAMP system was first used in the spring of 1998 in response to user requests for improvement over a previously used disk-based survey and minimize burden. The 508-compliant Web-based software facilitates respondents' data entry by ensuring more complete and correct data submissions and thus reducing the need for follow-up after a response is submitted to NSF.  Fields are also marked with out-of-range indicators, and respondents are warned to check their data if they appear to be out-of-range.

Under WebAMP, respondents see data submitted in previous (if any) collection cycles. Most projects (Alliances) have a multi-year lifecycle (often five years or longer), so this feature makes correcting or completing a previous year's data, particularly those on student enrollments, far easier and less burdensome than re-entering the data.  Additionally, because the collection is Web-based minor bugs or formatting of items can (and have) been easily corrected in response to user feedback.

A.4. Efforts To Identify Duplication

This system does not duplicate other NSF efforts. Comparable data are not currently being collected on an annual basis for the LSAMP program.  In addition, the collection is coordinated with the NSF FastLane Project Reports system (OMB 3145-0058) to ensure that the two collections do not collect similar data. As much as possible, data from other NSF monitoring collections are used to pre-fill LSAMP items, further minimizing overall response burden. Additionally, aggregate data are being shared with NSF-funded researchers as appropriate, thereby minimizing the possibility that other researchers will duplicate these efforts in their own future collections.

A.5. Small Business

No information is to be collected from small businesses.

A.6. Consequences of Not Collecting the Information

Without this information, NSF would be restricted in managing and reporting on the activities of awards in the LSAMP program. Without this feedback, NSF would have no way of making systematic modifications to the LSAMP program (e.g., adequacy of funding amount, duration of award, and institutional supports needed). These data will ensure that NSF makes informed decisions about future directions of the CREST program. The information requested here is not available elsewhere.

 

Additionally, without this information NSF would find it difficult to meet GPRA and PART reporting requirements and would be unable to comply fully with congressional and presidential mandates that the Foundation asses its STEM education programs.

A.7. Special Circumstances Justifying Inconsistencies with Guidelines in 5 CFR 1320.6

The data collection will continue to comply with 5 CFR 1320.6.

A.8. Consultation Outside the Agency

The notice inviting comments on the EHR Generic Clearance (OMB 3145-0136) was published in the Federal Register August 24, 2007, Volume 72, Number 164, page 48694. No comments were received.

During the initial system development principal investigators (PIs) from LSAMP awards reviewed the system; their responses to the survey and their assessments of the institution survey were taken into account in the development of the system.  Changes in the system since initial development are informed by ongoing consultations with the respondents, Macro International Inc. (the contractor that designed the Web interface and database system) and Abt Associates, Inc. (the contractor that produces reports and presentations of aggregate data).  Macro International currently maintains the surveys and survey databases and provides technical support to respondents as needed.

A.9. Payments or Gifts to Respondents

No payments or gifts will be provided to respondents.

A.10. Assurance of Confidentiality

Data collected under this task are only available to the respondents, NSF, and the firms hired to manage the data and data collection software. Data are processed according to Federal and State privacy statutes. To protect privacy, only composite data or graphical representations will be released to the public.

 For the collection covered by this clearance request, when respondents are presented with the first screen of the survey, they are additionally instructed as follows: "Information from this data collection system will be retained by the National Science Foundation, a federal agency, and will be an integral part of its Privacy Act System of Records in accordance with the Privacy Act of 1974 and maintained in the Education and Training System of Records 63 Fed. Reg. 264, 272 (January 5, 1998). These are confidential files accessible only to appropriate National Science Foundation (NSF) officials, their staffs, and their contractors responsible for monitoring, assessing, and evaluating NSF programs. Only data in highly aggregated form, or data explicitly requested as "for general use," will be made available to anyone outside of the National Science Foundation for research purposes. Data submitted will be used in accordance with criteria established by NSF for monitoring research and education grants, and in response to Public Law 99-383 and 42 USC 1885c. The Social Security number (SSN) will be maintained in accordance with the requirements of the Privacy Act of 1974. Submission of the SSN is voluntary. It is used for survey quality control, program evaluation, and for matching with other data sets maintained in the Education and Training System of Records 63 Fed. Reg. 264, 272 (January 5, 1998)."

A.11. Questions of a Sensitive Nature

In some cases, instruments request information from respondents including name, address, Social Security number (SSN), date of birth, and grade point average (GPA).  These data are collected in order to monitor the award sites and measure the progress of the individual award projects.

LSAMP requests information on gender, race/ethnicity, disability (if any), academic discipline, and class is order to monitor the sites' participant populations. GPA, graduation status, enrollment status, mentor, financial support indicators, and activity participation are needed to assess the impact of NSF's grant investment.

Names and Social Security numbers are collected to permit tracking of the program participants across time and place (e.g., from 2-year to 4-year institutions to Ph-D granting institutions) within a particular Alliance or across Alliances. Respondents have the option of not providing information that they consider privileged by marking the "not reported" option or by leaving their Social Security numbers blank. In addition, individual participant activity status is requested, not required. Respondents are advised that identifiable data are provided only to LSAMP program staff and NSF contractors conducting studies in compliance with the Privacy Act.

A.12 Estimates of Response Burden

A.12.1. Number of Respondents, Frequency of Response, and Annual Hour Burden

The total number of annual respondents is 415 (80 project PIs/Co-PIs; 300 LSAMP institution personnel; and an estimated 35 Bridge to the Doctorate data coordinators involved in the Bridge to the Doctorate module) and the total annual person-hours is 14,380.

The Web-based collection is an annual activity of the LSAMP program. There are approximately 40 LSAMP alliances with 2 or more co-PIs and project personnel at alliance institutions.   New alliances (and institutions within currently funded alliances) will be added to the program over the next three years.   The new institutions enter at approximately the same rate that alliances or institutions leave the program or project as their funding expires.

The annualized burden for the component surveys in the current task (PI and Institution Personnel) was calculated by taking the average number of respondents from the previous survey cycles and estimating their response burden, based on a question in the Web-based data collection asking how long it takes respondents to complete the survey.  The annual burden for the new component survey addressed to LSAMP Bridge program managers was estimated using the burden reported in similar monitoring systems.  The Bridge activity began in 2003 and during the first year of Bridge data collection coordinators will enter data for past years of support activity; while the average annual burden is estimated at 20 hours for these respondents, it is expected that this burden will be lower in the second and third year of data collection, after past years’ data has been entered. The three burden estimates for each type of respondent are outlined below:

Type of
Respondent

Average Number of Respondents

Burden Hours Per Respondent

Annual Person-Hours

PIs/Co-PIs

80

36 hours

  2,880

LSAMP institution personnel                   

300

36 hours

10,800

Bridge to the Doctorate data coordinators

35

20 hours

700

Total respondents

415

Total estimated hours

14,380

A.12.2. Hour Burden Estimates by Each Form and Aggregate Hour Burdens

As mentioned above respondents will be project PIs, Co-PIs, other project personnel, and data coordinators for the Bridge to the Doctorate component of the LSAMP program.  The total annual response burden is 14,380 person-hours. The annual burden by form was calculated as follows:


 Form Type

Respondent Type

Number of Respondents

Burden Hours Per Respondent

Total Person- Hours

LSAMP Annual Survey

Bridge to the Doctorate module

PIs, Co-PIs, LSAMP institution personnel

Bridge the Doctorate data coordinators

380

 

35

36 hours

 

20 hours

13,680

 

700

A.12.3. Estimates of Annualized Cost to Respondents for the Hour Burdens

The overall cost to the respondents is estimated to be $244,560. The following table shows the annualized estimates of costs to respondents.  Estimated PI hourly rates are based on a report in the April 20, 2007, edition of The Chronicle of Higher Education (2007. “What Professors Earn.” The Chronicle of Higher Education, 53(33), Washington, D.C.: The Chronicle of Higher Education, Inc.). According to the report, the average salary of an associate professor across all types of doctoral-granting institutions (public, private, church-related) was $76,639. When divided by the number of standard annual work hours (2,080), this calculates to $37.00 per respondent hour.

 

 Respondents

Number of Respondents

Hours per Respondent

Average Hourly Rate

Total Annual

Costs

Project PIs

80

36 hours

$37

$106,560

                        Institution personnel

300

36 hours

$12

$129,600

Bridge to the Doctorate data coordinators

35

20 hours

$12

$8,400

Total estimated costs

 

 

 

$244,560

A.13. Estimate of Total Capital and Startup Costs/Operation and Maintenance Costs to Respondents or Record Keepers

There is no overall annual cost burden to respondents or record-keepers that results from the distance monitoring of the LSAMP program other than the time spent responding to the data collection instrument attached as Appendix A to this request.

It is usual and customary for individuals involved in education and training activities in the United States to keep descriptive records. The information being requested is from records that are maintained as part of normal educational or training practice. Furthermore, the majority of respondents are active or former grantees or participants in programs or projects once funded by NSF. In order to be funded by NSF, institutions must follow the instructions in the NSF Grant Proposal Guide (GPG) that is cleared under OMB 3145-0058. The GPG requires that all applicants submit requests for NSF funding and that all active NSF awardees do administrative reporting via FastLane, an Internet-based forms system. Thus, LSAMP PIs and program personnel make use of standard office equipment (e.g., computers), Internet connectivity that are already required as a startup cost and maintenance cost under NSF GPG.

A.14. Estimates of Costs to the Federal Government

Computing the annualized cost to NSF for the LSAMP data collection was done by taking the budgets for 3 years and calculating the costs for each of the following operational activities involved in producing, maintaining, and conducting the LSAMP data collection:

Operational Activities

Cost Over 3 Years

System Development (includes initial development of the database and Web-based application, and later changes requested by the program-e.g., increased reporting tools, additional validations)

$215,500

System Maintenance, Updates, and Tech Support (system requires updates each year before opening the collection; maintenance is required to keep the system current with technology, e.g., database servers, operating systems)

$104,675

Data Collection Opening and Support (e.g., online and telephone support to respondents and contacting respondents to encourage completion of the questions), Reporting (as defined by HRD), and Followup activities (e.g., providing data to other consultants)

$135,500

3-Year Total for All Operational Activities

$455,675

The annualized cost was computed as one-third of the total 3-year costs; thus, the annualized cost to NSF for the LSAMP data collection is $151,892.

A.15. Changes in Burden

In this request for renewal, the number of respondents has decreased from the 2005 clearance, although the total hour burden has increased slightly. In the 2004 clearance request the burden was 13,336 hours for 701 respondents. This renewal request is for 14,380 hours for 415 respondents. This is due to an adjustment in the data collection plan for the Bridge to the Doctorate component of the collection. In the 2005 clearance, NSF proposed collecting data from each of the 350 Bridge to the Doctorate participants; NSF now plans to collect data from the 35 data Bridge to the Doctorate data coordinators. This reduced the number of respondents, but the data coordinators are expected to have a higher burden than the individual participants (20 hours each instead of 2), causing the slightly higher burden. Other than this change, small adjustments have been made to the number of PIs and institution personnel to reflect variations in awards and staffing. There are no changes to the LSAMP annual survey that would affect burden. For more details on the instruments, see the crosswalks in Appendix C.

A.16. Plans for Publication, Analysis, and Schedule

Data collection begins in June each year and ends in October. NSF program officers extend the October deadline upon request of the respondents.  Once the data collection has been completed, agency staff can access the data through the on-line system as needed.

Like many agencies, NSF is reducing its reliance on formal (i.e., traditional) publication methods and publication formats.  Macro International Inc., the contractor that manages the data collection Web site and database, is forbidden contractually from publishing results unless NSF instructs them to.  In short, all products of the collections are the property of NSF and NSF is the exclusive publisher of the information being gathered.     

The data from this collection primarily are used for internal review purposes and to monitor the LSAMP alliances, as well as for baseline data in NSF-contracted third-party program evaluations and descriptive analysis studies used in reporting to Congress (e.g., the GPRA Annual Performance Plan and the Program Assessment Rating Tool (PART). Reports to NSF management, PIs, OMB and Congress deal with characteristics and performance of the LSAMP program and may include statistical tables and charts generated from the LSAMP surveys. 

Data from these surveys may be used for NSF reports addressing the goal of increasing minority participation in STEM education and research. For example in the year 2000 NSF, the National Aeronautics and Space Administration (NASA), and the National Institutes of Health (NIH) participated in a joint project (Study of Services for Underrepresented Students) that described the activities supported by these programs that share a joint goal of increasing the participation of traditionally underrepresented minorities in undergraduate study in STEM fields. The final report highlighted methods that promote the achievement of traditionally underserved students in STEM fields. That report was turned into a NSF publication, A Description and Analysis of Best Practice Findings of Programs Promoting Participation of Underrepresented Undergraduate Students in Science, Math, Engineering, and Technology Fields, December 2000, Westat (NSF 01-31).  NSF 01-31 makes passing references to the survey data. 

During the 2001-2004 clearance period in accordance with OMB approval, NSF provided the historic LSAMP database (Please note LSAMP surveys are sometimes called MARS which refers to the pre-web method for delivering the survey) to NSF's contractor, the Urban Institute. The Urban Institute's evaluative study of the LSAMP program was cleared through OMB under OMB 3145-0190 and the Final Report on the Evaluation of the National Science Foundation Louis Stokes Alliances for Minority Participation Program was released in November 2005; more information on the report is available at http://www.nsf.gov/pubs/2007/nsf0701/pdf/19.pdf.

A.17. Approval to Not Display Expiration Date

Not applicable.

A.18 Exceptions to Item 19 of OMB Form 83-I

No exceptions apply.

Section B

Introduction

B.1. Respondent Universe and Sampling Methods

The sample size is the entire universe of LSAMP projects that consist of an annual average of 40 multi-year grants and cooperative agreements made by NSF to an eligible institution of higher education (IHE). That lead awardee has many partner IHEs as sub or collaborative awardees.  The individual respondents come from both the individual project's lead institution and other partnering institutions of higher education. The annual average of individual respondents is 415.  As above mentioned in section A, the individual types who respond include a project's PIs/CoPIs, other project personnel, and data coordinators. This annual number of 415 is expected to remain stable throughout the clearance period.

Population

Estimated Universe Size

Sample Size

LSAMP Project Participants 

415

415

 

B.2. Information Collection Procedures/Limitations of the Study

This data collection uses a Web-based survey.  Participating individuals from each LSAMP project provide descriptive data each year for the duration of their NSF funding.  The data are primarily useful for program management, monitoring and descriptive analysis. 

NSF understands the limitations of the data collection, particularly in terms of using the data to determine program effectiveness. Data collected through the LSAMP system are not used to determine the ultimate effectiveness of its STEM educational interventions, but are used in program planning and management, to report on agency activities and goals, and to lay the groundwork for future evaluations.

B.2.1. Statistical Methodology for Stratification and Sample Selection

This data collection is a census, so no sampling is required.

B.2.2. Estimation Procedure

Not Applicable

B.2.3. Degree of Accuracy Needed for the Purpose Described in the Justification

Not Applicable

B.2.4. Unusual Problems Requiring Specialized Sampling Procedures

Not Applicable

B.2.5. Use of Periodic (Less Frequent Than Annual) Data Collection Cycles

Not Applicable

B.3. Methods for Maximizing the Response Rate and Addressing Issues of Nonresponse

Past collections have had 100 percent response rates and NSF anticipates that the rate will remain the same.  The collection is part of the reporting required of LSAMP programs to maintain their NSF funding.  Additionally, considerable effort is made to follow-up programs with alliances and institutions that have not provided complete reports.  E-mail reminders are sent at regular intervals during the collection cycle and phone calls are made to alliance personnel as the end of the collection cycle approaches.  Examples of the emails announcing the opening of the system and reminding respondents to log in and enter data are included in appendix D.

B.4. Tests of Procedures or Methods

This system has been operational since 1998.  Most alliance PIs tested the system while it was in development and provided valuable feedback.  Additionally, respondents continually provide feedback on system improvements. Most of the items and response categories utilized in this survey follow formats that are already in place in other NSF monitoring systems. The Bridge to the Doctorate data collection module is based on other NSF monitoring systems and has been pilot tested with potential respondents.

 

B.5. Names and Telephone Numbers of Individuals Consulted

Agency

A. James Hicks, National Science Foundation, (703) 292-4668

Contractors

Macro International Inc. will be responsible for data collection and analysis under the direction of Lea Mesner, (301) 657-3077.

File Typetext/html
File TitleSupporting Statement
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy