1.Supporting Statement Part A (25Mar'19)

1.Supporting Statement Part A (25Mar'19).docx

Education and Human Resources Program Monitoring Data Collections

OMB: 3145-0226

Document [docx]
Download: docx | pdf

Supporting Statement (Part A; Control No. 3145-0226, 2019 renewal)

Supporting Statement (3145-0226)


REQUEST FOR RENEWAL OF EHR

PROGRAM MONITORING DATA COLLECTIONS


Forms Clearance Package


Submitted by:


National Science Foundation

2415 Eisenhower Avenue

Alexandria, VA 22230

Executive Summary

This is a request for a renewal of OMB approval for the National Science Foundation’s “Education and Human Resources Program Monitoring Data Collections” (OMB Control No 3145-0226). Specifically, NSF is requesting approval to continue collecting data for seven data collections that have similar elements and purposes and provide essential information for program monitoring purposes. Attachments are included for each collection providing: collection overviews (crosswalks of items to common collection categories, estimates of hour burdens & annualized costs to respondents and the Federal government, and collection items; see Attachments 1); and screenshots of the online collection instruments. Four other collections included in the prior request for renewal are not included in this renewal request; these legacy programs have either been replaced by new programs, combined with others, ended, or no longer require program-specific monitoring data collections.

Section A

Introduction

The National Science Foundation (NSF) is an independent federal agency that supports research at the frontiers of knowledge, across all fields of science and engineering (S&E) and S&E education (NSF, “Building the Future: Investing in Discovery and Innovation,” NSF Strategic Plan for Fiscal Years (FY) 2018–2022, NSF 18-045). NSF is “the funding source for approximately 27 percent of the total federal budget for basic research conducted at U.S. colleges and universities.”1 The Foundation awards grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations, and other research organizations throughout the U.S.2


Within NSF, EHR has primary responsibility for promoting rigor and vitality within the Nation’s science, technology, engineering, and mathematics (STEM) education enterprise to further the development of the 21st century’s STEM workforce and public scientific literacy. In order to support the development of a diverse and well-prepared workforce of scientists, technicians, engineers, mathematicians, and educators and a well-informed citizenry that has access to the tools of science and engineering, EHR’s mission includes identifying means and methods to promote excellence in U.S. STEM education at all levels and in all settings (both formal and informal). To these ends, EHR provides support for research and implementation activities that may improve STEM learning and education from pre-school through postdoctoral studies, in traditional and non-traditional venues, among all United States citizens, permanent residents, and nationals. EHR also focuses on broadening participation in STEM learning and careers, particularly among those individuals traditionally underrepresented and underemployed in the STEM workforce, including but not limited to, women, persons with disabilities, and racial and ethnic minorities.


This request seeks renewal of OMB 3145-0226 for seven data collections that have similar elements and purposes and provide essential information for program monitoring purposes. The collections contain items in two categories of programs (i.e., scholarship/fellowship programs and implementation, development, and research programs).


Data collected by EHR program monitoring systems are used for program planning, management, evaluation, and audit purposes. Summaries of monitoring data are used to respond to queries from Congress, the public, NSF’s external merit reviewers who serve as advisors, including Committees of Visitors (COVs), and NSF’s Office of the Inspector General. These data are needed for effective administration, program and project monitoring, evaluation, and measuring attainment of NSF’s program and strategic goals, consistent with the Government Performance and Results Act (GPRA) Modernization Act of 2010 and NSF’s Strategic Plan.


The seven program-specific collections included in this request (see attachments A1 through G3) are designed to assist in management of specific programs, divisions, or multi-agency initiatives and to serve as data resources for program evaluations. Of the seven collections contained in this request, two are for collection from remaining projects in legacy programs that are no longer making new awards (identified in Exhibit 1). EHR believes it is important to complete the collection of data from currently active projects in those programs to assure that there is a complete data repository from all projects in those programs for use in a future evaluation or research project. Four other collections included in the prior request for renewal—for the Advancing Informal STEM Learning (AISL), Graduate STEM Fellows in K-12 (GK-12), Research in Disabilities Education (RDE), and Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES) programs—are not included in this renewal request. Because of changes in program focus and emphasis since this collection was last cleared, these legacy programs have either been replaced by new programs, combined with others, ended, or no longer require program-specific monitoring data collections. Accordingly, the level of monitoring activity and total burden for the collections covered by this approval have decreased.


Exhibit 1: Collections covered by this request

Program

Type of Program

Centers of Research Excellence in Science and Technology (CREST) and Historically Black Colleges and Universities Research Infrastructure for Science and Engineering (HBCU-RISE) Monitoring System

Implementation, Development, & Research

Integrative Graduate Education and Research Traineeship Program (IGERT) Monitoring System (no longer making new awards)

Scholarships and Fellowships

Louis Stokes Alliances for Minority Participation (LSAMP) Monitoring System

Implementation, Development, & Research;

Scholarships and Fellowships

Louis Stokes Alliances for Minority Participation Bridge to the Doctorate (LSAMP-BD) Monitoring System

Scholarships and Fellowships

Robert Noyce Teacher Scholarship Program (Noyce) Monitoring System

Scholarships and Fellowships

Scholarships in Science, Technology, Engineering, and Mathematic (S-STEM) Monitoring System

Scholarships and Fellowships

Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) Monitoring System

(no longer making new awards)

Implementation, Development, & Research

A.1. Circumstances Requiring the Collection of Data

EHR is responsible for analyzing and evaluating STEM education and human resource development activities and research in NSF’s Education and Training (E&T) portfolio.


EHR Monitoring Systems Clearance

Since the original request for this collection, EHR has continued to refine and enhance strategies for generating evidence to inform assessments of, and decisions regarding, EHR programs and portfolios of investments. For example, EHR has collaborated on NSF-wide initiatives intended to coordinate evaluation and monitoring efforts across NSF led by the NSF Office of Integrative Activities’ Evaluation and Assessment Capability (EAC), including methods and processes for collecting standard data about projects currently limited to annual and final reports as well as new reporting requirements of the Research Performance Progress Report (RPPR). With support from EAC, a pilot study for compiling a human capital data asset inventory was completed; a common item bank of program monitoring survey questions was compiled; and analyses to assess the feasibility of utilizing monitoring data to address formative and summative evaluative questions were conducted.

EHR’s Evaluation and Monitoring Group (EMG) continues to assess the ways in which monitoring data are used within the Directorate, and plausible alternative sources of information (including NSF enterprise data) to meet EHR’s (and other stakeholders’) needs for evidence on the Directorate’s STEM workforce development, broadening participation and institutional capacity, and STEM education research programs and investments. Additionally, the EMG continues to collaborate with EHR program staff and leadership, EAC, and colleagues across the Federal Government to: establish evidence-building agendas; develop strategies and processes to address questions about EHR programs and investments; and provide guidance in the development of common metrics and scalable approaches to provide robust, timely evidence for a wide range of improvement, management, evaluative, transparency, and accountability purposes.

Issues Addressed in the Initial Collection Request

The initial request that created OMB 3145-0226 addressed the extent to which monitoring data in the collection were used in two ways:

  1. Do monitoring systems collect data needed to assess programs?

These monitoring systems provide data required to assess the progress of projects in each program. The monitoring data also contribute to the overall assessment of program performance.

In the case of programs that are primarily fellowship or scholarship programs, collection of information about participants in those programs is essential to any future tracking of their progress and determination of the impact of participation in the program. As an example, the S-STEM program recently was asked to identify the number of graduate students participating in the program. The source of this information was the monitoring data, without which the program would have been unable to respond in a timely fashion.

The importance of monitoring data is illustrated by the following description of the activities of the S-STEM monitoring data in the recent program management plan:

The program monitoring system operated by ICF International requests and gathers responses from PIs to a common set of items on a semester/quarterly basis. The items are tailored to the information needs of the program and are strongly aligned with the goals of the program. Monitoring activities are administered through a web-based survey in which Principal Investigators (PIs) report student scholarship recipient demographics and status. For example, the system collects and stores demographic information (e.g., gender, race/ethnicity, scholarship amount, discipline, degree program). At the start of each semester/quarter PIs report the academic status of each unique recipient (e.g., still in school and active in project activities, graduated, left the program) and the types of activities in which the recipient participated. ICF International assists the leadership team in contacting PIs, administering the survey, and following-up with PIs to ensure the collection of the requisite information for each project.

Individual program directors (PDs) monitor compliance with the semester/quarterly reporting requirement for each award on which they serve as the cognizant PD. Each PD has access to the data for the projects they manage. PDs do not sign off on Annual or Final Reports until data compliance is complete and correct. The management team utilizes data from these reports to assess and report on the impact of S-STEM on student recruitment, retention, and graduation in STEM fields.

The monitoring system is the source for both (a) the collection of information to support the documentation of program performance metrics (e.g., number of STEM majors, number and type of support activities, and number of graduates) and (b) the primary and/or secondary data source for the program evaluation and a project’s Third Review.

Programs with implementation and/or development goals utilize detailed information about the initial efforts of individual projects to track the potential impact of those efforts in successive locations. The monitoring systems collect project-level information on: the scale, scope, and state of each project along with information on types of activities implemented; results, such as publications and number of students and/or faculty involved in the project; and partners. This information is essential for program management and reporting purposes; it is used to document the development, implementation, adaptation, dissemination, and results of supported activities. In addition, the program monitoring system data are a primary source of information for program evaluation.

  1. To what extent are monitoring data used to shape questions for a third-party evaluation?

NSF policy requires the development of a management plan to accompany every program announcement and solicitation. A plan for monitoring and assessing program activities is a required element of the program management plan; this may include plans for data collection and external evaluation.

As noted in the previous clearance request, and reiterated above, EHR programs rely on their monitoring data collections to contribute to and inform such third-party evaluations. Without these data, third-party evaluators could be required to collect data about program participants and program projects after awards had been completed rather than during the period of performance of an award.

Circumstances of Data Collection


To fulfill its planning and management responsibilities, and to answer queries from Congress, OMB, and NSF management, EHR needs current and standardized information about projects in NSF’s Education and Training System of Records portfolio. This information is specifically important to support studies and evaluations by EHR, and studies by other NSF organizational units for project monitoring and effective program administration. The information is retained in accordance with the Education and Training System of Records (63 Fed. Reg. 264, 272 January 5, 1998). The Education and Training System of Records has several purposes, including:


  • providing a source of information on demographic and educational characteristics and employment plans of participants in NSF-funded educational projects, in compliance with Foundation responsibilities to monitor scientific and technical resources enabling NSF to monitor the effectiveness of NSF-sponsored projects and identify outputs of projects funded under NSF awards for management and for reporting to the Administration and Congress, (e.g., further to the GPRA Modernization Act of 2010, 5 U.S.C. 306 and 39 U.S.C. 2801-2805 and other requirements); and


  • creating public use files (which contain no personally identifiable information) for research purposes.


The data collected under this request are focused on initiative-specific, division-specific, and program-specific quantitative and qualitative data collection activities. Data from these collections are focused on participant demographic detail (particularly for scholarship and fellowship programs) and activities and outputs (i.e., the accomplishments of program grantees [projects] in terms of specific objectives). These descriptive data collections provide essential information for documenting progress toward NSF’s major performance goals, as described in NSF’s Strategic Plan (NSF 18-045): expand knowledge in science, engineering, and learning; advance the capability of the Nation to meet current and future challenges; and enhance NSF’s performance of its mission.


A.2. Purposes and Uses of the Data


Data collected under this request are required for effective program administration, program and project monitoring, evaluation, and for measuring attainment of NSF’s program and strategic goals as laid out in NSF’s Strategic Plan. This section describes how data to be collected under the clearance authority will be used for internal program management and administration; as a data source for NSF’s performance assessment activities, including COVs and Directorate and Office Advisory Committees; for documenting the attainment of NSF’s program and strategic goals; and as a foundation for rigorous assessment of the effectiveness of STEM education programs.3


Program Management and Administration


One of the primary uses of data from the EHR program monitoring clearance is for the general oversight of project and program activities by EHR staff. EHR has a limited number of staff members who must monitor hundreds of projects. Large-scale data collection is an efficient and effective mechanism for program officers to track project activities. The monitoring systems that fall under OMB 3145-0226 allow program officers and other NSF staff to integrate pre-existing data from the NSF administrative data system and newly generated data in a coherent and timely manner, providing information needed to adjust program portfolios. This kind of monitoring may stimulate respondents to iteratively refine their projects’ activities, facilitate changes in program guidelines and/or NSF funding levels to a particular project, and result in improved benefits to participants in NSF projects.


Data for Performance Assessment


Data from the monitoring systems contribute to NSF’s performance assessment activities and support the larger NSF evaluation model. NSF relies on the judgment of external experts to maintain high standards of program management, to provide advice for continuous improvement of NSF performance, and to ensure openness to the research and education community served by the Foundation. COV reviews are conducted at regular intervals of approximately four years for programs and offices that recommend or award grants, cooperative agreements, and/or contracts and whose main focus is the conduct or support of NSF research and education in science and engineering.4 COV reviews provide NSF with external expert judgments in two areas: (1) assessments of the quality and integrity of program operations and program-level technical and managerial matters pertaining to proposal decisions; and (2) comments on how the results generated by awardees have contributed to the attainment of NSF’s mission and strategic outcome goals. Data collected in the monitoring systems are often used in these reviews. For example, data collected via the STEP monitoring system were included in materials for the October 2016 STEP COV and data from the CREST program monitoring system were used to inform the development of materials for the EHR Division of Human Resource Development’s (HRD’s) November 2016 COV.


Another central use of the EHR program monitoring data collected subject to this approval is to report on progress toward and document attainment of NSF program and strategic goals. The Foundation’s FY 2018–2022 Strategic Plan describes three strategic goals: (1) expand knowledge in science, engineering, and learning; (2) advance the capability of the Nation to meet current and future challenges; and (3) enhance NSF’s performance of its mission. EHR contributes to the attainment of these goals through programs of activity that:

  • prepare the next generation of STEM professionals and attract and retain more Americans to STEM careers;

  • develop a robust research community that can conduct rigorous research and evaluation that will support excellence in STEM education and that integrates research and education;

  • increase the technological, scientific and quantitative literacy of all Americans so that they can exercise responsible citizenship and live productive lives in an increasingly technological society; and

  • broaden participation (individuals, geographic regions, types of institutions, STEM disciplines) and close achievement gaps in all STEM fields.5


The seven EHR programs whose awards and activities are the subject of the data collections described in this request for renewal play critical roles in the attainment of each of these objectives. Much of the information that enables EHR to monitor their progress and report on their accomplishments (e.g., consistent with the performance and improvement requirements of the GPRA Modernization Act of 2010) is derived from the data elements collected in the monitoring systems under OMB 3145-0226.


A Foundation for Future Evaluations


EHR places a strong emphasis on evidence-based decision making, and is committed to generating robust evidence to inform the development, management, and assessment of its programs and portfolios of investment. While the monitoring systems used to collect data under this collection play a role in this work, it is understood that they are not evaluative studies. NSF does conduct program-level management reviews to ensure that programs are administered properly and in accordance with federal guidelines and agency missions. This is currently one use of data from the EHR monitoring systems.


The importance of program evaluation to generate evidence about both what ‘works’ and how programs can be improved is underscored in the Administration’s 2018 reform plan and reorganization recommendations, and Chapter 6 of the FY 2019 Budget Analytical Perspectives.6 Data collected via the monitoring efforts that are the subject of this request for renewal contribute to the formal evaluation of programs and provide regular measures of program performance. With continued collection of these data, EHR will be able to leverage program monitoring system data from prior years and more efficiently and effectively track outputs, outcomes, and progress towards the attainment of program and Directorate goals over time. More generally, access to these data is critical to realizing the commitments to building and using evidence that are central to the Administration’s vision for results-driven government and the provisions of the Foundations for Evidence-Based Policymaking Act of 2018 (Pub.L. 115-435).


EHR has encouraged the use of monitoring data in its evaluation activities, creating a foundation for robust assessments of program activities. While data collected under this collection were not used to evaluate program effectiveness, some of the data gathered through OMB 3145-0226 facilitate the design and conduct of rigorous evaluation studies.




Ways in which these monitoring data might be used in evaluation include:

  • creating a universe data set with which to compare and establish representativeness of sample data;

  • providing data with which to verify/assess quality of evaluation data; and

  • providing data with which to establish population baseline and/or trend data.

A.3. Use of Information Technology to Reduce Burden

All of the collections included under this clearance request use Web-based data collection systems to minimize data duplication and respondent burden. EHR favors Web-based systems because they facilitate respondents’ data entry across computer platforms. One innovative feature of many of the individual Web systems is the thorough reviewing and editing of all submitted data for completeness, validity, and consistency. Editing and validation are performed as data are entered. Most invalid data cannot be entered into the system, and questionable or incomplete entries are called to respondents’ attention before they are submitted to NSF.


EHR program monitoring Web-based data collection systems employ user-friendly features such as automated tabulation, data entry with custom controls such as checkboxes, data verification with error messages for easy online correction, standard menus, and predefined charts and graphics. All of these features facilitate the reporting process, provide useful and rapid feedback to the data providers, and reduce burden.


All collections in the EHR program monitoring clearance comply with Section 508, the 1998 amendment to the Federal Rehabilitation Act of 1973, which mandates that the electronic and information technology used by federal agencies be made accessible to all people with disabilities.

A.4. Efforts to Identify Duplication

The EHR program monitoring clearance does not duplicate efforts undertaken by the Foundation, other federal agencies, or other data collection agents. For example, NSF grants require the submission of annual and final project reports in accordance with OMB 3145-0058. Recipients of NSF grants, such as PIs, create and submit annual and final project reports through Research.gov. Data collected under the EHR program monitoring clearance are unique and not available in either the NSF annual or final reporting system. The introduction of the new annual and final reports based on the RPPR format has improved the submission of project information but does not change the need for additional information that monitoring systems provide on a program-specific basis.

A.5. Small Business

None of the seven collections included in this request for renewal of the EHR program monitoring clearance collect information from small businesses.


A.6. Consequences of Not Collecting the Information

Data collected for the EHR program monitoring clearance are used to manage programs, monitor projects, inform project and program evaluations, coordinate with federal and non-federal education partners, provide Congress with information about government-supported activities, and report for GPRA and other requirements. In many cases, the data need to be collected annually to inform the NSF management and evaluation processes. Data collected under the EHR program monitoring clearance can be used by NSF management to document and measure NSF’s success at achieving both strategic outcome goals and internal annual performance goals.


If the information were not collected, NSF would be unable to document the implementation of project activities and outcomes of its programs. It would be unable to meet its accountability requirements or assess the degree to which projects and programs are meeting their goals.

A.7. Special Circumstances Justifying Inconsistencies with Guidelines in 5 CFR 1320.6

All data collections will comply with 5 CFR 1320.6. All collections under the EHR program monitoring clearance ask respondents for data annually, with the exception of the IGERT monitoring system, for which awards in their final year submit both an annual and final report, and the S-STEM monitoring system, which asks respondents to submit data each semester/quarter. See attachments B1 and F1 for more information on the frequency of these collections.

A.8. Consultation Outside the Agency

The notice inviting comments on the EHR program monitoring clearance (OMB 3145-0226) was published in the Federal Register October 31, 2018, Volume 83, Number 211, pages 54786-54787. No substantial comments were received in response to the Federal Register notice.


When developing collection instruments, EHR routinely consults with research and evaluation experts, PIs, and educators affected by EHR investments. The purpose of these consultations is to assess the relevance, availability, and clarity of items. As suggested by OMB guidelines, these consultations also enable EHR staff to obtain a reliable estimate of the respondent burden generated by new instruments. When a new collection is added or when an existing collection is modified to add new instruments, each instrument is pretested with nine or fewer individuals and revised following debriefings with participating respondents.


For data collections conducted earlier under the EHR program monitoring clearance, consultations have included knowledgeable outsiders such as representatives of EHR contractors responsible for technical and evaluation tasks and fellows who work at the Foundation as guests under programs such as the Einstein Fellows Program or the American Association for the Advancement of Science Washington Fellows Program.

A.9. Payments or Gifts to Respondents

To date no payments or gifts have been provided to respondents. There are no plans to provide incentives to respondents because the value of program and project monitoring surveys is of value to the respondents as well as NSF. Program monitoring can be used by projects as a foundation for project-level evaluation.

A.10. Assurance of Confidentiality

Respondents are informed that any information on specific individuals is maintained in accordance with the Privacy Act of 1974. Every data collection instrument displays both OMB and Privacy Act notices.


Respondents are told that data collected for the EHR program monitoring clearance are available to NSF officials and staff, evaluation contractors, and the contractors hired to manage the data and data collection software. Data are processed according to federal and state privacy statutes. Detailed procedures followed by EHR for making information available to various categories of users are specified in the Education and Training System of Records (63 Fed. Reg. 264, 272 January 5, 1998). This system limits access to personally identifiable information to authorized users. Data submitted are used in accordance with criteria established by NSF for monitoring research and education grants and in response to Public Law 99-383 and 42 USC 1885c.


The information requested through NSF monitoring systems may be disclosed to qualified researchers and contractors in order to coordinate programs and to a federal agency, court, or party in court or federal administrative proceedings, if the government is a party.

A.11. Questions of a Sensitive Nature

Seven of the proposed collections in the EHR program monitoring clearance request information from respondents, including either name, address, Social Security Number (SSN), date of birth (DOB), and/or grade point average (GPA). These data are collected in order to monitor the award sites and evaluate the success of the award programs. Information of this nature is also used to track recipients of funding and training. For example, in the IGERT survey (attachments B1 and B2), trainees’ SSNs are used as a tracking mechanism to permit follow-up studies that examine the long-term effect of the IGERT program on individuals’ success. However, in the IGERT collection and in all collections that request SSN, SSN is a voluntary field. Responses to all items of a sensitive nature are voluntary. Respondents may choose not to provide information that they deem as privileged, such as SSN, address, or DOB. Any individual-level data that are collected are provided only to program staff and consultants conducting studies using the data as authorized by NSF. Any public reporting of data is in aggregate form. Exhibit 2 shows which individual collections include questions of a sensitive nature.





Exhibit 2: Questions of a sensitive nature

Attachments

Collection Title

Address

DOB

GPA

Name

SSN

A1-A2

Centers of Research Excellence in Science and Technology (CREST) and Historically Black Colleges and Universities Research Infrastructure for Science and Engineering (HBCU-RISE) Monitoring System

X



X


B1-B2

Integrative Graduate Education and Research Traineeship Program (IGERT) Monitoring System

X


X*

X

X

C1-C2

Louis Stokes Alliances for Minority Participation (LSAMP) Monitoring System



X

X

X

D1-D2

Louis Stokes Alliances for Minority Participation Bridge to the Doctorate (LSAMP-BD) Monitoring System

X


X

X

X

E1-E5

Robert Noyce Teacher Scholarship Program (Noyce) Monitoring System


X

X

X


F1-F2

Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) Monitoring System

X

X

X

X


G1-G3

Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) Monitoring System

X**



X**


*IGERT does not collect GPAs but does collect the Graduate Record Exam scores of individual trainees.

**STEP collects names and addresses for PIs/respondents but not for individual students.



A.12. Estimates of Response Burden

A.12.1. Number of Respondents, Frequency of Response, and Annual Hour Burden

As shown in Appendix A and in Exhibit 3 below, the annual response burden for the seven collections under OMB 3145-0226 is 32,698 hours (for 2,511 respondents and 3,219 responses). Given the diversity of respondent types, the methods used to arrive at individual collection burden estimates are described in detail in attachments A1 through G1.



Exhibit 3: Respondents, responses, and annual hour burden

Attachment

Collection Title

No. of Respondents

No. of Responses

Annual Hour Burden

A1

Centers of Research Excellence in Science and Technology (CREST) and Historically Black Colleges and Universities Research Infrastructure for Science and Engineering (HBCU-RISE) Monitoring System

42

42

1,648

B1

Integrative Graduate Education and Research Traineeship Program (IGERT) Monitoring System

513

513 (8 PI respondents per year will submit both an annual and final report)

3,242

C1

Louis Stokes Alliances for Minority Participation (LSAMP) Monitoring System

625

625

16,250

D1

Louis Stokes Alliances for Minority Participation Bridge to the Doctorate (LSAMP-BD) Monitoring System

56

56

1,008

E1

Robert Noyce Teacher Scholarship Program (Noyce) Monitoring System

550

550

6,050

F1

Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) Monitoring System

700

1,400 (700 respondents X 2 responses/yr.)

4,.900

G1

Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) Monitoring System

25

25

500

 

Total

2,511

3,211

32,698

EHR estimates that possibly one new collection will need to be cleared under the EHR program monitoring clearance during the next three years, dependent on budgetary limitations and Congressional mandates. The overall response burden in any year should not exceed 40,000 hours.

Exhibit 4 presents an example illustrating how the hour burden was estimated for the CREST/HBCU-RISE monitoring system (detailed in Attachment A1).The estimated average number of annual respondents is 42 (31 CREST center PIs/program coordinators and 11 HBCU-RISE award PIs/program coordinators), with an estimated annual response burden of 1,648 hours. The Web-based data collection is an annual activity of the CREST/HBCU-RISE program. The respondents are either PIs or program coordinators. Generally, one PI or program coordinator per award completes the questionnaire. The estimated annual hour burden per respondent was determined using the burden information reported by respondents from the last two collection cycles.




Exhibit 4: Sample calculation of burden hour estimate

Respondent Type

Estimated Average Annual No. of
Respondents

Estimated Average Annual Burden Hours Per Respondent

Estimated Annual Burden Hour Total

CREST center

PIs/program coordinators

31

45

1,395

HBCU-RISE award

PIs/program coordinators

11

23

253

Total

42

39.24

1,648


A.12.2. Hour Burden Estimates by Each Form and Aggregate Hour Burdens

Details on the burdens of each form can be found in attachments A1 through G1. Exhibit 5 provides an example of how this burden was estimated for the CREST/HBCU-RISE monitoring system (details in Attachment A1):


Exhibit 5: Sample calculation of hour burden estimates by form

Form Type

Respondent Type

No. of Respondents

Burden Hours Per Respondent

Total Burden Hours

CREST/HBCU-RISE data collection form

PIs/program coordinators

42

39.24

1,648

Total


42


1,648

A.12.3. Estimates of Annualized Cost to Respondents for the Hour Burdens

As shown in Appendix A, the total annual cost to respondents generated by the seven ongoing data collections is currently estimated to be $1,227,195. Following is an example of the method used to calculate cost burden for the CREST/HBCU-RISE monitoring system (details in Attachment A1):


The overall annualized cost to the respondents is estimated to be $75,812. Exhibit 6 shows the annualized estimate of cost to PI/program coordinator respondents, who are generally university professors. This estimated hourly rate is based on a report from the American Association of University Professors, “The Annual Report on the Economic Status of the Profession, 2017–18,” Survey Report Table 1. According to this report, the average salary across all academic ranks and across all types of doctoral-granting institutions (public, private-independent, religiously affiliated) was $95,176. When divided by the number of standard annual work hours (2,080), this calculates to approximately $46 per hour.




Exhibit 6: Sample calculation of estimated annualized cost to respondents

Respondent Type

No. of Respondents

Burden Hours Per Respondent

Average Hourly Rate

Estimated Annual Cost

PIs/Program Coordinators

42

39.24

$46

$75,812

Total

42



$75,812

The cost to respondents generated by each data collection are described in attachments A1 through G1.

A.13. Estimate of Total Capital and Startup Costs/Operation and Maintenance Costs to Respondents or Record Keepers

There is no overall annual cost burden to respondents or record-keepers that results from the EHR program monitoring clearance other than the time spent responding to online questionnaires that are described in specific detail in attachments A1 through G1. It is usual and customary for individuals involved in education and training activities in the United States to keep descriptive records. The information being requested is from records that are maintained as part of normal educational or training practice. Furthermore, the majority of respondents are active or former grantees or participants in programs or projects funded by NSF. In order to receive funding, institutions must follow the instructions in the NSF Proposal and Award Policies and Procedures Guide (PAPPG) that is cleared under OMB 3145-0058. The PAPPG requires that all applicants submit requests for NSF funding and that all active NSF awardees do administrative reporting via FastLane or Research.gov. Thus, PIs, K-12 administrators, faculty members, and college students, who are the primary respondents to the individual data collections within the EHR program monitoring clearance, make use of standard office equipment (e.g., computers), Internet connectivity that is already required as a startup cost and maintenance cost under OMB 3145-0058, and free software (e.g., Microsoft Explorer) to respond.

A.14. Estimates of Costs to the Federal Government

As shown in Appendix A, the total annual cost to the Federal Government of the seven ongoing data collections is currently estimated to be $1,316,674. Details of the cost of each collection can be found in Appendix A. Following is an example of the calculation of cost to the Federal Government for the CREST/HBCU-RISE data collection (details in Attachment A1). More details on the costs of existing collections can be found in attachments A1 through G1.


Computing the annualized cost to NSF for the CREST/HBCU-RISE data collection was done by taking the projected budget for the next three years and calculating the cost for each of the following operational activities involved in producing, maintaining, and conducting the data collection (see Exhibit 7).




Exhibit 7: Sample calculation of estimated annualized cost to the Federal Government

Operational Activities

Cost Over Three Years

System Development (includes initial development of the database and Web-based application, and later changes requested by the program, e.g., increased reporting tools, additional validations)

$148,113

System Maintenance, Updates, and Technical Support (system requires updates each year before opening the collection; maintenance is required to keep the system current with technology, e.g., database servers, operating systems)

$266,603

Data Collection Opening and Support (e.g., online and telephone support to respondents and contacting respondents to encourage completion of the questions), Reporting (as defined by HRD), and Follow-up Activities (e.g., providing data to other consultants)

$177,735

Three-Year Total for All Operational Activities

$592,450


The annualized cost was computed as one-third of the total three-year cost; thus, the annualized cost to NSF for the CREST/HBCU-RISE data collection is $197,483.

A.15. Changes in Burden

The current inventory numbers at OMB for the EHR program monitoring clearance covers 11 individual collection tasks. The OMB inventory records show a total number of responses of 7,784 and total hours of 57,249.


This renewal includes seven individual collection tasks and requests 3,211 responses and 32,698 total hours; details can be found in Appendix A. The change in burden is due to shifts in the number of respondents. Exhibit 8 shows the changes in burden in the individual tasks:


Exhibit 8: Hour changes in task burdens

Collection Title

Previously Cleared Burden

Currently Requested Burden

Change in Burden

Advancing Informal STEM Learning (AISL) Monitoring System

1,921

0

(1,921)

Centers of Research Excellence in Science and Technology (CREST) and Historically Black Colleges and Universities Research Infrastructure for Science and Engineering (HBCU-RISE) Monitoring System

1,810

1,648

(162)

NSF Graduate STEM Fellows in K-12 Education (GK-12) Monitoring System

3,529

0

(3,529)

Integrative Graduate Education and Research Traineeship Program (IGERT) Monitoring System

12,282

2,342

(9,940)

Louis Stokes Alliances for Minority Participation (LSAMP) Monitoring System

12,949

16,250

3,301

Louis Stokes Alliances for Minority Participation Bridge to the Doctorate (LSAMP-BD) Monitoring System

2,090

1,008

(1,082)

Robert Noyce Teacher Scholarship Program (Noyce) Monitoring System

5,908

6,050

142

Research in Disabilities Education (RDE) Monitoring System

1,368

0

(1,368)

Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) Monitoring System

6,000

4,900

(1,100)

Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) Monitoring System

6,648

500

(6,148)

Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics (TUES) Monitoring System

2,744

0

(2,744)

NSF Burden Estimate Total

57,249

32,698

(24,551)


The total change of burden is a decrease of 24,551 hours.


Changes in the hour burden are accompanied by changes in the number of respondents. Exhibit 9 shows the changes in total number of respondents.


Exhibit 9: Changes in number of respondents

Collection Title

Previously Cleared No. of Respondents

Currently Requested No. of Respondents

Change in No. of Respondents

Advancing Informal STEM Learning (AISL) Monitoring System

155

0

(155)

Centers of Research Excellence in Science and Technology (CREST) and Historically Black Colleges and Universities Research Infrastructure for Science and Engineering (HBCU-RISE) Monitoring System

40

42

2

NSF Graduate STEM Fellows in K-12 Education (GK-12) Monitoring System

1,267

0

(1,267)

Integrative Graduate Education and Research Traineeship Program (IGERT) Monitoring System

3,307

513

(2,794)

Louis Stokes Alliances for Minority Participation (LSAMP) Monitoring System

563

625

62

Louis Stokes Alliances for Minority Participation Bridge to the Doctorate (LSAMP-BD) Monitoring System

55

56

1

Robert Noyce Teacher Scholarship Program (Noyce) Monitoring System

422

550

128

Research in Disabilities Education (RDE) Monitoring System

12

0

(12)

Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) Monitoring System

500

700

200

Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) Monitoring System

277

25

(252)

Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics (TUES) Monitoring System

686

0

(686)

NSF Respondent Estimate Total

7,284

2,511

(4,773)


The decrease in respondents is due largely to (1) transitions in several programs that are no longer making new awards (i.e., GK-12, IGERT, RDE, STEP, and TUES), and (2) a decision by the AISL program to leverage NSF enterprise data to provide evidence necessary for monitoring, post-award management, and assessment of outcomes associated with AISL projects.

In future years, the burden will be affected by the deletion and addition of some subtasks and respondents. NSF will notify OMB when there are significant changes to the burden.

A.16. Plans for Publication, Analysis, and Schedule

The data collections that are the subject of this renewal request are utilized for multiple purposes, described in sections A.1 (Circumstances Requiring the Collection of Data) and A.2 (Purposes and Use of the Data). Some individual collections contribute to formal products (e.g., analytical reports) that can be published by NSF’s On-Line Document System (ODS). Most of the data the EHR program monitoring clearance collects, however, is not published as a stand-alone product, because the data are an input to how NSF manages, documents, evaluates, and measures its performance as an agency. NSF’s GPRA Performance Report or an individual division’s annual report to the NSF Director may use information from the collection to report to Congress. This is an annual cycle.


The data collection efforts included under this request are administered by third-party contractors that deliver (1) analytical reports, (2) the raw data from the collections, or (3) both. Third parties are contractually forbidden from publishing results unless NSF has made a specific exception. In short, all products of the collections are the property of NSF. After the products are delivered, NSF determines whether the quality of the products deserves publication verbatim by NSF; i.e., NSF typically is the exclusive publisher of the information collected by the collections. Often it is only after seeing the quality of the information the collection delivers that NSF decides the format (raw or analytical) and manner (in the ODS or simply a page on the NSF Web site) in which to publish.


When reports on studies that employ monitoring data or documents presenting analyses of monitoring data are approved for publication, distribution is likely to be electronic in nature. For content authored by NSF or by a third party at NSF’s request, the agency rarely uses paper to publish the information. NSF publishes most documents electronically only using the agency’s Web site, from requests for proposals to evaluation or statistical reports, using an archive called an ODS. Public reports on studies that make use of monitoring data are typically made available from the EHR main Web page, part of the NSF main public Web page.


EHR recurring studies based on monitoring data are requested by program staff and are done to monitor, manage, and communicate with and about the clients funded by NSF’s investment in education and training. In most cases the primary purpose for each recurring study is program management. These studies generate data that enable both NSF and the funded education and training projects to improve management and performance. Typically, recurring studies generate information that NSF uses as inputs to other reports, and therefore EHR cites no specific publication plans other than internal or general use to meet reporting requirements.


EHR uses data from recurring studies to provide information that can be mined for program evaluation purposes, such as identifying best practices in the education of graduate and undergraduate students, or as a baseline for summative evaluation reports.

A.17. Approval to Not Display Expiration Date

Not applicable

A.18. Exceptions to Item 19 of OMB Form 83-I

No exceptions apply.

1 National Science Foundation. (2019). About NSF: At a Glance. Retrieved February 2019 from https://www.nsf.gov/about/glance.jsp.

2 National Science Foundation. (2019). How We Work. Retrieved February 2019 from https://www.nsf.gov/about/how.jsp.

3For general information on NSF performance assessment activities see https://www.nsf.gov/about/performance/.

4See “Committee of Visitors (COV),” retrieved March 2019 from https://www.nsf.gov/od/oia/activities/cov/. COV reports are available at https://www.nsf.gov/od/oia/activities/cov/covs.jsp.

5See “About Education and Human Resources (EHR),” retrieved March 2019 from https://www.nsf.gov/ehr/about.jsp.

6 See Delivering Government Solutions in the 21st Century: Reform Plan and Reorganization Recommendations, pp. 118-120, retrieved March 2019 from https://www.whitehouse.gov/wp-content/uploads/2018/06/Government-Reform-and-Reorg-Plan.pdf and “Building and Using Evidence to Improve Government Effectiveness,” Chapter 6, FY 2019 Budget Analytical Perspectives, retrieved March 2019 from https://www.whitehouse.gov/wp-content/uploads/2018/02/ap_6_evidence-fy2019.pdf.

Page 13 of 13


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy