REQUEST FOR RENEWAL OF EHR GENERIC CLEARANCE
Forms Clearance Package
Submitted by:
National Science Foundation
4201 Wilson Boulevard
Arlington, VA 22230
The National Science Foundation (NSF) funds basic research in fields of science and engineering as well as research on education and learning in those fields at all educational levels. NSF supplies grants, contracts, and cooperative agreements to more than 2,000 colleges, universities, and other eligible institutions, and provides graduate fellowships to individuals in all parts of the United States.
NSF is responsible for nearly 20 percent of Federal support to academic institutions for basic research. The Directorate for Education and Human Resources (EHR) is the unit within NSF primarily responsible for promoting rigor and vitality within the Nation's science, technology, engineering, and mathematics (STEM) education enterprise to further the development of the 21st century’s STEM workforce. The mission of EHR is to identify means and methods to promote excellence in U.S. STEM education at all levels and in all settings (both formal and informal), in order to support the development of a diverse and well-prepared workforce of scientists, technicians, engineers, mathematicians and educators and a well-informed citizenry that has access to the ideas and tools of science and engineering. EHR provides support for research and implementation activities that may improve STEM learning and education from pre-school through postdoctoral, in traditional and non-traditional venues and among all citizens, residents, and nationals. EHR also focuses on broadening participation in STEM learning and careers, particularly among those individuals traditionally underemployed in the STEM research workforce, including but not limited to women, persons with disabilities, and racial and ethnic minorities.
This request for Office of Management and Budget (OMB) review asks for a regular or standard three-year renewal for the EHR Generic Clearance OMB Control Number 3145-0136 that expires on January 31, 2008.
Data collected under the EHR Generic Clearance are primarily used for program planning, management, and audit purposes and to respond to queries from the Congress, the public, NSF’s external merit reviewers who serve as advisors, including Committees of Visitors (COVs), and the NSF’s Office of the Inspector General. These data are required for effective administration, communication, program and project monitoring and evaluation, and for measuring attainment of NSF's program, project and strategic goals, as required by the President's Management Agenda as represented by OMB’s Program Assessment Rating Tool (PART); the Deficit Reduction Act of 2005 (P.L. 109-171), which established the Academic Competitiveness Council (ACC), and the NSF’s Strategic Plan and other performance assessment activities.
The EHR Generic Clearance relates to information collected under the NSF's Grant Proposal Guide (GPG) OMB Control Number 3145-0058. Data gathered via OMB 3145-0058 are housed in NSF's main administrative database called the Proposal and Award System (PARS). Most of the information in the EHR Generic Clearance, however, originates from specialized, custom collections. These individual collections (see attachments A through I) are designed to assist in management of specific programs, divisions, or multi-agency initiatives.
Most programs subject to EHR Generic data collection are funded by the EHR Directorate, but some are funded in whole or in part by disciplinary directorates or multi-disciplinary or cross-cutting programs. There are currently 13 previously approved collections under the existing clearance that will expire in January 2008. Four (4) of these collections will end upon the January 2008 expiration and one new task has been added. Therefore, this request asks for clearance of ten tasks.
National Science Foundation, Division of Science Resources Statistics, Federal Obligations for Research to Universities and Colleges by Agency and Detailed Field of Science and Engineering: Fiscal Years 1973-2002.
The NSF Directorate of Education and Human Resources (EHR) is responsible for collecting, analyzing, evaluating, and communicating information on STEM education and human resource development activities, and for coordinating analytical and policy support for all of NSF's Education and Training (E&T) portfolio.
History of the EHR Generic Clearance
In 1995, at the request of the Office of Management and Budget and in response to the Government Performance and Results Act (GPRA) of 1993, an EHR Generic Clearance was established to integrate management, monitoring, and evaluation information pertaining to the NSF's Education and Training (E&T) portfolio. Under this generic survey clearance (OMB 3145-0136), data from the NSF administrative databases are incorporated with findings gathered through initiative-, division-, and program-specific data collections. These data are used for monitoring, managing, and communicating about NSF's investment in E&T programs, initiatives, and activities.
When the EHR Generic was cleared in 1998, the Terms of Clearance (TOC) specified how individual packages would be handled. Those terms stated that "All . . . individual tasks associated with this generic . . . must be submitted to OMB for clearance prior to implementation. If approved those individual approvals will expire, at the latest, when this generic expires in 9/2001 . . . When NSF seeks to add additional tasks to 3145-0136 other than those previously mentioned, the additional request will be accompanied by an 83-C burden change sheet so that the appropriate burden total for the generic clearance can be changed accordingly. Further, each additional request shall contain a cover memo which describes why the specific task is appropriate to include in the generic. . . Consistent with past procedures under this generic clearance, submission of individual task are done informally (i.e., sent directly to the desk officer rather than to the docket library) and OMB will attempt to complete the review expeditiously."
The 2001 Terms of Clearance further prescribed a "cross-walk that was provided by NSF on 11/6" and specifies that the cover memos submitted with new requests "should contain a similar crosswalk that details how the new questions fit into the three categories given." In addition, the 2001 TOC stated that "NSF has agreed to consider this clearance to encompass only 'monitoring' surveys, and no program evaluations will be completed under this generic clearance. Evaluations will need to go through a full clearance review under the PRA. All monitoring studies must conform to the three-category configuration explained in the memo of 10/24." In accordance with the 2001 and 2005 Terms of Clearance, NSF primarily uses the data from the EHR Generic Clearance for program planning, management, and audit purposes, and evaluation studies are submitted to OMB under separate information collection requests.
Circumstances of Data Collection
To fulfill its planning and management responsibilities, and to answer queries from Congress, OMB, and NSF management, EHR needs current and reliable information about projects in NSF's E&T portfolio. This information is specifically important to support studies and evaluations by EHR, and studies by other NSF organizational units for project monitoring and effective administration. The information is retained in accordance with the Education and Training System of Records (63 Fed. Reg. 264, 272 January 5, 1998). The Education and Training System of Records has several purposes, including:
Providing a source of information on demographic and educational characteristics and employment plans of participants in NSF-funded educational projects, in compliance with Foundation responsibilities to monitor scientific and technical resources
Enabling NSF to monitor the effectiveness of NSF-sponsored projects and identify outputs of projects funded under NSF awards for management and for reporting to the Administration and Congress, especially under GPRA, 5 U.S.C. 306 and 39 U.S.C. 2801-2805 and under the President's Management Agenda as represented by the Office of Management and Budget's Program Assessment Rating Tool
Creating public use files (which contain no personally identifiable information) for research purposes
The EHR Generic Clearance and the Education and Training System of Records enable NSF staff members and third-party evaluators to collect and combine data from:
Surveys (paper, electronic {i.e., Web}, and telephone)
Observations (i.e., site visits)
Face-to-face interviews
Focus groups
OMB 3145-0136 is focused on initiative-, division-, and program-specific quantitative and qualitative data collection activities. Data from these collections focus on activities and outputs (i.e., the accomplishments of program grantees {projects}in terms of specific objectives). These descriptive data collections provide essential information for assessing progress toward NSF's major performance goals, as described in NSF’s Strategic Plan. (The Foundation’s FY 2006-2011 Strategic Plan describes four strategic outcome goals of Discovery, Learning, Research Infrastructure, and Stewardship. See www.nsf.gov/publications/pub_summ.jsp for the complete strategic plan.)
In addition to the requirements of the NSF Strategic Plan, and PART and GPRA reporting, some collections under this Generic have statutory requirements for data collection for monitoring and reporting purposes. For example, the public law authorizing the Robert Noyce Scholarship (Noyce) program (attachment G), requires award recipients to provide monitoring information to the NSF: “An institution of higher education (or consortium thereof) receiving a grant under this section shall, as a condition of participating in the program, enter into an agreement with the Director to monitor the compliance of scholarship and stipend recipients with their respective service requirements” (http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=107_cong_public_laws&docid=f:publ368.107). See individual task attachments for more information.
The EHR Generic Clearance also may be used to clear data collections for other ACC agencies, such as the National Aeronautics and Space Administration (NASA). In February 2007 NASA and NSF signed a Memorandum of Understanding (MOU) to coordinate efforts promoting STEM education, the participation of individuals underrepresented in STEM, and evaluation of STEM education projects and programs in formal and informal settings. In this clearance, Task J represents the initial use of this MOU to clear a NASA monitoring system through 3145-0136. Additional information on the NSF-NASA MOU can be found at:http://education.nasa.gov/divisions/higher/overview/F_One_Giant_Step_STEM_Education.html.
A renewal of the EHR Generic Clearance that allows continued collection of these data is requested. Many of the data collection instruments have similar structures, and while they seek information about different activities, they are often designed to collect information to allow for monitoring and comparison across activities. In accordance with OMB’s 2001 Terms of Clearance, all EHR Generic Clearance data elements fall into one of three categories:
Staff and participant characteristics
Project implementation characteristics
Project outputs
A crosswalk comparing the data collected across the task collections can be found in appendix C.
The information collected under the EHR Generic Clearance is required for effective administration, communication, and program and project monitoring, and for measuring attainment of NSF's program, project, and strategic goals as laid out in NSF’s Strategic Plan. This section will describe how the data collected under OMB 3145-0136 will be used for internal program management and administration; as a data source for NSF’s performance assessment activities, including Committees of Visitors and Directorate and Office Advisory Committees (ACs); for measuring the attainment of NSF’s program, project, and strategic goals through PART and GPRA reporting; and as a foundation for the rigorous research required to evaluate the effectiveness of STEM education programs, as described by the Academic Competitiveness Council. For more general information about NSF’s performance assessment activities see: http://www.nsf.gov/about/performance/.
Program Management and Administration
One of the primary uses of data from the EHR Generic Clearance is for the general monitoring of project and program activities by EHR staff. Since EHR has limited staff members who must monitor hundreds of projects, large-scale data collection is the only way in which these program officers can hope to track project activities. The monitoring systems that fall under OMB 3145-0136 allow program officers and other NSF staff to integrate pre-existing data and newly generated data in a coherent and timely manner, giving them information needed to make adjustments to the program portfolio. For example, NSF decided to sunset the Collaboratives for Excellence in Teacher Preparation (CETP) program and no money was requested by NSF to support new CETP projects. Information from the EHR Generic Collection regarding the CETP program's activities had a significant influence on this decision and the CETP monitoring task is not being renewed this year. While most of the uses are not as dramatic as eliminating a program, they are significant to the normal operation of the EHR Directorate and to the individual projects outside the Foundation. This kind of monitoring can lead to corrections by respondents to their project activities, may facilitate changes in program guidelines and/or NSF funding levels to a particular project, and may result in improved benefits to participants in NSF projects.
Two executive orders, one old and one new, provide the legal authority for these collections. The 1993 Executive Order (EO) 12862, "Setting Customer Service Standards," direct agencys to put the public first by having a "revolution within the Federal government to change the way it does business." EO 12862 requires continual reform of government practices and operations to the end that, "when dealing with the Federal agencies, all people receive service that matches or exceeds the best service available in the private sector." Section 1(b) of this E.O. requires agencies to "survey customers to determine the kind and quality of services they want and their level of satisfaction with existing services" The newer EO 13450 issued November 2007, "Executive Order: Improving Government Program Performance," asks agencies to:
(i) assess performance of each program administered in whole or in part by the agency; and
(ii) consider means to improve the performance and efficiency of such program;
These Presidential directives establish an ongoing need for NSF to engage in an interactive process of collecting information and using it to improve program services and processes.
Data for NSF’s Performance Assessments, including COVs and ACs
Data from the monitoring system plays a key role in NSF’s performance assessment activities, and feeds into the larger NSF evaluation model. NSF relies on the judgment of external experts to maintain high standards of program management, and to provide advice for continuous improvement of NSF performance. Directorate and Office advisory committees meet twice a year, while Committees of Visitors for divisions or programs meet once every three years. COV reviews provide NSF with external expert judgments in two areas: (1) assessments of the quality and integrity of program operations and program-level technical and managerial matters pertaining to proposal decisions; and (2) comments on how the results generated by awardees have contributed to the attainment of NSF's mission and strategic outcome goals. Data collected in the monitoring systems are often used in these reviews. For example, the September 2007 Centers of Research Excellence in Science and Technology (CREST) COV materials included multiple years of summary data about the program that had been collected in the CREST monitoring system (attachment A). Further, the 2005 Noyce program COV, conducted before the launch of the Noyce monitoring system (attachment G), specifically cited the need for more systematic data collection beyond the fragmentary information available through annual reports from Principal Investigators (PIs). COV reports are available at http://www.nsf.gov/od/oia/activities/cov/covs.jsp.
PART and GPRA reporting
Another central use of the EHR Generic Clearance Data is to measure attainment of NSF's program, project and strategic goals and to report on the attainment of these goals. NSF’s performance assessment is guided by three elements: the Government Performance and Results Act of 1993, OMB’s Program Assessment Rating Tool, and NSF’s Strategic Plan. The Foundation’s FY 2006-2011 Strategic Plan describes four strategic outcome goals of Discovery, Learning, Research Infrastructure, and Stewardship. EHR’s portfolio of E&T programs is a critical part of the Foundation’s Learning mission to “[c]ultivate a world-class, broadly inclusive science and engineering workforce, and expand the scientific literacy of all citizens” (p. 5). NSF has integrated its GPRA reporting with the PART evaluation process designed by OMB, which examines program performance through a series of questions on program purpose and design, strategic planning, program management, and program results/accountability. Information collected under the EHR Generic Clearance may be used for each EHR division's annual report, and these annual reports are used by NSF's leadership to respond to the performance assessment requests. NSF’s 2006 Performance and Accountability report (which can be seen at http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par) specifically describes types of data needed to assess performance on developing People (the major goal in NSF’s FY 2003-2008 Strategic Plan that many EHR programs addressed) as including, “Student, teacher and faculty participants in NSF activities; demographics of participants; descriptions of student involvement; education and outreach activities under grants; demographics of science and engineering students and workforce; numbers and quality of educational models, products and practices used/developed; number and quality of teachers trained; and student outcomes including enrollments in mathematics and science courses, retention, achievement, and science and mathematics degrees received”
(page II-9). Many of these data elements are collected in the monitoring systems cleared under OMB 3145-0136. EHR has considerably more quantitative data than other NSF directorates, so having monitoring systems that allow these data to be collected and managed so that they can be successfully reported and used in these performance assessments is critical.
A
Foundation for Future Evaluations
Finally, a key measure of NSF’s success at achieving its goals is the effectiveness of its STEM education programs. The Deficit Reduction Act of 2005 (P.L. 109-171) established the Academic Competitiveness Council, a multi-agency effort to identify Federal STEM education programs and establish their effectiveness. NSF funded 29 percent of the programs identified by the ACC, and several of the programs monitored under the EHR Generic Clearance are specifically identified in the ACC’s May 2007 report, including Historically Black Colleges and Universities-Undergraduate Program (HBCU-UP) (attachment H), Louis Stokes Alliances for Minority Participation (LSAMP) (attachment F), Noyce (attachment G), Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) (attachment I), Graduate Teaching Fellows in K-12 Education (GK-12) (attachment D), and Integrative Graduate Education and Research Traineeship (IGERT) (attachment E). (The full ACC report can be accessed at http://www.ed.gov/about/inits/ed/competitiveness/acc-mathscience/index.html.) After identifying the Federal STEM education programs, the ACC report describes a hierarchy of study designs that should be used to evaluate the effectives of these programs. The report names experimental designs (such as well-designed randomized controlled trials) as the most preferred evaluation method, followed by quasi-experimental designs (such as well-matched comparison group studies), and finally places other designs (such as pre-post studies and comparison groups without careful matching) at the bottom of the hierarchy. This sort of rigorous program and project evaluation is essential to determining whether an education program is proving effective enough that it should be continued or expanded. It also helps agencies identify programs that are not effective and should be changed or discontinued.
NSF is committed to performing this type of program evaluation. While the monitoring systems used to collected data under the EHR Generic Clearance play a role in this work, it is understood that they are not evaluative studies. The ACC report allows that “[m]any agencies do conduct program-level management reviews to ensure that programs are administered properly and in accordance with federal guidelines and agency missions” (p.16) and this is the primary use of the EHR Generic data. However, EHR Generic data can play a role in creating a foundation for the kind of evaluation the ACC requires of Federal agencies. While data collected under this generic are not used to evaluate program effectiveness, some of the data can serve as baseline data for separate research and evaluation studies. For example, in order to conduct program or portfolio level evaluations, both experimental and quasi-experimental evaluation research studies on STEM education interventions require researchers to identify individual-level and organizational or project-level control and treatment groups or comparison groups. NSF-funded contract or grantee researchers and evaluators in part may identify control, comparison, or treatment groups for NSF’s E&T portfolio using some of the descriptive data gathered through OMB 3145-0136 to conduct well-designed, rigorous research and portfolio evaluation studies.
Two examples of third-party evaluations that used EHR OMB 3145-0136 data to inform study design are: OMB 3145-0190 (Expired: 5/2005) Evaluation of NSF's Louis Stokes Alliances for Minority Participation program conducted by the Urban Institute and OMB No. 3145-0182 (Expired 7/2005) Evaluation of the Initial Impacts of the Integrative Graduate Education Research and Traineeship program conducted by Abt Associates. For more information on these and other NSF-funded evaluations, please see the NSF’s FY 2006 Full Performance and Accountability Report: Appendix 4 B: Table of External Evaluations at: http://www.nsf.gov/pubs/2007/nsf0701/pdf/19.pdf
All of the task collections included under this generic clearance request use advanced electronic information technology—including, the Web-based data collection systems and email—to minimize data duplication and respondent burden. Any new collections that will be submitted in the future are also expected to be either Web- or email-based. The collections included in this clearance package and their methods of data collection are shown in chart 1.
Attachment |
Collection Title |
Method of Data Collection |
A |
Centers for Research Excellence in Science and Technology Monitoring System (CREST) |
Web |
B |
Survey Form for the Division of Undergraduate Education Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) |
Web |
C |
Division of Undergraduate Education Project Information Resource System (DUE-PIRS) |
Web |
D |
Graduate Teaching Fellows in K-12 Education Distance Monitoring System (GK-12) |
Web |
E |
Distance Monitoring System for the Division of Graduate Education Integrative Graduate Education and Research Traineeship Program (IGERT) |
Web |
F |
Louis Stokes Alliances for Minority Participation (LSAMP) Distance Monitoring |
Web |
G |
Program Monitoring System for the Robert Noyce Scholarship Program (Noyce) |
Web |
H |
Self-Evaluation Indicator System (SEIS) Historical Black Colleges and Universities Undergraduate Program (HBCU-UP) for Awardees |
|
I |
Program Monitoring System for the Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) |
Web |
J |
NASA Educators Survey |
Web |
EHR tends to favor Web-based systems because they can facilitate respondents' data entry across computer platforms. One innovative feature of many of the individual Web systems is the thorough editing of all submitted data for completeness, validity, and consistency. Editing and validation are performed as data are entered. Most invalid data cannot be entered into the system, and questionable or incomplete entries are called to respondents' attention before they are submitted to NSF. In cases where data are collected from small populations via email, spreadsheet forms are sent via email to respondents who then enter data and return the completed form, also by e-mail.
EHR Generic Clearance Web-based surveys employ user-friendly features such as automated tabulation, data entry with custom controls such as checkboxes, data verification with error messages for easy online correction, standard menus, and predefined charts and graphics. All of these features facilitate the reporting process, provide useful and rapid feedback to the data providers, and reduce burden.
All collections in the EHR Generic comply with Section 508, the 1998 amendment to the Federal Rehabilitation Act, which mandates that the electronic and information technology used by Federal agencies be made accessible to all people with disabilities.
The EHR Generic Clearance does not duplicate efforts undertaken by the Foundation, other Federal agencies or other data collection agents.
For example, NSF grants require the submission of Annual and Final Project Reports in accordance with OMB 3145-0058. Recipients of NSF grants, such as Principal Investigators, must create and submit annual and final project reports using NSF's nationally recognized FastLane Web template. (For more information on FastLane, see www.fldemo.nsf.gov.) To minimize overall response burden, OMB 3145-0136 items are designed, so that they can be shared with or use the FastLane Project Reports System Surveys, ensuring that data collection is not duplicated and that data collected under the EHR Generic Clearance are unique and not available elsewhere. Specifically, financial data on program funding are drawn from OMB 3145-0058, which covers applications submitted through the NSF FastLane system and the upcoming grants.gov.
Only one task, the Division of Undergraduate Education Project Information Resource System (attachment C), collects data from small businesses, which usually are partnered with an academic or education institution. EHR anticipates that an average of only 18 small businesses will be affected annually, and that only a small amount of data will be collected from any small business organizations, with the total small business response burden being less than 1 percent of the total EHR Generic Clearance response burden. Full details can be found in the subtask’s individual clearance requests.
Data collected for the EHR Generic Clearance are used to manage programs, monitor projects, coordinate with Federal and non-Federal education partners, provide Congress with information about government-supported activities, and report for GPRA and PART requirements. In many cases, the data need to be collected annually to inform the NSF management and evaluation processes. Data collected under the EHR Generic collection can be used by NSF management to measure NSF’s success at achieving both Strategic Outcome Goals, and internal Annual Performance Goals, as described in NSF’s PART reporting.
If the information were not collected, NSF would be unable to document the effectiveness and outcomes of its programs. It would not be able to meet its accountability requirements or assess the degree to which projects are meeting their goals. Moreover, NSF would be unable to comply fully with the congressional mandate that the Foundation evaluate its STEM education programs. The ACC May 2007 report recommended that Federal support of STEM education programs not be increased until a plan for rigorous, independent evaluation of program impacts is in place, and the EHR Generic is an important cornerstone in NSF’s larger evaluation plans.
All data collections will comply with 5 CFR 1320.6. All tasks under the EHR Generic Clearance ask respondents for data annually, with the exception of the Survey Form for the Division of Undergraduate Education Computer Science, Engineering, and Mathematics Scholarships Program (Attachment B), which asks respondents to submit data quarterly. See attachment E for more information on the frequency of this collection.
The notice inviting comments on the EHR Generic Clearance (OMB 3145-0136) was published in the Federal Register August 24, 2007, Volume 72, Number 164, page 48694. No comments were received. A copy of the notice can be found at the end of this document.
EHR routinely consults with research and evaluation experts, PIs, and educators affected by EHR investments when developing collection instruments. The purpose of these consultations is to assess the relevance, availability, and clarity of items. As suggested by OMB guidelines, these consultations also enable EHR staff to obtain a reliable estimate of the respondent burden generated by new instruments. When a new task is added to the collection or when an existing task is modified to add new instruments, each instrument is pretested with fewer than nine individuals and revised following debriefings with participating respondents.
All outside consultations are described within the context of the specific data collection tasks. In tasks conducted earlier under the EHR Generic Clearance, consultations have included knowledgeable outsiders such as representatives of EHR contractors responsible for technical and evaluation tasks and fellows who work at the Foundation as guests under programs such as the Einstein Fellows Program or the American Association for the Advancement of Science (AAAS) Washington Fellows Program.
To date no payments or gifts have been provided to respondents. There are no plans to provide incentives to respondents, since the value of program and project monitoring surveys is of value to both the respondent and the NSF.
Respondents are advised that any information on specific individuals is maintained in accordance with the Privacy Act of 1974. Every data collection instrument displays both OMB and Privacy Act notices.
Respondents are told that data collected for the EHR Generic Clearance are available to NSF officials and staff, evaluation contractors, and the contractors hired to manage the data and data collection software. Data are processed according to Federal and State privacy statutes. Detailed procedures followed by EHR for making information available to various categories of users are specified in the Education and Training System of Records (63 Fed. Reg. 264, 272 January 5, 1998). That system limits access to personally identifiable information to authorized users. Data submitted are used in accordance with criteria established by NSF for monitoring research and education grants and in response to Public Law 99-383 and 42 USC 1885c.
The information requested may be disclosed to qualified researchers and contractors in order to coordinate programs and to a Federal agency, court, or party in court, or Federal administrative proceeding, if the government is a party.
In some cases, collections in the EHR Generic Clearance request information from respondents including name, address, Social Security number (SSN), date of birth, and grade point average. These data are collected in order to monitor the award sites and evaluate the success of the award programs. Information of this nature is also used to track recipients of funding and training. For example, in the IGERT survey (attachment I), trainees' SSNs are used as a tracking mechanism to permit follow-up studies that examine the long-term effect of the IGERT program on individuals' success. However, in the IGERT collection and in all tasks that request SSN, SSN is a voluntary field. Indeed all items of a sensitive nature are voluntary. Respondents may choose not to provide information that they feel is privileged, such as SSN, address, or date of birth. Any individualized data that are collected are provided only to program staff and consultants conducting studies using the data as authorized by NSF. Any public reporting of data is in aggregate form.
The chart below shows which individual tasks include questions of a sensitive nature.
Attachment |
Collection Title |
Contains Questions of Sensitive Nature |
A |
Centers for Research Excellence in Science and Technology Monitoring System (CREST) |
Yes |
B |
Survey Form for the Division of Undergraduate Education Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) |
Yes |
C |
Division of Undergraduate Education Project Information Resource System (DUE-PIRS) |
No |
D |
Graduate Teaching Fellows in K-12 Education Distance Monitoring System (GK-12) |
Yes |
E |
Distance Monitoring System for the Division of Graduate Education Integrative Graduate Education and Research Traineeship Program (IGERT) |
Yes |
F |
Louis Stokes Alliances for Minority Participation (LSAMP) Distance Monitoring |
Yes |
G |
Program Monitoring System for the Robert Noyce Scholarship Program (Noyce) |
Yes |
H |
Self-Evaluation Indicator System (SEIS) Historical Black Colleges and Universities Undergraduate Program (HBCU-UP) for Awardees |
No |
I |
Program Monitoring System for the Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) |
No |
J |
Nasa Educators Survey |
Yes |
As shown in appendix B, and in Chart 3 below, the annual response burden for the ten tasks under OMB 3145-0136 is 44,756 hours (for 45,222 responses). Given the diversity of respondent types, the methods used to arrive at individual task burden estimates are described in detail in attachments A through K.
Attachment |
Collection Title |
Number of Respondents |
Number of Responses |
Annual Hour Burden |
A |
Centers for Research Excellence in Science and Technology Monitoring System (CREST) |
27 |
27 |
1,971 |
B |
Survey Form for the Division of Undergraduate Education Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) |
12,400 |
13,200 |
3,200 |
C |
Division of Undergraduate Education Project Information Resource System (DUE-PIRS) |
1,800 |
1,800 |
1,200 |
D |
Graduate Teaching Fellows in GK-12 Education Distance Monitoring System (GK-12) |
2,280 |
2,280 |
6,120 |
E |
Distance Monitoring System for the Division of Graduate Education Integrative Graduate Education and Research Traineeship Program (IGERT) |
2,136 |
2,136 |
9,440 |
F |
Louis Stokes Alliances for Minority Participation (LSAMP) Distance Monitoring |
415 |
415 |
14,380 |
G |
Program Monitoring System for the Robert Noyce Scholarship Program (Noyce) |
75 |
75 |
1050 |
H |
Self-Evaluation Indicator System (SEIS) Historical Black Colleges and Universities Undergraduate Program (HBCU-UP) for Awardees |
112 |
112 |
2,016 |
I |
Program Monitoring System for the Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) |
177 |
177 |
3,295 |
J |
NASA Educators Survey |
25,000 |
25,000 |
2,084 |
|
Total
|
44,422 |
45,222 |
45,756 |
NSF estimates approximately 5 new tasks will need to be cleared under the EHR Generic Clearance during the next three years, dependent on budgetary limitation and Congressional mandates. The overall response burden in any year should not exceed 60,000 hours. The burden associated with each new task will be outlined in the individual requests that will be submitted to OMB with a burden change request form.
Below is an example that shows how the hour burden was estimated for the LSAMP system, attachment F:
The total number of annual respondents for the LSAMP monitoring system is 415 (80 project PIs/Co-PIs; 300 LSAMP institution personnel; and an estimated 35 Bridge to the Doctorate data coordinators involved in the Bridge to the Doctorate module) and the total annual person-hours is 14,380.
The Web-based collection is an annual activity of the LSAMP program. There are approximately 40 LSAMP alliances with 2 or more co-PIs and project personnel at alliance institutions. New alliances (and institutions within currently funded alliances) will be added to the program over the next three years. The new institutions enter at approximately the same rate that alliances or institutions leave the program or project as their funding expires.
The annualized burden for the component surveys in the current task (PI and institution personnel) was calculated by taking the average number of respondents from the previous survey cycles and estimating their response burden, based on a question in the Web-based data collection asking how long it takes respondents to complete the survey. The annual burden for the new component survey addressed to LSAMP Bridge program managers was estimated using the burden reported in similar monitoring systems. The Bridge activity began in 2003 and during the first year of Bridge data collection coordinators will enter data for past years of support activity; while the average annual burden is estimated at 20 hours for these respondents, it is expected that this burden will be lower in the second and third year of data collection, after past years’ data have been entered. The three burden estimates for each type of respondent are outlined below.
Calculations Estimating Overall Response Burden for the LSAMP Monitoring System |
|||
Type
of |
Average Number of Respondents |
Burden Hours Per Respondent |
Annual Person-Hours |
PIs/Co-PIs |
80 |
36 hours |
2,880 |
LSAMP Institution Personnel |
300 |
36 hours |
10,800 |
Bridge to the Doctorate Data Coordinators |
35 |
20 hours |
700 |
Total respondents |
415 |
Total estimated hours |
14,380 |
Details on the burdens of each form can be found in the task clearances. The chart below is an example of how this burden was estimated for the GK-12 monitoring system, Attachment D:
Calculations Used to Estimate Burden by Form for the GK-12 Monitoring System |
||||
Form Type |
Respondent Type |
Number of Respondents |
Burden Hours Per Respondent |
Total Person Hours |
PI Survey |
PI/Program Coordinator |
120 |
23 |
2,760 |
Fellows Survey |
Graduate Fellows
|
960 |
1 |
960 |
Teachers Survey |
Cooperating Teachers |
1200 |
2 |
2,400 |
Total |
|
2280 |
|
6,120 |
As shown in appendix B, the total annual cost to respondents generated by the ten ongoing data collections is currently estimated to be $1,504,740. Below is an example of the method used to calculate cost burden for the CREST monitoring system, attachment A:
The overall annualized cost to the respondents for the CREST data collection is estimated to be $72,927. The following table shows the annualized estimates of costs to PI respondents, who are generally university professors. These estimated hourly rates are based on a report in the April 20, 2007, edition of The Chronicle of Higher Education (2007. “What Professors Earn.” The Chronicle of Higher Education, 53(33), Washington, D.C.: The Chronicle of Higher Education, Inc.). According to the report, the average salary of an associate professor across all types of doctoral-granting institutions (public, private, church-related) was $76,639. When divided by the number of standard annual work hours (2,080), this calculates to $37.00 per respondent hour.
Calculations Used to Estimate the Cost to Respondents for theCREST Monitoring System |
||
Respondent Type |
Number, Rate, and Burden |
Costs |
PIs/Program Coordinators |
(27 x $37/hour x 73 hours) |
$72,927 |
Total |
|
$72,927 |
The costs to respondents generated by additional data collections will be described in the individual task request for each data collection.
There
is no overall annual cost burden to respondents or record-keepers
that results from the EHR Generic Clearance other than the time spent
responding to surveys that are described in specific detail under
A.12 within the attached individual task justifications (attachments
A through J).
It is usual and customary for individuals
involved in education and training activities in the United States to
keep descriptive records. The information being requested is from
records that are maintained as part of normal educational or training
practice. Furthermore, the majority of respondents are active or
former grantees or participants in programs or projects once funded
by NSF. In order to be funded by NSF, institutions must follow the
instructions in the NSF Grant Proposal Guide that is cleared under
OMB 3145-0058. The GPG requires that all applicants submit requests
for NSF funding and that all active NSF awardees do administrative
reporting via FastLane, an Internet-based forms system, or via the
upcoming grants.gov. Thus, principal investigators, K-12
administrators, faculty members, and college students, who are the
primary respondents to the individual data collections tasks within
the EHR Generic Clearance, make use of standard office equipment
(e.g., computers), Internet connectivity that is already required as
a startup cost and maintenance cost under OMB 3145-0058, and free
software (e.g., Netscape or Microsoft Explorer) to respond.
As
shown in appendix B, the total annual cost to the federal government
of the ten ongoing data collections is currently estimated to be
$2,203,394. Details of the costs of each task can be found in
appendix B.
Below is an example of the costs to the
Federal government from the CREST data collection, attachment A:
Computing the annualized cost to NSF for the CREST data collection was done by taking the budgets for 3 years and calculating the costs for each of the following operational activities involved in producing, maintaining, and conducting the CREST data collection:
Costs to the Federal Government for the CREST data collection
Operational Activities |
Cost Over 3 Years |
System Development (includes initial development of the database and Web-based application, and later changes requested by the program-e.g., increased reporting tools, additional validations) |
$346,100 |
System Maintenance, Updates, and Tech Support (system requires updates each year before opening the collection; maintenance is required to keep the system current with technology, e.g., database servers, operating systems) |
$173,050 |
Data Collection Opening and Support (e.g., online and telephone support to respondents and contacting respondents to encourage completion of the questions), Reporting (as defined by HRD), and Followup activities (e.g., providing data to other consultants) |
$130,000 |
3-Year Total for All Operational Activities |
$649,150 |
The annualized cost was computed as one-third of the total 3-year costs; thus, the annualized cost to NSF for the CREST data collection is $216,383.
More details on the costs of existing tasks can be found in the individual task clearances. The costs to the government generated by future data collections will be described in the clearance request for each data collection.
During the last three years, in accordance with OMB's 2001 and 2005 Terms of Clearance, NSF has requested both:
Clearance of new (also called additional) collections as they are formulated
Revisions of previously cleared tasks
The current inventory numbers at OMB for the EHR Generic package cover 13 individual collection tasks. The OMB inventory records show a total number of responses of 24,792 and total hours of 56,948.
During the extensive document review to prepare this package, NSF estimated that these burden numbers are slightly high—the total hour burden appears to be off by approximately 4,600 hours. We believe this discrepancy primarily derives from two OMB Notices of Action (NOAs) for approved burden changes in April 2007. The IGERT and GK-12 monitoring systems requested changes in burden, and the subsequent NOAs recorded not the requested additional burden, but the new total burden. NSF records the current inventory numbers for the EHR Generic Clearance as 23,123 responses and 52,330 hours.
For this renewal, four of the previous tasks are sunsetting and one new task was added, so we request that OMB approve the ten individual tasks as requested and set their expiration to coincide with the EHR Generic Clearance's expiration in 2011. This renewal requests 45,222 total responses and 44,756 total hours; details can be found in appendix B. The change in burden is due to shifts in the number of respondents and small adjustments in the data requested. The chart below shows the changes in burden in the individual tasks:
Attachment |
Collection Title |
Previously Cleared Burden |
Currently Requested Burden |
Change in Burden |
A |
Centers for Research Excellence in Science and Technology Monitoring System (CREST) |
1,022 |
1,971 |
949 |
B |
Survey Form for the Division of Undergraduate Education Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) |
3,200 |
3,200 |
0 |
C |
Division of Undergraduate Education Project Information Resource System (DUE-PIRS) |
1,200 |
1,200 |
0 |
D |
Graduate Teaching Fellows in K-12 Education Distance Monitoring System (GK-12) |
9,360 |
6,120 |
-3,240 |
E |
Distance Monitoring System for the Division of Graduate Education Integrative Graduate Education and Research Traineeship Program (IGERT) |
7,200 |
9,440 |
2240 |
F |
Louis Stokes Alliances for Minority Participation (LSAMP) Distance Monitoring |
13,336 |
14,380 |
1044 |
G |
Program Monitoring System for the Robert Noyce Scholarship Program (Noyce) |
1,200 |
1,050 |
-150 |
H |
Self-Evaluation Indicator System (SEIS) Historical Black Colleges and Universities Undergraduate Program (HBCU-UP) for Awardees |
1,692 |
2,016 |
324 |
I |
Program Monitoring System for the Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) |
4,972 |
3,295 |
-1677 |
J |
NASA Educators Task |
0 |
2,084 |
2,084 |
Not being renewed |
Survey for Course, Curriculum, and Laboratory Improvement Program (CCLI), Division of Undergraduate Education |
349 |
0 |
-349 |
Not being renewed |
Collaboratives for Excellence in Teacher Preparation Distance Monitoring Data Collection (CETP) |
4680 |
0 |
-4680 |
Not being renewed |
Centers for Learning and Teaching Monitoring System (CLT) |
1219 |
0 |
-1219 |
Not being renewed |
Systemic Initiatives Monitoring for Educational Systemic Reform (ESR) |
2900 |
0 |
-2900 |
|
NSF Burden Estimates Totals (OMB Estimated Totals)
|
52,330 (56,948) |
44,756 |
-7,574 (-12,192) |
According to the OMB inventory records, the total change of burden is a decrease of 12,192 hours. As the table above shows NSF can account for 7,574 of those hours. The difference in the numbers can be explained through discrepancies in Notices of Action.
Changes in the hour burden are accompanied by changes in the number of respondents. The chart below shows the changes in total number of responses.
Attachment |
Collection Title |
Previously Cleared Number of Responses |
Currently Requested Number of Responses |
Change in Number of Responses |
A |
Centers for Research Excellence in Science and Technology Monitoring System (CREST) |
14 |
27 |
13 |
B |
Survey Form for the Division of Undergraduate Education Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) |
13,200 |
13,200 |
0 |
C |
Division of Undergraduate Education Project Information Resource System (DUE-PIRS) |
1,800 |
1,800 |
0 |
D |
Graduate Teaching Fellows in K-12 Education Distance Monitoring System (GK-12) |
3,510 |
2,280 |
-1230 |
E |
Distance Monitoring System for the Division of Graduate Education Integrative Graduate Education and Research Training Program (IGERT) |
1,700 |
2,136 |
436 |
F |
Louis Stokes Alliances for Minority Participation (LSAMP) Distance Monitoring |
701 |
415 |
-286 |
G |
Program Monitoring System for the Robert Noyce Scholarship Program (Noyce) |
60 |
75 |
15 |
H |
Self-Evaluation Indicator System (SEIS) Historical Black Colleges and Universities Undergraduate Program (HBCU-UP) for Awardees |
94 |
112 |
18 |
I |
Program Monitoring System for the Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) |
161 |
177 |
16 |
J |
NASA Educators Task |
0 |
25,000 |
25,000 |
Not being renewed |
Survey for Course, Curriculum, and Laboratory Improvement Program (CCLI), Division of Undergraduate Education |
779 |
0 |
-779 |
Not being renewed |
Collaboratives for Excellence in Teacher Preparation Distance Monitoring Data Collection (CETP) |
130 |
0 |
-130 |
Not being renewed |
Centers for Learning and Teaching Monitoring System (CLT) |
933 |
0 |
-933 |
Not being renewed |
Systemic Initiatives Monitoring for Educational Systemic Reform (ESR) |
41 |
0 |
-41 |
|
NSF Burden Estimates Totals (OMB Estimated Total)
|
23,123 (24,792) |
45,222 |
22,099 |
The increase in respondents is due largely to the addition of the new NASA Educators survey. Again, the difference in the OMB and NSF estimates can be explained through discrepancies in Notices of Action; more details on these discrepancies are available upon request.
In future years, the burden will be affected by the deletion and addition of some subtasks and respondents. NSF will notify OMB whenever there are significant changes to the burden.
While burden changes are often due to adjustments in the numbers of respondents, some changes in burden are due to the addition of new items to previously cleared surveys. The chart below indicates which tasks in this clearance have had major items added since their last OMB clearance. More details can be found in individual clearances.
Attachment |
Collection Title |
Major New Items Added |
A |
Centers for Research Excellence in Science and Technology Monitoring System (CREST) |
None |
B |
Survey Form for the Division of Undergraduate Education Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) |
None |
C |
Division of Undergraduate Education Project Information Resource System (DUE-PIRS) |
None |
D |
Graduate Teaching Fellows in K-12 Education Distance Monitoring System (GK-12) |
None |
E |
Distance Monitoring System for the Division of Graduate Education Integrative Graduate Education and Research Traineeship Program (IGERT) |
Yes (Some additional questions added to investigator and trainee surveys) |
F |
Louis Stokes Alliances for Minority Participation (LSAMP) Distance Monitoring |
Yes (Changes to questions and respondent type in Bridge to the Doctorate module) |
G |
Program Monitoring System for the Robert Noyce Scholarship Program (Noyce) |
None |
H |
Self-Evaluation Indicator System (SEIS) Historical Black Colleges and Universities Undergraduate Program (HBCU-UP) for Awardees |
None |
I |
Program Monitoring System for the Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) |
None |
J |
NASA Educators Survey |
N/A (New task) |
Like many agencies, NSF no longer relies on formal (i.e., traditional) publication methods and publication formats. News media advisories, notices of funding opportunities for colleges and universities, and results from survey collections are all examples of the types of publications that NSF regularly publishes without putting ink to paper.
For content authored by NSF or by a third party at NSF’s request, the agency rarely uses paper to publish the information. NSF publishes most documents electronically only using the agency's Web site, from requests for proposals to evaluation or statistical reports using an archive called an On-Line Document System (ODS).
In addition NSF runs a Custom News Service, an e-mail and Web-based alert service, that sends documents newly published in the ODS (e.g., vacancy announcements, calls for proposals, statistical reports) to subscribers. Subscribers receive electronically those NSF documents of interest and not the agency's entire publications line.
The other major venue for NSF publications is FastLane. The NSF FastLane system collects and publishes information from NSF's clients (i.e., applicants for funding to NSF) using the Web. When an applicant's proposal has been funded, that applicant's name and other key data are published on NSF's Web site. Each week the FastLane Web site publishes a list of new awards using data gathered from the application process.
Like NSF itself, the scope of publication plans and practices by the OMB 3145-0136 EHR Generic Clearance has a dual nature. Some individual collections contribute to formal products (e.g., analytical reports) that can be published by NSF's ODS. Some collections produce only the respondents' replies that are posted verbatim on the EHR share of the NSF Web site for anyone to download.
Most of what the EHR Generic Clearance OMB 3145-0136 collects, however, is not published as a stand-alone product, because the data are an input to how NSF manages and measures its performance as an agency. NSF's GPRA Performance Report, PART reports, or an individual division's annual report to the NSF Director uses information from OMB 3145-0136 to report to Congress. This is an annual cycle.
Most of these tasks are the work of third-party contractors that deliver 1) analytical reports, 2) the raw data from the collections, or 3) both. Third parties are contractually forbidden from publishing results unless NSF has made a specific exception. In short, all products of the collections are the property of NSF. After the products are delivered, NSF determines whether the quality of the products deserves publication verbatim by NSF; i.e., NSF typically is the exclusive publisher of the information collected by OMB 3145-0136. Often it is only after seeing the quality of the information the collection delivers that NSF decides the format (raw or analytical) and manner (in the ODS or simply a page on the NSF Web site) in which to publish.
EHR recurring studies are done to monitor, manage, and communicate with and about the clients funded by NSF's investment in education and training. In most cases the primary purpose for each recurring study is program management. These studies generate data that enable both NSF and the funded education and training projects to improve management and performance. Typically, recurring studies generate information that NSF uses as inputs to other reports and therefore EHR cites no specific publication plans other than internal or general use to meet reporting requirements.
There are, however, several collections within the EHR Generic Clearance that do, as previously approved, directly publish raw data from an individual collection using the NSF Web site. The model being employed is DUE's Project Information Resource System (attachment F). DUE's PIRS collects information from grantees of the Division and instantaneously publishes their verbatim response to the EHR Web site. There is an on-line system (at https://www.ehr.nsf.gov/PIRS_PRS_Web/Search/default.asp)
that allows anyone to generate customized reports using data collected by the PIRS system.
EHR uses data from recurring studies to provide information that can be mined for program evaluation purposes, such as identifying best practices in the education of graduate and undergraduate students or as a baseline for summative evaluation reports. In the past, using data in part, but not exclusively, from OMB 3145-0136, the following evaluative or descriptive analysis research reports have been produced:
A Description and Analysis of Best Practice Finding of Programs promoting participation of underrepresented undergraduate student in Science, Mathematics, Engineering and Technology (Westat) (NSF 01-31)
(http://www.nsf.gov/pubsys/ods/getpub.cfm?nsf0131)
Summary Report on the Impact Study of the National Science Foundation's Program for Women and Girls, December 2000, (The Urban Institute) (NSF 01-27) (http://www.nsf.gov/pubsys/ods/getpub.cfm?nsf0127)
At this time, NSF has no set timeline for publishing reports from these recurring studies, but plans that a summary or descriptive report be produced within two years of completion of the data collections for each recurring study.
Not applicable.
No exceptions apply.
In keeping with the original 1995 request and subsequent 1998, 2001, and 2005 OMB renewed approvals, the EHR Generic Clearance's (OMB 3145-0136) goal is a portfolio of individual collections used to count and describe the universe of NSF-funded or -partnered education and training projects. The statistical method employed in all eleven task collections is that of a census of NSF funded projects. Some projects have only one respondent type, typically a Principal Investigator, others have several types of respondents
Data collection for the tasks involves all awardees in the programs involved. The chart below shows the total universe and sample size for each of the tasks.
Chart 7. Respondent Universe and Sample Size of EHR Generic Clearance Surveys
Attachment |
Collection Title |
Universe of Respondents |
Sample Size |
A |
Centers for Research Excellence in Science and Technology Monitoring System (CREST) |
27 |
27 |
B |
Survey Form for the Division of Undergraduate Education Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) |
12,400 |
12,400 |
C |
Division of Undergraduate Education Project Information Resource System (DUE-PIRS) |
1,800 |
1,800 |
D |
Graduate Teaching Fellows in K-12 Education Distance Monitoring System (GK-12) |
2,280 |
2,280 |
E |
Distance Monitoring System for the Division of Graduate Education Integrative Graduate Education and Research Traineeship Program (IGERT) |
2,136 |
2,136 |
F |
Louis Stokes Alliances for Minority Participation (LSAMP) Distance Monitoring |
415 |
415 |
G |
Program Monitoring System for the Robert Noyce Scholarship Program (Noyce) |
75 |
75 |
H |
Self-Evaluation Indicator System (SEIS) Historical Black Colleges and Universities Undergraduate Program (HBCU-UP) for Awardees |
112 |
112 |
I |
Program Monitoring System for the Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) |
177 |
177 |
J |
NASA Educators Survey |
25,000 |
25,000 |
|
Total |
44,422 |
44,422 |
The
data collections in this generic clearance use either Web- or
e-mail-based surveys. Each respondent will provide answers once a
year, with the exception of respondents to the S-STEM survey
(attachment B), who enter data each semester/quarter, for an average
of three times a year.
NSF understands the limitations of
the EHR Generic Clearance, particularly in terms of using the data to
determine program effectiveness. Data collected under this generic
are for monitoring purposes; evaluation studies are cleared under
separate OMB requests. OMB 3145-0136 data are not a part of the
hierarchy of evaluation study designs described in the ACC report,
but they may serve as preliminary foundation work for later,
independent program evaluations. EHR Generic data are not used to
determine the ultimate effectiveness of STEM educational
interventions, but they are a key element in NSF’s efforts to
mange its program portfolio, to report on agency activities and
goals, and to lay the groundwork for future evaluations.
Each of the ten tasks for which clearance is requested is a census, in which the sample size is the universe. Details on the size of the universe in each collection are included in individual clearances.
Not applicable.
Not applicable.
Not applicable.
Not applicable.
All task collections in this generic clearance are a part of the reporting required of awardees, so a high response rate is expected. The table below shows the expected response rates for each of the individual tasks.
Attachment |
Collection title |
Response Rate |
A |
Centers for Research Excellence in Science and Technology Monitoring System (CREST) |
100% |
B |
Survey Form for the Division of Undergraduate Education Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) |
90% |
C |
Division of Undergraduate Education Project Information Resource System (DUE-PIRS) |
50% all respondents/ 100% new awards |
D |
Graduate Teaching Fellows in K-12 Education Distance Monitoring System (GK-12) |
100% PIs and fellows/ 60% teachers |
E |
Distance Monitoring System for the Division of Graduate Education Integrative Graduate Education and Research Traineeship Program (IGERT) |
100% |
F |
Louis Stokes Alliances for Minority Participation (LSAMP) Distance Monitoring |
100% |
G |
Program Monitoring System for the Robert Noyce Scholarship Program (Noyce) |
100% |
H |
Self-Evaluation Indicator System (SEIS) Historical Black Colleges and Universities Undergraduate Program (HBCU-UP) for Awardees |
100% |
I |
Program Monitoring System for the Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) |
100% |
J |
NASA Educators Survey |
50% |
Principal investigators are responsible for ensuring that other individuals involved in the project submit all necessary data, and in many cases have access to status information on the Web-based systems indicating whether or not individual respondents in their projects have completed their data entry. In addition, EHR staff also have access to on-line monitoring sections of many of the Web-based systems and can check the status of reporting. A series of e-mail messages and phone calls are also used to follow-up with respondents and ensure that all necessary data are collected. See individual task collections for examples of the follow-up e-mail messages that are sent and more specific information on how response rates are supported.
All of the collections for which clearance is being requested are currently in operation and have been tested both before initial implementation and throughout the data collection. The LSAMP monitoring system, for example, has been operational since 1995. Input on this system is continually received from users and their suggestions are implemented as the system is upgraded. Other test methods used by the various collections in the EHR Generic include feedback from PIs both as data are collected and during meetings and conferences, review by NSF staff, and testing performed by the system developer. Many systems are based on data collection methods currently used by other NSF groups, and many of the items and response categories follow formats that are already in place.
The following individuals were consulted on the EHR Generic Clearance:
William Neufeld (703-292-5148), Division of Research on Learning in Formal and Informal Settings, National Science Foundation, 703-292-5150
The following table shows the individuals involved in each task:
Attachment |
Collection Title |
NSF Agency Unit |
Contractor or Grantee |
A |
Centers for Research Excellence in Science and Technology Monitoring System (CREST) |
Victor Santiago, (703) 292-4673 |
Lea Mesner, Macro International Inc., (301) 657-3077 |
B |
Survey Form for the Division of Undergraduate Education Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) |
Duncan McBride, 703-292-4630 |
-- |
C |
Division of Undergraduate Education Project Information Resource System (DUE-PIRS) |
Lea Zia, (703) 292-5140 |
-- |
D |
Graduate Teaching Fellows in K-12 Education Distance Monitoring System (GK-12) |
Carol Stoel, (703) 292-8630 |
Lea Mesner, Macro International Inc., (301) 657-3077 |
E |
Distance Monitoring System for the Division of Graduate Education Integrative Graduate Education and Research Traineeship Program (IGERT) |
Carol Van Hartesveldt, (703) 292-8696 |
Lea Mesner, Macro International Inc., (301) 657-3077 |
F |
Louis Stokes Alliances for Minority Participation (LSAMP) Distance Monitoring |
A. James Hicks, (703) 292-4668 |
Lea Mesner, Macro International Inc., (301) 657-3077 |
G |
Program Monitoring System for the Robert Noyce Scholarship Program (Noyce) |
Joan Prival, (703) 292-4635 and Deh-I Hsiung, (703) 292-5153 |
Lea Mesner, Macro International Inc., (301) 657-3077 |
H |
Self-Evaluation Indicator System (SEIS) Historical Black Colleges and Universities Undergraduate Program (HBCU-UP) for Awardees |
Victor Santiago (703) 292-4673 and Jessie DeAro (703) 292-5350 |
Jason J. Kim and Linda M. Crasco, Systemic Research, Inc. (781) 278-0300 |
I |
Program Monitoring System for the Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) |
Susan H. Hixson, (703) 292-4623
|
Lea Mesner, Macro International Inc., (301) 657-3077 |
J |
NASA Educators Survey |
Mary Sladek, NASA, (202) 358-0861 |
-- |
File Type | application/msword |
File Title | Supporting Statement |
Author | nsfuser |
Last Modified By | nsfuser |
File Modified | 2007-12-07 |
File Created | 2007-12-07 |