3145-0101 Part A FY 2007 OMB clearance package

3145-0101 Part A FY 2007 OMB clearance package.doc

Survey of Science and Engineering Research Facilities

OMB: 3145-0101

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT

for

FY 2007 and FY 2009 Science and Engineering Research Facilities

















Part A: Justification


A.1. Circumstances Necessitating Collection of Information

The National Science Foundation’s (NSF) Survey of Science and Engineering Research Facilities is a Congressionally mandated biennial survey that has been conducted since 1986. In FY 2001 NSF engaged in an extensive redesign effort which addressed issues relating to data quality, response burden, response rate, and the changing nature of research. Important changes to the survey included the deletion of a question on the adequacy of research space, the deletion of the Large Facilities Follow-up Survey, and the addition of individual construction project sheets. A new section was also added to the survey requesting information on computing and networking capacity, which has become an essential part of the infrastructure for science and engineering research and will continue to be critical in the future. The new, redesigned survey was implemented in survey cycles FY 2003 and FY 2005.


Background


Academic research facilities in the fields of science and engineering (S&E) are an important national resource. Extensive hearings were held during the 99th Congress in the House and Senate committees on science and technology to examine the research facilities needs of universities and colleges. Both committees found “sufficient evidence to suggest the presence of a serious and growing problem…” and expressed concern that the Federal government did not have in place an ongoing analytical system to document the current status of and needs for research facilities by major field of science and engineering. Such systematic information was needed to understand current and future facilities needs and pressures and to formulate sound solutions over time.


In recognition of the need for objective information about research facilities, Congress directed the National Science Foundation (NSF), in the Authorization Act of November 22, 1985 (P.L. 99-159, Section 108


The National Science Foundation is authorized to design, establish, and maintain a data collection and analysis capability in the Foundation for the purpose of identifying and assessing the research facilities needs of universities. The needs of universities, by field of science and engineering, for construction and modernization of research laboratories, including fixed equipment and major research equipment, shall be documented. University expenditures for the construction and modernization of research facilities, the sources of funds, and other appropriate data shall be collected and analyzed. The Foundation, in conjunction with other appropriate Federal agencies, shall conduct the necessary survey every 2 years and report the results to the Congress.


In response to this directive, NSF has now conducted ten biennial surveys. The National Institutes of Health (NIH) has cosponsored the surveys with NSF since the beginning of the series. At the request of NIH, the survey also collects data from nonprofit, biomedical research organizations and hospitals that receive NIH research funds.


The NSF Division of Science Resources Statistics (SRS) is responsible for developing and maintaining the research facilities survey system. SRS now requests approval to conduct the next cycles of the Survey of Science and Engineering Research Facilities (hereafter Facilities survey) in academic years 2007/2008 and 2009/2010.


Since 1986, NSF has modified the survey several times with the goals of improving data quality and reducing respondent burden. The original survey was limited and was conducted using NSF’s Quick Response Survey questionnaire capability in order to meet the Congressional timeframes. The surveys from 1988 to 1996 built upon the 1986 Quick Response Survey instrument but expanded the level of detail in the data collected and expanded the size of the universe surveyed.


During the FY 1998 cycle, the Large Facilities Follow-up survey was added to the Facilities survey. The Follow-up survey collected data on the construction of buildings costing at least $25 million and housing at least some science and engineering research facilities as required in OMB circular A-21, revision dated June 1998. The data collected mostly concerned the costs of construction.


During the next survey cycle, data collection was expanded from a sample of institutions to a census of institutions in order to make the data collected more useful to Federal and State agencies and to the higher education community. A census of institutions allowed for more precise estimates of the status of S&E facilities, and allowed for the analysis of subgroups of institutions.


In the FY 2001 survey cycle NSF reduced the number of survey items from 10 items to two. The two remaining survey items requested data on: 1) the amount of net assignable square feet used for instruction and research by field of science; and, 2) the adequacy of the amount of S&E research space available at each institution. NSF also eliminated the Follow-up survey. The significant reduction in the survey was done in full consultation with the Office of Management and Budget, with relevant Congressional staff, with the National Institutes of Health and with other interested parties.


The purpose of implementing the abbreviated survey was for NSF to focus completely on redesigning the questionnaire for the FY 2003 cycle of the survey. The goal of the redesign was to improve data quality and relevance and to reduce respondent burden. Until the abbreviated FY 2001 survey, the questionnaire had essentially remained the same over the years; a substantial detailed review and redesign of the survey had not been undertaken for 14 years.


Subsequently, the redesigned survey was implemented for survey cycles FY 2003 and FY 2005. Following survey implementation, NSF continued to examine quality measures such as item nonresponse rate and the number of respondents receiving error messages on each individual survey question, among others.


General survey description


NSF is now requesting approval to conduct the Facilities survey for the FY 2007 and FY 2009 cycles of the survey. The following is a broad overview of the survey.


Questions 1 through 5 request data on the amount of space used for science and engineering research. The questions ask for specifics about the research facilities (e.g., how much of the space is in medical schools) because different kinds of space have different costs and requirements for preparing and using the space. Question 8 requests information on the biosafety level of any research animal space included in the estimates of total research space. NSF has reduced the burden of these questions by deleting a question on leased research space.


In addition, NSF significantly revised question 2 which asks respondents to report the square footage of their research space by field of science. This question previously required respondents to look up definitions for the individual fields of science in an attachment at the back of the survey. Beginning with the FY 2007 survey, the field of science definitions are incorporated into the survey question itself so that respondents can read the information required to respond without the burden of looking up definitions in an attachment. In addition to reducing burden, this revision should also increase data quality.


Questions 6 and 7 ask about the condition of the research facilities because of a recurring concern about aging research infrastructure in the U.S. Also, these questions may provide data on whether existing research space is appropriate for the rapidly changing needs of scientific research.


Questions 9, 10, and 11 ask about the costs of repairs and renovations in the previous two fiscal years. Similar to questions 6 and 7, these also relate to the issues of a potentially aging infrastructure and the changing needs of science, but from a financial perspective.


Question 12 and an accompanying project worksheet ask about new construction in the previous two fiscal years. New construction provides a way to increase the total amount of research space, to replace inadequate facilities, and to adjust to the changing needs of science. The worksheet asks about individual projects to address questions about the costs per square feet of new construction.


Question 13 asks about the sources of project funding for repairs/renovations and new construction to determine the roles of various sectors in funding research facilities.


Questions 14, 15, and 16 ask about planned repairs and renovations over the next two fiscal years, and questions 17, 18, and 19 similarly ask about plans for new construction. These provide data on the extent to which institutions are addressing their space needs.


Questions 20 through 25 ask about deferred repairs/renovation and new construction. These provide another measure of institutions’ anticipated need for space, while also examining the ability of institutions to meet their needs.


The new survey section on cyberinfrastructure with questions on computing and networking capacity asks about an important component of institutional research infrastructure that has largely been unmeasured until now. Researchers widely agree that the increased power of computers, combined with the use of networking to share resources and facilitate communication, is greatly changing the nature of scientific research. In some cases, these capabilities may also reduce the need for “bricks and mortar” research facilities, now and in the future, because they provide the ability to share the use of remote facilities or to use computer simulation rather than building new facilities.

Questions 1 through 12 in this section focus on institutional networking capabilities. These questions ask respondents to report on their different types of networking connections and the source and speed of these connections. Scientific research can depend on the use of massive databases, and therefore the ability to transfer large amounts of data quickly through networking is an important measure of the capacity to productively use such databases. Further, networking has become an important communication mechanism (e.g., collaboratories) and is increasingly being used to remotely connect to off-site research facilities. Question 12 asks about the infrastructure for wireless communications. Some institutions use wireless communications as a substitute for hard wire connections, while others use wireless to increase the ease of access to the network and for specialized applications such as robotics.


Questions 13 through 37 ask about the availability of high performance computing and the supporting storage capacity. High performance computation is becoming increasingly necessary to address large scientific issues.


A.2. Purposes and Use of the Data

Data from this survey serve several audiences. First, the data provide Congress with a broad, quantitative picture of the inventory and condition of existing S&E research space at research-performing academic institutions and at research organizations receiving research funds from the NIH. The survey also provides data on the current and future capital expenditures for research facilities by universities, colleges, and biomedical institutions; sources of funding for research facilities at these institutions; plans for future repair/renovation and new construction of science and engineering research facilities; and in computing and networking capacity.


Second, the National Institutes of Health use the survey information to assist them in developing their biomedical research facilities construction and modernization program. Third, State legislatures, State agencies and universities and colleges use the data to assist them in making budgeting decisions and in planning future activities. Private sector architectural, engineering, and construction firms use the data for several purposes including forecasting the demand for their products and services.

Finally, professional associations such as the Association of American Medical Colleges use the data to assist them in assessing the needs of their fields of science.


In the past, only aggregated data collected from this survey were publicly released in order to maintain the confidentiality of individual institutions. This created a serious constraint on the ability of individual institutions to use the data for benchmarking and planning purposes. For example, many institutions have a set of peer institutions that they compare themselves with in order to evaluate their own progress and needs. They are used to making comparisons using sources such as the Integrated Postsecondary Education Data System (IPEDS), but have not been able to make comparisons using the facilities data. In an effort to increase the usefulness of the data, most of the survey data for the FY 2003 and FY 2005 surveys was made publicly available with institutional identifiers (see A.11. on sensitive questions for exceptions). The Foundation created a public-use database that allows institutions (or others) to access the data for analytical purposes. We expect this change to increase the number of data users, the quality of the survey responses, and the ways in which the data are used.


A.3. Use of Improved Information Technology

The survey is a mixed mode survey: respondents have the option of completing a paper version of the survey or a World Wide Web version. The large majority of institutions (92% in FY 2005 cycle) respond to the survey via the web.


The web version of the survey is programmed in HTML and is a Macromedia ColdFusion application. This version has a real-time monitoring system, which allows the Foundation to monitor data, response status, and comments from respondents. From the perspective of the respondents, the web version is more convenient and simplifies the survey (e.g., by automating skip patterns, and automatically calculating totals). NSF also benefits from use of the web version by receiving improved data quality. Data quality is improved using several techniques such as skip pattern checks and consistency checks. For example, for skip patterns respondents will automatically not receive certain questions if they have indicated that they do not meet the criteria to respond to the questions. For inconsistent data, respondents will receive messages while completing their survey if their responses do not appear consistent with their responses to other questions. NSF also created a public-use database on the web, which significantly increased the usability of the data.


A.4. Efforts to Identify Duplication

As part of the redesign effort, an extensive literature review was conducted to identify other surveys (or data) of science and engineering research facilities. At the national level, data on the book value of higher education physical plant assets is collected periodically on the NCES Integrated Postsecondary Education Data System. However, these data are only collected in the aggregate for an institution, the data are not available for research space, and the data are not available by S&E field. The Survey of Scientific and Engineering Research Facilities is the only survey to collect data from research-performing higher education institutions and biomedical research organizations on research space and fields of S&E. The data do not duplicate statistical data from any other sources.


A.5. Collection of Data from Small Businesses

Data are not collected from small business.


A.6. Consequences of Less Frequent Data Collection

The Congressional mandate requires that data collection occur every two years, which ensures that the data are current. This also lessens burden when compared with an annual survey. Conducting the survey less frequently would adversely affect the relevance of the data both for policymakers and users of the data. The Foundation consulted with an expert panel and numerous individual institutions (see section A.8. on public comment and consultations outside the agency), and a two-year period largely corresponds with the frequency that many institutions update their records and with their planning processes.


A.7. Special Circumstances

No special circumstances.


A.8. Public Comment and Consultations Outside the Agency

Public notice was published in the Federal Register, May 23, 2007, 72 FR 29002. No substantial comments were received.


Following implementation of the FY 2003 and FY 2005 facilities surveys, NSF made a concerted effort to consult with survey respondents and cyberinfrastructure experts outside the agency.


Cyberinfrastructure


In the fall of 2004, NSF obtained the services of The Educational Consulting Foundation (ECF), consultants who assist educational institutions and organizations in optimizing the use of their technological resources and advise on institutional information technology infrastructure. ECF engaged in several activities to assist NSF in assessing the first survey cycle addressing computing and networking capacity. ECF conducted seven site visits which included interviews of numerous staff from colleges, universities, and biomedical institutions that responded to the FY 2003 survey. The institutions were selected based on the following criteria: complexity of their network environment (as reflected in their survey responses); knowledge of special features of their environment (e.g., supercomputing); and relationship to other institutions through campus, state, and other networks.


Interviews were conducted to determine how respondents interpreted the survey questions, to learn the availability of the requested data, and to elicit feedback from respondents about the data collection. The following institutions participated in the on-site interviews:


Penn State University

Brigham and Women’s Hospital

University of Massachusetts, Amherst

University of North Carolina, Chapel Hill

Virginia Polytechnic Institute and State University

Loma Linda Adventist Health Science Center

Oregon State University


Telephone interviews were also conducted with the University of North Carolina Systems Office.

Based on the site visits ECF made recommendations for future survey topics and question wording.


Additionally, ECF reviewed the FY 2003 computing and networking survey responses for inconsistent answers and missing data. Based on these reviews, they made recommendations to question wordings for future surveys.


In October of 2004, the Foundation convened a workshop to obtain feedback from FY 2003 survey respondents focusing primarily on the networking survey questions although some aspects of high performance computing were also discussed. Representatives from eight academic institutions who had or shared responsibility for their institutions’ computing and networking environment, and were familiar with their institutions’ responses to the survey, participated in the workshop. Institutions were selected to participate based on several criteria including: 1) characteristics of their computing and network structures (e.g., topology, computation speed); 2) affiliation with state and other network consortia; and 3) geographic location.


The workshop participants discussed their:


  • experiences with the survey questions (i.e., the process they used to answer the question, how they defined question terminology, data available at their institution to answer the questions) and any problems they encountered when trying to answer the questions,

  • recommendations for changing questions (e.g., suggestions for revised or additional definitions, instructions, or wording), and finally,

  • ideas for new measures and concepts that would help NSF meet the objectives of the survey.


The workshop panel consisted of the following members:


Michael Carlin, Director

Infrastructure and Support

University of Maryland, Baltimore Country


Herb Kuryliw

Chief Network Architect

Northern Illinois University


Lesley Tolman, Director

University IT Infrastructure

Tufts University


Joe St. Sauver, Director

User Services and Network Applications

University of Oregon


Robert Lummis

Assistant Dean for Information Technology

Albert Einstein College of Medicine

Yeshiva University


Nathan Bohlmann, Manager

IT Information Analysis

Purdue University


Tad Reynales, Director

Network and Telecommunication Services

University of California, Santa Cruz


Willis Marti, Associate Director

Networking, Computing & Information Services

Texas A&M University



In the spring of 2005, based on the workshop input, information from the site visits, information from the expert review of the FY 2003 survey results, and paradata from the FY 2003 survey (e.g., frequencies of error messages and missing data), NSF revised the survey questions for potential use on the FY 2005 survey cycle. Also, questions were revised so that topics and question wording kept abreast of continuous changes in computing and networking technology and terminology. Draft questions were sent to the workshop participants via email. Each participant was asked to answer the questions as if they were completing the survey. Participants were then debriefed on their responses and reactions. Where appropriate, questions were subsequently revised based on this input.


In the summer of 2005 NSF used these newly revised questions for another set of site visits. The following six institutions were visited:


University of Alabama at Birmingham

Alabama A&M University

Xavier University

Louisiana State University Health Sciences Center

Brown University

University of Rhode Island


The institutions were selected based on their size (amount of research expenditures) and their status as a public or private institution. Prior to NSF arriving on campus, the institutions were provided the draft questions so that they might determine their responses. Cognitive interviews were conducted and covered the interpretation of each question, the meaning of specific terms, the usefulness of definitions and instructions, and the process used to select responses.


Usability tests were also conducted. The usability protocol called for one participant to work through the web version of the survey using a think-aloud procedure. Additional probes covered the experiences of logging into the survey, navigating through the survey, responding to the variety of item types, exiting and returning to the survey, and submitting the survey.


A similar process was undertaken to update and revise the survey questions focused on high performance computation. In December of 2006 the Foundation convened a workshop to obtain expertise and feedback from FY 2005 survey respondents on high performance computing (HPC). Representatives from 12 academic and biomedical research institutions participated in the workshop. Institutions were selected to participate based on several criteria including characteristics of their computing environment and geographic location. The workshop participants discussed issues associated with HPC including:


  • Definitions of HPC and other related concepts;

  • Indicators of HPC capacity;

  • Data availability;

  • Measures;

  • Variation in capacity across institutions; and,

  • Emerging issues in HPC.


Workshop participants also discussed specific survey questions, including wording and potential response options.


The workshop participants were:


Vijay Agarwala, Director

Academic Services and Emerging Technology

Pennsylvania State University


Michael Denisevich, Director

Research Computing

The Scripps Research Institute


Curt Hillegas, Manager

Computational Science and Engineering Support

Princeton University


Christoph Hoffman, Director

Rosen Center

Department of Computer Science

Purdue University


Bill Labate, Director

Research Computing Technology and Academic Technology Services

University of California, Los Angeles


Jim Loter, Director

Computing Services

College of Engineering

University of Washington

Ruth Marinshaw

Acting Assistant Vice Chancellor for Research Computing/

Senior Research Manager

University of North Carolina at Chapel Hill


Carolyn Mattingly, Director

Department of Bioinformatics

Mount Desert Island Biological Laboratory


Rosalinda Mendez, Associate Director

Texas Learning and Computation Center

University of Houston


Oren Sreebny, Director

Emerging Technology, Computing & Communications

University of Washington


John Towns, Director

Persistent Infrastructure

National Center for Supercomputing Applications

University of Illinois at Urbana-Champaign



Following the workshop, the Foundation developed draft HPC questions for FY 2007 survey cycle. NSF obtained the consulting services of John Towns, an HPC expert at the National Center for Supercomputing Applications at University of Illinois, to work with the Foundation in writing and reviewing the survey questions. These questions were sent to four workshop participants and each participant was asked to answer the questions as if they were completing the survey. Participants were then debriefed on their question responses and their reactions to the HPC definitions, terminology, and response options. Based on the debriefings the questions were further revised.


Finally, these revised questions were sent to five institutions to complete as if they were completing the survey. The Foundation then visited each of the five institutions to conduct interviews with the staff involved in providing the data. Interviews were conducted to determine how respondents interpreted the survey questions, to learn the availability of the requested data, and to elicit feedback from respondents about the data collection. The five institutions were Columbia University, State University of New York Health Sciences Center, Memorial Sloan-Kettering Cancer Institute, Ohio State University, and Battelle Memorial Institute.

Research Space


In addition to investigating issues associated with the networking and computing section of the survey, NSF engaged in several activities to examine participant responses to the research space survey questions. Following the FY 2003 field period, NSF conducted respondent debriefings on the newly revised question on the amount of research space by field of science. In the FY 2003 cycle, for the first time, respondents had been asked to break out their total research space into four categories: laboratory space, laboratory support space, office space and “other research space.” The debriefings focused on the nature of the space that respondents included in the “other research space” category. Respondents were asked to describe the kinds of space they included in the “other” category and why they did not include this space in one of the three other available categories.


NSF was interested in understanding more about this type of space for several reasons. First, NSF wanted to ascertain that this space was actually research space as defined in the survey. Second, if this were true, it confirmed that respondents were including the appropriate space (in terms of the survey definition) in their responses to previous survey cycles when they were only asked to report total research space.


In a second activity, NSF contacted FY 2003 survey respondents from eight institutions to learn more about their reports of total gross square feet, research net assignable square feet, and costs for new construction projects. The purpose of the telephone interviews was to ascertain how institutions interpreted the various measures of research space requested on the survey. The telephone contacts indicated no common misinterpretations of the survey questions.


Thirdly, because the assignment of research space to fields of S&E is so important to the quality of the survey data, NSF examined the process respondents use to classify their research space into these fields. The research protocol employed to investigate these questions consisted of telephone interviews with FY 2005 survey respondents from 55 institutions. Institutions were selected based on 1) the size of their R&D expenditures, 2) their geographic location, 3) whether they were a biomedical or academic institution, 4) whether the institution had a medical school, and 5) and the timing of their FY 2005 survey submission. Some of the questions addressed were:


  • To what extent was the method used influenced by the institution’s structure?

  • To what extent was the assignment method used by respondents affected by the definitions of S&E fields at the end of the survey?

  • Did the respondents view the list of disciplines at the end of the survey as a comprehensive list of all disciplines to be included in their survey responses?

  • What information is needed by respondents to assign research space to S&E fields and what information should be presented in the survey?


Based on this investigation, NSF revised the survey question on research space to include definitions of S&E fields in the actual survey question, as well as at the end of the survey. The tool provided at the end of the survey was organized by the NCES Classification of Instructional Programs for those respondents who had institutional databases organized in this manner.


Other efforts to consult outside the Foundation in order to improve research space data quality included the following:


  • Four site visits to conduct cognitive interviews and usability tests of the revised question on research space for the FY 2007 survey. Telephone interviews were conducted with an additional two institutions. On-site interviews were conducted with the University of Virginia, Morgan State University, Virginia Commonwealth University, and Virginia Tech. Telephone interviews were conducted with Massachusetts General Hospital and the University of Arizona.

  • Telephone calls to the principal investigators of seven clinical trials to discuss the types of space used in clinical trials.

A.9. Payments or Gifts to Respondents

Respondents will not be paid or receive gifts.


A.10. Assurance of Confidentiality

Data on individuals are not collected. Historically, the survey promised respondents that all individual institutional data would be confidential and that responses would not be used in any manner which would identify any individual institution’s responses. Only aggregated data for statistical summaries would be presented. However, over the years, respondents and other organizations have requested the data in a less aggregated and more accessible form. For example, colleges and universities wish to compare their institution to other institutions they define as their peers. State agencies and legislatures are interested in comparing institutions in their state to institutions in other states that they define as relevant. Academic departments at colleges and universities wish to compare themselves to like academic departments at other institutions (e.g., computer science departments). Professional associations wish to compare institutional data. And finally, private sector firms wish to know which institutions are planning significant repair/renovation or construction work.


Based on the idea that the data should be available to all interested parties and users and that making the institutional level data more accessible will provide institutions with a reason to respond to the survey, the Foundation decided to make the data publicly available on an institution by institution basis with a few key exceptions (see section A.11 on sensitive questions). Institutions were clearly informed of the change in confidentiality policy in letters to the institutions’ presidents and survey coordinators, as well as on the survey form itself. The publication of micro-level, institutional data was consistent with NSF release practices for its other surveys of academic institutions. For more than 40 years, NSF has published institution data collected on its Survey of Research and Development Expenditures at Universities and Colleges and the Survey of Graduate Students and Postdoctorates in Science and Engineering.


A.11. Sensitive Questions

This is a voluntary survey. Institutions are not required to participate in the survey and respondents may decline to answer any question in the survey. While the Foundation decided to make the majority of the survey data publicly available, the expert panel and the cognitive interviews during the survey redesign identified several survey items as particularly sensitive. The release of data on those questions could make institutions less inclined to respond to the survey. Also, during pretests of the redesigned survey respondents expressed concern about the confidentiality of certain items. To address this issue, the Foundation retains confidentiality on the following survey items: condition of research space (questions 6 and 7) and amount of research animal space (appearing throughout the questionnaire).


A.12. Estimated Response Burden

Burden hours


The Facilities survey is a biennial survey of approximately 477 academic institutions and 190 nonprofit biomedical research organizations. For academic institutions the hour burden for any particular institution will be affected by two major factors--the size of the institution (in terms of number of S&E departments and/or the number and size of research facilities) and the status of the institution's computerized central records system. Previous surveys showed that many institutions, as part of their compliance with OMB Circular A-21, have created centralized databases, including a measurement of space devoted to research.


Because of these factors, the completion time varies among institutions. Based on pretests, the time to complete the research space section of the survey (Part 1) ranged from 10 to 85 hours with an average of 40 hours. The time to complete the computing and networking section of the survey (Part 2) averaged 60 minutes. Therefore, in total, the time per academic institution to complete the survey is expected to average approximately 41 hours. Assuming a 94% response rate, this would result in an estimated burden of 18,368 hours in FY 2007 and a similar burden in FY 2009. [(.94 response rate x 477 institutions) x 41 hours = 18,368]


Because biomedical research organizations generally are not as large, diverse or complex as colleges and universities, there is substantially less variation in the survey completion time. On average, completion time per biomedical research organization for Part 1of the survey was 4 hours. For Part 2 of the survey, average completion time per biomedical research organization was 1 hour. Therefore, in total, the time per biomedical research organization to complete the survey is expected to average approximately 5 hours. Assuming a 94% response rate, this would result in an estimated burden of 895 hours in FY 2007 and a similar burden in FY 2009. [(.94 response rate x 190 institutions) x 5 hours =895]


Total burden hours for academic institutions and biomedical institutions are 19,263.


Costs to responding institutions

Costs that will be incurred by responding institutions involve the completion time of the survey and the time of an institutional coordinator associated with managing the distribution of survey forms and transmittal to the contractor. The institutional coordinator makes the initial contacts with the research administrators and facilities planners at the institution informing them of the survey, distributes the forms, follows up on the return of the forms or data to the coordinator’s office, and transmits the forms or data to the contractor.


At an estimated cost of $35 an hour for institution staff, the average cost to each academic institution is $1,435 for FY 2007 and a similar amount for FY 2009. For biomedical research institutions the average cost to each institution is $175 in FY 2007 and a similar amount for FY 2009.


Assuming a 94% response rate for academic and biomedical institutions (based on FY 2005 response rates), the total respondent costs for the FY 2007 survey are estimated to be:


Academic: $1,435 per academic institution x 447 institutions = $641,445.

Biomedical: $175 per biomedical research organization x 179 institutions = $31,325.

Total: $672,770


These costs are expected to be similar for the FY 2009 cycle of the survey.


A.13. Annualized Cost to Respondents

There are no capital or startup costs to the respondents for the Facilities survey.


A.14. Annualized Cost to the Federal Government

The estimated total cost of the FY 2007 Facilities survey contract to the Federal Government is as follows:


Survey planning by NSF/NIH staff $25,000

Supervision of ongoing contract $75,000

Contract to conduct the survey $1,399,995

Total estimate $1,499,995



The contract will include all direct and indirect costs of data collection, analysis, reporting, and the production of public and proprietary data sets. These costs are expected to be similar for the FY 2009 survey cycle.


A.15. Reasons for Program Changes

No program changes.


A.16. Publication Plans and Project Schedule

The FY 2007 survey mailout is planned to begin in October of 2007. Data collection will take place over a five-month period in order to provide institutions adequate time to appoint a coordinator and to gather the appropriate records and data. Data collection is expected to end the spring of 2008. The data collection for the FY 2009 cycle will begin in the fall of 2009 and will end in approximately the spring of 2010.


The contractor will submit draft data tabulations within four months following completion of data collection. These will be reviewed and revised as needed by NSF and NIH before production of final data tabulations. NSF will be responsible for electronic dissemination of the tables to NSF offices, NIH, participating academic institutions, professional associations, and others who may request information about the survey.


The contractor will also provide data for a public use data file that will be accessible via the World Wide Web and will include electronic data files for use on personal computers. The data files, including documentation, will contain institutional data and will be delivered within 12 months of data collection. In addition, the contractor will provide a methodology report detailing all survey activities, materials, and procedures.


An NSF-produced Infobrief on the results of the FY 2007 survey is scheduled for December 2008. The Infobrief will include a presentation of major findings, charts and graphs relevant to the survey, and relevant statistical tables. It will also refer the reader to additional information about survey limitations and other technical information.


NSF anticipates a similar schedule for the FY 2009 cycle of the survey.


A.17. Approval for Not Displaying the Expiration Date for OMB Approval

Approval not requested.


A.18. Exceptions to the Certification Statement

There are no expectations.



20


File Typeapplication/msword
File TitlePART A: JUSTIFICATION
AuthorBrad Chaney
File Modified2007-07-31
File Created2007-07-31

© 2025 OMB.report | Privacy Policy